If Parleris a conservative alternative to Twitter and MeWeis attempting to replicate Facebook, BitChute is best described as the right-wing alternative to YouTube.
Hate speech. Neo-Nazi propaganda. Anti-Semitic QAnon conspiracy theories. Even terrorism. It's all there. And the platform keeps growing.
Ben Horne, an assistant professor of Information Sciences at the University of Tennessee, Knoxville, recently co-authored a study that examined BitChute videos, metadata, and comments from June 2019 to December 2019. The results were alarming.
“BitChute hasn’t really received coverage like all those other alternative platforms,” said Horne. “It was relatively small when we started doing our data collection, but it's grown a lot since.”
iRobot Roomba Essential Vac Q011 Robot Vacuum Cleaner—$159.99(List Price $249.99)
Samsung Galaxy Tab A9+ 10.9" 64GB Wi-Fi Tablet—$142.49(List Price $219.99)
Apple AirPods Pro 2nd Gen With MagSafe USB-C Charging Case—$168.99(List Price $249.00)
Fitbit Charge 6 Fitness Tracker With 6-Months Membership—$99.95(List Price $159.95)
Apple Watch Series 9 (GPS, 41mm, Midnight, S/M, Sports Band)—$279.99(List Price $399.00)
A week before the 2020 presidential election, he told me the platform grew rapidly after YouTube and other sites banned Plandemic, a viral film that spread COVID-19 conspiracy theories.
“In terms of measuring hate speech, the first thing that we found was that there was more hate speech on BitChute than on Gab,” explained Horne, referencing another alternative platform popular with the far right.
“Gab had been receiving a lot of criticism for being very hateful and having a lot of hate speech ... but there was significantly more on BitChute."
An August report by the Anti-Defamation League(ADL) found white supremacist channels racking up hundreds of thousands of views on BitChute.
The UK-based Jewish advocacy group, Community Security Trust (CST), labeled BitChute — along with Gab, Telegram, and 4chan — as one of the internet's biggest spreaders of anti-Semitic content.
A reportfrom UK advocacy group HOPE not hate also found that BitChute was riddled with anti-Semitic content.
“Holocaust denial is really, really big on the site,” explained Gregory Davis, a researcher with HOPE not hate. “Some of the most popular channels on there and the videos on its front page every day contain Holocaust denial.”
White supremacist organizations like Patriot Front and the Proud Boys have active accounts on the site alongside far-right personalities, including Stefan Molyneuxand Alex Jones, who moved to BitChute after their popular channels were banned on YouTube.
Mashable reached out to BitChute for comment on several issues but did not hear back.
HOPE not hate’s report also described BitChute as “an increasingly important hub for terrorist propaganda." The group found more than 100 videos promoting ISIS and National Action, a UK neo-Nazi terrorist organization. It also found recruitment videos for Atomwaffen Division, a U.S.-based white supremacist terror group.
Footage of mass shootings can be viewed on the platform as well.
“The thing that was sort of surprising to us is that we ended up finding terrorist recruitment material,” said Horne. “And this was something that was specifically written out in their guidelines that they weren't going to tolerate. Clearly, that wasn’t the case.”
In the lead-up to November’s elections, many big social media companies began cracking downon misinformation.
When YouTube finally banned “harmful” conspiracy theory content “used to justify real-world violence” in October, many QAnon spreaders were purged from the site.
Many of them already had a BitChute channel up and running with the expectation they'd eventually be booted from YouTube. And it wasn't hard to move their content over. The company has an importer tool that makes it easy.
Users don't even have to own the content. There is a Reuters channel on BitChute, Horne pointed out, syndicating the company's video content without permission.
“Reuters has no relationship with BitChute," said a Reuters spokesperson. "We were unaware of this channel and are looking into the matter.”
As HOPE not hate’s report puts it: “BitChute exists to circumvent the moderation of mainstream platforms.”
BitChute really seems like the Wild West. The company lists basic community guidelines on the site, but users can easily find videos that violate them. And it's not like there's so much content that BitChute couldn't moderate it all.
“If you look at the ‘ALL’ videos section on BitChute, you can actually watch them being uploaded to the site in real time,” explained Davis. “The quantity of material uploaded to [BitChute] daily is actually not that large.”
The lack of moderation isn't an oversight, it's Bitchute's selling point.
When Horne’s study was published in May, it found that approximately 50 videos had been removed for violating BitChute's policies. But a majority of those videos were taken down due to DMCA requests —effectively, copyright issues.
Horne’s study also found that BitChute only has a handful of popular video creators driving most of its traffic and engagement. On top of that, the site’s comments are often created by just a few of the site’s most frequent visitors.
BitChute was foundedin 2017 by British web developer Ray Vahey in order to create a “free speech” alternative to YouTube. It was conceivedfollowing changes to the Google-owned video giant’s monetization policies, meant to cut down on hate speech and extremist content. While the company is basedin the UK, Vahey livesand works in Thailand.
He and the company seem to invite fringe content. BitChute's Twitter account has shared links to videos on the site created by white nationalists. And Vahey has tweeted support from his personal account for conspiracy theories including Pizzagate, which claimed that a child-trafficking ring was being run out of a pizza place in Washington, D.C.
BitChute promotes itself as a peer-to-peer service. This has been one of the site's main selling points to right-wingers looking for a YouTube alternative.
Being a "decentralized" platform, BitChute wouldn’t technically host videos. Instead, it would simply facilitate the transfer of content between its users' computers. Thus, the company would have little control over user-uploaded content.
But it might not be a peer-to-peer service at all.
Frederik Brennan, the founder of another far-right platform, 8chan, laid out evidence in the Daily Dot that BitChute doesn't use peer-to-peer technology, and instead hosts the videos on its site. (Since leaving, Brennan has become an 8chan critic, due to the site’s role in spreading QAnon conspiracy theories.)
Speaking of web service providers, BitChute’s domain is registered with Epik, a domain name registrar known for providing safe haven for far-right websites.
BitChute’s funding, at least what’s public of it, comes from user donations. Its website says the company is closing in on its monthly goal of $30,000. The platform also recently started showing ads for the first time, served by an online advertising company called Criteo.
While organizations like the ADL have previously shined a light on the platform, a group of lawyers and activists in the UK are now looking to take further action against BitChute.
"Antisemitism and holocaust denial online are not specific criminal offences in the UK but other general criminal legislation can cover this behaviour," explained UK Lawyers for Israel (UKLFI) director Caroline Turner in an email. "Both the individuals uploading the content and the websites and organisations hosting the content could be committing criminal offences."
Turner provided Mashable with a list of UK laws meant to stop the spread of hate speech and misinformation — including Section 127 of the Communications Act 2003, the Public Order Act 1986, and the Malicious Communications Act 1988— that BitChute could be violating.
As Turner points out, those committing the offenses would need to be located in the UK. While BitChute is registered in England, many of its content creators, and its founder, reside outside of the country.
In a letter to the British government's anti-Semitism tsar, John Mann, UKLFI condemned the site for the “violent, racist, and harmful videos” that it hosts and provided examples of anti-Semitic content. It ended with a request that the site be taken down.
In addition, UKLFI has sent statements to BitChute's web service providers, including Epik and Cloudflare, requesting that they cut ties with the platform. The organization said it's also working with other groups to figure out its next steps.
Vahey previously claimed that the company was working with UK counter-terrorism police to "improve" its "processes" for dealing with terrorist content. But if BitChute is acting, it doesn't appear to be working.
"BitChute states on its website that it has community standards and it has actually removed many of its videos from sight," said Turner. "However, when a website such as BitChute becomes a magnet for this type of offensive material, it seems that a proliferation of hateful and offensive material is constantly being added."
BitChute has survived deplatforming attempts before.
In 2019, PayPal barred BitChute from using its payment-processing service. That same year, Twitter also temporarily banned links to the site from being shared on its platform. And, just last month, Google Play booted a third-party BitChute mobile app from its app store.
But BitChute is more popular than it's ever been. Traffic to the site has soared since the 2020 election. It's currently the 615th most visited site in the U.S., according to the Amazon-owned web analysis tool Alexa.com. BitChute's global Alexa rank earlier this year was in the 3,000 range. Now, it's the 1,667th most visited website in the world and rising.
That makes BitChute more popular than Gab and MeWe. For a time, BitChute was even attracting more visitors than Parler, now the most popular conservative social media platform since the end of the presidential election.
“YouTube has been doing a lot of banning, downgradingof search rankings recently,” explained Horne. “And so I think that's pushed a lot of people to move towards a platform like BitChute.”
When YouTube first started purging extremist content in 2017, there were quite a fewplatforms vying to be the alternative to the video giant. It’s clear now that BitChute has won that battle.
"BitChute has been taking up a greater and greater percentage of the video links shared in far-right groups on Facebook, Telegram, and Twitter," Davis said.
YouTubers played an integral part in the spread of QAnon and other conspiracy theories as people turned to the internet to pass the time during the pandemic.
As the U.S. experiences another coronavirus spike, it looks like the same conspiracy theorists could, once again, spread misinformation through videos, this time on BitChute.
TopicsSocial MediaYouTubePolitics
(责任编辑:探索)
Fyvush Finkel, Emmy winner for 'Picket Fences,' dies at 93
Hiddleswift finally followed each other on Instagram after 3 excruciating days
17 questions you can answer if you're a good communicator