Print

YouTube Cracks Down on Far-Right Videos as Conspiracy Theories Spread

The New York Times/March 3, 2018

By Jonah Engel Bromwich

YouTube this week cracked down on the videos of some prominent far-right actors and conspiracy theorists, continuing an effort that has become more visible since the school shooting in Parkland, Fla., last month caused a torrent of misinformation to be featured prominently on the site.

A week after the shooting, many of the videos on YouTube’s “Trending” list contained misinformation about the teenage survivors of the shooting. The top video on the list for some time falsely claimed that a student at the school, David Hogg, was a paid actor.

That video and others like it led to intense criticism of the site. Since then, many prominent right-wing personalities have reported that YouTube has issued them strikes, which the site uses to enforce its community guidelines. If a channel receives three strikes within three months, YouTube terminates it.

The company’s guidelines prohibit “videos that contain nudity or sexual content, violent or graphic content, harmful or dangerous content, hateful content, threats, spam, misleading metadata, or scams.”

Mike Cernovich, the right-wing agitator and conspiracy theorist, said Wednesday that his channel, which has more than 66,000 subscribers, had been given a strike. (Mr. Cernovich said Saturday that YouTube had reversed the strike and that the video that had been banned was again available on the site.)

Infowars, the conspiracy theory outlet headed by Alex Jones, said Tuesday that it had received a second strike in two weeks, both for videos about the Parkland shooting. (Infowars, which has more than 2.2 million YouTube subscribers, later said the second strike had been removed.) Infowars’ Washington bureau chief, Jerome Corsi, said on Twitter that his YouTube channel had been terminated without notice or explanation.

News outlets including The Outline and Breitbart have pointed to more than a dozen other right-wing or right-leaning accounts that have been disciplined, claiming they have either received strikes or been banned outright in the past several weeks.

They include the violent neo-Nazi group Atomwaffen (banned for hate speech) and the YouTube star Carl Benjamin, known by his username Sargon of Akkad, who criticizes feminism and identity politics. Mr. Benjamin posted a screenshot on Facebook on Thursday that said he had been locked out of his Google account because “it looked like it was being used in a way that violated Google’s policies.”

YouTube said it was not aware of any prominent accounts that had falsely reported strikes, though it did say that Mr. Benjamin had violated its policy on copyright infringement.

YouTube denied that the deletions and other actions were ideologically driven. It said accounts that had been disciplined or banned were only the most prominent, and vocal, of many across the ideological spectrum who had seen their videos taken down for violating the site’s rules.

But critics said YouTube was reacting haphazardly in an attempt to purge actors who have garnered it negative attention. They questioned whether the site was prepared to substantively address the problem of the conspiracy theories that flourish on its platform.

And some of the right-wing YouTube stars and conspiracists who were affected saw the disciplinary action as a result of what they say is left-wing ideology flourishing inside Google, of which YouTube is a subsidiary. Mr. Benjamin told Breitbart that the company was “riddled with a far-left ideological orthodoxy that has taken hold to a radical degree.”

A YouTube spokeswoman said in a statement that its “reviewers remove content according to our policies, not according to politics or ideology, and we use quality control measures to ensure they are applying our policies without bias.”

The company is in the middle of hiring a large influx of moderators, and it attributed some of its recent enforcement actions to a group who are still learning to apply its rules.

“With the volume of videos on our platform, sometimes we make mistakes and when this is brought to our attention we reinstate videos or channels that were incorrectly removed,” the company’s statement said.

The moderation efforts of YouTube, like those of Facebook and Twitter, have begun to receive more attention in the last year as academics and journalists have focused on how misinformation — like that sown by Russia during the 2016 presidential election — is spread.

In a December blog post, YouTube announced it would add many people to its work force in 2018, hoping eventually to have 10,000 working to moderate or otherwise address content that has violated its rules.

The company is in the process of hiring many of those people. A spokeswoman said that applying the company’s standards to any given video required training, and that new moderators were bound to make some mistakes. It said that some of the strikes that had been handed out since the Parkland shooting had been mistaken, though it did not specify which.

Jonathan Albright, the research director of the Tow Center for Digital Journalism and an expert on how misinformation thrives on social media, said YouTube has long been inconsistent in its enforcement of its guidelines.

“If these accounts are getting deleted at the last minute because people are angry and news organizations are digging into this, should these accounts have existed in the first place?” he asked.

Negative news media attention results in more users flagging videos of right-wing conspiracy theorists, leading to a feedback loop in which channels promoting those views are disciplined. The effect is that its disciplinary process is reactive, and relies on users to flag content they find troublesome.

A YouTube spokeswoman said the company planned to release a transparency report in the spring that it says will show the full range of videos that have been taken down.

Zack Exley, a left-wing populist who runs the YouTube channel Left Right Forward, said that the bans of high-profile right-wing accounts may do more harm than good by building up the reputations of extremists among their bases.

“The extreme right loves being censored,” he said. “They become heroes and raise tons of money every time YouTube removes or demonetizes their videos.”

YouTube does not police misinformation, which, on its own, does not violate the site’s guidelines. But the company says it is confident that, with its steep increase in moderators and progress in machine learning, it will be capable of enforcing its rules effectively, which will help stem the tide of objectionable videos.

Mr. Albright said that hiring many more people would help. But he said that those people would have to be trained to address difficult situations, applying the platform’s rules to nuanced videos in which the right decision was not always clear.

“It would take a group of people specifically trained in this kind of situation,” he said. “It’s a problem of scale. This stuff doesn’t scale like algorithms. Humans don’t scale.”

“YouTube isn’t in a ‘too big to fail’ situation,” he added. “But they’re potentially too big to moderate.”

To see more documents/articles regarding this group/organization/subject click here.

Educational DVDs and Videos