YouTube’s secret life as an engine for right-wing radicalization

Columbia Journalism Review/September 19, 2018

By Mathew Ingram

For many casual YouTube users, the Google-owned video service is a harmless way to waste time, listen to music, or maybe even learn how to install a new appliance. But if you dig below the surface, as the non-profit research institute Data & Society does in a new report, you quickly start to see odd or even disturbing links to right-wing pundits and conspiracy theories. This is YouTube’s alter ego, what sociologist Zeynep Tufekci has called “one of the most powerful radicalizing instruments of the 21st century.” And it’s not a coincidence, the report says—it’s a deliberate attempt to radicalize users by pulling them into a vortex of reactionary content.

In the Data & Society analysis, “Alternative Influence: Broadcasting the Reactionary Right on YouTube,” researcher Rebecca Lewis looks at 65 political influencers across 81 YouTube channels, and identified what she calls an Alternative Influence Network or AIN. The AIN uses the same techniques that brands and other social-media influencers use to build followers and garner traffic, but uses them as a way to sell users on a specific right-wing ideology. This media pundits and internet celebrities in the network, which include Canadian professor Jordan Peterson and white supremacist Richard Spencer, “use YouTube to promote a range of political positions, from mainstream version of libertarianism and conservatism, all the way to overt white nationalism,” Lewis writes in the report.

Just as Instagram users might market a new brand of alcohol by posting photos and videos of themselves and tagging others to extend their reach, social networking among right-wing influencers on YouTube “makes it easy for audience members to be incrementally exposed to, and come to trust, ever more extremist political positions,” Lewis writes. And Google, of course, happily monetizes all of that engagement and traffic with ads.

It’s not just that Google is taking advantage of the traffic generated by these networks. As I wrote for CJR earlier this year, the problem is exacerbated by Google’s recommendation engine, an algorithm that suggests new videos for users to watch after they have finished with the one they clicked on or searched for. For many younger users, this is the new TV—watching video after video on YouTube. And the site’s algorithm is often gamed by right-wing trolls to get their hoaxes or fake news high up in the recommended list, an example of what the Oxford Internet Institute has called “computational propaganda.”

Google has said it is concerned about misinformation on YouTube (especially after conspiracy theories were some of the top recommendations after the school shooting in Parkland, Florida in February) and that it is trying to implement a number of features that will reduce the likelihood users will see fake news in the recommended list. But what Lewis describes in her Data & Society report is even harder to root out—a coordinated attempt to expose viewers to right-wing ideologies, not necessarily through the use of conspiracy theories or fakes, but through the kind of brand-building that YouTube and other social tools excel at.

Here are some more links related to misinformation and computational propaganda:

  • A conspiracy ecosystem: Jonathan Albright, research director at the Tow Center for Digital Journalism at Columbia University, looked at the rise of what he calls the “conspiracy ecosystem” viewers could get sucked into after searching for videos about the Parkland shootings. “It’s not YouTube getting gamed,” he told The Washington Post. “It’s that YouTube has allowed this to flourish. The Florida videos are now taking people to the larger conspiracy space.”
  • The intellectual dark web: Many of the right-wing or libertarian personalities Rebecca Lewis mentions in her Data & Society report like to think of themselves as members of what Eric Weinstein, a managing director of billionaire Peter Thiel’s venture capital firm, has called the “intellectual dark web.” New York Times writer Bari Weiss wrote about some of the members of this group in May.
  • Keep them clicking: Guillaume Chaslot, a former programmer at Google, worked on the recommendation algorithms used by YouTube and told CJR earlier this year the number one metric staffers were supposed to focus on was time spent on the site, not the quality of information. Chaslot has since left the search engine and created a site called AlgoTransparency, aimed at showing how YouTube’s recommendation engine often suggests hoaxes when users search for political or scientific terms.
  • A global problem: Gaming YouTube’s algorithms or social networking structure to spread right-wing messages in the US is clearly an issue, but the use of social platforms to spread political misinformation and even dangerous conspiracy theories is widespread, according to a recent report by the Oxford Internet Institute’s Computational Propaganda project. Researchers found evidence of “formally organized social media manipulation campaigns” in 48 countries, up from 28 countries last year.
  • Too late for 2018: Although Facebook has tried to clamp down on potential meddling in the US mid-term elections by removing networks of fake pages and “inauthentic” accounts, the social network’s former head of security said recently that it is too late to prevent social-media driven interference in the elections, which he said could become “the World Cup of information warfare.”

To see more documents/articles regarding this group/organization/subject click here.

Educational DVDs and Videos