The Media Today

YouTube’s secret life as an engine for right-wing radicalization

September 19, 2018
 

For many casual YouTube users, the Google-owned video service is a harmless way to waste time, listen to music, or maybe even learn how to install a new appliance. But if you dig below the surface, as the non-profit research institute Data & Society does in a new report, you quickly start to see odd or even disturbing links to right-wing pundits and conspiracy theories. This is YouTube’s alter ego, what sociologist Zeynep Tufekci has called “one of the most powerful radicalizing instruments of the 21st century.” And it’s not a coincidence, the report says—it’s a deliberate attempt to radicalize users by pulling them into a vortex of reactionary content.

In the Data & Society analysis, “Alternative Influence: Broadcasting the Reactionary Right on YouTube,” researcher Rebecca Lewis looks at 65 political influencers across 81 YouTube channels, and identified what she calls an Alternative Influence Network or AIN. The AIN uses the same techniques that brands and other social-media influencers use to build followers and garner traffic, but uses them as a way to sell users on a specific right-wing ideology. This media pundits and internet celebrities in the network, which include Canadian professor Jordan Peterson and white supremacist Richard Spencer, “use YouTube to promote a range of political positions, from mainstream version of libertarianism and conservatism, all the way to overt white nationalism,” Lewis writes in the report.

Just as Instagram users might market a new brand of alcohol by posting photos and videos of themselves and tagging others to extend their reach, social networking among right-wing influencers on YouTube “makes it easy for audience members to be incrementally exposed to, and come to trust, ever more extremist political positions,” Lewis writes. And Google, of course, happily monetizes all of that engagement and traffic with ads.

It’s not just that Google is taking advantage of the traffic generated by these networks. As I wrote for CJR earlier this year, the problem is exacerbated by Google’s recommendation engine, an algorithm that suggests new videos for users to watch after they have finished with the one they clicked on or searched for. For many younger users, this is the new TV—watching video after video on YouTube. And the site’s algorithm is often gamed by right-wing trolls to get their hoaxes or fake news high up in the recommended list, an example of what the Oxford Internet Institute has called “computational propaganda.”

Google has said it is concerned about misinformation on YouTube (especially after conspiracy theories were some of the top recommendations after the school shooting in Parkland, Florida in February) and that it is trying to implement a number of features that will reduce the likelihood users will see fake news in the recommended list. But what Lewis describes in her Data & Society report is even harder to root out—a coordinated attempt to expose viewers to right-wing ideologies, not necessarily through the use of conspiracy theories or fakes, but through the kind of brand-building that YouTube and other social tools excel at.

Here are some more links related to misinformation and computational propaganda:

  • A conspiracy ecosystem: Jonathan Albright, research director at the Tow Center for Digital Journalism at Columbia University, looked at the rise of what he calls the “conspiracy ecosystem” viewers could get sucked into after searching for videos about the Parkland shootings. “It’s not YouTube getting gamed,” he told The Washington Post. “It’s that YouTube has allowed this to flourish. The Florida videos are now taking people to the larger conspiracy space.”
  • The intellectual dark web: Many of the right-wing or libertarian personalities Rebecca Lewis mentions in her Data & Society report like to think of themselves as members of what Eric Weinstein, a managing director of billionaire Peter Thiel’s venture capital firm, has called the “intellectual dark web.” New York Times writer Bari Weiss wrote about some of the members of this group in May.
  • Keep them clicking: Guillaume Chaslot, a former programmer at Google, worked on the recommendation algorithms used by YouTube and told CJR earlier this year the number one metric staffers were supposed to focus on was time spent on the site, not the quality of information. Chaslot has since left the search engine and created a site called AlgoTransparency, aimed at showing how YouTube’s recommendation engine often suggests hoaxes when users search for political or scientific terms.
  • A global problem: Gaming YouTube’s algorithms or social networking structure to spread right-wing messages in the US is clearly an issue, but the use of social platforms to spread political misinformation and even dangerous conspiracy theories is widespread, according to a recent report by the Oxford Internet Institute’s Computational Propaganda project. Researchers found evidence of “formally organized social media manipulation campaigns” in 48 countries, up from 28 countries last year.
  • Too late for 2018: Although Facebook has tried to clamp down on potential meddling in the US mid-term elections by removing networks of fake pages and “inauthentic” accounts, the social network’s former head of security said recently that it is too late to prevent social-media driven interference in the elections, which he said could become “the World Cup of information warfare.”
Sign up for CJR's daily email

 

Other notable stories:

  • Jonathan Kaiman, a former Beijing bureau chief for The Los Angeles Times, has resigned after being accused of sexual misconduct. Kaiman, who was suspended from the newspaper in May after accusations were made against him by two women, said in a statement that any sexual behavior he engaged in was consensual.
  • Twitter co-founder and CEO Jack Dorsey tells Wired, for its 25th anniversary issue, that he thinks one of the digital pioneers of the next 25 years will be ProPublica and its “experimental journalism.”
  • The New York Times apologized on Twitter after it mistakenly identified actress Angela Bassett, who was presenting at the Emmy Awards, as former White House staffer and reality show contestant Omarosa Manigault Newman. The Times said it regretted “running an incorrect caption from a photo wire service in some early print editions.”
  • For CJR, Andrew McCormick spoke with Time Editor in Chief Edward Felsenthal about the acquisition of the magazine by software billionaire Marc Benioff. Felsenthal says he thinks Benioff will be a “terrific fit” for the magazine, whose revenues have been on a downward trajectory for some time.
  • New York magazine announced that it is expanding its Intelligencer brand with a number of new hires, and will also bring its Select All technology vertical under the same umbrella. The magazine has hired former Business Insider editor Josh Barro, Mic writer Zak Cheney-Rice, and New Republic writer Sarah Jones.
  • Anita Hill, who testified about sexual harassment by Supreme Court nominee Clarence Thomas during his confirmation hearing in 1991, writes for The New York Times about how the Senate should handle the confirmation hearing of nominee Brett Kavanaugh, who has been accused of sexual assault by Christine Blasey Ford.
Mathew Ingram is CJR’s chief digital writer. Previously, he was a senior writer with Fortune magazine. He has written about the intersection between media and technology since the earliest days of the commercial internet. His writing has been published in the Washington Post and the Financial Times as well as by Reuters and Bloomberg.