The Media Today

Researchers say fears about ‘fake news’ are exaggerated

February 7, 2019
 

It’s so widely accepted that it’s verging on conventional wisdom: misinformation, or “fake news,” spread primarily by Facebook to hundreds of millions of people (and created by Russian agents), helped distort the political landscape before and during the 2016 US presidential election, and this resulted in Donald Trump becoming president. But is it really that cut and dried? Not according to Brendan Nyhan, a political scientist and professor of public policy at the University of Michigan. He and several colleagues have been researching this question since the election, and have come to a very different conclusion. Fears about the spread and influence of fake news have been over-hyped, Nyhan says, and many of the initial conclusions about the scope of the problem and its effect on US politics were exaggerated or just plain wrong.

Nyhan says his data shows so-called “fake news” reached only a tiny proportion of the population before and during the 2016 election. In most cases, misinformation from a range of fake news sites made up just 2 percent or less of the average person’s online news consumption, and even among the group of older conservatives who were most likely to consume fake news, it only made up about 8 percent. Not only that, but the University of Michigan researcher says a new paper he and his colleagues recently published shows the reach of fake news actually fell significantly between the 2016 election and the midterm elections last year, which suggests Facebook has cracked down on the problem. Nyhan also says “no credible evidence exists that exposure to fake news changed the outcome of the 2016 election.”

This might come as a surprise to Kathleen Hall Jamieson. She’s a veteran public policy researcher who published a book last year entitled Cyberwar: How Russian Hackers and Trolls Helped Elect a President. Jamieson, whose colleagues call her “the Drill Sergeant” for her no-nonsense attitude, has more 40 years of studying human behavior under her belt. In the book, she says the evidence suggests misinformation propagated by Russian trolls likely influenced the outcome of the election, in part because of the number of “swing” or undecided voters who were susceptible to those kinds of tactics. Jamieson also notes that the traditional news media played a key role in spreading this fake news and propaganda, by writing innumerable articles about Hillary Clinton’s emails. And she argues fake news wouldn’t have had to make much of an impact to influence the election, since a fairly small number of votes gave Trump the electoral college wins he needed.

Nyhan and his fellow researchers, however, including Princeton political scientist Andrew Guess, say their study looked at the actual behavior of a large sample of users who consented to have their online activity tracked and recorded in real time, and then followed up with interviews about their perceptions of the content. Not only was the amount of actual fake news they encountered incredibly tiny, Guess told CJR this past fall, but the idea that this would influence their behavior is also a bit of a stretch (something Nyhan wrote about for The New York Times last year). “It’s predominantly people who are inclined to believe the conclusions that are being made in this content, not so much swaying them to believe something,” Guess said. “In other words, it’s more or less just confirmation bias.”

So why has this myth of fake news swinging the election persisted despite a lack of evidence to support it? Nyhan’s theory is that it’s a little like the myth that Orson Welles’s radio play “War of the Worlds” caused widespread panic among the US population when it was aired in 1938. The play was likely only heard by a tiny number of people, and there’s no actual evidence that it caused any kind of panic, and yet the myth persists—in part because newspapers at the time played up the idea, as a way of discrediting radio (a relatively new competitor) as a source of news. In the same way, Nyhan argues, concerns about fake news being spread by Russian agents on Facebook are fueled by broader concerns about the influence of social networks on society.

Here’s more about fake news and US politics:

  • A fake problem: In an article he wrote for CJR in 2017, Jacob Nelson, a professor at the Arizona State University’s journalism school, argued that fake news was “a fake problem.” Evidence gathered from online analytics tools like comScore, he said, showed that “the audience for fake news is real, but it’s also really small.”
  • A crazy idea: In an infamous quote not long after the 2016 election, Facebook CEO Mark Zuckerberg scoffed at the idea that fake news could have influenced the outcome. “I think the idea that fake news on Facebook—which is a very small amount of the content—influenced the election in any way [is] a pretty crazy idea,” he said. He later said he regretted downplaying the effect of fake news.
  • Blame the media: In a study published in late 2017, Duncan Watts and David Rothschild from Microsoft Research argued that the amount and reach of misinformation was unlikely to have had any noticeable impact on the election, and that instead of blaming fake news, we should “blame the mainstream media.”
  • Blame the media 2: A group of researchers from Harvard and MIT argued in a paper looking at the election results that the problem was a right-wing media ecosystem anchored around Breitbart News that “used social media as a backbone to transmit a hyper-partisan perspective to the world.”
Sign up for CJR's daily email

Other notable stories:

  • Former New York Times Executive Editor Jill Abramson is facing multiple accusations of plagiarism related to her book, Merchants of Truth: Inside the News Revolution, including for a passage similar to a CJR post I wrote. In an interview on Fox News, Abramson said she “certainly didn’t plagiarize in my book” and on Twitter she said she “takes seriously the issues raised and will review the passages in question.”
  • Veteran Florida journalist Bob Norman writes for CJR about being targeted by Roger Stone, the notorious Trump advisor who has been arrested and charged with multiple felonies as part of the Mueller investigation. Norman says when he started writing about former Broward County Sheriff Scott Israel, he became the target of Stone’s “pathological attack machine.”
  • The New York Times announced that it had over $700 million in digital revenue last year, thanks in part to growth in digital subscriptions (both for the newspaper and other services such as the crossword) as well as new features such as its consumer recommendation service, Wirecutter. Times Company CEO Mark Thompson has committed to having at least $800 million in digital revenue by 2020.
  • Researchers from Northwestern University looked at 13 terabytes of anonymous audience and subscriber data from the Chicago Tribune, Indianapolis Star, and San Francisco Chronicle, and found that “frequency of consuming local news is the single biggest predictor of retaining subscribers—more than the number of stories read or the time spent reading them.”
  • Campbell Brown, the Head of News for Facebook, told a gathering of media-industry executives that the social network is not here to save them. “Facebook cannot be the entire solution to your problems,” she told the American Magazine Media conference. “By its very nature, Facebook is constantly changing and not dependable.”
  • Music-streaming service Spotify announced that it is acquiring Gimlet Media, which produces popular podcasts such as Homecoming and Reply All, for a rumored $230 million, and is also buying Anchor, a tool that makes it easy for podcasters to create and monetize their work. Spotify says it plans to spend as much as $500 million this year.
  • According to a Fortune magazine report based on an internal briefing with Facebook employees, one of the criteria that will be used to determine whether a staffer is eligible for a bonus—and how much—will be whether he or she helped the company deal with misinformation and hate speech on the platform.
  • Vox is launching a membership program, but it has a twist, according to a Nieman Lab report: the program is focused exclusively on video, and applies only to the news publisher’s content on YouTube. Members who pay $4.99 a month get access to behind-the-scenes content and a monthly Q&A with a producer.
  • A report at TechCrunch says that the online community Reddit is close to landing a new financing round led by Chinese media company Tencent that could value the company at $3 billion. The last funding round that Reddit closed in 2017 valued the former Advance Publications–owned site at $1.8 billion.
Mathew Ingram is CJR’s chief digital writer. Previously, he was a senior writer with Fortune magazine. He has written about the intersection between media and technology since the earliest days of the commercial internet. His writing has been published in the Washington Post and the Financial Times as well as by Reuters and Bloomberg.