The Media Today

How does fact-checking work when we can’t agree on the truth?

November 15, 2019
 

Last month, Facebook announced that it would exempt political advertising from the fact-checking standards imposed on the rest of its platform. The move was controversial. More than ever before, social media users are finding ways to debunk disinformation, yet the volume of inaccuracies and outright falsehoods never seems to diminish, thanks in large part to Donald Trump. How did we get here? And what are the best practices for fact-checking in the chaotic news environment we’re living in? Is fact-checking working at all to correct people’s impressions of misinformation? To explore these and other related questions, CJR convened a virtual symposium on Galley, discussion app, with some experts in the fact-checking field.

Jonathan Albright, who runs the Digital Forensic Research unit at Columbia’s Tow Center, says that his research shows that many of the same disinformation strategies from the 2016 election—aimed at reinforcing polarization and institutional distrust—are being leveraged this time around, too. As before, the focus is on religion, immigration, health, and climate change. When it comes to political advertising, Albright says, fact-checking wouldn’t be sufficient to confront the scope of the problem—even if Facebook did allow for it. “We need a [Federal Election Commission]-style portal on how citizen data is used in political campaigns, not separate platform political ad APIs,” he argues. Baybars Örsek, of the International Fact-Checking Network, says that Facebook’s decision not to fact-check political ads is a mistake: “I think fact-checkers should be able to flag not only political advertisements but also political claims and statements on Facebook.” 

Rampant disinformation may seem like a modern crisis, but Kelly Weill, of The Daily Beast, points out that “the US is a country that’s always held conspiratorial thinking close to its heart. The signers of the Declaration of Independence believed a number of falsehoods about plots by King George III against America.” Conspiratorial thinking often comes with new communication methods, Weill adds: the Flat Earth movement got its start in the United Kingdom in the mid-1840s, when newspapers became widely available. Renee DiResta, of the Stanford Internet Observatory, says that disinformation is “a chronic condition, and we’re now in the process of figuring out the best way to manage it.” What we need to do, she believes, is develop a much deeper understanding of how and when people internalize the messages they receive, “particularly in an era in which they’re barraged with messages and attention-grabbing content every time they pick up their phone.” 

ICYMI: Everyone is admitting what they get paid to work in journalism

Maarten Schenk, who runs a debunking site called Lead Stories, says that, over the past year or so, disinformation has evolved. “Operations are becoming larger and more complex, with widespread use of fake or stolen accounts to spread links around, often coming from dozens of websites that are all part of the same network,” he observes. Paradoxically, the changes indicate that countermeasures implemented by Facebook and others are working, he argues. “Creating a fake Facebook account is much more difficult these days, for example, with some of them getting caught within minutes for displaying non-human behavior.”

When Nathan Walter, a disinformation researcher at Northwestern University, analyzed whether fact-checking works or not, his team found evidence that it does in the sense that “people’s beliefs become more accurate and factually consistent” after seeing a fact-checking message. Yet Walter says that the benefits are minimal—even absent—when it comes to political campaign statements. And attempts to add relevant context, he adds, can actually make the problem worse. Alexios Mantzarlis, who ran the International Fact-Checking Network for several years before moving to the Google News Lab, says that in the time he led the IFCN, “We went from not thinking about it enough to hoping it could be a silver bullet and ultimately to fighting among ourselves because it didn’t stand up to those expectations.”

Sign up for CJR's daily email

Brooke Binkowski, a former managing editor at Snopes.com who now works for a fact-checking site called Truth or Fiction, says that fact-checking journalists confronting the scale of misinformation on social media need to adopt an aggressive stance. “You have to be prepared to stand up for the truth and defend it, in this Disinformation Age,” she says. “This isn’t ‘view from nowhere” journalism—you have to be willing and prepared to get into peoples’’ faces a bit, to tell them they’re wrong, to point your finger at them in the public square and say, ‘Look. This is a lie, and here is the liar who is spreading it.”

Here’s more on disinformation and the challenges of fact-checking:

  • Non-responsive: David Mikkelson, the founder of Snopes, talks in a Galley interview about why his site quit working with Facebook. “We impressed upon Facebook after our last agreement with them expired at the end of 2018 that we needed them to address some issues before we could renew, and they were completely non-responsive,” he says. “It did not make sense for us as an organization to continue expending resources to benefit a platform that seemed unconcerned about the welfare of their partners or about making sincere efforts to improve.”
  • A precious thing: Angie Drobnic Holan, editor of PolitiFact at Poynter, says that she tries not to get disillusioned about whether fact-checking is having an impact or not. “My goal is to keep alive factual analysis and evidence-based methods, so they’re not lost from the world,” she explains. “We all need help to see the world as it is, not as we’d like it to be. I fact-check because I think the truth in and of itself is a precious thing that needs to be valued and defended.”
  • Infectious info: Researchers at Stanford University are tracking the spread of viral disinformation using tools designed to track infectious diseases like Ebola. One of the researchers says he isn’t concerned about large fake news events, but about a “death by a thousand cuts” that erodes democracy “slowly and over time, so that we don’t recognize the gravity of what’s going on until perhaps it’s too late.”

 

Other notable stories:

  • Two students were killed and three others were wounded in a shooting at Saugus High School in Santa Clarita, California, on Thursday morning. The attack began just after 7:30 am, when students were supposed to be in their first classes. The sheriff’s department arrived at the school to find six students with gunshot wounds. Police initially thought the shooter had fled the scene, but later determined that one of the injured students was the perpetrator, who had shot himself in the head. According to several reports, it was his 16th birthday.
  • Media analyst Ken Doctor writes that the Gatehouse Media and Gannett merger will need to find as much as $400 million in cost savings to justify the deal. “What does that mean? Almost certainly, even more reduction in headcount than had been anticipated,” he writes. “How much? In any room of eight people at a current GateHouse or Gannett operation, one is likely to see her job gone in 2020.”
  • Jimmy Wales, co-founder of Wikipedia, has quietly launched what he hopes could someday become a rival to Facebook and Twitter, WT:Social. His network allows users to share links to articles and discuss them in a Facebook-style news feed, and will rely on donations from a small subset of users to fund the operation without advertising, which Wales blames for corrupting on social media. “The business model of social media companies, of pure advertising, is problematic,” he told the Financial Times. “It turns out the huge winner is low-quality content.”
  • To test advertising rules at Google, Facebook, Instagram, YouTube, Twitter, and Snapchat, The Daily Beast submitted anti-vaccination ads to each. In the end, the ads, filled with fake health claims, were mostly rejected, but Google and Twitter both approved ads that repeated falsehoods and linked to conspiracy theory websites. Google not only approved two ads with blatant language—“Don’t get vaccinated” and “Vaccines aren’t safe”—but even sent multiple prompts via email about how to better optimize them.
  • Emily Tamkin, CJR’s public editor for CNN, writes about the network’s use of the chyron to fact-check mistruths it airs. “The chyrons are clever. They’re cute. They’re wry. I am amused by the chyrons,” Tamkin writes. “But the chyron undermining [Steve] Bannon’s claim that a civil rights hero would support a profoundly divisive president does not change the fact that CNN is still covering Bannon’s words.”
  • Isaac Bailey wrote an open letter to Troy Closson, the editor of the Northwestern University student paper, which recently apologized for the way it covered a student protest. Bailey, a Black columnist in South Carolina, told Closson that “your instinct to empathize with marginalized groups who have for too long been overlooked or demonized or misconstrued throughout the history of the American media is a strong one. Don’t lose that.”
  • A Reuter’s news article that called Wednesday’s impeachment hearing “dull” sparked a social media revolt. The Reuters report drew criticism from Jay Rosen, a journalism professor at New York University, who chided Reuters for writing as if to “an audience craving a jolt.” James Fallows, of The Atlantic, said that “public-affairs writing suffers when it’s similar to theater reviews.” And Nikole Hannah-Jones, of the New York Times Magazine, argued that the report “is emblematic of a deep need for a reset in political reporting.”
  • When it launched in March, Apple signed on 200,000 subscribers to Apple News+ in its first 48 hours. Since then, according to a report from CNBC, it’s been stuck in neutral. Apple News+ includes magazines such as People and Vanity Fair, newspapers including the Los Angeles Times and the Wall Street Journal, and online publications like Vox, New York Magazine, and theSkimm. Bloomberg News reported Thursday that Apple is considering bundling Apple News+ with Apple Music and Apple TV+ as soon as next year.
  • In 2014, when Spain enacted a new copyright law, Google News pulled out. The News Media Alliance says that a study of Spanish media sites since then shows thatnews publishers “were minimally affected and that the reduction in traffic following the closing of Google News was, if anything, low and temporary.” France has now passed a similar law, and Google has said it will restrict the amount of information it provides for links to publishers there.
  • A single US website that specializes in anti-vaccination misinformation accounts for almost a third of all the anti-vaxx propaganda found on social media in Brazil, where thirteen percent of people don’t vaccinate themselves or their children, according to a new study from the Brazilian Society of Immunizations and Avaaz, a non-profit human rights activist network. The site had its account removed by Facebook, and was blocked by both YouTube and Twitter, but the falsehoods it published have continued to be distributed in Brazil and other countries.

ICYMI: The New York Times’ obsession with Trump, quantified

Mathew Ingram is CJR’s chief digital writer. Previously, he was a senior writer with Fortune magazine. He has written about the intersection between media and technology since the earliest days of the commercial internet. His writing has been published in the Washington Post and the Financial Times as well as by Reuters and Bloomberg.