The Media Today

Facebook’s disinformation problem is harder than it looks

July 22, 2021
 

That Facebook can distribute dangerous amounts of misinformation around the world in the blink of an eye is not a new problem. But the attention stepped up when President Joe Biden told reporters during a White House scrum that Facebook was “killing people” by spreading disinformation, hoaxes, and conspiracy theories about COVID-19, and in particular about the efficacy of various vaccines. As Jon Allsop reported in the CJR newsletter on Wednesday, Biden backtracked somewhat on his original statement after some pushback from the company and others: Facebook said that the country needed to “move past the finger pointing” when it comes to COVID disinformation, and that it takes action against such content when it sees it. Biden responded that his point was simply that Facebook has enabled a small group of about a dozen accounts to spread disinformation that might be causing people to avoid getting vaccinated, and that this could result in an increase in deaths.

Biden appears to have got his information about this “disinformation dozen” from a group called the Center for Countering Digital Hate, which came out recently with research it said showed that the bulk of the disinformation around COVID-19 and vaccines appears to come from a handful of accounts. The implication of the president’s comment is that all Facebook has to do is get rid of a few bad apples, and the COVID disinformation problem will be solved. As Farhad Manjoo of the New York Times put it, however, Biden “reduced the complex scourge of runaway vaccine hesitancy into a cartoonishly simple matter of product design: If only Facebook would hit its Quit Killing People button, America would be healed again.” While Biden’s comments may make for a great TV news hit, solving a problem like disinformation at the scale of something like Facebook is much harder than he makes it sound, in part because it involves far more than just a dozen bad accounts. And even the definition of what qualifies as disinformation when it comes to COVID has changed over time.

As Jon Allsop described yesterday, part of the problem is that media outlets like Fox News seem to feel no compunction about spreading “fake news” about the virus in return for the attention of their viewers. That’s not a problem Facebook can fix, nor will ridding the social network of all hoaxes about COVID or vaccines make much of a dent in the influence of Fox’s hysteria — which information researcher Yochai Benkler of Harvard’s Berkman Center for Internet and Society has argued was much more influential during the 2016 election than any social-media network. But even that’s just the tip of the disinformation iceberg. One of the most prominent sources of COVID and vaccine disinformation is a sitting US member of Congress: Marjorie Taylor Greene from Georgia. Another, Robert F. Kennedy, is a member of one of the most famous political families in US history, and his anti-vaccination conspiracy theories put him near the top of the Center for Countering Digital Hate’s “disinformation dozen” list. What is Facebook supposed to do about their repeated misstatements?

Twitter blocked Taylor Greene’s account for 12 hours because she was spreading anti-vax hysteria, and Facebook could easily do likewise. But then what? The social platforms could just play Whack-a-Mole with such statements forever, or they could take definitive action and ban Taylor Greene and/or Kennedy for their spreading of disinformation. But as Manjoo pointed out in his Times column, doing so is going to give right-wing critics even more ammunition to cry about censorship by the platforms than they already had thanks to Donald Trump’s ongoing social-media ban. It’s not just people like Taylor Greene and Kennedy, or obvious trolls like Alex Jones of Infowars. It’s not even just professional “bot” accounts that trade in disinformation for profit and influence. Another part of the problem is that things that once seemed like obvious COVID disinformation no longer do.

Take the idea that the virus might have originated in a research lab in Wuhan, China. Not that long ago, this was defined by almost everyone — including Facebook — as disinformation, and sharing theories to that effect would get your account blocked. In recent months, however, experts have started to entertain those theories more than they did in the past, in part because of a track record of poor record-keeping and slip-ups at a number of similar laboratories. The idea that COVID might have escaped into the wild accidentally doesn’t have much more to recommend it than it did six months or a year ago. But discussing this possibility is no longer seen as a direct ticket to Facebook or Twitter oblivion. It would be hard enough to pinpoint all the pieces of disinformation around something like COVID even if there were agreement about all aspects of it, but there isn’t.

Joe Biden and his advisors, and other critics of Facebook, might think that getting rid of disinformation is an easy task, and that the company is simply dragging its feet because it doesn’t want to disrupt its business, and there is probably more than a little truth to that. But it’s also true that finding the right line between disinformation control, public-health awareness, and outright censorship is not an easy task. Blocking accounts en masse for normal speech about an ongoing problem is not going to solve anything.

Sign up for CJR's daily email

Here’s more on Facebook and COVID:

  • Unknown: At the start of the pandemic, data scientists at Facebook met with senior executives to ask for resources to measure the prevalence of misinformation about COVID-19 on the social network, the New York Times reports. “The executives in question never approved the resources, but the team was never told why,” according to the people the Times spoke with, who requested anonymity because they were not authorized to speak to reporters. The report says that the company “doesn’t actually know many specifics about how misinformation about the coronavirus and the vaccines to combat it have spread.”
  • Bots: Groups of automated accounts known as “bots” drive much of the COVID-19 misinformation on Facebook, not actual human users, according to a new study led by John Ayers, who specializes in public health surveillance at the University of California, San Diego. “If we want to correct the ‘infodemic,’ eliminating bots on social media is the necessary first step,” Ayers said. “Unlike controversial strategies to censor actual people, silencing automated propaganda is something everyone can and should support.”
  • Double: New research released Tuesday suggests that Facebook is still a place where a lot of COVID misinformation is circulating, despite the company’s claims that it has cracked down on such content, according to a Forbes report. Media Matters for America, a liberal tech watchdog organization, says it has found 284 active private and public Facebook Groups currently distributing vaccine misinformation, more than double the amount researchers found in April. Over a half million users belong to these groups, the report says.

 

Other notable stories:

  • Four more former senior employees of the now-closed Apple Daily were charged under Hong Kong’s national security law on Wednesday night. Police officers detained former associate publisher Chan Pui-man and ex-editorial writers Fung Wai-kong and Yeung Ching-kee, as well as former executive editor-in-chief Lam Man-chung. Eight people who held senior roles at the newspaper have been arrested since last month, six of whom have now been charged with offenses involving collusion to subvert national security.
  • California journalist Matthew Keys was sentenced Monday to another six months in prison after a Sacramento judge found that he deleted the online videos and YouTube account belonging to Comstock magazine, his former employer. Chief US District Judge Kimberly Mueller also ordered Keys to submit to eighteen months of supervised release after he gets out, and to submit to outpatient mental health treatment to provide “further tools for self-reflection and self-control.” Keys served two years in prison following his 2015 convictionin a case involving a hack of the Los Angeles Times website.
  • Maria Bustillos, co-founder of Popula and the writers collective Brick House, and CJR’s public editor for MSNBC, reviews a controversial documentary about Anthony Bourdain, in which the director used artificial intelligence to create an audio facsimile of the TV host and author’s voice. What seems to have upset people the most about these audio deepfakes, Bustillos says, “is that fans, myself included, want so much to keep believing in the illusion, shared by millions, of Bourdain as a personal friend.”
  • Maria Taylor is leaving ESPN and will join NBC to cover the Olympics, according to a report from the New York Post. Taylor leaves the sports network just weeks after the New York Times reported on a year-old private tape of leaked comments, in which fellow NBA host/reporter Rachel Nichols alleged that ESPN gave Taylor the Finals hosting job to make up for what Nichols described as the network’s “crappy longtime record on diversity.”
  • Bryan Goldberg’s BDG Media, owner of the female-focused website Bustle and a revamped version of Gawker, is buying Some Spider Studios, a digital-media company behind parenting websites Scary Mommy, Fatherly and the Dad, Goldberg told the Wall Street Journal. BDG Media aims to go public by merging with a special-purpose acquisition company, or SPAC, later this year, Goldberg said. The all-stock deal for Some Spider values the publisher at about $150 million, a person familiar with the matter told the Journal.
  • The Dallas Morning News has named Katrice Hardy to be its next top editor, the paper reported on Wednesday. Hardy, who is forty-seven, is currently executive editor at the Indianapolis Star, which won this year’s Pulitzer Prize for national reporting, and she is also the Midwest regional editor for the USA Today Network. Hardy is the first woman and the first Black journalist to have the top job at the Dallas newspaper.
  • The Committee to Protect Journalists joined a coalition of US news and press freedom organizations asking the US government to provide humanitarian assistance and emergency visas to Afghans who have worked with US media outlets. In 2020, at least five journalists were killed in Afghanistan in relation to their work, the most killed in any global conflict, according to the CPJ’s research. The organization said it has been working closely with partners to provide emergency support to at-risk local and international journalists in Afghanistan and also advocating for political action with government leaders. 
Mathew Ingram is CJR’s chief digital writer. Previously, he was a senior writer with Fortune magazine. He has written about the intersection between media and technology since the earliest days of the commercial internet. His writing has been published in the Washington Post and the Financial Times as well as by Reuters and Bloomberg.