Political misinformation, and a matter of scale

In October of last year, during the runup to the presidential election, the New York Post dropped what looked like a bombshell story. It alleged that a laptop belonging to Joe Biden’s son, Hunter, had been found in a repair shop, and that emails taken from this laptop allegedly implicated the Bidens in a political-influence scheme in Ukraine. The story buckled under close scrutiny, however: the owner of the repair shop contradicted himself multiple times, and also referenced conspiracy theories in an interview; the emails made their way to the Post via some questionable sources—former Trump advisor Steve Bannon and Trump lawyer Rudy Giuliani; and the Post story was co-written by a former producer on Sean Hannity’s Fox News program, in her first published article for the newspaper.

On Sunday, Ben Smith, the media writer for the New York Times, noted that this story was used in a session about misinformation that Harvard’s Shorenstein Center held recently for senior media executives. Although Twitter and Facebook blocked or restricted the spread of the Biden laptop story out of concerns for its origins and accuracy, Smith argued the Post report was just “an old-fashioned, politically motivated dirty tricks campaign,” and that describing it as “misinformation” didn’t add much to our understanding of it. Misinformation is “a technocratic solution to a problem that’s as much about politics as technology,” Smith said; a reporter’s job isn’t to “put neat labels on the news. It’s to report out what’s actually happening, as messy and unsatisfying as that can be.”

In questioning the utility of labeling things “misinformation,” Smith is in sync with some other critics, including BuzzFeed writer Joe Bernstein, who wrote a recent piece for Harper‘s magazine (which Smith linked to in his column) about the movement he calls “Big Disinfo.” Believers, Bernstein argued, want users of social-media platforms to think they are gullible rubes who are being manipulated by sophisticated social targeting and advertising algorithms. The terms misinformation and disinformation, he wrote, “are used casually and interchangeably to refer to an enormous range of content, ranging from well-worn scams to viral news aggregation.” Bernstein argued that these terms are often just jargon used to refer to “things I disagree with.” (I spoke with Bernstein about his piece and some of the conclusions he reached in a recent discussion on CJR’s Galley platform.)

There’s no question that people throw around the terms misinformation and disinformation without much attention paid to how, or whether, they fit a given scenario. And Smith is right that stories like the one about Hunter Biden’s alleged laptop fit quite well into the “old-fashioned, politically motivated dirty tricks” category, which has been with us as long as politics itself. (Benjamin Franklin invented stories about his political opponents and printed them in a fake newspaper.) But as widely shared as those stories might have been, they weren’t instantaneously transmitted to millions of people via a recommendation algorithm based on metrics like “engagement.” Long before the internet, disinformation was artisanal, hand-crafted by people like Franklin. But its reach has grown with technology. Now it is mass-produced and distributed by Russian troll farms like the Internet Research Agency and Facebook.

The term “political dirty tricks” also doesn’t begin to describe something like the impact that disinformation about the Rohingya people in Myanmar—amplified by Facebook’s algorithms—had on the genocide there, in which tens of thousands were killed or displaced, or the impact it has had in countries like Brazil, where president Jair Bolsonaro and his supporters were accused of mobilizing armies of disinfo spreaders. (Last month, a judge acquitted Bolsonaro of any role in spreading disinformation, but said the court would not allow “digital militias to try again to destabilize the elections, the democratic institutions.”)  Disinformation in India, spread via Facebook-owned WhatsApp, has reportedly led to dozens of deaths. Smith seems to be arguing that all we really need to fight misinformation is good, old-fashioned reporting, and he may be right. But there aren’t enough reporters in the world to check all the misinformation that flies through social media every day. Even Facebook can’t keep up—and it has 15,000 moderators whose only job is to sift through that kind of content.

There are deep-seated social and political reasons why people invent and share disinformation; technological solutions will never be enough to solve that problem. But it seems naive to suggest that disinformation doesn’t have a technological aspect to it, given what we know about how it functions, or that technology isn’t going to have to play a role in potential solutions. Tracing the connections between, and the behavior of, people who understand and use these technologies for nefarious purposes is a complicated process—which is why it’s useful to have help from experts like Joan Donovan, who runs the Technology and Social Change project at Harvard’s Shorenstein Center, and led the misinformation program that Smith mentions in his column. Donovan, who spoke with Smith, later shared his column and added her own perspective. “From my vantage point,” Donovan wrote on Twitter, “we as a society are in dire shape if we don’t take up the persistence of misinformation-at-scale on as a whole-of-society problem and reckon with it.”

Sign up for CJR's daily email

Here’s more on misinformation:

  • Incentives: I spoke with Donovan as part of a discussion series on CJR’s Galley platform in 2019, and she talked about why she does the work she does. “I wouldn’t do this research if I did not believe deeply in the right to free and open communication, which includes the right to communicate free from hate speech, harassment, and violence,” Donovan said. Commitments from the social platforms to help stop these problems, she added, don’t really address “the fundamental incentives behind how hate groups are financed and resourced online by having access to payment processIng and broadcast technologies.”
  • Squatting: In a piece published in CJR earlier this year, Donovan and Brandi Collins-Dexter, a colleague at the Shorenstein Center, wrote about some of the tactics that right-wing groups used to spread disinformation about the 1619 Project and critical race theory. “Our research reveals that the popularity of ‘1776’ owes in part to keyword squatting—a tactic by which right-wing media have dominated the keywords ‘1619’ and ‘critical race theory’ and enabled a racialized disinformation campaign, waged by Trump and his acolytes, against Black civil rights gains,” Donovan and Collins-Dexter wrote.
  • Definition: In July, I wrote about some of the challenges Facebook faces when it comes to defining what qualifies as disinformation, including a flip-flop on whether posting rumors about COVID-19 escaping from a lab would fit that definition. “Not that long ago, this was defined by almost everyone—including Facebook—as disinformation, and sharing theories to that effect would get your account blocked. In recent months, however, experts have started to entertain those theories more than they did in the past.” As a result, I wrote, discussing such a possibility “is no longer seen as a direct ticket to Facebook or Twitter oblivion.”

 

Other notable stories:

  • Scott Hechinger writes for The Nation about what he calls “a massive fail on crime reporting” by the New York Times and NPR, related to stories both published about data showing a significant rise in homicides in 2020. “I write this not to attack the Times or NPR or the reporters of these stories, nor to take away or distract from the very real and disturbing tragedy of every single one of these murders,” Hechinger wrote. His intention, he said, was to “call attention to an insidious and historically rooted contributor to the system of policing and prison in our country: a pro-police worldview deeply ingrained in journalism.”
  • Julie K. Brown, the Miami Herald investigative reporter who helped break the story about Jeffrey Epstein, the former billionaire, and his abuse of under-age girls, suggested on Twitter that the trial of Ghislaine Maxwell—Epstein’s alleged associate, who the state says procured under-age girls for him—shows why cameras should be permitted in federal courtrooms. “Perhaps… it’s time to let cameras into federal courtrooms,” Brown wrote. “There is no better example of why this is important than in the Epstein/Maxwell case, where important information has never seen the light of day.”
  • Jack Dorsey, Twitter’s co-founder and CEO, announced Monday that he has stepped down, and will be replaced by Parag Agrawal, who was most recently the company’s chief technology officer. Dorsey’s exit “marks a significant shift at the company,” the New York Times reported, saying the company “has navigated years of pressure from investors who thought it did not make enough money and criticism from Washington, particularly from Republican lawmakers who have complained Twitter has helped stifle conservative voices in social media.”
  • Chris Cuomo, the CNN host, “used his sources in the media world to seek information on women who accused his brother Andrew Cuomo, then the governor of New York, of sexual harassment,” according to a report by CNBC based on documents released Monday by the New York Attorney General’s office. “While Chris Cuomo has previously acknowledged advising his brother and his team on the response to the scandals,” CNBC reports, “the records show that his role in helping the then-governor was much larger and more intimate than previously known.” According to the documents, Chris Cuomo dictated statements for his brother, then the governor, to use.
  • Behind the external success at Politico “lies a series of burgeoning newsroom conflicts,” the Daily Beast reports. “From personnel issues, including complaints about internal ‘woke police,’ to a divisive unionization drive, to increasing competition in the profitable D.C. newsletter space, tensions appear to be growing within Politico,” the report states. The Daily Beast says it spoke with 22 current and former staffers for its story, and that most of the conflicts have to do with Playbook, the company’s high-profile newsletter. Politico was recently acquired by Axel Springer, a German media giant, for more than $1 billion.
  • Greg Sargent, a Washington Post columnist, argues that journalists have to change the way they cover politics if Donald Trump decides to run again in 2024, and says he agrees with a Twitter thread posted by Jay Rosen, a journalism professor at New York University, that calls for a different way of covering a fundamentally anti-democratic candidate. “When bad actors manufacture an issue, it isn’t necessarily news,” Sargent writes. “Sometimes news organizations amplify political attacks by treating them as inherently newsworthy.” CJR editor and publisher Kyle Pope also wrote about how the media needs to change the way it reports on Trump.
  • Ariana Pekary, CJR’s public editor for CNN, recently wrote about how the network’s “exaggerated tone and graphic content increasingly pushes it into the realm of tabloid-like material” as it tries to boost its viewership numbers, including using paparazzi video and photos for lurid stories like the accidental shooting of a film crew member by actor Alec Baldwin. “That is not ethical journalism,” Pekary writes. “TV news producers didn’t stop to think about the lives that are at stake in their race to get their video on the air. What’s more, by airing that material, reputable outlets like CNN encourage and enable similar tactics in the future.” 
  • An episode of “The Simpsons” that ridicules Chinese government censorship appears to have been censored on Disney’s newly launched Disney+ streaming service in Hong Kong, according to a report from the New York Times. The episode was critical of former Chinese leader Mao Zedong, as well as the government’s efforts to suppress any memory or evidence of the 1989 Tiananmen Square massacre. 

Has America ever needed a media watchdog more than now? Help us by joining CJR today.

Mathew Ingram is CJR’s chief digital writer. Previously, he was a senior writer with Fortune magazine. He has written about the intersection between media and technology since the earliest days of the commercial internet. His writing has been published in the Washington Post and the Financial Times as well as by Reuters and Bloomberg.

TOP IMAGE: Laptop blank screen on wood table with coffee cafe background, mockup, template for your text, Clipping paths included for background and device screen