The Media Today

What should we do about the algorithmic amplification of disinformation?

March 11, 2021
 

The results of the 2020 presidential election. The alleged dangers of the COVID vaccine. Disinformation continues to have a significant effect on almost every aspect of our lives, and some of the biggest sources of disinformation are the social platforms that we spend a large part of our lives using—Facebook, Twitter and YouTube. On these platforms, conspiracy theories and hoaxes are distributed at close to the speed of light, thanks to the recommendation algorithms that all of these services use. But the algorithms themselves, and the inputs they use to choose what we see in our feeds, are opaque. They’re known only to senior engineers within those companies. or to malicious actors who specialize in “computational propaganda” by weaponizing the algorithm. Apart from hoping that the companies will figure out a solution, even if that goes against their financial interests, as it almost certainly will, is there anything we can do?

We invited some veteran disinformation researchers and other experts to discuss this topic and related issues on CJR’s Galley discussion platform this week, including: Joan Donovan, who runs the Technology and Social Change research project at Harvard’s Shorenstein Center; Sam Woolley, an assistant professor in both the School of Journalism and the School of Information at the University of Texas in Austin; Anne Washington, an assistant professor of data policy at New York University and an expert in data governance issues; Khadijah Abdurahman, an independent researcher specializing in content moderation and surveillance in Ethiopia; Irene Pasquetto, who studies information ethics and digital curation as an assistant professor at the University of Michigan’s School of Information; and Lilly Irani, an associate professor in the department of communication at the University of California in San Diego.

Donovan, whose specialty is media manipulation, disinformation, and adversarial movements that target journalists, says she believes the US needs legislation similar to the Glass-Steagall Act, which put limits on what banks and investment companies could do. This kind of law would “define what these business can do and lay out some consumer protections, coupled with oversight of human and civil rights violations by tech companies,” Donovan says. “The pandemic and the election revealed just how broken our information ecosystem is when it comes to getting the right information in front of the right people at the right time.” The way that Facebook and other platforms operate, she says, means that “those with money and power were able to exert direct influence over content moderation decisions.”

ICYMI: What the pandemic means for paywalls

The broader problem is not just the amplification of conspiracy theories about vaccines or other beliefs, says Woolley. “In Myanmar and India, we’ve seen disinformation on social media grow into offline hate, violence, and murder,”he says. The difficulty in proving that statement X led to the murder of Y shouldn’t stop us from holding the platforms to account, Wooley feels. “We know billions of people use these products, that they are key sources of information. If they prioritize, amplify and massively spread content that is associated with offline violence and other, societal, harms then they must be held accountable.” Washington notes that “users have no control, as in zero, about who they are compared to and what features are selected for that comparison. Instead of blaming people who take the wrong turn, I would instead suggest that we consider why we are building roads like that in the first place.”

Facebook, Twitter, and YouTube are often seen through a primarily Western lens, because they are based in the US and events like the 2020 presidential election draw a lot of attention, says Abdurahman, but the algorithmic amplification of disinformation is a global problem. And while Facebook in particular is happy to brag about how many users it has in other countries, it doesn’t always follow through with other services that might help with moderation of content. For example, Abdurahman says, “in Ethiopia there are 83+ languages. As of June there was only automated translation available for Amharic (the national language) and neither human or automated translation for any other languages.” In other words, she says, Facebook, Twitter, and other social media companies “literally could not see/hear/understand the content circulating on their platform.” Facebook in particular, she says, has reached the point where it should be treated–and regulated–as a public utility.

Sign up for CJR's daily email

Here’s more on algorithmic amplification:

  • Public interest: Pasquetto says we got where we are because “we have been prioritizing platforms’ interest (i.e., profiting out of the attention-economy business model) rather than the public interest (i.e., curating and giving access to good quality, reliable information over the Internet).” Also, she says, we have a cultural problem that involves “blind faith in the positive power of innovation” best embodied by Facebook chief executive Mark Zuckerberg’s famous motto “Move fast and break things.” The other reason we are where are now, she says, is that certain people figured out how algorithmic amplification works “and made use of it to amplify their own perspectives.”
  • Bias to action: Irani says her study of innovation showed her the mechanics behind this drive to move quickly. The source of this ethos, she says, “is a management theory response to a couple of changes that made long-term planning less effective,” including the rapid rate of technological change and the globalization of trade. “All of this led to this idea that McKinsey consultants called ‘the bias to action,'” she says. “This crucially assumes the organization doesn’t face the consequences of their mistakes or bear responsibility for the social costs of all that pivoting.” In other words, companies get to move fast and break things without being accountable for what they broke.
  • New rules: India is implementing new restrictions on social-media companies like Facebook, Twitter, and WhatsApp because of what that country says is a growing problem of hoaxes and hate speech. The new guidelines say that in order to counter the rise of problematic content, the companies must establish “grievance redressal mechanisms” to resolve user complaints about content, and share with the government the names and contact details for “grievance officers.” These officers must acknowledge complaints within a day and resolve them within 15. Russia, meanwhile, has said it is “slowing down” Twitter because it won’t deal with negative content.

 

Other notable stories:

  • A jury acquitted Iowa newspaper reporter Andrea Sahouri on Wednesday of two misdemeanor charges related to a racial justice protest last May. A reporter for the Des Moines Register, Sahouri was arrested and pepper-sprayed while covering the protest, along with her boyfriend at the time, Spenser Robnett, who said he attended the rally with her to help keep her safe. The pair faced fines and up to 30 days in jail for two charges: failure to disperse and interference with official acts. Both were acquitted on all charges after a three-day trial and less than two hours of deliberation but similar charges remain against a number of other reporters and journalists.
  • The New York Times posted a response on Twitter to Tucker Carlson, who spent a significant part of his show attacking Times reporter Taylor Lorenz, who had commented on the harassment she has gotten on the social network for reporting on tech companies like Clubhouse. “In a familiar move, Tucker Carlson opened his show last night by attacking a journalist,” the statement read. “It was a calculated and cruel tactic, which he regularly deploys to unleash a wave of harassment and vitriol at his intended target. Taylor Lorenz is a talented New York Times journalist doing timely and essential reporting. Journalists should be able to do their jobs without facing harassment.”
  • The International Center for Journalists has published an in-depth data analysis of attacks on Maria Ressa, the founder and editor of Rappler, a news outlet based in the Philippines which has been a target of hatred. The group conducted a forensic analysis of the torrent of social media attacks on Ressa over a period of five years, between 2016 and 2021. “We detail the intensity and ferocity of this abuse, and demonstrate how it is designed not only to vilify a journalism icon, but to discredit journalism itself, and shatter public trust in facts,” the preamble to the analysis says. “These attacks also created an enabling environment for Ressa’s persecution and prosecution in the Philippines. Now, her life is at risk and she faces the prospect of decades in jail.”
  • Jennifer Lyons, the highest ranking editor at NewsNation, a cable news outlet that markets itself as delivering “straight-ahead, unbiased news reporting,” has resigned, the New York Times reports. Lyons is the third top editor to quit in recent months as some staff have complained of a rightward shift at the network. She has decided to depart the channel, effective immediately, the company’s staff were told at a meeting on Tuesday. Sandy Pudar, the news director for the network, left on February 2, and Richard Maginn, the managing editor, resigned on March 1.
  • Washington Post publisher Fred Ryan emailed staff about the paper’s plan to “initiate the gradual return” to offices starting July 6. “This will inevitably be a complicated process,” Ryan noted in the email. He later confirmed the date in a town hall meeting with staffers, as reported on Twitter by Post media columnist Margaret Sullivan. If the newsroom does start regrouping July 6, it will be one year to the day since the newspaper sent its employees home due to the COVID pandemic.
  • Mary Retta writes for CJR about what the pandemic means for news paywalls.“What I’m noticing now is anger on the part of readers when they hit a paywall on a COVID story,” said Joy Mayer, director of the Trusting News project. “I’m seeing comments along the lines of: ‘I thought you were going to make these stories free, I guess you’re just greedy.’” Journalists are not usually present in the comments on such stories to correct the record, Mayer notes, or to describe why they need community support, and explain how subscriptions factor into their ability to stay in business.
  • The Times reports on a group of about a dozen websites that have been spreading misleading information, all of which appear to have ties to the Epoch Media Group, a news organization that has become a top purveyor of conspiracy theories and political misinformation, according to data provided by the research group Advance Democracy and analyzed by Global Disinformation Index, a nonprofit that studies disinformation. Epoch Media is affiliated with the Chinese spiritual movement Falun Gong, which has been outlawed by the Chinese government.
  • Congress plans to re-introduce legislation Wednesday to allow news organizations to band together to negotiate with the technology companies over payment for content and the data the companies have about readers. The legislation, which is being proposed in the Senate and House with bipartisan support, could make the US the next front in a news industry battle against Facebook and Google that flared up again in Australia last month when the country passed a law to force the companies to pay for news content. Facebook briefly shut off the ability for Australian news companies or users to post news to the social network, but later changed its mind.

 ICYMI: As protests grow in Myanmar, so do crackdowns on the press

Mathew Ingram is CJR’s chief digital writer. Previously, he was a senior writer with Fortune magazine. He has written about the intersection between media and technology since the earliest days of the commercial internet. His writing has been published in the Washington Post and the Financial Times as well as by Reuters and Bloomberg.