Innovations

Facebook’s latest changes will probably make misinformation worse

January 22, 2018
Image by downloadsource.fr via Flickr

There are good reasons to believe that Facebook’s recent News Feed changes not only won’t fix the problem of “fake news,” but could actually make it worse instead of better. And that looks to be even more likely after the company announced on Friday that the decision about which news sources to trust would be left up to users to vote on.

When Facebook announced the latest changes to its algorithmic ranking system earlier this month, most of the attention focused on how it might affect the amount of traffic coming from the network—which isn’t surprising, since many media companies depend on that traffic for revenue.

ICYMI: Journalists have something new to worry about in February

But a more important question is whether these changes will actually help solve any of the major problems Facebook claims it is trying to solve, such as the proliferation of hoaxes, false news and other types of misinformation. And the short answer is probably no.

Here’s why: Facebook co-founder and CEO Mark Zuckerberg and Adam Mosseri—the man in charge of the News Feed—said the new approach is designed to get away from passive consumption and to focus more on personal posts that generate discussion (On Monday, the company also seemed to acknowledge that sometimes, social media can be dangerous for democracy.)

Zuckerberg said the changes would promote content likely to “encourage meaningful interactions between people,” while Mosseri said it would highlight posts that “inspire back-and-forth discussion in the comments” and “that you might want to share and react to.”

Sign up for CJR's daily email

The problem is that if the new system is designed primarily to encourage conversation and spark reactions, the sites which could get the biggest boost from these changes are the least credible ones—publishers who specialize in either completely fake stories, or stories that have a grain of truth but are wildly exaggerated.

ICYMI: NPR drops a major scoop

Why? Because misinformation is almost always more interesting than the truth. After all, fake news stories don’t have to stick to the actual facts, and they don’t have to go out of their way to be balanced or objective, which means they are more likely to inflame people’s passions. The real news is complicated and often boring.

Misinformation is almost always more interesting than the truth.

As a former Facebook product manager put it during the 2016 election: “Sadly, News Feed optimizes for engagement [and] as we’ve learned in this election, bullshit is highly engaging.”

In a report from First Draft News that looked at how disinformation works, Claire Wardle and Hossein Derakhshan pointed out that for many users, sharing is performative—in other words, they don’t share fake news posts because they believe they are factually accurate, but because doing so fits the worldview of a specific group they would like to belong to.

The posts that are most likely to get the magic combination of comments and engagement that Facebook says it will now optimize for are the worst of the worst.

We already have some evidence that this is the case, because Facebook has been doing a “split feed” experiment in several countries for the past several months, in which users get a feed made up primarily of content from their friends and family, and news posts appear in a separate feed under a tab called “Discover.”

Slovakian journalist Filip Struharik, who has been studying the impact of the change, said that his research shows mainstream news sites have seen engagement on Facebook (reactions, comments and shares) decline much more than lower-quality sites. Serious news sites saw engagement fall by almost 40 percent, while fake news sites fell less than 30 percent.

A survey done by Craig Silverman of BuzzFeed in 2016, meanwhile, showed that the top fake election news stories generated more total engagement than all the top election stories from 19 major news outlets combined.

From archives: Politico embarrasses WSJ

Facebook planned to mitigate this issue by introducing “high quality” news into feeds even if it doesn’t get a lot of engagement. Not only is it not clear how the company will define that term, or how much promotion such posts will get, but Zuckerberg also suggested that the company would crowdsource which outlets are trustworthy.

This approach, besides putting news judgment in the hands of the same users who were duped into trusting fake news distributed by Russian troll factories, brings with it its own problems—namely, that the question of trust is all wrapped with political ideology, as a recent Knight-Gallup poll found. People trust the outlets they agree with, which makes trust an ineffective metric for measuring the truth.

“If the only change was privileging engaging content shared by friends and family, I would be terrified about what this means in terms of the continuing challenges of disinformation on the platform,” said Wardle. “It sounds as if they will also be investigating ways of measuring credible sources [but] the question of how you measure credibility is so difficult.”

The bottom line is that Facebook is prioritizing discussion and engagement, and that is likely to reward some of the worst media outlets. And in order to try and ameliorate that effect, it’s going to decide which are the most trusted publishers based on user votes—a solution that is almost as problematic and ripe for confusion as the problem it is trying to solve.

ICYMI: The New York Times made a decision that infuriated readers

Mathew Ingram is CJR’s chief digital writer. Previously, he was a senior writer with Fortune magazine. He has written about the intersection between media and technology since the earliest days of the commercial internet. His writing has been published in the Washington Post and the Financial Times as well as by Reuters and Bloomberg.