Facebook founder Mark Zuckerberg posted a picture of himself in quarter-profile, with his toddler daughter Max, as they watched Donald Trump become president of the United States. The smiley emoji on his post says he is “hopeful.” “Holding Max, I thought about all the work ahead of us to create the world we want for our children,” he wrote.
The good news for Zuckerberg is that, unlike most people, he can make the world a better place almost immediately just by taking more responsibility for Facebook’s publishing policies. By acknowledging that Facebook can and should play a more active part in editing—yes, editing—its own platform, and hiring actual people to do so, Zuckerberg will further the civic commons as well as address a growing problem of how people perceive Facebook.
Barack Obama called out the fake news problem directly at a rally in Michigan on the eve of the election: “And people, if they just repeat attacks enough, and outright lies over and over again, as long as it’s on Facebook and people can see it, as long as it’s on social media, people start believing it….And it creates this dust cloud of nonsense.”
Related: Facebook is eating the world
Yesterday, Zuckerberg disputed this, saying that “the idea that fake news on Facebook… influenced the election…is a pretty crazy idea” and defending the “diversity” of information Facebook users see. Adam Mosseri, the company’s VP of Product Development, said Facebook must work on “improving our ability to detect misinformation.” This line is part of Zuckerberg’s familiar but increasingly unconvincing narrative that Facebook is not a media company, but a tech company. Given the shock of Trump’s victory and the universal finger-pointing at Facebook as a key player in the election, it is clear that Zuckerberg is rapidly losing that argument.
In fact, Facebook, now the most influential and powerful publisher in the world, is becoming the “I didn’t do it” boy of global media. Clinton supporters and Trump detractors are searching for reasons why a candidate who lied so frequently and so flagrantly could have made it to the highest office in the land. News organizations, particularly cable news, are shouldering part of the blame for failing to report these lies for what they were. But a largely hidden sphere of propagandistic pages that target and populate the outer reaches of political Facebook are arguably even more responsible.
Closely reported pieces, one by John Herrman in The New York Times Magazine, others in a series for BuzzFeed by Craig Silverman, broke in the later stages of the campaign. Herrman drew attention to the dark economy of false pro-Trump Facebook pages, describing how they only live on Facebook and nowhere else:
These are news sources that essentially do not exist outside of Facebook, and you’ve probably never heard of them. They have names like Occupy Democrats; The Angry Patriot; US Chronicle; Addicting Info; RightAlerts; Being Liberal; Opposing Views; Fed-Up Americans; American News; and hundreds more. Some of these pages have millions of followers; many have hundreds of thousands.
As Herrman correctly noted, these pages represented 2016’s “most disruptive, least understood force in media.” Behind the pages, he found a new ecosystem of young entrepreneurs who used cheap labor from places like the Philippines to create and pump out large numbers of often false stories across the Facebook ecosystem.
Craig Silverman, an editor at BuzzFeed and leading exponent of verification techniques in journalism, took up the same theme and conducted a thorough content analysis of hyper-partisan websites and Facebook pages. He produced evidence showing that the more extreme political pages, both right and left, produce a far higher proportion of news that is either entirely made up or contains a mix of truth and falsehood than the comparatively “mainstream” media. Fake news can make up almost 40 percent of the pages’ daily output on Facebook.
Silverman describes how Macedonian teenagers churn out hundreds of pro-Trump Facebook pages not out of ideological commitment, but for money. Ad sales are all automated, and based on demographic data. Publishers that generate those data for traffic are not rewarded for quality.
The rising tide of untruths circulating and having a direct effect on voters during the recent US election was highly reminiscent of the Brexit campaign in the UK; as in the US presidential race, the polling for Brexit was largely wrong. Writing in The Financial Times yesterday, John Lloyd draws a clear parallel between the rise of the social Web and the migration away from truth by those who publish there. He links this shift in attention to the lower print readership: “The decline of newspapers in physical form and their passing on to the internet puts them on all fours with the vast flows of information, fantasy, leaks, conspiracy theories, expressions of benevolence and hatred. There they have to live or die.”
Jim Rutenberg pushed a similar line in a widely shared New York Times column, commenting that the value of a journalist such as David Fahrenthold at The Washington Post has been thrown into sharp relief by the tide of “fake news.” “It could be Pollyannaish to think so, but maybe this year’s explosion in fake news will serve to raise the value of real news,” Rutenberg wrote. “If so, it will be great journalism that saves journalism.”
The line of argument that says we need better journalism to combat fake news is appealing. However, it conflates two different crises.
The line of argument that says we need better journalism to combat fake news is appealing. However, it conflates two different crises. Having a larger number of good journalists is an indisputable goal for any functioning democracy. Wiping out the malicious falsehoods that carpet swaths of the social Web should be a high priority too. But the former will not be an adequate antidote to the latter.
On any given day, far more journalism is produced by non-partisan media outlets than by the most popular partisan sites. It might be relentlessly mundane reporting, insufficiently serious, and poorly reflect complex policy arguments, but it is rarely fabricated or hoaxed—fewer than 1 percent of pieces published in mainstream outlets fell into Silverman’s “totally false” category. The problem is that even where there is accurate journalism, it is not seen or not believed, or both.
Like Gresham’s Law in economics, which states that “bad money drives out good,” in a competitive information and entertainment economy, the quality of journalism (or even the veracity of information) does not guarantee financial success. Fake news and real news are not different types of news; they are completely different categories of activity. But in Facebook’s News Feed, they look the same.
For those outside the belief systems of Trump truthers or their leftist equivalent, the very design of Facebook creates an echo chamber of similarity. Facebook isolates readers from difference and disagreement for solid commercial reasons. A troll war of competing belief systems, played out in abuse and slurs, is an expensive nightmare to moderate and, as Twitter has discovered, poison to advertisers.
One unintended consequence of commercially effective social platform design is that it undermines the architecture of knowledge and healthy discourse. Within Facebook, you can set your personal preferences in a way that allows you to opt out of targeted ads. In political campaigns, a failure to understand the ecosystem has untold consequences. Clinton’s campaign was arguably insufficiently clued in to the power of the network of disinformation against her.
Discomfort over the role Facebook has played, both in creating echo chambers and disseminating false news, preceded the election. There were already intense internal discussions at Facebook itself about the “vast gray area” of content masquerading as legitimate journalism that was at best completely misleading and at worst totally fabricated and actively damaging. There had been high profile external pressure, particularly after Facebook removed the iconic Napalm girl photo, for the platform to admit to and refine its own editorial practices.
What should Facebook do? Until the company and Zuckerberg specifically acknowledge that this ecosystem is a problem, nothing will happen. The large numbers of policy people Facebook has working on issues such as extremist recruitment, hate speech, and terrorism are effectively already editing the platform. But the system for moderating the site’s content is largely obscure, the echo chambers concealed, and the fake news out of control.
Facebook is doing a great deal behind the scenes, evaluating different approaches, listening to arguments for more editorial accountability, and thinking about how to improve the very expensive business of moderation. But whilst these processes remain both partial and secretive, Obama’s “dust cloud of nonsense” will stick. Facebook has helped create an environment where perception can matter more than truth. Now it must deal with the consequences of the public’s perception of Facebook.