In some respects, Facebook isn’t so different from the publishers that rely on it. A report from Gizmodo this week suggests that a team of “news curators” subjectively selected stories for the objective-sounding “trending” section of the Facebook homepage. One former contractor alleged that the team suppressed conservative news—and conservative outlets—that the social network’s almighty algorithm surfaced. The obvious analogy is to left-leaning journalists picking stories for a newspaper’s front page. Facebook, however, reaches audiences no newspaper could imagine.
The allegations have stoked fears that social platforms may be pressing their thumbs on the ideological scale. The Senate Commerce Committee even requested a fuller explanation of “trending” selection in a letter sent to Facebook co-founder Mark Zuckerberg on Tuesday. In a statement to Brian Stelter later on Tuesday, Facebook said it’s looking forward to answering questions about the process.
It’s unclear how much this small plot of homepage real estate affects stories’ reach; users’ newsfeeds have prime location by comparison, and the “trending” tab is hard to locate on mobile devices. Yet the report cuts against Facebook’s altruistic-sounding mission “to give people the power to share and make the world more open and connected.” What’s more, it alludes to the more far-reaching editorial-like decisions social networks make in sorting news and digital content. Facebook may describe itself as “a platform,” but it acts a lot like a publisher.
As prominently argued by Emily Bell, director of Columbia’s Tow Center for Digital Journalism, Facebook is increasingly shaping the contours of the public square, and citizens and news organizations have little choice but to go along for the ride. The power shift raises the all-important question of how information travels in free societies—and what we know about it.
“This is an unregulated field. There is no transparency into the internal working of these systems,” Bell said in a University of Cambridge speech earlier this year. “We are handing the controls of important parts of our public and private lives to a very small number of people, who are unelected and unaccountable.”
News organizations once had a more central role in setting the terms of public debate, balancing money-making aspects of publishing with more civically minded accountability journalism. They also generally followed widely accepted journalistic standards. Social networks have assumed much of the same power, Bell and others have argued, though they typically use more opaque processes and have a greater focus on those profitable slices of publishing. That’s not to say this new construct is necessarily worse, but it is foreign. And Facebook has little incentive to open up about its methodology.
Indeed, the corporation occupies a historically unique position. On the one hand, it’s a tech giant whose in-house tools and massive user base give it omnipotence over media outlets. On the other, it’s a media company in its own right, relying on a steady stream of content for survival. And just as journalists have always been defensive toward claims of bias, Facebook, too, has denied its own subjectivity.
The social network’s response to Gizmodo’s story on its “trending” section was telling in this regard. On Tuesday morning, VP of Search Tom Stocky wrote a post denying the allegations of political bias, adding that Facebook’s team of news curators merely shepherd topics already identified by its algorithm. The supposed impartiality of this algorithm acts as a smokescreen to the other charges. Wrote Stocky:
We have in place strict guidelines for our trending topic reviewers as they audit topics surfaced algorithmically: reviewers are required to accept topics that reflect real world events, and are instructed to disregard junk or duplicate topics, hoaxes, or subjects with insufficient sources….
We do not insert stories artificially into trending topics, and do not instruct our reviewers to do so. Our guidelines do permit reviewers to take steps to make topics more coherent….
We will also keep looking into any questions about Trending Topics to ensure that people are matched with the stories that are predicted to be the most interesting to them, and to be sure that our methods are as neutral and effective as possible.
To be fair to Facebook, Gizmodo’s report is based on interviews with anonymous and disgruntled former employees. (The same outlet reported last week that Facebook wouldn’t even invite these contractors to company happy hours. The horror!)
But a few key words and phrases from Stocky’s statement stand out, as Zeynep Tufekci, a University of North Carolina professor who often writes about technology and society, highlighted on Twitter Tuesday. What makes a “hoax” or a source “insufficient”? What, exactly, is “neutral and effective”?
I am not saying why isn’t Facebook neutral, surfacing non-hoax news only from sufficient sources. I’m saying that will always be contested.— Zeynep Tufekci (@zeynep) May 10, 2016
Facebook feed algorithm sold as “what you want” but it structures the experience. FB trending is sold as algorithm but is FB’s preferences.🙄— Zeynep Tufekci (@zeynep) May 9, 2016
Of course, such questions speak to a larger issue: While Facebook has become the public’s primary conduit for digital content, its business imperative is to maximize engagement, not objectivity. The algorithms designed to do that are human-made and therefore biased by nature. But we can only guess as to how they’re constructed.
The obvious danger of the situation is that free societies have little knowledge of the systems funneling information into their newsfeeds. The sad irony is that the news organizations with the wherewithal to find out are the very same outfits that increasingly depend on Facebook for their survival.