Tow Center

Q&A: Mike Ananny on Facebook, facts, and the problem with ‘a marketplace of ideas’

April 5, 2018
 

Even Facebook agrees that the platform desperately needs to adopt and apply basic editorial standards. But can it? For four months, USC Annenberg’s Mike Ananny interviewed members of one of Facebook’s most scrutinized projects: A fact-checking team made up of employees from six different newsrooms, all assigned the daunting task of identifying and debunking false or misleading stories being circulated by users.

The job, Ananny writes in a report for the Tow Center, frustrated the veteran fact-checkers, in part because they couldn’t confirm the information Facebook was supplying them about the popularity of the stories being shared. They also weren’t sure the list of popular stories was complete. But they were sure the work needed to be done.

The need is urgent, especially to a company trying desperately to avoid regulation that will inevitably damage its bottom line. This week, Facebook admitted that it had uncovered yet more pages, users, and Instagram accounts associated with the Russian troll farm that disseminated mis- and disinformation, prompting condemnation from legislators including Mark Warner, the ranking Democrat on the Senate intelligence committee, who said he expected CEO Mark Zuckerberg “to work with Congress on updating out laws to better protect our democracy in the future.”  Zuckerberg is set to testify before another committee next week.

In conversation with CJR, Ananny described his findings, his recommendations for the future, and what exactly he believes makes Silicon Valley so reluctant to change. This conversation has been edited for length and clarity.

 

You present the team-up between Facebook and these news outlets as very lopsided but still incredibly urgent despite its problems. To what extent is it working, and what’s going wrong?

Sign up for CJR's daily email

My impression of this partnership, after talking with many of the participants, is that this is a bunch of people who are coming from very different organizational backgrounds, very different professional traditions, and really different assumptions about what the media system should be. But they’re all coming together around this central problem of wanting high-quality information to be circulating that is somehow in the public interest. They’ve also all got limitations on their ability to do that.

Journalists and the fact-checkers are also being swamped with this surfeit or overload of information that they care about. I heard that a lot from the fact-checkers in newsrooms: They care about making sure that there’s good-quality information circulating. And they’re willing to do what they need to do to increase the health of our media ecosystem. I didn’t hear anybody who was eager to work with Facebook, [or who said that] some lifelong ambition of theirs was to collaborate with Facebook. What I heard was people saying, “Look, this is a pragmatic thing that we’re doing. It’s not perfect. We are frustrated by lots of aspects of this partnership but we’re sticking with it because Facebook is such a central place for information and such a central place for news, increasingly.”

There’s a point at which you’ve got to say, ‘This speech doesn’t belong on our system.’

 

What about at Facebook itself?

I don’t want to impugn them and I can’t read tea leaves into their intentions, but I think Facebook has very, very quickly found itself in a situation that is beyond its organizational expertise and is beyond its culture. They say things in public that sound good, but honestly, I think Facebook is not very sophisticated on these issues of what a publicly healthy media system looks like. They don’t know, and they’re kind of making it up as they go along. And they’re looking for other people to shoulder the burden of a little bit of that responsibility.

It seems like they haven’t had much reason to care about whatever problems they’re causing to the news ecosystem until very recently, and I’m wondering why you think they care now.

Well, I think they’ve been publicly shamed and embarrassed. The 2016 elections pulled back the screen a little bit on how powerful social media platforms are, and Facebook took the brunt of that. But it’s important not to forget that Google, Apple, Twitter, Amazon ,and Reddit are all pretty powerful in these spaces and they’re not receiving as much scrutiny as Facebook has, although I think they should be, and they could be. In the aftermath of the 2016 election, Facebook got called to task for pursuing values that are just not aligned with publicly accountable, publicly responsible media systems. They’re interested in gathering as much data as possible, in profiling people in as much detail as they possibly can, and they’re interested in creating a walled garden that for many people becomes the entire internet—they’re on record as saying a lot of things like that. It’s not that Facebook has had wonderful intentions all along and this has happened all of a sudden.

Facebook has been a bad-faith actor for a while in the sense that it was pursuing its own ends. The thing I did get a sense of, through the report, is that there are people working for Facebook who are… I don’t know if embarrassed is the right word. They feel guilty, or they feel responsible on a personal level, and they don’t like working for a company that would be doing bad things.

RELATED: The new ‘billion-dollar problem’ for platforms and publishers

 

So much of this is so far afield from the Silicon Valley mantra of improving the world through technology.

But kind of predictably so. I mean, in a lot of ways it’s not rocket science to have seen this coming.

 

The instinctive reaction to the news that all our data is being sold off in ways that violate our privacy tends to be to hunker down and wait for the apocalypse. You have a lot of really good, usable questions journalists can ask at the end of the report, that suggest not all is lost.

I’m not sure I could get up in the morning if I didn’t have some sense of hope around this stuff. The recommendations are kind of meant to do that. What I was trying to do was try to give technology companies, news organizations and regulators and customers of these things some things to think about and ask as they’re watching this increasing blurring between platforms and publishers. In an ideal world, honestly, if we could have the editorial and public ethics represented by news organizations, and really good public regulation and public accountability, if we could marry that with a lot of the engineering power and product design power, that would be the holy grail: To do these things with real ethics and principles, and to debate what those ethics are.

 

These platforms have been built athwart a lot of ethical concerns; is it even possible to go back and kind of retrofit them so that they work toward the betterment of a collective that includes people who aren’t stockholders?

I’m not sure I would say that they were built without ethics. They were built under different ideologies, if that’s not too eggheady to say: An ideology that’s based on individuals communicating amongst themselves without any kind of regulation, and the idea that community is the thing that’s going to save and help people, and that somehow community is going to emerge from a marketplace of ideas where people are unrestricted in their speech — an idea that there’s a flattened place of speech out there where anybody can have a voice. That’s one kind of ideology.

But the reality is that there are structural inequalities in this world, and not everybody has the same power to speak and be heard. So if you build a platform premised on an assumption of individualism and libertarianism and equality, you’re going to be like, ‘Wait a second!’ when it doesn’t behave how you want it to behave. You’re almost having to rediscover structural forces and structural inequalities. And that’s where it’s frustrating for folks who have experienced or known about these things for a long time, and they say, “Well, duh, of course these inequalities have been around for a while and maybe your ideology should be less about ignoring their existence and more about building a recognition of inequality into your systems.”

ICYMI: Melissa Gira Grant on how sex workers changed the Stormy Daniels and SESTA narratives

 

They’re also not typically from race or class structures that are oppressed, either.

If the world works fine for you, then you don’t see problems, and that’s a problem in itself. There’s a point at which you’ve got to say,“This speech doesn’t belong on our system.” We’re not going to pretend it doesn’t exist, but we’re not going to give it a home and surface it and have a hands-off, neutral perspective where we say, “Oh well, if that’s what the community wants, then that’s what they’ll have!” That’s what Facebook is struggling with, because it’s not equipped to have these discussions in a sophisticated way, because I don’t think the people there who are in leadership positions fully, fully appreciate the depth of the cultural phenomenon they’re dealing with.

 

Do you think there was anything especially American about the way disinformation spread on Facebook in the 2016 elections?

The American approach to free speech equates free speech with free markets. That’s been built into this country for a very long time—the idea that anybody can speak, buyer beware, the best message will win, and if you don’t like something, then just speak up and respond to it and somehow this marketplace of everyone speaking will sort of magically add up to a good public life. Other countries and other cultures do not have that perspective, and that’s one of the Achilles’ heels of this country. If you have enough money, or a loud enough voice, or a sophisticated enough technological infrastructure, then you can hijack that approach to speech powerfully and quickly, because there aren’t checks and balances or subsidies or structural supports for marginalized voices and historically disempowered people, to sort of say, “Wait a second, a marketplace is not actually going to produce the public life we need.”

About the Tow Center

The Tow Center for Digital Journalism at Columbia's Graduate School of Journalism, a partner of CJR, is a research center exploring the ways in which technology is changing journalism, its practice and its consumption — as we seek new ways to judge the reliability, standards, and credibility of information online.

View other Tow articles »

Visit Tow Center website »