On Monday, The New Yorker published a profile of Elon Musk by Ronan Farrow, illuminating the way the world’s richest man has constructed a “shadow rule” by investing in critical industries. “In the past twenty years,” Farrow wrote, “against a backdrop of crumbling infrastructure and declining trust in institutions, Musk has sought out business opportunities in crucial areas where, after decades of privatization, the state has receded.” Musk’s fleet of satellites, through his company SpaceX, has given him a high level of influence in Ukraine’s war against Russia’s invasion. His manufacture of electric vehicles, through Tesla, make him central to policymakers’ approaches to transitions towards a net-zero carbon future. And his efforts to muscle into payment infrastructure—in the past with PayPal and more recently with intentions to make Twitter an “everything app”—represent another frontier of Musk’s empire.
To this list of critical infrastructure investments—although it is not an area of previous state oversight—we might add Twitter, which he acquired in October last year. Since launching in 2006, the social media platform has become a digital town square of sorts. It has helped spark (and crush) revolutions, and has developed into a digital watering hole for many voters and most journalists. By owning the platform, “What he can do is shape perceptions of his companies and shape perceptions of him,” Imran Ahmed, chief executive of the Center for Countering Digital Hate (CCDH), told me. “And also go on the attack against legislators who might pipe up” to propose regulating his actions.
The CCDH, a nonprofit that researches hate and disinformation online and campaigns for reform to tech platform regulation, has recently attracted Musk’s anger. On June 1, it released a report cataloguing a jump in hate speech on Twitter since Musk took over last year. Ahmed has said that Musk “put up the Bat-Signal to racists, to misogynists, to homophobes, to antisemites” by reinstating many accounts banned for harassment after gaining control. On August 1, X Corp filed a lawsuit against CCDH accusing it of embarking on a “scare campaign” to frighten advertisers which cost “tens of millions of dollars” in ad revenue, and alleging it unlawfully scraped data. CCDH, which does not take money from platforms or governments, intends to fight the suit. This afternoon, I spoke with Ahmed by video call. Our conversation has been edited for length and clarity.
JB: I wanted to start by talking about the research that seems to have angered Twitter so much. Your report found Twitter failed to enforce its own rules on ninety-nine of one-hundred hate tweets from Twitter Blue subscribers. Can you tell us more about the findings and what they say about the platform under its new ownership?
IA: That report is part of a series over some years now. Platforms often say that the problem for them is that there’s so much content, they couldn’t possibly moderate it, so they have to use algorithms. When their automated systems fail, they say the back-up is people reporting it to them and they’ll take action. We wanted to test: to what extent do they take action on it? So we reported one-hundred really hardcore, malignant tweets, which clearly were outside their rules, and then we went back and audited what action was taken. Ninety-nine out of one-hundred tweets saw no action taken. Even the one that had action taken on it, you don’t know if that person closed their account down or whatever else, we just assume they took action.
It’s essentially the equivalent of quality control in a factory: if you’re making five-thousand cream cakes a day, and you take one-hundred out and you test them—my dream job—and ninety-nine are toxic, is your reaction, “Don’t worry, it’s just that ninety-nine.” Or is your reaction, “Look, we need to shut down the factory and work out what on earth is going wrong!”? Musk’s reaction is neither of those—it’s suing the person taking the samples.
When you found out that Musk intended to sue, and then when he called you a “rat” on Twitter, what was your immediate reaction?
My reaction was to retweet it, and then to ask people to donate to us if they thought it was outrageous that he was behaving in that way. And lots of people did. It was to try and take his energy and turn it into something positive for ourselves. I don’t mind being called a rat. I’ve been called a lot worse.
You’ve spoken about how you’ve held up a mirror to the platform, and said that Musk is trying to sue that mirror. What are your thoughts on what this lawsuit is trying to achieve?
He’s trying to bleed us. Every journalist will know that people try to bleed small organizations through strategic litigation—part of the decline of our local news sector is down to relatively small organizations without the scale to absorb a slew of legal threats from one disgruntled person. Now, that can destroy organizations, but it can also change the way that other organizations work. They think, “Crumbs, I don’t want to experience that.” So there’s a general effect on the rest of the sector to dissuade, to make people think, “You know what, maybe taking on Musk is too costly. We should focus on [Facebook-founder Mark] Zuckerberg and TikTok and YouTube.” If we let that happen, it will lead to others learning that lesson. The first report we ever wrote was “Don’t Feed the Trolls,” and the central insight there was: trolls don’t think you’re evil. They just want you to shut up, they want to stop you from speaking, to think, “I don’t want to go through it.” But I will go through it—otherwise we are essentially silencing our ability to do that kind of work.
As we approach the next presidential election cycle, there seems to have been a fragmentation of the social media ecosystem. We’ve got people spread across a larger number of smaller platforms. What are your thoughts on what that will mean for misinformation during the campaign?
We’re well into these battles now. 2024 is going to be worse than 2020 which is worse than 2016. Things are going to get worse until we start trying to make them better. The problem is that we are letting the information ecosystem decay and degrade and become ever more fragmented. The sophistication of the algorithms improves every four years, to become better at keeping us addicted. So I expect 2024 to be the most dis-informed democratic election in history—until six months later, when we’ll have the next one.
You’ve painted a picture of a degrading landscape for facts from election to election. And now we’re throwing AI into that picture. What impact do you think AI is going to have?
I’m most concerned about its undermining of our ability to know what’s true and what’s not. One of the terms we use is “epistemic anxiety.” We thought about that a lot in the pandemic, because epistemic anxiety is correlated to conspiracism. Think of epistemic anxiety as a yearning for certainty, or even to know where you can get that certainty from. Conspiracy theories give you these big, macro solutions that seem like they give you a single, turn-key solution to understand everything. But in reality, they never do, because every conspiracy theory is based on a leap of faith, not on facts. Epistemic anxiety can only increase when we literally can’t tell if things are and aren’t real. The blurring of confected reality and reality drives even more epistemic anxiety which drives conspiracies.
One of my staff said to me, “Well, this is exactly what the Soviets used to do.” Once you make people completely unsure as to what’s true and what’s not, they just give up—and they’ll hold on to whatever is stable and unbending and unmoving. And that’s the dictator. It’s a recipe for autocracy. And it can happen very fast. My grandfather grew up in Afghanistan. I remember him writing to me before he died, saying how the country in the 1970s had women in mini-skirts. A country like that can, a few years later, be considered one of the basket cases in the world.
After Frances Haugen became a Facebook whistleblower, you said that social media platforms know they have a detrimental impact on society, but their perverse incentives—a desire to retain eyeballs on feeds to keep serving people ads—stop them from reforming. What needs to happen to provoke change?
Ironically, what we need to do is introduce some of the same costs, values and social mores that are vital to the operation of a democracy to social media—so essentially socializing social media. One thing we try to do is change the cost calculator. For companies to actually enforce their rules, they have to do two costly things. One, they have to take an action, which is a cost. Second, they reduce the amount of monetizable content. So it’s a double hit to their bottom line. Whereas if they do nothing, Section 230 of the Communications Decency Act means that they bear no cost. It’s not just that they’re not subject to defamation law or the normal rules that apply to any publisher or journalist. They’re also not subject to negligence law, which all companies in America are subject to, and they’re not subject to your product liability law, which all producers of products are subject to. They’re essentially completely free. So, actually, their fiduciary duty as executives to shareholders is to do absolutely nothing whatsoever, because there’s no cost of doing nothing. There’s reputational costs, sure, but CCDH has been trying to make sure that those reputational costs bear economic costs too. In doing so, you change the calculus of those companies.
CCDH supports federal legislation to enforce platform transparency—on things like feed algorithms, enforcement statistics, business model information. Can you say more about that?
Transparency is not an end unto itself. The mistake that many people make is thinking that transparency is sufficient. It’s not, it needs accountability to go with it. It needs the sharing of responsibility for the costs. Once you’ve made yourself transparent, there needs to be someone who can pose questions, whether that’s an Ofcom [the UK communications regulator] or another body. Lindsey Graham and Elizabeth Warren have got a bill for an FCC type body in the US that would be similar to what Ofcom is doing in the UK. In the [EU’s] Digital Services Act there’ll be new bodies to do that as well.
But then you need to have responsibility too, so you’re sharing the costs of the harms that are produced right now. The costs of the harms that are produced by the incompetent leadership of social media companies are [currently] borne by society, not borne by those companies themselves. You need to have that cost-sharing. That, we believe, will eventually lead to decisions at these companies where they implement safety by design. Our framework is called the STAR framework: safety by design, transparency, accountability and responsibility. They don’t actually go in that order, they go as TARS: transparency is required for accountability; when people are held accountable, you have to make them bear economic responsibility for it; and eventually we think that will lead to safety by design.
Finally, your research from 2021 found that up to sixty-five percent of anti-vaccine content on Facebook and Twitter could be traced back to just twelve people—the “disinformation dozen”. One of those is now running for president, Robert F. Kennedy Jr. In your view, how should news organizations approach covering that?
Stop calling him a Kennedy and start calling him an anti-vaxxer. They focus on the wrong thing with him. We’ve learned to make it really clear to people who they are and what they say. So any report about Robert F. Kennedy Jr. that doesn’t talk about the fact that he targeted black parents—by telling them not to vaccinate their children because African blood is different to white blood—is doing a massive disservice to the reader. The opines about the cut of his jaw and how he wears his tie—it’s just, stop it. Start telling us who they are and then hold them accountable for the things they say. The funny thing is, when he went in front of the congressional committee, which he thought would be a slam dunk, they just repeated back to him what he’d said, and asked, “Can you explain yourself?” Have you heard much about RFK Jr since the House Committee? Not really. Because he was exposed. So expose these people, hoist them by their own petard.