The Knight First Amendment Institute on Tuesday called on Facebook to add a special amendment to its terms of service that would create a “safe harbor” for journalists and researchers, allowing them to do things other users are forbidden from doing, including creating fake accounts and using automated tools to harvest user data. It may seem like a reasonable request, but it’s likely to be highly contentious, if only because those are the exact same things that blew up in the company’s face with the Cambridge Analytica fiasco and the Internet Research Agency, the infamous Russian “troll farm.”
And there’s an inherent problem at the heart of its proposal: Namely, who gets to decide who is deserving of protection? Having Facebook choose which researchers qualify might not raise too many red flags, but giving a private corporation the ability to say who is or isn’t a journalist would be hugely controversial, as evidenced by the controversy over Facebook’s recent attempts to rank “trusted” news outlets. Also, what’s to prevent bad actors from pretending to be journalists or researchers in order to get around the rules?
The Institute’s proposal and letter to Facebook CEO Mark Zuckerberg suggest that the research or journalism in question would have to be designed to “inform the general public about matters of public concern,” including issues like echo chambers, misinformation, and discrimination. The proposal says researchers and journalists would have to take steps to protect user privacy and to not mislead users about the purpose of their work, and wouldn’t be able to sell or transfer any data they acquired.
Jameel Jaffer, the Institute’s executive director, says in an email that the group isn’t asking Facebook to decide who is and who isn’t a journalist. “We’re asking it to decide, with respect to any given investigative project, whether the purpose of the project is to inform the general public about matters of public concern, and whether the project appropriately protects the privacy of Facebook’s users and the integrity of Facebook’s platform,” he writes. While there are risks in asking the platform to do so, Jaffer said it would be better than journalists and researchers being blocked from doing their work.
Digital journalism and research “are crucial to the public’s understanding of Facebook’s platform and its influence on our society,” the Institute says in its proposal. But Facebook’s terms of service “limit this kind of journalism and research because they ban tools that are often necessary to it.” The statement goes on to point out that journalists and researchers who use these tools risk not only having their accounts suspended or disabled, but also risk civil and criminal liability under the Computer Fraud and Abuse Act.
Kashmir Hill, who works on investigative projects for Gizmodo, said in a piece published Tuesday that Facebook tried to shut down a tool the site came up with to do research into the social network’s “People You May Know” feature. The tool still remains up and active, but Facebook made it clear that the kind of automated data collection Gizmodo was trying to do was a breach of its terms. The Knight Institute said Hill is one of the journalists it is representing in its attempt to get a safe harbor exemption, along with Kate Conger of The New York Times and award-winning journalist Cameron Hickey.
So far, Facebook’s response to the Institute’s request suggests it wants to appear concerned about the problem without sending any signal whatsoever about whether it intends to help. Campbell Brown, the social network’s Head of News, said in a statement that journalists and researchers “play a critical role in helping people better understand companies and their products—as well as holding us accountable when we get things wrong,” and that Facebook recognizes its rules “sometimes get in the way of this work.” But the company said nothing about what, if anything, it plans to do about that problem.