In December of 2016, after receiving a firestorm of criticism about online disinformation during the presidential election, Facebook announced its Third Party Fact-Checking project. Independent organizations would debunk false news stories, and Facebook would make the findings obvious to users, down-ranking the relevant post in its News Feed. Now the project includes 50 partner organizations around the world, operating in 42 languages, yet it’s still very much an open question how effective the program is at stopping the spread of misinformation.
Recently, Full Fact, a non-profit partner in Facebook’s project, published an in-depth report on the first six months of its involvement in the program. Overall, the group says, third-party fact-checking is worthwhile, but the report has a number of criticisms to make about the way the project works. For example, Full Fact says, the way Facebook rates misinformation needs to change, because the terminology and categories it applies aren’t granular enough to be useful. Plus, Facebook has so far failed to speed up its flags and responses to fact checks.
Another concern mentioned in the report is more fundamental: that Facebook simply doesn’t provide enough transparency or clarity on the impact of the fact-checking that its independent checkers do. How many users did the fact-checks reach? How many people clicked on the related links from a false story? Did the project slow or even halt the spread of that misinformation? Facebook doesn’t divulge enough data to even begin to answer those questions. Its only response to the Full Fact report, which contained eleven recommendations for how to do better, was to tell the group that it is “encouraged that many of the recommendations in the report are being actively pursued by our teams as part of continued dialogue with our partners, and we know there’s always room to improve.” There was no response to the criticism about a lack of data.
Full Fact’s critiques are not new. Earlier this year, a number of Facebook’s fact-checking partners told the BBC that they were concerned about having no way to see whether their work was having an effect, and suggested that Facebook didn’t care about the efficacy of the project. “Are we changing minds?” a fact-checker based in Latin America wondered. “Is it having an impact? Is our work being read? I don’t think it is hard to keep track of this. But it’s not a priority for Facebook.” Last year, a number of partners seemed deeply cynical about Facebook’s intentions. “They’re not taking anything seriously, Brooke Binkowski, former managing editor of fact-checking site Snopes.com, who now works for a similar site called Truth or Fiction, told The Guardian. “They are more interested in making themselves look good and passing the buck.”
It’s a common theme with Facebook: a crucial project is given the minimum enthusiasm necessary for good PR. But users—including entire democracies—are owed an explanation. If the world’s most powerful social network wanted to give the impression that it takes fact-checking seriously, it should open up its vast database and share more information about how the project is working.
Here’s more on Facebook and fact-checking:
- Checking in with the checkers: Last year, Mike Ananny wrote for CJR about a report he helped write for Columbia University’s Tow Center for Digital Journalism, which looked at the Facebook fact-check project and criticisms participants had, including why some posts and news stories were chosen for down-ranking but others were not.
- What about Instagram? Among the recommendations in the report from Full Fact is that Facebook extend its fact-checking program to Instagram, the photo-sharing network it owns, which is much more popular with younger users than Facebook itself. “The potential to prevent harm is high [on Instagram] and there are known risks of health misinformation on the platform,” the group wrote.
- A booming business: Fact-checking groups in Uruguay, Bolivia, Argentina, and Brazil have joined forces to create a national coalition in order to fight misinformation being spread both on Facebook and through WhatsApp, the encrypted messaging network Facebook owns. The groups are also working with organizations like First Draft, a fact-checking and training network based in the UK that is affiliated with City University in New York.
- Fact-checking Boris: The British TV network Channel 4 has done some fact-checking of government statements in the past. Now, in the wake of Boris Johnson’s ascent to the office of Prime Minister of the UK, Channel 4 says it is committed to fact-checking every public statement Johnson makes during his tenure, and has asked viewers to help.
Other notable stories:
- The New York Times profiled a dying local newspaper from rural Minnesota, The Warroad Pioneer, which is shutting down after 121 years of publishing. According to the Times story, the paper and its three remaining employees ended their run “with Bloody Marys, bold type and gloom about the void it would leave behind.” The Times also published a feature called A Future Without the Front Page, in which it asks: “What happens when the presses stop rolling? Who will tell the stories of touchdowns scored, heroes honored and neighbors lost?”
- Conveying the gravity of the climate change crisis is often a challenge to journalists, but sometimes an individual’s iPhone image does the job. Yesterday, a video clip shared on Twitter by a former fellow for the Council on Foreign Relations showed a swollen river created by a melting glacier in Greenland. According to Danish officials, more than 12 billion tons of glacier ice melted in a 24-hour period.
- Less than a year after agents working for the Saudi Arabian royal family reportedly killed and dismembered Jamal Khashoggi, a columnist for The Washington Post, the country says it will hold a media forum aimed at repairing its reputation. That will be difficult; according to the Post, the number of journalists in prison in Saudi Arabia has tripled in the two years since Mohammed bin Salman took power.
- Facebook said it removed 259 Facebook accounts, 102 pages, five groups, and 17 Instagram accounts that were engaged in “coordinated inauthentic behavior.” The accounts originated in the United Arab Emirates and Egypt, according to Facebook, and their behavior was focused on a number of countries in the Middle East and Africa, including Libya, Sudan, and Qatar.
- Brenna Wynn Greer writes for CJR about the sale of the photo archives from Jet and Ebony, two pioneering magazines aimed at African-American readers. The archive, which included historic photos of Emmett Till, was acquired by four philanthropic foundations—the J. Paul Getty Trust, the Ford Foundation, the MacArthur Foundation, and the Andrew W. Mellon Foundation—for just under $30 million.
- YouTube trolls advertised a new shelter in Los Angeles called the Ice Poseidon Homeless Shelter that turned out to be a private mansion belonging to a YouTube personality whose nickname is Ice Poseidon. The YouTube star, whose real name is Paul Denino, told the LA Times that he has been the victim of trolling before, including “swatting” attacks, in which trolls call 911 in an attempt to get SWAT teams to descend on a user’s home or workplace.
- The Wall Street Journal reported that social media bots pushed divisive content and misinformation related to race during the Democratic debates, focusing specifically on Senator Kamala Harris. But Josh Russell, a disinformation researcher, said on Twitter that virtually every hashtag or search term will show signs of bot-like activity. “What matters are networks of bots,” said Russell, who only found signs of two relatively small spam networks.
- A joint team from the UK’s Centre for Democracy and Development and the University of Birmingham spent months researching the impact of WhatsApp on May’s Nigerian elections. Their resulting report finds that WhatsApp was used to mislead voters in sophisticated ways, but in certain districts, actually strengthened democracy in other areas by helping citizens mount anti-corruption campaigns.