Views, impressions, and what we see on Facebook

Yesterday, The Markup, a nonprofit investigative platform focused on the ethics of technology and its effects on society, published a piece detailing the popularity of sensationalist, partisan media on Facebook. Drawing on data from its Citizen Browser program, which pays a national group of Facebook users to auto-share their news feed, The Markup produced a list of the Top 20 domains featured in its users’ news feeds from July through September. Meta, Facebook’s parent company, had recently released a similar Top 20, based on the platform’s “most-viewed” domains during the same time—an effort, Corin Faife wrote for The Markup, “to rebut critics who said that its algorithms were boosting extremist and sensational content.”

The Markup’s work—which drew on impressions, rather than Meta’s preferred “most-viewed” metric—would seem to confirm those critics. “We found that outlets like The Daily Wire, BuzzFeed’s viral content arm, Fox News, and Yahoo News jumped in the popularity rankings when we used the impressions metric,” Faife wrote. “Most striking, The Western Journal—which, similarly to The Daily Wire, does little original reporting and instead repackages stories to fit with right-wing narratives—improved its ranking by almost 200 places.” To underscore the disparity between Meta’s preferred metric and its own, The Markup noted that Facebook’s algorithms served one of their citizen browsers 1,065 Newsmax articles—which, using Facebook metrics, would have been counted as one “view.” The Markup’s conclusion? “Facebook isn’t telling you how popular right-wing content is on the platform.” 

A spokesperson for Meta told The Markup, “The focus of the Widely Viewed Content Report is to show the content that is seen by the most people on Facebook, not the content that is posted most frequently. That said, we will continue to refine and improve these reports as we engage with academics, civil society groups, and researchers to identify the parts of these reports they find most valuable, which metrics need more context, and how we can best support greater understanding of content distribution on Facebook moving forward.” The Markup published its own methodology here.

On Twitter, Faife noted that The Markup’s reporting took its lead from New York Times tech columnist Kevin Roose, whose use of Meta’s internal CrowdTangle engagement data to make a public daily Top Ten list for the platform’s most-engaged posts has brought greater scrutiny to Facebook’s influence, highlighting its tendency—by some metrics—to promote right-wing content. In July, Roose wrote for the Times that the company responded favorably to his use of CrowdTangle at first, but things changed when he began to post the Top Ten list daily on Twitter. Roose described a drawn-out conflict between Facebook executives about the existence of CrowdTangle; some, such as CrowdTangle co-founder Brandon Silverman and vice president Brian Boland, wanted the platform to commit to transparency, while other executives argued in favor of curation, fearing that journalists like Roose were creating problems for the company. Roose reviewed internal communications and spoke to more than a dozen current and former Facebook employees: “Transparency, they said, ultimately took a back seat to image management,” he wrote.

Obfuscation has been a longstanding problem with Facebook, and both internal and external reviews of the platform’s data are limited. As Will Oremus wrote last year amid Facebook’s initial disgruntlement with reporting on their CrowdTangle metrics, Roose’s posts are necessarily limited, and Facebook’s own reporting “is only a snapshot, one that may present an overly sanitized view of how news and political communication spreads on its platform. It doesn’t tell us how many people are actually clicking on those links to cnn.com et al., as opposed to simply scrolling past them. It doesn’t capture all the political content that people share in the form of memes, text posts, or videos.” Understanding Facebook, including user engagement on the platform, is “incredibly complicated and likely impossible,” tech journalist Charlie Warzel wrote in his Substack newsletter in August. “This isn’t an attempt to let Facebook off the hook. It’s to say that Facebook is far, far too big. What you glean about the platform is heavily dependent on the slice of data you’re looking at.”

The complexity, opacity, and sheer size of Facebook means that reporters at places like The Markup (among many others pushing for transparency) carry a substantial burden in trying to make sense of the platform and its parent company for the rest of us. A host of talented journalists do an impressive job on that beat; still, the stakes remain high. As Tow Center director Emily Bell wrote in 2016, “Social media hasn’t just swallowed journalism, it has swallowed everything. It has swallowed political campaigns, banking systems, personal histories, the leisure industry, retail, even government and security.” Five years later, the same is true—and while we’re arguably closer to wrapping our heads around the problems, they’re still immense, as last month’s release of the Facebook papers illustrates.

Sign up for CJR's daily email

For its part, The Markup soldiers on. Along with Faife’s story, the site launched its own Twitter bot—inspired by Roose’s—to report the engagement findings from its Citizen Browser project on a daily basis. “At the end of the day, we can’t take Facebook’s word for what’s happening on Facebook—or in the ‘metaverse,’” the account states in a pinned thread. “We need constant external checks of the company’s claims. And that’s exactly what we’ll be doing.”

More on Facebook and its responses to criticism:

Other notable stories:

Has America ever needed a media watchdog more than now? Help us by joining CJR today.

Lauren Harris is a freelance journalist. She writes CJR's weekly newsletter for the Journalism Crisis Project. Follow her on Twitter @LHarrisWrites

TOP IMAGE: Photo by Peerapon Boonyakiat / SOPA Images/Sipa USA(Sipa via AP Images)