Twitter is excellent at capturing a moment in time—and that’s part of its problem. On the day Special Counsel Robert Mueller announced charges against Paul Manafort in the Trump-Russia investigation, some people on Twitter took perverse glee in sharing the story Fox & Friends previewed at 8:13am Eastern: cheeseburger emojis.
Paul Manafort just agreed to turn himself in to the FBI. What should we report on?
Fox News: Hamburger emoji pic.twitter.com/Wjsvkb2coW
— Roberto Ferdman (@robferdman) October 30, 2017
“As every other cable news show jumped on the story of Paul Manafort surrendering to the FBI Monday, Fox and Friends decided to tackle the real issue of the day: cheeseburger emojis,” the New York Daily News reported.
ICYMI: Merriam-Webster reveals its word of the year
But what does that prove? ABC, NBC, CNBC, and Univision covered the emoji story, too. More importantly, a one-second screen grab from any network tells you close to nothing, when there are 86,400 seconds in a day.
The New York Times shows there’s a better way. Using Third Eye, a chyron database developed by the Internet Archive, the Times found that Fox did cover Manafort’s indictment immediately after 8am, just like CNN and MSNBC. Unlike those competitors, Fox chose to cover other stories in that hour, and to run plenty of commercials. But the cheeseburger story lasted only a few minutes.)
The Times analysis, released November 1, marks a vast improvement over armchair pronouncements of “bias,” and even better methods could be on the way.
ICYMI: Late last night, a media controversy came to an explosive conclusion
Academics are making advances in large-scale content analysis, with new machine-driven techniques and more sophisticated yardsticks with which to measure content. Such approaches can reveal much about news outlets’ choices of stories, sources, and language.
No single number will ever tell us how biased a news outlet is. But if academics and journalists could just agree upfront on the methodology, there could soon come a day when we could inform our discussions using an advanced dashboard of metrics, rather than a single anecdote or flawed human recollection. There’s just one problem: Everyone’s too scared to sign on.
‘There’s gotta be some influence’
US politicians have been lobbing charges of “bias” since the earliest days of the republic, so it’s easy to see those charges as empty partisan ploys. Often, they are. “Honestly, almost all the claims of bias that are out there are exceptionally poorly supported,” says University of California Los Angeles Communication Professor Tim Groeling.
But these are claims most Americans believe. Eighty-seven percent of Republicans and 53 percent of Democrats think news organizations tend to favor one side when reporting on political and social issues, Pew reported last May. Among those who say the media fails to distinguish between fact and fiction, “bias” is the most commonly cited culprit, and that feeling is strongest in the US, according to a report by the Reuters Institute at the University of Oxford.
FROM THE ARCHIVES: Journalism Should Own Its Liberalism
Maybe journalists shouldn’t dismiss this opinion out of hand. After all, while 28 percent of journalists identify as Democrats, just 7 percent identify as Republicans. (A full 50 percent identify as independent.) Most reporters will claim they know how to keep their opinions out of their reporting—but with humans notoriously bad at recognizing and managing their own biases, there’s a case that media companies really should care about this issue.
“I think most conservatives would just say that there’s gotta be some influence if reporters are disproportionately Democrats or liberals [versus Republicans],” says Matt Grossmann, Director of the Institute for Public Policy and Social Research at Michigan State University. “And I guess I’m more open to that idea than some others.”
Biased compared to what?
Academics have been trying to quantify media bias for some time, and researchers in the field say techniques are improving. Algorithms and artificial intelligence are allowing computers to read facial expressions and vocal tone. Researchers are analyzing larger and larger datasets for occurrences of specific phrases, which could show how an issue is framed or what sources are cited. For example, economists Matthew Gentzkow and Jesse Shapiro analyzed the 2005 Congressional Record to find phrases frequently used by either liberal or conservative members of Congress, and compared these to the news content of over 400 newspapers over the same time period. Their 2010 analysis found that newspapers’ political slant correlated fairly well with the public’s perception of them: The Washington Times slanted right, The Washington Post slanted left, and so on.
How can we judge an individual outlet’s choice of stories without knowing the whole universe of stories the outlet had to choose from?
Researchers have also come up with new ways to think about baselines, addressing the pesky question of what to treat as normal or desirable. If I say an article is biased, you’d be right to ask, “Biased compared to what?” Choosing a baseline is fraught with issues, journalistic and philosophical. How can we judge an individual outlet’s choice of stories without knowing the whole universe of stories the outlet had to choose from? Should major-party candidates get equal amounts of coverage—even when one has said or done things considered more “newsworthy”? Depending on the question you’re asking of the data, a widely acceptable baseline may not always be possible.
But researchers have had success using baseline measures that stay largely consistent over time, as the party in power changes. Groeling looked at coverage of presidential approval polls. Another study used economic data points, including unemployment and inflation.
Chartbeat for partisan balance?
Currently, media critics have access to a few analysis tools that make investigations of bias and balance more rigorous. The Internet Archive has launched several, such as Television Explorer. This collaboration with the Global Database of Events, Language and Tone (GDELT) spits out graphs showing how often a keyword is used over time in the transcripts of cable networks and broadcast affiliates. Vox and FiveThirtyEight have used the tool to document how various networks covered the Russia investigation. (Although in a data-light piece just a day before, Vox also blew the cheeseburger emoji out of proportion.)
There’s also the aforementioned Third Eye, used by The New York Times for its opinion piece on Fox’s morning program Fox & Friends. And the Internet Archive’s experimental Face-o-Matic, developed with facial recognition startup Matroid, tracks how often select politicians pop up on CNN, Fox News, MSNBC, and the BBC.
For the most part, the techniques being developed in academic circles aren’t being applied in a way that could inform journalists or readers. Of course, none of these measures in isolation reveals whether a newspaper or network is “biased.” An outlet could quote more Republicans, but cite more liberal talking points.
But if research techniques continue to improve, Groeling thinks a battery of such measures could give us a decent picture of outlets’ relative biases.
“If people can agree in advance as to what certain standards are for measuring these things and start putting them together, I think they should be more credible to an organization that sees, for instance, we seem to be criticizing or having this set of sources on. [Or] whenever we talk about this person, we frown; whenever we were talking about this person, we smile,” Groeling says.
The armchair academics
Amateur attempts at such tools already exist, and have found plenty of fans. Google “media bias,” and you’ll find Media Bias/Fact Check, run by armchair media analyst Dave Van Zandt. The site’s methodology is simple: Van Zandt and his team rate each outlet from 0 to 10 on the categories of biased wording and headlines, factuality and sourcing, story choices (“does the source report news from both sides”), and political affiliation.
A similar effort is “The Media Bias Chart,” or simply, “The Chart.” Created by Colorado patent attorney Vanessa Otero, the chart has gone through several methodological iterations, but currently is based on her evaluation of outlets’ stories on dimensions of veracity, fairness, and expression.
Both efforts suffer from the very problem they’re trying to address: Their subjective assessments leave room for human biases, or even simple inconsistencies, to creep in. Compared to Gentzkow and Shapiro, the five to 20 stories typically judged on these sites represent but a drop of mainstream news outlets’ production.
Then there are the organizations with declared agendas. The Media Research Center and Media Matters for America scour the news for evidence of left-wing and right-wing bias, respectively. The MRC’s vice president for research and publications, Brent Baker, doesn’t see a need to agree a suite of universal bias measures, because he’s confident in the organization’s existing quantitative methodology.
“The reality is, we’ve been finding, I think very effectively for 30 years now, [that] the media are tilted to the left,” Baker says.
So the rationale for a bias dashboard remains. Whether anyone wants such a dashboard—or if they just want others to use it—is a different question.
“If somebody invented what they thought was a brilliant tool to measure bias and he convinced a skeptic like me that it worked, I think in this polarized era, there [would] be a lot of public skepticism about who was behind it, who was making these judgments about when news organizations lean one way or another,” says Howard Kurtz, host of Fox News’s Media Buzz.
Groeling thinks the problem is even more basic: Most of the public just doesn’t care about the news. It seems far too big an ask for them to take note of an independent bias measure, let alone recognize that the new tool is better than ad-hoc or politically motivated measures.
A bias dashboard could help news outlets improve the quality of their work by showing them the frequency with which they cover various topics.
“Let’s say every time you go to a site on the top-right corner, you have some plugin that shows you how biased something is, from the left or the right,” says Alvin Chang, a senior graphics reporter at Vox and author of that outlet’s work using Television Explorer. “Do readers care? … Or do they just think, well, yes, I know this is a more progressive news source, but this is what I think is true about the world?” Chang asks.
Another problem with this enterprise: Changing bias perceptions might not change consumption. “Measures of trust or belief in bias and what [media] you actually use, they’re not completely disconnected, but they’re only very loosely connected,” says Jonathan Ladd, a public policy and government professor at Georgetown University and author of the book, Why Americans Hate the Media and How it Matters.
Then there’s the argument that a bias dashboard could help news outlets improve the quality of their work by showing them the frequency with which they cover various topics, quote different types of sources, or employ certain language.
If the dashboard results were public, Groeling says, he doesn’t see much incentive for news organizations to participate, even if they could approve methodology beforehand. The chance of being found as biased would pose too great a risk. Of course, if tech players like Facebook or Google used bias-related data to weight results, those incentives would shift rapidly, Groeling says.
An alternative could be to keep the tool completely internal, to advise the newsroom without publicizing findings. Think of it like Chartbeat for partisan balance.
Fox’s Kurtz questions how much appetite news outlets really have to pay for such a tool. But Jennifer Dargan, a John S. Knight journalism fellow at Stanford University, speculates that the economics might work: Publishers could use the tool to prove to advertisers that they have a better-quality product.
There’s at least one bias baseline this tool wouldn’t address: whether news outlets cover the interests of political and economic elites over the actual concerns of their readers.
“One legitimate concern that people who criticize the bias in the news right now have is…the fact that, increasingly, journalists are not living in [the] communities [they cover] and are from a somewhat separate strata of society,” Groeling says.
Whether journalists truly capture the feelings and lived experience of their readers, and whether they paint the spectrum of our country’s opinions, cultures, and belief systems—that’s a question altogether more difficult to answer.
ICYMI: What we learned from the bizarre interview on CNNTamar Wilner is a Dallas-based freelance journalist and researcher who writes about misinformation, fact-checking, science communication, and all things media. She tweets at @tamarwilner.