behind the news

Lists aren’t the best way to determine freedom

Journalists need to be a little more skeptical when looking at country rankings
December 30, 2014

Year’s end is an avalanche of lists: rankings of influential people, the year’s best movies and music, even the best and worst television moments.

A range of activist and advocacy groups also rank the world’s nation-states, from cleanest to most corrupt, from freest to most repressive.

For these groups, rankings offer a direct means of motivating governments to change their behavior. For readers, lists promise instantaneous insight into otherwise complex issues, combined with a fun, clickable format.

For journalists, lists can also offer a quick and easy story. The research has already been done by the advocacy groups; all that’s left for journalists is to report the scores. Therein lies the peril.

Country rankings are by definition a thumbnail picture of vast and intricate realities. They’re an attractive and sometimes powerful way of making sense of the world. But that does not mean that the media should regard every country’s position on every year-end list as an ironclad representation of reality.

“Lists can be helpful advocacy tools, because they grab headlines and boil down complex issues into digestible scores,” said Courtney C. Radsch, advocacy director at the Committee to Protect Journalists. “That can also be a drawback for measured and sustained engagement with governments on the root issues and causes of journalist insecurity and press freedom violations, because they become focused on the score rather on the action they need to take.”

Sign up for CJR's daily email

One list that draws its share of media attention is Transparency International’s Corruption Perceptions Index. The CPI is actually a “survey of surveys” using data from the World Bank and other institutions who poll experts and business figures around the world to develop a picture of a problem–corruption–that by definition takes place in secret.

It’s the most analytically tenuous aspect of the list–countries’ individual ranks–that often generates media attention in individual countries. The list generates a fair amount of discussion as a result of dramatic rises and falls by individual countries. This year, Egypt rose significantly. China fell 20 places, to 100th out of 175 states, spurring debate and media scrutiny inside and outside the country.

“We very clearly indicate that one should not read too much into year-to-year changes and look at the long term picture over two, three, four, five years,” said Finn Heinrich, the research director at Transparency International. “The important thing is what is happening over time.”

Lists can also be a powerful motivator. When countries do well or fall behind over the years, it generates discussion, and governments listen because they want to improve their scores and maintain a positive image in international media. Beyond the emotional impact of a low or high ranking, a country’s score could influence the calculations of investors, tourists, or donor governments. The fact that governments sometimes pay attention is notable, particularly when it comes to an issue like corruption that is, beyond the few scandals that burst into the public, neither sensational nor easy to report on.

“It helped a lot to put the issue on the agenda, even globally,” said Heinrich. “When you look at the media coverage we get about the issue of corruption around the Corruption Perception Index, it’s actually not so much about which countries ranked where, but it’s more about, ‘Well, corruption is really a problem worldwide.'”

But not all lists are as methodologically sound. The Good Country Index, a list that scores “goodness” based on a composite of 35 datasets assessing contributions in seven categories ranging from science and technology to culture, peace and international security, health, and global climate change. Overall, according to the list, Ireland scored the highest. Libya was the lowest. This ranking is so abstract, so far removed from the original data, that it’s difficult to see how it would ever spur concrete action by governments. Coverage didn’t necessarily reflect that abstractness; Business Insider’s headline on a piece covering the list was “These 30 Countries Contribute The Most Good To The World.”

Another problem with lists is that the authorities can become focused on raising scores rather than addressing underlying issues. This is the infamous problem of “juking the stats,” masterfully portrayed in The Wire, in which Baltimore’s schools and police department become more concerned with elevating test scores and keeping crime rates down than fixing the city’s deeply broken system.

Unlike some other groups, CPJ does not release a press freedom ranking, but instead releases a Prison Census: a documented list of journalists imprisoned worldwide. This approach solves at least one of the dilemmas associated with lists. It’s raw data as opposed an expert survey where at least some element of subjectivity is involved.

“If a country ranks number one in terms of the leading jailer of journalists, for example, that this is based on documented evidence and provides a clear, measurable, and comparable number,” says Radsch. She says it also provides governments with a clear way to improve their position on the list: “Let journalists out of prison.”

Jared Malsin is a freelance journalist based in Cairo