In the last two weeks, reporters have repeated false numbers provided by a study and a report (and by their respective press releases) related to the banding of penguins and global warming’s impact on global food production (the ever-vigilant Knight Science Journalism Tracker covered both episodes).
Most recently, an Argentina-based NGO, Universal Ecological Fund, released a report describing how climate change will affect food production in various parts of the world. One of its key findings was that the concentration of greenhouse gases in the atmosphere would rise to 490 parts per million (ppm) of CO2-equivalent (a unit of measure aggregating all such gases) by 2020, corresponding to a 2.4-degrees Celsius rise in temperature.
EurekAlert!, the widely used news service run by the American Association for the Advancement of Science (AAAS), issued a press release from the NGO touting the report, which was covered by numerous news outlets worldwide. The claims about the rise in CO2-equivalent and temperature were patently false, however.
Guardian environment correspondent Suzanne Goldenberg was among the few reporters to catch the gaffe. Sensing that something was amiss, she e-mailed NASA climate modeler Gavin Schmidt, who wrote back: “2.4C by 2020 (which is 1.4C in the next 10 years - something like six to seven times the projected rate of warming) has no basis in fact.”
Schmidt followed up with a post at RealClimate.org, a blog he runs with some of his fellow climatologists, which explained exactly where Universal Ecological Fund went wrong, and it’s worth quoting at length. The NGO made two basic mistakes:
The first error is in misunderstanding what CO2-eq means and is used for. Unfortunately, there are two mutually inconsistent definitions out there (and they have been confused before). The first, used by policymakers in relation to the Kyoto protocol, relates the radiative impact of all the well-mixed greenhouse gases (i.e. CO2, CH4, N2O, CFCs) to an equivalent amount of CO2 for purposes of accounting across the basket of gases. Current GHG amounts under this definition are ~460 ppm, and conceivably could be 490 ppm by 2020.
However, the other definition is used when describing the total net forcing on the climate system. In that case, it is not just the Kyoto gases that must be included but also ozone, black carbon, sulfates, land use, nitrates etc. Coincidentally, all of the extra GHGs and aerosols actually cancel out to a large extent and so the CO2-eq in this sense is quite close to the actual value of CO2 all on its own (i.e. in IPCC 2007, the radiative forcing from CO2 was 1.7 W/m2, and the net radiative forcing was also 1.7 W/m2 (with larger uncertainties of course), implying the CO2-eq was equal to actual CO2 concentrations).
In deciding how the climate is going to react, one obviously needs to be using the second definition. Using the first is equivalent to assuming that between now and 2020 all anthropogenic aerosols, ozone and land use changes will go to zero. So, they used an excessive forcing value (3 W/m2 instead of ~2 W/m2).
The second mistake
has a bigger consequence:is that they assumed that the instantaneous response to a forcing is the same as the long-term equilibrium response. This would be equivalent to a planet in which there was no thermal inertia - or one in which there were no oceans. Oceans have such a large heat capacity that it takes decades to hundreds of years for them to equilibrate to a new forcing. To quantify this, modelers often talk about transient climate sensitivity, a measure of a near term temperature response to an increasing amount of CO2, and which is often less than half of the standard climate sensitivity.
Goldenberg brought Universal Ecological Fund’s error to the attention of EurekaAlert!, which quickly removed the related press release from its website.
“We primarily rely on the submitting organization to ensure the veracity of the scientific content of the news release,” Ginger Pinholster, the director of AAAS’s communications office, told her.
“In this case, we immediately contacted a climate-change expert after receiving your query. That expert has confirmed for us that the information indeed raises many questions in his mind, and therefore we have removed the news release from EurekAlert!”
It is disappointing that the press release made it to AAAS in the first place, however. On his blog, freelance environmental journalist Stephen Leahy reported that he had brought the error to the attention of Universal Ecological Fund’s executive director, Liliana Hisas, “days before the report’s release,” and suggested that it be withdrawn. Hisas refused, telling Leahy that an Argentine scientist who had worked on the Intergovernmental Panel on Climate Change’s 2007 Assessment Report had vetted the report.
“I used up (the) better part of two or three days of my time and still they went ahead with the release,” Leahy told the Knight Science Journalism Tracker. “They think it is better to have a conversation on this than to be right.”
The NGO’s position is unfortunate because, as Goldenberg’s article pointed out, climate change does, in fact, pose a threat to global food production, and such errors can undermine the credibility of the larger climate science community.
The Knight Science Journalism Tracker’s Charlie Petit rightly observed, “Experienced climate reporters, if they covered this report and read that assertion about a 2.4 degree rise by 2020, should have immediately realized that such a jump is way out of line of standard projections But most reporters who don’t intensely cover such issues would not have any internalized feel for the scale and rate of climate change.” (To those in the latter category, Schmidt wisely recommended the American Geophysical Union’s Climate Q&A Service or the Climate Science Rapid Response Team.)
The other instance of flawed math echoed by the press should have been easier to spot. A week before the report about climate change and food production, a paper (the cover story) in the journal Nature reported that the practice of banding penguins’ flippers in order to track and study them is harmful to the birds.
A team of French and Norwegian scientists studied free ranging king penguins over the course of ten years and found that “banded birds produced 39 percent fewer chicks and had a survival rate 16 percent lower than non-banded birds.” Unlike the Universal Ecological Fund’s mistake, however, in this case, the situation was worse than the researchers said it was.
The data table accompanying the paper showed that 36 percent (.36) of the non-banded penguins survived for ten years, compared to only 20 percent (.20) of the banded ones. As the Knight Science Journalism Tracker explained, subtracting the latter from the former results in a difference of sixteen percentage points (.16). A decline of sixteen percentage points is not the same a decline of 16 percent, however. In fact, .20 is about a 44 percent drop from .36. A similar mistake was made in relation to breeding success, with the actual decline between banded and non-banded birds being about 41 percent, rather than 39 percent.
The Associated Press’s Seth Borenstein was the first to spot the innumeracy. He surveyed the coverage elsewhere and shared the results with the Tracker:
“Getting the math wrong: 14. Getting it right: 2. Our batting average as science writers, a tepid .125.” That is well under the Mendoza line and if you’re a baseball fan you know what that means and especially if it’s the team average: back to the minor leagues for you.
This was a collective, rather than individual, average, of course, and certainly not the type of mistake that any reporter should be penalized for. The incident did provoke some thoughtful commentary and soul-searching, however. Cassandra Willyard, who repeated the erroneous 16-percent-decline figure in her article for ScienceNOW, made the following reflection on her personal blog:
It’s never fun to be singled out for something you did wrong. And it’s especially painful when you’re made to feel like you don’t know how to do your job. Initially, I felt terrible about making this mistake. But over the past week, I’ve come to terms with it. Yes, I reported 16%. So did the researchers and at least 13 of my colleagues. The peer reviewers didn’t catch the error, and neither did the Nature employee who wrote the press release
Still, the whole mess has made me question whether I understand my job description. Should I have caught that mistake? Should I take a statistics class? Bone up on my math skills? Do I need to carefully comb through the tables and charts in every paper looking for oddities? I try to be a careful reporter. Am I failing?
The simple answer is no. Honest mistakes are part of the job, and any reporter who makes one should be judged by their reaction to the error rather than the error itself. Willyard owned up to the gaffe and admitted she’d learned a valuable lesson about the difference between percentage and percentage points. Moreover, the incident prompted a commenter on the Knight Science Journalism Tracker’s post to suggest three useful online math courses for journalists—from the Royal Statistical Society, the BBC, and Poynter’s News University.
The episodes related to the climate-change-and-food-production report and the penguin flipper-banding study do, of course, reinforce the vital importance of reporters not taking any information in studies, reports, or press releases for granted. Blindly repeating assertions made by scientists (or anybody else) is the job of stenographers, not journalists. The incidents also reinforce the value of mid-career training. The vast majority of reporters out there would surely benefit from a remedial math course; their editors should encourage—and, dare I say it, pay for—them to take one.
Clarification: The lede of this article was amended to reflect the fact that the study related to penguin banding and the report related to climate change and food production themselves contained the numerical errors, in addition to their accompanying press releases.