“I used up (the) better part of two or three days of my time and still they went ahead with the release,” Leahy told the Knight Science Journalism Tracker. “They think it is better to have a conversation on this than to be right.”

The NGO’s position is unfortunate because, as Goldenberg’s article pointed out, climate change does, in fact, pose a threat to global food production, and such errors can undermine the credibility of the larger climate science community.

The Knight Science Journalism Tracker’s Charlie Petit rightly observed, “Experienced climate reporters, if they covered this report and read that assertion about a 2.4 degree rise by 2020, should have immediately realized that such a jump is way out of line of standard projections… But most reporters who don’t intensely cover such issues would not have any internalized feel for the scale and rate of climate change.” (To those in the latter category, Schmidt wisely recommended the American Geophysical Union’s Climate Q&A Service or the Climate Science Rapid Response Team.)

The other instance of flawed math echoed by the press should have been easier to spot. A week before the report about climate change and food production, a paper (the cover story) in the journal Nature reported that the practice of banding penguins’ flippers in order to track and study them is harmful to the birds.

A team of French and Norwegian scientists studied free ranging king penguins over the course of ten years and found that “banded birds produced 39 percent fewer chicks and had a survival rate 16 percent lower than non-banded birds.” Unlike the Universal Ecological Fund’s mistake, however, in this case, the situation was worse than the researchers said it was.

The data table accompanying the paper showed that 36 percent (.36) of the non-banded penguins survived for ten years, compared to only 20 percent (.20) of the banded ones. As the Knight Science Journalism Tracker explained, subtracting the latter from the former results in a difference of sixteen percentage points (.16). A decline of sixteen percentage points is not the same a decline of 16 percent, however. In fact, .20 is about a 44 percent drop from .36. A similar mistake was made in relation to breeding success, with the actual decline between banded and non-banded birds being about 41 percent, rather than 39 percent.

The Associated Press’s Seth Borenstein was the first to spot the innumeracy. He surveyed the coverage elsewhere and shared the results with the Tracker:

“Getting the math wrong: 14. Getting it right: 2. Our batting average as science writers, a tepid .125.” That is well under the Mendoza line and if you’re a baseball fan you know what that means and especially if it’s the team average: back to the minor leagues for you.

This was a collective, rather than individual, average, of course, and certainly not the type of mistake that any reporter should be penalized for. The incident did provoke some thoughtful commentary and soul-searching, however. Cassandra Willyard, who repeated the erroneous 16-percent-decline figure in her article for ScienceNOW, made the following reflection on her personal blog:

It’s never fun to be singled out for something you did wrong. And it’s especially painful when you’re made to feel like you don’t know how to do your job. Initially, I felt terrible about making this mistake. But over the past week, I’ve come to terms with it. Yes, I reported 16%. So did the researchers and at least 13 of my colleagues. The peer reviewers didn’t catch the error, and neither did the Nature employee who wrote the press release…

Still, the whole mess has made me question whether I understand my job description. Should I have caught that mistake? Should I take a statistics class? Bone up on my math skills? Do I need to carefully comb through the tables and charts in every paper looking for oddities? I try to be a careful reporter. Am I failing?

The simple answer is no. Honest mistakes are part of the job, and any reporter who makes one should be judged by their reaction to the error rather than the error itself. Willyard owned up to the gaffe and admitted she’d learned a valuable lesson about the difference between percentage and percentage points. Moreover, the incident prompted a commenter on the Knight Science Journalism Tracker’s post to suggest three useful online math courses for journalists—from the Royal Statistical Society, the BBC, and Poynter’s News University.

Curtis Brainard is the editor of The Observatory, CJR's online critique of science and environment reporting. Follow him on Twitter @cbrainard.