Carl Bialik writes the weekly Numbers Guy column for the online edition of the Wall Street Journal. A former technology reporter for the online Journal, he co-writes a daily column about sports for the online Journal, and is co-editor of and writer for Gelf Magazine, which covers sports, the media and international news.
Bryan Keefer: Why do you think it is that journalists often fail to treat numbers with the same sort of skepticism — or same sort of enthusiasm, sometimes — as human sources, or as documents?
Carl Bialik: For some journalists, it could be that math was never their favorite subject, or that they never took a statistics class. But in a lot of cases, the numbers being thrown around don’t take any real advanced training to look into; it’s just a matter of reading the source documents. So I think in those cases, there are a number of reasons. And one of them is deadline pressure — it’s easier to just call the other side for five minutes and get a quick comment than to really put the responsibility on yourself to fact-check.
And in other cases I think, for some reason numbers seem to carry some sort of weight of authority, and if you can cite estimates and give a number, then it seems to stand on its own and be a very solid fact, even if what lies beneath the number isn’t much at all.
BK: I noticed a number of your pieces take the angle of challenging the conventional wisdom. Do you think that part of it might be that reporters get caught in the trap of assuming that conventional wisdom?
CB: Yeah, and I think part of that is the Nexis effect, or Google News effect. If you see a number, and it’s been reported somewhere authoritative, or if it’s just been reported in lots of places, then since numbers seem to have this air of certainty, reporters can start to accept it as fact. And even if the original article was responsible about saying where the number came from and giving some caveats and cautions, often when you’re writing the follow-up story, and you just have a paragraph to devote to what came before, a lot of the nuance gets lost.
BK: What do you think the biggest source of these bad numbers is? Is it politicians and interest groups and PR people promoting their agendas, is it just reporters’ misinterpretations of data, or is it something else entirely?
CB: I think it’s more the first category. Because people who are smart about dealing with the press are aware that reporters can often take numbers and repeat them, it’s just a good strategy on the part of advocacy groups or interest groups to put numbers into press releases and speeches, into other documents, into interviews. That just seems to become a part of the standard media strategy.
There are definitely cases, though, especially in science reporting or health reporting, where reporters are interacting directly with a less biased source, like a scientific researcher who really is just trying to share their findings, and who doesn’t have an agenda. And because of deadline pressures, and the story going through several edits by people who haven’t directly interacted with the researchers, you can get some misinterpretations. But of the examples I’ve seen so far, that seems to be the less common reason.
BK: Do you think that science and medical journalists, or any particular segment of journalists, get spun by these numbers more often? You’ve been a technology reporter, and you co-write a sports column — do you find that any particular segments of journalism are better with numbers or particularly worse with numbers?