Time’s Eben Harrell deals with the other side of the risk equation, hazard, in the third paragraph, noting that that information is missing too: “The question for parents, though, is what to do with this information. The long-term health effects [i.e. hazardousness] of these chemicals, which may save lives in the case of fire or car accident, have not been authoritatively established.”
But most stories fail to note that without these critical basics about hazard and exposure, the actual risk is unknown. In an early version of a story for Environmental Health News (picked up by Scientific American), the site’s editor, Marla Cone (a friend and an exemplary journalist), reported that these chemicals haven’t been fully studied, but failed to explain that that means we can’t tell whether there is a risk, or how big or small it might be. A day later, an amended version was posted that included comments from the chemical industry, which hadn’t replied in time to make the previous one.
The update, which ran on Environmental Health News, but not at Scientific American, acknowledges that without hazard and exposure data the risk is unknown, but this critical point, so important for any parent who wants to be informed, not just alarmed, doesn’t show up until twenty-one paragraphs in, seven from the bottom. Even then, Cone doesn’t present this crucial information as simple fact. Instead, she quotes the American Chemistry Council saying that while the study shows that chemicals are in baby products, “it does not address exposure or risks,” making the statement seem less like the truth than dubious and evasive spin from the industry’s principal trade group.
Another common mistake with risk coverage shows up in this episode as well. Science is a process, and rarely does one paper prove anything conclusively. Too few articles explained that the study is just the beginning of the risk assessment process, even though the paper made this abundantly clear. “Knowledge of the types of chemicals in use and the products they are used in are essential first steps for evaluating the potential for human exposure and subsequent health effects,” it reads.
Lots of reporters missed an obvious red flag when lead author Heather Stapleton, an assistant professor of environmental chemistry at Duke University, was far bolder and more proactive in comments she made to reporters than in the paper itself. Take Emily Sohn’s story for Discovery News, which ran under the misleading and alarmist headline, “Baby Products Loaded with Toxins.” Sohn begins the third paragraph by explaining, “The study was not able to quantify the health risks of baby products that are treated with these chemicals.” (Actually, it didn’t even try.) Nonetheless, Stapleton then gets prescriptive, telling Sohn that the findings are “worrisome enough that it’s worth seeking out alternative [products].”
Reporters should always be aware when scientists are making statements that go above and beyond their published findings—as Stapleton was doing—and note that discrepancy to readers. Sohn didn’t, and not until near the end of her piece, after lots of worrisome language higher in the piece warning about carcinogenicity and brain damage and other harms, did she make clear that the big questions of hazard and exposure aren’t addressed, so the risk question is unknown and Stapleton’s study is just the first step.
“It’s important to know [that these potentially dangerous chemicals are found in kids products],” Arnold Schecter, a public health physician at the University of Texas School of Public Health in Dallas, told Sohn. “And the next important thing is to find out how frequent this is, what levels are there and what sort of risk this poses. The big question is: What is the toxicity? And how much is getting into children?”
Those are good questions, basic hazard and exposure Risk Reporting 101 details necessary to help a reader make an informed judgment about the core question: is there a risk, and how much? But in many risk-related stories, those key details never show up, or they get buried at the end of the story after all the scarier more attention-getting stuff comes first.
I know journalists want attention for their work. I was an environmental reporter for years and did stuff just like this. But I also know that journalists want to get the basic facts right, and leaving out critical details like this is simply incomplete reporting. And no journalist wants their work to do any harm. But by alarming, without also fully informing, the choices the reader makes
about what products to use or what to eat or how to behave about all sorts of things
will be based on a dangerously incomplete picture.