There is a great story in the news right now that illustrates the challenges for journalists who cover environmental risks. It’s about a study in Environmental Science and Toxicology which found flame retardant chemicals in the foam padding used in car seats and highchairs and other products for infants and young children. Some of these chemicals are suspected human carcinogens and hormone disruptors. One was removed from baby pajamas years ago because a version of it caused cancer in lab animals.
It’s an important story, easy to write in a way that will get a lot attention. But it’s not such an easy story to write if the objective is not just to alarm (Fox News headline: “Study: 80% of Baby Products are Toxic”) but also to inform (those are not mutually exclusive). Based on the coverage so far, there are a lot more alarmed parents out there, than informed ones.
“Mothers reading one of the several hundred news stories this week that covered a study of flame retardants in US baby products could be forgiven for panicking,” Daniel Cressey observed in a news article for the journal Nature.
Here’s what makes this such a teachable example. The study didn’t find a risk; it didn’t even try to measure exposure. It was merely looking for the presence of chemicals and says, right in the abstract (you don’t even have to read the whole study to see this caveat), that because they were found to be so prevalent, “Future studies are therefore warranted to specifically measure infants exposure to these flame retardants from intimate contact with these products and to determine if there are any associated health concerns.”
Nice and careful, right? The authors themselves point out that there is relatively little information about infants’ actual exposure to chemicals in flame retardants. So there is no way of knowing whether there is actual increased likelihood of harm, or how much. But as is often the case with coverage of risk, many of the stories about this study play down, or fail to include, critical facts that would help the reader understand how big or small the threat might be, “Who What When Where Why” basics that should be in any story about risk.
To know if something poses a risk, or to know the severity of the risk, you have to establish both hazard and exposure, and you have to know some important details about each:
• How hazardous is the substance (how severe is the health outcome)?
• At what dose is it hazardous? (Sometimes a small dose might not be harmful at all, and while usually a bigger dose is worse, that’s not always true, as with endocrine disruptors—the point is, dose matters.)
• How are you exposed (inhalation or ingestion or dermal, which has a lot to do with how your body handles it)?
• Over what period of time does exposure occur (one acute dose, lots of little doses every day)?
• What’s the age of the exposed population? (A small dose for an infant, whose brain and immune system and body are still developing, is a bigger risk than the same dose for a larger adult.)
These details separate a risk story that alarms from one that also informs. But this study was a tough one for journalists because it was looking for the presence of chemicals and nothing more. The toxicity of many of those it found hasn’t been studied, and whether the kids in all those car seats and high chairs are in fact exposed, and to what doses, and for how long, hasn’t been studied either. That’s a lot of missing detail that readers need to gauge the risk. It’s fine that the study didn’t provide it (again, that wasn’t its point), but reporters have to explain that right away.