the observatory

Flame Retardants Raise Undue Alarm

Incomplete risk assessment mars coverage of chemicals in kids’ products
May 23, 2011

There is a great story in the news right now that illustrates the challenges for journalists who cover environmental risks. It’s about a study in Environmental Science and Toxicology which found flame retardant chemicals in the foam padding used in car seats and highchairs and other products for infants and young children. Some of these chemicals are suspected human carcinogens and hormone disruptors. One was removed from baby pajamas years ago because a version of it caused cancer in lab animals.

It’s an important story, easy to write in a way that will get a lot attention. But it’s not such an easy story to write if the objective is not just to alarm (Fox News headline: “Study: 80% of Baby Products are Toxic”) but also to inform (those are not mutually exclusive). Based on the coverage so far, there are a lot more alarmed parents out there, than informed ones.

“Mothers reading one of the several hundred news stories this week that covered a study of flame retardants in US baby products could be forgiven for panicking,” Daniel Cressey observed in a news article for the journal Nature.

Here’s what makes this such a teachable example. The study didn’t find a risk; it didn’t even try to measure exposure. It was merely looking for the presence of chemicals and says, right in the abstract (you don’t even have to read the whole study to see this caveat), that because they were found to be so prevalent, “Future studies are therefore warranted to specifically measure infants exposure to these flame retardants from intimate contact with these products and to determine if there are any associated health concerns.”

Nice and careful, right? The authors themselves point out that there is relatively little information about infants’ actual exposure to chemicals in flame retardants. So there is no way of knowing whether there is actual increased likelihood of harm, or how much. But as is often the case with coverage of risk, many of the stories about this study play down, or fail to include, critical facts that would help the reader understand how big or small the threat might be, “Who What When Where Why” basics that should be in any story about risk.

To know if something poses a risk, or to know the severity of the risk, you have to establish both hazard and exposure, and you have to know some important details about each:

Sign up for CJR's daily email

• How hazardous is the substance (how severe is the health outcome)?
• At what dose is it hazardous? (Sometimes a small dose might not be harmful  at all, and while usually a bigger dose is worse, that’s not always true, as with  endocrine disruptors—the point is, dose matters.)
• How are you exposed (inhalation or ingestion or dermal, which has a lot to  do with how your body handles it)?
• Over what period of time does exposure occur (one acute dose, lots of little  doses every day)?
• What’s the age of the exposed population? (A small dose for an infant, whose  brain and immune system and body are still developing, is a bigger risk than  the same dose for a larger adult.)

These details separate a risk story that alarms from one that also informs. But this study was a tough one for journalists because it was looking for the presence of chemicals and nothing more. The toxicity of many of those it found hasn’t been studied, and whether the kids in all those car seats and high chairs are in fact exposed, and to what doses, and for how long, hasn’t been studied either. That’s a lot of missing detail that readers need to gauge the risk. It’s fine that the study didn’t provide it (again, that wasn’t its point), but reporters have to explain that right away.

Some did. The New York Times notes the absence of exposure data right in the second paragraph that “the research does not determine if children absorbed the chemical[s] … from products,” and that scientists behind it merely “suggest that infants who use the products have higher exposure to the chemical than the government recommends.” WebMD’s Salynn Boyles, in a story reviewed by an M.D., does it in the fourth paragraph. “They did not examine how much of the chemicals babies were exposed to when the products were used,” she points out, later cautioning that “more study is needed to determine if this poses a health risk.”

Time’s Eben Harrell deals with the other side of the risk equation, hazard, in the third paragraph, noting that that information is missing too: “The question for parents, though, is what to do with this information. The long-term health effects [i.e. hazardousness] of these chemicals, which may save lives in the case of fire or car accident, have not been authoritatively established.”

But most stories fail to note that without these critical basics about hazard and exposure, the actual risk is unknown. In an early version of a story for Environmental Health News (picked up by Scientific American), the site’s editor, Marla Cone (a friend and an exemplary journalist), reported that these chemicals haven’t been fully studied, but failed to explain that that means we can’t tell whether there is a risk, or how big or small it might be. A day later, an amended version was posted that included comments from the chemical industry, which hadn’t replied in time to make the previous one.

The update, which ran on Environmental Health News, but not at Scientific American, acknowledges that without hazard and exposure data the risk is unknown, but this critical point, so important for any parent who wants to be informed, not just alarmed, doesn’t show up until twenty-one paragraphs in, seven from the bottom. Even then, Cone doesn’t present this crucial information as simple fact. Instead, she quotes the American Chemistry Council saying that while the study shows that chemicals are in baby products, “it does not address exposure or risks,” making the statement seem less like the truth than dubious and evasive spin from the industry’s principal trade group.

Another common mistake with risk coverage shows up in this episode as well. Science is a process, and rarely does one paper prove anything conclusively. Too few articles explained that the study is just the beginning of the risk assessment process, even though the paper made this abundantly clear. “Knowledge of the types of chemicals in use and the products they are used in are essential first steps for evaluating the potential for human exposure and subsequent health effects,” it reads.

Lots of reporters missed an obvious red flag when lead author Heather Stapleton, an assistant professor of environmental chemistry at Duke University, was far bolder and more proactive in comments she made to reporters than in the paper itself. Take Emily Sohn’s story for Discovery News, which ran under the misleading and alarmist headline, “Baby Products Loaded with Toxins.” Sohn begins the third paragraph by explaining, “The study was not able to quantify the health risks of baby products that are treated with these chemicals.” (Actually, it didn’t even try.) Nonetheless, Stapleton then gets prescriptive, telling Sohn that the findings are “worrisome enough that it’s worth seeking out alternative [products].”

Reporters should always be aware when scientists are making statements that go above and beyond their published findings—as Stapleton was doing—and note that discrepancy to readers. Sohn didn’t, and not until near the end of her piece, after lots of worrisome language higher in the piece warning about carcinogenicity and brain damage and other harms, did she make clear that the big questions of hazard and exposure aren’t addressed, so the risk question is unknown and Stapleton’s study is just the first step.

“It’s important to know [that these potentially dangerous chemicals are found in kids products],” Arnold Schecter, a public health physician at the University of Texas School of Public Health in Dallas, told Sohn. “And the next important thing is to find out how frequent this is, what levels are there and what sort of risk this poses. The big question is: What is the toxicity? And how much is getting into children?”

Those are good questions, basic hazard and exposure Risk Reporting 101 details necessary to help a reader make an informed judgment about the core question: is there a risk, and how much? But in many risk-related stories, those key details never show up, or they get buried at the end of the story after all the scarier more attention-getting stuff comes first.

I know journalists want attention for their work. I was an environmental reporter for years and did stuff just like this. But I also know that journalists want to get the basic facts right, and leaving out critical details like this is simply incomplete reporting. And no journalist wants their work to do any harm. But by alarming, without also fully informing, the choices the reader makes…about what products to use or what to eat or how to behave about all sorts of things…will be based on a dangerously incomplete picture.

David Ropeik is an instructor in the Harvard University Extension School’s Environmental Management Program, author of How Risky Is it, Really? Why Our Fears Don’t Always Match the Facts, creator of the in-house newsroom training program “Media Coverage of Risk,” and a consultant in risk communication. He was an environment reporter in Boston for twenty-two years and a board member of the Society of Environmental Journalists for nine years.