the observatory

Nutrition Coverage Under Fire

From red meat to white rice, not enough skepticism of observational studies
April 9, 2012

The incessant coverage of nutritional studies that make tenuous claims about the harms or benefits of consuming various foods and beverages has come under heavy fire from critics in recent months.

On Thursday, science writer Gary Taubes launched the latest broadside against credulous reporting of flimsy epidemiological research. “The last couple of weeks have witnessed a slightly-greater-than-usual outbreak of extremely newsworthy nutrition stories that could be described as bad journalism feasting on bad science,” he wrote in a guest post for Discover’s blog, The Crux.

A flood of stories about papers linking red meat to a higher chance of death and chocolate to lower body weight sparked Taubes’s ire.

The first study, from a team at Harvard School of Public Health and published in Archives of Internal Medicine, reviewed over two decades’ worth of data from more than 121,000 men and women who participated in either the Health Professionals Follow-up Study or the Nurses’ Health Study, two long-term observational research efforts. It claimed that one serving per day of unprocessed red meat was associated with a 13 percent increased risk of mortality after controlling for factors like smoking, exercise, and body weight, and that one serving per day of processed red meat (a hot dog or bacon, for example) was associated with a 20 percent increased risk.

The other study, from researchers at the University of California, San Diego (UCSD), was also published in Archives of Internal Medicine, and collected new data from over 1,000 men and women who participated in an observational study at the university. It found that eating chocolate was associated with a lower body mass index and claimed that eating it on a regular basis could lead to weight loss.

“The problem with observational studies like the ones from Harvard and UCSD,” Taubes explained, is that they identify associations rather than causal relationships or definitive links. In other words, they generate hypotheses, which, he added, more rigorously controlled experiments usually fail to support.

Sign up for CJR's daily email

The difference between observational epidemiology and clinical trials—the “gold standard” in such matters—was a subject that Taubes discussed at length in a 2007 cover story for The New York Times Magazine, “Do We Really Know What Makes Us Healthy?” which CJR praised at the time as a great backgrounder in medical science for reporters. His blog post reminded readers that he “first wrote about the questionable nature of observational epidemiology in Science back in 1995,” in a piece titled “Epidemiology Faces Its Limits.”

Taubes has very particular views about diet and nutrition, however. His books—Why We Get Fat: And What to Do About It in 2010 and Good Calories, Bad Calories: Fats, Carbs, and the Controversial Science of Diet and Health in 2008—and other articles for the Times Magazine“Is Sugar Toxic?” in 2011 and “What if It’s All Been a Big Fat Lie?” in 2002—have promoted the benefits of a high-fat, high protein, low-carbohydrate, zero-sugar diet. Critics such as Scientific American’s John Horgan and the Knight Science Journalism Tracker’s Paul Raeburn have questioned the evidence for Taubes’s formula and argued that his work is too intent on persuading readers to accept it.

At times, Taubes is similarly heavy-handed in his criticism of observational research, asserting in his blog post for Discover that it’s “closer to a pseudoscience than a real science.” And the no-to-meat, yes-to-chocolate advice in recent coverage clearly ran counter to his deeply held beliefs about what is healthy.
His cautions about the pitfalls of covering epidemiology are on the mark, however, and he’s not the only one making them.

In a post at the Knight Science Journalism Tracker titled “Red (Meat) Scare,” Pulitzer-Prize winning science writer and University of Wisconsin journalism professor Deborah Blum criticized “the alarmist tone of some coverage.” She faulted headlines such as “Eating All Red Meat Increases Death and More Reasons to Never Eat Meat” from The Daily Beast and “Scientists Warn Red Meat Can Be Lethal” from Sky News in the United Kingdom, while praising more balanced work elsewhere.

Fats and proteins aren’t the only foods drawing attention, however. On his blog, Dr. Yoni Freedhoff, the medical director of the Bariatric Medical Institute, a weight management center in Canada, recently slammed The Daily Mail for running an enormous front page headline declaring, “Diabetes Warning on White Rice: Millions who regularly eat it are at risk.” The article was a about a paper published in March in the British Medical Journal by some of the same Harvard researchers behind the red-meat-mortality paper. They reviewed data from four observational studies involving over 350,000 people and found an association between rice consumption and diabetes, but stressed that the findings had “few immediate implications for doctors, patients, or public health services.”

Arguing that The Daily Mail had “overstated” the risk of eating rice, an analysis at the UK’s excellent NHS Choices website, run by the publicly-funded National Health Service, explained that:

The main limitation with cohort studies such as the ones pooled in this review is that they may not have adjusted for all relevant factors that could be associated with intake of rice and with risk of diabetes. These include other dietary factors such as alcohol intake, physical activity and being overweight or obese. Also, studies assessing food intake can be particularly prone to some inaccuracy. Participants usually have to estimate their typical dietary intake, which can be hard to recall and variable over time.

Despite these significant limitations, however, journalists continue to seize upon the latest observational research studies and tout their findings with little to no qualification. As Gary Schwitzer put it in a post for HealthNewsReview in early March:

Last week we saw stories about citrus fruits protecting women from stroke. This week it was stories about “sleeping pills could kill 500,000.” This week we also had stories about “omega-3 fatty acids protecting the aging brain” and about “Vitamin A may slash melanoma risk.” Sometimes it’s stories about lower risk (or protection), sometimes it’s stories about higher risk.

Week after week, year after year, for 6 years now, we have written about news stories that fail to explain the limitations of observational studies to readers. They use causal language – suggesting cause-and-effect findings – for studies that cannot prove cause-and-effect.

Unfortunately, those stories are like candy to undiscerning readers, but reporters who want to do better can turn to HealthNewsReview’s useful primer on observational studies titled, “Does the Language Fit the Evidence? Association Versus Causation.” It’s a simple, clear-cut guide to how various studies are designed and to how journalists can evaluate them with a more skeptical eye.

Curtis Brainard writes on science and environment reporting. Follow him on Twitter @cbrainard.