Yesterday, The New York Times brought readers the best kind of health story: a crisis that failed to show up. Despite hyperbolic headlines in the 1980s predicting a generation of inner-city children irreparably damaged by intrauterine cocaine exposure, the Times reports on new research showing that so-called “crack babies” are generally doing just fine. (It turns out alcohol is more dangerous to a fetus.)

The Washington Post published a far more typically discouraging health story on yesterday’s front page. When lead levels in D.C.’s drinking water spiked between 2001 and 2004, public health officials assured residents “that they found no measurable impact on the general public’s health.” But a new study shows that hundreds of children in fact had unsafe amounts of lead in their blood.

The sad irony about these stories appearing on the same day is that lead poisoning in young children actually produces some of the irreparable cognitive and developmental damage that was once believed to be caused by exposing infants to cocaine. Lead poisoning also disproportionately affects the low income and African-American populations menaced by the crack epidemic. But while crack babies became a symbol of America’s deteriorating inner city during the Reagan administration, President Reagan cut funding for lead screening and ordered the Centers for Disease Control to stop keeping lead poisoning statistics.

Reagan’s drug war encouraged Americans to view the problems of the inner city as ones of moral decay, with no better example than the so-called “crack baby.” When politicians and panicked public health officials started warning that cocaine would produce a generation of learning disabled, aggressive, and addiction-prone slum dwellers, what reporter could be expected to suggest they should be more worried about water pipes than crack pipes?

Of course, it would be ridiculous to suggest that there were no reasons to fear the impact of crack on children. Reading through contemporary stories from The Washington Post—a newspaper in a city so ravaged by the crack epidemic that even its mayor was not immune—is heartbreaking. Reporters delivered tales of infants born deformed and HIV-positive, older children forced into prostitution and abandoned in foster care, and victims of non-stop violence. Yet the stories’ scientific reports on cocaine’s developmental affects seem, in hindsight, to be more rooted in fear than in research. When Post reporters wrote about the long-term effect of early fetal cocaine exposure, they often treated speculation as fact and used language as alarmist as possible.

A column by Post reporter Courtland Milloy on a 1989 conference of doctors and educators was headlined “Time Bomb in Cocaine Babies.” They were meeting, he wrote, for “a crash course on how this country can stop a potential human plague almost too horrible to imagine. It is the dysfunctional development of cocaine babies, or, as some have called it, the emergence of a ‘bio-underclass.’”

Milloy continued:

President [George H. W.] Bush was photographed cuddling one such infant at D.C. General Hospital recently, which helped generate an outpouring of sympathy for the plight of the babies.

But a few years from now, the experts note, those infants won’t look so cute anymore. Already, a few of them are turning up in first- and second-grade classrooms around the country, wreaking havoc on themselves and others. Severe emotional damage and even physical deformities not so readily apparent today may mushroom in the near future.

Milloy’s story is largely based on quotes from a doctor named J. Harold Nickens, then-chairman of the D.C. Chapter of the American Society of Addiction Medicine. Though I’m sure Nickens’s remarks were echoed by many other experts at this conference, Milloy never asks for specific studies that would substantiate that these kids’ “emotional damage” was caused by prenatal cocaine exposure, instead of, say, the environmental effects of being the child of a cocaine addict. And Milloy does not point out that the doctor’s prediction of a lurking danger in children without visible birth defects is based on speculation, not data. Milloy paraphrases: “Moreover, Nickens said, a time bomb exists even in those children who may appear ‘normal’ and are deemed medically ready for discharge from hospitals. They are, in effect, addicts unaware of the lifelong challenge of recovery ahead of them.”

Lester Feder is a freelance reporter based in Washington, D.C., and a research scientist at George Washington University School of Public Health.