Yesterday, The New York Times brought readers the best kind of health story: a crisis that failed to show up. Despite hyperbolic headlines in the 1980s predicting a generation of inner-city children irreparably damaged by intrauterine cocaine exposure, the Times reports on new research showing that so-called “crack babies” are generally doing just fine. (It turns out alcohol is more dangerous to a fetus.)
The Washington Post published a far more typically discouraging health story on yesterday’s front page. When lead levels in D.C.’s drinking water spiked between 2001 and 2004, public health officials assured residents “that they found no measurable impact on the general public’s health.” But a new study shows that hundreds of children in fact had unsafe amounts of lead in their blood.
The sad irony about these stories appearing on the same day is that lead poisoning in young children actually produces some of the irreparable cognitive and developmental damage that was once believed to be caused by exposing infants to cocaine. Lead poisoning also disproportionately affects the low income and African-American populations menaced by the crack epidemic. But while crack babies became a symbol of America’s deteriorating inner city during the Reagan administration, President Reagan cut funding for lead screening and ordered the Centers for Disease Control to stop keeping lead poisoning statistics.
Reagan’s drug war encouraged Americans to view the problems of the inner city as ones of moral decay, with no better example than the so-called “crack baby.” When politicians and panicked public health officials started warning that cocaine would produce a generation of learning disabled, aggressive, and addiction-prone slum dwellers, what reporter could be expected to suggest they should be more worried about water pipes than crack pipes?
Of course, it would be ridiculous to suggest that there were no reasons to fear the impact of crack on children. Reading through contemporary stories from The Washington Post—a newspaper in a city so ravaged by the crack epidemic that even its mayor was not immune—is heartbreaking. Reporters delivered tales of infants born deformed and HIV-positive, older children forced into prostitution and abandoned in foster care, and victims of non-stop violence. Yet the stories’ scientific reports on cocaine’s developmental affects seem, in hindsight, to be more rooted in fear than in research. When Post reporters wrote about the long-term effect of early fetal cocaine exposure, they often treated speculation as fact and used language as alarmist as possible.
A column by Post reporter Courtland Milloy on a 1989 conference of doctors and educators was headlined “Time Bomb in Cocaine Babies.” They were meeting, he wrote, for “a crash course on how this country can stop a potential human plague almost too horrible to imagine. It is the dysfunctional development of cocaine babies, or, as some have called it, the emergence of a ‘bio-underclass.’”
President [George H. W.] Bush was photographed cuddling one such infant at D.C. General Hospital recently, which helped generate an outpouring of sympathy for the plight of the babies.
But a few years from now, the experts note, those infants won’t look so cute anymore. Already, a few of them are turning up in first- and second-grade classrooms around the country, wreaking havoc on themselves and others. Severe emotional damage and even physical deformities not so readily apparent today may mushroom in the near future.
Milloy’s story is largely based on quotes from a doctor named J. Harold Nickens, then-chairman of the D.C. Chapter of the American Society of Addiction Medicine. Though I’m sure Nickens’s remarks were echoed by many other experts at this conference, Milloy never asks for specific studies that would substantiate that these kids’ “emotional damage” was caused by prenatal cocaine exposure, instead of, say, the environmental effects of being the child of a cocaine addict. And Milloy does not point out that the doctor’s prediction of a lurking danger in children without visible birth defects is based on speculation, not data. Milloy paraphrases: “Moreover, Nickens said, a time bomb exists even in those children who may appear ‘normal’ and are deemed medically ready for discharge from hospitals. They are, in effect, addicts unaware of the lifelong challenge of recovery ahead of them.”
Compare this doomsaying with the rhetoric of a 2,400-word story on lead poisoning published two years earlier in the Post’s A section. Though the experts quoted in the story make similarly dire predictions of widespread lead poisoning, the headline reads: “Persistent, Pervasive Pollutant; Found in Soil, Water and Food, Lead Increasingly Seen as Health Peril.” The story is based on a report by the Public Health Service, which said at least 17 percent of preschool-aged children in metropolitan areas were at risk of some level of lead poisoning. (Two scientists resigned in protest over political tampering with the report, suggesting that the science was even more alarming than the public version suggested.) The story, citing the American Academy of Pediatricians, identified symptoms of lead poisoning that seemed tied to specific research: “partial loss of hearing and IQ, growth retardation, inhibited metabolism of Vitamin D and disturbances in blood formation.”
Even more significant is that reporter Michael Weisskopf sought comment from someone who disputed these findings. “Lead was on earth before we people were,” Weisskopf quotes Lead Industries Association spokesman Werner T. Meyer. “So, a certain tolerance for lead must exist in the human body.” Readers may find his claims questionable, but at least the reporter flags the possibility that the science could be questioned, a caution missing from the crack baby stories.
While the Post deserves credit for its consistently strong reporting on lead poisoning, the asymmetrical rhetoric and uncritical acceptance of scientific claims reinforced the belief that the problems of low-income inner city communities were produced by bad choices of irresponsible individuals, most of whom happened to be black. Evidence that structural factors—like the fact that old housing stock put poor children at risk of lead poisoning—do not sink in as easily. (Even Milloy, who is African-American and whose columns seemed largely aimed at generating action within the black community, unfortunately contributed to this problem. A Lexis-Nexis search shows he did not write a column on lead poisoning until 1993, his tenth year as a columnist for the Post.) This is a reminder that the time to be most critical of data is when it supports easy explanations for difficult problems.Lester Feder is a freelance reporter based in Washington, D.C., and a research scientist at George Washington University School of Public Health.