Within the last month, Wired magazine’s Mark Anderson and author Tom Wolfe, in an interview published in the San Francisco Chronicle, did something rarely seen in the manic world of neuroscience reporting. They broke rank with the chorus of hypesters, saying, in essence, that we barely know what we think we know about the human brain. It was a stark departure from the usual drumbeat of flackery and dubious extrapolation common to the topic.
In the Wolfe interview by Steve Heilig, Wolfe appeared to backtrack from his own hyping of neuroscience in Hooking Up, his 2000 book about attraction in America, by saying he is fascinated by evidence of how little we actually know about the brain. In his words, theorists, and the reporters who give them ink, “are writing literature, which doesn’t mean they are wrong, but they don’t have a scientific leg to stand on. They literally don’t know what they are talking about.”
Call it a moment of clarity for a newspaper more prone to run articles like February’s, “Stressed at work? Rewire your brain!”. That article, by Chris Colin, was basically an advertorial posing as a news story about the Napa, California “peak performance” enhancement company ProAttitude - which uses “neuroscience … guided imagery … cognitive behavior therapy, humanistic psychology, positive psychology, and a form of learned optimism” to reduce workplace stress. The story concluded with, “ProAttitude is hosting a three-day workshop in Mill Valley.”
Similarly, Wired’s April issue is a bit schizo itself. Anderson’s criticism of neuroscience is contained in a succinct sidebar that injects a dose of reality into a fawning feature by Gary Wolf about eternal life guru Ray Kurzweil. “Almost nothing is known about how the brain produces awareness, and current models of brain function don’t accord with the little that is known,” Anderson writes; he then offers a point-by-point takedown of the accompanying feature and neuroscience hype in general. Specifically, Anderson rebuts the notion that brains are like computers and that advances in neuro- and computer science will enable the sixty-year-old Kurzweil to download his consciousness into a machine and extend his life to some time past 2030.
Wired’s parroting of neuro-hype is more in tune with the almost-daily strains of flackery and extrapolation found in some of the nation’s top newspapers.
On March 31, The New York Times perpetuated the promulgation of “neuropunditry” that began last November on its op-ed pages with a news story of the same ilk. In “Is the Ad A Success? The Brain Waves Tell All”, reporter Stuart Elliott states that “Madison avenue is all about the brain waves” and then details dubious EEG (electroencephalographic brain scan) focus groups, quoting proponents like a Virgin Mobile vice president who says, “You actually see what they think and feel.” Well, no, you don’t. How electrical activity in certain parts of the brain (which EEG measures) translates to specific thoughts and feelings is poorly understood. And what anyway, are the multi-million dollar lessons advertisers learned from such dubious techniques? Have a good lead and use a twist. Genius.
The Los Angeles Times followed a similar trail into the “neuropunditry” morass, following up on a dubious neuropunditry op-ed from December with a February news story on EmSense, an oft-quoted company in neuroscience articles that purports to divine voters’ specific feelings/opinions from neuroimaging techniques such as EEG. There’s not much evidence to support anything more than conclusions about whether or not a test subject is paying attention to a certain candidate when viewed, and that may change, but reporter Denise Gellene surely pleased her corporate sources by giving the marketing buzz an early push.
More often than tacitly advertising for moneyed neuro-research interests, however, the press falls victim to overblown extrapolation from small studies. The Associated Press, for example, recently published a story headlined “Sex and Financial Risk Linked in Brain, Study Finds”, about an fMRI studied that had found that the same areas of the brain “lit up” during sexual arousal (being shown erotic pictures) and gambling (roulette). The AP author, quoting the scientists, points out that the motivation to make bigger wagers or accept great risk isn’t always lust. “The trigger doesn’t have to be sex,” according to story, “it could be chocolate or a winning lottery ticket.” What really matters, according to the researchers, is raw emotion, which “bleeds over into your financial decisions.” Given that very general and unhelpful bottom line, the AP reporter, quoting a floor trader, makes a flimsy attempt to say something about sex’s impact on the stock market, but that impact is totally unclear. Readers are left with the information that science has found evidence that emotion-basically any intense, positive one-is linked to risk-taking. But meaningful extrapolation, which promised to be the real news hook, fell flat.
This isn’t to say that all neuroscience reporting is gullible or exaggerated. In addition to lapses of judgment, the Los Angeles Times sometimes sobers up and offers an eloquent example of quality neuroscience reporting. Terry McDermott’s long, narrative feature, “Trials, and a series of errors, in the brain lab” revels in the most underreported aspect of brain science: that it is slow, laborious, fraught with blunders and missteps, and researchers are essentially working in the dark without a light, let alone a map. The experiments McDermott presents in his article often yield contradictory findings, if and when they work at all. Such fundamental aspects of brain function-like how a single memory is formed or the true function of a neuron-baffle contemporary researchers. The passions of the researchers play as much a role in the science as the research. It’s nuanced, human and it doesn’t overreach.
Journalists should continue to incite the public’s sense of awe about the brain and the exciting research in the field. But they should do so with humility given neuroscience’s infancy, imperfection, immense financial motives, limited applicability of any one study, and our own natural desire to believe reductionist explanations of cognitive phenomena. Or maybe they simply shouldn’t rely as much on those multi-colored maps of the brain to attract readers. As researchers at Colorado State University have shown, “presenting brain images with articles summarizing cognitive neuroscience research resulted in higher ratings of scientific reasoning for arguments made in those articles, as compared to articles accompanied by bar graphs, a topographical map of brain activation, or no image.” In other words, neuroscience isn’t as simple as a colorful graphic, and reporters should never present it that way.