the observatory

Facebook and Procrastination

Runaway coverage mistakes correlation for causation
May 8, 2009

From the start, we knew that the news release we were distributing had a chance for ample news coverage. After all, it involved the ubiquitous “social media” and student grades, either of which is all-but-guaranteed to garner attention.

What we didn’t figure was how badly most of the conventional news media would muck up the story in the process. Ultimately, the entire episode offers a good lesson in the inherent risks of reporters’ cavalierly covering the social sciences, as well as the risks that young researchers can face in dealing with the news media.

It began in March when our communications office at Ohio State University spotted a poster session by one of the school’s grad students titled, “A Description of Facebook Use and Academic Performance Among Undergraduate and Graduate Students.” It was one of hundreds of papers scheduled to be presented at the annual meeting of the American Educational Research Association in April, and an obvious candidate for a press release.

Research on one of the most popular social media engines was a strong news hook. So was any connection with student grades. And from our perspective, as writers charged with explaining ongoing university research, the fact that it arose from education as a discipline, and that it was work by a graduate student, made it even more appealing. (Any chance to tie research by students to their ongoing education reinforces the oft-forgotten relationship between the two at major universities.)

Our resulting story included these key points:

• It was a pilot study with a small, but adequate, population
• It looked at Facebook use among undergraduate and graduate students in the sample and how much they said they studied
• It looked at the representative grade point averages (GPAs) of the students
• It looked for any correlation between Facebook use and GPAs, but suggested no causality
• It strongly recommended additional research.

Sign up for CJR's daily email

Our office produces a lot of stories on social science research. We’re very careful to narrowly report the findings and avoid extrapolations or conjecture beyond what the data provides. After the Facebook study’s author, Aryn Karpinski, reviewed the draft of our press release and deemed it accurate, we distributed the story through both Eurekalert and Newswise, two of the largest distributors of research news releases to the media. It was embargoed until April 16 to coincide with Karpinski’s presentation at the educational research conference.

But that weekend, the Sunday Times of London ran an article about the research that carried the following statements:

Research finds the website [Facebook] is damaging students’ academic performance. … Facebook users … are more likely to perform poorly in exams, according to new research. … The majority of students who use Facebook every day are underachieving by as much as an entire grade compared with those who shun the site.”

Sadly, the research showed no such thing.

The Times reporter wrote that he had talked with Karpinski and she’d verified the story the newspaper published. Karpinski says she saw a version of the story, but what the Times printed wasn’t it. And while the paper did not technically break the embargo (the reporter said he didn’t get his information from any of the embargoed material), its story, printed several days before Karpinki’s presentation, set in motion a frantic race among the rest of the news media to catch up, and most of them used the exaggerated Times story as their baseline.

By Wednesday of that week, before the research was presented, Google News was showing hundreds of news stories from media around the world on the study. Many of those reports were wrong as well. Karpinski was overwhelmed with requests for interviews, most of which she granted—but neither her explanations to reporters nor her presentation (which we posted online after the embargo was broken) seemed to make much difference.

The crux of the problem centered on reporters’ apparent ignorance of the terms “correlation” and “causation,” two relatively common technical research terms that are as different as night and day.

Karpinski’s study showed that students who described themselves as Facebook users reported studying less and having lower GPAs than students who didn’t use Facebook. The Facebook users also said they believed, in their cases, there was no connection between their poorer academic performance and the social media engine.

So the study simply pointed to an apparent relationship between students’ lower grades and less time spent studying, and their Facebook use. It did not say that latter caused the former. As one writer very nicely explained, “Facebook may be a symptom of a big procrastination habit, not a cause.”

Unfortuantely, most of the initial news stories didn’t get that.

A few writers from major media outlets did, however, point out the faulty reporting elsewhere. A couple pieces—most notably, a Wall Street Journal blog post in which I was quoted—even raised reasonable questions about whether or not the pilot study should have been publicized prior to peer review in the first place. An excellent piece in Ars Technica discussed advantages and disadvantages of releasing such exploratory science.

With the embargo useless, Karpinski’s presentation poster and our release were made available to all. Later coverage improved. Karpinski continued giving interviews and, ultimately, was pleased with a second wave of stories that ran in USA Today and other outlets.

But the public and some researchers, reacting to the inaccurate reporting, blamed Karpinski for releasing her preliminary results, faulting her methodology. Last week, in the online journal First Monday, rival researchers published their account of why weaknesses in Karpinski’s research led to the media frenzy—an interesting misunderstanding of causality in its own right! Fortunately, the journal allowed Karpinski to publish a response.

In the past, I’ve seen respected, tenured professors retreat into their warrens when faced with half this onslaught but, surprisingly, Karpinski—while understandably miffed—is philosophical about the experience. Her parents, she said, raised her to be resolute, and the episode has since netted kudos from faculty and more invitations to publish—reasonably rare positive results for a grad student.

In the end, the frenzy to be first with the news helped the media misinform the public and betrayed the essence of the research in question.

Most science reporters, and researchers, know the consequences of pushing the data too far. It’s a good lesson for other journalists to learn as well.

Eva Holland is a freelance writer based in Canada’s Yukon. Her first book, Nerve: Adventures in the Science of Fear, comes out in April.