From the start, we knew that the news release we were distributing had a chance for ample news coverage. After all, it involved the ubiquitous “social media” and student grades, either of which is all-but-guaranteed to garner attention.
What we didn’t figure was how badly most of the conventional news media would muck up the story in the process. Ultimately, the entire episode offers a good lesson in the inherent risks of reporters’ cavalierly covering the social sciences, as well as the risks that young researchers can face in dealing with the news media.
It began in March when our communications office at Ohio State University spotted a poster session by one of the school’s grad students titled, “A Description of Facebook Use and Academic Performance Among Undergraduate and Graduate Students.” It was one of hundreds of papers scheduled to be presented at the annual meeting of the American Educational Research Association in April, and an obvious candidate for a press release.
Research on one of the most popular social media engines was a strong news hook. So was any connection with student grades. And from our perspective, as writers charged with explaining ongoing university research, the fact that it arose from education as a discipline, and that it was work by a graduate student, made it even more appealing. (Any chance to tie research by students to their ongoing education reinforces the oft-forgotten relationship between the two at major universities.)
Our resulting story included these key points:
• It was a pilot study with a small, but adequate, population
• It looked at Facebook use among undergraduate and graduate students in the sample and how much they said they studied
• It looked at the representative grade point averages (GPAs) of the students
• It looked for any correlation between Facebook use and GPAs, but suggested no causality
• It strongly recommended additional research.
Our office produces a lot of stories on social science research. We’re very careful to narrowly report the findings and avoid extrapolations or conjecture beyond what the data provides. After the Facebook study’s author, Aryn Karpinski, reviewed the draft of our press release and deemed it accurate, we distributed the story through both Eurekalert and Newswise, two of the largest distributors of research news releases to the media. It was embargoed until April 16 to coincide with Karpinski’s presentation at the educational research conference.
But that weekend, the Sunday Times of London ran an article about the research that carried the following statements:
Research finds the website [Facebook] is damaging students’ academic performance. … Facebook users … are more likely to perform poorly in exams, according to new research. … The majority of students who use Facebook every day are underachieving by as much as an entire grade compared with those who shun the site.”
Sadly, the research showed no such thing.
The Times reporter wrote that he had talked with Karpinski and she’d verified the story the newspaper published. Karpinski says she saw a version of the story, but what the Times printed wasn’t it. And while the paper did not technically break the embargo (the reporter said he didn’t get his information from any of the embargoed material), its story, printed several days before Karpinki’s presentation, set in motion a frantic race among the rest of the news media to catch up, and most of them used the exaggerated Times story as their baseline.
By Wednesday of that week, before the research was presented, Google News was showing hundreds of news stories from media around the world on the study. Many of those reports were wrong as well. Karpinski was overwhelmed with requests for interviews, most of which she granted—but neither her explanations to reporters nor her presentation (which we posted online after the embargo was broken) seemed to make much difference.
The crux of the problem centered on reporters’ apparent ignorance of the terms “correlation” and “causation,” two relatively common technical research terms that are as different as night and day.
Karpinski’s study showed that students who described themselves as Facebook users reported studying less and having lower GPAs than students who didn’t use Facebook. The Facebook users also said they believed, in their cases, there was no connection between their poorer academic performance and the social media engine.
So the study simply pointed to an apparent relationship between students’ lower grades and less time spent studying, and their Facebook use. It did not say that latter caused the former. As one writer very nicely explained, “Facebook may be a symptom of a big procrastination habit, not a cause.”
Unfortuantely, most of the initial news stories didn’t get that.