FLORIDA—Late Saturday night, the Tampa Bay Times and the Miami Herald released the results of a new Mason-Dixon survey of Florida Republicans. One didn’t need to see the poll numbers to know the newspapers believed the results were dramatic.
Times political editor Adam Smith wrote:
Mitt Romney needed Florida to resuscitate his campaign after a South Carolina routing, and on Tuesday, Florida is poised to deliver big.
A new Tampa Bay Times/Miami Herald/Bay News 9 poll found Romney easily beating Newt Gingrich among likely Republican primary voters, with 42 percent support to Gingrich’s 31 percent. Rick Santorum trails with 14 percent, followed by Ron Paul at 6 percent.
And Marc Caputo, political writer for the Herald, wrote:
Newt Gingrich swaggered into Florida as a Republican front-runner, but now he’s close to slipping out as an also-ran against a resurgent Mitt Romney.
Gingrich is badly trailing Romney by 11 percentage points, garnering just 31 percent of likely Republican voters heading into Tuesday’s presidential primary, according to a Miami Herald/El Nuevo Herald/Tampa Bay Times poll released late Saturday night.
Both reporters went to considerable length to describe the complexities of the campaign and the reasons for Romney’s apparent resurgence and Gingrich’s apparent slide. Where they failed, however, is in what they did not report.
Newspaper polling (and the coverage of such polling) has long been a pet peeve of mine. Smith, Caputo and others with whom I worked as political reporter have been subjected to my railing about newspaper polls countless times. I have written about the issue and what newspapers can do better on my blog, Crowley Political Report.
In Caputo’s and Smith’s stories, there is a clear example of what is wrong with newspaper polls and the coverage thereof. Both reporters wrote that their poll showed that Romney has a 24-point lead over Gingrich among Hispanic voters—52 to 28 percent. But readers were not told important details about these numbers. I interviewed Brad Coker of Mason-Dixon Polling and Research who conducted the poll for the newspapers. He said the survey of 500 registered, likely voting, Republicans included just 75 Hispanics. Of those 75 Hispanics surveyed, the number consisted “heavily” of Cuban Americans who live in Miami-Dade county.
Now, the argument can be made that Miami-Dade Cuban Americans will be the overwhelming majority of Hispanic voters in the Florida primary, as Caputo explains in his story. But neither Caputo nor Smith tells readers that their Hispanic survey is, in fact, not a comprehensive look at Florida Hispanic voters. Coker told me that if he were doing a detailed survey of Hispanics he would have surveyed 400 not 75.
Once you realize that the survey sample is 75, it begs the next question: What is the margin of error? The Times and Herald reported that the margin of error for the 500 Republicans surveyed was 4.5 percent. True. But what they did not tell readers was the margin of error for the 75 Hispanic voters surveyed, which Coker told me was plus or minus 12 percent. That means that the percentage of Hispanics who support Romney ranges from 40 percent to 64 percent and the percentage supporting Gingrich ranges from 16 percent to 40 percent.
What does it really mean? That there is little statistical value in the number of Hispanics surveyed. This is a common problem when news organizations report subgroups in their polls. Often the numbers surveyed are too small to reveal any meaningful information.
The American Association For Public Opinion Research warns journalists:
Sample sizes below 100 will have a large margin of sampling error—plus or minus 10 percentage points for a sample size of 100 and increasing as the sample size declines. Journalists should avoid reporting on groups this small unless there is a compelling reason to do so, and then only after consulting with an independent polling expert.
(Here’s a very useful chart for journalists that offers information on margin of error for subgroups).
I spoke with the Herald’s Caputo and he later sent me an email offering this explanation for his reporting on this poll:
I believe about 60 percent of the registered Republican Latino voters—if not those who actually perform in the GOP primary—are Miami-Dade Hispanics. So it’s not out of bounds to poll them heavily. In the end, we relied on our professional pollster to conduct an accurate poll and I believe he delivered. When I asked if the results were outside the error margin (because they rode the line), he said they were. I generally try to note the error margin if a result is inside it (if there’s a greater chance of a tie).
Obviously, we’ll know more on Election Day.
The Times’s Smith declined to be interviewed, saying he was too busy.
Coker of Mason-Dixon Polling said he did not rely solely on his survey numbers to arrive at his conclusions. He has been polling in Florida for many years and he says he also looks at the numbers and uses his best judgment based on experience to determine the validity of his polling. (Disclosure: last year I spoke with Coker about doing a poll for a non-profit client and may use his firm in the future). Still, Coker acknowledges that a survey of 400 Hispanics would give him and his clients a better understanding of the Hispanic community than a sampling of 75. (Although, in the Times piece, Coker is quoted as simply observing that Romney has “completely flipped the table with Hispanic voters this time around,” having lost Florida in 2008 to John McCain “largely,” the Times wrote, “thanks to Romney’s anemic showing in South Florida and among Hispanic Republicans.”)
Another problem is that neither the Times nor the Herald released its complete polling results with questions and crosstabs (the Times posted poll questions online). While print news holes may be too small, there is no reason not to post all of this information on the newspaper websites. The timing of the release is also problematic. The poll was conducted during the period of January 24-26. The results were not released on the newspaper websites until very late on Saturday, January 28. The results then appeared in the Sunday paper, three days after the survey was completed. Pollsters will tell you there is simply too much volatility that close to an election to hold on to poll results for three days. Even voters surveyed on the 24th could have changed their opinion by the 25th.
The worst example I can recall of a newspaper misusing a poll occurred in 1986, during the Democratic primary for Florida governor. The Palm Beach Post, where I spent nearly all of my 28 years as a political editor, conducted a reputable, well-done poll. The executive editor ordered that the poll not run until the Sunday before the election. The newsroom howled and I was one of those who howled the loudest. Why? Because by the time we ran the poll it was two weeks old. Our pollsters were rightfully outraged. The newspaper was humiliated.
Tom Fiedler, then the political editor of the Miami Herald and now the Dean of the College of Communications at Boston University, took us to task, writing in his September 14, 1986 column:
The news media began using [polls] benignly enough. When we wanted to gauge how a candidate was doing, we found we could substitute the scientifically selected random sample for the traditional man-on-the-street interview.
Where would we be today without the poll to tell us the front-runner, the long-shot and the hopeless cause in any race?
Let me make plain that I do not want to discourage news media polling. Good polls provide an independent measure of how a candidate is doing; without them, we would be at the mercy of numbers leaked to us by candidate pollsters, always for selfish ends.
But if we print a poll in the closing hours of a race—a time when voters are making up their minds in massive blocs on God knows what basis—then we must move cautiously, if at all.
What can we do? First, we in the media have to understand this monster we have hired. Political polling is a fine art, having to measure such variables as likely voters, demographic balances and—trickiest of all—the dynamics of the electorate, which can stampede at the end.
We have to be honest with readers about the shortcomings of a poll we print—and courageous enough to spike one we don’t trust.
Fiedler was right 26 years ago. And he is right today. News organizations are getting increasingly sloppy with reporting on—and addressing the shortcomings of—their own polls, not to mention asking tough questions of all the other polls that seem to pop up every day.
The National Council on Public Polls offers 20 questions journalists should ask about poll results. Let’s ask them.