When I moved from Manhattan to Charlottesville, Virginia, in the summer of 2007, the first thing I did was call The Washington Post to subscribe to home delivery. I remembered fondly the weekend visits I had made to Washington in the 1980s and 1990s, when my group of Texas-raised journalist friends and I would sit around and devour every section of the Sunday Post, lamenting that our local papers from Texas could not or would not cover their regions so lyrically and comprehensively. We would marvel at the resources the Post put into local coverage.
My fond memories of the old Post would only haunt me as I established myself as an active citizen of Virginia. By the fall of 2009, I had cancelled my Post subscription. The Post was no longer a serious newspaper, willing to make sense of its readers’ world for them. By then, the Post had closed all its bureaus in the U.S. And the newsholes devoted to Virginia and Maryland were so small as to be largely irrelevant. So the states that taxed the two largest segments of Post readers would no longer be of interest to the Post newsroom. The Post would still devote massive resources to packages, series, and stories that might win big prizes. But covering the actual news about the country and region would be a luxury the Post could not afford.
The incentive systems of daily newspapers has been askew for years. Under current conditions, it’s even more absurd. The feedback mechanisms at work have created a vicious cycle by which readers care less about the brands that deliver news as those firms care less about news. So revenue decreases, so papers cut back on news, so readers flee.
The Post will never be great again until it re-opens bureaus across the country and decides to cover, rather than ignore, Maryland and Virginia.
Who Is Out There?
Lucas Graves’s “Traffic Jam: We’ll never agree about online audience size” (CJR, September/October) makes me think that some open cookie standard would help. For one thing, the cookie would be regulated, which would be good for users. But it could be a source of clout for the social networks as well. A news viewer may access the same site from several computers, but one thing those computers have in common is the cookie they get from Twitter or Facebook or Google.
As Graves’s article implies, Nielsen’s monopoly over television ratings has raised questions for years about how numbers are generated, tracked, manipulated, and published. I’ve spent more than a few nights looking at TV ratings, week-by-week, year-over-year, and my view is such that complacency with a crooked system is just as bad as being in promotion of said system. Set-top-box data? Some people have legitimate concerns about it, but it’s a shame others refuse it just because the new technology would force market researchers to be a little better at their job.
Graves writes: “But Nielsen’s numbers are better than nothing at all, and that’s what radio or TV broadcasting offers: no way to detect whether 5,000 people tuned in, or 5 million.”
Since we’re in the age of digital TV, this makes little sense to me. If the cable and satellite TV companies got together, they could tally an actual count of viewers. They could also indicate how many people watch commercials (not many), which is probably why they don’t report this stuff. I imagine they are doing this kind of research anyway—for their own internal optimization purposes. But the idea that Nielsen is the only option out there seems wrong to me. There are plenty of ways to “detect,” either by tallying actual numbers or using a statistically significant sample size. It just seems that there’s no interest in doing it—at least not for public consumption.
The New Video Storytellers