DETROIT, MI — Among the old metro dailies, that battered flagship of American journalism, the most celebrated investigative newsroom in the country probably belongs to the Milwaukee Journal Sentinel.

Actually, check that—the Journal Sentinel has one of the most acclaimed watchdog teams in the country, period. The paper’s investigative unit was organized in 2006, not long before the newspaper industry went into a tailspin. Since then the watchdog team has two Pulitzer Prizes to its credit. (The paper also won a third prize for an explanatory piece edited by a member of the investigative unit.) Scarcely a year goes by without media watchers paying tribute to the paper’s dogged reporting and commitment to accountability coverage (AJR, Nieman Watchdog, Nieman Lab). Perhaps most meaningfully, the Journal Sentinel maintains one of the highest rates of market penetration in the country—and it’s clear from reader surveys that the paper’s commitment to investigative journalism has a lot to do with that.

So consider this another encomium to the Journal Sentinel’s watchdog work. Less than two weeks ago the paper published a multi-part report, national in scope, on delays in hospital screenings that put newborns at risk; it’s already been called “one of the most comprehensive, readable, and visually attractive investigative series… ever seen on the web.”

But when I reached out to reporters and editors on the paper’s investigative team recently, I wanted to focus on a particular question. The unit’s mission is, “You’ve got to do stories that matter and you’ve got to have impact.”

It’s the second part of that mandate that’s the real challenge. Given editorial support, reporters anywhere can spend weeks or months grinding out investigative stories on worthy subjects. But once published, how do you measure what influence they had—which stories are effective, which fall flat, and why? How does one of the most celebrated investigative newsrooms in the country measure impact—and how does it achieve it?

Creating impact before a story is published

The question doesn’t have a simple answer. Ellen Gabler, an investigative reporter and assistant editor, is part of the team that spent the last five months immersed in “Deadly Delays,” the newborn screening story. To some extent, all that work is an act of faith—that the information, once public, will create change, even if she can’t see what it is. “I’m not sure there is any true test for tracing the impact,” Gabler told me via email. “There are lots of ways to effect change: raising awareness, holding people accountable, changing laws—although that’s not always the best measure of success.”

Greg Borowski, the investigative team’s editor, added that while the staff “certainly [tries] to track” impact, there is fundamentally “an illusive quality. You can’t measure it on a scale.” In other words, each story has a unique set of possible impacts, and a common set of standards can’t easily be applied.

But by any standard, “Deadly Delays” is having impact. As a result of the reporting, Gabler said, the Wisconsin State Lab will begin providing hospitals with performance reports, and the state’s largest hospital chain has “instituted immediate reforms to its newborn screening program.” As she reported over the weekend, major players in the medical field nationwide are already agitating for better practices.

That seems swift for a story that’s barely a week old. But impact isn’t just a post-publication phenomenon, Borowski said.

“If you do your story right, you have an impact just by virtue of what you find, and you can include it in your story,” he said. With the newborn screening series, reporters shared information with hospital administrators about the deadly consequences of screening delays while reporting was ongoing. As Borowski put it, several of them said, “Oh my gosh, I didn’t know we had that problem,” and immediately began work to fix it.

“Some reporters might not want to tell their sources what they found until the last minute, because they don’t want them to fix it before the story runs,” he said. “But we don’t feel that minimizes the story. In fact, it validates the story as accurate and important.”

Finding ‘pressure points’ to fix a broken system

The series’ ambitious scope also contributes to its influence. The project began with a local tip, but the Journal Sentinel reached far beyond the Milwaukee region, gathering data from every state from which it was available—and then reaching out to media outlets around the country to highlight the issue and show reporters how to use the data. Articles based on the Journal Sentinel’s data work have been published so far in the Rochester Democrat-Chronicle in upstate New York; The Gazette of Cedar Rapids, IA; The Des Moines Register; The Seattle Times; and MinnPost. Gabler was also interviewed on public radio in Arizona about a Phoenix hospital’s worst-in-the-nation ranking.

Anna Clark is CJR's correspondent for Michigan, Wisconsin, Ohio, and Pennsylvania. A 2011 Fulbright fellow, Clark has written for The New York Times, The American Prospect, and Grantland. She can be found online at www.annaclark.net and on Twitter @annaleighclark. She lives in Detroit.