DETROIT, MI — Among the old metro dailies, that battered flagship of American journalism, the most celebrated investigative newsroom in the country probably belongs to the Milwaukee Journal Sentinel.
Actually, check that—the Journal Sentinel has one of the most acclaimed watchdog teams in the country, period. The paper’s investigative unit was organized in 2006, not long before the newspaper industry went into a tailspin. Since then the watchdog team has two Pulitzer Prizes to its credit. (The paper also won a third prize for an explanatory piece edited by a member of the investigative unit.) Scarcely a year goes by without media watchers paying tribute to the paper’s dogged reporting and commitment to accountability coverage (AJR, Nieman Watchdog, Nieman Lab). Perhaps most meaningfully, the Journal Sentinel maintains one of the highest rates of market penetration in the country—and it’s clear from reader surveys that the paper’s commitment to investigative journalism has a lot to do with that.
So consider this another encomium to the Journal Sentinel’s watchdog work. Less than two weeks ago the paper published a multi-part report, national in scope, on delays in hospital screenings that put newborns at risk; it’s already been called “one of the most comprehensive, readable, and visually attractive investigative series… ever seen on the web.”
But when I reached out to reporters and editors on the paper’s investigative team recently, I wanted to focus on a particular question. The unit’s mission is, “You’ve got to do stories that matter and you’ve got to have impact.”
It’s the second part of that mandate that’s the real challenge. Given editorial support, reporters anywhere can spend weeks or months grinding out investigative stories on worthy subjects. But once published, how do you measure what influence they had—which stories are effective, which fall flat, and why? How does one of the most celebrated investigative newsrooms in the country measure impact—and how does it achieve it?
Creating impact before a story is published
The question doesn’t have a simple answer. Ellen Gabler, an investigative reporter and assistant editor, is part of the team that spent the last five months immersed in “Deadly Delays,” the newborn screening story. To some extent, all that work is an act of faith—that the information, once public, will create change, even if she can’t see what it is. “I’m not sure there is any true test for tracing the impact,” Gabler told me via email. “There are lots of ways to effect change: raising awareness, holding people accountable, changing laws—although that’s not always the best measure of success.”
Greg Borowski, the investigative team’s editor, added that while the staff “certainly [tries] to track” impact, there is fundamentally “an illusive quality. You can’t measure it on a scale.” In other words, each story has a unique set of possible impacts, and a common set of standards can’t easily be applied.
But by any standard, “Deadly Delays” is having impact. As a result of the reporting, Gabler said, the Wisconsin State Lab will begin providing hospitals with performance reports, and the state’s largest hospital chain has “instituted immediate reforms to its newborn screening program.” As she reported over the weekend, major players in the medical field nationwide are already agitating for better practices.
That seems swift for a story that’s barely a week old. But impact isn’t just a post-publication phenomenon, Borowski said.
“If you do your story right, you have an impact just by virtue of what you find, and you can include it in your story,” he said. With the newborn screening series, reporters shared information with hospital administrators about the deadly consequences of screening delays while reporting was ongoing. As Borowski put it, several of them said, “Oh my gosh, I didn’t know we had that problem,” and immediately began work to fix it.
“Some reporters might not want to tell their sources what they found until the last minute, because they don’t want them to fix it before the story runs,” he said. “But we don’t feel that minimizes the story. In fact, it validates the story as accurate and important.”
Finding ‘pressure points’ to fix a broken system
The series’ ambitious scope also contributes to its influence. The project began with a local tip, but the Journal Sentinel reached far beyond the Milwaukee region, gathering data from every state from which it was available—and then reaching out to media outlets around the country to highlight the issue and show reporters how to use the data. Articles based on the Journal Sentinel’s data work have been published so far in the Rochester Democrat-Chronicle in upstate New York; The Gazette of Cedar Rapids, IA; The Des Moines Register; The Seattle Times; and MinnPost. Gabler was also interviewed on public radio in Arizona about a Phoenix hospital’s worst-in-the-nation ranking.
The team’s emphasis on engaging as wide an audience as possible is evident in other ways. Persistent linking on social media helps. But so does old-fashioned attention to craft—strong narrative writing, and working to find just the right real-life example to elevate a worthy investigative report into an attention-demanding story.
And there’s a rough formula for the paper’s muckraking work, which might be summarized this way: Define the problem. Point to the solution. Show people how they can help get there.
Investigative journalism can come across as a pessimistic, even cynical, practice. But Gabler said that a “compelling and engaging tale” about a problem should include “solution pieces, where we highlight places where things are working right.” In the newborn screening story, that meant highlighting Iowa’s excellent state lab, now being held up as a model.
Stories are written in a way that gives readers information about who to call, write, or question, and suggest other ways to take individual action in response to the problem. The question made explicit in the Journal Sentinel’s stories, Borowski said, is: “What are pressure points for the system to get fixed?” Online, this shows up on the paper’s “Citizen Watchdog” page: “Your one-stop center to dig deeper.”
‘Your beat is wherever your sources are’
From start to finish, reporting is at the heart of the Journal Sentinel’s work. Some local news organizations, such as South Florida’s Sun Sentinel, have had success maintaining a firewall between the watchdog team and other reporters. But the Journal Sentinel’s investigative unit—a team of about seven, which swells to 12 if you count the local PolitiFact crew and an app developer and interactive designer—is integrated with the newsroom.
The newborn screening story, for example, came out of a tip to the paper’s medical reporters, John Fauber and Mark Johnson, who share a byline on the investigation with Gabler. Raquel Rutledge got the tip that led to her Pulitzer-winning 2009 expose on fraud in the state’s childcare system while she was one of the watchdog team’s beat-style public investigators, covering day-to-day “quick hits,” with a consumer focus. (And talk about impact: her report instigated a new rating system for childcare quality in Wisconsin and saved the state more than $100 million between 2009 and 2011 by cutting off daycare providers who were stealing taxpayer money.)
“Our approach is to come off of the beats,” Borowski said. “We’re not going to say [to beat reporters], ‘Great idea, we’ll take that, you go back to the slog.’” The strategy is to pair beat reporters with someone from the investigative team to see the story through together. “The beat reporter has the contacts, the investigative reporter has the time and experience with piles of information,” Borowski said. “We get more projects done that way.”
To a person, the Journal Sentinel watchdog team argues that the best way to measure the impact of a story, and to amplify that impact, is to not stop reporting.
“We don’t drop off and move on to the next thing. Reporters track several months, or more, of fallout on the story, on top of their other work,” Borowski said. Maybe a new law is proposed, or an audit is conducted, because of the original story. The reporter will come back to evaluate how it’s working, or to see if there’s funding to back the cheery-sounding solutions. A successful investigation creates its own beat.
Not to put too fine a point on it, but this can be exhausting. “It consumes you,” Rutledge said. The fallout from her childcare fraud investigation has lasted four years and counting. “It’s hard to get away from the story once you’ve written about it. Even today I still get calls [with tips]. Oh my gosh, I’m sick of it, but I’ve got to keep on it.”
“It turns out your beat is wherever your sources are. … It’s why fraud is still on my plate,” Rutledge said.
Conversely, stories that didn’t have much impact were ones without follow-up. In 2007, Rutledge worked with a photographer on a two-part story about kids who have secondary post-traumatic stress disorder from the deployment of their parents. They traveled to Texas and put together “what I thought was a compelling story,” she said. But the story didn’t attract much attention. Rutledge was frustrated to see a much-shared piece on the same topic in Mother Jones recently, while her own story seems to have vanished in the ether.
While the Journal Sentinel’s PTSD story was more explanatory than investigative, there were still follow-up opportunities—examining how the military was or was not responding to the trauma of kids, for example—that might’ve helped it stick.
As Gabler points out, churning out epic investigations isn’t the only way to be an effective news organization—day-to-day reporting on crime, city politics, education and more are all “central to … having impact in the community.” But accountability reporting, at its best, is pivotal.
“This is about real people. We’re not doing this for fun,” Rutledge said. “People are struggling in the world, sometimes needlessly. If we can bring justice and make the world just a bit better, that’s important. This is the kind of work that improves people’s lives.”
That’s the optimism that the Journal Sentinel’s brand of investigative reporting hinges on: Solutions exist. To measure the impact of a story is to measure whether or not the community has moved closer to finding them.