Sign up for the daily CJR newsletter.
In October of last year, Chris Quinn, the editor of Cleveland.com and the Plain Dealer, posted a job listing for an “AI rewrite specialist.” Quinn imagined this new employee would use an AI chatbot to turn reporting from the newsroom’s journalists into written articles, which the human hire would then fact-check. Quinn’s hope was that reporters would spend more time gathering information, less time typing it up.
Last month, Quinn filled the role with Joshua Newman—who had previously worked at LoneStarLive.com in Austin, covering the University of Texas. By mid-January, Newman was working on stories with AI’s help, using an in-house version of ChatGPT provided by the newsroom’s corporate parent, Advance Local. Quinn was pleased: the plan was working out. He wrote a letter to Cleveland.com readers, as he often does, noting the success of the AI rewrite desk—and calling out journalism schools for failing to instill into students a willingness to embrace AI. “Artificial intelligence is not bad for newsrooms,” he wrote. “It’s the future of them.”
The response was swift. “An editor for a newspaper encouraging ‘removing writing from reporters’ workloads’ should just resign,” Phil Lewis, an editor at HuffPost, posted on X. “While Quinn should be applauded for his transparency, his case against journalism education was anecdotal and full of unsupported generalizations, and he failed to explain to readers how using AI improves the journalism,” the American Press Institute declared in a column.
But Quinn remained undaunted, as did Leila Atassi—the public interest and advocacy team editor, who also oversees Newman’s work and the day-to-day success of the AI-rewrite-desk experiment. Atassi, who started as an intern in the newsroom two decades ago, has since covered everything from courts to COVID. “This is the work of a real reporter,” she said of the AI rewrite desk. “It’s real accountability. AI is the assistant, but it’s not the journalist.”
Since Quinn launched the desk, Atassi said, reporters have produced the same number of stories per week as before, but they’ve saved enough time for an extra day to report in the field. She pointed to the example of a reporter named Hannah Drown, who has been covering a proposed land deal in rural Lorain County. “For months, we knew something significant was brewing,” Atassi said. “The State of Ohio had quietly been assembling farmland to attract large-scale industry, but farmers and longtime residents are worried about losing not just their acreage, but the identity of their community.” With the help of the AI rewrite desk, Drown was able to get out from behind her screen. “She sat at kitchen tables and listened to people explain what that land means to their families,” Atassi said. “The result is deeper, more textured journalism that simply doesn’t happen when reporters are just chained to production.”
Of a story produced by the rewrite desk, “obviously it’s not going to look exactly like I would have written it,” Drown told me. Still, she said, the model works well, “as long as we do our due diligence, and I give the notes and the correct and proper information that I need to.”
Stories produced with help from AI carry only a reporter’s byline unless the reporter did minimal work, like forwarding a press release or meeting transcript to the AI rewrite desk, in which case they share a byline with “Advance Local Express Desk.” “I look at AI as a tool, like Microsoft Excel is a tool,” Quinn told me. “We never did disclosures saying ‘This story was produced with the assistance of Microsoft Excel.’”
Gina Chua—the executive director of the Tow-Knight Center at the Craig Newmark Graduate School of Journalism at the City University of New York, where she focuses on the intersection of journalism and AI, and the executive editor at large at Semafor—said while it makes for a better editor, AI can plausibly be used to write simple stories. “No, it’s not going to win any prizes, but it will be fine to communicate the information reasonably efficiently,” she said. “And it will, of course, do that even better if you have a human look at it before it goes out.”
Guardrails are in place. Atassi acknowledged that hallucinations arise—but they are addressed through a series of checks and balances. Both Newman and the reporters who feed in information verify that the resulting articles are accurate—including quotes, which they’ve found are the most likely thing for AI to screw up. “A lot of people are too quick to throw the baby out with the bathwater when it comes to those types of flaws in the technology,” Atassi said. Errors come up sometimes, but none, she told me, have made it to publication.
Despite the backlash, Quinn told me, the response from readers has been largely positive and appreciative of the newsroom’s transparency about AI use. He has been firm on one thing from the start: the goal is not to replace the journalists in the newsroom or to decrease their numbers. It’s not even to increase production. Instead, it’s to grow the newsroom’s reporting and give reporters more time. (The newsroom previously worked with a developer from Advance Local to adapt a suite of off-the-shelf software and tools for the newsroom, including AI scrapers, a software that provides editing feedback, and a tool to assist with creating first drafts, but found that they had more stories on their hands than they expected and as a result, reporters spent more time typing than ever.)
As for the future of the AI rewrite desk, Quinn floated an idea of assigning the role to reporters right out of college, so they could learn what makes a good story. “There’s so many things I don’t know, but we can’t know them until we do this experiment,” he said. “We’re learning a lot. We’re sponges. We’re soaking it in, but we’ve got a ways to go.”
Has America ever needed a media defender more than now? Help us by joining CJR today.