How an algorithm helped the LAT scoop Monday’s quake

Everyone, that is, except those desk-diving anchors

On Monday morning, Ken Schwencke, a programmer and data reporter at the Los Angeles Times, was jolted awake at his Los Angeles home at 6:27 am by a 4.4 magnitude earthquake in Southern California. Three minutes later, at 6:30 am, a news brief about the quake appeared under his byline on the Times’ L.A. Now Blog, breaking the news three minutes before the Los Angeles Daily News and nine minutes before LA Weekly. The only journalists to beat him to it were local TV anchors, who were live on-air as the earthquake happened.

It’s not that he’s an exceptionally quick writer: Schwencke didn’t write the story, his algorithm did.

The algorithm, named “Quakebot,” lifts data from the US Geological Survey into a post that Schwencke can just hit “publish” on.

Quakebot takes a few moments to run the USGS data through a series of questions, pre-programmed by Schwencke, about the magnitude, location, and depth of the quake. With these answers, the program generates a short post and adds it to the Times’ content management system in draft status.

Though a similar brief might only take a reporter 10 minutes at most, Quakebot can spit out its stories at lightning speed. “We’re as fast as [the USGS],” Schwencke said. The posts run with this tagline at the bottom: “This information comes from the USGS Earthquake Notification Service and this post was created by an algorithm written by the author.”

Schwencke began work on the technology in 2011, and the LAT has used Quakebot for just over two years. His goal was to create an notification system that would provide quick information to readers in quake-prone California, while also incorporating maps and data to help track the tremors.

But eventually it became clear the value of Quakebot was simple: speed. “When you feel something, the first question you have is, what was that?” Schwencke said. “Between the editors and myself, we realized that that was the most valuable part of the website.” In Monday morning’s post, for example, after quickly checking for errors Schwencke was able to beat other local sites by three minutes or more. After it was posted, an editor added a video of news anchors reacting to the tremors, and the item became the anchor for the LAT’s live-blog of the quake.

The effort has also inspired other robo-posting projects at the Times. The technique has allowed the paper’s crime desk to update its Homicide Report blog as soon as information about a new homicide in the city is reported. “It saves a lot of time over the years for people,” Schwencke said.

Reporting by algorithm isn’t unique to the Times. Jeremy Gilbert, a professor of media product design at the Medill School of Journalism at Northwestern University, began dabbling with robo-posting in 2009, having his students design code to create baseball game recaps off results published online.

This project, called StatsMonkey, has turned into a full-fledged company, Narrative Science. The company has created a new artificial intelligence publishing platform called Quill that mines data to produce stories; clients include Forbes and Fox Sports.

Schwencke hasn’t heard of any other newspapers in Southern California using a similar type of algorithm. But after Quakebot sparked a wave of Twitter discussion among journalists on Monday, we may start to see more projects like this soon.

“Somebody today from asked me if the code was available,” Schwencke said.

Has America ever needed a media watchdog more than now? Help us by joining CJR today.

Joanna Plucinska is an intern at CJR Tags: , ,