Innovations

A day in the life of a journalist in 2027: Reporting meets AI

Photo: Pixabay

Sign up for The Media Today, CJR's daily newsletter.

What might a day in the life of a journalist look like 10 years from now?

What would have taken weeks or months of reporting by an investigative team today could take a lone journalist aided by artificial intelligence only one day. It’s not science fiction. The fictional scenario below was inspired by the very real technological progress detailed in a recent study by The Associated Press.

In fact, AP spent the past few months meeting with leaders in the artificial-intelligence field for an extensive report detailing the impact of AI in journalism. The document captures some of the major trends and use-cases emerging so far. You can read the report here.

By 2027, newsrooms will have an arsenal of AI-powered tools at their disposal, and journalists will seamlessly integrate smart machines into their everyday work, the study predicts. Machine intelligence will be able to do much more than churn out straightforward, automated news reports. AI will allow reporters to analyze data; identify patterns and trends from multiple sources; see things that the naked eye can’t see; turn data and spoken words into text; text into audio and video; understand sentiment; analyze scenes for objects, faces, text, or colors—and more. When journalists use those tools to enhance their reporting, writing and editing, we call it augmented journalism.

TRENDING: 10 podcasts to help you keep up with the news cycle

Of course, the coming wave of technological innovation is no different than any other that has come before it. Success will still rely on how human journalists implement these new tools. Artificial intelligence is man-made; all the ethical, editorial and economic influences considered when producing traditional news content still apply in this new age of augmented journalism.

Sign up for CJR's daily email

The impact of artificial intelligence on journalism will ultimately be a story of how human reporters adapt to work alongside machines. To leverage artificial intelligence for the benefit of news, the first step is to understand how the technology itself can be deployed in a newsroom setting.

So, what might that day in the life of a journalist actually look like by the year 2027? Here’s what we came up with by extrapolating technology and journalism trends highlighted in AP’s report:

 

8 am: An environmental journalist is commuting to his newsroom in a driverless car when air quality sensors, which he placed throughout Springfield, detect a shift in air pollution. The sensors send an alert to his vehicle’s smart dashboard: “There has been a 10 percent decrease in air quality in Springfield.”

This alert was designed specifically by the journalist, who worked with a data scientist to develop a system that feeds data into a template: “There has been a [X] [increase/decrease] in air quality in [location].”

Seeing the alert, the journalist deploys a pair of drones equipped with water- and air-quality testing kits to confirm the alert.

 

8:30 am: Nearing the newsroom, a computer tracking social media alerts the journalist to increased chatter about air pollution and children suffering from asthma attacks in Springfield. The computer also detects a high rate of posts on the topic from a cluster (a group of users with similar demographics) of local mothers.

The journalist receives a notification that reads: “[Mothers in Springfield] are expressing [great] [concern] about [air pollution] and [their children.]”

The computer understands when individual posts are positive or negative, and whether they are referencing a person, place, or event. It detects similarities among different posts and even analyzes trends across large volumes of historical social media content.

 

9 am: The journalist arrives at the office and literally asks his computer to display the results from his drone-employed water- and air-quality tests.

He inputs the data into a frame (a sort of complex spreadsheet) and instructs a program to determine whether the figures are a statistical outlier. He confirms that today’s pollution rates are abnormally high when compared to historical trends.

The journalist also tweets to one of the mothers who has been posting about air pollution and asks for an interview later in the day to discuss her concerns.


TRENDING: 10 great podcasts to diversify your listening lineup

 

10 am: The journalist projects a series of images from his desktop onto his augmented reality headset.

With an increased field of vision and hundreds of photos “floating” in the air, he determines there has been a decrease in visibility (an indicator of high pollution) in an area around a newly constructed factory in the previous few days.

He downloads images from a series of robotic cameras posted throughout the region and uses computer vision (an algorithm able to view and comprehend a photo or video with enhanced accuracy) to compare photos of the area around the factory over time.

 

11 am: The journalist searches public records using an automated assistant.

He uses an AI-powered text analysis tool to run through thousands of government records and permits. The smart assistant highlights instances of nefarious behavior—recorded fines, permit revocations, public criticism, or legal trouble.

The reporter discovers that the factory owner has been fined for cheating emissions tests at other locations, and suspects that the same thing may be happening in Springfield.

He calls the public relations firm representing the factory owner to get their side of the story. The representative, the journalist suspects, may be hiding something; voice analysis technology declares the tone of the person on the phone is “tentative” and “nervous.”

 

12 pm: The journalist is hungry and…

…he asks his AI-powered food delivery service to recommend a recipe based on the desired ingredients he inputs. The system analyzes the chemical properties of those ingredients and recommends an optimized recipe. Twenty minutes later a drone drops his meal on a special landing platform extending from his newsroom window.

 

1 pm: The journalist, done with lunch, asks his automated assistant to run another set of public records.

Through a series of marriage and birth certificates and social media data, the assistant finds files suggesting that the CEO of the company responsible for building the factory is a distant relative of the woman assigned to conduct factory tests in Springfield.

The computer works thanks to an algorithm that can make sense of public records and derive relationships between text elements. The algorithm depends on something called natural language processing, which understands text and works to infer connections among people places and things. In real time, the reporter is able to visualize a digital family tree and find the links between the two parties.

 

2 pm: The journalist puts on a virtual-reality headset, takes control of a pair of drones and flies them over the area he is investigating. He discovers that a pipe at the factory is split in two and spewing some sort of substance into the air. He also sees a protest going on.

An algorithm, which analyzes the drone’s video live stream, detects physical crowd density and notes that factory’s security personnel is heavily concentrated in one area. The journalist identifies that they have set up barriers around one small section of the industrial lot— a digital heat map leads him to conclude that behind those barriers is the split pipe.

The journalist gets back in his driverless car and heads to the factory to investigate in person. Security prevents him from approaching the barrier hiding the chemical leak, but two factory workers say they have experienced difficulty breathing since the alert was sent, and none of them recall seeing an inspector since the factory began operating.

 

3 pm: The journalist stops at a local cafe on the way back to the office to speak with the mother who had, the same morning, been tweeting about her two young children and their sudden bout of asthma attacks.

The mother tells the journalist that her children were sleeping with their windows open and woke up in the middle of the night complaining of respiratory problems. She shows a video of her children coughing repeatedly.

Sitting in his car on the way back to the newsroom, the journalist runs a voice recording of the interview through his sentiment analysis system, which determines the mother’s tone to be “genuine” and “analytical.”

 

4 pm: The journalist calls the firm again, but this time, the representative declines to comment on the matter.

The journalist, now ignored by his human sources, conducts additional research. Luckily, he has a stock of press releases pertaining to previous accusations of the factory and health department polluting local neighborhoods. He runs those documents through his smart analysis system — and finds that neither party ever issued any form of apology statements.

 

5 pm: The journalist dictates his story to an app on his smart computer that formats and spell-checks. His editor gets a notification, reviews it and approves the story.

In a matter of minutes, he has a well-researched investigative story written about the increased pollution levels in Springfield, including quotes from the factory workers and the mother he met at the cafe.

The report, “Data show extent to which negligence and environmental code violations contribute to air pollution in Springfield,” supports their claims of factory-related breathing troubles with the air and water samples he took during the day.

The article is distributed on every platform—smartphone, smart watch, smart car, smart mirror, everything “smart”—and generates thousands of views in an engaged, local community. The following morning, a new health inspector is dispatched to the factory, notes several code violations and shuts it down indefinitely.

Francesco Marconi and Alex Siegman are the authors. Marconi is the manager of strategic planning and development at The Associated Press and an innovation fellow at The Tow Center for Digital Journalism at Columbia University. Siegman is a computational journalist and a master’s candidate at Columbia University’s Graduate School of Journalism.