Tow Center

The journalistic newsfeed: editorial values and algorithms

December 13, 2019
A newspaper with the headline "Artificial Intelligence" | Photo: Adobe Stock

Facebook, Google, Apple News: these are the technology companies vital to news distribution. Their algorithms, written with the values and priorities of Silicon Valley, drive the majority of online traffic to news, determining which publishers and stories gain exposure—sometimes with less-than-desirable results.  

Increasingly, news organizations are also joining this game of algorithmic curation. Maybe it’s the NPR One app picking the next story you hear, the Washington Post website recommending another article to read, or the New York Times dynamically assembling story packages for display on its website and apps. Algorithms are the new medium in which news organizations express editorial values and priorities, just as they are for the software and advertising businesses. 

Journalistic newsfeeds are an opportunity for news organizations to take back some control over distribution by writing algorithms with more traditionally recognizable editorial, ethical, and public interest values.

A journalistic curation algorithm might more closely consider diversity—the mix of content. It could improve transparency by providing clear explanations for every recommendation. Or it might mean adopting stricter accuracy standards for the sources of articles that enter a feed. When newsrooms build their own curation algorithms, they assert their independence by reducing reliance on algorithms designed by non-journalistic organizations. 

In his keynote at the Computation + Journalism Symposium earlier this year, New York Times engineering VP Brian Hamman described the algorithmic system used to curate the site’s home page and app content. The site is an algorithmically curated feed, with lists of articles organized into blocks and packages that form the atomic units of its layout. Ads or newsletter signup boxes can be put into the feed, too.  

The Times’s algorithm can display more than 160 different layout combinations, changing to accommodate different types of articles and images, and for emphasis. Subtle differences in labeling and typography allow the algorithm to use design to signal editorial importance. 

Sign up for CJR's daily email

The approach allows the Times to personalize content feeds to suit different users. For instance, the Editor’s Picks section curates a feed adapted to a reader’s location. Editors first choose about 30 articles a day and then the algorithm learns which of those are more likely to be clicked in different locations. A reader in New York may see different Editor’s Picks articles than a reader in Chicago, though they’ll also see some of the same articles. 

The BBC is another organization considering the journalistic newsfeed. One recent BBC initiative would develop a personalized radio curation algorithm, which may eventually find its way into the BBC Sounds app. Bill Thompson and Tim Cowlishaw, who work for the BBC’s research and development department, discussed the principles behind the project in an interview with CJR.  

The BBC wanted its audio recommender to provide exposure to a diversity of content, but it also wanted to curtail personalization so that people wouldn’t lose the shared experience of hearing broadly relevant content. Once explicit, these principles could help guide the design and choice of metrics towards ones that wouldn’t end up undermining the common ground between listeners that the BBC wanted to preserve. 

For Thompson, the goal isn’t “to capture the essence of the BBC in a set of tools or single neural network” but rather to have an ongoing awareness of how the newsroom’s values are reflected in its technology, and to continually review that technology with these in mind. That process needs metrics carefully aligned to ethical principles as much as it needs diligent and methodical people to implement it. 

Every news organization has different values, and organizations in various cultural and regulatory environments may need to orient their values-driven designs appropriately. In Germany, for instance, broadcasters seeking to comply with the German Broadcasting Act must consider the need to clearly label commentary and include a mix of information, education, culture, and entertainment. 

There’s no one set of universal values that should be designed into a journalistic newsfeed. And it may ultimately be better for society if there are a variety of news algorithms each emphasizing different values. The New York Times’ algorithm may diverge from the British or German public broadcaster algorithm, and they would all surely contrast with the Facebook feed. 

Algorithm design is an important new way for thinking about media diversity in society. That raises the stakes for  the people building these systems: engineers and designers of journalistic newsfeeds may well need to commit to stewardship of the values of journalism in the technologies they build. But if they do so, there’s a chance to reclaim news distribution from technology companies, and ensure that commercial and public interest values are better balanced.

About the Tow Center

The Tow Center for Digital Journalism at Columbia's Graduate School of Journalism, a partner of CJR, is a research center exploring the ways in which technology is changing journalism, its practice and its consumption — as we seek new ways to judge the reliability, standards, and credibility of information online.

View other Tow articles »

Visit Tow Center website »