Tow Center

NPR One: ‘We are the enemies of filter bubbles’

December 23, 2016
NPR Headquarters in Washington

In the post-election conversation about echo chambers, NPR is turning the microphone back on itself.

The team behind the NPR One app published “The Secret Sauce Behind NPR One: An Editorially Responsible Algorithm” on Wednesday, “raising the curtain” on how the app curates what it puts in front of users. NPR One is an audio app that gives users a feed of news, stories, and podcasts from public radio.

NPR wants to set an example of transparency not just among news organizations, but for the social-media platforms as well. “We are the enemies of filter bubbles,” said Mike Oreskes, NPR’s senior vice president of news. (Oreskes also sits on CJR’s advisory board.)

The piece, written by Managing Editor Tamar Charney, Oreskes, and Chief Digital Officer Thomas Hjelm reveals, very broadly, how personalization works on the app. NPR One offers a mix of national headlines and local news based on your location, as well as what they call “watercooler” stories. Everyone using the app sees the most important stories, but when it comes to lighter fare, the app uses factors such as listeners’ previous activity to determine what might be of interest. The app also personalizes podcast suggestions; a user’s behavior, Charney said, is often far more reliable for offering what the listener might want to hear than their own choices.

The team is adamant that personalization shouldn’t block listeners from being exposed to stories outside his or her usual interests. Algorithms are “just another editorial tool,” they said, and NPR One’s personalization is informed by editors’ decisions about what listeners should hear: If a user listens to a story with one particular perspective one day, they should be given the option to listen to something with the opposite viewpoint the next.

As news creation and curation becomes more automated, transparency is crucial. As Nicholas Diakopoulos writes in CJR, “While such technologies enable an ostensibly objective and factual approach to editorial decision-making, they also harbor biases that shape how they include, exclude, highlight, or make salient information to users.”

Sign up for CJR's daily email

NPR One has another mechanism, built into the algorithm, to guard against a user hearing too much of the same: they will “periodically double check” you’re not interested in, for instance, sports, or tech.

The algorithm has been refined since NPR One was launched in 2014. The team had certain assumptions about the order in which people like to hear things based on years of experience in broadcast radio. But tracking user behavior on the NPR One app gives much more granular data than, say, Nielsen ratings. They also are able to observe how users engage with the app and where they spend the most time. For example, Charney said he was surprised to learn users enjoyed having a lighter story closer to the top of their feed.

Understanding what listeners are interested in has even begun to inform how stories are presented on the radio, across NPR. The team collects this knowledge and shares it with member stations featured on the app.

Public radio sets practices in broadcasting not driven by business needs, Oreskes said. “Commercial platforms have an incentive to allow or even encourage filter bubbles, [and] we see breaking up those bubbles as a new way to fulfill our original mission.”

About the Tow Center

The Tow Center for Digital Journalism at Columbia's Graduate School of Journalism, a partner of CJR, is a research center exploring the ways in which technology is changing journalism, its practice and its consumption — as we seek new ways to judge the reliability, standards, and credibility of information online.

View other Tow articles »

Visit Tow Center website »