Join us
news literacy

How algorithms decide the news you see

Past clicks affect future ones
May 20, 2014

Sign up for The Media Today, CJR’s daily newsletter.

Homepage traffic for news sites continues to decrease. This trend is the result of an “if the news is important, it will find me” mentality that developed with the rise of social media, when people began to read links that their friends and others in their networks recommended. Thus, readers are increasingly discovering news through social media, email, and reading apps.

Publishers are well aware of this, and have tweaked their infrastructure accordingly, building algorithms that change the site experience depending on where a reader enters from.

While publishers view optimizing sites for the reading and sharing preferences of specific online audiences as a good thing, because it gets users to content they are likely to care about quickly and efficiently, that kind of catering may not be good for readers.

“We can actually act on the psychological predisposition to just expose ourselves to things that we agree with,” explains Nick Diakopoulos, research fellow at the Tow Center for Digital Journalism, where he recently published a report on algorithmic accountability reporting. “And what the algorithms do is they throw gasoline on the fire.”

Visitors who enter BuzzFeed via Pinterest, for instance, see a larger “Pin It” button, no Twitter share button, and a “hot on Pinterest” module. Medium, launched less than two years ago by Twitter co-founder Evan Williams, recommends content to readers via an intelligent algorithm primarily based on how long users spend reading articles. Recommended content sidebars on any news site are calculated via algorithm, and Facebook has a recommended news content block that takes into account previous clicks and offers similar links.

Diakopoulos categorizes algorithms into several categories based on the types of decisions they make. Prioritization, for example, ranks content to bring attention to one thing at the expense of another. Association marks relationships between entities, such as articles or videos that share subject matter of features. Filtering involves the inclusion or exclusion of certain information based on a set of criteria.

Sign up for CJR’s daily email

“Algorithms make it much easier not just for you to find the content that you’re interested in, but for the content to find you that the algorithm thinks you’re interested in,” Diakopoulos says. That is, they maximize for clicks by excluding other kinds of content, helping reinforce an existing worldview by diminishing a reader’s chance of encountering content outside of what they already know and believe.

This type of exclusion on the internet has become known as the filter bubble, after a 2011 book by Eli Pariser. As CJR’s Alexis Fitts explains in a recent feature about Pariser’s viral site, Upworthy:

In Pariser’s conception, the filter bubble is the world created by the shift from “human gatekeepers,” such as newspaper editors who curate importance by what makes the front page, to the algorithmic ones employed by Facebook and Google, which present the content they believe a user is most likely to click on. This new digital universe is “a cozy place,” Pariser writes, “populated by our favorite people and things and ideas.” But it’s ultimately a dangerous one. These unique universes “alter the way we’d encounter ideas and information,” preventing the kind of spontaneous encounters with ideas that promote creativity and, perhaps more importantly, encouraging us to throw our attention to matters of irrelevance.

“It’s easy to push ‘Like’ and increase the visibility of a friend’s post about finishing a marathon or an instructional article about how to make onion soup,” writes Pariser. “It’s harder to push the ‘Like’ button on an article titled, ‘Darfur sees bloodiest month in two years.’ “

These type of algorithms create a news literacy issue because if readers don’t know they are influencing content, they cannot make critical decisions about what they choose to read. In the print world, partisan media was transparent about its biases, and readers could therefore select which bias they preferred. Today, readers don’t necessarily know how algorithms are biased and and how nuanced the filters they receive content through really are.

“Newspapers have always been able to have an editorial voice and to possibly even affect voting patterns based on the that editorial voice,” says Diakopoulos. “But what we’re seeing [now] is the ability to scale across a population in a much more powerful way.” Facebook recently did a study that found that simply showing more news in the newsfeed affects voting decisions.

Furthermore, the algorithms that social sites use to promote content don’t evaluate the validity of the content, which can and has spread misinformation.

Beyond the filter bubble, algorithmic bias extends to search engine manipulation, which refers to the process undertaken by many companies, celebrities, and public figures to ensure that favorable content rises to the top of search engine results in particular regions. Though not intuitive to the average Web user, it’s actually a form of soft censorship, explains Wenke Lee, Director of the Georgia Tech Information Security Center.

After reading Pariser’s book, Lee and his research team set out to test the effect of personalized search results on Google and built a tool called Bobble, a browser plug-in that runs simultaneous Google searches from different locations around the globe so users can see the difference between Google search returns for different people. They found that results differ based on several factors: Web content at any given time, the region from which a search is performed, recent search history, and how much search engine manipulation has occurred to favor a given result. Though Bobble has largely been confined to research purposes, it has been downloaded close 10,000 times and has tremendous potential as a news literacy teaching tool.

“When we do this kind of work, there is always some pushback from people who say ‘Why should people care? Why should people care about the filter bubble or biased news?'” says Lee. “But in the print media age, if somebody was to give me a manipulated version of The New York Times, I would be able to put my newspaper next to yours and find out that mine is different. But now? You and I can very likely see different front pages of newspapers online because they are customized for individuals, and that’s pretty dangerous. Because that means I don’t have a baseline to compare what is real and what is not.”

For these reasons, the Center for News Literacy at Stony Brook University dedicates a portion of its curriculum to the filter bubble, covering issues of search engine manipulation and teaching how to search incognito on a Web browser–that is, without it storing your information.

Other efforts to mitigate media bias from algorithmic personalization include NewsCube, a Web service which automatically provides readers with multiple viewpoints on a given news item, and Balance, a research project at the University of Michigan that seeks to diversify the result sets provided by news aggregators (such as Google News).

Meanwhile, Diakopoulous is working on a framework for how to be transparent about algorithms, as well as processes for how they can be investigated, be it through reverse engineering by users (for which he offers methods in his report) or policy regulations on an institutional level.

“Transparency is important for the same reason why we want our newspaper editors to be transparent,” he says. “If the purveyor of this very powerful media tool is honest with us about how they are using it, then at least we can be a little bit more trusting of them.”

And it’s also a way to give people the choice to be more media savvy–to exit the filter bubble, if they wish. “If I know your search engine works that way and I know someone else’s search engine works a different way, then I can choose which one I would prefer to use.”

Funding for this coverage is provided by the Robert R. McCormick Foundation.

Has America ever needed a media defender more than now? Help us by joining CJR today.

Jihii Jolly is a freelance journalist and video producer in New York City