Homepage traffic for news sites continues to decrease. This trend is the result of an “if the news is important, it will find me” mentality that developed with the rise of social media, when people began to read links that their friends and others in their networks recommended. Thus, readers are increasingly discovering news through social media, email, and reading apps.
Publishers are well aware of this, and have tweaked their infrastructure accordingly, building algorithms that change the site experience depending on where a reader enters from.
While publishers view optimizing sites for the reading and sharing preferences of specific online audiences as a good thing, because it gets users to content they are likely to care about quickly and efficiently, that kind of catering may not be good for readers.
“We can actually act on the psychological predisposition to just expose ourselves to things that we agree with,” explains Nick Diakopoulos, research fellow at the Tow Center for Digital Journalism, where he recently published a report on algorithmic accountability reporting. “And what the algorithms do is they throw gasoline on the fire.”
Visitors who enter BuzzFeed via Pinterest, for instance, see a larger “Pin It” button, no Twitter share button, and a “hot on Pinterest” module. Medium, launched less than two years ago by Twitter co-founder Evan Williams, recommends content to readers via an intelligent algorithm primarily based on how long users spend reading articles. Recommended content sidebars on any news site are calculated via algorithm, and Facebook has a recommended news content block that takes into account previous clicks and offers similar links.
Diakopoulos categorizes algorithms into several categories based on the types of decisions they make. Prioritization, for example, ranks content to bring attention to one thing at the expense of another. Association marks relationships between entities, such as articles or videos that share subject matter of features. Filtering involves the inclusion or exclusion of certain information based on a set of criteria.
“Algorithms make it much easier not just for you to find the content that you’re interested in, but for the content to find you that the algorithm thinks you’re interested in,” Diakopoulos says. That is, they maximize for clicks by excluding other kinds of content, helping reinforce an existing worldview by diminishing a reader’s chance of encountering content outside of what they already know and believe.
This type of exclusion on the internet has become known as the filter bubble, after a 2011 book by Eli Pariser. As CJR’s Alexis Fitts explains in a recent feature about Pariser’s viral site, Upworthy:
In Pariser’s conception, the filter bubble is the world created by the shift from “human gatekeepers,” such as newspaper editors who curate importance by what makes the front page, to the algorithmic ones employed by Facebook and Google, which present the content they believe a user is most likely to click on. This new digital universe is “a cozy place,” Pariser writes, “populated by our favorite people and things and ideas.” But it’s ultimately a dangerous one. These unique universes “alter the way we’d encounter ideas and information,” preventing the kind of spontaneous encounters with ideas that promote creativity and, perhaps more importantly, encouraging us to throw our attention to matters of irrelevance.
“It’s easy to push ‘Like’ and increase the visibility of a friend’s post about finishing a marathon or an instructional article about how to make onion soup,” writes Pariser. “It’s harder to push the ‘Like’ button on an article titled, ‘Darfur sees bloodiest month in two years.’ “
These type of algorithms create a news literacy issue because if readers don’t know they are influencing content, they cannot make critical decisions about what they choose to read. In the print world, partisan media was transparent about its biases, and readers could therefore select which bias they preferred. Today, readers don’t necessarily know how algorithms are biased and and how nuanced the filters they receive content through really are.
“Newspapers have always been able to have an editorial voice and to possibly even affect voting patterns based on the that editorial voice,” says Diakopoulos. “But what we’re seeing [now] is the ability to scale across a population in a much more powerful way.” Facebook recently did a study that found that simply showing more news in the newsfeed affects voting decisions.
Furthermore, the algorithms that social sites use to promote content don’t evaluate the validity of the content, which can and has spread misinformation.