Tow Report

Local audiences consuming news on social platforms are hungry for transparency

Image: Pexels

Executive Summary

As more and more people get at least some of their news from social platforms, this study showcases perspectives on what the increasingly distributed environment looks like in day-to-day media lives. Drawing from thirteen focus groups conducted in four cities across the United States, we sample voices of residents who reflect on their news habits, the influence of algorithms, local news, brands, privacy concerns, and what all this means for journalistic business models.

 

Key findings include:

Tech platforms and news habits

Even people who frequently access news via tech platforms do not think of those platforms as news sources in their own right. Instead, participants of all ages typically reported that their encounters with news on tech platforms were usually unintentional and/or a byproduct of visiting the platforms for other purposes, as opposed to actively going there to consume news.

  • Convenience is key to audiences’ ongoing willingness to engage with news via platforms.
Sign up for CJR's daily email

 

Algorithm

While the term “algorithm” came up relatively infrequently among our focus group participants, associated themes were very prominent throughout conversation, indicating that awareness—and at times concern—about platform algorithms extends to everyday news audiences.

  • News audiences’ understandings of algorithms are varied and often simplistic and/or inaccurate, reflecting platforms’ general opacity about their functionality.
  • At one end of the scale there was an observable lack of awareness about the existence and/or purpose of the algorithms that control the flow of news on their news feeds. At the other, where people had some awareness of algorithms, we frequently observed a) a framing of them as “filters” that create one-sided filter bubbles or b) a perception that they, as autonomous individuals, wield more power over what news they see on platforms than the platforms’ algorithms.
  • It is not unusual for people to underestimate platforms’ algorithms, taking them at face value and/or viewing them in overly simplistic terms. This also manifests in people overestimating the degree of control they have over their news feeds.
  • Some people have developed habits intended to try and limit their exposure to platforms’ algorithms.
  • Multiple participants described how they had abandoned platforms, curtailed their usage, or considered leaving a platform due to a lack of transparency about algorithm changes.

 

Local news

Whether explicitly or implicitly, participants often framed platforms as environments where local news was subjugated to national/international news.

  • Participants frequently stated that they did not feel local news stories surfaced on their news feeds, even if they followed people from their local area—or that only really big local stories tended to achieve any kind of visibility.
  • Audience members were able to pinpoint platforms they consider strongest for local news.

 

News outlets and brand 

Audiences claim they can—and do—recognize publishers’ brands on platforms. However, they may not always be as diligent in verifying legitimate content as they believe.

  • Not only did a number of people admit to redistributing misinformation and fake news, some outlined questionable verification techniques that may leave them more susceptible to certain types of fake news than they realize.
  • There was an element of the “third-person effect” in some people’s assessment of brand visibility on platforms. Some participants recognized a problem but argued that while they are proficient in recognizing news brands and not recirculating content from scrupulous sources, other people are less so.

 

Platforms, brand, and fake news

Fake news was not an intended focus of this study, yet the subject was raised repeatedly in all of our discussion groups, highlighting its pervasiveness in everyday discourse. Use of the term was varied and inconsistent, covering “imposter content,” “fabricated content,” and, at times, news/outlets with which people did not agree.

  • Audiences intrinsically linked fake news to social platforms and typically blamed them for its dissemination rather than the individuals sharing it.
  • There were divisions over whether or not platform companies have an ethical obligation to intervene in the dissemination of misinformation.
  • Platforms risk tarnishing their reputations if they attempt to push out ineffective, quick-fix solutions to problems.
  • Perceptions among some audience members that platforms are strongly swayed toward one particular end of the political spectrum highlighted the challenge these platforms face in persuading audiences that their efforts are in any way neutral.

 

Privacy on social platforms

The issue of privacy on tech platforms evoked strong reactions across our focus groups, the overwhelming consensus being that platforms’ data collection practices are too opaque and a cause for concern (or “scary,” and “creepy”).

  • Despite concerns, there was an air of resignation about platforms’ data collection practices and the lack of transparency around them. It has become normalized to the extent that data collection is viewed as a price to be paid for accessing these services without charge.
  • Concerns about privacy among news audiences, and/or their willingness to relinquish data to tech companies, appeared to be partially generational. Participants who were more accepting of platforms’ data collection practices tended to come from the youngest groups. However, it is important not to over-generalize or stereotype younger audiences—while some younger participants expressed this sentiment, it was far from universal.

 

Business models 

This study did not set out to investigate audience attitudes toward business models. However, discussions about some of the most prominent ways through which news outlets are attempting to generate revenue emerged organically through the course of our focus groups.

  • Although participants frequently expressed guilt for not paying for the journalism they consume online and via tech platforms, noting that they “should” do so, many outlined strategies for avoiding paywalls and other forms of payment.
  • Tech platforms may (however inadvertently) be helping to perpetuate the view that journalism is/should be free.
  • On the relatively infrequent occasions that native advertising was discussed, it was frequently framed as a form of deception.
  • Sponsored links—often labeled “recommended links” or “partner content”—were viewed with suspicion and/or disdain, and may lead to harmful reputational damage for outlets that carry them.

 

Conclusions and recommendations

While our overall study complicates any notion of a singular audience with singular wants, it offered insights from varied perspectives that may be of value for both publishers and platforms.

  • Publishers and platforms interested in rebuilding and maintaining relationships of trust with audiences should invest in media literacy that includes a) skills for verifying brands, b) algorithm literacy, and c) privacy literacy. Effectively tackling these areas will require a shift in attitude and strategy for platform companies—reluctant companies should note the risk of losing users alienated by the opacity of their operations. However, it must be noted that algorithmic transparency is required before algorithmic literacy can be achieved.
  • Platforms should note that strategies to prolong engagement by exposing users to perspectives only with which they agree may backfire as some people turn away from platforms due to perceived echo chambers.
  • Additional research is needed to monitor existing efforts to increase the visibility of local news on social platforms, though there is likely a need for platform companies to do more in addressing this critical element of the news ecosystem.
  • Platforms and other stakeholders committed to verification should take note of public skepticism regarding quick fixes to the challenge of fake news and the nuance required to not only address “imposter content” and “fabricated content,” but also the absence or presence of partisan content.
  • Publishers should approach business models such as native advertising and sponsored links with caution given their potential to jeopardize relationships of trust with readers. However, additional research and a dedicated study of audience attitudes toward journalistic business models would be valuable.

Introduction

Platform companies talk about “users.” Publishers pursue “scale.” Both somewhat dehumanizing terms refer to people—people who make up news audiences, people who were once categorized in slightly more personable terms as listeners, viewers, or readers.

An ever-increasing amount of listening, viewing, and reading now takes place on social platforms. A recent study by the Pew Research Center found that two-thirds of Americans get at least some of their news from social platforms, with one in five reporting that they do so often.[i] A survey by the Reuters Institute for the Study of Journalism found that just over half of Americans had used social platforms as a source of news within the last week.[ii]

These figures should not come as a surprise. News organizations frequently describe a strategy of meeting their audiences where they are. That increasingly means places like Facebook, YouTube, Instagram, LinkedIn, Snapchat—social platforms with user bases ranging from the hundreds of millions to the billions—and Apple News, a news aggregator automatically installed on every iPhone updated in the last two years. In keeping with another of the major shifts that has transformed the media landscape, a number of these platforms, including Snapchat, Apple News, and WhatsApp, exist only on mobile. The emergence of these (mostly closed) platforms, together with the indomitable rise of the smartphone, has resulted in a third wave of technological change in journalism, following the advent of the commercial internet and the rise of Web 2.0.[iii]

While this swift, destabilizing reimagining of the news ecosystem has led to something of an existential crisis among traditional news organizations, audiences have, in some regards, never had it so good. They certainly have more choice than ever before, not just in terms of what they consume, but also the platforms on which they do so. By far the most dominant is Facebook. According to Pew, forty-five percent of the U.S. population gets news from Facebook. Factor in Instagram and WhatsApp, also owned by Facebook, and that figure jumps to fifty-two percent.[iv] The Reuters Institute place that figure at fifty-four percent globally (including Facebook Messenger).[v]

This ability to consume more journalism in more formats on more platforms via more devices than at any time in history has fundamentally changed news audiences’ relationship with the institutions that create it. News follows audiences rather than the other way around—whether audiences want it to or not. More people discover news through algorithms than through (human) editors.[vi] Technology companies are the new gatekeepers of information. This has created at least as many problems as it has solved. It is arguably no coincidence that “news fatigue,” “fake news,” “alternative facts,” and “echo chambers” have become buzzwords at exactly the time the world of news has been turned on its head by unfathomably large and powerful technology companies.

This report—the product of thirteen focus groups conducted in four cities across the United States—is part of an effort to better understand how news consumers are making sense of this new, distributed environment. It is structured around the six core pillars that underpin our study.

We begin with a brief overview of audience’s platform-based news habits. This is followed by an in-depth exploration of their understandings and attitudes toward algorithms, the mysterious black boxes that play such a key role in shaping when and how news gets surfaced via tech platforms. We then turn our attention to local news, paying particular attention to our participants’ perceptions of its visibility on platforms. In the section following, we discuss audience perceptions of news outlets’ brand visibility on platforms, a subject that unintentionally but unsurprisingly prompted a number of lengthy discussions about the role of platforms in the rise of fake news. Covered next is the issue of news consumers’ privacy on tech platforms, a topic that evoked some of the strongest reactions across our focus groups. Finally, we present a brief overview of audiences’ attitudes toward some of the business models that news outlets are attempting to utilize in the post-platform age.

Methodology

This report is based on the analysis of transcripts taken from thirteen focus groups conducted by the Tow Center for Digital Journalism between March and June 2017. These focus groups were composed of fifty-eight adults from across the United States (twenty-nine women and twenty-nine men) and were designed to explore a) how general audiences consume news via tech platforms and b) audience attitudes toward the increasingly powerful role tech platforms play in the contemporary news ecosystem. Accordingly, the only eligibility criterion was that participants had experience encountering news via these means.

Focus groups were conducted in four cities: San Francisco, California (March 28 and 30), Bowling Green, Kentucky (April 10–12), Elkhart, Indiana (May 3–4), and New York, New York (March 23 and June 7–8). These locations, on the East and West Coasts, and in the northern and southern Midwest, were chosen so as to achieve a degree of regional diversity. This study was designed to separate groups by age range (18–29, 30–49, and 50 and over). While this was largely achieved, there were occasions when, for practical or logistical reasons, participants joined groups that did not reflect their age range. This, however, was kept to a minimum and avoided wherever possible.

All focus group lasted approximately ninety minutes and were moderated by two of three facilitators (Dr. Andrea Wenzel, Dr. Pete Brown, and Dr. Meritxell Roca-Sales). They followed the same structure and were divided into two sections, each organized around a card-based focusing exercise.[vii] The first was a two-part exercise designed to explore how participants consume news and how new distribution techniques have impacted their news consumption. To provide comparative, memorable timeframes, we based the exercise around the 2012 and 2016 U.S. Presidential elections. Participants were presented with a series of cards, each representing a different traditional/legacy or contemporary platform, and asked to discuss if/how they used each platform for news before working together to try and rank them by importance. In the second exercise, which was designed to explore attitudes toward some of the practical and ethical challenges associated with this transformation, participants were presented with a series of statements, which they were then asked to discuss and place on a continuum between “Strongly agree” and “Strongly disagree” (e.g., “When I see news on social platforms, I don’t always notice which news outlet it came from”).

As with all focus group research, the goal of this study was not to make generalizations about a population.[viii] Rather, we set out to explore the understandings, attitudes, and meanings that news audiences apply to tech platforms and those platforms’ role in the contemporary news ecosystem. Focus groups allow this to be achieved with a degree of depth that is not afforded by large-scale, quantitative methods such as surveys. It must, however, be acknowledged that our core objective was to uncover and explore prescient themes and identify areas for further study, rather than to apply our findings to the United States as a whole.

 

Setting the Scene: Platform-Based News Habits

All of the participants[1] in our study were recruited on the basis of having experience receiving news via tech platforms. However, very few framed their platform-based news consumption as part of a regular routine or a concerted effort to find news.

Instead, while participants invariably had experience using multiple platforms, most reported that their encounters with news were typically unintentional and/or a byproduct of visiting the platforms for other purposes, as opposed to actively going there to consume news.

So as I’m looking through what my friends are doing that day [on Facebook], the news will pop up. (Bowling Green, Kentucky; 50-plus-year-olds)

I [don’t] really use Instagram, not for news, but there were a lot of things that happened during the election that would end up on Instagram.” (San Francisco, California; 30–49-year-olds)

Sometimes I find myself there [Apple News] by accident because Twitter automatically sends me there, and the format is friendly enough, but I don’t seek it out. (Elkhart, Indiana; 30–49-year-olds)

This finding is in keeping with recent research into 18–24-year-olds’ news habits, which found that “News is frequently encountered by accident and in interstitial moments, as young people dip into flows of news across various platforms.”[ix] Our study suggests that this behavior is applicable to news consumers across an even broader age range.

Overall, three platforms dominated conversation: Facebook, Twitter and, to our slight surprise, Reddit. One participant summed up an attitude that many shared, describing Facebook as “all-encompassing” (New York, New York; 18–29-year-olds), due to its facilitation of public sharing and debate among friends, engagement with news organizations and journalists, and private sharing and debate in the confines of Messenger and private groups.

Across locations there were some predictable differences between age groups, with those from the younger group (18–29-year-olds) typically the only ones to encounter news on Snapchat. People in this age group and the middle tier (30–49-year-olds), albeit to a lesser extent, also reported fluid, complex relationships with a wide variety of platforms. For example, a group of 18–29-year-olds in New York described a multi-platform approach which included routinely going to Twitter for news and headlines; searching YouTube for videos (or to watch election live debates); visiting Reddit for commentary and debate; and using Facebook, Messenger, and WhatsApp to share links and debate with friends—all while occasionally “falling into” news on Snapchat Discover.

Although participants did not frame tech platforms as news sources in their own right, a striking number of people made references to platforms having become more news-orientated. One person asserted that news had “just invaded social media,” prompting a fellow group member to agree: “Yeah. I hear that. It definitely has invaded the social sphere,” (New York, New York; 18–29-year-olds). This extended to specific platforms, too. For example, one participant said, “It’s like at that point when Snapchat became more than Snapchat, like the news bit and stuff like that . . . I definitely fell into some Vox articles or somebody’s articles,” (New York, New York; 18–29-year-olds).

Another said of Twitter:

Yeah, a lot of people [originally] used it because you got to feel a little more connected with these celebrities, these famous people that you really admired. I kind of feel like over time, once Twitter got more popular, that’s not really the trend anymore. It is used now for news, but originally it was just to be connected with those [celebrities]. (Bowling Green, Kentucky; 18–29-year-olds)

Explanations for how and why tech platforms have become conduits in participants’ news consumption habits varied, but one theme ran through almost all of them: convenience (see Associated Press et al., 2016, for a discussion of how convenience can lead to engagement).[x] For many, this was primarily due to platforms’ ability to centralize news in one place. Asked why he thought he now got more news via Facebook than in the past, William said, “Just because it’s convenient. Probably habit, just easier to do. It’s just right there,” (San Francisco, California; 50-plus-year-olds).

Indeed, the convenience of news aggregation, having content gathered into one central location, recurred and played out in different ways. A participant in San Francisco waxed lyrical about Apple News because it eliminated the need to open and navigate multiple publishers’ consumer apps: “I don’t even have them [news apps] on my phone anymore, with the Apple News. Apple News is like everything together. So before I used to have like KGO, ABC. Everything was on my phone. So I was clicking on the app to get the news. [But now] I use Apple News more,” (San Francisco, California; 30–49-year-olds).

This centralized source of news is also valued, across audiences of all ages, because of the variety and sense of control that results from being able to select content that appeals to people on demand. As one participant told us, “I know that I can just pass it up if it doesn’t interest me. So, my husband will turn it to a certain channel on the television at night for news. I don’t want to just sit there and watch that. . . . I’d rather use my Facebook to pick what I want to look at,” (Bowling Green, Kentucky; 50-plus-year-olds).

Tech platforms were also portrayed as appealing and convenient vehicles for news because of the speed with which they allow content to be a) navigated and b) shared. Reflecting on the 2016 U.S. Presidential election, one person said, “[Snapchat] always had ten to fifteen snaps on Trump and the election—and they’re quick, so you can just . . . if it’s boring you can go to the next one and you can be on it for like ten seconds,” (San Francisco, California; 18–29-year-olds). This appeal also rang true for the ease and speed with which platforms allow people to share and discuss news content among their friend groups. At its most basic level, this was a point that transcended age groups and locations.

For example, there was universal agreement among the oldest group in Bowling Green when a participant noted that she and her friends increasingly share and discuss news via Facebook rather than face to face: “The way I look at it, I think word of mouth is slowly moving into social networking. That’s the way a lot of this word of mouth is now,” (Bowling Green, Kentucky; 50-plus-year-olds). Likewise, a college student in New York argued, “[Chat apps] just kind of replaced word of mouth . . . that’s what we used chats for: just talking about the election, posting articles in there,” (New York, New York; 18–29-year-olds).

Indeed, for some it is the private nature of these exchanges—typically termed “dark social”[xi]— together with the speed with which they can be initiated that holds so much appeal:

I do love that about social media, because we have an ongoing group chat with friends where when we see something we share it amongst ourselves and we discuss it. Not openly so the rest of the world can comment, but just amongst us. And I do love how easy it is to do that and to share these ideas without having to sit there and talk about, “Oh, guess what I read today?” or “Blah, blah, blah.” Just share your link and go. (Bowling Green, Kentucky; 30–49-year-olds)

Extending this point further, though, participants spoke positively of the way in which some platforms remove a step from this process, enabling them to quickly share and discuss news without having to leave the confines of the app. Reflecting on his use of Snapchat Discover during the 2016 U.S. Presidential election, Adil said:

I used it [Snapchat Discover] a lot actually, yeah. I think the coverage was great for Snapchat. Discover was really good and pretty much friends would snap you the big story. They wouldn’t text you anymore, they would just snap it real quick—like, “This is real life, you know?” I would say Snapchat was a great thing. (New York, New York; 18–29-year-olds)

Adding another interesting layer to this, Graham spoke of Snapchat’s role in gamifying news discovery within his social circle:

I do [use Snapchat for news] and . . . all of my friends do, so it’s like a thing where we just want to communicate about what we’re seeing on Snapchat. So we will frequently go in there and talk to one another about articles that we find. It’s kind of like almost a game for us. (Bowling Green, Kentucky; 18–29-year-olds)

Algorithm

No discussion of tech platforms and the news industry is complete without a conversation about algorithm, so pivotal are they to the process through which content is surfaced, prioritized, and distributed. Algorithms are at the core of almost every platform. They are the “all-powerful” mechanisms through which platforms seek to retain their users on-platform.[xii] News of the slightest tweak of an algorithm can lead to sharp, observable shifts in publishers’ distributed content strategies (see, for example, the relationship between Facebook’s promotion of video, in January 2017, and the “pivot to video” that followed).[xiii]

Described variously as “opaque,”[xiv] “unpredictable,”[xv] and “secretive,”[xvi] platform algorithms are also highly controversial. They are a source of huge frustration for many publishers, who are “producing more content than ever, without knowing who it is reaching or how—they are at the mercy of the algorithm.”[xvii] On the other side of the fence, algorithms also bring a degree of mystery to news consumers who have “no way of knowing how or why [news] reaches them, how data collected about them is used, or how their online behavior is being manipulated.”[xviii]

 

Understandings of Algorithms Are Varied and Often Inaccurate

The term “algorithm,” which our moderators deliberately avoided using, cropped up organically in four of the thirteen groups. However, the themes that platform algorithms embody formed a central strand of our study.

Across our research, awareness and understandings of how tech platforms mediate the flow of news via algorithms varied greatly. Through conversations, a number of themes emerged highlighting participants’ understanding of algorithms and attitudes toward them. These themes, which we explore in the sections below, included: a lack of awareness about the existence and/or purpose of algorithms; the framing of algorithms as “filters” that do not provide variety, thereby creating filter bubbles; and the defense of platforms and the arguments that a) users yield more control of their news feeds than algorithms and/or b) users’ friends are the biggest factors in shaping their news feeds.

Some Users Are Adamant That They Are in Control of Their News Feeds

In one exercise, participants were asked to assess a series of statements as a group before attempting to reach a consensus about the extent to which they agreed with them. During one such discussion in Elkhart, Indiana, a couple of participants launched a spirited defense of social platforms while debating the merits of the statement: “Social networks don’t provide a varied enough news diet. I only tend to see things personalized to my interests.” Such exchanges provided valuable insight into the ways in which people view a) the relationship between the active choices they make on social platforms (liking, following, etc.) and the role of algorithms in shaping their news feeds, b) the balance of power between the two, and c) the degree of control they are able to exert.

Simon: I’m confused still. I’m trying to process it. Well, the second part [“I only tend to see things personalized to my interests”] is voluntary. It’s up to me what I see. What the social networks provide, I don’t know, I guess that’s also voluntary because you’re following who you want to follow.

Rick: Yeah.

Simon: It’s not really Twitter’s fault. It’s not Facebook’s fault. They have their faults, but what I see may not be their fault.

Rick: Oh, man. I don’t customize. I don’t subscribe or like . . . subscribe or like, follow certain . . . yeah, it’s a tough . . . it’s kind of a two-pronged. I guess, for me, I am able to pick and choose what I want. I don’t have a lot heavy on this [subject] or a lot heavy on that [subject]. So it’s not really customized to my interests. I know that’s not the actual answer you’re looking for, but I’m able to go through and pick and choose I guess [what I see].

(Elkhart, Indiana; 30–49-year-olds)

This sentiment was observed elsewhere. For example, in Bowling Green, Kentucky, another participant absolved Facebook of responsibility for content in his news feed, arguing that he had cultivated it entirely for himself:

What I’m seeing isn’t super varied, but it’s based upon whom I’ve decided to follow and whom I’ve decided to unfollow. I wouldn’t be like, “Oh, Facebook is doing this to me,” because I feel like it’s myself doing it to me—except the commercials that they put on there, which are always based on my latest Google searches. (Bowling Green, Kentucky; 18–29-year-olds)

These discussions occasionally resulted in comparisons of competing platforms. During a New York group, for example, two participants contrasted Facebook and Snapchat (the latter of which is actually curated by human editors), again arguing that their own choices—this time of Facebook friends—was a far greater factor in the composition of their news feeds than Facebook’s algorithm.

Cyrus: I think it’s your friends. Your news feed is populated by your friends, and your friends are the ones perpetuating your news feed . . . As far as Facebook having the control, I really don’t agree with that. I think it’s definitely your social circle that controls that, at least on Facebook.

Adil: Yeah. I agree with Facebook but there are certain platforms that do have a lot of control. For instance, Snapchat was, I guess, very pro-Hillary. You’d see a lot of pro-Hillary stuff leading up to the campaign, and anything Trump related was usually, you know, in a negative light.”

(New York, New York; 18–29-year-olds)

Some participants outlined simple actions they have taken to try and limit their exposure to algorithms. For example, a participant in Bowling Green described how he has developed a habit of ensuring his news feeds are always ordered chronologically in an effort to counter algorithmic attempts to personalize news to what the platform “wants” him to see: “The first thing I do whenever I get on there [a social platform] is I always put most recent, because I don’t like the random, ‘Let me show you what I want you to see.’ I just go through the most recent ones,” (Bowling Green, Kentucky; 18–29-year-olds).

Others would brush off concerns about algorithms by outlining the simple measures they thought they could take to exert control, such as unfollowing or ignoring things they would rather avoid: “I feel like I can always unfollow or just not pay as much attention to what I don’t want to see, which is mostly good, but can be bad, I guess, if you’re developing an echo chamber. But I feel like I have a good amount of control,” (New York, New York; 18–29-year-olds).

This notion that algorithms can be “gamed” or outmaneuvered recurred with surprising frequency. For example, in San Francisco, one participant spoke positively about the Facebook algorithm’s ability to surface videos of interest to her. When another participant interjected with concerns about the increased scope for creating a filter bubble this brings, they landed on a somewhat simplistic conceptualization of the algorithm, premised on the notion that Facebook’s algorithm “can’t really tell” what she likes because of the variety of news content she consumes.

Jill: I think Facebook is doing an amazing thing now where they know the kind of videos you like, they know the kind of posts you like, and they kind of put it in your [news feed].

Alyssa: That’s also dangerous when it comes to news because then you only are exposed to news that you would be receptive to. Like, I notice I’m getting very left-leaning news because I am pretty left-leaning.

Jill: Really? But for me, I get from every, I have . . .

Derrick: Every source.

Alyssa: That’s weird.

Jill: . . . From every source because I don’t like a particular kind of news. So it can’t really tell if this is what you like because I read everything. So I see every kind of information on my news feed once I click on a video.

Derrick: That’s good.

Jill: Yeah.

(San Francisco, California; 30–49-year-olds)

Others underestimated the role of algorithms by taking platforms at face value. For instance, one person described how responsibility for Reddit’s scoring is “all users.” He told the group, “If it’s something interesting, it’s voted up, and if it’s not, it’s useless,” the result of which is that “correct stuff [is] usually towards the top,” (Bowling Green, Kentucky; 18–29-year-olds). This, of course, it not the case. Reddit, like Facebook, Instagram, et al., is controlled by an algorithm. Indeed, following a change to Reddit’s front-page algorithm in December 2016, TechCrunch reported:

Yes, it’s true: Those numbers on the site aren’t just “upvotes minus downvotes” or anything so simple. The blue ball machine gif KeyserSosa [Reddit CTO and founding engineer, Christopher Slowe] shared as an indication of how the system works is probably closer to the truth. And he indicated in another comment that there is “some slight fuzzing” to stymie would-be reverse engineers of the algorithm.[xix]

However, far from reflecting negatively on our participant as an individual, this misconception about how Reddit, his platform of preference, operates is best viewed as an indication of how platforms could improve their transparency around the role that algorithms play in surfacing, prioritizing, and ranking news they present to their users. While many platforms will typically announce major algorithm changes via blog posts, these are often vague and, arguably, underpublicized (e.g., Instagram notified users it was introducing an algorithm via a blog, saying, “To improve your experience, your feed will soon be ordered to show the moments we believe you will care about the most”[xx]—without giving any indication of how it could or would go about making such a decision). In the case of the Reddit algorithm change referenced above, founding engineer Slowe provided a complicated image (reproduced below) by way of providing a “schematic of what the code looks like without revealing any trade secrets or compromising the integrity of the algorithm.”[xxi].

Image source: Reddit post by KeyserSosa, Reddit CTO and founding engineer, Christopher Slowe.

 

Many People Want to Know More about How Platform Algorithms Work

At times, too, people blamed their own supposed deficiencies for the lack of variety in the news delivered by algorithms. For example, a participant in the oldest Bowling Green group showed awareness that Facebook prioritizes the surfacing of news that appeals to her preexisting interests, but presumed that the other, “smarter” people in the group would be receiving news from a more eclectic range of sources: “Facebook knows what I want to read, and so they send it to me. So I don’t feel like I get a varied mix. I don’t know. You guys probably do because you’re smart.”

The topic of how and why tech platforms determine which news gets surfaced played out in interesting ways when the subject of trending news arose. During one group session, a participant alluded to the Facebook Trending controversy[xxii] of May 2016, soliciting nods around the room when expressing his skepticism about the platform’s motives: “I’ve been told that those were just set up by Facebook anyway: Certain topics weren’t really trending they just made you think they were, just to get people talking . . . That could have been a rumor. But I believe it, that they want to promote one thing over enough for the likes or whatever,” (Bowling Green, Kentucky; 30–49-year-olds).

Another group got onto the subject of Apple News and, in particular, the trending stories displayed prominently on both the app’s splash page and its lock screen widget. This, too, led to discussion about how little is known about the process through which trending stories are identified and promoted.

Andrew: Yeah, on Apple News they have the top five. I don’t know top five what.

Susan: Apple’s top five.

[…]

Mahdi: I think the problem with some of those is they’re focused on, like, what’s most trending.

Susan: That’s exactly what it is.

Andrew: The thing that’s trending.

Mahdi: Which, I think, is what I don’t like.

Andrew: No.

Mahdi: You know, that doesn’t necessarily feed into what . . . the stuff that’s most trending is not what I signed up for. You know what I mean? I don’t know, it’s like, that seems to dominate the way it’s set up, the interface.

Mary: So who decides what’s important?

Susan: That’s what he [the moderator] is going to ask!

Andrew: I think it is the trending . . .

Mary: So it’s the things that I’ve liked versus the things you don’t like, right . . . I think that it would be really good to see something that I could check, an app or something, that had not what I want to see necessarily, but what I need to see—and who decides what I need to see? I’m not sure. I don’t know, because it’s going through all these filters.

(Bowling Green, Kentucky; 50-plus-year-olds)

During another group, one of the younger participants began talking about Facebook, then had a sudden realization that his Instagram feed had changed and was now also dictated by an algorithm, leading him to reflect on how little knowledge he had about the rationale for how his feed is now presented to him:

It’s just now hitting me that Instagram is not [chronological] either. Like, I’m sorry, but I would like to know why they’re telling me that I’m more likely to like this person’s pictures [rather] than a different person’s pictures. Maybe it was just that I just recently followed the last person that they’re going to be put up there [higher in the news feed]. So I haven’t liked as many of their pictures. So I don’t understand exactly how that works, and I would definitely like to. (Bowling Green, Kentucky; 18–29-year-olds)

Additionally, participants occasionally indicated that not only would they like to know more about what surfaced on their news feeds—and why—but they would also like greater insight into what was being repressed or withheld. In other words, they expressed a desire to know more about what they are not seeing. During one discussion in Bowling Green, the group’s participants quickly reached a unanimous consensus that they would like platforms to be more transparent about how the content of their news feeds gets prioritized. This led one participant to ask the rest of the group whether the will of the algorithm was directed by selections she had made in the settings—and what she could do to change those settings—before reflecting on her lack of knowledge about how (if at all) the diet of news, and variety of news sources, served to her by social platforms had changed over time.

Mahdi: Is it based on the settings that I provided earlier, the parameters that I gave it? Because I will go back and refine them.

Susan: That’s a good point. That’s a good point.

Mahdi: I don’t try to see what percentage of stories I’m getting from what sources. So I don’t know, you know, you know, am I still getting the same content I was getting originally?

(Bowling Green, Kentucky; 50-plus-year-olds)

 

Platforms’ Secrecy around Their Algorithms May Drive Users Away

At times, participants exhibited outright hostility toward algorithms and the lack of transparency about their implementation, citing them as the reason for abandoning platforms or reducing their use of them.

In one instance, a participant in Bowling Green, Kentucky, went into great detail about Reddit’s supposed lack of transparency about changes made to its front-page algorithm. These changes, coupled with a more concerted effort to remove material deemed to be offensive, were, in his opinion, tantamount to “filtering” and designed to suppress right-wing perspectives. This, he argued, had been sufficiently egregious to drive him away from Reddit and over to 4Chan, a platform he felt was less “filtered”:

Well, they’ve specifically told users that they’ve changed their algorithm on what gets to the front page, which means right-wing news doesn’t make the front page anymore. You have to specifically search for right-wing news now . . . But it seems like once money comes into the website—and it’s getting very popular now—the filtering starts . . . It’s not completely useless or anything but . . . so there’ll be an Islamic bombing someplace in Europe. It won’t show up on Reddit—at least not until it’s a big, big story. If it’s just a small one it just won’t show up and they’ll filter it . . . They’re forcing me to go to 4Chan. They are forcing me to go to 4Chan to look at articles about Islamic bombings in places because they’re filtering. What they’re trying to get rid of is that viewpoint. They’re making me go to that website to look up that information because they’re filtering it out. It’s news! I’m sorry the world doesn’t sometimes go to your narrative, the way you like it, but bad things happen. They filter that out. (Bowling Green, Kentucky; 30–49-year-olds)

In another group, during a discussion about the range of perspectives people are exposed to on social platforms, one person gave her take on the power of the Facebook algorithm and what she perceived as its ability to track the amount of attention she paid to content of different politic leanings. Its effectiveness in doing this, she said, was a factor in her growing reluctance to use the platform: “I know that the algorithms are doing that on purpose, like they know that if you’re spending more time looking at a liberal-focused article, they’re going to show more of the same, and they’ll not show you any people who are posting pro-Trump. So it’s controlled, they’re controlling what I’m seeing,” (New York, New York; 18–29-year-olds).

Asked to elaborate on how she felt about this, she said, “Well, that’s why I don’t go on there very often, because I know there’s so much data that they’re culling from me that I have a hard time giving them that data.”

Another New Yorker specifically cited the Facebook algorithm, and concerns about perpetuating a “liberal bubble,” as her reason for turning away from the platforms as a news source: “I live in a very liberal bubble, New York and [college], so I’ve been trying to mediate. I don’t think Facebook helped that. There’s all sorts of stuff about their algorithm and stuff like that, so I don’t go to it for news anymore because I don’t want to keep spiraling down that sort of very narrow worldview,” (New York, New York; 30–49-year-olds).

Among participants with an awareness of algorithms, some reflected on the process through which they “exposed” their existence and described their attitudes toward the way in which they operate. A particularly strong example of this occurred in Elkhart, Indiana, where a participant recalled the sequence of events that had led to their awakening about the powerful, but seemingly superficial nature of the Facebook algorithm. This person went on to deliver a scathing assessment of the lack of transparency around the black box responsible for delivering what they had come to view as an unsatisfactorily one-dimensional news feed:

With the Facebook thing, I realized that I would pop up to do something on Facebook, and something would come up over here, and that would catch my eye, and I would go to it. Then I started watching information about Facebook and found out they were following me enough that they only sent me the stuff that I clicked on. I wasn’t getting both sides of the story. I was only getting the things that I had clicked on that caught my eye, and they were not giving [variety]. So when the things would start popping up, it was all, they were just following me and giving me sugar when I was really looking for more. They would just keep feeding me the same lines all the time. So I went on several other things and checked it out, and it was true. They were skewing the news to what I had picked. They personalized it . . . and that’s not why I was there. I was there to get information that was different or a different viewpoint than I was getting, and I’m very mad at Mark Zuckerberg. (Elkhart, Indiana; 50-plus-year-olds)

This, for us, is a particularly telling passage. The question of whether or not it is judicious to rely on—or expect—Facebook or any other social platform in their current forms to deliver a balanced, varied, and/or impartial news stream is, frankly, moot. The bigger discussion point here is that a vision of news personalization that slants heavily toward repetition is not just unhelpful (“feeding the same lines all the time,” “giving me sugar when I was looking for more,” “skewing the news”) but disengaging (“not why I was there”).

This appetite for wider variety and more perspectives—which, as has been noted, recurred frequently across our study—coupled with the rumblings of dissatisfaction around the one-dimensional stream of news currently delivered, should be of particular interest to the social platforms. As David Uberti has noted:

While Facebook has become the public’s primary conduit for digital content, its business imperative is to maximize engagement, not objectivity. The algorithms designed to do that are human-made and therefore biased by nature. But we can only guess as to how they’re constructed.

The obvious danger of the situation is that free societies have little knowledge of the systems funneling information into their news feeds. The sad irony is that the news organizations with the wherewithal to find out are the very same outfits that increasingly depend on Facebook for their survival.[xxiii]

Our findings, as exemplified by the example above, present a timely challenge to the assumption that platforms’ only route to achieving their business imperative of maximum engagement is by giving their users more of what they know, or what their algorithms deduce that they enjoy—“sugar,” as it was characterized by our participant.

On the contrary, our evidence suggests that, for parts of the news audience, an algorithm that deliberately skews its users’ news feeds—and is knowingly opaque about how and why it is doing so—can negatively impact users’ perception (and trust) of the platform, and the quality of the service and user experience it delivers (e.g., “I was really looking for more”; “They were skewing the news . . . that’s not why I was there”; “I’m very mad at Mark Zuckerberg”). In other words, to return to Uberti’s assessment, far from being incompatible opposites, engagement and objectivity—or at least transparency—could actually make unlikely bedfellows.

Local News

The struggles of smaller, local newsrooms to remain viable long predate the rise of tech platforms. But this development certainly hasn’t helped. While there are notable examples of regionally focused outlets that have succeeded in the new, distributed environment (e.g., Billy Penn, a mobile platform that covers local Philadelphia news), these tend to be the exception rather than the rule. Among the rest, many in local newsrooms are uneasy about the proposition of getting into bed with Facebook et al. Some complain the platforms do not provide a level playing field because they do not afford them the attention showered on their bigger, more glamorous rivals. Others report anxieties about sacrificing brand recognition in the pursuit of scale (brand is covered in detail later in this report), while ongoing belt-tightening means that even those who are open to new models of distribution simply do not have the resources required to make their journalism compatible with products such as Facebook Instant Articles or Google AMP (spots on the likes of Snapchat Discover are but a pipe dream).

These are criticisms that the platform companies claim to have taken onboard. Facebook, for example, made local news one of the cornerstones of its Journalism Project.[xxiv] But proof of tangible improvements remain few and far between, leaving some to conclude that unless there is “a civic sea change that miraculously alters the online behavior and spending patterns of the general public . . . the green shoots of small local newsrooms remain fragile.”[xxv]

 

For Most, Platforms Are Not Typically Viewed as a Reliable Source of Local News

Discussions about the relative paucity of local news on social platforms were plentiful and recurred across age groups and locations. During a discussion among 18–29-year-olds in Bowling Green, Kentucky, participants began addressing their growing recognition of the relevance of local news:

I kind of feel like—and I talk like I’m old or something—but the older you get, you kind of realize that the local stuff is what’s actually important and affecting you more than what you see on CNN. So what’s going on in Bowling Green, I’m a resident, that’s going to hit me sooner than whatever kind of stuff they’ve got going on at the White House right now. I think it’s nice to know what’s going on in your backyard. (Bowling Green, Kentucky; 18–29-year-olds)

This sentiment was shared elsewhere in groups, but as the conversation developed, participants reflected on how little their friendship groups share local news on social platforms, and the extent to which they subsequently struggle to keep informed about their communities.

Graham: I’m the worst at keeping up with local news as well as I do with national news. It’s really hard to do it with local news, and none of my Facebook friends share local news on Facebook. It’s all national-based. So I haven’t—honestly I couldn’t tell you what was happening in Bowling Green.

Jenny: I’m the same way.

Graham: It’s so bad.

Jenny: I know more about national news than I do local news.

(Bowling Green, Kentucky; 18–29-year-olds)

As this exchange evolved, Graham concluded that Facebook had “definitely created an environment” where local news was subjugated to the point of invisibility. Crucially, though, he placed responsibility for this on himself, leading a fellow participant to speculate that local news would never be surfaced if people did not actively try and surface it by liking local news outlets and the like. (Algorithm is covered in depth in an earlier section.)

Graham: For me, it’s definitely created an environment where I only get national or international news. Like, literally, my mom’s dog died and someone in the high school committed suicide from my hometown. I had no idea until three weeks later. I went home and I was like, “What? All of this stuff has been happening.” So it’s definitely the environment I’ve created for myself on social media.

Vanessa: I think overall, if you did not have that control, say you start your Facebook page from scratch. You don’t click any of the extra buttons. No one cares about the local stuff.

(Bowling Green, Kentucky; 18–29-year-olds)

This sentiment that individuals wishing to see more local news on social platforms have the agency to do so was echoed in another of the younger groups. During this exchange, in San Francisco, one participant told another of the supposed ease with which he could increase his exposure to local news. It strikes us as noteworthy because it again highlights the assumption that following/subscribing to more local news outlets and/or journalists will automatically result in greater exposure to local news, assuming a direct correlation that arguably underplays the extent to which some platforms’ algorithms intervene in the process (for a more detailed discussion of algorithm, see our earlier section).

Tim: Sometimes I think I should be more involved locally and watch what’s going on here, but I just get so caught up on, like, CNN or the bigger outlets—Snapchat—or what’s going on a national level, and I should be more involved here.

Janet: I think it really depends on the person because, it’s like with social platforms, you can decide what you want to follow . . . [so] if you are more interested in local news then you can just subscribe to more local news.

(San Francisco, California; 18–29-year-olds)

Participants would often bring these anecdotes to life by citing specific examples of events that had happened in their communities that had not surfaced on their social timelines and which they had only discovered via alternative means.

Jill: OK, so do you ever see anything [news] that’s local pop up? Like, he brought up earlier that there was an incident, I think it was in Oakland, or in a San Francisco mall, where somebody sprayed, like, a chemical into the air.

Derrick: Pepper spray. Pepper spray.

Jill: People had to, like, evacuate. I didn’t hear about that, nor see it on any of my news feeds. It was something I had to hear about through word of mouth . . . I heard about it from word of mouth, but I would’ve never come across it in my news feed, and I follow a lot of people that live and work in San Francisco.”

(San Francisco, California; 30–49-year-olds)

Similarly, a group in Elkhart, Indiana, told us that stories would only occasionally find their way onto their timelines if they were big enough. They shared the example of a recent hostage situation that, as far as they were concerned, “nobody knew” about until it made a splash on the platforms on which they spend much of their time.

Alex: It’s usually reshared from various friends if it’s a big news story like the . . . what was it? The hostage . . . ”

Rick: Oh yeah.

Alex: The hostage, everybody, you know, four people were hostage in Elkhart for five days and nobody knew. Well, yeah, the paper had it, but then everybody shared the local news station’s story.

(Elkhart, Indiana; 30–49-year-olds)

This notion that social platforms foreground big stories at the expense of more routine ones, and that the lack of blockbuster local news explains its subjugation by social platforms, was picked up elsewhere. As one person in Bowling Green put it:

Well, the big thing is: If it bleeds, it leads. Small-town Bowling Green, Kentucky, most of the time it’s not bleeding, so you’re not going to hear about it, but everything you see on CNN and Fox, that’s all violence and juicy stuff, you know. That’s the stuff that’s going to spread faster. That’s what’s exciting, unfortunately. That’s what you hear about first. So you know, oh, the fairness ordinance in Bowling Green, some people just don’t care about that because nobody, there’s nothing . . . (Bowling Green, Kentucky; 18–29-year-olds)

Some people questioned whether tech platforms were even a viable space for local news to thrive, so conditioned were they to seeing content from bigger, more nationally or internationally focused outlets dominating their news feeds.

They get kind of drowned out although, I mean, if you are a smaller or local news outlet, I don’t know if social media is where you should be targeting to get new readers because you’re just going to get drowned out. So maybe, I don’t know how they would acquire a readership, but I don’t think it should be on social platforms. (New York, New York; 18–29-year-olds)

At times, too, participants highlighted platforms’ perceived weakness at reliably surfacing local news in more implicit terms. Despite the fact that all participants were recruited on the basis that they had experience getting news via social platforms, some would describe an ongoing reliance on legacy outlets (and formats) for local news, specifically. For example, during one exchange in Bowling Green, Kentucky, participants described how older forms of media—specifically newspapers and radio—were the “only” ways to keep abreast of local news.

Andrew: I don’t want to [pay for the local newspaper], but it’s the only way I know to really keep up locally with what’s going on. I wouldn’t cancel it.

[ . . . ]

Javed: We have a weekly newspaper, and that’s the only source of information, and we have a radio station. And then we get a newspaper from Owensboro daily, and I think that has a bigger circulation . . .

(Bowling Green, Kentucky; 50-plus-year-olds)

Similarly, in San Francisco, another participant described how he still invariably defaults to regional television for local news.

Alyssa: I would agree with that, definitely [that local news get subjugated on platforms].

Derrick: That sounds, off the top of my head, like it makes sense. It sounds like it’s just a statement of fact.

Moderator: So when you’re looking for local news for instance, do you go to social media? Do you go to Facebook for instance, or do you go somewhere else?

Alyssa: I’m going to go to Channel 2, if I’m looking for something in the Bay area.

(San Francisco, California; 30–49-year-olds)

 

Audiences Can Pinpoint Platforms They Consider Strongest for Local News

At times, participants’ discussion of local news would lead them to compare the relative merits of different platforms in this field. When this occurred, Twitter was almost invariably cited as the preferred platform for following local news. During one group session in San Francisco, a participant identified a regionally specific account as his reason for being on the platform: “I actually follow Twitter for that [local news]. Like, there’s a ‘what’s going on in SF’ Twitter account that gives you immediate local news or what’s going on, like where are the police going to or how’s the weather going to be today,” (San Francisco, California; 18–29-year-olds).

Similarly, a participant in Bowling Green identified localized trending as Twitter’s stronger feature:

It’s my main go-to because what you can do now, I’m not sure if you could do the same thing in 2012, but you can select what location you’re in, and not only do you get the trending topics around the world, you get the trending topics in your area. So when I get on mine, sometimes I see things from Bowling Green, like what people in Bowling Green are tweeting about.

(Bowling Green, Kentucky; 18–29-year-olds)

During one exchange in Bowling Green a number of participants got into a discussion about why Twitter was their preferred platform for local news (particularly over Facebook), citing the volume of local news it surfaces (and its perceived “reliability” in doing so) and the ability to hone in on individual reporters.

Bill: I think it’s a bit more reliable for use than the stuff that’s on Facebook.

Meredith: I use it more so for, like, Kentucky News—specific.

Sarah: Bluegrass politics and things like that, just to kind of get some more—because you really can’t find those through Facebook, or even our local news stations don’t carry too much . . .

Gavin: But you can also even literally—you can follow actual, single journalists. There are certain ones that write for the Courier-Journal and these guys I’ll literally follow . . .

(Bowling Green, Kentucky; 30–49-year-olds)

Later on, the same person elaborated on this latter point, arguing that being able to follow individual reporters on Twitter was the main reason he was able to keep track of local developments: “I follow local journalists. That’s the reason [I see plenty of local news]. I mean, I don’t know about Facebook, but for Twitter I follow local folks and so therefore it doesn’t get drowned out because I use Twitter a lot,” (Bowling Green, Kentucky; 30–49-year-olds).

 

Brand

Another of the key concerns for publishers making the shift to tech platforms is that they risk relinquishing control of their brands. As one local publisher described it, “I think when our content is removed from the context of our own sites and placed in a different display, such as Facebook, it’s natural to assume that some of the branding will be lost to the new host. We have to work harder now to build that brand recognition and loyalty.”[xxvi] These concerns appear to have foundation. A 2017 study by the Pew Research Center found that just over half of their respondents could recall the source of news they had encountered on the social web.[xxvii] Similar work by the Reuters Institute for the Study of Journalism painted an even bleaker picture, finding that forty-seven percent of participants could correctly attribute brand to news found via social platforms, and just thirty-seven percent could for that encountered via search.[xxviii]

This criticism, too, has started to resonate with the platform companies. In August 2017, Facebook rolled out a new feature enabling publishers to upload multiple versions of their logos to be displayed alongside trending stories and search results as part of an effort to help news outlets improve brand recognition on the platform. Introducing the new feature, Andrew Anker, a product manager at Facebook, claimed: “By surfacing publisher logos next to article links, we want to make it easier for publishers to extend their brand identity on Facebook—to enhance people’s awareness of the source of content they see on Facebook, so they can better decide what to read and share.”[xxix]

 

Audiences Claim They Can—and Do—Recognize Publishers’ Brands on Platforms

Across our focus groups, the majority of participants reported that they have no significant problems identifying news brands when perusing tech platforms. Indeed, many were insistent that checking—and making a judgment about the perceived reliability or credibility of—news sources shared to their timelines had become second nature and formed a core part of their news routines. For these people, this process had become particularly important due to the rise of fake news, misinformation, and partisanship, which are typically traced back to the 2016 U.S. Presidential campaign.

During a discussion about whether or not it’s easy to miss the news outlet responsible for producing a piece posted to social platforms, one participant noted:

It’s interesting because I would have agreed [that it’s easy to miss the brand] . . . maybe four or five months ago, but a lot of emphasis has been placed on it recently, so I would disagree because now I check everything. It’s not that I don’t trust media anymore, it’s just, like, it becomes necessary now to check everything. (New York, New York; 18–29-year-olds)

Others made similar claims about the importance they attribute to brand and the efforts they make to check:

Andrew: I make it a point to look [at the news brand], yeah.

[…]

Javed: It’s at the bottom of the link on the picture or whatever media they post. It’s at the bottom where it’s coming from.

(Bowling Green, Kentucky; 50-plus-year-olds)

 

Adil: The source is big for me. I would check the source and say is this right-wing, is it conservative, and is it liberal media? And then just use my best judgment.

Cyrus: I would probably judge by prior reputation. If this is a longstanding institution that I feel I can trust, you know . . .

Adil: Exactly.

(New York, New York; 18–29-year-olds)

Interestingly, too, and to our slight surprise, it was not uncommon for participants to say that the perceived credibility of the news source took precedence for them over the person sharing with them, such was their frustration with the partisanship of large sections of their friendship groups: “I definitely look at the source, so like a news website, rather than who shares it. Definitely stick to the ones I’ve heard of—the news outlets,” (San Francisco, California; 18–29-year-olds). At times the removal of any identifying information about the sharer was framed as a positive because it forced the user to focus on the originating news outlet: “Well, on Reddit, you can’t really know who’s posting it. It’s like a random name. So you don’t really have the bias of who is posting it, but it does show the source of it though. So you can see if it’s credible or not right off the bat,” (Bowling Green, Kentucky; 18–29-year-olds).

 

. . . But They May Not Always Be as Diligent as They Would Like

While audiences’ apparent (re)emphasis on and (re)prioritization of news brand ought to be encouraging for news organizations (particularly when paired with the finding that readers do not think of the platforms as news sources in and of themselves), it should be treated with a degree of caution. People said it was straightforward to identify news brands, and that they focused on what they perceived to be credible outlets. At the same time, many also, somewhat contradictorily, admitted to having been duped by—and/or actively (re)sharing—false stories from less legitimate sources. For some, this was an occasional mishap that could, and should, be easily avoided:

Everyone slips up once in a while, and just forgets to check a source. We’re all guilty of it. They just, like, see a headline they either agree or disagree with, you know, the kind of interesting language in the article or the headline, and then share that without really checking the source. [But] I think if you look into it, it’s easy enough [to identify the source]. (New York, New York; 18–29-year-olds)

Others were adamant that they could easily identify which content on their timelines had originated from legitimate news brands, but outlined somewhat questionable strategies for doing so. For example, when, during an exchange in San Francisco, a participant said, “I think videos are the biggest mark of like, ‘OK, this is a news post,’ ” another weighed in and agreed, “Yes . . . when you click on the video, it tells you CNN, ABC, or this, and then you know it’s legit,” (San Francisco, California; 30–49-year-olds). Such attitudes raise alarm bells, of course, because of the rise of imposter content (see, for example, a 2017 BuzzFeed article by Craig Silverman and Jane Lytvynenko).[xxx]

At times, participants articulated their despondency at not knowing which brands they could trust when encountered via tech platforms. This was, in part, due to the increasing number of brands they felt they had been exposed to via social, many of which they did not have a historical relationship with. One discussion about the fact-checking site Snopes illustrates how this proliferation of sources has been further complicated in the context of the current political moment.

Mary: I have shared things that are unreliable, and I didn’t know it, and somebody . . . on Snopes, but then I found out Snopes lies, so . . .

Moderator: How did you find that out?

Mary: There was an article about Snopes, maybe it’s not true.

Moderator: Where was the article?

Mary: It was on Facebook.

Andrew: Fake news, it’s a vicious circle.

Moderator: Interesting.

Mary: It was a particular thing that Snopes had said that was not true.

Moderator: Do you remember what it was?

Mary: Something about the president I’m sure. So I don’t even trust Snopes anymore.

(Bowling Green, Kentucky; 50-plus-year-olds)

It should, however, be noted that this crisis of trust was also raised in relation to established brands:

We didn’t hear all about fake news, alternative facts in 2012. You know, because the recent year, few months, it’s like, just because it’s written doesn’t mean it’s [true]. You know, I have to put my faith in some organization . . . I can’t question everything. I mean, I could, but I’d be a nutcase—I have to put credibility in, let’s say, CBS News and C-SPAN. Those are my credible news sources, but, you know, I’m hearing that they’re not credible. I’m hearing Fox News isn’t credible, CNN is biased. It’s like, short of actually hearing it or seeing it being a witness myself, how am I going to get my information? (Elkhart, Indiana; 30–49-year-olds)

 

Third-Person Effect Is a Factor: People Often Think That Other People Are the Problem

Participants occasionally invoked a variation on the “third-person effect”[xxxi] during conversations about brand recognition on social platforms, arguing that while they were proficient in recognizing and checking content had come from a reputable brand, others were less capable. One such instance played out during a conversation between Graham and Jenny, who argued that their parents’ generation was less capable of deciphering between reputable and unrepeatable brands than theirs.

Graham: I think it’s a generational gap. For people who I follow on Facebook that are older than the millennial generation, they frequently don’t check to see which news source it’s from, and I don’t know if maybe that’s because they’re not used to having to. We grew up where it was like—or at least I did, I’m assuming you all did as well—where it’s like, “Wow, there are these Facebook articles, but we need to read them and make sure that they’re correct.” My English teacher in high school told us to do that. It’s kind of ingrained. A lot of the people, I felt like my dad didn’t check at all, you know. It was just like, read the headlines, and that was enough. I guess they assume that anything that they’re getting . . . [even if it’s from] a source where they don’t know exactly where it’s coming from, they’re like, “Wow, this has to be true then,” and then they automatically share it. At least that’s my experience.

Jenny: I think he has a point with the generation gap. Now that he said that, I see that’s very true to my news feed. The older people tend to just share. I know my mom is super bad about this. She’ll share whatever she sees, and she doesn’t really check on stuff, whereas growing up with technology always at our fingertips, we have been more prone to check things before sending them out. So I would agree with him on that.

Moderator: When you say that your mom shares everything she sees, is it things she agrees with?

Jenny: It will be anything that she agrees with, yeah. She shares a lot, and it doesn’t matter what. She doesn’t read it. She’ll just share it based on the headline.

(Bowling Green, Kentucky; 18–29-year-olds)

This subject of media literacy in the platform age was picked up elsewhere. During another group, also in Bowling Green, Kentucky, a participant began by outlining what she perceived to be a lack of media literacy among older generations before questioning whether it was ethical for platform companies such as Facebook to facilitate the ongoing flow of misinformation (and/or hyper-partisan news) without intervention. She suggested that socio-cultural aspects of her local community make this a particularly pressing subject:

Facebook . . . is such a huge news source for so many people—across generations. My grandmother has a Facebook. That’s pathetic. When you have something like that, with all the fake news and with all the clickbait articles, does a corporation like Facebook, do they have an ethical duty to try and make sure that they are not swaying it one side or another? There was an argument over the election. And [pointing to person sitting next to her] he’ll say, “No, it’s all up to individual choice,” and I can understand that argument. But at the same time, especially in our area, we have a low education, mostly unskilled laborers and we’re just going to say a highly religious population. There are certain things where all you see every single day is a very slanted political view of the news because that’s what everyone’s sharing. (Bowling Green, Kentucky; 30–49-year-olds)

 

Audiences Intrinsically Link Fake News to Social Platforms

Fake news was not intended to be a central strand of this research. However, despite never being raised by any of the facilitators, it emerged organically in all thirteen focus groups. Indeed, analysis of the transcripts reveals seventy-three instances of the phrase “fake news” being used across the thirteen discussions, highlighting just how ubiquitous it has become in everyday discourse around the contemporary media landscape. That said, though, it was notable that use of the phrase was varied and inconsistent, on the one hand being used in relation to what has been termed “imposter content” and “fabricated content,”[xxxii] while also being weaponized to refer to news or outlets with which people didn’t agree (e.g., “BuzzFeed is fake news” [New York, New York; 50-plus-year-olds]).

While an in-depth discussion of fake news is beyond the scope of this report, the fact that it emerged as such a central theme in our discussions of tech platforms cannot be ignored, and it is in this context that we explore some of the key points our participants raised.

Notably, culpability for the rise and spread of mis- and disinformation was often laid at the feet of platforms rather than the people sharing it. Even though participants recognized it was their friends who were engaging in the act of sharing said content (and did not entirely absolve them of blame), they ultimately viewed it as a black mark against the platforms for creating and sustaining an environment where unreliable information could circulate so freely:

I don’t like checking or reading over things that my own friends put up simply because a lot of things, from what I’ve noticed, especially with this election, people don’t read the articles that they post up. Like, they read the title, and they’re like, “Oh, this is bad,” repost, and then, you know, it’s from The Onion, which is sarcasm, or it’s from this website that everyone knows is fake. So a lot of times Facebook kind of loses its credibility. It’s supposed to have this filter of what’s real and what’s not, [but] they don’t do a very good job at it. (Bowling Green, Kentucky; 18–29-year-olds)

People with their ideological bend, they pick and choose the news, and then they put the links out there. It’s much easier to propagate their false news or fake news or whatever. So this part of the platform makes it easy for them. (Bowling Green, Kentucky; 50-plus-year-olds)

 

Ineffective, Quick-Fix Solutions Can Tarnish Platforms’ Brands

Platforms concerned about being tarnished with the fake news brush may logically assume that they need to take action to intervene. However, while rigorous efforts to reduce the flow of misinformation should always be welcomed, our findings suggest that unsuccessful efforts to do so through quick-fix methods may negatively impact audience perceptions of the platforms’ brands:

I believe maybe over a year ago or so, Facebook put out this big statement about how they were going to make the effort to filter those things along with inappropriate comments and inappropriate images. What they ended up doing was blocking out things that were completely OK. So I’m not sure if they’re still making the effort to do that, but when they did, it was a huge fail. (Bowling Green, Kentucky; 18–29-year-olds).

Others were even more scathing: “Like I said before, I don’t really trust any of it at this point. So there’s nothing Facebook can do at this point to make me think that they’re impartial or balanced,” (Bowling Green, Kentucky; 30–49-year-olds).

This partisan analysis, conducted by a participant who felt that Facebook (and other tech platforms) are swayed toward the left of the political spectrum, also highlights how any efforts platforms make to try and combat fake news are unlikely to be viewed as neutral and may instead be distrusted, or even disregarded, by sizeable proportions of their user bases, particularly in the current polarized climate.

In addition to seeking out their favorite news brands to verify news, some people reported that they had become so fatigued by the omnipresence of fake/highly partisan/opinion-based content on social platforms that they were fazing them out in favor of direct visits to what they perceived to be reliable, news-orientated sites, which of course varied according to individuals’ political leaning: “If I actually want news then I’ll go to The New York Times or an actual news outlet because a lot, as you say, on Facebook is definitely influenced by what the people around you are saying and so the columns that you read—they’re all like opinions,” (San Francisco, California; 18–29-year-olds).[2]

Privacy

The issue of privacy invariably solicited strong reactions across our focus groups. While there were some notable exceptions (particularly across generations—discussed later in this section), the overwhelming consensus was that platforms’ data collection practices are too opaque and a cause for concern. While there is obviously overlap with discussions of algorithm (discussed earlier), the main frame through which our participants instinctively discussed privacy and data collection was targeted advertising.

 

Privacy on Platforms Is a Major Cause of Concern for Many

Generally, participants expressed a strong degree of wariness and concern about tech platforms’ data collection endeavors and the implications for their personal privacy. For many, this concern arose from uncertainty about just how much of their personal data is harvested: “I think what I hear you saying is that they [tech platforms] do a lot of tracking, and I don’t know to what extent. This is a real science . . . I think what’s really going on, there’s more going on with them collecting stuff than I know, than I’m aware of,” (San Francisco, California; 30–49-year-olds).

Perhaps the most evocative reaction occurred in Elkhart, Indiana, where a participant asserted: “Social networks have become the KGB of our news system,” (Elkhart, Indiana; 50-plus-year-olds). While this specific response may be a touch hyperbolic, the sentiment was widespread and recurrent. It was not uncommon for people to use words such as “scary” and “creepy” when reflecting on the amount of information platforms are collecting about them, their habits, and interests:

It bothers me. I mean, people are making billions of dollars on everyday people. (New York, New York; 18–29-year-olds)

 

They have a wealth of information. They don’t tell you that, but that’s how they invest money, and they have too much information. So that scares me. (San Francisco, California; 30–49-year-olds)

Specific reasons for people’s concerns varied, but the most common themes centered around the lack of transparency—the notion that platforms are continually collecting unknown quantities of data for unspecified purposes—and the perceived scale of the operation, and its scope to expand.

So they ask you who your favorite Muppet is. They ask you who your favorite Looney Toons character is. They ask you about your favorite holiday. Already, they know too much about me. They’re not just taking that information in a vacuum. They’re comparing it to people across broad spectrum of demographics, and they’re collecting this data. They’ve got machines bigger than—well, they don’t have to be that big obviously—but they have got so much power that they can tell me more about me in two months, about what I’m going to eat for lunch on a Wednesday, than I even know. So call me cynical or paranoid, what you will. I actually think that paranoia is another thing that’s sort of creeping into the whole human experience because of social media. (Elkhart, Indiana; 30–49-year-olds)

Others picked up on this, arguing that a) they had become increasingly conscious of how they were being tracked, and b) the extent that their personal data was being utilized and served back to them felt like it was increasing:

I mean, it’s social networks plus your cellphone in tandem—they just know too much. And it’s been getting very creepy. Like, when I go to my friend’s house every Sunday to go watch HBO shows—I’ve never scheduled it or anything—and it’s like, I’m getting ready to walk to my car and I look at my phone and it says, “Eleven minutes to so-and-so’s house.” Uh, what? How do you know where I’m going? (Bowling Green, Kentucky; 30–49-year-olds).

 

. . . But There Is an Air of Resignation about Data Collection

Regardless of their level of concern about privacy and data collection, many people were resigned to it as an inescapable inevitability. For example, a man in the oldest group in San Francisco asserted, in a very matter-of-fact fashion, “That’s how they make money . . . They collect data, and they sell it to advertise . . . It goes with the territory,” (San Francisco, California; 50-plus-year-olds). Where this discourse of resigned acceptance emerged, as it did in multiple locations and discussions, the relinquishment of privacy was often framed as a price worth paying, either to receive free access to the services (e.g., allowing Facebook et al. to collect data in lieu of a membership fee) or for the perceived convenience they provide.

A participant, in Elkhart, Indiana, pointed to a prioritization of convenience over privacy and a possible lack of awareness about when, how, and why personal data is being relinquished:

I think a lot of times people give up, you know, privacy and things for convenience. You can do the Apple Pay. You can do the Android Pay on your phone . . . I’m sure that somewhere there’s a database of all of us that knows where you go to lunch, where you swipe your card, where your GPS on your phone is. I think sometimes it’s over the top, but people don’t take time or don’t know how [their data is collected/used], you know. (Elkhart, Indiana; 30–49-year-olds)

That said, though, as was noted in our discussion of audience attitudes toward algorithms, there were instances where people admitted that they have become so concerned about a) the amount of data being collected about them, and b) the lack of transparency about how it is collected and used that they had either begun withdrawing from certain platforms, or abandoned them completely. During one debate on this subject, a participant said of Facebook:

Well, that’s why I don’t go on there very often because I know there’s so much data that they’re culling from me that I have a hard time giving them that data. So I try to limit—you know, if I like something, I like it, but I don’t want to give everything and then [find] they’re sending me ads. (Bowling Green, Kentucky; 50-plus-years-old)

 

Younger People, Especially, Report Fewer Concerns about Privacy—and See Benefits

The subject of platform-based privacy was an area where there was one of the stronger generational disparities. Although far from universal, it became evident that participants from the younger groups were far more likely to express tolerance—and, at times, enthusiasm—for the data collection practices that came under the microscope.

This issue of generational disparity was explicitly raised during some of our younger groups, such as in Bowling Green, Kentucky, where Graham described the stark contrast between his perspective on platforms’ routine collection and sale of their users’ personal data, and that of his parents. Whereas he said his parents viewed it as something to be concerned about, he thought the outcomes were beneficial and hypothesized that his was a view commonplace among his peers (fellow “millennials,” to his description):

I read an article the other day about how millennials are far more likely to not care about their private settings and their private life in general. We’re much more open to social media networks being able to track that. So I definitely, going back to the generational gap, it could also be a thing here, because my father and mother are very much against social media networks trying to be able to do that. They frequently tell me that, but honestly, it doesn’t bother me because, like I said, I really want to know about that newest Etsy typography. That’s all I need, you know. So like whenever it sends me that ad, I actually appreciate it. I have zero problem with it. (Bowling Green, Kentucky; 18–29-year-olds)

During these discussions, the subject of convenience—identified earlier as a key factor behind news audiences’ embracement of platforms more generally—was often cited as a reason for accepting platforms’ data collection endeavors. For example, participants discussed how Google’s ability to learn about their video preferences and habits saved them a step when using YouTube, and could enable them to discover new content that fitted their interests.

Ben: I feel like the data they’re collecting is kind of the data that I have to input another way to get it. So if I’m searching for a certain kind of video, I like that they’re collecting that data and, I don’t know, going back and [serving] those videos back to me because [otherwise] I would have to search for it the next time I loaded up YouTube. But this time, I just click on YouTube, and it’s right there, all the video suggestions I would want to search anyway. It’s more data that I wouldn’t really mind them having. I don’t feel like they’re collecting data that I don’t want them to have.

Jenny: I like just to discover new things, so I like that they’re doing that, so I can find new songs or new videos or stuff like that.

(Bowling Green, Kentucky; 18–29-year-olds)

However, while there was undoubtedly less concern about privacy among the younger groups, it is important to note that the sentiment was not universal. It is not our intention to perpetuate stereotypes about young people’s laissez-faire attitude toward online privacy, nor would it be an accurate reflection of our findings to do so. For example, a participant in the Bowling Green discussion countered the wave of apathy toward privacy concerns, albeit with the addition of a degree of resignation: “I want to say I’m different. I don’t like that it collects and stores all that, but I also understand that when you’re doing stuff like on social media and all that, and you’re using their stuff, they’re going to make you pay for it. Basically, they’re going to do whatever they want,” (Bowling Green, Kentucky; 18–29-year-olds).

This emboldened another participant to express her own concerns about the scale of platforms’ current data collection and how much further they could potentially go in the future:

I see that stuff you [the other participants] agree to, you’re open to, but if I’m looking at a pet bed, because I’m an overly proud dog mom, and I see the same bed on Facebook when I’m just scrolling, what else can they get into? Where is the line that says, this is too much? And they’re going to push it until they can’t do it anymore. Maybe I’m just paranoid. (Bowling Green, Kentucky; 18–29-year-olds)

Business Models

The rise of tech platforms, and the rapid expansion of their role in shaping the production, discovery, distribution, and monetization of news, is inextricably linked to journalism’s existential crisis. On the one hand, publishers are enticed by the eye-watering audiences that Facebook, Twitter, Snapchat et al. can bring to their journalism (often succumbing to those platforms’ own proprietary formats due to their immunity from the ad blockers made available in Apple and Android’s app stores). Concurrently, though, those huge audience sizes have not yet translated into even modest returns from digital advertising revenue, not least because Google and Facebook hoover up ninety-percent-plus of mobile advertising growth—aided in no small part by the journalistic content that retains users on their platforms. In a nutshell, the catch-22 for publishers struggling to monetize their enterprises can often look like a case of: platforms—can’t afford to be on them, can’t afford not to be on them.

For publishers struggling to keep the lights on via digital advertising, two of the most common alternatives are digital subscriptions and so-called “native advertising”—that is, advertisements that look more like journalism than advertisements. While it was beyond the scope of this research to do a deep dive on news audiences’ attitudes toward journalistic business models (a topic comfortably big enough to justify a dedicated study in its own right), discussions pertaining to this topic occasionally emerged organically. Accordingly, we dedicate the final section of this report to a short discussion of native advertising and paid subscriptions as they emerged in the context of our discussion of tech platforms.

Perhaps the most alarming aspect of our participants’ discussions about paying for digital journalism was that the topic only typically arose in the context of their reluctance or unwillingness to pay to go beyond paywalls, or frustrations with publishers’ attempts to generate income through alternative means. Through the course of our discussions, participants occasionally admitted to a sense of guilt about not paying for the journalism they consume so readily.

For example, in San Francisco, one person said, “We’re kind of taking them [news outlets] for granted . . . [as if] the journalistic stuff that we’re consuming is free and just kind of happens magically . . . [T]here are ads because people need to get paid for what they’re doing because this stuff isn’t free.” This led another to agree: “It’s kind of a black-mirror truth. You want to say that you would actually invest in getting accurate reports and stuff like that, but I don’t spend a penny.”

More often though, the topic came up in relation to strategies for avoiding or circumventing paywalls, e.g., “I don’t click on The New York Times because I’m running out of free articles a month . . . I would actually be compelled to go to another site just to, like, get the story,” (New York, New York; 18–25-year-olds). Another noted, “This is actually the year I started looking at newspapers because we got a deal—we could look at The Washington Post online for free,” (Bowling Green, Kentucky; 30–49-year-olds).

While the perception that journalism accessed via the web is largely free precedes the rise of Facebook et al. (although products such as Facebook Instant Articles, which do not currently provide publishers to paywall their output, have arguably helped perpetuate it), our findings suggested that tech platforms may not be helping to alter this viewpoint. For example, during a discussion in Bowling Green, Mahdi spoke positively about Apple News because he (mistakenly) thought it provided a way of “getting around” paywalls, which he described as “kind of an appeal” of the platform:

Mahdi: Well, [Apple News] is kind of like a way of getting around some of the stuff that needs subscription, I assume.

Susan: Yeah. That’s true, too.

Mahdi: Until it tells me, “Oh, you need to subscribe.” My wife subscribes to The New York Times; actually, I got her the subscription. So I think she’s used my phone, so I can see The New York Times’s articles that come up on this [Apple News], but sometimes The Washington Post will—it will give me a few days of it, and it will say, “Oh, you need to subscribe.” But then, I don’t know, I suppose I shut the phone off and turn it back on, and I am able to access it. So, to me, that’s kind of an appeal.

(Bowling Green, Kentucky; 50-plus-year-olds)

References to native advertising were fleeting, but there was evidence that some people are aware of its existence. Notably, when it was referenced, it was typically framed as a form of deception. A participant in San Francisco said, “Sometimes news is disguised as, no, the other way around: sometimes commercials or advertisements are disguised [as news],” (San Francisco, California; 30–49-year-olds). Elsewhere, a participant in Bowling Green argued that native advertising had become more noticeable across legacy platforms and newer ones: “It doesn’t say ‘advertising’ or anything. So, I think that line is getting fuzzier in print and other media as well as social platforms,” (Bowling Green, Kentucky; 50-plus-year-olds).

Elsewhere, participants expressed frustrations with sponsored links placed on publishers’ websites in the form of “recommended links” or “partner content.” A particularly noteworthy example of this emerged in Bowling Green, where the strength of consensus indicated that the financial gain publishers may make from hosting such advertising could come at the cost of reputational damage.

Erica: Sometimes you might go to Yahoo or even CNN or somewhere that has a legitimate news story that you’re reading through. And then at the bottom they always sell that ad space but they say, suggested links. And there might be some kind of clickbait article or something. At first you’re kind of like, “What?” because you think it’s really a part of that site.

Tammy: Or right in the middle. Drives me crazy.

[ . . . ]

Erica: Whether or not it’s true, the article you’re reading, you just discarded everything that you just saw.

Bill: Yeah, it cheapens it.

Multiple voices: Yeah.

Moderator: Interesting.

Erica: It’s like playing Whack-A-Mole and it gets old.

Moderator: That sounds a bit like a kind of reflection on the brand itself. Is that fair?

Erica: Absolutely. If it’s got those suggested links and it’s got oh, guess what so-and-so looks like today or something like that—goodbye. I’m done.

(Bowling Green, Kentucky; 30–49-year-olds)

Conclusion

This research has explored the relationship between three stakeholders: news publishers, third-party platforms, and audiences. At its most rudimentary level, this study has acted as a pertinent reminder that the plural here is more crucial than ever. “The audience” is not a monolith. Nevertheless, heeding the concerns of audiences with varied needs and interests is critical to the survival (or revival) of healthy news ecosystems.

Publishers

For publishers, there are a few signs of encouragement. Many of the participants in our focus groups described an ongoing reliance on their most trusted news brands for verifying or corroborating dubious content encountered via tech platforms and other aggregated resources. This appears especially important at a time when many have heightened consciousness of, and concerns about, mis/disinformation/fake news, particularly in light of the anxieties stemming from the proliferation of unfamiliar brands they are exposed to via social platforms. For these people, the familiarity of a trusted brand (whatever the rationale for that trust) provides a degree of comfort and reassurance.

Regarding brand recognition on platforms, the picture is somewhat muddy. While our participants generally assured us that it is not an issue, and that they find it easy to recognize and identify news brands, a number admitted to slipping up (or knowing people who have), while some outlined questionable practices of verification, suggesting their confidence may not be entirely justified. As with so much in this new paradigm, publishers are largely at the mercy of the platforms in this regard, relying on them to instigate measures that make their brands more visible and distinguishable, a point further supported by our participants’ tendency to frame their ability to distinguish news brands through visible elements of platforms’ user interface’s (e.g., “It’s at the bottom of the link or the picture or whatever media they post”). Further efforts to aid publishers in this regard would seem worthwhile.

Unfortunately, though, the task of persuading audiences that journalism needs to be funded appears to be as pressing, and unresolved, as ever. If guilt, and a recognition that journalism is valuable and worth paying for, translated into hard currency, then news organizations probably wouldn’t have become so reliant on tech platforms and their promises of jam tomorrow. But sadly, it doesn’t. Further, those news outlets attempting to generate income through some of the more prominent revenue channels du jour—in particular sponsored content and sponsored links—may be paying a price in terms of negative audience perceptions of their brand. Findings suggest that some of these short-term pursuits may have more damaging, longer-term ramifications down the line. That this sentiment emerged organically, without prompt, in multiple locations is significant and worthy of further investigation. It also suggests that other revenue models, particularly those that allow news outlets to retain more control over their brand such as live events, membership programs, or more user-centric approaches to advertising,[xxxiii] may be worth pursuing. Again, though, this would require further research.

 

Platforms

On the platforms’ side, our findings suggest that news audiences have a strong desire for the very thing platform companies have arguably been least likely to provide: greater transparency around their algorithms and data collection practices/privacy. It is probably fanciful to expect this to change, but platform companies may take note of the finding that this is an issue about which some users are willing to vote with their feet. While it is impossible for a qualitative study to estimate the extent of this groundswell, we heard evidence that a lack of transparency around how and why algorithms surface and prioritize news content can be—and has been—enough to drive people away from a platform (and, at times, into the arms of another).

Our findings also present an important, and timely, challenge to the perceived wisdom among some platform companies that news audiences only want to be exposed to views with which they agree—that they just want to be “fed sugar,” as one participant memorably put it. Not only is this approach unhealthy for civic discourse, due to its role in the creation of echo chambers/filter bubbles, our findings suggest that it may not be as universally desired as said companies assume—and that it may not be the direct route to achieving the prolonged engagement time their businesses rely upon.

As noted, some platforms are making efforts to try and address a selection of the issues raised in our focus groups, such as aiding local news outlets and improving brand visibility. While such efforts are to be encouraged—and it may be too early to observe any notable outcomes from the existing, mostly nascent initiatives—our findings suggest that more could and should be done. Platform companies undoubtedly have the resources available to make a difference. Now is the time to show they have the inclination to follow through on their professed good intentions.

 

The Case for Algorithmic Literacy

One of the key findings of this research is that regular audience members are thinking about—and at times even worrying about—the algorithms that underpin tech platforms. They may not have always used the precise terminology, but our findings highlighted that questions around algorithms—their transparency, platforms’ accountability for them—are not exclusive to media commentators, academics, or practitioners. If there is an appetite among the platforms’ many millions of users to be better informed about how these little black boxes operate, then it surely cannot continue to go unaddressed.

“Media/digital literacy” is a slippery and evolving term. Fallout from the 2016 election and the rise of fake news has breathed new life into it and contributed to the development of a variety of initiatives and research aimed at improving citizens’ ability to engage critically with news. Our research suggests that such initiatives are worthwhile and necessary. However, our research also shows that a case should be made for increased education around the existence and function of the algorithms that shape when, how, and why that news finds its way to users via tech platforms—in other words, algorithmic literacy. The former could arguably be rolled into existing media literacy efforts. The latter, at least partially, relies on the willingness of the platform companies that engineer them. However, the key point here is that algorithmic literacy cannot be achieved without algorithmic transparency. This is the responsibility of the technology, which currently seems committed to keeping its algorithms as opaque as possible.

 

Acknowledgments

We would like to begin by thanking the John D. and Catherine T. MacArthur Foundation, the John S. and James L. Knight Foundation, the Foundation to Promote Open Society, and the Abrams Foundation, without whose generous support this research would not have been possible.

Thanks, too, to our colleagues at the Tow Center for Digital Journalism, particularly Nausicaa Renner and Abigail Hartstone, whose thoughtful feedback and copy-editing were as invaluable as always.

Thanks also to Ivy Tech Community College-Elkhart and Western Kentucky University for venue support, and to Sam Ford for his exceptional work in assisting our work in Kentucky.

Finally, we would like to extend heartfelt thanks to the people who gave up their time to participate in this research. Your openness, thoughtfulness and insight made these focus groups a pleasure to moderate. Thank you.

Endnotes

[1] Participant names, where used, have been changed to protect their identities.

[2] Another response to distrust was news avoidance. For a discussion of this phenomenon, partially based on participants in this study, see Wenzel, Andrea (in progress) To Verify or to Disengage: Coping with ‘fake news’ and ambiguity.

 

Citations

[i] Elisa Shearer and Jeffrey Gottfried, “News Use Across Social Media Platforms 2017,” Pew Research Center, September 6, 2017, https://assets.pewresearch.org/wp-content/uploads/sites/13/2017/09/13163032/PJ_17.08.23_socialMediaUpdate_FINAL.pdf.

[ii] Nic Newman et al., “Reuters Institute Digital News Report 2017,” Reuters Institute for the Study of Journalism and University of Oxford, 2017, https://reutersinstitute.politics.ox.ac.uk/sites/default/files/Digital%20News%20Report%202017%20web_0.pdf.

[iii] Emily Bell and Taylor Owen, “The Platform Press: How Silicon Valley Reengineered Journalism,” Tow Center for Digital Journalism, March 29. 2017, https://www.cjr.org/tow_center_reports/platform-press-how-silicon-valley-reengineered-journalism.php.

[iv] Shearer and Gottfried, “New Use Across Media Platforms 2017.”

[v] Newman, “Reuters Institute Digital News Report,” 12.

[vi] Newman, “Reuters Institute Digital Report.”

[vii] Michael Bloor et. al, Focus Groups in Social Research (Sage: London, 2001).

[viii] Bloor et. al., Focus Groups in Social Research.

[ix] Mary Madden, Amanda Lenhart, and Claire Fontaine, “How Youth Navigate the News Landscape,” Knight Foundation, February 2017, 4, https://kf-site-production.s3.amazonaws.com/publications/pdfs/000/000/230/original/Youth_News.pdf.

[x] The Associated Press, NORC Center for Public Affairs, and American Press Institute, “A New Understanding: What Makes People Trust the News,” April 2016, https://www.mediainsight.org/PDFs/Trust/TrustFinal.pdf.

[xi] Alexis C. Madrigal, “Dark Social: We Have the Whole History of the Web Wrong,” The Atlantic, October 12, 2012, https://www.theatlantic.com/technology/archive/2012/10/dark-social-we-have-the-whole-history-of-the-web-wrong/263523/.

[xii] John West, “Publishers Are Desperately Pivoting to Video—but They Should Be Standing up to Facebook,” Quartz, July 26, 2017, https://qz.com/1038396/publishers-are-desperately-pivoting-to-video-but-they-should-be-standing-up-to-facebook/.

[xiii] West, “Publishers Are Desperately Pivoting to Video.”

[xiv] West, “Publishers Are Desperately Pivoting to Video.”

[xv] Emily Bell, “Facebook Is Eating the World,” Columbia Journalism Review, March 7, 2016, https://www.cjr.org/analysis/facebook_and_media.php.

[xvi] Bell and Owen, “The Platform Press.”

[xvii] Bell and Owen, “The Platform Press.”

[xviii] Bell and Owen, “The Platform Press.”

[xix] Devin Coldewey, “Reddit Overhauls Upvote Algorithm to Thwart Cheaters and Show the Site’s True Scale,” TechCrunch, December 6, 2016,

https://techcrunch.com/2016/12/06/reddit-overhauls-upvote-algorithm-to-thwart-cheaters-and-show-the-sites-true-scale/.

[xx] Instagram, “See the Moments You Care About First,” March 15, 2016, https://blog.instagram.com/post/141107034797/160315-news.

[xxi] Christopher Slowe, “Scores on Posts Are about to Start Going Up,” Reddit, December 6, 2016, https://www.reddit.com/r/announcements/comments/5gvd6b/scores_on_posts_are_about_to_start_going_up/.

[xxii] Michael Nunez, “Former Facebook Workers: We Routinely Suppressed Conservative News”. Gizmodo, May 9, 2016, https://gizmodo.com/former-facebook-workers-we-routinely-suppressed-conser-1775461006

[xxiii] David Uberti, “Facebook Wants You to Think It’s Just a Platform. It’s Not.” Columbia Journalism Review, May 11, 2016, https://www.cjr.org/innovations/in_at_least_one_respect.php.

[xxiv] Tamar Wilner, “Facebook Appeals to Texas Reporters During Local Journalism Roadshow,” Columbia Journalism Review, February 13, 2017, https://www.cjr.org/local_news/facebook_local_news_texas.php.

[xxv] Emily Bell, “The Facebook Rescue That Wasn’t,” Columbia Journalism Review, Spring 2017, https://www.cjr.org/local_news/facebook-rescue-watershed-post.php.

[xxvi] Bell and Owen, “The Platform Press.”

[xxvii] Shearer and Gottfried, “New Use Across Media Platforms 2017.”

[xxviii] Newman, “Reuters Institute Digital Report.”

[xxix] Andrew Anker, “Showing Publisher Logos in Trending and Search,” Facebook, August 22, 2017, https://media.fb.com/2017/08/22/showing-publisher-logos-in-trending-and-search/.

[xxx] Craig Silverman and Jane Lytvynenko, “How a Hoax Made to Look Like a Guardian Article Made Its Way to Russian Media,” BuzzFeed, August 15, 2017, https://www.buzzfeed.com/craigsilverman/how-a-hoax-made-to-look-like-a-guardian-article-made-its?utm_term=.oil4Lqz5Ym#.qrON48q5nr.

[xxxi] W. Phillips Davison, “The Third-Person Effect in Communication,” Public Opinion Quarterly 47, no. 1 (January 1, 1983): 1–15.

[xxxii] Claire Wardle, “Fake News. It’s Complicated.” First Draft, February 16, 2017, https://medium.com/1st-draft/fake-news-its-complicated-d0f773766c79.

[xxxiii] Lucia Moses, “Publishers Obsess about User Experience, but Worry about Giving Up Revenue,” Digiday, September 7, 2017, https://digiday.com/media/publishers-obsess-user-experience-worry-giving-revenue/.

 

This project is underwritten by the John D. and Catherine T. MacArthur Foundation, with additional support by the John S. and James L. Knight Foundation, the Foundation to Promote Open Society, and The Abrams Foundation.

Pete Brown, Andrea Wenzel, and Meritxell Roca-Sales are the authors of this report. Pete Brown and Andrea Wenzel are fellows at the Tow Center for Digital Journalism. Meritxell Roca-Sales is project director for the Platforms and Publishers project at the Tow Center.

About the Tow Center

The Tow Center for Digital Journalism at Columbia's Graduate School of Journalism, a partner of CJR, is a research center exploring the ways in which technology is changing journalism, its practice and its consumption — as we seek new ways to judge the reliability, standards, and credibility of information online.

View other Tow articles »

Visit Tow Center website »