Illustration by Hanna Barczyk

Making media literacy great again

A basic understanding of where news comes from is back on the syllabus as students navigate an increasingly bewildering media environment

October 13, 2017

Professor Carl T. Bergstrom began his first lecture for INFO198 at the University of Washington with a declaration about America. “There is so much bullshit,” he said, looking up at 160 students last spring. “We are drowning in it.” Bergstrom’s audience didn’t seem surprised or outraged by his phraseology. They had surely heard that word before, but they no doubt also recognized it from the title in the course catalog: “Calling Bullshit in the Age of Big Data.”

As Bergstrom spoke, a picture of Hillary Clinton and Donald Trump appeared on a screen behind him, followed moments later by a photo of a young woman typing on her phone. “The average American spends nearly an hour a day on Facebook,” he said. “Doing what? Mostly spreading bullshit.” The students laughed. Then Bergstrom shouted, “Enough! Enough bullshit! We are tired of this.”

That roughly explains how Bergstrom, an evolutionary biologist, wound up in a lecture hall declaring war on fake news. He and his colleague Jevin West, a data science professor, launched the class this past spring, not long after hoax stories, Russian bots, and clickbait headlines wreaked havoc on the US electoral process. The professors were also concerned about misleading science stories, journalism by press release, and the way interest groups and corporations twist data. The class filled up in less than a minute, with several hundred students turned away.

“We wanted to teach students how to evaluate the onslaught of information in their lives,” West tells CJR. “There’s information warfare going on right now.”

As a data expert, West lives in a world where algorithms and machine learning solve human problems. He is encouraged that Facebook, Google, and Twitter are rolling out such tools to eradicate fake news and surface trustworthy content. But he knows that ones and zeros alone can’t solve the problem.

Later in the semester, West and Bergstrom offered students a striking reason why, examining a fake news story about vaccines causing shaken baby syndrome. The claim was so absurd that literally no content existed online to refute it. Searching Google for this phrase—“do vaccinations cause shaken baby syndrome?”—only turned up links to other bogus websites that repeated and expanded on the invented data.

Sign up for CJR's daily email

“We need a cultural solution as well,” West tells CJR. “That’s why we’re doing this.”

At least a dozen universities around the country have launched or are planning similar classes, using “Calling Bullshit” and curriculum from Stony Brook University’s Center for News Literacy as templates. There has been a burst of interest in secondary education as well, with legislators in at least 15 states introducing or recently passing laws mandating digitally focused media literacy instruction in public schools.

“This can get real,” West said that first day of class, taking over the mic from Bergstrom. “And when the shit gets real, that’s when we should care as a society.”

 

Fake news is not new.

Humans have manipulated and fabricated information for centuries—to persuade, confuse, entertain. There was Yellow Journalism, of course. During World War II, the United States used propaganda on American citizens to rally the country. And Adolf Hitler was a master of fake news.

Also not a new idea: media literacy. In the 1930s, an ex-journalist named Clyde Miller started the Institute for Propaganda Analysis, which designed curriculum for educators to teach students to recognize seven different propaganda devices. One was “glittering generalities,” defined as “[a]n attempt to sway emotions through the use of shining ideals or virtues, such as freedom, justice, truth, education, democracy in a large, general way.”

Schools have taught media literacy concepts for decades, but obviously never in an environment like this one, where owning a printing press or TV satellite isn’t needed to quickly and widely disseminate information. That, combined with hyperpartisan politics, has led to the weaponization of news by individuals, political groups, and foreign countries.

The old tools of media literacy—source checking, relying on known outlets—aren’t enough when a hacker in Macedonia can easily create a website that looks legitimate, then quickly make thousands of dollars from advertising as bogus stories circulate. Scrolling through social media feeds produces one challenge after another, from the serious to the mundane.

Illustration by Hanna Barczyk

Just ask the “Calling Bullshit” students.

“So many people just see stuff and share it,” says Conner Ardman, 19, who took the class and plans to major in Informatics with a concentration in data science. “They don’t even look at it. They have no idea what it is.”

Early in the semester, the professors displayed several memes that had been circulating online. The students voted electronically on whether the claims were true.

“You need to be careful on the toilet, you guys,” West said, showing a widely shared picture of a toilet with text reading, “No one tells you, but more than 30,000 Americans are injured each year in the process of using the toilet.”

Much laughter.

“Bullshit” West said. “Or not bullshit?”

The students voted. Roughly 65 percent thought it was bullshit.

“It’s not bullshit,” West said. “This one is really true. People really do get injured on toilets.”

 

The truth of toilet danger raises a rather profound question for our time: If something online that sounds ridiculous is true, how can internet users know if something that sounds potentially plausible is actually false? That’s what classes like “Calling Bullshit” are trying to address. Though fake news might seem like a know-it-when-you-see-it nuisance, students say that social networks make it exactly the opposite.

“It’s hard because you normally trust your friends when they tell you something,” says Jessica Basa, 20, a University of Washington computer science and informatics major who took the class. “They’re not trying to trick you when they post stuff they think is interesting but isn’t true.”

Some of the instruction seems old and obvious, yet also newly relevant. As a first line of defense, the “Calling Bullshit” professors instructed students to ask themselves three questions when encountering a news story, scientific study, or complicated data: Who is telling me this? How do they know it? What’s in it for them?

“If you were going to a car dealership, you would be asking these types of questions,” West said during a lecture. “We want you to have that kind of frame of mind going forward . . . .You should put your skeptical hat on, especially when you’re in these digital environments.”

West and Bergstrom introduced theoretical defenses as well, such as Occam’s Razor, a 14th century principle that says the simplest explanation is usually correct. They paired this with the so-called “Bullshit Asymmetry Principle,” an idea tweeted by an Italian software developer in 2013. It states, “The amount of energy needed to refute bullshit is an order of magnitude bigger than to produce it.” For evidence, the professors offered the Comet Ping Pong saga—allegations that Hillary Clinton ran a child sex ring in the basement of a Washington, DC, pizza shop. The fiction took months to dislodge, and references still pop up on social networks, shared by those who believe it or simply want to use it as ammo against the opposition.

Other detection tactics, such as the ones taught at Stony Brook, are more technical and workmanlike, but equally important. Students are taught how to look up domain name registration records. Though these records can be fudged, any English-language news site originating in Eastern Europe—the vast majority of fake news is manufactured there—is likely to be bogus. Another tool: image searches. To track down a photo’s origins, simply drop the questionable image into the search bar on Google Images. Definite warning signs: poor grammar and cheap-looking design.

Stony Brook became a leader in media literacy soon after Howard Schneider, a former Newsday editor, founded the university’s journalism school in 2006. Besides training the next generation of journalists, Schneider was prophetic in recognizing that there should be, as he put it in a 2007 Nieman Reports article, an additional mission “of equal—perhaps greater—importance: to educate the next generation of news consumers.”

“The digital revolution might bring the promise of enlightenment, but in its pathological lack of accountability might just as easily spread a virus of confusion and disinformation,” Schneider wrote. “The ultimate check against an inaccurate or irresponsible press never would be just better-trained journalists, or more press critics and ethical codes.”

What was the answer?

“Consumers who could differentiate between raw, unmediated information coursing through the Internet and independent, verified journalism,” Schneider wrote.

More than 10,000 students have taken Stony Brook’s news literacy course, which is constantly updated to help students identify the latest ways bogus news and information are created. For instance, there are dozens of websites that let anyone easily produce counterfeit social media posts, then retweet them, post them on Facebook, or embed them in a news story. But fake tweets seem positively quaint compared to an even newer threat: Using artificial intelligence to make videos of people saying things they didn’t say. Researchers recently made a video of Barack Obama speaking very earnestly about his priorities for the waning days of his administration.

“The single most important thing I can do now,” Obama said, according to the doctored audio track, “is to play golf.”

 

The following sentence is not fake news, nor does it rely on bullshit data: Media literacy works, and it just might save humanity. At least part of that sentence is backed up by a recent study that examines how people judge the accuracy of claims about controversial subjects. Researchers from the University of California, Riverside, and Santa Clara University provided more than 2,000 teenagers and adults up to age 27 with fake assertions connected to stories on highly charged topics such as economic inequality and tax policy. The study found that claims about controversial topics, whether true or false, were more often than not identified as accurate if they aligned with the person’s prior views. That’s essentially confirmation bias, seeking out and believing information that strengthens your worldview. Sobering. Also, not surprising.

Then the researchers dug further into the data, producing remarkable findings. They looked at two subsets of study subjects: those with higher than average political knowledge but no media literacy training, and those who had little or no political knowledge but took media literacy courses. Having political knowledge “did not improve judgments of accuracy,” the authors wrote. Media literacy education did. Other recent studies have hinted at similar success, but it’s unknown whether these lessons carry as much weight once students enter the real world.

The University of California authors were cautiously optimistic, noting the results seem to advance the notion of so-called “critical loyalty.” “Those with critical loyalty still hold strong values and beliefs,” the authors write, “but they adopt a critical stance when evaluating an argument—even when that argument aligns with their partisan preferences.” In other words, those with media literacy training can still be fiercely committed to their world view, but they can also successfully question flimsy claims. They can call bullshit. Maybe they can even stop spreading it.

Some caution about these results is warranted. People don’t just share fake news because they don’t know any better. A Pew Research Center survey last year found that 14 percent of US adults shared news they knew was fake. In many cases, researchers say, it’s an identity thing—to show what groups and ideas they agree with, to feel part of a movement, even for entertainment. Then there’s the problem of volume. Researchers have recently used complicated mathematical models to show that the sheer amount of information shared online—both real and fake—creates a sort of whack-a-mole situation, in which many moles survive to surface another day.

Nevertheless, media literacy educators are excited by this moment. For one thing, they hope it leads to more resources. This past summer, the Knight Foundation awarded $1 million in grants to 20 media literacy projects around the country, including “Calling Bullshit.” But educators also hope the enthusiasm will lead to meaningful changes in how students and adults deal with the onslaught of digital information. They are especially encouraged that both red and blue states are passing or considering laws mandating new digitally focused media literacy initiatives at the secondary school level. California, a liberal bastion, passed one such measure. Texas, at the other end of the political spectrum, is contemplating one.

At the last lecture of “Calling Bullshit,” Bergstrom focused on a future in which the course’s inaugural students go out into the world with their new skills, share what they know, and stop bullshit in its tracks.

“Starting with this class and spreading out further,” he said, “I’m hoping we can make a difference.”

Ardman, the computer science major, is already trying. Not long ago, a friend on Facebook posted a story saying an FBI agent investigating Hillary Clinton’s private email server had killed his wife, then himself.

“This can’t be right,” Ardman thought.

He checked it out and posted a comment with his findings: fake news.

“It’s important to call it out,” Ardman says. “It’s your civic duty.”


Back to school

For today’s college students, a dubious story, photo, or video shared by a friend on social media can enter the bloodstream faster than a Jello shot. Here are some key takeaways from students enrolled in some of the growing number of media literacy classes at colleges and universities around the country.

 

Sara Schabe

Junior, Stony Brook University

Major: Journalism

“I was like, ‘I’m such an idiot.’ ”

Before Hurricane Sandy hit New Jersey a few years ago, Schabe saw a photo online of an eerie storm cloud looming over New York City. “I thought it was a real picture,” she says. In class, she learned it was doctored from a movie. “I was like, ‘I’m such an idiot.’”

Lesson learned: use Google’s reverse image search to check whether photos are legit.

 

Navid Azodi

Senior, University of Washington

Major: Business and information systems

“Just liking one fake story can become this huge downward spiral.”

“Some of these fake news sites look totally real,” Azodi says, pointing to the widely shared fake news story about the Pope’s endorsement of Donald Trump. “Just liking one fake story can become this huge downward spiral.”

Lesson learned: If a news outlet is unfamiliar, look up domain records and use online fact-checkers such as Snopes.

 

Starla Sampaco

Senior, University of Washington

Major: Video journalism

“Debunking fake news is really hard.”

A surprising but fundamental research finding about fake news is that the more a claim is refuted, the harder it is to debunk, particularly among those with strong ideological views. “You can’t just keep saying something isn’t true,” Sampaco says. “That’s why debunking fake news is really hard.”

Lesson learned: People don’t just share fake news because they’ve been duped—it’s a way to strengthen their own beliefs and position.

 

Gary Ghayrat

Junior, Stony Brook University

Major: Journalism

“You have to use multiple sources. ”

“It was really shocking to learn what a rigorous process it is to publish a story,” Ghayrat says. “You have to have multiple sources [and] corroborate them with other sources to make sure you’re getting everything right.”

Lesson learned: Look for the basics of journalism in every story. Are the sources named? Who provided the information? Is there an opposing viewpoint? Is it too good to be true?

Michael Rosenwald is a reporter at the Washington Post. He has also written for The New Yorker, Esquire, and The Economist. Follow him on Twitter @mikerosenwald.