The tipoff came in a tweet.
In an April meeting of the Senate Finance Committee, a tragically buttoned-up affair, the subject of the day was tariff policy. It would have remained an event only of concern to the most deeply wonky of Beltway insiders, had Pat Roberts, the senior senator from Kansas, remembered to silence his phone.
Roberts had just finished asking a question when suddenly, from his coat pocket, came singer Idina Menzel, belting “Let it Go,” the hit song from the animated film sensation Frozen. A sweet, warbling children’s song; a cantankerous senator. All
eyes shifted to Roberts. “Just let it go, mister,” he said, as if he knew the internet was watching.
That’s when Alyssa Kurtzman, a 26-year-old producer leading the trending team of NowThis, jumped to action. For NowThis, which publishes its news videos in rapid fire onto social media platforms, a moment like Roberts’ cell phone slip is the kind of easily-digestible moment that leads to lots of social shares. But Kurtzman, like the rest of the media, hadn’t been watching the hearing. She never would have known about the incident without an alert from a digital device on her desk called Dataminr.
Dataminr is a stealthy tool that scours Twitter, looking for tweets that its algorithm considers important and newsworthy. It was Dataminr that alerted Kurtzman to a tweet from an attendee, allowing her to call all hands on deck to search for a video. After finding footage on C-Span’s AV website, they cut a quick story. The resulting video won the day.
What does it mean when we give an algorithm a say in story selection? Because like it or not, it’s already happening.
Tools like Dataminr are woven so tightly into Kurtzman’s workday, there’s almost never a time when they’re not in use. On the morning I visited NowThis headquarters, Kurtzman’s team had already cut a video based on a Dataminr alert, which directed them to a tweet from a bystander near Penn Station, who’d just watched the nypd shoot a suspect. While one producer cut a video of Bill Clinton’s appearance on David Letterman (“We like to get about six videos per producer, per day,” Kurtzman told me), he scanned Dataminr, tracking responses from a Jeb Bush campaign event. By 11 am, Kurtzman’s staff had received 57 Dataminr alerts, a small chunk of the hundreds they’d sift through by day’s end.
In the fast-paced news cycle of places like NowThis, an emerging generation of “social listening” tools like Dataminr looms large. These tools allow reporters to find a breaking story faster than news outlets have typically been able to, and they’ve proven so effective that even scoop-brokers like CNN, the Associated Press, and the New York Post employ them in their newsrooms. Their benefit comes from allowing editors to spot a story that might be lost in a cluster of their own feeds. “If I have one problem with Twitter it’s just that it’s so quick and so ephemeral. It’s so easy to miss things,” said Kurtzman. “If it’s a breaking story, nine times out of 10 we see it on Dataminr before we see it anywhere else.”
Social sites like Facebook and Twitter have become the unofficial homepage of the internet, and increasingly our gateway to the news. And as the mass of users on the social Web has expanded, Tweets and Facebook posts have become just as valuable for story-hunting as they are for story promotion. But beating the masses to a scoop on Twitter is less a game of skill than stamina; it requires wading through the glut of posts—millions and millions of pieces of content, a number that’s continually expanding—for a gem.
The problem is that the perfect post isn’t likely to attract a journalist’s attention until thousands of people have tweeted it—which means others have already written about it. This challenge is scaling beyond human abilities. That’s why “social listening” devices, like Dataminr, have become the best way for journalists to cut through the noise. They don’t get lost in the wave of social posts. Instead, they spot the one random tweet among millions and predict whether it will be big news.
Let’s say a cop shoots an unarmed black teenager in a scarcely known suburb of St. Louis. The news might get covered in the local press, and eventually by the national press once the masses start tweeting or posting on Facebook. But it will register as newsworthy far more quickly on social listening devices. A journalist searching for “gun violence” or scanning feeds from Midwestern states could find it from their office in New York or LA, swoop in and write about the story before the rest of the media move in. At their best, social listening devices can elevate a small local homicide into a major news event: Ferguson. Or, more absurdly, provide an elusive warning signal that five hours from now everyone will be talking about one particular dress.
They don’t get lost in the wave of social posts. Instead, they spot the one random tweet among millions and predict whether it will be big news.
But as tools that sort through content become increasingly crucial in sorting through what is news and what is chatter, it’s hard not to wonder who or what is deciding what journalism reaches the public. “We’re living in a world now where algorithms adjudicate more and more consequential decisions in our lives,” Nicholas Diakopoulos, a researcher studying algorithmic accountability at the University of Maryland, wrote in a recent report. “Algorithms, driven by vast troves of data, are the new power brokers in society.”
That doesn’t mean using a tool to sort through data isn’t useful and even beneficial to the public. These tools can help an editor spot a story that a couple years ago might have languished.
Which is why newsrooms are betting money on social listening technology, incorporating it into both their editorial structures and their business models. CrowdTangle, a social listening device that locates well-performing posts across Facebook, has been called the secret behind UpWorthy’s wild success. Dataminr has traditionally targeted its products towards the finance industry (its tools are powerful enough to predict stock market fluctuations). But after it released a system targeted to journalists in January, it garnered subscribers in over 150 newsrooms across the US. In 2012, Mashable increased its traffic when it introduced “Velocity,” an in-house social listening device that reporters now use to surface most of the stories on the site.
Yet these same companies are only beginning to set the rules for how to incorporate these tools into their editorial practices—and how to beat the competition in a way that’s consistent with journalistic ethics. I reached out to dozens of publishers for this story; roughly half declined to participate, or declined to return my many emails and phone calls. And many of the journalists who work with these tools on a daily basis requested anonymity, some citing company policies, but many, more interestingly, fearing retribution from their readers. “These tools are kind of weird,” one digital editor at a metro newspaper told me, “and I’m not quite sure what to think about them.” “I think about what it might be like to work somewhere where you don’t have these tools,” another told me. “You would not have a shot at being ahead of anybody.”
Which begs the question: What does it mean when we give an algorithm a say in story selection? Because like it or not, it’s already happening.
Around 2011, when Paul Quigley began to envision a way of sorting stories, he had a rather expansive vision of what he hoped to find. “I wanted a share box, for the whole internet, to see what the most talked about things in the world are,” Quigley told me over coffee in the chic midtown co-working space that houses the growing New York branch of his company, News Whip.
Based in Quigley’s native Ireland, News Whip bills itself as a “a human signal of what matters right now.” That “human signal” is measured by Spike, a tool that tracks—or attempts to track—every piece of content published on the internet. It also tracks how quickly a story is shared across both Facebook and Twitter, a metric they called “social velocity.”
According to Quigley, measuring speed of sharing allows Spike to zero in on something that’s just beginning to get hot—a YouTube video of a hostile police officer, for instance, or an important business merger that’s only been covered by a local outfit in Manitoba. It allows traders to take action, and journalists to get the story to their site before the rest of the world is already talking about it. Ideally, Spike transforms the kind of scoop that used to be a mix of alchemy and chance into an easily replicated science. Editors used to find these kinds of stories by having an eye on the right news tip or police radio, at the right time. With News Whip all it takes is the right filter.
And News Whip’s predictions, it seems, are usually on the right track. In February of 2014, the company commissioned the Irish Centre for High-End Computing to study how effective its algorithm is. The Centre analyzed 140,000 stories that passed through Spike’s 1-hour box, which highlights stories published in the last hour that show signs of virality. The “box” found that 79 percent of the most shared stories each day had been caught by Spike.
Which is why publishers have quietly adopted News Whip en masse. About 80 percent of the “top 25 most-shared English-language publishers” use Spike, according to Quigley, an array of sites that include the New York Post, Buzzfeed, and ESPN, all of which declined to comment for this story.
Though he was in town for interviews—the company intends to double its staff in the next year—Quigley was also drumming up business. He had just come from a pitch meeting with a group of editors at Hearst; the next night he was presenting his product at a conference for credit card companies.
Less than a decade ago, Quigley was a lawyer in New York, working as a litigator at the monolithic firm Simpson Thacher Bartlett, and hating life. “You realize you’re not jealous of any of your partners’ jobs in your law firm,” Quigley recalls, “And you think, ‘wait, but I’m working towards that job.’ I’m going to end up there, with a house up in Westchester and the ride down the Metro North every day.” Quigley’s compact frame and boyish face give him a roguish appearance, offset mainly by his speech—which is slow and deliberate.
To distract himself from the drudgery of legal work, he took to the internet. “I got very interested in, how do you find the coolest stuff? I became very interested in curated newsletters and the like.”
And, like many other disenfranchised workers, he became very interested in Gawker. He found Gawker’s characteristic snark “smart and funny,” but mostly he was impressed by the speed with which Nick Denton’s scribes lifted interesting stuff off of the Web, re-appropriating it for the their own site with a sharp Gawker-esque slant. Which is why, just five years into his legal career, Quigley quit his job in order to bring the Gawker model to his home country, with what would become a news media site he named News Whip.
But Quigley’s dream of being the Nick Denton of Ireland stalled. He and his bloggers struggled to find a voice, an audience, a revenue model. Eventually, Quigley decided, “if you’re putting out stuff that’s ‘OK,’ and you know there’s better stuff elsewhere, then why make stuff?” He ended contracts with his bloggers and closed up shop.
Yet the most useful aspect of Quigley’s short-lived media company was its original focus on locating interesting ideas quickly. Cool things, he noticed, were filtering on to Facebook and Twitter much faster than human beings could sort through the sprawl. And the importance of a piece of content on his site, he felt, was usually determined by an empiric measurement: How many people shared it on social media? What if he could build a tool that tracks that performance and surfaces things quicker than his team of human aggregators could? He connected with a computer programmer named Andrew Mullaney to build the vision into a product.
This time, he found a niche. About 300 publishers, brands, and communications companies use News Whip’s social listening tool, Spike, paying a monthly fee per user that ranges from $300 for a small organization to $5,000 for a large one. Kevin Lowe, an affable News Whip account executive, took me through the system. About 90 seconds after something is published on a site, Lowe explained, Spike begins tracking the link across Twitter and Facebook, pinging against both sites to see how quickly the post is appearing. Though employees won’t comment on the specifics of the algorithm, strong interactions, like shares and tweets, which signal endorsement, are counted more than easy interactions, like a comment or a “like.”
Based on these numbers, each story is assigned a constantly changing speed, which measures how well it’s rising within Spike’s algorithm. The most shared story of an ordinary day “starts at about 3,000,” explains Lowe. A major media event, like Charlie Hebdo: “more like 8,000 or 9,000.” A graf next to each story shows its prevalence in a given network, which tends to follow a pattern. “You see a burst of velocity on Twitter,” says Quigley, “followed by an awkward sloping line as it moves to Facebook.”
The most powerful aspect of Spike is the fact that it can be filtered in different ways, allowing users to sort viral stories by keywords, like location, or a topic or a particular source. That means an editor at a political site, like The Blaze or ThinkProgress (both News Whip clients) can track small newspapers in Georgia or Michigan, looking for a provocative story on gun control or gay marriage, while brands like the American Kennel Club (another News Whip customer) can search for “golden retriever” or “border collie” to find, and highlight, the biggest dog stories of the day, sorted by breed.
It also means Quigley’s staff is mapping more and more media as the internet expands, or as they locate publishers that they’d missed, in a never-ending attempt to track thousands of small, niche areas. Currently, News Whip is trying to expand into international markets by contracting with native speakers to map media conducted in Russian, Polish, Arabic, and Japanese.
But, while tracking the most-shared content can be a powerful tool, it can also prove fallible. What people share on social media is only a small subset of what they actually read, a subset dominated by stories that provoke feelings of rage, triumph, or irreverence. What’s more, it’s hard to entirely eradicate the fact that social media algorithms can be gamed—by homogenous groups that cluster together to uplift a story beyond its natural reach, or by sneaky headlines.
“A lot of stories that go viral, they have a bent towards the totally outrageous or super disingenuous,” says Joe Ragazzo, deputy publisher of Talking Points Memo, which gives News Whip subscriptions to its news writers, who focus on aggregated and breaking stories. “They tend to have extremely high social velocity, because they’re really good at gaming headlines, or baiting outrage.”
Which hints at the problem: Measuring what gets shares is just another way of tracking what captures people’s attention, but earlier and speedier than has ever been possible. And if a decade of nipple slips and Kim Kardashian footage has taught us anything, it’s this: People pay attention to a lot of crap. So, in a way, social listening tools have simply shifted the role of the editor, from someone trying to figure out what will capture people’s attention to someone sorting through what we know will capture interest to finding what’s actually quality news. Those same editors also have to be diligent in rooting out misinformation and hyperbole. Says Ragazzo, “You just have to be constantly checking in with the source—does it hold up to our editorial standards?”
In a similar vein, stories published without an easily sharable headline can get overlooked when tools focus mostly on shares and tweets. Holly Moore, the managing editor of USA Today’s Nation Now desk, regularly uses Spike to find trending local stories on Gannet sites that might be worthy of moving to the homepage of the national sites. Recently, Moore followed the story of a science teacher in Salem, Oregon, who was being investigated for burning a student with a Tesla coil during class.
“I was like, whoa, that was an interesting story, I wonder if our local sites have it already,” says Moore. They did, but under the less click-y headline: “Science teacher still under investigation by school.”
As social listening tools become more integral to our lives, it’s also worth understanding their limits, says Gilad Lotan, chief data scientist at the New York tech incubator Betaworks, which helps build and analyze social listening tools for its companies.
The tools have many virtues, but the problem, Lotan says, is anticipating the bias in how a tool makes its decisions. Many algorithmic systems, like Twitter’s trending graph, work well in English, but don’t work well in other languages—which can slow stories that break outside of the English-speaking world. When Ebola began to spread through West Africa, the earliest reports were in French. “Had they been in English, it might have been identified by some of these algorithmic systems much earlier,” says Lotan.
And the algorithms themselves can favor specific content without intending to. In 2011, thought leaders and citizen activists were talking constantly about Occupy Wall Street on Twitter, yet while Kim Kardashian made it to the Twitter trending panel, the Occupy movement never did. Activists decided it was a conspiracy: Clearly Twitter management was purposefully keeping the movement down.
Yet when Lotan went into Twitter’s metadata, he found that the omission wasn’t purposeful, it was the result of Twitter normalizing the data, a process that favors topics that are increasing quickly over topics that are building slow, constant momentum. It was built into the system, ironically, to allow smaller movements like Occupy Wall Street to trend over a constantly talked-about celebrity, like Justin Bieber. And yet, “if you compare Occupy Wall Street and Kim Kardashian, her trend line was super bursty and then fell very fast while Occupy was slowly growing.” It was the victim of a system that favors velocity over stamina.
“We’d like to think that the systems that we build are inherently democratic and that any piece of content could propagate and anyone is equal,” says Lotan. “But the more time you spend in these networks the more we realize that we’re really not all equal and there are users that have more strategic locations within these networks.”
That’s probably why just about every single editor that I spoke to emphasized that social listening tools are just that—tools—to be used with a hefty amount of editorial judgment. “We try to do the human oversight,” says Sarah Frank, an executive producer at NowThis, “because if we’re just data-driven it will drive us off a cliff.”
In theory, this is the opposite of the democratic, broadly sourced public opinion that social media is supposed to provide. But how it’s parsed is becoming a political decision. And a weighty one to plant on an editor rushing to identify breaking news stories. But like it or not, these tools are embedding further into newsrooms by the week. In late May, News Whip announced an additional $1.6 million in funding from a group of companies that includes The Associated Press, a heavy investment in the new editorial regime.
“We’ve had editors since the invention of mass media, for like 400 years, deciding what people should be reading,” says Quigley. “Ever since the first newspaper, the Daily Courant, in the 1600s, it’s been an editorial decision without much input. Now we’ve got input for the first time. So I hope there’s some good in that. I’m an optimist: Fingers crossed.”Alexis Sobel Fitts is a senior writer at CJR. Follow her on Twitter at @fittsofalexis. A version of this article appeared in the July/August issue of CJR under the headline, "Viral hunting."