The Media Today

Facebook shuts down possible Russian troll network ahead of midterms

August 1, 2018
 

The memory of what happened during the 2016 election is likely still fresh for Mark Zuckerberg—how he failed to take action against a Russian troll network running a misinformation campaign aimed at influencing the election, and was ordered to appear before Congress for a dressing down. This time around, the Facebook CEO is doing his best to crack down on similar behavior before it becomes a problem in the current political season: The company said Tuesday it has shut down more than 30 accounts and pages that were exhibiting behavior similar to that of the former Russian troll farm known as the Internet Research Agency.

In its blog post announcing the move, Facebook said it couldn’t confirm whether the disinformation tactics it identified (which it called “coordinated inauthentic behavior”) came from Russian sources. According to Facebook, the accounts in question were “more careful to cover their tracks” than the Internet Research Agency was, by using virtual private networks to disguise their location, or paying third parties to run ads. “As we’ve told law enforcement and Congress, we still don’t have firm evidence to say with certainty who’s behind this effort, “ Facebook said. The company admitted, however, that some of the activity— although not explicitly political in nature—was consistent with what they saw during the election, and that there was some evidence of a connection between the latest group of accounts and the Internet Research Agency accounts disabled last year.

ICYMI: NYT reporter’s tweet sparks controversy among journalists

But some observers appear to have already jumped to that conclusion. Democratic Senator Mark Warner, vice-chairman of the Senate Intelligence Committee, said in a prepared statement and on Twitter that he believes the campaign was also the work of Russian intelligence agencies. “More evidence the Kremlin continues to exploit platforms like Facebook to sow division and spread disinformation,” Warner said. (Warner’s own proposal to regulate social media was leaked earlier this week.)

Facebook’s head of cybersecurity said the company first identified suspicious activity on Facebook and Instagram two weeks ago, and ultimately came up with eight pages and 17 profiles on Facebook and seven accounts on Instagram that appeared to be part of a coordinated troll network. The earliest was created in March of 2017, and close to 300,000 users followed at least one of the pages. Facebook says it shared the information with US law enforcement as well as Congress, other unnamed technology companies, and the Atlantic Council’s Digital Forensic Research Lab.

Facebook’s analysis appears to show that the amount of influence these fake accounts had was relatively small: They created a total of 9,500 organic posts (i.e., not ads) and ran about 150 ads that cost about $11,000, which they paid for in Canadian and US dollars. They created 30 events, but about half of these had fewer than 100 accounts who said they were interested in attending. But the company clearly doesn’t want to take any chances this time around, especially with the midterm elections so close.

Sign up for CJR's daily email

Here’s more on Facebook’s ongoing war against misinformation and “inauthentic behavior.”

  • A tangled web: Facebook’s head of cybersecurity described some of the suspicious accounts, which included pages with names like “Black Elevation” and “Mindful Being.” In one case, a page run by the Internet Research Agency and later disabled by Facebook shared an event created by a group called Resisters—a group that was recently involved in planning a sequel to the Unite the Right march in Charlottesville, Virginia.
  • Sowing discord: Earlier this year, special counsel Robert Mueller indicted 13 Russian individuals and three Russian companies for using social-media messages, fake personas and staged rallies with the “strategic goal to sow discord in the U.S. political system.” President Trump has said the Russian government did not intervene in the US election, despite the fact that Mueller and most intelligence agencies believe it did.
  • The Troll Farm: Adrian Chen of The New York Times visited the Russian headquarters of the Internet Research Agency in 2015 and wrote one of the first in-depth accounts of the troll farm and its activities. “Russia’s information war might be thought of as the biggest trolling operation in history,” Chen wrote. “And its target is nothing less than the utility of the Internet as a democratic space.”
  • A futile crackdown? Some argue that the latest trolling behavior shows how futile Facebook’s recent crackdown on political advertising is, since the inauthentic accounts in question didn’t purchase obviously political ads (which are now segregated by Facebook in a special database), didn’t contain messages in support of any specific candidate, and didn’t pay in Russian currency or use Russian addresses.
  • No smoking gun: Alex Stamos, chief security officer at Facebook, was said to have clashed with other Facebook executives over a previous report on malicious activity leading up to the 2016 election, and whether to link the bad actors to Russia or not. The latest report doesn’t do that either, but Stamos says in a blog post that this is because identifying who exactly is behind such an attack is very difficult.

Other notable stories:

  • Vanity Fair media writer Joe Pompeo says The New York Times has thrown a monkey wrench into plans for who might succeed Dean Baquet as executive editor (Baquet is turning 62 and Times editors typically retire at 65). Managing editor Joe Kahn and editorial page editor James Bennet were said to be the top candidates, but sources say newly appointed metro editor Clifford Levy is now also in the running.
  • The founders of The Colorado Sun, a digital journalism startup staffed by former Denver Post journalists who were laid off by their hedge-fund owner, talked to Poynter about their plans to cover the state and their fundraising efforts, which include a grant from blockchain-journalism startup Civil and $161,000 raised from 2,600 donors during a recent Kickstarter campaign.
  • Photographer Tom Starkweather uses pictures to tell a story for CJR about New York’s “honor boxes,” the iconic newspaper-stuffed containers that used to sit on every street corner but have become increasingly rare in the digital age.
  • Axios says that Yahoo is going to launch a streaming video service for its financial hub by the end of the year. The service is expected to include eight hours a day of live market and global financial news updates, which could make it a competitor for traditional financial networks such as CNBC, Fox Business and Bloomberg News, as well as for digital startups such as Cheddar.
  • Emma Best, a freedom of information activist, has published a searchable database of more than 11,000 direct messages sent by WikiLeaks to its supporters and advisers via Twitter. The messages appear to show WikiLeaks coordinating attacks and smear campaigns aimed at its media critics as well as prominent politicians, and also contain what some allege are anti-Semitic sentiments.
  • Christine Schmidt writes for NiemanLab about new research that shows low-income individuals tend to get lower quality news and information, and talks with researchers Fiona Morgan and Jay Hamilton about what the industry can do to try and redress that balance. “It’s a known but not discussed issue for journalists, that you know there are always stories that aren’t getting told,” says Morgan.
  • ICYMI: We spoke with writers, journalists, and novelists about writing on the web. One writer and one article was mentioned over and over again
Mathew Ingram is CJR’s chief digital writer. Previously, he was a senior writer with Fortune magazine. He has written about the intersection between media and technology since the earliest days of the commercial internet. His writing has been published in the Washington Post and the Financial Times as well as by Reuters and Bloomberg.