What YouTube doesn’t say when it announces the removal of offensive videos

Google would really like everyone to know that its video sharing service, YouTube, is cracking down on offensive content. To that end, the company put on a full-court press this week, announcing that between April and June it had removed more than 100,000 videos and 17,000 channels for violating its hate speech rules, which is five times more than it removed in the previous three months. Google said it also took down more than 500 million comments because they included hate speech. According to a blog post about the crackdown, YouTube’s moderators removed about 30,000 videos last month alone. 

How popular were the offensive videos compared to the rest of what’s on Youtube? Google would like you to know that they “generated just 3% of the views that knitting videos did over the same time period.” In other words, Google won’t say. Instead of providing concrete information, we get a comparison to something else—knitting—that we also have no details about, in such a way as to provide an illusion of transparency. How popular are knitting videos? 

Still, knowing next to nothing about a positive development is better than seeing a turn for the worse. Just days before the announcement about the removal of videos, YouTube reinstated two accounts that it had previously tossed after much criticism—one belonging to Martin Sellner, a white supremacist, and another belonging to a British YouTube broadcaster who calls himself The Iconoclast. Both have ties to the movement that inspired the shooter who opened fire on a mosque in Christchurch, New Zealand—and spread video of the shooting on YouTube. Why the sudden change of heart? Are new criteria being applied? All YouTube would say is that, though many “may find the viewpoints expressed in these channels deeply offensive,” the channels in question did not violate Youtube’s community guidelines after all.

ICYMI: I now publish #MeToo stories on my blog, for free. Here’s why.

In a recent open letter to people who use YouTube, Susan Wojcicki, the CEO, wrote that YouTube struggles to strike the right balance between allowing users to exercise their freedom of speech and providing a platform for hate. A commitment to openness “sometimes means leaving up content that is outside the mainstream, controversial or even offensive,” Wojcicki argued. She believes that “hearing a broad range of perspectives ultimately makes us a stronger and more informed society,” she added. There’s no question that monitoring and judging content is a difficult task for many digital platforms, and both Facebook and Twitter have had some significant failures. Yet YouTube seems to have spent the least amount of time engaged with the problem. And Facebook may not be known for its transparency, but YouTube may be even more opaque when it comes to explaining its methods and rulings.

What we know from interviews with former YouTube staffers is that the company has spent much of its history caring about one thing above all else: engagement, or the amount of time users spend on the service and how often they click. Increasing those numbers has taken precedence over removing offensive content, former YouTubers say, and it shows. Google would very much like you to think that all of the bad days are in the past. But it won’t provide enough detail about what it is doing to back up those statements, and Wojcicki’s statements leave the door wide open for all kinds of offensive content.

Sign up for CJR's daily email

Here’s more on YouTube and its problems:

  • Radicalization engine: White supremacists and other right-wing agitators often say they were radicalized in part by the recommendation algorithm on YouTube, which they say can turn into a black hole of ever more conspiratorial video clips. “I think YouTube certainly played a role in my shift to the right through the recommendations I got,” a former right-wing radical told The Daily Beast. “It led me to discover other content that was very much right of center, and this only got progressively worse over time.”
  • Time spent: Guillaume Chaslot, a programmer who worked on the YouTube recommendation algorithm, told CJR that he offered to work on ways of keeping offensive content out of the “recommended” list. But his superiors told him that all they wanted to see was the amount of time spent by users increasing, and that it didn’t matter how they reached that goal. “Total watch time was what we went for—there was very little effort put into quality,” Chaslot said.
  • Slap on the wrist: As it was bragging about its crackdown, YouTube was hit by the largest fine ever recorded under the Children’s Online Privacy Protection Act, for targeting children with personalized advertising: $170 million. In practical terms, however, this amounts to barely a slap on the wrist for YouTube, since the company is estimated to bring in revenues of more than $10 billion a year, and the penalty may actually be less than YouTube made in ad revenue on the videos for which it was penalized.

Other notable stories:

  • In a four-hour hearing before the House Committee on Homeland Security, which subpoenaed him to testify, Jim Watkins, the owner of 8chan, said that hate speech and white-supremacist material only comes from “a small minority of users” on his site, and that he “has no intent of deleting constitutionally protected hate speech.” 8chan has been linked to three mass shootings, including an attack in El Paso last month that killed 20 people. The shooter in that tragedy uploaded his manifesto to 8chan.
  • More than ten years after it dropped its paywall, The Atlantic has put it back up again, adding a meter that will restrict the number of articles a visitor can read for free. It’s launching a subscription system that will give users access to five articles every month before requiring them to pay a fee. There are three plans: $49.99 for digital only, $59.99 for both print and digital, and $100 for a premium package that includes print and digital, ad-free web browsing, and other features.
  • The Pittsburgh Post-Gazette has donated the $15,000 monetary award it received as part of its Pulitzer Prize for breaking news coverage of the mass shooting at the Tree of Life synagogueThe paper said it will also sponsor an annual symposium in honor of the victims that will explore “how free speech and free thought can be used to confront hate speech and violence and overcome both with decency and love.”
  • CJR has been doing a series of interviews on our Galley discussion platform with experts on everything from the business of journalism to free speech to the rise of disinformation. Our next interview is with Mike Masnick, who runs a site called Techdirt and is an expert on the First Amendment and digital rights. Past Interview subjects have included Joan Donovan of Harvard’s Shorenstein Center, free speech expert and law professor Kate Klonick, and Jillian York of the Electronic Frontier Foundation.
  • A group of academics have written a white paper arguing that the government could help restore the journalism industry to health by giving every American citizen a $50 annual tax rebate that they could donate to their favorite news outlet. Only outlets that provide “serious news” would be eligible, the group proposes (as defined by a panel of experts) and no single entity could receive more than one percent of the proceeds. The package would cost an estimated $13 billion per year to finance.
  • NPR’s Board of Directors announced on Thursday that it has chosen John Lansing to become its next President and Chief Executive Officer. A former president of the Scripps cable network, Lansing was most recently the CEO of US Agency for Global Media, the federal agency that runs Voice of America, Radio Free Europe and several other entities that broadcast media into other countries with the intention of spreading American democratic principles.
  • An opinion column signed by Virginia Commonwealth University President Michael Rao and published in the Richmond Times-Dispatch in January, endorsing a development proposal by a private corporation called NH District Corp., was written by agents working for the company, according to documents obtained by the Times-Dispatch through an Access to Information request. The column supported the corporation’s plan for a development that includes a new downtown Richmond arena, hotel, and apartments.
  • A police officer in St. Louis has been accused of misconduct after the Post-Dispatch published his description of a particularly violent shift, which he posted to his personal Facebook page along with a plea for state officials to support the police department. The officer, Ryan Lynch, wrote the post on August 23, about a police chase after a local high school game involving an armed 16-year-old and the fatal shooting of an 8-year-old. The police department has said that, for sharing his experience, Lynch is guilty of “conduct unbecoming an officer.”
  • Mandy Jenkins, the former CEO of Storyful and now the head of the Compass Experiment—a McClatchy project aimed at revitalizing local media—has published a report on disinformation she wrote while she was a Knight Fellow at Stanford last year. Jenkins says her interviews with news consumers convinced her that the sharing of disinformation is driven by a fundamental disconnect between audiences and the mainstream media, which they distrust.

ICYMI: Malaysian sex-tape scandal poses a challenge for Muslim reporters

Has America ever needed a media watchdog more than now? Help us by joining CJR today.

Mathew Ingram is CJR’s chief digital writer. Previously, he was a senior writer with Fortune magazine. He has written about the intersection between media and technology since the earliest days of the commercial internet. His writing has been published in the Washington Post and the Financial Times as well as by Reuters and Bloomberg.