Twitch may be unknown to many internet users over forty, but for video game aficionados, it’s the place to watch their favorite gamer play a new title or compete in one of a growing number of live-streamed professional e-sports competitions. On Sunday afternoon, during the airing of a Madden NFL tournament held in Jacksonville, Florida, Twitch became a place to watch (or at least listen to) people being shot in real time. Two players were killed in the gunfire, and almost a dozen others were injured. According to police reports, the alleged shooter was a competitor.
There have been other incidents in which the Twitch network played host to unsavory activity, including one in which a man was robbed during a broadcast, but Sunday’s shooting appears to be the first time that a live-stream included someone being killed. Like some kind of horrific rite of passage, the incident has vaulted Twitch into the same category as Facebook, YouTube, and Twitter, all of which have faced criticism for hosting videos containing graphic violence or disturbing imagery. How Twitch, which Amazon acquired in 2016 for almost $1 billion, decides to deal with similar incidents in the future could be a real test of its values.
We are shocked and saddened by the tragedy that took place in Jacksonville today. Twitch and all its staff send our deepest sympathies to the victims, their loved ones, and everyone in our community who's grieving today.
— Twitch (@Twitch) August 26, 2018
Viewers watching Twitch on Sunday couldn’t see the shooting, exactly; what appeared on the live-stream was what looked like a red dot from a gun’s laser sight, moving around on the chest of a player. The small window where the players were shown—which appears in a corner of the screen, below the game-play—then disappeared, and gunfire could be heard, as well as screaming and shouting. The screen showed an error message saying “Controllers disconnected.”
Despite the lack of any actual violent imagery, Twitch removed the livestream within a matter of hours. (Its community guidelines forbid acts of violence or self-destructive behavior.) By then, copies were circulating on Twitter and YouTube, and even on Twitch itself, where some users appeared to be trying to use the footage to build their followings. By contrast, when Facebook’s live-streaming video feature was used to show real-time killings—including the drive-by shooting of Antonio Perkins, in Chicago, and the death of Philando Castile, shot by a police officer during a traffic stop—the company often left the videos up (with a warning) on the basis of newsworthiness.
Mark Zuckerberg, Facebook’s CEO, wrote a blog post after the Castile video was viewed widely, defending his company’s role in its distribution. Images of a dying man are “graphic and heartbreaking,” he wrote, but such videos also “shine a light on the fear that millions of members of our community live with every day.” Not long after this and other incidents, including several in which people committed suicide live on Facebook, executives at the company said that they would hire 3,000 moderators to help police livestream content. That sounds like a tough job—which Facebook could avoid if, like Twitch, it demonstrated an abundance of caution.