First Person

How user comments got ruined–and what to do about it

September 29, 2016
 

More and more news sites are shutting down the troll-ridden comment sections on their articles, but my experience as co-founder of a religion website that helped pioneer these kinds of online comments makes me think the troll infestation didn’t have to happen, and that news sites can and should preserve this valuable service.

We all know about the ugliness that has arisen as moderation has grown rarer. When FoxNews.com posted an article about Beau Biden’s death last year, the following comments accumulated in a matter of minutes:

I wish this was Chelsea’s funeral. Lol

blah blah blah blah. One down, so many more to go!

I think this story is false: Don’t you have to have a brain to get brain cancer

A Good Biden is a Dead Biden

Sign up for CJR's daily email

How did we get here? The site I started in 1999, Beliefnet, was one of the first to run reader comments attached to articles rather than in a separate “forum” area, as many sites did then. When we first offered this feature, it was met with some concern from our staff and outside writers. How would we get people to write for us if the authors knew they’d be subjected to immediate, in-their-face criticism? Wouldn’t it dilute our authority to have people publicly declaring that our prose was shoddy or, since we were a religion site, an abomination unto the Lord?

Struggling to come up with traditional-media lingo to explain this digital innovation to colleagues, I initially called the comments “instant letters to the editor.” That helped the new format go down a bit easier.

But we also operated under the assumption that online community didn’t just happen; it had to be cultivated. I don’t just mean “policing.” Our community staffers (as well as those at other quality websites) were part police, part social workers, and part cruise directors, guiding the conversation, suggesting topics, and encouraging productive behavior.

It was fairly expensive and labor intensive. At one point, we had four paid staff and 80 volunteers just to moderate comments and encourage civil discussion. The results were spectacular: vivid, detailed personal testimonials and surprisingly reasoned polemics about religion. It was early proof that great content could be created by readers themselves.

Of course, we also had plenty of trolls, though they had not yet been so-named. We rode them hard. We had community standards requiring civility, and we actually enforced them.

Good community managers understand troll psychology. For instance, we found that if you banned a user, he’d just come back with a new name, angrier than ever. Our staff ingeniously decided that instead of ejecting them, we would send them to special message boards where they could yell at each other as much as they wanted. We called these areas “dialogue and debate” boards. Sometimes they spurred fascinating discussions; at other times, they acted as rubber rooms for the unhinged. By segregating such users, we helped other areas of the site–such as our online support groups–feel safer. The fulminators became like pigs rolling around in their own vitriol.

Over time, the idea of attaching comments to articles spread. Soon, technology improved so message boards could be moderated with less human effort. Evermoreresponsive flagging systems were developed, better filters caught profanity, and companies like Disqus started helping publishers empower visitors to vote comments up or down. It was considered another bit of digital-era magic: A combination of technology and the wisdom of crowds would bring order to the chaos.

Good community managers understand troll psychology. For instance, we found that if you banned a user, he’d just come back with a new name, angrier than ever.

Except it didn’t. Instead of using the new tools and algorithms to better empower community managers, some news sites cut back on the number of moderators to save money.

Perhaps they were susceptible to wishful thinking about the wisdom of crowds because it aligned with their desire to cut costs. Financially, message boards were considered low-revenue areas because they could not attract blue-chip advertisers (who were nervous about appearing amidst unregulated chatter). So it became hard to justify putting more staff resources into maintaining them.

Such non-moderation also could be rationalized as being more in sync with the freedom-of-speech ethos of the internet. But too often, with the cops and cruise directors gone, the trolls have taken over.

The primacy of comments on news sites has faded over the years as conversation has shifted to social media. But the move to Facebook and Twitter doesn’t entirely solve the problem because many of the old pathologies have simply migrated over there. As long as news organizations want people to visit their websites and read articles, they’re going to need on-site ways of engaging them.

News outlets could actually maintain meaningful and informative comments sections if they took certain steps:

1)   Devote real staff resources. Technology works better when overseen by a sufficient number of humans. Perhaps news organizations will determine it’s not worth the money to employ more moderators, but at least let’s be honest about why online comments sections have failed: It wasn’t because “the trolls took over,” but because we weren’t willing to put the money into making them work

2)   Forbid anonymous posting. Most news sites do this anyway, and it’s amazing to see the self-incriminating sentences people will write alongside their real names and pictures. Still, it’s a first step.

3)   Restore some selectivity. The First Amendment does not require news sites to publish every comment submitted; their websites are their property. If a madman came into the newsroom and started shouting about government spies hiding in his backpack, the security guard would escort him out. There’s nothing wrong with taking that approach in the newspaper’s public spaces, too. Set community standards and abide by them. Leave it to Reddit or Twitter to conduct the social experiment in how vile people can be if given enough freedom.

I’ve come to think more news sites should institute prior-approval, as The New York Times mostly does. Yes, this would prompt cries of bias. But before the internet, newspapers routinely made editorial judgments about “user generated content” (a.k.a., letters to the editor). News  organizations can’t be so afraid of bias accusations that they surrender the right to exert judgment about quality.

People have plenty of places they can post their opinions without interference. Why not make news websites the places where readers have to show some level of thoughtfulness? We could have quasi-editorial standards relating to whether the person offers evidence or personal experiences for their conclusions, makes a novel point, or seems to understand his opponents’ arguments. Admit that the standard is subjective, and be transparent about how it was derived and implemented–as the Times did in this brilliant quiz–but stick to it.

This would not only raise the quality of the comments areas, it would make news websites different and more special than more free-wheeling social media areas. I have a friend who emails me every time the Times approves one of her comments. It’s an accomplishment for her, akin to getting a letter to the editor published. She could easily write the same thing on Facebook, but the very fact that it’s so easy to do makes it seem less worthwhile.

In the short term, such measures might reduce pageviews, but I suspect they eventually will generate more traffic as comments areas became worth reading again.

Having vicious, slander-filled comment areas is not defensible for a news organization. Having no comment areas silences readers’ voices at a time when news organizations should be better engaging them. So why not try making comments a more curated, special, exclusive area. It has worked before. 

Steven Waldman is founder of LifePosts.com, co-founder of Beliefnet, and a former newsmagazine journalist.