When Trolls Win
The National Journal has thrown in the towel and eliminated its comment section:
At National Journal, we believe that public service is a noble calling; that ideas matter; and that trustworthy information about politics and policy will lead to wiser decisions in the national interest. Those principles are reflected in everything we do—from the stories we write, to the events we produce, to the research and insights we offer our members.
But there’s one place where those principles don’t seem to hold: in the comments that appear at the end of our Web stories. For every smart argument, there’s a round of ad hominem attacks—not just fierce partisan feuding, but the worst kind of abusive, racist, and sexist name-calling imaginable.
The debate isn’t joined. It’s cheapened, it’s debased, and, as National Journal‘s Brian Resnick has written, research suggests that the experience leaves readers feeling more polarized and less willing to listen to opposing views.
Interestingly, The NJ has kept the comments open on their announcement, which of course has some of the types of comments that they (rightfully) object to having posted on their Web site, which I will not reprint here. Let’s just say Godwin’s Law is alive and kicking.
Previously, Popular Science was (I think) the most high-profile publication to turn off comments, saying that the established facts of science are not up for debate. Mother Jones famously tracked down one of their trolls (and kind of liked him).
Is this a trend that other Big Media sites should follow, or are they clueless about how the online world works and they need to adapt? Other than a Harry Reid-like nuclear option, what else can a website do to curb the abuse? Laura Hudson at Wired has a great piece up offering solutions from the experience of the on-line gaming community that could translate:
Really, freedom of speech is beside the point. Facebook and Twitter want to be the locus of communities, but they seem to blanch at the notion that such communities would want to enforce norms—which, of course, are defined by shared values rather than by the outer limits of the law. Social networks could take a strong and meaningful stand against harassment simply by applying the same sort of standards in their online spaces that we already apply in our public and professional lives. That’s not a radical step; indeed, it’s literally a normal one. Wishing rape or other violence on women or using derogatory slurs, even as “jokes,” would never fly in most workplaces or communities, and those who engaged in such vitriol would be reprimanded or asked to leave. Why shouldn’t that be the response in our online lives?
To truly shift social norms, the community, by definition, has to get involved in enforcing them. This could mean making comments of disapproval, upvoting and downvoting, or simply reporting bad behavior. The best online forums are the ones that take seriously their role as communities, including the famously civil MetaFilter, whose moderation is guided by a “don’t be an asshole” principle. On a much larger scale, Microsoft’s Xbox network implemented a community-powered reputation system for its new Xbox One console. Using feedback from players, as well as a variety of other metrics, the system determines whether a user gets rated green (“Good Player”), yellow (“Needs Improvement”), or red (“Avoid Me”).
In another initiative by Riot’s player- behavior team, League of Legends launched a disciplinary system called the Tribunal, in which a jury of fellow players votes on reported instances of bad behavior. Empowered to issue everything from email warnings to longer-term bans, users have cast tens of millions of votes about the behavior of fellow players. When Riot asked its staff to audit the verdicts, it found that the staff unanimously agreed with users in nearly 80 percent of cases. And this system is not just punishing players; it’s rehabilitating them, elevating more than 280,000 censured gamers to good standing. Riot regularly receives apologies from players who have been through the Tribunal system, saying they hadn’t understood how offensive their behavior was until it was pointed out to them. Others have actually asked to be placed in a Restricted Chat Mode, which limits the number of messages they can send in games—forcing a choice to communicate with their teammates instead of harassing others.
Is it the death of Free Speech for the National Journal to turn the comments off? Hardly. They can still be reached via email, Twitter, Facebook, and all the other usual social media channels. But the days of anonymous trolling at the NJ are over.