'The Universe is full of dots. Connect the right ones and you can draw anything. The important question is not whether the dots you picked are really there, but why you chose to ignore all the others.' -Sam Thomsen

Moderating Harassment and Abuse

by Ethan Glover, Fri, Sep 01, 2017 - (Edited) Fri, Sep 01, 2017

Originally published on WA.

Disqus is rolling out a new feature that allows you to automatically filter out comments from your websites that are considered "toxic." Meaning, comments that are abusive, that are meant to degrade the conversation (trolling), that do not contribute to the discussion, or would cause a reasonable person to leave the thread.

This comes on the heels of a study by the Data & Society Research Institute on the same topic. The study's most important findings show that 27% of Americans have decided not to post a comment for fear of being harassed over what they have to say. And 13% of Americans say that they've felt less connected to information when they've been harassed about a comment.

I've always said that you should ask for comments at the end of your articles. Get people to engage. Throw out a question. What I've said less often, but still needs to be put out there, is to ensure that opinions will be respected and treated fairly. Take responsibility of your community as a website owner. Police the comments and let people know that you are doing your job to maintain a friendly environment.

Some people use a free speech argument for allowing everything that anyone types in aside from blatant robot spam. I couldn't disagree with that more. All opinions are welcome. But comments like the one above that I recently received? Not OK.

I have a strong interest in social movements. I love to read about, watch, and even partake in various ideologies. I want to know how people think, why and what led them there. But the mentality that is trending online, that grew out of 4Chan forums, now known as the "alt-right" is toxic. (The email of the user above was 'counterfund@hitler.rocks') I understand it. These people are going to the absolute limit to offend, and anyone who expresses outrage becomes the target. To them it's joke. If you can't handle it, the game becomes seeing how far you can be pushed. If you can handle it, you become a player in the game.

The problem is, it's ruining conversation. For some of the alt-right, loaded language is changing their very ideology so that it's no longer a joke, but real.

The point of using this example is to say that when you allow toxic comments on your sites, not only are you scaring others away from making real, meaningful responses, but you're changing the very tone of the website itself.

I've seen website comment sections degrade into sand-throwing, constant cynicism, and sarcasm that prevents any real discussion. At that point, there's no point in having a comment section at all because it's not contributing to anyone.

Comments should be providing you with ideas and new conversations. Critiques should be an opportunity to answer a question, they should not be insults to avoid. - Everyone should feel welcome to say something except for those who have nothing to say.

So I applaud the efforts of Disqus and I encourage you to think about what kind of community you want to build. Lay out clear principles and guidelines for what you will and won't accept. I'm OK with angry comments, but I'm not OK with the trolling and the fools who take joy in making others angry. I have, in the past, used private messages to ask the targets of harassment how they would like me to moderate. Do I need to shut it down, or do they want to keep the conversation going in order to help them make an important point?

I don't think it's a good idea to be an authoritarian with what people say on websites. But it is important to guide and lead conversations, regardless of the emotion, into something that helps people explore each others minds. That's my core principle in moderation. What's yours?

Also, what do you think about "toxic" comments? Do you think it's possible to reliably moderate out all "toxicity?" Or do you think it's a slippery slope? Once you begin deleting harmful comments and banning bad users, how long does it take before you start deleting comments that simply disagree? Or comments that challenge you to think further and differently? Let me know in the comments below.