Best practice: Online moderation – Deciding when to moderate comments in social media

03/03/2008

Education & Training | by Danny Meadows-Klue
As a web publisher, how do you deal with conversations between viewers that descend into a flame war? How do you maintain a thread of comments that are on topic? How do you build credibility in the debates on your site without censoring everything that's written? How do you avoid the legal risk of libel when you are the publisher but your readers are the author? Social media tools are great ways to engage audiences and boost page traffic, but moderation needs to be taken into account from the very start.

Classic media never had this problem. The 'letters to the editor' team took unstructured analogue feedback and worked it into something publishable. The volumes were generally small, the process generally simple, and the quality of submissions generally high (after all it takes quite some motivation to get a pen paper, envelope and stamp). The model of pre-moderation and editing worked well.

Over a decade ago I was helping manage the UK's first online newspaper, telegraph.co.uk. One evening in the summer of 1996 we put in place a small button that invited readers to 'email the editor'. Overnight the floodgates were opened and the genie was out of the bottle. In the digital networked society communications are frictionless and the barriers that stop people from communicating are so weak that a whole new set of economics have evolved. The same economics of zero marginal cost that lead to spam also led to the thousands of emails that poured into that newsroom. From across the corridor to across the planet, from the most inane to the most profound, from school kids to senators; anybody could use their voice.

And these are both the beauties and challenges of social media.

Broadly the choices facing publishers are to moderate or not to moderate, to register the user or not to register the participants. Moderation can be pre-publication or post-publication, registration can be with or without validation. The moderation can be by the publishers staff teams, by the community, or by 'super-users' selected from the community.

Why not always pre-moderate? Since the first user comments were posted the debate has been constant because for every tactic that reduces risk for the publisher, there's a negative impact for the user. Pre-moderate and the conversation feels like it never starts. Over-moderate and opinion is cut out or watered down (and on a more fundamental level, what are the site's rules for censorship?).

Do sites need a code? Every site should have its explanation of what the expectations are. Even if it's simply about keeping posts on-topic and being respectful to others, this should be spelled out.

How do you avoid online bullying and aggressiveness towards participants? Profanity filters are a starting point because those lists of keywords can trigger a block to the person posting that forces them to think again about what they are writing. This means the filter acts at a deeper level than simply blocking the post. On the screens that appear when it kicks in, publishers should remind their audience about the rules for the site. Smart content tools can also flag up the posts that have been edited after profanity checks as this can help moderators focus their attention.

How do you keep posts on-topic? Like any good party host, conversation needs occasional stimulation. On the one hand publishers have a moral responsibility to the community, but if discussion isn't in line with audience expectations then the health of the community weakens and a negative effect on traffic appears. Nurture the star posters, seed with quality comments, stimulate the right conversations in the right way and you'll be off to a good start.

In terms of liability, the most countries have a practice of 'notice and take down'. Whichever route the publisher chooses, if they are notified of a libelous comment, then they are duty bound to take it down from the website. Failure to respond will create legal exposure, and fast and intelligent customer service can easily diffuse most incidences. That means very fast responses. And that means having a structure in place that ensures emails are read and teams are trained.

All of the areas touched on here are only personal opinions. Web publishers need a skilled lawyer to advise and an effective structure to be able to act.

<< Back to articles

Copyright ©2000-2018 Digital Strategy Consulting Limited | All rights reserved | This material is for your personal use only | Using this site constitutes acceptance of our user agreement and privacy policy