× Home Blog Archives Get Started With Smart Moderation

The Quick and Easy Guide to Comment Moderation

By Lena Harris on Tue, 30 Jan 2018

Putting yourself out there on social media can be intimidating. While the benefits of a vibrant and active community are many, the threat of trolls, spam and other abuse can make building a community feel daunting. But with the proper knowledge, you’ll have nothing to worry about!

This guide to comment moderation will help you get started designing a comment moderation policy and enforcing it. While different industries have different requirements—a digital publisher might have different needs than a game developer, for example—this guide to comment moderation covers the basics that apply to anyone online.

The Importance of Comment Moderation

Comment moderation is essential to grow a digital community. Without comment moderation, you open your community to abuse, profanity and spam—and turn prospective fans and customers away. It’s your responsibility to ensure your brand environment is safe and inviting to your audience. Moderating online communities while encouraging engagement is the most failproof way to achieve this.

Moderation is key to encouraging, not stifling, conversation. You want fans to feel that they can contribute in a valuable way by building an environment that facilitates spirited debate and respectful discussion. This spurs your community’s growth and in turn encourages brand evangelism. Closing comments is never an option; in fact, your community gives you a competitive edge because unlike products, communities can’t be replicated so easily.

Plan a Comment Moderation Policy with Your Team

The first step to moderating online communities is to establish a comment moderation policy and approach. You might opt to stick to your network’s own terms of service. In some cases, you may want to add some rules of your own. Get together with your team to come up with a vision for your team: what language do you consider inappropriate? Should certain topics be off-limits?

Your comment moderation policy should revolve around your content and audience. For example, a news publication will want to encourage political debate—but most online communities might moderate to avoid such debate if its considered off-topic. It’s important that your team is on the same page when it comes to moderation policy, because you’ll want to list out these rules for the community.

Your Approach to Moderating Online Communities Should Be Clear

If your rules do deviate from terms of service, post or link them somewhere visible—perhaps in a page or group description. Second, make contact information clearly visible so users can reach out to the moderation team. This not only provides a way for them to report infractions to you, but also lets them ask why a comment of theirs was removed if they aren’t familiar with the rules.

Enforce Your Comment Moderation Policy

Now that you’ve developed a comment moderation policy, it’s time to enforce it and get moderating! First, you should know that moderating effectively isn’t just about erasing spam and abuse. We’ll get to tips on engagement below, but for now let’s focus on dealing with harmful content posted within your community.

Because millions of comments are generated on social media each moment, moderating online communities requires certain tools to be effective. If you relied just on yourself (or your team), you’d quickly drown under a deluge of harmful comments. Most social networks natively feature tools that let you automate part of the comment moderation process, to varying degrees of success. Usually, this means automatically detecting spam or setting keywords to ban.

Work Faster and More Accurately with Automated Comment Moderation

Artificial intelligence and machine learning-based tools take things one step further. These tools can understand a comment close to how a human would—for example, if a troll tried to bypass a keyword blacklist by misspelling a banned word, an AI tool would be able to catch it. With machine learning, such tools can adapt themselves to your moderation approach. Because these tools don’t need to take a break (unlike a human), one can imagine how necessary a customized, automated comment moderation tool can be.

Of course, this doesn’t make you or your team obsolete. Moderation tools simply help you deal with inappropriate comments immediately. This allows your team to focus on engaging with the genuine comments.

Engage and Grow Your Community

Once you’ve automated the moderation process, keep an eye out for engagement to respond to. Answering your fans shows them you’re listening and reminds them that humans are on the other side of the screen—both of which encourage users to provide more quality content in their comments. Some social networks, like YouTube, even let you highlight and single out high-quality comments.

It’s important to note that you should answer both positive and negative comments. Don’t try to hide negative engagement! As long as they aren’t breaking a rule or putting anyone in danger, users with complaints deserve validation and respect just as much as anyone else. In fact, your response can help them gain new appreciation for the brand and community. A good rule of thumb is to highlight the positive and respond respectfully and courteously to the negative. Professionalism is of the utmost importance.

Finally, you should provoke conversation! This keeps discussions on-topic and sets the direction for conversation around your community. When posting content, ask pointed questions to get the community to engage and discuss.

Tags: Comment Moderation, Content Moderation, Social Media, Guide