× Home Blog Archives Get Started With Smart Moderation

Facebook Adds 3,000 Moderators to Its Community Management Team—Is It Enough?

By Çiler Ay on Tue, 30 Jan 2018

Facebook co-founder and CEO Mark Zuckerberg recently announced that the network is adding 3,000 new people to its community operations team. The team is responsible for monitoring user requests to remove harmful content. The news comes after recent disturbing, violent acts were caught on Facebook Live, including murder, rape, suicide and torture.

The Need to Expand Facebook Community Moderation

While the community operations team handles harmful content across the entire platform, criminal acts broadcast on Facebook Live have resulted in disastrous PR for the company. In fact, Zuckerberg focused on video specifically—live and pre-recorded—in his statement announcing the new hires.

That said, crimes committed on video exemplify a larger problem on Facebook and social media in general. Explicitly violent content is just one example of harmful posts spread across social media, including cyberbullying, homophobia, racism, misogyny and other abuse. While social networks expand on new ways for users to share their lives, they must anticipate how they can be used for harm—and put a solution in place. Zuckerberg’s announcement is a welcome step in that direction.

How Community Management Will Get Faster

The thousands of new team members will speed up Facebook’s community moderation efforts. Facebook reports are user-reactive, in which uses report content that they find harmful. With a larger community moderation team, Facebook can manage the reports faster and reach out to necessary authorities and organizations when necessary. In addition to building up the team, Zuckerberg mentioned a goal to make videos easier to report and act on.

Despite the large number of new employees, Zuckerberg has expressed in the past that AI will need to work alongside Facebook’s human community moderation team to keep up with the never-ending slew of content posted to the platform each day. In fact, just a couple months ago the network unveiled an algorithm that automatically detects users likely to self-harm by reading their posts’ content. The algorithm isn’t without criticism—predicting someone’s likelihood to harm themselves or others before it happens conjures up images of the dystopian Minority Report—but demonstrates Facebook’s willingness to embrace artificial intelligence to act on posts’ content.

Facebook Adds 3,000 Moderators to Its Community Management  Team

How AI Makes Community Management Easier

While the algorithms employed by Facebook are a high-tech black box for most users, any brand or page can make use of AI tools right now to help in their community management process. With AI-based comment moderation, brands and page owners don’t have to swat trolls, spam and harmful content like flies. Instead, that content can be filtered out automatically almost as soon as it’s posted.

Our software does all this, and is trained specifically with Facebook’s community standards as a base. And thanks to machine learning, it’s adaptable to fit into your existing community management approach. With AI community management tools like ours, community management teams can focus more on other responsibilities than dealing with abusive content, spam and profanity.

Doubling Down on Human-Led Community Management

Of course, there will always be a human, subjective element to moderation. Not too long ago, we wrote about Facebook’s grappling with the problem of developing community standards on a global scale. It’s important to consider cultural understanding and context when enforcing community standards—what’s offensive to one group may be essential and newsworthy to another. This presents tough calls that people, not algorithms, need to make. Assuming a larger set of community monitoring team members could further diversify Facebook’s reporting process, the website may be better equipped to avoid cross-cultural pitfalls in its community moderating decisions.

Tags: Comment Moderation, Facebook, Moderator