By Monika Bickert – One of the questions we’re asked most often is how we decide what’s allowed on Facebook. These decisions are among the most important we make because they’re central to ensuring that Facebook is both a safe place and a place to freely discuss different points of view. For years, we’ve had Community Standards that explain what stays up and what comes down. Today we’re going one step further and publishing the internal guidelines we use to enforce those standards. And for the first time we’re giving you the right to appeal our decisions on individual posts so you can ask for a second opinion when you think we’ve made a mistake.
We decided to publish these internal guidelines for two reasons. First, the guidelines will help people understand where we draw the line on nuanced issues. Second, providing these details makes it easier for everyone, including experts in different fields, to give us feedback so that we can improve the guidelines – and the decisions we make – over time.
The Policy Development Process
The content policy team at Facebook is responsible for developing our Community Standards. We have people in 11 offices around the world, including subject matter experts on issues such as hate speech, child safety and terrorism. Many of us have worked on the issues of expression and safety long before coming to Facebook. I worked on everything from child safety to counter terrorism during my years as a criminal prosecutor, and other team members include a former rape crisis counsellor, an academic who has spent her career studying hate organizations, a human rights lawyer, and a teacher. Every week, our team seeks input from experts and organizations outside Facebook so we can better understand different perspectives on safety and expression, as well as the impact of our policies on different communities globally.
Based on this feedback, as well as changes in social norms and language, our standards evolve over time. What has not changed – and will not change – are the underlying principles of safety, voice and equity on which these standards are based. To start conversations and make connections people need to know they are safe. Facebook should also be a place where people can express their opinions freely, even if some people might find those opinions objectionable. This can be challenging given the global nature of our service, which is why equity is such an important principle: we aim to apply these standards consistently and fairly to all communities and cultures. We outline these principles explicitly in the preamble to the standards, and we bring them to life by sharing the rationale behind each individual policy.
Our policies are only as good as the strength and accuracy of our enforcement – and our enforcement isn’t perfect.
One challenge is identifying potential violations of our standards so that we can review them. Technology can help here. We use a combination of artificial intelligence and reports from people to identify posts, pictures or other content that likely violates our Community Standards. These reports are reviewed by our Community Operations team, who work 24/7 in over 40 languages. Right now, we have 7,500 content reviewers, more than 40% the number at this time last year.
Another challenge is accurately applying our policies to the content that has been flagged to us. In some cases, we make mistakes because our policies are not sufficiently clear to our content reviewers; when that’s the case, we work to fill those gaps. More often than not, however, we make mistakes because our processes involve people, and people are fallible.
We know we need to do more. That’s why, over the coming year, we are going to build out the ability for people to appeal our decisions. As a first step, we are launching appeals for posts that were removed for nudity/sexual activity, hate speech or graphic violence.
Here’s how it works:
• If your photo, video or post has been removed because it violates our Community Standards, you will be notified, and given the option to request additional review.
• This will lead to a review by our team (always by a person), typically within 24 hours.
• If we’ve made a mistake, we will notify you, and your post, photo or video will be restored.
This post shows an example that could have been incorrectly removed and can now be appealed.
We are working to extend this process further, by supporting more violation types, giving people the opportunity to provide more context that could help us make the right decision, and making appeals available not just for content that was taken down, but also for content that was reported and left up. We believe giving people a voice in the process is another essential component of building a fair system.
Participation and Feedback
Our efforts to improve and refine our Community Standards depend on participation and input from people around the world. In May, we will launch Facebook Forums: Community Standards, a series of public events in Germany, France, the UK, India, Singapore, the US and other countries where we’ll get people’s feedback directly. We will share more details about these initiatives as we finalize them.
As our CEO Mark Zuckerberg said at the start of the year: “we won’t prevent all mistakes or abuse, but we currently make too many errors enforcing our policies and preventing misuse of our tools.” Publication of today’s internal enforcement guidelines – as well as the expansion of our appeals process – will create a clear path for us to improve over time. These are hard issues and we’re excited to do better going forward.