Facebook explains why it bans some content, publishes internal enforcement guidelines
April 24, 20181.5K views0 comments
Facebook, Tuesday, April 24, 2018, for the first time made public 27-paged internal enforcement guidelines, called Community Standards, that it gives to its workforce of thousands of human censors to determine whether comments, messages, or images posted by its 2.2 billion users violate its policy.
The release of the guidelines is part of a wave of transparency that Facebook hopes will quell its many critics. It has also published political ads and streamlined its privacy controls after coming under fire for its lax approach to protecting consumer data.
The company said it was laying bare just how much ugliness its global content moderators deal with every day, and just how hard it is to always get it right.
Monika Bickert, its vice president of global product management, in a blog post distributed by APO Group, said one of the questions “we’re asked most often is how we decide what’s allowed on Facebook”, adding that the decisions are among the most important they make since they’re central to ensuring that Facebook is both a safe place and a place to freely discuss different points of view.
“For years, we’ve had Community Standards that explain what stays up and what comes down.
“Today we’re going one step further and publishing the internal guidelines we use to enforce those standards. And for the first time we’re giving you the right to appeal our decisions on individual posts so you can ask for a second opinion when you think we’ve made a mistake,” Bickert said.
She said Facebook decided to publish the internal guidelines for two reasons. First, to help people understand where the social network draws the line on nuanced issues. Second, to make it easier for everyone, including experts in different fields, to give it feedback so that it can improve the guidelines – and the decisions it makes over time.
The guidelines encompass dozens of topics including hate speech, violent imagery, misrepresentation, terrorist propaganda, and disinformation. Facebook said it would offer users the opportunity to appeal Facebook’s decisions.
“We want people to know our standards and we want to give people clarity,” Bickert, said, adding that she hoped publishing the guidelines would spark dialogue. “We are trying to strike the line between safety and giving people the ability to really express themselves.”
The company’s censors, called content moderators, have been chastised by civil rights groups for mistakenly removing posts by minorities who had shared stories of being the victims of racial slurs. Moderators have struggled to tell the difference between someone posting a slur as an attack and someone who was using the slur to tell the story of their own victimization.
The company’s content policies, which began in earnest in 2005, addressed nudity and Holocaust denial in the early years. They have ballooned from a single page in 2008 to 27 pages today.
As Facebook has come to reach nearly a third of the world’s population, Bickert’s team has expanded significantly, and is expected to grow even more in the coming year. A far-flung team of 7,500 reviewers, in places like Austin, Dublin, and the Philippines, assesses posts 24-hours a day, seven days a week, in more than 40 languages. Moderators are sometimes temporary contract workers without much cultural familiarity with the content they are judging, and they make complex decisions in applying Facebook’s rules.
Activists and users have been particularly frustrated by the absence of an appeals process when their posts are taken down. (Facebook users are allowed to appeal the shutdown of an entire account, but not individual posts.) The Washington Post previously documented how people have liked this this predicament to being put into “Facebook jail” – without being given a reason why they were locked up.
On how appeals work, Bickert said “If your photo, video or post has been removed because it violates our Community Standards, you will be notified, and given the option to request additional review.
“This will lead to a review by our team (always by a person), typically within 24 hours. If we’ve made a mistake, we will notify you, and your post, photo or video will be restored,” she noted, adding “We are working to extend this process further, by supporting more violation types, giving people the opportunity to provide more context that could help us make the right decision, and making appeals available not just for content that was taken down, but also for content that was reported and left up. We believe giving people a voice in the process is another essential component of building a fair system.”