Facebook makes its community guidelines public and introduces an appeals process

Last May, Guardian posted a leaked copy of Facebook's content moderation guidelines, which describe the company's policies to determine whether publications should be removed from the service. Almost a year later, Facebook is making an expanded set of these guidelines available to the public, a measure designed to gather opinions from users around the world. The company is also filing a new appeals process, which allows users to request a review if they believe their publication was unfairly removed.

The community standards have 27 pages and cover topics that include intimidation, violent threats, self-harm and nudity, among many other topics. "These are problems in the real world," Monika Bickert, director of global policy management at Facebook, said in an interview with reporters. "The community we have using Facebook and other large social networks reflects the community we have in the real world, so we are realistic about that, the vast majority of people who come to Facebook come for very good reasons, but we know that there will always be people They will try to post abusive content or engage in abusive behavior, this is our way of saying that these things are not tolerated.

The guidelines apply to all countries in which Facebook operates and have been translated into more than 40 languages. The company says it developed them together with a "couple of hundreds" of experts and advocacy groups that represent the whole world. As the guidelines evolve, and will evolve, Bickert said, they will be updated simultaneously in all languages.

The guidelines mainly apply to other Facebook services, including Instagram, although there are differences. (You do not need to use your real name on Instagram, for example). The underlying policies have not changed, Bickert said, although they now include additional guidance for making decisions. "What is changing is the level of explanation about how we apply those policies," Bickert said.

In the midst of a series of developing humanitarian crises, Facebook has been under pressure to improve the moderation of content around the world. In March, the United Nations blamed Facebook for spreading hatred towards the Rohingya minority. Facebook was also forced to temporarily shut down its services in Sri Lanka last month after incendiary messages posted on the service incited mob violence against the country's Muslim minority. This weekend, a report in The New York Times connected the hate speech on Facebook with murders in Indonesia, India and Mexico.

In response, the company has said it will double its 10,000-strong security team by the end of this year. It also plans to update the guidelines regularly as new threats emerge. Facebook is making public guidelines now because it hopes to learn from user comments, Bickert said.

"I think we're going to learn from that feedback," he said. "This is not an exercise in complacency, this is an exercise in saying, this is where we draw the lines, and we understand that people in the world can see these problems differently, and we want to know about that, in order to incorporate that into our process. "

Facebook also announced plans to develop a more robust process for attractive appeals that were made by mistake. The company has faced regular criticism of high-profile layoffs over the years, whether it's a photo of a woman breastfeeding her son or an iconic wartime photo.

Users can now request that the company review the dismantling of personally published content. If your publication is removed, you will be notified on Facebook with an option to "request review". Facebook will review your request within 24 hours, he says, and if he decides he has made a mistake, he will restore the publication. and notify you By the end of this year, if you have reported a publication, but were informed that it does not violate community standards, you may also request a review.

Leave a Reply