ugc_banner

Facebook releases long-secret rules on how it polices the service

WION Web Team
New Delhi, Delhi, IndiaUpdated: Apr 24, 2018, 07:38 PM IST
main img
File photo. Photograph:(Others)

Facebook Inc on Tuesday gave an explanation for the types of posts it allows on its social network, giving far more detail than ever before on what is permitted on subjects ranging from drug use and sex work to bullying, hate speech and inciting violence.

Facebook for years has had "community standards" for what people can post. But only a relatively brief and general version was publicly available, while it had a far more detailed internal document to decide when individual posts or accounts should be removed.

Now, the company is providing the longer document on its website to clear up confusion and be more open about its operations, said Monika Bickert, Facebook's vice president of product policy and counter-terrorism.

New policies will, for the first time, allow people to appeal a decision to take down an individual piece of content. Previously, only the removal of accounts, Groups and Pages could be appealed.

Facebook is also beginning to provide the specific reason why content is being taken down for a wider variety of situations.

Facebook, the world's largest social network, has become a dominant source of information in many countries around the world. It uses both automated software and an army of moderators that now numbers 7,500 to take down text, pictures and videos that violate its rules. Under pressure from several governments, it has been beefing up its moderator ranks since last year.

Appeals

We know we need to do more. That’s why, over the coming year, we are going to build out the ability for people to appeal our decisions. As a first step, we are launching appeals for posts that were removed for nudity/sexual activity, hate speech or graphic violence.

Here’s how it works:

  • If your photo, video or post has been removed because it violates our Community Standards, you will be notified, and given the option to request additional review.
  • This will lead to a review by our team (always by a person), typically within 24 hours.
  • If we’ve made a mistake, we will notify you, and your post, photo or video will be restored.

Participation and Feedback

Our efforts to improve and refine our Community Standards depend on participation and input from people around the world. In May, we will launch Facebook Forums: Community Standards, a series of public events in Germany, France, the UK, India, Singapore, the US and other countries where we’ll get people’s feedback directly. We will share more details about these initiatives as we finalize them.

As our CEO Mark Zuckerberg said at the start of the year: “we won’t prevent all mistakes or abuse, but we currently make too many errors enforcing our policies and preventing misuse of our tools.” Publication of today’s internal enforcement guidelines – as well as the expansion of our appeals process – will create a clear path for us to improve over time. These are hard issues and we’re excited to do better going forward.

Meanwhile, Facebook has long made clear that it does not allow people to buy and sell prescription drugs, marijuana or firearms on the social network, but the newly published document details what another speech on those subjects is permitted.

Content in which someone "admits to personal use of non-medical drugs" should not be posted on Facebook, the rulebook says.

The document elaborates on harassment and bullying, barring for example "cursing at a minor." It also prohibits content that comes from a hacked source, "except in limited cases of newsworthiness."

The new community standards do not incorporate separate procedures under which governments can demand the removal of content that violates local law.

In those cases, Bickert said, formal written requests are required and are reviewed by Facebook's legal team and outside attorneys. Content deemed to be permissible under community standards but in violation of the local law - such as a prohibition in Thailand on disparaging the royal family - are then blocked in that country, but not globally.

The community standards also do not address false information - Facebook does not prohibit it but it does try to reduce its distribution - or other contentious issues such as the use of personal data.