Saturday 23 September 2017

Revealed: Facebook's secret guidelines on sex, violence, self-harm and terrorism

  • Investigation reveals Facebook's guidelines and rules
  • Details of guidelines on hate speech, violence, revenge porn, self-harm
  • Calls for company to be regulated in same way as mainstream broadcasters and publishers
  • Facebook 'won't delete all self-harm videos'
Photo: Stock
Photo: Stock
Denise Calnan

Denise Calnan

Facebook videos of violent deaths and self-harm, while marked as disturbing, don't always have to be deleted as they can help create awareness of issues like mental illness, a list of secret guidelines used by the social media giant has revealed.

The Guardian has revealed details of the rules shown to Facebook employees as they decide what the social media site’s 2 billion users can post on the site.

The guidelines apply to issues such as violence, hate speech, terrorism, pornography, revenge porn, racism and self-harm. There are also guidelines on match-fixing and cannibalism.

The analysis of their training manuals, spreadsheets and flowcharts show that:

  • Facebook reviews over 6.5 m reports weekly relating to potentially fake accounts
  • Some Facebook photos of non-sexual physical abuse, bullying of children don't have to be deleted or “actioned” unless there is a "sadistic" or celebratory element
  • Facebook using software to intercept some graphic content before it got on the site
  • Facebook will allow people to live-stream attempts to self-harm because it “doesn’t want to censor or punish people in distress”

The guidelines include examples of what moderators should and should not delete.

Statements such as “Someone shoot Trump” should be deleted, according to the rules, because as a head of state he is a ‘protected category’. However, you can say remarks such as; “To snap a b***h’s neck, make sure to apply all your pressure to the middle of her throat” and statements like “fuck off and die” because they are not considered credible threats.

Furthermore;

  • With regards to videos of violent deaths, moderators are told they do not always have to be deleted because they can help create awareness of issues like mental illness.
  • Photos of non-sexual physical abuse of children can be shared, unless there is a “celebratory element”. Moderators are told that Facebook does not automatically delete evidence of the abuse to allow the material to be shared so the “child can be identified and rescued”. They said they may add a warning on the video that the content is disturbing.
  • Photos of animal abuse and torture can also be shared. They can be marked as ‘disturbing’ as opposed to ‘deleted’.
  • “Handmade” art of nudity or sexual activity is allowed, but digitally made art showing sexual activity is not.
  • Videos of abortions are allowed, as long as there is no nudity.

Some critics are calling for Facebook to be regulated in the same way as mainstream broadcasters and publishers, but chiefs are saying they are a “new type of company”.

Facebook told the Guardian it was using software to intercept some graphic content before it got on the site, but that “we want people to be able to discuss global and current events … so the context in which a violent image is shared sometimes matters”.

Online Editors

Also in Business