Storm over Facebook failure to block live online self-harming
An Irish mental health service for young people has criticised Facebook's decision to allow users to livestream attempts to self-harm.
Facebook users are reportedly allowed to livestream acts of self-harm because the social networking giant "doesn't want to censor or punish people in distress who are attempting suicide". According to the rules, seen by 'The Guardian' newspaper, videos of violent deaths and self-harm, while marked as disturbing, don't always have to be deleted as they can help create awareness of issues like mental illness.
But Naoise Kavanagh, head of digital and communications at ReachOut.com, told the Irish Independent that the social network should immediately remove any threats of self-harm or suicide on its site.
"If someone communicates distress or threatens to take their own life we would of course urge Facebook to take them seriously and act on it immediately," shhe said.
The documents also tell moderators to ignore suicide threats when the "intention is only expressed through hashtags or emoticons" or when the proposed method is unlikely to succeed.
Any threat to kill themselves more than five days in the future can also be ignored, the files say. Ms Kavanagh added that this policy could have serious consequences.
"Telling moderators to ignore someone who intends to take their life is very worrying. The vulnerability of some people who use Facebook Live means the social network must bear some responsibility for the content being streamed."
Figures circulated to Facebook moderators appear to show that reports of potential self-harm on the site are rising. One document drafted last summer says moderators escalated 4,531 reports of self-harm in two weeks.
Figures for this year show 5,016 reports in one two-week period and 5,431 in another.
The documents show how Facebook will try to contact agencies to trigger a "welfare check" when it seems someone is attempting suicide.
Moderators have been told to "now delete all videos depicting suicide unless they are newsworthy, even when these videos are shared by someone other than the victim to raise awareness".
The ReachOut.com spokesperson added that we should now treat Facebook like an official news broadcaster.
She said there was plenty of evidence about copycat cases around graphic content or graphic details methods.
"We work with clinical psychiatrists who believe that a lot of young people are learning self-harm techniques on social media. Facebook needs to start working with local organisations and to give relevant and proper support to people who need it."
Facebook said it was using software to intercept some graphic content before it got on the site, but that "we want people to be able to discuss global and current events… so the context in which a violent image is shared sometimes matters".
Facebook also receives tens of thousands of potential "sextortion" and "revenge porn" cases every month.
The documents, leaked to The Guardian, show Facebook users reported almost 54,000 incidents of sexual extortion and revenge porn in January, with the company disabling 14,130 accounts as a result.
Moderators escalated 33 cases involving children.
The files also reveal that Facebook will not delete videos and images depicting child abuse of a non-sexual nature, since they may draw attention to mental illness or be newsworthy. In some cases, it allows footage portraying physical bullying of children under seven.
Attempts were made to contact Facebook for comment.