Tuesday 19 March 2019

Moderators on Facebook 'turn to drugs and sex' to cope with stress

Pledge: Facebook has vowed to do more to support moderators. Photo: Dominic Lipinski/PA Wire
Pledge: Facebook has vowed to do more to support moderators. Photo: Dominic Lipinski/PA Wire

James Cook

Facebook moderators who review disturbing content on the site are suffering panic attacks and mental breakdowns due to the stress of the job, according to a new report.

The moderators, who are mostly contract labour on low salaries, are coping with the stress by taking drugs, drinking alcohol, making offensive jokes, and having sex in the workplace, an investigation by 'The Verge' website claims.

Speaking to the technology site, employees said they felt therapeutic activities and counselling provided by Facebook to cope with their exposure to inappropriate content online were inadequate.

The report added that one moderator had been diagnosed with PTSD and now sleeps with a gun by his side, following trauma from seeing a video of a man being stabbed to death.

Other former contractors said that repeated exposure to conspiracy theories on Facebook had made them more likely to believe in those theories themselves. Some moderators believed in lies including that the Holocaust was fake and that the 9/11 terrorist attack in New York was part of a conspiracy after reading related content on Facebook.

The 'Harvard Digital Journal of Law & Technology' wrote last year that the increasing reliance on moderation contractors was "concerning".

It said: "One of the biggest problems in evaluating the existing systems is we have little information about them. The companies are intentionally opaque and resist any attempt by others to investigate the existing procedures.

"There is a growing body of evidence that content moderation, as currently constituted, entails considerable psychological risks to the employee."

Facebook has stepped up its efforts to remove posts involving violent crime, violent pornography and hate speech, relying on specialist partner firms to sift through the deluge of potentially offensive content to decide whether it needs to be taken down.

It provides moderation contractors with copies of its rules on which content should be removed and which posts are allowed to remain on the social network.

But 'The Verge' reported it is often unclear in particular cases whether certain content is allowed on the site.

Facebook has since admitted that it needs to do more to support the wellness of moderators who remove harmful content from the social network.

Justin Osofsky, Facebook's vice president of global operations, said: "We are committed to working with our partners to demand a high level of support for their employees; that's our responsibility and we take it seriously."

Irish Independent

Also in Business