Now Facebook to hire 3,000 staff to tackle violent videos
Facebook plans to hire another 3,000 people to review videos and other posts after getting criticised for not responding quickly enough to murders, suicide and other violent acts shown live on its service.
The hiring spree over the next year is an acknowledgement by Facebook that, at least for now, it needs more than automated software to improve monitoring of posts. Facebook Live, a service that allows any user to broadcast live, has been marred since its launch last year by instances of people streaming violence.
The hiring spree will be in addition to the 4,500 people Facebook already employs to identify crime and other questionable content for removal.
CEO Mark Zuckerberg wrote yesterday that the company is "working to make these videos easier to report so we can take the right action sooner - whether that's responding quickly when someone needs help or taking a post down."
Videos and posts that glorify violence are against Facebook's rules, but Facebook has been criticised for being slow in responding to such content, including live videos of a murder in Cleveland and a killing of a baby in Thailand.
The Thailand video was up for 24 hours before it was removed. In most cases, content is reviewed and possibly removed only if users complain.
News reports and posts that condemn violence are allowed. This makes for a tricky balancing act for the company.
Facebook does not want to act as a censor, as videos of violence, such as those documenting police brutality or the horrors of war, can serve an important purpose.
Policing live video streams is especially difficult, as viewers don't know what will happen.
"We don't want to get rid of the positive aspects and benefits of live streaming," said Benjamin Burroughs, professor of emerging media at the University of Nevada in Las Vegas.
Prof Burroughs said that Facebook clearly knew live streams would help the company make money, as they keep users on Facebook longer, making advertisers happy.
If Facebook hadn't also considered the possibility that live streams of crime or violence would inevitably appear alongside the positive stuff, "they weren't doing a good enough job researching implications for societal harm," Burroughs said.
Mr Zuckerberg described the recent violent videos as "heart-breaking, and I've been reflecting on how we can do better for our community".