YouTube to have 10,000-strong team of moderators by 2018 in content crackdown
Google will "significantly" increase the number of staff tracking down extremist, violent and predatory content posted on YouTube to more than 10,000 by next year, the video site's boss has announced.
Susan Wojcicki, chief executive of Google-owned YouTube said it would be expanding its team of moderators as she admitted that "bad actors" were "exploiting our openness to mislead, manipulate, harass or even harm".
In recent weeks YouTube has been widely criticised for failing to prevent predatory comments and accounts from targeting children, and has also been condemned for the availability of terrorist propaganda on the site and for not taking down extremist content.
Writing in a blog post, Ms Wojcicki said the company had already toughened its policies, including taking "aggressive action" on comment moderation, and was testing new systems to help fight emerging threats.
As well as using its human moderators, the site is also developing machine-learning technology to automatically flag up inappropriate content.
She said: "Human reviewers remain essential to both removing content and training machine learning systems because human judgement is critical to making contextualised decisions on content.
"Since June, our trust and safety teams have manually reviewed nearly two million videos for violent extremist content, helping train our machine-learning technology to identify similar videos in the future.
"We will continue the significant growth of our teams into next year, with the goal of bringing the total number of people across Google working to address content that might violate our policies to over 10,000 in 2018."
While she did not disclose the current number of staff working to moderate content, Ms Wojcicki said efforts to tackle extremism on the site had already seen "tremendous progress".
Over the past six months, more than 150,000 videos of violent extremism have been removed, she added.
Last month, separate investigations by BBC News and The Times found paedophiles were posting indecent comments on videos of youngsters, evading discovery through flaws in YouTube's reporting system.
It said adverts for major brands were appearing alongside some of the videos, which led to several big brands including Mars and Adidas pulling advertising from the site.
Ms Wojcicki said the site would now be taking a "new approach" to advertising, and would be "significantly ramping up" its team of ad reviewers to ensure ads only run where they should.
"We want advertisers to have peace of mind that their ads are running alongside content that reflects their brand's values," she said.
"As challenges to our platform evolve and change, our enforcement methods must and will evolve to respond to them. But no matter what challenges emerge, our commitment to combat them will be sustained and unwavering."