Facebook ramps up efforts to fight fake news and election interference with new monitoring centre at Dublin office
Facebook is ramping up its efforts to fight fake news and election interference with an eye on this year’s European Parliament polls.
As part of this effort, the tech giant is also cracking down on murky, unaccountable political ads.
The move is being partly run from Facebook’s Dublin office, the company says.
“To expand on work we did to fight misinformation in advance of the Brazil presidential election and the US midterms, we are planning to set up two new regional operations centres, focused on election integrity, located in our Dublin and Singapore offices,” said Katie Harbath and Samidh Chakrabarti, two Facebook directors overseeing the effort.
“This will allow our global teams to better work across regions in the run-up to elections and will further strengthen our coordination and response time between staff in Menlo Park [Facebook’s US headquarters] and in-country. These teams will add a layer of defence against fake news, hate speech and voter suppression, and will work cross-functionally with our threat intelligence, data science, engineering, research, community operations, legal and other teams.”
The move comes with the EU elections set to take place in May.
“First, we’re committed to setting a high standard for transparency when it comes to political advertising on Facebook,” said Harbath and Chakrabarti.
“In late March, we will launch new tools to help prevent foreign interference in the upcoming elections and make political advertising on Facebook more transparent. Advertisers will need to be authorised before purchasing political ads and far more information about the ads themselves will be made available for people to see.”
This require advertisers to “confirm their identity and include additional information about who is responsible for their ads”, especially for “electoral ads or ads about highly debated and important issues related to the European Parliament elections”, said the two directors.
“While the vast majority of ads on Facebook are run by legitimate organisations, we know that there are bad actors that try to misuse our platform,” they said.
The system will work using a new degree of transparency around a library system.
“When you click on the ‘paid for by’ disclaimer, you will be taken to the Ad Library,” said Harbath and Chakrabarti. “The library will share information on the ad’s performance, like range of spend and impressions, as well as demographics of who saw it, like age, gender and location.”
This library will be “completely searchable” and can be accessed by anyone in the world “regardless of whether they have a Facebook account or not” at facebook.com/adlibrary.
“These tools will cover not only campaign ads, but also issue ads which don’t mention a candidate or political party but do discuss highly-debated and important topics,” said Harbath and Chakrabarti.
“While we are pleased with the progress we’ve made in the countries where we have rolled out our ads transparency tools, we understand that they will not prevent abuse entirely. We’re up against smart and well-funded adversaries who are adapting and changing their tactics, just as we are getting better at preventing abuse. But we believe that this higher level of transparency is good for democracy and is good for the electoral process. Transparency helps everyone, including political watchdog groups and journalists, keep advertisers accountable for who they say they are and what they say to different audiences.”