Steve Dempsey: Facebook stepping up fight with fake news - but is it being entirely honest?
Last Monday, readers of the Guardian, the Times, and the Daily Telegraph in the UK were treated to full-page ads outlining Facebook's 10 commandments for spotting fake news. The patronising pointers included instructions to check the article date and URL and a reminder that some outlandish stories may be satirical.
It's no coincidence that these ads appeared a month out from the UK general election, set for June 8. Facebook ran similar print ads in France last month, and also in Germany, which has its own elections in September.
But the whole exercise is puzzling. Firstly, Facebook isn't overtly mentioned in the ads. Sure, there's a small Facebook logo in the top left-hand corner. But blink and you'll miss it. Could it be that the social network doesn't want to openly admit any responsibility for educating readers about the whiff of online bull?
Also, why take these ads out in print at all? Fake news is primarily an online issue. Surely Facebook - founded by Mark Zuckerberg, pictured - would be better off keeping its media spend and using the cash to flag or block disputed stories on its own platform. Also, why are the ads only running in broadsheets? Is the UK's tabloid reading population immune to fake news? Perhaps informing the masses isn't Facebook's goal here. Perhaps it's more interested in being seen to address the fake news furore by the broadsheet-reading establishment. And the establishment has been on Facebook's case.
In the UK, the conservative MP Damian Collins, chair of the House of Commons culture, media and sport select committee, has called on Facebook to take the issue as seriously as it takes images of child abuse or copyright infringements.
There's pressure from further afield too. In Germany, for example, members of the Bundestag have backed legislation that would see social networks fined up to €50m for failing to address fake news and hate speech and illegal content.
And, in fairness, the company is responding. Despite initially denying it had been used to manipulate public opinion in last year's US election, it is now promising a host of "information operations" to combat fake news.
In the UK the company claims to have removed tens of thousands of fake accounts. It is also monitoring and flagging accounts that look like they're spreading false information. The social network is also decreasing the visibility of stories that people read but don't share. And it's working with local media outlets and third-party fact checkers such as Full Fact, and First Draft News, a Google-backed non-profit group that aims to address challenges relating to trust and truth in the digital age.
Perhaps the most interesting development is Facebook's plan to reduce the visibility of low quality webpages. Links to webpages where there's more ads than content, sexually suggestive or shocking content, and malicious ads or pop-ups and interstitials will be deprioritised in the news feed in an update announced last Wednesday which will gradually roll out over the coming months.
How will this quality control work? Well, Facebook has reviewed hundreds of thousands of web pages linked to from its platform to identify the sites with little or no content and bad ads. Artificial intelligence will examine new web pages shared on Facebook to see if they have similar characteristics. If they do, they will be deprioritised. And they won't be eligible to be turned into ads on Facebook. Refusing money from malware peddling clickbait merchants is certainly a good indication that Facebook is taking this situation seriously.
So, a few strange press ads aside, it's fair to say that Facebook is stepping up its fight with fake news sites. Also, let's be clear, this problem isn't Facebook's alone. Just as truth is the first casualty of war, in modern elections, it's now thrown straight under the campaign bus. Fake news, the targeted spread of fake news on social media sites, hacking and the leaking sensitive documents, are all now political realities. So yes, Facebook has a huge responsibility to police its own platform, But others need to step up to the plate too. The traditional media has a role to play in stymieing the spread of fake news, calling out claptrap and false allegations, or at least, not repeating it.
So hey, maybe some sort of campaign on news literacy isn't a bad idea. But the voting public deserves better than a full-page ad in print that pays lip service to the digital manifestation of the problem. They don't need ten commandments, they just need one: don't believe everything you read.
Sunday Indo Business