Tuesday 17 September 2019

Facebook yet to prove it's serious about halting spread of fake news

Facebook has taken action to clamp down on misinformation on its WhatsApp messaging service. Photo: Reuters
Facebook has taken action to clamp down on misinformation on its WhatsApp messaging service. Photo: Reuters
Steve Dempsey

Steve Dempsey

Facebook's struggles with fake news are well-documented. The social network, which let Cambridge Analytica plunder data, let Macedonian teenagers peddle lies about the US presidential election, and initially denied that bad actors could use the platform to undermine democracy, has been slow to get its act together. But maybe that's changing.

The social network has realised it needs to control the spread of misinformation on its other properties, Instagram and WhatsApp. It has just announced that Instagram users can report posts to fact-checkers.

Please log in or register with Independent.ie for free access to this article.

Log In

But there are two glaring problems. Firstly, it's only available to American users. Secondly, if the offending posts are found to be 'false information', they won't be deleted. Instead, they'll be served less often on Instagram's Explore and Hashtag pages. Instagram won't alert users when they see a post that's been debunked. Think of a serial liar who admits they're not truthful, but agrees to limit their lies to a whisper.

When it comes to WhatsApp, the situation isn't much better - admittedly, not for the want of trying. The recent Indian election was an opportunity for Facebook to prove it could handle misinformation. India is WhatsApp's biggest market, with 400 million users at last count.

Plus, WhatsApp has been accused of spreading misinformation that has led to violence on the sub-continent. Facebook got busy trying to prove WhatsApp could be trusted.

It limited the number of times a user can forward a message to five other users, and labelled all forwarded messages. New privacy settings allowed users to decide who can add them to groups. WhatsApp also started banning accounts. Any new accounts that sent out a high volume of spammy messages - around two million per month - were axed.

It also gave users the ability to flag suspicious messages in English and four other languages to a fact-checking startup, and launched a fake news tip line. India has 22 official languages. So an advertising campaign in 10 of those languages encouraged Indians to "spread joy, not misinformation". There was little indication that any of this worked, with particular elements such as the tip line being branded a PR stunt.

But it's welcome news that Facebook is beginning to take misinformation in different languages seriously. It is, after all, a global phenomenon.

This week, the social network announced it was extending its fact-checking partnership with Africa Check, an independent fact- checking organisation, to cover a host of languages in different African countries.

The programme has been running in Kenya since 2018, where stories flagged as false are demoted in the news feed and are tagged with warnings. Like many countries, Kenyans were bombarded with misinformation on Facebook and WhatsApp during the nation's 2017 general election.

Now additional countries and native languages will benefit from some form of fact-checking. In Nigeria, Yoruba and Igbo; in Kenya, Swahili; in Senegal, Wolof; and in South Africa, Afrikaans, Zulu, Setswana, Sotho, Northern Sotho and Southern Ndebele will now be covered by fact-checkers. The big question is around resourcing.

Fact-checking is a time-consuming and expensive business (no wonder Facebook doesn't want to take direct responsibility for it). Africa Check's teams will be hard-pressed to stem the tide of misinformation on Facebook.

Only in Germany has the social network hired a sufficient number of fact-checkers. Why could that be? Legislation that imposes hefty fines on digital platforms with more than two million users in Germany that fail to remove posts containing hate speech or other criminal material within 24 hours might have something to do with it.

And the Germans have already used this legislation to slap Facebook on the wrist. The Federal Office of Justice has issued Facebook with a fine of €2m. Small beans compared with the recent US Facebook fine of $5bn (€4.5bn), and even smaller compared with Facebook's first-quarter revenue of more than $15bn.

But Facebook is eager to stay onside; if every jurisdiction decided to start piling in with fines, it could face death by a thousand cuts.

So Facebook is trying. But is it lip service, or is it putting its money where its mouth is?

According to Full Fact, a UK charity that specialises in fact-checking, Facebook's efforts are worthwhile and have clear social value, but further development is needed. Full Fact issued a report based on its experiences of working with Facebook that outlined two major concerns.

These relate to the need to increase the volume of content that could be fact-checked and Facebook's opacity. "We want Facebook to share more data with fact-checkers," the report stated, "so that we can better-evaluate content we are checking and evaluate our impact."

Sadly, while the company does have a history of letting the likes of Cambridge Analytica access user data without consent, it doesn't have a history of willingly sharing information with NGOs like Full Fact. We'll know Facebook is doing more than PR-focused virtue-signalling when that changes.

Sunday Indo Business

Also in Business