Friday 24 November 2017

Is online personalisation to blame for political polarisation?

Facebook CEO Mark Zuckerberg. Photo: Eric Risberg/AP Photo
Facebook CEO Mark Zuckerberg. Photo: Eric Risberg/AP Photo
Steve Dempsey

Steve Dempsey

The best thing we can say about the US presidential election is that it's over. There were lies, damn lies, scandals, racism and misogyny. There was one candidate who played by a traditional rulebook, and another who understood how to exploit the needs of struggling print and broadcast media to create ubiquitous publicity.

Obviously, the US media needs to take a long, hard look at itself. The traditional media seems to have lost the ability to hold the middle ground, and perhaps one of the reasons for this is the online filter bubble that the modern electorate inhabits.

Could it be that the internet, which originally held the promise of access to all human knowledge, actually delivers dangerous levels of confirmation bias and an ill-informed electorate?

Earlier this year, Pew Research estimated that 62pc of Americans get their news from social media, with Facebook having the strongest grip on the American psyche. But Facebook's newsfeed has little to do with news in the traditional sense. It offers no analysis. It's not designed to challenge or inform, but to keep users engaged on Facebook for as long as possible. It's a honey trap, which serves up a sweet cocktail of personalised content users are likely to engage with and share.

Of course, Facebook isn't the only one engaging in this form of personalisation that creates more habitual user behaviour - good news for the platforms that use it. But it also leads to users being exposed to very few ideas that challenge their world view - bad news for tolerance and democracy.

Market intelligence company Mintel issued a recent consumer trends report that found personalisation leads to curated worldviews and separate allegorical ecosystems.

"Humans naturally tend to select what they like," the report states, "but now, many content publishers and social media sites employ algorithms to feed users only articles and posts which they know will be met with enjoyment and agreement."

Search engines that give prominence to one candidate over another based on user data could also be swaying political outcomes. How much sway could they have? Researchers at the American Institute for Behavioural Research in California wanted to find out, so they built a search engine that deliberately skewed results. Around 300 voters were invited to choose their preferred candidate in a two-horse political race.

They found that undecided voters' preferences could be affected by placing one candidate above the other. They even estimated that biased search engine results could shift the vote by up to 2.6 million votes.

Not a huge amount, but if applied to states with higher electoral colleges, you could have a very different result. The researchers were quick to point out there is no evidence any search engine would deliberately manipulate results. But they did note that these algorithms that can sway voters' opinions are shrouded in secrecy, and the likes of relevance and personal data are used to rank results.

What is clear is that whenever algorithms are used to serve up information related to politics based on personal information, some form of digital gerrymandering is possible. The filter bubble and resulting polarisation is an unintended consequence of the convenience which many technology companies are pursuing on behalf of their users. They want intuitive and addictive interfaces that present users with no cognitive load. But this desire is directly at odds with the principles needed to foster an informed and tolerant electorate.

Facebook seems to be aware of its responsibility in this regard. In response to an article on Vox entitled 'Facebook is harming our democracy, and Mark Zuckerberg needs to do something about it', the social network issued the following statement: "We understand there's so much more we need to do, and that is why it's important that we keep improving our ability to detect misinformation. We're committed to continuing to work on this issue."

To borrow a political slogan from our side of the pond, it's a case of "A lot done. More to do."

Sunday Indo Business

Promoted Links

Business Newsletter

Read the leading stories from the world of Business.

Promoted Links

Also in Business