Google and Facebook mean that we don’t know what we’re missing
Have you ever thought about what information you are not seeing on the internet every day?
Have you ever considered how your search queries may yield totally different results to your brother, husband or mum?
Last year Eli Pariser delivered a powerful speech at the annual TED conference about the dangers of the 'filter bubble', a concept he penned a great book about, and the censorship of information by the internet giants such as Google and Facebook.
“Your filter bubble is your own personal, unique universe of information that you live in online. What’s in your filter bubble depends on who you are, and it depends on what you do. But you don’t decide what gets in — and more importantly, you don’t see what gets edited out,” he said.
A video of this excellent nine-minute talk has been doing the rounds on the web again this week, after Google was found tracking the web-browsing habits of millions of iPhone users without their consent and was also accused by Microsoft of bypassing its Internet Explorer browser's privacy protections. Google argues that it is simply making its services work for the millions of users who have signed up for them.
Pariser’s thought-provoking talk has now been watched more than 1.2 million times. In it he warns against the new “unethical” gatekeepers of information in the 21st century: algorithms. Editors used to be the gatekeepers of the flow of information, but in the internet age, algorithms, powering the likes of Google and Facebook, are rapidly gathering information about each of their users, so they can serve them a mixture of content and ‘relevant advertising’.
However, what we think we want to see is not always what we need to see.
Facebook founder and chief executive, Mark Zuckerberg, once tellingly said: “A squirrel dying in front of your house may be more relevant to your interests right now that people dying in Africa.”
The internet was meant to set information free; not hand over editorial power to unseen forces serving us the equivalent of junk food news.
Pariser learned, via an unnamed Google engineer, that even if you haven’t logged into Google’s search (which allows a user to specify their content and advertising preferences), there are some 57 signals which the search engine analyses - from what device a person is using to browse the web, to their location - when deciding what information to serve them. The idea is that there is no standard Google anymore.
While we still may be some way away from this vision of truly personalised search, social network feeds and even TV listings, it is definitely the direction in which the flow of information online is going. Pariser calls it the “invisible algorithmic editing of the web”.
Eric Schmidt, Google’s executive chairman, has been quoted as saying: “It will be very hard for people to watch or consume something that has not in some sense been tailored for them.”
Admittedly the personalisation of some content can be very useful. For instance, if you support a particular football team, naturally you would want to see the latest news about your side first via a search engine or news feed. However, it is important to remember that the majority of the information you don’t know you want to know, is often the most interesting and mind-opening.
This is why, in the age of algorithms and computers, human interaction and editors remain so important. But then I would say that wouldn’t I?