filter bubble
16 September 2019
Opinions, algorithms and filter bubbles

Algorithms decide what we find online, selecting contents that are closer and closer to our opinions, habits, desires and even emotions. The risk is to find ourselves entrapped in a filter bubble.

The World Wide Web promise – spread information and knowledge, favour communication and pluralism – has been largely disregarded. The combined effect of different events, including the advent of social media, turned the Web into a dimension where opinions are more and more polarized and dangerously tight to our own beliefs.

Raw materials are data we disseminate online more or less consciously any time we question Google, put a like on Facebook, buy through Amazon, review with Tripadvisor or use any other digital platform. This huge amount of data is endessly processed by highly advanced algorithms to get an accurate picture of who we are, where and how we live, what we like and are interested in, but even of our political and religious opinions, and our values.

This mechanism allows brands to design 1:1 marketing campaigns (I know you like this, and you would probably buy it), but the same feeds a more and more limited and partial perception of reality, caging us into actual filter bubbles. If algorithms choose what we should have, we tend to be exposed not only to customised advertisements, but to contents that conform to and reinforce our beliefs, so we don’t see what might generate debate and discussion. We are thus closed into bubbles, and gradually separated in a state of cognitive and intellectual isolation.

Unfortunately, our digital behaviours can grow and strengthen the bubble. In the paper “Recursive Patterns in online Echo Chambers”, a group of Italian researchers from CNR, together with Walter Quattrociocchi from Università Ca’ Foscari Venezia, reviewed how people remain inside those bubbles as reassuring comfort zones, and share with their network only what can aggregate a rock-hard consensus around them.

The echo chambers start a process of tribe segregation, with each tribe defending a certain position and denying any alternative narration. If polarization is worrysome, it becomes even worse when it ends up in validating fake news and disinformation, nullifying any possible dialogue.

It’s not clear yet how to mitigate this drift and the manipulatory use of algorithms. A recent article by Paolo Boccardelli from Luiss Business School called for the Authorithy to establish stricter regulatory and control measures, forcing platforms to have transparent profilation mechanisms and fight against fake news. Media have some responsibility too, but the most powerful antidote probably lies in users, who should wake up their critical thinking, distinguish truth from fake, break the vicious cicle of filter bubbles.

The web and the illusion of an active user

The web and the illusion of an active user

Fake news as vaccines against disinformation

Fake news as vaccines against disinformation

Fake news as vaccines against disinformation

This site is registered on wpml.org as a development site.