Finally a way to fight filter bubbles?

The phenomenon is not new, but it is often discussed. The filter bubble theory invented by writer Eli Pariser believes that social media algorithms only offer us content related to what we like. Politically and democratically, this can pose a real problem by locking citizens into an ideological comfort zone. From then on, Internet users no longer understand each other and discussions ignite without the slightest possibility of reaching a compromise.

MIT researchers delved deeper into the subject in a fascinating study. Unsurprisingly, the likelihood of someone following another Twitter account triples when they are linked by similar political positions.

Algorithms have to change

To conduct their experiment, the scientists gathered a list of users who had retweeted MSNBC and Fox News publications (reputable left and right channels, editor’s note). In this way, a list of 842 profiles was created that duly considered Democrats and Republicans.

At the same time, the researchers created a network of eight very partisan bots that look like progressive or conservative profiles. The latter then happened to follow the 842 previously selected accounts. The idea was to see how they would act.

Overall, 15% of real users chose to follow bot profiles that had the same opinions as them. Only 5% of them chose to do the same for bots with different positions.

Based on this observation, scientists believe that the web giants should change their strategy if they want to strengthen the interaction between citizens. David Rand, one of the authors, specifies:

If you want to foster social relationships between supporters, the friendship suggestion algorithm doesn’t just have to be neutral. You would have to design these in such a way that they actively counteract these psychological predispositions of individuals.

The message has passed, but there is no doubt that this experiment could provide inspiration for other researchers to complete this study in the future.

Back to top button