europe

Popping the digital filter bubble

[ad_1]

Issued on:

Ever wondered why 2 people can search for the same thing online and get 2 totally different results? The answer is online echo chambers and digital filter bubbles – social media and search engines that skew our access to information and algorithms that artificially promote content they think should suit us. Those invisible chains shrink our freedom to learn and be confronted with new ideas. Want to break free? France 24 can help you pop the filter bubbles around you! 

Social networks have revolutionised how we access information. In France, over a quarter of people get their news from social networks – second only to television. And for young people, the change is even more drastic: 47% of the under-35s say their primary source of information is social media (Ifop, 2019). And we’re not just passive consumers of information online now – everyone can also generate content, leading to a vast quantity of news and views online.

Sifting through that ever-growing mountain of information forces search engines and social media to use algorithms – to sort the wheat they think will interest us, from the chaff they assume won’t. For Jérôme Duberry of the University of Geneva, it’s a simple calculation: “if a web-user has a given profile, then they will be fed information of a certain type”.  Posts that seem to appear at random on our Twitter or Facebook timelines are in fact carefully chosen according to what the platform already knows about us – interests, friends, “likes”. Highlighting content that is tailored specifically to our interests filters out topics from outside our comfort zone – reinforcing our beliefs.

Online rights are human rights

But social networks are only one aspect of the digital echo chambers. Search engines are also key – once again due to their reliance on algorithms. Google’s search results are generated from our own online history, mixed with that of thousands of other users. The goal for the search engine is to maximise user engagement by finding results that are most likely to prompt interest (and sales) from the user – and so generate advertising revenue.

For Jérôme Duberry, those gatekeepers limit our access to knowledge: “it’s as if there was someone standing in front of the university library, who asks you a bunch of questions about who you are, and only then gives you access to a limited number of books. And you never get the chance to see all the books on offer, and you never know the criteria for those limits.”

The consequences of these so-called Filter Bubbles are far-reaching. For Tristan Mendès France, specialist in Digital Cultures at the University of Paris, “being informed via social networks means an internet user is in a closed-circuit of information”.

Blinkered online views, democratic bad news

For many academics, those echo chambers could threaten the health of our democracies, suggesting the algorithms could contribute to the polarisation of society. By limiting our access to views similar to our own and excluding contradictory opinions, our beliefs may be reinforced – but at the expense of a diversity of opinions.

And that could undermine the very basis of our democracies. For Jerôme Duberry, the Filter Bubbles “could lead to us questioning the value of a vote. Today, we lend a great deal of importance to the vote, which is the extension of a person’s opinion. But that individual’s opinion is targeted by interest groups using an impressive array of techniques.”

That isn’t the only distortion that algorithms have created. They have also allowed more radical views to predominate. Youtube’s algorithm is blind to the actual content of a video – its choice of what will be most visible is made according to which videos are viewed all the way to the end. But for Tristan Mendès France, “it is generally the most activist or militant internet users that view videos all the way through”. That provokes “extra-visibility” for otherwise marginal content – at the expense of more nuanced or balanced views, or indeed verified information. 

Escaping the echo chamber

So what happens to the spirit of debate in a world where your online habits reinforce your beliefs? Is the echo chamber a philosophical prison? And how easy is it to get back out into the fresh air of contradictory views?

In the US, the movement opposing algorithms is gaining pace.  Since 2019, the Senate has been debating the Filter Bubble Transparency Act, a bipartisan bill to allow webusers to search online without “being manipulated by algorithms driven by user-specific data.” And Twitter has already taken a small step in that direction. Since 2018, users can choose between a personalised timeline, curated according to their interests, and a simple chronological one.

But the change doesn’t need to be instigated from the top down – everyone can do their part in taking back their online independence from the diktats of algorithms. Jérôme Duberry has a simple suggestion: change your online habits, and “if you’re used to reading [Left-wing newspaper] Libération, then you should go and read [conservative daily] Le Figaro!”. For Duberry, it’s vital to remember that the algorithms are slaves to what they learn from us – and that can evolve all the time, and a web-user can escape the bubble any time they choose. 

Combatting radical content needn’t be a hopeless battle either. Tristan Mendès France helps coordinate RiPost, an online initiative run by the Conspiracy Watch website. Their goal is to subvert the algorithms’ strengths: users that type toxic keywords into search engines will be directed to relevant educational content.

And traditional media have a role to play too. That’s why France 24 has joined forces with 20+ media around the EU to launch  Europe Talks. We want to put thousands of Europeans in touch with each other. Our algorithm is different: we want to put you in contact with people who have totally different views to you. 

So, click here, if you want to be transported outside of your bubble.

 


[ad_2]

READ SOURCE

This website uses cookies. By continuing to use this site, you accept our use of cookies.  Learn more