The “filter bubble” effect is summarized as follows: “People are attracted by the news that reinforces their political views. Facebook’s software learns from users’ past visits and tries to guess what news they are likely to click on or share in the future. When it reaches an extreme, this creates a “filter bubble,” in which users are only exposed to content that reaffirms their biases. The danger then is that these “filter bubbles” will encourage misconceptions by hiding the truth “
You can actually choose to avoid filter bubbles by using private browser mode ( incognito mode), then your browser will not remember what you have searched for.
Social media just shows what you believed and gets you dropped into echo chamber. How does this happen?
Echo chamber is a metaphorical term for an environment where insiders receive only information and points of view that help reflect and reinforce their existing views.
An echo chamber can create fake news and distort an individual’s perspective, making it difficult for him to respect opposing views and broaden his horizons.
We can summarize:
Echo chamber prevents you from taking over opposing sources and widening the perspective.
The way the Internet filters and recommends content is a big cause of users falling into the information loop.
In addition, human psychological weaknesses also contribute to the formation of information echoes.
How does echo chamber work ?
1. Personalized search
Web search results tailored to the interests of each user. This is done in two ways: changing the user’s keywords and ranking the search results. Since 2005 when this feature was introduced, Google may have personalized searches using information such as location, language, and web history.
2. Targeted advertising
Have you ever had a thorny experience when you just complained about dry skin in a private message and then saw tons of ads for moisturizers on Facebook or Instagram? It feels like being watched.
According to a Wall Street Journal study on 50 websites from CNN to Yahoo and MSN, each site sets an average of 64 cookies to collect data. Pariser example: “On Dictionary.com there are 223 tracking cookies on your computer. After searching for the keyword “depression” on it, other websites will immediately advertise you for medication for depression. “
3. Recommended algorithm on social networks
When you ‘like’ a video of a celebrity, that person will begin to appear in the ‘Explore’ section of your Instagram or YouTube homepage, or find a news story, and you will be recommended. adding content, even polarizing views, is too hardcore about the news.
Suggesting to add content you’re already interested in is a way for social media sites to keep you on their platform. However, when a large number of users use social media as the primary news source, these recommending algorithms create information loops.
Consequence of Filter Bubble and Echo Chamber
Beliefs like the flat Earth or conspiracy theories are still rampant on social media, despite all the mainstream scientific information that proves otherwise. Has this ever made you wonder why? Aren’t many people so lacking in common sense?
It’s possible these people are falling into echo chamber. Especially when the internet experience is being controlled and our mentality has its own weaknesses, many people have completely ignored the other side of the debate.
An echo chamber, when combined with fake news, became even more of a disaster scenario. On the threshold of Brexit (Britain separates from the European Union), fake news is that the UK has lost 350 million pounds per week to the EU instead of for British citizens. Although official news channels tried to clarify, this information still spread, contributing to Brexit results in the future.
About filter bubble, this doesnt support Facebook’s economic model because digital advertising tracks users and their clicks, shares and “likes”. Therefore, breaking these “filter bubbles” has no commercial benefit for Facebook.