The Filter Bubble is a phenomenon where the internet learns our interests and behavior patterns to provide only the information we seem to want to see. In this article, we will explore the problems associated with filter bubbles and seek ways for a better democracy.
―
Q: What is a Filter Bubble?
―
A: A Filter Bubble occurs when the internet tailors the information it provides to us based on our interests and behavior patterns.
Imagine going to a special restaurant. This restaurant offers a menu based on the food we've ordered in the past. For instance, if we frequently ordered pizza, the menu might only show various types of pizza, hiding other food options. In such a scenario, we might not even be aware that other foods exist, increasing the likelihood of us continuing to eat only pizza.
Internet portals and social media function similarly to this restaurant. These platforms identify our interests and search habits to provide customized content. While such algorithms might initially seem convenient, they can actually confine us to a narrow perspective, posing significant problems.
The term "Filter Bubble" was coined by Eli Pariser, an internet activist and author. In his 2011 book, "The Filter Bubble: What the Internet Is Hiding from You," Pariser introduced this term and argued that personalized search results and news feeds narrow users' perspectives and limit exposure to diverse information.
Pariser's insights initiated an important discussion about how the internet and social media influence our consumption of information and communication, continuing to be a crucial concept today.
―
Q: What types of Filter Bubbles exist?
―
A: The filters that screen the information reaching us can be divided into social filters, search engine filters, and content recommendation filters.
◾ Social Filter Bubbles
This phenomenon occurs when individuals consume information solely within their social networks. As a result, we may only be exposed to information from people who hold similar opinions and perspectives to our own.
◾ Search Engine Filter Bubbles
Search engine filter bubbles provide customized search results based on a user's past search history and click behavior.
◾ Content Recommendation Filter Bubbles
Content recommendation filter bubbles on social media platforms and video streaming services recommend content tailored to an individual's tastes and interests. This limits users from exploring a wider variety of content.
―
Q: What are the problems with Filter Bubbles?
―
A: The most significant issue with Filter Bubbles is their 'closed-off nature'.
When the information we encounter is limited, our opportunities to be exposed to diverse perspectives and opinions decrease. Filter Bubbles can lead to several negative outcomes:
◾ Isolation in Personalized Information
Algorithms provide us with tailored information, making us only exposed to the news and advertisements we prefer. For example, on social media, if we hold certain political views, we might only see news and posts that align with those views, which can diminish our ability to understand different perspectives.
◾ Manipulation of Consumer Choices
Algorithms analyze our online behavior to show us customized advertisements. For instance, if we search for a specific product, related ads will continue to appear, encouraging us to make a purchase. This can manipulate our choices and limit our awareness of other options.
◾ Biased Decision-Making
Personalized information can introduce bias into our decision-making. For example, someone interested in the stock market may only come across positive news about certain stocks, potentially leading to misguided investment decisions.
◾ Creation of 'Echo Chambers'
Filter Bubbles can strengthen echo chambers, where information circulates only among people with the same opinions. Echo chambers can amplify extreme opinions, which is dangerous.
―
A Case Study of Filter Bubbles: The 2016 US Presidential Election
―
During the 2016 US presidential election, social media users encountered vastly different news and information based on their political leanings, complicating the understanding of opposing views and increasing social division.
◾ Social Media and Customized News Feeds
Many voters used social media platforms like Facebook as their primary news source. These platforms delivered personalized content based on users' past interactions. As a result, users were mainly exposed to information that aligned with their viewpoints, reducing their chances of encountering opposing opinions or different political perspectives.
◾ Spread of Fake News
Users were more likely to react to and share fake news that reflected their preferences or beliefs. Negative fake news about specific candidates spread rapidly on Facebook. According to a study by Stanford University, fake news supporting Trump was shared around 30 million times during the election period, nearly four times the number of shares for fake news supporting Hillary Clinton.
◾ Candidates' Social Media Strategies
The presidential candidates leveraged filter bubbles to implement targeted campaign strategies. They delivered customized messages to specific voter groups through social media ads and content, further intensifying the phenomenon where voters are only exposed to messages that reflect their viewpoints.
◾ Formation of Divided Social Echo Chambers
Filter bubbles strengthened the phenomenon of information circulating only among individuals with similar opinions. This further amplified extreme opinions about certain candidates or political issues.
Ultimately, the 2016 US presidential election remains a clear example of the negative impact filter bubbles can have on democracy and public discourse.
―
Q: Is there a way to overcome Filter Bubbles?
―
(Photo Credit : Facebook/Yuval Noah Harari)
"In a world deluged by irrelevant information, clarity is power. In theory, everybody can join the discussion about the future of humanity, but it is becoming harder to engage in meaningful dialogue. We are in an era of a personal information bubble, where what we see and hear is tailored to our existing beliefs."
- Historian, Yuval Noah Harari-
To overcome Filter Bubbles, we must approach the issue with Yuval Noah Harari's perspective: "In a world deluged by irrelevant information, clarity is power." Efforts must come from individuals, big tech companies, and governments.
◾ On an Individual Level:
It's crucial to consciously manage our information consumption habits.
Exploring news from diverse sources and perspectives, and listening to opinions different from our own are essential steps. Additionally, adjusting the settings on social media and search engines to expose ourselves to a variety of information is important.
◾ Big Tech Companies:
They need to increase the transparency of their algorithms and make greater efforts to provide users with balanced information. Overcoming algorithms that reflect users' existing beliefs and improving them to respect diversity is necessary.
◾ Governments and Regulatory Bodies:
They should focus on the issue of Filter Bubbles and implement appropriate policies. For example, they have a role in strengthening digital literacy education and enacting laws to promote the diversity and transparency of information.
When these efforts are combined, we can overcome the difficulty of "participating in discussions about the future of humanity" mentioned by Yuval Noah Harari and move towards "meaningful conversation and social participation."
For a peaceful coexistence among diverse ethnicities, religions, and nationalities, understanding and accepting different perspectives is essential!
Overcoming Filter Bubbles is not just about choosing different information. It's a crucial first step towards creating a more peaceful and unified world by understanding and accepting different perspectives.
“To embrace the future,
we must expand the scope of vocations that can herald the coming of peace.
Even though we may never meet our descendants, we must make sure that all their activities will harmonize in peaceful societies and nations.”
-Dr. Hak Ja Han Moon
Founder of Sunhak Peace Prize-
Reference |
Written by: Yeon Je Choi, Director