The danger of filter bubbles: Only receiving personalized information and no longer being open to opposing viewpoints.

« Back to Glossary Index

Filter Bubbles

A filter bubble – a term coined by internet activist Eli Pariser – is a state of intellectual isolation that allegedly can result from personalized searches when a website algorithm selectively guesses what information a user would like to see based on information about the user, such as location, past click-behavior and search history. As a result, users become separated from information that disagrees with their viewpoints, effectively isolating them in their own cultural or ideological bubbles. Pariser terms this reflection a filter bubble, a “personal ecosystem of information.”

The choices made by these algorithms are not transparent. Prime examples include Google Personalized Search results and Facebook’s personalized news-stream. The bubble effect may have negative implications for civic discourse, according to Pariser, but contrasting views regard the effect as minimal and addressable. The results of the US presidential election in 2016 have been associated with the influence of social media platforms such as Twitter and Facebook, and as a result have called into question the effects of the “filter bubble” phenomenon on user exposure to fake news and echo chambers, spurring new interest in the term, with many concerned that the phenomenon may harm democracy.

According to an article by Farnam Street, many sites offer personalized content selections, based on our browsing history, age, gender, location, and other data. The result is a flood of articles and posts that support our current opinions and perspectives to ensure that we enjoy what we see. Even when a site is not offering specifically targeted content, we all tend to follow people whose views align with ours. When those people share a piece of content, we can be sure it will be something we are also interested in.

The article says “That might not sound so bad, but filter bubbles create echo chambers. We assume that everyone thinks like us, and we forget that other perspectives exist. One of the great problems with filters is our human tendency to think that what we see is all there is, without realizing that what we see is being filtered.”

In particular, the existence of filter bubbles has led to widespread concern. Pariser writes: “Democracy requires citizens to see things from one another’s point of view, but instead we’re more and more enclosed in our own bubbles. Democracy requires a reliance on shared facts; instead we’re being offered parallel but separate universes. … Personalization filters serve a kind of invisible autopropaganda, indoctrinating us with our own ideas, amplifying our desire for things that are familiar and leaving us oblivious to the dangers lurking in the dark territory of the unknown.”

Nobody is going to suffer from lack of information,” according to Marcus Brauchli who asserted that “They may suffer from lack of objective information because they live in a filter bubble. I worry about what happens to democracy when everybody is specialized. We all used to read a veneer of news on different subjects because that’s what newspapers used to offer. Now we all read vertically. I read lots about economics and politics but not about other subjects. What happens in a democracy when people don’t all have the same information? They may be deeply informed on one subject but not informed on other subjects.

Analyzing the issue, Jacob Groshek and Karolina Koc-Michalska wrote in an article that tightly linked chains of ideologically-shaped information flows and filter bubbles where individuals intentionally or unintentionally self-select into media coverage that is ideologically monolithic, patently false, or a combination of both. To the article, early research on populism during the US 2016 presidential elections has indicated that as more traditional or otherwise established political candidates cultivated populist support, it was primarily through an emphasis on one form of populist communication or another.

Read More

GROSHEK, J., KOC-MICHALSKA, K. (2017). Helping populism win? Social media use, filter bubbles, and support for populist presidential candidates in the 2016 US election campaign, Information, Communication and Society20 (9), 1389-1407.

Zuiderveen Borgesius, F. J., Trilling, D., Möller, J., Bodó, B., de Vreese, C. H., & Helberger, N. (2016). Should We Worry about Filter Bubbles? Internet Policy Review5(1).

Möller, J., Helberger, N., & Makhortykh, M. (2019). Filter bubbles in the Netherlands?

Möller, J., & Helberger, N. (2018). Beyond the filter bubble: Concepts, myths, evidence and issues for future debates. Commissioned by Commissariaat voor de Media.

Groshek, J. and Koc-Michalska, K. (2017). “Helping populism win? Social media use, filter bubbles, and support for populist presidential candidates in the 2016 US election campaign.” Information Communication & Society, 20(9), 1389-1407.