Essays on Technology and Culture

Bubbles, Echo Chambers, and Algorithms

Online, we all live in a social bubble. Our bubble is permeable, but only to a certain extent. We allow in those things that please us, in one form or another. [1] Whatever it is you don’t want to see, there’s a tool to keep it out of your bubble, from keyword filters for your Twitter client, to ad blocking extensions for your browser. All it takes is a few minutes, and a few clicks, and your bubble is complete. I’ve written before about how easy it is to get trapped in an echo chamber of social media and news. When everything is on demand, there’s no incentive for us to demand the things that we don’t like. Our bubble is the echo chamber.

Amplifying this problem are the tools and algorithms that many services use to provide “custom” and “curated” content. Every social network scours the connections of you and your friends so that it suggests you follow people who are similar to what and who you follow already. knows the movies, music, and books you like, and will suggest other media that covers those same areas. These algorithms are constantly being honed and improved to provide you with stuff you’re more likely to want, which makes it all the more unlikely it will offer you something that exists outside of the bubble of your tastes and preferences. Statistically, you’re less likely to consume something that isn’t like something you’re already into, so there’s no incentive to provide anything else.

If you’re viewing this from a supply side perspective, wherein it’s your job to exchange goods for money, there’s no problem here. You’re merely filling demand, faster and more effectively than you would have otherwise. Reducing a person’s interests and tastes down to a few keywords that can be cross-referenced in a database search is just good business. If you’re viewing this from a demand perspective, it’s easy to see this as a boon as well. “Amazon, or Netflix knows me so well, that it knows I’ll be interested in Japanese Kaiju movies, stoner comedies, and albums by 90s alt-rock bands. [2] It’s so much easier for me to find something now.” It’s a system by which our own laziness causes us to be denied opportunity to explore something outside of our comfort zone, because of our reliance on algorithms that decide for us. It’s not the algorithm’s fault. It can only work with the data we give it.

Fortunately, we’re not locked into what the algorithms supply us. As long as we’re interacting with other people, whether face-to-face or from behind keyboards, there’s the possibility of being exposed to something outside of that comfort zone. We’ll always have friends, family, co-workers, and casual acquaintances with their own tastes and preferences that differ from our own. Loathe as I am to use “organic” to describe it, as it’s become a buzzword, it’s an accurate way of describing how we get exposed to new ideas. We allow these into our bubble because they come from a trusted source, someone we’ve connected with despite our own bubbles–a new voice to break up the echo chamber. We’ll always be more than keywords in a database.

  1. Including the things that displease us, so that we may gain pleasure from expressing our displeasure.  ↩

  2. These are cultural options selected almost at random by the author, and do not not necessarily reflect actual preferences.  ↩