From Biodiversity to Monoculture: How Algorithms Are Killing Our Taste

Introduction: The Earth Isn’t Flat, But Our Culture Is

We live in an era ruled by algorithms: Tinder decides who we date, Facebook what we believe, Yelp where we eat, and Spotify what we listen to. All of this makes our culture incredibly homogenous and flat. Despite having access to everything, we consume the same content and have the same opinions. It’s why every Halloween party is just Barbie meeting Wednesday Addams while Sabrina Carpenter plays in the background. We live in what I call “modern monoculture”. While the content is abundant, we’re spiritually anaemic. But what if the reason nothing inspires us is because it isn’t ours? We have the power to choose anything, but we relinquish that power and let algorithms shape our views. Every time we allow machines to decide for us, we lose a fraction of our humanity.

Relying on algorithms isn’t all bad. Since we have access to an infinite amount of content, having something that narrows things down for us is convenient. Ironically, an algorithm can also introduce us to obscure content or creators we wouldn’t find otherwise. At their best, algorithms can be convenient, time-saving, and accessible. The problem is that even though algorithms can expand our horizons, most of the time, they narrow things down too much, pushing what’s easiest to consume and not what’s most meaningful, which is how we ended up with twelve Fast & Furious movies. When it comes to culture, we gave up our taste for convenience, and we desperately need it back.

In the book Filterworld, author Kyle Chayka describes algorithmic recommendations as paradoxical. They want to make you feel unique by recommending the same content they’re recommending to everyone else. So how can you be different if you consume what everyone else does?

From Google to TikTok: How Algorithms Took Over Our Taste

Algorithms didn’t start as addictive; they slowly got there as companies competed for our attention. The first innovation to do this was Google’s PageRank, which came out in 1998. Instead of showing search results in chronological order, it ranked them by “relevance”. This tapped into something psychologists call authority bias, our tendency to be influenced by the opinion of an authority figure. When we’re using Google, we believe the top results must be better or most trustworthy, which isn’t always the case. It’s why my grandmother will click the first link on WebMD and immediately decide she has six months to live.

Arguably, the next meaningful innovation came out in the early 2010s when Facebook released endless scrolling. By turning our news feeds into bottomless holes, this key design shift made algorithms addictive since user interfaces no longer had a stopping cue. Endless scrolling is fascinating from a psychology standpoint because this innovation is tied to two principles from behavioral science. The first one is the concept of variable rewards, which means that when we use Facebook, we don’t know what we’re going to get. Due to its random nature, the term is also known as the slot machine effect. The second principle is the Zeigarnik effect and describes how we hate unfinished tasks. We remember finished tasks better than uncompleted ones, which explains why my brain won’t rest until I finish watching a show I don’t even like.

In 2016, when TikTok debuted, the social media platform introduced another feature that most streaming platforms and social media would incorporate: the “For You” feed. This is a personalized engine that allows certain posts to go viral based on engagement signals instead of social connections.

With these innovations, companies aim at one specific goal: they want your attention for as long as possible. This is when the principle of social proof comes into play. If millions of others like something, you’re more likely to like it as well. This explains how popular videos on the internet get even more views, creating a snowball effect of attention. For example, you can go to YouTube and watch a video called “Jazz for Cows”, which currently has 28 million views. When I was a kid, I thought the future was going to have servant robots and flying cars, but this is better if you ask me.

Be Kind, Please Rewind: When Culture Lived on Shelves

The history of algorithms is interesting because it shows how we went from maintaining curated collections of things that mattered to us to letting platforms decide. There’s nothing permanent in what digital platforms offer: media has become meaningless and disposable. The content you consume today gets forgotten in a few days. Honestly, there’s a good chance you’ll forget this sentence by the time you finish the paragraph. I argue that the best relationship we can have with pieces of our culture comes from physical, non-algorithmic collections. This includes books on bookshelves or a stack of vinyl records. How we interact with something reflects how we consume it, and in the case of physical media, we often do it carefully, slowly, and intentionally.

Every time we interact with streaming services, it reminds us what we miss the most: a balanced and reliable way to access whatever piece of media we want. Have you tried figuring out which streaming service a movie is on? By the time you find it, you could’ve just reenacted it with sock puppets. We took stability for granted and gave it up in the name of convenience. Having physical things shapes our identity, and without them, we aren’t ourselves. They’re symbols that represent what we care about, what we know, or aspire to know. They represent a commitment to the kind of person we are and to the kind of person we want to become.

There’s also a permanence to the things we collect. They don’t disappear unless we want them to. Unlike that DVD of Catwoman I’ve been trying to “accidentally” lose for twenty years. The same can’t be said about content on streaming platforms. The content we consume isn’t ours. When we subscribe to a service, we get unlimited access to whatever is on the platform, but that content can disappear at a moment’s notice. Streaming services get shut down, their interfaces change, and content gets removed, and you have no control over any of that. Also, when you have access to your own library, you can sort things according to your priorities, not some company’s.

When Art Becomes Content

When I look at my old DVDs or bookshelves, I’m reminded that these works were meant to last. I mean, my copy of Catwoman has survived three moves and one bad breakup. The media that fills our feeds today is designed to be consumed quickly and forgotten just as fast. That shift in how we consume has changed how things are created, too. We no longer expect permanence from art, only convenience.

We live in an era where most media is forgettable. I can’t tell you anything about Mission Impossible: The Final Reckoning, and I watched that movie last week. The atmosphere of most modern movies and television shows is compelling, but they don’t use it to say anything meaningful. It’s as if everything has been designed to make sense, even if you’re distracted by your phone while watching it. Modern movies are cheap and ephemeral when compared to older films. Few modern pieces of media will go on to become masterpieces we’ll revisit decades from now. Everyone keeps saying most Netflix shows are enjoyable to the point they can’t stop watching, but I struggle to come up with something that has truly stuck with me over the last decade.

Part of what makes Netflix so addictive is the fact that it has an autoplay feature (once an episode is over, a timer counts down ten seconds, and then another episode starts whether you like it or not). I’m old enough to remember how you’d have to wait another week to watch the next episode of a show you liked. Since Netflix shows are released one season at a time, you can spend an entire weekend watching them if you want. This is both a blessing and a curse. Although there are a few exceptions, you can’t manufacture something of quality in high quantity. The entertainment we have access to is certainly hypnotic, but that doesn’t make it memorable.

Too Many Streams, Nothing to Watch

In his book The Paradox of Choice, American psychologist Barry Schwartz explains how removing choices greatly reduces anxiety. We have more choices than any other generation before us, but we’re not benefiting from them psychologically. If anything, having endless options is making us more overwhelmed than ever. Algorithmic feeds claim to solve this by curating content for us. But by solving the paradox of choice, they quietly reshape the content we encounter.

Recommendation systems create a sense of autonomy. You think you’re choosing something, but in reality, you’re choosing from a series of options that are being presented to you, and those options have gone through a filter already. In psychology, this is called nudging. It’s when someone (or in this case, something) makes subtle changes to the environment so that the options that are being presented to you predictably influence your decision. Although you think you’re choosing, algorithms are guiding you toward choices that are in their best interest.

Baby Shark Made Me Do It

I was recently talking with someone who couldn’t understand the concept of advertising. He couldn’t conceive how companies would spend millions of dollars on ads. Companies do that because it works, and even if you think you’re immune, you’re not. There’s a concept in psychology known as the mere exposure effect, a phenomenon where people tend to develop a preference for things they’re familiar with. This explains why you like a song after hearing it several times or why you like someone simply because you interact with them frequently. This is probably the only scientific explanation why family holiday parties don’t end in riots. It also explains why repeated exposure to advertisements makes you more likely to choose that brand over another.

Needless to say, algorithms have exploited this feature of our psychology to push recommendations. We’re going to like those recommendations whether we like them or not because we’re being exposed to them all the time. It’s like trying to resist Baby Shark if you’re a parent. At some point, you’ll find yourself trapped in a loop of “doo doo doo doo doo doo.” The problem is that algorithms are weaponizing the mere exposure effect and scaling it up to billions of users. The result is a world that’s flat and boring because our views are narrower and narrower to the point we have no ideas of our own.

Another important concept is confirmation bias. This is our tendency to look for information that supports our previous values and beliefs. For example, if I see myself as someone who likes rock music, I’ll ignore everything that contradicts that belief. That’s also why every new pop song I hear gets dismissed as “garbage” within 12 seconds, unless it has a guitar solo, in which case it’s suddenly genius. Confirmation bias makes us comfortable, reinforcing our worldview and sense of identity. In other words, it shrinks our world, making our personal taste an exercise of self-confirmation instead of exploration.

In the context of modern technology, this means algorithms keep feeding us more of what reinforces our preferences and narrows our perspective. Watch a video of a conspiracy theory and suddenly, your feed is full of them. Before you know it, you’re Googling if pigeons are drones controlled by the government. The deeper you go into an algorithm rabbit hole, the narrower the views presented get. This also ties into the idea of the illusion of choice, which I discussed earlier. While we think we are discovering something, the algorithm is simply confirming something you’ve shown interest in before.

Everyone’s Watching the Same Show (and That Show Sucks)

Algorithms flatten culture by privileging what’s most broadly consumable. Usually, that translates into content that’s safe and popular. Basically, everything starts looking like a Marvel movie. Some of the best works of art in history challenge you in some way, and they weren’t created to satisfy consumers. If we’re constantly trying to satisfy the algorithm we end up consuming content safe and popular content, then that means we’ll never watch another 2001: A Space Odyssey or that we’ll never listen to another Nevermind. The best works of art are an acquired taste. Something rough around the edges, but the more you’re exposed to it, the more layers it reveals. You’re not an immediate fan, but when you become one, it becomes part of you. The best works of art aren’t safe and popular; they’re complex, weird, or challenging.

In contrast, we live in a homogeneous world where everyone streams the same shows, listens to the same playlists, and watches the same TikTok trends. Think Barbie, Game of Thrones, or Taylor Swift. Everyone knows someone who’s really into one of those things, and they wouldn’t have been as popular as they were if it algorithms hadn’t recommended it to them again and again. This leads me to the next point, which comes from an unlikely place: biology.

In biology, we have monocultures and biodiversities. Monocultures (a field planted with only one crop) are efficient in the short term but fragile. Since they don’t have diversity to protect them, they’re vulnerable to pests, disease, and collapse. Biodiversity, on the other hand, creates resilience because if one species fails, another can thrive. Healthy ecosystems might be messy, unpredictable, and hard to control, but they last. This is the case in the real culture. When everyone’s consuming the same content (watching the same shows, listening to the same songs, and laughing at the same jokes), everything looks and feels the same, and nothing feels uniquely yours. The solution isn’t to abandon technology completely, but to restore biodiversity.

How to Break the Algorithm (Without Moving to a Cabin in the Woods)

So how can we escape the feed? From now on, you must curate the content you deem important or find someone else who does. I believe that the best way out of this is by avoiding recommendation-heavy platforms. This creates space for slower and more intentional discovery. Depending on what kind of content you want to pursue, you have real-life alternatives. If you’re into books, you can go to bookstores or libraries. If you’re into politics, you can always follow niche blogs. And if you’re into movies, you can rely on word of mouth from friends. Some platforms are already seeing the algorithm as a problem. The newsletter format became popular to avoid the algorithmic feed, for example. Start valuing what’s difficult, strange, and non-viral. True taste isn’t about convenience. True taste requires friction and discovery.

Conclusion: Is Your Taste Really Yours?

In a world run by algorithms, the question isn’t just whether our taste is really ours, it’s whether we’re willing to put in the work to keep it alive. A flat culture is efficient, but like any monoculture, it’s fragile. True resilience comes from biodiversity, from messy, strange, inconvenient art that algorithms will never serve us. Maybe our freedom isn’t measured by what we consume, but by how much effort we put into seeking it out. Otherwise, we’re just humming along to Baby Shark on repeat, thinking it’s a personal choice.

Scroll to Top