Whether you’re shopping online, scrolling through social media or streaming a film, your every move is being logged and processed by software. There’s a recommendation engine at every digital turn, powered by a finely tuned algorithm that suggests what you should watch, buy or read next. But when your next move is driven directly by your existing interests, you can easily end up stuck in a bubble of similar content, products and ideas.
If you like this article, would you want to immediately read another just like it? What about five more? Fifteen more?
What if posts like this one were all you got in your news feed, your email newsletters, your little “recommended articles” box? Would you be excited to keep on reading my thoughts about how the creative world is changing, or would you get completely sick of them? And if you did get completely sick of them, what if there was no way out of the bubble some algorithm decided you’d be happy inside?
Subscribe to Observer’s Daily Newsletter
This “bubble” idea—that we’re each living inside cozy cocoons of our own opinions and preferences—is everywhere these days. Spotify makes us daily playlists and recommends new artists to “discover” based on our tastes. Five seconds after you finish a movie, Netflix is already suggesting the next one it thinks you’d like. Buy a blender on Amazon, and you’ll get ads for non-stick frying pans.
Sometimes, the bubble is harder to see. A lot of us only realized we were in an opinion bubble after the 2016 election, when it became obvious that Facebook and Twitter had been feeding us articles and opinions that mirrored our own. And the machine learning technology that turns our data into recommendations, the “recommendation engine,” crops up in places you wouldn’t expect. Waze uses it to suggest quicker driving routes. LinkedIn uses it to show you all the other jobs you could have if you had the courage to leap. Hell, it’s probably part of how you found this article in the first place.
It’s easy to see why companies like using our data this way. Why waste time and money broadcasting one-size-fits-all content when you know exactly what your customers want? If Netflix knows most of their viewers like true crime documentaries and hate gross-out comedies, they won’t sink millions into making a Johnny Knoxville joint. A customer who finds what they want to buy faster is less likely to get frustrated and give up; a customer who’s into the next song you play them is more likely to keep listening. The better the recommendation, the longer we stay; and the longer we stay, the more money we are worth to our captor.
For consumers, it just makes things easier and faster. You don’t have to spend hours tracking down the perfect handbag if Amazon already knows your personal style. You don’t have to argue with your significant other over where to eat when one place has 4 stars and the other has 3.5. Slogging through every movie ever has a way of ruining the “Netflix and chill” mood; isn’t it nice, then, when the perfect choice is right up top and five seconds away from starting?
Every Black Mirror fan knows that technology with obvious benefits also has unintended consequences. Sure, we all get annoyed when we get bombarded with non-stop ads for the shoes we just bought, but I’m talking about the deeper, long game stuff.
For a lot of us, part of what we love about finding the perfect accessory or discovering an undiscovered band is the thrill of the hunt. The best buys come with stories: the time you went wandering down side streets in Lisbon and found that unbelievable hole-in-the-wall restaurant, the angsty album that the store clerk recommended when she could tell your broken heart needed healing, the $75 Louboutins you miraculously thrifted. “Amazon thought I’d like it” doesn’t exactly make for a great anecdote or a cherished memory. And just being handed something, without having to work for it, can make getting what you want feel kind of… empty. What’s the destination without the journey?
The hunt is also how our tastes evolve. It used to be that finding what you wanted took browsing at brick-and-mortar stores, consulting savvy friends and online communities, and sprawling internet deep-dives. Along the way, you’d come across all kinds of products and culture you hadn’t gone looking for. That’s the stuff that pushes your tastes in new directions, or at least shows you what else is out there. From a data-driven standpoint, though, that makes no sense. A recommendation engine that knows you like A will tell you about B, but why would it ever tell you about C, much less X, Y or Z? In other words, if an algorithm thinks you only like music like “Late Registration,” why would it ever suggest you something that sounds like “Yeezus”?
What if recommendation engines are training you to only like what’s familiar? They decide what you’re exposed to, and base that on what they already know about you. You can pick out what you like about what they offer you, but at the end of the day, it’s going to be something familiar and something that was picked for you. At that point, are you really in control of what you see, or is it in control of you? Or did I just blow your mind?
The reason Facebook, at its peak, had a DAU/MAU of 45 percent (meaning 45 percent of monthly users also use the app daily) is because they purified their drug so profoundly by showing people exactly what they wanted to condition them to come back again and again. Facebook used data to discern what posts in what order would get each individual to stay the longest. The result is echo-chambers—endless feeds of exactly what we want to hear. The result is a bifurcated country where people surround themselves with articles, and politics, and music that confirms their worldview.
As our different tastes and preferences further separate and the distance between us compounds, our best chance is to violate the very algorithms that are herding us.
Just for fun, imagine an app that only recommended you stuff it was sure you’d hate. Not just stuff you don’t want or need; I’m talking about music, shoes, movies and home decor that you’d sooner die than have anyone associate with you. A system that operated on the logic, “users that bought X, would probably hate Y”; an anti-machine learning tool. Would that make you double down on your tastes and habits, or would exposing you to stuff so far outside your comfort zone get you to expand it? How many new doors would that open for you?
James Cole is the founder and CEO of H Collective.