We Are All Part of One Gigantic A/B Test

You may be part of a gigantic test, but you can still think for yourself—for now.

You may be part of a gigantic test, but you can still think for yourself—for now. Elissa Patel

If  all our actions are determined by machines that know how to elicit a certain response from us, do we really have free will?

This sounds like an existential question that would get raised during an episode of Westworld or maybe in a Sartre play, but it’s the type of question friends were asking after watching last week’s powerful and controversial 60 Minutes special on what they describe as “brain hacking.”

Like a lot of my friends who have worked or currently work in technology, I watched the piece with great interest. Tristan Harris, the protagonist of the story, does a great job of explaining at a high level how companies think about building products to get further and further “down your brain stem,” as he put it. Ramsay Brown of Dopamine Labs effectively restates a mantra that the tech industry has been saying for years — “If you aren’t paying for the product, you are the product.”

But a lot of the concepts shared during the 60 Minutes piece are not really all that new. For years, the tech industry has been using a combination of psychology and technology to effectively make the products you use extremely addictive. Four books in particular highlight exactly how tech companies think about building products before, after and while you are using them: The Lean Startup, Don’t Make Me Think, The Design of Everday Things and Hooked. Hooked perhaps provides the most insightful look into the psychology of the product manager.

This post isn’t intended to be a book review but rather is meant to provide a deeper dive for non-tech people into some of the concepts that Hooked and other books cover.

Our Behavior Is Pretty Predictable

The idea posited by Hooked is basically this: an internal or external trigger (like a smartphone notification) cues an action, through which users of a product expect some sort of variable reward, and users eventually escalate their commitment to the product. Then the cycle repeats. That’s a lot to unwrap, but in short, external triggers (like e-mail reminders, newsletters or paid advertising) are less desirable then internal triggers (like boredom, loneliness, social acceptance and uncertainty).

In essence, the way product managers think about building products is around internal triggers — you want a user to connect an internal situation (like loneliness) to the actual use of a product (like pulling up your smartphone).

There Is Even an Equation That Basically Describes This

Hooked and numerous other books describe a model posited by Dr. BJ Fogg, the famed founder of the Stanford Persuasive Tech Lab. The model is, in essence, an equation: B = MAT (Behavior = Motivation + Ability + Trigger). It describes the ideal scenario under which a user undertakes a certain behavior. The most desirable part of the equation, ability, describes the ease with which a user can take the action — but you need all three of the elements to be successful.

Hooked also describes the nucleus accumbens — a part of the brain that is associated with pleasure and cravings. If there is a craving or an itch for the reward, that is in essence a good thing. But it should not be black and white — the rewards should be variable. For example, psychologically, is there a difference between a Facebook like and a Facebook comment? They probably “feel” different to you. Having variable rewards that (users believe) contain different values is key to how companies think about building products.

The Notion of System Feedback

One of the concepts posited by The Design of Everyday Things is the notion that users of a product need feedback from the product. The book cites a pedestrian crosswalk as an example of a system that doesn’t provide the right feedback. The reason you press the crosswalk button 50 times is because you don’t necessarily get the desired outcome. The same is true for most tech products — when we talk to Siri or Alexa, we want the right feedback.

We don’t just need feedback from other people; we need feedback from our products.

How Does This Work in Practice?

You should assume that almost every tech company you love is experimenting on you right this second. Below are some examples of tech companies creating a variable reward structure to get you to stick around:

Facebook / Twitter / Snapchat and literally every other social media and mobile-app company. Where to begin here? The obvious examples are the following: a) any action you would respond to; or b) when you like a post. But the non-obvious examples are things like scrolling through a newsfeed (as described in the 60 Minutes piece). By scrolling through a newsfeed, you are essentially on a never-ending hunt for something you may never find — that one share that may change your life today. The internal trigger for these products is almost always boredom or loneliness. The external triggers are oftentimes notifications. You probably subconsciously know the notification sounds, maybe even the length of vibration and how to respond accordingly, right? All of this is by design.

Every publisher. One of the frameworks product managers talk about is the “pursuit of mastery.” The pursuit of knowledge will take users far and wide all over the Internet and can get them to do deep dives within publications. Every publisher on the planet thinks about this in terms of page views and unique visitors; they want the reader to stick to their magazine for as long as possible — it means more money for them and more mindshare. The external trigger for this is notifications and clever headlines. You should basically assume that for every article you read, the headline is being A/B/C/D testedwith four drastically different headlines.

Every website. In another book I mentioned earlier, Don’t Make Me Think, the author, Steve Krug, describes and lays out the best principles of website design. In essence, the book rightly assumes that users effectively scan pages and don’t actually read them fully. Based on that assumption and numerous others factors (it’s worth a read), Krug lays out some basic principles of website design that most great companies adopt. You should assume that most websites are designed this way and designed to make the “action” part of the Fogg model work effectively. Every website is starting to look the same for a reason — standard usability conventions are being applied everywhere on the Internet and within mobile apps. It’s sort of like how everyone is zeroing in on buying houses with a “Zen aesthetic.”

Alexa. Amazon will probably create a generation of kids that instantly thinks of Alexa whenever they want to play music.

Tinder and other dating apps. Self-explanatory with respect to motivation, action and triggers.

Non-Tech Companies Think about These Things Too

To be totally fair, non-tech companies obviously think about these things too. Here’s a classic example of a non-tech company using “brain hacking” as it’s described in Hooked: You buy a piece of furniture from IKEA, and you don’t want to dump it because you spent a lot of time building it. That’s a prime example of a person overvaluing their own efforts and, as a result, becoming dependent on a product they can’t get rid of. There are countless other examples of this in practice.

How Is This Even Possible?

One of the common threads across the books is the notion of human-centered design (HCD, as described in The Design of Everyday Things). Because it’s easier to get user feedback extremely quickly and reliably, it’s possible to adapt products sometimes multiple times in a single day. You can even do things like adapt variable rewards accordingly. For example, if a user posts on a social media site for the first time, and you want them to have a great variable reward, you might “rig” the number of likes a post receives. Or if a user hasn’t posted in a while nor been active on a site for a long time, it might help if they see that one of their friends posted. You see this happen all the time on apps like Instagram. It’s easy to get user feedback and activity rapidly, which is what makes all this possible.

Quick user feedback makes things like fake news possible. You can easily measure what resonates with audiences (really quickly) to see what works and what doesn’t.

In Defense of Tech

Books like Hooked ponder the concepts of ethics in products. Hooked even provides a framework called the “Manipulation Matrix” to describe how product managers can think about building products that materially improve users’ lives rather than building ones that merely make them addicted to a good or a service. But are enough product managers thinking about the ethics of their products? Probably not. The incentive structures are misaligned. When you are building a new product, you need to make it as addictive as possible to gain traction. Engagement, especially in the early stages, is one of the top metrics venture capitalists consider when putting money into a new product. If you can’t retain customers, you effectively don’t have a company.

Product Managers Are Getting So Good

It’s hard to imagine any other outcome than that we may lose free will with time. We are already seeing signs of it, and it’s going to accelerate with artificial intelligence. Facebook famously caught heat for quite literally manipulating people’s emotions through data — and this story was from three years ago. Imagine how far we’ve advanced since then.

With concepts like virtual reality and augmented reality taking off soon, the battle for (literally) your eyes will become fierce. Why compete for your time with a smartphone, a tablet and a TV when you can wrap a TV around a customer’s head? There are a large amount of ethical questions involved in the application of virtual reality that are being pondered by some but that probably need to be more mainstream.

Are we really headed for a future that looks like the “Nosedive” episode of Black Mirror? Quite possibly, yes. Remember, you are already essentially rating the quality of people through apps like Uber, Lyft, TaskRabbit and every other people-powered service in the on-demand economy. It’s only a matter of time until the practice gains broader adoption.

A discussion needs to be had about the ethics around product management. The 60 Minutes piece was a nice teaser to get the topic into the mainstream, but we have a long way to go. The future is already here.

Until then, just remember that it’s not too late to exercise free will. You may be part of a gigantic test, but you can still think for yourself—for now.

Sunil Rajaraman is the Co-Founder of Scripted.com, CEO of The Bold Italic, and a columnist at Inc.com. We Are All Part of One Gigantic A/B Test