- 1 Trick number 1: If You Control the Menu, You Control the Choices
- 2 Trick number 2: put a slot machine in billion pockets
- 3 Trick number 3: fear of missing something important (FOMSI)
- 4 Trick number 4: social approval
- 5 Trick number 5: social reciprocity (tit-for-tat)
- 6 Trick number 6: bottomless bowls, infinite feeds, and autoplay
- 7 Trick number 7: Instant Interruption vs. “Respectful” Delivery
- 8 Trick number 8: Bundling Your Reasons with Their Reasons
- 9 Trick number 9: Inconvenient Choices
- 10 Trick number 10: Forecasting Errors, “Foot in the Door” strategy
Tristan Harris, an expert at Google, reveals dozen tricks tech companies use to “ethically” exploit our psychological vulnerabilities.
The magicians start by looking for blind spots, edges, vulnerabilities and limits of people’s perception, so they can influence what people do without them even realizing it. Once you know how to push people’s buttons, you can play them like a piano. This is exactly what product designers do. They play your psychological vulnerabilities (conscious and unconscious) in the race to grab customer’s attention.
Trick number 1: If You Control the Menu, You Control the Choices
Western culture is built around the ideals of individual choice and freedom. Millions of us fiercely defend their right to make “free” choices, but at the same time ignore how these choices are manipulated through the menus that are imposed on us by someone else. This is exactly what magicians do. They give people the illusion of free choice by arranging the menu in such a way, so they can always win no matter what you choose.
When people are given a menu of choices, they rarely ask: “What is not on the menu?” “Why am I being given these options and not others?” “Do I know the menu provider’s goals?” “Is this menu my initial desires or are actually distraction of my attention?”
Imagine you’re out with your friends on a Tuesday night and want to sit down somewhere and chat. You click on Yelp icon to find recommended places nearby and see a list of bars. Your friend and you start staring down at the screens of smartphones comparing bars. You discuss photos of each bar, comparing cocktails drinks. Are you sure this menu is still meets your original intentions?
The point is not that bars are a bad choice. Yelp just substituted the original question (“where should we go to keep talking?”) with a different question (“what’s a bar with good photos of cocktails nearby?”) by shaping another menu.
Moreover, you fall for the illusion that Yelp’s menu displays a complete list of options for where to go. While people are looking down at their smartphones, they do not notice the park across the road with a band playing live music. They don’t see a food truck with sandwiches and coffee. Neither of those show up on Yelp’s menu. Yelp subtly reframes your need “where can we go to keep talking?” in terms of photos of cocktails served.
The more choices technology gives us in almost every aspect of our lives (information, events, places for rest, friends, dating, jobs) – the more we assume that our phone is always the most empowering and useful menu to pick from. But is it true?
The “most empowering” menu is different than the menu that has the most choices. But when we blindly follow the menus we are given, it is easy to lose the sense of difference:
- “Who wants to meet tonight?” turns into a menu of most recent people who texted us;
- “What happens in the world?” turns into a menu of news feed stories;
- “Who is single and ready to go on a date?” turns into a menu of faces to swipe on Tinder (instead of local events with friends, or urban adventures nearby);
- “I have to reply back to this email” turn into a menu of key to type a response (instead of empowering ways to communicate with a person).
When we wake up in the morning and turn on the phone to check a list of notifications – it creates the experience of ” waking up in the morning” around a menu of “all the things I’ve missed since last night.” By creating the menus we pick from, technology captures our perception of choice and replaces them with new ones. But the closer we pay attention to the options we are given, the more we start to notice that they do not actually align with our true needs.
Trick number 2: put a slot machine in billion pockets
If you’re a mobile app, what would you do to grab people’s attention? Answer: turn yourself into a slot machine. On average, a typical person checks their smartphone 150 times a day. Why do we do this? Are we making 150 conscious choices? One of the main reasons for the similarity of mobile app to slot machines is the intermittent variable rewards.
If you want to maximize addictiveness on your product, all tech designers need to link a user’s action (like pulling a lever) with a variable reward. You pull a lever and immediately receive either an enticing reward (a match, a prize) or nothing. Addictiveness is maximized when the rate of reward is most variable.
Does this effect really work on people? Yes. Slot machines make more money in the US than baseball, cinema and theme parks combined. According to Natasha Dow Schüll, a professor at New York University, slot machines cause a “problem addiction” 3-4 times faster than other types of gambling.
Here’s the sad truth – several billion people carry a slot machines in their pockets:
- When we take our phone out of our pocket, we pull a slot machine lever to see the notifications received;
- When we review email, we play a slot machine to see what new emails we got;
- When we swipe down our finger to scroll the Instagram feed, we pull a slot machine lever to see what photo comes next;
- When we swipe faces left/ right on dating apps like Tinder, we pull a slot machine lever to see if we got a match;
- When we click on the red number of notifications, we pull a slot machine lever to see what inside.
Apps and websites use intermittent variable rewards all over their products because it’s profitable for business. But in some cases, slot machines emerge by accident. There is no malicious corporation behind all of email who consciously turns it into a slot machine. No one makes a profit when millions check their email. Neither did Apple and Google’s designers want smartphones work like slot machines. This happened by accident.
But now companies like Apple or Google have a responsibility to reduce these effects by transforming intermittent variable rewards into less addictive, more predictable ones with better design. They could give people the opportunity to set predictable times during the day or week for when they want to check “slot machines” apps, and adjust the receipt of new notification with those time.
Trick number 3: fear of missing something important (FOMSI)
Another trick to hijack our mind is by inducing a “1% chance you could be missing something really important.” If I convince you that I am a channel of important information, messages, friendships, potential sexual opportunities – it will be hard for you to turn me off, unsubscribe or delete account – because you might miss something important:
- This keeps us subscribed to newsletters even if it hasn’t delivered anything valuable (“what if I miss an important announcement?”);
- This keeps us “friended” to people with whom we haven’t talked to in years (“what if I miss something important from them?”);
- This keeps us swiping photos on dating apps, even if we haven’t even met up with anyone in a while (“what if I miss that one hot match who likes me?”);
- This keeps us using social media (“what if I miss that important news or fall behind what my friends are talking about?”).
But if we look closely at that fear, we will discover that we always miss something important every time we put off our smartphones.
- Magic things happen on Facebook until we open applications 76 times a day (an old friend who is visiting town right now);
- We will miss great opportunities in Tinder (our dream romantic partner);
- We will miss urgent calls if we don’t have the phone in our hand 24/7.
But living moment to moment with the fear of missing something is not we’re created for. It’s amazing how quickly we will wake up from the illusion by letting go of that fear. When we go offline for a day, unsubscribe from those notifications or go camping without internet – we would realize we don’t really have anything to worry about. We don’t miss what we don’t see.
The thought, “what if I miss something important?” bothers us before unplugging, but not after. Imagine if tech companies realized that, and helped us proactively build our relationship with friends and business in terms as “time well spent” rather than what we might miss.
We’re all vulnerable to social approval. The need for support and approval by our fellows is among the highest human motivations. But today our social approval is in the hands of tech companies. When I get tagged by my friend Mark, I imagine him making a conscious decision to tag me. But I don’t even know how a company like Facebook managed his actions during that choice.
Facebook, Instagram or SnapChat can manipulate how often people get tagged in photos by automatically offering all the faces can be recognized (by showing a box with a 1-click confirmation, “Tag Tristan in this photo?”). So, when I’m tagged by Mark, he’s actually responding to Facebook’s offer and not making a conscious decision. This design allows Facebook to control for how often millions of people experience their social approval.
The same thing happens when we change profile photo. Facebook knows that is a moment when we are vulnerable to social approval (“what do my friends think of the new profile pic?”). Facebook can rank this higher in the news feed, so it sticks around for longer and more friends will like or comment on it. Every time someone likes or comments on it, I’ll get pulled right back.
Everyone reacts to social approval by default, but some demographics segments (teenagers) are more vulnerable to it than other. Therefore, it is so important to recognize how powerful designers are when they exploit these vulnerabilities.
- You do me a favor, now I owe you one next time.
- You say, “thank you”— I have to say “you are welcome.”
- You send me an email— it’s rude not to get back to you.
- You follow me — it’s rude not to follow you back. (especially for teenagers)
We are vulnerable because of the need to reciprocate others’ gestures. And as with social approval, tech companies manipulate how often we experience it. In some cases, it’s by accident. Email, texting and messaging apps are social reciprocity factories. But in other cases, companies exploit this vulnerability on purpose.
LinkedIn is the most obvious manipulator. LinkedIn wants as many people creating social obligations for each other as possible, because each time they reciprocate (by accepting a connection, responding to a message, or endorsing someone back for a skill) they have to come back on linkedin.com.
Just like Facebook, LinkedIn exploits an asymmetry in perception. When you receive an invitation from someone to connect, you imagine that person making a conscious choice to invite you, when in reality, they likely unconsciously responded to LinkedIn’s list of suggested contacts. In other words, LinkedIn turns your unconscious impulses (to “add” a person) into new social obligations that millions of people feel obligated to repay. All while LinkedIn profits from the time people spend doing it.
Imagine millions of people getting interrupted throughout their day, running around like chickens with their heads cut off, reciprocating each other. And all this designed by companies who profit from it.
Think of it, if tech companies had a responsibility to minimize social reciprocity. Or if there was an independent public organization that monitored when tech companies abused these vulnerabilities?
Another way to hijack people is to keep them consuming things, even when they aren’t hungry anymore. How? Easy. Take an experience that was bounded and finite, and turn it into a bottomless flow that keeps going.
A Professor Brian Wansink at Cornell University demonstrated this in his study showing – you can trick people into keep eating soup by giving them a bottomless bowl that automatically refills as they eat. With bottomless bowls, people eat 73% more calories than those with normal bowls.
Tech companies exploit the same principle. News feeds are purposely designed to auto-refill with reasons to keep you scrolling, and purposely eliminate any reason for you to pause, reconsider or leave.
It’s also why video and social media sites like Netflix, YouTube or Facebook has autoplay feature. The next video plays automatically after a countdown instead of waiting for you to make a conscious choice. A huge portion of traffic on these websites is driven by autoplaying the next thing.
Tech companies often claim that “we’re just making it easier for users to see the video they want to watch” when they are actually serving their business interests. And you can’t blame them, because increasing “time spent” is the currency they compete for.
Now let’s Imagine that a tech company will make efforts not only to increase the time spent on its resource, but also to improve the quality of this “time spent”.
Trick number 7: Instant Interruption vs. “Respectful” Delivery
Companies know that messages that interrupt people immediately are more persuasive at getting people to respond than messages delivered asynchronously (like email). Given the choice, Facebook Messenger (or WhatsApp, Viber, Telegram, WeChat) would prefer to design their messaging system to interrupt recipients immediately (and show a chat box) instead of helping users respect each other’s attention. In other words, interruption is good for business.
It’s also in the interest of tech companies to heighten the feeling of urgency and social reciprocity. Facebook automatically tells the sender when you “saw” their message, instead of letting you avoid disclosing whether you read it (“now that you know I’ve seen the message, I feel even more obligated to respond.”)
By contrast, Apple more respectfully lets users toggle “Read Receipts” on or off.
The problem is, while messaging apps maximize interruptions in the name of business, it creates a tragedy of the commons that ruins global attention spans and causes billions of interruptions every day. This is a huge problem we need to fix with shared design standards by developing new mobile apps and services.
Trick number 8: Bundling Your Reasons with Their Reasons
Another way mobile apps hijack you is by taking your reasons for visiting the app (to perform a task) and make them inseparable from the app’s business reasons (maximizing how much we consume once we’re there).
For example, let’s analyze grocery stores.
The most popular reasons to visit grocery stores are pharmacy refills and buying milk. But managers want to maximize how much people buy, so they put the pharmacy and the milk at the back of the store. In other words, they make the thing customers want (milk, pharmacy) inseparable from what the business wants. If stores were truly organized to support people, they would put the most popular items in the front.
Tech companies design their websites the same way. For example, when you want to look up a Facebook event happening tonight (your reason) the Facebook app doesn’t allow you to access it without first landing on the news feed (their reasons), and that’s on purpose. Facebook wants to convert every reason you have for using Facebook, into their reason which is to maximize the time you spend consuming things.
In an ideal world, apps would always give you a direct way to get what you want separately from what they want. Imagine a digital “bill of rights” outlining design standards that forced the products that billions of people used to support empowering ways to navigate towards their goals.
Trick number 9: Inconvenient Choices
We’re told that it’s enough for businesses to “make choices available.”
- “If you don’t like it you can always use a different product.”
- “If you don’t like it, you can always unsubscribe.”
- “If you’re addicted to our app, you can always uninstall it.”
Businesses naturally want you to make the choice they need. So, what they want is easy to choose, but what isn’t is difficult. Magicians do the same thing. You make it easier for a spectator to pick the thing you want them to pick, and harder to pick the thing you don’t.
For example, NYTimes.com lets you “make a free choice” to cancel your digital subscription. But instead of just doing it when you hit “Cancel Subscription,” they send you an email and force you to call a phone number that’s only open at certain times.
Trick number 10: Forecasting Errors, “Foot in the Door” strategy
Finally, apps can use people’s inability to predict the effects of their clicks. People don’t intuitively forecast the true cost of a click when it’s presented to them. Sales people use “foot in the door” techniques by asking for a small innocuous request to begin with (“just one click”), and escalating from there “why don’t you stay awhile?”. Virtually all engagement websites use this trick.
Imagine if web browsers and smartphones, the gateways through which people make these choices, were truly watching out for people and helped them forecast the consequences of clicks (based on real data about what it actually costs most people?).
That’s why I add “Estimated reading time” to the top of my posts. When you put the “true cost” of a choice in front of people, you’re treating your users or audience with dignity and respect. In a Time Well Spent internet, choices would be framed in terms of projected cost and benefit, so people were empowered to make informed choices.
Translated by Nata Kallissi