Defending A Billion People’s Minds From Getting Hijacked

Defending A Billion People’s Minds From Getting Hijacked

A former Google design ethicist and magician has exposed how technology hits us at our weak spots.

Ever since technology started making our lives better, there’s been a collective anxiety about how it’s making out lives worse – most of this fear, however, has been concentrated upon how artificial intelligence might one day outwit us (robots gone wild). Or the downside of virtual reality robbing us of real lived experience. One dude however, has recently written an article about how he’s more concerned about how technology worms its way deep into our brains… and takes over the controls.

“I spent the last three years as Google’s Design Ethicist, caring about how to design things in a way that defends a billion people’s minds from getting hijacked,” writes Tristan Harris on self-publishing website Medium. At Google, Harris helped invent and advocate for new designs that embed mindfulness into the screens people use. Harris is an expert on how technology hijacks our psychological vulnerabilities, and his article - outlining some of his findings of recent years - has getting a fair bit of attention.

It’s a very interesting read, but kind of a lengthy one - so we thought we’d give you the low-down in case you’re strapped for time.  

TECHNOLOGY IS MAKING OUR CHOICES FOR US

Harris points out that while we mostly think optimistically of technology (i.e. how amazing are all the things it can do for us!), we should turn our awareness towards how technology is actually manipulating us, and robbing us of our free choice. 

Harris begins by drawing an analogy, a reflection he first ascertained from his past working as a magician – that technology functions like a magician doing tricks – looking for the blind spots, edges, vulnerabilities and limits of people’s perception, so they can influence what people do without them even realizing it. Magicians give people the illusion of free choice, while architecting the menu so that they win (no matter what you choose). Technology’s playing us to make the choices it wants us to. And when we give tech this power, it controls our experiences in more ways than we realise.

Here’s some of the hijacks happening, according to Harris:

TECH CONTROLS WHAT'S ON THE MENU, WE EAT 'THE FEED' INSTEAD OF WHAT WE WANT TO

'If You Control the Menu, You Control the Choices': When using the internet to find a service, product, or even other people, we often become distracted by what’s put in front of us and get sidetracked from our initial search.  The ‘illusion of choice’ (a magician’s term) is presented to us in a way which makes us feel empowered  - and we begin to resort to our smartphones as the repository for all things we are seeking. But the reality is that what we see on our screens is a highly filtered, algorithm-based, or totally inaccurate collection of data. ‘What’s happening in the world?’ becomes a menu of news feed stories.

1 iDMEQ0vQrx2ep Z4G mY1w

For instance, when you wake up, it’s normal to check your phone notifications before getting out of bed. Harris says, however, that this “frames the experience of ‘waking up in the morning’ around a menu of ‘all the things I’ve missed since yesterday’. When we get that list of notifications when we wake up in the morning, we should ask ourselves instead, how empowering is this menu of choices when we wake up? Does it reflect what we care about? Harris suggests that instead of checking your phone first thing, you should consider another activity that’s more aligned with your true needs – like making a cup of coffee.

TECH HAS TURNED OUR PHONES INTO SLOT MACHINES AND US ALL INTO GAMBLERS

On a psychological level, many of us feel positive reinforcement when realizing we’ve received a text, missed a call, or been tagged in a post. It’s the same concept, in many ways, behind gambling, says Harris. “If you want to maximize addictiveness, all tech designers need to do is link a user’s action (like pulling a lever) with a variable reward. You pull a lever and immediately receive either an enticing reward (a match, a prize!) or nothing.” If you’re an app, how do you keep people hooked? Turn yourself into a slot machine. Tech designers know this is a reliable way of thinking about marketing thanks to the fact that “[s]lot machines make more money in the United States than baseball, movies, and theme parks combined.” As such, our smartphones become a slot machine of sorts, and with each scan of our notifications we are looking for the rewarding experience of feeling as if we’ve earned something. “The average person checks their phone 150 times a day. Why do we do this? Are we making 150 conscious choices?,” Harris asks. Imagine seeing this each time a phone’s screen was being scanned mindlessly:

hijack 2

Harris suggests that it’s up to Apple and Google to reduce these effects by “converting intermittent variable rewards into less addictive, more predictable ones” with better design. For example, says Harris, Apple could empower people to set predictable times during the day or week for when they want to check “slot machine” apps, and correspondingly adjust when new messages are delivered to align with those times.

1 IoBUUcjfeAUnmY2ccu75Ww

APPS AND WEBSITES ARE FUELLING OUR FOMO

Harris argues that apps and websites hijack people’s minds is by inducing a “1% chance you could be missing something important.” Our phones and computers are convincing us that they’re channels for important information, messages, friendships, or potential sexual opportunities — which makes it hard for us to turn them off, unsubscribe, or remove our account – because we’re worried we might miss something important. This is why, for instance, we keep swiping faces on dating apps, even when we haven’t even met up with anyone in a while (“what if I miss that one hot match who likes me?”). It also keeps us using social media (“what if I miss that important news story or fall behind what my friends are talking about?”)

Harris points out that “living moment to moment with the fear of missing something isn’t how we’re built to live.” He says that unplugging for a day, unsubscribing from those notifications, makes us realise that the concerns we thought we’d have don’t actually happen – “We don’t miss what we don’t see”.

PEOPLE WANT APPROVAL = MORE TIME SPENT ON BUSINESS' APPS

In his article, Harris also looks into how social approval (the feeling of the need to belong – that we all have) is in the hands of tech companies. It’s encouraged by apps like Instagram, where we often judge the success of our days by how many people interacted with our photos. Or Facebook tagging – for examples, when a friend tags you, they’re responding to Facebook’s suggestion (“Tag this person?”), not making an independent choice. So you think your friend is thinking about you, but it’s just a move that Facebook has orchestrated. The same happens when we change our main profile photo — Facebook knows that’s a moment when we’re vulnerable to social approval“what do my friends think of my new pic?” Facebook can rank this higher in the news feed, so it sticks around for longer and more friends will like or comment on it. Each time they like or comment on it, we’ll get pulled right back.

1 qnyJGSiHouwhM rhhvy rA

CRAFTING A SENSE OF OBLIGATION

Harris says we are vulnerable to needing to reciprocate others’ gestures (social reciprocity) and technology is manipulting how we experience that, exploiting our vulnerability on purpose. Harris writes about LinkedIn and its formula for growing its membership: “LinkedIn wants as many people creating social obligations for each other as possible, because each time they reciprocate (by accepting a connection, responding to a message, or endorsing someone back for a skill) they have to come back to linkedin.com where they can get people to spend more time.”

1 B5ZVhb7eL7hyEcfS8m1onQ

When you receive an invitation from someone to connect, you imagine that person making a conscious choice to invite you, when in reality, they likely unconsciously responded to LinkedIn’s list of suggested contacts. In other words, LinkedIn turns your unconscious impulses (to “add” a person) into new social obligations that millions of people feel obligated to repay. All while they profit from the time people spend doing it.

1 5wyghDrGzpUYxGZkZRzE A

"PLAY NEXT?" - AUTOREFILLS ARE AUTOPILOTING YOUR BRAIN

Another way tech hijacks people is to keep them consuming things, even when they aren’t hungry anymore, by taking an experience that was bounded and finite, and turning it into a bottomless flow that keeps going.  News feeds are purposely designed to auto-refill with reasons to keep us scrolling, and purposely eliminate any reason for us to pause, reconsider or leave. Likewise, a huge portion of traffic on Netflix, YouTube or Facebook is driven by autoplaying the next thing – they autoplay the next video after a countdown instead of waiting for you to make a conscious choice (in case you won’t). Harris says that while tech companies will claim that they’re just making it easier for users to see the video they want to watch, they are actually serving their business interests.

1 d 4O5k 1k9LLadu3HMmtXg

MAXIMISING INTERRUPTIONS

Facebook’s chat function is the poster child for the fact that “messages that interrupt people immediately are more persuasive at getting people to respond than messages delivered asynchronously.” Given the choice, Facebook Messenger (or WhatsApp, WeChat or SnapChat for that matter) would prefer to design their messaging system to interrupt recipients immediately (and show a chat box) instead of helping users respect each other’s attention. It’s also in their interest to heighten the feeling of urgency and social reciprocity. 

For example, Facebook automatically tells the sender when you “saw” their message, instead of letting you avoid disclosing whether you read it (“now that you know I’ve seen the message, I feel even more obligated to respond.”). Maximizing interruptions in the name of business creates a tragedy of the commons, ruining global attention spans and causing billions of unnecessary interruptions each day.

And so on…

Harris goes on to mention even more tech hijacks on our minds, including Bundling Your Reasons with Their Reasons, as well as the difficulty of unsubscribing from things like digital newspaper subscriptions. Read the rest here.

In conclusion, he suggests that our tech gadgets and programs as extensions of our literal selves, and that tech companies should acknowledge that time is valuable and should be respected as much as other things, like privacy.  

Close
-->