Your Happiness Was Hacked: Why Tech Is Winning the Battle to Control Your Brain--and How to Fight Back

Your Happiness Was Hacked: Why Tech Is Winning the Battle to Control Your Brain--and How to Fight Back

by Vivek Wadhwa, Alex Salkever

Hardcover

$22.46 $24.95 Save 10% Current price is $22.46, Original price is $24.95. You Save 10%.
View All Available Formats & Editions
Choose Expedited Shipping at checkout for guaranteed delivery by Thursday, June 27

Overview

"Technology is a great servant but a terrible master. This is the most important book ever written about one of the most significant aspects of our lives--the consequences of our addiction to online technology and how we can liberate ourselves and our children from it."
--Dean Ornish, M.D. Founder & President, Preventive Medicine Research Institute, Clinical Professor of Medicine, UCSF, Author, The Spectrum


Technology: your master, or your friend? Do you feel ruled by your smartphone and enslaved by your e-mail or social-network activities? Digital technology is making us miserable, say bestselling authors and former tech executives Vivek Wadhwa and Alex Salkever. We've become a tribe of tech addicts--and it's not entirely our fault.

Taking advantage of vulnerabilities in human brain function, tech companies entice us to overdose on technology interaction. This damages our lives, work, families, and friendships. Swipe-driven dating apps train us to evaluate people like products, diminishing our relationships. At work, we e-mail on average 77 times a day, ruining our concentration. At home, light from our screens is contributing to epidemic sleep deprivation.

But we can reclaim our lives without dismissing technology. The authors explain how to avoid getting hooked on tech and how to define and control the roles that tech is playing and could play in our lives. And they provide a guide to technological and personal tools for regaining control. This readable book turns personal observation into a handy action guide to adapting to our new reality of omnipresent technology.

Product Details

ISBN-13: 9781523095841
Publisher: Berrett-Koehler Publishers
Publication date: 06/26/2018
Pages: 192
Sales rank: 491,635
Product dimensions: 5.40(w) x 8.60(h) x 1.00(d)

About the Author

Vivek Wadhwa is an entrepreneur, a technologist, and a professor at Carnegie Mellon University's College of Engineering. A globally syndicated columnist for the Washington Post, he coauthored with Alex Salkever The Immigrant Exodus (an Economist 2012 Book of the Year) and The Driver in the Driverless Car (long-listed for the Financial Times and McKinsey 2017 Business Book of the Year).

Alex Salkever is an author, futurist and technology leader. He co-authored with Vivek Wadhwa "The Driver in the Driverless Car" and "The Immigrant Exodus". He is a columnist for Fortune and previously served as a Vice President at Mozilla as the Technology Editor of BusinessWeek.com and as a Guest Researcher at the Duke University Pratt School of Engineering.

Read an Excerpt

CHAPTER 1

How Technology Removes Our Choices

The Tricks and Tactics Tech Uses to Control Our Actions and Stoke Addictions

If you use Google to search for "Italian restaurant," you are likely to see a small box at the top of the screen with a few results below a map. The positioning is significant: viewers are significantly more likely to click on those results than on anything else on the page, much as shoppers are more likely to pick up products from shelves at eye level in supermarkets than from higher and lower shelves. But whereas in the physical world this limitation primarily affects our shopping experience, in the online and technology worlds, this algorithmic and sometimes intentional selection affects every subsequent thing that we see or do on that page — and far beyond it. The menu is the interface that controls the manner of engagement and sets limits on it, and the way menus are layered can radically alter the way we behave with technology.

For example, on iPhones Apple has an important — to Alex, critical — feature: the toggle that wipes in-app advertising identifiers that app makers can use to analyze and track users. Unfortunately, Apple places that feature deep in the menu: three layers deep. As a result, few people use it, even though regularly using the feature might significantly benefit their privacy by making it much harder for companies to track their behavior in smartphone apps. (The industry would say that using it would lead people to have less personalized and less useful experiences, which is certainly true; there is always a trade-off.)

Apple has in general taken a strong leadership position in protecting the privacy of its customers — by minimizing storage of customer data and by designing systems such as Apple Pay to present fewer opportunities for third parties to access and potentially intercept those data. But its placement of that single toggle deep in the weeds on the iPhone illustrates how decisions by product makers influence our freedom of choice and our relationship with technology. By clearing that identifier regularly, phone users would wipe away some of the capabilities of application developers to accurately target and personalize in-product offers, e-mails, and other entreaties to further guide or limit our choices and set the agenda for us. Another example is the ability to set notifications in the iPhone. Apple does not allow us to make global changes to all the notification settings of our apps. This means we must go through, app by app, and set notification settings. Sure, we can turn them all off by putting our device in "Do Not Disturb" mode. But that is a clumsy fix. Apple's menu design for managing notifications reduces our choices and not necessarily to our advantage (which seems odd from Apple, a company that has become dominant precisely by simplifying technology).

As a number of thinkers in this field, led by former Google design ethicist Tristan Harris, explain, menus also frame our view of the world. A menu that shows our "most important" e-mails becomes a list of the people we have corresponded with most often recently rather than of those who are most important to us. A message that asks "Who wants to meet for brunch tomorrow?" goes out to the most recent group of people we have sent a group text to, or to preset groups of friends, effectively locking in these groups and locking out new people we have met. On the set of potential responses to e-mail that Google automatically suggests in its Inbox e-mail program, we have yet to see "Pick up the phone and call this person" as an option, even if, after a heated e-mail exchange, a call or a face-to-face conversation may well be the best way to communicate and to smooth the waters.

A feed of world news becomes a list built by a nameless, faceless algorithm of topics and events the system decides interest us. It limits our choice by confining it to options within a set of patterns deriving from our past consumption history, and this may or may not relate to our immediate needs or interests. Unfortunately, no one has yet developed an effective algorithm for serendipity.

From the start of the day, a feed of what we missed on Facebook or Twitter as we slept presents us with a menu of comparisons that stokes our fear of missing out (FOMO). This is so by design. However benign its intent, its effect is to significantly limit our frames of reference and our thinking.

A Slot Machine in Our Pocket

In May 2016, Tristan Harris published an influential essay titled "How technology is highjacking your mind — from a magician and Google design ethicist," describing the many ways by which smartphones suck people into their vortex and demand constant attention. Harris traced the lineage of (both inadvertent and intentional) manipulation common in the design of technology products directly to the numerous techniques that slot-machine designers use to entice gamblers to sit for hours losing money.

Inspired by Harris and other advocates of more-mindful technology product design, a small but growing Silicon Valley movement in behavioral design is advocating greater consideration of the ethics and the human outcomes of technology consumption. (After leaving Google, Harris launched a website, Time Well Spent, that focuses on helping people build healthier interactions with technology.)

Harris, New York University marketing professor Adam Alter, and others have criticized the various techniques that product designers are using to encourage us to consume ever more technology even to our own clear detriment. Tightly controlling menus to direct our attention is one common technique (one that is not as easily available to offline businesses). For his part, Harris suggests that we ask four questions whenever we're presented with online menus: (1) What's not on the menu? (2) Why am I being given these options and not others? (3) Do I know the menu provider's goals? (4) Is this menu empowering for my original need, or are the choices actually a distraction? We assure you, once you start asking these questions, you will never look at the Internet or at software applications in the same light again!

Another technique, alluded to in the title of Harris's slot-machine article, is the use of intermittent variable rewards: unpredictability in the rewards of an interaction. The first behaviorist, psychologist B. F. Skinner, introduced this concept with his "Skinner box" research. Skinner put rats into boxes and taught them to push levers to receive a food pellet. The rats learned the connection between behavior and reward quickly, in only a few tries. With further research, Skinner learned that the best way to keep the rats motivated to press the lever repeatedly was to reward them with a pellet only some of the time — to give intermittent variable rewards. Otherwise, the rats pushed the lever only when they were hungry.

The casinos took the concept of the Skinner box and raised it to a fine art, designing multiple forms of variable rewards into the modern computerized versions of slot machines. Those machines now take in 70 to 80 percent of casino profits (or, according to an industry official, even 85 percent). Players not only receive payouts at seemingly random intervals but also receive partial payouts that feel like a win even if the player in fact loses money over all on a turn. With the newer video slots, players can place dozens of bets on the repetition of a screen icon in various directions and in varying sequence lengths.

Older mechanical slot machines displayed three reels and one line. Newer video slot machines display digital icon grids of five by five or more. This allows for many more types of bets and multiple bets in the same turn. For example, the player can bet on how many times the same icon will appear in a single row, how many times it will appear on a diagonal, and how many times it will appear in a screen full of icons, all in one turn. This allows players to win one or more small bets during a turn and gain the thrill of victory, albeit that in aggregate they lost money on their collective bets for the turn. The brain's pleasure centers do not distinguish well between actual winning and the techniques that researchers call losses disguised as wins (LDW). The machines are also programmed to highlight near misses (nearly enough of the right numbers), since near misses actually stimulate the same neurons as real wins do.

Machine designers use myriad other clever sensory tricks — both visual and auditory — to stimulate our neurons in ways that encourage more playing. As explained in a 2014 article in The Conversation, "Losses disguised as wins, the science behind casino profits,"

Special symbols might be placed on the reels that provide 10 free spins whenever three appear anywhere within the game screen. These symbols will often make a special sound, such as a loud thud when they land; and if two symbols land, many games will begin to play fast tempo music, display flashing lights around the remaining reels, and accelerate the rate of spin to enhance the saliency of the event. When you win these sorts of outcomes you feel as though you have won a jackpot; after all, 10 free spins is 10x the chances to win big money right? The reality is that those 10 free spins do not change the already small probability of winning on any given spin and are still likely to result in a loss of money. For many games, features such as this have entirely replaced standard jackpots.

What helps these techniques entice humans to keep playing is that our brains are hard wired to become easily addicted to variable rewards. This makes sense when you think that finding food in prehistoric, pre-agricultural times was a perfect example of intermittent variable rewards. According to research by Robert Breen, video-based gambling games (of which slots represent the majority) that rely on intermittent variable rewards result in gambling addiction three to four times faster than does betting on card games or sporting events.

Smartphones were not explicitly designed to behave like slot machines, but their effect is nearly the same. As Harris writes,

When we pull our phone out of our pocket, we're playing a slot machine to see what notifications we got. When we pull to refresh our email, we're playing a slot machine to see what new email we got. When we swipe down our finger to scroll the Instagram feed, we're playing a slot machine to see what photo comes next. When we swipe faces left/right on dating apps like Tinder, we're playing a slot machine to see if we got a match. When we tap the [red badge showing us the number of notifications in an app], we're playing a slot machine to [see] what's underneath.

Through this lens we can see how many actions deeply embedded in the technology we use are acting as variable rewards systems, and when we look at the technology in our lives, we can find intermittent variable rewards in nearly every product, system, or device. Embedded in everything from e-mail to social media to chat systems to Q&A sites such as Quora, this reward structure is omnipresent and not easy for us to control without going to extremes and without constant vigilance.

The Empty Vessel of Social Approval

When you post your first picture on Instagram, the application automatically contacts your friends who are already on Instagram and asks them to give you some "love." This is to encourage you to use the app more often and to get you hooked on social approval. It is a well-known product-design tactic in social networks and other consumer products. Both Twitter and Facebook encourage new users to immediately follow or connect with others they may already know in order to ensure that their feeds fill sufficiently to attract steady interest and to create a feedback loop of intermittent variable rewards. Sending some love seems rather innocuous, and the request is clearly not malicious in intent. But a little too much love can be bad for your soul when that love is empty and demand for it arises from a hedonic treadmill of empty accumulation rather than from real social relationships and personal recognition.

We all need and compete for social approval at some level, from our families, our friends, and our colleagues. Even if we intentionally try to avoid seeking it, the social-media software and hardware and their mass penetration via the Internet have led social competition to occupy considerable portions of our devices, our time, and our thoughts. Teens posting messages on the popular photo-sharing site Instagram worry acutely about how many likes and comments they will receive. To members of Instagram, followers are social currency. In Snapchat, teens compete to maintain "Snapstreaks" — consecutive days of mutual messaging — with friends. On Facebook, the number of likes on a post or the number of messages you get on your birthday becomes a measure of your personal self-worth. On Twitter, journalists and intellectuals compete for ret-weets and "hearts." On LinkedIn, we check to see who has viewed our profile, and the application provides us with weekly stats on the increase (as a percentage or an absolute number) in the number of people who have checked us out.

To be fair, some evidence exists that active participation in social networks leads people to feel more connected. Facebook claims that chatting with friends and family, sharing pictures, and other positive interactions don't make people sad, although it concedes that negative comparisons can lead to less happiness. Certain personality types, it appears, can better control the craving for constant likes and approvals, and suffer less from the inevitable comparisons with those who are more popular.

But, in general, jealous comparisons kill joy, and technology has driven us to compare ourselves with others on the most superficial of measures. Furthermore, recent research on social-media use has found that it is the comparisons, which are unavoidable in social media, that contribute most to making users unhappy. Teenagers appear to be particularly vulnerable to this; being excluded or unloved on social media is one of the worst humiliations a high-schooler can suffer. Heavy social-media use has been linked to unhappy relationships and higher divorce rates. That may follow from social media's encouragement of social comparisons and self-objectification, which tend to lower self-esteem, reduce mental health, and inculcate body shame. Quitting social media has been linked to marked increases in well-being.

This behavior of seeking likes and approvals also relates directly to intermittent variable rewards: the slot machine in our pockets and on our tablets and laptops. Not knowing how many likes you will get or when they will roll in, you check your social-media accounts frequently. And limits on choice and control compound the active promotion of destructive behaviors to escalate users into borderline obsessiveness.

The Bottomless Well

It's 11 p.m. on a weeknight, and you reach the end of the first episode of the latest season of Stranger Things on Netflix. It's late, and you know you should go to sleep. You have to be up in eight hours to go to work, and you need your rest. But before you can close the application, the next episode begins to play. Netflix has conveniently loaded that episode in the background, anticipating your desire to continue following the story. And then, almost against your will, you are watching the next episode even if you intended not to. Oh well, you figure, I can make up sleep on the weekend.

Along with the millions of others watching Netflix at that precise instant, you have just been sucked into the bottomless well of consumption. Netflix has teams of PhD data scientists who work to figure out how to get you to watch more movies. As you watch Netflix, they watch you, tracking your behavior in minute detail. They track when you pause, rewind, or fast-forward; the days of the week when you tend to watch; the times of day when you watch; where you watch (by zip code); what device you watch on; the content you watch; how long you pause for (and whether you return); whether you rate content; how often you search content; and how you browse and scroll — to name just a few parameters. Truly, they are watching you watching them!

So it's hardly surprising that Netflix figured out that starting the next episode without even asking you would entice you to consume far more content. They noticed that some users were binge-watching and decided that automatically activating the next episode might be a good feature. Netflix launched "Post-Play," as the feature is called, in 2012. Other video-hosting companies quickly followed suit. It got so bad that Apple built a feature into Safari that blocks auto-play videos on webpages and, in January 2018, Google made this a feature in its Chrome browser! So how much more do we consume when facing a bottomless pit of content? Real data on that aren't publicly available yet (although Netflix, YouTube, and Facebook certainly have them), but clues to the soaring amount of user time that Netflix, YouTube, and Facebook videos occupy are available in research and surveys. A 2017 report that surveyed 37,000 consumers found that Netflix binge-watching had become "the new normal," with 37% of binge-watchers actually partaking in their pastime at work!

(Continues…)


Excerpted from "Your Happiness Was Hacked"
by .
Copyright © 2018 Vivek Wadhwa and Alex Salkever.
Excerpted by permission of Berrett-Koehler Publishers, Inc..
All rights reserved. No part of this excerpt may be reproduced or reprinted without permission in writing from the publisher.
Excerpts are provided by Dial-A-Book Inc. solely for the personal use of visitors to this web site.

Table of Contents

Foreword v

Preface ix

Introduction 1

1 How Technology Removes Our Choices 17

2 The Origins of Technology Addiction 41

3 Online Technology and Love 47

4 Online Technology and Work 63

5 Online Technology and Play 96

6 Online Technology and Life 106

7 How Can We Make Technology Healthier for Humans? 132

8 A Vision for a More Humane Tech 158

9 A Personal Epilogue 181

Notes 190

Acknowledgments 221

Index 223

About the Authors 235

Customer Reviews

Most Helpful Customer Reviews

See All Customer Reviews