The Inmates are Running the Asylum: Why High Tech Products Drive Us Crazy and How to Restore the Sanity

The Inmates are Running the Asylum: Why High Tech Products Drive Us Crazy and How to Restore the Sanity

4.3 3
by Alan Cooper
     
 

Why are VCRs impossible to program? Why do car alarms make us all crazy at all the wrong times? Why do our computers reprimand us when they screw up? All of these computerized devices are wildly sophisticated and powerful, and they have proliferated our desks, our cars, our homes and our offices. So why are they still so dauntingly complicated to use?

The

…  See more details below

Overview

Why are VCRs impossible to program? Why do car alarms make us all crazy at all the wrong times? Why do our computers reprimand us when they screw up? All of these computerized devices are wildly sophisticated and powerful, and they have proliferated our desks, our cars, our homes and our offices. So why are they still so dauntingly complicated to use?

The highly renowned Alan Cooper, "The Father of Visual Basic," tackles this issue head-on with his new book, The Inmates are Running the Asylum, from Sams Publishing. Cooper believes that powerful and pleasurable software-based products can be created by the simple expedient of designing computer-based products first and then building them. Designing interactive software-based products is a specialty that is as demanding as the construction of these same products, Cooper says. Ironically, building computerized products isn't difficult, they only seem so because our process for making them is out of date. To compound the problem, the costs of badly designed software are incalculable, robbing us of time, customer loyalty, competitive advantage and opportunity.

The Inmates are Running the Asylum also addresses the societal dangers of what Cooper calls "software apartheid," where otherwise normal people are kept from entering the job market and participating in society because they cannot use computers effectively. While social activists are working hard to break down race and class barriers, technologies are hard at work inadvertently erecting new, bigger ones. "By purposefully designing software-based products to be more human and forgiving, we can automatically make them more inclusive, more class- and color-blind," Cooper writes.

Using examples from his own work with companies of all sizes, Cooper offers a provocative, insightful and entertaining explanation for this phenomenon.

He believes that in part, the problem lies in the fact that business executives in the high-tech industry have relinquished their control to the engineers and techies. In the rush to accept the many benefits of the silicon chip, responsibility has been abandoned, and "the inmates have been allowed to run the asylum." The solution, Cooper says, is to harness those talents to create products that will both thrill their users and grow the bottom line.

The book is written for two new archetypes emerging in contemporary business: "The technology-savvy businessperson," who knows that his success depends on the quality of the information available to him and the sophistication with which he uses it; and "the business-savvy technologist," an entrepreneurial engineer or scientist with a keen business sense and an awareness of the power of information.

About the Author:

Alan Cooper is the "Father of Visual Basic," according to Mitch Waite, founder of Waite Group Press. In 1994, Bill Gates gave him the rare and coveted Windows Pioneer Award--recognizing how his part in the invention of VB contributed to the success of Microsoft Windows. Cooper's influential book, About Face: The Essentials of User Interface Design (IDG Books) has sold over 40,000 since August of 1995. His title continues to sell about 300-400 copies a month. He now leads Cooper Interaction Design, a consulting firm that has created breakthrough interactive product designs for IBM, Sony, Logitech and several Internet/intranet start-ups.

For twenty years Alan Cooper designed and developed consumer software products including SuperProject, MicroPhone II for Windows, and Microsoft's visual programming user interface for Visual Basic.

Cooper is a member of the Corporate Design Foundation and American Center for Design. He is a former director of the Association for software Designs Silicon Valley Chapter and a member of the national organization's Board of Directors.

He is a frequent, opinionated and engaging industry speaker and writer on the topics of user interface and conceptual design.

Read More

Editorial Reviews

Booknews
Armed with solutions to the dilemma of how dependent every one of us is becoming on electronic products, the author argues that, despite appearances, business executives are simply not in control of the high-tech industry. He explains how talented people continuously design bad technology-based products and uses his own work to show businesses of all sizes how to harness talent to create products that will both thrill users and grow the bottom line. Annotation c. by Book News, Inc., Portland, Or.

Product Details

ISBN-13:
9780672316494
Publisher:
Sams
Publication date:
03/23/1999
Edition description:
Older Edition
Pages:
261
Product dimensions:
6.35(w) x 9.56(h) x 1.04(d)

Read an Excerpt

[Figures are not included in this sample chapter]

The Inmates Are Running the Asylum
-1-
Riddles for the Information Age

What Do You Get When You Cross a Computer with an Airplane?

In December 1995, American Airlines Flight 965 departed from Miami on a regularly scheduled trip to Cali, Columbia. On the landing approach, the pilot of the 757needed to select the next radio navigation fix, named "ROZO." He entered an "R" into his navigation computer. The computer returned a list of nearby navigation fixes starting with "R" and the pilot selected the first of these, whose latitude and longitude appeared to be correct. Unfortunately, instead of "ROZO," the pilot selected "ROMEO," 132 miles to the northeast. The jet was southbound, descending into a valley that runs North-South, and any lateral deviation was dangerous. Following indications on the flight computer, the pilots began an easterly turn and slammed into a granite peak at 10,000 feet. One hundred and fifty two passengers and all eight crewmembers aboard perished. Four passengers survived with serious injuries. The National Transportation Safety Board investigated, and--as usual--declared the problem human error. The navigational aid the pilots were following was valid but not for the landing procedure at Cali. In the literal definition of the phrase, this was indeed human error, because the pilot selected the wrong fix. However, in the larger picture, it wasn't the pilot's fault at all.

The front pan el of the airplane's navigation computer showed the currently selected navigation fix and a course deviation indicator. When the plane is on course, the needle is centere d, but the needle gives no indication what so ever about the correctness of the selected radio beacon. The gauge looks pretty much the same just before landing as it does just before crashing. The computer told the pilot he was tracking precisely to the beacon he had selected. Unfortunately, it neglected to tell him the beacon he selected was a fatal choice.


Communications can be precise and exacting while still being tragically wrong. This happens all too frequently when we communicate with computers, and computers are invading every aspect of our modern lives. From the planes we fly to just about every consumer product and service, computers are ubiquitous, and so is their characteristically poor way of communicating and behaving.

There is a widely told joke in the computer industry that goes like this: Aman is flying in a small airplane and is lost in the clouds. He descends until he spots an office building and yells to a man in an open window, "Where am I?" The man replies, "You are in an airplane about 100 feet above the ground." The pilot immediately turns to the proper course, spots the airport and lands. His astonished passenger asks how the pilot figured out which way to go. The pilot replies, "The answer the man gave me was completely correct and factual, yet it was no help whatsoever, so I knew immediately he was a software engineer who worked for Microsoft and I know where Microsoft's building is in relation to the airport."

When seen in the light of the tragedy of Flight 965, the humor of the joke is macabre, yet professionals in the digital world tell it gleefully and frequently because it highlights a fundamental truth about computers: They may tell us facts, but they don't inform us. They may guide us with precision, but they don't guide us where we want to go. The flight computer on Flight 965 could easily have told the pilots that "ROMEO" was not an appropriate fix for their approach to Cali. Even a simple hint that it was "unusual" or "unfamiliar" could have saved the airplane. Instead, it seemed as though the computer was utterly unconcerned with the actual flight and its passengers. It cared only about its own internal computations.

Hard-to-use computers affect us all, sometimes fatally. Software-based products are not inherently hard to use; they are that way because we use the wrong process for creating them. In this book, I intend to reveal this bad process by showing its effect and describing its cause. I'll then show how to change the process so that our software-based products become friendly, powerful, and desirable. First, I'll use this chapter to show how serious this problem really is.

What Do You Get When You Cross a Computer with a Camera?

Hereis a riddle for the information age: What do you get when you cross a computer with a camera? Answer: A computer! Thirty years ago, my first camera, a 35mmPentax Model H, had a small battery in it that powered the light meter. Like a wristwatch battery, I merely swapped in a new one every couple of years.

Fifteen years ago, my first electronic camera, a 35mm Canon T70, used two AA batteries to power its rather simple exposure computer and its automatic film drive. It had a simple On/Off switch, so that th e batteries wouldn't wear down needlessly.

Five years ago, my film less Logitech, a first-generati on digital camera, had a similar On/Off switch, but this time it had the smarts of a rudimentary computer inside it. So if I forgot to turn it off, it automatically shut down after one minute of inactivity. Neat.

One year ago, my second-generation digital camera, a Panasonic PalmCam, had an even smarter computer chip inside it. It was so smart that its On/Off switch had evolved into an Off/Rec/Play switch. It now had modes: I had to put it into Rec mode to take pictures and Play mode to view them on its small video display.

My newest camera, a Nikon CoolPix 900, is a third-generation digital camera and the smartest yet. In fact, it has a full-blown computer that displays a Windows-like hourglass while it "boots up." Like some mutant fish with extra heads, its On/Off switch has now grown to have four settings: Off/ARec/MRec/Play. "ARec" means "automatic record" and "MRec" means "manual record." As far as I can tell, there is no difference. There is no "On" setting, and none of my friends can figure out how to turn it on without a lengthy explanation.

The new camera is very power-hungry, and its engineers thoughtfully provided it with a sophisticated computer program that manages the consumption of battery power. A typical scenario goes like this: I turn the evil off/etc. switch to "MRec," wait about seven long seconds for the camera to boot up, then point it at my subject. I aim the camera and zoom in to properly frame the image. Just as I'm about to press the shutter button, the camera suddenly realizes that simultaneously running the zoom, charging the flash, and energizing the display has caused it to run out of power. In self-defense, it suspends its ability to actual ly take pictures. But I don't know that because I'm looking through the viewfinder, waving my arms, and saying "Smile" and pressing the shutter button. The computer detects the button-press, but it simply cannot obey. In a misguided effort to help out, the power management program instantly takes over and makes an executive decision: Shed load. It huts down the power-greedy LCD video display. I look at the camera quizzically, wondering why it didn't take the picture, shrug my shoulders, and let my arm holding the camera drop to my side. But as soon as the LCD is turned off, there is more battery power available for other systems. The power management program senses this increase and realizes that it now has enough electricity to take pictures. It now returns control to the camera program, which is waiting patiently to process the command it received when I pressed the shutter button, and it takes a nicely auto-focused, well-exposed, high-resolution digital picture of my kneecap.

That old mechanical Pentax had manual focusing, manual exposure, and manual shutter-speed, yet it was far less frustrating to use than the fully computerized modern Nikon CoolPix 900, which has automatic focusing, exposure, and shutter-speed. The camera may still take pictures, but it behaveslike a computer instead of a camera.


When a frog is slipped into a pot of cold water on the stove, he never recognizes the deadly rising temperature. Instead, the heat anesthetizes the frog's senses. I was unaware, like the frog, of my cameras' slow march from easy to hard-to-use as they slowly became computerized. We are all experiencing this same , slow, anesthetizing encroa chment of computer behavior in our everyday lives.

What Do You Get When You Cross a Computer with an Alarm Clock?

Acomputer! I just purchased an expensive new clock radio for my bedroom, a JVCFS-2000. It has a very sophisticated computer brain, and offers high fidelity, digital sound, and lots of features. It wakes me up at a preset time by playing a compact disc, and it has the delicacy and intelligence to slowly faaaaade up the volume when it begins to play at six o'clock in the morning. This feature is really pleasant and quite unique, and it compensates for the fact that I want to hurl the infuriating machine out the window.

It's very hard to tell when the alarm is armed, so it occasionally fails to wake me up on a Monday and rousts me out of bed early on a Saturday. Sure, it has an indicator to show the alarm is set, but that doesn't mean it's useful. The clock has a sophisticated alphanumeric liquid crystal display (LCD) that displays all of its many functions. The presence of a small clock symbol in the upper-left corner of the LCD indicates the alarm is armed, but in a dimly lit bedroom the clock symbol cannot be seen. The LCD has a built-in backlight that makes the clock symbol visible, but the backlight only comes on when the CD or audio is explicitly turned on. There's a gotcha, however, as the alarm simply won't ever sound while the CD is explicitly left on, regardless of the setting of the alarm. It is this paradoxical operation that frequently catches me unawares.

It is simple to disarm the alarm: Simply press the "Alarm" button once, and the clock symbol disappears from the display. However, to arm it, I must pre ss the "Alarm" button exactly five times. The first time I press it, the display shows me the time of the alarm. On press two, it shows the time when it will turn the sound off. On press three, it shows me whether it will play the radio or the CD. On press four, it shows me the preset volume. On press five, it returns to the normal view, but with the alarm now armed. But with just one additional press, it disarms the alarm. Sleepy, in a dark bedroom, it is quite difficult to perform this little digital ballet correctly.

Being a nerdy gizmologist, I continue to fiddle with the device in the hope that I will master it. My wife, however, long ago gave up on the diabolic machine. She loves the look of the sleek, modern design, and the fidelity of the sound it produces, but it failed to pass the alarm-clock test weeks ago because it is simply too hard to make work. The alarm clock may still wake me up, but it behaves like a computer.

By contrast, my old $11 non-computerized alarm clock woke me up with a sudden, unholy buzzing. When it was armed, a single red light glowed. When it was not armed, the red light was dark. I didn't like this old alarm clock for many reasons, but at least I could tell when it was going to wake me up.


Because it is far cheaper for manufacturers to use computers to control the internal functioning of devices than it is to use older, mechanical methods, it is economically inevitable that computers will insinuate themselves into every product and service in our lives. This means that the behavior of all of our products will soon be the same as most obnoxious computers, unless we try something different.


Thi s phenomenon is not restricted to consumer products. Just about every computerized device or service has more features and options than its manual counterpart. Yet, in practice, we often wield the manual devices with more flexibility, subtlety, and awareness than we do the modern versions driven by silicon-chip technology.

High-tech companies--in an effort to improve their products--are merely adding complicating and unwanted features to them. Because the broken process cannot solve the problem of bad products, but can only add new functions, which are what vendors do. Later in this book I'll show how a better development process makes users happier without the extra work of adding unwanted features.

What Do You Get When You Cross a Computer with a Car?

Acomputer! Porsche's beautiful new high-tech sports car, the Boxster, has seven computers in it to help manage its complex systems. One of them is dedicated to managing the engine. It has special procedures built into it to deal with abnormal situations. Unfortunately, these sometimes backfire. In some early models, if the fuel level in the gas tank got very low--only a gallon or so remaining--the centrifugal force of a sharp turn could cause the fuel to collect in the side of the tank, allowing air to enter the fuel lines. The computer sensed this as a dramatic change in the incoming fuel mixture, and interpreted it as a catastrophic failure of the injection system. To prevent damage, the computer would shut down the ignition and stop the car. Also to prevent damage, the computer wouldn't let the driver restart the engine until the car had been towed to a shop and serviced.

When owners of early Boxsters first discovered this problem, the only solution Porsche could devise was to tell them to open the engine compartment and disconnect the battery for at least five minutes, giving the computer time to forget all knowledge of the hiccup. The sports car may still speed down those two-lane blacktop roads, but now, in those tight turns, it behaveslike a computer.


In a laudable effort to protect Boxster owners, the programmers turned them into humiliated victims. Every performance car aficionado knows that the Porsche company is dedicated to lavishing respect and privilege on its clientele. That something like this slipped through shows that the software inside the car is not coming from the same Porsche that makes the rest of the car. It comes from a company within a company: the programmers, and not the legendary German automobile engineers. Somehow, the introduction of a new technology surprised an older, well-established company into letting some of its core values slip away. Acceptable levels of quality for software engineers are far lower than are those for more traditional engineering disciplines.

What Do You Get When You Cross a Computer with a Bank?

Acomputer! Whenever I withdraw cash from an automatic teller machine (ATM), I encounter the same sullen and difficult behavior so universal with computers. If I make the slightest mistake, it rejects the entire transaction and kicks me out of the process. I have to pull my card out, reinsert it, reenter my PIN code, and then re-assert my request. Typically, it wasn't my mistake, either, but the ATM computer finesses me into a misstep. It always asks me whether I want to withdraw money from my checking, savings, or money market account, even though I have only a checking account. Subsequently, I always forget which type it is, and the question confuses me. About once a month I inadvertently select "savings," and the infernal machine summarily boots me out of the entire transaction to start over from the beginning. To reject "savings," the machine has to know that I don't have a savings account, yet it still offers it to me as a choice. The only difference between me selecting "savings" and the pilot of Flight 965 selecting "ROMEO" is the magnitude of the penalty.

The ATM also restricts me to a $200 "daily withdrawal limit." If I go through all of the steps--identifying myself, choosing the account, selecting the amount--and then ask for $220, the computer unceremoniously rejects the entire transaction, informing me rudely that I have exceeded my daily withdrawal limit. It doesn't tell me what that amount is, nor does it tell me how much money is in my account, nor does it give me the opportunity to key ina new, lower amount. Instead it spits out my card and leaves me to try the whole process again from scratch, no wiser than I was a moment ago, as the line of people growing behind me shifts, shuffles, and sighs. The ATM is correct and factual, but it is no help whatsoever.

The ATM has rules that must be followed, and I am quite willing to follow them, but it is unreasonably computer-like to fail to inform me of them, give me contradictory indications, and then summarily punish me for innocently transgressing them. This behavior--so typical of computers--is not intrinsic to them. Actually nothing is intrinsic to computers: they merely act on b ehalf of their software, the program. And programs are as malleable as human speech. A person can speak rudely or politely, helpfully or sullenly. It is as simple for a computer to behave with respect and courtesy as it is for a human to speak that way. All it takes is for someone to describe how. Unfortunately, programmers aren't very good at teaching that to computers.

Computers Make It Easy to Get into Trouble

Computersthat sit on a desk simply behave in the same, irritating way computers always shave, and they don't have to be crossed with anything. My friend Jane used to work in public relations as an account coordinator. She used Microsoft Word on her desktop PC, running Windows 95 to write memos and contracts. The core of Windows 95 is the hierarchical file system. All of Jane's documents were stored in little folders, which were stored in other little folders. Jane didn't understand this, nor did she see the advantage to storing things that way. Actually, Jane didn't give it a lot of thought, but merely took the path of least resistance.

Jane had just finished drafting the new PR contract for a Silicon Valley startup company. She selected "Close" from the "File" menu. Instead of simply doing as she directed and closing the document, Word popped up a dialog box. It was, of course, the all-too-familiar "Save Changes?" confirmation box. She responded--as always--by pressing the "Yes" button. She responded this way so consistently and often that she no longer even looked at the dialog.

The first dialog was followed immediately by another one, the equally familiar "Save As" box. It presented Jane with lots of confusing buttons, icons, and text fields. The only one that Jane understood and used was the text entry field for "File name." She typed in a likely name and then pressed the" Save" button. The program then saved the PR contract in the "My Documents" folder. Jane was so used to this unnecessary drill that she gave it no thought.

At lunchtime, while Jane was out of her office, Sunil, the company's computer tech, installed a new version of VirusKiller 2.1 on her computer. While working on Jane's PC, Sunil used Word to view a VirusKiller Readme file. After viewing the file, Sunil closed it and returned Jane's computer to exactly the way it was before lunch. At least, he thought he did.

After lunch, Jane needed to reopen the PR contract and get a printout to show to her boss. Jane selected "Open" from the "File" menu, and the "Open" dialog box appeared. Jane expected the "Open" dialog box to show her, in neat alphabetic order, all of her contracts and documents. Instead, it showed her a bunch of filenames that she had never seen before and didn't recognize. One of them was named "Readme.doc."

Of course, when Sunil used Word to view the Readme file, he instructed Jane's copy of Word to look in an obscure folder six levels deep and inadvertently steered it away from Jane's normal setting of "My Documents."

Jane was now quite bewildered. Her first, unavoidable thought was that all of her hard work had somehow been erased, and she got very worried. She called over René, her friend and coworker, but René was just as confused as Jane was. Finally, in a state approaching panic, she telephoned Sunil to ask for his help. Sunil was not at his desk and it wasn't unti l Monday morning that he had a chance to stop by and set things right. Jane, René, Sunil—and the PR company--each lost a half-day's productivity.

Although computer operating systems need hierarchical file systems, the people who use them don't. It's not surprising that computer programmers like to see the underlying hierarchical file systems, but it is equally unremarkable that normal users like Jane don't. Unremarkable to everyone, that is, except he programmers who create the software that we all use. They create the behavior and information presentation that they like best, which is very different from the behavior and information presentation that is best for Jane. Jane's frustration and inefficiency is blamed on Jane, and not on the programmers who torpedoed her.

At least Jane has a job. Many people are considered insufficiently "computer literate" and are thus not employable. As more and more jobs demand interaction with computers, the rift between the employable and the unemployable becomes wider and more difficult to cross. Politicians may demand jobs for the underprivileged, but without the ability to use computers, no company can afford to let them put their untrained hands on the company's computers. There is too much training involved, and too much exposure to the destruction of data and the bollixing up of priceless databases.

The obnoxious behavior and obscure interaction that software-based products exhibit is institutionalizing what I call "software apartheid," where otherwise normal people are forbidden from entering the job market and participating in society because they cannot use computers effectively. In our enlightened soci ety, social activists are working hard to break down race and class barriers while technologists are hard at work inadvertently erecting new, bigger ones. By purposefully designing our software-based products to be more human and forgiving, we can automatically make them more inclusive, more class-and color-blind.

Commercial Software Suffers, Too

Not only are computers taking over the cockpit of jet airliners, they are taking over the passenger cabin, too, behaving in that same obstinate, perverse way that is so easy to recognize and so hard to use. Modern jet planes have in-flight entertainment (IFE) systems that deliver movies and music to airline passengers. These IFEs are merely computers connected with local area networks, just like in your office. Advanced IFE systems are generally installed only on larger airplanes flying transoceanic routes.

One airline's IFE was so frustrating for the flight attendants to use that many of them were bidding to fly shorter, local routes to avoid having to learn and use the difficult systems. This is remarkable considering that the time-honored airline route-bidding process is based on seniority, and that those same long-distance routes have always been considered the most desirable plums because of their lengthy layovers in exotic locales like Singapore or Paris. For flight attendants to bid for unglamorous, unromantic yo-yo flights from Denver-to-Dallas or LA-to-San Francisco just to avoid the IFE indicated a serious morale problem. Any airline that inflicted bad tools on its most prized employees--the ones who spent the most time with the customer--was making a foolish decision and was profligately discarding money, customer loy alty, and staff loyalty.

The computer-IFE of another large airline was even worse. The airline had created an in-flight entertainment system that linked movie delivery with the cash collection function. In a sealed jet airplane flying at 37,000 feet, cash collection procedures had typically been quite laissez-faire; after all, nobody was going to sneak out the back door. Flight attendants delivered goods and services when it was convenient and collected cash in only a very loosely coupled fashion. This kept them from running unnecessarily up and down the narrow aisles. Sure, there were occasional errors, but never more than a few dollars were involved, and the system was quite human and forgiving; everyone was happy and the work was not oppressive.

With cash-collection connected to content delivery by computer, the flight attendant had to first get the cash from the passenger, then walk all the way to the head-end of the cabin, where the attendant's console was, enter an attendant password, then perform a cash register-like transaction. Only when that transaction was completed could the passenger actually view a movie or listen to music. This inane product design forced the flight attendants to walkup and down those narrow aisles hundreds of extra times during a typical trip. Out of sheer frustration, the flight attendants would trip the circuit breaker on the in-flight entertainment system at the beginning of each long flight, shortly after departure. They would then blandly announce to the passengers that, sorry, the system was broken and there would be no movie on thisflight.

The airline had spent millions of dollars constructing a system so obnoxious that its users d eliberately turned it off to avoid interacting with it. The thousands of bored passengers were merely innocent victims. And this happened on long, overseas trips typically packed with much-sought-after frequent flyers. I cannot put a dollar figure on the expense this caused the airline, but I can say with conviction that it was catastrophically expensive.

The software inside the IFEs worked with flawless precision, but was are sounding failure because it misbehaved with its human keepers. How could a company fail to predict this sad result? How could it fail to see the connection? The goal of this book is to answer these questions and to show you how to avoid such high-tech debacles.

What Do You Get When You Cross a Computer with a Warship?

InSeptember of 1997, while conducting fleet maneuvers in the Atlantic, the USSYorktown, one of the Navy's new Aegis guided-missile cruisers, stopped dead in the water. A Navy technician, while calibrating an on-board fuel valve, entered a zero into one of the shipboard management computers, a Pentium Prorunning Windows NT. The program attempted to divide another number by that zero--a mathematically undefined operation--which resulted in a complete crash of the entire shipboard control system. Without the computers, the engine halted and the ship sat wallowing in the swells for two hours and forty-five minutes until it could be towed into port. Good thing it wasn't in a war zone.

What do you get when you cross a computer with a warship? Admiral Nimitz is rolling in his grave! Despite this setback, the Navy is committed to computerizing all of its ships because of the manpower cost savings, and to def lect criticism of this plan, it has blamed the "incident" on human error. Because the software creation process is out of control, the high-tech industry must either bring its process to heel or it will continue to put the blame on ordinary users while ever-bigger machines sit dead in the water.

Tech no-Rage

Anarticle in a recent issue of the Wall Street Journaldescribed an anonymous video clip circulated widely by email that showed a"...Mustachioed Everyman in a short-sleeved shirt hunched over a computer terminal, looking puzzled. Suddenly, he strikes the side of his monitor in frustration. As a curious co-worker peers over his cubicle, the man slams the keyboard into the monitor, knocking it to the floor. Rising from his chair, he goes after the fallen monitor with a final, ferocious kick." The article went on to say that reaction to the clip had been "intense" and th at it had apparently tapped into "a powerful undercurrent of techno-rage."

It's ironic that one needs to be moderately computer savvy to even send or view this video clip. While the man in the video may well be an actor, he touches a widespread, sympathetic chord in our business world. The frustration that difficult and unpleasant software-based products are bringing to our lives is rising rapidly.

Joke emails circulate on private email lists about "Computer Tourette's." This is a play on the disorder known as Tourette's Syndrome, where some sufferers engage in uncontrollable bouts of swearing. The joke is that you can walk down the halls of most modern office buildings and hear otherwise-normal people sitting in front of their monitors, jaws clenched, swearing repeatedly in a rictus of tense fury. Who knows what triggered such an outburst: a misplaced file, an inaccessible image, or a frustrating interaction. Or maybe the program just blandly erased the user's only copy of a 500-page manuscript because he responded with a "Yes" to a confirmation dialog box, assuming that it had asked him if he wanted to "save your changes?" when it actually asked him if he wanted to "discard your work?"

An Industry in Denial

Weare a world awash in high-tech tools. Computers dominate the workplace and our homes, and vehicles are filling up with silicon-powered gadgets. All of these computerized devices are wildly sophisticated and powerful, but every one of them is dauntingly difficult and confusing to use.

The high-tech industry is in denial of a simple fact that every person with a cell phone or a word processor can clearly see: Our computerized tools are too hard to use. The technologists who create software and high-tech gadgets are satisfied with their efforts. The software engineers1 who create them have tried as hard as they can to make them easy to use and they have made some minor progress. They believe that their products are as easy to use as it is technically possible to make them. As engineers, their belief is in technology, and they have faith that only some new technology, like voice recognition or artificial intelligence, will improve the user's experience.

Ironically, the thing that will likely make the least improvement in the ease of use of software-based products is new technology. There is little difference technically between a complicated, confusing program and a simple, fun, and powerful product. The problem is one of culture, training, and attitude of the people who make them, more than it is one of chips and programming languages. We are deficient in our development process, not in our development tools.

The high-tech industry has inadvertently put programmers and engineers in charge, so their hard-to-use engineering culture do minates. Despite appearances, business executives are simply not the ones in control of the high-tech industry. It is the engineers who are running the show. In our rush to accept the many benefits of the silicon chip, we have abdicated our responsibilities. We have let the inmates run the asylum.

When the inmates run the asylum, it is hard for them to see clearly the nature of the problems that bedevil them. When you look in the mirror, it is all too easy to single out your best features and overlook the warts. When the creators of software-based products examine their handiwork, they overlook how bad it is. Instead they see its awesome power and flexibility. They see how rich the product is in features and functions. They ignore how excruciatingly difficult it is to use, how many mind-numbing hours it takes to learn, or how it diminishes and degrades the people who must use it in their everyday lives.

The Origins of This Book

Ihave been inventing and developing software-based products for twenty-five years. This problem of hard-to-use software has puzzled and confounded me for years. Finally, in 1992, I ceased all programming to devote one hundred percent of my time to helping other development firms make their products easier to use. And a wonderful thing happened! I immediately discovered that after I freed myself from the demands of programming, I saw for the first time how powerful and compelling those demands were. Programming is such a difficult and absorbing task that it dominates all other considerations, including the concerns of the user. I could only see this after I had extricated myself from its grip.

Upon making this discovery, I began to see what influences drove software-based products to be so bad from the user's point of view. In 1995 I wrote a book2 about what I learned, and it has had a significant effect on the way some software is designed today.

To be a good programmer, one must be sympathetic to the nature and needs of the computer. But the nature and needs of the computer are utterly alien from the nature and needs of the human being who will eventually use it. The creation of software is so intellectually demanding, so all-consuming, that programmers must completely immerse themselves in an equally alien thought process. In the programmer's mind, the demands of the programming process not only supersede any demands from the outside world of users, but the very languages of the two worlds are at odds with each other.

The process of programming subverts the process of making easy-to-use products for the simple reason that the goals of the programmer and the goals of the user are dramatically different. The programmer wants the construction process to be smooth and easy. The user wants the interaction with the program to be smooth and easy. These two objectives almost never result in the same program. In the computer industry today, the programmers are given the responsibility to create interaction that makes t he user happy, but in the unrelenting grip of this conflict of interest, they simply cannot do so.

In software, typically nothing is visible until it is done, meaning that any second-guessing by non-programmers is too late to be effective. Desktop computer software is infamously hard to use because it is purely the product of programmers; nobody comes between them and the user. Objects like phones and cameras have always had a hefty mechanical component that forced them into the open for review. But as we've established, when you cross a computer with just about any product, the behavior of the computer dominates completely.

The key to solving the problem is interaction design. We need a new class of professional interaction designers who design the way software behaves. Today, programmers consciously design the "code" inside programs but only inadvertently design the interaction with humans. They design what it does but not how it behaves, communicates, or informs.Conversely, interaction designers focus directly on the way users see and interact with software-based products. This craft of interaction design is new and unfamiliar to programmers, so--when they admit it at all--they let it in only after their programming is already completed. At that point, it is too late.

The people who manage the creation of software-based products are typically either hostage to programmers because they are insufficiently technical, or hey are all too sympathetic to programmers because they are programmers themselves. The people who use software-based products are simply unaware that those products can be as pleasurable to use and as powerful as any other we ll-designed tool.

Programmers aren't evil. They work hard to make their software easy to use. Unfortunately, their frame of reference is themselves, so they only make it easy to use for other software engineers, not for normal human beings.

The costs of badly designed software are incalculable. The cost of Jane and Sunil's time, the cost of offended air travelers, and the cost of the lives of passengers on Flight 965 cannot easily be quantified. The greatest cost, though, is the opportunity we are squandering. While we let our products frustrate, cost, confuse, irritate, and kill us, we are not taking advantage of the real promise of software-based products: to be the most human and powerful and pleasurable creations ever imagined. Because software truly is malleable far beyond any other media, it has the potential to go well beyond the expectations of even the wildest dreamer. All it requires is the judicious partnering of interaction design with programming.


Read More

What People are saying about this

Jeff Hadfield
Whether you build high-tech software or just use it, this book will change your relationship with technology. Technology should help you -- not make you feel stupid. This is the best book I've read on interaction design: Alan shows what's wrong with how computers interact with us, then how to fix these chronic problems. It's essential for anyone who uses technology. -- (Jeff Hadfield, Editor in Chief, Visual Basic Programmer's Journal
Jean-Louis Gassee
Frightening but true. Personal computers have engendered another New Age co-dependency. They shame us, they frustrate us and yet we keep spending money on them. Alan Cooper's book explains why it shouldn't be so and what we can do about it. A humbling and enjoyable read. -- (Jean-Louis Gassee, Founder, Be, Incorporated and of Apple Computer, Inc's French subsidiary (their largest business unit outside the US)
Clement Mok
Bravo! A lively and insightful discourse on (software products) from the ultimate insider. Give copies to your friends, your peers, your clients. This book will undoubtedly spark conversations if not arguments. -- (Clement Mok, Founder, Studio Archetype, Chief Creative Officer, Sapient)

Read More

Meet the Author


Allan Cooper is the Father of Visual Basic," according to Mitch Waite, founder of Waite Group Press. In 1994, Bill Gates gave him the rare and coveted Windows Pioneer Awardrecognizing how his part in the invention of Visual Basic contributed to the success of Microsoft,") Windows"'. He also received a Software Visionary Award in 1998. He now leads Cooper Interaction Design, a consulting firm that has created breakthrough interactive product designs for 3M, Elemental, Ericsson, Fujitsu, IBM, Logitech, McGraw-Hill, Sagent, SAP, Sony, Varian, VISA, and Sun Microsystems.

Alan is also an outspoken champion of the forgotten person in the electronic product development process-the customer.

For twenty years Alan Cooper designed and developed consumer software products including SuperProject, MicroPhone 11 for Windows, and Microsoft's visual programming user interface for Visual Basic. In 1976 Cooper founded Structured Systems Group, Inc., a company that Fire In the Valley said produced "perhaps the first serious business software for a microcomputer."

Cooper is a member of the Corporate Design Foundation and the American Center for Design. He is a former director of the Association for Software Design!s Silicon Valley Chapter and a member of the national organizations Board of Directors. Cooper is a director for both Software Design and Software Forum, as well as the founder of SEFs Windows SIG- the largest Windows developers group in the world. He is a frequent, opinionated, and engaging industry speaker and writer on the topics of user interface and conceptual software design.

Read More

Customer Reviews

Average Review:

Write a Review

and post it to your social network

     

Most Helpful Customer Reviews

See all customer reviews >