The Inmates are Running the Asylum: Why High Tech Products Drive Us Crazy and How to Restore the Sanity

Overview

Why are VCRs impossible to program? Why do car alarms make us all crazy at all the wrong times? Why do our computers reprimand us when they screw up? All of these computerized devices are wildly sophisticated and powerful, and they have proliferated our desks, our cars, our homes and our offices. So why are they still so dauntingly complicated to use?

The highly renowned Alan Cooper, "The Father of Visual Basic," tackles this issue head-on with his new book, The Inmates are ...

See more details below
Available through our Marketplace sellers.
Other sellers (Paperback)
  • All (57) from $1.99   
  • New (5) from $2.19   
  • Used (52) from $1.99   
Close
Sort by
Page 1 of 1
Showing All
Note: Marketplace items are not eligible for any BN.com coupons and promotions
$2.19
Seller since 2008

Feedback rating:

(3395)

Condition:

New — never opened or used in original packaging.

Like New — packaging may have been opened. A "Like New" item is suitable to give as a gift.

Very Good — may have minor signs of wear on packaging but item works perfectly and has no damage.

Good — item is in good condition but packaging may have signs of shelf wear/aging or torn packaging. All specific defects should be noted in the Comments section associated with each item.

Acceptable — item is in working order but may show signs of wear such as scratches or torn packaging. All specific defects should be noted in the Comments section associated with each item.

Used — An item that has been opened and may show signs of wear. All specific defects should be noted in the Comments section associated with each item.

Refurbished — A used item that has been renewed or updated and verified to be in proper working condition. Not necessarily completed by the original manufacturer.

New
Brand New, not a remainder.

Ships from: San Jose, CA

Usually ships in 1-2 business days

  • Canadian
  • International
  • Standard, 48 States
  • Standard (AK, HI)
  • Express, 48 States
  • Express (AK, HI)
$3.00
Seller since 2011

Feedback rating:

(469)

Condition: New
1999 Hardcover New

Ships from: san francisco, CA

Usually ships in 1-2 business days

  • Canadian
  • International
  • Standard, 48 States
  • Standard (AK, HI)
  • Express, 48 States
  • Express (AK, HI)
$9.19
Seller since 2013

Feedback rating:

(87)

Condition: New
New New, collectible 1st Edition (Hardcover), 1st printing w/ 10 full numberline. Free delivery confirmation. Satisfaction guaranteed!

Ships from: Rahway, NJ

Usually ships in 1-2 business days

  • Canadian
  • International
  • Standard, 48 States
  • Standard (AK, HI)
  • Express, 48 States
  • Express (AK, HI)
$23.99
Seller since 2005

Feedback rating:

(27)

Condition: New
Hardcover New Hardcover with dust jacket. 1999 Edition. p261. Slight shelf wear & a sticker remnant on cover, a small remainder mark on the bottom edge, otherwise new. Quality ... Books...Because We Care-Shipped from Canada. Usually ships within 1-2 business days. If you buy this book from us, we will donate a book to a local school. We donate 10, 000+ books to local schools every year. Read more Show Less

Ships from: Ottawa, Canada

Usually ships in 1-2 business days

  • Canadian
  • International
  • Standard, 48 States
  • Standard (AK, HI)
  • Express, 48 States
  • Express (AK, HI)
$45.00
Seller since 2014

Feedback rating:

(136)

Condition: New
Brand new.

Ships from: acton, MA

Usually ships in 1-2 business days

  • Standard, 48 States
  • Standard (AK, HI)
Page 1 of 1
Showing All
Close
Sort by
Sending request ...

Overview

Why are VCRs impossible to program? Why do car alarms make us all crazy at all the wrong times? Why do our computers reprimand us when they screw up? All of these computerized devices are wildly sophisticated and powerful, and they have proliferated our desks, our cars, our homes and our offices. So why are they still so dauntingly complicated to use?

The highly renowned Alan Cooper, "The Father of Visual Basic," tackles this issue head-on with his new book, The Inmates are Running the Asylum, from Sams Publishing. Cooper believes that powerful and pleasurable software-based products can be created by the simple expedient of designing computer-based products first and then building them. Designing interactive software-based products is a specialty that is as demanding as the construction of these same products, Cooper says. Ironically, building computerized products isn't difficult, they only seem so because our process for making them is out of date. To compound the problem, the costs of badly designed software are incalculable, robbing us of time, customer loyalty, competitive advantage and opportunity.

The Inmates are Running the Asylum also addresses the societal dangers of what Cooper calls "software apartheid," where otherwise normal people are kept from entering the job market and participating in society because they cannot use computers effectively. While social activists are working hard to break down race and class barriers, technologies are hard at work inadvertently erecting new, bigger ones. "By purposefully designing software-based products to be more human and forgiving, we can automatically make them more inclusive, more class- and color-blind," Cooper writes.

Using examples from his own work with companies of all sizes, Cooper offers a provocative, insightful and entertaining explanation for this phenomenon.

He believes that in part, the problem lies in the fact that business executives in the high-tech industry have relinquished their control to the engineers and techies. In the rush to accept the many benefits of the silicon chip, responsibility has been abandoned, and "the inmates have been allowed to run the asylum." The solution, Cooper says, is to harness those talents to create products that will both thrill their users and grow the bottom line.

The book is written for two new archetypes emerging in contemporary business: "The technology-savvy businessperson," who knows that his success depends on the quality of the information available to him and the sophistication with which he uses it; and "the business-savvy technologist," an entrepreneurial engineer or scientist with a keen business sense and an awareness of the power of information.

About the Author:

Alan Cooper is the "Father of Visual Basic," according to Mitch Waite, founder of Waite Group Press. In 1994, Bill Gates gave him the rare and coveted Windows Pioneer Award--recognizing how his part in the invention of VB contributed to the success of Microsoft Windows. Cooper's influential book, About Face: The Essentials of User Interface Design (IDG Books) has sold over 40,000 since August of 1995. His title continues to sell about 300-400 copies a month. He now leads Cooper Interaction Design, a consulting firm that has created breakthrough interactive product designs for IBM, Sony, Logitech and several Internet/intranet start-ups.

For twenty years Alan Cooper designed and developed consumer software products including SuperProject, MicroPhone II for Windows, and Microsoft's visual programming user interface for Visual Basic.

Cooper is a member of the Corporate Design Foundation and American Center for Design. He is a former director of the Association for software Designs Silicon Valley Chapter and a member of the national organization's Board of Directors.

He is a frequent, opinionated and engaging industry speaker and writer on the topics of user interface and conceptual design.

Read More Show Less

Editorial Reviews

Booknews
Armed with solutions to the dilemma of how dependent every one of us is becoming on electronic products, the author argues that, despite appearances, business executives are simply not in control of the high-tech industry. He explains how talented people continuously design bad technology-based products and uses his own work to show businesses of all sizes how to harness talent to create products that will both thrill users and grow the bottom line. Annotation c. by Book News, Inc., Portland, Or.
Read More Show Less

Product Details

  • ISBN-13: 9780672316494
  • Publisher: Sams
  • Publication date: 3/23/1999
  • Edition description: Older Edition
  • Pages: 261
  • Product dimensions: 6.35 (w) x 9.56 (h) x 1.04 (d)

Meet the Author


Allan Cooper is the Father of Visual Basic," according to Mitch Waite, founder of Waite Group Press. In 1994, Bill Gates gave him the rare and coveted Windows Pioneer Awardrecognizing how his part in the invention of Visual Basic contributed to the success of Microsoft,") Windows"'. He also received a Software Visionary Award in 1998. He now leads Cooper Interaction Design, a consulting firm that has created breakthrough interactive product designs for 3M, Elemental, Ericsson, Fujitsu, IBM, Logitech, McGraw-Hill, Sagent, SAP, Sony, Varian, VISA, and Sun Microsystems.

Alan is also an outspoken champion of the forgotten person in the electronic product development process-the customer.

For twenty years Alan Cooper designed and developed consumer software products including SuperProject, MicroPhone 11 for Windows, and Microsoft's visual programming user interface for Visual Basic. In 1976 Cooper founded Structured Systems Group, Inc., a company that Fire In the Valley said produced "perhaps the first serious business software for a microcomputer."

Cooper is a member of the Corporate Design Foundation and the American Center for Design. He is a former director of the Association for Software Design!s Silicon Valley Chapter and a member of the national organizations Board of Directors. Cooper is a director for both Software Design and Software Forum, as well as the founder of SEFs Windows SIG- the largest Windows developers group in the world. He is a frequent, opinionated, and engaging industry speaker and writer on the topics of user interface and conceptual software design.

Read More Show Less

Read an Excerpt

[Figures are not included in this sample chapter]

The Inmates Are Running the Asylum
-1-
Riddles for the Information Age

What Do You Get When You Cross a Computer with an Airplane?

In December 1995, American Airlines Flight 965 departed from Miami on a regularly scheduled trip to Cali, Columbia. On the landing approach, the pilot of the 757needed to select the next radio navigation fix, named "ROZO." He entered an "R" into his navigation computer. The computer returned a list of nearby navigation fixes starting with "R" and the pilot selected the first of these, whose latitude and longitude appeared to be correct. Unfortunately, instead of "ROZO," the pilot selected "ROMEO," 132 miles to the northeast. The jet was southbound, descending into a valley that runs North-South, and any lateral deviation was dangerous. Following indications on the flight computer, the pilots began an easterly turn and slammed into a granite peak at 10,000 feet. One hundred and fifty two passengers and all eight crewmembers aboard perished. Four passengers survived with serious injuries. The National Transportation Safety Board investigated, and--as usual--declared the problem human error. The navigational aid the pilots were following was valid but not for the landing procedure at Cali. In the literal definition of the phrase, this was indeed human error, because the pilot selected the wrong fix. However, in the larger picture, it wasn't the pilot's fault at all.

The front pan el of the airplane's navigation computer showed the currently selected navigation fix and a course deviation indicator. When the plane is on course, the needle is centere d, but the needle gives no indication what so ever about the correctness of the selected radio beacon. The gauge looks pretty much the same just before landing as it does just before crashing. The computer told the pilot he was tracking precisely to the beacon he had selected. Unfortunately, it neglected to tell him the beacon he selected was a fatal choice.


Communications can be precise and exacting while still being tragically wrong. This happens all too frequently when we communicate with computers, and computers are invading every aspect of our modern lives. From the planes we fly to just about every consumer product and service, computers are ubiquitous, and so is their characteristically poor way of communicating and behaving.

There is a widely told joke in the computer industry that goes like this: Aman is flying in a small airplane and is lost in the clouds. He descends until he spots an office building and yells to a man in an open window, "Where am I?" The man replies, "You are in an airplane about 100 feet above the ground." The pilot immediately turns to the proper course, spots the airport and lands. His astonished passenger asks how the pilot figured out which way to go. The pilot replies, "The answer the man gave me was completely correct and factual, yet it was no help whatsoever, so I knew immediately he was a software engineer who worked for Microsoft and I know where Microsoft's building is in relation to the airport."

When seen in the light of the tragedy of Flight 965, the humor of the joke is macabre, yet professionals in the digital world tell it gleefully and frequently because it highlights a fundamental truth about computers: They may tell us facts, but they don't inform us. They may guide us with precision, but they don't guide us where we want to go. The flight computer on Flight 965 could easily have told the pilots that "ROMEO" was not an appropriate fix for their approach to Cali. Even a simple hint that it was "unusual" or "unfamiliar" could have saved the airplane. Instead, it seemed as though the computer was utterly unconcerned with the actual flight and its passengers. It cared only about its own internal computations.

Hard-to-use computers affect us all, sometimes fatally. Software-based products are not inherently hard to use; they are that way because we use the wrong process for creating them. In this book, I intend to reveal this bad process by showing its effect and describing its cause. I'll then show how to change the process so that our software-based products become friendly, powerful, and desirable. First, I'll use this chapter to show how serious this problem really is.

What Do You Get When You Cross a Computer with a Camera?

Hereis a riddle for the information age: What do you get when you cross a computer with a camera? Answer: A computer! Thirty years ago, my first camera, a 35mmPentax Model H, had a small battery in it that powered the light meter. Like a wristwatch battery, I merely swapped in a new one every couple of years.

Fifteen years ago, my first electronic camera, a 35mm Canon T70, used two AA batteries to power its rather simple exposure computer and its automatic film drive. It had a simple On/Off switch, so that th e batteries wouldn't wear down needlessly.

Five years ago, my film less Logitech, a first-generati on digital camera, had a similar On/Off switch, but this time it had the smarts of a rudimentary computer inside it. So if I forgot to turn it off, it automatically shut down after one minute of inactivity. Neat.

One year ago, my second-generation digital camera, a Panasonic PalmCam, had an even smarter computer chip inside it. It was so smart that its On/Off switch had evolved into an Off/Rec/Play switch. It now had modes: I had to put it into Rec mode to take pictures and Play mode to view them on its small video display.

My newest camera, a Nikon CoolPix 900, is a third-generation digital camera and the smartest yet. In fact, it has a full-blown computer that displays a Windows-like hourglass while it "boots up." Like some mutant fish with extra heads, its On/Off switch has now grown to have four settings: Off/ARec/MRec/Play. "ARec" means "automatic record" and "MRec" means "manual record." As far as I can tell, there is no difference. There is no "On" setting, and none of my friends can figure out how to turn it on without a lengthy explanation.

The new camera is very power-hungry, and its engineers thoughtfully provided it with a sophisticated computer program that manages the consumption of battery power. A typical scenario goes like this: I turn the evil off/etc. switch to "MRec," wait about seven long seconds for the camera to boot up, then point it at my subject. I aim the camera and zoom in to properly frame the image. Just as I'm about to press the shutter button, the camera suddenly realizes that simultaneously running the zoom, charging the flash, and energizing the display has caused it to run out of power. In self-defense, it suspends its ability to actual ly take pictures. But I don't know that because I'm looking through the viewfinder, waving my arms, and saying "Smile" and pressing the shutter button. The computer detects the button-press, but it simply cannot obey. In a misguided effort to help out, the power management program instantly takes over and makes an executive decision: Shed load. It huts down the power-greedy LCD video display. I look at the camera quizzically, wondering why it didn't take the picture, shrug my shoulders, and let my arm holding the camera drop to my side. But as soon as the LCD is turned off, there is more battery power available for other systems. The power management program senses this increase and realizes that it now has enough electricity to take pictures. It now returns control to the camera program, which is waiting patiently to process the command it received when I pressed the shutter button, and it takes a nicely auto-focused, well-exposed, high-resolution digital picture of my kneecap.

That old mechanical Pentax had manual focusing, manual exposure, and manual shutter-speed, yet it was far less frustrating to use than the fully computerized modern Nikon CoolPix 900, which has automatic focusing, exposure, and shutter-speed. The camera may still take pictures, but it behaveslike a computer instead of a camera.


When a frog is slipped into a pot of cold water on the stove, he never recognizes the deadly rising temperature. Instead, the heat anesthetizes the frog's senses. I was unaware, like the frog, of my cameras' slow march from easy to hard-to-use as they slowly became computerized. We are all experiencing this same , slow, anesthetizing encroa chment of computer behavior in our everyday lives.

What Do You Get When You Cross a Computer with an Alarm Clock?

Acomputer! I just purchased an expensive new clock radio for my bedroom, a JVCFS-2000. It has a very sophisticated computer brain, and offers high fidelity, digital sound, and lots of features. It wakes me up at a preset time by playing a compact disc, and it has the delicacy and intelligence to slowly faaaaade up the volume when it begins to play at six o'clock in the morning. This feature is really pleasant and quite unique, and it compensates for the fact that I want to hurl the infuriating machine out the window.

It's very hard to tell when the alarm is armed, so it occasionally fails to wake me up on a Monday and rousts me out of bed early on a Saturday. Sure, it has an indicator to show the alarm is set, but that doesn't mean it's useful. The clock has a sophisticated alphanumeric liquid crystal display (LCD) that displays all of its many functions. The presence of a small clock symbol in the upper-left corner of the LCD indicates the alarm is armed, but in a dimly lit bedroom the clock symbol cannot be seen. The LCD has a built-in backlight that makes the clock symbol visible, but the backlight only comes on when the CD or audio is explicitly turned on. There's a gotcha, however, as the alarm simply won't ever sound while the CD is explicitly left on, regardless of the setting of the alarm. It is this paradoxical operation that frequently catches me unawares.

It is simple to disarm the alarm: Simply press the "Alarm" button once, and the clock symbol disappears from the display. However, to arm it, I must pre ss the "Alarm" button exactly five times. The first time I press it, the display shows me the time of the alarm. On press two, it shows the time when it will turn the sound off. On press three, it shows me whether it will play the radio or the CD. On press four, it shows me the preset volume. On press five, it returns to the normal view, but with the alarm now armed. But with just one additional press, it disarms the alarm. Sleepy, in a dark bedroom, it is quite difficult to perform this little digital ballet correctly.

Being a nerdy gizmologist, I continue to fiddle with the device in the hope that I will master it. My wife, however, long ago gave up on the diabolic machine. She loves the look of the sleek, modern design, and the fidelity of the sound it produces, but it failed to pass the alarm-clock test weeks ago because it is simply too hard to make work. The alarm clock may still wake me up, but it behaves like a computer.

By contrast, my old $11 non-computerized alarm clock woke me up with a sudden, unholy buzzing. When it was armed, a single red light glowed. When it was not armed, the red light was dark. I didn't like this old alarm clock for many reasons, but at least I could tell when it was going to wake me up.


Because it is far cheaper for manufacturers to use computers to control the internal functioning of devices than it is to use older, mechanical methods, it is economically inevitable that computers will insinuate themselves into every product and service in our lives. This means that the behavior of all of our products will soon be the same as most obnoxious computers, unless we try something different.


Thi s phenomenon is not restricted to consumer products. Just about every computerized device or service has more features and options than its manual counterpart. Yet, in practice, we often wield the manual devices with more flexibility, subtlety, and awareness than we do the modern versions driven by silicon-chip technology.

High-tech companies--in an effort to improve their products--are merely adding complicating and unwanted features to them. Because the broken process cannot solve the problem of bad products, but can only add new functions, which are what vendors do. Later in this book I'll show how a better development process makes users happier without the extra work of adding unwanted features.

What Do You Get When You Cross a Computer with a Car?

Acomputer! Porsche's beautiful new high-tech sports car, the Boxster, has seven computers in it to help manage its complex systems. One of them is dedicated to managing the engine. It has special procedures built into it to deal with abnormal situations. Unfortunately, these sometimes backfire. In some early models, if the fuel level in the gas tank got very low--only a gallon or so remaining--the centrifugal force of a sharp turn could cause the fuel to collect in the side of the tank, allowing air to enter the fuel lines. The computer sensed this as a dramatic change in the incoming fuel mixture, and interpreted it as a catastrophic failure of the injection system. To prevent damage, the computer would shut down the ignition and stop the car. Also to prevent damage, the computer wouldn't let the driver restart the engine until the car had been towed to a shop and serviced.

When owners of early Boxsters first discovered this problem, the only solution Porsche could devise was to tell them to open the engine compartment and disconnect the battery for at least five minutes, giving the computer time to forget all knowledge of the hiccup. The sports car may still speed down those two-lane blacktop roads, but now, in those tight turns, it behaveslike a computer.


In a laudable effort to protect Boxster owners, the programmers turned them into humiliated victims. Every performance car aficionado knows that the Porsche company is dedicated to lavishing respect and privilege on its clientele. That something like this slipped through shows that the software inside the car is not coming from the same Porsche that makes the rest of the car. It comes from a company within a company: the programmers, and not the legendary German automobile engineers. Somehow, the introduction of a new technology surprised an older, well-established company into letting some of its core values slip away. Acceptable levels of quality for software engineers are far lower than are those for more traditional engineering disciplines.

What Do You Get When You Cross a Computer with a Bank?

Acomputer! Whenever I withdraw cash from an automatic teller machine (ATM), I encounter the same sullen and difficult behavior so universal with computers. If I make the slightest mistake, it rejects the entire transaction and kicks me out of the process. I have to pull my card out, reinsert it, reenter my PIN code, and then re-assert my request. Typically, it wasn't my mistake, either, but the ATM computer finesses me into a misstep. It always asks me whether I want to withdraw money from my checking, savings, or money market account, even though I have only a checking account. Subsequently, I always forget which type it is, and the question confuses me. About once a month I inadvertently select "savings," and the infernal machine summarily boots me out of the entire transaction to start over from the beginning. To reject "savings," the machine has to know that I don't have a savings account, yet it still offers it to me as a choice. The only difference between me selecting "savings" and the pilot of Flight 965 selecting "ROMEO" is the magnitude of the penalty.

The ATM also restricts me to a $200 "daily withdrawal limit." If I go through all of the steps--identifying myself, choosing the account, selecting the amount--and then ask for $220, the computer unceremoniously rejects the entire transaction, informing me rudely that I have exceeded my daily withdrawal limit. It doesn't tell me what that amount is, nor does it tell me how much money is in my account, nor does it give me the opportunity to key ina new, lower amount. Instead it spits out my card and leaves me to try the whole process again from scratch, no wiser than I was a moment ago, as the line of people growing behind me shifts, shuffles, and sighs. The ATM is correct and factual, but it is no help whatsoever.

The ATM has rules that must be followed, and I am quite willing to follow them, but it is unreasonably computer-like to fail to inform me of them, give me contradictory indications, and then summarily punish me for innocently transgressing them. This behavior--so typical of computers--is not intrinsic to them. Actually nothing is intrinsic to computers: they merely act on b ehalf of their software, the program. And programs are as malleable as human speech. A person can speak rudely or politely, helpfully or sullenly. It is as simple for a computer to behave with respect and courtesy as it is for a human to speak that way. All it takes is for someone to describe how. Unfortunately, programmers aren't very good at teaching that to computers.

Computers Make It Easy to Get into Trouble

Computersthat sit on a desk simply behave in the same, irritating way computers always shave, and they don't have to be crossed with anything. My friend Jane used to work in public relations as an account coordinator. She used Microsoft Word on her desktop PC, running Windows 95 to write memos and contracts. The core of Windows 95 is the hierarchical file system. All of Jane's documents were stored in little folders, which were stored in other little folders. Jane didn't understand this, nor did she see the advantage to storing things that way. Actually, Jane didn't give it a lot of thought, but merely took the path of least resistance.

Jane had just finished drafting the new PR contract for a Silicon Valley startup company. She selected "Close" from the "File" menu. Instead of simply doing as she directed and closing the document, Word popped up a dialog box. It was, of course, the all-too-familiar "Save Changes?" confirmation box. She responded--as always--by pressing the "Yes" button. She responded this way so consistently and often that she no longer even looked at the dialog.

The first dialog was followed immediately by another one, the equally familiar "Save As" box. It presented Jane with lots of confusing buttons, icons, and text fields. The only one that Jane understood and used was the text entry field for "File name." She typed in a likely name and then pressed the" Save" button. The program then saved the PR contract in the "My Documents" folder. Jane was so used to this unnecessary drill that she gave it no thought.

At lunchtime, while Jane was out of her office, Sunil, the company's computer tech, installed a new version of VirusKiller 2.1 on her computer. While working on Jane's PC, Sunil used Word to view a VirusKiller Readme file. After viewing the file, Sunil closed it and returned Jane's computer to exactly the way it was before lunch. At least, he thought he did.

After lunch, Jane needed to reopen the PR contract and get a printout to show to her boss. Jane selected "Open" from the "File" menu, and the "Open" dialog box appeared. Jane expected the "Open" dialog box to show her, in neat alphabetic order, all of her contracts and documents. Instead, it showed her a bunch of filenames that she had never seen before and didn't recognize. One of them was named "Readme.doc."

Of course, when Sunil used Word to view the Readme file, he instructed Jane's copy of Word to look in an obscure folder six levels deep and inadvertently steered it away from Jane's normal setting of "My Documents."

Jane was now quite bewildered. Her first, unavoidable thought was that all of her hard work had somehow been erased, and she got very worried. She called over René, her friend and coworker, but René was just as confused as Jane was. Finally, in a state approaching panic, she telephoned Sunil to ask for his help. Sunil was not at his desk and it wasn't unti l Monday morning that he had a chance to stop by and set things right. Jane, René, Sunil—and the PR company--each lost a half-day's productivity.

Although computer operating systems need hierarchical file systems, the people who use them don't. It's not surprising that computer programmers like to see the underlying hierarchical file systems, but it is equally unremarkable that normal users like Jane don't. Unremarkable to everyone, that is, except he programmers who create the software that we all use. They create the behavior and information presentation that they like best, which is very different from the behavior and information presentation that is best for Jane. Jane's frustration and inefficiency is blamed on Jane, and not on the programmers who torpedoed her.

At least Jane has a job. Many people are considered insufficiently "computer literate" and are thus not employable. As more and more jobs demand interaction with computers, the rift between the employable and the unemployable becomes wider and more difficult to cross. Politicians may demand jobs for the underprivileged, but without the ability to use computers, no company can afford to let them put their untrained hands on the company's computers. There is too much training involved, and too much exposure to the destruction of data and the bollixing up of priceless databases.

The obnoxious behavior and obscure interaction that software-based products exhibit is institutionalizing what I call "software apartheid," where otherwise normal people are forbidden from entering the job market and participating in society because they cannot use computers effectively. In our enlightened soci ety, social activists are working hard to break down race and class barriers while technologists are hard at work inadvertently erecting new, bigger ones. By purposefully designing our software-based products to be more human and forgiving, we can automatically make them more inclusive, more class-and color-blind.

Commercial Software Suffers, Too

Not only are computers taking over the cockpit of jet airliners, they are taking over the passenger cabin, too, behaving in that same obstinate, perverse way that is so easy to recognize and so hard to use. Modern jet planes have in-flight entertainment (IFE) systems that deliver movies and music to airline passengers. These IFEs are merely computers connected with local area networks, just like in your office. Advanced IFE systems are generally installed only on larger airplanes flying transoceanic routes.

One airline's IFE was so frustrating for the flight attendants to use that many of them were bidding to fly shorter, local routes to avoid having to learn and use the difficult systems. This is remarkable considering that the time-honored airline route-bidding process is based on seniority, and that those same long-distance routes have always been considered the most desirable plums because of their lengthy layovers in exotic locales like Singapore or Paris. For flight attendants to bid for unglamorous, unromantic yo-yo flights from Denver-to-Dallas or LA-to-San Francisco just to avoid the IFE indicated a serious morale problem. Any airline that inflicted bad tools on its most prized employees--the ones who spent the most time with the customer--was making a foolish decision and was profligately discarding money, customer loy alty, and staff loyalty.

The computer-IFE of another large airline was even worse. The airline had created an in-flight entertainment system that linked movie delivery with the cash collection function. In a sealed jet airplane flying at 37,000 feet, cash collection procedures had typically been quite laissez-faire; after all, nobody was going to sneak out the back door. Flight attendants delivered goods and services when it was convenient and collected cash in only a very loosely coupled fashion. This kept them from running unnecessarily up and down the narrow aisles. Sure, there were occasional errors, but never more than a few dollars were involved, and the system was quite human and forgiving; everyone was happy and the work was not oppressive.

With cash-collection connected to content delivery by computer, the flight attendant had to first get the cash from the passenger, then walk all the way to the head-end of the cabin, where the attendant's console was, enter an attendant password, then perform a cash register-like transaction. Only when that transaction was completed could the passenger actually view a movie or listen to music. This inane product design forced the flight attendants to walkup and down those narrow aisles hundreds of extra times during a typical trip. Out of sheer frustration, the flight attendants would trip the circuit breaker on the in-flight entertainment system at the beginning of each long flight, shortly after departure. They would then blandly announce to the passengers that, sorry, the system was broken and there would be no movie on thisflight.

The airline had spent millions of dollars constructing a system so obnoxious that its users d eliberately turned it off to avoid interacting with it. The thousands of bored passengers were merely innocent victims. And this happened on long, overseas trips typically packed with much-sought-after frequent flyers. I cannot put a dollar figure on the expense this caused the airline, but I can say with conviction that it was catastrophically expensive.

The software inside the IFEs worked with flawless precision, but was are sounding failure because it misbehaved with its human keepers. How could a company fail to predict this sad result? How could it fail to see the connection? The goal of this book is to answer these questions and to show you how to avoid such high-tech debacles.

What Do You Get When You Cross a Computer with a Warship?

InSeptember of 1997, while conducting fleet maneuvers in the Atlantic, the USSYorktown, one of the Navy's new Aegis guided-missile cruisers, stopped dead in the water. A Navy technician, while calibrating an on-board fuel valve, entered a zero into one of the shipboard management computers, a Pentium Prorunning Windows NT. The program attempted to divide another number by that zero--a mathematically undefined operation--which resulted in a complete crash of the entire shipboard control system. Without the computers, the engine halted and the ship sat wallowing in the swells for two hours and forty-five minutes until it could be towed into port. Good thing it wasn't in a war zone.

What do you get when you cross a computer with a warship? Admiral Nimitz is rolling in his grave! Despite this setback, the Navy is committed to computerizing all of its ships because of the manpower cost savings, and to def lect criticism of this plan, it has blamed the "incident" on human error. Because the software creation process is out of control, the high-tech industry must either bring its process to heel or it will continue to put the blame on ordinary users while ever-bigger machines sit dead in the water.

Tech no-Rage

Anarticle in a recent issue of the Wall Street Journaldescribed an anonymous video clip circulated widely by email that showed a"...Mustachioed Everyman in a short-sleeved shirt hunched over a computer terminal, looking puzzled. Suddenly, he strikes the side of his monitor in frustration. As a curious co-worker peers over his cubicle, the man slams the keyboard into the monitor, knocking it to the floor. Rising from his chair, he goes after the fallen monitor with a final, ferocious kick." The article went on to say that reaction to the clip had been "intense" and th at it had apparently tapped into "a powerful undercurrent of techno-rage."

It's ironic that one needs to be moderately computer savvy to even send or view this video clip. While the man in the video may well be an actor, he touches a widespread, sympathetic chord in our business world. The frustration that difficult and unpleasant software-based products are bringing to our lives is rising rapidly.

Joke emails circulate on private email lists about "Computer Tourette's." This is a play on the disorder known as Tourette's Syndrome, where some sufferers engage in uncontrollable bouts of swearing. The joke is that you can walk down the halls of most modern office buildings and hear otherwise-normal people sitting in front of their monitors, jaws clenched, swearing repeatedly in a rictus of tense fury. Who knows what triggered such an outburst: a misplaced file, an inaccessible image, or a frustrating interaction. Or maybe the program just blandly erased the user's only copy of a 500-page manuscript because he responded with a "Yes" to a confirmation dialog box, assuming that it had asked him if he wanted to "save your changes?" when it actually asked him if he wanted to "discard your work?"

An Industry in Denial

Weare a world awash in high-tech tools. Computers dominate the workplace and our homes, and vehicles are filling up with silicon-powered gadgets. All of these computerized devices are wildly sophisticated and powerful, but every one of them is dauntingly difficult and confusing to use.

The high-tech industry is in denial of a simple fact that every person with a cell phone or a word processor can clearly see: Our computerized tools are too hard to use. The technologists who create software and high-tech gadgets are satisfied with their efforts. The software engineers1 who create them have tried as hard as they can to make them easy to use and they have made some minor progress. They believe that their products are as easy to use as it is technically possible to make them. As engineers, their belief is in technology, and they have faith that only some new technology, like voice recognition or artificial intelligence, will improve the user's experience.

Ironically, the thing that will likely make the least improvement in the ease of use of software-based products is new technology. There is little difference technically between a complicated, confusing program and a simple, fun, and powerful product. The problem is one of culture, training, and attitude of the people who make them, more than it is one of chips and programming languages. We are deficient in our development process, not in our development tools.

The high-tech industry has inadvertently put programmers and engineers in charge, so their hard-to-use engineering culture do minates. Despite appearances, business executives are simply not the ones in control of the high-tech industry. It is the engineers who are running the show. In our rush to accept the many benefits of the silicon chip, we have abdicated our responsibilities. We have let the inmates run the asylum.

When the inmates run the asylum, it is hard for them to see clearly the nature of the problems that bedevil them. When you look in the mirror, it is all too easy to single out your best features and overlook the warts. When the creators of software-based products examine their handiwork, they overlook how bad it is. Instead they see its awesome power and flexibility. They see how rich the product is in features and functions. They ignore how excruciatingly difficult it is to use, how many mind-numbing hours it takes to learn, or how it diminishes and degrades the people who must use it in their everyday lives.

The Origins of This Book

Ihave been inventing and developing software-based products for twenty-five years. This problem of hard-to-use software has puzzled and confounded me for years. Finally, in 1992, I ceased all programming to devote one hundred percent of my time to helping other development firms make their products easier to use. And a wonderful thing happened! I immediately discovered that after I freed myself from the demands of programming, I saw for the first time how powerful and compelling those demands were. Programming is such a difficult and absorbing task that it dominates all other considerations, including the concerns of the user. I could only see this after I had extricated myself from its grip.

Upon making this discovery, I began to see what influences drove software-based products to be so bad from the user's point of view. In 1995 I wrote a book2 about what I learned, and it has had a significant effect on the way some software is designed today.

To be a good programmer, one must be sympathetic to the nature and needs of the computer. But the nature and needs of the computer are utterly alien from the nature and needs of the human being who will eventually use it. The creation of software is so intellectually demanding, so all-consuming, that programmers must completely immerse themselves in an equally alien thought process. In the programmer's mind, the demands of the programming process not only supersede any demands from the outside world of users, but the very languages of the two worlds are at odds with each other.

The process of programming subverts the process of making easy-to-use products for the simple reason that the goals of the programmer and the goals of the user are dramatically different. The programmer wants the construction process to be smooth and easy. The user wants the interaction with the program to be smooth and easy. These two objectives almost never result in the same program. In the computer industry today, the programmers are given the responsibility to create interaction that makes t he user happy, but in the unrelenting grip of this conflict of interest, they simply cannot do so.

In software, typically nothing is visible until it is done, meaning that any second-guessing by non-programmers is too late to be effective. Desktop computer software is infamously hard to use because it is purely the product of programmers; nobody comes between them and the user. Objects like phones and cameras have always had a hefty mechanical component that forced them into the open for review. But as we've established, when you cross a computer with just about any product, the behavior of the computer dominates completely.

The key to solving the problem is interaction design. We need a new class of professional interaction designers who design the way software behaves. Today, programmers consciously design the "code" inside programs but only inadvertently design the interaction with humans. They design what it does but not how it behaves, communicates, or informs.Conversely, interaction designers focus directly on the way users see and interact with software-based products. This craft of interaction design is new and unfamiliar to programmers, so--when they admit it at all--they let it in only after their programming is already completed. At that point, it is too late.

The people who manage the creation of software-based products are typically either hostage to programmers because they are insufficiently technical, or hey are all too sympathetic to programmers because they are programmers themselves. The people who use software-based products are simply unaware that those products can be as pleasurable to use and as powerful as any other we ll-designed tool.

Programmers aren't evil. They work hard to make their software easy to use. Unfortunately, their frame of reference is themselves, so they only make it easy to use for other software engineers, not for normal human beings.

The costs of badly designed software are incalculable. The cost of Jane and Sunil's time, the cost of offended air travelers, and the cost of the lives of passengers on Flight 965 cannot easily be quantified. The greatest cost, though, is the opportunity we are squandering. While we let our products frustrate, cost, confuse, irritate, and kill us, we are not taking advantage of the real promise of software-based products: to be the most human and powerful and pleasurable creations ever imagined. Because software truly is malleable far beyond any other media, it has the potential to go well beyond the expectations of even the wildest dreamer. All it requires is the judicious partnering of interaction design with programming.


Read More Show Less

Table of Contents

I. COMPUTER OBLITERACY.

1. Riddles for the Information Age.
What Do You Get When You Cross a Computer with an Airplane?What Do You Get When You Cross a Computer with a Camera? What Do You Get When You Cross a Computer with an Alarm Clock? What Do You Get When You Cross a Computer with a Car? What Do You Get When You Cross a Computer with a Bank? Computers Make it Easy to Get into Trouble. Commercial Software Suffers, Too.What Do You Get When You Cross a Computer with a Warship? Techno-Rage. An Industry in Denial. The Origins of This Book.

2. Cognitive Friction.
Behavior Unconnected to Physical Forces. Design Is a Big Word. The Relationship Between Programmers and Designers. Most Software is Designed by Accident. “Interaction”Versus “Interface”Design. Why Software-Based Products are Different. The Dancing Bear. The Cost of Features. Apologists and Survivors. How We React to Cognitive Friction. The Democratization of Consumer Power. Blaming the User. Software Apartheid.

II. IT COSTS YOU BIG TIME.

3. Wasting Money.
Deadline Management. What Does “Done”Look Like? Parkinson's Law. The Product That Never Ships. Shipping Late Doesn't Hurt. Feature List Bargaining. Programmers Are in Control. Features Are Not Necessarily Good. Iteration and the Myth of the Unpredictable Market. The Hidden Costs of Bad Software. The Only Thing More Expensive Than Writing Software is Writing Bad Software. Opportunity Cost. The Cost of Prototyping.

4. The Dancing Bear.
If it Were a Problem, Wouldn't It Have Been Solved by Now? Consumer Electronics Victim. How Email Programs Fail. HowScheduling Programs Fail. How Calendar Software Fails. Mass Web Hysteria. What's Wrong with Software? Software Forgets. Software is Lazy. Software is Parsimonious with Information. Software is Inflexible. Software Blames Users. Software Won't Take Responsibility.

5. Customer Disloyalty.
Desirability. A Comparison. Time to Market.

III. EATING SOUP WITH A FORK.

6. The Inmates are Running the Asylum.
Driving from the Backseat. Hatching a Catastrophe. Computers Versus Humans. Teaching Dogs to Be Cats.

7. Homo Logicus.
The Jetway Test. The Psychology of Computer Programmers. Programmers Trade Simplicity for Control. Programmers Exchange Success for Understanding.Programmers Focus on What is Possible to the Exclusion of What is Probable. Programmers Act Like Jocks.

8. An Obsolete Culture.
The Culture of Programming. Reusing Code. The Common Culture. Programming Culture at Microsoft. Cultural Isolation. Skin in the Game. Scarcity Thinking. The Process is Dehumanizing, Not the Technology.

IV. INTERACTION DESIGN IS GOOD BUSINESS.

9. Designing for Pleasure.
Personas. Design for Just One Person. The Roll-Aboard Suitcase and Sticky Notes. The Elastic User. Be Specific. Hypothetical. Precision, Not Accuracy. A Realistic Look at Skill Levels. Personas End Feature Debates. Both Designers and Programmers Need Personas. It's a User Persona, Not a Buyer Persona. The Cast of Characters. Primary Personas. Case Study: Sony Trans Com's Passport. The Conventional Solution. Personas. Designing for Clevis.

10. Designing for Power.
Goals are the Reason Why We Perform Tasks. Tasks Are Not Goals. Programmers Do Task-Directed Design. Goal-Directed Design. Goal-Directed Television News. Goal-Directed Classroom Management. Personal and Practical Goals. The Principle of Commensurate Effort. Personal Goals. Corporate GoalsPractical Goals. False Goals. Computers Are Human, Too. Designing for Politeness. What is Polite? What Makes Software Polite? Polite Software is Interested in Me. Polite Software is Deferential to Me. Polite Software is Forthcoming. Polite Software has Common Sense. Polite Software Anticipates My Needs. Polite Software is Responsive. Polite Software is Taciturn About its Personal Problems. Polite Software is Well Informed. Polite Software is Perceptive. Polite Software is Self-Confident. Polite Software Stays Focused.Polite Software is Fudgable. Polite Software Gives Instant Gratification. Polite Software is Trustworthy. Case Study: Elemental Drumbeat. The Investigation.Who Serves Whom. The Design. Pushback. Other Issues.

11. Designing for People.
Scenarios. Daily Use Scenarios. Necessary Use Scenarios. Edge Case Scenario. Inflecting the Interface. Perpetual Intermediates. Pretend it's Magic.Vocabulary. Breaking Through with Language. Reality Bats Last. Case Study: Logitech Scanman. Malcolm, the Web-Warrior. Chad Marchetti, Boy. Magnum, DPI.Playing Pretend It's Magic. World-Class Cropping. World-Class Image Resize. World-Class Image Reorient. World-Class Results. Bridging Hardware and Software.Less is More.

V. GETTING BACK INTO THE DRIVER'S SEAT.

12. Desperately Seeking Usability.
The Timing. User Testing. User Testing Before Programming. Fitting Usability Testing into the Process. Multidisciplinary Teams. Programmers Designing. How Do You Know? Style Guides. Conflict of Interest. Focus Groups.Visual Design. Industrial Design. Cool New Technology. Iteration.

13. A Managed Process.
Who Really Has the Most Influence? The Customer-Driven Death Spiral. Conceptual Integrity is a Core Competence. A Faustian Bargain. Taking a Longer View. Taking Responsibility. Taking Time. Taking Control. Finding Bedrock. Knowing Where to Cut. Making Movies. The Deal. Document Design to Get it Built. Design Affects the Code. Design Documents Benefit Programmers. Design Documents Benefit Marketing. Design Documents Help Documenters and Tech Support. Design Documents Help Managers. Design Documents Benefit the Whole Company. Who Owns Product Quality? Creating a Design-Friendly Process. Where Interaction Designers Come From. Building Design Teams.

14. Power and Pleasure.
An Example of a Well-Run Project. A Company-Wide Awareness of Design. Benefits of Change. Let Them Eat Cake. Changing the Process.

Index.

Read More Show Less

First Chapter

[Figures are not included in this sample chapter]

The Inmates Are Running the Asylum
-1-
Riddles for the Information Age

What Do You Get When You Cross a Computer with an Airplane?

InDecember 1995, American Airlines Flight 965 departed from Miami on a regularlyscheduled trip to Cali, Columbia. On the landing approach, the pilot of the 757needed to select the next radio navigation fix, named "ROZO." He entered an "R"into his navigation computer. The computer returned a list of nearby navigationfixes starting with "R" and the pilot selected the first of these, whoselatitude and longitude appeared to be correct. Unfortunately, instead of"ROZO," the pilot selected "ROMEO," 132 miles to the northeast. The jet wassouthbound, descending into a valley that runs north-south, and any lateraldeviation was dangerous. Following indications on the flight computer, thepilots began an easterly turn and slammed into a granite peak at 10,000 feet.One hundred and fifty two passengers and all eight crewmembers aboard perished.Four passengers survived with serious injuries. The National TransportationSafety Board investigated, and--as usual--declared the problem human error. Thenavigational aid the pilots were following was valid but not for the landingprocedure at Cali. In the literal definition of the phrase, this was indeedhuman error, because the pilot selected the wrong fix. However, in the largerpicture, it wasn't the pilot's fault at all.

The front pan el of the airplane's navigation computer showed the currentlyselected navigation fix and a course deviation indicator. When the plane is oncourse, the needle is centered, but the needle gives no indication whatsoeverabout the correctness of the selected radio beacon. The gauge looks pretty muchthe same just before landing as it does just before crashing. The computer toldthe pilot he was tracking precisely to the beacon he had selected.Unfortunately, it neglected to tell him the beacon he selected was a fatalchoice.


Communications can be precise and exacting while still being tragicallywrong. This happens all too frequently when we communicate with computers, andcomputers are invading every aspect of our modern lives. From the planes we flyto just about every consumer product and service, computers are ubiquitous, andso is their characteristically poor way of communicating and behaving.

There is a widely told joke in the computer industry that goes like this: Aman is flying in a small airplane and is lost in the clouds. He descends untilhe spots an office building and yells to a man in an open window, "Where am I?"The man replies, "You are in an airplane about 100 feet above the ground." Thepilot immediately turns to the proper course, spots the airport and lands. Hisastonished passenger asks how the pilot figured out which way to go. The pilotreplies, "The answer the man gave me was completely correct and factual, yet itwas no help whatsoever, so I knew immediately he was a software engineer whoworked for Microsoft and I know where Microsoft's building is in relation tothe airport."

When seen in the light of the tragedy of Flight 965, the humor of the jokeis macabr e, yet professionals in the digital world tell it gleefully andfrequently because it highlights a fundamental truth about computers: They maytell us facts, but they don't inform us. They may guide us with precision, butthey don't guide us where we want to go. The flight computer on Flight 965could easily have told the pilots that "ROMEO" was not an appropriate fix fortheir approach to Cali. Even a simple hint that it was "unusual" or"unfamiliar" could have saved the airplane. Instead, it seemed as though thecomputer was utterly unconcerned with the actual flight and its passengers. Itcared only about its own internal computations.

Hard-to-use computers affect us all, sometimes fatally. Software-basedproducts are not inherently hard to use; they are that way because weuse the wrong process for creating them. In this book, I intend to reveal thisbad process by showing its effect and describing its cause. I'll then show howto change the process so that our software-based products become friendly,powerful, and desirable. First, I'll use this chapter to show how serious thisproblem really is.

What Do You Get When You Cross a Computer with a Camera?

Hereis a riddle for the information age: What do you get when you cross a computerwith a camera? Answer: A computer! Thirty years ago, my first camera, a 35mmPentax Model H, had a small battery in it that powered the light meter. Like awristwatch battery, I merely swapped in a new one every couple of years.

Fifteen years ago, my first electronic camera, a 35mm Canon T70, used two AAbatteries to power its rather simple exposure computer and its automatic filmdrive. It had a simple On/Off switch, so that th e batteries wouldn't wear downneedlessly.

Five years ago, my filmless Logitech, a first-generation digital camera, hada similar On/Off switch, but this time it had the smarts of a rudimentarycomputer inside it. So if I forgot to turn it off, it automatically shut downafter one minute of inactivity. Neat.

One year ago, my second-generation digital camera, a Panasonic PalmCam, hadan even smarter computer chip inside it. It was so smart that its On/Off switchhad evolved into an Off/Rec/Play switch. It now had modes: I had to put it intoRec mode to take pictures and Play mode to view them on its small videodisplay.

My newest camera, a Nikon CoolPix 900, is a third-generation digital cameraand the smartest yet. In fact, it has a full-blown computer that displays aWindows-like hourglass while it "boots up." Like some mutant fish with extraheads, its On/Off switch has now grown to have four settings:Off/ARec/MRec/Play. "ARec" means "automatic record" and "MRec" means "manualrecord." As far as I can tell, there is no difference. There is no "On"setting, and none of my friends can figure out how to turn it on without alengthy explanation.

The new camera is very power-hungry, and its engineers thoughtfully providedit with a sophisticated computer program that manages the consumption ofbattery power. A typical scenario goes like this: I turn the evil off/etc.switch to "MRec," wait about seven long seconds for the camera to boot up, thenpoint it at my subject. I aim the camera and zoom in to properly frame theimage. Just as I'm about to press the shutter button, the camera suddenlyrealizes that simultaneously running the zoom, charging the flash, andenergizing the display has caus ed it to run out of power. In self-defense, itsuspends its ability to actually take pictures. But I don't know that becauseI'm looking through the viewfinder, waving my arms, and saying "Smile" andpressing the shutter button. The computer detects the button-press, but itsimply cannot obey. In a misguided effort to help out, the power managementprogram instantly takes over and makes an executive decision: Shed load. Itshuts down the power-greedy LCD video display. I look at the cameraquizzically, wondering why it didn't take the picture, shrug my shoulders, andlet my arm holding the camera drop to my side. But as soon as the LCD is turnedoff, there is more battery power available for other systems. The powermanagement program senses this increase and realizes that it now hasenough electricity to take pictures. It now returns control to the cameraprogram, which is waiting patiently to process the command it received when Ipressed the shutter button, and it takes a nicely auto-focused, well-exposed,high-resolution digital picture of my kneecap.

That old mechanical Pentax had manual focusing, manual exposure, and manualshutter-speed, yet it was far less frustrating to use than the fullycomputerized modern Nikon CoolPix 900, which has automatic focusing, exposure,and shutter-speed. The camera may still take pictures, but it behaveslike a computer instead of a camera.


When a frog is slipped into a pot of cold water on the stove, he neverrecognizes the deadly rising temperature. Instead, the heat anesthetizes thefrog's senses. I was unaware, like the frog, of my cameras' slow march fromeasy to hard-to-use as they slowly became computerized. We are all experiencingt his same, slow, anesthetizing encroachment of computer behavior in oureveryday lives.

What Do You Get When You Cross a Computer with an Alarm Clock?

Acomputer! I just purchased an expensive new clock-radio for my bedroom, a JVCFS-2000. It has a very sophisticated computer brain, and offers high fidelity,digital sound, and lots of features. It wakes me up at a preset time by playinga compact disc, and it has the delicacy and intelligence to slowly faaaaade upthe volume when it begins to play at six o'clock in the morning. This featureis really pleasant and quite unique, and it compensates for the fact that Iwant to hurl the infuriating machine out the window.

It's very hard to tell when the alarm is armed, so it occasionally fails towake me up on a Monday and rousts me out of bed early on a Saturday. Sure, ithas an indicator to show the alarm is set, but that doesn't mean it's useful.The clock has a sophisticated alphanumeric liquid crystal display (LCD) thatdisplays all of its many functions. The presence of a small clock symbol in theupper-left corner of the LCD indicates the alarm is armed, but in a dimly litbedroom the clock symbol cannot be seen. The LCD has a built-in backlight thatmakes the clock symbol visible, but the backlight only comes on when the CD orradio is explicitly turned on. There's a gotcha, however, as the alarm simplywon't ever sound while the CD is explicitly left on, regardless of the settingof the alarm. It is this paradoxical operation that frequently catches meunawares.

It is simple to disarm the alarm: Simply press the "Alarm" button once, andthe clock symbol disappears from the display. However, to arm it, I must pressthe "Alarm" button exactly five times. The first time I press it, the displayshows me the time of the alarm. On press two, it shows the time when it willturn the sound off. On press three, it shows me whether it will play the radioor the CD. On press four, it shows me the preset volume. On press five, itreturns to the normal view, but with the alarm now armed. But with just oneadditional press, it disarms the alarm. Sleepy, in a dark bedroom, it isquite difficult to perform this little digital ballet correctly.

Being a nerdy gizmologist, I continue to fiddle with the device in the hopethat I will master it. My wife, however, long ago gave up on the diabolicmachine. She loves the look of the sleek, modern design, and the fidelity ofthe sound it produces, but it failed to pass the alarm-clock test weeks agobecause it is simply too hard to make work. The alarm clock may still wake meup, but it behaves like a computer.

By contrast, my old $11 non-computerized alarm clock woke me up with asudden, unholy buzzing. When it was armed, a single red light glowed. When itwas not armed, the red light was dark. I didn't like this old alarm clock formany reasons, but at least I could tell when it was going to wake me up.


Because it is far cheaper for manufacturers to use computers to control theinternal functioning of devices than it is to use older, mechanical methods, itis economically inevitable that computers will insinuate themselves into everyproduct and service in our lives. This means that the behavior of all of ourproducts will soon be the same as most obnoxious computers, unless we trysomething different.


This phenomenon is not restricted to consumer products. Just about everycomputerized device or service has more features and options than its manualcounterpart. Yet, in practice, we often wield the manual devices with moreflexibility, subtlety, and awareness than we do the modern versions driven bysilicon-chip technology.

High-tech companies--in an effort to improve their products--are merelyadding complicating and unwanted features to them. Because the broken processcannot solve the problem of bad products, but can only add new functions, thatis what vendors do. Later in this book I'll show how a better developmentprocess makes users happier without the extra work of adding unwanted features.

What Do You Get When You Cross a Computer with a Car?

Acomputer! Porsche's beautiful new high-tech sports car, the Boxster, has sevencomputers in it to help manage its complex systems. One of them is dedicated tomanaging the engine. It has special procedures built into it to deal withabnormal situations. Unfortunately, these sometimes backfire. In some earlymodels, if the fuel level in the gas tank got very low--only a gallon or soremaining--the centrifugal force of a sharp turn could cause the fuel tocollect in the side of the tank, allowing air to enter the fuel lines. Thecomputer sensed this as a dramatic change in the incoming fuel mixture, andinterpreted it as a catastrophic failure of the injection system. To preventdamage, the computer would shut down the ignition and stop the car. Also toprevent damage, the computer wouldn't let the driver restart the engine untilthe car had been towed to a shop and serviced.

When owners of early Boxsters first discovered this problem, the onlysolution P orsche could devise was to tell them to open the engine compartmentand disconnect the battery for at least five minutes, giving the computer timeto forget all knowledge of the hiccup. The sports car may still speed downthose two-lane blacktop roads, but now, in those tight turns, it behaveslike a computer.


In a laudable effort to protect Boxster owners, the programmers turned theminto humiliated victims. Every performance car aficionado knows that thePorsche company is dedicated to lavishing respect and privilege on itsclientele. That something like this slipped through shows that the softwareinside the car is not coming from the same Porsche that makes the rest of thecar. It comes from a company within a company: the programmers, and not thelegendary German automobile engineers. Somehow, the introduction of a newtechnology surprised an older, well-established company into letting some ofits core values slip away. Acceptable levels of quality for software engineersare far lower than are those for more traditional engineering disciplines.

What Do You Get When You Cross a Computer with a Bank?

Acomputer! Whenever I withdraw cash from an automatic teller machine (ATM), Iencounter the same sullen and difficult behavior so universal with computers.If I make the slightest mistake, it rejects the entire transaction and kicks meout of the process. I have to pull my card out, reinsert it, reenter my PINcode, and then re-assert my request. Typically, it wasn't my mistake, either,but the ATM computer finesses me into a misstep. It always asks me whether Iwant to withdraw money from my checking, savings, or money market account, eventhough I have only a checking account. Subsequently, I always forget which typeit is, and the question confuses me. About once a month I inadvertently select"savings," and the infernal machine summarily boots me out of the entiretransaction to start over from the beginning. To reject "savings," the machinehas to know that I don't have a savings account, yet it still offers it to meas a choice. The only difference between me selecting "savings" and the pilotof Flight 965 selecting "ROMEO" is the magnitude of the penalty.

The ATM also restricts me to a $200 "daily withdrawal limit." If I gothrough all of the steps--identifying myself, choosing the account, selectingthe amount--and then ask for $220, the computer unceremoniously rejects theentire transaction, informing me rudely that I have exceeded my dailywithdrawal limit. It doesn't tell me what that amount is, nor does it tell mehow much money is in my account, nor does it give me the opportunity to key ina new, lower amount. Instead it spits out my card and leaves me to try thewhole process again from scratch, no wiser than I was a moment ago, as the lineof people growing behind me shifts, shuffles, and sighs. The ATM is correct andfactual, but it is no help whatsoever.

The ATM has rules that must be followed, and I am quite willing to followthem, but it is unreasonably computer-like to fail to inform me of them, giveme contradictory indications, and then summarily punish me for innocentlytransgressing them. This behavior--so typical of computers--is not intrinsic tothem. Actually nothing is intrinsic to computers: they merely act on behalf oftheir software, the program. And programs are as malleable as human speech. Aperson can speak rudely or pol itely, helpfully or sullenly. It is as simple fora computer to behave with respect and courtesy as it is for a human to speakthat way. All it takes is for someone to describe how. Unfortunately,programmers aren't very good at teaching that to computers.

Computers Make It Easy to Get into Trouble

Computersthat sit on a desk simply behave in the same, irritating way computers alwayshave, and they don't have to be crossed with anything. My friend Jane used towork in public relations as an account coordinator. She used Microsoft Word onher desktop PC, running Windows 95 to write memos and contracts. The core ofWindows 95 is the hierarchical file system. All of Jane's documents were storedin little folders, which were stored in other little folders. Jane didn'tunderstand this, nor did she see the advantage to storing things that way.Actually, Jane didn't give it a lot of thought, but merely took the path ofleast resistance.

Jane had just finished drafting the new PR contract for a Silicon Valleystartup company. She selected "Close" from the "File" menu. Instead of simplydoing as she directed and closing the document, Word popped up a dialog box. Itwas, of course, the all-too-familiar "Save Changes?" confirmation box. Sheresponded--as always--by pressing the "Yes" button. She responded this way soconsistently and often that she no longer even looked at the dialog.

The first dialog was followed immediately by another one, the equallyfamiliar "Save As" box. It presented Jane with lots of confusing buttons,icons, and text fields. The only one that Jane understood and used was the textentry field for "File name." She typed in a likely name and then pressed the" Save" button. The program then saved the PR contract in the "My Documents"folder. Jane was so used to this unnecessary drill that she gave it nothought.

At lunchtime, while Jane was out of her office, Sunil, the company'scomputer tech, installed a new version of VirusKiller 2.1 on her computer.While working on Jane's PC, Sunil used Word to view a VirusKiller Readme file.After viewing the file, Sunil closed it and returned Jane's computer to exactlythe way it was before lunch. At least, he thought he did.

After lunch, Jane needed to reopen the PR contract and get a printout toshow to her boss. Jane selected "Open" from the "File" menu, and the "Open"dialog box appeared. Jane expected the "Open" dialog box to show her, in neatalphabetic order, all of her contracts and documents. Instead, it showed her abunch of filenames that she had never seen before and didn't recognize. One ofthem was named "Readme.doc."

Of course, when Sunil used Word to view the Readme file, he instructedJane's copy of Word to look in an obscure folder six levels deep andinadvertently steered it away from Jane's normal setting of "My Documents."

Jane was now quite bewildered. Her first, unavoidable thought was that allof her hard work had somehow been erased, and she got very worried. She calledover René, her friend and coworker, but René was just as confusedas Jane was. Finally, in a state approaching panic, she telephoned Sunil to askfor his help. Sunil was not at his desk and it wasn't until Monday morning thathe had a chance to stop by and set things right. Jane, René, Sunil--andthe PR company--each lost a half-day's productivity.

Although computer operating systems need hier archical file systems, thepeople who use them don't. It's not surprising that computer programmers liketo see the underlying hierarchical file systems, but it is equally unremarkablethat normal users like Jane don't. Unremarkable to everyone, that is, exceptthe programmers who create the software that we all use. They create thebehavior and information presentation that they like best, which is verydifferent from the behavior and information presentation that is best for Jane.Jane's frustration and inefficiency is blamed on Jane, and not on theprogrammers who torpedoed her.

At least Jane has a job. Many people are considered insufficiently "computerliterate" and are thus not employable. As more and more jobs demand interactionwith computers, the rift between the employable and the unemployable becomeswider and more difficult to cross. Politicians may demand jobs for theunderprivileged, but without the ability to use computers, no company canafford to let them put their untrained hands on the company's computers. Thereis too much training involved, and too much exposure to the destruction of dataand the bollixing up of priceless databases.

The obnoxious behavior and obscure interaction that software-based productsexhibit is institutionalizing what I call "software apartheid," where otherwisenormal people are forbidden from entering the job market and participating insociety because they cannot use computers effectively. In our enlightenedsociety, social activists are working hard to break down race and classbarriers while technologists are hard at work inadvertently erecting new,bigger ones. By purposefully designing our software-based products to be morehuman and forgiving, we can automa tically make them more inclusive, more class-and color-blind.

Commercial Software Suffers, Too

Not only are computers taking over the cockpit of jet airliners, they aretaking over the passenger cabin, too, behaving in that same obstinate, perverseway that is so easy to recognize and so hard to use. Modern jet planes havein-flight entertainment (IFE) systems that deliver movies and music to airlinepassengers. These IFEs are merely computers connected with local area networks,just like in your office. Advanced IFE systems are generally installed only onlarger airplanes flying transoceanic routes.

One airline's IFE was so frustrating for the flight attendants to use thatmany of them were bidding to fly shorter, local routes to avoid having to learnand use the difficult systems. This is remarkable considering that thetime-honored airline route-bidding process is based on seniority, and thatthose same long-distance routes have always been considered the most desirableplums because of their lengthy layovers in exotic locales like Singapore orParis. For flight attendants to bid for unglamorous, unromantic yo-yo flightsfrom Denver-to-Dallas or LA-to-San Francisco just to avoid the IFE indicated aserious morale problem. Any airline that inflicted bad tools on its most prizedemployees--the ones who spent the most time with the customer--was making afoolish decision and was profligately discarding money, customer loyalty, andstaff loyalty.

The computer-IFE of another large airline was even worse. The airline hadcreated an in-flight entertainment system that linked movie delivery with thecash collection function. In a sealed jet airplane flying at 37,000 feet, cashcollection proce dures had typically been quite laissez-faire; after all, nobodywas going to sneak out the back door. Flight attendants delivered goods andservices when it was convenient and collected cash in only a very looselycoupled fashion. This kept them from running unnecessarily up and down thenarrow aisles. Sure, there were occasional errors, but never more than a fewdollars were involved, and the system was quite human and forgiving; everyonewas happy and the work was not oppressive.

With cash-collection connected to content delivery by computer, the flightattendant had to first get the cash from the passenger, then walk all the wayto the head-end of the cabin, where the attendant's console was, enter anattendant password, then perform a cash register-like transaction. Only whenthat transaction was completed could the passenger actually view a movie orlisten to music. This inane product design forced the flight attendants to walkup and down those narrow aisles hundreds of extra times during a typical trip.Out of sheer frustration, the flight attendants would trip the circuit breakeron the in-flight entertainment system at the beginning of each long flight,shortly after departure. They would then blandly announce to the passengersthat, sorry, the system was broken and there would be no movie on thisflight.

The airline had spent millions of dollars constructing a system so obnoxiousthat its users deliberately turned it off to avoid interacting with it. Thethousands of bored passengers were merely innocent victims. And this happenedon long, overseas trips typically packed with much-sought-after frequentflyers. I cannot put a dollar figure on the expense this caused the airline,but I can say with conviction that it was catastrophically expensive.

The software inside the IFEs worked with flawless precision, but was aresounding failure because it misbehaved with its human keepers. How could acompany fail to predict this sad result? How could it fail to see theconnection? The goal of this book is to answer these questions and to show youhow to avoid such high-tech debacles.

What Do You Get When You Cross a Computer with a Warship?

InSeptember of 1997, while conducting fleet maneuvers in the Atlantic, the USSYorktown, one of the Navy's new Aegis guided-missile cruisers, stoppeddead in the water. A Navy technician, while calibrating an on-board fuel valve,entered a zero into one of the shipboard management computers, a Pentium Prorunning Windows NT. The program attempted to divide another number by thatzero--a mathematically undefined operation--which resulted in a complete crashof the entire shipboard control system. Without the computers, the enginehalted and the ship sat wallowing in the swells for two hours and forty-fiveminutes until it could be towed into port. Good thing it wasn't in a warzone.

What do you get when you cross a computer with a warship? Admiral Nimitz isrolling in his grave! Despite this setback, the Navy is committed tocomputerizing all of its ships because of the manpower cost savings, and todeflect criticism of this plan, it has blamed the "incident" on human error.Because the software creation process is out of control, the high-tech industrymust either bring its process to heel or it will continue to put the blame onordinary users while ever-bigger machines sit dead in the water.

Tech no-Rage

Anarticle in a recent issue of the Wall Street Journaldescribed an anonymous video clip circulated widely by email that showed a"...Mustachioed Everyman in a short-sleeved shirt hunched over a computerterminal, looking puzzled. Suddenly, he strikes the side of his monitor infrustration. As a curious co-worker peers over his cubicle, the man slams thekeyboard into the monitor, knocking it to the floor. Rising from his chair, hegoes after the fallen monitor with a final, ferocious kick." The article wenton to say that reaction to the clip had been "intense" and that it hadapparently tapped into "a powerful undercurrent of techno-rage."

It's ironic that one needs to be moderately computer savvy to even send orview this video clip. While the man in the video may well be an actor, hetouches a widespread, sympathetic chord in our business world. The frustrationthat difficult and unpleasant software-based products are bringing to our livesis rising rapidly.

Joke emails circulate on private email lists about "Computer Tourette's."This is a play on the disorder known as Tourette's Syndrome, where somesufferers engage in uncontrollable bouts of swearing. The joke is that you canwalk down the halls of most modern office buildings and hear otherwise-normalpeople sitting in front of their monitors, jaws clenched, swearing repeatedlyin a rictus of tense fury. Who knows what triggered such an outburst: amisplaced file, an inaccessible image, or a frustrating interaction. Or maybethe program just blandly erased the user's only copy of a 500-page manuscriptbecause he responded with a "Yes" to a confirmation dialog box, assuming thatit had asked him if he wanted to "save your changes?" when it actually askedhim if he wanted to "discard your work?"

An Industry in Denial

Weare a world awash in high-tech tools. Computers dominate the workplace and ourhomes, and vehicles are filling up with silicon-powered gadgets. All of thesecomputerized devices are wildly sophisticated and powerful, but every one ofthem is dauntingly difficult and confusing to use.

The high-tech industry is in denial of a simple fact that every person witha cell phone or a word processor can clearly see: Our computerized tools aretoo hard to use. The technologists who create software and high-techgadgets are satisfied with their efforts. The software engineers1 who createthem have tried as hard as they can to make them easy to use and they have madesome minor progress. They believe that their products are as easy to use as itis technically possible to make them. As engineers, their belief is intechnology, and they have faith that only some new technology, like voicerecognition or artificial intelligence, will improve the user's experience.

Ironically, the thing that will likely make the least improvement inthe ease of use of software-based products is new technology. There is littledifference technically between a complicated, confusing program and asimple, fun, and powerful product. The problem is one of culture, training, andattitude of the people who make them, more than it is one of chips andprogramming languages. We are deficient in our development process, notin our development tools.

The high-tech industry has inadvertently put programmers and engineers incharge, so their hard-to-use engineering culture do minates. Despiteappearances, business executives are simply not the ones in control of thehigh-tech industry. It is the engineers who are running the show. In our rushto accept the many benefits of the silicon chip, we have abdicated ourresponsibilities. We have let the inmates run the asylum.

When the inmates run the asylum, it is hard for them to see clearly thenature of the problems that bedevil them. When you look in the mirror, it isall too easy to single out your best features and overlook the warts. When thecreators of software-based products examine their handiwork, they overlook howbad it is. Instead they see its awesome power and flexibility. They see howrich the product is in features and functions. They ignore how excruciatinglydifficult it is to use, how many mind-numbing hours it takes to learn, or howit diminishes and degrades the people who must use it in their everyday lives.

The Origins of This Book

Ihave been inventing and developing software-based products for twenty-fiveyears. This problem of hard-to-use software has puzzled and confounded me foryears. Finally, in 1992, I ceased all programming to devote one hundred percentof my time to helping other development firms make their products easier touse. And a wonderful thing happened! I immediately discovered that after Ifreed myself from the demands of programming, I saw for the first time howpowerful and compelling those demands were. Programming is such a difficult andabsorbing task that it dominates all other considerations, including theconcerns of the user. I could only see this after I had extricated myself fromits grip.

Upon making this discovery, I began to see what influences drovesoftware-based products to be so bad from the user's point of view. In 1995 Iwrote a book2 about what I learned, and it has had a significant effect on theway some software is designed today.

To be a good programmer, one must be sympathetic to the nature and needs ofthe computer. But the nature and needs of the computer are utterly alien fromthe nature and needs of the human being who will eventually use it. Thecreation of software is so intellectually demanding, so all-consuming, thatprogrammers must completely immerse themselves in an equally alien thoughtprocess. In the programmer's mind, the demands of the programming process notonly supercede any demands from the outside world of users, but the verylanguages of the two worlds are at odds with each other.

The process of programming subverts the process of making easy-to-useproducts for the simple reason that the goals of the programmer and the goalsof the user are dramatically different. The programmer wants the constructionprocess to be smooth and easy. The user wants the interaction with the programto be smooth and easy. These two objectives almost never result in the sameprogram. In the computer industry today, the programmers are given theresponsibility to create interaction that makes the user happy, but in theunrelenting grip of this conflict of interest, they simply cannot do so.

In software, typically nothing is visible until it is done, meaning that anysecond-guessing by non-programmers is too late to be effective. Desktopcomputer software is infamously hard to use because it is purely the product ofprogrammers; nobody comes between them and the user. Objects like phones andcameras have always had a he fty mechanical component that forced them into theopen for review. But as we've established, when you cross a computer with justabout any product, the behavior of the computer dominates completely.

The key to solving the problem is interaction design. We need a newclass of professional interaction designers who design the way softwarebehaves. Today, programmers consciously design the "code" inside programs butonly inadvertently design the interaction with humans. They design what it doesbut not how it behaves, communicates, or informs.Conversely, interaction designers focus directly on the way users see andinteract with software-based products. This craft of interaction design is newand unfamiliar to programmers, so--when they admit it at all--they let it inonly after their programming is already completed. At that point, it is toolate.

The people who manage the creation of software-based products are typicallyeither hostage to programmers because they are insufficiently technical, orthey are all too sympathetic to programmers because they are programmersthemselves. The people who use software-based products are simply unaware thatthose products can be as pleasurable to use and as powerful as any otherwell-designed tool.

Programmers aren't evil. They work hard to make their software easyto use. Unfortunately, their frame of reference is themselves, so they onlymake it easy to use for other software engineers, not for normal humanbeings.

The costs of badly designed software are incalculable. The cost of Jane andSunil's time, the cost of offended air travelers, and the cost of the lives ofpassengers on Flight 965 cannot easily be quant ified. The greatest cost,though, is the opportunity we are squandering. While we let our productsfrustrate, cost, confuse, irritate, and kill us, we are not taking advantage ofthe real promise of software-based products: to be the most human and powerfuland pleasurable creations ever imagined. Because software truly is malleablefar beyond any other media, it has the potential to go well beyond theexpectations of even the wildest dreamer. All it requires is the judiciouspartnering of interaction design with programming.

Read More Show Less

Interviews & Essays

On Thursday, May 13th, barnesandnoble.com welcomed Alan Cooper to discuss THE INMATES ARE RUNNING THE ASYLUM.

Moderator: Good afternoon, Alan Cooper! We're so glad to have you online with us this afternoon to chat about THE INMATES ARE RUNNING THE ASYLUM. How are you today?

Alan Cooper: Thanks for inviting me! I'm just fine today, and I'm looking forward to some good discussions.


Reenie P. from Los Angeles, CA: In your book you are critical of Web-based software, saying companies are sacrificing functionality for the appearance of simpler distribution. But with the web browser more and more becoming each person's interface of communication with their computer, and certainly becoming their computer's connection to the outside world, aren't there advantages to distributing software over the Internet?

Alan Cooper: Of course there are advantages to software distribution over the Internet. But why should we sacrifice good interaction to get it? It's like saying, "Sure, you can have free dinner, but you have to eat it off of the ground." There is simply no reason for this.


todd from Jamestown: What do you think are some of the best web sites out there today -- as far as interface design?

Alan Cooper: The advantage of the browser is that it is good for -- ta-da, browsing. Browsing, unfortunately, doesn't involve a lot of interaction. Many companies have begun to allow interaction with their applications via the Web, and they have used browsers as the tool. This is like saying, "My lawnmower is so good at mowing grass that I'm going to use it to shave and mix my martini, too".


Marco from New York City: In your book, you stress that all software should start with good design and that the design should be oriented toward typical end users. What about software that is never intended to have a "public face," like embedded systems? Is there any kind of development in which it is OK to have the design reflect the way the computer works, rather than the way a person might use the software?

Alan Cooper: Marco, what a provocative question! Before I answer it, I'd like to examine your motivation for asking it. It seems to me that you are implying that it is somehow purer, smoother, better to design software so that it reflects the way computers work. Do you think this is true?


harry from St. Paul, MN: What's the future of VB? VB 7,8 and 9.

Alan Cooper: You will have to ask Microsoft that question. I invented the original product and sold it to MS over a decade ago. They haven't asked me for help or advice since.


Rochelle from Indianapolis, IN: I like what you have said about "software apartheid." I think this is becoming more and more an important issue in today's times, and in the future, where so many things are computer-based. Could you talk more about this? What is your biggest concern about this?

Alan Cooper: Rochelle, thanks for asking this question. When programmers assume that it is "reasonable" to demand that users understand "a little bit" about how computers work in order to have the "privilege" of using them, they are inadvertently erecting social barriers. It is absolutely not okay to make that demand. You don't have to understand how electricity works to turn on the lights in your house. The kind of understanding that programmers find elementary is often very sophisticated and difficult for others. What's more, it is unwanted and unnecessary. Programmers merely demand "computer literacy" because that is how they learned, and they cannot see any other way to use the technology. Well, I can see many very easy ways to use the technology that don't demand that kind of up-front understanding. Such approaches will be qualitatively less divisive and more inclusive.


Fran from Procrastinating at Work: Alan, a couple of questions -- I went to a business seminar the other day, and a management "guru" said that e-commerce entrepreneurs can't expect to succeed unless they're based in Silicon Valley. Is there any validity to this statement? Also, if I want to start an online business and don't have much tech know-how, where do you suggest I start? Get a computer science degree, or hire someone else who knows all that stuff?

Alan Cooper: Fran, how's the weather in Procrastinating? In Silval everyone breathes their own exhaust. I think there is a lot to be said for not basing your business here. You'll get better perspective on things. Ask Michael Dell, for example. On the other hand, as a consultant, it's good for my business to hail from Silval. If you get a CS degree, you will be qualified to be a computer geek. Ask yourself, Do I want to be a computer geek, or do I want to create a business?


Georgette from Cyberspace: What products out there today do you think are good examples of how designers should be working and thinking?

Alan Cooper: I like AGE OF EMPIRES, the strategy game from Microsoft. Excellent use of audio feedback, with a very limited control vocabulary, and an interaction based on giving real-time directions, rather than responding to queries. I also like Sid Meier's use of Rich Visual Modeless Feedback in CIVILIZATION and RAILROAD TYCOON. Microsoft's POWERPOINT has some nice touches that I pointed out in ABOUT FACE. Unfortunately, I don't see a lot of good examples. I think that the inmates are running the asylum.


Gary from New York: Where do you see the future of interface design going?

Alan Cooper: Kinda depends on your definition of "interface design." I think that it is a small part of interaction design, which is growing in importance very rapidly. I'm bullish, in other words.


Kevin Paris from Urbana Champaign, IL: Macs are infinitely more intuitive to use than PCs, so how come the whole freakin' world is using the harder system?

Alan Cooper: Kevin, your assumption about Macs is arguable, if not downright false. But the answer to your question is that Windows delivers more value than the Mac does. I know, I know, I hate Windows with a passion (wryly: try programming for it sometime), but value is not the same as pleasure, or ease, or worship. MS delivers that better value not with better design but by outmanuevering their opponents, including Apple, by being stronger, smarter, better business competitors.


Dale Evans from Florrisant, MO: I have secretly entertained the paranoid and conspiratorial notion that evil computer nerds are purposely making everything harder. They were oppressed in high school, emotionally scarred and physically abused by jocks. Now we've all grown up into a semblance of adulthood, and they've regressed. Am I crazy or do you share my conspiratorial view, as the title of your book suggests?

Alan Cooper: Dale: Well, you should immediately purchase several copies of THE INMATES ARE RUNNING THE ASYLUM and read Chapter Seven, "Homo Logicus," where I make this exact point! Thanx, Alan


Computer Tourette's Sufferer from Austin, TX: I work in e-commerce and have what you refer to in INMATES as Computer Tourette's and spew obscenities when our server crashes, etc. Do you recommend any de-stressing methods to help me curb this problem? My neighboring cubicle dwellers thank you in advance.

Alan Cooper: Yes, the thing to do is become an ardent warrior in the fight to keep people from blaming themselves when computers and software screw up. Whenever you hear a friend or coworker saying something like, "Oh, I guess I should have read the manual," stop them and say, "Don't blame yourself for the failings of badly designed technology!" When enough people refuse to blame themselves for bad software, then IT managers will have to take some corrective action. Keep the faith!


Christopher Blair from New York City: There is something so uncivilized about the interface with computers. I rarely find it to be a pleasant experience, yet my expectations for this evolution remain constant in light of our ever-expanding technological growth. Am I deluded in this expectation/hope? Are we doomed to social devolution with those geeky computer monkeys?

Alan Cooper: No, we are not doomed to a life with bad software. All we have to do is become aware that there is something that can be done about it. Most people assume, reasonably, that software is hard to use because it is inherently hard to use. They assume that if it could be made easy to use, the technologists would have already done so. What they don't realize is that technologist kinda like it hard to use. It will be a good thing when people become indignant about having to live with bad software. Eventually we will get improvement.


Frank C. from Boston, MA: What do you think of the WEB TV interface?

Alan Cooper: Frankly, Frank, I haven't used it. However, that certainly won't stop me from having an opinion! I think the whole idea of putting any computer programs onto television to be silly. I rather think that we will soon be putting television onto our computers. We'll have to figure out some solution to the screen size discrepancy, but that is a detail.


Halley from Oak Park, IL: What do you think, ideally, is the role of the designer? The programmer?

Alan Cooper: The interaction designer thinks about users and understands what they will be trying to accomplish with the product. The designer can then translate this into a detailed and precise product specification that the programmer can then use to determine what to build. Of course, it is up to the programmer to design the program, that is, how the product is constructed. There are two critical elements here: one, the design must come before any programming begins; and two, the designer must not be involved with the programming -- otherwise there is an unavoidable conflict of interest.


Greg from Durham, NC: Could you explain the ideas behind your entertaining manifesto: "Most software needs to be spanked"?

Alan Cooper: Sure! Most software spanks me with error messages, inscrutable communications, rude behavior, and failure to actually accomplish my goals, so I feel it is only fair to spank it back.


Cristine from Hoboken, NJ: How did you get your start in the software industry? How do you think your career would be different if you had to start out in today's environment?

Alan Cooper: Cristine, when I started as a programmer in the early '70s, I was in a small, specialized business, far from the mainstream. If I were in my 20s today, I'd be somewhere far, far away from the computer business, because it has now become such a big part of our culture. I like to blaze my own trails. Thanks for the interesting question!


Geralyn from Chicago, IL: Hi, Alan. What do you think of electronic books? Are they in the "high-tech products that drive us crazy" category?

Alan Cooper: Are they in the "high-tech products that drive us crazy" category? So far, yes. Generally, I think that Neal Stephenson in THE DIAMOND AGE has a good grasp of future books.


Marco from New York City: I'm just wondering if you consider all software as software that needs to have a human, friendly face. Are there classes of applications we can say only exist to talk to other machines or other specialist humans -- that have no need to have a general-interest public face. I personally don't have a bias toward either design. I think you have an excellent point about designing for users -- I want to know if you think these principles should apply to all software, regardless of its intended function.

Alan Cooper: In a pure sense, yes. All software is used by humans at some point. If the opposite were true, then why don't we write our embedded systems in binary machine code? Compilers are nothing but a user-friendly interface to the CPU's native instruction set. On a more practical level, it is often beyond the point of diminishing returns to do lots of fancy interaction design for stuff that is only handled by professionals. On yet another level, you have to look at the user's goals. User friendliness is not always the goal. In a jet fighter cockpit, victory at any price is the goal, so exchanging intensely difficult training for ease of use is appropriate if it yields milliseconds of advantage in a dogfight. The downside of this discussion is that it gives tacit license to programmers to decide that a given program is "embedded" and doesn't require design. That's letting the nose of the camel into the tent. Thanks very much for your question.


Amy from New York, NY: When do you think the PC will become obsolete, or do you think it will become obsolete?

Alan Cooper: The computer is a general-purpose tool. At some point, every general-purpose tool is applied to some specific problem or task. The "PC" is applied to a specific task only at a very high level of abstraction, and late in the design/manufacturing/marketing cycle. As computer components become cheaper, they can be applied to more specific purposes earlier in the product creation cycle. This is what Don Norman means when he talks about "smart appliances". Unfortunately, the point in the cycle where computers are applied to specific problems has no bearing on their ease of use whatsoever. The terrible blinking-12:00 VCR, after all, is a "smart appliance," and it sucks as bad as any desktop PC does. So the answer to your question is: Probably in the next decade, and that in itself won't help the problem much.


David from Palo Alto, CA: But interaction design is expensive, isn't it? Is the industry ready for good design?

Alan Cooper: Actually, doing design is cheaper than not doing it. Visual Basic, for example, was a designed product. It was a critical success from Release 1. Most other Microsoft products are not designed, but are engineered. Windows, for example, and it was the laughingstock of the industry for the first five years -- and four major releases -- of its life. Designing first is always cheaper than designing afterward.


Shirley from Westford, MA: My company is new, but it seems the technology types can never get me what I want. I'm not asking for much! It takes them twice as long, and they give me choices between the lesser of several evils. Any advice for me?

Alan Cooper: Work with a professional interaction designer.


Moderator: Thank you so much for joining us this afternoon, Alan Cooper, for a fascinating chat. You have been an excellent guest. Before you go, do you have any last words for the online audience?

Alan Cooper: Thanks for inviting me. The questions were great, and I've had a lot of fun. Bye! Alan


Read More Show Less

Customer Reviews

Be the first to write a review
( 0 )
Rating Distribution

5 Star

(0)

4 Star

(0)

3 Star

(0)

2 Star

(0)

1 Star

(0)

Your Rating:

Your Name: Create a Pen Name or

Barnes & Noble.com Review Rules

Our reader reviews allow you to share your comments on titles you liked, or didn't, with others. By submitting an online review, you are representing to Barnes & Noble.com that all information contained in your review is original and accurate in all respects, and that the submission of such content by you and the posting of such content by Barnes & Noble.com does not and will not violate the rights of any third party. Please follow the rules below to help ensure that your review can be posted.

Reviews by Our Customers Under the Age of 13

We highly value and respect everyone's opinion concerning the titles we offer. However, we cannot allow persons under the age of 13 to have accounts at BN.com or to post customer reviews. Please see our Terms of Use for more details.

What to exclude from your review:

Please do not write about reviews, commentary, or information posted on the product page. If you see any errors in the information on the product page, please send us an email.

Reviews should not contain any of the following:

  • - HTML tags, profanity, obscenities, vulgarities, or comments that defame anyone
  • - Time-sensitive information such as tour dates, signings, lectures, etc.
  • - Single-word reviews. Other people will read your review to discover why you liked or didn't like the title. Be descriptive.
  • - Comments focusing on the author or that may ruin the ending for others
  • - Phone numbers, addresses, URLs
  • - Pricing and availability information or alternative ordering information
  • - Advertisements or commercial solicitation

Reminder:

  • - By submitting a review, you grant to Barnes & Noble.com and its sublicensees the royalty-free, perpetual, irrevocable right and license to use the review in accordance with the Barnes & Noble.com Terms of Use.
  • - Barnes & Noble.com reserves the right not to post any review -- particularly those that do not follow the terms and conditions of these Rules. Barnes & Noble.com also reserves the right to remove any review at any time without notice.
  • - See Terms of Use for other conditions and disclaimers.
Search for Products You'd Like to Recommend

Recommend other products that relate to your review. Just search for them below and share!

Create a Pen Name

Your Pen Name is your unique identity on BN.com. It will appear on the reviews you write and other website activities. Your Pen Name cannot be edited, changed or deleted once submitted.

 
Your Pen Name can be any combination of alphanumeric characters (plus - and _), and must be at least two characters long.

Continue Anonymously
Sort by: Showing all of 4 Customer Reviews
  • Anonymous

    Posted March 17, 2005

    Think you know what your customers want...think again!

    An excellent read and very well written. Alan walks you eloquently through the issues that plague the industry and defines common sense practice for good interaction design. Valuable lessens seep from each page. Unfortunately, 6 years after the release the industry is still suffering from those with too much skin in the game and resist the light of change. The book is a wake up call for the industry to practice more K.I.S.S. development and adds clarity to the reason why Software companies are fueling their Tech Support bonfires with poor design implementation.

    Was this review helpful? Yes  No   Report this review
  • Anonymous

    Posted June 26, 2000

    Couldn't Put it Down

    I believe all of his theories and I just hope I can embue his philosphies in my own consulting company.

    Was this review helpful? Yes  No   Report this review
  • Anonymous

    Posted February 21, 2000

    Don't Give Up, Its Not Your Fault; Blame it on the Programmers

    If you¿ve ever tried learning how to use a new piece of software or to program your digital telephone & answering machine only to give up angry and frustrated, this is the book for you. After reading this book, you won¿t feel so inadequate or technologically challenged. No, it¿s not your fault. Blame it on the programmers and the dysfunctional technology design industry. Cooper speaks from experience. He¿s an insider who decided to blow the whistle on the problematic nature of the software development industry. Cooper offers an explanation of why humans are having such difficulty working with many of the ¿wonders¿ of technology. The problem is that the programmers are running the industry. As a programmer of products like Visual Basic, Cooper discovered that the programmers are calling the shots. He explains that the problem is that programmers think like programmers not like the typical technology user. So, when we look at an alarm clock or a software title that we just can¿t seem to figure out, the problem isn¿t with the user it is with the designer. Cooper¿s book examines the structure of software development and the need for interactive designers to play a key role in this process. This book offers a design methodology that calls for ¿identifying who all the key users and other stakeholders [are] and write up profiles on them, and then develop statements of their goals and the tasks they would go through to accomplish those goals.¿ I particularly liked Cooper¿s discussion of the culture of different key players in the software development industry. I found Cooper¿s process of developing fictitious personas to gear the design process towards very useful. I think that this book could be of value to writers, designers, programmers, artist, and anyone else who tries to create a product that will be used by another person.

    Was this review helpful? Yes  No   Report this review
  • Anonymous

    Posted February 9, 2000

    Notes on the Asylum

    The toughest obstacle to overcoming bad software interface design, according to Alan Cooper, are programmers (the inmates). Unfortunately, these programmers are also given almost complete power in the software development process because of the skills they possess. The result is a situation where the inmates are allowed to run the software development asylum because no one else can stop them once they've begun writing code. Interested primarily in interactive design (the design of the environemnt for users), Cooper makes the case for dis-empowering programmers and requiring them to pay more attention to how users interact with software, rather than continuing to operate in the vacuum of the programming community. The implications of this shift in power would be more intuitive programs that don't test users' patience and drive them crazy with unwanted (or unnecessary) functions and procedures. Cooper proposes that these kinds of friendly interfaces can (and are) quite easy to develop once the programmers are kept in check. This is an interesting and eye-opening read for anyone interested in computing, or for those frustrated with the entire computing process. A must-read for anyone who wants a fresh perspective on interface design.

    Was this review helpful? Yes  No   Report this review
Sort by: Showing all of 4 Customer Reviews

If you find inappropriate content, please report it to Barnes & Noble
Why is this product inappropriate?
Comments (optional)