"You Can't Enlarge the Pie": The Psychology of Ineffective Government


everybody knows are foolish? It's not because they're stupid or corrupt, say the authors, but because our leaders, like the rest of us, are trapped in foolish and unproductive habits of thinking. "You Can't Enlarge the Pie" analyzes the unspoken assumptions that lead to bad policy, wasted resources, and lost lives, and shows exactly why they're wrong. With fascinating case studies and clear, compelling analysis, they dissect six beliefs that serve as psychological barriers to effective government:1. Do no harm2. ...

See more details below
Available through our Marketplace sellers.
Other sellers (Hardcover)
  • All (16) from $1.99   
  • New (1) from $60.00   
  • Used (15) from $1.99   
Sort by
Page 1 of 1
Showing All
Note: Marketplace items are not eligible for any BN.com coupons and promotions
Seller since 2015

Feedback rating:



New — never opened or used in original packaging.

Like New — packaging may have been opened. A "Like New" item is suitable to give as a gift.

Very Good — may have minor signs of wear on packaging but item works perfectly and has no damage.

Good — item is in good condition but packaging may have signs of shelf wear/aging or torn packaging. All specific defects should be noted in the Comments section associated with each item.

Acceptable — item is in working order but may show signs of wear such as scratches or torn packaging. All specific defects should be noted in the Comments section associated with each item.

Used — An item that has been opened and may show signs of wear. All specific defects should be noted in the Comments section associated with each item.

Refurbished — A used item that has been renewed or updated and verified to be in proper working condition. Not necessarily completed by the original manufacturer.

Brand new.

Ships from: acton, MA

Usually ships in 1-2 business days

  • Standard, 48 States
  • Standard (AK, HI)
Page 1 of 1
Showing All
Sort by
Sending request ...


everybody knows are foolish? It's not because they're stupid or corrupt, say the authors, but because our leaders, like the rest of us, are trapped in foolish and unproductive habits of thinking. "You Can't Enlarge the Pie" analyzes the unspoken assumptions that lead to bad policy, wasted resources, and lost lives, and shows exactly why they're wrong. With fascinating case studies and clear, compelling analysis, they dissect six beliefs that serve as psychological barriers to effective government:1. Do no harm2. Their gain is our loss3. Competition is always good4. Support our group5. Live for the moment6. No pain for us, no gain for themBy freeing ourselves from the narrow way we evaluate our government leaders, say the authors, we can learn to judge their performance just as we judge that of business leaders: by the overall health of their organizations.

Read More Show Less

Editorial Reviews

Publishers Weekly
Bazerman (a Harvard professor of business administration), Baron (a University of Pennsylvania professor of psychology) and Shonk (a Harvard research associate) have a promising idea for improving government. Drawing on "an approach that now dominates the curriculum of business schools," they declare, "Our core argument is that large gains can often only be achieved when citizens learn to accept small losses in return" as with vaccines, which save far more lives than they cost in fatal side effects. The authors devote separate chapters to each of six cognitive barriers they claim prevent us from making such wise trade-offs. Some are clearly related to their main theme "do no harm" describes the rationale opposing vaccination; but others notably "competition is always good" require more elaboration, which is generally lacking. Furthermore, they sometimes criticize behavior in one chapter and praise or simply overlook it in another indicating a schematized approach that ignores crucial sources of policy-making difficulty. One chapter touts free trade between countries while another decries cities' ruinous competitive spending on sports arenas, without acknowledging a similar dynamic when labor, consumer and environmental laws are construed as "trade barriers." The authors' cognitive focus obscures genuine objective dilemmas, while their psychologizing is often implausible. They say campaign finance reform has low priority as an ill-defined "process issue" that people can't grasp because like most business negotiators they don't think ahead. But most citizens grasp political corruption, which seems similarly to be a "process issue." Despite some obviously promising ideas, the relentlessreductionism oversimplifies and psychologizes problems that have complex, historical, real-world roots. (Sept.) Copyright 1999 Cahners Business Information.
Read More Show Less

Product Details

  • ISBN-13: 9780465006311
  • Publisher: Basic Books
  • Publication date: 8/15/2001
  • Pages: 288
  • Product dimensions: 6.45 (w) x 9.58 (h) x 1.02 (d)

Read an Excerpt

Chapter One

SINCE THE EARLY 1960s, MILLIONS OF AMERICAN WOMEN HAVE undergone surgery for silicone breast implants, either as reconstruction after a mastectomy or for the cosmetic enlargement of normal breasts. These silicone pouches, usually filled with saline solution, are intended to improve the quality of life rather than its duration. The popularity of the procedure suggests that many women expect implants to enhance their confidence, well-being and enjoyment of life.

    breast implants in 1976, it acted on the assumption that these devices were safe. Meanwhile, in Japan, reports in medical journals were beginning to document illnesses-particularly connective-tissue disease-among prostitutes who had received direct injections of silicone or wax. In 1982, connective-tissue disease was reported in three Australian women with implants. Soon after, American women with breast implants who developed connective-tissue disease or another disorder began to initiate lawsuits against the manufacturers. In 1990, after the television show Face to Face with Connie Chung interviewed victims of these diseases and implicitly blamed the manufacturers and the FDA, Congressman Ted Weiss began public hearings to explore a possible link between breast implants and these ailments.

    who claimed that her Dow-Corning implants had caused her connective-tissue disease. In 1992, after failing to receive evidence of their safety, FDA Commissioner David Kessler placed a ban on most implants; at the same time, he assured women who already had them that there was no evidence of danger. The ban galvanized tens of thousands of women to launch lawsuits against implant manufacturer Dow-Corning; these were eventually consolidated into one class-action suit. Some of the women involved in the suit blamed their implants for poor health. Others had no sign of disease but joined the suit for fear of future illness. In 1994, Dow-Corning agreed to an initial settlement of $4.25 billion, at the time the largest class-action settlement in history. One billion of the settlement went directly to the plaintiffs' lawyers. Under the terms of the settlement, a woman had only to present a doctor's diagnosis of some illness to receive a share of the money. Other women were allowed to file for their shares retroactively, making the per-person amount very small. Dow-Corning filed for bankruptcy in 1995.

    you may not have heard about the surprising scientific results that emerged—without much attention from the press—when implant manufacturers began to put their products to the test. This research began at the time of the lawsuit and continues today. Consider that about 1 percent of adult American women have implants, and about 1 percent of adult American women have connective tissue disease. If the implants had no relation to the disease, we would expect about 10,000 American women (1 percent of 1 percent of the 100 million women in the United States) to have both. Several studies showed that this rate was approximately correct. The only study that showed a small association between implants and disease was conducted after the negative publicity; the criterion for "disease" was a self-report questionnaire, raising the possibility that some women might have been more likely to report signs of disease after hearing that it could be caused by implants.

    the existing evidence and concluded that, to date, there was no connection between implants and the diseases blamed on them. Meanwhile, as more and more women have filed claims, the amount paid out by Dow-Corning in the class-action settlement has surpassed $7 billion. The fear of unfounded lawsuits led DuPont to refuse to supply Dacron polyester for vascular grafts. In 1994, foreign suppliers refused to supply American manufacturers with Dacron for the first time. Because silicone is used in catheters, pacemakers and artificial-heart valves, it is possible that these products could some day be in short supply. After all, in the aftermath of a lawsuit based on no scientific evidence whatsoever, why would any company supply a product that could bring about its financial ruin? It would not be surprising if firms chose to steer clear of this risky business altogether.

    to victims of diseases its products did not cause. Drug companies have cut back on research on contraceptives and vaccines because these products, given to healthy people, inspire lawsuits. Obstetricians are switching to the less risky field of gynecology, and the specialists who remain sometimes refuse to deliver lawyers' babies. In addition, companies guilty of outrageous crimes spend millions on attorneys' fees to protect themselves from litigation.

    of torts has today become almighty. Every day, plaintiffs win millions of dollars in damages for harms that cannot be considered crimes, and our existing tort system funnels as much money to lawyers as it does to victims. These fees could be avoided if parties were able to settle disputes through efficient negotiation.

    psychological reason is that we pay too much attention to the losses that might result from action or change, but we ignore potential gains. As a consequence, we refuse to take small risks in order to reduce a large risk. People often resist changing jobs, homes or relationships because, biased toward the status quo, they focus on what they will lose rather than what they might gain.

    status quo also make them resist government policies that might improve matters for them as individuals. Citizens and government officials oppose tort reform proposals because of the vivid but small risk that recouping losses in court will become more difficult. This fear of risk overshadows the greater benefits they will gain in the form of lower health care costs, tax reductions and product availability. Through wise tradeoffs, citizens will discover where their true self-interest lies, and they will be more accepting of government policies that advance this interest.

On December 21, 1988, 270 people died when a bomb planted by a terrorist on Pan American Airways Flight 103 exploded over Lockerbie, Scotland. In response to the tragedy, the U.S. Federal Aeronautics Administration (FAA) tightened regulations on air travel, including more thorough baggage checks and mandatory early arrival for travelers on international flights.

    the cost per life saved by the new regulations. Taking into account that the main cost factor is extra time, Farlie figured lost time to passengers at what was then the minimum wage, $3.35 per hour, even though most international travelers earn quite a bit more. He generously assumed that the new regulations would prevent all deaths from terrorism, which had averaged sixty-one each year since 1976. Given the 221,471,000 people who take international flights each year, this comes to $6,081,375.81 per life saved.

    human life. In other realms, the cost per life saved by governmental regulation is much higher. Still, there are far less expensive ways of saving lives, such as reducing air and water pollution. In general, developed nations spend far too much money on reducing certain risks. For example, the complete removal of asbestos from American school buildings would save about 400 lives over 40 years-at an estimated cost of about $100 billion, or $250 million per life saved. Of course, because of the danger to asbestos-removal workers, it's possible this $100 billion might save no lives at all. Estimates of the amount of money it costs to save a certain number of lives are full of inaccuracy; but even if each figure given in the examples above were off by a factor of ten, the allocation of resources to reduce risk would still be shockingly high.

    money and time come in limited quantities. Just as companies generate a finite amount of profit, governments collect a limited amount of funds each year. Government officials and employees can devote only a fixed amount of hours to solving a problem. Once money and time run out, they're gone for good.

    Ideally, reforms reduce one risk more than they increase another, achieving an overall improvement for each individual. Yet citizens often resist reforms, believing that they will suffer from the increased risk. Why, for instance, do we have so few organs to allocate to those who need them? Because people focus on the contributions they will be making—and by association, the risks they will be taking—without thought for the benefits they may receive as potential organ recipients. The solution may be a system in which people accept or reject both roles at once. An even simpler solution would be to assume that all citizens are both potential donors and potential recipients unless they specifically opt out of the program. These measures would frame organ donation as a cooperative enterprise in which the benefits to the individual far outweigh the costs.

    optimal legislation for society, and there are numerous reasons for these failures, including partisanship, political processes, special-interest groups and incompetence. We will examine many of these in later chapters. For now, we will focus on the most central explanation: the failure of the human mind to make wise tradeoffs. Citizens and legislators make a variety of systematic cognitive mistakes that lead to suboptimal legislation.

    and money spent on very expensive measures to control one risk and use them to reduce other risks. This would save money and lives. Instead, we tend to react to each catastrophe with new regulations, which pile up to massive size and become institutionalized. When someone suggests the repeal of an existing regulation, activists rush to its defense regardless of its cost inefficiency.

    government's ignoring the wishes of the people. When elected officials vote to pay huge sums on regulations aimed at risk reduction, they are usually responding to the demands of their constituents. The problem is that citizens look at risk through the lens of a variety of irrational biases. By supporting the status quo, these biases become roadblocks on the path of beneficial change—the type of change that requires a small increase in one risk in return for a large decrease in another risk.

The Omission Bias

Suppose you have a 10 percent chance of catching a new strain of flu virus. The only available vaccine completely prevents this type of flu, but it has a 5 percent chance of causing symptoms identical to those it is supposed to prevent, and with the same severity. If all other factors (such as cost) were the same, would you get the vaccine? Many people would not. They would be more concerned about the risk of harm from action—the 5 percent risk of an adverse reaction to the vaccine—than about the risks of inaction, or the 10 percent risk of catching the flu without the vaccine. This is true even though the vaccine reduces the chance of flu symptoms by 5 percent. Although the flu example is hypothetical, the same bias affects people's decisions about vaccination in real life.

    is known as the omission bias. More than any other cognitive error, the omission bias pervades our decisions regarding risk. When contemplating risky choices, many people follow the rule of thumb, "Do no harm." Implicit in this advice is the notion that "do" means "do through action," making harms of omission easy to ignore. Our susceptibility to the omission bias means that every year many more people catch the flu than need to. The bias is not limited to government officials who oppose the wishes of super-rational citizens; nor is it the case that citizens need to be set straight by all-knowing officials. While the strength of the bias varies from person to person, it is found in every group of people.

The Status Quo Bias

One feature of the omission bias is that it usually supports the status quo. When contemplating a change, people are more likely to attend to the risk of change than to the risk of failing to change. Taking losses more seriously than gains, they will be motivated to preserve the status quo.

    status quo bias. Students were divided into three groups: "sellers," "buyers," and "choosers." The sellers were each given a coffee mug from their university bookstore and were asked to indicate on a form whether they would sell the mug at each of a series of prices ranging from $0 to $9.25. On a similar form, buyers indicated whether they were willing to buy a mug at each price in the same range. The choosers were asked to choose between a mug (which they did not "own") and various amounts of money. Sellers valued the mug at a median price of $7.12, while the average buyer thought it was worth only $2.87 and choosers gave it a median value of $3.12. Motivated to avoid a "loss" and maintain the status quo, sellers irrationally overvalued the mug.

    aggravates many of the problems discussed in this book. Any kind of reform requires some losses and some gains. If it is a good reform, gains outweigh losses. But because people worry more about losses, they will tend to oppose the reform.

The Preference for Natural Risk

A final cognitive bias that interferes with rational decision making regarding risk is the human preference for natural risk over artificial risk. In general, "don't mess with nature" is a good rule for human beings to follow. We have evolved by adapting to the natural world and learning that tampering with nature—by bleeding people as a medical treatment, for example, or destroying entire forests for firewood—can lead to trouble. But, as often happens with such cognitive shortcuts, we fail to recognize important exceptions to the rule. In particular, we follow it even when the consequences of letting nature take its course would be worse than the consequences of altering it by "artificial" means.

    the same consequences differently depending upon whether they are brought about by humans or by nature. Specifically, we are more tolerant of "natural" disasters than of artificial ones. "I don't mind natural extinctions, but I'm not too enthusiastic about extinctions that are directly caused by man," commented a subject in a study of environmental values. This bias toward nature leads us to ignore opportunities for lessening the devastation of natural risks such as hurricanes, earthquakes, floods and epidemics. People tend to regard such disasters as the inevitable "will of God." Of course, we cannot prevent hurricanes and earthquakes, but we can do a great deal to protect ourselves against their worst consequences (such as not building homes on sandbars in hurricane-prone regions).

    the source of harm is human error than when it is a nonhuman natural source. Subjects were willing to contribute about $19 to an international fund to save Mediterranean dolphins when the dolphins were "threatened by pollution," but only $6 when they were "threatened by a new virus"—even though the same number of dolphins were expected to die in both cases. Similarly, subjects in one study (some of whom were judges) thought that workmen's compensation paid by a state panel should be greater when an injury is caused by a drug rather than by a natural disease. These questionnaire studies suggest that economic allocations are based on the psychological properties of human judgment rather than on the amount of benefit from the allocation. This leads to a misallocation. We could spend the same money more fairly by dividing it equally among those with the same need.

    to be suspicious of new technology, even when we have every reason to believe that the technology will improve on natural outcomes. Synthetic chemicals added to food are often banned because they cause cancer in laboratory animals. Yet when natural foods are broken into their constituent chemicals, they too have been found to cause cancer in lab animals. Caffeic acid, for example, is a carcinogen found in coffee, lettuce, apples, pears, plums, celery, carrots and potatoes. One review found that 94 percent of synthetic carcinogens were subject to government regulation, compared to 41 percent of a sample of natural carcinogens; the review also showed that the average risk to humans from synthetic chemicals was lower than from natural ones. Researchers have argued that animal tests require such high doses of the chemicals that cancer is an almost inevitable result of increased cell division brought on by an overdose of the chemical in the animal's body. At lower doses, a given chemical typically does not cause cell poisoning; for this reason, animal tests may be highly inaccurate. If these tests provide inaccurate and alarming information about natural chemicals, they probably do the same for synthetic chemicals.

    the human tendency to consider some technologies "natural" simply because they are old. For example, although most of our crops are the result of breeding—a technology—people resist further improvements from biotechnology on the grounds that they are "unnatural." It is inefficient to spend money reducing the risk of artificial chemicals when these risks are no more serious than those of nature itself. If expenditures are needed at all, we should treat all risks equally and simply strive to save the most lives per million dollars spent.

    preference for natural risk—in the context of cases in which gaining a large social benefit requires acceptance of a small risk. The examples we will explore are:

    worry too much about one risk and not enough about another. To restore balance, we, as a society, will have to reduce one risk greatly and increase the other risk by a small amount. The human tendency to be more concerned about losses than gains often prevents us from taking such potentially beneficial action. We will explore these three dilemmas, consider practical solutions to each one and offer a better way to think about tradeoffs across a broad array of policy decisions.

In his book Galileo's Revenge: Junk Science in the Courtroom, Peter Huber presents several alarming signs that something is wrong with the American legal system.

    scientific standards and tolerance for "junk science" in the courtroom. All sorts of shady characters are paid to pose as expert witnesses, regardless of their credentials. By leaving juries with the impression that "experts disagree," lawyers enable them to rationalize siding with the plaintiff purely out of sympathy. After all, who can truly judge which experts are credible?

    breast implants case to these triumphs of junk science, is that companies are sued for their actions—but never for inaction. No company will ever be taken to court for refusing to manufacture a silicone product or an anti-nausea drug. Thus, the omission bias is practically built into the law. The result? Lawsuits tend to encourage the omission bias in the companies themselves. If lawsuits were predictable and preventable, companies could fend them off by establishing adequate safety measures. But when companies can be sued for misfortunes they have not caused, they may refuse to take the risk. Large corporations, whose deep pockets make them an easy target of spurious claims, will be afraid to act when action involves developing a beneficial product that carries some degree of risk, as most medical products do. They will choose to avoid the healthcare industry and turn to other markets in which lawsuits are less likely.

    harms of new medical devices are often small compared to their benefits. Lawsuits are supposed to deter the production of goods that have few benefits and many harms, such as automobiles with faulty brakes or drugs that don't work as advertised. But when producers are sued for unforeseeable harms that are in any case minor when compared with the benefits, the lawsuits lead to the withdrawal of beneficial products, and the legal system has done a poor job of balancing harms and benefits. A prime example is the case of vaccines.

Excerpted from "You Can't Enlarge the Pie" by MAX H. BAZERMAN JONATHAN BARON KATHERINE SHONK. Copyright © 2001 by Max H. Bazerman, Jonathan Baron, and Katherine Shonk. Excerpted by permission. All rights reserved. No part of this excerpt may be reproduced or reprinted without permission in writing from the publisher.

Read More Show Less

Table of Contents

1 Do No Harm 1
2 Their Gain Is Our Loss 44
3 Competition Is Always Good 66
4 Support Our Group 99
5 Live for the Moment 128
6 No Pain for Us, No Gain for Them 154
7 The Case of Global Warming 201
Notes 229
Index 251
Read More Show Less

Customer Reviews

Be the first to write a review
( 0 )
Rating Distribution

5 Star


4 Star


3 Star


2 Star


1 Star


Your Rating:

Your Name: Create a Pen Name or

Barnes & Noble.com Review Rules

Our reader reviews allow you to share your comments on titles you liked, or didn't, with others. By submitting an online review, you are representing to Barnes & Noble.com that all information contained in your review is original and accurate in all respects, and that the submission of such content by you and the posting of such content by Barnes & Noble.com does not and will not violate the rights of any third party. Please follow the rules below to help ensure that your review can be posted.

Reviews by Our Customers Under the Age of 13

We highly value and respect everyone's opinion concerning the titles we offer. However, we cannot allow persons under the age of 13 to have accounts at BN.com or to post customer reviews. Please see our Terms of Use for more details.

What to exclude from your review:

Please do not write about reviews, commentary, or information posted on the product page. If you see any errors in the information on the product page, please send us an email.

Reviews should not contain any of the following:

  • - HTML tags, profanity, obscenities, vulgarities, or comments that defame anyone
  • - Time-sensitive information such as tour dates, signings, lectures, etc.
  • - Single-word reviews. Other people will read your review to discover why you liked or didn't like the title. Be descriptive.
  • - Comments focusing on the author or that may ruin the ending for others
  • - Phone numbers, addresses, URLs
  • - Pricing and availability information or alternative ordering information
  • - Advertisements or commercial solicitation


  • - By submitting a review, you grant to Barnes & Noble.com and its sublicensees the royalty-free, perpetual, irrevocable right and license to use the review in accordance with the Barnes & Noble.com Terms of Use.
  • - Barnes & Noble.com reserves the right not to post any review -- particularly those that do not follow the terms and conditions of these Rules. Barnes & Noble.com also reserves the right to remove any review at any time without notice.
  • - See Terms of Use for other conditions and disclaimers.
Search for Products You'd Like to Recommend

Recommend other products that relate to your review. Just search for them below and share!

Create a Pen Name

Your Pen Name is your unique identity on BN.com. It will appear on the reviews you write and other website activities. Your Pen Name cannot be edited, changed or deleted once submitted.

Your Pen Name can be any combination of alphanumeric characters (plus - and _), and must be at least two characters long.

Continue Anonymously

    If you find inappropriate content, please report it to Barnes & Noble
    Why is this product inappropriate?
    Comments (optional)