Lead Wars: The Politics of Science and the Fate of America's Children

Lead Wars: The Politics of Science and the Fate of America's Children

by Gerald Markowitz, David Rosner

View All Available Formats & Editions

In this incisive examination of lead poisoning during the past half century, Gerald Markowitz and David Rosner focus on one of the most contentious and bitter battles in the history of public health. Lead Wars details how the nature of the epidemic has changed and highlights the dilemmas public health agencies face today in terms of prevention strategies and


In this incisive examination of lead poisoning during the past half century, Gerald Markowitz and David Rosner focus on one of the most contentious and bitter battles in the history of public health. Lead Wars details how the nature of the epidemic has changed and highlights the dilemmas public health agencies face today in terms of prevention strategies and chronic illness linked to low levels of toxic exposure. The authors use the opinion by Maryland’s Court of Appeals—which considered whether researchers at Johns Hopkins University’s prestigious Kennedy Krieger Institute (KKI) engaged in unethical research on 108 African-American children—as a springboard to ask fundamental questions about the practice and future of public health. Lead Wars chronicles the obstacles faced by public health workers in the conservative, pro-business, anti-regulatory climate that took off in the Reagan years and that stymied efforts to eliminate lead from the environments and the bodies of American children.

Editorial Reviews

From the Publisher

"In Lead Wars, CUNY's Gerald Markowitz and Columbia University's David Rosner convincingly show that the Baltimore toddler study emerged from a century of policymaking in which the US government, faced at times with a choice between protecting children from lead poisoning and protecting the businesses that produced and marketed lead paint, almost invariably chose the latter."--New York Review of Books

""Lead Wars" clearly shows that the scandalous and tragic history of lead is one that our society is doomed to repeat over and over again unless we develop and fight for better safeguards against chemicals and new technology."--Mother Nature Network

"A fascinating new book."--Pbs Newshour the Rundown Blog

New York Review Of Books

In Lead Wars, CUNY’s Gerald Markowitz and Columbia University’s David Rosner convincingly show that the Baltimore toddler study emerged from a century of policymaking in which the US government, faced at times with a choice between protecting children from lead poisoning and protecting the businesses that produced and marketed lead paint, almost invariably chose the latter.”
Mother Nature Network - Helen Jupiter
Lead Wars clearly shows that the scandalous and tragic history of lead is one that our society is doomed to repeat over and over again unless we develop and fight for better safeguards against chemicals and new technology.”
PBS Newshour The Rundown Blog - Howard Markel
“A fascinating new book.”
SE Journal - Bill Kovarik
"A deeply conceived and well-written book by two of America's best public health historians. It's also an important background briefing on the politics and ethics of scientific research for journalists who will be covering environmental health issues like these."
New York Times - Nicholas D. Kristof
"Chronicles the monstrous irresponsibility of companies in the lead industry over the course of the 20th century."
Senator Sheldon Whitehouse
"I want to thank David Rosner and Gerald Markowitz for what that they've done to bring the story of the lead paint wars to the public."
Health Affairs - Elizabeth Fee
"The prolific team of Gerald Markowitz and David Rosner has done it again. Lead Wars: The Politics of Science and the Fate of America’s Children is a thoroughly researched, passionate, and gripping history of a major public health problem. . . . Lead Wars challenges us to take better care of our children by fighting those industries that appear to regard them—especially poor black and Latino children—as disposable."
Library Journal
When there is scientific consensus that even low levels of a toxin in children's blood will cause lasting damage, are there valid reasons to stop short of eliminating that threat? Using the decades-long conflict over lead in the environment, Markowitz (history, John Jay Coll. CUNY) and Rosner (public health & history, Columbia Univ.), coauthors of Deceit and Denial: The Deadly Politics of Industrial Pollution, explore the complexity of that question. Beginning with an appeal court's decision that Johns Hopkins University researchers violated their ethical obligation to children in their study of varying levels of lead paint abatement in homes, the book traces the evolution of the widespread scientific condemnation of environmental lead, as well as the parallel efforts by the lead industry to question that science and regulatory changes. Antiregulatory policies put into place in the 1980s and court decisions that upheld the industry's denial of liability have left thousands of families living in homes containing lead paint. The authors compare the industry's obfuscating tactics to those used by the tobacco industry and anticlimate-change forces, both of which leave many children (and society at large) at risk in the service of economic concerns. VERDICT Thoroughly researched and clearly written, this book does an excellent job of illustrating the problem society encounters when science and industry face off over likely harm versus economic benefit.—Richard Maxwell, Porter Adventist Hosp. Lib., Denver

Product Details

University of California Press
Publication date:
California/Milbank Books on Health and the Public , #24
Sold by:
Barnes & Noble
Sales rank:
File size:
3 MB

Read an Excerpt

Lead Wars

The Politics of Science and the Fate of America's Children

By Gerald Markowitz, David Rosner


Copyright © 2013 Gerald Markowitz and David Rosner
All rights reserved.
ISBN: 978-0-520-95495-3



A Legacy of Neglect

In August 2001, the Court of Appeals of Maryland, that state's highest court, handed down a strongly worded, even shocking opinion in what has become one of the most contentious battles in the history of public health, a battle that goes to the heart of beliefs about what constitutes public health and what our responsibility to others should be. The court had been asked to decide whether or not researchers at Johns Hopkins University, among the nation's most prestigious academic institutions, had engaged in unethical research on children. The case pitted two African American children and their families against the Kennedy Krieger Institute (KKI), Johns Hopkins's premier children's clinic and research center, which in the 1990s had conducted a six-year study of children who were exposed by the researchers to differing amounts of lead in their homes.

Organized by two of the nation's top lead researchers and children's advocates, J. Julian Chisolm and Mark Farfel, the KKI project was designed to find a relatively inexpensive, effective method for reducing—though not eliminating—the amount of lead in children's homes and thereby reducing the devastating effect of lead exposure on children's brains and, ultimately, on their life chances. For the study, the Johns Hopkins researchers had recruited 108 families of single mothers with young children to live in houses with differing levels of lead exposure, ranging from none to levels just within Baltimore's existing legal limit, and then measured the extent of lead in the children's blood at periodic intervals. By matching the expense of varying levels of lead paint abatement with changing levels of lead found in the blood, the researchers hoped to find the most cost-effective means of reducing childhood exposure to the toxin. Completely removing lead paint from the homes, Chisolm and Farfel recognized, would be ideal for children's health; but they believed, with some justification, that a legal requirement to do so would be considered far too costly in such politically conservative times and would likely result in landlord abandonment of housing in the city's more poverty-stricken districts.

Despite the intentions of KKI researchers to benefit children, the court of appeals found that KKI had engaged in highly suspect research that had direct parallels with some of the most infamous incidents of abuse of vulnerable populations in the twentieth century. The KKI project, the court argued, differed from but presented "similar problems as those in the Tuskegee Syphilis Study, ... the intentional exposure of soldiers to radiation in the 1940s and 50s, the test involving the exposure of Navajo miners to radiation ... and the secret administration of LSD to soldiers by the CIA and the army in the 1950s and 60s." The research defied many aspects of the Nuremberg Code, the court said, and included aspects that were similar to Nazi experimentation on humans in the concentration camps and the "notorious use of 'plague bombs' by the Japanese military in World War II where entire villages were infected in order for the results to be 'studied.'" More specifically, the court was appalled that many of the children selected for the study were recruited to live in homes where the researchers knew they would be exposed to lead and thus knowingly placed in harm's way. Children, the court argued, "are not in our society the equivalent of rats, hamsters, monkeys and the like." The court was deeply troubled that a major university would conduct research that might permanently damage children, given what was already known about the effects of lead.

How could two public health researchers who had devoted their scientific lives to alleviating one of the oldest and most devastating neurological conditions affecting children be likened to Nazis? Was this just a "rogue court," an out-of-control panel of judges, as many in the public health community would argue? These were the questions that initially drew our attention. We soon became aware, however, of the much more complex and troubling story underlying the case, about not just the KKI research but also the public health profession, the nation's dedication to the health of its citizens in the new millennium, and the conundrum that we as a society face when confronting revelations about a host of new environmental threats in the midst of a conservative political culture. In its ubiquity and harm, lead is an exemplary instance of these threats. Yet there are many others we encounter in everyday life that entail similar issues, from mercury in fish and emitted by power plants to cadmium, certain flame retardants, and bisphenol A, the widely distributed plastics additive that has been identified as a threat to children.

For much of its history, the public health field provided the vision and technical expertise for remedying the conditions—both biological and social—that created environments conducive to harm and within which disease could spread. And throughout much of the profession's history, public health leaders have joined with reformers, radicals, and other social activists to finds ways within the existing political and economic structures to prevent diseases. Although the medical profession has often been given credit for the vast improvements in Americans' health and life span, the nineteenth- and early-twentieth century public health reformers who pushed for housing reforms, mass vaccination campaigns, clean water and sewage systems, and pure food laws in fact played a major role in improving children's health, lowering infant mortality, and limiting the impact of viral and bacterial diseases such as cholera, typhoid, diphtheria, smallpox, tuberculosis, measles, and whooping cough. In the opening years of the twentieth century, for example, Chicago's public health department joined with Jane Addams and social reformers at Hull House to successfully advocate for new housing codes that, by reducing overcrowding and assuring fresh air in every room, led to reduced rates of tuberculosis. And New York's Commissioner of Health Hermann Biggs worked with Lillian Wald and other settlement house leaders to initiate nursing services for the poor, pure milk campaigns, vaccination programs, and well-baby clinics that dramatically reduced childhood mortality. Biggs, Addams, and other Progressives worked from a firm conviction that as citizens we have a collective responsibility to maintain conditions conducive to every person's health and well-being.

These broad public health campaigns to control infectious diseases yielded great victories from the 1890s through the 1930s. But with the first decades of the twentieth century, a different view of the profession began to gain ascendancy, redefining the mission of public health in ways that belied its role as an agent of social reform. In 1916 Hibbert Hill, a leading advocate of this new direction, put it this way: "The old public health was concerned with the environment; the new is concerned with the individual. The old sought the sources of infectious disease in the surroundings of man; the new finds them in man himself. The old public health ... failed because it sought [the sources] ... in every place and in every thing where they were not." In this view, the idea was for the fast-growing science of biological medicine to concentrate on treating disease person by person rather than on eradicating conditions that facilitated disease and its spread, in some cases encouraging reforms in behavior to reduce individual exposure to harm. Hence, like numerous other fields in the early decades of the century, public health became professionalized, imbuing itself with the aura of science and setting itself off as possessing special expertise.

By the middle decades of the twentieth century, public health officials thus typically conceived of their field mainly as a laboratory-based scientific enterprise, and many public health professionals saw their work as a technocratic and scientific effort to control the agents that imperiled the public's health individual by individual. We can see this shift in perspective in treating tuberculosis, for example. An infectious disease that terrified the American public in the eighteenth and nineteenth centuries, tuberculosis had begun to decline as a serious threat by the early twentieth century, mainly because of housing reforms, improvements in nutritional standards, and general environmental sanitation. By mid-century, public health officials tended to downplay such environmental conditions and came to rely instead on the armamentarium of new antibiotic therapies to address the relatively small number of tuberculosis victims. The history of responding to industrial accidents and disease offers another example. In the early years of the twentieth century, reformers such as Crystal Eastman addressed the plague of industrial accidents and disease in the steel and coal towns of Pennsylvania by advocating for higher wages, shorter hours, and better working conditions through unionization. By the 1950s, industrial disease and accidents had largely faded from public health view—ironically, in part because the earlier reform efforts had led to protective legislation—and it was left to company physicians to treat individual workers. This turn toward technological and individualistic solutions to problems that had once been defined as societal was by mid-century part of a general shift in American culture away from divisive class politics and toward a faith in ostensibly class-neutral science, technology, and industrial prowess as the best way to address social or public-health-related problems.

Since the early twentieth century, a tension has existed within the public health field—which mirrors a societal one—between, on the one hand, those who set their sights on prevention of disease and conditions dangerous to health through society-wide efforts and, on the other, those who believe in the more modest and pragmatic goal of ameliorating conditions through piecemeal reforms, personal education, and individual treatments. Despite the tremendous successes of environmental and political efforts to stem epidemics and lower mortality from infectious diseases, the credit for these improvements went to physicians (and the potent drugs they sometimes had at hand), whose role was to treat individuals. This shift also coincidentally, or not so coincidentally, undermined a public health logic that was potentially disruptive to existing social and power relationships between landlord and tenant, worker and industrialist, and poor immigrants and political leaders.

At elite universities around the country—from Harvard, Yale, and Columbia to Johns Hopkins and Tulane—new schools of public health were established in the first two decades of the twentieth century with funds from the Rockefeller and Carnegie Foundations. Educators at these new schools had faith that science and technology could ameliorate the public health threats that fed broader social conflicts. They envisioned a politically neutral technological and scientific field removed from the politics of reform. The Johns Hopkins School of Hygiene and Public Health was at the center of this movement. William Welch, the school's founder and first director (as well as the first dean of the university's medical school), argued persuasively that bacteriology and the laboratory sciences held the key to the future of the field. By the mid-twentieth century, municipal public health officials in most cities had adopted this approach. If early in the century public health workers in alliance with social reformers succeeded in getting legislation passed to control child labor and the dangers to health that accompanied it, and to protect women from having to work with such dangerous chemicals as phosphorus and lead, by midcentury departments of health worked more often to reduce exposures of workers to "acceptable" levels that would limit damage rather than eliminate it. Similarly, by the 1970s departments of health had established clinics aimed at treating the person with tuberculosis but displayed little interest in joining with reformers to tear down slums and build better houses for at-risk low-income people.

By the 1950s and 1960s, when childhood lead poisoning emerged as a major national issue, public health practitioners were divided between those who defined their roles as identifying victims and treating symptoms and those who in addition sought alliances with social activists to prevent poisoning through housing reforms that would require lead removal. Drawing on the social movements of the 1960s, health professionals joined with antipoverty groups, civil rights organizations, environmentalists, and antiwar activists to struggle for access to health facilities for African Americans in the South and in underserved urban areas, for Chicanos active in the United Farm Workers' strikes in the grape-growing areas of California and the West, for Native Americans on reservations throughout the country, and for soldiers returning from Vietnam suffering from post-traumatic stress disorders, among others. By the end of the twentieth century, though, the effort to eliminate childhood lead poisoning through improving urban infrastructure had largely been abandoned in favor of reducing exposures.


The campaign to halt childhood lead poisoning is often told as one of the great public health victories, like the efforts to eliminate diphtheria, polio, and other childhood scourges. After all, with the removal of lead from gasoline, blood lead levels of American children between the ages of one and five years declined precipitously from 15 micrograms per deciliter (µg/dl) in 1976–80 to 2.7 µg/dl by 1991–94, and levels have continued to drop. Today, the median blood lead level among children aged one–five years is 1.4 µg/dl, and 95 percent of children in this age group have levels below 4.1 µg/dl. Viewed from a broader perspective, however, the story is more complicated, and disturbing, and may constitute what Bruce Lanphear, a leading lead researcher, calls "a pyrrhic victory." If 95 percent of American children have below what is today considered the danger level for lead, then 5 percent—a half million children—still have dangerous amounts of lead in their bodies. A century of knowledge about the harmful effects of lead in the environment and the success of efforts to eliminate some of its sources have not staunched the flood of this toxic material that is polluting our children, our local environments, and our planet.

Today, despite broad understanding of the toxicity of this material, the world mines more lead and uses it in a wider variety of products than ever before. Our handheld electronic devices, the sheathing in our computers, and the batteries in our motor vehicles, even in new "green" cars such as the Prius, depend on it. While in the United States the new uses of lead are to a certain degree circumscribed, the disposal of all our electronic devices and the production of lead-bearing materials through mining, smelting, and manufacture in many countries continue to poison communities around the world. Industrial societies in the West may have significantly reduced the levels of new lead contamination, but the horror of lead poisoning here is hardly behind us, exposure coming from lead paint in hundreds of thousands of homes, in airborne particles from smelters and other sources and from contaminated soil, lead solder and pipes in city water systems, and some imported toys and trinkets. Over time, millions of children have been poisoned.

In the past, untold numbers of children suffered noticeably from irritability, loss of appetite, awkward gait, abdominal pain, and vomiting; many went into convulsions and comas, often leading to death. The level of exposure that results in such symptoms still occurs in some places. But today new concerns have arisen as researchers have found repeatedly that what a decade earlier was thought to be a "safe" level of lead in children's bodies turned out to itself result in life-altering neurological and physiological damage. Even by the federal standard in place at the beginning of 2012 (10 µg/dl), more than a quarter of a million American children were victims of lead poisoning, a condition that almost a century ago was already considered with some accuracy as totally preventable. Later in 2012, the Centers for Disease Control (CDC) lowered the level of concern to 5 µg/dl, nearly doubling estimates of the number of possible victims.


Excerpted from Lead Wars by Gerald Markowitz, David Rosner. Copyright © 2013 Gerald Markowitz and David Rosner. Excerpted by permission of UNIVERSITY OF CALIFORNIA PRESS.
All rights reserved. No part of this excerpt may be reproduced or reprinted without permission in writing from the publisher.
Excerpts are provided by Dial-A-Book Inc. solely for the personal use of visitors to this web site.

Meet the Author

Gerald Markowitz is Distinguished Professor of History at John Jay College and the Graduate Center of the City University of New York. He is, along with David Rosner, coauthor of Deceit and Denial: The Deadly Politics of Industrial Pollution (UC Press), and eight other books.

David Rosner is Ronald Lauterstein Professor of Public Health and Professor of History at Columbia University and Co-director of the Center for the History and Ethics of Public Health at Columbia's Mailman School of Public Health. In 2010 he was elected to the Institute of Medicine of the National Academy of Sciences.

Customer Reviews

Average Review:

Write a Review

and post it to your social network


Most Helpful Customer Reviews

See all customer reviews >