They are the disease detective corps of the US Centers for Disease Control and Prevention (CDC), the federal agency that tracks and tries to prevent disease outbreaks and bioterrorist attacks around the world. They are formally called the Epidemic Intelligence Service (EIS)—a group founded more than fifty years ago out of fear that the Korean War might bring the use of biological weapons—and, like intelligence operatives in the traditional sense, they perform their work largely in anonymity. They are not household names, but over the years they were first to confront the outbreaks that became known as hantavirus, Ebola, and AIDS. Every day they work to protect us by hunting down the deadly threats that we forget until they dominate our headlines, West Nile virus, anthrax, and SARS among others.
In this riveting narrative, Maryn McKenna—the only journalist ever given full access to the EIS in its fifty-three-year history—follows the first class of disease detectives to come to the CDC after September 11, the first to confront not just naturally occurring outbreaks but the man-made threat of bioterrorism. They are talented researchers—many with young families—who trade two years of low pay and extremely long hours for the chance to be part of the group that are on the frontlines, in the yellow suits and masks, that has helped eradicate smallpox, push back polio, and solve the first major outbreaks of Legionnaires’ disease, toxic shock syndrome, and E. coli O157 and works to battle every new disease before it becomes an epidemic.
Urgent, exhilarating, and compelling, Beating Back the Devil takes you inside the world of these medical detectives who are trying to stop the next epidemic—before the epidemics stop us.
Related collections and offers
|Product dimensions:||6.00(w) x 9.00(h) x 0.90(d)|
About the Author
Read an Excerpt
July 2002, Atlanta
On a humid, sunny morning in July 2002, a group of eighty-nine men and women met in a windowless auditorium in an unremarkable pale-brick building set in a leafy suburban neighborhood in northeast Atlanta.
The building, the street-side face of a sprawling warren of cubicles, laboratories, and conference rooms, was the headquarters of the U.S. Centers for Disease Control and Prevention, usually known as the CDC. The CDC is the public health agency for the federal government, the organization charged with identifying, tracking, and trying to prevent the infectious and chronic diseases, environmental hazards, and workplace dangers that may afflict Americans.
At least, that was its official mission. Over its fifty-six years of existence, the CDC's duties and even more, its reputation had outgrown that dry description. Within its undistinguished 1960s-style architecture, made interesting only by a small marble bust of Hygieia, the Greek goddess of health, in front of the building and some expensive new laboratories rising behind it, worked people who were responsible for eradicating smallpox, the world's worst killer. Its scientists had identified Legionnaires' disease and hantavirus pulmonary syndrome, and had come face to face with Ebola virus and survived. Its researchers had taught Third World villagers low-cost, low-tech ways to protect their drinking water from terrifying parasites. Its acronym was recognized around the globe, often by people who had no idea what its initials meant.
On the inside, the CDC was a combination of cumbersome bureaucracy, research university, idealistic nonprofit, and skunk works. From the outside, it had acquired an unlikely glamour. Imagined versions of its researchers had been featured in movies and best-selling novels usually, to its employees' sour amusement, in far more plush surroundings than the ones they actually worked in.
The group waiting in the auditorium on that hot July morning was feeling far from glamorous. Its members were variously nervous, excited, jet-lagged, grumpily undercaffeinated, appalled by the heat, anxious over a baby just dropped off for her first-ever morning of day care, and wondering with a thrill of apprehension what they had gotten themselves into.
They were the newest cadre of recruits to the Epidemic Intelligence Service, the fifty-first entering class of the CDC's rapid-reaction force. They had agreed to trade two years of their careers for two years of intensive training in real-time disease detection. They had accepted, along with the training, a commitment to leave at a moment's notice for whatever outbreak needed them, whether it was a deadly new encephalitis on the far side of the planet or a church-supper attack of diarrhea one state away.
This was their first day.
At the front of the auditorium, Polly Marchbanks, a silver-haired nurse and Ph.D. in a rainbow-striped dress, looked out at the recruits.
"I advise you to fasten your seatbelts," she said. "This is going to be difficult."
In the group that assembled in Atlanta that morning the class of 2002, named by a quirk of CDC tradition for the year they enrolled rather than the year they would graduate there were thirty-two men and fifty-seven women; seventy-five Americans and fourteen from other countries; sixty-three Caucasians, fourteen Asians, nine of African heritage, and three whose background was Spanish-speaking.
The EIS is a corps of health professionals; it accepts doctors, dentists, Ph.D.s in some disciplines, physicians' assistants, nurses, and veterinarians. The class of 2002 included fifty-six physicians, twenty-three Ph.D.s, seven veterinarians, two nurses, a dentist, and, for the first time, a lawyer. Each of them, to get their slot, had succeeded over at least two other applicants. Seven of them were Phi Beta Kappa members. Six had been college athletes and four were former Peace Corps volunteers. More than half of them were married or partnered. One was pregnant. Nine of them had children who were not yet walking.
The signature tool of the EIS is epidemiology, the quantitative study of the distribution of disease in populations. Epidemiologists count cases of disease, discern whether there are trends, and devise strategies if the trends need to be reversed. Health workers' exposure to epidemiology varies with their professional training. A Ph.D. in biostatistics might be steeped in it. A family-practice physician might have taken one course in epidemiology early in medical school, and by the time he was done with residency, have forgotten everything he learned five years earlier.
To make sure all its recruits know as much epidemiology as they need to, the CDC puts its new EIS members through what it politely calls a refresher. For three weeks, the group would cram the basics of finding cases, designing studies, and handling lab samples. In effect, they were beginning with boot camp: Epi 101, sixteen hours a day.
"When you arrive at an outbreak, you will find that you get very different reactions to your arrival," said Doug Hamilton, the EIS's chief. A tall, genial family physician and microbiologist who frequently wore Hawaiian shirts to the office, he had headed the EIS for five years. He was born the week the corps was founded, and joined it in 1991 after meeting an EIS member at his twentieth high school reunion. "You will get everything from, 'This is my investigation, back off,' to, 'Thank God, the cavalry has arrived.' "
The new EIS members had been assigned seats in the auditorium at cafeteria-style tables arranged across its width. The seating was alphabetical by last name. It was an uncomplicated method for imposing order on a group that was wildly diverse.
In the front row sat Karen Broder, a pediatrician from a medical family her father had been director of the National Cancer Institute who had left private practice to join the EIS. She and her husband, an assistant professor at Emory University next door to the CDC, had two sons, who were three and a half and five years old. She had been pregnant with the younger boy while she was a resident at Massachusetts General Hospital in Boston; she joked that the first noises he recognized were her voice and the sound of her beeper.
Near her sat Wayne Duffus, a Jamaican-born physician and virologist whose family had emigrated to the Bronx when he was in high school. Duffus, who published his first scientific paper when he was still a teenager, had spent all of his academic career in New York City: bachelor's degree from Brooklyn College, M.D. and Ph.D. from Albert Einstein College of Medicine, residency at Columbia-Presbyterian Hospital. For the past year, he had been an infectious diseases fellow at Emory, concentrating on HIV. He had always been torn between the broad implications of lab research and the fine details of patient care; he hoped that coming to the CDC, with its focus on the health of groups, would let him find a middle ground.
Next to him sat Danice Eaton, a Ph.D. in behavioral science who had been an AmeriCorps volunteer, and Kirsten Ernst, a nurse with two master's degrees. Behind them but not sitting together, because changing seats was frowned on were Scott Filler and Sami Gottlieb, both doctors and the only married couple in the class. Gottlieb, an internist, had wanted to be an EIS officer since she finished medical school. Meeting Scott, who was three years younger in age but six years behind her in training, had put the goal on hold until he finished as well. They had just returned from volunteering for the summer at an AIDS program in Vietnam.
In between them was Victoria Gammino, an anthropologist with a Ph.D. in international health who had done her dissertation on tuberculosis on a remote atoll in the Pacific, part of the Marshall Islands. Close by sat Alexandre Macedo de Oliveira, an infectious-disease specialist from São Paulo, Brazil, who had applied three times to get into the EIS, and Angela McGowan, the corps' first lawyer. She was a second-generation EIS officer; she had been born during her father's EIS assignment thirty-one years earlier. Her father and her mother both still worked in public health. She had resisted the family business as long as she could, studying international relations in college and international law afterward, but then she succumbed and went for a public health degree as well.
In the same row was Joel Montgomery, a microbiologist from Texas who had already worked at the CDC for two years as a postdoc in parasitic diseases. He was also, as far as anyone knew, the only one in the group to have been in a movie. While working on his Ph.D., he had been caught on camera by a documentary crew filming Komodo dragons poisonous, eight-foot-long reptiles in Indonesia. Komodo dragons kill their prey by chewing on them and leaving mouth bacteria in the wound; the prey dies of blood poisoning several days later, and the dragons eat the carcass. Montgomery had been after the same bacteria. He had swabbed the mouths of the thrashing, angry animals while five Indonesian assistants held the reptiles down. After fifteen years in lab science, he was eager to come out from behind the bench, to study the human impact of disease.
In the back of the auditorium were Jennifer Gordon Wright, a veterinarian who had had her first child, a daughter, two months earlier, and John Watson, a South Carolina native who had worked as a family practitioner in Alaska and Seattle and gotten a public-health degree in London. He had wanted to apply to the EIS since he had visited a friend working at the CDC eight years before. His years caring for patients, many of them low-income, had left him with a deep political commitment to health care as a basic right. At the CDC, which had sent two generations of epidemiologists fanning across the globe to improve the health of distant countries, he thought his views would find a home.
The CDC had humble beginnings: It started in 1942 as a government bureau called the Office of Malaria Control in War Areas. "War areas" took in most of the southern states, where there were dozens of military bases to train troops being sent to World War II and hundreds of industrial establishments boatyards, airfields, factories, and shipping depots working to support the war effort. Mosquito-borne diseases were a severe challenge for the coastal United States, especially in the South. In the summer of 1853, yellow fever killed more than nine thousand people just in New Orleans; throughout the nineteenth century, yellow-fever quarantines had routinely closed southern port cities. The debilitating recurrent fever of malaria was such a constant in southern life that it had fueled the colonial-era slave trade. European indentured servants sickened and died of malaria; searching for replacements, planters realized that West Africans who had been exposed to malaria since childhood appeared to have some immunity against the disease. In 1942, malaria still persisted in more than thirty states.
Mosquito-borne diseases historically had troubled the military as well, so much so that the U.S. Army funded the research that in 1900 determined mosquitoes were responsible for transmitting yellow fever. Money for the antimalarial drug quinine was one of the first military expenditures authorized by the Continental Congress in the Revolutionary War. Malaria in military personnel had been a persistent and intractable problem ever since.
Malaria control meant, essentially, mosquito control, which before the U.S. patent for DDT was issued in 1943 required searching out the swamps, ditches, and stagnant ponds where mosquito larvae were maturing and either clearing the water of pests with insecticide or smothering the larvae on the surface of the water with oil. DDT made the process much simpler. It killed mosquito adults as well as larvae, so it could be used where the insects encountered humans instead of only where they bred, and once it was sprayed on a surface its effects lasted a long time. Still, DDT's success created more work for the Office of Malaria Control, rather than less. The new chemical was so effective and so cheap that it stimulated wider demand for mosquito control and led to a national program, created by Congress in 1945, that offered to spray civilian homes and businesses as well.
The moquito programs quickly built the Office of Malaria Control in War Areas into a sizable government unit. It had a significant budget and training facilities and laboratories in several states, as well as fleets of trucks and several thousand employees to handle the labor-intensive jobs of mosquito control. But it was also a wartime creation with a limited mandate. That left its parent agency, the U.S. Public Health Service, wondering what to do with the new bureau and its possessions once the war ended. The solution, suggested by an assistant surgeon general named Joseph W. Mountin, was to make the malaria-control unit into a peacetime health agency. It would handle diseases caused by insects and parasites, and expand into tracking other infectious diseases as well. Congress approved the change in 1946; it dubbed the new agency the Communicable Disease Center, giving it initials that would persist through three name changes and come to be recognized worldwide.
The Public Health Service was the original federal health agency: It had been chartered by Congress in 1798 to operate public hospitals for sailors. Over 150 years it had grown to take in quarantine offices at ports, immigrant examination stations such as Ellis Island, venereal disease eradication campaigns, field teams of health investigators who tackled rural problems such as hookworm and trachoma, hospitals for military veterans, and research laboratories. Mountin headed a Health Service division called the Bureau of State Services that forged links between the federal agency and state health departments. He intended the new CDC to strengthen those links, funneling assistance and eventually money to the states.
Giving assistance to the states implied going out into the field to find and control disease outbreaks effectively the same task the agency had performed for malaria control, but expanded to include other diseases. The mission made the new CDC the mirror image of the sixteen-year-old National Institute (later Institutes) of Health, which began as the National Hygienic Laboratory and conducted basic research into infectious and chronic diseases. Lab research was the heart of NIH's work. But at the CDC, which had built up its laboratories in order to do malaria research, Mountin envisioned the labs not as the center of the agency but as support for the work its personnel were doing with state health departments.
Despite the difference in mission there was conflict between the two agencies. Scientists at NIH and its predecessors had investigated outbreaks as well, though usually in support of an existing research program that is, not to respond to the emergency, but because they were already interested in the bug causing the problem.
"The CDC asked if NIH was planning to investigate every disease outbreak every time the states asked for help," said Dr. David Sencer, who joined the Public Health Service in 1955, came to the CDC in 1960 and was its director from 1966 to 1977. "They said, 'Certainly not. Only the interesting ones.' "
The new agency solved the conflict by promising to at least look into every outbreak that a state asked about. But the plan had an intrinsic flaw: Though the CDC had engineers, entomologists, and lab scientists left from its malaria work, it had relatively few epidemiologists. In other words, it had plenty of personnel to devise and apply solutions to public health problems, but it was thinly supplied with the researchers who could verify that a problem was occurring in the first place.
The problem was not the CDC's alone: Epidemiologists were in short supply across the country. The first academic department that taught the discipline in the United States was founded in 1919 at Johns Hopkins University. In the next twenty-seven years, many epidemiology graduates went directly to other universities, securing jobs in a field that was just beginning to grow. For the most part, they focused on calculations of risk factors and causes of disease rather than involving themselves in real-time outbreaks. It was a shift from the actions of the man whom the field considered its icon: John Snow, a London doctor who in 1854 mapped cases of cholera in a London neighborhood, and then shut down the outbreak by recommending that authorities take the handle off the neighborhood's water pump.
The nascent CDC wanted workers in the John Snow tradition, but it had relatively little to offer the academic epidemiologists who had already opted to do pure and well-compensated research. It found one: an associate professor at Hopkins, named Alexander Duncan Langmuir, who believed that epidemiologists needed as much as possible to come face-to-face with raw data in the field. In 1949, Langmuir agreed to come to the CDC as its chief epidemiologist, a post he would hold for twenty-one years.
He had little success, at first, getting others to join him. In a memoir article published in 1980, ten years after he retired, Langmuir confessed that his first nationwide recruiting campaign netted only "two young physicians who were genuinely interested but totally untrained." Faced with a dearth of qualified candidates to be CDC epidemiologists, Langmuir proposed a novel solution: The agency would grow its own. He would find young physicians who had the qualities he wanted, and train them to be epidemiologists after they were hired.
The newly conceived program received a boost from an unexpected source: the start of the Korean War.
The conflict began in June 1950. Within months, the decision to commit troops revived wartime mechanisms for training and supporting forces in the field that had been on hold since the end of World War II. The ramp-up included a military draft for physicians, scheduled to start in July 1951. However, the rules of the draft included a provision allowing doctors to opt out of military service if they agreed to spend two years in the Public Health Service instead. Physicians quickly took advantage of the loophole: In September 1950, only three months after the war began, Langmuir received the first letters from physicians asking to spend their two years of service at the CDC.
Epi boot camp broke down into distinct parts. Mornings held courses in methods of statistical analysis, interview techniques, and respecting the rights of patients. In the afternoons, the class reviewed accounts of past investigations, absorbing a thirdhand sense of what it felt like to land in an unfamiliar place and be handed responsibility for the health of strangers.
In the middle of the first week, Hamilton led a small group through a 1999 investigation that began with seven children hospitalized with bloody diarrhea near Albany, New York. One child had developed hemolytic uremic syndrome, a life-threatening disorder that includes loss of red blood cells and kidney failure. Lab analysis showed the child was infected with Escherichia coli O157:H7, a foodborne bacteria that releases a potent toxin. All seven children had attended the Washington County Fair nearby. So had more than 100,000 local residents. The New York state Department of Health had called the CDC for help.
"Is this an outbreak?" Hamilton asked.
"An outbreak is more cases than expected in a certain time and place," replied Waimar Tun, a Ph.D. epidemiologist who had worked in Bangladesh and Tibet. "We don't know what that baseline would be."
"So is it worth investigating?" Hamilton prodded.
"The health department asked us to investigate," said Angela McGowan, the attorney. "When they ask us, don't we have to go?"
Hamilton nodded. "The No. 1 reason we do outbreak investigations is to control disease," he said. "The second most important reason to do this is public concern: This is a life-threatening disease. And the third reason is, they are good training for what we do, and responding to public and political concern is an important part of what we do."
He walked them through the steps to be considered. How would they educate themselves about the organism? What supplies should they consider taking with them? If there are additional cases, how will they find them? When they draw a graph of the cases, what will it look like, and what will it tell them about the kind of outbreak it is?
At the end of the exercise, they learned that the seven children were the first indicator of an epidemic. Within a month, there were 761 known cases, and possibly 5,000 all together; sixty-five people were hospitalized, and two died. After a study of the patients and an environmental investigation, the source of the bacteria was found in a fairground well contaminated by cattle manure. It was one of the largest E. coli outbreaks in history.
Hamilton pointed out that the three EIS officers who were sent to the outbreak had no idea how widely the illness would spread, or whether they themselves were at risk.
"It is vital that you take care of yourselves," he said. "If you get sick, you will not do anyone any good."
Doctors' apprehensiveness about going to war had given Langmuir his first slate of epidemiologists-to-be, but it was broader fears about the Korean conflict that truly shaped the EIS. The rise of Cold War tensions had produced deep anxiety in the United States about the possible use of biological weapons, either smuggled covertly into the country or delivered by the long-distance missile systems that the Cold War arms race was rapidly producing. The start of the Korean War amplified those fears. China, which became the United States' chief opponent, had experience with biological weapons: Japan had used them against the Chinese in World War II, and had built weapons-research facilities in the part of China it occupied. (Popular anxiety about China's germ-warfare plans ignored or failed to realize that significant amounts of biological-weapons knowledge had been collected after the war not by China, but by the United States, which granted immunity in exchange for testimony to the heads of Japan's germ warfare program, Unit 731.)
Once the conflict began, the U.S. government evaluated what was known about biological weapons research and concluded that germ warfare directed against the United States was a real risk. A civil defense manual published in December 1950 stated flatly: "An enemy could employ...biological warfare against us effectively."
Langmuir who as the government's chief epidemiologist was a consultant to the U.S. Army's own biological warfare program at Fort Detrick, Maryland concurred. In a speech he gave in February 1951 in Kansas City, he outlined the possibilities: Infectious organisms could be released inside buildings. "Specially designed bombs, shells, or other types of disseminating devices discharged from enemy aircraft or from warships offshore" could cast clouds of pathogens over cities. Water supplies could be contaminated, as well as food stores.
"One's imagination is almost unlimited when one considers the wide variety of possibilities and potentialities of this form of warfare," he concluded.
Germ warfare was still a new thought in the 1950s, so Langmuir took care to underline an essential point: Biological weapons would be most valuable to an enemy if they caused full-blown epidemics rather than scattered cases of disease. Because germ warfare is covert by definition, intentionally caused outbreaks would come as a surprise. The number of victims would overload hospitals and emergency medical services. State laboratories that would handle diagnostic samples from victims would be overwhelmed by the amount of analysis required to explain an outbreak; in addition, their scientists might never have seen the organisms likely to be used in weapons. Lacking an explanation for the cases flooding their offices, physicians would treat victims with anything that looked as though it might be effective an approach that might uncover a useful treatment, but would waste time and drugs along the way. Quarantines would be imposed by civil authorities. Mass panic would ensue.
Langmuir was envisioning attacks in which weaponized pathogens would be delivered directly to the United States. He shortly was forced to revise his views. In June 1951, American and United Nations troops in Korea began falling seriously ill.
At first, the illness looked like flu, with severe headache, high fever, loss of appetite, and pain in the back and abdomen. The first sign of something unusual happening showed in the sick soldiers' faces: They became bright red, as though the men were sunburned. Within a few days, victims began to look puffy-faced: Fluid collected around their eye sockets, and their eyes turned pink and then red as small blood vessels broke. Pinpoint hemorrhages appeared first on the palate, then in the armpits and along the chest and neck. The patients vomited blood. Their blood pressure crashed, sending them into shock, and then their kidneys shut down.
More than three thousand troops became sick. Almost two hundred died, most of them within a week of falling ill. When they were autopsied, their kidneys were found to be engorged with blood and dead tissue. Bewildered military doctors dubbed the illness "epidemic hemorrhagic fever." Western medical literature held no record of it but Japanese records did. Japanese soldiers occupying Manchuria more than ten years earlier had been attacked by the same disease, which they believed was carried by field mice and transmitted to humans by mites. But there was no record of the disease ever occurring in Korea. The U.S. military's reluctance to trust the Japanese, combined with widespread anxiety over biological weapons, led to a fresh set of fears: that soldiers had been infected deliberately by an enemy.
Frightening though the cases were, they remained confined to the battlefield. But the outbreak caused Langmuir to contemplate a second possibility: that combatants in Korea might infect U.S. troops with slower-acting organisms that would not made them sick until they returned home. He imagined companies of soldiers unknowingly made to act as human weapons against their own side, a much more efficient delivery method than the bombs and shells he had described months earlier.
Langmuir's predictions appalled Mountin and the new CDC's leadership. They transformed the dearth of epidemiologists from a problem into an emergency. In a tense meeting, Mountin said: "What we need is an epidemic intelligence service."
Years later, Langmuir wrote with satisfaction: "That is what he got." The first EIS class twenty-one physicians and an environmental engineer who had worked on malaria reported for two years of duty in July 1951.
Three months later, Langmuir appeared in San Francisco at the annual meeting of the American Public Health Association, before an audience of staff from state and local health departments from across the country. In a speech that made the EIS concept public for the first time partly to stimulate recruiting for the next year's class Langmuir described why the new corps was so necessary.
"Any plan of defense against biological warfare sabotage requires trained epidemiologists, alert to all possibilities and available for call at a moment's notice anywhere in the country," he said. "To achieve the desired mobility, the activities of these epidemiologists should be coordinated by a logical federal agency. They must be immediately available, however, to any state or local health department in need."
In one paragraph, Langmuir raised the possibility of biological attack, proposed a corps of detectives to combat it, sold it as a new addition to the taxpayer-funded CDC, and most important, dangled before the overworked health department employees the promise of highly trained, unpaid help whenever they needed it. Neither Langmuir nor his audience needed to voice the next thought: Biological attacks would at first be indistinguishable from naturally occurring epidemics. The new disease detectives were intended to uncover intentionally caused outbreaks, but they were likely to help health authorities solve many other epidemics along the way.
The states needed the help. Infectious disease outbreaks were common, in part because controls were few: Penicillin had just come into wide use, and childhood immunization programs had yet to make much of an impact. Since full-time investigation of disease outbreaks would be a new undertaking, Langmuir chose his troops carefully. He looked for a specific personality type.
"I remember he would say he wanted three characteristics in his EIS officers," said J. Lyle Conrad, a physician who came to the CDC as an EIS officer in 1965 and afterward ran a division of the CDC for eleven years. "He wanted the brightest people he could get his hands on. He wanted them to be aggressive; he even wanted them to be abrasive to a degree, to make some headway against the dead wood in public health. And he wanted them to be enthusiastic. He wanted people who, if he said, All right, young man, you've been doing cancer epidemiology too long; I've got a measles outbreak in South Dakota, and you leave tomorrow, would pack his bags, leave tomorrow and land in the snow in South Dakota without batting an eye."
The CDC's willingness to dispatch an EIS officer at any hour became the hallmark of the program. "State health officials were astounded to find bright, young, responsive epidemiologists in their offices the next morning, or even sometimes the same day that they called," Langmuir wrote in 1980. "Any situation to which the term 'epidemic' could be even remotely applied was accepted as within our jurisdiction, at least for a preliminary investigation."
Langmuir's plan contained substantial risk: He effectively staked the reputation of the nascent CDC on the performance of the newly minted researchers it sent out into the field. For the agency to succeed as Mountin envisioned, it needed to be closely tied to state health departments. Those departments would not have cared how good a CDC representative's medical credentials were. The only thing that mattered to them was whether novice epidemiologists arriving from the CDC could solve the outbreaks the states were grappling with.
Out of luck or good planning or perhaps because Langmuir was a good judge of character the plan succeeded. The freshly trained epidemiologists who were sent out from Atlanta puzzled out the epidemics they were presented with, even if they sometimes needed to read up on the suspected problem while they were en route to the outbreak. The group quickly began to pride itself on what it called "shoe-leather epidemiology," its core technique: trekking through the landscape of an investigation, door-to-door if need be, to conduct the face-to-face interviews that would supply the data its analyses needed. It adopted a symbol that hinted at the labor involved: the outline of the sole of a shoe with a hole worn through the ball of the foot, superimposed on a globe.
Langmuir had sold the EIS to the public health establishment as an always-available source of assistance. At home at the CDC, he hit different notes. The ultimate purpose of the EIS was to make epidemiologists, in the hopes that some at least would stick with the specialty once their two years of CDC service had ended. Investigating outbreaks was one facet of training, but not the only one. The new EIS members were taught to conduct routine surveillance for undetected health problems and to sift through collections of data for unrecognized opportunities to prevent and control diseases. They were expected to write papers and to deliver presentations at scientific conferences.
In the early years, they did all of it under Langmuir's direct supervision.
"Alex was a bit of a tyrant," Sencer said. "He was very demanding.
Anybody who wrote a paper had to sit down with Alex, and Alex would go through it and tell you what needed work. You'd come back two or three times, but when that paper was done, it was good."
Langmuir retired from the CDC in 1970. He died in 1993 after a second career teaching public health. The program outlived him. Between 1951 and 2002, more than 2,700 people joined the EIS. Afterward, one-third of them stayed with the CDC; another third took jobs in state health departments and other federal agencies, and one-fifth joined university faculties. Three of them became directors of the CDC; ten ended up as deans of schools of public health. Two became U.S. surgeons general. Langmuir's initial goal creating a cadre of trained epidemiologists to spread through U.S. public health had been achieved.
His second goal establishing an early-warning system against domestic biological attack gradually came to seem less relevant as the Cold War ended and the Soviet Union dismantled itself. In the world of defense intelligence, deep concern about biological weapons never abated. U.S. research into them continued, first offensive, and then defensive after germ warfare was outlawed by international treaty in 1972.
Little of that continuing disquiet, though, touched the CDC. At its Atlanta headquarters, the onetime mosquito-control bureau built itself over fifty years into the world's premier public health agency. Its mandate expanded, taking in chronic diseases, environmental health, sexually transmitted diseases, birth defects, and the acquisition and processing of millions of data-points of health statistics. It fielded teams of field and laboratory scientists who worked in dozens of countries, eradicating smallpox and curbing measles, polio, and tuberculosis. Its name became so well recognized that the People's Republic of China dubbed its own health agency the "China CDC" in tribute, though the initials are meaningless in Chinese.
Physically and psychologically, the CDC remained remote from the rest of the government, even when concerns about biological-weapon defense were renewed in the 1990s. That changed abruptly with the October 2001 revelation that an unknown assailant had launched anthrax-laden letters against media and politicians in Florida, Washington, D.C., and New York. Five people died; seventeen others were made ill. The assumption that the American homeland was safe from biological attack was permanently shattered.
The CDC realized that Langmuir's intentions for the EIS still had relevance. For the first time since the corps' founding, new members would have to be taught to counter outbreaks of disease that had been deliberately caused.
"I really hope," said Colonel Ted Cieslak, "that this week is a waste of all your time."
Cieslak, an Army physician, stood at the front of a windowless white classroom in a decommissioned Army hospital in Anniston, Alabama. Cieslak was chief of pediatrics at a military medical center in San Antonio, Texas, and the brother of an EIS officer from the class of 1992. The CDC had asked him to speak based on the job he used to have: He was a former division chief from the U.S. Army Medical Research Institute of Infectious Diseases, the government installation at Fort Detrick, Maryland, that had conducted all of the Pentagon's biological-weapons research.
"Until recently, biological warfare was a paragraph tucked into the back of our chemical weapons manual," he said. "But in this day and age, you are going to have to go about your business with the possibility always tucked into the back of your minds."
The members of the EIS class shifted in their seats. The CDC had bused them all to Anniston, ninety-five miles from Atlanta, thinking that the change in routine would break up the monotony of training and give the group one last chance to get to know each other. "We want them to be bonded as a group," Hamilton said. "The friendships that are made in these two years are really enduring. And if we face a situation again like the anthrax attacks, the cohesion they have developed will help to pull them through."
The class looked exhausted. After weeks of intensive schooling in recognizing natural outbreaks of disease, their brains were almost numb. They had little stamina left for five more twelve-hour days of training even training that promised, in theory at least, to save not just the general public's lives but their own. And the bland setting a cinder-block room inside a nondescript brick building on an overgrown military reservation bled the reality out of what would otherwise be a course in varieties of nightmare.
However, the setting was less benign than it looked. The reservation, Fort McClellan, was the former home of the U.S. Army's school for chemical warfare. Buried on the grounds, along with German and Italian prisoners of war who died while being held there, and bomb-sniffing military dogs that had been decorated for exemplary service, were 2,200 heavily guarded tons of nerve gas, mustard gas, and Sarin. The chemical weapons were scheduled for mass incineration, but their destruction had been postponed multiple times while the Army struggled with the risks of cracking open the stockpile. Without knowing it, the EIS officers were surrounded by some of the weapons they were being trained to defend against.
Cieslak took them briskly through the basics. Category A agents, the most dangerous not only for their infectiousness, but for the panic they would inspire: anthrax, botulism, smallpox, plague, and viral hemorrhagic fevers including Ebola. Category B agents, ones that would cause more sickness though less death, such as the livestock diseases brucellosis and Q fever, and ricin, a toxin expressed from castor beans. Category C agents, newly recognized diseases that the CDC feared could be engineered into weapons, like hantavirus, which had emerged unexpectedly in the Southwest a decade earlier, and Nipah virus, which had killed one hundred people in Malaysia in 1999.
PowerPoint images clicked by with the lecture. An electron-micrograph of a virus. An image of a first responder at an anthrax hoax, hidden from head to toe behind hooded coveralls and scuba-style breathing gear. A child with full-blown smallpox infection. That drew gasps. Smallpox had been eradicated before most of the class members reached high school. None of the physicians in the group had studied it in medical school. Few people, other than intelligence analysts, had expected to have to encounter it again. But on the list of organisms likely to be deployed by a terrorist, smallpox now ranked close to the top. Because most vaccinations had ceased thirty years ago, much of the world was considered vulnerable to it. Its release could potentially cause mass outbreaks, and a faster-moving epidemic of fear.
"You should be asking yourself: What is the average terrorist thinking?" Cieslak said. "If they are seeking a strategic goal the destruction of the United States, for instance there are probably only four or five biological agents they are going to use. But if the terrorist's goal is not so grandiose, the list of potential agents is much longer. For the big four anthrax, smallpox, botulism, and plague diagnosis may actually be quite easy. Unfortunately, if it's one of the rest, your problems immediately increase."
An attack would come as a surprise, the physician reminded them. The first signs that it had happened would be the cases of illness that it caused.
"If you suspect you've been hit with a biological weapon, the rest is relatively easy," he said. "You'll be able to do the right thing, if you suspect it in time. The problem in biological defense is recognizing the attack in the first place."
Cieslak sat down. A radiation physicist, a small, dry man in a suit, took his place. He looked out at the class. Shaken out of their exhaustion, all eighty-nine of them stared back.
"Is there anyone here," he said, "who knows how to operate a Geiger counter?"
The EIS class members stood in a rough circle inside another Anniston classroom, two buildings away from where they had started the week. At their feet, each of them had a large plastic trash bag. The bags were twisted shut; blank name-tags with wire tails held them closed.
The bags held the essentials of the last phase of their training. They had spent most of the week listening to the possible forms a terrorist attack could take. Now, they were going to enact one. The simulation, they had been told in advance, would be a poison-gas attack. They would suit up, set up their emergency response, find and treat the victims, and then learn how to retreat without becoming victims themselves, all within the confines of the building. The simulation was expected to be stressful. Outside the building, several paramedics were standing by.
The class split up into smaller groups, trailing after instructors into separate corridors of the building. One went with Danny Garrick, a tall, broad-shouldered man with hair so short and a manner so curt he could have passed for a drill instructor. Coached by Garrick, the group opened up their bags and got acquainted with the contents: Surgical gloves, with thick electricians' gauntlets to cover them. Heavy rubber boots with clear plastic liners to make them easier to pull on. A zipped white coverall with booties, a hood, and elastic at the waist, neck, and sleeves.
"This is Tyvek," Garrick said, holding it up. "It's a good protection against particulate matter. It offers limited splash protection. If you're somewhere where it's wet, don't kneel when you're wearing this; whatever the liquid is, it will soak right through."
The last item in the bags was an air-purifying respirator: a thick clear-plastic mask mounted on a pair of filter cartridges, surrounded by a heavy rubber gasket and held on the head with wide elastic straps. The masks were a struggle to don quickly: The elastic pulled hair out of place and the gaskets sat uncomfortably on beards.
"Take deep, slow breaths," Garrick warned them. He strolled around the group, stopping at a woman who had pushed the gasket down over her bangs. He bent slowly, bringing his face level with the front of her mask, and peered in.
"That hair inside your mask is breaking the seal that's protecting you," he said, gently. "You realize, if this were a real emergency, you'd be dead right now."
The group shuffled rapidly into the training area, confronting the body on the gurney and a set of incongruous props: a stack of trash bags, a garden sprayer, and a child's plastic wading pool.
The pool and the sprayer, Garrick said, were for victims who could walk into decontamination. If nothing like them was available, trash cans would do, or buckets or a garden hose anything that would let them rinse victims down rapidly with water, plus bleach if they could get it or soap if they could not.
"Think about the complexities here," he said. "Time is of the essence. You're going to have to move these people to medical care. If you have a parent with a child, you're going to have to decontaminate them together. If you have a blind person with a dog, you're going to have to decontaminate the dog."
Victims got washed down once with their clothes on, then again naked. They could take their clothes off themselves if they were able to, but things would go faster if the EIS cut them off instead. The trash bags served two functions: They were evidence bags for the clothing, because a terrorist attack would be considered a crime scene; and they were a modest emergency covering once the clothing was taken away.
"If you're at a hospital, you might be able to find sheets, or a stash of scrubs. If you can't then trash bags are your best substitute," Garrick said. "If you're grabbing them in a hurry, try to make sure you pick the dark ones not the ones that are clear."
The last part of the exercise was learning how to take the suits off safely. The class members stepped into the trash bags the suits had arrived in, holding them open with their knees. The gear came off in a careful sequence: First, drop the gauntlets into the bag, peel the suit away to the knees, and step out of the boots and suit. Then, take a deep breath and hold it; pull off the respirator and gloves; tilt your face away; drop them into the bag and twist it shut; and breathe again. The last step was to label the bag with the name-tag, in case the contaminated gear was considered evidence as well.
The class members reassembled in the classroom, breathing deeply and rubbing the red marks the gaskets had left on their faces. The exercise had lasted fifty minutes. If it ever occurred in real life, it would be hours, Messanier warned them on the way out.
"If something happens, you are on your own," he said. "You will be on your own for at least eight to twelve hours. That's the minimum amount of time it will take for the rest of the feds to show up."
Copyright © 2004 by Maryn McKenna