- Shopping Bag ( 0 items )
Ships from: Yaphank, NY
Usually ships in 1-2 business days
Ships from: Avenel, NJ
Usually ships in 1-2 business days
Author Biography: Eli Ginzberg is director of The Eisenhower Center for Conservation of Human Resources, Columbia University. He has been a consultant to nine U.S. presidents and chaired the National Commission for Employment Policy for six presidents. He is the author of numerous books as well as articles on health affairs in the New England Journal of Medicine, the Journal of the American Medical Association, and many other journals.
Although this chapter focuses on the important changes in the structure and function of U.S. medicine between World War II and the passage of Medicare and Medicaid, we begin by identifying key interactions between the leading urban academic health centers and their low-income neighbors. We will use 1910 as a starting date, when the young educator Abraham Flexner completed the first major report on U. S. medical education under the sponsorship of the Carnegie Foundation for the Advancement of Teaching and with the support of the AMA. Flexner's recommendations were twofold. First, he recognized the urgent need for the United States to reduce significantly the number of medical schools from around 160 to 39 to enhance the quality of medical education. Second, but not secondarily, he believed that the nation's medical school faculties should be strengthened and enlarged so that both the courses they taught and the medical diagnoses and treatments they provided would be firmly undergirded by a scientific foundation based on modern biology.
We should recall that some of today's leading AHCs began their operations in the nineteenth century and a few even as early as the eighteenth. Established largely with philanthropic funding, these AHCs were products of major teaching hospitals that combined with medical schools with a primary goal of providing hospital care and comfort to the neighborhood sick and injured, particularly to the victims of the periodic epidemics that afflicted so many urban residents. Not until the late nineteenth and early twentieth centuries did increasing numbers of the middle- and upper-income population begin to use the acute care hospital.
By the mid-1920s Flexner's reforms were largely in place, and a number of the nation's leading medical schools had allied themselves with a well-functioning acute care hospital to form an academic health center. These early AHCs had a small full-time faculty, relying on volunteer physicians to provide most of the clinical instruction. Their research activity was limited, and funding was often provided by an affluent physician on the staff or one of his wealthy patients. The AHC students were predominantly undergraduates. Most of these students, after earning their M.D. and completing a one-year internship and passing the licensing examination, would enter practice.
Many neighborhood practitioners were willing to provide hours of uncompensated care in the clinics and on the wards of teaching hospitals. This was particularly the case at AHCs that operated under public auspices and provided charity care as part of their mission and responsibilities. Until the advent of penicillin, which occurred at about the same time as the start of World War II, not even the AHCs could do much for many severely ill and injured patients other than to provide comfort care. Their limited capabilities largely explain why many of the nation's leading teaching hospitals were able to meet their budgets by charging patients, often 40 to 50 in a single ward, only $S per day. When patients themselves could not make payments, the charges were covered by philanthropic funds and by the extra costs assessed against private patients. An important difference between then and now was the dependence of the medical school on its affiliated hospital to admit reasonable numbers of charity patients. Students in training-medical students, interns, and the relatively small numbers of graduates who sought advanced (specialized) training-needed access to these patients, because paying patients were not then generally available to them.
The United States experienced striking advances in the health of its people during the first decades of the twentieth century, but these gains had relatively little to do with the improved diagnoses and therapies of curative medicine. Instead they mostly reflected such public health advances as the elimination of many infectious and contagious diseases; improvements in the water supply; better sanitation; the improved handling of food products, including milk pasteurization; a better understanding of nutrition; and a host of other highly successful interventions. Another way of describing the relatively limited scale and scope of the nation's health care sector when the United States entered World War II is to recall that the total expenditure for the nation's health care sector was approximately 4 percent of the nation's quite modest gross domestic project (GDP). At the end of the 1930s the GDP was no larger than it had been at the beginning of the decade because of the depth of the depression and the slow recovery that followed.
Flexner's report set U.S. medicine on its scientific path for the next three decades. During this time the quality of medical education had also been greatly improved, as many inadequate schools were closed, and the surviving schools raised their requirements for admission and graduation. Accompanying these reforms were selected advances in radiology, pathology, surgery, and other specialties, and in pharmaceuticals. But the substantial gains in the health of the American people during these decades mostly reflected gains in the public health arena. Progress in public health was itself closely related to advances in the educational level of Americans and their rising standard of living, at least before the onset of the Great Depression. It is thus ironic that medical schools did not teach their students more about public health issues. But overall the United States made tremendous progress in curative medicine in the half-century following the war and ultimately emerged as the world's leader in high-tech medicine.
After the nation went to war, many army enlistees revealed that they had had no previous contact with physicians, hospitals, or clinics. Their first encounter with a modern health care system occurred after they were in uniform, and most responded favorably about the care that they received. Approximately 15 million enlistees and draftees saw service during the war in the army (which included the air corps) or the navy (which included the marines), and all had access to the military health care system. Additionally, several million dependents of the uniformed forces also had some exposure to the medical departments of the army and navy. Together, people in the armed services and their dependents made up about 1 in 6 Americans.
The experiences of the millions who were in the uniformed services during the war, and especially the several hundred thousand battle casualties who were returned to the United States for medical or surgical treatment, produced a widespread, generally positive perception of the contribution that an effectively functioning health and hospital system could make to the well-being of both the armed forces and Americans in general. Wounded World War II soldiers who received immediate medical treatment on the battlefield from a medical corpsman had a 93 percent chance of recovery when finally discharged from a U.S. army hospital.
The active participation of the United States in World War II also brought changes to the health care system at home. The army and navy called on about 40 percent of active physicians who had been engaged in providing health care services to the civilian population. To compensate, at least in part, for the reduced availability of physicians, most remaining practitioners stopped making house calls and saw patients in their office or at a hospital clinic. The federal government also intervened and enrolled medical students in the reserves, paying them a stipend and encouraging medical schools to accelerate their teaching programs to enable graduates to join the armed services more quickly.
Of long-term significance to the U.S. health care system was the importance that the army attached to its commissioned medical officers who entered the service as board-qualified specialists. The army made a commitment to provide the highest possible level of medical care to its troops, especially those seriously injured in battle. As part of this obligation the army organized its general hospitals by specialty center. Facing shortages of specialists among its commissioned physician ranks, the army offered higher rank, more pay, and better assignments to those who were board certified. As the war began to wind down, Congress passed the GI Bill with its multiple monetary and educational benefits for those who had earned an honorable discharge. Impressed by the military's emphasis on specialty certification, many medical officers took advantage of this opportunity to enter or complete their residency training. Underscoring this attention to specialty care was the army policy of matching the type of care that the wounded or otherwise disabled returnee required with the general hospital best able to provide it. The only other criterion that played a role in the assignment process was locating a specialized hospital closest to the patient's home.
The growth in the number of Americans with health insurance was another outcome of U.S. involvement in the war. Although the beginnings of private health insurance under the Blue Cross movement is usually set as 1929 in Dallas, Texas, the total number of Americans with health insurance rose to no more than 15 percent of the population by the outbreak of World War II. Only after the United States entered the war do we see a considerable increase in the number of insured persons. The trade unions petitioned the War Labor Board in 1942 for permission to negotiate for health care benefits in collective bargaining with employers despite the board's earlier ruling against wage increases. The unions persuaded the board that acquiring health care benefits would not be inflationary. In addition, the Treasury Department decided that employers and employees could "pay for such health benefits using pretax income. This subsidy accelerated the number of beneficiaries in employer health care benefit programs. As of November 1997, health care economist and Princeton University professor Uwe Reinhardt estimated that the current annual value of the tax subsidy approximated $100 billion. This subsidy partially explains continued U.S. corporate involvement with health insurance despite growing conflicts with disgruntled current and retired employees. A second explanation is the general reluctance of private business to expand the role of the government in this area. Third, the tax subsidy is especially valuable to those in higher earning brackets.
U.S. involvement in World War II had two additional repercussions on the nation's health care sector. The first concerned federal support for biomedical research. The federal government had taken the lead during the war to finance important research and development projects, with the atom bomb being the most important example. With prompting from Chicago activist Mrs. Albert Lasker and other proponents of expanded federal funding for research into heart disease, stroke, and cancer, the government entered into a policy of active support for biomedical research after the war. By 1998 this support approximated $14 billion. Bolstering the AHCs in their efforts to secure these funds is the $20 billion that they receive annually in corporate investments to speed the development of new and improved health care techniques and products. AHCs with intellectually strong faculties and ongoing access to philanthropic funds are well positioned to compete successfully in the peer-reviewed process to obtain support from the enlarged National Institutes of Health program for biomedical research.
The passage of the Hill-Burton Act in 1946 was the war's other important consequence affecting health care financing and delivery. This legislation made significant federal funds available for the first time to small and medium-sized communities to build or upgrade their community hospital. In the absence of such assistance these areas might otherwise not have been able to attract or retain physicians who would be willing to practice locally. The depression and the war had resulted in a serious backlog in hospital construction, making the new financing of Hill-Burton all the more important. Senate Republican leader Robert Taft played a key role in the passage of this legislation. We should not underestimate the importance of this initiative to expand the access of Americans to hospital care. Hill-Burton was a departure for the federal government; Congress had never before become directly involved in financing hospital care services for the American people. Although this assistance was restricted to smaller and medium-sized communities, it is an important development nevertheless.
At the same time, the lack of activity in other areas likewise has affected the restructuring of the U. S. health care system. The failure of Congress to increase the supply of physicians is one such example. In the decade after the war various states and nonprofit sponsors established new medical schools. In 1948 Social Security administrator Oscar Ewing recommended that President Truman seek federal funding for expanding the physician supply. However, in the face of strong opposition from the AMA to direct federal funding for medical schools, Congress failed to act.
Universal health care coverage is another example. Shortly after the war, Truman, with the support of liberal Democrats in Congress, sought to put universal health insurance on the congressional docket. This effort did not succeed, largely because of strong opposition from the AMA. Also, most trade unions that had won private health insurance benefits from their employers saw little gain from a new system of tax-based insurance coverage. Moreover, a growing number of employers were comfortable with the health care benefits that they had recently introduced and saw no point in involving the federal government. Waiting in the wings were the private insurance companies that had belatedly begun to recognize the new profit opportunities once they learned how to sell health care coverage to private employers. Not to be over-shadowed were Blue Cross and Blue Shield, both of which had a major stake in the continuation of the status quo.
One way to assess the long-term impacts of World War II on the development of the U.S. health and hospital system is to point to the lack of changes in overall structure and functioning. Most physicians continued to practice as generalists, primarily as solo entrepreneurs or as members of small groups.
Excerpted from Teaching Hospitals and the Urban Poor by Eli Ginzberg Howard Berliner Miriam Ostow Panos Minogiannis J. Warren Salmon Copyright © 2000 by Yale University. Excerpted by permission.
All rights reserved. No part of this excerpt may be reproduced or reprinted without permission in writing from the publisher.
Excerpts are provided by Dial-A-Book Inc. solely for the personal use of visitors to this web site.