When Computers Were Humanby David Alan Grier
Before Palm Pilots and iPods, PCs and laptops, the term "computer" referred to the people who did scientific calculations by hand. These workers were neither calculating geniuses nor idiot savants but knowledgeable people who, in other circumstances, might have become scientists in their own right. When Computers Were Human represents the first in-depth/i>
- Editorial Reviews
- Product Details
- Related Subjects
- Read an Excerpt
- What People Are Saying
- Meet the author
Before Palm Pilots and iPods, PCs and laptops, the term "computer" referred to the people who did scientific calculations by hand. These workers were neither calculating geniuses nor idiot savants but knowledgeable people who, in other circumstances, might have become scientists in their own right. When Computers Were Human represents the first in-depth account of this little-known, 200-year epoch in the history of science and technology.
Beginning with the story of his own grandmother, who was trained as a human computer, David Alan Grier provides a poignant introduction to the wider world of women and men who did the hard computational labor of science. His grandmother's casual remark, "I wish I'd used my calculus," hinted at a career deferred and an education forgotten, a secret life unappreciated; like many highly educated women of her generation, she studied to become a human computer because nothing else would offer her a place in the scientific world.
The book begins with the return of Halley's comet in 1758 and the effort of three French astronomers to compute its orbit. It ends four cycles later, with a UNIVAC electronic computer projecting the 1986 orbit. In between, Grier tells us about the surveyors of the French Revolution, describes the calculating machines of Charles Babbage, and guides the reader through the Great Depression to marvel at the giant computing room of the Works Progress Administration.
When Computers Were Human is the sad but lyrical story of workers who gladly did the hard labor of research calculation in the hope that they might be part of the scientific community. In the end, they were rewarded by a new electronic machine that took the place and the name of those who were, once, the computers.
"David Alan Grier's recovery of the wonderfully rich story of human computers . . . ask[s] why human computers were made to disappear in the first place. . . . It is notoriously difficult to recover details of the lives of ordinary people. . . . But Grier triumphantly achieves his aim when discussing the twentieth-century human computer, as many are alive to tell their tales."Jon Agar,Nature
"Prior to the advent of programmable data-processing electronic devices in the mid-20th century, the word computer was commonly used to describe a person hired to crank out stupefyingly tedious calculations. . . . Human computers have . . . been largely forgotten, and David Alan Grier . . . is intent on restoring them to their rightful place in history."Ann Finkbeiner, Discover
"When Computers Were Human is a detailed and fascinating look at a world I had not even known existed."James Fallows, National Correspondent, Atlantic Monthly
"The strength of this book is its breadth of research and its human touch. . . . [A] well written, informative and enjoyable work."Amy Shell-Gellasch, MAA Reviews
"Overall, this book provides a wonderful survey of human computing from 1682 onward. . . . I recommend this book to all historians of computing, both professional and amateur."Jonathan P. Bowen, IEEE Annals of the History of Computing
James Fallows, National Correspondent
Jonathan P. Bowen
- Princeton University Press
- Publication date:
- Sold by:
- Barnes & Noble
- NOOK Book
- File size:
- 4 MB
Read an Excerpt
When Computers Were Human
By David Alan Grier Princeton University Press
Copyright © 2005 Princeton University Press
All right reserved.
Introduction A GRANDMOTHER'S SECRET LIFE
After a while nothing matters ... any more than these little things, that used to be necessary and important to forgotten people, and now have to be guessed at under a magnifying glass and labeled: "Use unknown." Edith Wharton, The Age of Innocence (1920)
IT BEGAN with a passing remark, a little comment, a few words not understood, a confession of a secret life. On a cold winter evening, now many years ago, I was sharing a dinner with my grandmother. I was home from graduate school, full of myself and confident of the future. We sat at a small table in her kitchen, eating foods that had been childhood favorites and talking about cousins and sisters and aunts and uncles. There was much to report: marriages and great-grandchildren, new homes and jobs. As we cleared the dishes, she became quiet for a moment, as if she were lost in thought, and then turned to me and said, "You know, I took calculus in college."
I'm certain that I responded to her, but I could not have said anything beyond "Oh really" or "How interesting" or some other empty phrase that allowed the conversation to drift toward another subject and lose the opportunity of the moment. In hindsight, her statement was every bit as strange and provocative as if she had said that she'd foughtwith the Loyalists in the Spanish Civil War or had spent her youth dealing baccarat at Monte Carlo. Yet, at that instant, I could not recognize that she had told me something unusual. I studied with many women who had taken calculus and believed they would have careers in the mathematical sciences like my intended career. I did not stop to consider that only a few women of my grandmother's generation had even attended college and that fewer still had ever heard of calculus.
My grandmother's comment was temporarily ignored, but it was not lost. It came rushing back into my thoughts, some six or seven years later, as I was sitting in a mathematics seminar. Such events are often conducive to reflection, and this occasion promised plenty of opportunity to think about other subjects. The speaker, a wild-haired, ill-clad academic, was discussing a new mathematical theory with allegedly important applications that were far more abstract than the theory itself. As I was helping myself to tea and cookies, a staple of mathematical talks, I caught a remark from a senior professor. I had always admired this individual, for he had the ability to sleep during the boring parts of seminars and still catch enough of the material to ask deep and penetrating questions during the discussion period. This professor, who had recently retired, was describing his early days at the university during the Great Depression of the 1930s. Having just arrived in the United States from his native Poland and knowing only rudimentary English, he was assigned to teach the engineering calculus course. "This," he stated with a flourish, "was the first time that calculus was required of engineering students at the university." As I listened to his story, I heard my grandmother's phrase from that night long before. "You know, I took calculus in college." I did not know when she had attended college, but having heard my mother's stories of the Depression, I was certain it would have been before 1930. As I settled into my chair, I started to ponder what my grandmother had said. For the next hour, I was lost in my own thoughts and oblivious to the talk, which proved to be the best seminar of the term. During the discussion period, I was asking myself the questions I should have raised at that dinner years before: Where had my grandmother attended college? What courses had she taken? What had she hoped to learn from calculus?
By then, it was too late to ask these questions. My grandmother was gone, and no one knew much about her early life. My mother believed that my grandmother had studied to be an accountant or an actuary. My uncle thought that my grandmother had taken some bookkeeping classes. Our family genealogist, a distant cousin who seemed to know everything about our relations, expressed her opinion that my grandmother's family had been too poor to send her to college. Still bothered by that one phrase, I decided to see what I could learn. My grandmother had been raised in Ann Arbor, the home of the University of Michigan. So one day, I called the college registrar and asked if she had a transcript for my grandmother. I tried to use a tone of voice to suggest that it was the most natural thing in the world for a grandson to review his grandmother's college grades, rather than the other way around. With surprisingly little hesitation, the registrar agreed to my request and left the phone. In a few minutes, she returned and said, "I have her records here."
Catching my breath, I asked, "When did she graduate?"
"Nineteen twenty-one," the registrar responded.
"What was her major?" was my next question.
After a moment of shuffling paper, she replied, "Mathematics."
Three weeks later, I was sitting at a long library table with a little gray box that contained the university's record of my grandmother's life. As I worked through her transcript and the course record books, I was surprised but pleased to see that she had taken a rigorous program of study. In all, she had taken about two-thirds of the mathematics courses that I had taken as an undergraduate, and she had studied with several well-known mathematicians of the 1920s. The professors' record books were particularly intriguing, for they contained little notes that hinted at the activity and turmoil outside the classroom. One mentioned the male students who had left for the First World War; another recorded that he had devoted part of the term to analyzing ballistics problems; a third mentioned that two students had died in the influenza epidemic.
Perhaps the most surprising revelation was the fact that my grandmother was not the only female mathematics student. Of the twelve students who had taken a mathematics degree in 1921, six of them, including my grandmother, were women. The University of Michigan was more progressive than the Ivy League schools, but its liberalism had limits. About a quarter of the university student body was female, but the school provided no dormitory for women and barred them from the student union building, as it was attached to a men's residence hall. University officials also discouraged women from studying medicine, business, engineering, physics, biology, and chemistry. For women with scientific interests, the mathematics department was about the only division of the school that welcomed them. Much of this welcome was provided by a single professor, James W. Glover (1868-1941), who served as the advisor to my grandmother and most of her female peers.
Glover was an applied mathematician, an expert in the mathematics of finance, insurance, and governance. He had been employed as an actuary for Michigan's Teacher Retirement fund, had held the presidency of Andrew Carnegie's Teachers Insurance and Annuity Association, and, in the early years of the century, had served as a member of the Progressive Party's "brain trust," the circle of academic advisors to the party leader, Robert La Follette. Within the University of Michigan, Glover was an advocate for women's education, though he was at least partly motivated by a desire to increase enrollments in mathematics courses. He welcomed women to his classes, encouraged them to study in the department lounge, prepared them for graduate study, and helped them search for jobs. He pushed the women to look beyond the traditional role of schoolteacher and consider careers in business and government. At a time when clerical jobs were still dominated by men, Glover helped his female students find positions as assistant actuaries and human computers, the workers who undertook difficult calculations in the days before electronic computers. At the end of his career, he recorded that he had advised nearly fifty women and that only "one-third have married and have retired from active business life."
Of the six women who graduated in 1921, only one, my grandmother, never worked outside the home. The remaining five had mathematical careers that lasted into the 1950s. One was a human computer for the United States Army and prepared ballistics trajectories. A second did calculations for the Chemical Rubber Company, a publisher that sold handbooks to engineers and scientists. Another compiled health statistics for the state of Michigan. The fourth worked for the United States Bureau of Labor Statistics and eventually became the assistant director of a field office in Baton Rouge. The last female mathematics major of 1921 became an actuary, moved to New York City, and operated her own business. Though my grandmother's hidden mathematical career held a special emotional appeal to me, it was the story of the other five women that captured my interest. What kind of world did they inhabit? What were their aspirations? What did they do each day? At the ends of their careers, what had they accomplished? Rather than restrict my scope to the five women who had known my grandmother or even the women mathematics graduates of the University of Michigan, I decided to look at the history of scientific computers, the workers who had done calculations for scientific research.
Scientific computation is not mathematics, though it is closely related to mathematical practice. One eighteenth-century computer remarked that calculation required nothing more than "persevering industry and attention, which are not precisely the qualifications a mathematician is most anxious to be thought to possess." It might be best described as "blue-collar science," the hard work of processing data or deriving predictions from scientific theories. "Mental labor" was the term used by the English mathematician Charles Babbage (1791-1871). The origins of scientific calculation can be found in some of the earliest records of human history, the clay tables of Sumeria, the astronomical records of ancient shepherds who watched over their flocks by night, the land surveys of early China, the knotted cords of the Inca. Its traditions were developed by astronomers and engineers and statisticians. It is kept alive, in a sophisticated form, by those graduate students and laboratory assistants who use electronic calculators and computer spreadsheets to prepare numbers for senior researchers.
Though many human computers toiled alone, the most influential worked in organized groups, which were sometimes called computing offices or computing laboratories. These groups form some of the earliest examples of a phenomenon known informally as "big science," the combination of labor, capital, and machinery that undertakes the large problems of scientific research. Many commentators identify the start of large-scale scientific research with the coordinated military research of the Second World War or the government-sponsored laboratories of the Cold War, but the roots of these projects can be traced to the computing offices of the eighteenth and nineteenth centuries.
It is possible to begin the story of organized computing long before the eighteenth century by starting with the great heavenly Almagest, the charts of the planets created by Claudius Ptolemy (85-165) in classical Egypt. As the connection between the ancient world and its modern counterpart is sometimes tenuous, we will begin our story just a few years before the opening of the eighteenth century with two events: the invention of calculus and the start of the Industrial Revolution. Both events are difficult to date exactly, but that is of little concern to this narrative. Identifying the inventors of specific ideas is less important than understanding how these ideas developed within the scientific community. Calculus gave scientists new ways of analyzing motion. Most historians of mathematics have concluded that it was invented independently by Isaac Newton (1642-1727) and Gottfried Wilhelm Leibniz (1646-1716) in the 1680s. It was initially used in astronomy, but it also opened new fields for scientific research. The Industrial Revolution, the economic and social change that was driven by the factory system and the invention of large machinery, created new techniques of management, developed public journalism as a means of disseminating ideas, and produced the modern factory. Most scholars place the start of the Industrial Revolution at the end of the eighteenth century, but this change was deeply influenced by the events of Newton's time. "It is enough to record that by 1700 the foundations of modern technology have been laid," concluded historian Donald Caldwell.
By starting with the invention of calculus, we will overlook several important computational projects, including the Arithmetica Logarithmica by Henry Briggs (1561-1630), the ballistic trajectories of Galileo Galilei (1564-1642), and the planetary computations in the Rudolphine Tables by Johannes Kepler (1571-1630). Each of these projects contributed to the development of science and mathematics. Briggs gave science one of its most important computing tools, the logarithm table. Galileo and Kepler laid the foundation for calculus. However, none of these projects is an example of organized computation, as we define it. None of these scientists employed a staff of computers. Instead, they did the computations themselves with the occasional assistance of a student or friend.
The story of organized scientific computation shares three themes with the history of labor and the history of factories: the division of labor, the idea of mass production, and the development of professional managers. All of these themes emerge in the first organized computing groups of the eighteenth century and reappear in new forms as the story develops. All three were identified by Charles Babbage in the 1820s, when he was considering problems of computation. These themes are tightly intertwined, as mass production clearly depends upon the division of labor, and the appearance of skilled managers can be seen as a specific example of divided and specified labor. However, this book separates these ideas and treats them individually in an attempt to clarify and illuminate the different forces that shaped computation.
The first third of this book, which deals with computation from the start of the eighteenth century up to 1880, treats the first theme, the division of labor. During this period, astronomy was the dominant field of scientific research and the discipline that required the greatest amount of calculation. Some of this calculation was done in observatories for astronomers, but most of it was done in practical settings by individuals who used astronomy in their work, most notably navigators and surveyors. It was a period when the borders of scientific practice were not well defined and many a scientist moved easily through the learned disciplines, scanning the sky one night, navigating a ship the next, and perhaps, on the night following, designing a fortification or preparing an insurance table. The great exponent of divided labor, the Scottish philosopher Adam Smith (1723-1790), wrote The Wealth of Nations during this period. Smith discussed the nature of divided labor in scientific work and even commented briefly on the nature of astronomy. The astronomers of the age were familiar with Smith's ideas and cited them as the inspiration for their computing staffs.
The second third of the book covers the period from 1880 to 1930, a time when astronomy was still an important force behind scientific computation but was no longer the only discipline that required large-scale calculations. In particular, electrical engineers and ordnance engineers started building staffs to deal with the demands of computation. The major change during this period came from the mass-produced adding and calculating machines. Such machines have histories that can be traced back to the seventeenth century, but they were not commonly found in computing offices until the start of the twentieth century. While these machines decreased the amount of time required for astronomical calculations, they had a greater impact in the fields of economics and social statistics. They allowed scientists to summarize large amounts of data and to develop mathematical means for analyzing large populations. With the calculating machines came other ideas that we associate with mass production, such as standardized methods, generalized techniques, and tighter managerial control.
Excerpted from When Computers Were Human by David Alan Grier
Copyright © 2005 by Princeton University Press. Excerpted by permission.
All rights reserved. No part of this excerpt may be reproduced or reprinted without permission in writing from the publisher.
Excerpts are provided by Dial-A-Book Inc. solely for the personal use of visitors to this web site.
What People are Saying About This
George Dyson, author of "Darwin among the Machines"
Michael R. Williams, Head Curator, Computer History Museum
James Fallows, National Correspondent, "Atlantic Monthly"
Theodore M. Porter, author of "Trust in Numbers: The Pursuit of Objectivity in Science and Public Life"
Meet the Author
GrierDavid Alan: David Alan Grier is Associate Professor in the Center for International Science and Technology Policy at George Washington University. His articles on the history of science have appeared in the "American Mathematical Monthly", "Chance", the "Christian Science Monitor", and the "Washington Post". He is Editor in Chief of the "IEEE Annals of the History of Computing". Long before he learned that his grandmother had been trained as a human computer, he absorbed the methods of programming the electronic computer from his father, who was a scientific computing specialist for the Burroughs Corporation.
Most Helpful Customer Reviews
See all customer reviews
Usually, the word "computer" generates images of a powerful, programmable machine that can perform almost any task. However, a "computer" was originally a person who performed complex math. Some "human computers" were scientists who did advanced calculations, but most were workers who labored over the same types of adding, subtracting, multiplying and dividing hour after hour, day after day. Scientist David Alan Grier weaves a wonderful story of the history of computing, framed by the discovery of Halley's Comet and its three subsequent appearances. The comet gives the story a nice structure that helps readers see the advances in computing over the past three centuries. Grier introduces colorful personalities and covers pivotal historical events in the rise of mechanical computing. getAbstract finds that this history book informs your understanding of how computerization advanced while also being a terrific read.
I heard about the book through a marketplace segment while I was doing my AP Economics book and it sounded interesting, I got the impression it would be about this woman who studied calculus but didn't do nothing with it. The book was about all these people who majored in mathematics and what they did with there degrees, grant it most of them were human computers, making only $30 a week.. Man things must have been cheap back in the day! If you have time to sit down and read this book go for it! I really got into the book with the last 100 pages, the first 100 or so it was kind of run on and dull but once I got to Gertrude Blanch I was like waho this is so cool! I like mathematics and I plan to do something amazing! Everyone should read this book to see what life was like before everyone became dependent on computers!
The best book i have read about computers