BN.com Gift Guide

When Computers Were Human

Hardcover (Print)
Used and New from Other Sellers
Used and New from Other Sellers
from $13.71
Usually ships in 1-2 business days
(Save 75%)
Other sellers (Hardcover)
  • All (18) from $13.71   
  • New (6) from $18.14   
  • Used (12) from $13.71   
Close
Sort by
Page 1 of 1
Showing All
Note: Marketplace items are not eligible for any BN.com coupons and promotions
$18.14
Seller since 2014

Feedback rating:

(81)

Condition:

New — never opened or used in original packaging.

Like New — packaging may have been opened. A "Like New" item is suitable to give as a gift.

Very Good — may have minor signs of wear on packaging but item works perfectly and has no damage.

Good — item is in good condition but packaging may have signs of shelf wear/aging or torn packaging. All specific defects should be noted in the Comments section associated with each item.

Acceptable — item is in working order but may show signs of wear such as scratches or torn packaging. All specific defects should be noted in the Comments section associated with each item.

Used — An item that has been opened and may show signs of wear. All specific defects should be noted in the Comments section associated with each item.

Refurbished — A used item that has been renewed or updated and verified to be in proper working condition. Not necessarily completed by the original manufacturer.

New
Ships same day as ordered. Brand new.

Ships from: Lansing, KS

Usually ships in 1-2 business days

  • Canadian
  • International
  • Standard, 48 States
  • Standard (AK, HI)
  • Express, 48 States
  • Express (AK, HI)
$18.14
Seller since 2014

Feedback rating:

(81)

Condition: New
Ships same day as ordered. Brand new.

Ships from: Lansing, KS

Usually ships in 1-2 business days

  • Canadian
  • International
  • Standard, 48 States
  • Standard (AK, HI)
  • Express, 48 States
  • Express (AK, HI)
$18.14
Seller since 2014

Feedback rating:

(81)

Condition: New
Ships same day as ordered. Brand new.

Ships from: Lansing, KS

Usually ships in 1-2 business days

  • Canadian
  • International
  • Standard, 48 States
  • Standard (AK, HI)
  • Express, 48 States
  • Express (AK, HI)
$19.09
Seller since 2005

Feedback rating:

(299)

Condition: New
Hardcover New 0691091579 This is a hardcover book with dust jacket. ! ! ! ! This is a 1st Edition! ! ! ! !

Ships from: Staten Island, NY

Usually ships in 1-2 business days

  • Canadian
  • International
  • Standard, 48 States
  • Standard (AK, HI)
  • Express, 48 States
  • Express (AK, HI)
$57.69
Seller since 2014

Feedback rating:

(322)

Condition: New
Brand New Item.

Ships from: Chatham, NJ

Usually ships in 1-2 business days

  • Canadian
  • International
  • Standard, 48 States
  • Standard (AK, HI)
  • Express, 48 States
  • Express (AK, HI)
$64.50
Seller since 2014

Feedback rating:

(81)

Condition: New
Ships same day as ordered. Brand new.

Ships from: Lansing, KS

Usually ships in 1-2 business days

  • Canadian
  • International
  • Standard, 48 States
  • Standard (AK, HI)
  • Express, 48 States
  • Express (AK, HI)
Page 1 of 1
Showing All
Close
Sort by

Overview

Before Palm Pilots and iPods, PCs and laptops, the term "computer" referred to the people who did scientific calculations by hand. These workers were neither calculating geniuses nor idiot savants but knowledgeable people who, in other circumstances, might have become scientists in their own right. When Computers Were Human represents the first in-depth account of this little-known, 200-year epoch in the history of science and technology.

Beginning with the story of his own grandmother, who was trained as a human computer, David Alan Grier provides a poignant introduction to the wider world of women and men who did the hard computational labor of science. His grandmother's casual remark, "I wish I'd used my calculus," hinted at a career deferred and an education forgotten, a secret life unappreciated; like many highly educated women of her generation, she studied to become a human computer because nothing else would offer her a place in the scientific world.

The book begins with the return of Halley's comet in 1758 and the effort of three French astronomers to compute its orbit. It ends four cycles later, with a UNIVAC electronic computer projecting the 1986 orbit. In between, Grier tells us about the surveyors of the French Revolution, describes the calculating machines of Charles Babbage, and guides the reader through the Great Depression to marvel at the giant computing room of the Works Progress Administration.

When Computers Were Human is the sad but lyrical story of workers who gladly did the hard labor of research calculation in the hope that they might be part of the scientific community. In the end, they were rewarded by a new electronic machine that took the place and the name of those who were, once, the computers.

Read More Show Less

Editorial Reviews

Nature - Jon Agar
David Alan Grier's recovery of the wonderfully rich story of human computers . . . ask[s] why human computers were made to disappear in the first place. . . . It is notoriously difficult to recover details of the lives of ordinary people. . . . But Grier triumphantly achieves his aim when discussing the twentieth-century human computer, as many are alive to tell their tales.
Discover - Ann Finkbeiner
Prior to the advent of programmable data-processing electronic devices in the mid-20th century, the word computer was commonly used to describe a person hired to crank out stupefyingly tedious calculations. . . . Human computers have . . . been largely forgotten, and David Alan Grier . . . is intent on restoring them to their rightful place in history.
Atlantic Monthly - James Fallows
When Computers Were Human is a detailed and fascinating look at a world I had not even known existed.
MAA Reviews - Amy Shell-Gellasch
The strength of this book is its breadth of research and its human touch. . . . [A] well written, informative and enjoyable work.
IEEE Annals of the History of Computing - Jonathan P. Bowen
Overall, this book provides a wonderful survey of human computing from 1682 onward. . . . I recommend this book to all historians of computing, both professional and amateur.
From the Publisher
Winner of the 2006 Book Award in Computers/Internet, Independent Publisher Book Awards

"David Alan Grier's recovery of the wonderfully rich story of human computers . . . ask[s] why human computers were made to disappear in the first place. . . . It is notoriously difficult to recover details of the lives of ordinary people. . . . But Grier triumphantly achieves his aim when discussing the twentieth-century human computer, as many are alive to tell their tales."—Jon Agar, Nature

"Prior to the advent of programmable data-processing electronic devices in the mid-20th century, the word computer was commonly used to describe a person hired to crank out stupefyingly tedious calculations. . . . Human computers have . . . been largely forgotten, and David Alan Grier . . . is intent on restoring them to their rightful place in history."—Ann Finkbeiner, Discover

"When Computers Were Human is a detailed and fascinating look at a world I had not even known existed."—James Fallows, National Correspondent, Atlantic Monthly

"The strength of this book is its breadth of research and its human touch. . . . [A] well written, informative and enjoyable work."—Amy Shell-Gellasch, MAA Reviews

"Overall, this book provides a wonderful survey of human computing from 1682 onward. . . . I recommend this book to all historians of computing, both professional and amateur."—Jonathan P. Bowen, IEEE Annals of the History of Computing

Nature
David Alan Grier's recovery of the wonderfully rich story of human computers . . . ask[s] why human computers were made to disappear in the first place. . . . It is notoriously difficult to recover details of the lives of ordinary people. . . . But Grier triumphantly achieves his aim when discussing the twentieth-century human computer, as many are alive to tell their tales.
— Jon Agar
Discover
Prior to the advent of programmable data-processing electronic devices in the mid-20th century, the word computer was commonly used to describe a person hired to crank out stupefyingly tedious calculations. . . . Human computers have . . . been largely forgotten, and David Alan Grier . . . is intent on restoring them to their rightful place in history.
— Ann Finkbeiner
Atlantic Monthly
When Computers Were Human is a detailed and fascinating look at a world I had not even known existed.
— James Fallows, National Correspondent
MAA Reviews
The strength of this book is its breadth of research and its human touch. . . . [A] well written, informative and enjoyable work.
— Amy Shell-Gellasch
IEEE Annals of the History of Computing
Overall, this book provides a wonderful survey of human computing from 1682 onward. . . . I recommend this book to all historians of computing, both professional and amateur.
— Jonathan P. Bowen
Read More Show Less

Product Details

  • ISBN-13: 9780691091570
  • Publisher: Princeton University Press
  • Publication date: 2/22/2005
  • Pages: 424
  • Product dimensions: 6.54 (w) x 9.60 (h) x 1.14 (d)

Meet the Author

David Alan Grier is Associate Professor in the Center for International Science and Technology Policy at George Washington University. His articles on the history of science have appeared in the "American Mathematical Monthly", "Chance", the "Christian Science Monitor", and the "Washington Post". He is Editor in Chief of the "IEEE Annals of the History of Computing". Long before he learned that his grandmother had been trained as a human computer, he absorbed the methods of programming the electronic computer from his father, who was a scientific computing specialist for the Burroughs Corporation.

Read More Show Less

Read an Excerpt

When Computers Were Human
By David Alan Grier Princeton University Press
Copyright © 2005
Princeton University Press
All right reserved.

ISBN: 978-0-691-13382-9


Introduction A GRANDMOTHER'S SECRET LIFE

After a while nothing matters ... any more than these little things, that used to be necessary and important to forgotten people, and now have to be guessed at under a magnifying glass and labeled: "Use unknown." Edith Wharton, The Age of Innocence (1920)

IT BEGAN with a passing remark, a little comment, a few words not understood, a confession of a secret life. On a cold winter evening, now many years ago, I was sharing a dinner with my grandmother. I was home from graduate school, full of myself and confident of the future. We sat at a small table in her kitchen, eating foods that had been childhood favorites and talking about cousins and sisters and aunts and uncles. There was much to report: marriages and great-grandchildren, new homes and jobs. As we cleared the dishes, she became quiet for a moment, as if she were lost in thought, and then turned to me and said, "You know, I took calculus in college."

I'm certain that I responded to her, but I could not have said anything beyond "Oh really" or "How interesting" or some other empty phrase that allowed the conversation to drift toward another subject and lose the opportunity of the moment. In hindsight, her statement was every bit as strange and provocative as if she had said that she'd foughtwith the Loyalists in the Spanish Civil War or had spent her youth dealing baccarat at Monte Carlo. Yet, at that instant, I could not recognize that she had told me something unusual. I studied with many women who had taken calculus and believed they would have careers in the mathematical sciences like my intended career. I did not stop to consider that only a few women of my grandmother's generation had even attended college and that fewer still had ever heard of calculus.

My grandmother's comment was temporarily ignored, but it was not lost. It came rushing back into my thoughts, some six or seven years later, as I was sitting in a mathematics seminar. Such events are often conducive to reflection, and this occasion promised plenty of opportunity to think about other subjects. The speaker, a wild-haired, ill-clad academic, was discussing a new mathematical theory with allegedly important applications that were far more abstract than the theory itself. As I was helping myself to tea and cookies, a staple of mathematical talks, I caught a remark from a senior professor. I had always admired this individual, for he had the ability to sleep during the boring parts of seminars and still catch enough of the material to ask deep and penetrating questions during the discussion period. This professor, who had recently retired, was describing his early days at the university during the Great Depression of the 1930s. Having just arrived in the United States from his native Poland and knowing only rudimentary English, he was assigned to teach the engineering calculus course. "This," he stated with a flourish, "was the first time that calculus was required of engineering students at the university." As I listened to his story, I heard my grandmother's phrase from that night long before. "You know, I took calculus in college." I did not know when she had attended college, but having heard my mother's stories of the Depression, I was certain it would have been before 1930. As I settled into my chair, I started to ponder what my grandmother had said. For the next hour, I was lost in my own thoughts and oblivious to the talk, which proved to be the best seminar of the term. During the discussion period, I was asking myself the questions I should have raised at that dinner years before: Where had my grandmother attended college? What courses had she taken? What had she hoped to learn from calculus?

By then, it was too late to ask these questions. My grandmother was gone, and no one knew much about her early life. My mother believed that my grandmother had studied to be an accountant or an actuary. My uncle thought that my grandmother had taken some bookkeeping classes. Our family genealogist, a distant cousin who seemed to know everything about our relations, expressed her opinion that my grandmother's family had been too poor to send her to college. Still bothered by that one phrase, I decided to see what I could learn. My grandmother had been raised in Ann Arbor, the home of the University of Michigan. So one day, I called the college registrar and asked if she had a transcript for my grandmother. I tried to use a tone of voice to suggest that it was the most natural thing in the world for a grandson to review his grandmother's college grades, rather than the other way around. With surprisingly little hesitation, the registrar agreed to my request and left the phone. In a few minutes, she returned and said, "I have her records here."

Catching my breath, I asked, "When did she graduate?"

"Nineteen twenty-one," the registrar responded.

"What was her major?" was my next question.

After a moment of shuffling paper, she replied, "Mathematics."

Three weeks later, I was sitting at a long library table with a little gray box that contained the university's record of my grandmother's life. As I worked through her transcript and the course record books, I was surprised but pleased to see that she had taken a rigorous program of study. In all, she had taken about two-thirds of the mathematics courses that I had taken as an undergraduate, and she had studied with several well-known mathematicians of the 1920s. The professors' record books were particularly intriguing, for they contained little notes that hinted at the activity and turmoil outside the classroom. One mentioned the male students who had left for the First World War; another recorded that he had devoted part of the term to analyzing ballistics problems; a third mentioned that two students had died in the influenza epidemic.

Perhaps the most surprising revelation was the fact that my grandmother was not the only female mathematics student. Of the twelve students who had taken a mathematics degree in 1921, six of them, including my grandmother, were women. The University of Michigan was more progressive than the Ivy League schools, but its liberalism had limits. About a quarter of the university student body was female, but the school provided no dormitory for women and barred them from the student union building, as it was attached to a men's residence hall. University officials also discouraged women from studying medicine, business, engineering, physics, biology, and chemistry. For women with scientific interests, the mathematics department was about the only division of the school that welcomed them. Much of this welcome was provided by a single professor, James W. Glover (1868-1941), who served as the advisor to my grandmother and most of her female peers.

Glover was an applied mathematician, an expert in the mathematics of finance, insurance, and governance. He had been employed as an actuary for Michigan's Teacher Retirement fund, had held the presidency of Andrew Carnegie's Teachers Insurance and Annuity Association, and, in the early years of the century, had served as a member of the Progressive Party's "brain trust," the circle of academic advisors to the party leader, Robert La Follette. Within the University of Michigan, Glover was an advocate for women's education, though he was at least partly motivated by a desire to increase enrollments in mathematics courses. He welcomed women to his classes, encouraged them to study in the department lounge, prepared them for graduate study, and helped them search for jobs. He pushed the women to look beyond the traditional role of schoolteacher and consider careers in business and government. At a time when clerical jobs were still dominated by men, Glover helped his female students find positions as assistant actuaries and human computers, the workers who undertook difficult calculations in the days before electronic computers. At the end of his career, he recorded that he had advised nearly fifty women and that only "one-third have married and have retired from active business life."

Of the six women who graduated in 1921, only one, my grandmother, never worked outside the home. The remaining five had mathematical careers that lasted into the 1950s. One was a human computer for the United States Army and prepared ballistics trajectories. A second did calculations for the Chemical Rubber Company, a publisher that sold handbooks to engineers and scientists. Another compiled health statistics for the state of Michigan. The fourth worked for the United States Bureau of Labor Statistics and eventually became the assistant director of a field office in Baton Rouge. The last female mathematics major of 1921 became an actuary, moved to New York City, and operated her own business. Though my grandmother's hidden mathematical career held a special emotional appeal to me, it was the story of the other five women that captured my interest. What kind of world did they inhabit? What were their aspirations? What did they do each day? At the ends of their careers, what had they accomplished? Rather than restrict my scope to the five women who had known my grandmother or even the women mathematics graduates of the University of Michigan, I decided to look at the history of scientific computers, the workers who had done calculations for scientific research.

Scientific computation is not mathematics, though it is closely related to mathematical practice. One eighteenth-century computer remarked that calculation required nothing more than "persevering industry and attention, which are not precisely the qualifications a mathematician is most anxious to be thought to possess." It might be best described as "blue-collar science," the hard work of processing data or deriving predictions from scientific theories. "Mental labor" was the term used by the English mathematician Charles Babbage (1791-1871). The origins of scientific calculation can be found in some of the earliest records of human history, the clay tables of Sumeria, the astronomical records of ancient shepherds who watched over their flocks by night, the land surveys of early China, the knotted cords of the Inca. Its traditions were developed by astronomers and engineers and statisticians. It is kept alive, in a sophisticated form, by those graduate students and laboratory assistants who use electronic calculators and computer spreadsheets to prepare numbers for senior researchers.

Though many human computers toiled alone, the most influential worked in organized groups, which were sometimes called computing offices or computing laboratories. These groups form some of the earliest examples of a phenomenon known informally as "big science," the combination of labor, capital, and machinery that undertakes the large problems of scientific research. Many commentators identify the start of large-scale scientific research with the coordinated military research of the Second World War or the government-sponsored laboratories of the Cold War, but the roots of these projects can be traced to the computing offices of the eighteenth and nineteenth centuries.

It is possible to begin the story of organized computing long before the eighteenth century by starting with the great heavenly Almagest, the charts of the planets created by Claudius Ptolemy (85-165) in classical Egypt. As the connection between the ancient world and its modern counterpart is sometimes tenuous, we will begin our story just a few years before the opening of the eighteenth century with two events: the invention of calculus and the start of the Industrial Revolution. Both events are difficult to date exactly, but that is of little concern to this narrative. Identifying the inventors of specific ideas is less important than understanding how these ideas developed within the scientific community. Calculus gave scientists new ways of analyzing motion. Most historians of mathematics have concluded that it was invented independently by Isaac Newton (1642-1727) and Gottfried Wilhelm Leibniz (1646-1716) in the 1680s. It was initially used in astronomy, but it also opened new fields for scientific research. The Industrial Revolution, the economic and social change that was driven by the factory system and the invention of large machinery, created new techniques of management, developed public journalism as a means of disseminating ideas, and produced the modern factory. Most scholars place the start of the Industrial Revolution at the end of the eighteenth century, but this change was deeply influenced by the events of Newton's time. "It is enough to record that by 1700 the foundations of modern technology have been laid," concluded historian Donald Caldwell.

By starting with the invention of calculus, we will overlook several important computational projects, including the Arithmetica Logarithmica by Henry Briggs (1561-1630), the ballistic trajectories of Galileo Galilei (1564-1642), and the planetary computations in the Rudolphine Tables by Johannes Kepler (1571-1630). Each of these projects contributed to the development of science and mathematics. Briggs gave science one of its most important computing tools, the logarithm table. Galileo and Kepler laid the foundation for calculus. However, none of these projects is an example of organized computation, as we define it. None of these scientists employed a staff of computers. Instead, they did the computations themselves with the occasional assistance of a student or friend.

The story of organized scientific computation shares three themes with the history of labor and the history of factories: the division of labor, the idea of mass production, and the development of professional managers. All of these themes emerge in the first organized computing groups of the eighteenth century and reappear in new forms as the story develops. All three were identified by Charles Babbage in the 1820s, when he was considering problems of computation. These themes are tightly intertwined, as mass production clearly depends upon the division of labor, and the appearance of skilled managers can be seen as a specific example of divided and specified labor. However, this book separates these ideas and treats them individually in an attempt to clarify and illuminate the different forces that shaped computation.

The first third of this book, which deals with computation from the start of the eighteenth century up to 1880, treats the first theme, the division of labor. During this period, astronomy was the dominant field of scientific research and the discipline that required the greatest amount of calculation. Some of this calculation was done in observatories for astronomers, but most of it was done in practical settings by individuals who used astronomy in their work, most notably navigators and surveyors. It was a period when the borders of scientific practice were not well defined and many a scientist moved easily through the learned disciplines, scanning the sky one night, navigating a ship the next, and perhaps, on the night following, designing a fortification or preparing an insurance table. The great exponent of divided labor, the Scottish philosopher Adam Smith (1723-1790), wrote The Wealth of Nations during this period. Smith discussed the nature of divided labor in scientific work and even commented briefly on the nature of astronomy. The astronomers of the age were familiar with Smith's ideas and cited them as the inspiration for their computing staffs.

The second third of the book covers the period from 1880 to 1930, a time when astronomy was still an important force behind scientific computation but was no longer the only discipline that required large-scale calculations. In particular, electrical engineers and ordnance engineers started building staffs to deal with the demands of computation. The major change during this period came from the mass-produced adding and calculating machines. Such machines have histories that can be traced back to the seventeenth century, but they were not commonly found in computing offices until the start of the twentieth century. While these machines decreased the amount of time required for astronomical calculations, they had a greater impact in the fields of economics and social statistics. They allowed scientists to summarize large amounts of data and to develop mathematical means for analyzing large populations. With the calculating machines came other ideas that we associate with mass production, such as standardized methods, generalized techniques, and tighter managerial control.

(Continues...)



Excerpted from When Computers Were Human by David Alan Grier
Copyright © 2005 by Princeton University Press. Excerpted by permission.
All rights reserved. No part of this excerpt may be reproduced or reprinted without permission in writing from the publisher.
Excerpts are provided by Dial-A-Book Inc. solely for the personal use of visitors to this web site.
Read More Show Less

Table of Contents

Introduction: A Grandmother's Secret Life 1

Part I: Astronomy and the Division of Labor 1682-1880 9

Chapter One: The First Anticipated Return: Halley's Comet 1758 11
Chapter Two: The Children of Adam Smith 26
Chapter Three: The Celestial Factory: Halley's Comet 1835 46
Chapter Four: The American Prime Meridian 55
Chapter Five: A Carpet for the Computing Room 72

Part II: Mass Production and New Fields of Science 1880-1930 89

Chapter Six: Looking Forward, Looking Backward: Machinery 1893 91
Chapter Seven: Darwin's Cousins 102
Chapter Eight: Breaking from the Ellipse: Halley's Comet 1910 119
Chapter Nine: Captains of Academe 126
Chapter Ten: War Production 145
Chapter Eleven: Fruits of the Conflict: Machinery 1922 159

Part III: Professional Computers and an Independent Discipline 1930-1964 175

Chapter Twelve: The Best of Bad Times 177
Chapter Thirteen: Scientific Relief 198
Chapter Fourteen: Tools of the Trade: Machinery 1937 220
Chapter Fifteen: Professional Ambition 233
Chapter Sixteen: The Midtown New York Glide Bomb Club 256
Chapter Seventeen: The Victor's Share 276
Chapter Eighteen: I Alone Am Left to Tell Thee 298
Epilogue: Final Passage: Halley's Comet 1986 318

Acknowledgments 323
Appendix: Recurring Characters, Institutions, and Concepts 325
Notes 333
Research Notes and Bibliography 373
Index 401
Illustration Credits 412

Read More Show Less

Customer Reviews

Average Rating 5
( 1 )
Rating Distribution

5 Star

(1)

4 Star

(0)

3 Star

(0)

2 Star

(0)

1 Star

(0)

Your Rating:

Your Name: Create a Pen Name or

Barnes & Noble.com Review Rules

Our reader reviews allow you to share your comments on titles you liked, or didn't, with others. By submitting an online review, you are representing to Barnes & Noble.com that all information contained in your review is original and accurate in all respects, and that the submission of such content by you and the posting of such content by Barnes & Noble.com does not and will not violate the rights of any third party. Please follow the rules below to help ensure that your review can be posted.

Reviews by Our Customers Under the Age of 13

We highly value and respect everyone's opinion concerning the titles we offer. However, we cannot allow persons under the age of 13 to have accounts at BN.com or to post customer reviews. Please see our Terms of Use for more details.

What to exclude from your review:

Please do not write about reviews, commentary, or information posted on the product page. If you see any errors in the information on the product page, please send us an email.

Reviews should not contain any of the following:

  • - HTML tags, profanity, obscenities, vulgarities, or comments that defame anyone
  • - Time-sensitive information such as tour dates, signings, lectures, etc.
  • - Single-word reviews. Other people will read your review to discover why you liked or didn't like the title. Be descriptive.
  • - Comments focusing on the author or that may ruin the ending for others
  • - Phone numbers, addresses, URLs
  • - Pricing and availability information or alternative ordering information
  • - Advertisements or commercial solicitation

Reminder:

  • - By submitting a review, you grant to Barnes & Noble.com and its sublicensees the royalty-free, perpetual, irrevocable right and license to use the review in accordance with the Barnes & Noble.com Terms of Use.
  • - Barnes & Noble.com reserves the right not to post any review -- particularly those that do not follow the terms and conditions of these Rules. Barnes & Noble.com also reserves the right to remove any review at any time without notice.
  • - See Terms of Use for other conditions and disclaimers.
Search for Products You'd Like to Recommend

Recommend other products that relate to your review. Just search for them below and share!

Create a Pen Name

Your Pen Name is your unique identity on BN.com. It will appear on the reviews you write and other website activities. Your Pen Name cannot be edited, changed or deleted once submitted.

 
Your Pen Name can be any combination of alphanumeric characters (plus - and _), and must be at least two characters long.

Continue Anonymously
Sort by: Showing all of 4 Customer Reviews
  • Posted June 5, 2013

    WOW... I love MyDeals247 model - they create competition among t

    WOW... I love MyDeals247 model - they create competition among the sellers real-time.

    Was this review helpful? Yes  No   Report this review
  • Posted January 5, 2010

    more from this reviewer

    Remember the team sport of complex calculations?

    Usually, the word "computer" generates images of a powerful, programmable machine that can perform almost any task. However, a "computer" was originally a person who performed complex math. Some "human computers" were scientists who did advanced calculations, but most were workers who labored over the same types of adding, subtracting, multiplying and dividing hour after hour, day after day. Scientist David Alan Grier weaves a wonderful story of the history of computing, framed by the discovery of Halley's Comet and its three subsequent appearances. The comet gives the story a nice structure that helps readers see the advances in computing over the past three centuries. Grier introduces colorful personalities and covers pivotal historical events in the rise of mechanical computing. getAbstract finds that this history book informs your understanding of how computerization advanced while also being a terrific read.

    Was this review helpful? Yes  No   Report this review
  • Anonymous

    Posted August 28, 2005

    This book really goes into detail...

    I heard about the book through a marketplace segment while I was doing my AP Economics book and it sounded interesting, I got the impression it would be about this woman who studied calculus but didn't do nothing with it. The book was about all these people who majored in mathematics and what they did with there degrees, grant it most of them were human computers, making only $30 a week.. Man things must have been cheap back in the day! If you have time to sit down and read this book go for it! I really got into the book with the last 100 pages, the first 100 or so it was kind of run on and dull but once I got to Gertrude Blanch I was like waho this is so cool! I like mathematics and I plan to do something amazing! Everyone should read this book to see what life was like before everyone became dependent on computers!

    Was this review helpful? Yes  No   Report this review
  • Anonymous

    Posted April 25, 2005

    great

    The best book i have read about computers

    Was this review helpful? Yes  No   Report this review
Sort by: Showing all of 4 Customer Reviews

If you find inappropriate content, please report it to Barnes & Noble
Why is this product inappropriate?
Comments (optional)