BN.com Gift Guide

Eniac: The Triumphs and Tragedies of the World's First Computer

Overview

ENIAC is the story of John Mauchly and Presper Eckert, the men who built the first digital, electronic computer. Their three-year race to create the legendary ENIAC is a compelling tale of brilliance and misfortune that has never been told before.

It was the size of a three-bedroom apartment, weighed 30 tons, and cost nearly half a million dollars to build-and $650 an hour to run. But in 1945, this behemoth was the cutting edge in technology, ...
See more details below
This Audiobook (CD - Unabridged) is Not Available through BN.com
Sending request ...

Overview

ENIAC is the story of John Mauchly and Presper Eckert, the men who built the first digital, electronic computer. Their three-year race to create the legendary ENIAC is a compelling tale of brilliance and misfortune that has never been told before.

It was the size of a three-bedroom apartment, weighed 30 tons, and cost nearly half a million dollars to build-and $650 an hour to run. But in 1945, this behemoth was the cutting edge in technology, and a herald of the digital age to come. This "little gem of a book" tells the story of this machine and the men who built it-as well as the secrecy, controversy, jealousy, and lawsuits that surrounded it-in a compelling real-life techno-thriller.

Engagingly written...this book an absorbing read for anyone who savors the human stories that always underlie great events. (Wired)

A great story...excellent documentation, interesting asides (the world's first computer programmers were all women) and real drama. (Publishers Weekly)

McCartney has performed an important service by rescuing this tale from obscurity. (Philadelphia Inquirer)

"Fascinating and insightful account...will capture and sustain the interest of even the least technically sophisticated reader. (Kirkus Reviews)

Author Bio: A staff writer for the Wall Street Journal, Scott McCartney is the author of Defying the Gods: Inside the New Frontiers of Organ Transplants and coauthor of Trinity's Children: Living Along America's Nuclear Highway.

Read More Show Less

Product Details

  • ISBN-13: 9781455128051
  • Publisher: Blackstone Audio, Inc.
  • Publication date: 11/20/2011
  • Format: CD
  • Edition description: Unabridged
  • Pages: 5
  • Product dimensions: 6.80 (w) x 6.10 (h) x 1.20 (d)

Read an Excerpt

Eniac

The Triumphs and Tragedies of the World's First Computer
By Scott McCartney

Berkley Publishing Group

Copyright © 2001 Scott McCartney
All right reserved.

ISBN: 0425176444


Chapter One


The Ancestors


"You've got mail!" The phrase has become part of our everyday lexicon, a new signpost of popular culture, one that defines an era, much like "I am not a crook" or "I can't believe I ate the whole thing."

Today, a wired America, physically and emotionally, communicates on-line. We have taken Morse code to a new extreme. We can send the equivalent of a library full of information around the globe in bits and bytes, a form of dots and dashes, as fast as energy can move through wires. None of this would be possible, of course, without computers, which provide the backbone of the Internet, the telephone systems, and just about every other mode of communication, transportation, commerce, and government. The computer today has become the heart and lungs of our society, powering and sustaining our lives efficiently and safely ... until it crashes.

We rely on the computer to process, to collect, to store, to track, to decide, and to resolve. It is everywhere. What does the grocery store do when the computer goes down? Could we communicate in the office without E-mail? The computer is controlling our lives if not driving them. And to think, the computer came into existence just over fifty years ago. Generation after generation of computer has been born and, within a few years, died unceremoniously, but this rapid growth has occurred without much careful planning.

So shortsighted was computer development that nobody planned for the millennium, and businesses and governments are spending billions of dollars to ready computers for the next century. Herbert Kelleher, the chairman and chief executive of Southwest Airlines, recalled the first meeting his company had on the Year 2000 problem, when computer engineers warned that the airline's computers might think it was January 1, 1900, instead of January 1, 2000.

"So what?" Kelleher asked. "Won't we know?"

Turns out, it's more important that the computer know. To save precious electronic real. estate, shortsighted software developers abbreviated years to two digits instead of using all four. When the year rolls over to 2000, some computers will become confused and confused computers are likely to shut down. Already, credit cards with expiration dates past 1999 have been unusable in some stores because computers rejected them as "expired." For Southwest and other airlines, the Year 2000 problem may manifest itself in many ways. For example, Federal Aviation Administration computers keep airplanes from crashing into one another. But if the computers themselves crash and airport control towers are unable to use them, it won't be safe for the planes to fly.

The Year 2000 problem looms so large because society depends on computers to make sense of information, to turn data into intelligence.

The power of the computer is that it brings order to chaos. We know that even in nature's randomness, there often is a pattern--we just don't see it. Computers help us sort and shape the world in brand-new ways. They get their intelligence from their power to count, sort, organize, and compare. By brute force--speed and capacity--computers enable government to provide Social Security checks to millions, manufacturers to efficiently run factories, stock exchanges to process billions of transactions, schoolchildren to study frogs in Australia. In each case, the computer is able to order the jumble of data, to make intelligence out of information.

In the history of civilization, truly important advances have come when one is able to bring new order to the world. Around 1600, the Italian astronomer, mathematician, and physicist Galileo mathematicized the physical sciences, developing formulas to explain how the physical world worked. That was a watershed event in scientific history, opening new doors to knowledge. Before Galileo, scientists investigated nature and made measurements, but not much more. With his telescopes, Galileo gained a new understanding of the solar system. His conclusions were so radical he was even put on trial by the Inquisition for saying the sun was the center of the universe and the earth revolved around the sun. Galileo timed the oscillations of a swinging lamp, discovering the isochronism of the pendulum, and that the path of a projectile is a parabola. His findings foreshadowed Sir Isaac Newton's laws of motion. But what was most important, Galileo showed how to create order out of chaos.

The development of the computer is intertwined with this quest for understanding, this search for order. The machine was designed to sort and organize information, and solve problems. Some investors simply wanted a tool to handle drudgery and repetitive work; others had grander visions of mechanizing the mind. Often the computer is thought of as a twentieth-century phenomenon, but its genealogy is sprinkled within three hundred years of scientific advancement.


Pascal to Babbage: Getting into Gear


In 1642, the year Galileo died, a young man named Blaise Pascal developed an adding machine for his father, Etienne, who was a high official in France, where a way of reorganizing taxation was needed in the face of revolt. The government wasn't keeping up. Pascal fashioned an eight-digit mechanical calculator called the Pascaline that could perform addition when the operator turned a series of rotating gears. The brass rectangular box had eight movable dials; as one dial moved ten notches, or one complete revolution, it moved the next dial one place. When dials were turned in proper sequence, a series of numbers was entered and a cumulative sum obtained. But because the gears were driven by a complicated set of hanging weights, the machine could work in only one direction; hence it could handle addition but not subtraction. By 1652, fifty prototypes had been produced, but few machines were sold. Nevertheless, Pascal demonstrated the importance of calculating machines, and he proved two axioms of computing history. First, the young will lead--Pascal was only nineteen at the time of his invention, making him perhaps the first teenage whiz kid in computer history. Second, new technologies will catch on slowly unless there is a clear and urgent use awaiting them.

Gottfried Wilhelm Leibniz, a German mathematician and philosopher, took Pascal's wheel one giant turn farther, and learned the same lesson. Born in 1646, four years after Pascal's invention, Leibniz produced a machine capable of multiplication, division, and subtraction as well as addition. Leibniz wanted to free people from the slavery of dull, simple tasks. Partly by studying Pascal's original notes and drawings, he was able to refine the Pascaline into a calculator so ingenious that the design would last for centuries in mechanical desk calculators. Instead of a simple, flat gear, Leibniz used a drum for his wheel, putting on the drum nine teeth of increasing length to make steps. He called his machine the Stepped Reckoner. The user turned a crank once for each unit in each digit in the multiplier, and a fluted drum translated those turns into a series of additions. In 1673, the Leibniz machine was completed and exhibited in London. But it was so far ahead of its time, it was not appreciated until after Leibniz's death. Leibniz's calculation machine couldn't be a success because economics were against it, Vannevar Bush, one of the foremost American scientists, wrote in 1945. There were no savings in labor: Pencil and paper were still cheaper and faster.

Not until 1820 or so did mechanical calculators gain widespread use. By then, the industrial revolution was under way, mass-production manufacturing was in hand, and the public had an enormous hunger for anything mechanical. The steam engine was the prime technology of the time, iron suspension bridges were being constructed, and in England, the Stockton and Darlington Railway was inaugurated in 1825. At the same time, Charles Babbage, a founding member of the Royal Astronomical Society and part of England's intellectual upper crust, conceived of the computer, though he didn't call it that and his device was far from an actual computer. Babbage, who was a mathematician, astronomer, and economist, recognized that an increasingly industrialized society needed better, more accurate methods of calculating. One particular problem was that tables used in maritime navigation, astronomy, and industry were often filled with inaccuracies, and doing calculations yourself required weeks and months of mindless number crunching. Babbage knew that many long computations consisted of operations that were often repeated, so he reasoned he could create a machine that would do those operations automatically. In effect, he wanted to mechanize mathematics, just as machines were converting factories to mass production.

Following Pascal and Leibniz, he designed a machine with a set of adding mechanisms--gears and wheels--that could calculate logarithms and even print the result by punching digits into a soft metal plate for a printing press. He called his room-size contraption the Difference Engine and even designed it as a steam-powered device. Its logic was conceptually simple; the machine would produce mathematical tables by comparing differences in numbers. But mechanically, it was complex.

Work began on the full machine in 1823, and it took ten years to produce a working prototype. Funded in part by the British government, the project was plagued by cost overruns and production difficulties. Unfortunately, the plans for the Difference Engine exceeded the manufacturing capabilities of the nineteenth century. (The task was later likened to a pharaoh having detailed designs for an automobile and trying to build one using the tools, materials, and expertise of the day.) As it turned out, the Difference Engine was never built--partly because Babbage had already fallen into a common trap in computer development. He had come up with a better idea before he completed his first machine, and he turned his attention away from the project already in progress.

In 1833, Babbage unveiled his idea for a new machine, the Analytical Engine. Remarkably similar in design to modern computers, the device could perform any kind of calculation, not just generate tables. Babbage had struggled to make the Difference Engine an automatic device that could take the result of one calculation--one point on the table--and use it to begin the next calculation, for the next point on the table, a process he called "eating its own tail." He found his answer in one of the most brilliant industrial developments of the day, the Jacquard loom, which became the basis for the Analytical Engine.

Joseph-Marie Jacquard, a Frenchman, had devised a loom that would weave by following a pattern laid out in a series of cards with holes punched in them. Intricate patterns of flowers and leaves could be repeated or revised, all by changing the perforated cards. Babbage decided to make a calculating device driven by two sets of cards, one to direct the operation to be performed, and the other for the variables of the problem. His assistant, Augusta Ada King, countess of Lovelace, who was the daughter of the English poet Lord Byron, said the Analytical Engine "weaves algebraical patterns." By placing cards in an orderly series, Babbage could organize various computational talks--he could program his computer.

The Analytical Engine had another very important concept built into it: It could hold partial results for further use. In effect, it could store information. The machine had a "mill," a term Babbage borrowed from textiles, which consisted of two main accumulators and some auxiliary ones for specific purposes. It functioned as the central processing unit--numbers could be added together in the accumulators, for example. In the textile industry, yarns were brought from the store to the mill, where they were woven. So it was with the Analytical Engine; numbers were brought from storage and woven in the mill into a new product.

Most important, the Analytical Engine had a key function that distinguishes computers from calculators: the conditional statement. The Analytical Engine, Babbage said, could reach a conditional point in its calculations and see what the current value was. If it met one condition, say, greater than o, the Analytical Engine would proceed one way. If the interim value was less than o, the Analytical Engine could take a different course.

The Analytical Engine was a computing device so brilliant Babbage was derided as a lunatic. The British government refused to fund such a radical contraption, although the government was undoubtedly perturbed that the Difference Engine it had helped fund a decade earlier was never completed and was now defunct. Babbage was largely ignored by contemporaries after he made the proposal for the Analytical Engine. There were many impediments to Babbage's machine. Not only was there no real need for it, but it would not have been very fast. Nor was the technology available to build all the gears and wheels to exacting tolerances. But Babbage's idea would eventually become a reality.


The Dawn of "Data Processing"


The need to handle numbers in an increasingly industrial society did ultimately spur further development. "Data processing" became a mechanical function in 1890 when Herman Hollerith of the Massachusetts Institute of Technology built a punch-card tabulator for the U.S. Census Bureau. The census of 1880 had taken nearly seven years to count, and the Census Bureau feared that with an expanding population, the 1890 census might take ten years, rendering it useless. Hollerith's proposal wasn't inspired by Jacquard's loom or Babbage's Analytical Engine. Instead, the idea came to him as he watched a train conductor punch tickets so that each ticket described the ticket holder: light hair, dark eyes, large nose, and so forth. The ticket was called a punch photograph, and Hollerith recalled to biographer Geoffrey D. Austrian that his inspiration was simply to make a punch photograph of each person for the census. Herman Hollerith was far more practical a genius than his dreamy data-processing predecessors.

Hollerith didn't use punch cards to instruct his machine the way Babbage had envisioned. Instead, Hollerith's cards stored the information. A punch at a certain location on the card could represent a number, and a combination of two holes could represent one letter. As many as eighty variables could be stored on one card. Then the cards were fed through a tabulating and sorting machine. The tabulator had a plate of 288 retractable, spring-loaded pins. A pin encountering a hole completed an electrical circuit as it dipped through, and the electricity drove a counter, not unlike the gears and wheels in Pascal's original calculator. The circuit could also open the lid of one of twenty-four sorting compartments and send the card to the next phase of operation. If the bureau wanted to count men with three children, the cards could be sorted accordingly. For the first time, electricity and computing were converging as intelligence inside a machine. It made for a smarter machine because it made for a faster machine.

Initial results from the 1890 census were tabulated in just six weeks, but Hollerith's punch cards were hardly an instant sensation. The Census Bureau declared that the nation's population totaled 62,622,250. Pundits had been hoping that the country, had grown to 75 million people, and in the initial shock and disappointment, the machine was blamed. "Useless Machines," the Boston Herald declared. "Slip Shod Work Has Spoiled the Census," the New York Herald chimed in.

Hollerith, of course, had the last laugh. The entire census was completed in less than three years, at an estimated savings of $5 million. It was also far more accurate than previous tallies. Hollerith had figured out a way for his card-tabulating machine to perform multiplication, enhancing its capabilities. Even at the racetrack, Hollerith's ideas took the form of mechanical calculators called totalizators--or tote boards--that continuously added bets and figured odds for each entry. "Real-time computing," or at least calculating, was born. His invention was the backbone of a new company he formed in 1896, Tabulating Machine Company, which through acquisitions ultimately became International Business Machines Corporation. Punch-card machines became the gold standard for data processing. The technology opened new avenues for government. It enabled the Social Security Administration to come into existence in 1935, using punch cards to store wage data on every working American.

Already, there was pressure from scientists to produce faster machines, which would allow for more complicated calculations and more accurate tasks. The 1930s was a time of rapid advancement in "calculating machines"--mechanical contraptions that began making life faster and easier--and MIT was at the center of the leading research.

In 1930, MIT's Vannevar Bush made an enormous breakthrough by developing a machine called the Differential Analyzer. Bush's machine, which used a mathematical method different from that of Babbage's Difference Engine, was full of shafts, wires, wheels, pulleys, and 1,000 gears that appeared to be the work of overzealous elves rather than an effulgent scientist. Everything was driven by small electric motors that had to be carefully adjusted for accuracy of calculations. The Differential Analyzer had to run at slow speeds, because disks could slip and throw numbers off. It measured movements and distances to yield the results of computations based on those measurements.

The machine had input tables, into which operators fed numbers and functions. Its gears could be adjusted to introduce constant functions: Want to double a result somewhere along the calculation? Simply adjust the gears. There was a unit to do multiplication and an output table that could display results in graphs. Different units could be connected in a certain sequence by long shafts, and how the connections were made programmed how the machine would solve the problem. It looked like a huge erector set, wider than the length of a banquet-size dining table. Programming it was a major chore, since all the interconnections had to be undone and redone.

For all that, the Differential Analyzer could solve only one type of problem--differential equations. Differential equations are an important type of calculation involving rates of change and curves that are used to describe happenings in the physical environment, such as the area lying under a curve or the path of a projectile. The formula


x² (d²y/dx) + x(dy/dx) + y(x² - n²) = 0


is an example of a differential equation used to describe vibration and periodic motion. Once assembled on the Differential Analyzer, substituting any number for n would yield the values of the other variables in the equation. What's the position of the projectile after two seconds? After nine seconds?

The Differential Analyzer was "analog" rather than "digital." That is, it worked in continuous waves rather than specific digits. Like a clock with hands, an analog-computing device gets to the right answer by moving a distance specified by the numbers in the problem. Digital devices, on the other hand, use only discrete numbers, such as those on a digital clock. A speedometer is analog; an odometer is digital. Analog devices are less accurate because the readout is an estimate. In addition, gears wear down, wheels slip, and the machines, over time, grow more inaccurate.

Despite its inherent inaccuracy, the Differential Analyzer was still far superior to other calculating devices, and it became the most sophisticated computational tool for scientific research. It helped lay the groundwork for the notion that if you wanted something fast, you had to make it electric.

Still, electricity was only the muscle, not the brains. As several people around the world were pressing on with more sophisticated calculating machines in the 1930s, a key technological innovation came from an unlikely source: the telephone company.


Electric Calculators Make Their Mark


Bell Telephone Laboratories had been working for years on the science of turning information--the numbers dialed into a telephone, the sound to be transmitted--into electrical signals. To route each call, the telephone company has a network of switches. Early switches required operators to physically connect lines with patch cords. Later, devices called relays were developed to make the connections much faster. Relays are electromechanical gadgets with an armature inside that opens or closes based on electrical current. Like a light switch, they can actually represent information depending on their position. Thus, making a computing machine with switches was a natural pursuit for Bell Labs.

In the mid-1930s, one of Bell Labs' research mathematicians, George R. Stibitz, began using telephone relays to register either a 1 or 0, depending on the position of the relay. He created a circuit of relays, called a flip-flop, which could count based on the flow of electricity. The circuit had lamps that glowed for the digit 1 and were dark for the digit 0. Stibitz's "breadboard" circuit, first assembled on his kitchen table, could count in binary arithmetic, where our traditional base-10 numbers are coded as a series of 1s or 0s. In base 2, four relays could represent ten digits: 0 was 0000, 1 was 0001, 2 was 0010, 25 was 0011, 4 was 0100, 5 was 0101, 6 was 0010, 7 was 0111, 8 was 1000, and 9 was 1011.

Stibitz had realized that the telephone relay--the on-off switch--was well suited to the binary arithmetic he had learned in algebra class. If you could sling enough relays together, you could make a very fast calculator. Instead of numbers being represented by slow-moving wheels and drums, they could be represented by fast-moving electricity.

In the fall of 1937, Stibitz drew up plans for a machine to multiply complex numbers and handle computations the company's engineers needed to better construct voice circuitry. The device, creatively called the Complex Number Computer, couldn't be programmed; once numbers were fed in, it multiplied them automatically. It couldn't do further calculation with the result on its own. Construction began at the old Bell Laboratories Building on West Street in Manhattan in April 1939 and was completed in just six months. The machine was an improvement over mechanical desk calculators of the day, and it operated for nine years. But its real value lay not in what it could do but in what it showed could be done. The flip-flop would prove to be a key invention in the development of the computer. It was the means by which electricity could be converted into numbers, and circuits could ultimately be made to process intelligence.

The same year Stibitz was drawing up his plans, Howard Aiken at Harvard was working on computing machines when he stumbled upon Babbage's work. He began working on machines with IBM that would become electromagnetic versions of the Hollerith punch cards. It seemed like a strong alliance. IBM, funding Aiken's work, had itself grown out of Hollerith's Tabulating Machine Company. Hollerith had sold his firm to businessman Charles Ranlegh Flint, who merged it with a maker of shopkeeper's scales and a manufacturer of workplace time docks to Create Computing-Tabulating-Recording Company, or CTR. In 1924, CTR's president, Thomas J. Watson Sr., renamed the conglomerate International Business Machines and issued his famous proclamation: "Everywhere ... there will be IBM machines in use. The sun never sets on IBM."

It was that hunger for ubiquity that drove IBM to fund not only its own development laboratories but also university research at several institutions. IBM, faced with a rapidly changing industry and growing competition, spent liberally so it could stay on top of developments that might lead to new office machines. One of the premier projects was work at Harvard on automatic computing machines--especially Howard Aiken's Mark I.

The Mark I was a slow electromechanical giant that borrowed some of Babbage's principles and used wheels with ten positions on them to represent digits. In fact, Aiken had come across a fragment of Babbage's calculating engine in the attic of Harvard's physics lab, a donation in 1886 by Babbage's son, Henry. Aiken became fascinated with Babbage and began to see himself as the new carrier of Babbage's baton.

The Mark I, constructed by IBM, was a five-ton calculator. The machine's components were lined up and driven by a fifty-foot shaft. Altogether, it had some 750,000 moving parts, and when it churned, observers said it sounded like the roar of a textile mill. Those suffering the din probably would not have appreciated just how fitting that comparison actually was. The beauty of the Mark I was that it was the first fully automatic machine: It could run for hours or even days without operator intervention. Programs were entered on paper tape, and if the same functions were required over and over again, the tape was simply glued together. But Aiken, or IBM, apparently hadn't studied Babbage closely enough; the Mark I had no "conditional branch"--the If ... then statement of computing--even though Babbage designed such a function into the Analytical Engine. And the machine had one huge drawback: Although it could undertake calculations never before mechanized, it was painfully slow.

Now part of the war effort, Mark I undertook computations for the U.S. Navy. As its public unveiling neared in 1944, Aiken began clashing with IBM. Thomas Watson, now the company's chairman, wanted to make a splash with his investment. So he commissioned a designer to spruce up the look of the machine, and it was fitted, over the objections of Aiken, who wanted easy access to all components, with a shiny cabinet of stainless steel and glass with the IBM livery. Then, at the dedication ceremony at Harvard in 1944, according to computer historians Martin Campbell-Kelly and William Aspray, Aiken took credit as the sole inventor of the calculator. He acknowledged neither IBM's underwriting nor the contribution of the firm's engineers, who had taken Aiken's rough concepts and hammered them into a functioning machine. Watson was said to have been shaken with anger and became determined that IBM would undertake its computing research without Harvard.

Aiken himself declared that Mark I was "Babbage's Dream Come True." While Mark I was indeed the kind of machine Babbage had proposed one hundred years earlier, it was, according to Campbell-Kelly and Aspray, perhaps only ten times as fast as the Analytical Engine would have been. The Mark I was so slow it was a technical dead end, soon to be eclipsed by electronic machines thousands of times faster than Babbage's dream. Aiken had used electricity to drive his counting and calculating components, but he had not made the leap to using electricity as the counting and calculating component itself. That leap--using electricity to think--would come in a most unusual way.

Continues...


Excerpted from Eniac by Scott McCartney Copyright © 2001 by Scott McCartney. Excerpted by permission.
All rights reserved. No part of this excerpt may be reproduced or reprinted without permission in writing from the publisher.
Excerpts are provided by Dial-A-Book Inc. solely for the personal use of visitors to this web site.

Read More Show Less

Table of Contents

Introduction
Chapter 1: The Ancestors
Chapter 2: A Kid and a Dinner
Chapter 3: Crunched by Numbers
Chapter 4: Getting Started
Chapter 5: Five Times One Thousand
Chapter 6: Whose Machine Was It, Anyway?
Chapter 7: Out on Their Own
Chapter 8: Whose Idea Was It, Anyway?
Epilogue
Notes
Bibliography
Acknowledgments
Index
Read More Show Less

Customer Reviews

Be the first to write a review
( 0 )
Rating Distribution

5 Star

(0)

4 Star

(0)

3 Star

(0)

2 Star

(0)

1 Star

(0)

Your Rating:

Your Name: Create a Pen Name or

Barnes & Noble.com Review Rules

Our reader reviews allow you to share your comments on titles you liked, or didn't, with others. By submitting an online review, you are representing to Barnes & Noble.com that all information contained in your review is original and accurate in all respects, and that the submission of such content by you and the posting of such content by Barnes & Noble.com does not and will not violate the rights of any third party. Please follow the rules below to help ensure that your review can be posted.

Reviews by Our Customers Under the Age of 13

We highly value and respect everyone's opinion concerning the titles we offer. However, we cannot allow persons under the age of 13 to have accounts at BN.com or to post customer reviews. Please see our Terms of Use for more details.

What to exclude from your review:

Please do not write about reviews, commentary, or information posted on the product page. If you see any errors in the information on the product page, please send us an email.

Reviews should not contain any of the following:

  • - HTML tags, profanity, obscenities, vulgarities, or comments that defame anyone
  • - Time-sensitive information such as tour dates, signings, lectures, etc.
  • - Single-word reviews. Other people will read your review to discover why you liked or didn't like the title. Be descriptive.
  • - Comments focusing on the author or that may ruin the ending for others
  • - Phone numbers, addresses, URLs
  • - Pricing and availability information or alternative ordering information
  • - Advertisements or commercial solicitation

Reminder:

  • - By submitting a review, you grant to Barnes & Noble.com and its sublicensees the royalty-free, perpetual, irrevocable right and license to use the review in accordance with the Barnes & Noble.com Terms of Use.
  • - Barnes & Noble.com reserves the right not to post any review -- particularly those that do not follow the terms and conditions of these Rules. Barnes & Noble.com also reserves the right to remove any review at any time without notice.
  • - See Terms of Use for other conditions and disclaimers.
Search for Products You'd Like to Recommend

Recommend other products that relate to your review. Just search for them below and share!

Create a Pen Name

Your Pen Name is your unique identity on BN.com. It will appear on the reviews you write and other website activities. Your Pen Name cannot be edited, changed or deleted once submitted.

 
Your Pen Name can be any combination of alphanumeric characters (plus - and _), and must be at least two characters long.

Continue Anonymously
Sort by: Showing all of 3 Customer Reviews
  • Posted April 22, 2011

    more from this reviewer

    The true history of a computer

    It is hard to imagine today, when there is literally a computer in each pocket in a form of a smartphone, that digital computers are a relatively recent development in the course of human history. They have more than anything else in the past fifty years changed the way we live and communicate with each other, the way we entertain ourselves, and have touched almost every aspect of our lives in ways that we have increasingly come to take for granted. And yet it is ironic that almost no one would be able to tell you who invented the computer. This is in a marked contrast with many other technological inventions that have changed the modern civilization. Almost any kid could tell you who invented the steam engine, the cotton gin, the automobile, the telephone, the airplane, the light bulb or the radio. For better or for worse, all of those inventions have particular name or two associated with them. Unfortunately, because of the series of historical misfortunes, the true inventors of the first functioning digital computer ENIAC are hardly household names. Presper Eckert and John Mauchly were the minds behind this WWII seminal effort, and even had the patent to the computer to their credit for a while, but due to a series of historic misfortunes and legal wrangling lost that piece of prestige.

    This book goes a long way towards righting that wrong. It is well researched and replete with details of the effort that led to the construction of ENIAC, with many interesting and amusing anecdotes. It paints a very humane and sympathetic picture of Eckert and Mauchly, all with their characteristic human foibles and weaknesses. And yet, Scott McCartney is not entirely opposed to the fact that no single individual ultimately benefited from the invention of the computer. To him at least this was the reason why the huge advances in computer industry were possible in such a short span of time.

    Ultimately, this is a very readable and enjoyable book, with a lot of important historical insight.

    Was this review helpful? Yes  No   Report this review
  • Anonymous

    Posted February 14, 2000

    techie treasure

    a great book, especially for a philly boy who was always told that penn dropped the ball so philly never became silicon valley.

    Was this review helpful? Yes  No   Report this review
  • Anonymous

    Posted January 4, 2000

    OK Nick, you made your point

    I've been working with computers for over 33 years. I think todays 'kids' don't understand or appreciate the advances in technology and where we came from. I think all college students should be required to read books like this. Especially CS majors. Here's a question that most, if not all, CS majors can't answer. How many transistors were in the first commercial computer? Answer: None - it used all tubes.

    Was this review helpful? Yes  No   Report this review
Sort by: Showing all of 3 Customer Reviews

If you find inappropriate content, please report it to Barnes & Noble
Why is this product inappropriate?
Comments (optional)