A History of Modern Computing, second edition / Edition 2

A History of Modern Computing, second edition / Edition 2

by Paul E. Ceruzzi
ISBN-10:
0262532034
ISBN-13:
9780262532037
Pub. Date:
04/08/2003
Publisher:
MIT Press
ISBN-10:
0262532034
ISBN-13:
9780262532037
Pub. Date:
04/08/2003
Publisher:
MIT Press
A History of Modern Computing, second edition / Edition 2

A History of Modern Computing, second edition / Edition 2

by Paul E. Ceruzzi
$40.0 Current price is , Original price is $40.0. You
$40.00 
  • SHIP THIS ITEM
    Qualifies for Free Shipping
  • PICK UP IN STORE
    Check Availability at Nearby Stores
  • SHIP THIS ITEM

    Temporarily Out of Stock Online

    Please check back later for updated availability.


Overview

From the first digital computer to the dot-com crash—a story of individuals, institutions, and the forces that led to a series of dramatic transformations.

This engaging history covers modern computing from the development of the first electronic digital computer through the dot-com crash. The author concentrates on five key moments of transition: the transformation of the computer in the late 1940s from a specialized scientific instrument to a commercial product; the emergence of small systems in the late 1960s; the beginning of personal computing in the 1970s; the spread of networking after 1985; and, in a chapter written for this edition, the period 1995-2001. The new material focuses on the Microsoft antitrust suit, the rise and fall of the dot-coms, and the advent of open source software, particularly Linux. Within the chronological narrative, the book traces several overlapping threads: the evolution of the computer's internal design; the effect of economic trends and the Cold War; the long-term role of IBM as a player and as a target for upstart entrepreneurs; the growth of software from a hidden element to a major character in the story of computing; and the recurring issue of the place of information and computing in a democratic society. The focus is on the United States (though Europe and Japan enter the story at crucial points), on computing per se rather than on applications such as artificial intelligence, and on systems that were sold commercially and installed in quantities.


Product Details

ISBN-13: 9780262532037
Publisher: MIT Press
Publication date: 04/08/2003
Series: History of Computing
Edition description: second edition
Pages: 460
Sales rank: 733,008
Product dimensions: 6.13(w) x 9.06(h) x 1.14(d)
Age Range: 18 Years

About the Author

Paul E. Ceruzzi is Curator at the National Air and Space Museum at the Smithsonian Institution. He is the author of Computing: A Concise History, A History of Modern Computing, and Internet Alley: High Technology in Tysons Corner, 1945–2005, all published by the MIT Press, and other books.

Read an Excerpt

Chapter One

The Advent of Commercial Computing, 1945-1956


"[Y]ou ... fellows ought to go back and change your program entirely, stop this ... foolishness with Eckert and Mauchly." That was the opinion of Howard Aiken, Harvard mathematician and builder of the Mark I calculator, expressed to Edward Cannon of the U.S. National Bureau of Standards in 1948. Aiken made that remark as a member of a National Research Council committee that had just recommended that the Bureau of Standards not support J. Presper Eckert and John Mauchly's proposal to make and sell electronic computers (figure 1.1). In Aiken's view, a commercial market would never develop; in the United States there was a need for perhaps for five or six such machines, but no more.

    Howard Aiken was wrong. There turned out to be a market for millions of electronic digital computers by the 1990s, many of them personal devices that fit easily into a briefcase. That would not have happened were it not for advances in solid state physics, which provided a way of putting the circuits of a computer on a few chips of silicon. Nevertheless, the nearly ubiquitous computers of the 1990s are direct descendants of what Eckert and Mauchly hoped to commercialize in the late 1940s.

    The Eckert-Mauchly Computer Corporation did not remain an independent entity for long; it was absorbed by Remington Rand and became a division of that business-machine company. Eckert and Mauchly's computer, the UNIVAC, was a technical masterpiece but was eclipsed in the market by computers made by Remington-Rand's competitor, IBM. So one could say that they were indeed foolish in their underestimation of the difficulties of commercializing their invention. What was not foolish was their vision, not only of how to design and build a computer but also of how a society might benefit from large numbers of them.

    Computing after 1945 is a story of people who at critical moments redefined the nature of the technology itself. In doing so they opened up computing to new markets, new applications, and a new place in the social order. Eckert and Mauchly were the first of many who effected such a transformation. They took an expensive and fragile scientific instrument, similar to a cyclotron, and turned it into a product that could be manufactured and sold, if only in small quantities. In the mid-1950s the IBM Corporation developed a line of products that met the information-handling needs of American businesses. A decade later, alumni from MIT's Project Whirlwind turned the computer into a device that one interacted with, a tool with which to augment one's intellectual efforts. In the mid-1970s, a group of hobbyists and enthusiasts transformed it into a personal appliance. Around 1980, it was transformed from a piece of specialized hardware to a standardized consumer product defined by its now-commercialized software. In the 1990s it is going through another transformation, turning into an agent of a worldwide nexus, a communications medium. The "computer age" — really a series of "computer ages" — was not just invented; it was willed into existence by people who wanted it to happen. This process of reinvention and redefinition is still going on.


The UNIVAC in Context


Eckert and Mauchly brought on the first of these transformations in 1951 with a computer they called "UNIVAC." The acronym came from "Universal Automatic Computer," a name that they chose carefully. "Universal" implied that it could solve problems encountered by scientists, engineers, and businesses. "Automatic" implied that it could solve complex problems without requiring constant human intervention or judgment, as existing techniques required. Before discussing its creation, one needs to understand how computing work was being done in different areas and why a single machine, a UNIVAC, could serve them equally well. One must also understand how existing calculating machines, the results of decades of refinement and use, were deficient. It was that deficiency that made room for the UNIVAC, which broke with past practices in many ways.


Punched Cards

During the Second World War, Eckert and Mauchly designed and built the ENIAC at the University of Pennsylvania's Moore School of Electrical Engineering. The ENIAC was an electronic calculator that inaugurated the era of digital computing in the United States. Its purpose was to calculate firing tables for the U.S. Army, a task that involved the repetitive solution of complex mathematical expressions. It was while working on this device that they conceived of something that had a more universal appeal.

    The flow of information through the UNIVAC reflected Eckert and Mauchly's background in physics and engineering. That is, the flow of instructions and data in the UNIVAC mirrored the way humans using mechanical calculators, books of tables, and pencil and paper performed scientific calculations. Although the vacuum tube circuits might have appeared novel, a scientist or engineer would not have found anything unusual in the way a UNIVAC attacked a problem.

    However, those engaged in business calculations, customers Eckert and Mauchly also wanted their machine to serve, would have found the UNIVAC's method of processing unusual. In the late nineteenth century, many businesses adopted a practice that organized work using a punched card machine; typically an ensemble of three to six different punched-card devices would comprise an installation. To replace these machines with a computer, the business had also to adopt the UNIVAC's way of processing information. Punched-card machines are often called "unit record equipment." With them, all relevant information about a particular entity (e.g., a sales transaction) is encoded on a single card that can serve multiple uses by being run through different pieces of equipment; for example, to count, sort, tabulate, or print on a particular set of columns. Historical accounts of punched-card machinery have described in great detail the functioning of the individual machines. More relevant is the "architecture" of the entire room — including the people in it — that comprised a punched-card installation, since it was that room, not the individual machines, that the electronic computer eventually replaced.

    In a typical punched-card installation, the same operation was performed on all the records in a file as a deck of cards went through a tabulator or other machine (figure 1.2). The UNIVAC and its successors could operate that way, but they could also perform a long sequence of operations on a single datum before fetching the next record from memory. In punched-card terms, that would require carrying a "deck" of a single card around the room — hardly an economical use of the machinery or the people. Processing information gathered into a deck of cards was entrenched into business practices by the mid-1930s, and reinforced by the deep penetration of the punched-card equipment salesmen into the accounting offices of their customers.

    By the 1930s a few scientists, in particular astronomers, began using punched-card equipment for scientific problems. They found that it made sense to perform sequences of operations on each datum, since often the next operation depended on the results of the previous one. One such person was Wallace Eckert (no relation to J. Presper Eckert), who with the aid of IBM established the Thomas J. Watson Computing Bureau at Columbia University in New York in 1934. In 1940 he summarized his work in an influential book, Punched Card Methods in Scientific Computation. In it, he explained that punched-card machines "are all designed for computation where each operation is done on many cards before the next operation is begun." He emphasized how one could use existing equipment to do scientific work, but he stated that it was not worth the "expense and delay involved" in building specialized machines to solve scientific problems. A decade later, that was precisely what J. Presper Eckert and John Mauchly were proposing to do — go to great expense and effort to create a "universal" machine that could handle both business and scientific problems.

    Ironically, Wallace Eckert was among the first to venture away from traditional punched-card practices and toward one more like the digital computers that would later appear. Despite his recommendation against building specialized equipment, he did have a device called a control switch designed at his laboratory. He installed this switch between the multiplier, tabulator, and summary punch. Its function was to allow short sequences of operations (up to 12) to be performed on a single card before the next card was read. Following his advice, IBM built and installed two specially built punched-card machines at the U.S. Army's Ballistic Research Laboratory at Aberdeen, Maryland. IBM called these machines the "Aberdeen Relay Calculators"; they were later known as the PSRC, for "Pluggable Sequence Relay Calculator."

    In late 1945, three more were built for other military labs, and these were even more complex. During the time one of these machines read a card, it could execute a sequence of up to forty-eight steps. More complex sequences-within-sequences were also possible. One computer scientist later noted that this method of programming demanded "the kind of detailed design of parallel subsequencing that one sees nowadays at the microprogramming level of some computers." When properly programmed, the machines were faster than any other nonelectronic calculator. Even after the ENIAC was completed and installed and moved from Philadelphia to Aberdeen, the Ballistic Research Lab had additional Relay Calculators built. They were still in use in 1952, by which time the BRL not only had the ENIAC but also the EDVAC, the ORDVAC (both electronic computers), an IBM Card Programmed Calculator (described next), and the Bell Labs Model V, a very large programmable relay calculator.


The Card-Programmed Calculator


The Aberdeen Relay Calculators never became a commercial product, but they reveal an attempt to adapt existing equipment to post-World War II needs, rather than take a revolutionary approach, such as the UNIVAC. There were also other punched-card devices that represented genuine commercial alternatives to Eckert and Mauchly's proposed invention. In 1935 IBM introduced a multiplying punch (the Model 601); these soon became popular for scientific or statistical work. In 1946 IBM introduced an improved model, the 603, the first commercial IBM product to use vacuum tubes for calculating. Two years later IBM replaced it with the 604, which not only used tubes but also incorporated the sequencing capability pioneered by the Aberdeen machines. Besides the usual plugboard control common to other punched-card equipment, it could execute up to 60 steps for each reading of a card and setting of the plugboard. The 604 and its successor, the IBM 605, became the mainstays of scientific computing at many installations until reliable commercial computers became available in the mid 1950s. It was one of IBM's most successful products during that era: over 5,000 were built between 1948 and 1958.

    One of IBM's biggest engineering customers, Northrop Aircraft of Hawthorne, California, connected a 603 multiplying punch to one of their tabulating machines. That allowed Northrop's users to print the results of a calculation on paper instead of punching them on cards. With a slight further modification and the addition of a small box that stored numbers in banks of relays, the machine could use punched cards run through the tabulator to control the sequences carried out by the multiplier.

    Logically, the arrangement was no different from an ordinary punched card installation, except that a set of cables and control boxes replaced the person whose job had been to carry decks of cards from one machine to the next. One of the Northrop engineers recalled years later that they rigged up the arrangement because they were running a problem whose next step depended on the results of the previous step. What this meant was that the normal decks of cards that ran through a machine were reduced to "a batch of one [card], which was awkward." In other words, with cables connecting the machines, the installation became one that executed instructions sequentially and was programmable in a more flexible way than plugging cables.

    IBM later marketed a version of this ensemble as the Card-Programmed Calculator (CPC). Perhaps several hundred in all were installed between 1948 and the mid 1950s — far fewer than the thousands of tabulators, punches, and other equipment installed in the traditional way. But even that was many times greater than the number of electronic computer installations worldwide until about 1954. For engineering-oriented companies like Northrop, the CPC filled a pressing need that could not wait for the problems associated with marketing stored-program computers to be resolved.

    The Aberdeen calculators and the 604 were transitional machines, between calculators, tabulators, and genuine computers like the UNIVAC. The CPC carried the punched-card approach too far to be of value to computer designers. By the time of its introduction, it was already clear that the design used by the UNIVAC, in which both the instructions and the data were stored in an internal memory device, was superior. The Card-Programmed Calculator's combination of program cards, plugboards, and interconnecting cables was like the epicycles of a late iteration of Ptolemaic cosmology, while the Copernican system was already gaining acceptance. Customers needing to solve difficult engineering problems, however, accepted it. It cost less than the computers then being offered, and it was available. Other southern California aerospace firms besides Northrop carefully evaluated the Card-Programmed Calculator against vendors' claims for electronic computers. Nearly all of them installed at least one CPC.


The Stored-Program Principle


No one who saw a UNIVAC failed to see how much it differed from existing calculators and punched card equipment. It used vacuum tubes — thousands of them. It stored data on tape, not cards. It was a large and expensive system, not a collection of different devices. The biggest difference was its internal design, not visible to the casual observer. The UNIVAC was a "stored program" computer, one of the first. More than anything else, that made it different from the machines it was designed to replace.

    The origins of the notion of storing a computer's programs internally are clouded in war-time secrecy. The notion arose as Eckert, Mauchly, and others were rushing to finish the ENIAC to assist the U.S. Army, which was engaged in a ground war in Europe and North Africa. It arose because the ENIAC's creators recognized that while the ENIAC was probably going to work, it was going to be a difficult machine to operate.

    Applying the modern term "to program" to a computer probably originated with the ENIAC team at the Moore School. More often, though, they used the phrase "set up" to describe configuring the ENIAC to solve different problems. Setting up the ENIAC meant plugging and unplugging a maze of cables and setting arrays of switches. In effect, the machine had to be rebuilt for each new problem it was to solve. When completed in late 1945, the ENIAC operated much faster than any other machine before it. But while it could solve a complex mathematical problem in seconds, it might take days to set up the machine properly to do that.

    It was in the midst of building this machine that its creators conceived of an alternative. It was too late to incorporate that insight into the ENIAC, but it did form the basis for a proposed follow-on machine called the "EDVAC" (Electronic Discrete Variable Computer). In a description written in September of 1945, Eckert and Mauchly stated the concept succinctly: "An important feature of this device was that operating instructions and function tables would be stored exactly in the same sort of memory device as that used for numbers." Six months later, Eckert and Mauchly left the Moore School, and work on the EDVAC was turned over to others (which was mainly why it took five more years to finish building it). The concept of storing both instructions and data in a common storage unit would become basic features of the UNIVAC and nearly every computer that followed.

    The stored-program principle was a key to the UNIVAC's success. It allowed Eckert and Mauchly, first of all, to build a computer that had much more general capabilities than the ENIAC, yet required fewer vacuum tubes. It led to the establishment of "programming" (later "software") as something both separate from and as important as hardware design. The basics of this design remained remarkably stable during the evolution of computing from 1945 to 1995. Only toward the end of this period do we encounter significant deviations from it, in the form of "massively parallel" processors or "non-von Neumann" architectures.


John von Neumann's Role

Although Eckert and Mauchly had realized as early as 1944 that computers would need to store information, the "First Draft of a Report on the EDVAC," by John von Neumann, dated June 30, 1945, is often cited as the founding document of modern computing. From it, and a series of reports co-authored by von Neumann a few years later, comes the term "von Neumann Architecture" to describe such a design. According to Herman Goldstine, an army officer assigned to the ENIAC project, John von Neumann (1903-1957) learned of the ENIAC from a chance meeting with him in the summer of 1944 at the Aberdeen, Maryland, railroad station. Despite his involvement in many other projects, including the design of the atomic bomb, von Neumann was sufficiently intrigued by what was going on at the Moore School to have himself introduced to Eckert and Mauchly and brought onto the project.

    Eckert and Mauchly were at that time busy thinking of ways to improve the process of setting up a computer faster. One possibility was to use perforated paper tape to feed instructions, as several relay machines of the 1940s did, but this was too slow for the high speeds of the ENIAC's calculating circuits. So were the decks of cards used by the Card-Programmed Calculator. In Mauchly's words, "calculations can be performed at high speed only if instructions are supplied at high speed."

    In the midst of the ENIAC's construction in 1944, Eckert wrote a "Disclosure of a Magnetic Calculating Machine," in which he described the use of" [d]iscs or drums which have at least their outer edge made of a magnetic alloy" on which numbers can be stored. Although it focused on ways of designing a machine that was "speedier, simpler as well as providing features of utility, ruggedness and ease or repair," the disclosure did not articulate the design concepts that later would become known as the stored-program principle. Von Neumann's 1945 Report on the EDVAC went farther — it described a machine in terms of its logical structure rather than its hardware construction. The memorandum that Eckert and Mauchly submitted in September 1945, stated the principle [Illegible] wrote that instructions and numerical data would be stored "in exactly the same sort of memory device."

    From the above sequence of reports and memorandums it appears that Eckert and Mauchly had conceived of something like a stored-program principle by 1944, but that it was von Neumann who clarified it and stated it in a form that gave it great force. Von Neumann's international reputation as a mathematician also gave the idea more clout than it might have had coming solely from Eckert and Mauchly, neither of whom were well-known outside the Moore School. Although the term "von Neumann Architecture" is too entrenched to be supplanted, Eckert and Mauchly, who demonstrated such a deep understanding of the nature of electronic computing from an engineering perspective, deserve equal credit.

    In the summer of 1946, the Moore School and the U.S. military cosponsored a course on the "Theory and Techniques for Design of Electronic Digital Computers." The course was a recognition of the school's inability to accommodate the numerous requests for information following the public unveiling of the ENIAC. That series of course lectures and the mimeographed reports that appeared a year or two later firmly established the Moore School's approach to computer design. Machines soon appeared that were based on that concept. An experimental computer at the University of Manchester, England, was running test programs by mid-1948. Maurice Wilkes, of Cambridge University, implemented the idea in his EDSAC, operational in the spring of 1949. Eckert and Mauchly completed the BINAC later that year. And of course the UNIVAC would also employ it. Others would continue to propose and build electronic computers of alternate designs, but after the summer of 1946, computing's path, in theory at least, was clear.


The von Neumann Architecture and Its Significance


Before providing a description of the UNIVAC, it is worth a brief look at the essentials of the architecture that von Neumann described in his 1945 report, especially those aspects of it that have remained stable through the past half-century of computer design.

    Aside from the internal storage of programs, a major characteristic of a von Neumann computer is that the units that process information are separate from those that store it. Typically there is only a single channel between these two units, through which all transfers of information must go (the so-called von Neumann Bottleneck, about which more later). This feature arose primarily for engineering reasons: it was easier to design storage cells that did not also have to perform arithmetic on their contents.

    The main characteristic is that instructions and data are stored in the same memory device, from which any datum can be retrieved as quickly as any other. This concept arose from considering that the processing unit of a computer should not have to sit idle awaiting delivery of the next instruction. Besides that, the ratio of instructions to data usually varies for each problem, so it would not make sense to dedicate separate, expensive storage devices to each. This design implies that one may treat a coded instruction as a piece of data and perform an operation on it, thus changing it into another instruction, but that was not fully understood at first. To give a sense of how this was first implemented, the UNIVAC main store could hold up to 1,000 "words", which could either be numbers (11 digits plus sign), characters (12 characters per word), or instructions (6 characters per instruction; 2 in each word).

    Finally, the basic cycle of a von Neumann computer is to transfer an instruction from the store to the processor, decode that instruction, and execute it, using data retrieved from that same store or already present in the processor. Once the processor executed an instruction, it fetched, decoded, and executed another, from the very next position in memory unless directed elsewhere. Having a fast storage device meant that the processor could branch to another stream of instructions quickly whenever it was necessary. Except when explicit branch instructions are encountered, the flow through the instructions stored in the memory was sequential and linear. This concept, of fetching and then executing a linear stream of instructions, is the most lasting of all; even computer designs that purport to be non-von Neumann typically retain the fetch-decode-execute heartbeat of a single-processor machine. As Alan Perils once remarked, "Sometimes I think the only universal in the computing field is the fetch-execute cycle." The UNIVAC could perform this sequence and add two numbers in about half a millisecond.

    Since 1990, computer systems with parallel processing structures have become more common, and genuine alternatives to the fetch-execute cycle have been accepted in a few limited markets. Elsewhere the von Neumann architecture, though much modified, prevails. The emergence of practical parallel designs reveals, however, the unifying effect of the von Neumann model as it influenced the computer design of the past five decades.

(Continues...)

Table of Contents

Dedicationv
Preface to the Second Editionix
Acknowledgmentsxiii
Introduction: Defining "Computer"1
1The Advent of Commercial Computing, 1945-195613
2Computing Comes of Age, 1956-196447
3The Early History of Software, 1952-196879
4From Mainframe to Minicomputer, 1959-1969109
5The "Go-Go" Years and the System/360, 1961-1975143
6The Chip and Its Impact, 1965-1975177
7The Personal Computer, 1972-1977207
8Augmenting Human Intellect, 1975-1985243
9Workstations, UNIX, and the Net, 1981-1995281
10"Internet Time," 1995-2001307
Conclusion: The Digitization of the World Picture345
Notes351
Bibliography415
Index431

What People are Saying About This

Thomas P. Hughes

Paul E Ceruzzi explores the mostly unmapped history of computing since 1945. Readers seeking to undestand a half century of turbulent and complex history will find him an informed and thoughtful guide. A path breaking book.

G. Pascal Zachary

Paul E Ceruzzi is America's finest historian of computing, and his masterful retelling of the past fifty years of these machines is an instant classic. A distinguished scholar with a poet's sense of deeper questions, Ceruzzi has given us a balanced, comprehensive, clear and incisive account of the digital computer from its origins in Worl War II to its place in the World Wide Web of the 1900s. Brilliantly, he elevates the computer beyond the arcane world of gadgetry and places it squarely in the center of American life, which is where it belongs.

Endorsement

Paul E Ceruzzi is America's finest historian of computing, and his masterful retelling of the past fifty years of these machines is an instant classic. A distinguished scholar with a poet's sense of deeper questions, Ceruzzi has given us a balanced, comprehensive, clear and incisive account of the digital computer from its origins in Worl War II to its place in the World Wide Web of the 1900s. Brilliantly, he elevates the computer beyond the arcane world of gadgetry and places it squarely in the center of American life, which is where it belongs.

G. Pascal Zachary, author of Endless Frontier: Vannevar Bush, Engineer of the American Century

From the Publisher

Paul E Ceruzzi explores the mostly unmapped history of computing since 1945. Readers seeking to undestand a half century of turbulent and complex history will find him an informed and thoughtful guide. A path breaking book.

Thomas P. Hughes , author of Rescuing Prometheus and American Genesis

Paul E Ceruzzi is America's finest historian of computing, and his masterful retelling of the past fifty years of these machines is an instant classic. A distinguished scholar with a poet's sense of deeper questions, Ceruzzi has given us a balanced, comprehensive, clear and incisive account of the digital computer from its origins in Worl War II to its place in the World Wide Web of the 1900s. Brilliantly, he elevates the computer beyond the arcane world of gadgetry and places it squarely in the center of American life, which is where it belongs.

G. Pascal Zachary , author of Endless Frontier: Vannevar Bush, Engineer of the American Century

From the B&N Reads Blog

Customer Reviews