Turing's Cathedral: The Origins of the Digital Universe

Turing's Cathedral: The Origins of the Digital Universe

by George Dyson

Narrated by Arthur Morey

Unabridged — 15 hours, 44 minutes

Turing's Cathedral: The Origins of the Digital Universe

Turing's Cathedral: The Origins of the Digital Universe

by George Dyson

Narrated by Arthur Morey

Unabridged — 15 hours, 44 minutes

Audiobook (Digital)

$24.00
FREE With a B&N Audiobooks Subscription | Cancel Anytime
$0.00

Free with a B&N Audiobooks Subscription | Cancel Anytime

START FREE TRIAL

Already Subscribed? 

Sign in to Your BN.com Account


Listen on the free Barnes & Noble NOOK app


Get an extra 10% off all audiobooks in June to celebrate Audiobook Month! Some exclusions apply. See details here.

Related collections and offers

FREE

with a B&N Audiobooks Subscription

Or Pay $24.00

Overview

“It is possible to invent a single machine which can be used to compute any computable sequence,” twenty-four-year-old Alan Turing announced in 1936. In Turing's Cathedral, George Dyson focuses on a small group of men and women, led by John von Neumann at the Institute for Advanced Study in Princeton, New Jersey, who built one of the first computers to realize Alan Turing's vision of a Universal Machine. Their work would break the distinction between numbers that mean things and numbers that do things-and our universe would never be the same.
*
Using five kilobytes of memory (the amount allocated to displaying the cursor on a computer desktop of today), they achieved unprecedented success in both weather prediction and nuclear weapons design, while tackling, in their spare time, problems ranging from the evolution of viruses to the evolution of stars.
*
Dyson's account, both historic and prophetic, sheds important new light on how the digital universe exploded in the aftermath of World War II. The proliferation of both codes and machines was paralleled by two historic developments: the decoding of self-replicating sequences in biology and the invention of the hydrogen bomb. It's no coincidence that the most destructive and the most constructive of human inventions appeared at exactly the same time.
*
How did code take over the world? In retracing how Alan Turing's one-dimensional model became John von Neumann's two-dimensional implementation, Turing's Cathedral offers a series of provocative suggestions as to where the digital universe, now fully three-dimensional, may be heading next.


Editorial Reviews

William Poundstone

…a groundbreaking history of the Princeton computer. Though the English mathematician Alan Turing gets title billing, Dyson's true protagonist is the Hungarian-­American John von Neumann, presented here as the Steve Jobs of early computers—a man who invented almost nothing, yet whose vision changed the world…Turing's Cathedral, incorporating original research and reporting…is an expansive narrative wherein every character, place and idea rates a digression…The book brims with unexpected detail.
—The New York Times Book Review

Publishers Weekly

An overstuffed meditation on all things digital sprouts from this engrossing study of how engineers at Princeton’s Institute for Advanced Studies, under charismatic mathematician John von Neumann (the book should really be titled Von Neumann’s Cathedral), built a pioneering computer (called MANIAC) in the years after WWII. To readers used to thinking of computers as magical black boxes, historian Dyson (Darwin Among the Machines) gives an arresting view of old-school mechanics hammering the first ones together from vacuum tubes, bicycle wheels, and punch-cards. Unfortunately, his account of technological innovations is too sketchy for laypeople to quite follow. The narrative frames a meandering tour of the breakthroughs enabled by early computers, from hydrogen bombs to weather forecasting, and grandiose musings on the digital worldview of MANIAC’s creators, in which the author loosely connects the Internet, DNA, and the possibility of extraterrestrial invasion via interstellar radio signals. Dyson’s portrait of the subculture of Von Neumann and other European émigré scientists who midwifed America’s postwar technological order is lively and piquant. But the book bites off more science than it can chew, and its expositions of hard-to-digest concepts from Gödel’s theorem to the Turing machine are too hasty and undeveloped to sink in. (Mar.)

From the Publisher

The best book I’ve read on the origins of the computer. . . not only learned, but brilliantly and surprisingly idiosyncratic and strange.”
The Boston Globe
 
“A groundbreaking history . . . the book brims with unexpected detail.”
The New York Times Book Review
 
“A technical, philosophical and sometimes personal account . . . wide-ranging and lyrical.”
The Economist
 
“The story of the [von Neumann] computer project and how it begat today’s digital universe has been told before, but no one has told it with such precision and narrative sweep.”
The New York Review of Books

“A fascinating combination of the technical and human stories behind the computing breakthroughs of the 1940s and ‘50s. . . . An important work.”
The Philadelphia Inquirer
 
“Vivid. . . . [A] detailed yet readable chronicle of the birth of modern computing. . . . Dyson’s book is one small step toward reminding us that behind all the touch screens, artificial intelligences and cerebellum implants lies not sorcery but a machine from the middle of New Jersey.”
The Oregonian
 
“Well-told. . . . Dyson tells his story as a sort of intellectual caper film. He gathers his cast of characters . . . and tracks their journey to Princeton. When they converge, it’s great fun, despite postwar food rationing and housing shortages. . . . Dyson is rightly as concerned with the machine’s inventors as with the technology itself."
The Wall Street Journal
 
“Charming. . . . Creation stories are always worth telling, especially when they center on the birth of world-changing powers. . . . Dyson creatively recounts the curious Faustian bargain that permitted mathematicians to experiment with building more powerful computers, which in turn helped others build more destructive bombs.”
San Francisco Chronicle
 
“The story of the invention of computers has been told many times, from many different points of view, but seldom as authoritatively and with as much detail as George Dyson has done. . . . Turing’s Cathedral will enthrall computer enthusiasts. . . . Employing letters, memoirs, oral histories and personal interviews, Dyson organizes his book around the personalities of the men (and occasional woman) behind the computer, and does a splendid job in bringing them to life.”
The Seattle Times
 
 “A powerful story of the ethical dimension of scientific research, a story whose lessons apply as much today in an era of expanded military R&D as they did in the ENIAC and MANIAC era . . . Dyson closes the book with three absolutely, hair-on-neck-standing-up inspiring chapters on the present and future, a bracing reminder of the distance we have come on some of the paths envisioned by von Neumann, Turing, et al.”
—Cory Doctorow, Boing Boing
 
“No other book about the beginnings of the digital age . . . makes the connections this one does between the lessons of the computer’s origin and the possible paths of its future.”
The Guardian
 
“If you want to be mentally prepared for the next revolution in computing, Dyson’s book is a must read. But it is also a must read if you just want a ripping yarn about the way real scientists (at least, some real scientists) work and think.”
Literary Review
 
“More than just a great book about science. It’s a great book, period.”
The Globe and Mail

Library Journal

In the 1940s and 1950s, scientists at the Institute for Advanced Study in Princeton, NJ, worked to realize Alan Turing's dream of a universal machine, which led to computers, digital television, modern genetics, and more. Because their work was funded by the government, which therefore expected to benefit from the results, it also led to the creation of the hydrogen bomb. Distinguished science writer Dyson is the son of renowned physicist Freeman Dyson, who worked at the institute in the 1950s, so you can expect an insightful book. With an eight-city tour.

MAY 2012 - AudioFile

It takes a long time to get to Alan Turing's 1936 paper "On Computable Numbers." Even George Washington fits into George Dyson's time line to the digital era. The many stops along the way include DNA research, the founding of Princeton's Math Department, and the development of the hydrogen bomb. Further details include a love story set against a WWII backdrop and action at the gaming tables. For the most part, Arthur Morey's narration maintains listener interest as all the threads are tied together. While Morey’s reading occasionally creates excitement, it is capable in a low-key way that doesn't draw attention to itself. While it might not cross over for a general audience, listeners interested in the digital revolution will find the journey fascinating. J.A.S. © AudioFile 2012, Portland, Maine

Kirkus Reviews

That we live in a digital universe is indisputable; how we got there is a mesmerizing tale brilliantly told by science historian Dyson (Project Orion: The Atomic Spaceship 1957–1965, 2002, etc.) The author establishes late 1945 as the birth date of the first stored-program machine, built at the Institute for Advanced Study, established in Princeton in 1932 as a haven for theoreticians. It happened under the watch of the brilliant mathematician John von Neumann, fresh from commutes to Los Alamos where the atom bomb had been built and the hydrogen bomb only a gleam in Edward Teller's eye. Dyson makes clear that the motivation for some of the world's greatest technological advances has always been to perfect instruments of war. Indeed, von Neumann's colleagues included some who had been at Aberdeen Proving Grounds, where a dedicated-purpose computer, ENIAC, had been built to calculate firing tables for antiaircraft artillery. The IAS computer, MANIAC, was used to determine the parameters governing the fission of an atom device inside an H-bomb that would then ignite the fusion reaction. But for von Neumann and others, the MANIAC was also the embodiment of Alan Turing's universal machine, an abstract invention in the '30s by the mathematician who would go on to crack the Nazi's infamous Enigma code in World War II. In addition to these stories, Dyson discusses climate and genetic-modeling projects programmed on the MANIAC. The use of wonderful quotes and pithy sketches of the brilliant cast of characters further enriches the text. Who knew that eccentric mathematician-logician Kurt Gödel had married a Viennese cabaret dancer? Meticulously researched and packed with not just technological details, but sociopolitical and cultural details as well--the definitive history of the computer.

Product Details

BN ID: 2940169465785
Publisher: Penguin Random House
Publication date: 03/06/2012
Edition description: Unabridged
Sales rank: 1,199,215

Read an Excerpt

Preface
 
POINT SOURCE SOLUTION
 
I am thinking about something much more important than bombs. I am thinking about computers.
—John von Neumann, 1946
 
 
There are two kinds of creation myths: those where life arises out of the mud, and those where life falls from the sky. In this creation myth, computers arose from the mud, and code fell from the sky.
 
In late 1945, at the Institute for Advanced Study in Princeton, New Jersey, Hungarian American mathematician John von Neumann gathered a small group of engineers to begin designing, building, and programming an electronic digital computer, with five kilobytes of storage, whose attention could be switched in 24 microseconds from one memory location to the next. The entire digital universe can be traced directly to this 32-by-32-by-40-bit nucleus: less memory than is allocated to displaying a single icon on a computer screen today.
 
Von Neumann’s project was the physical realization of Alan Turing’s Universal Machine, a theoretical construct invented in 1936. It was not the first computer. It was not even the second or third computer. It was, however, among the first computers to make full use of a high-speed random-access storage matrix, and became the machine whose coding was most widely replicated and whose logical architecture was most widely reproduced. The stored-program computer, as conceived by Alan Turing and delivered by John von Neumann, broke the distinction between numbers that mean things and numbers that do things. Our universe would never be the same.
 
Working outside the bounds of industry, breaking the rules of academia, and relying largely on the U.S. government for support, a dozen engineers in their twenties and thirties designed and built von Neumann’s computer for less than $1 million in under five years. “He was in the right place at the right time with the right connections with the right idea,” remembers Willis Ware, fourth to be hired to join the engineering team, “setting aside the hassle that will probably never be resolved as to whose ideas they really were.”
 
As World War II drew to a close, the scientists who had built the atomic bomb at Los Alamos wondered, “What’s next?” Some, including Richard Feynman, vowed never to have anything to do with nuclear weapons or military secrecy again. Others, including Edward Teller and John von Neumann, were eager to develop more advanced nuclear weapons, especially the “Super,” or hydrogen bomb. Just before dawn on the morning of July 16, 1945, the New Mexico desert was illuminated by an explosion “brighter than a thousand suns.” Eight and a half years later, an explosion one thousand times more powerful illuminated the skies over Bikini Atoll. The race to build the hydrogen bomb was accelerated by von Neumann’s desire to build a computer, and the push to build von Neumann’s computer was accelerated by the race to build a hydrogen bomb.
 
Computers were essential to the initiation of nuclear explosions, and to understanding what happens next. In “Point Source Solution,” a 1947 Los Alamos report on the shock waves produced by nuclear explosions, von Neumann explained that “for very violent explosions . . . it may be justified to treat the original, central, high pressure area as a point.” This approximated the physical reality of a nuclear explosion closely enough to enable some of the first useful predictions of weapons effects.
 
Numerical simulation of chain reactions within computers initiated a chain reaction among computers, with machines and codes proliferating as explosively as the phenomena they were designed to help us understand. It is no coincidence that the most destructive and the most constructive of human inventions appeared at exactly the same time. Only the collective intelligence of computers could save us from the destructive powers of the weapons they had allowed us to invent.
 
Turing’s model of universal computation was one-dimensional: a string of symbols encoded on a tape. Von Neumann’s implementation of Turing’s model was two-dimensional: the address matrix underlying all computers in use today. The landscape is now three-dimensional, yet the entire Internet can still be viewed as a common tape shared by a multitude of Turing’s Universal Machines.
 
Where does time fit in? Time in the digital universe and time in our universe are governed by entirely different clocks. In our universe, time is a continuum. In a digital universe, time (T) is a countable number of discrete, sequential steps. A digital universe is bounded at the beginning, when T = 0, and at the end, if T comes to a stop. Even in a perfectly deterministic universe, there is no consistent method to predict the ending in advance. To an observer in our universe, the digital universe appears to be speeding up. To an observer in the digital universe, our universe appears to be slowing down.
 
Universal codes and universal machines, introduced by Alan Turing in his “On Computable Numbers, with an Application to the Entscheidungsproblem” of 1936, have prospered to such an extent that Turing’s underlying interest in the “decision problem” is easily overlooked. In answering the Entscheidungsproblem, Turing proved that there is no systematic way to tell, by looking at a code, what that code will do. That’s what makes the digital universe so interesting, and that’s what brings us here.
 
It is impossible to predict where the digital universe is going, but it is possible to understand how it began. The origin of the first fully electronic random-access storage matrix, and the propagation of the codes that it engendered, is as close to a point source as any approximation can get.

From the B&N Reads Blog

Customer Reviews