Chaos: Making a New Science

The “highly entertaining” New York Times bestseller, which explains chaos theory and the butterfly effect, from the author of The Information (Chicago Tribune).

For centuries, scientific thought was focused on bringing order to the natural world. But even as relativity and quantum mechanics undermined that rigid certainty in the first half of the twentieth century, the scientific community clung to the idea that any system, no matter how complex, could be reduced to a simple pattern. In the 1960s, a small group of radical thinkers began to take that notion apart, placing new importance on the tiny experimental irregularities that scientists had long learned to ignore. Miniscule differences in data, they said, would eventually produce massive ones—and complex systems like the weather, economics, and human behavior suddenly became clearer and more beautiful than they had ever been before.

In this seminal work of scientific writing, James Gleick lays out a cutting edge field of science with enough grace and precision that any reader will be able to grasp the science behind the beautiful complexity of the world around us. With more than a million copies sold, Chaos is “a groundbreaking book about what seems to be the future of physics” by a writer who has been a finalist for both the Pulitzer Prize and the National Book Award, the author of Time Travel: A History and Genius: The Life and Science of Richard Feynman (Publishers Weekly).
1100668202
Chaos: Making a New Science

The “highly entertaining” New York Times bestseller, which explains chaos theory and the butterfly effect, from the author of The Information (Chicago Tribune).

For centuries, scientific thought was focused on bringing order to the natural world. But even as relativity and quantum mechanics undermined that rigid certainty in the first half of the twentieth century, the scientific community clung to the idea that any system, no matter how complex, could be reduced to a simple pattern. In the 1960s, a small group of radical thinkers began to take that notion apart, placing new importance on the tiny experimental irregularities that scientists had long learned to ignore. Miniscule differences in data, they said, would eventually produce massive ones—and complex systems like the weather, economics, and human behavior suddenly became clearer and more beautiful than they had ever been before.

In this seminal work of scientific writing, James Gleick lays out a cutting edge field of science with enough grace and precision that any reader will be able to grasp the science behind the beautiful complexity of the world around us. With more than a million copies sold, Chaos is “a groundbreaking book about what seems to be the future of physics” by a writer who has been a finalist for both the Pulitzer Prize and the National Book Award, the author of Time Travel: A History and Genius: The Life and Science of Richard Feynman (Publishers Weekly).
19.99 In Stock
Chaos: Making a New Science

Chaos: Making a New Science

by James Gleick
Chaos: Making a New Science

Chaos: Making a New Science

by James Gleick

eBookDigital Original (Digital Original)

$19.99 

Available on Compatible NOOK devices, the free NOOK App and in My Digital Library.
WANT A NOOK?  Explore Now

Related collections and offers


Overview

The “highly entertaining” New York Times bestseller, which explains chaos theory and the butterfly effect, from the author of The Information (Chicago Tribune).

For centuries, scientific thought was focused on bringing order to the natural world. But even as relativity and quantum mechanics undermined that rigid certainty in the first half of the twentieth century, the scientific community clung to the idea that any system, no matter how complex, could be reduced to a simple pattern. In the 1960s, a small group of radical thinkers began to take that notion apart, placing new importance on the tiny experimental irregularities that scientists had long learned to ignore. Miniscule differences in data, they said, would eventually produce massive ones—and complex systems like the weather, economics, and human behavior suddenly became clearer and more beautiful than they had ever been before.

In this seminal work of scientific writing, James Gleick lays out a cutting edge field of science with enough grace and precision that any reader will be able to grasp the science behind the beautiful complexity of the world around us. With more than a million copies sold, Chaos is “a groundbreaking book about what seems to be the future of physics” by a writer who has been a finalist for both the Pulitzer Prize and the National Book Award, the author of Time Travel: A History and Genius: The Life and Science of Richard Feynman (Publishers Weekly).

Product Details

ISBN-13: 9781453210475
Publisher: Open Road Media
Publication date: 03/22/2011
Sold by: Barnes & Noble
Format: eBook
Pages: 360
Sales rank: 442,345
Lexile: 1160L (what's this?)
File size: 8 MB

About the Author

Born in New York City in 1954, James Gleick is one of the nation’s preeminent science writers. Upon graduating from Harvard in 1976, he founded Metropolis, a weekly Minneapolis newspaper, and spent the next decade working at the New York Times. Gleick’s prominent works include Genius: The Life and Science of Richard Feynman, Isaac Newton, and Chaos: Making a New Science, all of which were shortlisted for the Pulitzer Prize. His latest book, The Information: A History, a Theory, a Flood,was published in March 2011. He lives and works in New York.



Born in New York City in 1954, James Gleick is one of the nation’s preeminent science writers. Upon graduating from Harvard in 1976, he founded Metropolis, a weekly Minneapolis newspaper, and spent the next decade working at the New York Times. Gleick’s prominent works include Genius: The Life and Science of Richard Feynman, Isaac Newton, and Chaos: Making a New Science, all of which were shortlisted for the Pulitzer Prize. His latest book, The Information: A History, a Theory, a Flood,was published in March 2011. He lives and works in New York.

Read an Excerpt

CHAOS

MAKING A NEW SCIENCE


By James Gleick

OPEN ROAD INTEGRATED MEDIA

Copyright © 2008 James Gleick
All rights reserved.
ISBN: 978-1-4532-1047-5



CHAPTER 1

THE BUTTERFLY EFFECT

Physicists like to think that all you have to do is say, these are the conditions, now what happens next?

—RICHARD P. FEYNMAN


The sun beat down through a sky that had never seen clouds. The winds swept across an earth as smooth as glass. Night never came, and autumn never gave way to winter. It never rained. The simulated weather in Edward Lorenz's new electronic computer changed slowly but certainly, drifting through a permanent dry midday midseason, as if the world had turned into Camelot, or some particularly bland version of southern California.

Outside his window Lorenz could watch real weather, the early-morning fog creeping along the Massachusetts Institute of Technology campus or the low clouds slipping over the rooftops from the Atlantic. Fog and clouds never arose in the model running on his computer. The machine, a Royal McBee, was a thicket of wiring and vacuum tubes that occupied an ungainly portion of Lorenz's office, made a surprising and irritating noise, and broke down every week or so. It had neither the speed nor the memory to manage a realistic simulation of the earth's atmosphere and oceans. Yet Lorenz created a toy weather in 1960 that succeeded in mesmerizing his colleagues. Every minute the machine marked the passing of a day by printing a row of numbers across a page. If you knew how to read the printouts, you would see a prevailing westerly wind swing now to the north, now to the south, now back to the north. Digitized cyclones spun slowly around an idealized globe. As word spread through the department, the other meteorologists would gather around with the graduate students, making bets on what Lorenz's weather would do next. Somehow, nothing ever happened the same way twice.

Lorenz enjoyed weather—by no means a prerequisite for a research meteorologist. He savored its changeability. He appreciated the patterns that come and go in the atmosphere, families of eddies and cyclones, always obeying mathematical rules, yet never repeating themselves. When he looked at clouds, he thought he saw a kind of structure in them. Once he had feared that studying the science of weather would be like prying a jack-in-the-box apart with a screwdriver. Now he wondered whether science would be able to penetrate the magic at all. Weather had a flavor that could not be expressed by talking about averages. The daily high temperature in Cambridge, Massachusetts, averages 75 degrees in June. The number of rainy days in Riyadh, Saudi Arabia, averages ten a year. Those were statistics. The essence was the way patterns in the atmosphere changed over time, and that was what Lorenz captured on the Royal McBee.

He was the god of this machine universe, free to choose the laws of nature as he pleased. After a certain amount of undivine trial and error, he chose twelve. They were numerical rules—equations that expressed the relationships between temperature and pressure, between pressure and wind speed. Lorenz understood that he was putting into practice the laws of Newton, appropriate tools for a clockmaker deity who could create a world and set it running for eternity. Thanks to the determinism of physical law, further intervention would then be unnecessary. Those who made such models took for granted that, from present to future, the laws of motion provide a bridge of mathematical certainty. Understand the laws and you understand the universe. That was the philosophy behind modeling weather on a computer.

Indeed, if the eighteenth-century philosophers imagined their creator as a benevolent noninterventionist, content to remain behind the scenes, they might have imagined someone like Lorenz. He was an odd sort of meteorologist. He had the worn face of a Yankee farmer, with surprising bright eyes that made him seem to be laughing whether he was or not. He seldom spoke about himself or his work, but he listened. He often lost himself in a realm of calculation or dreaming that his colleagues found inaccessible. His closest friends felt that Lorenz spent a good deal of his time off in a remote outer space.

As a boy he had been a weather bug, at least to the extent of keeping close tabs on the max-min thermometer recording the days' highs and lows outside his parents' house in West Hartford, Connecticut. But he spent more time inside playing with mathematical puzzle books than watching the thermometer. Sometimes he and his father would work out puzzles together. Once they came upon a particularly difficult problem that turned out to be insoluble. That was acceptable, his father told him: you can always try to solve a problem by proving that no solution exists. Lorenz liked that, as he always liked the purity of mathematics, and when he graduated from Dartmouth College, in 1938, he thought that mathematics was his calling. Circumstance interfered, however, in the form of World War II, which put him to work as a weather forecaster for the Army Air Corps. After the war Lorenz decided to stay with meteorology, investigating the theory of it, pushing the mathematics a little further forward. He made a name for himself by publishing work on orthodox problems, such as the general circulation of the atmosphere. And in the meantime he continued to think about forecasting.

To most serious meteorologists, forecasting was less than science. It was a seat-of-the-pants business performed by technicians who needed some intuitive ability to read the next day's weather in the instruments and the clouds. It was guesswork. At centers like M.I.T., meteorology favored problems that had solutions. Lorenz understood the messiness of weather prediction as well as anyone, having tried it firsthand for the benefit of military pilots, but he harbored an interest in the problem—a mathematical interest.

Not only did meteorologists scorn forecasting, but in the 1960s virtually all serious scientists mistrusted computers. These souped-up calculators hardly seemed like tools for theoretical science. So numerical weather modeling was something of a bastard problem. Yet the time was right for it. Weather forecasting had been waiting two centuries for a machine that could repeat thousands of calculations over and over again by brute force. Only a computer could cash in the Newtonian promise that the world unfolded along a deterministic path, rule-bound like the planets, predictable like eclipses and tides. In theory a computer could let meteorologists do what astronomers had been able to do with pencil and slide rule: reckon the future of their universe from its initial conditions and the physical laws that guide its evolution. The equations describing the motion of air and water were as well known as those describing the motion of planets. Astronomers did not achieve perfection and never would, not in a solar system tugged by the gravities of nine planets, scores of moons and thousands of asteroids, but calculations of planetary motion were so accurate that people forgot they were forecasts. When an astronomer said, "Comet Halley will be back this way in seventy-six years," it seemed like fact, not prophecy. Deterministic numerical forecasting figured accurate courses for spacecraft and missiles. Why not winds and clouds?

Weather was vastly more complicated, but it was governed by the same laws. Perhaps a powerful enough computer could be the supreme intelligence imagined by Laplace, the eighteenth-century philosopher-mathematician who caught the Newtonian fever like no one else: "Such an intelligence," Laplace wrote, "would embrace in the same formula the movements of the greatest bodies of the universe and those of the lightest atom; for it, nothing would be uncertain and the future, as the past, would be present to its eyes." In these days of Einstein's relativity and Heisenberg's uncertainty, Laplace seems almost buffoon-like in his optimism, but much of modern science has pursued his dream. Implicitly, the mission of many twentieth-century scientists—biologists, neurologists, economists—has been to break their universes down into the simplest atoms that will obey scientific rules. In all these sciences, a kind of Newtonian determinism has been brought to bear. The fathers of modern computing always had Laplace in mind, and the history of computing and the history of forecasting were intermingled ever since John von Neumann designed his first machines at the Institute for Advanced Study in Princeton, New Jersey, in the 1950s. Von Neumann recognized that weather modeling could be an ideal task for a computer.

There was always one small compromise, so small that working scientists usually forgot it was there, lurking in a corner of their philosophies like an unpaid bill. Measurements could never be perfect. Scientists marching under Newton's banner actually waved another flag that said something like this: Given an approximate knowledge of a system's initial conditions and an understanding of natural law, one can calculate the approximate behavior of the system. This assumption lay at the philosophical heart of science. As one theoretician liked to tell his students: "The basic idea of Western science is that you don't have to take into account the falling of a leaf on some planet in another galaxy when you're trying to account for the motion of a billiard ball on a pool table on earth. Very small influences can be neglected. There's a convergence in the way things work, and arbitrarily small influences don't blow up to have arbitrarily large effects." Classically, the belief in approximation and convergence was well justified. It worked. A tiny error in fixing the position of Comet Halley in 1910 would only cause a tiny error in predicting its arrival in 1986, and the error would stay small for millions of years to come. Computers rely on the same assumption in guiding spacecraft: approximately accurate input gives approximately accurate output. Economic forecasters rely on this assumption, though their success is less apparent. So did the pioneers in global weather forecasting.

With his primitive computer, Lorenz had boiled weather down to the barest skeleton. Yet, line by line, the winds and temperatures in Lorenz's printouts seemed to behave in a recognizable earthly way. They matched his cherished intuition about the weather, his sense that it repeated itself, displaying familiar patterns over time, pressure rising and falling, the airstream swinging north and south. He discovered that when a line went from high to low without a bump, a double bump would come next, and he said, "That's the kind of rule a forecaster could use." But the repetitions were never quite exact. There was pattern, with disturbances. An orderly disorder.

To make the patterns plain to see, Lorenz created a primitive kind of graphics. Instead of just printing out the usual lines of digits, he would have the machine print a certain number of blank spaces followed by the letter a. He would pick one variable—perhaps the direction of the airstream. Gradually the a's marched down the roll of paper, swinging back and forth in a wavy line, making a long series of hills and valleys that represented the way the west wind would swing north and south across the continent. The orderliness of it, the recognizable cycles coming around again and again but never twice the same way, had a hypnotic fascination. The system seemed slowly to be revealing its secrets to the forecaster's eye.

One day in the winter of 1961, wanting to examine one sequence at greater length, Lorenz took a shortcut. Instead of starting the whole run over, he started midway through. To give the machine its initial conditions, he typed the numbers straight from the earlier printout. Then he walked down the hall to get away from the noise and drink a cup of coffee. When he returned an hour later, he saw something unexpected, something that planted a seed for a new science.


This new run should have exactly duplicated the old. Lorenz had copied the numbers into the machine himself. The program had not changed. Yet as he stared at the new printout, Lorenz saw his weather diverging so rapidly from the pattern of the last run that, within just a few months, all resemblance had disappeared. He looked at one set of numbers, then back at the other. He might as well have chosen two random weathers out of a hat. His first thought was that another vacuum tube had gone bad.

Suddenly he realized the truth. There had been no malfunction. The problem lay in the numbers he had typed. In the computer's memory, six decimal places were stored: .506127. On the printout, to save space, just three appeared: .506. Lorenz had entered the shorter, rounded-off numbers, assuming that the difference—one part in a thousand—was inconsequential.

It was a reasonable assumption. If a weather satellite can read ocean-surface temperature to within one part in a thousand, its operators consider themselves lucky. Lorenz's Royal McBee was implementing the classical program. It used a purely deterministic system of equations. Given a particular starting point, the weather would unfold exactly the same way each time. Given a slightly different starting point, the weather should unfold in a slightly different way. A small numerical error was like a small puff of wind—surely the small puffs faded or canceled each other out before they could change important, large-scale features of the weather. Yet in Lorenz's particular system of equations, small errors proved catastrophic.

He decided to look more closely at the way two nearly identical runs of weather flowed apart. He copied one of the wavy lines of output onto a transparency and laid it over the other, to inspect the way it diverged. First, two humps matched detail for detail. Then one line began to lag a hairsbreadth behind. By the time the two runs reached the next hump, they were distinctly out of phase. By the third or fourth hump, all similarity had vanished.

It was only a wobble from a clumsy computer. Lorenz could have assumed something was wrong with his particular machine or his particular model—probably should have assumed. It was not as though he had mixed sodium and chlorine and got gold. But for reasons of mathematical intuition that his colleagues would begin to understand only later, Lorenz felt a jolt: something was philosophically out of joint. The practical import could be staggering. Although his equations were gross parodies of the earth's weather, he had a faith that they captured the essence of the real atmosphere. That first day, he decided that long-range weather forecasting must be doomed.

"We certainly hadn't been successful in doing that anyway and now we had an excuse," he said. "I think one of the reasons people thought it would be possible to forecast so far ahead is that there are real physical phenomena for which one can do an excellent job of forecasting, such as eclipses, where the dynamics of the sun, moon, and earth are fairly complicated, and such as oceanic tides. I never used to think of tide forecasts as prediction at all—I used to think of them as statements of fact—but of course, you are predicting. Tides are actually just as complicated as the atmosphere. Both have periodic components—you can predict that next summer will be warmer than this winter. But with weather we take the attitude that we knew that already. With tides, it's the predictable part that we're interested in, and the unpredictable part is small, unless there's a storm.

"The average person, seeing that we can predict tides pretty well a few months ahead would say, why can't we do the same thing with the atmosphere, it's just a different fluid system, the laws are about as complicated. But I realized that any physical system that behaved nonperiodically would be unpredictable."


The fifties and sixties were years of unreal optimism about weather forecasting. Newspapers and magazines were filled with hope for weather science, not just for prediction but for modification and control. Two technologies were maturing together, the digital computer and the space satellite. An international program was being prepared to take advantage of them, the Global Atmosphere Research Program. There was an idea that human society would free itself from weather's turmoil and become its master instead of its victim. Geodesic domes would cover cornfields. Airplanes would seed the clouds. Scientists would learn how to make rain and how to stop it.

The intellectual father of this popular notion was Von Neumann, who built his first computer with the precise intention, among other things, of controlling the weather. He surrounded himself with meteorologists and gave breathtaking talks about his plans to the general physics community. He had a specific mathematical reason for his optimism. He recognized that a complicated dynamical system could have points of instability—critical points where a small push can have large consequences, as with a ball balanced at the top of a hill. With the computer up and running, Von Neumann imagined that scientists would calculate the equations of fluid motion for the next few days. Then a central committee of meteorologists would send up airplanes to lay down smoke screens or seed clouds to push the weather into the desired mode. But Von Neumann had overlooked the possibility of chaos, with instability at every point.


(Continues...)

Excerpted from CHAOS by James Gleick. Copyright © 2008 James Gleick. Excerpted by permission of OPEN ROAD INTEGRATED MEDIA.
All rights reserved. No part of this excerpt may be reproduced or reprinted without permission in writing from the publisher.
Excerpts are provided by Dial-A-Book Inc. solely for the personal use of visitors to this web site.

Table of Contents

Contents

Prologue,
The Butterfly Effect,
Revolution,
Life's Ups and Downs,
A Geometry of Nature,
Strange Attractors,
Universality,
The Experimenter,
Images of Chaos,
The Dynamical Systems Collective,
Inner Rhythms,
Chaos and Beyond,
Afterword,
Notes on Sources and Further Reading,
Acknowledgments,
Index,

From the B&N Reads Blog

Customer Reviews