 Shopping Bag ( 0 items )

All (33) from $4.00

New (16) from $11.80

Used (17) from $4.00
More About This Textbook
Overview
The millioncopy bestseller by National Book Award nominee and Pulitzer Prize finalist James Gleick that reveals the science behind chaos theory
National bestseller
More than a million copies sold
A work of popular science in the tradition of Stephen Hawking and Carl Sagan, this 20thanniversary edition of James Gleick’s groundbreaking bestseller Chaos introduces a whole new readership to chaos theory, one of the most significant waves of scientific knowledge in our time. From Edward Lorenz’s discovery of the Butterfly Effect, to Mitchell Feigenbaum’s calculation of a universal constant, to Benoit Mandelbrot’s concept of fractals, which created a new geometry of nature, Gleick’s engaging narrative focuses on the key figures whose genius converged to chart an innovative direction for science. In Chaos,Gleick makes the story of chaos theory not only fascinating but also accessible to beginners, and opens our eyes to a surprising new view of the universe.
Editorial Reviews
From the Publisher
“ Fascinating . . . almost every paragraph contains a jolt.” —The New York Times“ Taut and exciting . . . a fascinating illustration of how the pattern of science changes.” —The New York Times Book Review
“ Highly entertaining . . . a startling look at newly discovered universal laws.” —Chicago Tribune
“ An aweinspiring book. Reading it gave me that sensation othat someone had just found the light switch.” —Douglas Adams, author of The Hitchhiker’s Guide to the Galaxy
“Chaos is a feast.” —The Washington Post Book World
Publishers Weekly  Publisher's Weekly
Science readers who have gone through relativity theory, quantum physics, Heisenbergian uncertainty, black holes and the world of quarks and virtual particles only to be stunned by recent Grand Unified Theories GUTS will welcome New York Times science writer Gleick's adventurous attempt to describe the revolutionary science of chaos. ``Chaos'' is what a handful of theorists steeped in math and computer knowhow are calling their challengingly abstract new look at nature in terms of nonlinear dynamics. Gleick traces the ideas of these littleknown pioneersincluding Mitchell Feigenbaum and his Butterfly Effect; Benoit Mandelbrot, whose ``fractal'' concept led to a new geometry of nature; and Joseph Ford who countered Einstein with ``God plays dice with the universe. But they're loaded dice.'' Chaos is deep, even frightening in its holistic embrace of nature as paradoxically complex, wildly disorderly, random and yet stable in its infinite stream of ``selfsimilarities.'' A groundbreaking book about what seems to be the future of physics. Illustrations. QPBC alternate. October 20The New York Times
Fascinating . . . almost every paragraph contains a jolt.The New York Times Book Review
Taut and exciting . . . a fascinating illustration of how the pattern of science changes.Chicago Tribune
Highly entertaining . . . a startling look at newly discovered universal laws.Library Journal
Chaostheory, touted as the third revolution in 20thcentury science after relativity and quantum mechanics, uses traditional mathematics to understand complex natural systems with too many variables to study. Philosophically, it counters the Second Law of Thermodynamics by demonstrating the ``spontaneous emergence of selforganization.'' In this new science apparent disorder is meaningful; the structure of chaos can be mapped by plotting graphically the calculations of nonlinear mathematics using ``fractal'' geometry, a brainchild of Benoit Mandelbrot in which symmetrical patterns repeat across different scales. With jocular descriptions of eccentric characters such as the ``Dynamical Systems collective,'' a.k.a. Chaos Cabal of the University of CaliforniaSanta Cruz, Chaos offers an absorbing look at trailblazers on a new scientific frontier. Laurie Tynan, Montgomery Cty.Norristown P.L., Pa.Booknews
Reissue of the 1987 Viking ed. Annotation c. Book News, Inc., Portland, OR (booknews.com)Product Details
Related Subjects
Meet the Author
James Gleick was born in New York City in 1954. He worked for ten years as an editor and reporter for The New York Times, founded an early Internet portal, the Pipeline, and has written several books of popular science, including The Information: A History, a Theory, a Flood, which won the Pen/E. O. Wilson Literary Science Writing Award. He lives in Key West and New York.
Read an Excerpt
CHAOS
MAKING A NEW SCIENCE
By James Gleick
OPEN ROAD INTEGRATED MEDIA
Copyright © 2008 James GleickAll rights reserved.
ISBN: 9781453210475
CHAPTER 1
THE BUTTERFLY EFFECT
Physicists like to think that all you have to do is say, these are the conditions, now what happens next?
—RICHARD P. FEYNMAN
The sun beat down through a sky that had never seen clouds. The winds swept across an earth as smooth as glass. Night never came, and autumn never gave way to winter. It never rained. The simulated weather in Edward Lorenz's new electronic computer changed slowly but certainly, drifting through a permanent dry midday midseason, as if the world had turned into Camelot, or some particularly bland version of southern California.
Outside his window Lorenz could watch real weather, the earlymorning fog creeping along the Massachusetts Institute of Technology campus or the low clouds slipping over the rooftops from the Atlantic. Fog and clouds never arose in the model running on his computer. The machine, a Royal McBee, was a thicket of wiring and vacuum tubes that occupied an ungainly portion of Lorenz's office, made a surprising and irritating noise, and broke down every week or so. It had neither the speed nor the memory to manage a realistic simulation of the earth's atmosphere and oceans. Yet Lorenz created a toy weather in 1960 that succeeded in mesmerizing his colleagues. Every minute the machine marked the passing of a day by printing a row of numbers across a page. If you knew how to read the printouts, you would see a prevailing westerly wind swing now to the north, now to the south, now back to the north. Digitized cyclones spun slowly around an idealized globe. As word spread through the department, the other meteorologists would gather around with the graduate students, making bets on what Lorenz's weather would do next. Somehow, nothing ever happened the same way twice.
Lorenz enjoyed weather—by no means a prerequisite for a research meteorologist. He savored its changeability. He appreciated the patterns that come and go in the atmosphere, families of eddies and cyclones, always obeying mathematical rules, yet never repeating themselves. When he looked at clouds, he thought he saw a kind of structure in them. Once he had feared that studying the science of weather would be like prying a jackinthebox apart with a screwdriver. Now he wondered whether science would be able to penetrate the magic at all. Weather had a flavor that could not be expressed by talking about averages. The daily high temperature in Cambridge, Massachusetts, averages 75 degrees in June. The number of rainy days in Riyadh, Saudi Arabia, averages ten a year. Those were statistics. The essence was the way patterns in the atmosphere changed over time, and that was what Lorenz captured on the Royal McBee.
He was the god of this machine universe, free to choose the laws of nature as he pleased. After a certain amount of undivine trial and error, he chose twelve. They were numerical rules—equations that expressed the relationships between temperature and pressure, between pressure and wind speed. Lorenz understood that he was putting into practice the laws of Newton, appropriate tools for a clockmaker deity who could create a world and set it running for eternity. Thanks to the determinism of physical law, further intervention would then be unnecessary. Those who made such models took for granted that, from present to future, the laws of motion provide a bridge of mathematical certainty. Understand the laws and you understand the universe. That was the philosophy behind modeling weather on a computer.
Indeed, if the eighteenthcentury philosophers imagined their creator as a benevolent noninterventionist, content to remain behind the scenes, they might have imagined someone like Lorenz. He was an odd sort of meteorologist. He had the worn face of a Yankee farmer, with surprising bright eyes that made him seem to be laughing whether he was or not. He seldom spoke about himself or his work, but he listened. He often lost himself in a realm of calculation or dreaming that his colleagues found inaccessible. His closest friends felt that Lorenz spent a good deal of his time off in a remote outer space.
As a boy he had been a weather bug, at least to the extent of keeping close tabs on the maxmin thermometer recording the days' highs and lows outside his parents' house in West Hartford, Connecticut. But he spent more time inside playing with mathematical puzzle books than watching the thermometer. Sometimes he and his father would work out puzzles together. Once they came upon a particularly difficult problem that turned out to be insoluble. That was acceptable, his father told him: you can always try to solve a problem by proving that no solution exists. Lorenz liked that, as he always liked the purity of mathematics, and when he graduated from Dartmouth College, in 1938, he thought that mathematics was his calling. Circumstance interfered, however, in the form of World War II, which put him to work as a weather forecaster for the Army Air Corps. After the war Lorenz decided to stay with meteorology, investigating the theory of it, pushing the mathematics a little further forward. He made a name for himself by publishing work on orthodox problems, such as the general circulation of the atmosphere. And in the meantime he continued to think about forecasting.
To most serious meteorologists, forecasting was less than science. It was a seatofthepants business performed by technicians who needed some intuitive ability to read the next day's weather in the instruments and the clouds. It was guesswork. At centers like M.I.T., meteorology favored problems that had solutions. Lorenz understood the messiness of weather prediction as well as anyone, having tried it firsthand for the benefit of military pilots, but he harbored an interest in the problem—a mathematical interest.
Not only did meteorologists scorn forecasting, but in the 1960s virtually all serious scientists mistrusted computers. These soupedup calculators hardly seemed like tools for theoretical science. So numerical weather modeling was something of a bastard problem. Yet the time was right for it. Weather forecasting had been waiting two centuries for a machine that could repeat thousands of calculations over and over again by brute force. Only a computer could cash in the Newtonian promise that the world unfolded along a deterministic path, rulebound like the planets, predictable like eclipses and tides. In theory a computer could let meteorologists do what astronomers had been able to do with pencil and slide rule: reckon the future of their universe from its initial conditions and the physical laws that guide its evolution. The equations describing the motion of air and water were as well known as those describing the motion of planets. Astronomers did not achieve perfection and never would, not in a solar system tugged by the gravities of nine planets, scores of moons and thousands of asteroids, but calculations of planetary motion were so accurate that people forgot they were forecasts. When an astronomer said, "Comet Halley will be back this way in seventysix years," it seemed like fact, not prophecy. Deterministic numerical forecasting figured accurate courses for spacecraft and missiles. Why not winds and clouds?
Weather was vastly more complicated, but it was governed by the same laws. Perhaps a powerful enough computer could be the supreme intelligence imagined by Laplace, the eighteenthcentury philosophermathematician who caught the Newtonian fever like no one else: "Such an intelligence," Laplace wrote, "would embrace in the same formula the movements of the greatest bodies of the universe and those of the lightest atom; for it, nothing would be uncertain and the future, as the past, would be present to its eyes." In these days of Einstein's relativity and Heisenberg's uncertainty, Laplace seems almost buffoonlike in his optimism, but much of modern science has pursued his dream. Implicitly, the mission of many twentiethcentury scientists—biologists, neurologists, economists—has been to break their universes down into the simplest atoms that will obey scientific rules. In all these sciences, a kind of Newtonian determinism has been brought to bear. The fathers of modern computing always had Laplace in mind, and the history of computing and the history of forecasting were intermingled ever since John von Neumann designed his first machines at the Institute for Advanced Study in Princeton, New Jersey, in the 1950s. Von Neumann recognized that weather modeling could be an ideal task for a computer.
There was always one small compromise, so small that working scientists usually forgot it was there, lurking in a corner of their philosophies like an unpaid bill. Measurements could never be perfect. Scientists marching under Newton's banner actually waved another flag that said something like this: Given an approximate knowledge of a system's initial conditions and an understanding of natural law, one can calculate the approximate behavior of the system. This assumption lay at the philosophical heart of science. As one theoretician liked to tell his students: "The basic idea of Western science is that you don't have to take into account the falling of a leaf on some planet in another galaxy when you're trying to account for the motion of a billiard ball on a pool table on earth. Very small influences can be neglected. There's a convergence in the way things work, and arbitrarily small influences don't blow up to have arbitrarily large effects." Classically, the belief in approximation and convergence was well justified. It worked. A tiny error in fixing the position of Comet Halley in 1910 would only cause a tiny error in predicting its arrival in 1986, and the error would stay small for millions of years to come. Computers rely on the same assumption in guiding spacecraft: approximately accurate input gives approximately accurate output. Economic forecasters rely on this assumption, though their success is less apparent. So did the pioneers in global weather forecasting.
With his primitive computer, Lorenz had boiled weather down to the barest skeleton. Yet, line by line, the winds and temperatures in Lorenz's printouts seemed to behave in a recognizable earthly way. They matched his cherished intuition about the weather, his sense that it repeated itself, displaying familiar patterns over time, pressure rising and falling, the airstream swinging north and south. He discovered that when a line went from high to low without a bump, a double bump would come next, and he said, "That's the kind of rule a forecaster could use." But the repetitions were never quite exact. There was pattern, with disturbances. An orderly disorder.
To make the patterns plain to see, Lorenz created a primitive kind of graphics. Instead of just printing out the usual lines of digits, he would have the machine print a certain number of blank spaces followed by the letter a. He would pick one variable—perhaps the direction of the airstream. Gradually the a's marched down the roll of paper, swinging back and forth in a wavy line, making a long series of hills and valleys that represented the way the west wind would swing north and south across the continent. The orderliness of it, the recognizable cycles coming around again and again but never twice the same way, had a hypnotic fascination. The system seemed slowly to be revealing its secrets to the forecaster's eye.
One day in the winter of 1961, wanting to examine one sequence at greater length, Lorenz took a shortcut. Instead of starting the whole run over, he started midway through. To give the machine its initial conditions, he typed the numbers straight from the earlier printout. Then he walked down the hall to get away from the noise and drink a cup of coffee. When he returned an hour later, he saw something unexpected, something that planted a seed for a new science.
This new run should have exactly duplicated the old. Lorenz had copied the numbers into the machine himself. The program had not changed. Yet as he stared at the new printout, Lorenz saw his weather diverging so rapidly from the pattern of the last run that, within just a few months, all resemblance had disappeared. He looked at one set of numbers, then back at the other. He might as well have chosen two random weathers out of a hat. His first thought was that another vacuum tube had gone bad.
Suddenly he realized the truth. There had been no malfunction. The problem lay in the numbers he had typed. In the computer's memory, six decimal places were stored: .506127. On the printout, to save space, just three appeared: .506. Lorenz had entered the shorter, roundedoff numbers, assuming that the difference—one part in a thousand—was inconsequential.
It was a reasonable assumption. If a weather satellite can read oceansurface temperature to within one part in a thousand, its operators consider themselves lucky. Lorenz's Royal McBee was implementing the classical program. It used a purely deterministic system of equations. Given a particular starting point, the weather would unfold exactly the same way each time. Given a slightly different starting point, the weather should unfold in a slightly different way. A small numerical error was like a small puff of wind—surely the small puffs faded or canceled each other out before they could change important, largescale features of the weather. Yet in Lorenz's particular system of equations, small errors proved catastrophic.
He decided to look more closely at the way two nearly identical runs of weather flowed apart. He copied one of the wavy lines of output onto a transparency and laid it over the other, to inspect the way it diverged. First, two humps matched detail for detail. Then one line began to lag a hairsbreadth behind. By the time the two runs reached the next hump, they were distinctly out of phase. By the third or fourth hump, all similarity had vanished.
It was only a wobble from a clumsy computer. Lorenz could have assumed something was wrong with his particular machine or his particular model—probably should have assumed. It was not as though he had mixed sodium and chlorine and got gold. But for reasons of mathematical intuition that his colleagues would begin to understand only later, Lorenz felt a jolt: something was philosophically out of joint. The practical import could be staggering. Although his equations were gross parodies of the earth's weather, he had a faith that they captured the essence of the real atmosphere. That first day, he decided that longrange weather forecasting must be doomed.
"We certainly hadn't been successful in doing that anyway and now we had an excuse," he said. "I think one of the reasons people thought it would be possible to forecast so far ahead is that there are real physical phenomena for which one can do an excellent job of forecasting, such as eclipses, where the dynamics of the sun, moon, and earth are fairly complicated, and such as oceanic tides. I never used to think of tide forecasts as prediction at all—I used to think of them as statements of fact—but of course, you are predicting. Tides are actually just as complicated as the atmosphere. Both have periodic components—you can predict that next summer will be warmer than this winter. But with weather we take the attitude that we knew that already. With tides, it's the predictable part that we're interested in, and the unpredictable part is small, unless there's a storm.
"The average person, seeing that we can predict tides pretty well a few months ahead would say, why can't we do the same thing with the atmosphere, it's just a different fluid system, the laws are about as complicated. But I realized that any physical system that behaved nonperiodically would be unpredictable."
The fifties and sixties were years of unreal optimism about weather forecasting. Newspapers and magazines were filled with hope for weather science, not just for prediction but for modification and control. Two technologies were maturing together, the digital computer and the space satellite. An international program was being prepared to take advantage of them, the Global Atmosphere Research Program. There was an idea that human society would free itself from weather's turmoil and become its master instead of its victim. Geodesic domes would cover cornfields. Airplanes would seed the clouds. Scientists would learn how to make rain and how to stop it.
The intellectual father of this popular notion was Von Neumann, who built his first computer with the precise intention, among other things, of controlling the weather. He surrounded himself with meteorologists and gave breathtaking talks about his plans to the general physics community. He had a specific mathematical reason for his optimism. He recognized that a complicated dynamical system could have points of instability—critical points where a small push can have large consequences, as with a ball balanced at the top of a hill. With the computer up and running, Von Neumann imagined that scientists would calculate the equations of fluid motion for the next few days. Then a central committee of meteorologists would send up airplanes to lay down smoke screens or seed clouds to push the weather into the desired mode. But Von Neumann had overlooked the possibility of chaos, with instability at every point.
(Continues...)
Table of Contents
ChaosPrologue
The Butterfly Effect
Edward Lorenz and his toy weather. The computer misbehaves. Longrange forecasting is doomed. Order masquerading as randomness. A world of nonlinearity. "We completely missed the point."
Revolution
A revolution in seeing. Pendulum clocks, space balls, and playground swings. The invention of the horseshoe. A mystery solved: Jupiter's Great Red Spot.
Life's Ups and Downs
Modeling wildlife populations. Nonlinear science, "the study of nonelephant animals." Pitchfork bifurcations and a ride on the Spree. A movie of chaos and a messianic appeal.
A Geometry of Nature
A discovery about cotton prices. A refugee from Bourbaki. Transmission errors and jagged shores. New dimensions. The monsters of fractal geometry. Quakes in the schizosphere. From clouds to blood vessels. The trash cans of science. "To see the world in a grain of sand."
Strange Attractors
A problem for God. Transitions in the laboratory. Rotating cylinders and a turning point. David Ruelle's idea for turbulence. Loops in phase space. Millefeuilles and sausage. An astronomer's mapping. "Fireworks or galaxies."
Universality
A new start at Los Alamos. The renormalization group. Decoding color. The rise of numerical experimentation. Mitchell Feigenbaum's breakthrough. A universal theory. The rejection letters. Meeting in Como. Clouds and paintings.
The Experimenter
Helium in a Small Box. "Insolid billowing of the solid." Flow and form in nature. Albert Libchaber's delicate triumph. Experiment joins theory. From one dimension to many.
Images of Chaos
The complex plane. Surprise in Newton's method. The Mandelbrot set: sprouts and tendrils. Art and commerce meet science. Fractal basin boundaries. The chaos game.
The Dynamical Systems Collective
Santa Cruz and the sixties. The analog computer. Was this science? "A longrange vision." Measuring unpredictability. Information theory. From microscale to macroscale. The dripping faucet. Audiovisual aids. An era ends.
Inner RhythmsA misunderstanding about models. The complex body. The dynamical heart. Resetting the biological clock. Fatal arrhythmia. Chick embryos and abnormal beats. Chaos as health.
Chaos and Beyond
New beliefs, new definitions. The Second Law, the snowflake puzzle, and loaded dice. Opportunity and necessity.
Afterword
Notes on Sources and Further Reading
Acknowledgments
Index