Read an Excerpt
Isaac Newton invented physics, and all of science depends on physics. Newton certainly built upon the work of others, but it was the publication of his three laws of motion and theory of gravity, almost exactly three hundred years ago, that set science off on the road that has led to space flight, lasers, atomic energy, genetic engineering, an understanding of chemistry, and all the rest. For two hundred years, Newtonian physics (what is now called “classical” physics) reigned supreme; in the twentieth century revolutionary new insights took physics far beyond Newton, but without those two centuries of scientific growth those new insights might never have been achieved. This book is not a history of science, and it is concerned with the new physics—quantum physics—rather than with those classical ideas. But even in Newton’s work three centuries ago there were already signs of the changes that were to come—not from his studies of planetary motions and orbits, or his famous three laws, but from his investigations of the nature of light.
Newton’s ideas about light owed a lot to his ideas about the behavior of solid objects and the orbits of planets. He realized that our everyday experiences of the behavior of objects may be misleading, and that an object, a particle, free from any outside influences must behave very differently from such a particle on the surface of the earth. Here, our everyday experience tells us that things tend to stay in one place unless they are pushed, and that once you stop pushing them they soon stop moving. So why don’t objects like planets, or the moon, stop moving in their orbits? Is something pushing them? Not at all. It is the planets that are in a natural state, free from outside interference, and the objects on the surface of the earth that are being interfered with. If I try to slide a pen across my desk, my push is opposed by the friction of the pen rubbing against the desk, and that is what brings it to a halt when I stop pushing. If there were no friction, the pen would keep moving. This is Newton’s first law: every object stays at rest, or moves with constant velocity, unless an outside force acts on it. The second law tells us how much effect an outside force—a push—has on an object. Such a force changes the velocity of the object, and a change in velocity is called acceleration; if you divide the force by the mass of the object the force is acting upon, the result is the acceleration produced on that body by that force. Usually, this second law is expressed slightly differently: force equals mass times acceleration. And Newton’s third law tells us something about how the object reacts to being pushed around: for every action there is an equal and opposite reaction. If I hit a tennis ball with my racket, the force with which the racket pushes on the tennis ball is exactly matched by an equal force pushing back on the racket; the pen on my desk top, pulled down by gravity, is pushed against with an exactly equal reaction by the desk top itself; the force of the explosive process that pushes the gases out of the combustion chamber of a rocket produces an equal and opposite reaction force on the rocket itself, which pushes it in the opposite direction.
These laws, together with Newton’s law of gravity, explained the orbits of the planets around the sun, and the moon around the earth. When proper account was taken of friction, they explained the behavior of objects on the surface of the earth as well, and formed the foundation of mechanics. But they also had puzzling philosophical implications. According to Newton’s laws, the behavior of a particle could be exactly predicted on the basis of its interactions with other particles and the forces acting on it. If it were ever possible to know the position and velocity of every particle in the universe, then it would be possible to predict with utter precision the future of every particle, and therefore the future of the universe. Did this mean that the universe ran like clockwork, wound up and set in motion by the Creator, down some utterly predictable path? Newton’s classical mechanics provided plenty of support for this deterministic view of the universe, a picture that left little place for human free will or chance. Could it really be that we are all puppets following our own preset tracks through life, with no real choice at all? Most scientists were content to let the philosophers debate that question. But it returned, with full force, at the heart of the new physics of the twentieth century.
WAVES OR PARTICLES?
With his physics of particles such a success, it is hardly surprising that when Newton tried to explain the behavior of light he did so in terms of particles. After all, light rays are observed to travel in straight lines, and the way light bounces off a mirror is very much like the way a ball bounces off a hard wall. Newton built the first reflecting telescope, explained white light as a superposition of all the colors of the rainbow, and did much more with optics, but always his theories rested upon the assumption that light consisted of a stream of tiny particles, called corpuscles. Light rays bend as they cross the barrier between a lighter and a denser substance, such as from air to water or glass (which is why a swizzle stick in a gin and tonic appears to be bent), and this refraction is neatly explained on the corpuscular theory provided the corpuscles move faster in the more “optically dense” substance. Even in Newton’s day, however, there was an alternative way of explaining all of this.
The Dutch physicist Christiaan Huygens was a contemporary of Newton, although thirteen years older, having been born in 1629. He developed the idea that light is not a stream of particles but a wave, rather like the waves moving across the surface of a sea or lake, but propagating through an invisible substance called the “luminiferous ether.” Like ripples produced by a pebble dropped into a pond, light waves in the ether were imagined to spread out in all directions from a source of light. The wave theory explained reflection and refraction just as well as the corpuscular theory. Although it said that instead of speeding up the light waves moved more slowly in a more optically dense substance, there was no way of measuring the speed of light in the seventeenth century, so this difference could not resolve the conflict between the two theories. But in one key respect the two ideas did differ observably in their predictions. When light passes a sharp edge, it produces a sharply edged shadow. This is exactly the way streams of particles, traveling in straight lines, ought to behave. A wave tends to bend, or diffract, some of the way into the shadow (think of the ripples on a pond, bending around a rock). Three hundred years ago, this evidence clearly favored the corpuscular theory, and the wave theory, although not forgotten, was discarded. By the early nineteenth century, however, the status of the two theories had been almost completely reversed.
In the eighteenth century, very few people took the wave theory of light seriously. One of the few who not only took it seriously but wrote in support of it was the Swiss Leonard Euler, the leading mathematician of his time, who made major contributions to the development of geometry, calculus and trigonometry. Modern mathematics and physics are described in arithmetical terms, by equations; the techniques on which that arithmetical description depends were largely developed by Euler, and in the process he introduced shorthand methods of notation that survive to this day—the name “pi” for the ratio of the circumference of a circle to its diameter; the letter i to denote the square root of minus one (which we shall meet again, along with pi); and the symbols used by mathematicians to denote the operation called integration. Curiously, though, Euler’s entry in the Encyclopaedia Britannica makes no mention of his views on the wave theory of light, views which a contemporary said were not held “by a single physicist of prominence.”* About the only prominent contemporary of Euler who did share those views was Benjamin Franklin; but physicists found it easy to ignore them until crucial new experiments were performed by the Englishman Thomas Young just at the beginning of the nineteenth century, and by the Frenchman Augustin Fresnel soon after.