Read an Excerpt
The Matchbox that Ate a Forty-Ton Truck
What Everyday Things Tell Us About the Universe
By Marcus Chown
Faber and Faber, Inc.Copyright © 2009 Marcus Chown
All rights reserved.
The Face in the Window
How, when you stand in front of a window, the most shocking discovery in the history of science – that ultimately things happen for no reason – is literally staring you in the face
Une difficulté est une lumière. Une difficulté insurmontable est un soleil. (A difficulty is a light. An insurmountable difficulty is a sun.)
No progress without paradox.
John Wheeler, 1985
It is night-time and it is raining. You are staring dreamily out of a window at the lights of the city. You can see the cars driving past on the street and you can see the faint reflection of your face among the runnels of water streaming down the pane. Believe it or not, this simple observation is telling you something profound and shocking about fundamental reality. It is telling you that the universe, at its deepest level, is founded on randomness and unpredictability, the capricious roll of a dice – that, ultimately, things happen for no reason at all.
The reason you can see the lights of the city outside and simultaneously the faint image of your face staring back at you is because about 95 per cent of the light striking the window goes straight through while about 5 per cent is reflected. This is easy to understand if light is a wave, like a ripple on water, which is the commonly held view. Imagine a speedboat streaking across a lake and creating a bow wave which runs into a piece of partially submerged driftwood. Most of the wave just keeps on going, unaffected by the obstacle, while a small portion doubles back on itself. Similarly, when a light wave encounters the obstacle of a window, most of the wave is transmitted, while a small portion is reflected.
This explanation of why you see your face in a window is straightforward. It certainly does not appear to have any profound implications for the nature of ultimate reality. However, this is an illusion. Light is not what it seems. It has a trick up its sleeve which undermines this simple picture and changes everything. In the twentieth century, a number of phenomena were discovered that revealed that light behaved not as a wave, like a ripple spreading on a pond, but as a stream of bullet-like particles. For instance, there was the Compton effect, which revealed something very peculiar about the way light bounced, or 'scattered', off an electron. Discovered in 1897 by the Cambridge physicist 'J. J.' Thomson, the electron was a particle smaller than an atom. In fact, it was one of its key constituents.
In 1920, the American physicist Arthur Compton decided to investigate what happened to light when it was shone on electrons. He had a picture in his mind of light waves bouncing off an electron like water waves off a buoy. If you have seen such a thing, you will know that the size, or 'wavelength', of the waves remains unchanged. In other words, the distance between successive wave crests is the same for the outgoing wave as the incoming wave. But in Compton's experiment this was not the case at all. After the light waves had bounced off electrons, their wavelength was bigger than before. And the more the direction of the light was changed in the encounter, the bigger the change in wavelength. It was as if the mere act of bouncing off an electron magically changed blue light, which is characterised by a short wavelength, into red light, which has a longer wavelength. A longer, more sluggish wave turns out to be less energetic than a short, frenetic wave. So what Compton's experiments were telling him was that, when light bounced off an electron, it was somehow sapped of energy.
Compton's mental picture of what was going on was demolished. The light in his experiments was not behaving anything like a water wave bouncing off a buoy. In fact, the more he thought about it, the more he realised that it was behaving like a billiard ball hitting another billiard ball. When a ball is struck by the cue ball, it shoots off, carrying with it some of the energy of the cue ball. Inevitably, the cue ball loses energy. Electrons were known to be like tiny billiard balls, but light was known to ripple though space like a wave. Compton's experiments were unequivocal, however. Despite centuries of evidence to the contrary, light must also consist of particles like tiny billiard balls. For his ground-breaking work in confirming the particle-like nature of light, Compton was awarded the 1927 Nobel Prize for Physics.
More evidence that light behaved like a stream of particles came from the photoelectric effect, familiar to everyone who sees supermarket doors part like the Red Sea when they walk towards them. What triggers the doors to swish aside is the breaking of a beam of light by an approaching leg or a foot. The beam illuminates a 'photocell', a device containing a metal which spits out electrons whenever light falls on it. This happens because the electrons are only loosely bound to their parent atoms, so the energy delivered by the light is sufficient to kick them free. When someone breaks the light beam, the photocell is cast into shadow and the sputtering of electrons stops. The electronics are rigged in such a way that the instant the flow of electrons chokes off the doors open.
So what has the photoelectric effect got to do with the particle nature of light? If light is a wave, it is nigh on impossible to explain how it can deliver energy efficiently to a tiny, localised electron. Being spread out, a typical light wave will interact with a large number of electrons spread over the surface of the metal. Inevitably, some will get kicked out after others. In fact, calculations show that some electrons will be kicked out up to ten minutes after others. Imagine if the flow of electrons took ten minutes to build up in the photocell, so supermarket customers had to wait ten minutes for an automatic door to open.
Everything makes sense if the light is made of tiny particles and each interacts with a single electron in the metal. Rather than spreading its energy over large numbers of electrons, the light tied up in such 'photons' packs a real punch. Not only does each photon eject a single electron but it ejects it promptly, not after a ten-minute delay. Thank the particlelike nature of light for your prompt admission to a supermarket.
It was for explaining the photoelectric effect in terms of tiny chunks, or 'quanta', of light that Einstein won the 1921 Nobel Prize for Physics. Many people find this surprising. They wonder why he did not win the prize for 'relativity', the theory for which he is most famous and which changed forever our view of space and time. Einstein himself, however, always saw relativity as a natural and unsurprising outgrowth of nineteenth-century physics. He considered 'quanta', alone among his achievements, the only truly revolutionary idea of his life.
Einstein published his paper on the existence of quanta in the same 'miraculous year' as his theory of relativity. Five years earlier, in 1900, the German physicist Max Planck had found a way to explain the puzzling character of the heat coming from a furnace by suggesting that atoms can vibrate only at certain permissible energies and that those energies come in multiples of some basic chunk, or quantum, of energy. Planck believed these quanta to be no more than a mathematical sleight of hand, with no physical significance whatsoever. Einstein was the first person to view them as truly real – as flying through space as a stream of photons in a beam of light.
The Matchbox That Ate a Forty-Ton Truck
Actually, the fact that light must in some circumstances behave as tiny, localised particles is forced on us by the most familiar of everyday phenomena – the emission of light by the filament of a light bulb and the absorption of light by your eye. The reason has to do with the make-up of the filament and your retina. Like all matter, they are made of atoms.
The idea that everything is made of atoms comes from the Greek philosopher Democritus, who, around 440 BC, picked up a rock or a branch or maybe it was a piece of pottery and asked himself: 'If I cut this object in half, then cut the halves in half, can I go on subdividing it like this forever?' Democritus answered his own question. It was inconceivable to him that matter could be subdivided in this way forever. Sooner or later, he reasoned, you must come to a tiny grain of matter which could not be cut in half any more. Since the Greek for 'uncuttable' was a-tomos, Democritus' ultimate grains of matter have come to be known as 'atoms'.
Democritus actually went further and postulated that atoms come in a handful of different types, like microscopic LEGO bricks, and that, by assembling them in different ways, it is possible to make a rose or a cloud or a shining star. But the key idea is that reality is ultimately grainy, composed of tiny, hard bullets of matter. It is an idea that has certainly stood the test of time.
Atoms turn out to be very small. It takes more than a million to span a pinhead. Confirming their existence was therefore very hard. A lot of indirect evidence was accumulated in the age of science. However, remarkably, no one actually 'saw' an atom until 1980, when two physicists at IBM built an ingenious device called the Scanning Tunnelling Microscope.
The STM earned Gerd Binnig and Heinrich Rohrer the 1986 Nobel Prize for Physics. Basically, the device drags a microscopic 'finger' across the surface of a material, sensing the up-and-down motion as it passes over the atoms in much the same way that a blind person senses the undulations of someone's face with their finger. And, in the same way a blind person builds up a mental picture of the face they are feeling, the STM builds up a picture on a computer display of the atomic landscape over which it is travelling.
Using the STM, Binnig and Rohrer became the first people in history to look down, like gods, on the microscopic world of atoms. And what they saw, swimming into view on their computer screen, was exactly what Democritus had imagined 2,500 years earlier. Atoms looked like tiny tennis balls. They looked like apples stacked in boxes. Never, in the history of science, had someone made a prediction so far in advance of its experimental confirmation. If only Binnig and Rohrer had a time machine. They could have transported Democritus to their Zürich lab, stood him in front of their remarkable image and said: 'Look. You got it right.' Just like artists who die in obscurity, never having seen their reputations go stratospheric and their paintings sell for tens of millions of dollars, scientists may never live to see the spectacular success of their ideas.
Atoms, it turns out, are not the ultimate grains of matter. They are made of smaller things. Nevertheless, Democritus' idea that matter is ultimately grainy, not continuous, persists, with 'quarks' and 'leptons' now wearing the mantle of nature's uncuttable grains. But quarks, it turns out, are not important when it comes to the meeting of light and matter in your eye or in the filament of a light bulb. When light is absorbed or spat out, it is atoms that do the absorbing and spitting. And herein lies the problem.
An atom, according to our theory of matter, is a tiny, localised thing like a microscopic billiard ball. Light, on the other hand, is a spread-out thing like a ripple on a pond. Take visible light. A convenient measure of its size is its wavelength – the distance it travels during a complete up-and-down oscillation, or double the separation of successive wave crests. The wavelength of visible light is about five thousand times bigger than an atom. Imagine you have a matchbox. You open it and out drives a forty-ton truck. Or say a forty-ton truck is driving towards you, you open your matchbox and the truck disappears inside. Ridiculous? But this is precisely the paradox that exists at the interface where light meets matter.
How does an atom in your eye swallow something five thousand times bigger than itself? How does an atom in the filament of a light bulb cough out something five thousand times more spread out? The British survival expert Ray Mears said during one of his TV programmes: 'Nothing fits inside a snake like another snake.' Apply this logic to the interface between light and matter. If light is to fit inside an atom, which is small and localised, it too must be small and localised. The trouble is there are a thousand instances where light shows itself to be a spread-out wave.
In the first decades of the twentieth century, physicists too went round and round in circles, trying desperately to resolve paradoxes of this kind. As the German physicist Werner Heisenberg wrote: 'I remember discussions which went through many hours until very late at night and ended almost in despair; and when at the end of the discussion I went alone for a walk in the neighbouring park I repeated to myself again and again the question: Can nature possibly be so absurd as it seemed to us in these atomic experiments?'
A paradox where one theory predicts one thing in a particular circumstance and another theory something quite different is often hugely fruitful. It tells us that one theory at least is wrong. And the bigger and more well-established the theories which are at loggerheads, the more revolutionary the consequences. In the case of light being emitted from a light bulb or being absorbed by your eye, the two theories which predict conflicting things are the wave theory of light and the atomic theory of matter. And they are two of the biggest and most well-established theories of all.
So which theory is wrong? The extraordinary answer embraced by physicists is both. Or neither. Light is both a wave and a particle. Or, rather, it is something for which we have no word in our vocabulary, and there is nothing we can compare it with in the everyday world. It is fundamentally ungraspable – like a three-dimensional object is to creatures confined to the two-dimensional world of a sheet of paper, with no concept of up above or down below. All they can ever experience are 'shadows' of the object, never the object in its entirety. Similarly, light is not a wave or a particle but 'something else' that we can never grasp completely. All we can see are its shadows – in some circumstances its wave-like face and in others its particle-like face.
Clearly, atoms do spit out light. But, just as clearly, visible light is many thousands of times bigger than an atom that spits it out. Both facts are incontrovertible. The only way to resolve the paradox, therefore, is to accept something that sounds like sheer madness – that light is both thousands of times bigger than and smaller than an atom. It is both spread out and localised. It is both a wave and a particle. When it travels through space, light travels like a ripple on a pond. However, when it is absorbed or spat out by an atom, it behaves like a stream of tiny machine-gun bullets. Imagine you are standing by a fire hydrant in New York's Times Square and simultaneously spread out like a fog throughout Manhattan. Ridiculous? Yes. Nevertheless, that is the way light is.
The wave picture of light was correct. So too was the particle picture. Paradoxically, light is both a wave and a particle.
A World That Defies Common Sense
Should we be surprised to find that light is fundamentally different from anything in the everyday world? Should we be surprised that it is ungraspable in its entirety, that its properties are counter-intuitive, that they defy common sense? Perhaps it helps to spell out what we mean by intuition or common sense. Really, it is just the body of information we have gathered about how the world around us works. In evolutionary terms we needed that information to survive on an African plain in the midst of a lot of creatures which were bigger, faster and fiercer than us. Survival depended on having vision that enabled us to see relatively big objects between us and the horizon, hearing that enabled us to hear relatively loud sounds, and so on. There was no survival value in developing senses that could take us beyond the world of our immediate surroundings – eyes, for instance, that could show us the microscopic realm of atoms. Consequently, we developed no intuition whatsoever about these domains. We should, therefore, not be surprised that when we began to explore the domain of the very small compared to our everyday world, we found counter-intuitive things. An atom is about 10 billion times smaller than a human being. It would be surprising if it behaved in any way like a football or a chair or a table, or anything else in the world of our senses.
The first person to realise that the fundamental reality that underpins the everyday world is totally unlike the everyday world was the Scottish physicist James Clerk Maxwell, arguably the most important physicist between the time of Newton and Einstein (tragically, he died of stomach cancer, aged only forty-eight). His great triumph, in the 1860s, was to distil all magnetic and electrical phenomena into one neat set of formulae. 'Maxwell's equations' are so super-compact you could write them on the back of a stamp (if you have small hand-writing!).
Excerpted from The Matchbox that Ate a Forty-Ton Truck by Marcus Chown. Copyright © 2009 Marcus Chown. Excerpted by permission of Faber and Faber, Inc..
All rights reserved. No part of this excerpt may be reproduced or reprinted without permission in writing from the publisher.
Excerpts are provided by Dial-A-Book Inc. solely for the personal use of visitors to this web site.