Quanta and Fields: The Biggest Ideas in the Universe

Quanta and Fields: The Biggest Ideas in the Universe

by Sean Carroll

Narrated by Sean Carroll

Unabridged — 9 hours, 49 minutes

Quanta and Fields: The Biggest Ideas in the Universe

Quanta and Fields: The Biggest Ideas in the Universe

by Sean Carroll

Narrated by Sean Carroll

Unabridged — 9 hours, 49 minutes

Audiobook (Digital)

$20.00
FREE With a B&N Audiobooks Subscription | Cancel Anytime
$0.00

Free with a B&N Audiobooks Subscription | Cancel Anytime

START FREE TRIAL

Already Subscribed? 

Sign in to Your BN.com Account


Listen on the free Barnes & Noble NOOK app


Related collections and offers

FREE

with a B&N Audiobooks Subscription

Or Pay $20.00

Overview

Quanta and Fields, the second book of Sean Carroll's already internationally acclaimed series The Biggest Ideas in the Universe, is an adventure into the bare stuff of reality.
*
Sean Carroll is creating a profoundly new approach to sharing physics with a broad audience, one that goes beyond analogies to show how physicists really think. He cuts to the bare mathematical essence of our most profound theories, explaining every step in a uniquely accessible way. *

Quantum field theory is how modern physics describes nature at its most profound level. Starting with the basics of quantum mechanics itself, Sean Carroll explains measurement and entanglement before explaining how the world is really made of fields. You will finally understand why matter is solid, why there is antimatter, where the sizes of atoms come from, and why the predictions of quantum field theory are so spectacularly successful. Fundamental ideas like spin, symmetry, Feynman diagrams, and the Higgs mechanism are explained for real, not just through amusing stories. Beyond Newton, beyond Einstein, and all the intuitive notions that have guided homo sapiens for millennia, this book is a journey to a once unimaginable truth about what our universe is.


* This audiobook edition includes a downloadable PDF of graphs, equations, and images.

Editorial Reviews

From the Publisher

Praise for Quanta and Fields:

“Readers will be electrified by his discussion of wave functions, entanglement, fields, and so much more. From the most infinitesimal of subatomic particles to the seemingly vast infinities of the universe’s great expanse, Carroll’s latest inquiry illuminates, well, everything.” Booklist

Kirkus Reviews

2024-02-14
The author’s second volume on the laws that govern the universe.

In 2022, Carroll, professor of natural philosophy at Johns Hopkins, published The Biggest Ideas in the Universe: Space, Time, and Motion, in which he explained the first fundamental description of nature in physics: classical mechanics, from Newton’s gravity to Einstein’s relativity. In the classical world, the workings of particles of energy are often complicated and even bizarre (general relativity is both), but they make sense. Adding that no one, physicists included, can sensibly explain quantum mechanics—even though it’s “the way the world works”—the author emphasizes that this book follows naturally from its predecessor. Although Carroll maintains that his field contains an enormous amount of material that he will “boil down to its bare essence,” readers may wish that he had kept the pot on longer. Science writers dealing with complex areas—e.g., DNA, immunity, brain function, quantum mechanics—traditionally start with easy material, usually the history, proceed to basic concepts, and slowly add complexity. Carroll, however, is a take-no-prisoners popularizer, and he dispenses with history in a dozen pages. His first equation appears on page 14, and it’s a doozy. Torrents of others follow as he eschews analogies, metaphors, and amusing stories for straightforward explanations of matter, fields, and forces as well as quantum behavior that he argues is less bizarre than other authors claim. That may be true, but most readers will agree that concepts such as entanglement, spin, symmetry, antimatter, and gauge theory are difficult to comprehend. This should not be anyone’s introduction to quantum theory; for that, try Quantum Mechanics by Leonard Susskind and Art Friedman. Readers who have forgotten first-year college physics and calculus will struggle. Those who remember and pay close attention will receive an unvarnished education on how the universe works.

Quantum theory for serious readers.

Product Details

BN ID: 2940159610089
Publisher: Penguin Random House
Publication date: 05/14/2024
Edition description: Unabridged
Sales rank: 420,890

Read an Excerpt

ONE
Wave Functions

As the nineteenth century drew to a close, you would have forgiven physicists for hoping that they were on track to understand everything. The universe, according to this tentative picture, was made of particles that were pushed around by fields.

The idea of fields filling space had really taken off over the course of the 1800s. Earlier, Isaac Newton had presented a beautiful and compelling theory of motion and gravity, and Pierre-Simon Laplace had shown how we could reformulate that theory in terms of a gravitational field stretching between every object in the universe. A field is just something that has a value at each point in space. The value could be a simple number, or it could be a vector or something more complicated, but any field exists everywhere through space.

But if all you cared about was gravity, the field seemed optional-a point of view you could choose to take or not, depending on your preferences. It was equally okay to think as Newton did, directly in terms of the force created on one object by the gravitational pull of others without anything stretching between them.

That changed in the nineteenth century, as physicists came to grips with electricity and magnetism. Electrically charged objects exert forces on each other, which is natural to attribute to the existence of an electric field stretching between them. Experiments by Michael Faraday showed that a moving magnet could induce electrical current in a wire without actually touching it, pointing to the existence of a separate magnetic field, and James Clerk Maxwell managed to combine these two kinds of fields into single a theory of electromagnetism, published in 1873. This was an enormous triumph of unification, explaining a diverse set of electrical and magnetic phenomena in a single compact theory. "Maxwell's equations" bedevil undergraduate physics students to this very day.

One of the triumphant implications of Maxwell's theory was an understanding of the nature of light. Rather than a distinct kind of substance, light is a propagating wave in the electric and magnetic fields, also known as electromagnetic radiation. We think of electromagnetism as a "force," and it is, but Maxwell taught us that fields carrying forces can vibrate, and in the case of electric and magnetic fields those vibrations are what we perceive as light. The quanta of light are particles called photons, so we will sometimes say "photons carry the electromagnetic force." But at the moment we're still thinking classically.

Take a single charged particle, like an electron. Left sitting by itself, it will have an electric field surrounding it, with lines of force pointing toward the electron. The force will fall off as an inverse-square law, just as in Newtonian gravity. If we move the electron, two things happen: First, a charge in motion creates a magnetic field as well as an electric one. Second, the existing electric field will adjust how it is oriented in space, so that it remains pointing toward the particle. And together, these two effects (small magnetic field, small deviation in the existing electric field) ripple outward, like waves from a pebble thrown into a pond. Maxwell found that the speed of these ripples is precisely the speed of light-because it is light. Light, of any wavelength from radio to x-rays and gamma rays, is a propagating vibration in the electric and magnetic fields. Almost all the light you see around you right now has its origin in a charged particle being jiggled somewhere, whether it's in the filament of a lightbulb or the surface of the sun.

Simultaneously in the nineteenth century, the role of particles was also becoming clear. Chemists, led by John Dalton, championed the idea that matter was made of individual atoms, with one specific kind of atom associated with each chemical element. Physicists belatedly caught on, once they realized that thinking of gases as collections of bouncing atoms could explain things like temperature, pressure, and entropy.

But the term "atom," borrowed from the ancient Greek idea of an indivisible elementary unit of matter, turned out to be a bit premature. Though they are the building blocks of chemical elements, modern-day atoms are not indivisible. A quick-and-dirty overview, with details to be filled in later: atoms consist of a nucleus made of protons and neutrons, surrounded by orbiting electrons. Protons have positive electrical charge, neutrons have zero charge, and electrons have negative charge. We can make a neutral atom if we have equal numbers of protons and electrons, since their electrical charges will cancel each other out. Protons and neutrons have roughly the same mass, with neutrons being just a bit heavier, but electrons are much lighter, about 1/1,800th the mass of a proton. So most of the mass in a person or another macroscopic object comes from the protons and neutrons. The lightweight electrons are more able to move around and are therefore responsible for chemical reactions as well as the flow of electricity. These days we know that protons and neutrons are themselves made of smaller particles called quarks, which are held together by gluons, but there was no hint of that in the early 1900s.

This picture of atoms was put together gradually. Electrons were discovered in 1897 by British physicist J. J. Thompson, who measured their charge and established that they were much lighter than atoms. So somehow there must be two components in an atom: the lightweight, negatively charged electrons, and a heavier, positively charged piece. A few years later Thompson suggested a picture in which tiny electrons floated within a larger, positively charged volume. This came to be called the plum pudding model, with electrons playing the role of the plums.

The plum pudding model didn't flourish for long. A famous experiment by Ernest Rutherford, Hans Geiger, and Ernest Marsden shot alpha particles (now known to be nuclei of helium atoms) at a thin sheet of gold foil. The expectation was that they would mostly pass right through, with their trajectories slightly deflected if they happened to pass through an atom and interact with the electrons (the plums) or the diffuse positively charged blob (the pudding). Electrons are too light to disturb the alpha particles' trajectories, and a spread-out positive charge would be too diffuse to have much effect. But what happened was, while most of the particles did indeed zip through unaffected, some bounced off at wild angles, even straight back. That could only happen if there was something heavy and substantial for the particles to carom off of. In 1911 Rutherford correctly explained this result by positing that the positive charge was concentrated in a massive central nucleus. When an incoming alpha particle was lucky enough to score a direct hit on the small but heavy nucleus, it would be deflected at a sharp angle, which is what was observed. In 1920 Rutherford proposed the existence of protons (which were just hydrogen nuclei, so had already been discovered), and in 1921 he theorized the existence of neutrons (which were eventually discovered in 1932).

So far, so good, thinks our imagined fin de siècle physicist. Matter is made of particles, the particles interact via forces, and those forces are carried by fields. The entire mechanism would run according to rules established by the framework of classical physics. For particles this is pretty familiar: we specify the positions and the momenta of all the particles, then use one of our classical techniques (Newton's laws or their equivalent) to describe their dynamics. Fields work in essentially the same way, except that the "position" of a field is its value at every point in space, and its "momentum" is how fast it's changing at every point. The overall classical picture applies in either case.

The suspicion that physics was close to being all figured out was tempting. Albert Michelson, at the dedication of a new physics laboratory at the University of Chicago in 1894, proclaimed, "It seems probable that most of the grand underlying principles [of physics] have been firmly established."

He was quite wrong.

But he was also in the minority. Other physicists, starting with Maxwell himself, recognized that the known behavior of collections of particles and waves didn't always accord with our classical expectations. William Thomson, Lord Kelvin, is often the victim of a misattributed quote: "There is nothing new to be discovered in physics now. All that remains is more and more precise measurement." His real view was the opposite. In a lecture in 1900, Thomson highlighted the presence of two "clouds" looming over physics, one of which was eventually to be dispersed by the formulation of the theory of relativity, the other by the theory of quantum mechanics.

BLACKBODY Radiation

The history of science is subtle and complicated, and progress rarely takes the straight path we remember in retrospect. Quantum mechanics in particular had a painful and messy development. We're going to skip over many of the historical twists and turns to focus on two puzzling phenomena that kicked off the quantum revolution: waves exhibiting particle-like properties, and particles exhibiting wave-like properties.

The particle-like properties of light came first. The idea arose from studying blackbody (or "thermal") radiation, which is the radiation emitted by an object that absorbs any incident light but nevertheless radiates just because it has a nonzero temperature. To physicists, the temperature of an object characterizes the random jiggling of its constituent particles, and randomly jiggling particles are going to emit radiation depending on how fast they are moving. When you look at a painting, you see an intricate configuration of shape and color that reflects the light that is shining on it. A blackbody, by contrast, is what you get when you turn off all the ambient light and just let objects glow because of their temperature; the glow from a heating element on an electric stove is a good example. Everything with a nonzero temperature gives off some thermal radiation, but pure blackbody radiation depends only on the temperature, unspoiled by color or reflectivity or other properties of the object. A low-temperature blackbody will primarily radiate at infrared or even radio wavelengths, and as we increase the temperature we see more visible light, ultraviolet, and ultimately x-rays.

So blackbody radiation represents a seemingly simple physics problem (a spherical cow, one might say). It has a temperature, and none of its other properties matter. Temperature measures the kinetic energy of atoms in the body jiggling back and forth, and those atoms contain charged particles, so this jiggling leads to the emission of electromagnetic radiation. Our physics problem is, how much radiation is given off at each wavelength?

Physicists in the nineteenth century set about both measuring the radiation as a function of wavelength-the spectrum of the blackbody-and calculating it theoretically. The measured curve is a thing of beauty, climbing up from zero at short wavelengths to a peak that depends on the temperature, then decaying back down to zero at long wavelengths.

The theoretical situation, however, was a mess. One proposed theory, by Wilhelm Wien in 1896, seemed to fit well at short wavelengths but diverged from experimental data at longer wavelengths. Another, by John Strutt (Lord Rayleigh) in 1900, worked the other way around: his fit well at long wavelengths but not at short ones. Indeed, it predicted an infinite amount of radiation at short wavelengths.

From the B&N Reads Blog

Customer Reviews