States of Matter

States of Matter

by David L. Goodstein

NOOK Book(eBook)

$26.99 $44.99 Save 40% Current price is $26.99, Original price is $44.99. You Save 40%.
View All Available Formats & Editions

Available on Compatible NOOK Devices and the free NOOK Apps.
WANT A NOOK?  Explore Now
LEND ME® See Details

Product Details

ISBN-13: 9780486319186
Publisher: Dover Publications
Publication date: 07/04/2013
Series: Dover Books on Physics
Sold by: Barnes & Noble
Format: NOOK Book
Pages: 512
File size: 34 MB
Note: This product may take a few minutes to download.

About the Author

David L. Goodstein is Professor of Physics and Applied Physics at the California Institute of Technology, where in 1995 he was named the Frank J. Gilloon Distinguished Teaching and Service Professor. His other books include Feynman's Lost Lecture and On Fact and Fraud: Cautionary Tales from the Front Lines of Science.

Read an Excerpt

States of Matter

By David L. Goodstein

Dover Publications, Inc.

Copyright © 1985 David L. Goodstein
All rights reserved.
ISBN: 978-0-486-79551-5




Ludwig Boltzmann, who spent much of his life studying statistical mechanics, died in 1906, by his own hand. Paul Ehrenfest, carrying on the work, died similarly in 1933. Now it is our turn to study statistical mechanics.

Perhaps it will be wise to approach the subject cautiously. We will begin by considering the simplest meaningful example, the perfect gas, in order to get the central concepts sorted out. In Chap. 2 we will return to complete the solution of that problem, and the results will provide the foundation of much of the rest of the book.

The quantum mechanical solution for the energy levels of a particle in a box (with periodic boundary conditions) is

εq = h2q2/2m (1.1.1)

where m is the mass of the particle, h = 2πh is Planck's constant, and q (which we shall call the wave vector) has three components, x, y, and z, given by

qx = (2[pi/L) lx, etc. (1.1.2)


lx = 0, ±1, ±2, etc. (1.1.3)


q2 = q2x + q2y + q2z (1.1.4)

L is the dimension of the box, whose volume is L3. The state of the particle is specified if we give three integers, the quantum numbers lx, ly, and lz. Notice that the energy of a particle is fixed if we give the set of three integers (or even just the sum of their squares) without saying which is lx, for example, whereas lx, ly, and lz are each required to specify the state, so that there are a number of states for each energy of the single particle.

The perfect gas is a large number of particles in the same box, each of them independently obeying Eqs. (1.1.1) to (1.1.4). The particles occupy no volume, have no internal motions, such as vibration or rotation, and, for the time being, no spin. What makes the gas perfect is that the states and energies of each particle are unaffected by the presence of the other particles, so that there are no potential energies in Eq. (1.1.1). In other words, the particles are noninteracting. However, the perfect gas, as we shall use it, really requires us to make an additional, contradictory assumption : we shall assume that the particles can exchange energy with one another, even though they do not interact. We can, if we wish, imagine that the walls somehow help to mediate this exchange, but the mechanism actually does not matter much as long as the questions we ask concern the possible states of the many-particle system, not how the system contrives to get from one state to another.

From the point of view of quantum mechanics, there are no mysteries left in the system under consideration; the problem of the possible states of the system is completely solved (although some details are left to add on later). Yet we are not prepared to answer the kind of questions that one wishes to ask about a gas, such as : If it is held at a certain temperature, what will its pressure be? The relationship between these quantities is called the equation of state. To answer such a question—in fact, to understand the relation between temperature and pressure on the one hand and our quantum mechanical solution on the other—we must bring to bear the whole apparatus of statistical mechanics and thermodynamics.

This we shall do and, in the course of so doing, try to develop some understanding of entropy, irreversibility, and equilibrium. Let us outline the general ideas briefly in this section, then return for a more detailed treatment.

Suppose that we take our box and put into it a particular number of perfect gas particles, say 1023 of them. We can also specify the total energy of all the particles or at least imagine that the box is physically isolated, so that there is some definite energy; and if the energy is caused to change, we can keep track of the changes that occur. Now, there are many ways for the particles to divide up the available energy among themselves—that is, many possible choices of lx, ly, and lz for each particle such that the total energy comes out right. We have already seen that even a single particle generally has a number of possible states of the same energy; with 1023particles, the number of possible quantum states of the set of particles that add up to the same energy can become astronomical. How does the system decide which of these states to choose?

The answer depends, in general, on details that we have not yet specified : What is the past history—that is, how was the energy injected into the box? And how does the system change from one state to another—that is, what is the nature of the interactions? Without knowing these details, there is no way, even in principle, to answer the question.

At this point we make two suppositions that form the basis of statistical mechanics.

1. If we wait long enough, the initial conditions become irrelevant. This means that whatever the mechanism for changing state, however the particles are able to redistribute energy and momentum among themselves, all memory of how the system started out must eventually get washed away by the multiplicity of possible events. When a system reaches this condition, it is said to be in equilibrium.

2. For a system in equilibrium, all possible quantum states are equally likely. This second statement sounds like the absence of an assumption—we do not assume that any particular kind of state is in any way preferred. It means, however, that a state in which all the particles have roughly the same energy has exactly the same probability as one in which most of the particles are nearly dead, and one particle goes buzzing madly about with most of the energy of the whole system. Would we not be better off assuming some more reasonable kind of behavior?

The fact is that our assumptions do lead to sensible behavior. The reason is that although the individual states are equally likely, the number of states with energy more or less fairly shared out among the particles is enormous compared to the number in which a single particle takes nearly all the energy. The probability of finding approximately a given situation in the box is proportional to the number of states that approximate that situation.

The two assumptions we have made should seem sensible; in fact, we have apparently assumed as little as we possibly can. Yet they will allow us to bridge the gap between the quantum mechanical solutions that give the physically possible microscopic states of the system and the thermodynamic questions we wish to ask about it. We shall have to learn some new language, and especially learn how to distinguish and count quantum states of many-particle systems, but no further fundamental assumptions will be necessary.

Let us defer for the moment the difficult problem of how to count possible states and pretend instead that we have already done so. We have N particles in a box of volume V = L3, with total energy E, and find that there are Γ possible states of the system. The entropy of the system, S, is then defined by

S = k log Γ (1.1.5)

where k is Boltzmann's constant

k = 1.38 × 10-16 erg per degree Kelvin

Thus, if we know Γ, we know S, and Γ is known in principle if we know N, V, and E, and know in addition that the system is in equilibrium. It follows that, in equilibrium, S may be thought of as a definite function of E, N, and V,

S = S(E, N, V)

Furthermore, since S is just a way of expressing the number of choices the system has, it should be evident that S will always increase if we increase E, keeping N and V constant; given more energy, the system will always have more ways to divide it. Being thus monotonic, the function can be inverted

E = E(S, N, V)

or if changes occur,


The coefficients of dS, dV, and dN in Eq. (1.1.6) play special roles in thermodynamics. They are, respectively, the temperature

T = ([partial derivative]E/[partial derivative]S)N,V (1.1.7)

the negative of the pressure

-P = ([partial derivative]E/[partial derivative]V)S,N (1.1.8)

and the chemical potential

μ = ([partial derivative]E/[partial derivative]N)S,V (1.1.9)

These are merely formal definitions. What we have now to argue is that, for example, the quantity T in Eq. (1.1.7) behaves the way a temperature ought to behave.

How do we expect a temperature to behave? There are two requirements. One is merely a question of units, and we have already taken care of that by giving the constant k a numerical value; T will come out in degrees Kelvin. The other, more fundamental point is its role in determining whether two systems are in equilibrium with each other. In order to predict whether anything will happen if we put two systems in contact (barring deformation, chemical reactions, etc.), we need only know their temperatures. If their temperatures are equal, contact is superfluous; nothing will happen. If we separate them again, we will find that each has the same energy it started with.

Let us see if T defined in Eq. (1.1.7) performs in this way. We start with two systems of perfect gas, each with some E, N, V, each internally in equilibrium, so that it has an S and a T; use subscripts 1 and 2 for the two boxes. We establish thermal contact between the two in such a way that the N's and V's remain fixed, but energy is free to flow between the boxes. The question we ask is : When contact is broken, will we find that each box has the same energy it started with?

During the time that the two systems are in contact, the combined system fluctuates about among all the states that are allowed by the physical circumstances. We might imagine that at the instant in which contact is broken, the combined system is in some particular quantum state that involves some definite energy in box 1 and the rest in box 2; when we investigate later, these are the energies we will find. The job, then, is to predict the quantum state of the combined system at the instant contact is broken, but that, of course, is impossible. Our fundamental postulate is simply that all states are equally likely at any instant, so that we have no basis at all for predicting the state.

The precise quantum state is obviously more than we need to know in any case—it is the distribution of energy between the two boxes that we are interested in. That factor is also impossible to predict exactly, but we can make progress if we become a bit less particular and ask instead : About how much energy is each box likely to have? Obviously, the larger the number of states of the combined system that leave approximately a certain energy in each box, the more likely it is that we will catch the boxes with those energies.

When the boxes are separate, either before or after contact, the total number of available choices of the combined system is

Γt = Γ1Γ2 (1.1.10)

It follows from Eq. (1.1.5) that the total entropy of the system is

St = S1 + S2 (1.1.11)

Now suppose that, while contact exists, energy flows from box 1 to box 2. This flow has the effect of decreasing Γ1 and increasing Γ2. By our argument, we are likely to find that it has occurred if the net result is to have increased the total number of available states Γ1Γ2, or, equivalently, the sum S1 + S2. Obviously, the condition that no net energy flow be the most likely circumstance is just that the energy had already been distributed in such a way that Γ1Γ2, or S1 + S2, was a maximum. In this case, we are more likely to find the energy in each box approximately unchanged than to find that energy flowed in either direction.

The differences between conditions in the boxes before contact is established and after it is broken are given by

δE1 = T1 δS1 (1.1.12)

δE2 = T2 δS2 (1.1.13)

from Eqs. (1.1.6) to (1.1.9), with the N's and V's fixed, and

δ(E1 + E2) = 0 (1.1.14)

Since energy is conserved overall. Thus,

T1 δS1 + T2 δS2 = 0 (1.1.15)

If S1 + S2 was already a maximum, then, for whatever small changes do take place, the total will be stationary,

δS1 + δS2 = 0 (1.1.16)

so that Eq. (1.1.15) reduces to

T1 = T2 (1.1.17)

which is the desired result.

It is easy to show by analogous arguments that if the individual volumes are free to change, the pressures must be equal in equilibrium, and that if the boxes can be exchange particles, the chemical potentials must be equal. We shall, however, defer formal proof of these statements to Secs. 1.2f and 1.2g, respectively.

We are now in a position to sketch a possible procedure for answering the prototype question suggested earlier: At a given temperature, what will the pressure be? Given the quantities E, N, V for a box of perfect gas, we count the possible states to compute S. Knowing E(S, V, N), we can then find

T(S, V, N) = ([partial derivative]E/[partial derivative]S)V,N (1.1.18)

-P(S, V, N) = ([partial derivative]E/[partial derivative]V)S,N

and finally arrive at P (T, V, N) by eliminating S between Eqs. (1.1.18) and (1.1.19). That is not the procedure we shall actually follow—there will be more convenient ways of doing the problem—but the very argument that that procedure could, in principle, be followed itself plays an important role. It is really the logical underpinning of everything we shall do in this chapter. For example, Eqs. (1.1.6) to (1.1.9) may be written together:

dE = T dS - P dV + μ dN (1.1.20)

Our arguments have told us that not only is this equation valid, and the meanings of the quantities in it, but also that it is integrable; that is, there exists a function E(S, V, N) for a system in equilibrium. All of equilibrium thermodynamics is an elaboration of the consequences of those statements.

In the course of this discussion we have ignored a number of fundamental questions. For example, let us return to the arguments that led to Eq. (1.1.7). As we can see from the argument, even if the temperatures were equal, contact was not at all superfluous. It had an important effect: we lost track of the exact amount of energy in each box. This realization raises two important problems for us. The first is the question: How badly have we lost track of the energy? In other words, how much uncertainty has been introduced? The second is that whatever previous operations put the original amounts of energy into the two boxes, they must have been subject to the same kinds of uncertainties : we never actually knew exactly how much energy was in the boxes to begin with. How does that affect our earlier arguments? Stated differently: Can we reapply our arguments to the box now that we have lost track of its exact energy?

The answer to the first question is basically that the uncertainties introduced into the energies are negligibly, even absurdly, small. This is a quantitative effect, which arises from the large numbers of particles found in macroscopic systems, and is generally true only if the system is macroscopic. We cannot yet prove this fact, since we have not yet learned how to count states, but we shall return to this point later and compute how big the uncertainties (in the energy and other thermodynamic quantities as well) actually are when we discuss thermodynamic fluctuations in Sec. 1.3f. It turns out, however, that in our example, if the temperatures in the two boxes were equal to start with, the number of possible states with energies very close to the original distribution is not only larger than any other possibility, it is also vastly greater than all other possibilities combined. Consequently, the probability of catching the combined system in any other kind of state is very nearly zero. It is due to this remarkable fact that statistical mechanics works.


Excerpted from States of Matter by David L. Goodstein. Copyright © 1985 David L. Goodstein. Excerpted by permission of Dover Publications, Inc..
All rights reserved. No part of this excerpt may be reproduced or reprinted without permission in writing from the publisher.
Excerpts are provided by Dial-A-Book Inc. solely for the personal use of visitors to this web site.

Table of Contents

1. Thermodynamics and Statistical Mechanics
1.1 Introduction: Thermodynamics and Statistical Mechanics of the Perfect Gas
1.2 Thermodynamics
1.3 Statistical Mechanics
1.4 Remarks on the Foundations of Statistical Mechanics
2. Perfect Gases
2.1 Introduction
2.2 The Ideal Gas
2.3 Bose-Einstein and Fermi-Dirac Statistics
2.4 Slightly Degenerate Perfect Gases
2.5 The Very Degenerate Fermi Gas: Electrons in Metals
2.6 The Bose Condensation: A First Order Phase Transition
3. Solids
3.1 Introduction
3.2 The Heat Capacity Dilemma
3.3 Normal Modes
3.4 Crystal Structures
3.5 Crystal Space: Phonons, Photons, and the Reciprocal Lattice
3.6 Electrons in Crystals
4. Liquids and Interacting Gases
4.1 Introduction
4.2 The Structure of a Fluid
4.3 The Potential Energy
4.4 Interacting Gases
4.5 Liquids
5. Some Special States
5.1 Introduction
5.2 Superfluidity
5.3 Superconductivity
5.4 Magnetism
6. Critical Phenomena and Phase Transitions
6.1 Introduction
6.2 Weiss Molecular Field Theory
6.3 The van der Waals Equation of State
6.4 Analogies Between Phase Transitions
6.5 The Generalized Theory
6.6 Critical Point Exponents
6.7 Scaling Laws: Physics Discovers Dimensional Analysis
6.8 Evaluation and Summary: Testing the Scaling Laws

Customer Reviews

Most Helpful Customer Reviews

See All Customer Reviews