- Shopping Bag ( 0 items )
Ray StreaterThis book is that rare thing: an edited volume that will be a lasting contribution to the literature.
— Ray Streater, King's College
"This book is that rare thing: an edited volume that will be a lasting contribution to the literature."--Ray Streater, King's College
Andreas Greven, Gerhard Keller Universität Erlangen
Gerald Warnecke Otto-von-Guericke University Magdeburg
The concept of entropy arose in the physical sciences during the 19th century, in particular in thermodynamics and in the development of statistical physics. From the very beginning the objective was to describe the equilibria and the evolution of thermodynamic systems. The development of the theory followed two conceptually rather different lines of thought. Nevertheless, they are symbiotically related, in particular through the work of Boltzmann.
The historically older line adopted a macroscopic point of view on the systems of interest. They are described in terms of a relatively small number of real variables, namely, temperature, pressure, specific volume, mass density, etc., whose values determine the macroscopic properties of a system in thermodynamic equilibrium. Clausius, building on the previous intuition of Carnot, introduced for the first time in 1867 a mathematical quantity, S, which he called entropy. It describes the heat exchanges that occur in thermal processes via the relation
dS = dQ/T.
Here Q denotes the amount of heat and T is the absolutetemperature at which the exchange takes place. Among the scientists who developed this idea further are some of the most influential physicists of that time, most notably Gibbs and Planck, as well as the mathematician Carathéodory.
The other conceptual line started from a microscopic point of view on nature, where macroscopic phenomena are derived from microscopic dynamics: any macrostate is represented by many different microstates, i.e. different configurations of molecular motion. Also, different macrostates can be realized by largely differing numbers of corresponding microstates. Equilibria are those macrostates which are most likely to appear, i.e. they have the largest number of corresponding microstates (if one talks about the simplest case of unconstrained equilibria). In particular, two names are associated with this idea: Boltzmann and Maxwell. Boltzmann argued that the Clausius entropy S associated with a system in equilibrium is proportional to the logarithm of the number W of microstates which form the macrostate of this equilibrium,
S = k ln W.
The formula is engraved on his tombstone in Vienna. Here the idea in the background, even though not explicitly formulated, is a relation between the encoded information specifying a microstate and the complexity of the system.
Since then both of these approaches to entropy have led to deep insights into the nature of thermodynamic and other microscopically unpredictable processes. In the 20th century these lines of thought had a considerable impact on a number of fields of mathematics and this development still continues. Among the topics where these ideas were most fruitful are stochastic processes and random fields, information and coding, data analysis and statistical inference, dynamical systems and ergodic theory, as well as partial differential equations and rational mechanics. However, the mathematical tools employed were very diverse. They also developed somewhat independently from their initial physical background. After decades of specialization and diversification, researchers involved in one of the areas of research mentioned above are often unprepared to understand the work of the others. Nowadays, it takes a considerable effort to learn and appreciate the techniques of a complementary approach, and the structure of this book reflects this and the state of the whole field. On the other hand, it was our intention with the symposium and this book to help bridge the gaps between the various specialized fields of research dealing with the concept of entropy. It was our goal to identify the unifying threads by inviting outstanding representatives of these research areas to give surveys for a general audience. The reader will find examples of most of the topics mentioned above in this book. To a large extent, the idea of entropy has developed in these areas in the context of quite different mathematical techniques, as the reader will see, but on the other hand there are common themes which we hope to exhibit in this book and which surfaced in the discussions taking place during the meeting.
We found two major methods in which entropy plays a crucial role. These are variational principles and Lyapunov functionals. They provide common tools to the different research areas.
Variational principles come in many guises. But they always take the following form, which we discuss for the case of the Gibbs variational principle. There, the sum of an energy functional and the entropy functional are minimized or maximized, depending on a choice of sign (see, for example, Section 2.3.1 on p. 29 and Section 3.6 on p. 50). Along the macroscopic line of thought such a variational problem appears as a first principle. It is clear that such a principle is crucial in thermodynamics and statistical physics. However, the reader will learn that it also plays a key role in the description of dynamical systems generated by differentiable maps, for populations evolving in a random medium, for the transport of particles and even in data analysis. As we proceed in the description of the various contributions we will show in more detail how this variational principle plays its role as an important tool.
For the microscopic line of thought, take the situation of a system where a macrostate corresponds to many microstates. In the Gibbs variational principle, one term reflects the number of possibilities to realize a state or path of a dynamical system or of a stochastic process under a macroscopic constraint such as, for example, a fixed energy. Then for a gain, e.g. in energy, one may have to 'pay' by a reduction in the number of ways in which microstates can achieve this gain. This results in a competition between energy and entropy that is reflected in the variational principle. Bearing this in mind it is not surprising that in many contributions to this book, a major role is played by information or information gain. There are two sides to the coin: one side is the description of the complexity of the problem to specify a microstate satisfying a macroscopic constraint; the other is the problem of coding this information. It is therefore also plausible that there are deeper connections between problems which at the surface look quite different, e.g. the description of orbits of a map on the one hand and data analysis on the other. We hope that these seemingly mysterious relations become clearer by reading through this book.
Another such common theme in many areas touched on in this book is the close connection of entropy to stability properties of dynamical processes. The entropy may, for instance, be interpreted as a Lyapunov function. Originally introduced for systems of ordinary differential equations, Lyapunov functions are an important tool in proving asymptotic stability and other stability properties for dynamical systems. These are functions that are increasing or decreasing with the dynamics of a system, i.e. the trajectories of solutions to the system cut the level sets of the function. This is exactly the type of behavior that the entropy exhibits due to the second law of thermodynamics. It is also exhibited in the axiomatic framework of Lieb-Yngvason when considering sequences of adiabatically accessible states, especially if the sequence of transitions is irreversible. For time-dependent partial differential equations in the framework of evolution equations, i.e. considering them as ordinary differential equations on infinite-dimensional spaces, such as Sobolev spaces, one has the analogous tool of Lyapunov functionals. Basically, in all these cases asymptotic stability tells us in which state (or states), in the long run, a dynamical system ends up.
There are other theoretical implications concerning the stability of a system as well, which are coming up in the chapters by Dafermos and Young. For instance, on the one hand, Sections 6.3 and 6.4 on pp. 110 and 113 of the chapter by Dafermos are devoted to illuminating the implications of entropy inequalities for stability properties of weak, i.e. discontinuous, solutions to hyperbolic conservation laws. Similarly, entropy inequalities are also used as a tool for systems of parabolic differential equations. On the other hand this might, at first sight, contrast with the role played by entropy in the chapter by Young, where it is connected to the instability of dynamical systems with chaotic behavior. However, instability of individual trajectories is a feature of the microscopic level which is responsible for the macroscopic stability of the actually observed invariant measures. Keeping in mind that the ergodic theory of dynamical systems is an equilibrium theory that cannot describe the dynamics of transitions between invariant measures, the role of entropy in variational principles, which single out the actually observed invariant measure among all other ones, comes as close as possible to the role of a Lyapunov functional.
1.1 Outline of the Book
Now we outline and comment on the contents of this book. In the first part, basic concepts, terminology and examples from both the macroscopic and the microscopic line of thought are introduced in a way that should provide just the right amount of detail to prepare the interested reader for further reading. The second part comprises five contributions tracing various lines of evolution that have emerged from the macroscopic point of view. The third part collects five contributions from the 'probabilistic branch,' and in a final part we offer the reader four contributions that illustrate how entropic ideas have penetrated coding theory and the theory of dynamical systems.
Due to limited time at the symposium, a number of further important topics related to entropy could not be dealt with appropriately, and as a consequence there are a number of deplorable omissions in this book, in particular regarding the role of entropy in statistics, in information theory, in the ergodic theory of amenable group actions, and in numerical methods for fluid dynamics.
Part 1 The concept of entropy emerging from the macroscopic point of view is emphasized by Ingo Müller's first contribution (p. 19). Starting from Clausius' point of view, Boltzmann's statistical ideas are added into the mix and the role of entropy as a governing quantity behind a number of simple real world phenomena is discussed: the elastic properties of rubber bands as an illustration for configurational entropy, the different atmospheres of the planets in our Solar System as a result of the 'competition between entropy and energy,' and the role of entropy in the synthesis of ammonia are just a few examples.
The approach to entropy in statistical physics developed its full strength in the framework of probability theory. It had a large impact on information theory, stochastic processes, statistical physics and dynamical systems (both classical and quantum). Concepts, terminology and fundamental results which are basic for all these branches are provided by Hans-Otto Georgii (p. 37), the second introductory chapter of this book.
Part 2 In the second part we have collected expositions on the macroscopic approach. There are two chapters on thermodynamics in the context of continuum mechanics. The first one, by Hutter and Wang, reviews alternative approaches to nonequilibrium thermodynamics, while in the second Müller surveys one of these approaches more extensively-the theory of extended thermodynamics-which he developed with various collaborators. Next, in the chapter by Dafermos on conservation laws, we pursue the bridge from continuum mechanics to differential equations further. Finally, we have two chapters that reflect the views of physicists on the fundamental issues concerning the understanding of entropy and irreversibility. Surprisingly, it turns out that there has not been until now one unified, universally accepted theory on the thermodynamics of irreversible processes. There are competing theories and concepts with a certain overlap and some fundamental differences. This diversity is a bit confusing, but bringing more clarity and unity to this field is also a challenging research goal. The contributions we have collected here are attempting to expose this diversity to a certain extent and clearly point out some of the conceptual differences that need to be resolved.
Interestingly, the uncertainty about some of the basic physical concepts involved is reflected by the fact that the state of the mathematical theory of the equations that follow from these theories is also very inadequate. Numerical methods play an increasingly important role in the effort to sort this out. This is because numerical computations allow us to explore more deeply consequences of mathematically complicated theories, i.e. the large systems of nonlinear differential equations they lead to. They may allow us to make quantitative predictions and comparisons with physical reality. Entropy also plays a key role in numerical analysis and numerical computations, an important topic that is missing here. However, the analytical tools are actually discussed in the chapter by Dafermos. The link between analytical theory and numerical approximations is particularly close in this field since key existence results were actually obtained via convergence proofs for numerical schemes. In numerical computations it is important to have verified that the method being used is consistent with an entropy condition, otherwise unphysical shock discontinuities may be approximated. Also, it is useful to monitor the entropy production in computations, because false entropy production is an indicator of some error in the scheme or the implementation.
Let us now turn to the contributions. The first chapter in this part is a survey by Kolumban Hutter and Yongqi Wang (p. 57) on some theories of irreversible thermodynamics. It highlights two different forms of the second law of thermodynamics, namely, the Coleman-Noll approach to the Clausius-Duhem inequality and the Müller-Liu entropy principle in a detailed comparison. The structure of these descriptions of thermodynamics is exposed in very concise form. The interesting lesson to the nonexpert is that there are different theories available that have much common ground but marked differences as well, a theme we will encounter again in Uffink's chapter. The second approach that Hutter and Wang discuss is then exposed much more extensively in the next chapter, by Ingo Müller (p. 79). The reader is given an introduction to the theory of extended thermodynamics, which provides a macroscopic theory for irreversible thermodynamic processes. It builds a vital link to the microscopic approach of kinetic gas theory formulated in terms of the Boltzmann equation and the density function it has as solution. From these one can rigorously derive an infinite sequence of balance equations involving higher and higher moments of the density function. In order to have a finite number of equations, these are truncated and certain closure laws have to be derived. This approach leads to symmetric hyperbolic systems of partial differential equations that are automatically endowed with an entropy function. The entropy production plays an important role in deriving closure laws for these systems. The reader who wishes to learn more should consult the book on extended thermodynamics that Müller together with Ruggeri recently published in updated form.
Excerpted from Entropy Copyright © 2003 by Princeton University Press. Excerpted by permission.
All rights reserved. No part of this excerpt may be reproduced or reprinted without permission in writing from the publisher.
Excerpts are provided by Dial-A-Book Inc. solely for the personal use of visitors to this web site.
"This book is that rare thing: an edited volume that will be a lasting contribution to the literature."—Ray Streater, King's College