 Shopping Bag ( 0 items )

All (13) from $46.27

New (7) from $89.66

Used (6) from $46.27
More About This Textbook
Overview
"This book is that rare thing: an edited volume that will be a lasting contribution to the literature."Ray Streater, King's College
What People Are Saying
Ray Streater
This book is that rare thing: an edited volume that will be a lasting contribution to the literature.— Ray Streater, King's College
Product Details
Related Subjects
Read an Excerpt
Entropy
Princeton University Press
Copyright © 2003 Princeton University PressAll right reserved.
ISBN: 9780691113388
Chapter One
IntroductionAndreas Greven, Gerhard Keller Universität Erlangen
Gerald Warnecke OttovonGuericke University Magdeburg
The concept of entropy arose in the physical sciences during the 19th century, in particular in thermodynamics and in the development of statistical physics. From the very beginning the objective was to describe the equilibria and the evolution of thermodynamic systems. The development of the theory followed two conceptually rather different lines of thought. Nevertheless, they are symbiotically related, in particular through the work of Boltzmann.
The historically older line adopted a macroscopic point of view on the systems of interest. They are described in terms of a relatively small number of real variables, namely, temperature, pressure, specific volume, mass density, etc., whose values determine the macroscopic properties of a system in thermodynamic equilibrium. Clausius, building on the previous intuition of Carnot, introduced for the first time in 1867 a mathematical quantity, S, which he called entropy. It describes the heat exchanges that occur in thermal processes via the relation
dS = dQ/T.
Here Q denotes the amount of heat and T is the absolutetemperature at which the exchange takes place. Among the scientists who developed this idea further are some of the most influential physicists of that time, most notably Gibbs and Planck, as well as the mathematician Carathéodory.
The other conceptual line started from a microscopic point of view on nature, where macroscopic phenomena are derived from microscopic dynamics: any macrostate is represented by many different microstates, i.e. different configurations of molecular motion. Also, different macrostates can be realized by largely differing numbers of corresponding microstates. Equilibria are those macrostates which are most likely to appear, i.e. they have the largest number of corresponding microstates (if one talks about the simplest case of unconstrained equilibria). In particular, two names are associated with this idea: Boltzmann and Maxwell. Boltzmann argued that the Clausius entropy S associated with a system in equilibrium is proportional to the logarithm of the number W of microstates which form the macrostate of this equilibrium,
S = k ln W.
The formula is engraved on his tombstone in Vienna. Here the idea in the background, even though not explicitly formulated, is a relation between the encoded information specifying a microstate and the complexity of the system.
Since then both of these approaches to entropy have led to deep insights into the nature of thermodynamic and other microscopically unpredictable processes. In the 20th century these lines of thought had a considerable impact on a number of fields of mathematics and this development still continues. Among the topics where these ideas were most fruitful are stochastic processes and random fields, information and coding, data analysis and statistical inference, dynamical systems and ergodic theory, as well as partial differential equations and rational mechanics. However, the mathematical tools employed were very diverse. They also developed somewhat independently from their initial physical background. After decades of specialization and diversification, researchers involved in one of the areas of research mentioned above are often unprepared to understand the work of the others. Nowadays, it takes a considerable effort to learn and appreciate the techniques of a complementary approach, and the structure of this book reflects this and the state of the whole field. On the other hand, it was our intention with the symposium and this book to help bridge the gaps between the various specialized fields of research dealing with the concept of entropy. It was our goal to identify the unifying threads by inviting outstanding representatives of these research areas to give surveys for a general audience. The reader will find examples of most of the topics mentioned above in this book. To a large extent, the idea of entropy has developed in these areas in the context of quite different mathematical techniques, as the reader will see, but on the other hand there are common themes which we hope to exhibit in this book and which surfaced in the discussions taking place during the meeting.
We found two major methods in which entropy plays a crucial role. These are variational principles and Lyapunov functionals. They provide common tools to the different research areas.
Variational principles come in many guises. But they always take the following form, which we discuss for the case of the Gibbs variational principle. There, the sum of an energy functional and the entropy functional are minimized or maximized, depending on a choice of sign (see, for example, Section 2.3.1 on p. 29 and Section 3.6 on p. 50). Along the macroscopic line of thought such a variational problem appears as a first principle. It is clear that such a principle is crucial in thermodynamics and statistical physics. However, the reader will learn that it also plays a key role in the description of dynamical systems generated by differentiable maps, for populations evolving in a random medium, for the transport of particles and even in data analysis. As we proceed in the description of the various contributions we will show in more detail how this variational principle plays its role as an important tool.
For the microscopic line of thought, take the situation of a system where a macrostate corresponds to many microstates. In the Gibbs variational principle, one term reflects the number of possibilities to realize a state or path of a dynamical system or of a stochastic process under a macroscopic constraint such as, for example, a fixed energy. Then for a gain, e.g. in energy, one may have to 'pay' by a reduction in the number of ways in which microstates can achieve this gain. This results in a competition between energy and entropy that is reflected in the variational principle. Bearing this in mind it is not surprising that in many contributions to this book, a major role is played by information or information gain. There are two sides to the coin: one side is the description of the complexity of the problem to specify a microstate satisfying a macroscopic constraint; the other is the problem of coding this information. It is therefore also plausible that there are deeper connections between problems which at the surface look quite different, e.g. the description of orbits of a map on the one hand and data analysis on the other. We hope that these seemingly mysterious relations become clearer by reading through this book.
Another such common theme in many areas touched on in this book is the close connection of entropy to stability properties of dynamical processes. The entropy may, for instance, be interpreted as a Lyapunov function. Originally introduced for systems of ordinary differential equations, Lyapunov functions are an important tool in proving asymptotic stability and other stability properties for dynamical systems. These are functions that are increasing or decreasing with the dynamics of a system, i.e. the trajectories of solutions to the system cut the level sets of the function. This is exactly the type of behavior that the entropy exhibits due to the second law of thermodynamics. It is also exhibited in the axiomatic framework of LiebYngvason when considering sequences of adiabatically accessible states, especially if the sequence of transitions is irreversible. For timedependent partial differential equations in the framework of evolution equations, i.e. considering them as ordinary differential equations on infinitedimensional spaces, such as Sobolev spaces, one has the analogous tool of Lyapunov functionals. Basically, in all these cases asymptotic stability tells us in which state (or states), in the long run, a dynamical system ends up.
There are other theoretical implications concerning the stability of a system as well, which are coming up in the chapters by Dafermos and Young. For instance, on the one hand, Sections 6.3 and 6.4 on pp. 110 and 113 of the chapter by Dafermos are devoted to illuminating the implications of entropy inequalities for stability properties of weak, i.e. discontinuous, solutions to hyperbolic conservation laws. Similarly, entropy inequalities are also used as a tool for systems of parabolic differential equations. On the other hand this might, at first sight, contrast with the role played by entropy in the chapter by Young, where it is connected to the instability of dynamical systems with chaotic behavior. However, instability of individual trajectories is a feature of the microscopic level which is responsible for the macroscopic stability of the actually observed invariant measures. Keeping in mind that the ergodic theory of dynamical systems is an equilibrium theory that cannot describe the dynamics of transitions between invariant measures, the role of entropy in variational principles, which single out the actually observed invariant measure among all other ones, comes as close as possible to the role of a Lyapunov functional.
1.1 Outline of the Book
Now we outline and comment on the contents of this book. In the first part, basic concepts, terminology and examples from both the macroscopic and the microscopic line of thought are introduced in a way that should provide just the right amount of detail to prepare the interested reader for further reading. The second part comprises five contributions tracing various lines of evolution that have emerged from the macroscopic point of view. The third part collects five contributions from the 'probabilistic branch,' and in a final part we offer the reader four contributions that illustrate how entropic ideas have penetrated coding theory and the theory of dynamical systems.
Due to limited time at the symposium, a number of further important topics related to entropy could not be dealt with appropriately, and as a consequence there are a number of deplorable omissions in this book, in particular regarding the role of entropy in statistics, in information theory, in the ergodic theory of amenable group actions, and in numerical methods for fluid dynamics.
Part 1 The concept of entropy emerging from the macroscopic point of view is emphasized by Ingo Müller's first contribution (p. 19). Starting from Clausius' point of view, Boltzmann's statistical ideas are added into the mix and the role of entropy as a governing quantity behind a number of simple real world phenomena is discussed: the elastic properties of rubber bands as an illustration for configurational entropy, the different atmospheres of the planets in our Solar System as a result of the 'competition between entropy and energy,' and the role of entropy in the synthesis of ammonia are just a few examples.
The approach to entropy in statistical physics developed its full strength in the framework of probability theory. It had a large impact on information theory, stochastic processes, statistical physics and dynamical systems (both classical and quantum). Concepts, terminology and fundamental results which are basic for all these branches are provided by HansOtto Georgii (p. 37), the second introductory chapter of this book.
Part 2 In the second part we have collected expositions on the macroscopic approach. There are two chapters on thermodynamics in the context of continuum mechanics. The first one, by Hutter and Wang, reviews alternative approaches to nonequilibrium thermodynamics, while in the second Müller surveys one of these approaches more extensivelythe theory of extended thermodynamicswhich he developed with various collaborators. Next, in the chapter by Dafermos on conservation laws, we pursue the bridge from continuum mechanics to differential equations further. Finally, we have two chapters that reflect the views of physicists on the fundamental issues concerning the understanding of entropy and irreversibility. Surprisingly, it turns out that there has not been until now one unified, universally accepted theory on the thermodynamics of irreversible processes. There are competing theories and concepts with a certain overlap and some fundamental differences. This diversity is a bit confusing, but bringing more clarity and unity to this field is also a challenging research goal. The contributions we have collected here are attempting to expose this diversity to a certain extent and clearly point out some of the conceptual differences that need to be resolved.
Interestingly, the uncertainty about some of the basic physical concepts involved is reflected by the fact that the state of the mathematical theory of the equations that follow from these theories is also very inadequate. Numerical methods play an increasingly important role in the effort to sort this out. This is because numerical computations allow us to explore more deeply consequences of mathematically complicated theories, i.e. the large systems of nonlinear differential equations they lead to. They may allow us to make quantitative predictions and comparisons with physical reality. Entropy also plays a key role in numerical analysis and numerical computations, an important topic that is missing here. However, the analytical tools are actually discussed in the chapter by Dafermos. The link between analytical theory and numerical approximations is particularly close in this field since key existence results were actually obtained via convergence proofs for numerical schemes. In numerical computations it is important to have verified that the method being used is consistent with an entropy condition, otherwise unphysical shock discontinuities may be approximated. Also, it is useful to monitor the entropy production in computations, because false entropy production is an indicator of some error in the scheme or the implementation.
Let us now turn to the contributions. The first chapter in this part is a survey by Kolumban Hutter and Yongqi Wang (p. 57) on some theories of irreversible thermodynamics. It highlights two different forms of the second law of thermodynamics, namely, the ColemanNoll approach to the ClausiusDuhem inequality and the MüllerLiu entropy principle in a detailed comparison. The structure of these descriptions of thermodynamics is exposed in very concise form. The interesting lesson to the nonexpert is that there are different theories available that have much common ground but marked differences as well, a theme we will encounter again in Uffink's chapter. The second approach that Hutter and Wang discuss is then exposed much more extensively in the next chapter, by Ingo Müller (p. 79). The reader is given an introduction to the theory of extended thermodynamics, which provides a macroscopic theory for irreversible thermodynamic processes. It builds a vital link to the microscopic approach of kinetic gas theory formulated in terms of the Boltzmann equation and the density function it has as solution. From these one can rigorously derive an infinite sequence of balance equations involving higher and higher moments of the density function. In order to have a finite number of equations, these are truncated and certain closure laws have to be derived. This approach leads to symmetric hyperbolic systems of partial differential equations that are automatically endowed with an entropy function. The entropy production plays an important role in deriving closure laws for these systems. The reader who wishes to learn more should consult the book on extended thermodynamics that Müller together with Ruggeri recently published in updated form.
(Continues...)
Table of Contents
Preface xi
List of Contributors xiii
Chapter 1. Introduction
A.Greven, G.Keller, G.Warnecke 1
1.1 Outline of the Book 4
1.2 Notations 14
PART 1. FUNDAMENTAL CONCEPTS 17
Chapter 2. Entropy: a Subtle Concept in Thermodynamics
I. Müller 19
2.1 Origin of Entropy in Thermodynamics 19
2.2 Mechanical Interpretation of Entropy in the Kinetic Theory of Gases 23
2.2.1 Configurational Entropy 25
2.3 Entropy and Potential Energy of Gravitation 28
2.3.1 Planetary Atmospheres 28
2.3.2 Pfeffer Tube 29
2.4 Entropy and Intermolecular Energies 30
2.5 Entropy and Chemical Energies 32
2.6 Omissions 34
References 35
Chapter 3. Probabilistic Aspects of Entropy
H. O.Georgii 37
3.1 Entropy as a Measure of Uncertainty 37
3.2 Entropy as a Measure of Information 39
3.3 Relative Entropy as a Measure of Discrimination 40
3.4 Entropy Maximization under Constraints 43
3.5 Asymptotics Governed by Entropy 45
3.6 Entropy Density of Stationary Processes and Fields 48
References 52
PART 2.ENTROPY IN THERMODYNAMICS 55
Chapter 4. Phenomenological Thermodynamics and Entropy Principles
K.Hutter and Y.Wang 57
4.1 Introduction 57
4.2 A Simple Classification of Theories of Continuum Thermodynamics 58
4.3 Comparison of Two Entropy Principles 63
4.3.1 Basic Equations 63
4.3.2 Generalized ColemanNoll Evaluation of the ClausiusDuhem Inequality 66
4.3.3 MüllerLiu's Entropy Principle 71
4.4 Concluding Remarks 74
References 75
Chapter 5. Entropy in Nonequilibrium
I. Müller 79
5.1 Thermodynamics of Irreversible Processes and Rational Thermodynamics for Viscous, HeatConducting Fluids 79
5.2 Kinetic Theory of Gases, the Motivation for Extended Thermodynamics 82
5.2.1 A Remark on Temperature 82
5.2.2 Entropy Density and Entropy Flux 83
5.2.3 13Moment Distribution. Maximization of Nonequilibrium Entropy 83
5.2.4 Balance Equations for Moments 84
5.2.5 Moment Equations for 13 Moments. Stationary Heat Conduction 85
5.2.6 Kinetic and Thermodynamic Temperatures 87
5.2.7 Moment Equations for 14 Moments. Minimum Entropy Production 89
5.3 Extended Thermodynamics 93
5.3.1 Paradoxes 93
5.3.2 Formal Structure 95
5.3.3 Pulse Speeds 98
5.3.4 Light Scattering 101
5.4 A Remark on Alternatives 103
References 104
Chapter 6. Entropy for Hyperbolic Conservation Laws
C.M.Dafermos 107
6.1 Introduction 107
6.2 Isothermal Thermoelasticity 108
6.3 Hyperbolic Systems of Conservation Laws 110
6.4 Entropy 113
6.5 Quenching of Oscillations 117
References 119
Chapter 7. Irreversibility and the Second Law of Thermodynamics
J.Uffink 121
7.1 Three Concepts of (Ir)reversibility 121
7.2 Early Formulations of the Second Law 124
7.3 Planck 129
7.4 Gibbs 132
7.5 Carathéodory 133
7.6 Lieb and Yngvason 140
7.7 Discussion 143
References 145
Chapter 8. The Entropy of Classical Thermodynamics
E. H. Lieb, J. Yngvason 147
8.1 A Guide to Entropy and the Second Law of Thermodynamics 148
8.2 Some Speculations and Open Problems 190
8.3 Some Remarks about Statistical Mechanics 192
References 193
PART 3.ENTROPY IN STOCHASTIC PROCESSES 197
Chapter 9. Large Deviations and Entropy
S. R. S. Varadhan 199
9.1 Where Does Entropy Come From? 199
9.2 Sanov's Theorem 201
9.3 What about Markov Chains? 202
9.4 Gibbs Measures and Large Deviations 203
9.5 VentcelFreidlin Theory 205
9.6 Entropy and Large Deviations 206
9.7 Entropy and Analysis 209
9.8 Hydrodynamic Scaling: an Example 211
References 214
Chapter 10. Relative Entropy for Random Motion in a Random Medium
F. den Hollander 215
10.1 Introduction 215
10.1.1 Motivation 215
10.1.2 A Branching Random Walk in a Random Environment 217
10.1.3 Particle Densities and Growth Rates 217
10.1.4 Interpretation of the Main Theorems 219
10.1.5 Solution of the Variational Problems 220
10.1.6 Phase Transitions 223
10.1.7 Outline 224
10.2 Two Extensions 224
10.3 Conclusion 225
10.4 Appendix: Sketch of the Derivation of the Main Theorems 226
10.4.1 Local Times of Random Walk 226
10.4.2 Large Deviations and Growth Rates 228
10.4.3 Relation between the Global and the Local Growth Rate 230
References 231
Chapter 11. Metastability and Entropy
E. Olivieri 233
11.1 Introduction 233
11.2 van der Waals Theory 235
11.3 CurieWeiss Theory 237
11.4 Comparison between MeanField and ShortRange Models 237
11.5 The 'Restricted Ensemble' 239
11.6 The Pathwise Approach 241
11.7 Stochastic Ising Model. Metastability and Nucleation 241
11.8 FirstExit Problem for General Markov Chains 244
11.9 The First Descent Tube of Trajectories 246
11.10 Concluding Remarks 248
References 249
Chapter 12. Entropy Production in Driven Spatially Extended Systems
C. Maes 251
12.1 Introduction 251
12.2 Approach to Equilibrium 252
12.2.1 Boltzmann Entropy 253
12.2.2 Initial Conditions 254
12.3 Phenomenology of SteadyState Entropy Production 254
12.4 Multiplicity under Constraints 255
12.5 Gibbs Measures with an Involution 258
12.6 The Gibbs Hypothesis 261
12.6.1 Pathspace Measure Construction 262
12.6.2 SpaceTime Equilibrium 262
12.7 Asymmetric Exclusion Processes 263
12.7.1 MEP for ASEP 263
12.7.2 LFT for ASEP 264
References 266
Chapter 13. Entropy: a Dialogue
J. L. Lebowitz, C. Maes 269
References 275
PART 4.ENTROPY AND INFORMATION 277
Chapter 14. Classical and Quantum Entropies:Dynamics and Information
F. Benatti 279
14.1 Introduction 279
14.2 Shannon and von Neumann Entropy 280
14.2.1 Coding for Classical Memoryless Sources 281
14.2.2 Coding for Quantum Memoryless Sources 282
14.3 KolmogorovSinai Entropy 283
14.3.1 KS Entropy and Classical Chaos 285
14.3.2 KS Entropy and Classical Coding 285
14.3.3 KS Entropy and Algorithmic Complexity 286
14.4 Quantum Dynamical Entropies 287
14.4.1 Partitions of Unit and Decompositions of States 290
14.4.2 CNT Entropy: Decompositions of States 290
14.4.3 AF Entropy: Partitions of Unit 292
14.5 Quantum Dynamical Entropies: Perspectives 293
14.5.1 Quantum Dynamical Entropies and Quantum Chaos 295
14.5.2 Dynamical Entropies and Quantum Information 296
14.5.3 Dynamical Entropies and Quantum Randomness 296
References 296
Chapter 15. Complexity and Information in Data
J. Rissanen 299
15.1 Introduction 299
15.2 Basics of Coding 301
15.3 Kolmogorov Sufficient Statistics 303
15.4 Complexity 306
15.5 Information 308
15.6 Denoising with Wavelets 311
References 312
Chapter 16. Entropy in Dynamical Systems
L. S. Young 313
16.1 Background 313
16.1.1 Dynamical Systems 313
16.1.2 Topological and Metric Entropies 314
16.2 Summary 316
16.3 Entropy, Lyapunov Exponents, and Dimension 317
16.3.1 Random Dynamical Systems 321
16.4 Other Interpretations of Entropy 322
16.4.1 Entropy and Volume Growth 322
16.4.2 Growth of Periodic Points and Horseshoes 323
16.4.3 Large Deviations and Rates of Escape 325
References 327
Chapter 17. Entropy in Ergodic Theory
M. Keane 329
References 335
Combined References 337
Index 351
Recipe
"This book is that rare thing: an edited volume that will be a lasting contribution to the literature."—Ray Streater, King's College