a fresh look at entropy and the second law of...

7
A Fresh Look at Entropy and the Second Law of Thermodynamics Elliott H. Lieb and Jakob Yngvason Citation: Phys. Today 53(4), 32 (2000); doi: 10.1063/1.883034 View online: http://dx.doi.org/10.1063/1.883034 View Table of Contents: http://www.physicstoday.org/resource/1/PHTOAD/v53/i4 Published by the American Institute of Physics. Additional resources for Physics Today Homepage: http://www.physicstoday.org/ Information: http://www.physicstoday.org/about_us Daily Edition: http://www.physicstoday.org/daily_edition Downloaded 13 Jun 2012 to 143.106.72.129. Redistribution subject to AIP license or copyright; see http://www.physicstoday.org/about_us/terms

Upload: others

Post on 23-Jul-2020

1 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: A Fresh Look at Entropy and the Second Law of Thermodynamicsdepa.fquim.unam.mx/amyd/archivero/Art_Entrop_2000_32695.pdf · second law. The approach of these authors is quite differ-ent

A Fresh Look at Entropy and the Second Law of ThermodynamicsElliott H. Lieb and Jakob Yngvason Citation: Phys. Today 53(4), 32 (2000); doi: 10.1063/1.883034 View online: http://dx.doi.org/10.1063/1.883034 View Table of Contents: http://www.physicstoday.org/resource/1/PHTOAD/v53/i4 Published by the American Institute of Physics. Additional resources for Physics TodayHomepage: http://www.physicstoday.org/ Information: http://www.physicstoday.org/about_us Daily Edition: http://www.physicstoday.org/daily_edition

Downloaded 13 Jun 2012 to 143.106.72.129. Redistribution subject to AIP license or copyright; see http://www.physicstoday.org/about_us/terms

Page 2: A Fresh Look at Entropy and the Second Law of Thermodynamicsdepa.fquim.unam.mx/amyd/archivero/Art_Entrop_2000_32695.pdf · second law. The approach of these authors is quite differ-ent

In days long gone, the sec-ond law of thermodynam-

ics (which predated the firstlaw) was regarded as per-haps the most perfect andunassailable law in physics.It was even supposed to havephilosophical import: It hasbeen hailed for providing aproof of the existence of God(who started the universe off in a state of low entropy,from which it is constantly degenerating); conversely, ithas been rejected as being incompatible with dialecticalmaterialism and the perfectibility of the human condition.

Alas, physicists themselves eventually demoted thesecond law to a lesser position in the pantheon—because(or so it was declared) it is “merely” statistics applied tothe mechanics of large numbers of atoms. Willard Gibbswrote: “The laws of thermodynamics may easily beobtained from the principles of statistical mechanics, ofwhich they are the incomplete expression”1—and LudwigBoltzmann expressed similar sentiments.

Is that really so? Is it really true that the second lawis merely an “expression” of microscopic models, or couldit exist in a world that was featureless at the 10–8 cmlevel? We know that statistical mechanics is a powerfultool for understanding physical phenomena and calculat-ing many quantities, especially in systems at or near equi-librium. We use it to calculate entropy, specific and latentheats, phase transition properties, transport coefficients,and so on, often with good accuracy. Important examplesabound, such as Max Planck’s realization that by staringinto a furnace he could find Avogadro’s number, and LinusPauling’s highly accurate back-of-the-envelope calculationof the residual entropy of ice. But is statistical mechanicsessential for the second law?

In any event, it is still beyond anyone’s computation-al ability (except in idealized situations) to account for avery precise, essentially infinitely accurate law of physicsfrom statistical mechanical principles. No exception to thesecond law of thermodynamics has ever been found—noteven a tiny one. Like conservation of energy (the “first”law), the existence of a law so precise and so independentof details of models must have a logical foundation that isindependent of the fact that matter is composed of inter-

acting particles. Our aimhere is to explore that foun-dation. The full details canbe found in reference 2.

As Albert Einstein putit, “A theory is the moreimpressive the greater thesimplicity of its premises,the more different kinds ofthings it relates, and the

more extended its area of applicability. Therefore the deepimpression which classical thermodynamics made uponme. It is the only physical theory of universal content con-cerning which I am convinced that, within the frameworkof the applicability of its basic concepts, it will never beoverthrown.”3

In an attempt to reaffirm the second law as a pillar ofphysics in its own right, we have returned to a little-noticed movement that began in the 1950s with the workof Peter Landsberg,4 Hans Buchdahl,5 Gottfried Falk,Herbert Jung,6 and others2 and culminated in the book ofRobin Giles,7 which must be counted one of the trulygreat, but unsung works in theoretical physics. It is inthese works that the concept of “comparison” (explainedbelow) emerges as one of the key underpinnings of thesecond law. The approach of these authors is quite differ-ent from lines of thought in the tradition of Sadi Carnot,which base thermodynamics on the efficiency of heatengines. (See reference 8, for example, for modern exposi-tions of the latter approach.)

The basic questionThe paradigmatic event that the second law deals withcan be described as follows. Take a macroscopic system inan equilibrium state X and place it in a room along with agorilla equipped with arbitrarily complicated machinery(a metaphor for the rest of the universe), and a weight—and close the door. As in the old advertisement for inde-structible luggage, the gorilla can do anything to the sys-tem—including tearing it apart. At the end of the day,however, when the door is opened, the system is found tobe in some other equilibrium state, Y, the gorilla andmachinery are found in their original state, and the onlyother thing that has possibly changed is that the weighthas been raised or lowered. Let us emphasize thatalthough our focus is on equilibrium states, the processesthat take one such state into another can be arbitrarilyviolent. The gorilla knows no limits. (See figure 1.)

The question that the second law answers is this:What distinguishes those states Y that can be reachedfrom X in this manner from those that cannot? The

32 APRIL 2000 PHYSICS TODAY © 2000 American Institute of Physics, S-0031-9228-0004-020-5

A FRESH LOOK AT ENTROPY

AND THE SECOND LAW OF

THERMODYNAMICSThe existence of entropy, and its

increase, can be understood without ref-erence to either statistical mechanics or

heat engines.

Elliott H. Lieb and Jakob Yngvason

ELLIOTT LIEB is a Higgins Professor of Physics and a professor of mathe-matics at Princeton University in Princeton, New Jersey. JAKOB YNG-VASON is a professor of theoretical physics at the University of Viennaand president of the Erwin Schrödinger Institute for MathematicalPhysics in Vienna, Austria.

Downloaded 13 Jun 2012 to 143.106.72.129. Redistribution subject to AIP license or copyright; see http://www.physicstoday.org/about_us/terms

Page 3: A Fresh Look at Entropy and the Second Law of Thermodynamicsdepa.fquim.unam.mx/amyd/archivero/Art_Entrop_2000_32695.pdf · second law. The approach of these authors is quite differ-ent

answer: There is a function of the equilibrium states,called entropy and denoted by S, that characterizes thepossible pairs of equilibrium states X and Y by theinequality S(X) � S(Y). The function can be chosen so asto be additive (in a sense explained below), and with thisrequirement it is unique, up to a change of scale. Ourmain point is that the existence of entropy relies on onlya few basic principles, independent of any statisticalmodel—or even of atoms.

What is exciting about this seemingly innocuousstatement is the uniqueness of entropy, for it means thatall the different methods for measuring or computingentropy must give the same answer. The usual textbookderivation of entropy as a state function, starting withsome version of “the second law,” proceeds by consideringcertain slow, almost reversible processes (along adiabatsand isotherms). It is not at all evident that a functionobtained in this way can contain any information aboutprocesses that are far from being slow or reversible. Theclever physicist might think that with the aid of modern

computers, sophisticated feedback mechanisms,unlimited amounts of mechanical energy (representedby the weight) and lots of plain common sense andfunding, the system could be made to go from an equi-librium state X to a state Y that could not be reachedby the primitive quasistatic processes used to defineentropy in the first place. This cannot happen, how-ever, no matter how clever the experimenter or howfar from equilibrium one travels!

What logic lies behind this law? Why can’t onegorilla undo what another one has wrought? Theatomistic foundation of the logic is not as simple asis often suggested. It concerns not only such mattersas the enormous number of atoms involved (1023),but also other aspects of statistical mechanics thatare beyond our present mathematical abilities. Inparticular, the interaction of a system with theexternal world (represented by the gorilla andmachinery) cannot be described in any obvious wayby Hamiltonian mechanics. Although irreversibilityis an important open problem in statistical mechan-ics, it is fortunate that the logic of thermodynamicsitself is independent of atoms and can be understoodwithout knowing the source of irreversibility.

The founders of thermodynamics—Rudolf Clau-sius, Lord Kelvin, Planck, Constantin Carathéodory, andso on—clearly had transitions between equilibrium statesin mind when they stated the law in sentences such as“No process is possible, the sole result of which is that abody is cooled and work is done” (Kelvin). Later it becametacitly understood that the law implies a continuousincrease in some property called entropy, which was sup-posedly defined for systems out of equilibrium. The ongo-ing, unsatisfactory debates (see reference 9, for example)about the definition of this nonequilibrium entropy andwhether it increases shows, in fact, that what is suppos-edly “easily” understood needs clarification. Once again, itis a good idea to try to understand first the meaning ofentropy for equilibrium states—the quantity that ourtextbooks talk about when they draw Carnot cycles. Inthis article we restrict our attention to just those states;by “state” we always mean “equilibrium state.” Entropy,as the founders of thermodynamics understood the quan-tity, is subtle enough, and it is worthwhile to understandthe “second law” in this restricted context. To do so it is

APRIL 2000 PHYSICS TODAY 33

a

b

c

X

Y

FIGURE 1. THE SECOND LAW OF THERMODYNAMICS

says that increased entropy characterizes those final statesof a macroscopic system that can be reached from a giveninitial state without leaving an imprint on the rest of theuniverse, apart from the displacement of a weight. Thescenario shown here illustrates that the process can bequite violent. (a) A system in an equilibrium state X(blue) is placed in a room with a gorilla, some intricatemachinery (green), and a weight. (b) The gorilla, machin-ery, and system interact and the system undergoes a vio-lent transition. (c) The system is found in a new equilibri-um state Y (red), the gorilla and machinery are found intheir original state, while the weight may have been dis-placed. The role of the weight is to supply energy (via themachinery) both for the actions of the gorilla and forbringing the machinery and gorilla back to their initialstates. The recovery process may involve additional inter-actions between machinery, system, and gorilla—interac-tions besides those indicated in (b).

Downloaded 13 Jun 2012 to 143.106.72.129. Redistribution subject to AIP license or copyright; see http://www.physicstoday.org/about_us/terms

Page 4: A Fresh Look at Entropy and the Second Law of Thermodynamicsdepa.fquim.unam.mx/amyd/archivero/Art_Entrop_2000_32695.pdf · second law. The approach of these authors is quite differ-ent

not necessary to decide whether Boltzmann or Gibbs hadthe right view of irreversibility. (Their views are describedin Joel L. Lebowitz’s article, “Boltzmann’s Entropy andTime’s Arrow,” PHYSICS TODAY, September 1993, page 32.)

The basic conceptsTo begin at the beginning, we suppose we know what ismeant by a thermodynamic system and equilibriumstates of such a system. Admittedly, these are not alwayseasy to define, and there are certainly systems, such as amixture of hydrogen and oxygen or an interstellar ionizedgas, capable of behaving as though they were in equilibri-um even if they are not truly so. The prototypical systemis a so-called “simple system,” consisting of a substance ina container with a piston. But a simple system can bemuch more complicated than that. Besides its volume, itcan have other coordinates, which can be changed bymechanical or electrical means—shear in a solid or mag-netization, for example. In any event, a state of a simplesystem is described by a special coordinate U, which is itsenergy, and one or more other coordinates (such as thevolume V) called work coordinates. An essential point isthat the concept of energy, which we know about frommoving weights and Newtonian mechanics, can be definedfor thermodynamic systems. This fact is the content of thefirst law of thermodynamics.

Another type of system is a “compound system,”which consists of several different or identical independ-ent, simple systems. By means of mixing or chemical reac-tions, systems can be created or destroyed.

Let us briefly discuss some concepts that are relevantfor systems and their states, which are denoted by capitalletters such as X, X�, Y, . . . . Operationally, the composi-tion, denoted (X, X�), of two states X and X� is obtainedsimply by putting one system in a state X and one in astate X� side by side on the experimental table andregarding them jointly as a state of a new, compound sys-tem. For instance, X could be a glass containing 100 g ofwhiskey at standard pressure and 20 °C, and X� a glasscontaining 50 g of ice at standard pressure and 0 °C. Topicture (X, X�), one should think of the two glasses stand-ing on a table without touching each other. (See figure 2.)

Another operation is the “scaling” of a state X by afactor l > 0, leading to a state denoted lX. Extensiveproperties such as mass, energy, and volume are multi-plied by l, while intensive properties such as pressurestay intact. For the states X and X� as in the example

above, 1/2 X is 50 g of whiskey at standard pressure and 20°C, and 1/5 X� is 10 g of ice at standard pressure and 0 °C.Compound systems scale in the same way: 1/5 (X, X�) is 20g of whiskey and 10 g of ice in separate glasses with pres-sure and temperatures as before.

A central notion is adiabatic accessibility. If our goril-la can take a system from X to Y as described above—thatis, if the only net effect of the action, besides the statechange of the system, is that a weight has possibly beenraised or lowered, we say that Y is adiabatically accessi-ble from X and write X ≺ Y (the symbol ≺ is pronounced“precedes”). It has to be emphasized that for macroscopicsystems the relation is an absolute one: If a transitionfrom X to Y is possible at one time, then it is always pos-sible (that is, it is reproducible), and if it is impossible atone time, then it never happens. This absolutism is guar-anteed by the large powers of 10 involved—the impossi-bility of a chair’s spontaneously jumping up from the flooris an example.

The role of entropyNow imagine that we are given a list of all possible pairsof states X,Y such that X ≺ Y. The foundation on whichthermodynamics rests, and the essence of the second law,is that this list can be simply encoded in an entropy func-tion S on the set of all states of all systems (includingcompound systems), so that when X and Y are related atall, then

X ≺ Y if and only if S(X) � S(Y).

Moreover, the entropy function can be chosen in such away that if X and X� are states of two (different or identi-cal) systems, then the entropy of the compound system inthis pair of states is given by

S(X,X�) ⊂ S(X) ⊕ S(X�).

This additivity of entropy is a highly nontrivial assertion.Indeed, it is one of the most far-reaching properties of thesecond law. In compound systems such as the whiskey/iceexample above, all states (Y,Y�) such that X ≺ Y and X� ≺Y� are adiabatically accessible from (X, X�). For instance,by letting a falling weight run an electric generator onecan stir the whiskey and also melt some ice. But it isimportant to note that (Y,Y�) can be adiabatically accessi-ble from (X, X�) without Y being adiabatically accessiblefrom X. Bringing the two glasses into contact and sepa-rating them again is adiabatic for the compound system,

34 APRIL 2000 PHYSICS TODAY

(X, X )�

(Y1 1, Y )� (Y

2 2, Y )� Y

3

20° C

30° C 30° C10° C

0° C

0° C 0° C

FIGURE 2. ADIABATIC STATE CHANGES

for a compound system consisting of aglass of whiskey and a glass of ice. Thestates (Y

1,Y

1�), (Y

2,Y

2�), and Y

3are all adi-

abatically accessible from (X,X�). (Y1,Y

1�)

can be reached by weight-powered stir-rers (not shown) acting on each glass.(Y

2,Y

2�) is obtained by bringing the two

subsystems temporarily into thermalcontact. Y

3is obtained by pouring the

whiskey on the ice and stirring; this is amixing process and changes the system.The original state (X,X�) is not adiabati-cally accessible from any of the threestates (Y

1,Y

1�), (Y

2,Y

2�), or Y

3.

Downloaded 13 Jun 2012 to 143.106.72.129. Redistribution subject to AIP license or copyright; see http://www.physicstoday.org/about_us/terms

Page 5: A Fresh Look at Entropy and the Second Law of Thermodynamicsdepa.fquim.unam.mx/amyd/archivero/Art_Entrop_2000_32695.pdf · second law. The approach of these authors is quite differ-ent

but the resulting cooling of the whiskey is not adiabaticfor the whiskey alone. The fact that the inequality S(X) ⊕S(X�) � S(Y) ⊕ S(Y�) exactly characterizes the possibleadiabatic transitions for the compound system, even whenS(X) � S(Y), is quite remarkable. It means that it is suffi-cient to know the entropy of each part of a compound sys-tem to decide which transitions due to interactions betweenthe parts (brought about by the gorilla) are possible.

Closely related to additivity is extensivity, or scalingof entropy,

S(lX) ⊂ lS(X),

which means that the entropy of an arbitrary mass of asubstance is determined by the entropy of some standardreference mass, such as 1 kg of the substance. Withoutthis scaling property, engineers would have to use differ-ent steam tables each time they designed a new engine.

In traditional presentations of thermodynamics,based for example on Kelvin’s principle given above,entropy is arrived at in a rather roundabout way thattends to obscure its connection with the relation ≺. Thebasic message we wish to convey is that the existence anduniqueness of entropy are equivalent to certain simpleproperties of the relation ≺. This equivalence is the con-cern of reference 2.

An analogy leaps to mind: When can a vector fieldE(x) be encoded in an ordinary function (potential) v(x)whose gradient is E? The well-known answer is that anecessary and sufficient condition is that curl E ⊂ 0. Theimportance of this encoding does not have to be empha-sized to physicists; entropy’s role is similar to the poten-tial’s role, and the existence and meaning of entropy arenot based on any formula such as S ⊂ ⊗Si pi ln pi, involv-ing probabilities pi of “microstates.” Entropy is derived(uniquely, we hope) from the list of pairs X ≺ Y; our aim isto figure out what properties of this list (analogous to thecurl-free condition) will allow it to be described by anentropy. That entropy will then be endowed with anunambiguous physical meaning independent of anyone’sassumptions about “the arrow of time,” “coarse graining,”and so on. Only the list, which is given by physics, isimportant for us now.

The required properties of ≺ do not involve concepts

like “heat” or “reversibleengines”; not even “hot” and“cold” are needed. Besidesthe “obvious” conditions “X ≺X for all X” (reflexivity) and“X ≺ Y and Y ≺ Z implies X≺ Z” (transitivity), one needsto know that the relationbehaves reasonably withrespect to the compositionand scaling of states. By thiswe mean the following:� Adiabatic accessibility isconsistent with the composi-tion of states: X ≺ Y and Z ≺W implies (X,Z) ≺ (Y,W).� Scaling of states does notaffect adiabatic accessibility:If X ≺ Y, then lX ≺ lY.� Systems can be cut adia-batically into two parts: If 0 <l < 1, then X ≺ ([1 ⊗ l]X,lX),and the recombination of theparts is also adiabatic: ([1 ⊗l]X,lX) ≺ X.

� Adiabatic accessibility is stable with respect to smallperturbations: If (X,eZ) ≺ (Y,eW) for arbitrarily small e >0, then X ≺ Y.

These requirements are all very natural. In fact, intraditional approaches they are usually taken for granted,without mention. They are not quite sufficient, however,to define entropy. A crucial additional ingredient is thecomparison hypothesis for the relation ≺. In essence, thisis the hypothesis that equilibrium states, whether simpleor compound, can be grouped into classes such that if Xand Y are in the same class, then either X ≺ Y or Y ≺ X.In nature, a class consists of all states with the same massand chemical composition—that is, with the same amountof each of the chemical elements. If chemical reactionsand mixing processes are excluded, the classes are small-er and may be identified with the “systems” in the usualparlance. But it should be noted that systems may be com-pound, or consist of two or more vessels of different sub-stances. In any case, the role of the comparison hypothe-sis is to ensure that the list of pairs X ≺ Y is sufficientlylong. Indeed, we shall give an example later of a systemwhose pairs satisfy all the other axioms, but that is notdescribable by an entropy function.

Construction of entropyOur main conclusion (which we do not claim is obvious,but whose proof can be found in reference 2) is that theexistence and uniqueness of entropy is a consequence ofthe comparison hypothesis and the assumptions aboutadiabatic accessibility stated above. In fact, if X0, X, andX1 are three states of a system and l is any scaling factorbetween 0 and 1, then either X ≺ ([1 ⊗ l]X0,lX1) or ([1 ⊗ l]X0,lX1) ≺ X, by the comparison hypothesis. If both alter-natives hold, then the properties of entropy demand that

S(X) ⊂ (1 ⊗ l)S(X0) ⊕ lS(X1).

If S(X0) Þ S(X1), then this equality can hold for at mostone l. With X0 and X1 as reference states, the entropy istherefore fixed, apart from two free constants, namely thevalues S(X0) and S(X1).

From the properties of the relation ≺ listed above, onecan show that there is, indeed, always a 0 � l � 1 withthe required properties, provided that X0 ≺ X ≺ X1. It is

APRIL 2000 PHYSICS TODAY 35

l kg Steam l kg Water

1 kg Water

(1– ) kg Icel (1– ) kg Waterl

FIGURE 3. DEFINITION OF ENTROPY. One can define the entropy of 1 kg of water in a givenstate (represented by the orange color) by obtaining the state from a fraction l kg of steam in afixed, standard state (red) and a fraction 1 ⊗ l kg of ice in a fixed, standard state (blue), withthe aid of a device (green) and a weight (yellow). The device returns to its initial state at theend of the process, but the weight may end up raised or lowered. The entropy S

water, measured

in units of Ssteam

, is the maximum fraction l ⊂ lmax

for which the transformation to 1 kg ofwater in the given (orange) state is possible. The system of steam and ice is used here only forillustration. The definition of entropy need not involve phase changes.

Downloaded 13 Jun 2012 to 143.106.72.129. Redistribution subject to AIP license or copyright; see http://www.physicstoday.org/about_us/terms

Page 6: A Fresh Look at Entropy and the Second Law of Thermodynamicsdepa.fquim.unam.mx/amyd/archivero/Art_Entrop_2000_32695.pdf · second law. The approach of these authors is quite differ-ent

equal to the largest l, denoted lmax, such that ([1 ⊗ l]X0,lX1) ≺ X. Defining the entropies of the reference statesarbitrarily as S(X0) ⊂ 0 and S(X1) ⊂ 1 unit, we obtain thefollowing simple formula for entropy:

S(X) ⊂ lmax units.

The scaling factors (1 ⊗ l) and l measure the amount ofsubstance in the states X0 and X1, respectively. The for-mula for entropy can therefore be stated in the followingwords: S(X) is the maximal fraction of substance in thestate X1 that can be transformed adiabatically (that is, inthe sense of ≺) into the state X with the aid of a comple-mentary fraction of substance in the state X0. This way ofmeasuring S in terms of substance is reminiscent of an oldidea, suggested by Pierre Laplace and Antoine Lavoisier,that heat be measured in terms of the amount of ice melt-ed in a process. As a concrete example, let us assume thatX is a state of liquid water, X0 of ice and X1 of vapor. ThenS(X) for a kilogram of liquid, measured with the entropyof a kilogram of water vapor as a unit, is the maximalfraction of a kilogram of vapor that can be transformedadiabatically into liquid in state X with the aid of a com-plementary fraction of a kilogram of ice. (See figure 3.)

In this example the maximal fraction lmax cannot beachieved by simply exposing the ice to the vapor, causingthe former to melt and the latter to condense. That wouldbe an irreversible process—that is, it would not be possi-ble to reproduce the initial amounts of vapor and ice adi-abatically (in the sense of the definition given earlier)from the liquid. By contrast, lmax is uniquely determinedby the requirement that one can pass adiabatically from Xto ([1 ⊗ lmax]X0,lmaxX1) and vice versa. For this transfor-mation it is necessary to extract or add energy in the formof work—for example by running a little reversible Carnotmachine that transfers energy between the high-temper-ature and low-temperature parts of the system (see figure3). We stress, however, that neither the concept of a“reversible Carnot machine” nor that of “temperature” isneeded for the logic behind the formula for entropy givenabove. We mention these concepts only to relate our defi-nition of entropy to concepts for which the reader mayhave an intuitive feeling.

By interchanging the roles of the three states, the def-inition of entropy is easily extended to situations where X≺ X0 or X1 ≺ X. Moreover, the reference points X0 and X1,where the entropy is defined to be 0 and 1 unit respec-tively, can be picked consistently for different systemssuch that the formula for entropy will satisfy the crucialadditivity and extensivity conditions

S(X,X�) ⊂ S(X) ⊕ S(X�) and S(lX) ⊂ lS(X).

It is important to understand that once the existenceand uniqueness of entropy have been established, one neednot rely on the lmax formula displayed above to determine itin practice. There are various experimental means to deter-mine entropy that are usually much more practical. Thestandard method consists of measuring pressures, volumes,and temperatures (on some empirical scale), as well as spe-cific and latent heats. The empirical temperatures are con-verted into absolute temperatures T (by means of formulasthat follow from the mere existence of entropy but do notinvolve S directly), and the entropy is computed by means offormulas like DS ⊂ ∫(dU + PdV) /T, with P the pressure.The existence and uniqueness of entropy implies that thisformula is independent of the path of integration.

Comparability of statesThe possibility of defining entropy entirely in terms of the

relation ≺ was first clearly stated by Giles.7 (Giles’s defi-nition is different from ours, albeit similar in spirit.) Theimportance of the comparison hypothesis had been real-ized earlier, however.4–6 All the authors take the compari-son hypothesis as a postulate—that is, they do notattempt to justify it from other, simpler premises. Howev-er, it is in fact possible to derive comparability for any pairof states of the same system from some natural anddirectly accessible properties of the relation ≺.2 The deri-vation uses the customary parameterization of states interms of energy and work coordinates. But such parame-terizations are irrelevant, and therefore not used, for ourdefinition of entropy—once the comparison hypothesishas been established.

To appreciate the significance of the comparisonhypothesis, it may be helpful to consider the followingexample. Imagine a world whose thermodynamical sys-tems consist exclusively of incompressible solid bodies.Moreover, all adiabatic state changes in this world aresupposed to be obtained by means of the following ele-mentary operations:� Mechanical rubbing of the individual systems, increas-ing their energy.� Thermal equilibration in the conventional sense (bybringing the systems into contact).The state space of the compound system consisting of twoidentical bodies, 1 and 2, can be parameterized by theirenergies, U1 and U2. Figure 4 shows two states, X and Y,of this compound system, and the states that are adiabat-ically accessible from each of these states. It is evidentfrom the picture that neither X ≺ Y nor Y ≺ X holds. Thecomparison hypothesis is therefore violated in this hypo-thetical example, and so it is not possible to characterizeadiabatic accessibility by means of an additive entropyfunction. A major part of our work consists of understand-ing why such situations do not happen—why the compar-ison hypothesis appears to hold true in the real world.

The derivation of the comparison hypothesis is basedon an analysis of simple systems, which are the buildingblocks of thermodynamics. As we already mentioned, thestates of such systems are described by an energy coordi-nate U and at least one work coordinate, such as the vol-ume V. The following concepts play a key role in thisanalysis:� The possibility of forming “convex combinations” ofstates with respect to the energy U and volume V (or otherwork coordinates). This means that given any two statesX and Z of one kilogram of our system, we can pick anystate Y on the line between them in U,V space and, by tak-ing appropriate fractions l and 1 ⊗ l in states X and Z,respectively, there will be an adiabatic process taking thispair of states into state Y. This process is usually quiteelementary. For example, for gases and liquids one needonly remove the barrier that separates the two fractionsof the system. The fundamental property of entropyincrease will then tell us that S(Y) � lS(X) ⊕ (1 ⊗ l)S(Z).As Gibbs emphasized, this “concavity” is the basis forthermodynamic stability—namely positivity of specificheats and compressibilities.� The existence of at least one irreversible adiabaticstate change, starting from any given state. In conjunc-tion with the concavity of S, this seemingly weak require-ment excludes the possibility that the entropy is constantin a whole neighborhood of some state. The classical for-mulations of the second law follow from this.� The concept of thermal equilibrium between simplesystems, which means, operationally, that no statechanges take place when the systems are allowed to

36 APRIL 2000 PHYSICS TODAY

Downloaded 13 Jun 2012 to 143.106.72.129. Redistribution subject to AIP license or copyright; see http://www.physicstoday.org/about_us/terms

Page 7: A Fresh Look at Entropy and the Second Law of Thermodynamicsdepa.fquim.unam.mx/amyd/archivero/Art_Entrop_2000_32695.pdf · second law. The approach of these authors is quite differ-ent

exchange energy with each other at fixed work coordi-nates. The zeroth law of thermodynamic says that if twosystems are in thermal equilibrium with a third, thenthey are in thermal equilibrium with one another. Thisproperty is essential for the additivity of entropy, becauseit allows a consistent adjustment of the entropy unit fordifferent systems. The zeroth law leads to a definition oftemperature by the usual formula 1/T ⊂ (]S/]U)V.

Using these notions (and a few others of a more tech-nical nature), the comparison hypothesis can be estab-lished for all simple systems and their compounds.

It is more difficult to justify the comparability ofstates if mixing processes or chemical reactions are takeninto account. In fact, although a mixture of whiskey andwater at 0 °C is obviously adiabatically accessible fromseparate whiskey and ice by pouring whiskey from oneglass onto the rocks in the other glass, it is not possible toreverse this process adiabatically. Hence it is not clearthat a block of a frozen whiskey/water mixture at ⊗10 °C,say, is at all related in the sense of ≺ to a state in whichwhiskey and water are in separate glasses. Textbooksusually appeal here to gedanken experiments with “semi-permeable membranes” that let only water moleculesthrough and hold back the whiskey molecules, but suchmembranes really exist only in the mind.10 However,without invoking any such device, it turns out to be possi-ble to shift the entropy scales of the various substances insuch a way that X ≺ Y always implies S(X) � S(Y). Theconverse assertion, namely, S(X) � S(Y) implies X ≺ Yprovided that X and Y have the same chemical composi-tion, cannot be guaranteed a priori for mixing and chemi-cal reactions, but it is empirically testable and appears tobe true in the real world. This aspect of the second law,comparability, is not usually stressed, but it is important; itis challenging to figure out how to turn the frozenwhiskey/water block into a glass of whiskey and a glass ofwater without otherwise changing the universe, except formoving a weight, but such an adiabatic process is possible.

What has been gained?The line of thought that started more than 40 years agohas led to an axiomatic foundation for thermodynamics. Itis appropriate to ask what if anything has been gained incomparison to the usual approaches involving quasi-stat-ic processes and Carnot machines on the one hand andstatistical mechanics on the other hand. There are sever-al points. One is the elimination of intuitive but hard-to-

define concepts such as “hot,” “cold,” and “heat” from thefoundations of thermodynamics. Another is the recogni-tion of entropy as a codification of possible state changes,X ≺ Y, that can be accomplished without changing therest of the universe in any way except for moving aweight. Temperature is eliminated as an a priori conceptand appears in its natural place—as a quantity derivedfrom entropy and whose consistent definition reallydepends on the existence of entropy, rather than the otherway around. To define entropy, there is no need for specialmachines and processes on the empirical side, and thereis no need for assumptions about models on the statisticalmechanical side. Just as energy conservation was eventu-ally seen to be a consequence of time translation invari-ance, in like manner entropy can be seen to be a conse-quence of some simple properties of the list of state pairsrelated by adiabatic accessibility.

If the second law can be demystified, so much thebetter. If it can be seen to be a consequence of simple,plausible notions, then, as Einstein said, it cannot beoverthrown.

We are grateful to Shivaji Sondhi and Roderich Moessner forhelpful suggestions. Lieb’s work was supported by the Nation-al Science Foundation. Yngvason’s work was supported by theAdalsteinn Kristjansson Foundation and the University ofIceland.

References1. C. Kittel, H. Kroemer, Thermal Physics, Freeman, New York

(1980), p. 57.2. E. H. Lieb, J. Yngvason, Phys. Rep. 310, 1 (1999); erratum

314, 669 (1999). For a summary, see Notices Amer. Math.Soc. 45, 571 (1998).

3. A. Einstein, autobiographical notes in Albert Einstein:Philosopher-Scientist, P. A. Schilpp, ed., Library of LivingPhilosophers, Cambridge U.P., London (1970), vol. VII, p. 33.

4. P. T. Landsberg, Rev. Mod. Phys. 28, 363 (1956).5. H. A. Buchdahl, The Concepts of Classical Thermodynamics,

Cambridge U.P., London (1966). 6. G. Falk, H. Jung, Handbuch der Physik III/2, S. Flügge, ed.,

Springer, Berlin (1959), p. 199.7. R. Giles, Mathematical Foundations of Thermodynamics,

Pergamon, Oxford, England (1964).8. D. R. Owen, A First Course in the Mathematical Foundations

of Thermodynamics, Springer, Heidelberg, Germany (1984).J. Serrin, Arch. Rat. Mech. Anal. 70, 355 (1979). M. SilhavÛ,The Mechanics and Thermodynamics of Continuous Media,Springer, Heidelberg, Germany (1997). C. A. Truesdell, S.Bharata, The Concepts and Logic of Thermodynamics as aTheory of Heat Engines, Springer, Heidelberg, Germany(1977).

9. J. L. Lebowitz, I. Prigogine, D. Ruelle, Physica A 263, 516,528, 540 (1999).

10. E. Fermi, Thermodynamics, Dover, New York, (1956), p. 101. �

APRIL 2000 PHYSICS TODAY 37

EN

ER

GY

U1

ENERGY U2

X

Y

FIGURE 4. HYPOTHETICALLY NONCOMPARABLE STATES. Thegraph shows the state space of a pair of identical, incompress-ible solids with the energies U

1and U

2as the only coordinates

of the compound system. The states adiabatically accessiblefrom X (yellow/orange) and Y (red/orange) are shown underthe assumption that the only adiabatic changes consist in com-binations of rubbing (increasing U

1or U

2) and thermal equili-

bration (moving to the diagonal U1⊂ U

2). In this example, adi-

abatic accessibility cannot be characterized by an entropy func-tion, because neither a transformation from X to Y nor from Yto X is possible. The comparison hypothesis does not holdhere. In the real world, however, it always holds.

Downloaded 13 Jun 2012 to 143.106.72.129. Redistribution subject to AIP license or copyright; see http://www.physicstoday.org/about_us/terms