dr roger bennett [email protected] rm. 23 xtn. 8559 lecture 15

51
Dr Roger Bennett [email protected] Rm. 23 Xtn. 8559 Lecture 15

Upload: ethelbert-freeman

Post on 17-Jan-2016

228 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Dr Roger Bennett R.A.Bennett@Reading.ac.uk Rm. 23 Xtn. 8559 Lecture 15

Dr Roger [email protected]

Rm. 23 Xtn. 8559

Lecture 15

Page 2: Dr Roger Bennett R.A.Bennett@Reading.ac.uk Rm. 23 Xtn. 8559 Lecture 15

Free expansion

• No temperature change means no change in kinetic energy distribution.

• The only physical difference is that the atoms have more space in which to move.

• We may imagine that there are more ways in which the atoms may be arranged in the larger volume.

• Statistical mechanics takes this viewpoint and analyses how many different states are possible that give rise to the same macroscopic properties.

Page 3: Dr Roger Bennett R.A.Bennett@Reading.ac.uk Rm. 23 Xtn. 8559 Lecture 15

Statistical View• The constraints on the system (U, V and n) define the

macroscopic state of the system (macrostate).• We need to know how many microscopic states

(microstates or quantum states) satisfy the macrostate.• A microstate for a system is one for which everything

that can in principle be known is known.• The number of microstates that give rise to a

macrostate is called the thermodynamic probability, , of that macrostate. (alternatively the Statistical Weight W)

• The largest thermodynamic probability dominates.• The essential assumption of statistical mechanics is

that each microstate is equally likely.

Page 4: Dr Roger Bennett R.A.Bennett@Reading.ac.uk Rm. 23 Xtn. 8559 Lecture 15

Statistical View

• Boltzmann’s Hypothesis:• The Entropy is a function of the statistical weight

or thermodynamic probability: S = ø(W)• If we have two systems A and B each with entropy

SA and SB respectively. Then we expect the total entropy of the two systems to be SAB = SA + SB (extensive).

• Think about the probabilities.

• WAB = WA WB

• So SAB = ø(WA) + ø(WB) = ø(WAB) = ø(WAWB)

Page 5: Dr Roger Bennett R.A.Bennett@Reading.ac.uk Rm. 23 Xtn. 8559 Lecture 15

Statistical View

• Boltzmann’s Hypothesis:

• SAB = ø(WA) + ø(WB) = ø(WAB) = ø(WAWB)

• The only functions that behave like this are logarithms.

• S = k ln(W) Boltzmann relation

• The microscopic viewpoint thus interprets the increase in entropy for an isolated system as a consequence of the natural tendency of the system to move from a less probable to a more probable state.

Page 6: Dr Roger Bennett R.A.Bennett@Reading.ac.uk Rm. 23 Xtn. 8559 Lecture 15

Expansion of an ideal gas - microscopic

• Expansion of ideal gas contained in volume V.

• U, T unchanged and no work is done nor heat flows.

• Entropy increases – what is the physical basis?

Page 7: Dr Roger Bennett R.A.Bennett@Reading.ac.uk Rm. 23 Xtn. 8559 Lecture 15

Expansion of an ideal gas - microscopic

• Split volume into elemental cells V.• Number of ways of placing one atom in

volume is V/ V.• Number of ways of placing n atoms is

– W = (V/ V)n S = nk ln(V/V)– Is this right? Depends on the size of V.

Page 8: Dr Roger Bennett R.A.Bennett@Reading.ac.uk Rm. 23 Xtn. 8559 Lecture 15

Expansion of an ideal gas - microscopic

• Is this right? Depends on the size of V.• Yep, we only measure changes in entropy.

• Sf-Si=nk(ln (Vf/V) - ln (Vi/V))= nk ln(Vf/Vi)

• Doubling volume gives S = nk ln(2) = NR ln(2)

Page 9: Dr Roger Bennett R.A.Bennett@Reading.ac.uk Rm. 23 Xtn. 8559 Lecture 15

Statistical mechanics• We have seen that the entropy of a system

is related to the probability of its state –entropy is a statistical phenomena.

• To calculate thermal properties we must combine our knowledge of the particles that make up a system with statistical properties.

• Statistical mechanics starts from conceptually simple ideas but evolves into a powerful and general tool.

• The first and cornerstone concept is a clear understanding of probability.

Page 10: Dr Roger Bennett R.A.Bennett@Reading.ac.uk Rm. 23 Xtn. 8559 Lecture 15

Probability

• Two common versions– Classical Probability: the power to

predict what the likely outcome of an experiment is.

– Statistical Probability: by repeated measurement we can determine the probability of an experimental outcome by measuring is frequency of occurrence. Relies on the system being in equilibrium and the existence of well defined frequencies.

Page 11: Dr Roger Bennett R.A.Bennett@Reading.ac.uk Rm. 23 Xtn. 8559 Lecture 15

Classical Probability

• Classical Probability– Must determine all the possible outcomes and

assign equal probabilities to each.– Why equal probs. Surely not all outcomes are

equal?– We ensure this is the case by looking at the

system in absolute finest detail such that the outcome is related to a simple event.

– By definition no further refinement would enable us to define the properties of the state in any finer detail. This is the microstate or quantum state of the system.

– We have already done this example by boxing atoms of a gas in small volumes V.

Page 12: Dr Roger Bennett R.A.Bennett@Reading.ac.uk Rm. 23 Xtn. 8559 Lecture 15

Example of Classical Probability

• Heads and Tails Coin Toss – best of 3.• Possible outcomes

– HHH, THH, HTH, HHT, TTH, THT, HTT, TTT

– Each one of these is a microstate.• Probability of exactly two Heads?• 3 microstates have two heads in at total of

8 possible outcomes so probability 3/8.• This is easy - we can count the number of

microstates.

Page 13: Dr Roger Bennett R.A.Bennett@Reading.ac.uk Rm. 23 Xtn. 8559 Lecture 15

Example of Classical Probability

• Heads and Tails Coin Toss – best of 30.– What is probability of 30 heads?– Only one microstate has 30 H but how

many microstates are possible in total? About 109

– Probability on each toss of a H is ½ so for 30 tosses gives P30H = (½)30

– What is probability of 5 Heads and 25 Tails?– How many ways of this occurring – how

many microstates?

30

506,142!25!5

!30

Page 14: Dr Roger Bennett R.A.Bennett@Reading.ac.uk Rm. 23 Xtn. 8559 Lecture 15

Example of Classical Probability

• Heads and Tails Coin Toss – best of 30.– What is probability of 5 Heads and 25

Tails?– How many ways of this occurring – how

many microstates?

– Each microstate equally likely– Probability= 142,506 × (½)30 = 1.3×10-4

• Prob. 15 H and 15 T = 155,117,520 × (½)30

30 506,142!25!5

!30

Page 15: Dr Roger Bennett R.A.Bennett@Reading.ac.uk Rm. 23 Xtn. 8559 Lecture 15

Microstates in a configuration

• No. of microstates in a configuration where particles are distinguishable.

• Where N is the total number of particles/events/options etc.

• ni is the no. of particles in the ith distinct state.

means product (cf. for sum).• Eg. how many distinct anagrams of

STATISTICS

!

!

iin

NW

400,50!2!3!3

!10

W

Page 16: Dr Roger Bennett R.A.Bennett@Reading.ac.uk Rm. 23 Xtn. 8559 Lecture 15

Dr Roger [email protected]

Rm. 23 Xtn. 8559

Lecture 16

Page 17: Dr Roger Bennett R.A.Bennett@Reading.ac.uk Rm. 23 Xtn. 8559 Lecture 15

Microstates in a configuration

• No. of microstates in a configuration where particles are distinguishable.

• Where N is the total number of particles/events/options etc.

• ni is the no. of particles in the ith distinct state.

means product (cf. for sum).• Eg. how many distinct anagrams of

STATISTICS

!

!

iin

NW

400,50!2!3!3

!10

W

Page 18: Dr Roger Bennett R.A.Bennett@Reading.ac.uk Rm. 23 Xtn. 8559 Lecture 15

Equilibrium

• Take an isolated system which is partitioned into two subsystems.

• U=U1+U2

• V=V1+V2

• N=N1+N2

• The statistical weight W of the entire system (total number of microstates) is the product of the weights of the subsystems.

• W(U,V,N,U1,V1,N1)=W1(U1V1N1) × W2(U2V2N2)

• S(U,V,N,U1,V1,N1)=S1(U1V1N1) + S2(U2V2N2)

U2, V2, N2U1, V1, N1

Page 19: Dr Roger Bennett R.A.Bennett@Reading.ac.uk Rm. 23 Xtn. 8559 Lecture 15

Equilibrium

• S(U,V,N,U1,V1,N1)=S1(U1V1N1) + S2(U2V2N2)

• Now let us exchange energy through the wall (same methodology works for exchange of particles or volume).

• Use Clausius Entropy principle (at equilibrium entropy is a maximum) – independent variable is U1 holding all other terms fixed.

01

2

,2

2

,1

1

,,,,1221111

dU

dU

U

S

U

S

U

S

NVNVNVNVU

Page 20: Dr Roger Bennett R.A.Bennett@Reading.ac.uk Rm. 23 Xtn. 8559 Lecture 15

Equilibrium

• This is the condition for thermal equilibrium – the two subsystems must be at the same temperature.

• We can now define an absolute temperature for each subsystem i. At equilibrium all subsystems are at the same temperature.

01

2

,2

2

,1

1

,,,,1221111

dU

dU

U

S

U

S

U

S

NVNVNVNVU

2211 ,2

2

,1

1

NVNVU

S

U

S

iNVi

i

TU

S

ii

1

,

Page 21: Dr Roger Bennett R.A.Bennett@Reading.ac.uk Rm. 23 Xtn. 8559 Lecture 15

Example – The Schottky Deferct

• At absolute zero all atoms in a crystal are perfectly ordered on a crystal lattice.

• Raising the temperature introduces point defects

• Schottky defects are atoms displaced from the lattice that end up on the surface leaving vacancies

Page 22: Dr Roger Bennett R.A.Bennett@Reading.ac.uk Rm. 23 Xtn. 8559 Lecture 15

Schottky Defect

• What is the concentration of defects in a crystal at thermal equilibrium?

• Creation of a defect costs energy . • The energy U associated with n of these defects

= n• Assumptions?• Defects are dilute and so do not interact.• We can now use our understanding of

probability to investigate the configurational entropy.

Page 23: Dr Roger Bennett R.A.Bennett@Reading.ac.uk Rm. 23 Xtn. 8559 Lecture 15

Schottky Defects

• Configurational entropy – how many ways to distribute n defects in a crystal of N atoms?

• We can calculate the number of microstates!

!

iin

NW

!)!(

!

nnN

NW

!)!(

!ln)ln()(

nnN

NkWknS

Page 24: Dr Roger Bennett R.A.Bennett@Reading.ac.uk Rm. 23 Xtn. 8559 Lecture 15

Schottky Defects

• At equilibrium, and remembering U = n,

• Leaves us with just the differential of the entropy to calculate. As crystals have large numbers we can approximate the factorial functions with Stirling’s formula.

n

nS

dU

dn

n

nS

nU

nS

U

S

T

)(1)(

)(

)(1

NNNN ln!ln

Page 25: Dr Roger Bennett R.A.Bennett@Reading.ac.uk Rm. 23 Xtn. 8559 Lecture 15

Schottky Defects

n

nS

T

)(11

NNNN ln!ln

!)!(

!ln)ln()(

nnN

NkWknS

)ln()(lnln)( nNnNnnNNknS

)ln(ln)(

nNnkn

nS

n

nNk

n

nS

Tln

)(11

Page 26: Dr Roger Bennett R.A.Bennett@Reading.ac.uk Rm. 23 Xtn. 8559 Lecture 15

Schottky Defects

n

nNk

n

nS

Tln

)(11

1

1

kTe

N

n

kTNen

• For typical values of = 1eV, and kT at room temp ~ 1/40 eV we find the density of Schottky defects to be n/N = e-40 = 10-17. At 1000K n/N~10-6

Page 27: Dr Roger Bennett R.A.Bennett@Reading.ac.uk Rm. 23 Xtn. 8559 Lecture 15

Dr Roger [email protected]

Rm. 23 Xtn. 8559

Lecture 17

Page 28: Dr Roger Bennett R.A.Bennett@Reading.ac.uk Rm. 23 Xtn. 8559 Lecture 15

Equilibrium for an isolated system

• Take an isolated system which is partitioned into two subsystems.

• U=U1+U2

• V=V1+V2

• N=N1+N2

• The statistical weight W of the entire system (total number of microstates) is the product of the weights of the subsystems.

• W(U,V,N,U1,V1,N1)=W1(U1V1N1) × W2(U2V2N2)

• S(U,V,N,U1,V1,N1)=S1(U1V1N1) + S2(U2V2N2)

U2, V2, N2U1, V1, N1

Page 29: Dr Roger Bennett R.A.Bennett@Reading.ac.uk Rm. 23 Xtn. 8559 Lecture 15

Equilibrium for an isolated system

• S(U,V,N,U1,V1,N1)=S1(U1V1N1) + S2(U2V2N2)

• Now let us allow the walls to move thus changing volumes to maximise entropy (same methodology as before).

• Use Clausius Entropy principle (at equilibrium entropy is a maximum) – independent variable is V1 holding all other terms fixed.

01

2

,2

2

,1

1

,,,,1221111

dV

dV

V

S

V

S

V

S

UNUNUNNVU

Page 30: Dr Roger Bennett R.A.Bennett@Reading.ac.uk Rm. 23 Xtn. 8559 Lecture 15

Equilibrium for an isolated system

• This is the condition for equilibrium – the two subsystems must be at the same pressure as the wall has moved to maximise entropy.

• We can now define pressure:

2211 ,2

2

,1

1

NUNUV

S

V

S

ii NUi

iii V

STP

,

Page 31: Dr Roger Bennett R.A.Bennett@Reading.ac.uk Rm. 23 Xtn. 8559 Lecture 15

Equilibrium for an isolated system

• S(U,V,N,U1,V1,N1)=S1(U1V1N1) + S2(U2V2N2)

• Now let us exchange particles through the wall (same methodology as before)

• Use Clausius Entropy principle (at equilibrium entropy is a maximum) – independent variable is N1 holding all other terms fixed.

01

2

,2

2

,1

1

,,,,1221111

dN

dN

N

S

N

S

N

S

UVUVUVNVU

Page 32: Dr Roger Bennett R.A.Bennett@Reading.ac.uk Rm. 23 Xtn. 8559 Lecture 15

Equilibrium for an isolated system

• This is the condition for particle equilibrium – the two subsystems must have no driving force to change particle numbers.

• We can now define the driving force to exchange particles as the Chemical Potential:

2211 ,2

2

,1

1

VUVUN

S

N

S

ii VUi

iii N

ST

,

Page 33: Dr Roger Bennett R.A.Bennett@Reading.ac.uk Rm. 23 Xtn. 8559 Lecture 15

Equilibrium for system in a heat bath• Take a system which is partitioned into two subsystems –same

problem as before so far.• U0=U+UR

• The combined system is again totally isolated and we assume T,V,N describe the macrostate of the system.

• The system will possess a discrete set of microstates, however, which we could group and label according to the energy of that microstate.

U, V, N

TR, VR, NR

Page 34: Dr Roger Bennett R.A.Bennett@Reading.ac.uk Rm. 23 Xtn. 8559 Lecture 15

Equilibrium for system in a heat bath• By grouping the microstates

by energy we can associate a statistical weight to each energy level. I.e. U1<U2<U3<U4…<Ur

• The total energy of the composite system is conserved so U0=U+UR.

U, V, N

T

• The probability of finding our system with U = Ur

must be proportional to the number of microstates associated with the reservoir having energy UR = U0-Ur.

• pr = const × W(U0-Ur) (all volumes and particle numbers constant)

Page 35: Dr Roger Bennett R.A.Bennett@Reading.ac.uk Rm. 23 Xtn. 8559 Lecture 15

Equilibrium for system in a heat bath• pr = const × W(U0-Ur)

• The constant of proportionality must just depend upon all the available microstates and so can be properly normalised:

• We can also write W(U0-Ur) in terms of entropy:

rr

rr UUW

UUWp

)(

)(

0

0

k

UUS

r

r

eUUW)(

0

0

)(

k

UUS

rr

r

econstUUWconstp)(

0

0

)(

Page 36: Dr Roger Bennett R.A.Bennett@Reading.ac.uk Rm. 23 Xtn. 8559 Lecture 15

Equilibrium for system in a heat bath

• So far we haven’t used the fact that the reservoir is a heat bath. It has average energy U0 >> Ur our system energy.

• This is not true for all states r but is true for all overwhelmingly likely states!

• We expand S(U0-Ur) as a Taylor expansion:-

.....

)(

2

1)()(

1)(

12

0

0

22

0

000

U

US

k

U

U

US

k

UUS

kUUS

krr

r

k

UUS

rr

r

econstUUWconstp)(

0

0

)(

Page 37: Dr Roger Bennett R.A.Bennett@Reading.ac.uk Rm. 23 Xtn. 8559 Lecture 15

Equilibrium for system in a heat bath

.....)(

2

1)()(

1)(

12

0

0

22

0

000

U

US

k

U

U

US

k

UUS

kUUS

krr

r

• The first term is simple• The second term is related to the temperature as

before through:

• The third term therefore describes changes in temperature of a heat bath due to temperature exchange with the system. By definition this and higher terms must be negligible. We keep terms up to linear in Ur.

iNVi

i

TU

S

ii

1

,

Page 38: Dr Roger Bennett R.A.Bennett@Reading.ac.uk Rm. 23 Xtn. 8559 Lecture 15

Equilibrium for system in a heat bath

kT

UUS

kUUS

kr

r )(1

)(1

00

k

UUS

rr

r

econstUUWconstp)(

0

0

)(

kT

U

r

kT

U

k

US

kT

U

k

US

kT

U

k

US

r

r

r

r

r

eZe

eeconstp

1)(

)()(

0

0

0

r

kT

Ur

eZ

Page 39: Dr Roger Bennett R.A.Bennett@Reading.ac.uk Rm. 23 Xtn. 8559 Lecture 15

The Boltzmann Distribution

• This is the Boltzmann distribution and gives “the probability that a system in contact with a heat bath at temperature T should be in a particular state”.

• The only property of the heat bath on which it depends is the temperature.

• The function Z is called the partition function of the system. It is fundamental to the study of systems at fixed temperature.

kT

U

r

r

eZ

p

1

Page 40: Dr Roger Bennett R.A.Bennett@Reading.ac.uk Rm. 23 Xtn. 8559 Lecture 15

Dr Roger [email protected]

Rm. 23 Xtn. 8559

Lecture 18

Page 41: Dr Roger Bennett R.A.Bennett@Reading.ac.uk Rm. 23 Xtn. 8559 Lecture 15

The Boltzmann Distribution

• This is the Boltzmann distribution and gives “the probability that a system in contact with a heat bath at temperature T should be in a particular state”.

• r labels all the states of the system. At low temperature only the lowest states have any chance of being occupied. As the temperature is raised higher lying states become more and more likely to be occupied.

• In this case, in contact with the heat bath, all the microstates are therefore not equally likely to be populated.

kT

U

r

r

eZ

p

1

r

kT

Ur

eZ

Page 42: Dr Roger Bennett R.A.Bennett@Reading.ac.uk Rm. 23 Xtn. 8559 Lecture 15

The Boltzmann Distribution -Example

• Take a very simple system that has only three energy levels each corresponding to one microstate (non-degenerate).

• The energies of these states are:

– U1 = 0 J, U2 = 1.4×10-23 J, U3 = 2.8×10-23 J

• If the heat bath has a temperature of 2K• Z = e0 + e-1/2 + e-1 = 1.9744

• The probability of being in state p1 = 0.506, p2 = 0.307 and p3 = 0.186

kT

U

r

r

eZ

p

1

r

kT

Ur

eZ

Page 43: Dr Roger Bennett R.A.Bennett@Reading.ac.uk Rm. 23 Xtn. 8559 Lecture 15

The Boltzmann Distribution

• Usually there are huge numbers of microstates that can all have the same energy. This is called degeneracy.

• In this case we can do our summations above over each individual energy level rather than sum over each individual microstate.

• The summation is now over all the different energies Ur and g(Ur) is the number of states possessing the energy Ur. The probability is that of finding the system with energy Ur.

kT

U

r

r

eZ

p

1

r

kT

Ur

eZ

kT

U

rr

r

eUgZ

Up

)(1

)(

r

r

U

kT

U

r eUgZ )(

Page 44: Dr Roger Bennett R.A.Bennett@Reading.ac.uk Rm. 23 Xtn. 8559 Lecture 15

Entropy in ensembles

• Our system embedded in a heat bath is called a canonical ensemble (our isolated system on its own from Lecture 16 is termed a microcanonical ensemble).

• When isolated the microcanonical ensemble has a defined internal energy so that the probability of finding a system in a particular microstate is the same as any other microstate.

• In a heat bath the energy of the system fluctuates and the probability of finding any particular microstate is not equal. Can we now calculate the entropy for such a system and hence derive thermodynamic variables from statistical properties?

Page 45: Dr Roger Bennett R.A.Bennett@Reading.ac.uk Rm. 23 Xtn. 8559 Lecture 15

Entropy in the canonical ensemble• Embed our system in a

heat bath made up of (M-1) replica subsystems to the one were interested in.

• Each subsystem may be in one of many microstates. The number of subsystems in the ith microstate is ni.

• The number of ways of arranging n1 systems of µstate 1, n2 systems of µstate 2, n3….

!

!

iin

MW

Page 46: Dr Roger Bennett R.A.Bennett@Reading.ac.uk Rm. 23 Xtn. 8559 Lecture 15

Entropy in the canonical ensemble• If we make M huge so

that all ni are also large then we can (eventually) use Stirling’s approximation in calculating the entropy for the entire ensemble of M systems SM

!

!

iin

MW

!ln!ln!

!ln)ln( ii

ii

M nkMkn

MkWkS

i

iM nkMkS !ln!ln

Page 47: Dr Roger Bennett R.A.Bennett@Reading.ac.uk Rm. 23 Xtn. 8559 Lecture 15

Entropy in the canonical ensemble

iiM nkMkS !ln!ln

iiiiM nnnMMMkS lnln

i iiiiM nnnMMMkS lnln

iiiM nnMMkS lnln

iii

iiM nnMnkS lnln

i

iiM M

n

M

nkMS ln

Page 48: Dr Roger Bennett R.A.Bennett@Reading.ac.uk Rm. 23 Xtn. 8559 Lecture 15

Entropy in the canonical ensemble

i

iiM M

n

M

nkMS ln

• As M becomes very large the ratios ni/M tend to a probability pi of finding the subsytem in state i.

• SM is the entropy for the ensemble of all the subsystems. But we know that entropy is extensive and scales with the size of the system. So the entropy per system is:

i

iiM ppkM

SS ln

Page 49: Dr Roger Bennett R.A.Bennett@Reading.ac.uk Rm. 23 Xtn. 8559 Lecture 15

Entropy in the canonical ensemble

• This is the general definition of entropy and holds even if the probabilities of each individual microstate are different.

• If all microstates are equally probable pi= 1/W (microcanonical ensemble)

• Which brings us nicely back to the Boltzmann relation

i

ii ppkS ln

WkWW

kppkSW

iiii ln

1ln

1ln

1

Page 50: Dr Roger Bennett R.A.Bennett@Reading.ac.uk Rm. 23 Xtn. 8559 Lecture 15

Entropy in the canonical ensemble

• The general definition of entropy, in combination with the Boltzmann distribution allows us to calculate real properties of the system.

i

ii ppkS ln

i

ii

iii Z

kT

UpkppkS lnln

kT

U

i

i

eZ

p

1

i

kT

Ui

eZ

ZkT

Up ii lnln

ZkT

UpZkUp

TS

ii

iii lnln

1

Page 51: Dr Roger Bennett R.A.Bennett@Reading.ac.uk Rm. 23 Xtn. 8559 Lecture 15

Helmholtz Free Energy

• Ū is the average value of the internal energy of the system.

• (Ū – TS) is the average value of the Helmholtz free energy, F. This is a function of state that we briefly mentioned in earlier lectures. It is central to statistical mechanics.

• The Partition function Z has appeared in our result –it seems to be much more than a mere normalising factor. Z acts as a bridge linking the microscopic world of microstates (quantum states) to the free energy and hence to all the large scale properties of a system.

ZkT

US ln ZkTSU ln