entropy and the 2 nd law of thermodynamics

16
Entropy and the 2 nd Law of Thermodynamics If you do a Google search on the term “entropy” or on the phrase “2 nd Law of Thermodynamics”, you will get millions of hits… but they won’t all say the same thing! There are several different ways to state these ideas; they sound so different from each other, but they’re all related.

Upload: hye

Post on 23-Feb-2016

27 views

Category:

Documents


0 download

DESCRIPTION

Entropy and the 2 nd Law of Thermodynamics. - PowerPoint PPT Presentation

TRANSCRIPT

Page 1: Entropy and the 2 nd  Law of Thermodynamics

Entropy and the 2nd Law of Thermodynamics

If you do a Google search on the term “entropy” or on the phrase “2nd Law of Thermodynamics”, you will get millions of hits… but they won’t all say the same thing! There are several different ways to state these ideas; they sound so different from each other, but they’re all related.

Page 2: Entropy and the 2 nd  Law of Thermodynamics

Flipping a Fair Coin

If we toss two fair coins, there are four possible outcomes:

It will be easier to list these outcomes with letters (H means “heads” and T means “tails”):

TT TH HT HHThese four outcomes are equally likely with fair coins. That’s true whether we flip one coin twice, or two coins at the same time and then arrange them in some sequential order.

Page 3: Entropy and the 2 nd  Law of Thermodynamics

When we list the possible outcomes, we see that there are eight (8):

TTT TTH THT HTT THH HTH HHT HHH

We’ll be exploring the outcomes with larger numbers of coins, so it will be convenient for us to categorize these outcomes. We’ll categorize them by the number of heads:

Now we flip 3 fair coins

Zero heads One head Two heads Three heads

TTT

TTHTHTHTT

THHHTHHHT HHH

(1 way) (3 ways) (3 ways) (1 way)

Page 4: Entropy and the 2 nd  Law of Thermodynamics

Now we flip 4 fair coins

Zero heads One head Two heads Three heads Four heads

TTTT

TTTHTTHTTHTTHTTT

TTHHTHTHTHHTHTTHHTHTHHTT

THHHHTHHHHTHHHHT HHHH

(1 way) (4 ways) (6 ways) (4 ways) (1 way)

The 16 possible outcomes are categorized above, and I’ve also included the number of specific outcomes in each category. We’ve seen these numbers before… they’re the rows of Pascal’s Triangle! It’s easy to calculate the numbers for each category – just use the “combination” function:

4C0 4C1 4C2 4C3 4C4

Page 5: Entropy and the 2 nd  Law of Thermodynamics

Entropy and the 2nd Law of Thermodynamics

“Entropy” is just the number of specific outcomes for a given category.

These numbers are shown in the table below, and are plotted on the vertical axis of the bar chart at right.

0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 10

50

100

150

200

250

300

Flipping 10 Fair Coins

Categories: Proportion of flips with heads facing up

Num

ber o

f Spe

cific

Out

com

es

The 2nd Law of Thermodynamics simply says that the system is most likely to be observed in the category having the largest entropy. Duh!

1 10 45 120 210 252 210 120 45 10 1

Page 6: Entropy and the 2 nd  Law of Thermodynamics

0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 10

50

100

150

200

250

300

Flipping 10 Fair Coins

Categories: Proportion of coins with heads facing up

Num

ber o

f Spe

cific

Out

com

es

0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 10.00

0.05

0.10

0.15

0.20

0.25

0.30

Probability Distribution for 10 Fair Coins

Proportion of coins with heads facing up

Prob

abili

ty

Switching to Probabilities

So, we can use the combination function to calculate the number of specific outcomes in each category. The total number of all outcomes is given by 2n where n is the number of coins flipped.

Therefore, it is also easy to turn these into a probability of observing each category. Thus,

P(H=0.4) = 210/1024for example.

If we list the resulting probability for each category, we have something called a “probability distribution”. The following slides show probability distributions for increasing numbers of coins.

Page 7: Entropy and the 2 nd  Law of Thermodynamics

0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 10.00

0.05

0.10

0.15

0.20

0.25

0.30

10 Fair Coins

Proportion of coins with heads facing up

Prob

abili

ty

Page 8: Entropy and the 2 nd  Law of Thermodynamics

0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 10.00

0.02

0.04

0.06

0.08

0.10

0.12

50 Fair Coins

Proportion of coins with heads facing up

Prob

abili

ty

Page 9: Entropy and the 2 nd  Law of Thermodynamics

0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 10

0.01

0.02

0.03

0.04

0.05

0.06

0.07

0.08

0.09

100 Fair Coins

Proportion of coins with heads facing up

Prob

abili

ty

Page 10: Entropy and the 2 nd  Law of Thermodynamics

0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 10.00

0.01

0.02

0.03

0.04

0.05

0.06

200 Fair Coins

Proportion of coins with heads facing up

Prob

abili

ty

Page 11: Entropy and the 2 nd  Law of Thermodynamics

0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 10

0.005

0.01

0.015

0.02

0.025

0.03

0.035

0.04

500 Fair Coins

Proportion of coins with heads facing up

Prob

abili

ty

Page 12: Entropy and the 2 nd  Law of Thermodynamics

0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 10

0.005

0.01

0.015

0.02

0.025

0.03

1,000 Fair Coins

Proportion of coins with heads facing up

Prob

abili

ty

Page 13: Entropy and the 2 nd  Law of Thermodynamics

0 10 20 30 40 50 600

0.1

0.2

0.3

0.4

0.5

0.6

0.7

0.8

0.9

1

Sequential Coin Flips

Number of coin flips

Runn

ing

Prob

abili

ty T

otal

Alternatively, we could flip the coins one at a time and keep track of the evolving probability. Each coin flip has a random outcome, yet after many flips, the system will evolve toward its most likely outcome. Also, the variation away from the equilibrium value (i.e. the zig-zags in the graph) will disappear after many flips.

Evolution of the Coin System

Page 14: Entropy and the 2 nd  Law of Thermodynamics

0 200 400 600 800 1000 1200 14000

10

20

30

40

50

60

70

80

90

Entropy Simulation

Number of Collisions

Entr

opy

We began with 160 units of energy uniformly distributed among 40 particles, and this system quickly evolved to a more likely distribution of energy. With only 40 particles in our system, the variation in entropy persists , even at equilibrium… with a mole of particles, any variation would be undetectable.

Evolution of the Peg and Washer System

Page 15: Entropy and the 2 nd  Law of Thermodynamics

View of a Typical Example of an Evolved System

The image at left shows an entropy board after 1450 collisions. Describe the distribution of energy in this system.

Page 16: Entropy and the 2 nd  Law of Thermodynamics

Discussion Questions

1) If I toss 10 fair coins, which of the following specific outcomes is more likely?HHHHHHHHHH or HTHHTHTTTH ?

2) The first example above is from the category H=1.0 while the second is from the category H=0.5. Which category are we more likely to observe if we toss 10 coins?

3) Let’s say I toss 10 coins and then arrange them in some order. Then I tell you only which of the 11 categories I’ve got, but ask you to guess the specific order of heads and tails. Which category would give you the lowest chance of success? This is meant to show why the best definition may be, “Entropy is a measure of the amount of missing information”. With this new definition, restate the 2nd Law of Thermodynamics.

4) Using the trend observed in slides 7 – 12, what would the probability distribution look like if we tossed one billion coins?

5) Evaluate this statement from the new Next Generation Science Standards: “Uncontrolled systems always evolve toward more stable states—that is, toward more uniform energy distribution”.