+ All Categories
Home > Documents > The Statistical Mechanics View of Entropy

The Statistical Mechanics View of Entropy

Date post: 09-Nov-2021
Category:
Upload: others
View: 2 times
Download: 0 times
Share this document with a friend
6
Page 1 of 6 The Statistical Mechanics View of Entropy Goal By shaking a box of wiggle-eyes, which can be either “eyes-up” or “eyes-down,” you will investigate the statistical distribution of microstates that can be randomly generated. You will calculate the entropy associated with the various macrostates, and will show that the most likely macrostate is the one with the maximum entropy. Microstate: A specific pattern of “eyes-up” and “eyes-down.” Macrostate: A specific total of “eyes-up” (or “eyes-down.”) Despite the length of this document, and the appearance of lots of math, this is actually a straightforward lab. Equations (2), (6) and (14) in the tutorial are the key equations here. Equipment (1) A plastic organizer box with 18 compartments (sprayed internally with anti-static). (2) 18 small plastic wiggle-eyes, one per compartment. Reference Young and Freedman, Section 20.8. Procedure The experiment is simple, but repetitive. (1) Shake the 18-compartment box vigorously for about 1–2 seconds and set it down flat on the table. (2) If any of the wiggle-eyes are not flat (i.e. tilted on their side), tap the box until all 18 wiggle-eyes are flat (i.e. either looking up or down). (3) Record the particular microstate (i.e. the sequence of 18 up/downs) that you have just created. For example, if a wiggle-eye is up, record a “1”; if down, record a “0”. Data recording will be done in the Google spreadsheet accessible here when logged into Google services with your ASU network id. One person should shake the box, and another should enter the results in the spreadsheet. One possible method is that the person who shook the box reads back the result in threes, i.e. “one – one – zero”, “one – zero – zero”, etc. The person at the computer can enter these rapidly using the “1”, “0” and “right-arrow” keys. The macrostate n that this microstate belongs to can be found by summing the row of 1’s and 0’s. Your group will enter exactly 100 rows (microstates). For your group’s dataset of 100 results, you should answer the following questions; (1) How many microstates are possible? How many microstates are there with 9 “up” wiggle-eyes (use Eq. 1)? How many microstates are there with all 18 wiggle-eyes “up”? (2) How many macrostates are possible? Did your experiment “visit?” all of them? (3) Plot a histogram of your macrostates. You should also calculate the mean value n for your experiment.
Transcript
Page 1: The Statistical Mechanics View of Entropy

Page 1 of 6

The Statistical Mechanics View of Entropy

Goal By shaking a box of wiggle-eyes, which can be either “eyes-up” or “eyes-down,” you will investigate the statistical distribution of microstates that can be randomly generated. You will calculate the entropy associated with the various macrostates, and will show that the most likely macrostate is the one with the maximum entropy.

Microstate: A specific pattern of “eyes-up” and “eyes-down.” Macrostate: A specific total of “eyes-up” (or “eyes-down.”)

Despite the length of this document, and the appearance of lots of math, this is actually a straightforward lab. Equations (2), (6) and (14) in the tutorial are the key equations here. Equipment

(1) A plastic organizer box with 18 compartments (sprayed internally with anti-static). (2) 18 small plastic wiggle-eyes, one per compartment.

Reference Young and Freedman, Section 20.8. Procedure The experiment is simple, but repetitive.

(1) Shake the 18-compartment box vigorously for about 1–2 seconds and set it down flat on the table.

(2) If any of the wiggle-eyes are not flat (i.e. tilted on their side), tap the box until all 18 wiggle-eyes are flat (i.e. either looking up or down).

(3) Record the particular microstate (i.e. the sequence of 18 up/downs) that you have just created. For example, if a wiggle-eye is up, record a “1”; if down, record a “0”.

Data recording will be done in the Google spreadsheet accessible here when logged into Google services with your ASU network id. One person should shake the box, and another should enter the results in the spreadsheet. One possible method is that the person who shook the box reads back the result in threes, i.e. “one – one – zero”, “one – zero – zero”, etc. The person at the computer can enter these rapidly using the “1”, “0” and “right-arrow” keys. The macrostate n that this microstate belongs to can be found by summing the row of 1’s and 0’s. Your group will enter exactly 100 rows (microstates). For your group’s dataset of 100 results, you should answer the following questions;

(1) How many microstates are possible? How many microstates are there with 9 “up” wiggle-eyes (use Eq. 1)? How many microstates are there with all 18 wiggle-eyes “up”?

(2) How many macrostates are possible? Did your experiment “visit?” all of them?

(3) Plot a histogram of your macrostates. You should also calculate the mean value n for

your experiment.

Page 2: The Statistical Mechanics View of Entropy

Page 2 of 6

(4) Fit your data to a Gaussian distribution with mean 𝜇 and variance 𝜎2, namely1

n(𝜇, 𝜎2)~1

√2𝜋𝜎2exp (−

(𝑥 − 𝜇)2

2𝜎2)

and report 𝜇 ± 𝛿𝜇 and 𝜎 ± 𝛿𝜎 with units. (5) Compare your result to Eq. 6 to find u. Calculate P(n, N) using Eqs. 2 and 6. How well

does the approximation (Eq. 6) agree with the accurate result (Eq. 2)? How do each compare with the probability we might expect if we used a coin instead of wiggle-eyes?

(6) How does 𝜇 compare to 𝜎2? Does this make sense? Explain.

Your data will be combined with the results of the other groups at the end of the lab to create a superset of data. You are to repeat your analysis on this superset of data, and to compare your results with those for the superset. This means to repeat questions 1-6 for the superset. Using the superset results:

(7) How does the superset data differ from your group’s data? If there are differences, what do you think are the possible explanations?

(8) Calculate the nominal value of E kBT for these wiggle-eyes using Eq. 14. If there were

no bias between the “up” and “down” state, we would expect E kBT to be zero. Is

E kBT significantly different from zero for the superset of data? Discuss reasons why

it might differ from zero for the wiggle-eyes. If we assume that T = 300 K, what value would E have for our wiggle-eye system? Is this meaningful with respect to thermal fluctuations?

(9) Use Eqs. 12 and 13 to calculate the probability of an “up” state, u, as well as the probability of a “down” state, d, from the superset of data.

(10) What happens to probability of an “up” state, 𝑢(𝑇), as temperature2 𝑇 → 0, 𝑇 → ∞? Repeat this calculation for the down states, 𝑑(𝑇). How would you explain this with respect to the heuristic of energy?

(11) Plot 𝑢(𝑇), 𝑑(𝑇) on the same axes. If one considers the system to change phase from freezing ↔ melting when the probability of finding a “down” state reaches 60%. What is the temperature at which a phase transition occurs? Would it occur at room temperature 𝑇 = 300K?

(12) Calculate the entropy associated with each microstate by inserting your values for u and d into Eq. 8, and then using Eq 9. Which microstate has the maximum entropy associated with it? Is this what one would expect? Why or why not.

(Extra Credit) How many macrostates does the superset “visit”? Estimate how many shakes of the box would be needed in order to visit every possible microstate with 50% certainty? What about absolute (100%) certainty3?

1 Here the symbol ‘~’ means “distributed as.” This is the probability density function for the normal (Gaussian)

distribution with independent variable x. 2 The limit of 𝑇 → ∞ is often known as the “thermodynamic limit.” 3 This is a difficult problem requiring a nuanced Poisson statistics confidence limits (CL) calculation. I highly

recommend you read PHYS 252 Macrostates.pdf posted on Blackboard.

Page 3: The Statistical Mechanics View of Entropy

Page 3 of 6

Brief Tutorial on the Statistical Interpretation of Entropy This tutorial provides a quick, simplified introduction to probability and entropy.

Although there is a lot of math here, there are only a few key equations that you need in order to analyze your data. Almost all systems move in the direction of increasing entropy. Thus, at equilibrium, a system is in its maximum entropy state. Entropy maximization is the key principle underlying ALL of equilibrium thermodynamics. The concepts of free energy, internal energy and work, all flow from the concept of entropy, not the other way round. Further, much of the mystery of thermodynamics (but not the difficulty!) starts to evaporate when we examine the subject in this microscopic manner. Entropy can be interpreted physically as the number, or weight, of the microscopic states of the system. The purpose of this lab and tutorial is to introduce you to the key statistical concepts underlying what we mean by entropy and its maximization. Suppose we have a system of N particles that can be either in an “up” state or a “down” state. It could be a collection of N coins lying on a table that can be “heads” or “tails”, or it can be the little wiggle-eyes in this experiment that can be looking up or down. In microscopic physics it can be the spins of a system of electrons in a magnetic field, “spin up” or “spin down”. A specific sequence of the 18 particles, which can be represented by the binary string

10 0 1 10 11 0 0 0 11 1 0 1 0 1 ,

(for example) is called a microstate. Here we have designated “up” = 1 and “down” = 0. In this particular sequence, there are 10 “up” particles and 8 “down” particles. All sequences that have 10 “up” particles (and therefore 8 “down” particles) are part of the same macrostate. If we designate u as being the probability of a particle being “up”, and d as the probability of the particle being “down”, then we have u d 1 , and the probability of the above microstate is then

u d d u u d u u d d d u u u d u d u u10 d8

As you can see, every sequence that contains ten “ups” and eight “downs” has the same probability. If u = d = 0.5, then every microstate has the same unlikely probability of (½)18. There are 218 = 262,144 distinct microstates. Clearly, it will be a very long time before we get every possible microstate to appear in our experiment. However, by studying the statistical trends, we can learn a lot by studying as few as 100 microstates, as we do in this lab. We now need to find out how many microstates have n=u “up” states, and therefore d=(N-n) “down” states, where N:= total number of googilie-eyes (18). The number of such

microstates w n,N is given by the combinatorial quantity

𝑤(𝑛,𝑁) =𝑁!

𝑛!(𝑁−𝑛)!=

(18)!

𝑢!(𝑑)! (1)

Page 4: The Statistical Mechanics View of Entropy

Page 4 of 6

where𝑁! = 𝑁 × (𝑁 − 1) × ⋯× 2 × 1. (It is a peculiar fact that4 0! = 1.) This expression tells us that there are N! ways to line up N distinguishably different objects. However, if n ≤ N of them are identical, then n! of these arrangements are indistinguishable, so we have to divide out those arrangements to avoid over-counting. In addition, if the other N – n are also indistinguishable, then we have to divide out those over-counted arrangements too. The figure above illustrates the situation for four objects that are either black (“up”) or white (“down”).

The probability P n,N that there will be n “up” particles in a system of N particles will

be the product of the probability of each microstate, undNn times the number of states that have n “up”

P n,N N!

n! N n !undNn (2)

This is the probability of the macrostate n. If we sum over all possible n we get

N !

n! N n !undN n

n0

nN

u d N 1N 1 (3)

4 This “peculiar fact” comes from a rigorous definition of the factorial function from a branch of mathematics

known as group theory. The symmetric group on N elements (denoted 𝑆𝑁) is the number of permutations (or

different orderings) of N things. The factorial function can be defined as the number of elements in 𝑆𝑁 – which is

different from N for 𝑁 ≠ {1,2}. Thus, one calculates 𝑆0 = {∅}, the empty set by vacuous implication. Thus one can

order zero things one way so 0! = 1.

Pictorial representation of the 24 = 16 microstates available to N = 4 objects (dots) that

can be black or white. This could also be for a box of just four wiggle-eyes that are “up”

(black) or “down” (white). Each row of four is a microstate. Each group with the same

number of black and white objects is a macrostate. Clearly, there are five (N+1)

macrostates. The macrostate with two white and two black objects is the most

probable in this example, having six different arrangements.

Page 5: The Statistical Mechanics View of Entropy

Page 5 of 6

This simply means that the probability of finding the system in any state 0 n N is unity; the system has to be in one of the available states. The total number of microstates is found from the sum

w n,N n0

nN

N !

n! N n !n0

nN

2N , (4)

which is easily confirmed arithmetically by setting u d 1 2 in equation (3).

When N is large (i.e. Avogadro’s number) the factorial terms are astronomically large. It is easier to use an approximation given by Stirling’s formula,

ln N ! N ln N N . (5)

Provided n is not too close to either 0 or N, it can be shown that

P n,N 1

2udNexp

n uN 2

2udN

(6)

This is a Gaussian distribution, (also called the Normal distribution) with mean value n uN

and standard deviation udN . Thus, for N = 18, and u=d=0.5, we have n 18 0.5 9.0

, and the standard deviation 0.5 0.5 18 2.12 . Thus, for any randomly chosen microstate, we expect to encounter n = 9.00 ± 2.12 wiggle-eyes in the “up” state 68% of the

time. Although n is necessarily an integer, n does not have to be an integer.

If N is very large, but u is very small, such that n uN is a small number of order 1,

then it is more appropriate to express the probability as a Poisson distribution

P n,N uN n

n!exp uN (7)

The number of microstates n,N (constrained by the potentially unequal probabilities u and

d) is found by multiplying P n,N by the total number of all microstates, which is 2N . Thus,

n,N 2NN!

n! N n !undNn

2N

2udNexp

n uN 2

2udN

(8)

Page 6: The Statistical Mechanics View of Entropy

Page 6 of 6

The entropy S n,N of macrostate n is given by Boltzmann’s famous expression

S n,N kB ln n,N , (9)

The logarithm appears because, in a nutshell, probabilities multiply, whereas entropies add.

Logarithms have the useful property that ln xy ln x ln y when x and y are probabilities.

When N is large, the entropy of a system in equilibrium is well approximated by the entropy for

the macrostate that has the maximum number of microstates wmax n,N , where

wmax n,N 2N P nmax ,N . (10)

This leads to

S n,N kB ln wmax n,N . (11)

The units of entropy are the same as the units of kB , which are Joules/Kelvin, which is

consistent with the units for macroscopic entropy, which was defined as S dQ

T.

In thermodynamics, it is asserted that each microstate is inherently equally probable. It is the energy of the microstate that can vary, which in turn constrains the occurrence of the microstate. The equilibrium state is influenced by the energy of the microstates, and all microstates within a macrostate have the same energy. If the energy of the “up” state is Eu

and the energy of the down state is Ed then the probability of an “up” state is

𝑢 =exp(−∆𝐸/𝑘𝐵𝑇)

1+exp(−∆𝐸/𝑘𝐵𝑇) (12)

Similarly, the probability of a “down” state is

𝑑 =exp(−𝐸𝑑/𝑘𝑏𝑇)

exp(−𝐸𝑢/𝐾𝐵𝑇)+exp(−𝐸𝑑/𝑘𝐵𝑇)=

1

1+exp(−∆𝐸/𝑘𝐵𝑇) (13)

where E Eu Ed . Clearly, u d 1 , as it should be. The average number of “up” states,

n , is then

n Nu Nexp E kBT 1 exp E kBT

(14)

The energy of microstate n is

En nEu N n Ed , (15)

and the equilibrium energy E of the system is therefore

E n Eu N n Ed , (16)

which must be constant for a thermodynamic system (provided no heat flows in, and no work is done). The macrostate with the largest entropy is the equilibrium macrostate.

𝑁 ≔ (#𝑜𝑓𝑑𝑜𝑡𝑠𝑝𝑒𝑟𝑟𝑜𝑤); 𝑛 ≔ (#𝑜𝑓𝑏𝑙𝑎𝑐𝑘𝑑𝑜𝑡𝑠);(𝑁 − 𝑛) ≔ (#𝑜𝑓𝑤ℎ𝑖𝑡𝑒𝑑𝑜𝑡𝑠)


Recommended