+ All Categories
Home > Documents > Entropy Entropy and applications of statistical mechanics...From Fermi’s Golden Rule: Let p i be...

Entropy Entropy and applications of statistical mechanics...From Fermi’s Golden Rule: Let p i be...

Date post: 19-Nov-2020
Category:
Upload: others
View: 1 times
Download: 0 times
Share this document with a friend
39
Statistical Mechanics Victor Naden Robinson vlnr500 3 rd Year MPhys 17/2/12 Lectured by Rex Godby Lecture 1: Probabilities Lecture 2: Microstates for system of N harmonic oscillators Lecture 3: More Thermodynamics, Boltzmann and Entropy Lecture 4: Entropy Lecture 5: Entropy and applications of statistical mechanics Lecture 6: 2-Level System Lecture 7: More harmonic oscillator and heat capacity Lecture 8: Modes Lecture 9: Debye continued Lecture 10: The ideal gas Lecture 11: More ideal gas Lecture 12: Maxwell-Boltzmann distribution Lecture 13: Summary and Gibbs approach Lecture 14: Identical particles Lecture 15: Bose-Einstein distribution Lecture 16: Continuation and condensation Lecture 17: Energy in EM modes Lecture 18: Black body radiation
Transcript
Page 1: Entropy Entropy and applications of statistical mechanics...From Fermi’s Golden Rule: Let p i be the probability of the system being in state I (among the W states): ∑ ∑ (1.1)

Statistical Mechanics

Victor Naden Robinson

vlnr500 3rd Year MPhys

17/2/12

Lectured by Rex Godby

Lecture 1: Probabilities

Lecture 2: Microstates for system of N harmonic oscillators

Lecture 3: More Thermodynamics, Boltzmann and Entropy

Lecture 4: Entropy

Lecture 5: Entropy and applications of statistical mechanics

Lecture 6: 2-Level System

Lecture 7: More harmonic oscillator and heat capacity

Lecture 8: Modes

Lecture 9: Debye continued

Lecture 10: The ideal gas

Lecture 11: More ideal gas

Lecture 12: Maxwell-Boltzmann distribution

Lecture 13: Summary and Gibbs approach

Lecture 14: Identical particles

Lecture 15: Bose-Einstein distribution

Lecture 16: Continuation and condensation

Lecture 17: Energy in EM modes

Lecture 18: Black body radiation

Page 2: Entropy Entropy and applications of statistical mechanics...From Fermi’s Golden Rule: Let p i be the probability of the system being in state I (among the W states): ∑ ∑ (1.1)

Lecture 1: Probabilities

{Note lots of graphs in this course}

Probability of obtaining score “S” on: 1 die

[Figure 1. Plotting P vs. S, dots on the P=1/6 line]

2 dice:

Total Score Average Score Number of Configurations

2 1.0 1 3 1.5 2 4 2.0 3 5 2.5 4 6 3.0 5 7 3.5 6 8 4.0 5 9 4.5 4

10 5.0 3 11 5.5 2 12 6.0 1

Total:

[Figure 2: Prob vs. avg score, 1 to 6 on x-axis, 0 to 1/36 to 1/6 on y-axis, triangle dot formation]

7 Dice:

Total Average Score Number of ways Probability

7 1.0 1 8 8/7 7 9 9/7 21*

*5x(1 dot) + 2x(2 dots):

6x(1 dot) + 1x(3dot):

[Figure 3: Prob vs. score plotted as a Gaussian between scores 1 to 6, FWHM=245660]

Scores obtained: 2.85 3.80 4.70

1024 dice:

[Figure 4: same as figure 3 but very thin (width ~10-12)]

With a large number of components we can describe properties with precision.

We can make predictions about the collective behaviour of large systems without needing to

predict the detailed motion of its components.

Equally likely elementary outcomes

1.1 Microstates

Page 3: Entropy Entropy and applications of statistical mechanics...From Fermi’s Golden Rule: Let p i be the probability of the system being in state I (among the W states): ∑ ∑ (1.1)

Microstates are QM eigenstate solutions to S.E. but for a whole system rather than an

elementary particle.

For an isolated system the energy is fixed but this energy has degenerate states: W (large

number) microstates with energy, E.

[Figure 4: E on y axis then lots of dashed lines which I suppose is energy levels, circle

enclosing some of the dashes labelled “W microstates”.]

From Fermi’s Golden Rule:

Let pi be the probability of the system being in state I (among the W states):

∑ ∑

(1.1)

(1.2)

When a steady state has been reached,

for all states .

This can happen only if , i.e. all probabilities are equal and all equal

. This is the Principle of

Equal Equilibrium Probability (PEEP).

Refresh: Possible E of single SHM, think about E of N SHM’s.

Lecture 2: Microstates for System of N harmonic oscillators

Classically,

(2.2)

Where is classical angular frequency

Energies

where

[Figure 2.1: of parabola with lines across it showing energy levels, at

going up]

A microstate of system of N oscillators is given by the states each oscillator is in.

The energy of N oscillators is:

Page 4: Entropy Entropy and applications of statistical mechanics...From Fermi’s Golden Rule: Let p i be the probability of the system being in state I (among the W states): ∑ ∑ (1.1)

(

) (

) (

)

(

)

(

) (2.3)

What is the number of microstates W corresponding to a given value of Q (i.e. to a given total

energy)? Consider:

: Q dots, (N-1) partitions

The no ways W is equal to no ways of arranging the Q dots and (N-1) partitions:

(2.4)

2.1 Thermal Equilibrium

Two systems of N harmonic oscillators

The no microstates corresponding to distribution of energy is:

(2.5)

From PEEP each of these W microstates is equally likely.

In the general case rather than W1 we shall focus on the density of states, i.e. the number of

microstates per unit energy:

(2.6)

Note that where the band of energies within which E is fluctuating.

(2.7)

Lecture 3: More thermodynamics

System:

∙ ∙ ∙ ∙ ∙

E1

W1

G1

E1

W1

G1

energy

Page 5: Entropy Entropy and applications of statistical mechanics...From Fermi’s Golden Rule: Let p i be the probability of the system being in state I (among the W states): ∑ ∑ (1.1)

The probability of this discussion of energy:

(3.1)

For most likely partition, maximise , that is, maximise:

(3.2)

(

)

(3.3)

Thus for most likely partition of energy (i.e. equilibrium):

(3.4)

This quantity:

is therefore equalised between two systems when they can exchange energy.

[Figure 3.2: Ok I’ll try and draw it!]

Ok that went well.

Define Statistical Mechanical temperature as:

(3.5)*

*For a large system

This T is identical to the ideal gas temperature.

3.2 Boltzmann Distribution

Visa vies systems of constant T.

[Figure 3.3]

𝑙𝑛𝑔 𝜕

𝜕𝐸𝑙𝑛𝑔

𝐸

ER

gR

T

E1

energy

Heat Reservoir

Microstate 𝑖

Page 6: Entropy Entropy and applications of statistical mechanics...From Fermi’s Golden Rule: Let p i be the probability of the system being in state I (among the W states): ∑ ∑ (1.1)

Because the reservoir is large, T is too good an approximation slowly varying with ER.

(3.6)

By PEEP the probability of the situation shown is:

(3.7)

But:

(3.8)

Since

(3.9)

(3.10)

Z is the sum over all microstates j of the system of the Boltzmann Factor

.

Mean energy of a system at constant T

(3.11)

The mean energy

(3.12)

Recall:

(3.13)

3.3 Entropy

For an isolated system we define:

Page 7: Entropy Entropy and applications of statistical mechanics...From Fermi’s Golden Rule: Let p i be the probability of the system being in state I (among the W states): ∑ ∑ (1.1)

(3.14)

For a non-isolated system exploring different microstates with probabilities pi (assumed to be

known): {ends}

Lecture 4: Entropy

(4.1)

For the general case where our system explores its microstates with given (generally equal)

probabilities pi, consider N replicas of the system, where N is arbitrarily large. Then ~Np1 systems will

be in microstate 1, ~Np2 in microstate 2, etc.

Total Energy Np1E1 + Np2E2+… is effectively fixedm thus the system is thermally isolated in its

behaviour, exploring states only with a total fixed energy.

Thus the effective W is the number of distinct permutations of Np1 identical objects, Np2 identical

objects, etc.

(4.2)

(4.3)

Using Stirling Approximation:

(4.4)

So the entropy of N replicas is:

(4.5)

For N=1, then:

(4.6)

Entropy of an isolated system

{

(4.7)

Page 8: Entropy Entropy and applications of statistical mechanics...From Fermi’s Golden Rule: Let p i be the probability of the system being in state I (among the W states): ∑ ∑ (1.1)

(4.8)

Entropy of a system at constant temperature

(4.9)

(

)

(4.10)

Recall Helmholtz free energy:

(4.11)

Thermodynamic entropy

Consider a system thermally isolated on which work can be done. The total internal energy is:

(4.12)

When the volume of the system is changed:

(4.13)

The RHS (and last term) must be associated with heat.

Lecture 5: Entropy and applications of statistical mechanics

If a system is insulated (thermally isolated):

(5.1)

Page 9: Entropy Entropy and applications of statistical mechanics...From Fermi’s Golden Rule: Let p i be the probability of the system being in state I (among the W states): ∑ ∑ (1.1)

Now,

(5.2)

For a slow change in constraints, FMG shows that no additional transitions between microstates are

induced, i.e. dpi = 0 for all

(5.3)

For an insulated system:

(5.4)

Since ∑ , the other term in (4.2) must be the heat:

(5.5)

At constant temperature, the probabilities, pi, are given by the Boltzmann distribution:

Also (4.6): ∑

∑{

}

∑ ∑

∑ {

}

(5.6)

Thus statistical mechanical and thermodynamic entropy are identical.

5.1 Applications of Statistical Mechanics

Vacancies in crystals

For N atoms, how many vacancies will there be at temperature T?

Each vacancy costs an amount of energy ( ) due to not being bonded fully. Free energy F is

to be minimised:

Page 10: Entropy Entropy and applications of statistical mechanics...From Fermi’s Golden Rule: Let p i be the probability of the system being in state I (among the W states): ∑ ∑ (1.1)

(5.7)

So when volume and temperature are held constant, hence we minimise not

What are and for a given ?

(5.8)

Provided a system is sufficiently large, it’s energy is effectively fixed as if it were thermally isolated

so we may use:

W is the number of arrangements of N atoms, and vacancies on a total of sites:

(5.9)

{ }

{ }

{ }

(

) (

)

(5.10)

Since ,

(5.11)

Note:

At 300K,

At 3000K,

Practical crystals at room temperature are not yet at thermal equilibrium but reflect vacancy

concentrations from temperatures ( ) where the crystal first formed.

Page 11: Entropy Entropy and applications of statistical mechanics...From Fermi’s Golden Rule: Let p i be the probability of the system being in state I (among the W states): ∑ ∑ (1.1)

Lecture 6: 2-Level System

E.g. each magnetic atom/ion in a dilute magnetic semi-conductor

If

then

Because

(6.1)

Take

(6.2)

(6.3)

Low T:

High T:

So,

(

)

(6.4)

0

𝜇𝐵

𝜇𝐵

𝐵

0

1

P

p1

𝐾𝑇 𝜇𝐵

T

𝐾𝑇 𝜇𝐵

Page 12: Entropy Entropy and applications of statistical mechanics...From Fermi’s Golden Rule: Let p i be the probability of the system being in state I (among the W states): ∑ ∑ (1.1)

Diagram:

Heat capacity

(6.5)

Mean magnetic dipole moment

So the magnetic polarisation with n magnetic ions per unit volume of crystal is (

)

Harmonic Oscillator

(

)

[

]

𝐸

T

𝜇𝐵

𝑘𝑇 𝜇𝐵

C

T

Schottky

Anomaly

𝜇

T

𝜇

0

𝜔

𝜔

𝜔

Page 13: Entropy Entropy and applications of statistical mechanics...From Fermi’s Golden Rule: Let p i be the probability of the system being in state I (among the W states): ∑ ∑ (1.1)

So

(6.6)

This is a geometric series

(6.7)

Lecture 7: More Harmonic Oscillator and Heat Capacity

(7.1)

Harmonic oscillator continued,

(7.2)

(7.3)

(7.4)

This is the Planck oscillator formula (eqn. 7.4)

At low T ( );

At high T ( );

But

Page 14: Entropy Entropy and applications of statistical mechanics...From Fermi’s Golden Rule: Let p i be the probability of the system being in state I (among the W states): ∑ ∑ (1.1)

(7.5)

An example of the classical equipartition theorem: A mean energy of

per degree of freedom in

the total energy is proportional to position2 or velocity2.

Heat capacity of oscillator

( )

(

)

( )

(7.6)

Graph:

Vibrational energy of a crystal

First consider two atoms in one dimension

2 modes:

𝐸

T

𝜔

𝑘𝑇

𝐶

T

𝜔

𝑘

𝑘

𝑚

𝑚

𝜆

𝜔 𝜆/𝑚

Page 15: Entropy Entropy and applications of statistical mechanics...From Fermi’s Golden Rule: Let p i be the probability of the system being in state I (among the W states): ∑ ∑ (1.1)

And

Arbitrary motion of the atoms can be written as a linear combination of these normal modes.

Next: 3 atoms in 1D

Now 3 modes:

N atoms in 3D have 3N modes each with a well-defined frequency .

Lecture 8: Modes

For a periodic crystal the modes can be classified by their wave vectors q. Displacement of atom at

position is:

( )

(8.1)

‘Zero’ boundary conditions give a wave of zero when .

(8.2)

Cubic lattice of points in q-space, each one representing an allowed value of q

𝜔

𝜔

𝜔

𝜔

𝐿

𝐿

𝐿

𝐶

𝜋

𝐿

𝜋

𝐿

𝜋

𝐿

𝑞𝑦

𝑞𝑥

Page 16: Entropy Entropy and applications of statistical mechanics...From Fermi’s Golden Rule: Let p i be the probability of the system being in state I (among the W states): ∑ ∑ (1.1)

For a realistic solid:

We study two simplified models:

Einstein model

Debye model

Einstein model

3N simple harmonic oscillators

(

)

(8.3)

(8.4)

𝜔

𝑞

𝑏𝑜𝑢𝑛𝑑𝑎𝑟𝑦 𝑜𝑓 𝑠𝑠𝑡𝐵𝑟𝑖𝑙𝑙𝑜𝑢𝑖𝑛 𝑧𝑜𝑛𝑒

𝜔

𝑞

𝜔𝐸

𝑁 𝑚𝑜𝑑𝑒𝑠

𝜔

𝑞

𝑠𝑙𝑜𝑝𝑒 𝑐

𝑞𝑚𝑎𝑥

𝐸

𝑞𝑚𝑎𝑥

𝐶

𝑞𝑚𝑎𝑥

𝑁

𝜔

𝑁𝑘

Page 17: Entropy Entropy and applications of statistical mechanics...From Fermi’s Golden Rule: Let p i be the probability of the system being in state I (among the W states): ∑ ∑ (1.1)

At low T,

(

)

(8.5)

Einstein model good except at low T (see hand out).

Debye model

The n. of q points per unit volume of q space is

(

)

(8.6)

The volume of the sphere in q-space is

The volume of the positive octant of this sphere is:

(

)

(8.7)

Lecture 9: Debye model continued

Picture of positive octant of sphere in q space with a strip on the surface

For the Debye model

(

)

We had, for a single oscillator {lecture 7}:

Adding over the modes, with

N. of q points per unit volume of q space is

Contribution to heat capacity from modes in the shell of thickness is

∙ ∙

[1st term from volume of octant shell, 2nd term is n. of points per unit volume, 3rd is modes per q]

(9.1)

Page 18: Entropy Entropy and applications of statistical mechanics...From Fermi’s Golden Rule: Let p i be the probability of the system being in state I (among the W states): ∑ ∑ (1.1)

Looking at

( )

(9.2)

(9.3)

Using:

Sub into (9.3)

∫ (

)

(9.4)

[Note the limits change too]

(9.5)

[Graph showing this integrand vs. x and the tails and for low T and high T]

Therefore for high T, integrand and so integral [

]

(

)

(9.6)

As expected for 3N oscillators with high T and heat capacity k.

For low T: let since rest of tail (graph) contributes negligible amount to C. Law of

contour integration:

Page 19: Entropy Entropy and applications of statistical mechanics...From Fermi’s Golden Rule: Let p i be the probability of the system being in state I (among the W states): ∑ ∑ (1.1)

(9.7)

(9.8)

[Graph comparing C vs. T for Debye and Einstein]

[Graph of omega vs. q showing straight line relationship and something about not being frozen]

Sometimes we prefer to use as the parameter in the Debye model not c but the Debye frequency

Lecture 10: The ideal gas

Gas molecules move around independently:

Ignore long-range Van der Waals attraction

Ignore scattering (except to allow molecules to explore all accessible microstates)

Initially we study monatomic gases. Partition function of ideal monatomic gas is required.

For one atom in a box:

(10.1)

For two non-identical atoms:

∑∑

(10.2)

For two identical atoms:

(10.3)

∑∑

{

(10.4)

For three atoms:

∑∑∑

{

(10.5)

For N atoms:

∑∑

{

(10.6)

Page 20: Entropy Entropy and applications of statistical mechanics...From Fermi’s Golden Rule: Let p i be the probability of the system being in state I (among the W states): ∑ ∑ (1.1)

In an ideal gas, most of the states are empty so as long as the temperature is not extremely low, an

atom has so many particle states to choose from that a state with more than one atom in it is

extremely unlikely.

To a good approximation we can apply the

correction factor to all cases. Thus:

∑∑

(10.7)

(∑

)(∑

) (∑

)

(10.8)

Thus Z for N atoms is:

(10.9)

Schrödinger eqn:

(10.10)

Choosing boundaries for which ; the solution is:

( )

(10.11)

Where

The energy Eigen function is:

(10.12)

Volume of the octant shell of radius thickness in q space is:

(10.13)

Number of states per unit q space is:

(10.14)

(10.15)

Lecture 11: More ideal gas! Continues from L10

Page 21: Entropy Entropy and applications of statistical mechanics...From Fermi’s Golden Rule: Let p i be the probability of the system being in state I (among the W states): ∑ ∑ (1.1)

Continuing on, define

(11.1)

(

)

(√ )

And since ∫

(√

)

(

)

(11.2)

So

(

)

Free energy is given by

[ (

)

]

[ (

)

(

) ]

(11.3)

Note that F is extensive: doubling both V and N simply doubles F.

Pressure of ideal gas

Recall

(

)

(11.4)

[ (

)

(

) ]

(

)

Page 22: Entropy Entropy and applications of statistical mechanics...From Fermi’s Golden Rule: Let p i be the probability of the system being in state I (among the W states): ∑ ∑ (1.1)

(11.5)

This establishes the equivalence of stat mech. and (ideal gas) thermo-dynamical temperatures.

Energy of ideal gas

[ (

)

(

) ]

(11.6)

I.e. energy of

per gas molecule 1 atom, as predicted by classical equipartition theorem:

per

K.E. term

(

)

Note that our “non-degenerate” approximation that led to

assumes that T is not too low,

hence we are already sufficiently hot enough for equipartition to apply.

Heat capacity of an ideal gas:

(11.7)

Maxwell-Boltzmann distribution

The probability distribution function of c: the speed of the gas molecule

(11.8)

If speed c corresponds to

then corresponds to

Lecture 12: starts next page

Page 23: Entropy Entropy and applications of statistical mechanics...From Fermi’s Golden Rule: Let p i be the probability of the system being in state I (among the W states): ∑ ∑ (1.1)

Stat Mech cont.

Lecture 12:

Opens with another three dimensional picture

(12.1)

From the Boltzmann distribution, probability of an atom (only one for now) being in a

particular state q is:

(12.2)

(

)

(12.3)

(

)

(12.4)

Now

(12.5)

(12.6)

Total probability that speed is in range is:

Page 24: Entropy Entropy and applications of statistical mechanics...From Fermi’s Golden Rule: Let p i be the probability of the system being in state I (among the W states): ∑ ∑ (1.1)

(

)

Cancelling and subbing in c

(

)

(

)

(12.7)

Leading to the Maxwell-Boltzmann distribution:

(

)

(12.8)

NB:

is the Boltzmann factor for a state; is proportional to number of

states ; the prefactor could be deduced from normalisation of the probability

distribution:

(12.9)

[Two figures of f(c) vs c for various Temps]

Molecular gases: e.g.

Vibration: single SHO with frequency

(12.10)

( )

Page 25: Entropy Entropy and applications of statistical mechanics...From Fermi’s Golden Rule: Let p i be the probability of the system being in state I (among the W states): ∑ ∑ (1.1)

(12.11)

Typically

therefore we can use a low temp approximation:

(12.12)

Rotation:

[Picture of degrees of freedom for diatomic molecule]

Eigenvalues

at normal temperatures, therefore use high T approximation

From classical equipartition theorem

(12.13)

So for a gas of N such molecules, the total heat capacity is:

Where the 1st term is from translation, 2

nd is from rotation and 3

rd is vibration

(12.14)

Entropy of ideal gas (monatomic)

Get S from the free energy eqn using

Page 26: Entropy Entropy and applications of statistical mechanics...From Fermi’s Golden Rule: Let p i be the probability of the system being in state I (among the W states): ∑ ∑ (1.1)

So

(

)

(12.15)

There was a problem question which helps practice this part well.

As but our non-degenerate approximation breaks down at low temperatures

Lecture 13: Key Theory so far

Bit of recap with some new stuff too, aids course completeness. If you can’t remember what

these are then go back and learn them:

Principle of equal equilibrium (PEEP)

at equilibrium

Boltzmann distribution:

With partition function

Recall

For an isolated system the entropy is

Other it is (Gibb’s formula)

Page 27: Entropy Entropy and applications of statistical mechanics...From Fermi’s Golden Rule: Let p i be the probability of the system being in state I (among the W states): ∑ ∑ (1.1)

At constant ,

Harmonic oscillator (Planck oscillator formula)

Now lecture moves on to new topic based on everything so far

Systems with variable numbers of particles

Chemical potential

[Figure similar to before ill copy later]

Most likely division of energy will maximise:

(13.1)

Consider

(13.2)

(13.3)

Define

Page 28: Entropy Entropy and applications of statistical mechanics...From Fermi’s Golden Rule: Let p i be the probability of the system being in state I (among the W states): ∑ ∑ (1.1)

(

)

(

)

(13.4)

Thus both temperature and the chemical potential are equalised in equilibrium. has

dimensions of energy. Since

For a large system, then

(

)

(

)

(13.5)

Comes from

[Now adds or amends original diagram to concern as well]

So now

(

)

(

)

(

)

(

)

(

) (

)

(13.6)

New version of eqn

(13.7)

(

)

(

)

(

)

(13.8)

Page 29: Entropy Entropy and applications of statistical mechanics...From Fermi’s Golden Rule: Let p i be the probability of the system being in state I (among the W states): ∑ ∑ (1.1)

(13.9)

(13.10)

(

)

(13.11)

Gibbs distribution

For a system able to exchange energy and particles with a reservoir

[Figure of a system connected to a heat reservoir]

For the reservoir

(

)

(

)

(13.12)

This gives

/

(13.13)

From PEEP:

Page 30: Entropy Entropy and applications of statistical mechanics...From Fermi’s Golden Rule: Let p i be the probability of the system being in state I (among the W states): ∑ ∑ (1.1)

/

Since

But thus

/

(13.14)

Absorbing constants into proportionality (Gibbs distribution)

Since ∑ :

/

∑ /

/

(13.15)

Where is the Grand Partition Function

Lecture 14: Identical particles

We ignore, for simplicity any interaction between particles (e.g. coulomb repulsion between

electrons). Specify how many particles in each 1-particle quantum state.

For Bosons (spin 0, 1, 2, ...) (e.g.) photon, hydrogen atom) there is no restriction on the

occupation.

For Fermions (spin ½, 3/2, ...) (e.g. electrons, protons, neutrons) the Pauli Exclusion Principle

prohibits two or more fermions in the same 1-particle state.

[Figure of particles in states for bosons]

[Similar figure isolating all other particles (bar one and its state) as a reservoir (for fermions)]

N.B.1-particle state means spatial wave function together with spin, e.g. 1s state with spin up

Key concept: Treat the 1-particle state as our system with the other states playing the role of

the particle reservoir (figure). The whole system is to be held at temperature T.

Apply Gibbs distribution:

(1) For fermions

Page 31: Entropy Entropy and applications of statistical mechanics...From Fermi’s Golden Rule: Let p i be the probability of the system being in state I (among the W states): ∑ ∑ (1.1)

N E

0 0

1

/

(14.1)

Gives Fermi-Dirac distribution:

(14.2)

Note

Where is the average number of particles in a state with energy .

(2) For bosons

/ / /

(14.3)

(14.4)

(14.5)

(

(

)

)

Page 32: Entropy Entropy and applications of statistical mechanics...From Fermi’s Golden Rule: Let p i be the probability of the system being in state I (among the W states): ∑ ∑ (1.1)

[I think that step is done using another geometric progression (the square term etc.)]

(14.6)

This is the Bose-Einstein distribution

(14.7)

[Two graphs describing each dist. For a range of temperatures]

NB. For the purpose of graph only the T-dependence of is ignored

Lecture 15: Bose Einstein Distribution

(15.1)

[Figure plotting with asymptote]

Asymptote at . There fore must be less than the lowest energy of the 1-particle states.

is the number of particles with energy

The total number of particles, N, in the system is known. This means that summed over

all the energies, , must yield a total of N:

(15.2)

This applies for both fermions and bosons depends on (albeit weakly).

Page 33: Entropy Entropy and applications of statistical mechanics...From Fermi’s Golden Rule: Let p i be the probability of the system being in state I (among the W states): ∑ ∑ (1.1)

The free-electron metal

is given by solving the Schrödinger equation:

[Cubic box of lengths L]

(15.3)

The density of these single-particle quantum states is:

(15.4)

Number of states:

1st term is the positive octant of spherical shell in q-space, 2

nd term is number of q-points per

unit volume of q-space)

(15.5)

(15.6)

/ /

/ √

(15.7)

Page 34: Entropy Entropy and applications of statistical mechanics...From Fermi’s Golden Rule: Let p i be the probability of the system being in state I (among the W states): ∑ ∑ (1.1)

BUT: for each solution to the S.E. eqn there are 2 modes (spin)

(15.8)

[Two graphs showing a step function and a rising graph]

(15.9)

This fixes at any temperature. At , f is a step function:

∫ /

/

(15.10)

(

)

(

)

(15.11)

This value of at is called the Fermi Energy,

[Figure displaying Fermi energy on a Fermi-Dirac graph]

Lecture 16: Continues really

Roughly half of the electrons within below the Fermi energy, , have been raised in

energy by ~ . The number of such electrons is:

(16.1)

Page 35: Entropy Entropy and applications of statistical mechanics...From Fermi’s Golden Rule: Let p i be the probability of the system being in state I (among the W states): ∑ ∑ (1.1)

In fact careful integration gives

1st term is the electrons, 2

nd is vibrational

[Graph showing these two terms plotted]

Bose-Einstein condensation

[Figure of moving for BE distribution; for details, see Mandl.]

Below what critical temperature are a significant fraction of the bosons in the very lowest

quantum state?

For particles in a box:

√ /

/

(16.2)

Where the factor of 2 {sure a half?} undoes spin degeneracy

For all states but the lowest

so the total of particles in all the states but the

lowest is:

∫ /

(16.3)

Approximate done as the step function has width of

(16.4)

For condensation, this is , the total of particles

Page 36: Entropy Entropy and applications of statistical mechanics...From Fermi’s Golden Rule: Let p i be the probability of the system being in state I (among the W states): ∑ ∑ (1.1)

(16.5)

(

)

(16.6)

Careful integration (some method, Canto?) gives:

[Figure with two graphs showing proportion of occupied states with temperature, with

being the fraction of bosons in very lowest state]

Blackbody radiation

[Cube again with lengths L]

Electric field

At boundary so

(16.7)

Mode has

Each mode is a simple harmonic oscillator we can apply Planck oscillator formula:

Lecture 17: Energy in EM modes

Useful in Electrons in solids to know these parts well – actually S.M. ties into most modules

Page 37: Entropy Entropy and applications of statistical mechanics...From Fermi’s Golden Rule: Let p i be the probability of the system being in state I (among the W states): ∑ ∑ (1.1)

Using all this

(

)

(17.1)

Integral of 1st term gives infinity, but this is the zero point energy and is independent of T

with no observable consequences.

( )

(

)

(17.2)

( )

(17.3)

Spectral density

Exclude

(17.4)

Since

(17.5)

Page 38: Entropy Entropy and applications of statistical mechanics...From Fermi’s Golden Rule: Let p i be the probability of the system being in state I (among the W states): ∑ ∑ (1.1)

Low :

High

Lecture 18: Blackbody Radiation Continued

Energy density:

( )

(18.1)

Spectral density:

( )

(18.2)

Absorption and emission

[Figure of temperature being absorbed and emitted in a square, also a figure of a semi circle

with various parts labelled in polar co-ordinates]

Blackbody: Zero reflection, absorbs all incident radiation (and subsequently re-emits)

Consider EM radiation arriving at angle range between and to the normal. The

fraction of the total modes considered is:

(18.3)

[Another figure with some light hitting a surface and reflecting possibly, the angles and

lengths are labelled. In retrospect the previous diagram was also describing light hitting a

surface, see audio lecture for description until diagrams drawn here]

Energy arriving on area A (figure) per unit time is:

Page 39: Entropy Entropy and applications of statistical mechanics...From Fermi’s Golden Rule: Let p i be the probability of the system being in state I (among the W states): ∑ ∑ (1.1)

(18.4)

Total for all modes (therefore all angles)

/

(18.5)

The energy flux is then

And is called the Stefan-Boltzmann constant

This flux is also the energy flux EMITTED by a blackbody at temperature T.

Classical statistical mechanics

Phase Space: For N particles in D dimensions there are DN position variables and DN

momentum coordinates, giving 2DN dimensional phase space.

For a 1D particle in an infinite well:

[Figure with lengths 0 to L and then P vs x plot related to it]

QM shows that the volume between successive quantised states is .

Boltzmann and Gibbs and co. assumed one can integrate over phase space to achieve what in

QM is given by a sum over microstates. This was later justified by the above result.

Fin


Recommended