Date post: | 04-Apr-2018 |
Category: |
Documents |
Upload: | jorgia-denclar |
View: | 226 times |
Download: | 0 times |
of 28
7/29/2019 Boltzmann Shannon
1/28
Boltzmann, Shannon, and
(Missing) Information
7/29/2019 Boltzmann Shannon
2/28
Second Law of Thermodynamics.
Entropy of a gas.
Entropy of a message. Information?
7/29/2019 Boltzmann Shannon
3/28
B.B. (before Boltzmann): Carnot, Kelvin,
Clausius, (19th c.)
Second Law of Thermodynamics: The entropy ofan isolated system never decreases.
Entropy defined in terms of heat exchange:
Change in entropy = (Heat absorbed)/(Absolute temp).
(+ if absorbed, - if emitted).
(Molecules unnecessary).
7/29/2019 Boltzmann Shannon
4/28
QHot (Th) Cold (Tc)
Isolated system. Has some structure (ordered).Heat, Q, extracted from hot, same amount
absorbed by coldenergy conserved, 1st Law.
Entropy of hot decreases by Q/Th; entropy ofcold increases by Q/Tc > Q/Th, 2
d Law.
In the fullness of time
No structure (no order).
Lukewarm
7/29/2019 Boltzmann Shannon
5/28
Pauls entropy picture
Sun releases heatQ at high temp
entropy
decreases
Living stuff
absorb heat Q at
lower temp
larger entropy
increases
Stuff releases heat
q, gets more
organized
entropy decreases
Surroundings
absorb q,
gets more
disorganized
entropy
increases
Overall, entropy increases
7/29/2019 Boltzmann Shannon
6/28
2d Law of Thermodynamics does notforbid emergenceof local complexity
(e.g., life, brain, ).
2d Law of Thermodynamics does not
require emergence of local complexity
(e.g., life , brain, ).
7/29/2019 Boltzmann Shannon
7/28
Boltzmann (1872))
Entropy of a dilute gas.
Nmolecules obeying Newtonian physics (time
reversible). State of each molecule given by its position
and momentum.
Molecules may collidei.e., transfer energy andmomentum among each other.
colliding
7/29/2019 Boltzmann Shannon
8/28
Represent system in a space whose coordinates
are positions and momenta = mv (phase space).
momentum
position
Subdivide space intoB bins.
pk= fraction of particles whose positions and
momenta are in bin k.
7/29/2019 Boltzmann Shannon
9/28
pks change because of
Motion
Collisions
External forces
Build a histogram of thepks.
7/29/2019 Boltzmann Shannon
10/28
All in 1 binhighly structured, highly ordered
no missing information, no uncertainty.
Uniformly distributedunstructured,disordered, random.
maximum uncertainty, maximum missing
information.
In-between case intermediate amount of
missing information (uncertainty).
Any flattening of histogram (phase space
landscape) increases uncertainty.
Given thepks, how much information do you need to locate
a molecule in phase space?
7/29/2019 Boltzmann Shannon
11/28
Boltzmann:
Amount of uncertainty, or missing
information, or randomness, of the
distribution of thepks, can be measured
by
HB = pklog(pk)
7/29/2019 Boltzmann Shannon
12/28
All in 1 binhighly structured, highly ordered
HB = 0.Maximum HB.
Uniformly distributedunstructured,disorder, random.
HB = - logB. Minimum HB.
In-between case intermediate amount of
missing information (uncertainty).
Inbetween value of HB.
pkhistogram revisited.
7/29/2019 Boltzmann Shannon
13/28
Boltzmanns Famous H TheoremDefine: HB = pklog(pk)
Assume: Molecules obey Newtons Laws of motion.
Show: HB never increases.
AHA! -HB never decreases: behaves like entropy!!
If it looks like a duck
Identify entropy withHB
:
S = - kBHB
Boltzmanns constant
7/29/2019 Boltzmann Shannon
14/28
New version of Second Law:
The phase space landscape either does not change orit becomes flatter.
It may peak locally provided itflattens overall.
life?
7/29/2019 Boltzmann Shannon
15/28
Two paradoxes1. Reversal(Loschmidt, Zermelo).
Irreversible phenomena (2d Law, arrow
of time) emerge from reversiblemolecular dynamics. (How can this be?cf Tony Rothman).
7/29/2019 Boltzmann Shannon
16/28
2.Recurrence
(Poincar).
Sooner or later,
you are back
where you
started. (So,what does
approach to
equilibrium
mean?)
Graphic from: J. P. Crutchfield et al.,
Chaos, Sci. Am., Dec., 1986.
7/29/2019 Boltzmann Shannon
17/28
Well
1. InterpretHtheorem probabilistically. Boltzmannstreatment of collisions is really probabilistic,,
molecular chaos, coarse-graining, indeterminacy
anticipating quantum mechanics? Entropy is
probability of a macrostate
is it something thatemerges in the transition from the micro to the macro?
2. Poincar recurrence time is really very, very long for
real systemslonger than the age of the universe,
even.Anyhow, entropy does not decrease!
on to Shannon
7/29/2019 Boltzmann Shannon
18/28
AB (After Boltzmann): Shannon (1949)
Entropy of a message
Message encoded in an alphabet ofB symbols,
e.g.,
English sentence (26 letters + space +punctuations)
Morse code (dot, dash, space)
DNA (A, T, G, C)pk= fraction of the time that symbol koccurs
(~ probability that symbol koccurs).
7/29/2019 Boltzmann Shannon
19/28
pick a symbolany symbol
Shannons problem: Want a quantity that measures
missing information: how much information is
needed to establish what the symbol is, or
uncertainty about what the symbol is, or
how many yes-no questions need to be asked to
establish what the symbol is.
Shannons answer:
HS = - kpklog(pk)
A positive number
7/29/2019 Boltzmann Shannon
20/28
Morse code example:
All dots:p1 = 1,p2 =p3 = 0.Take any symbolits a dot; no uncertainty, no question
needed, no missing information,HS = 0.
50-50 chance that its a dot or a dash:p1
=p2
= ,pk
= 0.
Given theps, need to ask one question (what question?),
one piece of missing information,HS = log(2) = 0.69
Random: all symbols equally likely,p1
=p2
=p3
= 1/3.
Given theps, need to ask as many as 2 questions -- 2
pieces of missing information,HS = log(3) = 1.1
7/29/2019 Boltzmann Shannon
21/28
1. It looks like a duck but does it quack?
Theres noHtheorem for ShannonsHS.
2.His insensitive to meaning.
Two comments:
Shannon: [The] semantic aspectsof communication are irrelevant to
the engineering problem.
7/29/2019 Boltzmann Shannon
22/28
OnHtheorems:
Q: What did Boltzmann have that Shannon didnt?
A: Newton (or equivalent dynamical rules for the
evolution of thepks).
Does Shannon have rules for how thepks evolve?
In a communications system, thepks may change
because of transmission errors. In genetics, is it
mutation? Is the result always a flattening of thepk
landscape, or an increase in missing information?
Is ShannonsHSjust a metaphor? What about
Maxwells demon?
7/29/2019 Boltzmann Shannon
23/28
On dynamical rules.
Is a neuron like a refrigerator?
Entropy of
fridge decreases.
Entropy of signal
decreases.
7/29/2019 Boltzmann Shannon
24/28
The entropy of a refrigerator may increase,
but it needs electricity.
The entropy of the message passing through a
neuron may increase, but it needs
nutrients.
General Electric designs refrigerators.
Who designs neurons?
7/29/2019 Boltzmann Shannon
25/28
Insensitive to meaning: Morse revisited
X={. . .-.. .-.. --- .-- --- .-. .-.. -..}
H E L L O W O R L D
Y={.- - -.-. -.. . ..-. --. .. .--- -.-}
A B C D E F G H I J M
Samepks, same entropies same missinginformation.
7/29/2019 Boltzmann Shannon
26/28
IfXand Yare separately scrambledstill samepks, same missing
information same entropy.
The message is in the sequence? What dogeneticists say?
Informationas entropyis not a very
useful way to characterize the genetic code?
7/29/2019 Boltzmann Shannon
27/28
Do Boltzmann and Shannon mix?
Boltzmanns entropy of a gas, SB = - kBSpklogpk
kB relates temperature to energy:E= kBT
relates temperature of a gas to PV.
Shannons entropy of a message, SS = - kSpklogpk
kis some positive constantno reason to be kB.
Does SB+ SS mean anything? Does the sum
never decrease? Can an increase in one make up
for a decrease in the other?
7/29/2019 Boltzmann Shannon
28/28
Maxwells demon yet once more.
Demon measures velocity of molecule by
bouncing light on it and absorbing reflected
light;
process transfers energy to demon;increases demons entropy makes up for
entropy decrease of gas.