Information Thermodynamics on Causal...

Post on 05-Aug-2020

0 views 0 download

transcript

Information Thermodynamics on Causal Networks

Takahiro Sagawa Department of Basic Science, University of Tokyo

In collaboration with

Sosuke Ito Department of Physics, University of Tokyo

The 6th KIAS Conference on Statistical Physics “Nonequilibrium Statistical Physics of Complex Systems” (NSPCS14)

10 July 2014, Seoul, Korea

Collaborators on Information Thermodynamics

• Masahito Ueda (Univ. Tokyo)

• Shoichi Toyabe (LMU Munich)

• Eiro Muneyuki (Chuo Univ.)

• Masaki Sano (Univ. Tokyo)

• Sosuke Ito (Univ. Tokyo)

• Naoto Shiraishi (Univ. Tokyo)

• Sang Wook Kim (Pusan National Univ.)

• Jung Jun Park (National Univ. Singapore)

• Kang-Hwan Kim (KAIST)

• Simone De Liberato (Univ. Paris VII)

• Juan M. R. Parrondo (Univ. Madrid)

• Jordan M. Horowitz (Univ. Massachusetts)

• Jukka Pekola (Aalto Univ.)

• Jonne Koski (Aalto Univ.)

• Ville Maisi (Aalto Univ.)

Outline • Introduction

• Information and Entropy

• Second Law with Measurement and Feedback

• Information Thermodynamics on Causal Networks

• Summary

Information Thermodynamics

Information processing at the level of thermal fluctuations

Foundation of the second law of thermodynamics

Application to nanomachines and nanodevices

System Demon

Information

Feedback

Szilard Engine (1929)

Heat bath

T

Initial State Which? Partition

Measurement

Left

Right

Feedback

ln 2

F E TS Free energy: Decrease by feedback Increase

Isothermal, quasi-static expansion B ln 2k T

Work

Can control physical entropy by using information

L. Szilard, Z. Phys. 53, 840 (1929)

Experimental Realizations

• With a colloidal particle Toyabe, TS, Ueda, Muneyuki, & Sano, Nature Physics (2010)

Efficiency: 30% Validation of

• With a single electron Koski, Maisi, TS, & Pekola, to appear in PRL (2014)

Efficiency: 75% Validation of

( )W Fe

( ) 1W F Ie

Outline • Introduction

• Information and Entropy

• Second Law with Measurement and Feedback

• Information Thermodynamics on Causal Networks

• Summary

Shannon Information

9

10

1

10

Information content with event : 1

lnkp

k

Shannon information: 1

lnk

k k

H pp

Average

Mutual Information

0 ( )I H M

No information No error

System S Memory M I

( : ) ( ) ( ) ( )I S M H S H M H SM

System S Memory M (measurement device)

Measurement with stochastic errors

0

1

0

1

1

1

Ex. Binary symmetric channel

Correlation between S and M

I

)1ln()1(ln2ln I

Entropy Production

System S

Heat bath B

(inverse temperature β)

Heat Q

Entropy production in the total system: QSS SSB

Change in the Shannon entropy of S

If the initial and the final states are canonical distributions: FWS SB

Free-energy difference

W Work

Averaged heat absorbed by S

Stochastic dynamics of system S (e.g., Langevin system)

Stochastic Entropy Production

Qss SSB

Stochastic entropy production along a trajectory of the system from time 0 to τ

System (phase-space point x)

Heat bath (inverse temperature β)

Q

If the initial and the final states are canonical distributions: FWs SB

W Work

SS Ss

],[ln],[S txPtxs ]0),0([]),([ SSS xsxss

],[ txP : probability distribution at time t

Fluctuation Theorem and Second Law

1SB s

e

Integral fluctuation theorem (Jarzynski equality)

for any initial and final distributions

0SB s

The second law of thermodynamics (Clausius inequality)

QS S FW

Jarzynski, PRL (1997), Seifert, PRL (2005), …

Second law can be expressed by an equality with full cumulants

Outline • Introduction

• Information and Entropy

• Second Law with Measurement and Feedback

• Information Thermodynamics on Causal Networks

• Summary

Our Setup

System Y Heat bath B System X Heat bath B

Time evolution of X under the influence of Y with initial and final correlations

X and Y interact with each other only through information exchange

Special Cases: Measurement and Feedback

Measurement Feedback

X: Engine Y: Memory

X: Memory Y: Engine

Stochastic Entropy Production & Mutual Information

XXXB Qss

Entropy production in XB

][]'[

],'[ln],'[''

yPxP

yxPyxdyPdxI f

yx

][][

],[ln],[

yPxP

yxPyxdxdyPI i

xy][][

],[ln

yPxP

yxPI i

xy

Initial correlation

][]'[

],'[ln'

yPxP

yxPI f

yx

Final correlation

General Result

i

xy

f

yx III '

1XB Is

e

Integral fluctuation theorem:

Is XBGeneralized second law:

TS and M. Ueda, PRL 109, 180602 (2012).

Special Case 1: Feedback Control

III rem

1)( remXB

IIse

Feedback: Control protocol depends on the measurement outcome

(Upper bound of) the correlation that is used by feedback

X: Engine Y: Memory

remXB IIs

ITkFW Bext

Special Case 2: Measurement

II

1XB Is

e

X: Memory Y: Engine

Is XB

Outline • Introduction

• Information and Entropy

• Second Law with Measurement and Feedback

• Information Thermodynamics on Causal Networks

• Summary

Generalization to Many-body Systems with Complex Information Flow

Sosuke Ito & TS, PRL 111, 180603 (2013).

Characterize the dynamics by Bayesian networks

Bayesian Networks

0x 

x2

1x

 

x1 

x2

 

y1

 

y0

 

z0

 

z1

 

x0

Node: Event Arrow: Causal relationship

Parents of node x (denoted by “pa(x)”): The set of nodes that have arrows to x

Bayesian Networks

0x 

x2

1x

pa(x2 ) = {x1, x0} 

x1 

x2

 

y1

 

y0

 

z0

 

z1

 

x0

pa(x1) = {x0}

pa(x2 ) = {x1, y0}

pa(x1) = {x0, y0}

pa(y1) = {x1, x0, y0, z0}

 

pa(x0 ) = Æ(Empty set)

1m

2x

1x

),|()|()( 112111 xmxpxmpxp

Simplest Case:

Measurement and Feedback

The path probability

Measurement

Feedback Time

evolution

under

feedback

control

Szilard engine

1x

2x

1m

1x

2x

3x

)( idttxxi )( idttyyi

)( idttzzi

zz

yy

xx

xzfdt

dz

zyfdt

dy

yxfdt

dx

,

,

,

xX (t) = 0,

xX t( )xX '(t ') =DXdXX 'd(t - t ')1y

3y

1z

2z

3z

2y

Multidimensional Langevin Dynamics

X = x, y, z

xi+1 = xi + fx xi, yi( )dt +xxdt

Main Result:

Fluctuation theorem on Causal Networks

,1exp X X

l

lIII trinifin

Iini : Initial correlation between the target system and other systems

Ifin : Final correlation between the system and others

Itrl: Information transfer from the system to others during the dynamics

Iini: Initial correlation

xN

pa(x1)x1

Iini º lnp(x1, pa(x1))

p(x1)p(pa(x1))

Iini : Initial correlation between the system and other systems

Ifin: Final correlation

xN-1

xN

C

x1

cN '

cN '-1cN '-2

cN '-3 cN '-4

Ifin º lnp(xN ,C)

p(xN )p(C)

C is the set of

random variables

that effect on the final state

from outside of X.

Ifin : Final correlation between the system and others

Nj xxaaC ,,\,, 11

a j = xNwith

↓Topological ordering

Iltr : Information transfer

xN-1

xN

C

x1

cN '

cN '-2

cN '-3cN '-4

),,|pa(),,|(

),,|,pa(ln

1111

11tr

cccpcccp

ccccpI

llXll

lllXl

C = cl 1£ l £ N '{ }

),,1( Nl

Itr : Information transfer from X into cl during the dynamics

This term is equivalent to

Transfer entropy [T. Schreiber, PRL 85, 461, 2000.]

NllX xxcc ,,)(pa)(pa 1

↑Topological ordering

cN '-1

Toward Application to Biochemical Signal Transduction

Information thermodynamics gives a useful bound; stronger than the usual second law!

Ito & TS, arXiv:1406.5810

Analogy with communication theory

Outline • Introduction

• Information and Entropy

• Second Law with Measurement and Feedback

• Information Thermodynamics on Causal Networks

• Summary

Summary

• Unified framework of information thermodynamics

– Fluctuation theorem for information exchanges

• Generalization to causal networks

– Entropy transfer (information flow) is crucial

• Toward application to signal transduction

Thank you for your attention!

S. Ito & T. Sagawa, arXiv:1406.5810.

S. Ito & T. Sagawa, PRL 111, 180603 (2013).

T. Sagawa & M. Ueda, PRL 109, 180602 (2012).