+ All Categories
Home > Documents > Meyers ed (Encyclopedia of Complexity and Systems...

Meyers ed (Encyclopedia of Complexity and Systems...

Date post: 19-Feb-2021
Category:
Upload: others
View: 0 times
Download: 0 times
Share this document with a friend
25
Entropy E 2859 45. Tani J, Fukumura N (1997) Self-organizing internal represen- tation in learning of navigation: A physical experiment by the mobile robot Yamabico. Neural Netw 10(1):153–159 46. Tani J, Nolfi S (1999) Learning to perceive the world as articu- lated: An approach for hierarchical learning in sensory-motor systems. Neural Netw 12:1131–1141 47. Tani J, Nishimoto R, Namikawa J, Ito M (2008) Co-developmen- tal learning between human and humanoid robot using a dy- namic neural network model. IEEE Trans Syst Man Cybern B. Cybern 38:1 48. Varela FJ, Thompson E, Rosch E (1991) The Embodied mind: Cognitive science and human experience. MIT Press, Cam- bridge 49. van Gelder TJ (1998) The dynamical hypothesis in cognitive sci- ence. Behav Brain Sci 21:615–628 50. Vaughan E, Di Paolo EA, Harvey I (2004) The evolution of con- trol and adaptation in a 3D powered passive dynamic walker. In: Pollack J, Bedau M, Husband P, Ikegami T, Watson R (eds) Proceedings of the Ninth International Conference on the Sim- ulation and Synthesis of Living Systems. MIT Press, Cambridge Entropy CONSTANTINO TSALLIS 1,2 1 Centro Brasileiro de Pesquisas Físicas, Rio de Janeiro, Brazil 2 Santa Fe Institute, Santa Fe, USA Article Outline Glossary Definition of the Subject Introduction Some Basic Properties Boltzmann–Gibbs Statistical Mechanics On the Limitations of Boltzmann–Gibbs Entropy and Statistical Mechanics The Nonadditive Entropy S q A Connection Between Entropy and Diffusion Standard and q-Generalized Central Limit Theorems Future Directions Acknowledgments Bibliography Glossary Absolute temperature Denoted T. Clausius entropy Also called thermodynamic entropy. Denoted S. Boltzmann–Gibbs entropy Basis of Boltzmann–Gibbs statistical mechanics. This entropy, denoted S BG , is ad- ditive. Indeed, for two probabilistically independent subsystems A and B, it satisfies S BG (ACB) D S BG (A)C S BG (B). Nonadditive entropy It usually refers to the basis of nonextensive statistical mechanics. This entropy, de- noted S q , is nonadditive for q ¤ 1. Indeed, for two probabilistically independent subsystems A and B, it satisfies S q (A C B) ¤ S q (A) C S q (B)(q ¤ 1). For his- torical reasons, it is frequently (but inadequately) re- ferred to as nonextensive entropy. q-logarithmic and q-exponential functions Denoted ln q x (ln 1 x D ln x ), and e x q (e x 1 D e x ), respectively. Extensive system So called for historical reasons. A more appropriate name would be additive system. It is a sys- tem which, in one way or another, relies on or is con- nected to the (additive) Boltzmann–Gibbs entropy. Its basic dynamical and/or structural quantities are ex- pected to be of the exponential form. In the sense of complexity, it may be considered a simple system. Nonextensive system So called for historical reasons. A more appropriate name would be nonadditive sys- tem. It is a system which, in one way or another, relies on or is connected to a (nonadditive) entropy such as S q (q ¤ 1). Its basic dynamical and/or structural quan- tities are expected to asymptotically be of the power- law form. In the sense of complexity, it may be consid- ered a complex system. Definition of the Subject Thermodynamics and statistical mechanics are among the most important formalisms in contemporary physics. They have overwhelming and intertwined applications in science and technology. They essentially rely on two ba- sic concepts, namely energy and entropy. The mathemati- cal expression that is used for the first one is well known to be nonuniversal; indeed, it depends on whether we are say in classical, quantum, or relativistic regimes. The sec- ond concept, and very specifically its connection with the microscopic world, has been considered during well over one century as essentially unique and universal as a physi- cal concept. Although some mathematical generalizations of the entropy have been proposed during the last forty years, they have frequently been considered as mere prac- tical expressions for disciplines such as cybernetics and control theory, with no particular physical interpretation. What we have witnessed during the last two decades is the growth, among physicists, of the belief that it is not neces- sarily so. In other words, the physical entropy would ba- sically rely on the microscopic dynamical and structural properties of the system under study. For example, for sys- tems microscopically evolving with strongly chaotic dy- namics, the connection between the thermodynamical en- tropy and the thermostatistical entropy would be the one
Transcript
  • Entropy E 285945. Tani J, Fukumura N (1997) Self-organizing internal represen-

    tation in learning of navigation: A physical experiment by the

    mobile robot Yamabico. Neural Netw 10(1):153–159

    46. Tani J, Nolfi S (1999) Learning to perceive the world as articu-

    lated: An approach for hierarchical learning in sensory-motor

    systems. Neural Netw 12:1131–1141

    47. Tani J, Nishimoto R, Namikawa J, Ito M (2008) Co-developmen-

    tal learning between human and humanoid robot using a dy-

    namic neural network model. IEEE Trans Syst Man Cybern B.

    Cybern 38:1

    48. Varela FJ, Thompson E, Rosch E (1991) The Embodied mind:

    Cognitive science and human experience. MIT Press, Cam-

    bridge

    49. vanGelder TJ (1998) The dynamical hypothesis in cognitive sci-

    ence. Behav Brain Sci 21:615–628

    50. Vaughan E, Di Paolo EA, Harvey I (2004) The evolution of con-

    trol and adaptation in a 3D powered passive dynamic walker.

    In: Pollack J, Bedau M, Husband P, Ikegami T, Watson R (eds)

    Proceedings of the Ninth International Conference on the Sim-

    ulation and Synthesis of Living Systems. MIT Press, Cambridge

    Entropy

    CONSTANTINO TSALLIS1,2

    1 Centro Brasileiro de Pesquisas Físicas,

    Rio de Janeiro, Brazil2 Santa Fe Institute, Santa Fe, USA

    Article Outline

    Glossary

    Definition of the Subject

    Introduction

    Some Basic Properties

    Boltzmann–Gibbs Statistical Mechanics

    On the Limitations of Boltzmann–Gibbs Entropy

    and Statistical Mechanics

    The Nonadditive Entropy SqA Connection Between Entropy and Diffusion

    Standard and q-Generalized Central Limit Theorems

    Future Directions

    Acknowledgments

    Bibliography

    Glossary

    Absolute temperature Denoted T.

    Clausius entropy Also called thermodynamic entropy.

    Denoted S.

    Boltzmann–Gibbs entropy Basis of Boltzmann–Gibbs

    statistical mechanics. This entropy, denoted SBG, is ad-

    ditive. Indeed, for two probabilistically independent

    subsystemsA andB, it satisfies SBG(ACB) D SBG(A)CSBG(B).

    Nonadditive entropy It usually refers to the basis of

    nonextensive statistical mechanics. This entropy, de-

    noted Sq, is nonadditive for q ¤ 1. Indeed, for twoprobabilistically independent subsystems A and B, it

    satisfies Sq(A C B) ¤ Sq(A) C Sq(B) (q ¤ 1). For his-torical reasons, it is frequently (but inadequately) re-

    ferred to as nonextensive entropy.

    q-logarithmic and q-exponential functions Denoted

    lnq x (ln1 x D ln x), and exq (ex1 D ex ), respectively.Extensive system So called for historical reasons. A more

    appropriate name would be additive system. It is a sys-

    tem which, in one way or another, relies on or is con-

    nected to the (additive) Boltzmann–Gibbs entropy. Its

    basic dynamical and/or structural quantities are ex-

    pected to be of the exponential form. In the sense of

    complexity, it may be considered a simple system.

    Nonextensive system So called for historical reasons.

    A more appropriate name would be nonadditive sys-

    tem. It is a system which, in one way or another, relies

    on or is connected to a (nonadditive) entropy such as

    Sq(q ¤ 1). Its basic dynamical and/or structural quan-tities are expected to asymptotically be of the power-

    law form. In the sense of complexity, it may be consid-

    ered a complex system.

    Definition of the Subject

    Thermodynamics and statistical mechanics are among

    the most important formalisms in contemporary physics.

    They have overwhelming and intertwined applications in

    science and technology. They essentially rely on two ba-

    sic concepts, namely energy and entropy. The mathemati-

    cal expression that is used for the first one is well known

    to be nonuniversal; indeed, it depends on whether we are

    say in classical, quantum, or relativistic regimes. The sec-

    ond concept, and very specifically its connection with the

    microscopic world, has been considered during well over

    one century as essentially unique and universal as a physi-

    cal concept. Although some mathematical generalizations

    of the entropy have been proposed during the last forty

    years, they have frequently been considered as mere prac-

    tical expressions for disciplines such as cybernetics and

    control theory, with no particular physical interpretation.

    What we have witnessed during the last two decades is the

    growth, among physicists, of the belief that it is not neces-

    sarily so. In other words, the physical entropy would ba-

    sically rely on the microscopic dynamical and structural

    properties of the system under study. For example, for sys-

    tems microscopically evolving with strongly chaotic dy-

    namics, the connection between the thermodynamical en-

    tropy and the thermostatistical entropy would be the one

  • 2860 E Entropyfound in standard textbooks. But, for more complex sys-

    tems (e. g., for weakly chaotic dynamics), it becomes ei-

    ther necessary, or convenient, or both, to extend the tradi-

    tional connection. The present article presents the ubiqui-

    tous concept of entropy, useful even for systems for which

    no energy can be defined at all, within a standpoint re-

    flecting a nonuniversal conception for the connection be-

    tween the thermodynamic and the thermostatistical en-

    tropies. Consequently, both the standard entropy and its

    recent generalizations, as well as the corresponding statis-

    tical mechanics, are here presented on equal footing.

    Introduction

    The concept of entropy (from the Greek �� ����!, en

    trepo, at turn, at transformation) was first introduced in

    1865 by the German physicist and mathematician Rudolf

    Julius Emanuel Clausius, Rudolf Julius Emanuel in or-

    der to mathematically complete the formalism of classi-

    cal thermodynamics [55], one of the most important the-

    oretical achievements of contemporary physics. The term

    was so coined to make a parallel to energy (from the Greek

    ����o& , energos, at work), the other fundamental con-

    cept of thermodynamics. Clausius connection was given

    by

    dS D ıQT; (1)

    where ıQ denotes an infinitesimal transfer of heat. In

    other words, 1/T acts as an integrating factor for ıQ.

    In fact, it was only in 1909 that thermodynamics was

    finally given, by the Greek mathematician Constantin

    Caratheodory, a logically consistent axiomatic formula-

    tion.

    In 1872, some years after Clausius proposal, the Aus-

    trian physicist Ludwig Eduard Boltzmann introduced

    a quantity, that he noted H, which was defined in terms

    of microscopic quantities:

    H �•

    f (v) ln[ f (v)] dv ; (2)

    where f (v)dv is the number of molecules in the veloc-

    ity space interval dv. Using Newtonian mechanics, Boltz-

    mann showed that, under some intuitive assumptions

    (Stoßzahlansatz or molecular chaos hypothesis) regarding

    the nature of molecular collisions, H does not increase

    with time. Five years later, in 1877, he identified this quan-

    tity with Clausius entropy through �kH � S, where k isa constant. In other words, he established that

    S D �k•

    f (v) ln[ f (v)] dv ; (3)

    later on generalized into

    S D �k“

    f (q;p) ln[ f (q;p)] dq dp ; (4)

    where (q;p) is called the�-space and constitutes the phase

    space (coordinate q and momentum p) corresponding to

    one particle.

    Boltzmann’s genius insight – the first ever mathemat-

    ical connection of the macroscopic world with the micro-

    scopic one – was, during well over three decades, highly

    controversial since it was based on the hypothesis of the

    existence of atoms. Only a few selected scientists, like

    the English chemist and physicist John Dalton, the Scot-

    tish physicist and mathematician James Clerk Maxwell,

    and the American physicist, chemist and mathematician

    Josiah Willard Gibbs, believed in the reality of atoms and

    molecules. A large part of the scientific establishment was,

    at the time, strongly against such an idea. The intricate

    evolution of Boltzmann’s lifelong epistemological strug-

    gle, which ended tragically with his suicide in 1906, may

    be considered as a neat illustration of Thomas Kuhn’s

    paradigm shift, and the corresponding reaction of the sci-

    entific community, as described in The Structure of Sci-

    entific Revolutions. There are in fact two important for-

    malisms in contemporary physics where the mathematical

    theory of probabilities enters as a central ingredient. These

    are statistical mechanics (with the concept of entropy as

    a functional of probability distributions) and quantum

    mechanics (with the physical interpretation of wave func-

    tions and measurements). In both cases, contrasting view-

    points and passionate debates have taken place alongmore

    than one century, and continue still today. This is no sur-

    prise after all. If it is undeniable that energy is a very deep

    and subtle concept, entropy is even more. Indeed, energy

    concerns the world of (microscopic) possibilities, whereas

    entropy concerns the world of the probabilities of those

    possibilities, a step further in epistemological difficulty.

    In his 1902 celebrated book Elementary Principles of

    Statistical Mechanics, Gibbs introduced the modern form

    of the entropy for classical systems, namely

    S D �kZ

    d� f (q;p) ln[C f (q;p)] ; (5)

    where � represents the full phase space of the system, thus

    containing all coordinates and all momenta of its elemen-

    tary particles, and C is introduced to take into account the

    finite size and the physical dimensions of the smallest ad-

    missible cell in � -space. The constant k is known today to

    be a universal one, called Boltzmann constant, and given

    by k D 1:3806505(24) � 10�23 Joule/Kelvin. The studies

  • Entropy E 2861of the German physicist Max Planck along Boltzmann and

    Gibbs lines after the appearance of quantum mechanical

    concepts, eventually led to the expression

    S D k lnW ; (6)

    which he coined as Boltzmann entropy. This expression is

    carved on the stone of Boltzmann’s grave at the Central

    Cemetery of Vienna. The quantity W is the total number

    of microstates of the system that are compatible with our

    macroscopic knowledge of it. It is obtained from Eq. (5)

    under the hypothesis of an uniform distribution or equal

    probabilities.

    The Hungarian–American mathematician and physi-

    cist Johann von Neumann extended the concept of BG en-

    tropy in two steps – in 1927 and 1932 respectively –, in or-

    der to also cover quantum systems. The following expres-

    sion, frequently referred to as the von Neumann entropy,

    resulted:

    S D �k Tr � ln � ; (7)

    � being the density operator (with Tr � D 1).Another important step was given in 1948 by the

    American electrical engineer and mathematician Claude

    Elwood Shannon. Having in mind the theory of digital

    communications he explored the properties of the discrete

    form

    S D �kWX

    iD1pi ln pi ; (8)

    frequently referred to as Shannon entropy (withPW

    iD1pi D 1). This form can be recovered from Eq. (5) for theparticular case for which the phase space density f (q;p) DPW

    iD1 pi ı(q � qi) ı(p � pi). It can also be recovered fromEq. (7) when � is diagonal. We may generically refer to

    Eqs. (5), (6), (7) and (8) as the BG entropy, noted SBG. It

    is a measure of the disorder of the system or, equivalently,

    of our degree of ignorance or lack of information about

    its state. To illustrate a variety of properties, the discrete

    form (8) is particularly convenient.

    SomeBasic Properties

    Non-negativity It can be easily verified that, in all cases,

    SBG � 0, the zero value corresponding to certainty,i. e., pi D 1 for one of the W possibilities, and zerofor all the others. To be more precise, it is exactly so

    whenever SBG is expressed either in the form (7) or in

    the form (8). However, this property of non-negativity

    may be no longer true if it is expressed in the form (5).

    This violation is one of the mathematical manifesta-

    tions that, at the microscopic level, the state of any

    physical system exhibits its quantum nature.

    Expansibility Also SBG(p1; p2; : : : ; pW ; 0) D SBG(p1; p2;: : : ; pW ), i. e., zero-probability events do not modify

    our information about the system.

    Maximal value SBG is maximized at equal probabilities,

    i. e., for pi D 1/W ;8i. Its value is that of Eq. (6). Thiscorresponds to the Laplace principle of indifference or

    principle of insufficient reason.

    Concavity If we have two arbitrary probability distribu-

    tions fpig and fp0ig for the same set ofW possibilities,we can define the intermediate probability distribution

    p00i D � pi C (1 � �) p0i (0 < � < 1). It straightfor-wardly follows that SBG(fp00i g) � � SBG(fpig) C (1 ��) SBG(fp0ig). This property is essential for thermody-namics since it eventually leads to thermodynamic sta-

    bility, i. e., to robustness with regard to energy fluctu-

    ations. It also leads to the tendency of the entropy to

    attain, as time evolves, its maximal value compatible

    with our macroscopic knowledge of the system, i. e.,

    with the possibly known values for the macroscopic

    constraints.

    Lesche stability or experimental robustness B. Lesche

    introduced in 1982 [107] the definition of an interest-

    ing property, which he called stability. It reflects the

    experimental robustness that a physical quantity is ex-

    pected to exhibit. In other words, similar experiments

    should yield similar numerical results for the physi-

    cal quantities. Let us consider two probability distribu-

    tions fpig and fp0ig, assumed to be close, in the sensethat

    PWiD1 jpi � p0i j < ı ; ı > 0 being a small number.

    An entropic functional S(fpig) is said stable or exper-imentally robust if, for any given � > 0, a ı > 0 exists

    such that jS(fpig) � S(fp0i g)j/Smax < � ;where Smax isthe maximal value that the functional can attain

    (lnW in the case of SBG). This implies that limı!0limW!1(S(fpig) � S(fp0i g))/Smax D 0. As we shallsee soon, this property is much stronger than it seems

    at first sight. Indeed, it provides a (necessary but not

    sufficient) criterion for classifying entropic functionals

    into physically admissible or not. It can be shown that

    SBG is Lesche-stable (or experimentally robust).

    Entropy production If we start the (deterministic) time

    evolution of a generic classical system from an arbi-

    trarily chosen point in its � phase space, it typically

    follows a quite erratic trajectory which, in many cases,

    gradually visits the entire (or almost) phase space. By

    making partitions of this � -space, and counting the

    frequency of visits to the various cells (and related

    symbolic quantities), it is possible to define probabil-

  • 2862 E Entropyity sets. Through them, we can calculate a sort of time

    evolution of SBG(t). If the system is chaotic (sometimes

    called strongly chaotic), i. e., if its sensitivity to the

    initial conditions increases exponentially with time,

    then SBG(t) increases linearly with t in the appropri-

    ate asymptotic limits. This rate of increase of the en-

    tropy is called Kolmogorov–Sinai entropy rate, and, for

    a large class of systems, it coincides (Pesin identity

    or Pesin theorem) with the sum of the positive Lya-

    punov exponents. These exponents characterize the

    exponential divergences, along various directions in

    the � -space, of a small discrepancy in the initial con-

    dition of a trajectory.

    It turns out, however, that the Kolmogorov–Sinai en-

    tropy rate is, in general, quite inconvenient for com-

    putational calculations for arbitrary nonlinear dynam-

    ical systems. In practice, another quantity is used in-

    stead [102], usually referred to as entropy production

    per unit time, which we note KBG. Its definition is as

    follows. We first make a partition of the � -space into

    many W cells (i D 1; 2; : : : ;W). In one of them, ar-bitrarily chosen, we randomly place M initial condi-

    tions (i. e., an ensemble). As time evolves, the occu-

    pancy of the W cells determines the set fM i(t)g, withPW

    iD1 M i(t) D M. This set enables the definition ofa probability set with pi (t) � M i(t)/M, which in turndetermines SBG(t). We then define the entropy produc-

    tion per unit time as follows:

    KBG � limt!1

    limW!1

    limM!1

    SBG(t)

    t: (9)

    Up to date, no theorem guarantees that this quan-

    tity coincides with the Kolmogorov–Sinai entropy

    rate. However, many numerical approaches of various

    chaotic systems strongly suggest so. The same turns

    out to occur with what is frequently referred in the lit-

    erature as a Pesin-like identity. For instance, if we have

    a one-dimensional dynamical system, its sensitivity to

    the initial conditions � � limx(0)!0x(t)/x(0) istypically given by

    �(t) D e�t ; (10)

    wherex(t) is the discrepancy in the one-dimensional

    phase space of two trajectories initially differing by

    x(0), and � is the Lyapunov exponent (� > 0 corre-

    sponds to strongly sensitive to the initial conditions, or

    strongly chaotic, and � < 0 corresponds to strongly in-

    sensitive to the initial conditions). The so-called Pesin-

    like identity amounts, if � � 0, to

    KBG D � : (11)

    Additivity and extensivity If we consider a system A C Bconstituted by two probabilistically independent sub-

    systems A and B, i. e., if we consider pACBi j D pAi pBj ,we immediately obtain from Eq. (8) that

    SBG(A C B) D SBG(A) C SBG(B) : (12)

    In other words, the BG entropy is additive [130]. If

    our system is constituted by N probabilistically inde-

    pendent identical subsystems (or elements), we clearly

    have SBG(N) / N . It frequently happens, however,that the N elements are not exactly independent but

    only asymptotically so in the N ! 1 limit. This is theusual case of many-body Hamiltonian systems involv-

    ing only short-range interactions, where the concept of

    short-range will be focused in detail later on. For such

    systems, SBG is only asymptotically additive, i. e.,

    0 < limN!1

    SBG(N)

    N< 1 : (13)

    An entropy S(fpig) of a specific systems is said exten-sive if it satisfies

    0 < limN!1

    S(N)

    N< 1 ; (14)

    where no hypothesis at all is made about the possible

    independence or weak or strong correlations between

    the elements of the systemwhose entropy Swe are con-

    sidering. Equation (13) amounts to say that the addi-

    tive entropy SBG is extensive for weakly correlated sys-

    tems such as the already mentioned many-body short-

    range-interacting Hamiltonian ones. It is important to

    clearly realize that additivity and extensivity are inde-

    pendent properties. An additive entropy such as SBG is

    extensive for simple systems such as the ones just men-

    tioned, but it turns out to be nonextensive for other,

    more complex, systems that will be focused on later

    on. For many of these more complex systems, it is the

    nonadditive entropy Sq (to be analyzed later on) which

    turns out to be extensive for a non standard value of q

    (i. e., q ¤ 1).

    Boltzmann–Gibbs StatisticalMechanics

    Physical systems (classical, quantum, relativistic) can be

    theoretically described in very many ways, through mi-

    croscopic, mesoscopic, macroscopic equations, reflecting

    either stochastic or deterministic time evolutions, or even

    both types simultaneously. Those systems whose time evo-

    lution is completely determined by a well defined Hamil-

    tonian with appropriate boundary conditions and ad-

  • Entropy E 2863missible initial conditions are the main purpose of an

    important branch of contemporary physics, named sta-

    tistical mechanics. This remarkable theory (or formalism,

    as sometimes called), which for large systems satisfacto-

    rily matches classical thermodynamics, was primarily in-

    troduced by Boltzmann and Gibbs. The physical system

    can be in all types of situations. Two paradigmatic such

    situations correspond to isolation, and thermal contact

    with a large reservoir called thermostat. Their stationary

    state (t ! 1) is usually referred to as thermal equilib-rium. Both situations have been formally considered by

    Gibbs within his mathematical formulation of statistical

    mechanics, and they respectively correspond to the so-

    calledmicro-canonical and canonical ensembles (other en-

    sembles do exist, such as the grand-canonical ensemble,

    appropriate for those situations in which the total num-

    ber of elements of the system is not fixed; this is however

    out of the scope of the present article).

    The stationary state of themicro-canonical ensemble is

    determined by pi D 1/W (8i, where i runs over all possi-ble microscopic states), which corresponds to the extrem-

    ization of SBG with a single (and trivial) constraint, namely

    WX

    iD1pi D 1 : (15)

    To obtain the stationary state for the canonical ensem-

    ble, the thermostat being at temperature T, we must (typi-

    cally) add one more constraint, namely

    WX

    iD1piEi D U ; (16)

    where fEig are the energies of all the possible states of thesystem (i. e., eigenvalues of the Hamiltonian with the ap-

    propriate boundary conditions). The extremization of SBGwith the two constraints above straightforwardly yields

    pi De�ˇE i

    Z(17)

    Z �WX

    jD1e�ˇE j (18)

    with the partition function Z, and the Lagrange param-

    eter ˇ D 1/kT . This is the celebrated BG distributionfor thermal equilibrium (or Boltzmann weight, or Gibbs

    state, as also called), which has been at the basis of an

    enormous amount of successes (in fluids, magnets, su-

    perconductors, superfluids, Bose–Einstein condensation,

    conductors, chemical reactions, percolation, among many

    other important situations). The connection with classi-

    cal thermodynamics, and its Legendre-transform struc-

    ture, occurs through relations such as

    1

    TD @S@U

    (19)

    F � U � TS D � 1ˇln Z (20)

    U D � @@̌

    ln Z (21)

    C � T @S@T

    D @U@T

    D �T @2F

    @T2; (22)

    where F, U and C are the Helmholtz free energy, the in-

    ternal energy, and the specific heat respectively. The BG

    statistical mechanics historically appeared as the first con-

    nection between the microscopic and the macroscopic de-

    scriptions of the world, and it constitutes one of the cor-

    nerstones of contemporary physics. The Establishment re-

    sisted heavily before accepting the validity and power of

    Boltzmann’ s revolutionary ideas. In 1906 Boltzmann dra-

    matically committed suicide, after 34 years that he had first

    proposed the deep ideas that we are summarizing here. At

    that early 20th century, few people believed in Boltzmann’s

    proposal (among those few, wemust certainly mentionAl-

    bert Einstein), andmost physicists were simply unaware of

    the existence of Gibbs and of his profound contributions.

    It was only half a dozen years later that the emerging new

    generation of physicists recognized their respective genius

    (thanks in part to various clarifications produced by Paul

    Ehrenfest, and also to the experimental successes related

    with Brownian motion, photoelectric effect, specific heat

    of solids, and black-body radiation).

    On the Limitations of Boltzmann–Gibbs Entropy

    and StatisticalMechanics

    Historical Background

    As any other human intellectual construct, the applicabil-

    ity of the BG entropy, and of the statistical mechanics to

    which it is associated, naturally has restrictions. The un-

    derstanding of present developments of both the concept

    of entropy, and its corresponding statistical mechanics,

    demand some knowledge of the historical background.

    Boltzmann was aware of the relevance of the range

    of the microscopic interactions between atoms and

    molecules. He wrote, in his 1896 Lectures on Gas The-

    ory [41], the following words:

    When the distance at which two gas molecules inter-

    act with each other noticeably is vanishingly small

    relative to the average distance between a molecule

  • 2864 E Entropyand its nearest neighbor—or, as one can also say,

    when the space occupied by the molecules (or their

    spheres of action) is negligible compared to the space

    filled by the gas—then the fraction of the path of each

    molecule during which it is affected by its interac-

    tion with other molecules is vanishingly small com-

    pared to the fraction that is rectilinear, or simply de-

    termined by external forces. [ . . . ] The gas is “ideal”

    in all these cases.

    Also Gibbs was aware. In his 1902 book [88], he wrote:

    In treating of the canonical distribution, we shall al-

    ways suppose the multiple integral in equation (92)

    [the partition function, as we call it nowadays] to

    have a finite value, as otherwise the coefficient of

    probability vanishes, and the law of distribution be-

    comes illusory. This will exclude certain cases, but not

    such apparently, as will affect the value of our results

    with respect to their bearing on thermodynamics. It

    will exclude, for instance, cases in which the system

    or parts of it can be distributed in unlimited space

    [ . . . ]. It also excludes many cases in which the en-

    ergy can decrease without limit, as when the system

    contains material points which attract one another

    inversely as the squares of their distances. [ . . . ]. For

    the purposes of a general discussion, it is sufficient to

    call attention to the assumption implicitly involved in

    the formula (92).

    The extensivity/additivity of SBG has been challenged,

    along the last century, by many physicists. Let us mention

    just a few. In his 1936 Thermodynamics [82], Enrico Fermi

    wrote:

    The entropy of a system composed of several parts is

    very often equal to the sum of the entropies of all the

    parts. This is true if the energy of the system is the

    sum of the energies of all the parts and if the work

    performed by the system during a transformation is

    equal to the sum of the amounts of work performed

    by all the parts. Notice that these conditions are not

    quite obvious and that in some cases they may not

    be fulfilled. Thus, for example, in the case of a system

    composed of two homogeneous substances, it will be

    possible to express the energy as the sum of the ener-

    gies of the two substances only if we can neglect the

    surface energy of the two substances where they are

    in contact. The surface energy can generally be ne-

    glected only if the two substances are not very finely

    subdivided; otherwise, it can play a considerable role.

    Laszlo Tisza wrote, in his Generalized Thermodynam-

    ics [178]:

    The situation is different for the additivity postu-

    late P a2, the validity of which cannot be inferred

    from general principles. We have to require that the

    interaction energy between thermodynamic systems

    be negligible. This assumption is closely related to

    the homogeneity postulate P d1. From the molecular

    point of view, additivity and homogeneity can be ex-

    pected to be reasonable approximations for systems

    containing many particles, provided that the inter-

    molecular forces have a short range character.

    Corroborating the above, virtually all textbooks of quan-

    tum mechanics contain the mechanical calculations cor-

    responding to a particle in a square well, the harmonic

    oscillator, the rigid rotator, a spin 1/2 in the presence of

    a magnetic field, and the Hydrogen atom. In the textbooks

    of statistical mechanics we can find the thermostatistical

    calculations of all these systems . . . excepting the Hydro-

    gen atom! Why? Because the long-range electron-proton

    interaction produces an energy spectrum which leads to

    a divergent partition function. This is but a neat illustra-

    tion of the above Gibbs’ alert.

    A Remark on the Thermodynamics

    of Short- and Long-Range Interacting Systems

    We consider here a simple d-dimensional classical fluid,

    constituted by many N point particles, governed by the

    Hamiltonian

    H D K C V DNX

    iD1

    p2i2m

    CX

    i¤ jV (ri j) ; (23)

    where the potential V(r) has, if it is attractive at short

    distances, no singularity at the origin, or an integrable

    singularity, and whose asymptotic behavior at infinity

    is given by V(r) � �B/r˛ with B > 0 and ˛ � 0. Onesuch example is the d D 3 Lennard–Jones fluid, forwhich V(r) D A/r12 � B/r6(A > 0), i. e., repulsive at shortdistances and attractive at long distances. In this case

    ˛ D 6. Another example could be Newtonian gravita-tion with a phenomenological short-distance cutoff (i. e.,

    V (r) ! 1 for r � r0 with r0 > 0. In this case, ˛ D 1.The full � -space of such a system has 2dN dimensions.

    The total potential energy is expected to scale (assuming

    a roughly homogeneous distribution of the particles) as

    Upot(N)

    N/ �B

    Z 1

    1

    dr rd�1 r�˛ ; (24)

    where the integral starts appreciably contributing above

    a typical cutoff, here taken to be unity. This integral is finite

  • Entropy E 2865[ D �B/(˛ � d) ] for ˛/d > 1 (short-range interactions),and diverges for 0 � ˛/d � 1 (long-range interactions). Inother words, the energy cannot be generically character-

    ized by Eq. (24), and we must turn onto a different and

    more powerful estimation. Given the finiteness of the size

    of the system, an appropriate one is, in all cases, given by

    Upot(N)

    N/ �B

    Z N1/d

    1

    dr rd�1 r�˛ D �BdN? ; (25)

    where

    N? � N1�˛/d � 11 � ˛/d �

    8

    ˆ

    ˆ

    ˆ

    ˆ

    ˆ

    <

    ˆ

    ˆ

    ˆ

    ˆ

    ˆ

    :

    1

    ˛/d � 1 if ˛/d > 1 ;

    lnN if ˛/d D 1 ;N1�˛/d

    1 � ˛/d if 0 < ˛/d < 1 :(26)

    Notice that N? D ln˛/d N where the q-log func-tion lnq x � (x1�q � 1)/(1 � q)(x > 0; ln1 x D ln x) willbe shown to play an important role later on. Satisfacto-

    rily enough, Eqs. (26) recover the characterization with

    Eq. (24) in the limit N ! 1, but they have the great ad-vantage of providing, for finite N, a finite value. This fact

    will be now shown to enable to properly scale the macro-

    scopic quantities in the thermodynamic limit (N ! 1),for all values of ˛/d � 0.

    Let us address the thermodynamical consequences of

    the microscopic interactions being short- or long-ranged.

    To present a slightly more general illustration, we shall as-

    sume from now on that our homogeneous and isotropic

    classical fluid is made by magnetic particles. Its Gibbs free

    energy is then given by

    G(N; T; p;H) D U(N; T; p;H) � TS(N; T; p;H)C pV(N; T; p;H) � HM(N; T; p;H) ; (27)

    where (T; p;H) correspond respectively to the tempera-

    ture, pressure and external magnetic field, V is the volume

    and M the magnetization. If the interactions are short-

    ranged (i. e., if ˛/d > 1), we can divide this equation by N

    and then take the N ! 1 limit. We obtain

    g(T; p;H) D u(T; p;H) � Ts(T; p;H)C pv(T; p;H) � Hm(T; p;H) ; (28)

    where g(T; p;H) � limN!1 G(N; T; p;H)/N , and anal-ogously for the other variables of the equation. If the in-

    teractions were instead long-ranged (i. e., if 0 � ˛/d � 1),all these quantities would be divergent, hence thermody-

    namically nonsense. Consequently, the generically correct

    procedure, i. e. 8˛/d � 0, must conform to the followinglines:

    limN!1

    G(N; T; p;H)

    NN?D lim

    N!1U(N; T; p;H)

    NN?

    � limN!1

    T

    N?S(N; T; p;H)

    N

    C limN!1

    p

    N?V (N; T; p;H)

    N

    � limN!1

    H

    N?M(N; T; p;H)

    N

    (29)

    hence

    g(T?; p?;H?) D u(T?; p?;H?) � T?s(T?; p?;H?)C p?v(T?; p?;H?) � H?m(T?; p?;H?) ; (30)

    where the definitions of T? and all the other variables are

    self-explanatory (e. g., T? � T/N?). In other words, in or-der to have finite thermodynamic equations of states, we

    must in general express them in the (T?; p?;H?) vari-

    ables. If ˛/d > 1, this procedure recovers the usual equa-

    tions of states, and the usual extensive (G;U; S;V ;M)

    and intensive (T; p;H) thermodynamic variables. But, if

    0 � ˛/d � 1, the situation is more complex, and we real-ize that three, instead of the traditional two, classes of ther-

    modynamic variables emerge. We may call them exten-

    sive (S;V ;M;N), pseudo-extensive (G;U) and pseudo-in-

    tensive (T; p;H) variables. All the energy-type thermody-

    namical variables (G; F;U) give rise to pseudo-extensive

    ones, whereas those which appear in the usual Legendre

    thermodynamical pairs give rise to pseudo-intensive ones

    (T; p;H; �) and extensive ones (S;V ;M;N). See Figs. 1

    and 2.

    The possibly long-range interactions within Hamil-

    tonian (23) refer to the dynamical variables themselves.

    There is another important class of Hamiltonians, where

    the possibly long-range interactions refer to the coupling

    constants between localized dynamical variables. Such is,

    for instance, the case of the following classical Hamilto-

    nian:

    H D K C V DNX

    iD1

    L2i2I

    �X

    i¤ j

    Jx sxi s

    xj C Jys

    yi s

    yj C Jzszi szj

    r˛i j(˛ � 0) ; (31)

    where fLig are the angular momenta, I the moment ofinertia, f(sxi ; s

    yi ; s

    zi )g are the components of classical ro-

    tators, (Jx ; Jy ; Jz) are coupling constants, and rij runs

  • 2866 E Entropy

    Entropy, Figure 1

    For long-range interactions (0 � ˛/d � 1) we have three classes

    of thermodynamic variables, namely the pseudo-intensive (scal-

    ing with N?), pseudo-extensive (scaling with NN?) and exten-

    sive (scaling with N) ones. For short range interactions (˛/d > 1)

    the pseudo-intensive variables become intensive (independent

    from N), and the pseudo-extensive merge with the extensive

    ones, all being now extensive (scaling with N), thus recovering

    the traditional two textbook classes of thermodynamical vari-

    ables

    Entropy, Figure 2

    The so-called extensive systems (˛/d > 1 for the classical ones)

    typically involve absolutely convergent series, whereas the so-

    called nonextensive systems (0 � ˛/d < 1 for the classical ones)

    typically involve divergent series. Themarginal systems (˛/d D 1

    here) typically involve conditionally convergent series, which

    therefore depend on the boundary conditions, i. e., typically on

    the external shape of the system. Capacitors constitute a notori-

    ous example of the ˛/d D 1 case. The model usually referred to

    in the literature as the Hamiltonian–Mean–Field (HMF) one lies

    on the ˛ D 0 axis (8d > 0). The model usually referred to as

    the d-dimensional ˛-XY model [19] lies on the vertical axis at ab-

    scissa d (8˛ � 0)

    over all distances between sites i and j of a d-dimen-

    sional lattice. For example, for a simple hypercubic lattice

    with unit crystalline parameter we have ri j D 1; 2; 3; : : : ifd D 1, ri j D 1;

    p2; 2; : : : if d D 2, ri j D 1;

    p2;

    p3; 2; : : :

    if d D 3, and so on. For such a case, we have that

    N? �NX

    iD2r�˛1i ; (32)

    which has in fact the same asymptotic behaviors as in-

    dicated in Eq. (26). In other words, here again ˛/d > 1

    corresponds to short-range interactions, and 0 � ˛/d � 1corresponds to long-range ones.

    The correctness of the present generalized thermo-

    dynamical scalings has already been specifically checked

    in many physical systems, such as a ferrofluid-like

    model [97], Lennard–Jones-like fluids [90], magnetic sys-

    tems [16,19,59,158], anomalous diffusion [66], percola-

    tion [85,144].

    Let us mention that, for the ˛ D 0 models (i. e., meanfield models), it is largely spread in the literature to divide

    by N the potential term of the Hamiltonian in order to

    make it extensive by force. Although mathematically ad-

    missible (see [19]), this is obviously very unsatisfactory in

    principle since it implies a microscopic coupling constant

    which depends on N. What we have described here is the

    thermodynamically proper way of eliminating the mathe-

    matical difficulties emerging in the models in the presence

    of long-range interactions.

    Last but not least, we verify a point which is crucial for

    the developments here below, namely that the entropy S is

    expected to be extensive no matter the range of the interac-

    tions.

    The Nonadditive Entropy Sq

    Introduction and Basic Properties

    The possibility was introduced in 1988 [183] (see

    also [42,112,157,182]) to generalize the BG statistical me-

    chanics on the basis of an entropy Sq which general-

    izes SBG. This entropy is defined as follows:

    Sq � k1 �PWiD1 p

    qi

    q � 1 (q 2 R; S1 D SBG): (33)

    For equal probabilities, this entropy takes the form

    Sq D k lnq W (S1 D k lnW) ; (34)

    where the q-logarithmic function has already been de-

    fined.

    Remark With the same or different prefactor, this en-

    tropic form has been successively and independently in-

    troduced in many occasions during the last decades.

    J. Havrda and F. Charvat [92] were apparently the first to

    ever introduce this form, though with a different prefactor

    (adapted to binary variables) in the context of cybernet-

    ics and information theory. I. Vajda [207], further studied

    this form, quoting Havrda and Charvat. Z. Daroczy [74]

    rediscovered this form (he quotes neitherHavrda–Charvat

  • Entropy E 2867nor Vajda). J. Lindhard and V. Nielsen [108] rediscovered

    this form (they quote none of the predecessors) through

    the property of entropic composability. B.D. Sharma and

    D.P. Mittal [163] introduced a two-parameter form which

    reproduces both Sq and Renyi entropy [145] as partic-

    ular cases. A. Wehrl [209] mentions the form of Sq in

    p. 247, quotes Daroczy, but ignores Havrda–Charvat, Va-

    jda, Lindhard–Nielsen, and Sharma–Mittal. Myself I re-

    discovered this form in 1985 with the aim of generalizing

    Boltzmann–Gibbs statistical mechanics, but quote none of

    the predecessors in the 1988 paper [183]. In fact, I started

    knowing the whole story quite a few years later thanks to

    S.R.A. Salinas and R.N. Silver, whowere the first to provide

    me with the corresponding informations. Such rediscov-

    eries can by no means be considered as particularly sur-

    prising. Indeed, this happens in science more frequently

    than usually realized. This point is lengthily and colorfully

    developed by S.M. Stigler [167]. In p. 284, a most inter-

    esting example is described, namely that of the celebrated

    normal distribution. It was first introduced by Abraham

    De Moivre in 1733, then by Pierre Simon de Laplace in

    1774, then by Robert Adrain in 1808, and finally by Carl

    Friedrich Gauss in 1809, nothing less than 76 years after

    its first publication! This distribution is universally called

    Gaussian because of the remarkable insights of Gauss con-

    cerning the theory of errors, applicable in all experimen-

    tal sciences. A less glamorous illustration of the same

    phenomenon, but nevertheless interesting in the present

    context, is that of Renyi entropy [145]. According to I.

    Csiszar [64], p. 73, the Renyi entropy had already been es-

    sentially introduced by Paul-Marcel Schutzenberger [161].

    The entropy defined in Eq. (33) has the following main

    properties:

    (i) Sq is nonnegative (8q);(ii) Sq is expansible (8q > 0);(iii) Sq attains its maximal (minimal) value k lnq W for

    q > 0 (for q < 0);

    (iv) Sq is concave (convex) for q > 0 (for q < 0);

    (v) Sq is Lesche-stable (8q > 0) [2];(vi) Sq yields a finite upper bound of the entropy pro-

    duction per unit time for a special value of q, when-

    ever the sensitivity to the initial conditions exhibits

    an upper bound which asymptotically increases as

    a power of time. For example, many D D 1 non-linear dynamical systems have a vanishing maximal

    Lyapunov exponent �1 and exhibit a sensitivity to

    the initial conditions which is (upper) bounded by

    � D e�q tq ; (35)

    with �q > 0, q < 1, the q-exponential function exq

    being the inverse of lnq x. More explicitly (see Fig. 3)

    exq �(

    [1 C (1 � q) x]1

    1�q if 1 C (1 � q)x > 0 ;0 otherwise :

    (36)

    Such systems have a finite entropy production per

    unit time, which satisfies a q-generalized Pesin-like

    identity, namely, for the construction described in

    Sect. “Introduction”,

    Kq � limt!1

    limW!1

    limM!1

    Sq(t)

    tD �q : (37)

    The situation is in fact sensibly much richer than

    briefly described here. For further details, see [27,28,

    29,30,93,116,117,146,147,148,149,150,151,152].

    (vii) Sq is nonadditive for q ¤ 1. Indeed, for indepen-dent subsystemsA and B, it can be straightforwardly

    proved

    Sq(A C B)k

    D Sq(A)k

    C Sq(B)k

    C (1�q) Sq(A)k

    Sq(B)

    k;

    (38)

    or, equivalently,

    Sq(ACB) D Sq(A)C Sq (B)C(1 � q)

    kSq(A) Sq(B) ;

    (39)

    which makes explicit that (1 � q) ! 0 plays thesame role as k ! 1. Property (38), occasion-ally referred to in the literature as pseudo-additiv-

    ity, can be called subadditivity (superadditivity) for

    q > 1 (q < 1).

    (viii) Sq D �k DqPW

    iD1 pxi jxD1, where the 1909 Jackson

    differential operator is defined as follows:

    Dq f (x) �f (qx) � f (x)

    qx � x (D1 f (x) D d f (x)/dx) :

    (40)

    (ix) An uniqueness theorem has been proved by San-

    tos [159], which generalizes, for arbitrary q, that of

    Shannon [162].

    Let us assume that an entropic form S(fpi g) satisfiesthe following properties:

    (a)

    S(fpig) is a continuous function of fpig;(41)

  • 2868 E Entropy(b)

    S(pi D 1/W;8i)monotonically increaseswith the total number of possibilitiesW ;

    (42)

    (c)

    S(A C B)k

    D S(A)k

    C S(B)k

    C (1 � q) S(A)k

    S(B)

    k

    if pACBi j D pAi pBj 8(i; j) ; with k > 0;(43)

    (d)

    S(fpig) D S(pL ; pM) C pqLS(fpi /pLg) C pqMS(fpi /pMg)

    with pL �X

    L terms

    pi ; pM �X

    M terms

    pi (L C M D W) ;

    and pL C pM D 1 :(44)

    Then and only then [159] S(fpig) D Sq(fpig).(x) Another (equivalent) uniqueness theorem was

    proved by Abe [1], which generalizes, for arbitrary q,

    that of Khinchin [100].

    Let us assume that an entropic form S(fpi g) satisfiesthe following properties:

    (a)

    S(fpi g) is a continuous function of fpig; (45)(b)

    S(pi D 1/W;8i) monotonically increaseswith the total number of possibilitiesW ;

    (46)

    (c)

    S(p1; p2 ; : : : ; pW ; 0) D S(p1; p2 ; : : : ; pW ) ;(47)

    (d)

    S(A C B)k

    D S(A)k

    C S(BjA)k

    C (1 � q) S(A)k

    S(BjA)k

    where S(A C B) � S�n

    pACBi j

    o�

    ;

    S(A) � S

    0

    @

    8

    <

    :

    WBX

    jD1pACBi j

    9

    =

    ;

    1

    A ; and the conditional entropy

    S(BjA) �PWA

    iD1�

    pAi�qS�n

    pACBi j /pAi

    o�

    PWAiD1

    pAi�q (k > 0)

    (48)

    Then and only then [1] S(fpig) D Sq(fpig).

    Additivity Versus Extensivity of the Entropy

    It is of great importance to distinguish additivity from

    extensivity. An entropy S is additive [130] if its value

    for a system composed by two independent subsys-

    tems A and B satisfies S(A C B) D S(A) C S(B) (hence,for N independent equal subsystems or elements, we

    have S(N) D NS(1)). Therefore, SBG is additive, andSq(q ¤ 1) is nonadditive. A substantially different mat-ter is whether a given entropy S is extensive for

    a given system. An entropy is extensive if and only if

    0 < limN!1 S(N)/N < 1. What matters for satisfacto-rily matching thermodynamics is extensivity not addi-

    tivity. For systems whose elements are nearly indepen-

    dent (i. e., essentially weakly correlated), SBG is extensive

    and Sq is nonextensive. For systems whose elements are

    strongly correlated in a special manner, SBG is nonexten-

    sive, whereas Sq is extensive for a special value of q ¤ 1(and nonextensive for all the others).

    Let us illustrate these facts for some simple ex-

    amples of equal probabilities. If W(N) � A�N (A > 0,� > 1, and N ! 1), the entropy which is extensiveis SBG. Indeed, SBG(N) D k lnW(N) � (ln�)N / N (itis equally trivial to verify that Sq(N) is nonexten-

    sive for any q ¤ 1). If W(N) � BN�(B > 0, � > 0, andN ! 1), the entropy which is extensive is S1�(1/�). In-deed, S1�(1/�)(N) � k�B1/�N / N (it is equally trivialto verify that SBG(N) / lnN, hence nonextensive). IfW(N) � C�N (C > 0,� > 1, ¤ 1, and N ! 1), thenSq(N) is nonextensive for any value of q. Therefore, in

    such a complex case, one must in principle refer to some

    other kind of entropic functional in order to match the ex-

    tensivity required by classical thermodynamics.

    Various nontrivial abstract mathematical models can

    be found in [113,160,186,198,199] for which Sq(q ¤ 1) isextensive. Moreover, a physical realization is also avail-

    able now [60,61] for a many-body quantum Hamiltonian,

    namely the ground state of the following one:

    H D �N�1X

    iD1

    (1 C )Sxi SxiC1 C (1 � )Syi S

    yiC1�

    � 2�NX

    iD1Szi ; (49)

    where � is a transverse magnetic field, and (Sxi ; Syi ; S

    zi )

    are Pauli matrices; for j j D 1 we have the Ising model,for 0 < j j < 1, we have the anisotropic XY model, and,for D 0, we have the isotropic XY model. The twoformer share the same symmetry and consequently be-

    long to the same critical universality class (the Ising uni-

  • Entropy E 2869

    Entropy, Figure 3

    The q-exponential and q-logarithm functions in typical representations: a Linear-linear representation of exq; b Linear-linear repre-

    sentation of e�xq ; c Log-log representation of y(x) D e�aq xq , solution of dy/dx D �aq y

    q with y(0) D 1; d Linear-linear representation

    of Sq D lnq W (value of the entropy for equal probabilities)

    versality class, which corresponds to a so-called central

    charge c D 1/2), whereas the latter one belongs to a dif-ferent universality class (the XX one, which corresponds

    to a central charge c D 1). At temperature T D 0 andN ! 1, this model exhibits a second-order phase tran-sition as a function of �. For the Ising model, the criti-

    cal value is � D 1, whereas, for the XX model, the entireline 0 � � � 1 is critical. Since the system is at its groundstate (assuming a vanishingly small magnetic field compo-

    nent in the x � y plane), it is a pure state (i. e., its densitymatrix �N is such that Tr �

    2N D 1, 8N), hence the entropy

    Sq(N)(8q > 0) is strictly zero. However, the situation isdrastically different for any L-sized block of the infinite

    chain. Indeed, �L � TrN�L�N is such that Tr �2L < 1, i. e.,it is a mixed state, hence it has a nonzero entropy. The

    block entropy Sq(L) � limN!1 Sq(N; L) monotonically

    increases with L for all values of q. And it does so linearly

    for

    q Dp9 C c2 � 3

    c; (50)

    where c is the central charge which emerges in quan-

    tum field theory [54]. In other words, 0 < limL!1S(p9Cc2�3)/c (L)/L < 1. Notice that q increases from zero

    to unity when c increases from zero to infinity; q Dp37�

    6 ' 0:083 for c D 1/2 (Ising model), q Dp10�3 ' 0:16

    for c D 1 (isotropic XY model), q D 1/2 for c D 4 (dimen-sion of space-time), and q D (

    p685 � 3)/26 ' 0:89 for

    c D 26, related to string theory [89]. The possible phys-ical interpretation of the limit c ! 1 is still unknown,although it could correspond to some sort of mean field

    approach.

  • 2870 E EntropyNonextensive Statistical Mechanics

    To generalize BG statistical mechanics for the canonical

    ensemble, we optimize Sq with constraint (15) and also

    WX

    iD1PiEi D Uq ; (51)

    where

    Pi �pqi

    PWjD1 p

    qi

    WX

    iD1Pi D 1

    !

    (52)

    is the so-called escort distribution [33]. It follows that

    pi D P1/qi /PW

    jD1 P1/qj . There are various converging rea-

    sons for being appropriate to impose the energy constraint

    with the fPig instead of with the original fpig. The full dis-cussion of this delicate point is beyond the present scope.

    However, some of these intertwined reasons are explored

    in [184]. By imposing Eq. (51), we follow [193], which in

    turn reformulates the results presented in [71,183]. The

    passage from one to the other of the various existing

    formulations of the above optimization problem are dis-

    cussed in detail in [83,193].

    The entropy optimization yields, for the stationary

    state,

    pi De

    �ˇq(E i�Uq)q

    Z̄q; (53)

    with

    ˇq �ˇ

    PWjD1 p

    qj

    ; (54)

    and

    Z̄q �WX

    i

    e�ˇq (E i�Uq)q ; (55)

    ˇ being the Lagrange parameter associated with the con-

    straint (51). Equation (53) makes explicit that the proba-

    bility distribution is, for fixed ˇq, invariant with regard to

    the arbitrary choice of the zero of energies. The station-

    ary state (or (meta)equilibrium) distribution (53) can be

    rewritten as follows:

    pi De

    �ˇ 0qE iq

    Z0q; (56)

    with

    Z0q �WX

    jD1e

    �ˇ 0qE jq ; (57)

    and

    ˇ0q �ˇq

    1 C (1 � q)ˇqUq: (58)

    The form (56) is particularly convenient for many appli-

    cations where comparison with experimental or computa-

    tional data is involved. Also, it makes clear that pi asymp-

    totically decays like 1/E1/(q�1)i for q > 1, and has a cut-

    off for q < 1, instead of the exponential decay with Ei for

    q D 1.The connection to thermodynamics is established in

    what follows. It can be proved that

    1

    TD @Sq@Uq

    ; (59)

    with T � 1/(kˇ). Also we prove, for the free energy,

    Fq � Uq � TSq D �1

    ˇlnq Zq ; (60)

    where

    lnq Zq D lnq Z̄q � ˇUq : (61)

    This relation takes into account the trivial fact that, in con-

    trast with what is usually done in BG statistics, the en-

    ergies fEig are here referred to Uq in (53). It can also beproved

    Uq D �@

    @̌lnq Zq ; (62)

    as well as relations such as

    Cq � T@Sq

    @TD @Uq

    @TD �T @

    2Fq

    @T2: (63)

    In fact the entire Legendre transformation structure of

    thermodynamics is q-invariant, which is both remarkable

    and welcome.

    A Connection Between Entropy and Diffusion

    We review here one of the main common aspects of en-

    tropy and diffusion. We shall present on equal footing

    both the BG and the nonextensive cases [13,138,192,216].

    Let us extremize the entropy

    Sq D k1 �

    R1�1 d(x/�) [� p(x)]

    q

    q � 1 (64)

    with the constraintsZ 1

    �1dx p(x) D 1 (65)

  • Entropy E 2871and

    hx2iq �R1

    �1 dx x2[p(x)]q

    R1�1 dx [p(x)]

    qD �2 ; (66)

    � > 0 being some fixed value having the same physical di-

    mensions of the variable x. We straightforwardly obtain

    the following distribution:

    pq(x) D8

    ˆ

    ˆ

    ˆ

    ˆ

    ˆ

    ˆ

    ˆ

    ˆ

    ˆ

    ˆ

    ˆ

    ˆ

    ˆ

    ˆ

    ˆ

    ˆ

    ˆ

    ˆ

    ˆ

    ˆ

    ˆ

    ˆ

    <

    ˆ

    ˆ

    ˆ

    ˆ

    ˆ

    ˆ

    ˆ

    ˆ

    ˆ

    ˆ

    ˆ

    ˆ

    ˆ

    ˆ

    ˆ

    ˆ

    ˆ

    ˆ

    ˆ

    ˆ

    ˆ

    ˆ

    :

    1

    q � 1�(3 � q)

    �1/2 �

    1

    q � 1

    3 � q2(q � 1)

    1�

    1 C q � 13 � q

    x2

    �2

    �1/(q�1)

    if 1 < q < 3 ;

    1

    1p2�

    e�x2/2�2

    if q D 1 ;

    1

    1 � q�(3 � q)

    �1/2 �

    5 � 3q2(1 � q)

    2 � q1 � q

    1 � 1 � q3 � q

    x2

    �2

    �1/(1�q)

    for jxj < � [(3 � q)/(1 � q)]1/2 ; and zero otherwise ,if q < 1 :

    (67)

    These distributions are frequently referred to as q-Gaus-

    sians. For q > 1, they asymptotically have a power-law

    tail (q � 3 is not admissible because the norm (65) can-not be satisfied); for q < 1, they have a compact sup-

    port. For q D 1, the celebrated Gaussian is recovered; forq D 2, the Cauchy–Lorentz distribution is recovered; fi-nally, for q ! �1, the uniform distribution within theinterval [�1; 1] is recovered. For q D 3Cm1Cm , m being aninteger (m D 1; 2; 3; : : :), we recover the Student’s t-dis-tributions with m degrees of freedom [79]. For q D n�4n�2 ,n being an integer (n D 3; 4; 5; : : :), we recover the so-called r-distributions with n degrees of freedom [79]. In

    other words, q-Gaussians are analytical extensions of Stu-

    dent’s t- and r-distributions. In some communities they

    are also referred to as the Barenblatt form. For q < 5/3,

    they have a finite variance which monotonically increases

    for q varying from �1 to 5/3; for 5/3 � q < 3, the vari-ance diverges.

    Let us now make a connection of the above optimiza-

    tion problem with diffusion. We focus on the following

    quite general diffusion equation:

    @ı p(x; t)

    @tıD @@x

    @U(x)

    @xp(x; t)

    C D @˛[p(x; t)]2�q

    @jxj˛(0 < ı � 1; 0 < ˛ � 2; q < 3; t � 0) ; (68)

    with a generic nonsingular potential U(x), and a gen-

    eralized diffusion coefficient D which is positive (nega-

    tive) for q < 2 (2 < q < 3). Several particular instances

    of this equation have been discussed in the literature

    (see [40,86,106,131,188] and references therein).

    For example, the stationary state for ˛ D 2, 8ı, andany confining potential (i. e., limjxj!1 U(x) D 1) isgiven by [106]

    p(x;1)q De

    �ˇ [U(x)�U(0)]q

    Z; (69)

    Z �Z 1

    �1dx e�ˇ [U(x)�U(0)]q ; (70)

    1/ˇ � kT / jDj ; (71)

    which precisely is the distribution obtained within nonex-

    tensive statistical mechanics through extremization of Sq.

    Also, the solution for ˛ D 2, ı D 1, U(x) D �k1x Ck22 x

    2(8 k1, and k2 � 0), and p(x; 0) D ı(x) is givenby [188]

    pq(x; t) De

    �ˇ (t)[x�xM (t)]2q

    Zq(t); (72)

    ˇ(t)

    ˇ(0)D�

    Zq(0)

    Zq(t)

    �2

    D��

    1 � 1K2

    e�t/��

    C 1K2

    ��2/(3�q);

    (73)

    K2 �k2

    2(2 � q)Dˇ(0)[Zq(0)]q�1; (74)

    � � 1k2(3 � q)

    ; (75)

    xM(t) �k1

    k2C�

    xM(0) �k1

    k2

    e�k2 t : (76)

    In the limit k2 ! 0, Eq. (73) becomes

    Zq(t) D˚

    [Zq(0)]3�q C 2(2 � q)(3 � q)Dˇ(0)

    [Zq(0)]2 t�1/(3�q)

    ; (77)

    which, in the t ! 1 limit, yields1

    ˇ(t)/ [Zq(t)]2 / t2/(3�q) : (78)

    In other words, x2 scales like t , with

    D 23 � q ; (79)

    hence, for q > 1 we have > 1 (i. e., superdiffusion; in

    particular, q D 2 yields D 2, i. e., ballistic diffusion),

  • 2872 E Entropyfor q < 1 we have < 1 (i. e., subdiffusion; in particular,

    q ! �1 yields D 0, i. e., localization), and naturally,for q D 1, we obtain normal diffusion. Four systems areknown for which results have been found that are con-

    sistent with prediction (79). These are the motion of Hy-

    dra viridissima [206], defect turbulence [73], simulation of

    a silo drainage [22], and molecular dynamics of a many-

    body long-range-interacting classical system of rotators

    (˛ � XY model) [143]. For the first three, it has beenfound (q; ) ' (3/2; 4/3). For the latter one, relation (79)has been verified for various situations corresponding to

    > 1.

    Finally, for the particular case ı D 1 and U(x) D 0,Eq. (68) becomes

    @p(x; t)

    @tD D @

    ˛[p(x; t)]2�q

    @jxj˛ (0 < ˛ � 2; q < 3) : (80)

    The diffusion constant D just rescales time t. Only two pa-

    rameters are therefore left, namely ˛ and q.

    The linear case (i. e., q D 1) has two types of solutions:Gaussians for ˛ D 2, and Lévy- (or ˛-stable) distributionsfor 0 < ˛ < 2. The case ˛ D 2 corresponds to the CentralLimit Theorem, where the N ! 1 attractor of the sumsof N independent random variables with finite variance

    precisely is a Gaussian. The case 0 < ˛ < 2 corresponds to

    the sometimes called Levy–Gnedenko Central Limit Theo-

    rem, where the N ! 1 attractor of the sums of N inde-pendent random variables with infinite variance (and ap-

    propriate asymptotics) precisely is a Lévy distribution with

    index ˛.

    The nonlinear case (i. e., q ¤ 1) has solutions thatare q-Gaussians for ˛ D 2, and one might conjecture that,similarly, interesting solutions exist for 0 < ˛ < 2. Fur-

    thermore, in analogy with the q D 1 case, one expects cor-responding q-generalized Central Limit Theorems to ex-

    ist [187]. This is precisely what we present in the next Sec-

    tion.

    Standard and q-GeneralizedCentral Limit Theorems

    The q-Product

    It has been recently introduced (independently and virtu-

    ally simultaneously) [43,125] a generalization of the prod-

    uct, which is called q-product. It is defined, for x � 0 andy � 0, as follows:

    x ˝q y �(

    [x1�q C y1�q � 1]1/(1�q) if x1�q C y1�q > 1 ;0 otherwise :

    (81)

    It has, among others, the following properties:

    it recovers the standard product as a particular instance,

    i. e.,

    x ˝1 y D xy ; (82)

    it is commutative, i. e.,

    x ˝q y D y ˝q x ; (83)

    it is additive under q-logarithm, i. e.,

    lnq(x ˝q y) D lnq x C lnq y (84)

    (whereas we remind that lnq(x y) D lnq x C lnq y C (1 �q)(lnq x)(lnq y);

    it has a (2 � q)-duality/inverse property, i. e.,

    1/(x ˝q y) D (1/x) ˝2�q (1/y) ; (85)

    it is associative, i. e.,

    x ˝q (y ˝q z) D (x ˝q y) ˝q z D x ˝q y ˝q zD (x1�q C y1�q C z1�q � 2)1/(1�q) ;

    (86)

    it admits unity, i. e.,

    x ˝q 1 D x : (87)

    and, for q � 1, also a zero, i. e.,

    x ˝q 0 D 0 (q � 1) : (88)

    The q-Fourier Transform

    We shall introduce the q-Fourier transform of a quite

    generic function f (x) (x 2 R) as follows [140,189,202,203,204,205]:

    Fq[ f ](�) �Z 1

    �1dx eix�q ˝q f (x)

    DZ 1

    �1dx e

    ix�[ f (x)]q�1

    q f (x)

    ; (89)

    where we have primarily focused on the case q � 1. Incontrast with the q D 1 case (standard Fourier transform),this integral transformation is nonlinear for q ¤ 1. It hasa remarkable property, namely that the q-Fourier trans-

    form of a q-Gaussian is another q-Gaussian:

    Fq

    h

    Nqp

    ˇ e�ˇ x2

    q

    i

    (�) D e�ˇ1 �2q1 ; (90)

  • Entropy E 2873with

    Nq �

    8

    ˆ

    ˆ

    ˆ

    ˆ

    ˆ

    ˆ

    ˆ

    ˆ

    ˆ

    ˆ

    ˆ

    ˆ

    <

    ˆ

    ˆ

    ˆ

    ˆ

    ˆ

    ˆ

    ˆ

    ˆ

    ˆ

    ˆ

    ˆ

    ˆ

    :

    q � 1�

    �1/2 �

    1

    q � 1

    3 � q2(q � 1)

    � if 1 < q < 3 ;

    1p�

    if q D 1 ;

    3 � q2

    1 � q�

    �1/2 �

    3 � q2(1 � q)

    1

    1 � q

    � if q < 1 ;

    (91)

    and

    q1 D z(q) �1 C q3 � q ; (92)

    ˇ1 D1

    ˇ2�qN

    2(1�q)q (3 � q)

    8: (93)

    Equation (93) can be re-written as ˇp2�qˇ

    1/p2�q

    1 D[(N

    2(1�q)q (3 � q))/8]1/

    p2�q � K(q), which, for q D 1, re-

    covers the well known Heisenberg-uncertainty-principle-

    like relation ˇˇ1 D 1/4.If we iterate n times the relation z(q) in Eq. (92), we

    obtain the following algebra:

    qn(q) D2q C n(1 � q)2 C n(1 � q) (n D 0;˙1;˙2; : : : ) ; (94)

    which can be conveniently re-written as

    2

    1 � qn(q)D 2

    1 � q C n (n D 0;˙1;˙2; : : : ) : (95)

    (See Fig. 4). We easily verify that qn(1) D 1 (8n),q˙1(q) D 1 (8q), as well as

    1

    qnC1D 2 � qn�1 : (96)

    This relation connects the so called additive duality q !(2 � q) and multiplicative duality q ! 1/q, frequentlyemerging in all types of calculations in the literature.

    Moreover, we see from Eq. (95) that multiple values of q

    are expected to emerge in connection with diverse proper-

    ties of nonextensive systems, i. e., in systems whose basic

    entropy is the nonadditive one Sq. Such is the case of the

    so called q-triplet [185], observed for the first time in the

    magnetic field fluctuations of the solar wind, as it has been

    revealed by the analysis of the data sent to NASA by the

    spacecraft Voyager 1 [48].

    Entropy, Figure 4

    The q-dependence of qn(q) � q2;n(q)

    q-Independent RandomVariables

    Two random variables X [with density fX(x)] and Y [with

    density fY (y)] having zero q-mean values (e. g., if fX(x)

    and fY (y) are even functions) are said q-independent, with

    q1 given by Eq. (92), if

    Fq[X C Y](�) D Fq[X](�) ˝q1 Fq[Y](�) ; (97)i. e., if

    Z 1

    �1dz eiz�q ˝q fXCY (z) D

    �Z 1

    �1dx eix�q ˝q fX(x)

    ˝(1Cq)/(3�q)�Z 1

    �1dy e

    i y�q ˝q fX(y)

    ; (98)

    with

    fXCY (z) DZ 1

    �1dx

    Z 1

    �1dy h(x; y) ı(x C y � z)

    DZ 1

    �1dx h(x; z � x)

    DZ 1

    �1dy h(z � y; y)

    (99)

    where h(x; y) is the joint density.

    Clearly, q-independence means independence for

    q D 1 (i. e., h(x; y) D fX(x) fY (y)), and implies a specialcorrelation for q ¤ 1. Although the full understanding ofthis correlation is still under progress, q-independence ap-

    pears to be consistent with scale-invariance.

  • 2874 E EntropyEntropy, Table 1

    The attractors corresponding to the four basic cases, where the N variables that are being summed are q-independent (i. e., globally

    correlated) with q1 D (1 C q)/(3 � q); �Q � (R

    1

    �1dx x2 [f (x)]Q)/(

    R1

    �1dx [f (x)]Q) with Q � 2q � 1. The attractor for (q; ˛) D (1;2) is

    a Gaussian G(x) � L1;2 (standard Central Limit Theorem); for q D 1 and 0 < ˛ < 2, it is a Lévy distribution L˛ � L1;˛ (the so called

    Lévy-Gnedenko limit theorem); for ˛ D 2 and q ¤ 1, it is a q-Gaussian Gq � Lq;2 (the q-Central Limit Theorem; [203]); finally, for

    q ¤ 1 and 0 < ˛ < 2, it is a generic (q; ˛)-stable distribution Lq;˛ ([204,205]). See [140,189] for typical illustrations of the four types

    of attractors. Thedistribution L˛(x) remains, for 1 < ˛ < 2, close to aGaussian for jxjup to about xc(1; ˛), where itmakes a crossover

    to a power-law. The distribution Gq(x) remains, for q > 1, close to a Gaussian for jxj up to about xc(q;2), where it makes a crossover

    to a power-law. The distribution Lq;˛(x) remains, for q > 1 and ˛ < 2, close to a Gaussian for jxj up to about x(1)c (q; ˛), where it

    makes a crossover to a power-law (intermediate regime), which lasts further up to about x(2)c (q; ˛), where it makes a second crossover

    to another power-law (distant regime)

    q D 1 [independent] q ¤ 1 (i. e., Q ¤ 1) [globally correlated]�Q < 1 G(x) Gq(x) D G(3q1�1)/(1Cq1)(x)(˛ D 2) [with same �1 of f (x)] [with same �Q of f (x)]

    Gq(x) � G(x) if jxj � xc(q; 2)Gq(x) � Cq;2/jxj2/(q�1) if jxj � xc(q; 2)for q > 1, with limq!1 xc(q; 2) D 1

    �Q ! 1 L˛ (x) Lq;˛ (x)(˛ < 2) [with same jxj ! 1 behavior of f (x)] [with same jxj ! 1 behavior of f (x)]

    L˛ (x) � G(x) if jxj � xc(1; ˛) Lq;˛ � C(intermediate)q;˛ /jxj2(1�q)�˛(3�q)

    2(q�1)

    L˛ (x) � C1;˛/jxj1C˛ if jxj � xc(1; ˛) if x(1)c (q; ˛) � jxj � x(2)c (q; ˛)with lim˛!2 qc(1; ˛) D 1 Lq;˛ � C(distant)q;˛ /jxj

    1C˛1C˛(q�1)

    if jxj � x(2)c (q; ˛)

    q-Generalized Central Limit Theorems

    It is out of the scope of the present survey to provide the

    details of the complex proofs of the q-generalized cen-

    tral limit theorems. We shall restrict to the presentation

    of their structure. Let us start by introducing a notation

    which is important for what follows. A distribution is said

    (q; ˛)-stable distribution Lq;˛(x) if its q-Fourier transform

    Lq;˛(�) is of the form

    Lq;˛(�) D a e�b j�j˛

    q1

    [a > 0; b > 0; 0 < ˛ � 2; q1 D (q C 1)/(3 � q)] :(100)

    Consistently, L1;2 are Gaussians, L1;˛ are Lévy distribu-

    tions, and Lq;2 are q-Gaussians.

    We are seeking for the N ! 1 attractor associatedwith the sum of N identical and distinguishable random

    variables each of them associated with one and the same

    arbitrary symmetric distribution f (x). The random vari-

    ables are independent for q D 1, and correlated in a spe-cial manner for q ¤ 1. To obtain the N ! 1 invariantdistribution, i. e. the attractor, the sum must be rescaled,

    i. e., divided by Nı , where

    ı D 1˛(2 � q) : (101)

    For (˛; q) D (2; 1), we recover the traditional 1/pN

    rescaling of Brownian motion. At the present stage, the

    theorems have been established for q � 1 and are summa-rized in Table 1. The case q < 1 is still open at the time

    at which these lines are being written. Two q < 1 cases

    have been preliminarily explored numerically in [124]

    and in [171]. The numerics seemed to indicate that the

    N ! 1 limits would be q-Gaussians for both models.However, it has been analytically shown [94] that it is

    not exactly so. The limiting distributions numerically are

    amazingly close to q-Gaussians, but they are in fact differ-

    ent. Very recently, another simple scale-invariant model

    has been introduced [153], whose attractor has been ana-

    lytically shown to be a q-Gaussian.

    These q ¤ 1 theorems play for the nonadditive en-tropy Sq and nonextensive statistical mechanics the same

    grounding role that the well known q D 1 theorems playfor the additive entropy SBG and BG statistical mechanics.

    In particular, interestingly enough, the ubiquity of Gaus-

    sians and of q-Gaussians in natural, artificial and social

    systems may be understood on equal footing.

    Future Directions

    The concept of entropy permeates into virtually all quan-

    titative sciences. The future directions could therefore

    be very varied. If we restrict, however, to the evidence

  • Entropy E 2875

    Entropy, Figure 5

    Snapshot of a nongrowing dynamic network with N D 256

    nodes (see details in [172], by courtesy of the author)

    presently available, the main lines along which evolution

    occurs are:

    Networks Many of the so-called scale-free networks,

    among others, systematically exhibit a degree distribu-

    tion p(k) (probability of a node having k links) which

    is of the form

    p(k) / 1(k0 C k)

    ( > 0; k0 > 0) ; (102)

    or, equivalently,

    p(k) / e�k/�q (q � 1; � > 0) ; (103)

    with D 1/(q � 1) and k0 D �/(q � 1) (see Figs. 5and 6). This is not surprising since, if we associate

    to each link an “energy” (or cost) and to each node

    half of the “energy” carried by its links (the other

    half being associated with the other nodes to which

    any specific node is linked), the distribution of en-

    ergies optimizing Sq precisely coincides with the de-

    gree distribution. If, for any reason, we consider k

    as the modulus of a d-dimensional vector k, the op-

    timization of the functional Sq[p(k)] may lead to

    p(k) / k� e�k/�q , where k� plays the role of a den-sity of states, �(d) being either zero (which repro-

    duces Eq. (103)) or positive or negative. Several exam-

    ples [12,39,76,91,165,172,173,212,213] already exist in

    the literature; in particular, the Barabasi–Albert uni-

    versality class D 3 corresponds to q D 4/3. A deeperunderstanding of this connectionmight enable the sys-

    Entropy, Figure 6

    Nongrowing dynamic network: a Cumulative degree distribu-

    tion for typical values for the number N of nodes; b Same data

    of a in the convenient representation linear q-log versus linear

    with Zq(k) � lnq[Pq(> k)] � ([Pq(> k)]1�q � 1)/(1 � q) (the opti-

    mal fitting with a q-exponential is obtained for the value of q

    which has the highest value of the linear correlation r as indi-

    cated in the inset; here this is qc D 1:84, which corresponds to

    the slope �1.19 in a). See details in [172,173]

    tematic calculation of several meaningful properties of

    networks.

    Nonlinear dynamical systems, self-organized criticality,

    and cellular automata Various interesting phenomena

    emerge in both low- and high-dimensional weakly

    chaotic deterministic dynamical systems, either dis-

    sipative or conservative. Among these phenomena

    we have the sensitivity to the initial conditions and

    the entropy production, which have been briefly ad-

    dressed in Eq. (37) and related papers. But there

    is much more, such as relaxation, escape, glassy

    states, and distributions associated with the station-

  • 2876 E Entropy

    Entropy, Figure 7

    Distribution of velocities for the HMF model at the quasi-stationary state (whose duration appears to diverge when N ! 1). The

    blue curves indicate a Gaussian, for comparison. See details in [137]

    ary state [14,15,31,46,62,67,68,77,103,111,122,123,154,

    170,174,176,177,179]. Also, recent numerical indi-

    cations suggest the validity of a dynamical version of

    the q-generalized central limit theorem [175]. The pos-

    sible connections between all these various properties

    is still in its infancy.

    Long-range-interacting many-body Hamiltonians

    A wide class of long-range-interacting N-body clas-

    sical Hamiltonians exhibits collective states whose

    Lyapunov spectrum has a maximal value that vanishes

    in the N ! 1 limit. As such, they constitute naturalcandidates for studying whether the concepts derived

    from the nonadditive entropy Sq are applicable. A vari-

    ety of properties have been calculated, through molec-

    ular dynamics, for various systems, such as Lennard–

    Jones-like fluids, XY and Heisenberg ferromagnets,

    gravitational-like models, and others. One or more

    long-standing quasi-stationary states (infinitely long-

    standing in the limit N ! 1) are typically observedbefore the terminal entrance into thermal equilib-

    rium. Properties such as distribution of velocities

    and angles, correlation functions, Lyapunov spectrum,

    metastability, glassy states, aging, time-dependence of

    the temperature in isolated systems, energy whenever

    thermal contact with a large thermostat at a given

    temperature is allowed, diffusion, order parameter,

    and others, are typically focused on. An ongoing de-

    bate exists, also involving Vlasov-like equations, Lyn-

    den–Bell statistics, among others. The breakdown of

    ergodicity that emerges in various situations makes

    the whole discussion rich and complex. The activity

    of the research nowadays in this area is illustrated

    in papers such as [21,26,45,53,56,57,63,104,119,121,

    Entropy, Figure 8

    QuantumMonte Carlo simulations in [81]: aVelocity distribution

    (superimposedwith a q-Gaussian);b Index q (superimposedwith

    Lutz prediction [110], by courtesy of the authors)

    126,127,132,133,134,135,136,142,169,200]. A quite re-

    markable molecular-dynamical result has been ob-

    tained for a paradigmatic long-range Hamiltonian: the

    distribution of time averaged velocities sensibly differs

    from that found for the ensemble-averaged velocities,

    and has been shown to be numerically consistent with

    a q-Gaussian [137], as shown in Fig. 7. This result pro-

    vides strong support to a conjecture made long ago:

    see Fig. 4 at p. 8 of [157].

  • Entropy E 2877

    Entropy, Figure 9

    Experiments in [81]: a Velocity distribution (superimposed with a q-Gaussian); b Index q as a function of the frequency; c Velocity

    distribution (superimposed with a q-Gaussian; the red curve is a Gaussian); d Tail of the velocity distribution (superimposed with the

    asymptotic power-law of a q-Gaussian). [By courtesy of the authors]

    Stochastic differential equations Quite generic Fokker–

    Planck equations are currently being studied. Aspects

    such as fractional derivatives, nonlinearities, space-de-

    pendent diffusion coefficients are being focused on,

    as well as their connections to entropic forms, and

    associated generalized Langevin equations [20,23,24,

    70,128,168,214]. Quite recently, computational (see

    Fig. 8) and experimental (see Fig. 9) verifications of

    Lutz’ 2003 prediction [110] have been exhibited [81],

    namely about the q-Gaussian form of the velocity

    distribution of cold atoms in dissipative optical lat-

    tices, with q D 1 C 44ER/U0(ER and U0 being en-

    ergy parameters of the optical lattice). These exper-

    imental verifications are in variance with some of

    those exhibited previously [96], namely double-Gaus-

    sians. Although it is naturally possible that the ex-

    perimental conditions have not been exactly equiv-

    alent, this interesting question remains open at the

    present time. A hint might be hidden in the recent

    results [62] obtained for a quite different problem,

    namely the size distributions of avalanches; indeed,

    at a critical state, a q-Gaussian shape was obtained,

    whereas, at a noncritical state, a double-Gaussian was

    observed.

  • 2878 E EntropyQuantum entanglement and quantum chaos The non-

    local nature of quantum physics implies phenomena

    that are somewhat analogous to those originated by

    classical long-range interactions. Consequently, a va-

    riety of studies are being developed in connection

    with the entropy Sq [3,36,58,59,60,61,155,156,195].

    The same happens with some aspects of quantum

    chaos [11,180,210,211].

    Astrophysics, geophysics, economics, linguistics, cog-

    nitive psychology, and other interdisciplinary appli-

    cations Applications are available and presently searched

    in many areas of physics (plasmas, turbulence, nuclear

    collisions, elementary particles, manganites), but also

    in interdisciplinary sciences such astrophysics [38,47,

    48,49,78,84,87,101,109,129,196], geophysics [4,5,6,7,8,

    9,10,62,208], economics [25,50,51,52,80,139,141,197,

    215], linguistics [118], cognitive psychology [181], and

    others.

    Global optimization, image and signal processing

    Optimizing algorithms and related techniques for

    signal and image processing are currently being de-

    veloped using the entropic concepts presented in this

    article [17,35,72,75,95,105,114,120,164,166,191].

    Superstatistics and other generalizations The methods

    discussed here have been generalized along a vari-

    ety of lines. These include Beck–Cohen superstatis-

    tics [32,34,65,190], crossover statistics [194,196], spec-

    tral statistics [201]. Also, a huge variety of entropies

    have been introduced which generalize in different

    manners the BG entropy, or even focus on other pos-

    sibilities. Their number being nowadays over forty, we

    mention here just a few of them: see [18,44,69,98,99,

    115].

    Acknowledgments

    Among the very many colleagues towards which I am

    deeply grateful for profound and long-lasting comments

    along many years, it is a must to explicitly thank S. Abe,

    E.P. Borges, E.G.D. Cohen, E.M.F. Curado, M. Gell-Mann,

    R.S. Mendes, A. Plastino, A.R. Plastino, A.K. Rajagopal, A.

    Rapisarda and A. Robledo.

    Bibliography

    1. Abe S (2000) Axioms and uniqueness theorem for Tsallis en-

    tropy. Phys Lett A 271:74–79

    2. Abe S (2002) Stability of Tsallis entropy and instabilities of

    Renyi andnormalized Tsallis entropies: A basis for q-exponen-

    tial distributions. Phys Rev E 66:046134

    3. Abe S, Rajagopal AK (2001) Nonadditive conditional entropy

    and its significance for local realism. Physica A 289:157–164

    4. Abe S, Suzuki N (2003) Law for the distance between succes-

    sive earthquakes. J Geophys Res (Solid Earth) 108(B2):2113

    5. Abe S, Suzuki N (2004) Scale-free network of earthquakes. Eu-

    rophys Lett 65:581–586

    6. Abe S, Suzuki N (2005) Scale-free statistics of time interval be-

    tween successive earthquakes. Physica A 350:588–596

    7. Abe S, Suzuki N (2006) Complex network of seismicity. Prog

    Theor Phys Suppl 162:138–146

    8. Abe S, Suzuki N (2006) Complex-network description of seis-

    micity. Nonlinear Process Geophys 13:145–150

    9. Abe S, Sarlis NV, Skordas ES, Tanaka H, Varotsos PA (2005) Op-

    timality of natural time representation of complex time series.

    Phys Rev Lett 94:170601

    10. Abe S, Tirnakli U, Varotsos PA (2005) Complexity of seismicity

    and nonextensive statistics. Europhys News 36:206–208

    11. Abul AY-M (2005) Nonextensive random matrix theory ap-

    proach to mixed regular-chaotic dynamics. Phys Rev E

    71:066207

    12. Albert R, Barabasi AL (2000) Phys Rev Lett 85:5234–5237

    13. Alemany PA, Zanette DH (1994) Fractal random walks from

    a variational formalism for Tsallis entropies. Phys Rev E

    49:R956–R958

    14. Ananos GFJ, Tsallis C (2004) Ensemble averages and nonex-

    tensivity at the edge of chaos of one-dimensional maps. Phys

    Rev Lett 93:020601

    15. Ananos GFJ, Baldovin F, Tsallis C (2005) Anomalous sensitiv-

    ity to initial conditions and entropy production in standard

    maps: Nonextensive approach. Euro Phys J B 46:409–417

    16. Andrade RFS, Pinho STR (2005) Tsallis scaling and the long-

    range Ising chain: A transfer matrix approach. Phys Rev E

    71:026126

    17. Andricioaei I, Straub JE (1996) Generalized simulated an-

    nealing algorithms using Tsallis statistics: Application to

    conformational optimization of a tetrapeptide. Phys Rev E

    53:R3055–R3058

    18. AnteneodoC, PlastinoAR (1999)Maximumentropy approach

    to stretched exponential probability distributions. J Phys A

    32:1089–1097

    19. Anteneodo C, Tsallis C (1998) Breakdown of exponential sen-

    sitivity to initial conditions: Role of the range of interactions.

    Phys Rev Lett 80:5313–5316

    20. Anteneodo C, Tsallis C (2003) Multiplicative noise: A mech-

    anism leading to nonextensive statistical mechanics. J Math

    Phys 44:5194–5203

    21. Antoniazzi A, Fanelli D, Barre J, Chavanis P-H, Dauxois T,

    Ruffo S (2007)Maximumentropy principle explains quasi-sta-

    tionary states in systems with long-range interactions: The

    example of the Hamiltonian mean-field model. Phys Rev E

    75:011112

    22. Arevalo R, Garcimartin A, Maza D (2007) A non-standard sta-

    tistical approach to the silo discharge. Eur Phys J Special Top-

    ics 143:191–197

    23. Assis PC Jr, da Silva LR, Lenzi EK, Malacarne LC, Mendes RS

    (2005) Nonlinear diffusion equation, Tsallis formalism and ex-

    act solutions. J Math Phys 46:123303

    24. Assis PC Jr, da Silva PC, da Silva LR, Lenzi EK, Lenzi MK (2006)

    Nonlinear diffusion equation andnonlinear external force: Ex-

    act solution. J Math Phys 47:103302

    25. Ausloos M, Ivanova K (2003) Dynamical model and nonexten-

    sive statisticalmechanics of amarket index on large timewin-

    dows. Phys Rev E 68:046122

  • Entropy E 287926. Baldovin F, Orlandini E (2006) Incomplete equilibrium in long-

    range interacting systems. Phys Rev Lett 97:100601

    27. Baldovin F, Robledo A (2002) Sensitivity to initial conditions

    at bifurcations in one-dimensional nonlinear maps: Rigorous

    nonextensive solutions. Europhys Lett 60:518–524

    28. Baldovin F, Robledo A (2002) Universal renormalization-

    group dynamics at the onset of chaos in logistic maps and

    nonextensive statistical mechanics. Phys Rev E 66:R045104

    29. Baldovin F, Robledo A (2004) Nonextensive Pesin identity. Ex-

    act renormalization group analytical results for the dynam-

    ics at the edge of chaos of the logistic map. Phys Rev E

    69:R045202

    30. Baldovin F, Robledo A (2005) Parallels between the dynamics

    at the noise-perturbed onset of chaos in logistic maps and

    the dynamics of glass formation. Phys Rev E 72:066213

    31. Baldovin F, Moyano LG, Majtey AP, Robledo A, Tsallis C (2004)

    Ubiquity of metastable-to-stable crossover in weakly chaotic

    dynamical systems. Physica A 340:205–218

    32. Beck C, Cohen EGD (2003) Superstatistics. Physica A 322:267–

    275

    33. Beck C, Schlogl F (1993) Thermodynamics of Chaotic Systems.

    Cambridge University Press, Cambridge

    34. Beck C, Cohen EGD, Rizzo S (2005) Atmospheric turbulence

    and superstatistics. Europhys News 36:189–191

    35. Ben A Hamza (2006) Nonextensive information-theoretic

    measure for image edge detection. J Electron Imaging 15:

    013011

    36. Batle J, Plastino AR, Casas M, Plastino A (2004) Inclusion rela-

    tions among separability criteria. J Phys A 37:895–907

    37. Batle J, Casas M, Plastino AR, Plastino A (2005) Quantum en-

    tropies and entanglement. Intern J Quantum Inf 3:99–104

    38. Bernui A, Tsallis C, Villela T (2007) Deviation from Gaussian-

    ity in the cosmic microwave background temperature fluctu-

    ations. Europhys Lett 78:19001

    39. Boccaletti S, Latora V, Moreno Y, ChavezM, HwangD-U (2006)

    Phys Rep 424:175–308

    40. Bologna M, Tsallis C, Grigolini P (2000) Anomalous diffu-

    sion associated with nonlinear fractional derivative Fokker-

    Planck-like equation: Exact time-dependent solutions. Phys

    Rev E 62:2213–2218

    41. Boltzmann L (1896) Vorlesungen über Gastheorie. Part II, ch

    I, paragraph 1. Leipzig, p 217; (1964) Lectures on Gas Theory

    (trans: Brush S). Univ. California Press, Berkeley

    42. Boon JP, Tsallis C (eds) (2005) Nonextensive Statistical Me-

    chanics: New Trends, New Perspectives. Europhysics News

    36(6):185–231

    43. Borges EP (2004) A possible deformed algebra and calculus

    inspired in nonextensive thermostatistics. Physica A 340:95–

    101

    44. Borges EP, Roditi I (1998) A family of non-extensive entropies.

    Phys Lett A 246:399–402

    45. Borges EP, Tsallis C (2002) Negative specific heat in a Len-

    nard–Jones-like gas with long-range interactions. Physica A

    305:148–151

    46. Borges EP, Tsallis C, Ananos GFJ, Oliveira PMC (2002)

    Nonequilibrium probabilistic dynamics at the logistic map

    edge of chaos. Phys Rev Lett 89:254103

    47. Burlaga LF, Vinas AF (2004) Multiscale structure of the mag-

    netic field and speed at 1 AU during the declining phase of

    solar cycle 23 described by a generalized Tsallis PDF. J Geo-

    phys Res Space – Phys 109:A12107

    48. Burlaga LF, Vinas AF (2005) Triangle for the entropic index q

    of non-extensive statisticalmechanics observed by Voyager 1

    in the distant heliosphere. Physica A 356:375–384

    49. Burlaga LF, Ness NF, Acuna MH (2006) Multiscale structure

    of magnetic fields in the heliosheath. J Geophys Res Space –

    Phys 111:A09112

    50. Borland L (2002) A theory of non-gaussian option pricing.

    Quant Finance 2:415–431

    51. Borland L (2002) Closed form option pricing formulas based

    on a non-Gaussian stock price model with statistical feed-

    back. Phys Rev Lett 89:098701

    52. Borland L, Bouchaud J-P (2004) A non-Gaussian option pric-

    ingmodel with skew. Quant Finance 4:499–514

    53. Cabral BJC, Tsallis C (2002) Metastability and weak mix-

    ing in classical long-range many-rotator system. Phys Rev E

    66:065101(R)

    54. Calabrese P, Cardy J (2004) JSTAT – J Stat Mech Theory Exp

    P06002

    55. Callen HB (1985) Thermodynamics and An Introduction to

    Thermostatistics, 2nd edn. Wiley, New York

    56. Chavanis PH (2006) Lynden-Bell and Tsallis distributions for

    the HMF model. Euro Phys J B 53:487–501

    57. Chavanis PH (2006) Quasi-stationary states and incomplete

    violent relaxation in systems with long-range interactions.

    Physica A 365:102–107

    58. Canosa N, Rossignoli R (2005) General non-additive entropic

    forms and the inference of quantum density operstors. Phys-

    ica A 348:121–130

    59. Cannas SA, Tamarit FA (1996) Long-range interactions and

    nonextensivity in ferromagnetic spin models. Phys Rev B

    54:R12661–R12664

    60. Caruso F, Tsallis C (2007) Extensive nonadditive entropy in

    quan


Recommended