See discussions, stats, and author profiles for this publication at: https://www.researchgate.net/publication/222529866
The rate of entropy increase at the edge of chaos
Article in Physics Letters A · July 1999
DOI: 10.1016/S0375-9601(00)00484-9 · Source: arXiv
CITATIONS
READS
118
30
4 authors, including:
Vito Latora
Andrea Rapisarda
244 PUBLICATIONS 15,387 CITATIONS
153 PUBLICATIONS 3,281 CITATIONS
Queen Mary, University of London
SEE PROFILE
University of Catania
SEE PROFILE
Constantino Tsallis
Centro Brasileiro de Pesquisas Físicas
155 PUBLICATIONS 10,187 CITATIONS
SEE PROFILE
Some of the authors of this publication are also working on these related projects:
Towards participatory processes in transport planning: an agent-based modelling approach View project
Mathai's Entropic Pathway and Fractional Space-Time (Reaction-) Diffusion of Solar Neutrinos OR "Does
Super-Kamiokande Observe Anomalous Diffusion of Neutrinos?" View project
All content following this page was uploaded by Andrea Rapisarda on 13 January 2017.
The user has requested enhancement of the downloaded file. All in-text references underlined in blue are added to the original document
and are linked to publications on ResearchGate, letting you access and read them immediately.
The rate of entropy increase at the edge of chaos
arXiv:cond-mat/9907412v4 [cond-mat.stat-mech] 7 Jul 2000
Vito Latora
(a)
Center for Theoretical Physics, Laboratory for Nuclear Sciences and Department of Physics,
Massachusetts Institute of Technology, Cambridge, Massachusetts 02139, USA
Department of Physics, Harvard University, Cambridge, Massachusetts, 02138, USA
and
Dipartimento di Fisica, Universitá di Catania,
and Istituto Nazionale di Fisica Nucleare, Sezione di Catania Corso Italia 57, I-95129 Catania, Italy
Michel Baranger(b)
Center for Theoretical Physics, Laboratory for Nuclear Sciences and Department of Physics,
Massachusetts Institute of Technology, Cambridge, Massachusetts 02139, USA
Andrea Rapisarda
(c)
Dipartimento di Fisica, Universitá di Catania,
and Istituto Nazionale di Fisica Nucleare, Sezione di Catania Corso Italia 57, I-95129 Catania, Italy
Constantino Tsallis
(d)
Centro Brasileiro de Pesquisas Fisicas, Xavier Sigaud 150, 22290-180, Rio de Janeiro-RJ, Brazil,
Physics Department, University of North Texas, P.O. Box 311427, Denton, Texas 76203, USA
and
Department of Mechanical Engineering, Massachusetts Institute of Technology, Rm. 3-164, Cambridge, Massachusetts 02139,
USA
(February 1, 2008)
Under certain conditions, the rate of increase of the statistical entropy of a simple, fully chaotic,
conservative system is known to be given by a single number, characteristic of this system, the
Kolmogorov–Sinai entropy rate. This connection is here generalized to a simple dissipative system,
the logistic map, and especially to the chaos threshold of the latter, the edge of chaos. It is found
that, in the edge–of–chaos case, the usual P
Boltzmann–Gibbs–Shannon entropy is not appropriate.
1−
W
p
q
i=1 i
Instead, the non–extensive entropy Sq ≡
, must be used. The latter contains a parameter
q−1
q, the entropic index which must be given a special value q ∗ 6= 1 (for q = 1 one recovers the usual
entropy) characteristic of the edge–of–chaos under consideration. The same q ∗ enters also in the
description of the sensitivity to initial conditions, as well as in that of the multifractal spectrum of
the attractor.
05.45.-a,05.45.Df,05.70.Ce
Final Version accepted for publication in Physics Letters A
The connection between chaos and thermodynamics has been receiving increased attention lately. A review of the
central ideas can be found in [1]. Recent studies have focused on both conservative (classical long–range interacting
many–body Hamiltonians [2], low–dimensional conservative maps [3]) and dissipative (low–dimensional maps [4,5],
many–body self–organized criticality [6], symbolic sequences [7]) systems. In ref. [3] the connection between the
Kolmogorov–Sinai (KS) entropy rate and the statistical entropy (or thermodynamic entropy) was brought out for
simple conservative systems. In ref. [4] the non–extensive entropy introduced some years ago by one of us [8] was
shown to be the relevant quantity at the chaos threshold (the edge of chaos). This entropy contains a parameter q
which has been called the entropic index and it reduces to the usual Boltzmann–Gibbs–Shannon (BGS) entropy when
q = 1. In refs. [4,5] q was determined in two completely different ways: from the sensitivity to initial conditions and
from the multifractal spectrum (using 1/(1 − q) = 1/αmin − 1/αmax, where αmin and αmax are respectively the lower
and upper values where the multifractal function f (α) vanishes).
The purpose of this letter is to extend the work of [3] to a dissipative case and to focus especially on the edge of
chaos. The results are: 1) In the chaotic regime the linear rate of the BGS entropy gives the KS entropy; 2) At the
edge of chaos, the non–extensive entropy for one particular value q 6= 1 grows linearly with time, as did the usual
entropy in [3]; 3) The value of q thus determined at the edge of chaos is identical with that found with the two other
independent methods respectively used in refs. [4] and [5].
1
The dissipative system chosen is the simplest possible: the logistic map, a nonlinear one–dimensional dynamical
system described by the iterative rule [1]:
xt+1 = 1 − ax2t (−1 ≤ xt ≤ 1; 0 ≤ a ≤ 2; t = 0, 1, 2, ...) .
(1)
It has chaotic behavior (with a positive Lyapunov exponent) for most of the values of the control parameter a above
the critical value ac ≡ 1.40115519.... This critical value marks the edge of chaos.
For convenience, we recall here the definition of the non–extensive entropy [8]. If the phase space R has been
divided into W cells of equal measure, and if the probability of being in cell i is pi , we define the entropy Sq by
Sq ≡
1−
PW
i=1
pqi
q−1
(q ∈ R) .
(2)
PW
For q = 1 this is S1 = − i=1 pi ln pi , the usual entropy. This generalized entropy was proposed a decade ago [8] to
allow statistical mechanics to cover certain anomalies due to a possible (multi)fractal structure of the relevant phase
space (for example, whenever we have long–range interactions, long–range microscopic memory, multifractal boundary
conditions, some dissipative process, etc). A review of the existing theoretical, experimental and computational
evidence and connections is now available [9] (very recent verifications in fully developed turbulence and in electronpositron annihilation producing hadronic jets are exhibited in [10] and [11] respectively). We also recall the two main
results of refs. [4,5]. The first one [4] concerns the sensitivity to initial conditions. In a truly chaotic system, the
separation between nearby trajectories, suitably averaged over phase space, ξ(t) diverges in time like the exponential
exp (λ1 t), where λ1 > 0 is the Lyapunov exponent. At the edge of chaos, on the other hand, the upper bound growth
of the separation follows a power law which may be written
1
ξ(t) ∝ [1 + (1 − q)λq t] 1−q
(q ∈ R)
(3)
in terms of a certain parameter q. The exponential is recovered in the limit q = 1. For the logistic map at the edge
of chaos, the Lyapunov exponent λ1 is found to vanish, but the growth of the separation is fitted well [12] by eq. (3)
with q = q ∗ ≡ 0.2445.... The second result concerns the geometrical description of the multifractal attractor existing
at ac . See ref. [5] for details. This gives a different method for finding a special value of q which fits the results, and
this value turns out to be again 0.2445.
The power–law sensitivity to the initial conditions has already been noticed in the literature [13]. We shall refer to it
as weak sensitivity, as opposed to the exponential law which we call strong sensitivity. The weak case is characteristic
of the edge of chaos. The conclusion which seems to emerge from our work is that the various manifestations of the
edge of chaos all contain in their description a certain parameter q ∗ , which has the same value for all of them; and
that moreover this q ∗ is the entropic index which must be used (instead of the usual value 1) in a thermodynamic
description of an edge–of–chaos system.
We shall now present the numerical work we have done, in which q ∗ is calculated in a third, completely different
way, which involves rates of increase of entropies. We use the analysis developped in ref. [3] for conservative systems.
In order to do so we partition the interval −1 ≤ x ≤ 1 into W equal cells; we choose (randomly or not) one of them
and we select (randomly or uniformly) N values of x (inside that cell) which will be considered as initial conditions
for the logistic map at a given value of a. As t evolves, the N points spread within the [−1, 1] interval in such a way
PW
that we have a set {Ni (t)} with i=1 Ni (t) = N ∀t, and a set of probabilities {pi (t) ≡ Ni (t)/N }. Differently from
ref. [3], we consider here the general entropy (3), which for q = 1 reduces to the entropy used in [3].
At t = 0, all probabilities but one are zero, hence Sq (0) = 0. And, as t evolves, Sq (t) tends to increase, in all cases
1−q
bounded by W 1−q−1 (ln W when q = 1), which corresponds to equiprobability. Fluctuations are of course present
and can be reduced by considering averages over the initial conditions. As last step, we define the following rate of
increase
κq ≡ lim lim
lim
t→∞ W →∞ N →∞
Sq (t)
t
(4)
where κ1 , in the case of chaotic conservative systems, is expected to coincide with the standard KS entropy rate [3].
Our expectations, based on the work of refs. [4,16], are:
(i) A special value q ∗ exists such that Sq∗ (t) increases linearly with time. κq is then finite for q = q ∗ , vanishes for
q > q ∗ and diverges for q < q ∗ .
(ii) When the system is strongly sensitive to initial conditions (λ1 > 0), q ∗ is 1 and the results of ref. [3] can be
2
extended to dissipative systems.
(iii) At the edge of chaos (λ1 = 0, weakly sensitive systems), q ∗ is different from 1 and coincides with the value
determined from the sensitivity to initial conditions (eq.(3)) and from the multifractal spectrum.
The following results confirm these expectations.
In fig. 1 we present the case a = 2, for which the system is chaotic (with λ1 = ln 2 ), and then we expect q ∗ = 1. We
show the time evolution of Sq for three different values of q . Only the curves for q = 1 show a clear linear behavior
before we reach the asymptotic constant value which characterizes the equilibrium distribution in the available part
of phase space. The slope in the intermediate time stage does not depend on W and it is equal to the KS entropy
rate ln 2 (for any one–dimensional system the KS entropy is given by the positive Lyapunov exponent [14]). This is
clear in fig. 2 which shows S1 (t) for two different cases a = 2 and a = 1.6 . In both cases the fitted slope agrees with
the predicted KS entropy rates, respectively ln 2 and 0.3578 .
So far we have shown that q ∗ is 1 for all the cases in which the logistic curve is chaotic, i.e. strongly sensitive to
the initial conditions. Now we want to study the same system at its chaos threshold a = ac = 1.40115519... for which
we expect q ∗ = qc = 0.2445.... For this value of a, considerable fluctuations are observed which require an efficient
and careful averaging over the initial conditions. Consider that the attractor occupies only a tiny part of phase space
(844 cells out of the W = 105 of our partition). We adopt the following criterion: we choose the initial distribution
(made of N = 106 points) in one of the W = 105 cells of the partition and we study the number of occupied cells
during the time evolution; the integrated number of occupied cells, i.e., the sum of the numbers of occupied cells for
all time steps from iteration 1 to iteration 50, is a measure of how good this initial condition is at spreading itself.
We repeat the same study for each one of the W/2 cells in the interval 0 ≤ x ≤ 1 and, in fig.3, we plot the integrated
number of occupied cells vs. the position of the initial distribution. The cells for which this number is larger than
a fixed cutoff (= 5000 in fig.3) are selected for inclusion in the averaging process (in fig.3 this is 1251 cells, out of a
total 105 ). In fig.4 we plot Sq (t) for four different values of q ; the curves are an average over the 1251 initial cells
selected by fig.3 . The growth of Sq (t) is found to be linear when q = qc = 0.2445, while for q < qc (q > qc ) the
curve is concave (convex). This behavior is similar to the one in fig.1, with a major difference: the linear growth is
not at q = 1 (see inset(a) in fig.4), but at a particular value of the entropic index, which happens to coincide with
q ∗ = 0.2445. In order to make this result much more convincing, we fitted the curves Sq (t) in the time interval [t1 , t2 ]
with the polynomial S(t) = a + bt + ct2 [17] . We define R = |c| · (t1 + t2 )/b as a measure of the importance of
the nonlinear term in the fit: if the points were on a perfect straight line, R should be zero. We choose t1 = 15 and
t2 = 38 for all q’s, so that the factor (t1 + t2 ) is just a normalizing constant. Fig.4(b) shows the minimum of R for
q = qc = 0.2445. These results are not sensitive to small changes in t1 and t2 .
Precisely the same behavior, in all of its aspects, that we find here for the standard logistic map has also been
found recently [18] for its extended version where the term x2t is generalized into |xt |z (z ∈ R). As expected from
well known universality class considerations for this family of maps, it was found that q ∗ depends on z. The same
values for q ∗ (z) were also found [18] for a periodic family of maps which belongs to the same universality class as the
logistic-like family. Finally, for the usual logistic map, the value q ∗ = 0.24... was found through a different algorithm
[19], closer in fact to the original definition of the Kolmogorov-Sinai entropy.
To summarize, we have illustrated for the logistic map the connections between the sensitivity to initial conditions,
the geometrical support in phase space, and the linear growth in time of the entropy Sq . In the case of strong sensitivity
(exponential divergence between trajectories), the geometrical support is Euclidean, and the relevant entropy with
linear growth is S1 , the usual entropy. In the case of weak sensitivity (power–law divergence), the geometrical support
is multifractal, and the relevant entropy whose growth is linear is the non–extensive entropy Sq , with a special value
q ∗ 6= 1 of the entropic index. This q ∗ is the same parameter which enters also in the power–law divergence and in
the multifractal support: the same q ∗ describes all three phenomena. Thus strong sensitivity and weak sensitivity
to initial conditions become unified in a single description, the difference residing in the particular value of q ( = 1
for the strong case). The KS entropy rate, which is an average loss of information, is also indexed by q. We believe
that the scenario herein exhibited is valid for vast classes of nonlinear dynamical systems, whose full and rigorous
characterization would be very welcome. We conclude, as a final remark, that the still unclear foundation of statistical
mechanics on microscopic dynamics seems more than ever to follow along the lines pioneered by Krylov [20]. Indeed,
the crucial concept appears to be the mixing (and not only ergodicity): if the mixing is exponential (strong mixing),
then q = 1 and the standard thermodynamical extensivity is the adequate hypothesis for those physical phenomena,
whereas at the edge of chaos the mixing is only algebraic (weak mixing) and then q 6= 1 and thermodynamical
nonextensivity is expected to emerge.
One of us (C.T.) acknowledges the warm hospitality of the Dipartimento di Fisica dell’ Universitá di Catania
where this work was initiated, as well as useful remarks from A. Politi and partial support by CNPq, FAPERJ
and PRONEX/FINEP (Brazilian agencies). V.L. thanks the Blanceflor–Ludovisi Foundation and CNR for financial
3
support. M.B. acknowledges partial support from the U.S. Department of Energy (D.O.E.) under Contract No.
DE-FC02-94ER40818.
(a) E-mail:
[email protected]
(b) E-mail:
[email protected]
(c) E-mail:
[email protected]
(d) E-mail:
[email protected]
[1] C. Beck and F. Schlogl, Thermodynamics of chaotic systems (Cambridge University Press, Cambridge, 1993).
[2] C. Anteneodo and C. Tsallis, Phys. Rev. Lett. 80, 5313 (1998); V. Latora, A. Rapisarda and S. Ruffo, Phys. Rev. Lett. 80,
692 (1998) and V. Latora, A. Rapisarda and S. Ruffo, Physica D 131, 38 (1999); V. Latora, A. Rapisarda and S. Ruffo,
Phys. Rev. Lett. 83, 2104 (1999).
[3] V. Latora and M. Baranger, Phys. Rev. Lett. 82, 520 (1999).
[4] C. Tsallis, A.R. Plastino and W.-M. Zheng, Chaos, Solitons and Fractals 8, 885 (1997); U.M.S. Costa, M.L. Lyra, A.R.
Plastino and C. Tsallis, Phys. Rev. E 56, 245 (1997).
[5] M.L. Lyra and C. Tsallis, Phys. Rev. Lett. 80, 53 (1998); U. Tirnakli, C. Tsallis and M.L. Lyra, Eur.
Phys. J. B. 10, 309 (1999); C.R. da Silva, H.R. da Cruz and M.L. Lyra, Braz. J. Phys. 29, 144 (1999)
[http://sbf.if.usp.br/WWW− pages/Journals/BJP/Vol29/Num1/index.htm].
[6] P.Bak, How Nature Works : The Science of Self-Organized Criticality, (Springer–Verlag, New York 1996); A. Bhowal,
Physica A 247, 327 (1997); F.A. Tamarit, S.A. Cannas and C. Tsallis, Eur. Phys. J. B 1, 545 (1998); A.R.R. Papa and C.
Tsallis, Phys. Rev. E 57, 3923 (1998).
[7] M. Buiatti, P. Grigolini and L. Palatella, Physica A 268, 214 (1999).
[8] C. Tsallis, J. Stat. Phys. 52, 479 (1988); E.M.F. Curado and C. Tsallis, J. Phys. A 24, L69 (1991) [Corrigenda: 24, 3187
(1991) and 25, 1019 (1992)]; C. Tsallis, R.S. Mendes and A.R. Plastino, Physica A 261, 534 (1998). A regularly updated
bibliography on the subject is accessible at http://tsallis.cat.cbpf.br/biblio.htm
[9] C. Tsallis, Nonextensive Statistical Mechanics and Thermodynamics, eds. S.R.A. Salinas and C. Tsallis, Braz. J. Phys. 29,
1 (1999) [http://sbf.if.usp.br/ WWW− pages/Journals/BJP/Vol29/Num1/index.htm]; C. Tsallis, ‘Nonextensive statistical
mechanics and thermodynamics: Historical background and present status”, in Nonextensive Statistical Mechanics and Its
Applications, eds. S. Abe and Y. Okamoto, Series ”Lecture Notes in Physics” (Springer-Verlag, Berlin, 2000), in press.
[10] C. Beck, Physica A 277, 115 (2000); see also T. Arimitsu and N. Arimitsu, Phys. Rev. E 61, 3237 (2000) and J. Phys. A
(2000), in press.
[11] I. Bediaga, E.M.F. Curado and J. Miranda, Physica A (2000) in press [hep-ph/9905255]; see also C. Beck, Physica A
(2000), in press [hep-ph/0004225].
[12] At the chaos threshold the separation has a very fluctuating behavior. The time series is a set of scattered points organized
in a regular but complicated way. Eq. (3) describes the upper bound of this complicated structure, i.e. the quickest possible
mixing. See details in ref. [4].
[13] P. Grassberger and M. Scheunert, J. Stat. Phys. 26, 697 (1981); T. Schneider, A. Politi and D. Wurtz, Z. Phys. B 66, 469
(1987); G. Anania and A. Politi, Europhys. Lett. 7, 119 (1988).
[14] Ya. Pesin, Russ. Math. Surveys 32, 55 (1977).
[15] B.B. Mandelbrot, J. Fluid Mech. 62, 331 (1974); P. Grassberger, Phys. Lett. A 97, 227 (1983); H.G.E. Hentschel and I.
Procaccia, Physica D 8, 435 (1983); P. Grassberger and I. Procaccia, Phys. Rev. A 28 , 2591 (1983) and Phys. Rev. Lett.
50, 364 (1983); R. Benzi, G. Paladin, G. Parisi and A. Vulpiani, J. Phys. A 17, 3521 (1984); M.J. Feigenbaum, J. Stat.
Phys. 46, 919, 925 (1987); T.C. Halsey, M.H. Jensen, L.P. Kadanoff, I. Procaccia and B.I. Shraiman, Phys. Rev A 33,
1141 (1986).
[16] G.R. Guerberoff, private communication (1998).
[17] This method is more sensitive than the standard correlation coefficient of the linear regression analysis in determining the
best linear slope among different values of q.
[18] U. Tirnakli, G.F.J. Ananos and C. Tsallis, preprint (2000) [cond-mat/0005210].
[19] S. Montangero, L. Fronzoni and P. Grigolini, preprint (1999) [cond-mat/9911412].
[20] N.S. Krylov, Works on the Foundations of Statistical Physics, translated by A.B. Migdal, Ya. G. Sinai and Yu. L. Zeeman,
Princeton Series in Physics (Princeton University Press, Princeton, 1979)
4
50
a=2
W=10
5
ENTROPY
40
30
W=10
20
4
q=0.8
W=10
5
q=1
10
W=10
4
q=1.2
0
0
10
20
TIME
FIG. 1. Time evolution of Sq for a = 2. The interval −1 ≤ x ≤ 1 is partitioned into W equal cells. The initial distribution
consists of N = 106 points placed at random inside a cell picked at random anywhere on the map. We consider three different
values of q and the two cases W = 104 and W = 105 . Results are averages over 100 runs.
ENTROPY
15.0
a=1.6
a=2
fit slope 0.36
fit slope 0.69
10.0
5.0
0.0
0.0
10.0
20.0
30.0
40.0
TIME
FIG. 2. Time evolution of S1 for a = 2 and a = 1.6 ; W = 105 and averages of 100 runs. The slopes of the fits shown are
respectively equal to the averaged Lyapunov exponents.
5
6
10
5
# of occ. cells
10
4
10
3
10
2
10
0.0
0.2
0.4
0.6
0.8
1.0
x
FIG. 3. Integrated number of occupied cells vs. position of the initial cell. The horizontal line selects the best initial
conditions (see text). If we increase or decrease the value of this (numerically convenient, but dispensable) cut-off, the value
for q ∗ remains the same; what changes is the proportionality coefficient between Sq∗ and time.
600
a=1.40115519
1.0
500
q=0.05
R
0.5
ENTROPY
400
(b)
0.0
0.0
0.5
300
1.0
q
6
4
200
q=1
2
q=0.2445
(a)
0
100
0
0
0
10
20
30
40
q=0.5
10
20
30
40
TIME
FIG. 4. Time evolution of Sq for a = ac . We consider four different values of q and W = 105 . The case q = 1 is reported
in the inset (a) with a different scale. Results are averages over 1251 runs. We show the coefficient of nonlinearity R vs. q in
the inset (b). See text. The 4-digit precision for q ∗ was not attained through the present numerical procedure, but using the
scaling 1/(1 − q) = 1/αmin − 1/αmax . The present procedure does not provide higher precision than q ∗ = 0.24....
6
View publication stats