Academia.eduAcademia.edu

What Entropy at the Edge of Chaos?

2005, Complexity, Metastability and Nonextensivity

Numerical experiments support the interesting conjecture that statistical methods be applicable not only to fully-chaotic systems, but also at the edge of chaos by using Tsallis' generalizations of the standard exponential and entropy. In particular, the entropy increases linearly and the sensitivity to initial conditions grows as a generalized exponential. We show that this conjecture has actually a broader validity by using a large class of deformed entropies and exponentials and the logistic map as test cases.

arXiv:cond-mat/0501299v2 [cond-mat.stat-mech] 20 Jan 2005 June 13, 2018 15:34 Proceedings Trim Size: 9.75in x 6.5in Erice04Lissia18012005 WHAT ENTROPY AT THE EDGE OF CHAOS? ∗ MARCELLO LISSIA, MASSIMO CORADDU AND ROBERTO TONELLI Ist. Naz. Fisica Nucleare (I.N.F.N.), Dipart. di Fisica dell’Università di Cagliari, INFM-SLACS Laboratory, I-09042 Monserrato (CA), Italy E-mail: [email protected] Numerical experiments support the interesting conjecture that statistical methods be applicable not only to fully-chaotic systems, but also at the edge of chaos by using Tsallis’ generalizations of the standard exponential and entropy. In particular, the entropy increases linearly and the sensitivity to initial conditions grows as a generalized exponential. We show that this conjecture has actually a broader validity by using a large class of deformed entropies and exponentials and the logistic map as test cases. Chaotic systems at the edge of chaos constitute natural experimental laboratories for extensions of Boltzmann-Gibbs statistical mechanics. The concept of generalized exponential could unify power-law and exponential sensitivity to initial conditions leading to the definition of generalized Liapounov exponents 1 : the sensitivity ξ ≡ limt→∞ lim∆x(0)→0 ∆x(t)/∆x(0) ∼ eg xp(λt), where the generalized exponential eg xp(x) = expq (x) = [1 + (1 − q)x]1/(1−q) ; the exponential behavior for the fully-chaotic regime is recovered for q → 1: limq→1 expq (λq t) = exp(λt). Analogously, a generalization of the Kolmogorov entropy should describe the relevant rate of loss of information. A general discussion of the relation between the KolmogorovSinai entropy rate and the statistical entropy of fully-chaotic systems can be found in Ref. 2: asymptotically and for ideal coarse graining, the entropy grows linearly P q with time. The generalized entropy proposed by Tsallis 3 Sq = (1− N i=1 pi )/(q−1), with pi the fraction of the ensemble found in the i-th cell, reproduces this picture at the edge of chaos; it grows linearly for a specific value of the entropic parameter q = qsens = 0.2445 in the logistic map: limt→∞ limL→0 Sq (t)/t = Kq . The same exponent describes the asymptotic power-law sensitivity to initial conditions 4 . This conjecture includes an extension of the Pesin identity Kq = λq . Numerical evidences with the entropic form Sq exist for the logistic 1 and generalized logistic-like maps 5 . Renormalization group methods 6 yield the asymptotic exponent of the sensitivity to initial conditions in the logistic and generalized logistic maps for specific ∗ This work was partially supported by MIUR (Ministero dell’Istruzione, dell’Università e della Ricerca) under MIUR-PRIN-2003 project “Theoretical physics of the nucleus and the many-body systems”. 1 June 13, 2018 15:34 Proceedings Trim Size: 9.75in x 6.5in Erice04Lissia18012005 2 initial conditions on the attractor; the Pesin identity for Tsallis’ entropy has been also studied 7 . Sensitivity and entropy production have been studied in one-dimensional dissipative maps using ensemble-averaged initial conditions 8 and for two simpletic standard maps 9 : the statistical picture has been confirmed with a different ave q = qsens = 0.35 8 . The ensemble-averaged initial conditions is relevant for the relation between ergodicity and chaos and for practical experiments. The present study demonstrates the broader applicability of the above-described picture by using the consistent statistical mechanics arising from the two-parameter family 10,11,12,13 of logarithms ξ α − ξ −β e ln(ξ) ≡ α+β . (1) Physical requirements 14 on the resulting entropy select 15 0 ≤ α ≤ 1 and 0 ≤ β < 1. All the entropies of this class: (i) are concave 12 , (ii) are Lesche stable 16 , and (iii) yield normalizable distributions 15 ; in addition, we shall show that they (iv) yield a finite non-zero asymptotic rate of entropy production for the logistic map with the appropriate choice of α. We have considered the whole class, but we shall here report results for three interesting one-parameter cases: (1) the original Tsallis proposal 3 (α = 1 − q, β = 0): (2) Abe’s logarithm ξ 1−q − 1 e ln(ξ) = lnq (ξ) ≡ 1−q ; ξ 1/qA −1 − ξ qA −1 e ln(ξ) = lnA (ξ) ≡ 1/qA − qA (2) , (3) where qA = 1/(1 + α) and β = α/(1 + α), which has the same quantum-group symmetry of and is related to the entropy introduced in Ref. 17; (3) and Kaniadakis’ logarithm, α = β = κ, which shares the same symmetry group of the relativistic momentum transformation 18 ξ κ − ξ −κ e ln(ξ) = lnκ (ξ) ≡ . (4) 2κ The sensitivity to initial conditions and the entropy production has been studied in the logistic map xi+1 = 1 − µx2i at the infinite-bifurcation point e µ∞ = 1.401155189. The generalized logarithm ln(ξ) of the sensitivity, ξ(t) = Qt−1 t (2µ) i=0 |xi | for 1 ≤ t ≤ 80, has been uniformly averaged by randomly choosing 4 × 107 initial conditions −1 < x0 < 1. Analogously to the chaotic regime, the e e exp(λt)) = λt. deformed logarithm of ξ should yield a straight line ln(ξ(t)) = ln(g Following Ref. 8, where the exponent obtained with this averaging procedure, av indicated by h· · · i, was denoted qsens for Tsallis’ entropy, each of the generalized e logarithms, hln(ξ(t))i, has been fitted to a quadratic function for 1 ≤ t ≤ 80 and α June 13, 2018 15:34 Proceedings Trim Size: 9.75in x 6.5in Erice04Lissia18012005 3 has been chosen such that the coefficient of the quadratic term be zero: we call this value αav sens . Statistical errors, estimated by repeating the whole procedure with sub-samples of the 4×107 initial conditions, and systematic uncertainties, estimated by including different numbers of points in the fit, have been quadratically combined. We find that the asymptotic exponent αav sens = 0.650 ± 0.005 is consistent with av ≈ 0.36. The error on αav the value of Ref. 8: qsens = 1 − αav sens is dominated by the sens systematic one (choice of the number of points) due to the inclusion of small values of ξ which produces 1% discrepancies from the common asymptotic behavior. e Figure 1 shows the straight-line behavior of ln(ξ) for all formulations when α = av αsens (right frame); the corresponding slopes λ (generalized Lyapunov exponents) are 0.271 ± 0.004 (Tsallis), 0.185 ± 0.004 (Abe) and 0.148 ± 0.004 (Kaniadakis). While α is a universal characteristic of the map, the slope λ strongly depends on the choice of the logarithm. 24 8 log(ξ) 16 Tsallis Abe Kaniadakis Tsallis Abe Kaniadakis 6 <S>(t) 20 12 4 8 2 4 0 0 20 40 60 t 80 0 0 5 10 15 20 25 t 30 Figure 1. Generalized logarithm of the sensitivity to initial conditions (left) and generalized entropy (right) as function of time. From top to bottom, Tsallis’, Abe’s and Kaniadakis’ logarithms (entropies) for α = αav sens . In the left frame the slopes λ (generalized Lyapunov exponents) are 0.271 ± 0.004, 0.185 ± 0.004 and 0.148 ± 0.004; in the right frame the slopes K (generalized Kolmogorov entropies) are 0.267 ± 0.004, 0.186 ± 0.004 and 0.152 ± 0.004. The entropy has been calculated by dividing the interval (−1, 1) in W = 105 equal-size boxes, putting at the initial time N = 106 copies of the system with a uniform random distribution within one box, and then letting the systems evolve according to the map. At each time pi (t) ≡ ni (t)/N , where ni (t) is the number of systems found in the box i at time t, the entropy of the ensemble is + + *N *N X p1−α (t) − p1+β (t) X 1 i i e (5) ) = S(t) ≡ pi (t)ln( pi (t) α+β i=1 i=1 where h· · · i is an average over 2 × 104 experiments, each one starting from one box randomly chosen among the N boxes. The application of the MaxEnt principle to the entropy (5) yields as distribution the deformed exponential that is the inverse e −1 (x) 15 . function of the corresponding logarithm of Eq. (1): eg xp(x) = ln June 13, 2018 15:34 Proceedings Trim Size: 9.75in x 6.5in Erice04Lissia18012005 4 15 10 9 8 7 10 <s(t)> <S( t)> 6 5 4 5 3 2 1 0 0 5 10 15 20 25 0 30 0 5 10 15 20 25 30 t t 8 7 6 <S(t)> 5 4 3 2 1 0 0 5 10 15 20 25 30 t Figure 2. Tsallis’ (top left), Abe’s (top right) and Kaniadakis’ (bottom) entropies as function of time for α = 1 − q = 0.80, 0.74, 0.64, 0.56, 0.52 (from top to bottom). Straight lines are guides for the eyes when α = 0.64 ≈ αav sens . Analogously to the strong chaotic case, where an exponential sensibility (α = β = 0) is associated to a linear rising Shannon entropy, which is defined in terms of the usual logarithm (α = β = 0), we use the same values α and β of the sensitivity for the entropy of Eq. (5). Fig. 1 shows (right frame) that this choice leads to entropies that grow linearly: the corresponding slopes K (generalized Kolmogorov entropies) are 0.267 ± 0.004 (Tsallis), 0.186 ± 0.004 (Abe) and 0.152 ± 0.004 (Kaniadakis). This linear behavior disappears when α 6= αav sens as shown in Fig. 2 for Tsallis’, Abe’s and Kaniadakis’ entropies. In addition, the whole class of entropies and logarithms verifies the Pesin identity K = λ confirming what was already known for Tsallis’ formulation 1,8 . The values of λ and K for the specific Tsallis’, Abe’s and Kaniadakis formulations are given in the caption to Fig. 1 as important explicit examples of this identity. An intuitive explanation of the dependence of the value of K on β and details on the calculations can be found in Ref. 19. In summary, numerical evidence corroborates and extends Tsallis’ conjecture that, analogously to strongly chaotic systems, also weak chaotic systems can be described by an appropriate statistical formalism. In addition to sharing the same June 13, 2018 15:34 Proceedings Trim Size: 9.75in x 6.5in Erice04Lissia18012005 5 asymptotic power-law behavior to correctly describe chaotic systems, extended formalisms should verify precise theoretical requirements. These requirements define a large class of entropies; within this class we use the two-parameter formula (5), which includes Tsallis’s seminal proposal. Its simple power-law form describes both small and large probability behaviors. Specifically, the logistic map shows: (a) a power-low sensitivity to initial conditions with a specific exponent ξ ∼ t1/α , where α = 0.650 ± 0.005; this sensitivity can be described by deformed exponentials with the same asymptotic behavior ξ(t) = eg xp(λt) (see Fig. 1, left frame); (b) a constant asymptotic entropy production rate (see Fig. 1, right frame) for trace-form entropies that go as p1−α in the limit of small probabilities p, where α is the same exponent of the sensitivity; (c) the asymptotic exponent α is related to parameters of known entropies: α = 1 − q, where q is the entropic index of Tsallis’ thermodynamics 3 ; α = 1/qA − 1, where qA appears in the generalization (3) of Abe’s entropy 17 ; α = κ, where κ is the parameter in Kaniadakis’ statistics 18 ; (d) Pesin identity holds Sβ /t → Kβ = λβ for each choice of entropy and corresponding exponential in the class, even if the value of Kβ = λβ depends on the specific entropy and it is not characteristic of the map as it is α 19 ; (e) this picture is not valid Dfor every entropy: counterexample is hP an important iE (R) N q a −1 the Renyi entropy , Sq (t) = (1 − q) log , which has a non-linear i=1 pi (t) behavior for any choice of the parameter q = 1 − α (see Fig. 3). <S(t)> 2 1 10 Figure 3. 20 t 30 Renyi’s entropy for 0.1 ≤ α = 1 − q ≤ 0.95 (from top to bottom). We gratefully thank S. Abe, F. Baldovin, G. Kaniadakis, G. Mezzorani, P. Quarati, A. Robledo, A. M. Scarfone, U. Tirnakli, and C. Tsallis for suggesaA comparison of Tsallis’ and Renyi’s entropies for the logistic map can also be found in Ref. 20. June 13, 2018 15:34 Proceedings Trim Size: 9.75in x 6.5in Erice04Lissia18012005 6 tions and comments. References 1. 2. 3. 4. 5. 6. 7. 8. 9. 10. 11. 12. 13. 14. 15. 16. 17. 18. 19. 20. C. Tsallis, A. R. Plastino, and W.-M. Zheng, Chaos Solitons Fractals 8, 885 (1997). V. Latora and M. Baranger, Phys. Rev. Lett. 82, 520 (1999). C. Tsallis, J. Statist. Phys. 52, 479 (1988). V. Latora, M. Baranger, A. Rapisarda and C. Tsallis, Phys. Lett. A 273, 97 (2000). U. M.S. Costa, M. L. Lyra, A. R. Plastino and C. Tsallis, Phys. Rev. E 56, 245 (1997). F. Baldovin and A. Robledo, Phys. Rev. E 66, 045104 (2002); Europhys. Lett. 60, 518 (2002). F. Baldovin and A. Robledo, Phys. Rev. E 69, 045202 (2004). Garin F. J. Ananos and Constantino Tsallis, Phys. Rev. Lett. 93, 020601 (2004). G. F. J. Ananos, F. Baldovin and C. Tsallis, arXiv:cond-mat/0403656. D.P. Mittal, Metrika 22, 35 (1975); B.D. Sharma, and I.J. Taneja, Metrika 22, 205 (1975). E.P. Borges, and I. Roditi, Phys. Lett. A 246, 399 (1998). G. Kaniadakis, M. Lissia, A. M. Scarfone, Physica A 340, 41 (2004). G. Kaniadakis and M. Lissia, Physica A 340, xv (2004) [arXiv:cond-mat/0409615]. J. Naudts, Physica A 316, 323 (2002) [arXiv:cond-mat/0203489]. G. Kaniadakis, M. Lissia, A. M. Scarfone, arXiv:cond-mat/0409683. S. Abe, G. Kaniadakis, and A.M. Scarfone, J. Phys. A (Math. Gen.) 37, 10513 (2004). S. Abe, Phys. Lett. A 224, 326 (1997). G. Kaniadakis, Physica A 296, 405 (2001); Phys. Rev. E 66, 056125 (2002). R. Tonelli, G. Mezzorani, F. Meloni, M. Lissia and M. Coraddu, arXiv:cond-mat/0412730. R. S. Johal and U. Tirnakli, Physica A 331, 487 (2004).