Optimal information measures for weakly chaotic dynamical systems (original) (raw)

Lossless Compression and Complexity of Chaotic Sequences

Computing Research Repository, 2011

, 2002]). We propose a new measure of complexity - defined as the number of iterations of NSRPS required to transform the input sequence into a constant sequence. We test this measure on symbolic sequences of the Logistic map for various values of the bifurcation parameter. The proposed measure of complexity is easy to compute and is observed to be highly correlated with the Lyapunov exponent of the original non-linear time series, even for very short symbolic sequences (as short as 50 samples). Finally, we construct symbolic sequences from the Skew-Tent map which are incompressible by popular compression algorithms like WinZip, WinRAR and 7-Zip, but compressible by NSRPS.

Dynamical systems and computable information

Discrete and Continuous Dynamical Systems - Series B, 2004

We present some new results which relate information to chaotic dynamics. In our approach the quantity of information is measured by the Algorithmic Information Content (Kolmogorov complexity) or by a sort of computable version of it (Computable Information Content) in which the information is measured by the use of a suitable universal data compression algorithm. We apply these notions to the study of dynamical systems by considering the asymptotic behavior of the quantity of information necessary to describe their orbits. When a system is ergodic, this method provides an indicator which equals the Kolmogorov-Sinai entropy almost everywhere. Moreover, if the entropy is 0, our method gives new indicators which measure the unpredictability of the system and allows to classify various kind of weak chaos. Actually this is the main motivation of this work. The behaviour of a zero entropy dynamical system is far to be completely predictable exept that in particular cases. In fact there are 0 entropy systems which exibit a sort of weak chaos where the information necessary to describe the orbit behavior increases with time more than logarithmically (periodic case) even if less than linearly (positive entropy case). Also, we believe that the above method is useful for the classification of zero entropy time series. To support this point of view, we show some theoretical and experimenthal results in specific cases.

Characterizing complexity of non-invertible chaotic maps in the Shannon–Fisher information plane with ordinal patterns

Chaos Solitons & Fractals, 2021

Being able to distinguish the different types of dynamics present in a given nonlinear system is of great importance in complex dynamics. It allows to characterize the system, find similarities and differences with other nonlinear systems, and classify those dynamical regimes to understand them better. For systems that develop chaos it is not always easy to distinguish determinism from stochasticity. We analyze several non-invertible maps by projecting them on the two-dimensional Fisher-Shannon plane using ordinal patterns. We find that this technique unfolds the complex structure of chaotic systems, showing more details than other methods. It also reveals signatures common to most of the non-invertible maps, and demonstrates its capability to distinguish determinism from stochasticity.

Information of sequences and applications

Physica A: Statistical Mechanics and its Applications, 2002

In this short note, we outline some results about complexity of orbits of a dynamical system, entropy and initial condition sensitivity in weakly chaotic dynamical systems. We present a technique to estimate orbit complexity by the use of data compression algorithms. We also outline how this technique has been applied by our research group to dynamical systems and to DNA sequences.

In What Sense is the Kolmogorov-Sinai Entropy a Measure for Chaotic Behaviour?--Bridging the Gap Between Dynamical Systems Theory and Communication Theory

The British Journal for the Philosophy of Science, 2004

On an influential account, chaos is explained in terms of random behaviour; and random behaviour in turn is explained in terms of having positive Kolmogorov-Sinai entropy (KSE). Though intuitively plausible, the association of the KSE with random behaviour needs justification since the definition of the KSE does not make reference to any notion that is connected to randomness. I provide this justification for the case of Hamiltonian systems by proving that the KSE is equivalent to a generalized version of Shannon's communicationtheoretic entropy under certain plausible assumptions. I then discuss consequences of this equivalence for randomness in chaotic dynamical systems.

The complexity of the logistic map at the chaos threshold

Physics Letters A, 2001

We apply a generalized version of the Kolmogorov-Sinai entropy, based on a non-extensive form, to analyzing the dynamics of the logistic map at the chaotic threshold, the paradigm of power-law sensitivity to initial conditions. We make the statistical averages on the distribution of the power indexes β, and we show that the resulting entropy time evolution becomes a linear function of time if we assign to the non-extensive index q the value Q < 1 prescribed by the heuristic arguments of earlier work. We also show that the emerging entropy index Q is determined by the asymptotic mean value of the index β, and that this same mean value determines the strength of the logarithmic time increase of entropy, stemming from the adoption of the ordinary Shannon form. 

The volume of information as a measure of the chaos synchronization

Technical Physics Letters, 2001

A characteristic is suggested for evaluation of the degree of synchronization of the chaotic oscillations in a system of two coupled oscillators. The proposed value is tested by application to the case of two unidirectionally coupled logistic maps. It is shown that this characteristic is stable with respect to a low noise and a nonlinear distortion of the signal. © 2001 MAIK "Nauka/Interperiodica".

State Predictability and Information Flow in Simple Chaotic Systems

Zeitschrift für Naturforschung A

The possibility of state prediction in deterministic chaotic systems, which are described by 1-D maps, is discussed in the light o f information theory. A quantity h(l) is defined which represents the production of uncertainty on a future state by the chaotic dynamics (intrinsic noise) after / time steps have passed. h(l) is related to the Lyapunov characteristic exponent. Moreover, the influence of the measuring process (overlappings o f mapped boxes o f state space partition) and external noise on the state predictability are investigated quantitatively.

Characterization of chaotic maps using the permutation Bandt-Pompe probability distribution

By appealing to a long list of different nonlinear maps we review the characterization of time series arising from chaotic maps. The main tool for this characterization is the permutation Bandt-Pompe probability distribution function. We focus attention on both local and global characteristics of the components of this probability distribution function. We show that forbidden ordinal patterns (local quantifiers) exhibit an exponential growth for pattern-length range 3 ≤ D ≤ 8, in the case of finite time series data. Indeed, there is a minimum Dmin-value such that forbidden patterns cannot appear for D < Dmin. The system's localization in an entropy-complexity plane (global quantifier) displays typical specific features associated with its dynamics' nature. We conclude that a more "robust" distinction between deterministic and stochastic dynamics is achieved via the present time series' treatment based on the global characteristics of the permutation Bandt-Pompe probability distribution function.