Entropies for complex systems: generalized-generalized entropies (original) (raw)


In information theory the 4 Shannon-Khinchin1,2 (SK) axioms determine Boltzmann Gibbs entropy, S ~ -∑i pi log pi, as the unique entropy. Physics is different from information in the sense that physical systems can be non-ergodic or non-Markovian. To characterize such strongly interacting, statistical systems – complex systems in particular – within a thermodynamical framework it might be necessary to introduce generalized entropies. A series of such entropies have been proposed in the past decades. Until now the understanding of their fundamental origin and their deeper relations to complex systems remains unclear. To clarify the situation we note that non-ergodicity explicitly violates the fourth SK axiom. We show that by relaxing this axiom the entropy generalizes to, S ~∑i Γ(d + 1, 1 - c log pi), where Γ is the incomplete Gamma function, and c and d are scaling exponents. All recently proposed entropies compatible with the first 3 SK axioms appear to be special cases. We prove th...

The Andrey Nikolaevich Kolmogorov's classical system of probability axioms can be extended to encompass the imaginary set of numbers and this by adding to his original five axioms an additional three axioms. Hence, any experiment can thus be executed in what is now the complex probability set C which is the sum of the real set R with its corresponding real probability and the imaginary set M with its corresponding imaginary probability. The objective here is to evaluate the complex probabilities by considering supplementary new imaginary dimensions to the event occurring in the 'real' laboratory. Whatever the probability distribution of the input random variable in R is, the corresponding probability in the whole set C is always one, so the outcome of the random experiment in C can be predicted totally and perfectly. The result indicates that chance and luck in R is replaced now by total determinism in C. This is the consequence of the fact that the probability in C is got by subtracting the chaotic factor from the degree of our knowledge of the stochastic system. This novel complex probability paradigm will be applied to Ludwig Boltzmann's classical concept of entropy in thermodynamics and in statistical mechanics.

For statistical systems that violate one of the four Shannon–Khinchin axioms, entropy takes a more general form than the Boltzmann–Gibbs entropy. The framework of superstatistics allows one to formulate a maximum entropy principle with these generalized entropies, making them useful for understanding distribution functions of non-Markovian or nonergodic complex systems. For such systems where the composability axiom is violated there exist only two ways to implement the maximum entropy principle, one using escort probabilities, the other not. The two ways are connected through a duality. Here we show that this duality fixes a unique escort probability, which allows us to derive a complete theory of the generalized logarithms that naturally arise from the violation of this axiom. We then show how the functional forms of these generalized logarithms are related to the asymptotic scaling behavior of the entropy.