A nonequilibrium entropy for dynamical systems (original) (raw)

Entropy and irreversibility in dynamical systems

Philosophical Transactions of the Royal Society A: Mathematical, Physical and Engineering Sciences, 2013

A method of defining non-equilibrium entropy for a chaotic dynamical system is proposed which, unlike the usual method based on Boltzmann’s principle , does not involve the concept of a macroscopic state. The idea is illustrated using an example based on Arnold’s ‘cat’ map. The example also demonstrates that it is possible to have irreversible behaviour, involving a large increase of entropy, in a chaotic system with only two degrees of freedom.

On the Nonequilibrium Entropy of Large and Small Systems

Stochastic Dynamics Out of Equilibrium, 2019

Thermodynamics makes definite predictions about the thermal behavior of macroscopic systems in and out of equilibrium. Statistical mechanics aims to derive this behavior from the dynamics and statistics of the atoms and molecules making up these systems. A key element in this derivation is the large number of microscopic degrees of freedom of macroscopic systems. Therefore, the extension of thermodynamic concepts, such as entropy, to small (nano) systems raises many questions. Here we shall reexamine various definitions of entropy for nonequilibrium systems, large and small. These include thermodynamic (hydrodynamic), Boltzmann, and Gibbs-Shannon entropies. We shall argue that, despite its common use, the last is not an appropriate physical entropy for such systems, either isolated or in contact with thermal reservoirs: physical entropies should depend on the microstate of the system, not on a subjective probability distribution. To square this point of view with experimental results of Bechhoefer we shall argue that the Gibbs-Shannon entropy of a nano particle in a thermal fluid should be interpreted as the Boltzmann entropy of a dilute gas of Brownian particles in the fluid.

On the entropy of nonequilibrium states

Journal of Statistical Physics, 1989

A definition originally proposed by H. S. Green is used to calculate the entropy of nonequilibrium steady states. This definition provides a well-defined coarse graining of the entropy. Although the dimension of the phase space accessible to nonequilibrium steady states is less than the ostensible dimension of that space, the Green entropy is computed from within the accessible phase space, thereby avoiding the divergences inherent in the fine-grained entropy. It is shown that the Green entropy is a maximum at equilibrium and that away from equilibrium, the thermodynamic temperature computed from the Green entropy is different from the kinetic temperature.

The Entropy Universe

Entropy

About 160 years ago, the concept of entropy was introduced in thermodynamics by Rudolf Clausius. Since then, it has been continually extended, interpreted, and applied by researchers in many scientific fields, such as general physics, information theory, chaos theory, data mining, and mathematical linguistics. This paper presents The Entropy Universe, which aims to review the many variants of entropies applied to time-series. The purpose is to answer research questions such as: How did each entropy emerge? What is the mathematical definition of each variant of entropy? How are entropies related to each other? What are the most applied scientific fields for each entropy? We describe in-depth the relationship between the most applied entropies in time-series for different scientific fields, establishing bases for researchers to properly choose the variant of entropy most suitable for their data. The number of citations over the past sixteen years of each paper proposing a new entropy ...

A New Nonextensive Entropy

Ima Journal of Applied Mathematics, 2004

We propose a new way of defining entropy of a system, which gives a general form which may be nonextensive as Tsallis entropy, but is linearly dependent on component entropies, like Renyi entropy, which is extensive. This entropy has a conceptually novel but simple origin and is mathematically easy to define by a very simple expression, though the probability distribution

Entropy balance, time reversibility, and mass transport in dynamical systems

Chaos: An Interdisciplinary Journal of Nonlinear Science, 1998

We review recent results concerning entropy balance in low-dimensional dynamical systems modeling mass ͑or charge͒ transport. The key ingredient for understanding entropy balance is the coarse graining of the local phase-space density. It mimics the fact that ever refining phase-space structures caused by chaotic dynamics can only be detected up to a finite resolution. In addition, we derive a new relation for the rate of irreversible entropy production in steady states of dynamical systems: It is proportional to the average growth rate of the local phase-space density. Previous results for the entropy production in steady states of thermostated systems without density gradients and of Hamiltonian systems with density gradients are recovered. As an extension we derive the entropy balance of dissipative systems with density gradients valid at any instant of time, not only in stationary states. We also find a condition for consistency with thermodynamics. A generalized multi-Baker map is used as an illustrative example.

Rethinking the Concept of Entropy

2019

It is shown that the construction of thermodynamics with a focus directly on nonequivalent systems leads to the understanding of entropy as a “thermoimpulse” — the impulse of internal motion that has lost its vector nature due to its chaos. Unlike entropy, thermoimpulse allows its diminution in adiabatic processes, which eliminates the contradiction of the principle of its increase to the laws of evolution, the threat of the “thermal death of the Universe”, Gibbs paradoxes, absolute temperatures, relativistic heat engines, etc., leaving it unshakable experimentally established laws.

On the time and cell dependence of the coarse-grained entropy. I [1976]

Physica 82A, 417-437 (1976), 1976

We consider a finite, thermally isolated, classical system which passes from an equilibrium state LS' by the removal of an internal constraint to another equilibrium state B after an empirical relaxation time. In the phase space of the system, cells are introduced according to the set of measuring instruments used and their experimental inaccuracies. It is shown that the coarsegrained entropy S,,(t) tends to its new equilibrium value in general faster than the expectation values of the macroscopic variables &o their new equilibrium values. We then investigate the dependence of S&t) on the size of the phase cells. For fixed t, we find a lower bound on L&(t) by going to the limit of infinite accuracy of the measuring instruments. In the limit t --f co, this lower bound on SC,(t) also converges to the equilibrium entropy of a. These properties strongly support the opinion that S,,(t) is a proper microscopic expression for the entropy for equilibrium and nonequilibrium. Finally, explicit calculations of f&(t) for the model of a point particle enclosed in a one-dimensional box are presented which confirm the general results.