Entropy production selects nonequilibrium states in multistable systems (original) (raw)
Related papers
Statistical Thermodynamic Foundation for the Origin and Evolution of Life
arXiv: Subcellular Processes, 2015
In this paper we review and extend our earlier recent work on thermostated systems. A description of nano-biological systems by Markov chains in coordinate space in the strongly overdamped limit is presented. Characterization of the most probable path is given and a new formula for the probability of this special path is provided from recursion formulae. The deterministic limit is derived and the significance of Lagrange multipliers introduced when constructing the most probable path is elucidated. The characterization of the generation of path entropy by the most probable path is given an equivalent interpretation relating to the rate of entropy production by the most probable path. The paper concludes with an account of the biological implications. Here we address why the origin of life and its subsequent evolution took place, not the particular chemical details of how it happened.
Physical Review Research
A general framework to describe a vast majority of biology-inspired systems is to model them as stochastic processes in which multiple couplings are in play at the same time. Molecular motors, chemical reaction networks, catalytic enzymes, and particles exchanging heat with different baths, constitute some interesting examples of such a modelization. Moreover, they usually operate out of equilibrium, being characterized by a net production of entropy, which entails a constrained efficiency. Hitherto, in order to investigate multiple processes simultaneously driving a system, all theoretical approaches deal with them independently, at a coarse-grained level, or employing a separation of time scales. Here, we explicitly take in consideration the interplay among time scales of different processes and whether or not their own evolution eventually relaxes toward an equilibrium state in a given subspace. We propose a general framework for multiple coupling, from which the well-known formulas for the entropy production can be derived, depending on the available information about each single process. Furthermore, when one of the processes does not equilibrate in its subspace, even if much faster than all the others, it introduces a finite correction to the entropy production. We employ our framework in various simple and pedagogical examples, for which such a corrective term can be related to a typical scaling of physical quantities in play.
Journal of physics. Condensed matter : an Institute of Physics journal, 2016
Nonequilibrium thermodynamics (NET) investigates processes in systems out of global equilibrium. On a mesoscopic level, it provides a statistical dynamic description of various complex phenomena such as chemical reactions, ion transport, diffusion, thermochemical, thermomechanical and mechanochemical fluxes. In the present review, we introduce a mesoscopic stochastic formulation of NET by analyzing entropy production in several simple examples. The fundamental role of nonequilibrium steady-state cycle kinetics is emphasized. The statistical mechanics of Onsager's reciprocal relations in this context is elucidated. Chemomechanical, thermomechanical, and enzyme-catalyzed thermochemical energy transduction processes are discussed. It is argued that mesoscopic stochastic NET in phase space provides a rigorous mathematical basis of fundamental concepts needed for understanding complex processes in chemistry, physics and biology. This theory is also relevant for nanoscale technologica...
It is not the entropy you produce, rather, how you produce it
Philosophical Transactions of the Royal Society B: Biological Sciences, 2010
The principle of maximum entropy production (MEP) seeks to better understand a large variety of the Earth's environmental and ecological systems by postulating that processes far from thermodynamic equilibrium will ‘adapt to steady states at which they dissipate energy and produce entropy at the maximum possible rate’. Our aim in this ‘outside view’, invited by Axel Kleidon, is to focus on what we think is an outstanding challenge for MEP and for irreversible thermodynamics in general: making specific predictions about the relative contribution of individual processes to entropy production. Using studies that compared entropy production in the atmosphere of a dry versus humid Earth, we show that two systems might have the same entropy production rate but very different internal dynamics of dissipation. Using the results of several of the papers in this special issue and a thought experiment, we show that components of life-containing systems can evolve to either lower or raise t...
The articles are published in Physical Review Research as "Hessian geometric structure of chemical thermodynamic systems with stoichiometric constraints" and "Chemical thermodynamics for growing systems." [19] Nonequilibrium statistical physics is so powerful that it has resolved one of the deepest mysteries about the nature of time: how does entropy evolve within an intermediate regime? [18] Physicists have extended one of the most prominent fluctuation theorems of classical stochastic thermodynamics, the Jarzynski equality, to quantum field theory. [17]
Entropy
The paper discusses how the two thermodynamic properties, energy (U) and exergy (E), can be used to solve the problem of quantifying the entropy of non-equilibrium systems. Both energy and exergy are a priori concepts, and their formal dependence on thermodynamic state variables at equilibrium is known. Exploiting the results of a previous study, we first calculate the non-equilibrium exergy E n-eq can be calculated for an arbitrary temperature distributions across a macroscopic body with an accuracy that depends only on the available information about the initial distribution: the analytical results confirm that E n-eq exponentially relaxes to its equilibrium value. Using the Gyftopoulos-Beretta formalism, a non-equilibrium entropy S n-eq (x,t) is then derived from E n-eq (x,t) and U(x,t). It is finally shown that the non-equilibrium entropy generation between two states is always larger than its equilibrium (herein referred to as "classical") counterpart. We conclude that every iso-energetic non-equilibrium state corresponds to an infinite set of non-equivalent states that can be ranked in terms of increasing entropy. Therefore, each point of the Gibbs plane corresponds therefore to a set of possible initial distributions: the non-equilibrium entropy is a multi-valued function that depends on the initial mass and energy distribution within the body. Though the concept cannot be directly extended to microscopic systems, it is argued that the present formulation is compatible with a possible reinterpretation of the existing non-equilibrium formulations, namely those of Tsallis and Grmela, and answers at least in part one of the objections set forth by Lieb and Yngvason. A systematic application of this paradigm is very convenient from a theoretical point of view and may be beneficial for meaningful future applications in the fields of nano-engineering and biological sciences.
On the Nonequilibrium Entropy of Large and Small Systems
Stochastic Dynamics Out of Equilibrium, 2019
Thermodynamics makes definite predictions about the thermal behavior of macroscopic systems in and out of equilibrium. Statistical mechanics aims to derive this behavior from the dynamics and statistics of the atoms and molecules making up these systems. A key element in this derivation is the large number of microscopic degrees of freedom of macroscopic systems. Therefore, the extension of thermodynamic concepts, such as entropy, to small (nano) systems raises many questions. Here we shall reexamine various definitions of entropy for nonequilibrium systems, large and small. These include thermodynamic (hydrodynamic), Boltzmann, and Gibbs-Shannon entropies. We shall argue that, despite its common use, the last is not an appropriate physical entropy for such systems, either isolated or in contact with thermal reservoirs: physical entropies should depend on the microstate of the system, not on a subjective probability distribution. To square this point of view with experimental results of Bechhoefer we shall argue that the Gibbs-Shannon entropy of a nano particle in a thermal fluid should be interpreted as the Boltzmann entropy of a dilute gas of Brownian particles in the fluid.
Beyond Disorder: A New Perspective on Entropy in Chemistry
Am J Med Chem , 2024
The concept of entropy, a fundamental principle in the field of chemistry, has traditionally been oversimplified as a mere measure of disorder. However, this simplistic perspective fails to capture the intricate and multifaceted nature of entropy, along with its profound influence on various phenomena. This paper seeks to delve deeper into the understanding of entropy by moving beyond the conventional disorder- centric viewpoint and adopting a more nuanced approach that integrates both disorder and energy considerations. Through the redefinition of potential energy and microstates as integral components of entropy, the study explores the intricate interplay between disorder, energy, and molecular transformations within chemical systems. The implications of this refined conceptualization extend beyond the boundaries of chemistry, impacting fields such as physics, biology, and medicine. The potential transformative effects of this enhanced understanding hold promise for advancing scientific knowledge and applications across diverse disciplines.
Entropy, extropy and information potential in stochastic systems far from equilibrium
Physica A-statistical Mechanics and Its Applications, 2002
The relations between information, entropy and energy, which are well known in equilibrium thermodynamics, are not clear far from equilibrium. Moreover, the usual expression of the classical thermodynamic potentials is only valid near equilibrium. In previous publications, we showed for a chemical system maintained far from equilibrium, that a new thermodynamic potential, the information potential, can be deÿned by using the stochastic formalism of the Master Equation.