On the Nonequilibrium Entropy of Large and Small Systems (original) (raw)

Statistical mechanics and thermodynamics of large and small systems

arXiv (Cornell University), 2017

Thermodynamics makes definite predictions about the thermal behavior of macroscopic systems in and out of equilibrium. Statistical mechanics aims to derive this behavior from the dynamics and statistics of the atoms and molecules making up these systems. A key element in this derivation is the large number of microscopic degrees of freedom of macroscopic systems. Therefore, the extension of thermodynamic concepts, such as entropy, to small (nano) systems raises many questions. Here we shall reexamine various definitions of entropy for nonequilibrium systems, large and small. These include thermodynamic (hydrodynamic), Boltzmann, and Gibbs-Shannon entropies. We shall argue that, despite its common use, the last is not an appropriate physical entropy for such systems, either isolated or in contact with thermal reservoirs: physical entropies should depend on the microstate of the system, not on a subjective probability distribution. To square this point of view with experimental results of Bechhoefer we shall argue that the Gibbs-Shannon entropy of a nano particle in a thermal fluid should be interpreted as the Boltzmann entropy of a dilute gas of Brownian particles in the fluid.

A comparison of Boltzmann and Gibbs definitions of microcanonical entropy for small systems

AIP Advances

Two different definitions of entropy, S = k ln W, in the microcanonical ensemble have been competing for over 100 years. The Boltzmann/Planck definition is that W is the number of states accessible to the system at its energy E (also called the surface entropy). The Gibbs/Hertz definition is that W is the number of states of the system up to the energy E (also called the volume entropy). These two definitions agree for large systems but differ by terms of order N −1 for small systems, where N is the number of particles in the system. For three analytical examples (a generalized classical Hamiltonian, identical quantum harmonic oscillators, and the spinless quantum ideal gas), neither the Boltzmann/Planck entropy nor heat capacity is extensive because it is always proportional to N − 1 rather than N, but the Gibbs/Hertz entropy is extensive and, in addition, gives thermodynamic quantities, which are in remarkable agreement with canonical ensemble calculations for systems of even a few particles. In a fourth example, a collection of two-level atoms, the Boltzmann/Planck entropy is in somewhat better agreement with canonical ensemble results. Similar model systems show that temperature changes when two subsystems come to thermal equilibrium are in better agreement with expectations for the Gibbs/Hertz temperature than for the Boltzmann/Planck temperature, except when the density of states is decreasing. I conclude that the Gibbs/Hertz entropy is more useful than the Boltzmann/Planck entropy for comparing microcanonical simulations with canonical molecular dynamics simulations of small systems.

On the (Boltzmann) entropy of non-equilibrium systems

Physica D: Nonlinear Phenomena, 2004

Boltzmann defined the entropy of a macroscopic system in a macrostate M as the log of the volume of phase space (number of microstates) corresponding to M . This agrees with the thermodynamic entropy of Clausius when M specifies the locally conserved quantities of a system in local thermal equilibrium (LTE). Here we discuss Boltzmann's entropy, involving an appropriate choice of macro-variables, for systems not in LTE. We generalize the formulas of Boltzmann for dilute gases and of Resibois for hard sphere fluids and show that for macro-variables satisfying any deterministic autonomous evolution equation arising from the microscopic dynamics the corresponding Boltzmann entropy must satisfy an H-theorem.

Thermodynamics, Statistical Mechanics and Entropy

Entropy

The proper definition of thermodynamics and the thermodynamic entropy is discussed in the light of recent developments. The postulates for thermodynamics are examined critically, and some modifications are suggested to allow for the inclusion of long-range forces (within a system), inhomogeneous systems with non-extensive entropy, and systems that can have negative temperatures. Only the thermodynamics of finite systems are considered, with the condition that the system is large enough for the fluctuations to be smaller than the experimental resolution. The statistical basis for thermodynamics is discussed, along with four different forms of the (classical and quantum) entropy. The strengths and weaknesses of each are evaluated in relation to the requirements of thermodynamics. Effects of order 1/N, where N is the number of particles, are included in the discussion because they have played a significant role in the literature, even if they are too small to have a measurable effect in an experiment. The discussion includes the role of discreteness, the non-zero width of the energy and particle number distributions, the extensivity of models with non-interacting particles, and the concavity of the entropy with respect to energy. The results demonstrate the validity of negative temperatures.

Inconsistency of microcanonical entropy: the case of chemical potential

2015

Attempts to establish microcanonical entropy as an adiabatic invariant date back to works of Gibbs and Hertz. More recently, a consistency relation based on adiabatic invariance has been used to argue for the validity of Gibbs (volume) entropy over Boltzmann (surface) entropy. Such consistency relation equates derivatives of thermodynamic entropy to ensemble average of the corresponding quantity in micro-state space (phase space or Hilbert space). In this work we propose to reexamine such a consistency relation when the number of particles (N) is considered as the independent thermodynamic variable. In other words, we investigate the consistency relation for the chemical potential which is a fundamental thermodynamic quantity. We show both by simple analytical calculations as well as model example that neither definitions of entropy satisfy the consistency condition when one considers such a relation for the chemical potential. This remains true regardless of the system size. Therefore, our results cast doubt on the validity of the adiabatic invariance as a required property of thermodynamic entropy. We close by providing commentary on the derivation of thermostatistics from mechanics which typically leads to controversial and inconsistent results.

The nonequilibrium thermodynamics of small systems

Comptes Rendus Physique, 2007

S mall systems found throughout physics, chemistry, and biology manifest striking properties as a result of their tiny dimensions. Examples of such systems include magnetic domains in ferromagnets, which are typically smaller than 300 nm; quantum dots and biological molecular machines that range in size from 2 to 100 nm; and solidlike clusters that are important in the relaxation of glassy systems and whose dimensions are a few nanometers. Scientists nowadays are interested in understanding the properties of such small systems. For example, they are beginning to investigate the dynamics of the biological motors responsible for converting chemical energy into useful work in the cell (see the article by Terence Strick, Jean-François Allemand, Vincent Croquette, and David Bensimon, PHYSICS TODAY October 2001, page 46). Those motors operate away from equilibrium, dissipate energy continuously, and make transitions between steady states. Until the early 1990s, researchers had lacked experimental methods to investigate such properties of small systems as how they exchange heat and work with their environments. The development of modern techniques of microscopic manipulation has changed the experimental situation. In parallel, during the past decade, theorists have developed several results collectively known as fluctuation theorems (FTs), some of which have been experimentally tested. The much-improved experimental access to the energy fluctuations of small systems and the formulation of the principles that govern both energy exchanges and their statistical excursions are starting to shed light on the unique properties of microscopic systems. Ultimately, the knowledge physicists are gaining with their new experimental and theoretical tools may serve as the basis for a theory of the nonequilibrium thermodynamics of small systems.

A definition of thermodynamic entropy valid for non-equilibrium states and few-particle systems

arXiv: Mathematical Physics, 2014

From a new rigorous formulation of the general axiomatic foundations of thermodynamics we derive an operational definition of entropy that responds to the emergent need in many technological frameworks to understand and deploy thermodynamic entropy well beyond the traditional realm of equilibrium states of macroscopic systems. The new definition is achieved by avoiding to resort to the traditional concepts of "heat" (which restricts aaa prioriprioripriori the traditional definitions of entropy to the equilibrium domain) and of "thermal reservoir" (which restricts ininin practicepracticepractice our previous definitions of non-equilibrium entropy to the many-particle domain). The measurement procedure that defines entropy is free from intrinsic limitations and can be applied, ininin principleprincipleprinciple, even to non-equilibrium states of few-particle systems, provided they are separable and uncorrelated. The construction starts from a previously developed set of carefully worded operational definitio...

A Novel Derivation of the Time Evolution of the Entropy for Macroscopic Systems in Thermal Non-Equilibrium

Entropy

The paper discusses how the two thermodynamic properties, energy (U) and exergy (E), can be used to solve the problem of quantifying the entropy of non-equilibrium systems. Both energy and exergy are a priori concepts, and their formal dependence on thermodynamic state variables at equilibrium is known. Exploiting the results of a previous study, we first calculate the non-equilibrium exergy E n-eq can be calculated for an arbitrary temperature distributions across a macroscopic body with an accuracy that depends only on the available information about the initial distribution: the analytical results confirm that E n-eq exponentially relaxes to its equilibrium value. Using the Gyftopoulos-Beretta formalism, a non-equilibrium entropy S n-eq (x,t) is then derived from E n-eq (x,t) and U(x,t). It is finally shown that the non-equilibrium entropy generation between two states is always larger than its equilibrium (herein referred to as "classical") counterpart. We conclude that every iso-energetic non-equilibrium state corresponds to an infinite set of non-equivalent states that can be ranked in terms of increasing entropy. Therefore, each point of the Gibbs plane corresponds therefore to a set of possible initial distributions: the non-equilibrium entropy is a multi-valued function that depends on the initial mass and energy distribution within the body. Though the concept cannot be directly extended to microscopic systems, it is argued that the present formulation is compatible with a possible reinterpretation of the existing non-equilibrium formulations, namely those of Tsallis and Grmela, and answers at least in part one of the objections set forth by Lieb and Yngvason. A systematic application of this paradigm is very convenient from a theoretical point of view and may be beneficial for meaningful future applications in the fields of nano-engineering and biological sciences.

Boltzmann or Gibbs Entropy? Thermostatistics of Two Models with Few Particles

We study the statistical mechanics of small clusters (N ~ 10 - 100) for two-level systems and harmonic oscillators. Both Boltzmann’s and Gibbs’s definitions of entropy are used. The properties of the studied systems are evaluated numerically but exactly; this means that Stirling’s approximation was not used in the calculation and that the discrete nature of energy was taken into account. Results show that, for the two-level system, using Gibbs entropy prevents temperatures from assuming negative values; however, they reach very high values that are not plausible in physical terms. In the case of harmonic oscillators, there are no significant differences when using either definition of entropy. Both systems show that for N = 100 the exact results evaluated with statistical mechanics coincide with those found in the thermodynamic limit. This suggests that thermodynamics can be applied to systems as small as these.

The Elusive Nature of Entropy and Its Physical Meaning

Entropy is the most used and often abused concept in science, but also in philosophy and society. Further confusions are produced by some attempts to generalize entropy with similar but not the same concepts in other disciplines. The physical meaning of phenomenological, thermodynamic entropy is reasoned and elaborated by generalizing Clausius definition with inclusion of generated heat, since it is irrelevant if entropy is changed due to reversible heat transfer or irreversible heat generation. Irreversible, caloric heat transfer is introduced as complementing reversible heat transfer. It is also reasoned and thus proven why entropy cannot be destroyed but is always generated (and thus overall increased) locally and globally, at every space and time scales, without any exception. It is concluded that entropy is a thermal displacement (dynamic thermal-volume) of thermal energy due to absolute temperature as a thermal potential (dQ = TdS), and thus associated with thermal heat and absolute temperature, i.e., distribution of thermal energy within thermal micro-particles in space. Entropy is an integral measure of (random) thermal energy redistribution (due to heat transfer and/or irreversible heat generation) within a material system structure in space, per absolute temperature level: dS = dQ Sys /T = mC Sys dT/T, thus logarithmic integral function, with J/K unit. It may be also expressed as a measure of " thermal disorder " , being related to logarithm of number of all thermal, dynamic microstates W (their position and momenta), S = k B lnW, or to the sum of their logarithmic probabilities S = −k B ∑p i lnp i , that correspond to, or are consistent with the given thermodynamic macro-state. The number of thermal microstates W, is correlated with macro-properties temperature T and volume V for ideal gases. A system form and/or functional order or disorder are not (thermal) energy order/disorder and the former is not related to Thermodynamic entropy. Expanding entropy to any type of disorder or information is a source of many misconceptions. Granted, there are certain benefits of simplified statistical descriptions to better comprehend the randomness of thermal motion OPEN ACCESS Entropy 2014, 16 954 and related physical quantities, but the limitations should be stated so the generalizations are not overstretched and the real physics overlooked, or worse discredited.