Beyond Disorder: A New Perspective on Entropy in Chemistry (original) (raw)

The Elusive Nature of Entropy and Its Physical Meaning

Entropy is the most used and often abused concept in science, but also in philosophy and society. Further confusions are produced by some attempts to generalize entropy with similar but not the same concepts in other disciplines. The physical meaning of phenomenological, thermodynamic entropy is reasoned and elaborated by generalizing Clausius definition with inclusion of generated heat, since it is irrelevant if entropy is changed due to reversible heat transfer or irreversible heat generation. Irreversible, caloric heat transfer is introduced as complementing reversible heat transfer. It is also reasoned and thus proven why entropy cannot be destroyed but is always generated (and thus overall increased) locally and globally, at every space and time scales, without any exception. It is concluded that entropy is a thermal displacement (dynamic thermal-volume) of thermal energy due to absolute temperature as a thermal potential (dQ = TdS), and thus associated with thermal heat and absolute temperature, i.e., distribution of thermal energy within thermal micro-particles in space. Entropy is an integral measure of (random) thermal energy redistribution (due to heat transfer and/or irreversible heat generation) within a material system structure in space, per absolute temperature level: dS = dQ Sys /T = mC Sys dT/T, thus logarithmic integral function, with J/K unit. It may be also expressed as a measure of " thermal disorder " , being related to logarithm of number of all thermal, dynamic microstates W (their position and momenta), S = k B lnW, or to the sum of their logarithmic probabilities S = −k B ∑p i lnp i , that correspond to, or are consistent with the given thermodynamic macro-state. The number of thermal microstates W, is correlated with macro-properties temperature T and volume V for ideal gases. A system form and/or functional order or disorder are not (thermal) energy order/disorder and the former is not related to Thermodynamic entropy. Expanding entropy to any type of disorder or information is a source of many misconceptions. Granted, there are certain benefits of simplified statistical descriptions to better comprehend the randomness of thermal motion OPEN ACCESS Entropy 2014, 16 954 and related physical quantities, but the limitations should be stated so the generalizations are not overstretched and the real physics overlooked, or worse discredited.

Review Using Entropy Leads to a Better Understanding of Biological Systems

2010

In studying biological systems, conventional approaches based on the laws of physics almost always require introducing appropriate approximations. We argue that a comprehensive approach that integrates the laws of physics and principles of inference provides a better conceptual framework than these approaches to reveal emergence in such systems. The crux of this comprehensive approach hinges on entropy. Entropy is not merely a physical quantity. It is also a reasoning tool to process information with the least bias. By reviewing three distinctive examples from protein folding dynamics to drug design, we demonstrate the developments and applications of this comprehensive approach in the area of biological systems.

Using Entropy Leads to a Better Understanding of Biological Systems

Biophysical Journal, 2011

In studying biological systems, conventional approaches based on the laws of physics almost always require introducing appropriate approximations. We argue that a comprehensive approach that integrates the laws of physics and principles of inference provides a better conceptual framework than these approaches to reveal emergence in such systems. The crux of this comprehensive approach hinges on entropy. Entropy is not merely a physical quantity. It is also a reasoning tool to process information with the least bias. By reviewing three distinctive examples from protein folding dynamics to drug design, we demonstrate the developments and applications of this comprehensive approach in the area of biological systems.

Directionality of Chemical Reaction and Spontaneity of Biological Process in the Context of Entropy

Int J Regenr Med, 2022

Chemical and biochemical reactions are carried out either to generate energy or to produce useful macromolecules. Entropy is a well-applied concept in many fields, including physics, chemistry, biology, and medicine. Various perspectives have been used to describe the concept, creating confusion and misconceptions. In chemical and biochemical reactions, entropy plays a significant role in the directionality and spontaneity of the reactions. Potential energy can be used to better understand the concept of entropy. Potential energy represents order, while entropy represents disorder; therefore, they are inversely proportional and intimately linked. Molecules with high potential usually have rich sets of functions and information, which is due to the enrichment of their constitutions, configurations, and conformations. In molecules with low potential, there are greater vibrational, rotational, and translational motions associated with decreased order in their constitution, configuration, and conformation. Distribution of electronic charge changes in macromolecules over time, increasing the rotation of side-chain residues and thus increasing entropy and affecting potential in terms of structure, function, and information. Entropy can thus be defined as a state of spontaneous change, bound to time and constantly increasing, which causes structural changes in the form of constitution, configuration, and conformation, and functional changes in the form of the ability to do work as well as informational changes in the form of the transmission of commands.

Defining and quantifying frustration in the energy landscape: Applications to atomic and molecular clusters, biomolecules, jammed and glassy systems

Journal of Chemical Physics, 2017

The emergence of observable properties from the organisation of the underlying potential energy landscape is analysed, spanning a full range of complexity from self-organising to glassy and jammed systems. The examples include atomic and molecular clusters, a β-barrel protein, the GNNQQNY peptide dimer, and models of condensed matter that exhibit structural glass formation and jamming. We have considered measures based on several different properties, namely, the Shannon entropy, an equilibrium thermodynamic measure that uses a sample of local minima, and indices that require additional information about the connections between local minima in the form of transition states. A frustration index is defined that correlates directly with key properties that distinguish relaxation behaviour within this diverse set. The index uses the ratio of the energy barrier to the energy difference with reference to the global minimum. The contributions for each local minimum are weighted by the equilibrium occupation probabilities. Hence we obtain fundamental insight into the connections and distinctions between systems that cover the continuum from efficient structure-seekers to landscapes that exhibit broken ergodicity and rare event dynamics.

Entropy Perspectives of Molecular and Evolutionary Biology

2022

Attempts to find and quantify the supposed low entropy of organisms and its preservation are revised. Absolute entropy of the mixed components of non-living biomass (around -1.6 x 103 J K-1 L-1) is the reference to which other entropy decreases would be ascribed to life. Compartmentation of metabolites and departure from the equilibrium of metabolic reactions account for 1 and 40-50 J K-1 L-1, respectively, decreases of entropy and, though small, are distinctive features of living tissues. Intense experimental and theoretical investigations suggest that no other living feature contributes significantly to the low entropy associated to life. Macromolecular structures, despite their informational relevance for life, do not supply significant decreases of thermodynamic entropy. The photosynthetic conversion of radiant energy to biomass energy accounts for the most of entropy (2.8 x 105 J K-1 carbon kg-1) produced by living beings. The comparative very low entropy produced in other proc...

Review Structuring Information and Entropy: Catalyst as Information Carrier

2013

Abstract: Many authors tried to exploit the similarities between expressions of the statistical thermodynamics for the entropy and those of Shannon's information theory. In a new approach, we highlight the role of information involved in chemical systems, in particular in the interaction between catalysts and reactants, what we call structuring information. By means of examples, we present some applications of this concept to the biosphere, by visiting a very vast domain going from the appearance of life on earth to its present evolution.