Entropy 15 01152 v (original) (raw)

Entropy in evolution

Biology and Philosophy, 1986

Daniel R. Brooks and E. O. Wiley have proposed a theory of evolution in which fitness is merely a rate determining factor. Evolution is driven by nonequilibrium processes which increase the entropy and information content of species together. Evolution can occur without environmental selection, since increased complexity and organization result from the likely "capture" at the species level of random variations produced at the chemical level. Speciation can occur as the result of variation within the species which decreases the probability of sharing genetic information. Critics of the Brooks-Wiley theory argue that they have abused terminology from information theory and thermodynamics. In this paper I review the essentials of the theory, and give an account of hierarchical physical information systems within which the theory can be interpreted. I then show how the major conceptual objections can be answered.

The Thermodynamic Considerations of Biological Evolution; the Role of Entropy

Although Darwin's theory of biological evolution is the cornerstone of modern biology, it lacks proper physical foundations. We consider ecosystems as closed systems that only exchange energy and information, not matter with the outside. Moreover, predictable and periodic fluctuations in entropy, genetic diversity, population number, and resource availability form a cyclic process that can be analyzed via thermodynamic principles. The sun's energy input drives a reversed Carnot cycle in four phases. The first phase is low entropy, a fast-changing environment spurring genotype-phenotype plasticity. In phase 2, the growing population increases entropy, forming nutrient cycles via symbiotic, parasitic, and predator-prey relationships. In phase 3, competitive and chaotic interactions spread genetic innovations in the overpopulated, stressed ecosystem. Finally, in phase 4, extinction purges the non-evolvable genomes, but the surviving species carry the cycle's genetic innovat...

Information, Entropy, and the Evolution of Living Systems

Brittonia, 1979

Information, entropy, and the evolution of living systems. Brittonia 31: 428-430. 1979.-~Selection at constant selective pressures results in the optimization of the average productivity within the system and an increase in the information content. The entropy increase through evolutionary time is, therefore, minimized. The "pattern" of entropy descriptions for ontogenetic (developmental) and phylogenetic (evolutionary) changes is shown to be different, and the latter is consistent with the Prigogine-Glansdorff principle for irreversible thermodynamic processes.

The Thermodynamic Considerations of Evolution; the Role of Entropy in Biological Complexity

2023

Darwin's theory of biological evolution became a cornerstone of modern biology. Predictable fluctuations in entropy, genetic diversity, population number, and resource availability in ecosystems turn evolution into a cyclic process, making the ecosystem's thermodynamic analysis possible. The sun's energy input drives a closed theoretical process, the reversed Carnot cycle. The Carnot cycle is divided into four distinct phases with standard features. The first phase is a low entropy, fast-changing environment, spurring phenotypic plasticity (phase 1). In phase 2, the population growth increases entropy, forming nutrient cycles via symbiotic, parasitic, predator-prey, and other interdependent relationships. In phase 3, the overpopulated, stressed ecosystem outgrows its boundaries; competitive and chaotic interactions spread genetic innovations through horizontal gene transfer. Finally, in phase 4, extinction purges the nonevolvable genomes, but the surviving species carry the cycle's genetic innovations and make renewal possible. Therefore, compression and expansion of the ecospace by energy fluxes (i.e., ecosystem dynamics) are potent drivers of change. Thus, the Darwinian concept is a cyclic sequestering of the sun's energy into genetic complexity. The second law of intellect shows that genetic complexity either increases or remains constant; it never decreases.

Does maximal entropy production play a role in the evolution of biological complexity? A biological point of view

Rendiconti Lincei. Scienze Fisiche e Naturali, 2020

A considerable literature has developed around the concept that the complexity of biological organisms, and the development of ever increasing complexity during biological evolution, is driven, in some way, by the maximization of entropy production (MEP). Most of these studies deal in very general terms with the living state and do not examine specific biological models. In the present study, we discuss some of the basic postulates of MEP as they are applied to living organisms. It is concluded that the MEP ideas seem to have little relevance either for the development of biological complexity or biological evolution.

Thermodynamics and Evolution

Journal of Theoretical Biology, 2000

The science of thermodynamics is concerned with understanding the properties of inanimate matter in so far as they are determined by changes in temperature. The Second Law asserts that in irreversible processes there is a uni-directional increase in thermodynamic entropy, a measure of the degree of uncertainty in the thermal energy state of a randomly chosen particle in the aggregate. The science of evolution is concerned with understanding the properties of populations of living matter in so far as they are regulated by changes in generation time. Directionality theory, a mathematical model of the evolutionary process, establishes that in populations subject to bounded growth constraints, there is a uni-directional increase in evolutionary entropy, a measure of the degree of uncertainty in the age of the immediate ancestor of a randomly chosen newborn. This article reviews the mathematical basis of directionality theory and analyses the relation between directionality theory and statistical thermodynamics. We exploit an analytic relation between temperature, and generation time, to show that the directionality principle for evolutionary entropy is a nonequilibrium extension of the principle of a uni-directional increase of thermodynamic entropy. The analytic relation between these directionality principles is consistent with the hypothesis of the equivalence of fundamental laws as one moves up the hierarchy, from a molecular ensemble where the thermodynamic laws apply, to a population of replicating entities (molecules, cells, higher organisms), where evolutionary principles prevail.

It is not the entropy you produce, rather, how you produce it

Philosophical Transactions of the Royal Society B: Biological Sciences, 2010

The principle of maximum entropy production (MEP) seeks to better understand a large variety of the Earth's environmental and ecological systems by postulating that processes far from thermodynamic equilibrium will ‘adapt to steady states at which they dissipate energy and produce entropy at the maximum possible rate’. Our aim in this ‘outside view’, invited by Axel Kleidon, is to focus on what we think is an outstanding challenge for MEP and for irreversible thermodynamics in general: making specific predictions about the relative contribution of individual processes to entropy production. Using studies that compared entropy production in the atmosphere of a dry versus humid Earth, we show that two systems might have the same entropy production rate but very different internal dynamics of dissipation. Using the results of several of the papers in this special issue and a thought experiment, we show that components of life-containing systems can evolve to either lower or raise t...

Population and Entropy Fluctuations in Ecology: A Thermodynamic Model of Biological Evolution

2024

Although Darwin's Theory of biological evolution is the cornerstone of modern biology, it lacks proper physical foundations. We applied the second law of thermodynamics to analyze biological evolution. Oscillating state variables such as entropy, energy, temperature, pressure, and volume can be conceptualized as an endothermic, reverse Carnot cycle. This endothermic process can accumulate genetic and morphological complexity through a multi-step, cyclic process. This cycle alternates between phases that favor order and maximum energy use (low entropy) or high entropy competition, where natural selection promotes minimal entropy production, favoring highly specialized species. Our argument reconciles the contradictions between the maximum power principle and Prigogine's minimum entropy production theory. Periodic mass extinctions act as pivotal reset points, removing highly specialized evolutionary dead ends while creating opportunities for surviving species to initiate new cycles of enhanced complexity. Notably, genetic material serves as an orthogonal, inert medium, carrying innovations forward and enabling the accumulation of biological complexity. Evolution's capacity to enhance complexity spontaneously through entropic effects suggests a conceptual extension: the "second law of intellect," a complementary principle to the second law of thermodynamics. This principle can aid a more in-depth understanding of the Darwinian Theory and inspire artificial intelligence research.

Entropy Perspectives of Molecular and Evolutionary Biology

2022

Attempts to find and quantify the supposed low entropy of organisms and its preservation are revised. Absolute entropy of the mixed components of non-living biomass (around -1.6 x 103 J K-1 L-1) is the reference to which other entropy decreases would be ascribed to life. Compartmentation of metabolites and departure from the equilibrium of metabolic reactions account for 1 and 40-50 J K-1 L-1, respectively, decreases of entropy and, though small, are distinctive features of living tissues. Intense experimental and theoretical investigations suggest that no other living feature contributes significantly to the low entropy associated to life. Macromolecular structures, despite their informational relevance for life, do not supply significant decreases of thermodynamic entropy. The photosynthetic conversion of radiant energy to biomass energy accounts for the most of entropy (2.8 x 105 J K-1 carbon kg-1) produced by living beings. The comparative very low entropy produced in other proc...

Information and Entropy – Top-down or Bottom-up development in living systems

Int. J. of Design & Nature and Ecodynamics. , 2009

ABSTRACT This paper deals with the fundamental and challenging question of the ultimate origin of genetic information from a thermodynamic perspective. The theory of evolution postulates that random mutations and natural selection can increase genetic information over successive generations. It is often argued from an evolutionary perspective that this does not violate the second law of thermodynamics because it is proposed that the entropy of a non-isolated system could reduce due to energy input from an outside source, especially the sun when considering the earth as a biotic system. by this it is proposed that a particular system can become organised at the expense of an increase in entropy elsewhere. However, whilst this argument works for structures such as snowflakes that are formed by natural forces, it does not work for genetic information because the information system is composed of machinery which requires precise and non-spontaneous raised free energy levels – and crystals like snowflakes have zero free energy as the phase transition occurs. The functional machinery of biological systems such as DNa, rNa and proteins requires that precise, non-spontaneous raised free energies be formed in the molecular bonds which are maintained in a far from equilibrium state. furthermore, biological structures contain coded instructions which, as is shown in this paper, are not defined by the matter and energy of the molecules carrying this information. Thus, the specified complexity cannot be created by natural forces even in conditions far from equilibrium. The genetic information needed to code for complex structures like proteins actually requires information which organises the natural forces surrounding it and not the other way around – the information is crucially not defined by the material on which it sits. The information system locally requires the free energies of the molecular machinery to be raised in order for the information to be stored. Consequently, the fundamental laws of thermodynamics show that entropy reduction which can occur naturally in non-isolated systems is not a sufficient argument to explain the origin of either biological machinery or genetic information that is inextricably intertwined with it. This paper highlights the distinctive and non-material nature of information and its relationship with matter, energy and natural forces. It is proposed in conclusion that it is the non-material information (transcendent to the matter and energy) that is actually itself constraining the local thermodynamics to be in ordered disequilibrium and with specified raised free energy levels necessary for the molecular and cellular machinery to operate. INTrODuCTION 1 How does information relate to the hardware used? Is it possible to measure information and its effect on the local thermodynamics of the system hardware? How do these relate in real systems, and can one connect and quantify the effect of information on matter and energy? This issue is most importantly considered in the realm of living systems, where one quickly becomes aware of the extraordinary complexity which so organises the biochemical proteins at the molecular level as to effectively build digital machinery that for many years, since the discovery of the DNa model by Crick and Watson [1], has been the goal of modern software engineers to emulate. genetic information in the form of DNa is responsible for defining the vast array of biological structures and processes that we see around us in the natural world. In the human genome, there are said to be around three billion units of information. These units specify the structures and processes