Review Structuring Information and Entropy: Catalyst as Information Carrier (original) (raw)
Related papers
Structuring Information and Entropy: Catalyst as Information Carrier
Many authors tried to exploit the similarities between expressions of the statistical thermodynamics for the entropy and those of Shannon's information theory. In a new approach, we highlight the role of information involved in chemical systems, in particular in the interaction between catalysts and reactants, what we call structuring information. By means of examples, we present some applications of this concept to the biosphere, by visiting a very vast domain going from the appearance of life on earth to its present evolution.
Information and Entropy – Top-down or Bottom-up development in living systems
Int. J. of Design & Nature and Ecodynamics. , 2009
ABSTRACT This paper deals with the fundamental and challenging question of the ultimate origin of genetic information from a thermodynamic perspective. The theory of evolution postulates that random mutations and natural selection can increase genetic information over successive generations. It is often argued from an evolutionary perspective that this does not violate the second law of thermodynamics because it is proposed that the entropy of a non-isolated system could reduce due to energy input from an outside source, especially the sun when considering the earth as a biotic system. by this it is proposed that a particular system can become organised at the expense of an increase in entropy elsewhere. However, whilst this argument works for structures such as snowflakes that are formed by natural forces, it does not work for genetic information because the information system is composed of machinery which requires precise and non-spontaneous raised free energy levels – and crystals like snowflakes have zero free energy as the phase transition occurs. The functional machinery of biological systems such as DNa, rNa and proteins requires that precise, non-spontaneous raised free energies be formed in the molecular bonds which are maintained in a far from equilibrium state. furthermore, biological structures contain coded instructions which, as is shown in this paper, are not defined by the matter and energy of the molecules carrying this information. Thus, the specified complexity cannot be created by natural forces even in conditions far from equilibrium. The genetic information needed to code for complex structures like proteins actually requires information which organises the natural forces surrounding it and not the other way around – the information is crucially not defined by the material on which it sits. The information system locally requires the free energies of the molecular machinery to be raised in order for the information to be stored. Consequently, the fundamental laws of thermodynamics show that entropy reduction which can occur naturally in non-isolated systems is not a sufficient argument to explain the origin of either biological machinery or genetic information that is inextricably intertwined with it. This paper highlights the distinctive and non-material nature of information and its relationship with matter, energy and natural forces. It is proposed in conclusion that it is the non-material information (transcendent to the matter and energy) that is actually itself constraining the local thermodynamics to be in ordered disequilibrium and with specified raised free energy levels necessary for the molecular and cellular machinery to operate. INTrODuCTION 1 How does information relate to the hardware used? Is it possible to measure information and its effect on the local thermodynamics of the system hardware? How do these relate in real systems, and can one connect and quantify the effect of information on matter and energy? This issue is most importantly considered in the realm of living systems, where one quickly becomes aware of the extraordinary complexity which so organises the biochemical proteins at the molecular level as to effectively build digital machinery that for many years, since the discovery of the DNa model by Crick and Watson [1], has been the goal of modern software engineers to emulate. genetic information in the form of DNa is responsible for defining the vast array of biological structures and processes that we see around us in the natural world. In the human genome, there are said to be around three billion units of information. These units specify the structures and processes
Information and Thermodynamics in Living Systems
Are there laws of information exchange? And how do the principles of thermodynamics connect with the communication of information? We consider first the concept of information and examine the various alternatives for its definition. The reductionist approach has been to regard information as arising out of matter and energy. In such an approach, coded information systems such as DNA are regarded as accidental in terms of the origin of life, and it is argued that these then led to the evolution of all life forms as a process of increasing complexity by natural selection operating on mutations on these first forms of life. However scientists in the discipline of thermodynamics have long been aware that organisational systems are inherently systems with low local entropy, and have argued that the only way to have consistency with an evolutionary model of the universe and common descent of all life forms is to posit a flow of low entropy into the earth's environment and in this second approach they suggest that islands of low entropy form organisational structures found in living systems. A third alternative proposes that information is in fact non-material and that the coded information systems (such as, but not restricted to the coding of DNA in all living systems) is not defined at all by the biochemistry or physics of the molecules used to store the data. Rather than matter and energy defining the information sitting on the polymers of life, this approach posits that the reverse is in fact the case. Information has its definition outside the matter and energy on which it sits, and furthermore constrains it to operate in a highly non-equilibrium thermodynamic environment. This proposal resolves the thermodynamic issues and invokes the correct paradigm for understanding the vital area of thermodynamic/organisational interactions, which despite the efforts from alternative paradigms has not given a satisfactory explanation of the way information in systems operates. Starting from the paradigm of information being defined by non-material arrangement and coding, one can then postulate the idea of laws of information exchange which have some parallels with the laws of thermodynamics which undergird such an approach. These issues are explored tentatively in this paper, and lay the groundwork for further investigative study.
Information, Thermodynamics and Life: A Narrative Review
2021
Information is probably one of the most difficult physical quantities to comprehend. This applies not only to the very definition of information, but also to the physical entity of information, meaning how can it be quantified and measured. In recent years, information theory and its function in systems has been an intense field of study, due to the large increase of available information technology, where the notion of bit dominated the information discipline. Information theory also expanded from the “simple” “bit” to the quantal “qubit”, which added more variables for consideration. One of the main applications of information theory could be considered the field of “autonomy”, which is the main characteristic of living organisms in nature since they all have self-sustainability, motion and self-protection. These traits, along with the ability to be aware of existence, make it difficult and complex to simulate in artificial constructs. There are many approaches to the concept of s...
Information, Entropy, and the Evolution of Living Systems
Brittonia, 1979
Information, entropy, and the evolution of living systems. Brittonia 31: 428-430. 1979.-~Selection at constant selective pressures results in the optimization of the average productivity within the system and an increase in the information content. The entropy increase through evolutionary time is, therefore, minimized. The "pattern" of entropy descriptions for ontogenetic (developmental) and phylogenetic (evolutionary) changes is shown to be different, and the latter is consistent with the Prigogine-Glansdorff principle for irreversible thermodynamic processes.
"A few exciting words": information and entropy revisited
A review is presented of the relation between information and entropy, focusing on two main issues: the similarity of the formal definitions of physical entropy, according to statistical mechanics, and of information, according to information theory; and the possible subjectivity of entropy considered as missing information. The paper updates the 1983 analysis of Shaw and Davis. The difference in the interpretations of information given respectively by Shannon and by Wiener, significant for the information sciences, receives particular consideration. Analysis of a range of material, from literary theory to thermodynamics, is used to draw out the issues. Emphasis is placed on recourse to the original sources, and on direct quotation, to attempt to overcome some of the misunderstandings and oversimplifications that have occurred with these topics. While it is strongly related to entropy, information is neither identical with it, nor its opposite. Information is related to order and pattern, but also to disorder and randomness. The relations between information and the "interesting complexity," which embodies both patterns and randomness, are worthy of attention.
Flows of Information and Informational Trajectories in Chemical Processes
2012
What is the importance of the concept of information measure in quantum mechanics? To answer, it is necessary to try to establish the importance of the concept of information, which is a general concept and perfectly applicable to any case. For example: What do they have in common codes used to send messages from a communications satellite and the bases of a DNA molecule? How does the second law of Thermodynamics and Communication, to the extent that is possible to speak of the entropy of a musical score? Why the intricate problems of probability is related to the way we express ourselves orally or in writing? The answer to all is information and the fact that one concept can link different ideas so reveals its great generality and power. Until the forties, had not been defined information as a scientific term, and this definition was quite new, different from all the common meanings, and precisely because they were described with sufficient accuracy for mathematicians and engineers...
"Entropy, free energy and information in living systems"
In this paper we consider the concept of information and consider the various alternatives for its definition. The reductionist approach has been to regard information as arising out of matter and energy. Coded information systems such as DNA are regarded as accidental in terms of the origin of life and that these then led to the evolution of all life forms as a process of increasing complexity by natural selection operating on mutations on these first forms of life. Thermodynamicists have long been aware that organisational systems are inherently systems with low local entropy and have argued that the only way to have consistency with an evolutionary model of the Universe and common descent of all life forms is to posit a flow of low entropy into the earth's environment and in such a model they suggest that islands of low entropy form organisational structures found in living systems. There is a third alternative explored in this paper which proposes that information is in fact non-material and that the coding of DNA and all living systems is not defined at all by the biochemistry or physics of the molecules used to store the data. Rather than matter and energy defining the information sitting on the polymers of life, the reverse is the case. Information has its definition outside the matter and energy on which it sits and furthermore constrains it to operate in a highly non-equilibrium environment thermodynamically. This approach resolves the thermodynamic issues and invokes the correct metaphysical paradigm for understanding the vital area of thermodynamic / organisational interaction, which despite the efforts from alternative paradigms has not given a satisfactory explanation of the way information in systems operates. 1. Introduction The concept of information has been a major issue since the discovery by Francis Crick and James Watson of the coding structure of DNA in 1953. Crick himself stated 1 " If the code does indeed have some logical foundation then it is legitimate to consider all the evidence, both good and bad, in any attempt to deduce it. " This was stated in the context of the discovery that triplets of nucleotides running along the rungs of the double helix molecule of DNA carry information to code for a specific amino acid which then makes up the proteins of the living organism. Crick was always of a reductionist mind set and had no sympathy with any approach which regarded the coding as essentially an expression of a non material intelligence transcendent to the polymer itself, and the above statement in its original context is most definitely not advocating an exploration of information in any other paradigm than a purely materialist approach. However it is significant because it shows that scientific investigation can be trapped by only considering one pathway-what if the search for a 'logical foundation' advocated by Crick, actually leads one to the edge of the material region of scientific enquiry? Stephen Jay Gould wrote of non overlapping magisteria 2 , often referred to with the acronym NOMA, in order to resolve the issues of how to approach science describing the physical realm and the metaphysical / philosophical concepts describing the realities dealing with the non material. This is diagrammatically shown in figure 1.
Information, Entropy, Life and the Universe
2015
In (2015), I wrote a book with the same title as this article. The book's subtitle is: "What we know and what we do not know." On the book's dedication page, I wrote [1]: "This book is dedicated to readers of popular science books who are baffled, perplexed, puzzled, astonished, confused, and discombobulated by reading about Information, Entropy, Life and the Universe." In the first part of this article, I will present the definitions of two central concepts: the "Shannon measure of information" (SMI), in Information Theory, and "Entropy", in Thermodynamics. Following these definitions, I will discuss the framework of their applicability. In the second part of the article, I will examine the question of whether living systems and the entire universe are, or are not within the framework of applicability of the concepts of SMI and Entropy. I will show that much of the confusion that exists in the literature arises because of people's ignorance about the framework of applicability of these concepts.