How the Relationship Between Information Theory and Thermodynamics Can Contribute to Explaining Brain and Cognitive Activity: An Integrative Approach (original) (raw)

Brain activity and cognition: a connection from thermodynamics and information theory

Frontiers in Psychology, 2015

The connection between brain and mind is an important scientific and philosophical question that we are still far from completely understanding. A crucial point to our work is noticing that thermodynamics provides a convenient framework to model brain activity, whereas cognition can be modeled in information-theoretical terms. In fact, several models have been proposed so far from both approaches. A second critical remark is the existence of deep theoretical connections between thermodynamics and information theory. In fact, some well-known authors claim that the laws of thermodynamics are nothing but principles in information theory. Unlike in physics or chemistry, a formalization of the relationship between information and energy is currently lacking in neuroscience. In this paper we propose a framework to connect physical brain and cognitive models by means of the theoretical connections between information theory and thermodynamics. Ultimately, this article aims at providing further insight on the formal relationship between cognition and neural activity.

The thermodynamics of cognition: A Mathematical Treatment

Computational and Structural Biotechnology Journal 19(2), 2021

There is a general expectation that the laws of classical physics must apply to biology, particularly the neural system. The evoked cycle represents the brain's energy/information exchange with the physical environment through stimulus. Therefore, the thermodynamics of emotions might elucidate the neurological origin of intellectual evolution, and explain the psychological and health consequences of positive and negative emotional states based on their energy profiles. We utilized the Carnot cycle and Landauer's principle to analyze the energetic consequences of the brain's resting and evoked states during and after various cognitive states. Namely, positive emotional states can be represented by the reversed Carnot cycle, whereas negative emotional reactions trigger the Carnot cycle. The two conditions have contrasting energetic and entropic aftereffects with consequences for mental energy. The mathematics of the Carnot and reversed Carnot cycles, which can explain recent findings in human psychology, might be constructive in the scientific endeavor in turning psychology into hard science.

The Mental Maxwell Relations: A Thermodynamic Allegory for Higher Brain Functions

Frontiers in Neuroscience

The theoretical framework of classical thermodynamics unifies vastly diverse natural phenomena and captures once-elusive effects in concrete terms. Neuroscience confronts equally varied, equally ineffable phenomena in the mental realm, but has yet to unite or to apprehend them rigorously, perhaps due to an insufficient theoretical framework. The terms for mental phenomena, the mental variables, typically used in neuroscience are overly numerous and imprecise. Unlike in thermodynamics or other branches of physics, in neuroscience, there are no core mental variables from which all others formally derive and it is unclear which variables are distinct and which overlap. This may be due to the nature of mental variables themselves. Unlike the variables of physics, perhaps they cannot be interpreted as composites of a small number of axioms. However, it is well worth exploring if they can, as that would allow more parsimonious theories of higher brain function. Here we offer a theoretical...

THE THERMODYNAMIC ANALYSIS OF NEURAL COMPUTATION

J Neurosci Clin Res, 2018

The brain displays a low-frequency ground energy conformation, called the resting state, which is characterized by an energy/information balance via self-regulatory mechanisms. Despite the high-frequency evoked activity, e.g., the detail-oriented sensory processing of environmental data and the accumulation of information, nevertheless the brain's automatic regulation is always able to recover the resting state. Indeed, we show that the two energetic processes, activation that decreases temporal dimensionality via transient bifurcations and the ensuing brain's response, lead to complementary and symmetric procedures that satisfy the Landauer's principle. Landauer's principle, which states that information era-sure requires energy, predicts heat accumulation in the system, this means that information accumulation is correlated with increases in temperature and lead to actions that recover the resting state. We explain how brain synaptic networks frame a closed system, similar to the Carnot cycle, where the information/energy cycle accumulates energy in synaptic connections. In deep learning, representation of information might occur via the same mechanism.

Information Processing in the Brain as Optimal Entropy Transport: A Theoretical Approach

Entropy, 2020

We consider brain activity from an information theoretic perspective. We analyze the information processing in the brain, considering the optimality of Shannon entropy transport using the Monge–Kantorovich framework. It is proposed that some of these processes satisfy an optimal transport of informational entropy condition. This optimality condition allows us to derive an equation of the Monge–Ampère type for the information flow that accounts for the branching structure of neurons via the linearization of this equation. Based on this fact, we discuss a version of Murray’s law in this context.

Information and Thermodynamics in Living Systems

Are there laws of information exchange? And how do the principles of thermodynamics connect with the communication of information? We consider first the concept of information and examine the various alternatives for its definition. The reductionist approach has been to regard information as arising out of matter and energy. In such an approach, coded information systems such as DNA are regarded as accidental in terms of the origin of life, and it is argued that these then led to the evolution of all life forms as a process of increasing complexity by natural selection operating on mutations on these first forms of life. However scientists in the discipline of thermodynamics have long been aware that organisational systems are inherently systems with low local entropy, and have argued that the only way to have consistency with an evolutionary model of the universe and common descent of all life forms is to posit a flow of low entropy into the earth's environment and in this second approach they suggest that islands of low entropy form organisational structures found in living systems. A third alternative proposes that information is in fact non-material and that the coded information systems (such as, but not restricted to the coding of DNA in all living systems) is not defined at all by the biochemistry or physics of the molecules used to store the data. Rather than matter and energy defining the information sitting on the polymers of life, this approach posits that the reverse is in fact the case. Information has its definition outside the matter and energy on which it sits, and furthermore constrains it to operate in a highly non-equilibrium thermodynamic environment. This proposal resolves the thermodynamic issues and invokes the correct paradigm for understanding the vital area of thermodynamic/organisational interactions, which despite the efforts from alternative paradigms has not given a satisfactory explanation of the way information in systems operates. Starting from the paradigm of information being defined by non-material arrangement and coding, one can then postulate the idea of laws of information exchange which have some parallels with the laws of thermodynamics which undergird such an approach. These issues are explored tentatively in this paper, and lay the groundwork for further investigative study.

Entropy and statistical complexity in brain activity

Europhysics News

Another fundamental issue is the proper discriminating measure when two systems are under consideration. In classical information theory, one employs the Kullback-Leibler relative entropy for this purpose which also has its quantum version. These are also additive measures and the Tsallis counterparts of these have been put forward and employed in the quantum context as well [10, 11]. There is promise in future work using the Tsallis approach to problems arising in quantum information theory, especially in the areas of quantum algorithms and quantum computing. There has been some discussion of the thermodynamics of information, in particular quantum information. Since there are hints that quantum entanglement may not be additive, and since the concept of entropy has been introduced into the discussion, an examination of maximum Tsallis entropy subject to constraints such as the Bell-Clauser-Horne-Shimony-Holt observable was studied for purposes of inferring quantum entanglement [5, 6].

Information: From Philosophic to Physics Concepts for Informational Modeling of Consciousness

Information was a frequently used concept in many fields of investigation. However, this concept is still not really understood, when it is referred for instance to consciousness and its informational structure. In this paper it is followed the concept of information from philosophical to physics perspective, showing especially how this concept could be extended to matter in general and to the living in particular, as a result of the intimate interaction between matter and information, the human body appearing as a bipolar informed-matter structure. It is detailed on this way how this concept could be referred to consciousness, and an informational modeling of consciousness as an informational system of the human body is presented. Based on the anatomic architecture of the organism and on the inference of the specific information concepts, it is shown that the informational system of the human body could be described by seven informational subsystems, which are reflected in consciousness as corresponding cognitive centers. These results are able to explain the main properties of consciousness, both the cognitive and extra-cognitive properties of the mind, like that observed during the near-death experiences and other similar phenomena. Moreover, the results of such a modeling are compared with the existing empirical concepts and models on the energetic architecture of the organism, showing their relevance for the understanding of consciousness.

Entropy: From Thermodynamics to Information Processing

Entropy, 2021

Entropy is a concept that emerged in the 19th century. It used to be associated with heat harnessed by a thermal machine to perform work during the Industrial Revolution. However, there was an unprecedented scientific revolution in the 20th century due to one of its most essential innovations, i.e., the information theory, which also encompasses the concept of entropy. Therefore, the following question is naturally raised: “what is the difference, if any, between concepts of entropy in each field of knowledge?” There are misconceptions, as there have been multiple attempts to conciliate the entropy of thermodynamics with that of information theory. Entropy is most commonly defined as “disorder”, although it is not a good analogy since “order” is a subjective human concept, and “disorder” cannot always be obtained from entropy. Therefore, this paper presents a historical background on the evolution of the term “entropy”, and provides mathematical evidence and logical arguments regard...