A generalization of Jaynes' principle: an information-theoretic interpretation of the minimum principles of quantum mechanics and gravitation (original) (raw)
Related papers
Universal nature of Boltzmann statistical mechanics, generalized thermodynamics, quantum mechanics, spacetime, black hole mechanics, Shannon information theory, Faraday lines of force, and Banach-Tarski paradox (BTP) are studied. The nature of matter and Dirac anti-matter are described in terms of states of compression and rarefaction of physical space, Aristotle fifth element, or Casimir vacuum identified as a compressible tachyonic fluid. The model is in harmony with perceptions of Plato who believed that the world was formed from a formless primordial medium that was initially in a state of total chaos or "Tohu Vavohu" (Sohrab, in Int J Mech 8:873-84, [1]. Hierarchies of statistical fields from photonic to cosmic scales lead to universal scale-invariant Schrödinger equation thus allowing for new perspectives regarding connections between classical mechanics, quantum mechanics, and chaos theory. The nature of external physical time and its connections to internal thermodynamics time and Rovelli thermal time are described. Finally, some implications of renormalized Planck distribution function to economic systems are examined. Keywords Thermodynamics. Quantum mechanics. Anti-matter. Spacetime. Thermal time. Information theory. Faraday lines of force. Banach-Tarski paradox. T.O.E.
Quantum-Informational Principles for Physics
The Frontiers Collection, 2015
It is time to to take a pause of reflection on the general foundations of physics, reexamining the solidity of the most basic principles, as the relativity and the equivalence principles that are currently under dispute for violations at the Planck scale. A constructive criticism engages us in seeking new general principles, which reduce to the old ones as approximations holding in the physical domain already explored. At the very basis of physics are epistemological and operational rules for the same formulability of the physical law and for the computability of its theoretical predictions, rules that give rise to new solid principles. These rules lead us to a quantum-information theoretic formulation, hinging on a logical identification of the experimental protocol with the quantum algorithm.
Information in statistical physics
Studies in History and Philosophy of Science Part B: Studies in History and Philosophy of Modern Physics, 2005
We review with a tutorial scope the information theory foundations of quantum statistical physics. Only a small proportion of the variables that characterize a system at the microscopic scale can be controlled, for both practical and theoretical reasons, and a probabilistic description involving the observers is required. The criterion of maximum von Neumann entropy is then used for making reasonable inferences. It means that no spurious information is introduced besides the known data. Its outcomes can be given a direct justification based on the principle of indifference of Laplace. We introduce the concept of relevant entropy associated with some set of relevant variables; it characterizes the information that is missing at the microscopic level when only these variables are known. For equilibrium problems, the relevant variables are the conserved ones, and the Second Law is recovered as a second step of the inference process. For non-equilibrium problems, the increase of the relevant entropy expresses an irretrievable loss of information from the relevant variables towards the irrelevant ones. Two examples illustrate the flexibility of the choice of relevant variables and the multiplicity of the associated entropies: the thermodynamic entropy (satisfying the Clausius-Duhem inequality) and the Boltzmann entropy (satisfying the H-theorem). The identification of entropy with missing information is also supported by the paradox of Maxwell's demon. Spin-echo experiments show that irreversibility itself is not an absolute concept: use of hidden information may overcome the arrow of time.
Information Theory and Statistical Mechanics
Physical Review, 1957
Information theory provides a constructive criterion for setting up probability distributions on the basis of partial knowledge, and leads to a type of statistical inference which is called the maximum.entropy estimate. It is the least biased estimate possible on the given information; i.e., it is maximally noncommittal with regard to missing information. If one considers statistical mechanics as a form of statistical inference rather than as a physical theory, it is found that the usual computational rules, starting with the determination of the partition function, are an immediate consequence of the maximum-entropy principle. In the resulting "subjective statistical mechanics," the usual rules are thus justified independently of any physical argument, and in particular independently of experimental verification; whether
Information theory and statistical mechanics. II
Physical review, 1957
Information theory provides a constructive criterion for setting up probability distributions on the basis of partial knowledge, and leads to a type of statistical inference which is called the maximum.entropy estimate. It is the least biased estimate possible on the given information; i.e., it is maximally noncommittal with regard to missing information. If one considers statistical mechanics as a form of statistical inference rather than as a physical theory, it is found that the usual computational rules, starting with the determination of the partition function, are an immediate consequence of the maximum-entropy principle. In the resulting "subjective statistical mechanics," the usual rules are thus justified independently of any physical argument, and in particular independently of experimental verification; whether
Information theory and statistical mechanics revisited
The statistical mechanics of Gibbs is a juxtaposition of subjective, probabilistic ideas on the one hand and objective, mechanical ideas on the other. From the mechanics point of view, the term 'statistical mechanics' implies that to solve physical problems, we must first acknowledge a degree of uncertainty as to the experimental conditions. Turning this problem around, it also appears that the purely statistical arguments are incapable of yielding any physical insight unless some mechanical information is first assumed. In this paper, we follow the path set out by Jaynes , including elements added subsequently to that original work, to explore the consequences of the purely statistical point of view. Because of the amount of material on this subject, we have found that an ordered presentation, emphasizing the logical and mathematical foundations, removes ambiguities and difficulties associated with new applications. In particular, we show how standard methods in the equilibrium theory could have been derived simply from a description of the available problem information. In addition, our presentation leads to novel insights into questions associated with symmetry and non-equilibrium statistical mechanics. Two surprising consequences to be explored in further work are that (in)distinguishability factors are automatically predicted from the problem formulation and that a quantity related to the thermodynamic entropy production is found by considering information loss in non-equilibrium processes. Using the problem of ion channel thermodynamics as an example, we illustrate the idea of building up complexity by successively adding information to create progressively more complex descriptions of a physical system. Our result is that such statistical mechanical descriptions can be used to create transparent, computable, experimentally-relevant models that may be informed by more detailed atomistic simulations. We also derive a theory for the kinetic behavior of this system, identifying the nonequilibrium 'process' free energy functional. The Gibbs relation for this functional is a fluctuation-dissipation theorem applicable arbitrarily far from equilibrium, that captures the effect of non-local and time-dependent behavior from transient driving forces. Based on this work, it is clear that statistical mechanics is a general tool for constructing the relationships between constraints on system information.
AIP Conference Proceedings, 2002
The Kullback-Leibler inequality is a way of comparing any two density matrices. A technique to set up the density matrix for a physical system is to use the maximum entropy principle, given the entropy as a functional of the density matrix, subject to known constraints. In conjunction with the master equation for the density matrix, these two ingredients allow us to formulate the second law of thermodynamics in its widest possible setting. Thus problems arising in both quantum statistical mechanics and quantum information can be handled. Aspects of thermodynamic concepts such as the Carnot cycle will be discussed. A model is examined to elucidate the role of entanglement in the Landauer erasure problem.
A link of information entropy and kinetic energy for quantum many-body systems
Physics Letters A, 2001
A direct connection of information entropy S and kinetic energy T is obtained for nuclei and atomic clusters, which establishes T as a measure of the information in a distribution. It is conjectured that this is a universal property for fermionic many-body systems. We also check rigorous inequalities previously found to hold between S and T for atoms and verify that they hold for nuclei and atomic clusters as well. These inequalities give a relationship of Shannon's information entropy in position-space with an experimental quantity i.e. the rms radius of nuclei and clusters.
Reinterpreting Boltzmann's H-theorem in the light of Information Theory
Cornell University - arXiv, 2013
Prompted by the realisation that the statistical entropy of an ideal gas in the micro-canonical ensemble should not fluctuate or change over time, the meaning of the H-theorem is re-interpreted from the perspective of information theory in which entropy is a measure of uncertainty. We propose that the Maxwellian velocity distribution should more properly be regarded as a limiting distribution which is identical with the distribution across particles in the asymptotic limit of large numbers. In smaller systems, the distribution across particles differs from the limiting distribution and fluctuates. Therefore the entropy can be calculated either from the actual distribution across the particles or from the limiting distribution. The former fluctuates with the distribution but the latter does not. However, only the latter represents uncertainty in the sense implied by information theory by accounting for all possible microstates. We also argue that the Maxwellian probability distribution for the velocity of a single particle should be regarded as a limiting distribution. Therefore the entropy of a single particle is well defined, as is the entropy of an N-particle system, regardless of the microstate. We argue that the meaning of the H-theorem is to reveal the underlying distribution in the limit of large numbers. Computer simulations of a hard-sphere fluid are used to demonstrate the ideas.
On Definitions of Information in Physics
Foundations of Science, 2011
During the refereeing procedure of Anthropomorphic Quantum Darwinism by Thomas Durt, it became apparent in the dialogue between him and me that the definition of information in Physics is something about which not all authors agreed. This text aims at describing the concepts associated to information that are accepted as the standard in the Physics world community.