Information theory after Shannon (original) (raw)

Towards a New Information Theory

2004

The modern information theory is established by Claude Shannon around the middle of the past century. The key challenges at the time were how to ensure a reliable transmission of signals. A prototype example is enough to illustrate the point. A signal represented by a binary sequence is to be transmitted over a noisy channel, the channel may randomly flip the bit with a given error rate. At the receiver side we'd like to recover the original message correctly, what to do?

Quantum theory is not only about information

2018

In his recent book Bananaworld. Quantum mechanics for primates, Jeff Bub revives and provides a mature version of his influential information-theoretic interpretation of Quantum Theory (QT). In this paper, I test Bub's conjecture that QT should be interpreted as a theory about information, by examining whether his information-theoretic interpretation has the resources to explain (or explain away) quantum conundrums. The discussion of Bub's theses will also serve to investigate, more in general, whether other approaches succeed in defending the claim that QT is about quantum information. First of all, I argue that Bub's interpretation of QT as a principle theory fails to fully explain quantum non-locality. Secondly, I argue that a constructive interpretation, where the quantum state is interpreted ontically as information, also fails at providing a full explanation of quantum correlations. Finally, while epistemic interpretations might succeed in this respect, I argue that such a success comes at the price of rejecting some in between the most basic scientific standards of physical theories.

Information Physics: The Final Frontier

Bayesian Inference and Maximum Entropy Methods in Science and Engineering, Chamonix, France, 2010, 2010

At this point in time, two major areas of physics, statistical mechanics and quantum mechanics, rest on the foundations of probability and entropy. The last century saw several significant fundamental advances in our understanding of the process of inference, which make it clear that these are inferential theories. That is, rather than being a description of the behavior of the universe, these theories describe how observers can make optimal predictions about the universe. In such a picture, information plays a critical role. What is more is that little clues, such as the fact that black holes have entropy, continue to suggest that information is fundamental to physics in general. In the last decade, our fundamental understanding of probability theory has led to a Bayesian revolution. In addition, we have come to recognize that the foundations go far deeper and that Cox’s approach of generalizing a Boolean algebra to a probability calculus is the first specific example of the more fundamental idea of assigning valuations to partially ordered sets. By considering this as a natural way to introduce quantification to the more fundamental notion of ordering, one obtains an entirely new way of deriving physical laws. I will introduce this new way of thinking by demonstrating how one can quantify partially-ordered sets and, in the process, derive physical laws. The implication is that physical law does not reflect the order in the universe, instead it is derived from the order imposed by our description of the universe. Information physics, which is based on understanding the ways in which we both quantify and process information about the world around us, is a fundamentally new approach to science.