A Possible Extension of Shannon's Information Theory (original) (raw)

Shannon Entropy, Renyi Entropy, and Information

2000

This memo,contains proofs that the Shannon entropy is the limiting case of both the Renyi entropy and the information. These results are also confirmed experimentally. We conclude with some general observations on the utility of entropy measures.

Shannon theory beyond quantum: information content of a source

2021

The information content of a source is defined in terms of the minimum number of bits needed to store the output of the source in a perfectly recoverable way. A similar definition can be given in the case of quantum sources, with qubits replacing bits. In the mentioned cases the information content can be quantified through Shannon’s and von Neumann’s entropy, respectively. Here we extend the definition of information content to operational probabilistic theories, and prove relevant properties as the subadditivity, and the relation between purity and information content of a state. We prove the consistency of the present notion of information content when applied to the classical and the quantum case. Finally, the relation with one of the notions of entropy that can be introduced in general probabilistic theories, the maximum accessible information, is given in terms of a lower bound.

Towards a New Information Theory

2004

The modern information theory is established by Claude Shannon around the middle of the past century. The key challenges at the time were how to ensure a reliable transmission of signals. A prototype example is enough to illustrate the point. A signal represented by a binary sequence is to be transmitted over a noisy channel, the channel may randomly flip the bit with a given error rate. At the receiver side we'd like to recover the original message correctly, what to do?