Information, Entropy and Their Geometric Structures (original) (raw)

Shannon's entropy revisited

ArXiv, 2015

I consider the effect of a finite sample size on the entropy of a sample of independent events. I propose formula for entropy which satisfies Shannon's axioms, and which reduces to Shannon's entropy when sample size is infinite. I discuss the physical meaning of the difference between two formulas, including some practical implications, such as maximum achievable channel utilization, and minimum achievable communication protocol overhead, for a given message size.

Shannon Entropy, Renyi Entropy, and Information

2000

This memo,contains proofs that the Shannon entropy is the limiting case of both the Renyi entropy and the information. These results are also confirmed experimentally. We conclude with some general observations on the utility of entropy measures.

Shannon's information is not entropy

Physics Letters A, 1991

Contrary to the widespread belief, Shannon's information and entropy are not in general equivalent. The purpose of this Letter is to discuss their conceptual difference and to pinpoint to mathematical reason for this. This fact is further illustrated through a toy model consisting of a harmonic oscillator in a coherent state, showing explicitly the dependence of Shannon's information on the class of quantum states the system is in. The interrelation between Shannon's information and entropy is reestablished after a theorem telling which class of states maximizes the (missing) information is proved. We shall conclude that entropy is the maximum amount of missing information.

A new basic theorem of information theory

The Annals of Mathematical Statistics, 1953

A new theorem for noisy channels, similar to Shannon's in its general statement but giving sharper results, is formulated and proven. It is shown that the equivocation of the channel defined by the present theorem vanishes with increasing code length. A continuous channel is defined in a manner that permits the application of these results. Detailed proof of the equivalence of this definition and Shannon's is given in an appendix.

Derivation and Development of Noise Channel Information Transmission Capacity

Proceedings of the 2018 International Conference on Network, Communication, Computer Engineering (NCCE 2018), 2018

In this paper, we derive one of Shannon's theorems, the maximum capacity theorem of noisy channels 2 log (1 /) C B S N   :. The derivation of noisy continuous channel capacity is based on the derivation of discrete channels and most of the conclusions are a limit discretization of the corresponding theorems of discrete channels. The channel capacity is the maximum transmission rate of the information, which is the difference between the source entropy and the loss information entropy. By using both the noise and the signal independently, the channel capacity can be obtained as the entropy difference between the received signal and the white Gaussian noise. Based on this theorem, the channel capacity can be further improved in the project, such as spread spectrum communication.

A Possible Extension of Shannon's Information Theory

Entropy, 2001

As a possible generalization of Shannon's information theory, we review the formalism based on the non-logarithmic information content parametrized by a real number q, which exhibits nonadditivity of the associated uncertainty. Moreover it is shown that the establishment of the concept of the mutual information is of importance upon the generalization.

Information theory after Shannon

Communications of The ACM, 2011

Science of Information Center with the goal of moving beyond Shannon. They aim to develop principles that encompass such concepts as structure, time, space, and semantics. These principles might help design better mobile networks, lead to new insights in biology and neuroscience, drive research in quantum computing, and even aid our understanding of social networks and economic behavior.

What is Shannon information?

Although the use of the word 'information', with different meanings, can be traced back to antique and medieval texts (see Adriaans 2013), it is only in the 20 th century that the term begins to acquire the present-day sense. Nevertheless, the pervasiveness of the notion of information both in our everyday life and in our scientific practice does not imply the agreement about the content of the concept. As Luciano stresses, it is a polysemantic concept associated with different phenomena, such as communication, computation, knowledge, reference, meaning, truth, etc. In the second half of the 20 th century, philosophy begins to direct its attention to this omnipresent but intricate concept in an effort of unravel the tangle of significances surrounding it.

A Modification to the Shannon Formula

2020

Most recently, an experiment was published that cannot be described by the Shannon formula. We will give the details to the reader to check the deviation between the Shannon formula and the experiment. Then we derive a modification of the formula. Consequences of the modification is extended generality, and clear bounds for the average rate of reliable information transfer on a noisy channel of finite bandwidth.