Statistical view of Information theory (original) (raw)

Information Theoretic Measures and Their Applications

Entropy, 2020

The concept of entropy, an ever-growing physical magnitude that measured the degree of decay of order in a physical system, was introduced by Rudolf Clausius in 1865 through an elegant formulation of the second law of thermodynamics [...].

Information Theory, Relative Entropy and Statistics

Lecture Notes in Computer Science, 2009

It is commonly assumed that computers process information. But what is information? In a technical, important, but nevertheless rather narrow sense, Shannon's information theory gives a first answer to this question. This theory focuses on measuring the information content of a message. Essentially this measure is the reduction of the uncertainty obtained by receiving a message. The uncertainty of a situation of ignorance in turn is measured by entropy. This theory has had an immense impact on the technology of information storage, data compression, information transmission and coding and still is a very active domain of research.

An Alternative to Entropy in the Measurement of Information

Entropy, 2004

Entropy has been the main tool in the analysis of the concept of information since information theory was conceived in the work of Shannon more than fifty years ago. There were some attempts to find more general measure of information, but their outcomes were more of formal, theoretical interest, and neither has provided better insight into the nature of information. The strengths of entropy seemed so obvious that no much effort has been made to find an alternative to entropy which gives different values, but which is consistent with entropy in the sense that the results obtained in information theory thus far can be reproduced with the new measure. In this article the need for such an alternative measure is demonstrated based on historical review of the problems with conceptualization of information. Then, an alternative measure is presented in the context of modified definition of information applicable outside of the conduit metaphor of Shannon's approach, and formulated without reference to uncertainty. It has several features superior to those of entropy. For instance, unlike entropy it can be easily and consistently extended to the continuous probability distributions, and unlike differential entropy this extension is always positive and invariant with respect to linear transformations of coordinates.

Information Theory and its Applications

Cryptography, Information Theory, and Error‐Correction, 2021

In this note, we look at several definitions of entropy and some of their consequences in information theory. We also obtain the entropy and relative entropy for general error distribution which is used to model errors which are not normal. H(Y) = H(f) = − f (y) log f (y)dy,

Shannon Entropy, Renyi Entropy, and Information

2000

This memo,contains proofs that the Shannon entropy is the limiting case of both the Renyi entropy and the information. These results are also confirmed experimentally. We conclude with some general observations on the utility of entropy measures.

On measure theoretic definitions of generalized information measures and …

Arxiv preprint cs/0601080

Abstract. Though Shannon entropy of a probability measure P, defined as − ∫ X dP dµ ln dP dµ dµ on a measure space (X, M, µ), does not qualify itself as an information measure (it is not a natural extension of the discrete case), maximum entropy (ME) prescriptions in the ...

Information Theory and Generalized Statistics

Lecture Notes in Physics, 2003

In this lecture we present a discussion of generalized statistics based on Rényi's, Fisher's and Tsallis's measures of information. The unifying conceptual framework which we employ here is provided by information theory. Important applications of generalized statistics to systems with (multi-)fractal structure are examined. 2 P. Jizba astronomy, geophysics, biology, medical diagnosis and economics. For the latest developments in classical MaxEnt the interested reader may consult ref.