Information Measures in Perspective (original) (raw)

On the Measure of the Information in a Statistical Experiment

2008

Abstract. Setting aside experimental costs, the choice of an experiment is usually formulated in terms of the maximization of a measure of information, often presented as an optimality design criterion. However, there does not seem to be a universal agreement on what objects can qualify as a valid measure of the information in an experiment. In this article we explicitly state a minimal set of requirements that must be satisfied by all such measures. Under that framework, the measure of the information in an experiment is equivalent to the measure of the variability of its likelihood ratio statistics or which is the same, it is equivalent to the measure of the variability of its posterior to prior ratio statistics and to the measure of the variability of the distribution of the posterior distributions yielded by it. The larger that variability, the more peaked the likelihood functions and posterior distributions that tend to be yielded by the experiment, and the more informative the...

Statistical view of Information theory

The paper provides statistical interpretation of properties of entropy-information measures with regard to the extent of their agreement with statistical theorems and their degree of success in statistical applications.

Information Theory, Relative Entropy and Statistics

Lecture Notes in Computer Science, 2009

It is commonly assumed that computers process information. But what is information? In a technical, important, but nevertheless rather narrow sense, Shannon's information theory gives a first answer to this question. This theory focuses on measuring the information content of a message. Essentially this measure is the reduction of the uncertainty obtained by receiving a message. The uncertainty of a situation of ignorance in turn is measured by entropy. This theory has had an immense impact on the technology of information storage, data compression, information transmission and coding and still is a very active domain of research.

An Extension to the Revised Approach in the Assessment of Informational Entropy

Entropy

This study attempts to extend the prevailing definition of informational entropy, where entropy relates to the amount of reduction of uncertainty or, indirectly, to the amount of information gained through measurements of a random variable. The approach adopted herein describes informational entropy not as an absolute measure of information, but as a measure of the variation of information. This makes it possible to obtain a single value for informational entropy, instead of several values that vary with the selection of the discretizing interval, when discrete probabilities of hydrological events are estimated through relative class frequencies and discretizing intervals. Furthermore, the present work introduces confidence limits for the informational entropy function, which facilitates a comparison between the uncertainties of various hydrological processes with different scales of magnitude and different probability structures. The work addresses hydrologists and environmental engineers more than it does mathematicians and statisticians. In particular, it is intended to help solve information-related problems in hydrological monitoring design and assessment. This paper first considers the selection of probability distributions of best fit to hydrological data, using generated synthetic time series. Next, it attempts to assess hydrometric monitoring duration in a netwrok, this time using observed runoff data series. In both applications, it focuses, basically, on the theoretical background for the extended definition of informational entropy. The methodology is shown to give valid results in each case.

Some Information Theoretic Ideas Useful in Statistical Inference

Methodology and Computing in Applied Probability, 2007

In this paper we discuss four information theoretic ideas and present their implications to statistical inference: (1) Fisher information and divergence generating functions, (2) information optimum unbiased estimators, (3) information content of various statistics, (4) characterizations based on Fisher information.

A Study of Generalized Information Measures & Their Inequalites

Journal of emerging technologies and innovative research, 2018

A mathematical communication theory in the context of communication theory. It develops a way to research disciplines such as coding theory, semantics, decision theory, economics, radar detection, biology, psychology, and many others that demonstrate the necessity and importance of: information theory. Shannon proposed a probabilistic phenomenon to define communication through the concept of entropy, while at the same time creating the concepts of entropy and mutual information. Entropy is used to measure the uncertainty of a random variable associated with a randomized experiment. On the other hand, the mutual information quantifies the dependency between two random variables. Information theory examines all the theoretical problems associated with the transmission of information via communication channels. This includes studying the measurement of uncertainty (information) and various practical and economical methods for encoding information for transmission. An important feature ...

Statistical Problem Classes and Their Links to Information Theory

Econometric Reviews, 2014

ABSTRACT We begin by recalling the tripartite division of statistical problems into three classes, M-closed, M-complete, and M-open and then reviewing the key ideas of introductory Shannon theory. Focusing on the related but distinct goals of model selection and prediction, we argue that different techniques for these two goals are appropriate for the three different problem classes. For M-closed problems we give relative entropy justification that the Bayes information criterion (BIC) is appropriate for model selection and that the Bayes model average is information optimal for prediction. For M-complete problems, we discuss the principle of maximum entropy and a way to use the rate distortion function to bypass the inaccessibility of the true distribution. For prediction in the M-complete class, there is little work done on information based model averaging so we discuss the Akaike information criterion (AIC) and its properties and variants.For the M-open class, we argue that essentially only predictive criteria are suitable. Thus, as an analog to model selection, we present the key ideas of prediction along a string under a codelength criterion and propose a general form of this criterion. Since little work appears to have been done on information methods for general prediction in the M-open class of problems, we mention the field of information theoretic learning in certain general function spaces.

Information Theory and Generalized Statistics

Lecture Notes in Physics, 2003

In this lecture we present a discussion of generalized statistics based on Rényi's, Fisher's and Tsallis's measures of information. The unifying conceptual framework which we employ here is provided by information theory. Important applications of generalized statistics to systems with (multi-)fractal structure are examined. 2 P. Jizba astronomy, geophysics, biology, medical diagnosis and economics. For the latest developments in classical MaxEnt the interested reader may consult ref.