A Note on the Comparison of the Quadratic and Shannon’s Mutual Information (original) (raw)

The mutual information. Estimation in the sampling without replacement

Kybernetika, Vol. 23, Num. 5, pp. 407-419, 1987

In previous papers, the" mutual information of order fi concerning two random variables" was defined from the concept of conditional entropy of order p (Z. Dar6czy, 1970). The aim of the present paper is to approach the value of the mutual information of order $= 2 in a large population on the basis of a sample drawn at random and without replacement from it. This purpose is achieved by obtaining an unbiased estimator of that value and estimating its mean square error. In addition, a contrast between samplings with and without ...

Estimation of Mutual Information: A Survey

Rough Sets and Knowledge Technology, 2009

A common problem found in statistics, signal processing, data analysis and image processing research is the estimation of mutual information, which tends to be difficult. The aim of this survey is threefold: an introduction for those new to the field, an overview for those working in the field and a reference for those searching for literature on different estimation methods. In this paper comparison studies on mutual information estimation is considered. The paper starts with a description of entropy and mutual information and it closes with a discussion on the performance of different estimation methods and some future challenges.

Some data analyses using mutual information

Brazilian Journal of Probability and Statistics

This paper presents a number of data analyses making use of the concept of mutual information. Statistical uses of mutual information are seen to include: comparative studies, variable selection, estimation of pa rameters and assessment of model fit. The examples are taken from the fields of sports, neuroscience, and forest science. There is an Appendix providing proofs.

Estimating the Mutual Information between Two Discrete, Asymmetric Variables with Limited Samples

Entropy

Determining the strength of nonlinear, statistical dependencies between two variables is a crucial matter in many research fields. The established measure for quantifying such relations is the mutual information. However, estimating mutual information from limited samples is a challenging task. Since the mutual information is the difference of two entropies, the existing Bayesian estimators of entropy may be used to estimate information. This procedure, however, is still biased in the severely under-sampled regime. Here, we propose an alternative estimator that is applicable to those cases in which the marginal distribution of one of the two variables—the one with minimal entropy—is well sampled. The other variable, as well as the joint and conditional distributions, can be severely undersampled. We obtain a consistent estimator that presents very low bias, outperforming previous methods even when the sampled data contain few coincidences. As with other Bayesian estimators, our prop...

New parametric measures of information

Information and Control, 1981

In this paper methods are presented for obtaining parametric measures of information from the non-parametric ones and from information matrices. The properties of these measures are examined. The one-dimensional parametric measures which are derived from the non-parametric are superior to Fisher's information measure because they are free from regularity conditions. But if we impose the regularity conditions of the Fisherian theory of information, these measures become linear functions of Fisher's measure.

Information Theory and its Applications

Cryptography, Information Theory, and Error‐Correction, 2021

In this note, we look at several definitions of entropy and some of their consequences in information theory. We also obtain the entropy and relative entropy for general error distribution which is used to model errors which are not normal. H(Y) = H(f) = − f (y) log f (y)dy,

A new theorem of information theory

Journal of Statistical Physics, 1969

Consider a random experiment whose possible outcomes are zl, z~ ,..., z,. Let the prior probabilities be p0 ..... p0, and let the posterior probabilities be Pl ,..., P~. It is shown that, subject to certain prescribed and intuitively reasonable conditions, the expression I = k Zp~ ln(pJpfl), where k is a positive constant, is the unique expression for the information contained in a message which alters the probabilities from the pfl to the p~.