Quantifying Synergistic Information Using Intermediate Stochastic Variables (original) (raw)

A Unified Definition of Mutual Information with Applications in Machine Learning

Mathematical Problems in Engineering, 2015

There are various definitions of mutual information. Essentially, these definitions can be divided into two classes: (1) definitions with random variables and (2) definitions with ensembles. However, there are some mathematical flaws in these definitions. For instance, Class 1 definitions either neglect the probability spaces or assume the two random variables have the same probability space. Class 2 definitions redefine marginal probabilities from the joint probabilities. In fact, the marginal probabilities are given from the ensembles and should not be redefined from the joint probabilities. Both Class 1 and Class 2 definitions assume a joint distribution exists. Yet, they all ignore an important fact that the joint or the joint probability measure is not unique. In this paper, we first present a new unified definition of mutual information to cover all the various definitions and to fix their mathematical flaws. Our idea is to define the joint distribution of two random variables...

Free PDF

A Unified Definition of Mutual Information with Applications in Machine Learning Cover Page

Free PDF

Non–Parametric Estimation of Mutual Information through the Entropy of the Linkage Cover Page

Free PDF

Estimating mutual information Cover Page

Free PDF

Unique additive information measures - Boltzmann-Gibbs-Shannon, Fisher and beyond Cover Page

Free PDF

Predictive Information, Multi-Information, and Binding Information Cover Page

Pairwise network information and nonlinear correlations

Physical review. E, 2016

Reconstructing the structural connectivity between interacting units from observed activity is a challenge across many different disciplines. The fundamental first step is to establish whether or to what extent the interactions between the units can be considered pairwise and, thus, can be modeled as an interaction network with simple links corresponding to pairwise interactions. In principle, this can be determined by comparing the maximum entropy given the bivariate probability distributions to the true joint entropy. In many practical cases, this is not an option since the bivariate distributions needed may not be reliably estimated or the optimization is too computationally expensive. Here we present an approach that allows one to use mutual informations as a proxy for the bivariate probability distributions. This has the advantage of being less computationally expensive and easier to estimate. We achieve this by introducing a novel entropy maximization scheme that is based on c...

Free PDF

Pairwise network information and nonlinear correlations Cover Page

Free PDF

MAXIMIZABLE INFORMATIONAL ENTROPY AS A MEASURE OF PROBABILISTIC UNCERTAINTY Cover Page

Free PDF

Estimation of Mutual Information: A Survey Cover Page

Free PDF

The Limits of Pairwise Correlation to Model the Joint Entropy. Comment on Nguyen Thi Thanh et al. Entropy Correlation and Its Impacts on Data Aggregation in a Wireless Sensor Network. Sensors 2018, 18, 3118 Cover Page

Maximizable informational entropy as measure of probabilistic uncertainty

2008

In this work, we consider a recently proposed entropy S (called varentropy) defined by a variational relationship dI=beta*(d-) as a measure of uncertainty of random variable x. By definition, varentropy underlies a generalized virtual work principle =0 leading to maximum entropy d(I-beta*)=0. This paper presents an analytical investigation of this maximizable entropy for several distributions such as stretched exponential distribution, kappa-exponential distribution and Cauchy distribution.

Free PDF

Maximizable informational entropy as measure of probabilistic uncertainty Cover Page