Quantifying Synergistic Information Using Intermediate Stochastic Variables (original) (raw)
Free related PDFsRelated papers
A Unified Definition of Mutual Information with Applications in Machine Learning
Mathematical Problems in Engineering, 2015
There are various definitions of mutual information. Essentially, these definitions can be divided into two classes: (1) definitions with random variables and (2) definitions with ensembles. However, there are some mathematical flaws in these definitions. For instance, Class 1 definitions either neglect the probability spaces or assume the two random variables have the same probability space. Class 2 definitions redefine marginal probabilities from the joint probabilities. In fact, the marginal probabilities are given from the ensembles and should not be redefined from the joint probabilities. Both Class 1 and Class 2 definitions assume a joint distribution exists. Yet, they all ignore an important fact that the joint or the joint probability measure is not unique. In this paper, we first present a new unified definition of mutual information to cover all the various definitions and to fix their mathematical flaws. Our idea is to define the joint distribution of two random variables...
Free PDF
Free PDF
Free PDF
Free PDF
Free PDF
Pairwise network information and nonlinear correlations
Physical review. E, 2016
Reconstructing the structural connectivity between interacting units from observed activity is a challenge across many different disciplines. The fundamental first step is to establish whether or to what extent the interactions between the units can be considered pairwise and, thus, can be modeled as an interaction network with simple links corresponding to pairwise interactions. In principle, this can be determined by comparing the maximum entropy given the bivariate probability distributions to the true joint entropy. In many practical cases, this is not an option since the bivariate distributions needed may not be reliably estimated or the optimization is too computationally expensive. Here we present an approach that allows one to use mutual informations as a proxy for the bivariate probability distributions. This has the advantage of being less computationally expensive and easier to estimate. We achieve this by introducing a novel entropy maximization scheme that is based on c...
Free PDF
Free PDF
Free PDF
Free PDF
Maximizable informational entropy as measure of probabilistic uncertainty
2008
In this work, we consider a recently proposed entropy S (called varentropy) defined by a variational relationship dI=beta*(d-) as a measure of uncertainty of random variable x. By definition, varentropy underlies a generalized virtual work principle =0 leading to maximum entropy d(I-beta*)=0. This paper presents an analytical investigation of this maximizable entropy for several distributions such as stretched exponential distribution, kappa-exponential distribution and Cauchy distribution.
Free PDF