Peter Gmeiner | Friedrich-Alexander-Universität Erlangen-Nürnberg (original) (raw)
Uploads
Papers by Peter Gmeiner
The persistent mutual information (PMI) is a complexity measure for stochastic processes. It is r... more The persistent mutual information (PMI) is a complexity measure for stochastic processes. It is related to well-known complexity measures like excess entropy or statistical complexity. Essentially it is a variation of the excess entropy so that it can be interpreted as a specific measure of system internal memory. The PMI was first introduced in 2010 by Ball, Diakonova and MacKay as a measure for (strong) emergence. In this paper we define the PMI mathematically and investigate the relation to excess entropy and statistical complexity. In particular we prove that the excess entropy is an upper bound of the PMI. Furthermore we show some properties of the PMI and calculate it explicitly for some example processes. We also discuss to what extend it is a measure for emergence and compare it with alternative approaches used to formalize emergence.
Inferring the causal direction and causal effect between two discrete random variables X and Y fr... more Inferring the causal direction and causal effect between two discrete random variables X and Y from a finite sample is often a crucial problem and a challenging task. However, if we have access to observational and interventional data, it is possible to solve that task. If X is causing Y, then it does not matter if we observe an effect in Y by observing changes in X or by intervening actively on X. This invariance principle creates a link between observational and interventional distributions in a higher dimensional probability space. We embed distributions that originate from samples of X and Y into that higher dimensional space such that the embedded distribution is closest to the distributions that follow the invariance principle, with respect to the relative entropy. This allows us to calculate the best information-theoretic approximation for a given empirical distribution, that follows an assumed underlying causal model. We show that this information-theoretic approximation to ...
The persistent mutual information (PMI) is a complexity measure for stochastic processes. It is r... more The persistent mutual information (PMI) is a complexity measure for stochastic processes. It is related to well-known complexity measures like excess entropy or statistical complexity. Essentially it is a variation of the excess entropy so that it can be interpreted as a specific measure of system internal memory. The PMI was first introduced in 2010 by Ball, Diakonova and MacKay as a measure for (strong) emergence. In this paper we define the PMI mathematically and investigate the relation to excess entropy and statistical complexity. In particular we prove that the excess entropy is an upper bound of the PMI. Furthermore we show some properties of the PMI and calculate it explicitly for some example processes. We also discuss to what extend it is a measure for emergence and compare it with alternative approaches used to formalize emergence.
The persistent mutual information (PMI) is a complexity measure for stochastic processes. It is r... more The persistent mutual information (PMI) is a complexity measure for stochastic processes. It is related to well-known complexity measures like excess entropy or statistical complexity. Essentially it is a variation of the excess entropy so that it can be interpreted as a specific measure of system internal memory. The PMI was first introduced in 2010 by Ball, Diakonova and MacKay as a measure for (strong) emergence. In this paper we define the PMI mathematically and investigate the relation to excess entropy and statistical complexity. In particular we prove that the excess entropy is an upper bound of the PMI. Furthermore we show some properties of the PMI and calculate it explicitly for some example processes. We also discuss to what extend it is a measure for emergence and compare it with alternative approaches used to formalize emergence.
Inferring the causal direction and causal effect between two discrete random variables X and Y fr... more Inferring the causal direction and causal effect between two discrete random variables X and Y from a finite sample is often a crucial problem and a challenging task. However, if we have access to observational and interventional data, it is possible to solve that task. If X is causing Y, then it does not matter if we observe an effect in Y by observing changes in X or by intervening actively on X. This invariance principle creates a link between observational and interventional distributions in a higher dimensional probability space. We embed distributions that originate from samples of X and Y into that higher dimensional space such that the embedded distribution is closest to the distributions that follow the invariance principle, with respect to the relative entropy. This allows us to calculate the best information-theoretic approximation for a given empirical distribution, that follows an assumed underlying causal model. We show that this information-theoretic approximation to ...
The persistent mutual information (PMI) is a complexity measure for stochastic processes. It is r... more The persistent mutual information (PMI) is a complexity measure for stochastic processes. It is related to well-known complexity measures like excess entropy or statistical complexity. Essentially it is a variation of the excess entropy so that it can be interpreted as a specific measure of system internal memory. The PMI was first introduced in 2010 by Ball, Diakonova and MacKay as a measure for (strong) emergence. In this paper we define the PMI mathematically and investigate the relation to excess entropy and statistical complexity. In particular we prove that the excess entropy is an upper bound of the PMI. Furthermore we show some properties of the PMI and calculate it explicitly for some example processes. We also discuss to what extend it is a measure for emergence and compare it with alternative approaches used to formalize emergence.