Information in experiments and sufficiency (original) (raw)
Related papers
On the Measure of the Information in a Statistical Experiment
2008
Abstract. Setting aside experimental costs, the choice of an experiment is usually formulated in terms of the maximization of a measure of information, often presented as an optimality design criterion. However, there does not seem to be a universal agreement on what objects can qualify as a valid measure of the information in an experiment. In this article we explicitly state a minimal set of requirements that must be satisfied by all such measures. Under that framework, the measure of the information in an experiment is equivalent to the measure of the variability of its likelihood ratio statistics or which is the same, it is equivalent to the measure of the variability of its posterior to prior ratio statistics and to the measure of the variability of the distribution of the posterior distributions yielded by it. The larger that variability, the more peaked the likelihood functions and posterior distributions that tend to be yielded by the experiment, and the more informative the...
Diierentiability of Statistical Experiments
2007
Let E = (P) be a dominated experiment with an open Euclidean parameter space and densities (f). The experiment is said to be continuously L 2 {diierentiable if the family (p f) is continuously L 2 {diierentiable. The following assertions are proved: (1) The experiment E is continuously L 2 {diierentiable ii the family (f) is continuously L 1 {diierentiable and Fisher's information function is continuous. (2) Suppose that the experiment E is continuously L 2 {diierentiable and the experiment F is less informative than E. Then F is continuously L 2 {diierentiable, too.
An application of information theory to the problem of the scientific experiment
Synthese, 2004
There are two basic approaches to the problem of induction: the empirical one, which deems that the possibility of induction depends on how the world was made (and how it works) and the logical one, which considers the formation (and function) of language. The first is closer to being useful for induction, while the second is more rigorous and clearer. The purpose of this paper is to create an empirical approach to induction that contains the same formal exactitude as the logical approach. This requires: a) that the empirical conditions for the induction are enunciated and b) that the most important results already obtained from inductive logic are again demonstrated to be valid.
On the information in two-level experiments
Model Assisted Statistics and Applications, 2008
The information matrix of a sequence of independent experiments is the sum of their information matrices but the value taken on that sequence by the real valued measures of the information based on that matrix is not the sum of the values they take on the individual experiments. Nevertheless, when information is measured through the determinant of the information matrix and one observes a response that can be modeled through a generalized linear model, a different type of additivity holds. Here, that property is stated in its full generality, and it is used to obtain the determinant of the information matrix for any two-level generalized linear experiment. That allows us to explore how the information in two-level factorial experiments depends on the location of their center point and on their range for linear normal, log-linear Poisson and logistic and probit binary response models. That property is also used to explore the effect of the addition of one support point on the information in the experiment.
Information concepts in filtered experiments
In this paper we define randomized filtered experiments with an abstract parameter space and analyze some properties of this concept. To that end we relate with the parametric family of density processes the arithmetic and geometric mean processes. The latter will be naturally linked to a generalized Hellinger process and Hellinger integrals. We also introduce and analyze the Kullback-Leibler information processes of a posterior distribution on the parameter space with respect to a prior (and vice versa) and give certain characterizations to their development in terms of the concepts given above. keywords Kullback-Leibler information, relative entropy, information from data, expected utility from data, Hellinger processes and Hellinger integrals, posterior and prior distributions.
Reliability criteria in information theory and in statistical hypothesis testing
… and Trends in …, 2007
This survey is devoted to one of the central problems of Information Theory-the problem of determination of interdependence between coding rate and error probability exponent for different information transmission systems. The overview deals with memoryless systems of finite alphabet setting. It presents material complementary to the contents of the series of the most remarkable in Information Theory books of Feinstain, Fano, Wolfowitz, Gallager, Csiszar and Körner, Kolesnik and Poltirev, Blahut, Cover and Thomas and of the papers by Dobrushin, Gelfand and Prelov. We briefly formulate fundamental notions and results of Shannon theory on reliable transmission via coding and give a survey of results obtained in last two-three decades by the authors, their colleagues and some other researchers. The paper is written with the goal to make accessible to a broader circle of readers the theory of rate-reliability. We regard this concept useful to promote the noted problem solution in parallel with elaboration of the notion of reliabilityreliability dependence relative to the statistical hypothesis testing and identification.
Information,-̂Sufficiency and Data Reduction Problems
2014
Placing us in the frame of the Bayes model of statistical decision we try to estimate in information-theoretical terms the average (respectively, Bayes) risk change caused by a modification of the probability law in action. Especially there are given some upper estimates of the Bayes risk increase on passing from an initial decision problem to a reduced one resulting from the first by a reduction of the sample space er-algebra as well as of the parameter space ff-algebra. The concept of e-sufficiency, previously introduced by the author as a natural extension of the concept of sufficiency applied in mathematical statistics, is in a certain sense automatically involved in the above estimates as the decrease of information implied by the reduction. CONTENTS 1.
A new theorem of information theory
Journal of Statistical Physics, 1969
Consider a random experiment whose possible outcomes are zl, z~ ,..., z,. Let the prior probabilities be p0 ..... p0, and let the posterior probabilities be Pl ,..., P~. It is shown that, subject to certain prescribed and intuitively reasonable conditions, the expression I = k Zp~ ln(pJpfl), where k is a positive constant, is the unique expression for the information contained in a message which alters the probabilities from the pfl to the p~.