Information Theory (original) (raw)

Basics of Probability Theory

Statistics deals with the collection and interpretation of data. This chapter lays a foundation that allows to rigorously describe non-deterministic processes and to reason about non-deterministic quantities. The mathematical framework is given by probability theory, whose objects of interest are random quantities, their description and properties.

Information Dependency and Its Applications

2012

Independence is a basic concept of probability theory and statistics. In a lot of elds of sciences, dependency of di erent variables is gained lots of attention from scientists. A measure, named information dependency, is proposed to express the dependency of a group of random variables. This measure is de ned as the Kullback-Leibler divergence of a joint distribution with respect to a product-marginal distribution of these random variables. In the bivariate case, this measure is known as mutual information of two random variables. Thus, the measure information dependency has a strong relationship with the Information Theory. The thesis aims to give a thorough study of the information dependency from both mathematical and practical viewpoints. Concretely, we would like to research three following problems: i. Proving that the information dependency is a useful tool to express the dependency of a group of random variables by comparing it with other measures of dependency. ii. Studyin...

JOINT AND CONDITIONAL PROBABILITY DISTRIBUTIONS

Abdella Mohammed Ahmed (Msc), 2024

A random vector X = (X1, . . . ,Xn) is a vector of, say n, random variables. There are three types of random vectors: continuous, discrete, and mixed. The latter is essentially a vector including both continuous and discrete random variables, and hence we will focus only on continuous and discrete random vectors. In this chapter, we will consider a bivariate random vector (X, Y), though extending the discussion to n-dimensional random vectors is straightforward.

On the Use of Entropy as a Measure of Dependence of Two Events

2021

We define degree of dependence of two events A and B in a probability space by using Boltzmann-Shannon entropy function of an appropriate probability distribution produced by these events and depending on one parameter (the probability of intersection of A and B) varying within a closed interval I. The entropy function attains its global maximum when the events A and B are independent. The important particular case of discrete uniform probability space motivates this definition in the following way. The entropy function has a minimum at the left endpoint of I exactly when one of the events and the complement of the other are connected with the relation of inclusion (maximal negative dependence). It has a minimum at the right endpoint of I exactly when one of these events is included in the other (maximal positive dependence). Moreover, the deviation of the entropy from its maximum is equal to average information that carries one of the binary trials defined by A and B with respect t...