A Note on the Comparison of the Quadratic and Shannon’s Mutual Information (original) (raw)

The mutual information. Estimation in the sampling without replacement

Kybernetika, Vol. 23, Num. 5, pp. 407-419, 1987

In previous papers, the" mutual information of order fi concerning two random variables" was defined from the concept of conditional entropy of order p (Z. Dar6czy, 1970). The aim of the present paper is to approach the value of the mutual information of order $= 2 in a large population on the basis of a sample drawn at random and without replacement from it. This purpose is achieved by obtaining an unbiased estimator of that value and estimating its mean square error. In addition, a contrast between samplings with and without ...

A Study of Generalized Information Measures & Their Inequalites

Journal of emerging technologies and innovative research, 2018

A mathematical communication theory in the context of communication theory. It develops a way to research disciplines such as coding theory, semantics, decision theory, economics, radar detection, biology, psychology, and many others that demonstrate the necessity and importance of: information theory. Shannon proposed a probabilistic phenomenon to define communication through the concept of entropy, while at the same time creating the concepts of entropy and mutual information. Entropy is used to measure the uncertainty of a random variable associated with a randomized experiment. On the other hand, the mutual information quantifies the dependency between two random variables. Information theory examines all the theoretical problems associated with the transmission of information via communication channels. This includes studying the measurement of uncertainty (information) and various practical and economical methods for encoding information for transmission. An important feature ...