Generalized divergence measures: Information matrices, amount of information, asymptotic distribution, and its applications to test statistical hypotheses (original) (raw)

Connections of generalized divergence measures with Fisher information matrix

Information Sciences, 1993

In this paper parametric measures of information for the multiparameter case are obtained using Csiszar's nonparametric measure of information as well as continuous and differentiable functions thereof. The same idea is applied to some of the unified b, s) divergence measures presented in Taneja. Relationships of the new parametric measures with Fisher's information matrix are derived and their connections with the Cramer-Rao inequality is established. This work generalizes the results of Kagan, Aggarwal, and Ferentinos and Papioannou.

Information geometry of divergence functions

Bulletin of the Polish Academy of Sciences: Technical Sciences, 2000

Measures of divergence between two points play a key role in many engineering problems. One such measure is a distance function, but there are many important measures which do not satisfy the properties of the distance. The Bregman divergence, Kullback-Leibler divergence and f -divergence are such measures. In the present article, we study the differential-geometrical structure of a manifold induced by a divergence function. It consists of a Riemannian metric, and a pair of dually coupled affine connections, which are studied in information geometry. The class of Bregman divergences are characterized by a dually flat structure, which is originated from the Legendre duality. A dually flat space admits a generalized Pythagorean theorem. The class of f -divergences, defined on a manifold of probability distributions, is characterized by information monotonicity, and the Kullback-Leibler divergence belongs to the intersection of both classes. The f -divergence always gives the α-geometry, which consists of the Fisher information metric and a dual pair of ±α-connections. The α-divergence is a special class of f -divergences. This is unique, sitting at the intersection of the f -divergence and Bregman divergence classes in a manifold of positive measures. The geometry derived from the Tsallis q-entropy and related divergences are also addressed.

On Unified Generalizations of Relative Jensen--Shannon and Arithmetic--Geometric Divergence Measures, and Their Properties

2005

In this paper we shall consider one parametric generalization of some nonsymmetric divergence measures. The non-symmetric divergence measures are such as: Kullback-Leibler relative information, χ 2 −divergence, relative J -divergence, relative Jensen -Shannon divergence and relative Arithmetic -Geometric divergence. All the generalizations considered can be written as particular cases of Csiszár's f-divergence. By putting some conditions on the probability distribution, the aim here is to develop bounds on these measures and their parametric generalizations.

Normalized information-based divergences

2007

This paper is devoted to the mathematical study of some divergences based on the mutual information well-suited to categorical random vectors. These divergences are generalizations of the "entropy distance" and "information distance". Their main characteristic is that they combine a complexity term and the mutual information. We then introduce the notion of (normalized) informationbased divergence, propose several examples and discuss their mathematical properties in particular in some prediction framework.

Divergence Measures Estimation and its Asymptotic Normality Theory in the Discrete Case

European Journal of Pure and Applied Mathematics

In this paper we provide the asymptotic theory of the general of φ-divergences measures, which include the most common divergence measures : R´enyi and Tsallis families and the Kullback-Leibler measure. We are interested in divergence measures in the discrete case. One sided and two-sided statistical tests are derived as well as symmetrized estimators. Almost sure rates of convergence and asymptotic normality theorem are obtained in the general case, and next particularized for the R´enyi and Tsallis families and for the Kullback-Leibler measure as well. Our theoretical results are validated by simulations.

A Family of Bounded Divergence Measures Based on The Bhattacharyya Coefficient

We introduce a new one-parameter family of divergence measures, called bounded Bhattacharyya distance (BBD) measures, for quantifying the dissimilarity between probability distributions. These measures are bounded, symmetric and positive semi-definite and do not require absolute continuity. In the asymptotic limit, BBD measure approaches the squared Hellinger distance. A generalized BBD measure for multiple distributions is also introduced. We prove an extension of a theorem of Bradt and Karlin for BBD relating Bayes error probability and divergence ranking. We show that BBD belongs to the class of generalized Csiszar f-divergence and derive some properties such as curvature and relation to Fisher Information. For distributions with vector valued parameters, the curvature matrix is related to the Fisher-Rao metric. We derive certain inequalities between BBD and well known measures such as Hellinger and Jensen-Shannon divergence. We also derive bounds on the Bayesian error probability. We give an application of these measures to the problem of signal detection where we compare two monochromatic signals buried in white noise and differing in frequency and amplitude.

ON ‘USEFUL’ R-NORM RELATIVE INFORMATION AND J-DIVERGENCE MEASURES

In this paper some new generalized R-Norm measures of useful relative information have been defined and their particular cases have been studied. From these measures new useful R-Norm information measures have also been derived. We have obtained J-divergence corresponding to each measure of useful relative R-norm information. in the end, an equality satisfied by useful J-divergence of type β has been proved.

A New Family of Bounded Divergence Measures and Application to Signal Detection

Proceedings of the 5th International Conference on Pattern Recognition Applications and Methods, 2016

We introduce a new one-parameter family of divergence measures, called bounded Bhattacharyya distance (BBD) measures, for quantifying the dissimilarity between probability distributions. These measures are bounded, symmetric and positive semi-definite and do not require absolute continuity. In the asymptotic limit, BBD measure approaches the squared Hellinger distance. A generalized BBD measure for multiple distributions is also introduced. We prove an extension of a theorem of Bradt and Karlin for BBD relating Bayes error probability and divergence ranking. We show that BBD belongs to the class of generalized Csiszar f-divergence and derive some properties such as curvature and relation to Fisher Information. For distributions with vector valued parameters, the curvature matrix is related to the Fisher-Rao metric. We derive certain inequalities between BBD and well known measures such as Hellinger and Jensen-Shannon divergence. We also derive bounds on the Bayesian error probability. We give an application of these measures to the problem of signal detection where we compare two monochromatic signals buried in white noise and differing in frequency and amplitude.

Generalized Non-Symmetric Divergence Measures and Inequalities

2005

In this paper we consider one parameter generalizations of some non - symmetric divergence measures. Measures are \textit{relative information}, chi2−\chi ^2 - chi2\textit{divergence}, \textit{relative J-divergence}, \textit{relative Jensen-Shannon divergence}and \textit{relative arithmetic and geometric divergence}. All the generalizations considered can be written as particular cases of Csisz\'{a}r \textit{f-divergence}. By conditioning the probability distributions, relationships among the \textit{relative divergence measures}are obtained.