On Elliptical Possibility Distributions (original) (raw)

A generalization of the Kullback–Leibler divergence and its properties

Journal of Mathematical Physics, 2009

A generalized Kullback-Leibler relative entropy is introduced starting with the symmetric Jackson derivative of the generalized overlap between two probability distributions. The generalization retains much of the structure possessed by the original formulation. We present the fundamental properties including positivity, metricity, concavity, bounds and stability. In addition, a connection to shift information and behavior under Liouville dynamics are discussed.

Transformation of Bimodal Probability Distributions Into Possibility Distributions

Ieee Transactions on Instrumentation and Measurement, 2010

At the application level, it is important to be able to define the measurement result as an interval that will contain an important part of the distribution of the measured values, that is, a coverage interval. This practice acknowledged by the International Organization for Standardization (ISO) Guide is a major shift from the probabilistic representation. It can be viewed as a probability/possibility transformation by viewing possibility distributions as encoding coverage intervals. In this paper, we extend previous works on unimodal distributions by proposing a possibility representation of bimodal probability distributions. Indeed, U-shaped distributions or Gaussian mixture distribution are not very rare in the context of physical measurements. Some elements to further propagate such bimodal possibility distributions are also exposed. The proposed method is applied to the case of three independent or positively correlated C-grade resistors in series and compared with the Guide to the Expression of Uncertainty in Measurement (GUM) and Monte Carlo methods.

Bivariate and multivariate distributions with bimodal marginals

We consider simple nÀ variate distributions, which are the special cases of general elliptically contoured and Kotz type distributions. In particular, we analyze the distributions having joint probability density functions (pdf) defined as functions of ðx 2 1 þ x 2 2 þ Á Á Á þ x 2 n Þ allowing easy calculations of different probabilities when using transformations with spherical coordinates. In the case of n ¼ 2; we give the various examples of such pdfs whose graphs resemble a bell sunken from the middle. These distributions can be used for modelling data clustered in some areas between concentric circles or ellipses. The easy analytical form of considered distributions make it possible to use them in many applications which require simplicity of calculations. The example of probability density function allowing high correlation is also considered. We also discuss the multivariate conditional ordering of random vectors and compute the structure functions considered in the paper probability density functions. The distributions with bimodal marginals can be used in many areas, such as hydrology, biology, medicine, economics, ecology, physics, and astronomy.

A New Family of Bounded Divergence Measures and Application to Signal Detection

Proceedings of the 5th International Conference on Pattern Recognition Applications and Methods, 2016

We introduce a new one-parameter family of divergence measures, called bounded Bhattacharyya distance (BBD) measures, for quantifying the dissimilarity between probability distributions. These measures are bounded, symmetric and positive semi-definite and do not require absolute continuity. In the asymptotic limit, BBD measure approaches the squared Hellinger distance. A generalized BBD measure for multiple distributions is also introduced. We prove an extension of a theorem of Bradt and Karlin for BBD relating Bayes error probability and divergence ranking. We show that BBD belongs to the class of generalized Csiszar f-divergence and derive some properties such as curvature and relation to Fisher Information. For distributions with vector valued parameters, the curvature matrix is related to the Fisher-Rao metric. We derive certain inequalities between BBD and well known measures such as Hellinger and Jensen-Shannon divergence. We also derive bounds on the Bayesian error probability. We give an application of these measures to the problem of signal detection where we compare two monochromatic signals buried in white noise and differing in frequency and amplitude.

Study of incompatibility or near compatibility of bivariate discrete conditional probability distributions through divergence measures

Journal of Statistical Computation and Simulation, 2013

Consider a two-dimensional discrete random variable (X, Y) with possible values 1, 2,. .. , I for X and 1, 2,. .. , J for Y. For specifying the distribution of (X, Y), suppose both conditional distributions, of X given Y and of Y given X, are provided. Under this setting, we present here different ways of measuring discrepancy between incompatible conditional distributions in the finite discrete case. In the process, we also suggest different ways of defining the most nearly compatible distributions in incompatible cases. Many new divergence measures are discussed along with those that are already known for determining the most nearly compatible joint distribution P. Finally, a comparative study is carried out between all these divergence measures as some examples.

Generalized divergence measures: Information matrices, amount of information, asymptotic distribution, and its applications to test statistical hypotheses

Information Sciences, 1995

In this paper, a transformation of Csiszar's measures which generalizes the unified (r, s) measures defined by Sharma and Mittal and Taneja is presented. For these transformations, information matrices associated to a differential metric in the direction to the tangent space are obtained, as well as the amount of information resulting from parameter perturbation in the direction of coordinate axes. Finally, the asymptotic distribution of information matrices and the amount of information and its applications to test statistical hypotheses are obtained. < F(x), I 00~ 00j < Y(x), 00~ 00j 00k < ~/(x), where F is finitely integrable and E[H(X)] < M, with M independent of 0. (iii) The Fisher information matrix O0 i OOj i,j = 1 ..

On Unified Generalizations of Relative Jensen--Shannon and Arithmetic--Geometric Divergence Measures, and Their Properties

2005

In this paper we shall consider one parametric generalization of some nonsymmetric divergence measures. The non-symmetric divergence measures are such as: Kullback-Leibler relative information, χ 2 −divergence, relative J -divergence, relative Jensen -Shannon divergence and relative Arithmetic -Geometric divergence. All the generalizations considered can be written as particular cases of Csiszár's f-divergence. By putting some conditions on the probability distribution, the aim here is to develop bounds on these measures and their parametric generalizations.

On mutual information for elliptical distributions: a case of nonlinear dependence of ‘n’ vectors

Sri Lankan Journal of Applied Statistics, 2017

In this paper, we modeled dependent categorical data via mutual information concept to obtain the measure of statistical dependence. We first derive the entropy and mutual information index for exponential power distribution. These concepts are important and were developed by Shannon in the context of information theory. Several literatures are already published in the case of the multivariate normal distribution. Then we extend these tools to the special case of a full symmetric of multivariate elliptical distributions. The upper bound for the entropy which is attained for the normal density is established. We further derived the nonlinear joint model for dependent random vectors that spans an elliptical vector space to enhance multivariate relationships among non-empty subsets of vectors via multivariate mutual information; based on the assumption that the subsets of each vector and their interactions can be represented in discrete form. To illustrate its application, the multivariate dependency among various sites based on dominance of some attributes were investigated.

On the Fisher Information Matrix for Multivariate Elliptically Contoured Distributions

IEEE Signal Processing Letters, 2000

The Slepian-Bangs formula provides a very convenient way to compute the Fisher information matrix (FIM) for Gaussian distributed data. The aim of this letter is to extend it to a larger family of distributions, namely elliptically contoured (EC) distributions. More precisely, we derive a closed-form expression of the FIM in this case. This new expression involves the usual term of the Gaussian FIM plus some corrective factors that depend only on the expectations of some functions of the so-called modular variate. Hence, for most distributions in the EC family, derivation of the FIM from its Gaussian counterpart involves slight additional derivations. We show that the new formula reduces to the Slepian-Bangs formula in the Gaussian case and we provide an illustrative example with Student distributions on how it can be used.

An Information Geometry Justification of the Relatively Invariant Prior

2004

SUMMARY Given a random sample from a distribution with density function that de- pends on an unknown parameter, we consider its Bayes estimate using as a loss function an element of the class of α-divergences. When the parametric family is one dimensional it can be suitably parameterized by an α-affine parameter. We show that the maximum likelihood estimator of the