A Cram'er-Wold theorem for elliptical distributions (original) (raw)
Related papers
Testing for affine equivalence of elliptically symmetric distributions
Journal of Multivariate Analysis, 2004
Let X and Y be d-dimensional random vectors having elliptically symmetric distributions. Call X and Y affinely equivalent if Y has the same distribution as AX + b for some nonsingular d × d-matrix A and some b ∈ R d. This paper studies a class of affine invariant tests for affine equivalence under certain moment restrictions. The test statistics are measures of discrepancy between the empirical distributions of the norm of suitably standardized data.
We consider random vectors bfXK×1{\bf X}_{K×1}bfXK×1 and bfYN×1{\bf Y}_{N×1}bfYN×1 having a multivariate elliptical joint distribution, and derive the exact joint distribution of bfX{\bf X}bfX and LLL-statistics from bfY{\bf Y}bfY, as a mixture of multivariate unified skew-elliptical distributions. This mixture representation enables us to predict bfX{\bf X}bfX based on LLL-statistics from bfY{\bf Y}bfY, and vice versa, when K=1K=1K=1 and with normal and ttt-distributions. Our results extend and generalize previous ones in two ways: first, we consider a general multivariate set-up for which Kgeq1K\geq{1}Kgeq1 and Ngeq2N\geq{2}Ngeq2, and second, we adopt the multivariate elliptical distribution to include previous multivariate normal and ttt-formulations as special cases.
On Elliptical Possibility Distributions
IEEE Transactions on Fuzzy Systems
This paper aims to propose two main contributions in the field of multivariate data analysis through the possibility theory. The first proposition is the definition of a generalized family of multivariate elliptical possibility distributions. These distributions have been derived from a consistent probabilitypossibility transformation over the family of so-called elliptical probability distributions. The second contribution proposed by the paper is the definition of two divergence measures between possibilistic distributions. We prove that a symmetric version of the Kullback-Leibler divergence guarantees all divergence properties when related to the space of possibility distributions. We further derive analytical expressions of the latter divergence and of the Hellinger divergence for certain possibility distributions pertaining to the elliptical family proposed, especially the normal multivariate possibility divergence in two dimensions. Finally, this paper provides an illustration of the developed possibilistic tools in an application of bi-band change detection between optical satellite images.
Akaike's Information Criterion, C p and Estimators of Loss for Elliptically Symmetric Distributions
International Statistical Review, 2014
In this article, we develop a modern perspective on Akaike's Information Criterion (AIC) and Mallows' C p for model selection, and proposes generalizations to spherically and elliptically symmetric distributions. Despite the differences in their respective motivation, C p and AIC are equivalent in the special case of Gaussian linear regression. In this case they are also equivalent to a third criterion, an unbiased estimator of the quadratic prediction loss, derived from loss estimation theory. We then show that the form of the unbiased estimator of the quadratic prediction loss under a Gaussian assumption still holds under a more general distributional assumption, the family of spherically symmetric distributions. One of the features of our results is that our criterion does not rely on the specificity of the distribution, but only on its spherical symmetry. The same kind of criterion can be derived for a family of elliptically contoured distribution, which allows correlations, when considering the invariant loss. More specifically, the unbiasedness property is relative to a distribution associated to the original density.
On the Fisher Information Matrix for Multivariate Elliptically Contoured Distributions
IEEE Signal Processing Letters, 2000
The Slepian-Bangs formula provides a very convenient way to compute the Fisher information matrix (FIM) for Gaussian distributed data. The aim of this letter is to extend it to a larger family of distributions, namely elliptically contoured (EC) distributions. More precisely, we derive a closed-form expression of the FIM in this case. This new expression involves the usual term of the Gaussian FIM plus some corrective factors that depend only on the expectations of some functions of the so-called modular variate. Hence, for most distributions in the EC family, derivation of the FIM from its Gaussian counterpart involves slight additional derivations. We show that the new formula reduces to the Slepian-Bangs formula in the Gaussian case and we provide an illustrative example with Student distributions on how it can be used.
Elliptically Contoured Distribution
The Annals of Statistics, 1999
This paper derives an extended version of the Haff or, more appropri-Ž. ately, Stein᎐Haff identity for an elliptically contoured distribution ECD. This identity is then used to show that the minimax estimators of the covariance matrix obtained under normal models remain robust under the ECD model.
A goodness-of-fit test for elliptical distributions with diagnostic capabilities
Journal of Multivariate Analysis
This paper develops a smooth test of goodness-of-fit for elliptical distributions. The test is adaptively omnibus, invariant to affinelinear transformations and has a convenient expression that can be broken into components. These components have diagnostic capabilities and can be used to identify specific departures. This helps in correcting the null model when the test rejects. As an example, the results are applied to the multivariate normal distribution for which the R package ECGofTestDx is available. It is shown that the proposed test strategy encompasses and generalizes a number of existing approaches. Some other cases are studied, such as the bivariate Laplace, logistic and Pearson type II distribution. A simulation experiment shows the usefulness of the diagnostic tools.