Generalized divergence measures: Information matrices, amount of information, asymptotic distribution, and its applications to test statistical hypotheses (original) (raw)

In this paper, a transformation of Csiszar's measures which generalizes the unified (r, s) measures defined by Sharma and Mittal and Taneja is presented. For these transformations, information matrices associated to a differential metric in the direction to the tangent space are obtained, as well as the amount of information resulting from parameter perturbation in the direction of coordinate axes. Finally, the asymptotic distribution of information matrices and the amount of information and its applications to test statistical hypotheses are obtained. < F(x), I 00~ 00j < Y(x), 00~ 00j 00k < ~/(x), where F is finitely integrable and E[H(X)] < M, with M independent of 0. (iii) The Fisher information matrix O0 i OOj i,j = 1 ..