Kosmas Ferentinos | University of Ioannina/Greece (original) (raw)
Uploads
Papers by Kosmas Ferentinos
Statistical Papers, 1988
ABSTRACT The aim of this paper is to investigate the possibility of constructing shortest-lenght ... more ABSTRACT The aim of this paper is to investigate the possibility of constructing shortest-lenght confidence intervals and give some results and aspects concerning shortest confidence intervals and uniformly minimum variance unbiased (UMVU) estimators.
Three inequalities of Tchebycheff type are presented. Two of them give lower bounds for the proba... more Three inequalities of Tchebycheff type are presented. Two of them give lower bounds for the probability of intervals not necessarily symmetric around the mean. The third one generalizes the extension of Tchebycheff's inequalities given by Miyamoto (1978). They are based on the inequality of Markov. Attainability of lower bounds is also discussed
Metrika, 1990
ABSTRACT In this paper we construct shortest confidence intervals for families of distributions i... more ABSTRACT In this paper we construct shortest confidence intervals for families of distributions involving two unknown truncation parameters. The method used is that of the pivotal quantity.
Communications in Statistics - Theory and Methods, 2006
ABSTRACT An interesting topic in mathematical statistics is that of constructing confidence inter... more ABSTRACT An interesting topic in mathematical statistics is that of constructing confidence intervals. Two types of intervals, both based on the method of pivotal quantity, are available: the Shortest Confidence Interval (SCI) and the Equal Tails Confidence Interval (ETCI). The aims of this article are: (i) to clarify and comment on methods of finding such intervals; (ii) to investigate the relationship between these types of intervals; (iii) to point out that confidence intervals with the shortest length do not always exist, even when the distribution of the pivotal quantity is symmetric; and finally, (iv) to give similar results when the Bayesian approach is used.
Communications in Statistics - Theory and Methods, 1990
In this paper we construct uniformly minimum variance unbiased estimators for U-estimable functio... more In this paper we construct uniformly minimum variance unbiased estimators for U-estimable functions when the underlying family of distributions involves two unknown truncation parameters and the sample is doubly Type II censored. Previous relevant results for the complete sample case are obtained as special cases of our results.
The American Statistician, 2004
Assuncao (1996) restated the lower bound for the mean squared relative error (MSRE) of an unbiase... more Assuncao (1996) restated the lower bound for the mean squared relative error (MSRE) of an unbiased estimator [^.[theta]] of a parameter [theta] in terms of the elasticity of the likelihood function with respect to [theta]. The MSRE is defined as R([^.[theta]],[theta]) = E([^.[theta]] - [theta])[. ...
The American Statistician, 1990
Page 1. Page 2.
Communication in Statistics- Theory and Methods
terms-and-conditions-of-access.pdf This article may be used for research, teaching and private st... more terms-and-conditions-of-access.pdf This article may be used for research, teaching and private study purposes. Any substantial or systematic reproduction, re-distribution, re-selling, loan or sub-licensing, systematic supply or distribution in any form to anyone is expressly forbidden. The publisher does not give any warranty express or implied or make any representation that the contents will be complete or accurate or up to date. The accuracy of any instructions, formulae and drug doses should be independently verified with primary sources. The publisher shall not be liable for any loss, actions, claims, proceedings, demand or costs or damages whatsoever or howsoever caused arising directly or indirectly in connection with or arising out of the use of this material. K e y words and phrases: Fisher and divergence-type measures o f in-forma tion; order preserving property. ABSTRACT If t h e amount of information contained in a r.v is greater than that
Information and Control, 1981
In this paper methods are presented for obtaining parametric measures of information from the non... more In this paper methods are presented for obtaining parametric measures of information from the non-parametric ones and from information matrices. The properties of these measures are examined. The one-dimensional parametric measures which are derived from the non-parametric are superior to Fisher's information measure because they are free from regularity conditions. But if we impose the regularity conditions of the Fisherian theory of information, these measures become linear functions of Fisher's measure.
Advances in Communications, 1980
Measures of information are characterized as parametric, non parametric and entropy-type. Statist... more Measures of information are characterized as parametric, non parametric and entropy-type. Statistical measures of information are nonnegative,strongly or weakly additive, have the maximal information property and are invariant under sufficient transformations of the data. The various measures of information are examined in the light of the above properties and characterizations. They are also examined for convexity, loss of information and appearance in Cramer-Rao type inequalities. Methods for obtaining parametric measures of information from the non parametric ones and from information matrices are also presented.
Metrika, 1994
Based on the Cram6r-Rao inequality (in the multiparameter case) the lower bound of Fisher informa... more Based on the Cram6r-Rao inequality (in the multiparameter case) the lower bound of Fisher information matrix is achieved if and only if the underlying distribution is the r-parameter exponential family. This family and the lower bound of Fisher information matrix are characterized when some constraints in the form of expected values of some statistics are available. If we combine the previous results we can find the class of parametric functions and the corresponding UMVU estimators via Cramer-Rao inequality.
Methodology and Computing in Applied Probability, 2007
In this paper we discuss four information theoretic ideas and present their implications to stati... more In this paper we discuss four information theoretic ideas and present their implications to statistical inference: (1) Fisher information and divergence generating functions, (2) information optimum unbiased estimators, (3) information content of various statistics, (4) characterizations based on Fisher information.
Journal of Statistical Planning and Inference, 1982
PAPAIOANNO of' information contained in an experiment which are by-prolducts sures of Fisher, Yaj... more PAPAIOANNO of' information contained in an experiment which are by-prolducts sures of Fisher, Yajda, thai and Boekee and the non-parametric of Rhatta~hary~a, Rdnyi, Matusita, K n and C&z&. We use these measures to iments according to Blat i's definition. In particular, we prove t;la! if r' x and 8 Y are tws experiments and C?X 2 8 y then Ixz i:e for til: of the above measures.
Information Sciences, 1996
Fisher and divergence type measures of information in the area of random censoring are introduced... more Fisher and divergence type measures of information in the area of random censoring are introduced and compared with the measures of Hollander, Proschan, and Sconing. The basic properties of statistical information theory are established for these measures of information. The winners are the classical measures of information.
Communications in Statistics - Theory and Methods, 1990
Institute of Mathematics of the Academy of Sciences of the Czech Republic provides access to digi... more Institute of Mathematics of the Academy of Sciences of the Czech Republic provides access to digitized documents strictly for personal use. Each copy of any part of this document must contain these Terms of use.
Communications in Statistics - Theory and Methods, 2005
Communications in Statistics—Theory and Methods, 34: 1461–1470, 2005 Copyright © Taylor & Fra... more Communications in Statistics—Theory and Methods, 34: 1461–1470, 2005 Copyright © Taylor & Francis, Inc. ISSN: 0361-0926 print/1532-415X online DOI: 10.1081/STA-200063386 ... On Two Forms of Fisher's Measure of Information ... TAKIS PAPAIOANNOU 1 AND KOSMAS ...
Statistical Papers, 1988
ABSTRACT The aim of this paper is to investigate the possibility of constructing shortest-lenght ... more ABSTRACT The aim of this paper is to investigate the possibility of constructing shortest-lenght confidence intervals and give some results and aspects concerning shortest confidence intervals and uniformly minimum variance unbiased (UMVU) estimators.
Three inequalities of Tchebycheff type are presented. Two of them give lower bounds for the proba... more Three inequalities of Tchebycheff type are presented. Two of them give lower bounds for the probability of intervals not necessarily symmetric around the mean. The third one generalizes the extension of Tchebycheff's inequalities given by Miyamoto (1978). They are based on the inequality of Markov. Attainability of lower bounds is also discussed
Metrika, 1990
ABSTRACT In this paper we construct shortest confidence intervals for families of distributions i... more ABSTRACT In this paper we construct shortest confidence intervals for families of distributions involving two unknown truncation parameters. The method used is that of the pivotal quantity.
Communications in Statistics - Theory and Methods, 2006
ABSTRACT An interesting topic in mathematical statistics is that of constructing confidence inter... more ABSTRACT An interesting topic in mathematical statistics is that of constructing confidence intervals. Two types of intervals, both based on the method of pivotal quantity, are available: the Shortest Confidence Interval (SCI) and the Equal Tails Confidence Interval (ETCI). The aims of this article are: (i) to clarify and comment on methods of finding such intervals; (ii) to investigate the relationship between these types of intervals; (iii) to point out that confidence intervals with the shortest length do not always exist, even when the distribution of the pivotal quantity is symmetric; and finally, (iv) to give similar results when the Bayesian approach is used.
Communications in Statistics - Theory and Methods, 1990
In this paper we construct uniformly minimum variance unbiased estimators for U-estimable functio... more In this paper we construct uniformly minimum variance unbiased estimators for U-estimable functions when the underlying family of distributions involves two unknown truncation parameters and the sample is doubly Type II censored. Previous relevant results for the complete sample case are obtained as special cases of our results.
The American Statistician, 2004
Assuncao (1996) restated the lower bound for the mean squared relative error (MSRE) of an unbiase... more Assuncao (1996) restated the lower bound for the mean squared relative error (MSRE) of an unbiased estimator [^.[theta]] of a parameter [theta] in terms of the elasticity of the likelihood function with respect to [theta]. The MSRE is defined as R([^.[theta]],[theta]) = E([^.[theta]] - [theta])[. ...
The American Statistician, 1990
Page 1. Page 2.
Communication in Statistics- Theory and Methods
terms-and-conditions-of-access.pdf This article may be used for research, teaching and private st... more terms-and-conditions-of-access.pdf This article may be used for research, teaching and private study purposes. Any substantial or systematic reproduction, re-distribution, re-selling, loan or sub-licensing, systematic supply or distribution in any form to anyone is expressly forbidden. The publisher does not give any warranty express or implied or make any representation that the contents will be complete or accurate or up to date. The accuracy of any instructions, formulae and drug doses should be independently verified with primary sources. The publisher shall not be liable for any loss, actions, claims, proceedings, demand or costs or damages whatsoever or howsoever caused arising directly or indirectly in connection with or arising out of the use of this material. K e y words and phrases: Fisher and divergence-type measures o f in-forma tion; order preserving property. ABSTRACT If t h e amount of information contained in a r.v is greater than that
Information and Control, 1981
In this paper methods are presented for obtaining parametric measures of information from the non... more In this paper methods are presented for obtaining parametric measures of information from the non-parametric ones and from information matrices. The properties of these measures are examined. The one-dimensional parametric measures which are derived from the non-parametric are superior to Fisher's information measure because they are free from regularity conditions. But if we impose the regularity conditions of the Fisherian theory of information, these measures become linear functions of Fisher's measure.
Advances in Communications, 1980
Measures of information are characterized as parametric, non parametric and entropy-type. Statist... more Measures of information are characterized as parametric, non parametric and entropy-type. Statistical measures of information are nonnegative,strongly or weakly additive, have the maximal information property and are invariant under sufficient transformations of the data. The various measures of information are examined in the light of the above properties and characterizations. They are also examined for convexity, loss of information and appearance in Cramer-Rao type inequalities. Methods for obtaining parametric measures of information from the non parametric ones and from information matrices are also presented.
Metrika, 1994
Based on the Cram6r-Rao inequality (in the multiparameter case) the lower bound of Fisher informa... more Based on the Cram6r-Rao inequality (in the multiparameter case) the lower bound of Fisher information matrix is achieved if and only if the underlying distribution is the r-parameter exponential family. This family and the lower bound of Fisher information matrix are characterized when some constraints in the form of expected values of some statistics are available. If we combine the previous results we can find the class of parametric functions and the corresponding UMVU estimators via Cramer-Rao inequality.
Methodology and Computing in Applied Probability, 2007
In this paper we discuss four information theoretic ideas and present their implications to stati... more In this paper we discuss four information theoretic ideas and present their implications to statistical inference: (1) Fisher information and divergence generating functions, (2) information optimum unbiased estimators, (3) information content of various statistics, (4) characterizations based on Fisher information.
Journal of Statistical Planning and Inference, 1982
PAPAIOANNO of' information contained in an experiment which are by-prolducts sures of Fisher, Yaj... more PAPAIOANNO of' information contained in an experiment which are by-prolducts sures of Fisher, Yajda, thai and Boekee and the non-parametric of Rhatta~hary~a, Rdnyi, Matusita, K n and C&z&. We use these measures to iments according to Blat i's definition. In particular, we prove t;la! if r' x and 8 Y are tws experiments and C?X 2 8 y then Ixz i:e for til: of the above measures.
Information Sciences, 1996
Fisher and divergence type measures of information in the area of random censoring are introduced... more Fisher and divergence type measures of information in the area of random censoring are introduced and compared with the measures of Hollander, Proschan, and Sconing. The basic properties of statistical information theory are established for these measures of information. The winners are the classical measures of information.
Communications in Statistics - Theory and Methods, 1990
Institute of Mathematics of the Academy of Sciences of the Czech Republic provides access to digi... more Institute of Mathematics of the Academy of Sciences of the Czech Republic provides access to digitized documents strictly for personal use. Each copy of any part of this document must contain these Terms of use.
Communications in Statistics - Theory and Methods, 2005
Communications in Statistics—Theory and Methods, 34: 1461–1470, 2005 Copyright © Taylor & Fra... more Communications in Statistics—Theory and Methods, 34: 1461–1470, 2005 Copyright © Taylor & Francis, Inc. ISSN: 0361-0926 print/1532-415X online DOI: 10.1081/STA-200063386 ... On Two Forms of Fisher's Measure of Information ... TAKIS PAPAIOANNOU 1 AND KOSMAS ...