Kosmas Ferentinos | University of Ioannina/Greece (original) (raw)

Uploads

Papers by Kosmas Ferentinos

Research paper thumbnail of On shortest confidence intervals and their relation with uniformly minimum variance unbiased estimators

Statistical Papers, 1988

ABSTRACT The aim of this paper is to investigate the possibility of constructing shortest-lenght ... more ABSTRACT The aim of this paper is to investigate the possibility of constructing shortest-lenght confidence intervals and give some results and aspects concerning shortest confidence intervals and uniformly minimum variance unbiased (UMVU) estimators.

Research paper thumbnail of On Tchebycheff's type inequalities

Three inequalities of Tchebycheff type are presented. Two of them give lower bounds for the proba... more Three inequalities of Tchebycheff type are presented. Two of them give lower bounds for the probability of intervals not necessarily symmetric around the mean. The third one generalizes the extension of Tchebycheff's inequalities given by Miyamoto (1978). They are based on the inequality of Markov. Attainability of lower bounds is also discussed

Research paper thumbnail of Shortest confidence intervals and UMVU estimators for families of distributions involving truncation parameters

Research paper thumbnail of Shortest confidence interval estimation for families of distributions involving two truncation parameters

Metrika, 1990

ABSTRACT In this paper we construct shortest confidence intervals for families of distributions i... more ABSTRACT In this paper we construct shortest confidence intervals for families of distributions involving two unknown truncation parameters. The method used is that of the pivotal quantity.

Research paper thumbnail of More on Shortest and Equal Tails Confidence Intervals

Communications in Statistics - Theory and Methods, 2006

ABSTRACT An interesting topic in mathematical statistics is that of constructing confidence inter... more ABSTRACT An interesting topic in mathematical statistics is that of constructing confidence intervals. Two types of intervals, both based on the method of pivotal quantity, are available: the Shortest Confidence Interval (SCI) and the Equal Tails Confidence Interval (ETCI). The aims of this article are: (i) to clarify and comment on methods of finding such intervals; (ii) to investigate the relationship between these types of intervals; (iii) to point out that confidence intervals with the shortest length do not always exist, even when the distribution of the pivotal quantity is symmetric; and finally, (iv) to give similar results when the Bayesian approach is used.

Research paper thumbnail of Minimum variance unbiased estimation in doubly type II censored samples from families of distributions involving two truncation parameters

Communications in Statistics - Theory and Methods, 1990

In this paper we construct uniformly minimum variance unbiased estimators for U-estimable functio... more In this paper we construct uniformly minimum variance unbiased estimators for U-estimable functions when the underlying family of distributions involves two unknown truncation parameters and the sample is doubly Type II censored. Previous relevant results for the complete sample case are obtained as special cases of our results.

Research paper thumbnail of On the Role of Semielasticity in Statistics

The American Statistician, 2004

Assuncao (1996) restated the lower bound for the mean squared relative error (MSRE) of an unbiase... more Assuncao (1996) restated the lower bound for the mean squared relative error (MSRE) of an unbiased estimator [^.[theta]] of a parameter [theta] in terms of the elasticity of the likelihood function with respect to [theta]. The MSRE is defined as R([^.[theta]],[theta]) = E([^.[theta]] - [theta])[. ...

Research paper thumbnail of Shortest Confidence Intervals for Families of Distributions Involving Truncation Parameters

The American Statistician, 1990

Page 1. Page 2.

Research paper thumbnail of Discrete approximations to the Csisz�r, Renyi, and Fisher measures of information

Research paper thumbnail of Loss of information due to groupings

Research paper thumbnail of PLEASE SCROLL DOWN FOR ARTICLE Communications in Statistics -Theory and Methods Publication details, including instructions for authors and subscription information Order preserving property of measures of information ORDER PRESERVING PROPERTY OF MEASURES OF INFORMATION

Communication in Statistics- Theory and Methods

terms-and-conditions-of-access.pdf This article may be used for research, teaching and private st... more terms-and-conditions-of-access.pdf This article may be used for research, teaching and private study purposes. Any substantial or systematic reproduction, re-distribution, re-selling, loan or sub-licensing, systematic supply or distribution in any form to anyone is expressly forbidden. The publisher does not give any warranty express or implied or make any representation that the contents will be complete or accurate or up to date. The accuracy of any instructions, formulae and drug doses should be independently verified with primary sources. The publisher shall not be liable for any loss, actions, claims, proceedings, demand or costs or damages whatsoever or howsoever caused arising directly or indirectly in connection with or arising out of the use of this material. K e y words and phrases: Fisher and divergence-type measures o f in-forma tion; order preserving property. ABSTRACT If t h e amount of information contained in a r.v is greater than that

Research paper thumbnail of Discretization of ( h , &phgr;)-divergences

Research paper thumbnail of New parametric measures of information

Information and Control, 1981

In this paper methods are presented for obtaining parametric measures of information from the non... more In this paper methods are presented for obtaining parametric measures of information from the non-parametric ones and from information matrices. The properties of these measures are examined. The one-dimensional parametric measures which are derived from the non-parametric are superior to Fisher's information measure because they are free from regularity conditions. But if we impose the regularity conditions of the Fisherian theory of information, these measures become linear functions of Fisher's measure.

Research paper thumbnail of Statistical and Non Statistical Measures of Information

Advances in Communications, 1980

Measures of information are characterized as parametric, non parametric and entropy-type. Statist... more Measures of information are characterized as parametric, non parametric and entropy-type. Statistical measures of information are nonnegative,strongly or weakly additive, have the maximal information property and are invariant under sufficient transformations of the data. The various measures of information are examined in the light of the above properties and characterizations. They are also examined for convexity, loss of information and appearance in Cramer-Rao type inequalities. Methods for obtaining parametric measures of information from the non parametric ones and from information matrices are also presented.

Research paper thumbnail of An information theoretic argument for the validity of the exponential model

Metrika, 1994

Based on the Cram6r-Rao inequality (in the multiparameter case) the lower bound of Fisher informa... more Based on the Cram6r-Rao inequality (in the multiparameter case) the lower bound of Fisher information matrix is achieved if and only if the underlying distribution is the r-parameter exponential family. This family and the lower bound of Fisher information matrix are characterized when some constraints in the form of expected values of some statistics are available. If we combine the previous results we can find the class of parametric functions and the corresponding UMVU estimators via Cramer-Rao inequality.

Research paper thumbnail of Some Information Theoretic Ideas Useful in Statistical Inference

Methodology and Computing in Applied Probability, 2007

In this paper we discuss four information theoretic ideas and present their implications to stati... more In this paper we discuss four information theoretic ideas and present their implications to statistical inference: (1) Fisher information and divergence generating functions, (2) information optimum unbiased estimators, (3) information content of various statistics, (4) characterizations based on Fisher information.

Research paper thumbnail of Information in experiments and sufficiency

Journal of Statistical Planning and Inference, 1982

PAPAIOANNO of' information contained in an experiment which are by-prolducts sures of Fisher, Yaj... more PAPAIOANNO of' information contained in an experiment which are by-prolducts sures of Fisher, Yajda, thai and Boekee and the non-parametric of Rhatta~hary~a, Rdnyi, Matusita, K n and C&z&. We use these measures to iments according to Blat i's definition. In particular, we prove t;la! if r' x and 8 Y are tws experiments and C?X 2 8 y then Ixz i:e for til: of the above measures.

Research paper thumbnail of Information and random censoring

Information Sciences, 1996

Fisher and divergence type measures of information in the area of random censoring are introduced... more Fisher and divergence type measures of information in the area of random censoring are introduced and compared with the measures of Hollander, Proschan, and Sconing. The basic properties of statistical information theory are established for these measures of information. The winners are the classical measures of information.

Research paper thumbnail of Divergence statistics: sampling properties and multinomial goodness of fit and divergence tests

Communications in Statistics - Theory and Methods, 1990

Institute of Mathematics of the Academy of Sciences of the Czech Republic provides access to digi... more Institute of Mathematics of the Academy of Sciences of the Czech Republic provides access to digitized documents strictly for personal use. Each copy of any part of this document must contain these Terms of use.

Research paper thumbnail of On Two Forms of Fisher's Measure of Information

Communications in Statistics - Theory and Methods, 2005

Communications in Statistics—Theory and Methods, 34: 1461–1470, 2005 Copyright © Taylor & Fra... more Communications in Statistics—Theory and Methods, 34: 1461–1470, 2005 Copyright © Taylor & Francis, Inc. ISSN: 0361-0926 print/1532-415X online DOI: 10.1081/STA-200063386 ... On Two Forms of Fisher's Measure of Information ... TAKIS PAPAIOANNOU 1 AND KOSMAS ...

Research paper thumbnail of On shortest confidence intervals and their relation with uniformly minimum variance unbiased estimators

Statistical Papers, 1988

ABSTRACT The aim of this paper is to investigate the possibility of constructing shortest-lenght ... more ABSTRACT The aim of this paper is to investigate the possibility of constructing shortest-lenght confidence intervals and give some results and aspects concerning shortest confidence intervals and uniformly minimum variance unbiased (UMVU) estimators.

Research paper thumbnail of On Tchebycheff's type inequalities

Three inequalities of Tchebycheff type are presented. Two of them give lower bounds for the proba... more Three inequalities of Tchebycheff type are presented. Two of them give lower bounds for the probability of intervals not necessarily symmetric around the mean. The third one generalizes the extension of Tchebycheff's inequalities given by Miyamoto (1978). They are based on the inequality of Markov. Attainability of lower bounds is also discussed

Research paper thumbnail of Shortest confidence intervals and UMVU estimators for families of distributions involving truncation parameters

Research paper thumbnail of Shortest confidence interval estimation for families of distributions involving two truncation parameters

Metrika, 1990

ABSTRACT In this paper we construct shortest confidence intervals for families of distributions i... more ABSTRACT In this paper we construct shortest confidence intervals for families of distributions involving two unknown truncation parameters. The method used is that of the pivotal quantity.

Research paper thumbnail of More on Shortest and Equal Tails Confidence Intervals

Communications in Statistics - Theory and Methods, 2006

ABSTRACT An interesting topic in mathematical statistics is that of constructing confidence inter... more ABSTRACT An interesting topic in mathematical statistics is that of constructing confidence intervals. Two types of intervals, both based on the method of pivotal quantity, are available: the Shortest Confidence Interval (SCI) and the Equal Tails Confidence Interval (ETCI). The aims of this article are: (i) to clarify and comment on methods of finding such intervals; (ii) to investigate the relationship between these types of intervals; (iii) to point out that confidence intervals with the shortest length do not always exist, even when the distribution of the pivotal quantity is symmetric; and finally, (iv) to give similar results when the Bayesian approach is used.

Research paper thumbnail of Minimum variance unbiased estimation in doubly type II censored samples from families of distributions involving two truncation parameters

Communications in Statistics - Theory and Methods, 1990

In this paper we construct uniformly minimum variance unbiased estimators for U-estimable functio... more In this paper we construct uniformly minimum variance unbiased estimators for U-estimable functions when the underlying family of distributions involves two unknown truncation parameters and the sample is doubly Type II censored. Previous relevant results for the complete sample case are obtained as special cases of our results.

Research paper thumbnail of On the Role of Semielasticity in Statistics

The American Statistician, 2004

Assuncao (1996) restated the lower bound for the mean squared relative error (MSRE) of an unbiase... more Assuncao (1996) restated the lower bound for the mean squared relative error (MSRE) of an unbiased estimator [^.[theta]] of a parameter [theta] in terms of the elasticity of the likelihood function with respect to [theta]. The MSRE is defined as R([^.[theta]],[theta]) = E([^.[theta]] - [theta])[. ...

Research paper thumbnail of Shortest Confidence Intervals for Families of Distributions Involving Truncation Parameters

The American Statistician, 1990

Page 1. Page 2.

Research paper thumbnail of Discrete approximations to the Csisz�r, Renyi, and Fisher measures of information

Research paper thumbnail of Loss of information due to groupings

Research paper thumbnail of PLEASE SCROLL DOWN FOR ARTICLE Communications in Statistics -Theory and Methods Publication details, including instructions for authors and subscription information Order preserving property of measures of information ORDER PRESERVING PROPERTY OF MEASURES OF INFORMATION

Communication in Statistics- Theory and Methods

terms-and-conditions-of-access.pdf This article may be used for research, teaching and private st... more terms-and-conditions-of-access.pdf This article may be used for research, teaching and private study purposes. Any substantial or systematic reproduction, re-distribution, re-selling, loan or sub-licensing, systematic supply or distribution in any form to anyone is expressly forbidden. The publisher does not give any warranty express or implied or make any representation that the contents will be complete or accurate or up to date. The accuracy of any instructions, formulae and drug doses should be independently verified with primary sources. The publisher shall not be liable for any loss, actions, claims, proceedings, demand or costs or damages whatsoever or howsoever caused arising directly or indirectly in connection with or arising out of the use of this material. K e y words and phrases: Fisher and divergence-type measures o f in-forma tion; order preserving property. ABSTRACT If t h e amount of information contained in a r.v is greater than that

Research paper thumbnail of Discretization of ( h , &phgr;)-divergences

Research paper thumbnail of New parametric measures of information

Information and Control, 1981

In this paper methods are presented for obtaining parametric measures of information from the non... more In this paper methods are presented for obtaining parametric measures of information from the non-parametric ones and from information matrices. The properties of these measures are examined. The one-dimensional parametric measures which are derived from the non-parametric are superior to Fisher's information measure because they are free from regularity conditions. But if we impose the regularity conditions of the Fisherian theory of information, these measures become linear functions of Fisher's measure.

Research paper thumbnail of Statistical and Non Statistical Measures of Information

Advances in Communications, 1980

Measures of information are characterized as parametric, non parametric and entropy-type. Statist... more Measures of information are characterized as parametric, non parametric and entropy-type. Statistical measures of information are nonnegative,strongly or weakly additive, have the maximal information property and are invariant under sufficient transformations of the data. The various measures of information are examined in the light of the above properties and characterizations. They are also examined for convexity, loss of information and appearance in Cramer-Rao type inequalities. Methods for obtaining parametric measures of information from the non parametric ones and from information matrices are also presented.

Research paper thumbnail of An information theoretic argument for the validity of the exponential model

Metrika, 1994

Based on the Cram6r-Rao inequality (in the multiparameter case) the lower bound of Fisher informa... more Based on the Cram6r-Rao inequality (in the multiparameter case) the lower bound of Fisher information matrix is achieved if and only if the underlying distribution is the r-parameter exponential family. This family and the lower bound of Fisher information matrix are characterized when some constraints in the form of expected values of some statistics are available. If we combine the previous results we can find the class of parametric functions and the corresponding UMVU estimators via Cramer-Rao inequality.

Research paper thumbnail of Some Information Theoretic Ideas Useful in Statistical Inference

Methodology and Computing in Applied Probability, 2007

In this paper we discuss four information theoretic ideas and present their implications to stati... more In this paper we discuss four information theoretic ideas and present their implications to statistical inference: (1) Fisher information and divergence generating functions, (2) information optimum unbiased estimators, (3) information content of various statistics, (4) characterizations based on Fisher information.

Research paper thumbnail of Information in experiments and sufficiency

Journal of Statistical Planning and Inference, 1982

PAPAIOANNO of' information contained in an experiment which are by-prolducts sures of Fisher, Yaj... more PAPAIOANNO of' information contained in an experiment which are by-prolducts sures of Fisher, Yajda, thai and Boekee and the non-parametric of Rhatta~hary~a, Rdnyi, Matusita, K n and C&z&. We use these measures to iments according to Blat i's definition. In particular, we prove t;la! if r' x and 8 Y are tws experiments and C?X 2 8 y then Ixz i:e for til: of the above measures.

Research paper thumbnail of Information and random censoring

Information Sciences, 1996

Fisher and divergence type measures of information in the area of random censoring are introduced... more Fisher and divergence type measures of information in the area of random censoring are introduced and compared with the measures of Hollander, Proschan, and Sconing. The basic properties of statistical information theory are established for these measures of information. The winners are the classical measures of information.

Research paper thumbnail of Divergence statistics: sampling properties and multinomial goodness of fit and divergence tests

Communications in Statistics - Theory and Methods, 1990

Institute of Mathematics of the Academy of Sciences of the Czech Republic provides access to digi... more Institute of Mathematics of the Academy of Sciences of the Czech Republic provides access to digitized documents strictly for personal use. Each copy of any part of this document must contain these Terms of use.

Research paper thumbnail of On Two Forms of Fisher's Measure of Information

Communications in Statistics - Theory and Methods, 2005

Communications in Statistics—Theory and Methods, 34: 1461–1470, 2005 Copyright © Taylor & Fra... more Communications in Statistics—Theory and Methods, 34: 1461–1470, 2005 Copyright © Taylor & Francis, Inc. ISSN: 0361-0926 print/1532-415X online DOI: 10.1081/STA-200063386 ... On Two Forms of Fisher's Measure of Information ... TAKIS PAPAIOANNOU 1 AND KOSMAS ...