Characterizations of the Best Linear Unbiased Estimator in the General Gauss-Markov Model with the Use of Matrix Partial Orderings (original) (raw)
Related papers
Characterizations of the Best Linear Unbiased Estimator
Under the general Gauss-Markov model {Y,Xb, a2V}, two new characterizations of BLUE(X$) are derived involving the L&vner and rank-subtractivity partial orderings between the dispersion matrix of BLUE(Xfi) and the dispersion matrix of Y. As particular cases of these characterizations, three new criteria for the equality between OLSE(X6) and BLUE(X0) are given.
Reehetov LA On Equalities for BLUEs under Misspecified Gauss–Markov Models
This paper studies relationships between the best linear unbiased estimators (BLUEs) of an estimable parametric functions Kβ under the Gauss-Markov model {y, Xβ, σ 2 Σ} and its misspecified model {y, X 0 β, σ 2 Σ 0 }. In addition, relationships between BLUEs under a restricted Gauss-Markov model and its misspecified model are also investigated.
International Journal of Statistics and Probability, 2020
We provide a new, concise derivation of necessary and sufficient conditions for the explicit characterization of the general nonnegative-definite covariance structure V of a general Gauss-Markov model with E(y) and Var(y) such that the best linear unbiased estimator, the weighted least squares estimator, and the least squares estimator of Xβ are identical. In addition, we derive a representation of the general nonnegative-definite covariance structure V defined above in terms of its Moore-Penrose pseudo-inverse.
Linear Algebra and its Applications, 1989
Given matrices A,B and vectors a, b, a necessary and sufficient condition is established for the Lijwner partial ordering (Am + a)(Am + a)' Q (Bm + b)(Bm + b)' to hold for all vectors m. This result is then applied to derive a complete characterization of estimators that are admissible for a given vector of parametric functions among the set of all linear estimators under the general Gauss-Markov model, when the mean square error matrix is adopted as the criterion for evaluating the estimators. inwhichY EIW,,~ is an observable random vector with expectation E(Y) = X B LINEAR ALGEBRA AND ITS APPLICATIONS 112:9-18 (1989)
Admissible linear estimators in the general Gauss-Markov model
Journal of Statistical Planning and Inference, 1988
This paper derives a complete characterization of estimators that are admissible for a given identifiable vector of parametric functions among the set of linear estimators under the general Gauss-Markov model with a dispersion matrix possibly singular. The characterization nbtained implies some corollaries, which are then compared with the results known in the literature. his paper, we will will stand for t will be the orthogonal projector onto Opefa~iomjkfsci?. Statist. Sef. Kionecki, W. (19821, Linear estimators of the mean vector in linear models: problem of admissibility. Probab. Math. Statist. 2, 167-178. . and S. Zontek (1.988). On the structure of admissible linear estimators. J. Multivaribte gavotte, E.R. (1982). Admissibii~ty in Enear estimation. Ann. Statiist. ) 245-255.
Reshetov LA The BLUE’s covariance matrix revisited: A review
In this paper we comment on and review some unexpected but interesting features of the BLUE (best linear unbiased estimator) of the expectation vector in the general linear model and in particular, the BLUE's covariance matrix. Most of these features appear in the literature but are rather scattered or hidden.
Journal of Statistical Planning and Inference, 1992
It is known that if the Gauss-Markov model M = {Y, Xfl, a*V} has the column space of the model matrix X not contained in the column space of the dispersion matrix a2V, then the vector of parameters p has to satisfy certain linear equations. However, these equations become restrictions on p in the usual sense only when the random vector Y involved in them is replaced by an observed outcome y. In this paper, explicit solutions to several statistical problems are derived in two situations: when fl is unconstrained and when p is constrained by the 'natural restrictions' mentioned above. The problems considered are: linear unbiased estimation and best linear unbiased estimation of an identifiable vector of parametric functions, comparison of estimators of any vector of parametric functions with respect to the matrix risk, and admissibility among the class of all linear estimators with respect to the matrix risk and with respect to the mean square error. The solutions corresponding to the unconstrained and constrained cases are compared to show in what sense B may be considered to be free to vary without loss of generality. AMS Subject Classification: 62505, 62C15. Key words and phrases: General Gauss-Markov model; singular linear model; linear estimator; unbiasedness; minimum dispersion linear unbiased estimator; matrix risk; quadratic risk; admissibility. 0378-3758/92/$05.00 0 1992-Elsevier Science Publishers B.V. All rights reserved 336 J.K. Baksalary et al. / Restrictions on estimation in Gauss-Markov models
Reshetov LA Comparing the BLUEs Under Two Linear Models
In this article, we consider two linear models, 1 = y X V 1 and 2 = y X V 2 , which differ only in their covariance matrices. Our main focus lies on the difference of the best linear unbiased estimators, BLUEs, of X under these models. The corresponding problems between the models y X I n and y X V , i.e., between the OLSE (ordinary least squares estimator) and BLUE, are pretty well studied. Our purpose is to review the corresponding considerations between the BLUEs of X under 1 and 2 . This article is an expository one presenting also new results.