AbstractIn a linear model Y = Xβ + Z a linear functional β → γ′β is to be estimated under squared error loss. It is well known that, provided Y is normally distributed, the ordinary least squares estimation function minimizes the risk uniformly in the class J of all equivariant estimation functions and is admissible in the class E of all unbiased estimation functions. For the design matrix X of a polynomial regression set up it is shown for almost all estimation problems that the ordinary least squares estimation function is uniformly best in J and also admissible in E only if Y is normally distributed
AbstractSome necessary and sufficient conditions are given for two equalities of ordinary least-squa...
Sockloff (1976), in reviewing the appropriateness of fixed and random models in regression analysis,...
We consider two types of problems in maximum likelihood estimation of parameters of linear functions...
AbstractIn a linear model Y = Xβ + Z a linear functional β → γ′β is to be estimated under squared er...
In a linear model $Y=X\beta +Z$ a linear functional $\beta \mapsto \gamma '\beta$ is to be estimated...
summary:Let $\bold Y$ be an $n$-dimensional random vector which is $N_n(\bold {A0,K})$ distributed. ...
AbstractThe problem of comparing the ordinary least-squares estimator β̂ and the restricted least-sq...
AbstractLet X be an observation from a p-variate (p ≥ 3) normal random vector with unknown mean vect...
AbstractThe criterion robustness of the standard likelihood ratio test (LRT) under the multivariate ...
AbstractIn a standard linear model, we explore the optimality of the least squares estimator under a...
AbstractThis paper is concerned with the linear regression model in which the variance of the depend...
AbstractAdmissibility of linear estimators of a regression coefficient in linear models with and wit...
Two methods of estimating the parameters of a polynomial regression with measurement errors in the r...
summary:The paper deals with the linear model with uncorrelated observations. The dispersions of the...
The ordinary least squares (OLS) method had been extensively applied to estimation of d...
AbstractSome necessary and sufficient conditions are given for two equalities of ordinary least-squa...
Sockloff (1976), in reviewing the appropriateness of fixed and random models in regression analysis,...
We consider two types of problems in maximum likelihood estimation of parameters of linear functions...
AbstractIn a linear model Y = Xβ + Z a linear functional β → γ′β is to be estimated under squared er...
In a linear model $Y=X\beta +Z$ a linear functional $\beta \mapsto \gamma '\beta$ is to be estimated...
summary:Let $\bold Y$ be an $n$-dimensional random vector which is $N_n(\bold {A0,K})$ distributed. ...
AbstractThe problem of comparing the ordinary least-squares estimator β̂ and the restricted least-sq...
AbstractLet X be an observation from a p-variate (p ≥ 3) normal random vector with unknown mean vect...
AbstractThe criterion robustness of the standard likelihood ratio test (LRT) under the multivariate ...
AbstractIn a standard linear model, we explore the optimality of the least squares estimator under a...
AbstractThis paper is concerned with the linear regression model in which the variance of the depend...
AbstractAdmissibility of linear estimators of a regression coefficient in linear models with and wit...
Two methods of estimating the parameters of a polynomial regression with measurement errors in the r...
summary:The paper deals with the linear model with uncorrelated observations. The dispersions of the...
The ordinary least squares (OLS) method had been extensively applied to estimation of d...
AbstractSome necessary and sufficient conditions are given for two equalities of ordinary least-squa...
Sockloff (1976), in reviewing the appropriateness of fixed and random models in regression analysis,...
We consider two types of problems in maximum likelihood estimation of parameters of linear functions...