In a linear model $Y=X\beta +Z$ a linear functional $\beta \mapsto \gamma '\beta$ is to be estimated under squared error loss. It is well-known that, provided Y is normally distributed, the ordinary least squares estimation function minimizes the risk uniformly in the class ${\cal P}$ of all equivariant estimation functions and is admissible in the class ${\cal E}$ of all unbiased estimation functions. For the design matrix X of a polynomial regression set up it is shown for almost all estimation problems that the ordinary least squares estimation function is uniformly best in ${\cal P}$ and also admissible in ${\cal E}$ only if Y is normally distributed
AbstractAdmissibility of linear estimators of a regression coefficient in linear models with and wit...
The evaluation of Ordinary Least Squares (OLS) and polynomial regression (PR) on their predictive pe...
In this paper we develop a general theory of local asymptotics for least squares estimates over poly...
AbstractIn a linear model Y = Xβ + Z a linear functional β → γ′β is to be estimated under squared er...
The ordinary least squares (OLS) method had been extensively applied to estimation of d...
A polynomial functional relationship with errors in both variables can be consistently estimated by ...
In a standard linear model, we assume that . Alternatives can be considered, when the linear assumpt...
In a standard linear model, we explore the optimality of the least squares estimator under assuption...
Consider the heteroscedastic polynomial regression model $ Y = \beta_0 + \beta_1X + ... + \beta_pX^...
We give the limiting distribution of the least squares estimator in the polynomial regression model ...
In general, the theory developed in the area of linear regression analysis assumes that the error ∊ ...
The local least-squares estimator for a regression curve cannot provide optimal results when non-Gau...
AbstractIn a standard linear model, we explore the optimality of the least squares estimator under a...
Let (X, Y) be a pair of random variables such that X = (X1,..., Xd) ranges over a nondegenerate comp...
Ordinary Least Squares (OLS) is a method for analyzing and estimating\ud the relationship among vari...
AbstractAdmissibility of linear estimators of a regression coefficient in linear models with and wit...
The evaluation of Ordinary Least Squares (OLS) and polynomial regression (PR) on their predictive pe...
In this paper we develop a general theory of local asymptotics for least squares estimates over poly...
AbstractIn a linear model Y = Xβ + Z a linear functional β → γ′β is to be estimated under squared er...
The ordinary least squares (OLS) method had been extensively applied to estimation of d...
A polynomial functional relationship with errors in both variables can be consistently estimated by ...
In a standard linear model, we assume that . Alternatives can be considered, when the linear assumpt...
In a standard linear model, we explore the optimality of the least squares estimator under assuption...
Consider the heteroscedastic polynomial regression model $ Y = \beta_0 + \beta_1X + ... + \beta_pX^...
We give the limiting distribution of the least squares estimator in the polynomial regression model ...
In general, the theory developed in the area of linear regression analysis assumes that the error ∊ ...
The local least-squares estimator for a regression curve cannot provide optimal results when non-Gau...
AbstractIn a standard linear model, we explore the optimality of the least squares estimator under a...
Let (X, Y) be a pair of random variables such that X = (X1,..., Xd) ranges over a nondegenerate comp...
Ordinary Least Squares (OLS) is a method for analyzing and estimating\ud the relationship among vari...
AbstractAdmissibility of linear estimators of a regression coefficient in linear models with and wit...
The evaluation of Ordinary Least Squares (OLS) and polynomial regression (PR) on their predictive pe...
In this paper we develop a general theory of local asymptotics for least squares estimates over poly...