AbstractIn a linear model Y = Xβ + Z a linear functional β → γ′β is to be estimated under squared error loss. It is well known that, provided Y is normally distributed, the ordinary least squares estimation function minimizes the risk uniformly in the class J of all equivariant estimation functions and is admissible in the class E of all unbiased estimation functions. For the design matrix X of a polynomial regression set up it is shown for almost all estimation problems that the ordinary least squares estimation function is uniformly best in J and also admissible in E only if Y is normally distributed
Consider the heteroscedastic polynomial regression model $ Y = \beta_0 + \beta_1X + ... + \beta_pX^p...
Multivariate local polynomial fitting is applied to the multivariate linear heteroscedastic regressi...
AbstractLet (X, Y) be a pair of random variables such that X = (X1,…, Xd) ranges over a nondegenerat...
In a linear model $Y=X\beta +Z$ a linear functional $\beta \mapsto \gamma '\beta$ is to be estimated...
AbstractIn a linear model Y = Xβ + Z a linear functional β → γ′β is to be estimated under squared er...
A polynomial functional relationship with errors in both variables can be consistently estimated by ...
In a standard linear model, we assume that . Alternatives can be considered, when the linear assumpt...
The ordinary least squares (OLS) method had been extensively applied to estimation of d...
We give the limiting distribution of the least squares estimator in the polynomial regression model ...
Ordinary Least Squares (OLS) is a method for analyzing and estimating\ud the relationship among vari...
In general, the theory developed in the area of linear regression analysis assumes that the error ∊ ...
Let (X, Y) be a pair of random variables such that X = (X1,..., Xd) ranges over a nondegenerate comp...
The local least-squares estimator for a regression curve cannot provide optimal results when non-Gau...
This paper considers a class of densities formed by taking the product of nonnegative polynomials an...
The evaluation of Ordinary Least Squares (OLS) and polynomial regression (PR) on their predictive pe...
Consider the heteroscedastic polynomial regression model $ Y = \beta_0 + \beta_1X + ... + \beta_pX^p...
Multivariate local polynomial fitting is applied to the multivariate linear heteroscedastic regressi...
AbstractLet (X, Y) be a pair of random variables such that X = (X1,…, Xd) ranges over a nondegenerat...
In a linear model $Y=X\beta +Z$ a linear functional $\beta \mapsto \gamma '\beta$ is to be estimated...
AbstractIn a linear model Y = Xβ + Z a linear functional β → γ′β is to be estimated under squared er...
A polynomial functional relationship with errors in both variables can be consistently estimated by ...
In a standard linear model, we assume that . Alternatives can be considered, when the linear assumpt...
The ordinary least squares (OLS) method had been extensively applied to estimation of d...
We give the limiting distribution of the least squares estimator in the polynomial regression model ...
Ordinary Least Squares (OLS) is a method for analyzing and estimating\ud the relationship among vari...
In general, the theory developed in the area of linear regression analysis assumes that the error ∊ ...
Let (X, Y) be a pair of random variables such that X = (X1,..., Xd) ranges over a nondegenerate comp...
The local least-squares estimator for a regression curve cannot provide optimal results when non-Gau...
This paper considers a class of densities formed by taking the product of nonnegative polynomials an...
The evaluation of Ordinary Least Squares (OLS) and polynomial regression (PR) on their predictive pe...
Consider the heteroscedastic polynomial regression model $ Y = \beta_0 + \beta_1X + ... + \beta_pX^p...
Multivariate local polynomial fitting is applied to the multivariate linear heteroscedastic regressi...
AbstractLet (X, Y) be a pair of random variables such that X = (X1,…, Xd) ranges over a nondegenerat...