This thesis considers the regression analysis problem in which the estimators of the parameters are selected according to some criterion other than least squares. Two basic areas are considered. First, some basic properties are derived for the estimators that minimize the sum of the absolute values of the residuals raised to the X power. Both the homoscedastic and heteroscedastic cases are considered. Second, procedures for estimating weights for two types of heteroscedastic models are presented.Industrial Engineering, Department o
This paper provides a root-n consistent, asymptotically normal weighted least squares estimator of t...
This thesis considers the problem of estimation in the presence of heteroskedasticity of unknown for...
Abstract:In classical regression analysis, the error of independent variable is usually not taken in...
In the present thesis we deal with the linear regression models based on least squares. These method...
In the normal linear regression the least square estimation of the coefficients has a series of nice...
Classical least squares regression consists of minimizing the sum of the squared residuals. Many aut...
There is a well-known simple formula for computing prediction sum of squares (PRESS) residuals in a ...
Much of the data analysed by least squares regression methods violates the assumption that independe...
The ordinary least squares regression can be misleading when there are outliers, heteroscedasticity ...
Consider the heteroscedastic polynomial regression model $ Y = \beta_0 + \beta_1X + ... + \beta_pX^...
AbstractThis paper is concerned with the linear regression model in which the variance of the depend...
In this thesis the concept of regression analysis is discussed. It involves the methods for finding ...
The asymptotic properties of the least squares estimator are derived for a non regular nonlinear mod...
We consider the problem of making inferences about the parameters in a heteroskedastic regression mo...
For the given data (wI, xI, yI ), i = 1, . . . , n, and the given model function f (x; θ), where θ i...
This paper provides a root-n consistent, asymptotically normal weighted least squares estimator of t...
This thesis considers the problem of estimation in the presence of heteroskedasticity of unknown for...
Abstract:In classical regression analysis, the error of independent variable is usually not taken in...
In the present thesis we deal with the linear regression models based on least squares. These method...
In the normal linear regression the least square estimation of the coefficients has a series of nice...
Classical least squares regression consists of minimizing the sum of the squared residuals. Many aut...
There is a well-known simple formula for computing prediction sum of squares (PRESS) residuals in a ...
Much of the data analysed by least squares regression methods violates the assumption that independe...
The ordinary least squares regression can be misleading when there are outliers, heteroscedasticity ...
Consider the heteroscedastic polynomial regression model $ Y = \beta_0 + \beta_1X + ... + \beta_pX^...
AbstractThis paper is concerned with the linear regression model in which the variance of the depend...
In this thesis the concept of regression analysis is discussed. It involves the methods for finding ...
The asymptotic properties of the least squares estimator are derived for a non regular nonlinear mod...
We consider the problem of making inferences about the parameters in a heteroskedastic regression mo...
For the given data (wI, xI, yI ), i = 1, . . . , n, and the given model function f (x; θ), where θ i...
This paper provides a root-n consistent, asymptotically normal weighted least squares estimator of t...
This thesis considers the problem of estimation in the presence of heteroskedasticity of unknown for...
Abstract:In classical regression analysis, the error of independent variable is usually not taken in...