We assume as model a standard multivariate regression of y on x, fitted to a controlled calibration sample and used to estimate unknown x's from observed y-values. The standard weighted least squares estimator ('classical', regress y on x and 'solve' for x) and the biased inverse regression estimator (regress x on y) are compared with respect to mean squared error. The regions are derived where the inverse regression estimator yields the smaller MSE. For any particular component of x this region is likely to contain 'most' future values in usual practice. For simultaneous estimation this needs not be true, however.mean squared error multivariate regression
A family of dimension reduction methods was developed by Cook and Ni [Sufficient dimension reduction...
International audienceThis chapter deals with the multiple linear regression. That is we investigate...
Consider multivariate linear calibration of a single standard. We show that a selection of the q' mo...
AbstractIn univariate calibration, two standard estimators are usually opposed: the classical estima...
Conditions are derived under which the mixed regression estimator (MRE) is better then the ordinary ...
AbstractIn univariate calibration problems two different estimators are commonly in use. They are re...
Since simple linear regression theory was established at the beginning of the 1900s, it has been use...
In univariate calibration problems two different estimators are commonly in use. They are referred t...
Regression analysis makes up a large part of supervised machine learning, and consists of the predic...
The inverse estimation problem consists of a calibration stage and a prediction stage. In the calibr...
The present article considers the problem of consistent estimation in measurement error models. A li...
A family of dimension reduction methods was developed by Cook and Ni [Sufficient dimension reduction...
Graduation date: 2006Regression calibration inference seeks to estimate regression models with measu...
In the presence of omitted variables or similar validity threats, regression estimates are biased. U...
Researchers need to consider robust estimation methods when analyzing data in multiple regression. T...
A family of dimension reduction methods was developed by Cook and Ni [Sufficient dimension reduction...
International audienceThis chapter deals with the multiple linear regression. That is we investigate...
Consider multivariate linear calibration of a single standard. We show that a selection of the q' mo...
AbstractIn univariate calibration, two standard estimators are usually opposed: the classical estima...
Conditions are derived under which the mixed regression estimator (MRE) is better then the ordinary ...
AbstractIn univariate calibration problems two different estimators are commonly in use. They are re...
Since simple linear regression theory was established at the beginning of the 1900s, it has been use...
In univariate calibration problems two different estimators are commonly in use. They are referred t...
Regression analysis makes up a large part of supervised machine learning, and consists of the predic...
The inverse estimation problem consists of a calibration stage and a prediction stage. In the calibr...
The present article considers the problem of consistent estimation in measurement error models. A li...
A family of dimension reduction methods was developed by Cook and Ni [Sufficient dimension reduction...
Graduation date: 2006Regression calibration inference seeks to estimate regression models with measu...
In the presence of omitted variables or similar validity threats, regression estimates are biased. U...
Researchers need to consider robust estimation methods when analyzing data in multiple regression. T...
A family of dimension reduction methods was developed by Cook and Ni [Sufficient dimension reduction...
International audienceThis chapter deals with the multiple linear regression. That is we investigate...
Consider multivariate linear calibration of a single standard. We show that a selection of the q' mo...