Weak consistency and asymptotic normality of the ordinary least-squares estimator in a linear regression with adaptive learning is derived when the crucial, so-called, `gain' parameter is estimated in a first step by nonlinear least squares from an auxiliary model. The singular limiting distribution of the two-step estimator is normal and in general affected by the sampling uncertainty from the first step. However, this `generated-regressor' issue disappears for certain parameter combinations
AbstractThis paper is concerned with the linear regression model in which the variance of the depend...
summary:We derive expressions for the asymptotic approximation of the bias of the least squares esti...
We consider a linear model where the coefficients - intercept and slopes - are random with a distrib...
Weak consistency and asymptotic normality of the ordinary least-squares estimator in a linear regres...
This paper looks at the strong consistency of the ordinary least squares (OLS) estimator in linear r...
We discuss techniques of estimation and inference for nonlinear cohort panels with learning from exp...
AbstractThe paper uses empirical process techniques to study the asymptotics of the least-squares es...
The most part of the paper is about modeling (or approximating) nonstochastic regressors. Examples o...
More than thirty years ago Halbert White inaugurated a “modelrobust” form of statistical inference b...
We build the Conditional Least Squares Estimator of 0 based on the observation of a single trajecto...
We consider a mixed vector autoregressive model with deterministic exogenous regressors and an autor...
International audienceWe consider the least-squares regression problem and provide a detailed asympt...
We study the asymptotics for jump-penalized least squares regression aiming at approximating a regre...
Assume observations Y[subscript] t, defined on a complete probability space ([omega], F, P), are ge...
We investigate the asymptotic behavior of the OLS estimator for regressions with two slowly varying ...
AbstractThis paper is concerned with the linear regression model in which the variance of the depend...
summary:We derive expressions for the asymptotic approximation of the bias of the least squares esti...
We consider a linear model where the coefficients - intercept and slopes - are random with a distrib...
Weak consistency and asymptotic normality of the ordinary least-squares estimator in a linear regres...
This paper looks at the strong consistency of the ordinary least squares (OLS) estimator in linear r...
We discuss techniques of estimation and inference for nonlinear cohort panels with learning from exp...
AbstractThe paper uses empirical process techniques to study the asymptotics of the least-squares es...
The most part of the paper is about modeling (or approximating) nonstochastic regressors. Examples o...
More than thirty years ago Halbert White inaugurated a “modelrobust” form of statistical inference b...
We build the Conditional Least Squares Estimator of 0 based on the observation of a single trajecto...
We consider a mixed vector autoregressive model with deterministic exogenous regressors and an autor...
International audienceWe consider the least-squares regression problem and provide a detailed asympt...
We study the asymptotics for jump-penalized least squares regression aiming at approximating a regre...
Assume observations Y[subscript] t, defined on a complete probability space ([omega], F, P), are ge...
We investigate the asymptotic behavior of the OLS estimator for regressions with two slowly varying ...
AbstractThis paper is concerned with the linear regression model in which the variance of the depend...
summary:We derive expressions for the asymptotic approximation of the bias of the least squares esti...
We consider a linear model where the coefficients - intercept and slopes - are random with a distrib...