summary:Local polynomials are used to construct estimators for the value $m(x_{0})$ of the regression function $m$ and the values of the derivatives $D_{\gamma }m(x_{0})$ in a general class of nonparametric regression models. The covariables are allowed to be random or non-random. Only asymptotic conditions on the average distribution of the covariables are used as smoothness of the experimental design. This smoothness condition is discussed in detail. The optimal stochastic rate of convergence of the estimators is established. The results cover the special cases of regression models with i.i.d. errors and the case of observations at an equidistant lattice
We consider the estimation of multivariate regression functions r(x1,...,xd) and their partial deriv...
We explore a class of vector smoothers based on local polynomial regression for fitting nonparametri...
Consider the estimation of g(ν), the ν-th derivative of the mean function, in a fixed-design nonpara...
summary:Local polynomials are used to construct estimators for the value $m(x_{0})$ of the regressio...
Let (X; Y ) be a pair of random variables such that X ranges over [0; 1] and Y is real - valued and ...
We consider local polynomial fitting for estimating a regression function and its derivatives nonpar...
We consider local polynomial fitting for estimating a regression function and its derivatives nonpar...
International audienceThis article considers the problem of nonparametric estimation of the regressi...
International audienceIn this work, we consider a multivariate regression model with one-sided error...
Theoretical thesis.Bibliography: pages 51-53.1. Introduction -- 2. Notations and assumptions -- 3. R...
The effect of errors in variables in nonparametric regression estimation is examined. To account for...
This thesis is focused on local polynomial smoothers of the conditional vari- ance function in a het...
AbstractWe consider the estimation of the multivariate regression function m(x1, …, xd) = E[ψ(Yd)|X1...
This paper concerns with M-estimators for the partly linear model Y-i = X(i)(tau) beta(o) + g(o)(T-i...
AbstractNonparametric regression estimator based on locally weighted least squares fitting has been ...
We consider the estimation of multivariate regression functions r(x1,...,xd) and their partial deriv...
We explore a class of vector smoothers based on local polynomial regression for fitting nonparametri...
Consider the estimation of g(ν), the ν-th derivative of the mean function, in a fixed-design nonpara...
summary:Local polynomials are used to construct estimators for the value $m(x_{0})$ of the regressio...
Let (X; Y ) be a pair of random variables such that X ranges over [0; 1] and Y is real - valued and ...
We consider local polynomial fitting for estimating a regression function and its derivatives nonpar...
We consider local polynomial fitting for estimating a regression function and its derivatives nonpar...
International audienceThis article considers the problem of nonparametric estimation of the regressi...
International audienceIn this work, we consider a multivariate regression model with one-sided error...
Theoretical thesis.Bibliography: pages 51-53.1. Introduction -- 2. Notations and assumptions -- 3. R...
The effect of errors in variables in nonparametric regression estimation is examined. To account for...
This thesis is focused on local polynomial smoothers of the conditional vari- ance function in a het...
AbstractWe consider the estimation of the multivariate regression function m(x1, …, xd) = E[ψ(Yd)|X1...
This paper concerns with M-estimators for the partly linear model Y-i = X(i)(tau) beta(o) + g(o)(T-i...
AbstractNonparametric regression estimator based on locally weighted least squares fitting has been ...
We consider the estimation of multivariate regression functions r(x1,...,xd) and their partial deriv...
We explore a class of vector smoothers based on local polynomial regression for fitting nonparametri...
Consider the estimation of g(ν), the ν-th derivative of the mean function, in a fixed-design nonpara...