In regression with a high-dimensional predictor vector, dimension reduction methods aim at replacing the predictor by a lower dimensional version with-out loss of information on the regression. In this context, the so-called cen-tral mean subspace is the key of dimension reduction. The last two decades have seen the emergence of many methods to estimate the central mean sub-space. In this paper, we go one step further, and we study the performances of a k-nearest neighbor type estimate of the regression function, based on an estimator of the central mean subspace. The estimate is first proved to be consistent. Improvement due to the dimension reduction step is then observed in term of its rate of convergence. All the results are distributio...
In linear dimension reduction for a p-variate random vector x, the general idea is to find an orthog...
Regression is the study of the dependence of a response variable y on a collection predictors p coll...
We obtain the maximum likelihood estimator of the central subspace under conditional normality of th...
International audienceIn regression with a high-dimensional predictor vector, dimension reduction me...
Consider a univariate response Y and a p-dimensional vector X of continuous predictors. Sufficient d...
In regression with a high-dimensional predictor vector, it is important to estimate the central and ...
Let $(X,Y)$ be an $\mathcal X\times\mathbb R$ valued random variable, where $\mathcal X\subset \math...
Sufficient dimension reduction (SDR) methods target finding lower-dimensional representations of a m...
This article proposes a novel approach to linear dimension reduction for regression using nonparamet...
Dimension reduction for regression is a prominent issue today because technological advances now all...
We proposed a new method to estimate the intra-cluster adjusted central subspace for regressions wit...
We propose a general framework for dimension reduction in regression to fill the gap between linear ...
We investigate the estimation efficiency of the central mean subspace in the framework of sufficient...
Sufficient dimension reduction is a useful tool for studying the dependence between a response and a...
We study the asymptotic behavior of a class of methods for sufficient dimension reduction in high-di...
In linear dimension reduction for a p-variate random vector x, the general idea is to find an orthog...
Regression is the study of the dependence of a response variable y on a collection predictors p coll...
We obtain the maximum likelihood estimator of the central subspace under conditional normality of th...
International audienceIn regression with a high-dimensional predictor vector, dimension reduction me...
Consider a univariate response Y and a p-dimensional vector X of continuous predictors. Sufficient d...
In regression with a high-dimensional predictor vector, it is important to estimate the central and ...
Let $(X,Y)$ be an $\mathcal X\times\mathbb R$ valued random variable, where $\mathcal X\subset \math...
Sufficient dimension reduction (SDR) methods target finding lower-dimensional representations of a m...
This article proposes a novel approach to linear dimension reduction for regression using nonparamet...
Dimension reduction for regression is a prominent issue today because technological advances now all...
We proposed a new method to estimate the intra-cluster adjusted central subspace for regressions wit...
We propose a general framework for dimension reduction in regression to fill the gap between linear ...
We investigate the estimation efficiency of the central mean subspace in the framework of sufficient...
Sufficient dimension reduction is a useful tool for studying the dependence between a response and a...
We study the asymptotic behavior of a class of methods for sufficient dimension reduction in high-di...
In linear dimension reduction for a p-variate random vector x, the general idea is to find an orthog...
Regression is the study of the dependence of a response variable y on a collection predictors p coll...
We obtain the maximum likelihood estimator of the central subspace under conditional normality of th...