This article proposes a novel approach to linear dimension reduction for regression using nonparametric estimation with positive-definite kernels or reproducing kernel Hilbert spaces (RKHSs). The purpose of the dimension reduction is to find such directions in the explanatory variables that explain the response sufficiently: this is called sufficient dimension reduction. The proposed method is based on an estimator for the gradient of the regression function considered for the feature vectors mapped into RKHSs. It is proved that the method is able to estimate the directions that achieve sufficient dimension reduction. In comparison with other existing methods, the proposed one has wide applicability without strong assumptions on the distrib...
In regression with a high-dimensional predictor vector, dimension reduction methods aim at replacing...
We introduce a principal support vector machine (PSVM) approach that can be used for both linear and...
A family of regularized least squares regression models in a Reproducing Kernel Hilbert Space is ext...
We present a new methodology for sufficient dimension reduction (SDR). Our methodology derives dire...
Abstract. Nonparametric regression is a powerful tool to estimate nonlinear relations between some p...
Nonparametric regression is a powerful tool to estimate nonlinear relations between some predictors ...
We study the problem of discovering a manifold that best preserves information relevant to a nonline...
We propose a novel method of dimensionality reduction for supervised learning. Given a regression or...
We consider the task of dimensionality reduction for re-gression (DRR) whose goal is to find a low d...
We propose nonparametric Bayesian models for supervised dimension reduction and regression problems....
A novel general framework is proposed in this paper for dimension reduction in regression to fill th...
Consider a univariate response Y and a p-dimensional vector X of continuous predictors. Sufficient d...
We propose a general framework for dimension reduction in regression to fill the gap between linear ...
The multivariate adaptive regression spline (MARS) is one of the popular estimation methods for nonp...
A dimension reduction method in kernel discriminant analysis is presented, based on the concept of d...
In regression with a high-dimensional predictor vector, dimension reduction methods aim at replacing...
We introduce a principal support vector machine (PSVM) approach that can be used for both linear and...
A family of regularized least squares regression models in a Reproducing Kernel Hilbert Space is ext...
We present a new methodology for sufficient dimension reduction (SDR). Our methodology derives dire...
Abstract. Nonparametric regression is a powerful tool to estimate nonlinear relations between some p...
Nonparametric regression is a powerful tool to estimate nonlinear relations between some predictors ...
We study the problem of discovering a manifold that best preserves information relevant to a nonline...
We propose a novel method of dimensionality reduction for supervised learning. Given a regression or...
We consider the task of dimensionality reduction for re-gression (DRR) whose goal is to find a low d...
We propose nonparametric Bayesian models for supervised dimension reduction and regression problems....
A novel general framework is proposed in this paper for dimension reduction in regression to fill th...
Consider a univariate response Y and a p-dimensional vector X of continuous predictors. Sufficient d...
We propose a general framework for dimension reduction in regression to fill the gap between linear ...
The multivariate adaptive regression spline (MARS) is one of the popular estimation methods for nonp...
A dimension reduction method in kernel discriminant analysis is presented, based on the concept of d...
In regression with a high-dimensional predictor vector, dimension reduction methods aim at replacing...
We introduce a principal support vector machine (PSVM) approach that can be used for both linear and...
A family of regularized least squares regression models in a Reproducing Kernel Hilbert Space is ext...