Ridge functions have recently emerged as a powerful set of ideas for subspace-based dimension reduction. In this paper we begin by drawing parallels between ridge subspaces, sufficient dimension reduction and active subspaces, contrasting between techniques rooted in statistical regression and those rooted in approximation theory. This sets the stage for our new algorithm that approximates what we call a Gaussian ridge function---the posterior mean of a Gaussian process on a dimension-reducing subspace---suitable for both regression and approximation problems. To compute this subspace we develop an iterative algorithm that alternates between optimizing over the Stiefel manifold to compute the subspace and optimizing the hyperparameters of t...
We present a new methodology for sufficient dimension reduction (SDR). Our methodology derives dire...
Uncertainty quantification (UQ) tasks, such as model calibration, uncertainty propagation, and optim...
We study the problem of learning ridge functions of the form f(x) = g(aT x), x ∈ ℝd, from random sam...
Ridge functions have recently emerged as a powerful set of ideas for subspace-based dimension reduct...
Much research has gone into scaling up classical machine learning algorithms such as Gaussian Proces...
© 2020 Society for Industrial and Applied Mathematics. Multivariate functions encountered in high-di...
The thesis investigates the search for dimension reduction subspace for the Poisson point process dr...
International audienceMultivariate functions encountered in high-dimensional uncertainty quantificat...
Abstract. There has been growing recent interest in probabilistic interpretations of kernel-based me...
This paper proposes a novel scheme for reduced-rank Gaussian process regression. The method is based...
Gaussian processes are a powerful and flexible class of nonparametric models that use covariance fun...
We study the problem of discovering a manifold that best preserves information relevant to a nonline...
Learning a feature of an expensive black-box function (optimum, contour line,...) is a difficult tas...
This article proposes a novel approach to linear dimension reduction for regression using nonparamet...
A priori dimension reduction is a widely adopted technique for reducing the computational complexity...
We present a new methodology for sufficient dimension reduction (SDR). Our methodology derives dire...
Uncertainty quantification (UQ) tasks, such as model calibration, uncertainty propagation, and optim...
We study the problem of learning ridge functions of the form f(x) = g(aT x), x ∈ ℝd, from random sam...
Ridge functions have recently emerged as a powerful set of ideas for subspace-based dimension reduct...
Much research has gone into scaling up classical machine learning algorithms such as Gaussian Proces...
© 2020 Society for Industrial and Applied Mathematics. Multivariate functions encountered in high-di...
The thesis investigates the search for dimension reduction subspace for the Poisson point process dr...
International audienceMultivariate functions encountered in high-dimensional uncertainty quantificat...
Abstract. There has been growing recent interest in probabilistic interpretations of kernel-based me...
This paper proposes a novel scheme for reduced-rank Gaussian process regression. The method is based...
Gaussian processes are a powerful and flexible class of nonparametric models that use covariance fun...
We study the problem of discovering a manifold that best preserves information relevant to a nonline...
Learning a feature of an expensive black-box function (optimum, contour line,...) is a difficult tas...
This article proposes a novel approach to linear dimension reduction for regression using nonparamet...
A priori dimension reduction is a widely adopted technique for reducing the computational complexity...
We present a new methodology for sufficient dimension reduction (SDR). Our methodology derives dire...
Uncertainty quantification (UQ) tasks, such as model calibration, uncertainty propagation, and optim...
We study the problem of learning ridge functions of the form f(x) = g(aT x), x ∈ ℝd, from random sam...