Gaussian process hyperparameter optimization requires linear solves with, and log-determinants of, large kernel matrices. Iterative numerical techniques are becoming popular to scale to larger datasets, relying on the conjugate gradient method (CG) for the linear solves and stochastic trace estimation for the log-determinant. This work introduces new algorithmic and theoretical insights for preconditioning these computations. While preconditioning is well understood in the context of CG, we demonstrate that it can also accelerate convergence and reduce variance of the estimates for the log-determinant and its derivative. We prove general probabilistic error bounds for the preconditioned computation of the log-determinant, log-marginal likel...
This dissertation uses structured linear algebra to scale kernel regression methods based on Gaussia...
Gaussian process regression (GPR) is a non-parametric Bayesian technique for interpolating or fittin...
Kernel machines have sustained continuous progress in the field of quantum chemistry. In particular,...
The kernel function and its hyperparameters are the central model selection choice in a Gaussian pro...
Stochastic gradient descent (SGD) and its variants have established themselves as the go-to algorith...
The task of choosing a preconditioner M to use when solving a linear system Ax=b with iterative meth...
The computational and storage complexity of kernel machines presents the primary barrier to their sc...
Abstract: The conjugate gradient method (CG) is usually used with a preconditioner which i...
A matrix free and a low rank approximation preconditioner are proposed to accelerate the convergence...
We provide approximation guarantees for a linear-time inferential framework for Gaussian processes, ...
Gaussian Processes (GPs) provide powerful probabilistic frameworks for interpolation, forecasting, a...
This paper introduces two randomized preconditioning techniques for robustly solving kernel ridge re...
The application of Gaussian processes (GPs) is limited by the rather slow process of optimizing the ...
Most machine learning methods require careful selection of hyper-parameters in order to train a high...
Gaussian processes (GPs) produce good probabilistic models of functions, but most GP kernels require...
This dissertation uses structured linear algebra to scale kernel regression methods based on Gaussia...
Gaussian process regression (GPR) is a non-parametric Bayesian technique for interpolating or fittin...
Kernel machines have sustained continuous progress in the field of quantum chemistry. In particular,...
The kernel function and its hyperparameters are the central model selection choice in a Gaussian pro...
Stochastic gradient descent (SGD) and its variants have established themselves as the go-to algorith...
The task of choosing a preconditioner M to use when solving a linear system Ax=b with iterative meth...
The computational and storage complexity of kernel machines presents the primary barrier to their sc...
Abstract: The conjugate gradient method (CG) is usually used with a preconditioner which i...
A matrix free and a low rank approximation preconditioner are proposed to accelerate the convergence...
We provide approximation guarantees for a linear-time inferential framework for Gaussian processes, ...
Gaussian Processes (GPs) provide powerful probabilistic frameworks for interpolation, forecasting, a...
This paper introduces two randomized preconditioning techniques for robustly solving kernel ridge re...
The application of Gaussian processes (GPs) is limited by the rather slow process of optimizing the ...
Most machine learning methods require careful selection of hyper-parameters in order to train a high...
Gaussian processes (GPs) produce good probabilistic models of functions, but most GP kernels require...
This dissertation uses structured linear algebra to scale kernel regression methods based on Gaussia...
Gaussian process regression (GPR) is a non-parametric Bayesian technique for interpolating or fittin...
Kernel machines have sustained continuous progress in the field of quantum chemistry. In particular,...