Gradient-based optimization and Markov Chain Monte Carlo sampling can be found at the heart of several machine learning methods. In high-dimensional settings, well-known issues such as slow-mixing, non-convexity and correlations can hinder the algorithms’ efficiency. In order to overcome these difficulties, we propose AdaGeo, a preconditioning framework for adaptively learning the geometry of the parameter space during optimization or sampling. In particular, we use the Gaussian process latent variable model (GP-LVM) to represent a lower-dimensional embedding of the parameters, identifying the underlying Riemannian manifold on which the optimization or sampling is taking place. Samples or optimization steps are consequently proposed based o...
We study the connections between optimization and sampling. In one direction, we study sampling algo...
© 2018 Curran Associates Inc..All rights reserved. Monte Carlo sampling in high-dimensional, low-sam...
Score-based generative models (SGMs) have recently emerged as a promising class of generative models...
Gradient-based optimization and Markov Chain Monte Carlo sampling can be found at the heart of a mul...
Thesis: Ph. D., Massachusetts Institute of Technology, Department of Brain and Cognitive Sciences, 2...
abstract: This thesis presents a family of adaptive curvature methods for gradient-based stochastic ...
Bayesian inference tells us how we can incorporate information from the data into the parameters. In...
In this chapter, we identify fundamental geometric structures that underlie the problems of sampling...
Effective training of deep neural networks suffers from two main issues. The first is that the param...
This work considers optimization methods for large-scale machine learning (ML). Optimization in ML ...
The Gaussian Process Latent Variable Model (GPLVM) is an attractive model for dimensionality reducti...
In this paper we propose a new framework for learning from large scale datasets based on iterative l...
We propose in this paper, STANLEY, a STochastic gradient ANisotropic LangEvin dYnamics, for sampling...
Thesis: S.M., Massachusetts Institute of Technology, Department of Aeronautics and Astronautics, 201...
The emergent field of machine learning has by now become the main proponent of data-driven discovery...
We study the connections between optimization and sampling. In one direction, we study sampling algo...
© 2018 Curran Associates Inc..All rights reserved. Monte Carlo sampling in high-dimensional, low-sam...
Score-based generative models (SGMs) have recently emerged as a promising class of generative models...
Gradient-based optimization and Markov Chain Monte Carlo sampling can be found at the heart of a mul...
Thesis: Ph. D., Massachusetts Institute of Technology, Department of Brain and Cognitive Sciences, 2...
abstract: This thesis presents a family of adaptive curvature methods for gradient-based stochastic ...
Bayesian inference tells us how we can incorporate information from the data into the parameters. In...
In this chapter, we identify fundamental geometric structures that underlie the problems of sampling...
Effective training of deep neural networks suffers from two main issues. The first is that the param...
This work considers optimization methods for large-scale machine learning (ML). Optimization in ML ...
The Gaussian Process Latent Variable Model (GPLVM) is an attractive model for dimensionality reducti...
In this paper we propose a new framework for learning from large scale datasets based on iterative l...
We propose in this paper, STANLEY, a STochastic gradient ANisotropic LangEvin dYnamics, for sampling...
Thesis: S.M., Massachusetts Institute of Technology, Department of Aeronautics and Astronautics, 201...
The emergent field of machine learning has by now become the main proponent of data-driven discovery...
We study the connections between optimization and sampling. In one direction, we study sampling algo...
© 2018 Curran Associates Inc..All rights reserved. Monte Carlo sampling in high-dimensional, low-sam...
Score-based generative models (SGMs) have recently emerged as a promising class of generative models...