International audienceMany constrained, nonconvex and nonsmooth optimization problems can be tackled using the Majorization-Minimization (MM) method which alternates between constructing a surrogate function which upper bounds the objective function, and then minimizing this surrogate. For problems which minimize a finite sum of functions, a stochastic version of the MM method selects a batch of functions at random at each iteration and optimizes the accumulated surrogate. However, in many cases of interest such as variational inference for latent variable models, the surrogate functions are expressed as an expectation. In this contribution, we propose a doubly stochastic MM method based on Monte Carlo approximation of these stochastic surr...
We propose the general Filter-based Stochastic Algorithm (FbSA) for the global optimization of nonco...
International audienceThe Expectation Maximization (EM) algorithm is a key reference for inference i...
Many problems in machine learning pertain to tackling the minimization of a possibly non-convex and ...
International audienceMany constrained, nonconvex and nonsmooth optimization problems can be tackled...
International audienceMajorization-minimization algorithms consist of successively minimizing a sequ...
© 2019 International Joint Conferences on Artificial Intelligence. All rights reserved. Majorization...
International audienceMajorization-minimization algorithms consist of iteratively minimizing a major...
Abstract. Majorization-minimization algorithms consist of successively minimizing a sequence of uppe...
Surrogate maximization (or minimization) (SM) algorithms are a family of algorithms that can be rega...
We introduce and analyze a parallel sequential Monte Carlo methodology for the numerical solution of...
open4siResearch by IS was supported by the NSF (USA) under Grant No. CCF-1525398. Research by SM and...
International audienceIn a learning context, data distribution are usually unknown. Observation mode...
Stochastic Approximation has been a prominent set of tools for solving problems with noise and uncer...
The purpose of this thesis is the design of algorithms that can be used to determine optimal solutio...
Mean-field variational inference is a method for approximate Bayesian posterior inference. It approx...
We propose the general Filter-based Stochastic Algorithm (FbSA) for the global optimization of nonco...
International audienceThe Expectation Maximization (EM) algorithm is a key reference for inference i...
Many problems in machine learning pertain to tackling the minimization of a possibly non-convex and ...
International audienceMany constrained, nonconvex and nonsmooth optimization problems can be tackled...
International audienceMajorization-minimization algorithms consist of successively minimizing a sequ...
© 2019 International Joint Conferences on Artificial Intelligence. All rights reserved. Majorization...
International audienceMajorization-minimization algorithms consist of iteratively minimizing a major...
Abstract. Majorization-minimization algorithms consist of successively minimizing a sequence of uppe...
Surrogate maximization (or minimization) (SM) algorithms are a family of algorithms that can be rega...
We introduce and analyze a parallel sequential Monte Carlo methodology for the numerical solution of...
open4siResearch by IS was supported by the NSF (USA) under Grant No. CCF-1525398. Research by SM and...
International audienceIn a learning context, data distribution are usually unknown. Observation mode...
Stochastic Approximation has been a prominent set of tools for solving problems with noise and uncer...
The purpose of this thesis is the design of algorithms that can be used to determine optimal solutio...
Mean-field variational inference is a method for approximate Bayesian posterior inference. It approx...
We propose the general Filter-based Stochastic Algorithm (FbSA) for the global optimization of nonco...
International audienceThe Expectation Maximization (EM) algorithm is a key reference for inference i...
Many problems in machine learning pertain to tackling the minimization of a possibly non-convex and ...