International audienceIn Bayesian inference, a statistical model is assumed between an unknown vector of parameters and a set of observations. The goal is to construct a posterior distribution of the unknowns conditioned to the given data. However, for most practical models, the posterior is not available in a closed form, mostly due to intractable integrals, and approximations must be performed via Monte Carlo (MC) methods [1]. Among the available classes of MC methods, importance sampling (IS) consists in simulating random samples from a proposal distribution and weighting them properly with the aim of building consistent estimators of the moments of the posterior. The performance of IS depends on the proposal distribution [2–4]. Adaptive...