Standard Monte Carlo computation is widely known to exhibit a canonical square-root convergence speed in terms of sample size. Two recent techniques, one based on control variate and one on importance sampling, both derived from an integration of reproducing kernels and Stein's identity, have been proposed to reduce the error in Monte Carlo computation to supercanonical convergence. This paper presents a more general framework to encompass both techniques that is especially beneficial when the sample generator is biased and noise-corrupted. We show our general estimator, which we call the doubly robust Stein-kernelized estimator, outperforms both existing methods in terms of MSE rates across different scenarios. We also demonstrate the supe...
Stein’s method for measuring convergence to a continuous target distribution relies on an operator c...
Markov chain Monte Carlo (MCMC) or the Metropolis-Hastings algorithm is a simulation algorithm that ...
International audienceStochastic Gradient Markov Chain Monte Carlo (SG-MCMC) algorithms have become ...
Gradient information on the sampling distribution can be used to reduce the variance of Monte Carlo ...
Gradient information on the sampling distribution can be used to reduce the variance of Monte Carlo ...
A non‐parametric extension of control variates is presented. These leverage gradient information on ...
The standard Kernel Quadrature method for numerical integration with random point sets (also called...
The standard Kernel Quadrature method for numerical integration with random point sets (also called ...
Control variates are a well-established tool to reduce the variance of Monte Carlo estimators. Howev...
We propose kernel sequential Monte Carlo (KSMC), a framework for sampling from static target densiti...
Bayesian inference problems require sampling or approximating high-dimensional probability distribut...
International audienceAdaptive Monte Carlo methods are very efficient techniques designed to tune si...
In the design of ecient simulation algorithms, one is often beset with a poorchoice of proposal dist...
Stein thinning is a promising algorithm proposed by (Riabiz et al., 2022) for post-processing output...
Machine learning contains many computational bottlenecks in the form of nested summations over datas...
Stein’s method for measuring convergence to a continuous target distribution relies on an operator c...
Markov chain Monte Carlo (MCMC) or the Metropolis-Hastings algorithm is a simulation algorithm that ...
International audienceStochastic Gradient Markov Chain Monte Carlo (SG-MCMC) algorithms have become ...
Gradient information on the sampling distribution can be used to reduce the variance of Monte Carlo ...
Gradient information on the sampling distribution can be used to reduce the variance of Monte Carlo ...
A non‐parametric extension of control variates is presented. These leverage gradient information on ...
The standard Kernel Quadrature method for numerical integration with random point sets (also called...
The standard Kernel Quadrature method for numerical integration with random point sets (also called ...
Control variates are a well-established tool to reduce the variance of Monte Carlo estimators. Howev...
We propose kernel sequential Monte Carlo (KSMC), a framework for sampling from static target densiti...
Bayesian inference problems require sampling or approximating high-dimensional probability distribut...
International audienceAdaptive Monte Carlo methods are very efficient techniques designed to tune si...
In the design of ecient simulation algorithms, one is often beset with a poorchoice of proposal dist...
Stein thinning is a promising algorithm proposed by (Riabiz et al., 2022) for post-processing output...
Machine learning contains many computational bottlenecks in the form of nested summations over datas...
Stein’s method for measuring convergence to a continuous target distribution relies on an operator c...
Markov chain Monte Carlo (MCMC) or the Metropolis-Hastings algorithm is a simulation algorithm that ...
International audienceStochastic Gradient Markov Chain Monte Carlo (SG-MCMC) algorithms have become ...