We introduce a treatment of parametric estimation in which optimality of an estimator is measured in probability rather than in variance (the measure for which the strongest general results are known in statistics). Our motivation is that the quality of an approximation algorithm is measured by the probability that it fails to approximate the desired quantity within a set tolerance. We concentrate on the Gaussian distribution and show that the sample mean is the unique “best” estimator, in probability, for the mean of a Gaussian distribution. We also extend this method to general penalty functions and to multidimensional spherically symmetric Gaussians. The algorithmic significance of studying the Gaussian distribution is established by ...
We apply Gaussian methods to the approximation of expected utility. An explicit formula, in terms of...
International audienceWe study the computation of Gaussian orthant probabilities, i.e. the probabili...
Variational inference is a technique for approximating intractable posterior distributions in order ...
We introduce a treatment of parametric estimation in which optimality of an estimator is measured in...
Abstract. We consider a problem in parametric estimation: given n samples from an unknown distributi...
International audienceDue to their flexibility, Gaussian processes (GPs) have been widely used in no...
We provide lower estimates for the norm of gradients of Gaussian distribution functions and apply th...
This paper extends the results of Andrews (1984) which considers the problem of robust estimation of...
As Gaussian processes are used to answer increasingly complex questions, analytic solutions become s...
Mathematical models implemented as computer code are gaining widespread use across the sciences and ...
International audienceWe study the optimization of a continuous function by its stochastic relaxatio...
International audienceIntroducing inequality constraints in Gaussian process (GP) models can lead to...
The paper aims at reconsidering the famous Le Cam LAN theory. The main features of the approach whic...
Probabilistic constraints represent a major model of stochastic optimization. A possible approach fo...
We apply Gaussian methods to the approximation of expected utility. An explicit formula, in terms of...
International audienceWe study the computation of Gaussian orthant probabilities, i.e. the probabili...
Variational inference is a technique for approximating intractable posterior distributions in order ...
We introduce a treatment of parametric estimation in which optimality of an estimator is measured in...
Abstract. We consider a problem in parametric estimation: given n samples from an unknown distributi...
International audienceDue to their flexibility, Gaussian processes (GPs) have been widely used in no...
We provide lower estimates for the norm of gradients of Gaussian distribution functions and apply th...
This paper extends the results of Andrews (1984) which considers the problem of robust estimation of...
As Gaussian processes are used to answer increasingly complex questions, analytic solutions become s...
Mathematical models implemented as computer code are gaining widespread use across the sciences and ...
International audienceWe study the optimization of a continuous function by its stochastic relaxatio...
International audienceIntroducing inequality constraints in Gaussian process (GP) models can lead to...
The paper aims at reconsidering the famous Le Cam LAN theory. The main features of the approach whic...
Probabilistic constraints represent a major model of stochastic optimization. A possible approach fo...
We apply Gaussian methods to the approximation of expected utility. An explicit formula, in terms of...
International audienceWe study the computation of Gaussian orthant probabilities, i.e. the probabili...
Variational inference is a technique for approximating intractable posterior distributions in order ...