Laplace random variables are commonly used to model extreme noise in many fields, while systems trained to deal with such noises are often characterized by robustness properties. We introduce new learning algorithms that minimize objectives derived directly from PAC-Bayes bounds, incorporating Laplace distributions. The resulting algorithms are regulated by the Huber loss function and are robust to noise, as the Laplace distri-bution integrated large deviation of param-eters. We analyze the convexity properties of the objective, and propose a few bounds which are fully convex, two of which jointly convex in the mean and standard-deviation under certain conditions. We derive new for-ward algorithms analogous to recent boost-ing algorithms, p...
We give sharper bounds for uniformly stable randomized algorithms in a PAC-Bayesian framework, which...
We consider a general statistical learning problem where an unknown fraction of the training data is...
The sensitivity of Adaboost to random label noise is a well-studied problem. Log-itBoost, BrownBoost...
We show that convex KL-regularized objective functions are obtained from a PAC-Bayes risk bound when...
In recent decades, boosting methods have emerged as one of the leading ensemble learning techniques....
PAC-Bayes bounds have been proposed to get risk estimates based on a training sample. In this paper...
Thesis: S.M., Massachusetts Institute of Technology, Department of Electrical Engineering and Comput...
We provide a PAC-Bayesian bound for the expected loss of convex combinations of classifiers under a ...
Recent research in robust optimization has shown an overfitting-like phenomenon in which models trai...
We establish risk bounds for Regularized Empirical Risk Minimizers (RERM) when the loss is Lipschitz...
We present new PAC-Bayesian generalisation bounds for learning problems with unbounded loss function...
Robustness is a major problem in Kalman filtering and smoothing that can be solved using heavy taile...
In this paper, we propose a computationally tractable and provably convergent algorithm for robust o...
The paper brings together methods from two disciplines: machine learning theory and robust statistic...
Convex potential minimisation is the de facto approach to binary classification. However, Long and S...
We give sharper bounds for uniformly stable randomized algorithms in a PAC-Bayesian framework, which...
We consider a general statistical learning problem where an unknown fraction of the training data is...
The sensitivity of Adaboost to random label noise is a well-studied problem. Log-itBoost, BrownBoost...
We show that convex KL-regularized objective functions are obtained from a PAC-Bayes risk bound when...
In recent decades, boosting methods have emerged as one of the leading ensemble learning techniques....
PAC-Bayes bounds have been proposed to get risk estimates based on a training sample. In this paper...
Thesis: S.M., Massachusetts Institute of Technology, Department of Electrical Engineering and Comput...
We provide a PAC-Bayesian bound for the expected loss of convex combinations of classifiers under a ...
Recent research in robust optimization has shown an overfitting-like phenomenon in which models trai...
We establish risk bounds for Regularized Empirical Risk Minimizers (RERM) when the loss is Lipschitz...
We present new PAC-Bayesian generalisation bounds for learning problems with unbounded loss function...
Robustness is a major problem in Kalman filtering and smoothing that can be solved using heavy taile...
In this paper, we propose a computationally tractable and provably convergent algorithm for robust o...
The paper brings together methods from two disciplines: machine learning theory and robust statistic...
Convex potential minimisation is the de facto approach to binary classification. However, Long and S...
We give sharper bounds for uniformly stable randomized algorithms in a PAC-Bayesian framework, which...
We consider a general statistical learning problem where an unknown fraction of the training data is...
The sensitivity of Adaboost to random label noise is a well-studied problem. Log-itBoost, BrownBoost...