Minimization of the L∞ norm, which can be viewed as approximately solving the non-convex least median estimation problem, is a powerful method for outlier removal and hence robust regression. However, current techniques for solving the problem at the heart of L∞ norm minimization are slow, and therefore cannot be scaled to large problems. A new method for the minimization of the L∞ norm is presented here, which provides a speedup of multiple orders of magnitude for data with high dimension. This method, termed Fast L∞ Minimization, allows robust regression to be applied to a class of problems which was previously inaccessible. It is shown how the L∞ norm minimization problem can be broken up into smaller sub-problems, which can then be solv...
Nonparametric methods are widely applicable to statistical learn-ing problems, since they rely on a ...
The methods of very robust regression resist up to 50% of outliers. The algorithms for very robust r...
International audienceWe consider a reformulation of Reduced-Rank Regression (RRR) and Sparse Reduce...
We propose a procedure for computing a fast approximation to regression estimates based on the minim...
Given a dataset an outlier can be defined as an observation that it is unlikely to follow the statis...
International audienceWe establish risk bounds for Regularized Empirical Risk Minimizers (RERM) when...
Data mining aims to extract previously unknown patterns or substructures from large databases. In st...
National audienceThis paper considers the problem of inference in a linear regression model with out...
Robust linear regression is one of the most popular problems in the robust statistics community. It ...
AbstractUsing a few very basic observations, we take full advantage of the special structure of the ...
We provide fast algorithms for overconstrained `p regression and related problems: for an n × d inpu...
We address the problem of fast estimation of ordinary least squares (OLS) from large amounts of data...
We address the problem of fast estimation of ordinary least squares (OLS) from large amounts of data...
This paper considers inference in a linear regression model with outliers in which the number of out...
We present a new algorithm to solve min-max or min-min problems out of the convex world. We use rigi...
Nonparametric methods are widely applicable to statistical learn-ing problems, since they rely on a ...
The methods of very robust regression resist up to 50% of outliers. The algorithms for very robust r...
International audienceWe consider a reformulation of Reduced-Rank Regression (RRR) and Sparse Reduce...
We propose a procedure for computing a fast approximation to regression estimates based on the minim...
Given a dataset an outlier can be defined as an observation that it is unlikely to follow the statis...
International audienceWe establish risk bounds for Regularized Empirical Risk Minimizers (RERM) when...
Data mining aims to extract previously unknown patterns or substructures from large databases. In st...
National audienceThis paper considers the problem of inference in a linear regression model with out...
Robust linear regression is one of the most popular problems in the robust statistics community. It ...
AbstractUsing a few very basic observations, we take full advantage of the special structure of the ...
We provide fast algorithms for overconstrained `p regression and related problems: for an n × d inpu...
We address the problem of fast estimation of ordinary least squares (OLS) from large amounts of data...
We address the problem of fast estimation of ordinary least squares (OLS) from large amounts of data...
This paper considers inference in a linear regression model with outliers in which the number of out...
We present a new algorithm to solve min-max or min-min problems out of the convex world. We use rigi...
Nonparametric methods are widely applicable to statistical learn-ing problems, since they rely on a ...
The methods of very robust regression resist up to 50% of outliers. The algorithms for very robust r...
International audienceWe consider a reformulation of Reduced-Rank Regression (RRR) and Sparse Reduce...