Large outliers break down linear and nonlinear regression models. Robust regression methods allow one to filter out the outliers when building a model. By replacing the traditional least squares criterion with the least trimmed squares (LTS) criterion, in which half of data is treated as potential outliers, one can fit accurate regression models to strongly contaminated data. High-breakdown methods have become very well established in linear regression, but have started being applied for non-linear regression only recently. In this work, we examine the problem of fitting artificial neural networks (ANNs) to contaminated data using LTS criterion. We introduce a penalized LTS criterion which prevents unnecessary removal of valid data. Trainin...
Motivated by the requirement of controlling the number of false discoveries that arises in several a...
International audienceIn this paper, we are interested in the recovery of an unknown signal corrupte...
Outliers widely occur in big-data applications and may severely affect statistical estimation and in...
Large outliers break down linear and nonlinear regression models. Robust regression methods allow on...
Large outliers break down linear and nonlinear regression models. Robust regres-sion methods allow o...
In the largest samplings of data, outliers are observations that are well separated from the major s...
The least trimmed sum of squares (LTS) regression estimation criterion is a robust statistical metho...
Regression analysis is one of the most important branches of multivariate statistical techniques. It...
Sparse model estimation is a topic of high importance in modern data analysis due to the increasing ...
We study the dynamics and equilibria induced by training an artificial neural network for regression...
High breakdown estimation (HBE) addresses the problem of getting reliable parameter estimates in the...
The methods of very robust regression resist up to 50% of outliers. The algorithms for very robust r...
Challenges with data in the big-data era include (i) the dimension $p$ is often larger than the samp...
Most supervised neural networks are trained by minimizing the mean square error (MSE) of the trainin...
We present a Distributionally Robust Optimization (DRO) approach to outlier detection in a linear re...
Motivated by the requirement of controlling the number of false discoveries that arises in several a...
International audienceIn this paper, we are interested in the recovery of an unknown signal corrupte...
Outliers widely occur in big-data applications and may severely affect statistical estimation and in...
Large outliers break down linear and nonlinear regression models. Robust regression methods allow on...
Large outliers break down linear and nonlinear regression models. Robust regres-sion methods allow o...
In the largest samplings of data, outliers are observations that are well separated from the major s...
The least trimmed sum of squares (LTS) regression estimation criterion is a robust statistical metho...
Regression analysis is one of the most important branches of multivariate statistical techniques. It...
Sparse model estimation is a topic of high importance in modern data analysis due to the increasing ...
We study the dynamics and equilibria induced by training an artificial neural network for regression...
High breakdown estimation (HBE) addresses the problem of getting reliable parameter estimates in the...
The methods of very robust regression resist up to 50% of outliers. The algorithms for very robust r...
Challenges with data in the big-data era include (i) the dimension $p$ is often larger than the samp...
Most supervised neural networks are trained by minimizing the mean square error (MSE) of the trainin...
We present a Distributionally Robust Optimization (DRO) approach to outlier detection in a linear re...
Motivated by the requirement of controlling the number of false discoveries that arises in several a...
International audienceIn this paper, we are interested in the recovery of an unknown signal corrupte...
Outliers widely occur in big-data applications and may severely affect statistical estimation and in...