In high-dimensional data, many sparse regression methods have been proposed. However, they may not be robust against outliers. Recently, the use of density power weight has been studied for robust parameter estimation, and the corresponding divergences have been discussed. One such divergence is the γ -divergence, and the robust estimator using the γ -divergence is known for having a strong robustness. In this paper, we extend the γ -divergence to the regression problem, consider the robust and sparse regression based on the γ -divergence and show that it has a strong robustness under heavy contamination even when outliers are heterogeneous. The loss function is constructed by an empirical estimate of the γ -diverge...
The aim of robust statistics is to develop statistical procedures which are not unduly influenced by...
The problem of fitting a model to noisy data is fundamental to statistics and machine learning. In t...
It is well known that classical robust estimators tolerate only less than fifty percent of outliers....
Outliers in the data are a common problem in applied statistics. Estimators that give reliable resul...
Minimum density power divergence estimation provides a general framework for robust statistics, depe...
Minimum density power divergence estimation provides a general framework for robust statistics depen...
Minimum density power divergence estimation provides a general framework for robust statistics, depe...
<div><p>The density power divergence (DPD) measure, defined in terms of a single parameter <i>α</i>,...
Due to the increasing availability of data sets with a large number of variables, sparse model estim...
Amidst the exponential surge in big data, managing high-dimensional datasets across diverse fields a...
The aim of robust statistics is to develop statistical procedures which are not unduly influenced by...
Amidst the exponential surge in big data, managing high-dimensional datasets across diverse fields a...
Algorithms such as Least Median of Squares (LMedS) and Ran-dom Sample Consensus (RANSAC) have been v...
The robust lasso-type regularized regression is a useful tool for simultaneous estimation and variab...
A minimum divergence estimation method is developed for robust parameter estimation. The proposed ap...
The aim of robust statistics is to develop statistical procedures which are not unduly influenced by...
The problem of fitting a model to noisy data is fundamental to statistics and machine learning. In t...
It is well known that classical robust estimators tolerate only less than fifty percent of outliers....
Outliers in the data are a common problem in applied statistics. Estimators that give reliable resul...
Minimum density power divergence estimation provides a general framework for robust statistics, depe...
Minimum density power divergence estimation provides a general framework for robust statistics depen...
Minimum density power divergence estimation provides a general framework for robust statistics, depe...
<div><p>The density power divergence (DPD) measure, defined in terms of a single parameter <i>α</i>,...
Due to the increasing availability of data sets with a large number of variables, sparse model estim...
Amidst the exponential surge in big data, managing high-dimensional datasets across diverse fields a...
The aim of robust statistics is to develop statistical procedures which are not unduly influenced by...
Amidst the exponential surge in big data, managing high-dimensional datasets across diverse fields a...
Algorithms such as Least Median of Squares (LMedS) and Ran-dom Sample Consensus (RANSAC) have been v...
The robust lasso-type regularized regression is a useful tool for simultaneous estimation and variab...
A minimum divergence estimation method is developed for robust parameter estimation. The proposed ap...
The aim of robust statistics is to develop statistical procedures which are not unduly influenced by...
The problem of fitting a model to noisy data is fundamental to statistics and machine learning. In t...
It is well known that classical robust estimators tolerate only less than fifty percent of outliers....