Due to the poor generalization performance of traditional empirical risk minimization (ERM) in the case of distributional shift, Out-of-Distribution (OoD) generalization algorithms receive increasing attention. However, OoD generalization algorithms overlook the great variance in the quality of training data, which significantly compromises the accuracy of these methods. In this paper, we theoretically reveal the relationship between training data quality and algorithm performance, and analyze the optimal regularization scheme for Lipschitz regularized invariant risk minimization. A novel algorithm is proposed based on the theoretical results to alleviate the influence of low quality data at both the sample level and the domain level. The e...
Learning algorithms can perform poorly in unseen environments when they learnspurious correlations. ...
Regularization plays an important role in generalization of deep learning. In this paper, we study t...
<p>A) A two-dimensional example illustrate how a two-class classification between the two data sets ...
We establish risk bounds for Regularized Empirical Risk Minimizers (RERM) when the loss is Lipschitz...
Out-of-Domain (OOD) generalization is a challenging problem in machine learning about learning a mod...
A central problem in statistical learning is to design prediction algorithms that not only perform w...
Machine learning methods suffer from test-time performance degeneration when faced with out-of-distr...
Machine learning methods suffer from test-time performance degeneration when faced with out-of-distr...
Despite impressive success in many tasks, deep learning models are shown to rely on spurious feature...
The goal of regression and classification methods in supervised learning is to minimize the empirica...
The goal of regression and classification methods in supervised learning is to minimize the empirica...
Learning invariant (causal) features for out-of-distribution (OOD) generalization have attracted ext...
Many settings in empirical economics involve estimation of a large number of parameters. In such set...
Many settings in empirical economics involve estimation of a large number of parameters. In such set...
Recently, generalization on out-of-distribution (OOD) data with correlation shift has attracted grea...
Learning algorithms can perform poorly in unseen environments when they learnspurious correlations. ...
Regularization plays an important role in generalization of deep learning. In this paper, we study t...
<p>A) A two-dimensional example illustrate how a two-class classification between the two data sets ...
We establish risk bounds for Regularized Empirical Risk Minimizers (RERM) when the loss is Lipschitz...
Out-of-Domain (OOD) generalization is a challenging problem in machine learning about learning a mod...
A central problem in statistical learning is to design prediction algorithms that not only perform w...
Machine learning methods suffer from test-time performance degeneration when faced with out-of-distr...
Machine learning methods suffer from test-time performance degeneration when faced with out-of-distr...
Despite impressive success in many tasks, deep learning models are shown to rely on spurious feature...
The goal of regression and classification methods in supervised learning is to minimize the empirica...
The goal of regression and classification methods in supervised learning is to minimize the empirica...
Learning invariant (causal) features for out-of-distribution (OOD) generalization have attracted ext...
Many settings in empirical economics involve estimation of a large number of parameters. In such set...
Many settings in empirical economics involve estimation of a large number of parameters. In such set...
Recently, generalization on out-of-distribution (OOD) data with correlation shift has attracted grea...
Learning algorithms can perform poorly in unseen environments when they learnspurious correlations. ...
Regularization plays an important role in generalization of deep learning. In this paper, we study t...
<p>A) A two-dimensional example illustrate how a two-class classification between the two data sets ...