This thesis main focus are the robustness properties of the Schwarz Information Criterion (SIC) based on sample objective functions defining (Bias) robust M-estimators. The Bayesian underpinnings of such a criterion are established by extending Schwarz's original framework to densities not belonging to the exponential family. A definition of qualitative robustness appropriate for model selection is provided and it is shown that the crucial restriction needed to achieve robustness is the uniform boundedness of the objective function defining Bias robust M-estimators. In this process, the asymptotic performance of the SIC for generalized M-estimators is also studied. The finite sample behavior of the SIC for different types of M-estimators is...
The classical model selection criteria, such as the Bayesian Information Criterion (BIC) or Akaike i...
Data sets where the number of variables p is comparable to or larger than the number of observations...
Data sets where the number of variables p is comparable to or larger than the number of observations...
This thesis main focus are the robustness properties of the Schwarz Information Criterion (SIC) base...
We develop a generalized Bayesian information criterion for regression model selection. The new crit...
Model selection is a key component in any statistical analysis. In this paper we discuss this issue ...
Variable selection in the presence of outliers may be performed by using a robust version of Akaike'...
Regularized M-estimators are widely used in science, due to their ability to fit a simpler, low- dim...
Variable selection in the presence of outliers may be performed by using a robust version of Akaike'...
Examining the robustness properties of maximum likelihood (ML) estimators of parameters in exponenti...
The goal of this PhD Thesis is the definition of new robust estimators, thereby extending the availa...
In this paper we derive Schwarz's information criterion and two modifications for choosing fixe...
[[abstract]]We consider penalized likelihood criteria for selecting models of dependent processes. T...
Variable selection in the presence of outliers may be performed by using a robust version of Akaike'...
Model selection is of fundamental importance to high dimensional modelling featured in many contempo...
The classical model selection criteria, such as the Bayesian Information Criterion (BIC) or Akaike i...
Data sets where the number of variables p is comparable to or larger than the number of observations...
Data sets where the number of variables p is comparable to or larger than the number of observations...
This thesis main focus are the robustness properties of the Schwarz Information Criterion (SIC) base...
We develop a generalized Bayesian information criterion for regression model selection. The new crit...
Model selection is a key component in any statistical analysis. In this paper we discuss this issue ...
Variable selection in the presence of outliers may be performed by using a robust version of Akaike'...
Regularized M-estimators are widely used in science, due to their ability to fit a simpler, low- dim...
Variable selection in the presence of outliers may be performed by using a robust version of Akaike'...
Examining the robustness properties of maximum likelihood (ML) estimators of parameters in exponenti...
The goal of this PhD Thesis is the definition of new robust estimators, thereby extending the availa...
In this paper we derive Schwarz's information criterion and two modifications for choosing fixe...
[[abstract]]We consider penalized likelihood criteria for selecting models of dependent processes. T...
Variable selection in the presence of outliers may be performed by using a robust version of Akaike'...
Model selection is of fundamental importance to high dimensional modelling featured in many contempo...
The classical model selection criteria, such as the Bayesian Information Criterion (BIC) or Akaike i...
Data sets where the number of variables p is comparable to or larger than the number of observations...
Data sets where the number of variables p is comparable to or larger than the number of observations...