The Akaike information criterion (AIC) is a common tool for model selection. It is frequently used in violation of regularity conditions at parameter space singularities and boundaries. The expected AIC is generally not asymptotically equivalent to its target at singularities and boundaries, and convergence to the target at nearby parameter points may be slow. We develop a generalized AIC for candidate models with or without singularities and boundaries. We show that the expectation of this generalized form converges everywhere in the parameter space, and its convergence can be faster than that of the AIC. We illustrate the generalized AIC on example models from phylogenomics, showing that it can outperform the AIC and gives rise to an inte...
Many authors have argued that identifying parsimonious statistical models (those that are neither ov...
We apply the nonconcave penalized likelihood approach to obtain variable selections as well as shrin...
We consider the problem of data classification where the training set consists of just a few data po...
The corrected Akaike information criterion (AICc) is a widely used tool in analyzing environmental a...
The Akaike information criterion, AIC, is widely used for model selection. Using the AIC as the esti...
The Akaike information criterion (AIC) is a widely used tool for model selection. AIC is derived as ...
This paper studies the asymptotic and nite-sample performance of penalized regression methods when d...
In statistical settings such as regression and time series, we can condition on observed informatio...
Information of interest can often only be extracted from data by model fitting. When the functional ...
It has been shown that AIC-type criteria are asymptotically efficient selectors of the tuning parame...
An improved AIC-based criterion is derived for model selection in general smoothing-based modeling, ...
I discuss the behavior of the Akaike Information Criterion in the limit when the sample size grows. ...
Akaike Information Criterion (AIC) has been used widely as a statistical criterion to compare the ap...
Various aspects of statistical model selection are discussed from the view point of a statistician. ...
<p>AIC is Akaike's information criterion with a second order correction for small sample sizes <a hr...
Many authors have argued that identifying parsimonious statistical models (those that are neither ov...
We apply the nonconcave penalized likelihood approach to obtain variable selections as well as shrin...
We consider the problem of data classification where the training set consists of just a few data po...
The corrected Akaike information criterion (AICc) is a widely used tool in analyzing environmental a...
The Akaike information criterion, AIC, is widely used for model selection. Using the AIC as the esti...
The Akaike information criterion (AIC) is a widely used tool for model selection. AIC is derived as ...
This paper studies the asymptotic and nite-sample performance of penalized regression methods when d...
In statistical settings such as regression and time series, we can condition on observed informatio...
Information of interest can often only be extracted from data by model fitting. When the functional ...
It has been shown that AIC-type criteria are asymptotically efficient selectors of the tuning parame...
An improved AIC-based criterion is derived for model selection in general smoothing-based modeling, ...
I discuss the behavior of the Akaike Information Criterion in the limit when the sample size grows. ...
Akaike Information Criterion (AIC) has been used widely as a statistical criterion to compare the ap...
Various aspects of statistical model selection are discussed from the view point of a statistician. ...
<p>AIC is Akaike's information criterion with a second order correction for small sample sizes <a hr...
Many authors have argued that identifying parsimonious statistical models (those that are neither ov...
We apply the nonconcave penalized likelihood approach to obtain variable selections as well as shrin...
We consider the problem of data classification where the training set consists of just a few data po...