The Akaike information criterion (AIC) is a widely used tool for model selection. AIC is derived as an asymptotically unbiased estimator of a function used for ranking candidate models which is a variant of the Kullback-Leibler divergence between the true model and the approximating candidate model. Despite the Kullback-Leibler's computational and theoretical advantages, what can become inconvenient in model selection applications is their lack of symmetry. Simple examples can show that reversing the role of the arguments in the Kullback-Leibler divergence can yield substantially different results. In this paper, three new functions for ranking candidate models are proposed. These functions are constructed by symmetrizing the Kullback-Leibl...
The Akaike information criterion, AIC, and its corrected version, AIC c are two methods for selectin...
A bias correction to the Akaike information criterion, AIC, is derived for regression and autoregres...
A new estimator, AIC;, of the Kullback-Leibler information is proposed for Gaussian autoregressive t...
For regression and time series model selection, Hurvich and Tsai (1989) obtained a bias correction A...
The Kullback information criterion (KIC) was proposed by Cavanaugh (1999) to serve as an asymptotica...
International audienceRecently, Azari et al (2006) showed that (AIC) criterion and its corrected ver...
Estimation of the expected Kullback-Leibler information is the basis for deriving the Akaike informa...
International audienceRecently, Azari et al (2006) showed that (AIC) criterion and its corrected ver...
The Akaike information criterion, AIC, is widely used for model selection. Using the AIC as the esti...
The Kullback Information Criterion, KIC, and its univariate bias-corrected version, KICc, are two ne...
The Akaike information criterion, AIC, and its corrected version, AICc are two methods for selecting...
The selection of an appropriate model is a fundamental step of the data analysis in small area estim...
Akaike Information Criterion (AIC) has been used widely as a statistical criterion to compare the ap...
This paper investigates and evaluates an extension of the Akaike information criterion, KIC, which i...
Estimation of Kullback-Leibler amount of information is a crucial part of deriving a statistical mod...
The Akaike information criterion, AIC, and its corrected version, AIC c are two methods for selectin...
A bias correction to the Akaike information criterion, AIC, is derived for regression and autoregres...
A new estimator, AIC;, of the Kullback-Leibler information is proposed for Gaussian autoregressive t...
For regression and time series model selection, Hurvich and Tsai (1989) obtained a bias correction A...
The Kullback information criterion (KIC) was proposed by Cavanaugh (1999) to serve as an asymptotica...
International audienceRecently, Azari et al (2006) showed that (AIC) criterion and its corrected ver...
Estimation of the expected Kullback-Leibler information is the basis for deriving the Akaike informa...
International audienceRecently, Azari et al (2006) showed that (AIC) criterion and its corrected ver...
The Akaike information criterion, AIC, is widely used for model selection. Using the AIC as the esti...
The Kullback Information Criterion, KIC, and its univariate bias-corrected version, KICc, are two ne...
The Akaike information criterion, AIC, and its corrected version, AICc are two methods for selecting...
The selection of an appropriate model is a fundamental step of the data analysis in small area estim...
Akaike Information Criterion (AIC) has been used widely as a statistical criterion to compare the ap...
This paper investigates and evaluates an extension of the Akaike information criterion, KIC, which i...
Estimation of Kullback-Leibler amount of information is a crucial part of deriving a statistical mod...
The Akaike information criterion, AIC, and its corrected version, AIC c are two methods for selectin...
A bias correction to the Akaike information criterion, AIC, is derived for regression and autoregres...
A new estimator, AIC;, of the Kullback-Leibler information is proposed for Gaussian autoregressive t...