Various aspects of statistical model selection are discussed from the view point of a statistician. Our concern here is about selection procedures based on the Kullback Leibler information number. Derivation of AIC (Akaike's Information Criterion) is given. As a result a natural extension of AIC, called TIC (Takeuchi's Information Criterion) follows. It is shown that the TIC is asymptotically equivalent to Cross Validation in a general context, although AIC is asymptotically equivalent only for the case of independent identically distributed observations. Next, the maximum penalized likelihood estimate is considered in place of the maximum likelihood estimate as an estimation of parameters after a model is selected. Then the weight of penal...
We consider the problem of model (or variable) selection in the classical regression model using the...
The selection of an appropriate model is a fundamental step of the data analysis in small area estim...
A bias correction to the Akaike information criterion, AIC, is derived for regression and autoregres...
The Akaike information criterion, AIC, is widely used for model selection. Using the AIC as the esti...
For regression and time series model selection, Hurvich and Tsai (1989) obtained a bias correction A...
Estimation of Kullback-Leibler amount of information is a crucial part of deriving a statistical mod...
Akaike Information Criterion (AIC) has been used widely as a statistical criterion to compare the ap...
[[abstract]]We consider penalized likelihood criteria for selecting models of dependent processes. T...
This book is unique in that it covers the philosophy of model-based data analysis and an omnibus str...
A new estimator, AIC;, of the Kullback-Leibler information is proposed for Gaussian autoregressive t...
Estimation of the expected Kullback-Leibler information is the basis for deriving the Akaike informa...
Since the 1990s, the Akaike Information Criterion (AIC) and its various modifications/extensions, in...
International audienceThis paper studies the problem of model selection in a large class of causal t...
Information-theoretic approaches to model selection, such as Akaike's information criterion (AIC) an...
An improved AIC-based criterion is derived for model selection in general smoothing-based modeling, ...
We consider the problem of model (or variable) selection in the classical regression model using the...
The selection of an appropriate model is a fundamental step of the data analysis in small area estim...
A bias correction to the Akaike information criterion, AIC, is derived for regression and autoregres...
The Akaike information criterion, AIC, is widely used for model selection. Using the AIC as the esti...
For regression and time series model selection, Hurvich and Tsai (1989) obtained a bias correction A...
Estimation of Kullback-Leibler amount of information is a crucial part of deriving a statistical mod...
Akaike Information Criterion (AIC) has been used widely as a statistical criterion to compare the ap...
[[abstract]]We consider penalized likelihood criteria for selecting models of dependent processes. T...
This book is unique in that it covers the philosophy of model-based data analysis and an omnibus str...
A new estimator, AIC;, of the Kullback-Leibler information is proposed for Gaussian autoregressive t...
Estimation of the expected Kullback-Leibler information is the basis for deriving the Akaike informa...
Since the 1990s, the Akaike Information Criterion (AIC) and its various modifications/extensions, in...
International audienceThis paper studies the problem of model selection in a large class of causal t...
Information-theoretic approaches to model selection, such as Akaike's information criterion (AIC) an...
An improved AIC-based criterion is derived for model selection in general smoothing-based modeling, ...
We consider the problem of model (or variable) selection in the classical regression model using the...
The selection of an appropriate model is a fundamental step of the data analysis in small area estim...
A bias correction to the Akaike information criterion, AIC, is derived for regression and autoregres...