[[abstract]]We consider penalized likelihood criteria for selecting models of dependent processes. The models may be strictly nested, overlapping or nonnested, linear or nonlinear, and correctly specified or misspecified. We provide sufficient conditions on the penalty to guarantee the selection, with probability one (or with probability approaching one), of the model attaining the lower average Kullback-Leibler Information Criterion (KLIC) or, when both have the same KLIC, the more parsimonious model. As special cases, our results describe the Akaike, Schwarz, and Hannan-Quinn information criteria. As examples, we consider selection of ARMAX-GARCH and STAR models.[[fileno]]2070223010006[[department]]經濟學
Information criteria such as the Akaike information criterion (AIC) and Bayesian information criteri...
Information theoretic criteria (ITC) have been widely adopted in engineering and statistics for sele...
Model selection criteria in the presence of missing data based on the Kullback-Leibler discrepanc
In this thesis we develop Focus Information Criteria (FIC) for a number of situations concerning mod...
Various aspects of statistical model selection are discussed from the view point of a statistician. ...
Model selection is of fundamental importance to high dimensional modelling featured in many contempo...
Information-theoretic approaches to model selection, such as Akaike's information criterion (AIC) an...
This paper considers model selection in panels where incidental parameters are present. Primary inte...
We propose a nonparametric likelihood ratio testing procedure for choosing between a parametric (lik...
Information of interest can often only be extracted from data by model fitting. When the functional ...
Information criteria (IC) are used widely to choose between competing alternative models. When these...
This paper considers model selection of nonlinear panel data models in the presence of incidental pa...
Plug-in estimation and corresponding refinements involving penalisation have been considered in vari...
This paper considers information criteria as model evaluation tools for nonlinear threshold models. ...
Abstract: Two bootstrap-corrected variants of the Akaike information criterion are proposed for the ...
Information criteria such as the Akaike information criterion (AIC) and Bayesian information criteri...
Information theoretic criteria (ITC) have been widely adopted in engineering and statistics for sele...
Model selection criteria in the presence of missing data based on the Kullback-Leibler discrepanc
In this thesis we develop Focus Information Criteria (FIC) for a number of situations concerning mod...
Various aspects of statistical model selection are discussed from the view point of a statistician. ...
Model selection is of fundamental importance to high dimensional modelling featured in many contempo...
Information-theoretic approaches to model selection, such as Akaike's information criterion (AIC) an...
This paper considers model selection in panels where incidental parameters are present. Primary inte...
We propose a nonparametric likelihood ratio testing procedure for choosing between a parametric (lik...
Information of interest can often only be extracted from data by model fitting. When the functional ...
Information criteria (IC) are used widely to choose between competing alternative models. When these...
This paper considers model selection of nonlinear panel data models in the presence of incidental pa...
Plug-in estimation and corresponding refinements involving penalisation have been considered in vari...
This paper considers information criteria as model evaluation tools for nonlinear threshold models. ...
Abstract: Two bootstrap-corrected variants of the Akaike information criterion are proposed for the ...
Information criteria such as the Akaike information criterion (AIC) and Bayesian information criteri...
Information theoretic criteria (ITC) have been widely adopted in engineering and statistics for sele...
Model selection criteria in the presence of missing data based on the Kullback-Leibler discrepanc