This paper focuses on the consequences of assuming a wrong model for multinomial data when using minimum penalized ϕ -divergence, also known as minimum penalized disparity estimators, to estimate the model parameters. These estimators are shown to converge to a well-defined limit. An application of the results obtained shows that a parametric bootstrap consistently estimates the null distribution of a certain class of test statistics for model misspecification detection. An illustrative application to the accuracy assessment of the thematic quality in a global land cover map is included
A multiplier bootstrap procedure for construction of likelihood-based confidence sets is considered ...
This paper studies the Minimum Divergence (MD) class of estimators for econometric models specified...
The problems of estimating parameters of statistical models for categorical data, and testing hypoth...
This paper focuses on the consequences of assuming a wrong model for multinomial data when using min...
This paper focuses on the consequences of assuming a wrong model for multinomial data when using min...
The consequences of model misspecification for multinomial data when using minimum [phi]-divergence ...
AbstractIn the present work, the problem of estimating parameters of statistical models for categori...
We propose a framework for estimation and inference when the model may be misspecified. We rely on a...
International audienceIdentifying a model by the penalized contrast procedure, we give an analytical...
Abstract. We introduce estimation and test procedures through divergence optimization for discrete a...
A multiplier bootstrap procedure for construction of likelihood-based confidence sets is considered ...
AbstractThe problems of estimating parameters of statistical models for categorical data, and testin...
We propose novel misspeci\u85cation tests of semiparametric and fully parametric univariate di¤usion...
We show that a bootstrap model selection criterion constructed by directly plugging-in a consistent ...
In this note we introduce some divergence-based model selection criteria. These criteria are defined...
A multiplier bootstrap procedure for construction of likelihood-based confidence sets is considered ...
This paper studies the Minimum Divergence (MD) class of estimators for econometric models specified...
The problems of estimating parameters of statistical models for categorical data, and testing hypoth...
This paper focuses on the consequences of assuming a wrong model for multinomial data when using min...
This paper focuses on the consequences of assuming a wrong model for multinomial data when using min...
The consequences of model misspecification for multinomial data when using minimum [phi]-divergence ...
AbstractIn the present work, the problem of estimating parameters of statistical models for categori...
We propose a framework for estimation and inference when the model may be misspecified. We rely on a...
International audienceIdentifying a model by the penalized contrast procedure, we give an analytical...
Abstract. We introduce estimation and test procedures through divergence optimization for discrete a...
A multiplier bootstrap procedure for construction of likelihood-based confidence sets is considered ...
AbstractThe problems of estimating parameters of statistical models for categorical data, and testin...
We propose novel misspeci\u85cation tests of semiparametric and fully parametric univariate di¤usion...
We show that a bootstrap model selection criterion constructed by directly plugging-in a consistent ...
In this note we introduce some divergence-based model selection criteria. These criteria are defined...
A multiplier bootstrap procedure for construction of likelihood-based confidence sets is considered ...
This paper studies the Minimum Divergence (MD) class of estimators for econometric models specified...
The problems of estimating parameters of statistical models for categorical data, and testing hypoth...