Zhang (2019) presented a general estimation approach based on the Gaussian distribution for general parametric models where the likelihood of the data is difficult to obtain or unknown, but the mean and variance-covariance matrix are known. Castilla and Zografos (2021) extended the method to density power divergence-based estimators, which are more robust than the likelihood-based Gaussian estimator against data contamination. Here, we present the restricted minimum density power divergence Gaussian estimator (MDPDGE) and study it asymptotic and robustness properties through it asymptotic distribution and influence function, respectively. Restricted estimators are required in many practical situations and provide here constrained estimators...
We approach parameter estimation based on power-divergence using Havrda-Charvat generalized entropy....
In this paper, a robust version of the Wald test statistic for composite likelihood is considered by...
In real life we often deal with independent but not identically distributed observations (i.n.i.d.o)...
The aim of robust statistics is to develop statistical procedures which are not unduly influenced by...
The aim of robust statistics is to develop statistical procedures which are not unduly influenced by...
In testing of hypothesis, the robustness of the tests is an important concern. Generally, the maximu...
A minimum divergence estimation method is developed for robust parameter estimation. The proposed ap...
<div><p>The density power divergence (DPD) measure, defined in terms of a single parameter <i>α</i>,...
This book presents new and original research in Statistical Information Theory, based on minimum div...
This paper presents a model selection criterion in a composite likelihood framework based on density...
This paper presents a model selection criterion in a composite likelihood framework based on density...
We approach parameter estimation based on power-divergence using Havrda-Charvat generalized entropy....
summary:Point estimators based on minimization of information-theoretic divergences between empirica...
summary:Point estimators based on minimization of information-theoretic divergences between empirica...
summary:Point estimators based on minimization of information-theoretic divergences between empirica...
We approach parameter estimation based on power-divergence using Havrda-Charvat generalized entropy....
In this paper, a robust version of the Wald test statistic for composite likelihood is considered by...
In real life we often deal with independent but not identically distributed observations (i.n.i.d.o)...
The aim of robust statistics is to develop statistical procedures which are not unduly influenced by...
The aim of robust statistics is to develop statistical procedures which are not unduly influenced by...
In testing of hypothesis, the robustness of the tests is an important concern. Generally, the maximu...
A minimum divergence estimation method is developed for robust parameter estimation. The proposed ap...
<div><p>The density power divergence (DPD) measure, defined in terms of a single parameter <i>α</i>,...
This book presents new and original research in Statistical Information Theory, based on minimum div...
This paper presents a model selection criterion in a composite likelihood framework based on density...
This paper presents a model selection criterion in a composite likelihood framework based on density...
We approach parameter estimation based on power-divergence using Havrda-Charvat generalized entropy....
summary:Point estimators based on minimization of information-theoretic divergences between empirica...
summary:Point estimators based on minimization of information-theoretic divergences between empirica...
summary:Point estimators based on minimization of information-theoretic divergences between empirica...
We approach parameter estimation based on power-divergence using Havrda-Charvat generalized entropy....
In this paper, a robust version of the Wald test statistic for composite likelihood is considered by...
In real life we often deal with independent but not identically distributed observations (i.n.i.d.o)...