The main purpose of this paper is to introduce and study the behavior of minimum (Formula presented.)-divergence estimators as an alternative to the maximum-likelihood estimator in latent class models for binary items. As it will become clear below, minimum (Formula presented.)-divergence estimators are a natural extension of the maximum-likelihood estimator. The asymptotic properties of minimum (Formula presented.)-divergence estimators for latent class models for binary data are developed. Finally, to compare the efficiency and robustness of these new estimators with that obtained through maximum likelihood when the sample size is not big enough to apply the asymptotic results, we have carried out a simulation study
The Cressie-Read (CR) family of power divergence measures is used to identify a new class of statist...
Recent work has focused on the problem of nonparametric estimation of information divergence functio...
The problem of estimating the Kullback-Leibler divergence D(P||Q) between two unknown distributions ...
In this paper we present two new families of test statistics for studying the problem of goodness-of...
AbstractIn the present work, the problem of estimating parameters of statistical models for categori...
This paper uses information theoretic methods to introduce a new class of probability distributions...
summary:We establish a decomposition of the Jensen-Shannon divergence into a linear combination of a...
This paper introduces a new class of estimators based on minimization of the Cressie-Read (CR) power...
The consequences of model misspecification for multinomial data when using minimum [phi]-divergence ...
This paper introduces a new class of estimators based on minimization of the Cressie-Read (CR)power ...
This paper studies the Minimum Divergence (MD) class of estimators for econometric models specified...
AbstractStochastic modeling for large-scale datasets usually involves a varying-dimensional model sp...
We establish a decomposition of the Jensen-Shannon divergence into a linear combination of a scaled ...
This paper focuses on the consequences of assuming a wrong model for multinomial data when using min...
The Expectation-Maximization (EM) algorithm is routinely used for maximum likelihood estimation in l...
The Cressie-Read (CR) family of power divergence measures is used to identify a new class of statist...
Recent work has focused on the problem of nonparametric estimation of information divergence functio...
The problem of estimating the Kullback-Leibler divergence D(P||Q) between two unknown distributions ...
In this paper we present two new families of test statistics for studying the problem of goodness-of...
AbstractIn the present work, the problem of estimating parameters of statistical models for categori...
This paper uses information theoretic methods to introduce a new class of probability distributions...
summary:We establish a decomposition of the Jensen-Shannon divergence into a linear combination of a...
This paper introduces a new class of estimators based on minimization of the Cressie-Read (CR) power...
The consequences of model misspecification for multinomial data when using minimum [phi]-divergence ...
This paper introduces a new class of estimators based on minimization of the Cressie-Read (CR)power ...
This paper studies the Minimum Divergence (MD) class of estimators for econometric models specified...
AbstractStochastic modeling for large-scale datasets usually involves a varying-dimensional model sp...
We establish a decomposition of the Jensen-Shannon divergence into a linear combination of a scaled ...
This paper focuses on the consequences of assuming a wrong model for multinomial data when using min...
The Expectation-Maximization (EM) algorithm is routinely used for maximum likelihood estimation in l...
The Cressie-Read (CR) family of power divergence measures is used to identify a new class of statist...
Recent work has focused on the problem of nonparametric estimation of information divergence functio...
The problem of estimating the Kullback-Leibler divergence D(P||Q) between two unknown distributions ...