Minimum divergence procedures based on the density power divergence and the logarithmic density power divergence have been extremely popular and successful in generating inference procedures which combine a high degree of model efficiency with strong outlier stability. Such procedures are always preferable in practical situations over procedures which achieve their robustness at a major cost of efficiency or are highly efficient but have poor robustness properties. The density power divergence (DPD) family of Basu et al.(1998) and the logarithmic density power divergence (LDPD) family of Jones et al.(2001) provide flexible classes of divergences where the adjustment between efficiency and robustness is controlled by a single, real, non-nega...
Kullback-Leibler (KL) divergence is widely used for variational inference of Bayesian Neural Network...
In this work, we introduce new information inequalities on new generalized f-divergence in terms of ...
© 2018 IEEE. A loss function measures the discrepancy between the true values (observations) and the...
This work reviews and extends a family of log-determinant (log-det) divergences for symmetric posit...
Markov Chain Monte Carlo methods for sampling from complex distributions and estimating normalizatio...
© 1963-2012 IEEE. A loss function measures the discrepancy between the true values and their estimat...
The power divergence (PD) and the density power divergence (DPD) families have proven to be useful t...
There are many applications that benefit from computing the exact divergence between 2 discrete prob...
summary:We propose a simple method of construction of new families of $\phi$%-divergences. This meth...
This dissertation has mainly focused on the development of statistical theory, methodology, and appl...
summary:This paper deals with four types of point estimators based on minimization of information-th...
This dissertation has mainly focused on the development of statistical theory, methodology, and appl...
This note provides a bibliography of investigations based on or related to divergence measures for t...
Deep metric learning techniques have been used for visual representation in various supervised and u...
abstract: Information divergence functions, such as the Kullback-Leibler divergence or the Hellinger...
Kullback-Leibler (KL) divergence is widely used for variational inference of Bayesian Neural Network...
In this work, we introduce new information inequalities on new generalized f-divergence in terms of ...
© 2018 IEEE. A loss function measures the discrepancy between the true values (observations) and the...
This work reviews and extends a family of log-determinant (log-det) divergences for symmetric posit...
Markov Chain Monte Carlo methods for sampling from complex distributions and estimating normalizatio...
© 1963-2012 IEEE. A loss function measures the discrepancy between the true values and their estimat...
The power divergence (PD) and the density power divergence (DPD) families have proven to be useful t...
There are many applications that benefit from computing the exact divergence between 2 discrete prob...
summary:We propose a simple method of construction of new families of $\phi$%-divergences. This meth...
This dissertation has mainly focused on the development of statistical theory, methodology, and appl...
summary:This paper deals with four types of point estimators based on minimization of information-th...
This dissertation has mainly focused on the development of statistical theory, methodology, and appl...
This note provides a bibliography of investigations based on or related to divergence measures for t...
Deep metric learning techniques have been used for visual representation in various supervised and u...
abstract: Information divergence functions, such as the Kullback-Leibler divergence or the Hellinger...
Kullback-Leibler (KL) divergence is widely used for variational inference of Bayesian Neural Network...
In this work, we introduce new information inequalities on new generalized f-divergence in terms of ...
© 2018 IEEE. A loss function measures the discrepancy between the true values (observations) and the...