The Chernoff information between two probability measures is a statistical divergence measuring their deviation defined as their maximally skewed Bhattacharyya distance. Although the Chernoff information was originally introduced for bounding the Bayes error in statistical hypothesis testing, the divergence found many other applications due to its empirical robustness property found in applications ranging from information fusion to quantum information. From the viewpoint of information theory, the Chernoff information can also be interpreted as a minmax symmetrization of the Kullback--Leibler divergence. In this paper, we first revisit the Chernoff information between two densities of a measurable Lebesgue space by considering the exponent...
We study the problem of maximizing information divergence from a new perspective using logarithmic V...
Data science, information theory, probability theory, statistical learning and other related discipl...
The Jensen-Shannon divergence is a renown bounded symmetrization of the unbounded Kullback-Leibler d...
The exponentially embedded family (EEF) of probability density functions (PDFs) is an important mode...
The exponentially embedded family (EEF) of probability density functions (PDFs) is an important mode...
We consider the problem of discriminating between two different states of a finite quantum system in...
We propose a new concept of codivergence, which quantifies the similarity between two probability me...
This book presents new and original research in Statistical Information Theory, based on minimum div...
The Kullback-Leibler (KL) divergence is a fundamental equation of information theory that quantifies...
We consider the problem of parameter estimation in a Bayesian setting and propose a general lower-bo...
AbstractIn statistical estimation problems measures between probability distributions play significa...
In information theory, one major goal is to find useful functions that summarize the amount of infor...
We consider the problem of discriminating two different quantum states in the setting of asymptotica...
We consider symmetric hypothesis testing, where the hypotheses are allowed to be arbitrary density o...
A new canonical divergence is put forward for generalizing an information-geometric measure of comp...
We study the problem of maximizing information divergence from a new perspective using logarithmic V...
Data science, information theory, probability theory, statistical learning and other related discipl...
The Jensen-Shannon divergence is a renown bounded symmetrization of the unbounded Kullback-Leibler d...
The exponentially embedded family (EEF) of probability density functions (PDFs) is an important mode...
The exponentially embedded family (EEF) of probability density functions (PDFs) is an important mode...
We consider the problem of discriminating between two different states of a finite quantum system in...
We propose a new concept of codivergence, which quantifies the similarity between two probability me...
This book presents new and original research in Statistical Information Theory, based on minimum div...
The Kullback-Leibler (KL) divergence is a fundamental equation of information theory that quantifies...
We consider the problem of parameter estimation in a Bayesian setting and propose a general lower-bo...
AbstractIn statistical estimation problems measures between probability distributions play significa...
In information theory, one major goal is to find useful functions that summarize the amount of infor...
We consider the problem of discriminating two different quantum states in the setting of asymptotica...
We consider symmetric hypothesis testing, where the hypotheses are allowed to be arbitrary density o...
A new canonical divergence is put forward for generalizing an information-geometric measure of comp...
We study the problem of maximizing information divergence from a new perspective using logarithmic V...
Data science, information theory, probability theory, statistical learning and other related discipl...
The Jensen-Shannon divergence is a renown bounded symmetrization of the unbounded Kullback-Leibler d...