A new canonical divergence is put forward for generalizing an information-geometric measure of complexity for both classical and quantum systems. On the simplex of probability measures, it is proved that the new divergence coincides with the Kullback–Leibler divergence, which is used to quantify how much a probability measure deviates from the non-interacting states that are modeled by exponential families of probabilities. On the space of positive density operators, we prove that the same divergence reduces to the quantum relative entropy, which quantifies many-party correlations of a quantum state from a Gibbs family
Given two density matrices ρ and σ, there are a number of different expressions that reduce to the α...
We extend algorithmic information theory to quantum mechanics, taking a universal semicomputable den...
We study many-party correlations quantified in terms of the Umegaki relative entropy (divergence) fr...
A new canonical divergence is put forward for generalizing an information-geometric measure of comp...
Abstract: Having a distance measure between quantum states satisfying the right properties is of fun...
Distance measures between quantum states like the trace distance and the fidelity can naturally be d...
In this paper it was proved that the quantum relative entropy D(k) can be asymptotically attained by...
The quantum relative entropy is a measure of the distinguishability of two quantum states, and it is...
We give systematic ways of defining monotone quantum relative entropies and (multi-variate) quantum ...
Recently a new quantum generalization of the Rényi divergence and the corresponding conditional Rény...
University of Technology Sydney. Faculty of Engineering and Information Technology.We investigate qu...
Data science, information theory, probability theory, statistical learning and other related discipl...
The quantum relative entropy is a measure of the distinguishability of two quantum states, and it is...
We discuss an alternative to relative entropy as a measure of distance between mixed quantum states....
Distance measures between quantum states like the trace distance and the fidelity can naturally be d...
Given two density matrices ρ and σ, there are a number of different expressions that reduce to the α...
We extend algorithmic information theory to quantum mechanics, taking a universal semicomputable den...
We study many-party correlations quantified in terms of the Umegaki relative entropy (divergence) fr...
A new canonical divergence is put forward for generalizing an information-geometric measure of comp...
Abstract: Having a distance measure between quantum states satisfying the right properties is of fun...
Distance measures between quantum states like the trace distance and the fidelity can naturally be d...
In this paper it was proved that the quantum relative entropy D(k) can be asymptotically attained by...
The quantum relative entropy is a measure of the distinguishability of two quantum states, and it is...
We give systematic ways of defining monotone quantum relative entropies and (multi-variate) quantum ...
Recently a new quantum generalization of the Rényi divergence and the corresponding conditional Rény...
University of Technology Sydney. Faculty of Engineering and Information Technology.We investigate qu...
Data science, information theory, probability theory, statistical learning and other related discipl...
The quantum relative entropy is a measure of the distinguishability of two quantum states, and it is...
We discuss an alternative to relative entropy as a measure of distance between mixed quantum states....
Distance measures between quantum states like the trace distance and the fidelity can naturally be d...
Given two density matrices ρ and σ, there are a number of different expressions that reduce to the α...
We extend algorithmic information theory to quantum mechanics, taking a universal semicomputable den...
We study many-party correlations quantified in terms of the Umegaki relative entropy (divergence) fr...