H-theorem states that the entropy production is nonnegative and, therefore, the entropy of a closed system should monotonically change in time. In information processing, the entropy production is positive for random transformation of signals (the information processing lemma). Originally, the H-theorem and the information processing lemma were proved for the classical Boltzmann-Gibbs-Shannon entropy and for the correspondent divergence (the relative entropy). Many new entropies and divergences have been proposed during last decades and for all of them the H-theorem is needed. This note proposes a simple and general criterion to check whether the H-theorem is valid for a convex divergence H and demonstrates that some of the popular divergen...
It is well known that in Information Theory and Machine Learning the Kullback-Leibler divergence, wh...
International audienceJaynes' information theory formalism of statistical mechanics is applied to th...
We introduce an axiomatic approach to entropies and relative entropies that relies only on minimal i...
Remarkable progress of quantum information theory (QIT) allowed to formulate mathematical theorems f...
We consider the question of the existence of a generalized H-theorem in the context of the variation...
We start with a clear distinction between Shannon’s Measure of Information (SMI) and the Thermodynam...
The H-theorem is an extension of the Second Law to a time-sequence of states that need not be equili...
Abstract. It is demonstrated that the second-order Markovian closures frequently used in turbulence ...
International audienceThe dissipation of general convex entropies for continuous time Markov process...
We analyze Furth's 1933 classical uncertainty relations in the modern language of stochastic differe...
In information theory the 4 Shannon-Khinchin (SK) axioms determine Boltzmann Gibbs entropy, S ~ -Sig...
In this paper we review various information-theoretic characterizations of the approach to equilibri...
We use a well-studied soluble model to define a nonequilibrium entropy. This entropy has all the req...
The starting point of my thesis is a recent result of microscopic thermodynamics obtained with techn...
peer reviewedWe show via counterexamples that relative entropy between the solution of a Markovian m...
It is well known that in Information Theory and Machine Learning the Kullback-Leibler divergence, wh...
International audienceJaynes' information theory formalism of statistical mechanics is applied to th...
We introduce an axiomatic approach to entropies and relative entropies that relies only on minimal i...
Remarkable progress of quantum information theory (QIT) allowed to formulate mathematical theorems f...
We consider the question of the existence of a generalized H-theorem in the context of the variation...
We start with a clear distinction between Shannon’s Measure of Information (SMI) and the Thermodynam...
The H-theorem is an extension of the Second Law to a time-sequence of states that need not be equili...
Abstract. It is demonstrated that the second-order Markovian closures frequently used in turbulence ...
International audienceThe dissipation of general convex entropies for continuous time Markov process...
We analyze Furth's 1933 classical uncertainty relations in the modern language of stochastic differe...
In information theory the 4 Shannon-Khinchin (SK) axioms determine Boltzmann Gibbs entropy, S ~ -Sig...
In this paper we review various information-theoretic characterizations of the approach to equilibri...
We use a well-studied soluble model to define a nonequilibrium entropy. This entropy has all the req...
The starting point of my thesis is a recent result of microscopic thermodynamics obtained with techn...
peer reviewedWe show via counterexamples that relative entropy between the solution of a Markovian m...
It is well known that in Information Theory and Machine Learning the Kullback-Leibler divergence, wh...
International audienceJaynes' information theory formalism of statistical mechanics is applied to th...
We introduce an axiomatic approach to entropies and relative entropies that relies only on minimal i...