We introduce the concept of information compressibility, KI, which measures the relative change of number of available microstates of an open system in response to an energy variation. We then prove that at the time in which the system reaches a steady state, the second and third time derivatives of the information entropy are proportional to the corresponding time derivatives of the energy, the proportionality constant being KI. We argue that if two steady states with different but same-sign KI are dynamically connected in a non-adiabatic way it takes a longer time to reach the state with compressibility closer to zero than the reverse. We also show analytically that for a two-level system in contact with external baths, the info...
Abstract: We consider the information and thermodynamic aspects of the transition between ...
We show that the conservation and the non-additivity of information, together with the additivity of...
A probability distribution encodes all the statistics of its corresponding random variable, hence it...
The act of measuring a system has profound consequences of dynamical and thermodynamic nature. In pa...
International audienceJaynes' information theory formalism of statistical mechanics is applied to th...
This book aims to present an information-theoretical approach to thermodynamics and its generalisati...
We put forth a unifying formalism for the description of the thermodynamics of continuously monitore...
We put forth a unifying formalism for the description of the thermodynamics of continuously monitore...
Information-theoretic approaches provide a promising avenue for extending the laws of thermodynamics...
Markovian master equations (formally known as quantum dynamical semigroups) can be used to describe ...
The dynamical convergence of a system to the thermal distribution, or Gibbs state, is a standard ass...
We explored the dynamics of two interacting information systems. We show that for the Markovian marg...
Thermodynamic entropy is not an entirely satisfactory measure of information of a quantum state. Thi...
We apply a certain unifying physical description of the results of Information Theory. Assuming that...
Landauer's principle provides a perspective on the physical meaning of information as well as on the...
Abstract: We consider the information and thermodynamic aspects of the transition between ...
We show that the conservation and the non-additivity of information, together with the additivity of...
A probability distribution encodes all the statistics of its corresponding random variable, hence it...
The act of measuring a system has profound consequences of dynamical and thermodynamic nature. In pa...
International audienceJaynes' information theory formalism of statistical mechanics is applied to th...
This book aims to present an information-theoretical approach to thermodynamics and its generalisati...
We put forth a unifying formalism for the description of the thermodynamics of continuously monitore...
We put forth a unifying formalism for the description of the thermodynamics of continuously monitore...
Information-theoretic approaches provide a promising avenue for extending the laws of thermodynamics...
Markovian master equations (formally known as quantum dynamical semigroups) can be used to describe ...
The dynamical convergence of a system to the thermal distribution, or Gibbs state, is a standard ass...
We explored the dynamics of two interacting information systems. We show that for the Markovian marg...
Thermodynamic entropy is not an entirely satisfactory measure of information of a quantum state. Thi...
We apply a certain unifying physical description of the results of Information Theory. Assuming that...
Landauer's principle provides a perspective on the physical meaning of information as well as on the...
Abstract: We consider the information and thermodynamic aspects of the transition between ...
We show that the conservation and the non-additivity of information, together with the additivity of...
A probability distribution encodes all the statistics of its corresponding random variable, hence it...