A central result that arose in applying information theory to the stochastic thermodynamics of nonlinear dynamical systems is the Information-Processing Second Law (IPSL): the physical entropy of the universe can decrease if compensated by the Shannon-Kolmogorov-Sinai entropy change of appropriate information-carrying degrees of freedom. In particular, the asymptotic-rate IPSL precisely delineates the thermodynamic functioning of autonomous Maxwellian demons and information engines. How do these systems begin to function as engines, Landauer erasers, and error correctors? Here, we identify a minimal, inescapable transient dissipation engendered by physical information processing...
Information-theoretic approaches provide a promising avenue for extending the laws of thermodynamics...
Abstract. We show that a rate of conditional Shannon entropy reduction, characterizing the learning ...
Algorithmic information theory in conjunction with Landauer’s principle can quantify the cost ...
A central result that arose in applying information theory to the stochastic thermodynamics...
Understanding structured information and computation in thermodynamics systems is crucial to progres...
Irreversible information processing cannot be carried out without some inevitable thermodynamical wo...
Landauer's Principle states that the energy cost of information processing must exceed the product o...
In this thesis we explore the relationship between information processing and physics. We use variou...
Information is often considered as an abstract entity, but it is always stored and processed by a ph...
The common saying, that information is power, takes a rigorous form in stochastic thermodynamics, wh...
We provide a unified thermodynamic formalism describing information transfers in autonomous as well ...
A restricted form of Landauer’s Principle, independent of computational considerations, is shown to ...
Transfer entropy is a recently introduced information-theoretic measure quantifying directed statist...
We study a two-level system controlled in a discrete feedback loop, modeling both the system and the...
One of the most powerful laws in physics is the second law of thermodynamics, which states that the ...
Information-theoretic approaches provide a promising avenue for extending the laws of thermodynamics...
Abstract. We show that a rate of conditional Shannon entropy reduction, characterizing the learning ...
Algorithmic information theory in conjunction with Landauer’s principle can quantify the cost ...
A central result that arose in applying information theory to the stochastic thermodynamics...
Understanding structured information and computation in thermodynamics systems is crucial to progres...
Irreversible information processing cannot be carried out without some inevitable thermodynamical wo...
Landauer's Principle states that the energy cost of information processing must exceed the product o...
In this thesis we explore the relationship between information processing and physics. We use variou...
Information is often considered as an abstract entity, but it is always stored and processed by a ph...
The common saying, that information is power, takes a rigorous form in stochastic thermodynamics, wh...
We provide a unified thermodynamic formalism describing information transfers in autonomous as well ...
A restricted form of Landauer’s Principle, independent of computational considerations, is shown to ...
Transfer entropy is a recently introduced information-theoretic measure quantifying directed statist...
We study a two-level system controlled in a discrete feedback loop, modeling both the system and the...
One of the most powerful laws in physics is the second law of thermodynamics, which states that the ...
Information-theoretic approaches provide a promising avenue for extending the laws of thermodynamics...
Abstract. We show that a rate of conditional Shannon entropy reduction, characterizing the learning ...
Algorithmic information theory in conjunction with Landauer’s principle can quantify the cost ...