We investigate the relationship between thermodynamic and information theoretic inefficiencies in an individual neuron model, the adaptive exponential integrate-and-fire neuron. Recent work has revealed that minimization of energy dissipation is tightly related to optimal information processing, in the sense that a system has to compute a maximally predictive model. In this thesis we justify the extension of these results to the neuron and quantify the neuron’s thermodynamic and information processing inefficiencies
The human brain is the most complex computational machine known to science, even though its componen...
Selective pressure may drive neural systems to process as much information as possible with the lowe...
Abstract. We show that a rate of conditional Shannon entropy reduction, characterizing the learning ...
In systems biology, questions concerning the molecular and cellular makeup of an organism are of utm...
In systems biology, questions concerning the molecular and cellular makeup of an organism are of utm...
Unravelling the physical limits of information processing is an important goal of non-equilibrium st...
Biological sensory systems react to changes in their surroundings. They are characterized by fast re...
International audience—Probabilistic and neural approaches, through their incorporation of nonlinear...
In contrast to artificial systems, animals must forage for food. In biology, the availability of ene...
Information measures are often used to assess the efficacy of neural networks, and learning rules ca...
Identifying the determinants of neuronal energy consumption and their relationship to information co...
Human and animal experiments have shown that acquiring and storing information can require substanti...
About 50-80% of total energy is consumed by signaling in neural networks. A neural network consumes ...
Active inference is a normative framework for explaining behaviour under the free energy principle—a...
In this paper, we pursue recent observations that, through selective dendritic filtering, single neu...
The human brain is the most complex computational machine known to science, even though its componen...
Selective pressure may drive neural systems to process as much information as possible with the lowe...
Abstract. We show that a rate of conditional Shannon entropy reduction, characterizing the learning ...
In systems biology, questions concerning the molecular and cellular makeup of an organism are of utm...
In systems biology, questions concerning the molecular and cellular makeup of an organism are of utm...
Unravelling the physical limits of information processing is an important goal of non-equilibrium st...
Biological sensory systems react to changes in their surroundings. They are characterized by fast re...
International audience—Probabilistic and neural approaches, through their incorporation of nonlinear...
In contrast to artificial systems, animals must forage for food. In biology, the availability of ene...
Information measures are often used to assess the efficacy of neural networks, and learning rules ca...
Identifying the determinants of neuronal energy consumption and their relationship to information co...
Human and animal experiments have shown that acquiring and storing information can require substanti...
About 50-80% of total energy is consumed by signaling in neural networks. A neural network consumes ...
Active inference is a normative framework for explaining behaviour under the free energy principle—a...
In this paper, we pursue recent observations that, through selective dendritic filtering, single neu...
The human brain is the most complex computational machine known to science, even though its componen...
Selective pressure may drive neural systems to process as much information as possible with the lowe...
Abstract. We show that a rate of conditional Shannon entropy reduction, characterizing the learning ...