In this work we determine a process-level Large Deviation Principle (LDP) for a model of interacting neurons indexed by a lattice Z d. The neurons are subject to noise, which is modelled as a correlated martingale. The probability law governing the noise is strictly stationary, and we are therefore able to find a LDP for the probability laws Π n governing the stationary empirical measurê µ n generated by the neurons in a cube of length (2n + 1). We use this LDP to determine an LDP for the neural network model. The connection weights between the neurons evolve according to a learning rule / neuronal plasticity, and these results are adaptable to a large variety of neural network models. This LDP is of great use in the mathematical modelling ...
International audienceWe analyze the macroscopic behavior of multi-populations randomly connected ne...
We here unify the field-theoretical approach to neuronal networks with large deviations theory. For ...
International audienceWe here unify the field-theoretical approach to neuronal networks with large d...
In this work we determine a Large Deviation Principle (LDP) for a model of neurons interacting on a ...
In this work we determine a Large Deviation Principle (LDP) for a model of neurons interacting on a ...
In this work we determine a process-level Large Deviation Principle (LDP) for a model of interacting...
In this work we determine a Large Deviation Principle (LDP) for a model of neurons interacting on a ...
In this work we determine a Large Deviation Principle (LDP) for a model of neurons interacting on a ...
102 pagesWe study the asymptotic behaviour for asymmetric neuronal dynamics in a network of Hopfield...
71 pagesWe study the asymptotic law of a network of interacting neurons when the number of neurons b...
This thesis addresses the rigorous derivation of mean-field results for the continuous time dynamics...
This thesis addresses the rigorous derivation of mean-field results for the continuous time dynamics...
This thesis addresses the rigorous derivation of mean-field results for the continuous time dynamics...
International audienceWe analyze the macroscopic behavior of multi-populations randomly connected ne...
How does reliable computation emerge from networks of noisy neurons? While individual neurons are in...
International audienceWe analyze the macroscopic behavior of multi-populations randomly connected ne...
We here unify the field-theoretical approach to neuronal networks with large deviations theory. For ...
International audienceWe here unify the field-theoretical approach to neuronal networks with large d...
In this work we determine a Large Deviation Principle (LDP) for a model of neurons interacting on a ...
In this work we determine a Large Deviation Principle (LDP) for a model of neurons interacting on a ...
In this work we determine a process-level Large Deviation Principle (LDP) for a model of interacting...
In this work we determine a Large Deviation Principle (LDP) for a model of neurons interacting on a ...
In this work we determine a Large Deviation Principle (LDP) for a model of neurons interacting on a ...
102 pagesWe study the asymptotic behaviour for asymmetric neuronal dynamics in a network of Hopfield...
71 pagesWe study the asymptotic law of a network of interacting neurons when the number of neurons b...
This thesis addresses the rigorous derivation of mean-field results for the continuous time dynamics...
This thesis addresses the rigorous derivation of mean-field results for the continuous time dynamics...
This thesis addresses the rigorous derivation of mean-field results for the continuous time dynamics...
International audienceWe analyze the macroscopic behavior of multi-populations randomly connected ne...
How does reliable computation emerge from networks of noisy neurons? While individual neurons are in...
International audienceWe analyze the macroscopic behavior of multi-populations randomly connected ne...
We here unify the field-theoretical approach to neuronal networks with large deviations theory. For ...
International audienceWe here unify the field-theoretical approach to neuronal networks with large d...