We here unify the field-theoretical approach to neuronal networks with large deviations theory. For a prototypical random recurrent network model with continuous-valued units, we show that the effective action is identical to the rate function and derive the latter using field theory. This rate function takes the form of a Kullback-Leibler divergence which enables data-driven inference of model parameters and calculation of fluctuations beyond mean-field theory. Lastly, we expose a regime with fluctuation-induced transitions between mean-field solutions
Abstract. We study the mean-field limit and stationary distributions of a pulse-coupled network mode...
Considering a model of dynamical units interconnected through a random weighted directed graph, we i...
International audienceWe study the mean-field limit and stationary distributions of a pulse-coupled ...
International audienceWe here unify the field-theoretical approach to neuronal networks with large d...
Statistical field theory captures collective non-equilibrium dynamics of neuronal networks, but it d...
The remarkable properties of information-processing by biological and artificial neuronal networks a...
This paper is a review dealing with the study of large size random recurrent neural networks. The co...
Review paper, 36 pages, 5 figuresInternational audienceThis paper is a review dealing with the study...
Abstract—Recurrent spiking neural networks can provide biologically inspired model of robot controll...
In this work we determine a process-level Large Deviation Principle (LDP) for a model of interacting...
In this work we determine a Large Deviation Principle (LDP) for a model of neurons interacting on a ...
71 pagesWe study the asymptotic law of a network of interacting neurons when the number of neurons b...
How does reliable computation emerge from networks of noisy neurons? While individual neurons are in...
Abstract. Controlling activity in recurrent neural network models of brain regions is essential both...
International audienceWe analyze the macroscopic behavior of multi-populations randomly connected ne...
Abstract. We study the mean-field limit and stationary distributions of a pulse-coupled network mode...
Considering a model of dynamical units interconnected through a random weighted directed graph, we i...
International audienceWe study the mean-field limit and stationary distributions of a pulse-coupled ...
International audienceWe here unify the field-theoretical approach to neuronal networks with large d...
Statistical field theory captures collective non-equilibrium dynamics of neuronal networks, but it d...
The remarkable properties of information-processing by biological and artificial neuronal networks a...
This paper is a review dealing with the study of large size random recurrent neural networks. The co...
Review paper, 36 pages, 5 figuresInternational audienceThis paper is a review dealing with the study...
Abstract—Recurrent spiking neural networks can provide biologically inspired model of robot controll...
In this work we determine a process-level Large Deviation Principle (LDP) for a model of interacting...
In this work we determine a Large Deviation Principle (LDP) for a model of neurons interacting on a ...
71 pagesWe study the asymptotic law of a network of interacting neurons when the number of neurons b...
How does reliable computation emerge from networks of noisy neurons? While individual neurons are in...
Abstract. Controlling activity in recurrent neural network models of brain regions is essential both...
International audienceWe analyze the macroscopic behavior of multi-populations randomly connected ne...
Abstract. We study the mean-field limit and stationary distributions of a pulse-coupled network mode...
Considering a model of dynamical units interconnected through a random weighted directed graph, we i...
International audienceWe study the mean-field limit and stationary distributions of a pulse-coupled ...