Statistical field theory captures collective non-equilibrium dynamics of neuronal networks, but it does not address the inverse problem of searching the connectivity to implement a desired dynamics. We here show for an analytically solvable network model that the effective action in statistical field theory is identical to the rate function in large deviation theory; using field theoretical methods we derive this rate function. It takes the form of a Kullback-Leibler divergence and enables data-driven inference of model parameters and Bayesian prediction of time series
This paper is a review dealing with the study of large size random recurrent neural networks. The co...
Review paper, 36 pages, 5 figuresInternational audienceThis paper is a review dealing with the study...
Understanding capabilities and limitations of different network architectures is of fundamental impo...
We here unify the field-theoretical approach to neuronal networks with large deviations theory. For ...
International audienceWe here unify the field-theoretical approach to neuronal networks with large d...
The remarkable properties of information-processing by biological and artificial neuronal networks a...
In this work we determine a process-level Large Deviation Principle (LDP) for a model of interacting...
Abstract—Recurrent spiking neural networks can provide biologically inspired model of robot controll...
In this work we determine a process-level Large Deviation Principle (LDP) for a model of interacting...
71 pagesWe study the asymptotic law of a network of interacting neurons when the number of neurons b...
Cerebral cortex is composed of intricate networks of neurons. These neuronal networks are strongly i...
Cerebral cortex is composed of intricate networks of neurons. These neuronal networks are strongly i...
Cerebral cortex is composed of intricate networks of neurons. These neuronal networks are strongly i...
Review paper, 36 pages, 5 figuresInternational audienceThis paper is a review dealing with the study...
Review paper, 36 pages, 5 figuresInternational audienceThis paper is a review dealing with the study...
This paper is a review dealing with the study of large size random recurrent neural networks. The co...
Review paper, 36 pages, 5 figuresInternational audienceThis paper is a review dealing with the study...
Understanding capabilities and limitations of different network architectures is of fundamental impo...
We here unify the field-theoretical approach to neuronal networks with large deviations theory. For ...
International audienceWe here unify the field-theoretical approach to neuronal networks with large d...
The remarkable properties of information-processing by biological and artificial neuronal networks a...
In this work we determine a process-level Large Deviation Principle (LDP) for a model of interacting...
Abstract—Recurrent spiking neural networks can provide biologically inspired model of robot controll...
In this work we determine a process-level Large Deviation Principle (LDP) for a model of interacting...
71 pagesWe study the asymptotic law of a network of interacting neurons when the number of neurons b...
Cerebral cortex is composed of intricate networks of neurons. These neuronal networks are strongly i...
Cerebral cortex is composed of intricate networks of neurons. These neuronal networks are strongly i...
Cerebral cortex is composed of intricate networks of neurons. These neuronal networks are strongly i...
Review paper, 36 pages, 5 figuresInternational audienceThis paper is a review dealing with the study...
Review paper, 36 pages, 5 figuresInternational audienceThis paper is a review dealing with the study...
This paper is a review dealing with the study of large size random recurrent neural networks. The co...
Review paper, 36 pages, 5 figuresInternational audienceThis paper is a review dealing with the study...
Understanding capabilities and limitations of different network architectures is of fundamental impo...