Information theory provides a powerful framework to analyse the representation of sensory stimuli in neural population activity. However, estimating the quantities involved such as entropy and mutual information from finite samples is notoriously hard and any direct estimate is known to be heavily biased. This is especially true when considering large neural populations. We study a simple model of sensory processing and show through a combinatorial argument that, with high probability, for large neural populations any finite number of samples of neural activity in response to a set of stimuli is mutually distinct. As a consequence, the mutual information when estimated directly from empirical histograms will be equal to the stimulus entropy...
In the context of parameter estimation and model selection, it is only quite recently that a direct ...
Conference PaperMutual information between stimulus and response has been advocated as an informatio...
International audienceMaximum entropy models are the least structured probability distributions that...
How neurons in the brain collectively represent stimuli is a long standing open problem. Studies in...
The information of the stimulus variable S in a population of n observed neurons R0…Rn can be measur...
Journal PaperMutual information enjoys wide use in the computational neuroscience community for anal...
Many systems in nature process information by transforming inputs from their environments into obser...
Maximum entropy models have become popular statistical models in neuroscience and other areas of bio...
While the relationship between information theoretic and Fisher-based measures in the limit of infin...
Neural populations encode information about their stimulus in a collective fashion, by joint activit...
How is information distributed across large neuronal populations within a given brain area? Informat...
The estimation of the information carried by spike times is crucial for a quantitative understanding...
Neural populations encode information about their stimulus in a collective fashion, by joint activit...
In the context of parameter estimation and model selection, it is only quite recently that a direct ...
Maximum entropy models are the least structured probability distributions that exactly reproduce a c...
In the context of parameter estimation and model selection, it is only quite recently that a direct ...
Conference PaperMutual information between stimulus and response has been advocated as an informatio...
International audienceMaximum entropy models are the least structured probability distributions that...
How neurons in the brain collectively represent stimuli is a long standing open problem. Studies in...
The information of the stimulus variable S in a population of n observed neurons R0…Rn can be measur...
Journal PaperMutual information enjoys wide use in the computational neuroscience community for anal...
Many systems in nature process information by transforming inputs from their environments into obser...
Maximum entropy models have become popular statistical models in neuroscience and other areas of bio...
While the relationship between information theoretic and Fisher-based measures in the limit of infin...
Neural populations encode information about their stimulus in a collective fashion, by joint activit...
How is information distributed across large neuronal populations within a given brain area? Informat...
The estimation of the information carried by spike times is crucial for a quantitative understanding...
Neural populations encode information about their stimulus in a collective fashion, by joint activit...
In the context of parameter estimation and model selection, it is only quite recently that a direct ...
Maximum entropy models are the least structured probability distributions that exactly reproduce a c...
In the context of parameter estimation and model selection, it is only quite recently that a direct ...
Conference PaperMutual information between stimulus and response has been advocated as an informatio...
International audienceMaximum entropy models are the least structured probability distributions that...