To investigate directed interactions in neural networks we often use Norbert Wiener's famous definition of observational causality. Wiener’s definition states that an improvement of the prediction of the future of a time series X from its own past by the incorporation of information from the past of a second time series Y is seen as an indication of a causal interaction from Y to X. Early implementations of Wiener's principle – such as Granger causality – modelled interacting systems by linear autoregressive processes and the interactions themselves were also assumed to be linear. However, in complex systems – such as the brain – nonlinear behaviour of its parts and nonlinear interactions between them have to be expected. In fact nonlinear ...
In complex networks such as gene networks, traffic systems or brain circuits it is important to unde...
Objective: Assessing brain connectivity from electrophysiological signals is of great relevance in n...
Current neural networks architectures are many times harder to train because of the increasing size ...
To investigate directed interactions in neural networks we often use Norbert Wiener's famous definit...
Background: Transfer entropy (TE) is a measure for the detection of directed interactions. Transfer ...
International audienceThis paper aims at estimating causal relationships between signals to detect f...
In complex networks such as gene networks, traffic systems or brain circuits it is important to unde...
Treballs Finals de Grau de Física, Facultat de Física, Universitat de Barcelona, Curs: 2016, Tutor: ...
We assume that even the simplest model of the brain is nonlinear and ‘causal’. Proceeding with the ...
Understanding causal relationships, or effective connectivity, between parts of the brain is of utmo...
A challenge for physiologists and neuroscientists is to map information transfer between components ...
It is a common notion in neuroscience research that the brain and neural systems in general "perform...
Poster presentation: Functional connectivity of the brain describes the network of correlated activi...
Transfer entropy (TE) provides a generalized and model-free framework to study Wiener-Granger causal...
Cross-frequency interactions, a form of oscillatory neural activity, are thought to play an essentia...
In complex networks such as gene networks, traffic systems or brain circuits it is important to unde...
Objective: Assessing brain connectivity from electrophysiological signals is of great relevance in n...
Current neural networks architectures are many times harder to train because of the increasing size ...
To investigate directed interactions in neural networks we often use Norbert Wiener's famous definit...
Background: Transfer entropy (TE) is a measure for the detection of directed interactions. Transfer ...
International audienceThis paper aims at estimating causal relationships between signals to detect f...
In complex networks such as gene networks, traffic systems or brain circuits it is important to unde...
Treballs Finals de Grau de Física, Facultat de Física, Universitat de Barcelona, Curs: 2016, Tutor: ...
We assume that even the simplest model of the brain is nonlinear and ‘causal’. Proceeding with the ...
Understanding causal relationships, or effective connectivity, between parts of the brain is of utmo...
A challenge for physiologists and neuroscientists is to map information transfer between components ...
It is a common notion in neuroscience research that the brain and neural systems in general "perform...
Poster presentation: Functional connectivity of the brain describes the network of correlated activi...
Transfer entropy (TE) provides a generalized and model-free framework to study Wiener-Granger causal...
Cross-frequency interactions, a form of oscillatory neural activity, are thought to play an essentia...
In complex networks such as gene networks, traffic systems or brain circuits it is important to unde...
Objective: Assessing brain connectivity from electrophysiological signals is of great relevance in n...
Current neural networks architectures are many times harder to train because of the increasing size ...