In everyday life, we receive affective information from a multisensory environment. What we see and what we hear jointly influence how we feel, think and act. Outstanding questions still remain about the essential behavioral and neural mechanism underlying how we combine visual and auditory affective signals. In this dissertation, I report a series of behavioral, EEG and fMRI experiments addressing this question. I found behaviorally there are congruency, visual dominance, and negativity dominance effects. Using ERPs, I showed that these behavioral effects can map onto different time course in audiovisual affective processing. Time-frequency analyses of EEG data showed that there are early sub-additive evoked theta, long-lasting supra-addit...
In this study we were interested in the neural system supporting the audiovisual integration of emot...
Induced affect is the emotional effect of an object on an individual. It can be quantified through tw...
In this study we were interested in the neural system supporting the audiovisual integration of emot...
In everyday life, we receive affective information from a multisensory environment. What we see and ...
The brain integrates or segregates audio-visual signals effortlessly in everyday life. In order to d...
x, 90 leaves : col. ill. ; 29 cmWe are capable of effortlessly parsing a complex scene presented to ...
Sensory information can both impair and enhance low-level visual feature processing, and this can be...
Our affective experiences are influenced by combined multisensory information. Although the enhanced...
In our daily environment, we are constantly encountering an endless stream of information which we m...
The processing of valence is known to recruit the amygdala, orbitofrontal cortex, and relevant senso...
Making sense of acoustic environments is a challenging task. At any moment, the signals from distinc...
The processing of valence is known to recruit the amygdala, orbitofrontal cortex and relevant sensor...
Our ability to perceive meaningful action events involving objects, people and other animate agents ...
In humans, emotions from music serve important communicative roles. Despite a growing interest in th...
Our perception is continuous and unified. Yet, sensory information reaches our brains through differ...
In this study we were interested in the neural system supporting the audiovisual integration of emot...
Induced affect is the emotional effect of an object on an individual. It can be quantified through tw...
In this study we were interested in the neural system supporting the audiovisual integration of emot...
In everyday life, we receive affective information from a multisensory environment. What we see and ...
The brain integrates or segregates audio-visual signals effortlessly in everyday life. In order to d...
x, 90 leaves : col. ill. ; 29 cmWe are capable of effortlessly parsing a complex scene presented to ...
Sensory information can both impair and enhance low-level visual feature processing, and this can be...
Our affective experiences are influenced by combined multisensory information. Although the enhanced...
In our daily environment, we are constantly encountering an endless stream of information which we m...
The processing of valence is known to recruit the amygdala, orbitofrontal cortex, and relevant senso...
Making sense of acoustic environments is a challenging task. At any moment, the signals from distinc...
The processing of valence is known to recruit the amygdala, orbitofrontal cortex and relevant sensor...
Our ability to perceive meaningful action events involving objects, people and other animate agents ...
In humans, emotions from music serve important communicative roles. Despite a growing interest in th...
Our perception is continuous and unified. Yet, sensory information reaches our brains through differ...
In this study we were interested in the neural system supporting the audiovisual integration of emot...
Induced affect is the emotional effect of an object on an individual. It can be quantified through tw...
In this study we were interested in the neural system supporting the audiovisual integration of emot...