Regardless of the fact that emotions are usually recognized by combining facial and vocal expressions, the multisensory nature of affect perception has scarcely been investigated. In the present study, we show results of three experiments on multisensory perception of emotions using newly validated sets of dynamic visual and non-linguistic vocal clips of affect expressions. In Experiment 1, participants were required to categorise fear and disgust expressions displayed auditorily, visually, or using congruent or incongruent audio-visual stimuli. Results showed faster and more accurate categorisation in the bimodal congruent situation than in the unimodal conditions. In the incongruent situation, participant preferentially categorised the af...
Humans seamlessly extract and integrate the emotional content delivered by the face and the voice of...
Emotions play a crucial role in human-human communication with a complex socio-psychological nature,...
We used human electroencephalogram to study early audiovisual integration of dynamic angry and neutr...
Multimodal perception of emotions has been typically examined using displays of a solitary character...
In everyday life, multiple sensory channels jointly trigger emotional experiences and one channel ma...
In everyday life, multiple sensory channels jointly trigger emotional experiences and one channel ma...
Face-to-face communication works multimodally. Not only do we employ vocal and facial expressions; b...
Affective information constitutes a key feature of audiovisual speech and plays a fundamental role i...
Previous research has shown that redundant information in faces and voices leads to faster emotional...
Emotion perception naturally entails multisensory integration. It is also assumed that multisensory ...
Audiovisual perception of emotions has been typically examined using displays of a solitary characte...
Multisensory integration may occur independently of visual attention as previously shown with compou...
Emotions have a pivotal role in our lives, and we massively express and perceive them through faces ...
Multisensory integration is a powerful mechanism for increasing adaptive responses, as illustrated b...
Humans extract and integrate the emotional content delivered through faces and voices of others. It ...
Humans seamlessly extract and integrate the emotional content delivered by the face and the voice of...
Emotions play a crucial role in human-human communication with a complex socio-psychological nature,...
We used human electroencephalogram to study early audiovisual integration of dynamic angry and neutr...
Multimodal perception of emotions has been typically examined using displays of a solitary character...
In everyday life, multiple sensory channels jointly trigger emotional experiences and one channel ma...
In everyday life, multiple sensory channels jointly trigger emotional experiences and one channel ma...
Face-to-face communication works multimodally. Not only do we employ vocal and facial expressions; b...
Affective information constitutes a key feature of audiovisual speech and plays a fundamental role i...
Previous research has shown that redundant information in faces and voices leads to faster emotional...
Emotion perception naturally entails multisensory integration. It is also assumed that multisensory ...
Audiovisual perception of emotions has been typically examined using displays of a solitary characte...
Multisensory integration may occur independently of visual attention as previously shown with compou...
Emotions have a pivotal role in our lives, and we massively express and perceive them through faces ...
Multisensory integration is a powerful mechanism for increasing adaptive responses, as illustrated b...
Humans extract and integrate the emotional content delivered through faces and voices of others. It ...
Humans seamlessly extract and integrate the emotional content delivered by the face and the voice of...
Emotions play a crucial role in human-human communication with a complex socio-psychological nature,...
We used human electroencephalogram to study early audiovisual integration of dynamic angry and neutr...