Integrating emotional information from multiple sensory modalities is generally assumed to be a pre-attentive process (de Gelder et al., 1999). This assumption, however, presupposes that the integrative process occurs independent of attention. Using event-potentials (ERP) the present study investigated whether the neural processes underlying the integration of dynamic facial expression and emotional prosody is indeed unaffected by attentional manipulations. To this end, participants were presented with congruent and incongruent face-voice combinations (eg, an angry face combined with a neutral voice) and performed different two-choice tasks in four consecutive blocks. Three of the tasks directed the participants' attention to emotion expres...
Previous research has shown that redundant information in faces and voices leads to faster emotional...
Both facial expression and tone of voice represent key signals of emotional communication but their ...
& Emotional attention, the boosting of the processing of emotionally relevant stimuli, has, up t...
Recent findings on multisensory integration suggest that selective attention influences cross-sensor...
The aim of the present study was to test whether multisensory interactions of emotional signals are ...
Evidence suggests that emotion is represented supramodally in the human brain. Emotional facial expr...
Evidence suggests that emotion is represented supramodally in the human brain. Emotional facial expr...
Results from recent event-related brain potential (ERP) studies investigating brain processes involv...
In the everyday environment, affective information is conveyed by both the face and the voice. Studi...
AbstractResults from recent event-related brain potential (ERP) studies investigating brain processe...
Objective: The study investigated the simultaneous processing of emotional tone of voice and emotion...
Behavioral studies have observed facial recognition bypass attentional limitations when performed wi...
Facial emotional processing can be bypassed when faces are task-irrelevant and attention is diverted...
<div><p>The decoding of social signals from nonverbal cues plays a vital role in the social interact...
Previous research has shown that redundant information in faces and voices leads to faster emotional...
Both facial expression and tone of voice represent key signals of emotional communication but their ...
& Emotional attention, the boosting of the processing of emotionally relevant stimuli, has, up t...
Recent findings on multisensory integration suggest that selective attention influences cross-sensor...
The aim of the present study was to test whether multisensory interactions of emotional signals are ...
Evidence suggests that emotion is represented supramodally in the human brain. Emotional facial expr...
Evidence suggests that emotion is represented supramodally in the human brain. Emotional facial expr...
Results from recent event-related brain potential (ERP) studies investigating brain processes involv...
In the everyday environment, affective information is conveyed by both the face and the voice. Studi...
AbstractResults from recent event-related brain potential (ERP) studies investigating brain processe...
Objective: The study investigated the simultaneous processing of emotional tone of voice and emotion...
Behavioral studies have observed facial recognition bypass attentional limitations when performed wi...
Facial emotional processing can be bypassed when faces are task-irrelevant and attention is diverted...
<div><p>The decoding of social signals from nonverbal cues plays a vital role in the social interact...
Previous research has shown that redundant information in faces and voices leads to faster emotional...
Both facial expression and tone of voice represent key signals of emotional communication but their ...
& Emotional attention, the boosting of the processing of emotionally relevant stimuli, has, up t...