International audienceRecent magneto-encephalographic and electro-encephalographic studies provide evidence for cross-modal integration during audio-visual and audio-haptic speech perception, with speech gestures viewed or felt from manual tactile contact with the speaker's face. Given the temporal precedence of the haptic and visual signals on the acoustic signal in these studies, the observed modulation of N1/P2 auditory evoked responses during bimodal compared to unimodal speech perception suggest that relevant and predictive visual and haptic cues may facilitate auditory speech processing. To further investigate this hypothesis, auditory evoked potentials were here compared during auditory-only, audio-visual and audio-haptic speech perc...
Speech perception often benefits from vision of the speaker's lip movements when they are available....
Synchronous presentation of stimuli to the auditory and visual systems can modify the formation of a...
International audienceRecent neurophysiological studies show that cortical brain regions involved in...
International audienceSpeech can be perceived not only by the ear and by the eye but also by the han...
International audienceSeeing the articulatory gestures of the speaker significantly enhances auditor...
Do cross-modal interactions during speech perception only depend on well-known auditory and visuo-fa...
While everyone has experienced that seeing lip movements may improve speech perception, little is kn...
International audienceRecent neurophysiological studies demonstrate that audio-visual speech integra...
International audienceAudio-visual speech perception is a special case of multisensory processing th...
International audienceHemodynamic studies have shown that the auditory cortex can be activated by vi...
International audiencePrevious electrophysiological studies have provided strong evidence for early ...
International audienceWe investigated the existence of a cross-modal sensory gating reflected by the...
An interaction between orofacial somatosensation and the perception of speech was demonstrated in re...
Speech perception often benefits from vision of the speaker's lip movements when they are available....
Synchronous presentation of stimuli to the auditory and visual systems can modify the formation of a...
International audienceRecent neurophysiological studies show that cortical brain regions involved in...
International audienceSpeech can be perceived not only by the ear and by the eye but also by the han...
International audienceSeeing the articulatory gestures of the speaker significantly enhances auditor...
Do cross-modal interactions during speech perception only depend on well-known auditory and visuo-fa...
While everyone has experienced that seeing lip movements may improve speech perception, little is kn...
International audienceRecent neurophysiological studies demonstrate that audio-visual speech integra...
International audienceAudio-visual speech perception is a special case of multisensory processing th...
International audienceHemodynamic studies have shown that the auditory cortex can be activated by vi...
International audiencePrevious electrophysiological studies have provided strong evidence for early ...
International audienceWe investigated the existence of a cross-modal sensory gating reflected by the...
An interaction between orofacial somatosensation and the perception of speech was demonstrated in re...
Speech perception often benefits from vision of the speaker's lip movements when they are available....
Synchronous presentation of stimuli to the auditory and visual systems can modify the formation of a...
International audienceRecent neurophysiological studies show that cortical brain regions involved in...