Quantitative relationships were established between speech intelligibility and gaze patterns when subjects listened to sentences spoken by a single talker at different auditory SNRs while viewing one or more talkers. When the auditory SNR was reduced and subjects moved their eyes freely, the main gaze strategy involved looking closer to the mouth. The natural tendency to move closer to the mouth was found to be consistent with a gaze strategy that helps subjects improve their speech intelligibility in environments that include multiple talkers. With a single talker and a fixed point of gaze, subjects' speech intelligibility was found to be optimal for fixations that were distributed within 10 degrees of the center of the mouth. Lower pe...
Speech has been used as the foundation for many human/machine interactive systems to convey the user...
The study examined whether people can extract speech related information from the talker's upper fac...
We present a summary overview of recent work using eye movement data to improve speech technologies....
Quantitative relationships were established between speech intelligibility and gaze patterns when su...
The goal of this study was to examine the role of gaze in speech perception and to investigate gaze ...
The behavior of a person during a conversation typically involves both auditory and visual attention...
Superdirectional acoustic beamforming technology provides a high signal-to-noise ratio, but potentia...
We evaluated effects of gaze direction and other nonverbal visual cues on multiparty mediated commun...
Gaze and language are major pillars in multimodal communication. Gaze is a non-verbal mechanism that...
Hearing impaired people often point their eyes at a speaker in a conversation in order to get the be...
In order to explore verbal-nonverbal integration, we investigated the influence of cognitive and lin...
Speech is often a multimodal process, presented audiovisually through a talking face. One area of sp...
PURPOSE: Visual cues from a speaker's face may benefit perceptual adaptation to degraded speech, but...
Speech is inextricably multisensory: both auditory and visual components provide critical informatio...
Abstract Looking at the mouth region is thought to be a useful strategy for speech-perception tasks....
Speech has been used as the foundation for many human/machine interactive systems to convey the user...
The study examined whether people can extract speech related information from the talker's upper fac...
We present a summary overview of recent work using eye movement data to improve speech technologies....
Quantitative relationships were established between speech intelligibility and gaze patterns when su...
The goal of this study was to examine the role of gaze in speech perception and to investigate gaze ...
The behavior of a person during a conversation typically involves both auditory and visual attention...
Superdirectional acoustic beamforming technology provides a high signal-to-noise ratio, but potentia...
We evaluated effects of gaze direction and other nonverbal visual cues on multiparty mediated commun...
Gaze and language are major pillars in multimodal communication. Gaze is a non-verbal mechanism that...
Hearing impaired people often point their eyes at a speaker in a conversation in order to get the be...
In order to explore verbal-nonverbal integration, we investigated the influence of cognitive and lin...
Speech is often a multimodal process, presented audiovisually through a talking face. One area of sp...
PURPOSE: Visual cues from a speaker's face may benefit perceptual adaptation to degraded speech, but...
Speech is inextricably multisensory: both auditory and visual components provide critical informatio...
Abstract Looking at the mouth region is thought to be a useful strategy for speech-perception tasks....
Speech has been used as the foundation for many human/machine interactive systems to convey the user...
The study examined whether people can extract speech related information from the talker's upper fac...
We present a summary overview of recent work using eye movement data to improve speech technologies....