This paper examines the degree of correlation between lip and jaw conguration and speech acoustics. The lip and jaw positions are characterised by a system of measure-ments taken from video images of the speaker's face and pro le, and the acoustics are represented using line spectral pair parameters and a measure of RMS energy. A corre-lation is found between the measured acoustic parameters and a linear estimate of the acoustics recovered from the visual data. This correlation exists despite the simplicity of the mapping and is in rough agreement with correla-tions measured in earlier work by Yehia et al. The linear estimates are also compared to estimates made using non-linear models. In particular it is shown that although per-forma...
International audienceThis paper presents a method for the extraction of articulatory parameters fro...
This paper presents technical refinements and ex-tensions of our system for correlating audible and ...
This paper examines the degrees of correlation among vocal tract and orofacial movement data and the...
This paper examines the degree of correlation between lip and jaw configuration and speech acoustics...
This article presents a thorough experimental comparison of several acoustic modeling techniques by ...
Facial motion during speech is a direct consequence of vocal-tract motion which also shapes the acou...
The aim of this work is to examine the correlation between audio and visual speech features. The mot...
Abstract. In this paper an evaluation of visual speech features is performed specifically for the ta...
Abstract—The multimodal nature of speech is often ignored in human-computer interaction, but lip def...
The aim of this work is to investigate a selection of audio and visual speech features with the aim ...
This paper investigates the statistical relationship between acoustic and visual speech features for...
One of the most challenging tasks in automatic visual speech recognition is the extraction of featur...
We measure face deformations during speech production using a motion capture system, which provides ...
This paper gives an insight into biometrics used for speaker recognition. Three different biometrics...
This paper gives an insight into biometrics used for speaker recognition. Three different biometrics...
International audienceThis paper presents a method for the extraction of articulatory parameters fro...
This paper presents technical refinements and ex-tensions of our system for correlating audible and ...
This paper examines the degrees of correlation among vocal tract and orofacial movement data and the...
This paper examines the degree of correlation between lip and jaw configuration and speech acoustics...
This article presents a thorough experimental comparison of several acoustic modeling techniques by ...
Facial motion during speech is a direct consequence of vocal-tract motion which also shapes the acou...
The aim of this work is to examine the correlation between audio and visual speech features. The mot...
Abstract. In this paper an evaluation of visual speech features is performed specifically for the ta...
Abstract—The multimodal nature of speech is often ignored in human-computer interaction, but lip def...
The aim of this work is to investigate a selection of audio and visual speech features with the aim ...
This paper investigates the statistical relationship between acoustic and visual speech features for...
One of the most challenging tasks in automatic visual speech recognition is the extraction of featur...
We measure face deformations during speech production using a motion capture system, which provides ...
This paper gives an insight into biometrics used for speaker recognition. Three different biometrics...
This paper gives an insight into biometrics used for speaker recognition. Three different biometrics...
International audienceThis paper presents a method for the extraction of articulatory parameters fro...
This paper presents technical refinements and ex-tensions of our system for correlating audible and ...
This paper examines the degrees of correlation among vocal tract and orofacial movement data and the...