In this study, we incorporate automatically obtained system/user performance features into machine learning experiments to detect student emotion in computer tutoring dialogs. Our results show a relative improvement of 2.7 % on classification accuracy and 8.08 % on Kappa over using standard lexical, prosodic, sequential, and identification features. This level of improvement is comparable to the performance improvement shown in previous studies by applying dialog acts or lexical-/prosodic-/discourse- level contextual features. Index Terms: emotional speech, emotion detection, spoken dialog system
We present an annotation scheme for stu-dent emotions in tutoring dialogues. Analy-ses of our scheme...
Proceedings of the 26th International Conference on Artificial Neural Networks, Alghero, Italy, Sept...
Proceedings of the 26th International Conference on Artificial Neural Networks, Alghero, Italy, Sept...
In this study, we incorporate automatically obtained system/user performance features into machine l...
In this study, we incorporate automatically obtained system/user performance features into machine l...
In this study, we incorporate automatically obtained system/user performance features into machine l...
While human tutors respond to both what a student says and to how the student says it, most tutorial...
Most research that explores the emotional state of users of spoken dialog systems does not fully uti...
When humans converse they can detect and respond to their interlocutor’s fleeting emotional changes....
While human tutors respond to both what a student says and to how the student says it, most tutorial...
Most research that explores the emotional state of users of spoken dialog systems does not fully uti...
Objective: The goal of this work is to develop and test an automated system methodology that can det...
Objective: The goal of this work is to develop and test an automated system methodology that can det...
We evaluate the performance of a spoken dialogue system that provides substantive dynamic responses ...
The automatic recognition of user’s communicative style within a spoken dialog system framework, inc...
We present an annotation scheme for stu-dent emotions in tutoring dialogues. Analy-ses of our scheme...
Proceedings of the 26th International Conference on Artificial Neural Networks, Alghero, Italy, Sept...
Proceedings of the 26th International Conference on Artificial Neural Networks, Alghero, Italy, Sept...
In this study, we incorporate automatically obtained system/user performance features into machine l...
In this study, we incorporate automatically obtained system/user performance features into machine l...
In this study, we incorporate automatically obtained system/user performance features into machine l...
While human tutors respond to both what a student says and to how the student says it, most tutorial...
Most research that explores the emotional state of users of spoken dialog systems does not fully uti...
When humans converse they can detect and respond to their interlocutor’s fleeting emotional changes....
While human tutors respond to both what a student says and to how the student says it, most tutorial...
Most research that explores the emotional state of users of spoken dialog systems does not fully uti...
Objective: The goal of this work is to develop and test an automated system methodology that can det...
Objective: The goal of this work is to develop and test an automated system methodology that can det...
We evaluate the performance of a spoken dialogue system that provides substantive dynamic responses ...
The automatic recognition of user’s communicative style within a spoken dialog system framework, inc...
We present an annotation scheme for stu-dent emotions in tutoring dialogues. Analy-ses of our scheme...
Proceedings of the 26th International Conference on Artificial Neural Networks, Alghero, Italy, Sept...
Proceedings of the 26th International Conference on Artificial Neural Networks, Alghero, Italy, Sept...