Most research that explores the emotional state of users of spoken dialog systems does not fully utilize the contextual nature that the dialog structure provides. This paper reports results of machine learning experiments designed to automatically classify the emotional state of user turns using a corpus of 5,690 dialogs collected with the “How May I Help You SM ” spoken dialog system. We show that augmenting standard lexical and prosodic features with contextual features that exploit the structure of spoken dialog and track user state increases classification accuracy by 2.6%. 1
In this paper we propose to combine speech-based and linguistic classification in order to obtain be...
Emotion recognition in conversations is essential for ensuring advanced human-machine interactions. ...
The present research focuses on analyzing and detecting emotions in speech as revealed by task-depen...
Most research that explores the emotional state of users of spoken dialog systems does not fully uti...
In this study, we incorporate automatically obtained system/user performance features into machine l...
In this study, we incorporate automatically obtained system/user performance features into machine l...
In this study, we incorporate automatically obtained system/user performance features into machine l...
In this study, we incorporate automatically obtained system/user performance features into machine l...
Conversational agents are increasingly being used for training of social skills. One of their most i...
Conversational agents are increasingly being used for training of social skills. One of their most i...
As a branch of sentiment analysis tasks, emotion recognition in conversation (ERC) aims to explore t...
Objective: The goal of this work is to develop and test an automated system methodology that can det...
Objective: The goal of this work is to develop and test an automated system methodology that can det...
In this paper we propose to combine speech-based and linguistic classification in order to obtain be...
In this paper we propose to combine speech-based and linguistic classification in order to obtain be...
In this paper we propose to combine speech-based and linguistic classification in order to obtain be...
Emotion recognition in conversations is essential for ensuring advanced human-machine interactions. ...
The present research focuses on analyzing and detecting emotions in speech as revealed by task-depen...
Most research that explores the emotional state of users of spoken dialog systems does not fully uti...
In this study, we incorporate automatically obtained system/user performance features into machine l...
In this study, we incorporate automatically obtained system/user performance features into machine l...
In this study, we incorporate automatically obtained system/user performance features into machine l...
In this study, we incorporate automatically obtained system/user performance features into machine l...
Conversational agents are increasingly being used for training of social skills. One of their most i...
Conversational agents are increasingly being used for training of social skills. One of their most i...
As a branch of sentiment analysis tasks, emotion recognition in conversation (ERC) aims to explore t...
Objective: The goal of this work is to develop and test an automated system methodology that can det...
Objective: The goal of this work is to develop and test an automated system methodology that can det...
In this paper we propose to combine speech-based and linguistic classification in order to obtain be...
In this paper we propose to combine speech-based and linguistic classification in order to obtain be...
In this paper we propose to combine speech-based and linguistic classification in order to obtain be...
Emotion recognition in conversations is essential for ensuring advanced human-machine interactions. ...
The present research focuses on analyzing and detecting emotions in speech as revealed by task-depen...