From a user perspective, immersive content can elicit more intense emotions than flat-screen presentations. From a system perspective, efficient storage and distribution remain challenging, and must consider user attention. Understanding the connection between user attention, user emotions and immersive content is therefore key. In this article, we present a new dataset, PEM360 of user head movements and gaze recordings in 360° videos, along with self-reported emotional ratings of valence and arousal, and continuous physiological measurement of electrodermal activity and heart rate. The stimuli are selected to enable the spatiotemporal analysis of the connection between content, user motion and emotion. We describe and provide a set of soft...
This paper presents a user-independent emotion recognition method with the goal of recovering affect...
Emotions influence our cognitive functioning heavily. Therefore, it is interesting to develop measur...
This paper presents a user-independent emotion recognition method with the goal of recovering affect...
International audienceFrom a user perspective, immersive content can elicit more intense emotions th...
Watching 360 videos using Virtual Reality (VR) head-mounted displays (HMDs) provides interactive and...
Inferring emotions from Head Movement (HM) and Eye Movement (EM) data in 360◦ Virtual Reality (VR) c...
From a computational viewpoint, emotions continue to be intriguingly hard to understand. In research...
While immersive media have been shown to generate more intense emotions, saliency information has be...
The use of questionnaires at the end of a specific task only evaluates what is expressed by the cons...
International audienceWhile immersive media have been shown to generate more intense emotions, salie...
The use of questionnaires at the end of a specific task only evaluates what is expressed by the cons...
We develop the CEAP-360VR dataset to address the lack of continuously annotated behavioral and physi...
We develop the CEAP-360VR dataset to address the lack of continuously annotated behavioral and physi...
The paper introduces a multimodal affective dataset named VREED (VR Eyes: Emotions Dataset) in which...
Viewers' preference for multimedia selection depends highly on their emotional experience. In this p...
This paper presents a user-independent emotion recognition method with the goal of recovering affect...
Emotions influence our cognitive functioning heavily. Therefore, it is interesting to develop measur...
This paper presents a user-independent emotion recognition method with the goal of recovering affect...
International audienceFrom a user perspective, immersive content can elicit more intense emotions th...
Watching 360 videos using Virtual Reality (VR) head-mounted displays (HMDs) provides interactive and...
Inferring emotions from Head Movement (HM) and Eye Movement (EM) data in 360◦ Virtual Reality (VR) c...
From a computational viewpoint, emotions continue to be intriguingly hard to understand. In research...
While immersive media have been shown to generate more intense emotions, saliency information has be...
The use of questionnaires at the end of a specific task only evaluates what is expressed by the cons...
International audienceWhile immersive media have been shown to generate more intense emotions, salie...
The use of questionnaires at the end of a specific task only evaluates what is expressed by the cons...
We develop the CEAP-360VR dataset to address the lack of continuously annotated behavioral and physi...
We develop the CEAP-360VR dataset to address the lack of continuously annotated behavioral and physi...
The paper introduces a multimodal affective dataset named VREED (VR Eyes: Emotions Dataset) in which...
Viewers' preference for multimedia selection depends highly on their emotional experience. In this p...
This paper presents a user-independent emotion recognition method with the goal of recovering affect...
Emotions influence our cognitive functioning heavily. Therefore, it is interesting to develop measur...
This paper presents a user-independent emotion recognition method with the goal of recovering affect...