The MediaEval 2015 Affective Impact of Movies Task chal-lenged participants to automatically find violent scenes in a set of videos and, also, to predict the affective impact that video content will have on viewers. We propose the use of several multimodal descriptors, such as visual, motion and auditory features, then we fuse their predictions to detect the violent or affective content. Our best-performing run with regard to the official metric received a MAP of 0.1419 in the violence detection task, and an accuracy of 45.038% for the arousal estimation and 36.123 % for the valence esti-mation
Affect Task: Violent Scenes Detection Task (working notes paper) - Proceedings at http://ceur-ws.org...
International audienceThis chapter introduces a benchmark evaluation targeting the detection of viol...
International audienceThis paper presents an audio-visual data representation for violent scenes det...
2017 Multimedia Benchmark Workshop, MediaEval 2017 -- 13 September 2017 through 15 September 2017 --...
This paper provides a description of the MediaEval 2018 “Emotional Impact of Movies task". It contin...
ABSTRACT In this paper, we present the work done at UMons regarding the MediaEval 2015 Affective Imp...
Volume: 1739 Host publication title: MediaEval 2016 Multimedia Benchmark Workshop Host publication s...
International audienceIn this paper, we report on the creation of a publicly available, common evalu...
In this paper we describe the Imperial College London, Technische Universitat München and University...
International audienceThis paper provides a description of the MediaEval 2013 Affect Task Violent Sc...
This paper provides a description of the MediaEval 2011 Affect Task: Violent Scenes Detection. This ...
Without doubt general video and sound, as found in large multimedia archives, carry emotional inform...
International audienceThis paper provides a description of the MediaEval 2012 Affect Task: Violent S...
This paper describes the participation of the TUB-IRML group to the MediaEval 2014 Violent Scenes De...
This work reports the methodology that CERTH-ITI team developed so as to recognize the emotional imp...
Affect Task: Violent Scenes Detection Task (working notes paper) - Proceedings at http://ceur-ws.org...
International audienceThis chapter introduces a benchmark evaluation targeting the detection of viol...
International audienceThis paper presents an audio-visual data representation for violent scenes det...
2017 Multimedia Benchmark Workshop, MediaEval 2017 -- 13 September 2017 through 15 September 2017 --...
This paper provides a description of the MediaEval 2018 “Emotional Impact of Movies task". It contin...
ABSTRACT In this paper, we present the work done at UMons regarding the MediaEval 2015 Affective Imp...
Volume: 1739 Host publication title: MediaEval 2016 Multimedia Benchmark Workshop Host publication s...
International audienceIn this paper, we report on the creation of a publicly available, common evalu...
In this paper we describe the Imperial College London, Technische Universitat München and University...
International audienceThis paper provides a description of the MediaEval 2013 Affect Task Violent Sc...
This paper provides a description of the MediaEval 2011 Affect Task: Violent Scenes Detection. This ...
Without doubt general video and sound, as found in large multimedia archives, carry emotional inform...
International audienceThis paper provides a description of the MediaEval 2012 Affect Task: Violent S...
This paper describes the participation of the TUB-IRML group to the MediaEval 2014 Violent Scenes De...
This work reports the methodology that CERTH-ITI team developed so as to recognize the emotional imp...
Affect Task: Violent Scenes Detection Task (working notes paper) - Proceedings at http://ceur-ws.org...
International audienceThis chapter introduces a benchmark evaluation targeting the detection of viol...
International audienceThis paper presents an audio-visual data representation for violent scenes det...