We present a novel method for monocular hand gesture recognition in ego-vision scenarios that deals with static and dynamic gestures and can achieve high accuracy results using a few positive samples. Specifically, we use and extend the dense trajectories approach that has been successfully introduced for action recognition. Dense features are extracted around regions selected by a new hand segmentation technique that integrates superpixel classification, temporal and spatial coherence. We extensively testour gesture recognition and segmentation algorithms on public datasets and propose a new dataset shot with a wearable camera. In addition, we demonstrate that our solution can work in near real-time on a wearable device.We present a novel ...
Hand gestures can be used for natural and intuitive human-computer interaction. To achieve this goal...
Abstract. This paper focuses on describing our method designed for both track 2 and track 3 at Looki...
In this thesis we use algorithms on data from body-worn sensors to detect physical gestures and acti...
We present a novel method for monocular hand gesture recognition in ego-vision scenarios that deals ...
Portable devices for first-person camera views will play a central role in future interactive system...
We propose a novel approach to segment hand regions in egocentric video that requires no manual labe...
A large number of works in egocentric vision have concentrated on action and object recognition. Det...
Gestures are spatiotemporal signals that contain valuable information. Humans can understand gestur...
Abstract: The recognition of hand gestures from image sequences is an important and challenging prob...
This paper addresses the detection of hand gestures during free-standing conversations in crowded mi...
Hands appear very often in egocentric video, and their appearance and pose give important cues about...
Wearable cameras allow people to record their daily activities from a user-centered (First Person Vi...
Although it is in general difficult to track articulated hand motion, exemplar-based approaches prov...
Wearable computing technologies are advancing rapidly and enabling users to easily record daily acti...
Researchers have recently focused their attention on vision-based hand gesture recognition. However,...
Hand gestures can be used for natural and intuitive human-computer interaction. To achieve this goal...
Abstract. This paper focuses on describing our method designed for both track 2 and track 3 at Looki...
In this thesis we use algorithms on data from body-worn sensors to detect physical gestures and acti...
We present a novel method for monocular hand gesture recognition in ego-vision scenarios that deals ...
Portable devices for first-person camera views will play a central role in future interactive system...
We propose a novel approach to segment hand regions in egocentric video that requires no manual labe...
A large number of works in egocentric vision have concentrated on action and object recognition. Det...
Gestures are spatiotemporal signals that contain valuable information. Humans can understand gestur...
Abstract: The recognition of hand gestures from image sequences is an important and challenging prob...
This paper addresses the detection of hand gestures during free-standing conversations in crowded mi...
Hands appear very often in egocentric video, and their appearance and pose give important cues about...
Wearable cameras allow people to record their daily activities from a user-centered (First Person Vi...
Although it is in general difficult to track articulated hand motion, exemplar-based approaches prov...
Wearable computing technologies are advancing rapidly and enabling users to easily record daily acti...
Researchers have recently focused their attention on vision-based hand gesture recognition. However,...
Hand gestures can be used for natural and intuitive human-computer interaction. To achieve this goal...
Abstract. This paper focuses on describing our method designed for both track 2 and track 3 at Looki...
In this thesis we use algorithms on data from body-worn sensors to detect physical gestures and acti...