Wearable computing technologies are advancing rapidly and enabling users to easily record daily activities for applica-tions such as life-logging or health monitoring. Recognizing hand and object interactions in these videos will help broaden application domains, but recognizing such interactions auto-matically remains a difficult task. Activity recognition from the first-person point-of-view is difficult because the video includes constant motion, cluttered backgrounds, and sudden changes of scenery. Recognizing hand-related activities is particularly challenging due to the many temporal and spatial variations induced by hand interactions. We present a novel approach to recognize hand-object interactions by extracting both local motion fea...
One challenging research problem of hand pose recognition is the accurate detection of finger abduct...
This paper presents a new method to describe spatio-temporal relations between objects and hands, t...
Abstract—Most methods proposed in the literature for pre-dicting movements involved in a reach-to-gr...
Hands appear very often in egocentric video, and their appearance and pose give important cues about...
In order to develop effective interventions for restoring upper extremity function after cervical sp...
In this project, we propose an action estimation pipeline based on the simultaneous recognition of t...
We study natural human activity under difficult settings of cluttered background, volatile illuminat...
In this paper we develop a first step towards the recognition of hand activity by detecting objects ...
Wearable cameras allow people to record their daily activities from a user-centered (First Person Vi...
A large number of works in egocentric vision have concentrated on action and object recognition. Det...
In the field of pervasive computing, wearable devices have been widely used for recognizing human ac...
In this work we study the use of 3D hand poses to recognize first-person dynamic hand actions intera...
Abstract Background Monitoring hand function at home ...
We present a novel method for monocular hand gesture recognition in ego-vision scenarios that deals ...
We address the task of pixel-level hand detection in the context of ego-centric cameras. Extracting ...
One challenging research problem of hand pose recognition is the accurate detection of finger abduct...
This paper presents a new method to describe spatio-temporal relations between objects and hands, t...
Abstract—Most methods proposed in the literature for pre-dicting movements involved in a reach-to-gr...
Hands appear very often in egocentric video, and their appearance and pose give important cues about...
In order to develop effective interventions for restoring upper extremity function after cervical sp...
In this project, we propose an action estimation pipeline based on the simultaneous recognition of t...
We study natural human activity under difficult settings of cluttered background, volatile illuminat...
In this paper we develop a first step towards the recognition of hand activity by detecting objects ...
Wearable cameras allow people to record their daily activities from a user-centered (First Person Vi...
A large number of works in egocentric vision have concentrated on action and object recognition. Det...
In the field of pervasive computing, wearable devices have been widely used for recognizing human ac...
In this work we study the use of 3D hand poses to recognize first-person dynamic hand actions intera...
Abstract Background Monitoring hand function at home ...
We present a novel method for monocular hand gesture recognition in ego-vision scenarios that deals ...
We address the task of pixel-level hand detection in the context of ego-centric cameras. Extracting ...
One challenging research problem of hand pose recognition is the accurate detection of finger abduct...
This paper presents a new method to describe spatio-temporal relations between objects and hands, t...
Abstract—Most methods proposed in the literature for pre-dicting movements involved in a reach-to-gr...