This paper describes a system that uses a camera and a point light source to track a user's hand in three dimensions. Using depth cues obtained from projections of the hand and its shadow, the system computes the 3D position and orientation of two fingers (thumb and pointing finger). The system recognizes one dynamic and two static gestures. Recognition and pose estimation are user independent and robust. The system operates at the rate of 60 Hz and can be used as an intuitive input interface to applications that require multi-dimensional control. Examples include 3D fly-thru's, object manipulation and computer games
We present an interactive system to manipulate a virtual object by tracking multiple hands in 3D spa...
International audienceInterest in gesture-based interaction has been growing considerably, but most ...
This paper describes our ongoing research work on deviceless interaction using hand gesture recognit...
Direct use of the hand as an input device is a smart method for providing natural human-computer int...
We present a novel technique implementing barehanded interaction with virtual 3D content by employin...
A gesture from hands and fingers have rich meanings in communication even without a word of sound. I...
Abstract—In this paper, we present a vision-based human–com-puter interaction system, which integrat...
A lot of researchers have been investigating interactive portable projection systems such as a mini-...
In this paper novel 2D-hand tracking algorithms used in a system for hand gesture interaction are pr...
We propose a system for human computer interaction via 3D hand movements, based on a combination of ...
Using hand gestures as input in human–computer interaction is of ever-increasing interest. Markerles...
This paper suggests a low-cost way to enable 3D hand interaction based on the frame capturing functi...
Abstract—This paper introduces an algorithm to track palm and fingertips based on images with depth ...
Vision-based hand gesture interactions are natural and intuitive when interacting with computers, si...
In this paper, we use the output of a 3D sensor (ex. Kinect from Microsoft) to capture depth images ...
We present an interactive system to manipulate a virtual object by tracking multiple hands in 3D spa...
International audienceInterest in gesture-based interaction has been growing considerably, but most ...
This paper describes our ongoing research work on deviceless interaction using hand gesture recognit...
Direct use of the hand as an input device is a smart method for providing natural human-computer int...
We present a novel technique implementing barehanded interaction with virtual 3D content by employin...
A gesture from hands and fingers have rich meanings in communication even without a word of sound. I...
Abstract—In this paper, we present a vision-based human–com-puter interaction system, which integrat...
A lot of researchers have been investigating interactive portable projection systems such as a mini-...
In this paper novel 2D-hand tracking algorithms used in a system for hand gesture interaction are pr...
We propose a system for human computer interaction via 3D hand movements, based on a combination of ...
Using hand gestures as input in human–computer interaction is of ever-increasing interest. Markerles...
This paper suggests a low-cost way to enable 3D hand interaction based on the frame capturing functi...
Abstract—This paper introduces an algorithm to track palm and fingertips based on images with depth ...
Vision-based hand gesture interactions are natural and intuitive when interacting with computers, si...
In this paper, we use the output of a 3D sensor (ex. Kinect from Microsoft) to capture depth images ...
We present an interactive system to manipulate a virtual object by tracking multiple hands in 3D spa...
International audienceInterest in gesture-based interaction has been growing considerably, but most ...
This paper describes our ongoing research work on deviceless interaction using hand gesture recognit...