The HANDS dataset has been created for human-robot interaction research, and it is composed of spatially and temporally aligned RGB and Depth frames. It contains 12 static single-hand gestures performed with both the right-hand and the left-hand, and 3 static two-hands gestures for a total of 29 unique classes. Five subjects (2 females and 3 males) performed the gestures, each of them with a different background and light conditions. For each subject, 150 RGB frames and their corresponding 150 depth frames per gesture have been collected, for a total of 2400 RGB frames and 2400 depth frames per subject. Data has been collected using a Kinect v2 camera intrinsically calibrated to spatially align RGB data to depth data. The temporal alignme...
Two files with a dataset of five different/independent hand gestures are provided. The data were gen...
Hand Gesture Recognition (HGR) is a form of perceptual computing that allows artificial systems to c...
To recognize different hand gestures and achieve efficient classification to understand static and d...
We introduce the UC2017 static and dynamic gesture dataset. Most researchers use vision-based system...
The dataset contains RGB and depth version video frames of various hand movements captured with the ...
Abstract—Hand gestures are one of the natural forms of communication in human-robot interaction scen...
Two files with a dataset of ten different/independent hand gestures are provided (seven static gestu...
This dataset provides valuable insights into hand gestures and their associated measurements. Hand g...
Intuitive user interfaces are indispensable to interact with the human centric smart environments. I...
Computer vision systems are commonly used to design touch-less human-computer interfaces (HCI) based...
International audienceIn the light of factories of the future, we present a reliable framework for r...
In daily life humans perform a great number of actions continuously. We recognize and interpret thes...
This hand recognition dataset comprises a comprehensive collection of hand images from 65 individual...
It usually takes a fusion of image processing and machine learning algorithms in order to build a f...
Abstract: This paper introduces a fast and feasible method for the collection of hand gesture sample...
Two files with a dataset of five different/independent hand gestures are provided. The data were gen...
Hand Gesture Recognition (HGR) is a form of perceptual computing that allows artificial systems to c...
To recognize different hand gestures and achieve efficient classification to understand static and d...
We introduce the UC2017 static and dynamic gesture dataset. Most researchers use vision-based system...
The dataset contains RGB and depth version video frames of various hand movements captured with the ...
Abstract—Hand gestures are one of the natural forms of communication in human-robot interaction scen...
Two files with a dataset of ten different/independent hand gestures are provided (seven static gestu...
This dataset provides valuable insights into hand gestures and their associated measurements. Hand g...
Intuitive user interfaces are indispensable to interact with the human centric smart environments. I...
Computer vision systems are commonly used to design touch-less human-computer interfaces (HCI) based...
International audienceIn the light of factories of the future, we present a reliable framework for r...
In daily life humans perform a great number of actions continuously. We recognize and interpret thes...
This hand recognition dataset comprises a comprehensive collection of hand images from 65 individual...
It usually takes a fusion of image processing and machine learning algorithms in order to build a f...
Abstract: This paper introduces a fast and feasible method for the collection of hand gesture sample...
Two files with a dataset of five different/independent hand gestures are provided. The data were gen...
Hand Gesture Recognition (HGR) is a form of perceptual computing that allows artificial systems to c...
To recognize different hand gestures and achieve efficient classification to understand static and d...