This paper presents Digito, a gesturally controlled virtual musical instrument. Digito is controlled through a number of intricate hand gestures, providing both discrete and continuous control of Digito's sound engine; with the fine-grain hand gestures captured by a 3D depth sensor and recognized using computer vision and machine learning algorithms. We describe the design and initial iterative development of Digito, the hand and finger tracking algorithms and gesture recognition algorithms that drive the system, and report the insights gained during the initial development cycles and user testing of this gesturally controlled virtual musical instrument
This paper describes the design and evaluation of Netz, a novel mixed reality musical instrument tha...
International audienceThe increasing availability of software for creating real-time simulations of ...
MyoSpat is an interactive audio-visual system that aims to augment musical performances by empowerin...
Gestural interfaces, which make use of physiological signals, hand / body postures or movements, hav...
[[abstract]]In this research, we propose a state-of-the-art 3D finger gesture tracking and recogniti...
The availability of electronic audio synthesizers has led to the development of many novel control i...
The introduction of new gesture interfaces has been expanding the possibilities of creating new Digi...
This article describes a system for interactive performance that generates live musical accompanimen...
Figure 1: (left) Computer model of virtual percussion instrument with controlling interface. (right)...
With current state-of-the-art human movement tracking techology it is possible to represent in real-...
[[abstract]]Kinect, a 3D sensing device from Microsoft, invokes the human-computer interaction resea...
This article presents a new gestural computer interface (3DGC = Three-Dimensional Gestural Controlle...
In this paper, we introduce and analyze four gesture-controlled musical instruments. We briefly disc...
MyoSpat is an interactive audio-visual system that aims to augment musical performances by empoweri...
Figure 1: HandSonor provides users with a customizable vision-based control interface for musical ex...
This paper describes the design and evaluation of Netz, a novel mixed reality musical instrument tha...
International audienceThe increasing availability of software for creating real-time simulations of ...
MyoSpat is an interactive audio-visual system that aims to augment musical performances by empowerin...
Gestural interfaces, which make use of physiological signals, hand / body postures or movements, hav...
[[abstract]]In this research, we propose a state-of-the-art 3D finger gesture tracking and recogniti...
The availability of electronic audio synthesizers has led to the development of many novel control i...
The introduction of new gesture interfaces has been expanding the possibilities of creating new Digi...
This article describes a system for interactive performance that generates live musical accompanimen...
Figure 1: (left) Computer model of virtual percussion instrument with controlling interface. (right)...
With current state-of-the-art human movement tracking techology it is possible to represent in real-...
[[abstract]]Kinect, a 3D sensing device from Microsoft, invokes the human-computer interaction resea...
This article presents a new gestural computer interface (3DGC = Three-Dimensional Gestural Controlle...
In this paper, we introduce and analyze four gesture-controlled musical instruments. We briefly disc...
MyoSpat is an interactive audio-visual system that aims to augment musical performances by empoweri...
Figure 1: HandSonor provides users with a customizable vision-based control interface for musical ex...
This paper describes the design and evaluation of Netz, a novel mixed reality musical instrument tha...
International audienceThe increasing availability of software for creating real-time simulations of ...
MyoSpat is an interactive audio-visual system that aims to augment musical performances by empowerin...