In this paper we describe a system which allows users to use their full-body for controlling in real-time the generation of an expressive audio-visual feedback. The system extracts expressive motion features from the user's full-body movements and gestures. The values of these motion features are mapped both onto acoustic parameters for the real-time expressive rendering of a piece of music, and onto real-time generated visual feedback projected on a screen in front of the user
This thesis presents a multimodal sonification system that combines video with sound synthesis gener...
The augmented ballet project aims at gathering research from several fields and directing them towar...
This research aims to develop a wearable musical interfacewhich enables to control audio and video s...
In this paper we describe a system which allows users to use their full-body for controlling in real...
In this paper we describe a system allowing users to express themselves through their full-body move...
The fruition of musical score by means of multimedia systems requires a suitable model for automatic...
This article reports on a project that uses unfettered gestural motion for expressive musical purpos...
This article reports on a project that uses unfettered gestural motion for expressive musical purpos...
This work proposes a new way for providing feedback to expressivity in music performance. Starting f...
The Expressive Director is a system allowing real-time control of music performance synthesis, in pa...
This paper presents the EyesWeb project. The goal of the EyesWeb project is to develop a modular sys...
Electronic sound synthesis continues to offer huge potential possibilities for the creation of new m...
Electronic sound synthesis continues to offer huge potential possibilities for the creation of new m...
Abstract — In this paper, we propose to use procedural ani-mation of a human character to enhance th...
Musical interpretations are often the result of a wide range of requirements on expressiveness rende...
This thesis presents a multimodal sonification system that combines video with sound synthesis gener...
The augmented ballet project aims at gathering research from several fields and directing them towar...
This research aims to develop a wearable musical interfacewhich enables to control audio and video s...
In this paper we describe a system which allows users to use their full-body for controlling in real...
In this paper we describe a system allowing users to express themselves through their full-body move...
The fruition of musical score by means of multimedia systems requires a suitable model for automatic...
This article reports on a project that uses unfettered gestural motion for expressive musical purpos...
This article reports on a project that uses unfettered gestural motion for expressive musical purpos...
This work proposes a new way for providing feedback to expressivity in music performance. Starting f...
The Expressive Director is a system allowing real-time control of music performance synthesis, in pa...
This paper presents the EyesWeb project. The goal of the EyesWeb project is to develop a modular sys...
Electronic sound synthesis continues to offer huge potential possibilities for the creation of new m...
Electronic sound synthesis continues to offer huge potential possibilities for the creation of new m...
Abstract — In this paper, we propose to use procedural ani-mation of a human character to enhance th...
Musical interpretations are often the result of a wide range of requirements on expressiveness rende...
This thesis presents a multimodal sonification system that combines video with sound synthesis gener...
The augmented ballet project aims at gathering research from several fields and directing them towar...
This research aims to develop a wearable musical interfacewhich enables to control audio and video s...