This paper describes a new interface for mobile music creation, the MobileMuse, that introduces the capability of using physiological indicators of emotion as a new mode of interaction. Combining both kinematic and physiological measurement in a mobile environment creates the possibility of integral music control—the use of both gesture and emotion to control sound creation—where it has never been possible before. This paper will review the concept of integral music control and describe the motivation for creating the MobileMuse, its design and future possibilities
This research aims to develop a wearable musical interfacewhich enables to control audio and video s...
How to play an iPhone? You can talk, sing, or blow into the microphone; shake, stroke, or spin the d...
Traditional musical instruments assume a high degree of mental and physical interaction, which resul...
License, which permits unrestricted use, distribution, and reproduction in any medium, provided the ...
We present research that extends the scope of the mobile application Control, a prototyping environm...
This paper presents MoodifierLive, a mobile phone application for interactive control of rule-based ...
This project proposes the use of the sensory features of modern mobile phones that were previously u...
In this paper, we describe the networking of multiple Integral Music Controllers (IMCs) to enable an...
Mobile phones offer an attractive platform for interactive music performance. We provide a theoretic...
We present research that extends the scope of the mobile application Control, aprototyping environme...
This paper describes the use of physiological and kinematic sensors for the direct measurement of ph...
Background in musicology. Understanding the gesture-based foundations of musical involvement open...
In this paper we describe the SAME networked platform for context-aware, experience-centric mobile m...
We discuss how the environment urMus was designed to allow creation of mobile musical instruments on...
We discuss how the environment urMus was designed to allow creation of mobile musical instruments on...
This research aims to develop a wearable musical interfacewhich enables to control audio and video s...
How to play an iPhone? You can talk, sing, or blow into the microphone; shake, stroke, or spin the d...
Traditional musical instruments assume a high degree of mental and physical interaction, which resul...
License, which permits unrestricted use, distribution, and reproduction in any medium, provided the ...
We present research that extends the scope of the mobile application Control, a prototyping environm...
This paper presents MoodifierLive, a mobile phone application for interactive control of rule-based ...
This project proposes the use of the sensory features of modern mobile phones that were previously u...
In this paper, we describe the networking of multiple Integral Music Controllers (IMCs) to enable an...
Mobile phones offer an attractive platform for interactive music performance. We provide a theoretic...
We present research that extends the scope of the mobile application Control, aprototyping environme...
This paper describes the use of physiological and kinematic sensors for the direct measurement of ph...
Background in musicology. Understanding the gesture-based foundations of musical involvement open...
In this paper we describe the SAME networked platform for context-aware, experience-centric mobile m...
We discuss how the environment urMus was designed to allow creation of mobile musical instruments on...
We discuss how the environment urMus was designed to allow creation of mobile musical instruments on...
This research aims to develop a wearable musical interfacewhich enables to control audio and video s...
How to play an iPhone? You can talk, sing, or blow into the microphone; shake, stroke, or spin the d...
Traditional musical instruments assume a high degree of mental and physical interaction, which resul...