Abstract Synergistic multimodal human-machine interfaces are characterised by their ability to interpret user input from more than one input modality. Such interfaces may contribute to better driver information systems in terms of efficieny and comfort of use. In this article we present an approach for the integration of voice and touchscreen input as well as capacitive proximity sensing for two scenarios: interaction with a map of points of interest and with a media player. We will present de-tails of the system realisation and of the implementation of the scenarios. Finally, we will report results from a recent user study.
Abstract. The development of interfaces has been a technology-driven process. However, the newly dev...
The need for novel interaction paradigms in automotive human-machine interface (HMI) applications ha...
The need for novel interaction paradigms in automotive human-machine interface (HMI) applications ha...
This paper presents a multimodal interface featuring fusion of multiple modalities for natural human...
The use of multiple modes of user input to interact with computers and devices is an active area of ...
This paper presents a multimodal interface featuring fusion of multiple modalities for natural human...
With such a rapid advancement in powerful mobile devices and sensors in recent years, inclusion of m...
Multimodal interaction is one of the taxonomies for Human-Computer Interaction (HCI). With the intro...
Our sensory modalities are specialized in perceiving different attributes of an object or event. Thi...
We describe our system which facilitates collaboration using multiple modalities, including speech, ...
Advancements in input device and sensor technologies led to the evolution of the traditional human-m...
In this paper we present a strategy for handling of multimodal signals from pen-based mobile devices...
This article describes requirements and a prototype system for a flexible multimodal human-machine i...
Advancements in input device and sensor technologies led to the evolution of the traditional human-m...
Abstract. In this paper, we present a novel multimodal system de-signed for smooth multi-party human...
Abstract. The development of interfaces has been a technology-driven process. However, the newly dev...
The need for novel interaction paradigms in automotive human-machine interface (HMI) applications ha...
The need for novel interaction paradigms in automotive human-machine interface (HMI) applications ha...
This paper presents a multimodal interface featuring fusion of multiple modalities for natural human...
The use of multiple modes of user input to interact with computers and devices is an active area of ...
This paper presents a multimodal interface featuring fusion of multiple modalities for natural human...
With such a rapid advancement in powerful mobile devices and sensors in recent years, inclusion of m...
Multimodal interaction is one of the taxonomies for Human-Computer Interaction (HCI). With the intro...
Our sensory modalities are specialized in perceiving different attributes of an object or event. Thi...
We describe our system which facilitates collaboration using multiple modalities, including speech, ...
Advancements in input device and sensor technologies led to the evolution of the traditional human-m...
In this paper we present a strategy for handling of multimodal signals from pen-based mobile devices...
This article describes requirements and a prototype system for a flexible multimodal human-machine i...
Advancements in input device and sensor technologies led to the evolution of the traditional human-m...
Abstract. In this paper, we present a novel multimodal system de-signed for smooth multi-party human...
Abstract. The development of interfaces has been a technology-driven process. However, the newly dev...
The need for novel interaction paradigms in automotive human-machine interface (HMI) applications ha...
The need for novel interaction paradigms in automotive human-machine interface (HMI) applications ha...