The utility of vision-based face tracking for dual pointing tasks is evaluated. We first describe a 3-D face tracking technique based on real-time parametric motion-stereo, which is non-invasive, robust, and self-initialized. The tracker provides a real-time estimate of a ?frontal face ray? whose intersection with the display surface plane is used as a second stream of input for scrolling or pointing, in paral-lel with hand input. We evaluated the performance of com-bined head/hand input on a box selection and coloring task: users selected boxes with one pointer and colors with a second pointer, or performed both tasks with a single pointer. We found that performance with head and one hand was intermediate between single hand performance an...
International audienceThis study investigated whether the execution of an accurate pointing response...
Abstract—In this paper, we present a vision-based human–com-puter interaction system, which integrat...
In this paper, we present an approach to improve pointing methods and target selection on tactile hu...
This study investigated eye pointing in stereoscopic displays. Ten participants performed 18 tapping...
Eye gaze involves the coordination of eye and head movement to acquire gaze targets, but existing ap...
Human-machine interfaces can be enhanced by incorporating knowledge of the user's current point of r...
For people with poor upper limb mobility or control, interaction with a computer may be facilitated ...
This paper presents a head-mounted virtual reality study that compared gaze, head, and controller po...
© 2017 IEEE. Inputs with multimodal information provide more natural ways to interact with virtual 3...
The aim of this paper was to evaluate the use of three facial actions (i.e. frowning, raising the ey...
Gaze and freehand gestures suit Augmented Reality as users can interact with objects at a distance w...
This paper examines and seeks to enhance gaze based pointing and interaction in virtual 3D environme...
The benefits of two-point interaction for tasks that require users to simultaneously manipulate mult...
Augmentative and alternative communication tools allow people with severe motor disabilities to inte...
In this work, we investigate gaze selection in the context of mid-air hand gestural manipulation of ...
International audienceThis study investigated whether the execution of an accurate pointing response...
Abstract—In this paper, we present a vision-based human–com-puter interaction system, which integrat...
In this paper, we present an approach to improve pointing methods and target selection on tactile hu...
This study investigated eye pointing in stereoscopic displays. Ten participants performed 18 tapping...
Eye gaze involves the coordination of eye and head movement to acquire gaze targets, but existing ap...
Human-machine interfaces can be enhanced by incorporating knowledge of the user's current point of r...
For people with poor upper limb mobility or control, interaction with a computer may be facilitated ...
This paper presents a head-mounted virtual reality study that compared gaze, head, and controller po...
© 2017 IEEE. Inputs with multimodal information provide more natural ways to interact with virtual 3...
The aim of this paper was to evaluate the use of three facial actions (i.e. frowning, raising the ey...
Gaze and freehand gestures suit Augmented Reality as users can interact with objects at a distance w...
This paper examines and seeks to enhance gaze based pointing and interaction in virtual 3D environme...
The benefits of two-point interaction for tasks that require users to simultaneously manipulate mult...
Augmentative and alternative communication tools allow people with severe motor disabilities to inte...
In this work, we investigate gaze selection in the context of mid-air hand gestural manipulation of ...
International audienceThis study investigated whether the execution of an accurate pointing response...
Abstract—In this paper, we present a vision-based human–com-puter interaction system, which integrat...
In this paper, we present an approach to improve pointing methods and target selection on tactile hu...