In the digitized world, interacting with multimedia information occupies a large portion of everyday activities; it’s now an essential part of how we gather knowledge and communicate with others. It involves several operations, including selecting, navigating through, and modifying multimedia, such as text, images, animations, and videos. These operations are usually performed by devices such as a mouse or keyboard, but people with motor disabilities often can’t use such devices. This limits their ability to interact with multimedia content and thus excludes them from the digital information spaces that help us stay connected with families, friends, and colleagues. In this paper, we primarily focus on the gaze-based control paradigm that we...
Gaze-based interaction lets users to operate computers through eye movement, and promises to especia...
The project introduces a technique for human computer interface with the help of pupils of eye. Norm...
We present a dataset that combines multimodal biosignals and eye tracking information gathered under...
The EU-funded MAMEM project (Multimedia Authoring and Management using your Eyes and Mind) aims to p...
In present society, Accessibility issues are being more and more considered, towards a world where e...
We are currently witnessing an increasing attention to issues related to Accessibility, which should...
When we interact with computers today, we typically feed input into the computer with our hands cont...
Motor and communication disabilities are common conditions that may implicate restrictions in daily ...
This dissertation introduces the design of a multimodal, adaptive real-time assistive system as an a...
Tracking the gaze of a person has been possible for several decades. Until recently, it was mostly d...
This paper presents a computer method to help people, typically having limited mobility, to be able ...
We present a dataset that combines multimodal biosignals and eye tracking information gathered under...
Pfeiffer T. Gaze-based assistive technologies. In: Kouroupetroglou G, ed. Assistive Technologies and...
There are people that are so severely paralyzed that they only have the ability to control the muscl...
We present a dataset that combines multimodal biosignals and eye tracking information gathered under...
Gaze-based interaction lets users to operate computers through eye movement, and promises to especia...
The project introduces a technique for human computer interface with the help of pupils of eye. Norm...
We present a dataset that combines multimodal biosignals and eye tracking information gathered under...
The EU-funded MAMEM project (Multimedia Authoring and Management using your Eyes and Mind) aims to p...
In present society, Accessibility issues are being more and more considered, towards a world where e...
We are currently witnessing an increasing attention to issues related to Accessibility, which should...
When we interact with computers today, we typically feed input into the computer with our hands cont...
Motor and communication disabilities are common conditions that may implicate restrictions in daily ...
This dissertation introduces the design of a multimodal, adaptive real-time assistive system as an a...
Tracking the gaze of a person has been possible for several decades. Until recently, it was mostly d...
This paper presents a computer method to help people, typically having limited mobility, to be able ...
We present a dataset that combines multimodal biosignals and eye tracking information gathered under...
Pfeiffer T. Gaze-based assistive technologies. In: Kouroupetroglou G, ed. Assistive Technologies and...
There are people that are so severely paralyzed that they only have the ability to control the muscl...
We present a dataset that combines multimodal biosignals and eye tracking information gathered under...
Gaze-based interaction lets users to operate computers through eye movement, and promises to especia...
The project introduces a technique for human computer interface with the help of pupils of eye. Norm...
We present a dataset that combines multimodal biosignals and eye tracking information gathered under...