A fundamental skill required in both humans and robots is the ability to direct gaze toward a target. Multiple body segments can be used to fixate a target, and here we describe our architecture for gaze control using eye and head movements and its implementation on an iCub robot. Human gaze control is learnt in infancy and progresses through stages of development which are shaped by constraints on the infant. We describe how constraints are used to develop gaze control in our system, and how they impact on the learning process
Hoffmann H, Schenck W, Möller R. Learning visuomotor transformations for gaze-control and grasping. ...
Using eyes as an input modality for different control environments is a great area of interest for e...
In this paper we address the problem of executing fast gaze shifts toward a visual target with a rob...
A fundamental skill required in both humans and robots is the ability to direct gaze toward a target...
Gaze control requires the coordination of movements of both eyes and head to fixate on a target. Usi...
In this paper we describe a biologically constrained architecture for developmental learning of eye–...
Gaze control requires the coordination of movements of both eyes and head to fixate on a target. We ...
Robots are at the position to become our everyday companions in the near future. Still, many hurdles...
The objective of this research is to implement an autonomous learning approach to robotic hand-eye c...
Eye tracking has many comprehensive achievements in the field of human computer interaction. Uses of...
distribution, and reproduction in any medium, provided the original work is properly cited. Abstract...
Abstract—Infants demonstrate remarkable talents in learning to control their sensory and motor syste...
The research field of robotics exists now for over fifty years, but the robots are still not able to...
This paper presents the biomechatronic design and development of an anthropomorphic robotic head abl...
The iCat is a user-interface robot with the ability to express a range of emotions through its facia...
Hoffmann H, Schenck W, Möller R. Learning visuomotor transformations for gaze-control and grasping. ...
Using eyes as an input modality for different control environments is a great area of interest for e...
In this paper we address the problem of executing fast gaze shifts toward a visual target with a rob...
A fundamental skill required in both humans and robots is the ability to direct gaze toward a target...
Gaze control requires the coordination of movements of both eyes and head to fixate on a target. Usi...
In this paper we describe a biologically constrained architecture for developmental learning of eye–...
Gaze control requires the coordination of movements of both eyes and head to fixate on a target. We ...
Robots are at the position to become our everyday companions in the near future. Still, many hurdles...
The objective of this research is to implement an autonomous learning approach to robotic hand-eye c...
Eye tracking has many comprehensive achievements in the field of human computer interaction. Uses of...
distribution, and reproduction in any medium, provided the original work is properly cited. Abstract...
Abstract—Infants demonstrate remarkable talents in learning to control their sensory and motor syste...
The research field of robotics exists now for over fifty years, but the robots are still not able to...
This paper presents the biomechatronic design and development of an anthropomorphic robotic head abl...
The iCat is a user-interface robot with the ability to express a range of emotions through its facia...
Hoffmann H, Schenck W, Möller R. Learning visuomotor transformations for gaze-control and grasping. ...
Using eyes as an input modality for different control environments is a great area of interest for e...
In this paper we address the problem of executing fast gaze shifts toward a visual target with a rob...