In this work, the problem of bootstrapping knowledge in language and vision for autonomous robots is addressed through novel techniques in grammar induction and word grounding to the perceptual world. In particular, we demonstrate a system, called OLAV, which is able, for the first time, to (1) learn to form discrete concepts from sensory data; (2) ground language (n-grams) to these concepts; (3) induce a grammar for the language being used to describe the perceptual world; and moreover to do all this incrementally, without storing all previous data. The learning is achieved in a loosely-supervised manner from raw linguistic and visual data. Moreover, the learnt model is transparent, rather than a black-box model and is thus open to human i...
International audienceIn order to be able to understand a conversation in interaction, a robot, has ...
Spoken conversation between two present interlocutors is the foundation of natural language interact...
From Frontiers via Jisc Publications RouterHistory: received 2020-11-05, collection 2021, accepted 2...
We present a cognitively plausible novel framework capable of learning the grounding in visual seman...
With the recent proliferation of robotic applications in domestic and industrial scenarios, it is vi...
Visually grounded human-robot interaction is recognized to be an essential ingredient of socially i...
For robots to interact with humans at the language level, it becomes fundamental that robots and hum...
The current state of the art in military and first responder ground robots involves heavy physical a...
We present a cognitively plausible system capable of acquiring knowledge in language and vision from...
The physical world and the language that we use to describe it are full of structure. Very young chi...
In order to behave autonomously, it is desirable for robots to have the ability to use human supervi...
In order to behave autonomously, it is desirable for robots to have the ability to use human supervi...
In order for robots to effectively understand natural language commands, they must be able to acquir...
Language is among the most fascinating and complex cognitive activities that develops rapidly since ...
The objective of this research is to develop a system for language learning based on a minimum of pr...
International audienceIn order to be able to understand a conversation in interaction, a robot, has ...
Spoken conversation between two present interlocutors is the foundation of natural language interact...
From Frontiers via Jisc Publications RouterHistory: received 2020-11-05, collection 2021, accepted 2...
We present a cognitively plausible novel framework capable of learning the grounding in visual seman...
With the recent proliferation of robotic applications in domestic and industrial scenarios, it is vi...
Visually grounded human-robot interaction is recognized to be an essential ingredient of socially i...
For robots to interact with humans at the language level, it becomes fundamental that robots and hum...
The current state of the art in military and first responder ground robots involves heavy physical a...
We present a cognitively plausible system capable of acquiring knowledge in language and vision from...
The physical world and the language that we use to describe it are full of structure. Very young chi...
In order to behave autonomously, it is desirable for robots to have the ability to use human supervi...
In order to behave autonomously, it is desirable for robots to have the ability to use human supervi...
In order for robots to effectively understand natural language commands, they must be able to acquir...
Language is among the most fascinating and complex cognitive activities that develops rapidly since ...
The objective of this research is to develop a system for language learning based on a minimum of pr...
International audienceIn order to be able to understand a conversation in interaction, a robot, has ...
Spoken conversation between two present interlocutors is the foundation of natural language interact...
From Frontiers via Jisc Publications RouterHistory: received 2020-11-05, collection 2021, accepted 2...