Abstract — The agility of a robotic system is ultimately limited by the speed of its processing pipeline. The use of a Dynamic Vision Sensors (DVS), a sensor producing asynchronous events as luminance changes are perceived by its pixels, makes it pos-sible to have a sensing pipeline of a theoretical latency of a few microseconds. However, several challenges must be overcome: a DVS does not provide the grayscale value but only changes in the luminance; and because the output is composed by a sequence of events, traditional frame-based visual odometry methods are not applicable. This paper presents the first visual odometry system based on a DVS plus a normal CMOS camera to provide the absolute brightness values. The two sources of data are a...
In order to safely navigate and orient in their local surroundings autonomous systems need to rapidl...
Event cameras, such as the Dynamic Vision Sensor (DVS), are bio-inspired vision sensors that output ...
Event-based cameras are new type vision sensors whose pixels work independently and respond asynchro...
The agility of a robotic system is ultimately limited by the speed of its processing pipeline. The u...
The agility of a robotic system is ultimately limited by the speed of its processing pipeline. The u...
New vision sensors, such as the Dynamic and Active-pixel Vision sensor (DAVIS), incorporate a conven...
Abstract — At the current state of the art, the agility of an autonomous flying robot is limited by ...
At the current state of the art, the agility of an autonomous flying robot is limited by its sensing...
Rather than generating images constantly and synchronously, neuromorphic vision sensors -also known ...
Event-based vision sensors, such as the Dynamic Vision Sensor (DVS), do not output a sequence of vid...
We propose an algorithm to estimate the “lifetime” of events from retinal cameras, such as a Dynamic...
Abstract—In the last few years, we have witnessed impres-sive demonstrations of aggressive flights a...
Abstract—In the last few years, we have witnessed impres-sive demonstrations of aggressive flights a...
In this paper, we introduce IDOL, an optimization-based framework for IMU-DVS Odometry using Lines. ...
We present a system that estimates the motion of a robot relying solely on images from onboard omnid...
In order to safely navigate and orient in their local surroundings autonomous systems need to rapidl...
Event cameras, such as the Dynamic Vision Sensor (DVS), are bio-inspired vision sensors that output ...
Event-based cameras are new type vision sensors whose pixels work independently and respond asynchro...
The agility of a robotic system is ultimately limited by the speed of its processing pipeline. The u...
The agility of a robotic system is ultimately limited by the speed of its processing pipeline. The u...
New vision sensors, such as the Dynamic and Active-pixel Vision sensor (DAVIS), incorporate a conven...
Abstract — At the current state of the art, the agility of an autonomous flying robot is limited by ...
At the current state of the art, the agility of an autonomous flying robot is limited by its sensing...
Rather than generating images constantly and synchronously, neuromorphic vision sensors -also known ...
Event-based vision sensors, such as the Dynamic Vision Sensor (DVS), do not output a sequence of vid...
We propose an algorithm to estimate the “lifetime” of events from retinal cameras, such as a Dynamic...
Abstract—In the last few years, we have witnessed impres-sive demonstrations of aggressive flights a...
Abstract—In the last few years, we have witnessed impres-sive demonstrations of aggressive flights a...
In this paper, we introduce IDOL, an optimization-based framework for IMU-DVS Odometry using Lines. ...
We present a system that estimates the motion of a robot relying solely on images from onboard omnid...
In order to safely navigate and orient in their local surroundings autonomous systems need to rapidl...
Event cameras, such as the Dynamic Vision Sensor (DVS), are bio-inspired vision sensors that output ...
Event-based cameras are new type vision sensors whose pixels work independently and respond asynchro...