This dataset is intended to be used to predict the velocity based on the event pixels present in the data, given in (t, x, y, p) format alongside a ground truth velocity reading. A novel dataset using people and various objects moving in front of an RGB video camera was created. The positions of each entity and the associated times were captured with a Vicon motion tracking system. These two types of data were calibrated so that the movement in the video matched the measurements recorded by the Vicon system. The types of data collected include two different people carrying a calibrated Vicon Active Wand and moving around the room, a Lambda aerial robot with motion tracking markers that flew around the room, a box with motion tracking marker...
The combination of spiking neural networks and event-based vision sensors holds the potential of hig...
We present the first event-based learning approach for motion segmentation in indoor scenes and the ...
For self-driving vehicles, aerial drones, and autonomous robots to be successfully deployed in the r...
International audienceIn recent years, event-based sensors have been combined with spiking neural ne...
For spiking networks to perform computational tasks, benchmark data sets are required for model desi...
Spiking Neural Networks (SNNs) are bio-inspired networks that process information conveyed as tempor...
Neuromorphic computing aims to mimic the computational principles of the brain in silico and has mot...
Event-based vision offers high dynamic range, time resolution and lower latency than conventional fr...
Current advances in technology have highlighted the importance of video analysis in the domain of co...
Event cameras and spiking neural networks (SNNs) allow for a highly bio-inspired, low-latency and po...
Apparent motion of the surroundings on an agent's retina can be used to navigate through cluttered e...
This research project develops a new deep neural network model for real-time human movement predicti...
See https://github.com/event-driven-robotics/batch-selection-experiments for examples on how to read...
This thesis presents the study, analysis, and implementation of a framework to perform trajectory pr...
The combination of spiking neural networks and event-based vision sensors holds the potential of hig...
We present the first event-based learning approach for motion segmentation in indoor scenes and the ...
For self-driving vehicles, aerial drones, and autonomous robots to be successfully deployed in the r...
International audienceIn recent years, event-based sensors have been combined with spiking neural ne...
For spiking networks to perform computational tasks, benchmark data sets are required for model desi...
Spiking Neural Networks (SNNs) are bio-inspired networks that process information conveyed as tempor...
Neuromorphic computing aims to mimic the computational principles of the brain in silico and has mot...
Event-based vision offers high dynamic range, time resolution and lower latency than conventional fr...
Current advances in technology have highlighted the importance of video analysis in the domain of co...
Event cameras and spiking neural networks (SNNs) allow for a highly bio-inspired, low-latency and po...
Apparent motion of the surroundings on an agent's retina can be used to navigate through cluttered e...
This research project develops a new deep neural network model for real-time human movement predicti...
See https://github.com/event-driven-robotics/batch-selection-experiments for examples on how to read...
This thesis presents the study, analysis, and implementation of a framework to perform trajectory pr...
The combination of spiking neural networks and event-based vision sensors holds the potential of hig...
We present the first event-based learning approach for motion segmentation in indoor scenes and the ...
For self-driving vehicles, aerial drones, and autonomous robots to be successfully deployed in the r...