Low-latency event-based visual odometry. In IEEE International Conference on Robotics and Automation (ICRA). May 2014. pdf supp. material
bibtexAbstract: The agility of a robotic system is ultimately limited by the speed of its processing pipeline. The use of a Dynamic Vision Sensors (DVS), a sensor producing asynchronous events as luminance changes are perceived by its pixels, makes it possible to have a sensing pipeline of a theoretical latency of a few microseconds. However, several challenges must be overcome: a DVS does not provide the grayscale value but only changes in the luminance; and because the output is composed by a sequence of events, traditional frame-based visual odometry methods are not applicable. This paper presents the first visual odometry system based on a DVS plus a normal CMOS camera to provide the absolute brightness values. The two sources of data are automatically spatiotemporally calibrated from logs taken during normal operation. We design a visual odometry method that uses the DVS events to estimate the relative displacement since the previous CMOS frame by processing each event individually. Experiments show that the rotation can be estimated with surprising accuracy, while the translation can be estimated only very noisily, because it produces few events due to very small apparent motion.
Additional materials
- slides:
- Keynote format (older Keynote 5 format)
- PDF format
- Powerpoint format (approximate conversion)
- datsets/source code: Not ready yet. Waiting for the first request to write proper documentation... In the meantime, you might also be interested in the source code and datasets for our previous DVS paper.
Pingback: Voiture autonome : How to make robots and self-driving cars think faster | Proâme