Profile cover photo
Profile photo
Michael Gschwandtner
About
Communities and Collections
Posts

Post has attachment
Add a comment...

Post has attachment
Add a comment...

Post has attachment
Add a comment...

Post has attachment
Add a comment...

Post has attachment
Add a comment...

Post has attachment
Add a comment...

Post has attachment
Add a comment...

Post has attachment
Add a comment...

Post has attachment
Add a comment...

Post has shared content
Wow, that is one really impressive realtime SLAM.
I wish you all happy new year by announcing our latest work. We present EVO: a geometric approach to Event-based 6-DOF Parallel Tracking and Mapping in Real-time, which has recently been accepted at IEEE RA-L. EVO successfully leverages the outstanding properties of event cameras to track fast camera motions while recovering a semi-dense 3D map of the environment. The implementation runs in real-time on a standard CPU and outputs up to several hundred pose estimates per second. Due to the nature of event cameras, our algorithm is unaffected by motion blur and operates very well in challenging, high dynamic range conditions with strong illumination changes (check especially at minute 0:57, when we point the camera towards the sun! and at minute 1:43 of the video, when we switch the lights off and on!). To achieve this, we combine a novel, event-based tracking approach based on image-to-model alignment with our recent event-based multiview stereo algorithm (EMVS, BMVC'16) in a parallel fashion. Additionally, we show that the output of our pipeline can be used to reconstruct intensity images from the binary event stream, though our algorithm does not require such intensity information. We believe that this work makes significant progress in SLAM by unlocking the potential of event cameras. This allows us to tackle challenging scenarios that are currently inaccessible to standard cameras.

Reference: Henri Rebecq, Timo Horstschaefer, Guillermo Gallego, Davide Scaramuzza, "EVO: A Geometric Approach to Event-based 6-DOF Parallel Tracking and Mapping in Real-time," IEEE Robotics and Automation Letters (RA-L), 2016.
http://rpg.ifi.uzh.ch/docs/RAL16_EVO.pdf

Our research page on event based vision:
http://rpg.ifi.uzh.ch/research_dvs.html

Robotics and Perception Group, University of Zurich, 2016
http://rpg.ifi.uzh.ch/

https://youtu.be/bYqD2qZJlxE
Add a comment...
Wait while more posts are being loaded