Robotics: Science and Systems XI

Get Out of My Lab: Large-scale, Real-Time Visual-Inertial Localization

Simon Lynen, Torsten Sattler, Michael Bosse, Joel Hesch, Marc Pollefeys, Roland Siegwart

Abstract:

Accurately estimating a robot's pose relative to a global scene model and precisely tracking the pose in real-time is a fundamental problem for navigation and obstacle avoidance tasks. Due to the computational complexity of localization against a large map and the memory consumed by the model, state-of-the-art approaches are either limited to small workspaces or rely on a server-side system to query the global model while tracking the pose locally. The latter approaches face the problem of smoothly integrating the server's pose estimates into the trajectory computed locally to avoid temporal discontinuities. In this paper, we demonstrate that large-scale, real-time pose estimation and tracking can be performed on mobile platforms with limited resources without the use of an external server. This is achieved by employing map and descriptor compression schemes as well as efficient search algorithms from computer vision. We derive a formulation for integrating the global pose information into a local state estimator that produces much smoother trajectories than current approaches. Through detailed experiments, we evaluate each of our design choices individually and document its impact on the overall system performance, demonstrating that our approach outperforms state-of-the-art algorithms for localization at scale.

Download:

Bibtex:

  
@INPROCEEDINGS{Lynen-RSS-15, 
    AUTHOR    = {Simon Lynen AND Torsten Sattler AND Michael Bosse AND Joel Hesch AND Marc Pollefeys AND Roland Siegwart}, 
    TITLE     = {Get Out of My Lab: Large-scale, Real-Time Visual-Inertial Localization}, 
    BOOKTITLE = {Proceedings of Robotics: Science and Systems}, 
    YEAR      = {2015}, 
    ADDRESS   = {Rome, Italy}, 
    MONTH     = {July},
    DOI       = {10.15607/RSS.2015.XI.037} 
}