Real-time large-scale dense RGB-D SLAM with volumetric fusion

T. Whelan, M. Kaess, H. Johannsson, M. Fallon, J. J. Leonard, J. Mcdonald

Research output: Contribution to journalArticlepeer-review

Abstract / Description of output

We present a new simultaneous localization and mapping (SLAM) system capable of producing high-quality globally consistent surface reconstructions over hundreds of meters in real time with only a low-cost commodity RGB-D sensor. By using a fused volumetric surface reconstruction we achieve a much higher quality map over what would be achieved using raw RGB-D point clouds. In this paper we highlight three key techniques associated with applying a volumetric fusion-based mapping system to the SLAM problem in real time. First, the use of a GPU-based 3D cyclical buffer trick to efficiently extend dense every-frame volumetric fusion of depth maps to function over an unbounded spatial region. Second, overcoming camera pose estimation limitations in a wide variety of environments by combining both dense geometric and photometric camera pose constraints. Third, efficiently updating the dense map according to place recognition and subsequent loop closure constraints by the use of an ‘as-rigid-as-possible’ space deformation. We present results on a wide variety of aspects of the system and show through evaluation on de facto standard RGB-D benchmarks that our system performs strongly in terms of trajectory estimation, map quality and computational performance in comparison to other state-of-the-art systems.
Original languageEnglish
Pages (from-to)598-626
Number of pages29
JournalInternational Journal of Robotics Research
Issue number4-5
Early online date9 Dec 2014
Publication statusPublished - Apr 2015


Dive into the research topics of 'Real-time large-scale dense RGB-D SLAM with volumetric fusion'. Together they form a unique fingerprint.

Cite this