Dense RGB SLAM With Neural Implicit Maps

ICLR 2023

Heng Li123, Xiaodong Gu2, Weihao Yuan2, Luwei Yang3, Zilong Dong2, Ping Tan123,
1Hong Kong University of Science and Technology, 2Alibaba Group, 3Simon Fraser University,
Framework

DIM-SLAM is the FIRST Dense RGB SLAM with NeRF representation.

Abstract

There is an emerging trend of using neural implicit functions for map representation in Simultaneous Localization and Mapping (SLAM). Some pioneer works have achieved encouraging results on RGB-D SLAM. In this paper, we present a dense RGB SLAM method with neural implicit map representation. To reach this challenging goal without depth input, we introduce a hierarchical feature volume to facilitate the implicit map decoder. This design effectively fuses shape cues across different scales to facilitate map reconstruction. Our method simultaneously solves the camera motion and the neural implicit map by matching the rendered and input video frames. To facilitate optimization, we further propose a photometric warping loss in the spirit of multi-view stereo to better constrain the camera pose and scene geometry. We evaluate our method on commonly used benchmarks and compare it with modern RGB and RGB-D SLAM systems. Our method achieves favorable results than previous methods and even surpasses some recent RGB-D SLAM methods.

Result


Results

BibTeX

@inproceedings{li2023dense,
  author    = {Li, Heng and Gu, Xiaodong and Yuan, Weihao and Yang, Luwei and Dong, Zilong and Tan, Ping},
  title     = {Dense RGB SLAM With Neural Implicit Maps},
  booktitle={Proceedings of the International Conference on Learning Representations},
  year      = {2023},
  url={https://openreview.net/forum?id=QUK1ExlbbA}
}