Robust RGB-D slam using point and line features for low textured scene

Yajing Zou, Amr Eldemiry, Yaxin Li, Wu Chen

Research output: Journal article publicationJournal articleAcademic researchpeer-review

16 Citations (Scopus)

Abstract

Three-dimensional (3D) reconstruction using RGB-D camera with simultaneous color image and depth information is attractive as it can significantly reduce the cost of equipment and time for data collection. Point feature is commonly used for aligning two RGB-D frames. Due to lacking reliable point features, RGB-D simultaneous localization and mapping (SLAM) is easy to fail in low textured scenes. To overcome the problem, this paper proposes a robust RGB-D SLAM system fusing both points and lines, because lines can provide robust geometry constraints when points are insufficient. To comprehensively fuse line constraints, we combine 2D and 3D line reprojection error with point reprojection error in a novel cost function. To solve the cost function and filter out wrong feature matches, we build a robust pose solver using the Gauss–Newton method and Chi-Square test. To correct the drift of camera poses, we maintain a sliding-window framework to update the keyframe poses and related features. We evaluate the proposed system on both public datasets and real-world experiments. It is demonstrated that it is comparable to or better than state-of-the-art methods in consideration with both accuracy and robustness.

Original languageEnglish
Article number4984
Pages (from-to)1-20
Number of pages20
JournalSensors (Switzerland)
Volume20
Issue number17
DOIs
Publication statusPublished - 1 Sept 2020

Keywords

  • Line features
  • Low textured scene
  • RGB-D SLAM
  • Sliding-window

ASJC Scopus subject areas

  • Analytical Chemistry
  • Biochemistry
  • Atomic and Molecular Physics, and Optics
  • Instrumentation
  • Electrical and Electronic Engineering

Fingerprint

Dive into the research topics of 'Robust RGB-D slam using point and line features for low textured scene'. Together they form a unique fingerprint.

Cite this