Improving monocular visual SLAM in dynamic environments: an optical-flow-based approach

Jiyu Cheng, Yuxiang Sun, Max Q.H. Meng

Research output: Journal article publicationJournal articleAcademic researchpeer-review

73 Citations (Scopus)

Abstract

Visual Simultaneous Localization and Mapping (visual SLAM) has attracted more and more researchers in recent decades and many state-of-the-art algorithms have been proposed with rather satisfactory performance in static scenarios. However, in dynamic scenarios, the performance of current visual SLAM algorithms degrades significantly due to the disturbance of the dynamic objects. To address this problem, we propose a novel method which uses optical flow to distinguish and eliminate the dynamic feature points from the extracted ones using RGB images as the only input. The static feature points are fed into the visual SLAM system for the camera pose estimation. We integrate our method with the original ORB-SLAM system and validate the proposed method with the challenging dynamic sequences from the TUM dataset and our recorded office dataset. The whole system can work in real time. Qualitative and quantitative evaluations demonstrate that our method significantly improves the performance of ORB-SLAM in dynamic scenarios.

Original languageEnglish
Pages (from-to)576-589
Number of pages14
JournalAdvanced Robotics
Volume33
Issue number12
DOIs
Publication statusPublished - 18 Jun 2019

Keywords

  • dynamic environments
  • optical flow
  • Visual SLAM

ASJC Scopus subject areas

  • Software
  • Control and Systems Engineering
  • Human-Computer Interaction
  • Hardware and Architecture
  • Computer Science Applications

Fingerprint

Dive into the research topics of 'Improving monocular visual SLAM in dynamic environments: an optical-flow-based approach'. Together they form a unique fingerprint.

Cite this