Abstract
Visual Simultaneous Localization and Mapping (visual SLAM) has attracted more and more researchers in recent decades and many state-of-the-art algorithms have been proposed with rather satisfactory performance in static scenarios. However, in dynamic scenarios, the performance of current visual SLAM algorithms degrades significantly due to the disturbance of the dynamic objects. To address this problem, we propose a novel method which uses optical flow to distinguish and eliminate the dynamic feature points from the extracted ones using RGB images as the only input. The static feature points are fed into the visual SLAM system for the camera pose estimation. We integrate our method with the original ORB-SLAM system and validate the proposed method with the challenging dynamic sequences from the TUM dataset and our recorded office dataset. The whole system can work in real time. Qualitative and quantitative evaluations demonstrate that our method significantly improves the performance of ORB-SLAM in dynamic scenarios.
Original language | English |
---|---|
Pages (from-to) | 576-589 |
Number of pages | 14 |
Journal | Advanced Robotics |
Volume | 33 |
Issue number | 12 |
DOIs | |
Publication status | Published - 18 Jun 2019 |
Keywords
- dynamic environments
- optical flow
- Visual SLAM
ASJC Scopus subject areas
- Software
- Control and Systems Engineering
- Human-Computer Interaction
- Hardware and Architecture
- Computer Science Applications