An Accurate Localization Scheme for Mobile Robots Using Optical Flow in Dynamic Environments

Jiyu Cheng, Yuxiang Sun, Wenzheng Chi, Chaoqun Wang, Hu Cheng, Max Q.H. Meng

Research output: Chapter in book / Conference proceedingConference article published in proceeding or bookAcademic researchpeer-review

7 Citations (Scopus)

Abstract

Visual Simultaneous Localization and Mapping (Visual SLAM) has been studied for the past years and many state-of-the-art algorithms have been proposed with rather satisfactory performance in static scenarios. However, in dynamic scenarios, off-the-shelf Visual SLAM algorithms cannot localize the robot very accurately. To address this problem, we propose a novel method that uses optical flow to distinguish and eliminate dynamic feature points from extracted ones by using the RGB images as the only input. The static feature points are fed into the Visual SLAM algorithm for camera pose estimation. We integrate our method with the ORB-SLAM system and validate the proposed method with challenging dynamic sequences from the TUM dataset. The entire system can run in real time. Qualitative and quantitative evaluations demonstrate that our method significantly improves the performance of the Visual SLAM in dynamic scenarios.

Original languageEnglish
Title of host publication2018 IEEE International Conference on Robotics and Biomimetics, ROBIO 2018
PublisherIEEE
Pages723-728
Number of pages6
ISBN (Electronic)9781728103761
DOIs
Publication statusPublished - 2 Jul 2018
Event2018 IEEE International Conference on Robotics and Biomimetics, ROBIO 2018 - Kuala Lumpur, Malaysia
Duration: 12 Dec 201815 Dec 2018

Publication series

Name2018 IEEE International Conference on Robotics and Biomimetics, ROBIO 2018

Conference

Conference2018 IEEE International Conference on Robotics and Biomimetics, ROBIO 2018
CountryMalaysia
CityKuala Lumpur
Period12/12/1815/12/18

ASJC Scopus subject areas

  • Biotechnology
  • Artificial Intelligence
  • Human-Computer Interaction

Cite this