TY - GEN
T1 - An Accurate Localization Scheme for Mobile Robots Using Optical Flow in Dynamic Environments
AU - Cheng, Jiyu
AU - Sun, Yuxiang
AU - Chi, Wenzheng
AU - Wang, Chaoqun
AU - Cheng, Hu
AU - Meng, Max Q.H.
N1 - Funding Information:
This project is partially supported by RGC GRF grants CUHK 415512 and CUHK 415613, CRF grant CUHK 6CRF13G, and CUHK VC discretional fund #4930765, awarded to Prof. Max Q.-H. Meng.
Publisher Copyright:
© 2018 IEEE.
PY - 2018/7/2
Y1 - 2018/7/2
N2 - Visual Simultaneous Localization and Mapping (Visual SLAM) has been studied for the past years and many state-of-the-art algorithms have been proposed with rather satisfactory performance in static scenarios. However, in dynamic scenarios, off-the-shelf Visual SLAM algorithms cannot localize the robot very accurately. To address this problem, we propose a novel method that uses optical flow to distinguish and eliminate dynamic feature points from extracted ones by using the RGB images as the only input. The static feature points are fed into the Visual SLAM algorithm for camera pose estimation. We integrate our method with the ORB-SLAM system and validate the proposed method with challenging dynamic sequences from the TUM dataset. The entire system can run in real time. Qualitative and quantitative evaluations demonstrate that our method significantly improves the performance of the Visual SLAM in dynamic scenarios.
AB - Visual Simultaneous Localization and Mapping (Visual SLAM) has been studied for the past years and many state-of-the-art algorithms have been proposed with rather satisfactory performance in static scenarios. However, in dynamic scenarios, off-the-shelf Visual SLAM algorithms cannot localize the robot very accurately. To address this problem, we propose a novel method that uses optical flow to distinguish and eliminate dynamic feature points from extracted ones by using the RGB images as the only input. The static feature points are fed into the Visual SLAM algorithm for camera pose estimation. We integrate our method with the ORB-SLAM system and validate the proposed method with challenging dynamic sequences from the TUM dataset. The entire system can run in real time. Qualitative and quantitative evaluations demonstrate that our method significantly improves the performance of the Visual SLAM in dynamic scenarios.
UR - http://www.scopus.com/inward/record.url?scp=85064127560&partnerID=8YFLogxK
U2 - 10.1109/ROBIO.2018.8664893
DO - 10.1109/ROBIO.2018.8664893
M3 - Conference article published in proceeding or book
AN - SCOPUS:85064127560
T3 - 2018 IEEE International Conference on Robotics and Biomimetics, ROBIO 2018
SP - 723
EP - 728
BT - 2018 IEEE International Conference on Robotics and Biomimetics, ROBIO 2018
PB - IEEE
T2 - 2018 IEEE International Conference on Robotics and Biomimetics, ROBIO 2018
Y2 - 12 December 2018 through 15 December 2018
ER -