TY - GEN
T1 - Perception-aided Visual-Inertial Integrated Positioning in Dynamic Urban Areas
AU - Bai, Xiwei
AU - Zhang, Bo
AU - Wen, Weisong
AU - Hsu, Li Ta
AU - Li, Huiyun
N1 - Funding Information:
ACKNOWLEDGMENT The authors acknowledge the support of the Hong Kong PolyU internal grant on the project ZVKZ, “Navigation for Autonomous Driving Vehicle using Sensor Integration”.
Publisher Copyright:
© 2020 IEEE.
Copyright:
Copyright 2020 Elsevier B.V., All rights reserved.
PY - 2020/4
Y1 - 2020/4
N2 - Visual-inertial navigation systems (VINS) have been extensively studied in the past decades to provide positioning services for autonomous systems, such as autonomous driving vehicles (ADV) and unmanned aerial vehicles (UAV). Decent performance can be obtained by VINS in indoor scenarios with stable illumination and texture information. Unfortunately, applying the VINS in dynamic urban areas is still a challenging problem, due to the excessive dynamic objects which can significantly degrade the performance of VINS. Detecting and removing the features inside an image using the deep neural network (DNN) that belongs to unexpected objects, such as moving vehicles and pedestrians, is a straightforward idea to mitigate the impacts of dynamic objects on VINS. However, excessive exclusion of features can significantly distort the geometry distribution of visual features. Even worse, excessive removal can cause the unobservability of the system states. Instead of directly excluding the features that possibly belong to dynamic objects, this paper proposes to remodel the uncertainty of dynamic features. Then both the healthy and dynamic features are applied in the VINS. The experiment in a typical urban canyon is conducted to validate the performance of the proposed method. The result shows that the proposed method can effectively mitigate the impacts of the dynamic objects and improved accuracy is obtained.
AB - Visual-inertial navigation systems (VINS) have been extensively studied in the past decades to provide positioning services for autonomous systems, such as autonomous driving vehicles (ADV) and unmanned aerial vehicles (UAV). Decent performance can be obtained by VINS in indoor scenarios with stable illumination and texture information. Unfortunately, applying the VINS in dynamic urban areas is still a challenging problem, due to the excessive dynamic objects which can significantly degrade the performance of VINS. Detecting and removing the features inside an image using the deep neural network (DNN) that belongs to unexpected objects, such as moving vehicles and pedestrians, is a straightforward idea to mitigate the impacts of dynamic objects on VINS. However, excessive exclusion of features can significantly distort the geometry distribution of visual features. Even worse, excessive removal can cause the unobservability of the system states. Instead of directly excluding the features that possibly belong to dynamic objects, this paper proposes to remodel the uncertainty of dynamic features. Then both the healthy and dynamic features are applied in the VINS. The experiment in a typical urban canyon is conducted to validate the performance of the proposed method. The result shows that the proposed method can effectively mitigate the impacts of the dynamic objects and improved accuracy is obtained.
KW - INS
KW - Navigation
KW - Positioning
KW - Urban Areas
KW - VINS
KW - Visual Odometry
UR - http://www.scopus.com/inward/record.url?scp=85085578671&partnerID=8YFLogxK
U2 - 10.1109/PLANS46316.2020.9109963
DO - 10.1109/PLANS46316.2020.9109963
M3 - Conference article published in proceeding or book
AN - SCOPUS:85085578671
T3 - 2020 IEEE/ION Position, Location and Navigation Symposium, PLANS 2020
SP - 1563
EP - 1571
BT - 2020 IEEE/ION Position, Location and Navigation Symposium, PLANS 2020
PB - Institute of Electrical and Electronics Engineers Inc.
T2 - 2020 IEEE/ION Position, Location and Navigation Symposium, PLANS 2020
Y2 - 20 April 2020 through 23 April 2020
ER -