Performance Analysis of Visual/Inertial Integrated Positioning in Typical Urban Scenarios of Hong Kong

Research output: Chapter in book / Conference proceedingConference article published in proceeding or bookAcademic researchpeer-review


There is an increasing demand for accurate and robust positioning in many application domains, such as the unmanned aerial vehicle (UAV) and autonomous driving vehicles (ADV). The integration of visual odometry and inertial navigation system (INS) is extensively studied to fulfill the positioning requirement. The visual odometry can provide aided positioning by matching consecutive frames of images. However, it can be sensitive to illumination conditions and features availability in urban environment. In this paper, we propose to evaluate the performance of tightly coupled visual/inertial integrated positioning in a typical urban scenario of Hong Kong based on existing state-of-the-art visual/inertial integration algorithm. The performance of visual/inertial integrated positioning is tested and validated in a typical urban scenario of Hong Kong which includes numerous dynamic participants, vehicles, pedestrians and trunks. The result shows that the visual/inertial integration can be affected in scenes with excessive
dynamic objects
Original languageEnglish
Title of host publicationProceedings of 2019 Asian-Pacific Conference on Aerospace Technology and Science, Taiwan
Publication statusPublished - 29 Aug 2019

Cite this