Perception in the dark—development of a ToF visual inertial odometry system

Shengyang Chen, Ching Wei Chang, Chih Yung Wen

Research output: Journal article publicationJournal articleAcademic researchpeer-review

8 Citations (Scopus)

Abstract

Visual inertial odometry (VIO) is the front-end of visual simultaneous localization and mapping (vSLAM) methods and has been actively studied in recent years. In this context, a time-of-flight (ToF) camera, with its high accuracy of depth measurement and strong resilience to ambient light of variable intensity, draws our interest. Thus, in this paper, we present a realtime visual inertial system based on a low cost ToF camera. The iterative closest point (ICP) methodology is adopted, incorporating salient point-selection criteria and a robustness-weighting function. In addition, an error-state Kalman filter is used and fused with inertial measurement unit (IMU) data. To test its capability, the ToF–VIO system is mounted on an unmanned aerial vehicle (UAV) platform and operated in a variable light environment. The estimated flight trajectory is compared with the ground truth data captured by a motion capture system. Real flight experiments are also conducted in a dark indoor environment, demonstrating good agreement with estimated performance. The current system is thus shown to be accurate and efficient for use in UAV applications in dark and Global Navigation Satellite System (GNSS)-denied environments.

Original languageEnglish
Article number1263
JournalSensors (Switzerland)
Volume20
Issue number5
DOIs
Publication statusPublished - 26 Feb 2020

Keywords

  • Data fusion
  • Error-state Kalman Filter
  • ICP
  • Real time
  • ToF camera
  • VIO

ASJC Scopus subject areas

  • Analytical Chemistry
  • Biochemistry
  • Atomic and Molecular Physics, and Optics
  • Instrumentation
  • Electrical and Electronic Engineering

Fingerprint

Dive into the research topics of 'Perception in the dark—development of a ToF visual inertial odometry system'. Together they form a unique fingerprint.

Cite this