Metric learning with generator for closed loop detection in VSLAM

Jianfang Chang, Na Dong, Donghui Li, Wai Hung Ip, Kai Leung Yung

Research output: Journal article publicationJournal articleAcademic researchpeer-review

1 Citation (Scopus)

Abstract

The development of Driverless Car, Unmanned Aerial Vehicle, Human–Computer Interaction and Artificial Intelligence has promoted the Internet of Things (IoT) industry, in which, Visual Simultaneous Localization and Mapping (VSLAM) is an important Localization and Mapping technique. Closed loop detection can alleviate the error accumulation during the operation of VSLAM. The traditional closed loop detection methods mostly rely on manually defined features, subjective and unstable, which are difficult to cope with complex and repetitive scenarios. Thus, triplet loss-based metric learning has been considered as a better solution for closed loop detection. In this paper, first, constructed Generator is applied to generate feature vector of hard negative sample. Second, triplet loss and generative loss have been applied to construct loss function. The keyframes are converted into feature vectors with well-trained model, evaluating the similarity of keyframes by calculating their distance of feature vectors, which is used to determine whether a closed loop is formed. Finally, TUM dataset is introduced to evaluate the Precision and Recall of the proposed metric learning. The well-trained model is applied to establish loop closing thread for VSLAM system. The experimental results illustrate the feasibility and effectiveness of the metric learning-based closed loop detection, which can be further applied to practical VSLAM systems.

Original languageEnglish
Pages (from-to)1025-1036
Number of pages12
JournalJournal of Real-Time Image Processing
Volume18
Issue number4
DOIs
Publication statusPublished - 19 Jan 2021

Keywords

  • Feature vector
  • Generator
  • Internet of Things
  • Loop closure detection
  • Metric learning

ASJC Scopus subject areas

  • Information Systems

Cite this