Robust Hyperspectral Image Domain Adaptation with Noisy Labels

Wei Wei, Wei Li, Lei Zhang, Cong Wang, Peng Zhang, Yanning Zhang

Research output: Journal article publicationJournal articleAcademic researchpeer-review

12 Citations (Scopus)

Abstract

In hyperspectral image (HSI) classification, domain adaptation (DA) methods have been proved effective to address unsatisfactory classification results caused by the distribution difference between training (i.e., source domain) and testing (i.e., target domain) pixels. However, these methods rely on accurate labels in source domain, and seldom consider the performance drop resulted by noisy label, which often happens since labeling pixel in HSI is a challenging task. To improve the robustness of DA method to label noise, we propose a new unsupervised HSI DA method, which is constructed from both feature-level and classifier-level. First, a linear transformation function is learned in feature-level to align the source (domain) subspace with the target (domain) subspace. Then, a robust low-rank representation based classifier is developed to well cope with the features obtained from the aligned subspace. Since both subspace alignment and the classifier are immune to noisy labels, the proposed method obtains good classification results when confronting with noisy labels in source domain. Experimental results on two DA benchmarks demonstrate the effectiveness of the proposed method.

Original languageEnglish
Article number8610142
Pages (from-to)1135-1139
Number of pages5
JournalIEEE Geoscience and Remote Sensing Letters
Volume16
Issue number7
DOIs
Publication statusPublished - 1 Jul 2019
Externally publishedYes

Keywords

  • Domain adaptation (DA)
  • hyperspectral image (HSI) classification
  • low-rank representation
  • subspace alignment

ASJC Scopus subject areas

  • Geotechnical Engineering and Engineering Geology
  • Electrical and Electronic Engineering

Fingerprint

Dive into the research topics of 'Robust Hyperspectral Image Domain Adaptation with Noisy Labels'. Together they form a unique fingerprint.

Cite this