Classification of hyperspectral images by deep learning of spectral-spatial features

Haiyong Ding, Luming Xu, Yue Wu, Wenzhong Shi

Research output: Journal article publicationJournal articleAcademic researchpeer-review

18 Citations (Scopus)

Abstract

Creating accurate land use and land cover maps using remote sensing images is one of the most important applications of remotely sensed data. Abundant spectral information in hyperspectral images (HSI) makes it possible to distinguish materials that would not be distinguishable by multi-spectral sensors. Spectral and spatial information from HSI is of primary importance for image classification. In this study, a hybrid stacked autoencoder (SAE) architecture and support vector machine (SVM) classifier was constructed to classify the HSI. The SAE architecture is constituted by stacking a multiple autoencoder (AE) deep learning network that consists in the encoder and decoder process. Spatial features in a neighbor region extracted from the principal component analysis (PCA) and the texture feature extracted from the gray-level cooccurrence matrix (GLCM) were fed into the classifier. It was found that the best result was from the combination of GLCM texture feature, PCA spatial feature, and spectral feature. Meanwhile, the representative features derived from SAE deep learning network were better than the original features. It reminded us that extracting the representative features from hyperspectral images is a key step of improving classification accuracy.

Original languageEnglish
Article number464
JournalArabian Journal of Geosciences
Volume13
Issue number12
DOIs
Publication statusPublished - 1 Jun 2020

Keywords

  • Deep learning
  • Hyperspectral image classification
  • Spatial features
  • Stacked autoencoder

ASJC Scopus subject areas

  • Environmental Science(all)
  • Earth and Planetary Sciences(all)

Fingerprint

Dive into the research topics of 'Classification of hyperspectral images by deep learning of spectral-spatial features'. Together they form a unique fingerprint.

Cite this