Extracting man-made objects from high spatial resolution remote sensing images via fast level set evolutions

Zhongbin Li, Wen Zhong Shi, Qunming Wang, Zelang Miao

Research output: Journal article publicationJournal articleAcademic researchpeer-review

22 Citations (Scopus)

Abstract

Object extraction from remote sensing images has long been an intensive research topic in the field of surveying and mapping. Most past methods are devoted to handling just one type of object, and little attention has been paid to improving the computational efficiency. In recent years, level set evolution (LSE) has been shown to be very promising for object extraction in the field of image processing because it can handle topological changes automatically while achieving high accuracy. However, the application of state-of-the-art LSEs is compromised by laborious parameter tuning and expensive computation. In this paper, we proposed two fast LSEs for man-made object extraction from high spatial resolution remote sensing images. We replaced the traditional mean curvature-based regularization term by a Gaussian kernel, and it is mathematically sound to do that. Thus, we can use a larger time step in the numerical scheme to expedite the proposed LSEs. Compared with existing methods, the proposed LSEs are significantly faster. Most importantly, they involve much fewer parameters while achieving better performance. Their advantages over other state-of-the-art approaches have been verified by a range of experiments.
Original languageEnglish
Article number6863659
Pages (from-to)883-899
Number of pages17
JournalIEEE Transactions on Geoscience and Remote Sensing
Volume53
Issue number2
DOIs
Publication statusPublished - 1 Jan 2015

Keywords

  • Airport runway extraction
  • building roof extraction
  • Chan-Vese model
  • high spatial resolution
  • level set evolution (LSE)
  • man-made object extraction
  • road network extraction

ASJC Scopus subject areas

  • Electrical and Electronic Engineering
  • Earth and Planetary Sciences(all)

Cite this