Abstract
Remote sensing image fusion extracts the spatial information of the panchromatic (PAN) image to sharpen the geometric structure of a multi-spectral (MS) image. Traditional algorithms that solve the fusion image problem by applying various transformations often result in some losses of spatial and spectral details. To improve the quality of the fusion result, we develop a novel fusion method based on collaborative representation for multi-band remote sensing images. In the developed collaborative representation model, a spectral preservation coefficient based on the spectral contribution and spectral-spatial dependency is designed to retain the spectral in the low-resolution MS (LRMS) image. An intensity modulation coefficient based on the spatial difference between the PAN and MS images that is spectral dependent is designed to adaptively recover and modulate the spatial of the MS image. Through the proposed collaborative representation model, the LRMS, low-resolution PAN, and PAN images, and the designed coefficients collaboratively represent the fusion image. The results of the experiment on various satellite datasets show that the method we develop is effective and robust in enhancing pansharpening.
Original language | English |
---|---|
Pages (from-to) | 23-35 |
Number of pages | 13 |
Journal | Information Fusion |
Volume | 90 |
DOIs | |
Publication status | Published - Feb 2023 |
Keywords
- Collaborative representation
- Multi-band remote sensing image
- Pansharpening
- Spectral dependency
ASJC Scopus subject areas
- Software
- Signal Processing
- Information Systems
- Hardware and Architecture