Multi-Scale Fusion of High Spatial Resolution Satellite Images Based on Steerable Gaussian Filters and Local Saliency Map
|
Negar Jovhari , Reza Shah-Hosseini * , Mahdi Hasanlou , Amin Sedaghat , Nazila Mohammadi , Siamand Avestan |
University of Tehran |
|
Abstract: (317 Views) |
High-resolution remote sensing images capture intricate local structural features across a range of frequencies, causing significant challenges in producing reliable remote sensing and photogrammetry products, particularly in the fusion of multispectral images with panchromatic bands. Many existing methods suffer from spectral and spatial distortions due to inadequate reconstruction of these local features, which poses a critical obstacle for both traditional and learning-based approaches. Furthermore, discrepancies between the training and test samples, with lack of access to reference data, can hinder the generalizability of learning-based methods. Accordingly, this paper introduces a novel two-stage approach aimed at enhancing the pixel-based fusion process of satellite images, specifically in pan-sharpening applications. The first stage focuses on reconstructing the local structural features through a multiscale technique inspired by the well-established SIFT algorithm. This method utilizes a median index map derived from steerable Gaussian filters in the spatial domain, and also incorporates the sensor's Modulation Transfer Function (MTF). Following this, a binary saliency map is generated using morphological gradients and Otsu thresholding. The reconstructed image and the saliency map are subsequently employed as a guiding image and guiding filter, respectively, and are integrated with various pan-sharpening methods across both conventional and learning-based computational frameworks. The qualitative and quantitative results demonstrate a marked improvement in the performance of these methods, yielding more realistic pansharpened images. Notably, the learning-based LPPN algorithm shows a superior performance compared to the other methods and is further enhanced by the proposed approach. On average, the Quality No-Reference (QNR) and the Hybrid Quality No-Reference (HQNR) evaluation metrics for the selected methods using the proposed approach have increased from 0.931 to 0.9430, and from 0.920 to 0.9310, respectively. Additionally, the average spectral and spatial distortions have decreased from 0.05 to 0.04 and from 0.0576 to 0.0483, respectively. Moreover, the average Spectral Angle Metric (SAM) has decreased from 3.5022 to 3.268, indicating a reduction in the distortions and an improvement in the realism of the fusion process. |
|
Keywords: remote sensing image fusion, steerable Gaussian filter, saliency map, morphological gradients, the sensor MTF, the learning-based LPPN |
|
Full-Text [PDF 1740 kb]
(87 Downloads)
|
Type of Study: Research |
Subject:
RS Received: 2023/10/8 | Accepted: 2024/10/8 | ePublished ahead of print: 2024/10/29 | Published: 2024/10/29
|
|
|
|
|
Send email to the article author |
|