Spectral-Temporal fusion of satellite images via an End-to-End Two-Stream attention with an effective reconstruction network

dc.contributor.authorBenzenati, Tayeb
dc.contributor.authorKessentini, Yousri
dc.contributor.authorKallel, Abdelaziz
dc.date.accessioned2023-03-15T08:44:42Z
dc.date.available2023-03-15T08:44:42Z
dc.date.issued2023
dc.description.abstractDue to technical and budget constraints on current optical satellites, the acquisition of satellite images with the best resolutions is not practicable. In this article, aiming to produce products with high spectral (HS) and temporal resolutions, we introduced a two-stream spectral–temporal fusion technique based on attention mechanism called STA-Net. STA-Net aims to combine high spectral and low temporal (HSLT) resolution images with low spectral and high temporal (LSHT) resolution images to generate products with the best characteristics. The proposed technique involves two stages. In the first one, two fused images are generated by a two-stream architecture based on residual attention blocks. The temporal difference estimator stream estimates the temporal difference between HS images at desired and neighboring dates. The reflectance difference estimator is the second stream. It predicts the reflectance difference between the input images (HS–LS) to map LS images into HS products. In the second stage, a reconstruction network combines the latter two-stream outputs via an effective learnable weighted-sum strategy. The two-stage model is trained in an end-to-end fashion using an effective loss function to ensure the best fusion quality. To the best of our knowledge, this work represents the first attempt to address the spectral–temporal fusion using an end-to-end deep neural network model. Experimental results conducted on two actual datasets of Sentinel-2 (HSLT:10 spectral bands and long revisit period) and Planetscope (LSHT: four spectral bands and daily images) images, which proved the effectiveness of the proposed technique with respect to baseline techniqueen_US
dc.identifier.issn19391404
dc.identifier.uriDOI: 10.1109/JSTARS.2023.3234722
dc.identifier.urihttps://ieeexplore.ieee.org/document/10008044
dc.identifier.urihttps://dspace.univ-boumerdes.dz/handle/123456789/11199
dc.language.isoenen_US
dc.publisherIEEEen_US
dc.relation.ispartofseriesIEEE Journal of Selected Topics in Applied Earth Observations and Remote Sensing/ Vol.16 (2023);pp. 1308-1320
dc.subjectAttention mechanismen_US
dc.subjectConvolutional neural network (CNN)en_US
dc.subjectImage fusionen_US
dc.subjectMultisensor image fusionen_US
dc.subjectPlanetscopeen_US
dc.subjectSentinel-2en_US
dc.subjectSpectral-temporal fusionen_US
dc.titleSpectral-Temporal fusion of satellite images via an End-to-End Two-Stream attention with an effective reconstruction networken_US
dc.typeArticleen_US

Files

Original bundle

Now showing 1 - 1 of 1
No Thumbnail Available
Name:
Benzenati, Tayeb.pdf
Size:
2.54 MB
Format:
Adobe Portable Document Format

License bundle

Now showing 1 - 1 of 1
No Thumbnail Available
Name:
license.txt
Size:
1.71 KB
Format:
Item-specific license agreed upon to submission
Description: