Yuan Wang

and 7 more

The pansharpening task has always been a focal point for researchers. However, existing methods have yet to fully exploit the modality correlation between Panchromatic (𝑃 𝐴𝑁) and Multispectral (𝑀𝑆) images, leading to the loss of spatial high-frequency information and spectral distortion during fusion. Additionally, they apply the same processing standards to information at different scales, causing these pieces of information to mix and interfere. To address these issues, we propose a multi-scale fusion network based on a continuous information guidance mechanism named CIGformer. We focus on improving the interaction mechanism between 𝑃 𝐴𝑁 and 𝑀𝑆. First, we introduce an Intensity Substitute Block (ISB) to decompose the shared and unique information of 𝑃 𝐴𝑁 and 𝑀𝑆, initializing the subsequent information guidance. Then, we construct an Information Guidance Block (IGB) based on the Transformer architecture, which is the core of our information guidance. It enables adaptive balanced retention of the unique information of 𝑃 𝐴𝑁 and 𝑀𝑆 without affecting the utilization of their shared information. Furthermore, we build a multi-level Encoder-Decoder bidirectional pyramid structure to mitigate the impact of multi-scale information intermixing. At each encoder level, we continuously use IGB to achieve optimal information utilization. Finally, we introduce a consistency loss function to evaluate the retention of unique information, assisting in the training process. Experiments on the GaoFen-2 and WorldView-3 datasets show that based on unique information guidance, our proposed interaction method significantly enhances the utilization efficiency of 𝑃𝐴𝑁 and 𝑀𝑆. Our code is available at https://github.com/Xidian-AIGroup190726/CIGformer.