Attention-Guided Shared Hybrid Network for Enhanced Land Cover Segmentation

Accurate land cover segmentation is crucial for numerous environmental and urban planning applications. However, irregular land types and varying illumination conditions can adversely affect segmentation results. Most existing remote sensing image segmentation models prioritize lightweight design, w...

Full description

Saved in:
Bibliographic Details
Main Authors: Yinbing Jiang, Linfeng Shi, Xinyu Fan
Format: Article
Language:English
Published: IEEE 2025-01-01
Series:IEEE Access
Subjects:
Online Access:https://ieeexplore.ieee.org/document/10819395/
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Accurate land cover segmentation is crucial for numerous environmental and urban planning applications. However, irregular land types and varying illumination conditions can adversely affect segmentation results. Most existing remote sensing image segmentation models prioritize lightweight design, which often leads to difficulties in identifying small objects and edge information, as well as insufficient multi-scale capabilities. To address these issues, we propose an innovative Attention-Guided Shared Hybrid Network (AGSHN) aimed at enhancing the precision and robustness of land cover segmentation. Our network integrates attention mechanisms with shared hybrid architectures to effectively capture spatial dependencies and contextual information. During the network fusion stage, our Dual Feature Complementary Modules (DFCM) selectively emphasizes informative features while suppressing irrelevant data. Additionally, during the decoding stage, we introduce a Multi-Scale Dual Representation Alignment Filter(MDAF) module to mitigate semantic ambiguity between shallow and deep features. Finally, a specialized auxiliary segmentation method is employed to reinforce the network’s boundary representation. Extensive experiments conducted on benchmark land cover datasets, including WHDLD, GID-15, and L8SPARCS, demonstrate that AGSHN significantly outperforms existing state-of-the-art methods in segmentation accuracy.
ISSN:2169-3536