ESLiteU²-Net: A Lightweight U²-Net for Road Extraction From High-Resolution Remote Sensing Images

Extracting road information from high-resolution remote sensing images has become a research hotspot in remote sensing image processing due to its cost-effectiveness and efficiency. Current road extraction methods generally face challenges such as large parameter sizes and limited accuracy when deal...

Full description

Saved in:
Bibliographic Details
Main Authors: Rui Xu, Zhenxing Zhuang, Renzhong Mao, Yihui Yang
Format: Article
Language:English
Published: IEEE 2025-01-01
Series:IEEE Access
Subjects:
Online Access:https://ieeexplore.ieee.org/document/10975038/
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Extracting road information from high-resolution remote sensing images has become a research hotspot in remote sensing image processing due to its cost-effectiveness and efficiency. Current road extraction methods generally face challenges such as large parameter sizes and limited accuracy when dealing with roads at different scales. To overcome these limitations, this study proposes a novel lightweight attention network model (ESLiteU2-Net) to improve both efficiency and accuracy of road extraction. Based on U2-Net, the proposed model reduces complexity by a channel reduction strategy and introduces an Efficient Spatial and Channel Attention Module (ESCA). This innovative design enables the model to better capture and reinforce road features across both spatial and channel dimensions, resulting in significant improvements in extraction accuracy and robustness while maintaining a lightweight structure. Experimental results demonstrate that ESLiteU2-Net outperforms existing methods on the CHN6-CUG and Massachusetts road datasets. Compared to U2-Net, the proposed model not only achieves superior accuracy but also reduces computational load and parameter number by 30.98% and 81.91%, respectively, achieving a balanced combination of lightweight design, efficiency, and accuracy for road extraction.
ISSN:2169-3536