Improving Road Semantic Segmentation Using Generative Adversarial Network
Road network extraction from remotely sensed imagery has become a powerful tool for updating geospatial databases, owing to the success of convolutional neural network (CNN) based deep learning semantic segmentation techniques combined with the high-resolution imagery that modern remote sensing prov...
Saved in:
Main Authors: | , , , , |
---|---|
Format: | Article |
Language: | English |
Published: |
IEEE
2021-01-01
|
Series: | IEEE Access |
Subjects: | |
Online Access: | https://ieeexplore.ieee.org/document/9416669/ |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | Road network extraction from remotely sensed imagery has become a powerful tool for updating geospatial databases, owing to the success of convolutional neural network (CNN) based deep learning semantic segmentation techniques combined with the high-resolution imagery that modern remote sensing provides. However, most CNN approaches cannot obtain high precision segmentation maps with rich details when processing high-resolution remote sensing imagery. In this study, we propose a generative adversarial network (GAN)-based deep learning approach for road segmentation from high-resolution aerial imagery. In the generative part of the presented GAN approach, we use a modified UNet model (MUNet) to obtain a high-resolution segmentation map of the road network. In combination with simple pre-processing comprising edge-preserving filtering, the proposed approach offers a significant improvement in road network segmentation compared with prior approaches. In experiments conducted on the Massachusetts road image dataset, the proposed approach achieves 91.54% precision and 92.92% recall, which correspond to a Mathews correlation coefficient (MCC) of 91.13%, a Mean intersection over union (MIOU) of 87.43% and a F1-score of 92.20%. Comparisons demonstrate that the proposed GAN framework outperforms prior CNN-based approaches and is particularly effective in preserving edge information. |
---|---|
ISSN: | 2169-3536 |