Stone inscription image segmentation based on Stacked-UNets and GANs
Abstract To overcome the challenges posed in effectively extracting stone inscriptions characterized by highly self-similarity between the foreground and background, a character image segmentation framework is proposed that integrates Stacked-UNets and Generative Adversarial Networks (GANs). Initial...
Saved in:
| Main Authors: | , , |
|---|---|
| Format: | Article |
| Language: | English |
| Published: |
Springer
2024-10-01
|
| Series: | Discover Applied Sciences |
| Subjects: | |
| Online Access: | https://doi.org/10.1007/s42452-024-06264-8 |
| Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
| Summary: | Abstract To overcome the challenges posed in effectively extracting stone inscriptions characterized by highly self-similarity between the foreground and background, a character image segmentation framework is proposed that integrates Stacked-UNets and Generative Adversarial Networks (GANs). Initially, a convolutional rule tailored for self-similar feature extraction is introduced to enhance the image detail segmentation. Subsequently, in improving the stacked units, multi-scale character masks with Spatial Transformer Network (STN) are added to guide the character segmentation. Finally, with two GANs and datasets, the character recognition and generation capabilities of the Stacked-UNets are trained, and further constructing its character segmentation and restoration abilities. Ultimately, the edge segmentation performance was enhanced, both in extracting characters from stone inscriptions with self-similar blocks and in effectively recovering partially incomplete characters. Compared to state-of-the-art methods, the Stacked-UNets demonstrates improvements of 6.41%, 7.39%, 8.43%, 8.11%, 4.13%, and 2.58% across six indicators, respectively. |
|---|---|
| ISSN: | 3004-9261 |