Downhole Coal–Rock Recognition Based on Joint Migration and Enhanced Multidimensional Full-Scale Visual Features
The accurate identification of coal and rock at the mining face is often hindered by adverse underground imaging conditions, including poor lighting and strong reflectivity. To tackle these issues, this work introduces a recognition framework specifically designed for underground environments, lever...
Saved in:
| Main Authors: | , , , , , , , |
|---|---|
| Format: | Article |
| Language: | English |
| Published: |
MDPI AG
2025-05-01
|
| Series: | Applied Sciences |
| Subjects: | |
| Online Access: | https://www.mdpi.com/2076-3417/15/10/5411 |
| Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
| Summary: | The accurate identification of coal and rock at the mining face is often hindered by adverse underground imaging conditions, including poor lighting and strong reflectivity. To tackle these issues, this work introduces a recognition framework specifically designed for underground environments, leveraging joint migration and enhancement of multidimensional and full-scale visual representations. A Transformer-based architecture is employed to capture global dependencies within the image and perform reflectance component denoising. Additionally, a multi-scale luminance adjustment module is integrated to merge features across perceptual ranges, mitigating localized brightness anomalies such as overexposure. The model is structured around an encoder–decoder backbone, enhanced by a full-scale connectivity mechanism, a residual attention block with dilated convolution, Res2Block elements, and a composite loss function. These components collectively support precise pixel-level segmentation of coal–rock imagery. Experimental evaluations reveal that the proposed luminance module achieves a PSNR of 21.288 and an SSIM of 0.783, outperforming standard enhancement methods like RetinexNet and RRDNet. The segmentation framework achieves a MIoU of 97.99% and an MPA of 99.28%, surpassing U-Net by 2.21 and 1.53 percentage points, respectively. |
|---|---|
| ISSN: | 2076-3417 |