Robust coal granularity estimation via deep neural network with an image enhancement layer
Accurate granularity estimation of ore images is vital in automatic geometric parameter detecting and composition analysis of ore dressing progress. Machine learning based methods have been widely used in multi-scenario ore granularity estimation. However, the adhesion of coal particles in the image...
Saved in:
| Main Authors: | , , |
|---|---|
| Format: | Article |
| Language: | English |
| Published: |
Taylor & Francis Group
2022-12-01
|
| Series: | Connection Science |
| Subjects: | |
| Online Access: | http://dx.doi.org/10.1080/09540091.2021.2015290 |
| Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
| Summary: | Accurate granularity estimation of ore images is vital in automatic geometric parameter detecting and composition analysis of ore dressing progress. Machine learning based methods have been widely used in multi-scenario ore granularity estimation. However, the adhesion of coal particles in the images usually results in lower segmentation accuracy. Because much powdery coal fills between blocky ones, making edge contrast between them is not distinct. Currently, the coal granularity estimation is still carried out empirical in nature. We propose a novel method for coal granularity estimation based on a deep neural network called Res-SSD to deal with the problem. Then, to further improve the detection performance, we propose an image enhancement layer for Res-SSD. Since the dust generated during production and transportation will seriously damage the image quality, we first propose an image denoising method based on dust modelling. By investigating imaging characteristics of coal, we second propose the optical balance transformation(OBT), by which the distinguishability of coal in dark zones can be increased. Meanwhile, OBT can also suppress overexposed spots in images. Experimental results show that the proposed method is better than classic and state-of-the-art methods in terms of accuracy while achieving a comparable speed performance. |
|---|---|
| ISSN: | 0954-0091 1360-0494 |