Binocular Video-Based Automatic Pixel-Level Crack Detection and Quantification Using Deep Convolutional Neural Networks for Concrete Structures

Crack detection and quantification play crucial roles in assessing the condition of concrete structures. Herein, a novel real-time crack detection and quantification method that leverages binocular vision and a lightweight deep learning model is proposed. In this methodology, the proposed method bas...

Full description

Saved in:
Bibliographic Details
Main Authors: Liqu Liu, Bo Shen, Shuchen Huang, Runlin Liu, Weizhang Liao, Bin Wang, Shuo Diao
Format: Article
Language:English
Published: MDPI AG 2025-01-01
Series:Buildings
Subjects:
Online Access:https://www.mdpi.com/2075-5309/15/2/258
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Crack detection and quantification play crucial roles in assessing the condition of concrete structures. Herein, a novel real-time crack detection and quantification method that leverages binocular vision and a lightweight deep learning model is proposed. In this methodology, the proposed method based on the following four modules is adopted: a lightweight classification algorithm, a high-precision segmentation algorithm, a semi-global block matching algorithm (SGBM), and a crack quantification technique. Based on the crack segmentation results, a framework is developed for quantitative analysis of the major geometric parameters, including crack length, crack width, and crack angle of orientation at the pixel level. Results indicate that, by incorporating channel attention and spatial attention mechanisms in the MBConv module, the detection accuracy of the improved EfficientNetV2 increased by 1.6% compared with the original EfficientNetV2. Results indicate that using the proposed quantification method can achieve low quantification errors of 2%, 4.5%, and 4% for the crack length, width, and angle of orientation, respectively. The proposed method can contribute to crack detection and quantification in practical use by being deployed on smart devices.
ISSN:2075-5309