IMAGE PROCESSING FOR STRUCTURAL HEALTH: YOLO-ENABLED MACHINE LEARNING TECHNIQUES
Mitali Chaudhary Chaudhary
Paper Contents
Abstract
Object detection plays a vital role in ensuring the safety and long-term performance of infrastructure. Conventional inspection methods, though widely practiced, are limited by their reliance on human expertise, extended assessment times, and the possibility of overlooking fine defects. Recent advances in computer vision and deep learning have created opportunities to automate these processes, with the YOLO (You Only Look Once) family of object detection models gaining particular attention for real-time inspection tasks. Unlike multi-stage detectors that separate localization and classification into different phases, YOLO employs a unified single-stage framework, making it both fast and accurate. This capability is especially beneficial when evaluating the workability of large structural elements such as beams, columns, and slabs under practical field conditions.Over its evolution, YOLO has consistently introduced architectural improvements that enhance detection robustness, efficiency, and adaptability to complex inspection environments. These developments include improved backbone networks, refined loss functions, and optimized training procedures, which collectively strengthen performance under varied lighting and background conditions. In this study, a comparative analysis of YOLOv12n, YOLOv12s, and YOLOv12m is carried out to evaluate their ability to detect structural defects. Each version is designed with a different balance of computational load and accuracy, making it essential to identify the most effective model for engineering practice. The outcomes are expected to inform the choice of suitable YOLO versions for automated SHM systems, contributing to more reliable inspections, reduced maintenance costs, and enhanced infrastructure resilience.
Copyright
Copyright © 2025 Mitali Chaudhary. This is an open access article distributed under the Creative Commons Attribution License.