Increasing Neural-Based Pedestrian Detectors’ Robustness to Adversarial Patch Attacks Using Anomaly Localization
Object detection in images is a fundamental component of many safety-critical systems, such as autonomous driving, video surveillance systems, and robotics. Adversarial patch attacks, being easily implemented in the real world, provide effective counteraction to object detection by state-of-the-art...
Saved in:
Main Authors: | , , |
---|---|
Format: | Article |
Language: | English |
Published: |
MDPI AG
2025-01-01
|
Series: | Journal of Imaging |
Subjects: | |
Online Access: | https://www.mdpi.com/2313-433X/11/1/26 |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
_version_ | 1832588275833896960 |
---|---|
author | Olga Ilina Maxim Tereshonok Vadim Ziyadinov |
author_facet | Olga Ilina Maxim Tereshonok Vadim Ziyadinov |
author_sort | Olga Ilina |
collection | DOAJ |
description | Object detection in images is a fundamental component of many safety-critical systems, such as autonomous driving, video surveillance systems, and robotics. Adversarial patch attacks, being easily implemented in the real world, provide effective counteraction to object detection by state-of-the-art neural-based detectors. It poses a serious danger in various fields of activity. Existing defense methods against patch attacks are insufficiently effective, which underlines the need to develop new reliable solutions. In this manuscript, we propose a method which helps to increase the robustness of neural network systems to the input adversarial images. The proposed method consists of a Deep Convolutional Neural Network to reconstruct a benign image from the adversarial one; a Calculating Maximum Error block to highlight the mismatches between input and reconstructed images; a Localizing Anomalous Fragments block to extract the anomalous regions using the Isolation Forest algorithm from histograms of images’ fragments; and a Clustering and Processing block to group and evaluate the extracted anomalous regions. The proposed method, based on anomaly localization, demonstrates high resistance to adversarial patch attacks while maintaining the high quality of object detection. The experimental results show that the proposed method is effective in defending against adversarial patch attacks. Using the YOLOv3 algorithm with the proposed defensive method for pedestrian detection in the INRIAPerson dataset under the adversarial attacks, the mAP50 metric reaches 80.97% compared to 46.79% without a defensive method. The results of the research demonstrate that the proposed method is promising for improvement of object detection systems security. |
format | Article |
id | doaj-art-61005d9ff6ef484c898a153beb0efb5d |
institution | Kabale University |
issn | 2313-433X |
language | English |
publishDate | 2025-01-01 |
publisher | MDPI AG |
record_format | Article |
series | Journal of Imaging |
spelling | doaj-art-61005d9ff6ef484c898a153beb0efb5d2025-01-24T13:36:19ZengMDPI AGJournal of Imaging2313-433X2025-01-011112610.3390/jimaging11010026Increasing Neural-Based Pedestrian Detectors’ Robustness to Adversarial Patch Attacks Using Anomaly LocalizationOlga Ilina0Maxim Tereshonok1Vadim Ziyadinov2Science and Research Department, Moscow Technical University of Communications and Informatics, 111024 Moscow, RussiaScience and Research Department, Moscow Technical University of Communications and Informatics, 111024 Moscow, RussiaScience and Research Department, Moscow Technical University of Communications and Informatics, 111024 Moscow, RussiaObject detection in images is a fundamental component of many safety-critical systems, such as autonomous driving, video surveillance systems, and robotics. Adversarial patch attacks, being easily implemented in the real world, provide effective counteraction to object detection by state-of-the-art neural-based detectors. It poses a serious danger in various fields of activity. Existing defense methods against patch attacks are insufficiently effective, which underlines the need to develop new reliable solutions. In this manuscript, we propose a method which helps to increase the robustness of neural network systems to the input adversarial images. The proposed method consists of a Deep Convolutional Neural Network to reconstruct a benign image from the adversarial one; a Calculating Maximum Error block to highlight the mismatches between input and reconstructed images; a Localizing Anomalous Fragments block to extract the anomalous regions using the Isolation Forest algorithm from histograms of images’ fragments; and a Clustering and Processing block to group and evaluate the extracted anomalous regions. The proposed method, based on anomaly localization, demonstrates high resistance to adversarial patch attacks while maintaining the high quality of object detection. The experimental results show that the proposed method is effective in defending against adversarial patch attacks. Using the YOLOv3 algorithm with the proposed defensive method for pedestrian detection in the INRIAPerson dataset under the adversarial attacks, the mAP50 metric reaches 80.97% compared to 46.79% without a defensive method. The results of the research demonstrate that the proposed method is promising for improvement of object detection systems security.https://www.mdpi.com/2313-433X/11/1/26adversarial patch attackrobustnesspedestrian detectiondeep convolutional neural network |
spellingShingle | Olga Ilina Maxim Tereshonok Vadim Ziyadinov Increasing Neural-Based Pedestrian Detectors’ Robustness to Adversarial Patch Attacks Using Anomaly Localization Journal of Imaging adversarial patch attack robustness pedestrian detection deep convolutional neural network |
title | Increasing Neural-Based Pedestrian Detectors’ Robustness to Adversarial Patch Attacks Using Anomaly Localization |
title_full | Increasing Neural-Based Pedestrian Detectors’ Robustness to Adversarial Patch Attacks Using Anomaly Localization |
title_fullStr | Increasing Neural-Based Pedestrian Detectors’ Robustness to Adversarial Patch Attacks Using Anomaly Localization |
title_full_unstemmed | Increasing Neural-Based Pedestrian Detectors’ Robustness to Adversarial Patch Attacks Using Anomaly Localization |
title_short | Increasing Neural-Based Pedestrian Detectors’ Robustness to Adversarial Patch Attacks Using Anomaly Localization |
title_sort | increasing neural based pedestrian detectors robustness to adversarial patch attacks using anomaly localization |
topic | adversarial patch attack robustness pedestrian detection deep convolutional neural network |
url | https://www.mdpi.com/2313-433X/11/1/26 |
work_keys_str_mv | AT olgailina increasingneuralbasedpedestriandetectorsrobustnesstoadversarialpatchattacksusinganomalylocalization AT maximtereshonok increasingneuralbasedpedestriandetectorsrobustnesstoadversarialpatchattacksusinganomalylocalization AT vadimziyadinov increasingneuralbasedpedestriandetectorsrobustnesstoadversarialpatchattacksusinganomalylocalization |