UAV target tracking method based on global feature interaction and anchor-frame-free perceptual feature modulation.

Target tracking techniques in the UAV perspective utilize UAV cameras to capture video streams and identify and track specific targets in real-time. Deep learning UAV target tracking methods based on the Siamese family have achieved significant results but still face challenges regarding accuracy an...

Full description

Saved in:
Bibliographic Details
Main Authors: Yuanhong Dan, Jinyan Li, Yu Jin, Yong Ji, Zhihao Wang, Dong Cheng
Format: Article
Language:English
Published: Public Library of Science (PLoS) 2025-01-01
Series:PLoS ONE
Online Access:https://doi.org/10.1371/journal.pone.0314485
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Target tracking techniques in the UAV perspective utilize UAV cameras to capture video streams and identify and track specific targets in real-time. Deep learning UAV target tracking methods based on the Siamese family have achieved significant results but still face challenges regarding accuracy and speed compatibility. In this study, in order to refine the feature representation and reduce the computational effort to improve the efficiency of the tracker, we perform feature fusion in deep inter-correlation operations and introduce a global attention mechanism to enhance the model's field of view range and feature refinement capability to improve the tracking performance for small targets. In addition, we design an anchor-free frame-aware feature modulation mechanism to reduce computation and generate high-quality anchors while optimizing the target frame refinement computation to improve the adaptability to target deformation motion. Comparison experiments with several popular algorithms on UAV tracking datasets, such as UAV123@10fps, UAV20L, and DTB70, show that the algorithm balances speed and accuracy. In order to verify the reliability of the algorithm, we built a physical experimental environment on the Jetson Orin Nano platform. We realized a real-time processing speed of 30 frames per second.
ISSN:1932-6203