The Attention Mechanism Performance Analysis for Football Players Using the Internet of Things and Deep Learning
This work proposes a novel Class Aware Network (CANet) for analyzing football player performance by decoding their body movements. Firstly, the role of the Internet of Things in football sports analysis and the advantages of deep learning techniques are introduced. Secondly, pyramid pooling modules...
Saved in:
Main Author: | |
---|---|
Format: | Article |
Language: | English |
Published: |
IEEE
2024-01-01
|
Series: | IEEE Access |
Subjects: | |
Online Access: | https://ieeexplore.ieee.org/document/10380550/ |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | This work proposes a novel Class Aware Network (CANet) for analyzing football player performance by decoding their body movements. Firstly, the role of the Internet of Things in football sports analysis and the advantages of deep learning techniques are introduced. Secondly, pyramid pooling modules and attention mechanisms are introduced. Moreover, the Group-split-bottleneck (GS-bt) module is employed, and the CANet is designed to extract and utilize multi-scale feature information and enhance the network’s ability to perceive details. Finally, the effectiveness of the proposed model is validated through comparisons with other models. The results show that in image classification experiments, the mean accuracy of the GS-bt module is at least 2.79% higher than that of other models. In human body parsing experiments, results from two different datasets demonstrate that the CANet model achieves the highest mean Intersection over Union, improving by at least 6.02% compared to other models. These findings indicate that the proposed CANet model performs better in image classification and human body parsing tasks, presenting higher accuracy and generalization capabilities. This work provides new methods and technologies for analyzing football player performance, potentially promoting sports development and application in athletics. |
---|---|
ISSN: | 2169-3536 |