A Dynamic Branch Automatic Modulation Recognition Method for Heterogeneous Data-Driven
To address insufficient feature complementarity mining and limited recognition accuracy in end-to-end deep learning models under complex channel environments, this paper proposes a dynamic branch automatic modulation recognition method driven by heterogeneous data. A multi-modal parallel feature ext...
Saved in:
| Main Authors: | , , |
|---|---|
| Format: | Article |
| Language: | English |
| Published: |
IEEE
2025-01-01
|
| Series: | IEEE Access |
| Subjects: | |
| Online Access: | https://ieeexplore.ieee.org/document/11119632/ |
| Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
| Summary: | To address insufficient feature complementarity mining and limited recognition accuracy in end-to-end deep learning models under complex channel environments, this paper proposes a dynamic branch automatic modulation recognition method driven by heterogeneous data. A multi-modal parallel feature extraction architecture is designed to learn time-frequency synergistic features, enhancing discriminative representation of modulation characteristics. Specifically, data preprocessing intensifies phase information in original I/Q data, while a sparse multi-scale convolutional module strengthens spatial feature extraction. The LSTM network has been developed to capture the time-dependent interactions of the three channels I/Q, I, and Q. A residual-based recursive encoder maps hierarchical features of Amplitude-Phase (AP) data to jointly extract spatiotemporal characteristics. Modulation classification is achieved through a fully connected layer. Experiments on the RadioML2016.10A dataset demonstrate superior performance: the proposed model achieves 1%–13.5% higher average accuracy than mainstream models at 0–18dB SNR, with peak accuracy reaching 94.68% at 14dB. This method improves robustness in complex channels through heterogeneous data-driven multi-modal fusion, offering new insights for intelligent communication signal demodulation. |
|---|---|
| ISSN: | 2169-3536 |