Transformers in EEG Analysis: A Review of Architectures and Applications in Motor Imagery, Seizure, and Emotion Classification
Transformers have rapidly influenced research across various domains. With their superior capability to encode long sequences, they have demonstrated exceptional performance, outperforming existing machine learning methods. There has been a rapid increase in the development of transformer-based mode...
Saved in:
| Main Authors: | , |
|---|---|
| Format: | Article |
| Language: | English |
| Published: |
MDPI AG
2025-02-01
|
| Series: | Sensors |
| Subjects: | |
| Online Access: | https://www.mdpi.com/1424-8220/25/5/1293 |
| Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
| Summary: | Transformers have rapidly influenced research across various domains. With their superior capability to encode long sequences, they have demonstrated exceptional performance, outperforming existing machine learning methods. There has been a rapid increase in the development of transformer-based models for EEG analysis. The high volumes of recently published papers highlight the need for further studies exploring transformer architectures, key components, and models employed particularly in EEG studies. This paper aims to explore four major transformer architectures: Time Series Transformer, Vision Transformer, Graph Attention Transformer, and hybrid models, along with their variants in recent EEG analysis. We categorize transformer-based EEG studies according to the most frequent applications in motor imagery classification, emotion recognition, and seizure detection. This paper also highlights the challenges of applying transformers to EEG datasets and reviews data augmentation and transfer learning as potential solutions explored in recent years. Finally, we provide a summarized comparison of the most recent reported results. We hope this paper serves as a roadmap for researchers interested in employing transformer architectures in EEG analysis. |
|---|---|
| ISSN: | 1424-8220 |