Beyond averaging: A transformer approach to decoding event related brain potentials

The objective of this study is to assess the potential of a transformer-based deep learning approach applied to event-related brain potentials (ERPs) derived from electroencephalographic (EEG) data. Traditional methods involve averaging the EEG signal of multiple trials to extract valuable neural si...

Full description

Saved in:
Bibliographic Details
Main Authors: Philipp Zelger, Manuel Arnold, Sonja Rossi, Josef Seebacher, Franz Muigg, Simone Graf, Antonio Rodríguez-Sánchez
Format: Article
Language:English
Published: Elsevier 2025-03-01
Series:NeuroImage
Subjects:
Online Access:http://www.sciencedirect.com/science/article/pii/S1053811925000515
Tags: Add Tag
No Tags, Be the first to tag this record!
_version_ 1832573239657758720
author Philipp Zelger
Manuel Arnold
Sonja Rossi
Josef Seebacher
Franz Muigg
Simone Graf
Antonio Rodríguez-Sánchez
author_facet Philipp Zelger
Manuel Arnold
Sonja Rossi
Josef Seebacher
Franz Muigg
Simone Graf
Antonio Rodríguez-Sánchez
author_sort Philipp Zelger
collection DOAJ
description The objective of this study is to assess the potential of a transformer-based deep learning approach applied to event-related brain potentials (ERPs) derived from electroencephalographic (EEG) data. Traditional methods involve averaging the EEG signal of multiple trials to extract valuable neural signals from the high noise content of EEG data. However, this averaging technique may conceal relevant information. Our investigation focuses on determining whether a transformer-based deep learning approach, specifically utilizing attention maps, an essential component of transformer networks, can provide deeper insights into ERP data compared to traditional averaging-based analyses.We investigated the data of an experiment on loudness perception. In the study, 29 normal-hearing participants between 18 and 30 years were presented with acoustic stimuli at five different sound levels between 65 and 95 dB and provided their subjective loudness rating, which was categorized as ”too loud” and ”not too loud”. During the sound presentation, EEG signals were recorded.A convolutional transformer was trained to categorize the EEG data into the two classes (”not too loud” and ”too loud”). The classifier exhibited exceptional performance, achieving over 86 % accuracy and an Area under the Curve (AUC) of up to 0.95.Through the utilization of the trained networks, attention maps were generated. Those attention maps provided insights into the time windows relevant for classification within the EEG data. The attention maps above all showed a focus on the time window around 150 to 200 ms, where the average based analysis did not indicate relevant potentials.Employing these attention maps, we were able to gain new perspectives on the ERPs, discovering the attention maps potential as a tool for delving deeper into the analysis of event-related potentials.
format Article
id doaj-art-b1add4ba23574d50a98c88c9bc7a57e8
institution Kabale University
issn 1095-9572
language English
publishDate 2025-03-01
publisher Elsevier
record_format Article
series NeuroImage
spelling doaj-art-b1add4ba23574d50a98c88c9bc7a57e82025-02-02T05:26:51ZengElsevierNeuroImage1095-95722025-03-01308121049Beyond averaging: A transformer approach to decoding event related brain potentialsPhilipp Zelger0Manuel Arnold1Sonja Rossi2Josef Seebacher3Franz Muigg4Simone Graf5Antonio Rodríguez-Sánchez6University Hospital for Hearing, Speech & Voice Disorders, Medical University of Innsbruck, Anichstrasse 35, Innsbruck, 6020, Austria; ICONE – Innsbruck Cognitive Neuroscience, Medical University of Innsbruck, Anichstrasse 35, Innsbruck, 6020, AustriaDepartment of Computer Science, University of Innsbruck, Technikerstrasse 21a, Innsbruck, 6020, AustriaUniversity Hospital for Hearing, Speech & Voice Disorders, Medical University of Innsbruck, Anichstrasse 35, Innsbruck, 6020, Austria; ICONE – Innsbruck Cognitive Neuroscience, Medical University of Innsbruck, Anichstrasse 35, Innsbruck, 6020, Austria; Corresponding author.University Hospital for Hearing, Speech & Voice Disorders, Medical University of Innsbruck, Anichstrasse 35, Innsbruck, 6020, AustriaUniversity Hospital for Hearing, Speech & Voice Disorders, Medical University of Innsbruck, Anichstrasse 35, Innsbruck, 6020, AustriaUniversity Hospital for Hearing, Speech & Voice Disorders, Medical University of Innsbruck, Anichstrasse 35, Innsbruck, 6020, AustriaDepartment of Computer Science, University of Innsbruck, Technikerstrasse 21a, Innsbruck, 6020, AustriaThe objective of this study is to assess the potential of a transformer-based deep learning approach applied to event-related brain potentials (ERPs) derived from electroencephalographic (EEG) data. Traditional methods involve averaging the EEG signal of multiple trials to extract valuable neural signals from the high noise content of EEG data. However, this averaging technique may conceal relevant information. Our investigation focuses on determining whether a transformer-based deep learning approach, specifically utilizing attention maps, an essential component of transformer networks, can provide deeper insights into ERP data compared to traditional averaging-based analyses.We investigated the data of an experiment on loudness perception. In the study, 29 normal-hearing participants between 18 and 30 years were presented with acoustic stimuli at five different sound levels between 65 and 95 dB and provided their subjective loudness rating, which was categorized as ”too loud” and ”not too loud”. During the sound presentation, EEG signals were recorded.A convolutional transformer was trained to categorize the EEG data into the two classes (”not too loud” and ”too loud”). The classifier exhibited exceptional performance, achieving over 86 % accuracy and an Area under the Curve (AUC) of up to 0.95.Through the utilization of the trained networks, attention maps were generated. Those attention maps provided insights into the time windows relevant for classification within the EEG data. The attention maps above all showed a focus on the time window around 150 to 200 ms, where the average based analysis did not indicate relevant potentials.Employing these attention maps, we were able to gain new perspectives on the ERPs, discovering the attention maps potential as a tool for delving deeper into the analysis of event-related potentials.http://www.sciencedirect.com/science/article/pii/S1053811925000515Event related potentialsEEGDeep learningTransformerLoudness perception
spellingShingle Philipp Zelger
Manuel Arnold
Sonja Rossi
Josef Seebacher
Franz Muigg
Simone Graf
Antonio Rodríguez-Sánchez
Beyond averaging: A transformer approach to decoding event related brain potentials
NeuroImage
Event related potentials
EEG
Deep learning
Transformer
Loudness perception
title Beyond averaging: A transformer approach to decoding event related brain potentials
title_full Beyond averaging: A transformer approach to decoding event related brain potentials
title_fullStr Beyond averaging: A transformer approach to decoding event related brain potentials
title_full_unstemmed Beyond averaging: A transformer approach to decoding event related brain potentials
title_short Beyond averaging: A transformer approach to decoding event related brain potentials
title_sort beyond averaging a transformer approach to decoding event related brain potentials
topic Event related potentials
EEG
Deep learning
Transformer
Loudness perception
url http://www.sciencedirect.com/science/article/pii/S1053811925000515
work_keys_str_mv AT philippzelger beyondaveragingatransformerapproachtodecodingeventrelatedbrainpotentials
AT manuelarnold beyondaveragingatransformerapproachtodecodingeventrelatedbrainpotentials
AT sonjarossi beyondaveragingatransformerapproachtodecodingeventrelatedbrainpotentials
AT josefseebacher beyondaveragingatransformerapproachtodecodingeventrelatedbrainpotentials
AT franzmuigg beyondaveragingatransformerapproachtodecodingeventrelatedbrainpotentials
AT simonegraf beyondaveragingatransformerapproachtodecodingeventrelatedbrainpotentials
AT antoniorodriguezsanchez beyondaveragingatransformerapproachtodecodingeventrelatedbrainpotentials