EEG-to-EEG: Scalp-to-Intracranial EEG Translation Using a Combination of Variational Autoencoder and Generative Adversarial Networks

A generative adversarial network (GAN) makes it possible to map a data sample from one domain to another one. It has extensively been employed in image-to-image and text-to image translation. We propose an EEG-to-EEG translation model to map the scalp-mounted EEG (scEEG) sensor signals to intracrani...

Full description

Saved in:
Bibliographic Details
Main Authors: Bahman Abdi-Sargezeh, Sepehr Shirani, Antonio Valentin, Gonzalo Alarcon, Saeid Sanei
Format: Article
Language:English
Published: MDPI AG 2025-01-01
Series:Sensors
Subjects:
Online Access:https://www.mdpi.com/1424-8220/25/2/494
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:A generative adversarial network (GAN) makes it possible to map a data sample from one domain to another one. It has extensively been employed in image-to-image and text-to image translation. We propose an EEG-to-EEG translation model to map the scalp-mounted EEG (scEEG) sensor signals to intracranial EEG (iEEG) sensor signals recorded by foramen ovale sensors inserted into the brain. The model is based on a GAN structure in which a conditional GAN (cGAN) is combined with a variational autoencoder (VAE), named as VAE-cGAN. scEEG sensors are plagued by noise and suffer from low resolution. On the other hand, iEEG sensor recordings enjoy high resolution. Here, we consider the task of mapping the scEEG sensor information to iEEG sensors to enhance the scEEG resolution. In this study, our EEG data contain epileptic interictal epileptiform discharges (IEDs). The identification of IEDs is crucial in clinical practice. Here, the proposed VAE-cGAN is firstly employed to map the scEEG to iEEG. Then, the IEDs are detected from the resulting iEEG. Our model achieves a classification accuracy of 76%, an increase of, respectively, 11%, 8%, and 3% over the previously proposed least-square regression, asymmetric autoencoder, and asymmetric–symmetric autoencoder mapping models.
ISSN:1424-8220