Scale-Dependent Signal Identification in Low-Dimensional Subspace: Motor Imagery Task Classification

Motor imagery electroencephalography (EEG) has been successfully used in locomotor rehabilitation programs. While the noise-assisted multivariate empirical mode decomposition (NA-MEMD) algorithm has been utilized to extract task-specific frequency bands from all channels in the same scale as the int...

Full description

Saved in:
Bibliographic Details
Main Authors: Qingshan She, Haitao Gan, Yuliang Ma, Zhizeng Luo, Tom Potter, Yingchun Zhang
Format: Article
Language:English
Published: Wiley 2016-01-01
Series:Neural Plasticity
Online Access:http://dx.doi.org/10.1155/2016/7431012
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Motor imagery electroencephalography (EEG) has been successfully used in locomotor rehabilitation programs. While the noise-assisted multivariate empirical mode decomposition (NA-MEMD) algorithm has been utilized to extract task-specific frequency bands from all channels in the same scale as the intrinsic mode functions (IMFs), identifying and extracting the specific IMFs that contain significant information remain difficult. In this paper, a novel method has been developed to identify the information-bearing components in a low-dimensional subspace without prior knowledge. Our method trains a Gaussian mixture model (GMM) of the composite data, which is comprised of the IMFs from both the original signal and noise, by employing kernel spectral regression to reduce the dimension of the composite data. The informative IMFs are then discriminated using a GMM clustering algorithm, the common spatial pattern (CSP) approach is exploited to extract the task-related features from the reconstructed signals, and a support vector machine (SVM) is applied to the extracted features to recognize the classes of EEG signals during different motor imagery tasks. The effectiveness of the proposed method has been verified by both computer simulations and motor imagery EEG datasets.
ISSN:2090-5904
1687-5443