This app implements a two-class motor imagery (MI) experiment consisting of the imagination of movement of either left or right hand. MI-based BCIs are of great intereest due to its great potential for rehabilitation by inducing neural plasticity in the brain.
MI is traditionally detected by decoding the sensorimotor rhythms (SMR). SMR are oscillations in the electric field in the sensorimotor cortex of the brain. These oscillations consist of an event related desynchronization (ERD) followed by a event related synchronization (ERS). A representation can be observed in the following figure:
The app shows a color changing sphere with a sliding bar. The sphere's color and bar's position are continuously updated depending on the classification outcome from the electroencephalogram (EEG) registered. It allows to configure important parameters, such as preparation, trial or rest duration, or the classification model.
It includes the traditional approach based on machine learning (ML) to decode SMR from the EEG. It uses a common spatial patterns (CSP) feature extraction followed by a classification with a regularized linear discriminant analysis (rLDA). For this approach you will first need to run a calibration session with around 60 to 100 trials of both classes to have sufficient accuracy in the online session.
Additionally, this app includes an initialized version of a deep learning model called EEGSym trained with data from 280 subjects to start an experiment without calibration with an expected accuracy above 80% (for more information regarding EEGSym and this calibrationless approach, see [1]).
Visit the forum to report issues and make improvement suggestions!
The following GIF shows the application:
References:
[1] S. Pérez-Velasco, E. Santamaría-Vázquez, V. Martínez-Cagigal, D. Marcos-Martínez and R. Hornero, "EEGSym: Overcoming Inter-Subject Variability in Motor Imagery Based BCIs With Deep Learning," in IEEE Transactions on Neural Systems and Rehabilitation Engineering, vol. 30, pp. 1766-1775, 2022, doi: 10.1109/TNSRE.2022.3186442.