Comparative analysis of spectral and temporal combinations in CSP-based methods for decoding hand motor imagery tasks Academic Article in Scopus uri icon

abstract

  • © 2022 Elsevier B.V.Background: A widely used paradigm for brain-computer interfaces (BCI) is based on the detection of event-related (des)synchronization (ERD/S) in response to hand motor imagery (MI) tasks. The common spatial pattern (CSP) has been recognized as a powerful algorithm to design spatial filters for ERD/ERS detection. However, a limitation of CSP focus on identification only of discriminative spatial information but not the spectral one. New method: An open problem remains in literature related to extracting the most discriminative brain patterns in MI-based BCIs using an optimal time segment and spectral information that accounts for intersubject variability. In recent years, different variants of CSP-based methods have been proposed to address the problem of decoding motor imagery tasks under the intersubject variability of frequency bands related to ERD/ERS events, including Filter Bank Common Spatial Patterns (FBCSP) and Filter Bank Common Spatio-Spectral Patterns (FBCSSP). Comparison with existing methods: We performed a comparative study of different combinations of time segments and filter banks for three methods (CSP, FBCSP, and FBCSSP) to decode hand (right and left) motor imagery tasks using two different EEG datasets (Gigascience and BCI IVa competition). Results: The best configuration corresponds to a filter bank with 3 filters (8¿15 Hz, 15¿22 Hz and 22¿29 Hz) using a time window of 1.5 s after the trigger, which provide accuracies of approximately 74% and an estimated ITRs of approximately 7 bits/min. Conclusion: Discriminative information in time and spectral domains could be obtained using a convenient filter bank and a time segment configuration, to enhance the classification rate and ITR for detection of hand motor imagery tasks with CSP-related methods, to be used in the implementation of a real-time BCI system.

publication date

  • April 1, 2022