Skip to main content
Frontiers in Neuroscience logoLink to Frontiers in Neuroscience
editorial
. 2021 Sep 29;15:732165. doi: 10.3389/fnins.2021.732165

Editorial: Datasets for Brain-Computer Interface Applications

Ian Daly 1,*, Ana Matran-Fernandez 1, Davide Valeriani 2, Mikhail Lebedev 3,4, Andrea Kübler 5
PMCID: PMC8511440  PMID: 34658770

Non-invasive Brain-computer interfaces are an exciting new technology that provide a channel for communication between the brain and a computer system. They can be used as communication devices (Chaudhary et al., 2016; Brumberg et al., 2018), rehabilitation systems (Cervera et al., 2018), entertainment devices (Gürkök et al., 2017), and for a wide range of other applications (Finke et al., 2009; Makeig et al., 2011).

Research in non-invasive BCIs is developing rapidly and is a highly multidisciplinary field, involving, among others, neuroscientists, engineers, psychologists, computer scientists, and clinicians. Continuing development of BCI technology relies on advances made in each of these fields, which individually and collectively can contribute to improving all aspects of BCI systems including signal acquisition, processing, classification, and user interface design.

Many individual parts of a BCI system are typically first developed and evaluated on pre-existing datasets. However, there are only a few high quality publicly available datasets on which new systems, tools, and technologies can be evaluated and compared. For example, the publicly available BCI competition datasets (Sajda et al., 2003; Blankertz et al., 2004, 2006) provide an excellent set of resources for BCI researchers and have been widely used by numerous researchers to develop and evaluate new signal processing and classification methods (Arvaneh et al., 2013; Ghaemi et al., 2017; Lotte et al., 2018; Sakhavi et al., 2018; Zanini et al., 2018; Zhang et al., 2018). Yet, the relatively small size and number of such datasets introduce the risk of overfitting to methods developed and evaluated with these datasets. In other words, the reliability and reproducibility of BCI research is held back by a lack and sparsity of publicly available datasets.

This special issue provides a collection of descriptions of publicly available physiological datasets recorded during development, training, and evaluation of non-invasive BCI systems from BCI research labs around the world.

The collected datasets consist of signals recorded via a wide variety of modalities, including, but not limited to, electroencephalography (EEG), functional near infrared spectroscopy (fNIRS), electromyography (EMG), electrocardiography (ECG), galvanic skin response (GSR), skin temperature measures, respiration rates, and body movement data. Many datasets include multi-modal recordings with combinations of two or more of these signal modalities.

Data from a wide variety of different BCI paradigms are described. These include development of novel event-related potential (ERP) and steady state visual evoked potential (SSVEP) based BCIs for communication, motor imagery BCIs, affective BCIs, collaborative BCIs, and neurofeedback-based BCIs for nicotine addiction, as well as resting-state data.

Data on ERP-based BCIs are provided by several authors. For example, Delijorge et al. describe an EEG-based P300-based robotic hand control BCI; Simões et al. provide a large EEG-based P300-based BCI dataset; Li et al. implemented an ERP-based BCI for communication.

Motor control-based BCIs and associated datasets are also included in this collection. For example, Brandl and Blankertz provide an EEG dataset recorded during motor imagery while distractions were presented to simulate day-to-day events experienced outside the lab. Schwarz et al. made an attempt to decode reach and grasp actions from the EEG. Ortega et al. collected a multimodal dataset comprising EEG, fNIRS, EMG, and movement data recorded during a force grip task.

A wide range of other types of EEG-based BCIs are also presented. These include a dataset for a BCI based on covert attention shifts (Reichert et al.) and an affective BCI based on neurofeedback (Charles et al.), as well as two BCIs based on the rapid serial visual presentation paradigm (Zhang et al.; Zheng et al.). The collection also includes a BCI for treating nicotine addiction via neurofeedback (Bu et al.) and a dataset of SSVEP signals (Liu et al.).

A diverse range of paradigms were used in this collection of studies. For example, von Lühmann et al. present a resting state fNIRS dataset, while Parent et al. provide a multimodal dataset, comprising EEG, ECG, and respiration activity, recorded during a range of physical activities and induced stress. Finally, Albuquerque et al. offer a multimodal dataset, comprising EEG, ECG, and GSR, recorded during a mental workload paradigm.

We expect that the collected datasets will enable novel developments and applications of BCI technology, as well as extensive validation studies of current and future BCIs.

Author Contributions

All authors co-wrote the editorial and edited the Research Topic.

Funding

ML was supported by the Russian Science Foundation grant 21-75-30024.

Conflict of Interest

DV is employed by Neurable Inc. The remaining authors declare that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.

Publisher's Note

All claims expressed in this article are solely those of the authors and do not necessarily represent those of their affiliated organizations, or those of the publisher, the editors and the reviewers. Any product that may be evaluated in this article, or claim that may be made by its manufacturer, is not guaranteed or endorsed by the publisher.

References

  1. Arvaneh M., Guan C., Ang K. K., Quek C. (2013). Optimizing spatial filters by minimizing within-class dissimilarities in electroencephalogram-based brain-computer interface. IEEE Transac. Neural Netw. Learn. Syst. 24, 610–619. 10.1109/TNNLS.2013.2239310 [DOI] [PubMed] [Google Scholar]
  2. Blankertz B., Müller K.-R., Curio G., Vaughan T. M., Schalk G., Wolpaw J. R., et al. (2004). The BCI Competition 2003: progress and perspectives in detection and discrimination of EEG single trials. IEEE Trans. Biomed. Eng. 51, 1044–1051. 10.1109/TBME.2004.826692 [DOI] [PubMed] [Google Scholar]
  3. Blankertz B., Müller K.-R., Krusienski D. J., Schalk G., Wolpaw J. R., Schlögl A., et al. (2006). The BCI competition. III: validating alternative approaches to actual BCI problems. IEEE Transac. Neural Syst. Rehabil. Eng. 14, 153–159. 10.1109/TNSRE.2006.875642 [DOI] [PubMed] [Google Scholar]
  4. Brumberg J. S., Pitt K. M., Mantie-Kozlowski A., Burnison J. D. (2018). Brain–computer interfaces for augmentative and alternative communication: a tutorial. Am. J. Speech Lang. Pathol. 27, 1–12. 10.1044/2017_AJSLP-16-0244 [DOI] [PMC free article] [PubMed] [Google Scholar]
  5. Cervera M. A., Soekadar S. R., Ushiba J., Millán J. D. R., Liu M., Birbaumer N., et al. (2018). Brain-computer interfaces for post-stroke motor rehabilitation: a meta-analysis. Ann. Clin. Transl. Neurol. 5, 651–663. 10.1002/acn3.544 [DOI] [PMC free article] [PubMed] [Google Scholar]
  6. Chaudhary U., Birbaumer N., Ramos-Murguialday A. (2016). Brain-computer interfaces for communication and rehabilitation. Nat. Rev. Neurol. 12, 513–525. 10.1038/nrneurol.2016.113 [DOI] [PubMed] [Google Scholar]
  7. Finke A., Lenhardt A., Ritter H. (2009). The MindGame: a P300-based brain-computer interface game. Neural Netw. 22, 1329–1333. 10.1016/j.neunet.2009.07.003 [DOI] [PubMed] [Google Scholar]
  8. Ghaemi A., Rashedi E., Pourrahimi A. M., Kamandar M., Rahdari F. (2017). Automatic channel selection in EEG signals for classification of left or right hand movement in Brain Computer Interfaces using improved binary gravitation search algorithm. Biomed. Signal Process. Control. 33, 109–118. 10.1016/j.bspc.2016.11.018 [DOI] [Google Scholar]
  9. Gürkök H., Hakvoort G., Poel M., Nijholt A. (2017). Meeting the expectations from brain-computer interfaces. Comput. Entertain. 15:5. 10.1145/2633431 [DOI] [Google Scholar]
  10. Lotte F., Bougrain L., Cichocki A., Clerc M., Congedo M., Rakotomamonjy A., et al. (2018). A review of classification algorithms for EEG-based brain-computer interfaces: a 10 year update. J. Neural Eng. 15:031005. 10.1088/1741-2552/aab2f2 [DOI] [PubMed] [Google Scholar]
  11. Makeig S., Leslie G., Mullen T., Sarma D., Bigdely-Shamlo N., Kothe C. (2011). First demonstration of a musical emotion BCI, in International Conference on Affective Computing and Intelligent Interaction (Berlin; Heidelberg: Springer; ) 487–496. [Google Scholar]
  12. Sajda P., Gerson A., Müller K.-R. K.-R., Blankertz B., Parra L. (2003). A data analysis competition to evaluate machine learning algorithms for use in brain-computer interfaces. IEEE Transac. Neural Syst. Rehabil. Eng. 11, 184–185. 10.1109/TNSRE.2003.814453 [DOI] [PubMed] [Google Scholar]
  13. Sakhavi S., Guan C., Yan S. (2018). Learning Temporal Information for Brain-Computer Interface Using Convolutional Neural Networks. IEEE Transac. Neural Netw. Learn. Syst. 29, 5619–5629. 10.1109/TNNLS.2018.2789927 [DOI] [PubMed] [Google Scholar]
  14. Zanini P., Congedo M., Jutten C., Said S., Berthoumieu Y. (2018). Transfer learning: a riemannian geometry framework with applications to brain-computer interfaces. IEEE Transac. Biomed. Eng. 65, 1107–1116. 10.1109/TBME.2017.2742541 [DOI] [PubMed] [Google Scholar]
  15. Zhang Y., Wang Y., Zhou G., Jin J., Wang B., Wang X., et al. (2018). Multi-kernel extreme learning machine for EEG classification in brain-computer interfaces. Expert Syst. Appl. 96, 302–310. 10.1016/j.eswa.2017.12.015 [DOI] [Google Scholar]

Articles from Frontiers in Neuroscience are provided here courtesy of Frontiers Media SA

RESOURCES