Skip to main content
NIHPA Author Manuscripts logoLink to NIHPA Author Manuscripts
. Author manuscript; available in PMC: 2019 Jan 1.
Published in final edited form as: Brain Topogr. 2017 Sep 6;31(1):125–128. doi: 10.1007/s10548-017-0586-7

Real-Time Clustered Multiple Signal Classification (RTC-MUSIC)

Christoph Dinh 1,2, Lorenz Esch 2, Johannes Rühle 2, Steffen Bollmann 2,3, Daniel Güllmar 4, Daniel Baumgarten 2,5, Matti S Hämäläinen 1, Jens Haueisen 2,6
PMCID: PMC5773364  NIHMSID: NIHMS904603  PMID: 28879632

Abstract

Magnetoencephalography (MEG) and Electroencephalography (EEG) provide a high temporal resolution, which allows estimation of the detailed time courses of neuronal activity. However, in real-time analysis of these data two major challenges must be handled: the low signal-to-noise ratio (SNR) and the limited time available for computations. In this work, we present Real-Time Clustered Multiple Signal Classification (RTC-MUSIC) a real-time source localization algorithm, which can handle low SNRs and can reduce the computational effort. It provides correlation information together with sparse source estimation results, which can, e.g., be used to identify evoked responses with high sensitivity. RTC-MUSIC clusters the forward solution based on an anatomical brain atlas and optimizes the scanning process inherent to MUSIC approaches. We evaluated RTC-MUSIC by analyzing MEG auditory and somatosensory data. The results demonstrate that the proposed method localizes sources reliably. For the auditory experiment the most dominant correlated source pair was located bilaterally in the superior temporal gyri. The highest activation in the somatosensory experiment was found in the contra-lateral primary somatosensory cortex (SI).

Keywords: Magnetoencephalography, Electroencephalography, Real-Time, Source Estimation, RAP-MUSIC, RTC-MUSIC, Powell’s Conjugate Direction Method, K-Means Clustering, Brain Atlas, CUDA

1 Introduction

Real-time source estimation based on MEG and EEG can help to better understand brain functions [1]–[3] and to create more effective brain-computer interface (BCI) systems [4]–[9]. There are two major challenges in real-time source localization: the low SNR and the limited time available for computations. In this work, we present RTC-MUSIC, derived from RAP-MUSIC [11], which allows to quickly obtain sparse real-time localization results based on an anatomical brain atlas. RAP-MUSIC is a scanning approach, which provides information on correlated source pairs in addition to the location. It reduces the activity pattern to a sparse set of the most prominent sources. In RTC-MUSIC, the forward problem is optimized by reducing the gain matrix to the most representative sources. This makes the gain matrix better conditioned leading to the ability to handle low SNRs [10]. The decimation of the gain matrix also reduces the computational effort. To preserve the location information, this decimation is done per cortical parcel. The parcels are part of a functional brain atlas and have, therefore, a physiological relevance, which helps to interpret the highly transient brain activation in real-time analysis.

The application of correlated scanning approaches for source localization on a real-time data stream is only feasible when the algorithm itself is accelerated and the calculations are optimized. In this work, we present a performance-optimized and parallelized RTC-MUSIC algorithm, which can estimate sources in real-time.

2 Results

Compared to RAP-MUSIC, RTC-MUSIC results in a significant reduction of the computational effort, see Fig. 1. Approximately half of the reductions are based on methodical improvements. Harnessing Powell’s Conjugate Direction Method resulted in a further reduction of the remaining computational effort by three orders of magnitude. Finally, reduction of the forward solution with clustering results in an additional acceleration by reducing the task by another two orders of magnitude, see supplementary material S.1.1. Sampling rates up to 1250 Hz together with a clustered gain matrix mapping 416 brain regions to 306 MEG sensors were successfully tested. The measured real-time delays of 23.22 ms ± 24.27 for 100 ms windows, i.e., 125 samples, allows for real-time localizations with the Elekta Neuromag® VectorView 306.

Fig. 1.

Fig. 1

Comparing S.1.1 Eq. (19) and (20): Influence of the (a) number of gain matrix dipoles, (b) number of sensors, (c) number of localized sources of RTC- and RAP-MUSIC.

We compared RAP- and RTC-MUSIC in a simulation study across different noise levels, see S.1.2 and S.2.2. Both RTC- and RAP-MUSIC showed to be suitable for single-trial noise levels, which were −12.09 dB across all channels and −9.88 dB for the auditory and somatosensory data, respectively. The original RAP-MUSIC had a slightly increased accuracy for the correlated dipole at lower noise levels, see S.2.2.

Subsequently, the localization precision of RAP- and RTC-MUSIC were evaluated on real data using a dipole fit as a reference, which is independent of the introduced clustering approach. The dipole activities were mapped to their related regions and used as active reference regions.

RAP- and RTC-MUSIC are subspace correlation methods, which were applied to the signal subspace, determined from the SVD of the data, see (S.1.1 Eq. (10)). Consequently, they require an adequate dimensioned observation time window to deliver useful results. As a first step, we investigated the effect of the length of the analysis window, see Fig. 2.

Fig. 2.

Fig. 2

Mean of the largest and second largest singular value over window size. The estimation is based on post auditory stimulus data starting from100 ms increasing the window size sample-wise to 300 ms resulting in a maximal window of 200 ms. This visualizes the impact of the window size on the data rank, i.e., number of localizable source pairs. The largest singular value has a relative robust plateau using a 100 ms window size.

We used a time window of 100 ms to localize one auditory correlated dipole pair, which was a good trade-off between the transient auditory patterns and a sufficiently long window, see Fig. 2. We did not use the recursive search capabilities of RTC-/RAP-MUSIC, i.e., we found only the most prominent source.

The closest distance between the localized RTC- and RAP-MUSIC regions and the reference region was taken as error measure. In case of the auditory evoked response, the most active correlated regions are located at the contra- and ipsilateral auditory cortex, see Fig. 3.

Fig. 3.

Fig. 3

a) Localization error of RAP-MUSIC and RTC-MUSIC using the N100 dipole fit as reference. The analyzed time window ranges from 100 to 200 ms. Friedman’s test comparing each number of averages separately across all subjects showed that there is no significant difference between the RAP-MUSIC and RTC-MUSIC localization errors given a significance level of p=0.01. b) RTC-MUSIC localization result of a right auditory stimulus using 2 averages. The localization was performed using a window size of 100 ms. The figure shows the most active correlated dipole pair (regions depicted in white).

The most active region of the correlated pair matches the contralateral N100 dipole activity. RTC-MUSIC shows a localization precision similar to RAP-MUSIC. Friedman’s test revealed that there is no significant difference in the RTC- and RAP-MUSIC localization errors. The test was separately applied for each average comprising all four subjects and two auditory measurements each. The single-trial errors showed the lowest p-values with pLH = 0.02 for the error in the left hemisphere and pRH = 0.25 in the right hemisphere, see Fig. 3.

Results based on somatosensory evoked responses can be found in supplementary material S.2.3.

3 Discussion

The two major real-time source localization challenges, the low SNR and the limited time available for the computations, were successfully addressed by the reduction of the gain matrix similar to [10]. The computational effort was further reduced by introducing computational improvements in the RAP-MUSIC calculation and the application of Powell’s Conjugate Direction Method (Diwakar et al. [13]). Powell’s Conjugate Direction Method integrated seamlessly into RTC-MUSIC and is a fast search algorithm as no gradient computations are needed. In future work, we plan to improve the performance of Powell search by introducing a taboo list to prevent calculating recurring combinations. With the GPU implementation, we were able to show the great scalability of RTC-MUSIC to many-core systems, which allows the analysis of large problems with appropriate hardware.

RTC-MUSIC shows a localization accuracy that is not significantly different from that of RAP-MUSIC. Even so, Powell’s search can have the disadvantage of converging to a local maximum. With both RTC- and RAP-MUSIC we were able to obtain reliable localization results down to single-trial auditory responses. The auditory response was localized at and around the contra- and ipsilateral superior temporal gyrus using one correlated dipole pair. This is an improvement from our previous results with RTC-MNE [10]. We can now compute source estimates from lower-SNR data and are able to localize the sources from single-trial data in real-time. Furthermore, RTC-MUSIC produces sparse estimates and the sources can be correlated, which is often an advantage.

To follow rapidly evolving activity, we employed short observation time windows. Therefore, the analyzed data windows have a low rank indicating only a small number of sources to find. Consequently, we only evaluated the most prominent source not using the recursive iteration capabilities both RTC- and RAP-MUSIC provide.

Despite the use of Powell’s iterative search algorithm, the variance of the calculation time of RTC-MUSIC result was small, which allowed us to conduct real-time source localizations.

The real-time data processing chains were realized with our MNE Scan [12] software. MNE Scan is part of our open-source electrophysiological data processing software MNE-CPP (https://mne-cpp.org).

Supplementary Material

10548_2017_586_MOESM2_ESM

Acknowledgments

This work was supported by the National Institutes of Health (NIH, grants 4R01EB009048 and 5P41EB015896), European Union’s Horizon 2020 research and innovation program under grant agreement No 686865 the German Research Foundation (DFG, grant Ba 4858/1-1), the Thuringian Ministry of Science under grant number 2015 FGR 0085 and the German Academic Exchange Service (DAAD).

Footnotes

Supplementary Material

For a detailed methodological description please consult our supplementary material.

A video of RTC-MUSIC localizations can be found here: http://www.mne-cpp.org/wp-content/uploads/2016/12/RTC-MUSIC.mp4

References

  • 1.Jones SR, Kerr CE, Wan Q, Pritchett DL, Hamalainen MS, Moore CI. Cued Spatial Attention Drives Functionally Relevant Modulation of the Mu Rhythm in Primary Somatosensory Cortex. J Neurosci. 2010 Oct;30(41):13760–5. doi: 10.1523/JNEUROSCI.2969-10.2010. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 2.Ziegler DA, Pritchett DL, Hosseini-Varnamkhasti P, Corkin S, Hamalainen MS, Moore CI, Jones SR. Transformations in Oscillatory Activity and Evoked Responses in Primary Somatosensory Cortex in Middle Age: A Combined Computational Neural Modeling and MEG Study. Neuroimage. 2010 Sep;52(3):897–912. doi: 10.1016/j.neuroimage.2010.02.004. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 3.Sudre GP, Parkkonen L, Bock E, Baillet S, Wang W, Weber DJ. rtMEG: A Real-Time Software Interface for Magnetoencephalography. Comput Intell Neurosci. 2011 Jan;2011:327953. doi: 10.1155/2011/327953. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 4.Besserve M, Martinerie J, Garnero L. Improving Quantification of Functional Networks with EEG Inverse Problem: Evidence from a Decoding Point of View. Neuroimage. 2011 Apr;55(4):1536–47. doi: 10.1016/j.neuroimage.2011.01.056. [DOI] [PubMed] [Google Scholar]
  • 5.Lotte F, Lecuyer A, Arnaldi B. FuRIA: An Inverse Solution Based Feature Extraction Algorithm Using Fuzzy Set Theory for Brain-Computer Interfaces. IEEE Trans Signal Process. 2009 Aug;57(8):3253–3263. [Google Scholar]
  • 6.Noirhomme Q, Kitney RI, Macq B. Single-trial EEG Source Reconstruction for Brain-Computer Interface. IEEE Trans Biomed Eng. 2008 May;May;55(5):1592–601. doi: 10.1109/TBME.2007.913986. [DOI] [PubMed] [Google Scholar]
  • 7.Congedo M, Lotte F, Lecuyer A. Classification of Movement Intention by Spatially Filtered Electromagnetic Inverse Solutions. Phys Med Biol. 2006 Apr;51(8):1971–89. doi: 10.1088/0031-9155/51/8/002. [DOI] [PubMed] [Google Scholar]
  • 8.Qin L, Ding L, He B. Motor Imagery Classification by Means of Source Analysis for Brain-Computer Interface Applications. J Neural Eng. 2004 Sep;1(3):135–41. doi: 10.1088/1741-2560/1/3/002. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 9.Buch E, Weber C, Cohen LG, Braun C, Dimyan Ma, Ard T, Mellinger J, Caria A, Soekadar S, Fourkas A, Birbaumer N. Think to Move: a Neuromagnetic Brain-Computer Interface (BCI) System for Chronic Stroke. Stroke. 2008 Mar;39(3):910–7. doi: 10.1161/STROKEAHA.107.505313. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 10.Dinh C, Strohmeier D, Luessi M, Güllmar D, Baumgarten D, Haueisen J, Hamalainen MS. Real-Time MEG Source Localization using Regional Clustering. Brain Topogr. 2015:1–14. doi: 10.1007/s10548-015-0431-9. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 11.Mosher JC, Leahy RM. Source Localization using Recursively Applied and Projected (RAP) MUSIC. IEEE Trans Signal Process. 1999;47(2):332–340. doi: 10.1109/10.867959. [DOI] [PubMed] [Google Scholar]
  • 12.Dinh C, Luessi M, Sun L, Haueisen J, Hamalainen MS. MNE-X: MEG/EEG Real-Time Acquisition, Real-Time Processing, and Real-Time Source Localization Framework. Biomed Eng/Biomed Tech. 2013 Sep;58(1):4184. doi: 10.1515/bmt-2013-4184. [DOI] [PubMed] [Google Scholar]
  • 13.Diwakar M, Tal O, Liu TT, Harrington DL, Srinivasan R, Muzzatti L, Song T, Theilmann RJ, Lee RR, Huang MX. Accurate Reconstruction of Temporal Correlation for Neuronal Sources using the Enhanced Dual-Core MEG Beamformer. Neuroimage. 2011 Jun;56(4):1918–28. doi: 10.1016/j.neuroimage.2011.03.042. [DOI] [PubMed] [Google Scholar]

Associated Data

This section collects any data citations, data availability statements, or supplementary materials included in this article.

Supplementary Materials

10548_2017_586_MOESM2_ESM

RESOURCES