Skip to main content
F1000Research logoLink to F1000Research
. 2023 Nov 3;11:989. Originally published 2022 Sep 1. [Version 4] doi: 10.12688/f1000research.123515.4

Brain-to-brain communication during musical improvisation: a performance case study

Mauricio A Ramírez-Moreno 1,2,a, Jesús G Cruz-Garza 2, Akanksha Acharya 2, Girija Chatufale 2,3, Woody Witt 4,5, Dan Gelok 4, Guillermo Reza 6, José L Contreras-Vidal 2
PMCID: PMC10558998  PMID: 37809054

Version Changes

Revised. Amendments from Version 3

This new version (4) includes additional references in the Introduction section. Such references are noteworthy due to its similarity to our study: a single-subject analysis of EEG signals during free jazz improvisation by an expert musician under ecological settings.  The database linked to this work encountered errors, and readers were not able to access it. It should be accessible now, since the error was fixed on Oct 27, 2023.

Abstract

Understanding and predicting others' actions in ecological settings is an important research goal in social neuroscience. Here, we deployed a mobile brain-body imaging (MoBI) methodology to analyze inter-brain communication between professional musicians during a live jazz performance. Specifically, bispectral analysis was conducted to assess the synchronization of scalp electroencephalographic (EEG) signals from three expert musicians during a three-part 45 minute jazz performance, during which a new musician joined every five minutes. The bispectrum was estimated for all musician dyads, electrode combinations, and five frequency bands. The results showed higher bispectrum in the beta and gamma frequency bands (13-50 Hz) when more musicians performed together, and when they played a musical phrase synchronously. Positive bispectrum amplitude changes were found approximately three seconds prior to the identified synchronized performance events suggesting preparatory cortical activity predictive of concerted behavioral action. Moreover, a higher amount of synchronized EEG activity, across electrode regions, was observed as more musicians performed, with inter-brain synchronization between the temporal, parietal, and occipital regions the most frequent. Increased synchrony between the musicians' brain activity reflects shared multi-sensory processing and movement intention in a musical improvisation task.

Keywords: Brain on arts, hyperscanning, brain-to-brain synchrony, musical improvisation

Introduction

Advances in neuroengineering have fostered the development of mobile brain-body imaging (MoBI) technologies and denoising algorithms that allow the acquisition, interpretation, and decoding of brain activity from free-behaving individuals in real settings. 1 3 These advances have led to the development of neurofeedback systems, brain-computer interfaces (BCIs) and neuroprostheses. 4 These devices provide aid in the treatment of neurological disorders such as Parkinson’s disease, epilepsy and depression, 5 7 motor impairments, 8 and diminished brain functioning. 9 Although all of these systems are extremely helpful to patients and healthy persons, they follow an individualistic, personal-use approach. 10

While an understanding of an individual’s cognitive function is of utmost importance in the development of neurological treatments, the comprehension of social interactions at a neurological level is also important, as humans are social beings by nature, 11 and neurological disorders such as autism spectrum disorders (ASD) can affect social communication and interaction. 12 Furthermore, many common daily human activities are carried out in groups, e.g. at school, work, sports, creative art, and leisure. 11 , 13 Thus, research advances in social neuroscience are likely to revolutionize different fields such as entertainment, communication, education, healthcare, and social embedding, among others. 14 Recently, researchers have started to explore brain activity from a collective perspective, using a contemporary approach, known as hyperscanning. 15

Brain synchrony assessment through hyperscanning

Hyperscanning refers to the synchronous recording of brain activity from more than one individual simultaneously, and has been implemented to study dynamic similarities or differences between the brain signals of multiple participants engaged in interactive or social tasks. 15 Such an approach holds promise in understanding the nature of cognitive traits during social interactions. 15 Recent hyperscanning studies have documented traces of shared cognition, emergent during moments of social interaction, collaboration, competition, and in educational settings. 16 , 17 The study of neural synchrony between individuals provides an insight into human connectedness and may aid in the development of treatments for social cognition disorders such as autism. 18 A desired outcome of hyperscanning is the development of neural biomarkers that track in real-time the quality or strength of shared cognitive states such as brain-to-brain communication, shared attention, message conveying, and high engagement during human interactions.

Indeed, recent studies on human interactions have analyzed shared brain dynamics during teamwork tasks, 19 and cooperative/competitive interactions. 16 It has been reported that neural synchronization increases when participants are interacting in cooperation, and it reduces when they are competing against each other. A hyperscanning study allowed the quantification of the synchronization between brain signals of infants and adults during gaze interactions, showing increased neural coupling during direct eye-contact. 20 Neural coupling between humans has also been associated to the degree of mutual pro-sociality, where higher synchronization reflects stronger social relationships, 21 and likeliness of interpersonal bonding. 22 Considering the aforementioned studies, by analyzing inter-brain activity, hyperscanning offers a quantitative assessment of the strength and quality of different types of social interactions. 23

Regarding neural synchrony metrics, among the most common are coherence, 17 phase coherence, 16 phase locking value (PLV) and phase locking index (PLI), 15 Granger causality, 20 correlation, 21 wavelet transform coherence (WTC), 22 graph theory, and partial directed coherence (PDC). 23 Bispectrum is another, more recent, metric in hyperscanning literature, 19 , 24 and offers insights on temporal, spatial and spectral levels. The bispectrum of a signal is a high order spectra that reflects the degree of temporal synchronization and phase coupling between two time series at different frequencies. 25 The bispectrum offers additional insight when compared to other neural synchrony metrics, as it provides a more complete intuition on phase coupling, 19 resonance, temporal synchronization and non-linear interactions 26 between any analyzed signal pair 25 ; rather than only focusing on phase-coupling or correlation. Moreover, bispectrum is feasible to implement in real-time applications, which makes it more attractive for closed loop brain-computer interface applications. 19

The estimation of temporal synchronous activity of brain areas, within and across brains, at different frequencies is critical for understanding social interaction in various contexts, including musical interaction as shown in this study. Several approaches to quantifying these neural interactions have been proposed. The bispectrum method is typically used to detect nonlinear interactions and identify cross-frequency interactions in the EEG signals acquired during various cognitive states. 27 It has the added advantage that it reduces Gaussian noise as Gaussian processes have vanishing bispectra. 28 PLV (a popular algorithm used in hyperscanning studies) quantifies the strength of phase coupling; however, it yields false positive correlations in the presence of source signal mixing due to instantaneous field spread and volume conduction in EEG recordings. 29 Amplitude-correlation based connectivity measures, which are insensitive to zero-lag correlations, may still be affected by spurious interactions due to field spread in the vicinity of true interactions. 29 Thus, the bispectrum is deemed to provide the best quantification approach to nonlinear cross-frequency interactions in EEG.

Musical improvisation as a window to study Brain-to-brain synchrony

Studies on intra and inter neural synchrony between pairs of guitarists during musical improvisation have shown dynamical networks that connect different brain regions, depending on the situation and/or expectations, with involvement of the fronto-parietal region, as well as the somatosensory, auditory, and visual cortices. 30 , 31 The analysis of such obtained networks can be used to study the temporal dynamics of these interactions and providing a neurophysiological interpretation of the observed behavior. Considering the rich and complex interchange of cognitive processes necessary during collaborative artistic performances, its study using the hyperscanning approach is a valid approach to explore the shared neural cognitive traces that emerge from these interactions. 32 , 33

Collaborative, free musical production (improvisation) is a complex and rich form of social interaction, 34 it has also been described as a continuous process of generation and transformation of musical interaction, 35 and offers an interesting object of study for hyperscanning. Similarities between music and language have been observed previously in terms of the social interaction they entail; as described in, 36 a jazz improvisation can be interpreted as a conversation, and a good improvisation as a complex, meaningful conversation. Over recent years, there has been a growing interest in the study of improvisational, freely-moving, collaborative musical production in live performance settings. 23 Hyperscanning in the musical context can allow the observation of neural traits of dynamic processes. For example the patterns between musicians’ brain activity when performing cooperatively or not, as it has been reported that such actions create differences in their peripersonal space, 37 and in the rhythmical alignment of the overall performance. 38

This process of improvised production can be perceived as a creative act of communication: one that is complex, nuanced, and technical, integrating simultaneous cognitive processes together in real time. Musical improvisation involves complex but rapid interactions of several components, including the generation and evaluation of melodic, harmonic, and rhythmic pattern ideas on a fast time-scale within a performance. 39 Mobile brain-body (MoBI) imaging provides the tools for analyzing neural patterns in real-time for freely-moving participants, 1 , 2 , 40 with hyperscanning techniques that provide an experimental approach to assess non-verbal communication in musical performance. 32 During an improvised performance, musicians interact with each other, making use of different skills such as creativity, 41 emotional expression and perception, 34 self-organization, 42 memory retrieval and procedural memory, 43 and integration of visual and auditory stimuli with complex and precise motor coordination. 44 , 45 Musical improvisation can also be considered as an 'in the moment' composition, where improvisers are able to generate immediate responses to the present musical environment, to manipulate ideas and sequences in a spontaneous manner. 46

Brain-to-brain synchrony between musicians in free-jazz improvisation

Among many music styles, this study is centered on jazz, which takes roots from a mixture of afrological (african-american), and eurological (classical and contemporary) music. 47 More specifically, the performance of “free jazz” is of interest to this study, due to its unique characteristics that convey freedom to the perfomers. It is not based on pre-defined musical components (e.g. melody, rythm and harmony), pushes boundaries of improvisational norms, avoids the existence of shared musical frameworks; and allow musicians to generate temporal coherent pieces using improvisation as its major element. 47

In a theoretical model to study group jazz improvisation, Biasutti and Frezza 48 identify the processes that are essential for creative musical improvisation: anticipation, use of repertoire, emotive communication, feedback, and flow. In Wopereis et al., 49 26 expert musicians provided statements about musical improvisation in two 10-min individual brainstorm sessions. The statements resulted in a 7-cluster concept map, with self-regulation as the central concept, and affect, risk-taking, ideal, basic skills, responsivity, and creation, as constituent concepts for improvisational expertise. Specifically for collaborative improvisation, monitoring, feedback, and evaluation must be performed in association with other musicians, with both generative and communicative attentional demands. 50 Another study on jazz improvisation remarks that shared intentions emerge on the fly, and their presence fosters acoustic and temporal coordination, as well as improving the quality of the performance, as perceived by the performers and listeners. 13 Time perception is likewise important during improvisation, as noted by Refs. 51, 52 where theta band power was found to increase when time was perceived more precisely by an expert musician during a free jazz improvisation under ecological settings.

Recently, the predictive coding of music (PCM) model has been introduced to model how listeners form expectations which may be fulfilled or not, through perception, action, emotion and, over time, learning. 53 Under this model, musical interaction is guided by mutual reduction of prediction errors, evidenced by alpha-band intrabrain neural synchronization (phase-locking analysis) in a right-lateralized temporoparietal network, with a higher occurrence probability in mutually adapting dyads than in leader-leader dyads. 54 These models of music improvisation highlight the centrality of anticipation, self-regulation, generation and evaluation, with feedback and communication in joint performance.

Research aims of this case study

This article aims to contribute to the understanding of brain-to-brain communication during a collaborative musical improvisation between jazz musicians. In this case, brain-to-brain communication can be understood as the existent non-verbal communication (e.g., anticipation, planning, and actions such as taking turns) between dyads of participants (jazz musicians improvising), 55 that can be assessed by quantifying and analyzing the synchrony between their brain signals, recorded simultaneously using the hyperscanning approach. 56 A jazz performance incorporates each of the five elements of musical improvisation: anticipation, feedback, use of previous repertoire, emotive communication, and coordinated flow. 48 Moreover, in a free jazz performance, as described in, 57 a continuous process of evaluation is present, where musicians can decide to maintain or change the current theme; initiate or respond to a change; and to adopt, augment, or contrast a given idea.

While improvising, musicians can elaborate over (but are not constrained to) a composition’s underlying chord structure 58 and theme, with variations that incorporate multiple derivations from instantaneous decisions in real-world practice. 59 An important aspect of jazz performance and proficiency lies in the embodied cognition and motor memory. 60 However, most neuroimaging studies on musical improvisation have used functional magnetic resonance imaging (fMRI). 39 , 50 Lying down in an fMRI scanner alters spatial and visual perception, 61 and restricts body movement, which limits the capability of fMRI studies to observe realistic musical performance given the importance of embodied cognition in the task (due to the continuous retrieval and processing of spatial, auditory, visual and somatosensory information). 60 Because mobile electroencephalography (EEG) does not impose movement constraints, (as compared to the ones imposed when lying down in an fMRI scan), and subsequently allows participants to naturally engage in creative production with minimal, instrumentational constraint, it may afford advantages in studying musical improvisation. 62 , 63

The current study examines the neural correlates of brain-to-brain communication of jazz musicians during collaborative musical improvisation through hyperscanning; and addresses the limited body of knowledge on collaborative musical improvisation in an ecologically-valid production, with freely-moving expert musicians, as they interact in a jazz performance with a live audience. Here, the presence of a live audience is important, as our cohort of musicians are accustomed to them, to the point that they become a relevant part of their performance. An inter-brain synchronization analysis was implemented, by estimating the bispectrum of EEG signals between musician pairs during collaborative improvisations as they performed for a live audience. Following the concept of ecological validity, the exquisite corpse method was adopted to obtain realistic collaborative improvised art pieces. 40 , 62 The exquisite corpse is a game played by surrealists, in which different artists integrate their contributions into a unique piece, taking turns to add their input in an iterative manner until completing a final piece with the contributions of all members. 40 Under this paradigm, the complete performance is formed by a multi-participant improvisational, free jazz piece formed by the creativity from each player.

Methods

Human participants

The experimental methods were approved by the Institutional Review Board of the University of Houston, and are in accordance with the Declaration of Helsinki. All participants provided written informed consent, including agreement for publication in online open-access publication of information obtained during the experiments such as data, images, audio, and video. Three male healthy adults (P 1, P 2 and P 3) volunteered for this study. Musicians P 2 and P 3 received (formal) musical instruction for 12 and 6 years, respectively, and P 1 (informal) for 6 years. To the date of the experiments, P 1, P 2 and P 3 had 31, 38, and 26 years of experience performing music, respectively. P 2 and P 3 were music educators at the University of Houston at the time of the experiment. The musicians performed jazz improvisation in a public event at the Student Center of the University of Houston while wearing the MoBI technology. Musicians P 1 and P 2 have a jazz musical background, whereas P 3 had a’classical music’ education. Musician P 1 played the drums, musician P 2 played the saxophone, and P 3 played using a soprano saxophone. P 1 and P 2 had performed jazz regularly together for 6 years, P 2 and P 3 had performed a concert together once before, and P 1 and P 3 had not performed together previously. The heterogeneity between participants' experience and familiarity was not voluntary targeted; however, it led to interesting results and interpretations, presented in Results and Discussion sections.

Equipment

High-density active electrode scalp EEG and electrooculography (EOG) recordings were obtained simultaneously for the three musicians during their musical performances. EEG data was wirelessly acquired using the 64 channel actiCAP (BP gel) electrodes along with the Brain Amp DC amplifer (actiCap system, Brain Products GmbH, Germany) at a sampling frequency of 1000 Hz. Electrode distribution follows the 10-20 international system. EEG data was online referenced to channel FCz on the superior region of the scalp. Four channels were used to record EOG data. Channels TP9 and TP10 were placed on the right and left temples, respectively, to record horizontal eye movement, whereas channels PO9 and PO10 were placed above and below the right eye, respectively, to record vertical eye movement. Electrode impedances were recorded prior to the experiment, and maintained below 60 kΩ following the manufacturer's guidelines in the Brain Vision Recorder software and User manual on impedance measurement (where impedance values above 60 kΩ are considered ‘bad’, and below 25 kΩ ‘good’). 64

Performances were recorded by three video cameras coupled with a Zoom H6 ( https://zoomcorp.com/) audio recorder from a frontal, superior and lateral perspective. Audio was recorded in a single stereo file at 44100 Hz. Three Sterling ST31 FET condenser microphones ( https://sterlingaudio.net/) were used to amplify the sound from each musician’s instrument during the live performance.

Experimental design

Musicians performed three 15 minutes improvisations (trials). Each trial was divided in three 5 minutes pieces (segments). In Segment 1, one musician was performing while the other two were listening. In Segment 2, a different musician joined the first, while the remaining musician listened to the performance of the other two. In Segment 3, the third musician joined and all participants performed together until the end of the trial. In a given segment, the musicians who are performing are referred to as the “active” musicians, whereas the musicians who are not performing are the “passive” musicians. At the beginning and at the end of the experiment, three blocks comprising of an EEG impedance check, a one-minute eyes open (EO) and a one-minute eyes closed (EC) were recorded.

In each trial, the order of the musicians joining at each segment was pseudo-randomized so that each musician entered one trial as the first, second or third player. Each musician was given a visual cue to signal their start time in the piece. Between one trial and the next, there were short pauses of 3-5 minutes, in which the audience clapped and the musicians prepared for the next trial.

Figure 1(a) shows the protocol for baseline measurements, and the order of musicians joining at each segment and trial. Figure 1(b) shows the locations of EEG electrodes with impedance higher than 25 kΩ at the start and at the end of the recordings, for all musicians. Figure 1(c) shows the setup of the instruments and microphones during the experiments, and the three musicians wearing the EEG caps. Figure 1(d) depicts, as an example, from top to bottom, the recorded raw EOG, EEG and audio signals obtained for the first five seconds of Segment 1 of Trial 1, when only P 3 is performing.

Figure 1. a) Impedance check, baseline (eyes open and eyes closed) measurements and performance times for each participant across the three improvisation trials. b) Impedance values larger than 25 kΩ across electroencephalographic (EEG) electrodes at the start and end of all experiments for the three participants. c) Experimental setup of musicians on stage wearing EEG caps and performing. From left to right: P 1 (drums), P 2 (saxophone) and P 3 (soprano saxophone). d) Representative electrooculography (EOG), EEG and sound recordings during musical performance of the first five seconds of Trial 1.

Figure 1.

Three independent raters with training in music composition annotated the data. The annotators were not familiar with the researchers nor with the musicians involved in the performances; and were tasked to write annotations of the performance independently from each other, by watching a recording of the live performance. The annotators were asked to write a short description (e.g. “players are performing in sync”, “mirroring each other”, “performing in discord”), and the time each event happened. Table 1 shows sample descriptions from the annotators during one trial of musical improvisation.

Table 1. Type, times and annotations of events labelled by annotators (in the audience) during Trial 1.

Only synchronized performance (SP) and desynchronized performance (DP) events are presented.

Type of event Time Annotation
SP 1 5:20 Drums and soprano saxophone synchronize
DP 1 5:46 Drum solo
DP 2 6:28 Both play unevenly
SP 2 7:03 Mirror each other
DP 3 7:34 Drum deviates
SP 3 8:23 Mirror each other
DP 4 11:07 Saxophone and soprano saxophone discord; both are trying to lead
SP 4 11:17 Rapid, loud performing - some mirroring
DP 5 12:01 Discord

At the beginning of each trial, video, audio and physiological signals were synchronized using manual event markers (i.e. pressing a button). Recordings from the three trials were obtained simultaneously using this procedure. Unfortunately, data transmission was interrupted from 4:25-5:25 of Trial 3 due to a loss in connection, which resulted in missing data. The events that happened in this period were therefore not included in the analyses.

Signal preprocessing

EEG signals were acquired at 1000 Hz and resampled to 250 Hz to reduce computational cost in subsequent calculations. Signals were bandpass filtered from 0.1 to 100 Hz using a 4 th order Butterworth filter to remove unwanted noise. The PREP pipeline from the EEGLAB package ( https://sccn.ucsd.edu/eeglab/download.php) was used as the initial step to clean the data. 65 This procedure ensures the removal of power line noise, as well as a “true” average reference of the signals. EOG artifacts were removed from the EEG signals using an adaptive noise cancelling (ANC) framework, known as H filter. 66 Raw EOG signals were used as input in the H filter with parameters γ = 1.15 and q = 1 e −10 for removal of eye blinks, eye motions, amplitude drifts and recording biases simultaneously. The obtained signals were further processed using the artifact subspace reconstruction algorithm (ASR) included in the EEGLAB package. 67 The ASR algorithm calculates the standard deviation of a “clean” portion of the signals in the principal component analysis (PCA) subspace, and reconstructs artifacts with standard deviations as κ times higher than in the clean portion. Here, a value of κ = 15 was chosen to remove remnants of eye movement and muscle artifacts. According to, 68 κ values between 10-20 are recommended to reduce artifacts and at the same time preserve the content of EEG signals. As a final step, independent component analysis (ICA) was performed on the data and suspicious components (eye, muscle, electrode popping) were removed before projecting back the signals. A graphical representation of the pre-processing steps is presented in Figure 2, as well as feature extraction and subsequent signals analysis.

Figure 2. Signal processing methodology flowchart divided into four main steps: (1) electroencephalographic (EEG) data acquisition, (2) pre-processing and denoising, (3) brain-to-brain feature extraction, and (4) statistical analysis.

Figure 2.

Each step is described in detail in the Methods section.

Brain-to-brain feature extraction

The improvisational nature of the performance allowed for the examination of the musical communication between the musicians as the piece progressed. At times, they built from the theme established, returned to the main theme, or proposed new ideas. With the annotations from three independent raters, we clustered sections of the performance where the participants performed synchronously or not as synchronized performance (SP), and desynchronized performance (DP). SP included moments of in-time synchronized execution, as well as improvisation and interactions under the same underlying pulse; while DP reflected moments where musicians traded material, though not aligned to the same underlying pulse, and with no coordination (as well as deviation from the current theme). A temporal (across-time) analysis was performed to observe neural synchronization in those moments of SP and DP; and these times were evaluated across three participant conditions: passive-passive, when no musicians in a dyad were performing; passive-active, when one musician in a dyad was performing; or active-active, when the two musicians in a dyad were performing. For the sake of the ecological validity approach, rather than manipulating experimental conditions (e.g. rest vs improvisation), we observed brain synchrony across SP and DP, and the participant conditions posed by the exquisite corpse approach.

The quantitative measurement of brain-to-brain communication was achieved by calculating the bispectrum between musician dyads of EEG data obtained during improvised musical performance at different stages of interactive performance. As referred to in previous works, higher bispectrum magnitudes at given pairs of frequencies reflect non-random interactions, phase coupling, 19 and non-linear multi-frequency interactions, 26 which have been observed as traces of inter-brain synchrony during teamwork interactions. 19 , 24

The denoised EEG signals were used to estimate the bispectrum between all possible channel combinations, for all participant pairs, trials and segments. Bispectrum was estimated across the EEG recordings using four-second windows with 75 % (one-second) overlap. The bispectrum at each time window was estimated using Equation 1:

Bfifjt,s,Pab=l=1LXlfiXlfjXlfi+fj, (1)

where X l ( f i ) and X l ( f j ) represents the Fourier transform of window l at frequencies f i and f j respectively, and L is the total number of windows. Subscripts t and s are the trial and segment where bispectrum is calculated for participants a and b, on two different EEG channels. The term Xlfi+fj represents the conjugate of the Fourier transform of the sum of frequencies f i and f j. 69 Using this method, bispectrum was estimated for all f i = f j , in 50 frequency bins between 1-50 Hz.

Bispectrum was estimated at 60 2 EEG channel combinations between pairs of participants for all segments, and trials. A bispectral representation of a segment was obtained averaging all four-second windows in each segment, for each frequency bin (1-50 Hz). Bispectral representations were normalized to the bispectral representations of the same channel combinations during pre-trial EO task using Equation 2. Pre-trial EO was treated as rest condition, where participants did not communicate with each other.

BSN=BSSegBSEOBSEO, (2)

where BS N is the normalized bispectrum, BS Seg is the average bispectral representation during a segment and BS EO is the bispectral representation during the EO task at the same channel combination. Normalized bispectrum representations were obtained using Equation 2 for all segments, trials, channel combinations and participant pairs. By applying this normalization, positive values of BS N for a specific channel combination represent higher temporal synchronization between specific participant pairs during performance when compared to Rest state (pre-trial EO).

Bispectral values for five frequency bands were obtained as the average of the normalized bispectral representation in the following frequency ranges: delta (1-4 Hz), theta (4-7 Hz), alpha (8-12 Hz), beta (13-29 Hz) and gamma (30-50 Hz).

A temporal bispectrum series was also estimated, using sliding overlapping windows of four-seconds (75 %). At each window, the temporal bispectrum values were obtained as the average normalized bispectral representation ( Equation 2) at each frequency band, thus obtaining a temporal representation of the EEG signals’ synchronization between musicians. These temporal bispectrum values were estimated for all windows using the channel combinations which were found to be significant in the implemented statistical analysis. The analysis is described in detail in the statistical analysis subsection.

Statistical analysis

Right-tailed Wilcoxon signed rank tests were used to evaluate statistically significant differences between the average bispectrum at different frequency bands for all channel combinations. Average bispectral representations during rest and specific segments were compared.

This procedure ensures the discovery of only those channel combinations with significantly higher bispectrum at a specific segment and for a given frequency band as compared to rest. At each Segment, 60 2 tests were performed ( p < 0.05, corrected for multiple comparisons via Bonferroni correction). Statistical tests were performed for all trials (3), segments (3), participant pairs (3) and frequency bands (5), for a total of 486, 000 tests. A different amount of samples was used for each frequency band, due to bandwidth difference; 16 for delta and theta, 20 for alpha, 68 for beta and 82 for gamma.

Through this procedure, the identified channel combinations were used as representative traces of brain-to-brain communication during musical improvisation. To further explore the behaviour of such traces, temporal and spatial analyses were implemented.

Temporal analysis

The temporal analysis of bispectrum was implemented in representative bispectrum traces to observe its dynamics under different conditions during the performance. The bispectrum traces used in this analysis were those of the most significant channel combination (in the gamma band as described in the Results section) at the third segment of each trial, for each participant pair.

The bispectrum analysis was divided into two groups of events of naturally occurring experimental conditions: SP and DP, as labelled by the annotators in the audience. For each event, a two-minutes representative bispectrum trace in a time period (-60 to 60 s) was obtained per dyad. For each specific event, the time of the annotation was considered as the 0 s mark. To observe relative differences between SP and DP, the bispectrum traces were baseline corrected at each event for both groups. To obtain baseline corrected traces, average bispectrum in the (-60 to 0 s) period was obtained and substracted from each two-minute bispectrum trace.

Baseline corrected bispectrum traces were obtained for all events, trials and participant pairs, and were grouped and compared between the two groups. Wilcoxon signed rank tests were used to find statistically significant differences ( p < 0.05) at every (-60 to 60 s) time point, between SP and DP at each performance condition. This analysis was implemented independently for events in the passive-passive, passive-active and active-active performance.

Spatial analysis

Spatial analysis was implemented to identify regions of interest (ROIs) involved in musical performance. The selected ROIs group spatially close electrodes in 13 regions: anterior frontal (AF), left fronto-central (LFC), midline fronto-central (MFC), right fronto-central (RFC), left centro-parietal (LCP), midline centro-parietal (MCP), right centro-parietal (RCP), left parieto-occipital (LPO), middle parieto-occipital (MPO), right parieto-occipital (RPO), left temporal (LT), right temporal (RT) and occipital (O). 70 Figure 7 shows the location for the 13 ROIs within the scalp map. The significant channel combinations identified through the statistical analysis at every segment and trial were grouped for the different performance conditions: passive-passive, passive-active and active-active.

Figure 7. Topographical representations of between-participants inter-synchronization of 13 regions of interest (ROIs) (anterior frontal, left fronto-central, midline fronto-central, right fronto-central, left centro-parietal, midline centro-parietal, right centro-parietal, left parieto-occipital, middle parieto-occipital, right parieto-occipital, left temporal, right temporal and occipital) across all dyads (3) and trials (3), in passive-passive (left), passive-active (middle) and active-active (right) conditions, in the gamma (top) and beta (bottom) bands.

Figure 7.

Shading represents the degree of inter-synchronization within the same (circles) and different (lines) ROIs.

Results

A general representation of the bispectral dynamics between pairs of participants during Trial 1 in shown in Figure 3. Here, normalized bispectrum is presented for the three participant pairs (P 12, P 13 and P 23) in separate insets. In each inset, four plots are shown: a bispectrogram (top left), the average bispectrum at each time window in the gamma band (bottom left), the average bispectrum at each frequency bin (top right) and the most significant channel combination found at gamma band, between each dyad (bottom right).

Figure 3. Bispectral estimations in frequency and time domain during Trial 1 for all participant pairs: P 12 (top), P 13 (middle) and P 23 (bottom).

Figure 3.

For each participant pair, four insets are provided: (1) Average bispectrum in gamma band across 15 minutes of musical improvisation (bottom left); (2) Bispectrogram (1-50 Hz) across 15 minutes at (3) a representative significant channel combination in the gamma band (bottom right); (4) Average bispectrum (1-50 Hz) across the 15 minutes of performance (top right).

The bispectrogram shows positive values from 0-0.3, which means that bispectrum values were up to 30 % higher during musical performance than during rest. 98 From the bispectrum representation in frequency it can be observed that in average, higher frequencies show the highest values. This particular behaviour is more evident for P 12 and P 23. The temporal dynamics of the bispectrum shows oscillations at different moments of the Trial, which correspond to fluctuations in EEG signals synchronization between participants. Across all participant pairs, the average bispectrum in the gamma band tends to increase from the initial segments to the latter ones, where more musicians are performing together. The regions where the highest significant synchronization was found include temporo-occipital (P 12), occipito-occipital (P 13) and temporo-frontal (P 23) connections.

As Figure 1 shows, at Trial 1, P 3 starts performing. At Segment 1, participants P 1 and P 2 were listening to P 3 perform; and the bispectrum of the dyad P 12 is lowest. The bispectrum trace of dyad P 13 also seems low at Segment 1. The stronger bispectrum is observed for dyad P 23. At Segment 2, P 1 joins the play and an increase in bispectrum is observed for P 13, as both participants are performing together. The bispectral trace of P 12 and P 23 show slightly higher values towards the end of Segment 2. By Segment 3, P 2 joins the other two participants and all are improvising together. Bispectrum increases at all participant pairs are observed during this final Segment, reflecting higher EEG synchronization, when compared to segments where participants are not actively interacting in the performance. These observations were tested statistically for all channel combinations in Figure 4.

Figure 4. a) Total significant bispectrum channel combinations between musician dyads, across all frequency bands when one, two and three musicians perform together. b) Most significant channel combinations (up to 5) during all trials (rows) and segments (columns), for all dyads (P 12, P 13 and P 23) and frequency bands (alpha, beta and gamma).

Figure 4.

Lines represent specific channel combinations with significantly higher bispectrum during improvisation than in rest condition ( p < 0.05) *. White and gray heads show the passive and active musicians, respectively. Bars insets show the total significant channel combinations for all participant pairs for alpha, beta and gamma bands at each segment. *Statistical tests were corrected for multiple comparisons via Bonferroni correction.

Statistical analysis

The procedure of the statistical tests presented in the Statistical analysis subsection was implemented for all Segments, Trials, frequency bands and participant pairs. No significant channel combinations were found for the delta and theta band for any segment. Most statistically significant channel combinations were found for the beta and gamma bands, and a few in the alpha band.

Figure 4(a) shows the total significant channel combinations (between all dyads) when different musicians were performing together, and Figure 4(b) shows a topographical representation of the statistical analyses.

In Figure 4(a), the dyads P 12 and P 23 show consistently more significant inter-brain synchronized channels that for the P 13 dyad. The three dyads showed few synchronized channels when musician P 3 performed alone, and when P 1 performed together with P 3.

The most significant channel combinations (visualizing the top 5 channel pairs) for the alpha, beta and gamma bands are shown for each participant pair. Bar graphs show the amount of significant channel combinations at each frequency band, and dyad. At each specific segment, passive and active musicians are shown as white and gray heads, respectively. Topographical representations are presented for all trials and segments, therefore the first row of Figure 4 corresponds to the data shown in Figure 3.

In Trial 1 (top row of Figure 4(b)) and Segment 1, P 3 starts performing, and only a few significant channel combinations were found for dyad P 12 in the gamma band. In Segment 2, P 1 joins and more channel combinations are shown in the beta and gamma bands for dyad P 12 and P 13, who are performing. In Segment 3, when all musicians are performing, an increase in the amount of significant channel combinations is observed for all participant pairs. In this last Segment, a few channel combinations were observed in the alpha band.

In Trial 2 (middle row of Figure 4(b)), in Segment 1, P 1 starts performing. A few channel combinations were found to be significant for all participants in the beta and gamma bands. In Segment 2, P 2 joins and an increase in the amount of significant channel combinations is observed for dyads P 12 and P 23). In the Segment 3, P 3 joins and a decrease in the amount of significant channel combinations is observed across all participants).

In Trial 3 (bottom row of Figure 4(b)), P 2 starts performing, and significant channel combinations for P 12 and P 23 are observed for beta and gamma, and only one for P 13. In Segment 2, P 3 joins and a similar connection pattern is observed between participants at Segment 1. In Segment 3, P 1 joins and a considerable increase in significant channel combinations is observed for both P 12 and P 23, while those for P 23 remain similar.

Some general patterns were observed through this analysis. It was observed that the amount of significant channel combinations increased as more musicians joined the performance, which can be observed in Figure 4(a). Dyad P 13 showed less amount of significant channel combinations throughout the experiment, at different segments and trials 4 (a). Also, in all segments, the amount of significant channel combinations was higher for the gamma band than for beta or alpha bands. Finally, the most common interconnected regions across segments and trials are those involving the temporal, occipital and parietal regions.

Temporal analysis

Figure 5 shows the normalized bispectrum trace for the 15 minutes of Trial 1, using the significant channel combinations described in the temporal analysis subsection. The vertical dashed lines mark the division between segments, and bars are used to visualize the moments during the performance when the experts identified either an SP or DP event. The volume of the recorded audio file from the performance is shown below the bispectrum traces. The individual events shown in Figure 5 are described in Table 1. The corresponding Figures and Tables for Trials 2 and 3 are presented in the Extended data, in Figures S1 and S2, and Tables S1 and S2, respectively. Representative SP and DP events from Trials 1-3 are presented in Videoclips S1-S3 in the Extended data. 98

Figure 5. Bispectrum temporal dynamics at the most significant channel combination per participant pair in the gamma band, and normalized volume intensity (unitless) of the audio recorded during the performance of Trial 1.

Figure 5.

Vertical dashed lines represent the times when a new musician joined the performance. Vertical bars represent time of synchronized performance (SP) or desynchronized performance (DP) events, as labelled by experts. A representation of the selected channels for each dyad is shown in the bottom right corner. The annotations from the events are shown in Table 1.

Figure 6(a) and (b) show the average bispectrum change across participant pairs for the passive-active and active-active conditions, respectively, for both SP and DP. The 0 seconds vertical dotted line in Figure 6 indicates the start of the event: either SP or DP. The amount of averaged traces for each condition were, 24 and 31 for SP; and 14 and 26 for DP; for the conditions passive-active and active-active, respectively. Figures c) and d) show every individual trace analyzed in a) and b), respectively.

Figure 6. Average bispectrum (BS) change (%) across all segments and participant pairs at their most significant channel combination in gamma band, for both synchronized performance (SP) and desynchronized performance (DP) events.

Figure 6.

BS change (%) was obtained applying baseline correction on each trace by using the 60s previous to each event. Average BS changes (%), and histograms (distribution of all traces) are shown for passive-active (a) and active-active (b) conditions. The amount of averaged traces at a) and b), respectively, are: 24 and 31 during SP (n SP ); 14 and 26 during DP (n DP ). Individual BS (%) changes for all n SP and n DP events are presented in c) and d) for the passive-active and active-active conditions, respectively.

No statistical significance was observed between SP and DP in the passive-active condition. Bispectrum change was significantly higher during SP than DP in the active-active condition, slightly before the onset of annotated events (− 3 s), as well as 40 s after the onset.

Spatial analysis

A topographical visualization of the most significant scalp ROIs for participant pair conditions (summarizing the findings of Figure 4), is shown in Figure 7. Visualization maps were plotted to represent the degree of synchronization between and within the evaluated ROIs for all conditions (passive-passive, passive-active and active-active), dyads (3) and trials (3). Figure 7 show this representation in the gamma and beta bands. It can be observed that the most synchronized ROIs are in the active-active condition, while less are observed in the Passive-Active condition and the lesser during passive-passive condition.

Summary of main findings

Brain synchrony (as assessed via the bispectrum) increases when more musicians are performing. This is most notorious in the higher frequency bands (beta and gamma).

The number of inter-connected channels and ROIs increase when there are more musicians performing. This happened mainly across temporal, occipital and parietal regions, related to multi-sensory processing.

Changes in brain synchrony (bispectrum) allowed to identify different performing states across musicians during the improvisation: Bispectrum was higher during Synchronized Performance and lower during Desynchronized Performance.

Discussion

The bispectrum analysis allowed us to obtain a quantitative representation of brain-to-brain communication, by analyzing the temporal synchronization strength (i.e. bispectrum) of EEG signals between musicians during a free jazz improvisation performance. In such performances, musicians continuously engage in a dynamic communication formed by perception, evaluation and action. The presented methods were applied to observe the synchronized interactions of neural activity at different stages of the performance, different recording sites and under five frequency bands.

Following our proposed methods, a bispectral representation in the time and frequency domain were obtained for all pairs of possible combinations of the assessed variables (segment, trial, frequency band, and channel). Statistical analysis revealed that the most significant synchronization between EEG signals of paired musicians were found in high frequency bands: beta and gamma, as shown in Figure 4. Also, in the same analysis, it was noted that most of the significant neural synchronization links were formed between the occipital, parietal and left temporal regions. Such results were used to assess the most frequent connections between ROIs across pairs of musicians at different performance conditions (See Figure 7).

Although musicians exhibit differences in their brain activity due to individual preferences, domain-specific memory for previously encountered auditory-motor patterns, 39 , 71 as well as the nature of their instruments (e.g. drummers use more spatial and visual processing), common patterns were observed. In this study, the most synchronized ROIs between musicians were found at left temporal, and bilateral parietal and occipital sites (LT, LPO, O, RPO and RT), with increased synchronization in the beta and gamma bands.

These results have further implications for cross-modal plasticity due to musical training, between individuals. The posterior coupling between musicians can be strengthened through extensive training. 72 , 73 Such processes are present when musicians rhythmically engage in a collaborative, creative work. Two processes give rise to this dynamical sensorimotor integration: motor commands, and sensory predictions that provide feedback regarding the given command. 73 75 This feedback loop often informs individuals of errors or mismatches between predicted and real sensory feedback, which results in the reconfiguration of this perception-action cycle. 75 , 76 However, this cycle is not restricted to self-generated action. An increasing body of research suggests that in musical contexts, musicians are able to form multiple action representations, performing real-time integration of perceptual stimuli, motor commands, and outcome predictions for one-self and others. 73 This complex, moment-to-moment processing within the perception-action cycle, informed by internal forward models, may be the foundation of inter-personal synchrony in creative, musical contexts. 73

Neural synchronization fMRI studies of resting state in musicians have found increased functional connectivity between the auditory and motor cortices within an individual’s brain 77 and in default mode network and executive control network. 78 fMRI studies have shown long term induced plasticity 79 in trained musicians when compared to non-musicians. Improvising jazz musicians experience weaker connectivity in pre-frontal areas during musical improvisation, compared to performing pre-learned segments. 80 Studies in the literature resembling our findings regarding gamma band activity in music related processes have been reported; expert musicians exhibit neural synchronization between multiple cortical areas in the gamma band 81 and left hemispheric gamma synchrony while listening to music 82 while such patterns are not observed for non-musicians. Inter-brain synchronization in the theta/alpha amplitudes between temporal and lateral-parietal regions has also been described during speech-rhythm coordination in. 83 The results obtained from the statistical and the ROI analysis suggest that beta and gamma synchronization is present during the performance of higher cognitive tasks that need a dynamic binding of information, such as an improvised collaborative musical performance. In our case study, the presence of higher synchronization between temporal, parietal and occipital sites during improvised musical performance suggests the establishment of functional inter-connections between musicians which reflect shared multi-sensory processing, integration, and communication. 31 Auditory and visual cues from co-performers have been reported to relate to the strength of inter-musician coordination during musical improvisation. 84

These results presented in this study show evidence for an inter-musician perception-action cycle, where there is a circular, feedback-based, hierarchical method of information-processing conducted by the interplay between both posterior (i.e. sensory input) and anterior (i.e. motor, executive output) regions of the cortex. 76 In this experiment, inter-brain bispectrum analysis showed synchronicity in sensory areas. Cross-modal plasticity, and reinforcement of intra-brain coupling of posterior and anterior areas, has been shown to be enhanced by musical training. 73 Experience in joint performance leads to fine-tuning of the internal forward model representation that allows for the prediction of observed or listened actions from fellow musicians with high temporal resolution. Our results suggest that coupling in posterior and temporal regions is associated with such predictions of the actions from other members of the performing group. The musicians generate predictions both about when and what their peers’ new musical idea will occur. Musicians with experience performing together may in fact learn which succession of tones are likely to occur, stemming from regularity from previous performances. This complex, moment-to-moment processing within the perception-action cycle, informed by internal forward models, may be the foundation of inter-personal synchrony in creative, musical contexts.

Musician P 1 and P 3 performed together for the first time in this experiment, while musician P 1 and P 2 performed regularly together prior to this study. Thus, it is likely that P 1 and P 2 had developed strong internal forward models of each other that enabled them to predict and respond to recognized sequences between them, as shown in Figure 4(a). Musician P 3 had the lesser prior musical collaboration with the other musicians. This difference in familiarity background supports the finding of a smaller number of synchronized channels between P 3 and the other musicians throughout the three trials of the performance.

Across all trials of the present study, significant bispectrum synchronization was found in posterior (e.g., parietal, temporal and occipital) regions that are involved in the processing of sensory input and are important in interpreting sensory feedback from the external environment. Because musical improvisation is founded in nuanced, interpersonal exchange of motor commands that are generated based on constantly evolving sensory input, these findings support the notion that this musical, creative synchrony between participants is highly dependent on the sensory, perceptual inputs they are receiving from each other, and their surroundings. The output in this cycle (i.e. action) would be represented by activation of anterior (e.g. frontal) regions. In Figure 7, connections involving anterior regions are more present during active-active interactions, when both musicians are performing (producing an action) together, while a predictive component can also be observed in Figure 6, where a significant positive bispectrum change was observed across all analyzed SP events approximately three seconds before the onset of the labelled events. Both components were not present during passive-active and passive-passive interactions, in which action and anticipation are not as needed due to the nature of such conditions.

Another key variable to address in our study is the temporal dynamics of the bispectrum. In Figure 5, the temporal bispectrum dynamics show a consistent increasing trend, as more musicians joined the performance. Towards the final segments of the performance, there were more’musical voices’ interacting, increasing the complexity of the piece, as well as the stimulation, perception, and engagement. This increase in bispectrum was also observed in Trial 3 for P 12 and P 13, but not for P 23, which presented a decreasing trend (See Figures S1 and S2, Extended data 98 ). Also, in Trial 2, a general bispectrum decrease was observed for all dyads, with a sudden increase at the end of the first segment, where a new musician joined the performance. As mentioned in the Experimental design subsection, P 3 is a classical trained musician, while P 1 and P 2 are professional jazz musicians. It is also important to mention that P 1 and P 2 often perform together, while P 3 is not an acquaintance of them. Brain-to-brain synchrony has been studied between dyads under different social contexts, such as between romantic couples and strangers 21 , 22 and it has been reported that higher neural coupling relate to the degree of social connectedness and mutual pro-sociality. It has also been noted from recent musical improvisation studies that the familiarity between musicians predicts stronger coordination of intentions during the performance. 85 Increased neural synchrony between two participating individuals may indicate mutual, efficient, and effective social interaction 86 and can be modulated by the degree to which the participating individuals feel socially connected, the activity they are engaging in, and the interaction setting. 21 , 86 88 An interpretation of the aforementioned temporal bispectrum changes is that during bispectrum decreases, participants were not communicating effectively, and the “closeness” between each other had an important role in this communication. It can be observed from the bars in Figure 4 that a lower amount of significant channel combinations were found at the dyads where the unacquainted P 3 is present, whereas a higher amount of significant channel combinations are found between the more acquainted musicians P 1 and P 2. This is also evident in Figure 4(a), where the highest amount of significant channels is presented between P 1 and P 2, and lower combinations are significant when P 3 is involved.

By analyzing the fluctuations of bispectrum change relative to the stimuli type and onset, differences in bispectrum dynamics were observed whether musicians were performing in synchronization or not. This analysis revealed on average higher bispectrum during synchronized performance when compared to desynchronized performance. Higher inter-brain synchrony has been reported in participants performing cooperative tasks and lower synchrony when performing competitive tasks. 15 Based on the results of this study, higher bispectrum was observed between participants while performing in synchrony, which might be a reflection of cooperative intention. On the other hand, lower bispectrum values during desynchronized performance might be indicative of a competitive behaviour (e.g. changing the current theme or proposing a new idea). A review on this topic is presented in, 16 however it is noted that most papers in this regard are based on an experimental design under controlled laboratory settings. In our study, a real-world scenario is presented, therefore it is best suited to study these types of interactions. An interesting note on this regard is that in a musical improvisation, alignment and misalignment between musicians are both needed to contribute a new perspective on an established theme, 59 and continuously propose new musical paths in the piece. Moreover, the observed brain synchrony dynamics are associated to the spontaneous decisions and interactions of the musicians during an unconstrained free jazz improvisation, which does not facilitate the temporal prediction that a steady pulse would cause in the musicians. 33

Our results suggest that bispectrum was able to detect relevant temporal and spatial information about musician’s interactions during the performance. Therefore, the proposed method could be used to track the degree of synchronized interactions and can be applied to different contexts. Some of the applications and desired outcomes of research in this field is the development of neural biomarkers that measure in real-time the quality or the strength of shared cognitive states such as: brain-to-brain communication, shared attention, message conveying, and high engagement during human interactions. Two possible applications are the use of such methods to track changes in social interactions in patients suffering from communication disorders, and to enhance learning in educational settings.

While the social nature of individuals has been recognized and acknowledged as foundational to human interaction, research regarding the neural inter-brain basis of these interactions naturalistic social settings has only just begun in its investigation. 88 Hyperscanning applied to social interactions opens the possibilities to study and enhance social exchanges. 15 In a recent study, large groups of museum-goers participating in face-to-face pairs in an artistic neurofeedback installation were found to exhibit higher levels of inter-brain synchronization in low alpha and beta band frequencies, correlated with the pair’s empathy, social closeness, engagement, joint action and eye-contact. 89 Observing brain-to-brain synchronization during naturalistic exchanges could not only aide in developing a more comprehensive understanding of its neural underpinnings, but also could shed light on various communication disabilities. 86

A bispectral analysis to observe brain-to-brain synchronization during social interactions could be used in the educational field to increase teacher-learner synchronization to enhance learning outcomes and experiences. 90 A study examining the effects of brain-to-brain synchronization within a classroom setting was performed on a group of four science students and a teacher. In such study, alpha-band synchrony between students significantly predicted subsequent performance (i.e., memory retention) on both immediate and delayed post-tests. 91 Inter-brain synchronization in prefrontal and superior temporal cortices between instructor-learner dyads has been shown to increase when the instructor engages with the learner through guiding questions and hints. 92 Such results are also consistent with previous research on the synchrony between speakers and listeners. 91 , 93 , 94 Brain-to-brain synchronization in a naturalistic musical performance provides a window to assess the perception-action and communicative cognitive processes required during musical improvisation 48 ; and coupled with instructor-learner interactions, inter-brain synchronization metrics can inform effective pedagogical techniques.

This study faced some limitations given the logistical challenges of integrating performance and MoBI research in a public setting. First, this study has a small sample size (three professional musicians). However, this drawback can be justified by the ecological validity of our experiments, which were intended to capture the interactions of musicians during real-world improvised performance, 32 , 33 and the experimental design that included counterbalancing to allow for testing different participants in different orders. The authors believe the ecological approach and experimental methodology used herein represent a milestone in the acquisition and understanding of brain data “in action and in context”, and the development of brain-to-brain communication metrics. It is of interest of the authors to implement the presented methods in different experimental designs oriented to unveil the shared neural traces of human interactions under a variety of contexts (e.g., dance, theater, teaming, education, etc.).

Another limitation of this study is the potential lingering effects of artifacts associated to the movements needed to perform music through the (percussion and wind) instruments involved in the experiments. Such artifacts are likely related to body and facial movements, blowing, head swaying, among others, and can contaminate the EEG signals. Although we deployed well known pre-processing and de-noising methods found in the literature 1 , 2 , 95 and performed visual inspection of the raw and cleaned data, it is still possible that residual motion and muscle-related artifacts may still remain in the processed EEG signals, and thus these results may be taken with caution. As additional information on this note, a comparison of EEG signals’ independent components (ICs) before and after the de-noising framework implemented in this study is presented in Figures S5-S10 ( Extended data 98 ). On the other hand, the existence of body movements can provide insight on behavioural synchronization between musicians. 96 At the same time, it has been reported that movement synchrony in groups is correlated to the presence of positive emotions. 97

It is also possible that synchronized breathing might be another source of similarity in the performers’ EEG signals, particularly in the context of making music together. Future studies could replicate the current study with additional psychophysiological and behavioral measures, i.e. EMG or accelerometers to more clearly identify the relative sources of synchrony in the bispectrum. Although out of the scope of this work, because of the focus on brain signals' synchrony, movement and breathing synchrony, and emotional context (across musicians and with the audience) are undoubtedly interesting under the studied scenario, and will be considered for analysis in future steps of this research.

Nevertheless, the presence of movement artifacts will be present (in a reduced extent, after a propper pre-processing) in any ecologically valid study on musical improvisation due to the freedom of movement of the musicians. 32 Additionally, this performance case study offers a novel way to investigate inter-brain synchronization, in “action and in context”, during a free jazz improvisation in real-world scenarios by expert musicians. It is important to note that movement and psychophysiological synchrony are not necessarily best understood as artifacts, but may well provide an important functional role in supporting joint performance. It would be interesting, in future studies, to investigate to what extent joint performance and its neural correlates depends on the ability of different modalities, that is seeing and hearing the other players or only hearing or seeing the other players. Such comparisons would provide new insights into how joint performance is actually achieved in musical improvisation.

Conclusion

In this study, temporal synchronization of EEG signals between musicians interacting during a jazz performance was observed through a bispectral analysis. The most significant interactions were found between left temporal, bilateral occipital and parietal regions at the gamma band, which reveals a shared dynamic and synchronized processing of auditory, visual and spatial information needed during a cooperative improvised performance. The inter-brain interaction between electrodes in sensory integration areas among musicians provides evidence towards the centrality of sensory processing, 50 feedback, 48 , 71 and communication 48 , 49 during a collaborative musical improvisation.

A temporal analysis of the bispectrum dynamics for both synchronized and desynchronized performing allowed to observe higher bispectrum when musicians were performing in a synchronized manner, when compared to desynchronized performing. In this study, bispectrum was useful to identify differences in competitive and collaborative performance in a real world scenario such as musicians improvising a collaborative piece. Based on the presented results, the implemented bispectral analysis method is proposed to study social interactions and brain-brain communication in hyperscanning measurements.

Data availability

Underlying data

OSF: MOBILE EEG RECORDINGS OF MUSICAL (JAZZ) IMPROVISATION.

https://doi.org/10.17605/OSF.IO/YUEQK 98

This project contains the following underlying data:

  • Block1_P1.mat (EEG data - Recording Block1, Participant 1).

  • Block1_P2.mat (EEG data - Recording Block1, Participant 2).

  • Block1_P3.mat (EEG data - Recording Block1, Participant 3).

  • Block2_P1.mat (EEG data - Recording Block2, Participant 1).

  • Block2_P2.mat (EEG data - Recording Block2, Participant 2).

  • Block2_P3.mat (EEG data - Recording Block2, Participant 3).

  • Impedances.xlsx (Impedance values of EEG electrodes from all participants, at start and end of recordings).

  • Performance Notes.xlsx (Notes with times of trials, segments and relevant events during the performance).

  • ZOOM0001.mp3 (Audio recording of the complete performance).

  • Blaffer_Floor_1210.mp4 (Video Recording1 from the performance).

  • Blaffer_Floor_1221.mp4 (Video Recording2 from the performance).

Extended data

OSF: MOBILE EEG RECORDINGS OF MUSICAL (JAZZ) IMPROVISATION.

https://doi.org/10.17605/OSF.IO/YUEQK 98

This project contains the following extended data:

  • Extended Data.pptx (An extended data file containing additional figures and tables from this work).

Data are available under the terms of the Creative Commons Zero “No rights reserved” data waiver (CC0 1.0 Public domain dedication).

Funding Statement

The authors would like to acknowledge the support of the IUCRC BRAIN at University of Houston (NSF award #1650536) for the development of this research project.

The funders had no role in study design, data collection and analysis, decision to publish, or preparation of the manuscript.

[version 4; peer review: 2 approved

References

  • 1. Cruz-Garza JG, Ravindran AS, Kopteva AE, et al. : Characterization of the stages of creative writing with mobile eeg using generalized partial directed coherence. Front. Hum. Neurosci. 2020;14:533. 10.3389/fnhum.2020.577651 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 2. Ravindran AS, Mobiny A, Cruz-Garza JG, et al. : Assaying neural activity of children during video game play in public spaces: A deep learning approach. J. Neural Eng. 2019;16(3):036028. . 10.1088/1741-2552/ab1876 [DOI] [PubMed] [Google Scholar]
  • 3. Kilicarslan A, Contreras-Vidal JL: Full characterization and removal of motion artifacts from scalp EEG recordings. 2018; pages1–1. 10.1109/werob.2017.8383881 [DOI]
  • 4. Kilicarslan A, Contreras-Vidal J: Neuro-Robotics: Rehabilitation and Restoration of Walking Using Exoskeletons via Non-invasive Brain-Machine Interfaces. 2021;04: pages143–166. 10.1007/978-3-030-68545-4_6 [DOI] [Google Scholar]
  • 5. Collomb-Clerc A, Welter M-L: Effects of deep brain stimulation on balance and gait in patients with parkinson’s disease: A systematic neurophysiological review. Neurophysiologie Clinique/Clinical Neurophysiology. 2015;45(4):371–388. . Special issue: Balance and Gait. 10.1016/j.neucli.2015.07.001 Reference Source [DOI] [PubMed] [Google Scholar]
  • 6. Li MCH, Cook MJ: Deep brain stimulation for drug-resistant epilepsy. Epilepsia. 2018;59(2):273–290. . 10.1111/epi.13964 [DOI] [PubMed] [Google Scholar]
  • 7. Widge AS, Malone DA, Dougherty DD: Closing the loop on deep brain stimulation for treatment-resistant depression. Front. Neurosci. 2018;12(MAR):1–10. . 10.3389/fnins.2018.00175 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 8. Bowsher K, Civillico EF, Coburn J, et al. : Brain-computer interface devices for patients with paralysis and amputation: A meeting report. J. Neural Eng. 2016;13(2). . 10.1088/1741-2560/13/2/023001 [DOI] [PubMed] [Google Scholar]
  • 9. Tyler DJ: U. S. Department of Veterans Affairs prosthesis. Curr. Opin. Neurol. 2015;28(6):574–581. 10.1097/WCO.0000000000000266.Neural [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 10. Ienca M, Vayena E: Direct-to-Consumer Neurotechnology: What Is It and What Is It for? AJOB Neurosci. 2019;10(4):149–151. . 10.1080/21507740.2019.1668493 [DOI] [PubMed] [Google Scholar]
  • 11. Behaviour H: The cooperative human. Nat. Hum. Behav. 2018;2(7):427–428. . 10.1038/s41562-018-0389-1 [DOI] [PubMed] [Google Scholar]
  • 12. Cole EJ, Barraclough NE, Andrews TJ: Reduced connectivity between mentalizing and mirror systems in autism spectrum condition. Neuropsychologia. 2019;122(November 2018):88–97. . 10.1016/j.neuropsychologia.2018.11.008 [DOI] [PubMed] [Google Scholar]
  • 13. Goupil L, Wolf T, Saint-Germier P, et al. : Emergent Shared Intentions Support Coordination During Collective Musical Improvisations. Cogn. Sci. 2021;45(1):e12932. . 10.1111/cogs.12932 [DOI] [PubMed] [Google Scholar]
  • 14. Elmalaki S, Demirel BU, Taherisadr M, et al.: Towards internet-of-things for wearable neurotechnology. 2021 22nd International Symposium on Quality Electronic Design (ISQED). 2021; pages559–565. 10.1109/ISQED51717.2021.9424364 [DOI]
  • 15. Liu D, Shen L, Liu X, et al. : Interactive brain activity: Review and progress on EEG-based hyperscanning in social interactions. Front. Psychol. 2018;9(OCT):1–11. . 10.3389/fpsyg.2018.01862 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 16. Balconi M, Vanutelli ME: Cooperation and competition with hyperscanning methods: Review and future application to emotion domain. Front. Comput. Neurosci. 2017;11(September):1–6. . 10.3389/fncom.2017.00086 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 17. Dikker S, Lu W, Davidesco I, et al. : Brain-to-Brain Synchrony Tracks Real-World Dynamic Group Interactions in the Classroom. Curr. Biol. 2017;27(9):1375–1380. . 10.1016/j.cub.2017.04.002 [DOI] [PubMed] [Google Scholar]
  • 18. Contreras-Vidal J, Robleto D, Cruz-Garza J, et al. : Mobile Brain-Body Imaging and the Neuroscience of Art, Innovation and Creativity. 01 2019. 978-3-030-24325-8. 10.1007/978-3-030-24326-5 [DOI]
  • 19. Cha KM, Lee HC: A novel qEEG measure of teamwork for human error analysis: An EEG hyperscanning study. Nucl. Eng. Technol. 2019;51(3):683–691. . 10.1016/j.net.2018.11.009 [DOI] [Google Scholar]
  • 20. Leong V, Byrne E, Clackson K, et al. : Speaker gaze increases information coupling between infant and adult brains. Proc. Natl. Acad. Sci. U. S. A. 2017;114(50):13290–13295. . 10.1073/pnas.1702493114 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 21. Kinreich S, Djalovski A, Kraus L, et al. : Brain-to-Brain Synchrony during Naturalistic Social Interactions. Sci. Rep. 2017;7(1):17060–12. . 10.1038/s41598-017-17339-5 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 22. Hu Y, Hu Y, Li X, et al. : Brain-to-brain synchronization across two persons predicts mutual prosociality. Soc. Cogn. Affect. Neurosci. 2017;12(12):1835–1844. . 10.1093/scan/nsx118 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 23. Czeszumski A, Eustergerling S, Lang A, et al. : Zadkiel Zuluaga Rendon, and Peter König. Hyperscanning: A Valid Method to Study Neural Inter-brain Underpinnings of Social Interaction. Front. Hum. Neurosci. 2020;14(February):1–17. . 10.3389/fnhum.2020.00039 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 24. Nam CS, Choo S, Huang J, et al. : Brain-to-brain neural synchrony during social interactions: A systematic review on hyperscanning studies. Applied Sciences (Switzerland). 2020;10(19):1–23. . 10.3390/APP10196669 [DOI] [Google Scholar]
  • 25. Nikias CL, Raghuveer MR: Bispectrum Estimation: A Digital Signal Processing Framework. Proc. IEEE. 1987;75(7):869–891. . 10.1109/PROC.1987.13824 [DOI] [Google Scholar]
  • 26. Gagliano L, Assi EB, Nguyen DK, et al. : Bispectrum and Recurrent Neural Networks: Improved Classification of Interictal and Preictal States. Sci. Rep. 2019;9(1):15649–9. . 10.1038/s41598-019-52152-2 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 27. Chella F, Pizzella V, Zappasodi F, et al. : Bispectral pairwise interacting source analysis for identifying systems of cross-frequency interacting brain sources from electroencephalographic or magnetoencephalographic signals. Phys. Rev. E. 2016 May;93(5): 052420. Epub 2016 May 25. 10.1103/PhysRevE.93.052420 [DOI] [PubMed] [Google Scholar]
  • 28. Hinich MJ: Higher order cumulants and cumulant spectra. Circuits Syst. Signal Process. 1994;13:391–402. 10.1007/BF01183737 [DOI] [Google Scholar]
  • 29. Palva JM, Wang SH, Palva S, et al. : Ghost interactions in MEG/EEG source space: A note of caution on inter-areal coupling measures. Neuroimage. 2018 Jun;173:632–643. 10.1016/j.neuroimage.2018.02.032 [DOI] [PubMed] [Google Scholar]
  • 30. Müller V, Sänger J, Lindenberger U: Intra- and Inter-Brain Synchronization during Musical Improvisation on the Guitar. PLoS One. 2013;8(9). . 10.1371/journal.pone.0073852 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 31. Müller V, Lindenberger U: Dynamic Orchestration of Brains and Instruments During Free Guitar Improvisation. Front. Integr. Neurosci. 2019;13(September):1–12. . 10.3389/fnint.2019.00050 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 32. Acquadro MAS, Congedo M, De Riddeer D: Music performance as an experimental approach to hyperscanning studies. Front. Hum. Neurosci. 2016;10(MAY2016):1–13. . 10.3389/fnhum.2016.00242 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 33. Saint-Germier P, Goupil L, Rouvier G, et al. : What it is like to improvise together? Investigating the phenomenology of joint action through improvised musical performance. Phenomenol. Cogn. Sci. 2021; (0123456789). . 10.1007/s11097-021-09789-0 [DOI] [Google Scholar]
  • 34. McPherson MJ, Lopez-Gonzalez M, Rankin SK, et al. : The role of emotion in musical improvisation: An analysis of structural features. PLoS One. 2014;9(8):1–11. . 10.1371/journal.pone.0105144 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 35. Walton A, Richardson MJ, Chemero A: Self-Organization and Semiosis in Jazz Improvisation. International Journal of Signs and Semiotic Systems. 2015a;3(2):12–25. . 10.4018/ijsss.2014070102 [DOI] [Google Scholar]
  • 36. Monson I: Saying Something: Jazz improvisation and interaction. The University of Chicago Press;1996. 9788490225370. [Google Scholar]
  • 37. Dell’Anna A, Rosso M, Bruno V, et al. : Does musical interaction in a jazz duet modulate peripersonal space? Psychol. Res. 2021;85(5):2107–2118. . 10.1007/s00426-020-01365-6 [DOI] [PubMed] [Google Scholar]
  • 38. Setzler M, Goldstone R: Coordination and consonance between interacting, improvising musicians. Open Mind. 2020;4:88–101. . 10.1162/opmi_a_00036 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 39. Loui P: Rapid and flexible creativity in musical improvisation: Review and a model. Ann. N. Y. Acad. Sci. 2018;1423(1):138–145. . 10.1111/nyas.13628 [DOI] [PubMed] [Google Scholar]
  • 40. Cruz-Garza JG, Chatufale G, Contreras-Vidal JL: Examining the Improvisational Creative Process in the Visual Arts: A Mobile Brain Body Imaging Approach. 2017; page2008. Reference Source
  • 41. Lopata JA, Nowicki EA, Joanisse MF: Creativity as a distinct trainable mental state: An EEG study of musical improvisation. Neuropsychologia. 2017;99(March):246–258. . 10.1016/j.neuropsychologia.2017.03.020 [DOI] [PubMed] [Google Scholar]
  • 42. Walton AE, Washburn A, Langland-Hassan P, et al. : Creating Time: Social Collaboration in Music Improvisation. Top. Cogn. Sci. 2018;10(1):95–119. . 10.1111/tops.12306 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 43. Tseng Y-L, Liu H-H, Liou M, et al. : Lingering Sound: Event-Related Phase-Amplitude Coupling and Phase-Locking in Fronto-Temporo-Parietal Functional Networks During Memory Retrieval of Music Melodies. Front. Hum. Neurosci. 2019;13(May). 10.3389/fnhum.2019.00150 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 44. Zatorre RJ, Chen JL, Penhune VB: When the brain plays music: auditory-motor interactions in music perception and production. Nat. Rev. Neurosci. jul 2007;8(7):547–558. . 10.1038/nrn2152 [DOI] [PubMed] [Google Scholar]
  • 45. Palmer C: Time course of retrieval and movement preparation in music performance. Ann. N. Y. Acad. Sci. 2005;1060(October):360–367. . 10.1196/annals.1360.030 [DOI] [PubMed] [Google Scholar]
  • 46. Gorow R. Hearing and Writing Music Professional Training for Today’s Musician. 2002. ISBN: 0-9629496-7-1. [Google Scholar]
  • 47. Pras A, Schober MF, Spiro N: What About Their Performance Do Free Jazz Improvisers Agree Upon? A Case Study. Front. Psychol. 2017 Jun 26;8:966. 10.3389/fpsyg.2017.00966 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 48. Biasutti M, Frezza L: Dimensions of music improvisation. Creat. Res. J. 2009;21(2-3):232–242. . 10.1080/10400410902861240 [DOI] [Google Scholar]
  • 49. Wopereis IGJH, Stoyanov S, Kirschner PA, et al. : What makes a good musical improviser? an expert view on improvisational expertise. Psychomusicology: Music, Mind, and Brain. 2013;23(4):222–235. 10.1037/pmu0000021 [DOI] [Google Scholar]
  • 50. Beaty R: The neuroscience of musical improvisation. Neurosci. Biobehav. Rev. 01 2015;51:108–117. 10.1016/j.neubiorev.2015.01.004 [DOI] [PubMed] [Google Scholar]
  • 51. Dumas G, Fairhurst MT: Reciprocity and alignment: quantifying coupling in dynamic interactions. R. Soc. Open Sci. The Royal Society 2021, May;8. 10.1098/rsos.210138 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 52. Farrugia N, Lamouroux A, Rocher C, et al. : Beta and Theta Oscillations Correlate With Subjective Time During Musical Improvisation in Ecological and Controlled Settings: A Single Subject Study. Front. Neurosci. Frontiers Media; SA;2021, June 10;15. 10.3389/fnins.2021.626723 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 53. Vuust P, Heggli OA, Friston KJ, et al. : Music in the brain. Nat. Rev. Neurosci. 2022;23:287–305. . 10.1038/s41583-022-00578-5 [DOI] [PubMed] [Google Scholar]
  • 54. Heggli OA, Konvalinka I, Cabral J, et al.: Transient brain networks underlying interpersonal strategies during synchronized action. Soc. Cogn. Affect. Neurosci. 2021;16(1-2):19–30. 10.1093/scan/nsaa056 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 55. Donnay GF, Rankin SK, Lopez-Gonzalez M, et al. : Neural Substrates of Interactive Musical Improvisation: An fMRI Study of ‘Trading Fours’ in Jazz.(D. R.Chialvo, Ed.). PLoS One. 2014 Feb 19;9(2): e88665. 10.1371/journal.pone.0088665 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 56. Saint-Germier P, Goupil L, Rouvier G, et al. : What it is like to improvise together? Investigating the phenomenology of joint action through improvised musical performance. Phenom. Cogn. Sci. Springer Science and Business: Media LLC: 2021, December 2. 10.1007/s11097-021-09789-0 [DOI] [Google Scholar]
  • 57. Wilson GB, MacDonald RAR: Musical choices during group free improvisation: A qualitative psychological investigation. Psychol. Music. 2016;44(5):1029–1043. . 10.1177/0305735615606527 [DOI] [Google Scholar]
  • 58. Limb CJ, Braun AR: Neural substrates of spontaneous musical performance: An fMRI study of jazz improvisation. PLoS One. 2008;3(2):e1679. . 10.1371/journal.pone.0001679 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 59. Brandt A: Theme and Variations as a Window into the Creative Mind. Cham: Springer International Publishing;2019;29–39. 978-3-030-24326-5. 10.1007/978-3-030-24326-5_5 [DOI] [Google Scholar]
  • 60. Goldman A: Towards a cognitive-scientific research program for improvisation: Theory and an experiment. Psychomusicology: Music, Mind, and Brain. 2013;23(4):210–221. ISSN 2162-1535(Electronic),0275-3987(Print). 10.1037/pmu0000020 [DOI] [Google Scholar]
  • 61. Thibault RT, Lifshitz M, Jones JM, et al. : Posture alters human resting-state. Cortex. 2014;58:199–205. . 10.1016/j.cortex.2014.06.014 [DOI] [PubMed] [Google Scholar]
  • 62. Contreras-Vidal JL, Cruz-Garza J, Kopteva A: Towards a whole body brain-machine interface system for decoding expressive movement intent Challenges and Opportunities. 5th International Winter Conference on Brain-Computer Interface, BCI 2017. 2017; pages1–4. 10.1109/IWW-BCI.2017.7858142 [DOI]
  • 63. Rosen DS, Yongtaek O, Erickson B, et al. : Dual-process contributions to creativity in jazz improvisations: An SPM-EEG study. NeuroImage. 2020;213(September 2019):116632. . 10.1016/j.neuroimage.2020.116632 [DOI] [PubMed] [Google Scholar]
  • 64. Brain Vision Recorder user manual . Accessed:2023-04-13. https://tsgdoc.socsci.ru.nl/images/d/d1/BrainVision_Recorder_UM.pdf
  • 65. Bigdely-Shamlo N, Mullen T, Kothe C, et al. : The PREP pipeline: Standardized preprocessing for large-scale EEG analysis. Front. Neuroinform. 2015;9(JUNE):1–19. . 10.3389/fninf.2015.00016 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 66. Kilicarslan A, Grossman RG, Contreras-Vidal JL: A robust adaptive denoising framework for real-time artifact removal in scalp EEG measurements. J. Neural Eng. feb 2016;13(2):026013. 10.1088/1741-2560/13/2/026013 [DOI] [PubMed] [Google Scholar]
  • 67. Delorme A, Makeig S: Eeglab: An open source toolbox for analysis of single-trial eeg dynamics including independent component analysis. J. Neurosci. Methods. 2004;134(1):9–21. . 10.1016/j.jneumeth.2003.10.009 [DOI] [PubMed] [Google Scholar]
  • 68. Chang CY, Hsu SH, Pion-Tonachini L, et al. : Evaluation of Artifact Subspace Reconstruction for Automatic EEG Artifact Removal. Proceedings of the Annual International Conference of the IEEE Engineering in Medicine and Biology Society, EMBS. 2018-July (June);1242–1245. . 10.1109/EMBC.2018.8512547 [DOI] [PubMed]
  • 69. Sigl JC, Chamoun NG: An introduction to bispectral analysis for the electroencephalogram. J. Clin. Monit. 1994;10(6):392–404. . 10.1007/BF01618421 [DOI] [PubMed] [Google Scholar]
  • 70. Zhang Y, Prasad S, Kilicarslan A, et al. : Multiple kernel based region importance learning for neural classification of gait states from EEG signals. Front. Neurosci. 2017;11(APR):1–11. . 10.3389/fnins.2017.00170 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 71. Pressing J: Improvisation: methods and models. John A. Sloboda (Hg.): Generative processes in music, Oxford. 1988; pages129–178.
  • 72. Herholz SC, Zatorre RJ: Musical training as a framework for brain plasticity: behavior, function, and structure. Neuron. 2012;76(3):486–502. 10.1016/j.neuron.2012.10.011 [DOI] [PubMed] [Google Scholar]
  • 73. Novembre G, Keller PE: A conceptual review on action-perception coupling in the musicians’ brain: What is it good for?. Front. Hum. Neurosci. 2014;8(AUG):1–11. . 10.3389/fnhum.2014.00603 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 74. Cisek P: Image Schemata. Encyclopedia of Neuroscience. 2009. 10.1007/978-3-540-29678-2 [DOI] [Google Scholar]
  • 75. Wolpert D, Ghahramani Z, Jordan M: An internal model for sensorimotor integration. Science (New York, N.Y.). 10 1995;269:1880–1882. 10.1126/science.7569931 [DOI] [PubMed] [Google Scholar]
  • 76. Fuster JM: Upper processing stages of the perception - action cycle. Trends Cogn. Sci. 2004;8(4):143–145. 10.1016/j.tics.2004.02.004 [DOI] [PubMed] [Google Scholar]
  • 77. Palomar-García MÁ, Zatorre RJ, Ventura-Campos N, et al. : Modulation of Functional Connectivity in Auditory-Motor Networks in Musicians Compared with Nonmusicians. Cereb. Cortex. 2017;27(5):2768–2778. . 10.1093/cercor/bhw120 [DOI] [PubMed] [Google Scholar]
  • 78. Belden A, Zeng T, Przysinda E, et al. : Improvising at rest: Differentiating jazz and classical music training with resting state functional connectivity. NeuroImage. 2020;207:116384. 10.1016/j.neuroimage.2019.116384 [DOI] [PubMed] [Google Scholar]
  • 79. Cheng L, Guo ZW, Lai YX, et al. : Musical training induces functional plasticity in perceptual and motor networks: Insights from resting-state fMRI. PLoS One. 2012;7(5):1–10. . 10.1371/journal.pone.0036568 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 80. Vergara VM, Norgaard M, Miller R, et al. : Functional network connectivity during jazz improvisation. Sci. Rep. 2021;11(1):1–12. 10.1038/s41598-021-98332-x [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 81. Bhattacharya J, Petsche H, Pereda E: Long-range synchrony in the γ band: Role in music perception. J. Neurosci. 2001;21(16):6329–6337. . 10.1523/jneurosci.21-16-06329.2001 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 82. Bhattacharya J, Petsche H: Phase synchrony analysis of EEG during music perception reveals changes in functional connectivity due to musical expertise. Signal Process. 2005;85(11):2161–2177. . 10.1016/j.sigpro.2005.07.007 [DOI] [Google Scholar]
  • 83. Kawasaki M, Yamada Y, Ushiku Y, et al. : Inter-brain synchronization during coordination of speech rhythm in human-to-human social interaction. Sci. Rep. 2013;3:1–8. . 10.1038/srep01692 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 84. Walton AE, Richardson MJ, Langland-Hassan P, et al. : Improvisation and the self-organization of multiple musical bodies. Front. Psychol. 2015b;06(MAR):1–9. . 10.3389/fpsyg.2015.00313 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 85. Goupil L, Saint-Germier P, Rouvier G, et al. : Musical coordination in a large group without plans nor leaders. Sci. Rep. 2020;10(1):20377–14. . 10.1038/s41598-020-77263-z [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 86. Kruppa JA, Reindl V, Gerloff C, et al. : Interpersonal Synchrony Special Issue Brain and motor synchrony in children and adolescents with ASD–a fNIRS hyperscanning study. Soc. Cogn. Affect. Neurosci. 2020;16: (January):103–116. . 10.1093/scan/nsaa092 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 87. Gvirts HZ, Perlmutter R: What Guides Us to Neurally and Behaviorally Align With Anyone Specific? A Neurobiological Model Based on fNIRS Hyperscanning Studies. Neuroscientist. 2020;26(2):108–116. . 10.1177/1073858419861912 [DOI] [PubMed] [Google Scholar]
  • 88. Babiloni F, Astolfi L: Social neuroscience and hyperscanning techniques: Past, present and future. Neurosci. Biobehav. Rev. 2014;44:76–93. . 10.1016/j.neubiorev.2012.07.006 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 89. Dikker S, Michalareas G, Oostrik M, et al. : Crowdsourcing neuroscience: inter-brain coupling during face-to-face interactions outside the laboratory. NeuroImage. 2021;227:117436. 10.1016/j.neuroimage.2020.117436 [DOI] [PubMed] [Google Scholar]
  • 90. Bevilacqua D, Ido Davidesco L, Wan KC, et al. : Brain-to-brain synchrony and learning outcomes vary by student-teacher dynamics: Evidence from a real-world classroom electroencephalography study. J. Cogn. Neurosci. 2019;31(3):401–411. 10.1162/jocn_a_01274 [DOI] [PubMed] [Google Scholar]
  • 91. Davidesco I, Laurent E, Valk H, et al. : Brain-to-brain synchrony between students and teachers predicts learning outcomes (preprint). 2019. 10.1101/644047 [DOI] [PubMed]
  • 92. Pan Y, Dikker S, Goldstein P, et al. : Instructor-learner brain coupling discriminates between instructional approaches and predicts learning. NeuroImage. 2020;211:116657. 10.1016/j.neuroimage.2020.116657 [DOI] [PubMed] [Google Scholar]
  • 93. Stephens GJ, Silbert LJ, Hasson U: Speaker-listener neural coupling underlies successful communication. Proc. Natl. Acad. Sci. U. S. A. 2010;107(32):14425–14430. . 10.1073/pnas.1008662107 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 94. Liu T, Saito G, Lin C, et al. : Inter-brain network underlying turn-based cooperation and competition: A hyperscanning study using near-infrared spectroscopy. Sci. Rep. 2017;7(1):8684–12. . 10.1038/s41598-017-09226-w [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 95. Kontson KL, Megjhani M, Brantley JA, et al. : Your brain on art: Emergent cortical dynamics during aesthetic experiences. Front. Hum. Neurosci. 2015;9(NOVEMBER):1–17. . 10.3389/fnhum.2015.00626 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 96. Damm L, Varoqui D, De Cock VC, et al. : Why do we move to the beat? A multi-scale approach, from physical principles to brain dynamics. Neurosci. Biobehav. Rev. 2020 May;112:553–584. 10.1016/j.neubiorev.2019.12.024 [DOI] [PubMed] [Google Scholar]
  • 97. Smykovskyi A, Bieńkiewicz MMN, Pla S, et al. : Positive emotions foster spontaneous synchronisation in a group movement improvisation task. Front. Hum. Neurosci. 2022 Aug 30;16: 944241. 10.3389/fnhum.2022.944241 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 98. Ramírez-Moreno MA, Cruz-Garza JG, Paek A, et al. : MOBILE EEG RECORDINGS OF MUSICAL (JAZZ) IMPROVISATION. OSF. [Dataset]. 2022. 10.17605/OSF.IO/YUEQK [DOI]
F1000Res. 2023 Nov 16. doi: 10.5256/f1000research.158355.r220645

Reviewer response for version 4

Guido Orgs 1

I have no further comments to make

Is the case presented with sufficient detail to be useful for teaching or other practitioners?

Yes

Is the work clearly and accurately presented and does it cite the current literature?

Yes

If applicable, is the statistical analysis and its interpretation appropriate?

Yes

Are all the source data underlying the results available to ensure full reproducibility?

Partly

Are the conclusions drawn adequately supported by the results?

Partly

Is the background of the case’s history and progression described in sufficient detail?

Yes

Reviewer Expertise:

Dance, Neuroaesthetics, Action Observation, EEG Hyperscanning.

I confirm that I have read this submission and believe that I have an appropriate level of expertise to confirm that it is of an acceptable scientific standard.

F1000Res. 2023 Oct 6. doi: 10.5256/f1000research.152004.r201607

Reviewer response for version 3

Nicolas Farrugia 1

The authors here present an ecological study on brain to brain synchrony during free jazz improvisation with three musicians. They used EEG on the three musicians, with an experimental design that enables to test whether inter brain synchrony was modulated as a function of who's playing with who, and with active / passive roles in improvisation. The setup is particularly challenging to do its ecological validity, and due to noise related to movements in the EEG. After extensive preprocessing, the authors have performed a very elaborate analysis pipeline focusing on time-frequency analysis, and in particularly on bispectrum estimation. The presented results show many interesting effects related to the interplay between musicians, and are quite clearly presented and convincing. Overall, the paper is well written, the statistical analysis is sufficient and the results are convincing towards the given objectives. This kind of ecological study is really welcome, and I want to congratulate the authors for running this complex project and encourage to investigate follow up questions in future work. For the time being, I only have a few points which I think would make an overall better contribution, that are detailed below.

Major issues :

  1. As awkward as it can sometimes be to ask to add one's own reference, I think it would make sense to cite our previous study on a single case study on improvisation, also done using EEG, in a free jazz context (Farrugia et al. 2021, ref 1 below). While it's not an hyperscanning study, it s an ecological study with an audience, so it would make sense to at least cite it in the related works. Our paper investigated other questions on the subjective state of the improviser, namely flow state and subjective time perception.

  2. Another interesting reference to add would be the review by Fairhurst and Dumas, ref 2 below.

  3. The OSF link to the data is broken https://osf.io/yueqk so I cannot guarantee that the data is accessible currently (reference 96 in the manuscript

  4. The following statement in the discussion is very speculative, and not backed by any references. "An interesting finding from this study is that brain-to brain synchrony appears to be limited to the higher frequency bands. Although it may seem that music performance requires behavioral coordination across these longer timescales (i.e. musical phrases for example) that may involve delta and theta bands, it is possible that the jazz improvisation performed by the musicians imposed higher demands in higher frequency bands, which are also more localized. Future studies comparing jazz improvisation with other types of music performances that may emphasize longer timescales could elucidate this question." In  [Farrugia et al. 2021], we did find modulations in the theta and beta oscillations related to subjective states of the improviser, and this was also on "free jazz" (see for example https://www.youtube.com/watch?v=ILhaZYtW8fs&t=390s ). But still, I don't find the statement about linking musical timescales and oscillations frequencies very convincing overall. I'm also not sure what the authors mean by "more localized" in this context. I would recommend removing this paragraph.

  5. Authors should refrain from doing direct inferences about neural sources based on scalp electrode positions, such as " The number of inter-connected channels and ROIs increase when there are more musicians performing. This happened mainly across temporal, occipital and parietal regions, related to auditory, visual and spatial processing.", or " posterior (e.g., parietal, temporal and occipital) regions that are involved in the processing of sensory input and are important in interpreting sensory feedback from the external environment". It's ok to mention interconnected channels and ROIs, but it's impossible to say how much this is related to auditory, visual and spatial processing without doing source localization. Generally speaking, in the results and in the discussion, the authors should stick to channel positions and ROIs at the scalp level, without linking those to brain areas or brain functions.

Minor issues :

  • "Towards the final segments of the performance, there were more ’musical voices’ interacting, " a space is missing before 'musical

  • A very general question related to one of the limitations that is discussed. The authors do mention that the observed effects might be related to movement artifacts ; is there any statistical analysis that can be presented in this paper to at least show that such a confound is equally "strong" across conditions (e.g. between synchronized and not synchronized performances)

Is the case presented with sufficient detail to be useful for teaching or other practitioners?

Yes

Is the work clearly and accurately presented and does it cite the current literature?

Partly

If applicable, is the statistical analysis and its interpretation appropriate?

Yes

Are all the source data underlying the results available to ensure full reproducibility?

Partly

Are the conclusions drawn adequately supported by the results?

Yes

Is the background of the case’s history and progression described in sufficient detail?

Yes

Reviewer Expertise:

Auditory cognitive neuroscience, prior work on music-based rehabilitation of Parkinson's disease using ryhtmical auditory cueing, as well as musical improvisation with EEG, and music neurosciences in general.

I confirm that I have read this submission and believe that I have an appropriate level of expertise to confirm that it is of an acceptable scientific standard, however I have significant reservations, as outlined above.

References

  • 1. : Reciprocity and alignment: quantifying coupling in dynamic interactions. R Soc Open Sci .2021;8(5) : 10.1098/rsos.210138 210138 10.1098/rsos.210138 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 2. : Beta and Theta Oscillations Correlate With Subjective Time During Musical Improvisation in Ecological and Controlled Settings: A Single Subject Study. Front Neurosci .2021;15: 10.3389/fnins.2021.626723 626723 10.3389/fnins.2021.626723 [DOI] [PMC free article] [PubMed] [Google Scholar]
F1000Res. 2023 Oct 30.
Mauricio Ramirez 1

We thank the reviewer for the time invested in reading and making such useful comments for our work. Below is a response to each of the comments (in bold letters), as well as the locations of the corrections made to the manuscript. All changes are carried out in the next version (Version 4) of the manuscript, which should be published soon.

Major issues: 

  1. As awkward as it can sometimes be to ask to add one's own reference, I think it would make sense to cite our previous study on a single case study on improvisation, also done using EEG, in a free jazz context (Farrugia et al. 2021, ref 1 below). While it's not an hyperscanning study, it s an ecological study with an audience, so it would make sense to at least cite it in the related works. Our paper investigated other questions on the subjective state of the improviser, namely flow state and subjective time perception.

  2. Another interesting reference to add would be the review by Fairhurst and Dumas, ref 2 below.

Introduction, Brain-to-brain synchrony between musicians in free-jazz improvisation, Paragraph 2. Both references were added. As stated by the reviewer, although not using hyperscanning, the mentioned study is also an ecological study with an audience, and it is worth to include in the introduction. 

3. The OSF link to the data is broken  https://osf.io/yueqk so I cannot guarantee that the data is accessible currently (reference 96 in the manuscript). 

References, Reference 96. Thank you for noticing this, we contacted OSF and their team helped us to fix the problem. The database is available again.

4. The following statement in the discussion is very speculative, and not backed by any references.  "An interesting finding from this study is that brain-to brain synchrony appears to be limited to the higher frequency bands. Although it may seem that music performance requires behavioral coordination across these longer timescales (i.e. musical phrases for example) that may involve delta and theta bands, it is possible that the jazz improvisation performed by the musicians imposed higher demands in higher frequency bands, which are also more localized. Future studies comparing jazz improvisation with other types of music performances that may emphasize longer timescales could elucidate this question." In  [Farrugia et al. 2021], we did find modulations in the theta and beta oscillations related to subjective states of the improviser, and this was also on "free jazz" (see for example https://www.youtube.com/watch?v=ILhaZYtW8fs&t=390s ). But still, I don't find the statement about linking musical timescales and oscillations frequencies very convincing overall. I'm also not sure what the authors mean by "more localized" in this context. I would recommend removing this paragraph.

Discussion. The paragraph was removed following this recommendation.  

5. Authors should refrain from doing direct inferences about neural sources based on scalp electrode positions, such as " The number of inter-connected channels and ROIs increase when there are more musicians performing. This happened mainly across temporal, occipital and parietal regions, related to auditory, visual and spatial processing.", or " posterior (e.g., parietal, temporal and occipital) regions that are involved in the processing of sensory input and are important in interpreting sensory feedback from the external environment". It's ok to mention interconnected channels and ROIs, but it's impossible to say how much this is related to auditory, visual and spatial processing without doing source localization. Generally speaking, in the results and in the discussion, the authors should stick to channel positions and ROIs at the scalp level, without linking those to brain areas or brain functions.

In results and discussion sections, we limited to talk about channel position and ROI level. Sentences linking brain areas and functions in results and discussion were removed, for instance in: Discussion, Paragraph 5; Results, Summary of main findings.

Minor issues: 

  • "Towards the final segments of the performance, there were more ’musical voices’ interacting, " a space is missing before 'musical

Thank you for noticing, the space was added.

  • A very general question related to one of the limitations that is discussed. The authors do mention that the observed effects might be related to movement artifacts ; is there any statistical analysis that can be presented in this paper to at least show that such a confound is equally "strong" across conditions (e.g. between synchronized and not synchronized performances)

  • Movement artifacts are indeed something to consider in future steps. However, to answer the reviewer comment, Figure 6 notes that when comparing brain synchrony across different conditions, there is a statistical difference at the onset of the events (which were labeled by raters independently). If movement artifacts were the only cause for the changes in synchrony, the synchrony changes would appear across all the time window, and not only at the onset.

F1000Res. 2023 Jun 2. doi: 10.5256/f1000research.147429.r172450

Reviewer response for version 2

Stephane Perrey 1

Thanks to the authors for revising their manuscript satisfactorily. I have no futher major comments to address. This study makes notable step towards conducting neuroscience studies during natural human activities. This case study is methodologically sound and highlights scientific questions about cooperative behaviour and its neural correlates.

Is the case presented with sufficient detail to be useful for teaching or other practitioners?

Yes

Is the work clearly and accurately presented and does it cite the current literature?

Yes

If applicable, is the statistical analysis and its interpretation appropriate?

Yes

Are all the source data underlying the results available to ensure full reproducibility?

Yes

Are the conclusions drawn adequately supported by the results?

Yes

Is the background of the case’s history and progression described in sufficient detail?

Yes

Reviewer Expertise:

Neuroscience and Neuroimaging on the field.

I confirm that I have read this submission and believe that I have an appropriate level of expertise to confirm that it is of an acceptable scientific standard.

F1000Res. 2023 May 18. doi: 10.5256/f1000research.147429.r153202

Reviewer response for version 2

Guido Orgs 1

This paper reports a hyper-scanning study on jazz improvisation, using a clever experimental design that dissociates brain synchrony during listening from brain synchrony during joint performance. The authors compute bispectra between musicians during joint improvisation and use musical feature analysis to distinguish brain-to-brain synchrony between synchronised and desynchronised sections of the music. Overall, the study reports greater brain to synchrony in gamma and beta bands during joint performance compared to baseline and greater brain-to brain synchrony during synchronised as compared to desynchronised sections. This is an interesting study that makes important steps towards conducting neuroscience studies in real world environments and during natural human activities such as making music. The study is methodologically strong and raises important questions about cooperative behaviour and its neural correlates, a few more general comments.

  1. It would be interesting to see how the bispectral measures compare to more established measures of brain-to-brain synchrony such as PLV or inter-subject correlations. From a methodological perspective it will be important to understand how different measures of neural synchrony relate to specific tasks or activities.

  2. How do the authors interpret the lack of brain-to-brain synchrony on lower frequency bands, like theta and delta. Music performance requires behavioural coordination across these longer timescales, i.e. musical phrases for example. It is interesting that brain-to brain synchrony appears to be limited to the higher frequency bands.

  3. The authors acknowledge a potential role for movement in mediating synchrony between music performers. Aside from movement, synchronised breathing might be another source of similarity in the performers’ EEG signals, particular in the context of making music together. It would be interesting to replicate the current study with additional psychophysiological and behavioural measures, i.e. EMG or accelerometers to more clearly identify the relative sources of synchrony in the bispectrum.

  4. In relation to the previous point, it seems to me that movement and psychophysiological synchrony are not necessarily best understood as artifacts but may well provide an important functional role in supporting joint performance. It would be interesting to see to what extent joint performance and its neural correlates depends on the ability of different modalities, that is seeing and hearing the other players or only hearing or seeing the other players. Such comparisons would provide new insights into how joint performance is actually achieved in musical improvisation.

Overall, this study provides an important step towards better understanding how musicians perform together and which cognitive mechanisms are at play in the process.

Is the case presented with sufficient detail to be useful for teaching or other practitioners?

Yes

Is the work clearly and accurately presented and does it cite the current literature?

Yes

If applicable, is the statistical analysis and its interpretation appropriate?

Yes

Are all the source data underlying the results available to ensure full reproducibility?

Partly

Are the conclusions drawn adequately supported by the results?

Partly

Is the background of the case’s history and progression described in sufficient detail?

Yes

Reviewer Expertise:

Live performance, Mobile EEG, Time-series analyses, Dance, Neuroaesthetics, Joint action and perception.

I confirm that I have read this submission and believe that I have an appropriate level of expertise to confirm that it is of an acceptable scientific standard, however I have significant reservations, as outlined above.

F1000Res. 2023 Jun 22.
Mauricio Ramirez 1

We thank the reviewer for the time invested in reading and making such useful comments for our work. Below is a response to each of the comments (in bold letters), as well as the locations of the corrections made to the manuscript. All changes are carried out in the next version (Version 3) of the manuscript, which should be published soon.

  1. It would be interesting to see how the bispectral measures compare to more established measures of brain-to-brain synchrony such as PLV or inter-subject correlations. From a methodological perspective it will be important to understand how different measures of neural synchrony relate to specific tasks or activities.

Introduction, Brain synchrony assessment through hyperscanning, Paragraph 4. A paragraph (from the methodological perspective of brain-to-brain synchrony) comparing bispectral measures to the PLV metric was added as suggested.

  1. How do the authors interpret the lack of brain-to-brain synchrony on lower frequency bands, like theta and delta. Music performance requires behavioural coordination across these longer timescales, i.e. musical phrases for example. It is interesting that brain-to brain synchrony appears to be limited to the higher frequency bands.

Discussion, Paragraph 6. A paragraph was added in the Dicussion section where we discuss our thinking on the appearance of brain synchrony at higher frequency bands, for this particular study case.

  1. The authors acknowledge a potential role for movement in mediating synchrony between music performers. Aside from movement, synchronised breathing might be another source of similarity in the performers’ EEG signals, particular in the context of making music together. It would be interesting to replicate the current study with additional psychophysiological and behavioural measures, i.e. EMG or accelerometers to more clearly identify the relative sources of synchrony in the bispectrum.

Discussion, Paragraph 17. We agree with the reviewer’s comment, and added a statement in the discussion section mentioning that breathing synchrony could be a source of similarity in the EEG signals, and will need to be addressed in future studies.

  1. In relation to the previous point, it seems to me that movement and psychophysiological synchrony are not necessarily best understood as artifacts but may well provide an important functional role in supporting joint performance. It would be interesting to see to what extent joint performance and its neural correlates depends on the ability of different modalities, that is seeing and hearing the other players or only hearing or seeing the other players. Such comparisons would provide new insights into how joint performance is actually achieved in musical improvisation.

Discussion, Paragraph 18. We completely agree with the reviewer’s comment, and added a statement regarding this in the Discussion section.

F1000Res. 2023 Jan 20. doi: 10.5256/f1000research.135627.r158229

Reviewer response for version 1

Stephane Perrey 1

This study aimed to assess temporal synchronization of EEG signals between three professional musicians interacting during a jazz performance. Some relevant interactions were found through a bispectral analysis. A reflective cooperative intention was observed during the task. This original and pilot study was carried out during a free jazz improvisation tasks of long duration in real-world scenario by expert musicians. This case study is well presented and proposes very interesting thoughts and new perspectives in social neuroscience.

Authors were able to deal nicely with possible artifacts associated to the movements during such long plays by performing suitable analyses with EEG signals.

Below you will find some comments for your convenience.

Introduction:

  • Concerning the following sentence, additional insight offered by the bispectrum, when compared to other neural synchrony metrics, needs to be clarified more. First, what does mean here " a more complete intuition on phase coupling, resonance, temporal synchronization and non-linear interactions between any analyzed signal pair". Second, Please highlight the added-value of bispectrum if any.

  • The following sentence without reference does not bring more into the rationale of this study and could be removed: " Musical improvisation can also be considered as an’on the fly’ composition, one that is temporally ubiquitous, spontaneous, and is not restricted by critique."

  • Jazz performance needs to be defined/characterized in the introduction; different stages do exist.

  • " Brain-to-brain communication", as an important metric proposed by authors is cited as a shared cognitive state. Please confirm is this is right and how this metric is associated to hyperscanning methods.

  • Overall, introduction is delivering many terms around hyperscanning techniques, which makes reading difficult. 

  • Please explain/inform on what is a " creative collaboration" task, not really defined in the introduction. Thereafter you wrote " collaborative musical improvisation". Are you saying the same?

  • In the following sentence (end of introduction), since the previous underlined/questioned terms are present, they have to be sufficiently clear. " The current study examines the neural correlates of brain-to-brain communication of jazz musicians during collaborative musical improvisation through hyperscanning".

  • Mobile EEG can impose movement constraints. This is not so straightforward. Please adjust accordingly.

Methods

  • Participants. In terms of experiences (years performing jazz together, musical background) for collaborative tasks, P1 and P2 have likely more chance to "interact » better during collaborative musical improvisation; P3 should present some different responses. Does this heterogeneity was voluntary targeted?

  • Impedance was set to less than 25 kOhms. It appears relatively high as compared to classic/regular use in the field. Can you provide some methodological reference/guidelines?

  • Analysis methods (EEG features, temporal and spatial) are well detailed and are suitable for the data set. Assessment of brain synchrony (maybe state " brain-to brain communication") across two ecological sections (synchronized or not) and three participant conditions (active vs passive in a dyad) of jazz performance is relevant and within the scope of Neuroergonomics. Statistical analysis are complete. 

Results

  • There are many results, some of them being very observational due to the exploratory feature of this case study. However, can authors select the main findings regarding the brain-to brain communication of Jazz musicians during the collaborative musical improvisation sequences?

Discussion

  • This section presents some first evidence and underlines well some specific intra- inter-brain coupling modulated by musical training and previous experiences. Perspectives and methodological limitations are provided. Body movements as main sources of EEG artefacts should be captured in a further study in order to get synchronisation at the behavioral level associated to their neural correlates. 

  • Emotion is likely playing a role in the proposed collaborative musical improvisation task (see doi: 10.3389/fnhum.2022.944241) between the musicians but also with audience, as musicians interact in a jazz performance with a live audience. Please advice. 

  • This study included freely-moving expert musicians.  Do you have collected/recorded the body motion of the musicians? Have you any ideas on the kind of movements over the two sequences defined (SP and DP)?Also, it might be surprising to not observe brain synchronization with the motor regions. The ability to synchronize movement with auditory rhythms is relying on motor networks, such as cortical areas, basal ganglia and the cerebellum, which also participate in rhythm perception and movement production (doi: 10.1016/j.neubiorev.2019.12.024). Have you look at these cortical regions?

  • Minor: correct " synchornicity in sensory areas"

Is the case presented with sufficient detail to be useful for teaching or other practitioners?

Yes

Is the work clearly and accurately presented and does it cite the current literature?

Yes

If applicable, is the statistical analysis and its interpretation appropriate?

Yes

Are all the source data underlying the results available to ensure full reproducibility?

Yes

Are the conclusions drawn adequately supported by the results?

Yes

Is the background of the case’s history and progression described in sufficient detail?

Yes

Reviewer Expertise:

Neuroscience and Neuroimaging on the field.

I confirm that I have read this submission and believe that I have an appropriate level of expertise to confirm that it is of an acceptable scientific standard, however I have significant reservations, as outlined above.

References

  • 1. : Positive emotions foster spontaneous synchronisation in a group movement improvisation task. Front Hum Neurosci .2022;16: 10.3389/fnhum.2022.944241 944241 10.3389/fnhum.2022.944241 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 2. : Why do we move to the beat? A multi-scale approach, from physical principles to brain dynamics. Neurosci Biobehav Rev .2020;112: 10.1016/j.neubiorev.2019.12.024 553-584 10.1016/j.neubiorev.2019.12.024 [DOI] [PubMed] [Google Scholar]
F1000Res. 2023 Apr 24.
Mauricio Ramirez 1

We thank the reviewer for the time invested in reading and making such useful comments for our work. Below is a response to each of the comments (in bold letters), as well as the locations of the corrections made to the manuscript. All changes are carried out in the next version of the manuscript.

 

  • Concerning the following sentence, additional insight offered by the bispectrum, when compared to other neural synchrony metrics, needs to be clarified more. First, what does mean here " a more complete intuition on phase coupling, resonance, temporal synchronization and non-linear interactions between any analyzed signal pair". Second, Please highlight the added-value of bispectrum if any.

    Introduction, Page 2, Lines 111-115. The mentioned paragraph was extended to be clearer. References were included as well as some advantages of using the bispectrum metric. 

  • The following sentence without reference does not bring more into the rationale of this study and could be removed: " Musical improvisation can also be considered as an’on the fly’ composition, one that is temporally ubiquitous, spontaneous, and is not restricted by critique."

    Introduction, Page 2, Lines 169-173. The sentence was rephrased, and a new reference from a music composition book was added.

  • Jazz performance needs to be defined/characterized in the introduction; different stages do exist.

    Introduction, Page 3, Lines 176-186. A paragraph adding more detail about jazz and free jazz performance was added with a reference.  

  • " Brain-to-brain communication", as an important metric proposed by authors is cited as a shared cognitive state. Please confirm is this is right and how this metric is associated to hyperscanning methods.

    Introduction, Page 3, Lines 231-238. A paragraph and a reference were included to add more detail about the use of hyperscanning for assessment of brain-to-brain communication. 

  • Overall, introduction is delivering many terms around hyperscanning techniques, which makes reading difficult. 

    Introduction. Page 1 – 3. Subsections were added to the Introduction to guide the reader throughout the text. No text was removed since in further comments, reviewer suggests that some terms are explained clearly.

  • Please explain/inform on what is a " creative collaboration" task, not really defined in the introduction. Thereafter you wrote " collaborative musical improvisation". Are you saying the same?

    Introduction, Page 3, Line 230 Yes, they imply the same idea, however the term “collaborative musical improvisation” was used instead, to avoid confusion. 

  • In the following sentence (end of introduction), since the previous underlined/questioned terms are present, they have to be sufficiently clear. " The current study examines the neural correlates of brain-to-brain communication of jazz musicians during collaborative musical improvisation through hyperscanning".

    Thank you for the comment, we made sure previous terms are clear and present, as suggested.

  • Mobile EEG can impose movement constraints. This is not so straightforward. Please adjust accordingly.

    Introduction, Page 3, Lines 263-264. A clarification on this was added, movement constraints imposed by mobile EEG are less than for other brain activity measuring techniques, for example fMRI scans. 

Methods

  • Participants. In terms of experiences (years performing jazz together, musical background) for collaborative tasks, P1 and P2 have likely more chance to "interact » better during collaborative musical improvisation; P3 should present some different responses. Does this heterogeneity was voluntary targeted?

    Methods, Page 4, Lines 316-319. It was not targeted on purpose, those were the volunteers that agreed to participate in the study. An observation on this regard was included.  

  • Impedance was set to less than 25 kOhms. It appears relatively high as compared to classic/regular use in the field. Can you provide some methodological reference/guidelines?

    Methods, Page 4, Lines 335-339. A reference was added to how this value was selected.  

  • Analysis methods (EEG features, temporal and spatial) are well detailed and are suitable for the data set. Assessment of brain synchrony (maybe state " brain-to brain communication") across two ecological sections (synchronized or not) and three participant conditions (active vs passive in a dyad) of jazz performance is relevant and within the scope of Neuroergonomics. Statistical analysis are complete. 

    Thank you, we appreciate your comments.

Results

  • There are many results, some of them being very observational due to the exploratory feature of this case study. However, can authors select the main findings regarding the brain-to brain communication of Jazz musicians during the collaborative musical improvisation sequences?

    Results, Page 10, Lines 730-744. Main findings were added and highlighted by the end of the results section, in a new subsection.

Discussion

  • This section presents some first evidence and underlines well some specific intra- inter-brain coupling modulated by musical training and previous experiences. Perspectives and methodological limitations are provided. Body movements as main sources of EEG artefacts should be captured in a further study in order to get synchronisation at the behavioral level associated to their neural correlates. 

    Discussion, Page 15, Lines 1049-1051.Thank you, this is a great comment, which indeed can be explored in a future study. A note on this regard was added to the discussion section.   

  • Emotion is likely playing a role in the proposed collaborative musical improvisation task (see doi: 10.3389/fnhum.2022.944241) between the musicians but also with audience, as musicians interact in a jazz performance with a live audience. Please advice. 

    Discussion, Page 15. Lines 1052-1059. A comment including the emotional context during musical improvisation was included as suggested.  

  • This study included freely-moving expert musicians.  Do you have collected/recorded the body motion of the musicians? Have you any ideas on the kind of movements over the two sequences defined (SP and DP)?Also, it might be surprising to not observe brain synchronization with the motor regions. The ability to synchronize movement with auditory rhythms is relying on motor networks, such as cortical areas, basal ganglia and the cerebellum, which also participate in rhythm perception and movement production (doi: 10.1016/j.neubiorev.2019.12.024). Have you look at these cortical regions?

    Only head motion was collected, but not analyzed in this study. Indeed will be included as future steps. The auditory-motor synchronization in the references talk about such functional network within one person. However, when analyzing our data, synchronization between motor areas and motor-auditory were not observed. We believe this was due to the individuality of movement in the production of sounds (motor areas not in synch), however the perceptual information and processes were shared among musicians, therefore the synchrony in temporal, parietal, and occipital regions (perceptual areas in synch).  

  • Minor: correct " synchornicity in sensory areas"

    This error was fixed. Discussion, Page 12, Lines 838.

Associated Data

    This section collects any data citations, data availability statements, or supplementary materials included in this article.

    Data Availability Statement

    Underlying data

    OSF: MOBILE EEG RECORDINGS OF MUSICAL (JAZZ) IMPROVISATION.

    https://doi.org/10.17605/OSF.IO/YUEQK 98

    This project contains the following underlying data:

    • Block1_P1.mat (EEG data - Recording Block1, Participant 1).

    • Block1_P2.mat (EEG data - Recording Block1, Participant 2).

    • Block1_P3.mat (EEG data - Recording Block1, Participant 3).

    • Block2_P1.mat (EEG data - Recording Block2, Participant 1).

    • Block2_P2.mat (EEG data - Recording Block2, Participant 2).

    • Block2_P3.mat (EEG data - Recording Block2, Participant 3).

    • Impedances.xlsx (Impedance values of EEG electrodes from all participants, at start and end of recordings).

    • Performance Notes.xlsx (Notes with times of trials, segments and relevant events during the performance).

    • ZOOM0001.mp3 (Audio recording of the complete performance).

    • Blaffer_Floor_1210.mp4 (Video Recording1 from the performance).

    • Blaffer_Floor_1221.mp4 (Video Recording2 from the performance).

    Extended data

    OSF: MOBILE EEG RECORDINGS OF MUSICAL (JAZZ) IMPROVISATION.

    https://doi.org/10.17605/OSF.IO/YUEQK 98

    This project contains the following extended data:

    • Extended Data.pptx (An extended data file containing additional figures and tables from this work).

    Data are available under the terms of the Creative Commons Zero “No rights reserved” data waiver (CC0 1.0 Public domain dedication).


    Articles from F1000Research are provided here courtesy of F1000 Research Ltd

    RESOURCES