Skip to main content
Journal of Undergraduate Neuroscience Education logoLink to Journal of Undergraduate Neuroscience Education
. 2019 Jun 30;17(2):A119–A124.

Hands-on Undergraduate Experiences Using Low-Cost Electroencephalography (EEG) Devices

Jennifer A Segawa 1,
PMCID: PMC6650260  PMID: 31360127

Abstract

Most methods used in cognitive neuroscience use expensive equipment that requires extensive training. This normally limits the hands-on experiences available to undergraduate neuroscience students, despite the known benefits of this type of learning. However, new commercially-available electroencephalography (EEG) systems aim to make the classic methodology available to laypeople, for instance, for the purposes of meditation practice. In this study, we evaluated the use of one such device – the Muse headband – to teach undergraduate neuroscience majors about cognitive neuroscience methodology and the research process. Students at Stonehill College practiced using the devices and then conceived, designed, and implemented their own experiments related to a topic of their choosing as part of a Research Methods in Neuroscience course. Objectively, students better retained material related to their experience compared with material only presented in lecture. Subjectively, they reported better understanding the material because of their experiences. They also reported that the experience made them more excited about studying neuroscience.

Keywords: electroencephalography (EEG), cognitive neuroscience


Past literature has shown the advantages of active learning (Oliver-Hoyo et al., 2004; Michael, 2006; Freeman et al., 2014) and project-based learning (cf. Thomas, 2000) over a traditional lecture format, but it is challenging to provide these experiences when teaching cognitive neuroscience, especially when teaching its methodology. For instance, one of the most popular techniques in the field today is magnetic resonance imaging (MRI). MRI scanners cost millions of dollars, require months of training to operate, and have safety concerns even for trained users. Most undergraduate students are unlikely to have access to an MRI scanner to pursue their own research and less likely to use MRI independently.

In contrast, electroencephalography (EEG) – which measures the electrical activity of the brain from the scalp – is a more accessible methodology for undergraduates (Steinmetz and Atapattu, 2010; Nyhus and Curtis, 2016, Shields et al., 2016). It does not have the safety concerns associated with MRI and requires less training. However, typical research EEG systems cost tens or hundreds of thousands of dollars; faculty are unlikely to entrust their equipment to relatively untrained undergraduates in a classroom context where individual supervision is difficult. In addition, while faculty may have an EEG system for their own research, they likely only have one such system so a class of students cannot have the opportunity to use the equipment independently for an extended period.

In contrast, new commercially available EEG devices, such as the MUSE EEG system (InteraXon), are built to be rugged and easy to use without extensive training. They are portable so small groups of students each can use their own system for an entire semester and take it to whatever testing conditions their project requires. Additionally, MUSE – marketed for personal meditation practice – is inexpensive enough to make purchasing a classroom set a greater possibility even for a small institution. This paper describes the use of the MUSE EEG system where an entire class of undergraduate students were able to have hands-on experience designing and implementing their own EEG experiments.

Students used the EEG systems throughout the semester. First, in guided exercises, students learned to use the systems to collect and then analyze data. Then, students developed independent projects based on a cognitive topic of their choosing, which they planned and implemented under faculty guidance.

To evaluate the EEG system as a pedagogical tool, we used both quantitative and qualitative assessment methods at the end of the semester. We examined objective differences in students’ learning between knowledge gained from the hands-on EEG methods compared to information presented only in lecture. We also measured students’ perception of an EEG system as a teaching tool. To explore any other issues or insights from students, we conducted group interviews.

MATERIALS AND METHODS

Participants

Thirteen junior and senior neuroscience majors at Stonehill College were enrolled in Research Methods in Neuroscience in Fall 2017. Twelve students were female, one was male. They completed the EEG exercises and their student-led project as part of the course, and voluntarily participated in subsequent assessments. The projects and assessments were conducted with the approval of the Stonehill College Institutional Review Board.

Equipment: EEG Headbands And Interface Devices

The MUSE headband by InteraXon (InteraXon, 2018) costs $200 per system. It is marketed toward laypeople to improve their meditation practice using EEG biofeedback. The EEG system is built into a headband worn across the user’s forehead, wrapping behind their ears. It has 5 sensors: 2 behind the ears (approximately TP9 and TP10 in the 10–20 system), two on the forehead (approximately AF7 and AF8), and a reference sensor positioned in the middle of the forehead (approximately FPz).

Signals from the headband sensors are sent via Bluetooth to an Android or Apple device. We used the MUSEMonitor app to connect to the headbands ($9.99 for Android, $14.99 for Apple; see Muse Monitor). The app shows real-time signals arranged either by sensor or as discrete frequency values on a log scale (Figure 1). In addition, data can be saved in comma separated value (CSV) format and exported using a variety of methods (e.g., email, cloud-based servers).

Figure 1.

Figure 1

The MUSE monitor interface showing raw data (μV) across the 4 MUSE headband sensors.

We used older, donated smartphones to reduce not only the costs of the devices themselves, but also to reduce the cost associated with the app (as compared to students using their personal devices). Both Google Play and the Apple Store allows users to share an app with up to 10 devices under their account from a single purchase. This also allowed the faculty member to set-up the Muse headbands and their associated devices before class, so students could start immediately on the main scientific content of the exercises. Additionally, providing devices and the app removed the financial burden from students and provided technology to students who may not have smartphones.

EEG Class Exercises: Learning To Collect And Analyze Data

To learn how to use the EEG headbands, teams of two or three students completed two in-class guided exercises (Segawa, 2018) over two weeks. The first exercise focused on data collection and the second on data analysis. Each exercise took about 3 hours to complete and comprised of written instructions and corresponding closed- and open-ended questions. Students conducted the guided exercises when asked by students or when she observed a group in checked students’ work during the exercise and intervened relatively independently, but the faculty member periodically need of help. Moreover, the students turned in their responses to the questions posed in the exercise es. The following week, the faculty member provided individual written feedback, and the class discussed common missteps as a group.

Exercise 1: Training and Data Collection

During the data collection exercise, one student in the group functioned as the “participant” and the other or others as the “experimenter” or “experimenters”. Then, time permitting, students alternated roles to experience both sides of the study. Students first practiced placing the headband based on the International 10–20 system for the scalp electrodes using anatomical landmarks. They learned to recognize a noisy EEG signal and practiced improving it, for instance, by cleaning the participant’s skin, refitting the headband for a closer fit, or moving away from other electronic devices with interfering electromagnetic fields.

Then, to identify (and later rectify) common EEG artifacts when collecting data, such as those generated from blinking or head motion, the participant intentionally generated these artifacts. The students described the resulting signals, then compared across different artifact types (Figure 2). For instance, eye-blink artifacts tend to primarily affect the frontal electrodes whereas head motion artifacts affect all electrodes, and artifacts generated from tension in the facial muscles creates a constant high-frequency noise, whereas eye-blinks each generate a single, gaussian signal.

Figure 2.

Figure 2

Examples of (A) an artifact-free raw signal, (B) a signal with 2 eye-blink artifacts (indicated by the arrows), and C) a signal showing facial muscle tension. As seen with the MUSEMonitor app.

Finally, once the EEG headband was properly fitted, students recorded data during several tasks that typically elicit greater band power in a specific frequency. For instance, the participant closed his or her eyes and relaxed to elicit greater power in the alpha band (centered on 10 Hz; cf. Barry et al., 2007). The subsequent data analysis exercise was performed on these data.

Exercise 2: Data Analysis

In the data analysis exercise, students used a MATLAB-based script (Figure 3, created by the author, available online). They preprocessed the data by examining the raw signals to find and remove data from noisy electrodes, and then found and removed any time segments with artifacts. Then a Fast-Fourier transform calculated the power across frequencies. After identifying the peak in power around 60 Hz from AC electrical current, students notch-filtered the data at that frequency to remove its effect. Finally, students extracted the average power at frequency bands of interest (10*log10(μV2/Hz)), averaged across, for instance, 8–12 Hz (for the alpha band) for each data set collected the previous week and compared power across them.

Figure 3.

Figure 3

Output of the data analysis script. The top plot shows the overlaid raw signals from the electrodes. Here, TP10 was very noisy, so it was excluded using the script. The bottom plot shows the power spectrum for TP9.

EEG Projects

In teams of two or three, students chose a research topic, and formulated and tested a hypothesis. Projects varied across a wide range of cognitive fields:

  • Neural Differences Between Tasting Sweet and Sour” measured frequency power between conditions when a participant was given a sour candy or a sweet candy. The project compared when the experimenters accurately described the candy flavor or when participants were surprised with an incorrect description (e.g., they were told they were receiving a sour candy, but were given a sweet candy).

  • An Examination of the Effects of Calming and Excitatory Music on the Human Brain through Electroencephalography” compared oscillatory activity of participants listening to different genres of music (classical and “screamo”) as well the effect of participants’ self-described music preferences.

  • How Stressed are You Really?” examined how brain waves changed when students with various basal levels of stress underwent a stress-inducing task with negative reinforcement.

  • Perceptions of Language and Music” compared the brain waves of English-Spanish bilinguals and native English speakers learning Spanish when listening to songs in Spanish and English.

Final presentations for several projects are available online (Segawa, 2018).

The project was scaffolded throughout the semester coinciding with lecture material. During the semester, the steps were as follows: data collection and analysis exercises as described above, topic proposals, literature searches, method formulation, creation of IRB materials, piloting, data collection, data analysis, and written and oral presentation of results. In addition, drafts of research paper sections were assigned in parallel with their analogous steps; for instance, a draft of the methods section was due after the students formulated their methods. The course syllabus and schedule are available online (Segawa, 2018). At each step, the project was vetted and refined with the professor for ethics, sound scientific methods, and the practicality of completing the project within the semester with the available resources.

Quantifying Student’s Learning and Experiences

During the last week of the semester, we measured two facets of the students’ experiences with the MUSE EEG system: 1) their content retention of EEG experience-related material, and 2) their subjective experience using the EEG systems in class. These assessments were conducted by an outside faculty member from the College’s Center for Teaching and Learning who had not had previous contact with the students. The students were informed that their responses were anonymous and had no bearing on their grades. The teaching faculty member was not present for any portion of the assessment, and the resulting data were compiled by the outside faculty member.

To test if and how their experience enhanced learning, we compared students’ retention of material related to the EEG experience compared to other material that was presented in lecture at the same time, but not used in the exercises and project. In lecture, students learned about two types of EEG analyses: those measuring frequency band power – like in their research projects – and those measuring event-related potentials (ERPs). ERPs are deflections in the EEG signal in response to a stimulus or cognitive event. Both frequency bands and ERPs were presented in the same lecture, but only frequency bands (and the analogous methodology) were reinforced with the EEG exercises and project.

Short-answer, open-ended questions related to the EEG experience included not only material on frequency bands but also methodological questions about EEG artifacts and electrode positioning:

What cognitive state is typically associated with increased power in the alpha?

Based on its name, where is the Fz electrode?

In contrast, questions about material only presented in lecture included questions about ERPs and other aspects of methodology not reinforced by students’ hands-on experiences:

Why do we need a reference electrode for EEG measurements?

What cognitive state is typically associated with the P300?

See Table 1 for the full list of questions, and the online material for scoring criteria. While this approach of comparing different sets of material was not ideal – confounding the EEG experience with exposure duration and comparing different sets of questions. A control group was not available, and we could not compare responses on identical questions; only a with-group comparison with different questions was practical.

Table 1.

Questions and descriptive statistics for content knowledge assessment. Percentages indicate the percent of students correctly answering the question. See the online material for a detailed description of how each question was scored (Segawa, 2018).

EEG experience questions % correct
Name two sources of artifacts one might see in an EEG recording. 100%
Name the landmarks used to position electrodes in the 10–20 system. 53.8%
Based on its name, where on the head is the Fz electrode located? 80.8%
What cognitive state is typically associated with increased power in the alpha band? 84.6%
Lecture-only questions
What does the acronym ERP stand for? 61.5%
What cognitive process is typically associated with the P300? 46.2%
Why do we need a reference electrode for EEG measurements? 30.8%
What does a Fourier transform do? 38.5%

In addition to the content assessment, we asked students about their subjective experience with the MUSE EEG system using a written survey consisting of six closed-ended questions (Table 2). The first four questions asked if the EEG project enhanced their learning over various topics – conducting scientific research, the biological basis of EEG, collecting EEG data, analyzing EEG data – over a traditional lecture. For balance, half the questions asked if a lecture format was more helpful, and the rest were phrased to ask whether the headbands were more helpful.

Table 2.

Questions and descriptive statistics for student evaluations of EEG experience indicated students’ subjective preference for the EEG experience. Effectiveness ranged from 1 (preference for lecture format or dislike of headband) to 5 (preference for the use of the EEG system or enjoyment of headband).

Mean SD
Because of the EEG headbands, I understand how to conduct scientific research better than if I had only learned about it in a lecture. 4.62 0.51
I would have understood the biological basis of EEG just as well if I only learned about it in a lecture. 4.15 0.38
Because of the EEG headbands, I understand how to collect EEG data better than if I only learned about it in a lecture. 4.77 0.44
I would have understood how to analyze EEG data just as well if I only learned about it in a lecture. 4.46 0.52
The EEG headbands increased my interest in studying neuroscience. 4.85 0.38
Should the professor use the EEG headbands in the future? 5.00 0.00

The 5-point rating scale values always indicated the same sentiments. While the exact wording accompanying each question’s rating scale was tailored to the question, 1 always indicated that the student thought a lecture would have been just as effective as the EEG experience, 3 always indicated that the student didn’t have a preference, and 5 always indicated that the student strongly preferred the EEG experience.

Finally, to illuminate issues beyond the closed-ended questions, the outside faculty member conducted a semi-structured group interview with the students. The interview consisted of open-ended questions asking students about their experience, what they learned, what challenges they faced, and what they would change about the assignments.

RESULTS

Content Knowledge Assessment

To compare knowledge retention on content related to the EEG experience and similar content taught only in lecture but not used in the exercises or project, we used a paired t-test (Table 1, Figure 4). Consistent with our expectations that the hands-on experiences with the EEG system would bolster knowledge retention, we found that students were significantly more accurate on EEG experience related questions (mean = 79.81% correct, standard deviation = 25.79%) than lecture-only questions (mean = 44.23% correct, standard deviation = 23.17%), t(12) = 3.98, p < 0.001.

Figure 4.

Figure 4

On average, students answered significantly more questions correctly that were related to the EEG experience compared to questions related to content covered only in lecture on the content knowledge assessment (p < 0.001). Error bars indicate standard error.

Student Experience: Closed-Ended Questions

In all cases, students indicated a strong preference for the use of the EEG system in class over only learning the material in a lecture format (Table 2). On average, on a 5 point scale with 1 being a preference for lecture, and 5 being a preference for the use of the EEG system, students said their hands-on experience with the EEG system increased their understanding of the following:

  • the biological basis of EEG (mean: 4.15)

  • how to conduct scientific research (mean: 4.62)

  • how to collect EEG data (mean: 4.77)

  • how to analyze EEG data (mean: 4.46)

We also asked students if using the EEG systems increased their interest in studying neuroscience. They almost unanimously responded that the systems “…made studying neuroscience come alive” (mean: 4.85). Finally, we asked students if the professor should use the EEG systems again in the future, and students unanimously responded 5:

“…keep giving students this opportunity!

Student Experience: Open-Ended Questions

In addition, students were asked a series of open-ended questions to illuminate aspects of their experiences that were not covered by the closed-ended survey. In the interview, the students’ overall view of the use of EEG was very clear: they reported that they enjoyed the EEG and appreciated the ability to conduct a “real” independent experiment. Many said that using the EEG system got them (and their friends) more interested in neuroscience. Their follow up comments were both enthusiastic and enlightening.

Specifically, when asked, “What was your experience like with the headsets? Did you have fun with it?” students unanimously answered “Yes”. Representative comments were:

  • “It was the coolest thing ever!”

  • “We did some testing with our housemates, and they were all like ‘WHAT? You guys get to do that?’ Like everybody we’d be around when we’d test would say, ‘Can I see MY brainwaves?’ or, ‘I wish I did science!’”

Students were also asked “What did you learn, intellectually and experientially?” In general, they reported they learned how the devices, and EEGs, work. They also said:

  • “I learned a lot about how EEGs work, and like later, reading scientific articles, I was better able to understand what they were talking about.”

  • “It made me want to be a neuroscientist.”

  • “I liked the experience of using something that is going to be actually used in the field.”

  • “It was cool to design an experiment and see what actually worked well.”

  • “… nice to do real work like in the field…”

  • “it made it ’really concrete’.”

DISCUSSION

We have described the methodology for using MUSE headbands to give undergraduate neuroscience students extended hands-on experience using EEG. Students worked in small groups, first to practice using the system, and then to design an EEG paradigm on a topic of their choosing, collect and analyze data, and present their results.

Students better retained material related to their experiences and reported enjoying their experiences. To quantify these effects, we compared retention of material related to the EEG system to other EEG-related material presented only in lecture. Students retained significantly more material related to their experience compared to the control material. Moreover, students reported the experience increased their understanding of scientific research and EEG methodology, and that they subjectively enjoyed the experience. They unanimously agreed that the EEG system should continue to be used in the class.

In general, students developed clear and novel hypotheses for their EEG projects. They generated sound scientific paradigms and collected clean data. And despite small sample sizes – 8–10 participants per project – four of the six groups had at least one significant or near significant finding. For instance, in the “Perceptions of Language and Music” project, the team found that native English speakers who were learning Spanish had significantly higher alpha power in right-hemisphere electrodes compared to native Spanish speakers when listening to music with Spanish lyrics. In “Neural Differences Between Tasting Sweet and Sour”, the left hemisphere electrodes had near significant differences in beta power between the sweet and sour tasting condition. This demonstrates that even with a small number of participants and a minimal number of electrodes per headband, students can produce scientifically sound experiments with the MUSE EEG system.

We found two main limitations to using the learning experiences described here. First, our students’ projects measured frequency band power of the EEG data; they could not measure ERPs, which is the other predominant paradigm in EEG research. While others have reported using the system to measure ERPs (Krigolson et al., 2017), the software development kit (SDK) necessary for ERP paradigms are not supported for the more recent MUSE 2017 systems. Only the older 2014 systems can use the SDK. Moreover, our students generally lack the programming skills needed to create an ERP experimental paradigm. The second limitation was the MATLAB-based script to preprocess and analyze the data. Most students had never used command-line prompts before, and it proved to be more difficult than expected. In the future, we plan to develop a GUI which will be easier for students to use.

In addition, many students expressed their desire for more time for their EEG projects, both to collect and analyze data. They also asked for a structured system with which to recruit and schedule participants. Both suggestions will be implemented in the next iteration of the course.

A minor limitation is one that affects all undergraduate-level class projects: limited resources. Projects included less than 15 participants, and low statistical power was a common issue. This also forced students to choose simple study designs; most projects compared only two groups or two conditions, even though many students initially wished to pursue more complex hypotheses. Similarly, study materials – other than the EEG system – were limited to those already accessible to or easily created by students, e.g., free visual stimuli from Google Images, easily available foods for gustatory stimuli, or music from YouTube.

Despite these limitations, this study demonstrates the utility of a low-cost, portable, commercially-available EEG system to provide students with real-world, hands-on training in the field of cognitive neuroscience. We found this to be a beneficial tool for teaching students about the scientific method, EEG, and the various cognitive neuroscience topics that they chose to study (e.g., music perception, language, stress, sensation). Students learned more than a traditional lecture, were deeply engaged, and had fun.

Footnotes

The author has no financial relationship with the makers of MUSE (InteraXon). The author thanks Dr. Phyllis Thompson for conducting the surveys and interviews, Dr. Hilary Gettman for the invaluable feedback, and the students in NEU271 for their creativity, patience, and enthusiasm for neuroscience.

REFERENCES

  1. Barry RJ, Clarke AR, Johnstone SJ, Magee CA, Rushby JA. EEG differences between eyes-closed and eyes-open resting conditions. Clin Neurophysiol. 2007;118(12):2765–2773. doi: 10.1016/j.clinph.2007.07.028. [DOI] [PubMed] [Google Scholar]
  2. Freeman S, Eddy SL, McDonough M, Smith MK, Okoroafor N, Jordt H, Wenderoth MP. Active learning increases student performance in science, engineering, and mathematics. PNAS. 2014;111(23):8410–8415. doi: 10.1073/pnas.1319030111. [DOI] [PMC free article] [PubMed] [Google Scholar]
  3. InteraXon. MUSE. 2018. Available at https://choosemuse.com/
  4. Krigolson OE, Williams CC, Norton A, Hassall CD, Colino FL. Choosing MUSE: Validation of a low-cost, portable EEG system for ERP research. Front Neurosci. 2017;11:109. doi: 10.3389/fnins.2017.00109. [DOI] [PMC free article] [PubMed] [Google Scholar]
  5. Michael J. Where’s the evidence that active learning works? Adv Physiol Edu. 2006;30(4):159–167. doi: 10.1152/advan.00053.2006. [DOI] [PubMed] [Google Scholar]
  6. Muse Monitor. Download. Available at https://musemonitor.com/
  7. Nyhus E, Curtis N. Incorporating an ERP project into undergraduate instruction. J Undergrad Neurosci Ed. 2016;14(2):A91–96. [PMC free article] [PubMed] [Google Scholar]
  8. Oliver-Hoyo MT, Allen D, Hunt WF, Hutson J, Pitts A. Effects of an active learning environment: Teaching innovations at a research I institution. J Chem Edu. 2004;81(3):441–448. [Google Scholar]
  9. Segawa. Teaching. 2018. Available at https://www.segawalab.com/teaching.
  10. Shields SM, Morse CE, Applebaugh ED, Muntz TL, Nichols DF. Are electrode caps worth the investment? An evaluation of EEG methods in undergraduate neuroscience laboratory courses and research. J Undergrad Neurosci Ed. 2016;15(1):A29–37. [PMC free article] [PubMed] [Google Scholar]
  11. Steinmetz KRM, Atapattu RK. Meeting the challenge of preparing undergraduates for careers in Cognitive Neuroscience. J Undergrad Neurosci Ed. 2010;9(1):A36–42. [PMC free article] [PubMed] [Google Scholar]
  12. Thomas JW. A Review of Research on Project-Based Learning. San Rafael, CA: Autodesk Foundation; 2000. 2000. [Google Scholar]

Articles from Journal of Undergraduate Neuroscience Education are provided here courtesy of Faculty for Undergraduate Neuroscience

RESOURCES