Abstract
Purpose
Brain–computer interfaces (BCIs) can provide access to augmentative and alternative communication (AAC) devices using neurological activity alone without voluntary movements. As with traditional AAC access methods, BCI performance may be influenced by the cognitive–sensory–motor and motor imagery profiles of those who use these devices. Therefore, we propose a person-centered, feature matching framework consistent with clinical AAC best practices to ensure selection of the most appropriate BCI technology to meet individuals' communication needs.
Method
The proposed feature matching procedure is based on the current state of the art in BCI technology and published reports on cognitive, sensory, motor, and motor imagery factors important for successful operation of BCI devices.
Results
Considerations for successful selection of BCI for accessing AAC are summarized based on interpretation from a multidisciplinary team with experience in AAC, BCI, neuromotor disorders, and cognitive assessment. The set of features that support each BCI option are discussed in a hypothetical case format to model possible transition of BCI research from the laboratory into clinical AAC applications.
Conclusions
This procedure is an initial step toward consideration of feature matching assessment for the full range of BCI devices. Future investigations are needed to fully examine how person-centered factors influence BCI performance across devices.
Children and adults who cannot communicate, or have difficulty communicating, often rely on compensatory strategies to convey their thoughts and intentions that may include augmentative and alternative communication (AAC; Fager, Beukelman, Fried-Oken, Jakobs, & Baker, 2012). A large number of AAC options and access methods currently exist to support individuals with complex communication needs. Some access methods include directly selecting a communication icon using a touch screen–type interface or controlling an on-screen selection cursor using eye-gaze tracking, head mouse, or light indicator. In contrast, scanning is an indirect access method for selecting a desired communication item where a predetermined movement (e.g., a limb movement) indicates one's intention to a communication partner or operates a selection switch (American Speech-Language-Hearing Association, 2017; Beukelman & Mirenda, 2013).
While AAC intervention has been successful for many individuals with complex communication needs, existing AAC access methods (aided and unaided, direct and indirect) depend upon some minimal amount of voluntary motor capacity (e.g., single muscle movements, orofacial movements, and/or eye control). This requirement of voluntary motor behavior may prevent individuals with severe motor impairments or paralysis from successfully accessing conventional AAC devices. For example, individuals with profound motor impairments due to neurological disease and disorder, such as amyotrophic lateral sclerosis (ALS) and brainstem stroke may present with near-total paralysis and akinetic mutism, a condition known as locked-in syndrome (Plum & Posner, 1972). Locked-in syndrome is often described along a continuum of severity, from incomplete (some limb movement control), classical (vertical eye movements and blinking), and total (no voluntary motor behavior; Bauer, Gerstenbrand, & Rumpl, 1979). Because all AAC access methods require some amount of voluntary motor behavior, current access methods may be inefficient or completely ineffective for individuals with locked-in syndrome, including eye-gaze tracking techniques for those with classical or total locked-in syndrome.
Brain–computer interfaces (BCIs) have the potential to provide an alternate modality for accessing AAC for individuals with profound speech and motor impairments, including paralysis and akinetic mutism. BCIs circumvent the requirement of voluntary motor movement needed for accessing AAC devices by enabling control via brain activity alone. Thus, BCIs may be the only communication option available for individuals who cannot access AAC devices by current commercial methods (Fried-Oken, Mooney, Peters, & Oken, 2013; Kübler, Kotchoubey, Kaiser, Wolpaw, & Birbaumer, 2001).
A majority of BCI approaches involve noninvasive recordings of brain activity using electroencephalography, which are recorded from the scalp via a wearable electrode cap and converted into commands for controlling communication software (e.g., keyboard typing; Donchin, Spencer, & Wijesinghe, 2000), simulated button presses (Brumberg, Burnison, & Pitt, 2016; Scherer et al., 2015), and mouse cursors (Wolpaw, Birbaumer, McFarland, Pfurtscheller, & Vaughan, 2002). There are many variations in the way BCIs translate brain activity into communication output, which are based on neurological signals related to sensory, motor, and/or cognitive processes. Each BCI method has specific requirements that may either support or obstruct successful operation depending on an individual's unique sensory–cognitive–motor and motor imagery profile. Therefore, effective person-centered, feature matching procedures are necessary to identify the most appropriate BCI method to facilitate individual success, decrease training time, and limit the potential for device abandonment (e.g., Light & McNaughton, 2013).
Feature Matching in AAC and BCI
Given the wide variety of AAC methods and access techniques currently available, it is important that individuals are matched with the AAC access method or methods that emphasize their strengths (Beukelman & Mirenda, 2013). Feature matching is the established standard practice in AAC for choosing the AAC device, layout, access method, and intervention strategy that best suits each individual on the basis of a combination of their current and future profile, level of support, and levels of sensory, motor, cognitive, and literacy skills (Gosnell, Costello, & Shane, 2011; Thistle & Wilkinson, 2015). Comprehensive assessments in these domains allow clinicians to understand clients' unique profiles, which, in turn, inform trial-based testing of multiple AAC devices and communication layouts to establish stakeholder preferences (Beukelman & Mirenda, 2013). As BCI technology progresses toward integration with existing AAC systems and clinical best practices, it is important to ensure that feature matching procedures are available to guide selection of the most appropriate BCI device for accessing AAC, not just the most technologically advanced (Fager et al., 2012; Fried-Oken et al., 2013; Light & McNaughton, 2013). The application of strategic approaches for assessment of BCI methods is important (Ahn & Jun, 2015), particularly for accessing AAC, due to the complexity of BCI methods; the variability in sensory, cognitive, motor, and motor imagery skills; and communication needs of individuals who may access AAC via BCI (Brumberg, Pitt, Mantie-Kozlowski, & Burnison, 2018) and to account for variable outcomes during BCI trial evaluation (Peters, Mooney, Oken, & Fried-Oken, 2016).
Cognitive impairments concomitant with neuromotor disorders are great examples that illustrate the need for feature matching practices in BCI. For instance, approximately 30% of individuals with ALS (a population often targeted for BCI interventions) have cognitive impairments that may include deficits in language, executive function, social cognition, and verbal memory (Beeldman et al., 2016). Cognitive deficits have also been noted for individuals with locked-in syndrome due to cortical damage associated with brainstem stroke (Schnakers et al., 2008), and additional sensorimotor deficits are associated with other populations who may use BCI for accessing AAC, including, stroke, Parkinson's disease, Parkinson-Plus syndromes, traumatic brain injury, cerebral palsy, and brain tumors (Fried-Oken et al., 2013). The exact nature of impairments to language and cognition associated with these neurological disorders is likely to vary per person, even within disorders, and have specific consequences for successful BCI outcomes. Another example that demonstrates the need for feature matching in BCI is that individuals with poor vision may not be ideally matched to visually based BCIs (McCane et al., 2014). This illustration clearly shows the importance of formally assessing levels of visual ability to either rule out or modify certain BCI techniques. Similarly, neurological signals associated with motor imagery (i.e., simulation of an action without physical movement), which are necessary for controlling motor-based BCIs, are not observable in 15% to 30% of the population (Blankertz et al., 2010) and must be assessed prior to device selection. These cases highlight that BCIs are not a one-size-fits-all technology and underscore the need to evaluate the sensory, motor, and cognitive profiles of individuals who may use BCI in relation to the requirements of each possible BCI method.
Development of a Clinically Focused Guide to Feature Matching for BCI Access Methods
Nine variations of noninvasive BCIs for accessing AAC are reviewed in the following sections, each of which includes a summary of feature matching guidelines and considerations associated with each device type. We focus primarily on auditory, visual, and motor (imagery)-based BCI methods, including the visual P300 speller (e.g., Donchin et al., 2000); auditory P300 speller (e.g., Kübler et al., 2009); modified P300 grid displays/rapid serial visual presentation speller (e.g., Oken et al., 2014); steady state visually evoked potential speller (e.g., Sutter, 1992); motor imagery (e.g., Blankertz et al., 2006; Nijboer et al., 2008); and auditory steady state response (e.g., Lopez, Pomares, Pelayo, Urquiza, & Perez, 2009). The proposed feature matching guidelines are based on existing AAC best practices and attempt to balance the tradeoffs between requirements for each BCI type, initial assessment of BCI performance, and other AAC assessment considerations, including physical barriers, language, literacy, sensory and cognitive function, and motor ability. The clinical criteria important for assessing BCI access methods for AAC were identified by a multidisciplinary team with a combined experience in BCI, AAC, neuromotor disorders, and cognitive assessment. The team included three speech-language pathologists (SLPs), two neuroscientists, one BCI engineer, and one individual with ALS who is a regular participant in BCI studies. The major areas identified focused on the following:
Sensory: visual acuity and hearing sensitivity.
Medical considerations: for example, history of seizures (important for some sensory BCI techniques), use of medications.
Motor: oculomotor (eye) movement and absence of involuntary motor movements.
Motor imagery: ability to perform first-person motor imagery, presence of neurological activity related to movement imagery.
Cognition: attention, memory/working memory, cognitive and motor learning performance factors (e.g., task switching, self-monitoring, and abstract reasoning).
Literacy: reading and spelling.
General considerations: physical barriers, age, device positioning, and training time.
In the following sections, we first briefly review the technical fundamentals and skills needed for a successful operation of a range of BCI access methods for AAC. We then present a proposed template checklist for BCI and AAC professionals to refer to when determining the combination of devices and access techniques that best meet the needs of individuals who may use BCI. The template checklist (Figure 1) emphasizes domains most likely to discriminate between BCI access methods and, along with consideration of extrinsic factors, may be used to help guide professional decision making. Examples for implementing these procedures are then outlined in three hypothetical cases. A more thorough review of BCI techniques used to access AAC can be found in Brumberg, Pitt, et al. (2018).
Figure 1.
Feature matching assessment template with domains, possible BCI options, and pertinent considerations. Items included within the template are those most likely to discriminate between BCI methods. Items matching an individual's profile are to be checked in the blue column, with the total feature matches for each device to be indicated in the yellow row. Flexibility when interpreting clinical levels of severity is required on a case-by-case basis. BCI = brain–computer interface; RSVP = rapid serial visual presentation; SSVEP = steady state visually evoked potential; ASSR = auditory steady state response; Lit = literacy; MI = motor imagery; AAC = augmentative and alternative communication.
P300 Spellers
Visual P300 Grid Speller
The visual P300 grid speller uses the P300 event-related potential in a visual oddball paradigm (Donchin et al., 2000; Farwell & Donchin, 1988) to access typing interfaces. Individuals communicate using the P300 grid speller by focusing their attention on communication items arranged in a grid on a computer screen, often 6 × 6 with letters and numbers (Sellers, Krusienski, McFarland, Vaughan, & Wolpaw, 2006). In a visual oddball paradigm, individuals must maintain attention on a single item (known as the rare or oddball stimulus) as each row and column within the grid is randomly flashed (the frequent stimulus). A positive voltage event can be detected in electroencephalography scalp recordings approximately 300 ms after the attended item (the oddball stimulus) is flashed when compared with the electroencephalography recordings for frequent stimuli. The BCI detection algorithm then associates each P300 event with the flashed row or column to determine the attended item, which is then selected.
P300 Rapid Serial Visual Presentation Speller
A variant of the visual P300 speller uses a rapid serial visual presentation of icons (Acqualagna & Blankertz, 2013; Oken et al., 2014), in which items formerly organized in a grid format are presented individually and sequentially at a single location on the screen. To make a selection in this format, individuals operating the BCI must keep their target communication item in working memory as all available elements are randomly and serially presented. In some implementations, the stimulus sequence is also optimized based on language modeling to improve accuracy and speed (Oken et al., 2014). As with the grid format, active attention to the intended (rare) item among the ignored (frequent) items will elicit a P300 event-related potential that is used for selection.
Auditory P300
BCIs using the auditory-evoked P300 event-related potential allow individuals to select communication items through an auditory oddball paradigm and do not rely on visual perception. Like its visual counterpart, the auditory P300 BCI requires individuals to attend to a rare, target auditory stimulus (e.g., tone) while ignoring frequent nontarget stimuli (Halder et al., 2010), and the elicited P300 event-related potential is detected by the BCI algorithm for item selection. Using the auditory P300 BCI, it is possible to represent a grid layout by relating each row and column to distinct auditory stimuli (Käthner et al., 2013; Kübler et al., 2009), similar to existing auditory scanning AAC devices. In such a paradigm, communication items are selected by attending to the audio stimulus corresponding to the target row and column while ignoring all other audio. In this paradigm, individuals are required to encode the mapping between auditory stimuli and grid layout to memory. An alternative auditory P300 paradigm uses binary selections that are driven by detecting P300 event-related potentials while selectively attending to one of two sound streams containing linguistic information (e.g., “yes” and “no” presented dichotically; Hill et al., 2014).
Competency Skills Necessary for Visual and Auditory P300 Devices
Sensory. Individuals with limited vision and/or other visual deficits, including reduced visual field due to ptosis or cataracts, may require approaches using auditory stimulation or motor (imagery) paradigms (below). For those using auditory BCIs who also have hearing loss, it may be possible to adapt tone-stimulus varieties to only include frequencies in their audible range or to use visual or motor imagery paradigms.
Medical considerations. Visual P300 displays carry a risk of seizure due to the rapid flashing of items (Ikegami, Takano, Saeki, & Kansaku, 2011), though this risk may be less than those associated with steady state visually evoked potential BCIs (below) because the flickering stimuli can vary in location across the screen. If assessment reveals a history of seizures, then auditory and motor imagery approaches may be more appropriate than both visual P300 grid and rapid serial visual presentation spellers. Although the direct impacts of pharmaceuticals on P300 BCI performance for individuals with neuromotor disorders requires further study, some antidepressant, anticholinergic, and antiepileptic drugs may negatively impact cognitive performance (Meador, 1998), and sedatives, such as alcohol (Polich & Criado, 2006) and benzodiazepines, may decrease P300 amplitudes (Hayakawa et al., 1999), both of which can reduce BCI performance. In addition, pain medications, such as opioids, may cause mental clouding and confusion; however, because pain itself may negatively impact cognitive performance, long-term chronic pain relief may comparatively improve cognitive function (Jamison et al., 2003).
Motor. The visual P300 grid speller is more successful for individuals with a full range of oculomotor function that enables overt attention to communication icons using eye gaze than for those only able to use covert, peripheral attention (Brunner et al., 2010). Individuals with locked-in syndrome and others with profound neuromotor impairments often have severe deficits in visual perception due to a lack of oculomotor control (Fried-Oken et al., 2013) that may limit success with the grid layout. In these cases, graphical display adaptations may be needed to elicit the visual P300 by arranging items in the active visual field or by using the rapid serial visual presentation paradigm. However, auditory or motor imagery approaches may be required if suitable adaptations to the visual P300 interface are not possible or insufficient.
Motor imagery. Motor imagery skills are not necessary to elicit the P300 through either grid, auditory, or rapid serial visual presentation paradigms.
Cognition. All types of P300 BCIs depend on attention and working memory to focus on a single target while ignoring all other stimuli. Both visual and auditory P300 BCIs are expected to have some similarities in cognitive requirements; however, research on their differences is limited aside from the use of their respective sensory modality (e.g., auditory vs. visual attention). That said, there is some evidence that auditory P300 spelling interfaces may require greater attention and short-term memory capacity (Klobassa et al., 2009; Kübler et al., 2009) compared with visual interfaces, due to an increased demand for navigating auditory grid systems and memorizing the mapping of all items to their auditory representations.
Attention. An individual's ability to rapidly update their selective attention to a new target stimulus while ignoring other irrelevant information is important for successful visual P300 BCI outcomes in individuals with neuromotor disorders (Geronimo, Simmons, & Schiff, 2016; Riccio et al., 2013). In addition, motivation may lead to increases in attention (Engelmann, Damaraju, Padmala, & Pessoa, 2009), which, in turn, leads to greater P300 BCI performance (Nijboer, Birbaumer, & Kübler, 2010).
Working memory. Aided AAC devices commonly impose substantial working memory demands (e.g., to locate the target icon and ignore distracting stimuli; Thistle & Wilkinson, 2013), and this is no different for BCI techniques. In addition, prior studies have shown that working memory skills (assessed via the List Sorting Working Memory Task; Gershon et al., 2013) are positively correlated with P300 speller performance in healthy adults (Sprague, McBee, & Sellers, 2016).
Cognitive and motor learning performance factors. P300 BCI control does not require motor learning, though some executive functions, such as self-monitoring and problem solving, may still play a role in P300 BCI success as all BCI applications require some form of skill learning. Further study is needed to more fully explore these executive function factors.
Literacy. P300 systems largely provide access to spelling-based interfaces. However, both the visual grid and rapid serial visual presentation paradigms may be easily adapted for symbol-based communication (e.g., Brumberg, Pitt, et al., 2018). Further, word prediction software can be used to aid spelling, though the use of word prediction can have a negative impact on cognitive load (Koester & Levine, 1996). For auditory P300 BCI systems, individual streams containing spoken words to the right and left ear have been used for binary selection (Hill et al., 2014), which supports even minimal literacy skills. However, if the BCI assessment reveals strengths in literacy, executive function, and short-term memory, it may be beneficial to consider auditory BCIs with greater numbers of possible items for selection, such as a spelling device.
Additional considerations. Incorrect positioning and poor physical support can affect levels of fatigue, comfort, emotional state, and ability to attend to a given task. Therefore, AAC best practices for positioning (e.g., ensuring a stable base of support, limiting the influence of atypical muscle tone, extraneous movements, and reflexes, and providing support for rest; Beukelman & Mirenda, 2013) should also be applied to BCI to facilitate successful outcomes. In addition, the peak P300 amplitude observed from posterior electroencephalography electrodes is positively correlated with increases in visual P300 BCI performance when compared with amplitudes recorded from fronto-central electrodes for individuals with ALS (Sugata et al., 2016). Therefore, positioning factors that affect posterior electrode recordings, such as uncontrolled neck movements or spasticity (e.g., Daly et al., 2013; Sutter, 1992) or physical pressure from a wheelchair head support (Daly et al., 2013), are a heightened consideration for BCI modalities that specifically rely on recordings from electrodes at these locations (e.g., P300 and steady state, visually evoked potential). Last, visual P300 BCI performance has been shown to increase with age for individuals with ALS (Geronimo et al., 2016; Silvoni et al., 2009), though age effects are unknown for auditory P300 BCIs, and the underlying rationale for these findings (e.g., whether this is due to cognitive, motivational, or other related factors) are not yet fully understood.
Steady State Responses
Steady State Visually Evoked Potential
The steady state visually evoked potential BCI paradigm uses steady state electroencephalography rhythms, which are physiological responses to a driving input stimulus (Regan, 1989), such as a strobe, for selecting items from an AAC device. Presenting the driving visual stimulus at a specific flicker frequency (e.g., 5–30 Hz) with eyes open will generate electroencephalography signals in posterior electrodes with oscillations at the same frequency as the stimulus and its harmonics (Regan, 1989). When multiple flickering stimuli are presented simultaneously, all frequencies are observable in the electroencephalography recording, but the one to which the individual is attending will have the greatest amplitude (Müller-Putz, Scherer, Brauneis, & Pfurtscheller, 2005) and temporal correlation to the stimulus (Lin, Zhang, Wu, & Gao, 2007). In a BCI context, a four-choice, steady state visually evoked potential display may have four items each flickering at a different rate (e.g., 12, 13, 14, and 15 Hz). Attending to the item flickering at 14 Hz will elicit electroencephalography signals from occipital electrodes with heightened amplitude at 14 and 28 Hz (first harmonic), and the greatest temporal correlation to the 14-Hz strobe in comparison to all other competitors (Lin et al., 2007; Müller-Putz et al., 2005). The amplified steady state visually evoked potential response would then be used to select the 14-Hz item.
Auditory Steady-State Response
The auditory steady state response is a steady state electroencephalography rhythm generated by an auditory driving stimulus, for instance, a carrier tone (e.g., 1000 Hz) that is amplitude modulated at a rate observable by electroencephalography (e.g., 30–40 Hz; Picton, John, Dimitrijevic, & Purcell, 2003). Simultaneous presentation of two streams, one to each ear, with distinct amplitude modulation rates will generate neural responses at both rates (Lopez et al., 2009). Attention to a particular stream will amplify the neural response at the specific modulation frequency, which can be used to identify the attended stream (similar to the attended strobe frequency and steady state visually evoked potential). In this way, each stream can be associated with a communicative action (e.g., binary selection) to control an AAC device. The auditory steady state response is a relatively new technique for BCI, and its application in populations with neuromotor impairments is not known, though under active investigation (see Brumberg, Pitt, et al., 2018, for a review).
Competency Skills for Steady State Visually Evoked Potential and Auditory Steady State Response Devices
Sensory. Steady state visually evoked potential and auditory steady state response primarily require acuity in their respective modalities, vision and hearing. Individuals considered for steady state BCIs, but with visual impairment, may be better matched to auditory (auditory steady state response) versus visual (steady state visually evoked potential) stimulation and vice versa.
Medical considerations. Due to the nature of the steady state visually evoked potential flickering stimulus, it is possible to trigger a seizure event (Volosyak, Valbuena, Lüth, Malechka, & Gräser, 2011) in individuals with epilepsy or other seizure disorders; however, there is no evidence of increased seizure risk for auditory steady state response devices (Higashi, Rutkowski, Washizawa, Cichocki, & Tanaka, 2011). There is little information available about the influence of pharmaceuticals on the steady state visually evoked potential and auditory steady state response performance. However, it is reasonable to hypothesize that these techniques may be impacted by medications in a similar manner to P300 techniques.
Motor. There is some evidence that overt attention with intact oculomotor control leads to the highest steady state visually evoked, potential-based BCI performance, though both overt and covert visual attention can be used (Brumberg, Nguyen, Pitt, & Lorenz, 2018; Kelly, Lalor, Finucane, McDarby, & Reilly, 2005; Zhang et al., 2010). For instance, steady state visually evoked potential stimuli can be presented in an overlapping fashion (Allison et al., 2008) for individuals with severely limited or absent oculomotor control and poor peripheral vision. Steady state visually evoked potential interfaces can also be adapted to suit the strengths of individuals with moderate deficits to oculomotor control and/or selective attention processes (e.g., fewer icons, optimal placement in the visual field; Brumberg, Nguyen, et al., 2018).
Motor imagery. Motor imagery skills are not needed to access these devices.
Cognition.
Attention. Successful operation of steady state BCIs requires excellent selective visual or auditory attention. Individuals must be able to focus on a single item among many other competing stimuli; however, unlike P300 approaches, active decisions about whether a novel target is presented are not required. As a result, steady state visually evoked potential spellers have been associated with lower mental workload (vs. the P300 grid speller) for individuals with locked-in syndrome (Combaz et al., 2013).
Memory/working memory. The working memory demands of steady state, visually evoked potential and auditory steady state response paradigms have not been fully explored. However, one explanation for reports of lower mental workload for steady state visually evoked potential BCIs is due to decreased working memory demands.
Cognitive and motor learning performance factors. Motor learning is not necessary to achieve device control; however, executive function skills, such as self-monitoring and problem solving, may still play a role in BCI success.
Literacy. Steady state visually evoked, potential-based devices are typically configured for spelling but may be adapted for symbol-based communication (e.g., Brumberg, Pitt, et al., 2018).
Additional considerations. Steady state visually evoked potential signals are best recorded from the posterior scalp; therefore, positioning should follow AAC best practices with additional emphasis on minimizing physical obstructions to signal acquisition from these electrode locations. Auditory responses are often recorded from central scalp locations near the top of the head and may be less susceptible to physical contact artifacts. Auditory steady state response-based devices are currently limited to binary choices; therefore, BCI assessment scores on executive function tests should be considered when deciding between binary-choice auditory steady state response and other multichoice BCIs to best match the strengths of each individual.
Motor (Imagery)-Based
Motor-based BCI techniques use the neural activity resulting from imagined movements to control communication devices (e.g., Blankertz et al., 2006; Miner, McFarland, & Wolpaw, 1998; Neuper, Müller, Kübler, Birbaumer, & Pfurtscheller, 2003). Specifically, motor-based BCIs detect changes in the sensorimotor rhythm, an electroencephalography signal that occurs in the mu (8–12 Hz) and beta (15–25 Hz) frequency bands and recorded from electrodes at central scalp locations (over the sensorimotor areas of the brain). During motor imagery (and actual movements), the sensorimotor rhythm decreases in power when compared with rest (Brumberg, Pitt, et al., 2018). Sensorimotor modulations (increases and decreases in band power) can be used in both continuous (Wolpaw et al., 2002) and discrete paradigms (Pfurtscheller & Neuper, 2001) for accessing communication software using either (a) continuous control of a selection cursor (e.g., Blankertz et al., 2006; Miner et al., 1998; Wolpaw et al., 2002) or (b) discrete selection of interface items (e.g., switch-type access; Neuper et al., 2003; Pfurtscheller & Neuper, 2001).
For example, one continuous motor BCI paradigm moves a computer cursor in proportion to changes in sensorimotor rhythm amplitude toward screen locations with communication icons (e.g., the words yes and no or letters placed along borders of the screen; Miner et al., 1998; Vaughan et al., 2006). The Berlin BCI (another continuous variety) uses the “Hex-o-spell” communication system in which imagined limb movements rotate an arrow (again in proportion to sensorimotor rhythm amplitude changes) inside of a hexogram to select groups of letters arranged around its borders (Blankertz et al., 2006). Discrete motor BCIs depend on an external stimulus (e.g., visual highlighting) to cue motor imagery for selection of communication items. One example, the Graz BCI (Neuper et al., 2003; Obermaier, Müller, & Pfurtscheller, 2003), uses a binary tree approach that places communication items (e.g., letters) into two groups on either side of the screen, and individuals using the BCI are prompted to perform movement imagery to select one group or the other (e.g., a left or right hand movement). The resulting event-related desynchronization (i.e., a decrease in the sensorimotor rhythm aligned to an external cue) is detected by the BCI to directly select the desired group of items, which are then split into two new groups, and the process is repeated.
Competency Skills for Motor (Imagery) BCI Devices
Sensory. Sensory abilities are only needed to interact with communication interfaces (e.g., locate items on a visual display, utilize auditory feedback), in contrast to sensory-based BCIs (e.g., P300) that depend on some form of sensory stimulation to elicit the target signal (e.g., identification of an oddball target). Instead, the neurological activity required for motor BCI control is independently generated via imagined movements. This sensory independence enables adaptation of the communication interface to the sensory modality that best matches the strengths of each individual.
Medical considerations. Flashing stimuli are not used with motor-based BCIs, which supports device selection for individuals with a history of seizures. Motor BCIs do have a cognitive requirement, and though research is limited, medications that affect cognition may impair BCI performance. Some evidence suggests that benzodiazepines may negatively impact the sensorimotor rhythm (Silva et al., 2011); however, results are conflicting and may be dependent upon task type (e.g., these medications may aid performance of predictable motor tasks, such as typing, but impair performance on reaction time tasks; Cunha et al., 2008). Likewise, a single dose of morphine may impair motor behavior, whereas long-term use for managing chronic pain may lead to improved psychomotor performance due to decreased pain (Jamison et al., 2003).
Motor. Motor BCIs do not require any overt movement capabilities, and an individual's level of functional motor control is not predictive of motor imagery BCI performance (e.g., Geronimo et al., 2016).
Motor imagery. Motor BCIs require an ability to control the sensorimotor rhythm using imagined movements. However, the sensorimotor rhythm is not observable in 15%–30% of the general population (Vidaurre & Blankertz, 2010). When present, sensorimotor rhythm amplitude over the right and left motor areas recorded at rest with eyes open is positively correlated to initial BCI performance for neurotypical participants (Blankertz et al., 2010) and individuals with ALS (Geronimo et al., 2016). Previous studies have also shown that performance of first-person kinesthetic imagery tasks, during which individuals imagine “feeling” the sensations associated with the action, leads to greater motor BCI performance compared with a third-person visual imagery, in which individuals imagine watching the action from across the room (Neuper, Scherer, Reiner, & Pfurtscheller, 2005). Using the Kinesthetic and Visual Imagery Questionnaire's 5-point kinesthetic subscale (1 = no sensation, to 5 = as intense as executing the action; Malouin et al., 2007), further evidence has found that self-reports of motor imagery skill are reliable and predictive of future motor imagery BCI performance in individuals without neuromotor disorders (Vuckovic & Osuagwu, 2013).
Cognition.
Attention. While attention to task performance and online feedback is important for motor BCI control (Hammer, Kaufmann, Kleih, Blankertz, & Kübler, 2014; Halder et al., 2011), individuals with poor selective attention, typically required for sensory-elicited BCIs, may be supported by the lack of sensory stimulation involved in sensorimotor rhythm modulation. As a result, motor BCIs can be individually customized according to AAC best practices.
Memory/working memory. During first-person motor imagery, one must mentally recreate an action, recalling the sensations associated with physical task performance. Individuals must then maintain the imagined movement throughout the BCI control period, while simultaneously monitoring task performance and attending to auditory and/or visual device feedback (e.g., Brumberg, Pitt, & Burnison, 2018). Therefore, working memory has been highlighted as an important factor in motor imagery BCI success in part by facilitating mental rehearsal of a motor movement sequence from memory (Halder et al., 2011).
Cognitive and motor learning performance factors. Learning to perform tasks via motor imagery has been likened to physical motor learning processes (Wander et al., 2013; Wolpaw et al., 2002). The early stages of motor learning have been specifically linked to cognitive functions that support visuospatial skills (Marinelli, Quartarone, Hallett, Frazzitta, & Ghilardi, 2017), which play an important role in visuomotor adaption during motor learning (Seidler, Bo, & Anguera, 2012). The later stages of motor learning are associated with refining and automatizing learned actions through error detection and correction processes (Sigrist, Rauter, Riener, & Wolf, 2013). Therefore, motor BCI performance may be influenced by (a) one's ability to self-regulate the appropriate allocation of cognitive resources (Kleih & Kübler, 2015), such as visuomotor coordination (Hammer et al., 2014), visuospatial skills (Jeunet, N'Kaoua, Subramanian, Hachet, & Lotte, 2015) and monitoring task performance (Halder et al., 2011), and (b) abstract reasoning needed to reflect upon imagery performance (Jeunet et al., 2015) and to learn new imagery tasks without traditional forms of multimodal sensory feedback (Wander et al., 2013), executive function for switching between different imagined movements (Geronimo et al., 2016), and motivation (e.g., challenge, confidence mastery, and fear of incompetence; Nijboer et al., 2010).
Literacy. Like most AAC devices, motor BCIs can be configured for symbol communication and letter spelling. Because a separate external device is not needed to deliver stimuli necessary for eliciting the BCI neural control signal, it is possible to use existing communication devices and interfaces (Brumberg et al., 2016; Scherer et al., 2015).
Additional considerations. Motor BCIs rely on recordings from electrodes adjacent to the sensorimotor areas (middle and top of scalp) and are less susceptible to poor signal quality due to physical contact from head positioning (cf. sensory BCIs). The training time for successful use of motor BCIs is typically much longer than for sensory-style BCIs, such as the P300 (Geronimo et al., 2016; Mak & Wolpaw, 2009), because new imagined motor skills must be learned for device operation. In addition, age has been inversely correlated with motor BCI performance (Geronimo et al., 2016), but more research is needed to tease apart the effects of age from those of declining motor cortical function in individuals with neuromotor disorders (and ALS specifically).
Case Study Application
We describe the application of our proposed feature-matching guidelines to three hypothetical cases each involving a unique cognitive–sensory–motor profile for individuals who may possibly need BCI to access AAC. With each case, we provide a description of the major feature matching considerations that discriminate BCI access methods, along with a checklist to record assessment outcomes and aid the selection of the most appropriate BCI device. A blank template of the checklist is shown in Figure 1, and filled versions are shown in Figures 2 –4. In many cases, more than one BCI option can be matched for possible selection subject to trial-based assessments and individual preferences. In addition, it may be possible that the assessment that reveals a non-BCI approach is an appropriate match to an individual's strengths and preferences (e.g., eye gaze); these techniques are not included in our analysis, but clinicians should keep these and other access techniques in mind.
Figure 2.
Feature matching template filled from assessments in Case 1. All nonselected items/rows have been deleted for clarity. The yellow row is the sum of features (rows) supported by each individual for using a BCI (column). The BCIs with the greatest number of matches were multiclass (speller) and binary (left/right stream) auditory P300, motor imagery with auditory feedback, and auditory steady state response (eight feature matches). However, after evaluation of pertinent assessment considerations (outlined in Table 1), multiclass auditory P300 and motor imagery devices were selected (highlighted in orange). BCI = brain–computer interface; RSVP = rapid serial visual presentation; SSVEP = steady state visually evoked potential; ASSR = auditory steady state response; S = sensory; MC = medical considerations; M = motor; MI = motor imagery; Cog = cognition; Lit = literacy.
Figure 3.
Feature matching template filled from assessment results in Case 2. All nonselected items/rows have been deleted for clarity. The yellow row is the sum of features (rows) supported by each individual for using a BCI (column). The BCIs with the greatest number of matches included the visual modified P300 grid, rapid serial visual presentation, and steady state visually evoked potential spellers (seven feature matches) and are highlighted in orange. BCI = brain–computer interface; RSVP = rapid serial visual presentation; SSVEP = steady state visually evoked potential; ASSR = auditory steady state response; S = sensory; MC = medical considerations; M = motor; MI = motor imagery; Cog = cognition; Lit = literacy.
Figure 4.
Feature matching template filled from assessments in Case 3. All nonselected items/rows have been deleted for clarity. The yellow row is the sum of features (rows) supported by each individual for using a BCI (column). The BCI with the greatest number of matches was motor imagery, utilizing visual and/or auditory feedback (seven feature matches). Specific adaptations include symbol-based communication page sets and BCI switch-type access to commercial augmentative and alternative communication devices. The final device choices are highlighted in orange. BCI = brain–computer interface; RSVP = rapid serial visual presentation; SSVEP = steady state visual evoked potential; ASSR = auditory steady state response; S = sensory; MC = medical considerations; M = motor; MI = motor imagery; Cog = cognition; Lit = literacy.
Example Case 1
Mr. Dixon is a 65-year-old man who sought an evaluation for AAC following progressive paralysis associated with his ALS diagnosis. An evaluation found that he had severe difficulty with visual acuity tasks, unimpaired hearing, limited oculomotor control, high self-ratings performing first-person motor imagery (an average score of 4/5 on the 5-point Kinesthetic and Visual Imagery Questionnaire's kinesthetic subscale), and no history of seizures. Electroencephalography analysis revealed a present sensorimotor rhythm. He also performed well on tasks designed to assess working memory, selective attention, abstract reasoning, self-monitoring, and literacy (reading and spelling). The key assessment criteria are listed in Table 1 and, the feature-matching checklist, in Figure 2. Following assessment, device trials are recommended for motor imagery with auditory feedback and auditory P300 spelling devices.
Table 1.
Case 1 key assessment considerations for brain–computer interface (BCI) feature matching.
Domain | Pertinent assessment findings | Feature matching consideration |
---|---|---|
Sensory | Severe visual impairments, normal hearing | Visual sensory BCIs are not appropriate due to visual impairment; interfaces with auditory feedback are preferred. |
Medical | No history of seizures | As visual sensory BCIs are not appropriate, seizure activity does not need to be considered in this case. |
Motor | Limited oculomotor control | Oculomotor impairment in addition to visual impairment further increases the suitability of auditory feedback BCIs. |
Motor imagery | High self-ratings for first-person motor imagery (4/5), present sensorimotor rhythm | Strengths in motor imagery afford selection of an auditory feedback motor imagery BCI. |
Cognition | Strengths in working memory and selective attention, skills in cognitive–motor learning tasks | Strengths in cognitive motor learning factors support the use of a motor imagery BCI. Strengths in selective attention and normal hearing support selection of an auditory P300 device. Additional testing needed to determine whether auditory P300 performance exceeds motor imagery approaches (cf. Geronimo et al., 2016). |
Literacy | Strengths in literacy | Strengths in literacy support selection of a spelling-based BCI display. |
Considerations | Strengths in executive function support multiclass (> two-choice) BCI approaches: P300 speller, motor imagery. |
Example Case 2
Following a brainstem stroke, Mrs. Holden (a 70-year-old woman) received a diagnosis of locked-in syndrome. An AAC evaluation revealed strengths in visual acuity, literacy, and selective attention/working memory skills. However, she also had weaknesses in a number of domains with moderate difficulty completing cognitive–motor learning tasks (e.g., task switching, problem solving), low self-ratings on first-person motor imagery (an average score of 1/5 on the Kinesthetic and Visual Imagery Questionnaire's kinesthetic subscale), and an absent sensorimotor rhythm. In addition, a limited range of eye (oculomotor) movement was observed. She does not have a history of seizure activity, and posterior electroencephalography electrode recordings were largely unimpeded by her wheelchair headrest. The key assessment criteria are listed in Table 2, and the completed feature matching checklist is in Figure 3. Following assessment, device trials are suggested for modified visual P300 grid devices/rapid serial visual presentation, and steady state visually evoked potential BCIs, with icon arrangements according to match individual strengths.
Table 2.
Case 2 key assessment considerations for brain–computer interface (BCI) feature matching.
Domain | Pertinent assessment findings | Feature matching consideration |
---|---|---|
Sensory | No noticeable visual acuity impairment | Intact visual acuity supports BCIs with visual feedback. |
Medical | No history of seizures | Lack of seizure history supports steady state visually evoked potential techniques. |
Motor | Limited oculomotor control | Oculomotor limitations contraindicate grid-based visual P300 and steady state visually evoked potential devices unless item locations can be placed in the unimpaired visual field. Otherwise, rapid serial visual presentation P300 devices are preferred. |
Motor imagery | Low self-ratings for first-person motor imagery (1/5), sensorimotor rhythm not present | Motor imagery BCIs are not appropriate due to lack of observable sensorimotor rhythm and weaknesses in first-person motor imagery. |
Cognition | Deficits in cognitive–motor learning and performance tasks, strengths in selective attention and working memory | In addition to weaknesses in motor imagery, difficulties with completing cognitive–motor learning tasks make use of motor imagery BCIs unsuitable. Strengths in attention and working memory support visual attention modulated BCIs (e.g., steady state visually evoked potential) or P300 with appropriate screen layouts (e.g., rapid serial visual presentation). |
Literacy | Strengths in literacy | Strengths in literacy support selection of a spelling-based BCI display. |
Considerations | Minimal headrest obstructions to posterior electroencephalography electrodes support use of P300 and steady state, visually evoked potential systems. |
Example Case 3
Ms. Leeson (a 17-year-old young woman) has a diagnosis of spastic cerebral palsy and was referred for BCI evaluation for AAC intervention. Ms. Leeson performed well on many assessment tasks, including visual acuity, multistep directions, and cognitive motor learning (e.g., self-monitoring and task switching). Self-ratings for first-person motor imagery were high (an average score of 4.5/5 on the Kinesthetic and Visual Imagery Questionnaire's kinesthetic subscale), and the sensorimotor rhythm was present. She had moderate difficulty with oculomotor control, selective attention, and literacy tasks, and has a history of seizures. Posterior electroencephalography electrode recordings may be impeded by her wheelchair headrest. The key assessment criteria are listed in Table 3 and the feature matching checklist in Figure 4. Following assessment, device trials are suggested for auditory and/or visual feedback motor imagery devices with symbol selection (e.g., BCI switch-type access to commercial AAC devices with standard pictorial symbol sets).
Table 3.
Case 3 key assessment considerations for brain–computer interface (BCI) feature matching.
Domain | Pertinent assessment findings | Feature matching consideration |
---|---|---|
Sensory | No noticeable visual acuity impairment | Normal visual acuity supports BCIs with visual feedback over auditory despite normal hearing. |
Medical | History of seizures | History of seizures contraindicates steady state visually evoked potential devices and decreases P300 suitability. |
Motor | Moderate oculomotor impairment | Adaptations to support oculomotor ability are necessary to enhance individual strengths (e.g., screen placement and number of communication items). |
Motor imagery | High self-ratings for motor imagery (4.5/5), present sensorimotor rhythm | Motor imagery proficiency and present sensorimotor rhythm support a range of motor imagery BCIs that can be adapted to individual needs and preferences (e.g., binary/multiclass selection vs. continuous selection and control). |
Cognition | Strengths in cognitive–motor learning tasks, moderate difficulty with selective attention tasks | Individuals with impaired selective attention, typically required for sensory-elicited BCIs, may be supported by motor imagery-based BCI devices. |
Literacy | Moderate difficulty with literacy tasks | Motor imagery BCIs can be readily configured for symbol-based and text-based communication to support literacy skills. |
Considerations | Headrest obstructions to posterior electroencephalography electrodes may impede signal recording for visual processing, limiting P300 and steady state, visually evoked potential systems. Young age is a positive factor for motor imagery approaches versus P300. |
Extrinsic and Environmental Factors
Though this review has primarily focused on intrinsic factors related to successful outcomes within BCI modalities, external/environmental factors that cut across all modalities are crucial considerations for AAC (and BCI) success, including caregiver support, intervention approaches focused on achieving individuals' communication goals (American Speech-Language-Hearing Association, 2017), improving quality of life, and participation in preferred activities and environments (Beukelman & Mirenda, 2013). Consideration of both intrinsic and extrinsic factors are critical for communication success and minimizing device abandonment.
The environment in which an individual uses BCI is particularly important and limits the usefulness of specific BCI modalities (Sellers, Kubler, & Donchin, 2006). For instance, glare or reflections may obscure a BCI visual display reducing the effectiveness of visual stimulation for eliciting electroencephalography signals (He, Huang, & Li, 2016), or environmental noise from sources such as ventilators and air conditioning units may degrade electroencephalography signals and, thus, lower BCI performance (Sellers, Kubler, & Donchin, 2006). Electrical artifacts from muscle contractions can also impede signal quality, and although often caused by intrinsic factors, such as spasticity and uncontrolled movements (Daly et al., 2013), they may also be elicited by environmental distractions, such as orienting responses (Brandl, Höhne, Müller, & Samek, 2015). Finally, caregiver support cannot be understated for any AAC method and especially for BCI. Caregivers are often the primary personnel for troubleshooting basic environmental and technical difficulties for AAC and BCI, and additional services will be required for BCIs, including device setup (e.g., correct electroencephalography cap placement, application of electrolyte gel), basic device operation (e.g., turn on, start calibration programs), and supporting and monitoring device use (Sellers, Vaughan, & Wolpaw, 2010). Additional extrinsic factors that include battery/power supply, size of equipment, software platforms, and the need for device programming are all important considerations currently in device development and will likely utilize commercial partnerships (e.g., with AAC device manufacturers) to integrate BCI processes into existing clinical/commercial frameworks. Taken together, it is clear that the technology itself is only one piece of the service provision necessary for AAC success, especially for BCI-based access. Therefore, the way in which intrinsic and extrinsic factors combine to inform BCI procedures over time is an important area of continued study.
Conclusions and Future Directions
Individuals who may use BCI to access AAC represent a heterogeneous population with unique and variable sensory–cognitive–motor profiles. Therefore, formal assessment and feature matching procedures are needed to ensure selection of the best and most appropriate BCI access technique for AAC. Previous efforts have been successful for screening some BCI-related skills prior to BCI use for individuals with severe paralysis (e.g., Fried-Oken et al., 2013; Geronimo et al., 2016), though they were largely limited to single modalities. The feature matching framework described within is designed to be comprehensive across all BCI types, adheres to the best practices currently used by SLPs specializing in AAC (Beukelman & Mirenda, 2013; Gosnell et al., 2011; Light & McNaughton, 2013), and is based upon the current literature and multidisciplinary viewpoints. That said, our framework should be considered an initial step toward consideration of feature matching across the full range of BCI devices, and we anticipate and welcome modifications as more BCIs are translated into clinical practice. Further future investigations are needed to more fully examine how executive function and its specific components influence BCI performance across devices.
The visual P300 grid speller (Donchin et al., 2000) is the most well-known and most mature technology with ongoing at-home trials (Holz, Botrel, Kaufmann, & Kübler, 2015; Sellers et al., 2010) and is currently available from commercial partners (i.e., the intendiX Speller; g.tec medical engineering). Therefore, these devices are likely to be among the first BCIs that professionals encounter in clinical settings, though logistical issues of insurance, funding, and prescription have not been resolved. Nearly all other BCIs are still primarily focused on laboratory testing, and our proposed feature matching guidelines may be modified and expanded upon over time. Despite possible future changes to the clinical BCI landscape, the fundamental science and engineering underlying the use of brain signals and feedback modalities for BCI access to AAC will remain consistent. Therefore, many of the domains considered in our proposed feature matching guidelines will still be applicable to future BCI implementations (e.g., sensory ability, motor imagery requirements, medical considerations). Other domains (e.g., cognition) may be modified with future improvements in the quality of BCI signal recording (e.g., less time needed for focusing on items in P300 and steady state visually evoked potential approaches). Further, external/environmental considerations will also change with advancements in electroencephalography technology, specifically through the development of dry electrodes that do not require electrolytic gel, wireless signal transmission, and cloud-based virtual telepresence for faster technology support responses. Last, though BCIs currently have limited representation in speech-language pathology, as they enter clinical practice, they will also emerge in AAC course curricula preservice training to allow SLPs to obtain expertise in their function.
Maintaining a focus on each individual with complex communication needs by matching their unique present and future cognitive–sensory–motor profiles to BCI access methods will increase buy-in and quality of life and decrease device abandonment for individuals who use BCI. Multidisciplinary teams of professionals (e.g., SLPs, AAC professionals, neuroscientists, engineers, physical and occupational therapists, and other AAC-related research and clinical intervention disciplines), along with individuals who use AAC and BCI and their caregivers, must all be involved in the translation of current research to clinical practice and in the further development of person-centered BCI technology, assessment, and training methods.
Acknowledgments
This work was supported in part by National Institute on Deafness and Other Communication Disorders Grant R03-DC011304, awarded to J. Brumberg; the University of Kansas New Faculty Research Fund, awarded to J. Brumberg; and the American Speech-Language-Hearing Foundation New Century Scholars Research Grant, awarded to J. Brumberg. The authors would like to thank Nancy Brady, Jeremy Burnison, Alana Mantie-Kozlowski, Allison Meder, Caitlin Masterson, and Kelli Johnsen for their discussions on the development of the feature matching framework, and Anthony Pitt, for assistance with the checklist design.
Funding Statement
This work was supported in part by National Institute on Deafness and Other Communication Disorders Grant R03-DC011304, awarded to J. Brumberg; the University of Kansas New Faculty Research Fund, awarded to J. Brumberg; and the American Speech-Language-Hearing Foundation New Century Scholars Research Grant, awarded to J. Brumberg.
References
- Acqualagna L., & Blankertz B. (2013). Gaze-independent BCI-spelling using rapid serial visual presentation (RSVP). Clinical Neurophysiology, 124(5), 901–908. [DOI] [PubMed] [Google Scholar]
- Ahn M., & Jun S. C. (2015). Performance variation in motor imagery brain–computer interface: A brief review. Journal of Neuroscience Methods, 243, 103–110. [DOI] [PubMed] [Google Scholar]
- Allison B., McFarland D., Schalk G., Zheng S., Jackson M., & Wolpaw J. R. (2008). Towards an independent brain–computer interface using steady state visual evoked potentials. Clinical Neurophysiology, 119(2), 399–408. [DOI] [PMC free article] [PubMed] [Google Scholar]
- American Speech-Language-Hearing Association. (2017). Practice portal: Augmentative and alternative communication. Retrieved from http://www.asha.org/PRPSpecificTopic.aspx?folderid=8589942773§ion=Key_Issues
- Bauer G., Gerstenbrand F., & Rumpl E. (1979). Varieties of the locked-in syndrome. Journal of Neurology, 221(2), 77–91. [DOI] [PubMed] [Google Scholar]
- Beeldman E., Raaphorst J., Twennaar M. K., de Visser M., Schmand B. A., & de Haan R. J. (2016). The cognitive profile of ALS: A systematic review and meta-analysis update. Journal of Neurology, Neurosurgery, & Psychiatry, 87(6), 611–619. [DOI] [PubMed] [Google Scholar]
- Beukelman D., & Mirenda P. (2013). Augmentative and alternative communication: Supporting children and adults with complex communication needs (4th ed.). Baltimore, MD: Brookes. [Google Scholar]
- Blankertz B., Dornhege G., Krauledat M., Muller K.-R., Kunzmann V., Losch F., & Curio G. (2006). The Berlin brain–computer interface: EEG-based communication without subject training. IEEE Transactions on Neural Systems and Rehabilitation Engineering, 14(2), 147–152. [DOI] [PubMed] [Google Scholar]
- Blankertz B., Sannelli C., Halder S., Hammer E. M., Kübler A., Müller K., … Dickhaus T. (2010). Neurophysiological predictor of SMR-based BCI performance. NeuroImage, 51(4), 1303–1309. [DOI] [PubMed] [Google Scholar]
- Brandl S., Höhne J., Müller K. R., & Samek W. (2015). Bringing BCI into everyday life: Motor imagery in a pseudo realistic environment. In 7th International IEEE/EMBS Conference on Neural Engineering (NER) (pp. 224–227). Washington, DC: IEEE. [Google Scholar]
- Brumberg J. S., Burnison J. D., & Pitt K. M. (2016). Using motor imagery to control brain–computer interfaces for communication. In Schmorrow D. D. & Fidopiastis C. M. (Eds.), Foundations of augmented cognition: Neuroergonomics and operational neuroscience (pp. 14–25). Cham, Switzerland: Springer. [Google Scholar]
- Brumberg J. S., Nguyen A., Pitt K. M., & Lorenz S. D. (2018). Examining sensory ability, feature matching, and assessment-based adaptation for a brain–computer interface using the steady-state visually evoked potential. Disability and Rehabilitation: Assistive Technology. Advance online publication. https://doi.org/10.1080/17483107.2018.1428369 [DOI] [PMC free article] [PubMed]
- Brumberg J. S., Pitt K. M., & Burnison J. D. (2018). A non-invasive brain–computer interface for real-time speech synthesis: The importance of multimodal feedback. IEEE Transactions on Neural Systems and Rehabilitation Engineering, 26(4), 874–881. PMCID: PMC5906041. https://doi.org/10.1109/TNSRE.2018.2808425 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Brumberg J. S., Pitt K. M., Mantie-Kozlowski A., & Burnison J. D. (2018). Brain–computer interfaces for augmentative and alternative communication, a tutorial. American Journal of Speech-Language Pathology, 27, 1–12. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Brunner P., Joshi S., Briskin S., Wolpaw J. R., Bischof H., & Schalk G. (2010). Does the ‘P300’ speller depend on eye gaze? Journal of Neural Engineering, 7(5), 056013. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Combaz A., Chatelle C., Robben A., Vanhoof G., Goeleven A., Thijs V., … Laureys S. (2013). A comparison of two spelling brain–computer interfaces based on visual P3 and SSVEP in locked-in syndrome. PloS One, 8(9), e73691. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Cunha M., Portela C., Bastos V. H., Machado D., Machado S., Velasques B., … Ribeiro P. (2008). Responsiveness of sensorimotor cortex during pharmacological intervention with bromazepam. Neuroscience Letters, 448(1), 33–36. [DOI] [PubMed] [Google Scholar]
- Daly I., Billinger M., Laparra-Hernández J., Aloise F., García M. L., Faller J., … Müller-Putz G. (2013). On the control of brain–computer interfaces by users with cerebral palsy. Clinical Neurophysiology, 124(9), 1787–1797. [DOI] [PubMed] [Google Scholar]
- Donchin E., Spencer K. M., & Wijesinghe R. (2000). The mental prosthesis: Assessing the speed of a P300-based brain–computer interface. IEEE Transactions on Rehabilitation Engineering, 8(2), 174–179. [DOI] [PubMed] [Google Scholar]
- Engelmann J. B., Damaraju E., Padmala S., & Pessoa L. (2009). Combined effects of attention and motivation on visual task performance: Transient and sustained motivational effects. Frontiers in Human Neuroscience, 3, 4. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Fager S., Beukelman D., Fried-Oken M., Jakobs T., & Baker J. (2012). Access interface strategies. Assistive Technology, 24(1), 25–33. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Farwell L. A., & Donchin E. (1988). Talking off the top of your head: Toward a mental prosthesis utilizing event-related brain potentials. Electroencephalography and Clinical Neurophysiology, 70(6), 510–523. [DOI] [PubMed] [Google Scholar]
- Fried-Oken M., Mooney A., Peters B., & Oken B. (2013). A clinical screening protocol for the RSVP keyboard brain–computer interface. Disability and Rehabilitation: Assistive Technology, 10(1), 11–18. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Geronimo A., Simmons Z., & Schiff S. J. (2016). Performance predictors of brain–computer interfaces in patients with amyotrophic lateral sclerosis. Journal of Neural Engineering, 13(2), 026002. [DOI] [PubMed] [Google Scholar]
- Gershon R. C., Wagster M. V., Hendrie H. C., Fox N. A., Cook K. F., & Nowinski C. J. (2013). NIH toolbox for assessment of neurological and behavioral function. Neurology, 80(11, Supp. 3), S2–S6. https://doi.org/10.1212/WNL.0b013e3182872e5f [DOI] [PMC free article] [PubMed] [Google Scholar]
- Gosnell J., Costello J., & Shane H. (2011). Using a clinical approach to answer “what communication apps should we use?” SIG 12 Perspectives on Augmentative and Alternative Communication, 20(3), 87–96. [Google Scholar]
- Halder S., Agorastos D., Veit R., Hammer E. M., Lee S., Varkuti B., … Kübler A. (2011). Neural mechanisms of brain–computer interface control. NeuroImage, 55(4), 1779–1790. [DOI] [PubMed] [Google Scholar]
- Halder S., Rea M., Andreoni R., Nijboer F., Hammer E. M., Kleih S. C., … Kübler A. (2010). An auditory oddball brain–computer interface for binary choices. Clinical Neurophysiology, 121(4), 516–523. [DOI] [PubMed] [Google Scholar]
- Hammer E. M., Kaufmann T., Kleih S. C., Blankertz B., & Kübler A. (2014). Visuo-motor coordination ability predicts performance with brain–computer interfaces controlled by modulation of sensorimotor rhythms (SMR). Frontiers in Human Neuroscience, 8, 1–9. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Hayakawa T., Uchiyama M., Urata J., Enomoto T., Okubo J., & Okawa M. (1999). Effects of a small dose of triazolam on P300. Psychiatry and Clinical Neurosciences, 53(2), 185–187. [DOI] [PubMed] [Google Scholar]
- He S., Huang Q., & Li Y. (2016, June). Toward improved P300 speller performance in outdoor environment using polarizer. In Intelligent Control and Automation (WCICA), 2016 12th World Congress on (pp. 3172–3175). Institute of Electrical and Electronics Engineers. [Google Scholar]
- Higashi H., Rutkowski T. M., Washizawa Y., Cichocki A., & Tanaka T. (2011). EEG auditory steady state responses classification for the novel BCI. Conference Proceeding of IEEE Engineering in Medicine and Biology Society, 2011, 4576–4579. [DOI] [PubMed] [Google Scholar]
- Hill N. J., Ricci E., Haider S., McCane L. M., Heckman S., Wolpaw J. R., & Vaughan T. M. (2014). A practical, intuitive brain–computer interface for communicating ‘yes’ or ‘no’ by listening. Journal of Neural Engineering, 11(3), 035003. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Holz E. M., Botrel L., Kaufmann T., & Kübler A. (2015). Long-term independent brain–computer interface home use improves quality of life of a patient in the locked-in state: A case study. Archives of Physical Medicine and Rehabilitation, 96(3), S16–S26. [DOI] [PubMed] [Google Scholar]
- Ikegami S., Takano K., Saeki N., & Kansaku K. (2011). Operation of a P300 based brain computer interface by individuals with cervical spinal cord injury. Clinical Neurophysiology, 122(5), 991–996. [DOI] [PubMed] [Google Scholar]
- Jamison R. N., Schein J. R., Vallow S., Ascher S., Vorsanger G. J., & Katz N. P. (2003). Neuropsychological effects of long-term opioid use in chronic pain patients. Journal of pain and symptom management, 26(4), 913–921. [DOI] [PubMed] [Google Scholar]
- Jeunet C., N'Kaoua B., Subramanian S., Hachet M., & Lotte F. (2015). Predicting mental imagery-based BCI performance from personality, cognitive profile and neurophysiological patterns. PLoS One, 10(12), 1–21. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Käthner I., Ruf C. A., Pasqualotto E., Braun C., Birbaumer N., & Halder S. (2013). A portable auditory P300 brain–computer interface with directional cues. Clinical Neurophysiology, 124(2), 327–338. [DOI] [PubMed] [Google Scholar]
- Kelly S. P., Lalor E. C., Finucane C., McDarby G., & Reilly R. B. (2005). Visual spatial attention control in an independent brain–computer interface. IEEE Transactions on Biomedical Engineering, 52(9), 1588–1596. [DOI] [PubMed] [Google Scholar]
- Kleih S. C., & Kübler A. (2015). Psychological factors influencing brain–computer interface (BCI) performance. IEEE International Conference on Systems, Man, and Cybernetics, 2015, 3192–3196. [Google Scholar]
- Klobassa D. S., Vaughan T. M., Brunner P., Schwartz N. E., Wolpaw J. R., Neuper C., & Sellers E. W. (2009). Toward a high-throughput auditory P300-based brain–computer interface. Clinical Neurophysiology, 120(7), 1252–1261. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Koester H. H., & Levine S. (1996). Effect of a word prediction feature on user performance. Augmentative and Alternative Communication, 12(3), 155–168. [Google Scholar]
- Kübler A., Furdea A., Halder S., Hammer E. M., Nijboer F., & Kotchoubey B. (2009). A brain–computer interface controlled auditory event-related potential (P300) spelling system for locked-in patients. Annals of the New York Academy of Sciences, 1157, 90–100. [DOI] [PubMed] [Google Scholar]
- Kübler A., Kotchoubey B., Kaiser J., Wolpaw J. R., & Birbaumer N. (2001). Brain–computer communication, unlocking the locked in. Psychological Bulletin, 127(3), 358–375. [DOI] [PubMed] [Google Scholar]
- Light J., & McNaughton D. (2013). Putting people first: Re-thinking the role of technology in augmentative and alternative communication intervention. Augmentative and Alternative Communication, 29(4), 299–309. [DOI] [PubMed] [Google Scholar]
- Lin Z., Zhang C., Wu W., & Gao X. (2007). Frequency recognition based on canonical correlation analysis for SSVEP-based BCIs. IEEE Transactions on Biomedical Engineering, 54(6), 1172–1176. [DOI] [PubMed] [Google Scholar]
- Lopez M., Pomares H., Pelayo F., Urquiza J., & Perez J. (2009). Evidences of cognitive effects over auditory steady-state responses by means of artificial neural networks and its use in brain computer interfaces. Neurocomputing, 72(16–18), 3617–3623. [Google Scholar]
- Mak J. N., & Wolpaw J. R. (2009). Clinical applications of brain computer interfaces: Current state and future prospects. IEEE Reviews in Biomedical Engineering, 2, 187–199. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Malouin F., Richards C. L., Jackson P. L., Lafleur M. F., Durand A., & Doyon J. (2007). The Kinesthetic and Visual Imagery Questionnaire (KVIQ) for assessing motor imagery in persons with physical disabilities: A reliability and construct validity study. Journal of Neurologic Physical Therapy, 31(1), 20–29. [DOI] [PubMed] [Google Scholar]
- Marinelli L., Quartarone A., Hallett M., Frazzitta G., & Ghilardi M. F. (2017). The many facets of motor learning and their relevance for Parkinson's disease. Clinical Neurophysiology, 128(7), 1127–1141. [DOI] [PMC free article] [PubMed] [Google Scholar]
- McCane L. M., Sellers E. W., McFarland D. J., Mak J. N., Carmack C. S., Zeitlin D., … Vaughan T. M. (2014). Brain–computer interface (BCI) evaluation in people with amyotrophic lateral sclerosis. Amyotrophic Lateral Sclerosis and Frontotemporal Degeneration, 15(3–4), 207–215. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Meador K. J. (1998). Cognitive side effects of medications. Neurologic Clinics, 16(1), 141–155. [DOI] [PubMed] [Google Scholar]
- Miner L. A., McFarland D. J., & Wolpaw J. R. (1998). Answering questions with an electroencephalogram-based brain–computer interface. Archives of Physical Medicine and Rehabilitation, 79(9), 1029–1033. [DOI] [PubMed] [Google Scholar]
- Müller-Putz G. R., Scherer R., Brauneis C., & Pfurtscheller G. (2005). Steady-state visual evoked potential (SSVEP)-based communication, impact of harmonic frequency components. Journal of Neural Engineering, 2(4), 123–130. [DOI] [PubMed] [Google Scholar]
- Neuper C., Müller G. R., Kübler A., Birbaumer N., & Pfurtscheller G. (2003). Clinical application of an EEG-based brain–computer interface: A case study in a patient with severe motor impairment. Clinical Neurophysiology, 114(3), 399–409. [DOI] [PubMed] [Google Scholar]
- Neuper C., Scherer R., Reiner M., & Pfurtscheller G. (2005). Imagery of motor actions: Differential effects of kinesthetic and visual-motor mode of imagery in single-trial EEG. Cognitive Brain Research, 25(3), 668–677. [DOI] [PubMed] [Google Scholar]
- Nijboer F., Birbaumer N., & Kübler A. (2010). The influence of psychological state and motivation on brain–computer interface performance in patients with amyotrophic lateral sclerosis—A longitudinal study. Frontiers in Neuroscience, 4(55), 1–13. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Nijboer F., Furdea A., Gunst I., Mellinger J., McFarland D. J., Birbaumer N., & Kübler A. (2008). An auditory brain–computer interface (BCI). Journal of Neuroscience Methods, 167(1), 43–50. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Obermaier B., Müller G. R., & Pfurtscheller G. (2003). “Virtual keyboard” controlled by spontaneous EEG activity. IEEE Transactions on Neural Systems and Rehabilitation Engineering, 11(4), 422–426. [DOI] [PubMed] [Google Scholar]
- Oken B. S., Orhan U., Roark B., Erdogmus D., Fowler A., Mooney A., … Fried-Oken M. B. (2014). Brain–computer interface with language model–electroencephalography fusion for locked-in syndrome. Neurorehabilitation and Neural Repair, 28(4), 387–394. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Peters B., Mooney A., Oken B., & Fried-Oken M. (2016). Soliciting BCI user experience feedback from people with severe speech and physical impairments. Brain–Computer Interfaces, 2621, 1–12. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Pfurtscheller G., & Neuper C. (2001). Motor imagery and direct brain–computer communication. Proceedings of the IEEE, 89(7), 1123–1134. [Google Scholar]
- Picton T. W., John M. S., Dimitrijevic A., & Purcell D. (2003). Human auditory steady-state responses. International Journal of Audiology, 42, 177–219. [DOI] [PubMed] [Google Scholar]
- Plum F., & Posner J. B. (1972). The diagnosis of stupor and coma. Contemporary Neurology Series, 10, 1–286. [PubMed] [Google Scholar]
- Polich J., & Criado J. R. (2006). Neuropsychology and neuropharmacology of P3a and P3b. International Journal of Psychophysiology, 60(2), 172–185. [DOI] [PubMed] [Google Scholar]
- Regan D. (1989). Human brain electrophysiology: Evoked potentials and evoked magnetic fields in science and medicine. New York, NY: Elsevier. [Google Scholar]
- Riccio A., Simione L., Schettini F., Pizzimenti A., Inghilleri M., Belardinelli M. O., … Cincotti F. (2013). Attention and P300-based BCI performance in people with amyotrophic lateral sclerosis. Frontiers in Human Neuroscience, 7, 732. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Scherer R., Billinger M., Wagner J., Schwarz A., Tassilo D., Bolinger E., … Mu G. (2015). Thought-based row-column scanning communication board for individuals with cerebral palsy. Annals of Physical and Rehabilitation Medicine, 58, 14–22. [DOI] [PubMed] [Google Scholar]
- Schnakers C., Majerus S., Goldman S., Boly M., Van Eeckhout P., Gay S., … Laureys S. (2008). Cognitive function in the locked-in syndrome. Journal of Neurology, 255(3), 323–330. [DOI] [PubMed] [Google Scholar]
- Seidler R. D., Bo J., & Anguera J. A. (2012). Neurocognitive contributions to motor skill learning: The role of working memory. Journal of Motor Behavior, 44(6), 445–453. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Sellers E. W., Krusienski D. J., McFarland D. J., Vaughan T. M., & Wolpaw J. R. (2006). A P300 event-related potential brain–computer interface (BCI): The effects of matrix size and inter stimulus interval on performance. Biological Psychology, 73(3), 242–252. [DOI] [PubMed] [Google Scholar]
- Sellers E. W., Kubler A., & Donchin E. (2006). Brain–computer interface research at the University of South Florida Cognitive Psychophysiology Laboratory: The P300 speller. IEEE Transactions on Neural Systems and Rehabilitation Engineering, 14(2), 221–224. [DOI] [PubMed] [Google Scholar]
- Sellers E. W., Vaughan T. M., & Wolpaw J. R. (2010). A brain–computer interface for long-term independent home use. Amyotrophic Lateral Sclerosis, 11(5), 449–455. [DOI] [PubMed] [Google Scholar]
- Sigrist R., Rauter G., Riener R., & Wolf P. (2013). Augmented visual, auditory, haptic, and multimodal feedback in motor learning: A review. Psychonomic Bulletin & Review, 20(1), 21–53. [DOI] [PubMed] [Google Scholar]
- Silva J., Arias-Carrión O., Paes F., Velasques B., Teixeira S., Basile L., … Ribeiro P. (2011). Bromazepam impairs motor response: An ERSP study. CNS and Neurological Disorders-Drug Targets, 10(8), 945–950. [DOI] [PubMed] [Google Scholar]
- Silvoni S., Volpato C., Cavinato M., Marchetti M., Priftis K., Merico A., … Piccione F. (2009). P300 based brain–computer interface communication, evaluation and follow-up in amyotrophic lateral sclerosis. Frontiers in Neuroscience, 3, 60. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Sprague S. A., McBee M. T., & Sellers E. W. (2016). The effects of working memory on brain–computer interface performance. Clinical Neurophysiology, 127(2), 1331–1341. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Sugata H., Hirata M., Kageyama Y., Kishima H., Sawada J., & Yoshimine T. (2016). Relationship between the spatial pattern of P300 and performance of a P300-based brain–computer interface in amyotrophic lateral sclerosis. Brain–Computer Interfaces, 3(1), 1–8. [Google Scholar]
- Sutter E. E. (1992). The brain response interface: Communication through visually-induced electrical brain responses. Journal of Microcomputer Applications, 15(1), 31–45. [Google Scholar]
- Thistle J. J., & Wilkinson K. M. (2013). Working memory demands of aided augmentative and alternative communication for individuals with developmental disabilities. Augmentative and Alternative Communication, 29(3), 235–245. [DOI] [PubMed] [Google Scholar]
- Thistle J. J., & Wilkinson K. M. (2015). Building evidence-based practice in AAC display design for young children: Current practices and future directions. Augmentative and Alternative Communication, 31(2), 124–136. [DOI] [PubMed] [Google Scholar]
- Vaughan T. M., McFarland D. J., Schalk G., Sarnacki W. A., Krusienski D. J., Sellers E. W., & Wolpaw J. R. (2006). The Wadsworth BCI Research and Development Program, at home with BCI. IEEE Transactions on Neural Systems and Rehabilitation Engineering, 14(2), 229–233. [DOI] [PubMed] [Google Scholar]
- Vidaurre C., & Blankertz B. (2010). Towards a cure for BCI illiteracy. Brain Topography, 23(2), 194–198. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Volosyak I., Valbuena D., Lüth T., Malechka T., & Gräser A. (2011). BCI demographics II: How many (and what kinds of) people can use a high-frequency SSVEP BCI? IEEE Transactions on Neural Systems and Rehabilitation Engineering, 19(3), 232–239. [DOI] [PubMed] [Google Scholar]
- Vuckovic A., & Osuagwu B. A. (2013). Using a motor imagery questionnaire to estimate the performance of a brain–computer interface based on object oriented motor imagery. Clinical Neurophysiology, 124(8), 1586–1595. [DOI] [PubMed] [Google Scholar]
- Wander J., Blakely T., Miller K., Weaver K., Johnson L., Olson J., … Ojemann J. (2013). Distributed cortical adaptation during learning of a brain computer interface task. Proceedings of the National Academy of Sciences, 110(26), 10818–10823. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Wolpaw J. R., Birbaumer N., McFarland D. J., Pfurtscheller G., & Vaughan T. M. (2002). Brain–computer interfaces for communication and control. Clinical Neurophysiology, 113(6), 767–791. [DOI] [PubMed] [Google Scholar]
- Zhang D., Maye A., Gao X., Hong B., Engel A. K., & Gao S. (2010). An independent brain–computer interface using covert non-spatial visual selective attention. Journal of Neural Engineering, 7(1), 016010. [DOI] [PubMed] [Google Scholar]