Skip to main content
Exploration logoLink to Exploration
. 2024 Mar 14;4(5):20230146. doi: 10.1002/EXP.20230146

Neural interfaces: Bridging the brain to the world beyond healthcare

Shumao Xu 1,, Yang Liu 2, Hyunjin Lee 1, Weidong Li 2,
PMCID: PMC11491314  PMID: 39439491

Abstract

Neural interfaces, emerging at the intersection of neurotechnology and urban planning, promise to transform how we interact with our surroundings and communicate. By recording and decoding neural signals, these interfaces facilitate direct connections between the brain and external devices, enabling seamless information exchange and shared experiences. Nevertheless, their development is challenged by complexities in materials science, electrochemistry, and algorithmic design. Electrophysiological crosstalk and the mismatch between electrode rigidity and tissue flexibility further complicate signal fidelity and biocompatibility. Recent closed‐loop brain‐computer interfaces, while promising for mood regulation and cognitive enhancement, are limited by decoding accuracy and the adaptability of user interfaces. This perspective outlines these challenges and discusses the progress in neural interfaces, contrasting non‐invasive and invasive approaches, and explores the dynamics between stimulation and direct interfacing. Emphasis is placed on applications beyond healthcare, highlighting the need for implantable interfaces with high‐resolution recording and stimulation capabilities.

Keywords: decoded neural activity, human‐machine interactions, mind communication, neural interfaces, remote control, smart homes


This perspective explores diverse neural interfaces in the brain, offering transformative possibilities for engaging with our environment and enhancing communication. Emphasis is placed on the role of both non‐invasive and invasive neural interfaces in transforming practical applications beyond the realm of healthcare, emphasizing the promising implications for remote control, mind connectivity, and integrated urban design.

graphic file with name EXP2-4-20230146-g010.jpg

1. INTRODUCTION

With over 86 billion neurons and trillions of connections, the human brain is a complex organ, characterized by its remarkable capacity for processing, learning, and adapting.[ 1 , 2 , 3 , 4 , 5 , 6 , 7 ] This has led to significant interest in brain‐machine interfaces (BMIs) and brain‐computer interfaces (BCIs),[ 8 , 9 , 10 , 11 ] which are transforming how we interact with our surroundings and communicate.[ 12 ] By interpreting brain activity, these technologies enable intuitive and natural manipulation of external devices.[ 13 , 14 ] High‐resolution and reliable neural interfaces are paving the way for direct brain‐to‐device and brain‐to‐computer connection, heralding a new era of information exchange and thought communication.[ 11 , 15 ]

Historically, neural interfaces have played a pivotal role in healthcare. Serving as an intermediary between external electronic devices and biological tissue, these interfaces, particularly neural microelectrodes, have been crucial in both recording bioelectrical signals for sensory information and motility mapping and in electrically stimulating neural tissues for biological function regulation, such as altering ion concentrations inside and outside the cell membrane and improving neural signal transmission.[ 16 , 17 , 18 , 19 , 20 ] This advancement has deepened our insight into the workings of the nervous system and improved the treatment of neurological conditions like Parkinson's disease and epilepsy, as well as sensory impairments like hearing and vision loss.[ 21 ]

Early multi‐channel silicon‐based neural electrodes, like the Utah array and the Michigan probe, caused cell tissue death and inflammatory reactions, hampering signal stability.[ 22 , 23 ] The evolution towards flexible polymer microelectrode arrays has been a significant step forward, yet challenges remain in electrocorticography (ECoG) signal clarity due to tissue‐electrode interface mismatches.[ 24 , 25 ] Neural interfaces have undergone significant evolution, transcending their traditional medical applications to pioneer groundbreaking uses across various sectors. Central to these advancements is the ability to decode brain patterns, enabling control over external devices like prosthetics.[ 10 , 26 ] These interfaces offer an intuitive control, akin to the natural use of limbs, representing a breakthrough in assistive technology. The development of these interfaces involves direct measurement of brain activity with diverse temporal and spatial resolutions, combined with advanced mathematical modeling. Nevertheless, these technological advances face significant challenges. Precise control requires sophisticated algorithms capable of decoding complex neural signals.[ 9 ] Additionally, ensuring long‐term biocompatibility and minimizing adverse biological responses, such as inflammation, fibrosis, infection, and neurodegeneration, is crucial for their safe and effective deployment.[ 10 , 27 ] Recent advancements in fully internalized microelectrodes for deep brain stimulation (DBS), cochlear implants, and retinal prostheses have been driven by designs that enhance safety and enable long‐term stimulation and recording.[ 17 , 28 , 29 ]

Beyond healthcare, these neural interfaces have the potential to transform fields such as virtual reality, smart home technology, and urban planning.[ 30 , 31 , 32 , 33 , 34 , 35 , 36 , 37 ] These interfaces offer immersive interactions with virtual worlds, improved Internet of Things (IoT) control,[ 38 ] and the potential for emotion sharing and mind connectivity.[ 39 ] Envision a future where self‐driving cars,[ 40 ] smart homes,[ 41 ] and other urban utilities are not just automated, but also directly controlled by brain signals,[ 42 , 43 ] resulting in an efficient and interconnected landscape.[ 44 ]

This perspective delves into the progress and challenges of neural interfaces, which are changing how we interact with our environment. These interfaces work by recording and decoding neural signals, thus enabling direct connections between the brain and devices for seamless information sharing and collective experiences. However, significant challenges also remain. These include the complexity of the algorithms required, the interference of electrophysiological signals, and the incompatibility between the hardness of the electrodes and the softness of neural tissue. This perspective also highlights the differences between non‐invasive and invasive neural interfaces. While non‐invasive methods are less risky and easier to use, invasive interfaces offer higher resolution and are more effective for specific applications. Advancements in high‐density biocompatible implantable interfaces are increasingly essential beyond healthcare applications like human‐machine interactions. The potential of neural interfaces in these areas is vast, but realizing this potential requires addressing the technical limitations and ethical concerns.

2. BRAIN NEURAL INTERFACE

Brain neural interfaces, such as electroencephalography (EEG), ECoG, subcortical microelectrode arrays (MEA), and DBS, offer unprecedented opportunities for healthcare and human‐machine interactions.

2.1. Brain regions and neural interfaces

The brain is an intricate organ with various regions in charge of thoughts, emotions, and behaviors.[ 45 , 46 , 47 , 48 , 49 ] The prefrontal cortex, often considered the “thinking brain”, is primarily accountable for executive functions like decision‐making and planning[ 50 ] (Figure 1A). Regions like the cingulate gyrus and ventral striatum are commonly referred to as the “emotional brain”, as they are critical in shaping our emotional experiences and responses.[ 51 ] The amygdala, hypothalamus, and hippocampus are often described as the “doing brain”, with functions that include sleep regulation, autonomic control, and motor activities.[ 52 , 53 ] Techniques like DBS are powerful tools that target these specific regions, presenting potential treatments for mood disorders and other neural conditions. The field of neural interfaces is witnessing significant growth, especially in the areas of artificial intelligence (AI)‐driven neural decoding and neural stimulation therapies (Figure 1B). This expansion is marked by a diverse set of international collaborations, as indicated by various color‐coded clusters in research output (Figure 2). The United States leads in scholarly output on “neural interface”, followed by China, Germany, and the United Kingdom. Among the top 20 contributing nations, approximately 30% of their published work involves international cooperation. This trend underscores a global, interdisciplinary effort in advancing neural interface science. The burgeoning neural interfaces hold promise not only for medical applications but also for broad societal impacts, especially as these technologies increasingly integrate into daily life.

FIGURE 1.

FIGURE 1

(A) Representation of selected brain regions and their associated clinical symptoms. A, created by Biorender.com. (B) Top keywords in the neural interface publications.

FIGURE 2.

FIGURE 2

Publications with the subject of “neural interface” in different countries. A bibliometric approach and network analysis were employed to visualize the collaboration patterns. The size of the circles on the map corresponds to the total number of publications from each country, while different colors indicate collaborative publications.

2.2. Historical evolution in different neural interfaces

Starting in the 1960s, EEG advancements enabled brain source localization, such as identifying epilepsy foci,[ 54 ] marking a significant leap in understanding and treating neurological conditions (Figure 3). The 1970s saw further progress with the development of methods for topographic analyses of EEG data, enhancing visualization of the spatial distribution of brain activity.[ 55 , 56 , 57 ] Additionally, improvements in computational methods enabled single‐trial EEG data analyses, enhancing temporal resolution and precision.[ 58 ] The 1980s were marked by directional tuning research, focusing on how a neuron firing rate changes with the movement direction,[ 59 , 60 ] which significantly enhanced our understanding of motor control and brain‐coordinated movement. The 1990s introduced deep learning algorithms in EEG data analysis[ 61 , 62 ] and saw the first clinical demonstrations of BCIs in humans, especially for individuals with amyotrophic lateral sclerosis (ALS), showcasing the potential of BCIs in augmenting communication and control.[ 63 ] The 2000s brought about the development in sensory haptic devices[ 64 ] and research into how stimulation affects sensation and perception,[ 65 , 66 ] providing insights into sensory processing. In the 2010s, the integration of brain‐controlled therapies and AI with BCIs for clinical diagnosis opened new pathways in treating neurological disorders.[ 67 ] The emergence of the metaverse in 2021, integrating BCIs with virtual and augmented reality, followed by the development of the brain‐AI closed‐loop system (BACLoS) in 2022,[ 68 ] has marked recent progress. ECoG has evolved significantly, with the development of foldable and flexible ECoG in 2011,[ 69 ] and high‐density Neurogrid in 2015.[ 70 ] The evolution of MEAs traces back to the early development of the Michigan silicon electrode which laid the groundwork for precise neural recording and stimulation. By 1983, tetrodes allowed for simultaneous recording from multiple neurons,[ 71 ] and the 1990s saw the development of the Utah array, notable for its detailed brain mapping capabilities.[ 72 ] In 2005, advancements in tetrode technology improved single‐neuron representations, allowing for precise studies of individual neural activities.[ 73 ] The development of a transparent intracortical microprobe array in 2015 enabled simultaneous electrical recording and optical stimulation, further advancing neuroscience research.[ 74 ]

FIGURE 3.

FIGURE 3

Evolution of different neural interface techniques. ALS, amyotrophic lateral sclerosis; Ag, silver; stim, stimulation; IPG, implantable pulse generator; BACLoS, brain‐AI closed‐loop system. Micro‐DBS, reproduced under the terms of the CC‐BY Creative Commons Attribution International license (https://creativecommons.org/licenses).[ 87 ] Copyright 2019, The Authors, published by Frontiersin.org. Partially created by Biorender.com.

DBS has evolved from stereotactic frames in 1947 for precise brain targeting[ 75 ] to the first silver‐based DBS in 1948[ 76 ] and Leksell's arc‐based design in 1949[ 77 ] (Figure 3). The 1970s introduced DBS for pain treatment[ 78 ] and external DBS systems with handheld radio frequency transmitters.[ 79 ] The 1980s saw the first fully internalized DBS systems,[ 80 ] and by the late 1980s, DBS was successfully used for tremor treatment.[ 81 ] The late 1990s and early 2000s saw the development of dual‐channel implantable pulse generators (IPGs)[ 82 ] and commercialized Quadripolar electrodes, allowing for precise controlled stimulation.[ 83 ] In 2009, rechargeable batteries were introduced,[ 84 ] followed by advancements in 2013 with stimulation‐recording adaptive DBS in Parkinson's disease treatment,[ 85 ] closed‐loop systems for epilepsy,[ 86 ] and micro‐DBS probes.[ 87 , 88 ] The introduction of directional Quadripolar DBS electrodes in 2016, with their ability to shape the electric field through directional electrodes, offered targeted stimulation to enhance therapeutic efficacy.[ 89 ] The year 2019 saw developments, with the introduction of closed‐loop DBS enabling control over external devices[ 90 ] and the advent of wireless 3 Tesla‐compatible DBS systems with imaging techniques to enhance targeted precision and treatment efficacy.[ 91 ]

2.3. Neural interfaces: Current advances and challenges

DBS, primarily recognized for its role in treating neurological conditions, is now being explored in applications beyond healthcare, particularly to mitigate movement disorder symptoms like tremors and rigidity.[ 92 , 93 ] However, its invasive nature poses risks and challenges. ECoG, another invasive method, involves direct electrode placement on the brain surface to detect electrical signals. It shows promise in controlling prosthetic limbs and monitoring epileptic seizures.[ 11 , 94 ] Current endeavors aim to develop more flexible and chronic electrodes to broaden their applicability.[ 95 , 96 ] MEA has the capability to precisely monitor neural activity, capturing signals from neuron clusters. As with other invasive methods, concerns about their long‐term integration and potential health impacts persist. EEG stands apart as a non‐invasive technique for monitoring neural activity by placing electrodes on the scalp.[ 97 ] It has been effectively used in various applications including navigating virtual environments and monitoring neural responses in situations like fatigued driving. Nevertheless, its relatively low signal‐to‐noise ratio can be a limitation. Advanced signal processing techniques like time‐frequency analysis and independent component analysis (ICA) are utilized to improve EEG data accuracy.[ 98 , 99 ] Despite the potential of various neural interfaces, challenges in ensuring their accuracy, reliability, and long‐term safety, particularly when integrated into daily urban life, remain critical concerns.

2.3.1. Biocompatibility, durability, and efficiency

When foreign electrodes are implanted in the brain, they typically trigger a foreign body response, including inflammation and scarring, which can reduce the effectiveness of the interface over time.[ 24 , 100 , 101 ] To enhance biocompatibility, recent advancements have focused on utilizing nanomaterials and coatings to reduce inflammation and scarring.[ 24 , 101 ] As electrodes are miniaturized to enhance spatial resolution and reduce tissue damage, they face challenges such as a decreased signal‐to‐noise ratio and increased impedance, which hinders efficient signal transmission.[ 102 ] To improve the durability and mechanical adhesion of electrodes to tissue, various treatments, including microwave treatment[ 103 ] and tough interfacial covalent bonding,[ 104 , 105 , 106 ] have been employed. Furthermore, the development of 3D nanostructures and the use of durable materials like nanostructured platinum (Pt)[ 107 ] and iridium oxide (IrOx)[ 108 ] have proven effective in improving long‐term stability. Over time, electrodes also suffer from degradation due to mechanical strain and continual cyclic loads, leading to cracking, delamination, and potential failure.[ 17 ] To address these challenges, efforts have been made to optimize efficiency by balancing miniaturization with efficient signal transmission. This includes advancements in material engineering and electrode design, aiming to resolve issues related to decreased signal‐to‐noise ratio and increased impedance.[ 109 ]

2.3.2. Crosstalk in high‐density electrode arrays

Crosstalk in high‐density electrode arrays presents a significant challenge in neural recordings and stimulation.[ 24 , 110 , 111 , 112 ] This issue arises from the proximity of electrodes within the array, causing their electric fields to overlap in time and space at the electrode and tissue interface. Crosstalk can significantly interfere with signal clarity and may lead to complex and adverse neural interactions. Crosstalk exceeding 1% is not negligible in neural signal recording,[ 24 , 110 ] as it can induce neural interactions and even inhibit neural activation if the extracellular potential exceeds the inhibition threshold. This limits the spatial‐temporal resolution and can adversely affect nearby sites in high‐density electrode arrays.[ 24 ] Additionally, crosstalk issues may be exacerbated in flexible polymer arrays due to insulation limitations of polymer substrate and encapsulation layers.[ 111 ]

To overcome this, the design of electrode arrays is being refined by adjusting both electrode spacing and diameter.[ 113 ] Increasing the distance between electrodes could reduce the overlap of their electric fields, thereby minimizing crosstalk.[ 113 ] This spacing is fine‐tuned based on application needs and the specific neural tissue targeted, balancing the need for high spatial resolution with reduced interference. Additionally, reducing the diameter of electrodes limits the spatial extent of electric fields, further reducing crosstalk potential.[ 113 ] However, smaller electrodes could result in increased impedance, thereby necessitating a balance to minimize crosstalk while simultaneously maintaining signal quality. Furthermore, the development of multi‐channel sites on electrode arrays represents a significant step forward.[ 114 , 115 ] These arrays could uniformly distribute electric fields across the surface, thereby reducing edge effects where electric field density tends to concentrate at the edge of the electrode.[ 18 , 114 ] This uniformity to address the crosstalk in high‐density electrode arrays leads to a more consistent modified coating, improving both the accuracy of electrode recordings and the efficacy of stimulation.

2.3.3. Stability

The enduring functionality and effectiveness of neural interfaces are closely associated with the interface stability between electrodes and neural tissues. Challenges such as corrosion, dissolution, and material swelling at these electrode interfaces significantly impact the durability and operational performance of implants.[ 22 , 116 ] It is crucial to maintain interface stability and high‐quality electrochemical properties for long‐lasting and effective signal recording and neural stimulation. In response, material engineering has been a focal point. Conductive polymers like polypyrrole (PPy), polyaniline (PANI), poly(3,4‐ethylenedioxythiophene) (PEDOT), and poly(3‐hexylthiophene) (P3HT), renowned for their enhanced electrochemical properties and structural stability are utilized to prolong the life and reliability of neural interfaces by mitigating interface degradation.[ 19 , 117 ] Polydopamine (PDA) could enhance neural interfaces with its biocompatibility and adhesion, which promotes electrode–tissue integration, functionalization for neuron growth, reducing inflammation, and can be synergized with conductive polymers for a robust and biocompatible interface.[ 19 , 117 ] Moreover, the use of biomolecules, such as zwitterionic polymers for antifouling coatings,[ 118 ] has significantly improved interface stability by effectively minimizing interface degradation. Additionally, electrochemical copolymerization could be also employed to create coatings that enhance electrode performance and durability.[ 119 ] Furthermore, the development of biodegradable and flexible electrode materials like polylactic acid (PLA), polyglycolic acid (PGA), and polycaprolactone (PCL) offers adaptability to tissue deformation.[ 120 ] Their biodegradability also contributes to minimizing adverse reactions, further enhancing interface functionality.

2.4. Monitoring EEG signals in real‐world applications beyond healthcare

EEG, which captures brain electrical activity using scalp‐placed electrodes, is becoming central to brain‐to‐device interactions, providing insights into the brain electrical activity and connectivity. EEG analyses are multi‐step processes including recording and preprocessing brain signals[ 121 , 122 ] (Figure 4A), identifying the power and specific frequency bands of these signals by band‐power estimation[ 121 ] (Figure 4B), and delving into connectivity across distinct brain regions[ 123 , 124 ] (Figure 4C). The preprocessing of raw EEG data typically involves the use of EEGLAB, along with functions like “eegfilt”, to filter out noise and artifacts. In this process, “eegfilt” is specifically used for band‐pass filtering, which helps in reducing edge artifacts and in obtaining more accurate EEG readings. Following the preprocess, Loreta source localization is generally applied to identify the origins of the electrical activity within the brain, which helps in pinpointing the specific brain regions associated with the recorded electrical signals (Figure 4B). Additionally, the Hilbert transform could extract features from the EEG data, such as the characteristic frequencies of brain waves, instantaneous amplitudes, and phases. The outcomes obtained from EEG data analysis typically include power spectral density plots and topographical maps. The power spectral density plots demonstrate the distribution of signal power across various frequencies, while the topographical maps visually represent the spatial distribution of this power across the scalp. These steps are crucial for analyzing task‐state EEG in the time domain, particularly for identifying event‐related potentials (ERPs), which are distinct waveforms in EEG data that respond to specific stimuli. Conversely, the analysis of resting‐state EEG is predominantly centered on power spectrum analysis to identify variations in power across different frequency bands. Synchrony measures, which utilize coupling functions and simulated histograms, elucidate the regularity and synchronization of neuronal firing, providing a deep understanding of the brain oscillatory dynamics.

FIGURE 4.

FIGURE 4

Neural interfaces monitoring and modulation. (A–C) EEG processing for neural signal monitoring. (A) Raw data collection. Reproduced with permission.[ 121 ] Copyright 2013, Society for Neuroscience. (B) Band‐power estimation including Loreta localization, spectral power, and further calculation for the theta/alpha/beta bands. Reproduced with permission.[ 121 , 123 ] Copyright 2013, Society for Neuroscience; and permission.[ 48 ] Copyright 2023, Elsevier. (C) Connectivity analyses. (I) Synchrony measures including connectivity matrices, phase‐difference estimation, and classification recognition. (II) EEG network analysis including group network connectivity and partial directed coherence (PDC) interactions. Reproduced with permission.[ 123 , 124 , 134 ] Copyright 2022, Springer; 2023, Elsevier; and 2018, PLoS. (D) ECoG for neural signal monitoring. Reproduced under the terms of the CC‐BY Creative Commons Attribution 4.0 International license (https://creativecommons.org/licenses/by/4.0).[ 145 , 148 ] Copyright 2023, The Authors, published by Springer Nature; and reproduced with permission.[ 69 ] Copyright 2011, IEEE. (I) ECoG grid and self‐driving application; (II) Gamma activity modulation of ECoG rhythms in local cortical processing. (III) Sleep fMRI: ECoG and BOLD signals. SWR: sharp wave ripples; NREM: non‐rapid eye movement; AW: awake. (E) MEA electrode. (F) DBS electrophysiological mechanism (calcium waves formation and gliotransmitters release with arteriole dilation and increased blood flow). (G) Closed‐loop DBS control. Reproduced with permission.[ 90 , 170 ] Copyright 2021 and 2019, Springer Nature. (I) Closed‐loop close model. (II) Sensing and stimulation through the same DBS electrodes. (III) Utilizing exterior sensing devices and stimulating.

Connectivity matrices illustrate the strength and pattern of connections between various brain regions, and phase‐difference estimation highlights the phase relationships between EEG signals, offering insights into the timing of information transfer across these regions (Figure 4C). Connectivity analysis in EEG typically includes both undirected and directed connections. For undirected connectivity, measures such as coherence, phase lock value (PLV), and mutual information are utilized. These assess the degree of synchronization or shared information between different brain regions without specifying the direction of information flow. Directed connections, which provide insights into the directionality of information flow within the brain neural network, are analyzed using methods like the phase slope index and Granger causality‐based indicators. The Mean Vector Length Modulation Index (MVL‐MI) is noteworthy within the broad connectivity framework for understanding complex neural interactions. It quantifies the coupling between the phase of low‐frequency oscillations and the amplitude of high‐frequency activity, thus offering a metric for amplitude‐phase coupling. Similarly, the PLV is used for measuring phase synchronization between EEG signals from different brain regions, revealing the coherence of neuronal oscillatory activities. Furthermore, classification recognition is utilized to interpret these data patterns and classify different brain states or responses, which is vital for applying EEG in diagnostic and monitoring scenarios, where accurate interpretation of brain activity is essential.

Status X condition represents an EEG analytical framework, where interactions or combinations of various statuses such as different conditions, groups, or states of subjects, are mapped against specific conditions including experimental manipulations, environmental factors, or task conditions. This framework usually arises from statistical analysis or computational modeling and aims to elucidate how different brain states are modulated under varying conditions. In this process, matrices are formed that might display the strength of EEG signals, connectivity measures, or statistical outputs from regression analyses. Analyzing these matrix patterns is crucial for understanding how different conditions differentially impact various statuses, thereby offering insights into the complex correlations between brain activity and behavior. In contrast, phase‐difference connectivity (PDC) is employed to assess the directional flow of information between different brain regions. It plays a key role in unraveling the pathways of communication within the brain during various tasks, states, or in response to stimuli. PDC, therefore, is instrumental in providing a comprehensive view of brain dynamics by combining the effects of specific stimuli or conditions with the intricate network of neural interactions in the brain.

Diverse wave patterns detected by EEG are key to uncovering the nuances of neural dynamics. For instance, the alpha band (7−14 Hz), prevalent during relaxation with closed eyes, plays a pivotal role in treatments like vagus nerve stimulation (VNS). Shifts in alpha rhythms during VNS can reflect the effectiveness of epilepsy and depression treatments.[ 123 , 125 , 126 , 127 , 128 , 129 ] On the other hand, the beta frequencies (15−30 Hz) represent alert cognitive states and intersect with attention‐requiring tasks. ERPs, notably the P300 signal that emerges approximately 300 ms post‐visual stimulus, hold significant BMI implications.[ 130 , 131 ] Furthermore, theta waves (4−7 Hz) often indicate drowsiness or meditation and correlate with learning and memory, while gamma (30−100 Hz) resonates with high‐order cognitive tasks and information solidification.[ 132 , 133 ]

Within the sprawling blueprint of applications beyond healthcare, EEG has the potential to improve traffic safety by detecting driver fatigue and issuing timely alerts for rest breaks, thus preventing accidents and reducing traffic congestion.[ 97 , 134 , 135 ] Urban zones could be tailored to promote relaxation by analyzing alpha activity, while areas designated for alert interactions could utilize beta frequencies. Analyzing collective neural responses with different frequency band activities can refine urban design.[ 136 ] Within this framework, time‐frequency spectra, event‐related spectral perturbations, and scalp maps[ 137 , 138 , 139 , 140 ] serve as powerful methods to decode intricate brain dynamics. The prevailing appeal of EEG is its non‐invasive nature, whose potential spans from aiding paralyzed individuals to control wheelchairs to enabling computer interactions. Furthermore, advances in wearable EEG, such as dry and capacitive in‐ear electrodes with integrated circuits,[ 141 , 142 ] herald a future where BCI is accessible and portable.

2.5. ECoG: Monitoring and modulation in practical implementations

ECoG, with its high‐resolution and high‐density electrode arrays spaced mere hundreds of microns apart, offers enhanced signal detection and a broad frequency spectrum, significantly reducing noise compared to EEG (Table 1). This precision enables applications like speech prosthetics,[ 143 ] which convert ECoG signals from speech articulation into reproduced speech. Unlike EEG, ECoG requires surgical implantation of electrodes either above or beneath the dura mater, directly under the skull, capturing localized and high‐quality signals[ 144 , 145 , 146 , 147 ] (Figure 4D). This subdural placement is effective in detecting high‐frequency oscillations, especially within the gamma‐band range of 30−150 Hz. Additionally, the modulation by rhythm phase in these activities is promising for applications such as intraoperative cortical mapping[ 148 ] (Figure 4D). The intricacies of ECoG signals are highlighted by the contrast between high‐frequency (>40 Hz) and low‐frequency oscillations (<40 Hz). Low‐frequency oscillations are notable for their large amplitude and long propagation distance. Their phase could reflect brain connections and is instrumental in the mechanism of information communication across brain regions. These low‐frequency oscillations, associated with states of consciousness and relaxation, can be adjusted by external interventions. In contrast, high‐frequency oscillations, while characterized by low amplitude and short propagation distance, represent different brain activities. These oscillations, particularly in the high‐frequency gamma range, are associated with the activation of local brain areas. The power of these high‐frequency waves is inversely related to their amplitude and is crucial in understanding local brain dynamics. High‐frequency oscillations are typically linked to complex cognitive functions like attention, memory, sensory perception, and inter‐brain region connectivity. Generally, high‐frequency brain signals, unlike low‐frequency oscillations, are inherently generated internally and are less susceptible to external modulation, serving mainly as indicators of various cognitive states.

TABLE 1.

Comparison and summary of different neural interfaces in the brain for smart city applications.

EEG ECoG MEA a DBS
Electrode location Scalp Cortical surface Subcortical regions Deep brain
Main utilized signal frequency Alpha (8−13 Hz) for relaxation and treatment efficacy b like vagus nerve stimulation; beta (13−30 Hz) for alertness and active thought Low‐frequency (1−4 Hz) for sleep state analysis; high‐frequency (140−165 Hz) for specialized modeling; and gamma frequencies (30−150 Hz) in specific cortical regions for cognitive analysis or sensory perception Local field potential (<200 Hz) and neuronal spikes (0.1−7 kHz) High‐frequency stimulation (>70 Hz) for targeted modulation; closed‐loop control typically in beta range (13−30 Hz)
Potential applications Real‐time feedback, neurofeedback therapy, sleep monitoring, treatment evaluation (EEG synchronization) c , connectivity analysis, traffic safety, self‐driving vehicles, cognitive enhancement Sleep monitoring and enhancement, smart homes, self‐driving vehicles, direct brain‐machine communication, movement prediction, mood/text/voice decoding, epilepsy management and neural disease detection Neuroprosthetics, cognitive modeling, mood/text/voice decoding, movement prediction, drug delivery, and neural rehabilitation Motor/limb control, fMRI imaging, mood enhancement, treatments for neurological conditions, robotics and assistive devices, connectivity analyses, cognitive therapies, cognitive learning, mobility treatment, and rehabilitation
Advantages Non‐invasive and real‐time monitoring High spatial resolution and localized neural activity High spatial and temporal resolution, recording and stimulation, compactness (Utah), and adjustable features (Michigan) Targeted modulation of specific brain regions
Disadvantages Limited spatial resolution, noise susceptibility Semi‐invasive procedure, risks associated with brain surface placement Invasive, surgical implantation risks, bio‐compatibility concerns, and potential limitations with high‐frequency synaptic transmission Invasive, surgical implantation and stimulation‐related risks
a

MEA refers specifically to subcortical microelectrode arrays in this work, while cortical surface electrodes are termed ECoG;

b

Anomalies in alpha wave patterns can serve as indicators for diagnosing neurological disorders or identifying cognitive variations;

c

Coordinated oscillations of electrical activity in different brain regions, often indicate effective communication between those areas or a specific cognitive state.

In future intelligent cities or smart homes, monitoring high‐frequency gamma activity, linked to cognitive functions like attention and memory, could indicate when residents are deeply engaged in mental tasks, prompting the system to optimize the environment for concentration by adjusting lighting, temperature, and reducing distractions. Low‐frequency oscillations, associated with relaxation or consciousness, could be utilized by smart home systems to induce relaxation or alertness. For instance, in the evenings, the system could enhance relaxation through environmental adjustments like dimming lights, and playing soothing music, thereby enhancing sleep quality. During mornings or when increased alertness is necessary, the environment could be adjusted to energize the residents, perhaps through changes in lighting or ambient sound. This integration of ECoG in smart homes goes beyond simple task automation and energy efficiency,[ 149 ] aiming to develop living environments that align with the mental and emotional states of residents, potentially transforming how we interact with our surroundings. Additionally, ECoG is crucial in sleep monitoring, enabling a deep understanding of sleep disorders.[ 145 , 150 ] By analyzing functional connectivity matrices across frequency bands, such as 1−4 and 140−165 Hz, predictive models for sleep state transitions can be developed.[ 151 ] Advanced prediction models, such as long short‐term memory (LSTM) units, exhibited high precision, particularly in brain regions affecting sleep like the medial mammillary nucleus and the ventral thalamus.[ 145 , 152 ] Complementing this, blood oxygen level‐dependent (BOLD) signals in functional magnetic resonance imaging (fMRI) provide holistic brain dynamics during sleep transitions[ 145 , 153 , 154 ] (Figure 4D). Furthermore, integrating ECoG into autonomous driving technologies could facilitate transitions between manual and automated driving modes (Figure 4D), ultimately reducing traffic congestion.

Leveraging technologies like fMRI and ECoG gamma activity markers can usher in a new era of applications in healthcare, emotion detection, urban safety, and smart home automation. Short‐term ECoG applications, particularly those focused on diagnosing and managing seizures, have been approved. However, the long‐term safety of ECoG for BMI applications remains under investigation. The transition of ECoG from research to real‐world applications requires cooperation involving industrial production, comprehensive clinical trials, and rigorous regulatory oversight.

2.6. MEA: Recording and neural modulation in practical implementations

Originating from a single‐electrode system akin to the patch clamp for monitoring bioelectric activity in neurons, MEA has evolved into devices capable of simultaneous recordings from multiple electrode arrays[ 155 , 156 ] (Figure 4E). Traditional silicon‐based MEA, notably the Utah and Michigan electrodes, have impacted neurophysiology in the past few decades owing to their high spatial and temporal resolutions. Utah arrays excel in their compactness, while the Michigan electrodes stand out with their adjustable features, adept at capturing signals across varying depths and ranges (Figure 4E). Due to their invasive nature, these tools are essential for capturing detailed electrophysiological signals, including local field potential (LFP) typically below 200 Hz, reflecting collective synaptic potential from neuron groups, and neuronal spikes (0.1−7 kHz), offering insights into individual neuronal activities.[ 157 ] The integration of both spiking activity and LFP into future BMI promises the creation of dexterous prostheses, paving the way for complex tasks like reach, grasp, and intricate finger movements to become commonplace. By leveraging data from MEA, urban spaces can potentially evolve to be adaptive to inhabitants. This transformative potential opens pathways for groundbreaking enhancements in healthcare, notably in precisely targeted drug delivery systems, personalized neural rehabilitation programs, and the development of neural interfaces that make prosthetic limbs natural and intuitive.[ 158 , 159 ]

2.7. DBS: Stimulation and neural modulation in practical implementations

DBS which targets specific neurons, is generally used for the treatment of neurological conditions like Parkinson's disease.[ 160 , 161 , 162 , 163 ] Assisted by neuroimaging and targeting techniques, DBS is refining its spatial precision and increasingly focusing on temporal sequencing to enhance treatment efficacy.[ 164 , 165 ] Additionally, emerging DBS technologies align with IoT and virtual reality trends by employing encrypted telemetry for wireless data transfer and cloud‐based controls, which extend their uses in real‐world applications beyond healthcare.[ 166 ]

2.7.1. DBS therapy

DBS, employing high‐frequency stimulation typically over 70 Hz, is precisely targeted to specific brain regions like the subthalamic nucleus, commonly associated with Parkinson's disease treatment.[ 165 ] DBS operates on multiple scales to modulate neural activity, spanning from molecular interactions to broad neuronal network dynamics. On a molecular level, the implanted DBS electrode generates an electrical field that influences voltage‐sensitive sodium channels in neuronal membranes[ 167 ] (Figure 4F). This stimulation leads to the opening of these channels and the propagation of action potentials along axons. Despite facing challenges such as limited synaptic transmission with high‐frequency signals, DBS effectively serves as a synaptic filter, preventing the spread of abnormal or pathological neural activity, particularly within sensory and motor regions. At the broad neuronal network level, the efficacy of DBS emerges in its modulation of specific neural circuits. For instance, while the thalamus receives inputs from the basal ganglia, it preferentially transmits only those that synchronize with the high‐frequency signals produced by DBS.[ 167 , 168 ] This selectivity allows DBS to suppress low‐frequency oscillations without causing widespread network disruption, thereby minimizing its impact on neural plasticity and alleviating symptoms such as akinesia, rigidity, tremor, and dystonia.[ 169 ]

2.7.2. Closed‐loop DBS control

An open‐loop DBS system delivers electrical stimulation without feedback or adjustments based on the outcome. In contrast, a closed‐loop system continuously monitors outcomes to adaptively modify the control action (Figure 4G), thereby enhancing effectiveness and reducing side effects.[ 170 , 171 , 172 , 173 ] The closed‐loop approach initially focuses on passive sensing and identifying specific neural biomarkers. For instance, upon identifying the gamma biomarker in the amygdala, which is marked by neural activities within the gamma frequency range—often measured via EEG or LFP and indicative of conditions such as anxiety and depression, the closed‐loop system activates stimulation. Within the realm of neural signals, LFP in the beta range (13−30 Hz) stands as an emblem of rigidity and bradykinesia. Conversely, gamma‐band oscillations, particularly from cortical strip electrodes are indicative of dyskinesia. In closed‐loop DBS, two control approaches are prominent[ 90 ] (Figure 4G). The first utilizes the DBS electrodes for both sensing and stimulation, relying on rhythmic neural signals in either the gamma or beta range to guide stimulation intensity. The second employs external sensors to monitor disease symptoms, which are then fed back to the implanted stimulator to adjust the stimulation timing.

The efficacy of DBS is influenced by several factors, including the frequency and intensity of stimulation as well as the inherent physiological and anatomical characteristics of the targeted region.[ 174 ] For instance, utilizing low‐frequency DBS below 30 Hz can increase beta oscillations in the subthalamic nucleus. In contrast, beta oscillations, especially within the 13−30 Hz range,[ 175 ] can disrupt normal neural communication, resulting in behavioral anomalies. These oscillations have gained significant attention as a metric for assessing the clinical condition of patients. Through closed‐loop control typically in the beta range, adjusting the amplitude of the LFP signal can enhance the effectiveness of DBS treatments compared to conventional methods. Moreover, the integration of DBS with IoT technologies heralds a new era in real‐time monitoring, allowing for fine‐tuning treatments based on extensive sensor data. Such a data‐centric approach not only improves DBS efficacy but also paves the way for more timely and adaptive care tailored to individual needs and the diverse applications of closed‐loop controls. The capacity of DBS to precisely modulate neural dynamics—the temporal patterns of neural signaling and connections—opens intriguing possibilities for practical implementations, including adaptive public services and personalized urban experiences that can potentially respond in real time to individual cognitive and emotional states.

2.7.3. DBS frequency modulation and MRI imaging

DBS is primarily used for its therapeutic effects on movement disorders, requiring accurate targeting of specific brain areas. In its initial phases, structural MRI is critical for identifying neuroanatomical landmarks. This targeting is further refined through fMRI, essential for ensuring precise and effective modulation of DBS frequencies. Figure 5A illustrates the integration of neural activation models with neuroimaging methods like diffusion tensor imaging, fMRI, and connectomic targeting imaging to optimize DBS‐induced signal variations,[ 176 , 177 , 178 ] thereby facilitating predictions of treatment responses in movement disorders.

FIGURE 5.

FIGURE 5

DBS frequency modulation and MRI imaging. (A) DBS electrodes into the striatum (red). Reproduced under the terms of the Creative Commons' Attribution‐Share 4.0 International License. Copyright, The Authors. (B) DBS connectivity. Reproduced with permission.[ 179 ] Copyright 2017, Wiley‐VCH. (C) Time‐frequency representation of LFP during high‐frequency DBS of ipsilateral subthalamic nucleus. Reproduced with permission.[ 174 ] Copyright 2007, Springer Nature. (D) DBS connectomic targeting of the ventral nucleus (pink) and beta oscillations. Reproduced with permission.[ 175 , 176 ] Copyright 2022, Frontiers; and Copyright 2020, IOP Publishing. (E) Low‐frequency DBS (<30 Hz) enhancing LFP power. Reproduced with permission.[ 62 ] Copyright 2019, Springer Nature. (F) DBS reducing beta‐band activity in the subthalamic nucleus. Reproduced with permission.[ 68 ] Copyright 2007, Springer Nature. (G) Ultra‐high‐field MRI for pain sensations imaging: Scans of amputees revealed activation of neurons associated with the movement of missing digits; and sensorimotor plasticity by BMI training for pre‐ and post‐BMI comparison. Reproduced under the terms of the CC‐BY Creative Commons Attribution 4.0 International license (https://creativecommons.org/licenses/by/4.0).[ 181 , 183 ] Copyright 2016, The Authors, published by Elife and Springer Nature.

Despite its promise, this method faces challenges such as extended durations of fMRI scans and a need for specialized analytical skills, which currently limit its prevalence in current clinical settings. Topographical visualization of neural activity in the brain, which highlights regions involved in specific functions[ 179 ] (Figure 5B), is crucial for precise electrode placement in DBS. This spatial mapping, correlating brain functions with exact anatomical locations, ensures targeted electrical stimulation of neural areas, essential for effectively treating movement disorders by modulating dysfunctional neural circuits. Figure 5C illustrates the variations in neural oscillations within the beta frequency band over time and frequency.[ 174 ] These beta oscillations, closely associated with motor control, can be altered through DBS to improve motor function in conditions like Parkinson's disease. Spectrograms serve as crucial tools for clinicians to visualize these oscillations and make informed decisions about the most effective DBS settings for individual patients. By adjusting the stimulation parameters in response to the unique patterns of neural activity observed in the spectrogram, the treatment can be personalized to optimize therapeutic outcomes for individual patients.

Connectomic targeting imaging combines neural activation with neuroimaging, improving electrode placement precision[ 176 , 178 ] (Figure 5D). Notably, DBS at 3 V shows significant shifts in beta oscillations[ 175 ] (Figure 5D), which could offer insights into Parkinson's disease management. Modulating the power of LFPs across different DBS frequencies reveals complex neural response patterns, providing a detailed view of brain electrical activity[ 174 ] (Figure 5E). LFP modulation indicates the brain immediate response to stimulation and sheds light on the mechanisms by which DBS exerts its effects. By adjusting DBS frequencies and observing the resultant LFP power changes, clinicians can more accurately target therapeutic interventions to the neural basis of movement disorders. Reversible beta oscillation shifts in response to DBS further highlight its potential to tailor neural activities[ 167 ] (Figure 5F).

High‐field MRI, particularly at 3 Tesla, is the standard in clinical imaging for its high‐resolution capabilities, while ultra‐high‐field MRI at 7 Tesla is primarily a research tool that offers detailed brain structure and function images.[ 180 ] These imaging modalities, especially when combined with fMRI, provide invaluable insights into brain regions activated by DBS, enhancing our understanding and management of movement disorders and sensory deficits. For instance, ultra‐high‐field MRI imaging captures the activation patterns in amputees, such as the movement of missing digits, shedding light on the neuronal basis of phantom limb pain and the potential for sensorimotor plasticity through BMI training[ 181 , 182 ] (Figure 5G). Furthermore, the digit topography revealed by this imaging, characterized by inter‐digit overlaps and digit selectivity, influences tactile interface design for practical applications beyond healthcare,[ 183 , 184 ] particularly in developing sensory applications in smart cities.

3. NEURAL DECODING

Neural decoding acts as a computational link between the human brain and external devices, employing algorithms and high‐density electrodes to interpret neural signals.[ 185 , 186 ] Beyond command recognition, neural decoding has the potential to adapt urban environments according to the emotional and cognitive states of their inhabitants. Envision public information kiosks translating brain signals directly into text or voice, facilitating immediate, hands‐free access to vital information.

3.1. Optimized neural decoding algorithms

Advancements in neuroscience and machine learning have led to optimized neural decoding, facilitating efficient communication between devices and the brain.[ 187 ] For individuals suffering from degenerative motor diseases, the decoding of brain neural signals is essential, as it transforms neural signals into understandable outputs.[ 185 , 188 ] The Filter Band Common Spatial Pattern (FBCSP) method, an enhancement over the standard CSP employs a filter bank to obtain features across multiple frequency bands[ 189 , 190 ] (Figure 6A). This leads to improved BCI accuracy and has applications ranging from neurorehabilitation to immersive gaming and virtual reality experiences.[ 191 ] ICA is another technique that enhances EEG or ECoG signal decoding by separating pure signals from noise[ 192 , 193 ] (Figure 6B). It increases the signal‐to‐noise ratio, preserving valuable data while excluding disturbances. By isolating statistically independent cortical processes, ICA finds applications in neurology, cognitive neuroscience, and neuroengineering.[ 194 ] BCI systems that use classifiers such as artificial neural networks (ANN) or linear discriminant analysis (LDA)[ 195 , 196 ] can control humanoid robots through brain signals (Figure 6C). Integration with inputs from multiple sensors opens possibilities in rehabilitation, communication, and device control. Support vector machine (SVM) algorithms, known for their noise resilience and efficient data handling,[ 195 , 197 , 198 ] are instrumental in directing prosthetic devices and aiding individuals with motor and communication difficulties (Figure 6D). Furthermore, deep learning algorithms, including LSTM, deep neural networks (DNN), deep belief networks (DBN), and convolutional neural networks (CNN), are valuable in BCI applications[ 199 , 200 , 201 , 202 , 203 ] (Figure 6E,F) due to their role in feature identification and signal decoding, but require careful optimization to prevent overfitting and simplify their complex learning procedures. The evolution of these algorithms not only empowers individuals with enhanced capabilities but also reshapes urban experiences toward inclusivity and cutting‐edge technological integration.

FIGURE 6.

FIGURE 6

Optimized neural decoding. (A) Sparse FBCSP algorithm for motor‐related action recognition. Reproduced with permission.[ 190 ] Copyright 2015, Elsevier. (B) ICA of unmixing EEG channels to identify independent components. Reproduced with permission.[ 192 ] Copyright 2011, Oxford University Press. (C, D) Typical optimized decision boundaries to differentiate between the various classifiers. (C) LDA, and (D) SVM. Reproduced with permission.[ 195 ] Copyright 2020, MDPI. (E, F) Typical machine learning. (E) DNN and (F) DBN. Reproduced with permission.[ 199 ] Copyright 2020, Elsevier. (G) Representative ECoG decoding of text (I), reproduced with permission.[ 204 ] Copyright 2020, Springer Nature; and voice (II), reproduced with permission.[ 205 ] Copyright 2019, Springer Nature. (H) Representative MEA decoding for movement prediction. LSTM, long short‐term memory layer. Reproduced with permission.[ 209 ] Copyright 2018, Springer Nature.

3.2. Text, voice, and movement decoding

Algorithms such as recurrent neural networks (RNN) and LSTM have significantly enhanced our ability to decode neural signals.[ 200 , 201 ] These computational tools are instrumental in translating brain signals into actionable data, extending their applications beyond the traditional medical domain. One pivotal study demonstrated the capability of RNN decoding to transform ECoG signals into textual representations[ 204 ] (Figure 6G). Upon comprehensive model training, this approach has applications not just in medical devices like speech aids but also within the broad IoT framework of smart city infrastructures, facilitating an environment where citizens can seamlessly interact with integrated intelligent systems. Moreover, recent advancements have enabled the conversion of brain activity directly into audible speech[ 205 ] (Figure 6G). This holds promise for individuals who are speech‐impaired due to neurological conditions, and it opens the door for voice‐activated functionalities. Reflecting on the history of BCIs, these interfaces were originally developed to reinstate communication abilities in individuals with significant disabilities.[ 206 ] A notable example is the P300 speller, developed to enable patients to type text on a computer screen utilizing brain activities.[ 207 ] Paired with speech synthesizers, these technologies have evolved to potentially decode full sentences from minimally invasive brain recordings, suggesting a future of prostheses for speech restoration and privacy‐respectful communication forms like silent‐speech interfaces.[ 208 ]

In addition, strides have been made in developing an MEA‐based system specifically for movement decoding[ 209 , 210 ] (Figure 6H). This system can translate neural signals into accurate movement predictions. Offering benefits ranging from assisting people with motor disabilities to enhancing interactions with robotic assistive systems in urban settings. Various machine learning algorithms, such as regression models, linear classifiers, DNN, and SVM, have been utilized to achieve high‐accuracy discrimination of movement intentions. This includes both broad and precise motor movements, as demonstrated in able‐bodied and paralyzed participants using ECoG electrodes and Utah arrays.[ 211 ]

3.3. Cognitive therapies and treatments

In recent years, BCIs have gained prominence as potential therapeutic tools for neuropsychiatric conditions like depression and anxiety.[ 212 ] A major challenge lies in accurately decoding mood states, which is crucial for both diagnosis and treatment. Recent strides in closed‐loop DBS treatments have been pivotal in addressing this challenge. Further, the integration of wirelessly transmitted BCIs into smart city infrastructures offers promising avenues for mood decoding, stress reduction through mindfulness, and cognitive learning and training.

3.3.1. Mood decoding

Accurate mood decoding is crucial for the effective treatment of mood disorders, but real‐time tracking of emotional states remains a significant challenge due to the complex interactions within neural systems.[ 213 , 214 ] Advances in neuroimaging provide insights into the neural basis of emotional responses, yet the intricate dynamics within the cortical and limbic systems require further exploration. Closed‐loop DBS offers a solution by enabling real‐time mood state decoding and facilitating targeted electrical therapies[ 212 , 215 ] (Figure 7A). Unlike motor‐function BCIs, which often employ algorithms like FBCSP and SVM, mood BCIs use closed‐loop systems with both control and stimulation components, leveraging machine learning algorithms, such as DNN or LSTM (Table S1), to analyze neural activity in various brain regions and identify specific mood states[ 39 , 216 , 217 ] (Figure 7A). Combining feedback controllers for mood‐based stimulation adjustment with neural decoders for mood identification enables the potential for personalized treatments. Key neural regions like the limbic system and the orbitofrontal cortex play crucial roles in this decoding process. Integrating advances in mood research with urban planning, mood BCIs can reshape city designs to be attuned to residents' emotional states, elevating both individual well‐being and the overall urban experience.

FIGURE 7.

FIGURE 7

Neural decoding. (A) Mood decoding. Reprinted with permission.[ 39 ] Copyright 2019, Springer Nature. (I) Closed‐loop electrical stimulation for therapies and feedback; (II) multisite neural activity mapping via dynamic latent state‐space models; (III) decoded mood relation. (B) Neural decoding for mindfulness stress reduction. Reprinted with permission.[ 170 ] Copyright 2021, Springer Nature. (I) Intracranial electrodes for biomarker identification; (II) two symptoms for depression and anxiety; (III) evoked potentials in the corticolimbic network; (IV) hemisphere network with circumference strength and color‐coded start locations. (C) Neural decoding for cognitive learning and training. (I) Spectral‐geometric emotion‐forecasting neural pathways (green for positive and red for negative correlations), reproduced with permission.[ 223 ] Copyright 2018, Springer Nature; (II) the trajectory before (black), during (red), and after (grey) near‐infrared DBS stimulation, reproduced with permission.[ 224 ] Copyright 2022, Springer Nature.

3.3.2. BCI‐enhanced mindfulness

Mindfulness practice, known for reducing stress and improving cognitive function, could be further enhanced by integrating BCIs that provide real‐time neural feedback.[ 218 , 219 ] Cutting‐edge methods such as implanting electrodes to monitor brain activity[ 170 ] and targeting specific neural regions associated with major depressive disorder symptoms[ 170 , 220 ] present the groundbreaking possibility for highly personalized neurostimulation therapies (Figure 7B). Further insights into emotional processing mechanisms can be garnered by analyzing the N1 amplitude of evoked potentials,[ 170 , 221 ] which is the negative voltage peak observed approximately 100 ms after stimulus and is commonly used to investigate attention and sensory processing. Additionally, stimulation‐induced mapping of brain region connections underscores the significant impact of DBS on connectivity (Figure 7B). Urban areas equipped with BCI‐enhanced mindfulness interventions can be tailored to cater to individual needs, thereby providing valuable resources for stress mitigation.

3.3.3. Cognitive learning and training

DBS‐based BCI approaches offer the potential for cognitive learning and training, particularly for individuals with cognitive deficits.[ 222 ] These methods provide real‐time feedback, enabling individuals to assess and improve their cognitive skills. Analysis across different frequency bands can reveal the temporal predictability of mood states, underscoring the dynamic capabilities of neural encoding models[ 223 ] (Figure 7C). Passive BCIs, which are part of this technology spectrum, enhance high‐order brain functions like reasoning and decision‐making by monitoring brain activity. This includes assessing decisions‐making processes and confidence levels in those decisions. Another promising development is the use of near‐infrared deep brain modulation in cognitive enhancement, which demonstrates the potential to improve cognitive abilities non‐invasively[ 224 , 225 , 226 ] (Figure 7C). Furthermore, the integration of neurotechnologies in cognitive enhancement is poised to facilitate effective human–AI collaboration. While AI excels in computation‐intensive tasks, like playing Go, humans outperform AI in tasks requiring advanced reasoning and intricate problem‐solving skills. Future neurotechnologies are expected to enhance these human strengths, enhancing performance in a variety of tasks through efficient human‐AI collaboration.

4. NEURAL INTERFACES FOR WEARABLE INTERACTIONS

The rapid evolution of neurotechnology has ushered in a new era of wearable interactions, seamlessly integrating the human brain with external devices.[ 171 , 227 , 228 , 229 ] Central to this progression are wearable BCIs and haptic interactions, which together, revolutionize our connection to urban landscapes and digital platforms. Wearable BCIs, encompassing designs like headsets, EEG‐integrated smart glasses, and baseball caps, provide continuous monitoring of neural activities, promising vast implications in healthcare, sports, and gaming. On the other hand, haptic interactions, tailored for tactile communication, bring a tangible dimension to digital experiences,[ 172 , 230 ] ranging from force feedback in virtual reality to intuitive touch in public service kiosks.

4.1. Wearable BCIs

Wearable EEGs come in diverse designs like headsets,[ 231 ] headbands,[ 232 ] baseball caps,[ 233 ] and smart glasses[ 234 ] (Figure 8A). These devices are tailored for comfort and continuous brain activity monitoring. EEG skin devices, featuring mesh electronics with stretchable interconnectors,[ 142 ] conform to the contour of the skin (Figure 8B,C), offering enhanced resolution and robustness. Wearable DBS devices treat neurological and psychiatric conditions[ 235 ] (Figure 8D), while the WIMAGINE implant is a wearable ECoG device, replaces part of the cranium to streamline surgery and improve safety[ 236 ] (Figure 8E). The CLINATEC device, worn on the head, captures ECoG data for interpretive movements[ 237 ] (Figure 8F), which is currently undergoing clinical trials, showing promise for patients with severe disabilities. The WIMAGINE systems, both in wired and wireless versions, have been utilized in applications such as controlling motorized exoskeletons using brain signals[ 236 ] (Figure 8G,H). These wearable BCIs promise a transformative impact on urban living, including stress monitoring, telehealth solutions, and controlling robots and transport systems.

FIGURE 8.

FIGURE 8

Wearable BCI devices. (A) Wearable EEG devices. Top: headset, reprinted with permission.[ 231 ] Copyright 2012, Springer; baseball cap, reprinted with permission.[ 233 ] Copyright 2008, IEEE. Bottom: monitoring headset, reprinted with permission.[ 287 ] Copyright 2010, IEEE; wireless steady‐state visual evoked potential (SSVEP) device, reprinted with permission.[ 232 ] Copyright 2006, IEEE; commercial Quasar DSI 10/20, reprinted with permission.[ 288 ] Copyright 2021, Quasar; and smart glass, reprinted with permission.[ 234 ] Copyright 2021, Cognixion. (B, C) EEG skin devices including fractal device architectures and tripolar concentric ring and capacitive designs. B,C are reproduced with permission.[ 142 ] Copyright 2015, National Acad Sciences. (D) DBS clinically implanted device. Reproduced with permission.[ 235 ] Copyright 2019, Embopress. (E–H) Wearable ECoG devices. (E) WIMAGINE anatomical implant, reproduced with permission.[ 236 ] Copyright 2014, IEEE; (F) wearable Clinatec device, reproduced with permission.[ 237 ] Copyright 2019, Clinatec; (G, H) WIMAGINE wired and wireless wearable ECoG devices, reproduced with permission.[ 236 ] Copyright 2014, IEEE.

4.2. Haptic interactions

Neural interfaces tailored for tactile communication are reshaping our interactions. Force feedback, crucial for virtual reality and robotics, can be enhanced by neural interfaces like EEG or DBS, enabling tactilely immersive experiences and intuitive robotic‐assisted public services (Figure 9A). Tactile feedback technologies, when integrated with neural interfaces like EEG or ECoG, provide direct sensations of touch; these systems simulate finger sensations and optimize interactions using sensors like hydraulically amplified taxels[ 238 , 239 , 240 , 241 ] (Figure 9B).

FIGURE 9.

FIGURE 9

Neural interfaces for haptic interactions. (A) Force feedback for robotics and virtual reality. (B) Tactile feedback technologies, such as triboelectric sensors, designed for feeling sensations. (C) Mid‐air vibrations for touchscreens and similar interfaces, incorporating technologies like ferroelectric, electromagnetic, ultrasonic piezoelectric, and thermoelastic laser devices. (D) Electrovibration‐based haptic interfaces for wearables such as gloves, featuring pneumatic, piezoelectric, thermoelectric, or ferroelectric actuators; and heating haptic interfaces.

Mid‐air haptic technologies provide contactless sensations in the air and, when integrated with neural interfaces, enable public venues to offer air‐touch menus controlled by EEG‐detected neural patterns[ 242 , 243 , 244 , 245 , 246 , 247 , 248 ] (Figure 9C). Electrovibration technologies, essential for wearable devices, when combined with neural interfaces like EEG or invasive methods such as ECoG and MEA, enable haptic gloves to provide an intuitive touch experience in kiosks and digital installations[ 249 , 250 ] (Figure 9D). These wearable devices feature diverse actuators—from pneumatic to piezoelectric, thermoelectric, and ferroelectric—and can include heating interfaces[ 251 , 252 , 253 , 254 , 255 , 256 , 257 ] to deliver detailed tactile sensations. The convergence of haptic technologies with neural interfaces paves the way for tactile interactivity in practical applications, amplifying accessibility and enriching user experience (Table 2).

TABLE 2.

Comparison of different haptic interfaces for real‐world applications beyond healthcare.

Haptic devices Description Structure Applications Ref.
Electrovibration for haptic gloves and other wearable devices
Flexible pneumatic electrovibration glove Provides realistic experience of virtual objects with tactile feedback Pneumatic soft actuator, interface board, piezoelectric sensors, and high‐voltage converter Gaming, educational training, medical training [249]
Piezoelectric glove Identify user gestures Kapton film, AlN layers, and Mo‐based electrode Healthcare, rehabilitation [250]
Thermo‐haptic device Skin‐like mechanical properties Si nanomembrane diodes, and electrode arrays Human‐machine interfaces [252]
Joule heating interfaces Efficient heat transfer Ag nanowires, SBS thermoplastic elastomer, serpentine‐mesh structure Virtual reality, gaming [253]
Thermal electric actuators Provide haptic feedback through heat transfer, creating warm or cool sensations p‐, n‐type Bi2Te3 pellets, and low thermal conductivity polymer Virtual reality, gaming, and medical devices [254]
Thermal electric actuators (audio) Provide tactile and audible feedback Ag nanowires, touch sensors, ribbon‐shaped actuators, and electrode grid Human–computer interaction, gaming, and mobile devices [255]
Tactile feedback for feeling sensations
Ring‐like tactile device Multi‐sensory feedback and sensing rings Triboelectric nanogenerator (TENG) sensors, and NiCr heaters Sensory devices, and haptic interfaces [238]
Hydraulically amplified taxels (HAXEL) Provide flexible and dense cutaneous haptic feedback Fluid‐filled cavity, non‐stretchable polymer coating, and segmented electrodes Wearable haptics, microfluidics, soft robotics, and virtual reality [239]
Piezoelectric glove Enables multidimensional motion detection and prompt haptic feedback Piezoelectric stimulators, and elastomer‐derived triboelectric sensors Virtual reality, robotics, and rehabilitation [240]
Triboelectrification Mimics Merkel cells for touch and pressure detection Receptive substrate, and triboelectric capacitive potential Prosthetics, smart wearables, and intelligent systems [241]
Mid‐air vibration for touchscreens and other user interfaces
Ferroelectric‐based dynamic interfacing Utilizes a ferroelectric layer to transform mechanical energy into electrical energy, and mimics synapses Ferroelectric layer, receptive substrate, and transistor Robotics, virtual reality, and haptic feedback systems [242]
Piezoelectric micromachined ultrasonic transducer (PMUT) Thin layer of piezoelectric material patterned into an array of dots or squares Piezoelectric material, and thin metallic layer Medical imaging, therapeutic ultrasound and cell manipulation [243, 244]
Ultrasound haptic devices (e.g. Ultraleap) Ultrasound array creating points of pressure on the skin Ultrasound arrays Virtual reality, augmented reality, and haptic feedback [245, 246]
Indirect laser radiation Utilizes a laser system and an elastic substance to stimulate the skin through thermoelastic effects Laser and elastic substance Virtual reality, gaming, and human‐computer interaction [247]
Electromagnetic arrays or rotating magnet disks Small electromagnets or rotating disks generating a magnetic field that induces air vibrations Electromagnets or rotating disks Virtual reality, augmented reality, and medical training simulations [248]
Force feedback for robotics and virtual reality
Self‐powered electrotactile device Electrotactile interface combined with a TENG array, inducing a discharge signal to the skin Electrotactile interface, and TENG arrays Robotics and virtual reality [289]
Prosthetic sensory‐motor interfaces Controlled and powered wirelessly, frequencies between 100 to 300 Hz provide strong sensations Wireless components, and frequency modulation Prosthetics, rehabilitation, virtual reality, and augmented reality [290]
Augmented virtual reality with force feedback Interconnected virtual environments with haptic feedback delivered through gloves, suits, or exoskeletons Gloves, suits, and exoskeletons Robotics and virtual reality [238]
Force feedback for robotic or limb control Sensors, control algorithms, and actuators deliver tactile feedback Sensors, control algorithms, and actuators Prosthetics, professional training simulators and virtual reality [291]

5. NEURAL INTERFACES FOR AI‐DRIVEN BCI

AI‐driven BCIs that facilitate direct interactions between the external devices and human brain are revolutionizing how we interact with our digital environments.[ 258 ] This integration promises not only to enhance efficiency but also to provide personalized interaction models tailored to individual needs.[ 259 ] AI‐driven BCIs can redefine urban living, enabling actions like altering street illumination with a thought, orchestrating traffic flows to mitigate congestion, or even steering autonomous vehicles. Virtual assistants, already a staple in urban life, could evolve to be more responsive and intuitive through BCI integration.[ 260 ]

5.1. BCI operation modes in real‐world applications beyond healthcare

AI‐driven BCIs extend their impact beyond healthcare, enabling thought translation into actionable commands for applications like rehabilitation and enhanced communication. Software advancements in machine learning and signal processing have improved the signal‐to‐noise ratio in neural signal recording, potentially allowing non‐invasive BCIs to rival invasive ones in performance while reducing health risks and costs. Open‐source software tools like EEGLAB and OpenViBE have accelerated BCI research by providing accessible signal processing and machine learning. However, hardware development has lagged due to high costs and lengthy development processes. Advancements include biocompatible invasive interfaces and the shift from wet to dry in‐ear EEG electrodes, making brain recording more accessible and efficient.

5.1.1. Communication mode

Teleoperated communication mode utilizes EEG‐based BCIs to control external devices, such as wheelchairs and traffic light systems, using neural signals[ 261 ] (Figure 10A). Neural signals undergo processes of filtration, augmentation, and categorization through algorithms,[ 262 ] enabling control of devices and influencing direct stimulation of brain areas or muscle groups.[ 263 ] The primarily non‐invasive nature of this BCI mode allows for diverse smart city applications, including traffic management, air quality control, and energy optimization.[ 264 , 265 , 266 , 267 ] Another application in this domain is the capacity to recreate tactile feedback using intracortical micro‐stimulation (ICMS), a method that involves using microelectrodes to deliver electrical currents to specific brain cortical regions for mapping neural circuits or inducing artificial sensations or movements.[ 268 ] This can offer transformative experiences, especially in the realm of prosthetic design. AI techniques, like transfer learning, refine the neural intention decoding process[ 269 ] (Figure 10B).

FIGURE 10.

FIGURE 10

AI‐driven BCIs. (A) Teleoperated communication and mode; (B) BCIs transfer learning; (C) autonomous modes; (D) EEG‐driven motor intention for enhanced instrumental learning, reproduced with permission.[ 272 ] Copyright 2013, Wiley‐VCH. (E) brain‐AI closed‐loop system; (F) ErrP feedback versus manual stop button in autonomous navigation system. Reproduced with permission.[ 68 ] Copyright 2022, Springer Nature. Potential applications of (G) limb control; (H) visual prosthesis; (I) remote control; and (J) mind connectivity.

5.1.2. Movement mode

Autonomous movement BCIs primarily focus on restoring neural activity in patients with conditions like strokes or spinal cord injuries[ 261 , 270 , 271 ] (Figure 10C). Non‐invasive methods such as EEG are used for guiding devices based on motor intentions[ 272 ] (Figure 10D), while invasive approaches, involving surgical placement of electrodes like ECoG or MEA, are required for more precise interventions. Neural bypasses and bridges reroute signals around injured nervous system parts, linking decoded signals to electrical muscle or nerve stimulation to potentially restore movement. The first use of a neural bypass for restoring voluntary movement in human involved placing a MEA electrode array on the primary motor cortex.[ 273 ] This allowed for the deciphering of finger and hand motions, and subsequently, more intricate actions, illustrating its potential in restoring motor function. The potential future uses of this mode include directing robots and vehicles for urban services like deliveries and maintenance. Both teleoperated and autonomous BCIs offer unique advantages for diverse applications. Teleoperated communication BCIs can control traffic, monitor air quality, and regulate energy systems, while autonomous movement BCIs can manage autonomous vehicles or robots. Integrating these two modes could redefine urban services, elevating their efficiency and user accessibility.

5.2. Closed‐loop AI‐BCI systems in real‐world applications beyond healthcare

BCI development typically progresses from identifying strong neural patterns in controlled lab experiments to testing these patterns in realistic settings through open‐loop BCIs, which operate without user feedback. The next stage is closing the loop, creating neuroadaptive AI‐BCIs that update in real time based on the user mood state. While open‐loop BCIs have been widely explored, closed‐loop systems are less investigated but offer more seamless user interaction, such as BCIs for arousal regulation in flight simulators and therapeutic BCIs for controlling epileptic seizures and restoring emotional function in neuropsychiatric disorders. Closed‐loop AI‐BCIs, in contrast to one‐way open‐loop systems, are adaptive, utilizing real‐time user feedback for ongoing optimization. Current challenges include synchronizing operations in real‐time, especially using visual evoked potentials (VEPs)[ 274 , 275 ]—neural responses to visual stimuli to assess the functionality of visual pathways—and ERPs[ 276 ]—brainwaves triggered by specific sensory events. To address these, asynchronous BCIs have been developed.[ 277 ] These systems operate independently of external cues, enabling continuous, natural interaction between the user and the system, and have achieved advancements like a transmission speed of 67.7 bit/min[ 278 ] and the ability to detect error‐related potentials (ErrP)—neural signals generated when an error is perceived.[ 68 , 279 ] A key example is the BACLoS[ 68 ] (Figure 10E), which continually refines decision‐making via a feedback loop. BACLoS generally uses EEG devices and low‐power platforms like SpiNNaker, TrueNorth, and Loihi resembling earbuds for high‐quality data capture. ErrP feedback has notably reduced reaction times[ 68 ] (Figure 10F) in scenarios like driving.

5.3. AI‐driven BCIs for future smart city applications

AI‐driven BCIs hold the potential to address urban challenges and pave the way for human‐machine interactions. Applications range from limb and robot control to visual prostheses, enabling remote control and mind connectivity.

5.3.1. Limb, robot control, and clinical applications

Traditional prostheses rely on electromyography (EMG) signals from peripheral muscles, offering limited information about movements like hand opening or closing. BCIs can aid in motor control recovery after stroke or multiple sclerosis by bypassing the impaired neuromotor system. Next‐generation neural prostheses will capture detailed motor intentions directly from brain activity, offering precise control and seamless integration with the body. Despite challenges in speed and control accuracy, advancements in software and hardware, along with hybrid neural interfaces combining multiple signals (e.g. ECoG, MEA, EEG,) or BCI paradigms (e.g. SSVEP and P300), are expected to improve robotic arms control[ 280 ] (Figure 10G). These BCIs allow robots to conduct complex tasks, such as waste management and surveillance.[ 281 ] Invasive BCIs are crucial in medical rehabilitation, linking the brain and muscles to restore neural function in limb‐impaired individuals. Non‐invasive EEG‐based BCIs, while more accessible, often provide limited control and depend on AI for enhanced accuracy. Techniques like random forest algorithms improve non‐invasive sensorimotor rhythm BCI accuracy by effectively assessing somatosensory evoked potentials (SEPs)—electrical signals measured via EEG that are generated in response to tactile or proprioceptive stimuli, often used to evaluate the integrity of sensory pathways.[ 282 ] Research is underway to decode neural signals of somatosensory experiences in healthy individuals, with goals to replicate these stimuli using methods like ICMS,[ 283 ] potentially enhancing virtual reality and simulation training experiences.

5.3.2. Prostheses and future urban transportation

Visual prostheses, crucial in biomedical engineering, offering hope to those with blindness resulting from conditions such as age‐related maculopathy or congenital amaurosis[ 284 ] (Figure 10H), are evolving with AI‐driven BCIs, tailoring electrode configurations to individual needs. The shift to a flexible multielectrode system could refine retinal stimulation to align with the intricacies of human retinal physiology. Beyond medical solutions, visual prostheses can enhance urban living, promising for adaptive lighting tailored to individual needs and augmented reality enhancements that turn simple city tours into immersive experiences. In the broad urban landscape, the influence of visual prostheses is also reshaping the future of transportation. These interfaces are poised to revolutionize self‐driving cars by introducing thought‐controlled navigation systems. Such developments could enhance road safety, with drivers guiding vehicles using their thoughts, thus decreasing accidents due to human errors or fatigue.

5.3.3. Remote device control, and smart homes

Neural interfaces integrated with IoT offer transformative potential in the realm of smart devices (Figure 10I), which allow direct thought‐to‐device communication, obviating the need for physical intermediaries. From modulating home settings to plunging into virtual realities, the horizon of possibilities is vast. Specific neural patterns associated with various commands are identified by advanced algorithms, ensuring that a range of IoT devices, including thermostats and lighting systems, are controlled precisely and reliably. Picture a smart home where brain‐controlled devices allow residents unprecedented control over their environment while promoting sustainability and enhanced health outcomes. Advances in nanotechnology and wireless capabilities are anticipated to produce BCIs that are both small in size and compatible with biological tissues.[ 96 , 285 ]

5.3.4. Communication and shared experiences

In future smart cities, BCIs could go beyond basic control to facilitate sharing emotions and thoughts[ 286 ] (Figure 10J). This interaction could remove the need for physical interfaces altogether, providing extraordinary accessibility in telemedicine, remote collaborations, and entertainment domains such as thought‐driven gaming. However, challenges in reliability, protecting privacy, and accurately decoding complex brain neural signals remain significant.

6. CONCLUSIONS AND PERSPECTIVES

Neural interfaces in the brain, including MEA, DBS, ECoG, and EEG are ushering in a revolutionary era that extends beyond healthcare into diverse realms of human‐technology interaction. These interfaces are pivotal in decoding brain signals, reshaping our communication with the external world, and integrating human cognition with urban technology. They offer a new level of autonomy, particularly for individuals with disabilities or mobility constraints (Figure 11). However, understanding their wide‐ranging implications for safety, inclusivity, and health is vital as they become integrated into future urban development.

FIGURE 11.

FIGURE 11

Perspective and possible designs on neural interfaces for future smart city applications including IoT and remote control; mind connectivity; limb and robot control; neural diseases treatments; traffic safety; and self‐driving vehicles.

Invasive neural interfaces (e.g. intracortical BMIs using action potentials and LFPs) offer direct and intuitive motor control, boasting higher accuracy and potential for fine motor command. However, they are subject to surgical risks and long‐term biocompatibility challenges, including nerve cell and blood vessel damage, infection, and immune rejection. Addressing these requires a deep understanding of the interaction between the tissue and foreign materials, and the development of biocompatible interfaces. Non‐invasive interfaces like EEG, while safer and less intrusive, offer lower performance due to their reliance on detecting broader neural activities and being subject to more noise and interference. Their future development hinges on improving signal processing and artifact removal techniques to enhance performance. The pursuit of high‐density electrodes aims to record from numerous neurons with high‐resolution signals, promising advancements in sensorimotor applications and beyond. However, these technologies still face challenges in miniaturization and signal crosstalk. A multidisciplinary approach involving neuroscience, engineering, psychology, and algorithm processing is essential to harness the full potential of these interfaces.

Initially, the primary application of neural interfaces was in rehabilitation and medical care to restore social interaction and movement capabilities in patients. The success of these applications has inspired the development of bidirectional and commercial BMIs. Technologies like cochlear implants represent successful examples of BMIs that have significantly enhanced human capabilities, although their interaction with the environment remains distinct from natural experiences. The future of BMIs lies in addressing the balance between precision and safety, with advancement in signal processing, miniaturization, and biocompatibility being critical. Intracortical BMIs promise higher information transfer rates and the potential for more natural control and feedback, including somatosensory feedback restoration. However, challenges in implant longevity, signal stability, and ethical considerations remain.

As BMIs advance, they raise significant ethical and societal questions regarding privacy, the potential for mind communication, and brain enhancement implications. The lack of specific standards for the development and use of these technologies poses risks of unauthorized access to sensitive brain signals. Rigorous standards for data acquisition, access control, and encryption should be established to protect user privacy. The path forward involves not only technological advancements but also a robust ethical framework, multidisciplinary collaboration, and regulatory oversight. Overcoming these challenges will enable neural interfaces to enhance human capabilities and quality of life responsibly and inclusively.

CONFLICT OF INTEREST STATEMENT

The authors declare no conflicts of interest.

Supporting information

Supporting Information

EXP2-4-20230146-s001.docx (67.5KB, docx)

ACKNOWLEDGEMENTS

S. X. and Y. L. contributed equally to this work. This work was supported by the National Natural Science Foundation (grant 22205254). We acknowledged the use of Biorender.com and GIS network analyses for the creation of figures. Tools like EEGLAB, OpenViBE, BCI2000, and MNE have been instrumental in the analysis and BCI prototyping. Preprocessing, decoding, imaging, and data analyses were aided by MATLAB and Python, and wearable BCIs originated from Quasar, Cognixion, WIMAGINE, and Clinatec, for which we are grateful. Additionally, the authors extended their sincere gratitude to Prof. Bruce Gluckman for insightful discussions. The authors would also like to express their heartfelt thanks to the reviewers whose constructive feedback significantly refined this work.

Biographies

Dr. Shumao Xu is presently affiliated with Pennsylvania State University. He got his Ph.D. degree from Shanghai Jiao Tong University and completed postdoctoral training at the Max Planck Institute for Solid‐State Research. His current research focuses on the development of implantable microdevices for enhancing brain‐machine and brain–computer interfaces.

graphic file with name EXP2-4-20230146-g005.gif

Dr. Yang Liu, an assistant professor at the Global Institute of Future Technology, Shanghai Jiao Tong University, specializes in Mechanical Engineering. He got the postdoctoral training in the Interventional Radiology Department at Mayo Clinic, Rochester, and the Department of Mechanical Engineering at the University of Michigan. His research interests lie in biomedical manufacturing, interventional medical devices, and brain–computer interfaces. He has been honored with several awards, including the Tech Transfer Talent Network Postdoc Fellowship and the Pryor‐Hale Award for Best Business.

graphic file with name EXP2-4-20230146-g006.gif

Hyunjin Lee is currently a Ph.D. student in the Biomedical Engineering Department and University Graduate Fellow at the Pennsylvania State University. Previously, he received his B.S. (2019) in mechanical engineering from the Hanyang University and his M.S. (2021) in mechanical engineering from the Pohang University of Science and Technology in South Korea. His research focuses on the development of novel devices of bioelectronic interfaces for enhanced understanding of the in vivo nervous system.

graphic file with name EXP2-4-20230146-g016.gif

Prof. Weidong Li is a tenured full professor at Shanghai Jiao Tong University, a distinguished oriental scholar of Shanghai Universities, Shanghai outstanding academic leader, Pujiang talent, Shuguang scholar, subject leader of the Key Laboratory for the Genetics of Developmental and Neuropsychiatric Disorders (ministry of education), and PI of the National 973 project. His group works on the super brain population; pathogenesis of depression, schizophrenia, mental retardation, epilepsy, and other neuropsychiatric diseases, as well as the mechanisms of emotions, learning, memory, etc.

graphic file with name EXP2-4-20230146-g013.gif

Xu S., Liu Y., Lee H., Li W., Exploration 2024, 4, 20230146. 10.1002/EXP.20230146

Shumao Xu and Yang Liu contributed equally to this work.

Contributor Information

Shumao Xu, Email: shumaoxu@163.com.

Weidong Li, Email: liwd@sjtu.edu.cn.

REFERENCES

  • 1. Chen H. I., Wolf J. A., Blue R., Song M. M., Moreno J. D., Ming G.‐l., Song H., Cell Stem Cell 2019, 25, 462. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 2. Pediaditakis I., Kodella K. R., Manatakis D. V., Le C. Y., Hinojosa C. D., Tien‐Street W., Manolakos E. S., Vekrellis K., Hamilton G. A., Ewart L., Nat. Commun. 2021, 12, 5907. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 3. Yang J. Q., Wang R., Ren Y., Mao J. Y., Wang Z. P., Zhou Y., Han S. T., Adv. Mater. 2020, 32, 2003610. [DOI] [PubMed] [Google Scholar]
  • 4. Liu Y., Wang W., Zhang D., Sun Y., Li F., Zheng M., Lovejoy D. B., Zou Y., Shi B., Exploration 2022, 2, 20210274. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 5. Lim J.‐S., Kim H. J., Park I., Woo S., Kim J.‐H., Park J. W., Nano Lett. 2022, 22, 3865. [DOI] [PubMed] [Google Scholar]
  • 6. Yearley A. G., Patel R. V., Blitz S. E., Park S., Madinger A. M., Li J., Johnston B. R., Peruzzi P. P., Lee S., Srinivasan S. S., Device 2023, 1, 100068. [Google Scholar]
  • 7. Zhang A., Jiang Y., Loh K. Y., Bao Z., Deisseroth K., Nat. Rev. Bioeng. 2023, 2, 82. [Google Scholar]
  • 8. Greenberg A., Cohen A., Grewal M., Nat. Biotechnol. 2021, 39, 1194. [DOI] [PubMed] [Google Scholar]
  • 9. Willsey M. S., Nason‐Tomaszewski S. R., Ensel S. R., Temmar H., Mender M. J., Costello J. T., Patil P. G., Chestek C. A., Nat. Commun. 2022, 13, 6899. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 10. Heng W., Solomon S., Gao W., Adv. Mater. 2022, 34, 2107902. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 11. Drew L., Nat. Electron. 2023, 6, 90. [Google Scholar]
  • 12. Nikitas A., Michalakopoulou K., Njoya E. T., Karampatzakis D., Sustainability 2020, 12, 2789. [Google Scholar]
  • 13. Lin S., Liu J., Li W., Wang D., Huang Y., Jia C., Li Z., Murtaza M., Wang H., Song J., Nano Lett. 2019, 19, 6853. [DOI] [PubMed] [Google Scholar]
  • 14. Tang X., Shen H., Zhao S., Li N., Liu J., Nat. Electron. 2023, 6, 109. [Google Scholar]
  • 15. Abiri R., Borhani S., Sellers E. W., Jiang Y., Zhao X., J. Neural Eng. 2019, 16, 011001. [DOI] [PubMed] [Google Scholar]
  • 16. Acarón Ledesma H., Li X., Carvalho‐de‐Souza J. L., Wei W., Bezanilla F., Tian B., Nat. Nanotechnol. 2019, 14, 645. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 17. Zeng Q., Huang Z., Adv. Funct. Mater. 2023, 33, 2301223. [Google Scholar]
  • 18. Duan Y., Wang S., Yuan Q., Shi Y., Jiang N., Jiang D., Song J., Wang P., Zhuang L., Small 2023, 19, 2205768. [DOI] [PubMed] [Google Scholar]
  • 19. Go G. T., Lee Y., Seo D. G., Lee T. W., Adv. Mater. 2022, 34, 2201864. [DOI] [PubMed] [Google Scholar]
  • 20. Song H., Kim M., Kim E., Lee J., Jeong I., Lim K., Ryu S. Y., Oh M., Kim Y., Park J. U., BMEMat 2023, DOI: 10.1002/bmm2.12048 [DOI] [Google Scholar]
  • 21. Wu N., Wan S., Su S., Huang H., Dou G., Sun L., InfoMat 2021, 3, 1174. [Google Scholar]
  • 22. Shen K., Chen O., Edmunds J. L., Piech D. K., Maharbiz M. M., Nat. Biomed. Eng. 2023, 7, 424. [DOI] [PubMed] [Google Scholar]
  • 23. Xu S., Momin M., Ahmed S., Hossain A., Veeramuthu L., Pandiyan A., Kuo C. C., Zhou T., Adv. Mater. 2023, 35, 2303267. [DOI] [PubMed] [Google Scholar]
  • 24. Bianchi M., De Salvo A., Asplund M., Carli S., Di Lauro M., Schulze‐Bonhage A., Stieglitz T., Fadiga L., Biscarini F., Adv. Sci. 2022, 9, 2104701. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 25. Wang J., Wang T., Liu H., Wang K., Moses K., Feng Z., Li P., Huang W., Adv. Mater. 2023, 35, 2211012. [DOI] [PubMed] [Google Scholar]
  • 26. Andersen R. A., Aflalo T., Bashford L., Bjånes D., Kellis S., Annu. Rev. Psychol. 2022, 73, 131. [DOI] [PubMed] [Google Scholar]
  • 27. Cho Y., Jeong H. H., Shin H., Pak C. J., Cho J., Kim Y., Kim D., Kim T., Kim H., Kim S., Adv. Sci. 2023, 10, 2303728. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 28. Murphy B. B., Apollo N. V., Unegbu P., Posey T., Rodriguez‐Perez N., Hendricks Q., Cimino F., Richardson A. G., Vitale F., iScience 2022, 25, 104652. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 29. Hou B., Liu X., BMEMat 2023, 1, e12054. [Google Scholar]
  • 30. An A., IET Smart Cities 2023, 5, 64. [Google Scholar]
  • 31. Algnbri M. K., J. Metaverse 2022, 2, 29. [Google Scholar]
  • 32. Wang G., Badal A., Jia X., Maltz J. S., Mueller K., Myers K. J., Niu C., Vannier M., Yan P., Yu Z., Nat. Mach. Intell. 2022, 4, 922. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 33. Zhou Y., Xiao X., Chen G., Zhao X., Chen J., Joule 2022, 6, 1381. [Google Scholar]
  • 34. Zhao X., Askari H., Chen J., Joule 2021, 5, 1391. [Google Scholar]
  • 35. Libanori A., Chen G., Zhao X., Zhou Y., Chen J., Nat. Electron. 2022, 5, 142. [Google Scholar]
  • 36. Chen G., Xiao X., Zhao X., Tat T., Bick M., Chen J., Chem. Rev. 2021, 122, 3259. [DOI] [PubMed] [Google Scholar]
  • 37. Venkatesan T., Williams S., Appl. Phys. Rev. 2022, 9, 010401. [Google Scholar]
  • 38. Zhang X., Yao L., Zhang S., Kanhere S., Sheng M., Liu Y., IEEE IoT J. 2018, 6, 2084. [Google Scholar]
  • 39. Shanechi M. M., Nat. Neurosci. 2019, 22, 1554. [DOI] [PubMed] [Google Scholar]
  • 40. Gagana P., Kumar S., Meghana S., Advithi M., Airaddi S., J. Remote Sens. GIS Technol. 2022, 8, 10. [Google Scholar]
  • 41. Putze F., Weiß D., Vortmann L.‐M., Schultz T., in IEEE Int. Conf. Syst. Man Cybern., IEEE,  Bari, Italy: 2019, 2812. [Google Scholar]
  • 42. Donoghue J. P., Nat. Neurosci. 2002, 5, 1085. [DOI] [PubMed] [Google Scholar]
  • 43. Sun W., Lin J.‐W., Su S.‐F., Wang N., Er M. J., IEEE Trans. Cybern. 2020, 51, 1099. [DOI] [PubMed] [Google Scholar]
  • 44. Oña E. D., Garcia‐Haro J. M., Jardón A., Balaguer C., Appl. Sci. 2019, 9, 2586. [Google Scholar]
  • 45. Zhou Y., Song H., Ming G.‐l., Nat. Rev. Genet. 2024, 25, 26. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 46. D'Angelo E., Jirsa V., Trends Neurosci. 2022, 45, 777. [DOI] [PubMed] [Google Scholar]
  • 47. Rust N. C., LeDoux J. E., Trends Neurosci. 2023, 46, 3. [DOI] [PubMed] [Google Scholar]
  • 48. Tang Y.‐Y., Tang R., Posner M. I., Gross J. J., Trends Cogn. Sci. 2022, 26, 567. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 49. Mildenberger W., Stifter S. A., Greter M., Curr. Opin. Immunol. 2022, 76, 102181. [DOI] [PubMed] [Google Scholar]
  • 50. Chini M., Hanganu‐Opatz I. L., Trends Neurosci. 2021, 44, 227. [DOI] [PubMed] [Google Scholar]
  • 51. Luby J. L., Baram T. Z., Rogers C. E., Barch D. M., Trends Neurosci. 2020, 43, 744. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 52. Banwinkler M., Theis H., Prange S., van Eimeren T., Brain Sci. 2022, 12, 1248. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 53. Svara F., Förster D., Kubo F., Januszewski M., Dal Maschio M., Schubert P. J., Kornfeld J., Wanner A. A., Laurell E., Denk W., Nat. Methods 2022, 19, 1357. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 54. Gloor P., Am. J. EEG Technol. 1969, 9, 1. [Google Scholar]
  • 55. Banquet J.‐P., Electroencephalogr. Clin. Neurophysiol. 1973, 35, 143. [DOI] [PubMed] [Google Scholar]
  • 56. Kuhlman W. N., Electroencephalogr. Clin. Neurophysiol. 1978, 44, 83. [DOI] [PubMed] [Google Scholar]
  • 57. Pfurtscheller G., Aranibar A., Electroencephalogr. Clin. Neurophysiol. 1977, 42, 817. [DOI] [PubMed] [Google Scholar]
  • 58. Carmon A., Friedman Y., Coger R., Kenton B., Pain 1980, 8, 21. [DOI] [PubMed] [Google Scholar]
  • 59. Hammond P., Smith A., J. Physiol. 1983, 342, 35. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 60. Hawken M. J., Parker A., Lund J., J. Neurosci. 1988, 8, 3541. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 61. Gevins A., Smith M. E., Leong H., McEvoy L., Whitfield S., Du R., Rush G., Hum. Factors. 1998, 40, 79. [DOI] [PubMed] [Google Scholar]
  • 62. Kostov A., Polak M., IEEE Trans. Rehabil. Eng. 2000, 8, 203. [DOI] [PubMed] [Google Scholar]
  • 63. Pfurtscheller G., Neuper C., Schlogl A., Lugger K., IEEE Trans. Rehabil. Eng. 1998, 6, 316. [DOI] [PubMed] [Google Scholar]
  • 64. Meystre S., Telemed. J. E Health. 2005, 11, 63. [DOI] [PubMed] [Google Scholar]
  • 65. Zaehle T., Rach S., Herrmann C. S., PloS One 2010, 5, e13766. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 66. Tamietto M., De Gelder B., Nat. Rev. Neurosci. 2010, 11, 697. [DOI] [PubMed] [Google Scholar]
  • 67. Raghavendra U., Acharya U. R., Adeli H., Eur. Neurol. 2020, 82, 41. [DOI] [PubMed] [Google Scholar]
  • 68. Shin J. H., Kwon J., Kim J. U., Ryu H., Ok J., Joon Kwon S., Park H., Kim T.‐I., npj Flex. Electron. 2022, 6, 32. [Google Scholar]
  • 69. Viventi J., Kim D.‐H., Vigeland L., Frechette E. S., Blanco J. A., Kim Y.‐S., Avrin A. E., Tiruvadi V. R., Hwang S.‐W., Vanleer A. C., Nat. Neurosci. 2011, 14, 1599. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 70. Khodagholy D., Gelinas J. N., Thesen T., Doyle W., Devinsky O., Malliaras G. G., Buzsáki G., Nat. Neurosci. 2015, 18, 310. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 71. Harjes C., Schoenbach K., Schaefer G., Kristiansen M., Krompholz H., Skaggs D., Rev. Sci. Instrum. 1984, 55, 1684. [Google Scholar]
  • 72. Maynard E. M., Nordhausen C. T., Normann R. A., Electroencephalogr. Clin. Neurophysiol. 1997, 102, 228. [DOI] [PubMed] [Google Scholar]
  • 73. Blanche T. J., Spacek M. A., Hetke J. F., Swindale N. V., J. Neurophysiol. 2005, 93, 2987. [DOI] [PubMed] [Google Scholar]
  • 74. Lee J., Ozden I., Song Y.‐K., Nurmikko A. V., Nat. Methods 2015, 12, 1157. [DOI] [PubMed] [Google Scholar]
  • 75. Spiegel E. A., Wycis H. T., Marks M., Lee A. J., Science 1947, 106, 349. [DOI] [PubMed] [Google Scholar]
  • 76. Bate‐Smith E., Nature 1948, 161, 835. [DOI] [PubMed] [Google Scholar]
  • 77. Leksell L., Acta Chir Scand. 1949, 99, 229. [PubMed] [Google Scholar]
  • 78. Richardson D. E., Akil H., J. Neurosurg. 1977, 47, 184. [DOI] [PubMed] [Google Scholar]
  • 79. Gleason C. A., Wise B. L., Feinstein B., Neurosurg. 1978, 2, 217. [DOI] [PubMed] [Google Scholar]
  • 80. Dieckmann G., Witzmann A., Stereotact. Funct. Neurosurg. 1982, 45, 167. [Google Scholar]
  • 81. Benabid A.‐L., Pollak P., Louveau A., Henry S., De Rougemont J., Stereotact. Funct. Neurosurg. 1987, 50, 344. [DOI] [PubMed] [Google Scholar]
  • 82. Holsheimer J., Nuttin B., King G. W., Wesselink W. A., Gybels J. M., De Sutter P., Neurosurg. 1998, 42, 541. [DOI] [PubMed] [Google Scholar]
  • 83. Volkmann J., Herzog J., Kopper F., Deuschl G., Mov. Disord. 2002, 17, S181. [DOI] [PubMed] [Google Scholar]
  • 84. Malone Jr D. A., Dougherty D. D., Rezai A. R., Carpenter L. L., Friehs G. M., Eskandar E. N., Rauch S. L., Rasmussen S. A., Machado A. G., Kubu C. S., Biol. Psychiatry 2009, 65, 267. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 85. Little S., Pogosyan A., Neal S., Zavala B., Zrinzo L., Hariz M., Foltynie T., Limousin P., Ashkan K., FitzGerald J., Ann. Neurol. 2013, 74, 449. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 86. Li S., Zhou W., Yuan Q., Liu Y., IEEE Trans. Neural Syst. Rehabil. Eng. 2013, 21, 880. [DOI] [PubMed] [Google Scholar]
  • 87. Anderson D. N., Anderson C., Lanka N., Sharma R., Butson C. R., Baker B. W., Dorval A. D., Front. Neurosci. 2019, 13, 1152. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 88. Kouzani A. Z., Abulseoud O. A., Tye S. J., Hosain M. K., Berk M., IEEE J. Transl. Eng. Health Med. 2013, 1, 1500109. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 89. Steigerwald F., Müller L., Johannes S., Matthies C., Volkmann J., Mov. Disord. 2016, 31, 1240. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 90. Cagnan H., Denison T., McIntyre C., Brown P., Nat. Biotechnol. 2019, 37, 1024. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 91. Boutet A., Hancu I., Saha U., Crawley A., Xu D. S., Ranjan M., Hlasny E., Chen R., Foltz W., Sammartino F., J. Neurosurg. 2019, 132, 586. [DOI] [PubMed] [Google Scholar]
  • 92. Potel S. R., Marceglia S., Meoni S., Kalia S. K., Cury R. G., Moro E., Curr. Neurol. Neurosci. Rep. 2022, 22, 577. [DOI] [PubMed] [Google Scholar]
  • 93. Pozzi N. G., Isaias I. U., Handb. Clin. Neurol. 2022, 184, 273. [DOI] [PubMed] [Google Scholar]
  • 94. Wu M., Yao K., Huang N., Li H., Zhou J., Shi R., Li J., Huang X., Li J., Jia H., Adv. Sci. 2023, 10, 2300504. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 95. Zhang X., Wu C., Lv Y., Zhang Y., Liu W., Nano Lett. 2022, 22, 7246. [DOI] [PubMed] [Google Scholar]
  • 96. Huang Y., Li H., Hu T., Li J., Yiu C. K., Zhou J., Li J., Huang X., Yao K., Qiu X., Nano Lett. 2022, 22, 5944. [DOI] [PubMed] [Google Scholar]
  • 97. Tibon R., Geerligs L., Campbell K., Trends Neurosci. 2022, 45, 507. [DOI] [PubMed] [Google Scholar]
  • 98. de Borman A., Vespa S., El Tahry R., Absil P.‐A., J. Neural Eng. 2022, 19, 026005. [DOI] [PubMed] [Google Scholar]
  • 99. Yan W., Wu Y., Front. Neurosci. 2022, 16, 991136. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 100. Gong Q., Yu Y., Kang L., Zhang M., Zhang Y., Wang S., Niu Y., Zhang Y., Di J., Li Q., Adv. Funct. Mater. 2022, 32, 2107360. [Google Scholar]
  • 101. Zhang J., Wang L., Xue Y., Lei I. M., Chen X., Zhang P., Cai C., Liang X., Lu Y., Liu J., Adv. Mater. 2023, 35, 2209324. [DOI] [PubMed] [Google Scholar]
  • 102. Bourceau P., Geier B., Suerdieck V., Bien T., Soltwisch J., Dreisewerd K., Liebeke M., Nat. Protoc. 2023, 18, 3050. [DOI] [PubMed] [Google Scholar]
  • 103. Han I. K., Song K. I., Jung S. M., Jo Y., Kwon J., Chung T., Yoo S., Jang J., Kim Y. T., Hwang D. S., Adv. Mater. 2023, 35, 2203431. [DOI] [PubMed] [Google Scholar]
  • 104. Tian G., Yang D., Liang C., Liu Y., Chen J., Zhao Q., Tang S., Huang J., Xu P., Liu Z., Adv. Mater. 2023, 35, 2212302. [DOI] [PubMed] [Google Scholar]
  • 105. Yang W., Kang X., Gao X., Zhuang Y., Fan C., Shen H., Chen Y., Dai J., Adv. Funct. Mater. 2023, 33, 2211340. [Google Scholar]
  • 106. Yang C., Suo Z., Nat. Rev. Mater. 2018, 3, 125. [Google Scholar]
  • 107. Fu Y., Zhou Y., Huang X., Dong B., Zhuge F., Li Y., He Y., Chai Y., Miao X., Adv. Funct. Mater. 2022, 32, 2111996. [Google Scholar]
  • 108. Zheng X. S., Yang Q., Vazquez A., Cui X. T., iScience 2022, 25, 104539. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 109. Lee C., Kim B., Kim J., Lee S., Jeon T., Choi W., Yang S., Ahn J.‐H., Bae J., Chae Y., IEEE J. Solid‐State Circuits. 2022, 57, 3212. [Google Scholar]
  • 110. Qiang Y., Gu W., Liu Z., Liang S., Ryu J. H., Seo K. J., Liu W., Fang H., Nano Res. 2021, 14, 3240. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 111. Zhang Y., Lu Q., He J., Huo Z., Zhou R., Han X., Jia M., Pan C., Wang Z. L., Zhai J., Nat. Commun. 2023, 14, 1252. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 112. Chang Y. Y., He J., Pu D., Chen L., Zhang Y.‐M., Zhang S. X.‐A., Device 2023, 1, 100021. [Google Scholar]
  • 113. Liu Z., Bai L., Zhu Z., Chen L., Sun Q., in IEEE Electron. Compon. Technol. Conf., IEEE Denver, United States: 2022, 2078. [Google Scholar]
  • 114. Jiang D., Demosthenous A., IEEE Trans. Biomed. Circuits Syst. 2018, 12, 940. [DOI] [PubMed] [Google Scholar]
  • 115. Wilke R., Moghadam G. K., Lovell N., Suaning G., Dokos S., J. Neural Eng. 2011, 8, 046016. [DOI] [PubMed] [Google Scholar]
  • 116. Dimov I. B., Moser M., Malliaras G. G., McCulloch I., Chem. Rev. 2022, 122, 4356. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 117. Gao X., Bao Y., Chen Z., Lu J., Su T., Zhang L., Ouyang J., Adv. Electron. Mater. 2023, 9, 2300082. [Google Scholar]
  • 118. Li Q., Wen C., Yang J., Zhou X., Zhu Y., Zheng J., Cheng G., Bai J., Xu T., Ji J., Chem. Rev. 2022, 122, 17073. [DOI] [PubMed] [Google Scholar]
  • 119. Kosuri S., Borca C. H., Mugnier H., Tamasi M., Patel R. A., Perez I., Kumar S., Finkel Z., Schloss R., Cai L., Adv. Healthc. Mater. 2022, 11, 2102101. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 120. Stephen M., Nawaz A., Lee S. Y., Sonar P., Leong W. L., Adv. Funct. Mater. 2023, 33, 2208521. [Google Scholar]
  • 121. Snider J., Plank M., Lynch G., Halgren E., Poizner H., J. Neurosci. 2013, 33, 15056. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 122. O'Byrne J., Jerbi K., Trends Neurosci. 2022, 45, 820. [DOI] [PubMed] [Google Scholar]
  • 123. Ray K. L., Griffin N. R., Shumake J., Alario A., Allen J. J., Beevers C. G., Schnyer D. M., Brain Res. 2023, 1806, 148282. [DOI] [PubMed] [Google Scholar]
  • 124. Onojima T., Goto T., Mizuhara H., Aoyagi T., PLoS Comput. Biol. 2018, 14, e1005928. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 125. Holroyd C. B., Trends Neurosci. 2022, 45, 346. [DOI] [PubMed] [Google Scholar]
  • 126. Coa R., La Cava S. M., Baldazzi G., Polizzi L., Pinna G., Conti C., Defazio G., Pani D., Puligheddu M., Front. Neurol. 2022, 13, 1030118. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 127. Yang X., Chen Z., Wang Z., He G., Li Z., Shi Y., Gong N., Zhao B., Kuang Y., Takahashi E., Mol. Brain 2022, 15, 16. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 128. Royer J., Bernhardt B. C., Larivière S., Gleichgerrcht E., Vorderwülbecke B. J., Vulliémoz S., Bonilha L., Epilepsia 2022, 63, 537. [DOI] [PubMed] [Google Scholar]
  • 129. Yang D., Ren Q., Nie J., Zhang Y., Wu H., Chang Z., Wang B., Dai J., Fang Y., Nano Lett. 2023, 24, 1052. [DOI] [PubMed] [Google Scholar]
  • 130. Guger C., Daban S., Sellers E., Holzner C., Krausz G., Carabalona R., Gramatica F., Edlinger G., Neurosci. Lett. 2009, 462, 94. [DOI] [PubMed] [Google Scholar]
  • 131. Criscuolo A., Schwartze M., Kotz S. A., Trends Neurosci. 2022, 45, 667. [DOI] [PubMed] [Google Scholar]
  • 132. Adaikkan C., Tsai L.‐H., Trends Neurosci. 2020, 43, 24. [DOI] [PubMed] [Google Scholar]
  • 133. Benussi A., Cantoni V., Grassi M., Brechet L., Michel C. M., Datta A., Thomas C., Gazzina S., Cotelli M. S., Bianchi M., Ann. Neurol. 2022, 92, 322. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 134. Zheng R., Wang Z., He Y., Zhang J., Cogn. Neurodyn. 2022, 16, 325. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 135. Li P., Wang C., Li M., Xuan X., Zhou B., Li H., Adv. Intell. Syst. 2023, 5, 2300018. [Google Scholar]
  • 136. Voytek B., Nat. Methods 2022, 19, 1349. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 137. Borra D., Fantozzi S., Bisi M. C., Magosso E., Sensors 2023, 23, 3530. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 138. Asher E. E., Plotnik M., Günther M., Moshel S., Levy O., Havlin S., Kantelhardt J. W., Bartsch R. P., Commun. Biol. 2021, 4, 1017. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 139. Strauch C., Wang C.‐A., Einhäuser W., Van der Stigchel S., Naber M., Trends Neurosci. 2022, 45, 635. [DOI] [PubMed] [Google Scholar]
  • 140. Clancy K. J., Baisley S. K., Albizu A., Kartvelishvili N., Ding M., Li W., Soc. Cogn. Affect. Neurosci. 2018, 13, 1305. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 141. Kaveh R., Doong J., Zhou A., Schwendeman C., Gopalan K., Burghardt F. L., Arias A. C., Maharbiz M. M., Muller R., IEEE Trans. Biomed. Circuits Syst. 2020, 14, 727. [DOI] [PubMed] [Google Scholar]
  • 142. Norton J. J., Lee D. S., Lee J. W., Lee W., Kwon O., Won P., Jung S.‐Y., Cheng H., Jeong J.‐W., Akce A., Proc. Natl. Acad. Sci. U. S. A. 2015, 112, 3920. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 143. Bouchard K. E., Mesgarani N., Johnson K., Chang E. F., Nature 2013, 495, 327. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 144. Schalk G., Leuthardt E. C., IEEE Rev. Biomed. Eng. 2011, 4, 140. [DOI] [PubMed] [Google Scholar]
  • 145. Yu Y., Qiu Y., Li G., Zhang K., Bo B., Pei M., Ye J., Thompson G. J., Cang J., Fang F., Feng Y., Duan X., Tong C., Liang Z., Nat. Commun. 2023, 14, 1651. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 146. Tan L. L., Oswald M. J., Kuner R., Trends Neurosci. 2021, 44, 629. [DOI] [PubMed] [Google Scholar]
  • 147. Imai A., Takahashi S., Furubayashi S., Mizuno Y., Sonoda M., Miyazaki T., Miyashita E., Fujie T., Adv. Mater. Technol. 2023, 8, 2300300. [Google Scholar]
  • 148. Leuthardt E. C., Schalk G., Wolpaw J. R., Ojemann J. G., Moran D. W., J. Neural Eng. 2004, 1, 63. [DOI] [PubMed] [Google Scholar]
  • 149. Asman P., Prabhu S., Tummala S., Ince N. F., in Proc. 44th Annu. Int. Conf. IEEE Eng. Med. Biol. Soc., IEEE, Glasgow, United Kingdom 2022, 4892. [DOI] [PubMed]
  • 150. Owolabi M. O., Leonardi M., Bassetti C., Jaarsma J., Hawrot T., Makanjuola A. I., Dhamija R. K., Feng W., Straub V., Camaradou J., Nat. Rev. Neurol. 2023, 19, 371. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 151. Betzel R. F., Medaglia J. D., Kahn A. E., Soffer J., Schonhaut D. R., Bassett D. S., Nat. Biomed. Eng. 2019, 3, 902. [DOI] [PubMed] [Google Scholar]
  • 152. JeyaJothi E. S., Anitha J., Rani S., Tiwari B., Biomed. Res. Int. 2022, 2022, 7242667. [DOI] [PMC free article] [PubMed] [Google Scholar] [Retracted]
  • 153. Ye Z., Hu C., Wang J., Liu H., Li L., Yuan J., Ha J. W., Li Z., Xiao L., Exploration 2023, 3, 2023002. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 154. Zhang P., Li W., Liu C., Qin F., Lu Y., Qin M., Hou Y., Exploration 2023, 3, 20230070. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 155. Hong G., Lieber C. M., Nat. Rev. Neurosci. 2019, 20, 330. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 156. Liu S., Liu L., Zhao Y., Wang Y., Wu Y., Zhang X.‐D., Ming D., Nano Lett. 2022, 22, 4400. [DOI] [PubMed] [Google Scholar]
  • 157. Thakor N. V., Sci. Tranl. Med. 2013, 5, 210. [DOI] [PubMed] [Google Scholar]
  • 158. Ren N., Hang C., Liu X., Jiang X., Nano Lett. 2022, 22, 7554. [DOI] [PubMed] [Google Scholar]
  • 159. Ferschmann L., Bos M. G., Herting M. M., Mills K. L., Tamnes C. K., Curr. Opin. Psychol. 2022, 44, 170. [DOI] [PubMed] [Google Scholar]
  • 160. Mankin E. A., Fried I., Neuron 2020, 106, 218. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 161. Lowet E., Kondabolu K., Zhou S., Mount R. A., Wang Y., Ravasio C. R., Han X., Nat. Commun. 2022, 13, 7709. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 162. Neumann W.‐J., Horn A., Kühn A. A., Trends Neurosci. 2023, 46, 472. [DOI] [PubMed] [Google Scholar]
  • 163. Sun F., Shen H., Yang Q., Yuan Z., Chen Y., Guo W., Wang Y., Yang L., Bai Z., Liu Q., Adv. Mater. 2023, 35, 2210018. [DOI] [PubMed] [Google Scholar]
  • 164. Emiliani V., Entcheva E., Hedrich R., Hegemann P., Konrad K. R., Lüscher C., Mahn M., Pan Z.‐H., Sims R. R., Vierock J., Nat. Rev. Methods Primers 2022, 2, 55. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 165. Rossi M. A., Trends Neurosci. 2023, 46, 738. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 166. Chen R., Romero G., Christiansen M. G., Mohr A., Anikeeva P., Science 2015, 347, 1477. [DOI] [PubMed] [Google Scholar]
  • 167. Lozano A. M., Lipsman N., Bergman H., Brown P., Chabardes S., Chang J. W., Matthews K., McIntyre C. C., Schlaepfer T. E., Schulder M., Nat. Rev. Neurol. 2019, 15, 148. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 168. Xiao B., Tan E.‐K., Trends Neurosci. 2023, 46, 1. [DOI] [PubMed] [Google Scholar]
  • 169. Udupa K., Chen R., Prog. Neurobiol. 2015, 133, 27. [DOI] [PubMed] [Google Scholar]
  • 170. Scangos K. W., Khambhati A. N., Daly P. M., Makhoul G. S., Sugrue L. P., Zamanian H., Liu T. X., Rao V. R., Sellers K. K., Dawes H. E., Nat. Med. 2021, 27, 1696. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 171. Zhou H., Kim K.‐N., Sung M.‐J., Han S. J., Lee T.‐W., Device 2023, 1, 100060. [Google Scholar]
  • 172. Ghanim R., Kaushik A., Park J., Abramson A., Device 2023, 1, 100092. [Google Scholar]
  • 173. Aron L., Zullo J., Yankner B. A., Curr. Opin. Neurobiol. 2022, 72, 91. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 174. Kringelbach M. L., Jenkinson N., Owen S. L., Aziz T. Z., Nat. Rev. Neurosci. 2007, 8, 623. [DOI] [PubMed] [Google Scholar]
  • 175. Vissani M., Isaias I. U., Mazzoni A., J. Neural Eng. 2020, 17, 051002. [DOI] [PubMed] [Google Scholar]
  • 176. Frey J., Cagle J., Johnson K. A., Wong J. K., Hilliard J. D., Butson C. R., Okun M. S., de Hemptinne C., Front. Neurol. 2022, 13, 825178. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 177. Messina M. S., Chang C. J., ACS Cent. Sci. 2023, 9, 1706. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 178. Lecoq J. A., Boehringer R., Grewe B. F., Nat. Methods 2023, 20, 495. [DOI] [PubMed] [Google Scholar]
  • 179. Horn A., Reich M., Vorwerk J., Li N., Wenzel G., Fang Q., Schmitz‐Hübsch T., Nickl R., Kupsch A., Volkmann J., Ann. Neurol. 2017, 82, 67. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 180. Krauss J. K., Lipsman N., Aziz T., Boutet A., Brown P., Chang J. W., Davidson B., Grill W. M., Hariz M. I., Horn A., Nat. Rev. Neurol. 2021, 17, 75. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 181. Yanagisawa T., Fukuma R., Seymour B., Hosomi K., Kishima H., Shimizu T., Yokoi H., Hirata M., Yoshimine T., Kamitani Y., Nat. Commun. 2016, 7, 13209. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 182. Liu X., Zhao Y., Dou J., Hou Q., Cheng J., Jiang X., Nano Lett. 2022, 22, 1091. [DOI] [PubMed] [Google Scholar]
  • 183. Kikkert S., Kolasinski J., Jbabdi S., Tracey I., Beckmann C. F., Johansen‐Berg H., Makin T. R., Elife 2016, 5, e15292. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 184. Shan L., Zeng H., Liu Y., Zhang X., Li E., Yu R., Hu Y., Guo T., Chen H., Nano Lett. 2022, 22, 7275. [DOI] [PubMed] [Google Scholar]
  • 185. Godwin‐Jones R., Lang. Learn. Technol. 2023, 27, 6. [Google Scholar]
  • 186. Yang Q., Mishra R., Cen Y., Shi G., Sharma R., Fong X., Yang H., Nano Lett. 2022, 22, 8437. [DOI] [PubMed] [Google Scholar]
  • 187. Padhy S. P., Kalinin S. V., Device 2023, 1, 100115. [Google Scholar]
  • 188. Schreiber S., Bernal J., Arndt P., Schreiber F., Müller P., Morton L., Braun‐Dullaeus R. C., Valdés‐Hernández M. D. C., Duarte R., Wardlaw J. M., Cell 2023, 12, 957. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 189. Ang K. K., Chin Z. Y., Zhang H., Guan C., in Proc. 2008 Int. Joint Conf. Neural Networks , IEEE, Hong Kong, China 2008, 2390.
  • 190. Zhang Y., Zhou G., Jin J., Wang X., Cichocki A., J. Neurosci. Methods 2015, 255, 85. [DOI] [PubMed] [Google Scholar]
  • 191. Shuaibu Z., Qi L., Int. J. Comput. Applic. 2020, 175, 16. [Google Scholar]
  • 192. Makeig S., Onton J., ERP Features and EEG Dynamics: An ICA Perspective. Oxford University Press, Oxford Handbooks Online: 2011. [Google Scholar]
  • 193. Wu X., Zhou B., Lv Z., Zhang C., IEEE J. Biomed. Health Inf. 2019, 24, 775. [DOI] [PubMed] [Google Scholar]
  • 194. Lin C.‐T., Tian Y., Wang Y.‐K., Do T.‐T. N., Chang Y.‐L., King J.‐T., Huang K.‐C., Liao L.‐D., IEEE Trans. Intell. Transp. Syst. 2022, 23, 10395. [Google Scholar]
  • 195. Chamola V., Vineet A., Nayyar A., Hossain E., Sensors 2020, 20, 3620. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 196. Atangana R., Tchiotsop D., Kenne G., Chanel L., Health Inf. Int. J. 2020, 9, 14. [Google Scholar]
  • 197. Siddharth T., Gajbhiye P., Tripathy R. K., Pachori R. B., IEEE Sens. J. 2020, 20, 11421. [Google Scholar]
  • 198. Sha'Abani M., Fuad N., Jamal N., Ismail M., in Proc. 5th Int. Conf. Electr. Control Comput. Eng., IEEE, Kuantan, Malaysia 2020, 555.
  • 199. Huang M., Liu Z., Tao Y., Simul. Model. Pract. Theory 2020, 102, 101981. [Google Scholar]
  • 200. Lin P.‐J., Jia T., Li C., Li T., Qian C., Li Z., Pan Y., Ji L., IEEE Trans. Neural Syst. Rehabil. Eng. 2021, 29, 1936. [DOI] [PubMed] [Google Scholar]
  • 201. Hosman T., Vilela M., Milstein D., Kelemen J. N., Brandman D. M., Hochberg L. R., Simeral J. D., in Proc. 9th Int. IEEE/EMBS Conf. Neural Eng., IEEE, San Francisco, United States 2019, 1066.
  • 202. Guo J., Liu Y., Lin L., Li S., Cai J., Chen J., Huang W., Lin Y., Xu J., Nano Lett. 2023, 23, 9651. [DOI] [PubMed] [Google Scholar]
  • 203. Chan K. K., Shang L.‐W., Qiao Z., Liao Y., Kim M., Chen Y.‐C., Nano Lett. 2022, 22, 8949. [DOI] [PubMed] [Google Scholar]
  • 204. Makin J. G., Moses D. A., Chang E. F., Nat. Neurosci. 2020, 23, 575. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 205. Anumanchipalli G. K., Chartier J., Chang E. F., Nature 2019, 568, 493. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 206. Wolpaw J. R., Birbaumer N., McFarland D. J., Pfurtscheller G., Vaughan T. M., Clin. Neurophysiol. 2002, 113, 767. [DOI] [PubMed] [Google Scholar]
  • 207. Farwell L. A., Donchin E., Electroencephalogr. Clin. Neurophysiol. 1988, 70, 510. [DOI] [PubMed] [Google Scholar]
  • 208. Luo S., Angrick M., Coogan C., Candrea D. N., Wyse‐Sookoo K., Shah S., Rabbani Q., Milsap G. W., Weiss A. R., Anderson W. S., Adv. Sci. 2023, 10, 202304853. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 209. Schwemmer M. A., Skomrock N. D., Sederberg P. B., Ting J. E., Sharma G., Bockbrader M. A., Friedenberg D. A., Nat. Med. 2018, 24, 1669. [DOI] [PubMed] [Google Scholar]
  • 210. Willett F. R., Kunz E. M., Fan C., Avansino D. T., Wilson G. H., Choi E. Y., Kamdar F., Hochberg L. R., Druckmann S., Shenoy K. V., Nature 2023, 620, 1031. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 211. Rapeaux A. B., Constandinou T. G., Curr. Opin. Biotechnol. 2021, 72, 102. [DOI] [PubMed] [Google Scholar]
  • 212. Sani O. G., Yang Y., Shanechi M. M., BCI Res. 2021, 9, 121. [Google Scholar]
  • 213. Yang Y., Qiao S., Sani O. G., Sedillo J. I., Ferrentino B., Pesaran B., Shanechi M. M., Nat. Biomed. Eng. 2021, 5, 324. [DOI] [PubMed] [Google Scholar]
  • 214. Xia F., Kheirbek M. A., Trends Neurosci. 2020, 43, 902. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 215. Lee J., Liao H., Wang Q., Han J., Han J. H., Shin H. E., Ge M., Park W., Li F., Exploration 2022, 2, 20210086. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 216. Mei J., Muller E., Ramaswamy S., Trends Neurosci. 2022, 45, 237. [DOI] [PubMed] [Google Scholar]
  • 217. Wan X., Li Z., Yu W., Wang A., Ke X., Guo H., Su J., Li L., Gui Q., Zhao S., Adv. Mater. 2023, DOI: 10.1002/adma.202305192 [DOI] [PubMed] [Google Scholar]
  • 218. Kinney‐Lang E., Auyeung B., Escudero J., J. Neural Eng. 2016, 13, 061002. [DOI] [PubMed] [Google Scholar]
  • 219. Wang T., Song Z., Zhao X., Wu Y., Wu L., Haghparast A., Wu H., Exploration 2023, 3, 20220133. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 220. Wang Z., Yang X., Zhao B., Li W., Heliyon 2023, 9, e14786. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 221. Fulton S., Décarie‐Spain L., Fioramonti X., Guiard B., Nakajima S., Trends Endocrinol. Metab. 2022, 33, 18. [DOI] [PubMed] [Google Scholar]
  • 222. Yuan Z., Peng Y., Wang L., Song S., Chen S., Yang L., Liu H., Wang H., Shi G., Han C., IEEE Trans. Neural Syst. Rehabil. Eng. 2021, 29, 2569. [DOI] [PubMed] [Google Scholar]
  • 223. Sani O. G., Yang Y., Lee M. B., Dawes H. E., Chang E. F., Shanechi M. M., Nat. Biotechnol. 2018, 36, 954. [DOI] [PubMed] [Google Scholar]
  • 224. Wu X., Jiang Y., Rommelfanger N. J., Yang F., Zhou Q., Yin R., Liu J., Cai S., Ren W., Shin A., Nat. Biomed. Eng. 2022, 6, 754. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 225. Cui H., Zhao S., Hong G., Device 2023, 1, 100113. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 226. Brennan M., Device 2023, 1, 100096. [Google Scholar]
  • 227. Pandiyan A., Veeramuthu L., Yan Z.‐L., Lin Y.‐C., Tsai C.‐H., Chang S.‐T., Chiang W.‐H., Xu S., Zhou T., Kuo C.‐C., Prog. Mater. Sci. 2023, 140, 101206. [Google Scholar]
  • 228. Dai Z., Lei M., Ding S., Zhou Q., Ji B., Wang M., Zhou B., Exploration 2023, 3, 20230046. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 229. Tat T., Zhao X., Chen J., Device 2023, 1, 100094. [Google Scholar]
  • 230. Liao L.‐D., Chen C.‐Y., Wang I.‐J., Chen S.‐F., Li S.‐Y., Chen B.‐W., Chang J.‐Y., Lin C.‐T., J. Neuroeng. Rehabil. 2012, 9, 5. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 231. Piccini L., Parini S., Maggi L., Andreoni G., in 27th Annu. Int. Conf. IEEE Eng. Med. Biol. Soc., IEEE, Piscataway, United States 2006, 5384.
  • 232. Sullivan T. J., Deiss S. R., Jung T.‐P., Cauwenberghs G., A brain‐machine interface using dry‐contact, low‐noise EEG sensors, in Proc. IEEE Int. Symp. Circuits Syst , IEEE Seattle, United States 2008, 1986.
  • 233. Design Y., https://www.yankodesign.com/2021/02/23/cognixions (accessed February 2021).
  • 234. Jakobs M., Fomenko A., Lozano A. M., Kiening K. L., EMBO Mol. Med. 2019, 11, e9575. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 235. Mestais C. S., Charvet G., Sauter‐Starace F., Foerster M., Ratel D., Benabid A. L., IEEE Trans. Neural Syst. Rehabil. Eng. 2014, 23, 10. [DOI] [PubMed] [Google Scholar]
  • 236. Fonds Clinatec , https://fonds‐clinatec.fr/projets‐projets‐soutenus/ (accessed June 2022).
  • 237. Sun Z., Zhu M., Shan X., Lee C., Nat. Commun. 2022, 13, 5224. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 238. Leroy E., Hinchet R., Shea H., Adv. Mater. 2020, 32, 2002564. [DOI] [PubMed] [Google Scholar]
  • 239. Zhu M., Sun Z., Zhang Z., Shi Q., He T., Liu H., Chen T., Lee C., Sci. Adv. 2020, 6, eaaz8693. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 240. Lee Y. R., Trung T. Q., Hwang B.‐U., Lee N.‐E., Nat. Commun. 2020, 11, 2753. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 241. Park J., Kang D.‐h., Chae H., Ghosh S. K., Jeong C., Park Y., Cho S., Lee Y., Kim J., Ko Y., Sci. Adv. 2022, 8, eabj9220. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 242. Halbach A., Gijsenbergh P., Jeong Y., Devriese W., Gao H., Billen M., Torri G. B., Chare C., Cheyns D., Rottenberg X., Rochus V., In 20th Int. Conf. Solid‐State Sens. Actuat. Microsyst. Eurosens. XXXIII . IEEE, Estrel Berlin, Germany 2019, 158.
  • 243. Yang T. H., Kim J. R., Jin H., Gil H., Koo J. H., Kim H. J., Adv. Funct. Mater. 2021, 31, 2008831. [Google Scholar]
  • 244. Ultraleap, https://www.ultraleap.com/ (accessed June 2021).
  • 245. Hirayama R., Plasencia D.M., Masuda N., Subramanian S., Nature 2019, 575, 320. [DOI] [PubMed] [Google Scholar]
  • 246. Yin H., Jiang W., Liu Y., Zhang D., Wu F., Zhang Y., Li C., Chen G., Wang Q., BMEMat 2023, 1, e12023. [Google Scholar]
  • 247. Lee H., Kim J.‐S., Choi S., Jun J.‐H., Park J.‐R., Kim A.‐H., Oh H.‐B., Kim H.‐S., Chung S.‐C., in Proc. IEEE World Haptics Conf. Evanston, IEEE, United States 2015, 374.
  • 248. Zhang Q., Dong H., El Saddik A., IEEE Access 2016, 4, 299. [Google Scholar]
  • 249. Song K., Kim S. H., Jin S., Kim S., Lee S., Kim J.‐S., Park J.‐M., Cha Y., Sci. Rep. 2019, 9, 8988. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 250. De Fazio R., Mastronardi V. M., Petruzzi M., De Vittorio M., Visconti P., Future Internet 2022, 15, 14. [Google Scholar]
  • 251. Ying M., Bonifas A. P., Lu N., Su Y., Li R., Cheng H., Ameen A., Huang Y., Rogers J. A., Nanotechnology 2012, 23, 344004. [DOI] [PubMed] [Google Scholar]
  • 252. Lee J., Sul H., Lee W., Pyun K. R., Ha I., Kim D., Park H., Eom H., Yoon Y., Jung J., Adv. Funct. Mater. 2020, 30, 1909171. [Google Scholar]
  • 253. Choi S., Park J., Hyun W., Kim J., Kim J., Lee Y. B., Song C., Hwang H. J., Kim J. H., Hyeon T., ACS Nano 2015, 9, 6626. [DOI] [PubMed] [Google Scholar]
  • 254. Kim S., Kim T., Kim C. S., Choi H., Kim Y. J., Lee G. S., Oh O., Cho B. J., Soft Rob. 2020, 7, 736. [DOI] [PubMed] [Google Scholar]
  • 255. Xu S., Ahmed S., Momin M., Hossain A., Zhou T., Device 2023, 1, 100067. [Google Scholar]
  • 256. Xu S., Han Z., Yuan K., Qin P., Zhao W., Lin T., Zhou T., Huang F., Nat. Rev. Methods Primers 2023, 3, 44. [Google Scholar]
  • 257. Wang H., Xu S., Dong W., Sun D., Zhang S., Han Z., Huang F., Chem. Eur. J. 2022, 28, e202200124. [DOI] [PubMed] [Google Scholar]
  • 258. Beyeler M., Sanchez‐Garcia M., J. Neural Eng. 2022, 19, 063001. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 259. Zgallai W., Brown J. T., Ibrahim A., Mahmood F., Mohammad K., Khalfan M., Mohammed M., Salem M., Hamood N., in Proc. Adv. Sci. Eng. Technol. Int. Conf., IEEE, Dubai, United Arab Emirates 2019, 1.
  • 260. Aslam A. M., Chaudhary R., Bhardwaj A., Budhiraja I., Kumar N., Zeadally S., IEEE IoT J. 2023, 6, 32. [Google Scholar]
  • 261. Chaudhary U., Birbaumer N., Ramos‐Murguialday A., Nat. Rev. Neurol. 2016, 12, 513. [DOI] [PubMed] [Google Scholar]
  • 262. Chella A., Pagello E., Menegatti E., Sorbello R., Anzalone S. M., Cinquegrani F., Tonin L., Piccione F., Prifitis K., Blanda C., in Proc. Int. Conf. Complex Int. Softw. Intensive Syst., IEEE, Fukuoka, Japan 2009, 783.
  • 263. Spataro R., Chella A., Allison B., Giardina M., Sorbello R., Tramonte S., Guger C., La Bella V., Front. Hum. Neurosci. 2017, 11, 68. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 264. Xu S.‐M., Liang X., Wu X.‐Y., Zhao S.‐L., Chen J., Wang K.‐X., Chen J.‐S., Nat. Commun. 2019, 10, 5810. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 265. Xu S., Peng B., Pang X., Huang F., ACS Mater. Lett. 2022, 4, 2195. [Google Scholar]
  • 266. Xu S.‐M., Liang X., Liu X., Bai W.‐L., Liu Y.‐S., Cai Z.‐P., Zhang Q., Zhao C., Wang K.‐X., Chen J.‐S., Energy Storage Mater. 2020, 25, 52. [Google Scholar]
  • 267. Gillani S. H., Sohail M., El Maati L. A., Altuijri R., Zairov R., Nazar M. F., Ahmad I., J. Alloys Compd. 2024, 976, 172931. [Google Scholar]
  • 268. Flesher S. N., Downey J. E., Weiss J. M., Hughes C. L., Herrera A. J., Tyler‐Kabara E. C., Boninger M. L., Collinger J. L., Gaunt R. A., Science 2021, 372, 831. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 269. Gu X., Cao Z., Jolfaei A., Xu P., Wu D., Jung T.‐P., Lin C.‐T., IEEE/ACM Trans. Comput. Biol. Bioinform. 2021, 18, 1645. [DOI] [PubMed] [Google Scholar]
  • 270. Wang Z., Pan J., Yuan R., Chen M., Guo X., Zhou S., Nano Lett. 2023, 23, 6544. [DOI] [PubMed] [Google Scholar]
  • 271. Godoy D. A., Rabinstein A. A., Curr. Opin. Crit. Care 2022, 28, 111. [DOI] [PubMed] [Google Scholar]
  • 272. Ramos‐Murguialday A., Broetz D., Rea M., Läer L., Yilmaz Ö., Brasil F. L., Liberati G., Curado M. R., Garcia‐Cossio E., Vyziotis A., Ann. Neurol. 2013, 74, 100. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 273. Bouton C. E., Shaikhouni A., Annetta N. V., Bockbrader M. A., Friedenberg D. A., Nielson D. M., Sharma G., Sederberg P. B., Glenn B. C., Mysiw W. J., Nature 2016, 533, 247. [DOI] [PubMed] [Google Scholar]
  • 274. Wang Y., Gao X., Hong B., Jia C., Gao S., IEEE Eng. Med. Biol. Mag. 2008, 27, 64. [DOI] [PubMed] [Google Scholar]
  • 275. Singh R., Rai N. K., Gupta A., Chouhan S., Joshi A., Goyal M., Int. J. Neurosci. 2023, 1, 2269472. [DOI] [PubMed] [Google Scholar]
  • 276. Sellers E. W., Krusienski D. J., McFarland D. J., Vaughan T. M., Wolpaw J. R., Biol. Psychol. 2006, 73, 242. [DOI] [PubMed] [Google Scholar]
  • 277. Suefusa K., Tanaka T., IEEE Trans. Biomed. Eng. 2017, 65, 2119. [DOI] [PubMed] [Google Scholar]
  • 278. Nagel S., Spüler M., Sci. Rep. 2019, 9, 8269. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 279. Iwane F., Sobolewski A., Chavarriaga R., Millán J. D. R., iScience 2023, 26, 107524. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 280. Osborn L. E., Dragomir A., Betthauser J. L., Hunt C. L., Nguyen H. H., Kaliki R. R., Thakor N. V., Sci. Rob. 2018, 3, eaat3818. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 281. Zhen K., Zhang S., Tao X., Li G., Lv Y., Yu L., npj Park. Dis. 2022, 8, 146. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 282. Zhang X., Ma Z., Zheng H., Li T., Chen K., Wang X., Liu C., Xu L., Wu X., Lin D., Ann. Transl. Med. 2020, 8, 712. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 283. Hughes C., Herrera A., Gaunt R., Collinger J., Handb. Clin. Neurol. 2020, 168, 163. [DOI] [PubMed] [Google Scholar]
  • 284. Berthouze L., Tijsseling A., J. Three Dimens. Images 2002, 16, 141. [Google Scholar]
  • 285. Anderson C. F., Chakroun R. W., Grimmett M. E., Domalewski C. J., Wang F., Cui H., Nano Lett. 2022, 22, 4182. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 286. Roelfsema P. R., Denys D., Klink P. C., Trends Cogn. Sci. 2018, 22, 598. [DOI] [PubMed] [Google Scholar]
  • 287. Brown L., van de Molengraft J., Yazicioglu R. F., Torfs T., Penders J., Van Hoof C., in Proc. Annu. Int. Conf. IEEE Eng. Med. Biol. Soc., IEEE, Buenos Aires, Argentina 2010, 4197. [DOI] [PubMed]
  • 288. Quasar, http://www.quasarusa.com/products_dsi.htm (accessed June 2022).
  • 289. Shi Y., Wang F., Tian J., Li S., Fu E., Nie J., Lei R., Ding Y., Chen X., Wang Z. L., Sci. Adv. 2021, 7, eabe2943. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 290. Yu X., Xie Z., Yu Y., Lee J., Vazquez‐Guardado A., Luan H., Ruban J., Ning X., Akhtar A., Li D., Nature 2019, 575, 473. [DOI] [PubMed] [Google Scholar]
  • 291. Kumar R., Mehta U., Chand P., Procedia Comput. Sci. 2017, 105, 264. [Google Scholar]

Associated Data

This section collects any data citations, data availability statements, or supplementary materials included in this article.

Supplementary Materials

Supporting Information

EXP2-4-20230146-s001.docx (67.5KB, docx)

Articles from Exploration are provided here courtesy of Wiley

RESOURCES