Abstract
Tongue drive system (TDS) is a tongue-operated, minimally invasive, unobtrusive, noncontact, and wireless assistive technology that infers users’ intentions by detecting and classifying their voluntary tongue motions, and translating them to user-defined commands. We have developed customized interface circuitry between an external TDS (eTDS) prototype and a commercial powered wheelchair (PWC) as well as three control strategies to evaluate the tongue motion as an alternative control input for wheeled mobility. We tested the eTDS performance in driving PWCs on 12 able-bodied human subjects, of which 11 were novice. The results showed that all subjects could complete navigation tasks by operating the PWC using their tongue motions. Despite little prior experience, the average time using the eTDS and the tongue was only approximately three times longer than using a joystick and the fingers. Navigation time was strongly dependant on the number of issued commands, which reduced by gaining experience. Particularly, the unintended issued commands (the Midas touch problem) were rare, demonstrating the effectiveness of the tongue tracking and external magnetic field cancellation algorithms as well as the safety of the TDS for wheeled mobility.
Keywords: Assistive technologies (ATs), environmental control, magnetic sensors, telemetry, tongue motion, wheeled mobility
I. Introduction
Persons with severe disabilities, such as tetraplegia, generally find it extremely difficult to carry out daily tasks without receiving continuous help [1]. In order to move around in home, office, and outdoor environments, these individuals are completely dependent on their electrically powered wheelchairs (PWCs). PWCs are the most helpful tools allowing individuals to complete daily tasks with greater independence, and to access school, work, and community environments [2]. The default method for controlling a PWC is by using a joystick, which requires a certain level of physical movement ability in the upper limbs, which may not exist in people with severe disabilities. A few assistive technologies (ATs) have been developed to provide alternative control for PWCs by detecting certain patterns of remaining abilities in users, such as brain waves, muscular activities, diaphragm control, or eye position [3]. However, these devices have a few limitations and safety issues that, except for a small number, have prevented them from becoming popular among potential users.
Sip-n-puff is a simple, low-cost, and easy to use AT, which allows its users control a PWC by blowing or sucking through a straw. However, it is slow, cumbersome, and needs to be frequently cleaned. It has only a limited number of direct choices (four commands), which should be entered in series [4]. It also requires diaphragm control, and may not benefit those who continuously use ventilators.
A group of ATs based on tracking eye motion from corneal reflection, pupil position, or electrooculogram (EOG) potentials have been used for computer access and PWC control [5]-[7]. Since eyes have evolved as sensory parts of our body, a drawback of the eye-tracking systems is that they affect the users’ normal vision by requiring extra eye movements that sometimes interfere with the users’ visual tasks. Despite some recent progress, the Midas touch problem, which results in unintended commands being issued when the user just looks at some point and the system considers that as a command, has not been entirely solved for eye gaze devices [8]. Moreover, there should always be a camera or display in front of the users for detection or visual feedback, respectively, which may block their sight.
Another group of controllers, such as head arrays, based on chin or head movements, require a certain level of neck and shoulder movement ability and the strength that may not exist [9], [10]. Even in individuals who do have control over their neck and shoulders, these muscles are often quite weak and cannot be used on a continuous basis for an extended period of time.
There is considerable ongoing research on a series of ATs, known as brain–computer interfaces (BCIs), which directly discern users’ intentions from their electrophysiological brain activities, such as EEG, electrocorticograms (ECoGs), evoked potentials (EPs), and intracortical neural signals [11]. So far, BCIs have been used for controlling the cursor on a computer screen or moving a graphical object through a virtual environment [12]. However, their applicability to wheeled mobility is still in doubt due to their slow response (EEG or ECoG), high error rate, and large instrumentation needed for signal acquisition and processing (neural signals). Except for the EEG-based BCIs, they also require highly invasive procedures for electrode placement on the surface of the brain or in the cortex, which may not be desired by potential users.
According to recent reports, EEG-based BCIs have achieved information transfer rates (ITRs) of up to 25 b/min, which means that it would take ~2 s on average to issue a command to stop the PWC, for example, see [11]. As a result, in an experiment described in [13], the PWC had to be stopped during EEG analysis, which is not acceptable in practical conditions. The small amplitude of the brain waves (often less than 1 mV) make them susceptible to electromagnetic interference and motion artifacts, which are inevitable during PWC operation outside controlled laboratory environments, leading to serious safety concerns. Also, using EEG-based BCIs need concentration, which would be difficult to maintain during emergency situations. Further, most of these devices need learning and considerable time for preparing the scalp for good electrode contact (setup time) and cleaning up after removal [11].
Electromyogram (EMG) is arguably one of the most suitable sources of bioelectric signal for PWC control [3]. However, EMG-based systems are relatively error-prone and need complex muscular interactions as well as electrode attachment [14]. A few hands-free PWC control approaches use voice commands as input signals [15]. These systems are suitable for computer access in quiet places. However, they can be unreliable for PWC control in noisy and outdoors environments. There are also methods that utilize a combination of different input signals to improve reliability of the alternative control system at the expense of more sophisticated signal processing and higher costs [16].
Tongue and mouth occupy an amount of motor cortex in the human brain that rivals that of the fingers and the hand. Therefore, they are inherently capable of sophisticated manipulation tasks [17]. The tongue is connected to the brain via hypoglossal nerve, which generally escapes severe damage in spinal cord injuries (SCIs). The tongue muscle is similar to the heart muscle in that it does not fatigue easily [18]. Finally, the tongue is noninvasively accessible, and is not influenced by the position of the rest of the body, which can be adjusted for maximum user comfort. These reasons have resulted in the development of a few tongue-operated ATs, such as the tongue-touch-keypad [19]-[22]. These devices, however, require bulky objects inside the mouth that may interfere with the users’ speech, ingestion, and sometimes breathing. There are also a number of mouth-operated joysticks, which can provide proportional control [23], [24]. However, they can only be used when the user is in a certain position and require head movement to grab the mouth joystick. They also require tongue and lip contact, and pressure, which may cause fatigue and irritation over long-term use [22], [25].
Tongue drive system (TDS) is a minimally invasive, unobtrusive, noncontact, wireless, and wearable tongue-operated AT that can potentially substitute some of the arm and hand functions, which are considered the highest priorities for individuals with severe disabilities, with tongue motion [26]-[28]. Here, we are reporting on our progress in interfacing the latest TDS prototype with a PWC, and performing human trials on 12 able-bodied human subjects.
II. System Architecture
A. TDS Overview
TDS consists of an array of magnetic sensors, mounted on a headset, and a small permanent magnetic tracer, secured on the tongue. The magnet can be temporarily attached to the tongue using tissue adhesives. For long-term usage, however, the user should receive a tongue piercing embedded with the magnetic tracer. Alternatively, the magnet can be coated with biocompatible materials, such as titanium or gold, and implanted under the tongue mucosa. The magnetic field generated by the tracer varies inside and around the mouth with the tongue movements. These variations are detected by the magnetic sensors and wirelessly transmitted to a smartphone or an ultramobile personal computer (UMPC), which can be worn by the user or attached to his/her PWC. A sensor signal processing (SSP) algorithm running on the UMPC classifies the sensor signals and converts them into user control commands, which are then wirelessly communicated to the target devices in the user’s environment [27]-[29].
A principal advantage of the TDS is that a few sensors and a small magnetic tracer can potentially capture a large number of tongue movements, each of which can represent a particular command. By tracking tongue movements in real time, TDS also has the potential to provide its users with proportional control, which is easier, smoother, and more natural than the switch-based control for complex tasks such driving a PWC in confined spaces [30], [31].
To avoid the “Midas touch” problem, which is the unintended commands issued by an AT as a result of users’ unrelated activities, the tongue gestures associated with the TDS can be defined in a way that they are sufficiently different from tongue motions originated from voluntary or reflexive tongue movements resulted from speech, swallowing, coughing, or sneezing. In addition, we have defined a specific tongue command to switch the TDS to standby mode when the user intends to eat or engage in a conversation. Issuing the same command returns the TDS back to the operational mode. A mild concern about TDS usage is that its users should avoid inserting ferromagnetic objects in their mouth. For example, they should use silver, plastic, or wooden utensils instead of stainless steel. Similar to cochlear implants, the magnetic tracer should be removed if the user is undergoing MRI.
B. External TDS (eTDS) Prototype
In the latest eTDS prototype, shown in Fig. 1, the magnetic tracer is a small disk-shaped rare earth permanent magnet (Ø5 mm × 1.3 mm) that we attach to the subjects’ tongues using a cyanoacrylic tissue adhesive (Cyanodent, Ellman Internationa, Inc., Oceanside, NY). Magnetic field variations are detected by a pair of three-axial magnetoinductive sensor modules (PNI, Santa Rosa, CA) mounted bilaterally on a headgear near the user’s cheeks. In PNI sensors, an inductor with a soft ferromagnetic core varies the resonance frequency of an LC oscillator proportional to the external magnetic field (EMF) along its main axis, which is measured to produce a digital output [32]. An ultralow-power microcontroller unit (MCU) (MSP430, Texas Instruments, Dallas, TX) takes 13 samples/s from each of the six sensors (three per module), while activating only one sensor at a time to save power. These samples are arranged in a data frame, which is then wirelessly transmitted to a laptop PC across a 2.4-GHz wireless link that is established between two identical low-power transceivers (nRF24L01, Nordic Semiconductor, Norway). Some of the important technical specifications of the eTDS prototype are summarized in Table I.
Fig. 1.
eTDS prototype.
TABLE I.
eTDS Specifications
| Specification | Value |
|---|---|
| Control Unit | |
| Microcontroller | Texas Instruments - MSP430F1232 |
| Control unit dimensions | 22.5 × 18 × 16 mm3 |
| Wireless Transceiver | Nordic nRF24L01 @ 2.4 GHz |
| Operating voltage / current | 2.2 V / ~ 4 mA |
| Weight | 5 g without battery |
| Magnetic Sensor Module | |
| Magneto-inductive sensors | PNI 2-axis MS2100 + SEN-S65 |
| Sensor dimensions | 7 × 7 × 1.5 mm3 (MS2100) 6.3 × 2.3 × 2.2 mm3 (SEN-S65) |
| Sensor module dimensions | 25 × 23 × 13 mm3 |
| Sensor resolution / range | 0.0263 μT/ 1100 μT (MS2100) 0.015 μT/ 1100 μT (SEN-S65) |
| Sensitivity (programmable) | 0.30~67 Counts/μT |
| Weight | 3g |
| Magnetic Tracer | |
| Source and type | RadioShack rare-earth super magnet |
| Size (diameter and thickness) | Ø 5 mm × 1.3 mm |
| Residual magnetic strength | 10800 Gauss |
| Weight | 0.2 g |
eTDS has two operating modes: standby and active. By holding their tongue near the right sensor module for more than 2 s, users can toggle between the two modes. In the standby mode, the transceiver is off, and the microcontroller samples only the right module at 1 Hz. In this mode, the SSP output is locked in “neutral” to avoid any unintended commands. In the active mode, transceiver is on and all sensors are sampled, one at a time, at a rate of 13 Hz. The SSP algorithm classifies the incoming signals into six PWC control commands, which are described in the following section. SSP also includes a preprocessing component that filters out the common-mode signals resulted from EMF interference, such as the earth magnetic field, based on a novel stereo differential cancellation mechanism, described in [29].
C. PWC Control
In order to evaluate the TDS performance in controlling PWCs, we have developed a TDS–PWC interface module, which consists of a computer-controlled adapter circuitry and several control strategies, running in the LabVIEW environment, to operate a low-end commercial PWC (Golden Technologies, Old Forge, PA) by substituting its default joystick controller.
1) TDS-PWC Interface Module
Interface module receives eTDS control commands from the laptop through the same Universal Serial Bus (USB) port that wirelessly receives the sensor signals from the headset and converts them into four synchronized rectangular waveforms, whose lower amplitudes change between 0 and 5 V, and upper levels always stay at 5 V. These signals substitute the signals that the VR2 motor controller (PG Drives Technology, Anaheim, CA) receives from its designated proportional joystick module. Hence, we take advantage of the rest of the motor control circuitry without any changes. Every two of the aforementioned four signals are related to a state vector, as described in Section II-C2. The direction and speed of the two PWC electric motors can be smoothly controlled by changing the lower amplitudes of these four signals [33].
Fig. 2 shows the TDS–PWC interface block diagram and signal flow graph. eTDS control commands, once detected from the sensors data, are sent from the laptop to an MCU in the interface module to determine the amplitudes of the PWC control signals through 12-bit digital-to-analog converters (DACs). These dc levels are chopped by an analog switch to be synchronized with the VR2 controller master clock before substituting its joystick input signals.
Fig. 2.
Block diagram and signal flow graph of the TDS–PWC interface circuitry.
To improve safety, we have added a watchdog timer to the TDS–PWC interface. If the wireless link is broken due to a malfunction in the eTDS or electromagnetic interference, or if the laptop freezes, the slowdown in receiving control commands is detected by the watchdog. In this case, the MCU will reset all control signals to bring the PWC to standstill. It will not respond to any new incoming control commands until a normal command rate is resumed.
2) Control Strategies
To allow users control the PWC by their tongue motion, seven TDS commands were defined: five main commands to drive the PWC forward (FD) and backward (BD), turn left (TL) and turn right (TR), and stop/neutral (N). In addition, two auxiliary commands were defined to adjust the PWC maximum speed. Auxiliary commands can also be used to switch the control mode in PWCs that are equipped with powered seating.
The fundamental PWC control mechanism is based on two state vectors, one for linear motion, controlled by (FD and BD) commands, and one for rotations, controlled by (TR and TL) commands. The PWC speed in each major direction is proportional to the absolute value of these state vectors. Each TDS command increases or decreases its associated state vector by a certain adjustable value until a predefined maximum or minimum level is reached. For example, if the user keeps issuing the FD command, the linear motion state vector increases, and the PWC accelerates in the FD direction until it reaches a certain predefined maximum speed. Based on these fundamental rules, we have implemented the following three control strategies.
1) Discrete control
In this strategy, five commands are utilized: FD, BD, TR, TL, and N. Each directional command linearly changes the state vector value in that direction, while the N command, which is issued automatically when the tongue returns back to its resting position, linearly returns all state vectors back to zero. Therefore, by simply returning the tongue to its resting position, the user can bring the PWC to a standstill.
Another important feature in this strategy is that the state vectors are mutually exclusive, i.e., only one state vector can be greater or less than zero at any time. If a new command changes the current state, for example, from FD to TL, the current state vector has to be gradually reduced to zero before the new vector can ramp up. Hence, the PWC always stops before its direction can be changed. This is a safety feature particularly for novice users at the cost of reducing the PWC agility. Fig. 3(a) shows the graphical user interface (GUI) for this strategy. A vertical bar and a dial provide the user with visual feedback (VF) on the status of the linear and rotation vectors, respectively. We have also added two calibration knobs to provide the investigator (not the user) the ability to fine-tune the speed of the left and right motors by adjusting the control signal amplitudes in a way that the PWC does not deviate from a straight path while moving FD or BD.
Fig. 3.
GUIs for the TDS–PWC control interface in LabVIEW environment. (a) GUI designed for the discrete and continuous control strategies. (b) GUI for the gearshift control strategy.
2) Continuous control
This strategy uses the same command definitions as discrete control in (a). However, the state vectors are no longer mutually exclusive, which means that the user is allowed to steer the PWC to the left or right as it is moving FD or BD. Therefore, the PWC movements are continuous and much smoother, making it possible to follow a curve. Similar to (a), returning the tongue to its resting position (N) results in breaking and eventually stopping the PWC.
3) Gearshift control
In this strategy, we have imitated the gearshifting concept in driving stick shift vehicles. The GUI for this strategy is shown in Fig. 3(b). By employing an additional TDS command, users are able to shift the gear to operate the PWC at a different speed by setting a different maximum level for the linear state vector. By issuing the sixth command for 1 s, the user can shift the gear from N to 1, to 2, to 3, to R, and back to N. For safety reasons, the user has to stop the PWC before shifting gears.
Since a reverse gear is already included in the gearbox in this strategy, the (FD and BD) functions in (a) and (b) have changed to acceleration and deceleration. We also made the speed increments a quadratic function to make it easier to fine-tune the lower speeds. When users issue an FD command, the PWC speeds up to a maximum value depending on which gear is selected. If reverse gear is selected, the maximum speed is always set to gear 1, and the FD command increases the backup speed. Similarly, the BD command decreases the PWC speed regardless of the direction of motion. The N command does not affect the linear state vector but it decreases the rotation state vector to zero. No fine-tuning knob is needed in this strategy since users have full control over the PWC movement direction with their tongues.
III. Experimental Results From Human Trials
A. Human Subjects
Twelve able-bodied human subjects were recruited from the Georgia Institute of Technology graduate student population, comprising of ten males and two females with ages from 23 to 35 years. We obtained the necessary approvals from the Georgia Tech’s Institutional Review Board (IRB) and informed consent from each subject. Subjects had no prior experience with other ATs. One of the subjects (subject A) was a member of the research team and quite familiar with the TDS. However, he was not a TDS user on a daily basis. Four subjects had used the eTDS for computer access for less than 3 h in previous human trials, but none of them had prior experience in using eTDS to control a PWC. All other subjects were novice, and had no prior experience with the eTDS before these trials.
B. Human Trial Procedure
Detailed instructions were prepared ahead of the trials and provided to the subjects, and then strictly followed to ensure that every subject follows the same procedure.
1) Sensor calibration
This step was intended to obtain the linear regression coefficients for our stereodifferential SSP algorithm that cancels out the EMF interference [28]. This step should be taken before attaching the magnetic tracer to the subject’s tongue, because the recorded data should only include the EMF. Subjects wore the eTDS prototype headset, and were asked to move around in the laboratory, while the GUI recorded 1000 data points. The calibration coefficients were then calculated and saved for the following steps.
2) Magnet attachment
A new magnetic tracer was washed with detergent and tap water, disinfected using 70% isopropyl alcohol, dried, and attached to the subject’s tongue, about 1 cm from the tip, using Cyanodent tissue adhesive. Subjects were allowed to familiarize themselves with the eTDS and magnetic tracer on their tongues for ~10 min.
3) Command definition and eTDS training
Subjects were encouraged to define the tongue position for each command in their mouth as sparsely as possible to facilitate the SSP command classification. To shorten the preparation time, they were provided with a few recommended tongue positions based on previous human trials [29]. In addition, a visual cue in the form of an analog gauge was provided in the GUI to help the subjects get a sense of how far their current tongue position was from all previously defined commands [34]. This step was repeated for three times to achieve the best command definitions and help the subjects remember them.
Once command-related tongue positions were identified and practiced, the subjects were ready for the training session, during which the GUI prompted the subjects to issue each command by turning on its associated indicator on the screen in 3 s intervals. The subjects were asked to issue the prompted command by moving their tongue from its resting position to the corresponding command position when the command light was on, and returning it back to the resting position when the light went off. This procedure was repeated ten times for the set of six commands plus the tongue resting position [34].
4) PWC control experiment
To gain more experience, subjects initially used their tongue gestures to navigate the mouse cursor through an on-screen maze, similar to the procedure that we had previously reported in [29]. Once the subjects felt comfortable driving the mouse cursor in the virtual environment, the laptop was connected to the TDS–PWC interface module, and the subjects were allowed to test-drive the PWC with their tongue for ~ 10 min, while the maximum PWC speed and rotation rates were set to 0.5 m/s and 36 °/s, respectively, and the acceleration/deceleration rates were set to 0.125/−0.5 m/s2.
During PWC trials, the subjects were required to drive the PWC with their tongue through an obstacle course, as shown in Fig. 4. The track was designed for the subjects to use all TDS control commands and perform various navigation tasks such as making a U-turn, back up, and fine-tune the PWC orientation in a limited space, while moving FD or BD. The subjects were asked to navigate the PWC from point A to point F as fast as they can, while avoiding collisions. Aisle 1 was designed wide enough for the subjects to make a U-turn at point C. However, the only way they could get out of aisle 2 was to back up the PWC from point E and make a few left and right adjustments to avoid hitting the tables. Fig. 5 shows one of the subjects sitting on the PWC with his hands crossed, which is the position he was asked to maintain throughout the experiments.
Fig. 4.
Plan of the obstacle course used in the PWC navigation human trials, using the TDS, showing the dimensions and approximate PWC trajectory.
Fig. 5.
eTDS prototype worn by an able-bodied subject to wirelessly control a PWC. eTDS is wirelessly connected to the laptop under the seat, which is connected to the TDS–PWC interface circuitry through a USB port.
During the experiment, the laptop was either placed in front of the subjects to provide them with VF or hidden beneath the seat. The subjects were required to repeat the experiment three times for each control strategy. The discrete control strategy (a) was tried with and without VF. Finally, the subjects were asked to navigate through the same track using the PWC’s default proportional joystick. The navigation time from A to F, the number of collisions, and the number of issued commands (NICs) were recorded for each experiment. After completing the trial, each subject was asked to fill out a questionnaire including eight ratings and two open-ended questions to compare their perceptions of different control strategies. The open-ended questions were: “What is the most important feature that a PWC controller should have?” and “What other features do you expect to be added to the current TDS control strategies?”
C. Experimental Results
All subjects could successfully complete the TDS–PWC control tasks. In the following, we have separated the test results obtained by the experienced subject A from other subjects to demonstrate the effect of prior experience in using eTDS to control PWCs.
Fig. 6 shows the average time, number of collisions, and NIC for each experiment. Overall, the continuous control (b) resulted in the best performance with minimum elapsed time (130.9 s for novice subjects and 114.3 s for subject A) and relatively low number of collisions (0.42 and 0 per trial for novice and experienced, respectively). As expected, the discrete control (a) was the slowest and safest with minimum number of collision. No significant difference in performance was observed in discrete control between novice subjects and subject A, showing that this method barely relies on the users’ prior experience. Gearshift control (c) was in the middle in terms of elapsed time but it had the highest rate of collisions. Furthermore, subject A performed significantly better than the other subjects with this strategy, showing that prior experience did matter in this case.
Fig. 6.
PWC navigation experimental results using eTDS with different control strategies. (a) Average navigation time. (b) Number of collisions. (c) NICs. The mean values along with their 95% confidence interval are shown for each variable.
The average time to complete the obstacle course using joystick was 51.3 s, which was 39% of the average time taken by the novice subjects using eTDS (45% for subject A). Considering the fact that subjects had much less experience in operating a PWC with their tongue motion using eTDS than with their fingers using a joystick, the viability of the TDS in substituting some of the arm and hand functions in tasks related to navigation and wheeled mobility is shown.
Fig. 6(c) shows the average NIC for each strategy. It can be seen that subject A has issued less commands than the other 11 novice subjects, especially in gearshift control (c). That is perhaps due to issuing the control commands more accurately and more timely, which can lead to shorter navigation time.
Fig. 7 shows the subjects’ ratings of the three control strategies based on the questionnaire they filled in eight categories. Continuous control received highest overall rating as well as the best flexibility and accuracy. This rating is in agreement with the quantitative experimental results shown in Fig. 6. The gearshift control received the lowest overall rating due to its safety issue, and difficulty in learning and remembering. Almost all subjects agreed that the discrete control was the easiest strategy to learn, use, and remember. It is also safer than the other strategies. However, its poor performance in terms of timing resulted in concluding that the continuous control (b) is perhaps the best choice for driving PWC with tongue motion.
Fig. 7.
Subjective rating of three PWC control strategies using tongue motions: discrete, continuous, and gearshift based on a questionnaire filled by subjects after the trials.
IV. Discussion
In human trials, we observed that the subjects’ performance in discrete control strategy (a) is not significantly different with and without visual feedback. Similarly, the subjects’ prior experience does not seem to be very helpful in this case. These outcomes combined with the fact that subjects found this strategy to be safe, easy to learn, and remember, suggest that discrete control is probably the best strategy to begin with when one starts using the TDS.
We performed linear regression analysis to study the relationship between navigation time and NIC for each control strategy. Results showed that they were positively correlated in all cases. NIC depends on how accurately the subjects can remember the tongue gestures and repeat them consistently. When subjects were not able to correctly issue the command that they had intended, they had to issue another command to correct the previous one, further increasing the NIC and slowing the navigation. Another important parameter affecting NIC was the timing of the commands. For example, to perform a 90° left turn into an aisle, subject A could drive the PWC to a proper position and issue a TL command at the right time to make a single sharp turn followed by an FD when the rotation was close to 90°. On the other hand, the novice subjects either started turning the PWC too early and too little or too late and too much, in which case they needed to issue a few other TL and TR commands to adjust the PWC position and enter the aisle. This is almost similar to what people do when learning how to drive a car.
Over time, the TDS user is expected to minimize the NIC and achieve better performance by remembering the tongue movements more easily (spatial accuracy) and executing them more timely (temporal accuracy). Fig. 8 shows the relationship between average NIC and the trial number for each control strategy. The NIC of discrete control strategies (with and without VF) decreased as the subject became more experienced, while the NICs of the other strategies did not show any trends in only three trials. This shows that the effect of learning rapidly becomes evident in discrete control strategy, which is quite easy to learn. However, more time and trials would be needed for the other more advanced strategies.
Fig. 8.
Average NIC versus the trial number for each control strategy. The learning effect in discrete control strategy is quite evident from the early trials. However, the other two strategies, which are relatively more complicated, require more trials to show the effects of learning.
Robustness against the “Midas touch” problem is particularly important in wheeled mobility because an unintended movement of the PWC can lead to dangerous consequences. This is unfortunately a common problem in eye gaze systems and EEG-based BCIs [11]. In our trials, however, the eTDS was able to differentiate between the command-related tongue movements and the tongue natural movements due to our stereodifferential cancellation algorithm [28]. Most natural tongue movements, such as those related to speech, are in the sagittal plane, resulting in common-mode variations in the magnetic field at the symmetrical locations of the sensor modules. Therefore, these components are cancelled in our differential transformations, and considered as the neutral, which designates the tongue resting position [28]. To avoid the “Midas touch” problem during eating, when tongue movements are not limited only to the sagittal plane, the user is supposed to switch the eTDS to the standby mode, as explained in Section II-A.
V. Conclusion
TDS is a tongue-operated wireless AT, which can potentially benefit people with severe disabilities by enabling them to control their environments, access computers, and operate PWCs using their tongue motion. We have implemented a prototype TDS–PWC interface, including an adapter circuit and multiple control strategies. Human trials with 12 able-bodied subjects showed that learning to drive a PWC using the eTDS prototype is easy and effective. Different TDS–PWC control strategies were tested in an obstacle course, and compared with controlling the same PWC with its default proportional joystick. The continuous control strategy was found to be the most efficient method for driving a PWC with tongue motion. Using this method, subjects with reasonable experience could complete the obstacle course using their tongue and eTDS prototype only approximately two times longer than using their fingers and a joystick.
Our future directions include refining the eTDS hardware, SSP algorithms, and PWC control strategies further, while having them assessed by people with severe disabilities in their home or office environments. We also plan to add proportional control capability to the TDS in order to further improve its smooth and natural control capability for wheeled mobility. The TDS–PWC should also be evaluated in unstructured outdoor environments.
Acknowledgment
The authors would like to thank members of the GT-Bionics and NC-Bionics Laboratories for helping with the human trials, and M. Jones, J. Bruce, R. Fierman, and J. Anschutz from the Shepherd Center, Atlanta, GA, for their constructive comments.
This work was supported in part by Christopher and Dana Reeve Foundation, and in part by the National Science Foundation under Grant IIS-0803184.
Biographies

Xueliang Huo (S’07) was born in 1981. He received the B.S. and M.S. degrees in mechanical engineering (instrument science and technology) from Tsinghua University, Beijing, China, in 2002 and 2005, respectively. He is currently working toward the Ph.D. degree at the Georgia Tech (GT) Bionics Laboratory, School of Electrical and Computer Engineering, Georgia Institute of Technology, Atlanta.
His current research interests include lowpower circuit and system design for biomedical applications, brain–computer interfacing, and assistive technologies.

Maysam Ghovanloo (S’00–M’04) was born in 1973. He received the B.S. degree in electrical engineering from the University of Tehran, Tehran, Iran, in 1994, the M.S. degree in biomedical engineering from the Amirkabir University of Technology, Tehran, Iran, in 1997, and the M.S. and Ph.D. degrees in electrical engineering from the University of Michigan, Ann Arbor, in 2003 and 2004, respectively.
From 2004 to 2007, he was an Assistant Professor in the Department of Electrical And Computer Engineering, North Carolina (NC) State University, Raleigh, where he was the Founder of the NC-Bionics Laboratory. In June 2007, he joined the faculty of Georgia Institute of Technology, Atlanta, where he is currently an Assistant Professor and the Founding Director of the Georgia Tech (GT) Bionics Laboratory in the School of Electrical and Computer Engineering. He has authored or coauthored more than 60 conference and journal publications.
Dr. Ghovanloo has received awards in the 40th and 41st Design Automation Conference (DAC)/International Solid-State Circuits Conference (ISSCC) Student Design Contest in 2003 and 2004, respectively. He has organized special sessions and was a member of Technical Review Committees for major conferences and journals in the areas of circuits, systems, sensors, and biomedical engineering. He is a member of the Tau Beta Pi, the Sigma Xi, and the IEEE Solid-State Circuits Society, the IEEE Circuits and Systems Society, and the IEEE Engineering in Medicine and Biology Society.
Footnotes
Color versions of one or more of the figures in this paper are available online at http://ieeexplore.ieee.org.
References
- [1].National Institute of Neurological Disorders and Stroke (NINDS) NIH Spinal cord injury: Hope through research [Online] 2009 May 6; Available: http://www.ninds.nih.gov/disorders/sci/detail_sci.htm.
- [2].Cooper RA, Boninger ML, Spaeth DM, Ding D, Guo S, Koontz AM, Fitzgerald SG, Cooper R, Kelleher A, Collins DM. Engineering better wheelchairs to enhance community participation. IEEE Trans. Rehabil. Eng. 2006 Dec;14(4):438–455. doi: 10.1109/TNSRE.2006.888382. [DOI] [PubMed] [Google Scholar]
- [3].Felzer T, Nordman R. Alternative wheelchair control; Proc. Int. IEEE-BAIS Symp., Res. Assistive Technol.; 2007.pp. 67–74. [Google Scholar]
- [4].Origin Instruments Corporation Sip and Puff Switch [Online] 2009 May 6; Available: http://orin.com/access/sip_puff/sp_mu/index.htm.
- [5].Chen YL, Tang FT, Chang WH, Wong MK, Shih YY, Kuo TS. The new design of an infrared-controlled human–computer interface for the disabled. IEEE Trans. Rehabil. Eng. 1999 Dec;7(4):474–481. doi: 10.1109/86.808951. [DOI] [PubMed] [Google Scholar]
- [6].Law C, Leung M, Xu Y, Tso S. A cap as interface for wheelchair control; Proc. IEEE/RSJ Int. Conf. Intell. Robots Syst.; 2002.pp. 1439–1444. [Google Scholar]
- [7].Barea R, Boquete L, Mazo M, Lopez E. System for assisted mobility using eye movements based on electrooculography. IEEE Trans. Rehabil. Eng. 2002 Dec;10(4):209–218. doi: 10.1109/TNSRE.2002.806829. [DOI] [PubMed] [Google Scholar]
- [8].Jacob R. The use of eye movements in human–computer interaction techniques: What you look at is what you get. ACM Trans. Inf. Syst. (TOIS) 1991;9:152–169. [Google Scholar]
- [9].Craig DA, Nguyen HT. Wireless real-time head movement system using a personal digital assistant (PDA) for control of a power wheelchair; Proc. IEEE Eng. Med. Biol. Conf.; 2005; pp. 772–775. [DOI] [PubMed] [Google Scholar]
- [10].Adaptive Switch Labs, Inc. 2009 May 6; [Online]. Available: http://www.asl-inc.com/Products/Index.htm.
- [11].Hochberg LR, Donoghue JP. Sensors for brain computer interfaces. IEEE Eng. Med. Biol. Mag. 2006 Sep-Oct;25(5):32–38. doi: 10.1109/memb.2006.1705745. [DOI] [PubMed] [Google Scholar]
- [12].Wolpaw JR, Birbaumer N, McFarland DJ, Pfurtscheller G, Vaughan TM. Brain–computer interfaces for communication and control. Clin. Neurophysiol. 2002;113:767–791. doi: 10.1016/s1388-2457(02)00057-3. [DOI] [PubMed] [Google Scholar]
- [13].Tanaka K, Matsunaga K, Wang HO. Electroencephalogram-based control of an electric wheelchair. IEEE Trans. Robot. 2005 Aug;21(4):762–766. [Google Scholar]
- [14].Han JS, Bien ZZ, Kim DJ, Lee HE, Kim JS. Human machine interface for wheelchair control with EMG and its evaluation; Proc. IEEE Eng. Med. Biol. Conf.; 2003.pp. 1602–1605. [Google Scholar]
- [15].Pacnik G, Benkic K, Brecko B. Voice operated intelligent wheelchair—VOIC; Proc. ISIE; 2005.pp. 1221–1226. [Google Scholar]
- [16].Moon I, Lee M, Ryu J, Mun M. Intelligent robotic wheelchair with EMG-, gesture-, and voice-based interfaces; Proc. Int. IEEE Conf. Intell. Robots Syst.; 2003.pp. 3453–3458. [Google Scholar]
- [17].Kandel ER, Schwartz JH, Jessell TM. Principles of Neural Science. 4th ed. McGraw-Hill; Hoboken, NJ: 2000. [Google Scholar]
- [18].Lau C, O’Leary S. Comparison of computer interface devices for persons with severe physical disabilities. Amer. J. Occup. Ther. 1993 Nov;47:1022–1030. doi: 10.5014/ajot.47.11.1022. [DOI] [PubMed] [Google Scholar]
- [19].TongueTouch Keypad (TTK) 2009 May 6; [Online]. Available: http://www.newabilities.com/
- [20].Nutt W, Arlanch C, Nigg S, Staufert G. Tongue-mouse for quadriplegics. J. Micromech. Microeng. 1998;8(2):155–157. [Google Scholar]
- [21].Salem C, Zhai S. An isometric tongue pointing device; Proc. CHI 1997.pp. 22–27. [Google Scholar]
- [22].Struijk LNSA. An inductive tongue computer interface for control of computers and assistive devices. IEEE Trans. Biomed. Eng. 2006 Dec;53(12):2594–2597. doi: 10.1109/TBME.2006.880871. [DOI] [PubMed] [Google Scholar]
- [23].Jouse2, Compusult Limited 2009 May 6; [Online]. Available: http://www.jouse.com/
- [24].USB Integra Mouse, Tash, Inc. 2009 May 6; [Online]. Available: http://www.tashinc.com/catalog/ca_usb_integra_mouse.html.
- [25].Ghovanloo M. Tongue operated assistive technologies; Proc. IEEE 29th Eng. Med. Biol. Conf.; Aug. 2007; pp. 4376–4379. [DOI] [PubMed] [Google Scholar]
- [26].Anderson KA. Targeting recovery: priorities of the spinal cord-injured population. J. Neurotrauma. 2004;21:1371–1383. doi: 10.1089/neu.2004.21.1371. [DOI] [PubMed] [Google Scholar]
- [27].Huo X, Wang J, Ghovanloo M. Use of tongue movements as a substitute for arm and hand functions in people with severe disabilities; presented at the RESNA Conf.; Phoenix, AZ. Jun. 2007. [Google Scholar]
- [28].Huo X, Wang J, Ghovanloo M. A wireless tongue-computer interface using stereo differential magnetic field measurement; Proc. 29th IEEE Eng. Med. Biol. Conf.; Aug. 2007; pp. 5723–5726. [DOI] [PubMed] [Google Scholar]
- [29].Huo X, Wang J, Ghovanloo M. Introduction and preliminary evaluation of tongue drive system: A wireless tongue-operated assistive technology for people with severe disabilities. J. Rehabil. Res. Develop. 2008 Nov;45(6):921–938. doi: 10.1682/jrrd.2007.06.0096. [DOI] [PubMed] [Google Scholar]
- [30].Cook AM, Polgar JM. Assistive Technologies: Principles and Practice. 3rd ed. Mosby-Year Book; St. Louis, MO: 2008. [Google Scholar]
- [31].Wang J, Huo X, Ghovanloo M. Tracking tongue movements for environment control using particle swarm optimization; Proc. IEEE Int. Symp. Circuits Syst.; May 2008.pp. 1982–1985. [Google Scholar]
- [32].PNI, MS2100 two-axis sensor. 2009 May 6; [Online]. Available: http://www.pnicorp.com.
- [33].PG Drives Technology, VR2. 2009 May 6; [Online]. Available: http://www.pgdt.com/products/vr2/index.html.
- [34].Huo X, Wang J, Ghovanloo M. A magneto-inductive sensor based wireless tongue–computer interface. IEEE Trans. Neural Syst. Rehabil. Eng. 2008 Oct;16(5):497–504. doi: 10.1109/TNSRE.2008.2003375. [DOI] [PMC free article] [PubMed] [Google Scholar]








