Skip to main content
NIHPA Author Manuscripts logoLink to NIHPA Author Manuscripts
. Author manuscript; available in PMC: 2022 May 4.
Published in final edited form as: Assist Technol. 2019 Jun 10;33(3):117–128. doi: 10.1080/10400435.2019.1593260

Assistive game controller for artificial intelligence-enhanced telerehabilitation post-stroke

Grigore Burdea a, Nam Kim a, Kevin Polistico a, Ashwin Kadaru a, Namrata Grampurohit a, Doru Roll a, Frank Damiani b
PMCID: PMC6901786  NIHMSID: NIHMS1523965  PMID: 31180276

Abstract

Off-the-shelf gaming technology is designed for young, fit and motor intact individuals. Artificial intelligence has a role in making controllers and therapeutic games adaptable to the disabled. Post-stroke rehabilitation outcomes can be enhanced by gaming technology within the home to enable engaging telerehabilitation. BrightBrainer™ Grasp is a novel hand controller designed to adapt to arm and hand impairments post stroke, enable functionality for desired intensive training, and the ability to track relevant outcomes. The newly designed controller uses BrightBrainer gamification system with artificial intelligence technology to provide automatic adaptation, requiring minimal clinician input. This article describes the BrightBrainer Grasp design, hardware, force and movement detection and calibration, and its integration with the therapeutic games. The use of artificial intelligence in adapting a library of custom therapeutic games is also described. Results of a usability study with healthy individuals and related design modifications are presented, with implications for future trials.

Keywords: BrightBrainer, BrightBrainer Grasp, telerehabilitation, artificial intelligence, therapeutic games, stroke, usability evaluation

Introduction

Telerehabilitation, as described by the American Telemedicine Association, involves providing rehabilitation remotely with telecommunication technology (Richmond et al., 2017). This method of care delivery provides home care in the early phase post-hospital discharge (Tousignant et al., 2011) or homecare in locations where there is a scarcity of therapists (Saywell et al., 2012; Cherry et al., 2017).

Several studies have demonstrated telerehabilitation benefits in cardiac (Barnason et al., 2009), orthopedic (Tousignant et al., 2011) and neurological conditions (Piron et al., 2008). In stroke rehabilitation, functional outcomes and cost-effectiveness are both desired goals for all stakeholders (Bramanti, Manuli, & Calabrò, 2018).

As elaborated by Agostini et al., (2015), telerehabilitation should be designed so not to discriminate based on levels of impairment. The telerehabilitation system needs to be usable by patients with a wide range of disability (from mild to severe), thus needs to be adaptable to each patient, each day. Measurement of patient actions needs to be accurate, and the therapeutic goals need to be set realistically, so to be achievable, without discouraging or disenfranchising the patient. Furthermore, patients need an appropriate level of challenge to advance their function and align with their cognitive level (Hung, Huang, Chen, & Chu, 2016). The adaptability of the system needs to be almost instantaneous and it needs to be highly responsive.

Artificial intelligence (AI) is one technology that may be used to provide system automatic adaptation, with minimal clinician input (Dilsizian & Siegel, 2014; Massalha, Clarkin, Thornhill, Wells, & Chow, 2018). The benefits AI brings to telerehabilitation also relate to the speed of adaptation, which is even more important when telerehabilitation incorporates therapeutic games (Darzi, Gorsic, & Novak, 2017).

Use of games in stroke rehabilitation has long been recognized as beneficial in terms of patient motivation, higher intensity of exercising and ability to transparently measure objective outcomes (such as arm repetitions, task completion time, or error rates) (Burdea, 2003; Holden, 2005). However, adaptability is currently lacking in off-the-shelf video games (Webster & Celik, 2014; Cheok, Tan, Low, & Hewitt, 2015;). For instance, Nintendo Wii is limited by the game controllers which make the play by people with severely impaired hand function a challenge (Cheok et al., 2015). The Microsoft Kinect, in turn, is limited by its lack of fine movement tracking, which is a barrier for people with mildly impaired hand function wanting to exercise their fingers (Webster & Celik, 2014). Kinect is designed to translate gross kinematic movement into virtual environment by translating limbs to a stick figure. Due to real life computing power constraints of Kinect hardware and intrinsic occlusion which is inevitable from the vision-based motor tracking system, Kinect is not optimally designed to be used for distal fine motor movement detection. Furthermore, off-the-shelf games are limited by game scores and functional changes do not drive the game-related adaptations. In general, lack of adaptation makes games less winnable by the disabled, leading to frustration, increase in depression severity, and eventual abandonment of game-based rehabilitation (Threapleton et al., 2016).

Although virtual reality studies have been conducted for over two decades in people with stroke, there is a limited number of options that offer specialized controllers beyond the traditional gaming sticks (Laver et al., 2015). The unique controllers currently reported in the literature are in the form of car steering wheel (Akinwuntan et al., 2005), robotic orthoses (Byl et al., 2013; Coupar, 2012), and gaming gloves (Cho & Jung, 2012; Crosbie, 2012). The steering wheel controller (Akinwuntuan et al., 2005) trains driving a car, a task that may not be a rehabilitation goal for many of the individuals. Thus the skill learned during the game may not be translated into real-life tasks. The robotic orthoses (Byl et al., 2013; Coupar, 2012) are expensive and donning and doffing can be challenging due to finger positioning in the orthoses. Furthermore, creating space for the device at home and setting it up for daily use are challenges for home use.

Gaming gloves (Cho & Jung, 2012; Crosbie et al., 2012), if used in hand rehabilitation, present with sizing, donning, doffing and hygiene challenges. The HandTutor™ glove is an exception, since it was designed from the start for hand rehabilitation use. This commercially-available system developed in Israel, attaches to the back of the hand and uses built-in sensors to allow fine motor training for the hand. The rehabilitation glove is connected to a library of simple rehabilitation exercises of varying difficulty levels and has been used successfully in tele-rehabilitation settings (Carmeli et al., 2011). HandTutor does not however track 3D arm movement, and cannot measure grasping force. Training is uni-manual, thus only engaging half of the motor centers in the brain.

MindMotion™ PRO (MindMaze SA, Switzerland) is a virtual rehabilitation system which has been used for upper extremity training of individuals in the chronic phase post-stroke (Perez-Marcos et al., 2017). Similar to the Kinect, it uses vision tracking, however this is coupled with passive markers the patient wears on the shoulder, elbow and wrist. This is supplemented by inertia sensors on the back of the hand, so to measure wrist flexion/extension. Exercises are in the form of games and train shoulder, elbow and wrist movement, allowing mirror therapy, by controlling the affected arm avatar using the less affected arm. This system seems designed for the clinic (therapist was reported as present during the study), and does not track finger extension nor grasping force.

BrightBrainer is a gamification system that uses the principles of integrative medicine to simultaneously train motor and cognitive function while also benefitting the patient’s wellbeing (Bright Cloud International, 2018). Earlier versions of BB used the Hydra (Sixense, 2011) and subsequently the VIVE (HTC, 2016) game controllers. Both Hydra and VIVE controllers are off-the-shelf products meant primarily for gaming. They are appropriate for virtual rehabilitation of populations with dementia (Burdea et al., 2015a) or mild Traumatic Brain Injury (Burdea et al. 2018) as long as they do not have severe hand dysfunction.

The present article describes the BrightBrainer Grasp (BBG), a new game controller, designed from the start for individuals with hand impairments. The use of AI in adapting BrightBrainer library of custom therapeutic games for BBG control is also described. Results of an initial usability evaluation are included. The BBG and AI adaptable games have undergone a feasibility evaluation at the time of manuscript acceptance and the results will be presented elsewhere. That study involved telerehabilitation of stroke survivors in the chronic phase, living at home.

Methods

BrightBrainer Grasp game controller

The design of a game controller for the disabled had to accommodate individuals with spastic hands, or those that lack gravity bearing (that is individuals unable to lift the arm). It had to be simple, so to facilitate use by the cognitively impaired. This meant that the game controller had to have few buttons. Finally, it had to be unencumbering, meaning no wires or tethers to interfere with the patient’s free arm movement. Since the controller was meant to interact with real-time graphics, it had to track hand movement in 3D space, accurately, and at a sufficiently high measurement rate so to minimize lag.

The BBG is a hand-held, forearm attached wireless controller that measured power grasp, finger extension and wrist position and orientation (Burdea et al., 2017). In order to facilitate supported arm movement (needed for weak arms) while also allowing forearm pronation/supination, the BBG had a rounded forearm support sled (Figure 1).

Figure 1.

Figure 1.

BrightBrainer Grasp therapeutic game controller: a) left side view of prototype;

b) frontal view; c) right side sectional detail. © Bright Cloud International Corp.

Reprinted by permission.

The underside of the forearm support was covered with low friction material so to minimize resistance when moving on a gravity bearing surface.

The BBG incorporated an HTC VIVE Wireless Tracker (HTC Model# 99HANL002–00) which had a compact form factor (0.7 lb., 99.65 mm Dia × 42.27 mm H), and was powered via a micro USB cable.

This 3D tracker was chosen due to its small dimensions, ability to provide 120 data sets every second (Kreylos, 2012), wireless transmission and easy integrations with the HTC VIVE simulation system. Upon setup, the VIVE tracker established a wireless connection between the object it was mounted on (in this case BBG controller) and head mount display (HMD) headset of the VIVE. This provided 3D tracking data for the simulation, allowing the player to move objects in the virtual world (Niehorster, 2017). The actual tracking was based on infrared (IR) beams transmitted by two VIVE IR emitters (part of the BrightBrainer system) and detected by an array of IR sensors integrated in the VIVE tracker.

The tracker was mounted with regular DSLR camera mount (1/4–20 tripod) screws on top of a custom support. This support integrated a finger extension detection mechanism and a stainless hollow tube mounted to its underside. The tubes other end connected to the forearm curved sled through a one degree of freedom rotary joint designed to accommodate varying wrist yaw angles.

The supporting sled had three major components, a rounded shell housing that attached to the user’s forearm using Velcro strips, an electronics board with rechargeable batteries and wireless transmitter, and a detachable pad that provided comfort to the user’s forearm. The total weight of a BBG controller was about 570 grams (a little over one pound), which did not hinder movement for those with higher motor function. Those with diminished or no gravity bearing could still use the BBG, as long as it was supported by a low-friction table or similar surface.

Grasp Force Detection and Calibration

The BBG game controller measured the user’s grasp force by detecting the pressure inside a rubber bulb (WelchAllyn Model# 5086–03H) held by the patient’s hand (Figure 1). The bulb was mounted coaxially with the stainless steel tube and hermetically sealed at both ends through which the tube passed. This formed an air chamber with a pressure proportional with the force with which the patient was able to grasp. The axial central tube had a small orifice, to which was attached a plastic tube. This thin tube was routed inside the stainless steel tube and connected to a pressure sensor (Honeywell SSCMRND015PGSA3) on the BBG electronics board.

Pressure sensor calibration was performed so to detect the linear dependency of pressure sensor output signal vs. the force applied on the rubber bulb in Newtons. A Jamar pinch meter gauge was placed between the tester’s hand and the rubber bulb, and its dial indicated measured grasp force. Simultaneously, the pressure sensor digital output was sampled and plotted (Figure 2a). As can be seen, the dependency was linear in the range of 4 N to 40 N, corresponding to a sensor output of approximately 900 to 2500 digital ticks. The sensor manufacturer specified a measurement resolution of 10–4 Atmosphere (Honeywell, 2014). However, BBG’s pressure sensor response to small pressures of 0.4 ~ 4.4 N returned a monotonously increasing scaled value of only approximately 40 ticks. This made detection of very small forces problematic. Overall linearity trend was 0.9934 in R2 goodness of fit value. Each calibration step (for grasp force, or finger extension, or wrist pronation/supination) was repeated three times, which took about 3 minute total. Depending on a subject’s functioning level, once arm range of motion calibration had been added, calibration lasted up to 15 minutes per training session.

Figure 2.

Figure 2.

BrightBrainer Grasp calibration curves: a) grasp force; b) finger extension angle.

© Bright Cloud International Corp. Reprinted by permission.

Finger Extension Detection and Calibration

The VIVE tracker support served as the grounding block for the BBG finger extension detection mechanism. This assembly consisted of a rotating lever, a translation plate, a spring, and a manual screw. The rotating lever incorporated a rotary magnetic sensor (RLS Merilna Tehnick D.O.O., Model# RM08A-Analogue sinusoidal) with a miniature (4 mm diameter) magnet (Figure 3). The magnet created a magnetic field that varied proportionally to the lever rotation angle. The RM08 magnetic sensor had a 10 bits embedded digital-to-analog (DAC) converter, and had an accuracy of ±3°.

Figure 3.

Figure 3.

Magnetic rotary sensor elements of BrightBrainer Grasp controller: a) frontal assembly view; b) magnetic rotary sensor and 4 mm diameter miniature magnet.

© Bright Cloud International Corp. Reprinted by permission.

The combination of rotating lever and rotary sensor accommodated a range of detectable lever rotation of 0° to 60°. The rotary position sensor outputted a sinusoidal signal, which was then calibrated to the user’s finger extension range using a Jamar goniometer (Figure 2b). A particular BBG signal depended on how the magnet was mounted on the lever support, which meant that each game controller had an individual finger extension measurement signature.

The BBG had several features designed to accommodate various hand sizes and wrist orientations. A translation mechanism (part of the finger extension detection mechanism) allowed the rotating lever to be brought closer or moved further away from the rubber bulb (for smaller, or for larger hands, respectively). Once the desired distance had been found, the rotating screw was tightened, so to prevent in-out translation during hand therapy. The spring incorporated in the rotating hinge kept it in constant contact with the dorsal side of the user’s fingers when the hand was wrapped around the central rubber bulb. This positive contact made detection of even the smallest finger extensions possible, as long as the angle was larger than the rotary sensor resolution of 1 degree. The force with which the spring brought the rotating lever in contact with the fingers was small enough so not to materially resist finger extension.

Communication with the PC

The BBG controller sent user data to the computer which was rendering custom therapeutic games, allowing a patient to control avatars and in the process undergo arm and hand rehabilitation. Communication was done via a Bluetooth XBee-Pro S2C 802.15.4 chip mounted on the BBG electronics board, at a rate of about 10 packets/second. Higher rates were possible, however, this would have drained the BBG onboard rechargeable battery faster. In the authors’ view, 10 packets/second was sufficient when sampling finger extension or grasping force values. This number stems from empirical data, namely subject feedback commenting on responsiveness of the extension and grasp functions when controlling avatars. Furthermore, the maximum amplitude of finger movement can only be maintained up to 2 Hz [Freund 1986], thus 4 measurements per second would have sufficed.

Each data packet contained multiple bit fields, which transferred grasp strength and finger extension data. In parallel, wrist 3D motion data (acquired by the HTC VIVE Tracker) was processed separately via HTC SteamVR software, at 120 data packets/second (Niehorster, 2017).

Controller interaction with therapeutic games

Attempting to train survivors of stroke, especially those in the chronic phase presenting with spastic hands and weak arms is challenging and motivated the development of the BBG controller. The existing BrightBrainer library of therapeutic games had been designed for the Hydra and VIVE controllers, which are off-the-shelf devices meant for normal hand function. Neither the Hydra nor the VIVE controllers measure grasp strength nor finger extension. Thus BrightBrainer games had to be modified to include grasping and finger extension functionality if the full potential of the BBG was to be harnessed.

Game modifications to allow BBG control centered on virtual object “activation” (through grasping) and “resetting” states (through finger extension). The following section describes a sub-sample of four games in order to illustrate the concepts of activation and reset, as they relate to BBG use.

Breakout3D (Figure 4a) was a BrightBrainer game where the player had to use rectangular paddle avatars to bounce a ball so it collided with crates placed on an island. The paddle avatars were moved forward/back, left, and right corresponding to the users’ real-life arm movements in order to bounce the ball towards the crates. A ball collision destroyed only one crate (to maximize needed movements to destroy all crates). A partition was placed at the island center, such that each avatar could only move in its half of the island. This was meant to prevent learned non-use, i.e. a stroke survivors’ tendency to only use the less affected arm when playing. Higher difficulty levels required that the player remember to squeeze the controller (the paddle “solidified” and became active) just before bouncing a ball. Otherwise, the ball passed right through a transparent (inactive) paddle and became lost.

Figure 4.

Figure 4.

BrightBrainer Games: a) Breakout 3D; b) Card Island; c) Towers of Hanoi 3D;

d) Musical Drums. © Bright Cloud International Corp. Reprinted by permission.

Another variable in setting Breakout3D difficulty was the paddle length, with shorter paddle avatars requiring a shorter reaction time and more precise movement. The prior versions of the game (using Hydra or VIVE) changed the length of the paddle depending on game difficulty level but otherwise kept paddle length fixed for a given level. When playing the Breakout3D game using a BBG controller, however, the length of the paddle avatar shrank gradually during play. Every finger extension reset the paddle to the full paddle avatar length corresponding to that level of difficulty, followed by gradual shrinking. Thus in order to make the game playable, the user had to repeatedly extend fingers so to keep the paddle from becoming too small.

Card Island (Figure 4b) was a short-term memory game that required users to match pairs of cards, initially placed face down in an array in the center of a small island. Hand avatars were positioned by the movement of the BBG controller forward, back, left, and right. Only two cards could be turned face up at one time. If the cards matched, they were removed, if not, the two cards turned face down again. The game completed once all cards had been matched. In the Hydra or VIVE versions of Card Island, players had to move the hand avatar over the selected card, then squeeze a trigger button to turn the card face up. For BBG control, grasping was used to turn a card face up (corresponding to object activation). However, fingers need to be extended to reset the hand avatar such that it could subsequently select another card. Since the game involved a card array, playing Card Island resulted in a sequence of alternating grasps and finger extensions to match all cards.

Towers of Hanoi 3D (Figure 4c) was a puzzle game that trained executive function. It required that the player restack disks from one pole to the destination one, using a third pole as a waypoint. The stacking needed to be done such that the largest disk was at the bottom, and disks decrease in size with the smallest one always placed at the top. At no time was a larger disk allowed to be placed on top of a smaller one and the best performance meant the minimum sequence of moves for the stacking on the destination pole. Movement of the hand avatars were achieved similarly to Card Island, with the BBG controllers moving forward, back, left, and right in real space, so to move the hand avatars in the corresponding directions. When the Hydra or VIVE had been used in previous game versions, disks had been picked up when the hand avatar overlapped them, and the player squeezed a trigger. A disk dropped automatically on a pole once brought on top of it. The BBG version of the game, however, required that the player squeeze the rubber bulb to pick up a disk and extend fingers to drop it. The number of repeated grasp-extend cycles depended on the number of disks to be stacked, which was the variable determining the game level of difficulty. The Towers of Hanoi 3D game had been part of a study of elderly with dementia who were long-term residents in a SNF (Burdea et al., 2015b). That study outcomes showed a statistically significant improvement in executive function (decision making) as measured with the Neuropsychological Assessment Battery (NAB) Word Generation Test (White and Stern, 2003).

Finally, Musical Drums (Figure 4d) was a game that trained grasp strength, finger extension and reaction time. The task was to play drums by hitting notes that scrolled vertically across the screen. When a note had reached the center of a drum it turned green, which indicated the moment the player was supposed to squeeze the controller so to hit that note with a mallet avatar. After hitting a note, the mallet remained in a down position, and the player was required to extend fingers to raise the mallet back up. At lower difficulty levels, only one drum was present on each side of the screen and notes scrolled slowly. At higher difficulty levels, there were two drums on each side of the screen and notes moved faster. The number of notes increased with difficulty level, which required faster reaction time to hit all the descending notes. An error was recorded if the user missed a note by not hitting it when it had turned green on the center of the drum. The game ended when all notes scrolled across the drums.

Baselines game adaptation to patient’s functioning level

A key differentiator of therapeutic games vs. off-the-shelf ones, is the former’s ability to adapt to the player’s motor function. In the case of integrative games, adaptation also involves the player’s cognitive function. Motor function varied not only between patients (as a function of their impairments), but also within a patient from day to day, being adversely impacted by fatigue, lack of sleep, or illness. Thus baselines needed to be done at the start of every rehabilitation session to properly map the arm and finger physical range of motion and grasping strength to the virtual space. Performing all baselines took around 3–15 minutes in total, depending on the patient’s mood, motivation, and comprehension of directions given to them.

In the case of the BrightBrainer system using the Hydra or VIVE controllers, baselines were divided into several scenes, each asking the user to perform certain tasks while the system recorded measurements. Each baseline required verification of the measured value to ensure accurate readings and an option to redo it if necessary. All baseline measurements were saved on a remote secure cloud server. Baselines for shoulder horizontal adduction/overhead shoulder flexion (for writing on a virtual whiteboard), shoulder horizontal adduction/elbow extension, shoulder horizontal abduction (for writing on a virtual tabletop), and index flexion/extension were described in Burdea et. al. (2014). The discussion here focuses on baseline enhancements to adjust games to the patient’s BBG use, so to measure the ability to grasp, extend fingers, or pronate/supinate the arm.

Grasp Baseline

The BBG grasp baseline screen consisted of a tube connected to a mockup of the rubber bulb, a circular display to one side of the tube, a digital display to the other side, and a hand avatar (Figure 5a). The tube progressively filled up with color based on how strong the user grasped. The circular display gave a visual cue as to how long the user needed to maintain maximal grasp, while the digital display showed the percentage force applied. The hand avatar changed posture from flat to flexed, as a cue for when the user were to grasp. Instructions were given by text displayed at the top of the baseline screen.

Figure 5.

Figure 5.

BrightBrainer Grasp baseline screens: a) grasp baseline; b) finger extension baseline,

c) pronation/supination baseline. © Bright Cloud International Corp. Reprinted by permission.

The grasp baseline involved a sequence of steps. First, the user was asked to relax his/her hand and the system took a measurement of any residual pressure (owing to hand spasticity, for example). Subsequently the user was asked to squeeze the rubber bulb three times with the average of the three trials being the maximum squeeze threshold value.

It is known from prior studies on healthy individuals (Wiker et al., 1989) and on stroke survivors (Bautmans et al., 2007), that repeated grasping caused hand fatigue and discomfort. In order to minimize such unwanted effects during game play involving grasping, BBG games used 50% of this maximum squeeze force measured at baseline that day. Exceeding 50% of maximum squeeze force activated game avatars whenever the user had been asked to squeeze. This method minimized fatigue and finger pain during play using the BBG controller.

Extension Baseline

In this baseline, the user saw a circle depicting the angle of global finger extension, placed next to a simplified representation of the BBG controller (Figure 5b). Similar to the grasp baseline, a rotary dial located above the BBG mockup served as a visual cue on how long the user needed to maintain maximal finger extension. First, the user was asked to relax his/her hand and the system took a measurement of any residual extension due to spasticity or other causes. Subsequently the user was asked to stretch their fingers (extend) three times with the average of the trials being the extension threshold. Again only 50% of this average was subsequently used in the games to detect finger extension. One reason to choose 50% of finger extension baseline, related to the difficulty spastic stroke survivors have extending fingers (Thibaud et al, 2013). Since playing games with the BBG induced repeated finger extensions, fatigued had been detected even in the healthy subjects that took part in the usability study reported here. Thus it became important to reduce extension detection threshold to facilitate game play by impaired users.

Arm Pronation/supination Baseline

In this baseline, the user saw a hand avatar holding a knob that could rotate (Figure 5c). This baseline measured wrist pronation and supination range of motion. First, the user was asked to rotate the wrist inwards (pronation) to the maximum comfortable position, without compensatory elbow movement. Next, the user was asked to rotate the wrist outwards (supination) to the maximum comfortable position, again without elbow movement. As the user rotated the knob arrow avatar, a digital display located above the knob showed the corresponding angle of pronation or supination achieved. The system computed the minimum and maximum angles of rotation so that in-game avatars could rotate proportionally. Namely, a user’s unique range of wrist rotation was mapped to the avatar’s full wrist rotation in the game, making games easier to play and win. If the user did not have gravity bearing in the affected arm, the baseline was to be performed while the BBG controller rested on a horizontal surface. The controller curved bottom allowed the forearm rotation motion necessary to perform this baseline, provided that spastic pronation did not bring the tracker all the way down against the supporting surface.

Role of Artificial Intelligence in game adaptation

During gameplay, if the user was not able to reach as far as she/he did in the initial baseline or not reaching far enough, a supervisor could adjust the baseline. Namely, the baseline range was manually adjustable to increase or decrease the range of motion captured by the system at baseline so to make the game more difficult or easier to play. Typically, this feature was used when a user had become fatigued towards the end of a session and was no longer able to produce the full range of motion that the system had measured at baseline. After manually adjusting the baseline, the AI component took over so to adapt the games to the new range of motion.

Artificial intelligence played several roles in ensuring the therapy was adaptable and games were winnable even by patients with minimal functioning levels. Each BrightBrainer game had an adaptive algorithm which took previous user performance trend data as a reference and used this information to adjust subsequent game difficulties accordingly. This was done so to provide more challenging game settings for users with different functioning levels.

The first use of AI related to the mapping of the baseline-detected workspace and grasp strength to the VR scene. If the range of motion was very small for the affected arm or fingers, it was automatically scaled to the full dimensions of the surrounding virtual scene. This amplification allowed the affected arm to control avatars anywhere in the scene. Similarly, if the active finger extension was very limited, it was still mapped to the full finger extension of the hand avatar. This way even low functioning patients could still play the game in settings that required finger extension.

Artificial Intelligence also played a role in keeping the patient engaged maximally, by setting game difficulty based on past performance. Completing the game while exceeding a game-specific number of errors (losing too many balls in Breakout 3D, not matching pairs of cards in Card Island, making too many moves in Towers of Hanoi 3D, or missing too many notes in Musical Drums) twice in a row resulted in a decrease in game difficulty the next time that particular game was to be played.

Conversely, if the user had performed well in three immediately preceding instances of a game, the difficulty level for that game was subsequently increased, up to a maximum level allowed. This introduced higher functioning tasks (ex. more disks in the Towers of Hanoi 3D), and faster reaction times were needed to succeed (ex. faster balls in Breakout 3D).

To make sure a user was staying engaged with the system, difficulty levels were different for every game based on skill exhibited in that particular game. If for some reason users achieved a difficulty that was too high for them, or if they exceed the amount of game specified errors two times in a row (as stated above), the AI subsequently decrease the game difficulty for that particular game and patient. This adjustment was iterative if the conditions mentioned above were met, all the way back down to level one. The combination of automatic increase or decrease in difficulty levels, when occurring over a sequence of rehabilitation sessions, resulted in a form of “shaping” of the therapy. This is a term known in physical medicine to mean the conditioning of behavioral responses to promote the desired movement (Peterson, 2004). This AI difficulty adjustment process was designed to keep a patient engaged at a point where they remained challenged and not bored while exercising maximally over the set sessions duration.

An important by-product of AI game adaptation was that therapeutic games became winnable. In all cases of successful game completion, the patient received “rewards,” such as applause, congratulatory text, or pleasant sounds. A consequence of success was to give the patient a feeling of being in control, something that increased wellbeing and reduced depression severity (Lau, H., Smit, J., Fleming, T., & Riper, H., 2016).

Usability Evaluation

Subsequent to approval from the Western Institutional Review Board, a usability study was conducted for BBG controllers integrated with the BrightBrainer therapeutic game library. Two healthy participants, one male (Subject 1) and one female (Subject 2), were recruited, and their demographics are shown in Table 1. The choice of using novice healthy volunteers for the usability evaluation related to the researchers’ desire to have an objective evaluation, unbiased by motor or cognitive impairments. Specifically, had stroke survivors participated, they would have had bias related to their high or low motor function. Thus, usability on healthy participants gave researchers a better idea of the response of a typical individual.

Table 1.

Bright Brainer Grasp usability study participants’ demographics. © Bright Cloud International Corp. Reprinted by permission.

ID Gender Age Race Years of school Dominant arm Primary Language Prior Computer Usage
1 M 50 white 18 Left Romanian Yes
2 F 60 African American 16 Right English Yes

Each participant performed four usability sessions lasting approximately one hour each. After each session had completed, they received a participation fee of $25.

During the usability sessions, participants were seated and rested their arms on a low-friction table. Each forearm donned a BBG controller, such that they could play the games uni-manually, or bimanually (for increased motor and cognitive load). Of the four usability sessions, the first one was played uni-manually, and subsequent three sessions involved bimanual interaction.

Baselines were completed every session to ensure both participants could comfortably play the therapeutic games. The level of difficulty for the selected games increased starting at level 1 and reaching difficulty levels 10 to 16 (depending on games).

During the first session, usability participants learned proper arm motions and game mechanics. After playing each game once at the lowest level, both participants understood the basic rules and requirements of each game. As difficulty levels increased and new game mechanics were introduced, participants required explanations and instructions on how to proceed. By the last session, participants were able to complete the session with minimal assistance.

Initially, games took longer to complete due to learning of the game, how to use the BBG controller, and how to interact with the BrightBrainer system. This resulted in occasional game timeouts, and tasks were interrupted before completion, allowing the evaluation session to proceed with another game. After the first session, the game completion time for both participants became shorter, with only one game (Avalanche) timing out.

Both participants made fewer errors during lower levels of difficulty and made more errors at higher levels of difficulty. For the Breakout 3D game, an error was counted if the subject missed the ball and the ball disappeared into the ocean. Subject 1 made between 0 and 6 errors during lower levels of game difficulty and between 17 to 40 errors during higher levels of Breakout 3D difficulty. Subject 2 made between 1 and 5 errors during lower levels and between 6 and 23 errors during higher levels (Figure 6.a). For the Towers of Hanoi 3D game, subject 1 was able to complete the game in the ideal (minimum) number of moves for the lower levels of difficulty (2/2 and 5/5, respectively). However, at higher levels the subject made errors resulting in extra moves (11/9 and 28/22, respectively) and was unable to complete the game in the allotted time during the final session. Subject 2 completed lower levels making the minimum number of moves (2/2 and 5/5) whereas at higher levels subject 2 made errors resulting in additional moves (16/15 and 36/22). She was unable to complete the game in the allotted time during the last evaluation session (Figure 6c). Finally, for the Musical Drums game, both subjects made fewer errors at difficulty levels 1 and 4 as the number and speed of notes were smaller. At difficulty level 13, the number and speed of notes increased resulting in subject 1 making 47 errors and subject 2 making 29 errors (Figure 6d).

Figure 6.

Figure 6.

Usability study participants’ error rates as a function of games difficulty for: a) Breakout 3D; b) Card Island; c) Towers of Hanoi 3D; d) Musical Drums. © Bright Cloud International Corp. Reprinted by permission.

This increased error count may have been the result of multiple increases in game difficulty levels per session. Normally, a patient would have a gradual increase in difficulty levels over the course of several weeks (not days) with the AI adjusting difficulty levels based on game performance.

After each session, participants completed a custom Feedback Form with 21 questions (Figure 7), and a USE Questionnaire Form (Lund 2001) was filled after the final session. Participants rated individual therapeutic games and the overall system on a 5-point Likert Scale, with 1 being the least desired and 5 the most desired outcome.

Figure 7.

Figure 7.

Usability Evaluation Feedback Form. © Bright Cloud International Corp. Reprinted by permission.

For all but one game, participant 1’s average rating per game was equal to or above 4/5. Participant 2 rated all games equal to or above 4.7/5 on the custom evaluation form. They reported having no difficulty donning the controllers and did not require rest periods during sessions.

Participants mentioned there was some tracker delay while playing the games. Upon examination, the distance between the BrightBrainer Grasp controllers and the BrightBrainer IR illuminators used in tracking was increased, which resolved the delay issue.

Over the course of the study, games were continuously updated based on feedback from both participants and on observations from the research team. Feedback included additional visual cues for indicating when to grasp or when to extend fingers. For example, when playing the Card Island game, both participants recommended that the games provide additional visual cues to differentiate when the hand was open and closed. The researchers addressed this issue by changing the hand avatar color based on its open and closed status. The researchers further observed that when playing Towers of Hanoi 3D game, both participants were unable to differentiate disk size due to depth perspective artifacts. When a bigger disk was placed farther away than a smaller, closer disk, both disks appeared to be the same size when seen by the subjects. Researchers addressed this by adding a number to each disk to help differentiate them. Numbers were assigned in increasing order, from the smaller disk to the largest (see also Figure 4c). Lastly, both participants experienced a small zone of unresponsive avatars when attempting to reach forward or raise their arms. Researchers discovered an issue with the calibration of the BBG controllers and fixed it to allow for accurate controller tracking and seamless avatar movements. Participant 2 commented that the “length of time was long” for the second session and her arms were becoming fatigued. Towards the end of the session, this fatigue resulted in the participant being unable to reach the extension and squeeze baseline thresholds that had been set at the start of the session. In an actual telerehabilitation intervention, the exercise time increases by week, allowing patients to gradually increase their arm/hand endurance. Over the shorter usability evaluation the participant had no time to increase her arm endurance.

Both participants rated 6/7 or above for all questions on the USE Questionnaire form, strongly agreeing the system was useful. Additionally, both participants strongly agreed the system was easy to use, user-friendly, fun to use and they were able to learn to use it quickly. Both participants listed positive aspects of the system such as, “Fun and interesting” (Participant 1) or “Fun overall. Agility and Sharpness with hand and arm movements” (Participant 2).

Discussion

The BrightBrainer Grasp controller is an adaptive but passive system. While in a clinical setting active-assisted arm reach is possible when a therapist is present (as long as trackers are not occluded), BBG is aimed primarily for telerehabilitation applications. Gameplay is possible at home, however, the patient needs to be able to generate minimal voluntary (active) movement. Such a movement needs to be several times larger than the HTC tracker resolution, in order to be detected with minimal jitter.

Another limitation in the current design relates to detectable grasp forces when a pressure sensor is used. Air compressibility is not a big factor here, owing to the short distance between the air chamber in the rubber bulb and the pressure sensor on the BBG electronics board. As seen in Figure 2a, however, the sensor output is essentially flat for grasping strength below 1 lbf, which makes detection of very weak forces problematic.

While the usability evaluation reported here is the first to target the BBG controller, it is limited by the number of participants (only two) and their age. The oldest participant was 60, while 66% of stroke survivors are 65 or older (Hall et al., 2012). Even more importantly, the usability participants had intact upper extremity motor function, as well as normal cognition. It is highly probable that usage by elderly stroke survivors (some also having cognitive impairments), will be substantially more challenging. A usability study on stroke survivors would certainly have been beneficial, however that was not done due to time and funding constraints.

A BBG feasibility evaluation on individuals chronic post-stroke living at home has been completed after the original submission of the present manuscript. Results from this subsequent study will be published elsewhere. Randomized clinical trials of the device and more therapeutic adaptable games for training at home are in the planning phase and will incorporate lessons learned in this study.

Acknowledgement

The authors would like to thank the participating healthy volunteers for taking part in the usability data collection, and to Team Design Group for manufacturing of the prototypes used in this study and contributing to subsequent design improvements.

Funding

This study was funded by the National Institutes of Health/NIA grants R43AG052290 and R44AG044639.

Footnotes

Trial Registration

ClinicalTrials.gov registration TBA.

References

  1. Agostini M, Moja L, Banzi R, Pistotti V, Tonin P, Venneri A, and Turolla A (2015). Telerehabilitation and recovery of motor function: a systematic review and meta-analysis. J Telemed Telecare, 21(4), 202–213. 10.1177/1357633X15572201 [DOI] [PubMed] [Google Scholar]
  2. Akinwuntan AE, De Weerdt W, Feys H, Pauwels J, Baten G, Arno P, et al. (2005), Effect of simulator training on driving after stroke. Neurology. 65(6):843–50. [DOI] [PubMed] [Google Scholar]
  3. Barnason S, Zimmerman L, Nieveen J, Schulz P, Miller C, Hertzog M, and Chunhao T (2009). Influence of a Symptom Management Telehealth Intervention on Older Adults’ Early Recovery Outcomes following Coronary Artery Bypass Surgery (CABS). Heart & Lung: The Journal of Critical Care, 38(5), 364–376. 10.1016/j.hrtlng.2009.01.005 [DOI] [PMC free article] [PubMed] [Google Scholar]
  4. Bower KJ, Louie J, Landesrocha Y, Seedy P, Gorelik A, Bernhardt J (2015). Clinical feasibility of interactive motion-controlled games for stroke rehabilitation. J. Neuroeng. Rehabil 12, 63. [DOI] [PMC free article] [PubMed] [Google Scholar]
  5. Bramanti A, Manuli A, and Calabrò RS (2018). Stroke Telerehabilitation in Sicily: a Cost-Effective Approach to Reduce Disability? Innov Clin Neurosc, 15(1–2), 11–15. [PMC free article] [PubMed] [Google Scholar]
  6. Bright Cloud International (2018). BrightBrainer BBX Rehabilitation System. Product Brochure. North Brunswick, NJ, 08904. [Google Scholar]
  7. Burdea G (2003). Virtual Rehabilitation: Benefits and Challenges. JMIM Schattauer, German, 42(5):519-523. [PubMed] [Google Scholar]
  8. Burdea G, Polistico K, Grampurohit N, House G, Kim N, Nordstrom M, Buccellato K, Murphy J, Pasquina P. Concurrent Virtual Rehabilitation of Service Members Post-Acquired Brain Injury – A Randomized Clinical Study, 12th Int Conference on Disability and Virtual Reality Technology, September 2018, Nottingham, UK. pp. 22–31. [Google Scholar]
  9. Burdea G, Polistico K, Krishnamoorthy A, House G, Rethage D, Hundal J, Damiani F, and Pollack S. (2015a). A feasibility study of the BrightBrainer™ cognitive therapy system for elderly nursing home residents with dementia. Disability and Rehabilitation – Assistive Technology. 10(5):421–432. doi: 10.3109/17483107.2014.900575 [DOI] [PMC free article] [PubMed] [Google Scholar]
  10. Burdea G, Polistico K, House G, Liu R, Muñiz Roberto, Macaro N, and Slater L. (2015b) Novel Integrative Virtual Rehabilitation Reduces Symptomatology in Primary Progressive Aphasia – A Case Study. International Journal of Neuroscience. 125(12):949–958. [DOI] [PMC free article] [PubMed] [Google Scholar]
  11. Burdea G, Roll D, Kim NH, Polistico K, Kadaru A (2017) Bimanual Integrative Virtual Rehabilitation System and Methods. US Patent Application 2017/0361217 A1, December. 21.
  12. Byl N, Abrams G, Pitsch E, Fedulow I, Kim H, Simkins M, et al. (2013), Chronic stroke survivors achieve comparable outcomes following virtual task specific repetitive training guided by a wearable robotic orthosis (UL-EXO7) and actual task specific repetitive training guided by a physical therapist. Journal of Hand Therapy, 26(4):343–51. [DOI] [PubMed] [Google Scholar]
  13. Carmeli E, Peleg S, Bartur G, Elbo E et al. (2011). HandTutor enhanced hand rehabilitation after stroke-a pilot study. Physiotherapy Research International, 16(4):101–200. [DOI] [PubMed] [Google Scholar]
  14. Cheok G, Tan D, Low A, and Hewitt J (2015). Is Nintendo Wii an Effective Intervention for Individuals With Stroke? A Systematic Review and Meta-Analysis. J A Med Dir Assoc, 16(11), 923–932. 10.1016/j.jamda.2015.06.010 [DOI] [PubMed] [Google Scholar]
  15. Cherry CO, Chumbler NR, Richards K, Huff A, Wu D, Tilghman LM, and Butler A (2017). Expanding stroke telerehabilitation services to rural veterans: a qualitative study on patient experiences using the robotic stroke therapy delivery and monitoring system program. Disab Rehab. Assist Tech, 12(1), 21–27. 10.3109/17483107.2015.1061613 [DOI] [PubMed] [Google Scholar]
  16. Cho K, Yu J, Jung J. (2012). Effects of virtual reality based rehabilitation on upper extremity function and visual perception in stroke patients: a randomized control trial. Journal of Physical Therapy Science, 24:1205–8. [Google Scholar]
  17. Coupar F (2012). Exploring Upper Limb Interventions After Stroke [PhD thesis]. Glasgow, UK: University of Glasgow. [Google Scholar]
  18. Crosbie J, Lennon S, McGoldrick M, McNeil M, McDonough S. (2012). Virtual reality in the rehabilitation of the arm after hemiplegic stroke: a randomized controlled pilot study. Clinical Rehabilitation, 26(9):798–806. [DOI] [PubMed] [Google Scholar]
  19. Darzi A, Gorsic M, and Novak D (2017). Difficulty adaptation in a competitive arm rehabilitation game using real-time control of arm electromyogram and respiration. IEEE … Int Conf Rehab Robot. 2017, 857–862. 10.1109/ICORR.2017.8009356 [DOI] [PMC free article] [PubMed] [Google Scholar]
  20. Dilsizian SE, and Siegel EL (2014). Artificial intelligence in medicine and cardiac imaging: harnessing big data and advanced computing to provide personalized medical diagnosis and treatment. Current Cardiol Rep, 16(1), 441. 10.1007/s11886-013-0441-8 [DOI] [PubMed] [Google Scholar]
  21. Freund H-J (1986) Time control of hand movements. In Freund Biittner. Cohen and Noth (Eds) Progress in Brain Research. 64: 287–294. [DOI] [PubMed] [Google Scholar]
  22. Hall MJ, Levant S, and DeFrances CJ Hospitalization for stroke in U.S. hospitals, 1989–2009. NCHS data brief, No. 95. Hyattsville, MD: National Center for Health Statistics; 2012. https://www.cdc.gov/nchs/data/databriefs/db95.pdf [PubMed] [Google Scholar]
  23. Holden M (2005). Virtual environments for motor rehabilitation: review. Cyberpsychol Behav. 8(3):187–211 [DOI] [PubMed] [Google Scholar]
  24. Honeywell (2014). TruStability® Board Mount Pressure Sensors. Golden Valley, MN. https://sensing.honeywell.com/index.php?ci_id=151134 [Google Scholar]
  25. HTC Corporation. (2016). VIVE Pre User Guide. http://www.htc.com/managed-assets/shared/desktop/vive/Vive_PRE_User_Guide.pdf
  26. Hung Y-X, Huang P-C, Chen K-T, and Chu W-C (2016). What Do Stroke Patients Look for in Game-Based Rehabilitation. Medicine, 95(11). 10.1097/MD.0000000000003032 [DOI] [PMC free article] [PubMed] [Google Scholar]
  27. Kreylos O (2016). Lighthouse tracking examined. Retrieved from http://doc-ok.org/?p¼1478
  28. Lau HM, Smit JH, Fleming TM, and Riper H (2016). Serious Games for Mental Health: Are They Accessible, Feasible, and Effective? A Systematic Review and Meta-analysis. Frontiers in Psychiatry, 7, 209. 10.3389/fpsyt.2016.00209 [DOI] [PMC free article] [PubMed] [Google Scholar]
  29. Laver KE, George S, Thomas S, Deutsch JE, Crotty M. (2015). Virtual reality for stroke rehabilitation. Cochrane Database of Systematic Reviews, Issue 2. Art. No.: CD008349. DOI: 10.1002/14651858.CD008349.pub3. [DOI] [PMC free article] [PubMed] [Google Scholar]
  30. Lund AM (2001). Measuring Usability with the USE Questionnaire. STC Usability SIG Newsletter. 8:2. http://hcibib.org/perlman/question.cgi?form=USE [Google Scholar]
  31. Massalha S, Clarkin O, Thornhill R, Wells G, and Chow BJW (2018). Decision Support Tools, Systems, and Artificial Intelligence in Cardiac Imaging. Canadian J Cardio, 34(7), 827–838. 10.1016/j.cjca.2018.04.032 [DOI] [PubMed] [Google Scholar]
  32. Niehorster D, Li L, Lappe M (2017). The Accuracy and Precision of Position and Orientation Tracking in the HTC Vive Virtual Reality System for Scientific Research. I-Perception,1–23. [DOI] [PMC free article] [PubMed] [Google Scholar]
  33. Perez-Marcos D, Chevalley O, Schmidlin T, Garipelli G et al. , (2017) Increasing upper limb training intensity in chronic stroke using embodied virtual reality: a pilot study. J NeuroEngineering and Rehabilitation, 14:119, 14 pp. [DOI] [PMC free article] [PubMed] [Google Scholar]
  34. Peterson GB (2004). A day of great illumination: B. F. Skinner’s discovery of shaping. J Exp Analysis Behav, 82(3), 317–328. 10.1901/jeab.2004.82-317 [DOI] [PMC free article] [PubMed] [Google Scholar]
  35. Piron L, Turolla A, Tonin P, Piccione F, Lain L, and Dam M (2008). Satisfaction with care in post-stroke patients undergoing a telerehabilitation programme at home. J Telemed Telecare, 14(5), 257–260. 10.1258/jtt.2008.080304 [DOI] [PubMed] [Google Scholar]
  36. Richmond T, Peterson C, Cason J, Billings M, Terrell E. Lee, et al. (2017). American Telemedicine Association’s Principles for Delivering Telerehabilitation Services. Int J Telerehab, 9(2), 63–68. 10.5195/ijt.2017.6232 [DOI] [PMC free article] [PubMed] [Google Scholar]
  37. Saywell N, Vandal AC, Brown P, Hanger HC, Hale L, Mudge S, … Taylor D. (2012). Telerehabilitation to improve outcomes for people with stroke: study protocol for a randomized controlled trial. Trials, 13, 233. 10.1186/1745-6215-13-233 [DOI] [PMC free article] [PubMed] [Google Scholar]
  38. Sixense Entertainment. Razer Hydra Master Guide. (2011). Available from: dl.razerzone.com/master-guides/Hydra/HydraOMG-ENG.pdf
  39. Threapleton K Avril Drummond A, and Standen P. (2016) Virtual rehabilitation: What are the practical barriers for home-based research? Digital Health, 2:1–11 [DOI] [PMC free article] [PubMed] [Google Scholar]
  40. Thibaut A, Chatelle C, Ziegler E, Bruno MA, Laureys S and Gosseries O. (2013). Spasticity after stroke: Physiology, assessment and treatment, Brain Injury, 27:10, 1093–1105, DOI: 10.3109/02699052.2013.804202. [DOI] [PubMed] [Google Scholar]
  41. Tousignant M, Moffet H, Boissy P, Corriveau H, Cabana F, and Marquis F (2011). A randomized controlled trial of home telerehabilitation for post-knee arthroplasty. J Telemed Telecare, 17(4), 195–198. 10.1258/jtt.2010.100602 [DOI] [PubMed] [Google Scholar]
  42. Webster D, and Celik O (2014). Systematic review of Kinect applications in elderly care and stroke rehabilitation. J Neuroeng Rehab, 11, 108. 10.1186/1743-0003-11-108 [DOI] [PMC free article] [PubMed] [Google Scholar]
  43. White T and Stern R (2003) Neuropsychological Assessment Battery, Ch. 4, Psychometric and Technical Manual, Psychological Assessment Resources Inc, Lutz. FL, 49 pp. [Google Scholar]
  44. Wiker S, Hershkowitz E and Zik J. (1989) Teleoperator Comfort and Psychometric Stability: Criteria for Limiting Master Controller Forces of Operation and Feedback During Telemanipulation, Proc of NASA Conference Space Telerobotics, NASA, 1:99–107. [Google Scholar]

RESOURCES