Abstract
Introduction:
Deep brain stimulation (DBS) is an effective treatment for Parkinson’s disease (PD), but its efficacy is tied to DBS programming, which is often time consuming and burdensome for patients, caregivers, and clinicians. Our aim is to test whether the Mobile Application for PD DBS (MAP DBS), a clinical decision support system, can improve programming.
Methods:
We conducted an open-label, 1:1 randomized, controlled, multicenter clinical trial comparing six months of SOC standard of care (SOC) to six months of MAP DBS-aided programming. We enrolled patients between 30 and 80 years old who received DBS to treat idiopathic PD at six expert centers across the United States. The primary outcome was time spent DBS programming and secondary outcomes measured changes in motor symptoms, caregiver strain and medication requirements.
Results:
We found a significant reduction in initial visit time (SOC: 43.8 ± 28.9 minutes n=37, MAP DBS: 27.4 ± 13.0 minutes n=35, p = 0.001). We did not find a significant difference in total programming time between the groups over the 6-month study duration. MAP DBS-aided patients experienced a significantly larger reduction in UPDRS III on-medication scores (−7.0 ± 7.9) compared to SOC (−2.7 ± 6.9, p = 0.01) at six months.
Conclusion:
MAP DBS was well tolerated and improves key aspects of DBS programming time and clinical efficacy.
Introduction:
Deep brain stimulation (DBS) has been shown to be an effective treatment for select patients with Parkinson’s disease (PD),1–3 as well as other neurological and psychiatric disorders.4–10 The therapeutic effectiveness of DBS depends on careful patient screening,11,12 precise surgical targeting,13,14 and optimization of stimulation parameters delivered to the patient.15–17 Determining the best stimulation parameters, commonly referred to as DBS programming, can be a cumbersome and time-consuming process. At many expert centers, DBS programming requires four or more clinic visits during the first six months after lead implantation,18 creating a substantial burden for patients and caregivers, especially those traveling long distances.
The selection of optimal DBS settings for a patient is challenging given the tens of thousands of potential programming combinations. The efficacy of DBS therapy is tied to the interaction between the precise anatomical location of patients’ DBS leads and their stimulation parameters.19,20 However, many clinicians perform programming without access to information regarding the patient-specific anatomical fields of activation caused by different stimulation settings. Instead, DBS programming is often performed using a trial-and-error-based iterative approach, and usually requires an experienced programming clinician to effectively navigate the functionally unlimited number of parameter combinations while being guided by therapeutic response and side effects.18
To address this information gap and improve the DBS programming process, we developed the Mobile Application for PD DBS (MAP DBS), a DBS programming mobile decision support system that provides interactive, patient-specific computational models of activation (Fig 1). Building on the legacy of other computational model-based programming tools,21,22 the programming clinician is able to modify the stimulation settings through an intuitive user interface and interactively visualize how the area of activation changes relative to each patient’s target nuclei. Previously, we demonstrated in a small pilot study that MAP DBS could improve multiple aspects of programming.23
Figure 1:
iPad screenshots of the MAP DBS interface. MAP DBS supports both a 3D view of patient anatomy and stimulation, as well as a 2D view with multiple coregistered MRI and CT image volumes. Top row: Example of the interface for a patient with a Medtronic 3389 DBS lead implanted in the left globus pallidus internus (GPi). The globus pallidus externus (GPe) is shown for additional context. Bottom row: Example of the interface for a patient with a Boston Scientific Vercise DBS lead implanted in the left subthalamic nucleus (STN). Left column: 3D interface. Right column: 2D slice view. The top left portion of the screen in both the 3D and 2D views contain the stimulation parameters chosen by the user to generate the volume of tissue activated.
Based on our previous work, we hypothesized that MAP DBS could simplify the programming process. To test this hypothesis, we conducted an open-label, randomized clinical trial to evaluate our mobile decision support system in a clinical setting.
Methods:
This study was a multicenter (six sites), 1:1, open-label, randomized controlled clinical trial and designed to compare MAP DBS-aided programming to SOC programming for PD DBS patients from initial programming to six-months. The primary outcome for the study was the mean total time difference between the two groups required for device programming during the study period. Additionally, changes in clinically validated rating scales on motor, mood, quality of life, and caregiver strain between baseline and six-month outcomes were compared. The study was approved by the internal review board (IRB) at each participating institution and was registered on ClinicalTrials.gov (identifier: NCT02474459).
Participants
The study was conducted at the University of Florida (UF), Baylor College of Medicine, Wake Forest University, New York University, University of Utah, and University of California San Francisco.
Study subjects could enroll either before or after DBS surgery, but prior to their first programming session. Inclusion criteria included age between 30 and 80 years; a diagnosis of idiopathic PD by a movement disorders neurologist; a minimum of five years since PD diagnosis; stage II or higher on the Hoehn and Yahr scale during the off-dopaminergic medication state; unsatisfactory clinical response to optimal medical management; and a stable medication regimen for antiparkinsonian drug therapy for at least one month prior to DBS surgery. Exclusion criteria were subjects who were suspected of secondary or atypical parkinsonism; had an MRI scan with significant evidence of brain atrophy or other abnormalities; had previously had a DBS system implanted at another institution; or had been planning to receive an additional DBS lead within six months of enrolling in the study. The primary caregivers (usually the spouse) of patients were given the option to enroll as a research participant for assessment of caregiver strain. Informed consent was received from all study subjects.
Randomization and masking
A stratified blocked randomization scheme (block size of 4) was used to achieve balance over time and reduce confounds introduced by concurrent changes in other supportive care. Stratification was performed based on clinical site, target nucleus (subthalamic nucleus (STN) vs. globus pallidus internus (GPi)), programming clinician (i.e., the specific nurse or neurologist), DBS lead manufacturer, and number of DBS leads implanted (unilateral or bilateral). Study personnel at the University of Utah generated the randomization sequence using a computer program, and the sequences were distributed to the sites. Only the study coordinator soliciting consent from patients at each site had access to the randomization scheme. No attempt was made to conceal the study arm assignment from the patient or clinician.
Intervention
The study period started at patient’s initial DBS programming session and finished at their six-month outcomes visit. Because of differences in the SOC practices at each institution, the schedule of DBS programming sessions within the study period and the timing of the six-month outcomes visit varied between sites. All patients within an institution underwent the same schedule of study visits regardless of study arm.
Patients randomized to SOC underwent SOC DBS programming for the study duration. Patients randomized to MAP DBS-aided programming received a modified version of SOC; in which each patient’s programming clinician was provided with MAP DBS. In MAP-DBS group, clinicians used MAP DBS to download and explore patient-specific computational models of activation to aid in DBS programming decision making. Patient-specific computational models were generated at the Scientific Computing and Imaging (SCI) Institute at the University of Utah and were downloaded by each patient’s programming clinician prior to the patient’s first DBS programming session.
Typically, as part of DBS SOC patients undergo a monopolar review at their first DBS programming visit. In short, the monopolar review is the process of testing each individual electrode along the DBS lead for symptomatic benefit and adverse effects. However, patients enrolled at UF received MAP DBS programming instead of the SOC monopolar review. No constraints were used in the programming process; the clinicians were free to test and use any programming parameters based on their expert opinion.
During the first year UF was the only study clinical site in the study however, in year two, five clinical sites were added (detailed above). Based on the preliminary feedback from year one at UF, the other five sites were given more constrained instructions on how to use MAP DBS. These constraints included first, using MAP DBS to select the two (of the 4 or 8 depending on DBS lead model) electrodes most likely to result in therapeutic benefit prior to the patient’s initial DBS programming visit. Next, clinicians were instructed to perform DBS programming limited to the two electrodes selected. Programming was done on one or two DBS leads at each session depending on whether the patient had unilateral or bilateral DBS implantation surgery. For patients implanted with leads containing segmented directional electrodes, the three coplanar segments were considered a single electrode (i.e., the electrode selection was done in ring-mode). MAP DBS visualizations on the iPad were available for use during all visits at all six sites.
Outcomes
The primary outcome measure was the between-group difference in programming time, which was calculated as the cumulative time spent clinically optimizing DBS settings across all scheduled and unscheduled DBS programming sessions prior to the six-month outcomes visit. Programming time was recorded using a stopwatch and physicians were instructed to self-time noting the start and stop time of the programming session. Secondary outcomes measures for the study were change from baseline to six-month outcomes in PD motor symptoms as measured by the Unified Parkinson’s Disease Rating Scale part III (UPDRS III) in both the off-medication and on-medication state (both with DBS on); total PD symptom severity as measured by off-medication total UPDRS (sum of parts I, II, III, and IV); caregiver strain as measured by the Multidimensional Caregiver Strain Index (MCSI); quality of life as measured by the 39-question Parkinson’s rating scale (PDQ-39); and PD medication dose as measured by levodopa equivalent daily dose (LEDD).
The baseline used for the UPDRS III was the last evaluation done prior to the patient undergoing DBS surgery. All other baseline metrics were captured at the first study visit prior to the initiation of DBS programming. Outcome metrics were captured at all study visits, with the exception of the on-medication UPDRS III, which was captured only at baseline and at the six-month visit. Additionally, at the first study visit prior to the initiation of DBS programming, and at the six-month outcomes visit, off-medication UPDRS III assessments were performed by an assessor blinded to the study arm of the patient. The blinded assessments were done both by video review and in person. The MCSI was completed by primary caregivers who consented to the study. LEDD values were calculated using the Parkinson’s Measurement Toolbox (https://www.parkinsonsmeasurement.org). Programming visit attendance rate and total number of extra, unscheduled programming visits were captured to measure protocol compliance. Unscheduled programming visits were defined as visits not part of the initial schedule of visits. All untoward medical occurrences were recorded as adverse events for all patients throughout the trial.
Finally, we conducted an unplanned sub-analysis of programming time and change in UPDRS III off-medication and on-medication scores between baseline and the six-months visit. This sub-analysis was conducted to determine if more constrained protocol instructions affected site outcomes. Additionally, we performed an unplanned sub analysis comparing the between-group difference in programming time at the initial DBS programming session for each patient.
Statistical methods
All analyses were performed on an intent-to-treat basis, with subjects placed into treatment groups based on the randomization assignment. Initial session DBS programming times and total programming times over the course of the study were analyzed using a logarithmic scale in an effort to account for a skewed distribution and variability increasing with the mean. We estimated effect sizes and interpreted the back-transformed times as fold-changes in the original scale. Multiple imputation was used to estimate programming time for missing data, including study visits missed because of discontinued intervention. All analyses were performed separately on each imputed data set, and Rubin’s formula 24 was used to combine the estimates and standard errors across the multiple imputations. To test for differences between the two arms of the study, we performed a randomized-blocked ANOVA that accounted for the a priori randomization strata. Changes in clinical rating scales and LEDD between baseline and the six-month visit were compared using a randomized-blocked ANCOVA adjusting for the baseline value. Programming visit attendance was calculated as the percentage of originally scheduled study visits attended, not accounting for those missed by patients after they had discontinued the intervention. Attendance rates and unscheduled visits were compared using a randomized-blocked ANOVA. Lin’s concordance correlation coefficient 25 was used to quantify the agreement between the UPSRS III scores as recorded by the treating clinician and assessed in a blinded review. Rigidity questions were not included in this calculation due to the difficulty of assessing this symptom from a video recording therefore UPDRS III scores were modified for the assessment of agreement only. Outcomes are reported as mean ± standard deviation.
Sample size calculation and interim analysis
The target enrollment was 80 patients, with 40 patients per group. We anticipated a 10% dropout rate, and therefore 36 patients per group who completed study procedures were needed to achieve 80% power to detect a 33% decrease in programming time at a two-sided 0.05 significance level, assuming a log-normal distribution with a coefficient of variation of 0.66. If a consenting patient had a primary caregiver, we gave the option for the caregiver to participate in the study.
No interim efficacy or futility evaluations were performed. The study design allowed for a blinded interim sample size recalculation using the approach of Kieser and Friede 26 to calculate whether the initial estimate of a 0.66 coefficient of variation in programming time was an underestimate of the true variability, and potentially increasing the target sample size. This assessment was performed at the point in the study when 36 patients had been enrolled, 22 of whom had completed all study procedures. The interim analysis estimated the coefficient of variation as 0.46 instead of 0.66. No modifications to the study design were made.
MAP DBS patient-specific model generation
MAP DBS consisted of patient-specific computational models that were generated based on our previous publication 23 and were delivered to the clinicians on an iPad via ImageVis3D Mobile (https://itunes.apple.com/us/app/imagevis3d-mobile-universal/id378071694), a free iOS mobile application. Each patient-specific model consisted of four primary pieces of information; the location of the patient’s DBS lead or leads, the targeted brain nuclei (STN or GPi), a visualization of the volume of tissue activated for a range of DBS programming settings, and the patient’s DBS surgery head medical imaging.
The MAP DBS patient-specific models of activation were generated from medical imaging, requiring at least one preoperative head MRI and one postoperative head CT or MRI from each patient. For each patient, all imaging sequences were rigidly co-registered to the patient’s preoperative T1 MRI sequence using the Advanced Normalization Toolkit (ANTs) antsRegistration algorithm.45 The DBS lead locations were manually identified using the metal artifact manifested by the lead contained in the postoperative imaging and through the use of the SCIRun 4 software (http://www.sci.utah.edu/cibc-software/scirun.html). Segmentations of the DBS target nuclei were obtained by using antsRegistration SyN algorithm 46 to perform a nonlinear registration of the patient’s preoperative MRI to the Montreal Neurological Institute PD25 template.47 Once the transformation between the two images was established, the nuclei segmentations provided by the PD25 template were warped into the patient space, where they could be viewed in the same reference as the patient’s DBS lead and medical imaging.
Computational models of the volume of tissue activated were precomputed for each DBS lead model using the axon model method detailed in our previous publications.48,49 Briefly, voltage solutions for each DBS lead model were computed using a finite element approach and were mapped onto a regular 3D grid of cellular axon models that surrounded the lead. Simulations were run using NEURON 50 and assumed an idealized waveform to determine the firing threshold for each axon in the grid. The threshold volumes were iso-surfaced to generate discrete predictions of the extent of neuronal activation.
Role of the funding source
The study sponsor (NIH) did not play a role in study design, collection, analysis, interpretation of data, writing the manuscript, or the decision to submit the paper for publication.
Results
Study participants
Seventy-seven patients across six clinical sites were enrolled in the study between November 15th, 2015 and January 17th, 2020, with the last study visit occurring on April 29th, 2020. Study enrollment was terminated after meeting the target sample size. Seventy-three patients were randomized, with 37 allocated to SOC and 36 allocated to MAP DBS-aided DBS programming. Of the 36 patients assigned to the MAP DBS-aided programming, 35 underwent at least one study visit, of which 32 received programming that adhered to study protocol. All 37 patients allocated to SOC underwent at least one study visit and received the intended treatment. Baseline characteristics were similar between both groups with the exception of PD disease duration (SOC: 10.0 ± 5.4, MAP DBS: 7.9 ± 4.1). However, the difference in disease duration did not manifest as differences in baseline PD symptom severity as measured by total off-medication UPDRS scores (SOC: 46.5 ± 15.5, MAP DBS: 44.0 ± 12.6) (Table 1).
Table 1:
Demographics and baseline characteristics by study arm.
Standard of care | MAP DBS | |
---|---|---|
Age (n) | 63.6 ± 8.9 (37) | 63.3 ± 8.2 (35) |
Sex | ||
− Male | 28 (78%) | 25 (71%) |
− Female | 8 (22%) | 10 (29%) |
Duration of Parkinson’s disease (years) (n) | 10.0 ± 5.4 (37) | 7.9 ± 4.1 (35) |
Race | ||
− Asian | 3 (8%) | 1 (3%) |
− White | 33 (89%) | 33 (94%) |
− Other | 0 (0%) | 1 (3%) |
− Multiple | 1 (3%) | 0 (0%) |
Ethnicity | ||
− Hispanic or Latino | 0 (0%) | 1 (3%) |
− Not Hispanic or Latino | 37 (100%) | 33 (97%) |
DBS target | ||
− GPi | 21 (57%) | 21 (60%) |
− STN | 16 (43%) | 14 (40%) |
Number of DBS leads | ||
− Unilateral | 22 (59%) | 21 (60%) |
− Bilateral | 15 (41%) | 14 (40%) |
UPDRS III off-medication (n) * | 36.9 ± 13.8 (36) | 37.9 ± 13.7 (34) |
UPDRS III on-medication (n) * | 23.3 ± 12.9 (37) | 21.9 ± 11.2 (33) |
Total UPDRS (I, II, III, IV) off-medication (n) | 54.0 ± 17.7 (36) | 49.7 ± 14.3 (32) |
PDQ-39 (n) | 26.3 ± 16.4 (37) | 26.0 ± 15.0 (35) |
MCSI (n) | 16.7 ± 11.9 (34) | 16.0 ± 11.3 (30) |
LEDD (n) | 1078.1 ± 502.1 (35) | 868.6 ± 514.3 (29) |
Caregivers consented | 34 (92%) | 32 (91%) |
DBS lead model | ||
− Medtronic 3387 | 22 (59%) | 23 (66%) |
− Medtronic 3389 | 10 (27%) | 6 (17%) |
− Abbott Infinity Directional Lead (0.5 mm spacing) | 2 (5%) | 1 (3%) |
− Abbott Infinity Directional Lead (1.5 mm spacing) | 1 (3%) | 1 (3%) |
− Boston Scientific Vercise | 1 (3%) | 3 (9%) |
− Boston Scientific Vercise Cartesia | 1 (3%) | 1 (3%) |
GPi = globus pallidus internus, STN = subthalamic nucleus, UPDRS = Unified Parkinson’s Disease Rating Scale, PDQ-39 = 39-question Parkinson’s disease rating scale, MCSI = Multidimensional Caregiver Strain Index, LEDD = levodopa equivalent daily dose. Data are: mean ± standard deviation (n) or n (%).
Preoperative baseline used
Programming time
We found that time spent during the initial programming session was significantly less for patients assigned to MAP DBS (27.4 ± 13.0 minutes, n = 35) than for those assigned to SOC (43.8 ± 28.9 minutes, n=37, p-value=0.001). We did not find a significant difference in total programming time between the SOC arm (97.8 ± 53.0 minutes, n = 37) and the MAP DBS arm (75.2 ± 40.9 minutes, n = 35, p-value = 0.11) over the entire six-month study duration (primary outcome) (Fig 2a).
Figure 2:
A) CONSORT patient flow diagram. B) Summary of MAP DBS clinical trials. MAP DBS I refers to the current study.
Programming visit attendance
The number of clinic visits scheduled for DBS programming between the sites varied as sites were instructed to follow their own SOC procedure, ranging from two to five programming visits within the six-month study window (Fig 2b). We found no significant difference in scheduled visit attendance between patients who were allocated to SOC (98.2 ± 6.6 %) and those allocated to MAP DBS-aided programming (98.6 ± 4.7 %, p-value = 0.92). We also found no significant difference in the number of extra, unscheduled, DBS programming visits for patients allocated to SOC (0.3 ± 0.7) compared to those allocated to MAP DBS-aided programming (0.2 ± 0.5, p-value = 0.52) (Table 2).
Table 2:
Protocol adherence and secondary outcomes. The improvement was significantly larger in motor rating scales in the on-medication state as quantified by UPDRS III for patients who were randomized to MAP DBS-aided programming compared to those randomized to SOC. Data are: mean ± standard deviation (n).
SOC | MAP DBS | p-value | |
---|---|---|---|
Percent of planned visits attended % (n) † | 98.2 ± 6.6 (37) | 98.6 ± 4.7 (35) | 0.92 |
Number of unscheduled programming visits (n) | 0.3 ± 0.7 (37) | 0.2 ± 0.5 (35) | 0.52 |
Change from baseline in off-medication UPDRS III (n) | −11.6 ± 11.5 (28) | −15.7 ± 13.2 (30) | 0.15 |
Change from baseline in on-medication UPDRS III (n) | −2.7 ± 6.9 (28) | −7.0 ± 7·9 (29) | 0.01* |
Change from baseline in total off-medication UPDRS (I, II, III, IV) (n) | −11.9 ± 16.6 (28) | −11.8 ± 15.4 (29) | 0.69 |
Change from baseline in PDQ-39 (n) | −3.3 ± 18.2 (29) | −7.1 ± 13.5 (30) | 0.12 |
Change from baseline in caregiver MCSI (n) | −0.4 ± 5.6 (26) | −3.4 ± 9.3 (21) | 0.48 |
Change from baseline in LEDD (n) | −33.9 ± 371.9 (29) | −68.5 ± 287.1 (25) | 0.70 |
UPDRS = Unified Parkinson’s Disease Rating Scale, PDQ-39 = 39-question Parkinson’s disease rating scale, MCSI = Multidimensional Caregiver Strain Index, LEDD = levodopa equivalent daily dose
p< 0·05 (statistically significant)
Does not include visits missed because of discontinued intervention
Clinical rating scales and medications
Motor symptoms, as measured by UPDRS III in the on-medication state, showed greater decreases for the patients who received MAP DBS-aided programming (−7.0 ± 7.9) compared to those who received SOC (−2.7 ± 6.9, p-value = 0.01). The change in patient PD medications as measured by LEDD did not differ between those who underwent SOC (−33.9 ± 371.9) compared to those who underwent MAP DBS-aided programming (−68.5 ± 287.1, p-value = 0.70). No statistically significant differences were found between MAP DBS-aided programming and SOC for changes in total UPDRS, UPDRS III off-medication, or PDQ-39 (Table 2). The agreement between the original and blinded off-medication UPDRS III ratings were good (Lin’s r = 0.75 and 0.70 for initial programming and six-month outcomes, respectively). The change in caregiver strain, as measured by MCSI, did not differ for the caregivers in the MAP DBS-aided programming arm (−3.4 ± 9.3) compared to the SOC arm (−0.4 ± 5.6, p = 0.48) (Table 2).
Safety
We recorded 21 severe or moderate adverse events (AEs) for patients allocated to SOC, and 15 for those allocated to the MAP DBS arm. SOC patients experienced four serious adverse events (SAEs), two of which led to withdrawal of the patient. MAP DBS-aided patients experienced three SAEs, one of which was deemed related to study procedures (the patient accidently deactivated the DBS device). No MAP DBS-aided patients were withdrawn because of an SAE (Supplemental Table 1).
Constrained MAP DBS protocol sub-analysis
A sub analysis revealed that constrained patients who were programmed following the more restrictive MAP DBS usage protocol did not substantially differ from the total population. We found no statistically significant difference in the mean total programming time between the constrained patients allocated to SOC (92.1 ± 60.8 minutes, n = 18) and those allocated to MAP DBS programming (68.1 ± 35.2, p-value = 0.58, n = 17). Just as in the full analysis, patients in the MAP DBS-aided arm experienced greater on-medication UPDRS III change from baseline (−8.6 ± 8.6, n = 12) than those in SOC (−5.2 ± 6.1, n = 13, p-value = 0.01). There were no statistical differences between the SOC patients (−18.1 ± 9.7, n = 13) and MAP DBS-aided patients (−21.9 ± 14.7, n = 14, p-value = 0.99) in the change from baseline in UPDRS III off-medication scores.
Discussion
The effect of MAP DBS on DBS programming
We conducted an open-label randomized trial on a mobile visual decision support tool for DBS programming. MAP DBS increased the efficiency of the initial programming visit, although it did not significantly affect the overall total programming time during the study period. Additionally, MAP DBS helped expert clinicians identify more effective stimulation settings for patients, as indicated by greater improvement of on-medication UPDRS III scores. The integration of MAP DBS into the workflows of six clinical sites revealed the versatility and scalability of the platform. Our results support MAP DBS as an effective and simple-to-use programming tool, which will be further tested in our second MAP DBS study using a home health care model.
DBS programming time is a challenging outcome to influence in part because DBS programming has no formal stopping criteria; clinicians often explore different settings until they believe a good setting for the patient has been reached. Additionally, the clinical efficacy of DBS is highly variable.2,27,28 As a result, a DBS programming clinician may find it difficult to know if a patient is receiving optimal benefit from a stimulation setting. Although MAP DBS may help the programming clinician identify an effective setting, the clinician is still likely to spend an extended period of time iterating to ensure no better option is available. As most clinical appointments are scheduled for fixed amounts of time, clinicians may continue to iterate on parameters until their scheduled time with the patient ends.
The theory that programming time is inflated because of extensive iteration is supported by the sub analysis of the patients programmed with the constrained protocol in which the clinician was allowed to select only programming settings that exclusively used the two electrodes they anticipated would be most effective after exploring the patient model. Despite eliminating over half of the possible stimulation settings on the device, no total programming time was saved. However, just as with the total population, the constrained MAP DBS patients experienced significantly larger improvement in UPDRS III on-medication scores than the SOC arm. This finding indicates that although clinicians were able to use MAP DBS to effectively narrow down the parameter space, the programming was not more time-efficient. Future work should consider alternative metrics for determining programming efficiency, such as the number of settings tried before the optimal setting, or the number of adjustments made by the patient using the at-home programmer, to gain insight into the impact of technology on programming process.
One possible explanation for why MAP DBS programmed patients experienced greater on-medication UPDRS III improvement without a statistical difference in off-medication scores is that DBS programming is typically performed in the off-medication state. Therefore, although experts are able to use traditional programming techniques to help alleviate off-medication symptoms, the use of visual feedback about stimulation location helped to produce programming settings to enhance on-medication benefits. One other explanation is that MAP DBS decreased the complexity of DBS programming, which increased the clinician capacity to focus on medication management. However, we found no differences between our study arms regarding change in medication as measured by LEDD.
MAP DBS-aided programing resulted in no differences in quality of life, as measured by PDQ-39, or in caregiver burden, as measured by the MCSI when comparing the two study arms. These measures, which are indirectly related to the effectiveness of the therapy an individual patient receives, are likely not sensitive enough to detect the small advantages yielded by MAP DBS-aided programming. Caregiver strain is a frequently overlooked aspect when considering the role of DBS programming, as many PD patients cannot travel to clinical appointments without the assistance of a caregiver due to mobility disability.
Patient-specific visual models of activation in programming
The therapeutic effects of DBS are the result of a synergistic interaction between lead location and DBS programming, which consists of multiple independent parameters. However, DBS programming is usually performed without any visual reference to the lead location relative to the surrounding anatomical structures. Traditional DBS programming is an iterative process in which the clinician selects a stimulation setting and then observes the symptom response. The clinician also elicits verbal feedback. The clinician uses the feedback to guide device setting changes, and the process is repeated until the patient and clinician agree on a “best” setting. Although attempts have been made to use systematic approaches to DBS programming,29–31 the process remains iterative, with no widely adopted standardized approach. Novel DBS device technologies have added to the number of options for a clinician managing DBS 32, and these additions have further complicated attempts to simplify programming strategies.
MAP DBS fills a critical gap in the typically opaque DBS programming process by providing information about the interaction between stimulation settings and local anatomy via an easy-to-use mobile interface. Computational models have been previously used to guide the DBS programming process;21,22 however, these models have had limited penetration into SOC clinical workflows. MAP DBS is a simple mobile computing platform, and our study demonstrates that it can be integrated into the DBS clinic at multiple centers. DBS device manufactures have included visual models of activation into their modern programming platforms, but the inclusion of the patient’s anatomical lead location and patient medical imaging, while being compatible with all DBS leads on the United States market, sets MAP DBS apart.
Programming is currently performed at most centers without tools to enable a deeper understanding of the effects of changes in stimulation parameters. MAP DBS provides a medium for clinician education and has the potential to foster a more nuanced understanding of the relationship between stimulation and clinical effect. The MAP DBS tool does not make decisions, but it does inform decisions. As clinicians become familiar with this decision support tool, it has the potential to further enhance and inform care. Having a visual reference of the anatomical location of a patient’s DBS lead can transform traditionally complex programming tasks into simple ones. A clinician could, for example, immediately identify a sub optimally placed lead in need of revision without going through months of fruitless empirical programming. Similarly, a clinician can immediately identify a reasonably placed DBS lead within a target and commence programming by constraining the setting choices based on visual feedback. Because our study was conducted on only a subset of DBS patients at each clinical site (newly implanted, diagnosed with PD, consenting research subjects), the clinicians using MAP DBS were not able to incorporate the platform into their everyday programming workflows. Increased use and familiarity with MAP DBS may result in clinicians being more trusting of the platform, with the benefit of further time savings. The impact of the daily use of the platform on the programming process should be investigated in future work.
The benefits of a decision-making tool such as MAP DBS are likely underpinned by providing critical information early in the management algorithm, as indicated by the time savings during the first DBS programming visit, which can often be lengthy and tiring for patients. Additionally, we tested, in a second study, whether the simplicity of the tool can facilitate programming by a nonexpert in the home health setting. Tools such as MAP DBS may, in the future, enhance access for patients and reduce the need for specialist-driven care.
MAP DBS and emerging DBS technology
MAP DBS as a platform provides static predictions based on medical imaging. Ideally, MAP DBS will evolve past a system that provides visual models that require clinician interpretation, to a platform that recommends stimulation settings for patients based on verified predictors of effective stimulation. Recent imaging-based research has shown that PD DBS efficacy is predicted by the brain networks modulated by a patient’s stimulation,33 and emerging evidence indicates an ideal stimulation location “hot spot” for STN DBS.19,20 Integration of imaging biomarkers into MAP DBS may allow for a future where stimulation settings for patients are predicted using existing optimization algorithms.16,34 Beyond imaging-based metrics, emerging technologies in DBS provide avenues to create platforms that provide real-time feedback regarding patient status. Wearable technologies have proven to be an effective way to continuously monitor changes in patient status,35 and provide real-time data that can be used to help guide DBS parameter selection.36–38 Next-generation commercial DBS devices emerging on the market have the capacity to measure patient physiology via local field potentials, which have been shown to be good biomarkers for stimulation parameter selection.39–42 A future mobile computing platform could serve as a hub for remote monitoring of multimodal patient data, and integrate the data to provide predictions of DBS settings for patients. This technology would not only assist in the quality of patient care, but also facilitate the at-home management of patients by providing clinicians with remote monitoring of patient status.
Beyond the improvement in DBS monitoring technologies, DBS systems are becoming more feature rich with regard to how stimulation can be delivered. Features such as multiple-independent current sources and directional DBS leads provide increased resolution for the selection of parameters, which would provide an opportunity for more precise stimulation to be delivered to the patient. However, the increasing complexity of these technologies will provide additional patient benefit only if they can be efficiently programmed. Although expert centers are adapting,43,44 technology advances are increasing the specialization required to perform optimal DBS programming. Patients who cannot access expert care are being left further behind. We argue that unless tools such as MAP DBS are developed, advancements in DBS technology will not deliver on their promises to most DBS patients. Our study enrolled patients with all six PD DBS leads currently on the market in the United States, and future work should aim at distinguishing if our platform provides additional benefit for programming devices with advanced features.
Limitations
Neither our programming clinicians nor study patients were blinded, though UPDRS ratings were performed by blinded raters from video. The open-label study design was largely out of necessity, because there was no practical way to blind the patient to the clinician using MAP DBS during programming. Although our blinded and unblinded UPDRS III scores showed strong concordance, our study would have been strengthened by using a blinded-rater for all UPDRS III evaluations. Because of the multicenter nature of our study and the difference in clinical practice between institutions, several factors likely influenced programming time including institutional difference in SOC for programming. We accounted for anticipated differences in programming time by stratifying our randomization blocks and using a randomized-blocked ANOVA as our statistical model. While our randomization was conducted as originally planned, there were notable differences between groups. Due to the sample size of the groups, randomization is unlikely to match on every factor. Similarly, there was a slightly higher loss/withdrawal rate in the MAP-DBS group compared to the SOC group. Due to sample size restrictions it is hard to speculate on the cause of this difference. As stated, patient rigidity was eliminated from the UPDRS III scoring as it was difficult to visualize on video. The MAP-DBS tool was able to help to identify sub optimally placed leads however, in this current study we did not collect data about misplaced leads. In future studies sub optimally placed leads could be systematically investigated using the MAP-DBS software. MAP DBS is a platform that provides visual models of activation, but it relies on the clinician to integrate the information into the DBS programming process. Clinician experience and formal education likely influence the interaction that the programming clinician has with the platform, and future research should identify how to create a system that provides the most benefit to programmers of all backgrounds.
In conclusion, we showed that MAP DBS can improve the time spent at the initial DBS programming visit when compared to SOC procedures, which has been traditionally burdensome and time consuming. As new DBS systems are now available with added complexities, the need to improve the DBS programming process becomes more critical. The results of this study demonstrate that technology can positively impact DBS programming, and should be the launching point for future studies to improve DBS programming for patients, caregivers, and clinicians.
Supplementary Material
Figure 3:
Programming time A) Left: Cumulative programming time, bold lines represent mean values. Right: Comparison of programming time for initial programming and all other programming visits. The difference in cumulative programming time between patients allocated to SOC and those allocated to MAP DBS-aided programming was not significant, but the time savings during the initial programming visit was.
Acknowledgments
We thank Dr. Jens Krüger for his collaboration in the development of the ImageVis3D Mobile iOS application. We thank Matt Barabas, Derek Ridgeway, Erin Monari, Julie Segura, Raisa Syed, Suneel Kumar, Maria Banvard, Jessica Dimos, Jennifer Love, Jessica Tate, Elizabeth Sanguinetti, Melissa Butson, Elizabeth Nuttall, and Theresa Lins for their help collecting data and managing the study. This research was funded by the National Institute of Nursing Research (NIH: NR014852).
Funding:
National Institute of Nursing Research (NIH: NR014852)
Footnotes
Potential Conflicts of Interest
There are potential conflicts of interested related to Medtronic, Boston Scientific, and Abbott, who are the device manufacturers who produced the DBS systems implanted in the study subjects. The University of Florida has received grants from Medtronic, Boston Scientific, and Abbott, but the authors have financial interest in these grants. ARZ serves as a consultant for Medtronic and Boston Scientific. IUH has performed research for Boston Scientific, and has consulted for Boston Scientific and Medtronic. JLO has received grant support from Boston Scientific and Medtronic, has consulted for Medtronic and Abbott, and received non-financial research support from Boston Scientific. MSS has received grant support from Boston Scientific, and honoraria/nonfinancial support as a scientific advisor from Boston Scientific. As the director of the Movement Disorders Fellowship at the University of Florida, CWH has received industry grants for educational support for the fellowship program that are paid directly to the University of Florida and used solely for fellow salary support from Medtronic, Boston Scientific, and Abbott. MHP has received consultation fees from Medtronic, Boston Scientific, and Abbot. MSL has received consulting fees from Boston Scientific. CRB has received consulting feeds from Boston Scientific and Abbott, and he holds intellectual property related to DBS. KDF has received occasional consulting fees from Medtronic and Boston Scientific for DBS related work. University of Florida has received implantable devices for KDF’s DBS-related research, but not for this trial, from Medtronic. The University of Florida receives partial funding for KDF’s functional neurosurgery fellowship from Medtronic. KDF holds three DBS related patents for which he has received no royalties. KDF has participated as a site implanting surgeon in multicenter DBS-related research studies sponsored by Abbott and Boston Scientific. MV has received consultation fees from Medtronic. PZ has received honorariums as a consultant and part of an advisory panel with Medtronic. JJS has received research support from Medtronic and Abbott, and has received consulting fees from Medtronic, Abbott, and, Boston Scientific.
MSO, EM, BJL, CCA, SC, AT, GD, JLW, CG, AB, NN, DS, JC, MH, MA, and AS have nothing to disclose.
References
- 1.Deuschl G, Schade-Brittinger C, Krack P, et al. A randomized trial of deep-brain stimulation for Parkinson’s disease. N Engl J Med 2006; 355: 896–908. [DOI] [PubMed] [Google Scholar]
- 2.Follett KA, Weaver FM, Stern M, et al. Pallidal versus Subthalamic Deep-Brain Stimulation for Parkinson’s Disease. N Engl J Med 2010; 362: 2077–91. [DOI] [PubMed] [Google Scholar]
- 3.Okun MS, Fernandez HH, Wu SS, et al. Cognition and mood in Parkinson’s disease in subthalamic nucleus versus globus pallidus interna deep brain stimulation: the COMPARE trial. Ann Neurol 2009; 65: 586–95. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 4.Benabid AL, Pollak P, Gao D, et al. Chronic electrical stimulation of the ventralis intermedius nucleus of the thalamus as a treatment of movement disorders. J Neurosurg 1996; 84: 203–14. [DOI] [PubMed] [Google Scholar]
- 5.Morrell MJ. Responsive cortical stimulation for the treatment of medically intractable partial epilepsy. Neurology 2011; 77: 1295–304. [DOI] [PubMed] [Google Scholar]
- 6.Greenberg BD, Giftakis JE, Rasmussen SA, et al. Deep brain stimulation of the ventral internal capsule/ventral striatum for obsessive-compulsive disorder: worldwide experience. Mol Psychiatry 2008; 15: 64–79. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 7.Vidailhet M, Vercueil L, Houeto J-L, et al. Bilateral deep-brain stimulation of the globus pallidus in primary generalized dystonia. N Engl J Med 2005; 352: 459–67. [DOI] [PubMed] [Google Scholar]
- 8.Martinez-Ramirez D, Jimenez-Shahed J, Leckman JF, et al. Efficacy and safety of deep brain stimulation in tourette syndrome the international tourette syndrome deep brain stimulation public database and registry. JAMA Neurol 2018; 75: 353–9. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 9.Laxton AW, Tang-Wai DF, McAndrews MP, et al. A phase i trial of deep brain stimulation of memory circuits in Alzheimer’s disease. Ann Neurol 2010; 68: 521–34. [DOI] [PubMed] [Google Scholar]
- 10.Lozano AM, Giacobbe P, Hamani C, et al. A multicenter pilot study of subcallosal cingulate area deep brain stimulation for treatment-resistant depression: Clinical article. J Neurosurg 2012; 116: 315–22. [DOI] [PubMed] [Google Scholar]
- 11.Okun MS, Foote KD. Parkinson’s disease DBS: What, when, who and why? The time has come to tailor DBS targets. Expert Rev Neurother 2010; 10: 1847–57. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 12.Okun MS, Fernandez HH, Pedraza O, et al. Development and initial validation of a screening tool for Parkinson disease surgical candidates. Neurology 2004; 63: 161–3. [DOI] [PubMed] [Google Scholar]
- 13.Rolston JD, Englot DJ, Starr PA, Larson PS. An unexpectedly high rate of revisions and removals in deep brain stimulation surgery: Analysis of multiple databases. Park Relat Disord 2016; 33: 72–7. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 14.Okun MS, Tagliati M, Pourfar M, et al. Management of referred deep brain stimulation failures: A retrospective analysis from 2 Movement Disorders Centers. Arch Neurol 2005; 62: 1250–5. [DOI] [PubMed] [Google Scholar]
- 15.Moro E, Poon Y -YW, Lozano AM, Saint-Cyr JA, Lang AE. Subthalamic Nucleus Stimulation: Improvements in Outcome with Reprogramming. Arch Neurol 2006; 63: 1266. [DOI] [PubMed] [Google Scholar]
- 16.Anderson DN, Osting B, Vorwerk J, Dorval AD, Butson CR. Optimized programming algorithm for cylindrical and directional deep brain stimulation electrodes. J Neural Eng 2018; 15: 026005. [DOI] [PubMed] [Google Scholar]
- 17.Vorwerk J, Brock A, Anderson DN, Rolston JD, Butson CR. A retrospective evaluation of automated optimization of deep brain stimulation parameters. J Neural Eng 2019; 16: 064002. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 18.Bronstein JM, Tagliati M, Alterman RL, et al. Deep brain stimulation for Parkinson disease an expert consensus and review of key issues. Arch Neurol 2011; 68: 165–71. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 19.Butson CR, Cooper SE, Henderson JM, Wolgamuth B, McIntyre CC. Probabilistic Analysis of Activation Volumes Generated During Deep Brain Stimulation. Neuroimage 2011; 54: 2096–104. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 20.Dembek TA, Roediger J, Horn A, et al. Probabilistic sweet spots predict motor outcome for deep brain stimulation in Parkinson disease. Ann Neurol 2019; 86: 527–38. [DOI] [PubMed] [Google Scholar]
- 21.Pourfar MH, Mogilner AY, Farris S, et al. Model-Based Deep Brain Stimulation Programming for Parkinson’s Disease: The GUIDE Pilot Study. Stereotact Funct Neurosurg 2015; 93: 231–9. [DOI] [PubMed] [Google Scholar]
- 22.Frankemolle AMM, Wu J, Noecker AM, et al. Reversing cognitive–motor impairments in Parkinson’s disease patients using a computational modelling approach to deep brain stimulation programming. Brain 2010; 133: 746–61. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 23.Butson CR, Tamm G, Jain S, Fogal T, Krüger J. Evaluation of interactive visualization on mobile computing platforms for selection of deep brain stimulation parameters. IEEE Trans Vis Comput Graph 2013; 19: 108–17. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 24.Rubin DB. Multiple Imputation for Nonresponse in Surveys. J Mark Res 1989; 26: 485. [Google Scholar]
- 25.Lin LI -K. A Concordance Correlation Coefficient to Evaluate Reproducibility. Biomatrics 1989; 45: 255–68. [PubMed] [Google Scholar]
- 26.Kieser M, Friede T. Simple procedures for blinded sample size adjustment that do not affect the type I error rate. Stat Med 2003; 22: 3571–81. [DOI] [PubMed] [Google Scholar]
- 27.Odekerken VJJ, van Laar T, Staal MJ, et al. Subthalamic nucleus versus globus pallidus bilateral deep brain stimulation for advanced Parkinson’s disease (NSTAPS study): A randomised controlled trial. Lancet Neurol 2013; 12: 37–44. [DOI] [PubMed] [Google Scholar]
- 28.Okun MS, Fernandez HH, Wu SS, et al. Cognition and mood in Parkinson’s disease in subthalamic nucleus versus globus pallidus interna deep brain stimulation: The COMPARE trial. Ann Neurol 2009; 65: 586–95. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 29.Volkmann J, Herzog J, Kopper F, Deuschl G. Introduction to the programming of deep brain stimulators. Mov Disord 2002; 17 Suppl 3: S181–7. [DOI] [PubMed] [Google Scholar]
- 30.Volkmann J, Moro E, Pahwa R. Basic algorithms for the programming of deep brain stimulation in Parkinson’s disease. Mov Disord 2006; 21: 284–9. [DOI] [PubMed] [Google Scholar]
- 31.Picillo M, Lozano AM, Kou N, Puppi Munhoz R, Fasano A. Programming Deep Brain Stimulation for Parkinson’s Disease: The Toronto Western Hospital Algorithms. Brain Stimul 2016; 9: 425–37. [DOI] [PubMed] [Google Scholar]
- 32.Schüpbach WMM, Chabardes S, Matthies C, et al. Directional leads for deep brain stimulation: Opportunities and challenges. Mov Disord 2017; 32: 1371–5. [DOI] [PubMed] [Google Scholar]
- 33.Horn A, Reich M, Vorwerk J, et al. Connectivity predicts deep brain stimulation outcome in Parkinson’s disease. Ann Neurol 2017; : 1–39. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 34.Pena E, Zhang S, Patriat R, et al. Multi-objective particle swarm optimization for postoperative deep brain stimulation targeting of subthalamic nucleus pathways. J Neural Eng 2018; 15: 066020. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 35.Joshi R, Bronstein JM, Keener A, et al. PKG Movement Recording System Use Shows Promise in Routine Clinical Care of Patients With Parkinson’s Disease. Front Neurol 2019; 10: 1–11. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 36.Pulliam CL, Heldman DA, Orcutt TH, Mera TO, Giuffrida JP, Vitek JL. Motion sensor strategies for automated optimization of deep brain stimulation in Parkinson’s disease. Park Relat Disord 2015; 21: 378–82. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 37.Mera T, Vitek JL, Alberts JL, Giuffrida JP. Kinematic optimization of deep brain stimulation across multiple motor symptoms in Parkinson’s disease. J Neurosci Methods 2011; 198: 280–6. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 38.Malekmohammadi M, Herron J, Velisar A, et al. Kinematic Adaptive Deep Brain Stimulation for Resting Tremor in Parkinson’s Disease. Mov Disord 2016; 31: 426–8. [DOI] [PubMed] [Google Scholar]
- 39.Swann NC, De Hemptinne C, Miocinovic S, et al. Chronic multisite brain recordings from a totally implantable bidirectional neural interface: Experience in 5 patients with Parkinson’s disease. J Neurosurg 2018; 128: 605–16. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 40.Tinkhauser G, Pogosyan A, Debove I, et al. Directional local field potentials: A tool to optimize deep brain stimulation. Mov Disord 2018; 33: 159–64. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 41.Torrecillos F, Tinkhauser G, Fischer P, et al. Modulation of beta bursts in the subthalamic nucleus predicts motor performance. J Neurosci 2018; 38: 8905–17. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 42.Horn A, Neumann W-J, Degen K, Schneider G-H, Kühn AA. Toward an electrophysiological “sweet spot” for deep brain stimulation in the subthalamic nucleus. Hum Brain Mapp 2017; 3390: 3377–90. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 43.Dembek TA, Reker P, Visser-Vandewalle V, et al. Directional DBS increases side-effect thresholds—A prospective, double-blind trial. Mov Disord 2017; 32: 1380–8. [DOI] [PubMed] [Google Scholar]
- 44.Vitek JL, Jain R, Chen L, et al. Subthalamic nucleus deep brain stimulation with a multiple independent constant current-controlled device in Parkinson’s disease (INTREPID): a multicentre, double-blind, randomised, sham-controlled study. Lancet Neurol 2020; 19: 491–501. [DOI] [PubMed] [Google Scholar]
- 45.Avants B, Tustison N, Song G. Advanced Normalization Tools (ANTS). Insight J 2009; 2: 1–35. [Google Scholar]
- 46.Avants BB, Epstein CL, Grossman M, Gee JC. Symmetric diffeomorphic image registration with cross-correlation: Evaluating automated labeling of elderly and neurodegenerative brain. Med Image Anal 2008; 12: 26–41. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 47.Xiao Y, Fonov V, Bériault S, et al. Multi-contrast unbiased MRI atlas of a Parkinson’s disease population. Int J Comput Assist Radiol Surg 2015; 10: 329–41. [DOI] [PubMed] [Google Scholar]
- 48.Duffley G, Anderson DN, Vorwerk J, Dorval AD, Butson CR. Evaluation of methodologies for computing the deep brain stimulation volume of tissue activated. J Neural Eng 2019; 16: 066024. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 49.Butson CR, McIntyre CC. Role of electrode design on the volume of tissue activated during deep brain stimulation. J Neural Eng 2006; 3: 1–8. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 50.Carnevale NT, Hines ML. The Neuron Book. Cambridge, United Kingdom: Cambridge University Press, 2008. DOI: 10.1017/CBO9780511541612. [DOI] [Google Scholar]
Associated Data
This section collects any data citations, data availability statements, or supplementary materials included in this article.