Skip to main content
NPJ Digital Medicine logoLink to NPJ Digital Medicine
. 2022 Mar 18;5:32. doi: 10.1038/s41746-022-00568-y

Co-evolution of machine learning and digital technologies to improve monitoring of Parkinson’s disease motor symptoms

Anirudha S Chandrabhatla 1, I Jonathan Pomeraniec 2,3,, Alexander Ksendzovsky 4
PMCID: PMC8933519  PMID: 35304579

Abstract

Parkinson’s disease (PD) is a neurodegenerative disorder characterized by motor impairments such as tremor, bradykinesia, dyskinesia, and gait abnormalities. Current protocols assess PD symptoms during clinic visits and can be subjective. Patient diaries can help clinicians evaluate at-home symptoms, but can be incomplete or inaccurate. Therefore, researchers have developed in-home automated methods to monitor PD symptoms to enable data-driven PD diagnosis and management. We queried the US National Library of Medicine PubMed database to analyze the progression of the technologies and computational/machine learning methods used to monitor common motor PD symptoms. A sub-set of roughly 12,000 papers was reviewed that best characterized the machine learning and technology timelines that manifested from reviewing the literature. The technology used to monitor PD motor symptoms has advanced significantly in the past five decades. Early monitoring began with in-lab devices such as needle-based EMG, transitioned to in-lab accelerometers/gyroscopes, then to wearable accelerometers/gyroscopes, and finally to phone and mobile & web application-based in-home monitoring. Significant progress has also been made with respect to the use of machine learning algorithms to classify PD patients. Using data from different devices (e.g., video cameras, phone-based accelerometers), researchers have designed neural network and non-neural network-based machine learning algorithms to categorize PD patients across tremor, gait, bradykinesia, and dyskinesia. The five-decade co-evolution of technology and computational techniques used to monitor PD motor symptoms has driven significant progress that is enabling the shift from in-lab/clinic to in-home monitoring of PD symptoms.

Subject terms: Translational research, Learning algorithms, Technology

Introduction

Parkinson’s disease (PD) is a complex neurodegenerative disorder commonly characterized by motor impairments such as tremor, bradykinesia, dyskinesia, and gait abnormalities1. Proper assessment of PD motor impairments is vital for clinical management of the disease2,3. Appropriate timing of dopaminergic medications4 to avoid sudden increases in symptom severity5 and selection for interventions such as deep brain stimulation6 both require precise understandings of symptom fluctuations in patients with PD. In addition, objective characterization of non-motor manifestations of PD such as sleep disorders, gastrointestinal symptoms, and psychiatric symptoms are needed to understand long-term disease progression3.

Characterization of motor and non-motor PD symptoms traditionally relied on the Unified Parkinson’s Disease Rating Scale (UPDRS), a PD severity rating system with four parts related to (I) Mentation, Behavior and Mood, (II) Activities of Daily Living, (III) Motor, and (IV) Complications7. The UPDRS was eventually updated by the Movement Disorder Society (MDS), creating the MDS-UPDRS, in an attempt to reduce subjectivity in the scale8. Clinicians also use other rating systems such as the WHIGET Tremor Rating Scale for action tremor9 and the modified bradykinesia rating scale (MRBS) for bradykinesia10. However, these rating systems suffer from two main flaws. First, they lack granularity during disease or medication cycles, as they only provide a snapshot view of a patient’s symptoms as seen during in-clinic visits. In addition, when assessing PD symptoms outside of the clinic, physicians must rely on patient diaries or recall, which can be inaccurate2. Second, these rating systems are inherently subjective, leading to high inter- and intra-rater variability3.

Addressing these flaws is vital to ensure proper diagnosis and management of patients with PD. To that end, considerable efforts have been made to develop objective, at-home, and automated methods to monitor the main motor symptoms characteristic of PD. Leveraging motion sensor and, in some instances, video-based technologies can first enable physicians to take data-driven approaches to PD diagnoses. Adding at-home patient monitoring through smart devices (e.g., smartphones, watches) could then enable physicians to adjust treatment plans based on patient activity data. The end goal of these technologies is to achieve continuous, at-home monitoring, which will require continued research using data from at-home, continuous studies, rather than applying laboratory data to develop at-home solutions. This review aims to summarize the co-evolution of the technologies and computational methods used to assess and monitor common motor symptoms of PD such as tremor, gait abnormalities, bradykinesia, and dyskinesia.

Review of literature

Technology

The technology used to diagnose and monitor PD has evolved significantly over time (Fig. 1A). Most notably, this technology has progressed from laboratory to at-home/everyday settings, enabling more robust data collection related to PD symptoms. This section describes this progression, along with the purpose and advantages of different technologies.

Fig. 1. A 50 + year timeline illustrating the progression of technology used to assess and monitor symptoms in patients with PD and illustrating the progression of computational and machine learning techniques used to assess and monitor symptoms in patients with PD.

Fig. 1

A In the 1970s, the main technologies used were lab-based, such as EMG and potentiometer measurements. Adoption of lab-based accelerometers began in the late 1980s and continued until the early 2000s when smaller devices such as tablets and wearable accelerometers started being leveraged. Since the late 2010s, smart devices and apps on those devices were the primary technologies used for symptom monitoring. Over time, the evolution of technology has enabled greater and more continuous data collection. B Since the 1970s, computational and statistical techniques such as frequency domain analyses of accelerometer data have enabled researchers and clinicians to quantify symptom severity in patients with PD. Improvements in technologies used to monitor symptoms have enabled increased data collection, allowing for the growth in adoption of machine learning techniques. Supervised techniques were applied first to analyze symptom data, followed by unsupervised techniques.

Laboratory-based technologies to assess PD symptoms had two main purposes: (1) develop methodologies to diagnose PD/categorize severity (e.g., distinguish PD from similar neurological conditions) and (2) set the foundation for smaller, more portable, and more user-friendly technologies that could assist in PD diagnosis and monitoring in the future (Table 1).

Table 1.

Progression of technology used to monitor and assess PD symptoms in laboratory/clinic settings.

Authors, Years Device Primary Symptom(s) Measured
Andrews et al.13 Surface EMG Freezing of gait
Milner-Brown et al.13 EMG Bradykinesia
Bathien et al.11 EMG Resting tremor and dyskinesia
Hacisalihzade et al.14 In-lab potentiometer-based motion tracker Bradykinesia while tracking moving target
van Hilten et al.44 Accelerometer on non-dominant wrist Continuous monitoring of tremor and dyskinesia
Weller et al.16 Infrared-based shoe sensor Straight-line gait
Beuter et al.15 In-lab laser-based system Resting and action tremor
Deuschl et al.141 Monoaxial accelerometer Resting tremor
Dunnewold et al.24 Tri-axial accelerometer Bradykinesia
Someren et al.142 Uniaxial accelerometer Tremor
Dunnewold et al.22 Uniaxial accelerometer Bradykinesia, hypokinesia,
Spyers-Ashby et al.19 Tri-axial accelerometer Postural tremor
Giovannoni et al.38 Computer keyboard to administer the BRAIN TEST Bradykinesia while alternately striking computer keys for a period of 60 seconds.
Rajaram et al.20 Tri-axial electromagnetic sensors Resting, postural, and intention tremor. Also included distraction and mental stress conditions.
Manson et al.35 Tri-axial accelerometer on shoulder Dyskinesia in multiple conditions (e.g., sitting, writing)
O’Suilleabhain. et al.17 Electromagnetic motion tracking system Quantitative tremor assessment in multiple conditions (e.g., arms horizontal and straight ahead, shoulders abducted to 90°)
Hoff et al.23 Bi-axial accelerometers Dyskinesia during rest, talking, stress, and four activities of daily life (ADL
Burne et al.45 Tri-axial accelerometer and surface EMG Resting and postural tremor
Hoff et al.21, Uniaxial accelerometers “On” and “off” tremor states
Sekine et al.143 Tri-axial accelerometer and photoelectric sensor Gait
Salarian et al.25 Tri-axial gyroscope Bradykinesia and tremor while performing activities of daily life (e.g., brushing hair and teeth, putting on and taking off a jacket and shoes)
Allen et al.39 Computer with videogame joystick and steering wheel Bradykinesia while using videogame joystick and steering wheel
Rao et al.43 Video Dyskinesia (face and neck) during speech task
Giansanti et al.144 Force sensor/step counter Gait
Salarian et al.29 Tri-axial accelerometer and gyroscope Straight-line gait with turning
Mancini et al.145 Tri-axial accelerometers and gyroscopes. Force plate Bradykinesia
Bachlin et al.41 Accelerometer and headphones for audio cues Freezing of gait
Cole et al.46 Tri-axial accelerometer and surface EMG Scripted (e.g., tooth-brushing) and unscripted action tremor
Espay et al.40 4 m electronic walkway. VR goggles and earphones Straight-line gait with or without feedback from goggles and earphones
Papapetropoulos et al.48 Tremor pen with bi-axial accelerometer, touch recording plate, reaction time handle, and force plate Postural and action tremor (with distraction conditions), reaction time, and postural stability
Mancini et al.146 Tri-axial accelerometer and gyroscope. Force plate Gait (via postural sway)
Heldman et al.27 KinetiSense motion sensor on heel Bradykinesia
Tsipouras et al.47 Tri-axial accelerometers and gyroscopes Action tremor in scripted conditions (e.g., rising from bed and sitting on chair)
Mera et al.26 Tri-axial accelerometer and gyroscope Bradykinesia and tremor in multiple conditions (e.g., rest, repetitive finger-tapping)
Moore et al.30 7 inertial measurement units Freezing of gait from timed up-and-go tasks
Tripoliti et al.33 6 accelerometers and gyroscopes Freezing of gait during simulated activities of daily life
Morris et al.147 Animations generated from inertial sensors Freezing of gait
Zach et al.34 Tri-axial linear waist-mounted accelerometer Freezing of gait during walking tasks
Ginis et al.148 Inertial measurement units and smartphone app Gait
Phan et al.28 Tri-axial accelerometer, gyroscope, and compass Axial bradykinesia in multiple conditions (e.g., pouring water from a jug into 9 cups)
Pulliam et al.36 Kinesia motion sensor (tri-axial accelerometer and gyroscope) on each wrist and ankle. Motor fluctuations (“ON” vs “OFF” state) during simulated activities of daily life
Rodriguez-Molinero et al.37 Waist-worn accelerometer, gyroscope, magnetometer Dyskinesia while performing activities of daily life (e.g., brushing teeth, drying a glass)
Reches et al.32 Opal sensors (tri-axial accelerometer, gyroscope, and magnetometer) Freezing of gait during walking tasks
Lee et al.42 Google Glass Freezing of gait during walking tasks
Mancini et al.31 Opal sensors (tri-axial accelerometer, gyroscope, and magnetometer) Freezing of gait during walking tasks and during activities of daily life

All data were collected in controlled environments (e.g., laboratories, hospitals).

Laboratory-based electromyography (EMG) techniques were among the first technologies used to assess PD. More specifically, the data collected using these techniques was primarily meant to help distinguish/diagnose PD from similar conditions or quantify disease progression. In 1984, Bathien et al. quantified tremor of the head, hands, and lower extremities with EMG11. The group found that analyzing phase-shifts between bursts of EMG activity in agonist-antagonist muscles enabled categorization between the tremor seen in PD and that of tardive dyskinesia, creating one of the first quantitative methodologies for distinguishing PD from other conditions. In-lab EMG was also leveraged to quantify and monitor gait abnormalities in patients with PD. EMG data helped distinguish between normal and “Parkinsonian” gait and quantify response to therapy over time12. Similar studies were conducted to assess other symptoms of PD. In 1979, Milner-Brown et al. reported that needle-based hand EMG detected abnormal motor unit properties during muscle contraction that could be used to track progression of bradykinesia13. Of note, these EMG-based techniques were not meant for making initial PD diagnoses, but were rather used for tracking progression of already established disease.

Starting in the latter half of the 1980s, researchers began moving past EMG and towards less invasive methods. The technologies developed during this era also attempted to diagnose PD in addition to monitoring/quantifying symptoms. The initial techniques developed varied widely. Some groups tested potentiometer-based systems that could monitor multiple symptoms at once14, allowing for “one stop” assessments of, for example, how patients were responding to pharmacologic therapy. Laser-based technologies were also popular. Beuter et al. developed a laser system that could measure hand movements to distinguish between healthy controls and patients with PD15, while Weller et al. developed a system to track how gait abnormalities changed in response to various medications16. Though these technologies were less invasive and more portable than EMG, their use was often limited to special laboratory environments (e.g., areas with laser-safety equipment) and required significant expertise to operate17. Accelerometers and gyroscopes addressed both of these concerns, thereby solidifying them as two of the main technologies that defined the next era of PD monitoring. The use of accelerometers and gyroscopes enabled increased data collection, thereby improving the granularity with which researchers were able to monitor and assess patients with PD. Early use of accelerometers and gyroscopes collected in-lab data in one axis and looked to differentiate between PD and other conditions. Deuschl et al. used a monoaxial accelerometer to demonstrate that time series analysis alone was sufficient to differentiate between PD and essential tremor18. The use of tri-axial accelerometers and gyroscopes improved classification accuracy and allowed for more robust in-lab measurements. The tri-axial accelerometers employed by Spyers-Ashby et al. in 1999 lead to greater than 60% classification accuracy between control, essential tremor, multiple sclerosis, and PD19. Additionally, Rajaraman et al. demonstrated that using an increased number of tri-axial accelerometers on various parts of the hand, forearm, and arm allowed for quantification of tremor despite altered hand positions and orientation20. Seminal studies by the van Hilten group also demonstrated that tri-axial accelerometry was beneficial in identifying and characterizing tremor, bradykinesia, and dyskinesia2124.

The use of wearable accelerometers and gyroscopes extended to quantifying other PD symptoms. Data from tri-axial accelerometers and gyroscopes on various parts of the body (e.g., wrists, index finger, back) allowed for models to estimate UPDRS scores and determine bradykinesia severity under both scripted and unscripted conditions2528. Salarian et al. investigated using tri-axial accelerometers and gyroscopes along with inertial sensors to track postural instability gait difficulty (PIGD sub score of UPDRS III) during gait-assessment turning trials, reporting that patients with PD had significantly longer turning duration and delay before initiating a turn29. Similar findings were reported by Moore et al., who showed that freezing-of-gait (FoG) identification based on frequency characteristics of lower extremity motion correlated strongly (interclass correlation >0.7) with clinical assessments by specialists30. The use of accelerometers to identify FoG has been reported by many other groups as well3134. Multiple studies investigating dyskinesia severity used tri-axial accelerometers, gyroscopes, and/or magnetometers on various body parts (e.g., shoulder, wrist, ankle, waist) and found strong correlations between the magnitudes of dyskinesia measured by devices to those observed by clinicians3537.

At the same time, in-lab methodologies were being developed specific to quantifying and better understanding certain manifestations of PD. Unique to bradykinesia was the use of computer game-based technologies. In 1999, Giovannoni et al. introduced the BRAIN TEST as a computer-based way to monitor the progression of bradykinesia in PD. By requiring participants to use their index fingers to alternately strike the “S” and “;” keys on a standard computer keyboard, the BRAIN TEST provided a rapid and objective measurement of upper-limb motor function38. Allen et al. built upon Giovannoni’s work and developed a joystick and toy steering wheel-based computer test that was able to discriminate pathologic bradykinesia of varying severity39. Espay et al. studied the effect of virtual reality (VR) and audio-based gait feedback in identifying and correcting gait abnormalities in PD patients as they walked on an in-lab four meter GAITRite electronic walkway. Overall, nearly 70% of patients improved by at least 20% in either walking velocity, stride length, or both40. Bachlin et al. developed a similar correction-focused platform that detected FoG in patients with PD and provided audio cues to resume walking. The system detected FoG events in real-time with a sensitivity of 73% and specificity of 82%41. Visually cued FoG correction platforms have been developed using technologies such as Google Glass42. Finally, Rao et al. reported a video-based facial tracking algorithm that assessed severity of face and neck dyskinesia during a speech task. The calculated severity scores showed a high correlation to dyskinesia ratings by neurologists43.

Leveraging the data and analyses from in-lab studies, researchers began to develop methodologies for not just monitoring, but also diagnosing PD outside of the lab. Initial studies in this area included work by van Hilten et al. in which patients wore small accelerometers over the course of six days and completed quality of life surveys, enabling the first objective measures of dyskinesia44. Tremor analyses continued incorporating progressively more wearable accelerometers and enabling accurate classification between PD, essential tremor patients and controls while starting to step outside the boundaries of the lab45,46. Tsipouras et al. demonstrated that using multiple, wearable accelerometers and gyroscopes allowed for effective monitoring of patients while performing activities of daily life under real-life, but simulated, conditions47. Finally, using accelerometers embedded in a pen along with other sensors (e.g., touch recording plate), Papapetropoulos et al. showed the ability of multiple, small sensors to discriminate types of pathological tremor48.

Over the past decade, monitoring of PD symptoms has experienced two thematic changes. First, monitoring has become more remote and accessible due to the ease of use and widespread availability of more wearable accelerometers/gyroscopes and smartphones with those devices built-in. Second, monitoring has become more continuous through the use of web and mobile applications. Together, these changes are making way for more smart technology-mediated assessment of PD, with platforms for diagnosis currently in development (Table 2).

Table 2.

Progression of technology used to monitor and assess PD symptoms in the home setting.

Authors, Years Primary study setting Device Primary symptom(s) measured
Holmes et al.134 Controlled Professional microphone Voice recordings
Keijsers et al.68 Controlled (home-like) Tri-axial accelerometers Fluctuations in Levodopa-induced dyskinesia
Banaszkiewicz et al.64 Controlled Tablet with stylus Bradykinesia while drawing spirals
Patel et al.69 Controlled Uniaxial accelerometers Tremor, bradykinesia and dyskinesia severity and fluctuation
Chen et al.58 Home Wearable sensors, web services for live streaming and storage of data, and web-based graphical user interface client Home monitoring of tremor, bradykinesia, dyskinesia, medication compliance, and other qualitative patient data
Chen et al.58 Controlled In-lab video Straight-line gait
Kostikis et al.62 Controlled iPhone Postural tremor
Yang et al.49 Controlled Tri-axial accelerometer Straight-line gait
Cavanaugh et al.149 Home StepWatch 3 Step Activity Monitor Number of complete gait cycles completed
Cancela et al.59 Home Web interface, tri-axial accelerometers, and belt sensor with accelerometer and gyroscope Monitoring of tremor, medication (dose, time), meals (type of food, amount, time), and PDQ-39
Daneault et al.60 Controlled Smartphone with accelerometer Resting, postural, intention, and kinetic tremor
Klucken et al.50 Controlled Tri-axial gyroscopes and accelerometers Straight-line gait and lower extremity coordination (e.g., heel/toe tapping)
Cancela et al.59 Home Four tri-axial accelerometers. Waist-worn accelerometer and gyroscope. Gait and bradykinesia
Ferreira et al.54 Home Accelerometers and angular rate sensors, Wii Balance Board, SENSE-PARK app Feasibility of and compliance with continuous sensor-based monitoring
Fisher et al.55 Home Tri-axial accelerometers Feasibility of and compliance with continuous sensor-based monitoring.
Bank et al.76 Controlled In-lab video Bradykinesia
Heldman et al.53 Home Wireless motion sensor and touch screen tablet Resting and postural tremor and bradykinesia
Silva de Lima et al.65 Home “Fox Wearable Companion” app on a smartwatch and smartphone Continuous monitoring of tremor and qualitative surveys on quality of life
Lakshminarayana et al.150 Home Smartphone with uMotif app Motor: Bradykinesia Other: Sleep, mood, cognition
Rusz et al.79 Controlled Smartphone and professional microphone Voice recordings
Zhan et al.72 Home Smartphone Gait, finger tapping, and voice samples
Lo et al.61 Home Smartphone (7 + models were used) Falls, FoG, postural instability, cognitive impairment, difficulty doing hobbies, need for help at home
Prince et al.66 Home iPhone Dexterity, gait, phonation, and memory
Isaacson et al.151 Home Kinesia 360 motion sensor and mobile application Tremor, bradykinesia and dyskinesia severity and fluctuation
Aich et al.71 Controlled Tri-axial accelerometers Gait kinematic features
Erb et al.57 Controlled (home-like) and Home Accelerometers, gyroscopes, magnetometers, barometers, EKG, EMG, and/or galvanic skin response Feasibility of and compliance with continuous sensor-based monitoring Dynamics of tremor, dyskinesia, and bradykinesia over the medication cycle
Evers et al.56 Home Accelerometer, gyroscope, magnetometer, barometer, galvanic skin response, photoplethysmogram, thermometer, and Android smartphone Gait abnormalities via unscripted, real-world collection
Ghoraani et al.70 Controlled (home-like) Tri-axial gyroscopes Fluctuations between medication ON and OFF states
Lu et al.74 Controlled Video Straight-line gait
Mahadevan et al.52 Controlled Tri-axial accelerometer, gyroscope, and magnetometer Identification of resting tremor constancy, bradykinesia MDS-UPDRS scores, and gait abnormalities. Assessment of medication state (“ON” or “OFF”) related changes in tremor
Pfister et al.73 Controlled Microsoft Band 2 (Tri-axial accelerometer and gyroscope along with Bluetooth capabilities) Fluctuations between “ON”, “OFF”, and “Dyskinetic” states
Sajal et al.80 Controlled Smartphone-based accelerometer Tremor and voice recordings
Singh et al.78 Home Smartphone Voice recordings
van Brummelen et al.63 Controlled 7 consumer product accelerometers (CPAs) (e.g., iPhone 7, iPod Touch 5) and laboratory-grade accelerometer (Biometrics ACL300) Postural and action tremor
Dominey et al.152 Home KinetiGraph wrist-worn movement recording system. Motor: tremor, bradykinesia, and dyskinesia. Other: Immobility/somnolence and medication adherence
Gatsios et al.135 Home Microsoft Band, a sensor insoles, smartphone with an Android app, and a cloud backend Motor: Tremor, gait, weight-bearing Other: Heart rate, skin temperature, sleep quality/duration
Daneault et al.67 Home Smartwatch and smartphone with “Fox Wearable Companion” app Bradykinesia and tremor from continuous home monitoring and in-lab tasks (e.g., finger-to-nose, typing on a keyboard)
Marcante et al.51 Controlled (home-like) Insole sensors consisting of 13 pressure sensors and a tri-axial accelerometer Gait in scripted scenarios (e.g., rise from bed walk to chair)
Powers et al.2 Home Apple Watch Fluctuations in resting tremor and dyskinesia. Tremor severity and presence of dyskinesia
Hadley et al.153 Home Smartphone and smartwatch Tremor, bradykinesia and dyskinesia severity and fluctuation
Sundgren et al.154 Home KinetiGraph wrist-worn movement recording system. Motor: tremor, bradykinesia, and dyskinesia. Other: Immobility/somnolence and medication adherence

Data were either collected in controlled environments and applied to the home setting or were directly collected in home settings.

Wearable sensors are making way for more remote assessment of PD symptoms. Yang et al. found that a single, small tri-axial accelerometer attached to the belt buckle enabled estimation of multiple gait parameters such as cadence, step regularity, stride regularity and step symmetry to be estimated in real-time, allowing for immediate quantification of gait49. Klucken et al. also reported the use of a small, heel-clipped device that achieved a classification accuracy of 81% differentiating between PD patients and healthy controls50. More recently, a study of insole sensors enabled detection of PD-related FoG episodes with 90% accuracy51 and wrist-worn accelerometers achieved “good to strong” agreement with clinical ratings of resting tremor and bradykinesia, in addition to discriminating between treatment-caused changes in motor symptoms52. Though some of these studies were conducted in laboratory settings, the collective results indicate that patients could wear similar devices at home, enabling remote mobility assessment. Studies specifically assessing wearable technologies’ ability to track motor symptoms in at-home settings have reported high compliance and clinical utility26,5357.

In 2011, Chen et al. introduced MercuryLive, a web-based system that integrated data from wearable sensors and qualitative patient surveys for real-time, in-home monitoring of symptoms. Specifically, the system was used to guide potential changes in medications for patients with later-stage disease58. The advantage of such systems over sensor-only platforms is the ability to more seamlessly collect qualitative patient data, allowing clinicians and researchers to better contextualize quantitative sensor data. Other web application-based systems, like the PERFORM system presented by Cancela et al. in 2013, continued deploying wearable accelerometers and gyroscopes, but expanded the functionalities of the associated web application to include medication adherence questionnaires, food diaries, and the PDQ-39 questionnaire59, further expanding the qualitative information that supplements the objective data collected by wearable devices.

In-home monitoring became even more practical following the adoption of smartphones and other smart devices60,61. In 2011, Kostikis showed the feasibility of remote tremor monitoring using an Apple iPhone 3 G’s built-in accelerometer and gyroscope62. As recently as 2020, van Brummelen et al. tested seven consumer product accelerometers in smartphones (e.g., iPhone 7) and consumer smart devices (e.g., Huawei watch) and found that these products performed comparably to laboratory-grade accelerometers when assessing the severity of certain PD symptoms63. Smart tablets have also been shown to be helpful through the use of spiral drawing tests whose results significantly correlated with UPDRS scores and with the results of other tests including the BRAIN Test64.

Further expansion of smart devices came with the advent of user-friendly mobile applications such as the Fox Wearable Companion app developed by the Michael J. Fox Foundation. Silva de Lima et al. showed that using the app along with an Android smartphone and Pebble smartwatch resulted in high patient engagement and robust quantitative and qualitative data collection for clinicians to monitor PD progression and medication adherence65. Prince et al. report success using an independently designed iOS application66. Use of smartwatches in conjunction with such mobile applications also allows for cloud-based data storage, thereby enabling research and clinical teams to more effectively monitor symptom progression and severity in real-time67. In 2021, Powers et al. developed the “Motor fluctuations Monitor for Parkinson’s Disease” (MM4PD) system that used continuous monitoring from an Apple Watch to quantify resting tremor and dyskinesia. MM4PD strongly correlated with evaluations of tremor severity, aligned with expert ratings of dyskinesia, and matched clinician expectations of patients 94% of the time2. Multiple other groups, including Keijsers et al., have presented solutions that can assess motor fluctuations in real or simulated home settings using either wearable sensors6871 or smart devices72,73. These types of solutions are particularly important for PD monitoring since assessing symptom fluctuations can give clinicians insight into medication dosing, disease severity, and even symptom triggers (e.g., a patient has worse tremor when driving compared to washing dishes). Monitoring fluctuations using smart devices can be particularly useful, as the device can document what a patient was doing when symptoms worsened, what time of day it happened, among other important environmental factors, providing clinicians a more wholistic picture of a patient’s disease. Data collected from fluctuation monitoring could also inform whether certain patients might be candidates for procedures such as deep brain stimulation.

Finally, multiple studies have proposed using technologies other than accelerometers and gyroscopes (either stand-alone or in smartphones). Instead, some studies used computer vision-based algorithms to assess data from video cameras, time-of-flight sensors, and other motion devices7476. In the future, similar video analysis technologies could be combined with existing video platforms (e.g., Zoom, FaceTime) to regularly and reliably monitor motor impairments outside of the clinic. Significant work has also been conducted assessing the feasibility of using voice recordings to monitor and even diagnose PD. Arora et al. analyzed at-home voice recordings and were able to determine patients’ UPDRS scores to differentiate between patients with PD and healthy controls with a sensitivity of 96% and specificity of 97%77. Similar work on voice data from smartphones has been reported by many others7880, indicating that voice analyses might be beneficial when developing technologies for monitoring and diagnosing PD.

Computational approaches

Non-ML techniques to evaluate PD symptoms have evolved considerably over the last 30 years (Fig. 1B). Prior to adoption of machine learning algorithms, researchers used more traditional statistical and frequency domain analysis techniques. This likely occurred for two main reasons: (1) requisite computing power for ML was not as widely available and (2) the datasets collected in early studies were relatively less complex with respect to size and noise. Additionally, certain key machine learning techniques (e.g., backpropagation applied to neural networks) were not popularized until the late 1980s and early 1990s, with more widespread adoption occurring many years after with the advent of machine learning software libraries81,82. One of the first studies was in 1973 where Albers et al. showed that Parkinsonian hand tremor power spectra were easily distinguished that of control patients83 (Table 3). Statistical testing of frequency power spectrum also showed a significant correlation between selected features such as the total power of the frequency power spectrum and clinical ratings for dyskinesia severity84. Edwards et al. showed that combining multiple tremor characteristics (e.g., amplitude, dominant frequency) into one single index could also differentiate PD from non-PD movement85. Further development of computational techniques included applying more advanced regression models to data collected through different modalities (e.g., accelerometers, mechanical devices)86,87.

Table 3.

Progression of non-machine learning techniques used to monitor and assess PD symptoms.

Authors, Year Primary symptom(s) Device(s) Selected features Key takeaways
Albers et al.83 Tremor Hand-controlled force-measuring stick Frequency and power spectrum Parkinsonian tremor-power spectra are easily distinguished from control power spectra, providing an additional descriptive measure of tremor.
Blin et al.88 Gait Potentiometer Spatiotemporal and kinematic features Relationships between certain spatiotemporal parameters were preserved (e.g., linear relationship between velocity and stride length) in PD and control.
Burkhard et al.84 Dyskinesia Gyroscope RMS of the frequency power spectrum between 0–20 Hz Frequency power spectrum (FPS) values showed a statistically significant correlation with the clinical ratings for dyskinesia severity.
Edwards et al.85 Tremor Laser-based hand displacement tracker Tremor amplitude, frequency, and power spectrum Multiple characteristics of tremor (e.g., amplitude, frequency) can be combined to form an index that can differentiate PD from non-PD.
Lewis et al.90 Gait Force-sensitive resistors, video cameras, force plates, and EMG Velocity, cadence, and stride length Compared to age-matched control subjects, PD patients demonstrated 24% reduction in gait velocity and 23% reduction in stride length.
Matsumoto et al.86 Tremor Accelerometer Accelerations Autoregression model parameters and the main tremor frequency can be used to differentiate between PD and non-PD (control and essential tremor)
Salarian et al.155 Gait Tri-axial accelerometers and gyroscopes Kinematic walking data PD patients had significantly different gait compared to controls (e.g., 52% lower stride velocity). Patients with deep brain stimulation devices had significant gait improvement.
Sofuwa et al.91 Gait Force plates, video cameras Kinematic walking data PD patients showed significant reduction in step length, walking velocity, and ankle plantarflexion compared to controls.
Elble et al.87 Tremor Graphics tablet, accelerometer, and mechanical-linkage device Tremor characteristics (time and frequency domain) Analyses of 5 different tremor datasets revealed a logarithmic relationship between a 5-point tremor rating scale and tremor amplitude.
Chien et al.156 Gait GAITre pressure-sensitive walkway Kinematic walking data Stride length is the best indicator of UPDRS III score improvements
Moore et al.157 Gait Uniaxial accelerometer Vertical linear acceleration and pitch angular velocity of left leg Power analysis revealed distinctions between FoG and standing (e.g., high-frequency leg movements, overall power) which resulted in FoG detection accuracy of 89%.
Giuffrida et al.158 Tremor Kinesia sensor (tri-axial accelerometer and gyroscope) Tremor characteristics (time and frequency domain) Tremor characteristics determined from the sensor correlated with clinician scores.
Hwang et al. (2009) Tremor Displacement-transducing laser Tremor characteristics (time and frequency domain) PD patients demonstrated abnormal tremor modulation compared to healthy controls in a load-bearing task.
Kim et al.89, Bradykinesia Gyroscope Peak and total power in power spectrum of angular velocity All features showed significant differences between control and PD patients and significant correlations with clinical finger tap score.
Sant’Anna et al.159 Gait Inertial measurement unit and gyroscopes (uni- and bi-axial) Kinematic walking data A novel “symmetry index” of upper and lower limb movement during gait had an AUC of ~0.87 in differentiating between PD and control.
Palmerini et al.160 Gait Smartphone Kinematic walking data 10 principal components from PCA accounted for more than 90% of the variance in the original data. The first principal component accounted for 33% of the variance and was most correlated with standard deviation of step duration.
Stamatakis et al.161 Bradykinesia Tri-axial accelerometer Finger-tapping characteristics (e.g., movemet frequency, acceleration, number of halts, number of hesitations) Logistic regression model trained using selected features achieved an AUC of 0.92 in classifying patients based on MDS-UPDRS scores.
Cancela et al.59 Gait Tri-axial accelerometers Step frequency, stride length, stride speed, arm swing, and entropy Step frequency, stride length, arm swing, and entropy vary significantly between the OFF and ON state. Arm swing and entropy vary the most.
Buchman et al.162 Gait Tri-axial accelerometer and gyroscope Kinematic walking data Regression revealed that performance on different mobility tasks (e.g., walking, sit to stand) was associated with Parkinsonian gait. Different tasks accounted for a wide range of gait variance.
Nair et al.92 Gait Tri-axial accelerometer Kinematic walking data Logistic regression had accuracy of 94%, specificity of 96%, and sensitivity of 89%

Many studies also found success through standard hypothesis statistical testing such as t-tests and ANOVAs. Blin et al. used an in-lab potentiometer-linked string and pulley system to collect data on stride length. Using a Mann-Whiteney U test and linear regression, they found that variability of stride length was significantly more marked in PD patients and increased with Hoehn and Yahr clinical stages88. ANOVA conducted on finger tapping data (e.g., RMS angular velocity, RMS angular displacement) showed significant differences between PD and control subjects89.

To harness insights about gait abnormalities, researchers incorporated kinematic analyses into their studies. Using ANOVA on kinematic measurements of gait, Lewis et al. found that patients with Parkinson’s displayed lower gait velocity and stride length, but comparable cadence relative to healthy controls while exhibiting reductions in peak joint angles in the sagittal plane and reductions in ankle plantarflexion at toe-off of the gait cycle90. These gait and kinematic characteristics were corroborated using spatiotemporal analysis conducted by Sofuwa et al., who showed that patients with PD had a significant reduction and step length and walking velocity compared to control, with the major feature defining the PD group being a reduction in ankle plantarflexion91. More recently, Nair et al. used standard logistic regression on centroids from k-means clustering of data from tri-axial accelerometers to classify PD and control subjects with an accuracy of ~95%, specificity of ~96%, and sensitivity of ~89%92.

In more recent literature, machine learning techniques have proven to be highly effective in identifying PD symptom characteristics, especially when applied to varied datasets obtained using smart devices (Fig. 1B). The literature demonstrates strong performance across multiple machine learning techniques. Both neural network and non-neural network algorithms achieved high sensitivities and specificities in classification of PD symptoms using both raw and processed data. (Table 4) .

Table 4.

Progression of machine learning techniques used to monitor and assess PD symptoms.

Authors, Year Primary symptom(s) Device(s) Selected features Supervised or unsupervised Key takeaways
Jakubowski et al.123 Tremor Accelerometer 30 statistical features Supervised Using “higher order” statistical characteristics of tremor (e.g., spectral moments of polyspectra) lead to average error < 3% in distinguishing between PD, essential, and physiological tremor
Keijsers et al.163 Hypokinesia, bradykinesia, and tremor Six tri-axial accelerometers Kinematic features (e.g., mean gait velocity, tremor frequency) Supervised A neural network with 2 hidden units and 4 input parameters resulted in a sensitivity and specificity of 100 and 98%, respectively.
Roy et al.124 Tremor Accelerometer and EMG Spectrum data Supervised Combining accelerometer and EMG data led to a global error rate < 10%.
Kostikis et al.101 Tremor iPhone Tri-axial accelerometer data Supervised Bagged decision trees had the highest AUC (0.94).
Lee et al.103 Dyskinesia Accelerometer Correlation between accelerometers on and entropy of accelerometer time series Supervised The proposed method (data pre-processing + random forest) significantly improves the estimation of limb-specific clinical scores.
Martinez-Manzanera et al.107 Bradykinesia Accelerometer, gyroscope, and magnetometer Raw and smooth splined signals Supervised Lowest classification error was ~33% for finger tapping, ~33% of diadochokinesis, and ~30% for toe tapping.
Memedi et al.125 Dyskinesia Hand-held touch screen device Characteristics of spiral drawing task Supervised Multilayer Perceptron classified the motor symptom (bradykinesia or dyskinesia) with an accuracy of 84% and ROC AUC 0.86 in relation to classifications of raters.
Rigas et al.99 Tremor Microsoft Band on wrist Accelerometer and gyroscope readings Supervised Decision trees achieved accuracy of 94% with 0.01% false positives.
Alam et al.164 Tremor Tri-axial accelerometer and gyroscope Tremor characteristics (time and frequency domain) Supervised Linear kernel SVM trained on data collected from the index finger classified rest and postural tremor severity with ~89% and ~82% accuracy, respectively.
Jeon et al.113 Tremor Accelerometer and gyroscope Temporal and frequency characteristics of tremor Supervised The best performing machine learning method changed depending on the type of tremor being assessed (e.g., resting tremor: Polynomial SVM, resting tremor with mental stress: Decision tree)
Butt et al.126 Tremor Inertial measurement units Movement characteristics (e.g., velocity, frequency) of multiple tasks (e.g., thumb–forefinger tapping, hand opening/closing) Supervised Neural network (accuracy of ~83%) outperformed SVM and logistic regression in distinguishing between slight-mild and moderate-severe PD
Oung et al.127 Tremor Accelerometer, gyroscope, and magnetometer Tri-axial accelerometer data Supervised Maximum average accuracy of ~91% using ELM, ~90% using PNN, and ~87% using KNN.
Jeon et al.113 Tremor Tri-axial accelerometer and gyroscope Tremor characteristics (time and frequency domain) Supervised Decision tree trained with features selected using PCA performed the best in predcting UPDRS scores. However, multiple ML algorithms performed well as measured by accuracy and RMSE.
Samà et al.165 Bradykinesia Tri-axial accelerometer Gait characteristics (time and frequency domain) Supervised SVM detected bradykinesia with ~91% accuracy. Regression models estimated UPDRS scores ~10% error.
Gao et al.105 Gait Multiple across 2 separate studies Numerous, including: demographic, gait, balance, and imaging data Supervised Feature selection streamlined large datasets. Model-free ML techniques (e.g., Neural Networks, SVM) performed better than model-based techniques (e.g., regression) in forecasting falls.
Bazgir et al.115 Tremor Android smartphone with tri-axial accelerometer and gyroscope Mean and max power spectrum density Supervised Naïve Bayes achieved 100% accuracy in classifying UPDRS scores of PD patients. NN had accuracy of ~93%, KNN of ~87%, and SVM of ~75%.
Butt et al.116 Tremor Infrared-based motion detector Speed and frequency of movement tasks Supervised Naïve Bayes on features selected by SVM performed the best.
Zhan et al.72 Gait Smartphone Features from gait, finger tapping, and voice tests Unsupervised mPDS correlated strongly with Hoehn and Yahr stage (0.91), MDS-UPDRS part III (r = 0.88), and MDS-UPDRS total (0.81).
Kim et al.129 Tremor Accelerometer and gyroscope Tri-axial accelerometer and gyroscope readings Supervised 3-layer CNN resulted in ~85% accuracy when estimating UPDRS scores.
Pereira et al.130 Tremor Smartpen with accelerometer Time series images of 4 drawing tasks and 2 wrist movements Supervised CNNs performed best when utilizing data from each task separately and when combining data from each task
Vivar et al.102 Tremor Infrared-based hand/finger motion detector (Leap Motion Controller) 9 texture features (e.g., mean, variance, energy, correlation) Supervised Bagged tree classifier using contrast achieved a 98% accuracy in classifying patients’ UPDRS scores.
Hssayeni et al.104 Tremor Motion sensor X, Y, and Z axis signal power Supervised Gradient tree boosting outperformed LSTM when estimating UPDRS-III scores.
Rehman et al.106 Gait Tri-axial accelerometer and instrumented force mat Gait kinematic features Supervised SVM performed better than random forest. Classification of PD versus control was significantly more accurate when using accelerometer data compared to using force mat data.
Rehman et al.106 Gait Instrumented force mat Gait kinematic features Supervised Random forest achieved the highest classification accuracy of 97% with 100% sensitivity and 94% specificity.
Rios-Urrego et al.114 Tremor Tablet with stylus Kinematic and spatial features from drawing task Supervised KNN using kinematic features and the kinematic + spatial + NLD features was most accurate (83%).
Steinmetzer et al.166 Gait Tri-axial accelerometer and gyroscope 3D Euler angle and linear acceleration of the arms Supervised A three layer CNN detected motor dysfunction with an accuracy of 93% based on data from a timed up and go test.
Aich et al.71 Gait Accelerometer Gait kinematic features Supervised Decision tree had the highest accuracy of 88%, sensitivity of 93%, and specificity of 91%
Aich et al.71 Gait Tri-axial accelerometers Gait kinematic features Supervised Random forest had an accuracy of 96%, SVM of 93%, Naïve Bayes of 88%, and KNN of 86% classifying between the medication ON and OFF states.
Reches et al.32 Gait Opal sensors (accelerometer, gyroscope, and magnetometer) Time and frequency domain features Supervised SVM had ~80% sensitivity, ~83 specificity, and ~87% accuracy.
de Araújo et al.112 Tremor Triple-axis gyroscope and an accelerometer Time and frequency domain tremor features Supervised KNN outperformed 6 other machine learning algorithms in classifying hand resting tremor in patients with PD.
Sajal et al.80 Tremor Smartphone-based accelerometer Time and frequency domain features Supervised KNN had the highest accuracy for both PD vs non-PD and UPDRS 0–4 classifications
Moon et al.119 Gait 6 IMU sensors Gait and postural sway features Supervised The accuracy of the models in classifying between PD and ET ranged from 0.65 (KNN) to 0.89 (NN).
Veeraragavan et al.122 Gait 8 force sensors on the soles of each foot Swing, stance, and force metrics Supervised The neural network used for PD diagnosis had an accuracy of 97%. H&Y staging was conducted with an accuracy of 87%.
Shi et al.128 Gait 3 IMU sensors with tri-axial sensors Wavelet transformed gait data Supervised CNN achieved an accuracy of 89% using wavelet-transformed data, 84% using FFT of the time series, and 74% using raw time series
Sigcha et al.131 Gait IMU sensor Time and frequency domain features Supervised Highest AUC (0.94) was achieved with CNN-LSTM using FFT + 3 previous windows when classifying presence vs non-presence of FOG.
Ibrahim et al.132 Tremor Motion sensor 1D time-domain tremor amplitude signal Supervised CNN with perceptron was used to estimate amplitude of future tremor at time steps of 10, 20, 50, and 100 ms. Accuracy ranged from 90–97%.
Channa et al.108 Bradykinesia Accelerometer and gyroscope Time and frequency domain features Supervised Bradykinesia classification had a sensitivity of 100% and specificity of 89%
Mirelman et al.110 Gait Tri-axial accelerometers and gyroscopes Gait kinematic features Supervised Different gait features held varying levels of importance for distinguishing between stages of PD (AUC of 0.76–0.9). Upper-limb features best discriminated controls from early PD, turning features were important in mid-stage PD, and stride-related features were more important in more advanced stages.
Rupprechter et al.111 Gait KELVIN-PD video recording platform Gait kinematic features Supervised Random forest highly aligned with clinical examiners’ ratings of UPDRS scores (rarely differed by more than one point; 95% agreement) when trained on computer vision-derived features.
Liu et al.167 Tremor Tri-axial accelerometer Tremor characteristics (time and frequency domain) Supervised SVM achieved the best performance with an overall accuracy of ~95% when differentiating between patients with different tremor severities

The vast majority of machine learning techniques are supervised.

There is still significant research being conducted on optimizing and refining most of the ML algorithms discussed here, as many aspects of ML design still work through trial and error. This applies to both determining model parameters (e.g., learning rates for gradient descent, impurity levels in decision trees) and selecting algorithms themselves (e.g., neural network versus decision tree)9396. In reality, multiple different models could be effective in performing the same task on a given set of data97,98. Here, we present objective measures of ML model performance while also attempting to provide rationale regarding the design criteria that may have led researchers to choose one algorithm over another.

Non-neural network machine learning algorithms have proven effective in Parkinson’s disease classification, as they often provide more mechanistic insight/interpretability and generally require less training data compared to neural networks. Multiple studies have found that decision trees are highly effective in classifying Parkinson’s versus control patients based on accelerometer and gyroscope data. Using data from a Microsoft Band smartwatch, Rigas et al. used decision trees to achieve a tremor detection accuracy of 94% with a 0.01% false positive rate99. Aich et al. showed that a decision tree trained on gait characteristics such as step time and length, stride time and length, and walking speed distinguished Parkinson’s patients from healthy controls with an accuracy of ~88%, sensitivity of ~93%, and specificity of ~91%, outperforming k nearest neighbor (KNN), support vector machine (SVM), and Naïve-Bayes100. The design choices in these studies were conducive to using decision trees, as there were multiple quantitative variables (e.g., stride length) with specific cut-offs (e.g., stride length <1.2 m) that informed certain diagnoses. Decision trees also enabled researchers to quantitatively determine which feature(s) (e.g., tremor frequency) from the data were most important in determining final classifications, thus improving the link between data analysis and understanding of disease.

While decision trees can be effective, they can also overfit training data, thereby limiting their generalizability. Therefore, many groups have found success using bagged decision trees, a technique that trains multiple trees using subsets of the training data and then aggregates the final results. Bagged decision trees can be particularly useful to mitigate overfitting that can result from analyzing relatively small datasets. Kostikis et al. used data from 25 patients with PD and 20 health controls and found that bagged decision trees on tremor features resulted in an AUC of 0.94, higher than any other algorithm they tested (e.g., logistic regression, SVM, AdaBoost)101. In a study with 20 patients with PD, bagged trees showed between 95 and 98% accuracy in classifying patients as per the MDS-UPDRS 0,1,2 scheme when using tremor data from motion sensors rather than accelerometers or gyroscopes102.

Results continued to be strong with a variant of bagged decision trees known as random forests (RF), which can be useful in improving accuracy and further reduce overfitting, with the tradeoff of longer training times. RF performed better than logistic regression on features from gait analysis, sway tests, and time up-and-go tasks when classifying between progressive supranuclear palsy and Parkinson’s and were also useful in estimating clinical scores of dyskinesia103. At the same time, researchers have encountered success with another variation of decision trees known as boosted trees, with gradient tree boosting outperforming a long short-term memory neural network when estimating UPDRS-III scores based on motion sensor data from the wrist and ankle104.

To further improve algorithm efficiency and reduce computational cost, researchers have leveraged feature selection techniques in combination with established machine learning algorithms. Feature selection is particularly important in the design of studies that evaluate multiple ML algorithms to identify the top performers or train algorithms on different datasets105,106. Feature selection is also commonly used as a tool to help improve algorithm performance. When used in conjunction with feature selection techniques such as recursive feature elimination, RF achieved a classification accuracy of 96% when grading gait abnormalities of PD patients on and off medications71. Another type of SVM-based feature selection was useful in achieving high RF performance when classifying PD vs non-PD patients, resulting in accuracy of 97%, sensitivity of 100%, and specificity of 94%. In general, many different feature selection techniques have shown to be useful with multiple ML algorithms32,106108. SVMs have shown to perform well with and without feature selection before model training32,106.

Feature analysis, however, does not stop with feature selection. Specifically, post-hoc feature importance calculations can be beneficial in better understanding why specific models work the way they do, providing more insight related to the clinical applications of the model. Rehman et al. built multiple partial least discriminant analysis models using subsets of gait features measured in patients with PD and healthy controls, and used feature importance metrics to identify that, among others, step velocity, step length, and gait regularity were the most influential features in the model. This type of analysis is particularly beneficial, as they can improve clinical decision-making independent of using machine learning models, by providing clinicians with more nuanced signs/symptoms of early disease manifestation or disease progression109. Similar analyses were conducted on gait abnormalities by Mirelman et al., who stratified patients based on their PD disease progression and found that different features were more important in differentiating between various stages of PD110. For example, as PD progressed, features related to more challenging activities such as turning became more important for patient classification, but Mirelman et al. found that this increase in importance occurred in earlier stages of disease than one would normally expect. Similar analyses were reported by additional groups investigating gait and even other symptoms of PD104,111,112.

While the choice of which ML algorithm to use can partially be informed by the type of data, size of the study, etc., some papers have shown that the accuracy of a machine learning model depends on the type of tremor being evaluated, further highlighting the inherent trial-and-error nature of ML study design. Jeon et al. found that while decision trees were most accurate when classifying patients based on resting tremor with mental stress and intention tremor, resting tremor classification alone was most accurate with polynomial SVM and postural tremor classification was most accurate with (KNN)113. In the same vein, multiple groups have found that KNNs using time and frequency domain data are highly effective in Parkinson’s versus control classification80,112,114 using tremor data. Finally, Butt et al. and Bazgir et al. in 2018 both found that Naïve Bayes outperformed other tested algorithms when classifying Parkinson’s tremor using motion and accelerometer/gyroscope data, respectively115,116.

A few unsupervised learning algorithms have been developed for PD classification. Unsupervised learning can be useful when designing studies with large datasets that might be too cumbersome to manually label—a pre-requisite for training supervised ML models. Unsupervised learning is also beneficial in exploratory analyses to provide structure and novel insights from large and diverse datasets. Zhan et al. developed a novel “Disease Severity Score Learning” algorithm that calculated a “mobile Parkinson disease score” (mPDS) based on 435 features from gait, finger tapping, and voice tests that were conducted using smartphones. mPDS scores strongly correlated with MDS-UPDRS part III, MDS-UPDRS total, and Hoen and Yahr stages. This work represents ongoing efforts to create more objective measurements of Parkinson’s disease progression that are not impacted by interrater variability72.

The development of artificial neural networks to study large datasets have recently been used for PD symptom classification. Neural networks have multiple use cases but are most often utilized on large sets of data whose features must be combined using complex, non-linear relationships for classification or regression tasks. That being the case, neural networks typically require more data to train compared to other ML algorithms and, as a consequence, are more computationally expensive. Though neural networks can be powerful tools, they tend to be more “black box”, lacking in interpretability compared to other ML algorithms52,117,118. Even so, neural networks are one of the most popular ML algorithms used today and have achieved strong performance when applied to diagnosing and monitoring PD.

Moon et al. used 48 features across gait and postural sway collected from six inertial measurement units (IMUs) across patients’ backs, upper extremities, and lower extremities to differentiate between PD and essential tremor. After testing multiple machine learning algorithms (e.g., SVM, KNN, neural network, logistic regression), the authors found that a neural network with a learning rate of 0.001 had the highest accuracy (0.89), precision (0.61), and F1-score (0.61)119. Moon et al.’s paper is a good example of the design process often times used with machine learning in that multiple algorithms are tested before selecting one algorithm with specific hyperparameters (e.g., learning rate, number of hidden layers) that are also typically selected with trial and error120,121. Veeraragavan et al. also used neural networks, but attempted two different tasks: classifying between PD and healthy patients based on gait and classifying PD patients into Hoehn and Yahr clinical stages. Parkinson’s versus healthy control classification was achieved with an accuracy of 97% using a single hidden layer network with 25 nodes, while classification into Hoehn and Yahr stages was accomplished with an accuracy of 87% using a single hidden layer network with 13 nodes122. These results suggest that neural networks are promising candidates for disease classification and staging.

Early efforts to apply machine learning to PD tremor data utilized single hidden layer perceptron classifiers of 30 higher order statistical characteristics of tremor accelerometer data as inputs to differentiate between Parkinsonian, essential, and physiological tremor123. Such efforts essentially combined sophisticated feature extraction with relatively simple algorithm architecture for classification tasks. Other approaches, such as the dynamic neural network used by Roy et al., aimed to classify tremor as “mild”, “moderate”, or “severe” (based on UPDRS), using spectrum data from EMG and accelerometer measurements. Leveraging input features that required minimal pre-processing, such as accelerometer signal energy after lowpass filtering, Roy et al. achieved global classification error rates of less than 10%124. Others have reported success using neural networks trained on similar features that require little pre-processing125,126. Alterations to classical neural networks have also performed well. Oung et al. showed that extreme learning machines—neural networks that learn weights without backpropagation—achieved 91% classification accuracy when tremor and voice data were used as inputs to the network127.

Convolutional neural networks (CNNs) have recently played a large role in Parkinson’s disease classification due to their ability to directly analyze image data. In many cases, this reduces the amount of feature extraction needed. For example, if using a CNN to analyze tremor data collected by accelerometers, researchers do not need to extract features such as frequency, amplitude, etc., because the input to the CNN can simply be a processed version of the accelerometry graph itself. In 2020, Shi et al. used graphs of wavelet-transformed data (decomposing the data into a set of discrete oscillations called wavelets) from tri-axial accelerometers, gyroscopes, and magnetometers as inputs to a CNN to classify FoG and non-FoG episodes. Overall, the CNN displayed classification accuracy of ~89%, sensitivity of ~82%, and specificity of ~96%. The same study found that CNNs using raw time series data or Fourier-transformed data as inputs did not perform as well128. This shows that researchers must carefully select pre-processing techniques when using CNNs, as this choice can significantly alter the algorithm’s performance. However, using Fourier-transformed data improved CNN-based tremor classification. Kim et al., in 2018, reported ~85% accuracy when estimating UPDRS scores using a 3-layer CNN with a soft-max classification final layer. Rather than extracting specific features from accelerometer data to use as inputs to the CNN, Kim et al. used a stacked 2D FFT image of the tri-axial accelerometer and gyroscope data129.

Researchers have experimented with various CNN architectures and structures as well. Pereira et al. compared CNNs with ImageNet or Cifar10 architectures to an optimum-path forest, support vector machine with radial basis function, and Näive-Bayes using data from 4 drawing (e.g., spiral drawing) and 2 wrist movement tasks to distinguish Parkinson’s from control patients based on tremor. Overall, the CNNs outperformed the other machine learning techniques with respect to classification accuracy when using data from each aforementioned task separately (single-assessment case) and when combining data from each task (combined-assessment case)130. Sigcha et al. in 2020 wanted to model the time-dependencies of FoG and used a novel CNN structure by combining a classical CNN with a long short-term memory (LSTM) recurrent neural network to classify FoG and non-FoG episodes. Using Fourier-transformed data from an IMU on patients’ waists as an input, the CNN-LSTM combination achieved an AUC of 0.939131.

CNNs have also been useful beyond classification tasks. In 2020, Ibrahim et al. used a CNN with perceptron to estimate the amplitude of future tremor at 10, 20, 50, and 100 millisecond time steps, with a prediction accuracy ranging from 90 to 97%132. Both traditional and convolutional neural networks will likely continue to be useful in machine learning-based analysis of PD symptoms.

Interplay between technology and computational techniques

The technology and computational techniques used to monitor PD motor symptoms have evolved concurrently. As technology improves, different computational techniques must be developed and optimized to handle increasing amounts of data collected by new devices. The same applies in reverse. As advancements are made in computation that enable researchers to ask and answer different questions, new technologies must be developed that can facilitate these new analyses.

The overarching, major change seen in the technology used to diagnose and monitor PD over the last ~50 years has been the transition from laboratory to home monitoring. This technological evolution has undoubtedly been accompanied by a shift in computational approaches. Fundamentally, the techniques used to analyze data collected in well-controlled laboratory settings must be different from those required to analyze data collected in real-world conditions. As such, the evolution of technology necessitated computational methods that could: (1) better denoise signals, (2) make predictions given large sets of structured data, and (3) make predictions given large sets of unstructured data.

In-lab diagnosis and monitoring of PD generates data with less noise compared to data generated from real-world monitoring. This manifests in two ways. First, the data signal itself contains less ambient noise. For example, by using high-quality microphones or working in sound-treated rooms, researchers can control for room noise if recording voice samples from patients with PD133,134. At another level, the data from most in-lab studies are “de-noised”/simplified due to the inherent structure built into these studies. Assessing gait abnormalities via the timed up-and-go test or quantifying tremor via circle drawing tests produces highly consistent and uniform data since participants have executed the same task(s) in the same way to generate the data. This is not the case in real-world settings. As technology enabled real-world data collection, de-noising became one of the first priorities, both through simple filtering77 and data labeling (e.g., smartwatch labeling if a participant was running, swimming, sleeping)135. Though, apart from adding functionalities to deal with noisy data, foundational computational techniques such as frequency analyses and statistical testing were still adequate.

The adoption of machine learning generally correlated with the ability to collect increasing amounts of data, which have enabled researchers to ask new questions. The prime example of this is the adoption of smart devices. Before, researchers could ask participants to wear accelerometers, gyroscopes, heart rate monitors, etc. to collect varied types of data. Smart devices enabled device consolidation, improving ease of use for patients, and therefore increasing the amount of data that could be collected. Even more, smart devices improved ease of collecting qualitative data. Instead of relying on patient diaries or recall from memory, app-based monitoring on phones or tablets allowed patients to more seamlessly provide qualitative data related to medication adherence, exercise levels, mood, etc.

With access to increased volumes and types of data, researchers and clinicians started asking questions that were more suited for analysis with ML rather than non-ML techniques. These questions can broadly be assigned into two categories: (1) predictions and (2) classifications. When investigating PD, researchers were interested in predicting severity of symptoms and disease progression, while classifying patients for diagnostic and therapeutic purposes. ML algorithms were specifically suited for this task given their ability to leverage non-linearities and more efficiently handle large datasets. For example, neural networks enabled researchers to uncover complex, non-linear relationships between quantitative (e.g., tremor frequency) and qualitative (e.g., medication adherence) data to predict UPDRS scores, while SVM allowed for high-dimensional (>3 independent variable) classification. With smart devices providing access to vast amounts of data, researchers leveraged algorithms such as random forest that parallelized classification and prediction tasks, making data analyses more efficient and insightful.

It is clear that the computational techniques and technology used to monitor PD have co-evolved over the years. As technology advances, new computational techniques will be required to take advantage of the technologies’ improved functionalities and vice versa.

Discussion

The technology used to monitor and quantify Parkinson’s motor symptoms has undergone a rapid transformation in the past few decades. Early monitoring began with in-lab devices such as needle-based EMG, transitioned to using in-lab accelerometers/gyroscopes, then to more wearable accelerometers/gyroscopes, and finally to phone and mobile & web application-based monitoring in patients’ homes. The shift from in-lab to in-home monitoring will enable physicians to make more data-driven decisions regarding patient management. Along the same lines, significant progress has been made with respect to the use of machine learning to classify and monitor Parkinson’s patients. Using data from multiple different sources (e.g., wearable motion sensors, phone-based accelerometers, video cameras), researchers have designed both neural network and non-neural network-based machine learning algorithms to classify/categorize Parkinson’s patients across tremor, gait, bradykinesia, and dyskinesia. Further advancements in these algorithms will create more objective and quantitative ways for physicians to diagnose and manage patients with Parkinson’s.

As machine learning becomes more prevalent in medicine, regulators such as the Food and Drug Administration (FDA) are developing new protocols to assess the safety and efficacy of ML-based health technologies. The plan outlined by the FDA to improve evaluation of these technologies includes: (1) outlining “good machine learning practices”, (2) setting guidelines for algorithm transparency, (3) supporting research on algorithm evaluation and improvement, and (4) establishing guidelines on real-world data collection for initial approval and post-approval monitoring136. As this plan goes into action over the next few years, trial endpoints for diseases will still likely be established clinical metrics (e.g., UPDRS) rather than novel metrics generated by new ML-powered devices137,138. There seems to be, however, a future in which device-generated metrics replace or are used in conjunction with traditional clinical metrics. In the case of PD monitoring, the FDA’s approval of Great Lakes NeuroTechnologies’ KinesiaU device and provider portal to monitor motor symptoms of PD is a first step in that direction139. ML will undoubtedly play an increasingly larger role in medicine, and the FDA’s actions to navigate this new healthcare environment should be carefully monitored by researchers in this field.

Digital PD monitoring has enabled an understanding of patients’ symptoms to a level of detail not seen before. Prior to the adoption of wearable and smart devices in this field, clinicians were blind to the manifestation of PD motor symptoms outside of the clinic (e.g., brushing teeth, exercising, driving). Device-based monitoring has also helped fill in gaps left by sometimes inaccurate or incomplete patient diaries. However, many barriers exist to full clinical adoption of digital monitoring, including the cost of digital devices, lack of secure and reliable pipelines to transfer data to physicians, and perhaps the technological capabilities of patients with PD140. These barriers can start to be overcome through: (1) public-private partnerships that help lower the cost of digital devices for hospital systems to provide to their patients, (2) increased focus on data storage and retrieval infrastructure, and (3) patient education.

In the future, a transition to truly continuous PD symptom monitoring has the greatest potential by leveraging easy-to-use mobile applications on smart devices (e.g., smartphones, smartwatches) that can integrate quantitative and qualitative (e.g., quality of life surveys) data for physicians to better understand a patient’s experience with Parkinson’s. Further development of these applications, along with live data transmission and storage to the cloud will enhance the usability and utility of these technologies. Incorporating machine learning to these functionalities can then enable more objective disease staging/diagnoses by physicians and enhanced predictive capabilities for identifying disease progression. However, there is much work to be done related to developing better disease biomarkers to train these machine learning algorithms on. Reliable biomarkers must accurately identify symptoms of PD across patient populations and stages of disease. These biomarkers might also need to be different in different contexts (e.g., tremor during driving is different from tremor while brushing teeth). Identifying the nuances of digital biomarkers will be essential in realizing the full potential of machine learning and high technology in the monitoring of Parkinson’s symptoms.

Methods

We queried the US National Library of Medicine PubMed database (PubMed). Five compound search terms were used to query PubMed for machine learning and computational publications and clinical trials: “Parkinson’s” + SYMPTOM + (1) machine learning, (2) neural network, (3) quantification, (4) analysis, and (5) monitoring where “SYMPTOM” was either “tremor”, “gait”, “bradykinesia”, or “dyskinesia”. These queries resulted in 10,200 papers. Manuscripts about technology for monitoring PD symptoms were identified in PubMed with advanced search terms: ((automatic detection) OR (classification) OR (wearables) OR (digital health) OR (sensors)) AND “Parkinson’s” + SYMPTOM. These queries resulted in 2600 papers. Studies were first de-duplicated and then excluded if they did not: have full text availability, use data from humans, or evaluate PD specifically. Book chapters, review articles, and “short communications” were also excluded. Titles and abstracts were reviewed before further assessing a sub-set of representative English language papers. These papers were selected as they best characterized the machine learning and technology timelines that manifested from reviewing the literature.

Reporting summary

Further information on research design is available in the Nature Research Reporting Summary linked to this article.

Supplementary information

Supplementary Information (166.8KB, pdf)
Reporting Summary (983.2KB, pdf)

Author contributions

I.J.P. and A.K. devised the project. A.S.C. conducted the literature search with support from I.J.P. and A.K. A.S.C. wrote the manuscript with support from I.J.P. and A.K. All authors contributed to the final manuscript and approve of its contents.

Funding

Open Access funding provided by the National Institutes of Health (NIH).

Data availability

The data used to generate the figures and tables are publicly available to researchers through the National Library of Medicine. Additional inquiries are welcome to the corresponding author.

Competing interests

The authors declare no competing interests.

Footnotes

Publisher’s note Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Supplementary information

The online version contains supplementary material available at 10.1038/s41746-022-00568-y.

References

  • 1.Liang, T.-W. & Tarsy, D. In Up to Date (ed. Post, T. W.) (UpToDate, 2021).
  • 2.Powers, R. et al. Smartwatch inertial sensors continuously monitor real-world motor fluctuations in Parkinson’s disease. Sci. Transl. Med.13, eabd7865 (2021). [DOI] [PubMed]
  • 3.Rovini, E., Maremmani, C. & Cavallo, F. How wearable sensors can support Parkinson’s disease diagnosis and treatment: a systematic review. Front. Neurosci. 11, 555 (2017). [DOI] [PMC free article] [PubMed]
  • 4.Kovosi S, Freeman M. Administering medications for Parkinson’s disease on time. Nursing. 2011;41:66. doi: 10.1097/01.NURSE.0000394533.76028.32. [DOI] [PubMed] [Google Scholar]
  • 5.Grissinger M. Delayed administration and contraindicated drugs place hospitalized Parkinson’s disease patients at. Risk. P T. 2018;43:10–39. [PMC free article] [PubMed] [Google Scholar]
  • 6.Groiss SJ, Wojtecki L, Südmeyer M, Schnitzler A. Deep brain stimulation in Parkinson’s disease. Ther. Adv. Neurol. Disord. 2009;2:20–28. doi: 10.1177/1756285609339382. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 7.Movement Disorder Society Task Force on Rating Scales for Parkinson’s Disease. The unified Parkinson’s disease Rating Scale (UPDRS): status and recommendations. Mov. Disord. 2003;18:738–750. doi: 10.1002/mds.10473. [DOI] [PubMed] [Google Scholar]
  • 8.Goetz CG, et al. Movement Disorder Society-sponsored revision of the Unified Parkinson’s Disease Rating Scale (MDS-UPDRS): Process, format, and clinimetric testing plan. Mov. Disord. 2007;22:41–47. doi: 10.1002/mds.21198. [DOI] [PubMed] [Google Scholar]
  • 9.Louis ED, et al. Clinical correlates of action tremor in Parkinson disease. Arch. Neurol. 2001;58:1630. doi: 10.1001/archneur.58.10.1630. [DOI] [PubMed] [Google Scholar]
  • 10.Heldman DA, et al. The Modified Bradykinesia Rating Scale for Parkinson’s disease: reliability and comparison with kinematic measures. Mov. Disord. 2011;26:1859–1863. doi: 10.1002/mds.23740. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 11.Bathien N, Koutlidis RM, Rondot P. EMG patterns in abnormal involuntary movements induced by neuroleptics. J. Neurol. Neurosurg. Psychiatry. 1984;47:1002–1008. doi: 10.1136/jnnp.47.9.1002. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 12.Andrews CJ. Influence of dystonia on the response to long-term L-dopa therapy in Parkinson’s disease. J. Neurol. Neurosurg. Psychiatry. 1973;36:630–636. doi: 10.1136/jnnp.36.4.630. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 13.Milner-Brown HS, Fisher MA, Weiner WJ. Electrical properties of motor units in Parkinsonism and a possible relationship with bradykinesia. J. Neurol. Neurosurg. Psychiatry. 1979;42:35–41. doi: 10.1136/jnnp.42.1.35. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 14.Hacisalihzade SS, Albani C, Mansour M. Measuring parkinsonian symptoms with a tracking device. Comput. Methods Prog. Biomed. 1988;27:257–268. doi: 10.1016/0169-2607(88)90090-9. [DOI] [PubMed] [Google Scholar]
  • 15.Beuter A, de Geoffroy A, Cordo P. The measurement of tremor using simple laser systems. J. Neurosci. Methods. 1994;53:47–54. doi: 10.1016/0165-0270(94)90143-0. [DOI] [PubMed] [Google Scholar]
  • 16.Weller C, et al. Defining small differences in efficacy between anti-parkinsonian agents using gait analysis: a comparison of two controlled release formulations of levodopa/decarboxylase inhibitor. Br. J. Clin. Pharm. 1993;35:379–385. doi: 10.1111/j.1365-2125.1993.tb04154.x. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 17.O’Suilleabhain PE, Dewey RB. Validation for tremor quantification of an electromagnetic tracking device. Mov. Disord. 2001;16:265–271. doi: 10.1002/mds.1064. [DOI] [PubMed] [Google Scholar]
  • 18.Deuschl G, Lauk M, Timmer J. Tremor classification and tremor time series analysis. Chaos: Interdiscip. J. Nonlinear Sci. 1998;5:48. doi: 10.1063/1.166084. [DOI] [PubMed] [Google Scholar]
  • 19.Spyers-Ashby JM, Stokes MJ, Bain PG, Roberts SJ. Classification of normal and pathological tremors using a multidimensional electromagnetic system. Med. Eng. Phys. 1999;21:713–723. doi: 10.1016/s1350-4533(00)00004-7. [DOI] [PubMed] [Google Scholar]
  • 20.Rajaraman V, et al. A novel quantitative method for 3D measurement of Parkinsonian tremor. Clin. Neurophysiol. 2000;111:338–343. doi: 10.1016/s1388-2457(99)00230-8. [DOI] [PubMed] [Google Scholar]
  • 21.Hoff JI, van der Meer V, van Hilten JJ. Accuracy of objective ambulatory accelerometry in detecting motor complications in patients with Parkinson’s disease. Clin. Neuropharmacol. 2004;27:53–57. doi: 10.1097/00002826-200403000-00002. [DOI] [PubMed] [Google Scholar]
  • 22.Dunnewold RJW, et al. Ambulatory quantitative assessment of body position, bradykinesia, and hypokinesia in Parkinson’s disease. J. Clin. Neurophysiol. 1998;15:235–242. doi: 10.1097/00004691-199805000-00007. [DOI] [PubMed] [Google Scholar]
  • 23.Hoff JI, van den Plas AA, Wagemans EA, van Hilten JJ. Accelerometric assessment of levodopa-induced dyskinesias in Parkinson’s disease. Mov. Disord. 2001;16:58–61. doi: 10.1002/1531-8257(200101)16:1<58::aid-mds1018>3.0.co;2-9. [DOI] [PubMed] [Google Scholar]
  • 24.Dunnewold RJW, Jacobi CE, van Hilten JJ. Quantitative assessment of bradykinesia in patients with Parkinson’s disease. J. Neurosci. Methods. 1997;74:107–112. doi: 10.1016/s0165-0270(97)02254-1. [DOI] [PubMed] [Google Scholar]
  • 25.Salarian A, et al. Quantification of tremor and bradykinesia in Parkinson’s disease using a novel ambulatory monitoring system. IEEE Trans. Biomed. Eng. 2007;54:313–322. doi: 10.1109/TBME.2006.886670. [DOI] [PubMed] [Google Scholar]
  • 26.Mera TO, Heldman DA, Espay AJ, Payne M, Giuffrida JP. Feasibility of home-based automated Parkinson’s disease motor assessment. J. Neurosci. Methods. 2012;203:152–156. doi: 10.1016/j.jneumeth.2011.09.019. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 27.Heldman DA, et al. Automated motion sensor quantification of gait and lower extremity Bradykinesia. Conf. Proc. IEEE Eng. Med Biol. Soc. 2012;2012:1956–1959. doi: 10.1109/EMBC.2012.6346338. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 28.Phan, D., Horne, M., Pathirana, P. N. & Farzanehfar, P. Measurement of axial rigidity and postural instability using wearable sensors. Sensors (Basel)18, 495 (2018). [DOI] [PMC free article] [PubMed]
  • 29.Salarian A, et al. Analyzing 180° turns using an inertial system reveals early signs of progress in Parkinson’s Disease. Conf. Proc. IEEE Eng. Med Biol. Soc. 2009;2009:224–227. doi: 10.1109/IEMBS.2009.5333970. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 30.Moore ST, et al. Autonomous identification of freezing of gait in Parkinson’s disease from lower-body segmental accelerometry. J. Neuroeng. Rehabil. 2013;10:19. doi: 10.1186/1743-0003-10-19. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 31.Mancini M, et al. Measuring freezing of gait during daily-life: an open-source, wearable sensors approach. J. Neuroeng. Rehabil. 2021;18:1. doi: 10.1186/s12984-020-00774-3. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 32.Reches, T. et al. Using wearable sensors and machine learning to automatically detect freezing of gait during a FOG-Provoking test. Sensors (Basel)20, 4474 (2020). [DOI] [PMC free article] [PubMed]
  • 33.Tripoliti EE, et al. Automatic detection of freezing of gait events in patients with Parkinson’s disease. Comput. Methods Prog. Biomed. 2013;110:12–26. doi: 10.1016/j.cmpb.2012.10.016. [DOI] [PubMed] [Google Scholar]
  • 34.Zach H, et al. Identifying freezing of gait in Parkinson’s disease during freezing provoking tasks using waist-mounted accelerometry. Parkinsonism. Relat. Disord. 2015;21:1362–1366. doi: 10.1016/j.parkreldis.2015.09.051. [DOI] [PubMed] [Google Scholar]
  • 35.Manson A, et al. An ambulatory dyskinesia monitor. J. Neurol. Neurosurg. Psychiatry. 2000;68:196–201. doi: 10.1136/jnnp.68.2.196. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 36.Pulliam CL, et al. Continuous assessment of levodopa response in Parkinson’s disease using wearable motion sensors. IEEE Trans. Biomed. Eng. 2018;65:159–164. doi: 10.1109/TBME.2017.2697764. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 37.Rodríguez-Molinero, A. et al. Estimating dyskinesia severity in Parkinson’s disease by using a waist-worn sensor: concurrent validity study. Sci. Rep.9, 13434 (2019). [DOI] [PMC free article] [PubMed]
  • 38.Giovannoni G, van Schalkwyk J, Fritz V, Lees A. Bradykinesia akinesia inco-ordination test (BRAIN TEST): an objective computerised assessment of upper limb motor function. J. Neurol. Neurosurg. Psychiatry. 1999;67:624–629. doi: 10.1136/jnnp.67.5.624. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 39.Allen DP, et al. On the use of low-cost computer peripherals for the assessment of motor dysfunction in Parkinson’s disease—quantification of bradykinesia using target tracking tasks. IEEE Trans. Neural Syst. Rehabilitation Eng. 2007;15:286–294. doi: 10.1109/TNSRE.2007.897020. [DOI] [PubMed] [Google Scholar]
  • 40.Espay AJ, et al. At-home training with closed-loop augmented-reality cueing device for improving gait in patients with Parkinson’s disease. J. Rehabil. Res. Dev. 2010;47:573. doi: 10.1682/jrrd.2009.10.0165. [DOI] [PubMed] [Google Scholar]
  • 41.Bachlin M, et al. Wearable assistant for Parkinson’s disease patients with the freezing of gait symptom. IEEE Trans. Inf. Technol. Biomed. 2010;14:436–446. doi: 10.1109/TITB.2009.2036165. [DOI] [PubMed] [Google Scholar]
  • 42.Lee, A. et al. Can google glassTM technology improve freezing of gait in parkinsonism? A pilot study. Disabil. Rehabil. Assist. Technol. 1–11. 10.1080/17483107.2020.1849433 (2020). [DOI] [PubMed]
  • 43.Rao, A. S. et al. Quantifying drug induced dyskinesia in Parkinson’s disease patients using standardized videos. In: 2008 30th Annual International Conference of the IEEE Engineering in Medicine and Biology Society 1769–1772. 10.1109/IEMBS.2008.4649520 (2008). [DOI] [PubMed]
  • 44.van Hilten JJ, Middelkoop HA, Kerkhof GA, Roos RA. A new approach in the assessment of motor activity in Parkinson’s disease. J. Neurol. Neurosurg. Psychiatry. 1991;54:976–979. doi: 10.1136/jnnp.54.11.976. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 45.Burne JA, Hayes MW, Fung VSC, Yiannikas C, Boljevac D. The contribution of tremor studies to diagnosis of Parkinsonian and essential tremor: a statistical evaluation. J. Clin. Neurosci. 2002;9:237–242. doi: 10.1054/jocn.2001.1017. [DOI] [PubMed] [Google Scholar]
  • 46.Cole, B. T., Roy, S. H., Luca, C. J. D. & Nawab, S. H. Dynamic neural network detection of tremor and dyskinesia from wearable sensor data. In: 2010 Annual International Conference of the IEEE Engineering in Medicine and Biology 6062–6065. 10.1109/IEMBS.2010.5627618 (2010). [DOI] [PubMed]
  • 47.Tsipouras MG, et al. An automated methodology for levodopa-induced dyskinesia: assessment based on gyroscope and accelerometer signals. Artif. Intell. Med. 2012;55:127–135. doi: 10.1016/j.artmed.2012.03.003. [DOI] [PubMed] [Google Scholar]
  • 48.Papapetropoulos, S. et al. Objective quantification of neuromotor symptoms in Parkinson’s disease: implementation of a portable, computerized measurement tool. Parkinsons Dis.2010, (2010). [DOI] [PMC free article] [PubMed]
  • 49.Yang C-C, Hsu Y-L, Shih K-S, Lu J-M. Real-time gait cycle parameter recognition using a wearable accelerometry system. Sensors (Basel) 2011;11:7314–7326. doi: 10.3390/s110807314. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 50.Klucken, J. et al. Unbiased and mobile gait analysis detects motor impairment in Parkinson’s disease. PLoS ONE8, e56956 (2013). [DOI] [PMC free article] [PubMed]
  • 51.Marcante, A. et al. Foot pressure wearable sensors for freezing of gait detection in Parkinson’s disease. Sensors (Basel)21, 128 (2020). [DOI] [PMC free article] [PubMed]
  • 52.Mahadevan N, et al. Development of digital biomarkers for resting tremor and bradykinesia using a wrist-worn wearable device. npj Digital Med. 2020;3:1–12. doi: 10.1038/s41746-019-0217-7. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 53.Heldman DA, et al. Telehealth management of Parkinson’s disease using wearable Sensors: Exploratory Study. Digit Biomark. 2017;1:43–51. doi: 10.1159/000475801. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 54.Ferreira JJ, et al. Quantitative home-based assessment of Parkinson’s symptoms: the SENSE-PARK feasibility and usability study. BMC Neurol. 2015;15:89. doi: 10.1186/s12883-015-0343-z. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 55.Fisher JM, Hammerla NY, Rochester L, Andras P, Walker RW. Body-worn sensors in Parkinson’s disease: evaluating their acceptability to patients. Telemed. J. E Health. 2016;22:63–69. doi: 10.1089/tmj.2015.0026. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 56.Evers, L. J. et al. Real-life gait performance as a digital biomarker for motor fluctuations: the Parkinson@Home validation study. J. Med. Internet Res.22, e19068 (2020). [DOI] [PMC free article] [PubMed]
  • 57.Erb MK, et al. mHealth and wearable technology should replace motor diaries to track motor fluctuations in Parkinson’s disease. npj Digital Med. 2020;3:1–10. doi: 10.1038/s41746-019-0214-x. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 58.Chen B, et al. A web-based system for home monitoring of patients with Parkinson’s disease using wearable sensors. IEEE Trans. Biomed. Eng. 2011;58:831–836. doi: 10.1109/TBME.2010.2090044. [DOI] [PubMed] [Google Scholar]
  • 59.Cancela, J., Pastorino, M., Arredondo, M. T. & Hurtado, O. A telehealth system for Parkinson’s disease remote monitoring. The PERFORM approach. In: 2013 35th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC) 7492–7495. 10.1109/EMBC.2013.6611291 (2013). [DOI] [PubMed]
  • 60.Daneault, J.-F., Carignan, B., Codère, C. É., Sadikot, A. F. & Duval, C. Using a smart phone as a standalone platform for detection and monitoring of pathological tremors. Front. Hum. Neurosci. 6, 357(2013). [DOI] [PMC free article] [PubMed]
  • 61.Lo C, et al. Predicting motor, cognitive & functional impairment in Parkinson’s. Ann. Clin. Transl. Neurol. 2019;6:1498–1509. doi: 10.1002/acn3.50853. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 62.Kostikis, N., Hristu-Varsakelis, D., Arnaoutoglou, M., Kotsavasiloglou, C. & Baloyiannis, S. Towards remote evaluation of movement disorders via smartphones. In: 2011 Annual International Conference of the IEEE Engineering in Medicine and Biology Society 5240–5243. 10.1109/IEMBS.2011.6091296 (2011). [DOI] [PubMed]
  • 63.van Brummelen, E. M. J. et al. Quantification of tremor using consumer product accelerometry is feasible in patients with essential tremor and Parkinson’s disease: a comparative study. J. Clin. Mov. Disord.7, 4 (2020). [DOI] [PMC free article] [PubMed]
  • 64.Banaszkiewicz K, Rudzińska M, Bukowczan S, Izworski A, Szczudlik A. Spiral drawing time as a measure of bradykinesia. Neurol. Neurochir. Pol. 2009;43:16–21. [PubMed] [Google Scholar]
  • 65.Silva de Lima, A. L. et al. Feasibility of large-scale deployment of multiple wearable sensors in Parkinson’s disease. PLoS ONE12, e0189161 (2017). [DOI] [PMC free article] [PubMed]
  • 66.Prince J, Andreotti F, De Vos M. Multi-source ensemble learning for the remote prediction of Parkinson’s disease in the presence of source-wise missing data. IEEE Trans. Biomed. Eng. 2018;66:1402–1411. doi: 10.1109/TBME.2018.2873252. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 67.Daneault, J.-F. et al. Accelerometer data collected with a minimum set of wearable sensors from subjects with Parkinson’s disease. Scientific Data8, 48 (2021). [DOI] [PMC free article] [PubMed]
  • 68.Keijsers NLW, Horstink MWIM, Gielen SCAM. Automatic assessment of levodopa-induced dyskinesias in daily life by neural networks. Mov. Disord. 2003;18:70–80. doi: 10.1002/mds.10310. [DOI] [PubMed] [Google Scholar]
  • 69.Patel S, et al. Monitoring motor fluctuations in patients with Parkinson’s disease using wearable sensors. IEEE Trans. Inf. Technol. Biomed. 2009;13:864–873. doi: 10.1109/TITB.2009.2033471. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 70.Ghoraani B, Hssayeni MD, Bruack MM, Jimenez-Shahed J. Multilevel features for sensor-based assessment of motor fluctuation in Parkinson’s disease subjects. IEEE J. Biomed. Health Inf. 2020;24:1284–1295. doi: 10.1109/JBHI.2019.2943866. [DOI] [PubMed] [Google Scholar]
  • 71.Aich, S. et al. A supervised machine learning approach to detect the on/off state in Parkinson’s disease using wearable based gait signals. Diagnostics (Basel)10, 421 (2020). [DOI] [PMC free article] [PubMed]
  • 72.Zhan A, et al. Using smartphones and machine learning to quantify parkinson disease severity. JAMA Neurol. 2018;75:876–880. doi: 10.1001/jamaneurol.2018.0809. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 73.Pfister FMJ, et al. High-resolution motor state detection in Parkinson’s disease using convolutional neural networks. Sci. Rep. 2020;10:5860. doi: 10.1038/s41598-020-61789-3. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 74.Lu M, et al. Vision-based estimation of MDS-UPDRS gait scores for assessing Parkinson’s disease motor severity. Med. Image Comput. Comput. Assist Int. 2020;12263:637–647. doi: 10.1007/978-3-030-59716-0_61. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 75.Chen S-W, et al. Quantification and recognition of parkinsonian gait from monocular video imaging using kernel-based principal component analysis. Biomed. Eng. Online. 2011;10:99. doi: 10.1186/1475-925X-10-99. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 76.Bank PJM, Marinus J, Meskers CGM, de Groot JH, van Hilten JJ. Optical hand tracking: a novel technique for the assessment of bradykinesia in Parkinson’s disease. Mov. Disord. Clin. Pr. 2017;4:875–883. doi: 10.1002/mdc3.12536. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 77.Arora S, et al. Detecting and monitoring the symptoms of Parkinson’s disease using smartphones: a pilot study. Parkinsonism Relat. Disord. 2015;21:650–653. doi: 10.1016/j.parkreldis.2015.02.026. [DOI] [PubMed] [Google Scholar]
  • 78.Singh S, Xu W. Robust detection of Parkinson’s disease using harvested smartphone voice data: a telemedicine approach. Telemed. J. E Health. 2020;26:327–334. doi: 10.1089/tmj.2018.0271. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 79.Rusz J, et al. Smartphone allows capture of speech abnormalities associated with high risk of developing Parkinson’s disease. IEEE Trans. Neural Syst. Rehabilitation Eng. 2018;26:1495–1507. doi: 10.1109/TNSRE.2018.2851787. [DOI] [PubMed] [Google Scholar]
  • 80.Sajal M. S. R. et al. Telemonitoring Parkinson’s disease using machine learning by combining tremor and voice analysis. Brain Inform.7, 12 (2020). [DOI] [PMC free article] [PubMed]
  • 81.Collobert, R., Bengio, S. & Marithoz, J. Torch: A Modular Machine Learning Software Library (CiteSeerx, 2002).
  • 82.Rumelhart DE, Hinton GE, Williams RJ. Learning representations by back-propagating errors. Nature. 1986;323:533–536. [Google Scholar]
  • 83.Albers JW, Potvin AR, Tourtellotte WW, Pew RW, Stribley RF. Quantification of hand tremor in the clinical neurological examination. IEEE Trans. Biomed. Eng. 1973;20:27–37. doi: 10.1109/TBME.1973.324248. [DOI] [PubMed] [Google Scholar]
  • 84.Burkhard PR, Shale H, Langston JW, Tetrud JW. Quantification of dyskinesia in Parkinson’s disease: validation of a novel instrumental method. Mov. Disord. 1999;14:754–763. doi: 10.1002/1531-8257(199909)14:5<754::aid-mds1007>3.0.co;2-1. [DOI] [PubMed] [Google Scholar]
  • 85.Edwards R, Beuter A. Indexes for identification of abnormal tremor using computer tremor evaluation systems. IEEE Trans. Biomed. Eng. 1999;46:895–898. doi: 10.1109/10.771207. [DOI] [PubMed] [Google Scholar]
  • 86.Matsumoto Y, Fukumoto I, Okada K, Hando S, Teranishi M. Analysis of pathological tremors using the autoregression model. Front. Med. Biol. Eng. 2001;11:221–235. doi: 10.1163/15685570152772487. [DOI] [PubMed] [Google Scholar]
  • 87.Elble RJ, et al. Tremor amplitude is logarithmically related to 4- and 5-point tremor rating scales. Brain. 2006;129:2660–2666. doi: 10.1093/brain/awl190. [DOI] [PubMed] [Google Scholar]
  • 88.Blin O, Ferrandez AM, Serratrice G. Quantitative analysis of gait in Parkinson patients: increased variability of stride length. J. Neurological Sci. 1990;98:91–97. doi: 10.1016/0022-510x(90)90184-o. [DOI] [PubMed] [Google Scholar]
  • 89.Kim J-W, et al. Quantification of bradykinesia during clinical finger taps using a gyrosensor in patients with Parkinson’s disease. Med. Biol. Eng. Comput. 2011;49:365–371. doi: 10.1007/s11517-010-0697-8. [DOI] [PubMed] [Google Scholar]
  • 90.Lewis GN, Byblow WD, Walt SE. Stride length regulation in Parkinson’s disease: the use of extrinsic, visual cues. Brain. 2000;123:2077–2090. doi: 10.1093/brain/123.10.2077. [DOI] [PubMed] [Google Scholar]
  • 91.Sofuwa O, et al. Quantitative gait analysis in Parkinson’s disease: comparison with a healthy control group. Arch. Phys. Med. Rehabilitation. 2005;86:1007–1013. doi: 10.1016/j.apmr.2004.08.012. [DOI] [PubMed] [Google Scholar]
  • 92.Nair, P., Trisno, R., Baghini, M. S., Pendharkar, G. & Chung, H. Predicting early stage drug induced parkinsonism using unsupervised and supervised machine learning. In: 2020 42nd Annual International Conference of the IEEE Engineering in Medicine Biology Society (EMBC) 776–779. 10.1109/EMBC44109.2020.9175343 (2020). [DOI] [PubMed]
  • 93.Chicco D. Ten quick tips for machine learning in computational biology. BioData Min. 2017;10:35. doi: 10.1186/s13040-017-0155-3. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 94.Raschka, S. Model evaluation, model selection, and algorithm selection in machine learning. Preprint at https://arxiv.org/abs/1811.12808 (2020).
  • 95.Ali S, Smith KA. On learning algorithm selection for classification. Appl. Soft Comput. 2006;6:119–138. [Google Scholar]
  • 96.Kotthoff L, Gent IP, Miguel I. An evaluation of machine learning in algorithm selection for search problems. AI Commun. 2012;25:257–270. [Google Scholar]
  • 97.Lee I, Shin YJ. Machine learning for enterprises: applications, algorithm selection, and challenges. Bus. Horiz. 2020;63:157–170. [Google Scholar]
  • 98.Awan SE, Bennamoun M, Sohel F, Sanfilippo FM, Dwivedi G. Machine learning‐based prediction of heart failure readmission or death: implications of choosing the right model and the right metrics. ESC Heart Fail. 2019;6:428–435. doi: 10.1002/ehf2.12419. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 99.Rigas, G. et al. Tremor UPDRS estimation in home environment. In: 2016 38th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC) 3642–3645. 10.1109/EMBC.2016.7591517 (2016). [DOI] [PubMed]
  • 100.Aich, S. et al. Design of a machine learning-assisted wearable accelerometer-based automated system for studying the effect of dopaminergic medicine on gait characteristics of Parkinson’s patients. J. Healthc. Eng.2020, 1823268 (2020). [DOI] [PMC free article] [PubMed]
  • 101.Kostikis N, Hristu-Varsakelis D, Arnaoutoglou M, Kotsavasiloglou C. A smartphone-based tool for assessing parkinsonian hand tremor. IEEE J. Biomed. Health Inform. 2015;19:1835–1842. doi: 10.1109/JBHI.2015.2471093. [DOI] [PubMed] [Google Scholar]
  • 102.Vivar, G. et al. Contrast and homogeneity feature analysis for classifying tremor levels in Parkinson’s disease patients. Sensors (Basel)19, 2072 (2019). [DOI] [PMC free article] [PubMed]
  • 103.Lee, S. I. et al. A novel method for assessing the severity of levodopa-induced dyskinesia using wearable sensors. In 2015 37th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC) 8087–8090. 10.1109/EMBC.2015.7320270 (2015). [DOI] [PubMed]
  • 104.Hssayeni, M. D., Jimenez-Shahed, J., Burack, M. A. & Ghoraani, B. Wearable sensors for estimation of parkinsonian tremor severity during free body movements. Sensors (Basel)19, 4215 (2019). [DOI] [PMC free article] [PubMed]
  • 105.Gao C, et al. Model-based and model-free machine learning techniques for diagnostic prediction and classification of clinical outcomes in Parkinson’s disease. Sci. Rep. 2018;8:7129. doi: 10.1038/s41598-018-24783-4. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 106.Rehman, R. Z. U. et al. Selecting clinically relevant gait characteristics for classification of early Parkinson’s disease: a comprehensive machine learning approach. Sci. Rep.9, 17269 (2019). [DOI] [PMC free article] [PubMed]
  • 107.Martinez-Manzanera O, et al. A method for automatic and objective scoring of bradykinesia using orientation sensors and classification algorithms. IEEE Trans. Biomed. Eng. 2016;63:1016–1024. doi: 10.1109/TBME.2015.2480242. [DOI] [PubMed] [Google Scholar]
  • 108.Channa, A., Ifrim, R.-C., Popescu, D. & Popescu, N. A-WEAR bracelet for detection of hand tremor and bradykinesia in Parkinson’s patients. Sensors (Basel)21, 981 (2021). [DOI] [PMC free article] [PubMed]
  • 109.Rehman RZU, et al. Accelerometry-based digital gait characteristics for classification of Parkinson’s disease: what counts? IEEE Open J. Eng. Med. Biol. 2020;1:65–73. doi: 10.1109/OJEMB.2020.2966295. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 110.Mirelman A, et al. Detecting sensitive mobility features for Parkinson’s disease stages via machine learning. Mov. Disord. 2021;36:2144–2155. doi: 10.1002/mds.28631. [DOI] [PubMed] [Google Scholar]
  • 111.Rupprechter S, et al. A clinically interpretable computer-vision based method for quantifying gait in parkinson’s disease. Sensors (Basel) 2021;21:5437. doi: 10.3390/s21165437. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 112.de Araújo, A. C. A. et al. Hand resting tremor assessment of healthy and patients with parkinson’s disease: an exploratory machine learning study. Front. Bioeng. Biotechnol.8, 778 (2020). [DOI] [PMC free article] [PubMed]
  • 113.Jeon H, et al. High-accuracy automatic classification of Parkinsonian tremor severity using machine learning method. Physiol. Meas. 2017;38:1980–1999. doi: 10.1088/1361-6579/aa8e1f. [DOI] [PubMed] [Google Scholar]
  • 114.Rios-Urrego CD, et al. Analysis and evaluation of handwriting in patients with Parkinson’s disease using kinematic, geometrical, and non-linear features. Comput. Methods Prog. Biomed. 2019;173:43–52. doi: 10.1016/j.cmpb.2019.03.005. [DOI] [PubMed] [Google Scholar]
  • 115.Bazgir O, Habibi SAH, Palma L, Pierleoni P, Nafees S. A classification system for assessment and home monitoring of tremor in patients with Parkinson’s disease. J. Med. Signals Sens. 2018;8:65–72. [PMC free article] [PubMed] [Google Scholar]
  • 116.Butt, A. H. et al. Objective and automatic classification of Parkinson disease with Leap Motion controller. Biomed. Eng. Online17, 168 (2018). [DOI] [PMC free article] [PubMed]
  • 117.Heaton, J., McElwee, S., Fraley, J. & Cannady, J. Early stabilizing feature importance for TensorFlow deep neural networks. In: 2017 International Joint Conference on Neural Networks (IJCNN) 4618–4624. 10.1109/IJCNN.2017.7966442 (2017).
  • 118.Ghorbani A, Abid A, Zou J. Interpretation of neural networks is fragile. Proc. AAAI Conf. Artif. Intell. 2019;33:3681–3688. [Google Scholar]
  • 119.Moon, S. et al. Classification of Parkinson’s disease and essential tremor based on balance and gait characteristics from wearable motion sensors via machine learning techniques: a data-driven approach. J. Neuroeng. Rehabil.17, 125 (2020). [DOI] [PMC free article] [PubMed]
  • 120.Balaprakash, P., Salim, M., Uram, T. D., Vishwanath, V. & Wild, S. M. DeepHyper: asynchronous hyperparameter search for deep neural networks. In: 2018 IEEE 25th International Conference on High Performance Computing (HiPC) 42–51. 10.1109/HiPC.2018.00014 (2018).
  • 121.Thomas, A. J., Petridis, M., Walters, S. D., Gheytassi, S. M. & Morgan, R. E. In Engineering Applications of Neural Networks (eds. Boracchi, G., Iliadis, L., Jayne, C. & Likas, A.) 279–290 (Springer International Publishing, 2017). 10.1007/978-3-319-65172-9_24.
  • 122.Veeraragavan, S., Gopalai, A. A., Gouwanda, D. & Ahmad, S. A. Parkinson’s disease diagnosis and severity assessment using ground reaction forces and neural networks. Front. Physiol.11, 587057 (2020). [DOI] [PMC free article] [PubMed]
  • 123.Jakubowski J, Kwiatos K, Chwaleba A, Osowski S. Higher order statistics and neural network for tremor recognition. IEEE Trans. Biomed. Eng. 2002;49:152–159. doi: 10.1109/10.979354. [DOI] [PubMed] [Google Scholar]
  • 124.Roy, S. H., Cole, B. T., Gilmore, L. D., Luca, C. J. D. & Nawab, S. H. Resolving signal complexities for ambulatory monitoring of motor function in Parkinson’s disease. In: 2011 Annual International Conference of the IEEE Engineering in Medicine and Biology Society 4832–4835. 10.1109/IEMBS.2011.6091197 (2011). [DOI] [PubMed]
  • 125.Memedi M, et al. Automatic spiral analysis for objective assessment of motor symptoms in Parkinson’s disease. Sensors (Basel) 2015;15:23727–23744. doi: 10.3390/s150923727. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 126.Butt AH, et al. Biomechanical parameter assessment for classification of Parkinson’s disease on clinical scale. Int. J. Distrib. Sens. Netw. 2017;13:1550147717707417. [Google Scholar]
  • 127.Oung QW, Muthusamy H, Basah SN, Lee H, Vijean V. Empirical wavelet transform based features for classification of Parkinson’s disease severity. J. Med. Syst. 2017;42:29. doi: 10.1007/s10916-017-0877-2. [DOI] [PubMed] [Google Scholar]
  • 128.Shi, B. et al. Convolutional neural network for freezing of gait detection leveraging the continuous wavelet transform on lower extremities wearable sensors data. In: 2020 42nd Annual International Conference of the IEEE Engineering in Medicine Biology Society (EMBC) 5410–5415. 10.1109/EMBC44109.2020.9175687 (2020). [DOI] [PubMed]
  • 129.Kim HB, et al. Wrist sensor-based tremor severity quantification in Parkinson’s disease using convolutional neural network. Comput. Biol. Med. 2018;95:140–146. doi: 10.1016/j.compbiomed.2018.02.007. [DOI] [PubMed] [Google Scholar]
  • 130.Pereira CR, et al. Handwritten dynamics assessment through convolutional neural networks: an application to Parkinson’s disease identification. Artif. Intell. Med. 2018;87:67–77. doi: 10.1016/j.artmed.2018.04.001. [DOI] [PubMed] [Google Scholar]
  • 131.Sigcha, L. et al. Deep learning approaches for detecting freezing of gait in parkinson’s disease patients through on-body acceleration sensors. Sensors (Basel)20, 1895 (2020). [DOI] [PMC free article] [PubMed]
  • 132.Ibrahim, A., Zhou, Y., Jenkins, M. E., Trejos, A. L. & Naish, M. D. The design of a Parkinson’s tremor predictor and estimator using a hybrid convolutional-multilayer perceptron neural network. In: 2020 42nd Annual International Conference of the IEEE Engineering in Medicine Biology Society (EMBC) 5996–6000. 10.1109/EMBC44109.2020.9176132 (2020). [DOI] [PubMed]
  • 133.Wang EQ, et al. Hemisphere-specific effects of subthalamic nucleus deep brain stimulation on speaking rate and articulatory accuracy of syllable repetitions in Parkinson’s disease. J. Med. Speech Lang. Pathol. 2006;14:323–334. [PMC free article] [PubMed] [Google Scholar]
  • 134.Holmes RJ, Oates JM, Phyland DJ, Hughes AJ. Voice characteristics in the progression of Parkinson’s disease. Int J. Lang. Commun. Disord. 2000;35:407–418. doi: 10.1080/136828200410654. [DOI] [PubMed] [Google Scholar]
  • 135.Gatsios D, et al. Feasibility and utility of mhealth for the remote monitoring of Parkinson disease: ancillary study of the Pd_Manager randomized controlled trial. JMIR Mhealth Uhealth. 2020;8:e16414. doi: 10.2196/16414. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 136.U.S. Food and Drug Administration. Artificial Intelligence/Machine Learning (AI/ML)-Based Software as a Medical Device (SaMD) Action Plan (FDA, 2021).
  • 137.Taylor, N. P. FDA rejects Verily filing for wrist-worn Parkinson’s clinical trial device. MedTech Divehttps://www.medtechdive.com/news/fda-rejects-verily-filing-for-wrist-worn-parkinsons-clinical-trial-device-google/601724/ (2021).
  • 138.Heldman, D. A. Movement Disorder Quantification Algorithm Development —Clinical Trial. (2017).
  • 139.NeuroTechnologies, G. L. Great Lakes NeuroTechnologies Releases KinesiaU Provider Portal Enabling Real-Time Remote Monitoring of Patients With Parkinson’s Disease. Great Lakes NeuroTechnologieshttps://www.glneurotech.com/blog/2021/01/08/great-lakes-neurotechnologies-releases-kinesiau-provider-portal-enabling-real-time-remote-monitoring-of-patients-with-parkinsons-disease/ (2021).
  • 140.Espay AJ, et al. A roadmap for implementation of patient-centered digital outcome measures in Parkinson’s disease obtained using mobile health technologies. Mov. Disord. 2019;34:657–663. doi: 10.1002/mds.27671. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 141.Deuschl G, Lauk M, Timmer J. Tremor classification and tremor time series analysis. Chaos Woodbury N. 1995;5:48–51. doi: 10.1063/1.166084. [DOI] [PubMed] [Google Scholar]
  • 142.Van Someren EJW. Actigraphic monitoring of movement and rest-activity rhythms in aging, Alzheimer’s disease, and Parkinson’s disease. IEEE Trans. Rehabil. Eng. 1997;5:394–398. doi: 10.1109/86.650297. [DOI] [PubMed] [Google Scholar]
  • 143.Sekine M, Akay M, Tamura T, Higashi Y, Fujimoto T. Fractal dynamics of body motion in patients with Parkinson’s disease. J. Neural Eng. 2004;1:8–15. doi: 10.1088/1741-2560/1/1/002. [DOI] [PubMed] [Google Scholar]
  • 144.Giansanti D, Macellari V, Maccioni G. Telemonitoring and Telerehabilitation of Patients with Parkinson’s Disease: Health Technology Assessment of a Novel Wearable Step Counter. Telemed. E-Health. 2008;14:76–83. doi: 10.1089/tmj.2007.0019. [DOI] [PubMed] [Google Scholar]
  • 145.Mancini M, Zampieri C, Carlson-Kuhta P, Chiari L, Horak FB. Anticipatory postural adjustments prior to step initiation are hypometric in untreated Parkinson’s disease: an accelerometer-based approach. Eur. J. Neurol. 2009;16:1028–1034. doi: 10.1111/j.1468-1331.2009.02641.x. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 146.Mancini M, et al. Trunk Accelerometry Reveals Postural Instability in Untreated Parkinson’s Disease. Parkinsonism Relat. Disord. 2011;17:557–562. doi: 10.1016/j.parkreldis.2011.05.010. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 147.Morris TR, et al. Clinical assessment of freezing of gait in Parkinson’s disease from computer-generated animation. Gait Posture. 2013;38:326–329. doi: 10.1016/j.gaitpost.2012.12.011. [DOI] [PubMed] [Google Scholar]
  • 148.Ginis P, et al. Feasibility and effects of home-based smartphone-delivered automated feedback training for gait in people with Parkinson’s disease: A pilot randomized controlled trial. Parkinsonism Relat. Disord. 2016;22:28–34. doi: 10.1016/j.parkreldis.2015.11.004. [DOI] [PubMed] [Google Scholar]
  • 149.Cavanaugh JT, et al. Capturing Ambulatory Activity Decline in Parkinson Disease. J. Neurol. Phys. Ther. JNPT. 2012;36:51–57. doi: 10.1097/NPT.0b013e318254ba7a. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 150.Lakshminarayana R, et al. Using a smartphone-based self-management platform to support medication adherence and clinical consultation in Parkinson’s disease. NPJ Park. Dis. 2017;3:2. doi: 10.1038/s41531-016-0003-z. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 151.Isaacson SH, et al. Effect of using a wearable device on clinical decision-making and motor symptoms in patients with Parkinson’s disease starting transdermal rotigotine patch: A pilot study. Parkinsonism Relat. Disord. 2019;64:132–137. doi: 10.1016/j.parkreldis.2019.01.025. [DOI] [PubMed] [Google Scholar]
  • 152.Dominey T, et al. Introducing the Parkinson’s KinetiGraph into Routine Parkinson’s Disease Care: A 3-Year Single Centre Experience. J. Park. Dis. 2020;10:1827–1832. doi: 10.3233/JPD-202101. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 153.Hadley AJ, Riley DE, Heldman DA. Real-World Evidence for a Smartwatch-Based Parkinson’s Motor Assessment App for Patients Undergoing Therapy. Changes. Digit. Biomark. 2021;5:206–215. doi: 10.1159/000518571. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 154.Sundgren M, Andréasson M, Svenningsson P, Noori R-M, Johansson A. Does Information from the Parkinson KinetiGraphTM (PKG) Influence the Neurologist’s Treatment Decisions?—An Observational Study in Routine Clinical Care of People with Parkinson’s Disease. J. Pers. Med. 2021;11:519. doi: 10.3390/jpm11060519. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 155.Salarian A, et al. Gait assessment in Parkinson’s disease: toward an ambulatory system for long-term monitoring. IEEE Trans. Biomed. Eng. 2004;51:1434–1443. doi: 10.1109/TBME.2004.827933. [DOI] [PubMed] [Google Scholar]
  • 156.Chien S-L, et al. The efficacy of quantitative gait analysis by the GAITRite system in evaluation of parkinsonian bradykinesia. Parkinsonism Relat. Disord. 2006;12:438–442. doi: 10.1016/j.parkreldis.2006.04.004. [DOI] [PubMed] [Google Scholar]
  • 157.Moore ST, MacDougall HG, Ondo WG. Ambulatory monitoring of freezing of gait in Parkinson’s disease. J. Neurosci. Methods. 2008;167:340–348. doi: 10.1016/j.jneumeth.2007.08.023. [DOI] [PubMed] [Google Scholar]
  • 158.Giuffrida JP, Riley DE, Maddux BN, Heldman DA. Clinically deployable KinesiaTM technology for automated tremor assessment. Mov. Disord. 2009;24:723–730. doi: 10.1002/mds.22445. [DOI] [PubMed] [Google Scholar]
  • 159.Sant’Anna A, Salarian A, Wickstrom N. A new measure of movement symmetry in early Parkinson’s disease patients using symbolic processing of inertial sensor data. IEEE Trans. Biomed. Eng. 2011;58:2127–2135. doi: 10.1109/TBME.2011.2149521. [DOI] [PubMed] [Google Scholar]
  • 160.Palmerini, L., Mellone, S., Rocchi, L. & Chiari, L. Dimensionality reduction for the quantitative evaluation of a smartphone-based Timed Up and Go test. Annu. Int. Conf. IEEE Eng. Med. Biol. Soc. IEEE Eng. Med. Biol. Soc. Annu. Int. Conf. 2011, 7179–7182 (2011). [DOI] [PubMed]
  • 161.Stamatakis J, et al. Finger Tapping Clinimetric Score Prediction in Parkinson’s Disease Using Low-Cost Accelerometers. Comput. Intell. Neurosci. 2013;2013:717853. doi: 10.1155/2013/717853. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 162.Buchman AS, et al. Associations between Quantitative Mobility Measures Derived from Components of Conventional Mobility Testing and Parkinsonian Gait in Older Adults. PLoS ONE. 2014;9:e86262. doi: 10.1371/journal.pone.0086262. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 163.Keijsers NLW, Horstink MWIM, Gielen SCAM. Ambulatory motor assessment in Parkinson’s disease. Mov. Disord. 2006;21:34–44. doi: 10.1002/mds.20633. [DOI] [PubMed] [Google Scholar]
  • 164.Alam, M. N. et al. Tremor quantification of Parkinson’s disease - a pilot study. in 2016 IEEE International Conference on Electro Information Technology (EIT) 0755–0759 10.1109/EIT.2016.7535334 (2016).
  • 165.Samà A, et al. Estimating bradykinesia severity in Parkinson’s disease by analysing gait through a waist-worn sensor. Comput. Biol. Med. 2017;84:114–123. doi: 10.1016/j.compbiomed.2017.03.020. [DOI] [PubMed] [Google Scholar]
  • 166.Steinmetzer T, Maasch M, Bönninger I, Travieso CM. Analysis and Classification of Motor Dysfunctions in Arm Swing in Parkinson’s Disease. Electronics. 2019;8:1471. [Google Scholar]
  • 167.Liu S, et al. Comprehensive analysis of resting tremor based on acceleration signals of patients with Parkinson’s disease. Technol. Health Care Off. J. Eur. Soc. Eng. Med. 2021 doi: 10.3233/THC-213205. [DOI] [PubMed] [Google Scholar]

Associated Data

This section collects any data citations, data availability statements, or supplementary materials included in this article.

Supplementary Materials

Supplementary Information (166.8KB, pdf)
Reporting Summary (983.2KB, pdf)

Data Availability Statement

The data used to generate the figures and tables are publicly available to researchers through the National Library of Medicine. Additional inquiries are welcome to the corresponding author.


Articles from NPJ Digital Medicine are provided here courtesy of Nature Publishing Group

RESOURCES