Skip to main content
NIHPA Author Manuscripts logoLink to NIHPA Author Manuscripts
. Author manuscript; available in PMC: 2021 Apr 1.
Published in final edited form as: IEEE Sens J. 2019 Dec 17;20(7):3777–3787. doi: 10.1109/JSEN.2019.2960320

Wearable Inertial Sensors for Range of Motion Assessment

Ashwin Rajkumar §, Fabio Vulpi §, Satish Reddy Bethi §, Hassam Khan Wazir §, Preeti Raghavan , Vikram Kapila §
PMCID: PMC7202549  NIHMSID: NIHMS1573766  PMID: 32377175

Abstract

This paper presents the design and development of wearable inertial sensors (WIS) for real-time simultaneous triplanar motion capture of the upper extremity (UE). The sensors simultaneously capture in the frontal, sagittal, and horizontal planes UE range of motion (ROM), which is critical to assess an individual’s movement limitations and determine appropriate rehabilitative treatments. Off-the-shelf sensors and microcontrollers are used to develop the WIS system, which wirelessly streams real-time joint orientation for UE ROM measurement. Key developments include: 1) two novel approaches, using earth’s gravity (EG approach) and magnetic field (EGM approach) as references, to correct misalignments in the orientation between the sensor and its housing to minimize measurement errors; 2) implementation of the joint coordinate system (JCS)-based method for triplanar ROM measurements for clinical use; and 3) an in-situ guided mounting technique for accurate sensor placement and alignment on human body. The results 1) compare computational time between two orientation misalignment correction approaches (EG approach = 325.05 μs and EGM approach = 92.05μs); 2) demonstrate the accuracy and repeatability of measurements from the WIS system (percent deviation of measured angle from applied angle is less than ±6.5% and percent coefficient of variation is less than 11%, indicating acceptable accuracy and repeatability, respectively); and 3) demonstrate the feasibility of using the WIS system within the JCS framework for providing anatomically-correct simultaneous triplanar ROM measurements of shoulder, elbow, and forearm movements during several upper limb exercises.

Keywords: motor assessment, rehabilitation, wearable inertial sensors, motion capture, range of motion, joint coordinate system, triplanar

I. Introduction

ARM movements are essential for performing many activities of daily living (ADL). Each activity has minimum range of motion (ROM) requirements for the various upper limb joints [1]. A joint or muscle injury or neurological event such as a stroke, spinal cord injury, or nerve damage can affect the ROM of one or more joints and make ADL difficult to perform. The first step towards rehabilitation is to identify the limitations in ROM to develop strategies for restoring it. The second step is to track the recovery of ROM and change treatment strategies if necessary. The human arm exhibits seven degrees of freedom (three at shoulder, one at elbow, and three at wrist), which require effective tools for triplanar measurements to assess recovery and track the improvement in motor capability.

In a clinical setting, ROM is measured using goniometers [2], inclinometers [3], video analysis software such as Dartfish [4], etc. However, these techniques suffer from high interobserver variability [3]. Although, Dartfish has high interobserver agreement, it requires time-consuming extraction of ROM information from video capture. Moreover, scant research has considered integration of the aforementioned clinically used devices with the joint coordinate system (JCS), proposed by the International Society of Biomechanics (ISB), which provides a comprehensive method for the visualization of triplanar joint ROM [5], [6]. Sophisticated systems for motion capture (MOCAP), reviewed in Section II, are used in research but are not typically feasible to use in clinical environments.

Recent advancements in micro-electromechanical systems have led to the miniaturization of inertial measurement units (IMU) and magnetometer, accelerometer, and gyroscope (MARG) sensor arrays, collectively referred to as inertial sensors. These inertial sensors are widely used in varied applications (e.g., aerospace, automotive, sports medicine, etc.) for pose measurement and have revolutionized the ability to precisely track the position and orientation of a rigid body in 3D space. Moreover, the small footprint of such sensors permits their integration in everyday ubiquitous devices such as smartphones. With recent innovations in wireless communication protocols obviating the need for wired connectivity and allowing portability, new avenues for adoption of inertial sensors have opened in myriad wearable electronic devices such as smart watches. These inertial sensors, coupled with wireless connectivity, are also used for MOCAP applications [7]–[17]. However, low cost IMU and MARG sensors are often prone to errors of misalignment, orthogonality, and offset, which require correction methods to achieve accurate position and orientation measurements.

To address the limitations of existing clinical tools for upper extremity (UE) ROM assessment, we have developed a mechatronics-based wearable inertial sensor (WIS) system made from off-the shelf components. Section II reviews the related works in MOCAP and correction techniques for orientation misalignment of inertial sensors. Section III describes the design and development of the mechatronic sensors and features of the WIS system. Section IV explains the correction techniques to account for orientation misalignments between the sensor and its housing. Section V describes the joint coordinate system (JCS) and its application to triplanar ROM computation with the WIS system. Section VI describes proof-of-concept results on a human subject. Section VII concludes the paper and suggests directions for future research.

II. Related Work

MOCAP is an interdisciplinary research topic that focuses on quantifying motion and enabling interaction in real and virtual environments. Commercially available MOCAP systems can be broadly classified into: (i) optical marker-based systems [18], (ii) electromagnetic position tracking system [19], (iii) markerless optical systems [20], [21], and (iv) inertial sensing systems [7]–[17]. Marker-based optical MOCAP is the gold standard for tracking joint position and angular movement with high precision and accuracy [18]. Nevertheless, such systems require precise marker placement and expensive cameras, all of which are burdensome for clinical use. Furthermore, marker occlusion can occur during limb movements, making tracking difficult. Electromagnetic position tracking (e.g., by Ascension Technology Corp.) computes the position of body-worn electromagnetic sensors relative to a transmitter [19]. These systems avoid the use of multiple cameras and marker occlusion, but they are not easy to use for clinical purposes. Markerless optical systems such as the Kinect™ V2 (Microsoft Corp., Redmond, WA) is a popular MOCAP device to measure joint positions in 3D space [20], [21]. However, data from the Kinect and other markerless video-analysis systems cannot make measurements in the horizontal plane, such as shoulder internal-external rotation and forearm pronation-supination, which are critical for ADL [1]. Furthermore, the Kinect cannot be used in noisy visual environments. Recent advancements in deep learning with markerless MOCAP using videography can reduce the human effort to track human and animal behavior [22], [23], but these have the same limitations as other vision-based systems such as the Kinect, and do not provide precise triplanar measurements for real-time applications.

A. MOCAP With Wearable Inertial Sensors

Inertial sensors refer to a family of sensors capable of measuring the pose of a rigid body in 3D space [7]. Commercially available inertial sensors for MOCAP (e.g. Opal, X-sens, etc.) are expensive due to their built-in calibration techniques, sensor fusion algorithms, offset correction techniques, and software support [8], [10]–[12], and are not suitable for translation to at-home use and clinical practice.

Development of sensor fusion algorithms for extracting orientation information from the inertial sensors’ raw accelerometer, gyroscope, and magnetometer data is presented in [13], [14]. These fusion algorithms have been considered in the context of a single inertial sensor but are yet to be explored in MOCAP applications that require simultaneous use in a network of multiple inertial sensors. Consumer targeted inertial sensors can be used for numerous medical diagnostic applications as discussed in [15]. However, for single joint motion, the approach of [15] does not produce clinically-usable triplanar measurements. A comparison of commercial MOCAP systems vs. consumer-grade inertial sensors for UE ROM measurements is presented in [17] and the results suggest that consumer-grade sensors can provide similar accuracy as commercial MOCAP systems. Finally, a state-of-the-art review [5] on using inertial sensors for MOCAP recommends the use of JCS-based ROM reporting as human motion consists of triplanar movements requiring simultaneous measurements in multiple axes.

B. Orientation Misalignment Correction

Inertial sensors require an initial calibration and correction for various misalignment errors to provide accurate measurements. Specifically, obtaining precise measurements from low cost inertial sensors requires the following steps [24]: (i) calibration of individual sensors of the IMU or MARG system; (ii) correction of misalignment arising from the offset between the inertial sensor and the housing containing it; and (iii) anatomical calibration required due to misalignment of inertial sensor with the object that it is being mounted on. To retrieve accurate measurements from individual sensors (i.e., accelerometer, gyroscopes, and magnetometer), the calibration procedure corrects for the errors arising from: (i) scaling factors, (ii) cross axis sensitivity, (iii) offsets in the three axes (non-orthogonal), etc. Comprehensive approaches for calibration of IMU sensors utilizing sensor error models for accelerometers and gyroscopes are presented in [25], [26]. Dynamic model-based adaptive control techniques to improve the performance of microgyroscopes are presented in [27]–[29]. Nonetheless, recent years have witnessed inertial sensor packages endowed with on-board microcontrollers to support self-calibration. For example, BNO055 [30] offers simple experimental routines for calibration that can be performed by novice users, effectively obviating the need for individual sensor calibration techniques of [25]–[29].

Sensor-housing offset arises when the sensor orientation does not align with the orientation of sensor housing. For such a case, effective orientation misalignment correction techniques need to be developed to reprocess the sensor measurement and align it with the housing to obtain accurate measurement. A rotation matrix-based orientation misalignment correction method using a calibration device is developed in [31], which yields unique results for a specific sensor and its housing and needs to be repeated for each sensor-housing pair. Alternatively, in this paper, we utilize a simpler and computationally efficient quaternion-based approach to develop two orientation misalignment correction methods for the inertial sensors.

Finally, anatomical calibration is essential for accurate measurement of joint angles from the human body. Alignment-free calibration of wearable inertial sensors for the human body has been examined by using prescribed motion sequences in the upper [9], [15] and lower [10] extremities. However, a limitation of such an approach is that the individual must initiate movements from a standard position, which may not be achievable for persons with movement limitations.

In this paper, we develop a WIS system for UE ROM assessment using off-the-shelf inertial sensors that wirelessly stream quaternion data for the absolute orientation of the sensors. Two sensor-to-housing orientation misalignment correction techniques are developed to use the quaternion measurements from the inertial sensors and retrieve absolute orientation of the housing. The JCS approach [5], [6] is utilized to compute the ROM data from the quaternion measurements obtained from the sensors worn by the subject. An in-situ data-driven technique for mounting and aligning the WIS on the human body is discussed for precise placement of the sensors, resulting in accurate measurement of ROM.

III. Mechatronic Design of Wearable Inertial Sensors

Individuals with movement limitations may have highly variable initial positions. Hence to be truly applicable in a clinical setting, the sensing method should be able to measure joint RoM accurately from any initial position of the extremity. The objective of this work is to (i) measure the initial pose of the arm from absolute orientation of the body-worn sensors and (ii) measure the RoM simultaneously in the three planes at the shoulder and elbow. To achieve these objectives, individual sensors worn on each body segment must sense the orientation relative to the previous body segment. Prior work shows that mounting the sensors at the distal end of arm segments from the joints whose motion is being measured increases the accuracy of joint angle measurement [15]. Moreover, wireless connectivity such as the Bluetooth low energy (BLE) protocol can eliminate the hassle of being tethered to a computer [19]. The WIS system consists of five wireless inertial sensors mounted on the body as shown in Fig. 1, where LF, RF, LA, and RA represent sensors mounted on the left and right forearm and left and right arm, respectively, and B refers to the sensor mounted on the back. The sensors are mounted using straps and a belt.

Fig. 1.

Fig. 1.

WIS module pictorially represented on a human model in T-pose

The WIs system employs off-the-shelf inertial sensors and microcontrollers to facilitate translation of technology to clinical settings. The multi-module wearable sensor framework of this paper requires the use of a star topology that enables wireless connectivity of multiple devices to a single host computer or smartphone interface. The Gazell protocol from Nordic semiconductors is a common peer-to-peer star topology network [32]. The RFduino microcontroller that supports BLE and Gazell protocols was chosen for the design of the WIs system.

Several low cost, consumer targeted MARG sensors, e.g., BNO055, MPU9150, and X-NEUCLEO can serve as inertial sensors in the proposed WIS system [33]. While these three sensors can provide absolute orientation, the BNO055 has superior static and dynamic angular measurement stability over the MPU9150 and X-NEUCLEO [33]. Moreover, the BNO055’s direct sensor fusion and various operating modes were deemed to offer a high degree of flexibility for the WIS system over the MPU9150. Specifically, the BNO055 sensor can measure (i) absolute orientation relative to the earth’s magnetic field and gravity and (ii) relative orientation from its initial start position based on the selected operating mode. The absolute orientation signals from the sensor can be retrieved using the following operating modes (i) compass mode, (ii) nine degrees of freedom fast magnetometer calibration off mode (NDOF_FMC_OFF), and (iii) nine degrees of freedom fast magnetometer calibration on mode (NDOF_FMC_ON). The NDOF modes requires an initial calibration of the three sensors (three axis accelerometer, magnetometer, and gyroscope) for streaming absolute orientation. In any operating mode, the absolute or relative orientation output data from BNO055 is obtainable as quaternions or Euler angles. Salient features of the BNO055 sensor are delineated in Fig. 2.

Fig. 2.

Fig. 2.

BNO055 sensor operating modes

The five wearable inertial sensors wirelessly connect to a USB host tethered to a computer to stream the quaternion data corresponding to each sensor’s absolute orientation. Moreover, wearable 3D-printed housings were designed for hosting the sensors and mounting them to the human body. Fig. 3(a) shows the schematic representation of the printed circuit board (PCB) developed and Fig. 3(b) shows the 3D printed housing for the PCB. The quaternion data for absolute orientation is obtained from each BNO055 through I2C communication and a packet of maximum 18 bytes, containing each sensor’s quaternion and its corresponding identifier, is created for streaming through the Gazell protocol to the computer.

Fig. 3.

Fig. 3.

(a) Block diagram of the electronics connected in the PCB and (b) 3D printed housing for the PCB

Euler angle, quaternion, and axis/angle representations are the commonly used methods to describe the absolute orientation of a rigid body [34]. Tait-Bryan angles [35], a subset of Euler angles, utilize three angles about the axes of the world coordinate frame to describe the rotation of the body. However, utilizing Euler angles to describe rigid body rotations often results in gimbal lock and singularity problems. Moreover, the computational simplicity of quaternions (four elements) vs. rotation matrices (six elements) suggest the use of quaternions for rotation description. A brief review of quaternions is included below for completeness.

A quaternion is a four-tuple representation of the orientation of a coordinate frame in 3D space. A quaternion describing the rotation of a coordinate frame given by the axis/angle representation (k, ϕ), where k = [kx ky kz]T, is characterized below.

qk(ϕ):=(cosϕ2kxsinϕ2kysinϕ2kzsinϕ2) (1)

A quaternion q := (qw qx qy qz) consists of a scalar qw and a vector Q :=[qxqyqz]. Throughout this paper, we denote vectors using uppercase alphabets, such as Q, and quaternions using lowercase alphabets, such as q. Moreover, “∧” and “~” are used for vectors and quaternions represented in the world frame and sensor frame, respectively. Consistent with the notation in prior literature [13], [15], we use “⊗” and “*” to denote quaternion product and conjugate in this paper.

IV. Sensor-Housing Misalignment Correction

The BNO055 sensor measures absolute orientation of the sensor relative to the world coordinate frame (FW) whose Z^W-axis is anti-parallel to gravity and ŶW-axis points towards the magnetic north of earth. Preliminary measurements revealed that after soldering BNO055 to the PCB and placing it inside the housing, the BNO055 sensor’s coordinate frame (FS) was not aligned with the coordinate frame of the 3D-printed housing (FH). To correct the orientation misalignment between the two frames, we utilize the FW as a reference to develop two software signal processing methods: (i) earth’s gravity (EG approach) and (ii) earth’s magnetic field and gravity (EGM approach) to transform FS and align it with FH.

A. Earth’s Gravity-Based Misalignment Correction

The sensor fusion algorithm embedded in the BNO055 MARG sensor is based on the principle that whenever a rotational axis of the sensor is aligned with anti-parallel to the earth’s gravity vector (Z^W) the angular rotation measurements about the other two axes are 0°. We utilize the direction of the vector Z^W, anti-parallel to gravity, as a reference for correcting the BNO055 orientation FS and obtain FW. A flat table was created by using a smartphone’s accelerometer such that its screen is normal to gravity. The 3D-printed housing was placed on this table (see Fig. 4) with its ZH-axis (Z^H) pointing upward and thus parallel to Z^W. As seen in Fig. 4, the accelerometer measurements for the X- and Y-axes of the smartphone are zero indicating that the gravity vector is normal pointing inward to the smartphone’s screen and Z^W is pointing outward.

Fig. 4.

Fig. 4.

Smartphone-based flat surface normal to gravity

The ZS-axis of the sensor, expressed in the sensor frame, is given by Z˜S=[001]T and q˜(Z˜S)=(0Z˜ST) denotes the quaternion corresponding to Z˜S. Now, we express the Z˜S vector in FW as a quaternion as shown below.

q^ZS=q^Sq˜(Z˜S)q^S* (2)

Using (2), we can now extract the ZS-axis of the sensor in FWZ^S=V^(q^Z). In (2), q^S denotes the quaternion measurement obtained from the BNO055 sensor, i.e., its orientation relative to FW. Following (2) to obtain Z^S, ideally, we expect Z^S to align with the direction of Z^W (or Z^H, which is given by Z^W=[001]T. However, human errors cause unavoidable misalignment in the orientations of the sensor (Z^S) and the housing (Z^H). This results in a non-zero angle γ between Z^S and Z^W, which can be computed by using the dot product below.

γ=cos1(Z^SZ^W) (3)

A software rotation of γ needs to be performed to align Z^S to Z^H and this rotation needs to be performed only around a vector normal to Z^S and Z^W. Thus, we determine a vector P^, normal to Z^S and Z^W, computed in FW as shown below.

P^=Z^W×Z^S (4)

Using P^, we compute P˜ by expressing it in FS and it is found to be constant throughout the entire 360° rotation of the housing about Z^H. The P˜ obtained is retained for further processing. The alignment of ZS-axis to the ZH-axis requires the rotation of the FS by γ about the P^ axis. Hence, the pure quaternion of P˜ is expressed in FW as q^(P^S) as shown below.

q^(P^S)=q^Sq˜(P˜S)q^S* (5)

Next, the quaternion q^P^(γ) describing a rotation in FW of angle γ around P^S=[xyz]T is computed using (1). Finally, for the updated coordinate frame of sensor (FS), (6) computes q^S whose Z^S is aligned with Z^H.

q^S=q^P^*(γ)q^S (6)

This rotation is pictorially represented in Fig. 5(a).

Fig. 5.

Fig. 5.

Pictorial representation of the steps in earth’s gravity-based misalignment correction. (a) Rotation of γ about P^S and (b) Rotation of α about Z^S

Next, the housing is placed with its XH-axis (X^H) aligned with Z^W and the corresponding angle α between the X^S and Z^W is determined (see Fig. 5(b)). The Z^S-axis of the sensor in FW, i.e., V(qZ)=[xyz]T, is computed as in (2) using q^S. The quaternion for rotating the sensor about Z^S axis through an angle α is given by q^Z^S'*(α) and computed using (1). The new quaternion q^H represents the absolute orientation of the housing and is obtained from (7) below.

q^H=q^Z^S*(α)q^S (7)

Finally, the EG approach is validated by placing the sensor on the phone’s screen with YH-axis pointing upward and angle β between ŶH-axis and the Z^W-axis is computed for verification. Table 1 lists the α, β, and γ angles obtained from the five WIS modules. A flow chart of the steps involved in EG-based orientation misalignment correction is shown in Fig. 6.

TABLE I.

WIS Module Housing Angles Relative to Gravity

WIS module α γ β
(min.) (max.)
LA 3.43 2.88 0.0 1.92
RA 4.94 2.66 0.0 3.55
LF 2.20 2.85 0.0 3.15
RF 11.16 1.32 0.0 2.51
B 7.61 1.96 0.0 2.87

Fig. 6.

Fig. 6.

Flow chart showing the steps involved in gravity-based orientation misalignment correction

B. Earth’s Gravity and Magnetic Field-Based Correction

A smartphone screen as a flat surface is not suitable for orientation correction with the EGM approach, since smartphones utilize electromagnetic waves for communication that disturb sensor measurements. Hence, we created a wooden platform with adjustable screws to perform the correction. The calibrated wooden platform is shown in Fig. 7. Here the gravity vector pointing inward is normal to the wooden platform and Z^W points outward. A marking is made on the wooden platform based on the magnetic north pole direction, representing ŶW, shown by the smartphone’s compass.

Fig. 7.

Fig. 7.

Calibrated wooden platform used for EGM-based orientation misalignment correction

From the principle of BNO055’s sensor fusion algorithm, the measurement of absolute orientation of the sensor is relative to FW. Specifically, when the sensor is oriented such that the direction of ZS-axis is along Z^W and the YS-axis is aligned with ŶW, the sensor provides zero measurements in yaw, pitch, and roll. We utilize this information and align the FH such that its YH-axis is parallel to the earth’s magnetic field ŶW and ZH-axis parallel to Z^W (i.e., FH and FW are now aligned). Ideally, we expect FS also to align with FW. However, due to misalignment between FH and FS, q^S[1000]. Now, a measurement of quaternion for sensor orientation is obtained (in FW) and its conjugate is saved as q^S*. It can be shown that q˜H=q^S*, where q˜H denotes the quaternion of the housing represented in FS. Next, the housing orientation q˜H is expressed in FW using q^H=q^Sq˜Hq^S*. Now, the sensor misalignment is corrected by applying the rotation q˜H to the sensor measurement q˜S using q^S=q^Hq^S. A 90° rotation about the YH-axis, aligns the XH-axis with Z^W. However, if FH is not coincident with FS, the resulting XS′-axis and Z^W will have a non-zero static offset angle θ, which can be computed as below.

θ=cos1(X^SZ^W) (8)

The angle θ is the misalignment due to the error in aligning the YH-axis to ŶW. We now revert the housing to its old position where ZH-axis is parallel to Z^W and rotate the housing by an angle θ to align the YH-axis with ŶW. The sensor data at this stage provides the alignment of the housing with FW. Thus, as per above, measurement of quaternions for sensor orientation is obtained and its conjugate q^S* is saved as q˜H. The steps for the correction procedure are delineated in the block diagram shown in Fig. 8. The effectiveness of the algorithm is validated by performing a rotation of 90° about the housing’s principal axes. The measured rotation angle for the principal axes of each WIS module’s housing are reported in Table 2.

Fig. 8.

Fig. 8.

Flow-chart showing the gravity and magnetic field-based orientation misalignment corrections

TABLE II.

WIS Module 90° Angular Rotation Results After Correction

WIS module XH-axis YH-axis ZH-axis
LF 89.76° 89.78 ° 90.92 °
RF 89.72 ° 89.18 ° 90.41°
LA 90.03 ° 90.07 ° 89.96 °
RA 89.61 ° 88.45 ° 87.50 °
B 90.08 ° 89.96 ° 89.83 °

V. Joint Coordinate System (JCS)

The quaternions are an effective representation for rotation and computation in 3D space, however, they are rarely used to characterize RoM measurements by therapists and clinicians. The JCS is a standard reporting method proposed by the ISB for computing human joint angles [5], [6]. Furthermore, reporting results using a single standard allows transparent communication between researchers and clinicians. The JCS method uses the proximal coordinate frame as a reference to define the joint angle of the distal coordinate frame. We adopt the method proposed in [5] for computing the joint angles of the UE. The shoulder joint angles use the thorax coordinate frame as the reference and the elbow joint angles use the shoulder coordinate frame as the reference. The coordinate frames and corresponding relative joint angles are described from a starting neutral pose (NP) as shown in Fig. 9.

Fig. 9.

Fig. 9.

Subject wearing WIS modules in neutral pose

In the JCS implementation of the WIS system of this paper, the back-sensor module B is used as a reference for LA and RA sensor modules to compute the shoulder joint angles. Similarly, the LA and RA sensor modules are used as references for the LF and RF sensor modules, respectively, to compute the elbow and forearm movements. For the shoulder angle computation, an initial reference is needed for the back inertial sensor module at NP. To do so, two quaternions qRBref and qLBref are created as shown below in (9).

qRBref=qLBref=q^Z^B(π2)q^B (9)

The sign convention of shoulder joint angle measurements is defined as extension (−) and flexion (+), adduction (−) and abduction (+), and external (−) and internal (+) rotation. The axes shown in Fig. 9 for the q^LA,q^RA, qLBref, and qRBref are rotated by 180° to achieve a similar sign convention. All the rotated coordinate frames are pictorially represented in Fig. 9. The quaternion representing the rotation of the shoulder relative to the back WIS module is extracted by using (10) and (11) for the left and right sides, where (·) denotes the quaternions for the aforementioned rotated coordinate frames.

qLS=qLBref*qLA (10)
qRS=qRBref*qRA (11)

The Y − X − Y′ Euler angle convention is used in [5] to obtain the shoulder joint angles. Since the orientation of the LA and RA WIS modules differ from [5], Y − Z − Y′ Euler angle convention is adopted. The joint angles are computed using MATLAB’s built-in command quat2angle from qLS and qRS. The quat2angle command returns angles θY, θZ, and θY + θY′ that represent rotation in the shoulder plane, shoulder elevation, and shoulder internal-external rotation, respectively. Shoulder elevation θZ refers to shoulder flexion-extension (in the sagittal plane) when θY ≈ 90° and to shoulder abduction-adduction (i.e., the frontal plane) when θY ≈ 0°.

The JCS implementation for measuring elbow rotation requires the use of left arm (LA) and right arm (RA) inertial sensors as references, i.e., qLAref and qRAref, respectively, which are computed as below.

qLAref=q^Y^LA(π2)q^LA (12)
qRAref=q^Y^RA(π2)q^RA (13)

The sign convention for the elbow and forearm measurements are defined as extension (−) and flexion (+) and supination (−) and pronation (+). As per above, the axes shown in Fig. 9 for the coordinate frames qRAref, qLAref, qLF, and qRF are rotated by 180° to achieve a similar sign convention. The relative quaternions representing the left (qLE) and right (qRE) elbow joint angles are computed as below.

qLE=qLAref*qLF (14)
qRE=qRAref*qRF (15)

Next, as in [5], the Z-X-Y Euler angle convention is used to obtain the left and right elbow joint angles by using quat2angle MATLAB command from qLE and qRE, respectively. The quat2angle command returns angles θZ, θX, and θY that indicate elbow flexion-extension, carrying, and pronation-supination angles, respectively. The carrying angle is the angle between the humerus in the upper arm and the ulna in the forearm, which ranges between 8° to 20° [36], [37].

A. WIS Mounting and Alignment

Mounting the sensors at the distal end of the limb segment reduces most errors in measurement. For example, the forearm sensors (LF and RF) are placed proximal to the wrist joint to produce acceptable results for elbow rotation. However, even when the arm sensors (LA and RA) are placed just proximal to the elbow joint, they are prone to erroneous measurements of internal-external rotation at the shoulder due to skin movements. Thus, correct mounting of the WIS modules is critical for accurate measurement of joint ROM. Inertial sensors have previously been calibrated by using a standard initial position and a prescribed motion to correct for mounting uncertainties [10], [15]. However, patients with motor deficits may not be able to achieve these initial positions or perform prescribed movements to produce the suggested joint-to-sensor transformation. Hence, as an alternative, we developed an in-situ solution for accurate placement of sensors that is applicable to patients with real-world movement constraints. Specifically, the sensors LA, RA, LF, and RF are placed at their corresponding distal joint segments as shown in Fig. 9. The carrying angle at the elbow joints and the internal-external rotation at the shoulder joints are displayed in real-time during mounting of the sensors. The sensors are placed correctly when the carrying angle is reflected accurately based on the subject’s gender (8°–20°) and internal-external rotations of the LA and RA sensors read zero. This directed real-time mounting strategy can permit correct positioning of sensors without the need to achieve any specific initial position or perform prescribed movements and does not require training in MOCAP.

VI. Experimental Validation

A. Experimental Setup

As evidenced in Section V, the JCS approach utilizes the relative measurements between two WIS modules for computing the joint angles of the shoulder and elbow. Before conducting experimental measurements with a human subject using the WIS system, we first validated the accuracy of the relative angles between the WIS modules by creating an experimental setup. Specifically, a 12-inch 360° clinical goniometer (Elite Medical Instruments, orange County, CA) was mounted on a flat table to create a rotating platform (i.e., turntable) for testing the measurement accuracy of WIS modules. Next, four WIS modules (LA, RA, LF, and RF) were mounted on the moving arm of the goniometer and the WIS module B was fixed on the table parallel to the 0° start position of the other four WIS modules as shown in Fig. 10. A MATLAB user interface was created for data acquisition and visualization of the WIS modules’ relative angles as shown in Fig. 11. To validate the angular measurement stability of BNO055 reported in [33], we examined the temporal variability of sensor measurements with an arbitrary fixed pose and with dynamic changes to it. Specifically, the relative orientations of the LF, RF, LA, and RA sensors vs. B sensor were measured for 300 sec. for both fixed (0°) and changing orientations (−90°, 0°, 90°, 180°). The resulting measurements exhibited a stable response with no drift or deviations. Next, for each angular measurement, the movable arm of the goniometer was rotated manually from the 0° start position to a pre-determined target angle for ten trials. To test the measurement accuracy of a WIS module about each of its three axes of rotation, the module was placed on the turntable with the axis under test being orthonormal to the turntable. In this manner, the sensor data from each axis of the WIS modules was measured for various angular positions applied on the goniometer.

Fig. 10.

Fig. 10.

Turntable for measuring angles from WIS modules

Fig. 11.

Fig. 11.

MATLAB interface for sampling relative angles

B. Angular Accuracy Testing

The moving arm of the goniometer was manually rotated from 0° start position, in intervals of 20°, to various angular positions ranging between ±80°. The angular orientations of WIS sensor modules (LA, RA, LF, and RF) relative to WIS sensor module B were computed using two methods: (i) vector projection method for the EG approach and (ii) Euler angle method for the EGM approach. These two angular computation methods are applicable to measurements obtained from both the EG and EGM approaches and are included here only for illustration.

1). Vector Projection Method:

In the vector projection method, we utilized the EG approach to obtain the orientation of the housing from the sensor measurements. Each axis Ω^ where ℓ ∈ {LA, RA, LF, RF} and Ω ∈ {X, Y, Z}, of the sensor module l was aligned with X^B. Now, for each Ω ∈ {X, Y, Z}, the rotation of sensor module ℓ, about the axis normal to the turntable, was computed by projecting Ω^ on the XB-ZB plane of the WIS B module. For example, in Fig. 10, X^X^B in the start position and the angular rotation of sensor module ℓ was computed about Y^ axis by projecting X^ on the XB-ZB plane. The angular rotation ψRel of the WIS module ℓ relative to the WIS module B was computed by using the atan2 function as shown below.

ψRel=atan2(V(q^Ω^)V(q^Z^B),V(q^Ω^)V(q^X^B)) (16)

The vectors required to compute ψRel of (16) were obtained from q^(·) using (2). The procedure consisting of 10 trials for each angle between ±80° at 20° intervals was repeated for each axis of the WIS module.

2). Euler Angle Method:

In the Euler angle method, we utilized the EGM approach to obtain the orientation of the housing from the sensor measurements. A similar procedure as outlined above was repeated; however, the relative angles were computed using the relative quaternion qRel between the WIS module B and WIS modules ℓ attached to the moving arm of the turntable as below.

qRel=q^B*q^ (17)

Furthermore, quat2angle MATLAB command was used to extract the relative angle Rangle for the axis tested using Tait Bryan’s angle sequence, wherein the axis tested is the last axis of the sequence, i.e. Z axis testing can utilize X-Y-Z or Y-X-Z sequences. The procedure consisting of 10 trials for each angle between ±80° at 20° intervals was repeated for each axis of the WIS modules.

C. Results and Discussion

The computational time taken for the EG and EGM methods used for orientation misalignment correction techniques were computed using the MATLAB command tic and toc and the results are presented in Table 3. The results indicate that the EGM approach, requiring four quaternion products, is computationally efficient compared to the EG approach, requiring eight quaternion products.

TABLE III.

Computation Time for Different Misalignment Techniques

Technique Computation time for one WIS module (μs) Computation time for five WIS modules (μs)
EG approach 65.01 325.05
EGM approach 18.50 92.50

A MATLAB routine was developed to obtain the positive and negative peaks of the time series WIS module data using findpeaks command. The peaks represent the measured angle ψM and were compared with the applied angle ψA on the goniometer. Coefficient of determination (R2) and the root mean square error (RMSE) between ψA and ψM of each WIS module using the two methods are presented in Table 4.

TABLE IV.

Coefficient of Determination and Rmse Testing Results From EG and EGM Approaches

WIS Module Axis EG approach EGM approach

(R2) RMSE (deg °) (R2) RMSE (deg °)
LA X 0.998 2.128 0.999 1.857
Y 0.999 1.507 0.998 2.168
Z 0.992 1.658 0.998 1.958
LF X 0.999 1.080 0.999 1.003
Y 0.998 2.064 0.999 1.235
Z 0.998 2.011 0.999 0.483
RA X 0.997 3.312 0.998 2.032
Y 0.997 2.926 0.999 1.583
Z 0.998 2.250 0.999 1.773
RF X 0.999 0.771 0.999 0.854
Y 0.998 2.296 0.999 1.190
Z 0.999 0.759 0.999 1.035

The data indicate an excellent correlation between the measured and applied angles. Furthermore, the high correlations indicate that the housing’s coordinate frame computed from the sensor’s coordinate frame were sufficiently accurate to measure ROM.

The accuracy and repeatability of sensor measurements are key parameters to describe the operating constraints of any measurement system. The accuracy a of the WIS module ℓ expressed as average percentage deviation from the applied angle is given by a:=±1ni=1n|ψMliψAli||ψAli|×100, where n = 10trials × 9angles × 3axes. The repeatability of the WIS module ℓ is expressed as the coefficient of variation CV:=maxj=1m(|σj|×100|μj|), where m =9angles × 3axes and μj and σj are the mean and standard deviation, respectively, of ψMl for ten trials. The computed values a and CV for each WIS module relative angle are presented in Table 5. The results indicate that the relative angles obtained from the sensor modules are accurate within ±6.5% of measured angle and the small values of CV indicate that the sensors produce repeatable results.

TABLE V.

Accuracy and Precision of the WIS Module Measured Angle

WIS Module a CV
LA ±6.028 % 10.51%
LF ±4.343 % 7.01%
RA ±6.337 % 7.63%
RF ±4.584 % 7.34%

Having used the goniometer-based turntable described above for validating the relative measurements produced by the WIS modules, we next utilized the WIS modules for the JCS-based ROM measurements. Specifically, the WIS modules were mounted on a healthy human subject as shown in Fig. 9. The subject was asked to perform simple ROM exercises in the following order: (i) shoulder flexion-extension, (ii) shoulder abduction-adduction, (iii) elbow flexion-extension, (iv) forearm pronation-supination, and (v) shoulder internal-external rotation. The JCS method was used to compute joint angle measurements from the sensor data obtained from the shoulder (LA, RA), elbow (LF, RF), and back sensor (B). The JCS-based triplanar motion for the shoulder is shown in Fig. 12 (top panel), where movement in the shoulder plane (frontal plane) is ≈ 90° when the shoulder is flexing and extending in the sagittal plane, and it is ≈ 0° when the shoulder is abducting and adducting in the frontal plane. Furthermore, during abduction-adduction, it is anatomically infeasible to move the shoulder beyond 90° without external rotation (which occurs in the horizontal plane). Similarly, note the elbow and forearm movements in Fig. 12 (bottom panel). These data illustrate that the integration of the JCS technique with our WIS sensor modules provides comprehensive information about joint motion in all three planes simultaneously. This is quite informative to understand movement limitations in patients.

Fig. 12.

Fig. 12.

JCS left upper limb movements at the shoulder (top panel) and the elbow and forearm (bottom panel) during (i) shoulder flexion-extension, (ii) shoulder abduction-adduction, (iii) elbow flexion-extension, (iv) forearm pronation-supination, and (v) shoulder internal-external rotation

VII. Conclusion

In this paper, we presented a mechatronic approach to design and develop a WIS system for triplanar upper extremity ROM assessment. Two software-based signal processing methods were introduced to correct the orientation misalignment between the sensor and its housing. The WIS module measurements were benchmarked against a goniometer on a turntable for repeated measurements and the results show acceptable agreement between measurements in all axes. Furthermore, the experimental measurements were analyzed for accuracy and reliability, and indicate acceptable tolerance limits for rehabilitative applications. Next, the clinically accepted JCS-based ROM assessment technique was integrated with the WIS system for ease of use by rehabilitation clinicians and translation to clinical practice. The results illustrate simultaneous availability of all joint angles to enable clinicians to identify movement restrictions accurately and tailor treatment effectively.

There are several limitations to the work presented in this paper. First, in-house desktop milling machines were used to machine the WIS module PCBs, yielding a quick turnaround time but a large manufacturing footprint. Second, currently the software signal processing, data acquisition, and data analysis algorithms are all implemented using MATLAB, which is unsuitable for translation of the WIS system to patients’ homes and clinical practice. The feasibility of using the WIS system under the JCS framework for ROM assessment was examined with a single healthy subject.

Future work will address several of the aforementioned limitations. By leveraging state-of-the-art manufacturing capabilities, the PCB design of the WIS module can be reduced in size, improving its form factor, comfort level, and wearability. We will conduct a formal study on user experience related to such an updated WIS module design. We are currently developing an exergame framework that can integrate the WIS system in the open-source Unity3D environment and eliminate the need for commercial software tools. Our exergame environment will consist of two human models (i) an animated virtual coach to instruct the users for performing ROM exercises and (ii) a patient model that simulates the user’s movements retrieved from the sensor measurements. We also envision an instructor interface for intuitive visualization and comparison between the animated virtual coach instruction vs. the patient ROM data to facilitate patient performance assessment and feedback. With such an interface, therapists and clinicians will be able to tailor individualized treatment for the patients. We will perform additional user studies for further validation of the WIS system.

In our prior research, we have demonstrated the use of BLE-based devices for interfacing with smartphone applications [38]. In a similar vein, Unity 3D-based applications are compatible for deployment on smartphone interfaces to facilitate the development of smartphone connected WIS modules for patient rehabilitation and ROM assessment. In prior research, we have also demonstrated the ability to utilize mechatronic approaches for creating low-cost, reproducible, prototypes of a grasp rehabilitator [39]. In a related study, we reproduced six copies of the grasp rehabilitator of [39] and utilized these devices within a telemedicine framework to remotely assess the grasp performance [40] and therapy compliance [41] in patients with multiple sclerosis. In future work, we will adopt a similar approach to use small-footprint, reproduced versions of WIS modules for ROM assessment of patients in clinical and telemedicine settings to generate clinically-relevant efficacy, validation, and compliance data for these devices.

Acknowledgements

The authors sincerely thank the reviewers for their valuable feedback that helped improve the manuscript.

Work supported in part by the National Science Foundation under DRK-12 Grant DRL-1417769,* RET Site Grant EEC-1542286,* and ITEST Grant DRL-1614085;* NY Space Grant Consortium under Grant 48240-7887;* and under a Translation of Rehabilitation Engineering Advances and Technology (TREAT) grant NIH P2CHD086841.*,*

Biographies

Ashwin Rajkumar received the B.Tech. degree in mechanical engineering from NIT-Trichy, India, in 2011 and the M.S. degree in mechanical engineering from NYU Tandon in 2015. He is a doctoral student in mechanical engineering at NYU Tandon.

Fabio Vulpi received the B.S. degree in mechanical engineering from Polytechnic Institute of Bari, Italy, in 2017 and the M.S. degree in mechanical engineering from NYU Tandon and Polytechnic Institute of Bari in 2019.

Satish Reddy Bethi received the B.Tech. degree in mechanical engineering from CVR College of Engineering, India, in 2018. He is a master’s student in mechatronics and robotics at NYU Tandon.

Hassam Khan Wazir received the B.Eng. degree in electrical and communication engineering from Institut Teknologi Brunei in 2014 and the M.S. degree in mechatronics and robotics from NYU Tandon in 2018. He is a doctoral student in mechanical engineering at NYU Tandon.

Preeti Raghavan, MD received her medical degree from Rajah Muthiah Medical College in India in 1997 and completed her residency in Physical Medicine and Rehabilitation from Albert Einstein College of Medicine, Bronx, NY in 2002.

She is presently the Sheikh Khalifa Stroke Institute Endowed Chair and Director of Recovery and Rehabilitation at the Sheikh Khalifa Stroke at the Johns Hopkins University School of Medicine. Her research interests are in motor control, rehabilitation engineering, and stroke rehabilitation.

Vikram Kapila received the B.Tech. degree in production engineering and management from REC-Calicut, India, in 1988, the M.S. degree in mechanical engineering from Florida Tech. in 1993, and the doctoral degree in aerospace engineering from Georgia Tech. in 1996.

He has been on the faculty of mechanical and aerospace engineering at NYU Tandon (formerly Polytechnic University) since 1996. His research interests are in mechatronics and robotics with applications to human-robot interaction, healthcare, and education.

Footnotes

Disclosures

Drs. Preeti Raghavan and Vikram Kapila have patented technology for a Game-Based Sensorimotor Rehabilitator through New York University.

References

  • [1].Gates DH et al. , “Range of motion requirements for upper-limb activities of daily living,” Am. J. Occup. Ther, vol. 70, no. 1, pp. 7001350010pp1–10, Jan. 2016. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • [2].Gajdosik RL and Bohannon RW, “Clinical measurement of range of motion: Review of goniometry emphasizing reliability and validity,” Phys. Ther, vol. 67, no. 12, pp. 1867–1872, Dec. 1987. [DOI] [PubMed] [Google Scholar]
  • [3].De Winter AF et al. , “Inter-observer reproducibility of measurements of range of motion in patients with shoulder pain using a digital inclinometer,” BMC Musculoskelet. Disord, vol. 5, no. 18, pp. 1–8, Jun. 2004. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • [4].MacDermid J et al. , “An analysis of functional shoulder movements during task performance using Dartfish movement analysis software,” Int. J. Shoulder Surg, vol. 8, no. 1, pp. 1–9, Jan-Mar 2014. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • [5].Wu G et al. , “ISB recommendation on definitions of joint coordinate systems of various joints for the reporting of human joint motion – Part II: Shoulder, elbow, wrist and hand,” J. Biomech, vol. 38, no. 5, pp. 981–992, May 2005. [DOI] [PubMed] [Google Scholar]
  • [6].Wu G et al. , “ISB recommendation on definitions of joint coordinate system of various joints for the reporting of human joint motion – Part I: Ankle, hip, and spine,” Journal of Biomechanics, vol. 35, no. 4 pp. 543–548, Apr. 2002. [DOI] [PubMed] [Google Scholar]
  • [7].Iosa M et al. , “Wearable inertial sensors for human movement analysis,” Expert Rev. Med. Devices, vol. 13, no. 7, pp. 641–659, Apr. 2016. [DOI] [PubMed] [Google Scholar]
  • [8].El-Gohary M and McNames J, “Shoulder and elbow joint angle tracking with inertial sensors,” IEEE Trans. Biomed. Eng, vol. 59, no. 9, pp. 2635–2641, Jun. 2012. [DOI] [PubMed] [Google Scholar]
  • [9].Karunarathne MS et al. , “An adaptive orientation misalignment calibration method for shoulder movements using inertial sensors: A feasibility study,” 4th Int. Symp. Bioelectron. Bioinformatics, ISBB 2015, pp. 99–102. [Google Scholar]
  • [10].Palermo E et al. , “Experimental evaluation of accuracy and repeatability of a novel body-to-sensor calibration procedure for inertial sensor-based gait analysis,” Measurement, vol. 52, no. 1, pp. 145–155, Aug. 2014. [Google Scholar]
  • [11].Cuesta-Vargas AI et al. , “The use of inertial sensors system for human motion analysis,” Phys. Ther. Rev, vol. 15, no. 6, pp. 462–473, Dec. 2010. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • [12].Kim M and Lee D, “Wearable inertial sensor based parametric calibration of lower-limb kinematics,” Sensors Actuators, A Phys, vol. 265, pp. 280–296, Oct. 2017. [Google Scholar]
  • [13].Madgwick SOH et al. , “Estimation of IMU and MARG orientation using a gradient descent algorithm,” in IEEE International Conference on Rehabilitation Robotics, 2011, pp. 1–7. [DOI] [PubMed] [Google Scholar]
  • [14].Bachmann ER et al. , “Orientation tracking for humans and robots using inertial sensors,” in International Symposium on Computational Intelligence in Robotics & Automation, 1999, pp. 187–194. [Google Scholar]
  • [15].Chen X, Human Motion Analysis with Wearable Inertial Sensors, Ph.D. Diss. Univ. of Tenn., 2013. [Google Scholar]
  • [16].Lopez-Nava IH and Angelica MM, “Wearable inertial sensors for human motion analysis: A review,” IEEE Sens. J, vol. 16, no. 22, pp. 7821–7834, Nov. 2016. [Google Scholar]
  • [17].Hsu YL et al. , “A wearable inertial-sensing-based body sensor network for shoulder range of motion assessment,” Int. Conf. Orange Technol, 2013. pp. 328–331. [Google Scholar]
  • [18].Windolf M et al. , “Systematic accuracy and precision analysis of video motion capturing systems-exemplified on the vicon-460 system,” J. Biomech, vol. 41, no. 12, pp. 2776–2780, 2008. [DOI] [PubMed] [Google Scholar]
  • [19].Hardwick DD, “Scapular and humeral movement patterns of people with stroke during range of motion exercises,” J. Neurol. Phys. Ther, vol. 35, no. 1, pp. 18–25, Mar. 2011. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • [20].Knippenberg E et al. , “Markerless motion capture systems as training device in neurological rehabilitation: A systematic review of their use, application, target population and efficacy.,” J. Neuroeng. Rehabil, vol. 14, no. 1, pp. 1–11, Dec. 2017. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • [21].Lee SH et al. , “Measurement of shoulder range of motion in patients with adhesive capsulitis using a Kinect,” PLoS One, vol. 10, no. 6, p. e0129398pp1–12, Jun. 2015. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • [22].Mathis A et al. , “DeepLabCut: markerless pose estimation of user-defined body parts with deep learning,” Nat. Neurosci, vol. 21, no. 9, pp. 1281–1289, Aug. 2018. [DOI] [PubMed] [Google Scholar]
  • [23].Andriluka M et al. , “2D human pose estimation: New benchmark and state of the art analysis,” Proc. IEEE Comput. Soc. Conf. Comput. Vis. Pattern Recognit, 2014, pp. 3686–3693. [Google Scholar]
  • [24].Bonnet S et al. , “Calibration methods for inertial and magnetic sensors,” Sensors Actuators, A Phys, vol. 156, no. 2, pp. 302–311, Dec. 2009. [Google Scholar]
  • [25].Lv J et al. , “A method of low-cost IMU calibration and alignment,” Int. Symp. Syst. Integr, 2017, pp. 373–378. [Google Scholar]
  • [26].Rohac J et al. , “Calibration of low-cost triaxial inertial sensors,” IEEE Instrum. Meas. Mag, vol. 18, no. 6, pp. 32–38, Nov. 2015. [Google Scholar]
  • [27].Fei J and Feng Z, “Adaptive Fuzzy Super-Twisting Sliding Mode Control for Microgyroscope,” Complexity, vol. 2019, 2019. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • [28].Fang Y et al. , “Adaptive backstepping design of a microgyroscope,” Micromachines, vol. 9, no. 7, 2018. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • [29].Fei J and Liang X, “Adaptive backstepping fuzzy neural network fractional-order control of microgyroscope using a nonsingular terminal sliding mode controller,” Complexity, vol. 2018, 2018. [Google Scholar]
  • [30].Bosch Sensortech GmbH, BNO055 Intelligent 9-axis absolute orientation sensor, 2016. [Online]. Available: https://www.boschsensortec.com [Accessed: 08-Jul-2019].
  • [31].Vcelák J et al. , “AMR navigation systems and methods of their calibration,” Sensors Actuators, A Phys, vol. 123–124, pp. 122–128, Sep. 2005. [Google Scholar]
  • [32].Nordic Semiconductors, Gazell Link Layer User Guide. [Online]. Available: www.infocenter.nordicsemi.com/ [Accessed: 08-Jul-2019].
  • [33].Lin Z et al. , “An Experimental Performance Evaluation of the Orientation Accuracy of Four Nine-Axis MEMS Motion Sensors,” Int. Conf. Enterp. Syst. Ind. Digit, 2017, pp. 185–189. [Google Scholar]
  • [34].Spong M et al. , Robot Modeling and Control. New York: Wiley, 2006. [Google Scholar]
  • [35].Diebel J, “Representing Attitude: Euler Angles, Unit Quaternions, and Rotation Vectors,” Stanford Univ., Stanford, CA, USA, Tech. Rep, 2006. [Google Scholar]
  • [36].An KN et al. , “Carrying angle of the human elbow joint,” J. Orthop. Res, vol. 1, no. 4, pp. 369–378, 1983. [DOI] [PubMed] [Google Scholar]
  • [37].Paraskevas G et al. , “Study of the carrying angle of the human elbow joint in full extension: A morphometric analysis,” Surg. Radiol. Anat, vol. 26, no. 1, pp. 19–23, Feb. 2004. [DOI] [PubMed] [Google Scholar]
  • [38].Raj Kumar A et al. , “Wearable smart glasses for assessment of eyecontact behavior in children with autism,” in Proceedings of the Design of Medical Devices Conference, 2019, pp. V001T09A006. [Google Scholar]
  • [39].Raj Kumar A et al. , “Grasp rehabilitator: A mechatronic approach,” in Proceedings of the Design of Medical Devices Conference, 2019, pp. V001T03A007. [Google Scholar]
  • [40].Feinberg C et al. , “Remotely supervised transcranial direct current stimulation (RS-tDCS) paired with a hand exercise program to improve manual dexterity in progressive multiple sclerosis: A randomized sham controlled trial,” Neurology, vol. 92, no. 15 Supplement, pp. P5.6-009, 2019. [Google Scholar]
  • [41].Malik M et al. , “Upper Extremity Telerehabilitation for Progressive Multiple Sclerosis,” Neurology, 2020, under review. [Google Scholar]

RESOURCES