Abstract
Motion tracking based on commercial inertial measurements units (IMUs) has been widely studied in the latter years as it is a cost-effective enabling technology for those applications in which motion tracking based on optical technologies is unsuitable. This measurement method has a high impact in human performance assessment and human-robot interaction. IMU motion tracking systems are indeed self-contained and wearable, allowing for long-lasting tracking of the user motion in situated environments. After a survey on IMU-based human tracking, five techniques for motion reconstruction were selected and compared to reconstruct a human arm motion. IMU based estimation was matched against motion tracking based on the Vicon marker-based motion tracking system considered as ground truth. Results show that all but one of the selected models perform similarly (about 35 mm average position estimation error).
Keywords: kinematics, sensor fusion, motion tracking, inertial measurements units
1. Introduction
In recent years, the development of sensing technologies and sensor signals processing techniques paved the way for the use of wearable sensors to monitor human status and performance. These developments resulted in the need for managing efficiently such networks, as explained by Fortino et al. in [1]. Wearable body sensor networks (BSN) are nowadays used in several applications which include healthcare, ergonomics, sport and entertainment, (see [2] for a review on the argument). A field that has benefited from the research on BSN is motion tracking.
Motion tracking has received the attention and the effort of generations of researchers. There are several techniques that allow for motion reconstruction based on different information sources. One of the biggest challenges in motion tracking is having an accurate estimation with non-invasive sensors and non limited workspace. In the recent years, a new generation of inertial measurement units (IMUs) based on micro-electro-mechanical systems (MEMS) technology has given a new surge to motion tracking research. These devices are cost-effective and can be successfully used for accurate, non-invasive and portable motion tracking. The big interest in these devices is mainly motivated by the fact that they overcome many issues raised by optical systems and mechanical trackers. IMUs indeed do not suffer from occlusions and have theoretically unlimited workspace compared to optical motion tracking systems, and despite the accuracy of mechanical trackers, IMUs are much more affordable and far less intrusive.
Inertial units-based motion tracking has been used for navigation since decades ago. Initially developed for the attitude estimation of aerial vehicles (see [3,4]), it is nowadays used for other unmanned vehicles tracking (see [5,6,7,8]). In recent years, IMUs are often used to track human motion thus becoming an enabling technology for several applications which include localization, human-robot interaction, rehabilitation and ergonomics. This development is also witnessed by the rise of companies that sell IMUs and IMU-based systems e.g., Invensense (Invensense, San Jose, CA, USA), Trivisio (Trivisio, Trier, Germany), Microstrain (Lord Microstrain, Willistone, VT, USA) and XSens (Xsens Technologies B.V., Enschede, The Netherlands) and the amount of start-ups which target IMU-based systems. The products that they sell often include attitude reconstruction, which is provided as output to the user, or even full body motion reconstruction.
IMUs are typically composed of accelerometers and gyroscopes. These signals are used in different manners according to the applications as it will be explained in Section 2 (as an example see [9]). In most cases IMUs are used to reconstruct the pose or at least either the position or the orientation of the body they are attached to. The naive use of IMUs is the integration of the sensors’ signals over time to estimate velocity, position and orientation. Since both accelerometer and gyroscope measurements suffer from time varying biases and noises, this approach leads to a quick drift of the estimation that is unreliable after a few seconds. Therefore, researchers started investigating both algorithmic and hardware solutions to solve the drift issue. In many cases IMUs are equipped with a three axis magnetometer (e.g., [10,11,12,13]), we refer to these sensors as mIMUs. The magnetometer measures the local (earth) magnetic field that is used as an earth-fixed reference for the current estimation of the IMU orientation. Other solutions include exploiting ultrasonic sensors [14], GPS [15], ultra wide bands (UWB) [16], cameras [17], and magnetic field generated by actuated coils [18].
Motivated through the variety of approaches to IMU-based human motion tracking (IHMT), the goal of this article is introducing the reader to IHMT. In the first part (Section 2) this article introduces the reader to IHMT main issues, then it presents a survey of the methods that have been used so far to tackle the IHMT problem. In the second part (Section 3) , the article includes a tutorial section which explains in details five selected methods for upper limb tracking. This part aims at both making concrete some of the main issues presented in the survey and letting the reader familiarize with IHMT methods. These methods are finally compared to each other in Section 4. The latter section concludes the presented work.
2. IMU-Based Human Motion Tracking
2.1. Reviews on Wearable Motion Tracking
In recent decades, emerging technologies allowed for a huge step forward in human motion tracking. Exoskeletons, vision-based systems as well as motion capture based on inertial systems have become commonly used firstly in laboratory settings and nowadays in everyday life. Several reviews described human motion capture under different perspectives with a focus on the application [19] and/or on technical aspects [20]. For example, Patel in [21] proposes a review of wearable sensors for human monitoring in which a great emphasis is laid on applications and on the enabling technology. Their survey moves from sensing technology including motion capture based on inertial sensors to applications, including health monitoring, wellness and safety. Similarly, Shull et al. [22] review wearable sensing systems applied to gait analysis in clinical settings. They group methods according to the sensors that are used, subject populations and measured parameters. In a recent review paper [23], Gravina et al. discuss issues and advantages of body sensor networks, then they focus on their applications to human activity recognition. Wong et al. [24] review applications of wearable sensors to biomechanics. Differently from [21], they focus on the devices and the sensors that are used for motion tracking. Moreover, they explain advantages and disadvantages of the different methods. In the recent review [9] a specific focus is laid on wearable inertial sensors. The authors analyze several medical applications of wearable inertial motion tracking, including gait analysis, stabilometry, instrumented clinical tests, upper body mobility assessment, daily-life activity monitoring and tremor assessment. For all applications, they report the methods proposed to tackle those. Interestingly, the selection of the applications provides a grouping of methods that reflects different complexity levels in using IMU sensors: for example stabilometry requires simpler algorithms and fewer sensors than upper body mobility assessment. Harle et al. [25] provide a technical review of the issues arising and methods used in IMU-based pedestrian localization. Similarly, Yang et al. in [26] target localization, reviewing sensors as well as methods with their respective sources of errors. In [27] the focus is on walking speed estimation. Sabatini in [28] proposes a review of (m)IMU-based tracking systems for 3D attitude estimation focusing on the technical aspects of IHMT methods. In particular, sensor fusion techniques and related issues are explained in details including techniques to estimate and tune filter parameters. In [29] the authors compare six algorithms for the estimation of a smartphone’s attitude. The goal of their analysis was to compare algorithms in order to select the most suited one for pedestrian localization even when magnetometer’s signal is disturbed. They test such algorithms indoor while artificially distorting the magnetic field by means of magnets. Although authors claim that two of the selected methods perform better than the others, reported average orientation errors differ less than half of the error’s standard deviation.
The survey presented in the following subsections has a different scope and target with respect to previously published surveys: both applications and technical aspects are taken into account. Moreover, it explicitly encompasses the evolution of algorithms from the estimate of one rigid body pose to full body tracking. All reported methods are listed and characterized in the Appendix A (see Table A1). The table summarizes relevant information related to application, target, kinematic representation, sensor fusion technique and validation of each method and may help the reader in the following sections.
2.2. Introductory Concepts to IHMT Methods
In the late 1990s, technological advancements made inertial systems a candidate alternative to optical ones for online human motion capture. Moving from the findings in aerial vehicles navigation and accelerometry techniques, researchers started tackling the problem of human motion tracking based on (m)IMUs [30,31]. In this, major issues are: How to represent and constrain human limbs kinematics, how to fuse measurements from multiple sensors to track these with minimal drift, also considering erroneous measurements e.g., due to magnetic disturbances, and how to make the relation between technical sensor and (anatomical) body segment frames through calibration, when several (m)IMUs are involved. Moreover, ways of and measures for assessing and comparing different methods are of major interest for evaluation purposes. In the following, these major aspects and preliminary concepts of IHMT are described in more detail in order to prepare the reader for Section 2.3, which provides the IHMT survey.
2.2.1. Kinematics and Constraints
All reported techniques used for inertial body motion tracking assume that human limbs are rigid bodies. Therefore, from the point of view of kinematics IHMT reduces to determining the attitude and/or the position of these limbs. When more limbs are involved a kinematic chain can be modeled. The first multi-limb models used this kinematic chain in a second step after estimating the attitude of each limb separately (e.g., [10,32]). However, the kinematic chain can be better exploited by providing joint constraints that can be added to the sensor fusion algorithm to make the estimation more consistent with human motion.
Kinematics Representation
While the position of a limb in space is typically represented through a Cartesian frame, several possibilities are proposed in literature to represent its orientation. Euler angles are a common choice since they have an intuitive physical meaning, which is the case of the roll-pitch-yaw representation of a vehicle attitude (e.g., [15,33,34,35]) or the identification of the roll-pitch-yaw angles with the anatomical degrees of freedoms (DoFs) of human limbs (e.g., [36,37] ). The major drawback of such a solution (and of all the three parameters attitude representations) is the existence of singularities that may occur in certain configurations, as in the case of the gimbal lock. This is why many methods use quaternions for the estimation (e.g., [38,39,40,41,42]). Quaternions allow for a computationally efficient and singularity-free attitude representation and are often used for the attitude representation of a single body. However, besides their non-minimal rotation parametrization, they do not provide direct access to anatomical or functional angles. Factorized quaternion algorithms [43,44] decompose attitude quaternions in order to identify physically meaningful rotations (e.g., about the articulations’ axes), thus allowing for a simpler implementation of constraints. Kinematic chains are indeed often parametrized either by means of Euler angles (e.g., [45]) or using the Denavit Hartenberg (DH) [46] convention (e.g., [47,48,49]) to represent the relative joint angles.
Constraints
Kinematic constraints play a fundamental role in the whole estimation process, as they can prevent the relative displacement of the body segments to drift over time. Kinematic constraints are sometimes embedded in the sensor fusion algorithm to provide more consistent solutions (e.g., [45,47,50,51]). In other cases the constraints are applied after the sensor fusion algorithm has provided the attitude estimation (e.g., [39]). Since the estimated quantities are often random variables, applying limits to those variables in a consistent way is a delicate issue. This is shown by Simon in [52], where different approaches to solve the issue are reviewed. In [44] quaternions are used to represent the attitudes of the human arm limbs. Anatomical constraints such as joint angle limits and limitations of the limb motions are implemented by posing the attitude estimation as an optimization problem in which the estimated attitudes have to respect the constraints and at the same time optimize the consistency with the accelerometer measurements. Also in [45,47,48,49,50,53,54], the elbow is constrained to reduced DoFs.
In contrast to the kinematic chain model, free segments models have been proposed in [51,55]. These representations keep some of the anatomical constraints as hard constraints, e.g., the connectivity between successive limbs [55], while others are relaxed (implemented as soft constraints) in order to reduce the effects of errors related to their implementation. For example, the elbow is not a perfect hinge joint as its axis is not fixed when the ulna moves with respect to the humerus [56]. Moreover, localizing the elbow axes is a further source of error [48]. On the one hand, detrimental effects of such model errors may be mitigated by the free segments approach, on the other hand this may lead to unwanted behaviours, such as measuring elbow abduction, which is not physically plausible.
2.2.2. Sensor Fusion Technique
Signals gathered from accelerometers, gyroscopes, magnetometers and other sensors need suitable sensor fusion techniques to derive useful information about the attitudes and poses of the limbs. Note, sensor-to-limb calibration parameters are here assumed known. Calibration methods are addressed below. In most of the sensor fusion methods the unknown variables (e.g., Euler angles) are estimated in discrete settings at successive time steps based on previous time step estimation and current time step measures. Two main approaches are adopted for sensor fusion: complementary filters (CF) and Kalman filters (KF). More complex approaches include particle filters (PF) and optimization-based approaches, which are now suitable for online IHMT.
Complementary filters exploit the different frequency spectra of gyroscope, accelerometer and magnetometer signals. Many of the methods that exploit the CF approach (e.g., [10,30,57,58,59,60]) apply the following steps: The accelerometer signal is used to estimate the acceleration due to gravity in the sensor frame. This and the magnetometer signal are then used to obtain a “low frequency” estimate of the sensor’s attitude. At the same time an estimate of this attitude is dynamically calculated from the gyroscope measurement. These two estimates are then fused in the complementary filter. Some methods assume the body acceleration being negligible, thus modeling the accelerometer signal as a noisy measurement of acceleration due to gravity (e.g., [30,60]). In other methods body acceleration is calculated and removed from the accelerometer signal (e.g., [58]). Acceleration due to gravity and measured local earth magnetic field are then used to estimate the pose of the sensor in an earth-fixed frame (world frame). Some methods (e.g., [10]) simply implement the TRIAD algorithm [61] or TRIAD-like algorithms to reconstruct the attitude with respect to the world frame. Others [30,59,60] use more complex optimization algorithms to find the attitude as best fit of the different measurements.
The most widespread sensor fusion techniques are the Kalman Filters [62]. There are many cases in which a linear KF suffices for sensor fusion (e.g., [32,63,64,65]). In most of the cases, nonlinear equations require a manipulation of the KF. The Extended Kalman Filter (EKF) is the most immediate solution that has been adopted to use the KF approach with nonlinearities (e.g., [38,40,66,67,68,69]. Alternatively, Unscented Kalman Filters (UKF) are used in [47,49,53] as they provide a more accurate estimation of probability density functions (PDF) under nonlinear transformations. The method proposed in [70] uses unscented transformations, but implements constraints through probabilistic graphical models. Particle filters have been used by [39,42] to further improve and generalize the representation of PDFs.
Compared to the EKF, the UKF improves the estimation of the transformed probability density function. Moreover, the UKF keeps a convenient complexity when compared to the PF. Conversely, the PF allows to drop the hypothesis of Gaussian distributed random variables, thus permitting a more accurate PDF estimation. Comparisons between EKF and UKF provided conflicting results [71,72,73]. The most recent work highlighted the performance being highly influenced by the application. They describe the UKF as being more robust to initialization issues whereas the EKF is more computationally efficient [74]. The few comparisons that were found between the PF and the UKF are far from IMU-based motion tracking applications and are not reported.
Recent improvements of computational powers made optimization approaches attractive to IHMT. The methods presented in [44,51,55] show optimization based approaches that allow for both sensor fusion and implementation of constraints. Optimization approaches make it easier to take into account large time windows for the estimate. However, a good compromise has to be found between accuracy and speed of the algorithm [51] to allow for online IHMT. These latter methods are really promising because they showed to be highly flexible to add, remove, loosen or strengthen constraints as well as to find a compromise between accuracy and computational burden.
2.2.3. IHMT Common Issues
There are three main issues that recur in IHMT: how to reduce the estimate’s drift, how to handle magnetic disturbances and calibration issues.
Drift
A very first approach to IHMT was based on inertial navigation systems (INS) strapdown integration of gyroscope measurements which was inherited from the navigation of aerial vehicles. Though adapted to follow the dynamics of a human, this solution cannot be used alone as the estimate quickly drifts. Many methods have their main focus in reducing drifts. One solution is fusing the INS or INS-like estimate with a quasi-static one, as it is done in many CF-based approaches (e.g., [58,75], see Section 2.2.2). Since drift is mainly due to gyroscope bias, a second solution is to include the bias in the estimation and to account for it [50,75,76,77,78]. A third solution exploits constraints from the kinematic chain to avoid a drifting attitude estimate of one limb with respect to the others [41,45,47,49,50,53,58,66,67,77,79]. A further solution is used mainly in lower limbs tracking and exploits contacts of the feet with the ground [38,69,78,80]. When the foot is in contact with the ground its velocity is almost null. This information can be used to reset the speed (zero velocity update, ZUPT) and, when moving on a flat ground, to also reset the height of the foot with respect to the ground. These techniques have highly reduced drifts as demonstrated in many of the aforementioned methods.
Magnetic Disturbances
Many of the aforementioned methods rely on magnetometers. Despite being a valuable aid to have an absolute orientation reference, their signals are easily distorted by the presence of ferromagnetic materials in the vicinity of the sensor. Distortion effects are typically classified as hard and soft iron interferences (e.g., [38,81]), which are related respectively to permanently magnetized objects and to objects that are magnetized only when an external field is applied. Hard iron effects cause an offset of the earth magnetic field whereas soft iron effects cause a distortion. If the magnetic environment does not change, these effects can be corrected through internal sensor calibration, which is out of the scope of this article. However, for dealing with a changing magnetic field, either through a changing environment or translational motion of the sensor in an environment with inhomogeneous magnetic field, several solutions have been proposed. The simplest solution is to establish a policy to decide when the magnetometer signal is reliable. This can be done by thresholding its magnitude (e.g., [82,83]). Another common solution is limiting the contribution of the magnetometer measurement to the heading variable (e.g., [45]) or to two components (e.g., [59]). A more sophisticated solution is model-based estimation of the disturbance; e.g., in [84] the magnetic field direction is estimated simultaneously with the sensor orientation. Another approach is proposed in [85]. Under the assumption that magnetic field is constant for a given period, the authors take the magnetometer measurement at the beginning of the period as a reference. They then use the error with respect to this reference at each time step to update the error state estimate in their Kalman Filter. A survey of methods to handle earth’s magnetic field disturbances is proposed by Ligorio et Sabatini in [86].
Calibration
All IMU based motion reconstruction algorithms require some parameters to be provided.
A subset of these parameters defines the orientations (and sometimes the positions) of the IMU frames with respect to the tracked body segments they are attached to. In most of the cases these parameters are assumed to be known: the IMU frame is supposed to be physically aligned to the body frame (e.g., [50,60,87]). In other cases these parameters are obtained by means of a calibration procedure that is carried out at the beginning of the capturing session (e.g., [47]). Another subset is related to the dimensions of the human body: human limb lengths are typically either measured (e.g., [87]), taken from anthropometric models (e.g., [64]) or calculated by means of calibration procedures (e.g., [64,88]). In contrast to the IMU-to-segment orientations and positions, there is no need for online estimation of human limb lengths as they can safely be assumed constant during tracking.
Several calibration procedures were proposed to obtain IMU and limb parameters when tracking humans. The most typical procedure requires the human to rest in the neutrum-pose (N-pose) that is standing still while leaving the arms vertical alongside the trunk in the sagittal plane (e.g., [39,45,47,89]). Another widespread calibration pose is the T-pose, where the user is standing still keeping the arms horizontal in the sagittal plane [47,64,90,91]. In [45] the user is asked to lean forward to define an earth-fixed reference frame. In [14] the user is required to assume a rest pose before each motion. Besides the static poses, functional calibration methods require the user to perform rotations around different joint axes in order to better align the body segment frames with anatomical axes (e.g., [54,92,93]).
In a clinical setting, it is particularly important to obtain anatomically interpretable joint angles and, hence, to obtain accurate IMU-to-segment orientations. In [48,94] calibration procedures comprising IMU placement protocols, static poses and functional movements are proposed for identifying the knee and elbow flexion/extension axes and the forearm pronation/supination axes, thus improving the estimation of the anatomical joint angles. A simpler calibration procedure based on two static poses (standing and sitting or lying) is proposed and evaluated in [89]. Picerno et al. [37] proposes a specific rig equipped with an IMU for IMU-to-segment orientation calibration based on anatomical landmarks. For this, the rig endpoints have to be manually placed on anatomical landmarks, which is applied to the thigh and the shank.
The effects of errors in the different calibration parameters on the limb orientation estimation errors have been recently investigated in [51] demonstrating a clear dominant role of the sensor-to-segment orientations compared to the positions and limb lengths.
2.2.4. Methods’ Assessment
The assessment of motion tracking methods often relies on ground truth data, where the estimated trajectories are compared to ground truth trajectories using typical metrics, such as the root mean square error, namely RMSE, (e.g., [30,36,50,84,88,95,96,97]) or correlation coefficients (e.g., [49,53,58,87,88,96,98]). Two performance measures are mainly used for the evaluation: the drift and the accuracy of the target reconstruction, either based on the position of a reference point or the orientation of a rigid body.
Drift assessment requires to perform relatively long trials. In [36] the proposed algorithm is evaluated with respect to its drift dependency over time. In [87] the drift of the wrist position estimate is calculated and reported for one circular and one square trajectory. Luinge and Veltink [95] report attitude estimation drifts obtained from a strapdown integration of the measured angular velocities as compared to a sensor fusion algorithm using a KF. In many papers validation trials are longer than a few seconds (e.g., [10,31,38,75,92]) so that it is possible to at least qualitatively assess drift by visual inspection of the RMSE along time. Other works (e.g., [10,32,80]) do not report long assessment trials, making it difficult to evaluate drift.
Accuracy assessment requires ground truth data being at least as accurate as the method is expected to be. Single body attitude/position estimation can exploit very reliable ground truth data, such as those gathered from tilt tables (e.g., [30,43,97]). Since the IMU can be aligned very accurately to these devices, the error introduced through the evaluation device is limited and often much smaller than the estimation error. In many applications related to human motion tracking, the positions of anatomical landmarks are of interest. For this, marker-based optical motion capture (OMC) has become the gold standard (e.g., [39,40,47,48,50,55,66,77,79,91,95,99]). OMC permits to evaluate both attitude and position tracking. In the first case the main source of error resides in the alignment of the coordinate frame calculated from the OMC data with respect to the model estimation frame. The second case requires to estimate the parameters (e.g., the segment lengths) needed to calculate the reference point positions from the IMU data. Since it is not possible to measure these parameters exactly, they represent an additional source of error.
Other types of reference data were also used. One example is the work of Zhu et al. [32], in which the authors constrain the hand to follow a straight line and then check the reconstructed trajectory to be straight. Other mechanical platforms and robots were used in [42,57,75,76,81] as ground truth data.
As an alternative to tilt tables, Picerno et al. used in [100] a method for assessing the orientation estimation accuracy by attaching mIMUs to a rigid plate that is oriented in 12 different ways. They use the RMSE of the reconstructed orientation angles with respect to the known plate’s poses as accuracy metric. Devices from Xsens have also been used as providers of ground truth data. Robert-Lachaine et al. recently published [91] a comparison of MVN [64] and optical motion tracking performance when using such systems either with proprietary kinematic models or when estimating angles derived from ISB (International Society of Biomechanics) recommendations. The MVN suite was also used by Pons et al. in [41] and Taunyazov in [90], whereas the attitude estimate provided by the MTx IMU units was used by Brigante et al. in [68] and by Lee et al. in [44] to validate their methods.
It is worth noting that evaluation results always depend on the assessment method, while a great variety of such methods is currently used. Hence, it is difficult to make a fair comparison between the results reported in different publications. The summary Table A1 takes this into account by reporting the assessment methods along with the results.
2.3. Survey of IHMT Methods
This section provides a survey of IHMT methods categorized by the targeted body parts. This categorization allows following the historical development of IHMT methods. Indeed, starting to work on methods for a specific target (e.g., the upper limbs) and then refining it before moving to other targets represents a pattern found for many teams of researchers. Note, the presentation of the different methods combines solutions introduced in Section 2.2.
2.3.1. Generic Limb Orientation
Limbs pose tracking has been tackled by means of (m)IMUs since the end of the last century. Works of Bachmann [30] and Marin [31] pioneered the field targeting orientation tracking of human limbs and robotic links by using mIMUs for computer graphics applications. The former proposed a quaternion-based attitude tracking method that updates the attitude quaternion by means of the gyroscope measurement and corrects it based on a “low frequency” estimate from the accelerometer and magnetometer measurements. Differently from [30] the second model assumes a time decay of the limbs’ angular velocities. This assumption is suitable for human motion as humans cannot maintain an average non-zero magnitude of their limbs’ accelerations for long time periods. The same group further investigated this matter with different approaches: Marin et al. in [31] moved to using an EKF to fuse a gyroscope-based quaternion attitude estimate with the estimate obtained from the accelerometer and magnetometer signals through an optimization procedure. Yun tackles the problem of limb attitude estimation in a similar way with two variants: In [97] the authors take into account a decay of human limb acceleration, whereas in [43] they adopt a factorized quaternion approach to limit the use of the magnetometer measurements for heading estimation. Both methods replace optimization with the QUEST (QUaternion ESTimator) algorithm [4] to determine the attitude from the accelerometer and magnetometer measurements. Hol et al. propose an alternative approach to IHMT for pose estimation of one limb based on UWB [101]. They develop a sensor composed of a 6-axes IMU and a UWB transmitter, whose pose estimation is the goal of the algorithm. First UWB measurements are modeled considering transmission time as an unknown, whereas gyroscope and accelerometer models are based on the kinematics of the sensor and they include time varying biases. Finally, an EKF is set to estimate the pose of the sensor. Kok et al. in [102] extend this method. They also propose a tightly coupled approach to fuse UWB measurements and IMU measurements to obtain a set of variables which includes poses of human limbs. A novel two-steps method for calibration of UWB is first proposed to obtain positions and time offsets of UWB receivers and transmitter, as well as the parameters of an asymmetric probability distribution, that they use to model measurements of UWB. Obtained UWB measurements are then used to set an optimization problem which includes IMU measurements to estimate the poses of human limbs. In [10] the upper limb posture is estimated using a CF which fuses accelerometer and magnetometer signals based on the TRIAD algorithm to reconstruct the attitude of each limb. Two nonlinear CFs are proposed in [75,76] to fuse accelerometer, magnetometer and gyroscope measurements to obtain an attitude quaternion estimation. The authors define an orientation error and demonstrate by means of a Lyapunov stability analysis that the proposed filters enforce the defined error to converge to zero. The method of Madgwicks et al. [59] is also based on a CF and includes two variations. The former uses inertial signals, while the latter uses also magnetometer measurements. The method with magnetometers is described in Section 2.2.2. It exploits an earth-fixed frame to reconstruct the IMU’s orientation as a quaternion . The relation of its time derivative with the angular velocity allows the authors to use the gyroscope measurements for the CF “high frequency” estimate of . The “low frequency” estimate is obtained from an optimization procedure in which the goal is to align vectors measured in the sensor frame with their known counterparts in the Earth fixed frame. This second part can be adapted depending on the availability of measurements; e.g., acceleration due to gravity in case of an IMU and, in addition, local magnetic field in the case of an mIMU. Finally, the method of To and Mahfouz [42] tries to improve the quaternion attitude estimation by using von Mises-Fisher and Bingham densities in a PF that provides the attitude quaternion based on the IMU signals.
2.3.2. Lower Limbs Tracking
IMU-based lower limbs tracking has been tackled for several purposes. In some cases it has been used for gait analysis [36,37,79,93], in other cases only parts of the lower limbs were targeted, mainly for monitoring purposes in medical settings [94,96] or rehabilitation [44,59]. In [88] a system for tracking shank and thigh orientation in the sagittal plane is presented. The authors use two IMUs (both the accelerometers and the gyroscopes were biaxial) attached to these body segments. They perform direct integration with updates according to the difference between the detected acceleration and the acceleration due to gravity.
The work in [63] aims instead at estimating the knee flexion/extension angle based on IMUs attached to the user’s thigh and shank. They use KFs to estimate the IMUs’ attitudes and model the knee as a hinge joint to obtain the flexion/extension angle assuming the orientations of the sensors with respect to the knee joint to be known. Similarly, in [98] the target is knee angle estimation, and the IMU poses with respect to the knee rotation axis are supposed to be known or determined through a calibration procedure. Favre et al. show an application of a similar approach to knee ligament injury monitoring [92,96]. The same authors further developed their method to overcome calibration issues. In [103] they propose a functional calibration procedure to obtain clinically relevant joint angles. The importance of calibration (see Section 2.2.3) for measurements in clinical settings is further witnessed by the works of Picerno et al. [37] and Cutti et al. [48,94] who developed calibration procedures to map mIMU-based 3D kinematics reconstruction to anatomical landmarks. Knee angle estimation based on two IMUs on thigh and shank is also the target of Seel et al. in [93]. They propose a calibration procedure that allows to obtain the knee joint position and knee flexion/extension axis in the sensors’ frames. Based on this, they propose two magnetometer-free joint angle estimation methods. The first method exploits IMU orientation estimation to obtain the knee angle as orientation difference about the knee axis. The second method exploits directly the hinge joint assumption to obtain the knee angle by integrating the difference of the angular speeds with respect to the knee axis. Finally, drift is removed by an acceleration-based joint angle estimation. They test their method against ground truth from an OMC system by mounting IMUs and optical markers on both human and prosthetic legs.
Lower limbs reconstruction has often been used to aid localization during locomotion. Examples include the methods presented in [65,104]. The first exploits detection of contacts and a lower limbs biomechanical model to correct acceleration and velocity errors. Localization is then obtained by integration of linear velocity. The second implements KFs too estimate limbs orientations from mIMUs signals. KFs estimate IMUs biases and the errors of limbs orientation quaternions, which are then used to correct orientation estimate from INS. The method implements also ZUPT and adaptive weighting of accelerometer and magnetometer signals to mitigate detrimental effects of linear acceleration and magnetic field disturbances. Estimates from left and right legs are finally merged by a KF to obtain pelvis displacement.
Joukov et al. propose to use five IMUs to track locomotion for gait analysis [79]. They use two kinematics models to model the support and the swing leg. The first connects the feet to the ground by means of hinge joints (stance leg), whereas the latter connects the waist to the ground by means of three prismatic and three hinge joints. IMU data are fused in an EKF whose states comprise the joint variables and their time derivative. The method is validate on ten cycles but only knee joint angles are reported.
Zihajehzadeh and Park in [105] propose a method that substitute magnetometer with UWB. They use 7 IMUs attached to feet, shanks, thighs and pelvis as well as 3 UWB tags attached to feet and pelvis to reconstruct lower limbs motion and localize pelvis. Interestingly they exploit the robustness of the estimate of limbs’ inclination to remove yaw estimation drift. Their method moves from a first KF (tilt KF) that estimates inclination of the seven limbs based on accelerometer an gyroscope signals. Then they use UWB for a second KF whose output are feet positions and yaw of feet and pelvis. This output and tilt KF output are finally used to estimate shanks and thighs yaw. They obtained good results (orientation error below 5 and position error below 5 cm) walking, jumping and ascending validation trials.
2.3.3. Upper Limbs Tracking
More than a decade ago, Luinge and Veltink in [95] exploited IMUs to track the orientation of the upper limbs by modelling the accelerometer and gyroscope measurements as a function of attitude, biases and noises and using a KF to estimate orientation errors and biases based on these models. This method was applied in [54] for tracking the relative orientation of the forearm with respect to the upper arm. Other works from these and associated researchers addressed magnetic disturbance handling [84] and extending the method to full body motion tracking [64].
Mihelj [50] used IMUs to track human arm motion in a rehabilitation task. In this task the user’s hand was firmly fixed to a robot and the known hand pose was used to complement the IMU information. mIMUs were also used by Jung et al. in [67] to track the motion of the trunk and the upper limbs. These were modeled as two four DoF serial kinematic chains which were connected to the trunk, while the latter had three rotational joints with respect to the pelvis.
Bleser et al. proposed in [45] a novel method for upper limbs tracking that exploits an egocentric camera and markers to aid mIMU-based estimation. The topic was then further investigated addressing motion tracking algorithms for general kinematic chains [66], investigation of the effects of different model calibration errors and biomechanical model representations on the segment orientation estimation accuracy [51] (studied based on arm motions), simultaneous motion and IMU-to-segment calibration estimation [106] as well as low-cost full body sensor suits [107]. Targeted applications include ergonomics in industrial manufacturing [108] and rehabilitation [109].
Peppoloni proposed in [47] an mIMU-based method for arm tracking, modeling each shoulder and elbow with five DoFs and using an UKF to fuse the mIMU data. In [70], the same group proposed a method where the UKF was replaced by a probabilistic graphical model approach. The method takes into account the constraints provided by the kinematic chain model and implements a message passing approach to estimate the joint angles. Considered applications include ergonomics [110], robot teleoperation [111] and rehabilitation.
Particle filters were used by Zhang et al. [39] to fuse inertial and magnetometer measurements for estimating the elbow flexion/extension angles. The same authors worked previously on an UKF-based method presented in [49].
The upper limbs tracking approach of El Gohary [53,77] exploits IMU measurements fused in an UKF. The method was eventually improved [78] by including IMU biases and ZUPTs to limit drift.
Taunyazov et al. [90] adopted a simpler approach to track the upper limbs. Their method relies on one IMU mounted on the upper arm and a simple mechanical tracker equipped with a potentiometer to measure the elbow’s rotation angle.
Finally, upper arm pose estimation is the goal of the methods analyzed in Section 3, i.e., [32,45,47,58,97] which are all based on mIMUs and exploit different calibration and sensor fusion techniques.
2.3.4. Full Body Motion Tracking
Some of the aforementioned methods were extended to full body motion tracking. Works from professor Veltink’s group led to the development of a commercially available inertial body tracking system based on a body suit with 17 mIMUs [64] (the Xsens MVN system). The motion reconstruction algorithm also benefits from the work of Schepers, Roetenberg and Slycke on the exploitation of disturbed magnetic field signals [99,112].
Vlasic et al. [14] developed a full body suit equipped with 18 IMUs and eight ultrasonic sources. The IMUs were equipped with microphones so that the received signals provided a reference to avoid the drift that would occur when purely integrating accelerometer and gyroscope measurements.
The full body tracking method of Pons-Moll et al. [41] is mainly based on camera images. The limbs poses are inferred from the video information within a set of possible poses. This set is reduced thanks to the orientation cues from IMUs mounted on the body.
A linear CF is proposed in [58] in which the orientation quaternion of each limb is obtained as a linear combination of an estimate based on the gyroscope measurements and one based on the accelerometer and magnetometer measurements.
Miezal et al. [66] exploited an EKF to develop a general framework for motion tracking of arbitrary kinematic chains based on mIMUs.
An interesting probabilistic method has been developed by Kok et al. in collaboration with XSens [55]. Instead of using a recursive filter, joint angles are estimated from IMU measurements using constrained optimization. Here, constraints from the biomechanical model and from assumptions about the average acceleration over time are included into the cost function as both hard and soft constraints. Moreover, errors due to sensor shortcomings and soft tissue artefacts are modelled by incorporating appropriate noise terms. The maximum a posteriori estimate is obtained in an offline process using an infeasible start Gauss Newton method to solve the weighted least squares problem. Recently, Miezal et al. [51] proposed a variation of Kok’s offline method to enable online constrained optimization using a sliding window approach.
Multiple limbs and full body suits have been applied to several fields. In the sports field, for example, Ruffaldi et al. in [113] use IMUs to analyze rowing performance by estimating the rower’s motion based on five mIMUs. Measurements from the rowing simulator hardware (oars and seat) aids the overall estimate. Supej et al. developed a full body suit based on Xsens MTx sensors to track ski performance [114]. In [57] Miller et al. addresses remote robot control through IMU-based motion tracking. YostLabs (YostLabs, Portsmouth, OH, US), formerly YEI technologies, distributes full body IMU-based motion tracking applied to computer graphics and Virtual Reality (PrioVR). The system enables computer game players to control virtual characters through their own motions (see YEI technology, http://www.yeitechnology.com/).
3. Selected Methods
This second part of the article is in the form of a tutorial, and provides more details on five methods which have been selected in order to span the different areas that were identified in Section 2. These methods differ concerning the sensor fusion technique, using either CF, KF, EKF or UKF. They also differ concerning the sensors that are used: all of them exploit IMUs, but magnetometer signals are not always used and one method requires a visual reference for tracking human upper limbs. They also differ regarding the kinematic models: some use Euler angles, some use the DH convention and others use quaternions. Moreover, they differ in how the constraints of the kinematic chain are considered. Finally, they differ in how the parameters of the algorithms are set. In the following these methods are briefly recalled, more details can be found in the cited papers.
The following notation will be used. The i-th accelerometer signal will be , the gyroscope’s will be and the magnetometer’s will be . Vector will denote linear acceleration of a point. The earth magnetic field and the gravity vectors will be respectively and . will specify that the vector is written in the reference system l. will be the size n identity matrix, a n by m null matrix, and T is the sample time. The quaternion will represent the attitude of the i-th body.
3.1. Method 1
The first method that is described in [32] is suitable for the reconstruction of an arbitrary kinematic chain, given that each link is provided with a nine axis mIMU. Figure 1 shows the block diagram that summarizes this method. Given two consecutive links in the kinematic chain, namely i and , each one provided with a frame , the authors represent the orientation between the two frames by a axis-angle representation with axis and rotation angle , from which the rotation matrix can be obtained.
The rate of change of and within the same frame can be calculated as
(1) |
where is the angular velocity of the frame and is the skew-symmetric matrix of vector
(2) |
It is worth noting that the gyroscope measurement is treated here as a known control input, thus not taking into account the gyroscope measurement noise.
Under the assumption of slow motion (or that the linear acceleration is known) and that the i-th sensor frame is aligned with , then , and are approximately the output of the i-th sensor, i.e.,
(3) |
The authors hence propose to use a KF for each segment in which the state is
(4) |
The process model between steps j and is derived from Equation (1):
(5) |
where is white noise. According to Equation (3) the measurement model is hence
(6) |
where and is the white measurement noise.
The KF estimation of and feeds the QUEST algorithm to calculate the attitude quaternion . The resulting quaternion is converted to a rotation matrix that feeds the homogeneous matrices
(7) |
in which is the position of in the frame , are then recursively applied from the chain root up to the desired point to obtain its position in the root frame.
3.2. Method 2
In the method proposed by Yun et al. [97] each limb is supposed to be independent of the others and equipped with a nine axes mIMU sensor. Figure 2 shows the block diagram that summarizes this method. The attitude of i-th limb with respect to the root frame is represented by a quaternion . Under the assumption that the linear acceleration of the human limbs is negligible with respect to the gravity, and that the mIMU axes are aligned with the limb ones, then is initially estimated by means of the QUEST algorithm fed by equally weighted accelerometer and magnetometer signals thus obtaining . To compensate for the dynamic effect of the linear acceleration, the authors estimate the rate of change of based on the link angular velocity also measured by the mIMU:
(8) |
where is the matrix representation of the quaternion:
(9) |
The authors also assume that human limb acceleration is bounded and averages to zero over a certain amount of time, hence they propose to model the angular velocity as exponentially decaying over time:
(10) |
where is a parameter of the algorithm that determines the time horizon within which averages to zero. These two methods for estimating are fused by means of an EKF in which the state vector is
(11) |
The process model between steps j and is derived from Equations (8) and (10):
(12) |
where is white noise in a 7-dimensional space and
(13) |
The measurement model is the identity as and measurement is . The final estimation of is then used to reconstruct the pose of the links composing the human arm.
3.3. Method 3
The third method is presented in [58]. The attitude of the i-th limb with respect to the root is represented by the quaternion . They assume to attach a mIMU sensor to each moving limb. The authors propose two versions of their method. In the first version that is called pure the linear acceleration is neglected. In the second version that is called perfect the authors model the human body as a kinematic chain that allows them to calculate the linear acceleration of each frame. Figure 3 summarizes the two versions of the method. The authors assume that the mIMU axes are aligned with the limb frames, thus having , and they suppose to know all the parameters that are required to define the kinematic chain.
In both versions the authors propose a complementary filter in which the “high frequency” estimation of , namely is obtained from the limb angular velocity as in Equation (8). The “low frequency" estimation of , namely is obtained from and by means of the QUEST algorithm. Given estimation at time step j, the proposed CF computes
(14) |
where k is a parameter that allows us to tune the filter.
The pure and the perfect filters differ in the that is provided to the algorithm:
(15) |
The authors associate a hierarchical model tree with the human kinematic chain so that one limb is the root and every other limb i has its parent p. Given the angular velocity of the i-th limb, is
(16) |
where is the linear acceleration of the i-th limb’s parent, is defined in Equation (2), and is the position of i-th frame origin in the parent frame.
Finally is used to reconstruct each limbs pose according to the defined kinematic chain.
3.4. Method 4
The fourth selected method [45] has two main innovations with respect to the previous ones. First it embeds the kinematic constraint equations in the sensor fusion filter. Second it proposes a visual reference to aid magnetometers under severe magnetic disturbances. This method aims at tracking the upper body (but can be extended to the full body) by means of five mIMUs attached on the chest, the upper arms and the forearms. The mIMU on the chest is also provided with a camera (CmIMU) that tracks the markers placed on the user’s wrists. The authors consider a five degree of freedom (DoF) model for each arm and also model the shoulder motion accounting for the scapulohumeral rhythm [115]. The resulting kinematic chain is rooted in the chest and organized as a hierarchical tree.
The method is based on three loosely coupled EKFs, the first returns the trunk orientation given the mIMU signals, the latter two estimate the shoulder (three DoFs) and elbow (two DoFs) joint angles of each arm based on the mIMU signals, the trunk orientation and the wrist position are obtained from the camera image. Figure 4 summarizes the components of the EKF that estimates the arm motion given the trunk orientation.
The state of these EKFs is , the process model is linear and it assumes constant angular acceleration between two time steps j and , thus having for the i-th angle
(17) |
The authors then propose a calibration procedure to relate the state to the available measurements. Assumed that the mIMUs sit on the frames of the limbs, the orientation of each mIMU frame with respect to the related link frame is represented by the rotation matrix that is obtained through this procedure as well as the position of the CmIMU with respect to the shoulder joint center. The other link lengths are gathered from anthropometric tables. Given , the orientation of each mIMU with respect to the root is and the mIMU measurements are
(18) |
where being defined in Equation (2), is the acceleration of the link hosting the i-th mIMU, and are white noises. The latter measurement equation is only partially used and reduced to the heading direction. A further measurement equation relates the position of the wrist to the wrist position estimated from the camera:
(19) |
where is the wrist position and is white noise. The measurement equations are then grouped as
(20) |
that is linearized to obtain the observation matrix
(21) |
This method directly provides the poses of all limbs.
3.5. Method 5
The last selected method [47] has two main differences with respect to the previous ones. First, it does not rely on the linearization of nonlinear equations, but it exploits the unscented transformation to cope with non-linearities. Second, it proposes refinements in the kinematics of the upper body and exploits a nonlinear sensor fusion algorithm to cope with nonlinear models. The method aims at upper limb motion tracking. Each of the clavicles, upper arm and forearm is provided with a mIMU. Taken the chest as root, a seven DoFs hierarchical kinematic model of each arm was developed according to the Denavit Hartenberg convention.
The sensor fusion technique of this method is an Unscented Kalman Filter in which the state vector is and the process model is the same as Equation (17).
The authors propose a calibration procedure to gather the parameters needed to relate the state X to the measurements. The orientation of each mIMU frame with respect to the related link frame is represented by the rotation matrix , whereas its translation with respect to the parent frame is measured to obtain the homogeneous matrix that fully refer the mIMU frame to its parent’s one. Given that the sth mIMU is attached to the i-th frame whose parent frame is p, the measurements model is:
(22) |
where is the rotation matrix from the parent frame to the sensor frame, vector, and is the position of sensor frame relative to parent in sensor frame. The measurement equations are then grouped as Equation (20). In this case the function h is not linearized, but it is used for the unscented transformation that provides the measurement estimation based on the state prediction. As for method 4, the state already provides he pose of each limb.
4. Comparison
4.1. Experimental Setup
Selected methods were compared to each other using OMC. Ground truth data were obtained from the Vicon (OMG plc, Oxford, UK) OMC system while tracking a healthy 28 years old male that was equipped with the mIMUs Colibri mIMUs from Trivisio Prototyping GmbH, sampled at 100 Hz), the CmIMU (Firefly MV color camera from PointGrey with diagonal field of view of 140 deg, sampled at 12.5 Hz, hardware synchronized with the mIMUs), and markers on the anatomical landmarks. After holding N-pose and T-pose as calibration procedure, he was asked to perform several movements that involved one functional degree of freedom at a time, namely elbow flexion/extension (), forearm pronation/supination (), shoulder flexion/extension (), shoulder abduction/adduction (), and shoulder internal rotation (). The participant gave his informed consent for inclusion before he participated in the study. The study was conducted in accordance with the Declaration of Helsinki, and the protocol was approved by the Ethics Committee of Scuola Superiore Sant’Anna (Delibera n. 1/2017). The setup of the experiment is shown in Figure 5.
The joint motion reconstruction from optical data is based on the following kinematic model: the chest was considered as a steady rigid body. Shoulder was modeled as a spherical joint, so that the humerus has three rotational DoFs with respect to the chest. The forearm was considered to have two DoFs with respect to the humerus, i.e., the flexion-extension and pronation-supination functional DoFs. The Vicon Nexus® software (Oxford Metrics, Oxford, UK) allowed us to use this kinematic model for offline adjustment of the marker positions. The marker on the acromions served to capture the shoulder joint center, that is assumed to be cm under the acromion in the direction.
Each sequence lasted at least 10 s. Wrist position was used to assess the methods. Moreover, each sequence of movements includes a pair of repetitions performed at higher speed to test how methods’ performance varies as the linear acceleration increases.
Captured data include the markers positions in the Vicon reference system , the mIMU signals in the respective sensor reference systems, and the images gathered by the CmIMU in the camera reference system. All the data are synchronized and gathered at Hz. IMU and OMC data were manually synchronized by exploiting the transitions from static postures to motion. This synchronization method may introduce a time misalignment in the data which accounts for up to 3 samples (variations in the data are sufficiently clear to identify onset of motion). This means that the maximum misalignment cannot exceed 30 ms. The dataset used for the comparison is available in Zenodo (https://zenodo.org/) and it can be found through the digital object identifier (DOI) of this paper.
4.1.1. Data Alignment for the Comparison
The comparison of the estimated positions (joints and end effectors) against the OMC based ones requires to represent both in a common reference system. OMC data is available in a frame that is defined during the OMC system calibration. Since the chest frame was the root for the IMU based estimation, all these data were available in . Once decided to represent all the body frames in , the rigid transformation between the global frame and the root IMU frame is sufficient for the comparison. The homogeneous transformation matrix
(23) |
represents such a transformation, where is the rotation matrix that aligns axes with ones, and is the position vector of the origin. Since in data capturing there was not enough information to calculate , it was decided to estimate it for each method. Let be a n samples set of optical captured positions and the corresponding estimated positions. Let then and be the captured and reconstructed positions in the reference configuration, i.e., the N-pose for the present evaluation. If we consider two new sets of samples, namely X and Y, obtained from and as
(24) |
then
(25) |
The rotation matrix is calculated to minimize the reconstruction error, for any method m and a reference r the quality Q can be computed as
(26) |
Finally, the samples set obtained as
(27) |
can be compared against X.
This method underestimates the absolute error of each method, but it provides a fair comparison between the methods. The quality of a set of body tracking techniques can then be evaluated against the OMC dataset by comparing the joint and end-effector points with the reference method.
4.1.2. Performance Indices
Aligned mIMU and OMC data are then used to calculate the performance measures that we introduced in Section 2.2.4 and hence compare the algorithms. Given two random variables X and Z each sampled with N samples, the following indices will be used for this purpose:
- Accuracy:
(28) - Correlation:
(29)
4.2. Experimental Results
Data gathered from the mIMUs provided the input for the methods reported in Section 3 to reconstruct the arm kinematics. The parameters of each method’s filter were selected to optimize the method performance in terms of stability and accuracy. To enable the comparison of the methods, OMC and mIMU-based position estimation were aligned according to the method reported in Section 4.1.1. Figure 6 shows how the mIMU-based data are first translated to match OMC data in N-pose and then rotated to obtain the best alignment with OMC data.
Figure 7 refers to the functional motion and shows how the error E (see Equation (28)) evolves over time.
After these examples we report the values of the error and of the correlation that the different methods scored. For the comparison of the methods three functional movements were selected, namely , , . The first movement allows us to assess how the methods behave when only one joint of the kinematic chain as well as only one mIMU moves. The latter two involve the motion of two mIMUs. With respect to the motion, in the and motions the estimation provided by methods that use the kinematic chain is likely to differ more from the other methods’ estimation. The average of E and C in the three trials are reported in Table 1.
Table 1.
E | S | S | ||||
---|---|---|---|---|---|---|
Method | E (mm) | C | E (mm) | C | E (mm) | C |
1 | 38.8 | 0.86 | 108.9 | 0.46 | 66.0 | 0.66 |
2 | 89.2 | 0.77 | 121.4 | 0.86 | 243.8 | 0.36 |
3-pu. | 45.7 | 0.84 | 122.86 | 0.46 | 156.0 | 0.59 |
3-pe. | 59.7 | 0.84 | 100.4 | 0.50 | 272.2 | 0.60 |
4 | 75.7 | 0.91 | 82.7 | 0.86 | 86.0 | 0.73 |
5 | 89.2 | 0.77 | 214.4 | 0.89 | 125.4 | 0.66 |
To obtain a more detailed insight of the methods’ performance, the functional motion was further studied. It was divided into cycles: cycles 1–7 were carried out at a slower speed, whereas the latter two were carried out at a higher speed. Figure 8 shows how the error E is distributed within the cycles of the trial, whereas Table 2 reports the average of E and C (see Equation (28)) for the same cycles.
Table 2.
Method | |||||||
---|---|---|---|---|---|---|---|
Cycle | Index | 1 | 2 | 3-Pure | 3-Perfect | 4 | 5 |
E (mm) | 1 | 43.7 | 102.8 | 53.8 | 73.4 | 72.3 | 51.3 |
2 | 49.9 | 100.3 | 64.2 | 79.9 | 90.7 | 44.8 | |
3 | 46.6 | 103.4 | 63.1 | 84.0 | 98.3 | 49.4 | |
4 | 37.7 | 102.8 | 53.3 | 68.1 | 92.1 | 47.7 | |
5 | 35.3 | 97.1 | 44.5 | 57.6 | 79.0 | 59.6 | |
6 | 49.4 | 102.8 | 51.3 | 64.1 | 85.7 | 89.6 | |
7 | 50.4 | 108.4 | 45.9 | 57.5 | 90.5 | 101.3 | |
8 | 38.9 | 100.2 | 44.7 | 56.3 | 76.0 | 97.8 | |
9 | 23.6 | 60.9 | 27.8 | 36.5 | 59.0 | 136.9 | |
C | 1 | 0.94 | 0.73 | 0.96 | 0.92 | 0.94 | 0.93 |
2 | 0.94 | 0.77 | 0.91 | 0.87 | 0.92 | 0.85 | |
3 | 0.96 | 0.75 | 0.91 | 0.82 | 0.95 | 0.81 | |
4 | 0.95 | 0.72 | 0.96 | 0.92 | 0.96 | 0.76 | |
5 | 0.95 | 0.78 | 0.93 | 0.89 | 0.94 | 0.87 | |
6 | 0.93 | 0.81 | 0.95 | 0.92 | 0.92 | 0.82 | |
7 | 0.83 | 0.77 | 0.84 | 0.85 | 0.91 | 0.76 | |
8 | 0.88 | 0.74 | 0.86 | 0.89 | 0.95 | 0.71 | |
9 | 0.87 | 0.86 | 0.92 | 0.93 | 0.95 | 0.94 |
4.3. Discussion
The methods can now be compared according to the indices that were proposed in Section 4.1.2. Before comparing the methods, we see from Figure 7 that for all the methods the error varies periodically with time. This time error may be due to residual error of mIMU-OMC data alignment. However, each of the methods may have other source of error related to biomechanical constraints: lack of kinematic constraints (methods 1, 2 and 3 pure) and too rigid constraints (method 3 perfect, 4 and 5) are all suitable candidates for periodic errors in the estimation of a periodic motion.
4.3.1. Accuracy
The first measure that we proposed is accuracy, as obtained via E. Clearly, the lower E the more accurate the method is. Accuracy is a measure of reliability in accurate position estimation, accuracy is needed when an absolute measure of the position is required, for example in the analysis of motion or to provide force rendering when interacting with a Virtual Environment. From Table 1 we see that method 1 is generally the most accurate and method 4 is also comparable. We note that the accuracy gap between methods 1 and 4 is smaller for and movements, being method 4 more accurate for the latter movement. This partially supports the finding that imposing a kinematic chain in the motion estimation improves the estimation when the measurements of mIMUs on different links are affected by the same joint variable. This hypothesis is further supported when methods 1, 2 and 3-pure are compared. Differently from methods 3-perfect, 4 and 5, the first two methods do not take into account the kinematics of the arm. The latter has a lower accuracy in the motion, but has a better accuracy for the motions and (except for method 3-perfect in the motion).
4.3.2. Correlation
Correlation C is the second measure that is considered. Correlation indicates whether the estimated position follows the real pattern of the performed movement. A good correlation of the estimated human motion with the real movement suffices to teleoperate a remote robot. In facts, in this case it is anyhow needed to map the operator motion to the robot kinematics, and what matters is that this map does not vary along time. In other words, even if accuracy is poor, the human who teleoperates the robot can easily adapt motion of his arm to be able to control the robot unless there is a good correlation between the performed motion and the method’s estimate. When correlation is poor, human has to adapt the motion of the arm and eventually perform unnatural motions to control the robot (e.g., activate more DoFs to obtain a simple elbow motion). From the point of view of the correlation, Table 1 shows that the best performing methods are 4 and 5.
4.3.3. Fast Motion
One of the differences between the methods is the use of the kinematics of the human arm, in particular of the linear acceleration of the limbs. A reasonable hypothesis is that methods 3-perfect, 4 and 5 perform better with respect to the others when the motion of the limbs is fast and the speed changes quickly. This condition should make linear acceleration of the limbs play a bigger role. In our case, this role is enhanced by mounting the mIMUs far from the parent joint. However, looking at Figure 8 and at Table 2, there is no significant difference in the variation of the accuracy between slow cycles (1 to 7) and fast cycles (8 and 9). Similarly, Table 2 shows that correlation does not improve between cycles 1–7 and 8–9 for methods 1,2,3-perfect and 5, whereas there is a small improvement for method 4. These results suggest that the linear acceleration of the limbs plays a minor role with respect to gravity, as assumed by several models. However, this aspect should be further investigated with specific motions in which gravity plays a minor role to identify limb’s orientation.
4.3.4. Sources of Error
As a final remark of the discussion, the accuracy and the correlation that were obtained are generally comparable or worse with respect to the literature. However, apart from possible suboptimal tuning of the methods’ parameters, possible sources of error that can explain our results
Knowlegde of human parameters (i.e., arm length). This source of error can be minimized by including human parameters in the estimation e.g., [51]
Body to mIMUs calibration. Although the calibration procedure that we carried out suffices to determine the orientation of the mIMUs, uncertainties in the position of mIMUs with respect to their parent is still subject to assumptions. Also the effects of this source of error can be reduced by a proper calibration and by taking into account the sensor poses in the sensor fusion technique.
Time alignment of OMC data with mIMU data. OMC and mIMU-based data are manually done based on a known motion from a steady condition. However, the effects of misalignment are much smaller than the error we have reported.
Preprocessing of data. Here we tested only the reported algorithm, not considering possible filtering on mIMU data. For example, having a hard magnetic calibration, it would be possible to handle bad data with distorted magnetic field.
5. Conclusions
After introducing the reader to the main issues of IHMT, relevant methods from the literature were reported. Analysis of the literature revealed that several approaches perform similarly for IHMT. However, optimization-based methods seem to have the potential to bring substantial improvements to IHMT. Currently methods and solutions for lower limbs tracking such as ZUPT have not been widely applied to full body tracking yet. These methods are often capable of accurate estimation even during long walk trials. Therefore their combination with their upper limbs counterparts may improve accuracy and reduce drift of full body IHMT also when walking.
Five methods that span the different techniques used for IMU data sensor fusion were presented and analysed in depth, and an evaluation of these methods was proposed based on accuracy and correlation with OMC data. Results showed that method 1 is the best performing for accuracy followed by method 4, which is the best in terms of correlation. We hence advise to use method 1 for attitude estimation and for navigation purposes. Instead we consider 4 the best method for robot teleoperation. Motion speed analysis provided minor results, possibly due to the choice of movements that make gravity play a dominant role in limb attitude estimation.
Appendix A. Summary of Presented Methods
A summary of the methods presented in Section 2 is reported in Appendix A in Table A1. Methods are sorted in chronological order and described according to the following categories:
Ref: reference to the article where the methods described.
Year: publication year.
Body: this category describes which part(s) of the body are targeted by the method Application: some methods are connected by the authors to one ore more applications. Most of the times this link is made explicit in the introduction. This category reports the application(s) extracted from the article. They span from computer graphics to robotics and medical applications.
Target: this category describes the physical quantities that are targeted by the method in relation to the body parts to be tracked. For example, when the body parts to be tracked are upper limbs, arm’s orientation and position are possible targets. The word “attitude” is referred to methods that are not referred to a specific part of the human body but to a generic rigid body.
Focus: Most of the cited articles focuses on the description of the IHMT method. However, some articles focus on the assessment of some methods, or on their application. In these cases fewer or no details about the IHMT method are reported. Calibration and handling of magnetic disturbances are also focuses of some of the articles reported in Section 2.
Sensors: List of the sensors required to apply the method described in the article. Each sensor is preceded by the numbers of units that are used.
Kinematics and Constraints: This category lists the kinematic representation(s) that was (were) chosen in the method and the biomechanical constraints that were used. Orientation is often represented as a quaternion or a rotation matrix which is a function of three independent variables. Exponential maps have also been used to represent orientation. When a kinematic chain is used, position and orientation may be obtained through the joint variables. This is the case when “kinematic chain” is reported without other kinematic representation specifications (e.g., “quaternion orientation”). “Kinematic chain” also indicates that joints are used either to impose constraints to motion or to reconstruct limbs’ pose. This category reports also the use of free segments modelling. Other constraints specifications (e.g., hinge, soft or hard constraints) are specified for some methods. This category is linked to Section 2.2.1.
Parameters: Kinematic chain, constraints and sensor fusion technique require to set and tune some parameters. These parameters are grouped and reported in this category. Lengths (e.g., arm length), orientations (e.g., mIMU’s orientation with respect to the link it is attached to), Kalman Filter parameters (e.g., covariance matrices) are parameters that many models require.
Sensor Fusion Technique: The mathematical method(s) that was (were) used is reported in this category. Kalman Filters, Complementary Filters and Optimization are the most popular possibilities that are reported. However, some methods use other tools such as inertial navigation systems and probabilistic graphical models. This category is linked to Section 2.2.2.
Calibration: When one or more calibration procedures are required to obtain the parameters needed for the method, these are here summarized. Most of the methods include static calibration, whereas other requires more complex procedures which include motion of specific articulations. This category is linked to Section 2.2.3.
Validation: Validation methods are reported in this category, the number of participants and the source of ground truth data is reported.
Measure: Quantitative validation of the methods requires the definition of variables to be assessed. In many cases these variables replicate the target of the method, in other cases they are a subset of the variables need for tracking of the declared targets.
RMSE: Root mean square error of the variables reported as measures for the validation when compared to ground truth data. Angular variables are reported in degrees whereas position are reported in millimetres. Drift are reported in meters per seconds.
Correlation: Correlation of estimated variables with ground truth data as reported in the referenced article.
The Notes column completes the description of each method with a few details specific of the method. Some formulas are worth to be mentioned:
Sens align link refers to the assumption that a sensor’s frame is supposed to be aligned to the frame of the body it is attached to; Sens sitting link is used referring to the situation of a (m)IMU attached to a human limb. In this case the the limb is supposed to be a beam and the formula refers to the assumption that the sensor’s frame origin is a point of such a beam; Bias est refers to models in which sensors’ biases are among the variables that are estimated;
Lin acc noise is referred to methods in which linear acceleration of the sensor is considered to be noise;
Show plot refers to method for which RMSE and/or correlation are not reported as figures in the text on in tables but a plot of such quantities is available.
Appendix A.1. Description of Abbreviations
In Table A1 some abbreviations are used to make the table more compact. These abbreviations are listed according to the previously defined categories and described in Table A2:
Table A1.
Ref. | Year | Body | Application | Target | Focus | Sensors | Kinematics & Constraints | Parameters | Sensor Fusion Technique | Calibration | Validation | Measure | RMSE | Correlation | Notes |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
[30] | 1999 | attitude | computer graphics | link ori | method | 1 acc, 1 gyro, 1 mag | quat ori | CF gain | CF, GN opt | no | 1, tilt table | roll | 1.0 | - | initial condition study, 120 s valid trial, lin acc negl |
[31] | 2001 | attitude | computer graphics | link ori | method | 1 acc, 1 gyro, 1 mag | quat ori | CF gain | CF, GN opt | - | 1, tilt table | quat compar | - | - | initial condition study, 25 s valid trial |
[10] | 2004 | attitude | computer graphics | link ori | method | 1 acc, 1 gyro, 1 mag | rot mat | CF gain | CF | - | 1, robot EE | roll, pitch, yaw | - | 0.7–0.87 | drift, 12s valid trial, omni phantom |
[32] | 2004 | upper limbs | - |
|
method | 2 acc, 2 gyro, 2 mag | rot mat | link len, KF params | KF, QUEST | - | 1, hor line vert line | wrist pos, shoulder abd, elbow fle | - | - | lin acc negl, sens align link, sens sitting link, 25s valid trial, show plot |
[57] | 2004 | full body | teleoperation | link ori | application | 14 acc, 14 gyro, 14 mag | kinematic chain, quat ori | CF gain | CF like | - | robot EE | - | - | - | sens sitting link, plot traj, valid teleop robot |
[95] | 2005 | upper limbs, lower limbs | medical | link ori | method | 1 acc, 1 gyro | rot mat | KF params | KF | static | 2, OMC |
|
|
- | drift modeled, sens align link |
[84] | 2005 | attitude | - | link ori | mag comp | 1 acc, 1 gyro, 1 mag | rot mat | KF params | KF, lin acc err, mag err | static | 1, box, 1, OMC |
|
|
- | linear acc noise, |
[75] | 2005 | attitude | - | link ori | method | 1 acc, 1 gyro, 1 mag | quat ori | CF gain | CF | - | mech platform | link ori | - | - | bias est, 60s trial, show plot |
[36] | 2005 | lower limbs | medical: locomotion |
|
method | 1 acc, 1 gyro | rot mat | - | acc double int |
|
1, OMC |
|
|
- | sens sitting link, sens align link, 4 s localization valid tasks |
[88] | 2006 | lower limbs | - | shank ori | method | 2 2d acc, 1 gyro | knee hinge | CF gain | CF like | static, n pose like | 8, OMC |
|
|
0.999 | motion limited sagittal |
[96] | 2006 | lower limbs | medical: knee function analysis post cruciate ligament lesion |
|
application | 2 gyro | - | - | gyro int | - | 5, US |
|
|
- | target ROM, 30m walk valid trial |
[97] | 2006 | attitude | computer graphics | link ori | method | 1 acc, 1 gyro, 1 mag | quat ori | KF params | EKF, QUEST | - | 1, tilt table |
|
2.0–9.0 | - | initial condition study, 25 s valid trial |
[50] | 2006 | upper limbs | computer graphics |
|
method | 1 2d acc, 1 1d gyro, mech track wrist pos | kinematic chain, quat ori | link len, KF params | KF, GN opt | - | 1, OMC |
|
<OMC prec | - | sens align link, sens sitting link, bias est |
[38] | 2006 | attitude | medical | link ori | method | 1 acc, 1 gyro, 1 mag | quat ori | KF params | EKF | - | 1, OMC |
|
|
- | adaptive covariance, bias est, ZUPT like sens inline calib, 120s free movements valid trial |
[54] | 2007 | upper limbs | medical: monitoring, neuromuscular disorders | elbow angle | method | 2 acc, 2 gyro | elbow hinge | KF params | KF | dynamic | 1, OMC | elbow fle | elbow fle 8–25 | - | static plus pro mov calib, variation R, 130s daily activities valid trial |
[99] | 2007 | full body | - |
|
method |
|
rot mat | link len, KF params | INS, EKF | - | 6, OMC |
|
|
- | sens align link, sens sitting link, 30s walking valid trial |
[14] | 2007 | full body | computer graphics, sport |
|
method | 18 acc, 18 gyro, 18 US | kinematic chain, quat ori | link len, KF params | EKF | static, rest pose | 1, OMC |
|
|
- | sens sitting link, 30s valid trials, drift observed |
[92] | 2008 | lower limbs | medical |
|
calibration | 2 gyro | - | - | gyro int | static, n pose, dynamic hip abd | 10, MAG |
|
|
|
30m walk valid trial |
[37] | 2008 | lower limbs | medical: gait |
|
calibration | 2 acc, 2 gyro, 2 mag | - | - | - | static | 6, OMC |
|
|
- | calibration from 6 participants |
[48] | 2008 | upper limbs | medical |
|
calibration | 4 acc, 4 gyro, 4 mag | - | - | - | static | 1, OMC |
|
0.2—3.2 | - | |
[43] | 2008 | attitude | - | link ori | method | 1 acc, 1 mag | quat ori | FQA | - | 1, tilt table | roll, pitch, yaw | - | - | sens align to link | |
[76] | 2008 | attitude | - | link ori | method | 1 acc, 1 gyro, 1 mag | quat ori | CF gain | CF | - | robot EE | link ori | - | - | bias est, 60 s valid trial, show plot |
[63] | 2009 | lower limbs | - |
|
method | 2 acc, 2 gyro | knee hinge | KF params | KF | static, n pose like | 7, OMC |
|
|
- | |
[98] | 2009 | lower limbs | - |
|
method | 4 acc, 4 gyro | knee hinge | rot acc | - | 8, OMC |
|
|
0.91 | limited motion to 80 deg, low speed valid trial | |
[103] | 2009 | lower limbs | medical |
|
calibration | 2 gyro | - | - | gyro int | dynamic | 8, MAG |
|
|
|
|
[64] | 2009 | full body | computer graphics |
|
calibration, application | 17 acc, 17 gyro, 17 mag |
|
link len, KF params | KF | static, t pose, dynamic, axis rot, closed loop calib | - | - | - | - | three steps calib, closed loop calib |
[94] | 2010 | lower limbs | medical: monitoring cerebral palsy |
|
calibration | 8 acc, 8 gyro, 8 mag | - | - | - | static | 9, OMC, 2, manual |
|
1.4-1.8 | - | manual measurement therapist |
[58] | 2010 | full body | computer graphics |
|
method | 9 acc, 9 gyro, 9 mag | kinematic chain, quat ori | link len, sens pos, CF gain | CF, lin acc err | - | sim sens meas |
|
|
0.939 0.999 | sens align link, sens sitting link, walking gait, running gait |
[112] | 2010 | pose | - |
|
mag comp |
|
rot mat | - | INS, EKF, mag sto model | - | static pos | - | - | - | |
[114] | 2010 | full body | sport |
|
application | 16 acc, 16 gyro, 16 mag | - | - | - | - | 2, GNSS |
|
|
- | 35 s pendulum valid trial, entire ski race |
[67] | 2010 | upper limbs | - |
|
method | 6 acc, 6 gyro, 6 mag | kinematic chain |
|
NR opt, inv kin | - | 1, OMC | wrist pos | wrist pos 5 | - | sens sitting link, sens align link, 180s valid trial, lin acc negl |
[69] | 2010 | pose | localization |
|
method | 1 acc, 1 gyro, 1 mag | rot mat | KF params | EKF, INS | - | 1, known path | foot pos | foot pos 450–1350 | - | bias est, lin acc est, ZUPT, ZARU, HDR, 100s valid trial 125m |
[87] | 2010 | upper limbs | - |
|
method | 2 acc, 2 gyro, 2 mag | kinematic chain, rot mat | KF params, link len | KF, CF like | - | 8, OMC |
|
|
|
sens sitting link, sens align link, const lin acc, const ang vel, sens reloc, 30s square and circle valid trials, 100 s daily activities valid trials |
[80] | 2010 | full body | - | link ori | method | - | - | KF params | KF | - | 1, OMC | - | - | - | lowest point alg, const height ground, lin acc noise, show plot, drift visible, ZUPT |
[18] | 2010 | full body | - |
|
method | 1 acc, 1 gyro, 1 mag, mag coil | rot mat | link len, KF params | INS, EKF | - | 6, OMC |
|
|
- | sens align link, sens sitting link |
[45] | 2011 | upper limbs | industrial assembly |
|
method | 5 acc, 5 gyro, 5 mag, Camera marker | kinematic chain, rot mat |
|
EKF |
|
1, OMC | wrist pos | - | - | show plot, visual insp drift, 40 s valid trial |
[41] | 2011 | full body | computer graphics, sport |
|
method | 10 acc, 10 gyro, 10 mag |
|
link len, OPT params | OPT, VMF dist | static | 1, MVN | 5 links ori | 7.3 | - | sens sitting link, 20 s valid trials |
[59] | 2011 | attitude | medical | link ori | method | 1 acc, 1 gyro, 1 mag | quat ori | CF gain | CF like, OPT | - | 1, OMC |
|
|
- | sens sitting link, sens align link, average over 860 s valid trials |
[40] | 2011 | attitude | medical | link ori | method | 1 acc, 1 gyro, 1 mag | quat ori | KF params | EKF | - | 1, OMC |
|
|
- | sens sitting link, sens align link, adaptive covariance, bias est, 20 s valid trial, lin acc negl |
[68] | 2011 | full body | medical |
|
application | 1 acc, 1 gyro, 1 mag | quat ori | KF params, link len | EKF | static | 1, MTX |
|
|
- | sens sitting link, sens align link, adaptive covariance, bias est, 20 s valid trial, lin acc negl |
[49] | 2011 | upper limbs | - |
|
method | 2 acc, 2 gyro, 2 mag | kinematic chain, rot mat | KF params, link len | UKF | static, n pose | 1, OMC |
|
|
|
DH, sens sitting link, sens fixed to limit soft tissue effect, lin acc negl |
[39] | 2011 | upper limbs | computer graphics |
|
method | 2 acc, 2 gyro, 2 mag |
|
PF params, link len | PF | static, n pose | 1, OMC |
|
|
|
bias est, lin acc negl, sens sitting link, sens fixed to limit soft tissue effect |
[53] | 2011 | upper limbs | - |
|
method | 2 acc, 2 gyro, 2 mag | rot mat |
|
UKF | - | 1, OMC |
|
- |
|
DH, sens sitting link, sens align link, 5 s anat movements valid trials |
[100] | 2011 | full body | - | link ori | assessment | 9 acc, 9 gyro, 9 mag | quat ori | - | - | static, 12 poses | 1, OMC |
|
|
- | inter MIMU error, intra MIMU error, static valid trial, MTX proprietary KF |
[77] | 2012 | upper limbs | - |
|
method | 2 acc, 2 gyro, 2 mag | kinematic chain, rot mat | link len, KF params | UKF | static, n pose | 8, OMC |
|
|
|
DH, sens align link, sens sitting link, bias est, calib remove gyro bias, 12 s functional movements valid trials, 12 s daily actitvities trials |
[81] | 2012 | full body | - |
|
mag comp | 1 acc, 1 gyro, 1 mag | - | - | OPT | - | mech platform |
|
|
- | |
[44] | 2012 | upper limbs | medical: rehabilitation |
|
method | 2 acc | quat ori | opt params | NR opt | static | 1, MTX |
|
|
- | motion limited sagittal, lin acc negl, sens sitting link, sens align link, static poses calib, 40 s trial sagittal plane |
[66] | 2013 | upper limbs |
|
method | n acc, n gyro, n mag | kinematc chain, rot mat | KF params, link len | EKF |
|
1, OMC | hand pos |
|
- | DH, any kinematic chain | |
[104] | 2013 | lower limbs | - | link ori, pelvis pos | method | 7 acc, 7 gyro, 7 mag | kinematic chain, quat ori | KF params, link len | KF | - | 1, OMC | pelvis pos |
|
- | sens align link, sens sitting link, ZUPT, 20 s hopping valid trial, walking valid trial |
[107] | 2013 | full body |
|
application | 21 acc, 21 gyro, 21 mag | kinematc chain, rot mat | KF params, link len | EKF |
|
MIMUs comparison | - | - | - | no head | |
[108] | 2013 | upper limbs | ergonomics | link ori, link pos | application | 21 acc, 21 gyro, 21 mag, 2 goniometers | kinematic chain, rot mat | KF params, link len | EKF |
|
12 experts | execution time, RULA class freq | - | - | - |
[47] | 2013 | upper limbs | - |
|
method | 3 acc, 3 gyro, 3 mag | kinematic chain, rot mat |
|
UKF | static, n pose, t pose | 1, OMC |
|
|
|
DH, 160s functional movements valid trials |
[65] | 2013 | lower limbs | localization, training | method | 1 acc, 1 gyro, 1 mag, 1 pressure | kinematic chain, rot mat | KF params, link len, sens pos | KF | static, three walk step poses | 1, OMC, known path |
|
|
- | sens sitting link, >40s walking valid trial, >40s jogging valid trial, lin acc noise | |
[42] | 2013 | full body | medical | link ori | method | 1 acc, 1 gyro, 1 mag | quat ori | PF params | PF, VMF dist | static | 1, OMC, robot EE |
|
|
- | bias est, sens sitting links, init GN opt acc mag meas |
[60] | 2013 | attitude | - | link ori | method | 1 acc, 1 gyro, 1 mag | quat ori | CF gain, opt params | CF, GN opt | static, dymanic | 1, OMC |
|
|
- | adaptive gain CF, bias from calib, bias est, 1000s valid trials, variation mag disturbance static, lin acc negl |
[93] | 2014 | lower limbs | medical: gait |
|
method | 6 acc, 6 gyro | knee hinge | CF gain | CF like | dynamic | 1, OMC |
|
|
- | walk 10m straight valid trial |
[79] | 2014 | lower limbs | medical:gait, rehabilitation |
|
method | 5 acc, 5 gyro | kinematic chain | KF params | EKF | - | 5, OMC |
|
|
- | 2 full cycles valid trial |
[70] | 2014 | upper limbs | - |
|
method | 2 acc, 2 gyro, 2 mag | kinematic chain | U transform, link len, sens pos | PGM | static, n pose, t pose | 1, OMC |
|
|
|
DH, 160s functional movements valid trials |
[55] | 2014 | lower limbs | - |
|
method | 17 acc, 17 gyro, 17 mag |
|
opt steps, link len | OPT | unspecifieds | 1, OMC | knee ori | - | - | bias est, lin acc negl, covmat allanvar, no sens to hinge and acc model, no real time, 37s walking valid trial, show plot |
[89] | 2014 | lower limbs | - |
|
calibration | 7 acc | rot mat | - | TRIAD like | static, n pose, seat pose | 10, OMC |
|
|
|
lin acc negl |
[109] | 2015 | full body | medical: physical activity monitoring |
|
application | - | kinematic chain, rot mat | KF params, link len | EKF | static, n pose, back bent | - | - | - | - | |
[78] | 2015 | upper limbs | - |
|
method | 2 acc, 2 gyro, 2 mag | kinematic chain, rot mat | KF params, link len | UKF | static, n pose | 1, mech |
|
|
- | DH, sens align link, sens sitting link, bias est, calib remove gyro bias, ZUPT reduce gyro bias, joint limit, mech synch crosscorrelation, 120s funct mov valid trial, 120 s norm tasks valid trial |
[113] | 2015 | full body | sport |
|
method, application | 5 acc, 5 gyro, 5 mag, 5 enc | kinematic chain, rot mat | KF params, link len | UKF | static | 1, OMC |
|
|
|
40s rowing valid trial |
[110] | 2016 | upper limbs | ergonomics |
|
application | 3 acc, 3 gyro, 3 mag, EMG | kinematic chain | KF params, link len | UKF | static, n pose, t pose | 10, manual vs auto | - | - | - | |
[90] | 2016 | upper limbs | computer graphics |
|
method | 1 acc, 1 gyro, 1 mag, mech track elbow fle | quat ori | KF params, link len | UKF | static, t pose | 1, XsensMVN |
|
- | - | lin acc negl, sens align link, sens sitting link, 5.5 s valid trial, show plot |
[86] | 2016 | - | - | link ori | mag comp | 1 acc, 1 gyro, 1 mag | quat ori | KF params | KF | - | 1, OMC | quat ori | quat ori6 | - | 180s walking valid trial, similar to [32] |
[51] | 2016 | upper limbs | - |
|
method | 3 acc, 3 gyro, 3 mag |
|
|
EKF, OPT | static |
|
|
|
- | DH, link len est, sens ori est, motion speed |
[91] | 2016 | full body | - |
|
assessment | 12 acc, 12 gyro, 12 mag | - | link len | - | static | 1, OMC |
|
|
|
complex vs simple task valid trials, 1920s manual handling valid trial, error due to biomechanics, total err, ISB kinematic model, MVN kinematic model |
[85] | 2016 | attitude | - | link ori | method | 1 acc, 1 gyro, 1 mag | quat ori | KF params | EKF | - | 4, OMC |
|
|
- | bias est, lin acc negl, 120 s texting walking valid trial, 120s swinging walking valid trial, 780 s unsupervised walking valid trial |
[105] | 2016 | lower limbs | - | link ori, pelvis pos | method | 7 acc, 7 gyro, 3 UWB | kinematic chain, rot mat | KF params, link len | KF | - | 1, OMC |
|
|
- | ZUPT, 100s walking valid trial, 100s jumping valid trial, 100s ascending valid trial |
Table A2.
Abbreviation | Full Name | Categories | Description |
---|---|---|---|
ori | orientation | Target, Kinematics & Constraints, Measure, RMSE, Correlation | orientation of a rigid body |
pos | position | Target, Calibration, Parameters, Validation, Measure, RMSE, Correlation, notes | position of a point of a rigid body |
fle | flexion/extension | Target, Measure, RMSE, Correlation, Sensors, | anatomical term of motion |
abd | abduction/adduction | Target, Measure, RMSE, Correlation, Sensors, | anatomical term of motion |
rot | rotation | Target, Kinematics & Constraints, Measure, Sensor Fusion Technique, Calibration, RMSE, Correlation | rotation related either to rotation about an axis or rotation matrix |
pro | pronation/supination | Target, Measure, RMSE, Correlation, notes | |
ret | retraction/protraction | Target, Measure, RMSE | scapular retraction/protraction |
ele | elevation/depression | Target, Measure, RMSE | scapular elevation/depression |
mag | magnetometer | Focus, Sensors, Sensor Fusion Technique, Measure, notes | referred to either 3 axis (unless otherwise specified) magnetometer or its signal |
comp | compensation | Focus | referred to compensation of magnetic field distortions |
acc | accelerometer | Sensors | referred to either 3 axis (unless otherwise specified) accelerometer or its signal |
gyro | gyroscope | Sensors, Sensor Fusion technique, notes | 3 axis (unless otherwise specified) gyroscope |
xd | - | Sensors | x axes sensor (e.g., 2d acc means biaxial accelerometer) |
mech | mechanical | Sensors, Validation, notes | mechanical is usually referred to either trackers or rigs for validation |
biomech | biomechanical | notes | - |
track | tracker | Sensors | - |
US | ultrasound | Sensors, Validation | ultrasound sensor or motion tracking system based on ultrasound |
exp | exponential | Kinematics & Constraints | exponential maps representation |
seg | segment | Kinematics & Constraints | referred to free segments representation |
CF | Complementary Filter | Parameters, Sensor Fusion Technique, notes | - |
len | length | Parameters, Measures, RMSE, notes | length of human limbs or robotic links |
KF | Kalman Filter | Parameters, Sensor Fusion Technique | - |
EKF | Extended Kalman Filter | Parameters, Sensor Fusion Technique | - |
UKF | Unscented Kalman Filter | Parameters, Sensor Fusion Technique | - |
PF | Particle Filter | Parameters, Sensor Fusion Technique | - |
PGM | probabilistic grpahical models | Sensor Fusion Technique | - |
opt/OPT | optimization | Parameters, Sensor Fusion Technique, notes | - |
params | parameters | Parameters | - |
sens | sensor(s) | Parameters, Validation, notes | Typically referred to position and orientation of the sensor with respect to the link it is attached to. Otherwise referred to simulated measurements of a virtual sensor |
U | unscented | Parameters | - |
INS | inertial navigation system | Sensor Fusion Technique | navigation system based on signals from accelerometers and gyroscopes aimed at estimating position, velocity and orientation of a rigid body. It is typically referred to navigation of aerial vehicles |
GN | Gauss-Newton | referred to Gauss-Newton optimization | |
QUEST | Quaternion estimator algorithm | Sensor Fusion Technique | - |
err | error | Sensor Fusion Technique, Measure, RMSE | typically error is quantitatively defined as difference of a variable with respect to a reference |
int | integration | Sensor Fusion Technique | referred to integration of gyroscope’s or accelerometer’s signal |
FQA | Factorized Quaternion Algorithm | Sensor Fusion Technique | - |
lin acc | linear acceleration | Sensor Fusion Technique, notes | acceleration of a point in space |
sto | stochastic | Sensor Fusion Technique | - |
NR | Newton-Raphson | Sensor Fusion Technique | referred to Newton-Raphson algorithm used for optimization |
MAG | - | Sensors, | Motion tracking system based on magnetic field measurement |
VMF dist | Von Mises-Fisher distribution | Sensor Fusion Technique, notes | - |
calib | calibration | Calibration, notes | - |
traj | trajectory | Calibration, Measure, RMSE, notes | trajectory of a point in space |
EMG | electromyography | Sensors | array of surface electromyography sensors |
quat | quaternion | Kinematics & Constraints, Measure, RMSE, | - |
coord sys | coordinate system | Kinematics & Constraints | - |
EE | end effector | Validation | end effector of a robot |
hor | horizontal | Validation | - |
ver | vertical | Validation | - |
OMC | optical motion capture | Validation, RMSE | OMC is referred to tracking of points in space by means of optical motion capture. Tracking of these points is often used to compute orientation of rigid bodies |
sim | simulated | Validation | referred to simulated measurements of virtual sensors |
GNSS | global satellite navigation system | Validation | provider of ground truth data in outdoor motion capture sessions |
meas | measurement | Validation, notes | - |
compar | comparison | Measure | - |
incli | inclination | Measure, RMSE | deviation with respect to a given direction |
sta | static | Measure, RMSE | - |
dyn | dynamic | Measure, RMSE | - |
RULA | rapid upper limb assessment | Measure | method of ergonomic assessment based on articular motion and forces exerted during an activity |
freq | frequency | Measure | number of occurrence of a risk class in RULA evaluation |
twi | twist | Measure, RMSE | referred to wrist motion |
valid | validation | notes | referred to validation trials carried out for the validation of the method, typically preceded by their duration |
negl | neglected | notes | referred to a variable that is neglected in a method, typically linear acceleration |
align | aligned | notes | used in the formula sens align link to indicate the assumption that a sensor’s frame is supposed to be aligned to the frame of the body it is attached to |
teleop | teleoperation | notes | - |
ROM | range of motion | notes | - |
est | estimation | notes | - |
alg | algorithm | notes | - |
const | constant | notes | - |
ZARU | zero angular rate update | notes | technique to reduce drift based on detection of steady orientation |
HDR | heuristic heading reduction | notes | technique that exploits straight paths to improve localization estimate |
reloc | relocation | notes | referred to relocation of sensors |
ang | angle or angular | Correlation, notes | - |
insp | inspection | notes | referred to visual inspection |
DH | Denavit-Hartenberg | notes | standard to define kinematic chains |
synch | synchronization | notes | - |
ISB | International Society of Biomechanics | notes | used to refer to the standard proposed by ISB to define frames attached to human limbs to define their pose and motion |
UWB | ultra wide band | Sensors | - |
Conflicts of Interest
The authors declare no conflict of interest. The founding sponsors had no role in the design of the study; in the collection, analyses, or interpretation of data; in the writing of the manuscript, and in the decision to publish the results.
References
- 1.Fortino G., Giannantonio R., Gravina R., Kuryloski P., Jafari R. Enabling effective programming and flexible anagement of efficient body sensor network applications. IEEE Trans. Hum.-Mach. Syst. 2013;43:115–133. doi: 10.1109/TSMCC.2012.2215852. [DOI] [Google Scholar]
- 2.Chen M., Gonzalez S., Vasilakos A., Cao H., Leung V.C. Body area networks: A survey. Mob. Netw. Appl. 2011;16:171–193. doi: 10.1007/s11036-010-0260-8. [DOI] [Google Scholar]
- 3.Lefferts E.J., Markley F.L., Shuster M.D. Kalman filtering for spacecraft attitude estimation. J. Guid. Control Dyn. 1982;5:417–429. doi: 10.2514/3.56190. [DOI] [Google Scholar]
- 4.Shuster M.D., Oh S.D. Three-axis attitude determination from vector observations. J. Guid. Control Dyn. 1981;4:70–77. doi: 10.2514/3.19717. [DOI] [Google Scholar]
- 5.Choi S., Do J., Hwang B., Lee J. Static attitude control for underwater robots using multiple ballast tanks. IEEJ Trans. Electr. Electron. Eng. 2014;9:S49–S55. doi: 10.1002/tee.22032. [DOI] [Google Scholar]
- 6.Rossi A., Pasquali M., Pastore M. Performance analysis of an inertial navigation algorithm with DVL auto-calibration for underwater vehicle; Proceedings of the 2014 DGON Inertial Sensors and Systems Symposium (ISS); Karlsruhe, Germany. 16–17 September 2014; pp. 1–19. [Google Scholar]
- 7.Brown A.K., Lu Y. Performance test results of an integrated GPS/MEMS inertial navigation package; Proceedings of the 17th International Technical Meeting of the Satellite Division of The Institute of Navigation (ION GNSS 2004); Long Beach, CA, USA. 21–24 September 2004; pp. 21–24. [Google Scholar]
- 8.Choi J.H., Oh S.H., Kim H.S., Lee Y.W. Design of Multi-Sensor-Based Open Architecture Integrated Navigation System for Localization of UGV. J. Position Navig. Timing. 2012;1:35–43. doi: 10.11003/JKGS.2012.1.1.035. [DOI] [Google Scholar]
- 9.Iosa M., Picerno P., Paolucci S., Morone G. Wearable inertial sensors for human movement analysis. Expert Rev. Med. Devices. 2016;13:641–659. doi: 10.1080/17434440.2016.1198694. [DOI] [PubMed] [Google Scholar]
- 10.Gallagher A., Matsuoka Y., Ang W.T. An efficient real-time human posture tracking algorithm using low-cost inertial and magnetic sensors; Proceedings of the 2004 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS); Sendai, Japan. 28 September–2 October 2004; pp. 2967–2972. [Google Scholar]
- 11.Gebre-Egziabher D., Elkaim G.H., Powell J., Parkinson B.W. A gyro-free quaternion-based attitude determination system suitable for implementation using low cost sensors; Proceedings of the 2000 IEEE Position Location and Navigation Symposium; San Diego, CA, USA. 13–16 March 2000; pp. 185–192. [Google Scholar]
- 12.Ang W.T., Khosla P.K., Riviere C.N. Kalman filtering for real-time orientation tracking of handheld microsurgical instrument; Proceedings of the 2004 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS); Sendai, Japan. 28 September–2 October 2004; pp. 2574–2580. [Google Scholar]
- 13.Suh Y.S. Orientation estimation using a quaternion-based indirect Kalman filter with adaptive estimation of external acceleration. IEEE Trans. Instrum. Meas. 2010;59:3296–3305. doi: 10.1109/TIM.2010.2047157. [DOI] [Google Scholar]
- 14.Vlasic D., Adelsberger R., Vannucci G., Barnwell J., Gross M., Matusik W., Popović J. Practical motion capture in everyday surroundings. ACM Trans. Graph. 2007;26:35. doi: 10.1145/1276377.1276421. [DOI] [Google Scholar]
- 15.Schall G., Wagner D., Reitmayr G., Taichmann E., Wieser M., Schmalstieg D., Hofmann-Wellenhof B. Global pose estimation using multi-sensor fusion for outdoor augmented reality; Proceedings of the 8th IEEE International Symposium on Mixed and Augmented Reality (ISMAR 2009); Orlando, FL, USA. 19–22 October 2009; pp. 153–162. [Google Scholar]
- 16.Corrales J.A., Candelas F., Torres F. Hybrid tracking of human operators using IMU/UWB data fusion by a Kalman filter; Proceddings of the 2008 3rd ACM/IEEE International Conference on Human-Robot Interaction (HRI); Amsterdam, The Netherlands. 12–15 March 2008; pp. 193–200. [Google Scholar]
- 17.Tao Y., Hu H., Zhou H. Integration of vision and inertial sensors for 3D arm motion tracking in home-based rehabilitation. Int. J. Robot. Res. 2007;26:607–624. doi: 10.1177/0278364907079278. [DOI] [Google Scholar]
- 18.Schepers H.M., Roetenberg D., Veltink P.H. Ambulatory human motion tracking by fusion of inertial and magnetic sensing with adaptive actuation. Med. Biol. Eng. Comput. 2010;48:27–37. doi: 10.1007/s11517-009-0562-9. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 19.Ahmad N., Ghazilla R.A.R., Khairi N.M., Kasi V. Reviews on various inertial measurement unit (IMU) sensor applications. Int. J. Signal Proc. Syst. 2013;1:256–262. doi: 10.12720/ijsps.1.2.256-262. [DOI] [Google Scholar]
- 20.Buke A., Gaoli F., Yongcai W., Lei S., Zhiqi Y. Healthcare algorithms by wearable inertial sensors: A survey. China Commun. 2015;12:1–12. doi: 10.1109/CC.2015.7114054. [DOI] [Google Scholar]
- 21.Patel S., Park H., Bonato P., Chan L., Rodgers M. A review of wearable sensors and systems with application in rehabilitation. J. Neuroeng. Rehabil. 2012;9:21. doi: 10.1186/1743-0003-9-21. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 22.Shull P.B., Jirattigalachote W., Hunt M.A., Cutkosky M.R., Delp S.L. Quantified self and human movement: A review on the clinical impact of wearable sensing and feedback for gait analysis and intervention. Gait Posture. 2014;40:11–19. doi: 10.1016/j.gaitpost.2014.03.189. [DOI] [PubMed] [Google Scholar]
- 23.Gravina R., Alinia P., Ghasemzadeh H., Fortino G. Multi-sensor fusion in body sensor networks: State-of-the-art and research challenges. Inf. Fusion. 2017;35:68–80. doi: 10.1016/j.inffus.2016.09.005. [DOI] [Google Scholar]
- 24.Wong C., Zhang Z.Q., Lo B., Yang G.Z. Wearable sensing for solid biomechanics: A review. IEEE Sens. J. 2015;15:2747–2760. doi: 10.1109/JSEN.2015.2393883. [DOI] [Google Scholar]
- 25.Harle R. A survey of indoor inertial positioning systems for pedestrians. IEEE Commun. Surv. Tutor. 2013;15:1281–1293. doi: 10.1109/SURV.2012.121912.00075. [DOI] [Google Scholar]
- 26.Yang Z., Wu C., Zhou Z., Zhang X., Wang X., Liu Y. Mobility increases localizability: A survey on wireless indoor localization using inertial sensors. ACM Comput. Surv. 2015;47:1–34. doi: 10.1145/2676430. [DOI] [Google Scholar]
- 27.Yang S., Li Q. Inertial sensor-based methods in walking speed estimation: A systematic review. Sensors. 2012;12:6102–6116. doi: 10.3390/s120506102. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 28.Sabatini A.M. Estimating three-dimensional orientation of human body parts by inertial/magnetic sensing. Sensors. 2011;11:1489–1525. doi: 10.3390/s110201489. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 29.Michel T., Fourati H., Geneves P., Layaïda N. A comparative analysis of attitude estimation for pedestrian navigation with smartphones; Proceedings of the 2015 International Conference on Indoor Positioning and Indoor Navigation (IPIN); Banff, AB, Canada. 13–16 October 2015; pp. 1–10. [Google Scholar]
- 30.Bachmann E., Duman I., Usta U., McGhee R., Yun X., Zyda M. Orientation tracking for humans and robots using inertial sensors; Proceedings of the 1999 IEEE International Symposium on Computational Intelligence in Robotics and Automation (CIRA ’99); Monterey, CA, USA. 8–9 November 1999; pp. 187–194. [Google Scholar]
- 31.Marins J.L., Yun X., Bachmann E.R., McGhee R.B., Zyda M.J. An extended Kalman filter for quaternion-based orientation estimation using MARG sensors; Proceedings of the 2001 IEEE/RSJ International Conference onntelligent Robots and Systems; Maui, HI, USA. 29 October–3 November 2001; pp. 2003–2011. [Google Scholar]
- 32.Zhu R., Zhou Z. A real-time articulated human motion tracking using tri-axis inertial/magnetic sensors package. IEEE Trans. Neural Syst. Rehabil. Eng. 2004;12:295–302. doi: 10.1109/TNSRE.2004.827825. [DOI] [PubMed] [Google Scholar]
- 33.Rogers R.M. Weapon IMU transfer alignment using aircraft position from actual flight tests; Proceedings of the IEEE 1996 Position Location and Navigation Symposium; Atlanta, GA, USA. 22–25 April 1996; pp. 328–335. [Google Scholar]
- 34.Qi H., Moore J.B. Direct Kalman filtering approach for GPS/INS integration. IEEE Trans. Aerosp. Electron. Syst. 2002;38:687–693. [Google Scholar]
- 35.Wang J.J., Wang J., Sinclair D., Watts L. A neural network and Kalman filter hybrid approach for GPS/INS integration; Proceedings of the 12th IAIN World Congress, 2006 International Symposium on GPS/GNSS; Jeju, Korea. 18–20 October 2006; pp. 18–20. [Google Scholar]
- 36.Giansanti D., Maccioni G., Macellari V. The development and test of a device for the reconstruction of 3-D position and orientation by means of a kinematic sensor assembly with rate gyroscopes and accelerometers. IEEE Trans. Biomed. Eng. 2005;52:1271–1277. doi: 10.1109/TBME.2005.847404. [DOI] [PubMed] [Google Scholar]
- 37.Picerno P., Cereatti A., Cappozzo A. Joint kinematics estimate using wearable inertial and magnetic sensing modules. Gait Posture. 2008;28:588–595. doi: 10.1016/j.gaitpost.2008.04.003. [DOI] [PubMed] [Google Scholar]
- 38.Sabatini A.M. Quaternion-based extended Kalman filter for determining orientation by inertial and magnetic sensing. IEEE Trans. Biomed. Eng. 2006;53:1346–1356. doi: 10.1109/TBME.2006.875664. [DOI] [PubMed] [Google Scholar]
- 39.Zhang Z.Q., Wu J.K. A novel hierarchical information fusion method for three-dimensional upper limb motion estimation. IEEE Trans. Instrum. Meas. 2011;60:3709–3719. doi: 10.1109/TIM.2011.2135070. [DOI] [Google Scholar]
- 40.Lin Z., Zecca M., Sessa S., Bartolomeo L., Ishii H., Takanishi A. Development of the wireless ultra-miniaturized inertial measurement unit WB-4: Preliminary performance evaluation; Proceedings of the 2011 Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC); Boston, MA, USA. 30 August–3 September 2011; pp. 6927–6930. [DOI] [PubMed] [Google Scholar]
- 41.Pons-Moll G., Baak A., Gall J., Leal-Taixe L., Muller M., Seidel H., Rosenhahn B. Outdoor human motion capture using inverse kinematics and von mises-fisher sampling; Proceedings of the 2011 IEEE International Conference on Computer Vision (ICCV); Barcelona, Spain. 6–13 November 2011; pp. 1243–1250. [Google Scholar]
- 42.To G., Mahfouz M.R. Quaternionic Attitude Estimation for Robotic and Human Motion Tracking Using Sequential Monte Carlo Methods with von Mises-Fisher and Non Uniform Densities Simulations. IEEE Trans. Biomed. Eng. 2013;60:3046–3059. doi: 10.1109/TBME.2013.2262636. [DOI] [PubMed] [Google Scholar]
- 43.Yun X., Bachmann E.R., McGhee R.B. A simplified quaternion-based algorithm for orientation estimation from earth gravity and magnetic field measurements. IEEE Trans. Instrum. Meas. 2008;57:638–650. [Google Scholar]
- 44.Lee G.X., Low K.S. A Factorized quaternion approach to determine the arm motions using triaxial accelerometers with anatomical and sensor constraints. IEEE Trans. Instrum. Meas. 2012;61:1793–1802. doi: 10.1109/TIM.2011.2181884. [DOI] [Google Scholar]
- 45.Bleser G., Hendeby G., Miezal M. Using egocentric vision to achieve robust inertial body tracking under magnetic disturbances; Proceedings of the 2011 10th IEEE International Symposium on Mixed and Augmented Reality (ISMAR); Basel, Switzerland. 26–29 October 2011; pp. 103–109. [Google Scholar]
- 46.Denavit J. A kinematic notation for lower-pair mechanisms based on matrices. Trans. ASME J. Appl. Mech. 1955;22:215–221. [Google Scholar]
- 47.Peppoloni L., Filippeschi A., Ruffaldi E., Avizzano C.A. A novel 7 degrees of freedom model for upper limb kinematic reconstruction based on wearable sensors; Proceedings of the 2013 IEEE 11th International Symposium on Intelligent Systems and Informatics (SISY); Subotica, Serbia. 26–28 September 2013; pp. 105–110. [Google Scholar]
- 48.Cutti A.G., Giovanardi A., Rocchi L., Davalli A., Sacchetti R. Ambulatory measurement of shoulder and elbow kinematics through inertial and magnetic sensors. Med. Biol. Eng. Comput. 2008;46:169–178. doi: 10.1007/s11517-007-0296-5. [DOI] [PubMed] [Google Scholar]
- 49.Zhang Z.Q., Wong W.C., Wu J.K. Ubiquitous human upper-limb motion estimation using wearable sensors. IEEE Trans. Inf. Technol. Biomed. 2011;15:513–521. doi: 10.1109/TITB.2011.2159122. [DOI] [PubMed] [Google Scholar]
- 50.Mihelj M. Inverse kinematics of human arm based on multisensor data integration. J. Intell. Robot. Syst. 2006;47:139–153. doi: 10.1007/s10846-006-9079-8. [DOI] [Google Scholar]
- 51.Miezal M., Taetz B., Bleser G. On Inertial Body Tracking in the Presence of Model Calibration Errors. Sensors. 2016;16:1132. doi: 10.3390/s16071132. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 52.Simon D. Kalman filtering with state constraints: A survey of linear and nonlinear algorithms. IET Control Theory Appl. 2010;4:1303–1318. doi: 10.1049/iet-cta.2009.0032. [DOI] [Google Scholar]
- 53.El-Gohary M., Holmstrom L., Huisinga J., King E., McNames J., Horak F. Upper limb joint angle tracking with inertial sensors; Proceedings of the 2011 Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC); Boston, MA, USA. 30 August–3 September 2011; pp. 5629–5632. [DOI] [PubMed] [Google Scholar]
- 54.Luinge H.J., Veltink P.H., Baten C.T. Ambulatory measurement of arm orientation. J. Biomech. 2007;40:78–85. doi: 10.1016/j.jbiomech.2005.11.011. [DOI] [PubMed] [Google Scholar]
- 55.Kok M., Hol J.D., Schoen T.B. An optimization-based approach to human body motion capture using inertial sensors; Proceedings of the 19th World Congress of the International Federation of Automatic Control (IFAC); Cape Town, South Africa. 24–29 August 2014. [Google Scholar]
- 56.Ericson A., Arndt A., Stark A., Wretenberg P., Lundberg A. Variation in the position and orientation of the elbow flexion axis. Bone Jt. J. 2003;85:538–544. doi: 10.1302/0301-620X.85B4.13925. [DOI] [PubMed] [Google Scholar]
- 57.Miller N., Jenkins O.C., Kallmann M., Mataric M.J. Motion capture from inertial sensing for untethered humanoid teleoperation; Proceedings of the 2004 4th IEEE/RAS International Conference on Humanoid Robots; Santa Monica, CA, USA. 10–12 November 2004; pp. 547–565. [Google Scholar]
- 58.Young A.D. Use of body model constraints to improve accuracy of inertial motion capture; Proceedings of the 2010 International Conference on Body Sensor Networks (BSN); Singapore. 7–9 June 2010; pp. 180–186. [Google Scholar]
- 59.Madgwick S.O., Harrison A.J., Vaidyanathan R. Estimation of IMU and MARG orientation using a gradient descent algorithm; Proceedings of the 2011 IEEE International Conference on Rehabilitation Robotics; Zurich, Switzerland. 29 June–1 July 2011; pp. 1–7. [DOI] [PubMed] [Google Scholar]
- 60.Tian Y., Wei H., Tan J. An adaptive-gain complementary filter for real-time human motion tracking with marg sensors in free-living environments. IEEE Trans. Neural Syst. Rehabil. Eng. 2013;21:254–264. doi: 10.1109/TNSRE.2012.2205706. [DOI] [PubMed] [Google Scholar]
- 61.Black H.D. A passive system for determining the attitude of a satellite. AIAA J. 1964;2:1350–1351. doi: 10.2514/3.2555. [DOI] [Google Scholar]
- 62.Kalman R.E. A new approach to linear filtering and prediction problems. J. Basic Eng. 1960;82:35–45. doi: 10.1115/1.3662552. [DOI] [Google Scholar]
- 63.Cooper G., Sheret I., McMillian L., Siliverdis K., Sha N., Hodgins D., Kenney L., Howard D. Inertial sensor-based knee flexion/extension angle estimation. J. Biomech. 2009;42:2678–2685. doi: 10.1016/j.jbiomech.2009.08.004. [DOI] [PubMed] [Google Scholar]
- 64.Roetenberg D., Luinge H., Slycke P. Xsens MVN: Full 6 DOF Human Motion Tracking Using Miniature Inertial Sensors. Xsens Motion Technologies BV; Enschede, The Netherlands: 2009. [Google Scholar]
- 65.Yuan Q., Chen I., Caus A. others. Human velocity tracking and localization using 3 IMU sensors; Proceedings of the 2013 6th IEEE Conference on Robotics, Automation and Mechatronics (RAM); Manila, Philippines. 12–15 November 2013; pp. 25–30. [Google Scholar]
- 66.Miezal M., Bleser G., Schmitz N., Stricker D. A generic approach to inertial tracking of arbitrary kinematic chains; Proceedings of the 8th International Conference on Body Area Networks; Boston, MA, USA. 30 September– 2 October, 2013; pp. 189–192. [Google Scholar]
- 67.Jung Y., Kang D., Kim J. Upper body motion tracking with inertial sensors; Proceedings of the 2010 IEEE International Conference onRobotics and Biomimetics (ROBIO); Tianjin, China. 14–18 December 2010; pp. 1746–1751. [Google Scholar]
- 68.Brigante C.M., Abbate N., Basile A., Faulisi A.C., Sessa S. Towards miniaturization of a MEMS-based wearable motion capture system. IEEE Trans. Ind. Electron. 2011;58:3234–3241. doi: 10.1109/TIE.2011.2148671. [DOI] [Google Scholar]
- 69.Jimenez A.R., Seco F., Prieto J.C., Guevara J. Indoor Pedestrian Navigation using an INS/EKF framework for Yaw Drift Reduction and a Foot-mounted IMU; Proceedings of the 2010 7th Workshop on Positioning Navigation and Communication (WPNC); Dresden, Germany. 11–12 March 2010; pp. 135–143. [Google Scholar]
- 70.Ruffaldi E., Peppoloni L., Filippeschi A., Avizzano C. A novel approach to motion tracking with wearable sensors based on Probabilistic Graphical Models; Proceedings of the IEEE International Conference onRobotics and Automation (ICRA); Hong Kong, China. 31 May–7 June 2014; pp. 1247–1252. [Google Scholar]
- 71.Laviola J.J. A comparison of unscented and extended Kalman filtering for estimating quaternion motion; Proceedings of the 2003 IEEE American Control Conference; Denver, CO, USA. 4–6 June 2003; pp. 2435–2440. [Google Scholar]
- 72.Giannitrapani A., Ceccarelli N., Scortecci F., Garulli A. Comparison of EKF and UKF for spacecraft localization via angle measurements. IEEE Trans. Aerosp. Electron. Syst. 2011;47:75–84. doi: 10.1109/TAES.2011.5705660. [DOI] [Google Scholar]
- 73.Hong-de D., Shao-wu D., Yuan-cai C., Guang-bin W. Performance comparison of EKF/UKF/CKF for the tracking of ballistic target. TELKOMNIKA Indones. J. Electr. Eng. 2012;10:1692–1699. doi: 10.11591/telkomnika.v10i7.1564. [DOI] [Google Scholar]
- 74.Rhudy M., Gu Y., Gross J., Gururajan S., Napolitano M.R. Sensitivity Analysis of Extended and Unscented Kalman Filters for Attitude Estimation. J. Aerosp. Inf. Syst. 2013;10:131–143. doi: 10.2514/1.54899. [DOI] [Google Scholar]
- 75.Mahony R., Hamel T., Pflimlin J.M. Complementary filter design on the special orthogonal group SO (3); Proceedings of the 44th IEEE Conference on Decision and Control, 2005 European Control Conference, (CDC-ECC’05); Seville, Spain. 15 December 2005; pp. 1477–1484. [Google Scholar]
- 76.Mahony R., Hamel T., Pflimlin J.M. Nonlinear complementary filters on the special orthogonal group. IEEE Trans. Autom. Control. 2008;53:1203–1218. doi: 10.1109/TAC.2008.923738. [DOI] [Google Scholar]
- 77.El-Gohary M., McNames J. Shoulder and elbow joint angle tracking with inertial sensors. IEEE Trans. Biomed. Eng. 2012;59:2635–2641. doi: 10.1109/TBME.2012.2208750. [DOI] [PubMed] [Google Scholar]
- 78.El-Gohary M., McNames J. Human joint angle estimation with inertial sensors and validation with a robot arm. IEEE Trans. Biomed. Eng. 2015;62:1759–1767. doi: 10.1109/TBME.2015.2403368. [DOI] [PubMed] [Google Scholar]
- 79.Joukov V., Karg M., Kulic D. Online tracking of the lower body joint angles using imus for gait rehabilitation; Proceedings of the 2014 36th Annual International Conference of the IEEE Engineering in Medicine and Biology Society; San Francisco, CA, USA. 1–5 September 2014; pp. 2310–2313. [DOI] [PubMed] [Google Scholar]
- 80.Young A. From posture to motion: The challenge for real time wireless inertial motion capture; Proceedings of the Fifth International Conference on Body Area Networks; Corfu Island, Greece. 10–12 September 2010; pp. 131–137. [Google Scholar]
- 81.Salehi S., Mostofi N., Bleser G. A practical in-field magnetometer calibration method for IMUs; Proceedings of the IROS Workshop on Cognitive Assistive Systems: Closing the Action-Perception Loop; Algarve, Portugal. 7 October 2012; pp. 39–44. [Google Scholar]
- 82.Harada T., Mori T., Sato T. Development of a tiny orientation estimation device to operate under motion and magnetic disturbance. Int. J. Robot. Res. 2007;26:547–559. doi: 10.1177/0278364907079272. [DOI] [Google Scholar]
- 83.Lee J.K., Park E.J. Minimum-order Kalman filter with vector selector for accurate estimation of human body orientation. IEEE Trans. Robot. 2009;25:1196–1201. [Google Scholar]
- 84.Roetenberg D., Luinge H.J., Baten C.T., Veltink P.H. Compensation of magnetic disturbances improves inertial and magnetic sensing of human body segment orientation. IEEE Trans. Neural Syst. Rehabil. Eng. 2005;13:395–405. doi: 10.1109/TNSRE.2005.847353. [DOI] [PubMed] [Google Scholar]
- 85.Combettes C., Renaudin V. Delay Kalman Filter to Estimate the Attitude of a Mobile Object with Indoor Magnetic Field Gradients. Micromachines. 2016;7:79. doi: 10.3390/mi7050079. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 86.Ligorio G., Sabatini A.M. Dealing with Magnetic Disturbances in Human Motion Capture: A Survey of Techniques. Micromachines. 2016;7:43. doi: 10.3390/mi7030043. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 87.Zhou H., Hu H. Reducing drifts in the inertial measurements of wrist and elbow positions. IEEE Trans. Instrum. Meas. 2010;59:575–585. doi: 10.1109/TIM.2009.2025065. [DOI] [Google Scholar]
- 88.Dejnabadi H., Jolles B.M., Casanova E., Fua P., Aminian K. Estimation and visualization of sagittal kinematics of lower limbs orientation using body-fixed sensors. IEEE Trans. Biomed. Eng. 2006;53:1385–1393. doi: 10.1109/TBME.2006.873678. [DOI] [PubMed] [Google Scholar]
- 89.Palermo E., Rossi S., Marini F., Patanè F., Cappa P. Experimental evaluation of accuracy and repeatability of a novel body-to-sensor calibration procedure for inertial sensor-based gait analysis. Measurement. 2014;52:145–155. doi: 10.1016/j.measurement.2014.03.004. [DOI] [Google Scholar]
- 90.Taunyazov T., Omarali B., Shintemirov A. A novel low-cost 4-DOF wireless human arm motion tracker; 2016 6th IEEE International Conference on Biomedical Robotics and Biomechatronics (BioRob); Singapore. 26–29 June 2016; pp. 157–162. [Google Scholar]
- 91.Robert-Lachaine X., Mecheri H., Larue C., Plamondon A. Validation of inertial measurement units with an optoelectronic system for whole-body motion analysis. Med. Biol. Eng. Comput. 2017;55:609–619. doi: 10.1007/s11517-016-1537-2. [DOI] [PubMed] [Google Scholar]
- 92.Favre J., Jolles B., Aissaoui R., Aminian K. Ambulatory measurement of 3D knee joint angle. J. Biomech. 2008;41:1029–1035. doi: 10.1016/j.jbiomech.2007.12.003. [DOI] [PubMed] [Google Scholar]
- 93.Seel T., Raisch J., Schauer T. IMU-based joint angle measurement for gait analysis. Sensors. 2014;14:6891–6909. doi: 10.3390/s140406891. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 94.Cutti A.G., Ferrari A., Garofalo P., Raggi M., Cappello A., Ferrari A. ‘Outwalk’: A protocol for clinical gait analysis based on inertial and magnetic sensors. Med. Biol. Eng. Comput. 2010;48:17–25. doi: 10.1007/s11517-009-0545-x. [DOI] [PubMed] [Google Scholar]
- 95.Luinge H.J., Veltink P.H. Measuring orientation of human body segments using miniature gyroscopes and accelerometers. Med. Biol. Eng. Comput. 2005;43:273–282. doi: 10.1007/BF02345966. [DOI] [PubMed] [Google Scholar]
- 96.Favre J., Luthi F., Jolles B., Siegrist O., Najafi B., Aminian K. A new ambulatory system for comparative evaluation of the three-dimensional knee kinematics, applied to anterior cruciate ligament injuries. Knee Surg. Sports Traumatol. Arthrosc. 2006;14:592–604. doi: 10.1007/s00167-005-0023-4. [DOI] [PubMed] [Google Scholar]
- 97.Yun X., Bachmann E.R. Design, implementation, and experimental results of a quaternion-based Kalman filter for human body motion tracking. IEEE Trans. Robot. 2006;22:1216–1227. doi: 10.1109/TRO.2006.886270. [DOI] [Google Scholar]
- 98.Liu K., Liu T., Shibata K., Inoue Y. Ambulatory measurement and analysis of the lower limb 3D posture using wearable sensor system; Proceedings of the 2009 International Conference on Mechatronics and Automation; Changchun, China. 9–12 August 2009; pp. 3065–3069. [Google Scholar]
- 99.Roetenberg D., Slycke P.J., Veltink P.H. Ambulatory position and orientation tracking fusing magnetic and inertial sensing. IEEE Trans. Biomed. Eng. 2007;54:883–890. doi: 10.1109/TBME.2006.889184. [DOI] [PubMed] [Google Scholar]
- 100.Picerno P., Cereatti A., Cappozzo A. A spot check for assessing static orientation consistency of inertial and magnetic sensing units. Gait Posture. 2011;33:373–378. doi: 10.1016/j.gaitpost.2010.12.006. [DOI] [PubMed] [Google Scholar]
- 101.Hol J.D., Dijkstra F., Luinge H., Schon T.B. Tightly coupled UWB/IMU pose estimation; Proceedings of the IEEE International Conference on Ultra-Wideband (ICUWB 2009); Vancouver, BC, Canada. 9–11 September 2009; pp. 688–692. [Google Scholar]
- 102.Kok M., Hol J.D., Schön T.B. Indoor positioning using ultrawideband and inertial measurements. IEEE Trans. Veh. Technol. 2015;64:1293–1303. doi: 10.1109/TVT.2015.2396640. [DOI] [Google Scholar]
- 103.Favre J., Aissaoui R., Jolles B., de Guise J., Aminian K. Functional calibration procedure for 3D knee joint angle description using inertial sensors. J. Biomech. 2009;42:2330–2335. doi: 10.1016/j.jbiomech.2009.06.025. [DOI] [PubMed] [Google Scholar]
- 104.Meng X., Zhang Z.Q., Wu J.K., Wong W.C. Hierarchical information fusion for global displacement estimation in microsensor motion capture. IEEE Trans. Biomed. Eng. 2013;60:2052–2063. doi: 10.1109/TBME.2013.2248085. [DOI] [PubMed] [Google Scholar]
- 105.Zihajehzadeh S., Park E.J. A Novel Biomechanical Model-Aided IMU/UWB Fusion for Magnetometer-Free Lower Body Motion Capture. IEEE Trans. Syst. Man Cybern. Syst. 2016;47:927–938. doi: 10.1109/TSMC.2016.2521823. [DOI] [Google Scholar]
- 106.Taetz B., Bleser G., Miezal M. Towards Self-Calibrating Inertial Body Motion Capture; Proceedings of the19th International Conference on Information Fusion; Heidelberg, Germany. 5–8 July 2016. [Google Scholar]
- 107.Salehi S., Bleser G., Schmitz N., Stricker D. A Low-Cost and Light-Weight Motion Tracking Suit; Proceedings of the 2013 IEEE 10th International Conference on Ubiquitous Intelligence and Computing, and 10th International Conference on Autonomic and Trusted Computing (UIC/ATC); Vietri sul Mere, Italy. 18–21 December 2013; pp. 474–479. [Google Scholar]
- 108.Vignais N., Miezal M., Bleser G., Mura K., Gorecky D., Marin F. Innovative system for real-time ergonomic feedback in industrial manufacturing. Appl. Ergon. 2013;44:566–574. doi: 10.1016/j.apergo.2012.11.008. [DOI] [PubMed] [Google Scholar]
- 109.Bleser G., Steffen D., Reiss A., Weber M., Hendeby G., Fradet L. Smart Health. Springer; Berlin, Germany: 2015. Personalized physical activity monitoring using wearable sensors; pp. 99–124. [Google Scholar]
- 110.Peppoloni L., Filippeschi A., Ruffaldi E., Avizzano C. A novel wearable system for the online assessment of risk for biomechanical load in repetitive efforts. Int. J. Ind. Ergon. 2016;52:1–11. doi: 10.1016/j.ergon.2015.07.002. [DOI] [Google Scholar]
- 111.Peppoloni L., Brizzi F., Avizzano C.A., Ruffaldi E. Immersive ros-integrated framework for robot teleoperation; Proceedings of the 2015 IEEE Symposium on 3D User Interfaces (3DUI); Arles, France. 23–24 March 2015; pp. 177–178. [Google Scholar]
- 112.Schepers H., Veltink P. Stochastic magnetic measurement model for relative position and orientation estimation. Meas. Sci. Technol. 2010;21:065801. doi: 10.1088/0957-0233/21/6/065801. [DOI] [Google Scholar]
- 113.Ruffaldi E., Peppoloni L., Filippeschi A. Sensor fusion for complex articulated body tracking applied in rowing. Proc. Inst. Mech. Eng. Part P J. Sports Eng. Technol. 2015;229 doi: 10.1177/1754337115583199. [DOI] [Google Scholar]
- 114.Supej M. 3D measurements of alpine skiing with an inertial sensor motion capture suit and GNSS RTK system. J. Sports Sci. 2010;28:759–769. doi: 10.1080/02640411003716934. [DOI] [PubMed] [Google Scholar]
- 115.Klopčar N., Lenarčič J. Bilateral and unilateral shoulder girdle kinematics during humeral elevation. Clin. Biomech. 2006;21:S20–S26. doi: 10.1016/j.clinbiomech.2005.09.009. [DOI] [PubMed] [Google Scholar]