Abstract
The article describes a dataset of gait measures acquired to validate the use of wearable sensors in gait analysis since its measurements can be compared with those provided by the stereophotogrammetric system. The comparison with a gold standard in gait analysis makes the dataset useful for the development, testing and validation of algorithms for estimating gait parameters.
The dataset contains measurements simultaneously acquired by the wearable sensors and the stereophotogrammetric system during an acquisition campaign performed on 5 healthy subjects (2 females and 3 males aged between 25 and 35 years). In the acquisition campaign the involved subjects carried out a motion task wearing the wearable sensors and reflective markers of the stereophotogrammetric system. In particular, the subjects wore in each foot a wearable sensor on the instep and a reflective marker on heel, first metatarsal head, fifth metatarsal head, and above the sensor, respectively. During the motion task each subject walked over an 11-meter long walkway according to its own course. The 5 subjects involved in the acquisition campaign performed 3 repetitions of the motion task, for a total of 15 trials in where the measures collected by wearable sensors and the stereophotogrammetric system can be compared.
Keywords: Gait analysis, Stride event detection, Wearable sensors, Stereophotogrammetry
Specifications Table
| Subject | Electrical and Electronic Engineering |
| Specific subject area | Wearable sensor for gait analysis |
| Type of data | CSV files relating to data acquired by the wearable sensors Text files relating to data acquired by the optoelectronic system |
| How data were acquired | Two wearable sensors (NGIMU, X-io Technologies Limited, United Kingdom) including triaxial accelerometer, gyroscope, and magnetometer. One optoelectronic system (Elite, BTS-Bioengineering, Italy) composed of 6 infrared camera, 8 reflective markers and an acquisition unit. |
| Data format | Raw data from the wearable sensors Pre-processing data from the optoelectronic system |
| Parameters for data collection | Data were acquired from 5 healthy subjects (3 males and 2 females), aged between 25 and 35 years, without any musculoskeletal or neurological disease which can affect the gait. Before each acquisition the subjects signed an informed written consent to participate in the acquisition campaign. |
| Description of data collection | In each foot the subjects wore the wearable sensor on the instep and 4 reflective markers on heel, first metatarsal head, fifth metatarsal head, and above the sensor, respectively. Acquiring simultaneously by the wearable system and the optoelectronic system, each subject carried out a predetermined movement to synchronize the two systems and then started walking over a walkway. The subject started walking halfway up the walkway, arrived at the beginning of the walkway, reversed and retraced the entire journey in the opposite direction. Each subject repeated this motion task 3 times. |
| Data source location | Movement analysis Laboratory, Department of Information Engineering, Università Politecnica delle Marche, Ancona, Italy |
| Data accessibility | With the article |
| Related research article | Paola Pierleoni, Alberto Belli, Lorenzo Palma, Marco Mercuri, Federica Verdini, Sandro Fioretti, Sebastian Madgwick, and Federica Pinti, “Validation of a gait analysis algorithm for wearable sensors”, IEEE 2019 International Conference on Sensing and Instrumentation in IoT Era (ISSI), DOI 10.1109/ISSI47111.2019.9043647 [1] |
Value of the Data
-
•
Data are useful to validate the use of wearable sensors in gait analysis;
-
•
Researchers and developers can benefit from these data to propose new systems for stride events detection based on wearable sensors;
-
•
Data can be used as benchmark for performance evaluations of different algorithms for the detection of stride events using data acquired by wearable sensor;
-
•
The additional value of these data is that they allow a comparison between simultaneously acquisitions of wearable sensors and optoelectronic system which is the gold standard for gait analysis.
1. Data
The proposed dataset contains data collected during an acquisition campaign conducted to validate wearable sensors measurements in gait analysis.
The dataset consists of a main folder called GaitAnalysisData and 5 subfolders (IDx, where x = 1, 2, …5 indicates the subject ID) containing the acquisitions of each subject involved in the acquisition campaign. Each subfolder is organized in the IMU_Data and the Stereophotogrammetry_Data folders.
The IMU_Data folder presents the acquisitions of the wearable sensors divided in three folders containing the data collected during each repetition of the path carried out by the subject. These folders are divided in the RX and the LX subfolders, in which the acquisitions of the wearable sensors on the right and left instep are respectively stored. Each subfolder contains a sensor.csv file structured in 11 columns:
-
•
Time series (s): acquisition time (fs = 128 Hz);
-
•
Gyroscope X(deg/s): angular velocity on the x axis of the gyroscope;
-
•
Gyroscope Y (deg/s): angular velocity on the y axis of the gyroscope;
-
•
Gyroscope Z (deg/s): angular velocity on the z axis of the gyroscope;
-
•
Accelerometer X (g): acceleration on the x axis of the accelerometer;
-
•
Accelerometer Y (g): acceleration on the y axis of the accelerometer;
-
•
Accelerometer Z (g): acceleration on the z axis of the accelerometer;
-
•
Magnetometer X (uT): magnitude on the x axis of the magnetometer;
-
•
Magnetometer Y (uT): magnitude on the y axis of the magnetometer;
-
•
Magnetometer Z (uT): magnitude on the z axis of the magnetometer;
-
•
Barometer (hPa): barometric pressure.
The Stereophotogrammetry_Data folder presents the acquisitions of the optoelectronic system divided in three folders containing the data collected during each repetition of the path carried out by the subject. Each folder contains a .txt file in which the displacement signals obtained from each reflective marker of the optoelectronic system are saved. The .txt files are structured in 25 columns:
-
•
Time series (ms): acquisition time (fs = 100 Hz);
-
•
RICCAdx _X (mm): displacement on the x axis of the right heel;
-
•
RICCAdx _Y (mm): displacement on the y axis of the right heel;
-
•
RICCAdx _Z (mm): displacement on the z axis of the right heel;
-
•
RICVMdx _X (mm): displacement on the x axis of the right fifth metatarsal head;
-
•
RICVMdx _Y (mm): displacement on the y axis of the right fifth metatarsal head;
-
•
RICVMdx _Z (mm): displacement on the z axis of the right fifth metatarsal head;
-
•
RICFMdx _X (mm): displacement on the x axis of the right first metatarsal head;
-
•
RICFMdx _Y (mm): displacement on the y axis of the right first metatarsal head;
-
•
RICFMdx _Z (mm): displacement on the z axis of the right first metatarsal head;
-
•
RICCAsx _X (mm): displacement on the x axis of the left heel;
-
•
RICCAsx _Y (mm): displacement on the y axis of the left heel;
-
•
RICCAsx _Z (mm): displacement on the z axis of the left heel;
-
•
RICVMsx _X (mm): displacement on the x axis of the left fifth metatarsal head;
-
•
RICVMsx _Y (mm): displacement on the y axis of the left fifth metatarsal head;
-
•
RICVMsx _Z (mm): displacement on the z axis of the left fifth metatarsal head;
-
•
RICFMsx _X (mm): displacement on the x axis of the left first metatarsal head;
-
•
RICFMsx _Y (mm): displacement on the y axis of the left first metatarsal head;
-
•
RICFMsx _Z (mm): displacement on the z axis of the left first metatarsal head;
-
•
RICIMUdx _X (mm): displacement on the x axis of the wearable sensor on the right foot;
-
•
RICIMUdx _Y (mm): displacement on the y axis of the wearable sensor on the right foot;
-
•
RICIMUdx _Z (mm): displacement on the z axis of the wearable sensor on the right foot;
-
•
RICIMUsx _X (mm): displacement on the x axis of the wearable sensor on the left foot;
-
•
RICIMUsx _Y (mm): displacement on the y axis of the wearable sensor on the left foot;
-
•
RICIMUsx _Z (mm): displacement on the z axis of the wearable sensor on the left foot.
2. Experimental design, materials, and methods
The proposed dataset contains measurements of wearable sensors and stereophotogrammetric system collected during an acquisition campaign undertaken to validate gait analysis algorithm based on wearable sensors acquisition. The data of the acquisition campaign are acquired based on the protocol defined in our previous study [1] which presents a gait analysis algorithm for the detection of stride events. The orientation of the foot is calculated using the Attitude and Heading Reference System (AHRS) proposed by Madgwick [2], which has an accuracy comparable to the Kalman filter, but specifically developed for real-time solutions with limited computing resources [3]. A threshold, following the study of Ahmadi et al. [4], is applied to acceleration signal to detect the stance and swing phases which allow to get a complete evaluation of gait parameters [5].
The protocol also defines the guidelines and the exact acquisition procedure to provide simultaneously measurements by wearable sensors and stereophotogrammetric system. The subjects involved in the acquisition campaign wore wearable sensors and reflective markers of the stereophotogrammetric system. In each foot of the subject, the wearable sensor was positioned on the instep and the 4 reflective markers were positioned at appropriate anatomical landmarks. During the motion task of the acquisition campaign each subject completed a specific path over an 11-meter long walkway. In the motion task of the acquisition campaign the subject was instructed to:
-
•
Stand in the halfway of the walkway and lift the heels up three times in order to synchronize the wearable sensors and the optoelectronic system;
-
•
Start walking from the halfway of the walkway until arriving to the end;
-
•
Turn around and retrace the entire journey in the opposite direction arriving to the beginning of the walkway.
After a pause of about 20 s, the motion task was repeated up to three times for each subject.
Declaration of Competing Interest
There is no Declaration of Competing Interest.
Acknowledgment
The authors would like to thank the movement analysis laboratory in Department of Information Engineering for their contribution in the data acquisition and the research activities.
Footnotes
Supplementary material associated with this article can be found, in the online version, at doi:10.1016/j.dib.2020.105918.
Appendix. Supplementary materials
References
- 1.Pierleoni P., Belli A., Palma L., Mercuri M., Verdini F., Fioretti S., Madgwick S.O., Pinti F. International Conference on Sensing and Instrumentation in IoT Era (ISSI) IEEE; 2019. Validation of a gait analysis algorithm for wearable sensors. [Google Scholar]
- 2.Madgwick S.O., Harrison A.J., Vaidyanathan R. IEEE International Conference on Rehabilitation Robotics. IEEE; 2011. Estimation of IMU and MARG orientation using a gradient descent algorithm; pp. 1–7. [DOI] [PubMed] [Google Scholar]
- 3.Cavallo A., Cirillo A., Cirillo P., De Maria G., Falco P., Natale C., Pirozzi S. Experimental comparison of sensor fusion algorithms for attitude estimation. IFAC Proceed. Vol. 2014;47(3):7585–7591. [Google Scholar]
- 4.Ahmadi A., Destelle F., Unzueta L., Monaghan D.S., Linaza M.T., Moran K., O'Connor “3D human gait reconstruction and monitoring using body-worn inertial sensors and kinematic modeling. IEEE Sens. J. 2016;16(24):8823–8831. [Google Scholar]
- 5.Li G., Liu T., Yi J. Wearable sensor system for detecting gait parameters of abnormal gaits: a feasibility study. IEEE Sens. J. 2018;18(10):4234–4241. [Google Scholar]
Associated Data
This section collects any data citations, data availability statements, or supplementary materials included in this article.
