Table 6.
Dataset | Provided by | No of Participants |
Parameters | Approach | MoCap System Details |
Suggested Application |
---|---|---|---|---|---|---|
UI–PRMD [125] |
University of Idaho |
10 | Locations and angular orientations of the body joints | Vision-based | Vicon optical trackers Kinect cameras |
Monitoring rehabilitation exercises |
KIMORE [126] |
Marche Polytechnic University |
78 | Joint locations |
Vision-based | Kinect cameras |
Detection motor dysfunction |
M. Capecci et al. [127] |
Marche Polytechnic University |
7 | Joint locations |
Vision-based | Kinect v1 | Evaluation of karate moves |
Daily and sports activities data set [128] |
Bilkent University |
8 | Inertial data |
Sensor-based | Inertial sensors (25 Hz sampling frequency) | Activity recognition |
Human Activity recognition using smartphones data set [129] |
University of Genoa | 30 | Inertial data |
Sensor-based | Smartphone (Samsung Galaxy S II) |
Activity recognition |
MoVi dataset [130] |
York University |
90 | Camera images, joint locations, inertial data |
Vision-based Sensor-based |
15 cameras (Qualisys Oqus 300 and 310) 2 stationary cameras (RGB Grasshopper2) 2 hand-held cameras (iPhone 7) 17 IMU sensors (Noitom Neuron Edition V2) |
Motion recognition |
Gait in aging and disease database [131] |
PhysioBank | 15 | Stride interval | Sensor-based | Force-sensitive resistors | Normal gait and Parkinson’s disease analysis |
MIT database [132] |
MIT | 24 | View, time | Vision-based | Sony Handycam | Gait recognition |
Georgia Tech [133] |
Georgia Tech | 20 | View, time, distance | Vision-based | - | Gait recognition |