Skip to main content
. 2024 Oct 30;24(21):6986. doi: 10.3390/s24216986

Table 5.

Comparison of human and robot interaction with relevant research methods.

Reference
Papers
Sensory Approach Algorithm Hardware Number of
Postures/Position
Pros Accuracy Cons
Method Fusion
Q. Hu et al. 2021
[20]
1024 sensors Pressure
sensor
Yes HOG, SVM,
and CNN
Arduino Nano and CPU 6 <400 ms, sampling and processing 86.94% to 91.24% Contact
approach
Matar et al. 2020
[37]
1728
FSR sensors
Yes HOG + LBP,
FFANN
CPU 4 Health
monitoring
97% More usage of sensors
R. Tapwal et al. 2023
[38]
Two flex force sensors Yes K-means Arduino Uno
and CPU
4 Health
monitoring
~99.3% consumes
17.5 W,
contact
approach
Hu, D et al. 2024
[39]
32
Piezoelectric sensor
Yes S3CNN N/A 4 Effectively detects nuanced pressure disturbances 93.0% not applicable
Y. Tanaka et al. 2020 [40] Camera No Amygdala FPGA _ Interaction with subject based face recognition >90% not applicable
T.Kim et al. 2024 [41] Multi sensors and Camera Yes DFS, 3D routes,
Vision algorithms
Intel NUC and Nvidia Jetson Xavier Multi floor service and mapping N/A Costlier in implementation
Proposed 10
Ultrasonic sensors
Yes Human Localization, Triangulation Navigation algorithm FPGA Parallel computing, <200 ns, sampling, and computation. Adaptive localization-based robot services 98.4% PR flow will be preferred in future usage