Skip to main content
. 2023 Aug 29;23(17):7505. doi: 10.3390/s23177505

Table 9.

Literature result (continued, Part 4 of 5): summary table of papers reporting on objective, analysis methods, and obtained results.

Author(s) Objective Analysis Methods Results
Park et al., 2018 [97] Detect the location and direction of the driver’s phone, as well as in-car activities, such as walking towards the vehicle, standing near the vehicle while opening a door, and starting the engine Electromagnetic field (EMF) fluctuations are analysed The driver’s phone was identified with 83–93% true positive rate and achieved 90–91% true negative rate
Paruchuri and Kumar, 2015 [63] Detects smartphone location and distinguishes drivers from passengers Image comparison (angle difference) with reference images for the localisation of the smartphone (driver’s seat vs. passenger seats), based on the distance between images 15 out of 38 images were registered incorrectly
Punay et al., 2018 [98] Focus on a safer driving experience by providing an Android application for non-distracted driving Thresholds are used, i.e., the system detects if the speed is higher than a certain threshold N/A
Prototype only
Qi et al., 2019a [99] Detect in-car human activity, such as chatting, and contextual information (clear vs. crowded) based on vehicle dynamics (braking and turning) Convolutional neural network (CNN) for the audio Average accuracy of 90% across 7 different activities
Qi et al., 2019b [100] Classify driving events, such as turning, braking, and lane changes, using sensor data, while cameras and microphones are used to identify objects in front view and blind spots and estimate head position Deep learning inference (Nvidia TensorRT) Average of 90% event detection accuracy
Rachmadi et al., 2021 [101] Present a driver abnormal behaviour classification system Enhanced multi-layer perceptron (MLP) 97,5% accuracy and 45 ms processing time
Shabeer and Wahidabanu, 2012 [62] Detect driver phone calls Threshold value cutoff of the receiving RF signal N/A
Singh et al., 2014 [102] Blind spot vehicle detection Two approaches: intensity variation and contour matching Detect and alert the driver with 87% accuracy
Song et al., 2016 [64] Detect driver phone calls Similarity based on threshold: voice feature model TPR is over 98% for 3 different evaluated passenger positions, over 90% with noise impact, 80% when three people are talking, and 67% when 4 people are talking
Torres et al., 2019 [103] Use data from smartphone sensors to distinguish between driver and passenger when reading a message in a vehicle Machine learning (various models): three eager learners (SVM: DT, LR), three ensemble learners (RF, ADM, GBM), and one deep learning model (CNN) Performance values accuracy, precision, recall, F1, and Kappa: CNN and GB models had the best performance
Tortora et al., 2023 [104] Develop Android application to detect distracted driving behaviour Distraction score based on different distraction activities and detection methods Application presentation (no KPIs)
Tselentis et al., 2021 [105] Driving behaviour analysis using smartphone sensors to provide driver safety scores and driver clustering K-means driver clustering (based on event compute in a trip such as phone use, speeding, harsh braking, etc.) Descriptive statistics, definition of driver characteristics for each cluster (moderate, unstable, cautious drivers)
Wang et al., 2016 [106] Present an approach based on smartphone sensing of vehicle dynamics to determine driver phone use Signal processing: compute centripetal acceleration using smartphone sensors and compare to those measured by a simple plug-in reference module Approach achieves close to 90% accuracy with only a few with less than 3% FPR
Vasey et al., 2018 [107] Driver emotional arousal detection Machine learning classifier (decision tree, SVM, NN) N/A, concept only