Skip to main content
. 2023 Aug 29;23(17):7505. doi: 10.3390/s23177505

Table 7.

Literature result (continued, Part 2 of 5): summary table of papers reporting on objective, analysis methods, and obtained results.

Author(s) Objective Analysis Methods Results
Dua et al., 2019a [49] Detect and assess driver attention using the front camera of a windscreen-mounted smartphone Neuronal networks, CNNs, and GRUs The driver’s attention rating had an overall agreement of 0.87 with the ratings of 5 human annotators
Dua et al., 2019b [49] Identify driver distraction based on facial characteristics (head position, eye gaze, eye closure, and yawning) CNN (generic features) and GRU or (CNN + GRU) The automatically generated rating has an overall agreement of 88% with the ratings provided by 5 human annotators; the attention-based model outperforms the AUTORATE model by 10% accuracy on the extended dataset
Eraqi et al., 2019 [50] Detect 10 different types of driver distraction (including talking to passengers, phone calls, and texting) Deep learning; ensemble of convolutional neural networks New public dataset, detection with 90% accuracy
Gelmini et al., 2020 [46] Driving style risk assessment based on speeding, longitudinal acceleration, lateral acceleration, and smartphone use while driving Thresholds used for profiling drivers and detecting smartphone usage Median phone usage, no accuracy indicators used
He et al., 2014 [80] Present a seat-level location of smartphones in a vehicle to identify who is sitting where Signal processing: reference frame transformation, event detection, left/right identification, front/back identification Position accuracy between 70% and 90% (best case)
Hong et al., 2014 [81] Detect a person’s driving behaviour via an Android-based in-vehicle sensor platform Machine learning approach (Naïve Bayes classifier) Average model accuracy with all three sensors was 90.5%, and 66.7% with the smartphone only
Janveja et al., 2020 [52] Introduce a smartphone-based system to detect driver fatigue and distraction (mirror scanning behaviour) in low-light conditions For distraction detection, statistics are calculated if the driver is scanning their mirrors at least once every 10 s continuously during the drive NIR LED setup: 93.8% accuracy in detecting driver distraction
Jiao et al., 2021 [82] Recognise actions of distracted drivers Hybrid deep learning model, OpenPose, K-means, LSTM Accuracy depending on processing step (up to 92%)
Johnson et al., 2011 [83] Detect and classify driving events, such as left/right manoeuvres, turns, lane changes, device removal, and excessive speed and braking Manoeuvre classification with the DTW algorithm U-turn correctly identified 23% of the time (using accelerometer), 46% of the time (using gyroscope), 77% of the time (combined sensors), 97% of aggressive events correctly identified
Kapoor et al., 2020 [48] Provide a real-time driver distraction detection system that detects distracting tasks in driver images Convolutional neural networks (CNNs) Accuracy for 4 classes (e.g., calling or texting on a cell phone) reaches 98–100% when fine-tuned with datasets such as the State Farm Distracted Driver Dataset
Kashevnik et al., 2021 [5] Provide an audio-visual speech recognition corpus for use in speech recognition for driver monitoring systems Corpus creation, development of smartphone app Corpus (audio-visual speech database with list of phrases in Russian language, 20 participants)
Khurana and Goel, 2020 [84] Detect smartphone use by drivers using in-device cameras Random forest classifiers (machine learning models) for 2 scenarios: a) docked, b) in-hand Approximately 90% accuracy in distinguishing between driver and passenger. Cannot collect data for phones in handheld position
Koukoumidis et al., 2011 [85] Detect traffic lights using the smartphone camera and predict their timing Machine learning (Support Vector Regression) Accuracy of traffic signal detection (87.6% and 92.2%) and schedule prediction (0.66 s, for pre-timed traffic signals; 2.45 s for traffic-adaptive traffic signals)