Skip to main content
. 2021 Mar 19;21(6):2166. doi: 10.3390/s21062166
Algorithm 2 Our Sensor Fusion Emotion Recognition (SFER) Algorithm.
Require: Variable for driver distinguish(Driver ID): ID
   List of defined emotional state categories for driver: C
   The recognized valence and arousal level of driver by facial expressions at time t: (V^t,A^t)
   The measured EDA response of driver at time t: Et
Ensure: The recognized real emotion of driver at time t: Y^t
   Initialize:
      Let the sensor fusion emotion recognition model be SFER_Model
      Let SFER_Model be the DNN
      Let final hidden layer of SFER_Model be fully connected layer which n units and softmax function as activation function
      V^ID=[],A^ID=[],EID=[]
   if Accumulated data for the driver with ID exists then
      Load V^ID,A^ID,EID
   end if
   while Driving exists do
      # Accumulate data per each individual for personalization
      V^ID.insert(V^t),   A^ID.insert(A^t),  EID.insert(Et)
      # Mean subtraction
      (V^t,A^t,Et)=(mean(V^ID),mean(A^ID),mean(EID))
      # Normalization
      XtT=(V^tmin(V^ID)max(V^ID)min(V^ID),A^tmin(A^ID)max(A^ID)min(A^ID),Etmin(EID)max(EID)min(EID))
      Yt𝕝^=SFER_Model(Xt)
      Y^t=C[argmax(Yt𝕝^)]
   end while