Skip to main content
. 2022 Oct 24;82(12):17879–17903. doi: 10.1007/s11042-022-14043-z

Table 2.

Evaluation criteria used in the papers on deep learning and neural network

Formula Method Name Evaluation Method
TP+TNTP+TN+FP+FN Accuracy(ACC) - The ability of a measurement to match the actual (true) value of the quantity being measured [42].
TPTP+FP Precision(P) Precision is the most critical performance measure in this field to evaluate the model that focuses on false positives (FP(or positive) prediction [42].
TPTP+FN Recall or Sensitivity - The recall usually utilized to evaluate the model focuses on false negatives (FN). It shows how many samples are truly positive among all the positive examples.
F1=2precisionrecallprecision+recall F1 score or F-measure Harmonic average of two criteria of precision and recall The best state is one, and the worst condition is zero.
Curve AUC and ROC(Receiver Operating Characteristic) - Based on the ROC, the curve area under the curve AUC (Area under the curve) can calculate the overall performance measures, And gives us information about the model’s ability to detect classes. The model will be better if the AUC is higher [63].
MAPE=1NK=1K=NAPK Map(Mean Average Precision) A score widely used in ranking problems [44] AP is class K, and N is the number of classes [1].
MRR=1NI=1N1ranki Mean Reciprocal Rank (MRR) Mean Reciprocal Rank is a statistical criterion for evaluating each process that provides a list of possible answers to a sample of questions sorted by probability of accuracy [76].
LR Learning Rate The learning rate is a configurable hyperparameter used in training neural networks with a positive value, mostly between 0.0 and 1.0 [46].
MSE=1NYY^2 mean squared error (MSE) - In statistics, the MSE estimate measures the average of the squares of the errors, that is, the average squared difference between the estimated values and the actual value [16, 20].
RMSE=1NYY^2 root mean squared error (RMSE), Root Mean Square Error (RMSE) is the standard deviation of the residuals (prediction errors) [1, 16].
MRR=1NYY^ mean absolute error (MAE) Mean Absolute Error (MAE) is a public metric used to amount accuracy for continuous variables [43].