Table 3.
SN | Performance Matrix | Description |
---|---|---|
1 | Accuracy | It is set out as the number of correct predictions made as a ratio of all predictions made. |
2 | Sensitivity or Recall | It is defined as the number of positive predictions made. |
3 | Specificity | It is defined as the number of negative predictions made. |
4 | Precision | It is defined as the number of correct positive results divided by the number of positive results predicted by the classifier. |
6 | F1-Score | It is defined as the weighted average of precision and recall. |
7 | Area under ROC curve (AUC) | It is a probabilistic measure that defines how much the model is capable of distinguishing between classes. |
8 | Kaplan-Meier Curve | It is the visual representation of the function that shows the probability of an event at a respective time interval. |
9 | Mean Absolute Error (MAE) | It is defined as the average of the difference between the ground truth and the predicted values by the regression model. |
10 | Mean Square Error (MSE) | It is defined as the average of the squared difference between the target value and the predicted value by the regression model. |
11 | R2 (R-Squared) | It is defined as the statistical measure of fit that indicates how much total variation of a dependent variable is explained by the independent variable by the regression model. |
Where TP—true positive; TN—true negative; FP—false positive; FN—false negative; and are the target variable and predicted values; N represents the total number of samples.