Table 2.
Developer (software name╪, version) | ROC AUC (95% CI) | PR AUC (95% CI) |
---|---|---|
Abnormality scores obtained by FIT | ||
Qure.ai (qXR v3) | 0.82 (0.79–0.86) | 0.41 (0.33–0.50) |
Delft Imaging (CAD4TB v7) | 0.82 (0.78–0.85) | 0.39 (0.31–0.47) |
DeepTek (Genki v2) | 0.78 (0.75–0.82) | 0.28 (0.22–0.34) |
Abnormality scores provided by CAD company | ||
Lunit (INSIGHT CXR v3.1.0.0) | 0.82 (0.79–0.86) | 0.44 (0.35–0.54) |
JF Healthcare (JF CXR-1 v3.0) | 0.77 (0.73–0.81) | 0.28 (0.22–0.35) |
InferVision (InferRead DR Chest v1.0.0.0) | 0.76 (0.72–0.80) | 0.29 (0.22–0.36) |
OXIPIT (ChestEye v1) | 0.73 (0.69–0.77) | 0.23 (0.18–0.28) |
Artelus (T-Xnet v1) | 0.70 (0.66–0.74) | 0.23 (0.17–0.29) |
EPCON (XrayAME v1) | 0.66 (0.61–0.71) | 0.23 (0.17–0.28) |
COTO (v1) | 0.66 (0.61–0.71) | 0.22 (0.17–0.28) |
SemanticMD (v1) | 0.53 (0.48–0.58) | 0.14 (0.10–0.17) |
Dr CADx (v0.1) | 0.50 (0.45–0.55) | 0.13 (0.10–0.16) |
ROC AUC area under the receiver operating characteristic curve, PR AUC area under the precision-recall curve.
╪Software name omitted if none available.