Table 2.
# | Challenge | Importance agreement | Category |
---|---|---|---|
1 | Problems related to reproducibility/generalizability | 51.4% (18) | Problems related to radiomics studies |
2 | Problem related to uncertainty/trustability of models [expert proposal] | 40.0% (14) | Problems related to radiomics pipelines |
3 | Lacking workflow integration [expert proposal] | 31.4% (11) | Problems related to radiomics pipelines |
4 | Lack of evidence gained by prospective evaluation [expert proposal] | 28.6% (10) | Problems related to radiomics studies |
5 | Legal and privacy problems | 28.6% (10) | Problems related to data sharing |
6 | Problems related to use of routine data | 25.7% (9) | Problems related to radiomics studies |
7 | Lack of quality ensuring guidelines for reviewers (and editors) [expert proposal] | 20.0% (7) | Lack of guidelines |
8 | Lack of standardized computation methods | 17.1% (6) | Lack of standardization |
9 | Lack of homogeneous evaluation criteria | 11.4% (4) | Lack of standardization |
10 | Problems related to image acquisition | 11.4% (4) | Problems related to radiomics pipelines |
Column “#” displays the final importance rank. Column “Importance agreement” displays the percentage of experts who picked the challenge as important, and in brackets the absolute number of selections. Challenges proposed by the expert panel in round 3 are marked by “[expert proposal]”