Table 2.
Challenges for AI application in diabetes care and how they may be overcome with future development
Challenge | Description | Mitigating strategies |
---|---|---|
Data quality control | data quality may have the following problems: (1) poor quality of the data themselves, (2) poor quality of the data labels, and (3) insufficient data. | ensure the quality of data used in the training process |
AI may amplify implicit bias and discrimination if trained on data reflecting the health-care disparities | train AI algorithms on fair datasets that include and accurately represent social, environmental, and economic factors that influence health | |
Poor technology design | the initial versions of most AI systems are always challenging to navigate | understand the needs of the end user (for example, patients and providers) |
many EHR vendors did not follow basic usability principles | develop software and applications with input from end users | |
patients reported lack of confidence with technology, as well as frustration with design features and navigation of commercially available mobile applications | utilize iterative design process | |
Lack of clinical integration | application of AI systems in the real world may lead to many unintended outcomes | develop AI algorithms that could be integrated into current clinical and digital workflows |
experts may struggle to develop trust with AI systems | demonstrate explainability analysis of AI systems | |
AI systems could also be perceived as encroaching on clinicians’ professional role | support the clinical decision-making of clinicians instead of making solely a competing diagnosis | |
Privacy concerns | implementing data privacy and security assurances is an overriding issue for the future of AI in medicine, since there are pervasive problems of hacking worldwide |
|
Non-adherence | user adherence is crucial to the effectiveness of AI applications in the real world, which can be affected by convenience, user experience, and true benefits brought by this technology |
|
Imperfection of laws and regulations | AI in medicine results in legal and regulatory challenges regarding medical negligence attributed to complex decision-support systems |
|
AI, artificial intelligence; EHR, electronic health record.