Abstract
The development of artificial intelligence (AI) and machine learning (ML)-based systems in medicine is growing, and these systems are being used for disease diagnosis, drug development, and treatment personalization. Some of these systems are designed to perform activities that demand human cognitive function. However, use of these systems in routine care by patients and caregivers lags behind expectations. This paper reviews several challenges that healthcare systems face and the obstacles of integrating digital systems into routine care. This paper focuses on integrating digital systems with human physicians. It describes second-generation AI systems designed to move closer to biology and reduce complexity, augmenting but not replacing physicians to improve patient outcomes. The constrained disorder principle (CDP) defines complex biological systems by their degree of regulated variability. This paper describes the CDP-based second-generation AI platform, which is the basis for the Digital Pill that is humanizing AI by moving closer to human biology via using the inherent variability of biological systems for improving outcomes. This system augments physicians, assisting them in decision-making to improve patients’ responses and adherence but not replacing healthcare providers. It restores the efficacy of chronic drugs and improves adherence while generating data-driven therapeutic regimens. While AI can substitute for many medical activities, it is unlikely to replace human physicians. Human doctors will continue serving patients with capabilities augmented by AI. The described co-piloting model better reflects biological pathways and provides assistance to physicians for better care.
Keywords: digital health, artificial intelligence, robots, medical care, future medicine
1. Introduction
Computer-controlled machines are generated to perform activities that demand human cognitive function. There is a growing improvement in artificial intelligence (AI) and machine learning (ML)-based digital health systems for use in healthcare. The potential use of AI in healthcare includes disease diagnosis, drug development, treatment personalization, and gene editing [1]. It has led companies and physicians to consider the concept that, at some point, physicians will be replaced by robots. Vindo Koshla, the legendary Silicon Valley investor, argued that machines would replace 80 percent of doctors in the future of healthcare [2]. It was suggested that machines could replace doctors because professional work can be fragmented into routine and process-based parts requiring judgment, creativity, or empathy [3]. Technological singularity (TS) implies a time when AI surpasses human intelligence and involves using AI to take full autonomy and responsibility for decision-making. TS in healthcare suggests replacing medical practitioners with AI-enabled robots and peripheral systems [4].
However, patients and caregivers using AI-based systems in routine care must catch up to expectations.
This paper reviews several challenges healthcare systems face and the obstacles of integrating digital systems into routine patient care. The constrained disorder principle (CDP) defines complex biological systems by their inherent variability [5]. The paper focuses on integrating digital systems with human physicians. It describes the CDP-based second-generation AI systems designed to move closer to biology and reduce complexity, aiding but not replacing physicians to improve patient outcomes [6].
2. Currently Used AI Systems in Healthcare
Machine learning (ML) and natural language processing (NLP) are two common areas of AI. In ML, the system gathers and examines structured data such as gene traits and diagnostic images. NLP involves collecting unstructured data from electronic medical records (EMRs) [7]. Both can assist in clinical decisions. ML systems improve algorithms by automatically using data and cognitive inputs without being programmed for them [8]. It learns patterns used for prediction from large amounts of information [8,9].
Knowledge-based systems are designed to simulate human information by requesting experts to provide the methods they use to solve problems [10]. Data from wearable and non-wearable biosensors, physical sensors, and EMR provide sources for big data, which serve as tools to improve the accuracy and efficiency of diagnosis and treatment [8,11,12,13]. Precision medicine-based AI systems consider individual subjects and their genomic variations, contributing factors, age, gender, ethnicity, and emotional state, and tailor interventions accordingly [12,14].
AI-based systems can gather and analyze large amounts of data and draw conclusions, which is challenging for human physicians [13]. They can help predict disease, prepare long-term plans for patient care, improve care accuracy, reduce side effects, and personalize treatments 15. AI systems use algorithms for data processing using several methods, including artificial neural networks (ANNs), fuzzy expert systems (FESs), evolutionary calculations (ECs), and hybrid intelligent systems (HISs) [4].
Some theories suggest that ANN mimics human thinking through data collection, analysis of examples of answers to previous problems, and the use of learning algorithms, based on which a response is designed [15]. The type of data being processed governs the value of the results. A deep neural network (DNN) is an ANN that comprises more layers, enabling better predictions from data [16]. It benefits from the training dataset size. Deep learning (DL) recognizes complex correlations to be extracted by ML algorithms. Compared with earlier networks that use three to five connections, DL uses ANN with more than ten layers, encompassing an architecture within layers and computational actions like tensors. The FES uses professional information in a particular area that mimics an expert’s response. EC is a method based on a biological evaluation that provides a solution. HIS is a synergetic model that is based on some of these systems [17,18,19].
AI and robots are used in numerous areas to replace the human workforce and assist human performance [20]. Systems are divided into assistive and autonomous. Assistive systems improve performance through data collection and analysis, leading to the conduction of tasks under human supervision [21]. Examples are robots integrated into prosthetics and implants, robotic assistants for patient assistance and rehabilitation, and robots that assist healthcare staff with their jobs [22,23]. Autonomous systems are designed to react to real-world situations; they function with minimal human interaction and make decisions. Examples are digital receptionists and autonomous implanted devices [24].
Speech recognition and other interface systems can regulate robotics. Examples are simulations of social interaction, cognitive stimulation, and health assessment [25,26]. AI platforms collect and analyze data for object identification, language processing, facial recognition, and improving diagnosis [27,28]. In surgery, robotics can undertake pre-programmed tasks under a surgeon’s control. Surgeons can integrate pre-op data with real-time operating parameters for improving outcomes [29,30]. Semi-active systems enable surgeons to supplement the pre-programmed component and use master–slave systems that lack autonomous elements and depend entirely on their actions [31].
AI-based monitoring approaches support healthcare providers in diagnosing symptoms and providing prognosis [28,32]. ML can detect abnormality with contextual factors, such as an object, time, space, and duration for decision support on the suitable action [33]. It predicts the risk of postoperative complications with better accuracy than conventional approaches [34]. ML can improve cardiovascular risk prediction by correlating complex interactions between risk factors [35].
Computer vision (CV) involves image processing, pattern recognition, and response [28,36]. It can be used for radiology and pathology for image processing [4]. Supervised cancer detection implies that the machine has “learned” mapping from a tagged dataset; unsupervised learning is essentially clustering, regardless of the task. Supervised learning can be used to predict diagnosis and prognosis in sepsis when fed large datasets; reinforcement training is a hybrid that combines the other two [37,38]. AI algorithm-based grading systems can assess the eye fundus photographs of diabetic patients [39].
While these systems offer numerous benefits, they pose challenges for end users and healthcare systems.
3. The Challenges Faced by Current AI Systems: Low Engagement by Patients and Physicians
The digital health market has ballooned to 350,000 products: apps, wearables, and telemedicine devices [40,41]. The market reached USD 116.39 billion in 2019 and is expected to exceed USD 833.44 billion by 2027 [42]. The market size of AI healthcare is likely to grow by 40% compound annually [20]. AI-based systems are predicted to reduce annual US healthcare costs by USD 150 billion in 2026. Savings are expected by changing the healthcare model from reactive to proactive, centered on health management rather than treatments. It is expected to reduce doctor visits and hospitalizations [20]. While the AI field is increasingly growing, multiple challenges prevent it from expanding. Patients, physicians, and healthcare authorities have a low level of engagement [43,44].
Healthcare systems are becoming proactive following the P5 concept, which represents personalized, predictive, participatory, preventive, and precision [45]. This patient-centered model sets the basis for AI platforms that assist in diagnosis and treatment decision-making [8]. However, these systems are associated with numerous challenges, such as privacy, ethical principles, safety, security, and various biases, which may result in harmful decisions [8].
There is a tendency among healthcare professionals to refrain from resisting or ignoring new technologies. There are numerous reasons for the lack of engagement, such as the perceived threats to professional status, privacy and autonomy concerns, and ethical and legal issues of responsibility [46,47,48]. The low engagement with digital systems led to the “valley of death of AI” or “AI winter” [8]. Machines are expected to perform repetitive, routine tasks under foreseeable environments and collaborate with subjects to conduct activities in uncontrolled situations [49,50,51]. However, there are no validation data for the functions machines can perform. It is thus not always clear how a robot can assist healthcare providers and how forecasts of outcomes can change managed care [52,53].
The quality and reliability of digital health devices and sensor inputs still need to be determined [54]. Algorithms still underperform in unusual cases of treatment resistance and side effects and where there are no previous examples to build on [55]. Digital mammography has a sensitivity of 84% for detecting breast cancers and is yet to be improved [56,57].
AI technologies achieved notable results in predicting sepsis or cardiovascular risk and monitoring vital parameters in intensive care units and robots. However, their use in real-world settings is still limited [8]. Outcomes drive precision patient management and aim for optimal outcome prediction, but this is challenging to achieve using the current AI-based methods [58]. Computers can search more datasets faster than humans, but speed and efficiency do not always translate to accuracy [59]. Humans still perform many tasks and play an essential role in determining AI-based systems’ course of operation [60]. A few AI tools achieved regulatory clearance due to their performance in limited studies [61].
Explainability is an essential tool for justifying AI-based decisions. It can help validate predictions, improve models, and provide new insights into the problem, leading to more reliable AI systems [62]. Explainability is a feature that enables a reconstruction of why an AI system produces a certain decision [63]. AI-based predictions are not always explainable, in contrast to rule-based systems. This may lead to hiding errors or biases. The lack of explainability makes it hard for physicians to judge the reliability of the AI output [60]. Explainable AI involves the strategies and methodologies used to construct AI systems that enable end users to understand and interpret the outputs and predictions made by AI models [62]. The increasing use of opaque AI applications in high-stakes fields, especially healthcare, has increased the need for clarity and explainability. This is due to the potentially high-impact consequences of incorrect AI predictions in such critical sectors. The effective integration of AI models in healthcare depends on these models’ ability to be both explainable and interpretable. Gaining the trust of healthcare professionals requires AI applications to be transparent about their decision-making processes and underlying logic [64]. The challenges of accountability, responsibility, and liability for harm caused by AI are unresolved. It is unclear who is responsible: the programmer, manufacturer, end user, AI/robotic system, or dataset provider. The European Parliament’s Resolution assigned the responsibility to the human factor, including the developer, manufacturer, owner, or operator [65].
The P5 model for healthcare is used to develop individualized care and is associated with multiple factors: socio-economic background, gender, ethnicity, and education. However, it has numerous potential biases toward particular groups of the population. Training datasets face the challenge of under-representing subgroups, and essential variables are dispersed differently across subgroups. Examples are cardiovascular disease and Parkinson’s disease, which have different progression based on gender. This results in unintended and undesirable discrimination bias [66,67]. Implementations of AI can account for ethnicity, gender, and other differences, generating more personalized results and a “desirable bias” [68].
The costs associated with using newer versions of AI systems are expected to be economic burdens on health services. They also require higher computing power, which is not always available [8].
Among the many challenges healthcare systems face is low adherence to care, where 40% of patients need to adhere to their regimens [69]. Many patients go through more than one line of treatment to reach an optimal response and cannot achieve it. A third to half of the patients with chronic diseases suffer from a partial or complete loss of effect of chronic therapies, a significant cause of loss of adherence to chronic therapies [70]. Low adherence and loss of response in patients with chronic diseases are substantial burdens on health systems, leading to increased admissions and the need for expensive drugs, which are not always better.
The lack of patients’ adherence to AI-based apps is a significant challenge [71]. Low engagement of patients and physicians with the currently used digital health apps prohibits their use as a solution for the loss of efficacy and low adherence to chronic therapies [60,72,73]. Digital health apps are subject to significantly reduced engagement rates, although their availability is increasing. There needs to be more sustainability of use, and many users must adhere to these apps as intended. Studies suggest that 80 percent of subjects exposed to digital health interventions only engage at a minimum level. They only log into the app once and do not use it consistently long-term [74]. It is estimated that less than 5% of patients show long-term adherence to apps [75,76]. The ethical dilemmas that occur between AI and patients are also challenged [77]. A recent study based on multiple interviews documented marked variability in patients’ perspectives on AI systems; there was no single pattern regarding patients’ opinions on the introduction of robots or the functionalities and tasks that robots can perform [71]. Patients accept and value care automation while looking for reciprocity in robotics [78].
These encounters raise the challenges of whether robots can replace medical care providers.
4. Why Will Robots Not Replace Physicians?
Multiple studies thoroughly analyzed the challenge of replacing caregivers with AI-based systems. AI is expected to enhance human clinicians’ efficiency, improve diagnostic and treatment accuracy, and reduce costs while eliminating emotional contact in the patient–doctor relationship [79,80]. Empathy is an ability to understand and share feelings, which develops from self-awareness and interactions with other members of society [81]. Empathy cannot be replaced and is a challenge for AI-based interactions, which Artificial Empathy (AE) attempts to resolve. AI-enabled, empathetic mobile app technologies showed some promise in dealing with this challenge [82,83].
Although AI and robotics are performing well, human surveillance is still essential [60]. Human physicians have a non-linear working method that adapts to ever-changing conditions and quickly evolving situations, which is difficult to teach to a computer [20,59]. This is a significant challenge for ML-based technologies. Following pre-defined guidelines, diagnosis, and treatment require creativity and problem-solving skills that algorithms can never have. Patients vary, and no case is the same. This implies a need for a human physician to decide whether to use a data-derived means to diagnose or decide on a therapeutic regimen [84].
The human brain is complex and can oversee a vast data scale, so it is not worth developing an AI that attempts to imitate it, as the human brain does it well. This implies that while AI-based systems can assist and guide physicians, they may need to leave the complex analysis and decisions to humans 85. There are tasks that algorithms and robots cannot complete with humans in. A human physician must constantly review the data analysis conducted and the conclusions made by the AI system [85].
There will always be tasks where humans are faster, more reliable, and cheaper. Human physicians rely on more than just data for medical decision-making. Computer scientists find that physicians’ “gut feelings” that cannot be translated into algorithms influence decisions [86,87]. This implies that collaboration between humans and technology should be the ultimate goal.
5. Integrating the Human Brain with a Computer: The Future Physician
The practice of medicine is changing and is expected to change with improvements in AI methods [13]. Although technology has indeed come a long way, the focus on advancing technology should be on augmenting a physician’s workflow rather than replacing it. Human–machine collaborations are performing better than either one alone [61,86,87]. The role of AI in medicine is not about replacing physicians; instead, it is about optimizing and improving what they already achieve [13,88].
Intelligent digital technologies require competent professionals [89]. It is not tech vs. human, as complex technologies require skilled professionals. It is worthwhile for a program to assist in repetitive, data-based tasks. Internists spend a large proportion of their working hours on paperwork that can be automated [90]. Decision-making, on the other hand, needs to be supervised by a human physician. The analysis of a large amount of data for improving diagnostic approaches is limited by humans’ cognitive and sensory abilities, which are expected to be enhanced by AI [28].
AI can improve patient outcomes by 30% to 40% while reducing treatment costs by up to 50% [13,91]. It can improve efficiency and decrease costs by shifting human labor to more complex tasks, identifying workflow optimization strategies, reducing medical waste via better coordination, eliminating over-treatment or low-value care, automating administrative and highly repetitive processes, and allowing the physician to focus on actual care [59,92,93].
Autonomous machines can augment medical professionals, improving the quality of care [4]. In the art of medicine of the future, it is essential to know when to use AI and when we should let natural intelligence work undisturbed [94]. AI and humans form a bio-techno-social system in which each participant contributes something and benefits [95,96]. AI-based systems will support and augment physicians’ skills and are unlikely to replace the traditional physician–patient relationship [4,94,97]. Augmentation means that AI can assist and allow humans to function better. Human clinicians will focus on tasks based on uniquely human skills like big-picture integration, empathy, and persuasion [28,98].
When deep learning’s predictions were combined with a human pathologist, tumor scores increased significantly, and the human error rate decreased [99,100]. AI is expected to become part of the routine radiologists’ work and improve accuracy [101]. Before accessing an actual doctor, AI bots can determine whether specific symptoms warrant a conversation with a physician [102]. Patients are asked multiple questions based on each response, and specific actions are recommended. Medical professionals review these answers for accuracy. Under certain conditions, a referral to a physician with the appropriate level of urgency is provided [103].
AI can also augment patients. Neuroprosthetics help or improve a patient’s nervous system in both input and output via electrical stimulation to overcome neurological deficits [104]. Brain–machine interfaces are platforms where subjects’ intended and voluntary goal-directed wishes are stored and learned when a user trains an AI-based controller and translates them into actions [105].
Nevertheless, these AI systems are not designed for and cannot replace physicians. There will always be cases where a physician’s judgment is needed to define undefined variables or to use variables related to the host, the disease, or the environment that are uncontrollable or hard to quantify or formulate into the algorithm.
A significant challenge in medicine is dealing with uncertainties and decision-making under dynamic circumstances, where the input and output variables are uncontrolled [106]. While this gives the human factor an advantage, AI systems using the Markov Logic Network can handle uncertainty and domain knowledge modeling [28]. Uncertainty modeling is essential for various medical needs, such as monitoring incomplete and unpredictable patients’ activities [28]. These models are necessary for activity recognition, modeling of activities, and decision-making but always require human physician supervision.
6. The Constrained Disorder Principle Defines Complex Biological Systems
Variability characterizes biological systems at all levels [107,108,109,110,111]. It is well documented for the genome, cellular functions, and whole organs [112,113,114,115,116,117,118,119,120,121,122,123,124,125,126,127]. Examples are the variability in heart rate, blood pressure, gait, breathing, and brain functions [122,123,124,125,126,127]. The constrained disorder principle (CDP) defines living organisms based on their inherent variability regulated within dynamic borders [5]. Per the CDP, disease states evolve from loss of variability or in cases where the variability is out of control [128].
The CDP implies that variability is mandatory for the normal function of complex systems. It is part of their adaptability and flexibility, which is required for proper function under continuous internal and external perturbations [5,129].
7. CDP-Based Second-Generation AI System Augmenting Physicians
CDP-based second-generation AI systems overcome several first-generation systems’ challenges while realizing that using these platforms is to augment and not replace physicians [6]. Second-generation systems are designed to utilize the inherent variability in complex biological processes, better reflecting these processes [112,113,114,115,116,117,118,119,120,121,130]. This co-piloting model of using AI systems by healthcare providers implies augmenting physicians and assisting them in providing better care.
The CDP-based second-generation AI systems implement random patterns into therapeutic regimens and are outcome-driven [6]. Clinical decisions involve numerous complexities that may not have been captured adequately in previous experiences or in the use of big data. Second-generation systems evolve from the n = 1 concept, recognizing that each patient is different [6,70,128,131]. It is based on the notion that a subject is never identical to the mean; it is a significant challenge for current big-data-based decision systems [4,132].
AI systems must be adaptive as transformed health ecosystems evolve rapidly and must continuously adapt diagnosis and treatment for each subject. The CDP-based second-generation AI adapts to unforeseen scenarios and provides solutions accounting for biological processes’ dynamicity while personalizing the treatment [6]. AI systems must be context-aware to infer the user’s current activity state and the environment’s characteristics. They also need to be interoperable and to be able to exchange data and knowledge with numerous platforms [8,133].
Second-generation systems are designed to assist decision-making and augment physicians, not replace them. They provide a co-piloting model of an integrated precision medicine solution comprising a patient management tool adapted to the user and connected to data sources. The system offers a method for generating insightful datasets to optimize future diagnostic and therapeutic analysis [6]. See Table 1.
Table 1.
Potential strengths of second-generation artificial intelligence systems versus first-generation ones.
|
|
|
|
|
|
|
|
|
|
|
|
The Digital Pill is a drug regulated by a patent-protected CDP-based second-generation AI system. It is designed to improve drug efficacy by introducing personalized variability into therapeutic regimens [132]. Data-driven variability in dosing and administration times within the approved range is generated. The digital pill is simple for patients and physicians to use and increases product loyalty and adherence, ensuring a long-term sustainable effect [131]. The Digital Pill was suggested to overcome drug resistance and loss of adherence in numerous chronic diseases [121,134,135,136,137,138,139,140,141,142,143,144,145,146,147,148,149,150,151,152,153,154,155,156,157,158,159,160,161,162,163].
Using the system in patients with chronic heart failure and diuretic resistance improved clinical and laboratory parameters, reduced side effects, and minimized heart failure-related emergency room admissions and hospitalizations [153]. The platform stabilized disease progression and improved clinical symptoms in patients with multiple sclerosis. The Digital Pill overcame analgesic tolerance, improving outcomes while reducing side effects in patients with chronic pain [164]. Clinical improvement was shown in patients with the genetic disease Gaucher disease using the system [160]. Additionally, the Digital Pill helped overcome drug resistance in tumors, improving clinical, radiological, and serum biomarkers in patients with cancer [165]. The system was also shown to slow down aging processes [166].
8. Adding Value to All Players of the Healthcare System: Reducing Complexity
The CDP-based second-generation AI-based Digital Pill is a companion tool for various healthcare system stakeholders that enables the delivery of precision medicine, streamlines communications, and manages data. These systems generate AI-powered decision-making companions, reducing complexities and adding value for all players of healthcare systems. For physicians and other healthcare providers, it assists decision-making for generating treatment regimens to improve outcomes and adherence. For the patients, these systems improve drug response and reduce side effects. As these systems are outcome-based, they benefit patients where they require the most improved clinical response, ensuring engagement. The systems reduce costs to payers while generating a market disruptor for pharma companies [6,70,131].
The system improves communications with patients and caregivers and provides a data collection and analysis tool to inform better decision-making and improve efficiency and excellence in managing resources. It enables data collection and report performance, supporting service standardization and quality assurance. The Digital Pill can serve as a platform for developing a digital twin for better prediction and risk management and selecting the ideal personalized therapies in a subject-tailored way. It is a tool to manage de-centralized clinical trials, enabling a combination of products with a tool that enhances the benefit of chronic therapies [6,70,131].
The Digital Pill provides solutions to some of the significant challenges in healthcare by overcoming the lack of patient engagement in digital health adoption. It ensures adoption by improving outcomes so patients view the app as part of their treatment. It reduces uncertainty about the opportunities and risks of digital systems. The system allows pharma companies to use existing drugs without the need to develop expensive new products, omits regulatory barriers, reduces costs to the health system, and increases pharma revenues [131].
9. Summary
Advances in AI are not just about technology versus humans. AI could be a substitute for many medical activities; however, it is unlikely to reach a stage where it can fully substitute for human physicians. Human doctors will continue serving patients with capabilities augmented by AI. The CDP-based second-generation AI-based Digital Pill is designed to increase physicians’ skills based on a co-piloting model, assisting them in decision-making to improve patients’ responses. The system better reflects biological processes by implementing variability signatures for generating therapeutic regimens. It humanizes AI by moving closer to human biology and using biological systems’ inherent variability to improve outcomes. The system is a physician’s companion, not a replacement for healthcare providers. It restores the efficacy of chronic drugs and improves adherence while generating data-driven therapeutic regimens.
Abbreviations
AI: artificial intelligence; ML: machine learning; TS: technological singularity; CDP: constrained-disorder principle; NLP: natural language processing; ANN: artificial neural networks; FES: fuzzy expert systems; EC: evolutionary calculations; HIS: hybrid intelligent systems; DNN: deep neural network; DL: Deep learning.
Conflicts of Interest
The author declares no conflicts of interest.
Funding Statement
This research received no external funding.
Footnotes
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.
References
- 1.Tran B.X., Vu G.T., Ha G.H., Vuong Q.-H., Ho M.-T., Vuong T.-T., La V.-P., Ho M.-T., Nghiem K.-C.P., Nguyen H.L.T. Global evolution of research in artificial intelligence in health and medicine: A bibliometric study. J. Clin. Med. 2019;8:360. doi: 10.3390/jcm8030360. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 2.Gold J.R., Bajo V.M. Insult-induced adaptive plasticity of the auditory system. Front. Neurosci. 2014;8:110. doi: 10.3389/fnins.2014.00110. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 3.Susskind R., Susskind D. Technology will replace many doctors, lawyers, and other professionals. Harvard Business Review. Oct 11, 2016.
- 4.Shuaib A., Arian H., Shuaib A. The Increasing Role of Artificial Intelligence in Health Care: Will Robots Replace Doctors in the Future? Int. J. Gen. Med. 2020;13:891–896. doi: 10.2147/IJGM.S268093. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 5.Ilan Y. The constrained disorder principle defines living organisms and provides a method for correcting disturbed biological systems. Comput. Struct. Biotechnol. J. 2022;20:6087–6096. doi: 10.1016/j.csbj.2022.11.015. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 6.Ilan Y. Second-Generation Digital Health Platforms: Placing the Patient at the Center and Focusing on Clinical Outcomes. Front. Digit. Health. 2020;2:569178. doi: 10.3389/fdgth.2020.569178. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 7.Solez K., Bernier A., Crichton J., Graves H., Kuttikat P., Lockwood R., Marovitz W.F., Monroe D., Pallen M., Pandya S. Bridging the gap between the technological singularity and medicine: Highlighting a course on technology and the future of medicine. Glob. J. Health Sci. 2013;5:112. doi: 10.5539/gjhs.v5n6p112. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 8.Denecke K., Baudoin C.R. A Review of Artificial Intelligence and Robotics in Transformed Health Ecosystems. Front. Med. 2022;9:795957. doi: 10.3389/fmed.2022.795957. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 9.Olshannikova E., Ometov A., Koucheryavy Y., Olsson T. Visualizing Big Data with augmented and virtual reality: Challenges and research agenda. J. Big Data. 2015;2:22. doi: 10.1186/s40537-015-0031-2. [DOI] [Google Scholar]
- 10.Steels L., López de Mantaras R. The Barcelona declaration for the proper development and usage of artificial intelligence in Europe. AI Commun. 2018;31:485–494. doi: 10.3233/AIC-180607. [DOI] [Google Scholar]
- 11.Kim J., Campbell A.S., de Ávila B.E.-F., Wang J. Wearable biosensors for healthcare monitoring. Nat. Biotechnol. 2019;37:389–406. doi: 10.1038/s41587-019-0045-y. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 12.Nam K.H., Kim D.H., Choi B.K., Han I.H. Internet of things, digital biomarker, and artificial intelligence in spine: Current and future perspectives. Neurospine. 2019;16:705. doi: 10.14245/ns.1938388.194. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 13.Ahuja A.S. The impact of artificial intelligence in medicine on the future role of the physician. PeerJ. 2019;7:e7702. doi: 10.7717/peerj.7702. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 14.Yu K.-H., Beam A.L., Kohane I.S. Artificial intelligence in healthcare. Nat. Biomed. Eng. 2018;2:719–731. doi: 10.1038/s41551-018-0305-z. [DOI] [PubMed] [Google Scholar]
- 15.Zador A.M. A critique of pure learning and what artificial neural networks can learn from animal brains. Nat. Commun. 2019;10:3770. doi: 10.1038/s41467-019-11786-6. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 16.Sarker I.H. Deep Learning: A Comprehensive Overview on Techniques, Taxonomy, Applications and Research Directions. SN Comput. Sci. 2021;2:420. doi: 10.1007/s42979-021-00815-1. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 17.Koopialipoor M., Jahed Armaghani D., Hedayat A., Marto A., Gordan B. Applying various hybrid intelligent systems to evaluate and predict slope stability under static and dynamic conditions. Soft Comput. 2019;23:5913–5929. doi: 10.1007/s00500-018-3253-3. [DOI] [Google Scholar]
- 18.Vlamou E., Papadopoulos B. Fuzzy logic systems and medical applications. AIMS Neurosci. 2019;6:266–272. doi: 10.3934/Neuroscience.2019.4.266. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 19.Abiodun O.I., Jantan A., Omolara A.E., Dada K.V., Mohamed N.A., Arshad H. State-of-the-art in artificial neural network applications: A survey. Heliyon. 2018;4:e00938. doi: 10.1016/j.heliyon.2018.e00938. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 20.Bohr A., Memarzadeh K. Artificial Intelligence in Healthcare. Academic Press; Cambridge, MA, USA: 2020. The rise of artificial intelligence in healthcare applications; pp. 25–60. [DOI] [Google Scholar]
- 21.Tao R., Ocampo R., Fong J., Soleymani A., Tavakoli M. Modeling and Emulating a Physiotherapist’s Role in Robot-Assisted Rehabilitation. Adv. Intell. Syst. 2020;2:1900181. doi: 10.1002/aisy.201900181. [DOI] [Google Scholar]
- 22.Feizi N., Tavakoli M., Patel R.V., Atashzar S.F. Robotics and AI for Teleoperation, Tele-Assessment, and Tele-Training for Surgery in the Era of COVID-19: Existing Challenges, and Future Vision. Front. Robot. AI. 2021;8:610677. doi: 10.3389/frobt.2021.610677. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 23.Łukasik S., Tobis S., Kropińska S., Suwalska A. Role of Assistive Robots in the Care of Older People: Survey Study Among Medical and Nursing Students. J. Med. Internet Res. 2020;22:e18003. doi: 10.2196/18003. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 24.Tavakoli M., Carriere J., Torabi A. Robotics, smart wearable technologies, and autonomous intelligent systems for healthcare during the COVID-19 pandemic: An analysis of the state of the art and future vision. Adv. Intell. Syst. 2020;2:2000071. doi: 10.1002/aisy.202000071. [DOI] [Google Scholar]
- 25.Lima M.R., Wairagkar M., Natarajan N., Vaitheswaran S., Vaidyanathan R. Robotic Telemedicine for Mental Health: A Multimodal Approach to Improve Human-Robot Engagement. Front. Robot. AI. 2021;8:618866. doi: 10.3389/frobt.2021.618866. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 26.Rogowski A. Scenario-Based Programming of Voice-Controlled Medical Robotic Systems. Sensors. 2022;22:9520. doi: 10.3390/s22239520. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 27.Koumakis L., Chatzaki C., Kazantzaki E., Maniadi E., Tsiknakis M. Dementia care frameworks and assistive technologies for their implementation: A review. IEEE Rev. Biomed. Eng. 2019;12:4–18. doi: 10.1109/RBME.2019.2892614. [DOI] [PubMed] [Google Scholar]
- 28.Ahmad R. Reviewing the relationship between machines and radiology: The application of artificial intelligence. Acta Radiol. Open. 2021;10:2058460121990296. doi: 10.1177/2058460121990296. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 29.Mathis M.R., Dubovoy T.Z., Caldwell M.D., Engoren M.C. Making Sense of Big Data to Improve Perioperative Care: Learning Health Systems and the Multicenter Perioperative Outcomes Group. J. Cardiothorac. Vasc. Anesth. 2020;34:582–585. doi: 10.1053/j.jvca.2019.11.012. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 30.Wagner M., Bihlmaier A., Kenngott H.G., Mietkowski P., Scheikl P.M., Bodenstedt S., Schiepe-Tiska A., Vetter J., Nickel F., Speidel S., et al. A learning robot for cognitive camera control in minimally invasive surgery. Surg. Endosc. 2021;35:5365–5374. doi: 10.1007/s00464-021-08509-8. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 31.Mezger U., Jendrewski C., Bartels M. Navigation in surgery. Langenbeck’s Arch. Surg. 2013;398:501–514. doi: 10.1007/s00423-013-1059-4. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 32.Nef T., Urwyler P., Büchler M., Tarnanas I., Stucki R., Cazzoli D., Müri R., Mosimann U. Evaluation of three state-of-the-art classifiers for recognition of activities of daily living from smart home ambient data. Sensors. 2012;15:11725–11740. doi: 10.3390/s150511725. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 33.Gupta N., Gupta S.K., Pathak R.K., Jain V., Rashidi P., Suri J.S. Human activity recognition in artificial intelligence framework: A narrative review. Artif. Intell. Rev. 2022;55:4755–4808. doi: 10.1007/s10462-021-10116-x. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 34.Stam W.T., Goedknegt L.K., Ingwersen E.W., Schoonmade L.J., Bruns E.R.J., Daams F. The prediction of surgical complications using artificial intelligence in patients undergoing major abdominal surgery: A systematic review. Surgery. 2022;171:1014–1021. doi: 10.1016/j.surg.2021.10.002. [DOI] [PubMed] [Google Scholar]
- 35.Pal M., Parija S., Panda G., Dhama K., Mohapatra R.K. Risk prediction of cardiovascular disease using machine learning classifiers. Open Med. 2022;17:1100–1113. doi: 10.1515/med-2022-0508. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 36.Gao J., Yang Y., Lin P., Park D.S. Computer vision in healthcare applications. J. Healthc. Eng. 2018;2018:5157020. doi: 10.1155/2018/5157020. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 37.Ching T., Himmelstein D.S., Beaulieu-Jones B.K., Kalinin A.A., Do B.T., Way G.P., Ferrero E., Agapow P.M., Zietz M., Hoffman M.M., et al. Opportunities and obstacles for deep learning in biology and medicine. J. R. Soc. Interface. 2018;15:20170387. doi: 10.1098/rsif.2017.0387. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 38.Jovel J., Greiner R. An Introduction to Machine Learning Approaches for Biomedical Research. Front. Med. 2021;8:771607. doi: 10.3389/fmed.2021.771607. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 39.Aruni G., Amit G., Dasgupta P. New surgical robots on the horizon and the potential role of artificial intelligence. Investig. Clin. Urol. 2018;59:221–222. doi: 10.4111/icu.2018.59.4.221. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 40.Ronquillo Y., Meyers A., Korvek S.J. StatPearls. StatPearls Publishing LLC.; Treasure Island, FL, USA: 2022. Digital Health. [PubMed] [Google Scholar]
- 41.Cummins N., Schuller B.W. Five Crucial Challenges in Digital Health. Front. Digit. Health. 2020;2:536203. doi: 10.3389/fdgth.2020.536203. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 42.Korteling J.E.H., van de Boer-Visschedijk G.C., Blankendaal R.A.M., Boonekamp R.C., Eikelboom A.R. Human- versus Artificial Intelligence. Front. Artif. Intell. 2021;4:622364. doi: 10.3389/frai.2021.622364. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 43.Davenport T., Kalakota R. The potential for artificial intelligence in healthcare. Future Healthc. J. 2019;6:94–98. doi: 10.7861/futurehosp.6-2-94. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 44.Hazarika I. Artificial intelligence: Opportunities and implications for the health workforce. Int. Health. 2020;12:241–245. doi: 10.1093/inthealth/ihaa007. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 45.Longo U.G., Carnevale A., Massaroni C., Lo Presti D., Berton A., Candela V., Schena E., Denaro V. Personalized, Predictive, Participatory, Precision, and Preventive (P5) Medicine in Rotator Cuff Tears. J. Pers. Med. 2021;11:255. doi: 10.3390/jpm11040255. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 46.Walter Z., Lopez M.S. Physician acceptance of information technologies: Role of perceived threat to professional autonomy. Decis. Support Syst. 2008;46:206–215. doi: 10.1016/j.dss.2008.06.004. [DOI] [Google Scholar]
- 47.Price W.N., Cohen I.G. Privacy in the age of medical big data. Nat. Med. 2019;25:37–43. doi: 10.1038/s41591-018-0272-7. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 48.Lamanna C., Byrne L. Should artificial intelligence augment medical decision making? The case for an autonomy algorithm. AMA J. Ethics. 2018;20:902–910. doi: 10.1001/amajethics.2018.902. [DOI] [PubMed] [Google Scholar]
- 49.Lin P., Abney K., Jenkins R. Robot Ethics 2.0: From Autonomous Cars to Artificial Intelligence. Oxford University Press; Oxford, UK: 2017. [Google Scholar]
- 50.Schaal S. The new robotics—Towards human-centered machines. HFSP J. 2007;1:115–126. doi: 10.2976/1.2748612. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 51.Vallès-Peris N., Domènech M. Roboticists’ imaginaries of robots for care: The radical imaginary as a tool for an ethical discussion. Eng. Stud. 2020;12:157–176. doi: 10.1080/19378629.2020.1821695. [DOI] [Google Scholar]
- 52.Topol E.J. High-performance medicine: The convergence of human and artificial intelligence. Nat. Med. 2019;25:44–56. doi: 10.1038/s41591-018-0300-7. [DOI] [PubMed] [Google Scholar]
- 53.Chin-Yee B., Upshur R. Three problems with big data and artificial intelligence in medicine. Perspect. Biol. Med. 2019;62:237–256. doi: 10.1353/pbm.2019.0012. [DOI] [PubMed] [Google Scholar]
- 54.Poitras I., Dupuis F., Bielmann M., Campeau-Lecours A., Mercier C., Bouyer L.J., Roy J.-S. Validity and reliability of wearable sensors for joint angle estimation: A systematic review. Sensors. 2019;19:1555. doi: 10.3390/s19071555. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 55.Paul D., Sanap G., Shenoy S., Kalyane D., Kalia K., Tekade R.K. Artificial intelligence in drug discovery and development. Drug Discov. Today. 2021;26:80–93. doi: 10.1016/j.drudis.2020.10.010. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 56.Geras K.J., Mann R.M., Moy L. Artificial Intelligence for Mammography and Digital Breast Tomosynthesis: Current Concepts and Future Perspectives. Radiology. 2019;293:246–259. doi: 10.1148/radiol.2019182627. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 57.Jairam M.P., Ha R. A review of artificial intelligence in mammography. Clin. Imaging. 2022;88:36–44. doi: 10.1016/j.clinimag.2022.05.005. [DOI] [PubMed] [Google Scholar]
- 58.Johnson K.B., Wei W.Q., Weeraratne D., Frisse M.E., Misulis K., Rhee K., Zhao J., Snowdon J.L. Precision Medicine, AI, and the Future of Personalized Health Care. Clin. Transl. Sci. 2021;14:86–93. doi: 10.1111/cts.12884. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 59.Chung H.J., Levens D. c-myc expression: Keep the noise down! Mol. Cells. 2005;20:157–166. doi: 10.1016/S1016-8478(23)13212-2. [DOI] [PubMed] [Google Scholar]
- 60.Győrffy Z., Radó N., Mesko B. Digitally engaged physicians about the digital health transition. PLoS ONE. 2020;15:e0238658. doi: 10.1371/journal.pone.0238658. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 61.Langlotz C.P. Will Artificial Intelligence Replace Radiologists? Radiology. Artif. Intell. 2019;1:e190058. doi: 10.1148/ryai.2019190058. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 62.Sadeghi Z., Alizadehsani R., Cifci M.A., Kausar S., Rehman R., Mahanta P., Bora P.K., Almasri A., Alkhawaldeh R.S., Hussain S., et al. A review of Explainable Artificial Intelligence in healthcare. Comput. Electr. Eng. 2024;118:109370. doi: 10.1016/j.compeleceng.2024.109370. [DOI] [Google Scholar]
- 63.Amann J., Blasimme A., Vayena E., Frey D., Madai V.I. Explainability for artificial intelligence in healthcare: A multidisciplinary perspective. BMC Med. Inform. Decis. Mak. 2020;20:310. doi: 10.1186/s12911-020-01332-6. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 64.Saranya A., Subhashini R. A systematic review of Explainable Artificial Intelligence models and applications: Recent developments and future trends. Decis. Anal. J. 2023;7:100230. doi: 10.1016/j.dajour.2023.100230. [DOI] [Google Scholar]
- 65.Langman S., Capicotto N., Maddahi Y., Zareinia K. Roboethics principles and policies in Europe and North America. SN Appl. Sci. 2021;3:857. doi: 10.1007/s42452-021-04853-5. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 66.Miller I.N., Cronin-Golomb A. Gender differences in Parkinson’s disease: Clinical characteristics and cognition. Mov. Disord. 2010;25:2695–2703. doi: 10.1002/mds.23388. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 67.Cirillo D., Catuara-Solarz S., Morey C., Guney E., Subirats L., Mellino S., Gigante A., Valencia A., Rementeria M.J., Chadha A.S. Sex and gender differences and biases in artificial intelligence for biomedicine and healthcare. npj Digit. Med. 2020;3:81. doi: 10.1038/s41746-020-0288-5. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 68.Wolff R.F., Moons K.G., Riley R.D., Whiting P.F., Westwood M., Collins G.S., Reitsma J.B., Kleijnen J., Mallett S., PROBAST Group PROBAST: A tool to assess the risk of bias and applicability of prediction model studies. Ann. Intern. Med. 2019;170:51–58. doi: 10.7326/M18-1376. [DOI] [PubMed] [Google Scholar]
- 69.Fernandez-Lazaro C.I., García-González J.M., Adams D.P., Fernandez-Lazaro D., Mielgo-Ayuso J., Caballero-Garcia A., Moreno Racionero F., Córdova A., Miron-Canelo J.A. Adherence to treatment and related factors among patients with chronic conditions in primary care: A cross-sectional study. BMC Fam. Pract. 2019;20:132. doi: 10.1186/s12875-019-1019-3. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 70.Ilan Y. Overcoming Compensatory Mechanisms toward Chronic Drug Administration to Ensure Long-Term, Sustainable Beneficial Effects. Mol. Ther. Methods Clin. Dev. 2020;18:335–344. doi: 10.1016/j.omtm.2020.06.006. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 71.Vallès-Peris N., Barat-Auleda O., Domènech M. Robots in Healthcare? What Patients Say. Int. J. Environ. Res. Public Health. 2021;18:9933. doi: 10.3390/ijerph18189933. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 72.Birnbaum F., Lewis D., Rosen R.K., Ranney M.L. Patient engagement and the design of digital health. Acad. Emerg. Med. Off. J. Soc. Acad. Emerg. Med. 2015;22:754–756. doi: 10.1111/acem.12692. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 73.Mathews S.C., McShea M.J., Hanley C.L., Ravitz A., Labrique A.B., Cohen A.B. Digital health: A path to validation. Npj Digit. Med. 2019;2:38. doi: 10.1038/s41746-019-0111-3. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 74.Jakob R., Harperink S., Rudolf A.M., Fleisch E., Haug S., Mair J.L., Salamanca-Sanabria A., Kowatsch T. Factors Influencing Adherence to mHealth Apps for Prevention or Management of Noncommunicable Diseases: Systematic Review. J. Med. Internet Res. 2022;24:e35371. doi: 10.2196/35371. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 75.Hesser H. Estimating causal effects of internet interventions in the context of nonadherence. Internet Interv. 2020;21:100346. doi: 10.1016/j.invent.2020.100346. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 76.Meyerowitz-Katz G., Ravi S., Arnolda L., Feng X., Maberly G., Astell-Burt T. Rates of Attrition and Dropout in App-Based Interventions for Chronic Disease: Systematic Review and Meta-Analysis. J. Med. Internet Res. 2020;22:e20283. doi: 10.2196/20283. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 77.Blease C., Kaptchuk T.J., Bernstein M.H., Mandl K.D., Halamka J.D., DesRoches C.M. Artificial intelligence and the future of primary care: Exploratory qualitative study of UK general practitioners’ views. J. Med. Internet Res. 2019;21:e12802. doi: 10.2196/12802. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 78.van Wynsberghe A. Social robots and the risks to reciprocity. AI Soc. 2022;37:479–485. doi: 10.1007/s00146-021-01207-y. [DOI] [Google Scholar]
- 79.Loh E. Medicine and the rise of the robots: A qualitative review of recent advances of artificial intelligence in health. BMJ Lead. 2018;2:59–63. doi: 10.1136/leader-2018-000071. [DOI] [Google Scholar]
- 80.Gilvary C., Madhukar N., Elkhader J., Elemento O. The missing pieces of artificial intelligence in medicine. Trends Pharmacol. Sci. 2019;40:555–564. doi: 10.1016/j.tips.2019.06.001. [DOI] [PubMed] [Google Scholar]
- 81.Bringsjord S. Ethical robots: The future can heed us. AI Soc. 2008;22:539–550. doi: 10.1007/s00146-007-0090-9. [DOI] [Google Scholar]
- 82.da Costa P.G., Leon D. Reviewing the Concept of Technological Singularities: How Can It Explain Human Evolution? NanoEthics. 2019;13:119–130. doi: 10.1007/s11569-019-00339-2. [DOI] [Google Scholar]
- 83.Portnoff A.-Y., Soupizet J.-F. Artificial intelligence: Opportunities and risks. Futuribles. 2018;426:5–26. doi: 10.3917/futur.426.0005. [DOI] [Google Scholar]
- 84.Cook C.E., Décary S. Higher order thinking about differential diagnosis. Braz. J. Phys. Ther. 2020;24:1–7. doi: 10.1016/j.bjpt.2019.01.010. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 85.Foadi N., Varghese J. Digital competence—A Key Competence for Todays and Future Physicians. J. Eur. CME. 2022;11:2015200. doi: 10.1080/21614083.2021.2015200. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 86.Bien N., Rajpurkar P., Ball R.L., Irvin J., Park A., Jones E., Bereket M., Patel B.N., Yeom K.W., Shpanskaya K. Deep-learning-assisted diagnosis for knee magnetic resonance imaging: Development and retrospective validation of MRNet. PLoS Med. 2018;15:e1002699. doi: 10.1371/journal.pmed.1002699. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 87.Bankowitz R.A., McNeil M.A., Challinor S.M., Parker R.C., Kapoor W.N., Miller R.A. A computer-assisted medical diagnostic consultation service: Implementation and prospective evaluation of a prototype. Ann. Intern. Med. 1989;110:824–832. doi: 10.7326/0003-4819-110-10-824. [DOI] [PubMed] [Google Scholar]
- 88.Amisha, Malik P., Pathania M., Rathaur V.K. Overview of artificial intelligence in medicine. J. Fam. Med. Prim. Care. 2019;8:2328–2331. doi: 10.4103/jfmpc.jfmpc_440_19. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 89.Konttila J., Siira H., Kyngäs H., Lahtinen M., Elo S., Kääriäinen M., Kaakinen P., Oikarinen A., Yamakawa M., Fukui S., et al. Healthcare professionals’ competence in digitalisation: A systematic review. J. Clin. Nurs. 2019;28:745–761. doi: 10.1111/jocn.14710. [DOI] [PubMed] [Google Scholar]
- 90.Toscano F., O’Donnell E., Broderick J.E., May M., Tucker P., Unruh M.A., Messina G., Casalino L.P. How Physicians Spend Their Work Time: An Ecological Momentary Assessment. J. Gen. Intern. Med. 2020;35:3166–3172. doi: 10.1007/s11606-020-06087-4. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 91.Salcedo J., Rosales M., Kim J.S., Nuno D., Suen S.C., Chang A.H. Cost-effectiveness of artificial intelligence monitoring for active tuberculosis treatment: A modeling study. PLoS ONE. 2021;16:e0254950. doi: 10.1371/journal.pone.0254950. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 92.Ahmed Z., Mohamed K., Zeeshan S., Dong X. Artificial intelligence with multi-functional machine learning platform development for better healthcare and precision medicine. Database J. Biol. Databases Curation. 2020;2020:baaa010. doi: 10.1093/database/baaa010. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 93.Rissanen M. Translational health technology and system schemes: Enhancing the dynamics of health informatics. Health Inf. Sci. Syst. 2020;8:39. doi: 10.1007/s13755-020-00133-5. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 94.Bhattad P.B., Jain V. Artificial Intelligence in Modern Medicine—The Evolving Necessity of the Present and Role in Transforming the Future of Medical Care. Cureus. 2020;12:e8041. doi: 10.7759/cureus.8041. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 95.Leenes R., Palmerini E., Koops B.-J., Bertolini A., Salvini P., Lucivero F. Regulatory challenges of robotics: Some guidelines for addressing legal and ethical issues. Law Innov. Technol. 2017;9:1–44. doi: 10.1080/17579961.2017.1304921. [DOI] [Google Scholar]
- 96.Brailas A. Psychotherapy in the era of artificial intelligence: Therapist panoptes. Homo Virtualis. 2019;2:68–78. doi: 10.12681/homvir.20197. [DOI] [Google Scholar]
- 97.Arnold M.H. Teasing out Artificial Intelligence in Medicine: An Ethical Critique of Artificial Intelligence and Machine Learning in Medicine. J. Bioethical Inq. 2021;18:121–139. doi: 10.1007/s11673-020-10080-1. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 98.Becker A. Artificial intelligence in medicine: What is it doing for us today? Health Policy Technol. 2019;8:198–205. doi: 10.1016/j.hlpt.2019.03.004. [DOI] [Google Scholar]
- 99.He Y., Zhao H., Wong S.T.C. Deep learning powers cancer diagnosis in digital pathology. Comput. Med. Imaging Graph. Off. J. Comput. Med. Imaging Soc. 2021;88:101820. doi: 10.1016/j.compmedimag.2020.101820. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 100.Cui M., Zhang D.Y. Artificial intelligence and computational pathology. Lab. Investig. 2021;101:412–422. doi: 10.1038/s41374-020-00514-0. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 101.European Society of Radiology (ESR) What the radiologist should know about artificial intelligence—An ESR white paper. Insights Imaging. 2019;10:44. doi: 10.1186/s13244-019-0738-2. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 102.Palanica A., Flaschner P., Thommandram A., Li M., Fossat Y. Physicians’ Perceptions of Chatbots in Health Care: Cross-Sectional Web-Based Survey. J. Med. Internet Res. 2019;21:e12887. doi: 10.2196/12887. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 103.Lee D., Yoon S.N. Application of Artificial Intelligence-Based Technologies in the Healthcare Industry: Opportunities and Challenges. Int. J. Environ. Res. Public Health. 2021;18:271. doi: 10.3390/ijerph18010271. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 104.Warwick K. Neuroengineering and neuroprosthetics. Brain Neurosci. Adv. 2018;2:2398212818817499. doi: 10.1177/2398212818817499. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 105.Musk E. An integrated brain-machine interface platform with thousands of channels. J. Med. Internet Res. 2019;21:e16194. doi: 10.2196/16194. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 106.Helou M.A., DiazGranados D., Ryan M.S., Cyrus J.W. Uncertainty in Decision Making in Medicine: A Scoping Review and Thematic Analysis of Conceptual Models. Acad. Med. J. Assoc. Am. Med. Coll. 2020;95:157–165. doi: 10.1097/ACM.0000000000002902. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 107.Finn E.H., Misteli T. Molecular basis and biological function of variability in spatial genome organization. Science. 2019;365:eaaw9498. doi: 10.1126/science.aaw9498. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 108.Chiera M., Cerritelli F., Casini A., Barsotti N., Boschiero D., Cavigioli F., Corti C.G., Manzotti A. Heart Rate Variability in the Perinatal Period: A Critical and Conceptual Review. Front. Neurosci. 2020;14:561186. doi: 10.3389/fnins.2020.561186. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 109.Forte G., Favieri F., Casagrande M. Heart Rate Variability and Cognitive Function: A Systematic Review. Front. Neurosci. 2019;13:710. doi: 10.3389/fnins.2019.00710. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 110.Mitchison T., Kirschner M. Dynamic instability of microtubule growth. Nature. 1984;312:237–242. doi: 10.1038/312237a0. [DOI] [PubMed] [Google Scholar]
- 111.Kirschner M.W., Mitchison T. Microtubule dynamics. Nature. 1986;324:621. doi: 10.1038/324621a0. [DOI] [PubMed] [Google Scholar]
- 112.Ilan Y. Overcoming randomness does not rule out the importance of inherent randomness for functionality. J. Biosci. 2019;44:132. doi: 10.1007/s12038-019-9958-3. [DOI] [PubMed] [Google Scholar]
- 113.Ilan Y. Generating randomness: Making the most out of disordering a false order into a real one. J. Transl. Med. 2019;17:49. doi: 10.1186/s12967-019-1798-2. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 114.Ilan Y. Advanced Tailored Randomness: A Novel Approach for Improving the Efficacy of Biological Systems. J. Comput. Biol. 2020;27:20–29. doi: 10.1089/cmb.2019.0231. [DOI] [PubMed] [Google Scholar]
- 115.Ilan Y. Order Through Disorder: The Characteristic Variability of Systems. Front. Cell Dev. Biol. 2020;8:186. doi: 10.3389/fcell.2020.00186. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 116.El-Haj M., Kanovitch D., Ilan Y. Personalized inherent randomness of the immune system is manifested by an individualized response to immune triggers and immunomodulatory therapies: A novel platform for designing personalized immunotherapies. Immunol. Res. 2019;67:337–347. doi: 10.1007/s12026-019-09101-y. [DOI] [PubMed] [Google Scholar]
- 117.Ilan Y. Randomness in microtubule dynamics: An error that requires correction or an inherent plasticity required for normal cellular function? Cell Biol. Int. 2019;43:739–748. doi: 10.1002/cbin.11157. [DOI] [PubMed] [Google Scholar]
- 118.Ilan Y. Microtubules: From understanding their dynamics to using them as potential therapeutic targets. J. Cell. Physiol. 2019;234:7923–7937. doi: 10.1002/jcp.27978. [DOI] [PubMed] [Google Scholar]
- 119.Ilan-Ber T., Ilan Y. The role of microtubules in the immune system and as potential targets for gut-based immunotherapy. Mol. Immunol. 2019;111:73–82. doi: 10.1016/j.molimm.2019.04.014. [DOI] [PubMed] [Google Scholar]
- 120.Forkosh E., Kenig A., Ilan Y. Introducing variability in targeting the microtubules: Review of current mechanisms and future directions in colchicine therapy. Pharmacol. Res. Perspect. 2020;8:e00616. doi: 10.1002/prp2.616. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 121.Ilan Y. beta-Glycosphingolipids as Mediators of Both Inflammation and Immune Tolerance: A Manifestation of Randomness in Biological Systems. Front. Immunol. 2019;10:1143. doi: 10.3389/fimmu.2019.01143. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 122.Schutte A.E., Kollias A., Stergiou G.S. Blood pressure and its variability: Classic and novel measurement techniques. Nat. Rev. Cardiol. 2022;19:643–654. doi: 10.1038/s41569-022-00690-0. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 123.van den Bosch O.F.C., Alvarez-Jimenez R., de Grooth H.J., Girbes A.R.J., Loer S.A. Breathing variability-implications for anaesthesiology and intensive care. Crit. Care. 2021;25:280. doi: 10.1186/s13054-021-03716-0. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 124.Boripuntakul S., Kamnardsiri T., Lord S.R., Maiarin S., Worakul P., Sungkarat S. Gait variability during abrupt slow and fast speed transitions in older adults with mild cognitive impairment. PLoS ONE. 2022;17:e0276658. doi: 10.1371/journal.pone.0276658. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 125.Genon S., Eickhoff S.B., Kharabian S. Linking interindividual variability in brain structure to behaviour. Nat. Rev. Neurosci. 2022;23:307–318. doi: 10.1038/s41583-022-00584-7. [DOI] [PubMed] [Google Scholar]
- 126.Saha S., Baumert M. Intra- and Inter-subject Variability in EEG-Based Sensorimotor Brain Computer Interface: A Review. Front. Comput. Neurosci. 2019;13:87. doi: 10.3389/fncom.2019.00087. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 127.Crawford L., Mills E., Meylakh N., Macey P.M., Macefield V.G., Henderson L.A. Brain activity changes associated with pain perception variability. Cereb. Cortex. 2022;33:4145–4155. doi: 10.1093/cercor/bhac332. [DOI] [PubMed] [Google Scholar]
- 128.Ilan Y. Next-Generation Personalized Medicine: Implementation of Variability Patterns for Overcoming Drug Resistance in Chronic Diseases. J. Pers. Med. 2022;12:1303. doi: 10.3390/jpm12081303. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 129.Ilan Y. Constrained disorder principle-based variability is fundamental for biological processes: Beyond biological relativity and physiological regulatory networks. Prog. Biophys. Mol. Biol. 2023;180–181:37–48. doi: 10.1016/j.pbiomolbio.2023.04.003. [DOI] [PubMed] [Google Scholar]
- 130.Ilan Y. Microtubules as a potential platform for energy transfer in biological systems: A target for implementing individualized, dynamic variability patterns to improve organ function. Mol. Cell. Biochem. 2022;478:375–392. doi: 10.1007/s11010-022-04513-1. [DOI] [PubMed] [Google Scholar]
- 131.Ilan Y. Improving Global Healthcare and Reducing Costs Using Second-Generation Artificial Intelligence-Based Digital Pills: A Market Disruptor. Int. J. Environ. Res. Public Health. 2021;18:811. doi: 10.3390/ijerph18020811. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 132.Speelman C.P., McGann M. How Mean is the Mean? Front. Psychol. 2013;4:451. doi: 10.3389/fpsyg.2013.00451. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 133.Montani S., Striani M. Artificial intelligence in clinical decision support: A focused literature survey. Yearb. Med. Inform. 2019;28:120–127. doi: 10.1055/s-0039-1677911. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 134.Kessler A., Weksler-Zangen S., Ilan Y. Role of the Immune System and the Circadian Rhythm in the Pathogenesis of Chronic Pancreatitis: Establishing a Personalized Signature for Improving the Effect of Immunotherapies for Chronic Pancreatitis. Pancreas. 2020;49:1024–1032. doi: 10.1097/MPA.0000000000001626. [DOI] [PubMed] [Google Scholar]
- 135.Ishay Y., Kolben Y., Kessler A., Ilan Y. Role of circadian rhythm and autonomic nervous system in liver function: A hypothetical basis for improving the management of hepatic encephalopathy. Am. J. Physiol. Gastrointest. Liver Physiol. 2021;321:G400–G412. doi: 10.1152/ajpgi.00186.2021. [DOI] [PubMed] [Google Scholar]
- 136.Kolben Y., Weksler-Zangen S., Ilan Y. Adropin as a potential mediator of the metabolic system-autonomic nervous system-chronobiology axis: Implementing a personalized signature-based platform for chronotherapy. Obes. Rev. 2021;22:e13108. doi: 10.1111/obr.13108. [DOI] [PubMed] [Google Scholar]
- 137.Kenig A., Kolben Y., Asleh R., Amir O., Ilan Y. Improving Diuretic Response in Heart Failure by Implementing a Patient-Tailored Variability and Chronotherapy-Guided Algorithm. Front. Cardiovasc. Med. 2021;8:695547. doi: 10.3389/fcvm.2021.695547. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 138.Azmanov H., Ross E.L., Ilan Y. Establishment of an Individualized Chronotherapy, Autonomic Nervous System, and Variability-Based Dynamic Platform for Overcoming the Loss of Response to Analgesics. Pain Physician. 2021;24:243–252. [PubMed] [Google Scholar]
- 139.Potruch A., Khoury S.T., Ilan Y. The role of chronobiology in drug-resistance epilepsy: The potential use of a variability and chronotherapy-based individualized platform for improving the response to anti-seizure drugs. Seizure. 2020;80:201–211. doi: 10.1016/j.seizure.2020.06.032. [DOI] [PubMed] [Google Scholar]
- 140.Isahy Y., Ilan Y. Improving the long-term response to antidepressants by establishing an individualized platform based on variability and chronotherapy. Int. J. Clin. Pharmacol. Ther. 2021;59:768–774. doi: 10.5414/CP204000. [DOI] [PubMed] [Google Scholar]
- 141.Khoury T., Ilan Y. Introducing Patterns of Variability for Overcoming Compensatory Adaptation of the Immune System to Immunomodulatory Agents: A Novel Method for Improving Clinical Response to Anti-TNF Therapies. Front. Immunol. 2019;10:2726. doi: 10.3389/fimmu.2019.02726. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 142.Khoury T., Ilan Y. Platform introducing individually tailored variability in nerve stimulations and dietary regimen to prevent weight regain following weight loss in patients with obesity. Obes. Res. Clin. Pract. 2021;15:114–123. doi: 10.1016/j.orcp.2021.02.003. [DOI] [PubMed] [Google Scholar]
- 143.Kenig A., Ilan Y. A Personalized Signature and Chronotherapy-Based Platform for Improving the Efficacy of Sepsis Treatment. Front. Physiol. 2019;10:1542. doi: 10.3389/fphys.2019.01542. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 144.Ilan Y. Why targeting the microbiome is not so successful: Can randomness overcome the adaptation that occurs following gut manipulation? Clin. Exp. Gastroenterol. 2019;12:209–217. doi: 10.2147/CEG.S203823. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 145.Gelman R., Bayatra A., Kessler A., Schwartz A., Ilan Y. Targeting SARS-CoV-2 receptors as a means for reducing infectivity and improving antiviral and immune response: An algorithm-based method for overcoming resistance to antiviral agents. Emerg. Microbes Infect. 2020;9:1397–1406. doi: 10.1080/22221751.2020.1776161. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 146.Ishay Y., Potruch A., Schwartz A., Berg M., Jamil K., Agus S., Ilan Y. A digital health platform for assisting the diagnosis and monitoring of COVID-19 progression: An adjuvant approach for augmenting the antiviral response and mitigating the immune-mediated target organ damage. Biomed. Pharmacother. 2021;143:112228. doi: 10.1016/j.biopha.2021.112228. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 147.Ilan Y., Spigelman Z. Establishing patient-tailored variability-based paradigms for anti-cancer therapy: Using the inherent trajectories which underlie cancer for overcoming drug resistance. Cancer Treat. Res. Commun. 2020;25:100240. doi: 10.1016/j.ctarc.2020.100240. [DOI] [PubMed] [Google Scholar]
- 148.Hurvitz N., Azmanov H., Kesler A., Ilan Y. Establishing a second-generation artificial intelligence-based system for improving diagnosis, treatment, and monitoring of patients with rare diseases. Eur. J. Hum. Genet. 2021;29:1485–1490. doi: 10.1038/s41431-021-00928-4. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 149.Ilan Y. Digital Medical Cannabis as Market Differentiator: Second-Generation Artificial Intelligence Systems to Improve Response. Front. Med. 2021;8:788777. doi: 10.3389/fmed.2021.788777. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 150.Gelman R., Berg M., Ilan Y. A Subject-Tailored Variability-Based Platform for Overcoming the Plateau Effect in Sports Training: A Narrative Review. Int. J. Environ. Res. Public Health. 2022;19:1722. doi: 10.3390/ijerph19031722. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 151.Azmanov H., Bayatra A., Ilan Y. Digital Analgesic Comprising a Second-Generation Digital Health System: Increasing Effectiveness by Optimizing the Dosing and Minimizing Side Effects. J. Pain Res. 2022;15:1051–1060. doi: 10.2147/JPR.S356319. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 152.Hurvitz N., Elkhateeb N., Sigawi T., Rinsky-Halivni L., Ilan Y. Improving the effectiveness of anti-aging modalities by using the constrained disorder principle-based management algorithms. Front. Aging. 2022;3:1044038. doi: 10.3389/fragi.2022.1044038. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 153.Gelman R., Hurvitz N., Nesserat R., Kolben Y., Nachman D., Jamil K., Agus S., Asleh R., Amir O., Berg M., et al. A second-generation artificial intelligence-based therapeutic regimen improves diuretic resistance in heart failure: Results of a feasibility open-labeled clinical trial. Biomed. Pharmacother. 2023;161:114334. doi: 10.1016/j.biopha.2023.114334. [DOI] [PubMed] [Google Scholar]
- 154.Kolben Y., Azmanov H., Gelman R., Dror D., Ilan Y. Using chronobiology-based second-generation artificial intelligence digital system for overcoming antimicrobial drug resistance in chronic infections. Ann. Med. 2023;55:311–318. doi: 10.1080/07853890.2022.2163053. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 155.Lehmann H., Arkadir D., Ilan Y. Methods for Improving Brain-Computer Interface: Using A Brain-Directed Adjuvant and A Second-Generation Artificial Intelligence System to Enhance Information Streaming and Effectiveness of Stimuli. Int. J. Appl. Biol. Pharm. Technol. 2023;14:42–52. doi: 10.26502/ijabpt.202124. [DOI] [Google Scholar]
- 156.Adar O., Hollander A., Ilan Y. The Constrained Disorder Principle Accounts for the Variability That Characterizes Breathing: A Method for Treating Chronic Respiratory Diseases and Improving Mechanical Ventilation. Adv. Respir. Med. 2023;91:350–367. doi: 10.3390/arm91050028. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 157.Ilan Y. The Constrained Disorder Principle Accounts for The Structure and Function of Water as An Ultimate Biosensor and Bioreactor in Biological Systems. Int. J. Appl. Biol. Pharm. Technol. 2023;14:31–41. doi: 10.26502/ijabpt.202123. [DOI] [Google Scholar]
- 158.Sigawi T., Hamtzany O., Shakargy J.D., Ilan Y. The Constrained Disorder Principle May Account for Consciousness. Brain Sci. 2024;14:209. doi: 10.3390/brainsci14030209. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 159.Ilan Y. Special Issue “Computer-Aided Drug Discovery and Treatment”. Int. J. Mol. Sci. 2024;25:2683. doi: 10.3390/ijms25052683. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 160.Hurvitz N., Dinur T., Revel-Vilk S., Agus S., Berg M., Zimran A., Ilan Y. A Feasibility Open-Labeled Clinical Trial Using a Second-Generation Artificial-Intelligence-Based Therapeutic Regimen in Patients with Gaucher Disease Treated with Enzyme Replacement Therapy. J. Clin. Med. 2024;13:3325. doi: 10.3390/jcm13113325. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 161.Ilan Y. Free Will as Defined by the Constrained Disorder Principle: A Restricted, Mandatory, Personalized, Regulated Process for Decision-Making. Integr. Psychol. Behav. Sci. 2024:1–33. doi: 10.1007/s12124-024-09853-9. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 162.Ilan Y. The Constrained Disorder Principle Defines Mitochondrial Variability and Provides A Platform for A Novel Mechanism for Improved Functionality of Complex Systems. Fortune J. Health Sci. 2024;7:338–347. doi: 10.26502/fjhs.194. [DOI] [Google Scholar]
- 163.Sigawi T., Israeli A., Ilan Y. Harnessing Variability Signatures and Biological Noise May Enhance Immunotherapies’ Efficacy and Act as Novel Biomarkers for Diagnosing and Monitoring Immune-Associated Disorders. Immunotargets Ther. 2024;13:525–539. doi: 10.2147/ITT.S477841. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 164.Sigawi T., Lehmann H., Hurvitz N., Ilan Y. Constrained Disorder Principle-Based Second-Generation Algorithms Implement Quantified Variability Signatures to Improve the Function of Complex Systems. J. Bioinform. Syst. Biol. 2023;6:82–89. doi: 10.26502/jbsb.5107051. [DOI] [Google Scholar]
- 165.Sigawi T., Gelman R., Maimon O., Yossef A., Hemed N., Agus S., Berg M., Ilan Y., Popovtzer A. Improving the response to lenvatinib in partial responders using a Constrained-Disorder-Principle-based second-generation artificial intelligence-therapeutic regimen: A proof-of-concept open-labeled clinical trial. Front. Oncol. 2024;14:1426426. doi: 10.3389/fonc.2024.1426426. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 166.Ilan Y. Variability in exercise is linked to improved age-related dysfunctions: A potential role for the constrained-disorder principle-based second-generation artificial intelligence system. Res. Sq. 2023 doi: 10.21203/rs.3.rs-3671709/v1. [DOI] [PMC free article] [PubMed] [Google Scholar]