Table 3.
Strongly disagree, n (%) | Disagree, n (%) | Neutral, n (%) | Agree, n (%) | Strongly agree, n (%) | |
---|---|---|---|---|---|
I think that the use of AI brings benefits for the patient | 41 (5.3) | 46 (6.0) | 192 (24.9) | 354 (45.9) | 138 (17.9) |
Doctors will play a less important role in the therapy of patients in the future | 150 (19.5) | 179 (23.2) | 189 (24.5) | 199 (25.8) | 54 (7.0) |
Through the use of AI, there will be less treatment errors in the future | 57 (7.4) | 99 (12.8) | 270 (35.0) | 261 (33.9) | 84 (10.9) |
AI should not be used in medicine as a matter of principle | 56 (7.3) | 91 (11.8) | 225 (29.2) | 241 (31.3) | 158 (20.5) |
Doctors are becoming too dependent on computer systems | 41 (5.3) | 95 (12.3) | 201 (26.1) | 313 (40.6) | 121 (15.7) |
The testing of AI before it is used on patients should be carried out by an independent body (e.g., such as the government, ministry of health, or similar) | 42 (5.4) | 34 (4.4) | 113 (14.7) | 237 (30.7) | 345 (44.7) |
I would trust the assessment of an AI more than the assessment of a doctor | 148 (19.2) | 217 (28.1) | 242 (31.4) | 125 (16.2) | 39 (5.1) |
Doctors know too little about AI to use it on patients | 71 (9.2) | 131 (17.0) | 315 (40.9) | 191 (24.8) | 63 (8.2) |
If a patient has been harmed, a doctor should be held responsible for not following the recommendations of AI | 119 (15.4) | 156 (20.2) | 263 (34.1) | 173 (22.4) | 60 (7.8) |
The influence of AI on medical treatment scares me | 74 (9.6) | 126 (16.3) | 269 (34.9) | 220 (28.5) | 82 (10.6) |
The use of AI prevents doctors from learning to make their own correct judgement of the patient | 82 (10.6) | 112 (14.5) | 285 (37.0) | 215 (27.9) | 77 (10.0) |
If AI predicts a low chance of survival for the patient, doctors will not fight for that patient's life as much as before | 156 (20.2) | 135 (17.5) | 238 (30.9) | 183 (23.7) | 59 (7.7) |
The use of AI is changing the demands of the medical profession | 62 (8.0) | 101 (13.1) | 248 (32.2) | 265 (34.4) | 95 (12.3) |
I would like my personal, medical treatment to be supported by AI | 104 (13.5) | 142 (18.4) | 284 (36.8) | 187 (24.3) | 54 (7.0) |
I would make my anonymous patient data available for noncommercial research (universities, hospitals, etc.) if this could improve future patient care | 92 (11.9) | 70 (9.1) | 197 (25.6) | 247 (32.0) | 165 (21.4) |
AI-based decision support systems for doctors should only be used for patient care if their benefit has been scientifically proven | 52 (6.7) | 73 (9.5) | 261 (33.9) | 276 (35.8) | 109 (14.1) |
I am more afraid of a technical malfunction of AI than of a wrong decision by a doctor | 60 (7.8) | 79 (10.2) | 244 (31.6) | 261 (33.9) | 127 (16.5) |
I am not worried about the security of my data | 131 (17.0) | 131 (17.0) | 225 (29.2) | 214 (27.8) | 70 (9.1) |
By using AI, doctors will again have more time for the patient | 71 (9.2) | 110 (14.3) | 284 (36.8) | 228 (29.6) | 78 (10.1) |
A doctor should always have the final control over diagnosis and therapy | 47 (6.1) | 45 (5.8) | 161 (20.9) | 228 (29.6) | 290 (37.6) |
I am worried that AI-based systems could be manipulated from the outside (terrorists, hackers,...) | 56 (7.3) | 92 (11.9) | 220 (28.5) | 236 (30.6) | 167 (21.7) |
The use of AI impairs the doctor-patient relationship | 59 (7.7) | 107 (13.9) | 256 (33.2) | 218 (28.3) | 131 (17.0) |
The use of AI is an effective instrument against the overload of doctors and the shortage of doctors | 49 (6.4) | 81 (10.5) | 255 (33.1) | 287 (37.2) | 99 (12.8) |
I would like my doctor to override the recommendations of AI if he comes to a different conclusion based on his experience or knowledge | 40 (5.2) | 74 (9.6) | 235 (30.5) | 270 (35.0) | 152 (19.7) |
The use of AI will reduce the workload of doctors | 56 (7.3) | 74 (9.6) | 223 (28.9) | 297 (38.5) | 121 (15.7) |
AI=Artificial intelligence