The future is perhaps more uncertain than ever. Put the crises in healthcare and cost of living to one side, and focus your minds on artificial intelligence. Admittedly, the world of ChatGPT and large language models can be hard to make sense of but it’s something we all must do.
These technologies are transforming our world at warp speed, and the article we publish this month will already be yesterday’s news albeit still worth reading 1 . Large language models are set to either destroy the world or solve all our problems – depending on your perspective. The truth, as ever, probably lies somewhere in between those two extremes.
At present, the evolutionary stage of large language models is producing more questions than answers, but the fact is that, although some people would prefer to see further development banned, large language models are here to stay. On that basis, a smarter response is to learn how to work with large language models and use them to best advantage.
Large language models and other forms of artificial intelligence might help with analysis and interpretation of research findings, for example, more quickly producing a research report for publication. 2 They might help improve critical appraisal and understanding of published research. 3 They might help us appreciate the public health harms of industry. 4 They might help medical professionals better manage their finances. 5 There is a great deal that large language models might help us do, but their value in these roles is yet to be properly evaluated and they remain dependent on the “training” and inputs delivered by humans.
That said, while it seems unlikely that large language models can effectively replace humans in delivering healthcare, they might replace some labour intensive tasks. Large language models work quickly, but the “human touch” that goes into clinical decision making is something that large language models find hard to replicate.
The problem, therefore, may be the unpredictability and inconsistency of humans – traits that machines find hard to mimic. Indeed, these traits make us human. Humans, unlike machines, are also able to take responsibility and be held accountable, which is one reason why machines are unlikely to replace humans in clinical decision making.
The same logic applies to articles submitted to medical journals. Journals are already receiving manuscripts that are generated by artificial intelligence, and it seems futile to fight the future. But the role of artificial intelligence may be undeclared. Our policy, in line with COPE and other emerging guidance, is that artificial intelligence can be used to generate manuscripts but the contribution of AI must be transparent. Tell us how and why artificial intelligence was used, and to what extent. Nonetheless, artificial intelligence cannot be an author of a manuscript for the simple reason that authorship comes with responsibility for the integrity of the work. Forms of artificial intelligence that are applicable to scientific publishing are sophisticated puppets, and the puppet masters pulling the strings – for noble or malign intent – are human, all too human.
References
- 1.Thirunavukarasu AJ.Large language models will not replace healthcare professionals: curbing popular fears and hype. J R Soc Med 2023; 116: 181–182. DOI: 10.1177/01410768231173123 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 2.Cornforth F, Webber L, Kerr G, et al. Impact of COVID-19 vaccination on COVID-19 hospital admissions in England during 2021: an observational study. J R Soc Med 2023; 116: 167–176. DOI: 10.1177/01410768231157017 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 3.Oxman AD Chalmers I andDahlgren A.. Key concepts for informed health choices. 3.1: evidence should be relevant. J R Soc Med 2023; 116: 177–180. DOI: 10.1177/01410768221140768 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 4.Ashton J.The globalisation of the chemical industry and the public health scandal of Forever Chemicals. J R Soc Med 2023; 116: 183–184. DOI: 10.1177/01410768231172305 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 5.Medisauskaite A, Viney R, Rich A, et al. Financial difficulty in the medical profession. J R Soc Med 2023; 116: 160–166. DOI: 10.1177/01410768231172151 [DOI] [PMC free article] [PubMed] [Google Scholar]
