Artificial Intelligence (AI) is now available throughout the world and is rapidly influencing many aspects of society. As with other computer-based technologies, AI may be a double-edged sword with both positive and negative consequences. The use of AI in the generation of scientific knowledge and its role in the publication process are genuine concerns. Since AI can produce text that could serve as scholarship, it is important to understand its strengths, weaknesses and limitations. In this issue of Cell Stress and Chaperones, Dr. Thiago Gomes Heck provides a letter to the editor discussing “an experiment” in which AI (ChatGPT) was asked for information about HSP70 and also asked to write an essay. He reports that AI provided some basic facts about HSP70 and was capable of linking the information to several aspects in health and disease but limited in producing an essay. Since AI is in its early stages, it is hard to imagine where it will lead and what may ultimately happen. In view of this, it would be wise to stay ahead of the curve by understanding the limitations of any AI-generated material, and by learning how to avoid its incorrect use. I encourage you to read Dr. Heck’s letter (https://doi.org/10.1007/s12192-023-01340-1) and to consider what the future of AI may hold for academic writing.
Footnotes
Publisher's note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.