We read with great interest the article by Olender et al.1 reporting the novel platform to synthesize realistic intravascular images. In this article, the authors depict the various benefits of medical image synthesis, by showing actual optical coherence tomography (OCT) images generated from conditional generative adversarial networks (GANs). We agree in many respects with the perspective presented by Dr Olender and colleagues, because we have also reported on the generation of cardiac intravascular images using GAN.2 However, based on the difficulties we have experienced, we would like to pose some questions to the authors.
The first question is how we should evaluate the validity of generated images. As GANs are unsupervised learning algorithms, there is no absolute measure to evaluate the quality of image synthesis.3 The visual Turing test can be used to evaluate the realism of an image,4 but it does not guarantee that the image is pathologically correct. If the purpose of the artificial intelligence (AI) model is to see how the image changes according to the addition of auxiliary variables, supervised image synthesis, like we have previously reported,5 should be a more straightforward approach.
Next question is on the scalability of the authors’ method. Considering the differences in physical properties and image resolution between intravascular ultrasound and OCT, image translation between them is not an easy task and will require an extremely large amount of data to ensure pathological consistency. Nevertheless, given the similarity of the scanning methods of the two modalities, it may be relatively feasible as described by the authors. Therefore, it would be interesting for us to see if translation is possible among more different modalities such as coronary angioscopy. Finally, we would like to ask the authors’ opinion about the risk of misdiagnosis caused by AI-generated fake images. In generative models using adversarial loss, there is a concern that the features of the generated image are greatly affected by inherent bias of the training data.6
Like many researchers, we also believe that the use of deep learning will be essential in the cardiovascular field. However, before such technology can be used in clinical practice, the above concerns need to be addressed.
Conflict of interest: none declared.
Data availability
The data underlying this article are available in the article.
References
- 1. Olender ML, de la Torre Hernández JM, Athanasiou LS, Nezami FR, Edelman ER. Artificial intelligence to generate medical images: augmenting the cardiologist’s visual clinical workflow. Eur Heart J Digit Health 2021;2:539–544. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 2. Miyoshi T, Higaki A, Kawakami H, Yamaguchi O. Automated interpretation of the coronary angioscopy with deep convolutional neural networks. Open Heart 2020;7:e001177. doi: 10.1136/openhrt-2019-001177. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 3. Yi X, Walia E, Babyn P. Generative adversarial network in medical imaging: a review. Med Image Anal 2019;58:101552. [DOI] [PubMed] [Google Scholar]
- 4. Geman D, Geman S, Hallonquist N, Younes L. Visual Turing test for computer vision systems. Proc Natl Acad Sci USA 2015;112:3618–3623. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 5. Higaki A, Inoue K, Kinoshita M, Ikeda S, Yamaguchi O. Reconstruction of apical 2-chamber view from apical 4- and long-axis views on echocardiogram using machine learning―pilot study with deep generative modeling. Circ Rep 2019;1:197. doi: 10.1253/circrep.CR-19-0011. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 6. Cohen J. P., Luck M., Honari S. Distribution matching losses can hallucinate features in medical image translation. In: International Conference on Medical Image Computing and Computer-Assisted Intervention. Cham (Switzerland): Springer; 2018, pp. 529–536.
Associated Data
This section collects any data citations, data availability statements, or supplementary materials included in this article.
Data Availability Statement
The data underlying this article are available in the article.