Abstract
Decision-making in fertility care is on the cusp of a significant frameshift. Online tools to integrate artificial intelligence into the decision-making process across all aspects of ART are rapidly emerging. These tools have the potential to improve outcomes and transition decision-making from one based on traditional provider centric assessments toward a hybrid triad of expertise, evidence, and algorithmic data analytics using AI. We can look forward to a time when AI will be the third part of a provider’s tool box to complement expertise and medical literature to enable ever more accurate predictions and outcomes in ART. In their fully integrated format, these tools will be part of a digital fertility ecosystem of analytics embedded within an EMR. To date, the impact of AI on ART outcomes is inconclusive. No prospective studies have shown clear cut benefit or cost reductions over current practices, but we are very early in the process of developing and evaluating these tools. We owe it to ourselves to begin to examine these AI-driven analytics and develop a very clear idea about where we can and should go before we roll these tools into clinical care. Thoughtful scrutiny is essential lest we find ourselves in a position of trying to modulate and modify after entry of these tools into our clinics and patient care. The purpose of this commentary is to highlight the evolution and impact AI has had in other fields relevant to the fertility sector and describe a vision for applications within ART that could improve outcomes, reduce costs, and positively impact clinical care.
Keywords: Artificial intelligence, Personalized medicine, Decision support systems, Image analysis, Electronic medical records, Convolutional neural networks, Image analysis, Deep learning, Feature engineering
Introduction
The collection and analysis of data from clinic and embryology lab to generate impressions and predict outcomes remain the overarching driver in assisted reproductive technology (ART). Advancements from ovarian stimulation protocols to preimplantation genetic testing (PGT) have led to improvements in live birth rates from 6% in the early years to age-stratified rates as high as 60% per transfer [1]. For the past 50 years, the ART decision-making behind these improvements has been a clinician-centric process. This paradigm has been informed by two key principles: the training and experience of a provider or what is commonly referred to as expertise and evidence described in relevant literature or what is commonly referred to as evidence-based medicine (EBM). This iterative process of data collection and interpretation based on these two key elements is about to undergo significant change with the advent of artificial intelligence (AI) in clinical care. Artificial intelligence will soon be the third tool to complement this framework and shift the decision process from provider centric to a data centric, quantitative model for personalized care.
Since the 1990s, computer decision tools and analytics have improved decision-making and assessments across a wide swath of clinical care ranging from radiology to pathology to dermatology. More specialties are evaluating how these tools can support their practices. Several of these software innovations are relevant to ART with the potential to improve communication, outcomes, and cost efficiencies. The aggregate end point in ART will be an integrated digital ecosystem that leverages AI to create a start to finish suite of software applications that starts with a personalized plan and outcome estimates through treatment and image analytics in the embryology lab. The purpose of this commentary is to explore the vision of how AI can complement traditional paradigms for decision-making in critical areas of the ART experience and suggest what the future could look like within this digital environment. In the very near future, we will incorporate computer and data science to move ART to a new level. Four areas will be explored: the impact of AI on personalized care, decision support systems to supplement day-to-day decision-making, image analysis, and embedded predictive analytics in next-generation electronic health or medical records (EMR). Descriptions of the technological aspects of these tools are thoughtfully reviewed in two recent publications and will not be reviewed here [2, 3]. Instead, this commentary will step away from the discussion of AI software and the underlying algorithms and attempt to frame and answer two questions: How can we incorporate AI into what we do to positively impact patient care? And what could the intersection of computer science and clinical care look like in ART?
Developing a digital ecosystem for ART
Personalized care, ART, and AI: the power to predict
The initial suggestion that clinical decisions could have a quantitative metric beyond simply impressions was made in 1960 [4]. Early forward-thinking applications to IVF of this concept using computer analytics had variable success and minimal clinical uptake [5–7]. Lest judgment be made regarding these early studies and computer technologies in ART, early models were constrained by limited datasets, older platforms, and in some models no validation in settings outside the clinic in which it was developed [8]. None has been incorporated into an EMR or entered into routine clinical care. With the advent of personalized medicine and rapidly evolving software, opportunity now exists to leverage these tools into a better clinical experience for provider and patient.
Though genomic data has been promoted as the foundation of personalized care, these tools will not be relevant on a large scale in the near future. But personalized medicine in fertility need not await the definitions of relevant genes. We are now empowered by datasets and AI platforms to move effectively toward a personalized strategy. A basic database familiar to any practice includes maternal and paternal age; testing for anti-Mullerian, follicle stimulating, and luteinizing hormones; ultrasound imaging for antral follicle counts; and body mass index and sperm profiles. Though fertility has a multitude of operative factors impacting outcome, these data points offer sufficient detail to formulate a personalized plan if deeply analyzed with AI. Robust models are now within reach to allow probability estimates stratified by treatment option [9]. This predictive platform will include side by side comparisons of treatment options and outcomes over several cycles; risk of multiples and AI-driven protocol recommendations for ovarian stimulation in a single document. In addition to generating this document at the start of the fertility journey, AI tools will include analytics to adjust predictions throughout treatment in a continuous, feedback loop of data collection, assessment, and prediction to enable a machine learning model to yield edited predictions within the context of a patient’s unique profile.
The vision is to deliver care with greater precision and personalization. Figures 1 and 2 illustrate what we can expect from an AI-driven system. A typical scenario starts with a patient who has lab studies drawn on provider recommendation or secured through a variety of direct-to-consumer home tests. In this AI-driven system, a set of options are generated for review and discussion prior to a consultation to evaluate options. In the two examples characterized in the figures, treatment and outcomes are based on IUI and IVF but could be drafted for any provider preferred options. Near- and far-term plans can be drafted and immediate treatment initiated. The difference between this and current practice is a transition to a data driven and informed assessment to supplement provider centric care. The final tool could be available on any platform from desk top to tablet to smart phone devices as patients transition to digital formats as the expected modus operandi for their care. In short, data from clinical/lab, ultrasound images, life style, social determinants, demographic and epidemiological details, and outcomes from earlier interventions (if any) can be used to generate a list of options offered in a hierarchy of effectiveness and a decision flow chart to enable personalized care plan that is “predictive, preventive, personalized, and participatory.”
Optimizing the IVF process: decision support systems
Decision support systems (DSS) are examples of early applications of AI with a positive impact on care [10, 11]. A DSS is a predictive algorithm for making automated recommendations to guide clinical decision-making. It may be independent or part of an EMR. In clinical care, DSS have entered the framework to assist in prescribing, diagnosing, and guiding care decisions [12, 13]. DSS offer an opportunity in ART to progress from decisions based on clinical impressions and expertise to something more compositionally rich and integrated. Such a platform in IVF for example could offer recommendations at critical points in the treatment plan; adjusted success probability during the ovarian stimulation based on data generated during stimulation and a more systematic recording of the process calling out unusual or outlier lab results for expert reevaluation. In one application of a DSS to IVF, an algorithm was trained to enhance the decision-making of the clinical team during ovarian stimulation, suggesting next steps and follow up and identifying the ideal time to trigger for oocyte retrieval [14]. If proven in a broader clinical context, these systems will supplement a provider’s assessments and expertise in management and using a machine learning model enable more refined stimulation protocol and cycle management with continued use.
DSS could be incorporated into future EMRs for immediate assessment and issue a recommendation to a provider for approval or not. Figures 3 and 4 illustrate an example of a potential DSS application in IVF. In this hypothetical setting, a recommendation is made by the algorithm and forwarded to a provider using a HIPAA compliant platform and smart phone display of the DSS recommendation. In this case, the system suggested continuing the dose of rec FSH 150 IU × 3 days. The provider may opt to approve. When linked to an automated and secure texting platform, the system is programmed to send prewritten instructions via text or Twitter without any intermediaries such as nursing teams bypassing the need for triangulation from provider to nurse to patient. If the provider disagrees, the smart phone app can migrate to a more extensive database for review and adjustment (if any) of the recommendation.
In addition to clinical care in the example described above, future AI applications and support systems in ART will extend to IVF operations. Operational tools for clinic management and workflow include automated scheduling for appointments and retrievals during IVF cycles. For example, an algorithm could be trained to identify the single day for monitoring during ovarian stimulation based on patient profile. In addition, the algorithm could offer at trigger several potential days for retrieval enabling a level loading of the retrievals across several days without compromising success. These two examples add transparency to the IVF calendar and enable crosstalk between clinical care, lab, and administration for efficient scheduling. Intersecting operational and clinical venues should assist the workforce to have adequate resources and timely access to supply chains and anticipate demand and needs on cycle volume.
Extracting image attributes to guide decision-making: image analysis tools
Software systems for assessing finely grained image attributes and generating classifications are recent additions to the catalogue of image analysis programs with clinical application [15]. These systems have rapidly evolved into sensitive tools in several spheres of clinical care [16]. Convolutional neural networks have enabled transition of image interpretation from qualitative assessments by expert observers to AI-supported scoring algorithms. These systems can be adapted to a variety of tasks where objective and quantitative assessments of images may improve outcomes and reduce costs. Systems are in place to detect attributes, patterns, and textures in regions of interest within an image and create catalogues of normal and abnormal images for comparisons. Algorithms to sort the images can then be written using these annotated archives. These systems are in place across a spectrum of specialties that use images for diagnosis and treatment planning [17–21]. Two applications of image analysis for ART are oocyte and embryo assessments.
Oocyte assessments and image analytics: at the start of the embryology journey
Previous attempts to assess oocyte quality have been a labor intensive, manual process evaluating attributes such as zona pellucida, cytoplasm, polar body, and the perivitelline space [22–24]. None of these studies have conclusively proven that any feature constellation can predict success. Lack of accuracy aside, the labor intensive scrutiny in this model precludes daily application in a busy lab and constrains uptake. Specialized software to enable automated grading of oocytes are logical next-phase extensions of image analysis platforms to ART to streamline workflow and improve outcomes.
Early software applications predictive of which oocytes would be most likely to result in live births were largely mathematical models and served as the basis for subsequent clinical study of oocyte image analytics [25, 26]. Much has changed since these early descriptions [27]. Future AI-driven image analytics will be able to identify which oocytes to freeze in donor and fertility preservation programs or to inseminate for IVF and in combination with blastocyst morphology add insight into embryo selection to identify those with the highest likelihood of implantation and perhaps those that should (or should not) be biopsied. Definition of attributes associated with oocytes most likely to succeed in the journey to blastocyst and implantation would spare the labor intensive and costly process of sperm injection or insemination and advanced culture; guide decisions regarding the number of oocytes needed in frozen inventory to maximize family building options; or predict outcome with donor oocytes. In its fully integrated form, these models will automatically and almost instantaneously identify favorable oocyte attributes (e.g., pixel distribution or neighborhoods in the ooplasm or cellular symmetry) but also may identify predictive attribute patterns not apparent to visual inspection. These software applications could be secure, cloud based and accessed regardless of lab location, or attached to an in-house hardware system
In a recent study published in abstract form only, a machine learning-enabled image analysis of oocytes could predict fertilization (91%) and blastocyst formation (63%) and with greater accuracy than a team of embryologists using standard scoring systems [28]. The bottom-line impact of this technology will be to identify the best oocytes for insemination or freezing leading to improved clinical efficiencies; cost-reductions in the embryology lab; and an opportunity to normalize embryology decisions across practices. Such AI analytics of oocytes will be complementary to video monitoring and can be incorporated into the video platforms for in integrated system of assessment. One study suggested that oocyte assessment may be more powerful than video monitoring in identifying outcomes for blastocyst formation [29].
Embryo assessments and image analytics: at the final end point of the journey
The idea thatembryo morphology is predictive of implantation potential is the cornerstone to IVF success: the better embryologists are at scoring, the better the outcomes. Conventional assessment of embryo quality through static, daily measurements using light microscopy is complicated and requires skill and time [30]. Earlier studies of blastocyst scoring systems used hand annotated assessments of blastocyst features such as cellular symmetry, fragmentation, and inner cell mass. Though a numerical assignment is made, the process remains subjective and may be impacted by factors common to any repetitive technique of observation such as fatigue and inter- and intra-observer variability. Future systems in the embryology lab will include software to automatically, reliably, and regardless of number of cycles (no fatigue factor) automatically rank embryos and identify which should be biopsied or not. These tools when matched to a reliable system of oocyte identification could result in cost and time savings.
Early trials in computer applications to embryo assessments used more basic tools than the analytics and convolutional neural networks now available [31]. Newer software has enabled analytics applied to single microscopy images to assess quality and predict outcome from cleavage stage embryos to blastocysts. Future systems for embryo selection will include an AI-generated score to each embryo to assist in the selection process. Two recent studies are notable. A recent publication described a two-phase study using proprietary software and relatively low number of images (approximately 9000) for training and challenge to predict blastocyst viability. The algorithm was accurate in predicting viability (sensitivity) at 74% and non-viability at 65% [32]. In a study of deep learning-based neural network for embryo study, the network demonstrated a 90% accuracy in choosing best quality blastocysts and outperformed trained embryologists in predicting implantation potential [33]. Video morphokinetics were an early entry into software-supported embryo assessments using video systems to identify key events associated with high implantation rates. Cost and variable improvements in pregnancy rates have been factors preventing greater uptake with clear regional preferences in evidence [34, 35]. These systems may have a role in combinations with static image assessments using analytics noted above in an integrated platform to identify the best embryos.
In an ideal setting, an image analysis tool would offer guidance to supplement assessments of the embryology team in determining which embryo to transfer; reduce the need for PGT; and when indicated identify those embryos with a high likelihood of abnormalities thus limiting the number biopsied. Extending these concepts, an AI-trained model may be able to identify and extract a set of features that reliably identified an embryo at risk for aneuploidy and reduce the need to PGT. A final system for image analysis in ART would incorporate the attributes cited in Table 1. In addition to possible improved outcomes and workflow improvement, the ease of image capture through use of smart phones could enable these tools to be deployed to regions of high need and low penetration of skilled clinicians, embryologists, and technicians, thus expanding care to neglected populations regardless of location [36].
Table 1.
A robust image analysis system for quantifying egg, pronuclear, and blastocyst morphology | |
° A systematic recording of the process and opportunities for comparison within and among patients ° Definitions of attributes or ensembles across all three structures positively and negatively associated with outcomes | |
A robust prediction phase and deep learning platform using the morphology of: | |
° Oocytes, pronuclear embryos, and blastocysts to predict outcomes ° Insight should be provided as early as possible to minimize number of oocytes inseminated, blastocysts frozen, or biopsied | |
A set of predictive tools to enhance decision-making regarding best use of oocytes and embryos: | |
° Allowing success-probability estimation attached to each oocyte and blastocyst ° Calling out unusual morphology and edge cases for inspection by experts | |
A full system allowing storage of images, features, and outcomes with easy interface by experts and data sharing as needed within and across electronic medical record systems |
Redesigning electronic medical records: embedded analytics
The EMR is the linchpin for transitioning to a digital ecosystem in ART on scale. A cognitive re-design of the EMR in fertility care will require a step away from the idea that the EMR is a dedicated record keeping tool toward a more expansive view as a robust data management and analytical system. This frameshift would also include a system for decision-making and prediction to inform decisions in real time with immediate connectivity to patients. EMRs within healthcare delivery systems are leveraging this potential and illustrate the extraordinary opportunity at the intersection of EMRs, databases, and analytics. [37]. There are two relevant takeaways for ART from these more global observations. First is the vision that EMRs of the future will have embedded analytics to interpret data and through machine learning improve predictions as more data is collected and analyzed in an ever refined forward movement to assure best outcome. Second is a redefinition in fertility care of the term data. Data in this sense extends across all information data types: unstructured text sources, digital clinical records, ultrasound images, and the entire panoply of images captured in the embryology lab.
To achieve these goals will require continued efforts to redefine the EMR, work already underway within many systems available in the fertility space. Machine learning will be integrated within an EMR system and data models standardized so that similar products work in a similar fashion across a variety of platforms and information silos. The contemporary EMR will also have capabilities for the immediate transfer of information and clinical recommendations in real time and maximum security to mobile devices [38, 39]. These potentials will require upgrades to realize a strategic plan that understands the communication needs of a contemporary patient population who rely on the anytime, anywhere profile of mobile tools. Such mobile project development will require privacy and security compliance and insulation against cyber threats. The population of prospective patients entering the fertility care pipeline bring with them expectations tied intimately to mobile phone devices. These patients fall into the category of digital natives, those who look to mobile devices as the go-to tool for communication (as opposed to digital immigrants with a background in more traditional tools like dial-up telephones and who came into the technology late) [40].
Summary
There is a temptation at the crossroads of science, technology, and patient care to think of one tool as a solution for many problems. Though solo breakthroughs are rare, we can say this: disruptive technology to re-vamp the entire decision-making process in ART is on hand. Conventional practices will change dramatically as we move toward these tools for more precise predictions and best practice recommendations for patient care. The best way to assure that end point is through development sensitive to the needs of stakeholders in ART: clinicians, embryologists, nursing teams, administrators, and industry collaborators. Taking care of patients remains a nuanced enterprise: clinical decisions in ART are made in the context of a patient’s personal, emotional, and clinical needs with equal emphasis. Providers will remain preeminent in this domain. Several years ago, the prediction was made that AI would supplant radiologists, dermatologists, and pathologists [41]. None of this ever happened. If anything these algorithms have enhanced not only the ability to deliver quality care but also expand the reach of these diagnostic tools across greater demographics and regardless of geolocation.
The greatest challenge to AI in these healthcare domains rests in part with the technology but also with assuring their effectiveness and cognitive design for easy uptake in daily clinical practice. The true impact of these technologies will be fully appreciated when incorporated into a seamless system capable of data analysis across the clinical and lab spectrum regardless of EMR to improve what we do; how we communicate and how to improve workflow. We are at a point early enough in their deployment that we can design how these tools will be integrated into ART and have in place checks and balances for their effective and thoughtful use.
We have been here before. A 1932 editorial Conception in a Watch Glass offered this prescient insight into IVF: Truly it seems as if the forge were being warmed and another link may be welded into the chain. [42]. Here we are again. Good things await as we evaluate AI in fertility care and forge another small link in the chain. Patiently and thoughtfully.
Acknowledgements
The author gratefully acknowledges Jeremy Stimson and Jay Yoo for assistance with the images and illustrations.
Author contribution
Content entirely by main/solo author
Code availability
Not applicable.
Declarations
Conflict of interest
The author declares no competing interests.
Footnotes
Publisher’s note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
References
- 1.Cohen J, Trounson A, Dawson K, Jones H, Hazekamp J, Nygren K-G, Hamberger L. The early days of IVF outside the UK. Hum Reprod Update. 2005;11:439–460. doi: 10.1093/humupd/dmi016. [DOI] [PubMed] [Google Scholar]
- 2.Hajirasouliha I, Elemento O. Precision medicine and artificial intelligence: overview and relevance to reproductive medicine. Fertil Steril. 2020;114:908–913. doi: 10.1016/j.fertnstert.2020.09.156. [DOI] [PubMed] [Google Scholar]
- 3.Hickman C, Aishubbar H, Chambost J, Jacque C, Pena C-A, Drakeley A, Freour T. Data sharing: using blockchain and decentralized data technologies to unlock the potential of artificial intelligence: what can assisted reproduction learn from other areas of medicine? Fertil Steril. 2020;114:927–933. doi: 10.1016/j.fertnstert.2020.09.160. [DOI] [PubMed] [Google Scholar]
- 4.Hoffman PJ. The paramorphic representation of clinical judgment. Psychol Bull. 1960;57:116–131. doi: 10.1037/h0047807. [DOI] [PubMed] [Google Scholar]
- 5.Jurisica I, Myleopoulos J, Glasgow J, Shapiro H, Casper RF. Case based reasoning in IVF: prediction and knowledge mining. Artif Intell Med. 1998;12:1–24. doi: 10.1016/S0933-3657(97)00037-7. [DOI] [PubMed] [Google Scholar]
- 6.Custers IM, Steures P, van der Steeg JW, Van Dessel TJHM, Bernardus RE, Bourdrez P, Koks CAM, Riedikh WJ, Burggraaff LM, van der Veen F, Mol BWJ. External validation of a prediction model for an ongoing POR after IUI. Fertil Steril. 2007;88:425–431. doi: 10.1016/j.fertnstert.2006.12.007. [DOI] [PubMed] [Google Scholar]
- 7.Kaufman SJ, Eastaugh JJ, Snowden S, Smye SW, Sharma V. The application of neural networks in predicting the outcome of in-vitro fertilization. Hum Reprod. 1997;12:1454–1457. doi: 10.1093/humrep/12.7.1454. [DOI] [PubMed] [Google Scholar]
- 8.Guvenir HA, Misiil G, Dibaz S, Ozedegirmenci O, Demir B, Dilbaz B. Esimating the chance of success in IVF treatment using rank algorithm. Mol Biol Eng Comput. 2015;53:911–920. doi: 10.1007/s11517-015-1299-2. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 9.Pencina MJ, Goldstein BA, D’Agostino R. Prediction models-development, evaluation and clinical application. N Engl J Med. 2020;382:1583–1586. doi: 10.1056/NEJMp2000589. [DOI] [PubMed] [Google Scholar]
- 10.Garg AX, Adhikari NKJ, Rosas-Areliano MP, Devereaux PJ, Beyene J, Sam J, Haynes RB. Effects of computerized clinical decision support systems on practitioner performance and patient outcomes. J Amer Med Assoc. 2005;293:1223–1238. doi: 10.1001/jama.293.10.1223. [DOI] [PubMed] [Google Scholar]
- 11.Medic G, Kleiss MK, Atallah L, Weichert J, Panda S, Postma M, El-Kerdi N. Evidence-based clinical decision support systems for the prediction and detection of three disease states in critical care: a systematic literature review. Artif Intell Med. 2019;8:1772. doi: 10.12688/f1000research.20498.1. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 12.Dissanayake PI, Colicchio TK, Cimino JJ. Using clinical reasoning ontologies to make smarter clinical decision support systems: a systematic review and data synthesis. J Amer Med Info Asso. 2020;27:159–174. doi: 10.1093/jamia/ocz169. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 13.Dolan JO, Veezie PJ, Russ AJ. Development and initial evaluation of a treatment decision dashboard. BMC Med Informa Decis Making. 2013;13:51–57. doi: 10.1186/1472-6947-13-51. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 14.Letterie GS, Mac Donald AW. A computer decision support system for day to day management of ovarian stimulation during in vitro fertilization. Fertil Steril. 2020;114:1026–1031. doi: 10.1016/j.fertnstert.2020.06.006. [DOI] [PubMed] [Google Scholar]
- 15.Shen D, Wu G, Suk H-I. Deep learning in medical image analysis. Annu Rev Biomed Eng. 2017;2:221–248. doi: 10.1146/annurev-bioeng-071516-044442. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 16.Greenspan H, Ginneken BV, Summers RM. Guest editorial deep learning in medical imaging: overview and future promise of an exciting new technique. IEEE Trans Med Imaging. 2016;35:1153–1159. doi: 10.1109/TMI.2016.2553401. [DOI] [Google Scholar]
- 17.De Fauw J, Ledsam JR, Romera-Paredes B. Clinically applicable deep learning for diagnosis and referral in retinal disease. Nat Med. 2018;24:1342–1350. doi: 10.1038/s41591-018-0107-6. [DOI] [PubMed] [Google Scholar]
- 18.Gulshan V, Peng L, Coram M, Stumpe MC, Wu D, Narayanaswamy A, Venugopalan S, Widner K, Madams T, Cuadros J, Kim R, Raman R, Nelson PC, Mega JL, Webster DR. Development and validation of a deep learning algorithm for detection of diabetic retinopathy in retinal fundus photographs. J Amer Med Assoc. 2016;13:2402–2410. doi: 10.1001/jama.2016.17216. [DOI] [PubMed] [Google Scholar]
- 19.Cheung DS, Lim G. Prediction of cardiovascular risk factors from retinal fundus photographs via deep learning nature biomedical engineering development and validation of a deep learning algorithm for detection diabetic retinopathy in retinal fundus photographs. J Amer Med Assoc. 2016;316:2402–2410. doi: 10.1001/jama.2016.17216. [DOI] [Google Scholar]
- 20.Esteva A, Kuprel B, Novoa R, et al. Dermatologist-level classification of skin cancer with deep neural networks. Nature. 2017;542:115–118. doi: 10.1038/nature21056. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 21.Coudray N, Santiago Ocampo P, Narula N, Snuderl M, Fenyö D, Moreira AL, Razavian N, Tsirigos A. Classification and mutation prediction from non–small cell lung cancer histopathology images using deep learning. Nat Med. 2018;10:1559–1567. doi: 10.1038/s41591-018-0177-5. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 22.Lazzaroni-Tealdi E, Barad DH, Albertini DF, Yu Y, Kushnir VA, Russell H, Wu Y-G. Oocyte scoring enhances embryo scoring in predicting pregnancy chances with IVF where it counts most. PLoS One. 2015;10:e0143632. doi: 10.1371/journal.pone.0143632. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 23.Rienzi L, Vajta G, Ubaldo F. Predictive value of oocyte morphology and human IVF a systematic review of the literature. Human Repro Update. 2011:1734–45. [DOI] [PMC free article] [PubMed]
- 24.Rienzi L, Ubaldi FM, Iacobelli M, Minasi MG, Romano S, Ferrero S, Baroni E, Litwicka K, Greco E. Significance of metaphase to human oocyte morphology on ICSI outcome. Fertil Steril. 2008;90:1692–1700. doi: 10.1016/j.fertnstert.2007.09.024. [DOI] [PubMed] [Google Scholar]
- 25.Aragon J, Gonzalez AL, Yufera A. Image Processing: Methods, Applications and Challenges. 2012. Applying image processing to in vitro human oocytes characteristics; pp. 1–17. [Google Scholar]
- 26.Basile TM, Caponetti L, Catellano G, Sforza G. Texture-based image processing approach for the description of human oocyte cytoplasm. IEEE Trans Instrum Meas. 2010;59:2591–2601. doi: 10.1109/TIM.2010.2057552. [DOI] [Google Scholar]
- 27.Bakas P, Bolaris S, Pantou A, Pantos K, Koutsilieris M. Are computational applications the “crystal ball” in the IVF laboratory? The evolution from mathematics to artificial intelligence. J Assist Reprod Genet. 2018;35:1545–1557. doi: 10.1007/s10815-018-1266-6. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 28.Nayot D, Meriano J, Casper R, Krivoi A. An oocyte assessment tool using machine learning: predicting blastocyst development based on a single image of an oocyte. Presented at ESHRE 2020. Accessed at: https://futurefertility.com/eshre-2020-abstract-ff/
- 29.Faramarzi A, Khalili MA, Ashourzadeh S. Oocyte morphology and embryo morphokinetics in an intra-cytoplasmic sperm injection programme. Is there a relationship? Zygote. 2017;25:190–196. doi: 10.1017/S0967199417000041. [DOI] [PubMed] [Google Scholar]
- 30.Braude P. Are the best embryos being selected? Reprod BioMed Online. 2013;27:644–653. doi: 10.1016/j.rbmo.2013.08.009. [DOI] [PubMed] [Google Scholar]
- 31.Filho ES, Noble JA, Poli M, Griffiths T, Emerson G, Wells D. Method of semi-automatic grading of human blastocyst microscope images. Hum Reprod. 2012;27:2641–2648. doi: 10.1093/humrep/des219. [DOI] [PubMed] [Google Scholar]
- 32.VerMilyea M, Hall JMM, Diakiw M, Johnston A, Nguyen T, Perugini D, Miller A, Picou A, Murphy AP, Perugini M. Development of an artificial intelligence-based assessment model for prediction of embryo viability using static images captured by optical light microscopy during IVF. Hum Reprod. 2020;35:770–784. doi: 10.1093/humrep/deaa013. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 33.Bormann C, Kanakasabapathy M, Thirumalaraju P, Gupta R, Pooniwala R, Kandula H, Hariton E, Souter I, Dimitriadis I, Ramirez L, Curchoe C, Swain J, Boehnlein L, Shafiee H. Performance of a deep learning based neural network in the selection of human blastocysts for implantation. eLife. 2020;9:e55301. doi: 10.7554/eLife.55301. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 34.Armstrong S, Bhide P, Jordan V, Pacey A, Marjoribanks J, Farquhar C. Time-lapse systems for embryo incubation and assessment in assisted reproduction. Cochrane Database of Systemic Reviews. 29 May 2019 [DOI] [PMC free article] [PubMed]
- 35.Racowsky C, Kovacs P, Martins WP. A critical appraisal of time-lapse imaging for embryo selection: where are we and where do we need to go? J Assist Reprod Genet. 2015;31:1025–1030. doi: 10.1007/s10815-015-0510-6. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 36.Nogueira M, Guilherme VB, Pronunciate M, dos Santos PH, Bezerra da Silva D, Rocha J. Artificial Intelligence based grading of bovine blastocyst digital images: direct capture with juxtaposed lenses of smart phone camera and stereomicroscope ocular lens. Sensors. 2018;18:4440. doi: 10.3390/s18124440. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 37.Matheny ME, Whicher D, Israni ST. Artificial intelligence in health care: report from the national academy of medicine. J Amer Med Assoc. 2020;323:509–510. doi: 10.1001/jama.2019.21579. [DOI] [PubMed] [Google Scholar]
- 38.Helou S, Abou-Khalil V, Yamamoto G, Kondoh E, Tamura H, Hiragi S, Sugiyama O, Okamoto K, Nambu M, Kuroda T. Prioritizing features to redesign in an EMR system. Stud Health Technol Inform. 2019;264:1213–1217. doi: 10.3233/SHTI190419. [DOI] [PubMed] [Google Scholar]
- 39.Ventola CL. Mobile devices and apps for health care professional: uses and benefits. Pharm Ther. 2014;39:356–364. [PMC free article] [PubMed] [Google Scholar]
- 40.Dingli A, Seychell D. Who are the digital natives? Berlin: Springer; 2015. The new digital natives; pp. 9–22. [Google Scholar]
- 41.Shah NR. Healthcare in 2030: Will artificial intelligence replace physicians? Ann Intern Med. 2020;170:407–408. doi: 10.7326/M19-0344. [DOI] [PubMed] [Google Scholar]
- 42.Anon. Editorial. Conception in a Watch Glass. N Engl J Med. Oct 21 1937.
Associated Data
This section collects any data citations, data availability statements, or supplementary materials included in this article.
Data Availability Statement
Not applicable.