We were pleased to read the December 2020 editorial in this journal [1], “Research, reuse, repeat.” The editorial extolls the benefits of making tools (methods, data, and code) available to others who can replicate the results and build on previously developed work. Here, we highlight three lines of activities in the FDA Center for Devices and Radiological Health (CDRH) that embody the spirit portrayed in the editorial. These activities include research, resources, and collaborations that enable assessment of the safety and efficacy of FDA-regulated products, including products enabled by Artificial Intelligence and Machine Learning (AI/ML).
Firstly, the FDA performs scientific research. Along with other research activities across the FDA, the Office of Science and Engineering Laboratories (OSEL) is the research arm of CDRH, and is focused on approximately 20 program areas covering a wide range of medical device technologies. OSEL scientists are charged with conducting regulatory research to accelerate product development and bring products to patients as rapidly as possible. Results of these efforts are disseminated in multiple ways including peer-reviewed publications and by incorporating research results into regulatory resources.
Secondly, the FDA shares resources. CDRH actively seeks to disseminate information and tools to support the application of regulatory science through publications, guidance documents, public summaries of authorized products, and open tools and datasets. The knowledge base of CDRH is its corpus of guidance documents. Guidance documents provide current thinking on policy and regulatory issues. They include technical and conceptual frameworks, as well as methods to guide the robust development and reproducible assessment of regulated products. For medical imaging software as a medical device (SaMD)[2], two recent guidance documents discuss a regulatory framework specifically for evaluating computer aided detection (CADe) algorithms in radiology: one pertains to information needed for a premarket notification (also known as a 510(k) submission) [3], and the other provides additional information about how to assess the device clinical performance in the hands of the clinicians[4]. Another important FDA information resource is the collection of Medical Device Databases, which includes decision summaries for certain types of authorized devices with information on the methods and results relied upon for the regulatory decision. Some examples for SaMD in radiology are related to computer assisted diagnosis (CADx) software for lesions suspicious for cancer[5], [6], computer assisted detection and diagnosis software (CADe + CADx)[6], [7], and computer aided triage and notification software (CADt)[6], [8]. More recently, FDA rendered its first authorization for an H&E digital pathology slide scanner[6], [9] and an AI-based SaMD that uses that scanner’s images to help pathologists detect prostate cancer[6], [10]. To proactively support emerging technologies, OSEL has recently compiled results from its research activities into a Catalog of Regulatory Science Tools to Help Assess New Medical Devices. This catalog supplements guidance documents and recognized standards by providing a list of newly developed methods, phantoms, and computational models and simulations. For example, to address reader variability in Multi-Reader, Multi-Case imaging studies, iMRMC is a publicly available statistical software package that supports the assessment of clinical performance of imaging devices and related AI/ML. iMRMC-related dataset repositories hold supplementary materials from published articles, allowing others to easily understand the data and code with examples.
Thirdly, the FDA collaborates with stakeholders. The Medical Device Development Tool (MDDT) program is a formal CDRH program that encourages stakeholders to propose and pursue FDA qualification for specific tools, which is a voluntary process. Such tools facilitate device development or evaluation, reduce early risk in product development, and provide a means for collecting the necessary information for a regulatory submission. Examples include patient-reported outcome instruments, physical and computational models and simulations of devices and biology, biomarkers to monitor effectiveness of therapies, and AI/ML validation datasets. For example, the high-throughput truthing (HTT) project has submitted a proposal for a validation dataset for the evaluation of tumor infiltrating lymphocytes in digital histopathology images of breast cancer biopsies[11]. The HTT project has publicly shared pilot study data and MDDT submission materials. In addition to the data, the public repository includes scripts to explore and analyze the data. As the HTT project progresses, anyone can adapt the approach to validate other algorithms and biomarkers, whether or not the HTT project successfully qualifies the data as an MDDT. The HTT project has benefitted from a diverse group of collaborators, including several organizations that provide expertise during project development and feedback on deliverables. Similarly, CDRH builds and leverages external expertise through collaborations. CDRH participates in collaborative communities, believing they can help inform and generate regulatory science solutions of public health importance, by bringing together diverse stakeholders to develop tools, methods, and data. This is of particular importance as AI/ML penetrates many technologies. For example, members of the Pathology Innovation Collaborative Community, including chair J. Lennerz, provided a practical tool to navigate FDA guidance documents related to AI/ML[12, Fig. 1]. Another example of collaborative efforts is the Medical Device Innovation Consortium (MDIC). MDIC convenes three collaborative communities and has initiated a digital health working group to “complement FDA’s efforts to develop an innovative regulatory pathway for software.”
Collectively these activities demonstrate FDA’s commitment to communicate and conduct collaborative regulatory science: the science of developing tools, standards, and approaches to reproducibly assess the safety, efficacy, quality, and performance of all FDA-regulated products. FDA is actively engaging researchers, industry, patients, and other stakeholders to identify the needs of industry and the public, and to tackle emerging challenges in the pre-competitive space. You should feel invited to reach out to the FDA or join them in a collaborative community.
Footnotes
Competing interest statement:
The authors do not have any competing interests to declare.
Contributor Information
Brandon D. Gallas, FDA Center for Devices and Radiological Health, Office of Science and Engineering Laboratories, Division of Imaging, Diagnostics, and Software Reliability.
Aldo Badano, FDA Center for Devices and Radiological Health, Office of Science and Engineering Laboratories, Division of Imaging, Diagnostics, and Software Reliability.
Sarah Dudgeon, Yale New Haven Hospital, Center for Outcomes Research and Evaluation; Yale University, Biological and Biomedical Sciences.
Katherine Elfer, FDA Center for Devices and Radiological Health, Office of Science and Engineering Laboratories, Division of Imaging, Diagnostics, and Software Reliability.
Victor Garcia, FDA Center for Devices and Radiological Health, Office of Science and Engineering Laboratories, Division of Imaging, Diagnostics, and Software Reliability.
Jochen K. Lennerz, Massachusetts General Hospital/Harvard Medical School, Department of Pathology, Center for Integrated Diagnostics, Boston, MA.
Kyle Myers, FDA, Retired.
Nicholas Petrick, FDA Center for Devices and Radiological Health, Office of Science and Engineering Laboratories, Division of Imaging, Diagnostics, and Software Reliability.
Ed Margerrison, FDA Center for Devices and Radiological Health, Office of Science and Engineering Laboratories.
References
- [1].“Research, reuse, repeat,” Nat Mach Intell, vol. 2, no. 12, pp. 729–729, Dec. 2020, doi: 10.1038/s42256-020-00277-9. [DOI] [Google Scholar]
- [2].FDA/CDRH, “Software as a Medical Device (SAMD): Clinical Evaluation.” FDA, 2017. Accessed: Jan. 05, 2018. [Online]. Available: https://www.fda.gov/media/100714/download [Google Scholar]
- [3].FDA CDRH, “Guidance for industry and FDA staff - Computer-Assisted Detection Devices Applied to Radiology Images and Radiology Device Data - Premarket Notification [510(k)] Submissions.” FDA, 2012. Accessed: Apr. 21, 2020. [Online]. Available: https://www.fda.gov/media/77635/download [Google Scholar]
- [4].FDA CDRH, “Guidance for industry and FDA staff - clinical performance assessment: considerations for computer-assisted detection devices applied to radiology images and radiology device data in premarket notification [510(k)] submissions.” FDA, 2020. Accessed: Aug. 05, 2020. [Online]. Available: https://www.fda.gov/media/77642/download [Google Scholar]
- [5].FDA CDRH, “Decision Summary for QuantX, DEN170022.” 2017. Accessed: Sep. 20, 2021. [Online]. Available: http://www.accessdata.fda.gov/cdrh_docs/reviews/DEN170022.pdf
- [6].Note, “The mention of commercial products, their sources, or their use in connection with material reported herein is not to be construed as either an actual or implied endorsement of such products by the Department of Health and Human Services.”
- [7].FDA CDRH, “Decision Summary for OsteoDetect, DEN180005.” 2018. Accessed: Sep. 20, 2021. [Online]. Available: https://www.accessdata.fda.gov/cdrh_docs/pdf18/DEN180005.pdf
- [8].FDA CDRH, “Decision Summary for ContaCT, DEN170073.” 2017. Accessed: Sep. 20, 2021. [Online]. Available: https://www.accessdata.fda.gov/cdrh_docs/reviews/DEN170073.pdf
- [9].FDA CDRH, De Novo Request Evaluation of Automatic Class III Designation for Philips IntelliSite Pathology Solution (PIPS): Decision Summary. 2017. Accessed: May 17, 2018. [Online]. Available: https://www.accessdata.fda.gov/cdrh_docs/reviews/DEN160056.pdf
- [10].FDA CDRH, “Decision Summary for Paige Prostate, DEN200080.” 2021. Accessed: Sep. 24, 2021. [Online]. Available: https://www.accessdata.fda.gov/cdrh_docs/pdf20/DEN200080.pdf
- [11].Dudgeon S. et al. , “A pathologist-annotated dataset for validating artificial intelligence: A project description and pilot study,” J Pathol Inform, vol. 12, no. 1, p. 45, 2021, doi: 10.4103/jpi.jpi_83_20. [DOI] [PMC free article] [PubMed] [Google Scholar]
- [12].Marble H. and et al. , “A Regulatory Science Initiative to Harmonize and Standardize Digital Pathology and Machine Learning Processes to Speed up Clinical Innovation to Patients,” J Pathol Inform, Accepted 2020. [DOI] [PMC free article] [PubMed] [Google Scholar]