Skip to main content
Journal of the American Medical Informatics Association: JAMIA logoLink to Journal of the American Medical Informatics Association: JAMIA
. 2018 Jan 5;25(5):458–464. doi: 10.1093/jamia/ocx150

Interactive or static reports to guide clinical interpretation of cancer genomics

Stacy W Gray 1,2,#,, Jeffrey Gagan 3,4, Ethan Cerami 5, Angel M Cronin 6, Hajime Uno 4,6, Nelly Oliver 6, Carol Lowenstein 6, Ruth Lederman 6, Anna Revette 6, Aaron Suarez 7, Charlotte Lee 4, Jordan Bryan 7, Lynette Sholl 3,4, Eliezer M Van Allen 4,6,7,#,
PMCID: PMC6018970  PMID: 29315417

Abstract

Objective

Misinterpretation of complex genomic data presents a major challenge in the implementation of precision oncology. We sought to determine whether interactive genomic reports with embedded clinician education and optimized data visualization improved genomic data interpretation.

Materials and Methods

We conducted a randomized, vignette-based survey study to determine whether exposure to interactive reports for a somatic gene panel, as compared to static reports, improves physicians’ genomic comprehension and report-related satisfaction (overall scores calculated across 3 vignettes, range 0–18 and 1–4, respectively, higher score corresponding with improved endpoints).

Results

One hundred and five physicians at a tertiary cancer center participated (29% participation rate): 67% medical, 20% pediatric, 7% radiation, and 7% surgical oncology; 37% female. Prior to viewing the case-based vignettes, 34% of the physicians reported difficulty making treatment recommendations based on the standard static report. After vignette/report exposure, physicians’ overall comprehension scores did not differ by report type (mean score: interactive 11.6 vs static 10.5, difference = 1.1, 95% CI, −0.3, 2.5, P = .13). However, physicians exposed to the interactive report were more likely to correctly assess sequencing quality (P < .001) and understand when reports needed to be interpreted with caution (eg, low tumor purity; P = .02). Overall satisfaction scores were higher in the interactive group (mean score 2.5 vs 2.1, difference = 0.4, 95% CI, 0.2-0.7, P = .001).

Discussion and Conclusion

Interactive genomic reports may improve physicians’ ability to accurately assess genomic data and increase report-related satisfaction. Additional research in users’ genomic needs and efforts to integrate interactive reports into electronic health records may facilitate the implementation of precision oncology.

Keywords: cancer, genetics, implementation, pathology

BACKGROUND AND SIGNIFICANCE

The increased availability and decreased cost of tumor genomic profiling promise to provide oncologists with a clear path to “precision medicine.”1 The goal of precision medicine is to leverage an understanding of alterations in somatic (tumor) and germline DNA to identify therapies matched to a patient’s molecular profile. Genomic data can also reveal prognostic, diagnostic, and cancer risk information that can shape care, and there are now many academic and large (>100) commercial somatic gene panels currently being used for clinical decision-making.2–4

Despite the promise of precision medicine, clinical utility studies of somatic panel testing have shown modest impact and highlight implementation obstacles.5–7 One major barrier to effective genetic testing implementation is provider knowledge gaps.8–13 Low levels of genetic knowledge/confidence have been associated with lower utilization of genetic testing.14,15 In order to address these gaps, professional organizations such as the American Society for Clinical Oncology (ASCO), the Association for Molecular Pathology (AMP), and the American Medical Association (AMA) have intensified efforts to train providers in genomics (ASCO tumor boards, ASCO pre–annual meeting genomic courses, AMP webinars, AMA somatic resource: https://cme.ama-assn.org/Activity/4652654/Detail.aspx). Moreover, many institutions have developed “molecular tumor boards” to review cases in real time.7,16–20 Others have developed genomic knowledge banks such as MyCancerGenome,21 OncoKB,22 IntOGen,23 and CIViC24 to aid in the curation and interpretation of data.

Many solutions, however, are constrained by a lack of scalability, exist outside of providers’ clinical workflow, and fail to integrate into the electronic health record.25 One potential solution is to provide clinicians with education at the point of care through genomic testing laboratory reports. Most lab reports already contain annotated information from molecular pathologists, and they include technical information, such as coverage and sequencing quality, which is essential for interpretation.26,27 Reporting of complex genomic data to date has typically required a significant reduction of these data into static documents that eliminate information inherent in genomic analyses.28 Furthermore, static genomic reports lack the decision support that may be needed to empower physicians to effectively utilize genomic data as part of evidence-based decision-making.29

To address some of these limitations, we developed a web-based interactive genomic report that includes modern information technology features (eg, improved data visualization and embedded clinician-directed education). We hypothesized that physicians who were exposed to the interactive report would demonstrate increased comprehension of patients’ genomic data and be more satisfied with the report than physicians exposed to a traditional static genomic report. We then conducted a randomized, vignette-based survey study, in the context of a longitudinal study,8 to determine whether exposure to the interactive genomic report, as compared to the static report, improves physicians’ comprehension of the reported data and their report-related satisfaction.

MATERIALS AND METHODS

Study population

We re-surveyed all faculty members who participated in the baseline survey8 and who provide clinical care to cancer patients at the Dana-Farber Cancer Institute (DFCI) and the Brigham and Women’s Hospital. In addition, we included pediatric oncologists, whose patients are now eligible for testing, and new faculty. Questions pertained to a representative somatic next-generation sequencing panel–based assay that queries exonic mutations in a set of cancer genes (OncoPanel).2,30 We recruited participants between May and November 2016. The study was approved by the DFCI Institutional Review Board (DFCI #16-101).

Survey instrument

The survey instrument contained questions related to the use of OncoPanel testing, case-based vignettes, genomic confidence, and sociodemographic/practice characteristics. We developed 3 patient vignettes in which the application of cancer genomics could be used to facilitate clinical decision-making (metastatic breast and lung cancer, metastatic melanoma). Each vignette included (1) a patient description, (2) a table on disease-relevant genomic alterations and their clinical significance, (3) a mock genomic report (static or interactive format), and (4) multiple-choice questions. We included the table on disease-relevant genomic alterations and clinical significance to ensure that participants did not need specific knowledge to answer questions. This approach allowed us to provide participants with the answers to the vignette-based questions within the vignette itself, as long as they could find the relevant data on the report.

The vignette-based question format was similar across all cases. We queried participants about (1) sample quality, (2) sequencing quality, (3) treatment recommendations, (4) copy number interpretation, (5) the likelihood that a particular alteration was somatic or germline, (6) factors that could lead to a false negative report (eg, low tumor purity or low depth of coverage), and (7) confidence in report interpretation. To evaluate specific domains, we included information (1) consistent with a germline alteration in the lung case, (2) on low-quality sequencing in the melanoma case, and (3) on low tumor purity in the breast case. We measured report satisfaction with a modified Scheuner scale.31 We also included an open-ended question to obtain participants’ feedback on the reports.

We conducted cognitive pretesting of a draft instrument with medical, radiation, and pediatric oncologists, and surgeons (n = 4). We then refined and finalized the survey. The survey took approximately 30 min and was administered online via DatStat Illume 6.1 (DatStat Inc., Seattle, WA, USA). The survey instrument is available as Supplementary Materials 1.

Study procedures

Participants were randomized into 2 groups following stratification by specialty: experimental (interactive report) vs traditional (static report), with a 1:1 allocation rate. SWG and EMV sent potentially eligible physicians an electronic letter that contained study details and a survey link. Furthermore, we randomized the vignette order to minimize potential order effect bias. At the end of the survey, we offered physicians who were randomized to view the static report the option to view the interactive report and provide feedback. We sent electronic reminders to nonresponders 2 and 4 weeks after initial contact. SWG and EMV called/e-mailed all nonresponders. Physicians were offered a $100 gift card as incentive.

Genomics report design and implementation

We used the OncoPanel assay report2 as our static report (Supplementary Materials 2). The OncoPanel report also served as the source of content for the interactive report; we presented the same genomic information and explanatory text in both formats. All reports were generated for the study and contained fictional accession numbers and specimen collection dates. We further modified the interactive report through an iterative feedback process with trainees to optimize information organization, include hover-over buttons with education (eg, potential importance of allelic fraction, interpretation of tumor purity), and visual cues to provide rapid assessment of quality features (available in Supplementary Materials 3). The interactive report was developed in AngularJS 1.5.32

Statistical analysis

We summarized physician characteristics and survey responses descriptively. We evaluated differences in physician characteristics between responders and nonresponders, and between randomization groups, by Fisher’s exact and Wilcoxon rank-sum tests. We analyzed the open-ended data to identify themes in content.

The primary outcome was overall comprehension of genomic findings, defined as the sum of the correct responses to 18 items (6 items for each of the 3 vignettes, range 0–18). In the primary analysis, we assigned physician nonresponse as incorrect (ie, assuming that failure to select a response on a comprehension question meant that the physician was unsure of the correct response). We also conducted secondary analyses using a complete case analysis in the subset of physicians who completed all 18 comprehension items. Furthermore, we calculated domain-specific comprehension for the 6 domains included in each of the vignettes, defined as the sum of the correct responses to the domain-specific items (range 0–3). The secondary outcome was physician-reported satisfaction. Responses to 16 items were scored on a 1–4 point Likert scale, with higher scores representing greater satisfaction; we averaged responses to determine the overall satisfaction score. We evaluated the mean differences in overall comprehension and overall satisfaction by randomization group by t-test, while we evaluated differences in domain-specific comprehension by randomization group by Wilcoxon rank-sum test. Statistical analyses were performed using Stata version 13.2 (StataCorp, College Station, TX, USA).

When designing the study, we estimated that we would recruit 320 eligible physicians, with an expected participation rate of 50%, for a target of 160 participating physicians. This study design had 80% power to detect an effect size of 0.446 with a 2-sided type I error rate of 5%. We ultimately recruited 358 eligible physicians, of whom 103 completed the items specific to the primary outcome. Updating this power calculation with the actual number of participants included in the analysis of the primary outcome (56 static and 47 interactive), the observed sample had 62% power to detect an effect size of 0.45, and 80% power to detect an effect size of 0.56.

RESULTS

Study population

We randomized 358 eligible physicians, of whom 105 participated (29%) (Supplementary Figure 4). The participation rate was slightly higher among physicians randomized to the static report (57/177 = 32%) than those randomized to the interactive report (48/181 = 27%). Medical oncologists had the highest participation rate (36%), followed by pediatric oncologists (33%), radiation oncologists (24%), and surgeons (10%). Participants were more likely to have graduated medical school more recently (P < .001), and among the subset of eligible physicians who participated in the baseline survey,8 we observed that those who participated in the current study reported higher levels of confidence in knowledge about genomics (P = .03) (Supplementary Table 1). For example, 42% of participants vs 23% of nonparticipants in the current study reported in 2011 that they were “very confident” in their knowledge about genomics. Among the 105 participants, there were no statistically significant differences in physician characteristics by study arm (Table 1).

Table 1.

Characteristics of participants by study arma

Characteristics Standard Interactive P-value**
N = 57 N = 48
Gender .68
 Male 34 (61) 30 (65)
 Female 22 (39) 16 (35)
 Nonresponse 1 2
Years since medical school graduation .39
 0–5 1 (2) 1 (2)
 6–10 15 (26) 13 (28)
 11–15 12 (21) 15 (33)
 16–20 14 (25) 7 (15)
 21–25 4 (7) 3 (7)
 26–30 2 (4) 0 (0)
 31–35 2 (4) 2 (4)
 36–40 5 (9) 3 (7)
 >40 2 (4) 2 (4)
 Nonresponse 0 2
Department .72
 Medical oncology 38 (67) 32 (67)
 Pediatric oncology 13 (23) 8 (17)
 Radiation oncology 3 (5) 4 (8)
 Surgery 3 (5) 4 (8)
Confidence in knowledge about genomics .12
 Not confident at all 2 (4) 1 (2)
 Not very confident 5 (9) 8 (17)
 Somewhat confident 34 (61) 19 (40)
 Very confident 15 (27) 20 (42)
 Nonresponse 1 0
Confidence in ability to explain genomic concepts to patients .10
 Not confident at all 0 (0) 0 (0)
 Not very confident 3 (5) 7 (15)
 Somewhat confident 30 (54) 17 (35)
 Very confident 23 (41) 24 (50)
 Nonresponse 1 0
Confidence in ability to make treatment recommendations based on genomic information .64
 Not confident at all 1 (2) 1 (2)
 Not very confident 10 (18) 12 (25)
 Somewhat confident 28 (50) 18 (38)
 Very confident 17 (30) 17 (35)
 Nonresponse 1 0
Principal investigator in clinical trials research? 1.00
 No 29 (51) 25 (52)
 Yes 28 (49) 23 (48)
Number of newly diagnosed patients seen for treatment or evaluation each month .10
 Median (interquartile range) 8 (3–15) 10 (6–20)

aItem nonresponse was 7% for new patient volume and <3% for all other items.

**P-values were determined by Wilcoxon rank-sum test (years since medical school graduation, new patient volume) and Fisher’s exact test (all other items).

Reported use of OncoPanel

There was wide variability in OncoPanel use, with physicians reporting that a median of 30% of their patients had testing (interquartile range, 10%–50%, range 1%–100%). The majority of physicians (>50%) reported that multiple factors “sometimes” or “often” played a role in a decision to not use Tier 1 (well-established clinical utility) or Tier 2 (clinical utility in some contexts; eg, clinical trial eligibility or Federal Drug Administration drug approved for a different tumor type) test results to inform treatment recommendations (Figure 1). Notably, 34% reported that they often or sometimes found it difficult to make treatment recommendations based on OncoPanel results.

Figure 1.

Figure 1.

Physicians' reasons for not using genomic test results to inform treatment recommendations. Categories of responses are described in the figure.

Impact on genomics comprehension

Two physicians did not respond to any of these items and were excluded from the comprehension analyses. Nonresponse for the remaining 103 physicians was minimal (<7% for each of the 18 comprehension items, with 86 physicians responding to all 18 items). In the primary analysis, the physicians’ overall comprehension scores did not differ significantly by report type (mean score: interactive 11.6 vs static 10.5, difference = 1.1, 95% CI, −0.3, 2.5, P = .13, Figure 2A). Similar results were obtained when the analysis was limited to the physicians who responded to all items (mean score: interactive 12.2 vs static 11.2, difference = 1.0, 95% CI, −0.4, 2.4, P = .13).

Figure 2.

Figure 2.

Physicians’ comprehension scores. (A) Overall: This score is defined as the sum of the correct responses to 18 items (6 items for each of the 3 vignettes; range 0–18). (B) Domain-specific: This score is defined as the sum of the correct responses to the domain-specific items from each of the 3 vignettes (range 0–3). Std: standard report; Int: interactive report. Higher scores correspond to better comprehension.

In secondary analyses for domain-specific comprehension, however, physicians who viewed the interactive report were more likely to correctly assess tumor purity (P = .02) and sequencing quality (P < .001) and understand when reports needed to be interpreted with caution (P = .02) (Figure 2B).

Physician satisfaction with reports

Overall satisfaction scores were significantly higher in the interactive group than the static group (mean score 2.5 vs 2.1, difference = 0.4, 95% CI, 0.2, 0.7, P = .001; Figure 3). However, when asked about ease of understanding the test results presented in the genomic report, one-quarter of physicians in both groups responded “not at all easy” (across all vignettes: 27% interactive and 28% static) and nearly one-half responded “somewhat easy” (42% interactive and 53% static). In open-ended comments, providers had differing opinions about the type of genomic information that they found usable (eg, some wanted technical/raw data while others wanted simplified reports and a summary) or suggested additional report functionality (eg, links to external databases, clinical trials information, and institutional pathways), and many reported a need for additional provider-directed genomic education (Supplementary Table 2).

Figure 3.

Figure 3.

Physicians’ average satisfaction score. Responses to 16 items were scored on a 1–4 point Likert scale, with higher scores representing greater satisfaction. The overall satisfaction score was defined as the average of the 16 items.

In aggregate, 85% of physicians reported that existing resources are inadequate to support genomic testing in clinical practice. The need for additional support for providers and patients was endorsed by 88% and 68% of physicians, respectively. Among physicians who desired additional provider support, electronic reports with decision support (66%) and a genomic consult service (53%) were most highly endorsed (Figure 4). Among physicians who desired additional patient support, “patient-friendly” versions of the report (81%) and increased availability of genetic counselors for individual patient sessions (76%) were most highly endorsed (Supplementary Figure 5).

Figure 4.

Figure 4.

Physicians’ attitudes about provider genomic support that would be helpful. Categories of responses are described in the figure.

DISCUSSION

In this study, physicians participated in a vignette-based survey study in which they were randomized to view either a novel interactive genomic report or a traditional static report. In our primary analyses, we found that mean comprehension scores did not significantly differ between groups. However, in our exploratory analyses, we found that physicians’ ability to correctly assess sequencing quality and tumor purity and understand when reports needed to be interpreted with caution was significantly higher for those who were exposed to the interactive reports. Furthermore, report-related satisfaction was higher among physicians who viewed the interactive report. Our findings suggest that an interactive interface may be beneficial in guiding physicians in genomic interpretation and confirm that there remains a major need to improve current genomic care.

There are several possible explanations for the lack of an observed difference in overall comprehension scores between groups. One explanation is that the overall comprehension score did not measure a single construct and that clinically active physicians have greater knowledge in some areas (eg, interpreting copy number variation) than others (eg, assessing sequencing quality). Another explanation is that there may be an actual difference in comprehension by group but our sample size was too small to detect it. The fixed physician sample size at DFCI, combined with a relatively low participation rate, limits our ability to find small to moderate differences in overall comprehension. Alternatively, it is also possible that the proposed interactive report is not, on its own, sufficient to improve comprehension. Indeed, only one-third of physicians endorsed the interactive report as easy to understand, and the average score was only 10–12 out of 18.

In contrast to overall comprehension, we found that the average report-related satisfaction score was higher among physicians who were exposed to the interactive report. Given that one of our goals for the interactive report was to provide data to physicians in a clear and minimalist interface, and to allow providers to tailor their viewing experience by using the hover-over functionality, the satisfaction findings are promising. Our open-ended data suggest that providers may have different needs, with some preferring a simplified report and others preferring raw or technical data. Future web-based reports may need to have an increased number of interactive features in order to allow clinicians to further tailor their use depending on their needs.33

Our findings add to a growing body of literature that investigates the effect of new strategies for genomic reporting. Williams and colleagues34 have demonstrated that patient and provider reports for Mendelian genetic disorders can affect clinician satisfaction and facilitate provider-patient communication. Other investigators have developed reports for whole-exome or whole-genome sequencing that include features such as a succinct summary of genomics findings, written for a nongenetic specialist audience, and information about the technical limitations of whole-exome or whole-genome sequencing.35,36

Although large panel testing is increasingly being incorporated into care, prior research has demonstrated that genomic results can be difficult to interpret.8,9 For example, generalists and nongenetic specialists may under- or overinterpret genomic information, which can lead to inaccurate diagnoses and misinformed counseling.37–39 We found that 84% of surveyed physicians endorsed the need for additional provider support in the form of electronic reports with embedded decision support, genomic consult services, and the ability to obtain physician-to-physician “curbsides.” Physicians also endorsed additional genomic support for patients, including a patient-friendly genomics report. In service to these ideas, we have incorporated a version of the interactive physician-directed report described here into our MatchMiner clinical trial interface (http://matchminer.org). Furthermore, our group is developing patient-facing genomic reports that will be integrated into our information technology platform. Despite this progress, the benefits and limitations of a variety of genomic interventions need further study.

Strengths of our study include the creation of a web-based interactive format for genomic reporting and the evaluation of reports through a randomized experiment. Our study has limitations that are also worth noting. First, we had a relatively low participation rate. Although nonparticipation is a known issue with physician surveys,40 the rate was lower than that of our baseline survey (61%). This finding is striking, because we did not offer incentives at baseline and did offer an incentive for this work. One reason for the lower response rate is that physicians might not appreciate being “tested.” This hypothesis is supported by the formative work for our baseline survey, in which physicians suggested eliminating knowledge items because of concerns that “testing” would lower response rates. Furthermore, physicians who had lower genomic confidence at baseline were less likely to respond to the current survey. Given that assessments of physicians’ knowledge are needed to develop robust physician education, enhanced provider participation in future studies is needed. However, despite the relatively low participation rate, randomization was successful and we achieved balance across the 2 arms (interactive vs static), thus strengthening confidence in our findings. Second, the standard report was based on reports from our institution and may not be representative of other genomic reports. Third, we studied physicians at a single institution, and our findings might not be generalizable.

CONCLUSION

In summary, we found that physicians’ overall comprehension scores did not differ by report type. However, in exploratory analyses, we found that interactive genomic reports facilitated cancer physicians’ ability to correctly interpret technical aspects of the reports, such as sequencing quality and tumor purity. Furthermore, physicians who were exposed to the interactive reports had higher report-related satisfaction. Taken together, our findings suggest that innovations in genomic reporting hold the potential to decrease providers’ sequencing-related knowledge gaps and improve accurate genomic interpretation. Additional work is needed to determine whether dynamic reports are helpful for broader provider populations and to integrate interactive reports into the electronic health record. Ultimately, in order to fulfill the promise of precision medicine, point-of-care interventions will be needed to increase providers’ confidence in their ability to use genomic data.

COMPETING INTERESTS

SWG, JG, EC, AMC, HU, NO, CL, RL, AR, AS, CL, JB: none to report. LS: consultant for Research to Practice. EMV is a consultant/advisor for Genome Medical, Tango Therapeutics, and Novartis; receives research funding from Novartis and Bristol Myers Squibb; and has equity in Genome Medical, Tango Therapeutics, and Syapse.

FUNDING

This work was supported by a DFCI Medical Oncology grant (EMV), National Institutes of Health U01HG006492, K08CA188615 (EMV), the American Cancer Society 120529-MRSG-11-006-01-CPPB (SWG), and the Agency for Healthcare Research and Quality NIH R21HS024984 (SWG).

CONTRIBUTORS

SWG, JG, EC, AMC, HU, NO, CL, RL, AR, AS, CL, JB, LS, and EMV contributed to the conception and design of the project, survey execution, survey data collection, and interpretation of results. SWG, JG, EC, AS, CL, JB, and EMV contributed to the design and coding of the web-based report. SWG, NO, CL, RL, AR, and EMV contributed to operations related to the survey instrument and deployment. AMC and HU performed statistical analyses. All authors contributed to drafting the manuscript and approved the manuscript.

SUPPLEMENTARY MATERIAL

Supplementary material is available at Journal of the American Medical Informatics Association online.

Supplementary Material

Supplementary Data

REFERENCES

  • 1. Garraway LA. Genomics-driven oncology: framework for an emerging paradigm. J Clin Oncol. 2013;3115:1806–14. [DOI] [PubMed] [Google Scholar]
  • 2. Sholl LM, Do K, Shivdasani P et al. , Institutional implementation of clinical tumor profiling on an unselected cancer population. JCI Insight. 2016;1:e87062. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 3. Zehir A, Benayed R, Shah RH et al. , Mutational landscape of metastatic cancer revealed from prospective clinical sequencing of 10,000 patients. Nat Med. 2017;23:703–13. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 4. Gagan J, Van Allen EM. Next-generation sequencing to guide cancer therapy. Genome Med. 2015;7:80. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 5. Le Tourneau C, Delord JP, Goncalves A et al. , Molecularly targeted therapy based on tumour molecular profiling versus conventional therapy for advanced cancer (SHIVA): a multicentre, open-label, proof-of-concept, randomised, controlled phase 2 trial. Lancet Oncol. 2015;16:1324–34. [DOI] [PubMed] [Google Scholar]
  • 6. Massard C, Michiels S, Ferte C et al. , High-throughput genomics and clinical outcome in hard-to-treat advanced cancers: results of the MOSCATO 01 Trial. Cancer Discov. 2017;7:586–95. [DOI] [PubMed] [Google Scholar]
  • 7. Ghazani AA, Oliver NM, St Pierre JP et al. , Assigning clinical meaning to somatic and germ-line whole-exome sequencing data in a prospective cancer precision medicine study. Genet Med. 2017;19:787–95. [DOI] [PubMed] [Google Scholar]
  • 8. Gray SW, Hicks-Courant K, Cronin A et al. , Physicians’ attitudes about multiplex tumor genomic testing. J Clin Oncol. 2014;32:1317–23. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 9. Selkirk CG, Weissman SM, Anderson A et al. , Physicians’ preparedness for integration of genomic and pharmacogenetic testing into practice within a major healthcare system. Genet Test Mol Biomarkers. 2013;17:219–25. [DOI] [PubMed] [Google Scholar]
  • 10. Miller FA, Krueger P, Christensen RJ et al. , Postal survey of physicians and laboratories: practices and perceptions of molecular oncology testing. BMC Health Serv Res. 2009;9:131. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 11. Douma KF, Smets EM, Allain DC. Non-genetic health professionals; attitude towards, knowledge of and skills in discussing and ordering genetic testing for hereditary cancer. Fam Cancer. 2016;15:341–50. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 12. Johnson LM, Valdez JM, Quinn EA et al. , Integrating next-generation sequencing into pediatric oncology practice: an assessment of physician confidence and understanding of clinical genomics. Cancer. 2017;123:2352–59. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 13. Cohen B, Roth M, Marron JM et al. , Pediatric oncology provider views on performing a biopsy of solid tumors in children with relapsed or refractory disease for the purpose of genomic profiling. Ann Surg Oncol. 2016;23:990–97. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 14. Doksum T, Bernhardt BA, Holtzman NA. Does knowledge about the genetics of breast cancer differ between nongeneticist physicians who do or do not discuss or order BRCA testing? Genet Med. 2003;5:99–105. [DOI] [PubMed] [Google Scholar]
  • 15. Wideroff L, Freedman AN, Olson L et al. , Physician use of genetic testing for cancer susceptibility: results of a national survey. Cancer Epidemiol Biomarkers Prev. 2003;12:295–303. [PubMed] [Google Scholar]
  • 16. Bryce AH, Egan JB, Borad MJ et al. , Experience with precision genomics and tumor board indicates frequent target identification, but barriers to delivery. Oncotarget. 2017;8:27145–54. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 17. Roychowdhury S, Iyer MK, Robinson DR et al. , Personalized oncology through integrative high-throughput sequencing: a pilot study. Sci Transl Med. 2011;3:111ra121. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 18. Parsons DW, Roy A, Yang Y et al. , Diagnostic yield of clinical tumor and germline whole-exome sequencing for children with solid tumors. JAMA Oncol. 2016 Jan 28. doi: 10.1001/jamaoncol.2015.5699. [Epub ahead of print]. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 19. McGraw SA, Garber J, Janne PA et al. , The fuzzy world of precision medicine: deliberations of a precision medicine tumor board. Per Med. 2017;14:37–50. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 20. Parker BA, Schwaederle M, Scur MD et al. , Breast cancer experience of the molecular tumor board at the University of California, San Diego Moores Cancer Center. J Oncol Pract. 2015;11:442–49. [DOI] [PubMed] [Google Scholar]
  • 21. Yeh P, Chen H, Andrews J et al. , DNA-Mutation Inventory to Refine and Enhance Cancer Treatment (DIRECT): a catalog of clinically relevant cancer mutations to enable genome-directed anticancer therapy. Clin Cancer Res. 2013;19:1894–901. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 22. Chakravarty D, Gao J, Phillips S et al. , OncoKB: a precision oncology knowledge base. JCO Precision Oncol. 2017;1:1–16. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 23. Gonzalez-Perez A, Mustonen V, Reva B et al. , Computational approaches to identify functional genetic variants in cancer genomes. Nat Methods. 2013;10:723–29. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 24. Griffith M, Spies NC, Krysiak K et al. , CIViC is a community knowledgebase for expert crowdsourcing the clinical interpretation of variants in cancer. Nat Genet. 2017;49:170–4. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 25. Shirts BH, Salama JS, Aronson SJ et al. , CSER and eMERGE: current and potential state of the display of genetic information in the electronic health record. J Am Med Inform Assoc. 2015;22:1231–42. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 26. Sims D, Sudbery I, Ilott NE et al. , Sequencing depth and coverage: key considerations in genomic analyses. Nat Rev Genet. 2014;15:121–32. [DOI] [PubMed] [Google Scholar]
  • 27. Nielsen R, Paul JS, Albrechtsen A et al. , Genotype and SNP calling from next-generation sequencing data. Nat Rev Genet. 2011;12:443–51. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 28. Tarczy-Hornoch P, Amendola L, Aronson SJ et al. , A survey of informatics approaches to whole-exome and whole-genome clinical reporting in the electronic health record. Genet Med. 2013;15:824–32. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 29. Berg JS, Amendola LM, Eng C et al. , Processes and preliminary outputs for identification of actionable genes as incidental findings in genomic sequence data in the Clinical Sequencing Exploratory Research Consortium. Genet Med. 2013;15:860–67. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 30. Garcia EP, Minkovsky A, Jia Y et al. , Validation of OncoPanel: a targeted next-generation sequencing assay for the detection of somatic variants in cancer. Arch Pathol Lab Med. 2017;141:751–58. [DOI] [PubMed] [Google Scholar]
  • 31. Scheuner MT, Edelen MO, Hilborne LH et al. , Effective communication of molecular genetic test results to primary care providers. Genet Med. 2013;15:444–49. [DOI] [PubMed] [Google Scholar]
  • 32. Hevery M, Abrons A. 2016. Angular JS. version 1.6. https://angularjs.org/.
  • 33. Gray SW, Park ER, Najita J et al. , Oncologists’ and cancer patients’ views on whole-exome sequencing and incidental findings: results from the CanSeq study. Genet Med. 2016;18:1011–19. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 34. Williams JL, Rahm AK, Stuckey H et al. , Enhancing genomic laboratory reports: a qualitative analysis of provider review. Am J Med Genet A. 2016;170A:1134–41. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 35. McLaughlin HM, Ceyhan-Birsoy O, Christensen KD et al. , A systematic approach to the reporting of medically relevant findings from whole genome sequencing. BMC Med Genet. 2014;15:134. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 36. Dorschner MO, Amendola LM, Shirts BH et al. , Refining the structure and content of clinical genomic reports. Am J Med Genet C Semin Med Genet. 2014;166C:85–92. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 37. Krier JB, Kalia SS, Green RC. Genomic sequencing in clinical practice: applications, challenges, and opportunities. Dialogues Clin Neurosci. 2016;18:299–312. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 38. Christensen KD, Vassy JL, Jamal L et al. , Are physicians prepared for whole genome sequencing? A qualitative analysis. Clin Genet. 2016;89:228–34. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 39. Vassy JL, Bates DW, Murray MF. Appropriateness: a key to enabling the use of genomics in clinical practice? Am J Med. 2016;129:551–53. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 40. Martins Y, Lederman RI, Lowenstein CL et al. , Increasing response rates from physicians in oncology research: a structured literature review and data from a recent physician survey. Br J Cancer. 2012;106:1021–26. [DOI] [PMC free article] [PubMed] [Google Scholar]

Associated Data

This section collects any data citations, data availability statements, or supplementary materials included in this article.

Supplementary Materials

Supplementary Data

Articles from Journal of the American Medical Informatics Association : JAMIA are provided here courtesy of Oxford University Press

RESOURCES