Skip to main content
Journal of the American Medical Informatics Association: JAMIA logoLink to Journal of the American Medical Informatics Association: JAMIA
. 2019 Nov 7;27(2):308–314. doi: 10.1093/jamia/ocz190

Best practices for data visualization: creating and evaluating a report for an evidence-based fall prevention program

Srijesa Khasnabish 1,, Zoe Burns 1, Madeline Couch 1, Mary Mullin 1, Randall Newmark 2, Patricia C Dykes 1,3,4
PMCID: PMC7647241  PMID: 31697326

Abstract

This case report applied principles from the data visualization (DV) literature and feedback from nurses to develop an effective report to display adherence with an evidence-based fall prevention program. We tested the usability of the original and revised reports using a Health Information Technology Usability Evaluation Scale (Health-ITUES) customized for this project. Items were rated on a 5-point Likert scale, strongly disagree (1) to strongly agree (5). The literature emphasized that the ideal display maximizes the information communicated, minimizes the cognitive efforts involved with interpretation, and selects the correct type of display (eg, bar versus line graph). Semi-structured nurse interviews emphasized the value of simplified reports and meaningful data. The mean (standard deviation [SD]) Health-ITUES score for the original report was 3.86 (0.19) and increased to 4.29 (0.11) in the revised report (Mann Whitney U Test, z = −12.25, P <0.001). Lessons learned from this study can inform report development for clinicians in implementation science.

Keywords: data visualization, evidence-based, fall prevention, health-ITUES, usability

INTRODUCTION

In today’s healthcare environment, data are continuously being collected at the patient, clinician, and organizational levels.1,2 Data visualization (DV) techniques are competing with the rate of data collection.3,4 Data pertaining to patient risk status5,6 and outcome measures7 are constantly presented to clinicians. Previous research has focused on the use of technologies and dashboards8 to identify patterns in electronic health record (EHR) data,9 but there is a lack of research on best practices for DV of clinician adherence with quality improvement initiatives.

DV is defined by Stephen Few as “the graphical display of abstract information for 2 purposes: sense making and communication.”10 Effective data displays are crafted based on the message the creators intend to communicate and consideration of the best means to display variables.11,12 The issue with poor quality reports is that they can lead clinicians to overlook patterns in performance and miss opportunities for improvement.13 Reports created by leveraging the innate capacities for pattern recognition among other best practices for data display could be valuable to clinicians and prompt positive practice change.

Previous research shows that tailoring reports to the end users’ knowledge and skills can reduce the cognitive burden associated with report comprehension;14,15 this is often a barrier to the implementation of evidence-based practices.15–17 Furthermore, a systematic literature review by Wu et al concluded that future work was needed to create DV frameworks to apply broadly and validate in healthcare.7

To address this knowledge gap, this study used a literature review to consolidate best practices from cognitive science and computer science. Lessons learned were applied to data displays for clinician end users and validated by them. DV principles that emerged in the literature were applied to refine reports that were originally created to display nurse adherence with an evidence-based fall prevention program, Fall TIPS (Tailoring Interventions for Patient Safety). Fall TIPS demonstrated a 25% reduction in falls in a randomized controlled trial18 and has over a decade of evidence to support its use in acute care settings.19–24 Fall TIPS protocol adherence is measured using the Fall TIPS Audit Tool (FTAT); audit results are the basis for the Fall TIPS Monthly Reports (FTMR). The FTMR focuses on process measures instead of outcome measures because the outcome measures (falls/fall-related injuries) are rare events and often require additional time to collect, process, and share with staff. Furthermore, the need for FTMR improvement was emphasized by nurse feedback at practice committee meetings. The Institute for Healthcare Improvement’s Framework for Spread emphasizes the importance of continuous monitoring and feedback related to the implementation of new initiatives,25 thus, identifying optimal strategies for depicting adherence is in line with improving the quality of healthcare.

OBJECTIVE

The objective of this study was: 1) To identify best practices for DV through a systematic literature review, 2) To apply these principles and collect semi-structured feedback from nurses to iteratively refine the FTMR, and 3) To evaluate FTMR usability. By harnessing best practice standards established in the literature as well as qualitative and quantitative feedback, we sought to establish guiding principles for creating reports for clinician use in implementation science projects and clinical practice. The methods used were tailored for this process.

MATERIALS AND METHODS

This study is a part of a 3-year project to evaluate the generalizability and spread of an evidence-based fall prevention toolkit: Fall TIPS.26 The protocol was approved by the Partners HealthCare human subjects’ committee. Participants included nurses working on general medical and surgical units at a large academic medical center located in the northeastern USA.

Literature review

With the assistance of a medical librarian, we searched the literature published between 1940 and 2019 to identify best practices for communication of quantitative data via visual display and principles for effective DV for clinicians (see Supplementary MaterialFigure S1 for a list of databases, search terms, inclusion/exclusion criteria, and PRISMA diagram).27 Literature review results directed FTMR improvement.

Usability testing

The original FTMR (Figure 1) was created by the research team and used for 6 months in 1 large academic medical center. Qualitative and quantitative feedback from nurses was collected from April–July 2018 (original report) and August–September 2018 (revised report) and applied to improve the FTMR. The Research Computing Core (RCC) at our hospital modified the reports.

Figure 1.

Figure 1.

Original Fall TIPS Monthly Report (FTMR). This figure shows the original FTMR, prior to implementing any modifications. It was developed by the research team using Microsoft Powerpoint.

Qualitative data collection

Two authors adapted Few’s 6 requirements for effective data display11,28,29 into questions specific to the FTMR and used them to guide semi-structured nurse interviews in individual or group settings based on availability (Table 1). Participation was voluntary; nurses were not required to have a DV background to participate. Basic content analysis methods30 were used to interpret qualitative feedback using a 2-person consensus approach for identifying and organizing themes.

Table 1.

Few et al. Requirements for data displays adapted to a Questionnaire to assess fall TIPS monthly reports

Number Question
1 Does the display clearly indicate how the values relate to 1 another?
How do the 3 audit questions relate?
How does your unit/service compare to others?
How does your unit compare to the aggregate data?
2 Does it make it easy to compare the quantities?
How easy is it to interpret the bar graph (# of audits)?
3 Are the ranked order values easily recognizable?
Is it easy to see which unit is doing the best?
Is it easy to see which units are meeting the 5 audits/month target?
4 Is it clear how the information display should be used?
Are the takeaways clear?
Is it clear how this can be used to provide targeted feedback?

Nurses from medical, surgical, or combined medical surgical units were interviewed; some nurses were Fall TIPS champions (FTCs), nurse leaders, and staff nurses. The FTCs completed the Fall TIPS audits for their respective units and disseminated the among staff. Feedback was collected from independent groups of nurses in 2 phases: once based on the original FTMR and once based on the revised FTMR. Interviews were semi-structured and not transcribed.

Abbreviations: FTC, Fall TIPS champion; FTMR, Fall TIPS Monthly Reports; TIPS, Tailoring Interventions for Patient Safety.

Quantitative data collection

Nurses completed a Health Information Technology Usability Evaluation Scale (Health-ITUES) survey. The 20-item Health-ITUES is a validated tool designed to be customizable for use outside its original context.31 Each item is based on task-specific concepts using the Technology Acceptance Model32: “Using [system] is useful in [task].” The Health-ITUES addresses 4 usability factors: quality of work life, perceived usefulness, perceived ease of use, and user control.32 Participants responded on a 5-point Likert scale from 1 (strongly disagree) to 5 (strongly agree); a Mann-Whitney U test was used for analysis using SPSS.33 Two researchers modified 15 items to ask about FTMR use and eliminated 5 that were irrelevant if redundant or out of scope (see Supplementary MaterialTable S2). Since the Health-ITUES was adapted for Fall TIPS, Cronbach’s alpha was calculated to retest reliability.

RESULTS

Literature review

After removing duplicates and assessing titles and abstracts for eligibility, 956 articles were retained. Since 791 articles did not meet established inclusion criteria, 165 were retained for full text review. Two researchers identified 54 publications as relevant to the study (references8,10–12,14,15,28,29,34–79; Supplementary Material Figure). Lessons learned from the literature aligned with the qualitative themes (Table 2).

Table 2.

Changes to the Fall TIPS monthly report were rooted in the qualitative themes and lessons learned from the literature

Change to Fall TIPS monthly report Theme Subtheme Principle from literature
  1. Rotated the bars on the graph from vertical to horizontal and integrated “goal” into bar graph (denoted by red line as opposed to a separate box on the report)

Simplified Reports Ease of Comprehension The best data displays take advantage of one’s ability to visually process what is seen with limited thinking power, thereby reducing the cognitive burden involved in data interpretation.10,12,15,41,51,59,76 Color should be used conservatively.8,46,64,75
2.
  • Ordered units based on those that submitted the highest number to lowest number of audits

Simplified Reports Ease of Comprehension Making the message stand out is vital to creating an effective visual display, which is easily perceived and remembered.15,41,51,62
3.
  • Wrote out the full wording of the Fall TIPS Audit Tool questions

Simplified Reports Ease of Comprehension In designing all components of a visual display, it is best practice to simplify while providing sufficient context to orient the viewer through legends, titles, and axis labels.60,73,77,80
4.
  • Changed the term “aggregate” to “average”

Simplified Report Optimizing Visualization A data display is effective when it accurately communicates as much information to the audience as directly as possible.10,34,35,40,57
5.
  • Incorporated “aggregate results” into the “Patient Engagement Audit Results” table

Simplified Reports; Optimizing Visualization; Choosing the right visual display to represent data is vital in the clear communication of a message to the target audience.35,38,72
6.
  • Added the “number of audits” to the “Patient Engagement Audits” table

Simplified Reports; Meaningful Data Optimizing Visualization; Leveraging Numeracy Repetition is an important component of visual literacy42,70 and interactions between different types of display reduce cognitive effort and foster easy interpretation.42,45
Meaningful Data Leveraging Numeracy Data tables are particularly useful when trying to convey precise numbers or compare specific values.10,28,29,44,66,71 There is variability in numeracy and graph literacy in clinicians, but, in today’s healthcare environment, clinicians are more often exposed to numbers over graphs.14
7.
  • Refined the criteria for calculating the top unit

Meaningful Data Ensuring Accuracy of Metrics Clinicians are concerned with their performance with respect to goals, which need to be clearly communicated to them.16,17,68
8.
  • Clarified target adherence

Meaningful Data Goal Clarification
9.
  • Eliminated top champion metric

Meaningful Data Goal Clarification Experts emphasize the importance of a high data–ink ratio or ensuring that the ink on a report represents meaningful data.75,81

Usability testing results

Qualitative results

Staff interviews (original FTMR n = 79, revised FTMR n = 72, total n = 151) lasted 5–15 minutes depending on group size. Some nurses were interviewed in pre-existing practice committee meetings (original = 40, revised = 32), while others were interviewed individually or in groups of 2–3 nurses (original = 39, revised = 40). Interviews were conducted until saturation was reached. Two themes emerged emphasizing that clinicians prefer simplified reports (theme 1) that depict meaningful data (theme 2). Subthemes highlighted the importance of easy comprehension and optimizing visualization (theme 1) and the accuracy of the data, goal clarification, and numeracy (theme 2). When evaluating the revised FTMR (Figure 2), suggestions emerged related to optimizing visualization (minor changes related to wording and formatting). This implies that concerns related to FTMR were addressed by the changes made to the original report.

Figure 2.

Figure 2.

Revised Fall TIPS Monthly Report (FTMR). Revisions of the FTMR included: 1) rotating the bars on the graph from vertical to horizontal, 2) ordering the units based on those that submitted the highest number to lowest number of audits, 3) writing out the full wording of the Fall TIPS Audit Tool questions, 4) changing “aggregate” to “average,” 5) incorporating “aggregate results” to the “Patient Engagement Audit Results” table, 6) adding “number of audits” to the “Patient Engagement Audit Results” table, 7) refining the criteria for calculating the top unit, and 8) clearly communicating Fall TIPS Audit Tool target adherence.

Quantitative results

A total of 151 nurses from medical and surgical units completed the Health-ITUES survey to evaluate the FTMR (original n = 79, revised n = 72). A reliability analysis using Cronbach’s alpha showed that the survey adapted for Fall TIPS use is reliable (original = 0.96, revised = 0.94). The mean (SD) score based on the survey was 3.86 (0.19) for the original FTMR and increased to 4.29 (0.11) for the revised FTMR (Mann-Whitney U, z = −12.25, P <0.001). The mean score for all 15 items increased from the original to revised FTMRs (Figure 3; see Supplementary MaterialTable S2 for item-level statistics).

Figure 3.

Figure 3.

Health-ITUES Scores by Factor for Original versus Revised Fall TIPS Monthly Reports. Improvements in Health-ITUES scores were observed across all 4 usability factors: quality of work life, perceived usefulness, perceived ease of use, and user control.

DISCUSSION

The changes made to improve the original FTMR emerged in the nurse interviews and were further supported by the literature. These modifications improved visualization but did not change the main metrics communicated in the reports (number of audits and FTAT adherence). DV principles that exist in cognitive science and computer science also apply to clinicians, but the main clinician-specific point that emerged was a strong preference for goal-related metrics. Nurses value reports where information can be found quickly and the visualization techniques facilitate easy comparison between quantities. The changes made to the FTMR to promote the ease of comprehension adhere to Kosslyn’s Psychological Principles of Effective Graphs38,80 and reduce cognitive workload. Simple changes such as writing out the full questions, avoiding jargon/symbols, integrating information with graphs, and limiting the use of gridlines to increase comprehension are also important.38,42,54,72 Comprehension improves when the data representation helps viewers cluster information8,57,70 and recognize patterns. Improvements made to the Patient Engagement Audits table in the FTMR are in line with the idea that our visual processing capabilities allow us to better retain information that stand out from the norm. FTMR evaluation findings also highlight the benefit of communicating important metrics in 2 areas of reports. Reports including both numerical displays and graphical displays are beneficial as they cater to the DV needs of clinicians with varying levels of comfort with numbers. Reports that leverage viewers’ innate ability for pattern recognition and encourage the eye to connect different components of the report are most effective.12,42,79

Nurses had positive perceptions of FTMR usefulness. Nurses were aware of how to use these reports in discussions with their staff to identify opportunities for improvement. However, the lack of a routine method for dissemination was a barrier to use. Even if validated reports exist, their impact is limited unless the report is widely viewed and used for continuous quality improvement.35 A system for dissemination is a crucial facilitator for report usefulness. To facilitate dissemination, we collaborated with RCC to automate FTMR generation.

Following best practices for DV reduces cognitive workload.14,15 Given that modifications were made based on best practices identified in the literature and nurses communicated positive perceptions of usefulness, a reduction in cognitive workload is expected. Best practices include the focus on facilitating the ease of comprehension, clarifying goals, ensuring metrics accuracy, leveraging clinicians’ numeracy and graphical literacy, and optimizing visualization.

The clinician-specific best practices for DV identified in this work can be leveraged to create effective reports that communicate feedback on process measures to clinicians, thus potentially improving the quality of care.

Limitations

Limitations of this study include that the FTMR was evaluated with nurses at 1 academic medical center and participants had varying levels of Fall TIPS exposure. Interviews were not transcribed. Given the preexisting system for FTMR dissemination, it was not feasible to have a control group for the qualitative portion of this study. The cognitive workload associated with the FTMR was not directly measured in this study.

Future work

Recommendations for future iterations of the FTMR include displaying temporal trends. Nurses suggested adding a “days since last fall” metric to emphasize the correlation between Fall TIPS adherence and falls. Next steps also include using the questionnaire based on Few’s requirements and the Health-ITUES of Yen et al to evaluate FTMR perceptions and usability at other hospitals.

CONCLUSION

Through a literature review and qualitative and quantitative evaluation of the FTMR, best practices for DV were identified. The novelty of this work is that best practices were validated by clinicians by refining the FTMR. Reports for clinicians need to be quick and easy to comprehend to be used effectively as a tool to disseminate feedback. These reports must contain accurate data and clarify goals. To maximize the benefit of the reports, a systematic approach to dissemination must be in place. If a report meets these criteria, it is more likely to reduce the attentive effort associated with understanding the report and procure positive perceptions of usefulness by clinicians. The lessons learned can be applied to developing reports for continuous monitoring and feedback regarding implementation progress of other programs for clinicians.

FUNDING

This work was supported by the Agency for Healthcare Research and Quality (AHRQ) Grant Number 1R18HS025128-01. The content is solely the authors’ responsibility and does not represent official AHRQ views.

AUTHOR CONTRIBUTIONS

SK, ZB, and PCD designed the study, interpreted the data, and drafted the manuscript. SK, ZB, MC, and MM acquired data. RN created the automated reports. All authors revised and approved the final manuscript.

Supplementary Material

ocz190_Supplementary_Files

ACKNOWLEDGMENTS

The authors would like to thank the clinicians who participated in the staff interviews and Health-ITUES surveys and Jacqueline Cellini (medical librarian at Countway Library of Medicine).

Conflict of Interest statement

None declared.

REFERENCES

  • 1. Goldenberg JN. The breadth and burden of data collection in clinical practice. Neurol Clin Pract 2016; 6 (1): 81–6. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 2. Murdoch TB, Detsky AS.. The inevitable application of big data to health care. JAMA 2013; 309 (13): 1351–2. [DOI] [PubMed] [Google Scholar]
  • 3. Caban JJ, Gotz D.. Visual analytics in healthcare–opportunities and research challenges. J Am Med Inform Assoc: JAMIA 2015; 22 (2): 260–2. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 4. Gotz D, Borland D.. Data-driven healthcare: challenges and opportunities for interactive visualization. IEEE Comput Grap Appl 2016; 36 (3): 90–6. [DOI] [PubMed] [Google Scholar]
  • 5. Klimov D, Shknevsky A, Shahar Y.. Exploration of patterns predicting renal damage in patients with diabetes type II using a visual temporal analysis laboratory. J Am Med Inform Assoc: JAMIA 2015; 22 (2): 275–89. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 6. Powsner SM, Tufte ER.. Graphical summary of patient status. Lancet 1994; 344 (8919): 386–9. [DOI] [PubMed] [Google Scholar]
  • 7. Wu DTY, Chen AT, Manning JD.. Evaluating visual analytics for health informatics applications: a systematic review from the American Medical Informatics Association Visual Analytics Working Group Task Force on Evaluation. J Am Med Inform Assoc 2019; 26 (4): 314–23. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 8. Ratwani RM, Trafton JG, Boehm-Davis DA.. Thinking graphically: connecting vision and cognition during graph comprehension. J Exp Psychol 2008; 14 (1): 36–49. [DOI] [PubMed] [Google Scholar]
  • 9. West VL, Borland D, Hammond WE.. Innovative information visualization of electronic health record data: a systematic review. J Am Med Inform Assoc: JAMIA 2015; 22 (2): 330–9. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 10. Few S. Data Visualization for Human Perception. Secondary Data Visualization for Human Perception; 2005. https://www.interaction-design.org/literature/book/the-encyclopedia-of-human-computer-interaction-2nd-ed/data-visualization-for-human-perception. Accessed June 21, 2019.
  • 11. Few S. Effectively Communicating Numbers: Selecting the Best Means and Manner of Display [White Paper]. Secondary Effectively Communicating Numbers: Selecting the Best means and manner of Display [White Paper]; 2005. http://www.perceptualedge.com/articles/Whitepapers/Communicating_Numbers.pdf. Accessed June 21, 2019.
  • 12. Cleveland W, McGill R.. Graphical perception: theory, experimentation, and application to the development of graphical methods. J Am Stat Assoc 1984; 79 (387): 531–54. [Google Scholar]
  • 13. Little K. Improving the Visual Display of Data. Improvement Stories Institute for Healthcare Improvement. Boston: Institute for Healthcare Improvement; 2019. http://www.ihi.org/resources/Pages/ImprovementStories/Improvingthevisualdisplayofdata.aspx. Accessed October 31, 2019.
  • 14. Dowding D, Merrill JA, Onorato N, Barrón Y, Rosati RJ, Russell D.. The impact of home care nurses' numeracy and graph literacy on comprehension of visual display information: implications for dashboard design. J Am Med Inform Assoc: JAMIA 2018; 25 (2): 175–82. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 15. Gilger M. Addressing Information Display Weaknesses for Situational Awareness. New Jersey: IEEE; 2006.
  • 16. Grol R, Grimshaw J.. From best evidence to best practice: effective implementation of change in patients' care. Lancet 2003; 362 (9391): 1225–30. [DOI] [PubMed] [Google Scholar]
  • 17. Oxman AD. Evidence-Based Practice in Pimary Care. 2nd ed London: BMJ Books; 2001. [Google Scholar]
  • 18. Dykes PC, Carroll DL, Hurley A, et al. Fall prevention in acute care hospitals: a randomized trial. JAMA 2010; 304 (17): 1912–8. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 19. Abbott PA, Foster J, Marin Hde F, Dykes PC.. Complexity and the science of implementation in health IT–knowledge gaps and future visions. Int J Med Inform 2014; 83 (7): e12–22. [DOI] [PubMed] [Google Scholar]
  • 20. Dykes PC, Carroll DL, Hurley A. Fall TIPS: strategies to promote adoption and use of a fall prevention toolkit. In: AMIA Annual Symposium Proceedings; November 14, 2009; San Francisco. [PMC free article] [PubMed]
  • 21. Dykes PC, Carroll DL, Hurley AC, Benoit A, Middleton B.. Why do patients in acute care hospitals fall? Can falls be prevented? J Nurs Adm 2009; 39 (6): 299–304. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 22. Dykes PC, Duckworth M, Cunningham S, et al. Pilot testing fall TIPS (Tailoring Interventions for Patient Safety): a patient-centered fall prevention toolkit. Jt Comm J Qual Patient Saf 2017; 43 (8): 403–13. [DOI] [PubMed] [Google Scholar]
  • 23. Dykes PC, Eh IC, Soukup JR, Chang F, Lipsitz S. A case control study to improve accuracy of an electronic fall prevention toolkit. In: AMIA Annual Symposium Proceedings. [PMC free article] [PubMed]
  • 24. Zuyev L, Benoit AN, Chang FY, Dykes PC.. Tailored prevention of inpatient falls: development and usability testing of the fall TIPS toolkit. Comput Inform Nurs: CIN 2011; 29 (2): 93–100. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 25. Massoud MR, Nielsen GA, Nolan K, Schall MW, Sevin C.. A Framework for Spread: From Local Improvements to System-Wide Change. Cambridge, MA; 2006. [Google Scholar]
  • 26. Dykes PC, Khasnabish S, Marketing CS. Fall TIPS website. Secondary Fall TIPS Website; 2018. www.falltips.org. Accessed June 21, 2019.
  • 27. Moher D, Liberati A, Tetzlaff J, Altman DG, Group P.. Preferred reporting items for systematic reviews and meta-analyses: the PRISMA statement. PLoS Med 2009; 6 (7): e1000097.. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 28. Rougier NP, Droettboom M, Bourne PE.. Ten simple rules for better figures. PLoS Comput Biol 2014; 10 (9): e1003833.. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 29. Few S. Designing effective tables and graphs. Secondary designing effective tables and graphs; 2004. http://www.perceptualedge.com/images/Effective_Chart_Design.pdf. Accessed June 21, 2019.
  • 30. Miles MB, Huberman AM.. Qualitative Data Analysis: An Expanded Sourcebook. 2nd ed Thousand Oaks, CA: Sage; 1994. [Google Scholar]
  • 31. Yen PY, Sousa KH, Bakken S.. Examining construct and predictive validity of the Health-IT Usability Evaluation Scale: confirmatory factor analysis and structural equation modeling results. J Am Med Inform Assoc: JAMIA 2014; 21 (e2): e241–8. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 32. Yen PY, Wantland D, Bakken S. Development of a customizable health IT usability evaluation scale. In: AMIA Annual Symposium Proceedings; November 13, 2010; Washington DC. [PMC free article] [PubMed]
  • 33.SPSS [Program]. 24 Version. Armonk, New York: IBM Corporation; 2016. [Google Scholar]
  • 34. Agrawala MD. Smart depiction for visual communication. IEEE Comput Grap Appl 2005; 25 (3): 20–1. [Google Scholar]
  • 35. Alverson CY, Yamamoto SH.. Educational decision making with visual data and graphical interpretation: assessing the effects of user preference and accuracy. Sage Open 2016; 6 (4): 215824401667829. [Google Scholar]
  • 36. Ancker JS, Senathirajah Y, Kukafka R, et al. Design features of graphs in health risk communication: a systematic review. J Am Med Inform Assoc 2006; 13 (6): 608–18. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 37. Arcia A, Woollen J, Bakken S.. A systematic method for exploring data attributes in preparation for designing tailored infographics of patient reported outcomes. EGEMS (Washington, DC) 2018; 6 (1): 2. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 38. Asada Y, Abel H, Skedgel C, Warner G.. On effective graphic communication of health inequality: considerations for health policy researchers. Milbank Q 2017; 95 (4): 801–35. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 39. Backonja U, Chi NC, Choi Y, et al. Visualization approaches to support healthy aging: a systematic review. JHI 2016; 23 (3): 600.. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 40. Bauer DT, Guerlain S, Brown PJ.. The design and evaluation of a graphical display for laboratory data. J Am Med Inform Assoc 2010; 17 (4): 416–24. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 41. Bliss M, Dalto J.. INFOGRAPHICS for safety communication. Prof Saf 2018; 63 (12): 16–7. [Google Scholar]
  • 42. Brasher PM, Brant RF.. Pictures worthy of a thousand words. Can J Anesth/J Can Anesth 2010; 57 (11): 961–5. [DOI] [PubMed] [Google Scholar]
  • 43. Brewer NT, Gilkey MB, Lillie SE, Hesse BW, Sheridan SL.. Tables or bar graphs? Presenting test results in electronic medical records. Med Decis Making 2012; 32 (4): 545–53. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 44. Chittaro L. Visualization of patient data at different temporal granularities on mobile devices In: proceedings of the Working Conference on Advanced Visual Interfaces; May 23–36, 2006; Venezia, Italy: ACM. [Google Scholar]
  • 45. Damman OC, De Jong A, Hibbard JH, Timmermans DR.. Making comparative performance information more comprehensible: an experimental evaluation of the impact of formats on consumer understanding. BMJ Qual Saf 2016; 25 (11): 860–9. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 46. Davis C, McNair A, Brigic A, et al. Optimising methods for communicating survival data to patients undergoing cancer surgery. Eur J Cancer 2010; 46 (18): 3192–9. [DOI] [PubMed] [Google Scholar]
  • 47. Dowding D, Randell R, Gardner P, et al. Dashboards for improving patient care: review of the literature. Int J Med Inform 2015; 84 (2): 87–100. [DOI] [PubMed] [Google Scholar]
  • 48. Dowding DW, Russell D, Jonas K, et al. Does level of numeracy and graph literacy impact comprehension of quality targets? Findings from a survey of home care nurses. In: AMIA Annual Symposium Proceedings; April 16, 2018; Washington DC 2017: 635–40. [PMC free article] [PubMed]
  • 49. Effken J, Loeb R, Johnson K, Johnson S, Reyna V.. Using cognitive work analysis to design clinical displays. Studies in Health Technology and Informatics 2001; 84 (Pt 1): 127–31. [PubMed] [Google Scholar]
  • 50. Few S. Visual Business Intelligence for Enlightening Analusis and Communication - Examples; 2004. http://www.perceptualedge.com/examples.php. Accessed June 21, 2019.
  • 51. Few S. Now You See It: Simple Visualization Techniques for Quantitative Analysis. Oakland, CA: Analytics Press; 2009. [Google Scholar]
  • 52. Few S. Why Do We Visualize Quantitative Data? Secondary Why Do We Visualize Quantitative Data? 2014. https://www.perceptualedge.com/blog/? p=1897. Accessed June 21, 2019.
  • 53. Gaissmaier W, Wegwarth O, Skopec D, Müller A-S, Broschinski S, Politi MC.. Numbers can be worth a thousand pictures: individual differences in understanding graphical and numerical representations of health-related information. Health Psychol 2012; 31 (3): 286–96. [DOI] [PubMed] [Google Scholar]
  • 54. Galesic M, Garcia-Retamero R.. Graph literacy: a cross-cultural comparison. Med Decis Mak 2011; 31 (3): 444–57. [DOI] [PubMed] [Google Scholar]
  • 55. Gardner SA. Telling your story: using dashboards and infographics for data visualization. Comput Libr 2016; 36 (3): 4–7. [Google Scholar]
  • 56. Gerteis M, Gerteis JS, Newman D, Koepke C.. Testing consumers' comprehension of quality measures using alternative reporting formats. Health Care Financ Rev 2007; 28 (3): 31–45. [PMC free article] [PubMed] [Google Scholar]
  • 57. Harold J, Lorenzoni I, Shipley TF, Coventry KR.. Cognitive and psychological science insights to improve climate change data visualization. Nature Clim Change 2016; 6 (12): 1080–9. [Google Scholar]
  • 58. Heer J, Bostock M.. Crowdsourcing graphical perception: using mechanical Turk to assess visualization design In: proceedings of the SIGCHI Conference on Human Factors in Computing Systems; April 10–15, 2010; Atlanta, Georgia: ACM. [Google Scholar]
  • 59. Heer J, Bostock M, Ogievetsky V.. A tour through the visualization zoo: a survey of powerful visualization techniques, from the obvious to the obscure. Commun ACM 2010; 8 (5): 59–67. [Google Scholar]
  • 60. Hegarty M. The cognitive science of visual-spatial displays: implications for design. Top Cogn Sci 2011; 3 (3): 446–74. [DOI] [PubMed] [Google Scholar]
  • 61. Hildon Z, Allwood D, Black N.. Impact of format and content of visual display of data on comprehension, choice and preference: a systematic review. Int J Qual Health Care 2012; 24 (1): 55–64. [DOI] [PubMed] [Google Scholar]
  • 62. Holmquist LE. Evaluating the comprehension of ambient displays In: CHI '04 Extended Abstracts on Human Factors in Computing Systems; April 24–29, 2004; Vienna, Austria: ACM. [Google Scholar]
  • 63. Jeffs L, Beswick S, Lo J, Lai Y, Chhun A, Campbell H.. Insights from staff nurses and managers on unit-specific nursing performance dashboards: a qualitative study. BMJ Qual Saf 2014; 23 (12): 1001–6. [DOI] [PubMed] [Google Scholar]
  • 64. Kang X. The effect of color on short-term memory in information visualization In: proceedings of the 9th International Symposium on Visual Information Communication and Interaction; September 24–26, 2016; Dallas, TX: ACM. [Google Scholar]
  • 65. Keller C, Kreuzmair C, Leins-Hess R, Siegrist M.. Numeric and graphic risk information processing of high and low numerates in the intuitive and deliberative decision modes: an eye-tracker study. Judgm Decis Mak 2014; 9 (5): 420–32. [Google Scholar]
  • 66. Kilmer L. More than just pretty pictures. Provider (Washington, D.C.) 2016; 42 (9): 434. [PubMed] [Google Scholar]
  • 67. Lopez KD, Wilkie DJ, Yao Y, et al. Nurses' numeracy and graphical literacy: informing studies of clinical decision support interfaces. J Nurs Care Qual 2016; 31 (2): 124–30. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 68. Maheshwari D, Janssen M.. Dashboards for supporting organizational development: principles for the design and development of public sector performance dashboards In: proceedings of the 8th International Conference on Theory and Practice of Electronic Governance; October 27–30, 2014; Guimaraes, Portugal: ACM. [Google Scholar]
  • 69. Martinez R, Ordunez P, Soliz PN, Ballesteros MF.. Data visualisation in surveillance for injury prevention and control: conceptual bases and case studies. Inj Prev 2016; 22 (Suppl 1): i27–33. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 70. Mixer SJ, McFarland MR, McInnis LA.. Visual literacy in the online environment. Nurs Clin North Am 2008; 43 (4): 575–82. [DOI] [PubMed] [Google Scholar]
  • 71. Monsen KA, Peterson JJ, Mathiason MA, et al. Data visualization techniques to showcase nursing care quality. Comput Inform Nurs: CIN 2015; 33 (10): 417–26. [DOI] [PubMed] [Google Scholar]
  • 72. O'Brien S, Lauer C.. Testing the susceptibility of users to deceptive data visualizations when paired with explanatory text In: proceedings of the 36th ACM International Conference on the Design of Communication; August 3–5, 2018; Milwaukee, WI: ACM. [Google Scholar]
  • 73. Puhan MA, ter Riet G, Eichler K, Steurer J, Bachmann LM.. More medical journals should inform their contributors about three key principles of graph construction. J Clin Epidemiol 2006; 59 (10): 1017.e1–22. [DOI] [PubMed] [Google Scholar]
  • 74. Shah P, Freedman EG.. Bar and line graph comprehension: an interaction of top-down and bottom-up processes. Top Cogn Sci 2011; 3 (3): 560–78. [DOI] [PubMed] [Google Scholar]
  • 75. Shah P, Hoeffner J.. Review of graph comprehension research: Implications for instruction. Educ Psychol Rev 2002; 14 (1): 47–69. [Google Scholar]
  • 76.The eyes have it: a task by data type taxonomy for information visualizations. In: 1996 IEEE Symposium on Visual Languages; September 3–6, 1996.
  • 77. Sperandei S. The pits and falls of graphical presentation. Biochem Med 2014; 24 (3): 311–20. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 78. Vogel SE, Keller C, Koschutnig K, et al. The neural correlates of health risk perception in individuals with low and high numeracy. ZDM Math Educ 2016; 48 (3): 337–50. [Google Scholar]
  • 79. Ware C. Information Visualization: Perception for Design. 2nd ed. San Francisco: Elsevier; 2004. [Google Scholar]
  • 80. Kosslyn SM. Graph Design for the Eye and Mind. New York: Oxford University Press; 2006. [Google Scholar]
  • 81. Tufte ER. The Visual Display of Quantiative Information. Cheshire, CT: Graphic Press; 1983. [Google Scholar]

Associated Data

This section collects any data citations, data availability statements, or supplementary materials included in this article.

Supplementary Materials

ocz190_Supplementary_Files

Articles from Journal of the American Medical Informatics Association : JAMIA are provided here courtesy of Oxford University Press

RESOURCES