Abstract
From 2000 to 2020, more than 28,000 North Carolina (NC) residents died from drug overdose. In response, NC Department of Health and Human Services worked with community partners to develop an Opioid and Substance Use Action Plan, now in its third iteration. The NC OSUAP data dashboard brings together data on 15 public health indicators and 16 local actions across 8 strategies. We share innovations in design, data structures, user tasks, and visual elements over five years of dashboard development.
Keywords: Data visualization, public health surveillance, public health informatics, policy
Introduction
From 2000 to 2020, more than 28,000 North Carolina (NC) residents died from drug overdose. The NC Department of Health and Human Services (NC DHHS) developed an Opioid and Substance Use Action Plan (OSUAP) with community and state partners, first released in June 2017 and updated twice since (see Supplement: OSUAP 3.0 Slide Deck). A public data dashboard tracks plan progress.
In recent years, many states have released overdose data portals1-3. Thanks to five years of feedback and nearly 100,000 visits, the NC portal now includes innovations that few other overdose dashboards employ, such as tracking local policy adoption.
Our techniques, data, and metadata structures may be useful to other agencies combining disparate data sources, including policy status, and visualizing them together. Design elements, diagrams, and tools are shared as online supplements.
Intervention
County-level metrics included 15 overdose-related indicators, including overdose deaths, overdose emergency department visits, the percent of overdoses involving illicit substances like Fentanyl as well as social drivers like the percent of unemployed adults and calls for housing and homelessness assistance. County-level program data included the local implementation status of 16 overdose prevention local actions, like prescription drug drop boxes and the distribution of Fentanyl test strips, naloxone, and sterile syringes. These metrics and program implementation status indicators were organized under an 8-strategy framework (see Figure 1 and online supplements: Technical Notes, Local Actions). These data were presented in an interactive dashboard across ten pages designed for specific user tasks.
Figure 1.

Strategy, Indicator, and Local Action Matrix for NC Opioid and Substance Use Action Plan 3.0.
Implementation
Data collection and dashboard development were led by the Injury & Violence Prevention Branch (IVPB) within the NC DHHS Division of Public Health (DPH). Other NC DHHS divisions and branches and community organizations were essential data partners, including the NC Division of Mental Health, United Way 211, and the NC Harm Reduction Coalition (see supplemental technical notes for data owners).
The data dashboard project was led by two staff, one from state and one from academic public health, working on the project part-time among their other priorities. Their collaboration leaned on shared experiences across epidemiology, data science, informatics, database design, and data visualization. This pair coordinated individual data stewards to maintain the metric data and local policy table.
A Secure File Transfer Protocol (SFTP) site centralized data collection for most data partners, replacing earlier disjointed email chains with (public) data attachments. Dashboard designs were workshopped on whiteboards and iterated using LucidChart (see Supplement: OSUAP-OAP Whiteboards, Lucid Diagrams).
The frontend transitioned from R Shiny (see Supplements: OAP 1.0 RShiny Code; OAP 1.0 Screenshot) to Tableau (see Supplement: OSUAP 3.0 Screenshots), trading custom functionality for wider familiarity. R was retained for data harmonization due to efficient data structures and spatial analysis capacities (see Supplement: OSUAP 3.0 R Code). A county-to-region crosswalk table programmatically regionalizes indicators and builds custom region polygons. Small count rates were smoothed in R for improved presentation. Data suppression masked small number counts for some datasets but enabled regional trend tracking.
We used common language wherever possible, leaving jargon and details to a technical notes appendix. A human-authored executive summary narrative (not programmatically generated) was programmatically updated and integrated into the dashboard quarterly. Major updates were announced with DHHS press releases. Recorded user demonstrations were embedded in the dashboard and given live during statewide gatherings.
Place, Time, and Persons
Monthly indicator tracking (of overdose mortality) began in 2000; however, some indicators begin later or are updated less frequently as data sources were either not yet available or not available with monthly frequency. To clarify these differences in data availability lag times, all graph x (time) axes begin in 2000. Monthly data calculates year-to-date calculations when available, but all data is commonly shared as annual counts and rates as well. Details of data lags and harmonization efforts were documented in the technical notes (see Supplement: Technical Notes).
Residence-based definitions were used, e.g., emergency department visits were constrained to NC residents and county assignments made by patient residence, not emergency department location.
Drug overdose death rates were stratified by combined race-ethnicity groups into five-year rates, demonstrating disparities in rate magnitude and trends.
Purpose
The dashboard was primarily designed for the state and local health departments, community groups, nonprofits, coalitions, and policy-makers tracking and acting to reduce overdoses in NC. Secondary audiences were media, the general public, and researchers.
These user tasks were prioritized and often common across these audiences: tracking trends and severity of the overdose epidemic within a jurisdiction across many metrics, understanding which local responses were being employed, and comparing trends and local responses between jurisdictions.
The initial plan included state actions, like Medicaid expansion, which could not be locally implemented. Locals requested tracking and integration of local actions, including naloxone distribution, syringe services programs, and pre-arrest diversion programs (see Figure 1 for full list).
Evaluation
Feedback has been constructive and positive. Current, local data on overdose metrics and response strategies meet the call for timely, granular, actionable public health information4. Users provided feedback during prototyping, regular meetings, and directly by email. Evaluation drove innovations designed to reduce confusion and meet unanticipated user needs.
Visualization feedback included preferring annual and year-to-date indicators over quarterly indicators and a need for printable formats. Phone-capable visualizations received early praise and were improved on.
Indicator feedback led to easier comprehension. Measures were made more person-centric, e.g., rates of people prescribed opioids instead of rates of pill prescriptions. Complex measures based on multiple provider episodes and milligram morphine equivalent were dropped (though could still be obtained from the original data provider). Opioid-specific definitions expanded to all-drug overdoses. Overdose measures were joined by social drivers of health, including housing and unemployment metrics, drawn from data sources like United Way 211 calls for government resources and the US Bureau of Labor Statistics (BLS).
New visualizations compared one location to peer counties, region, or the state. Dashboard and data models were redesigned to work for common multi-county regions.
The portal includes overdose disparities in death rates, but many other indicators have uncommunicated disparities. Demographic groups are often dataset-specific, requiring metadata efforts to track and harmonize them. Such efforts may identify disparities now invisible (e.g., without sexual orientation or homeless status, those disparities cannot be calculated). Race-ethnicity specifically and data generation processes in general are socially constructed and change over time; harmonization takes ongoing vigilance.
As an abundance of regular feedback and presentations provided sufficient improvement ideas, formal feedback and dashboard scoring tools were not used, but may have use in future iterations5.
Sustainability
Informatics infrastructure6 is still a common obstacle in sustainable dashboard development. Unlike many states, US DHHS, and the CDC7, NC does not yet have an Open Data Portal for public health data; we share underlying data using a little-documented Tableau CSV export feature8 and manually post data updates monthly.
A new indicator data format with separate metadata proved key to efficient updates. Metadata was used in data processing and in visual presentation. Metadata tables were not completely normalized; the strategy-indicator row format balanced ease of human maintenance with machine readability.
Data governance rules9 can break down single-data-source silo walls to enable accessible, multi-data-source landscapes. Holistic rulesets provide response teams better context within which to act. Joining indicator with current local policy status, as we have, enables decision-makers to select remaining actions from convenient policy menus.
Each of over a dozen data silos is centrally harmonized after being prepared by its own unique team, processes, and data platforms. Only some of these processes are scriptable and automatable. Putting automation, not just statistical scripting, tools in the hands of epidemiologists (i.e., SAS & R tasks running unattended on a schedule) enables more timely dashboards. In our case, many state data partners are without the infrastructure to schedule and automate SFTP data pushes. Given the breadth of public health topics and reports, there may be hundreds or even thousands of data updates still unscheduled annually. Automation enables epidemiologists to focus on quality assurance and situational awareness rather than button pressing every day, week, month, and year.
Harmonized indicator and metadata structures enable faster, less error-prone dashboard design and maintenance. Sharing pre-aggregated indicator data is easier than providing separate access to original data sources to reproduce common indicators externally. We have expanded this indicator harmonization effort across more injury content areas and know of similar efforts across public health topics.
Supplementary Material
Public Health Significance.
Our portal has been used by state, local, and community decision-makers to understand and respond to the evolving overdose epidemic, submit grants using local data, and strategize how to spend incoming opioid settlement funds. Design elements and lessons learned have permeated our other data dashboards, including dashboards on alcohol, motor vehicle crashes, and violent death. We hope to continue to better respond to the data needs of the overdose epidemic with subsequent versions.
Implications for Policy and Practice.
Data dashboards (including those tracking overdoses) designed for public health situational awareness can track both traditional metrics and the local status of recommend policy implementations.
Sustainable dashboard maintenance requires putting scheduled automation, not just scripting, tools in the hands of practicing epidemiologists.
Harmonization of the data structure of metrics simplify subsequent steps of data processing pipelines, simplifying visualization of metrics drawn from disparate data silos.
Acknowledgments
This work was supported by an award from the Centers for Disease Control and Prevention, National Center for Injury Prevention and Control to the North Carolina Division of Public Health (Overdose Data to Action, cooperative agreement #5NU17CE925024-02-00). Authors from the University of North Carolina at Chapel Hill were funded through a subcontract under this grant (contract #5118396). The contents of this manuscript are those of the authors and do not necessarily represent the official views of, nor an endorsement, by NC DHHS, CDC/HHS, or the U.S. Government. Authors have no other acknowledgements or conflicts of interest to disclose.
References
- 1.Marshall BDL, Yedinak JL, Goyer J, Green TC, Koziol JA, Alexander-Scott N. Development of a Statewide, Publicly Accessible Drug Overdose Surveillance and Information System. Am J Public Health. 2017;107(11):1760–1763. doi: 10.2105/AJPH.2017.304007 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 2.Goldstick J, Ballesteros A, Flannagan C, Roche J, Schmidt C, Cunningham RM. Michigan system for opioid overdose surveillance. Inj Prev. 2021;27(5):500–505. doi: 10.1136/injuryprev-2020-043882 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 3.Anderson J, Demeter N, Pasquires M y sol, Wirtz S. Using the CA Opioid Overdose Surveillance Dashboard to track opioid overdose deaths. OJPHI. 2019;11(1). doi: 10.5210/ojphi.v11i1.9938 [DOI] [Google Scholar]
- 4.Wang YC, DeSalvo K. Timely, Granular, and Actionable: Informatics in the Public Health 3.0 Era. Am J Public Health. 2018;108(7):930–934. doi: 10.2105/AJPH.2018.304406 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 5.Ising A, Waller A, Frerichs L. Evaluation of an Emergency Department Visit Data Mental Health Dashboard. Journal of Public Health Management and Practice. 2023;29(3):369–376. doi: 10.1097/PHH.0000000000001727 [DOI] [PubMed] [Google Scholar]
- 6.Fliss MD, Cox ME, Dorris SW, Austin AE. Timely Overdose Death Reporting Is Challenging but We Must Do Better. American Journal of Public Health. 2021;111(7):3. doi: 10.2105/AJPH.2021.306332 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 7.Centers for Disease Control and Prevention. Data ∣ Centers for Disease Control and Prevention. Accessed July 8, 2022. https://data.cdc.gov/ [Google Scholar]
- 8.Kriebel Andy. The Greatest Tableau Tip EVER: Exporting Made Simple! Published August 14, 2018. Accessed July 8, 2022. https://www.vizwiz.com/2014/03/the-greatest-tableau-tip-ever-exporting.html [Google Scholar]
- 9.Proescholdbell S, Geary S, Tenenbaum JD. Data Governance and the Need for Organization-Wide Guidance to Enable and Facilitate Data Sharing: Lessons Learned From North Carolina. Journal of Public Health Management and Practice. 2022;Publish Ahead of Print. doi: 10.1097/PHH.0000000000001553 [DOI] [PubMed] [Google Scholar]
Associated Data
This section collects any data citations, data availability statements, or supplementary materials included in this article.
