Skip to main content
Hawai'i Journal of Medicine & Public Health logoLink to Hawai'i Journal of Medicine & Public Health
. 2018 Oct;77(10):246–250.

Assessing the Accuracy of Physician Self-disclosed PID Reporting: A Comparison of Data from a Physician Survey and Actual PID Case Reports from a State Surveillance System

Misty Y Pacheco 1,, Alan R Katz 1
PMCID: PMC6176267  PMID: 30324002

Abstract

Pelvic inflammatory disease is a state-mandated notifiable disease in Hawai‘i. A survey assessing pelvic inflammatory disease (PID) reporting to the Hawai‘i Department of Health (HDOH) PID surveillance system, was administered to physicians in Hawai‘i in April 2012. To measure the accuracy of self-disclosed PID reporting, data from the survey were compared to HDOH PID surveillance system case reports. Concordance between the two data sources was assessed using Cohen's kappa statistic. We first linked data by physician name. An adjusted kappa was also calculated to minimize prevalence and bias effects. A second analysis linked data according to physician name or practice setting. In the name-based analysis, the HDOH PID surveillance database successfully matched only ten of 118 physicians (8.5%) who self-disclosed reporting a PID case. Only “slight agreement” (k= 0.09, 95% confidence interval [CI]: 0.02–0.16) was demonstrated between the two databases. The prevalence-adjusted, bias-adjusted kappa demonstrated “moderate agreement” (κ=0.53, 95% CI: 0.45–0.60). In the second (name or practice-based setting) analysis, 77 physicians with linkages were found in the HDOH surveillance database, reflecting “moderate agreement” (κ=0.52, 95% CI 0.43, 0.61). Our findings provide evidence that individual physicians are submerging their case reports into group practice/HMO aggregate reports and not reporting individually as legally mandated and hence are compromising PID surveillance quality.

Keywords: Concordance, Pelvic inflammatory disease, surveillance

Introduction

Pelvic Inflammatory Disease (PID) is a spectrum of inflammatory disorders of the upper female reproductive tract. Many cases are related to sexually transmitted bacteria, most notably Neisseria gonorroheae and Chlamydia trachomatis. These bacteria can move from the cervix to the uterus, fallopian tubes, and ovaries.1 Serious consequences, such as chronic pelvic pain, ectopic pregnancy, and infertility can occur. Prompt detection of PID and treatment with antibiotics are important to prevent further damage. However, it is difficult to achieve early detection because symptoms of PID are often mild, vague, or unrecognized and any damage that has already occurred is irreversible.1

Sexually transmitted disease (STD)-related infertility is a major health issue in the United States (US). In 1998, the Centers for Disease Control and Prevention (CDC) collaborated with the Office of Population Affairs of the Department of Health and Human Services to create the Infertility Prevention Project (IPP), an effort to prevent STD-related infertility.2 Infertility itself is difficult to diagnose and track on a population level. “PID is a more proximal outcome associated with STD-related infertility, making it a more suitable marker in measuring successes in reducing infertility.”3 Hence accurate surveillance for PID occurrence is an important public health objective. The CDC estimates that approximately one million women in the United States are diagnosed with PID annually.4 Although PID is a serious disorder, it is “notifiable” in only 19 states, as well as the District of Columbia and Puerto Rico. These states and territories legally require reporting of PID diagnoses, however, reporting is not consistent or complete.5 PID is a state-mandated notifiable disease in Hawai‘i. The Hawai‘i Department of Health (HDOH) started collecting PID case reports as a component of its infectious disease surveillance activities since the late 1980s in conjunction with the CDC's IPP. It became an official notifiable disease in 2001. Data obtained from surveillance are disseminated to entities to plan and evaluate programs, develop policy, and appropriately allocate resources. More specifically, a surveillance system is important to describe trends and define the natural history of a disease, detect epidemics, reveal disease occurrence-pattern details, track changes in health practices, and evaluate control and prevention measures. The HDOH administers a “passive” surveillance system, which puts responsibility of reporting PID on the diagnosing physician. Hawai‘i physicians who practice in inpatient settings (hospitals, emergency departments), health maintenance organizations (HMO) or physician groups, can submit their case report forms under the facility or physician group/HMO. A physician can either complete the case report form themselves or have another designated person complete and submit the form on their behalf. However, the state statute asks for the diagnosing physician's name on the case report form. Therefore, if the form is not submitted under the diagnosing physician's name, it will be found in the data base but without the diagnosing physician's name.

A study was conducted to analyze the severity of the problem of under-reporting of PID in Hawai‘i.6 Data on PID hospitalizations from 2007–2010 was extracted from the Hawai‘i Health Information Corporation (HHIC) database (based on International Classification of Diseases — 9th revision — Clinical Modification [ICD-9] codes to identify PID). Diagnosed PID hospitalized cases were identified and compared to data from the HDOH PID surveillance system for the same time period. From 2007 – 2010, a total of 240 unique cases of PID were reported through the HDOH PID surveillance system. During the same period 828 unique cases of PID were diagnosed in Hawai‘i hospitals. This analysis confirmed that PID is under-reported in the state of Hawai‘i.

To further assess threats to the accuracy of PID surveillance in Hawai‘i, a survey was administered to physicians in Hawai‘i.8 One of the questions asked if they had ever reported a case of PID in the past 11 years (since 2000). Research on the validity of self-reported data from surveys suggests that misreporting is a common source of error, especially when questions are sensitive in nature.7 Question sensitivity is a concern when disclosure could be threatening or is socially undesirable. Threats of disclosure refer to concern about one's answer becoming known to a third party and/or leading to negative consequences.7 Answers given about an attitude or behavior that deviates from the norm are considered to be socially undesirable.7 Since PID reporting is mandated in the State of Hawai‘i, questions relating to PID reporting may be perceived as sensitive. A threat of disclosure exists because if a PID case is not reported, the State of Hawai‘i could impose a fine on the physician. In addition, abiding by the law and disease reporting, which benefits the health of the public, could very well be considered a social norm. A strategy used by researchers to address the accuracy of self-reported data involving sensitive questions in surveys is to triangulate self-reported data with data from another source, such as a database, register, or administrative data set. 8 Data from these sources provide a validity check for the self-reported data. A study from Quebec, Canada examined the validity of survey data on the use of mental health services, since the topic is sensitive. 8 Data from a community health survey was linked to a government-managed health services register. The study revealed that 75% of people did not report their use of mental health services. Authors concluded that social desirability was a factor in the observed discordance between the two data sources. Bertalli, et al, (2011) linked electronic records from a national blood donation service to self-reported donation history to examine the agreement between the two. There was 87% concordance between the data sources and a high level of agreement (kappa statistic: 0.74).9 Similar to reporting diseases for public health benefit, giving blood to possibly save a life, is socially desirable.

The main objective of this study was to examine the accuracy of physicians' self-reported PID data by measuring the concordance between physicians' self-reported data in a survey and the HDOH PID surveillance data. By assessing the accuracy of self-reported data with the surveillance system, we could also assess the adherence of reporting per state regulations. This is important, because it could provide us with further insight into what factors may be further threatening the accuracy of PID case reporting. Two separate analyses were done in this study. In the first analysis, data linkage was by individual physician name. To address the issue of physicians submitting case reports by facility/provider group/HMO, a second analysis was also done, which linked data by physician name or practice setting.

Methods

Analysis I

Survey Study:

A PID-reporting survey developed by the San Francisco Department of Public Health was modified for this study.10 The revised one-page survey was piloted in a convenience sample of five physicians practicing family medicine, obstetrics and gynecology, and internal medicine to ensure clarity and validity. The final survey took approximately 5–10 minutes to complete.

The HDOH uses a list compiled by The Hawai‘i Department of Commerce and Consumer Affairs (DCCA), comprised of physicians from specific specialties, to conduct mailings. Important notifications on disease outbreaks or updated treatment guidelines are sent to physicians on this list.

For this study, we used the DCCA list, which included all licensed obstetrician/gynecologists, internal medicine physicians, family practice physicians, pediatricians, and emergency medicine physicians in the state of Hawai‘i with a local Hawai‘i address (N=1,202).

A packet, which included the survey, a stamped addressed return envelope, and a cover letter endorsing the study, was mailed to each physician in April 2012. The endorsement letter was signed by HDOH and the presidents of the Hawai‘i chapters of the targeted medical specialty organizations (American College of Obstetricians and Gynecologists, American Academy of Family Physicians, and American Academy of Pediatrics). The letter also stated that their confidential and voluntary response was very important to the scientific validity of this study. One week later, a reminder/thank-you postcard was mailed to each physician. Non-respondents were sent a second packet two weeks after the postcard.

The University of Hawai‘i Committee on Human Subjects as well as the HDOH Institutional Review Board (IRB) approved this research.

The survey asked physicians if they reported any cases of PID to the HDOH in the previous 11 years (January 1, 2000 through December 31, 2011), and if so, they were asked to give the number of diagnosed cases (they could approximate if unsure). One of the demographic questions on the survey asked physicians to identify their practice setting. The practice settings were categorized into inpatient (private hospital, public hospital), HMO/outpatient (private practice, community health center, HMO), and other (military, other).

HDOH PID Surveillance System Linkage:

Physician's self-reported information was compared to HDOH PID surveillance data over the same time period (2000 through 2011). According to Hawai‘i State law, once a physician diagnoses a case of PID, they have three business days to complete the case report form and submit it to the HDOH via phone, mail, or fax. The reporting form is available online at the HDOH website or can be ordered (at no charge) by fax. Once the form is received, the information from the form is entered into a database.11

For confidentiality, HDOH linked the physician names from the survey to the surveillance database. Each physician was assigned a unique ID number. Each physician who self-reported that he/she had reported a case of PID from 2000 through 2011 was compared with the surveillance database to determine concordance.

Data Analysis:

Analysis was conducted using StatXact version 4.0.1 (Cytel Software, Cambridge, MA). Concordance between the two data sources was assessed in 2 x 2 contingency tables using the Cohen's Kappa (κ) statistic. The k statistic indicates the level of agreement between two sources beyond that attributable to chance alone.12

The magnitude of the kappa is affected by a prevalence or bias effect. The prevalence index, is reflected in the difference between the cells of agreement in the 2 x 2 contingency table (“a” and “d” cells) [See Tables 13]. If the prevalence index is high, chance agreement is high, resulting in a low kappa. The extent to which two sources disagree on the proportion of cases (positive or negative) is reflected in the difference between the cells of disagreement, “b” and “c,” and is called the bias effect. When kappa is small, the effect of bias is greater.13 To determine if any prevalence or bias effects exist, the prevalence index and bias index were calculated by taking the absolute value of the differences of the paired cells over “n.” The adjusted kappa was calculated and is presented alongside the obtained value of kappa to show the possible effects of prevalence and bias. The adjusted kappa is obtained by computing the average of the diagonal concordant cells (of the 2 x 2 contingency table) then substituting that average value for the actual values in the concordant cells.13 The kappa coefficient that is then calculated with these new values is called the PABAK (prevalence-adjusted bias-adjusted kappa).

Table 1.

Name-based PID Case Reports Comparing Physician Self-report with Hawai‘i Department of Health (HDOH) Database (κ= 0.09, 95% confidence interval: 0.02–0.16, P<.001)

HDOH Database Total
Found Not found
Self Report Reported 10 108 118
Did not report 7 361 368
Total 17 469 486
Table 3.

Name-based or Practice Setting-based PID Case Reports Comparing Physician Self-report with Hawai‘i Department of Health Database (κ= 0.52, 95% confidence interval: 0.43–0.61, P<.001)

HDOH Database Total
Found Not found
Self Report Reported 77 41 118
Did not report 46 322 368
Total 123 363 486

The number of cases of PID that a physician self-reported was then compared to the number of actual cases reported by that physician found in the HDOH PID surveillance database.

Analysis II

Survey Study:

Data for analysis II were collected from the same survey as described in analysis I. In addition to physician name-based reports, the practice setting variable was also used in this analysis.

HDOH PID Surveillance System Linkage:

Self-reported data from the survey study were compared to the HDOH PID surveillance database described in analysis I. Practice setting (inpatient, HMO/outpatient, other) was also a component of the HDOH PID surveillance database. In analysis II, HDOH identified PID case reports from 2000–2011 were compared by physician name and practice setting.

Data Analysis:

The same data analysis conducted in analysis I was done with the data in analysis II. However, in analysis II, in addition to focusing on linked named-based reports, we attempted to link non-name-based reports with practice settings.

Results

Of the total 1,202 surveys that were mailed, 140 (11.6%) were determined to be ineligible, including 58 that were returned unopened because of change of address, 31 associated with physicians who were retired, 17 related to physician no longer at that specific facility, 12 associated with physicians no longer practicing medicine, 11 duplicates, 8 from respondents who said that the survey did not apply to them, and 3 sent to physicians who were deceased. The remaining 1,062 were deliverable, of which 486 were returned completed for a response rate of 45.8%. Comparing the 486 respondents against the 576 non-respondents, there was a significant association between the groups by specialty, X2 = 26.819 (df=4, n=1.063), P=.001. With highest response rates for OBGYNs.

Of the 486 physicians who returned and completed the survey, 118 (24.3%) answered that they reported at least one case of PID to the HDOH from 2000 to 2011. There were a total of 652 unique cases of PID reported to the HDOH surveillance database during this same time period. The HDOH PID surveillance database successfully matched 10 physicians (8.5%) who self-disclosed that they reported a case of PID to the HDOH during this interval. Concordant reporters were mostly all OBGYNs from private practice (outpatient) settings. A little less than half of the physicians (4) self-reported the same number of PID cases as were found associated with them in the database.

Analysis I

Of 368 physicians who self-reported that they did not report a case of PID to the HDOH from 2000–2011, seven (1.9%) were found in the HDOH PID surveillance database as having reported a case. These physicians were considered to be the discordant group of physician reporters. Again, the majority of these physicians were OBGYNs, from private practice (outpatient) settings.

Table 1 is a cross-tabulation of the number of reported cases of PID during 2000 to 2011 from both data sources. There was “slight agreement” (κ=0.09, 95% confidence interval [CI]: 0.02–0.16, P=.001) between the name-based self-reported survey data and the HDOH PID surveillance database.11 The prevalence index was calculated to be 0.72 and the bias index, 0.21. The adjusted κ, or PABAK, was 0.53, 95% CI: 0.45–0.60, P<.001 (Table 2), which is considered to be “moderate agreement.”12

Table 2.

Name-based PID Case Reports Comparing Physician Self-report with Hawai‘i Department of Health Database with Cell Frequencies Adjusted to Minimize Prevalence and Bias Effects, Giving a Prevalence-adjusted, Bias-adjusted Kappa (PABAK) (κ= 0.53, 95% confidence interval: 0.45–0.60, P<.001)

HDOH Database Total
Found Not found
Self Report Reported 185 58 243
Did not report 57 186 243
Total 242 244 486

Analysis II

Of the remaining 108 physicians who completed the survey and identified themselves as having reported a case of PID from 2000–2011, 64 physicians practiced in either inpatient, HMO/outpatient, or other group settings: 12 were from inpatient settings, 25 were from HMO/outpatient settings and 27 were in the “other” group, which includes military, prison and other group facilities. Aside from the concordant (n=10) and discordant (n=7) physicians who had name-based matches to reported PID cases in the HDOH PID surveillance database, an additional 42 inpatient, 41 HMO/outpatient and 23 “other” reporting entities (106 total) were identified by practice setting only in the HDOH PID surveillance database from 2000–2011. It is possible that any of the 60 physicians (12 inpatient, 25 HMO/outpatient, 23 Other) that self-reported that they reported a case of PID from 2000–2011 are counted in the database as one of those entities. If this is correct, a total of 77 physicians (17 name-based concordant and discordant physicians already found in the database plus 60 physicians who reported [theoretically] under an inpatient, HMO or Other entity) would have been found in the database. These 77 physicians would now be in the “a” box, and 46 physicians (106 minus 60) would be in the “c” box resulting in a kappa statistic of 0.52, 95% CI: 0.43–0.60, P<.001; reflecting a moderate level of agreement with virtually the same level of agreement as was found with the PABAK calculation (Table 3).

Discussion

The purpose of this study was to examine concordance between physicians' self-reported data in a survey and the HDOH PID surveillance data. The results of analysis I indicated poor agreement between the physicians' self-reported data and the HDOH PID surveillance database looking only at name-based reporting. However, the possible explanation for this low level of concordance is that case reports from physicians practicing in inpatient settings (hospitals, emergency departments), HMOs or physician groups, are being submitted to the HDOH surveillance system under the facility or physician group/HMO, as shown in study II. When we calculate the PABAK, the level of agreement is significantly higher.

If the diagnosing physician or clinician is filling out the form using their facility or physician group name instead of their own name on the reporting form, or if a designated person from the physician group is reporting all cases seen without identifying the diagnosing physician, this potentially poses a number of problems. If the HDOH needs to follow up on a certain case for epidemiologic and control purposes, they would need to call the facility, HMO, or physician group and the latter would need to retrieve the medical record or case report for that patient, then search for the identity of the diagnosing physician, and try to contact that physician. This inefficiency could be eliminated if the physician's name was already on the form. Also, if one wanted to identify a “best practice” group of physicians who are reporting PID or a group of non-reporting physicians to target, it would be difficult without the identity of the physician. Conducting studies to analyze physician attributes for reporting and not reporting PID would be challenging. Furthermore, if different strategies for improving reporting among physicians were implemented, such as reimbursement of screening costs by insurance carriers, as is done for chlamydia screening by a main insurance carrier in Hawai‘i, the physician's name would need to be on the reporting form.14 There is a space on the current PID reporting form for the name of the diagnosing physician; apparently it is not being completed correctly. The HDOH may want to have a mechanism or policy in place where if all parts of the reporting form are not completed, the form is returned or not accepted.

The PABAK showed a substantially higher level of agreement than the non-adjusted value of kappa. Since the diagonal cells of agreement and disagreement were both asymmetrical, the prevalence and bias indices were high, resulting in a low kappa value. The magnitude of the kappa is affected by the prevalence of the attribute under consideration.12 If this attribute is rare, the kappa statistic alone may not be a valid measurement of agreement, and a prevalence effect may exist.13 In this study, the attribute is PID reporting. The PABAK was virtually identical to the kappa value calculated by estimating concordance using either name-based or practice based data. Previous analysis of PID case reports established that PID reporting is incomplete.6 Furthermore, many barriers, such as the issue surrounding diagnosing PID (no gold standard, unclear PID definition, unclear diagnostic guidelines) deter physicians from diagnosing PID, making it a rare attribute.15 The much higher PABAK of 0.53 compared to the unadjusted kappa of 0.09, coupled with the rare attribute of PID reporting, supports the possibility of prevalence or bias effects.

There are substantial limitations to this study. Two study limitations, threat of disclosure and social desirability, were explained at the beginning of this paper. Some consequences of these limitations include responders not completing the survey (which would result in a low response rate), and responders skipping the sensitive question(s) altogether or not answering truthfully. 7 With any survey gathering historical self-reported data, there is potential for recall or memory bias.15 We are asking physicians to disclose whether they have reported a PID case to HDOH during an 11-year period. Physicians may not remember whether or not they have reported PID within this time period. Also, some physicians who did report PID may no longer be practicing medicine in Hawai‘i or may be deceased. Furthermore, according to Hawai‘i Revised Statutes Section 622-58, medical records need only to be retained for seven years, so if physicians reported a case of PID more than 7 years ago, they would not be able to access that data through their records. 16 The results may not be generalizable outside of Hawai‘i, and as the study included an 11-year period, the findings may not reflect current reporting practices in Hawai‘i. Another major limitation is the HDOH surveillance database. Previous studies have shown that the database is incomplete due to under and/or non-reporting, so we are already working with a very small and limited sample.6 Evidence that case reports are being submitted by groups (eg, HMOs, physician groups or hospitals) rather than individual physicians not only helps to explain the compromised kappa, but highlights an area where intervention may improve the accuracy of the current PID surveillance system.

Conclusion

Despite these limitations, the two different analyses, as well as the additional calculation of the PABAK, suggest discordance. Further research is needed to address this systematic issue. It is also possible that a physician may have completed a case report form but it was not submitted because it was perceived to be another staff member's responsibility or because disease reporting may be handled by another physician, department, or health professional. The main problem that needs to be addressed is that individual physicians who are submerging their reports into group practice/HMO aggregate reports and not reporting individually are not following the legal mandate and compromising PID surveillance. In order to improve the accuracy of PID reporting, future studies could be conducted with physicians and key informants to obtain more information about the PID reporting process within their facilities.

Acknowledgement

This article is from Misty Pacheco's Doctor of Public Health dissertation and has not been previously published nor is it being considered for publication elsewhere. The authors acknowledge Maria Veneranda Lee for her assistance with record examination, critical review of our manuscript, and thoughtful comments.

Conflict of Interest

None of the authors identify a conflict of interest.

References

  • 1.Centers for Disease Control and Prevention, author. Sexually transmitted diseases treatment guidelines, 2015. MMWR Recommendations and Reports. 2015;64:78–82. [Google Scholar]
  • 2.Chlamydia screening among sexually active young female enrollees of health plans—United States, 2000–2007. Centers for Disease and Control and Prevention website; [February 6, 2018]. http://www.cdc.gov/mmwr/preview/mmwrhtml/mm5814a2.htm. [Google Scholar]
  • 3.Stephens SC, Bernstein KT, Koh RP, Klausner JD, Philip SS. Can case reports be used to identify trends in pelvic inflammatory disease? San Francisco 2004–2009. Sexually Transmitted Diseases. 2010;38(1):8–11. doi: 10.1097/OLQ.0b013e3181e9afb1. [DOI] [PubMed] [Google Scholar]
  • 4.Self-study modules for clinicians - pelvic inflammatory disease. Centers for Disease Control and Prevention website; [June 25, 2016]. http://www2a.cdc.gov/stdtraining/self-study/pid/pid_epidemiology_self_study_from_cdc.html. Updated October 2014. [Google Scholar]
  • 5.Ratelle S, et al. Predictive value of clinical diagnostic codes for the CDC case definition of pelvic inflammatory disease (PID) Sexually Transmitted Diseases. 2003;30:866–870. doi: 10.1097/01.OLQ.0000087945.08303.38. [DOI] [PubMed] [Google Scholar]
  • 6.Pacheco M, Sentell T, Katz AR. Under-reporting of pelvic inflammatory disease in Hawaii: a comparison of state surveillance and hospitalization data. Journal of Community Health. 2013;39:336–338. doi: 10.1007/s10900-013-9766-x. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 7.Tourangeau R, Yan T. Sensitive Questions in Surveys. Psychological Bulletin. 2007;133(5):859–883. doi: 10.1037/0033-2909.133.5.859. [DOI] [PubMed] [Google Scholar]
  • 8.Drapeau A, Boyer R, Diallo FB. Discrepancies between survey and administrative data on the use of mental health services in the general population: findings from a study conducted in Quebec. BMC Public Health. 2011;11:837. doi: 10.1186/1471-2458-11-837. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 9.Bertalli NA, Allen KJ, McLaren, et al. A comparison of self-reported and record-linked blood donation history in an Australian cohort. Transfusion. 2011;51(10):2189–2198. doi: 10.1111/j.1537-2995.2011.03141.x. [DOI] [PubMed] [Google Scholar]
  • 10.Bernstein K. Statewide efforts to better understand PID surveillance; Paper presented at the National STD 2010 Annual Conference; March 10, 2010; Atlanta, GA. [February 6, 2018]. https://cdc.confex.com/cdc/std2010/webprogram/Paper21602.html. [Google Scholar]
  • 11.Disease Reporting. Hawaii State Department of Health website; [February 6, 2018]. http://health.hawaii.gov/harmreduction/disease-reporting/. Updated August 2013. [Google Scholar]
  • 12.Viera AJ, Garrett JM. Understanding interobserver agreement: the kappa statistic. Family Medicine. 2005;37:360–363. [PubMed] [Google Scholar]
  • 13.Sim J, Wright CC. The kappa statistic in reliability studies: use, interpretation, and sample size requirements. Physical Therapy. 2005;85:257–268. [PubMed] [Google Scholar]
  • 14.McGrath CM, et al. Chlamydia screening of adolescent females: A survey of providers in Hawaii. Journal of Community Health. 2011;36:274–280. doi: 10.1007/s10900-010-9308-8. [DOI] [PubMed] [Google Scholar]
  • 15.Pacheco M, et al. Physician survey assessing pelvic inflammatory disease knowledge and attitudes to identify diagnosing and reporting barriers. Women's Health Issues. 2015;26:27–33. doi: 10.1016/j.whi.2015.07.013. [DOI] [PubMed] [Google Scholar]
  • 16.Retention of Medical Records Section 622-58. [April 13, 2018]. Hawaii Revised Statutes http://health.hawaii.gov/harmreduction/disease-reporting/

Articles from Hawai'i Journal of Medicine & Public Health are provided here courtesy of University Health Partners of Hawaii

RESOURCES