Skip to main content
NIHPA Author Manuscripts logoLink to NIHPA Author Manuscripts
. Author manuscript; available in PMC: 2019 May 4.
Published in final edited form as: Subst Abus. 2018 May 4;39(2):218–224. doi: 10.1080/08897077.2018.1449175

Physicians report adopting safer opioid prescribing behaviors after academic detailing intervention

Mary Jo Larson 1, Cheryl Browne 2, Ruslan V Nikitin 3, Nikki R Wooten 4, Sarah Ball 5, Rachel Sayko Adams 6, Kelly Barth 7
PMCID: PMC6237655  NIHMSID: NIHMS1501577  PMID: 29608412

Abstract

Background

This study evaluated an educational intervention intended to increase physicians’ use of patient prescription history information from the state prescription monitoring program (PMP) and their adoption of clinical behaviors consistent with opioid prescription guidelines to reduce patient risk.

Methods

Physician volunteers (n=87) in community practices and Veterans Administration medical settings in South Carolina received an office-based, individualized, educational intervention (Academic Detailing) from a trained pharmacist who promoted three key messages about safer opioid prescribing. Physicians were registered for the state PMP, guided through retrieving patient information from the PMP, and given patient-centered materials. Physicians consented to completing web-surveys; 68 (78%) completed follow-up surveys on average 12.2 weeks post-intervention.

Results

Of 43 respondents who did not use the PMP before the intervention, 83% adopted PMP use. Self-reports also revealed a significant increase in frequency of the following behaviors: 1) using patient report information from the PMP, 2) using a standardized scale to monitor pain intensity and interference with daily functioning, and 3) issuing orders for urine toxicology screens for patients maintained long-term on opioids.

Conclusions

The intervention was effective in promoting physician adoption of prescribing behaviors intended to reduce risks associated with prescription opioids. The self-report findings of this study should be confirmed by analysis using data on the number of queries submitted to the state’s PMP. The present study suggests that a single academic detailing visit may be an effective tool for increasing physician voluntary registration and utilization of data on patients’ prescription history contained in a state PMP.

Keywords: academic detailing, opioid prescribing, primary care, evaluation

INTRODUCTION

Opioid fatalities have increased in parallel with rapid increases in opioid prescribing. Over 259 million opioid prescriptions were written in 2012, with opioid prescriptions per capita increasing by 7.3% from 2007 to 2012.1 In a retrospective study of state-based prescription monitoring program (PMP) data, nearly 51% of prescribed controlled substances were written to individuals who received prescriptions from multiple physicians and pharmacies within a 12-month period.2

A number of safe opioid prescribing initiatives have been launched to curb the opioid epidemic, including an emphasis on utilization of state-based PMPs.3,1,4 Prescribers are now advised to obtain patient information from the PMP to review patients’ receipt of controlled substances from other prescribers, and to communicate with patients about overdose and other adverse reactions and risk.57 Some evidence suggests that states with increased adoption of PMPs have reduced Schedule II prescribing and prescription opioid-related morbidity and mortality;811 however, findings are inconsistent.12,13 Reviewing PMP patient data is now regarded as a bestpractice to enhance the quality of pain management care.14,15 While an increasing number of states mandate that all prescribers participate (i.e., register, or query data), in most of the US, PMP participation remains voluntary and underutilized.16 A study of 420 nationally representative physicians found that 72% of physicians were aware of their state’s PMP and only 53% reported utilizing it.17 In addition to lack of awareness,18 reasons for not utilizing PMPs include difficulty accessing the PMP and the time required to query the database, as well as the perception that PMP information is not pertinent or timely.17,1921

Prescriber education has an important role in increasing effective utilization of PMPs.22 Academic Detailing is one promising approach to delivering opioid prescribing education to physicians.23 Academic Detailing provides in-office, interactive encounters with a clinical consultant who is trained to assess the physician’s prescribing concerns and provide new information on recommended practices and tools.24,25,2631

We hypothesized that Academic Detailing would be an effective educational intervention to advance safer opioid prescribing, as it addresses both skills and motivation to use patient data from the PMP. A prevention project with primary care providers in South Carolina was undertaken by academic researchers, pharmacists, and the SC Department of Health and Environmental Control/Bureau of Drug Control (SC DHEC). We targeted providers who served military service members and Veterans as these populations are frequently prescribed opioids for acute and chronic pain and were identified by National Institute on Drug Abuse as a priority group for prevention.7,3237

This paper addresses two research questions: 1) Did the academic detailing intervention change physicians’ prescribing behaviors to be more consistent with clinical guidelines? Specifically, did physicians report increased use of patient data from the PMP and other safe opioid clinical practices or tools when prescribing opioids? 2) Which participant and intervention characteristics were associated with adoption of PMP use? These findings are important for the design of more effective interventions that encourage uptake in the use of patient data from PMPs and to promote further adoption of behaviors consistent with opioid prescribing guidelines.

MATERIALS AND METHODS

Study context and setting

South Carolina (SC) was selected because of the team’s prior analysis of opioid prescriptions from the PMP and because it is has large military and veteran populations.38 In 2008, South Carolina ranked 10th highest in number of pain prescriptions per capita,39,40 and in fiscal year 2014, 4.2 million opioid prescriptions were dispensed to 1.2 million patients.41 The SC Prescription Monitoring Act (HB 3803) established the SC PMP in 2008. It is an electronic, rapid turnaround, patient prescription database of all retail and outpatient hospital pharmacy dispensing of schedules II-IV controlled substances. Physicians querying a specific patient are immediately provided a 1-year profile: showing the number of controlled prescriptions and prescribers, a listing of each controlled prescription, dispensary location, days supply, and prescriber names.16 Local VA pharmacies were authorized to contribute data by the 2012 Consolidated Appropriations Act.42 In 2015, prescriber registration and patient query were entirely voluntary. Approximately 22% of prescribers were registered users and utilization was described as “infrequent”.40 Between October 2014 and July 2015, the number of new physicians who registered each month to use the PMP typically was between 52 and 60.

Study design

As proof of concept, the study design was a single group, pre-post comparison. Feasibility assessment was critical. Pre-intervention and post-intervention self-report measures were assessed by survey 4 to 38 weeks after the intervention. The Brandeis University and University of South Carolina Institutional Review Boards approved all study human subjects procedures. The National Institute of Drug Abuse reviewed the data safety management protocol and the clinical trial registration number was #NCT02210936. The study received written permission to recruit Veterans Affairs Medical Center (VAMC) and affiliated Community-Based Outpatient Clinics (CBOC) providers from the local VAMC Medical Director. The project agreed to mask the identity of the participating VA facilities when reporting results to encourage participation.

Sample recruitment and delivery of the academic detailing intervention

The study inclusion criteria were physicians who: (a) prescribed to military service members, dependents or veterans (i.e., TRICARE beneficiaries) or were employees of VA clinics, (b) reported prescribing Schedule II opioids for the treatment of non-cancer pain, and (c) provided written informed consent and release of their PMP and TRICARE provider data. Oncologists, surgeons, and non-physician community prescribers who were not licensed in SC to prescribe Schedule II opioids were excluded.

A convenience sample of South Carolina community physicians was recruited May to November, 2015 from 14 towns within 3 hours distance from project’s Academic Detailing team members. The majority of physicians was located in Richland county, a home to VA clinics and an Army base, and known to have high opioid use indicators. Recruitment letters and a one-page fact sheet of the CME-eligible training were sent to 316 unique office-based community physicians, followed by email (n=36), fax (n=227), and follow-up phone calls (n=143). AD team members also contacted physicians with whom they had a prior relationship. Permission to contact physicians in VA clinics was facilitated by the Research Director and managed by two internal champions, the Clinical Pharmacy Specialist for Pain Management and the Director of the Pain Clinic. Both were advocates for PMP registration and utilization and knowledgeable about VA protocol for pain management with opioids. The collaborating VAMC was an early adopter of participation in the SC PMP. VA physicians were recruited within all VAMC primary care clinics and 3 of 5 local CBOCs with support from their medical directors. All participating physicians received up to 2.0 AMA PRA Category 1 credit™ based on the number of contact minutes for the on-site intervention visit, approved and granted by the University of South Carolina School of Medicine – Palmetto Health Continuing Medical Education Organization.

Three primary messages delivered through interactive intervention visits and supporting print materials included “S.O.S”: “Share a patient provider agreement,” “Optimize patient treatment using a multi-dimensional rating scale,” and “Screen for appropriate opioid use and the continued need for opioid therapy”. The intervention development process is described in Barth et al.43 Prior to the intervention visit, the research assistant sent participant information obtained by web-based survey to the state’s PMP director who pre-registered the physician for a PMP account. While some physicians had previously registered as a PMP user, most required a new account and password because lack of use had led to account de-activation. Four pharmacist AD consultants were trained on the Academic Detailing visit protocol by the co-investigator PharmD who had, jointly with the SC PMP director, developed a streamlined PMP account registration process. Excluding partial applications and those screened out, we received baseline data from 93 consenting physicians. Indicators of adherence to the intervention visit protocol and calculation of CME credits were based on a project AD visit form completed within 2 days after the visit. The physician web-administered survey was re-administered at follow-up, with a response rate of 78.1% (68/87 respondents). The median time to follow-up was 12.2 weeks.

Survey Measures

The pre-intervention survey and screening instrument asked physicians to self-report demographics, number of patients treated with pain and frequency of opioid prescribing, attitudes about the information obtained from the PMP, current PMP registration and utilization status, and frequency of other risk mitigation practices, with survey items drawn from seven published surveys.26,4449 The follow-up survey repeated several assessment items and included items about the perceived usefulness and quality of the intervention materials and visit.

There were two primary outcome measures: 1) Feasibility was assessed by the number of new PMP accounts and/or reactivation of inactive accounts, measured from visit forms completed by the academic detailer and judged relative to our goal of 100 participants; 2) Effectiveness of the intervention was assessed by physician utilization of PMP patient report data when prescribing opioids. A priori we hypothesized that a majority of participating physicians would report more frequent PMP use after the intervention visit. The primary PMP use measure was physician’s self-report of a patient prescription report query in the past 30 days, with four response options: 1) I have not made any requests or reviewed any reports, 2) I rely on other professionals in my practice to request patient reports (i.e., pharmacist or delegate), 3) I occasionally submit requests but did not submit a request in the past month, and 4) I submitted one or more requests in the past month. PMP adoption was defined as a physician who did not use the PMP pre-intervention but who used the PMP at follow-up.

Secondary effectiveness outcomes were assessed by self-report of PMP use on a frequency rating scale from 1 (never) to 5 (always) and four other behaviors consistent with intervention ‘S.O.S’ messages. Physicians also reported whether they performed the behavior more regularly after the intervention visit than before the visit.

Analysis

We compared PMP use status at pre- and post-intervention using the McNemar chi-square test for paired nominal data and performed chi-square tests of each potential covariate with PMP. A logistic regression was performed to assess the combined effects of two covariates with physician characteristics that influenced PMP adoption. The 18 respondents who at pre-intervention were already self-users (i.e., reported PMP use in the past month) were excluded from this analysis. To examine the effect of the intervention on other prescribing behaviors, pre- and post-intervention frequency ratings were evaluated by paired samples t-tests. All analyses used p < 0.05 as significance.

RESULTS

One-half (n=34) of the 68 follow-up respondents were women. The sample was racially/ethnically diverse: 38 were white/non-Hispanic, 11 were Black/Non-Hispanic, eight were Asian, and seven were Hispanic. The practice setting was almost evenly divided between the two settings: 35 physicians in Practice Setting A, 33 physicians in Practice Setting B (settings are masked to protect confidentiality). One-half of the respondents (n=34) reported being registered with the PMP prior to the intervention. Regarding opioid prescription history, the majority reported that they treated more than 40 non-cancer patients for chronic pain, and more than half reported that they “never” or “seldomly” prescribed opioids for patients. The characteristics of the follow-up sample were very similar to the full sample of 93 consenting physicians.43

Feasibility

We successfully delivered the academic detailing visit to 87 physicians (93.5% contact rate) between September 23 and November 20, 2015 and newly registered or re-activated 79 accounts (85% of 93 consented physicians). The intervention was not delivered to five physicians who completed the application too late for the PMP Director to set up a new account. The median visit length was 57.5 and 60.0 minutes for the two practice settings, longer than the anticipated visit length of 30 minutes. While the majority of visits were made to a single physician, 31 percent were attended by 2 or 3 physicians, and 13 percent were attended by other clinical or administrative staff. Academic detailers recorded the topic areas covered in response to physician’s questions. These topics included appropriate use of patient prescription data obtained from the PMP, optimized care using the multi-dimensional assessment tool, use of the informed consent process, and screening for possible addiction or drug diversion. Additional information on implementation of the intervention is detailed elsewhere.43

Effectiveness of the Academic Detailing Intervention

Use of PMP

On the pre-intervention survey, the majority of respondents (63%) reported not using the PMP to obtain patient prescription histories in the past month (i.e., PMP non-use), while about one-quarter (26%) reported past month PMP use (i.e., PMP self-use). Seven respondents (10%) reported that other clinicians had requested PMP reports for them (i.e., rely) (Table 1).

Table 1:

Percent of South Carolina study physicians (n=68) that consulted PMP patient history records in past 30 days

Pre-intervention survey self-report Post-intervention survey self-report a
No Rely on Others Yes Total
N Percent N Percent N Percent N Percent
No 8 17% 13 31% 22 52% 43 63%
Rely on Others 0 0% 3 43% 4 57% 7 10%
Yes 0 0% 0 0% 18 100% 18 26%
Total 8 10% 16 24% 44 66% 68 100%

Abbreviations: N = number of respondents; PMP = prescription monitoring program

a

Median week of return of post-intervention survey was 12.2 after academic detailing intervention.

At follow-up, only 10% reported PMP non-use, 66% reported PMP self-use, and 24% reported relying on others. Of most interest are the 43 respondents who had not used the PMP in the month before the intervention visit. At follow-up 52% of this group reported PMP self-use, 31% relied on others, and only 17% reported PMP non-use. Thus, 83% of physicians who were non-users at pre-intervention were PMP adopters at follow-up either by querying the PMP database or reviewing reports obtained by other clinicians. In addition, 4 of the 7 physicians who relied on others at baseline changed to PMP self-use at follow-up (Table 1). For statistical comparisons, self-use and rely groups were collapsed into a “PMP use” group; 37% were in this group pre-intervention and 88% were in this group at follow-up, a statistically significant increase (McNemar, p<.001). We also created a conservative definition of PMP status for sensitivity analysis, in which physicians who relied on others were defined as “PMP non-users”. Using the conservative definition, 27% were in the PMP use group pre-intervention and 65% were in the PMP use group at follow-up, again a significant increase (McNemar, p<.001).

Table 2 presents the results of two additional items that assessed frequency of consulting the PMP patient report using a frequency rating scale of 1 to 5 (never to always). The mean frequency rating significantly increased from pre-intervention to follow-up (3.2 vs 3.8, p < .001). In addition, 72% of physicians reported that they used the PMP patient reports more frequently after the intervention visit than before the visit.

Table 2:

South Carolina study physicians’ self-report of frequency of consulting the PMP patient history records in past 30 days (n=68)

Measure Pre-Intervention Post-Interventiona
Mean frequency rating, scale 1–5b 3.2 3.8 (p< 0.001)
Response = “Usually” or “Always” n=34, 50% n=48, 71% (p<0.001)
At follow-up survey, consult the PMP after the visit more than before the visit, response = yes n.a. 72%

Abbreviations: n= number of respondents; n.a. = not applicable; PMP = prescription monitoring program

a

Median week of return of post-intervention survey was 12.2 after academic detailing intervention.

b

Labels for each scale numeral: 1 = never; 2 = seldom; 3 = about half the time; 4 = usually; 5 = always

Covariates of PMP adoption

Chi-square tests showed that two covariates had significant effects on PMP adoption, using the conservative definition (defining relying on others as PMP non-use). Practice Setting A (n=22) physicians were more likely than those at Practice Setting B (n=28) to be PMP adopters (77% vs 32%, respectively, χ2=10.1, p=.002). Further, physicians who had practiced for 10 or fewer years (n=18) were more likely than physicians who had practiced for 11 or more years (n=32) to be PMP adopters (83% vs 34%, respectively, χ2=11.1, p=.001).

The following variables were not associated with PMP adoption: number of patients currently treated for pain (>40 vs 40 or less), gender, continuing education in pain management (any vs none), race/ethnicity (white, non-Hispanic vs other), and length of the visit (in minutes). Although an overall chi-square indicated a significant effect for which, of five pharmacists, conducted the intervention visit, post-hoc tests revealed that no academic detailer differed significantly from the other detailers in association with PMP adoption.

A logistic regression examined the combined effects of practice setting and number of years practiced on PMP adoption. The logistic regression model was statistically significant, χ2(2) = 20.9, p < .001, explained 46.0% (Nagelkerke R2) of the variance in adoption, and correctly classified 80% of cases. Among physicians who were PMP non-users at pre-intervention, physicians in Practice Setting A were 8.0 times more likely (95% CI=1.9–34.5) to adopt PMP use at follow-up than physicians in Practice Setting B. Further, among physicians who were PMP non-users at pre-intervention, those with 10 or fewer years of practice were 10.6 times more likely (95% CI=2.1–52.9) than physicians with 11 or more years of practice to adopt PMP use at follow-up.

Changes in Other Prescribing Behaviors

The percent of physicians who reported they usually or always used a specific multi-dimensional pain assessment scale increased from 44% to 57% (p= 0.028) and the mean frequency rating also increased significantly at follow-up (p <.001) (Table 3). Regarding orders for urine toxicology as a risk mitigation strategy, more than two-thirds of physicians responded that they usually or always ordered a urine toxicology screen for new patients or for patients they maintained on opioids, but the proportions did not change significantly. The mean frequency rating did not change on the item about new patients (p =.53), but increased significantly for patients the physician was maintaining on opioids (mean 3.8 to 4.1 (p = .04). Between 46% and 50% (new patients, patients maintained on opioids, respectively) of physicians reported using urine toxicology screens more frequently at follow-up. Finally, the mean frequency ratings for two other S.O.S. behaviors, assessing patients on opioids for aberrant behaviors and using a written patient treatment agreement, were high before the visit and did not increase.

Table 3:

South Carolina study physician self-report frequency of using prescribing approaches consistent with opioid prescribing guidelines (n=68)

Measure Pre-Intervention Post-intervention a
Use specific scale (e.g., PEG) to assess level of pain, quality of life, functioning, general activity
Mean frequency rating, scale 1–5 b 2.9 (SD 1.5) 3.5 (SD 1.3) (p < .001 )
Frequency Response = “Usually” or “Always” n=30, 44% n=39, 57% (p =0.028)
Response=more frequently after the visit than before the visit n.a. 38%

Assess aberrant opioid behaviors (e.g. early refills, lost prescriptions, taking meds in ways not prescribed)

Mean frequency rating, scale 1–5 b 4.6 (SD 0.7) 4.4 (SD 0.8) (p=0.13)
Frequency Response = “Usually” or “Always” n=63, 93% n=64, 94% (P=0.658)
Response=more frequently after the visit than before the visit n.a. 59%

Conduct a urine toxicology screen when starting opioid treatment for a new patient
Mean frequency rating, scale 1–5 b 3.7 (SD 1.4) 3.9 (SD 1.3) (p=0.12)
Frequency Response = “Usually” or “Always” n=44, 65% n=50, 74% (p=0.53)
Response=more frequently after the visit than before the visit n.a. 50%

Conduct annual urine toxicology screens for chronic, non-cancer patients maintained on opioids
Mean frequency rating, scale 1–5 b 3.8 4.1 (p=0.04)
Frequency Response = “Usually” or “Always” n=47, 69% n=53, 78%
Response=more frequently after the visit than before the visit n.a. 46%

Use a written patient treatment agreement or opioid contract to communicate expectations
Mean frequency rating, scale 1–5 b 4.2 4.2 (p=0.89)
Frequency Response = “Usually” or “Always” n=56, 82% n=56, 82%
Response=more frequently after the visit than before the visit n.a. 52%

Abbreviations: PEG = PEG assessment tool49; n.a. = not applicable; n = number of respondents

a

Median week of return of post-intervention survey was 12.2 after academic detailing intervention.

b

Labels for each scale numeral: 1 = never; 2 = seldom; 3 = about half the time; 4 = usually; 5 = always

DISCUSSION

The evidence from this pilot study shows that a single visit with a trained, academic detailer was effective in changing some, but not all, physician opioid prescribing behaviors and substantially increasing utilization of the PMP. A majority (84%) of 43 physicians who had neither obtained PMP patient reports on their own, nor relied on staff members to do so for 30 days before the intervention, reported past 30-day PMP use at follow-up. In addition, more than one-half of physicians who reported relying on others to query the PMP at pre-intervention; at follow-up,reported having queried the PMP themselves in the past 30 days. Thus, facilitated registration, demonstration, and practice of submitting queries to the PMP promoted PMP adoption — an important component of safe opioid prescribing. Further, physicians who had previously registered for the PMP increased use of the PMP. We also learned that many physicians, who reported prior PMP registration, had their accounts administratively de-activated because of inactivity and required new registration during the intervention.

Regarding feasibility, the intervention succeeded in registering or reactivating 79 physicians with the PMP over a 3-month period, at a time when registration was voluntary and overall PMP use was low. This accomplishment was attenuated by several limitations of the intervention: 1) We did not meet our goal to deliver the intervention to 100 physicians; 2) We did not recruit physicians employed at a military hospital; and 3) Three physicians who participated in the intervention visit were not registered during the visit because of delayed receipt of necessary information. Although these findings were based on self-report, responses across several related items indicated increased frequency of PMP utilization after the visit.

This study also provides evidence that the intervention influenced some, but not all, of our secondary outcomes. The academic detailing package left with the physician included an example assessment tool, the P.E.G.,50 for community physicians and a VA-multidimensional pain instrument for VA physicians. We speculate that the use of a specific instrument, instead of giving only generic advice, contributed to the boost in utilization. For the two S.O.S. messages, the high pre-intervention use rates limited the opportunity for the intervention to affect change; this demonstrates the value of additional knowledge of pre-intervention behaviors when designing key messages. In sum, these findings revealed changes in clinical behaviors measured several weeks after the intervention based on a high response rate (78%) among physicians in busy practices. To our knowledge, this is the first academic detailing study with physicians serving military and veteran populations.

This study also has several limitations. First, the outcomes are based on self-report data rather than direct observation or record review; physicians’ desires to appear competent may bias their self-report. We are currently analyzing encrypted PMP data on the number of physician patient report queries and patterns of opioid prescribing; thus, we will attempt to corroborate these findings based on physician self-report. Second, the limited resources of this study prohibited recruitment of a control group of physicians; thus, we cannot attribute the self-reported changes to the intervention and it is possible other changes in SC or the VAMC contributed to these findings. Third, study participants were volunteers and drawn from limited locations in SC and may be more motivated to adopt recommended clinical practices than others. Study findings do not generalize to non-studied regions or health care settings, such as military treatment facilities, health maintenance organizations, or other states with different PMP programs.

In future studies, we aim to collaborate with institutional partners, who can provide a list of physicians to approach and contribute administrative data during a planning phase to characterize patient case mix (e.g., number of patients treated for pain) and current prescribing behaviors. This planning data would facilitate random assignment of physicians and permit calculation of an intervention refusal rate.

In conclusion, the present study provides evidence that a single academic detailing visit appears to increase adoption of guideline-consistent behavior among a group of physicians not mandated to register or utilize the PMP.

Acknowledgements

The authors wish to thank Dr. Crystal Endsley, Dr. Sue Haddock, and Ms. Jill Bonkowske at the William Jennings Bryan Dorn Veteran Affairs Medical Center for their generous professional support to this initiative. We thank the 93 physicians who volunteered to participate in this study. We are grateful to Peter Georgantopoulos, Shanada Adams, and David Rodriguez for their critical role in project recruitment. This project would not have been possible without the tireless advice and commitment from Ms. Christie Frick, RPh and Ms. Tracie Paschall at the South Carolina Department of Health and Environmental Control, Prescription Monitoring Program. The University of South Carolina School of Medicine – Palmetto Health Continuing Medical Education (CME) Organization suggested important changes to improve the project’s protocol as well as provided CME credit as part of the study. Eve Reider, PhD and Aria Crump, ScD of the National Institute of Drug Abuse provided helpful guidance throughout the project.

Funding/Support:

This study was funded by as a pilot study by National Institute of Drug Abuse (NIDA) grant R34 DA037039. Dr. Wooten acknowledges the support of a NIDA Mentor Research Scientist Development Award (K01DA037412). The authors are responsible for the content and the views expressed herein do not represent the views of collaborating organizations including the South Carolina Department of Health and Environmental Control, the Veterans Health Administration, the National Institutes of Health, or any other public and private entity. The funding organizations had no role in the design and conduct of the study; collection, management, analysis, and interpretation of the data; preparation, review, or approval of the manuscript; and decision to submit the manuscript for publication.

This study was funded by National Institute of Drug Abuse (NIDA) grant R34 DA037039.

Footnotes

Other disclosures:

Nikki R. Wooten is a lieutenant colonel in U. S. Army Reserves, but was not involved in this study as a part of her official military duties. No authors have conflict of interests.

We agree to copyright and author rights statement.

Ethical approval:

Review and approval of human subjects procedures was received from Brandeis Committee for Protection of Human Subjects on May 4, 2014 with annual continuation approval, and the University of South Carolina on May 9, 2014. The Clinical Trials identifier was NCT02210936.

Contributor Information

Mary Jo Larson, Institute for Behavioral Health, Heller School for Social Policy and Management, Brandeis University, Waltham MA..

Cheryl Browne, Somerville, MA.

Ruslan V. Nikitin, Institute for Behavioral Health, Heller School for Social Policy and Management, Brandeis University, Waltham MA.

Nikki R. Wooten, College of Social Work, University of South Carolina, Columbia SC. Dr. Wooten is also a lieutenant colonel in the U. S. Army Reserve.

Sarah Ball, Division of General Internal Medicine and Geriatrics, College of Medicine, Medical University of South Carolina, Charleston, SC.

Rachel Sayko Adams, Institute for Behavioral Health, Heller School for Social Policy and Management, Brandeis University, Waltham MA.

Kelly Barth, Department of Psychiatry and Behavioral Sciences, College of Medicine, Medical University of South Carolina, Charleston, SC.

REFERENCES

  • 1.U.S. Centers for Disease Control and Prevention (CDC). CDC Guideline for Prescribing Opioids for Chronic Pain — United States, 2016. Morbidity and Mortality Weekly Report (MMWR) 2016; http://www.cdc.gov/mmwr/volumes/65/rr/rr6501e1.htm. Accessed July 10, 2016.
  • 2.Lev R, Lee O, Petro S, et al. Who is prescribing controlled medications to patients who die of prescription drug abuse? The American journal of emergency medicine. 2016;34(1):30–35. [DOI] [PubMed] [Google Scholar]
  • 3.U.S. Department of Health and Human Services (HHS); Office of the Surgeon General. Facing Addiction in America: The Surgeon General’s Report on Alcohol, Drugs, and Health. Washington, DC: HHS; 2016. [PubMed] [Google Scholar]
  • 4.Dowell D, Haegerich TM, Chou R. CDC Guideline for Prescribing Opioids for Chronic Pain-United States, 2016. JAMA. 2016:E1–E22. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 5.Haffajee RL, Jena AB, Weiner SG. Mandatory use of prescription drug monitoring programs. JAMA. 2015;313(9):891–892. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 6.The Pew Charitable Trusts. Prescription Durg Monitoring Programs: Evidence-based practices to optimize prescriber use. 2016.
  • 7.Department of Veterans Affairs - Department of Defense. VA/DoD Clinical Practice Guideline for the Management of Opioid Therapy for Chronic Pain - Clinician Summary. 2017: https://www.healthquality.va.gov/guidelines/Pain/cot/; Access Date April 14, 2017 Accessed April 14, 2017.
  • 8.Delcher C, Wagenaar AC, Goldberger BA, Cook RL, Maldonado-Molina MM. Abrupt decline in oxycodone-caused mortality after implementation of Florida’s Prescription Drug Monitoring Program. Drug Alcohol Depend. 2015;150:63–68. [DOI] [PubMed] [Google Scholar]
  • 9.Paulozzi LJ, Kilbourne EM, Desai HA. Prescription drug monitoring programs and death rates from drug overdose. Pain Med. 2011;12(5):747–754. [DOI] [PubMed] [Google Scholar]
  • 10.Reifler LM, Droz D, Bailey JE, et al. Do prescription monitoring programs impact state trends in opioid abuse/misuse? Pain Med. 2012;13(3):434–442. [DOI] [PubMed] [Google Scholar]
  • 11.Bao Y, Pan Y, Taylor A, et al. Prescription Drug Monitoring Programs Are Associated With Sustained Reductions In Opioid Prescribing By Physicians. Health Aff (Millwood). 2016;35(6):1045–1051. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 12.Haegerich TM, Paulozzi LJ, Manns BJ, Jones CM. What we know, and don’t know, about the impact of state policy and systems-level interventions on prescription drug overdose. Drug Alcohol Depend. 2014;145:34–47. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 13.Wen H, Schackman BR, Aden B, Bao Y. States With Prescription Drug Monitoring Mandates Saw A Reduction In Opioids Prescribed To Medicaid Enrollees. Health Aff (Millwood). 2017;36(4):733–741. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 14.Dowell D, Haegerich TM, Chou R. CDC Guideline for Prescribing Opioids for Chronic Pain-United States, 2016. JAMA. 2016. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 15.Chou R 2009 Clinical Guidelines from the American Pain Society and the American Academy of Pain Medicine on the use of chronic opioid therapy in chronic noncancer pain: what are the key messages for clinical practice? Pol Arch Med Wewn. 2009;119(7–8):469–477. [PubMed] [Google Scholar]
  • 16.Manasco AT, Griggs C, Leeds R, et al. Characteristics of state prescription drug monitoring programs: a state-by-state survey. Pharmacoepidemiol Drug Saf. 2016;25(7):847–851. [DOI] [PubMed] [Google Scholar]
  • 17.Rutkow L, Turner L, Lucas E, Hwang C, Alexander GC. Most primary care physicians are aware of prescription drug monitoring programs, but many find the data difficult to access. Health Aff (Millwood). 2015;34(3):484–492. [DOI] [PubMed] [Google Scholar]
  • 18.Lin D, Lucas E, Murimi IB, et al. Physician attitudes and experiences with Maryland’s prescription drug monitoring program (PDMP). Addiction. 2016. [DOI] [PubMed] [Google Scholar]
  • 19.National Safety Council. Prescription Nation 2016: Addressing America’s drug epidemic. 2016.
  • 20.Poon SJ, Greenwood-Ericksen MB, Gish RE, et al. Usability of the Massachusetts Prescription Drug Monitoring Program in the Emergency Department: A Mixed-methods Study. Academic emergency medicine : official journal of the Society for Academic Emergency Medicine. 2016;23(4):406–414. [DOI] [PubMed] [Google Scholar]
  • 21.Smith RJ, Kilaru AS, Perrone J, et al. How, Why, and for Whom Do Emergency Medicine Providers Use Prescription Drug Monitoring Programs? Pain Medicine. 2015;16(6):1122–1131. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 22.Carnevale Associates L Leveraging Prescription Drug Monitoring Programs to Reduce Drug Use and its Damaging Consequences Gaithersburg, MD: 2011. [Google Scholar]
  • 23.Avorn J Academic Detailing: “Marketing” the Best Evidence to Clinicians. JAMA. 2017;317(4):361–362. [DOI] [PubMed] [Google Scholar]
  • 24.Clow PW, Dunst CJ, Trivette CM, Hamby DW. Educational Outreach (Academic Detailing) and Physician Prescribing Practices. 2005.
  • 25.Lu CY, Ross-Degnan D, Soumerai SB, Pearson SA. Interventions designed to improve the quality and efficiency of medication use in managed care: a critical review of the literature - 2001–2007. BMC Health Serv Res. 2008;8:75. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 26.Hartung DM, Hamer A, Middleton L, Haxby D, Fagnan LJ. A pilot study evaluating alternative approaches of academic detailing in rural family practice clinics. BMC family practice. 2012;13:129. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 27.Figueiras A, Sastre I, Gestal-Otero JJ. Effectiveness of educational interventions on the improvement of drug prescription in primary care: a critical literature review. J Eval Clin Pract. 2001;7(2):223–241. [DOI] [PubMed] [Google Scholar]
  • 28.O’Brien MA, Rogers S, Jamtvedt G, et al. Educational outreach visits: effects on professional practice and health care outcomes. Cochrane Database Syst Rev. 2007(4):CD000409. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 29.Bloom BS. Effects of continuing medical education on improving physician clinical care and patient health: a review of systematic reviews. Int J Technol Assess Health Care. 2005;21(3):380–385. [DOI] [PubMed] [Google Scholar]
  • 30.Avorn J, Soumerai SB. Improving drug-therapy decisions through educational outreach. A randomized controlled trial of academically based “detailing”. N Engl J Med. 1983;308(24):1457–1463. [DOI] [PubMed] [Google Scholar]
  • 31.May FW, Rowett DS, Gilbert AL, McNeece JI, Hurley E. Outcomes of an educational-outreach service for community medical practitioners: non-steroidal anti-inflammatory drugs. Med J Aust. 1999;170(10):471–474. [PubMed] [Google Scholar]
  • 32.Department of Veterans Affairs - Office of Inspector General. Healthcare Inspection - VA Patterns of Dispensing Take-Home Opioids and Monitoring Patients on Opioid Therapy,. Washington, DC: Office of Healthcare Inspections;2014. Report #14–00895–163. [Google Scholar]
  • 33.Macey TA, Morasco BJ, Duckart JP, Dobscha SK. Patterns and correlates of prescription opioid use in OEF/OIF veterans with chronic noncancer pain. Pain Med. 2011;12(10):1502–1509. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 34.Seal KH, Shi Y, Cohen G, et al. Association of mental health disorders with prescription opioids and high-risk opioid use in US veterans of Iraq and Afghanistan. JAMA. 2012;307(9):940–947. [DOI] [PubMed] [Google Scholar]
  • 35.Army Suicide Prevention Task Force. Army Health Promotion Risk Reduction Suicide Prevention Report. 2010.
  • 36.Jonas WB, Schoomaker EB. Pain and opioids in the military: we must do better. JAMA internal medicine. 2014;174(8):1402–1403. [DOI] [PubMed] [Google Scholar]
  • 37.Dobscha SK, Morasco BJ, Duckart JP, Macey T, Deyo RA. Correlates of prescription opioid initiation and long-term opioid use in veterans with persistent pain. The Clinical journal of pain. 2013;29(2):102–108. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 38.Office of the Deputy Assistant Secretary of Defense (Military Community and Family Policy). 2014 Demographics: Profile of the Military Community. 2014.
  • 39.Paulozzi LJ, Mack KA, Hockenberry JM. Variation among states in prescribing of opioid pain relievers and benzodiazepines--United States, 2012. Journal of safety research. 2014;51:125–129. [DOI] [PubMed] [Google Scholar]
  • 40.Maley PJ. South Carolina Lacks a Statewide Prescription Drug Abuse Strategy, Case #2012–223. Office of the Inspector General, 2013.
  • 41.South Carolina Governor’s Prescription Drug Abuse Prevention Council. State plan to prevent and treat prescription drug abuse. South Carolina Office of the Governor; 2014.
  • 42.Federal Register. Disclosures To Participate in State Prescription Drug Monitoring Programs 38 CFR Part 1, RIN 2900–AO45 In: Department of Veterans Affairs, ed2013. [PubMed]
  • 43.Barth KS, Ball S, Adams RS, et al. Development and Feasibility of an Academic Detailing Intervention to Improve Prescription Drug Monitoring Program Use Among Physicians. J Contin Educ Health Prof. 2017;37(2):98–105. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 44.Thomas CP, Kim M, Kelleher SJ, et al. Early experience with electronic prescribing of controlled substances in a community setting. J Am Med Inform Assoc. 2013;20(e1):e44–51. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 45.Thomas CP, Kim M, Nikitin RV, Kreiner P, Clark TW, Carrow GM. Prescriber response to unsolicited prescription drug monitoring program reports in Massachusetts. Pharmacoepidemiol Drug Saf. 2014. [DOI] [PubMed] [Google Scholar]
  • 46.Barrett K, Watson A. Physician perspectives on a pilot prescription monitoring program. J Pain Palliat Care Pharmacother. 2005;19(3):5–13. [PubMed] [Google Scholar]
  • 47.Grahmann PH, Jackson KC 2nd, Lipman AG Clinician beliefs about opioid use and barriers in chronic nonmalignant pain. J Pain Palliat Care Pharmacother. 2004;18(2):7–28. [PubMed] [Google Scholar]
  • 48.Green CR, Wheeler JR, Marchant B, LaPorte F, Guerrero E. Analysis of the physician variable in pain management. Pain Med. 2001;2(4):317–327. [DOI] [PubMed] [Google Scholar]
  • 49.Holliday S, Magin P, Dunbabin J, et al. An evaluation of the prescription of opioids for chronic nonmalignant pain by Australian general practitioners. Pain Med. 2013;14(1):62–74. [DOI] [PubMed] [Google Scholar]
  • 50.Krebs EE, Lorenz KA, Bair MJ, et al. Development and initial validation of the PEG, a three-item scale assessing pain intensity and interference. J Gen Intern Med. 2009;24(6):733–738. [DOI] [PMC free article] [PubMed] [Google Scholar]

RESOURCES