Abstract
Objective
To develop a legal research protocol for identifying various measures of prescription drug monitoring program (PDMP) start dates, apply the protocol to create a useable PDMP database, and test whether the different legal databases that are meant to contain the same information produce divergent results when used in an illustrative empirical exercise.
Data sources
Original research from state statutes, regulations, policy statements, and interviews; alternative PDMP data from the National Alliance for Model State Drug Laws and Prescription Drug Abuse Policy System; claims from a 40 percent random sample of Medicare beneficiaries, 2006‐2014.
Study design
Collaborative research effort among a group of lawyers to develop protocol. Legal research to produce an original database of dates state PDMP laws: (a) were enacted, (b) became operational, and (c) required query before prescribing controlled substances. Descriptive analyses characterize differences in dates of enactment, operation, and must query requirements. Regression analyses estimating, for each beneficiary annually any opioid prescription received in a calendar year, among other measures. Estimates conducted on under age 65 and full Medicare population.
Data collection/extraction methods
PDMP legal databases were linked to annual Medicare claims.
Principal findings
An original database differs from commonly used, publicly available data. Outcomes tested depend on the measure of PDMP date used and differ by data source. Must‐query laws show the largest effects among all the laws tested.
Conclusions
Data choices likely have had large consequences for study results and may explain contradictory outcomes in prior research. Researchers must understand and report protocol for dates used in PDMP research to ensure that results are internally consistent and verifiable.
Keywords: legal epidemiology, Medicare, opioids, policy evaluation, prescribing behavior, regulation
What this study adds.
Shows that commonly used databases of enactment and operational dates for PDMP laws differ markedly, despite purportedly reporting the same information.
Provides a rigorous legal research protocol for identifying various measures of PDMP start dates and applies the protocol to generate a PDMP database of various start dates.
Uses the database to demonstrate that results of PDMP studies differ because of the underlying legal date used.
What is known about this topic.
Prior studies show contradictory results on whether PDMPs affect opioid fills;
Studies on mandatory PDMPs are most likely to demonstrate effects;
1. INTRODUCTION
Opioid‐related deaths have been increasing for several decades, contributing more to the rise in mortality between 2014 and 2016 than any other cause. 1 , 2 , 3 Although the root causes of opioid fatalities are complex, high levels of opioid prescribing have contributed to preventable harms. The federal government has promoted, 4 and nearly all states have created state‐level databases that track controlled substance prescriptions (Prescription Drug Monitoring Programs or PDMPs).
Despite much research assessing whether PDMPs have reduced inappropriate prescribing, dispensing, and resultant harms, the answer remains uncertain. Many studies have been inconclusive, and results among them are often contradictory. 5 , 6 That studies consider different outcomes or populations or that only some account for contemporaneous policy interventions may explain disparate findings. In addition, studies consider specific characteristics of PDMPs—such as whether providers are mandated to use them, 7 , 8 the breadth of drugs included, and the frequency of data updates 9 —to different degrees. Here, we offer an additional explanation for these inconsistent results.
We focus on the key variable for assessment of PDMPs, the dates on which the PDMP or PDMP characteristics went into effect. Nearly all published PDMP studies rely on a few, publicly available databases for these dates, sometimes supplemented with interviews of state officials. Different sources often report strikingly divergent dates for the same or similar measure, including measures as foundational as whether a state had a PDMP in a given year. In addition, the methods used to construct the datasets are often unavailable, hindering the comparison of studies and the determination of which dataset is best suited to answer a particular policy question.
We seek to provide clarity by addressing the foundational legal research involved in PDMP analyses. We identified the major decisions necessary to generate an internally consistent PDMP database, developed research protocols incorporating those decisions, and applied them to create a new PDMP dataset. The application of our protocols generated data importantly different from those found in other sources. Finally, as an illustration of the importance of these decisions and the resulting data, we perform identical analyses with our data and, alternately, data from two, publicly available sources and generate divergent results.
2. METHODS
2.1. Legal research protocol and PDMP databases
Our team created a detailed protocol to identify the month and year that a state PDMP law was first enacted, when it became operational, and when, if ever, the law required prescribers to check it before prescribing a controlled substance (Appendix S1). We then applied that protocol to construct an original database (Table 1). Detailed reports listing all sources of dates are available online; in those documents, we include some information regarding legislation passed after we ended our analyses in July 2019. 10 If we could not identify a start date for a very early PDMP, we coded the date as pre‐1990, long before the typical PDMP research sample. We use the term "law" to include statutes, regulations, and subregulatory measures that have the force of law for the purposes of a PDMP start date.
TABLE 1.
PDMP legislation and operation dates
| Jurisdiction | (1) | (2) | (3) | (4) | (5) |
|---|---|---|---|---|---|
| Enactment | Enactment: funding contingent | Enactment: electronic | Modern system operational | Prescriber must‐query | |
| Alabama | Nov‐05 | Aug‐04 | Apr‐06 | Mar‐17 | |
| Alaska | Sep‐08 | Jan‐12 | Jun‐18 | ||
| Arizona | Sep‐07 | Dec‐08 | Oct‐17 | ||
| Arkansas | Mar‐13 | Jul‐11 | May‐13 | Aug‐17 | |
| California | Pre‐1990 | Jan‐05 | Sep‐09 | Oct‐18 | |
| Colorado | Jun‐05 | Feb‐08 | |||
| Connecticut | Oct‐06 | Jul‐08 | Oct‐15 | ||
| Delaware | Sep‐11 | Jul‐10 | Aug‐12 | ||
| DC | Feb‐14 | Oct‐16 | |||
| Florida | Dec‐10 | Oct‐11 | Jul‐18 | ||
| Georgia | Jul‐11 | Jul‐11 | May‐13 | Jul‐18 | |
| Hawaii | Pre‐1990 | Dec‐96 | Feb‐12 | Jul‐18 | |
| Idaho | Pre‐1990 | Apr‐00 | Apr‐08 | ||
| Illinois | Pre‐1990 | Apr‐00 | Dec‐09 | Jan‐18 | |
| Indiana | Pre‐1990 | Jul‐07 | Jul‐07 | ||
| Iowa | May‐06 | May‐06 | Mar‐09 | Jul‐18 | |
| Kansas | Jul‐08 | Apr‐11 | |||
| Kentucky | Jul‐98 | Jul‐99 | Jul‐12 | ||
| Louisiana | Jul‐06 | Jan‐09 | Aug‐14 | ||
| Maine | Jan‐04 | Jan‐04 | Jan‐05 | Jan‐17 | |
| Maryland | Oct‐11 | Oct‐11 | Dec‐13 | Jul‐18 | |
| Massachusetts | Dec‐92 | Feb‐13 | Jan‐11 | Dec‐14 | |
| Michigan | Pre‐1990 | Jan‐02 | Jan‐03 | Jun‐18 | |
| Minnesota | Jan‐09 | Jul‐07 | Apr‐10 | ||
| Mississippi | Jun‐06 | Jul‐08 | |||
| Missouri | Jul‐17 | ||||
| Montana | Jul‐11 | Oct‐12 | |||
| Nebraska | Aug‐11 | Jan‐17 | |||
| Nevada | Jan‐96 | Feb‐11 | Oct‐15 | ||
| New Hampshire | Jun‐12 | Oct‐14 | May‐16 | ||
| New Jersey | Aug‐09 | Jan‐12 | Nov‐15 | ||
| New Mexico | Jul‐04 | Aug‐05 | Feb‐13 | ||
| New York | Pre‐1990 | Oct‐06 | Jun‐13 | Aug‐13 | |
| North Carolina | Jan‐06 | Jul‐07 | Jun‐17 | ||
| North Dakota | Dec‐06 | Apr‐05 | Oct‐08 | Jan‐18 | |
| Ohio | May‐05 | Oct‐06 | Apr‐15 | ||
| Oklahoma | Jan‐91 | Jul‐06 | Nov‐15 | ||
| Oregon | Jul‐09 | Sep‐11 | |||
| Pennsylvania | Pre‐1990 | Jun‐15 | Aug‐16 | Jun‐15 | |
| Rhode Island | Pre‐1990 | Aug‐95 | Sep‐12 | Jun‐16 | |
| South Carolina | Jun‐06 | Feb‐08 | May‐17 | ||
| South Dakota | Mar‐10 | Mar‐12 | |||
| Tennessee | Jan‐03 | Jan‐10 | Apr‐13 | ||
| Texas | Aug‐81 | Sep‐99 | Aug‐12 | Sep‐19 | |
| Utah | Jul‐95 | Jan‐06 | May‐18 | ||
| Vermont | Jun‐08 | May‐06 | Jan‐09 | Nov‐13 | |
| Virginia | Sep‐03 | Jun‐06 | Jul‐15 | ||
| Washington | Aug‐11 | Jul‐07 | Jan‐12 | ||
| West Virginia | May‐96 | Sep‐02 | May‐13 | Jun‐12 | |
| Wisconsin | Jun‐10 | Jun‐13 | Apr‐17 | ||
| Wyoming | Jul‐03 | Jul‐13 |
“Enactment” is date when state required a dispenser or prescriber to report a written or filled prescription, including paper submissions. For statutes requiring state to secure funding before mandating reporting, date is when funding was secured. “Enactment, Funding Contingent” is date at which the statute requiring funding to be secured before mandating reporting was passed. “Enactment, Electronic” is date at which enacted law creating a modern, electronic PDMP. “Modern System Operational” is date at which PDMP data became accessible to any user (eg, physician, pharmacist, or member of law enforcement) authorized by state law to receive it. “Prescriber Must Query” is date at which law mandated prescriber to check a database before prescribing a listed opioid. All dates determined according to the detailed research protocol described in Appendix S1. The dates in columns 1‐4 were current up to July 2019, and in column 5 up to February 2020.
A lawyer‐librarian, with the assistance of law students, created a preliminary database using electronic and paper legal sources. Four of the authors, all lawyers, then discussed each enactment and operational date, evaluating the supporting evidence in the case of uncertain dates. In many cases, we then conducted additional research, including consulting regulations available only in paper form and phoning state authorities to clarify legislative processes and PDMP laws. To investigate and resolve any discrepancies, we next compared our results to dates used by other researchers, including but not limited to the sources discussed above. We determined the final dates found in Table 1 by consensus.
2.1.1. Enactment
Defining and identifying a PDMP enactment date are more challenging than it might appear. Most laws simply authorize creation of a PDMP and set out general requirements for the PDMP. However, some laws authorize creation of a PDMP only if funding is secured. Still other laws state a date by which the PDMP must be created.
We define “enactment” as the date at which a bill, regulation, or administrative action requiring dispensers or prescribers to send to an authority responsible for compiling prescription information (physical copies via mail or fax, or electronic data) regarding written or dispensed prescriptions became law. We coded dates according to the particular enactment rules of each state (ie, when a legislature passes a bill, a governor signs a bill, or a specified number of days or months after a governor signs a bill). For state statutes that require the establishment of a PDMP at a future specified date, we list the date by which the PDMP is required to exist and also provide a second column in Table 1 with the effective date of the statute itself.
This definition includes older, multiple prescription systems, which required that a physical copy of the prescription be forwarded to a state agency. Although these older programs did not have the capabilities of a modern electronic system, we include them because they represent the fact that state authorities were paying attention to controlled substance prescribing and that such attention could have influenced prescribing or dispensing decisions. In fact, there is evidence that at least some of these early programs influenced prescribing, leading to reductions in the prescription of some controlled substances. 11 , 12
Nonetheless, researchers should distinguish between modern electronic systems and paper‐based triplicate programs, such as those in New York (1970s) and California (1930s). Therefore, we also report when states with paper programs switched to electronic programs.
2.1.2. Operational
We identify the operational date as the month and year that PDMP data first became accessible to any party authorized to access it (eg, physician or pharmacist) electronically (eg, not via phone or fax). Although some states operated pilot programs, allowing access for limited users, we report the date at which the full program became operational.
2.1.3. Must‐query laws
The scope of must‐query laws varies widely. They may cover different parties (eg, pharmacists, prescribers, or both), settings (eg, such as pain clinics), substances (eg, listed by schedule), patients (eg, all, those receiving opioid agonist treatment, or nonpalliative care), or circumstances (eg, each prescription, first prescription, upon reasonable suspicion of misuse, or if the patient is unknown to the prescriber). We identify “must‐query” as when a state law required a prescriber to check the PDMP before prescribing any category of controlled substance likely to contain prescription opioids. We exclude laws that only apply to limited settings such as pain clinics, but we include those with narrow exemptions, such as for prescribing to patients with cancer for palliative treatment.
2.1.4. Publicly available databases
Our comparison data are from two databases commonly cited in PDMP research, The National Alliance for Model State Drug Laws (NAMSDL), 13 , 14 , 15 and the Prescription Drug Abuse Policy System (PDAPS). 6 , 16 Although some scholars cite data from the Brandeis University PDMP Training and Technical Assistance Center (TTAC), 15 we do not compare our findings to TTAC data because they consist only the year of enactment and PDAPS stated that it relied on TTAC data for its “Implementation Dates” dataset 16 during the relevant time period.
Appendix S2 reports the PDMP enactment, operational (general and user access), and must‐query dates from databases on the websites of NAMSDL and PDAPS that were publicly available as of April 2018 and “must‐query” dates from PDAPS as of July 2019. As of April 2018, NAMSDL reported that the PDMP “research is conducted using nationwide legal database software, individual state legislative websites and direct communications with state PDMP representatives,” without further source explanation. 17 When we initially compiled data PDAPS reported as to dates of enactment, operation, and user access from PDAPS’ PDMP Implementation Dates page, PDAPS reported that they “were compiled through contact with PDMP administrators from each state program by Brandeis’ PDMP” TTAC; since then, PDAPS has made a more detailed research protocol publicly available, but the dates on the Implementation Dates page have remained substantially the same (the few updates appear to be rounding differences of one day). 18
3. Empirical illustration
We perform multiple identical analyses on the same sample, changing only the PDMP variable, to assess whether different measures of PDMPs derived from our legal research and existing public databases yield different estimates of the relationship between PDMPs and a measure of opioid use. The PDMP variable consists, alternately, of our data on PDMP dates (legislative enactment, operational, or must‐query date) and then of analogous data from the sources described above. Because this illustration is meant to determine whether our legal data are different in important ways from other sources rather than produce substantive conclusions regarding PDMP effectiveness, we only summarize the sample data and empirical approach employed. Readers may find more detail in our previous work on which these analyses are based and in Appendix S2.
In brief, patient data are a 40 percent random sample of Medicare beneficiaries from 2006 through 2014 enrolled in Medicare fee‐for‐service (parts A, B, and D) in a calendar year. We exclude patients with evidence of advanced cancer, hospice care, or end‐stage renal disease. In our main specification, the outcome is whether any opioid prescription is filled in a calendar year in a sample of Medicare beneficiaries under age 65, most of whom are disabled workers receiving federal disability insurance, a group particularly likely to use opioids and to suffer from related harms. 19 , 20 We estimate the probability of receiving an opioid using ordinary least squares (OLS) regressions controlling for individual year effects, patient state of residence, age, clinical diagnoses, and Medicaid enrollment (see Appendix). We also control for whether a state had adopted seven other types opioid regulation and cannabis laws in that year.
4. RESULTS
We find that different measures of PDMPs—for example, enactment of any PDMP v. enactment of an electronic PDMP—generate vastly different dates. Moreover, we find that our protocol and the other sources produce widely varying dates for analogous measures of PDMPs commonly used in research (see Appendix S2, Tables S1 and S2). For example, PDAPS lists the enactment date of California as 1938 (when the paper system started), and NAMSDL codes it as 2003, and, depending on our measure (enactment or electronic system enactment), we code it as before‐1990 or as 2005.
Because rigorous studies tend to identify the effects of PDMPs in response to changes in PDMP laws within a state (ie, using state fixed‐effects), these differences in dates of PDMP enactment or operation mean that estimates using one set of dates will be identified from experiences in a different set of states than analyses using another set of dates. For example, as can be seen in Table 1 (Column 4), we report dates representing when a modern system became operational for 49 states. Corresponding data are available in 37 states in the NAMSDL database on user access (Appendix S2, Table 1, Column 6) and 38 states in the PDAPS database on user access (Appendix S2, Table 1, Column 3). Not only does the comprehensiveness of datasets matter, but differences in methodology will affect which states are included in a sample. For example, we find that 43 states have user access dates that began within 2006‐2014, the sample period for our empirical work, whereas, the corresponding number of states are 29 from PDAPs and 28 states from NAMSDL.
Our empirical illustration further demonstrates the importance of these results for examining the potential effects of PDMPs. We find that whether and the extent to which PDMP laws are correlated with changes in various measures of opioid prescriptions filled depends both on the measure of PDMP used (ie, legislation enacted, PDMP, operational, or must‐query) and on the source of data for any given measure. In our main analyses, we find that different measures (legislation enacted, PDMP operational, and must‐query) generate different results. This can be seen most clearly in Figure 1, which shows estimates grouped by each measure. In addition, although the “must‐query” measure yields the most robust results in terms of reductions in the probability of having any opioid prescriptions filled, the magnitude of that effect varies depending on data source; our dates and PDAPS’ dates produce similar (if imprecisely measured) results, whereas NAMSDL’s measures produce an estimated effect that is larger in magnitude.
FIGURE 1.

Any opioid prescription received in calendar year, Medicare beneficiaries under age 65, percentage point change. Note:Mean percent of beneficiaries under age 65 filling any prescription for opioids = 45.28 (SD 49.78). Authors’ calculations using Medicare Part D claims data for person‐years from 2006 to 2014 based on linear probability model of whether an individual Medicare beneficiary filled one or more opioid prescriptions in the year. All models control for age (30‐39, 40‐49, 50‐59, 60‐64), black race, other nonwhite race, Hispanic ethnicity, female sex, the number of hierarchical condition categories used to adjust payments to Medicare part D drug plans (illness severity), and diagnosis of depression, whether beneficiary received any low income subsidy (proxy for poverty), Medicaid enrollment, indicators for each state, indicators for each year, state‐level cannabis laws, and other state opioid laws. All regressions adjust for correlation of errors within states using Huber‐White Sandwich estimators
Across a broad set of outcomes (Appendix S2), few correlations between opioid‐related outcomes and various measures of PDMP enactment or implementation are statistically significant at conventional levels. However, there are very large differences in the point estimates depending on the data source. For example, based on the mean annual daily morphine equivalent in prescriptions filled among the Medicare population under 65 during our study period, 20.6 mg, the NAMSDL and PDAPS measures produce results about five times larger than the statistically insignificant results using our measures. And although the negative relationship between must query provisions and filling prescriptions at more than 4 pharmacies is confirmed across data sources, the magnitude of that effect varies by a factor of two (See Appendix S2, Tables S3‐S6 and Figure S1‐S7 for regression results and related figures).
5. DISCUSSION
Creating consistent and reliable measures of PDMP laws is hard, as can be inferred both from our lengthy research protocol and by differences in dates published by commonly used public sources that aim to capture the same measure. When state policy interventions such as PDMPs contain many different potential characteristics, all of which evolve over time, it can be difficult to create consistent measures even within a single state. The growth of legal epidemiology, a field devoted to the appropriate use of law for research purposes, may help alleviate the problem. 21
Our empirical exercise illustrates that the choice of data source can have important consequences for findings. Using one source can mean excluding a large state, whereas using another source would not, making results more or less generalizable depending on the data source. That we find some large differences in quantitative estimates when we vary only the source of legal data in otherwise identical models recommends care in choosing a data source.
Researchers will continue to rely on publicly available databases as sources for dates in PDMP studies. Indeed, using third party sources can help protect against unconscious researcher bias. It is crucial that such sources make available detailed information on key definitions and how relevant dates were determined, as some of this information became available only recently. Furthermore, they should explain how inevitable conflicts in primary sources were resolved. Other researchers may make different and equally reasonable choices regarding how to code legal data or whether to use third party sources. Nonetheless, protocols need to be implemented consistently and transparently regardless of the data source, and limitations of any given decision must be acknowledged.
If policymakers rely on overstated estimates of ineffective programs, they will waste scarce resources and focus attention on the wrong solution, allowing people to continue suffering. But if they rely on understated estimates of effective programs, they will unnecessarily slow action to treat people in need. Accurately estimating the effects of real‐world policies is almost always hard to do precisely. 22 But it is very important to do it as well as possible.
Since responsible policymakers need to act without certainty, particularly when addressing crises like this one, access to transparent research methods is critical. There is too much at stake to do otherwise.
Supporting information
Supplementary Material
Appendix S1
Appendix S2
ACKNOWLEDGMENT
Joint Acknowledgment/Disclosure Statement: The authors thank Chris Auld for helpful comments and Allison Borsheim and Joshua Parson for research assistance. Horwitz thanks workshop participants at the Berkeley Law School; Canadian Health Economists Study Group; Centre for Law and Economics, ETH Zurich, School of Public Health, Penn State; and School of Public Policy, University of Southern California. Supported by grants (P01AG019783 and U01AG046830) from the National Institute on Aging.
Corey Davis received approximately $450 in consulting fees from PDAPS in 2014‐15 and served on the PDAPS expert advisory committee, without compensation, in 2016‐2018.
Horwitz JR, Davis C, McClelland L, Fordon R, Meara E. The importance of data source in prescription drug monitoring program research. Health Serv Res.2021;56:268–274. 10.1111/1475-6773.13548
REFERENCES
- 1. Sandoe E, Fry CE, Frank RG.Policy levers that states can use to improve opioid addiction treatment and address the opioid epidemic. Health Aff Blog. 2018. https://www.healthaffairs.org/do/10.1377/hblog20180927.51221/full/. Accessed August 14, 2020.
- 2. Hedegaard H, Warner M, Minino A.Drug Overdose Deaths in the United States, 1999–2016. NCHS Data Brief No. 294, December 2017. https://www.cdc.gov/nchs/products/databriefs/db294.htm. Accessed August 14, 2020. [PubMed]
- 3. Murphy SL, Xu J, Kochanek KD, Arias E.Mortality in the United States, 2017. NCHS Data Brief No. 328, November 2018. https://www.cdc.gov/nchs/products/databriefs/db328.htm. Accessed August 14, 2020. [PubMed]
- 4. CDC . “What States Need to Know about PDMPs”. https://www.cdc.gov/drugoverdose/pdmp/states.html. Accessed July 10, 2019.
- 5. Fink DS, Schleimer JP, Sarvet A, et al. Association between prescription drug monitoring programs and nonfatal and fatal drug overdoses: a systematic review. Ann Intern Med. 2018;168(11):783‐790. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 6. Davis CS. ,Commentary on Pardo (2017) and Moyo et al. (2017): Much Still Unknown about Prescription Drug Monitoring Programs. Addiction. 2017;112(10):1797‐1798. [DOI] [PubMed] [Google Scholar]
- 7. Buchmueller TC, Carey C. The effect of prescription drug monitoring programs on opioid utilization in Medicare. Amer J Econ J Econ Pol’y. 2017;10(1):77‐112. [Google Scholar]
- 8. Meara E, Horwitz JR, Powell W, et al. State legal restrictions and prescription‐opioid use among disabled adults. New Engl J Med. 2016;375(1):44‐53. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 9. Patrick SW, Fry CE, Jones TF, Buntin MB. Implementation of prescription drug monitoring programs associated with reductions in opioid‐related death rates. Health Aff. 2016;35(7):1324‐1332. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 10. https://www.dropbox.com/s/2wry51uf4n0j5bw/Enacted_Operational_Sourcesdocx?dl=0 https://www.dropbox.com/s/w7weea53hivz9x1/Must_Query_sources_2019.docx?dl=0. Accessed August 14, 2020.
- 11. Sigler KA, Guernsey BG, Ingrim NB, et al. Effect of a triplicate prescription law on prescribing of Schedule II drugs. Am J Health Syst Pharm. 1984;41(1):108‐111. [PubMed] [Google Scholar]
- 12. Weintraub M, Singh S, Byrne L, Maharaj K, Guttmacher L. Consequences of the 1989 New York State triplicate benzodiazepine prescription regulations. J American Med Assoc. 1991;266(17):2392‐2397. [PubMed] [Google Scholar]
- 13. Bao Y, Pan Y, Taylor A, et al. Prescription drug monitoring programs are associated with sustained reductions in opioid prescribing by physicians. Health Aff. 2016;35(6):1045‐1051. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 14. Deyo RA, Irvine JM, Millet LM, et al. Measures such as interstate cooperation would improve the efficacy of programs to track controlled drug prescriptions. Health Aff. 2013;32(3):603‐613. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 15. Dowell D, Zhang K, Noonan RK, Hockenberry JM. Mandatory provider review and pain clinic laws reduce the amounts of opioids prescribed and overdose death rates. Health Aff. 2016;35(10):1876‐1883. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 16. http://pdaps.org/datasets/pdmp‐implementation‐dates. Accessed July 31, 2019.
- 17. National Alliance for Model State Drug Laws, “PDMP Dates of Operation”, December 2014, https://web.archive.org/web/20180419233955/https://namsdl.org/library/580225E9‐E469‐AFA9‐50E7579C1D738E71. Accessed August 14, 2020.
- 18. Prescription Drug Abuse Policy System, “PDMP Implementation Dates”. http://pdaps.org/datasets/pdmp‐implementation‐dates. Accessed July 31, 2019
- 19. Morden NE, Munson JC, Colla CH, et al. Prescription opioid use among disabled Medicare beneficiaries: intensity, trends and regional variation. Med Care. 2014;52(9):8529. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 20. Sarah A. Trends in opioid use and prescribing in Medicare, 2006–2012. Health Serv Res. 2019;53(5):3309‐3328. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 21. Burris S, Ashe M, Levin D, Penn M, Larkin M. A transdisciplinary approach to public health law: the emerging practice of legal epidemiology. Ann Rev Public Health. 2016;37:135‐148. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 22. Ruhm CJ. Shackling the identification police? Southern Econ J. 2019;85(4):1016‐1026. [Google Scholar]
Associated Data
This section collects any data citations, data availability statements, or supplementary materials included in this article.
Supplementary Materials
Supplementary Material
Appendix S1
Appendix S2
