Skip to main content
Implementation Science Communications logoLink to Implementation Science Communications
. 2025 Jul 26;6:77. doi: 10.1186/s43058-025-00763-4

Efficiency in health care: connecting economic evaluations with implementation objectives

Todd H Wagner 1,2,, Ties Hoomans 3, Ramzi G Salloum 4, Douglas E Levy 5
PMCID: PMC12296606  PMID: 40713845

Abstract

Introduction

Economic evaluations are helpful for efficient resource use. This paper aims to clarify the relationship between economic evaluation methods and two types of health care efficiency, aiding implementation scientists in selecting the appropriate approach for their research.

Methods

We clarify the connection between cost-effectiveness analysis (CEA) and allocative efficiency, and explain how budget impact analysis (BIA) more closely connects with productive efficiency. We also discuss other methods that researchers can use to analyze an organization's productive efficiency, given increasing pressure for health care organizations to be efficient.

Results

Allocative efficiency seeks to maximize social welfare through optimal resource distribution. Productive efficiency focuses on an organization’s ability to maximize its output given its resource constraints. CEA, particularly when incorporating a societal perspective, assesses allocative efficiency. BIA, which often has a short time horizon and more focused perspective, assesses productive efficiency. When organizational leaders ask implementation scientists for an economic evaluation, it is important to determine whether they want a CEA or a BIA, given they answer different questions, often employing different methods. We also present other methods for measuring efficiency and causes of inefficiency stemming from fixed costs, scale, scope, regulations, labor, and decision-making.

Conclusions

Implementation scientists must recognize that CEA and BIA serve distinct purposes and are not interchangeable. Choosing the right economic evaluation tool is crucial for answering specific research questions and building research teams. Future implementation work will also need to measure efficiency so that it is sustainable.

Keywords: Efficiency, Productivity, Waste, Costs, Economies of scale, Behavioral economics


Contributions to the literature.

  • Implementation scientists seek to conduct economic evaluations to support learning health care systems

  • Economic evaluations, such as cost-effectiveness analysis and budget impact analysis, address two different aspects of efficiency, thus understanding efficiency is helpful for choosing the right method.

  • We highlight how efficiency analysis may help sustain implementation strategies.

Introduction

Leaders in health care delivery organizations must make decisions and set priorities to improve care and outcomes with limited budgets. Not only do they need to invest in the right services, but they must also consider the resources needed for implementation and practice change. Wise decisions can help make the organization more efficient, while poor choices can create inefficiencies that persist for years. For example, investing in a new diagnostic tool without adequate training and workflow integration may limit its impact and waste resources. Similarly, prematurely scaling a remote patient monitoring program in primary care without addressing barriers to adoption and maintenance may result in low effectiveness and limited reach [1].

To navigate complex decisions and build learning health care systems [2], organizational leaders are increasingly asking implementation scientists to conduct economic evaluations to ensure efficient resource use. There are many types of economic evaluations, and each informs a different aspect of efficiency. Although defining efficiency sounds simple, anyone who ventures into the efficiency literature will be confronted with substantial technical details and jargon, with the term used in different, and sometimes seemingly conflicting contexts [3]. The objective of this paper is to clarify these connections and to help implementation scientists select the right method for their economic evaluation.

The link between economic evaluations and efficiency

Health care efficiency falls into two main types: allocative efficiency—distributing resources in a way that best achieves societal goals, such as population health or social welfare, and productive efficiency—an organization’s ability to maximize the production of its outputs from available inputs and technology. Allocative efficiency is important for national policy makers, such as the National Health Service in the United Kingdom or the Centers for Medicare and Medicaid Services in the United States, while hospital directors and clinic managers may be more focused on productive efficiency. As we discuss below, allocative and productive efficiency relate to different economic evaluations.

Allocative efficiency and cost-effectiveness analysis

Cost-effectiveness analysis (CEA) is the best-known tool for conducting economic evaluations in health and medicine. In the years after Weinstein and Stason’s [4] landmark paper published almost fifty years ago, experts recommended conducting CEA from a societal perspective with a long-term time horizon [5, 6]. Although comprehensive societal analyses are rare due to data limitations [7], this approach provides information on whether a new treatment or program efficiently maximizes social welfare [8]. In theory, social welfare encompasses a society’s health, estimated by aggregating quality of life and social well-being across individuals. Efficiency in this context is referred to as allocative efficiency; put simply it seeks to address the question, does the allocation of resources maximize social welfare? Fig. 1, adapted from Chandra et al., [9] shows the relationship between the use of resources invested in health on the X axis and amount of health outputs those resources produce on the Y axis. This figure is referred to as the health care production possibilities frontier. Points A-G all represent different interventions. Interventions A-E all generate gains in health but at an added cost. A cost-effectiveness analysis would show that interventions F and G are not cost-effective, relative to other treatment options. Societies can choose to how much to invest in health care, and higher investments yield more health, but there is a tradeoff because investments in health consume more resources that could be invested elsewhere (e.g., education or roads) to generate social welfare.

Fig. 1.

Fig. 1

Allocative efficiency and production possibilities frontier. Caption: points A-G represent interventions. The production possibility frontier shows what health could be produced with different interventions. Interventions A-E all improve health, but require more resources. Interventions F and G and not cost-effective compared to alternatives. Health in this context is population health, what economists refer to as social welfare

On Fig. 1, the Y axis represents health or idealized wellness. We do not have a perfect measure of health. Quality adjusted life years (QALYs) and disability adjusted life years (DALYs) are widely used, but even these methods have challenges and limitations [10, 11]. Production above the curve is not possible, given current technology and medical knowledge. Innovations provide opportunities to produce more health outputs, but often at an additional cost. For example, Chimeric Antigen Receptor T-Cell therapy has provided a new way to treat acute lymphoblastic leukemia, but the cost of the drug can exceed USD $450,000. Not surprisingly, many ask, “is the new innovation worth it?” CEA uses the incremental cost-effectiveness ratio to compute the additional gains in health relative to the additional resources required to achieve them.

At any point the production possibilities frontier shown on Fig. 1, the slope of the curve represents the rate of spending relative to the outputs produced. Moving from points A to B requires about the same resources as moving from D to E, but moving from D to E produces substantially fewer gains in health than moving from A to B. The slope at point E is nearly zero. This is referred to as”the flat of the curve.” [12] In cancer screening, low-cost strategies such as patient reminders, may result in 70–80% of the population getting screened. But more expensive strategies are usually needed to improve screening rates further. It is also possible to overinvest in health, as shown on the far right of the curve, a phenomenon Grady and Redberg [13] refer to as when “Less is More.” While most experts would recommend not investing in technologies or programs on the flat of the curve, each government can set its own willingness to pay for gains in health. This is the willingness to pay threshold that commonly appears in CEA papers [14].

CEAs that take a societal perspective (i.e. to assess the health of nations) may not answer questions posed by health care organizations. Individual organizations generally have narrower interests within their programmatic priorities and budgetary constraints. The Second Panel on Cost-Effectiveness in Health and Medicine acknowledged the need for CEA models with shorter time horizons and more limited perspectives [15]. Such modifications may be appealing to organizational decision makers, but as we discuss next, these models may not be measuring allocative efficiency and instead they may be measuring productive efficiency.

Productive efficiency and budget impact analysis

Health care organizations provide health care services to patients by investing in staff, supplies, space, and equipment— what economist generically refer to as labor and capital. Each organization has its own priorities and objectives, whether that is maximizing profit or providing as many health services as possible [16]. Organizations are efficient when they maximize the quantity of outputs they produce given the resources or inputs used, along with the available technology and knowledge [17]. No health care organization strives to be inefficient because it implies that its production process is not as good as it could be, and the organization has the potential to produce more output with the same inputs. Efficiency, when used in this context, is referred to as productive or technical efficiency.

Again, we can use the production possibilities frontier to represent productive efficiency. Figure 2 looks similar in most regards to Fig. 1, with some important changes. First, the Y axis has changed from social welfare to the organization’s specific objective(s). Rarely are individual organizations trying to maximize the health of society. Next, each point in Fig. 2represents an organization that has allocated its resources on staff, facilities, equipment, and supplies to produce outputs [17]. An organization that invests a greater share of its resources into staff will have fewer resources to allocate to the other inputs. There is no single allocation rule that works equally well for every organization. Primary care clinics generally devote a greater share of their resources to labor, whereas hospitals generally invest a greater proportion of their resources into capital, such as scanners, ventilators, or surgical suites [18]. Finally, we introduce inefficient organizations. Efficient organizations (points A, C and F) are producing on the production possibilities frontier (see Fig. 2) [17]. Conversely, inefficient organizations are producing “off the frontier”; organizations represented by B, D, and E could produce more products and services with the same inputs. Efficient organizations can create goods and services that have lower costs than inefficient organizations, all else being equal. Budget impact analysis (BIA) is a common economic evaluation that can help an organization become more productively efficient [1922].

Fig. 2.

Fig. 2

Productive efficiency. Caption: points A-F are health care organizations. The production possibility frontier shows the health that different organizations are producing relative what could be produced if they were productively efficient. The Y axis is health as measured by the organization’s objective function

Connecting allocative efficiency and productive efficiency

Organizational leaders are motivated to optimize their organization’s productive efficiency because it affects outcomes they care about. However, their decisions may not be maximizing social welfare. In perfectly competitive markets, productive and allocative efficiency are linked through what Adam Smith referred to as the invisible hand of the market. Perfectly competitive markets are driven by consumers. Organizations respond by providing those services. Thus, perfectly competitive markets provide signals to organizations on what they should produce to maximize social welfare [23]. In a competitive market, this process happens continuously over time. Thus, organizations must continually strive to be efficient [24].

The restaurant industry in most cities, such as New York, is one example where there are tight connections between productive and allocative efficiency. Consumers are well informed through online reviews, and there are few barriers for a restaurant to enter or exit the market. Consumers can easily judge value and they stop going to restaurants where the cost/quality tradeoff is poor, relative to the competition. Thus, implementation science is not common in perfectly competitive markets, like restaurants in most cities, where restaurants that fail to respond to the market signals go out of business.

It is well-established, however, that health care markets are stubborn exceptions to perfect competition [25]. Without a well-functioning market, the tight connection between productive efficiency and allocative efficiency is lost. A health care organization can minimize its costs of production, but still produce too much specialty care or overinvest in technologies that may have little societal value [26]. For example, many hospitals in the US continue to add intensive care beds, even though up to half of beds are not being used by patients needing life sustaining treatment [27, 28].

Unlike restaurants, a health care organization can survive and be sustainable in the long-term, despite being inefficient. To maximize social welfare in health care, two general conditions must hold. First, health care organizations must be productively efficient. Second, health care organizations must be providing services that can maximize allocative efficiency. Implementation scientists can inform and sometimes persuade the organization, by using strategies, to produce services that maximize allocative efficiency.

Types of economic evaluations

Cost-effectiveness analysis for allocative efficiency

Researchers interested in allocative efficiency will gravitate towards a CEA using a decision model. Decision models, such as decision trees, Markov models and discrete event simulations, can be built to inform decision makers about the likely best course of action to maximize allocative efficiency. A simple decision tree is shown in Fig. 3with a hypothetical illness that has two treatment options: medical management or surgery. The decision tree is made up of decision nodes (blue squares), chance nodes (orange circles), and terminal nodes (green triangles). For a CEA, the decision tree should be populated with information on chance node probabilities, treatment costs, and outcomes at the terminal nodes. With this information, the optimal decision can be determined. The gold standard CEA, sometimes referred to as cost-utility analysis, is a decision model that includes lifetime costs and benefits using a societal perspective [5, 6, 15]. These models account for discounting and inflation, based on the time horizon [21]. Societal costs reflect all the resources used by all parties affected by the intervention, not just the healthcare system or a specific payer; this includes treatment costs, patient and caregiver costs, and productivity losses [15]. Some countries use CEA to ensure that resources are efficiently allocated to services that maximize social welfare. For example, CEAs are a key element of policymaking in the United Kingdom through the National Institute for Health and Care Excellence (NICE).

Fig. 3.

Fig. 3

A hypothetical decision tree

Budget impact analysis for productive efficiency

Implementation scientists interested in productive efficiency will be less interested in a CEA and more interested in other approaches, where the results can guide short-term decisions. While BIA has grown in popularity, helping an organization optimize its productive efficiency may lead to decisions that are incongruous with allocative efficiency [29]. Ideally a BIA would be accompanied with a CEA to highlight when productive efficiency differs from allocative efficiency. If this is not feasible to do formally (e.g., because of a lack of resources), then investigators should discuss the time horizon used and how the results might change if a different time horizon were used.

Measuring BIA and productive inefficiency

While a BIA can be estimated using a decision model, examining productive inefficiencies can be accomplished through a range of other methods. Among health economists, it is common to use administrative data with causal inference methods to estimate the effect of a program implementation on economic endpoints. For example, Gujral and colleagues [30] used a difference-in-differences analysis to evaluate a program that distributed tablet computers to patients with mental health needs and limited access to care. Another example is Daniels et al.’s [31] evaluation of a multidisciplinary pain clinic that was implemented in two hospitals. Both examples employ statistical analyses of administrative health care cost and utilization data to estimate budgetary impact.

Economists have also pioneered other approaches to measuring an organization’s productive efficiency using an output maximization or cost minimization models (see Table 1). These models should include implementation efforts and costs, as noted by the i, in Table 1. The output maximization model examines the quantity of outputs that an organization produces relative to the inputs it uses. Estimating the output maximization model frequently requires the use of data envelopment analysis (DEA), a mathematical programming technique that computes relative efficiency scores [32]. Prior and Solà [33] used DEA to examine the efficiency of hospitals, whereas Tran et al. [34] used DEA to measure primary care clinic efficiency. The cost minimization model encapsulates all the organization’s inputs in a single metric: the organization’s operating costs. Analytically, one estimates a regression model known as a “stochastic frontier analysis” in which total operating cost is the dependent variable and the independent variables are the input prices (i.e., the local cost of labor and capital) and the organization’s outputs. A range of regression models have been used to estimate the stochastic frontier from a simple ratio of costs per service to the more flexible translog model [35]. Readers may refer to examples from Romley et al., [36] Gu et al., [37] and Hall [38].

Table 1.

Two common approaches for measuring productive efficiency

Output maximization Cost Minimization
Output = f(w, y, r, i, u) Cost = f(p, o, r, i, u)
Where Where
Output is the quantity of outputs produced Cost is the organization’s total operating cost
w is a vector of input labor quantities (e.g., full time equivalent employees) p is a vector of input prices (i.e., cost of labor and capital) at the local-market level. Most models assume that these prices are set by market forces
y is a vector of input capital quantities (e.g., number of bed days, equipment, and supplies) o is a vector of outputs, such as the number of outpatient visits and inpatient stays produced
r is a vector of regulatory factors, such as hospital accreditation or internal staff policies r is a vector of regulatory factors
I is a vector of implementation activities undertaken by an organization for implementation and practice change.a I is a vector of implementation activities undertaken by an organization for implementation and practice change.a
u is a vector of patient differences (e.g., age, gender, clinical risk, social risk) u is a vector of patient differences (e.g., age, gender, clinical risk, social risk)

aIn markets that are perfectly competitive, the market provides signals to direct quality improvement and implementation activities can be assumed to be zero. However, in health care and education the markets do not provide these signals and implementation costs must be included

Other economic evaluations

Readers may also encounter return on investment (ROI) analysis and cost–benefit analysis (CBA). A ROI analysis is simply a ratio of benefits relative to investments. A CBA is the difference between the net benefits and costs. In both ROI and CBA, the benefits, including mortality effects, are expressed in monetary terms. Depending on the perspective (what benefits are measured) and the time horizon, ROIs and CBAs can end up looking much like a CEA or a BIA. The perspective and time horizon will determine whether the models are addressing questions that relate to allocative efficiency or productive efficiency.

Causes of inefficiency

A BIA may identify inefficiencies, but further mechanistic information will be needed if organizational leaders want implementation scientists to help them become more efficient. Some inefficiencies may be easily identified and remedied with little or no external effort. This includes, for example, staffing allocations where some staff are waiting for patients. Conversely, other inefficiencies, such as bottlenecks in patient transfers, may be more difficult to identify, necessitating a comprehensive investigation. If organizational leaders are seeking ways to reduce productive inefficiency, then understanding the underlying mechanisms is critical. Below, we discuss common causes of inefficiencies. These causes might stem from organizational structures, employee decision making, or financing and regulatory factors external to the organization.

Capital investment and fixed costs

Total costs of running a health care organization include a combination of variable costs, such as labor and supplies, and capital investments, such as equipment and space. These capital investments are often referred to as a fixed cost because once a health care organization purchases a building or a piece of equipment, such as an MRI machine, the cost is fixed for its lifespan (e.g., 8–12 years for an MRI machine). If the organization is not using the MRI machine at full capacity, it can sell it and then invest the money in a more productive capacity. Major investments are often done with projections, but inefficiencies arise when actual service use and costs differ from projected use and costs. To reduce inefficiencies due to excessive fixed costs, Adang and Wensing [29] proposed a fixed cost checklist combined with education so that organizational leaders could make informed decisions for the short and long term. Inefficiencies associated with fixed costs typically do not resolve on their own because it takes effort, and sometimes money, to convert the fixed asset into a variable cost. In the MRI example, the health care organization might seek to reduce costs by selling the MRI machine’s excess capacity to other health care organizations, or start imaging patients who would not necessarily benefit from the MRI (i.e., provider-induced demand [39, 40]). Inefficiencies from fixed costs can be resolved over time. Evaluations that take a long-term time horizon, can estimate costs, assuming that these inefficiencies are resolved. However, in evaluations with a short-term time horizon, which is more common in implementation science, two questions arise: what is the best strategy in the short term given the variable costs, and next, what is the best course of action in the long term when the fixed costs are variable. Wagner [18] gives an example of inefficiencies in ICUs, noting that in the short term, the best strategy is to minimize variable costs, even if ICU beds remain unused. The best strategy in the long term is then to convert the unused beds into more productive capacity.

Scale and scope

Inefficiencies may be linked to organizational size. Small health care organizations often encounter inefficiencies because they lack the economies of scale achieved by larger organizations. Efficiency gains from economies of scale happen when expanding the scale of production reduces the average unit cost. This may occur, for example, when two small clinics merge, thus combining their administrative and human resources costs, and the combined clinic is less expensive to operate than the combined cost of the two individual clinics. Alternatively, small institutions may improve efficiency by purchasing certain services externally rather than providing them internally. Another commonly described efficiency is economies of scope. Economies of scope arise when a health care organizations conducts a re-organization and the result is a decrease in the total cost of operation [41]. In health care, creating multispecialty group practices has been linked with breaking down “silos” and creating efficiencies [42].

Rules and regulations

Organizations must adhere to external regulations, but they can also create their own internal rules. Separating these two related concepts is important. In terms of productive efficiency, external regulations increase the organization’s cost of production relative to no regulations (see for example, Keeler and Sing [43]). In terms of allocative efficiency, regulations may enhance social welfare even if they decrease an organization’s productive efficiency. For example, imagine that two health care organizations sought to increase their productive efficiency by merging and gaining economies of scale. However, provider consolidation has been linked to higher prices because it reduces the competition across providers [4446]. Therefore regulations preventing the merger may improve allocative efficiency while hindering productive efficiency [47, 48]. Assessing those tradeoffs are often the focus of government regulators, like the Federal Trade Commission.

Researchers often assume regulations are fixed. Erhun and colleagues [49] compared the costs of heart surgery in the US relative to India. They noted that major savings could be possible if U.S. health care providers reconsidered or de-implemented many of the regulations that make care in the US so expensive relative to India. Prime examples would be regulations that no longer improve allocative efficiency (add societal value). An example is the change in U.S. Food and Drug Administration’s regulations affecting hearing aids in which the agency determined consumers seeking hearing aids can buy them over the counter without needing an audiology exam. Researchers should consider the possible effects of changing or de-implementing policies and regulations that do not improve allocative efficiency. While public policies may be unlikely to change without considerable efforts through voting, lobbying, and other avenues of political persuasion, changing regulations may be appropriate implementation targets. Organizational policies and rules can also cause inefficiencies, but they may be more amenable to change. Recent research examined inefficiencies from outdated pre-operative testing requirements [50]. Educating organizational leaders about the deleterious effects of low-value requirements may be effective at removing them.

Labor

Inefficiencies sometimes result from an organization’s suboptimal mix of labor inputs (e.g., doctor, nurse, nursing assistant). For example, organizations might experience inefficiencies when choosing among different types of labor (e.g., advance practice nurses and doctors) [51], or in how they deploy staff. Process maps can help identify staffing bottlenecks and inefficiencies. For example, Dilts and colleagues [52] created process maps to understand the efficiency of initiating clinical trials at their research facility. They mapped the actors, actions and time involved in the process, mirroring the Action, Actor, Context, Target, Time (AACTT) model described by Presseau and colleagues [53]. They used these data in collaboration with leadership to reduce delays in the process. The process maps highlighted which actions were creating bottlenecks and delays in the process. Fernandez and colleagues [54] describe a six-step approach for using process maps to identify targets for improvement and potential strategies that might effect change. Labor inefficiencies may also stem from patient no-shows which reduces productivity, which has been mitigated with patient reminders, overbooking, or no-show fees [55, 56].

Employee / practitioner decision making

Employee decisions can significantly impact efficiency. Researchers have analyzed decisions over employees’ shifts, noting how the decisions change when faced with a constraint. Freedman and colleagues [57] found that under unexpected time pressures, a provider will take “shortcuts” such as reducing the number of diagnoses recorded during a visit, increasing their use of follow-up care (i.e., pushing care onto future visits), and providing less preventive care. Similarly, Neprash and Barnett [58] found that providers were more likely to prescribe opiates later in their shifts, and Chan [59] found that emergency room physicians spent less time with patients near the end of their shift, resulting in higher total costs because these patients used more hospital resources. The provider’s decisions may be rational to them, but they create inefficiencies for the organization. Eliminating inefficiencies stemming from suboptimal decision-making may involve developing and testing strategies that have emerged from the field of behavioral economics, such as nudges and clinical reminders [60, 61]. Behavioral economics is a newer area of research, and there is much to be learned about how to best adapt nudges for specific contexts and avoid unintended problems like information overload or employee burnout.

Payment and financing systems

Different payment and reimbursement mechanisms will alter the incentives for improving efficiency. In the U.S., prospective payment was introduced as a way to encourage efficiency by incentivizing providers to minimize the cost of the care they produced, but hospitals responded by providing more intensive services of questionable value that received higher reimbursement [62]. Capitated payment, a payment mechanism in which health care systems are paid a fixed amount per member per month, encourages efficiencies, at the risk of limiting appropriate services [63]. Implementation scientists should consider financing when suggesting strategies for improving efficiency because organizational leaders may not be receptive, especially if the strategy seeks to reduce care that is profitable to the organization. Ultimately, if the payment system does not incentivize allocative efficiency, policy change may be necessary [64].

Efficient implementation and sustainability

Health care markets do not provide health care organizations with signals on where they need to improve. Instead, that is the vision for creating learning health care systems [2, 65], wherein implementation science fulfills a critical role of helping organizations improve their productive and allocative efficiency. A likely challenge for implementation science is developing methods for assessing efficiency more continuously. A recent study [66] provided insights on what this might look like in the future.

Pluta and colleagues [66] describe an implementation trial that tested the efficient deployment of strategies to improve the impact of smoking cessation, measured in terms of the number of smokers reached and the number of smokers who quit [66]. In brief, the U.S. National Cancer Institute (NCI) funded 52 NCI-designed cancer centers through the Cancer Center Cessation Initiative (C3I) to increase smoking cessation in oncology settings. The centers were allowed to invest the funding into different smoking cessation programs. They could invest in programs that had a greater reach, though these programs were typically less effective. Alternatively, they could invest in more intensive programs, such as those employing in-person counseling, which is typically more effective but reaches fewer people. Pluta and colleagues [66] measured efficiency with an output maximization model using DEA. The results showed wide variability across sites in terms of their reach and effectiveness, as well as variation in the resources each site used. They highlight one site as being most efficient in terms of using its resources to reach the greatest number of people while also generating the highest number of patients who successfully quit smoking.

Efficiency analysis, whether based on DEA or a stochastic frontier analysis, has the potential for each site to learn from themselves and from each other. Sites may reflect on the degree to which their selected strategies are operating efficiently, as indicated by being on the frontier of Fig. 2, or inefficiently. Inefficient sites could seek ways to reduce their costs and become more efficient. In this never-ending quest to be more efficient, each site will need to decide when to stop pursuing one strategy and pivot to another. If the site’s smoking cessation program is operating on the flat of the curve, the program could stop and pivot. Of course, there are no easy litmus tests to guide these decisions, but efficiency is essential to sustaining evidence-based practices.

Measuring efficiency using a DEA or stochastic frontier model has advantages over alternative approaches that necessitate creating and updating a CEA model [67]. One option is to assess outputs and inputs at regular intervals, allowing the site to learn by comparing this month to prior months and by comparing themselves to their peers. Imagine the study conducted by Pluta et al., where the implementation costs and implementation outcomes were collected monthly. Sites could use this information to make improvements and guide their implementation strategy. If reach declined, the team could consider investing in alternatives strategies or focus on a different outcome, like effectiveness. The main limitation, however, with these efficiency models is causal identification. [68, 69] “Efficient” sites may appear inefficient if the analytical model is incorrectly specified (e.g., an omitted variable). This may be mitigated by following organizations over time with machine learning techniques that can model efficient production more accurately [70].

Conclusion

Health care leaders often rely on economic evaluations for insights. An organizational leader may say they want a CEA, but if they really want to know whether an implementation strategy will save them money in the next year, then the CEA’s focus on allocative efficiency is unlikely to address the leader’s specific question. Instead, a budget impact analysis, focused on productive efficiency will be more likely to address that question. Because CEA and BIA address different aspects of efficiency, these tools are not interchangeable. Therefore, implementation scientists should seek clarity when designing economic evaluations to ensure their methods address the underlying question.

As described earlier, economic evaluations can be achieved using decision models and observational data analyses. Both methods have strengths and weaknesses that make them complimentary. Decision models require information on chance probabilities and costs, which may come from the literature or national databases, and may not be relevant to an organization faced with a decision. Similarly, showing that investments in telehealth reduce ED visits with a regression model may be relevant to those organizations in the sample, but the results may not generalize to organizations outside the sample. Both methods require specialized skills and differentiating the two methods may help implementation scientists who are seeking collaborators, a well-known challenge [71, 72]. Implementation scientists wanting to conduct a CEA may have more success searching for experts in decision sciences and operations research. Conversely, if they interested in measuring budget impact using administrative data, they may find more interest among health economists.

Economic evaluations should include the costs of implementation, which often need to be estimated using micro-costing methods. When estimating implementation costs, researchers need to decide how precisely to estimate costs and the unit of analysis. More precision incurs substantially higher measurement burden [73, 74]. In addition, in economic evaluations where the patient is the unit of analysis, researchers will need to identify the number of patients who were exposed to the implementation strategy. This may require further data collection efforts. Productive efficiency models, on the other hand, often focus on the site or clinic as the unit of analysis, and this reduces the micro-costing burden.

In health care, like other markets that do not exhibit perfect competition, efficiency gains may raise questions about equity. As an example, a narrow focus on maximizing the number of patients screened for colorectal cancer (or minimizing the cost of reaching a particular number of patients eligible for colorectal cancer screening), may yield strategies that exacerbate health inequities by focusing on the easiest to reach patients, leaving patients from marginalized populations at a disadvantage. Thus, the issue of equity and fairness cannot be easily avoided when seeking to improve efficiencies [75]. This phenomenon can be countered by explicitly identifying equity as a goal. For example, distributional CEA explicitly includes equity [76], and examining treatment effect heterogeneity can inform equity questions in observational analysis [77].

This paper is not meant to be an exhaustive review of efficiency theory and methods. Readers interested in deeper discussions on theory and measurement will likely find Sickles and Zelenyuk’s [17] and Greene’s [78] books of interest. There are also important limitations when conducting efficiency analysis, especially with small samples and causal identification, so efficiency analysis should not be viewed as a universal solution.

In summary, implementation science has developed as a robust interdisciplinary field. While there are challenges to implementation scientists collaborating with health economists and decision scientists, a common understanding of efficiency concepts and methods can help bridge existing gaps. Health economists and decision scientists have several powerful tools that may be useful, but these tools are not interchangeable, in part because they are often viewing efficiency problems through a different lens. We believe a greater understanding of efficiency will help focus research designs and help support ongoing collaborations.

Acknowledgements

We would like to thank Louise Russell, Austin Frakt, Enola Proctor, Mark Bounthavong, Steve Asch, Mark McGovern, Cathy Battaglia, Russ Glasgow, Andrea Eisman, Matt Maciejewski, and Tiffany Radcliff for comments on earlier versions. This paper was presented to at the Queensland University of Technology and Northwestern University Prevention Science & Methodology Group (PSMG), co-hosted with the Center for Dissemination and Implementation Science at Stanford (C-DIAS).

Authors’ contributions

All authors (THW, TH, RS, and DL) were involved in writing, editing, and revising this paper.

Funding

Todd Wagner was funded in part by a VA Research Career Scientist Award (RCS-17-154) and Center for Dissemination and Implementation at Stanford (P50DA054072). Ramzi Salloum was supported in part by the University of Florida Clinical and Translational Science Institute (UL1TR001427). Douglas Levy was supported in part by the Implementation Science Center for Cancer Control Equity at the Harvard T.H. Chan School of Public Health (P50CA244433).

Data availability

Not applicable.

Declarations

Ethics approval and consent to participate

Not applicable.

Consent for publication

Not applicable.

Competing interests

The authors declare that they have no competing interests.

Footnotes

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

References

  • 1.Hailu R, Sousa J, Tang M, Mehrotra A, Uscher-Pines L. Challenges and facilitators in implementing remote patient monitoring programs in primary care. J Gen Intern Med. 2024;39(13):2471–7. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 2.Smith M, Saunders R, Stuckhardt L, McGinnis JM. Best Care at Lower Cost: The Path to Continuously Learning Health Care in America. Washington, D.C.: The National Academies Press; 2012. [PubMed] [Google Scholar]
  • 3.Burgess JF Jr. Innovation and efficiency in health care: does anyone really know what they mean? Health Syst. 2012;1(1):7–12. [Google Scholar]
  • 4.Weinstein MC, Stason WB. Foundations of cost-effectiveness analysis for health and medical practices. N Engl J Med. 1977;296(13):716–21. [DOI] [PubMed] [Google Scholar]
  • 5.Gold MR, Siegel JE, Russell LB, Weinstein MC. Cost-Effectiveness in Health and Medicine. Oxford: Oxford University Press; 1996. [Google Scholar]
  • 6.Drummond MF, Sculpher MJ, Claxton K, Stoddart GL, Torrance GW. Methods for the Economic Evaluation of Health Care Programmes. Book, Oxford UK, Oxford University Press, 4th Edition; ISBN (Print) 9780199665884, Published 2015.
  • 7.Neumann PJ, Kamal-Bahl S. Should value frameworks take a “Societal Perspective”? Health Aff Forefr. Available from: https://www.healthaffairs.org/do/10.1377/forefront.20170906.061833/full/.
  • 8.Garber AM, Phelps CE. Economic foundations of cost-effectiveness analysis. J Health Econ. 1997;16(1):1–31. [DOI] [PubMed] [Google Scholar]
  • 9.Chandra A, Jena AB, Skinner JS. The pragmatist’s guide to comparative effectiveness research. J Econ Perspect. 2011;25(2):27–46. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 10.Lakdawalla DN, Phelps CE. Health technology assessment with diminishing returns to health: the Generalized Risk-Adjusted Cost-Effectiveness (GRACE) approach. Value Health. 2021;24(2):244–9. [DOI] [PubMed] [Google Scholar]
  • 11.Gold MR, Stevenson D, Fryback DG. HALYS and QALYS and DALYS, oh my: similarities and differences in summary measures of population Health. Annu Rev Public Health. 2002;23:115–34. [DOI] [PubMed] [Google Scholar]
  • 12.Fuchs VR. More variation in use of care, more flat-of-the-curve medicine. Health Aff (Millwood). 2004;Suppl Variation:VAR104-7. 10.1377/hlthaff.var.104. PMID: 15471787. [DOI] [PubMed]
  • 13.Grady D, Redberg RF. Less is more: how less health care can result in better health. Arch Intern Med. 2010;170(9):749–50. [DOI] [PubMed] [Google Scholar]
  • 14.Neumann PJ, Kim DD. Cost-effectiveness thresholds used by study authors, 1990–2021. JAMA. 2023;329(15):1312–4. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 15.Neumann PJ, Sanders GD, Russell LB, Siegel JE, Ganiats TG. Cost-Effectiveness in Health and Medicine. Second Edition. Oxford: Oxford University Press; 2016. ISBN: 9780190492939.
  • 16.Newhouse JP. Toward a theory of nonprofit institutions: an economic model of a hospital. Am Econ Rev. 1970;60:64–74. [Google Scholar]
  • 17.Sickles RC, Zelenyuk V. Measurement of productivity and efficiency. Cambridge: Cambridge University Press; 2019. ISBN: 9781139565981.
  • 18.Wagner TH. Rethinking how we measure costs in implementation research. J Gen Intern Med. 2020;35(2):870–4. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 19.Wagner TH, Dopp AR, Gold HT. Estimating downstream budget impacts in implementation research. Med Decis Mak Int J Soc Med Decis Mak. 2020;40(8):968–77. [DOI] [PubMed] [Google Scholar]
  • 20.Mauskopf JA, Sullivan SD, Annemans L. Principles of good practice for budget impact analysis: report of the ISPOR task force on good research practices – budget impact analysis. Value Health. 2007;10:336–47. [DOI] [PubMed] [Google Scholar]
  • 21.Sullivan SD, Mauskopf JA, Augustovski F, et al. Budget impact analysis-principles of good practice: report of the ISPOR 2012 budget impact analysis good practice II task force. Value Health. 2014;17(1):5–14. [DOI] [PubMed] [Google Scholar]
  • 22.Smith NR, Levy DE. Budget impact analysis for implementation decision making, planning, and financing. Transl Behav Med. 2024;14(1):54–9. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 23.Stiglitz JE. Pareto optimality and competition. J Finance. 1981;36(2):235–51. [Google Scholar]
  • 24.Bator FM. The anatomy of market failure. Q J Econ. 1958;72(3):351–79. [Google Scholar]
  • 25.Arrow KJ. Uncertainty and the welfare economics of medical care. Am Econ Rev. 1963;53(5):941–73. [Google Scholar]
  • 26.Devers KJ, Brewster LR, Casalino LP. Changes in hospital competitive strategy: a new medical arms race? Health Serv Res. 2003;38(1p2):447–69. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 27.Chen LM, Render M, Sales A, Kennedy EH, Wiitala W, Hofer TP. Intensive care unit admitting patterns in the Veterans Affairs health care system. Arch Intern Med. 2012;172(16):1220–6. [DOI] [PubMed] [Google Scholar]
  • 28.Chang DW, Dacosta D, Shapiro MF. Priority levels in medical intensive care at an academic public hospital. JAMA Intern Med. 2017;177(2):280–1. [DOI] [PubMed] [Google Scholar]
  • 29.Adang EM, Wensing M. Economic barriers to implementation of innovations in health care: is the long run-short run efficiency discrepancy a paradox? Health Policy. 2008;88(2–3):236–42. [DOI] [PubMed] [Google Scholar]
  • 30.Gujral K, Van Campen J, Jacobs J, Kimerling R, Blonigen D, Zulman DM. Mental health service use, suicide behavior, and emergency department visits among rural US veterans who received video-enabled tablets during the COVID-19 pandemic. JAMA Netw Open. 2022;5(4):e226250. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 31.Daniels SI, Cave S, Wagner TH, et al. Implementation, intervention, and downstream costs for implementation of a multidisciplinary complex pain clinic in the Veterans Health Administration. Health Serv Res. 2024;1475–6773.14345. [DOI] [PMC free article] [PubMed]
  • 32.Martín-Gamboa M, Iribarren D. Chapter 16 - Coupled life cycle thinking and data envelopment analysis for quantitative sustainability improvement. In: Ren J, editor. Methods in Sustainability Science. Elsevier; 2021:295–320. Available from: https://www.sciencedirect.com/science/article/pii/B9780128239872000039. Cited 2022 Aug 30.
  • 33.Prior D, Solà M. Technical efficiency and economies of diversification in health care. Health Care Manag Sci. 2000;3(4):299–307. [DOI] [PubMed] [Google Scholar]
  • 34.Tran LD, Wagner TH, Shekelle P, Nelson KM, Fihn SD, Newberry S, Ghai I, Curtis I, Rubenstein LV. Assessing and Improving Productivity in Primary Care: Proof of Concept Results for a Novel Value-Based Metric. J Gen Intern Med. 2024;39(12):2317–23. 10.1007/s11606-024-08710-0. Epub 2024 Jun 26. PMID: 38926317; PMCID: PMC11347497. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 35.Hancock D. Bank profitability, interest rates, and monetary policy. J Money Credit Bank. 1985;17(2):189–202. [Google Scholar]
  • 36.Romley JA, Goldman D, Sood N, Dunn A. Productivity growth in treating a major chronic health condition. Bureau of Economic Analysis; 2020. Available from: https://healthpolicy.usc.edu/wp-content/uploads/2021/08/WP61-Romley-et-al_5.14.2020.pdf. Cited 2025 Feb 7.
  • 37.Gu J, Sood N, Dunn A, Romley J. Productivity growth of skilled nursing facilities in the treatment of post-acute-care-intensive conditions. PLoS One. 2019;14(4):e0215876. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 38.Hall AE. Estimating regression-based medical care expenditure indexes for medicare advantage enrollees. Forum Health Econ Policy. 2016;19(2):261–97. [DOI] [PubMed] [Google Scholar]
  • 39.Labelle R, Stoddart G, Rice T. A re-examination of the meaning and importance of supplier-induced demand [see comments]. J Health Econ. 1994;13(3):347–68. [DOI] [PubMed] [Google Scholar]
  • 40.Dranove D, Wehner P. Physician-induced demand for childbirths. J Health Econ. 1994;13(1):61–73. [DOI] [PubMed] [Google Scholar]
  • 41.Panzar JC, Willig RD. Economies of scope. Am Econ Rev. 1981;71(2):268–72. [Google Scholar]
  • 42.Hillson SD, Feldman R, Wingert TD. Economies of scope and payment for physician services. Med Care. 1992;822–31. [DOI] [PubMed]
  • 43.Keeler T, Sing JS. Hospital costs and excess bed capacity: a statistical analysis. Rev Econ Stat. 1996;78(3):470–81. [Google Scholar]
  • 44.Austin DR, Baker LC. Less physician practice competition is associated with higher prices paid for common procedures. Health Aff Millwood. 2015;34(10):1753–60. [DOI] [PubMed] [Google Scholar]
  • 45.Baker LC, Bundorf MK, Royalty AB, Levin Z. Physician practice competition and prices paid by private insurers for office visits. JAMA. 2014;312(16):1653–62. [DOI] [PubMed] [Google Scholar]
  • 46.Rochlin DH, Rizk NM, Matros E, Wagner TH, Sheckter CC. Commercial Price Variation for Breast Reconstruction in the Era of Price Transparency. JAMA Surg. 2023;158(2):152–60. 10.1001/jamasurg.2022.6402. PMID: 36515928; PMCID: PMC9856784. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 47.Gaynor M, Mostashari F, Ginsburg PB. Making health care markets work: competition policy for health care. JAMA. 2017;317(13):1313–4. [DOI] [PubMed] [Google Scholar]
  • 48.Propper C. Competition in health care: lessons from the English experience. Health Econ Policy Law. 2018;13(3–4):492–508. [DOI] [PubMed] [Google Scholar]
  • 49.Erhun F, Kaplan RS, Narayanan VG, et al. Are cost advantages from a modern Indian hospital transferable to the United States? Am Heart J. 2020;224:148–55. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 50.Mudumbai SC, Pershing S, Bowe T, et al. Variability and costs of low-value preoperative testing for cataract surgery within the Veterans Health Administration. JAMA Netw Open. 2021;4(5):e217470–e217470. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 51.Chan J David C, Chen Y. The Productivity of Professions: Evidence from the Emergency Department. National Bureau of Economic Research; 2022. Available from: https://www.nber.org/papers/w30608.
  • 52.Dilts DM, Sandler AB. Invisible barriers to clinical trials: the impact of structural, infrastructural, and procedural barriers to opening oncology clinical trials. J Clin Oncol. 2006;24(28):4545–52. [DOI] [PubMed] [Google Scholar]
  • 53.Presseau J, McCleary N, Lorencatto F, Patey AM, Grimshaw JM, Francis JJ. Action, actor, context, target, time (AACTT): a framework for specifying behaviour. Implement Sci. 2019;14(1):1–13. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 54.Fernandez ME, Ten Hoor GA, Van Lieshout S, et al. Implementation mapping: using intervention mapping to develop implementation strategies. Front Public Health. 2019;7:158. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 55.LaGanga LR, Lawrence SR. Clinic overbooking to improve patient access and increase provider productivity. Decis Sci. 2007;38(2):251–76. [Google Scholar]
  • 56.Parikh A, Gupta K, Wilson AC, Fields K, Cosgrove NM, Kostis JB. The effectiveness of outpatient appointment reminder systems in reducing no-show rates. Am J Med. 2010;123(6):542–8. [DOI] [PubMed] [Google Scholar]
  • 57.Freedman S, Golberstein E, Huang T-Y, Satin DJ, Smith LB. Docs with their eyes on the clock? the effect of time pressures on primary care productivity. J Health Econ. 2021;77:102442. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 58.Neprash HT, Barnett ML. Association of primary care clinic appointment time with opioid prescribing. JAMA Netw Open. 2019;2(8):e1910373–e1910373. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 59.Chan DC. The efficiency of slacking off: evidence from the emergency department. Econometrica. 2018;86(3):997–1030. [Google Scholar]
  • 60.McConnell KJ, Lindrooth RC, Wholey DR, Maddox TM, Bloom N. Management practices and the quality of care in cardiac units. JAMA Intern Med. 2013;173(8):684–92. [DOI] [PubMed] [Google Scholar]
  • 61.McConnell KJ, Lindrooth RC, Wholey DR, Maddox TM, Bloom N. Modern management practices and hospital admissions. Health Econ. 2016;25(4):470–85. [DOI] [PubMed] [Google Scholar]
  • 62.Shin E. Hospital responses to price shocks under the prospective payment system. Health Econ. 2019;28(2):245–60. [DOI] [PubMed] [Google Scholar]
  • 63.Robinson JC. Theory and practice in the design of physician payment incentives. Milbank Q. 2001;79(2):149–77. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 64.Dopp AR, Kerns SEU, Panattoni L, et al. Translating economic evaluations into financing strategies for implementing evidence-based practices. Implement Sci. 2021;16(1):66. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 65.Kilbourne AM, Elwy AR, Sales AE, Atkins D. Accelerating research impact in a learning health care system: VA’s quality enhancement research initiative in the choice act era. Med Care. 2017;55 Suppl 7 Suppl 1:S4–12. [DOI] [PMC free article] [PubMed]
  • 66.Pluta K, Hohl SD, D’Angelo H, et al. Data envelopment analysis to evaluate the efficiency of tobacco treatment programs in the NCI moonshot cancer center cessation initiative. Implement Sci Commun. 2023;4(1):50. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 67.Curran GM, Landes SJ, McBain SA, et al. Reflections on 10 years of effectiveness-implementation hybrid studies. Front Health Serv. 2022;2. Available from: https://www.frontiersin.org/journals/health-services/articles/10.3389/frhs.2022.1053496/full. [DOI] [PMC free article] [PubMed]
  • 68.Newhouse JP. Frontier estimation: how useful a tool for health economics? J Health Econ. 1994;13(3):317–22. [DOI] [PubMed] [Google Scholar]
  • 69.Skinner J. What do stochastic frontier cost functions tell us about inefficiency? J Health Econ. 1994;13(3):323–8. [DOI] [PubMed] [Google Scholar]
  • 70.Zelenyuk V. Productivity analysis: roots, foundations, trends and perspectives. J Product Anal. 2023;60(3):229–47. [Google Scholar]
  • 71.Barnett ML, Dopp AR, Klein C, Ettner SL, Powell BJ, Saldana L. Collaborating with health economists to advance implementation science: a qualitative study. Implement Sci Commun. 2020;1(1):1–11. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 72.Barnett ML, Stadnick NA, Proctor EK, Dopp AR, Saldana L. Moving beyond aim three: a need for a transdisciplinary approach to build capacity for economic evaluations in implementation science. Implement Sci Commun. 2021;2(1):133. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 73.Wagner TH, Yoon J, Jacobs JC, et al. estimating costs of an implementation intervention. Med Decis Mak Int J Soc Med Decis Mak. 2020;40(8):959–67. [DOI] [PubMed] [Google Scholar]
  • 74.Liu C-F, Rubenstein LV, Kirchner JE, et al. Organizational cost of quality improvement for depression care. Health Serv Res. 2009;44(1):225–44. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 75.Reinhardt UE. Reflections on the meaning of efficiency: can efficiency be separated from equity. Yale Pol Rev. 1992;10:302. [Google Scholar]
  • 76.Asaria M, Griffin S, Cookson R. Distributional cost-effectiveness analysis: a tutorial. Med Decis Making. 2016;36(1):8–19. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 77.Angrist JD. Treatment effect heterogeneity in theory and practice. Econ J. 2004;114(494):C52-83. [Google Scholar]
  • 78.Greene WH. Econometric Analysis. 8th ed. Upper Saddle River, NJ: Prentice-Hall; 2017. [Google Scholar]

Associated Data

This section collects any data citations, data availability statements, or supplementary materials included in this article.

Data Availability Statement

Not applicable.


Articles from Implementation Science Communications are provided here courtesy of BMC

RESOURCES