Skip to main content
BMJ Simulation & Technology Enhanced Learning logoLink to BMJ Simulation & Technology Enhanced Learning
. 2018 Mar 23;4(2):95–96. doi: 10.1136/bmjstel-2017-000214

Cost–benefit analysis of healthcare professional education: report of a conference workshop

Kieran Walsh 1, Cindy Noben 2, Simon Gregory 3, Wee Shiong Lim 4, Chris Green 5, Trudie Roberts 6, Stephen Maloney 7, Dragan Ilic 7, George Rivers 8, Scott Reeves 9
PMCID: PMC8936711  PMID: 35515893

Introduction

The following is the report of a workshop that was presented at AMEE 2016: 27–31 August 2016, Barcelona, Spain. The workshop was composed of 40 leaders in undergraduate and postgraduate healthcare professional education.

The corresponding author introduced the workshop and gave an introduction to different potential cost analyses in healthcare professional education with a special emphasis on cost–benefit analysis. Following this, the delegates worked in small groups on the following themes.

Costing planned or hypothetical educational programmes

The delegates conducted mock costings of planned or hypothetical educational programmes. The following themes emerged during their discussions.

The cost of personnel was assumed to represent the biggest cost in most educational programmes.1 This is true, even in programmes that make use of technology-enhanced learning, largely because of the personnel costs needed to create or administer the technology-enhanced learning programmes, which far outweigh software costs in most programmes. However, it can be difficult to decide what personnel should be included in the costings. Most delegates erred on the side of including all personnel connected with the educational project even if some of them had only a remote connection. It was felt that this was probably best practice and that it was certainly best practice to be explicit about what personnel were included and excluded. It was felt that most education providers have financial models in place that anticipate tangible financial costs (usually with the aim of generating a small surplus). Equally important to the number of staff involved is the amount of staff time that is dedicated to the programme. This will include the amount of time delivering lectures or workshops and the amount of time preparing for them. This could be identified here, but with the caveat that the costs that education providers stipulate may not represent the true cost.

There was considerable discussion about what constituted sunk costs. A fixed or variable cost that has been incurred and cannot be recovered is known as a sunk (or retrospective) cost. A medical school building that was built 50 years ago would be a sunk cost. Similarly, simulation equipment that was bought only 5 years ago would also be a sunk cost. Both are sunk costs as they are retrospective one-time expenses which cannot be recovered once spent (50 years ago for the building and 5 years ago for the equipment). Even though the simulation equipment that was bought years ago would be a sunk cost, there might still be some ongoing costs related to consumables. An example of such a cost might be £5000 per year to stock a small simulation suite with consumables.

There was also much discussion about set up costs and then business as usual costs. Setup costs are the costs involved in creating the course for the first time. These might include the creation of course materials or assessments. These only need to be created once and can then be reused in subsequent years. They then become part of business as usual costs. Other business as usual costs include staff salaries that have to be paid every year. Setup costs are usually higher, which can mean that an educational programme appears most expensive in the first year. Subsequent years may then appear to be of lower cost than they actually are. Best practice would recommend that initial costs should be consolidated with subsequent costs to give annual costs over a stated period.2 This approach spreads the costs out over a period of time and so makes them more palatable for commissioners or funders of educational programmes. Some felt that they did not have the expertise for this—there was consensus that economic expertise was a prerequisite for studies of this kind.

Defining and costing the benefits of planned or hypothetical educational programmes

Defining the benefits of the educational programmes was by no means straightforward; however, it was felt that costing the benefits was even more challenging. The benefits discussed varied from learner satisfaction with a course to improved knowledge and skills to changed behaviour and improved performance. These resonated with Kirkpatrick’s educational outcome levels.3 4 It was felt that clinical, organisational, patient and public health outcomes might also reasonably be included. However it might be difficult to link some of these outcomes to the education.

Attempting to cost certain benefits was challenging. For example, delegates thought that learner satisfaction was important but they struggled to place a monetary value on it. Similarly, the achievement of better communication skills or reflective learning skills were felt to be important and yet could not be linked to a pecuniary value. However, certain outcomes or benefits could be costed—these included improved clinical outcomes such as fewer pressure ulcers or falls. Some of these outcomes have associated tariffs and this enables a costed outcome to be defined quickly. An example might be that a fall on the ward costs an average of £2000. So each fall prevented would save £2000. Working out the sorts of benefits that did not have associated tariffs was felt to be likely to be cumbersome and slow.

Linking the programmes and their costs to the outcomes

Cost–benefit analysis enables the comparison of costs and benefits of interventions with different outcomes. It also enables the comparison of ‘the costs and benefits of widely differing types of interventions in completely different areas’. Most delegates at the workshop used cost–benefit analysis in its most simple form—to evaluate whether a particular programme on its own has benefits that exceed its costs.5

The delegates discussed how best to use cost–benefit analysis to link the programmes and their costs to their outcomes. This was felt to be a challenge by most delegates—fundamentally because of the difficulty of demonstrating causation. The issues in this regard were similar to any other attempt to prove causality and included the effects of testing, attrition, maturation and regression to the mean. These problems were felt to be not specific to cost analyses but rather relevant to any benefit analysis.

Some delegates discussed more complex forms of comparison—such as integrating cost–benefit analyses with realist evaluation to theorise the costs and benefits of similar educational mechanisms leading to different outcomes dependent on context. Cost–benefit analysis requires that both costs and benefits are expressed in the same monetary value—which many delegates found to be a challenge.

Many delegates asked about other forms of cost analyses that might help them with their problems. Other forms of cost analyses raised included cost-effectiveness, cost–utility and cost–feasibility analyses. There was agreement that more education was needed about the various types of analyses and their relative advantages and disadvantages in different circumstances. There was also agreement that the healthcare professional education community could do more to include economists who would have a lot to offer such analyses.

Concluding remarks

Even though the purpose of the workshop was not to model cost–feasibility analyses, some delegates found that the exercise enabled them to evaluate whether or not their planned programme was feasible (that is, whether it can or cannot be considered given its likely budgetary requirements). In performing this version of a cost–feasibility analysis, delegates also acknowledged potential limitations with the approach. It was agreed that the major limitation of this approach was that it does not assist with deciding if an approach is worthwhile, in and of itself, as it does not ‘look at the effectiveness or utility of an intervention or the benefits that may be associated with it’.5

Some delegates discussed the potential unwanted effects of cost analyses. While all agreed that the ultimate aim was to provide high-quality education at the lowest possible cost, there was concern that educators could feel forced into choosing the lowest cost option and sometimes sacrifice quality as a result. However, all concurred that cost analyses should be a guide to decision-making but they should not necessarily make the decisions for us. Sometimes high-quality education is associated with high costs and that must be accepted by payers.

Conducting cost–benefit analyses will be essential if the medical education community is to achieve the ultimate goal of low-cost and high-value medical education. But this will not be achievable unless we listen to the practical problems that educators have in conducting such analyses and to the methods that they might develop to overcome these problems. The purpose of this paper is to share these challenges and solutions with the wider medical education community.

Footnotes

Contributors: KW helped to design the workshop and to acquire and interpret the data. He wrote the first draft and has approved the final version. CN made a substantial contribution to the analysis and interpretation of the data. She revised the paper critically for important intellectual content and has approved the final version. SG made a substantial contribution to the analysis and interpretation of the data. He revised the paper critically for important intellectual content and has approved the final version. WSL made a substantial contribution to the analysis and interpretation of the data. He revised the paper critically for important intellectual content and has approved the final version. CG made a substantial contribution to the analysis and interpretation of the data. He revised the paper critically for important intellectual content and has approved the final version. TR made a substantial contribution to the analysis and interpretation of the data. He revised the paper critically for important intellectual content and has approved the final version. SM helped to conceive and design the work. He made a substantial contribution to the analysis and interpretation of the data. He revised the paper critically for important intellectual content and has approved the final version. DI helped to conceive and design the work. He made a substantial contribution to the analysis and interpretation of the data. He revised the paper critically for important intellectual content and has approved the final version. GR helped to conceive and design the work. He made a substantial contribution to the analysis and interpretation of the data. He revised the paper critically for important intellectual content and has approved the final version. Professor SR helped to conceive and design the work. He made a substantial contribution to the analysis and interpretation of the data. He revised the paper critically for important intellectual content and has approved the final version.

Competing interests: None declared.

Patient consent: Detail has been removed from this case description/these case descriptions to ensure anonymity. The editors and reviewers have seen the detailed information available and are satisfied that the information backs up the case the authors are making.

Provenance and peer review: Not commissioned; externally peer reviewed.

Correction notice: This paper has been amended since it was published Online First. Owing to a scripting error, some of the publisher names in the references were replaced with ’BMJ Publishing Group'. This only affected the full text version, not the PDF. We have since corrected these errors and the correct publishers have been inserted into the references.

References

  • 1. Levin HM, McEwan PJ. Cost-effectiveness analysis: methods and applications. 4: Sage, 2001. [Google Scholar]
  • 2. Walsh K. Cost and value analyses in medical education: common errors to avoid. Br J Hosp Med 2014;75:290–1. 10.12968/hmed.2014.75.5.290 [DOI] [PubMed] [Google Scholar]
  • 3. Morrison J. Evaluation. BMJ 2003;326:385. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 4. Institute of Medicine (IOM). Measuring the impact of interprofessional education on collaborative practice and patient outcomes. Washington, DC: The National Academies Press, 2015. [PubMed] [Google Scholar]
  • 5. Walsh K, Levin H, Jaye P, et al. Cost analyses approaches in medical education: there are no simple solutions. Med Educ 2013;47:962–8. 10.1111/medu.12214 [DOI] [PubMed] [Google Scholar]

Articles from BMJ Simulation & Technology Enhanced Learning are provided here courtesy of BMJ Publishing Group

RESOURCES