Abstract
Objective To compare two models of revalidation for general practitioners.
Design Randomised comparison of two revalidation models.
Setting Primary care in Tayside, Scotland.
Participants 66 Tayside general practitioners (principals and non-principals), 53 of whom completed the revalidation folders.
Interventions Two revalidation models: a minimum criterion based model with revalidation as the primary purpose, and an educational outcome model with emphasis on combining revalidation with continuing professional development.
Main outcome measures Feasibility and acceptability of each approach and effect on the doctor's continuing professional development. The ability to make a summative judgment on completed models and whether either model would allow patient groups to have confidence in the revalidation process.
Results The criterion model was preferred by general practitioners. For both models doctors reported making changes to their practice and felt a positive effect on their continuing professional development. Summative assessment of the folders showed reasonable inter-rater reliability.
Conclusions The criterion model provides a practical and acceptable model for general practitioners to use when preparing for revalidation.
Introduction
In the United Kingdom, a doctor's licence to practise is secured by registration with the General Medical Council. Periodic revalidation, to start in spring 2005, will be the regular demonstration by doctors that they remain fit to practise, and the process by which a doctor's licence is maintained.1 The UK approach will be to link revalidation with continuing professional development though annual appraisal or an independent route, with both requiring doctors to be able to provide information that their workplace activities are above the standards for “fitness to practise.” Doctors whose submissions are either absent or below fitness to practise standards will undergo GMC performance assessment. Their registration will then depend on their satisfying the GMC fitness to practise procedures.2
Background
However, although the GMC's role is to set the professional standards that general practitioners will be revalidated against, it is unable to prescribe either what information needs to be collected or how the process should work. As a result of this, general practitioners lack a clear guide or model to use to show that their practice is satisfactory for revalidation purposes. In order to gain insight into this problem, we recruited general practitioners in Tayside, Scotland, to develop, implement, and evaluate two models for revalidation. The models were a minimum criterion based model, with revalidation as the primary purpose, and an educational outcome model, with emphasis on combining revalidation with continuing professional development.
We were interested to find out the effect of each approach on the doctors' continuing professional development, whether either of these models could be used as a basis for a summative judgment, the feasibility and acceptability of the process to general practitioners, and whether either would allow patient groups to have confidence in the revalidation process.
The study ran between November 2000 and January 2003. When this study started the aims of revalidation were to ensure public confidence in doctors,3 promote maintenance of competence and continuing professional development, and detect poor performance.4 The objective of detecting poor performance caused some concern to study participants, who had difficulty reconciling this with the formative developmental objectives of demonstrating fitness to practise by means of a portfolio. Issues around introduction of revalidation to the profession are outlined in our companion paper on bmj.com, which details the development and implementation of the two revalidation models.
Participants and methods
This study involved three phases. The development phase (from November 2000 to August 2001) involved recruitment of participants, development of revalidation models, and desktop publishing of the models. The implementation phase (September 2001 to June 2002) covered the implementation of the models. The evaluation phase (July 2002 to January 2003) comprised evaluation of the process and assessment of the completed models.
Participants
All 340 general practitioners (principals and non-principals) registered on the databases of Tayside Primary Health Care Trust, the local faculty of the Royal College of General Practitioners, and the GP Postgraduate Unit were invited by letter to take part in the two year study. We set up evening meetings to explain the study and recruit participants. The two models were developed by two groups of volunteer general practitioners and “key stakeholders” (representatives from Tayside Health Council (patient representatives), the Royal College of General Practitioners, each of Tayside's three local health care cooperatives, the local non-principals group, and secondary care).
The revalidation models
Criterion model—For this model we used a method suggested by the Royal College of General Practitioners and based on Good Medical Practice for General Practitioners.5 We grouped the unacceptable attributes of a general practitioner under the seven headings of good medical practice, and for each group of unacceptable attributes we created a positive criterion statement. Information that general practitioners could collect in their portfolios was then decided, along with the pass point (standard) if it could be specified (see example in box 1).
Educational outcome model—On the basis of the Dundee outcome model,6 we modified the 12 outcomes determined for medical practice to reflect the specific tasks and competencies of general practice. Conceptually, this model looks at the tasks that a doctor does (technical intelligences), the deeper understanding needed for those tasks (intellectual intelligences), and the professionalism of the doctor (personal intelligences). Within each outcome, we specified broad statements of required practices as “givens.” Information to be collected by general practitioners for their folders was specified and standards determined (see box 2).
For both models, standardised forms for structured presentation of the information were agreed and made available on the postgraduate department website.7 Full details of the development and content of the models are given in our accompanying paper on bmj.com.
Data collection
The study developed two types of data; assessment data from completed revalidation folders and evaluation data from participants.
Assessment data
Completed folders were anonymised. We then added two quality control folders that contained inadequate evidence of fitness to practise in order to test the robustness of the assessment process. Each folder was assessed separately by two general practitioners in the same implementation group (peer assessment). They assessed the folders as pass, problematic but pass, or fail. We included the category “problematic but pass” to aid the decision making process, so that assessors could record that information was thin but met the minimum standards to pass revalidation. A second assessment was made by an “external assessment group” comprising a patient's representative, a senior doctor nominated by the study group, and a doctor working in medical education. This group assessed a sample of the completed folders, any problem folders, and two quality control folders (12 in total). They assessed the folders as pass or fail. In addition, the GMC technical group assessed 10 folders, comprising a quality control folder, a problem folder, and a random sample of the other folders.
Box 1: Example of a criterion statement, information to be collected, and standard created for criterion revalidation model
Unacceptable attributes (GMC heading of good clinical care)
Doctor has no way of organising care for long term problems or for prevention
Criterion statement
Doctor provides continuing care for chronic medical problems
Information to be collected
Doctor provides a management plan illustrating the care of a patient with chronic disease
Standard
Protocol completed and referenced to local or national standards
Box 2: Example of givens, information to be collected, and standard created for educational outcome revalidation model
Givens for outcome 1—clinical care (history taking)
Doctor is able to elicit adequate clinical details to formulate a diagnosis
Ensures no serious condition is missed
Considers social and psychological factors
Makes effective use of time
Information to be collected
Observation of five consultations with colleague or
Case report from a consultation or
Patient satisfaction survey
Standard
Forms for choice of information completed
Evaluation data
We collected data from participants at each of the stages of the study using feedback forms, structured piloted questionnaires with open questions, closed responses using a 5 point Likert scale, and semi-structured interviews. The results, which were a mixture of quantitative and qualitative data, allowed exploration of doctors' perceptions of the two models and difficulties encountered, comparative data, and estimates of time taken to complete models. We analysed the quantitative data using SPSS and analysed qualitative data for content to develop major themes.8
Results
A total of 72 general practitioners indicated initial interest, and 45 volunteered to develop the two models. A further 24 indicated interest in piloting the completed models. Sixty six doctors started the implementation phase: two dropped out at the start for no known reason, and three dropped out because of serious personal illness, giving a total working sample of 61. Eight failed to hand in folders and did not provide feedback as to their reasons for dropping out. Of the 13 doctors who dropped out, 10 had been involved with the study since the start (that is, had developed the models), and three had joined the study at the implementation phase. The remaining 53 handed in completed folders for peer and external assessment. Table 1 shows the demographics of the study groups.
Table 1.
Criterion model (n=24) | Outcome model (n=29) | New recruits only (n=24)* | |
---|---|---|---|
Men:women | 17:7 | 20:9 | 18:6 |
Principal:non-principal | 22:2 | 28:1 | 21:3 |
Years in practice: | |||
1-9 | 8 | 12 | 10 |
10-19 | 12 | 6 | 8 |
20-29 | 4 | 11 | 4 |
≥30 | 0 | 0 | 2 |
Type of practice: | |||
Urban | 8 | 13 | 9 |
Rural | 6 | 7 | 4 |
Mixed | 10 | 9 | 10 |
GP trainers | 7 | 6 | 4 |
Doctors who had not participated in developing the revalidation models.
The time taken to complete the folders varied widely from less than 20 hours to more than 40, with the education outcome model requiring slightly more time (table 2).
Table 2.
Completion time (hours) | Criterion model (n=24) | Outcome model (n=29) |
---|---|---|
<20 | 0 | 2 (7) |
21-30 | 12 (50) | 7 (24) |
31-40 | 5 (21) | 10 (34) |
≥40 | 7 (29) | 10 (34) |
Assessment of completed folders
Peer assessment of the 55 folders showed a good degree of inter-rater reliability (κ = 0.66). The assessment made by the external group on 12 folders showed moderate agreement with the peer assessment (κ = 0.43). Of the 12 folders that were marked three times (once by each of two peer assessors and once by the external group), two were marked as doubtful in two of the three assessments. The quality control folders were identified as substandard in five of their six assessments (each folder had two peer and one external group assessments). The GMC technical group confirmed that, of the 10 folders they sampled, nine provided sufficient information to support revalidation and one (a quality control folder) contained inadequate information.
Evaluation data
Doctors who had helped to develop the models valued the protected time to work with their colleagues and patient groups. Those who developed the simpler criterion model found the tasks and processes clear but had difficulty in trying to define and measure good practice. Doctors who developed the educational outcome model enjoyed the learning and educational theory but felt that more guidance, leadership, and facilitation were needed. There were reported changes in practice with updating of medical bags, improved record keeping, and increased audit activity.
The study design allowed collection of data from three groups of doctors implementing the models (table 3). All who developed a model found it easy to follow when implementing it. However, those who, after having developed the criterion model, switched to implementation of the more complex educational outcome model found it difficult to grasp, and the new recruits found both models complex to understand.
Table 3.
Implementation of models
|
||
---|---|---|
Subgroups of GPs | Criterion model (n=24 completed) | Outcome model (n=29 completed) |
GPs who had developed the same model | 10 (5 completed) | 10 (9 completed) |
GPs who had developed the other model | 11 (9 completed) | 11 (9 completed) |
New recruits for implementation | 12 (10 completed) | 12 (11 completed) |
Despite their reservations and early confusion, once the doctors began collecting information for inclusion in their folders, feedback at progress meetings indicated that most found the task straightforward and relatively easy. The doctors had a range of information available for inclusion in their folders (table 4). Though all doctors collected a wide variety of data to illustrate their practice, they had clear views and preferences about these. Observation of practice, clinical audit, and analysis of prescribing data were considered the least feasible information to collect: the need to involve partners and to obtain data for prescribing and audit activities required both planning and effort. Observation of practice, patient satisfaction survey, and peer survey were the least acceptable data to collect: each of these involves external opinions of a doctor's practice. When asked which information provided the most robust data of their performance, participants rated observation of practice, medical records, and referral letters as the highest.
Table 4.
Types of information | Criterion model (n=24) | Outcome model (n=29) |
---|---|---|
“External data” | ||
Observation | 15 (63) | 13 (45) |
Patient satisfaction survey9 | 22 (92) | 27 (93) |
Peer review10 | 21 (88) | 23 (79) |
Other information | ||
Significant event analysis | 20 (83) | 24 (83) |
Clinical audit | 23 (96) | 26 (90) |
Analysis of prescribing | 20 (83) | 26 (90) |
Referral letter analysis | 23 (96) | 29 (100) |
Teamwork account | 9 (38) | 26 (90) |
Medical record analysis | 21 (88) | 28 (97) |
Case report | 16 (67) | 28 (97) |
Management plan | 23 (96) | 17 (59) |
Reflective diary | 0 | 1 (3) |
Non-principal general practitioners had problems getting staff to cooperate with information gathering and felt marginalised by a practice based emphasis in the data. Lack of access to prescribing data and difficulty in audit were also key issues for non-principals.
Participants felt that the educational outcome model involved more work and effort than should be required for revalidation purposes, but also found it enjoyable. Almost half of the doctors involved with either developing or implementing this model indicated that they would be interested in developing it further, to diploma or masters degree level.
At completion of the folders, though no difference was reported in ease of understanding the models, we found differences in ease of implementing the models and their acceptability, with the criterion model being favoured (table 5).
Table 5.
Criterion model (n=24) | Outcome model (n=29) | |
---|---|---|
Ease of use: | ||
Difficult or borderline | 10 | 20 |
Easy or very easy | 14 | 9 |
Acceptability: | ||
Not acceptable or borderline | 1 | 11 |
Acceptable or very acceptable | 23 | 18 |
Doctors reported that both models had a positive effect in encouraging their continuing professional development. Although we have no hard evidence of changes in their practice, most doctors completed their folders using a wide variety of data that included patient and peer surveys and observation of their practice by colleagues (table 4). Such educational activities are rarely presented for postgraduate accreditation in Tayside. The choices of information collected in the doctors' folders were similar for both models.
Discussion
In this study we developed two models of revalidation for general practitioners. These were acceptable to doctors and achieved the aims of encouraging continuing professional development, detecting poor performance, and assuring patients that doctors successfully completing the process are competent practitioners. The models provided a structure which allowed general practitioners to demonstrate their fitness to practise by selecting from a menu of information choices. The preferred criterion model was found to be feasible and acceptable to general practitioners.
Potential limitations of study
The study group was recruited from those who volunteered on a single invitation to all general practitioners in Tayside. However, this group was generally representative as it included 20% of the general practitioners in the region from differing backgrounds, including many not traditionally seen at postgraduate educational or research meetings.
The timescale of this project was tight, with doctors completing their folders over nine months. This contrast with the five year revalidation cycle. As a result of the time pressure, any comparisons between a complex and simple model are likely to favour the simple model.
Lessons learnt
The high completion rate may be explained by three factors. Both models were developed by general practitioners (plus key stakeholders), enhancing ownership.11 Considerable support was given during implementation, with meetings, examples of evidences, standard forms available on the internet, and independent organisation of patient and peer surveys. The models were based on Good Medical Practice for General Practitioners12 with attributes grouped under the seven headings of good medical practice, making the rationale for providing evidence clear to participants.
Summative assessment of portfolio work, though used in undergraduate education,13 has been problematic in postgraduate general practice.14 In our study assessment of both models showed good inter-rater reliability. This suggests that if doctors collect information about their practice in the standardised format used in these models and that they are assessed against clear criteria and standards, then decisions whether to recommend revalidation can be made with some confidence. Although the models allowed the participants a choice as to what information to include in their folders, each choice was clearly defined as to what should be included and what standard was acceptable.15
We attempted to be inclusive of non-principals, a growing group of general practitioners who are often neglected in educational matters.16 However, the problems they encountered show that more flexibility in folder content is needed.
Both revalidation models had a positive effect in encouraging continuing professional development. As the educational model had generated more interest in education and learning at the development phase, we wondered whether it would encourage more reflective practice. One postulated measure of increased critical reflection by doctors was the use of the “external” data (observation of practice, peer review, and patient satisfaction surveys), but we found no differences between the models in such use.
Doctors who had developed the criterion model and then changed to the educational model, and new recruits to both models, initially needed time and support to grasp the concepts but were then able to gather information for their folders without problems. This suggests that, for those using either the appraisal or independent route for revalidation, peer or educational support will be required.
Further developments
The simpler criterion model was the preferred choice in our study, and this model has been used to inform development of the Scottish revalidation folder.17 The Scottish revalidation folder has now been distributed to all GP principals in Scotland and is suitable for use either as part of the Scottish appraisal process or the independent revalidation route. A separate revalidation toolkit incorporates work from this study and offers practical guidance to doctors preparing for revalidation. Both are available on the RCGP Scotland website.18 Developments based on this study have resulted in general practitioners having a practical and acceptable model to use when preparing for revalidation.
What is already known on this topic
UK doctors' professional standards of practice are made explicit in Good Medical Practice
Folders of evidence will be used to show doctors' competence, but no validated models exist for how to carry out revalidation by this method
Portfolio assessments have been developed for undergraduates, but they have poor reliability for postgraduates because of the varied nature of their content
What this study adds
The study developed two revalidation models: a criterion model, with revalidation as the primary purpose, and an educational outcome model, which combined revalidation with continuing professional development
The summative assessment of the folders was reasonably reliable
The simpler criterion model was preferred by participating doctors and has informed development of the Scottish revalidation folder
Supplementary Material
A companion paper giving details of the development and implementation of the two revalidation models is on bmj.com
We thank Miriam Friedman, Ben David, and Jennifer Laidlaw of the Centre for Medical Education, University of Dundee, for methodological support and Peter Donnan of Tayside Centre for General Practice, University of Dundee, for statistical advice.
Contributors: DB was the principal investigator, who conceived and developed the original idea, led the study, and prepared the manuscript. KP coordinated the study, organised the questionnaires, carried out the interviews, and contributed to analysis. RR contributed to all stages of study, including analysis, and contributed to manuscript preparation. DS helped develop the original idea, advised on the study throughout, sat on the expert group, and prepared the final manuscript. RH helped with initial methodological design and supported the project throughout. DB is guarantor for the study.
Funding: Main funding was from NHS Education for Scotland with supplementary funding from Tayside Primary Care Trust and Tayside Centre for General Practice Postgraduate Funds.
Competing interests: None declared.
Ethical approval: Not required.
References
- 1.General Medical Council. A licence to practice and revalidation. London: GMC, 2003.
- 2.General Medical Council. When your professional performance is questioned. London: GMC, 1997.
- 3.Buckley G. Revalidation is the answer. BMJ 1999;319: 1145-6. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 4.Southgate L, Pringle M. Revalidation in the United Kingdom: general principles based on experience in general practice. BMJ 1999;319: 1180-3. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 5.General Practitioner Committee, Royal College of General Practitioners. A methodology for recommending revalidation for the general practitioner. London: RCGP, 2000.
- 6.Harden RM, Crosby JR, Davis MH, Friedman M. From competency to meta-competency: a model for the specification of learning outcomes. Med Teach 1999;21: 546-52. [DOI] [PubMed] [Google Scholar]
- 7.Tayside revalidation pilot. e-proformas. http://www.dundee.ac.uk/generalpractice/postgraduate/E-proformas.htm (accessed 3 Mar 2004).
- 8.Bowling A. Research methods in health. Maidenhead: Open University Press, 2000.
- 9.Grogan S, Conner M, Willits D, Norman P. Development of a questionnaire to measure patients' satisfaction with general practitioners' services. Br J Gen Pract 1995;45: 525-9. [PMC free article] [PubMed] [Google Scholar]
- 10.Ramsey PG, Wenrich M, Carline J, Iuni T, Larson E, LoGerfo J. Use of peer ratings to evaluate physician performance. JAMA 1993;269: 1655-60. [PubMed] [Google Scholar]
- 11.Spiegal N, Murphy E, Kinmonth AL, Ross F, Bain J, Coates R. Managing change in general practice: a step by step guide. BMJ 1992;304: 231-4. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 12.Royal College of General Practitioners, General Practitioner Committee. Good medical practice for general practitioners. London: RCGP, 2002.
- 13.Friedman Ben-David M, Davis MH, Harden RM, Howie PW, Ker J, Pippard MJ. AMEE Medical Education Guide No 24: Portfolios as a method of student assessment. Med Teach 2001;23: 535-51. [DOI] [PubMed] [Google Scholar]
- 14.Pitts J, Coles C, Thomas P. Educational portfolios in the assessment of general practice trainers: reliability of assessors. Med Educ 1999;33: 515-20. [DOI] [PubMed] [Google Scholar]
- 15.Dauphinee WD, Case S, Fabb W, McAvoy P, Saunders N, Wakeford R. Standard setting for recertification. In: Newble D, Jolly B, Wakeford R, eds. The certification and recertification of doctors. Cambridge: Cambridge University Press, 1994: 201-15.
- 16.Chambers R, Fieldhouse R, O'Connell S. GP non-principal's education: let's improve access for our flexible friends. Br J Gen Pract 1998;48: 1551-2. [PMC free article] [PubMed] [Google Scholar]
- 17.RCGP Scotland, BMA Scotland, NHS Education for Scotland. Revalidation folder: Doctors working in clinical general practice in Scotland. Edinburgh: RCGP Scotland, 2003.
- 18.RCGP Scotland. www.rcgp-scotland.org.uk/ (accessed 20 Feb 2004).
Associated Data
This section collects any data citations, data availability statements, or supplementary materials included in this article.