Abstract
Objectives
To gain insight into current methods and practices for the assessment of competences during rheumatology training, and to explore the underlying priorities and rationales for competence assessment.
Methods
We used a qualitative approach through online focus groups (FGs) of rheumatology trainers and trainees, separately. The study included five countries—Denmark, the Netherlands, Slovenia, Spain and the United Kingdom. A summary of current practices of assessment of competences was developed, modified and validated by the FGs based on an independent response to a questionnaire. A prioritising method (9 Diamond technique) was then used to identify and justify key assessment priorities.
Results
Overall, 26 participants (12 trainers, 14 trainees) participated in nine online FGs (2 per country, Slovenia 1 joint), totalling 12 hours of online discussion. Strong nationally (the Netherlands, UK) or institutionally (Spain, Slovenia, Denmark) standardised approaches were described. Most groups identified providing frequent formative feedback to trainees for developmental purposes as the highest priority. Most discussions identified a need for improvement, particularly in developing streamlined approaches to portfolios that remain close to clinical practice, protecting time for quality observation and feedback, and adopting systematic approaches to incorporating teamwork and professionalism into assessment systems.
Conclusion
This paper presents a clearer picture of the current practice on the assessment of competences in rheumatology in five European countries and the underlying rationale of trainers’ and trainees’ priorities. This work will inform EULAR Points-to-Consider for the assessment of competences in rheumatology training across Europe.
Keywords: Autoimmunity, Early rheumatoid arthritis, Rheumatoid arthritis, Synovitis, Treatment, Sjøgren’s syndrome, Tcells, Chondrocalcinosis, Gout, Health services research, Synovial fluid, Ankylosing spondylitis, Spondyloarthritis, Outcomes research, Epidemiology
INTRODUCTION
A rheumatologist is defined as a physician who has received further training in the diagnosis (detection) and treatment of musculoskeletal disorders and systemic autoimmune conditions, commonly referred to as rheumatic and musculoskeletal diseases.1 2 Rheumatology is recognised as a specialty or sub-specialty in most of the European League Against Rheumatism (EULAR) countries.3 However, the scope of rheumatology practice varies across countries.4,6 Indeed, in some countries, rheumatologists focus on inflammatory joint and connective tissue diseases whereas in others, rheumatology covers a broader scope, including soft tissue lesions, fibromyalgia and rehabilitation.5
In order to become a rheumatologist, trainees must successfully complete a rheumatology training programme.2 6 Both the content and the assessments within these programmes are regulated by national authorities. Some initiatives aiming at harmonising training across countries of the European Union (EU) exist. The European Union of Medical Specialists, a professional body of representatives from medical specialities from the EU member states, has developed a general European curriculum with the competences to be achieved at the completion of training, including theoretical and clinical knowledge, practical skills and non-clinical competences.2 7
Over the past years, in addition to this European curriculum, efforts have been made to gain insights and provide an in-depth analysis of the differences and similarities in national curricula and assessment methods across EULAR countries.6 8 In a prior study, a questionnaire was answered by young rheumatologists and trainees to assess the acquisition of competences during the training and other information not clearly stated in the curricula (eg, assessment of competences).6 8 9 Interestingly, while this approach provided useful information on some of the differences and similarities on training across countries, data on the assessment of competences were incomplete and several limitations hampered the interpretation of the findings.6 8 A further attempt was made to gather information from a principal investigator (PI) per country with a short questionnaire with open questions. This further highlighted the difficulty in obtaining useful and reliable information from a single person. Inquiring into assessments during rheumatology training leads to answers reflecting a personal experience and perception; therefore, a more comprehensive evaluation, obtained from different sources, is needed before a full picture can be obtained. For this reason, we decided on a qualitative approach in a representative selection of European countries.
The present study aimed at gathering information and in-depth views on the assessment methods of competences in rheumatology and the experiences around them, as well as underlying priorities for competence assessment through focus groups (FGs).
This qualitative approach will ultimately inform EULAR Points-to-Consider on the assessment of competences in rheumatology across Europe.
METHODS
Focus groups
FGs across different European countries were run online to gain insights into assessment methods.
Countries were selected in order to provide a geographical spread across Europe and to represent different educational contexts: larger and smaller countries, localised and centralised approaches to assessment. In order to be included, countries had to have (a) a national regulatory document for both curriculum and assessment methods, (b) a portfolio and (c) a structured framework for feedback closely related to curriculum. Additionally, a minimum of one country per geographical area (East, North and South Europe) was included to give a spread of contexts. Eleven countries fulfilled all criteria; for feasibility, five countries were finally included Denmark, the Netherlands, Spain, Slovenia and the United Kingdom.
Information on the individual countries in order to check the eligibility criteria above-mentioned was obtained from a PI in each country through a questionnaire seeking a general description of the country’s assessment methods. In qualitative research, this approach, ‘purposive sampling’, is used to ensure that those participating in the discussion are likely to have the experience to be able to contribute to the study. The sample is not random, is a small study size and is only able to contribute to the understanding of that particular population but is considered likely to have experience and factors in common with other very similar groups.
Current assessment methods and practice
The PI of each included country was responsible for two tasks. First, to fill a questionnaire for which they could receive the help of their local team and/or head of the unit for maximum accuracy (online supplementary text S1). Second, to identify FG participants from their country through their personal or institutional network that ideally comprised four trainers and four trainees. Participation was voluntary and anonymised. Trainee and trainer FGs were run separately to avoid pressure between groups.
rmdopen-2020-001183supp001.pdf (249.1KB, pdf)
Participants were sent preparatory material to review before the FG explaining the process and guiding the preparation for the discussion. This included a summary of the aims and methods of the project and an overall description of the country’s training and assessment methods previously developed by the PI through the above-mentioned questionnaire (online supplementary files 1 and 2).
rmdopen-2020-001183supp002.pdf (194.6KB, pdf)
The FGs were conducted online in English and moderated by an experienced qualitative researcher in medical education (CH) and assisted by a rheumatologist (AN). The FGs were audio-recorded through Zoom software. Some quotes were then used to illustrate the findings. First, the participants were introduced, then the responses from the PI to the questionnaire were shared on the screen during the discussion. The account of practice in each country was discussed and amended in detail until a full account was agreed upon by all participants. This allowed the incorporation of the perspective of experienced trainers and trainees from different centres within each country, thereby ensuring more reliable data. The following aspects were discussed during the FG: portfolio, formative feedback, summative assessments, clinical practice and skills, professionalism, trainer certification, knowledge tests and national standards.
Priorities for assessment from trainees and trainers
In order to gain insights into FG participants’ views on assessment methods, a prioritising technique, known as the 9 Diamond method,10 11 was used to identify key assessment priorities and justifications. The 9 Diamond technique employs a common method used to stimulate discussion in face-to-face educational settings, where the underlying values and beliefs about a topic have a strong bearing on priorities in professional practice. The technique has also been employed in educational research settings as a way of providing a semi-structured framework for discussing the rationale behind complex choices in a time-effective manner.10 The possible limitations are that the statements may not produce the optimal or expected response from the participants, which is a risk for any interview or discussion-based method of qualitative research. The method also relies on the skill of the person leading the discussion to ensure that participants have the best opportunity to voice their thoughts. The discussion leader has over 30 years of experience in leading professional discussions of this type, 15 of which have been in clinical education.
In qualitative research, the subjective experience and opinions of the selected group are the subject of study and so the discussion was specifically targeted to participants’ own experience within their own setting. The trainers were all experts in their field and so their experience was extremely relevant. The trainees were not yet experts in their field nor able to fully judge the role any assessment would play in their future career; however, they were experts in their own current experience of being assessed in training. Trainee perception of the assessment regime is an important aspect in the overall effectiveness of the programme and can assist in evolving practice for future trainees.
Participants were provided with a set of nine statements about assessing competences in rheumatology training (table 1).
Table 1.
|
|
|
|
|
|
|
|
|
Statements are presented unprioritised, as presented to the FGs.
Each participant was asked to rank these statements into top, bottom and middle three priorities, giving reasons. This process stimulated discussion between participants, until the group was able to reach a consensus agreement on the priority order. Statements were framed to prompt discussion on the underlying values and beliefs related to nine key areas of competence assessment.
These statements were developed by the medical educator (CH) based on general principles and medical education literature and a systematic literature review on the assessment of competences.12 Quotations were thoroughly collected from this part for each group. Participants were finally asked to specify any aspect which had, in their view, been omitted in order to ensure comprehensiveness of the final picture and to assess whether data saturation had been reached.
For aggregated analysis, priorities scored 1 if they were the first choice and gradually increased their score until the last statement, which scored 9. These scores were calculated for each FG and then aggregated. Thus, the lowest total was the most popular choice.
Results were analysed by country and also by participant group: trainer and trainees. The findings were ordered and presented using colours and statements to demonstrate the variety of decisions across the groupings. The recordings from this section of the discussion were used to provide key quotations, illustrating themes discussed and justifications given within the groups relating to each competence area.
RESULTS
Current assessment methods and practice
In total, 29 volunteers (15 trainers and 14 trainees) participated in nine FGs. Table 2 summarises demographic data of the FG participants.
Table 2.
Trainees (n=12) N (%) | Trainers (n=15) N (%) | ||
---|---|---|---|
Female gender | 7 (58) | 9 (60) | |
Academic centre | 9 (75) | 7 (47) | |
Country | Slovenia | 2 (17) | 1 (7) |
Spain | 4 (33) | 4 (27) | |
The Netherlands | 3 (25) | 4 (27) | |
UK | 4 (33) | 4 (27) | |
Denmark | 3 (25) | 2 (13) |
Two online FGs composed of three to four persons per group were conducted in each country, except Slovenia, where only one FG was performed due to a lower number of participants. The 12 hours of online discussion resulted in over 15 000 words. A summary for each of the five countries’ assessment systems is available in table 3.
Table 3.
The Netherlands | Spain | Slovenia | Denmark | United Kingdom | |
---|---|---|---|---|---|
Portfolio | -Mandatory electronic portfolio -Entrustable professional activities† -Portfolio incorporates CanMeds framework* |
-Mandatory portfolio, resident’s book -Checklist of the main competences |
-Electronic portfolio -Checklist of rotations, mandatory procedures and description of the final licencing exam |
-Electronic portfolio -Checklist of learning objectives, formal assessment and approval of competencies |
-Mandatory electronic portfolio -Portfolio incorporates CanMeds framework* |
Formative feedback | -Meeting with the supervisor every 3–6 months -Strong culture of work-based assessment -Personal reflection about progress and learning goals |
-Meeting with supervisor every 3 months -Yearly assessment |
-No formal feedback procedure -Work-based assessment -Yearly assessment |
-Regular evaluation discussions before the “Final
discussion”
-the plan uses the CanMeds* roles |
-Meeting with the supervisor at least 3 times a year -Case based discussions |
Quotes |
‘You learn most from people, not from the portfolio. It
doesn’t drive me at all, but I use it to order my
thoughts.’
The Netherlands Trainee |
‘Feedback on clinical learning is the most important thing for
trainees.’ Spanish Trainee ‘Reviewing with the resident what is going well and what is problematic and how to improve or overcome difficulties is very important.’ Spanish Trainer |
‘I have so many other things to be done first and then I’ll
think about it’
Slovenia Trainee ‘Feedback is really vital to a trainee so you know what to keep doing and what you need to stop doing or improve on.’ Slovenia Trainee |
‘The portfolio is the way to make everybody see the minimum standards’ Danish Trainee |
‘The portfolio provides the framework for a common standard that
can be applied nationally (…) and includes the curriculum which is
crucial.’
UK Trainer |
Summative assessment | -Decision based on portfolio and supervisor’s assessment -EULAR course is mandatory -No publication requirement -No national exams |
-Annual assessment which comprises of mini-CEX 2–3 times a
year -Voluntary OSCE -No publication requirement -No national exam |
-Interview at the end of each rotation -National exam: Final licencing exam which includes practical, oral and theoretical questions. Three examiners, two rheumatologists and one from different specialty. |
-Decision based on portfolio and supervisors’ assessment - No publication requirement - No national exams |
-National exam: Specialty Certificate Examination |
Clinical practice and skills | - Entrustable professional activities† - Short practical assessment |
- Mini-CEX -Voluntary OSCE |
-Workplace-based assessments | -Competency cards | -Workplace-based assessments - Direct Observation Procedural Skills |
Quotes |
‘Skills are important, but always in clinical
context.’
The Netherlands Trainee |
‘Some test of theoretical understanding is good but an OSCE is a
much better approach’
Spanish Trainer |
‘It was 10 years ago, and I still remember it!’
Slovenia Trainer |
‘I focus on acquiring the appropriate competency to a high level in the clinical setting. I didn’t realise it was summative.’ Danish Trainee | |
Professionalism | -Multisource feedback | -None | -None | -Multisource feedback | -Multisource feedback |
Knowledge tests | -None | -None | -None | -Final oral knowledge test | -None |
Quotes |
‘Knowledge tests are useful as a way of measuring your progress but
they are not the best way at this level; you need to show you can apply your
knowledge in clinical practice.’
The Netherlands Trainee |
‘Knowledge tests are not a good way to judge between trainees at
this level’
Spanish Trainer ‘I don’t think professional behaviours would just emerge alone. We have to teach it. Poor models don’t teach professionalism.’ Spanish Trainer |
‘If you did something unprofessional, then somebody would discuss it with you.’ Slovenian Trainee |
‘Everyone judges your professionalism on a daily basis, so do you
need it assessing as well formally? I don’t think so.’
UK Trainee ‘We love the professional attitude, that’s vital!’ UK Trainer |
|
National standards | -National portfolio -Central body checking standards across the country -National audits for traineeships organised by the Dutch central organisation |
-Central body checking standards across the country | -No national standard | -The assessment is similar across the country. -Regulation by the Danish Society of Rheumatology |
-All summative points are specified centrally by the Royal
College. -Training is supported nationally. |
Trainers certification | -University Teaching Qualification course recommended but not
mandatory. -Supervisors should do faculty development in the university hospitals |
-Supervisors should do faculty development in the university
hospitals -Each mentor should have a maximum of four trainees |
-Each main mentor has to have at least 50 CME points each year -A teacher should have a maximum of three trainees -No formal certification to become a trainer |
-Courses in supervision are mandatory for all physicians who train
trainees. -Train the trainer course |
-Requirement for all supervisors to receive a minimum training in Teaching and
assessing. -Review on a regular cycle and registration of the trainers within their region. |
*CanMeds is an educational framework that describes the abilities physicians require to effectively meet the healthcare needs of the people they serve. It is the basis for the educational and practice standards of the Royal College of Canada.
†An entrustable professional activity is a key task that an individual can be trusted to perform in a given healthcare context, once sufficient competence has been demonstrated.
mini-CEX, Clinical Evaluation Exercise; CME, Continuing Medical Education; OSCE, Objective Structured Clinical Examination.
Overall, some sort of portfolio was used in every included country; this was commonly seen as useful (especially by trainers), but time-consuming (especially by trainees).
“The portfolio provides the framework for a common standard that can be applied nationally (…) and includes the curriculum which is crucial. (UK Trainer)”
“I have so many other things to be done first and then I’ll think about it. (Slovenia Trainee)”
Its positive aspects were due to the framework it provided to the overall training and assessments. Formative feedback is felt to be essential by both trainers and trainees; timings of the assessments were country-specific, from a 3-monthly basis (Denmark, the Netherlands), to a yearly basis (Slovenia, Spain).
“Feedback on clinical learning is the most important thing for trainees. (Spanish Trainee)”
The Slovenian oral final examination, performed for 1 or 2 days in a clinic setting, was felt by participants to be very stressful. Professionalism was formally assessed in three countries (Denmark, the Netherlands, UK) through multisource feedback, and highlighted to be important by participants. Mandatory courses for trainers in teaching methods took place in two countries, Denmark and the UK.
9 Diamond priority ordering results
Table 4 provides an aggregated summary of participants’ priorities, and of both trainees and trainers.
Table 4.
Order | Joint (n=26) | All trainees (n=14) | All trainers (n=12) |
---|---|---|---|
1 | Regular feedback | Regular feedback | Regular feedback |
2 | Service, timely | Service, timely | Portfolio driven |
3 | Effective in clinic | Effective in clinic | Service, timely |
4 | Portfolio driven | Regular skills | Effective in clinic |
5 | Regular skills | Prevent failure | Functions in teams |
6 | Prevent failure | Functions in teams | Regular skills |
7 | Functions in teams | Portfolio driven | Prevent failure |
8 | Knowledge tests | Knowledge tests | Knowledge tests |
9 | Professional attitudes emerge | Professional attitudes emerge | Professional attitudes emerge |
*Each colour represents one statement.
N, number.
Providing regular feedback to trainees and the need to achieve a balance between service provision and protected time for training were rated highly by both groups. Knowledge tests during postgraduate training were rated in the lowest priorities by most of the respondents.
Despite overall high agreement, there were differences between trainers and trainees. Trainees were much less keen on portfolios than trainers in general. Trainees were more concerned with interventions to support trainees at risk of failure and demonstrating effectiveness in a clinical setting than trainers.
Differences were also observed across countries (table 5). All countries agreed in giving a high priority to providing regular feedback. Denmark and the Netherlands seemed less concerned with trainees at risk of failure, whereas Slovenian participants seemed to be keener on knowledge tests. UK, Spanish and Slovenian trainees were less keen on portfolios, highlighting their administrative burden. On the other hand, trainers were very keen on them irrespective of their country of origin.
Table 5.
Order | The Netherlands (n=5) | UK (n=7) | Spain (n=6) | Slovenia (n=4) | Denmark (n=4) |
---|---|---|---|---|---|
1 | Functions in teams | Regular feedback | Regular feedback | Regular feedback | Regular feedback |
2 | Regular feedback | Service, timely | Service, timely | Service, timely | Portfolio driven |
3 | Portfolio driven | Prevent failure | Effective in clinic | Effective in clinic | Effective in clinic |
4 | Effective in clinic | Rregular skills | Regular skills | regular skills | Regular skills |
5 | Service, timely | Portfolio driven | Portfolio driven | Knowledge tests | Service, timely |
6 | Prevent failure | Functions in teams | Functions in teams | Prevent failure | Functions in teams |
7 | Regular skills | Effective in clinic | Prevent failure | Portfolio driven | Prevent failure |
8 | Knowledge tests | Professional attitudes emerge | Knowledge tests | Functions in teams | Knowledge tests |
9 | Professional attitudes emerge | Knowledge tests | Professional attitudes emerge | Professional attitudes emerge | Professional attitudes emerge |
*Each colour represents one statement.
N, number.
On top of prespecified discussion items, the Dutch group (both trainees and trainers) made further suggestions which would be important to their competence assessment: research, resilience and reducing the administrative burden within healthcare. None of the other countries made additional proposals, suggesting that the key issues were well covered by the nine statements.
DISCUSSION
This work is, to our knowledge, the first qualitative study gaining insights into competence assessments in rheumatology across Europe. Online FGs are an innovative methodology and have shown to be a feasible method to involve discussants across a wide geographical area. Through this study, we highlighted interesting differences between countries regarding the current assessment of competences strategies and methods. While a portfolio was mandatory in all five countries, in a mostly electronic format, other assessment types, such as assessment of professionalism through multisource feedback, occur in a structured manner in only some countries. Portfolios were supported by trainers but were also felt to be burdensome by trainees, sometimes reduced to an ineffective, time-consuming checklist of requirements. Participants particularly valued portfolios which provided a framework to integrate all the required aspects of performance, often derived from the CanMeds approach. Where components were closely linked to the curriculum and included non-clinical competences, such as professionalism and teamwork, portfolios were seen as particularly useful and valid. Trainees were commenting on their current experience while compiling their portfolios and understandably expressed their current challenges and frustrations, which would be expected to become more balanced in retrospect.
Structured feedback took place regularly in all five countries but with a variable frequency. Providing feedback was highly valued by both trainers and trainees. One main difficulty, described by many FG participants, was the lack of protected time for giving and receiving feedback, which limits its feasibility. Indeed, giving feedback is a skilled educational task, particularly when related to attitudes and professionalism.13 Many trainers and trainees described how they valued or would welcome opportunities to develop their skills in a constructive and consistent approach, where feedback received regularly and in a positive manner could help catalyse changes in performance and motivation.
A prolonged oral test in a clinical setting was performed in only one country, and knowledge tests were generally felt to be neither effective nor desirable as key assessment methods at this stage in training. Indeed, there is evidence that summative oral exams should not be strongly relied on to assess competences and are not supported in the literature across medical education as the main approach to assessment for postgraduate education.14 15 In addition, knowledge tests might also be inappropriate at this level as assessment in advanced stages of training focuses on the acquisition of complex competences integrating knowledge, skills and attitudes.16
Interestingly, this work provides a summary of current practices, and the initial document provided by the country’s PI evolved with the interventions from FG participants. This allowed us to obtain a more comprehensive view of the current assessment strategies within specific countries. Indeed, reflections on assessment practices are not fully objective and include a perception and judgement on them. Moreover, recent reviews highlight person al and environmental factors influencing the way trainees perceive the developmental value of assessment, self-motivation also being reported as an important driver of feedback seeking.17 18 This is a possible reason explaining why previous attempts for gathering such information8 were felt as incomplete and unreliable. By using a qualitative approach, we were able to combine and report in-depth views of several individuals from different positions and countries.
In the second part of this work, we used a novel technique, the 9 Diamond methodology, in order to identify underlying beliefs and attitudes likely to influence how assessment of competences take place. Through this work, it has been possible to identify general priorities in assessment of competences in rheumatology across Europe. This work has the potential to help some countries to develop their approach to assess competences in rheumatology training and avoid pitfalls. One of the main priorities of FG participants, especially for trainees, was the identification of trainees at risk of failure and their support for progression. Failure signs should be monitored and addressed in a timely fashion, so that a variety of appropriate solutions can be offered. While the importance of regular skill assessment was discussed, the particular case of technical procedural skills such as joint aspiration was not specifically addressed, which could be perceived as a lacking element.
Some limitations of the study must be considered. Qualitative research often relies on purposive sampling. By its nature, the sample was predisposed towards participants with an interest in education and current interaction with EULAR, and so may well be biased towards those with a fuller understanding of and commitment towards the assessment of competences and current best practice. However, the FG participants, through their engagement, are also likely to have critical views, as well as positive ones, and are therefore suitable for such a qualitative study. It is likely that there are other examples of good practice either nationally, institutionally or locally that have not been captured by this approach, but they were considered out of scope of this project. Online FGs present unique challenges. The group size must remain small, but on some occasions, one or two participants were unavailable at the last minute, making the number of participants potentially low. However, the overall sample size and methods of data collection, building a comprehensive account of practice with contributions by different groups (trainers and trainees) and in different stages, provide a significant sample for a qualitative study of this type. In addition, general agreement in priority ordering across groups suggests that variation in FG sizes has not significantly affected the results. One limitation of the 9 Diamond method lies in the framing of the statements. It is important to formulate statements in a way that provokes discussion and leads to disagreement. This strategy was used in particular for the professionalism statement, which was worded in a negative manner. Despite leading to subsequent discussion, this particular statement ended up appearing low in the priority order, precisely because it was felt to be very important. This illustrates the fact that the order of priorities might not fully reflect importance in a linear way. Traditional qualitative data (ie, quotations from transcriptions of the discussion) clearly demonstrated that assessing professionalism was felt to be extremely important and have a high priority in training. Of the 11 countries fulfilling our initial selection criteria, the inclusion of five countries allowed data saturation, while ensuring feasibility. Despite this limited number of countries, very few further aspects were mentioned by participants in response to prompts about what else needed to be included. Although we strove for representativity of different assessment systems and cultures, given the wide heterogeneity in training programmes across the 41 EULAR countries, selection bias could have hampered retrieving other useful insights or best practices.
In conclusion, we identified current practice in the assessment of competences across five countries, incorporating the views of expert trainers and current trainees. Additionally, priorities and underlying beliefs about the assessment of competences were identified. Together, these provide a rich and coherent picture on the assessment of competences in rheumatology training across European countries, which will inform the EULAR Points-to-Consider for the assessment of competences in rheumatology training.
Key messages.
What is already known about this subject
Providing medical education in rheumatology is a challenging balance between delivering training to time and with high standards while delivering service.
What does this study add
Providing frequent formative feedback to trainees for developmental purposes is perceived as the highest priority for both trainers and trainees in rheumatology.
Portfolios, considered useful by both trainers and trainees, are seen as time-consuming particularly by trainees and requiring streamlined approaches to remain close to clinical practice.
How might this impact on clinical practice or future developments
Through focus groups, a good insight into practices and preferences around the assessment of competences in rheumatology was gathered from European trainees (fellows, residents) and trainers.
These insights will help further harmonise assessment practices for rheumatology trainees across Europe.
Acknowledgments
We acknowledge national PIs from the selected countries (Tue Kragstrup, Diego Benavent, Marloes van Onna, Blaž Burja and Md Yuzaiful Md Yusof) for their help in identifying the focus group participants. We thank all focus group participants for their time and contribution to this project.
Footnotes
Twitter: Aurélie Najm@AurelieRheumo.
Contributors: AN, AA, SR, FS and CH and the Working Group have contributed to the design of the study. AN and CH have organised and analysed the data of the focus groups. AN, AA, SR, FS, CH and the Working Group have contributed to the drafting of the manuscript and have approved the final version to be submitted.
Funding: This project was supported by a EULAR grant (EDU043).
Competing interests: None declared.
Patient consent for publication: Not required.
Data sharing statement: Data are available upon reasonable request.
Provenance and peer review: Not commissioned; externally peer reviewed.
Contributor Information
Working Group on Training in Rheumatology across Europe:
Ledio Collaku, Paul Studenic, Samir Mehmedagić, Russka Shumnalieva, Ivan Padjen, Ladislav Senolt, Tue Kragstrup, Liis Puis, Laura Kuusalo, Claire Daien, Peter Korsten, Antonis Fanouriakis, Mangel Zsolt, Richard Conway, Abid Awisat, Maddalena Larosa, Mira Merashli, Julija Zepa, Goda Seskute, Snezana M. Perchinkova, Cecilia Mercieca, Victoria Sadovici-Bobeica, Marloes von Onna, Brigitte Michelsen, Olga Brzezińska, Alexandre Sepriano, Ana Maria Gherghe, Anton Povzun, Ivan Jeremic, Ulrika Ursinyova, Blaž Burja, Diego Benavent, Aikaterina Chatzidionysiou, Kim Lauper, Yesim Ozguler, Yuzaiful Yusof, and Olena Zimba
REFERENCES
- 1. Brown CR, Criscione-Schreiber L, O’Rourke KS, et al. What is a rheumatologist and how do we make one? Arthritis Care Res (Hoboken) 2016;68:1166–72. 10.1002/acr.22817 [DOI] [PubMed] [Google Scholar]
- 2. European-training-requirements-in-rheumatology, endorsed UEMS April 12, 2014.pdf. Available https://www.uems.eu/__data/assets/pdf_file/0005/44438/UEMS-2014.21-European-Training-Requirements-Rheumatology-.pdf
- 3. Grahame R, Woolf AD. Clinical activities: an audit of rheumatology practice in 30 European centres. Br J Rheumatol 1993;32:22–7. 10.1093/rheumatology/32.suppl_4.22 [DOI] [PubMed] [Google Scholar]
- 4. Woolf AD. Specialist training in rheumatology in Europe. Rheumatology (Oxford) 2002;41:1062–6. 10.1093/rheumatology/41.9.1062 [DOI] [PubMed] [Google Scholar]
- 5. Jayson MI. The scope of rheumatology. Ann Rheum Dis 1980;39:528. 10.1136/ard.39.5.528 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 6. Sivera F, Ramiro S, Cikes N, et al. Rheumatology training experience across Europe: analysis of core competences. Arthritis Res Ther 2016;18:213. 10.1186/s13075-016-1114-y [DOI] [PMC free article] [PubMed] [Google Scholar]
- 7. UEMS_European_Curriculum_2008.pdf . Available https://www.eular.org/myUploadData/files/UEMS_European_Curriculum_2008.pdf
- 8. Sivera F, Ramiro S, Cikes N, et al. Differences and similarities in rheumatology specialty training programmes across European countries. Ann Rheum Dis 2015;74:1183–7. 10.1136/annrheumdis-2014-206791 [DOI] [PubMed] [Google Scholar]
- 9. Haines C, Dennick R, da Silva JA. Developing a professional approach to work-based assessments in rheumatology. Baillieres Best Pract Res Clin Rheumatol 2013;27:123–36. 10.1016/j.berh.2013.02.006 [DOI] [PubMed] [Google Scholar]
- 10. Clark J. Using diamond ranking as visual cues to engage young people in the research process. Qual Res J 2012. 10.1108/14439881211248365 [DOI] [Google Scholar]
- 11. Hall E, Wall K. Research methods for understanding professional learning. Bloomsbury Academic 2019;304. [Google Scholar]
- 12. Alunno A, Najm A, Sivera S, et al. Heterogeneity of strategies and methods for assessment of competences in rheumatology training: results of a systematic literature review to inform EULAR points to consider. Submitted. [DOI] [PMC free article] [PubMed]
- 13. Natesan S, Stehman C, Shaw R, et al. Curated collections for educators: five key papers about receiving feedback in medical education. Cureus 2019;11:e5728. 10.7759/cureus.5728 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 14. Kibble JD. Best practices in summative assessment. Adv Physiol Educ 2017;41:110–19. 10.1152/advan.00116.2016 [DOI] [PubMed] [Google Scholar]
- 15. Houston D, Thompson JN. Blending formative and summative assessment in a capstone subject: ‘it’s not your tools, it’s how you use them’. J Univ Teach Learn Pract 2017;14:2. [Google Scholar]
- 16. Jayaweera HK, Potts HWW, Keshwani K, et al. The GP tests of competence assessment: which part best predicts fitness to practise decisions? BMC Med Educ 2018;18:2. 10.1186/s12909-017-1111-0 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 17. Scarff CE, Bearman M, Chiavaroli N, et al. Trainees’ perspectives of assessment messages: a narrative systematic review. Med Educ 2019;53:221–33. 10.1111/medu.13775 [DOI] [PubMed] [Google Scholar]
- 18. Gaunt A, Markham DH, Pawlikowska TRB. Exploring the role of self-motives in postgraduate trainees’ feedback-seeking behavior in the clinical workplace: a multicenter study of workplace-based assessments from the United Kingdom. Acad Med 2018;93:1576–83. 10.1097/ACM.0000000000002348 [DOI] [PubMed] [Google Scholar]
Associated Data
This section collects any data citations, data availability statements, or supplementary materials included in this article.
Supplementary Materials
rmdopen-2020-001183supp001.pdf (249.1KB, pdf)
rmdopen-2020-001183supp002.pdf (194.6KB, pdf)