Abstract
Due to the multi-dimensional characteristics of professionalism, no single assessment modality has shown to reliably assess professionalism. This review aims to describe some of the popular assessment tools that are being used to assess professionalism with a view to formulate a framework of assessment of professionalism in medicine.
In December 2015, the online research databases of MEDLINE, the Educational Resources Information Center (ERIC), Elton Bryson Stephens Company (EBSCO), SCOPUS, OVID and PsychINFO were searched for full-text English language articles published during 2000 to 2015. MeSH terms “professionalism” AND “duty” AND “assessment” OR “professionalism behavioural” AND “professionalism–cognitive” were used. The research articles that assessed professionalism across medical fields along with other areas of competencies were included. A final list of 35 articles were selected for this review.
Several assessment tools are available for assessing professionalism that includes, but not limited to, mini clinical evaluation exercise, standardised direct observation of procedural skills, professionalism mini-evaluation exercise, multi-source feedback and 360 degree evaluation, and case based discussions. Because professionalism is a complex construct, it is less likely that a single assessment strategy will adequately measure it. Since every single assessment tool has its own weaknesses, triangulation involving multiple tools can compensate the shortcomings associated with any single approach. Assessment of professionalism necessitates a combination of modalities at individual, interpersonal, societal, and institutional levels and should be accompanied by feedback and motivational reflection that will, in turn, lead to behaviour and identity formation. The assessment of professionalism in medicine should meet the criteria of validity, reliability, feasibility and acceptability. Educators are urged to enhance the depth and quality of assessment instruments in the existing medical curricula for ensuring validity and reliability of assessment tools for professionalism.
Keywords: Assessment, Ethics, Mini-CEX, OSCE, Professionalism behavioural, SDOPS
Introduction
Professionalism is a multi-dimensional construct that has been shown to demonstrate variations across educational, regional and cultural contexts [1]. There is a growing evidence that professionalism in medical medicine has a potential role in developing professional excellence in medical students and physicians [2,3]. However, the absence of a comprehensive and universally acceptable definition of professionalism has currently limited its operationalisation [4]. There have been a proliferation of definitions of professionalism [5–7], that perhaps follow the escalating community concerns and public disquiet over clinical incompetence, and unprofessional behaviours by medical fraternity [8]. This urges educators and policy makers to develop effective teaching and assessment strategies of professionalism that can be conveniently embedded into medical curricula. “Professionalism needs to be assessed if it is to be viewed as both positive and relevant” [2]. Assessment of professionalism is a key dimension of the recommended guidelines by the General Medical Council (GMC) for undergraduate curricula and all four domains for appraisal as well as re-validation [9]. Nevertheless, literature has provided simple outcome measures of the teaching and learning experiences with positive impact on attitudes towards professionalism and professional behaviour in medicine [10,11]. Nevertheless, these findings fail to validate the validity and reliability of assessment modalities that are used to assess professionalism in medicine.
This review elaborates on the effectiveness of the currently used popular tools that can be applied for assessing professionalism in medicine. The gaps in the assessment strategies are also highlighted with an attempt to advance the knowledge about how to effectively assess professionalism in medical field.
Research Design
Data sources
In December 2015, two authors independently searched the online research databases of MEDLINE, the Educational Resources Information Center (ERIC), Elton Bryson Stephens Company (EBSCO), SCOPUS, OVID and PsychINFO for full-text English language articles published during 2000 to 2015. MeSH terms used in this search included “professionalism” AND“duty” AND “ethics” AND “assessment” OR “professionalism–behavioural” AND “professionalism–cognitive”. Furthermore, a manual search was conducted on the reference lists of retrieved articles and literature reviews. Only those articles pertaining to medical education providing empirical evidence were selected in this research. Editorials, short communications, opinions, letters to the editor, abstracts and descriptive papers without original data were excluded. The research articles that assessed professionalism across medical fields along with other areas of competencies were included. The reviewers pooled their findings and thus generated a single list of selected articles.
Data Analysis and Results
Initial search retrieved 498 articles. Search on the manual reference list yielded another 49 references. Of these 547 articles, 310 were excluded as they did not meet selection criteria. During further detailed review of their abstracts, 190 publications were excluded. Finally, 35 articles were selected for this literature review [Table/Fig-1].
[Table/Fig-1]:
Schematic presentation of selection of studies the strategies for assessing professionalism in medicine.
General principles of assessment in medicine
In medical education, competencies are task-based performances that a qualified medical professional should be able to perform successfully [12]. Medical competence encompasses a mix of measurable constructs such as knowledge, skills, problem-solving and attitudes [13]. van Mook et al., have argued that assessment of medical competence should measure performance in everyday practice [14]. Four fundamental domains of assessment should be met with; validity, reliability, acceptability and feasibility [15]. The weightage and application of each instrument need to be balanced depending on the context and objective of assessment. In high-stake examinations, reliability may be given priority to the choice of assessment modality [16]. In formative assessments, where the final decision is based on triangulation of a variety of assessments, the educational impact of assessment can be favoured at the expense of reliability [17].
The assessment of surgical competence is even further complex and challenging as this involves a construct of assessment tools to measure surgical knowledge, skills, and attitudes. Guraya et al., gathered the preferred modes of surgical education and training from the participants of a state-of-the-art laparoscopic surgical training center, Advanced International Minimally Invasive Surgery Academy (AIMS) in Milan Italy and concluded that the surgical trainees preferred a blend of training and assessment modes including hands-on training and training by skilled tutors [18]. This blend of preferred learning styles, instead of a single stand-alone entity, has been well documented in several studies [19,20]. A very popular assessment tool, the Objective Structured Clinical Examination (OSCE) has the potential to objectively assess the knowledge, skills and practice in medical field with fair degree of validity and reliability [21]. However, examiner bias has been shown to negatively influence the desired outcome.
The value of measuring professional behaviour in medicine
The main challenge of assessing professionalism is its multidimensional construct that requires a diversity of approaches for valid and reliable assessment [1]. The cornerstone of assessment of professionalism is the measurement of professional behaviour as this domain has been acknowledged to be reflective of the underlying cognitive, attitudinal, and personality characteristics of professionalism. However, from the perspective of socio-cognitive psychology, attitudes have been considered to be poor predictors of behaviour particularly in the presence of strong external constraints and social pressure [22]. “An individual’s behaviour is more likely to be influenced by situational and contextual phenomena arising during learning and practice than by their underlying attitudes”[23]. This phenomenon of ignoring contextual background has the potential to unfairly label the physicians and medical students as ‘unprofessional’[24]. These situationally-dependent professional characteristics must be given due attention while assessing the context driven nature of professional behaviours.
While selecting a mode of assessment, this seems imperative to be aware of the main purposes; to provide feedback to students that will enable them to improve and to measure the achievement of course learning objectives [25]. Failure to assess the core values of professionalism can deliver conflicting messages to all stakeholders including students and practising clinicians that may lose the benefits of teaching and assessing professionalism [26]. The process of assessing professionalism will provide a deep insight into the knowledge and understanding of learners about their professional competency, honesty and confidentiality with patients, improved quality of care, scientific knowledge, professional responsibilities, and trust [27].
The assessment of professionalism in medicine
No single instrument can be employed to assess each competency pertinent to complex multi-dimensional constructs of professionalism. A blend of instruments and triangulations, therefore, should be used [28]. A range of assessment tools as suggested by Goldie is shown in [Table/Fig-2] [1].
[Table/Fig-2]:
The range of modalities for assessing professionalism.
Assessment tool | Types of assessment tool |
---|---|
Observed clinical encounters | -Mini-CEX -Professionalism mini-evaluation exercise -Standardised direct observation assessment tool |
Collated views of co-workers | -360 degree evaluation |
Records of incidents of professional lapses | -Incident reporting form |
Critical incident reports | |
Simulations | -Ethical dilemmas with patient simulations -OSCE |
Paper-based tests | -Defining issues test -Objective structured video examination -Critical incident report |
Patient surveys | -Patient assessment questionnaire -Simulated patient rating scales -Humanism scale |
Global observer ratings | -Global rating form -Professionalism assessment instrument -Amsterdam attitudes and communication scale |
Self-administered rating scales | -Time management inquiry form -Pharmacy professionalism instrument -Cross-cultural adaptability inventory -Cultural competence self-assessment questionnaire -Interpersonal reactivity index |
Precisely, five most popular assessment modalities that are currently used to measure professionalism in medicine are elaborated in the following section.
1. Mini- Clinical Examination Exercise (mini-CEX)
An assessor examines a trainee–patient interaction in a hospital and such clinical encounters last for about 15 minutes where the trainee is expected to take a focused history and/or physical examination within a defined time period [29]. Finally, the trainee proposes a working diagnosis and management plan and the performance are ranked using a structured evaluation form. At the end, a constructive feedback is provided. Assessors use a nine-point Likert scale ranging from ‘unsatisfactory’ to ‘superior’, which gives a six domain-specific ratings and one final rating of clinical competence. During one year, the trainees appears in about six clinical assessments with a different assessor for each session [30].
2. Standardised Direct Observation Assessment Tool (SDOPS)
SDOPS is perceived to be an effective and reliable learning and assessment strategy for trainees to enhance their work-place based assessment and performance [31]. SDOPS assessments are customized to be conveniently incorporated into the trainees’ and assessors’ daily practice that speaks volumes of this assessment tool in terms of feasibility and effectiveness [32]. During SDOPS, the assessor examines a trainee’s performance while performing a procedure in real-time environment. This event is followed by a face-to-face feedback and the trainee gets a scored evaluation of his/her performance for clinical and procedural skills [33].
3. Professionalism mini-evaluation exercise (P-MEX)
The development of P-MEX has been based upon the Mini-CEX [34] and the reliability and validity of this assessment instrument has been found to be reasonably fair [35]. The items included in the P-MEX were originally generated at workshops held at McGill University. The mini-MEX assesses the clinical skills of residents, in which the examiner rates residents about their performance on professionalism, clinical reasoning, interviewing, physical examination, counseling, unit organizational skills, and efficiency using a nine-point rating scale [36].
4. Multi source feedback and 360 degree evaluation
The 360-degree tool consists of a set of questions that can explore an individual’s knowledge about professionalism, communication skills, interpersonal style, leadership, and teamwork [37]. The evaluation of performance from a single source, such as a supervisor, subordinate, or patient, can have inherent inaccuracies and can be affected by bias with the “halo and horn” effect [38]. To circumvent this potential element of undue leniency and unfairness in assessment, the concept of the 3600 feedback has been coined that gathers feedback from several sources with varying roles in an individual’s work environment and serves to generate a comprehensive perspective on performance [39]. The 3600 feedback solicits feedback from multiple sources within a physician’s work environment, including peers, supervisors, and subordinates, carrying a promise of comprehensive global assessment of performance, thus eliminating bias due to race, sex and culture [40]. The resulting dossier of information has been conveniently and effectively used as a means for a framework of professional development and to track employee progress in any given institution [41].
5. Case based discussion (CbD)
During the CbD, the trainee is given the opportunity to select the timing, the records, and even the assessor [42]. A few days before the assessment, the trainee chooses a falls well within the goals and learning objectives of the curriculum that is then discussed using focused questions dedicated to elicit responses for a comprehensive assessment of knowledge, skills, attitudes and behaviours relevant to all domains of professionalism. At the end of discussion, the assessor rates the performance and, finally, provides constructive feedback. On average, trainees are assessed six times during the year. A working plan for a typical CbD encompasses planning the case, main event of discussion and evaluation, and feedback [43].
Traditionally, Miller’s pyramid is regarded as a benchmark in the assessment of clinical skills, performance and competencies [44]. [Table/Fig-3] illustrates a schematic algorithm using the range of assessment strategies that can be conveniently applied for assessing professionalism in medicine.
[Table/Fig-3]:
The range of assessment tools that can be used for the assessment of professionalism in medicine in terms of Miller’s pyramid.
Some models for assessing professional behaviour in medicine
A study by van Lujik et al., has demonstrated three main sets of the required domains for assessing professional behaviours [45];
- Reliability of the situation (intra-observer), assessors, and assessment modalities.
- Validity of the situation, judges (qualified), and rating scales (consequential validity).
- Acceptability by the students and institutions
On the same note, Veloski et al., has argued that “the assessment of professional behaviour should meet the criteria of validity, reliability, feasibility and acceptability” [46].
Professional performance is context specific and student performance will vary from case to case that necessitates broader assessment across a range of contexts [47]. This demands a wide range of assessment tools across the curriculum with rigorous attention to their validity and reliability. Furthermore, its vital to collate these finding and observations over time and carefully triangulate before a holistic judgement can be passed [48]. Such approach signals the use of portfolios collating indicators of performance such as attendance records, multi-source feedback, performance in OSCE, reflective writing on critical incidents that can help capture a positive sketch of a learner’s professional performance across different contexts. This can be prompted by embracing the interactions at individual, interpersonal, societal, and institutional levels.
The School of Medicine and Health at Durham University developed a conscientiousness index that attempted to explore whether students performances completing specific tasks over several domains can be used to assess their professional behaviours [49]. The domains included attendance, timely submission of class work, and participation in evaluation and research. A satisfactory accomplishment of tasks was awarded one conscientiousness point. Each student’s point score was compared to a grade given by a faculty member. A significantly positive correlation has been reported between the conscientiousness index points score and the grade given by the faculty member. “The conscientiousness index has also been found to correlate with a student peers estimation of their professionalism” [50].
The academia at Groningen University introduced peer assessment within small group work for improving student professional behaviours [51]. This peer assessment employed three domains of communication, task performance and personal performance. The students were grouped into control and peer assessment batches once per trimester over a period of two years. Both groups were assessed by tutors and the scores were compared. A significantly enhanced performance was reported in the task performance and personal performance domains of the intervention group. “The study inferred that the peer assessment did have a positive impact on student professional behaviours for the students who were more accustomed to the complex learning environment of undergraduate medicine”.
The University of Michigan developed a self-assessment questionnaire for measuring the professionalism of surgical residents [52]. The instrument contained 15 attributes of professionalism and the respondents were asked to choose one statement from a menu of seven options for each of the attributes. All statements had the extremes of scales such as undesirable examples for every single attribute. Although this instrument was found to be effective and feasible for measuring surgical resident’s professional behaviours, it failed to show reliability for assessment [52].
Recommendations for assessing professionalism in medicine
- Assessment in multiple settings can help determine the accurate level of an individual’s professionalism and can identify context specific views and knowledge-based skills relevant to learners. Thorough and systematic assessment of professionalism mandates the involvement of several assessors and uses more than one assessment method and assessment in different settings [35]. A body of evidence-based literature has indicated that the engagement of multiple assessors offer several perspectives, thus enhancing the breadth of assessment [53–55]. Per se, this engagement of multiple assessors enhances reliability and effectiveness of assessment. Each assessment strategy has merits and demerits; rating forms are considered relatively easy to use, but are plagued with the ‘halo’ or ‘horns’ effect [56]. The OSCE may overcome this shortcoming but needs huge investment in terms of cost, time, and resources [57]. Consequently, using triangulation that employs more than one assessment tool may help to cover the shortcomings associated with any single approach.
- “The assessment of medical student professionalism is often delayed until clerkship rotations, however, research indicates that it is both desirable and possible to begin assessing student professionalism during the first year of medical school” [58]. The initial use of this assessment should be formative where the students are invited to present their perspective and this information is used to provide feedback and guide remediation.
- Persistent patterns of unprofessional behaviour, despite remediation, may be a strong impetus for dismissal [59].
- The practical strategies to assessing physician professionalism include a 360o evaluation, SDOPS, mini-CEX, OSCE, CbD and a cognitive assessment of professionalism.
- Faculty development program in medical schools carry great promise in fostering the knowledge and teaching skills of the medical faculty that will, in turn, be reflected in the practice of the future doctors [60].
Conclusion
As professionalism is context based, each institution should develop an explicit definition of professionalism that should resonate with the societal norms and the core elements of professionalism. A judicious combination of assessment methods need to be incorporated in a longitudinal manner using a stepwise approach for assessing professionalism in medicine.
Financial or Other Competing Interests
None.
References
- [1].Goldie J. Assessment of professionalism: A consolidation of current thinking. Medical teacher. 2013;35(2):e952–e6. doi: 10.3109/0142159X.2012.714888. [DOI] [PubMed] [Google Scholar]
- [2].Cox M, Irby DM, Stern DT, Papadakis M. The developing physician—becoming a professional. New England Journal of Medicine. 2006;355(17):1794–99. doi: 10.1056/NEJMra054783. [DOI] [PubMed] [Google Scholar]
- [3].Wald HS, Anthony D, Hutchinson TA, Liben S, Smilovitch M, Donato AA. Professional identity formation in medical education for humanistic, resilient physicians: Pedagogic strategies for bridging theory to practice. Academic Medicine. 2015;90(6):753–60. doi: 10.1097/ACM.0000000000000725. [DOI] [PubMed] [Google Scholar]
- [4].van Mook WN, van Luijk SJ, O’Sullivan H, Wass V, Zwaveling JH, Schuwirth LW, et al. The concepts of professionalism and professional behaviour: Conflicts in both definition and learning outcomes. European Journal of Internal Medicine. 2009;20(4):e85–e9. doi: 10.1016/j.ejim.2008.10.006. [DOI] [PubMed] [Google Scholar]
- [5].Van Zanten M, Boulet JR, Norcini JJ, McKinley D. Using a standardised patient assessment to measure professional attributes. Medical education. 2005;39(1):20–29. doi: 10.1111/j.1365-2929.2004.02029.x. [DOI] [PubMed] [Google Scholar]
- [6].Murray E, Gruppen L, Catton P, Hays R, Woolliscroft JO. The accountability of clinical education: its definition and assessment. Medical Education. 2000;34(10):871–79. doi: 10.1046/j.1365-2923.2000.00757.x. [DOI] [PubMed] [Google Scholar]
- [7].Cate TJt. Summative assessment of medical students in the affective domain. Medical Teacher. 2000;22(1):40–43. [Google Scholar]
- [8].Parker M. Assessing professionalism: theory and practice. Medical teacher. 2006;28(5):399–403. doi: 10.1080/01421590600625619. [DOI] [PubMed] [Google Scholar]
- [9].Tallentire VR, Smith SE, Wylde K, Cameron HS. Are medical graduates ready to face the challenges of Foundation training? Postgraduate Medical Journal. 2011;87(1031):590–95. doi: 10.1136/pgmj.2010.115659. [DOI] [PubMed] [Google Scholar]
- [10].Cruess RL, Cruess SR, Boudreau JD, Snell L, Steinert Y. A schematic representation of the professional identity formation and socialization of medical students and residents: A guide for medical educators. Academic Medicine. 2015;90(6):718–25. doi: 10.1097/ACM.0000000000000700. [DOI] [PubMed] [Google Scholar]
- [11].Wong A, Trollope-Kumar K. Reflections: An inquiry into medical students’ professional identity formation. Medical Education. 2014;48(5):489–501. doi: 10.1111/medu.12382. [DOI] [PubMed] [Google Scholar]
- [12].Tichelaar J, van Kan C, van Unen RJ, Schneider AJ, van Agtmael MA, de Vries TP, et al. The effect of different levels of realism of context learning on the prescribing competencies of medical students during the clinical clerkship in internal medicine: an exploratory study. European Journal of Clinical Pharmacology. 2015;71(2):237–42. doi: 10.1007/s00228-014-1790-y. [DOI] [PubMed] [Google Scholar]
- [13].Holden MD, Buck E, Luk J, Ambriz F, Boisaubin EV, Clark MA, et al. Professional identity formation: Creating a longitudinal framework through TIME (transformation in medical education) Academic Medicine. 2015;90(6):761–67. doi: 10.1097/ACM.0000000000000719. [DOI] [PubMed] [Google Scholar]
- [14].van Mook WN, van Luijk SJ, O’Sullivan H, Wass V, Schuwirth LW, van der Vleuten CP. General considerations regarding assessment of professional behaviour. European Journal of Internal Medicine. 2009;20(4):e90–e5. doi: 10.1016/j.ejim.2008.11.011. [DOI] [PubMed] [Google Scholar]
- [15].Todsen T, Tolsgaard MG, Olsen BH, Henriksen BM, Hillingsø JG, Konge L, et al. Reliable and valid assessment of point-of-care ultrasonography. Annals of surgery. 2015;261(2):309–15. doi: 10.1097/SLA.0000000000000552. [DOI] [PubMed] [Google Scholar]
- [16].Van Der Vleuten CP. The assessment of professional competence: developments, research and practical implications. Advances in Health Sciences Education. 1996;1(1):41–67. doi: 10.1007/BF00596229. [DOI] [PubMed] [Google Scholar]
- [17].Jonsson A, Svingby G. The use of scoring rubrics: Reliability, validity and educational consequences. Educational Research Review. 2007;2(2):130–44. [Google Scholar]
- [18].Guraya SY, Forgione A, Gianluca S, Pugliese R. The mapping of preferred resources for surgical education: Perceptions of surgical trainees at the Advanced International Minimally Invasive Surgery Academy (AIMS), Milan, Italy. Journal of Taibah University Medical Sciences. 2015;10(4):396–404. [Google Scholar]
- [19].Khan AK, Khan KR, Bashir Z, Hanif A. Learning style preferences among students of medical and dental colleges. Advances in Health Professions Education. 2015;1(1) [Google Scholar]
- [20].Guraya SS, Guraya SY, Habib FA, Khoshhal KI. Learning styles of medical students at Taibah University: Trends and implications. 2014;19(12):1155. doi: 10.4103/1735-1995.150455. [DOI] [PMC free article] [PubMed] [Google Scholar]
- [21].Guraya S, Alzobydi A, Salman S. Objective structured clinical examination: Examiners’ bias and recommendations to improve its reliability. J Med Med Sci. 2010;1(7):269–72. [Google Scholar]
- [22].Rees CE, Knight LV. Viewpoint: The trouble with assessing students’ professionalism: Theoretical insights from sociocognitive psychology. Academic Medicine. 2007;82(1):46–50. doi: 10.1097/01.ACM.0000249931.85609.05. [DOI] [PubMed] [Google Scholar]
- [23].Rees CE, Knight LV. Banning, detection, attribution and reaction: The role of assessors in constructing students’ unprofessional behaviours. Medical Education. 2008;42(2):125–27. doi: 10.1111/j.1365-2923.2007.02930.x. [DOI] [PubMed] [Google Scholar]
- [24].Hodges BD, Ginsburg S, Cruess R, Cruess S, Delport R, Hafferty F, et al. Assessment of professionalism: Recommendations from the Ottawa 2010 Conference. Medical Teacher. 2011;33(5):354–63. doi: 10.3109/0142159X.2011.577300. [DOI] [PubMed] [Google Scholar]
- [25].O’Sullivan H, van Mook W, Fewtrell R, Wass V. Integrating professionalism into the curriculum: AMEE Guide No. 61. Medical Teacher. 2012;34(2):e64–e77. doi: 10.3109/0142159X.2012.655610. [DOI] [PubMed] [Google Scholar]
- [26].Goold SD, Stern DT. Ethics and professionalism: what does a resident need to learn? The American Journal of Bioethics. 2006;6(4):9–17. doi: 10.1080/15265160600755409. [DOI] [PubMed] [Google Scholar]
- [27].Kung JW, Slanetz PJ, Huang GC, Eisenberg RL. Reflective Practice: Assessing Its Effectiveness to Teach Professionalism in a Radiology Residency. Academic Radiology. 2015;22(10):1280–86. doi: 10.1016/j.acra.2014.12.025. [DOI] [PubMed] [Google Scholar]
- [28].Van Der Vleuten CP, Schuwirth LW. Assessing professional competence: from methods to programmes. Medical Education. 2005;39(3):309–17. doi: 10.1111/j.1365-2929.2005.02094.x. [DOI] [PubMed] [Google Scholar]
- [29].Guraya SY. Workplace-based assessment; applications and educational impact. Malaysian Journal of Medical Sciences. 2015;22(6):5–10. [PMC free article] [PubMed] [Google Scholar]
- [30].Vaughan B, Moore K. The mini Clinical Evaluation Exercise (mini-CEX) in a pre-registration osteopathy program: Exploring aspects of its validity. International Journal of Osteopathic Medicine. 2015. 2016;19:61–72. [Google Scholar]
- [31].Miller A, Archer J. Impact of workplace based assessment on doctors’ education and performance: a systematic review. BMJ. 2010;341:c5064. doi: 10.1136/bmj.c5064. [DOI] [PMC free article] [PubMed] [Google Scholar]
- [32].Morris A, Hewitt J, Roberts C. Practical experience of using directly observed procedures, mini clinical evaluation examinations, and peer observation in pre-registration house officer (FY1) trainees. Postgraduate Medical Journal. 2006;82(966):285–88. doi: 10.1136/pgmj.2005.040477. [DOI] [PMC free article] [PubMed] [Google Scholar]
- [33].Barton JR, Corbett S, van der Vleuten CP, Programme EBCS. The validity and reliability of a Direct Observation of Procedural Skills assessment tool: assessing colonoscopic skills of senior endoscopists. Gastrointestinal Endoscopy. 2012;75(3):591–97. doi: 10.1016/j.gie.2011.09.053. [DOI] [PubMed] [Google Scholar]
- [34].Norcini JJ, Blank LL, Arnold GK, Kimball HR. The mini-CEX (clinical evaluation exercise): a preliminary investigation. Annals of Internal Medicine. 1995;123(10):795–99. doi: 10.7326/0003-4819-123-10-199511150-00008. [DOI] [PubMed] [Google Scholar]
- [35].Cruess R, McIlroy JH, Cruess S, Ginsburg S, Steinert Y. The professionalism mini-evaluation exercise: A preliminary investigation. Academic Medicine. 2006;81(10):S74–S8. doi: 10.1097/00001888-200610001-00019. [DOI] [PubMed] [Google Scholar]
- [36].Tsugawa Y, Ohbu S, Cruess R, Cruess S, Okubo T, Takahashi O, et al. Introducing the Professionalism Mini-Evaluation Exercise (P-MEX) in Japan: results from a multicenter, cross-sectional study. Academic Medicine. 2011;86(8):1026–31. doi: 10.1097/ACM.0b013e3182222ba0. [DOI] [PubMed] [Google Scholar]
- [37].Nurudeen SM, Kwakye G, Berry WR, Chaikof EL, Lillemoe KD, Millham F, et al. Can 360-degree reviews help surgeons? Evaluation of multisource feedback for surgeons in a multi-institutional quality improvement project. Journal of the American College of Surgeons. 2015;221(4):837–44. doi: 10.1016/j.jamcollsurg.2015.06.017. [DOI] [PubMed] [Google Scholar]
- [38].Bettenhausen KL, Fedor DB. Peer and upward appraisals a comparison of their benefits and problems. Group & Organization Management. 1997;22(2):236–63. [Google Scholar]
- [39].Donnon T, Al Ansari A, Al Alawi S, Violato C. The reliability, validity, and feasibility of multisource feedback physician assessment: a systematic review. Academic Medicine. 2014;89(3):511–16. doi: 10.1097/ACM.0000000000000147. [DOI] [PubMed] [Google Scholar]
- [40].Drew G. A “360” degree view for individual leadership development. Journal of Management Development. 2009;28(7):581–92. [Google Scholar]
- [41].Sargeant J, Bruce D, Campbell CM. Practicing physicians’ needs for assessment and feedback as part of professional development. Journal of Continuing Education in the Health Professions. 2013;33:S54–S62. doi: 10.1002/chp.21202. [DOI] [PubMed] [Google Scholar]
- [42].Aamodt A, Plaza E. Case-based reasoning: Foundational issues, methodological variations, and system approaches. AI communications. 1994;7(1):39–59. [Google Scholar]
- [43]. Kolodner J. Case-based reasoning: Morgan Kaufmann; 2014.
- [44].Miller GE. The assessment of clinical skills/competence/performance. Academic Medicine. 1990;65(9):S63–67. doi: 10.1097/00001888-199009000-00045. [DOI] [PubMed] [Google Scholar]
- [45].Van Luijk J. JS, Smits J, Wolfhagen I, Perquin MLF, S. Assessing professional behaviour and the role of academic advice at the Maastricht Medical School. Medical Teacher. 2000;22(2):168–72. [Google Scholar]
- [46].Veloski JJ, Fields SK, Boex JR, Blank LL. Measuring professionalism: a review of studies with instruments reported in the literature between 1982 and 2002. Academic Medicine. 2005;80(4):366–70. doi: 10.1097/00001888-200504000-00014. [DOI] [PubMed] [Google Scholar]
- [47].Stern DT, Frohna AZ, Gruppen LD. The prediction of professional behaviour. Medical Education. 2005;39(1):75–82. doi: 10.1111/j.1365-2929.2004.02035.x. [DOI] [PubMed] [Google Scholar]
- [48].O’Sullivan H, Van Mook W, Fewtrell R, Wass V. Integrating professionalism into the curriculum. Medical teacher. 2012;34(2):155–57. doi: 10.3109/0142159X.2011.595600. [DOI] [PubMed] [Google Scholar]
- [49].McLachlan JC, Finn G, Macnaughton J. The conscientiousness index: a novel tool to explore students’ professionalism. Academic Medicine. 2009;84(5):559–65. doi: 10.1097/ACM.0b013e31819fb7ff. [DOI] [PubMed] [Google Scholar]
- [50].Finn G, Sawdon M, Clipsham L, McLachlan J. Peer estimation of lack of professionalism correlates with low Conscientiousness Index scores. Medical Education. 2009;43(10):960–67. doi: 10.1111/j.1365-2923.2009.03453.x. [DOI] [PubMed] [Google Scholar]
- [51].Schönrock-Adema J, Heijne-Penninga M, Van Duijn MA, Geertsma J, Cohen-Schotanus J. Assessment of professional behaviour in undergraduate medical education: peer assessment enhances performance. Medical Education. 2007;41(9):836–42. doi: 10.1111/j.1365-2923.2007.02817.x. [DOI] [PubMed] [Google Scholar]
- [52].Minter RM, Gruppen LD, Napolitano KS, Gauger PG. Gender differences in the self-assessment of surgical residents. The American Journal of Surgery. 2005;189(6):647–50. doi: 10.1016/j.amjsurg.2004.11.035. [DOI] [PubMed] [Google Scholar]
- [53].Kasule OH. Medical professionalism and professional organizations. Journal of Taibah University Medical Sciences. 2013;8(3):137–41. [Google Scholar]
- [54].Arnold L, Shue CK, Kritt B, Ginsburg S, Stern DT. Medical students’ views on peer assessment of professionalism. Journal of General Internal Medicine. 2005;20(9):819–24. doi: 10.1111/j.1525-1497.2005.0162.x. [DOI] [PMC free article] [PubMed] [Google Scholar]
- [55].Aldughaither SK, Almazyiad MA, Alsultan SA, Al Masaud AO, Alddakkan ARS, Alyahya BM, et al. Student perspectives on a course on medical ethics in Saudi Arabia. Journal of Taibah University Medical Sciences. 2012;7(2):113–17. [Google Scholar]
- [56].Gray JD. Global rating scales in residency education. Academic Medicine. 1996;71(1):S55–63. doi: 10.1097/00001888-199601000-00043. [DOI] [PubMed] [Google Scholar]
- [57].Harden RM. Misconceptions and the OSCE. Medical Teacher. 2015;37(7):608–10. doi: 10.3109/0142159X.2015.1042443. [DOI] [PubMed] [Google Scholar]
- [58].Papadakis MA, Loeser H, Healy K. Early detection and evaluation of professionalism deficiencies in medical students: one school’s approach. Academic Medicine. 2001;76(11):1100–06. doi: 10.1097/00001888-200111000-00010. [DOI] [PubMed] [Google Scholar]
- [59].Papadakis MA, Hodgson CS, Teherani A, Kohatsu ND. Unprofessional behaviour in medical school is associated with subsequent disciplinary action by a state medical board. Academic Medicine. 2004;79(3):244–49. doi: 10.1097/00001888-200403000-00011. [DOI] [PubMed] [Google Scholar]
- [60].Al-Mohaimeed AA. Medical faculty development: Perceptions about teachers’ characteristics. Journal of Taibah University Medical Sciences. 2015;10(4):405–10. [Google Scholar]