Abstract
Introduction
The script concordance test (SCT) is one the best tools used to evaluate clinical reasoning in ill-defined clinical situations. The aim of this study was to demonstrate SCT application in otolaryngology residency training.
Methods
A 20 item otolaryngology SCT containing 60 questions was administered to 26 otolaryngology residents. The test was prepared by two otolaryngologists familiar to medical education. These questions have been validated by otolaryngology experts. The panel consisted of 9 academic staff in the field of otolaryngology. Pearson correlation test was used to assess the reliability of the test.
Results
The obtained mean scores were 68.4±5.8 (out of 100) for residents and 78.2±6.4(out of 100) for experts. There was a significant difference between the two scores (p<0.005). Cronbach’s alpha value was 0.80.
Conclusion
The SCT is a reliable tool to evaluate clinical reasoning in otolaryngology residents. It should be included in otolaryngology residency training.
Keywords: Training, Otolaryngology, Validity, Reliability
Introduction
Clinical reasoning is one of the most important skills for physicians to solve ill-defined problems (1, 2). So, students and residents should be trained to have a strong clinical reasoning skill.
It makes the physicians choose the appropriate diagnostic tools and treatment options in various situations. Clinical reasoning is a process by which clinical data from a medical problem combined with the previous knowledge and experience of the physician are used to manage uncertain clinical situations. Clinical reasoning is an important factor for the physician’s competence. Medical education experts believe that clinical reasoning should be taught for medical students and residents. Several tests have been introduced to assess clinical reasoning including key feature (KF), script concordance (SCT), clinical reasoning problems (CRP), and comprehensive integrative puzzles (CIP). These tests are prepared and validated by experts in their fields (1).
In otolaryngology, like other clinical sciences, physicians challenge with new and alternative diagnostic and treatment options called ill-defined problems. These problems indicate that there is more than one solution existing for the problems (1).
According to cognitive psychology script theory, scripts are the networks of knowledge in our minds. Clinicians use scripts in judgment for diagnosis and treatment options in uncertain and ill-defined cases (3).
Scripts appear from the beginning of education in the medical school and are refined in postgraduate state with clinical experiences (4). Script concordance test (SCT) is one of the best tools to assess the clinical reasoning. It helps us together with other tools but cannot be replaced with them for clinical reasoning evaluation (5). Script concordance test constitutes a clinical situation (hypothesis) in domains of diagnosis, treatment or investigation in the first part. In the second part, the examinees were confronted with new information and asked to define the effect of new findings (information) on the hypothesis by a five point Likert scale. Scoring was the degree of concordance for responses with those of experts’ panel (6, 7).
There are a series of studies on validity and reliability of SCT in the fields of radiation oncology (1), urology (8), rheumatology (2), family medicine (9, 10), but no study has been done in the field of otolaryngology yet. The aim of this study was to define the usefulness of SCT on clinical reasoning in otolaryngology residents.
Methods
The participants consisted of 26 senior (3rd and 4th year) residents in otolaryngology department of Shiraz University of Medical Sciences. Each examinee received information about the goals and format of examination before the test. A test sample was also demonstrated for examinees before the examination. Items and questions were constructed by two otolaryngologists familiar with medical education and also cooperation of Education and Development Center (EDC) in Shiraz University. Experts’ panel consisted of 9 board certified otolaryngologists in
Our department and another qualified center. They also participated in the exam for future comparison with the residents.
The test consisted of 20 items and 60 questions in the field of otolaryngology and has been designed for residency training. The items were taken from the prevalent cases in the field of otolaryngology. These cases incorporate the uncertainty on management and diagnosis (appendix 1).
SCT scoring
For scoring, we used the formula 1/ (1+x) where x is defined as the distance between the selected and correct answer (x ranged from 1 to 5 in a five – point Likert scale). This is an innovative method advised by Bland et al. and used by Amini et al. (5). It seems to be more accurate than other scoring methods (11).
The maximum score for an answer was 1, which was chosen by the majority of panels. The other panel choices had a partial credit. Average MCQ scores from4 annual and periodic examination were gathered to be compared with SCT scores to find any correlation.
Otolaryngology experts also validated the content of questions. They were not the same as those constructed for the exam.
Test reliability was calculated by the Cronbach alpha coefficient. Good reliability is indicated when the coefficient is ≥0.80. Item analysis was done to detect the problematic questions. Data analysis was done by SPSS software, version 11.0. Item analysis was done by Whitney and Sabers method to detect problematic questions.
Case description : a 50-year-old patient with stage II laryngeal carcinoma for whom radiotherapy was done. Three months after termination of radiotherapy, the patient was referred with tumor recurrence.
Apendix 1.
If you were considering | And then you find | Your recommendation become: |
---|---|---|
Supracricoidlaryngectomy | COPD | -2 -1 0 +1 +2 |
Laser surgery | Chronic bronchitis | -2 -1 0 +1 +2 |
Total laryngectomy | Thyroid cartilage involvement | -2 -1 0 +1 +2 |
Scoring key:
-2:strongly contraindicated, -1:contraindicated 0:neither more or less indicated ( It doesn’t change your mind ) +1: indicated, +2: strongly indicated |
Results
26 residents and 9 experts participated in the study. Although there was no time limitation, all the participants completed the test in less than an hour. The mean scores and the score variability were 68.4±5.8 out of 100 (range=52.8-77.6) for residents and 78.2±6.4 out of 100 (range=62.3-91.7) for the panel.
Item analysis was done. Then, two questions were excluded due to poor item difficulty (questions no. 3 and 11). An acceptable level of difficulty was between 0.30 to 0.80. Then, the final score was calculated. The result of item analysis is shown in Table 1.
Table1.
q1 | q2 | q3 | q4 | q5 | q6 | q7 | q8 | q9 | q10 | |
---|---|---|---|---|---|---|---|---|---|---|
Item difficulty level | 0.52 | 0.60 | 0.15 | 0.44 | 0.57 | 0.49 | 0.43 | 0.40 | 0.49 | 0.42 |
Item- total correlation | 0.46 | 0.46 | 0.12 | 0.30 | 0.33 | 0.31 | 0.30 | 0.42 | 0.40 | 0.30 |
q11 | q12 | q13 | q14 | q15 | q16 | q17 | q18 | q19 | q20 | |
Item difficulty level | 0.2 | 0.51 | 0.42 | 0.47 | 0.52 | 0.57 | 0.48 | 0.44 | 0.49 | 0.51 |
Item- total correlation | 0.1 | 0.30 | 0.31 | 0.40 | 0.32 | 0.30 | 0.51 | 0.34 | 0.30 | 0.46 |
There was a significant difference between the mean score of residents and the panel (P<0.05). The Cronbach’s alpha coefficient was 0.80 for the reliability of the test. The correlation between SCT and MCQ scores is shown in Table 2. The correlation was significant at the 0.05 level (2-tailed).
Table 2.
Pearson correlation | MCQ | SCT |
---|---|---|
MCQ
Sig. (2-tailed) |
1 |
0.458 0.019 |
SCT
Sig. (2-tailed) |
0.458 0.019 |
1 |
*correlation is significant at the 0.05. level (2-tailed)
All 26 residents and 9 experts signed an informed consent about the examination. All 26 residents were volunteer to take the exam. They were informed that the test was not mandatory and their scores were not used for any certification.
Discussion
Physicians need knowledge, experience and clinical reasoning to solve diagnostic and treatment problems. Script concordance test (SCT) is one of the best tools to evaluate the clinical reasoning ability. Clinical reasoning is based on data interpretation that needs correlation between previous and new data in an organized manner (7).
With increasing body of research from other centers, clinical reasoning tests including SCT may be applied in the future routinely.
In this study, we constructed a SCT for otolaryngology residents in the first time to assess its reliability and validity as a valuable tool in future. The test appeared reliable with Cronbach’s alpha coefficient value of 0.80. Other researchers have used the test in various fields in pre- and post-graduate students. Lambert et al. administered the test to radiation oncology students and residents with resulting Cronbach’s alpha of 0.90 (1). Mathieu et al. utilized online version of SCT for the 5th year medical students in rheumatology with Cronbach’s alpha value of 0.82 (2). Also, Lemay et al. used SCT in pediatric students and residents with acceptable reliability coefficient of 0.74 (12). Amini et al. used SCT for top-ranked medical students with a reliability coefficient of 0.78(5).
Experts’ panel had a higher level of experience and a higher mean score than residents in SCT scoring. This is compatible with the findings of other studies (1, 2). So, SCT may be used as a reflection of clinical experience in pre- and post-graduate medical groups, especially during residency training.
Caire et al. used SCT as a training tool for neurosurgery interns by applying it as a self-assessment examination that assessedthe clinical experience (13, 14). Humbert et al. also indicated that SCT may assess clinical experience in examinees.Their study was satisfactory with Cronbach’s alpha of 0.73, that is near to our study result (15).
There was a positive correlation between MCQ and SCT scores in our study. This may be due to good MCQ scores; the examinees had a well-organized knowledge for reasoning together with experience.
The major limitation of this study was lack of familiarity of the panel and examinees to the examination format in spite of pretest instruction. Other studies mentioned this problem (1, 2). Another limitation of our study and other similar researches is small cohort of participants and panels. Multicenter studies are needed for better assessment of the test.
Conclusion
Our study showed that script concordance test is a reliable test to assess clinical reasoning in otolaryngology residents. It needs to provide additional data from multiple researches to conclude the usefulness of the test for widespread use in residency training.
Acknowledgment
The authors would like to thank Dr. NasrinShokrpour at Center for Development of Clinical Research of Nemazee Hospital for editorial assistance.
Conflict of interest: None declared.
References
- 1.Lambert C, Gagnon R, Nguyen D, Charlin B. The script concordance test in radiation oncology: validation study of a new tool to assess clinical reasoning. Radiation oncology. 2009;4:7. doi: 10.1186/1748-717X-4-7. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 2.Mathieu S, Couderc M, Glace B, Tournadre A, Malochet-Guinamand S, Pereira B, et al. Construction and utilization of a script concordance test as an assessment tool for DCEM3 (5th year) medical students in rheumatology. BMC medical education. 2013;13:166. doi: 10.1186/1472-6920-13-166. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 3.Amini M, Moghadami M, Kojuri J, Abbasi H, Abadi AA, Molaee NA, et al. An innovative method to assess clinical reasoning skills: Clinical reasoning tests in the second national medical science Olympiad in Iran. BMC research notes. 2011;4:418. doi: 10.1186/1756-0500-4-418. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 4.Amini M, Kojuri J, Karimian Z, Lotfi F, Moghadami M, Dehghani M, et al. Talents for future: Report of the second national medical science Olympiad in Islamic republic of Iran. Iranian Red Crescent Medical Journal. 2011;2011(6):377–81. [Google Scholar]
- 5.Charlin B, Roy L, Brailovsky C, Goulet F, Van der Vleuten C. The Script Concordance test: a tool to assess the reflective clinician. Teaching and learning in medicine. 2000;12(4):189–95. doi: 10.1207/S15328015TLM1204_5. [DOI] [PubMed] [Google Scholar]
- 6.Fournier JP, Demeester A, Charlin B. Script concordance tests: guidelines for construction. BMC medical informatics and decision making. 2008;8:18. doi: 10.1186/1472-6947-8-18. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 7.Charlin B, Boshuizen HP, Custers EJ, Feltovich PJ. Scripts and clinical reasoning. Medical Education. 2007;41(12):1178–84. doi: 10.1111/j.1365-2923.2007.02924.x. [DOI] [PubMed] [Google Scholar]
- 8.Sibert L, Darmoni SJ, Dahamna B, Hellot MF, Weber J, Charlin B. On line clinical reasoning assessment with Script Concordance test in urology: results of a French pilot study. BMC medical education. 2006;6:45. doi: 10.1186/1472-6920-6-45. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 9.Gagnon R, Charlin B, Coletti M, Sauve E, Van der Vleuten C. Assessment in the context of uncertainty: how many members are needed on the panel of reference of a script concordance test? Medical Education. 2005;39(3):284–91. doi: 10.1111/j.1365-2929.2005.02092.x. [DOI] [PubMed] [Google Scholar]
- 10.Charlin B, Van der Vleuten C. Standardized assessment of reasoning in contexts of uncertainty: the script concordance approach. Evaluation & the health professions. 2004;27(3):304–19. doi: 10.1177/0163278704267043. [DOI] [PubMed] [Google Scholar]
- 11.Bland AC, Kreiter CD, Gordon JA. The psychometric properties of five scoring methods applied to the script concordance test. Academic medicine: Journal of the Association of American Medical Colleges. 2005;80(4):395–9. doi: 10.1097/00001888-200504000-00019. [DOI] [PubMed] [Google Scholar]
- 12.Lemay JF, Donnon T, Charlin B. The reliability and validity of a paediatric script concordance test with medical students, paediatric residents and experienced paediatricians. Canadian Medical Education Journal. 2010;1(2):e89–95. [Google Scholar]
- 13.Caire F, Sol JC, Charlin B, Isidori P, Moreau JJ. Le test de Concordance de Script Comme outil d’evaluation formative des interns en neurochirurgie. Pedagogie Medicale. 2004;5:87–94. doi: 10.1016/s0028-3770(04)98309-9. [DOI] [PubMed] [Google Scholar]
- 14.Caire F, Marin B, Cuny E. Utilisation du test de concordance de script au cours du deuxieme cycle des etudes medicales. Pedagogie Medicale. 2011;12(1):29–36. [Google Scholar]
- 15.Humbert AJ, Johnson MT, Miech E, Friedberg F, Grackin JA, Seidman PA. Assessment of clinical reasoning: a script concordance test designed for pre-clinical medical students. Med Teach. 2011;33(6):472–7. doi: 10.3109/0142159X.2010.531157. [DOI] [PubMed] [Google Scholar]