Skip to main content
PLOS ONE logoLink to PLOS ONE
. 2020 Mar 26;15(3):e0230792. doi: 10.1371/journal.pone.0230792

Relationships between objective structured clinical examination, computer-based testing, and clinical clerkship performance in Japanese medical students

Nobuyasu Komasawa 1,*, Fumio Terasaki 1, Takashi Nakano 1, Ryo Kawata 1
Editor: Conor Gilligan2
PMCID: PMC7098585  PMID: 32214357

Abstract

Background

It is unclear how comprehensive evaluations conducted prior to clinical clerkships (CC), such as the objective structured clinical examination (OSCE) and computer-based testing (CBT), reflect the performance of medical students in CC. Here we retrospectively analyzed correlations between OSCE and CBT scores and CC performance.

Methods

Ethical approval was obtained from our institutional review board. We analyzed correlations between OSCE and CBT scores and CC performance in 94 medical students who took the OSCE and CBT in 2017 when they were 4th year students, and who participated in the basic CC in 2018 when they were 5th year students.

Results

Total scores for OSCE and CBT were significantly correlated with CC performance (P<0.001, each). More specifically, medical interview and chest examination components of the OSCE were significantly correlated with CC performance (P = 0.001, each), while the remaining five components of the OSCE were not.

Conclusion

Our findings suggest that the OSCE and CBT play important roles in predicting CC performance in Japanese medical education context. Among OSCE components, medical interview and chest examination were suggested to be important for predicting CC performance.

Introduction

Seamless medical education in which students gradually acquire professional abilities from when they are undergraduates up until they become postgraduates is important from the perspective of outcome-based education [1]. To achieve this goal, effective clinical training methods are needed which allow for a smooth transition from undergraduate medical education to basic skill acquisition as a postgraduate [2].

In Japan, clinical clerkships (CC)s form the basis of clinical training. In contrast to conventional clinical training, which involves only observation and no practice, CC have students participate as members of a medical team to perform actual medical procedures and care. The range of medical procedures allowed to be performed by students is defined and carried out under the supervision of an instructing doctor [3]. This enables students to acquire practical clinical skills. In this regard, students are required to have a sense of identity and personal responsibility [4]. Clinical training throughout the various departments of a hospital is carried out in the form of CCs which are driven by curricula for diagnoses and treatments [5].

Assuring basic clinical competency in medical students prior to CC is essential from a medical safety perspective. In order to validate the basic clinical competency of medical students, the objective structured clinical examination (OSCE) and computer-based testing (CBT) were introduced in 2005 as standardized tests, organized by the Common Achievement Tests Organization (CATO) [http://www.cato.umin.jp/], to be taken by medical students. The OSCE evaluates clinical skills and communication skills using simulated patients and simulators [6][7], and the CBT basic clinical knowledge. The OSCE and CBT are mandatory for 4th year students in Japanese medical schools. Medical students are recognized by the Association of Japanese Medical Colleges as “student doctor” once they pass both examinations. After this certification, medical students can participate in the CC.

Although previous studies examined the CC performance by mini-clinical evaluation exercise (mini-CEX) [8][9], the relation between OSCE or CBT and CC performance has not been fully validated. Furthermore, no study has evaluated which skill components measured in the OSCE reflect student performance in CCs in Japanese medical education context.

Thus, we decided to evaluate the relationships between OSCE components or CBT and CC performance in Japanese medical education contexts. Accordingly, the present study aimed to retrospectively analyze correlations of various components of the OSCE and CBT with CC performance.

Material and methods

Ethical considerations

This study was approved by the Research Ethics Committee of Osaka Medical College (No.2806). All data were fully anonymized before accessing them and our research ethics committee waived the requirement for informed consent.

Study population

As with other medical schools in Japan, medical students of Osaka Medical College take the OSCE and CBT in their 4th year, and participate in CCs in their 5th and 6th years. The students have undergone all basic and clinical medicine lectures and skill training utilizing simulation before OSCE and CBT. Once they complete their CCs, medical students then take the graduation examination. From 2020, a post-CC OSCE will be introduced by CATO formally to evaluate clinical skills cultivated during CCs (Fig 1).

Fig 1. Schematic summarizing relationships between objective structured clinical examination (OSCE), computer-based testing (CBT), and clinical clerkships (CCs) at Osaka Medical College.

Fig 1

Subjects of the present study were medical students of Osaka Medical College who were 4th year students in 2017 and 5th year student in 2018. We excluded students who did not advance to 5th year status in 2018.

Study measures

OSCE content and evaluation

The OSCE evaluates various aspects of clinical competency. The OSCE included the following seven themes: medial interview, head and neck examination, chest examination, abdominal examination, neurological examination, emergency response, and basic clinical technique. The OSCE is carried out in seven stations, with one station dedicated to a 10-min medical interview, and the remaining six stations to physical examinations and basic skills in 5-min for each. In the 5 or 10 minutes, students perform core clinical skills such as medical interview and physical examination [10].

In the present study, student performance was evaluated by two examiners using a checklist. Scores on each component of the OSCE was based on the average of the scores assigned by the two examiners. Examiners evaluate the communication, medical safety, and consultation skills accordingly on the checklist. The examiners underwent about three hours evaluation training for standardization based on common text and video provided by CATO. Each student take examination in the all seven skill stations and total score was calculated by the average of seven skills. The examination also strictly checks the identification of students by validating their names and ID numbers. Examiners from other universities are routinely invited to validate internal evaluations during the OSCE.

CBT content and evaluation

The CBT consists of multiple-choice questions and extended matching items, and students are required to answer 320 questions about basic clinical knowledge over the course of six hours. Evaluation was performed by the 240 questions which the difficulty and discriminating power was validated from the past pooling data. The remaining 80 questions were trial questions which are not used for the evaluation. The questions are standard tested by the CATO. The CBT includes clinical disciplines and related basic medicine knowledges. Scores for the CBT are machine-calculated and scoring rate was evaluated.

Clinical clerkship (CC) content and evaluation

Medical students participate in a basic CC during their 5th year. The basic CC involves participation in CCs of all clinical departments of the hospital over the course of 32 weeks, with each CC spanning about one to two weeks in duration. Once students complete the basic CC, they must then select a discipline they wish to study for 14 weeks (Figure).

During the CCs, supervising (teaching) doctors of each department evaluate the clinical skills of students using an evaluation sheet based on the mini- CEX and Direct Observation of Procedural Skills (DOPS) [11][12]. Accomplishment consists of 5-point evaluation sheet for 16 parts (80%), subjective evaluation by the organizer of each department (10%), and written report (10%) (Fig 2).

Fig 2. Contents of clinical clerkships (CC) evaluation in our college.

Fig 2

Accomplishment consists of 5-point evaluation sheet for 16 parts (80%), subjective evaluation by the organizer (10%), and written report (10%).

Scores for each CC are collected by the medical education center and are used to calculate an average score. In this study, we used the basic CC (32 weeks) score, since all medical students are required to participate in the basic CC.

Statistical analysis

Statistical analyses were performed using JMP® 11 (SAS Institute Inc., Cary, NC, USA). Results were compared using Pearson’s correlation test. Data are presented as mean ± SD. P < 0.05 was considered statistically significant.

Results

We analyzed scores of 94 medical students who participated in the OSCE and CBT in 2017, and the basic CC in 2018. As shown in Table 1, medical students generally achieved scores ranging from 80%-90% for the OSCE, CBT, and CC.

Table 1. Scores for objective structured clinical examination (OSCE), computer-based testing (CBT), and clinical clerkship (CC).

Medical interview Head and neck examination Chest examination Abdominal examination Neurological examination Emergency response Basic Technique Total OSCE score Computer- based testing Clinical Clerkship
Average 80.3 92.2 81.8 94.3 90.8 90.6 82.4 87.5 80.1 78.0
SD 9.9 6.6 10.8 4.9 7.6 6.6 5.8 3.9 7.5 2.1

Correlations of OSCE and CBT scores with CC scores

Correlations of OSCE and CBT scores with CC scores are shown in Table 2. Total scores for OSCE and CBT were significantly correlated with CC scores (P<0.001 each). When analyzed by OSCE components, medical interview and chest examination scores were significantly correlated with CC scores (P = 0.001, each), while the remaining five component scores were not.

Table 2. Correlations of objective structured clinical examination (OSCE) and computer-based testing scores with clinical clerkship (CC) scores.

Medical interview Head and neck examination Chest examination Abdominal examination Neurological examination Emergency
response
Basic Technique Total OSCE score Computer-based testing
R 0 0.059 0.329 0.166 0.158 0.124 0.161 0.377 0.346
Co-efficient 0.075 0.004 0.108 0.028 0.025 0.015 0.026 0.141 0.120
P 0.001* 0.570 0.001* 0.109 0.130 0.234 0.122 <0.001* <0.001*

*P<0.05.

Correlations between OSCE and CBT scores

We evaluated correlations between OSCE and CBT scores, and found no significant correlations between them (Table 3). There also were no significant correlations between each OSCE component score and CBT score (Table 3).

Table 3. Correlations between scores of objective structured clinical examination (OSCE) components and computer-based testing (CBT).

Medical interview Head and neck examination Chest examination Abdominal examination Neurological examination Emergency response Basic Technique Total OSCE score
R -0.045 0.0618 0.179 -0.021 -0.059 0.044 0.04 0.063
Co-efficient 0.021 0.004 0.032 0.0005 0.003 0.002 0.002 0.004
P 0.667 0.560 0.085 0.841 0.579 0.672 0.702 0.549

Discussion

Our study showed that total scores for OSCE and CBT were significantly correlated with CC scores. This suggest that OSCE and CBT can be an effective indicator of CC performance in Japanese medical education. From the specific correlation analysis, medical interview and chest examination scores were significantly correlated with CC scores.

Physical examinations and medical interviews are essential skills and being able to evaluate information from these provides information important for diagnosis and treatment during CCs [13][14]. In clinical settings, it is not rare to overlook physical findings or perform evaluations incorrectly. Incorrect assessment of physical findings can lead to errors in diagnosis, which may result in an adverse outcome for the patient [15][16]. Accordingly, from the perspectives of clinical competency and outcome-based education, assuring the quality of both technical and non-technical skills of medical students before CCs is essential.

In the present study, total scores for OSCE and CBT showed strong and significant correlations with CC performance, as reflected in CC scores. These data validate the OSCE and CBT as measures to assure competency prior to participating in CCs. Interestingly, no components of the OSCE were significantly correlated with total CBT score. This suggests that competency as evaluated using the OSCE and CBT are different, and a combination of both could provide a better sense of the competency of medical students prior to CCs.

When OSCE components were considered individually, medical interview and chest examination components were significantly correlated with CC performance, while the remaining five components were not. One potential explanation for this is that, of the seven components of the OSCE, medical interviews and chest examinations are performed the most often during CCs. Thus, focusing training on these skills may contribute to better CC performance [17][18].

In contrast, components other than medical interviews and chest examinations were not significantly correlated with CC performance. One possible reason for this is the lack of opportunities to use such skills. For example, medical students are not permitted to perform emergency response such as advanced life support alone [19][20]. As medical students should acquire these basic clinical competencies after medical doctor certification, some educational method for compensating the gap is warranted [21].

To overcome this problem, we believe that simulation-based education (SBE) can be a powerful solution to compensate for the lack of opportunities to exercise these skills [22]. As SBE methods have been developed and are widely used to acquire both technical and non-technical skills in medical education [23][24], combination of SBE and CC could potentially maximize the competency of medical students. For example, medical students can rephrase the resuscitation utilizing simulator which they watched in the emergency ward. Such combination can enhance the CC performance in medical students.

Medical educators are expected to improve CC program by including SBE method to compensate low-frequent clinical skills [25]. They can also utilize SBE for formative assessment to improve teaching and learning in clinical region.

This study has a number of limitations worth noting. First, as data were obtained from a single institution, our findings may not be generalizable to other medical schools [26][27]. However, our results likely apply to medical schools in Japan given the core medical curriculum adopted throughout the country. Second, the CBT scores are generally high and small variation, this might have caused some bias for correlation analysis. Third, we excluded the students who did not progress to perform correlation analysis more accurately, as we considered that content of CC may differ year by year. However, this may have caused bias. Fourth, we evaluated the total OSCE score by the average of seven stations which may have lacked statistical justification. Lastly, we evaluated the overall score for CC. Correlations between the OSCE and CBT and the various aspects of CC may provide further insight into how these test instruments relate to actual medical practice by medical students. In this regard, it will be interesting to evaluate the relationship between CC performance and post-CC OSCE scores once the post-OSCE is implemented in the 2020 curriculum year.

In conclusion, our findings suggest that the OSCE and CBT play important roles in predicting CC performance. In particular, medical interview and chest examination components of the OSCE were particularly relevant for predicting CC performance.

Supporting information

S1 Data

(XLSX)

Data Availability

The data underlying the results presented in the study are available from attached supplemental data.

Funding Statement

The authors have no affiliation with any manufacturer of any device described in the manuscript and declare no financial interest in relation to the material described in the manuscript. Financial support for the study was provided by Osaka Medical College which had no role in study design, data collection and analysis, publication decisions, or manuscript preparation.

References

  • 1.Ellaway RH, Graves L, Cummings BA (2016) Dimensions of integration, continuity and longitudinality in clinical clerkships. Med Educ 50:912–921. 10.1111/medu.13038 [DOI] [PubMed] [Google Scholar]
  • 2.Hudson JN, Poncelet AN, Weston KM, Bushnell JA, A Farmer E (2017) Longitudinal integrated clerkships. Med Teach 39:7–13. 10.1080/0142159X.2017.1245855 [DOI] [PubMed] [Google Scholar]
  • 3.Takahashi N, Aomatsu M, Saiki T, Otani T, Ban N (2018) Listen to the outpatient: qualitative explanatory study on medical students' recognition of outpatients' narratives in combined ambulatory clerkship and peer role-play. BMC Med Educ 18:229 10.1186/s12909-018-1336-6 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 4.Murata K, Sakuma M, Seki S, Morimoto T (2014) Public attitudes toward practice by medical students: a nationwide survey in Japan. Teach Learn Med 26:335–343. 10.1080/10401334.2014.945030 [DOI] [PubMed] [Google Scholar]
  • 5.Okayama M, Kajii E (2011) Does community-based education increase students' motivation to practice community health care?—a cross sectional study. BMC Med Educ 11:19 10.1186/1472-6920-11-19 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 6.Tagawa M, Imanaka H (2010) Reflection and self-directed and group learning improve OSCE scores. Clin Teach 7:266–270. 10.1111/j.1743-498X.2010.00377.x [DOI] [PubMed] [Google Scholar]
  • 7.Ishikawa H, Hashimoto H, Kinoshita M, Fujimori S, Shimizu T, Yano E (2006) Evaluating medical students' non-verbal communication during the objective structured clinical examination. Med Educ 40:1180–1187. 10.1111/j.1365-2929.2006.02628.x [DOI] [PubMed] [Google Scholar]
  • 8.Humphrey-Murto S, Côté M1 Pugh D, Wood TJ (2018) Assessing the Validity of a Multidisciplinary Mini-Clinical Evaluation Exercise. Teach Learn Med 30:152–161. 10.1080/10401334.2017.1387553 [DOI] [PubMed] [Google Scholar]
  • 9.Rogausch A, Beyeler C, Montagne S, Jucker-Kupper P, Berendonk C, Huwendiek S, et al. (2015) The influence of students’ prior clinical skills and context characteristics on mini-CEX scores in clerkships–a multilevel analysis. BMC Med Edu 15:208. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 10.Tagawa M, Imanaka H (2010) Reflection and self-directed and group learning improve OSCE scores. Clin Teach 7:266–270. 10.1111/j.1743-498X.2010.00377.x [DOI] [PubMed] [Google Scholar]
  • 11.Lörwald AC, Lahner FM, Mooser B, Perrig M, Widmer MK, Greif R, et al. (2019) Influences on the implementation of Mini-CEX and DOPS for postgraduate medical trainees' learning: A grounded theory study. Med Teach 41:448–456. 10.1080/0142159X.2018.1497784 [DOI] [PubMed] [Google Scholar]
  • 12.Lörwald AC, Lahner FM, Nouns ZM, Berendonk C, Norcini J, Greif R, et al. (2018) The educational impact of Mini-Clinical Evaluation Exercise (Mini-CEX) and Direct Observation of Procedural Skills (DOPS) and its association with implementation: A systematic review and meta-analysis. PLoS One 13:e0198009 10.1371/journal.pone.0198009 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 13.Bowen JL (2006) Educational strategies to promote clinical diagnostic reasoning. N Engl J Med 355:2217–2225. 10.1056/NEJMra054782 [DOI] [PubMed] [Google Scholar]
  • 14.Nomura S, Tanigawa N, Kinoshita Y, Tomoda K (2015) Trialing a new clinical clerkship record in Japanese clinical training. Adv Med Educ Pract 6:563–565. 10.2147/AMEP.S90295 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 15.Shikino K, Ikusaka M, Ohira Y, Miyahara M, Suzuki S, Hirukawa M, et al. (2015) Influence of predicting the diagnosis from history on the accuracy of physical examination. Adv Med Educ Pract 6:143–148. 10.2147/AMEP.S77315 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 16.Kirkman MA, Sevdalis N, Arora S, Baker P, Vincent C, Ahmed M (2015). The outcomes of recent patient safety education interventions for trainee physicians and medical students: a systematic review. BMJ Open 5:e007705 10.1136/bmjopen-2015-007705 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 17.Kassam A, Cowan M, Donnon T (2016) An objective structured clinical exam to measure intrinsic CanMEDS roles. Med Educ Online 21:31085 10.3402/meo.v21.31085 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 18.Eftekhar H, Labaf A, Anvari P, Jamali A, Sheybaee-Moghaddam F (2012) Association of the pre-internship objective structured clinical examination in final year medical students with comprehensive written examinations. Med Educ Online 17:15958. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 19.Couto LB, Durand MT, Wolff ACD, Restini CBA, Faria M Jr, Romão GS, et al. (2019) Formative assessment scores in tutorial sessions correlates with OSCE and progress testing scores in a PBL medical curriculum. Med Educ Online 24:1560862 10.1080/10872981.2018.1560862 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 20.Dong T, Saguil A, Artino AR Jr, Gilliland WR, Waechter DM, Lopreaito J, et al. (2012) Relationship between OSCE scores and other typical medical school performance indicators: a 5-year cohort study. Mil Med 177:44–46. 10.7205/milmed-d-12-00237 [DOI] [PubMed] [Google Scholar]
  • 21.Mukohara K, Kitamura K, Wakabayashi H, Abe K, Sato J, Ban N (2004) Evaluation of a communication skills seminar for students in a Japanese medical school: a non-randomized controlled study. BMC Med Educ 4:24 10.1186/1472-6920-4-24 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 22.Hough J, Levan D, Steele M, Kelly K, Dalton M (2019) Simulation-based education improves student self-efficacy in physiotherapy assessment and management of paediatric patients. BMC Med Educ 16:463. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 23.Offiah G, Ekpotu LP, Murphy S, Kane D, Gordon A, O'Sullivan M, et al. (2019) Evaluation of medical student retention of clinical skills following simulation training. BMC Med Educ 19:263 10.1186/s12909-019-1663-2 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 24.Dong T, Zahn C, Saguil A, Swygert KA, Yoon M, Servey J, et al. (2017) The Associations Between Clerkship Objective Structured Clinical Examination (OSCE) Grades and Subsequent Performance. Teach Learn Med 29:280–285. 10.1080/10401334.2017.1279057 [DOI] [PubMed] [Google Scholar]
  • 25.Schiekirka S, Reinhardt D, Heim S, Fabry G, Pukrop T, Anders S, et al. (2012) Student perceptions of evaluation in undergraduate medical education: a qualitative study from one medical school. BMC Med Educ 12:45 10.1186/1472-6920-12-45 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 26.Näpänkangas R, Karaharju-Suvanto T, Pyörälä E, Harila V, Ollila P, Lähdesmäki R, et al. (2016) Can the results of the OSCE predict the results of clinical assessment in dental education? Eur J Dent Educ 20:3–8. 10.1111/eje.12126 [DOI] [PubMed] [Google Scholar]
  • 27.Sahu PK, Chattu VK, Rewatkar A, Sakhamuri S (2019) Best practices to impart clinical skills during preclinical years of medical curriculum. J Educ Health Promot 8:57 10.4103/jehp.jehp_354_18 [DOI] [PMC free article] [PubMed] [Google Scholar]

Decision Letter 0

Conor Gilligan

20 Jan 2020

PONE-D-19-29629

Relationships between Objective Structured Clinical Examination, Computer-based Testing, and Clinical Clerkship Performance in Japanese Medical Students

PLOS ONE

Dear Dr Komasawa,

Thank you for submitting your manuscript to PLOS ONE. After careful consideration, we feel that it has merit but does not fully meet PLOS ONE’s publication criteria as it currently stands. Therefore, we invite you to submit a revised version of the manuscript that addresses the points raised during the review process.

We would appreciate receiving your revised manuscript by Mar 05 2020 11:59PM. When you are ready to submit your revision, log on to https://www.editorialmanager.com/pone/ and select the 'Submissions Needing Revision' folder to locate your manuscript file.

If you would like to make changes to your financial disclosure, please include your updated statement in your cover letter.

To enhance the reproducibility of your results, we recommend that if applicable you deposit your laboratory protocols in protocols.io, where a protocol can be assigned its own identifier (DOI) such that it can be cited independently in the future. For instructions see: http://journals.plos.org/plosone/s/submission-guidelines#loc-laboratory-protocols

Please include the following items when submitting your revised manuscript:

  • A rebuttal letter that responds to each point raised by the academic editor and reviewer(s). This letter should be uploaded as separate file and labeled 'Response to Reviewers'.

  • A marked-up copy of your manuscript that highlights changes made to the original version. This file should be uploaded as separate file and labeled 'Revised Manuscript with Track Changes'.

  • An unmarked version of your revised paper without tracked changes. This file should be uploaded as separate file and labeled 'Manuscript'.

Please note while forming your response, if your article is accepted, you may have the opportunity to make the peer review history publicly available. The record will include editor decision letters (with reviews) and your responses to reviewer comments. If eligible, we will contact you to opt in or out.

We look forward to receiving your revised manuscript.

Kind regards,

Conor Gilligan

Academic Editor

PLOS ONE

Additional Editor Comments (if provided):

This paper addresses an important topic in medical education but some further detail and justification is needed to strengthen the paper and ensure it makes a valuable contribution. Both reviewers make helpful points which you should take into consideration in preparing a revised version. In particular, I agree that the study needs clearer justification and I would think that further discussion of the theories associated with competency-based assessment and the challenges of teaching and assessment in clinical settings would be an important addition. Also, please provide further detail on the methods including the assessment measures and scoring systems. The discussion will need to change to reflect these changes to other sections.

Journal Requirements:

When submitting your revision, we need you to address these additional requirements:

1. Please ensure that your manuscript meets PLOS ONE's style requirements, including those for file naming. The PLOS ONE style templates can be found at http://www.plosone.org/attachments/PLOSOne_formatting_sample_main_body.pdf and http://www.plosone.org/attachments/PLOSOne_formatting_sample_title_authors_affiliations.pdf

2. In ethics statement in the manuscript and in the online submission form, please provide additional information about the records used in your retrospective study. Specifically, please ensure that you have discussed whether all data were fully anonymized before you accessed them and/or whether the IRB or ethics committee waived the requirement for informed consent. If students provided informed written consent to have data from their medical records used in research, please include this information.

3. Thank you for stating the following in the Declaration of Interests Section of your manuscript:

"Financial support for the study was provided by Osaka Medical College which had no role in study design,

data collection and analysis, publication decisions, or manuscript preparation."

We note that you have provided funding information that is not currently declared in your Funding Statement. However, funding information should not appear in the Acknowledgments section or other areas of your manuscript. We will only publish funding information present in the Funding Statement section of the online submission form.

Please remove any funding-related text from the manuscript and let us know how you would like to update your Funding Statement. Currently, your Funding Statement reads as follows:

"The funders had no role in study design, data collection and analysis, decision to publish, or preparation of the manuscript."

4. Please include captions for your Supporting Information files at the end of your manuscript, and update any in-text citations to match accordingly. Please see our Supporting Information guidelines for more information: http://journals.plos.org/plosone/s/supporting-information.

[Note: HTML markup is below. Please do not edit.]

Reviewers' comments:

Reviewer's Responses to Questions

Comments to the Author

1. Is the manuscript technically sound, and do the data support the conclusions?

The manuscript must describe a technically sound piece of scientific research with data that supports the conclusions. Experiments must have been conducted rigorously, with appropriate controls, replication, and sample sizes. The conclusions must be drawn appropriately based on the data presented.

Reviewer #1: Yes

Reviewer #2: Yes

**********

2. Has the statistical analysis been performed appropriately and rigorously?

Reviewer #1: Yes

Reviewer #2: Yes

**********

3. Have the authors made all data underlying the findings in their manuscript fully available?

The PLOS Data policy requires authors to make all data underlying the findings described in their manuscript fully available without restriction, with rare exception (please refer to the Data Availability Statement in the manuscript PDF file). The data should be provided as part of the manuscript or its supporting information, or deposited to a public repository. For example, in addition to summary statistics, the data points behind means, medians and variance measures should be available. If there are restrictions on publicly sharing data—e.g. participant privacy or use of data from a third party—those must be specified.

Reviewer #1: Yes

Reviewer #2: Yes

**********

4. Is the manuscript presented in an intelligible fashion and written in standard English?

PLOS ONE does not copyedit accepted manuscripts, so the language in submitted articles must be clear, correct, and unambiguous. Any typographical or grammatical errors should be corrected at revision, so please note any specific errors here.

Reviewer #1: Yes

Reviewer #2: Yes

**********

5. Review Comments to the Author

Please use the space provided to explain your answers to the questions above. You may also include additional comments for the author, including concerns about dual publication, research ethics, or publication ethics. (Please upload your review as an attachment if it exceeds 20,000 characters)

Reviewer #1: Thank you for the opportunity to review this paper. It is an interesting piece of work, however, there are issues that need to be addressed and clarified by the authors in different parts of the manuscript.

Introduction

The introduction lacks in depth discussion and information about the relationship between OSCE, written examination and Mini-CEX. The authors have provided the organizational context in the introduction but have failed to highlight what is already known in the literature about the relationship between OSCEs, written examination and Mini-CEX. Some examples of published literature include

• Susan Humphrey-Murto, Mylène Côté, Debra Pugh & Timothy J. Wood (2018) Assessing the Validity of a Multidisciplinary Mini-Clinical Evaluation Exercise, Teaching and Learning in Medicine, 30:2, 152-161, DOI: 10.1080/10401334.2017.1387553

• Rogausch A, Beyeler C, Montagne S, Jucker-Kupper P, Berendonk C, Huwendiek S, Gemperli A, Himmel W. The influence of students’ prior clinical skills and context characteristics on mini-CEX scores in clerkships–a multilevel analysis. BMC medical education. 2015 Dec;15(1):208.

Given that there are published evidence highlighting the relationship between the above variables, the authors need to identify the gap and state the justification for the study.

In addition, some of the references listed such as references, one (1) to four (4) do not reflect the statements written by the authors in the first paragraph. Please use appropriate references.

Methods

Study measures

OSCE content and evaluation

The authors stated that seven aspects were covered in the OSCE. Could the authors state the clinical disciplines that were examined? Was it only one discipline or was the OSCE conducted in all disciplines (Internal Medicine, Pediatrics, Obs and Gyn, Surgery, Family medicine, Emergency, and Anesthesiology)?

How was the overall score calculated?

CBT Content and Evaluation

What content was covered? Were the clinical disciplines assessed? If so, please state it.

Clinical clerkship content and evaluation

The authors have stated that an evaluation sheet based on Mini-CEX and DOPS was used. Could the authors include a copy of the evaluation sheet?

In addition, the authors have stated that the basic clinical clerkship was conducted across all disciplines. How was the average and overall score for the mini-CEX calculated as well as the scores for the different clinical disciplines? Was the form adapted for the different clinical disciplines? If so, please provide the details

Statistical analysis

The statistical analysis should be updated based on the information provided above related to the methods.

Results

As stated in the methods section, the authors need to provide further information on how the average score reported were calculated for the OSCE, CBT and CC. This applies to all sections of the results. Given the above issues, the results reported by the authors cannot be verified.

Discussion

Given the concerns raised about the methodology and results, the discussion needs to be re-written to align with the updated information.

However, the first paragraph in the discussion is a repetition of the first paragraph in the introduction. The authors need to consider how to present the findings of the study in relation to existing evidence.

Other issues

Some references listed in text do not reflect the statements written by the authors. It is important for the authors to use appropriate references.

Reviewer #2: This manuscript is very well written. It looks at the relationship between objective structured clinical exam, computer-based testing and clinical clerkship performance in Japanese medical student. The transition from medical student to practising doctor is a very topical issue in medical education at the moment and hence this manuscript is very welcomed.

Well, the paper has been written very well with good statistical rigour, however, there are a few clarifications that is needed.

1. It would be good to know what teaching was done prior to the test. Did all 94 students attend the session. was it mandatory? was it right before the test or was there a lag period? This was not clear on the manuscript and would have a significant impact on the results. Please clarify.

2. What elements of the examination (head and neck exam; neurological examination etc.) was expected in the five minute OSCE period. So, specifically for the chest exam, was it just the preacordium that was examined in the five minutes? For neurological exam was it all aspects of sensation, proprioception, reflexes for upper and lower limb that was done in the five minutes? Please clarify.

3. Did the 2 examiners scores correlate? An average was achieved. Was examiner training provided to ensure consistency?

4. Typo in paragraph 2 of study measures. 'the examination also strictly checks the identification of students.....'

5. For the computer based testing, were the 320 question standard tested beforehand? If so, was this to the standard of a fourth-year or a 5th year medical student?

6. In addition clarification should be provided as to what aspects of clinical knowledge was tested. Was it all subjects or just acute medicine and acute surgery that is applied during clinical clerkship?

7. It looks like the mini clinical evaluation exercise and the direct observation of procedural skills assessement tools were graded for correlation purposes. Could the grading/marksheets be provided, please. Traditionally these assessments are used for feedback purposes and not usually graded.

Many thanks.

**********

6. PLOS authors have the option to publish the peer review history of their article (what does this mean?). If published, this will include your full peer review and any attached files.

If you choose “no”, your identity will remain anonymous but your review may still be made public.

Do you want your identity to be public for this peer review? For information about this choice, including consent withdrawal, please see our Privacy Policy.

Reviewer #1: No

Reviewer #2: Yes: Gozie Offiah

[NOTE: If reviewer comments were submitted as an attachment file, they will be attached to this email and accessible via the submission site. Please log into your account, locate the manuscript record, and check for the action link "View Attachments". If this link does not appear, there are no attachment files to be viewed.]

While revising your submission, please upload your figure files to the Preflight Analysis and Conversion Engine (PACE) digital diagnostic tool, https://pacev2.apexcovantage.com/. PACE helps ensure that figures meet PLOS requirements. To use PACE, you must first register as a user. Registration is free. Then, login and navigate to the UPLOAD tab, where you will find detailed instructions on how to use the tool. If you encounter any issues or have any questions when using PACE, please email us at figures@plos.org. Please note that Supporting Information files do not need this step.

PLoS One. 2020 Mar 26;15(3):e0230792. doi: 10.1371/journal.pone.0230792.r002

Author response to Decision Letter 0


13 Feb 2020

To the Editor (Prof. Conor Gilligan, M.D.)

Editor comments: This paper addresses an important topic in medical education but some further detail and justification is needed to strengthen the paper and ensure it makes a valuable contribution. Both reviewers make helpful points which you should take into consideration in preparing a revised version. In particular, I agree that the study needs clearer justification and I would think that further discussion of the theories associated with competency-based assessment and the challenges of teaching and assessment in clinical settings would be an important addition. Also, please provide further detail on the methods including the assessment measures and scoring systems. The discussion will need to change to reflect these changes to other sections.

Thank you for appreciating the clinical importance regarding our article.

We revised the manuscript faithfully according to the insightful comments of the reviewers as possible, especially for justification and discussion on the theory on competency-based assessment and challenges of teaching and assessment. We believe that simulation-based education can be a solution for competency-based assessment and dissolving the problems associated with teaching and learning in clinical settings.

We believe that the quality of our article improved significantly by their valuable comments.

To Reviewer 1

Thank you for appreciating the educational significance of our article.

1. Reviewer comments; Introduction

The introduction lacks in depth discussion and information about the relationship between OSCE, written examination and Mini-CEX. The authors have provided the organizational context in the introduction but have failed to highlight what is already known in the literature about the relationship between OSCEs, written examination and Mini-CEX. Some examples of published literature include

• Susan Humphrey-Murto, Mylène Côté, Debra Pugh & Timothy J. Wood (2018) Assessing the Validity of a Multidisciplinary Mini-Clinical Evaluation Exercise, Teaching and Learning in Medicine, 30:2, 152-161, DOI: 10.1080/10401334.2017.1387553

• Rogausch A, Beyeler C, Montagne S, Jucker-Kupper P, Berendonk C, Huwendiek S, Gemperli A, Himmel W. The influence of students’ prior clinical skills and context characteristics on mini-CEX scores in clerkships–a multilevel analysis. BMC medical education. 2015 Dec;15(1):208.

Given that there are published evidence highlighting the relationship between the above variables, the authors need to identify the gap and state the justification for the study.

In addition, some of the references listed such as references, one (1) to four (4) do not reflect the statements written by the authors in the first paragraph. Please use appropriate references.

Thank you for constructive comments regarding our article. According to the suggestion of Reviewer#1, we re-wrote the introduction part for justification. We also changed the references to appropriate ones.

2. Reviewer comments; Methods

Study measures

OSCE content and evaluation

The authors stated that seven aspects were covered in the OSCE. Could the authors state the clinical disciplines that were examined? Was it only one discipline or was the OSCE conducted in all disciplines (Internal Medicine, Pediatrics, Obs and Gyn, Surgery, Family medicine, Emergency, and Anesthesiology)?

How was the overall score calculated?

CBT Content and Evaluation

What content was covered? Were the clinical disciplines assessed? If so, please state it.

Clinical clerkship content and evaluation

The authors have stated that an evaluation sheet based on Mini-CEX and DOPS was used. Could the authors include a copy of the evaluation sheet?

In addition, the authors have stated that the basic clinical clerkship was conducted across all disciplines. How was the average and overall score for the mini-CEX calculated as well as the scores for the different clinical disciplines? Was the form adapted for the different clinical disciplines? If so, please provide the details

Statistical analysis

The statistical analysis should be updated based on the information provided above related to the methods.

Thank you for constructive comments regarding our article. According to the suggestion of Reviewer#1, we changed our manuscript accordingly.

1, We apologize for the expression. We performed this basic OSCE which includes basic OSCE which is common to all disciples. The seven parts is chest examination, abdominal examination, head and neck examination, neurological examination, emergency response, basic technique which we evaluated in the OSCE. We added this information clearly.

2, Over all score was calculated of the average of the seven OSCE stations.

3, CBT included basic clinical disciplines and basic medicine knowledges.

4, We also added the summary of clinical clerkship evaluation sheet as Figure 2. We made this sheet which we can make scoring and evaluation definitely.

3, Reviewer comments; Results

As stated in the methods section, the authors need to provide further information on how the average score reported were calculated for the OSCE, CBT and CC. This applies to all sections of the results. Given the above issues, the results reported by the authors cannot be verified.

Thank you for constructive comments regarding our article. According to the suggestion of Reviewer#1, we clarified the average score calculation method. We again checked our statistics method and confirmed the correct results.

4, Reviewer comments; Discussion

Given the concerns raised about the methodology and results, the discussion needs to be re-written to align with the updated information.

However, the first paragraph in the discussion is a repetition of the first paragraph in the introduction. The authors need to consider how to present the findings of the study in relation to existing evidence.

Thank you for constructive comments regarding our article. According to the suggestion of Reviewer#1, we re-wrote the discussion part.

We sincerely appreciate for your insightful suggestions regarding our article. We believe that our manuscript has been significantly improved by your valuable comments.

To Reviewer 2

Thank you very much for appreciating the educational significance of our article.

We all agree with all your insightful comments and revised the manuscript accordingly.

1. Reviewer comment; It would be good to know what teaching was done prior to the test. Did all 94 students attend the session. was it mandatory? was it right before the test or was there a lag period? This was not clear on the manuscript and would have a significant impact on the results. Please clarify.

Thank you for constructive comments regarding our article. According to the suggestion of Reviewer#2, we added the concise information of the OSCE evaluation in the introduction and method part.

2. Reviewer comment; What elements of the examination (head and neck exam; neurological examination etc.) was expected in the five minute OSCE period. So, specifically for the chest exam, was it just the preacordium that was examined in the five minutes? For neurological exam was it all aspects of sensation, proprioception, reflexes for upper and lower limb that was done in the five minutes? Please clarify.

Thank you for constructive comments regarding our article. According to the suggestion of Reviewer#2, we added more information about the OSCE. Actually, the scenario theme was randomly selected and medical students perform core examination skill in each part. We added this in the method part.

3. Reviewer comment; Did the 2 examiners scores correlate? An average was achieved. Was examiner training provided to ensure consistency?

Thank you for constructive comments regarding our article. We confirmed that 2 examiners scores are correlated. Also, we perform 3 hour evaluation training before OSCE. According to the suggestion of Reviewer#2, we added some training for OSCE evaluation.

4. Reviewer comment; Typo in paragraph 2 of study measures. 'the examination also strictly checks the identification of students.....'

Thank you for constructive comments regarding our article. According to the suggestion of Reviewer#2, we corrected the expression.

5. Reviewer comment; For the computer based testing, were the 320 question standard tested beforehand? If so, was this to the standard of a fourth-year or a 5th year medical student?

Thank you for constructive comments regarding our article. This is a standard for 4th-year student. According to the suggestion of Reviewer#2, we added this in the introduction part.

6. Reviewer comment; In addition clarification should be provided as to what aspects of clinical knowledge was tested. Was it all subjects or just acute medicine and acute surgery that is applied during clinical clerkship?

Thank you for constructive comments regarding our article. The CBT includes basic clinical knowledge in all disciplines. According to the suggestion of Reviewer#2, we added this information in the method part.

7. Reviewer comment; It looks like the mini clinical evaluation exercise and the direct observation of procedural skills assessement tools were graded for correlation purposes. Could the grading/marksheets be provided, please. Traditionally these assessments are used for feedback purposes and not usually graded.

Thank you for constructive comments regarding our article. According to the suggestion of Reviewer#2, we added the clinical clerkship grading sheet as Figure 2. However, OSCE evaluation sheer contain some restrictions by Common Achievement Test Organization in Japan. We added the evaluation standards of OSCE in the methods part as possible with citation.

We sincerely appreciate for your insightful suggestions regarding our article. We believe that our manuscript has been significantly improved by your valuable comments.

Attachment

Submitted filename: response to reviewers.doc

Decision Letter 1

Conor Gilligan

2 Mar 2020

PONE-D-19-29629R1

Relationships between Objective Structured Clinical Examination, Computer-based Testing, and Clinical Clerkship Performance in Japanese Medical Students

PLOS ONE

Dear Dr Komasawa,

Thank you for submitting your manuscript to PLOS ONE. After careful consideration, we feel that it has merit but does not fully meet PLOS ONE’s publication criteria as it currently stands. Therefore, we invite you to submit a revised version of the manuscript that addresses the points raised during the review process.

We would appreciate receiving your revised manuscript by Apr 16 2020 11:59PM. When you are ready to submit your revision, log on to https://www.editorialmanager.com/pone/ and select the 'Submissions Needing Revision' folder to locate your manuscript file.

If you would like to make changes to your financial disclosure, please include your updated statement in your cover letter.

To enhance the reproducibility of your results, we recommend that if applicable you deposit your laboratory protocols in protocols.io, where a protocol can be assigned its own identifier (DOI) such that it can be cited independently in the future. For instructions see: http://journals.plos.org/plosone/s/submission-guidelines#loc-laboratory-protocols

Please include the following items when submitting your revised manuscript:

  • A rebuttal letter that responds to each point raised by the academic editor and reviewer(s). This letter should be uploaded as separate file and labeled 'Response to Reviewers'.

  • A marked-up copy of your manuscript that highlights changes made to the original version. This file should be uploaded as separate file and labeled 'Revised Manuscript with Track Changes'.

  • An unmarked version of your revised paper without tracked changes. This file should be uploaded as separate file and labeled 'Manuscript'.

Please note while forming your response, if your article is accepted, you may have the opportunity to make the peer review history publicly available. The record will include editor decision letters (with reviews) and your responses to reviewer comments. If eligible, we will contact you to opt in or out.

We look forward to receiving your revised manuscript.

Kind regards,

Conor Gilligan

Academic Editor

PLOS ONE

Additional Editor Comments (if provided):

The authors have attempted to address the reviewers comments but in many cases have done so superficially, resulting in a paper that continues to lack clarity.

1. The conclusion section of the abstract needs to make a stronger ‘so what?’ point.

2. Reviewer 1 suggested a deeper exploration of the topic in the introduction and I do not feel that this has been addressed.

3. On page 3 – ‘assuring basic clinical competency is essential’ – are the authors referring to achieving this prior top CC?

4. On page 4, the authors have added some detail but the writing. Is unclear and includes repetition. The paragraph starting ‘The OSCE…’ needs to be revised.

5. Page 5 please delete ‘We have discussed that’ and begin this sentence with ‘All data’

6. Why were students who did not progress excluded? Might this have introduced bias? Also, I would expect that this group might increase the variance in the findings which would be statistically helpful.

7. On page 6 – what is the statistical justification for having 7 stations – is there evidence that this provides sufficient data for assessment?

8. Also on page 6 the bracketed ‘(i.e chest….’) is not needed as this information has already been provided.

9. Was the training ‘based on common text’ purely written preparation? Is there any assessor standardisation?

10. I am surprised at such high scores, particularly on the CBT – is item analysis available and how is the standard set for this examination? There is very limited variance across all scores – this should be addressed as a limitation in making judgments about the correlations or lack thereof.

11. On page 11, second paragraph there is repetition – the discussion in general is repetitive and offers limited analysis of the findings. I encourage the authors to explore the potential implications more deeply to clarify a ‘so what?’ message.

12. The English grammar at the bottom of page 11 is clumsy and needs revision.

[Note: HTML markup is below. Please do not edit.]

Reviewers' comments:

Reviewer's Responses to Questions

Comments to the Author

1. If the authors have adequately addressed your comments raised in a previous round of review and you feel that this manuscript is now acceptable for publication, you may indicate that here to bypass the “Comments to the Author” section, enter your conflict of interest statement in the “Confidential to Editor” section, and submit your "Accept" recommendation.

Reviewer #2: All comments have been addressed

**********

2. Is the manuscript technically sound, and do the data support the conclusions?

The manuscript must describe a technically sound piece of scientific research with data that supports the conclusions. Experiments must have been conducted rigorously, with appropriate controls, replication, and sample sizes. The conclusions must be drawn appropriately based on the data presented.

Reviewer #2: Yes

**********

3. Has the statistical analysis been performed appropriately and rigorously?

Reviewer #2: Yes

**********

4. Have the authors made all data underlying the findings in their manuscript fully available?

The PLOS Data policy requires authors to make all data underlying the findings described in their manuscript fully available without restriction, with rare exception (please refer to the Data Availability Statement in the manuscript PDF file). The data should be provided as part of the manuscript or its supporting information, or deposited to a public repository. For example, in addition to summary statistics, the data points behind means, medians and variance measures should be available. If there are restrictions on publicly sharing data—e.g. participant privacy or use of data from a third party—those must be specified.

Reviewer #2: Yes

**********

5. Is the manuscript presented in an intelligible fashion and written in standard English?

PLOS ONE does not copyedit accepted manuscripts, so the language in submitted articles must be clear, correct, and unambiguous. Any typographical or grammatical errors should be corrected at revision, so please note any specific errors here.

Reviewer #2: Yes

**********

6. Review Comments to the Author

Please use the space provided to explain your answers to the questions above. You may also include additional comments for the author, including concerns about dual publication, research ethics, or publication ethics. (Please upload your review as an attachment if it exceeds 20,000 characters)

Reviewer #2: Comments:

Revisions well received and efforts made to address initial concerns raised but not there yet for me hence accept with minor changes.

The indepth discussions are useful for this paper to add rigour.

Minor – Typos and grammar need to be corrected.

For example -

Abstract: Although a few studies examined the efficacy of CC

Ethical consideration: rephrase the sentence – It was agreed that as all data were fully anonymised, the ethics committee waived the requirement for informed consent.

Reference 23 and 24 is a duplicate, a thorough review of the references is needed to ensure refrences in the biblography as adeqautely represented in the body of the text.

Overall - satisfied but manuscript needs to be read for grammartical and typo errors and references. Thanks.

**********

7. PLOS authors have the option to publish the peer review history of their article (what does this mean?). If published, this will include your full peer review and any attached files.

If you choose “no”, your identity will remain anonymous but your review may still be made public.

Do you want your identity to be public for this peer review? For information about this choice, including consent withdrawal, please see our Privacy Policy.

Reviewer #2: Yes: Gozie Offiah

[NOTE: If reviewer comments were submitted as an attachment file, they will be attached to this email and accessible via the submission site. Please log into your account, locate the manuscript record, and check for the action link "View Attachments". If this link does not appear, there are no attachment files to be viewed.]

While revising your submission, please upload your figure files to the Preflight Analysis and Conversion Engine (PACE) digital diagnostic tool, https://pacev2.apexcovantage.com/. PACE helps ensure that figures meet PLOS requirements. To use PACE, you must first register as a user. Registration is free. Then, login and navigate to the UPLOAD tab, where you will find detailed instructions on how to use the tool. If you encounter any issues or have any questions when using PACE, please email us at figures@plos.org. Please note that Supporting Information files do not need this step.

PLoS One. 2020 Mar 26;15(3):e0230792. doi: 10.1371/journal.pone.0230792.r004

Author response to Decision Letter 1


3 Mar 2020

To the Editor (Prof. Conor Gilligan, M.D.)

Thank you for appreciating the clinical importance regarding our article.

We revised the manuscript faithfully according to the insightful comments of the editors as possible. We also deeply apologize for the insufficient revision in the previous version.

We believe that the quality of our article improved significantly by your valuable comments.

1. Editor comments: The conclusion section of the abstract needs to make a stronger ‘so what?’ point.

Thank you very much for your suggestive comments. According to your suggestion, we made the conclusion section stronger.

2. Editor comments: Reviewer 1 suggested a deeper exploration of the topic in the introduction and I do not feel that this has been addressed.

Thank you very much for your suggestive comments. As we tried to express the deeper explanation, we deeply apologize for the inconvenience regarding the introduction part in the former revision. According to your suggestion, we tried to add more explanation about the justification of this study.

3. Editor comments: On page 3 –assuring basic clinical competency is essential’ – are the authors referring to achieving this prior top CC?

Thank you very much for your suggestive comments. We apologize for the inconvenience regarding our expression. According to your suggestion, we corrected the sentence.

4. Editor comments: On page 4, the authors have added some detail but the writing. Is unclear and includes repetition. The paragraph starting ‘The OSCE…’ needs to be revised.

Thank you very much for your suggestive comments. According to your suggestion, we reconstructed the paragraph.

5. Editor comments: Page 5 please delete ‘We have discussed that’ and begin this sentence with ‘All data’

Thank you very much for your suggestive comments. According to your suggestion, we corrected the sentence.

6. Editor comments: Why were students who did not progress excluded? Might this have introduced bias? Also, I would expect that this group might increase the variance in the findings which would be statistically helpful.

Thank you very much for your suggestive comments. We considered that the learning content of clinical clerkship content may differ year by year. Thus, we excluded the repeat-year students to perform correlation analysis more accurately. According to your suggestion, we added this in the limitation part.

7. Editor comments: On page 6 – what is the statistical justification for having 7 stations – is there evidence that this provides sufficient data for assessment?

Thank you very much for your suggestive comments. This is a conventional method in our OSCE evaluation. We agree with your idea. According to your suggestion, we added this in the limitation.

8. Editor comments: Also on page 6 the bracketed ‘(i.e chest….’) is not needed as this information has already been provided.

Thank you very much for your suggestive comments. According to your suggestion, we deleted the expression.

9. Editor comments: Was the training ‘based on common text’ purely written preparation? Is there any assessor standardisation?

Thank you very much for your suggestive comments. The training included standardization and also used video. According to your suggestion, we added this infomation in the method part.

10. Editor comments: I am surprised at such high scores, particularly on the CBT – is item analysis available and how is the standard set for this examination? There is very limited variance across all scores – this should be addressed as a limitation in making judgments about the correlations or lack thereof.

Thank you very much for your suggestive comments. According to your suggestion, we added the information about CBT and limitation part too. We also added information about CBT in the method part to show that quality management has been performed in this test.

11. Editor comments: On page 11, second paragraph there is repetition – the discussion in general is repetitive and offers limited analysis of the findings. I encourage the authors to explore the potential implications more deeply to clarify a ‘so what?’ message.

Thank you very much for your suggestive comments. According to your suggestion, we deleted repetition and tried to add more ‘so what’ messages and reconstructed the paragraphs. We tried to emphasize that combination of SBE and clinical environment can enhance the CC performance. We tried to express what we want to say using emergency training for clarification.

12. Editor comments: The English grammar at the bottom of page 11 is clumsy and needs revision.

Thank you very much for your suggestive comments. According to your suggestion, we corrected the English grammar as possible.

We also performed through spell and grammar check again and corrected some parts. We also checked the references.

We sincerely appreciate for your insightful suggestions regarding our article.

We also apologize for the inconvenience regarding our article.

We believe that our manuscript has been significantly improved by your valuable comments.

Decision Letter 2

Conor Gilligan

10 Mar 2020

Relationships between Objective Structured Clinical Examination, Computer-based Testing, and Clinical Clerkship Performance in Japanese Medical Students

PONE-D-19-29629R2

Dear Dr. Komasawa,

We are pleased to inform you that your manuscript has been judged scientifically suitable for publication and will be formally accepted for publication once it complies with all outstanding technical requirements.

Within one week, you will receive an e-mail containing information on the amendments required prior to publication. When all required modifications have been addressed, you will receive a formal acceptance letter and your manuscript will proceed to our production department and be scheduled for publication.

Shortly after the formal acceptance letter is sent, an invoice for payment will follow. To ensure an efficient production and billing process, please log into Editorial Manager at https://www.editorialmanager.com/pone/, click the "Update My Information" link at the top of the page, and update your user information. If you have any billing related questions, please contact our Author Billing department directly at authorbilling@plos.org.

If your institution or institutions have a press office, please notify them about your upcoming paper to enable them to help maximize its impact. If they will be preparing press materials for this manuscript, you must inform our press team as soon as possible and no later than 48 hours after receiving the formal acceptance. Your manuscript will remain under strict press embargo until 2 pm Eastern Time on the date of publication. For more information, please contact onepress@plos.org.

With kind regards,

Conor Gilligan

Academic Editor

PLOS ONE

Additional Editor Comments (optional):

You have addressed the concerns adequately and I feel that the paper can now be accepted.

Reviewers' comments:

Acceptance letter

Conor Gilligan

12 Mar 2020

PONE-D-19-29629R2

Relationships between Objective Structured Clinical Examination, Computer-based Testing, and Clinical Clerkship Performance in Japanese Medical Students

Dear Dr. Komasawa:

I am pleased to inform you that your manuscript has been deemed suitable for publication in PLOS ONE. Congratulations! Your manuscript is now with our production department.

If your institution or institutions have a press office, please notify them about your upcoming paper at this point, to enable them to help maximize its impact. If they will be preparing press materials for this manuscript, please inform our press team within the next 48 hours. Your manuscript will remain under strict press embargo until 2 pm Eastern Time on the date of publication. For more information please contact onepress@plos.org.

For any other questions or concerns, please email plosone@plos.org.

Thank you for submitting your work to PLOS ONE.

With kind regards,

PLOS ONE Editorial Office Staff

on behalf of

Dr. Conor Gilligan

Academic Editor

PLOS ONE

Associated Data

    This section collects any data citations, data availability statements, or supplementary materials included in this article.

    Supplementary Materials

    S1 Data

    (XLSX)

    Attachment

    Submitted filename: response to reviewers.doc

    Data Availability Statement

    The data underlying the results presented in the study are available from attached supplemental data.


    Articles from PLoS ONE are provided here courtesy of PLOS

    RESOURCES