Skip to main content
PLOS One logoLink to PLOS One
. 2024 May 23;19(5):e0303534. doi: 10.1371/journal.pone.0303534

Educational climate of a pathology residency program at a tertiary care hospital

Zafar Ali 1, Hashaam Bin Ghafoor 2, Muhammad Nasir Ayub Khan 3, Muslim Atiq 4, Saira Akhlaq 5,*
Editor: Ritesh G Menezes6
PMCID: PMC11115291  PMID: 38781154

Abstract

Evaluating educational climate (EC) is imperative for ensuring postgraduate trainees’ competencies and quality in residency training programs. This study assessed the EC experiences of pathology postgraduate residents (PGRs) during their postgraduate training in pathology residency programs—a cross-sectional study design assigned EC scores in the pathology residency program at a prestigious institution in Islamabad, which were measured using the Dutch Residency Educational Climate Test (D-RECT) questionnaire. Scores from the D-RECT were employed to conduct descriptive statistics and comparison of means across groups to evaluate EC scores by years of training and compared to assess where the differences were located. Among FCPS-II pathology residents, most of whom were females (94.4%), the mean age was 28.11±2.91 years. A mean positive score was observed among all pathology residents (M≥3.6) for all D-RECT subscales except for the feedback subscale: the average score for feedback was below the average mean score of 3.6 (M = 3.19). A significant difference p = 0.016 was observed in EC scores across different groups through the Analysis of Variances (ANOVA) test. The most significant difference was between less than two and greater than two groups p = 0.027, followed by the difference between equal to two groups and greater than two groups p = 0.052. Overall, positive scores for EC in the pathology residency program were observed. Thus, targeted interventions are needed to increase feedback scores and address observed differences in EC scores by years of training.

Introduction

Educational Climate (EC) is the pivot around which the other four focal areas of medical education (i.e., Curriculum, Environment, Quality, and Change) are considered for discussion [1]. Educational Environment (EE) may be described as an environment that is influenced by the physical environment (i.e., safety, comfort, food), emotional environment (i.e., feedback quality and security), and educational environment (i.e., participation, relevance, and education planning) [2, 3]. Experiences within an EC are directly or indirectly linked with the valuable outcomes of student achievement, satisfaction, and success [1]. During postgraduate (PG) training programs, EC is an essential marker for quality in PG medical education [46], and the successful translation of a curriculum in EE depends upon an adequate EC [79]. Enhancing the latter in medical education also depends upon establishing a continuous feedback system, especially among emerging nations and new training institutions [9]. Henceforth, establishing a feedback system in EC of a histopathology residency program is necessary to achieve positive outcomes through an EE that improves learning [79], the well-being of PGRs [1012], professionalism [13], learning satisfaction [14], and competency of health care professionals (HCP) [9]. The EC of PG programs highlights the environment in which PGRs learn about the context and shared identification of approaches, techniques, and practices [15]. In comparison, achievement of excellence through PGRs’ training program may be influenced by obtrusive clinical EE that may lead to fatigue, exhaustion, and burnout among PGRs, leading to dropout [16].

Measuring the learning environment (LE) offers a reference point for the didactic institution [17]. The measurement process facilitates answering trainers’ and trainees’ questions about the learning experience’s quality. The pathology training program is considered distinctive because a deficiency is observed in the exposure to several lab sections during house job and MBBS. The pathology residency program is divided into subspecialties such as Chemistry, Histopathology, Immunology, Hematology, and Microbiology; each sub-specialty has its unique EC. To fulfill the increasing demand for pathology training, valuable programs are needed on a large scale as any change or development induces recognition of the available teaching models, boosts PGRs’ education, and enhances their experience. The pathology graduates and employers agree that existing pathology residency training remains inadequate in preparing the residents who are competent in clinical chemistry during their professional careers primarily due to less value placed by residents on learning clinical chemistry and the lack of directors who supervise and manage labs of clinical chemistry [18]. The impassiveness of the teaching staff or residents is an essential obstruction in teaching clinical chemistry to the residents [18]. As explained by residents, unsupportive EE and abortive teaching techniques are the main drawbacks [18].

EC using D-RECT has been assessed in various local and international studies [19]. However, these studies involved different disciplines [3, 1921], and the nature of EC depends upon the nature of the setting, i.e., each unique specialty and sub-specialty [1, 18, 22, 23]. The current study, therefore, assessed the EC of a PG pathology residency program by administering D-RECT and was conducted considering the need for future research in more focused areas by examining the individual disciplines to increase the likelihood of ruling out subject-specific differences [24], mainly when the same climate index can be used even with different stakeholders [7].

Materials and methods

A cross-sectional study design approved by the ethics committee of Shifa Tameer-e-Millat University was used at Shifa International Hospital (Islamabad, Pakistan) to assess EC scores of pathology residents by years of training. Enrollment in the study required voluntarily informed permission from participants (including all voluntary pathology PGRs), which was acquired through written consent following the Shifa International Hospital Protocol. Data was collected from 06/06/2021-06/30/2021. During this period, the D-RECT instrument (fifty items categorized into eleven subscales) was used to evaluate EC scores among trainees, and demographic data variables like age, sex, specialist area, and years of training were acquired in the data collection process.

Data was collected through a paper-format questionnaire stored in locked folders for data safety and security, and paper-format data was converted into electronic files using SPSS 21.0. The mean of the D-RECT ranking and each subscale were determined as follows: the descriptive data analysis included the calculation of Mean (M), Standard Deviation (SD), frequency (f), and percentages (%) and an average score below 3.6 was considered a negative perception in any subscale.

Results

The response rate was 100% (18). Refer to data in S1 Dataset for complete access to data that was used in the data analysis. Considering the age categories of more than and less than 30 years old, 83.3% (15) of the study participants were ≤30 years, and only 16.7% (3) were >30 years. The average age was 28.11±2.908 years, and PGRs included more females (94.4%) than males (5.6%). PGRs in the first two years of training were more significant in numbers (66.6%) than those PGRs in the last three years (33.4%). The histopathology department had the highest number of PGRs, 55.5% (10), while 33.6% (6), 5.6% (1), and 5.6% (1) were working in the hematology, immunology, and chemical pathology department, respectively.

Overall, positive experiences (M = >3.6) were reflected in EC of the pathology residency program as measured through the sub-scales on the D-RECT questionnaire, except Subscale: feedback (M≤3.6) in Table 1.

Table 1. Descriptive statistics for educational climate (D-RECT subscales).

Subscales Perspective Mean±SD
Agree Neutral Disagree
Supervision 17 (94.4%) 1 (5.6%) 0 (0%) 4.56±0.457
Coaching and assessment 18 (100%) 0 (0%) 0 (0%) 4.45±0.429
Feedback 8 (44.4%) 10 (55.6%) 0 (0%) 3.19±0.458
Teamwork 18 (100%) 0 (0%) 0 (0%) 4.43±0.427
Peer collaboration 17 (94.4%) 1 (5.6%) 0 (0%) 4.54±0.606
Professional relations between attending 18 (100%) 0 (0%) 0 (0%) 4.41±0.465
Work is adapted to residents’ competence 18 (100%) 0 (0%) 0 (0%) 4.10±0.536
Attendings’ role 18 (100%) 0 (0%) 0 (0%) 4.58±0.506
Formal education 18 (100%) 0 (0%) 0 (0%) 4.19±0.504
Specialty Tutor’s role 18 (100%) 0 (0%) 0 (0%) 4.37±0.497
Patients sign out 18 (100%) 0 (0%) 0 (0%) 4.22±0.535
Overall D-RECT 18 (100%) 0 (0%) 0 (0%) 4.39±0.385

Comparison of means

ANOVA

Data for the overall EC scores was typically distributed as the overall p>0.05. Please refer to the Table in S1 Table for assessing the test of normality that was used to determine the normality of data. An overall summary of the mean scores of EC across three different groups was calculated. There were fewer participants in the created Groups 4 and 5. Therefore, Groups 4 and 5 were added to Group 3 because only one participant was in Groups 4 and 5. There were equal participants in the final groups, i.e., Group 1 (less than 2), Group 2 (equal to 2), and Group 3 (greater than 2). A category “less than 2” refers to PGRs in the 1st year. The “Equal to 2” category refers to PGRs in the second year, and the “greater than 2” category refers to PGRs in the 3rd, 4th, and 5th years. A summary of means across different groups is as follows in Table 2.

Table 2. Descriptive of variances in EC scores across groups.
Descriptive
Total LC Scores
N Mean Std. Deviation Std. Error 95% Confidence Interval for Mean Minimum Maximum
Lower Bound Upper Bound
Less than 2 6 4.59 .455 .186 4.12 5.07 4 5
Equal to 2 6 4.53 .273 .112 4.24 4.82 4 5
Greater than 2 6 4.05 .088 .036 3.95 4.14 4 4
Total 18 4.39 .385 .091 4.20 4.58 4 5

One-way ANOVA was conducted to identify if there were any differences between and within groups. A significant difference was observed in the LC scores across different groups, p = .016 in Table 3.

Table 3. ANOVA-variations in EC scores by groups.
Total LC Scores
Sum of Squares df Mean Square F Sig.
Between Groups 1.073 2 .536 5.557 .016a
Within Groups 1.448 15 .097
Total 2.521 17

aThe mean difference is significant at the level of 0.05.

Multiple comparisons

After observing the ANOVA analysis, multiple comparisons were conducted to identify where the difference was located, highlighted between 1st year of pathology and 3rd year of residency and above, and it was similarly observed between 2nd year of the pathology residency program and 3rd year of residency and above in LC scores in Table 4.

Table 4. Multiple comparisons.
Dependent Variable: Total EC scores
(I) Years (J) Years Mean Difference (I-J) Std. Error Sig. 95% Confidence Interval
Lower Bound Upper Bound
Less than 2 Equal to 2 .063 .179 .940 -.42 .55
Greater than 2 .547a .179 .027 .06 1.03
Equal to 2
Greater than 2 .483b .179 .052 .00 .97

aThe mean difference is significant at the 0.05 level.

bThe mean difference is significant at the 0.05 level.

Discussion

EC is a crucial variable that must be monitored to guarantee the caliber of instruction in a healthcare setting that offers an EE for PGR training. Most PGRs in the pathology residency program were females (83.3%) and ≤30 years old; generally, numerous study participants (66.6%) were from the first and second years of the pathology residency program.

Considering the subscales on the D-RECT, each may be considered a separate variable or domain for understanding purposes to discuss their unique implications. Additionally, supervision is one such sub-scale that is an essential component of any training program. The need to supervise pathology residents constantly by hospital teaching staff is always to provide patients with safe, effective, and better-quality care. PGRs score for supervision were high (average score 4.56 +/- 0.457) in a large percentage (94.4%) of study participants, which were comparatively better than the scores (3.75 +/- 1.49) among PGR residents from multiple specialties [20] and during shift change (3.93 +/- 0.36) in a study conducted in the Netherlands that included residents from multiple specialty training programs during shift change [15]. Despite the prevalence of positive perceptions with varying degrees of magnitude in different populations, negative perceptions have also been observed in some studies, specifically about supervision (2.83 +/- 0.83) [4] and in residents of the Saudi emergency department in Riyadh (3.30 +/- 1.17) [23]. Therefore, identifying mechanisms that nurture differences in the magnitude of positive scores and factors/strategies that lead to lower scores for supervision may help address the weaknesses within the educational systems and facilitate building on the strengths of a solid educational system by using it as a model.

Coaching and assessment play a vital role that can never be overlooked. A positive perception of coaching and assessment was prevalent (4.45 +/- 0.429) among study participants (100%), like positive perceptions (3.92 +/- 0.32) observed in medical residents in the Philippines [25]. Despite the prevalence of positive scores in the current study and another study in the literature, there are still some studies that report lower scores (2.60 +/- 0.73, 3.24 +/- 0.35, and 3.08 +/- 1.17) in Saudi psychiatry residents, 45 multi-specialty residents in the Netherlands, and Saudi Emergency residents correspondingly for perceptions about coaching and assessment [4, 15, 23]. Therefore, future studies may identify and compare the factors leading to these scores’ differences. Such an approach may help adopt mechanisms associated with higher scores for coaching and assessment.

Feedback is a significant factor that enhances pathology residents’ learning process in three ways: it updates learners regarding their deficiency/progress, advises learners regarding observed training needs and resources available, assists them during learning, and encourages learners to engage in adequate learning activities. Despite the well-established understanding of the effectiveness of feedback in learning outcomes if communicated appropriately, feedback scores reportedly had been on the lower end [4, 15, 20]. A negative perception about feedback (3.19 +/- 0.458) in the current study re-affirms the previous findings about the low scores for feedback (2.00 +/- 0.85, 3.24 +/- 0.45 and 2.75 +/- 0.54), respectively [4, 15, 20]. The observed trend reflects the need for further concentration in the focused area of feedback to improve the patient-care capacity of the PGRs. Additionally, significant differences in means of EC scores by years of training signal a need to identify differential pathways through which these differences in means occur. For example, even in the low feedback scores range, lower scores were slightly higher for a single specialty but at multiple sites; perhaps, it could justify the slightly better feedback scores when considering the collective assessment of scores across all specialties of medicine for PG training, even though scores were still lower, i.e., less than 4 (3.75 +/- 1.49) [20].

Similarly, feedback scores remain lower even when the primary purpose is not to measure the feedback scores but to measure feedback scores for assessing their effect. For example, the effect of LE on faculty’s teaching performance is assessed by residents’ perceptions (3.24+/-0.45) [15]. When considering feedback scores for each item, it appears that the scores for the generalized feedback to residents are comparatively better (2.80+/-1.18) than the ones from a structured format (2.36+/-1.19) even though the feedback provided through the latter is more regular (2.39+/-1.22), despite the scores remaining consistently lower, i.e., below 4 in any of the three reported ways to provide feedback [23]. Thus, future research studies may utilize these findings and focus more on the needed forms of feedback, like structured rather than generalized feedback, and identifying mechanisms that nurture the promotion of structured feedback within the educational systems, facilitating the building of a feedback system in educational settings by using this proposed design as a model. This development and implementation of interventions based on conceptual models can be used to improve feedback communication and interpretation to PGRs, requiring the integration of evidence-based methodologies to disseminate structured feedback upon which the learner can reflect.

For PGRs learning, teamwork is considered a significant tool that assists in modifying learners’ attitudes and increases their efficiency, yet perceptions about teamwork have been reported differently in literature. In the present study, they are mainly positive (4.43 +/- 0.427), as well as in a study of Saudi emergency residents in Riyadh (4.05 +/- 0.97) [23], while the prevalence of negative ones (2.81 +/- 0.86) has been reported in Saudi psychiatry residents [4]. A large majority of study participants (94.4%) reported high scores for peer collaboration (4.54 +/- 0.606), consistent with the numbers in the literature, i.e., 82% of study participants reported positive perception (4.07 +/- 2.02) in a study of residents from different disciplines in Army Medical Hospital of Rawalpindi, Pakistan [20]. A low score for perceptions regarding peer collaboration was reported in a French population where the score was less than 3.6 (3.54 +/- 0.90) [26], though almost close to 3.6. It may be inferred from the differential prevalence levels for scores indicating teamwork that future interventions may be planned and implemented to enhance teamwork among PGRs.

A positive perception of professional relations among the attendings (4.45 +/- 0.429) was found among all the participants (100%), like high positive perception scores (3.61 +/- 0.67) in a study where students evaluated the effect of learning climate on the faculty’s teaching performance [15], and in contrary to the finding in psychiatry residents where negative perceptions (2.71 +/- 0.95) were prevalent [4]. Work is adapted to the residents’ competence (4.10 +/- 0.536) was positively perceived among all the pathology residents (100%), despite controversial findings in literature where this variable has been perceived positively (4.06 +/- 0.28) in a study among internal medicine residents in the Philippines [25] and negatively perceived in another study of residents from medical and allied sciences training as well as surgery and allied sciences PGR training of one institution in Pakistan [20]. Perceptions about “Professional relations between the attendings” and “Work adapted to the residents’ competence” are two variables that may or may not impact the quality of EC in any healthcare sector. Therefore, consistent efforts should be made to improve the scores for these variables.

Consultants play a vital role in developing positive perceptions about EC amongst residents. A positive perception regarding them (4.58 +/- 0.506) was prevalent among residents (100%); the finding coheres with positive perception (above 3.6) [15] and is contrary to the findings in a study among Saudi psychiatry residents where negative perception prevailed among residents regarding consultant’s role (2.71+/-0.86) [4]. An overall trend of positive experiences was observed amongst 100% of pathology residents regarding formal education (4.19±0.504), like a positive perception regarding formal education (M = >3.6) amongst residents from multiple PGR training programs in Pakistan [20]. Thus, whether it is the active role of consultants or formal education, consultants may be viewed within the context of formal education; this would help enhance the capacity of consultants in formal education.

The role of teachers can never be underestimated in improving the EC, while successful learning could be attained through the active role of specialty teachers. A positive perception (4.37+/-0.497) for the specialty tutor role was prevalent for 100% of participants, like positive perception (3.83 +/- 0.32) in a validity assessment of the D-RECT tool in a non-westernized context [25]. Despite reporting positive perceptions in different studies, negative perceptions (3.21 +/- 1.02) have also been reported in a study that focused not on a single specialty but multiple specialties [20], resulting in the need for further exploration of specialty teachers’ roles.

Views about patient sign-out are essential as a variable because residents’ perceptions could improve with an upgraded patient sign-out system. This consideration sparked because of high scores for patients signing out (4.22±0.535) among residents (100%), like the positive scores (3.82 +/- 0.46) during a shift change in a study in the Netherlands that included residents from multiple specialty training programs [15]. Nevertheless, scores in the current study were higher, indicating a positive impact on our patient sign-out system. Despite consistent positive findings in the literature about patient sign-out, negative perception has also been reported (3.50±1.15) when patient sign-out across multiple specialties was done and could reduce the representative size from each specialty [20]. These differential findings provide evidence for further improvement of patient-sign-out systems in specific settings.

Overall, a positive perception (>3.6) about EC Subscales of the D-RECT questionnaire was found in almost all the PGRs working in the four departments (histopathology, chemical pathology, immunology, hematology) of the pathology residency program. A positive perception regarding the overall EC score (4.39+0.385) was observed, which corresponded to the overall positive score for clinical learning (3.85 +/- 0.29) in Filipino medical residents [25]. However, a negative perception regarding EC (3.16±0.92) has also been reported in a study of Moroccan residents where a French translation of the D-RECT scale was used for psychometric analysis of the tool’s French version [26]. Despite the positive score for EC, score variations by years of training p≤0.05 were observed; it is supposed that they reflected changing perceptions of PGRs about the attributes of EC due to the transition of PGRs into senior years. In various years of training, different challenges and milestones may influence perceptions about experiences regarding LC. In senior years (from third to fifth), residents may feel added responsibility with low time availability for each commitment, relatively less supervised task performances, and the increased complexity of competencies that need to be mastered during the residency training [27]. Therefore, the observed differences in EC scores by years of training may reflect that in the earlier years of residency, i.e., first year and second year, residents might be more enthusiastic about their professional futures and invest consistent time commitments for training milestones and adjust corresponding to relatively simple competencies to be mastered.

Additionally, significant differences in means of EC scores by years of training in our study contradict the assessed differences in scores by years of training in an emergency department [23]. However, there are methodological differences in how the differences were calculated. In a study in the emergency department [23], scores were determined individually for each construct on the D-RECT and then compared by years of training. In addition, emergency residency training is only for four years. In our study, we calculated the average mean score for the overall D-RECT and then compared the scores by each year.

Furthermore, in our study, the pathology residency program lasts five years. We condensed these years of training into three groups. We conducted ANOVA to assess if any significant differences could be identified in addition to a post-hoc analysis through multiple comparisons to see where differences were located, as identifying differences in perceptions about EC scores reflects a need to devise mechanisms to address these differences.

A limitation of our study is that it included a smaller sample size. Enrolling any new PGRs for the study was impossible as these were the maximum PGRs during the study, and even though the study participants provided voluntary informed consent to participate, the response rate was 100%. To address the limitation of sampling size, this study may be considered a pilot study and, in the future, may be replicated in the same setting but with a different study design, i.e., longitudinal study design. Our study’s strength is that it focuses on the pathology residency program. Considering the diverse needs of EC in each unique setting, it was essential to assess the needs within our EC. To achieve this milestone, a research study was designed based on methodological rigor to identify differences within our EC, which was necessary to achieve quality assurance in our setting. We categorized the groups not only by the number of participants but also by considering the maturity levels of residents at each level and the complexity of the competencies to be met at different levels.

Furthermore, lower numbers in the senior years that required us to combine the groups may also reflect the program’s evolution with the decreasing numbers in the senior years and increasing numbers in the junior years. In the future, changing perceptions of PGRs in pathology residency training programs over time in each group through a longitudinal study design to highlight the dynamics of interactions related to feedback scores may be assessed. Implementing interventions to address the problem of lower feedback scores in educational settings requires a comprehensive plan like training the trainer workshops for supervisors. It would be a short training session in which supervisors practiced giving feedback in a simulated setting that increased the quality of their feedback [28]. Thus, updating the departmental policies to address the identified needs would help increase feedback scores and how the feedback is perceived at the perceivers’ end.

Conclusions

Variations in EC scores were observed by years of training. Despite the observed positive scores for overall EC, variations by years of training reflect the need to identify the factors that lead to these variations. Such an approach would help address the differences in EC scores by implementing targeted interventions. It is expected that feedback scores for pathology trainees can be improved by following the same approach.

Supporting information

S1 Dataset. Dataset used in this study.

This is the data file for the data on EC scores in a pathology residency program at a tertiary care hospital.

(CSV)

pone.0303534.s001.csv (4.1KB, csv)
S1 Table. Test of normality Kolmogorov-Smirnov.

This is the table for the test of normality that was conducted to confirm if data was normally distributed.

(DOCX)

pone.0303534.s002.docx (14.2KB, docx)

Acknowledgments

The research study was initially conducted as a part of the master’s in health Professions Education (MHPE) Thesis at Shifa Tameer-e-Millat University.

Data Availability

Data has been attached as a Supporting Information file.

Funding Statement

The author(s) received no specific funding for this work.

References

  • 1.Genn JM. AMEE Medical Education Guide No 23 (Part 1): Curriculum, environment, climate, quality and change in medical education-a unifying perspective. Medical teacher. 2001; 23: 337–44. doi: 10.1080/01421590120063330 [DOI] [PubMed] [Google Scholar]
  • 2.Altemani AH, Merghani TH. The quality of the educational environment in a medical college in Saudi Arabia. International journal of medical education. 2017; 8: 128. doi: 10.5116/ijme.58ce.55d2 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 3.Roff S, McAleer S. What is educational climate? Medical teacher. 2001; 23: 333–4. doi: 10.1080/01421590120063312 [DOI] [PubMed] [Google Scholar]
  • 4.Alshomrani AT, AlHadi AN. Learning environment of the Saudi psychiatry board training program. Saudi medical journal. 2017; 38(6): 629–35. doi: 10.15537/smj.2017.6.18164 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 5.Nasca TJ, Philibert I, Brigham T, Flynn TC. The next GME accreditation system—rationale and benefits. New England journal of medicine. 2012; 366: 1051–6. doi: 10.1056/NEJMsr1200117 [DOI] [PubMed] [Google Scholar]
  • 6.WFME. Postgraduate medical education: WFME global standards for quality improvement. Copenhagen (DK): World Federation for Medical Education; 2003. [Google Scholar]
  • 7.Bassaw B, Roff S, McAleer S, Roopnarinesingh S, De Lisle J, Teelucksingh S, et al. Students’ perspectives on the educational environment, Faculty of Medical Sciences, Trinidad. Medical teacher. 2003; 25: 522–6. doi: 10.1080/0142159031000137409 [DOI] [PubMed] [Google Scholar]
  • 8.Genn JM. AMEE Medical Education Guide No. 23 (Part 2): Curriculum, environment, climate, quality and change in medical education—a unifying perspective. Medical teacher. 2001; 23: 445–54. doi: 10.1080/01421590120075661 [DOI] [PubMed] [Google Scholar]
  • 9.Roff S. The Dundee Ready Educational Environment Measure (DREEM)—a generic instrument for measuring students’ perceptions of undergraduate health professions curricula. Medical teacher. 2005; 27: 322–5. doi: 10.1080/01421590500151054 [DOI] [PubMed] [Google Scholar]
  • 10.Hobgood C, Hevia A, Tamayo-Sarver JH, Weiner B, Riviello R. The influence of the causes and contexts of medical errors on emergency medicine residents’ responses to their errors: an exploration. Academic medicine. 2005; 80: 758–64. doi: 10.1097/00001888-200508000-00012 [DOI] [PubMed] [Google Scholar]
  • 11.Mareiniss DP. Decreasing GME training stress to foster residents’ professionalism. Academic medicine. 2004; 79: 825–31. doi: 10.1097/00001888-200409000-00003 [DOI] [PubMed] [Google Scholar]
  • 12.Tsai JC, Chen CS, Sun IF, Liu KM, Lai CS. Clinical learning environment measurement for medical trainees at transitions: relations with socio-cultural factors and mental distress. BMC medical education. 2014; 14: 226. doi: 10.1186/1472-6920-14-226 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 13.Brown J, Chapman T, Graham D. Becoming a new doctor: a learning or survival exercise? Medical education. 2007; 41: 653–60. doi: 10.1111/j.1365-2929.2007.02785.x [DOI] [PubMed] [Google Scholar]
  • 14.Silkens ME, Arah OA, Scherpbier AJ, Heineman MJ, Lombarts KM. Focus on quality: investigating residents’ learning climate perceptions. Plos one. 2016; 11: e0147108. doi: 10.1371/journal.pone.0147108 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 15.Lombarts KM, Heineman MJ, Scherpbier AJ, Arah OA. Effect of the learning climate of residency programs on faculty’s teaching performance as evaluated by residents. Plos one. 2014; 9: e86512. doi: 10.1371/journal.pone.0086512 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 16.Yousaf MJ, Yasmeen R, Khan MA, Qamar K. Perceptions of postgraduate residents regarding clinical educational environment by using the post graduate hospital education environment measure (PHEEM) inventory. Pakistan Armed forces medical journal. 2017; 67(6): 914–18. [Google Scholar]
  • 17.Maudsley RF. Role models and the learning environment: essential elements in effective medical education. Academic medicine. 2001; 76: 432–4. doi: 10.1097/00001888-200105000-00011 [DOI] [PubMed] [Google Scholar]
  • 18.Haidari M, Yared M, Olano JP, Alexander CB, Powell SZ. Attitudes and beliefs of pathology residents regarding the subspecialty of clinical chemistry: results of a survey. Archives of pathology & laboratory medicine. 2017; 141(2): 203–8. doi: 10.5858/arpa.2015-0547-OA [DOI] [PubMed] [Google Scholar]
  • 19.Domínguez LC, Silkens MEWM, Sanabria A. The Dutch residency educational climate test: construct and concurrent validation in Spanish language. International Journal of Medical Education. 2019; 10:138–148 ISSN: 2042-6372 doi: 10.5116/ijme.5d0c.bff7 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 20.Amin MS, Iqbal U, Shukr I. Residency educational climate in a Pakistani postgraduate medical institute. Pakistan armed forces medical journal. 2016; 66(4): 606–12. [Google Scholar]
  • 21.Jafri L, Siddiqui I, Khan AH, Tariq M, Effendi MU, Naseem A, et al. Fostering teaching-learning through workplace based assessment in postgraduate chemical pathology residency program using virtual learning environment. BMC medical education. 2020; 20(1): 1–2. doi: 10.1186/s12909-020-02299-8 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 22.Stephen Black-Schaffer W, Robboy SJ, Gross DJ, Crawford JM, Johnson K, Austin M, et al. Evidence-Based Alignment of Pathology Residency with Practice II: Findings and Implications. Academic Pathology. 2021; 8. doi: 10.1177/23742895211002816 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 23.Alsalamah M, Al-Madani A. Training climate of the Saudi emergency medicine program in Riyadh. Journal of health informatics in developing countries. 2020; 14(1): 1–11. [Google Scholar]
  • 24.Bennett D, Dornan T, Bergin C, Horgan M. Postgraduate training in Ireland: expectations and experience. Irish Journal of Medical Science; 2014. Jan; 183(4). doi: 10.1007/s11845-013-1060-5 [DOI] [PubMed] [Google Scholar]
  • 25.Pacifico JL, van der Vleuten CPM, Muijtjens AMM, Sana EA, Heeneman S. Cross-validation of a learning climate instrument in a non-western postgraduate clinical environment. BMC medical education. 2018; 18: 22. doi: 10.1186/s12909-018-1127-0 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 26.Majbar MA, Majbar Y, Benkabbou A, Amrani L, Bougtab A, Mohsine M, et al. Validation of the French translation of the Dutch residency educational climate test. BMC medical education. 2020; 20: 338. doi: 10.1186/s12909-020-02249-4 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 27.The Accreditation Council for Graduate Medical Education. Pathology Milestones. Available from: pathologymilestones.pdf (acgme.org). Accessed November 14, 2023. [Google Scholar]
  • 28.Renting N, Jaarsma D, Borleffs JCC, et al. Effectiveness of supervisor training on quality of feedback to internal medicine residents: a controlled longitudinal multicenter study. BMJ Open 2023;13:e076946. doi: 10.1136/bmjopen-2023-076946 [DOI] [PMC free article] [PubMed] [Google Scholar]

Decision Letter 0

Ritesh G Menezes

25 Oct 2023

PONE-D-23-26192Educational Climate of a Pathology Residency Program at a Tertiary Care HospitalPLOS ONE

Dear Dr. Akhlaq,

Thank you for submitting your manuscript to PLOS ONE. After careful consideration, we feel that it has merit but does not fully meet PLOS ONE’s publication criteria as it currently stands. Therefore, we invite you to submit a revised version of the manuscript that addresses the points raised during the review process.

Please submit your revised manuscript by Dec 09 2023 11:59PM. When you're ready to submit your revision, log on to https://www.editorialmanager.com/pone/ and select the 'Submissions Needing Revision' folder to locate your manuscript file.

Please include the following items when submitting your revised manuscript:

  • A 'Response to Reviewers' letter that responds to each point raised by the academic editor and reviewer(s). You should upload this letter as a separate file labelled 'Response to Reviewers'.

  • A marked-up copy of your manuscript that highlights changes made to the original version. You should upload this as a separate file labelled 'Revised Manuscript with Track Changes'.

  • An unmarked version of your revised paper without tracked changes. You should upload this as a separate file labelled 'Manuscript'.

If you would like to make changes to your financial disclosure, please include your updated statement in your cover letter. Guidelines for resubmitting your figure files are available below the reviewer comments at the end of this letter.

If applicable, we recommend that you deposit your laboratory protocols in protocols.io to enhance the reproducibility of your results. Protocols.io assigns your protocol its own identifier (DOI) so that it can be cited independently in the future. For instructions see: https://journals.plos.org/plosone/s/submission-guidelines#loc-laboratory-protocols. Additionally, PLOS ONE offers an option for publishing peer-reviewed Lab Protocol articles, which describe protocols hosted on protocols.io. Read more information on sharing protocols at https://plos.org/protocols?utm_medium=editorial-email&utm_source=authorletters&utm_campaign=protocols.

We look forward to receiving your revised manuscript.

Kind regards,

Prof. Ritesh G. Menezes, M.B.B.S., M.D., Diplomate N.B.

Academic Editor

PLOS ONE

Journal Requirements:

When submitting your revision, we need you to address these additional requirements.

1. Please ensure that your manuscript meets PLOS ONE's style requirements, including those for file naming. The PLOS ONE style templates can be found at 

https://journals.plos.org/plosone/s/file?id=wjVg/PLOSOne_formatting_sample_main_body.pdf and 

https://journals.plos.org/plosone/s/file?id=ba62/PLOSOne_formatting_sample_title_authors_affiliations.pdf

2. We note that you have indicated that data from this study are available upon request. PLOS only allows data to be available upon request if there are legal or ethical restrictions on sharing data publicly. For information on unacceptable data access restrictions, please see http://journals.plos.org/plosone/s/data-availability#loc-unacceptable-data-access-restrictions. 

In your revised cover letter, please address the following prompts:

a) If there are ethical or legal restrictions on sharing a de-identified data set, please explain them in detail (e.g., data contain potentially identifying or sensitive patient information) and who has imposed them (e.g., an ethics committee). Please also provide contact information for a data access committee, ethics committee, or other institutional body to which data requests may be sent.

b) If there are no restrictions, please upload the minimal anonymized data set necessary to replicate your study findings as either Supporting Information files or to a stable, public repository and provide us with the relevant URLs, DOIs, or accession numbers. Please see http://www.bmj.com/content/340/bmj.c181.long for guidelines on how to de-identify and prepare clinical data for publication. For a list of acceptable repositories, please see http://journals.plos.org/plosone/s/data-availability#loc-recommended-repositories.

We will update your Data Availability statement on your behalf to reflect the information you provide.

[Note: HTML markup is below. Please do not edit.]

Reviewers' comments:

Reviewer's Responses to Questions

Comments to the Author

1. Is the manuscript technically sound, and do the data support the conclusions?

The manuscript must describe a technically sound piece of scientific research with data that supports the conclusions. Experiments must have been conducted rigorously, with appropriate controls, replication, and sample sizes. The conclusions must be drawn appropriately based on the data presented.

Reviewer #1: Partly

Reviewer #2: Partly

Reviewer #3: Partly

**********

2. Has the statistical analysis been performed appropriately and rigorously?

Reviewer #1: I Don't Know

Reviewer #2: I Don't Know

Reviewer #3: Yes

**********

3. Have the authors made all data underlying the findings in their manuscript fully available?

The PLOS Data policy requires authors to make all data underlying the findings described in their manuscript fully available without restriction, with rare exception (please refer to the Data Availability Statement in the manuscript PDF file). The data should be provided as part of the manuscript or its supporting information, or deposited to a public repository. For example, in addition to summary statistics, the data points behind means, medians and variance measures should be available. If there are restrictions on publicly sharing data—e.g. participant privacy or use of data from a third party—those must be specified.

Reviewer #1: No

Reviewer #2: Yes

Reviewer #3: Yes

**********

4. Is the manuscript presented in an intelligible fashion and written in standard English?

PLOS ONE does not copyedit accepted manuscripts, so the language in submitted articles must be clear, correct, and unambiguous. Any typographical or grammatical errors should be corrected at revision, so please note any specific errors here.

Reviewer #1: Yes

Reviewer #2: Yes

Reviewer #3: Yes

**********

5. Review Comments to the Author

Please use the space provided to explain your answers to the questions above. You may also include additional comments for the author, including concerns about dual publication, research ethics, or publication ethics. (Please upload your review as an attachment if it exceeds 20,000 characters)

Reviewer #1: Overall Review:

This paper presents an interesting study assessing the educational climate (EC) experiences of pathology residents at a hospital in Islamabad, Pakistan using the D-RECT questionnaire. The study finds overall positive perceptions of the EC, except for lower scores on the feedback subscale. There are also significant differences in EC scores by years of training. The paper is well-written and adds to the literature on EC in graduate medical education. I have some suggestions to further strengthen the manuscript:

Minor recommendations:

- In the discussion, expand on possible reasons for the lower feedback scores and differences by training year. Compare to any similar findings in the literature.

- Incomplete phrase (Negative perceptions about supervision have been observed in a study by?)

- Conclusion could be strengthened by providing specific recommendations based on your findings, beyond just suggesting further study of the factors influencing scores.

- Carefully proofread for any typos, grammar issues, or awkward wording.

- Make sure references are in correct journal format.

- Consider adding any limitations of the study design or sample.

Overall, the study makes a valuable contribution to understanding the EC in this residency program. Addressing the suggestions above would further improve the quality and impact of the manuscript.

Based on my review, I did not notice any major scientific or methodological mistakes in the paper. The study design and use of the D-RECT questionnaire seem appropriate. Here are some minor language/writing issues I identified:

Some awkward phrasing, such as: "Despite the prevalence of positive perceptions with varying degrees of magnitude in different populations, prevalence of negative perceptions has also been observed in some studies." This could be smoothed out and clarified.

Typos - "infers" should be "implies", wellbeing of PGRs (extra spaces) etc. Carefully proofread.

Need to be consistent with abbreviations. For example, "post-graduate" is written out sometimes and abbreviated "PG" other times. Pick one format.

Uses words that sound too casual/conversational in academic writing like "bulk" and "infers"

Overall, the writing is pretty clear and understandable. Just needs minor editing for typos, consistency, and using more formal academic style. The paper would benefit from careful proofreading and having someone else review the writing. But I did not see any major scientific, methodological, or statistical issues from my review.

Reviewer #2: The idea is interesting and worth exploring. There are a few things which require further clarification. Firstly, you need to expand on the groups which you compared between (on what basis were these groups created?). Secondly, it would be interesting to see your interpretation for the differences in perceptions and experiences based on your context. I would also be interested to learn of the impact this study has on your program (any policy changes?). Lastly, there is some use of informal language, a few grammatical errors, and some missing information (specifically the name of a study is missing which was alluded to in the discussion).

Reviewer #3: The article is well written. Here are a few comments:

P9, HCP: what is this abbreviation stans for?

P10, inadequate in preparing residents to play their significant role: What is their significant role, could you be more specific?

P 11, results, first two lines: Review numbers (15: young, 15: >30)

P 11, were female: repetitive

P 12, table, Peer collaboration: missing number in neutral

P 15, in a study by However: missing information

Coaching and assessment vs feedback:

In my understanding, coaching and assessment are part of feedback. You cannot coach without giving feedback. I find it difficult that the residents have positive perspective regarding coaching and assessment and at the same time they report the opposite in feedback!

I also have concerns regarding the low number of residents particularly in pathology disciplines other than histopathology and predominantly female participants, which is acknowledged by the authors. I would advise to involve more numbers to be more representative.

**********

6. PLOS authors have the option to publish the peer review history of their article (what does this mean?). If published, this will include your full peer review and any attached files.

If you choose “no”, your identity will remain anonymous but your review may still be made public.

Do you want your identity to be public for this peer review? For information about this choice, including consent withdrawal, please see our Privacy Policy.

Reviewer #1: No

Reviewer #2: No

Reviewer #3: No

**********

[NOTE: If reviewer comments were submitted as an attachment file, they will be attached to this email and accessible via the submission site. Please log into your account, locate the manuscript record, and check for the action link "View Attachments". If this link does not appear, there are no attachment files.]

While revising your submission, please upload your figure files to the Preflight Analysis and Conversion Engine (PACE) digital diagnostic tool, https://pacev2.apexcovantage.com/. PACE helps ensure that figures meet PLOS requirements. To use PACE, you must first register as a user. Registration is free. Then, login and navigate to the UPLOAD tab, where you will find detailed instructions on how to use the tool. If you encounter any issues or have any questions when using PACE, please email PLOS at figures@plos.org. Please note that Supporting Information files do not need this step.

PLoS One. 2024 May 23;19(5):e0303534. doi: 10.1371/journal.pone.0303534.r002

Author response to Decision Letter 0


15 Nov 2023

In response to academic editor:

This study is not a laboratory protocol. Therefore, I do not need to use those links.

In response to comments regarding editorial requirements:

I have followed the links that describes the PLOS ONE formatting guidelines for the title page, main manuscript, heading fonts, and references and adopted accordingly. If any minor thing is remaining, let me know I will adhere accordingly.

Reviewer #1: Overall Review:

Minor recommendations:

- In the discussion, expand on possible reasons for the lower feedback scores and differences by training year. Compare to any similar findings in the literature.

- Incomplete phrase (Negative perceptions about supervision have been observed in a study by?)

My answer: I have included possible reasons for variations in feedback scores in lines 174-183.

I have addressed the issue of incomplete phrases.

- Conclusion could be strengthened by providing specific recommendations based on your findings, beyond just suggesting further study of the factors influencing scores.

My answer: I have provided recommendations to address low feedback scores through implementation of interventions based on conceptual models. The exact sentences are in the discussion section in line 212-225 and lines 258-268.

- Carefully proofread for any typos, grammar issues, or awkward wording.

My answer- I have tried to address this issue to the best of my knowledge through re-reading multiple times.

- Make sure references are in correct journal format.

My answer- I have followed the PLOS ONE template now.

- Consider adding any limitations of the study design or sample.

My answer- I have included limitations now as a part of discussion.

Here are some minor language/writing issues I identified:

Some awkward phrasing, such as: "Despite the prevalence of positive perceptions with varying degrees of magnitude in different populations, prevalence of negative perceptions has also been observed in some studies." This could be smoothed out and clarified.

My answer- I have re-phrased this sentence. I have almost re-written most of the discussion section by considering the recommendations of one of the reviewers.

Typos - "infers" should be "implies", wellbeing of PGRs (extra spaces) etc. Carefully proofread.

My answer- I have replaced “infers” with “implies”, removed extra spaces before wellbeing of PGRs.

Need to be consistent with abbreviations. For example, "post-graduate" is written out sometimes and abbreviated "PG" other times. Pick one format.

My answer- I have now written post-graduate (PG) in the beginning and then followed with PG throughout the paper.

Uses words that sound too casual/conversational in academic writing like "bulk" and "infers"

My answer- Addressed.

Overall, the writing is pretty clear and understandable. Just needs minor editing for typos, consistency, and using more formal academic style. The paper would benefit from careful proofreading and having someone else review the writing. But I did not see any major scientific, methodological, or statistical issues from my review.

Reviewer #2: The idea is interesting and worth exploring. There are a few things which require further clarification.

Firstly, you need to expand on the groups which you compared between (on what basis were these groups created?).

My answer- I have now explained in the manuscript that the pathology residency is a 5-year program. Participants in the last three years were comparatively fewer in number. Therefore, I have condensed participants from 5 years of training into three groups for the purpose of study. Group 1 is less than 2 which means years 1, Group 2 is equal to 2 which means year 2, and Group 3 which mean more than 2 is equal to trainees in years 3, 4 and 5.

Secondly, it would be interesting to see your interpretation for the differences in perceptions and experiences based on your context.

My answer- I have now included the interpretation for the differences in the opening paragraph of the Discussion section.

I would also be interested to learn of the impact this study has on your program (any policy changes?).

My answer- I have now included answer here and a sentence in the discussion section of the manuscript. I believe the differences in the perceptions and experiences relate to the different challenges associated with the milestones to be achieved at each level. I have included that in the manuscript now. Even though policy making has not been updated yet, this evidence can be used to advocate for the needed changes through updated policy. I have made the adjustments accordingly in lines 266-267.

Lastly, there is some use of informal language, a few grammatical errors, and some missing information (specifically the name of a study is missing which was alluded to in the discussion).

My Answer- I have tried to replace the names of authors and years with descriptions about the studies to which they are referred.

Reviewer #3: The article is well written. Here are a few comments:

P9, HCP: what is this abbreviation stans for?

My answer- HCP stands for “Health Care Professionals”. I have included it in the manuscript as well.

P10, inadequate in preparing residents to play their significant role: What is their significant role, could you be more specific?

My answer- I have addressed it in the manuscript. This phrase was awkwardly written. I have re-phrased it. The lines are now as: Pathology graduates and employers agree that existing pathology residency training remains inadequate in preparing the residents who are competent in clinical chemistry during their professional careers primarily due to less value placed by residents on learning clinical chemistry and lack of directors who supervise and manage labs of clinical chemistry [18].

P 11, results, first two lines: Review numbers (15: young, 15: >30)

My answer- I have edited this part. It meant a large majority of participants (15) were either less than or equal to 30 years of age, and a smaller number (3) were older than 30 years.

P 11, were female: repetitive

My answer- Deleted repetitive words.

P 12, table, Peer collaboration: missing number in neutral

My answer- Addressed. It was 1 participant.

P 15, in a study by However: missing information

Coaching and assessment vs feedback:

My answer- In my understanding, coaching and assessment are part of feedback. You cannot coach without giving feedback. I find it difficult that the residents have positive perspective regarding coaching and assessment and at the same time they report the opposite in feedback!

I agree that coaching and assessment are part of the feedback process. However, on the D-RECT scale, they are measured as separate constructs. As the coaching and assessment scores are high, and feedback scores are low. I have mostly re-written the entire discussion part as per the recommendations of one of the reviewers to elaborate and focus more on the construct of feedback as this is an important finding that is consistent throughout literature and efforts need to be taken to promote feedback scores rather than just assessing the reasoning behind differences in these scores.

I also have concerns regarding the low number of residents particularly in pathology disciplines other than histopathology and predominantly female participants, which is acknowledged by the authors. I would advise to involve more numbers to be more representative.

My answer- I agree that the numbers of the study are not generalizable. However, this is the limitation of the study as it was only focused on pathology residency program in one point in time. As the response rate is 100%, there was no option to enroll any more residents at that time. I agree that these are alarming numbers. However, in a lower middle-income country like Pakistan, where availability and disbursement of funds is a major issue; Pathology residency program is at a disadvantaged end. Therefore, it may be considered as a pilot study if due to lack of generalizability issue, there is a publication concern. However, in future longitudinal study designs may be implemented to ensure larger sample sizes in residency programs that have fewer number of residents. Sub-specialties of pathology that have even fewer number of residents would require a separate publication to discuss low enrollment numbers in different sub-specialties of pathology.

Considering females as major representatives is alarming. However, considering the income prospects after pathology residency program in Pakistan, most of the males do not opt Pathology residency as a first choice. This is also dependent upon the lack of equipment needed to run the clinical chemistry labs. However, a separate comprehensive study would be needed to discuss these aspects.

Attachment

Submitted filename: Responses to Reviewers.docx

pone.0303534.s003.docx (18.7KB, docx)

Decision Letter 1

Ritesh G Menezes

2 Jan 2024

PONE-D-23-26192R1Educational climate of a pathology residency program at a tertiary care hospitalPLOS ONE

Dear Dr. Akhlaq,

Thank you for submitting your manuscript to PLOS ONE. After careful consideration, we feel that it has merit but does not fully meet PLOS ONE’s publication criteria as it currently stands. Therefore, we invite you to submit a revised version of the manuscript that addresses the points raised during the review process.

Please submit your revised manuscript by Feb 16 2024 11:59PM. If you will need more time than this to complete your revisions, please reply to this message or contact the journal office at plosone@plos.org. When you're ready to submit your revision, log on to https://www.editorialmanager.com/pone/ and select the 'Submissions Needing Revision' folder to locate your manuscript file.

Please include the following items when submitting your revised manuscript:

  • A Response to Reviewers' letter that responds to each point raised by the academic editor and reviewer(s). You should upload this letter as a separate file labeled 'Response to Reviewers'.

  • A marked-up copy of your manuscript that highlights changes made to the original version. You should upload this as a separate file labeled 'Revised Manuscript with Track Changes'.

  • An unmarked version of your revised paper without tracked changes. You should upload this as a separate file labeled 'Manuscript'.

If you would like to make changes to your financial disclosure, please include your updated statement in your cover letter. Guidelines for resubmitting your figure files are available below the reviewer comments at the end of this letter.

If applicable, we recommend that you deposit your laboratory protocols in protocols.io to enhance the reproducibility of your results. Protocols.io assigns your protocol its own identifier (DOI) so that it can be cited independently in the future. For instructions see: https://journals.plos.org/plosone/s/submission-guidelines#loc-laboratory-protocols. Additionally, PLOS ONE offers an option for publishing peer-reviewed Lab Protocol articles, which describe protocols hosted on protocols.io. Read more information on sharing protocols at https://plos.org/protocols?utm_medium=editorial-email&utm_source=authorletters&utm_campaign=protocols.

We look forward to receiving your revised manuscript.

Kind regards,

Prof. Ritesh G. Menezes, M.B.B.S., M.D., Diplomate N.B.

Academic Editor

PLOS ONE

[Note: HTML markup is below. Please do not edit.]

Reviewers' comments:

Reviewer's Responses to Questions

Comments to the Author

1. If the authors have adequately addressed your comments raised in a previous round of review and you feel that this manuscript is now acceptable for publication, you may indicate that here to bypass the “Comments to the Author” section, enter your conflict of interest statement in the “Confidential to Editor” section, and submit your "Accept" recommendation.

Reviewer #1: All comments have been addressed

Reviewer #2: (No Response)

Reviewer #3: All comments have been addressed

**********

2. Is the manuscript technically sound, and do the data support the conclusions?

The manuscript must describe a technically sound piece of scientific research with data that supports the conclusions. Experiments must have been conducted rigorously, with appropriate controls, replication, and sample sizes. The conclusions must be drawn appropriately based on the data presented.

Reviewer #1: Yes

Reviewer #2: Partly

Reviewer #3: No

**********

3. Has the statistical analysis been performed appropriately and rigorously?

Reviewer #1: Yes

Reviewer #2: Yes

Reviewer #3: I Don't Know

**********

4. Have the authors made all data underlying the findings in their manuscript fully available?

The PLOS Data policy requires authors to make all data underlying the findings described in their manuscript fully available without restriction, with rare exception (please refer to the Data Availability Statement in the manuscript PDF file). The data should be provided as part of the manuscript or its supporting information, or deposited to a public repository. For example, in addition to summary statistics, the data points behind means, medians and variance measures should be available. If there are restrictions on publicly sharing data—e.g. participant privacy or use of data from a third party—those must be specified.

Reviewer #1: (No Response)

Reviewer #2: Yes

Reviewer #3: Yes

**********

5. Is the manuscript presented in an intelligible fashion and written in standard English?

PLOS ONE does not copyedit accepted manuscripts, so the language in submitted articles must be clear, correct, and unambiguous. Any typographical or grammatical errors should be corrected at revision, so please note any specific errors here.

Reviewer #1: Yes

Reviewer #2: No

Reviewer #3: Yes

**********

6. Review Comments to the Author

Please use the space provided to explain your answers to the questions above. You may also include additional comments for the author, including concerns about dual publication, research ethics, or publication ethics. (Please upload your review as an attachment if it exceeds 20,000 characters)

Reviewer #1: (No Response)

Reviewer #2: Thank you for adressing my previous comments. There are, however, still a number of grammatical and typographical errors and use of informal language (e.g. it's [line 66], Study was conducted (instead of The study) [line103], writing the word less than and also adding he symbol [line 171]. I believe you need to thoroughly edit the manuscript.

One of the biggest issues I have now with the review is the discussion. The originally submitted discussion was more specific and offered a rich argument. The revised discussion was too general and at times very difficult to follow. I beleive this generalization took away from the strength of the original discussion.

There were also some repatative points made (e.g. the point about poitive perception except in feedback [lines 228-232] and the point about the inability to enroll additional participants).

Reviewer #3: (No Response)

**********

7. PLOS authors have the option to publish the peer review history of their article (what does this mean?). If published, this will include your full peer review and any attached files.

If you choose “no”, your identity will remain anonymous but your review may still be made public.

Do you want your identity to be public for this peer review? For information about this choice, including consent withdrawal, please see our Privacy Policy.

Reviewer #1: No

Reviewer #2: No

Reviewer #3: No

**********

[NOTE: If reviewer comments were submitted as an attachment file, they will be attached to this email and accessible via the submission site. Please log into your account, locate the manuscript record, and check for the action link "View Attachments". If this link does not appear, there are no attachment files.]

While revising your submission, please upload your figure files to the Preflight Analysis and Conversion Engine (PACE) digital diagnostic tool, https://pacev2.apexcovantage.com/. PACE helps ensure that figures meet PLOS requirements. To use PACE, you must first register as a user. Registration is free. Then, login and navigate to the UPLOAD tab, where you will find detailed instructions on how to use the tool. If you encounter any issues or have any questions when using PACE, please email PLOS at figures@plos.org. Please note that Supporting Information files do not need this step.

PLoS One. 2024 May 23;19(5):e0303534. doi: 10.1371/journal.pone.0303534.r004

Author response to Decision Letter 1


2 Feb 2024

In reviewing the comments by the reviewers, I only found the comments by Reviewer # 2. Therefore, responses to reviewer # 2 have been included in this document.

Additionally, comment by the Academic Editor was to deposit my laboratory protocol in protocols.io. Even though my study is now beyond the protocol stage, I have registered with the protocols.io and entered my abstract for this manuscript without the results section to reserve the doi for this manuscript. Therefore, the reserved doi for this manuscript is:

DOI: dx.doi.org/10.17504/protocols.io.n2bvj3wk5lk5/v1 (Private link for reviewers: https://www.protocols.io/private/E5424EF9C00211EEAABF0A58A9FEAC02 to be removed before publication.)

6. Review Comments to the Author

Please use the space provided to explain your answers to the questions above. You may also include additional comments for the author, including concerns about dual publication, research ethics, or publication ethics. (Please upload your review as an attachment if it exceeds 20,000 characters)

Reviewer #1: (No Response)

Reviewer #2: Thank you for addressing my previous comments. There are, however, still a number of grammatical and typographical errors and use of informal language (e.g. it's [line 66], Study was conducted (instead of the study) [line103], writing the word less than and also adding he symbol [line 171]. I believe you need to thoroughly edit the manuscript.

Thank you for your comprehensive feedback. First, I addressed these specific issues in lines 66, 103, and 171. Then, after making adjustments to the content for the other feedback points especially with regards to discussion section, I sent the finalized manuscript to an international professional editor for editing the manuscript in accordance with the standardized English language to prevent any grammatical and typographical errors and use of informal language.

One of the biggest issues I have now with the review is the discussion. The originally submitted discussion was more specific and offered a rich argument. The revised discussion was too general and at times very difficult to follow. I believe this generalization took away from the strength of the original discussion.

Two of the other reviewers had the issue that the discussion included more content on all domains of D-RECT rather than focusing on the statistically significant findings and their implications. Therefore, considering your comment about the discussion, I have included the same discussion patterns and detailed description of each domain of D-RECT as was in the first draft that you believed added to the beauty of the discussion. In addition, I have retained the content that was requested by the other reviewers with respect to feedback and variations in EC scores by years of training. Therefore, I have tried to the best of my abilities to preserve the feedback of all three reviewers in the final discussion section of the current manuscript version.

There were also some repetitive points made (e.g. the point about positive perception except in feedback [lines 228-232] and the point about the inability to enroll additional participants).

I have deleted all the repetitive points. Yes, there were specially at two points where the points were repeated about the need for longitudinal study to enroll larger sample size corresponding to the lower sample size in the current study that was based on cross-sectional study design. The point is that the study was completed at one point in time. It was not possible to enroll any more residents at that time as the response rate was 100%, and the new residents would be hired for the residency program in the corresponding year. The researcher had to complete the study in one year as this study was part of the thesis of a Masters’ degree. It does not mean any new additional participants cannot be enrolled. However, for that purpose a longitudinal study would be more feasible.

In some qualitative studies with executive leaders the number of participants is much less as compared to the sample in the general population. The same principle applies here that as we only targeted pathology residents in one tertiary care hospital, it was not possible to enroll any more participants. In other words, there were no more eligible participants as our study was not multidisciplinary. It was only aimed at analyzing the educational climate within the pathology residency program context.

Reviewer #3: (No Response)

Attachment

Submitted filename: Response to Reviewers.docx

pone.0303534.s004.docx (15.1KB, docx)

Decision Letter 2

Ritesh G Menezes

26 Apr 2024

Educational climate of a pathology residency program at a tertiary care hospital

PONE-D-23-26192R2

Dear Dr. Akhlaq,

We’re pleased to inform you that your manuscript has been judged scientifically suitable for publication and will be formally accepted for publication once it meets all outstanding technical requirements.

Within one week, you’ll receive an e-mail detailing the required amendments. When these have been addressed, you’ll receive a formal acceptance letter and your manuscript will be scheduled for publication.

An invoice will be generated when your article is formally accepted. Please note, if your institution has a publishing partnership with PLOS and your article meets the relevant criteria, all or part of your publication costs will be covered. Please make sure your user information is up-to-date by logging into Editorial Manager at Editorial Manager® and clicking the ‘Update My Information' link at the top of the page. If you have any questions relating to publication charges, please contact our Author Billing department directly at authorbilling@plos.org.

If your institution or institutions have a press office, please notify them about your upcoming paper to help maximize its impact. If they’ll be preparing press materials, please inform our press team as soon as possible -- no later than 48 hours after receiving the formal acceptance. Your manuscript will remain under strict press embargo until 2 pm Eastern Time on the date of publication. For more information, please contact onepress@plos.org.

Kind regards,

Prof. Ritesh G. Menezes, M.B.B.S., M.D., Diplomate N.B.

Academic Editor

PLOS ONE

Additional Editor Comments: Kindly address the minor corrections suggested by one of the reviewers at the time of proof corrections or preferably when an opportunity to consider technical corrections is provided by the editorial office.

Reviewers' comments:

Reviewer's Responses to Questions

Comments to the Author

1. If the authors have adequately addressed your comments raised in a previous round of review and you feel that this manuscript is now acceptable for publication, you may indicate that here to bypass the “Comments to the Author” section, enter your conflict of interest statement in the “Confidential to Editor” section, and submit your "Accept" recommendation.

Reviewer #4: All comments have been addressed

**********

2. Is the manuscript technically sound, and do the data support the conclusions?

The manuscript must describe a technically sound piece of scientific research with data that supports the conclusions. Experiments must have been conducted rigorously, with appropriate controls, replication, and sample sizes. The conclusions must be drawn appropriately based on the data presented.

Reviewer #4: Yes

**********

3. Has the statistical analysis been performed appropriately and rigorously?

Reviewer #4: Yes

**********

4. Have the authors made all data underlying the findings in their manuscript fully available?

The PLOS Data policy requires authors to make all data underlying the findings described in their manuscript fully available without restriction, with rare exception (please refer to the Data Availability Statement in the manuscript PDF file). The data should be provided as part of the manuscript or its supporting information, or deposited to a public repository. For example, in addition to summary statistics, the data points behind means, medians and variance measures should be available. If there are restrictions on publicly sharing data—e.g. participant privacy or use of data from a third party—those must be specified.

Reviewer #4: Yes

**********

5. Is the manuscript presented in an intelligible fashion and written in standard English?

PLOS ONE does not copyedit accepted manuscripts, so the language in submitted articles must be clear, correct, and unambiguous. Any typographical or grammatical errors should be corrected at revision, so please note any specific errors here.

Reviewer #4: Yes

**********

6. Review Comments to the Author

Please use the space provided to explain your answers to the questions above. You may also include additional comments for the author, including concerns about dual publication, research ethics, or publication ethics. (Please upload your review as an attachment if it exceeds 20,000 characters)

Reviewer #4: The authors have revised the manuscript well. Some suggestions are as under:

1. Page 3, Line 69- Please write [2][3] as [2,3]. Kindly do this throughout the manuscript.

2. Page 3, Line 76- Please expand PGRs as it is used here for the first time.

3. Page 3, Lines 76-79; Reference citation 10 is done after 11-13, 14, 15. Please rectify. Reference citation 10 should follow 9. Accordingly, the reference order and list need to be corrected.

**********

7. PLOS authors have the option to publish the peer review history of their article (what does this mean?). If published, this will include your full peer review and any attached files.

If you choose “no”, your identity will remain anonymous but your review may still be made public.

Do you want your identity to be public for this peer review? For information about this choice, including consent withdrawal, please see our Privacy Policy.

Reviewer #4: No

**********

Acceptance letter

Ritesh G Menezes

13 May 2024

PONE-D-23-26192R2

PLOS ONE

Dear Dr. Akhlaq,

I'm pleased to inform you that your manuscript has been deemed suitable for publication in PLOS ONE. Congratulations! Your manuscript is now being handed over to our production team.

At this stage, our production department will prepare your paper for publication. This includes ensuring the following:

* All references, tables, and figures are properly cited

* All relevant supporting information is included in the manuscript submission,

* There are no issues that prevent the paper from being properly typeset

If revisions are needed, the production department will contact you directly to resolve them. If no revisions are needed, you will receive an email when the publication date has been set. At this time, we do not offer pre-publication proofs to authors during production of the accepted work. Please keep in mind that we are working through a large volume of accepted articles, so please give us a few weeks to review your paper and let you know the next and final steps.

Lastly, if your institution or institutions have a press office, please let them know about your upcoming paper now to help maximize its impact. If they'll be preparing press materials, please inform our press team within the next 48 hours. Your manuscript will remain under strict press embargo until 2 pm Eastern Time on the date of publication. For more information, please contact onepress@plos.org.

If we can help with anything else, please email us at customercare@plos.org.

Thank you for submitting your work to PLOS ONE and supporting open access.

Kind regards,

PLOS ONE Editorial Office Staff

on behalf of

Professor Ritesh G. Menezes

Academic Editor

PLOS ONE

Associated Data

    This section collects any data citations, data availability statements, or supplementary materials included in this article.

    Supplementary Materials

    S1 Dataset. Dataset used in this study.

    This is the data file for the data on EC scores in a pathology residency program at a tertiary care hospital.

    (CSV)

    pone.0303534.s001.csv (4.1KB, csv)
    S1 Table. Test of normality Kolmogorov-Smirnov.

    This is the table for the test of normality that was conducted to confirm if data was normally distributed.

    (DOCX)

    pone.0303534.s002.docx (14.2KB, docx)
    Attachment

    Submitted filename: Responses to Reviewers.docx

    pone.0303534.s003.docx (18.7KB, docx)
    Attachment

    Submitted filename: Response to Reviewers.docx

    pone.0303534.s004.docx (15.1KB, docx)

    Data Availability Statement

    Data has been attached as a Supporting Information file.


    Articles from PLOS ONE are provided here courtesy of PLOS

    RESOURCES