Skip to main content
PLOS One logoLink to PLOS One
. 2023 Jul 12;18(7):e0287660. doi: 10.1371/journal.pone.0287660

Knowledge and motivations of training in peer review: An international cross-sectional survey

Jessie V Willis 1,2,*, Janina Ramos 1,3, Kelly D Cobey 4,5, Jeremy Y Ng 1, Hassan Khan 1,5, Marc A Albert 6,7, Mohsen Alayche 1,2, David Moher 1,5
Editor: Suhad Daher-Nashif8
PMCID: PMC10337866  PMID: 37436973

Abstract

Background

Despite having a crucial role in scholarly publishing, peer reviewers do not typically require any training. The purpose of this study was to conduct an international survey on the current perceptions and motivations of researchers regarding peer review training.

Methods

A cross-sectional online survey was conducted of biomedical researchers. A total of 2000 corresponding authors from 100 randomly selected medical journals were invited via email. Quantitative items were reported using frequencies and percentages or means and SE, as appropriate. A thematic content analysis was conducted for qualitative items in which two researchers independently assigned codes to the responses for each written-text question, and subsequently grouped the codes into themes. A descriptive definition of each category was then created and unique themes–as well as the number and frequency of codes within each theme–were reported.

Results

A total of 186 participants completed the survey of which 14 were excluded. The majority of participants indicated they were men (n = 97 of 170, 57.1%), independent researchers (n = 108 of 172, 62.8%), and primarily affiliated with an academic organization (n = 103 of 170, 62.8%). A total of 144 of 171 participants (84.2%) indicated they had never received formal training in peer review. Most participants (n = 128, 75.7%) agreed–of which 41 (32.0%) agreed strongly–that peer reviewers should receive formal training in peer review prior to acting as a peer reviewer. The most preferred training formats were online courses, online lectures, and online modules. Most respondents (n = 111 of 147, 75.5%) stated that difficulty finding and/or accessing training was a barrier to completing training in peer review.

Conclusion

Despite being desired, most biomedical researchers have not received formal training in peer review and indicated that training was difficult to access or not available.

Introduction

Peer review is the predominant quality control measure for scientific publishing regardless of country or discipline [13]. Peer review refers to the process by which “peers” are selected to assess the validity and quality of submitted manuscripts for publication [4]. Responsibilities of peer reviewers typically include providing constructive feedback to the authors of the manuscript and sometimes recommendations to journal editors [5, 6].

Despite its foothold in scholarly publishing, peer review is not a standardized process and lacks uniform guidelines [710]. Different scholarly publishers have different requirements and responsibilities for their peer reviewers and peer review data is not always made public [11]. Some publishers provide guidelines and training for their peer review process; however, a 2012 study found that only 35% of selected journals provided online instructions for their peer reviewers [12, 13].

It is therefore understandable that many potential peer reviewers feel inadequately trained to peer review. This is especially true for early career researchers; a recent survey showed that 60% of those under 36 years of age felt there is a lack of guidance on how to review papers [14]. Additional studies have shown that training is highly desired by academics [1517]. In a 2018 survey by Publons, 88% of survey respondents felt training would have a positive impact on the efficacy of peer review. Despite this, 39% of respondents had never received training and 35.8% had self-trained by reading academic literature. Most respondents believed that training should be provided by scholarly publishers or journals and 45.5% believe that it should be a practical online course [18].

Unfortunately, the effectiveness of peer review training has been studied only via small-scale studies on non-online methods (e.g., workshops) with limited evidence of any benefit [1922]. Our group was unable to identify any randomized-controlled trials regarding how the electronic delivery of peer review guidelines has impacted the knowledge of potential peer reviewers.

In the present study we conducted a large-scale, online survey to provide an up-to-date perspective of international biomedical researchers’ views on peer review training. We focused on biomedical researchers as this is our content area and the needs and perspectives of researchers related to peer review may differ by discipline.

Methods

Transparency statement

Ethics approval was obtained from the Ottawa Health Science Network Research Ethics Board (OHSN-REB Protocol Number 20220237-01H). Participants were provided with a consent form prior to entering the survey and consent was presumed if they completed the survey. The study protocol was registered to the Open Science Framework (OSF) prior to data analysis (https://osf.io/wgxc2/) [23]. Text for this manuscript was drawn directly in reference to the registered protocol on OSF. Anonymous study data and any analytical code was shared publicly using the OSF and study findings were reported in a preprint and open access publication.

Study design

We conducted a cross-sectional online survey of biomedical researchers. The CHERRIES reporting guidelines were used to inform the reporting of our findings [24].

Participant sampling framework

We identified a random sample of international biomedical researchers who are actively publishing in peer-reviewed medical journals. We used the Scopus source list to randomly select 100 biomedical journals. The Scopus list was restricted to those journals with an All Science Journal Classification (ASJC) code of ‘Medicine’ and those that specified the journal was ‘active’ at the time of searching (November 2021). We excluded journals that indicated that they only published articles in a language other than English. Using the RAND function in Excel, we then randomly selected 100 journals from this list. Subsequently, we visited each of the randomly selected journal websites and extracted the corresponding authors from the last 20 published research articles. Corresponding author email extraction was completed on December 9, 2021. In instances where the journal was not open access and we did not have access via our academic institution, we replaced the journal with another randomly selected journal. We also replaced any journals which had non-functioning links. A total of 26 journals were replaced for either not being in English (n = 7), not being active after 2020 (n = 8), broken link (n = 4), not open access (n = 3), or no emails for corresponding authors listed (n = 4). We have used this broad approach to sampling successfully in previous research [25]. This approach enabled us to identify a population of 2000 randomly selected researchers to invite to our survey.

Survey

The survey was purposefully built for this study and was administered using SurveyMonkey software (https://www.surveymonkey.ca/r/7B2JYR6). This was a closed survey; thus, only available to invited participants via our sampling framework. We emailed the identified sample population a recruitment script with a link to the survey. Participation in the survey was voluntary and all data was completely anonymized. The survey was sent on May 23, 2022. We sent all participants a reminder email to complete our survey after one week (May 30, 2022) and two weeks (June 6, 2022), respectively, from the original invitation. The survey was closed after three weeks (June 13, 2022).

The survey contained 37 questions: 1–10 were demographic questions about the participant, 11–15 were regarding level of experience with peer review, 16–23 were opinion-based questions about peer review training, 24–33 were for respondents who have experience running peer review from a journal perspective, and 34–37 were open-ended questions with comment boxes. 33 of the questions were quantitative while four were qualitative. The survey questions were presented in blocks based on content and question type. The survey used adaptive questioning where certain questions appeared based on the participants’ previous responses. The full list of survey questions can be found in S1 File.

The survey was created in SurveyMonkey by two authors (JVW, JR). All survey questions were reviewed and piloted by four researchers (HK, JYN, KDC, DM) and two invited researchers outside of the author list. The average time to complete the survey was estimated to be 15 minutes by pilot testers. All questions were optional and could be skipped. We offered participants the option to report their email to be entered into a draw to win one of three $100 Amazon Gift Cards. Email addresses were stored separately from the study data.

Data analysis

We used SPSS Statistics and Microsoft Excel for data analysis. We reported the overall response rate based on the number of individuals that completed our survey from the sample identified, as well as the survey completion rate (i.e., the number of people who viewed our survey that completed it). We excluded participants from data analysis if they did not complete 80% or more of the survey. We reported quantitative items using frequencies and percentages or means and SE, as appropriate. For qualitative items, we conducted a thematic content analysis of responses in Excel. For this, two researchers (JR, MAA) independently assigned codes to the responses for each written-text question. Codes were then discussed and iteratively updated until there was consensus among the two researchers that best reflected the data. Following this, individual codes were independently grouped into themes by the two reviewers and finalized by consensus. We then created a descriptive definition of each category. We reported the number of unique themes and the number and frequency of codes within each theme.

Results

Protocol amendments

Survey roll-out was changed from four weeks to three weeks due to time constraints. Minor revisions were made to the survey questions, recruitment and reminder emails and consent form.

Participants

Demographics

A total of 186 participants completed the survey of the 2000 researchers invited (9.3%). There were 107 (5.4%) instances where the email was unable to be sent and 32 (1.6%) instances where the participant indicated (including auto-replies) an inability to be reached/participate. As these accounted for less than 10% of invited participants, no changes were made to the recruitment strategy. A flowchart detailing these instances can be found in S1 Table.

The average completion rate was 92% and it took, on average, 13 minutes to complete the survey. There were 14 responses that were excluded based on having less than 80% questions answered, thus the final included number was 172. A total of 97 of 170 respondents (57.1%) identified as men. The survey received responses from 48 different countries with the greatest representation from United States (n = 41, 24.0%), United Kingdom (n = 13, 7.6%) and India (n = 13, 7.6%). The majority of respondents identified as independent researchers defined as assistant, associate, or full professors (n = 108 of 172, 62.8%), were primarily affiliated with an academic organization (n = 103 of 170, 62.8%), and had published more than 21 peer-reviewed articles (n = 106 of 172, 61.6%). Full demographics are described in Table 1.

Table 1. Demographic data.
Frequency Percent
Age 18–24 1 .6
25–34 30 17.4
35–44 60 34.9
45–54 35 20.3
55–64 26 15.1
65+ 19 11.0
Total 171 99.4
Gender Man 97 56.4
Woman 73 42.4
Total 170 98.8
Occupation and/or Position Other (please specify) 22 12.8
Master’s student 10 5.8
PhD student 12 7.0
Post-doctoral fellow 14 8.1
Independent researcher (e.g., assistant/associate/full professor) 108 62.8
Research support staff (e.g., research assistant, research coordinator) 6 3.5
Total 172 100.0
Primary research interest Other (please specify) 58 33.7
Clinical 82 47.7
Pre-clinical ("Basic science") 30 17.4
Total 170 98.8
Institution Other (please specify) 9 5.2
University/college 103 59.9
Research institute 4 2.3
Healthcare institution (e.g., medical centre, hospital) 42 24.4
Private sector (e.g., pharmaceutical company) 4 2.3
Not-for-profit 1 .6
Government organization 7 4.1
Total 170 98.8
Scholarly publishing experience < 1 year 1 .6
1–5 years 38 22.1
6–10 years 44 25.6
11–15 years 29 16.9
16–20 years 13 7.6
21+ years 47 27.3
Total 172 100.0
Number of peer reviewed articles published to date < 2 5 2.9
3–5 16 9.3
6–10 22 12.8
11–20 23 13.4
21–50 36 20.9
51+ 70 40.7
Total 172 100.0

Experience with peer review and peer review training

In total, 144 of 171 participants (84.2%) have never received formal training in peer review. The majority answered that their primary institution did not offer peer review training (n = 108, 63.2%) or otherwise did not know of any training offered (n = 48, 28.1%). For those (n = 26) that had received peer review training, the most common training formats were in-person lectures (n = 12, 44.4%), online lectures (n = 10, 37.0%), or online courses of at least 6 sessions (n = 10, 37.0%). Most of the training received was provided by an academic organization (n = 18, 66.7%). Less than half (40.7%) of participants indicated the training was completed over 5 years ago.

For their first-time performing peer review, 88 of 166 (53.0%) participants felt either very unprepared (10.8%), unprepared (24.1%), or slightly unprepared (18.1%). Highlighted responses about peer review and peer review training are shown in Table 2. A complete table of responses can be found in S2 Table.

Table 2. Experience with peer review and peer review training.
Frequency Percent
How many articles have your peer reviewed in the last 12 months? 0 7 4.1
1–3 41 23.8
4–6 38 22.1
6–10 23 13.4
>10 58 33.7
I have never been a peer reviewer 4 2.3
Total 171 99.4
For how many years have you been active as a manuscript peer reviewer? < 1 year 11 6.4
1–5 years 59 34.3
6–10 years 43 25.0
11–15 years 15 8.7
16–20 years 13 7.6
21 + years 28 16.3
Total 169 98.3
Have you completed any formal training in peer review? Yes 26 15.1
No 144 83.7
Unsure 1 .6
Total 171 99.4
Does the primary institution you are affiliated with offer formal training in peer review? Yes, and I have completed it 10 5.8
Yes, but I have not completed it 5 2.9
No 108 62.8
Unsure/don’t know 48 27.9
Total 171 99.4
Type of formal peer review training completed Frequency Percent
Online lecture 10 16.4
Online course (at least 6 sessions) 10 16.4
In-person lecture 12 19.7
In-person half day workshop 2 3.3
In-person full day workshop 7 11.5
Shadowing a mentor/ghost-writing 4 6.6
Self-selected reading material 7 11.5
Online resource/module 8 13.1
Other 1 1.6
Total 61 100.0
Who provided peer review training that was completed Journal 4 12.1
Publisher (of multiple journals 6 18.2
University/college 18 54.5
Private company 2 6.1
Unsure/don’t know 2 6.1
Other 1 3.0
Total 33 100.0
Peer review training provided by institution Online lecture 6 20.7
Online course (at least 6 sessions) 2 6.9
In-person lecture 3 10.3
Half day workshop 3 10.3
Full day workshop 5 17.2
Shadowing a mentor/ghost-writing 2 6.9
Self-selected reading material 2 6.9
Online resource/modules 4 13.8
Unsure/don’t know 1 3.4
Other 1 3.4
Total 29 100.0

Opinion-based questions

General statements on peer review and peer review training

Participants rated their agreement with statements related to peer review and peer review training on a 7-point scale from strongly disagree to strongly agree. A graph of the responses is depicted in Fig 1.

Fig 1. Participant agreement with statements based on overall experiences with peer review in the last 12 months.

Fig 1

Notable findings included that 148 respondents (86.5%) either strongly agreed or agreed that peer review is important for ensuring the quality and integrity of scholarly communication. One hundred and sixteen (69.5%) strongly agreed or agreed that their experience as a peer reviewer had been positive. Seventy-six (45.2%) strongly agreed or agreed that there is a lack of knowledge and understanding for how to properly conduct peer review. Ninety-nine (58.9%) strongly agreed or agreed that peer reviewers should receive formal training in peer review prior to acting as a peer reviewer for journals. Eighty-six (50.9%) strongly disagreed or disagreed that there were appropriate incentives in place to motivate them to engage in peer review.

Desired training topics, organizations and formats

These questions required participants to rank their responses in order from most to least preferred. Based on average rank position, a score was given to each response (max score based on number of ranked items). A higher score corresponds to being ranked more highly/preferrable. A graph of the response scores can be found in Fig 2.

Fig 2. Ranking of preferred topics, training formats, funding providers, and creating organizations.

Fig 2

Score calculated by average rank placement. A higher score indicates a more preferred and highly ranked response.

The topics that participants were most interested in were appraisal of study design and methodology, appraisal of the research question, and appraisal of statistics. The most desired training formats were all online, including online courses (at least 6 sessions), and online lectures. Academic institutions and scholarly publishers/journals were both similarly ranked as the preferred organization to develop peer review training. Scholarly publishers were additionally ranked as the preferred funders.

Journal-specific questions

Participants were only able to answer these questions if they indicated they worked or volunteered for a journal that publishes peer reviewed articles. There were 80 respondents that were included in this section.

In total, 55 of 80 participants (68.8%) indicated that the journal they were affiliated with did not explicitly require peer review training for their reviewers. Eight (10.0%) indicated that it was required and provided by the journal internally, while two (2.5%) indicated that it was required but externally delivered. The most common format of training required was either an online course and/or lecture. Required training length was variable from 1–5 hours to 20+ hours.

Only 10 of 80 (10.0%) of participants indicated that the journal assessed peer review reports of new reviewers; however, the majority (n = 51, 63.8%) indicated they were unsure or did not know. Twenty-one (26.6%) provided reporting guidelines to reviewers as part of the peer review assessment process.

Qualitative questions

There was a total of 503 comment responses to the four open-ended questions. Other suggestions on how to improve peer review training included clearer standards and expectations (n = 36), improving incentives (n = 32), increased feedback and oversight (n = 28), and improved guidance (n = 19). Barriers to engaging peer review training included difficulty finding or accessing training, including a lack of time or the cost of training offered (n = 111). Desired incentives listed included recognition (n = 39) and financial incentives (n = 35). Please see Table 3 for a list of themes and definitions. In terms of the last question which asked for any further comments, responders typically highlighted topics already listed, such as suggestions on how to improve the process (n = 17), suggestions for the implementation of training (n = 16), and the need for incentives (n = 10). The full thematic content analysis can be found in S2 File.

Table 3. Summary of qualitative survey responses.
Item Number of responses Themes (examples) Frequency of themes
Other than training, how do you believe the quality of peer review could be improved? 152 Improve incentives (official recognition for peer reviewers; monetary incentives; recognition of peer review training for providers; training provided by journals)
Description: Includes comments that refer to the need to implement more/better incentives for potential reviewers.
32
Standardization of process (standardization using templates and/or checklists; standardization through benchmarking; mandatory training certification)
Description: Includes comments that refer to strategies for standardizing the peer review process to promote structure and consistency.
11
Clear standards and expectations for peer reviewers (improved invitation system; focus reviewers toward areas of expertise; provide software for quality checking)
Description: Includes comments that emphasize the need to focus on selecting appropriate peer reviewers and have clearly defined expectations for said reviewers.
36
Improved guidance (mentoring by experienced reviewers; educate reviewers about best practices; provide example reviews from past reviewers; provide written and/or video guidance)
Description: Includes comments that focus on the importance of providing reviewers with various forms of guidance and/or examples to reference throughout the peer review process.
19
Feedback and oversight (better oversight from editors; receiving feedback on peer review comments; collaboration between reviewers during peer review; post-hoc assessment of peer review comments)
Description: Includes comments that mention the need for improved feedback mechanisms and better oversight from journal editors.
28
Self-improvement (personal studies; frequent practice and engagement; individual improvement through increased experience)
Description: Includes comments that mention the need to encourage reviewers to improve their peer review skills through practice and personal study.
9
Transparency of peer review (open peer review; make comments available to all reviewers for comparison; anonymous peer review)
Description: Includes comments that mention the need for open peer review or the need to maintain reviewer anonymity
10
What barriers do you face in engaging in peer review training? 147 Complexity/inconsistency of peer review requirements (differences in specific journal requirements for reviews or training; lack of consensus)
Description: Includes comments that refer to the lack of clarity/consistency in peer review expectations across journals.
8
Difficult to find and/or access training (lack of time; cost of training; availability of training; lack of proper software; internet limitations; lack of international access; unaware of training)
Description: Includes comments that refer to challenges regarding the availability and/or accessibility of peer review training. For example, lack of access to training may be because of the courses are not offered, or even because busy schedules prevent individuals from completing courses offered at a given time.
111
Lack of personal incentive (unwilling attitude; lack of statistics knowledge; lack of recognition; unclear benefit for reviewer; training is monotonous)
Description: Includes comments that refer to a lack of internal motivation or individual incentives for potential reviewers.
15
Lack of value placed on peer review/training (not prioritized by institutes; not sought out; not necessary with expertise)
Description: Includes comments that refer the lack of value that is placed on peer review training, or peer review in general. This lack of value can either be at the individual or institutional level.
8
Other (none)
Description: Includes other comments that did not clearly fit into one of the previous themes. For example, some respondents reported that they did not face any barriers to peer review training.
5
What would incentivize you to obtain additional training in peer review best practices? 136 Financial incentives (discount for future publications; free training; payment for reviewers; financial support via funding; no monetary incentives ever)
Description: Includes comments that refer to various forms of financial incentives for potential peer reviewers–either directly related to training or related to peer review more broadly. A minority of respondents also indicated that financial incentives should never be used for peer review.
35
Recognition of training and/or peer review (official recognition of training; certification; community commitment; official recognition through research metrics; official recognition from institution; official recognition from journals)
Description: Includes comments that refer to the need to provide improved institutional/community level recognition to peer reviewers. This can either be recognizing that they completed peer review training, or by recognizing their subsequent peer review work.
39
Improve feasibility of attending training (time manageable training; accessible training)
Description: Includes comments that refer to the need to make peer review training accessible and feasible to accomplish while also attending to other professional/personal duties.
11
Translation into the peer review process (professional requirements for peer review training; high quality training; tangible payoff to training; non-academic benefits for physicians)
Description: Includes comments that refer to the need for peer review training to be of high quality and clearly beneficial for those who complete it.
27
Other (personal interest; none)
Description: Includes comments that did not clearly fit into one of the previous themes. These comments were from participants who either did not mention any incentives or explicitly stated that no incentives were needed.
24

Discussion

One hundred and eighty-six respondents completed our international survey on peer review training. Among respondents, the vast majority indicated they have never received formal training in peer review. A lack of training could therefore explain the less-than-optimal reporting quality of biomedical research [26] and the inability of reviewers to detect major errors and deficiencies in publications [17].

Limitations of our study included the lower than anticipated response rate. However, this is not out of line with other online surveys, particularly during the COVID-19 pandemic. In addition, most respondents of our study were well established researchers, which was likely a result of our sampling method. Therefore, whether non-responders and early career researchers would respond similarly is unknown.

A majority of respondents either strongly agreed or agreed that peer reviewers should receive formal training. They also indicated that their preference was to receive such training online, as through an online course or lecture. This differs from our recently conducted systematic review which revealed that few online training options exist and of those that do exist, most of the training is a brief one-hour lecture [27]. There appears to be a need to align this disconnect between what peer reviewers want and what is available.

In addition, most respondents indicated difficulty in finding and accessing training in peer review, including not being able to find the time to justify engaging in training. Furthermore, most participants indicated that their primary institution or journal which they worked at did not require/provide training in peer review. Despite this, respondents indicated that scholarly publishers, journals, or universities are best positioned to provide this training. Given academic institutions are training the next generation of researchers, it is surprising that part of a researcher’s training does not include such a fundamental part of research. Universities would likely have the expertise and resources to provide such training, given this is often where editors and editorial boards reside. As for journals and scholarly publishers, they may be less inclined to require training given the existing challenges to find peer reviewers. While lowering the barrier is potentially important, this needs to be balanced against the potential risk of providing an unhelpful review and/or missing flaws.

Even if training were available, there may not be enough incentive for peer reviewers, as many respondents in the survey indicated a lack of time or personal benefit. More than a third of respondents reported that recognition for undergoing training or peer reviewing would incentivize them to obtain additional training in peer review. Discussion surrounding incentivizing peer review is part of a broader discourse to move away from the metric of publications to a more societal perspective and reward behaviours that strengthen research integrity [28]. The Declaration of Research Assessment (DORA), the Coalition for Advancing Research Assessment (COARA), and others are now advocating for formally including such activities as part of research(er) assessment [29, 30].

Scholarly peer review plays a crucial role in ensuring the quality of research manuscripts before they are published. Training peer reviewers could enhance reviewer expertise, establish guidelines and standards, and improve review quality. In addition, training could cover important topics such as mitigating bias and publication fraud. We implore stakeholders in peer review to focus future efforts in creating an open and accessible training program in peer review.

Supporting information

S1 File. Complete list of survey questions delivered through SurveyMonkey.

(PDF)

S2 File. Full thematic content analysis data.

(XLSX)

S1 Table. Flowchart of instances of failure of email delivery to intended survey participants.

(PDF)

S2 Table. Full survey response data.

(PDF)

Data Availability

The study protocol was registered to the Open Science Framework (OSF) prior to data analysis (https://osf.io/wgxc2/)23. Text for this manuscript was drawn directly in reference to the registered protocol on OSF. Anonymous study data and any analytical code was shared publicly using the OSF and study findings were reported in a preprint and open access publication.

Funding Statement

The author(s) received no specific funding for this work.

References

  • 1.Tennant JP, Dugan JM, Graziotin D, Jacques DC, Waldner F, Mietchen D, et al. A multi-disciplinary perspective on emergent and future innovations in Peer Review. F1000Research. 2017;6:1151. doi: 10.12688/f1000research.12037.3 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 2.Burnham JC. The Evolution of Editorial Peer Review. JAMA: The Journal of the American Medical Association. 1990;263(10):1323. [PubMed] [Google Scholar]
  • 3.Nicholas D, Watkinson A, Jamali HR, Herman E, Tenopir C, Volentine R, et al. Peer Review: Still king in the Digital age. Learned Publishing. 2015;28(1):15–21. [Google Scholar]
  • 4.Rowland F. The peer-review process. Learned Publishing. 2002;15(4):247–58. [Google Scholar]
  • 5.Glonti K, Cauchi D, Cobo E, Boutron I, Moher D, Hren D. A scoping review on the roles and tasks of peer reviewers in the Manuscript Review Process in biomedical journals. BMC Medicine. 2019;17(1). doi: 10.1186/s12916-019-1347-0 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 6.Glonti K, Boutron I, Moher D, Hren D. Journal editors’ perspectives on the roles and tasks of peer reviewers in Biomedical Journals: A Qualitative Study. BMJ Open. 2019;9(11). doi: 10.1136/bmjopen-2019-033421 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 7.Kelly J, Sadeghieh T, & Adeli K. Peer Review in Scientific Publications: Benefits, Critiques, & A Survival Guide. EJIFCC, 2014;25(3):227–243. [PMC free article] [PubMed] [Google Scholar]
  • 8.Smith R. Peer Review: A Flawed Process at the Heart of Science and Journals. Journal of the Royal Society of Medicine. 2006;99(4):178–182. doi: 10.1177/014107680609900414 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 9.Horbach SP, Halffman W. The changing forms and expectations of Peer Review. Research Integrity and Peer Review. 2018;3(1). [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 10.Superchi C, González JA, Solà I, Cobo E, Hren D, Boutron I. Tools used to assess the quality of peer review reports: A methodological systematic review. BMC Medical Research Methodology. 2019;19(1). doi: 10.1186/s12874-019-0688-x [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 11.Squazzoni F, Grimaldo F, Marušić A. Publishing: Journals could share peer-review data. Nature. 2017;546(7658):352–. doi: 10.1038/546352a [DOI] [PubMed] [Google Scholar]
  • 12.Song E, Ang L, Park J-Y, Jun E-Y, Kim KH, Jun J, et al. A scoping review on biomedical journal peer review guides for reviewers. PLOS ONE. 2021;16(5). doi: 10.1371/journal.pone.0251440 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 13.Hirst A, Altman DG. Are peer reviewers encouraged to use reporting guidelines? A survey of 116 Health Research Journals. PLoS ONE. 2012;7(4). doi: 10.1371/journal.pone.0035621 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 14.Mulligan A, Hall L, Raphael E. Peer Review in a changing world: An international study measuring the attitudes of researchers. Journal of the American Society for Information Science and Technology. 2012;64(1):132–61. [Google Scholar]
  • 15.Ho RC-M, Mak K-K, Tao R, Lu Y, Day JR, Pan F. Views on the peer review system of biomedical journals: An online survey of academics from high-ranking universities. BMC Medical Research Methodology. 2013;13(1). doi: 10.1186/1471-2288-13-74 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 16.Benos DJ, Bashari E, Chaves JM, Gaggar A, Kapoor N, LaFrance M, et al. The ups and downs of peer review. Adv Physiol Educ. 2007. Jun;31(2):145–52. doi: 10.1152/advan.00104.2006 [DOI] [PubMed] [Google Scholar]
  • 17.Patel J. Why training and specialization is needed for peer review: A case study of peer review for Randomized Controlled Trials. BMC Medicine. 2014;12(1). doi: 10.1186/s12916-014-0128-z [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 18.Publons’ Global State of Peer Review 2018. 2018. https://publons.com/static/Publons-Global-State-Of-Peer-Review-2018.pdf
  • 19.Bruce R, Chauvin A, Trinquart L, Ravaud P, Boutron I. Impact of interventions to improve the quality of peer review of Biomedical Journals: A Systematic Review and meta-analysis. BMC Medicine. 2016;14(1). doi: 10.1186/s12916-016-0631-5 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 20.Schroter S, Black N, Evans S, et al.: Effects of training on quality of peer review: randomised controlled trial. BMJ. 2004; 328(7441): 673 doi: 10.1136/bmj.38023.700775.AE [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 21.Callaham ML, Wears RL, Waeckerle JF: Effect of attendance at a training session on peer reviewer quality and performance. Ann Emerg Med September 1998;32:318–322. [DOI] [PubMed] [Google Scholar]
  • 22.Galipeau J, Moher D, Campbell C, Hendry P, Cameron DW, Palepu A, et al. A systematic review highlights a knowledge gap regarding the effectiveness of health-related training programs in journalology. Journal of Clinical Epidemiology. 2015;68(3):257–65. doi: 10.1016/j.jclinepi.2014.09.024 [DOI] [PubMed] [Google Scholar]
  • 23.Willis JV, Ramos J, Cobey KD, et al. Knowledge and Motivations of Training in Peer Review: A protocol for an international cross-sectional survey. Epub ahead of print September 3, 2022. doi: 10.17605/OSF.IO/WGXC2 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 24.Eysenbach G. Improving the quality of Web surveys: the Checklist for Reporting Results of Internet E-Surveys (CHERRIES). J Med Internet Res. 2004. Sep 29;6(3):e34 doi: 10.2196/jmir.6.3.e34 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 25.Cobey KD, Monfaredi Z, Poole E, Proulx L, Fergusson D, Moher D. Editors-in-chief perceptions of patients as (co) authors on publications and the acceptability of ICMJE authorship criteria: a cross-sectional survey. Res Involv Engagem. 2021;7(1):39. doi: 10.1186/s40900-021-00290-1 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 26.Kleinert S, Horton R. How should medical science change? Lancet. 2014. Jan 18;383(9913):197–8. doi: 10.1016/S0140-6736(13)62678-1. Epub 2014 Jan 8.. [DOI] [PubMed] [Google Scholar]
  • 27.Willis JV, Cobey KD, Ramos J. Online training in manuscript peer review: a systematic review. 2022. [Google Scholar]
  • 28.Moher D, Bouter L, Kleinert S, Glasziou P, Sham MH, Barbour V, et al. The Hong Kong Principles for assessing researchers: Fostering research integrity. PLoS Biol 18(7): e3000737. 2020. doi: 10.1371/journal.pbio.3000737 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 29.American Society for Cell Biology. DORA. Declaration on Research Assessment [Internet]. Available from: http://www.ascb.org/dora/.
  • 30.Coalition for Advancing Research Assessment. CoARA. Available from: https://coara.eu/

Decision Letter 0

Suhad Daher-Nashif

7 Feb 2023

PONE-D-22-31384Knowledge and motivations of training in peer review: an international cross-sectional surveyPLOS ONE

Dear Dr. Willis,

Thank you for submitting your manuscript to PLOS ONE. After careful consideration, we feel that it has merit but does not fully meet PLOS ONE’s publication criteria as it currently stands. Therefore, we invite you to submit a revised version of the manuscript that addresses the points raised during the review process.

Please submit your revised manuscript by Mar 24 2023 11:59PM. If you will need more time than this to complete your revisions, please reply to this message or contact the journal office at plosone@plos.org. When you're ready to submit your revision, log on to https://www.editorialmanager.com/pone/ and select the 'Submissions Needing Revision' folder to locate your manuscript file.

Please include the following items when submitting your revised manuscript:

  • A rebuttal letter that responds to each point raised by the academic editor and reviewer(s). You should upload this letter as a separate file labeled 'Response to Reviewers'.

  • A marked-up copy of your manuscript that highlights changes made to the original version. You should upload this as a separate file labeled 'Revised Manuscript with Track Changes'.

  • An unmarked version of your revised paper without tracked changes. You should upload this as a separate file labeled 'Manuscript'.

If you would like to make changes to your financial disclosure, please include your updated statement in your cover letter. Guidelines for resubmitting your figure files are available below the reviewer comments at the end of this letter.

If applicable, we recommend that you deposit your laboratory protocols in protocols.io to enhance the reproducibility of your results. Protocols.io assigns your protocol its own identifier (DOI) so that it can be cited independently in the future. For instructions see: https://journals.plos.org/plosone/s/submission-guidelines#loc-laboratory-protocols. Additionally, PLOS ONE offers an option for publishing peer-reviewed Lab Protocol articles, which describe protocols hosted on protocols.io. Read more information on sharing protocols at https://plos.org/protocols?utm_medium=editorial-email&utm_source=authorletters&utm_campaign=protocols.

We look forward to receiving your revised manuscript.

Kind regards,

Suhad Daher-Nashif, MSc., PhD

Academic Editor

PLOS ONE

Journal Requirements:

When submitting your revision, we need you to address these additional requirements.

1. Please ensure that your manuscript meets PLOS ONE's style requirements, including those for file naming. The PLOS ONE style templates can be found at

https://journals.plos.org/plosone/s/file?id=wjVg/PLOSOne_formatting_sample_main_body.pdf and

https://journals.plos.org/plosone/s/file?id=ba62/PLOSOne_formatting_sample_title_authors_affiliations.pdf

2. Please amend your current ethics statement to address the following concerns:

a) Did participants provide their written or verbal informed consent to participate in this study?

b) If consent was verbal, please explain i) why written consent was not obtained, ii) how you documented participant consent, and iii) whether the ethics committees/IRB approved this consent procedure.

3. Your ethics statement should only appear in the Methods section of your manuscript. If your ethics statement is written in any section besides the Methods, please delete it from any other section.

Additional Editor Comments:

Dear Dr. Willis,

We've now received the reviewers' comments and recommendations.  

Kindly address each comment in a table, revise your manuscript accordingly with highlighting the changes in red, and re-submit a modified version.

Warmest regards,

Dr Suhad Daher-Nashif

[Note: HTML markup is below. Please do not edit.]

Reviewers' comments:

Reviewer's Responses to Questions

Comments to the Author

1. Is the manuscript technically sound, and do the data support the conclusions?

The manuscript must describe a technically sound piece of scientific research with data that supports the conclusions. Experiments must have been conducted rigorously, with appropriate controls, replication, and sample sizes. The conclusions must be drawn appropriately based on the data presented.

Reviewer #1: Partly

Reviewer #2: Yes

**********

2. Has the statistical analysis been performed appropriately and rigorously?

Reviewer #1: Yes

Reviewer #2: Yes

**********

3. Have the authors made all data underlying the findings in their manuscript fully available?

The PLOS Data policy requires authors to make all data underlying the findings described in their manuscript fully available without restriction, with rare exception (please refer to the Data Availability Statement in the manuscript PDF file). The data should be provided as part of the manuscript or its supporting information, or deposited to a public repository. For example, in addition to summary statistics, the data points behind means, medians and variance measures should be available. If there are restrictions on publicly sharing data—e.g. participant privacy or use of data from a third party—those must be specified.

Reviewer #1: Yes

Reviewer #2: Yes

**********

4. Is the manuscript presented in an intelligible fashion and written in standard English?

PLOS ONE does not copyedit accepted manuscripts, so the language in submitted articles must be clear, correct, and unambiguous. Any typographical or grammatical errors should be corrected at revision, so please note any specific errors here.

Reviewer #1: Yes

Reviewer #2: Yes

**********

5. Review Comments to the Author

Please use the space provided to explain your answers to the questions above. You may also include additional comments for the author, including concerns about dual publication, research ethics, or publication ethics. (Please upload your review as an attachment if it exceeds 20,000 characters)

Reviewer #1: Thank you for the opportunity to review this article on the current perceptions of biomedical researchers on peer review. However, whilst it is an interesting read and provides an overview of current perceptions on this topic, in my opinion, there are a number of issues with the article that need addressing, particularly with the discussion. The article feels mismatched in the aims and discussion – with the aim stated as describing an updated perspective on current perceptions of biomedical researchers on peer review; and the discussion reading more as an opinion piece focussing mainly on triangulating findings from this study with a different pre-print review by the same authors as well as recommendations for professionalising peer review. I am not disagreeing with what is being said but it feels that the current findings are being somewhat overstated and I would welcome more interpretation of the survey findings themselves. I have some specific comments for consideration that I believe could improve the manuscript:

Abstract.

Sentence - Most respondents (n = 108, 62.8%) were independent researchers of an academic organization (n = 103, 62.8%) with greater than 21 peer-reviewed articles published (n = 106, 61.6%). This sentence is confusing and it is not clear what the numbers are referring to.

Introduction

Although it is clear why one discipline is being explored, it would be helpful to understand why the field of biomedical research was chosen as this is not explicitly described in the introduction.

Methods.

It would be helpful to know how many potential journals were in the original list, and how many were replaced. Could this be added as a flow chart? Also, it would be helpful to understand why inclusion criteria (e.g. being open access or having access) were not applied to the journal list before randomisation happened (presumably this is due the to the number of journals that there were).

The discussion highlights that the findings from this study match closely to the findings from a pre-print review. Can the authors provided further details on how the survey was developed, including on whether the responses to the questionnaire were informed by the findings of this review

Can authors also reflect on how they mitigated against any potential biases in developing responses to the closed questions or coding of data in the open questions.

Results

Again, this sentence is not clear - The majority (n = 108, 62.8%) were independent researchers defined as assistant, associate or full professors of an academic organization (n = 103, 62.8%) with greater than 21 peer-reviewed articles published (n = 106, 61.6%).

Sentence - For the 27 participants that had received peer review training, … Should this be 26?

Table 2 does not show all of the responses described in the section ‘experience with peer review and peer review training’. It should be made clear that all data can be found in supplemental material and that Table 2 does not present all data.

Figure 2 is very confusing. Some description on how to read this graph would be beneficial.

Sentence Eight (10.0%) indicated that it was required and provided by the journal internally, while two (2.5%) indicated that it was required by externally delivered. Typo on but not by?

The ‘qualitative section’ does not adequately present the open question data. None of themes to arise from the open question are described. A paragraph/summary of the themes would be beneficial, especially if the authors want to only present the top 3 themes by frequency in the table, although I would recommend expanding table 3 to include all of the themes. This table also needs a heading.

The table of themes and frequencies needs further detail explaining that the numbers refer to the frequencies overall and for specific themes.

Discussion

Much of the discussion is given over to mapping a potential path to professionalising peer review. Whilst I agree in principle with much of what is said, it feels out of place here and the findings from the current study being overstated. In particular, the recommendations from paragraph 6 (starting Third…) do not seem to be based on the findings at all.

Furthermore, potential biases and increases in online participation from the pandemic have not been addressed. For example, it is suggested that online training is highly desired. However, the questionnaire responses did not provide the same options for online and face to face and so it might be that lectures and courses are preferred over workshops regardless of whether they are online or face to face.

In addition, the authors cite that a large barrier to receiving training was the limited availability and accessibility of training material. In the context of the other data this reads that there are limited training courses, however, the data show that over 50% of responses were due to the researcher not having the time to engage in these courses. This could perhaps be broken down and expanded on

It is not clear why Appendix 1 is provided. I would recommend this is removed.

Also Supplementary material 2 and 3 show the same information.

Reviewer #2: Well written article. The survey questionnaire addressed pertinent and the analysis is appropriately performed. The discussion is concise and is too the point.

The low response rate has been admitted as a limitation. However, the geographical distribution is biazed naturally towards countries with lots of publications. Perhaps the picture is worse in less developed regions.

**********

6. PLOS authors have the option to publish the peer review history of their article (what does this mean?). If published, this will include your full peer review and any attached files.

If you choose “no”, your identity will remain anonymous but your review may still be made public.

Do you want your identity to be public for this peer review? For information about this choice, including consent withdrawal, please see our Privacy Policy.

Reviewer #1: No

Reviewer #2: No

**********

[NOTE: If reviewer comments were submitted as an attachment file, they will be attached to this email and accessible via the submission site. Please log into your account, locate the manuscript record, and check for the action link "View Attachments". If this link does not appear, there are no attachment files.]

While revising your submission, please upload your figure files to the Preflight Analysis and Conversion Engine (PACE) digital diagnostic tool, https://pacev2.apexcovantage.com/. PACE helps ensure that figures meet PLOS requirements. To use PACE, you must first register as a user. Registration is free. Then, login and navigate to the UPLOAD tab, where you will find detailed instructions on how to use the tool. If you encounter any issues or have any questions when using PACE, please email PLOS at figures@plos.org. Please note that Supporting Information files do not need this step.

PLoS One. 2023 Jul 12;18(7):e0287660. doi: 10.1371/journal.pone.0287660.r002

Author response to Decision Letter 0


7 Apr 2023

(Please see response to reviewer doc for colour-coded formatting).

Journal Requirements:

When submitting your revision, we need you to address these additional requirements.

1. Please ensure that your manuscript meets PLOS ONE's style requirements, including those for file naming. The PLOS ONE style templates can be found at

https://journals.plos.org/plosone/s/file?id=wjVg/PLOSOne_formatting_sample_main_body.pdf and

https://journals.plos.org/plosone/s/file?id=ba62/PLOSOne_formatting_sample_title_authors_affiliations.pdf

This has been done.

2. Please amend your current ethics statement to address the following concerns:

a) Did participants provide their written or verbal informed consent to participate in this study?

b) If consent was verbal, please explain i) why written consent was not obtained, ii) how you documented participant consent, and iii) whether the ethics committees/IRB approved this consent procedure.

This has been done.

3. Your ethics statement should only appear in the Methods section of your manuscript. If your ethics statement is written in any section besides the Methods, please delete it from any other section. Only appears in Methods section.

Reviewer #1: Thank you for the opportunity to review this article on the current perceptions of biomedical researchers on peer review. However, whilst it is an interesting read and provides an overview of current perceptions on this topic, in my opinion, there are a number of issues with the article that need addressing, particularly with the discussion. The article feels mismatched in the aims and discussion – with the aim stated as describing an updated perspective on current perceptions of biomedical researchers on peer review; and the discussion reading more as an opinion piece focussing mainly on triangulating findings from this study with a different pre-print review by the same authors as well as recommendations for professionalising peer review. I am not disagreeing with what is being said but it feels that the current findings are being somewhat overstated and I would welcome more interpretation of the survey findings themselves. I have some specific comments for consideration that I believe could improve the manuscript:

We thank the reviewer for their insightful comments. We have provided individual responses below. We have made large changes to the discussion based on these recommendations.

Abstract.

Sentence - Most respondents (n = 108, 62.8%) were independent researchers of an academic organization (n = 103, 62.8%) with greater than 21 peer-reviewed articles published (n = 106, 61.6%). This sentence is confusing and it is not clear what the numbers are referring to.

This has been changed to “The majority of participants indicated they were men (n = 97 of 170, 57.1%), independent researchers (n = 108 of 172, 62.8%), and primarily affiliated with an academic organization (n = 103 of 170, 62.8%).”

Introduction

Although it is clear why one discipline is being explored, it would be helpful to understand why the field of biomedical research was chosen as this is not explicitly described in the introduction.

We chose biomedical research as this is our content area. The authorship team are in various stages of medical training and/or are faculty members in faculties of medicine. We believe that survey participants (biomedicine) might believe our credibility in this discipline. Similarly, some members of the authorship team have a long history in examining the quality of reporting of biomedical research which is related to the conduct of peer review. We have added a brief explanation to the Introduction section.

Methods.

It would be helpful to know how many potential journals were in the original list, and how many were replaced. Could this be added as a flow chart? Also, it would be helpful to understand why inclusion criteria (e.g. being open access or having access) were not applied to the journal list before randomisation happened (presumably this is due the to the number of journals that there were).

We have added a breakdown of the reasons for replacement. A total of 26 journals were excluded. We do not believe a flowchart is necessary as there was only a single level of exclusion. There is no way to filter the journals on Scopus by open access/having access via our institution specifically prior to randomization (therefore, it was more feasible to filter them out after the fact).

No corresponding author name or emails listed (n = 4)

Journal is not in English (n = 7)

No journal website provided from Scopus link/broken link (n = 4)

Journal link dead/inactive after 2020/incorrect (n = 8)

Journal is subscription based and I can't access it via uOttawa (n = 3)

We have added this to the Methods section.

The discussion highlights that the findings from this study match closely to the findings from a pre-print review. Can the authors provided further details on how the survey was developed, including on whether the responses to the questionnaire were informed by the findings of this review

The survey was not informed directly by the findings of this review as they were developed concurrently. As from the Methods section: The survey was purpose built for this study and administered using SurveyMonkey (https://www.surveymonkey.ca/r/7B2JYR6) software.

Can authors also reflect on how they mitigated against any potential biases in developing responses to the closed questions or coding of data in the open questions.

For coding of data in the open questions, assignment of codes was done independently by two researchers. Following this, individual codes were independently grouped into themes. At both stages, this was finalized through discussion between the two researchers until consensus was reached. This is explained under the “Data Analysis” section of the manuscript.

We are unsure what is meant by mitigating against potential biases for the closed questions.

Results

Again, this sentence is not clear - The majority (n = 108, 62.8%) were independent researchers defined as assistant, associate or full professors of an academic organization (n = 103, 62.8%) with greater than 21 peer-reviewed articles published (n = 106, 61.6%).

This has been changed to “The majority of respondents identified as independent researchers defined as assistant, associate, or full professors (n = 108 of 172, 62.8%), were primarily affiliated with an academic organization (n = 103 of 170, 62.8%), and had published more than 21 peer-reviewed articles (n = 106 of 172, 61.6%). “

Sentence - For the 27 participants that had received peer review training, … Should this be 26?

Thank you for noticing this mistake. This has been corrected.

Table 2 does not show all of the responses described in the section ‘experience with peer review and peer review training’. It should be made clear that all data can be found in supplemental material and that Table 2 does not present all data.

This has been added.

Figure 2 is very confusing. Some description on how to read this graph would be beneficial.

We have reworked this graph and provided a description.

Sentence Eight (10.0%) indicated that it was required and provided by the journal internally, while two (2.5%) indicated that it was required by externally delivered. Typo on but not by?

This typo has been corrected. It should have been “but” not “by”.

The ‘qualitative section’ does not adequately present the open question data. None of themes to arise from the open question are described. A paragraph/summary of the themes would be beneficial, especially if the authors want to only present the top 3 themes by frequency in the table, although I would recommend expanding table 3 to include all the themes. This table also needs a heading.

The table of themes and frequencies needs further detail explaining that the numbers refer to the frequencies overall and for specific themes.

We have significantly expanded on the Qualitative section and reworked the table. Definitions of each theme are provided in the table.

Discussion

Much of the discussion is given over to mapping a potential path to professionalising peer review. Whilst I agree in principle with much of what is said, it feels out of place here and the findings from the current study being overstated. In particular, the recommendations from paragraph 6 (starting Third…) do not seem to be based on the findings at all.

We have reworked/rewritten the discussion section to address this concern.

Furthermore, potential biases and increases in online participation from the pandemic have not been addressed. For example, it is suggested that online training is highly desired. However, the questionnaire responses did not provide the same options for online and face to face and so it might be that lectures and courses are preferred over workshops regardless of whether they are online or face to face.

We are unclear on this point. As illustrated in Figure 2, online and in-person options were provided for the question “Which training format would you most prefer?”.

In addition, the authors cite that a large barrier to receiving training was the limited availability and accessibility of training material. In the context of the other data this reads that there are limited training courses, however, the data show that over 50% of responses were due to the researcher not having the time to engage in these courses. This could perhaps be broken down and expanded on.

We have made the definition of the theme clearer in both the table, Qualitative section, and in the discussion section. We have expanded on this more in both the Qualitative Section and in Discission section.

It is not clear why Appendix 1 is provided. I would recommend this is removed.

Also Supplementary material 2 and 3 show the same information.

Appendix 1 and Supplementary material 3 have been removed.

Reviewer #2: Well written article. The survey questionnaire addressed pertinent and the analysis is appropriately performed. The discussion is concise and is too the point.

The low response rate has been admitted as a limitation. However, the geographical distribution is biazed naturally towards countries with lots of publications. Perhaps the picture is worse in less developed regions.

Thank you.

Attachment

Submitted filename: Response to Reviewers.docx

Decision Letter 1

Suhad Daher-Nashif

26 Apr 2023

PONE-D-22-31384R1

Knowledge and motivations of training in peer review: an international cross-sectional survey

PLOS ONE

Dear Dr. Willis,

Thank you for submitting your manuscript to PLOS ONE. After careful consideration, we feel that it has merit but does not fully meet PLOS ONE’s publication criteria as it currently stands. Therefore, we invite you to submit a revised version of the manuscript that addresses the points raised during the review process.

Please submit your revised manuscript by Jun 09 2023 11:59PM. If you will need more time than this to complete your revisions, please reply to this message or contact the journal office at plosone@plos.org. When you're ready to submit your revision, log on to https://www.editorialmanager.com/pone/ and select the 'Submissions Needing Revision' folder to locate your manuscript file.

Please include the following items when submitting your revised manuscript:

  • A rebuttal letter that responds to each point raised by the academic editor and reviewer(s). You should upload this letter as a separate file labeled 'Response to Reviewers'.

  • A marked-up copy of your manuscript that highlights changes made to the original version. You should upload this as a separate file labeled 'Revised Manuscript with Track Changes'.

  • An unmarked version of your revised paper without tracked changes. You should upload this as a separate file labeled 'Manuscript'.

If you would like to make changes to your financial disclosure, please include your updated statement in your cover letter. Guidelines for resubmitting your figure files are available below the reviewer comments at the end of this letter.

If applicable, we recommend that you deposit your laboratory protocols in protocols.io to enhance the reproducibility of your results. Protocols.io assigns your protocol its own identifier (DOI) so that it can be cited independently in the future. For instructions see: https://journals.plos.org/plosone/s/submission-guidelines#loc-laboratory-protocols. Additionally, PLOS ONE offers an option for publishing peer-reviewed Lab Protocol articles, which describe protocols hosted on protocols.io. Read more information on sharing protocols at https://plos.org/protocols?utm_medium=editorial-email&utm_source=authorletters&utm_campaign=protocols.

We look forward to receiving your revised manuscript.

Kind regards,

Suhad Daher-Nashif, MSc., PhD

Academic Editor

PLOS ONE

Journal Requirements:

Please review your reference list to ensure that it is complete and correct. If you have cited papers that have been retracted, please include the rationale for doing so in the manuscript text, or remove these references and replace them with relevant current references. Any changes to the reference list should be mentioned in the rebuttal letter that accompanies your revised manuscript. If you need to cite a retracted article, indicate the article’s retracted status in the References list and also include a citation and full reference for the retraction notice.

Additional Editor Comments:

Thank you for submitting your revised manuscript to PLOS ONE. You addressed most of the comments, but there still a need for minor revisions, in order to be able to make a final decision on your manuscript. 

These comments made by one of the reviewers: 

Figure 2 is now easier to read; however, the description incorrectly identifies the most desired training formats as all online. According to the new figure 2, the online resources/modules were ranked 5th after in person half day workshop and in person lecture.

"The most desired training formats were all online, including online lectures, online courses (at least 6 sessions),

online lectures, and online resources or modules."

The discussion now relates much more to the study findings but ends quite abruptly. I wonder whether a concluding statement would be beneficial; however, I do appreciate that this may be personal preference.

Appendix 2 needs to be renamed Appendix 1; similar for supplemental 4 to 3.

[Note: HTML markup is below. Please do not edit.]

Reviewers' comments:

Reviewer's Responses to Questions

Comments to the Author

1. If the authors have adequately addressed your comments raised in a previous round of review and you feel that this manuscript is now acceptable for publication, you may indicate that here to bypass the “Comments to the Author” section, enter your conflict of interest statement in the “Confidential to Editor” section, and submit your "Accept" recommendation.

Reviewer #1: (No Response)

Reviewer #2: All comments have been addressed

**********

2. Is the manuscript technically sound, and do the data support the conclusions?

The manuscript must describe a technically sound piece of scientific research with data that supports the conclusions. Experiments must have been conducted rigorously, with appropriate controls, replication, and sample sizes. The conclusions must be drawn appropriately based on the data presented.

Reviewer #1: Yes

Reviewer #2: Yes

**********

3. Has the statistical analysis been performed appropriately and rigorously?

Reviewer #1: Yes

Reviewer #2: Yes

**********

4. Have the authors made all data underlying the findings in their manuscript fully available?

The PLOS Data policy requires authors to make all data underlying the findings described in their manuscript fully available without restriction, with rare exception (please refer to the Data Availability Statement in the manuscript PDF file). The data should be provided as part of the manuscript or its supporting information, or deposited to a public repository. For example, in addition to summary statistics, the data points behind means, medians and variance measures should be available. If there are restrictions on publicly sharing data—e.g. participant privacy or use of data from a third party—those must be specified.

Reviewer #1: Yes

Reviewer #2: Yes

**********

5. Is the manuscript presented in an intelligible fashion and written in standard English?

PLOS ONE does not copyedit accepted manuscripts, so the language in submitted articles must be clear, correct, and unambiguous. Any typographical or grammatical errors should be corrected at revision, so please note any specific errors here.

Reviewer #1: Yes

Reviewer #2: Yes

**********

6. Review Comments to the Author

Please use the space provided to explain your answers to the questions above. You may also include additional comments for the author, including concerns about dual publication, research ethics, or publication ethics. (Please upload your review as an attachment if it exceeds 20,000 characters)

Reviewer #1: Thank you for responding to all of the comments. A couple of very minor points.

Figure 2 is now easier to read; however, the description incorrectly identifies the most desired training formats as all online. According to the new figure 2, the online resources/modules were ranked 5th after in person half day workshop and in person lecture.

"The most desired training formats were all online, including online lectures, online courses (at least 6 sessions),

online lectures, and online resources or modules."

The discussion now relates much more to the study findings but ends quite abruptly. I wonder whether a concluding statement would be beneficial; however, I do appreciate that this may be personal preference.

Appendix 2 needs to be renamed Appendix 1; similar for supplemental 4 to 3.

Reviewer #2: Thank you for addressing all the responses. I have no concerns about the revised version. I am happy with this version.

**********

7. PLOS authors have the option to publish the peer review history of their article (what does this mean?). If published, this will include your full peer review and any attached files.

If you choose “no”, your identity will remain anonymous but your review may still be made public.

Do you want your identity to be public for this peer review? For information about this choice, including consent withdrawal, please see our Privacy Policy.

Reviewer #1: No

Reviewer #2: No

**********

[NOTE: If reviewer comments were submitted as an attachment file, they will be attached to this email and accessible via the submission site. Please log into your account, locate the manuscript record, and check for the action link "View Attachments". If this link does not appear, there are no attachment files.]

While revising your submission, please upload your figure files to the Preflight Analysis and Conversion Engine (PACE) digital diagnostic tool, https://pacev2.apexcovantage.com/. PACE helps ensure that figures meet PLOS requirements. To use PACE, you must first register as a user. Registration is free. Then, login and navigate to the UPLOAD tab, where you will find detailed instructions on how to use the tool. If you encounter any issues or have any questions when using PACE, please email PLOS at figures@plos.org. Please note that Supporting Information files do not need this step.

PLoS One. 2023 Jul 12;18(7):e0287660. doi: 10.1371/journal.pone.0287660.r004

Author response to Decision Letter 1


6 Jun 2023

We thank the reviewer for their follow-up comments. We have edited the statement and removed online resources and modules. We have added a concluding paragraph.

Appendices and supplemental materials have been renamed and removed from text where they do not exist anymore.

Attachment

Submitted filename: Response to Reviewers 2.docx

Decision Letter 2

Suhad Daher-Nashif

12 Jun 2023

Knowledge and motivations of training in peer review: an international cross-sectional survey

PONE-D-22-31384R2

Dear Dr. Willis,

We’re pleased to inform you that your manuscript has been judged scientifically suitable for publication and will be formally accepted for publication once it meets all outstanding technical requirements.

Within one week, you’ll receive an e-mail detailing the required amendments. When these have been addressed, you’ll receive a formal acceptance letter and your manuscript will be scheduled for publication.

An invoice for payment will follow shortly after the formal acceptance. To ensure an efficient process, please log into Editorial Manager at http://www.editorialmanager.com/pone/, click the 'Update My Information' link at the top of the page, and double check that your user information is up-to-date. If you have any billing related questions, please contact our Author Billing department directly at authorbilling@plos.org.

If your institution or institutions have a press office, please notify them about your upcoming paper to help maximize its impact. If they’ll be preparing press materials, please inform our press team as soon as possible -- no later than 48 hours after receiving the formal acceptance. Your manuscript will remain under strict press embargo until 2 pm Eastern Time on the date of publication. For more information, please contact onepress@plos.org.

Kind regards,

Suhad Daher-Nashif, MSc., PhD

Academic Editor

PLOS ONE

Acceptance letter

Suhad Daher-Nashif

19 Jun 2023

PONE-D-22-31384R2

Knowledge and motivations of training in peer review: an international cross-sectional survey

Dear Dr. Willis:

I'm pleased to inform you that your manuscript has been deemed suitable for publication in PLOS ONE. Congratulations! Your manuscript is now with our production department.

If your institution or institutions have a press office, please let them know about your upcoming paper now to help maximize its impact. If they'll be preparing press materials, please inform our press team within the next 48 hours. Your manuscript will remain under strict press embargo until 2 pm Eastern Time on the date of publication. For more information please contact onepress@plos.org.

If we can help with anything else, please email us at plosone@plos.org.

Thank you for submitting your work to PLOS ONE and supporting open access.

Kind regards,

PLOS ONE Editorial Office Staff

on behalf of

Dr. Suhad Daher-Nashif

Academic Editor

PLOS ONE

Associated Data

    This section collects any data citations, data availability statements, or supplementary materials included in this article.

    Supplementary Materials

    S1 File. Complete list of survey questions delivered through SurveyMonkey.

    (PDF)

    S2 File. Full thematic content analysis data.

    (XLSX)

    S1 Table. Flowchart of instances of failure of email delivery to intended survey participants.

    (PDF)

    S2 Table. Full survey response data.

    (PDF)

    Attachment

    Submitted filename: Response to Reviewers.docx

    Attachment

    Submitted filename: Response to Reviewers 2.docx

    Data Availability Statement

    The study protocol was registered to the Open Science Framework (OSF) prior to data analysis (https://osf.io/wgxc2/)23. Text for this manuscript was drawn directly in reference to the registered protocol on OSF. Anonymous study data and any analytical code was shared publicly using the OSF and study findings were reported in a preprint and open access publication.


    Articles from PLOS ONE are provided here courtesy of PLOS

    RESOURCES