Skip to main content
Royal Society Open Science logoLink to Royal Society Open Science
. 2023 Feb 1;10(2):221093. doi: 10.1098/rsos.221093

Exploring enablers and barriers to implementing the Transparency and Openness Promotion Guidelines: a theory-based survey of journal editors

Kevin Naaman 1,2, Sean Grant 3,4, Sina Kianersi 1,5, Lauren Supplee 6, Beate Henschel 1, Evan Mayo-Wilson 1,7,
PMCID: PMC9890101  PMID: 36756061

Abstract

The Transparency and Openness Promotion (TOP) Guidelines provide a framework to help journals develop open science policies. Theories of behaviour change can guide understanding of why journals do (not) implement open science policies and the development of interventions to improve these policies. In this study, we used the Theoretical Domains Framework to survey 88 journal editors on their capability, opportunity and motivation to implement TOP. Likert-scale questions assessed editor support for TOP, and enablers and barriers to implementing TOP. A qualitative question asked editors to provide reflections on their ratings. Most participating editors supported adopting TOP at their journal (71%) and perceived other editors in their discipline to support adopting TOP (57%). Most editors (93%) agreed their roles include maintaining policies that reflect current best practices. However, most editors (74%) did not see implementing TOP as a high priority compared with other editorial responsibilities. Qualitative responses expressed structural barriers to implementing TOP (e.g. lack of time, resources and authority to implement changes) and varying support for TOP depending on study type, open science standard, and level of implementation. We discuss how these findings could inform the development of theoretically guided interventions to increase open science policies, procedures and practices.

Keywords: behaviour change, journal editors, reproducibility, open science, Top Guidelines, transparency

1. Introduction

Journals in the behavioural, social and health sciences often publish articles with results that cannot be reproduced [16]. Irreproducibility and false findings in the published literature might be explained partly by common detrimental research practices associated with opaque and closed research workflows [713]. Journal policies (i.e. ‘instructions to authors') that promote transparent and open science could reduce these detrimental research practices [1417].

1.1. The Transparency and Openness Promotion Guidelines

The Transparency and Openness Promotion (TOP) Guidelines are a prominent framework to help journals develop and implement clear policies regarding open science [18]. As described in box 1, TOP comprises eight standards on transparency (design and analysis reporting guidelines), reproducibility (data, code and materials sharing), prospective registration (study and analysis plan preregistration), and rewarding researchers for engaging in open science (conducting replications, and citing data, code and materials). Journals might not mention an open science practice or merely encourage authors to implement the open science practice in their journal policies (Level 0). Journals that adopt TOP can: require that authors disclose whether (or not) they used an open science practice (Level 1), require that authors use an open science practice (Level 2), or require that the journal verify the transparency and reproducibility of authors' research (Level 3). Thus, the lowest threshold for adopting TOP is implementing at least one standard at Level 1 (e.g. requiring authors to disclose whether or not their data are publicly available). Journal implementation of TOP standards is a target behaviour for many influential initiatives and organizations working to increase the credibility of published results [1921].

Box 1. The Transparency and Openness Promotion (TOP) Guidelines .

Standards

  • Citation Standards: Citation of datasets in the text and reference sections of manuscripts

  • Data Transparency: Public availability and sharing of datasets

  • Analytic Methods (Code) Transparency: Public availability and sharing of analytical (statistical) code

  • Research Materials Transparency: Public availability and sharing of other research materials

  • Design and Analysis Transparency: Transparent reporting of study design and analysis

  • Study Preregistration: Specification of study details prior to conducting the study

  • Analysis Plan Preregistration: Specification of analytical details prior to conducting the study

  • Replication: Encourages publication of replication studies

Levels of Implementation

  • Level 0 (Not Implemented): The journal encourages an open science practice or says nothing about the open science practice.

  • Level 1 (Disclosure): Published manuscripts disclose whether or not the study incorporated the open science practice.

  • Level 2 (Requirement): A study must incorporate the open science practice for the manuscript to be published.

  • Level 3 (Verification): The journal (or another independent third party) verifies that the study appropriately incorporated the open science practice according to journal standards.

Adapted from previously published grids [18].

Efforts to promote TOP implementation have seen mixed results. The Center for Open Science—which led the development and coordinates implementation of TOP—lists over 5000 signatories on their website (https://www.cos.io/initiatives/top-guidelines). Signatories include individual journals and societies, as well as large publishers of multiple journals, who have expressed ‘interest in the guidelines and commit to conducting a review within a year of the standards and levels of adoption'. This nominal approval of TOP is enabled by the growing proportion of scientists who practice open science [22] and support the specific practices in TOP, such as registration [23], replication [24], transparent reporting [25], data sharing [2632], materials sharing [33] and code sharing [22,34,35]. In addition, journal editors increasingly support data sharing [33,3641], and major funders are implementing requirements for data sharing [4245] and study registration [46,47].

Becoming a TOP signatory does not always translate to implementation. In a database tracking TOP implementation, the modal journal does not implement any open science policies, and most standards in TOP are not implemented by most journals (https://www.topfactor.org/). Several independent assessments have also found low levels of TOP implementation [4853]. Potential barriers include disinclinations toward prescriptive guidelines generally, disagreement with TOP specifically, scepticism about the outcomes of TOP implementation, time and effort required, and perceptions that TOP is not implemented or valued by peers [24,25,27,31,5462]. Yet scant research has systematically investigated enablers and barriers to TOP implementation in a manner that would inform interventions to increase its uptake.

1.2. Using behaviour change theory to promote TOP implementation

Because TOP implementation is a behaviour, theories of behaviour change can guide research to understand why journals do or do not have open science policies [63]. Intervention development should draw upon explicit theories and approaches for identifying hypothesized pathways from candidate intervention techniques to desired behaviour changes [6466]. For example, the Behaviour Change Wheel (BCW) provides systematic guidance on developing behaviour change interventions, based on a broad range of multidisciplinary frameworks (figure 1) [67]. The BCW is centred on the ‘COM-B' theoretical model of behaviour, which posits that the likelihood an individual will engage in a behaviour is affected by their capability, opportunity and motivation to enact that behaviour. The COM-B maps onto the Theoretical Domains Framework (TDF), which divides capability, opportunity and motivation into 14 component theoretical constructs representing potential enablers and barriers to behaviour change [68]. Formative research can use the TDF to inform theoretically guided behaviour change interventions by identifying enablers of and barriers to behaviour change. Researchers can then use the BCW to link these enablers of and barriers to specific behaviour change techniques [69]. The TDF and BCW have been applied to a diverse range of behaviours, including researcher use of open science practices [70,71]; however, research has not yet used this approach to promote journal implementation of open science policies.

Figure 1.

Figure 1.

Behaviour Change Wheel. Reproduced under a CC BY 2.0 license from Michie et al. [67].

1.3. Objective

In this formative study, we sought to explore possible enablers and barriers to TOP adoption by asking editors to complete a survey that we developed using prominent theory and previous questionnaires from implementation science. We did not seek to test any prespecified hypotheses, and we did not register a protocol for the study.

2. Methods

We used the TDF to develop an online, mixed-methods questionnaire on enablers and barriers to journal implementation of TOP. The survey, invitations and reminders are available on Open Science Framework [72]. From 15 March 2021 to 26 April 2021, we surveyed editors of journals that publish influential intervention research about their (i) perceived and actual support for TOP and (ii) capability, opportunity and motivation to implement TOP. Our research data [73], code and materials (e.g. survey, emails) are available on Open Science Framework [74].

2.1. Sampling procedures

In a previous study, we identified 10 federal evidence clearinghouses that rate the quality of evidence concerning the effectiveness of social interventions [75]. We then identified 339 journals that published at least one intervention report that a clearinghouse used to give its highest rating for quality of evidence [76]. Most eligible journals were categorized in Journal Citation Reports as social sciences, psychiatry/psychology, clinical medicine or multidisciplinary [48]. For this study, we excluded one journal that ceased publication in 2020 and one journal for which we could not find any editor contact details.

We sent 337 editors an email invitation via Qualtrics (https://qualtrics.com/) to participate in our online survey. Our invitation email described the purpose of the study and included a unique link to the survey for each editor (electronic supplementary material). Our emails also informed editors that, upon completing the survey, they would be directed to individualized reports describing their journals' current implementation of TOP and changes they could make to increase implementation [77]. If an editor did not respond to the first invitation, we sent up to two reminder emails. Because emails sent through Qualtrics might be identified as ‘spam', we used a university email account to send reminders approximately one and two weeks after the first invitation. We found seven journals with an editor-in-chief on extended leave (e.g. sabbatical) or no longer affiliated with the journal; for these journals, we contacted the next editor listed on each journal's editorial board.

2.2. Data collection

We designed our questionnaire based on previous questionnaires from implementation science using the TDF [7883]. The first page of the survey provided editors with a brief overview on the eight modular standards in TOP and their levels of implementation. On the next page, we presented questions for editors to rate on a five-point Likert scale ranging from ‘strongly disagree to ‘strongly agree’ (box 2). The first two questions asked whether editors support adopting TOP at their journal and whether other editors in their discipline support adopting TOP. We defined adopting TOP as implementing at least one of the eight open science standards at Level 1 (Disclosure) or higher. The remaining questions assessed enablers and barriers to implementing TOP based on 14 constructs in the TDF [68]. The final page asked editors for any feedback and reflections about their responses.

Box 2. Survey Questions.

Part 1: Support for adoption of TOP Guidelines (Likert-Scale)

  • Actual support: As editor, I support adoption of the TOP Guidelines at < insert journal name > .

  • Perceived support: Other editors in my discipline support adoption of the TOP Guidelines at their respective journals.

Part 2: Enablers and barriers to implementing the TOP Guidelines (Likert-Scale)

  • Knowledge: I am familiar with the content and objectives of the TOP Guidelines.

  • Cognitive and interpersonal skills (Skills): I have the necessary skills to adopt the TOP Guidelines at < insert journal name >.

  • Memory, attention and decision processes (Memory processes): When managing a manuscript at < insert journal name>, it is easy for me to remember the specific requirements in our ‘instructions to authors' that I am supposed to enforce.

  • Behavioural regulation: I have a clear plan of how I could promote changes to ‘instructions to authors' at < insert journal name>, if I wanted to do so.

  • Social influences: Colleagues whose opinion I value would approve of < insert journal name > adopting the TOP Guidelines.

  • Environmental context and resources (Environment): <insert journal name > has the necessary editorial systems and tools to adopt the TOP Guidelines.

  • Social/professional role and identity (Professional identity): It is part of my role as editor at < insert journal name > to maintain ‘instructions for authors' that reflect current best practices.

  • Beliefs about capabilities (Beliefs in capabilities): I am confident that, if I wanted, I would be capable of leading the adoption of the TOP Guidelines at < insert journal name > .

  • Optimism: When < insert journal name > adopts new ‘instructions for authors', I usually expect positive outcomes.

  • Intentions: I intend to promote the adoption of the TOP Guidelines at < insert journal name > in the next year.

  • Goals: Compared with other editorial tasks, adopting the TOP Guidelines at < insert journal name > is a higher priority on my agenda.

  • Beliefs about consequences (Consequences): Adoption of the TOP Guidelines would benefit < insert journal name > .

  • Reinforcement: Whenever I promote changes to the ‘instructions for authors' at < insert journal name>, I receive positive recognition from colleagues who are important to me.

  • Emotion: I generally do not feel nervous or anxious about promoting adoption of the TOP Guidelines at < insert journal name > .

Part 3: Reflections (Qualitative)

  • We welcome any reflections on your responses and feedback below.

2.3. Data analysis

We analysed the 16 Likert-scale questions by counting the number of responses in each of the response categories. We visualized these results using bar charts. We narratively combined percentages for ‘strongly agree' with ‘somewhat agree' and ‘strongly disagree’ with ‘somewhat disagree'. To explore potential sources of heterogeneity, we stratified the proportions of editors who agreed and disagreed with each item by whether their journals were not listed as TOP signatories on the Center for Open Science website [84]. To explore potential non-response bias, we examined whether TOP implementation [48] and bibliometric characteristics [85,86] differed between journals of participating and non-participating editors by comparing measures of central tendency and dispersion, visualizing density distributions and histograms, and conducting Welch two-sample t-tests. For data cleaning and visualizations, we used the tidyverse [87], Likert [88], ggpubr [89] and table1 [90] packages in RStudio 4.0.1 [91,92]. We cleaned and processed TOP implementation data and bibliometric characteristics using the pandas [93,94] and NumPy [95] packages in Python 3.7.6 [96]. Lastly, we analysed written reflections by grouping comments into shared topics and creating topic summaries [97].

3. Results

Of 337 eligible editors, we recruited 88 (26%) to participate in our survey (figure 2). Of invited editors, 62% (209/337) did not open the link, 11% (38/337) opened the survey but did not complete any questions, and 1% (2/337) declined to participate by emailing us. Of participating editors, 87% (77/88) answered all 16 Likert-scale questions. We did not identify evidence of non-response bias between participating and non-participating editors on TOP implementation or bibliometric characteristics of their journals based on our statistical analyses (S1) or inspections of visualized density distributions (figures 3 and 4; electronic supplementary material, figures S2–S7).

Figure 2.

Figure 2.

Flowchart of journal editor participation in our survey.

Figure 3.

Figure 3.

Journal TOP implementation and JCR rankings by survey participation status. The TOP Factor measures the extent that journals have implemented the TOP Guidelines into their editorial policies and has a range from 0 to 29. Our analyses of JCR metrics are based on the 2019 data. As defined by JCR [98], Total Cites is ‘the total number of times that a journal has been cited by all journals included in the database in the JCR year' (i.e. 2019). 2-year Impact Factor is ‘defined as all citations to the journal in the current JCR year to items published in the previous 2 years, divided by the total number of scholarly items (these comprise articles, reviews and proceedings papers) published in the journal in the previous two years'. The 5-year Impact Factor is ‘the average number of times articles from the journal published in the past five years have been cited in the JCR year. It is calculated by dividing the number of citations in the JCR year by the total number of articles in the five previous years'. The Article Influence Score ‘is calculated by multiplying the Eigenfactor Score by 0.01 and dividing by the number of articles in the journal, normalized as a fraction of all articles in all publications…The Eigenfactor Score calculation is based on the number of times articles from the journal published in the past five years have been cited in the JCR year, but it also considers which journals have contributed these citations so that highly cited journals will influence the network more than lesser cited journals’.

Figure 4.

Figure 4.

Journal SCImago rankings by survey participation status. Our analyses of SCImago metrics are based on the 2020 data. As defined by SCImago [99], Journal Rank is the ‘average number of weighted citations received in the selected year by the documents published in the selected journal in the three previous years'. H Index is ‘the journal's number of articles (h) that have received at least h citations'. Total docs (3 years) is the number of ‘published documents in the three previous years'. Citable docs (3 years) is the ‘number of citable documents published by a journal in the three previous years'. Total cites (3 years) is the ‘number of citations received in the selected year by a journal to the documents published in the three previous years'. Cites per doc (2 years) is the ‘average citations per document in a 2-year period. It is computed considering the number of citations received by a journal in the current year to the documents published in the two previous years'.

3.1. Quantitative findings

As shown in figure 5a, most participating editors support adopting TOP at their journals (71%; 62/87) and perceive that other editors in their discipline support adopting TOP at their journals (57%; 49/86). As shown in figure 5b, the degree to which editors perceived TDF domains as enablers or barriers varied (electronic supplementary material, tables S8 and S9 include the number of editors who responded to each item).

Figure 5.

Figure 5.

Results of quantitative rating questions. Theoretical Domains Framework sources of behaviour: C = Capability; O = Opportunity; M = Motivation. Number of responses per question ranged from 79 to 87. In Part (b), ‘agree' indicates that the source of behaviour is an enabler of TOP adoption and ‘disagree' that the source of behaviour is a barrier to TOP adoption. Professional identity = Social/professional role and identity. Memory processes = Memory, attention, and decision processes. Beliefs in capabilities = Beliefs about capabilities. Skills = Cognitive and interpersonal skills. Consequences = Beliefs about consequences. Environment = Environmental context and resources.

Most respondents were editors of journals that were not signatories of TOP (73%; 64/88). We did not identify important differences in support for TOP, or in perceptions of enablers and barriers, when comparing signatories and non-signatories (figure 6).

Figure 6.

Figure 6.

Quantitative Results Stratified by TOP Signatory Status. ‘Yes' indicates TOP signatory journals and ‘No' indicates non-signatories. ‘Agree' indicates that the source of behaviour is an enabler of TOP adoption and ‘disagree' that the source of behaviour is a barrier to TOP adoption. Theoretical Domains Framework sources of behaviour: C = Capability; O = Opportunity; M = Motivation. Number of responses per question ranged from 21 to 24 for TOP signatories and from 58 to 63 for non-signatories. The Center for Open Science stopped adding journal names to their list of signatories at the end of 2020. We collected data in March 2021, so our analysis of heterogeneity may have included more journals in the TOP signatory group if the Center for Open Science had continued updating their list of signatories. Professional identity = Social/professional role and identity. Memory processes = Memory, attention, and decision processes. Beliefs in capabilities = Beliefs about capabilities. Skills = Cognitive and interpersonal skills. Consequences = Beliefs about consequences. Environment = Environmental context and resources.

3.1.1. Capability

Most editors agreed that they have the capability to implement TOP: 78% (63/81) believe it is easy to enforce journal policies once enacted, 68% (55/81) that they have both the skills and knowledge to implement TOP and 62% (50/81) that they have a clear plan of how they could promote changes to their journal policies if desired.

3.1.2. Opportunity

Slightly more than half of editors agreed that they have the opportunity to implement TOP: 57% (46/81) believe that they have the necessary editorial systems and tools and 55% (44/80) that colleagues whose opinion they value would approve of them implementing TOP.

3.1.3. Motivation

The majority of editors agreed that part of their role is to maintain journal policies that reflect current best practices (93%; 75/81). Most editors also had confidence in their ability to facilitate TOP implementation at their journal (74%; 60/81), expected positive outcomes from journal implementation of new policies (74%; 59/80), and believed that implementing TOP would benefit their journal (64%; 52/81). Slightly more than half (52%; 42/81) were not nervous or anxious about promoting TOP. However, 74% (60/81) did not see TOP as a high priority compared with other editorial tasks, 65% (51/79) do not receive positive recognition from colleagues who are important to them when changing journal policies, and 58% (47/81) did not intend to implement TOP in the next year.

3.2. Qualitative findings

We identified several topics in written reflections (electronic supplementary material, box S10), which we received from 19% (17/88) of editors.

3.2.1. Overall support

Editors indicated overall support for open science policies and for TOP specifically, including editors of journals that had either already or planned to implement TOP: ‘We are in the process of announcing guidelines that include many of the TOP transparency guidelines'.

3.2.2. Differences by context and discipline

Several editors indicated varying support for TOP depending on context and discipline: ‘The TOP guidelines try to be “one size fits all”. They do not’. For example, support for TOP can vary by the type of studies to which it is being applied: ‘The main challenge is that we are eclectic with respect to the types of studies we accept. I am 100% behind the adoption of TOP for trials, but that is not all that we do. Standards for other types of studies are less well-developed. That does not mean we shouldn't do it across study designs, but as an editor I cannot be vague about requirements'. In addition, support for TOP can vary by open science practice: ‘I agree with adopting some of the TOP guidelines but not with adopting all of them'. Support for TOP also can vary by level of implementation: ‘Being transparent about whether a particular article does this [use an open science practice] is an easy choice…[but] another issue to consider is that…there is always a risk that enough information can be gathered from different sources to identify individuals'.

3.2.3. Enablers and barriers

Other topics provided elaboration on enablers and barriers examined in the questionnaire. For example, one editor explained why TOP is a low priority: ‘TOP guidelines are not a very high priority concern relative to daily priorities for running the journal…At this moment our focus has to be on daily operations and in the long run, having a diversity statement that contains measurable objectives. TOP was not even on my list until this survey'. Editors also expressed concerns about TOP implementation leading to increased demands, especially for more stringent levels of implementation: ‘moving to what TOP Guidelines define as Level 3 requires time and somehow a cultural change in our specific authors…This is something that has to be done gradually, introducing step by step new requirements for our authors'. Several editors also spoke to their organizational context, namely the role of the editorial board, sponsoring society and publisher: ‘we don't have the authority to implement these guidelines; it would go through other channels (the journal committee, the executive committee)'. The kind of influence that these other stakeholders have may vary by discipline, business model of the journal and publisher capacity: ‘We are changing publishers next year, which will make it much easier to adopt the guidelines'. Lastly, several editors said that they needed more knowledge about and skills in implementing TOP to consider it further: ‘I am new to much of this but think it potentially important and will look into it more’.

4. Discussion

We found most participating editors support implementing at least one of the eight open science standards at Level 1 or higher, and they perceive that other editors also support some level of adoption. Consistent with previous studies [22,54,57], editors perceive that peer support is lower than actual support, suggesting that editors might be unaware how much community norms have shifted recently in favour of TOP. Our survey also identified several potential enablers that are linked to theories of behaviour change. Most notably, editors perceive their roles to include maintaining policies that reflect current best practices. Other enablers included the ease of enforcing journal policies once enacted, editor confidence that they could facilitate TOP implementation at their journal if they so desired, and optimism that changes in journal policies lead to positive outcomes. Conversely, we identified several barriers to implementing TOP related to motivation, the most substantial being competing priorities, lack of intention to promote TOP and limited familiarity with TOP. Qualitative responses suggest that factors outside the direct control of journal editors—i.e. limited time, resources and authority to implement changes—may be important determinants of these motivational barriers. In addition to elaborating on these enablers and barriers examined in the questionnaire, qualitative responses also indicated that editor support can vary by study type, open science standard and level of implementation. We did not identify systematic differences in enablers and barriers based on TOP signatory status.

4.1. Theoretically informed interventions to promote TOP implementation

Our findings have implications for the development of interventions to increase open science using best practices for translational research [100]. Intervention development approaches like the BCW can be used to operationally define journal adoption of TOP as a targeted behaviour for intervention. Combined with the TDF, the BCW can then be used to identify intervention techniques that address enablers and barriers to increasing the prevalence of the target behaviour (i.e. TOP adoption) among the population of interest (i.e. journals) [64,65,101,102]. For example, our findings suggest motivation is an important barrier to TOP implementation, particularly what the TDF classifies as ‘reflexive motivation' represented by goals and professional identity. According to the BCW approach, techniques that target reflexive motivation might include working with editors to set goals for behaviours to be achieved (e.g. at least one TOP standard implemented at Level 1). Additionally, interventions might help editors identify positive outcomes of these behaviours (e.g. increased visibility of publicly available datasets). Journal progress toward these goals could then be reviewed periodically by examining changes in the desired behaviours and outcomes. As another example from the BCW approach, techniques like personalized feedback could provide journal editors with data on TOP implementation over time, drawing attention to discrepancies between current implementation with agreed goals and TOP implementation at peer journals. Based on our survey finding professional identity as a substantial enabler, this feedback could also instruct editors on how to implement TOP, offer solutions for overcoming factors that might impede TOP implementation, and provide contact details to groups that can provide practical support on implementing TOP (e.g. the Center for Open Science). In the light of qualitative findings that these motivational barriers are driven by factors outside of direct journal editor control, information on advocacy with journal publishers and societies might also be beneficial to include, as the power and resources to make changes to journal policies often depends on approval from these authorities. Specifically, the BCW approach suggests that official guidelines on open science standards endorsed by and support services offered by these authorities would support the aforementioned interventions [67].

Our findings also indicate that interventions to implement open science journal policies should target publishers and manuscript submission systems. That is, we found that many editors do not plan to implement open science policies, and they report time as an important barrier. In an analysis related to this survey, we previously found that 335 eligible journals were affiliated with 86 publishers and 33 manuscript submission systems, and the majority of journals used the submission systems ScholarOne and Editorial Manager [103]. Working with publishers to change default options in widely used submission systems could be a scalable approach to increasing TOP. Even in the absence of stringent policies, these changes could prompt authors to see open science practices as more normative. These changes would also allow submission systems to capture structured data about open science practices, which might facilitate indexing of transparency information (e.g. in databases such as PubMed), and support automated surveillance and meta-research concerning TOP [104]. If it saves time and effort, our results suggest that authors and editors might support such changes.

4.2. Limitations

Our study has several limitations. Firstly, because we did not ask detailed questions about support for each specific TOP standard, some editors found our questions to be overly broad, and thus difficult to answer. To encourage participation, we intentionally designed a brief questionnaire and used the lowest threshold possible for adopting TOP (i.e. Level 1 implementation of at least one open science standard). Although a more detailed survey might have further clarified specific enablers and barriers to implementing specific TOP standards, we might have received fewer complete responses. Secondly, our questionnaire has not undergone psychometric validation. Although we organized our narrative around the three components of the COM-B model, we did not assess their reliability, so we focused our interpretation on the individual items. Future iterations of this questionnaire could be compared with our results and with other studies that assessed determinants of open research practices [70].

Our results might generalize to journals like those that participated in this study. Eligible journals had to publish studies of social and behavioural interventions, so our results might not generalize to disciplines and journals that do not publish this type of research. If editors who support TOP were most likely to participate, then our results might overestimate actual support, and they might overestimate the difference between perceived and actual support. Evidence that non-participating editors did not open the survey—choosing to ignore our unsolicited email invitation—suggests that non-response bias could be limited. We also did not find evidence that current open science policies or journal characteristics differed between journals whose editors participated and those whose editors did not participate, further reducing concerns about non-response bias.

5. Conclusion

We found support for the TOP Guidelines among editors of journals publishing influential intervention research. Quantitative findings identified enablers and barriers to implementing TOP that are linked to domains and constructs from theories of behaviour change. Qualitative responses elaborated on quantitative findings and further indicated that editor support can vary by study type, open science standard and level of implementation. Our findings can be used to develop theoretically informed and scalable interventions that aim to facilitate journal implementation of open science policies. Based on the BCW approach, these interventions include goal setting, action planning, monitoring and feedback, and instruction on and support in implementing TOP.

Acknowledgments

We thank the following people for their involvement with the development and implementation of the Shiny application: Lilian Golzarri-Arroyo, Stephanie Dickinson and Hiroki Naganobori from the Biostatistics Consulting Center (BCC) in the Department of Epidemiology and Biostatistics at Indiana University School of Public Health-Bloomington and Scott Michael and Ben Fulton from Research Technologies, Indiana University. We thank David Mellor for his help conceiving the study.

Ethics

This study was reviewed by the Institutional Review Board (IRB) at Indiana University and determined to be exempt human subjects research (IRB#10201).

Data accessibility

De-identified data, statistical code for generating results and our survey instruments are available on the Open Science Framework (https://osf.io/j6ykx/).

Supplementary material is available online [105].

Authors' contributions

K.N.: data curation, formal analysis, investigation, methodology, software, visualization, writing—original draft, writing—review and editing; S.G.: conceptualization, funding acquisition, investigation, methodology, project administration, resources, supervision, writing—original draft, writing—review and editing; S.K.: data curation, formal analysis, visualization, writing—review and editing; L.S.: writing—review and editing; B.H.: formal analysis, validation, visualization, writing—review and editing; E.M.W.: conceptualization, funding acquisition, investigation, methodology, project administration, resources, supervision, writing—original draft, writing—review and editing.

All authors gave final approval for publication and agreed to be held accountable for the work performed therein.

Conflict of interest declaration

S.G. received honoraria from the Office of Planning, Research, and Evaluation (Administration for Children and Families, US Department of Health and Human Services) for speaking at their 2019 meeting on ‘Methods for Promoting Open Science in Social Policy Research'. S.G. is a Senior Research Fellow for the International Initiative for Impact Evaluation, which includes remuneration for advising on their research transparency policy. E.M.W. is on the TOP Coordinating Committee. S.G. and E.M.W. are co-developers of the TOP Statement. All other author(s) declare that there were no conflicts of interest with respect to the authorship or the publication of this article.

Funding

This work was funded by a grant from Arnold Ventures. The funders were not involved in the conduct of the study, manuscript preparation or the decision to submit the manuscript for publication.

References

  • 1.Baker M. 2016. 1,500 scientists lift the lid on reproducibility. Nature 533, 452-454. ( 10.1038/533452a) [DOI] [PubMed] [Google Scholar]
  • 2.Open Science Collaboration. 2015. Estimating the reproducibility of psychological science. Science 349, aac4716. ( 10.1126/science.aac4716) [DOI] [PubMed] [Google Scholar]
  • 3.Klein RA, Ratliff KA, Vianello M, Adams RB, Bahník Š, Bernstein MJ, Bocian K, Nosek BA. 2014. Investigating variation in replicability: a ‘many labs’ replication project. Soc. Psychol. 45, 142-152. ( 10.1027/1864-9335/a000178) [DOI] [Google Scholar]
  • 4.Camerer CF, Dreber A, Forsell E, Ho TH, Huber J, Johannesson M, Almenberg J, Wu H. 2016. Evaluating replicability of laboratory experiments in economics. Science 351, 1433-1436. ( 10.1126/science.aaf0918) [DOI] [PubMed] [Google Scholar]
  • 5.Camerer CF, Dreber A, Holzmeister F, Ho TH, Huber J, Kirchler M, Nave G, Wu H. 2018. Evaluating the replicability of social science experiments in Nature and Science between 2010 and 2015. Nat. Hum. Behav. 2, 637-644. ( 10.1038/s41562-018-0399-z) [DOI] [PubMed] [Google Scholar]
  • 6.Chang A, Li P. 2015. Is economics research replicable? Sixty published papers from thirteen journals say ‘usually not’. Finance and Economics Discussion Series 2015-083. Washington, D.C.: Board of Governors of the Federal Reserve System. See https://www.federalreserve.gov/econresdata/feds/2015/files/2015083pap.pdf [Google Scholar]
  • 7.Goodman SN, Fanelli D, Ioannidis JP. 2016. What does research reproducibility mean? Sci. Transl. Med. 8, 341ps312. ( 10.1126/scitranslmed.aaf5027) [DOI] [PubMed] [Google Scholar]
  • 8.John LK, Loewenstein G, Prelec D. 2012. Measuring the prevalence of questionable research practices with incentives for truth telling. Psychol. Sci. 23, 524-532. ( 10.1177/0956797611430953) [DOI] [PubMed] [Google Scholar]
  • 9.Dwan K, et al. 2008. Systematic review of the empirical evidence of study publication bias and outcome reporting bias. PLoS ONE 3, e3081. ( 10.1371/journal.pone.0003081) [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 10.Kerr NL. 1998. HARKing: Hypothesizing after the results are known. Pers. Soc. Psychol. 2, 196-217. ( 10.1207/s15327957pspr0203_4) [DOI] [PubMed] [Google Scholar]
  • 11.Simonsohn U, Nelson LD, Simmons JP. 2014. P-curve: a key to the file-drawer. J. Exp. Psychol. Gen. 143, 534-547. ( 10.1037/a0033242) [DOI] [PubMed] [Google Scholar]
  • 12.Wicherts JM, Veldkamp CL, Augusteijn HE, Bakker M, Van Aert R, Van Assen MA. 2016. Degrees of freedom in planning, running, analyzing, and reporting psychological studies: a checklist to avoid p-hacking. Front. Psychol. 7, 1832. ( 10.3389/fpsyg.2016.01832) [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 13.Simmons JP, Nelson LD, Simonsohn U. 2011. False-positive psychology: undisclosed flexibility in data collection and analysis allows presenting anything as significant. Psychol. Sci. 22, 1359-1366. ( 10.1177/0956797611417632) [DOI] [PubMed] [Google Scholar]
  • 14.Munafò MR, et al. 2017. A manifesto for reproducible science. Nat. Hum. Behav. 1, 0021. ( 10.1038/s41562-016-0021) [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 15.Miguel E, et al. 2014. Promoting transparency in social science research. Science 343, 30-31. ( 10.1126/science.1245317) [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 16.Laitin DD, et al. 2021. Reporting all results efficiently: a RARE proposal to open up the file drawer. Proc. Natl Acad. Sci. USA. 118, e2106178118. ( 10.1073/pnas.2106178118) [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 17.Marusic A, Wager E, Utrobicic A, Rothstein HR, Sambunjak D. 2016. Interventions to prevent misconduct and promote integrity in research and publication. Cochrane Database of Syst. Rev. 4, MR000038. ( 10.1002/14651858.MR000038.pub2) [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 18.Nosek BA, et al. 2015. Promoting an open research culture. Science 348, 1422-1425. ( 10.1126/science.aab2374) [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 19.National Academies of Sciences, Engineering, and Medicine; Policy and Global Affairs; Board on Research Data and Information; Committee on Toward an Open Science Enterprise. 2018. Open science by design: realizing a vision for 21st century research. Washington, DC: The National Academies Press. [PubMed] [Google Scholar]
  • 20.National Academies of Sciences, Engineering, and Medicine. 2019. Reproducibility and replicability in science. Washington, DC: The National Academies Press. [PubMed] [Google Scholar]
  • 21.McNutt M. 2016. Taking up TOP. Science 348, 1147. ( 10.1126/science.aag2359) [DOI] [PubMed] [Google Scholar]
  • 22.Christensen G, Wang Z, Levy Paluck E, Swanson N, Birke D, Miguel E, Littman R. 2020. Open science practices are on the rise: The State of Social Science (3S) Survey. UC Berkeley: Center for Effective Global Action. See https://escholarship.org/uc/item/0hx0207r [DOI] [PMC free article] [PubMed]
  • 23.Svirsky MA. 2020. Editorial: preregistration and open science practices in hearing science and audiology: the time has come. Ear Hear. 41, 1-2. ( 10.1097/aud.0000000000000817) [DOI] [PubMed] [Google Scholar]
  • 24.Agnoli F, Fraser H, Singleton Thorn F, Fidler F. 2021. Australian and Italian Psychologists' view of replication. Adv. Methods Pract. Psychol. Sci. 4, 25152459211039218. ( 10.1177/25152459211039218) [DOI] [Google Scholar]
  • 25.Prager R, Gagnon L, Bowdridge J, Unni RR, McGrath TA, Cobey K, Bossuyt PM, McInnes MDF. 2021. Barriers to reporting guideline adherence in point-of-care ultrasound research: a cross-sectional survey of authors and journal editors. BMJ Evid. Based Med. 26, 188-189. ( 10.1136/bmjebm-2020-111604) [DOI] [PubMed] [Google Scholar]
  • 26.Kim Y, Stanton JM. 2016. Institutional and individual factors affecting scientists’ data-sharing behaviors: a multilevel analysis. J. Assoc. Inf. Sci. Technol. 67, 776-799. ( 10.1002/asi.23424) [DOI] [Google Scholar]
  • 27.Kim Y, Zhang P. 2015. Understanding data sharing behaviors of STEM researchers: the roles of attitudes, norms, and data repositories. Libr. Inf. Sci. Res. 37, 189-200. ( 10.1016/j.lisr.2015.04.006) [DOI] [Google Scholar]
  • 28.Anagnostou P, Capocasa M, Milia N, Sanna E, Battaggia C, Luzi D, Destro Bisol G. 2015. When data sharing gets close to 100%: what human paleogenetics can teach the open science movement. PLoS ONE 10, e0121409. ( 10.1371/journal.pone.0121409) [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 29.Tenopir C, Dalton ED, Allard S, Frame M, Pjesivac I, Birch B, Pollock D, Dorsett K. 2015. Changes in data sharing and data reuse practices and perceptions among scientists worldwide. PLoS ONE 10, e0134826. ( 10.1371/journal.pone.0134826) [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 30.Tenopir C, Allard S, Douglass K, Aydinoglu AU, Wu L, Read E, Manoff M, Frame M. 2011. Data sharing by scientists: practices and perceptions. PLoS ONE 6, e21101. ( 10.1371/journal.pone.0021101) [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 31.Toribio-Flórez D, Anneser L, deOliveira-Lopes FN, Pallandt M, Tunn I, Windel H. 2020. Where do early career researchers stand on open science practices? A survey within the Max Planck Society. Front. Res. Metr. Anal. 5, 586992. ( 10.3389/frma.2020.586992) [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 32.Baždarić K, Vrkić I, Arh E, Mavrinac M, Gligora Marković M, Bilić-Zulle L, Stojanovski J, Malički M. 2021. Attitudes and practices of open data, preprinting, and peer-review—a cross sectional study on Croatian scientists. PLoS ONE 16, e0244529. ( 10.1371/journal.pone.0244529) [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 33.Lindsay DS. 2017. Sharing data and materials in psychological science. Psychol. Sci. 28, 699-702. ( 10.1177/0956797617704015) [DOI] [PubMed] [Google Scholar]
  • 34.Code share. 2014. Nature 514, 536. ( 10.1038/514536a) [DOI] [PubMed] [Google Scholar]
  • 35.Easterbrook SM. 2014. Open code for open science? Nat. Geosci. 7, 779-781. ( 10.1038/ngeo2283) [DOI] [Google Scholar]
  • 36.Bloom T, Ganley E, Winker M. 2014. Data access for the open access literature: PLOS's Data Policy. PLoS Biol. 12, e1001797. ( 10.1371/journal.pbio.1001797) [DOI] [Google Scholar]
  • 37.Grahe J. 2021. The necessity of data transparency to publish. J. Soc. Psychol. 161, 1-4. ( 10.1080/00224545.2020.1847950) [DOI] [PubMed] [Google Scholar]
  • 38.2014. Journals unite for reproducibility. Nature 515, 7. ( 10.1038/515007a) [DOI] [PubMed] [Google Scholar]
  • 39.Political Science Journal Editors. 2015. Data access and research transparency (DA-RT): A joint statement by Political Science Journal Editors. Eur. J. Polit. Res. 54, 411. ( 10.1111/1475-6765.12103) [DOI] [Google Scholar]
  • 40.Simmons LW. 2017. Guidelines for transparency and openness (TOP). Behav. Ecol. 28, 347. ( 10.1093/beheco/arx019) [DOI] [Google Scholar]
  • 41.Verdier JM, Collins SL. 2017. BioScience Signs TOP Guidelines. BioScience 67, 871-871. ( 10.1093/biosci/bix123) [DOI] [Google Scholar]
  • 42.National Institutes of Health. 2020. Final NIH Policy for Data Management and Sharing. See https://grants.nih.gov/grants/guide/notice-files/NOT-OD-21-013.html
  • 43.National Science Foundation. 2010. Scientists Seeking NSF Funding Will Soon Be Required to Submit Data Management Plans. See https://www.nsf.gov/news/news_summ.jsp?cntn_id=116928
  • 44.Wellcome. 2017. Data, software and materials management and sharing policy. See https://wellcome.org/grant-funding/guidance/data-software-materials-management-and-sharing-policy
  • 45.Collins FS, Tabak LA. 2014. Policy: NIH plans to enhance reproducibility. Nature. 505, 612-613. ( 10.1038/505612a) [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 46.National Institutes of Health, Department of Health and Human Services. 2016. Clinical Trials Registration and Results Information Submission. 2016-22129 [Internet]. See https://www.federalregister.gov/documents/2016/09/21/2016-22129/clinical-trials-registration-and-results-information-submission
  • 47.Hudson KL, Lauer MS, Collins FS. 2016. Toward a new era of trust and transparency in clinical trials. JAMA 316, 1353-1354. ( 10.1001/jama.2016.14668) [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 48.Grant S, Mayo-Wilson E, Kianersi S, Naaman K, Henschel B. 2022. Implementation of the Transparency and Openness Promotion Guidelines at Journals Publishing Social and Behavioral Intervention Research [Preprint]. MetaArXiv. ( 10.31222/osf.io/f9ptg) [DOI]
  • 49.Cashin AG, Bagg MK, Richards GC, Toomey E, McAuley JH, Lee H. 2021. Limited engagement with transparent and open science standards in the policies of pain journals: a cross-sectional evaluation. BMJ Evid. Based Med. 26, 313-319. ( 10.1136/bmjebm-2019-111296) [DOI] [PubMed] [Google Scholar]
  • 50.Hansford HJ, Cashin AG, Wewege MA, Ferraro MC, McAuley JH, Jones MD. 2021. Evaluation of journal policies to increase promotion of transparency and openness in sport science research. Arthroscopy 37, 3223-3225. ( 10.1016/j.arthro.2021.09.005) [DOI] [PubMed] [Google Scholar]
  • 51.Spitschan M, Schmidt MH, Blume C. 2021. Principles of open, transparent and reproducible science in author guidelines of sleep research and chronobiology journals. Wellcome Open Res. 5, 172. ( 10.12688/wellcomeopenres.16111.2) [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 52.Schroeder SR, Gaeta L, El Amin M, Chow J, Borders JC. 2022. Evaluating research transparency and openness in communication sciences and disorders journals. PsyArXiv. ( 10.31234/osf.io/dy5zs) [DOI] [PubMed] [Google Scholar]
  • 53.Gardener AD, Hick EJ, Jacklin C, Tan G, Cashin AG, Lee H, Nunan D, Toomey EC, Richards GC. 2022. Open science and conflict of interest policies of medical and health sciences journals before and during the COVID-19 pandemic: a repeat cross-sectional study: Open science policies of medical journals. JRSM Open. 13, 20542704221132139. ( 10.1177/20542704221132139) [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 54.Anderson MS, Martinson BC, De Vries R. 2007. Normative dissonance in science: results from a national survey of US scientists. J. Empir. Res. Hum. Res. Ethics 2, 3-14. ( 10.1525/jer.2007.2.4.3) [DOI] [PubMed] [Google Scholar]
  • 55.Fecher B, Friesike S, Hebing M, Linek S. 2017. A reputation economy: how individual reward considerations trump systemic arguments for open access to data. Palgrave Commun. 3, 17051. ( 10.1057/palcomms.2017.51) [DOI] [Google Scholar]
  • 56.Fraser H, Barnett A, Parker TH, Fidler F. 2020. The role of replication studies in ecology. Ecol. Evol. 10, 5197-5207. ( 10.1002/ece3.6330) [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 57.Houtkoop BL, Chambers C, Macleod M, Bishop DVM, Nichols TE, Wagenmakers E-J. 2018. Data sharing in psychology: a survey on barriers and preconditions. Adv. Methods Pract. Psychol. Sci. 1, 70-85. ( 10.1177/2515245917751886) [DOI] [Google Scholar]
  • 58.Stuart D, Baynes G, Hrynaszkiewicz I, Allin K, Penny D, Lucraft M, Astell M. 2018. Whitepaper: Practical challenges for researchers in data sharing. Springer Nature. ( 10.6084/m9.figshare.5975011) [DOI]
  • 59.Saenen B, Morais R, Gaillard V, Borrell-Damián L. 2019. Research Assessment in the Transition to Open Science: 2019 EUA Open Science and Access Survey Results. European University Association. See https://eua.eu/resources/publications/888:research-assessment-in-the-transition-to-open-science.html
  • 60.Washburn AN, et al. 2018. Why do some psychology researchers resist adopting proposed reforms to research practices? A description of researchers' rationales. Adv. Methods Pract. Psychol. Sci. 1, 166-173. ( 10.1177/2515245918757427) [DOI] [Google Scholar]
  • 61.Lash TL. 2015. Declining the transparency and openness promotion guidelines. Epidemiology 26, 779-780. ( 10.1097/EDE.0000000000000382) [DOI] [PubMed] [Google Scholar]
  • 62.Lash TL. 2022. Getting over TOP. Epidemiology 33, 1-6. ( 10.1097/EDE.0000000000001424) [DOI] [PubMed] [Google Scholar]
  • 63.Norris E, O'Connor DB. 2019. Science as behaviour: using a behaviour change approach to increase uptake of open science. Psychol. Health. 34, 1397-1406. ( 10.1080/08870446.2019.1679373) [DOI] [PubMed] [Google Scholar]
  • 64.Craig P, Dieppe P, Macintyre S, Michie S, Nazareth I, Petticrew M. 2008. Developing and evaluating complex interventions: the new Medical Research Council guidance. Br. Med. J. 337, a1655. ( 10.1136/bmj.a1655) [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 65.Czajkowski SM, et al. 2015. From ideas to efficacy: The ORBIT model for developing behavioral treatments for chronic diseases. Health Psychol. 34, 971-982. ( 10.1037/hea0000161) [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 66.O'Cathain A, Croot L, Duncan E, Rousseau N, Sworn K, Turner KM, Yardley L, Hoddinott P. 2019. Guidance on how to develop complex interventions to improve health and healthcare. BMJ Open. 9, e029954. ( 10.1136/bmjopen-2019-029954) [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 67.Michie S, van Stralen MM, West R. 2011. The behaviour change wheel: a new method for characterising and designing behaviour change interventions. Implement. Sci. 6, 42. ( 10.1186/1748-5908-6-42) [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 68.Atkins L, et al. 2017. A guide to using the Theoretical Domains Framework of behaviour change to investigate implementation problems. Implement. Sci. 12, 77. ( 10.1186/s13012-017-0605-9) [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 69.Michie S, Richardson M, Johnston M, Abraham C, Francis J, Hardeman W, Eccles MP, Cane J, Wood CE. 2013. The behavior change technique taxonomy (v1) of 93 hierarchically clustered techniques: building an international consensus for the reporting of behavior change interventions. Ann. Behav. Med. 46, 81-95. ( 10.1007/s12160-013-9486-6) [DOI] [PubMed] [Google Scholar]
  • 70.Norris E, Clark K, Munafo MR, Jay C, Baldwin J, Lautarescu A, Pedder H, Page M, Pennington CR. 2022. Awareness of and engagement with Open Research behaviours: Development of the Brief Open Research Survey (BORS) with the UK Reproducibility Network. MetaArXiv. ( 10.31222/osf.io/w48yh) [DOI] [Google Scholar]
  • 71.Osborne C, Norris E. 2022. Pre-registration as behaviour: developing an evidence-based intervention specification to increase pre-registration uptake by researchers using the Behaviour Change Wheel. Cogent. Psychol. 9, 2066304. ( 10.1080/23311908.2022.2066304) [DOI] [Google Scholar]
  • 72.Naaman K, Grant S, Kianersi S, Supplee L, Henschel B, Mayo-Wilson E. 2021. Research Materials. ( 10.17605/OSF.IO/ZMS8B) [DOI] [PMC free article] [PubMed]
  • 73.Mayo-Wilson E, Grant S, Kianersi S, Naaman K. 2022. TRUST Data Sets and Files. ( 10.17605/OSF.IO/3REB6) [DOI]
  • 74.Naaman K, Grant S, Kianersi S, Supplee L, Henschel B, Mayo-Wilson E. 2022. Journal Editor Survey. ( 10.17605/OSF.IO/J6YKX) [DOI] [PMC free article] [PubMed]
  • 75.Mayo-Wilson E, Grant S, Supplee L. 2021. Clearinghouse standards of evidence on the transparency and reproducibility of intervention evaluations. Prev. Sci. 23, 774-786. ( 10.1007/s11121-021-01284-x) [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 76.Mayo-Wilson E, Grant S, Supplee L, Kianersi S, Amin A, DeHaven A, Mellor D. 2021. Evaluating implementation of the Transparency and Openness Promotion (TOP) guidelines: the TRUST process for rating journal policies, procedures, and practices. Res. Integr. Peer. Rev. 6, 9. ( 10.1186/s41073-021-00112-8) [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 77.Naaman K, Grant S, Kianersi S, Mayo-Wilson E. 2022. Shiny Application. ( 10.17605/OSF.IO/UQD3C) [DOI]
  • 78.Huijg JM, Gebhardt WA, Crone MR, Dusseldorp E, Presseau J. 2014. Discriminant content validity of a theoretical domains framework questionnaire for use in implementation research. Implement. Sci. 9, 11. ( 10.1186/1748-5908-9-11) [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 79.Huijg JM, Gebhardt WA, Dusseldorp E, Verheijden MW, van der Zouwe N, Middelkoop BJ, Crone MR. 2014. Measuring determinants of implementation behavior: psychometric properties of a questionnaire based on the theoretical domains framework. Implement. Sci. 9, 33. ( 10.1186/1748-5908-9-33) [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 80.Morgan JI, Curcuruto M, Steer M, Bazzoli A. 2021. Implementing the theoretical domains framework in occupational safety: development of the safety behaviour change questionnaire. Safety Sci. 136, 105135. ( 10.1016/j.ssci.2020.105135) [DOI] [Google Scholar]
  • 81.Seward K, Wolfenden L, Wiggers J, Finch M, Wyse R, Oldmeadow C, Presseau J, Clinton-McHarg T, Yoong SL. 2017. Measuring implementation behaviour of menu guidelines in the childcare setting: confirmatory factor analysis of a theoretical domains framework questionnaire (TDFQ). Int. J. Behav. Nutr. Phys. Act. 14, 1-9. ( 10.1186/s12966-017-0499-6) [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 82.Skoien W, Page K, Parsonage W, Ashover S, Milburn T, Cullen L. 2016. Use of the theoretical domains framework to evaluate factors driving successful implementation of the Accelerated Chest pain Risk Evaluation (ACRE) project. Implement. Sci. 11, 1-11. ( 10.1186/s13012-016-0500-9) [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 83.Smith JD, Corace KM, MacDonald TK, Fabrigar LR, Saedi A, Chaplin A, MacFarlane S, Valickis D, Garber GE. 2019. Application of the theoretical domains framework to identify factors that influence hand hygiene compliance in long-term care. J. Hosp. Infect. 101, 393-398. ( 10.1016/j.jhin.2018.12.014) [DOI] [PubMed] [Google Scholar]
  • 84.Center for Open Science n.d. The Standards. See https://www.cos.io/initiatives/top-guidelines
  • 85.Clarivate Analytics. 2019. Journal Citation Reports®. Thomson Reuters, 2021. See https://jcr.clarivate.com/
  • 86.Scimago Journal & Country Rank. n.d. SJR: Scimago Journal & Country Rank [Portal]. Retrieved 2020 from http://www.scimagojr.com
  • 87.Wickham H, et al. 2019. Welcome to the Tidyverse. J. Open Source Softw. 4, 1686. ( 10.21105/joss.01686) [DOI] [Google Scholar]
  • 88.Bryer J, Speerschneider K. 2016. Likert: analysis and visualization Likert items. R package version 1.3.5. See https://CRAN.R-project.org/package=likert [Google Scholar]
  • 89.Kassambara A. 2020. Ggpubr: ‘ggplot2’ based publication ready plots. R package version 0.4.0. See https://CRAN.R-project.org/package=ggpubr. [Google Scholar]
  • 90.Rich B. 2021. table1: Tables of Descriptive Statistics in HTML. R package version 1.4.2. See: https://CRAN.R-project.org/package=table1
  • 91.RStudio Team. 2020. RStudio: integrated development environment for R. Version 1.3.959. Boston, MA: RStudio, PBC. See http://www.rstudio.com/ [Google Scholar]
  • 92.R Core Team. 2020. R: a language and environment for statistical computing. Version 4.0.1. Vienna, Austria: R Foundation for Statistical Computing. See https://www.R-project.org/ [Google Scholar]
  • 93.McKinney W. 2010. Data structures for statistical computing in Python. Proc. of the 9th Python in Science Conf. 43, 51-56. [Google Scholar]
  • 94.Reback J, et al. 2021. pandas-dev/pandas: Pandas. 1.3.3. ( 10.5281/zenodo.3509134) [DOI]
  • 95.Harris CR, et al. 2020. Array programming with NumPy. Nature 585, 357-362. ( 10.1038/s41586-020-2649-2) [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 96.van Rossum G, Drake FL. 2009. Python 3 reference manual. Scotts Valley, CA: CreateSpace. [Google Scholar]
  • 97.Braun V, Clarke V. 2021. Thematic analysis: a practical guide. London, UK: Sage Publishing. [Google Scholar]
  • 98.Clarivate n.d. Glossary. See https://jcr.help.clarivate.com/Content/glossary.htm
  • 99.Scimago Journal & Country Rank. n.d. Help. See https://www.scimagojr.com/help.php
  • 100.Hardwicke TE, Serghiou S, Janiaud P, Danchev V, Crüwell S, Goodman SN, Ioannidis JPA. 2020. Calibrating the scientific ecosystem through meta-research. Annu. Rev. Stat. Appl. 7, 11-37. ( 10.1146/annurev-statistics-031219-041104) [DOI] [Google Scholar]
  • 101.Michie S, Carey RN, Johnston M, Rothman AJ, De Bruin M, Kelly MP, Connell LE. 2018. From theory-inspired to theory-based interventions: a protocol for developing and testing a methodology for linking behaviour change techniques to theoretical mechanisms of action. Ann. Behav. Med. 52, 501-512. ( 10.1007/s12160-016-9816-6) [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 102.Hallingberg B, et al. 2018. Exploratory studies to decide whether and how to proceed with full-scale evaluations of public health interventions: a systematic review of guidance. Pilot Feasibility Stud. 4, 104. ( 10.1186/s40814-018-0290-8) [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 103.Kianersi S, et al. 2022. Evaluating implementation of the Transparency and Openness Promotion Guidelines: Reliability of instruments to assess journal policies, procedures, and practices [Preprint]. MetaArXiv. ( 10.31222/osf.io/9ba3s) [DOI] [Google Scholar]
  • 104.Serghiou S, Contopoulos-Ioannidis DG, Boyack KW, Riedel N, Wallach JD, Ioannidis JPA. 2021. Assessment of transparency indicators across the biomedical literature: How open is open? PLoS Biol. 19, e3001107. ( 10.1371/journal.pbio.3001107) [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 105.Naaman K, Grant S, Kianersi S, Supplee L, Henschel B, Mayo-Wilson E. 2023. Exploring enablers and barriers to implementing the Transparency and Openness Promotion (TOP) Guidelines: a theory-based survey of journal editors. Figshare. ( 10.6084/m9.figshare.c.6403446) [DOI] [PMC free article] [PubMed]

Associated Data

This section collects any data citations, data availability statements, or supplementary materials included in this article.

Data Citations

  1. Naaman K, Grant S, Kianersi S, Supplee L, Henschel B, Mayo-Wilson E. 2023. Exploring enablers and barriers to implementing the Transparency and Openness Promotion (TOP) Guidelines: a theory-based survey of journal editors. Figshare. ( 10.6084/m9.figshare.c.6403446) [DOI] [PMC free article] [PubMed]

Data Availability Statement

De-identified data, statistical code for generating results and our survey instruments are available on the Open Science Framework (https://osf.io/j6ykx/).

Supplementary material is available online [105].


Articles from Royal Society Open Science are provided here courtesy of The Royal Society

RESOURCES