Abstract
Registered reports are a publication format that involves peer reviewing studies both before and after carrying out research procedures. Although registered reports were originally developed to combat challenges in quantitative and confirmatory study designs, today registered reports are also available for qualitative and exploratory work. This article provides a brief primer that aims to help researchers in choosing, designing, and evaluating registered reports, which are driven by qualitative methods.
Keywords: guidelines, open science, transparency, qualitative research, registered reports
Introduction
Registered reports (“RRs”, Chambers & Tzavella, 2022) are a modification of the scientific publication process that aims to shift publication decisions from being based on the nature of the results of the study and towards rigorous conceptualization and design. In so doing, RRs improve transparency and the timeliness of peer review. The FORTT (2022) glossary defines RRs as a two-stage process, where peer review occurs at study design (Stage 1) and, if ‘in-principle-accepted’ (IPA), study completion (Stage 2). While RRs were originally developed for research following the hypothetico-deductive method ( Chambers, 2013), today they are also open to explorative and qualitative studies ( Branney et al., 2022). In fact, some qualitative RRs are already emerging (e.g., Stage 1, Topor et al., 2022) and at least two have already been published (Stage 2: Karhulahti et al., 2022; Xiao, 2022).
As RRs are likely to be unfamiliar to those using qualitative methods, this is a brief primer on choosing, designing and evaluating RRs for qualitative research. This primer is principally for people who design and conduct research using qualitative methods, although it may also be useful for editors and undergraduate and graduate students. As a rule of thumb, we propose the following: the more qualitative researchers can commit to research decisions before the study, the more benefits they will harvest by using the RR format ( Figure 1). In other words, not all qualitative designs benefit from using the RR format equally, but some may benefit significantly.
Figure 1. The benefits of registered reports (RRs) correlate with the possibilities to commit to decisions.
Choosing RRs for qualitative research
Frontloading peer review
Choosing to carry out a research project in the RR format allows one to receive peer review feedback in the design phase. This can be helpful especially for early career researchers and teams with limited qualitative expertise. Feedback alone can be a good reason for choosing the RR format if the study timeline allows. Projects that are carried out under severe time pressure should carefully consider whether they wish to use the RR format, as carefully reviewing the study design usually takes time (see Evans et al., 2023). On the other hand, RRs submitted via Peer Community in Registered Reports (PCI RR) can schedule their review to save time, or even start generating data before completing Stage 1 by lowering the reported control level ( PCI RR, 2022).
Negotiation…
…with fellow researchers and stakeholders
In frontloading the design, an RR gives legitimacy to researchers and stakeholders spending time transparently negotiating the study design, including uncertainties and delayed decision making. This could mean that the design is collaborative and each person involved has agency in decision making. For example, in Button et al’s (2020) model for consortium based empirical undergraduate dissertations, the academics collaboratively write the study protocol and main hypothesis; once the new academic year commences, the dissertation students join the consortium and they collaboratively develop the protocol further, with each student having responsibility for adding one secondary hypothesis. While qualitative studies are unlikely to have testable hypotheses, this grassroots training model can be used in qualitative RRs to negotiate better research questions and the study design in ways that give those involved agency.
…with Journal or review platform
The RR format allows negotiating research decisions with editors before carrying out the study. As journals and review platforms have many explicit and implicit policies regarding the studies they publish, using the RR format allows pursuing an agreement with them at Stage 1. This can save valuable time and resources compared to traditional publication processes, which sometimes produce month/year-long (desk) rejection loops.
Transparency
Due to the more flexible nature of decision-making in qualitative (vs. quantitative) research, authors may need to apply for a permission to make changes between Stage 1 and 2. Although this is generally against the philosophy of hypothesis-testing RRs and increases editorial labor, it is also a robust means for documenting the expected and unexpected events in the research process.
Designing qualitative RRs
Analysis
Methods of analysis should be written in such detail that editors and reviewers understand what will be done (Stage 1) and what was done (Stage 2). Researchers should also clearly explain how they plan to report the data and results. In qualitative studies, methods tend to include flexible elements (see Haven & Van Grootel, 2019). Such elements should be identified and planned, i.e. if researchers cannot decide between multiple alternative analyses, rules of decision making can be stated instead (e.g., if X then we Y). If biases are of epistemological concern, it is also possible for teams to apply masked analyses ( Dutilh et al., 2021).
Data access & stewardship
Qualitative research can include a wide range of different data types such as audio, images, videos, and written text. All data types are suitable for RRs. Qualitative data are likely to present ‘legitimate sensitivities’ to participants' privacy ( Branney et al., 2019), even through the ‘innocent collection of details’ ( Branney et al., 2017). Therefore, it is important to outline and negotiate data sharing plans at an early stage, and RRs facilitate this process ( Karhulahti, 2022a). Two current trends contribute to the importance of these efforts.
The first trend is the digitization of our research and data protection legislation, which means that data can have a short life ( Fang et al., 2013) and will be impossible to access without clear consent and data sharing agreements between all organizations (compared to reading through letters or documents in a filing cabinet, for example where researchers had only a single copy of their data but still shared it, see Craig & Reese, 1973). Organizations, for example, may be unable to grant access to archeologists of the future. The second trend is the range of data sharing and open data policies of research bodies, such as funders and professional bodies (see Riley et al., 2019). From 2013, the UK Economic and Social Research Council had a policy that data should be available for reuse within three months of grant completion ( ESRC, 2013). In turn, the British Psychological Society has a ‘as open as possible; as closed as necessary’ position statement on open data ( BPS, 2020).
It is less common for qualitative studies to share data openly, especially in human research. On the other hand, data sharing is also possible for qualitative studies, and should be encouraged whenever possible (for an overview, see DuBois et al., 2018). The Qualitative Data Repository was developed for precisely this purpose, but authors can also consult national and international repositories for assistance in sharing procedures. Anonymization (or pseudonymization), consent procedures, and controlling data reuse are preferably carried out in collaboration with archives and experts ( Karhulahti, 2022b). Clear reasons should be provided if data are non-shareable. Notably, some qualitative data is not collected but rather generated (e.g., interviews), in which case the researchers’ own involvement should be recognized, for instance, via positionality statements.
Ethics approval and IRBs
Different countries and universities apply different processes for reviewing research ethics, and each Stage 1 submission should communicate how they meet the standards of their country or institution. When needed, ethics approval for RRs should generally be applied before Stage 1 submission, but parallel and post-IPA approvals are negotiable as well in case in inflexible committees (see Figure 3 in Chambers & Tzavella, 2022; also, PCI RR, 2022). Authors can discuss with journal (or PCI RR) representatives when applying for approval, being explicit to each about the amendments recommended by each process. If an ethics approval sets critical limits to a planned design, authors can also consult RR representatives before submission and report those limits in their Stage 1 proposal to minimize misunderstandings in peer review.
Hypotheses
It is uncommon (but not impossible) for qualitative research designs to test hypotheses. On the other hand, it is also possible to set hypotheses without testing them. These “qualitative hypotheses” (QHs) can be used in a similar way as positionality statements, i.e. to report the team’s prior beliefs and hypothetical biases, which can affect the study design and its procedures. So far, at least two RRs ( Karhulahti et al., 2022; Topor et al., 2022) have used qualitative hypotheses. Unlike positionality statements, QHs are based on previous data, literature, and theory, which have influenced the study design and may influence data interpretation, thus being akin to ‘priors’ in Bayesian statistical designs (see Andrews & Baguely, 2017).
Positionality
Because qualitative research is often highly interpretive and reflective, it is important to disclose the position(s) from which the interpretations and reflections are made. It may be useful to have separate positionality statements for data generation (e.g., interviews) and analysis (e.g., coding). The APA Journal Article Reporting Standards for qualitative research, for example, recommend describing the researchers, how their perspectives were used in methodological integrity of the data collection and analysis, and their understanding of the conclusions ( APA, 2020; Levitt et al., 2018).
Research questions
As qualitative research is usually nonconfirmatory, one of the most important parts of study design is the formation of useful research questions. Specific types of qualitative data and analyses are often suitable for producing answers only to specific types of research questions, for which authors should carefully assess these relationships in their design. In general, good research questions ensure that the produced answers will contribute to the field.
Sample (size)
There are no universal sampling rules for qualitative studies, but samples should always be justified. Different justifications apply to different types of data and methods. When justifying the nature and size of a sample, authors might also apply e.g., saturation, where the analytic process defines the sample during data generation (e.g., Low, 2019). In such cases, however, it can still be useful to preregister estimations, which facilitate editorial work and increase transparency.
Evaluating qualitative RRs
Evaluating cost-benefit (Stage 1)
Because successful RRs receive in-principle acceptance after Stage 1, one of the special features in evaluating them is to assess whether the study is worth carrying out. Importantly, this is not the same as “impact”, but rather the degree to which a study can contribute, given resources. The value of expected findings needs to be assessed against the current knowledge, e.g. do the findings have potential to produce useful contributions to knowledge or improve the understanding of a phenomenon? In most cases, this means carefully evaluating the research questions and their match with the generated data as well as applied analyses. Because many authors are pre-committed to asking specific research questions (e.g., for funding bodies), even small contributions can be worth pursuing. Primarily, reviewers should help authors to maximize the contribution potential of their resources.
Evaluating interpretations (Stage 2)
As authors must clearly spell out how they will report the data and results at Stage 1, reviewers should start by assessing whether the plan was followed and if not, are possible deviations justified. A key difference between qualitative and other RRs is the former’s strong affiliation with interpretative and reflective methodology. This means that reviewers have less control over the production of results compared to e.g., statistically driven studies. Because interpretations may not always be fully reproducible (recall positionality), reviewers balance their assessment between ensuring that authors communicate their interpretations clearly, back up the interpretations with data, and draw reasonable conclusions from the analysis (see Josselson, 2004).
Evaluating language (Stage 1 & 2)
Qualitative studies, which are typically exploratory by nature, should rarely use confirmatory language. Likewise, as qualitative studies less commonly reach high generalizability, language should be used with care when discussing the findings in general contexts. Reviewers should pay special attention to the language of discussion and conclusions at Stage 2.
Evaluating data/materials (Stage 1 & 2)
Peer reviewers should generally be provided secure access to data at Stage 2. Materials used or produced in analysis should be included as an appendix to the shared data whenever possible; for instance, coding manuals and documentation for establishing reliability/trust should always be shared (when applicable). On the other hand, data sharing is not a binary question, but usually parts of the data can be shared whereas other parts (e.g., with potential personal identifiers) cannot ( Syed, 2022). These dimensions must be assessed case by case. Agreeing upon a clear plan for data and material sharing at Stage 1 and assessing them at Stage 2 are an important part of the evaluation process.
Use cases and discussion
Registered reports are becoming increasingly popular across fields, and many of their benefits also apply to qualitative methods. So far, our own experiences as authors and evaluators ( Topor et al., 2022; Xiao, 2022) serve as practical use cases, which have demonstrated the potential of registered reports for qualitative methods. As authors ( Karhulahti et al., 2022), we witnessed a significant improvement of our research design at Stage 1 due valuable review feedback, and the registered report format did not conflict with the flexibility characteristic to qualitative research, but we were also allowed to make non-registered choices, such as reporting themes through a case format that turned out optimal after analysis. As evaluators ( Karhulahti et al., 2022; Topor et al., 2022; Xiao, 2022), we witnessed exceptional motivation to support the authors because feedback at Stage 1 can have a fundamental positive effect on the study design and the quality of the research—this cannot happen to the same degree in publication formats, which are reviewed after the study has already been carried out. In sum, based on our experience, registered reports allow authors and evaluators to “play in the same team,” of which the present primer is also a practical example: authors and their evaluators having teamed up for follow-up collaboration.
We hope this brief primer is helpful for authors, editors, and reviewers involved in qualitative registered reports, and we look forward to updating these recommendations along with our accumulating experience and knowledge.
Ethics and consent
Ethical approval and consent were not required.
Funding Statement
This project has received funding from the European Research Council (ERC) under the European Union’s Horizon Europe research and innovation programme (grant agreement No 101042052); and the Academy of Finland (312397).
The funders had no role in study design, data collection and analysis, decision to publish, or preparation of the manuscript.
[version 1; peer review: 2 approved]
Data availability
No data are associated with this article.
References
- Andrews M, Baguley T: Bayesian data analysis.In: The Cambridge encyclopedia of child development. Edited by B. Hopkins, E. Geangu and S. Linkenauger Cambridge: Cambridge University Press,2017;165–169. 10.1017/9781316216491.030 [DOI] [Google Scholar]
- APA: APA Style JARS: Journal Article Reporting Standards. JARS–Qual: Table 1. Information Recommended for Inclusion in Manuscripts That Report Primary Qualitative Research.American Psychiatric Association,2020. Reference Source [Google Scholar]
- Branney P, Brooks J, Kilby L, et al. : Three Steps to Open Science for Qualitative Research in Psychology. PsyArXiv. 2022. Reference Source [Google Scholar]
- Branney P, Reid K, Frost N, et al. : A context-consent meta-framework for designing open (qualitative) data studies. Qualitative Research in Psychology. 2019;16(3): 483–502. 10.1080/14780887.2019.1605477 [DOI] [Google Scholar]
- Branney P, Woolhouse M, Reid K: The ‘innocent collection of details’ and journal requests to make qualitative datasets public post-consent: Open access data, potential author response and thoughts for future studies. QMiP Bulletin. 2017;23:19–23. Reference Source [Google Scholar]
- British Psychological Society: Position statement: Open data.British Psychological Society,2020. [Google Scholar]
- Button KS, Chambers CD, Lawrence N, et al. : Grassroots Training for Reproducible Science: A Consortium-Based Approach to the Empirical Dissertation. Psychology Learning and Teaching. 2020;19(1):77–90. 10.1177/1475725719857659 [DOI] [Google Scholar]
- Chambers CD: Registered reports: a new publishing initiative at Cortex. Cortex. 2013;49(3):609–610. 10.1016/j.cortex.2012.12.016 [DOI] [PubMed] [Google Scholar]
- Chambers CD, Tzavella L: The past, present and future of Registered Reports. Nat Hum Behav. 2022;6(1):29–42. 10.1038/s41562-021-01193-7 [DOI] [PubMed] [Google Scholar]
- Craig JR, Reese SC: Retention of raw data: A problem revisited. American Psychologist. 1973;28(8):723. 10.1037/h0035667 [DOI] [Google Scholar]
- DuBois JM, Strait M, Walsh H: Is it time to share qualitative research data? Qual Psychol. 2018;5(3):380–393. 10.1037/qup0000076 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Dutilh G, Sarafoglou A, Wagenmakers EJ: Flexible yet fair: Blinding analyses in experimental psychology. Synthese. 2021;198(Suppl 23):5745–5772. 10.1007/s11229-019-02456-7 [DOI] [Google Scholar]
- ESRC [Economic and Social Science Research Council]: ESRC open access to research outputs.ESRC, London,2013. [Google Scholar]
- Evans TR, Branney P, Clements A, et al. : Improving evidence-based practice through preregistration of applied research: Barriers and recommendations. Account Res. 2023;30(2):88–108. 10.1080/08989621.2021.1969233 [DOI] [PubMed] [Google Scholar]
- Fang X, Liu Sheng OR, Goes P: When Is the Right Time to Refresh Knowledge Discovered from Data? Operations Research. 2013;61(1):32–44. 10.1287/opre.1120.1148 [DOI] [Google Scholar]
- FORTT [Framework for Open and Reproducible Research Training]: Registered Reports.Glossary,2022. Reference Source [Google Scholar]
- Haven TL, Van Grootel DL: Preregistering qualitative research. Account Res. 2019;26(3):229–244. 10.1080/08989621.2019.1580147 [DOI] [PubMed] [Google Scholar]
- Josselson R: The hermeneutics of faith and the hermeneutics of suspicion. Narrative Inquiry. 2004;14(1):1–28. 10.1075/ni.14.1.01jos [DOI] [Google Scholar]
- Karhulahti VM: Registered reports for qualitative research. Nat Hum Behav. 2022a;6(1):4–5. 10.1038/s41562-021-01265-8 [DOI] [PubMed] [Google Scholar]
- Karhulahti VM: Reasons for Qualitative Psychologists to Share Human Data. Br J Soc Psychol. 2022b. 10.1111/bjso.12573 [DOI] [PubMed] [Google Scholar]
- Karhulahti VM, Siutila M, Vahlo J, et al. : Phenomenological Strands for Gaming Disorder and Esports Play: A Qualitative Registered Report. Collabra: Psychology. 2022;8(1):38819. 10.1525/collabra.38819 [DOI] [Google Scholar]
- Levitt HM, Bamberg M, Creswell JW, et al. : Journal article reporting standards for qualitative primary, qualitative meta-analytic, and mixed methods research in psychology: The APA Publications and Communications Board task force report. Am Psychol. 2018;73(1):26–46. 10.1037/amp0000151 [DOI] [PubMed] [Google Scholar]
- Low J: A pragmatic definition of the concept of theoretical saturation. Sociological Focus. 2019;52(2):131–139. 10.1080/00380237.2018.1544514 [DOI] [Google Scholar]
- PCI RR [Peer Community in Registered Reports]: Author Guidelines. 2022. Reference Source [Google Scholar]
- Riley S, Brooks J, Goodman S, et al. : Celebrations amongst challenges: Considering the past, present and future of the qualitative methods in psychology section of the British Psychology Society. Qualitative Research in Psychology. 2019;16(3):464–482. 10.1080/14780887.2019.1605275 [DOI] [Google Scholar]
- Syed M: Three myths about open science that just won’t die. PsyArXiv. 2022. 10.31234/osf.io/w8xs2 [DOI] [Google Scholar]
- Topor M, Armstrong G, Gentle J: Through the lens of Developmental Coordination Disorder (DCD): experiences of a late diagnosis.In principle acceptance of Version 4 by Peer Community in Registered Reports. 2022. Reference Source [Google Scholar]
- Xiao L: Breaking Ban: Belgium’s ineffective gambling law regulation of video game loot boxes.Stage 2 Registered Report. Acceptance of Version 2 by Peer Community in Registered Reports. 2022. 10.31219/osf.io/hnd7w [DOI] [Google Scholar]