Version Changes
Revised. Amendments from Version 1
In this second version of the article, several small revisions have been made based on the feedback of two reviewers. In particular, we have removed Figure 1 that did not add new information and made changes in several sentences to improve factual accuracy. Additionally, we have updated qualitative terminology and ensured the examples are not too limited to certain qualitative traditions. Finally, we have clarified as well as expanded the use cases section.
Abstract
Registered reports are a publication format that involves peer reviewing studies both before and after carrying out research procedures. Although registered reports were originally developed to combat challenges in quantitative and confirmatory study designs, today registered reports are also available for qualitative and exploratory work. This article provides a brief primer that aims to help researchers in choosing, designing, and evaluating registered reports, which are driven by qualitative methods.
Keywords: guidelines, open science, transparency, qualitative research, registered reports
Introduction
Registered reports (“RRs”, Chambers & Tzavella, 2022) are a modification of the scientific publication process that aims to shift publication decisions from being based on the nature of the results of the study and towards rigorous conceptualization and design. In so doing, RRs improve transparency and the timeliness of peer review. The FORTT (2022) glossary defines RRs as a two-stage process, where peer review occurs at study design (Stage 1) and, if ‘in-principle-accepted’ (IPA), again at study completion (Stage 2). While RRs were originally developed for research following the hypothetico-deductive method ( Chambers, 2013), today they are also open to explorative and qualitative studies ( Branney et al., 2022). In fact, some qualitative RRs are already emerging (e.g., Stage 1, Topor et al., 2022) and at least two have already been published (Stage 2: Karhulahti et al., 2022; Xiao, 2022).
As RRs are likely to be unfamiliar to those using qualitative methods, this is a brief primer on choosing, designing and evaluating RRs for qualitative research. This primer is principally for people who design and conduct research using qualitative methods, although it may also be useful for editors and undergraduate and graduate students. As a rule of thumb, we propose the following: the more qualitative researchers can commit to research decisions before the study, the more benefits they will harvest by using the RR format. In other words, not all qualitative designs benefit from using the RR format equally, but some may benefit significantly.
Choosing RRs for qualitative research
Frontloading peer review
Choosing to carry out a research project in the RR format allows one to receive peer review feedback in the design phase. Feedback alone can be a good reason for choosing the RR format if the study timeline allows. Projects that are carried out under severe time pressure should carefully consider whether they wish to use the RR format, as reviewing the study design usually takes time (see Evans et al., 2023). On the other hand, RRs submitted via Peer Community in Registered Reports (PCI RR) can schedule their review to save time, or even start generating data before completing Stage 1 by lowering the reported control level ( PCI RR, 2022).
Negotiation…
…with fellow researchers and stakeholders
In frontloading the design, an RR gives legitimacy to researchers and stakeholders spending time transparently negotiating the study design, including uncertainties and delayed decision making. This could mean that the design is collaborative and each person involved has agency in decision making. For example, in Button et al.’s (2020) model for consortium based empirical undergraduate dissertations, the academics collaboratively write the study protocol and main research question; once the new academic year commences, the dissertation students join the consortium and they collaboratively develop the protocol further. This process, developed in the context of quantitative research, is just as applicable to qualitative research. Qualitative RRs allow editors, reviewers, and authors to negotiate research questions and the study design, empowering all those involved.
…with journal or review platform
The RR format allows negotiating research decisions with editors before carrying out the study. As journals and review platforms have many explicit and implicit policies regarding the studies they publish, using the RR format allows pursuing an agreement with them at Stage 1. This can save valuable time and resources compared to traditional publication processes, which sometimes produce month/year-long (desk) rejection loops. Again, this process was developed with quantitative methods in mind, but is equally relevant to qualitative methods.
Transparency
Due to the flexible nature of decision-making in qualitative research, authors may often need to apply for permission to make changes between Stage 1 and 2. Although such changes may also need to be applied when encountering problems in hypothesis-testing RRs, in qualitative RRs it is a natural means for documenting the expected and unexpected events in the research process.
Designing qualitative RRs
Analysis
Methods of analysis should be written in such detail that editors and reviewers understand what will be done (Stage 1) and what was done (Stage 2). Researchers should also clearly explain how they plan to report the data and findings. In qualitative studies, methods tend to include flexible elements (see Haven & Van Grootel, 2019). Such elements should be identified and planned, i.e. if researchers cannot decide between multiple alternative analyses, rules of decision making can be stated instead (e.g., if X then we Y). If biases are of epistemological concern, it is also possible for teams to apply masked analyses ( Dutilh et al., 2021).
Data access & stewardship
Qualitative research can include a wide range of different data types such as audio, images, videos, and written text. All data types are suitable for RRs. Qualitative data are likely to present ‘legitimate sensitivities’ to participants' privacy ( Branney et al., 2019), even through the ‘innocent collection of details’ ( Branney et al., 2017). Therefore, it is important to outline and negotiate data sharing plans at an early stage, and RRs facilitate this process ( Karhulahti, 2022a). Two current trends contribute to the importance of these efforts.
The first trend is the digitization of our research and data protection legislation, which means that data can have a short life ( Fang et al., 2013) and will be impossible to access without clear consent and data sharing agreements between all organizations (compared to reading through letters or documents in a filing cabinet, for example where researchers had only a single copy of their data but still shared it, see Craig & Reese, 1973). Organizations, for example, may be unable to grant access to archeologists of the future. The second trend is the range of data sharing and open data policies of research bodies, such as funders and professional bodies (see Riley et al., 2019). From 2013, the UK Economic and Social Research Council had a policy that data should be available for reuse within three months of grant completion ( ESRC, 2013). In turn, the British Psychological Society has a ‘as open as possible; as closed as necessary’ position statement on open data ( BPS, 2020).
Due to various pragmatic difficulties of anonymizing qualitative datasets and the significant amounts of labor typically involved in such processes, it is less common for qualitative studies to share data openly, especially in human research. On the other hand, some qualitative approaches like conversation analysis have a long open data tradition (see Joyce et al., 2022) and data sharing is also possible for other types of qualitative studies (for an overview, see DuBois et al., 2018). The Qualitative Data Repository was developed for precisely this purpose, but authors can also consult national and international repositories for assistance in sharing procedures. Anonymization (or pseudonymization), consent procedures, and controlling data reuse are preferably carried out in collaboration with archives and experts ( Karhulahti, 2022b). Clear reasons should be provided if data are non-shareable. Notably, some qualitative data is not collected but rather generated (e.g., interviews), in which case the researchers’ own involvement should be recognized, for instance, via positionality statements.
Ethics approval and IRBs
Different countries and universities apply different processes for reviewing research ethics, and each Stage 1 submission should communicate how they meet the standards of their country or institution. When needed, ethics approval for RRs should generally be obtained before Stage 1 submission, but parallel and post-IPA approvals are negotiable as well in case the committee is inflexible (see Figure 3 in Chambers & Tzavella, 2022; also, PCI RR, 2022). Authors can discuss with journal (or review platform) representatives when applying for approval, being explicit to each about the amendments recommended by each process. If an ethics approval sets critical limits to a planned design, authors can also consult RR representatives before submission and report those limits in their Stage 1 proposal to minimize misunderstandings in peer review.
Hypotheses
It is uncommon (but not impossible) for qualitative research designs to test hypotheses. On the other hand, it is also possible to set hypotheses without testing them. These “qualitative hypotheses” (QHs) can be used in a similar way as positionality statements, i.e. to report the team’s prior beliefs and hypothetical biases, which can affect the study design and its procedures. So far, at least two RRs ( Karhulahti et al., 2022; Topor et al., 2022) have used qualitative hypotheses. Unlike positionality statements, QHs are based on previous data, literature, and theory, which have influenced the study design and may influence data interpretation, thus being akin to ‘priors’ in Bayesian statistical designs (see Andrews & Baguely, 2017).
Positionality
Because qualitative research is often highly interpretive and reflective, it is important to disclose the position(s) from which the interpretations and reflections are made. It may be useful to have separate positionality statements for data generation (e.g., interviews) and analysis (e.g., coding). The APA Journal Article Reporting Standards for qualitative research, for example, recommend describing the researchers, how their perspectives were used in methodological integrity of the data collection and analysis, and their understanding of the conclusions ( APA, 2020; Levitt et al., 2018).
Research questions
As qualitative research is usually nonconfirmatory, one of the most important parts of study design is the formation of useful research questions. Specific types of qualitative data and analyses are often suitable for producing answers only to specific types of research questions, for which authors should carefully assess these relationships in their design. In general, good research questions ensure that the produced answers will contribute to the field.
Sample (size) and participants
There are no universal sampling rules for qualitative studies, but the selected participants or other data should always be justified. Different justifications apply to different types of data and methods. When justifying the nature and size of a sample, authors might also apply e.g., saturation, where the analytic process defines the sample during data generation (e.g., Low, 2019). In such cases, however, it can still be useful to preregister estimations, which facilitate editorial work and increase transparency.
Evaluating qualitative RRs
Evaluating cost-benefit (Stage 1)
Successful RRs receive in-principle acceptance after Stage 1, which means that one of the special features in evaluating them is to assess whether the study is worth carrying out (in contrast to evaluating completed work for which authors seek a publisher). Importantly, this is not the same as “impact”, but rather the degree to which a study can contribute, given the available resources. The value of expected findings needs to be assessed against the current knowledge, e.g. do the findings have potential to produce useful contributions to knowledge or improve the understanding of a phenomenon? In most cases, this means carefully evaluating the research questions and their match with the generated data as well as applied analyses. Many authors are committed to asking specific research questions (e.g., for funding bodies, community partnerships, etc.), for which even small contributions can be worth pursuing. Primarily, reviewers should help authors to maximize the contribution potential of their resources.
Evaluating interpretations (Stage 2)
As authors must clearly spell out how they will report the data and results at Stage 1, reviewers should start by assessing whether the plan was followed and if not, are possible deviations pre-approved or otherwise justified. A key difference between qualitative and other RRs is the former’s strong affiliation with interpretative and reflective methodology. This means that reviewers have less control over the production of findings compared to e.g., statistically driven studies. Because interpretations may not always be fully reproducible (recall positionality), reviewers balance their assessment between ensuring that authors communicate their interpretations clearly, back up the interpretations with data, and draw reasonable conclusions from the analysis (see Josselson, 2004).
Evaluating language (Stage 1 & 2)
Qualitative studies, which are typically exploratory by nature, should rarely use confirmatory language. Likewise, as qualitative studies less commonly reach high generalizability, or even seek to generalize at all, language should be used with care when discussing the findings in general contexts. Reviewers should pay special attention to the language of discussion and conclusions at Stage 2.
Evaluating data/materials (Stage 1 & 2)
Following the BPS principle of as ‘open as possible, as closed as necessary’ ( BPS, 2020), peer reviewers should generally be provided secure access to data at Stage 2. Materials used or produced in analysis should be included as an appendix to the shared data whenever possible; for instance, coding manuals and documentation for establishing reliability/trust should always be shared (when applicable). Use of supplemental materials can allow for additional reporting of qualitative results and findings that may not fit within the confines of a journal article. Importantly, data sharing is not a binary question, but usually parts of the data can be shared whereas other parts (e.g., with potential personal identifiers) cannot ( Syed, 2022). These dimensions must be assessed case by case and researchers may apply the FAIR principles as useful guide in thinking about how data is shared, not shared or ‘stewarded’ (see ‘Step 3: opening up your (meta) data’ in Branney et al., 2022). Agreeing upon a clear plan for data and material sharing at Stage 1 and assessing them at Stage 2 are an important part of the evaluation process.
Use cases and discussion
Registered reports are becoming increasingly popular across fields, and many of their benefits also apply to qualitative methods. So far, our own experiences as authors and evaluators serve as practical use cases, which have demonstrated the potential of registered reports for qualitative methods. The present primer, in fact, was originally inspired by the dialogues we had during two RR review processes: in the first one, PB served as a reviewer of a manuscript where VMK/MSI were authors ( Karhulahti et al., 2022), and in the second one MSY served as a reviewer and VMK as a recommender ( Topor et al., 2022). Afterwards, these review processes led us to collectively discuss the limits and possibilities of qualitative RRs.
As authors ( Karhulahti et al., 2022), we witnessed a significant improvement of our phenomenologically motivated research design at Stage 1 due to valuable review feedback. For example, the feedback introduced us to new literature and perspectives, which further helped us to plan analysis and data sharing procedures, consider various reflexive elements in the process, and better distinguish our interpretive phenomenological approach from epistemologically different phenomenology. Moreover, the feedback also improved the longitudinal design by replacing unnecessary components (e.g., contextualizing interviews) with more focused member checking—a change that we did not realize ourselves due to the heavy time pressure set by the funding scheme ( Box 1). Overall, the registered report format did not conflict with the flexibility characteristic of qualitative research, but we were also allowed to make non-registered choices, such as reporting themes through a case format that turned out optimal after final analysis.
Box 1.
The study was funded as part of a project that took place 1.1.2021–31.12.2022. Funding had been received to run a longitudinal qualitative study where the experiences of both healthy and treatment-seeking videogame players were followed for 12 months, with two interview rounds by a one-year interval. Because of the double interview design, two separate manuscripts were planned. We utilized the “programmatic” feature at PCI RR, which allows registering and reviewing multiple analysis plans via one Stage 1 manuscript (in our case: two rounds of interviews by one-year interval). The manuscript was submitted to PCI RR in June 16 (2022) and three reviews were received July 16. After one more review round, the Stage 1 plan received in-principle acceptance on September 24. Because of the fixed project timeline, we decided to start the first round of interviews before in-principle acceptance in order to ensure that our follow-up a year later remained possible within the project deadline. This meant lowering the confidence level (we had generated some data before the Stage 1 review was completed)—we were not worried about maintaining the highest confidence level because pre-study confidence plays a less meaningful role in exploratory qualitative research anyway. The first Stage 2 outcome was published a year later (September 21, 2022) and the longitudinal follow-up Stage 2 is currently in review.
As evaluators ( Karhulahti et al., 2022; Topor et al., 2022; Xiao, 2022), we witnessed exceptional motivation to support the authors because feedback at Stage 1 can have a fundamental positive effect on the study design and the quality of the research—this cannot happen to the same degree in other publication formats, which are reviewed after the study has already been carried out. A key manifestation of this process is working with study authors to align their research questions with their methods and analysis plans to ensure a coherent, informative final product. In sum, based on our experience, registered reports allow authors and evaluators to “play in the same team,” of which the present primer is also a practical example: authors and their evaluators having teamed up for follow-up collaboration.
We hope this brief primer is helpful for authors, editors, and reviewers involved in qualitative registered reports, and we look forward to updating these recommendations along with our accumulating experience and knowledge.
Ethics and consent
Ethical approval and consent were not required.
Funding Statement
This project has received funding from the European Research Council (ERC) under the European Union’s Horizon Europe research and innovation programme (grant agreement No 101042052); and the Academy of Finland (312397).
The funders had no role in study design, data collection and analysis, decision to publish, or preparation of the manuscript.
[version 2; peer review: 2 approved]
Data availability
No data are associated with this article.
References
- Andrews M, Baguley T: Bayesian data analysis.In: The Cambridge encyclopedia of child development. Edited by B. Hopkins, E. Geangu and S. Linkenauger Cambridge: Cambridge University Press,2017;165–169. 10.1017/9781316216491.030 [DOI] [Google Scholar]
- APA: APA Style JARS: Journal Article Reporting Standards. JARS–Qual: Table 1. Information Recommended for Inclusion in Manuscripts That Report Primary Qualitative Research.American Psychiatric Association,2020. Reference Source
- Branney P, Brooks J, Kilby L, et al. : Three Steps to Open Science for Qualitative Research in Psychology. PsyArXiv. 2022. Reference Source [Google Scholar]
- Branney P, Reid K, Frost N, et al. : A context-consent meta-framework for designing open (qualitative) data studies. Qual Res Psychol. 2019;16(3):483–502. 10.1080/14780887.2019.1605477 [DOI] [Google Scholar]
- Branney P, Woolhouse M, Reid K: The ‘innocent collection of details’ and journal requests to make qualitative datasets public post-consent: Open access data, potential author response and thoughts for future studies. QMiP Bulletin. 2017;23:19–23. Reference Source [Google Scholar]
- British Psychological Society: Position statement: Open data.British Psychological Society,2020.
- Button KS, Chambers CD, Lawrence N, et al. : Grassroots Training for Reproducible Science: A Consortium-Based Approach to the Empirical Dissertation. Psychology Learning and Teaching. 2020;19(1):77–90. 10.1177/1475725719857659 [DOI] [Google Scholar]
- Chambers CD: Registered reports: a new publishing initiative at Cortex. Cortex. 2013;49(3):609–610. 10.1016/j.cortex.2012.12.016 [DOI] [PubMed] [Google Scholar]
- Chambers CD, Tzavella L: The past, present and future of Registered Reports. Nat Hum Behav. 2022;6(1):29–42. 10.1038/s41562-021-01193-7 [DOI] [PubMed] [Google Scholar]
- Craig JR, Reese SC: Retention of raw data: A problem revisited. American Psychologist. 1973;28(8):723. 10.1037/h0035667 [DOI] [Google Scholar]
- DuBois JM, Strait M, Walsh H: Is it time to share qualitative research data? Qual Psychol. 2018;5(3):380–393. 10.1037/qup0000076 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Dutilh G, Sarafoglou A, Wagenmakers EJ: Flexible yet fair: Blinding analyses in experimental psychology. Synthese. 2021;198(Suppl 23):5745–5772. 10.1007/s11229-019-02456-7 [DOI] [Google Scholar]
- ESRC [Economic and Social Science Research Council]: ESRC open access to research outputs.ESRC, London,2013.
- Evans TR, Branney P, Clements A, et al. : Improving evidence-based practice through preregistration of applied research: Barriers and recommendations. Account Res. 2023;30(2):88–108. 10.1080/08989621.2021.1969233 [DOI] [PubMed] [Google Scholar]
- Fang X, Liu Sheng OR, Goes P: When Is the Right Time to Refresh Knowledge Discovered from Data? Operations Research. 2013;61(1):32–44. 10.1287/opre.1120.1148 [DOI] [Google Scholar]
- FORTT [Framework for Open and Reproducible Research Training]: Registered Reports.Glossary,2022. Reference Source [Google Scholar]
- Haven TL, Van Grootel DL: Preregistering qualitative research. Account Res. 2019;26(3):229–244. 10.1080/08989621.2019.1580147 [DOI] [PubMed] [Google Scholar]
- Josselson R: The hermeneutics of faith and the hermeneutics of suspicion. Narrative Inquiry. 2004;14(1):1–28. 10.1075/ni.14.1.01jos [DOI] [Google Scholar]
- Joyce J, Douglass T, Benwell B, et al. : Should we share qualitative data? Epistemological and practical insights from conversation analysis. Int J Soc Res Methodol. 2022;1–15. 10.1080/13645579.2022.2087851 [DOI] [Google Scholar]
- Karhulahti VM: Registered reports for qualitative research. Nat Hum Behav. 2022a;6(1):4–5. 10.1038/s41562-021-01265-8 [DOI] [PubMed] [Google Scholar]
- Karhulahti VM: Reasons for Qualitative Psychologists to Share Human Data. Br J Soc Psychol. 2022b. 10.1111/bjso.12573 [DOI] [PubMed] [Google Scholar]
- Karhulahti VM, Siutila M, Vahlo J, et al. : Phenomenological Strands for Gaming Disorder and Esports Play: A Qualitative Registered Report. Collabra: Psychology. 2022;8(1): 38819. 10.1525/collabra.38819 [DOI] [Google Scholar]
- Levitt HM, Bamberg M, Creswell JW, et al. : Journal article reporting standards for qualitative primary, qualitative meta-analytic, and mixed methods research in psychology: The APA Publications and Communications Board task force report. Am Psychol. 2018;73(1):26–46. 10.1037/amp0000151 [DOI] [PubMed] [Google Scholar]
- Low J: A pragmatic definition of the concept of theoretical saturation. Sociological Focus. 2019;52(2):131–139. 10.1080/00380237.2018.1544514 [DOI] [Google Scholar]
- PCI RR [Peer Community in Registered Reports]: Author Guidelines. 2022. Reference Source
- Riley S, Brooks J, Goodman S, et al. : Celebrations amongst challenges: Considering the past, present and future of the qualitative methods in psychology section of the British Psychology Society. Qualitative Research in Psychology. 2019;16(3):464–482. 10.1080/14780887.2019.1605275 [DOI] [Google Scholar]
- Syed M: Three myths about open science that just won’t die. PsyArXiv. 2022. 10.31234/osf.io/w8xs2 [DOI] [Google Scholar]
- Topor M, Armstrong G, Gentle J: Through the lens of Developmental Coordination Disorder (DCD): experiences of a late diagnosis.In principle acceptance of Version 4 by Peer Community in Registered Reports. 2022. Reference Source
- Xiao L: Breaking Ban: Belgium’s ineffective gambling law regulation of video game loot boxes.Stage 2 Registered Report. Acceptance of Version 2 by Peer Community in Registered Reports. 2022. 10.31219/osf.io/hnd7w [DOI]