Abstract
Open-science reforms, which aim to increase credibility and access of research, have the potential to benefit the research base in special education, as well as practice and policy informed by that research base. Awareness of open science is increasing among special education researchers. However, relatively few researchers in the field have experience using multiple open-science practices, and few practical guidelines or resources have been tailored to special education researchers to support their exploration and adoption of open science. In this paper, we described and provided guidelines and resources for applying five core open-science practices—preregistration, registered reports, data sharing, materials sharing, and open-access publishing—in special education research.
Special education has a long and rich tradition of using scientific research to inform practice and policy. Indeed, contemporary evidence-based reforms are premised on the notion that scientific research yields valid, credible evidence that—when aligned with (a) goals and values of students and families, and (b) expertise and resources of educators—can and should serve as a basis for practice (Cook & Odom, 2013). However, a significant gap exists between research and practice, severely limiting the impact of research in the field. To bridge the research-to-practice gap in special education, Carnine (1997) posited, among other things, research evidence needs to be trustworthy (i.e., credible) and accessible.
A critical indicator of credible research is the rigor with which studies are conducted and reported. To guide the conduct and reporting of rigorous special education research, the special education research community has developed quality indicators for different research designs (e.g., Council for Exceptional Children, 2014; Odom et al., 2005). However, as in other fields (e.g., Fraser et al., 2018; John et al., 2012), many education researchers report engaging in questionable research practices that (a) are not directly addressed in extant quality indicators, and (b) can undermine the credibility of research findings (Makel et al., 2021). For example, some education researchers reported engaging in p-hacking (i.e., trying different analytic strategies until a significant p-value is obtained), hypothesizing after results are known (HARKing), selective outcome reporting (i.e., reporting only analyses for which desired findings were obtained), and data peeking (i.e., deciding whether to collect more data after examining data); all of which may inflate study effects. Moreover, evidence suggests publication bias (i.e., over-representation of studies with positive effects in the published literature) is present in the education and special education research bases (Gage et al., 2017; Polanin et al., 2016), which may bias the results of research syntheses and meta-analyses. As such, it appears efforts to increase credibility of research in special education are warranted.
With regard to research accessibility, most published studies lie behind paywalls, inaccessible to individuals who are (a) not affiliated with an institution (e.g., a university) that subscribes to the publisher, and (b) unable or unwilling to pay to download individual articles (Piwowar et al., 2018). Inaccessibility constrains application and impact of research; educators cannot base policy and practice on research they are unable to access. Additionally, open access is seldom provided to other products of research, such as data and materials. Charging for access to research findings and restricting access to study data and materials is inconsistent with scientific norms of universalism and communism (see Merton, 1973/1942).
Open science, which involves making all aspects of the research enterprise as open and transparent as possible, has been suggested as a means to increase the credibility and accessibility of research in special education (Cook et al., 2018). Adelson et al. (2019) posited three levels at which special education researchers can engage in open science: awareness, exploration, and adoption. Although awareness of open science is growing among special education researchers, it is only a first step. Exploration and adoption of open science practices are needed to realize their potential for improving the credibility and accessibility of research. Clear guidelines and resources for using open science are important for helping special education researchers transition from awareness to exploration and adoption of open-science practices. Yet, there are few practical resources or guidelines tailored to special education researchers (Adelson et al., 2019). Also, whereas open-science reforms have targeted group quantitative research designs, scholars have called for applying open and transparent practices in other research designs commonly used in special education, such as single-case (Cook, Johnson, et al., 2021) and qualitative (Trainor & Graue, 2014) research.
Thus, our purpose is to provide guidelines and resources for applying five core open-science practices—preregistration, registered reports, data sharing, materials sharing, and OA publishing—in special education research, including in single-case and qualitative research. In the following sections, we briefly describe each core open practice, note their primary benefits and limitations, indicate applications to single-case and qualitative research, and provide guidelines and resources for their implementation. Key steps for implementing each open practice are summarized in Table 1, with Figure 1illustrating when the steps occur in the research process.
Table 1.
Steps for Conducting Five Core Open-Science Practices
| Preregistering a Study | Registered Reports | Data Sharing | Materials Sharing | Posting a Preprint |
|---|---|---|---|---|
| 1. Select a study registry, create an account, and review how-to guides available on website. | 1. Identify a journal that accepts a registered report. Authors can ask editors of other journals if they will consider registered reports. | 1. Check informed consent for language that would prohibit data sharing. If you are unsure, contact your institutional review board. | 1. Choose materials to make open. | 1. Review preprint policies of target journals and funding agency (if relevant) before conducting study. Obtain agreement to post manuscript as a preprint from co-authors. |
| 2. Complete each section of the protocol. Write in future tense and invite co-investigators to contribute. Provide additional information, as relevant, in the open-ended areas at the end of each section. | 2. Write up complete and prospective introduction and method sections, and cover letter. Stage-1 manuscripts should clearly report research questions and accompanying hypotheses, a power analysis (as relevant), and reproducible methods. The cover letter should justify the importance of the study regardless of findings and study feasibility. | 2. Clean and de-identify data. Check for data entry mistakes and create summary scores if appropriate. Remove potentially identifying data, and check for crosstabs of small cells. Some variables may need to be recoded to avoid possible identification of participants. | 2. Check any existing copyright permissions for language that would prohibit sharing, and investigate copyright and intellectual property restrictions specific to your country, university, and/or funding agency. If you modified others’ materials that are licensed to permit sharing and adaptation, properly cite and attribute ownership when sharing. | 2. Identify a print repository and create an account. List of print repositories to consider: https://asapbio.org/preprint-servers |
| 3. Publish the preregistration and update as needed. Updates are common and should be expected. All updates should be completed before analyzing data. | 3. Submit for stage-1 review. If study has a set start date (e.g., beginning of school year), make sure to allot time for multiple rounds of review before start date. Respond to editor and reviewer feedback until in-principle accepted of stage-1 manuscript is granted. | 3. Create meta-data. Describe the aims and procedures of the project, as well as measures used. Include a codebook with names, labels, recoding strategies, missing value indicators, etc. | 3. Format and clean materials so they are easily reused by others. Include directions so others can replicate use of materials. Save files in a format that can be opened by others without restriction. | 3. Select a copyright license. Most researchers use CC BY. Preprint licensing information from ASAPbio: asapbio.org/licensing-faq Creative Commons license information: https://creativecommons.org/ |
| 4. Use the preregistration to guide one’s study, provide preregistration to reviewers and editor, and obtain a preregistration badge if publishing in a journal that provides open-science badges. | 4. Preregister (optional) and conduct study. Make sure to adhere to approved study plans. Contact editor if substantive deviations are needed. | 4. Select appropriate data repository where data will be easily discoverable by one’s community. | 4. Determine and select the appropriate level of copyright for your materials. Many researchers use CC BY and CC BY-NC. | 4. Post preprint. This is usually done in conjunction with a submission to a journal, but (depending on journal policy) authors can post a print before submission, after review, or after acceptance. |
| 5. Clearly identify and report any deviations from original plan and any exploratory analyses conducted in final research report. | 5. Write up and submit completed study for stage-2 review. Note and justify any deviations from approved study plans, and clearly identify any added exploratory analyses. | 5. Upload data to repository. Select appropriate license and access level for the data. Add the data set with its DOI as a product to one’s vitae. | 5. Determine where materials will be readily discoverable by your community. Materials can be shared on a data repository or the online supplemental materials option many journals offer. Consider which level of access to provide if options exist. | 5. Share preprint on personal website, institutional repositories, and/or social media (as allowed by journal policy). |
| 6. Upload materials to selected repository or journal’s online supplemental option during manuscript submission. Add the material with its DOI as a product to one’s vitae. | 6. Update preprint after acceptance with journal name and DOI. As allowed by journal policy, upload postprint of peer-reviewed version of the manuscript. |
Note: DOI = digital object identifier. CC BY = Creative Commons Attribution 4.0 International Public License. CC BY-NC = Creative Commons Attribution-NonCommercial 4.0 International Public License. © by Cook, Fleming, Hart, Lane, Therrien, & Wilson (2021) under a Creative Commons Attribution-Noncommercial 4.0 International License (CC BY-NC). DOI: 10.17605/OSF.IO/N35ZY.
Figure 1. Steps for Open Practices Before, During, and After a Study.

© by Cook, Fleming, Hart, Lane, Therrien, & Wilson (2021) under a Creative Commons Attribution-Noncommercial 4.0 International License (CC BY-NC). DOI: 10.17605/OSF.IO/N35ZY.
Open Practices
Preregistration
Preregistration involves researchers publicly posting study plans (e.g., research questions and hypotheses, data analysis, independent and dependent variables, sampling) before beginning to conduct a study (Gehlbach & Robinson, 2018; Nosek et al., 2019). Typically, preregistrations are posted on a searchable, independent registry where they can be freely accessed by editors, reviewers, and other research consumers. If and when changes to research plans occur, preregistrations can be updated and a rationale for the changes provided. An example of a preregistered study in the special education literature is Gesel and Lemons’ (2020) examination of different schedules of curriculum-based measurement, which includes a link to the preregistration in the article.
Benefits and Limitations
Nosek et al. (2019) noted three primary benefits of preregistration. First, by making study plans transparent, preregistration provides a record of which analyses are planned a priori (i.e., hypothesis testing or confirmatory analyses) and which are post hoc (i.e., exploratory analyses). Second, by enabling comparisons between preregistered study plans and reported methods and findings, preregistration makes questionable research practices such as p-hacking, selective outcome reporting, data-peeking, and HARKing more readily discoverable, thereby discouraging their use. Finally, preregistration can help combat publication bias by making all planned studies discoverable regardless of whether the study was published.
A key concern with preregistration is that it might diminish or demean exploratory analysis. However, as DeHaven (2017) noted, researchers are encouraged to conduct and report exploratory analyses in preregistered studies, but should clearly report them as exploratory and separate from preregistered analyses. Another challenge is simply that preregistration is time consuming and can be challenging (Nosek et al., 2019). Finally, preregistration only works to (a) combat publication bias if researchers search for and include preregistered but unpublished research in their research syntheses, and (b) reduce questionable research practices if editors, reviewers, and other research consumers examine and hold researchers accountable for discrepancies between study plans and research reports. Yet, Claesen et al. (2019) reported most published, preregistered studies they examined contained unexplained deviations from preregistered plans.
Applications to Single-Case and Qualitative Research
Although preregistration has been most commonly applied with group quantitative studies, researchers can also preregister other research designs, including single-case design, qualitative research, meta-analyses, and systematic reviews (see Johnson & Cook, 2019; Haven & Van Grootel, 2019). Although single-case and qualitative studies often involve making decisions after data collection has begun (e.g., deciding to introduce an intervention according to participant baseline responding, determining questions to ask in a focus group depending on participants’ previous responses), researchers can preregister the process and criteria for making decisions and update the preregistration when decisions are made.
Guidelines
Before starting the preregistration process, it is important to plan out in detail the critical elements of one’s study (e.g., research questions and hypotheses, sampling plan, independent variables, dependent variables, study procedures, data analyses). After developing a research plan, researchers should select a registry and respond to the prompts and questions as completely as possible. If researchers do not know one or more aspects of a study in advance, they can preregister the criteria by which they will make such determinations. For example, a decision tree for determining which statistical analysis will be used depending on whether assumptions are met can be preregistered (Kiyonaga & Scimeca, 2019). Once the decision is made, it can be described in an update to the preregistration.
Two registries commonly used by special education researchers are the Open Science Framework (OSF; https://osf.io/prereg/) and the Registry of Efficacy and Effectiveness Studies (REES; www.sreereg.org). OSF is a multi-disciplinary registry with multiple templates for preregistering different types of studies. For example, OSF’s standard template, which can be used for group experiments, observational studies, and meta-analyses, asks researchers to report the study design, sampling plan, study variables, and analysis plan. The standard template provides a variety of multiple-choice and short answer questions for researchers to complete, provides examples, and prompts researchers to explain and justify their plans. OSF also provides preregistration templates for qualitative research and secondary data analysis, and allows researchers to save blinded copies of preregistrations, which can be cited in manuscripts submitted for blind peer review. A preregistration guide (OSF, n.d.) and a how-to video (Mellor et al., 2017) are also provided.
REES is designed for preregistering causal inference studies (i.e., randomized trials, quasi-experimental designs, regression discontinuity designs, and single-case designs) in education and related fields. When preregistering a study on REES, researchers respond to a series of questions and prompts in each of eight areas (i.e., general study information, description of study, research questions, study design, sample characteristics, outcomes, analysis plan, and additional materials). For example, when preregistering a multiple-baseline single-case design, researchers specify type of multiple-baseline design (across participants, places, materials, or behaviors), start of baseline (concurrent or non-concurrent, and for how many cases), type of participants (students, teachers, principals, schools, other), and total number of participants. Prompts for preregistering a study on REES are straightforward, with definitions and examples provided. A user guide (REES, n.d.b), a how-to video (REES, 2020), and a checklist of required information for each design (REES, n.d.a) are provided.
Once completed, preregistrations are time-stamped, assigned a unique digital object identifier (DOI), and made public. The published preregistration is permanent, but researchers can and should post updates describing and providing a rationale for any changes to preregistered plans. For example, researchers may need to deviate from preregistered plans if collected data do not meet assumptions for a planned analysis or if a school site requests a change to a planned intervention. Transparency is paramount and changes should be completed before data analysis is conducted. Researchers can update their preregistration on REES by navigating to “My Registry” and clicking “Update”. This will create a time-stamped copy of the updated registration that will also be made public. Researchers are provided additional space to justify or explain changes. When updating a preregistration on OSF, researchers must add a Transparent Changes Document (https://osf.io/yrvcg/) as a supplemental file on the overview page of the public registration. This document should note the changes to the study, rationale for the changes, and how the changes potentially affect the outcome of the study.
Registered Reports
Registered reports apply the principles of preregistration to the peer-review process (Chambers, 2019; Cook, Maggin, & Robertson, 2021). In registered reports, authors write introduction and prospective method sections before conducting a study, and submit these study plans to a journal for peer review, which is conducted in two stages. Stage-1 review is focused on the importance of the research questions and rigor of the proposed methods. It concludes with the proposed study either being rejected or granted in-principle acceptance. After researchers conduct, write up, and resubmit the study, stage-2 review involves reviewers checking to ensure (a) approved plans are adhered to, or any modifications are reported and justified, and (b) study findings are appropriately reported and discussed. A study cannot be rejected at stage 2 because of the direction or perceived interestingness of findings. An example of a registered report in the special education literature is Doabler et al.’s (2021) experiment examining the effects of a science program.
Benefits and Limitations
Registered reports are designed to improve the transparency, quality, and credibility of research in at least three ways. First, reviewers of registered reports are able to provide constructive feedback before the study is conducted, when authors can make adjustments to their study. Second, registered reports reduce questionable research practices such as p-hacking by not only detailing research plans prior to conducting a study, as in preregistration, but also by devoting a stage of review to ensure that plans are adhered to. Finally, registered reports help combat publication bias. Because the decision to accept a study in principle is made before results are known, reviewer and editor decisions are not influenced by the direction of study findings. Indeed, engaging in unplanned, questionable research practices to obtain positive findings is one of the only ways for a study with in-principle acceptance not to be published. Null findings are, in fact, significantly more likely to be reported in registered reports compared to traditional studies (Scheel et al., 2020).
The same potential limitations noted for pregistrations apply to registered reports. An additional concern for registered reports is that (a) planning for and writing up stage-1 manuscripts and (b) conducting two stages of review entail increased time and effort for researchers, reviewers, and editors. Second, not all research is appropriate for registered reports. For example, authors are unlikely to be able to write a prospective introduction and method section suitable for review for purely exploratory research.
Applications to Single-Case and Qualitative Research
Most published registered reports have used group quantitative designs. For example, all seven studies in Reich et al.’s (2020) special issue in AERA Open on registered reports in education used group quantitative designs. Yet, any study that can be preregistered, including single-case design, qualitative research, meta-analyses, and systematic reviews, can be a registered report.
Guidelines
First, researchers must select a study that is appropriate for registered reports (e.g., a study with primarily predetermined methods). It is also important to allot sufficient time for the stage-1 review process when conducting school-based studies with fixed timelines, to ensure that study plans are accepted in principle before the study must begin. Researchers must also identify a journal that accepts registered reports. Only a few journals in special education currently accept registered reports as a regular submission option (e.g., Exceptional Children, Gifted Education Quarterly; see https://www.cos.io/our-services/registered-reports for a list of journals accepting registered reports). We encourage interested authors to ask editors of other journals if they would accept a registered report submission to pilot the process.
Stage-1 submissions should include complete, but prospective, introduction and method sections. Journals accepting registered reports have their own guidelines for stage-1 submissions that should be adhered to. Researchers need to clearly report (a) well-justified research questions and accompanying hypotheses, (b) a power analysis to justify the proposed sample (as relevant), and (c) reproducible methods in their stage-1 manuscript. A cover letter in which authors describe the feasibility of conducting the study and the importance of study findings regardless of direction and magnitude often accompany stage-1 submissions. For elements of the study that cannot be fully determined in advance, Kiyonaga and Scimeca (2019) recommended using decision trees in stage-1 manuscripts to clarify criteria for determining how the study will be conducted.
The stage-1 submission and review process is similar to typical peer review. When and if in-principle acceptance is granted, authors should preregister the accepted study plans, and then proceed to conduct and write up the study. Researchers should notify the editor if substantive deviations from approved plans will occur. Any deviations from accepted study plans need to be clearly noted and justified in the final manuscript. Any exploratory analyses conducted should be clearly reported as such. Stage-2 review typically is conducted by the same reviewers as in stage 1, and focuses on whether accepted study plans were followed and deviations are noted and justified. See Kiyonaga and Scimeca (2019) for practical considerations and Center for Open Science (n.d.) for responses to frequently asked questions regarding registered reports.
Data Sharing
Data sharing is the practice of making raw, yet curated, data available for others to examine and use, preferably in a data repository. Although researchers often share just the subset of data used in a publication (e.g., Conroy & Sutherland, 2018), data from entire projects can also be shared (e.g., Quint, 2016). Researchers with federal funding often are mandated to share their data, typically with a deadline after the main findings of the project are published (e.g., Institute for Education Sciences, n.d.). Data sharing should include metadata, which typically includes a codebook or data dictionary listing information such as variable names, variable labels, questionnaire items, and scoring rules that allow others to find, understand, and use the shared data appropriately (Day, 2005). In addition to a codebook, metadata should also include details such as the project’s aims, information about the sample, general information about the measures, study design and data collection procedures (e.g., CONSORT diagram, data entry procedures), data missingness (values used, reasons for missingness), and any other information someone unfamiliar with the project and data users should know (see Logan et al., 2021).
Benefits and Limitations
Shared data expands the possibilities of what can be done with a data set, including allowing independent reanalysis and reproduction of published findings to assess their trustworthiness. In addition, allowing more researchers to analyze a data set promotes (a) examination of novel research questions and (b) application of diverse perspectives and techniques to robust research questions (Vision, 2010). Sharing data in a repository also allows data sets to be assigned a DOI and become a citable and reportable product for the data sharer. Furthermore, publications with shared data are cited more often than publications without (Piwowar & Vision, 2013). Given the typically resource-intensive nature of data collection in special education, open data also benefits other researchers by democratizing access to data and allowing them to analyze otherwise inaccessible data (Fleming et al., 2021; Mangul et al., 2019).
However, sharing high quality data takes time and expertise: data must be checked, cleaned, and fully de-identified. For many investigators, these processes may fall outside of their expertise and necessitate a change in workflow. Resources are available to aid with sharing data (see van Dijk et al., 2021) and, for those who write grants, we recommend budgeting time for staff with expertise in data sharing. Additionally, not all data can be shared (e.g., highly sensitive data that cannot be fully de-identified).
Applications to Single-Case and Qualitative Research
Data often can be shared in single-case and qualitative research. Researchers conducting qualitative research can use specialized data repositories for sharing their data (i.e., the Qualitative Data Repository [https://qdr.syr.edu/] for text-based data, and Databrary [www.databrary.org] for video-based data). Both repositories offer services to help curate and document data. Although values for dependent variables typically are shown in single-case studies in single-case graphs, greater transparency and accuracy can be achieved by openly sharing the actual values for each data point. Additionally, single-case researchers can share data related to procedural fidelity, intervention intensity, reliability of the dependent variable(s), social validity, and visual analysis (e.g., trend and stability calculation results); as well as their code for calculating effect sizes. Because single-case and qualitative studies often have few participants, researchers should take extra care de-identifying their data to protect participants’ identities.
Guidelines
The first step to sharing data is to check participants’ informed consent. Given the new Common Rule for Protection of Human Subjects (U.S. Department of Health and Human Services, n.d.), de-identified data are considered not human subjects if researchers did not explicitly state what would happen to participant’s data. Thus, if data sharing is not mentioned in informed consent it is possible that data may be shared. We recommend checking with one’s IRB for guidance. If informed consent indicates data will not be shared, researchers can request a waiver of consent from the IRB for sharing de-identified data. Researchers planning to share their data should design consent agreements and research protocols to facilitate data sharing. As appropriate, we recommend using language permitting sharing of de-identified data in informed consents (see Shero and Hart, 2020, for templates for informed consent that facilitate data sharing).
The second step is cleaning and de-identifying the data. This involves checking for data-entry mistakes (e.g., out-of-range values), creating any necessary summary scores (e.g., sum scores, developmental scale scores), and checking for identifiable variables (e.g., names, addresses) and cells (e.g., labelling of low-incidence disabilities). Although most available guidance on data de-identification involves Health Insurance Portability and Accountability Act (HIPAA)-related data in medical settings (see US Department of Health and Human Services, 2012), a general rule of thumb is to ensure obvious identifiers are removed (such as name, exact date of birth, address), and then consider the frequency of the knowable characteristics of that participant to the region the data were collected. For example, if a participant is a child with a rare genetic disorder, that child is likely identifiable when coupled with the setting of the study, and caution is warranted in sharing that data. However, a participant who is a male with a learning disability in a study conducted in a large city is likely not identifiable from these variables, which can therefore be shared. See Edwards and Schatschneider (2020a, b) for guides to de-identifying education data and Meyer (2018, pp. 133-135) for other considerations when de-identifying data.
The third step is creating metadata. Researchers need to create documentation providing information about the study, the sample, protocols, measures, as well as create a data dictionary that details missing data values, variable names, variable labels, recoding strategies, questionnaire items, scoring logic, and the like. The fourth step is selecting an appropriate data repository. There are many data repositories, including domain general (e.g., Inter-university Consortium for Political and Social Research [ICPSR], OSF) and domain specific (e.g., LDbase, Databrary) repositories, and researchers should consider where the data will be best discoverable by their research community. It is important that the repository provides a DOI for the data and the option to assign a copyright license to the data. Most repositories give researchers an option of an Open Data Commons (https://opendatacommons.org/) or Creative Commons (https://creativecommons.org/) license, which will tell data users how they can use and cite the data and data products.
The final step is to upload the data and metadata into the data repository and set an access level, if applicable. Some data repositories allow researchers to choose if the data will be posted openly or be available only by managed access, which requires data reusers to apply for permission to access the data. This may be a good choice for special education researchers who have sensitive data or other concerns, as they can require a data reuser to have an active IRB protocol before accessing the data, as an example. For more information on open data, the open data handbook (https://opendatahandbook.org/) hosts a wealth of information. Other resources to help researchers prepare their data to be shared include Gilmore et al. (2018) and ICPSR (n.d.).
Materials Sharing
Materials sharing, or open materials, involves researchers sharing study materials by posting them in a data repository (e.g., figshare, https://figshare.com; OSF, www.osf.io) or in a journal’s online supplemental repository alongside an article. A creator’s copyright is established as soon as a creative work is shared in a fixed and tangible form (US Copyright Office, 2019). However, copyright owners can add copyright licenses to provide permission for others to reuse, adpat, and redistribute their work in specific ways. Researchers can share many types of study materials using copyright licenses, including intervention protocols, survey instruments, researcher-developed dependent measures, and treatment-integrity checklists. Recent examples of authors sharing study materials openly in special education include Bettini et al. (2019), who shared their survey, additional demographic data, and additional results for structural equation models, and Kozleski et al. (2020), who shared interview protocols and coding templates.
Benefits and Limitations
Making study materials open is beneficial for multiple stakeholders. By sharing study materials, researchers increase their impact by allowing others to use and build upon their work. These products can be listed on authors’ CVs and can garner citations. Open materials also provide benefits to the field and other special education stakeholders by (a) facilitating replication of previous studies, which is essential to establish evidence-based practices in special education, and (b) providing researchers, practitioners, parents, and other stakeholders access to free, research-validated materials to use with students.
Copyright concerns are the main obstacle to materials sharing. Authors cannot share material, such as a published norm-referenced assessment, for which they do not own the copyright, even if they used the materials in a study. Yet, they can share study material, such as a treatment-integrity protocol, that they developed for the project and own (original work is under the creator’s copyright once it is shared in a fixed and tangible form; US Copyright Office, 2019). Unfortunately, determining ownership is not always straightforward. For example, it may be unclear whether and how one can share adapted versions of someone else's copyrighted materials, materials developed as part of one's professional duties at a university, and materials developed from work funded by external sources (e.g., grants). Additionally, like other open practices, materials sharing requires time and planning (van Dijk et al., 2021).
Applications to Single-Case and Qualitative Research
For researchers conducting single-case studies, authors can share materials such as observation coding forms, intervention protocols, intervention materials, social validity measures, and personnel training protocols. Qualitative researchers can share materials such as interview and probing question protocols, field notes/observation data collection forms, positionality and reflexivity statements, data analytic strategies and detailed descriptions of the data analytic process, and deductive or inductive codebooks with supportive interview quotes or field notes/observations. The process for sharing materials is the same regardless of study design.
Guidelines
First, decide what materials to share. We encourage authors to share all relevant materials possible and provide detailed descriptions in the manuscript of material they are unable to share. Second, check for any potential copyright or intellectual-property restrictions that might preclude legally sharing the material. We suggest researchers check with their funding agencies, employers, and university copyright lawyers (often employed through a university’s library system) regarding sharing researcher-created materials. When seeking to share others’ materials they have adapted, authors must ensure the materials have a license permitting sharing and adaptations. Any materials created from others’ open work should be properly attributed, cited, and licensed according to the permissions granted by the original creator. Some materials are licensed by their original authors in a manner that allows for adaptations, but does not permit sharing of the adapted materials (i.e., the original authors retain exclusive rights to redistribute their materials). In these instances, we recommend sharing a detailed account of the adaptations made to the original materials with proper attribution to the original authors.
Third, format materials to facilitate easy access and reuse. Open materials should be saved and shared in an easily accessible format such as .txt or .html that allows reusers to edit the materials without the need for atypical software or expertise in editing (Hilton et al., 2010). Authors should also provide relevant instructions for using shared materials. For example, authors might include a coding guide alongside a shared observational protocol, or the implementation steps and procedures for a training manual.
Fourth, determine and apply a copyright license for sharing in consultation with everyone who played a role in creating the material. Creative Commons (https://creativecommons.org) is a widely used provider of public copyright licenses for sharing content with varying degrees of openness. Two popular licenses are CC BY and CC BY-NC. The CC BY license allows others to reuse, modify, edit, build upon, and distribute shared materials as long as attribution is given. The CC BY-NC license restricts such reuse to non-commercial purposes. Once selected and shared, licensing options on materials cannot be changed, so care should be taken when selecting the license. The license should be indicated clearly on the shared materials in a machine-readable format. Creative Commons provides copyright license images that can be copied and pasted into the shared document or included as a watermark. Alternatively, authors can indicate the license within a header or footer by using the copyright symbol (©) followed by their names, the year, and the license (e.g., CC BY-NC). See Figure 1 and Table 1, which are shared on OSF, for examples.
Fifth, decide whether to share materials on a data repository or as an online supplement to an article. We recommend selecting the option that makes the material best available to one’s target audience. For example, if a researcher is sharing an intervention protocol that is intended primarily to be shared with practioners, and they are publishing in a practitioner journal read by many teachers, sharing through the journal may be the best choice. Alternatively, if the article is being published in a research journal read by few practitioners, a data repository may be advisable. Note that some repositories allow sharers to select the level of availability of their shared materials (e.g., only shared after the owner approves an application), which may be an attractive option. Finally, upload one’s materials with the chosen licensing level, and add the shared materials and associated DOI, if available, to one’s CV.
Open-Access Publishing
OA publishing is an umbrella term for approaches that provide free access to published research, which is often behind paywalls and inaccessible to many stakeholders. There are multiple OA-publishing models. Gold and hybrid OA publishing both make articles immediately and permanently open. In gold OA, the article is published in an open journal in which all content is OA. Hybrid OA articles are published in traditional journals in which most articles are behind a paywall, but authors make their specific article freely available on the journal’s website. Authors typically retain the copyright and pay an article-processing charge to the journal to cover publishing costs in these models. With Bronze OA, publishers select specific articles to make freely accessible (e.g., an article from a special issue). However, bronze-OA articles are not licensed as OA; as such, they may be placed behind a paywall at any time and research consumers are restricted in how they can reuse and adapt content. Green OA refers to authors self-archiving their own manuscripts to online repositories. Both preprints and postprints are examples of green OA. Preprints and postprints are author-formatted versions of a paper; preprints have not been peer reviewed, whereas postprints have.
Benefits and Limitations
A fundamental benefit of OA publishing is democratization of access to research and scholarship (Flming et al., 2021). Providing special education practitioners, for example, access to research can help bridge the gap between research and practice. Preprints can also speed dissemination of research by allowing authors to immediately share their research on preprint servers, personal websites, social media, or institutional repositories without having to wait for what can sometimes be a lengthy process of peer review. OA publishing is also associated with increased impact of research; articles posted as preprints before publication receive more citations, downloads, and social media attention (Fu & Hughey, 2019; Piwowar et al., 2018). Finally, preprints can help combat publication bias by providing a forum for dissemination of studies that might not otherwise be published (e.g., studies with null results).
In terms of limitations, gold and hybrid OA can be costly for researchers, as the average article-processing charge for special education journals is just below $3,000 (Fleming & Cook, 2021). It is also important to recognize preprints are not typically peer reviewed. Thus, seriously flawed and misleading research can be posted as preprints, and research consumers should critically evaluate preprinted research for potential bias. Additionally, preprints can unmask the identities of study authors, thereby potentially reducing the pool of available blind peer reviewers if and when the paper is submitted for publication (Fleming et al., 2021).
Applications to Single-Case and Qualitative Research
Single-case and qualitative research, as well as any other type of scholarship, can be made openly accessible through gold, hybrid, bronze, and green OA. OA operates in the same way regardless of the type of research.
Guidelines
Journal OA policies vary. For example, some journals do not accept submissions that have been preprinted, and policies differ on whether, where, and when papers can be postprinted (Fleming & Cook, 2021). Researchers should review OA policies of journals to which they may submit a manuscript as early as possible in the research process. Most journals with academic publishers have posted OA information on their websites. For some small or independent journals, researchers may need to contact editors to obtain OA policies.
As with shared data and materials, authors must choose their copyright license when publishing OA. For all OA models except bronze, authors should select a copyright license to clarify how others can reuse and adapt the work. For preprints and postprints, print repositories often require selection of a copyright license. Similarly, for gold and hybrid OA, authors will typically choose a copyright license as part of the publication process.
For Green OA, authors must select a repository. Repositories can be general (e.g., OSF Preprints, www.osf.io/preprints/) or discipline specific (e.g., EdArXiv, www.edarxiv.org). Researchers typically must create an account at the selected repository, select a copyright license, and upload the manuscript. When submitting to EdArXiv, for example, authors upload their unformatted manuscript (typically a .pdf file) and answer a series of questions. EdArXiv requests the following information about submissions: title, abstract, license for copyright, key words, discipline and subdiscipline(s), author(s), statement on conflict of interest, supplemental materials, a link to the study’s publicly available data (if applicable), a link to the study’s preregistration (if applicable), and the DOI to the corresponding article if the manuscript is already published.
Print repositories typically do not peer review manuscripts, but often screen them for appropriateness (e.g., content aligns with focus of discipline-specific repositories) before posting. If and when a preprinted manuscript is submitted and accepted for publication in a journal, authors should add an updated version of the paper with an author note indicating acceptance and specifying the journal. After publication of a paper, authors should update the preprint with the corresponding citation and link DOIs of the preprint and published paper. If the journal in which an article is published allows postprints, authors may update a preprint with a postprint to reflect changes made in peer review. Many journals impose an embargo period of one or two years before authors can postprint a manuscript accepted for publication.
As an example of how this process can work, two of the authors concurrently submitted a recent manuscript to EdArXiv (Fleming & Cook, 2020) and a journal for publication. Because the journal allowed postprints without an embargo period, we updated our preprint following each round of peer review. Following acceptance, we uploaded the accepted version of the manuscript with an author’s note that indicates the journal in which the paper is accepted. Researchers can also share preprints and postprints on social media, personal websites, and institutional repositories (Laasko, 2014). See Fleming and Cook (2021) for a review of OA policies of special education journals, and Sherpa Romeo (https://v2.sherpa.ac.uk/romeo/) to search journal OA policies.
Conclusion
Although not without limitations, open-science reforms have the potential to increase transparency, credibility, and impact of special education research (Cook et al., 2018). Changing researcher behaviors to increase the application of open science among special education researchers will require manipulation of antecedents (e.g., journal policies) and consequences (e.g., rewards in the tenure and promotion process; see Norris & O’Connor, 2019). Our hope is that the guidelines and resources for engaging in core open-science practices provided in this article will serve as one antecedent for increasing special education researchers’ exploration and adoption of open science. As more special education researchers adopt open practices, norms in the field may shift and standards for rigorous research may be expanded to include open practices, providing other powerful antecedents for engaging in open science (Mellor, 2021).
References
- Adelson JL, Barton EE, Bradshaw CP, Bryant BR, Bryant DP, Cook BG, Coyne MD, … Troia GA (2019). (2019, February 18). A roadmap for transparent research in special education and related disciplines. 10.31219/osf.io/sqfy3 [DOI] [Google Scholar]
- Bettini E, Cumming MM, O’Brien KM, Brunsting NC, Ragunathan M, Sutton R, & Chopra A (2020). Predicting special educators’ intent to continue teaching students with emotional or behavioral disorders in self-contained settings. Exceptional Children, 86(2), 209–228. 10.1177/0014402919873556 [DOI] [Google Scholar]
- Carnine D (1997). Bridging the research-to-practice gap. Exceptional Children, 63(4), 513–521. 10.1177/001440299706300406 [DOI] [Google Scholar]
- Center for Open Science. (n.d.). Frequently asked questions. https://www.cos.io/our-services/registered-reports
- Chambers C (2019). What’s next for registered reports? Nature, 573, 187–189. 10.1038/d41586-019-02674-6 [DOI] [PubMed] [Google Scholar]
- Claesen A, Gomes S, Tuerlinckx F, & Vanpaemel W (2019, May 9). Preregistration: Comparing dream to reality. 10.31234/osf.io/d8wex [DOI] [PMC free article] [PubMed] [Google Scholar]
- Conroy M, & Sutherland K (2018, December 18). Prevention and treatment of problem behaviors in young children: Clinical implications from a randomized controlled trial of BEST in CLASS. Inter-university Consortium for Political and Social Research. 10.3886/E107827V1 [DOI] [Google Scholar]
- Cook BG, Johnson AH, Maggin DM, Therrien WJ, Barton EE, Lloyd JW, Reichow B, Talbott E, & Travers JC (2021). Open science and single-case design research. Remedial and Special Education. Advance online publication. 10.1177/0741932521996452 [DOI] [Google Scholar]
- Cook BG, Lloyd JW, Mellor D, Nosek BA, & Therrien WJ (2018). Promoting open science to increase the trustworthiness of evidence in special education. Exceptional Children, 85(1), 104–118. 10.1177/0014402918793138 [DOI] [Google Scholar]
- Cook BG, Maggin DM, & Robertson RE (2021). Registered reports in special education: Introduction to the special series. Remedial and Special Education. Advance online publication. 10.1177/0741932521996459 [DOI] [Google Scholar]
- Cook BG, & Odom SL (2013). Evidence-based practices and implementation science in special education. Exceptional Children, 79(3), 135–144. 10.1177/001440291307900201 [DOI] [Google Scholar]
- Day M (2005). Metadata. In Ross S & Day M (Eds.), DCC digital curation manual. http://www.dcc.ac.uk/resources/curation-reference-manual/completed-chapters/metadata [Google Scholar]
- De Haven A (2017, May 23). Preregistration: A plan, not a prison. https://www.cos.io/blog/preregistration-plan-not-prison [Google Scholar]
- Doabler CT, Therrien WJ, Longhi MA, Roberts G, Hess K, Maddox SA, … Toprac P (in press). Efficacy of a second-grade science program: Increasing science outcomes for all learners. Remedial and Special Education. [Google Scholar]
- Edwards A, & Schatschneider C (2020a). 5 things to check for data de-identification. https://venngage.net/ps/5p6yjaAGTSs/new-5-things-to-check-for-data-deidentification. [Google Scholar]
- Edwards A, & Schatschneider C (2020b): De-identification Guide. figshare. 10.6084/m9.figshare.13228664.v1 [DOI] [Google Scholar]
- Fleming JI, & Cook BG (2020, May 28). Open access in special education: A review of journal and publisher policies. 10.1177/0741932521996461 [DOI] [Google Scholar]
- Fleming JI, & Cook BG (2021). Open access in special education: A review of journal and publisher policies. Remedial and Special Education. Advance online publication. 10.1177/0741932521996461 [DOI] [Google Scholar]
- Fleming JI, Wilson SE, Hart SA, Therrien WJ, & Cook BG (2021). Open accessibility in education research: Enhancing the credibility, equity, impact, and efficiency of research. Educational Psychologist, 56(2). Advance online publication. 10.1080/00461520.2021.1897593 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Fraser H, Parker T, Nakagawa S, Barnett A, & Fidler F (2018). Questionable research practices in ecology and evolution. PloS one, 13(7), Article e0200303. 10.1371/journal.pone.0200303 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Fu DY, & Hughey JJ (2019). Releasing a preprint is associated with more attention and citations for the peer-reviewed article. Elife, 8, e52646. 10.7554/eLife.52646 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Gage NA, Cook BG, & Reichow B (2017). Publication bias in special education meta-analyses. Exceptional Children, 83(4), 428–445. 10.1177/0014402917691016 [DOI] [Google Scholar]
- Gehlbach H, & Robinson CD (2018). Mitigating illusory results through preregistration in education. Journal of Research on Educational Effectiveness, 11(2), 296–315. 10.1080/19345747.2017.1387950 [DOI] [Google Scholar]
- Gesel SA, & Lemons CJ (2020). Comparing schedules of progress monitoring using curriculum-based measurement in reading: A replication study. Exceptional Children, 87(1), 92–112. 10.1177/0014402920924845 [DOI] [Google Scholar]
- Gilmore RO, Kennedy JL, & Adolph KE (2018). Practical solutions for sharing data and materials from psychological research. Advances in Methods and Practices in Psychological Science, 1(1), 121–130. 10.1177/2515245917746500 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Haven TL, & Van Grootel DL (2019). Preregistering qualitative research. Accountability in Research, 26(3), 229–244. 10.1080/08989621.2019.1580147 [DOI] [PubMed] [Google Scholar]
- Hilton III J, Wiley D, Stein J, & Johnson A (2010). The four ‘R’s of openness and ALMS analysis: frameworks for open educational resources. Open Learning: The Journal of Open, Distance and e-Learning, 25(1), 37–44. 10.1080/02680510903482132 [DOI] [Google Scholar]
- Institute for Education Sciences. (n.d.). Resources for researchers: Implementation guide for public access to research data. https://ies.ed.gov/funding/datasharing_implementation.asp
- Inter-university Consortium for Political and Social Research (ICPSR). (n.d.). Guide to social science data preparation and archiving: Best practice throughout the data life. (6th ed.). https://www.icpsr.umich.edu/web/pages/deposit/guide/ [Google Scholar]
- John LK, Loewenstein G, & Prelec D (2012). Measuring the prevalence of questionable research practices with incentives for truth telling. Psychological Science, 23(5), 524–532. 10.1177/0956797611430953 [DOI] [PubMed] [Google Scholar]
- Johnson AH, & Cook BG (2019). Preregistration in single-case design research. Exceptional Children, 86(1), 95–112. 10.1177/0014402919868529 [DOI] [Google Scholar]
- Kaplan RM, & Irvin VL (2015). Likelihood of null effects of large NHLBI clinical trials has increased over time. PloS one, 10(8), Article e132382. 10.1371/journal.pone.0132382 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Kiyonaga A, & Scimeca JM (2019). Practical considerations for navigating registered reports. Trends in Neurosciences, 42(9), 568–572. 10.1016/j.tins.2019.07.003 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Kozleski EB, Hunt P, Mortier K, Stepaniuk I, Fleming D, Balasubramanian L, Leu G, & Munandar V (2020). What peers, educators, and principals say: The social validity of inclusive, comprehensive literacy instruction. Exceptional Children. 10.1177/0014402920969184 [DOI] [Google Scholar]
- Laakso M (2014). Green open access policies of scholarly journal publishers: a study of what, when, and where self-archiving is allowed. Scientometrics, 99(2), 475–494. 10.1007/s11192-013-1205-3 [DOI] [Google Scholar]
- Logan JAR, Hart SA, & Schatschneider C (2021). Data sharing in education science. AERA Open. 10.1177/23328584211006475 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Makel MC, Hodges J, Cook BG, & Plucker JA (2021). Both questionable and open research practices are prevalent in education research. Educational Researcher. Advance online publication. 10.3102/0013189X211001356 [DOI] [Google Scholar]
- Makel MC, Plucker JA, Freeman J, Lombardi A, Simonsen B, & Coyne M (2016). Replication of special education research: Necessary but far too rare. Remedial and Special Education, 37(4), 205–212. 10.1177/0741932516646083 [DOI] [Google Scholar]
- Mangul S, Martin LS, Langmead B, Sanchez-Galan JE, Toma I, Hormozdiari F, Pevzner P, & Eskin E (2019). How bioinformatics and open data can boost basic science in countries and universities with limited resources. Nature Biotechnology, 37(3), 324–326. 10.1038/s41587-019-0053-y [DOI] [PubMed] [Google Scholar]
- Mellor D, Soderberg C, & DeHaven A (2017, January 5). Preregistration on the Open Science Framework [Video]. YouTube. https://www.youtube.com/watch?v=EnKkGO3OM9c [Google Scholar]
- Merton RK (1973) [1942]. The normative structure of science. In Merton RK (ed.), The sociology of science: Theoretical and empirical investigations (pp. 267–278). University of Chicago Press. [Google Scholar]
- Meyer MN (2018). Practical tips for ethical data sharing. Advances in Methods and Practices in Psychological Science, 1(1), 131–144. 10.1177/2515245917747656 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Norris E, & O’Connor DB (2019). Science as behaviour: Using a behaviour change approach to increase uptake of open science. Psychology & Health, 34(12), 1397–1406. 10.1080/08870446.2019.1679373 [DOI] [PubMed] [Google Scholar]
- Nosek BA, Beck ED, Campbell L, Flake JK, Hardwicke TE, Mellor DT, van’t Veer AE, & Vazire S (2019). Preregistration is hard, and worthwhile. Trends in Cognitive Sciences, 23(10), 815–818. 10.1016/j.tics.2019.07.009 [DOI] [PubMed] [Google Scholar]
- Open Science Collaboration. (2015). Estimating the reproducibility of psychological science. Science, 349(6251), aac4716. 10.1126/science.aac4716 [DOI] [PubMed] [Google Scholar]
- Open Science Framework. (n.d.). Registrations. https://help.osf.io/hc/en-us/categories/360001550953-Registrations
- Piwowar H, Priem J, Larivière V, Alperin JP, Matthias L, Norlander B, … Haustein S (2018). The state of OA: A large-scale analysis of the prevalence and impact of open access articles. PeerJ, 6, e4375. 10.7717/peerj.4375 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Piwowar HA, & Vision TJ (2013). Data reuse and the open data citation advantage. PeerJ, 1, e175. 10.7717/peerj.175 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Polanin JR, Tanner-Smith EE, & Hennessy EA (2016). Estimating the difference between published and unpublished effect sizes: A meta-review. Review of Educational Research, 86(1), 207–236. 10.3102/0034654315582067 [DOI] [Google Scholar]
- Quint J (2016, September 12). Impacts and implementation of the i3-funded scale-up of Success for All. Inter-university Consortium for Political and Social Research [distributor]. 10.3886/ICPSR36387.v1 [DOI] [Google Scholar]
- Registry of Efficacy and Effectiveness Studies. (2020, June 11). Getting started in REES [Video]. YouTube. https://www.youtube.com/watch?v=OXODz7QZYmE [Google Scholar]
- Registry of Efficacy and Effectiveness Studies. (n.d.b). Checklist of required information for a registry entry. https://sreereg.icpsr.umich.edu/sreereg/checklist
- Registry of Efficacy and Effectiveness Studies. (n.d.b). Registry of Efficacy and Effectiveness Studies. https://sreereg.icpsr.umich.edu/sreereg/userguide
- Reich J, Gehlbach H, & Albers CJ (2020). “Like upgrading from a typewriter to a computer”: Registered reports in education research. AERA Open, 6(2), 1–6. 10.1177/2332858420917640 [DOI] [Google Scholar]
- Schäfer T, & Schwarz MA (2019). The meaningfulness of effect sizes in psychological research: Differences between sub-disciplines and the impact of potential biases. Frontiers in Psychology, 10, 813. 10.3389/fpsyg.2019.00813 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Scheel AM, Schijen M, & Lakens D (2020, February 5). An excess of positive results: Comparing the standard psychology literature with Registered Reports. 10.31234/osf.io/p6e9c [DOI] [Google Scholar]
- Shero J, & Hart SA (2020). Informed consent template (Version 1). figshare. 10.6084/m9.figshare.13218773.v1 [DOI] [Google Scholar]
- Trainor AA, & Graue E (2014). Evaluating rigor in qualitative methodology and research dissemination. Remedial and Special Education, 35(5), 267–274. 10.1177/0741932514528100 [DOI] [Google Scholar]
- U.S. Copyright Office. (2019, December). Copyright basics. https://www.copyright.gov/circs/circ01.pdf
- U.S. Department of Health and Human Services. (n.d.). Federal policy for the protection of human subjects ('common rule'). https://www.hhs.gov/ohrp/regulations-and-policy/regulations/common-rule/index.html
- US Department of Health and Human Services. (2012). Guidance regarding methods for de-identification of protected health information in accordance with the Health Insurance Portability and Accountability Act (HIPAA) Privacy Rule. US Department of Health and Human Services. https://www.hhs.gov/hipaa/for-professionals/privacy/special-topics/de-identification/index.htm [Google Scholar]
- van Dijk W, Schatschneider C, & Hart SA (2021). Open science in education science. Journal of Learning Disabilities, 54(2), 139–152. 10.1177/0022219420945267 [DOI] [PMC free article] [PubMed] [Google Scholar]
