Skip to main content
Integrative Medicine Research logoLink to Integrative Medicine Research
. 2025 Aug 19;15(1):101229. doi: 10.1016/j.imr.2025.101229

Data sharing in acupuncture meta-analyses: Associations with journal policies and practical considerations

Jaerang Park a,b,1, Inhu Bae a,c,1, Seaun Ryu a, Myungsun Kim a, Heejung Bang d, Jiyoon Won e, Hyangsook Lee a,c,
PMCID: PMC12423402  PMID: 40950803

Abstract

Background

Data sharing can reduce research waste, enable researchers to avoid duplicating efforts, and allow resources to be effectively directed towards addressing new clinical questions. This study aimed to evaluate data sharing practices and identify associated factors in acupuncture meta-analyses.

Methods

A PubMed search identified meta-analyses of any type of acupuncture (April 2022 to December 2023). Journal guidelines were classified by data sharing policies, and their associations with data availability statements (DASs) and data availability, were examined using chi-squared tests or generalised estimating equations analyses.

Results

Of 3713 studies, 300 were included. Articles published in journals with data sharing policies were more likely to include DASs compared to those without (75.8 % vs. 21.7 %, p < 0.001). DASs were more frequently present when journals mandated sharing rather than merely recommended it (94.6 % vs. 59.2 %, p < 0.001). While no significant association was found between the presence of DASs or sharing policies and data availability, articles from mandating journals had higher odds of data provision than those from recommending journals (OR 1.58, 95 % CI [1.11, 2.25]). Non-Complementary and Alternative Medicine (CAM) journal articles outperformed those in CAM journals in DAS inclusion (79.1 % vs. 49.3 %, p < 0.001), though data accessibility was comparable (71.6 % vs. 69.3 %, p = 0.826). Impact factor was not significantly associated with any aspects of data sharing practices (all p > 0.05).

Conclusions

Mandatory journal data sharing policies were associated with more frequent inclusion of DASs and provision of raw data, but neither a policy nor a DAS alone ensured reusable datasets. Mandatory policies paired with adequate training and supports may help improve transparency, promote reusability and reproducibility of results, and reduce research waste.

Keywords: Data sharing, Meta-analysis, Acupuncture, Transparency, Reproducibility

1. Introduction

Transparency and reproducibility are widely recognized features of scientific research,1 ensuring that the results are not merely due to chance.2 While we expect these important features to be guaranteed in the published reports, evidence suggests that this is not always the case across various disciplines.3, 4, 5 In clinical research in particular, efforts have been made to enhance complete and transparent reporting in order to ensure reliability and applicability of study findings, such as the development of reporting guidelines and the implementation of prospective protocol registrations.6, 7, 8

Among the various strategies to improve transparency and thereby increase reproducibility of the study results, data sharing can be one viable option. Robust data sharing can reduce research waste, enable researchers to avoid duplicating efforts, and allow resources to be effectively directed towards addressing new clinical questions.9 Many journals have incorporated data sharing policies into their own submission guidelines, generally asking the authors to share raw data, analysis code, and supplementary files.10,11 Some journals mandate data sharing, while others simply encourage or recommend it. However, responsible data sharing remains far from routine practice.12 In 2018, the International Committee of Medical Journal Editors (ICMJE) introduced a clinical trial data sharing policy.11 Nevertheless, subsequent analysis found poor compliance: among 334 papers that declared data sharing (July 2018 and April 2020), only two actually made their data publicly available.13 Similarly, in a recent survey involving 292 Complementary/Alternative and integrative Medicine (CAM) researchers, only 23 % of respondents shared raw data associated with their studies.14

While data sharing has been adopted in clinical trials,11 it has not yet been universally implemented in systematic reviews and meta-analyses. Given the overarching significance of systematic reviews and meta-analyses in clinical decision-making in evidence-based practice and guiding future research, investigating data sharing practices in systematic reviews is just as critical as focusing on data sharing in individual trials.

Despite the growing number of systematic reviews of acupuncture,15 there is still a lack of knowledge on transparency and reproducibility in this area. Several studies have examined the reporting quality of acupuncture research using Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA),16,17 Consolidated Standards of Reporting Trials (CONSORT) statement,18 and Standards for Reporting Interventions in Clinical Trials of Acupuncture (STRICTA) checklist.19 However, only one study protocol so far has been published to investigate data sharing in acupuncture randomised controlled trials (RCTs),20 and to the best of our knowledge, none have specifically examined data sharing in systematic reviews and meta-analyses of acupuncture. This study therefore aimed to analyse how and to what extent currently published systematic reviews and meta-analyses on acupuncture share their data and what factors are associated with such practice.

2. Methods

2.1. Search strategies

PubMed was searched on November 11, 2023 for eligible meta-analyses published between September 1, 2022 and August 31, 2023. However, the initial search yielded insufficient results to reach the target of 300 articles, so the search was extended to publications between April 1, 2022 and December 27, 2023, on January 20, 2024. Our full search strategy is provided in Supplement 1.

2.2. Eligibility criteria

Studies meeting the following criteria were included:

  • Systematic reviews with meta-analyses including either RCTs or non-RCTs of acupuncture for any conditions/diseases in humans;

  • Studies published between April 1, 2022 and December 27, 2023 in English and indexed in PubMed;

  • References of all included studies in the systematic review were available;

  • Systematic reviews presenting at least one pairwise meta-analysis, with effect estimates from at least two studies (binary or continuous data); but excluded, if there were no clinical data available;

  • Network meta-analyses were considered if they met the above criteria;

  • Acupuncture treatments included manual acupuncture, ear acupuncture, electroacupuncture, laser acupuncture, embedded acupuncture, fire needling, warm needling, or pharmacopuncture; and other techniques (e.g., transcutaneous electrical nerve stimulation) were excluded;

  • Systematic reviews including multiple interventions were allowed if they analysed acupuncture intervention(s) satisfying the criteria above; and

  • No restrictions on control types and outcome measures were imposed.

2.3. Study screening and selection

Search results were imported to Covidence for screening after duplicates were removed. Covidence is a web-based collaboration software platform that streamlines the production of systematic and other literature reviews.21 Two independent reviewers (SR and MK) screened the titles and abstracts of retrieved articles followed by full text reviews for inclusion.

A total of 300 articles of systematic reviews with meta-analyses were targeted for analysis. This sample size was considered sufficient to estimate the percentage of reviews reporting a particular practice (e.g., reporting a full search strategy for at least one database) with a maximum margin of error of 6 %, assuming a prevalence of 50 % (i.e., 1.96*0.5*(10.5)/300); for a prevalence of less (or greater) than 50 %, the margin of error was smaller.22

2.4. Extraction of general characteristics of the included studies

Two independent researchers (SR and MK) extracted the characteristics of the included systematic reviews with meta-analyses: first author, year of publication, the country where reviews were conducted based on the first author’s affiliation(s), journal name, journal citation reports (JCR) category (i.e., CAM vs. non-CAM), and journal impact factor (IF). Population, intervention, comparison, and outcome elements were also tabulated; disease/condition and International Classification of Diseases (ICD) -11 code, acupuncture intervention, and primary outcome. For ICD-11 coding, symptoms and signs for acupuncture treatment and underlying diseases/conditions (if relevant) were coded separately. Any discrepancy was resolved through discussion with the corresponding author (HL).

2.5. Assessment of data sharing

2.5.1. Data sharing by individual articles

To assess the data sharing statuses of the included studies, each study was coded as either “Yes” or “No” based on whether a Data Availability Statement (DAS) was included in the publication. Then, data sharing methods described in DASs were categorised as follows: “link” if data were openly available in a public repository (e.g., Open Science Framework (OSF)); “supplementary file” if data were provided as supplementary files or appendices; “available upon request to author(s)” if explicitly stated; and “within the article” if it was stated that data were included within the article.

To evaluate actual data availability in the published articles, the provision of raw data files, analysis codes, or review materials was examined. If any of the data, analysis code, or review materials necessary to reproduce at least one pairwise meta-analysis of the primary outcome was available, it was categorised as “Yes”; and if such data was inaccessible or unavailable, it was given “No”. Raw data files, analysis codes, and review materials were categorised as follows:

  • Data: any original/primary materials or extracted raw data, needed to reproduce or replicate the meta-analysis (i.e., unprocessed extracted data);

  • Analysis code: analysis codes with name of used statistical packages (e.g., R, or STATA), needed to reproduce or replicate the meta-analysis; and

  • Review materials: all other materials created during the review or needed for reproducibility, except raw data files and analysis codes.

An article given "No" in actual data sharing for reproducing pairwise meta-analyses, may share review materials, such as a citation list of all excluded studies. Therefore, we assessed all three types of data availability in each article. For articles that used the Cochrane Review Manager (RevMan) programme, it was considered review materials to reproduce pairwise meta-analyses were available, because the software automatically displays such data, even if the authors did not explicitly intend to share them.

2.5.2. Data sharing by journal groups

Journal submission guidelines were reviewed to determine data sharing specifications. Journals with identical submission guidelines under the same publisher were grouped together for analysis, but those with different guidelines were classified separately. The two reviewers (JP and either SR or IB) collected information about data sharing policies from journal instructions for authors. Journal groups were classified as “Yes” (specified data sharing policy) or “No” (no policy), with “Yes” groups further categorised as “Mandatory” (required data sharing) or “Recommending” (encouraged or recommended data sharing).

At a journal group level, we further examined whether the journal policy specified types of shared data: they were classified as “Yes” if journal groups specified sharing of any of the followings, i.e., data, analysis code, or review materials; or as “No” if there was no such policy.

2.5.3. Association of data sharing policies and data sharing statuses

The association of DAS provision in the manuscripts (Yes vs. No) with the following journal level factors was analysed:

  • The presence of a data sharing policy (Yes vs. No); and

  • The level of data sharing requirements (Mandatory vs. Recommending).

Since the inclusion of a DAS in the manuscript may not necessarily guarantee actual data sharing, we also analysed the association of data availability (Yes vs. No) with the following factors:

  • The presence of a data sharing policy (Yes vs. No) at the journal group level;

  • The level of data sharing requirements (Mandatory vs. Recommending) at the journal group level; and

  • Inclusion of a DAS (Yes vs. No) at the individual article level.

We also examined whether explicitly specifying the types of data to be shared in the journal policy (Specified vs. Unspecified) at the journal group level affects data availability. For this analysis, we coded ‘Yes’ if any type of data was provided, even if at least one pairwise meta-analysis was not deemed reproducible, and ‘No’ if no data were available.

2.6. Data sharing in CAM vs. non-CAM journals

CAM journal articles have been criticised for a lack of high-quality clinical trial reports and biased conclusions.23 To evaluate whether CAM journals have different data sharing statuses compared to those of non-CAM journals, all journals were categorised as either CAM or non-CAM journals based on JCR categories. It was examined whether the published articles of CAM and non-CAM journals differed in terms of presence of data sharing policies and DAS, and data availability.

2.7. Data sharing and journal IF

With the journals without IFs excluded, we analysed the association between journal IF and the followings:

  • The presence of a data sharing policy at the individual journal level;

  • The level of data sharing requirements at the individual journal level;

  • The presence of a DAS at the individual article level; and

  • Data availability at the individual article level.

2.8. Statistical analyses

All statistical analyses were done using R programme (R Core Team, version 4.4.1). To summarise general characteristics of the included systematic reviews, descriptive statistics were used: frequency and percentage (%) for categorical data and mean and standard deviations (SD) or median with interquartile range for continuous data. At individual article level, a chi-square test was conducted to assess any difference in journals’ data sharing policies (Yes vs. No), level of data sharing requirements (Mandatory vs. Recommending), DAS provision (Yes vs. No), and data availability (Yes vs. No), and journal types (CAM vs. non-CAM).

At journal group level, to account for the clustering of journals with shared submission guidelines under the same publisher, the Generalised Estimating Equations (GEE) method using the ‘geeglm’ function from the ‘geepack’ package was used. Regarding the relationship between journal IF and data sharing practice, simple logistic regression models were used. Given CAM journals usually have low IFs, analysis unit was set at 0.1 for regression models. Results were reported in terms of odds ratio (OR) with 95 % confidence interval (CI). All raw data files and R codes are available at https://osf.io/utdxw/.

3. Results

3.1. Literature searching

The initial PubMed search, targeting eligible systematic reviews with meta-analyses published from 01/09/2022 to 31/08/2023, was unsuccessful in retrieving the target sample of 300 meta-analyses meeting the inclusion criteria. Consequently, the search period was expanded to April 1, 2022 – December 27, 2023, resulting in 3713 systematic reviews. Within this cohort, 358 duplicates were removed. After screening titles and abstracts, 2918 were found to be irrelevant. Of 437 studies for full-text review, 137 were excluded and a total of 300 studies were included for final analysis (Fig. 1).

Fig. 1.

Fig 1

PRISMA flow diagram of literature searching. *Journal group refers to journals that share common submission guidelines.

3.2. General characteristics of the included studies

Among the 300 included reviews, 178 (59 %) were published in 2023 and 122 (41 %) in 2022. Most reviews were conducted in China (n = 238, 79 %), followed by Korea (n = 20, 7 %) and Taiwan (n = 10, 3 %), with the remaining 32 (11 %) from 17 other countries. The included reviews were published across 106 distinct journals, with Medicine (Baltimore) contributing the most (n = 46, 15 %).

Diseases/conditions in the included reviews were coded using ICD-11. The most common category for target symptoms for acupuncture treatment was “Symptoms, signs or clinical finding, not elsewhere classified” (n = 62, 21 %), while the most prevalent category for underlying conditions was "Diseases of the nervous system" (n = 69, 23 %).

Over half of the reviews investigated multiple types of acupuncture (n = 168, 56 %), while the others only included a single type (n = 132, 44 %). Manual acupuncture (n = 179) and electroacupuncture (n = 167) were the most frequently studied (Table 1).

Table 1.

General characteristics of the included studies.

Item Frequency (n ( %))
Year of publication
 2022 122/300 (40.7)
 2023 178/300 (59.3)
Country of corresponding author(s)
 China 238/300 (79.3)
 Korea 20/300 (6.7)
 Taiwan 10/300 (3.3)
 Others 32/300 (10.7)
Journals
 Medicine (Baltimore) 46/300 (15.3)
 Frontiers in Neurology 17/300 (5.7)
 Evidence-Based Complementary and Alternative Medicine 16/300 (5.3)
 Complementary Therapies in Medicine 14/300 (4.7)
 Frontiers of Medicine 14/300 (4.7)
 Frontiers in Neuroscience 13/300 (4.3)
 Others 180/300 (60.0)
ICD-11*: target symptoms and signs for acupuncture treatment
 Symptoms, signs or clinical finding, not elsewhere classified 62/300 (20.7)
 Mental, behavioural or neurodevelopmental disorders 39/300 (13.0)
 Diseases of the nervous system 39/300 (13.0)
 Diseases of the digestive system 30/300 (10.0)
 Diseases of the genitourinary system 29/300 (9.7)
 Diseases of the musculoskeletal system or connective tissue 21/300 (7.0)
 Sleep-wake disorders 18/300 (6.0)
 Endocrine, nutritional or metabolic diseases 14/300 (4.7)
 Diseases of the circulatory system 10/300 (3.3)
 Others 38/300 (12.7)
ICD-11: underlying diseases
 Diseases of the nervous system 69/300 (23.0)
 Diseases of the genitourinary system 38/300 (12.7)
 Diseases of the musculoskeletal system or connective tissue 28/300 (9.3)
 Mental, behavioural or neurodevelopmental disorders 25/300 (8.3)
 Neoplasms 24/300 (8.0)
 Symptoms, signs or clinical findings, not elsewhere classified 23/300 (7.7)
 Diseases of the digestive system 22/300 (7.3)
 Endocrine, nutritional or metabolic diseases 19/300 (6.3)
Others 52/300 (17.3)
Type of acupuncture interventions investigated (multiple choices allowed)
 Manual acupuncture 179/300
 Electroacupuncture 167/300
 Ear acupuncture 55/300
 Pharmacopuncture 22/300
 Laser acupuncture 22/300
 Others 167/300
Types of acupuncture interventions investigated
 Single 132/300 (44.0)
 Multiple 168/300 (56.0)

ICD-11, International Classification of Diseases.

3.3. Assessment of data sharing

3.3.1. Data sharing policies and data sharing levels in journal groups

Clustering 106 distinct journals with the same data sharing policies in submission guidelines resulted in 54 journal groups. Of these, 40 groups (74.1 %) specified data sharing policies, while 14 (25.9 %) did not. Within the 40 journal groups with data sharing policies, 8 (20.0 %) required sharing data, and 32 (80.0 %) encouraged or recommended sharing. Data sharing methods stated in DASs in the included meta-analyses greatly varied, with embedding data within the articles being the most common (Supplement 2).

3.3.2. Association of data sharing policies and presence of a DAS

The majority of 300 included meta-analyses (277, 92.3 %) were published in journals with established data sharing policies and among them, 130 (46.9 %) were in journals mandating data sharing, while 147 (53.1 %) were in journals that merely recommended it. Of 300 meta-analyses, 215 (71.7 %) included DASs in the manuscripts, while 85 (28.3 %) did not. DAS inclusion was more common in journals with mandatory policies (94.6 %, 123/130) than in those with only recommendations (59.2 %, 87/147). Among the 23 articles published in journals without data sharing policies, only 21.7 % (5/23) provided DAS in the manuscripts (Table 2).

Table 2.

Association of journals’ data sharing policies, DASs and data availability status in the articles.

Data sharing policy (n ( %)) Data sharing level (n ( %)) DAS* present (n ( %)) Data availability (n ( %))
Yes
277 (92.3)
Mandatory
130 (46.9)
Yes
123 (94.6)
Yes 93 (75.6)

No 30 (24.3)

No
7 (5.4)
Yes 5 (71.4)

No 2 (28.6)

Recommending
147 (53.1)
Yes
87 (59.2)
Yes 63 (72.4)

No 24 (27.6)

No 60 (40.8) Yes 34 (56.7)

No 26 (43.3)

No 23 (7.7) Yes 5 (21.7) Yes 3 (60.0)

No 2 (40.0)

No 18 (78.3) Yes 15 (83.3)

No 3 (17.7)

DAS, data availability statement.

Meta-analysis articles published in journals with data sharing policies were more likely to include DASs in the manuscripts (75.8 %) than those published in journals without such policies (21.7 %) (χ² = 27.98, df = 1, p < 0.001). Furthermore, journals with mandatory policies were associated with a higher rate of DAS inclusion (94.6 %) than those that merely recommended sharing (59.2 %) (χ² = 45.32, df = 1, p < 0.001).

In an analysis of 300 articles across 54 journal groups, the odds of omitting a DAS in meta-analyses published in journals without data sharing policies were 11 times higher than in journals with such policies (OR 11.28, 95 % CI [2.79, 45.59], p < 0.001). Moreover, mandatory policies were associated with approximately 12 times higher odds of including a DAS compared to recommending policies (OR 12.10, 95 % CI [3.70, 39.70], p < 0.001).

3.3.3. Association of data sharing policies and data sharing statuses

Within the 215 papers that included DASs in the manuscripts, data availability was confirmed in 159 papers (74.0 %). Among 85 papers without DASs, data availability was identified in 54 papers (63.5 %) (Table 2). Results showed no significant difference in the data availability regardless of whether DAS was stated in the manuscripts (74.0 % vs. 63.5 %, χ² = 2.73, df = 1, p = 0.10; Supplement 3).

In 54 journal groups, the association between the presence of data-sharing policies and data availability was examined using the GEE analysis. While no significant relationship between data sharing policies and data availability was observed (OR 0.66, 95 % CI [0.27, 1.59], p = 0.36), there was a 58 % increase in the odds of data availability in journals with mandatory data sharing policies compared to those with recommended data sharing policies (OR 1.58, 95 % CI [1.11, 2.25], p = 0.012).

When data sharing policies were differentiated into data, analysis code, and review materials sharing, at least one type of data was available in 269 manuscripts (89.7 %), while no data was provided in 31 manuscripts (10.3 %). Among the 269 articles, only one article shared all three types of data (Fig. 2A). Details of review materials in each article included search strategies, reporting guideline checklists, additional tables or figures, details of included studies, risk of bias/quality assessment, and list of excluded studies (Supplement 4). Among the 40 journal groups specifying data sharing policy, 25 (62.5 %) differentiated data, analysis code, and materials sharing, while 15 (37.5 %) did not do so (Fig. 2B). Journal groups that differentiated data to be shared in their policies were more likely to provide data, analysis code or review materials (91.9 % vs. 81.5 %; χ² = 5.22, df = 1, p = 0.02).

Fig. 2.

Fig 2

Differentiated data sharing policies and types of data shared. (A) in published articles; (B) in journal groups.

3.3.4. Data sharing in CAM vs. non-CAM journals

Of 300 meta-analyses included, 25 % (n = 75) were published in 17 CAM journals and the remaining 75 % (n = 225) were published in 89 non-CAM journals (Supplement 5). There was no significant difference in the adoption of data sharing policies between CAM and non-CAM journals (χ² = 3.11, df = 1, p = 0.08).

In CAM journals, 37 articles (49.3 %) included DASs in the manuscripts, while non-CAM journals displayed a significantly higher prevalence of DASs (n = 178, 79.1 %) (χ² = 23.12, df = 1, p < 0.001). However, data availability was similar between the two groups: 69.3 % (n = 52) of CAM articles vs. 71.6 % (n = 161) of non-CAM articles (χ² = 0.05, df = 1, p = 0.826) (Table 3).

Table 3.

Association of DASs & data availability in CAM vs non-CAM journals’ articles.

Article’s journal type Frequency (n (%)) Item Frequency (n (%))
CAM* 75/300 (25.0) DAS present

Yes 37/75 (49.3)

No 38/75 (50.7)

Data availability

Yes 52/75 (69.3)

No 23/75 (30.7)

Non-CAM 225/300 (75.0) DAS present

Yes 178/225 (79.1)

No 47/225 (20.9)

Data availability

Yes 161/225 (71.6)

No 64/225 (28.3)

CAM, Complementary/Alternative and Integrative Medicine;

DAS, data availability statement.

3.3.5. Data sharing and journal IF

Of 106 journals included, only 70 journals had IF, ranging from 0.3 to 8.7, with the average 3.5. There was no significant association between the presence of journal IF and data sharing practice (all p > 0.1). There was no significant association between journal IF and any of the followings: data sharing policy presence (OR 0.96, 95 % CI [0.91, 1.02], p = 0.19), level of data sharing requirements (OR 0.97, 95 % CI [0.95,1.00], p = 0.07), inclusion of a DAS in the manuscript (OR 0.98, 95 % CI [0.94, 1.01], p = 0.18), and data availability (OR 1.01, 95 % CI [0.98, 1.03], p = 0.50).

4. Discussion

4.1. Summary of main findings

This analysis of 300 acupuncture meta-analyses revealed that articles were more likely to include DASs when journals had data sharing policies (75.8 % vs. 21.7 %), and DAS provision was significantly higher in journals mandating data sharing (94.6 %) compared to those recommending it (59.2 %). Although DAS presence alone did not guarantee data availability, mandatory policies increased their availability by 58 % compared to recommending ones. Non-CAM journal articles outperformed those in CAM journals in DAS inclusion (79.1 % vs. 49.3 %), though data accessibility was comparable (71.6 % vs. 69.3 %). Meanwhile, journal IF was not significantly associated with any aspect of data sharing practices.

4.2. Interpretation of the results in comparison of previous studies

The prevalence of DASs in published manuscripts has reportedly varied across disciplines,24,25 journals, and study contexts depending on the scope and methods of analysis, e.g., 8 % in meta-research studies of medicine and health sciences,26 19.5 % in general biomedical literature,27 34 % of clinical studies in cardiology,28 98 % of full research reports from BMC open-access journals,29 and 72 % in the present results. Among factors that may have contributed to this considerable variation,30,31 implementation of data sharing policies by journals can be critical. Previous studies have demonstrated that data sharing policies, such as those mandated by the Public Library of Science (PLoS) and BioMed Central (BMC), significantly increased the inclusion of DASs.32,33 Another study reported that adoption of any data sharing policies increased the likelihood of data availability online by 25 times compared to the absence of such policies.34 These findings are largely consistent with the present results, which highlight that data sharing policies are strongly associated with a higher likelihood of containing DASs in the manuscripts, i.e., OR 11.28, 95 % CI [2.79, 45.59].

Mandatory data sharing policies implemented by journals have been shown to significantly increase DAS prevalence in manuscripts.29,32,35 Our study also supports this finding: articles published in journals with mandatory policies were more likely to include DASs compared to recommending policies (OR 12.10, 95 % CI [3.70, 39.70]). However, DAS inclusion in the published articles does not always translate into actual data accessibility. A recent meta-analysis of medical and health research studies on data sharing showed while 8 % of the included studies declared data availability, only 2 % actually provided data. In addition, while articles published in journals with a mandatory open data sharing policy have been reported to more frequently provide raw data files, the reusability and reproducibility of the provided raw data remained low. 26,35 In our analysis, neither the presence of a journal data sharing policy nor the inclusion of a DAS was significantly associated with increased data availability. Only mandatory data sharing policies significantly improved data availability (OR 1.58, 95 % CI [1.11, 2.25]), and this suggests simply implementing policies or including DASs may not be sufficient to ensure raw data accessibility.26,29 One possible interpretation for the lack of significant association is the high proportion of the included meta-analyses that used RevMan (64.7 %), which may overestimate actual data availability regardless of the authors’ intentions to share them. Alternatively, researchers’ awareness and attitudes toward data sharing may have improved over time (e.g., between 201334 and the present study in 2024). Nonetheless, our results are in line with the previous studies, suggesting mandatory data sharing policies are more effective than recommending policies in increasing actual data availability.34

Regarding factors affecting data sharing practices, the presence and level of data sharing policies have been reported to vary across journals in various disciplines, and journals with higher IFs were more likely to adopt such policies.36 We explored the differences between CAM and non-CAM journal reports and the associations between journal IFs and data sharing practices in our analysis. While reporting quality of acupuncture trials and meta-analyses has been extensively criticised,37, 38, 39 studies that have specifically investigated the data sharing status are sparse. Also, the data sharing status of meta-analyses published in CAM journals has not yet been compared with that of their counterparts. Except for the higher prevalence of DASs in non-CAM journals, no significant differences were found between CAM and non-CAM journals regarding data sharing policies and data availability. Plausible explanations for this finding may include the followings: difference may lie not in whether policies exist, but in how strictly they are enforced during the editorial process and/or in how authors adhere to them. It is also possible that varying levels of familiarity with open science practices across research communities contributed to the observed gap.

Unlike previous studies that reported positive associations of journal IFs and data sharing practices,30,40,41 our analysis found no such association. This discrepancy may be due to different classification methods for data sharing policies: some studies classified data sharing policies as weak/strong/no policy,40 but this study categorised the presence of data sharing policies as yes/no and data sharing level as recommending/mandatory. Also, steadily improving data sharing practices26,32 may have nullified the data sharing and IF association.

4.3. Pros and cons of this study

While many studies explored transparency and reproducibility of meta-analyses across medical disciplines or medical fields,30,42, 43, 44, 45, 46, 47, 48 no studies specifically examined data sharing practices in acupuncture meta-analyses. There are studies that assessed the reporting quality of acupuncture systematic reviews and data sharing levels in trials.16,17,20,38,39 However, research on data sharing status in meta-analyses on acupuncture is still lacking. Therefore, this study is the first assessment of data sharing practices in acupuncture meta-analysis reports. It also provides the first comparative analysis of data sharing policy levels (mandatory vs. recommended) and their potential influence on the provision of DASs and data availability. This comparison underscores the critical role of journal editorial enforcement of mandatory data sharing policies in promoting reporting transparency.

However, due to the complex issues associated with data sharing from both the researcher’s and the journal/publisher’s perspectives, one may argue that simply introducing a mandatory data sharing policy may not be sufficient to substantially improve data sharing practices. Sharing data often entails additional costs for authors or journals, such as expenses related to maintaining and operating data repositories.49 Researchers have also expressed concerns that data sharing may lead to a loss of originality, as similar studies could be published earlier using shared data.50 Additionally, there are concerns about potential breaches of privacy.24 All these findings suggest that data sharing is influenced by a complex interplay of various factors. In this context, our finding is noteworthy that the proportion of articles sharing any type of data was significantly higher in journal groups with policies that specified the type of data to be shared. Previous studies have mostly focused on whether data sharing policies are mandatory or recommending, while relatively few have examined the specificity of such policies, including the types of data required. Beyond simply mandating data sharing, more detailed and structured data sharing policies may effectively promote data sharing by authors.

Although engagement with transparency initiatives has been relatively limited among CAM researchers, this trend may reflect structural barriers, such as chronic underfunding, limited institutional support and financial incentives, and historically distinct research traditions.14,51 Our study, however, did not find any significant discrepancy in data sharing practices between CAM and non-CAM journals. Such evaluation can be extended to different medical research fields, contributing to a more nuanced understanding of reporting transparency across various scientific domains.

This study also has several limitations. Firstly, the dynamic nature of data sharing policies in journals posed a significant challenge, as journals’ policies appeared to change frequently during the analysis. Journal policy changes might have affected the evaluation of adherence to policies in certain articles: some may have complied with the policies at the time of submission, but the policies had changed by the time of actual publication. Furthermore, even when data sharing policies are in place, the enforcement may vary among journals, such as sharing data is a condition of publication. In addition, as authors' perceptions of data sharing and journal policies are subject to dynamic changes over time, the results may differ if more recent articles were included.

Secondly, this study focused only on systematic reviews indexed in PubMed and written in English. Consequently, included articles in this study may reflect a limited portion of acupuncture systematic reviews and may not be fully representative. The results might have been different if the dataset had included Chinese databases with a higher proportion of CAM-related literature.

Thirdly, while there are many studies reporting on data sharing practices in various disciplines,24,28,30 the binary classification of journals into CAM and non-CAM may have limited the generalisability of the study findings. Although our analysis was based on the JCR category, many journals in practice exhibit a hybrid or overlapping scope. Nonetheless, considering that most journals tend to publish articles consistent with their designated subject categories - and that these classifications are regularly updated - this approach can still be considered valid.52

4.4. Implications for future research

While it is considered best practice when data are openly available in a public repository, future research on reporting transparency should focus on analysing different data sharing methods to find out which ones are more effective for making raw data easier to access, reuse and reproduce. In addition, it would be of interest to look into whether different types of data, i.e., raw data files, analysis codes, or review materials, are shared to a similar extent. This could provide valuable insights for systematic reviewers and journal editors when developing meta-analysis protocols or evaluating manuscripts. Furthermore, examining whether transparency differs according to the reporting and methodological quality of systematic reviews could enhance our understanding of transparency in the field of data sharing. In addition to evaluating authors’ behaviour and data sharing status, future studies should also investigate underlying factors and motivations, such as funders’ policies or the costs associated with repository use. Beyond simply mandating data sharing, pairing mandates with adequate training and tools (e.g., checklists, templates, repository workflows, and editorial checks) in line with Findability, Accessibility, Interoperability, and Reusability (FAIR) guiding principles may improve appropriate data sharing practices and the practical reusability of shared data.53 Comparative studies of implementation supports could identify approaches that outperform mandates alone.

Last but not least, reporting completeness and reproducibility are as important as transparency achieved through data sharing. Previous reports have raised concerns the reusability and reproducibility of the raw data provided in articles published in journals with open data sharing policies are far from satisfaction.35 Building on our results, future studies could examine the data shared by authors are genuinely accessible and reusable, ultimately supporting reproducibility in articles. A more nuanced, multi-dimensional investigation into the factors affecting data availability and actual reproducibility is warranted to enhance the value of scientific works.

4.5. Conclusion

Mandatory data sharing policies in journals were associated with more frequent inclusion of DASs and raw data files, but neither a policy nor a DAS inclusion alone guaranteed accessible or reusable datasets. To avoid low-quality or unusable deposits in the absence of adequate training or tools, mandatory policy interventions should be paired with practical supports such as repository guidance, templates, quality checks, and incentives, as framed by the FAIR guiding principles. Together these measures may help improve transparency in acupuncture evidence synthesis, reduce research waste, and promote reliability and public trust.

Author contributions

Jaerang Park: Investigation, Writing – original draft, Formal analysis. Inhu Bae: Investigation, Validation, Formal analysis, Writing – original draft. Seaun Ryu, Myungsun Kim: Investigation, Validation. Heejung Bang: Formal analysis, Writing - review & editing. Jiyoon Won: Validation, Formal analysis. Hyangsook Lee: Conceptualization, Funding acquisition, Project administration, Writing – review & editing, and Supervision.

Conflict of interest

JW is an editorial board member of this journal but JW's editorial board membership had no bearing on the editorial decision. The authors declare that they have no conflicts of interest.

Acknowledgments

Funding

This work was supported by grant RS-2023-00278131 from the National Research Foundation of Korea funded by the Korean Ministry of Science and Information and Communication Technology. Hyangsook Lee was supported by Kyung Hee University during her sabbatical and this manuscript is based in part on Jaerang Park’s master’s thesis.

Ethical statement

No ethics approval was needed for this work.

Data availability

All raw data files and R codes are available at https://osf.io/utdxw/.

Declaration of generative AI and AI-assisted technologies in the writing process

During the preparation of this manuscript, the author(s) used ChatGPT 4o in order to rephrase English sentences and verify English grammar, as none of the authors of this paper are native English speakers. After using this tool/service, the authors reviewed and edited the content as needed and take full responsibility for the content of the publication.

Acknowledgments

We thank anonymous reviewers for their critical comments which we believe greatly helped improve the manuscript.

Footnotes

Supplementary material associated with this article can be found, in the online version, at doi:10.1016/j.imr.2025.101229.

Appendix. Supplementary materials

mmc1.pdf (150.8KB, pdf)

References

  • 1.Nosek B.A., Alter G., Banks G.C., et al. SCIENTIFIC STANDARDS. Promoting an open research culture. Science (1979) 2015;348(6242):1422–1425. doi: 10.1126/science.aab2374. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 2.Page M.J., Moher D., Fidler F.M., et al. The REPRISE project: protocol for an evaluation of REProducibility and replicability In syntheses of evidence. Syst Rev. 2021;10(1):112. doi: 10.1186/s13643-021-01670-0. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 3.Errington T.M., Denis A., Perfito N., Iorns E., Nosek BA. Challenges for assessing replicability in preclinical cancer biology. Elife. 2021:10. doi: 10.7554/eLife.67995. 10.7554/eLife.67995. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 4.Ioannidis J.P., Munafò M.R., Fusar-Poli P., Nosek B.A., David SP. Publication and other reporting biases in cognitive sciences: detection, prevalence, and prevention. Trends Cogn Sci. 2014;18(5):235–241. doi: 10.1016/j.tics.2014.02.010. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 5.John L.K., Loewenstein G., Prelec D. Measuring the prevalence of questionable research practices with incentives for truth telling. Psychol Sci. 2012;23(5):524–532. doi: 10.1177/0956797611430953. [DOI] [PubMed] [Google Scholar]
  • 6.Stevens A., Shamseer L., Weinstein E., et al. Relation of completeness of reporting of health research to journals' endorsement of reporting guidelines: systematic review. BMJ. 2014;348:g3804. doi: 10.1136/bmj.g3804. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 7.Sideri S., Papageorgiou S.N., Eliades T. Registration in the international prospective register of systematic reviews (PROSPERO) of systematic review protocols was associated with increased review quality. J Clin Epidemiol. 2018;100:103–110. doi: 10.1016/j.jclinepi.2018.01.003. [DOI] [PubMed] [Google Scholar]
  • 8.Schulz K.F., Altman D.G., Moher D., Group C. CONSORT 2010 statement: updated guidelines for reporting parallel group randomised trials. BMJ. 2010;340:c332. doi: 10.1136/bmj.c332. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 9.Wang S.V., Pottegård A. Building transparency and reproducibility into the practice of pharmacoepidemiology and outcomes research. Am J Epidemiol. 2024;193(11):1625–1631. doi: 10.1093/aje/kwae087. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 10.Tan A.C., Webster A.C., Libesman S., et al. Data sharing policies across health research globally: cross-sectional meta-research study. Res Synth Methods. 2024;15(6):1060–1071. doi: 10.1002/jrsm.1757. [DOI] [PubMed] [Google Scholar]
  • 11.Taichman D.B., Sahni P., Pinborg A., et al. Data sharing statements for clinical trials: A requirement of the International Committee of Medical Journal Editors. JAMA. 2017;317(24):2491–2492. doi: 10.1001/jama.2017.6514. [DOI] [PubMed] [Google Scholar]
  • 12.Tenopir C., Rice N.M., Allard S., et al. Data sharing, management, use, and reuse: practices and perceptions of scientists worldwide. PLoS One. 2020;15(3) doi: 10.1371/journal.pone.0229003. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 13.Danchev V., Min Y., Borghi J., Baiocchi M., Ioannidis JPA. Evaluation of data sharing after implementation of the International Committee of Medical Journal Editors data sharing statement requirement. JAMA Netw Open. 2021;4(1) doi: 10.1001/jamanetworkopen.2020.33972. e2033972-e2033972. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 14.Ng J.Y., Lin B.X., Kreuder L., Cramer H., Moher D. Open science practices among authors published in complementary, alternative, and integrative medicine journals: an international, cross-sectional survey. Medicine (Baltimore) 2024;103(44) doi: 10.1097/MD.0000000000040259. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 15.Lu L., Zhang Y., Tang X., et al. Evidence on acupuncture therapies is underused in clinical practice and health policy. BMJ. 2022;376 doi: 10.1136/bmj-2021-067475. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 16.Liu Y., Zhang R., Huang J., et al. Reporting quality of systematic reviews/meta-analyses of acupuncture. PLoS One. 2014;9(11) doi: 10.1371/journal.pone.0113172. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 17.Qin C., Ma H., Mandizadza O.O., Xu X., Ji C. Reporting quality of meta-analyses in acupuncture: investigating adherence to the PRISMA statement. Medicine (Baltimore) 2024;103(39) doi: 10.1097/md.0000000000039933. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 18.Qin C., Deng S., Li B., et al. How is the quality of randomized controlled trials (RCTs) for acupuncture treatment of post-stroke aphasia? A report quality assessment. PLoS One. 2024;19(10) doi: 10.1371/journal.pone.0308704. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 19.Lu T., Lu C., Li H., et al. The reporting quality and risk of bias of randomized controlled trials of acupuncture for migraine: methodological study based on STRICTA and RoB 2.0. Complement Ther Med. 2020;52:102433. doi: 10.1016/j.ctim.2020.102433. [DOI] [PubMed] [Google Scholar]
  • 20.Duan Y., Xu Z., Li X., et al. Reporting and data-sharing level of acupuncture randomised controlled trials: a cross-sectional study protocol. BMJ Open. 2023;13 doi: 10.1136/bmjopen-2022-070545. 06/21. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 21.Covidence systematic review software. Melbourne, Australia: Veritas Health Innovation [online] 2014. https://www.covidence.org accessed 20 Dec 2023.
  • 22.Nguyen P.Y., Kanukula R., McKenzie J.E., et al. Changing patterns in reporting and sharing of review data in systematic reviews with meta-analysis of the effects of interventions: cross sectional meta-research study. BMJ. 2022:379. doi: 10.1136/bmj-2022-072428. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 23.Schmidt K., Pittler M., Ernst E. A profile of journals of complementary and alternative medicine. Swiss Med Wkly. 2001;131(3940):588–591. doi: 10.57187/smw.2001.09818. [DOI] [PubMed] [Google Scholar]
  • 24.Tedersoo L., Kungas R., Oras E., et al. Data sharing practices and data availability upon request differ across scientific disciplines. Sci Data. 2021;8(1):192. doi: 10.1038/s41597-021-00981-0. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 25.Page M.J., Nguyen P.Y., Hamilton D.G., et al. Data and code availability statements in systematic reviews of interventions were often missing or inaccurate: a content analysis. J Clin Epidemiol. 2022;147:1–10. doi: 10.1016/j.jclinepi.2022.03.003. [DOI] [PubMed] [Google Scholar]
  • 26.Hamilton D.G., Hong K., Fraser H., Rowhani-Farid A., Fidler F., Page M.J. Prevalence and predictors of data and code sharing in the medical and health sciences: systematic review with meta-analysis of individual participant data. BMJ. 2023;382 doi: 10.1136/bmj-2023-075767. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 27.Serghiou S., Contopoulos-Ioannidis D.G., Boyack K.W., Riedel N., Wallach J.D., Ioannidis JPA. Assessment of transparency indicators across the biomedical literature: how open is open? PLoS Biol. 2021;19(3) doi: 10.1371/journal.pbio.3001107. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 28.Archer D., Barks N., Chaudhry M., et al. Data sharing statements: impact of journal policies across clinical research disciplines. Eur Heart J. 2025 doi: 10.1093/eurheartj/ehaf359. [DOI] [PubMed] [Google Scholar]
  • 29.Gabelica M., Bojcic R., Puljak L. Many researchers were not compliant with their published data sharing statement: a mixed-methods study. J Clin Epidemiol. 2022;150:33–41. doi: 10.1016/j.jclinepi.2022.05.019. [DOI] [PubMed] [Google Scholar]
  • 30.Moore J., Nguyen K., Dennis B., et al. Assessing the prevalence, quality and compliance of data-sharing statements in gastroenterology publications: a cross-sectional analysis. BMJ Open. 2025;15(3) doi: 10.1136/bmjopen-2024-092490. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 31.Zhang J., Liu Y., Thabane L., et al. Journal requirement for data sharing statements in clinical trials: a cross-sectional study. J Clin Epidemiol. 2024;172 doi: 10.1016/j.jclinepi.2024.111405. [DOI] [PubMed] [Google Scholar]
  • 32.Federer L.M., Belter C.W., Joubert D.J., et al. Data sharing in PLOS ONE: an analysis of Data Availability statements. PLoS One. 2018;13(5) doi: 10.1371/journal.pone.0194768. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 33.Colavizza G., Hrynaszkiewicz I., Staden I., Whitaker K., McGillivray B. The citation advantage of linking publications to research data. PLoS One. 2020;15(4) doi: 10.1371/journal.pone.0230416. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 34.Vines T.H., Andrew R.L., Bock D.G., et al. Mandated data archiving greatly improves access to research data. FASEB J. 2013;27(4):1304–1308. doi: 10.1096/fj.12-218164. [DOI] [PubMed] [Google Scholar]
  • 35.Hardwicke T.E., Mathur M.B., MacDonald K., et al. Data availability, reusability, and analytic reproducibility: evaluating the impact of a mandatory open data policy at the journal Cognition. R Soc Open Sci. 2018;5(8) doi: 10.1098/rsos.180448. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 36.Resnik D.B., Morales M., Landrum R., et al. Effect of impact factor and discipline on journal data sharing policies. Account Res. 2019;26(3):139–156. doi: 10.1080/08989621.2019.1591277. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 37.Gagnier J.J., DeMelo J., Boon H., Rochon P., Bombardier C. Quality of reporting of randomized controlled trials of herbal medicine interventions. Am J Med. 2006;119(9) doi: 10.1016/j.amjmed.2006.02.006. 800. e1-800. e11. [DOI] [PubMed] [Google Scholar]
  • 38.Wang Y., Chen N., Guo K., et al. Reporting and methodological quality of acupuncture network meta-analyses could be improved: an evidence mapping. J Clin Epidemiol. 2023;153:1–12. doi: 10.1016/j.jclinepi.2022.11.004. [DOI] [PubMed] [Google Scholar]
  • 39.Lu L.M., He J., Zeng J.C., Liao M.X., Jia C., Pan HH. Impact evaluation of CONSORT and STRICTA guidelines on reporting quality for randomized controlled trials of acupuncture conducted in China. Chin J Integr Med. 2017;23(1):10–17. doi: 10.1007/s11655-016-2451-z. [DOI] [PubMed] [Google Scholar]
  • 40.Kim J., Kim S., Cho H.M., Chang J.H., Kim SY. Data sharing policies of journals in life, health, and physical sciences indexed in Journal Citation Reports. PeerJ. 2020;8:e9924. doi: 10.7717/peerj.9924. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 41.Stodden V., Guo P., Ma Z. Toward reproducible computational research: an empirical analysis of data and code policy adoption by journals. PLoS One. 2013;8(6) doi: 10.1371/journal.pone.0067111. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 42.Cullis P.S., Gudlaugsdottir K., Andrews J. A systematic review of the quality of conduct and reporting of systematic reviews and meta-analyses in paediatric surgery. PLoS One. 2017;12(4) doi: 10.1371/journal.pone.0175213. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 43.Peters J.P., Hooft L., Grolman W., Stegeman I. Reporting quality of systematic reviews and meta-analyses of otorhinolaryngologic articles based on the PRISMA statement. PLoS One. 2015;10(8) doi: 10.1371/journal.pone.0136540. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 44.Wayant C., Page M.J., Vassar M. Evaluation of reproducible research practices in oncology systematic reviews with meta-analyses referenced by national comprehensive cancer network guidelines. JAMA Oncol. 2019;5(11):1550–1555. doi: 10.1001/jamaoncol.2019.2564. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 45.Zhu Y., Fan L., Zhang H., et al. Is the best evidence good enough: quality assessment and factor analysis of meta-analyses on depression. PLoS One. 2016;11(6) doi: 10.1371/journal.pone.0157808. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 46.López-Nicolás R., López-López J.A., Rubio-Aparicio M., Sánchez-Meca J. A meta-review of transparency and reproducibility-related reporting practices in published meta-analyses on clinical psychological interventions (2000–2020) Behav Res Methods. 2022;54(1):334–349. doi: 10.3758/s13428-021-01644-z. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 47.Koffel J.B., Rethlefsen ML. Reproducibility of search strategies is poor in systematic reviews published in high-impact pediatrics, cardiology and surgery journals: a cross-sectional study. PLoS One. 2016;11(9) doi: 10.1371/journal.pone.0163309. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 48.Polanin J.R., Hennessy E.A., Tsuji S. Transparency and reproducibility of meta-analyses in psychology: A meta-review. Perspect Psychol Sci. 2020;15(4):1026–1041. doi: 10.1177/1745691620906416. [DOI] [PubMed] [Google Scholar]
  • 49.Goodhill GJ. Open access: practical costs of data sharing. Nature. 2014;509(7498):33. doi: 10.1038/509033b. [DOI] [PubMed] [Google Scholar]
  • 50.Schmidt B., Gemeinholzer B., Treloar A. Open Data in Global Environmental Research: The Belmont Forum's Open Data Survey. PLoS One. 2016;11(1) doi: 10.1371/journal.pone.0146695. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 51.Ng J.Y., Wieland L.S., Lee M.S., et al. Open science practices in traditional, complementary, and integrative medicine research: A path to enhanced transparency and collaboration. Integr Med Res. 2024;13(2) doi: 10.1016/j.imr.2024.101047. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 52.Clarivate Analytics. Clarivate unveils the 2025 Journal Citation Reports Clarivate [online]. https://clarivate.com/news/clarivate-unveils-the-2025-journal-citation-reports/. Published June 18, 2025. Accessed on July 14, 2025.
  • 53.Wilkinson M.D., Dumontier M., Aalbersberg I.J., et al. The FAIR Guiding Principles for scientific data management and stewardship. Sci Data. 2016;3 doi: 10.1038/sdata.2016.18. [DOI] [PMC free article] [PubMed] [Google Scholar]

Associated Data

This section collects any data citations, data availability statements, or supplementary materials included in this article.

Supplementary Materials

mmc1.pdf (150.8KB, pdf)

Data Availability Statement

All raw data files and R codes are available at https://osf.io/utdxw/.


Articles from Integrative Medicine Research are provided here courtesy of Korea Institute of Oriental Medicine

RESOURCES