Abstract
Bibliometric analysis has gained popularity as a quantitative research methodology to evaluate scholarly productivity and identify trends within specific research areas. However, there are currently no established reporting guidelines for bibliometric studies. The present study aimed to investigate the reporting practices of bibliometric research related to health and medicine based on a guidelines “Preferred Reporting Items for Bibliometric Analysis (PRIBA)” proposed in this study. The Science Citation Index, Expanded of the Web of Science was used to identify the top 100 articles with the highest normalized citation counts per year. The search was conducted on April 9, 2022, using the search topic “bibliometric” and including publications from 2019 to 2021. The results substantiated the need for a standardized reporting guideline for bibliometric research. Specifically, among the 25 proposed items in the PRIBA, only five were consistently reported across all articles examined. Further, 11 items were reported by at least 80% of the articles, while nine items were reported by less than 80% of the articles. In conclusion, our findings suggest that the reporting practices of bibliometric studies in the field of health and medicine are in need of improvement. Future research should be conducted to refine the PRIBA guidelines.
Keywords: Reporting guidelines, Bibliometrics, Scientometrics, Scientific publications
1. Introduction
Bibliometric analysis, which was first introduced by Alan Pritchard in 1969 [1], has gained popularity as a quantitative research methodology in recent years. This is partly due to the increasing availability and accessibility of analytical software packages [2] and electronic indexing databases [[3], [4], [5], [6]]. According to Pritchard’s original definition, bibliometrics can be defined as “the application of mathematical and statistical methods to books and other media of communication” [1]. A related term, “scientometrics”, was also coined by Nalimov and Mulchenko in 1969. Although the two terms are often used interchangeably, scientometrics is typically used to refer to the quantitative study of science and technology. Nacke introduced the term “informetrics” in 1978 as a general term for scientometrics and bibliometrics [7]. However, it is not commonly used in medical literature titles. Of the three terms, only bibliometrics has been recognized as a Medical Subject Headings (MeSH) term for indexing articles in PubMed. Therefore, in this study, the term bibliometrics will refer to research based on the quantitative analysis of scientific publication output.
Bibliometric studies have gained popularity as a methodology for quantitatively assessing scholarly productivity and identifying trends in various disciplines. This approach is prevalent not only within the specialized fields of library and information science, but also in other professional fields [8,9], particularly in health sciences [10]. Bibliometric studies can provide quantitative summaries of a given research field or topic from both performance and network perspectives. Performance analysis involves measuring productivity through counting, rankings, and citations, while network analysis reveals intellectual connections using techniques such as co-authorship and co-citation analysis [11]. These analyses can be conducted on several levels, including individual-level (authors), journal-level, institution-level, and international-level. These measures can be referred to as bibliometric indicators. While numerous conceivable indicators exist [12], they must be measurable with a high degree of validity for practical use. A recent review offers an overview of various techniques and indicators of bibliometric analysis [13].
Comprehensive and transparent reporting is essential to improve the reliability, interpretability, and value of research findings. To this end, various reporting guidelines have been developed for a wide range of healthcare research methodologies. The EQUATOR (Enhancing the QUAlity and Transparency Of health Research) Network [14], an international initiative that promotes accurate reporting of health research studies, defines a reporting guideline as “a checklist, flow diagram, or structured text to guide authors in reporting a specific type of research, developed using explicit methodology”. Currently, there are over 90 reporting guidelines listed on the EQUATOR Network registry [15], covering various types of research, such as randomized trials (CONSORT), observational studies (STROBE), systematic reviews (PRISMA), case reports (CARE), qualitative research (SRQR), diagnostic and prognostic studies (STARD), quality improvement studies (SQUIRE), economic evaluations (CHEERS), animal preclinical studies (ARRIVE), study protocols (SPIRIT), and clinical practice guidelines (AGREE). However, there are currently no reporting guidelines for studies based on bibliometric analysis.
A recent study examined pertinent records indexed in the Web of Science-Core Collection from 1965 to 2019 and identified an “uncontainable” pattern of dynamic growth in bibliometrics as a research methodology. Between 2000 and 2014, the number of published papers increased 12-fold, expanding across all knowledge domains, particularly in life sciences and biomedicine. The study’s author advocated for strengthening the teaching of bibliometrics for use as research tools and for rigorous review before accepting such studies [16]. Given the widespread interest in using bibliometrics for evaluation of research, it is expected that number of publications using bibliometric analysis will continue to increase in the medical and natural sciences [17]. In 1996, Glänzel had already identified the need for standardization in bibliometric research and technology to improve the reliability and validity of bibliometric results [18]. Other authors also supported the calls for more standardization to promote progress in bibliometric research [19]. In addition, there is a need for standardization in bibliometric research to improve interoperability in bibliometric studies and to ensure that research data can meet the guiding principles of findability, accessibility, interoperability, and reusability, as outlined by the FAIR principles [20]. Therefore, the aim of this study was to investigate the current reporting practices of health and medicine-related bibliometric studies based on a proposed guideline called “Preferred Reporting Items for Bibliometric Analysis (PRIBA)”. PRIBA was adapted and extended from the items used in PRISMA statement to fit the setting of bibliometric studies.
2. Methods
2.1. Source of bibliometric data and search strategy
The Web of Science (WoS) Core Collection database (Clarivate Analytics, Philadelphia, PA, USA) was used to identify articles using bibliometric analysis. The WoS Core Collection database is a comprehensive platform encompassing over 21,000 peer-reviewed, scholarly journals across over 250 science, social sciences, and humanities disciplines. It is one of the most comprehensive database platforms available for determining citation and is widely used in bibliometric studies [21,22]. The Science Citation Index Expanded (SCI-Expanded, 1900 to present) of the WoS Core Collection online database was selected to search using the topic “bibliometric” (based on the WoS field tag TS). The search was conducted on April 9, 2022, and the publication years were limited to 2019 to 2021, based on the WoS field tag PY.
In addition, to ensure proper interpretation of the results, publication language was restricted to English, based on the WoS field tag LA. Only original articles, based on the WoS field tag DT, were included in this study. As the focus of the present study was on health and medicine-related articles, the following WoS categories were used: “surgery” or “public environmental occupational health” or “health care sciences services” or “medicine general internal” or “orthopedics” or “oncology” or “medicine re-search experimental” or “dentistry oral surgery medicine” or “pharmacology pharmacy” or “nursing” or “medical informatics” or “psychiatry” or “neurosciences” or “integrative complementary medicine” or “health policy services” or “pediatrics” or “immunology” or “ophthalmology” or “rheumatology” or “endocrinology metabolism” or “emergency medicine” or “engineering biomedical” or “gastroenterology hepatology” or “infectious diseases” or “obstetrics gynecology” or “sport sciences” or “urology nephrology” or “cardiac cardiovascular systems” or “otorhinolaryngology” or “tropical medicine” or “dermatology”. WoS categories with less than 10 articles published from 2019 to 2021 were excluded.
Furthermore, only publishers that had published at least 10 articles between 2019 and 2021 were considered. The full record of the resulting articles was exported from WoS as a plain text file and imported into bibliometrix 3.0 (Naples, Italy) [23]. Biblioshiny, a web interface for bibliometrix, was used to obtain a ranking of articles based on mean citation counts per year. The full text of the top 100 articles ranked highest according to normalized citation counts per year was downloaded [24]. Subsequently, their reporting practices were evaluated in accordance with the proposed reporting guidelines entitled “Preferred Reporting Items for Bibliometric Analysis (PRIBA)” in this study.
2.2. Proposed guidelines for bibliometric studies
Table 1 presents the PRIBA, which includes 25 items purposed for use when reporting bibliometric research. The guidelines consist of seven main sections identical to those in the PRISMA 2020 Checklist, namely title, abstract, introduction, methods, results, discussion, and other information. The individual items were modified and expanded from the ones used in the PRISMA 2020 Checklist for bibliometric studies.
Table 1.
The proposed items in the PRIBA to be used for reporting bibliometric studies, adapted from the PRISMA Checklist (http://prismastatement.org/PRISMAStatement/Checklist.aspx).
| Section and topic | Item # | Proposed item to be used in bibliometric research |
|---|---|---|
| Title | ||
| Title | 1a | Identify the study is a bibliometric analysis. |
| 1b | Indicate the coverage period. | |
| Abstract | ||
| Abstract | 2a | Provide an explicit statement of objective(s). |
| 2b | Specify the data sources. | |
| 2c | Specify the coverage period. | |
| 2d | Provide results for main outcomes. | |
| 2e | Provide an overall interpretation of the results and implications. | |
| Introduction | ||
| Rationale | 3 | Describe the rationale for the study. |
| Objectives | 4 | Provide an explicit statement of objective(s). |
| Methods | ||
| Data source | 5a | Specific the database(s) or other data sources searched. |
| 5b | Describe the characteristics of the data source. | |
| 5c | Specify the date when the search was conducted. | |
| Eligibility criteria | 6 | Specify the inclusion and exclusion criteria, such as language, article types, and coverage period. |
| Search strategy | 7 | Specify the search strategy and keywords used. |
| Bibliometric indicators | 8 | Describe the bibliometric indicators used. |
| Analytical software | 9 | Specify the software package(s) used and the settings selected. |
| Results | ||
| Search process | 10 | Describe the results of the search and selection processes, and use a flow diagram if necessary. |
| Bibliometric indicators | 11 | Describe the results of bibliometric indicators, including quantity, performance, and structural indicators. |
| Figures | 12 | Prepare figures with an adequate resolution for online and print readability. |
| Discussion | ||
| Conclusions | 13 | Summarize key results with reference to study objective(s). |
| Limitations | 14 | Discuss any limitations and impact of potential bias. |
| Interpretation | 15 | Interpret the results in the context of background knowledge. |
| Other information | ||
| Support | 16 | Describe the sources of financial or non-financial support and the role of funders or sponsors. |
| Conflicts of interest | 17 | Declare any competing interests of the author(s). |
| Data availability | 18 | Specify if data are publicly available and the route of access. |
3. Results
3.1. Search strategy
Fig. 1 shows the study flowchart. A total of 9476 publications were initially identified in the WoS Core Collection online database using the keyword “bibliometric.”. Of these 47% were published over the three years between 2019 and 2021. After applying further restrictions on language and article type, 2681 articles were retained. The selection was further refined to include only articles related to health and medicine, and those categories with at least 10 articles published between 2019 and 2021, resulting in 974 articles. These articles were then filtered based on the criteria of having at least 10 articles from 2019 to 2021 by the journal publishers, resulting in 819 articles. The top 100 articles, based on normalized citation counts [25], were selected for further analysis (Supplementary Table S1). These 100 articles were published across 61 different journals, with Clarivate Analytics Journal Citation Reports impact factors ranging from 0.69 to 79.32 (weighted median 3.39). The International Journal of Environmental Research and Public Health published the most papers (16 articles), followed by the Journal of Medical Internet Research (5 articles).
Fig. 1.
Study flowchart.
3.2. Reporting characteristics: title, abstract, and introduction
Fig. 2 shows the relative percentage of the 25 proposed items outlined in Table 1 among the 100 articles analyzed. Regarding item 1a, 73% of the articles explicitly identified themselves as a bibliometric analysis in the title. In the present study, the term “scientometric analysis” was regarded as synonym with bibliometric analysis. However, “citation analysis” and “altmetric analysis” were not because the term “altmetrics” is a distinct subtype of bibliometrics [26,27] in which the unit of analysis changed from scholarly journals to social activities and interactions on blogging platforms and web-based reference managers, such as Mendeley or CiteULike. For item 1b, only 24% of the articles included the coverage period in the title.
Fig. 2.
Distribution of the 25 proposed items in the top 100 cited health and medicine-related bibliometric studies from 2019 to 2021.
For the five items of the abstract, all articles provided an explicit statement of the objective (item 2a), and almost all articles provided results for the main outcomes (item 2d) and an overall interpretation of the results (item 2e). However, only 88% specified the data sources (item 2b) and 69% specified the coverage period. In terms of the introduction section, all articles in the sample provided a description of the rationale of the study (item 3) and an explicit statement of the objective (item 4).
3.3. Reporting characteristics: methods
For the seven items in the methods section, it was found that all articles specified the search strategy and keywords used (item 7), and most articles (99%) clearly specified the main source of data (item 5a). In addition, the majority of articles (94%) provided descriptions of the bibliometric indicators employed (item 8) as well as the software packages used (item 9). However, only a quarter (24%) of the articles included a description of the database’s characteristics, such as the number of journals and records indexed at the time of the search (item 5b). Moreover, only 62% of the articles explicitly indicated the date when the search was conducted (item 5c), and 75% of them mentioned the inclusion and exclusion criteria, such as language, article types, and the search period covered (item 6).
3.4. Reporting characteristics: results
For the three items of the results section, all articles described the results of bibliometric indicators (item 11). However, only 60% of the articles provided a clear description of the results of the search and selection processes, either in text or via a flow diagram (item 10). A few articles described the sample size at various stages of the search process in the methods section rather than the results section. Of the 100 articles, five did not include any figures. Of the remaining 95 articles, 92% had figures with adequate resolution (item 12). However, the legibility of figure lettering was adversely affected when figures were resized into a single column in articles of a two-column format.
3.5. Reporting characteristics: discussion and other information
For the six items of the discussion section and other information section, almost all articles clearly summarized key results with reference to the study objective (item 13) and provided an interpretation of the results in the context of existing knowledge (item 15). However, not all articles mentioned about the limitations of the study (86%) (item 14). Most articles provided a statement of competing interests (item 17), and 79% indicated whether the study had financial or non-financial support (item 16). However, as not all journals required a data availability statement, only 28% of the study specified if the data used in the study were publicly available.
4. Discussion
This study aimed to examine the reporting practices of the top 100 bibliometric studies in the health and medicine field, from 2019 to 2021, based on normalized citation counts. Despite the recent rapid proliferation of studies based on bibliometric methodologies, no reporting guideline is available for this type of research. Therefore, this study proposed a new reporting guideline called the “Preferred Reporting Items for Bibliometric Analysis (PRIBA)”, which was adapted from the well-established PRISMA Checklist.
Overall, the results of the study highlighted the need for a reporting guideline for bibliometric research. Interestingly, it has been more than 25 years since the “Bibliometric Standards” workshop in 1995, which focused on the importance and a need to develop conceptual and technical standards for bibliometric research and technology [28]. Having a checklist for transparent reporting would enhance the reliability and interpretability of bibliometric research findings and contribute to the overall credibility and transparency of the scientific literature in this field.
Among the 25 proposed items, five were reported by all articles. Three of them were related to the study objective (items 2a and 4) and rationale (item 3), one regarding the provision of the search strategy and keywords (item 7), and one regarding the description of bibliometric indicators used (item 8). This result is expected because these items are the essential core elements of all bibliometric studies. It should be mentioned that as the number of bibliometric studies on similar topics continues to increase, a clear and compelling rationale for conducting a new bibliometric study must be provided.
In addition, 11 items were reported by at least 80% of the articles. Three of them were essential elements of the abstract (items 2b, 2d, and 2e). Three of them were essential elements of the methods section, which included the name of the database used (item 5a), the bibliometric indicators used (item 8), and the software package used for the analysis (item 9). Nevertheless, it is worth noting that while most articles mentioned the main database used, the sub-database was often not specified, which can impact the reliability and interpretability of the findings. For example, the WoS Core Collection consists of various online indexing databases, which specifically cover journals, books, or conference proceedings in different disciplines. Therefore, the choice of sub-database should be explicitly mentioned [29]. One item was the legibility of figures in the results section (item 12), three items were important elements of the discussion section (items 13, 14, and 15), and one item in the other information section was the declaration of competing interests (item 17). Most articles discussed and interpreted the findings in the context of background information with relevant supporting references (item 15).
There were articles with a discussion section that simply reiterated the results and did not provide insight into future research directions [24]. Given the proliferation of bibliometric studies, it is important for authors to check for newly published studies with similar or identical topics when writing the discussion section. If such studies exist, their findings should be compared and contrasted.
For the remaining nine items, the proportion of reporting ranged from 24% to 79%. First, in the title and the abstract, 73% of the articles clearly indicated that they were bibliometric studies, often as a subtitle (item 1a). However, only 24% of the articles included the coverage period in the title (item 1b). Moreover, only 69% of the articles mentioned coverage in the abstract (item 2c). As the number of studies based on bibliometric analysis is rapidly growing, it is advisable to include the coverage period in the title to facilitate evaluating the relevance of future studies and meta-research, such as systematic reviews of bibliometric studies. Additionally, simply indicating the time span of the analysis, such as “20 years”, is also not informative without indicating a beginning or ending date.
Second, in the methods section, only 24% of the articles provided background information on the database used (item 5b). While major bibliographic databases, such as the WoS, Scopus, and PubMed, may be familiar to most readers, there are also established specialized bibliographic databases [30] and newer databases [31] that may not be as well-known. In addition, the coverage of these databases may change over time, and it is essential to mention the coverage with a relevant source of information. The suitability of the database chosen to address the research objective should also be supported by relevant references, such as up-to-date comparisons between different databases [22,[32], [33], [34]]. Merely stating that a particular database is a popular and comprehensive is insufficient.
Furthermore, 62% of the articles in the methods section specified the date when the search was conducted. As databases are regularly updated, the retrieval date can impact the number of citations retrieved. Therefore, it is desirable to conduct the data retrieval within a short period of time, such as in one day.
As mentioned above, while all articles included search strategy and keywords (item 7), 75% did not clearly state the inclusion or exclusion criteria, such as language, article types, and coverage period (item 6). As the validity and reliability of the results of a bibliometric analysis depend on data collection methods, a systematic approach is essential to ensure reproducibility [35]. Both explicit search terms and search strings, which are a combination of search terms with Boolean operators and relevant wildcard symbols, should be given. Moreover, the coverage period should be clearly indicated, and vague statements, such as “up to the date of search,” should be avoided. Contemporary evaluative bibliometrics has been pointed out to be inadequate for handling multiple authorship [36]. Therefore, it is necessary to report the normalization method used for publications with two or more authors [37].
Third, in the results section, it was observed that 60% of the articles did not provide clear description the results of the search and selection processes. Many articles only presented the final sample size without mentioning how many articles were excluded and the reasons for their exclusion. A flow diagram may be used to outline the key elements of a bibliometric study, such as the database used, search terms, search string, inclusion and exclusion criteria, analytical and visualization software, and bibliometric indicators. Regarding the types of bibliometric indicators, bibliometric studies should typically provide the following types of results: (1) quantity indicators that measure the productivity of different research constituents, such as authors, institutions, and countries; (2) performance indicators that measure the quality of different research constituents, based on publication- and citation-related metrics; and (3) structural indicators that measure connections between publications, authors, institutions, or countries, based on citation analysis, co-citation analysis, co-authorship analysis, co-word analysis, and bibliographical coupling [13,38,39].
Fourth, in the other information section, 79% of the articles provided a description of the sources of financial or non-financial support (item 16), while only 28% specified a data availability statement (item 18). These figures basically reflected the lack of requirement by the journal for such information. A similar situation could be found in the study of systematic reviews of interventions, where only 29% of 300 systematic reviews with meta-analysis reported a data availability statement [40]. Currently, many journals do not provide a specific section for authors to disclose this type of information. However, this is expected to change as data sharing is increasingly recognized as an integral part of the scientific publishing process [41,42], which can facilitate verification, validation, reanalysis, or inclusion of the data in future meta-research.
This study has several limitations that need to be taken into consideration. First, the 100 articles analyzed were published during a period of the novel coronavirus disease 2019 (COVID-19), resulting in a higher proportion of COVID-19-related articles (17%). Therefore, caution is needed when generalizing the results to bibliometric studies published prior to the pandemic. Second, the three-year period analyzed might not be sufficient to accurately determine research influence. However, given the relatively recent popularity of bibliometric analysis, only the most recent articles were included. A similar issue exists in bibliometric studies of COVID-19. An option is to analyze a random sample of articles instead of relying on citation rankings. Further studies based on such an approach should be conducted after establishing guidelines for bibliometric studies. Third, the quality of the articles was not appraised. Therefore, satisfying all items outlined in our study should not be viewed as a guarantee of research quality. Instead, our checklist should be seen as a guide to help researchers write more complete and transparent bibliometric studies [43]. For research assessment, other guidelines, such as the Leiden Manifesto (http://www.leidenmanifesto.org/) [44] and the San Francisco Declaration on Research Assessment (DORA; https://sfdora.org/) [45] used for evaluative purposes. Both these guidelines highlight the importance of rigorousness in bibliometric methodology and provide a framework for ensuring that such methodologies are used appropriately. Fourth, the development of proposed items in the PRIBA was not based on a structured approach as outlined by Moher et al. [46]. Nevertheless, this study provides a starting point for establishing reporting guidelines for bibliometric studies. More rigorous evaluation of its methodologies, such as those based on expert consensus, should be conducted to refine the guidelines.
In conclusion, this study highlights the need to improve the reporting of bibliometric studies based on analysis of the top 100 studies with normalized citation counts between 2019 and 2021. The proposed PRIBA reporting guidelines identify several areas that require attention. It is hoped that this study will raise awareness of the importance for the need of standardized reporting guidelines for bibliometric studies, which can enhance the clarity and reproducibility of such studies. Journal editors and reviewers can use the PRIBA as a guide as a minimum set of items needed when evaluating a bibliometric study.
Author contribution statement
Malcolm Koo: Conceived and designed the experiments; Performed the experiments; Analyzed and interpreted the data; Contributed reagents, materials, analysis tools or data; Wrote the paper.
Shih-Chun Lin: Conceived and designed the experiments; Analyzed and interpreted the data.
Data availability statement
Data will be made available on request.
Declaration of competing interest
The authors declare that they have no known competing financial interests or personal relationships that could have appeared to influence the work reported in this paper
Footnotes
Supplementary data to this article can be found online at https://doi.org/10.1016/j.heliyon.2023.e16780.
Appendix A. Supplementary data
The following is the Supplementary data to this article.
References
- 1.Pritchard A. Statistical bibliography or bibliometrics. J. Doc. 1969;25:348–349. [Google Scholar]
- 2.Moral-Muñoz J.A., Herrera-Viedma E., Santisteban-Espejo A., Cobo M.J. Software tools for conducting bibliometric analysis in science: an up-to-date review. Prof. Inf. 2020;29 doi: 10.3145/epi.2020.ene.03. [DOI] [Google Scholar]
- 3.Martin-Martin A., Thelwall M., Orduna-Malea E., Delgado Lopez-Cozar E. Google Scholar, Microsoft Academic, Scopus, Dimensions, Web of Science, and OpenCitations' COCI: a multidisciplinary comparison of coverage via citations. Scientometrics. 2021;126(1):871–906. doi: 10.1007/s11192-020-03690-4. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 4.Birkle C., Pendlebury D.A., Schnell J., Adams J. Web of Science as a data source for research on scientific and scholarly activity. Quant. Sci. Stud. 2020;1(1):363–376. doi: 10.1162/qss_a_00018. [DOI] [Google Scholar]
- 5.Baas JS M., Plume A., Côté G., Karimi R. Scopus as a curated, high-quality bibliometric data source for academic research in quantitative science studies. Quant. Sci. Stud. 2020;1(1):377–386. doi: 10.1162/qss_a_00019. [DOI] [Google Scholar]
- 6.Herzog C., Hook D., Konkiel S. Dimensions: bringing down barriers between scientometricians and data. Quant. Sci. Stud. 2020;1(1):387–395. doi: 10.1162/qss_a_00020. [DOI] [Google Scholar]
- 7.Hood W.W., Wilson C.S. The literature of bibliometrics, scientometrics, and informetrics. Scientometrics. 2001;52(2):291–314. doi: 10.1023/a:1017919924342. [DOI] [Google Scholar]
- 8.Ellegaard O. The application of bibliometric analysis: disciplinary and user aspects. Scientometrics. 2018;116:181–202. doi: 10.1007/s11192-018-2765-z. [DOI] [Google Scholar]
- 9.Jonkers K., Derrick G.E. The bibliometric bandwagon: characteristics of bibliometric articles outside the field literature. J. Am. Soc. Inf. Sci. Technol. 2012;63(4):829–836. doi: 10.1002/asi.22620. [DOI] [Google Scholar]
- 10.Kokol P., Blazun Vosner H., Zavrsnik J. Application of bibliometrics in medicine: a historical bibliometrics analysis. Health Inf. Libr. J. 2021;38(2):125–138. doi: 10.1111/hir.12295. [DOI] [PubMed] [Google Scholar]
- 11.Romanelli J.P., Goncalves M.C.P., de Abreu Pestana L.F., Soares J.A.H., Boschi R.S., Andrade D.F. Four challenges when conducting bibliometric reviews and how to deal with them. Environ. Sci. Pollut. Res. Int. 2021;28(43):60448–60458. doi: 10.1007/s11356-021-16420-x. [DOI] [PubMed] [Google Scholar]
- 12.Wildgaard L., Schneider J.W., Larsen B. A review of the characteristics of 108 author-level bibliometric indicators. Scientometrics. 2014;101(1):125–158. doi: 10.1007/s11192-014-1423-3. [DOI] [Google Scholar]
- 13.Donthu D., Kumar S., Mukherjee D., Pandey N., Lim W.M. How to conduct a bibliometric analysis: an overview and guidelines. J. Bus. Res. 2021;133:285–296. doi: 10.1016/j.jbusres.2021.04.070. [DOI] [Google Scholar]
- 14.Enhancing the QUAlity and transparency of health research. The EQUATOR Network. https://www.equator-network.org/about-us/] Available online:
- 15.Simera I., Moher D., Hoey J., Schulz K.F., Altman D.G. A catalogue of reporting guidelines for health research. Eur. J. Clin. Invest. 2010;40(1):35–53. doi: 10.1111/j.1365-2362.2009.02234.x. [DOI] [PubMed] [Google Scholar]
- 16.González-Alcaide G. Bibliometric studies outside the information science and library science field: uncontainable or uncontrollable? Scientometrics. 2021;126:6837–6870. doi: 10.1007/s11192-021-04061-3. [DOI] [Google Scholar]
- 17.Larivière V. The decade of metrics? Examining the evolution of metrics within and outside LIS. Bull. Am. Soc. Inf. Sci. Technol. 2012;38(6):12–17. doi: 10.1002/bult.2012.1720380605. [DOI] [Google Scholar]
- 18.Glänzel W. The need for standards in bibliometric research and technology. Scientometrics. 1996;35:167–176. doi: 10.1007/BF02018475. [DOI] [Google Scholar]
- 19.Rousseau R. Lack of standardisation in informetric research. Comments on “Power laws of research output. Evidence for journals of economics” by Matthias Sutter and Martin G. Kocher. Scientometrics. 2002;55:317–327. doi: 10.1023/A:1019675909829. [DOI] [Google Scholar]
- 20.Wilkinson M.D., Dumontier M., Aalbersberg I.J., Appleton G., Axton M., Baak A., Blomberg N., Boiten J.W., da Silva Santos L.B., Bourne P.E., et al. The FAIR Guiding Principles for scientific data management and stewardship. Sci. Data. 2016;3 doi: 10.1038/sdata.2016.18. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 21.Li K., Rollins J., Yan E. Web of Science use in published research and review papers 1997-2017: a selective, dynamic, cross-domain, content-based analysis. Scientometrics. 2018;115(1):1–20. doi: 10.1007/s11192-017-2622-5. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 22.Pranckutė R. Web of Science (WoS) and Scopus: the titans of bibliographic information in today's academic world. Publications. 2021;9(1):12. doi: 10.3390/publications9010012. [DOI] [Google Scholar]
- 23.Aria M., Cuccurullo C. bibliometrix: an R-tool for comprehensive science mapping analysis. J. Inf. 2017;11:959–975. doi: 10.1016/j.joi.2017.08.007. [DOI] [Google Scholar]
- 24.Szomszor M., Adams J., Fry R., Gebert C., Pendlebury D.A., Potter R.W.K., Rogers G. Interpreting bibliometric data. Front. Res. Metr. Anal. 2020;5 doi: 10.3389/frma.2020.628703. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 25.Agarwal A., Durairajanayagam D., Tatagari S., Esteves S.C., Harlev A., Henkel R., Roychoudhury S., Homa S., Puchalt N.G., Ramasamy R., et al. Bibliometrics: tracking research impact by selecting the appropriate metrics. Asian J. Androl. 2016;18(2):296–309. doi: 10.4103/1008-682X.171582. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 26.Dardas L.A., Woodward A., Scott J., Xu H., Sawair F.A. Measuring the social impact of nursing research: an insight into altmetrics. J. Adv. Nurs. 2019;75(7):1394–1405. doi: 10.1111/jan.13921. [DOI] [PubMed] [Google Scholar]
- 27.Brigham T.J. An introduction to altmetrics. Med. Ref. Serv. Q. 2014;33(4):438–447. doi: 10.1080/02763869.2014.957093. [DOI] [PubMed] [Google Scholar]
- 28.Glänzel W., Katz S., Moed H., Schoepflin U. Proceedings of the workshop on “Bibliometric Standards” Rosary College, River Forest, Illinois (USA) Scientometrics. 1995;35(2):165–166. [Google Scholar]
- 29.Liu W. The data source of this study is web of science core collection? Not enough. Scientometrics. 2019;121:1815–1824. doi: 10.1007/s11192-019-03238-1. [DOI] [Google Scholar]
- 30.Gasparyan A.Y., Yessirkepov M., Voronov A.A., Trukhachev V.I., Kostyukova E.I., Gerasimov A.N., Kitas G.D. Specialist bibliographic databases. J. Kor. Med. Sci. 2016;31(5):660–673. doi: 10.3346/jkms.2016.31.5.660. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 31.Thelwall M. Dimensions: a competitor to Scopus and the web of science? J. Inf. 2018;12(2):430–435. doi: 10.1016/j.joi.2018.03.006. [DOI] [Google Scholar]
- 32.Falagas M.E., Pitsouni E.I., Malietzis G.A., Pappas G. Comparison of PubMed, Scopus, Web of Science, and Google Scholar: strengths and weaknesses. Faseb. J. 2008;22(2):338–342. doi: 10.1096/fj.07-9492LSF. [DOI] [PubMed] [Google Scholar]
- 33.Kulkarni A.V., Aziz B., Shams I., Busse J.W. Comparisons of citations in Web of Science, Scopus, and Google Scholar for articles published in general medical journals. JAMA. 2009;302(10):1092–1096. doi: 10.1001/jama.2009.1307. [DOI] [PubMed] [Google Scholar]
- 34.De Groote S.L., Raszewski R. Coverage of Google Scholar, Scopus, and Web of Science: a case study of the h-index in nursing. Nurs. Outlook. 2012;60(6):391–400. doi: 10.1016/j.outlook.2012.04.007. [DOI] [PubMed] [Google Scholar]
- 35.Livoreil B., Glanville J., Haddaway N.R., Bayliss H., Bethel A., de Lachapelle F.F., Robalino S., Savilaakso S., Zhou W., Petrokofsky G., et al. Systematic searching for environmental evidence using multiple tools and sources. Environ. Evid. 2017;6(1):1–14. doi: 10.1186/s13750-017-0099-6. [DOI] [Google Scholar]
- 36.Põder E. What is wrong with the current evaluative bibliometrics? Front. Res. Metr. Anal. 2021;6 doi: 10.3389/frma.2021.824518. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 37.Wallin J.A. Bibliometric methods: pitfalls and possibilities. Basic Clin. Pharmacol. Toxicol. 2005;97(5):261–275. doi: 10.1111/j.1742-7843.2005.pto_139.x. [DOI] [PubMed] [Google Scholar]
- 38.Durieux V., Gevenois P.A. Bibliometric indicators: quality measurements of scientific publication. Radiology. 2010;255(2):342–351. doi: 10.1148/radiol.09090626. [DOI] [PubMed] [Google Scholar]
- 39.Župič I., Čater T. Bibliometric methods in management and organization. Organ. Res. Methods. 2015;18(3):429–472. doi: 10.1177/1094428114562629. [DOI] [Google Scholar]
- 40.Page M.J., Nguyen P.Y., Hamilton D.G., Haddaway N.R., Kanukula R., Moher D., McKenzie J.E. Data and code availability statements in systematic reviews of interventions were often missing or inaccurate: a content analysis. J. Clin. Epidemiol. 2022;147:1–10. doi: 10.1016/j.jclinepi.2022.03.003. [DOI] [PubMed] [Google Scholar]
- 41.Peccoud J. Data sharing policies: share well and you shall be rewarded. Synth. Biol. (Oxf.) 2021;6(1):ysab028. doi: 10.1093/synbio/ysab028. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 42.Woods H.B., Pinfield S. Incentivising research data sharing: a scoping review. Wellcome Open Res. 2021;6:355. doi: 10.12688/wellcomeopenres.17286.2. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 43.Bennett C., Khangura S., Brehaut J.C., Graham I.D., Moher D., Potter B.K., Grimshaw J.M. Reporting guidelines for survey research: an analysis of published guidance and reporting practices. PLoS Med. 2010;8(8) doi: 10.1371/journal.pmed.1001069. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 44.Hicks D., Wouters P., Waltman L., de Rijcke S., Rafols I. Bibliometrics: the Leiden Manifesto for research metrics. Nature. 2015;520(7548):429–431. doi: 10.1038/520429a. [DOI] [PubMed] [Google Scholar]
- 45.Hoppeler H. The San Francisco declaration on research assessment. J. Exp. Biol. 2013;216(Pt 12):2163–2164. doi: 10.1242/jeb.090779. [DOI] [PubMed] [Google Scholar]
- 46.Moher D., Schulz K.F., Simera I., Altman D.G. Guidance for developers of health research reporting guidelines. PLoS Med. 2010;7(2) doi: 10.1371/journal.pmed.1000217. [DOI] [PMC free article] [PubMed] [Google Scholar]
Associated Data
This section collects any data citations, data availability statements, or supplementary materials included in this article.
Supplementary Materials
Data Availability Statement
Data will be made available on request.


