Skip to main content
PLOS One logoLink to PLOS One
. 2021 Apr 8;16(4):e0249879. doi: 10.1371/journal.pone.0249879

Journal article publishing in the social sciences and humanities: A comparison of Web of Science coverage for five European countries

Michal Petr 1,*, Tim C E Engels 2, Emanuel Kulczycki 3, Marta Dušková 4, Raf Guns 2, Monika Sieberová 1, Gunnar Sivertsen 5
Editor: Lutz Bornmann6
PMCID: PMC8031415  PMID: 33831115

Abstract

This study compares publication pattern dynamics in the social sciences and humanities in five European countries. Three are Central and Eastern European countries that share a similar cultural and political heritage (the Czech Republic, Slovakia, and Poland). The other two are Flanders (Belgium) and Norway, representing Western Europe and the Nordics, respectively. We analysed 449,409 publications from 2013–2016 and found that, despite persisting differences between the two groups of countries across all disciplines, publication patterns in the Central and Eastern European countries are becoming more similar to those in their Western and Nordic counterparts. Articles from the Central and Eastern European countries are increasingly published in journals indexed in Web of Science and also in journals with the highest citation impacts. There are, however, clear differences between social science and humanities disciplines, which need to be considered in research evaluation and science policy.

Introduction

From ca the 1980s, bibliometric methods became increasingly anchored in science policy and research evaluation. In this context, bibliometric research refocused from supporting sociological theories to research evaluation topics: science mapping, systemic effects of evaluation procedures, developing new indicators or studying publication patterns [1]. The common attribute of the recent era in bibliometric scholarship is using publication metadata as the primary data source. Major international bibliographic database Web of Science (WoS) has been used in research dealing with publishing and citation characteristics and patterns changes in the social sciences and humanities (SSH), as summarized by Nederhof [2] and recently by Franssen and Wouters [1]. In 2006, Archambault et al. [3] discussed the limits of WoS for any comparative analysis in terms of SSH output due to the language-related database bias. As documented by Franssen and Wouters [1], the recent establishment of comprehensive national databases has opened a new direction of bibliometric studies of SSH allowing for a more complete picture of the publication practices by covering more publication types, sources, and research with local relevance than highly selective international databases WoS and Scopus do [4]. However, national databases are relatively new and limit the research by having a different scope and comprehensiveness. In this descriptive study, drawing on the latest bibliometric research approaches and results using national databases as a main data source, we aim to expand knowledge about publication patterns in journal publishing in SSH. More specifically, on an aspect of performance perceived by national evaluation systems in several countries as a sign of good performance, i.e. ranking of journals. We focus on Central and Eastern European (CEE) countries, which comprise a still-neglected region in terms of bibliometrics. Although some research encompassing several CEE countries has been conducted, the overall situation regarding SSH outcome in this region has not been fully grasped. A recent comparative study of eight European countries [5] discovered the changes and differences in publication patterns in terms of the proportion of publication types and publication languages across European countries and disciplines. The authors of the study argued that these differences are often due to the countries’ cultural and historical backgrounds. Similarly, Kozak et al. [6] concluded that in terms of international collaboration, number of articles, and citation impact, publication practices and the intensity of change therein differ between individual Eastern European post-communist states, noting that the number of articles had increased the most in the Czech Republic and Poland.

The evaluation systems in the Czech Republic [7] and Poland [8] favour journal articles indexed in Web of Science (WoS) and Scopus over other journal articles, books, and other types of outputs. However, SSH researchers in the Czech Republic still commonly believe that publishing in WoS journals can be at times too challenging for several reasons; whether real or perceived, these reasons include research limited to topics of local relevance, language barriers, and lack of journals in the researchers’ fields [9,10]. Some researchers in other CEE countries hold similar opinions [11,12]. The local versus international dilemma also obviously applies to journals’ publishing strategies [13,14].

Jurajda et al. [15] analysed the performance of SSH disciplines in post-communist countries through journal-level Article Influence Scores and concluded that performance in these countries lags behind performance in the West. A few years earlier, Vanecek [16] reached a similar conclusion based on his finding that there had been no change in the quality of publications by Czech authors based on the average impact factor (IF) of journals published in all disciplines. Several questions then emerge: are journal-level indicators used in national evaluation systems suitable for studying the developments in SSH publication output or even for evaluating these disciplines at the national level? Furthermore, is the mindset among CEE researchers still defensible when comparing publication patterns in SSH disciplines with patterns in other countries?

In this study, we investigate whether the changes and differences in publication patterns between CEE countries also affect the coverage of SSH articles in WoS and, within WoS, the proportion of articles in Q1 and Q2 journals ranked by IF and subject matter.

We pose the following two research questions:

  • RQ1: Have SSH publication patterns in CEE countries changed in favour of WoS-indexed articles?

  • RQ2: Have publication patterns changed in favour of articles published in journals ranked Q1 and Q2 in the Journal Citation Reports (JCR) based on IF?

Alongside each research question, we also ask if there are differences between SSH disciplines. Several previous studies have investigated SSH coverage in major international databases [1720], but these studies examined only Nordic or Western countries. Here, we will focus on the Czech Republic, Slovakia, and Poland, countries representing the CEE region, because of their similar cultural and political heritages. We will compare them with Flanders (Belgium) and Norway, representing Western and Nordic countries, respectively. The previously mentioned studies by Kozak et al. [6] and Jurajda [15] used WoS only as a data source. Here, we will go beyond WoS coverage and analyse the publication output reported in national databases for all SSH disciplines. Furthermore, we will investigate the journal article share in total peer-reviewed publications registered in national bibliographic databases to obtain a broader picture of publishing practices.

Full coverage of scholarly publication channels in WoS is virtually impossible in SSH disciplines. Publication patterns, channels, and types are much more heterogeneous in SSH disciplines than in science, technology, and medicine (STM). Even though Clarivate Analytics (formerly Thomson Reuters) began indexing more journals published in national languages in 2006, its coverage of SSH publications is still limited [4,18,21]. The Emerging Sources Citation Index (ESCI) was launched in 2015 as a relatively new part of the WoS Core Collection, with backfiles dating to 2005. As of October 2018, this index covered 7,743 journals. The ESCI supplies the Core Collection with SSH journals in local languages and journals of regional scope. The selection process is similar to that for journals indexed in the JCR and in the Arts & Humanities Citation Index (AHCI). Like AHCI publications, ESCI journals do not receive an IF but are included in coverage analysis.

In this study, we consider IF to be an indication of a journal’s citation impact. Following this rationale, we take a journal’s quartile ranking in the JCR to be an indication of the journal’s quality requirements, selectivity, and scholarly impact. Those with higher rankings (Q1 and Q2) are more demanding journals. In this paper, we do not discuss whether IF is an appropriate indicator for evaluating research. We only use it at the journal level, which it was designed for. At the same time, we do not claim that WoS-indexed journals are better than non-WoS-indexed journals. Our selection of indicators reflects the fact that all five countries in our study make use of bibliometric indicators in their national funding and evaluation systems. This common trait in these countries represents just one part of the greater diversity of evaluation systems across Europe [22]. In Poland, journals with IF are given more weight in the Polish Journal Ranking, which is a key element in the national performance-based research funding system (PRFS) [23]. In the past, the Czech PRFS used a formula comprising IF-based journal rankings to calculate publication points. Since 2017, however, calculating the number and proportion of articles in each quartile in journal rankings based on Article Influence Score (AIS) has been one of five modules used in the Czech Republic’s PRFS for assessing publication performance in each research field [24]. A similar system is applied in Slovakia [25]. In addition, we would direct the reader to dedicated comparisons of performance-based research funding systems in the EU [26,27].

Although coverage of SSH publications is still limited, which creates unnecessary tensions between SSH disciplines with different degrees of coverage in the databases, the presence of publications in Scopus or WoS has increasingly become a SSH research evaluation criterion [28]. The use of journal-based indicators or journal rankings at the national level can influence the publication behaviour of individuals and institutions [13,2931]. Vanholsbeeck at al. [32] have demonstrated that even if researchers are critical of how research quality is defined in such evaluation and funding systems, these systems still influence these scholars’ publishing priorities and research interests. A previous study focused on Flanders and Norway revealed that the design of evaluation and funding systems affects the intensity of focus on WoS coverage and WoS-based indicators [18]. The Czech Republic, Poland, Slovakia, Flanders, and Norway have a stronger focus on WoS coverage, and therefore we theorize this focus could influence national-level publication patterns in these countries.

Data and methods

In this study we use data about peer-reviewed publications from 2013 to 2016 that we collected from national databases in the Czech Republic (RIV), Poland (PBN), Slovakia (CREPČ), Flanders (VABB-SHW), and Norway (NSI). The authors of previous studies have described the need to use national databases as essential data sources for SSH-related analytics, data collection methodologies, definitions of publication types, and inclusion criteria in different countries [5,33,34].

We should note a few important points before proceeding. In the case of Flanders, the VABB-SHW database covers only the Flemish region but not the whole country of Belgium. As for Slovakia, we work with only a 3-year window because data collection started in 2014 (Table 1). For the purposes of this study, we use only data classified in three publications types: a) journal articles, b) monographs and edited books, and c) book chapters and conference proceedings. Details about each country’s database and publication types were published in an earlier report [33]. Because national databases evolved in different historical and political settings and are unique in terms of scope, coverage, structure of publication types, and data availability, we must consider the context of data sources when interpreting the results. Differences in the comprehensiveness of national databases, their data structure, scope, and covered period limit the range, in which the data are reliable in mutual comparison. Since we endeavour to study all SSH disciplines, we have adopted the limited timeframe of 2013–2016 for which data are available across all databases, with the exception of Slovak data (2014–2016).

Table 1. The total volume of all types of SSH publications in national databases.

CZE (RIV) SLO (CREPČ) POL (PBN) NOR (NSI) FLA (VABB-SHW)
Psychology 2,099 n/a 7,539 2,625 2,675
Economics and business 6,770 13,973 73,119 4,105 2,864
Educational sciences 8,260 7,004 19,285 4,037 1,186
Sociology 3,345 n/a 12,807 2,155 2,057
Law 6,617 5,659 36,479 1,820 4,987
Political science 11,389 n/a 14,868 2,198 1,389
Social and economic geography 518 n/a n/a 1,754 1,026
Media and communication 917 n/a 3,604 875 770
Other social sciences 1,262 5,737 7,960 2,808 411
History and archaeology 10,786 2,137 26,190 2,120 1,647
Languages and literature 7,690 n/a 47,258 3,175 3,345
Philosophy, ethics, and religion 3,981 n/a 17,381 2,309 2,219
Arts 5,662 479 5,449 1,090 1,108
Other humanities 16 11,306 5,099 734 390
All 69,312 46,295 277,038 31,805 15,811
Period 2013–2016 2014–2016 2013–2016 2013–2016 2013–2016

CZE Czech Republic, SLO Slovakia, POL Poland, NOR Norway, FLA Flanders.

Disciplines are ordered as they appear in the Frascati classification scheme.

Publications are counted using full counting with the same weight given to all publication types and affiliated countries. Our classification of publications was adopted from the categories included in the OECD’s Frascati Field of Research and Development (FORD) classification scheme [35]. Unfortunately, some disciplines in the Slovak national database, CREPČ, are defined too broadly; thus, they could be classified only as “Other social sciences” or “Other humanities”, respectively.

We aggregated the data on two levels:

  • Level 1: all peer-reviewed publications. To analyse the article share, we used the total counts of all peer-reviewed articles in journals, monographs and edited books, and chapters and conference proceedings collected from national databases, which totalled 449,409 publications.

  • Level 2: journal articles. To analyse coverage and to link articles with information about journal indexation, we used the full list of journal articles that included basic bibliographic information, FORD classification [29], and digital identifiers (UT WoS, DOI) from every country under analysis. We examined a total of 194,876 articles. For this study, Clarivate Analytics provided us with lists of journals covered by WoS in 2013–2016 that incorporated information about journals’ inclusion in WoS citation indexes (SCI-E, SSCI, AHCI, and ESCI) for every year in the studied period. In some cases, these data indicated that a journal was associated with more than one citation index. As we study the coverage and “quality sign” from the perspective of national evaluation systems methodologies, we decided to assign just one index to each journal and used the following decision-making strategy. We prioritised searching in the SCI-E and SSCI (i.e., journals indexed in the JCR), but if a journal was not in the JCR, we searched for it first in the AHCI and then in the ESCI. We also assigned a journal to the AHCI if it was present in both the AHCI and the SCI-E/SSCI but did not have an IF. We also did not duplicate the publications if the journal was present in both SCI-E/SSCI and AHCI. In this case, we treat these journals once as present in JCR (having an IF). As a second step, we assigned a quartile ranking (Q1–Q4) to journals indexed in the JCR. For this purpose, we worked with the quartile ranking valid in the year a given article was published. For journals assigned multiple subject categories, we decided to use the best performing quartile ranking. Finally, we matched journal indexation information with the list of articles containing each journal’s ISSN, e-ISSN, title, and other article-level identification (DOI, UT WoS). This procedure allowed us to determine which journals mentioned in the list of articles from national databases are indexed in the year the given articles were published. We considered all articles published in a given year as covered if a journal was present in any WoS citation index in that year. A total of 3.6% of all WoS-indexed articles from national databases were assigned to Book Citation Indexes (BKCI-S, BKCI-SSH), Conference Proceedings Citation Indexes (CPCI-S, CPCI-SSH), and the category “n/a”, which stands for WoS-indexed articles in journals indexed in the SCI-E or SSCI but without IF assigned in a given year. For some purposes (e.g. Figs 5 and 6), we merged these articles into the category “other” (BKCI-S, BKCI-SSH, CPCI-S, CPCI-SSH, “n/a”).

Fig 5. Proportion of articles in WoS by citation indexes–social sciences.

Fig 5

Fig 6. Proportion of articles in WoS by citation indexes–humanities.

Fig 6

We conducted the following analyses to answer our research questions. First, we determined the article share. By article share we mean the percentage of peer-reviewed journal articles out of the total number of all peer-reviewed SSH publications (articles, books, book chapters, proceedings) registered in national databases. This variable relates only to patterns in the proportion of publication types regardless of indexation in international databases. Second, we determined the percentage of SSH journal articles indexed in WoS out of all articles registered in national databases (coverage). Third, we calculated the proportions of SSH journal articles in WoS citation indexes assigned to the source journals. For those included in the JCR, we further distinguished four quartiles derived from IF-based subject category rankings. In many countries, it is customary to refer to quartiles in evaluation-related procedures. Quartiles are perceived as a sign of quality. Therefore, in this paper we seek to demonstrate on one hand if coverage is changing and on the other if the perceived quality of the journals in which articles are published is changing. Thus, we focus on calculating the percentage of articles in Q1 and Q2 journals. We assume that in SSH fields the first two quartiles of the JCR ranking represent good performance and prestige [28]. In the Results section, we do not describe disciplines where the total number of WoS-indexed articles per country per year does not reach 50 articles because coverage analysis and finer-grained differentiation of citation databases within WoS does not seem justified for such small units. In the Czech Republic, such “small” fields are Law, Social and economic geography, and Media and communication; and in Poland, Social and economic geography, Media and communication, and Arts. In Slovakia, however, only the fields of Economics and business, Educational sciences, and History and archaeology reached the threshold of 50 articles due to the limited classification scheme in place there. In Norway and Flanders, there were no fields that produced fewer than 50 articles per year.

Results

The results of the two aggregated groups of SSH disciplines are available in the following forms: as the overall article share in the total number of peer-reviewed publications, as the coverage of articles in WoS, and as their distribution in citation indexes in the WoS Core Collection.

The journal article share in total publications

Out of the five studied countries, Flanders and Norway generally have the highest article share in overall peer-reviewed publications in both the social sciences and the humanities. In the social sciences, the article share in CEE countries does not exceed 50%, while in Flanders and Norway the share is roughly 60–70% (Fig 1). Journal article share is on the rise in the Czech Republic and Norway and is relatively stable in Slovakia. In Poland and Flanders, journal article share decreased; remarkably, the decline in Poland followed a significant gain in 2013 [5]. In the humanities (Fig 2), overall journal article share is lower than in the social sciences, and the overall differences between countries seem less pronounced. Article share in these fields is mostly stable or slightly increasing. Only in Poland, in nearly all disciplines, is the number of articles as well as the article share stable or on the decline. At the level of individual disciplines, fluctuations seem somewhat greater in the humanities than in the social sciences (S1 and S2 Tables). We observed differences between disciplines rather than differences between countries; in contrast, year-to-year fluctuations are more visible. This analysis serves as an important context for analysing coverage and representation in citation indexes.

Fig 1. Article share–social sciences.

Fig 1

Fig 2. Article share–humanities.

Fig 2

Coverage of journal articles in WoS

Figs 3 and 4 depict developments in the coverage of SSH articles in WoS in 2013–2016. In CEE countries there was a clear increase in the percentage of WoS-covered social sciences articles in 2013–2016 (in the Czech Republic from 17.5% to 26.0%, in Slovakia from 10.4% to 17.4%, and in Poland from 6.9% to 10.5%; see Fig 3), albeit absolute values are much lower than in Norway and Flanders, where the proportion of articles in WoS-covered journals is fairly stable at around 65% in Norway and fluctuates around 60% in Flanders. In the humanities (Fig 4), coverage is generally lower, and changes are occurring more swiftly than in the social sciences. Fluctuations are more apparent in the humanities than in the social sciences (S3 and S4 Tables).

Fig 3. Coverage of journal articles in WoS–social sciences.

Fig 3

Fig 4. Coverage of journal articles in WoS–humanities.

Fig 4

The distribution of journal articles in WoS citation indexes

To reveal the extent to which SSH scholars in each country publish in journals in each citation index, Figs 5 and 6 refer to the overall proportion of citation indexes in each country for all the years of the analysis and are divided into a) JCR-indexed journals (with IF)–Q1+Q2; b) JCR-indexed journals (with IF)–Q3+Q4; c) AHCI; d) ESCI; and e) other (BKCI-S, BKCI-SSH, CPCI-S, CPCI-SSH, n/a). Further, Figs 7 and 8 show the percentage of articles published in Q1 and Q2 journals in WoS. In addition, S1S10 Figs show the distribution of journal articles in citation indexes by country, and S5 Table contains data for each discipline.

Fig 7. The percentage of articles in Q1+Q2 journals in WoS–social sciences.

Fig 7

Fig 8. The percentage of articles in Q1+Q2 journals in WoS–humanities.

Fig 8

Our analysis revealed that disciplines across the social sciences have similar characteristics in each country. Fig 5 shows that in the Czech Republic the percentage of social science articles published in Q1 and Q2 journals grew overall (24.2% in 2016) as did the percentage of articles in ESCI-indexed journals, whereas the percentage of articles in Q3 and Q4 journals decreased (Table 2). A similar trend can be observed in the social sciences in Slovakia, Poland, and Norway (S5 Table). The data from Flanders indicate a rather stable trend throughout the observed period. The proportion of journals with IF overall and, within this subset of publications, in Q1 and Q2 is, nonetheless, much higher in Flanders and Norway than in CEE countries. A substantial increase in the percentage of articles published in ESCI journals was recorded in Slovakia (from 29.9% in 2014 to 50.3% in 2016) (Table 2); an increase was also apparent in the Czech Republic and Norway (S1S5 Figs).

Table 2. Distribution of articles in WoS citation indexes–social sciences.

year JCR Q1+Q2 JCR Q3+Q4 AHCI ESCI Other
CZE # % # % # % # % # %
2013 150 19.6% 335 43.8% 5 0.7% 222 29.0% 53 6.9%
2014 233 27.7% 348 41.4% 7 0.8% 204 24.3% 48 5.7%
2015 278 26.0% 395 37.0% 8 0.7% 344 32.2% 43 4.0%
2016 276 24.1% 394 34.5% 14 1.2% 425 37.2% 34 3.0%
SLO
2013 n/a n/a n/a n/a n/a n/a n/a n/a n/a n/a
2014 53 18.4% 121 42.0% 7 2.4% 86 29.9% 21 7.3%
2015 66 20.6% 118 36.9% 5 1.6% 123 38.4% 8 2.5%
2016 100 20.0% 139 27.7% 6 1.2% 252 50.3% 4 0.8%
POL
2013 331 24.5% 476 35.2% 13 1.0% 469 34.7% 64 4.7%
2014 416 25.6% 445 27.3% 14 0.9% 641 39.4% 112 6.9%
2015 542 26.4% 568 27.6% 16 0.8% 795 38.7% 135 6.6%
2016 598 30.6% 549 28.1% 22 1.1% 696 35.6% 88 4.5%
NOR
2013 996 48.1% 508 24.5% 4 0.2% 461 22.3% 101 4.9%
2014 1,081 49.0% 546 24.8% 0 0.0% 488 22.1% 91 4.1%
2015 1,143 48.3% 509 21.5% 0 0.0% 569 24.0% 147 6.2%
2016 1,272 49.1% 534 20.6% 5 0.2% 674 26.0% 107 4.1%
FLA
2013 886 58.7% 359 23.8% 5 0.3% 215 14.2% 45 3.0%
2014 942 58.0% 377 23.2% 8 0.5% 283 17.4% 14 0.9%
2015 955 61.5% 311 20.0% 8 0.5% 273 17.6% 7 0.5%
2016 1,069 61.0% 405 23.1% 11 0.6% 259 14.8% 9 0.5%

CZE Czech Republic, SLO Slovakia, POL Poland, NOR Norway, FLA Flanders.

Humanities articles indicate a different distribution pattern in citation indexes than social sciences articles. Only moderate changes have occurred in distribution patterns. Fig 6 depicts the percentage of articles in influential journals in each country. The distribution of articles published in all WoS-indexed journals for each country is presented in S6S10 Figs. The percentage of articles published in journals with IF increased in all countries (S6S10 Figs) except Slovakia. Flanders also experienced a decrease in 2016. Fig 6 further illustrates the importance of the AHCI for the humanities. One must take into account that performance in the humanities is not measured only by publications in journals with IF, particularly top-tier journals. Table 3 shows that changes in the proportion of articles in Q1 and Q2 journals in the humanities seem less important than the vital presence of articles in AHCI-indexed journals. AHCI-indexed journals article share, however, decreased to the benefit of ESCI- (all countries except Poland) and JCR-indexed (all countries) journals, which is most obvious in Flanders. Slovakia has the lowest share of journal articles with IF (8.9% in 2016), whereas articles in other journals, mostly ESCI-indexed ones, dominate (about 91.1% in 2016, Table 3). Norway is rather stable with some progress towards more articles in journals with IF and more ESCI-indexed articles (S9 Fig).

Table 3. Distribution of articles in WoS citation indexes–humanities.

year Q1+Q2 Q3+Q4 AHCI ESCI Other
CZE # % # % # % # % # %
2013 39 8.4% 53 11.4% 225 48.4% 103 22.2% 45 9.7%
2014 54 10.2% 41 7.8% 273 51.6% 101 19.1% 60 11.3%
2015 69 13.0% 40 7.5% 236 44.5% 123 23.2% 62 11.7%
2016 72 13.0% 69 12.5% 225 40.7% 149 26.9% 38 6.9%
SLO
2013 n/a n/a n/a n/a n/a n/a n/a n/a n/a n/a
2014 8 4.5% 3 1.7% 81 46.3% 47 27.7% 34 19.8%
2015 10 5.3% 10 5.3% 72 38.8% 61 32.4% 33 18.1%
2016 8 3.4% 13 5.5% 72 33.5% 124 53.0% 11 4.7%
POL
2013 67 8.9% 65 8.6% 233 30.8% 361 47.7% 31 4.1%
2014 76 10.2% 90 12.1% 227 30.4% 327 43.8% 26 3.5%
2015 100 12.0% 83 9.9% 235 28.1% 405 48.4% 13 1.6%
2016 102 13.9% 96 13.1% 211 28.7% 297 40.5% 28 3.8%
NOR
2013 57 14.2% 63 15.7% 156 38.9% 79 19.7% 46 11.5%
2014 70 17.6% 67 16.9% 136 34.3% 88 22.2% 36 9.1%
2015 81 16.4% 72 14.5% 169 34.1% 109 22.0% 64 12.9%
2016 85 17.3% 88 17.9% 188 38.2% 122 24.8% 9 1.8%
FLA
2013 85 14.5% 96 16.3% 274 46.6% 88 15.0% 45 7.7%
2014 106 17.9% 95 16.0% 260 43.9% 97 16.4% 34 5.7%
2015 98 20.1% 71 14.5% 176 36.1% 118 24.2% 25 5.1%
  2016 97 15.5% 118 18.8% 262 41.9% 147 23.5% 2 0.3%

CZE Czech Republic, SLO Slovakia, POL Poland, NOR Norway, FLA Flanders.

Regional journals in the Czech Republic, Slovakia, and Poland

Because coverage in WoS in 2013–2016 may have increased due to the addition of new regional journals [17], we analysed the proportion of regional journal articles in each CEE country in both the social sciences and the humanities. By the term regional we mean journal articles in the national database published in the same country. For Czech and Slovak journal articles we count both Czech and Slovak journals due to the similarity of both languages and the two countries’ shared history. Fig 9 shows that although in the Czech Republic coverage increased in both the social sciences and the humanities, the share of articles in regional journals decreased, even though five journals had been recently added (those with more than 10 articles in 2013–2016) to WoS during the observed period. A similar trend can be observed in Slovakia (Fig 10), while Poland is the only country where the article share in regional journals remains stable in the social sciences (Fig 11). In this case, there are eight newly indexed Polish journals in the 4.3% WoS-indexed article share. In the humanities, the article share in Polish journals decreased.

Fig 9. WoS coverage and regional journal article share in the Czech Republic.

Fig 9

Fig 10. WoS coverage and regional journal article share in Slovakia.

Fig 10

Fig 11. WoS coverage and regional journal article share in Poland.

Fig 11

The aggregated characteristics of countries and SSH disciplines

Our results show a gap between Flanders and Norway on the one hand and CEE countries on the other in all three publication production variables we studied: article share in the total production of peer-reviewed publications, coverage of articles in WoS, and percentage of articles in Q1 and Q2 journals. Norway and especially Flanders in most SSH disciplines display higher and relatively stable values in all indicators. Interestingly, in the CEE region, when we compare the Czech Republic and Poland, we discover opposite trends. In the Czech Republic the article share is stable or slightly growing compared to Poland, where the production of articles in all SSH disciplines decreased in 2013–2016 after a significant increase in previous years. The Czech Republic also has a relatively higher article share in most SSH disciplines than other CEE countries, except for in Psychology and Economics and business, where Poland dominates. The Czech Republic also has higher coverage and article representation in Q1 and Q2 journals than Poland, at least in fields where articles appear to be a considerable publication channel (Psychology, Economics and business). The situation in Poland is somewhat different. Coverage is not rising (apart from in Psychology) but remains rather stable or has slightly decreased. At the same time though, the percentage of articles in prestigious journals is increasing. Data from Slovakia were analysed less vigorously due to major incomparability of the Slovak national database’s classification system and the reduced timeframe of available data. Fluctuations can be attributed to small publication numbers in some disciplines not only in CEE countries, but also in Flanders and Norway in some cases (Law, Media and communication, Arts).

In the field of Psychology, journal articles comprise a usual and stable publication channel (in Norway and Flanders the article share was 80–90%). Most of these articles were published in WoS-indexed journals (in Norway and Flanders coverage in WoS was 80–90%), and a large majority were published in prestigious journals with IF (in Norway and Flanders 60–80% articles were published in Q1+Q2 journals). In this regard, Psychology is rather unusual among SSH disciplines. Elsewhere, certainly in the Czech Republic, journal articles were not always the dominant publication type due to different publication patterns, but, as our data show, this situation is changing. In CEE countries, Psychology has the highest article share and coverage in WoS among all SSH disciplines (Czech Republic 46.7%, Poland 46.2% in 2016; S1 and S3 Tables). In these two countries, publishing trends in Psychology indicate greater coverage in WoS journals and a greater number of articles published in prestigious journals (Czech articles in Q1+Q2 journals rose from 34.1% in 2013 to 42.3% in 2016; Polish articles, from 35.9% to 50.5%). In Poland, Psychology is the only discipline where the coverage of journal articles in WoS exceeded 10% (46.2% in 2016).

The coverage of articles in Economics and business, Political science, and Media and communication shared similar traits in all five countries. In Norway and Flanders, WoS-indexed articles are a stable publication channel, and coverage here was approximately 20% higher than in the Czech Republic, where coverage was increasing. The percentage of WoS-indexed articles in Poland remained below 10%. In this group, Slovak only keeps records for Economics and business, where the data indicate some growth in article share in a limited three-year window (S1 Table). The growing coverage of Czech articles in Economics and business journals (approx. 47% in 2016) was accompanied by a growing percentage of articles in Q1 and Q2 journals (approx. 26.7% in 2016) and a growing percentage of articles in ESCI journals (approx. 30% in 2016, S5 Table). Growing representation in the ESCI could also be seen in Poland and Slovakia, whereas Norway and Flanders published more in JCR journals (S5 Table). The graphs for Media and communication depict a similar profile; the number of WoS-covered articles, however, is too small to conduct further analyses.

In general, the article share in Educational sciences and Law was lower, as was WoS coverage. The countries included in this analysis share similar publication profiles in these disciplines. In both fields, article share was significantly higher in Norway (Educational sciences 54.8%, Law 40% in 2016) and Flanders (Educational sciences 60.9%, Law 63.4% in 2016). In Slovakia and Poland, the article share was rather low (both Educational sciences and Law around 20–30%). In the Czech Republic in Law the article share was relatively higher (46.8% in 2016). Coverage was significantly higher in Norway (Educational sciences 46.4%, Law 25.6% in 2016) and Flanders (Educational sciences 71.4%, Law 15.2% in 2016) than in CEE countries, where coverage in Educational sciences fluctuated between 5 and 15% and coverage in Law remained consistently below 5%. Significantly, CEE countries produced more ESCI-indexed articles than Flanders and Norway. In the Czech Republic and Poland, the share of articles published in influential Q1 and Q2 journals also increased slightly (S5 Table).

Social and economic geography was well represented only in Norway and Flanders but not in Slovakia and Poland, which do entirely not classify this discipline in their databases, and in the Czech Republic, where we found only a few articles.

In the humanities, there were more similarities across disciplines than in the social sciences. For example, data from the fields of History and archaeology, Languages and literature, and Arts demonstrate that publication patterns in the humanities seem to be firmly established. Coverage in WoS was modest in all CEE countries but was greater in Flanders and Norway. Coverage in CEE countries was around 5–15% and was quite stable in 2013–2016. In Norway, however, there was a dynamic change in coverage in the Arts, which jumped from 33.8% in 2013 to 51.6% in 2016. Fluctuations in article share were larger as well. While it is true that differences between countries in article share in the humanities are not as obvious as in the social sciences, the article share in History and archaeology in Flanders and in Arts in Norway was significantly higher than in other countries (around 70% and 60%, respectively).

Publishing patterns in Philosophy, ethics, and religion stand apart from the patterns described above as in this field article share was roughly the same across all countries. Also, the percentage of WoS-covered articles from the Czech Republic (34.2% in 2016) approached coverage in Norway (38.4% in 2016) and Flanders (45.5%, after a decrease in 2016).

Discussion

This study deepens understanding about current changes in WoS coverage, the increase in three CEE countries in articles published in influential SSH journals, and the general increase in the use of journal articles as a means of communicating SSH research results. Our objectives were to describe the current situation with regard to the coverage of SSH articles in WoS and to identify possible trends towards greater coverage in WoS and towards publishing more in prestigious journals (Q1 and Q2 in ranking by IF) in those articles already covered in WoS.

Kulczycki et al. [5] concluded that publication patterns in 2011–2014 were rather stable in Western and Northern Europe and underwent significant changes in Central and Eastern Europe. Our current findings for 2013–2016 show that this conclusion applies not only to article share, but also to coverage in WoS and representation in citation indexes and, by extension, in prestigious journals based on IF. Publication patterns differed not only between the social sciences and humanities, but also between individual disciplines and between countries within the same discipline.

Some disciplines place more importance than others on journal articles as a publication channel as well as seek to increase the coverage of articles in WoS and the percentage of articles in prestigious journals. This is well documented across all countries in the fields of Psychology and Economics and business. In some countries, other disciplines share these traits, but the national data here vary heavily. In some countries, the article share was significantly higher and may thus be related to locally specific publishing priorities in the field (in Norway in Sociology and Arts, in Flanders in Educational sciences, Law, and Media and communication). Except in Philosophy, ethics, and religion, where the article share was roughly equal across all analysed countries, there were striking differences between the CEE countries and the Western/Nordic countries across all disciplines for all indicators analysed in this paper. That being said, the differences are becoming smaller. The current situation in the Czech Republic, Poland, and Slovakia demonstrates that the number of articles indexed in WoS has clearly increased, and this is also true of a higher percentage of articles in influential journals, though the figures do not reach those of Flanders and Norway. In Flanders, the rapid increase of WoS-indexed journals has been described for the previous period by Engels et al. [17] and Ossenblok et al. [18]. The increase in WoS coverage as described here may be due to some extent to the addition of new journals, which is especially true in non-English-speaking countries. In this respect, Ossenblok et al. [18] showed, using the examples of Flanders and Norway, that adding a few journals could lead to a substantial difference in terms of WoS coverage for some of the analysed disciplines. Regarding the countries in this research, we hypothesise that the PRFSs in each country could influence journal policy by making indexation in WoS a priority to benefit from favourable publishing conditions for national authors. Macháček and Srholec [36] studied the inclusion of local journals in Scopus in several European countries. They defined a local journal as a journal in which at least 33% of articles are written by domestic authors. As a result, there is a considerably higher number of Scopus-indexed local journals published in CEE countries than in Western countries. In the Czech Republic, this conclusion still applies even when the domestic-author threshold is set at 10%, 33%, or even 66%. Although Macháček and Srholec conducted research on Scopus-indexed journals, their results indicate that specific journal policies were likely stimulated by local PRFSs. In contrast to Macháček and Srholec, we did not carry out an analysis of comparable quality and granularity for WoS; similarly, we did not study the addition of new journals at the level of disciplines. However, the results of our simple comparison of the percentage of articles in regional journals and their coverage in WoS (Figs 911) in two research areas (the social sciences and humanities) show that this share decreased even though coverage increased. Therefore, the impact of publishing in regional journals may not have been significant in the observed period.

It may be useful to take into consideration not only differences and developments in article coverage between STM and the SSH, but also between individual SSH disciplines or clusters of disciplines when applying bibliometrics and determining methods for assessing research impact in evaluation systems. Such considerations correspond with De Filippo and Sanz-Casado’s [37] findings for three social science disciplines based on an analysis of publication activity, collaboration, impact, and visibility using both traditional and alternative metrics.

Differences in publication patterns are not only related to the publication practices in the given discipline; they are also rooted in differences in scholarly traditions across countries. The gap between CEE countries and Western/Nordic ones is influenced, among other things, by cultural and historical heritage, and more specifically by scholars’ ability and willingness to communicate in a foreign language [5]. National PRFSs often provide incentives for adopting different practices; for example, in the Czech Republic, the effect of the PRFS on the strategic behaviour of researchers is well documented [7,12,38,39]. Generally speaking, until 2017 this evaluation system attributed greater weight, quantified as points, to influential WoS- and Scopus-indexed journal articles. Although SSH monographs and other peer-reviewed publications (such as edited volumes and chapters) were taken into account, they were not particularly lucrative [7,16]. Because in the Czech Republic the PRFS had a fundamental impact on the core annual funding of research organisations, it provided incentives for the adaptation of new publishing habits. This was especially true in SSH disciplines, in which producing many non-WoS outcomes or outcomes of mediocre quality was a favourable strategy for ensuring departmental funding [7].

In light of these facts, several research findings deserve to be further commented on. Although publishing in journals is the expected method of communicating results in Economics and business [5], the journal article share in this field is low in the Czech Republic in comparison with Poland. In contrast, we observed a massive number of articles published in a few Czech-based economic journals and an inordinate number of conference proceedings. Neither of these publication types comprise the core of publishing activities in most SSH disciplines [39]. However, the Czech evaluation methodology significantly changed in 2017. By eliminating the straightforward linking of scores (“points”) received for individual publications with money, the new evaluation system strives to avoid stimulating unwanted publishing behaviour. Furthermore, in the case of Poland, the straightforward preference for articles in international journals in national and institutional contexts [23] may have resulted in the increased WoS coverage of articles in many SSH disciplines that this study revealed, although such changes have happened slowly. These observations raise a question: Does change in coverage correspond to researchers’ efforts to reflect intradisciplinary changes and/or the push to submit (regional) journals for inclusion in WoS to increase the chances of receiving more money based on the funding mechanisms outlined in national PRFSs? Although from our own experience we know that both options can be valid for some disciplines, our present results (Figs 911) show that publishing in domestic journals (regardless of the percentage of domestic authors published in these journals) does not seem to influence increasing coverage. Empirical research is needed to answer this question more properly. However relevant to the results of the present research, studying the formative effects of each country’s PRFS on publication strategies was not a goal of this paper. Earlier bibliometric literature pertaining to the effects of PRFSs exist [18,29,31,40,41] and illustrates that PRFSs may influence the publication patterns of scholars in various countries and disciplines.

In this study, we tackled several limitations related to national databases, which were identified and conceptualised in previous studies [34,42,43]. One of the most important limitations of the study lies in cross-country comparison of the data on scientific disciplines determined by different classification methods: cognitive (applied in databases in the Czech Republic, Norway, and Slovakia) or organisational (applied in databases in Poland and Flanders). The analysis and theoretical framework presented by Guns et al. [42] warn about cross-country comparisons such as the present study. According to Guns et al., 73% of Flemish humanities articles, organizationally defined, are published in humanities journals, whereas this ratio is only 59% for the social sciences. Given the objectives of this study, however, we cannot work solely with the cognitive classifications derived from journal classification in WoS, as this could make it difficult to understand how each discipline in each country identifies itself through a set of publications (e.g. an archaeological paper published in an anthropological journal, or an article authored by psychologists published in a neuroscience journal). Thus, one of the possible results of this study is highlighting the ability of disciplines to publish research in outlets that are prestigious or influential (based on IF), even though the journals it is published in may be classified differently in WoS than the articles themselves in the national index. In the Czech system, each publication reported for evaluation is included in the national bibliographic database (RIV) with the cognitive classification determined by the author based on the research topic. But within the evaluation process, the bibliometric analysis is conducted at the level of WoS categories assigned to journals.

Another limitation is rooted in the indexation of some journals in multiple databases, foremost those indexed both in the AHCI and the SCI-E/SSCI-E. This implies a risk of bias regarding the extent to which the SSH are covered in each individual citation index. Presence in multiple citation indexes does not affect article share and overall coverage. In terms of article representation in citation indexes, we decided to link every journal in each year with just one citation index, preferring the SCI-E/SSCI-E over the AHCI, as one of our research questions focuses on changes in the number of articles published in top-tier journals.

In this study, we found differences in the representation of articles in WoS across SSH disciplines and across countries within these disciplines. Research evaluation should respect the diversity of disciplines and not apply mechanisms punishing typical publication patterns in certain fields [28]. English-language journals are overrepresented in WoS. Hence, countries or disciplines publishing more in English are far more likely to be well represented in WoS. This partly explains the lower coverage of humanities publications. In the Czech national evaluation system implemented after 2017, performance in each discipline and in each research organisation is seen as a mark of “quality” reflecting the distribution of articles in quartiles derived from rankings based on the Article Influence Score. In this respect, the Czech PRFS exposes SSH disciplines to the new challenge of being assessed through publication performance in influential journals. Based on the results of the first two years of the annual evaluation, members of expert panels commenting on the assessment criticised the “apparently” low quality of research in SSH disciplines (unpublished working reports). Previous research also argued that performance in post-communist countries still does not meet the level of that in Western European countries [15,16], but these studies were based solely on the limited view of journal-level indicators. Nevertheless, we should not neglect other publishing activities, for example, AHCI-indexed articles, when interpreting publication performance in the humanities. We argue that the patterns should be seen from a much broader viewpoint that takes into account different contexts and starting points. Assessment results based on a single or inappropriate indicator without understanding the discipline- and country-specific publication patterns and recent developments of disciplines might negatively shape research policies or even the public view of research activities. More specifically, evaluation systems should not punish SSH disciplines on the basis of inappropriate or incomplete measures. Our results offer a basis for a more contextualised interpretation of changes in SSH disciplines.

Differences in publication patterns stem from different national cultural and political backgrounds. Disciplines do not have universal characteristics across all countries. This does not mean, however, that these patterns should be preserved. We do not claim that an article in a WoS-indexed journal is the most demanded publication type in all disciplines, including SSH ones. However, we argue that SSH publishing habits in CEE countries do not need to be maintained based on misplaced or outdated arguments and assumptions. The idea of protecting local excellence, especially in CEE countries, should not be confused with rigidity, avoiding international resources, and the national isolation of SSH disciplines. SSH research in post-communist countries has much to say in international forums, especially the social sciences. Nevertheless, our comparison shows that the potential for having international visibility and impact, understood here as the extent to which SSH disciplines use WoS-indexed journals as a publication channel, can be improved in comparison with Western European and Nordic countries. Locally relevant research published in national languages is especially important for SSH disciplines but preserving local topics and the isolation of SSH disciplines may limit the development of these disciplines in the national and international contexts.

Conclusions

Our study shows that despite some fluctuations, publication patterns in the social sciences and humanities in CEE countries are changing towards broader representation in WoS, particularly in journals with greater influence in terms of citation impact. Despite the clear differences between the CEE countries studied and the two Western/Nordic countries across all disciplines, publication patterns in CEE countries are becoming more similar to those in Western and Nordic countries, albeit with different intensity. In the Czech Republic, the growing trend in the relative article share in general publication patterns is often related to increased coverage and publication in prestigious journals. In Poland, the article share in all humanities disciplines decreased, but a greater percentage of articles was published in more influential journals. There are nonetheless major potential dissimilarities in the dynamics between countries and individual SSH disciplines. Hence, in bibliometrics, research evaluation, and science policy we suggest taking into account the differences not only between social sciences and humanities but also between disciplines within them.

Supporting information

S1 Fig. Proportion of articles in WoS by citation indexes.

Czech Republic–social sciences.

(TIF)

S2 Fig. Proportion of articles in WoS by citation indexes.

Slovakia–social sciences.

(TIF)

S3 Fig. Proportion of articles in WoS by citation indexes.

Poland–social sciences.

(TIF)

S4 Fig. Proportion of articles in WoS by citation indexes.

Norway–social sciences.

(TIF)

S5 Fig. Proportion of articles in WoS by citation indexes.

Flanders–social sciences.

(TIF)

S6 Fig. Proportion of articles in WoS by citation indexes.

Czech Republic–humanities.

(TIF)

S7 Fig. Proportion of articles in WoS by citation indexes.

Slovakia–humanities.

(TIF)

S8 Fig. Proportion of articles in WoS by citation indexes.

Poland–humanities.

(TIF)

S9 Fig. Proportion of articles in WoS by citation indexes.

Norway–humanities.

(TIF)

S10 Fig. Proportion of articles in WoS by citation indexes.

Flanders–humanities.

(TIF)

S1 Table. Article share–social sciences.

(DOCX)

S2 Table. Article share–humanities.

(DOCX)

S3 Table. Coverage of journal articles in WoS–social sciences.

(DOCX)

S4 Table. Coverage of journal articles in WoS–humanities.

(DOCX)

S5 Table. Proportion of articles in WoS by citation indexes.

(DOCX)

Acknowledgments

The authors wish to thank Clarivate Analytics for providing the list of journals related to citation indexes in the Core Collection, and reviewers for their valuable comments.

Data Availability

The data underlying this study are publicly available at: https://doi.org/10.5281/zenodo.4060376.

Funding Statement

The work was supported by the COST Action CA15137 “European Network for Research Evaluation in the Social Sciences and the Humanities” (ENRESSH) through an STSM grant to Michal Petr. The work of Raf Guns and T.C.E. Engels is supported by the Flemish Government through its funding of the Flemish Centre for Research & Development Monitoring (ECOOM), and the work of Gunnar Sivertsen was supported by the Research Council of Norway, Grant 256223 FORINNPOL, the work of Emanuel Kulczycki was supported by the National Science Centre in Poland, Grant UMO-2017/26/E/ HS2/00019. The funders had no role in study design, data collection and analysis, decision to publish, or preparation of the manuscript.

References

  • 1.Franssen T, Wouters P. Science and its significant other: Representing the humanities in bibliometric scholarship. Journal of the Association for Information Science and Technology. 2019;70:1124–1137. 10.1002/asi.24206 [DOI] [Google Scholar]
  • 2.Nederhof A. Bibliometric monitoring of research performance in the Social Sciences and the Humanities: A Review. Scientometrics. 2006;66:81–100. 10.1007/s11192-006-0007-2 [DOI] [Google Scholar]
  • 3.Archambault E, Vignola‐Gagne E, Cote G, Lariviere V, Gingras Y. Benchmarking scientific output in the social sciences and humanities: The limits of existing databases. Scientometrics. 2006;68(3):329–342. 10.1007/s11192-006-0115-z [DOI] [Google Scholar]
  • 4.Aksnes DW, Sivertsen G. A Criteria-based Assessment of the Coverage of Scopus and Web of Science. Journal of Data and Information Science. 2019;4(1): 1–21. 10.2478/jdis-2019-0001 [DOI] [Google Scholar]
  • 5.Kulczycki E, Engels TCE, Pölönen J, Bruun K, Dušková M, Guns R, et al. Publication patterns in the social sciences and humanities: evidence from eight European countries. Scientometrics. 2018;116(1): 463–486. 10.1007/s11192-018-2711-0 [DOI] [Google Scholar]
  • 6.Kozak M, Bornmann L, Leydesdorff L. How have the Eastern European countries of the former Warsaw Pact developed since 1990? A bibliometric study. Scientometrics. 2015;102(2):1101–17. 10.1007/s11192-014-1439-8 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 7.Good B, Vermeulen N, Tiefenthaler B, Arnold E. Counting quality? The Czech performance-based research funding system. Research Evaluation. 2015;24(2):91–105. 10.1093/reseval/rvu035 [DOI] [Google Scholar]
  • 8.Korytkowski P, Kulczycki E. Examining how country-level science policy shapes publication patterns: the case of Poland. Scientometrics. 2019;119(3):1519–43. 10.1007/s11192-019-03092-1 [DOI] [Google Scholar]
  • 9.Linkova M. Unable to resist: Researchers’ responses to research assessment in the Czech Republic. Human Affairs. 2014;24(1):78–88. 10.2478/s13374-014-0207-z [DOI] [Google Scholar]
  • 10.Šima K. Evidence in Czech research evaluation policy: measured and contested. Evidence & Policy: A Journal of Research, Debate and Practice. 2017;13(1): 81–95. [Google Scholar]
  • 11.Kulczycki E, Rozkosz EA, Drabek A. Internationalization of Polish journals in the social sciences and humanities: Transformative role of the research evaluation system. Canadian Journal of Sociology. 2019;44(1): 9–38. [Google Scholar]
  • 12.Pajić D. Globalization of the social sciences in Eastern Europe: genuine breakthrough or a slippery slope of the research evaluation practice? Scientometrics. 2015;102(3):2131–50. 10.1007/s11192-014-1510-5 [DOI] [Google Scholar]
  • 13.Skovajsa M. Místo Sociologického časopisu/Czech Sociological Review mezi sociologickými časopisy podle bibliometrických indikátorů. Úvaha nepříliš jubilejní. Sociologický časopis/Czech Sociological Review. 2014;50(5): 759–778. [Google Scholar]
  • 14.Toth J. “U.S. journals can afford to remain regional, but we can not.” Author distribution-based internationality of Eastern European communication journals. KOME − An International Journal of Pure Communication Inquiry. 2018;6(2): 1–15. 10.17646/KOME.2018.21 [DOI] [Google Scholar]
  • 15.Jurajda Š, Kozubek S, Münich D, Škoda S. Scientific publication performance in post-communist countries: still lagging far behind. Scientometrics. 2017;112(1):315–28. 10.1007/s11192-017-2389-8. [DOI] [Google Scholar]
  • 16.Vanecek J. The effect of performance-based research funding on output of R&D results in the Czech Republic. Scientometrics. 2014;98(1):657–81. 10.1007/s11192-013-1061-1 [DOI] [Google Scholar]
  • 17.Engels TCE, Ossenblok TLB, Spruyt EHJ. Changing publication patterns in the Social Sciences and Humanities, 2000–2009. Scientometrics. 2012;93(2):373–90. 10.1007/s11192-012-0680-2 [DOI] [Google Scholar]
  • 18.Ossenblok TLB, Engels TCE, Sivertsen G. The representation of the social sciences and humanities in the Web of Science—a comparison of publication patterns and incentive structures in Flanders and Norway (2005–9). Research Evaluation. 2012;21(4):280–90. 10.1093/reseval/rvs019 [DOI] [Google Scholar]
  • 19.Sivertsen G, Larsen B. Comprehensive bibliographic coverage of the social sciences and humanities in a citation index: an empirical analysis of the potential. Scientometrics. 2012;91(2):567–75. 10.1007/s11192-011-0615-3 [DOI] [Google Scholar]
  • 20.Sivertsen G. Scholarly publication patterns in the social sciences and humanities and their coverage in Scopus and Web of Science. In: Noyons E, editor. Proceedings of the science and technology indicators conference 2014 Leiden; 2014. pp. 598–604. [Google Scholar]
  • 21.Mongeon P, Paul-Hus A. The journal coverage of Web of Science and Scopus: a comparative analysis. Scientometrics. 2016;106(1):213–28. 10.1007/s11192-015-1765-5 [DOI] [Google Scholar]
  • 22.Ochsner M, Kulczycki E, Gedutis A. The Diversity of European Research Evaluation Systems. In: Costas R, Franssen T, Yegros-Yegros A, editors. 23rd International Conference on Science and Technology Indicators; 2018. pp. 1235–1241.
  • 23.Kulczycki E, Rozkosz EA. Does an expert-based evaluation allow us to go beyond the Impact Factor? Experiences from building a ranking of national journals in Poland. Scientometrics. 2017;111(1):417–42. 10.1007/s11192-017-2261-x [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 24.Office of the Government of the Czech Republic. Methodology for Evaluating Research Organisations and Research, Development and Innovation Purpose-tied Aid Programmes. Prague: Office of the Government of the Czech Republic; 2018. Available from: https://www.vyzkum.cz/FrontClanek.aspx?idsekce=799796. [Google Scholar]
  • 25.Jonkers K, Zacharewicz T. Research Performance Based Funding Systems: A Comparative Assessment. Luxembourg: Publications Office of the European Union; 2016, EUR 27837 EN. 10.2791/70120 [DOI] [Google Scholar]
  • 26.Debackere K, Arnold E, Sivertsen G, Spaapen J, Sturn D. Performance-Based Funding of University Research. Publications Office of the European Union; 2018. 10.2777/644014. [DOI] [Google Scholar]
  • 27.Zacharewicz T, Lepori B, Reale E, Jonkers K. Performance-based research funding in EU Member States—A comparative assessment. Science and Public Policy; 2018. 10.1093/scipol/scy041 [DOI] [Google Scholar]
  • 28.Sivertsen G. Patterns of internationalization and criteria for research assessment in the social sciences and humanities. Scientometrics. 2016;107(2):357–68. 10.1007/s11192-016-1845-1 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 29.Butler L. What happens when funding is linked to publication counts? In: Moed H, Glänzel W and Schmoch U (eds). Handbook of Quantitative Science and Technology Research; 2005. pp. 389–405. [Google Scholar]
  • 30.Butler L. Explaining Australia’s Increased Share of ISI Publications–the Effects of a Funding Formula Based on Publication Counts. Research Policy. 2003;32(1):143–55. 10.1016/S0048-7333(02)00007-0 [DOI] [Google Scholar]
  • 31.Moed HF. UK Research Assessment Exercises: Informed Judgments on Research Quality or Quantity? Scientometrics. 2008;74(1): 153–161. 10.1007/s11192-008-0108-1. [DOI] [Google Scholar]
  • 32.Vanholsbeeck M, Demetriou T, Girkontaite A, Istenic Starcic A, Keiski V, Kulczycki E, et al. Senior academics as key negotiators in the implementation of impact policies in the social sciences and humanities. fteval Journal for Science and Technology Policy Evaluation. 2019;48:72–79. Available from: https://www.fteval.at/content/home/journal/aktuelles/19_07_2019_ausgabe_48/Journal48_WEB_V2.pdf. [Google Scholar]
  • 33.Sīle L, Guns R, Sivertsen G, Engels TCE. European databases and repositories or Social Sciences and Humanities research output. Antwerp, Belgium: ECOOM & ENRESSH; 2017. 10.1007/s00484-017-1401-6 [DOI] [Google Scholar]
  • 34.Sīle L, Pölönen J, Sivertsen G, Guns R, Engels TCE, Arefiev P, et al. Comprehensiveness of national bibliographic databases for social sciences and humanities: Findings from a European survey. Research Evaluation. 2018;27(4):310–22. 10.1093/reseval/rvy016 [DOI] [Google Scholar]
  • 35.Organisation for Economic Co-operation and Development. Frascati Manual 2015: Guidelines for Collecting and Reporting Data on Research and Experimental Development. Paris: OECD Publishing; 2015. 10.1787/9789264239012-en [DOI] [Google Scholar]
  • 36.Macháček V, Srholec M. Local journals in Scopus. Study 17/2017. Prague: Economics Institute of the Czech Academy of Sciences; 2017. Available from: https://idea.cerge-ei.cz/files/IDEA_Studie_17_2017_Mistni_casopisy_ve_Scopusu/mobile/index.html#p=4. [Google Scholar]
  • 37.De Filippo D, Sanz-Casado E. Bibliometric and Altmetric Analysis of Three Social Science Disciplines. Frontiers in Research Metrics and Analytics. 2018;3:34. 10.3389/frma.2018.00034 [DOI] [Google Scholar]
  • 38.Linková M, Stöckelová T. Public accountability and the politicisation of science: The peculiar journey of Czech research assessment. Science and Public Policy. 2012;39(5):618–29. 10.1093/scipol/scs039 [DOI] [Google Scholar]
  • 39.Vanecek J, Pecha O. Fast growth of the number of proceedings papers in atypical fields in the Czech Republic is a likely consequence of the national performance-based research funding system. Research Evaluation. 2020;29(3): 245–262. 10.1093/reseval/rvaa005. [DOI] [Google Scholar]
  • 40.Aagaard K, Bloch C, Schneider JW. Impacts of Performance-Based Research Funding Systems: The Case of the Norwegian Publication Indicator. Research Evaluation. 2015; 24(2):106–117. 10.1093/reseval/rvv003 [DOI] [Google Scholar]
  • 41.Hammarfelt B, De Rijcke S. Accountability in Context: Effects of Research Evaluation Systems on Publication Practices, Disciplinary Norms and Individual Working Routines in the Faculty of Arts at Uppsala University. Research Evaluation. 2015;24(1):63–77. 10.1093/reseval/rvu029 [DOI] [Google Scholar]
  • 42.Guns R, Sīle L, Eykens J, Verleysen FT, Engels TCE. A comparison of cognitive and organisational classification of publications in the social sciences and humanities. Scientometrics. 2018;116(2):1093–111. 10.1007/s11192-018-2775-x [DOI] [Google Scholar]
  • 43.Sīle L. Entanglement of bibliographic database content and data collection practices: Rethinking data integration using findings from a European study. Procedia Computer Science. 2019;146:201–7. 10.1016/j.procs.2019.01.094 [DOI] [Google Scholar]

Decision Letter 0

Lutz Bornmann

22 Jun 2020

PONE-D-20-14121

Coverage of journal articles in social sciences and humanities in Web of Science and their representation in citation indexes: a comparison of five European countries

PLOS ONE

Dear Dr. Petr,

Thank you for submitting your manuscript to PLOS ONE. After careful consideration, we feel that it has merit but does not fully meet PLOS ONE’s publication criteria as it currently stands. Therefore, we invite you to submit a revised version of the manuscript that addresses the points raised during the review process. Please submit your revised manuscript by Aug 06 2020 11:59PM. If you will need more time than this to complete your revisions, please reply to this message or contact the journal office at plosone@plos.org. When you're ready to submit your revision, log on to https://www.editorialmanager.com/pone/ and select the 'Submissions Needing Revision' folder to locate your manuscript file.

Please include the following items when submitting your revised manuscript:

  • A rebuttal letter that responds to each point raised by the academic editor and reviewer(s). You should upload this letter as a separate file labeled 'Response to Reviewers'.

  • A marked-up copy of your manuscript that highlights changes made to the original version. You should upload this as a separate file labeled 'Revised Manuscript with Track Changes'.

  • An unmarked version of your revised paper without tracked changes. You should upload this as a separate file labeled 'Manuscript'.

If you would like to make changes to your financial disclosure, please include your updated statement in your cover letter. Guidelines for resubmitting your figure files are available below the reviewer comments at the end of this letter.

If applicable, we recommend that you deposit your laboratory protocols in protocols.io to enhance the reproducibility of your results. Protocols.io assigns your protocol its own identifier (DOI) so that it can be cited independently in the future. For instructions see: http://journals.plos.org/plosone/s/submission-guidelines#loc-laboratory-protocols

We look forward to receiving your revised manuscript.

Kind regards,

Lutz Bornmann

Academic Editor

PLOS ONE

Journal Requirements:

When submitting your revision, we need you to address these additional requirements.

1. Please ensure that your manuscript meets PLOS ONE's style requirements, including those for file naming. The PLOS ONE style templates can be found at

https://journals.plos.org/plosone/s/file?id=wjVg/PLOSOne_formatting_sample_main_body.pdf and

https://journals.plos.org/plosone/s/file?id=ba62/PLOSOne_formatting_sample_title_authors_affiliations.pdf

2. Please ensure that you include a title page within your main document. We do appreciate that you have a title page document uploaded as a separate file, however, as per our author guidelines (http://journals.plos.org/plosone/s/submission-guidelines#loc-title-page) we do require this to be part of the manuscript file itself and not uploaded separately.

Could you therefore please include the title page into the beginning of your manuscript file itself, listing all authors and affiliations.

3. We note that you have indicated that data from this study are available upon request. PLOS only allows data to be available upon request if there are legal or ethical restrictions on sharing data publicly. For more information on unacceptable data access restrictions, please see http://journals.plos.org/plosone/s/data-availability#loc-unacceptable-data-access-restrictions.

In your revised cover letter, please address the following prompts:

a) If there are ethical or legal restrictions on sharing a de-identified data set, please explain them in detail (e.g., data contain potentially sensitive information, data are owned by a third-party organization, etc.) and who has imposed them (e.g., an ethics committee). Please also provide contact information for a data access committee, ethics committee, or other institutional body to which data requests may be sent.

b) If there are no restrictions, please upload the minimal anonymized data set necessary to replicate your study findings as either Supporting Information files or to a stable, public repository and provide us with the relevant URLs, DOIs, or accession numbers. For a list of acceptable repositories, please see http://journals.plos.org/plosone/s/data-availability#loc-recommended-repositories.

We will update your Data Availability statement on your behalf to reflect the information you provide.

4. Please upload a copy of Figures 10 and 11, to which you refer in your text on pages 17, 23 and 27. If any figure is no longer to be included as part of the submission please remove all reference to it within the text.

5. Please include a copy of Table 1 which you refer to in your text on page 19.

6. Please include captions for your Supporting Information files at the end of your manuscript, and update any in-text citations to match accordingly. Please see our Supporting Information guidelines for more information: http://journals.plos.org/plosone/s/supporting-information

[Note: HTML markup is below. Please do not edit.]

Reviewers' comments:

Reviewer's Responses to Questions

Comments to the Author

1. Is the manuscript technically sound, and do the data support the conclusions?

The manuscript must describe a technically sound piece of scientific research with data that supports the conclusions. Experiments must have been conducted rigorously, with appropriate controls, replication, and sample sizes. The conclusions must be drawn appropriately based on the data presented.

Reviewer #1: Yes

Reviewer #2: Partly

**********

2. Has the statistical analysis been performed appropriately and rigorously?

Reviewer #1: Yes

Reviewer #2: N/A

**********

3. Have the authors made all data underlying the findings in their manuscript fully available?

The PLOS Data policy requires authors to make all data underlying the findings described in their manuscript fully available without restriction, with rare exception (please refer to the Data Availability Statement in the manuscript PDF file). The data should be provided as part of the manuscript or its supporting information, or deposited to a public repository. For example, in addition to summary statistics, the data points behind means, medians and variance measures should be available. If there are restrictions on publicly sharing data—e.g. participant privacy or use of data from a third party—those must be specified.

Reviewer #1: Yes

Reviewer #2: No

**********

4. Is the manuscript presented in an intelligible fashion and written in standard English?

PLOS ONE does not copyedit accepted manuscripts, so the language in submitted articles must be clear, correct, and unambiguous. Any typographical or grammatical errors should be corrected at revision, so please note any specific errors here.

Reviewer #1: Yes

Reviewer #2: Yes

**********

5. Review Comments to the Author

Please use the space provided to explain your answers to the questions above. You may also include additional comments for the author, including concerns about dual publication, research ethics, or publication ethics. (Please upload your review as an attachment if it exceeds 20,000 characters)

Reviewer #1: This paper compares comprehensive national records of papers in social sciences and humanities with WoS lists of such papers. Trends over time in Flanders and Norway are compared against Poland, Czech Republic and Slovakia. The paper is solid and I have just a few questions. The text does go on at some length and should be shortened. There were pages and pages of discussion when there really wasn’t that much to discuss, results are just being repeated multiple times. The formatting of the figures made it impossible to consider them in the review. The figures came after the text with no captions associated with them. I don’t know whether this was journal submission requirements or author choice, either way I ended up just ignoring the figures.

On page 4 the way ESCI is described makes one wonder why the journals aren’t just included in WoS. Maybe improve explanation.

I very much liked the interpretation of the IF, very well conceived.

P. 5 “revealing publication patterns of the coverage and ranking of journals in international comparison”

Confusing phrasing, rework the sentence.

p. 5 “A publication in a WoS indexed journal is in most SSH disciplines still seen as a success of its kind and also as an indication of higher quality in publication profile [23]. If this is true, then the ranking of journals might play a role in this process.”

What process? And if what is true? That WoS is seen as higher quality? If it might not be true that it is seen this way, then why did you say it?

Table 1 – numbers in tables should be right justified and ,’s should be used to demarcate 1000’s. And how are the rows in this table ordered? Largest to smallest would be a good order.

“share of articles” sounds like the wrong phrase. It should probably be “article share” Share of articles sounds like the % of articles that are X. You want articles as a % of X.

Table 3 Can you not spell out the country names? And where is the data for Norway?

p. 18 You need to use the field name in the first sentence or two discussing field data in every section. Do not rely on the section heading. Sometime the sentences here can read like they refer to all the data.

p. 23 “ this leads to the same threshold whether” - meaning unclear, rephrase

Reviewer #2: - Data for four (or even three) subsequent single years are insufficient to base conclusions on concerning publication dynamics, let alone trends.

- The analysis of top half IFs vs. bottom half IFs is crude; Subject Category - normalized IFs would be preferable

- Differences between (groups of) countries should be tested statistically where possible

- The decision concerning journals classified in multiple citation indexes to include these in SCI-E and SSCI if possible, and not partly to A&HCI when relevant, might bias findings for Social Sciences as well as for Humanities (p. 8). Preferably, this should be changed, if not, this potential bias should be discussed.

The paper offers an interesting comparison between 3 West Slavic central European countries and two non-English Germanic

ones in North West Europe, concerning their journal article output in SSH. The study of SSH output has a considerable history that, unfortunately,has not been given due consideration. Not much is said about the three elephants in the room: English (the dominant language in scholarly publishing), German (an important language in central and north-western Europe, and Russian the most important Slavic scholarly language in many fields. Especially in the humanities, English is less often used than in most (social) sciences, as opposed to German and Russian. The perhaps most interesting aspect of the paper is the difference in evaluation and reward systems ("PRFS") and in particular the way this relates to adaptive publication strategies of scholars. However, the various PRFSs are described rather summarily for a full appreciation of the findings of the study. Finally, although the importance of books for PRFSs is mentioned, the present study is not able to present much data on these. It is not made clear how well the Book Citation Indexes cover monographs and edited volumes.

Major revisions are recommended to meet the concerns raised above.

Typo's / mistakes: Table 2 should be replaced by Table s2. CZE in Table 2 should be NOR on p.11

**********

6. PLOS authors have the option to publish the peer review history of their article (what does this mean?). If published, this will include your full peer review and any attached files.

If you choose “no”, your identity will remain anonymous but your review may still be made public.

Do you want your identity to be public for this peer review? For information about this choice, including consent withdrawal, please see our Privacy Policy.

Reviewer #1: No

Reviewer #2: Yes: Anton J. Nederhof

[NOTE: If reviewer comments were submitted as an attachment file, they will be attached to this email and accessible via the submission site. Please log into your account, locate the manuscript record, and check for the action link "View Attachments". If this link does not appear, there are no attachment files.]

While revising your submission, please upload your figure files to the Preflight Analysis and Conversion Engine (PACE) digital diagnostic tool, https://pacev2.apexcovantage.com/. PACE helps ensure that figures meet PLOS requirements. To use PACE, you must first register as a user. Registration is free. Then, login and navigate to the UPLOAD tab, where you will find detailed instructions on how to use the tool. If you encounter any issues or have any questions when using PACE, please email PLOS at figures@plos.org. Please note that Supporting Information files do not need this step.

PLoS One. 2021 Apr 8;16(4):e0249879. doi: 10.1371/journal.pone.0249879.r002

Author response to Decision Letter 0


5 Oct 2020

I also have responded in a more readable way in the attached file - "response to reviewers".

Dear PloS One Editors,

Foremost, we would like to thank both reviewers for very useful and challenging comments. However, we were struggling with making all requested changes as these sometimes pit against each other. Whereas Reviewer #1 requested shortening the text (which we consider very useful), implementing all changes requested by Reviewer #2 would make the manuscript much longer. Some Reviewer´s #2 suggestions – however undoubtedly overall relevant – goes far beyond the intended focus of this study, changes the scope and would result in the entirely new manuscript. In this respect, we decided giving priority to reducing the text, keeping our original research questions, concept, focus and methodology.

Besides concept-changing requests, we made all requested revisions. Foremost, we reduced and reformulated lengthy Results and Discussion, as Reviewer 1 suggested. At the same time, we added a few paragraphs in the manuscript, discussing Reviewer´s 2 concerns, which we consider very useful and relevant. We believe that we found the right balance and reasonably consolidated the Results and Discussion. We also paid increased attention to improving overall English to make the text clear.

We also uploaded the version with track changes. However, this version only shows our changes in the manuscript and does not contain the changes made during proofreading. Clean version with accepted changes is the final version of revised manuscript.

Below we copied the editor´s review and both Reviewer´s comments.

Our response is marked with colour and aligned with an increased indent of the paragraph, like this text

***

Editor's comments

When submitting your revision, we need you to address these additional requirements.

1. Please ensure that your manuscript meets PLOS ONE's style requirements, including those for file naming. The PLOS ONE style templates can be found at

https://journals.plos.org/plosone/s/file?id=wjVg/PLOSOne_formatting_sample_main_body.pdf and https://journals.plos.org/plosone/s/file?id=ba62/PLOSOne_formatting_sample_title_authors_affiliations.pdf

2. Please ensure that you include a title page within your main document. We do appreciate that you have a title page document uploaded as a separate file, however, as per our author guidelines (http://journals.plos.org/plosone/s/submission-guidelines#loc-title-page) we do require this to be part of the manuscript file itself and not uploaded separately.

Could you therefore please include the title page into the beginning of your manuscript file itself, listing all authors and affiliations.

We included the title page into the main document as required.

3. We note that you have indicated that data from this study are available upon request. PLOS only allows data to be available upon request if there are legal or ethical restrictions on sharing data publicly. For more information on unacceptable data access restrictions, please see http://journals.plos.org/plosone/s/data-availability#loc-unacceptable-data-access-restrictions.

In your revised cover letter, please address the following prompts:

a) If there are ethical or legal restrictions on sharing a de-identified data set, please explain them in detail (e.g., data contain potentially sensitive information, data are owned by a third-party organization, etc.) and who has imposed them (e.g., an ethics committee). Please also provide contact information for a data access committee, ethics committee, or other institutional body to which data requests may be sent.

b) If there are no restrictions, please upload the minimal anonymized data set necessary to replicate your study findings as either Supporting Information files or to a stable, public repository and provide us with the relevant URLs, DOIs, or accession numbers. For a list of acceptable repositories, please see http://journals.plos.org/plosone/s/data-availability#loc-recommended-repositories.

We will update your Data Availability statement on your behalf to reflect the information you provide.

We uploaded the data in the repository.

4. Please upload a copy of Figures 10 and 11, to which you refer in your text on pages 17, 23 and 27. If any figure is no longer to be included as part of the submission please remove all reference to it within the text.

We fixed references and figures.

5. Please include a copy of Table 1 which you refer to in your text on page 19.

Table 1 was at the very beginning of the manuscript.

6. Please include captions for your Supporting Information files at the end of your manuscript, and update any in-text citations to match accordingly. Please see our Supporting Information guidelines for more information: http://journals.plos.org/plosone/s/supporting-information

We included captions at the end of the manuscript and checked in-text references.

Reviewers' comments

Reviewer #1

This paper compares comprehensive national records of papers in social sciences and humanities with WoS lists of such papers. Trends over time in Flanders and Norway are compared against Poland, Czech Republic and Slovakia. The paper is solid and I have just a few questions. The text does go on at some length and should be shortened. There were pages and pages of Discussion when there really wasn’t that much to discuss, results are just being repeated multiple times.

Thank you for this suggestion. We shortened the text, mainly in the Results section. Please, see the revised manuscript.

The formatting of the figures made it impossible to consider them in the review. The figures came after the text with no captions associated with them. I don’t know whether this was journal submission requirements or author choice, either way I ended up just ignoring the figures.

Graphs were uploaded as separate files, according to PLoS One requirements. We fixed all in-text references and included graphs in the text.

On page 4 the way ESCI is described makes one wonder why the journals aren’t just included in WoS. Maybe improve explanation.

P. 5 “revealing publication patterns of the coverage and ranking of journals in international comparison”

Confusing phrasing, rework the sentence.

p. 5 “A publication in a WoS indexed journal is in most SSH disciplines still seen as a success of its kind and also as an indication of higher quality in publication profile [23]. If this is true, then the ranking of journals might play a role in this process.”

What process? And if what is true? That WoS is seen as higher quality? If it might not be true that it is seen this way, then why did you say it?

We looked at how the Introduction could be expressed more clearly, given these questions from Reviewer #1. Finally, we edited the Introduction from its start and improved the explanations.

Table 1 – numbers in tables should be right justified and ,’s should be used to demarcate 1000’s. And how are the rows in this table ordered? Largest to smallest would be a good order.

We updated all Tables as suggested and added following sentence under the Table 1: “Disciplines are ordered as they appear in the Frascati classification scheme.”

“share of articles” sounds like the wrong phrase. It should probably be “article share” Share of articles sounds like the % of articles that are X. You want articles as a % of X.

Thank you. With our proofreader, we rephrased “share of articles” to “article share” and improved overall English.

Table 3 Can you not spell out the country names? And where is the data for Norway?

We added the country names legend below tables. The data from Norway was present – there was a mistake in splitting the table in two pages. We fixed this.

p. 18 You need to use the field name in the first sentence or two discussing field data in every section. Do not rely on the section heading. Sometime the sentences here can read like they refer to all the data.

Concerning the very first comment on lengthy Results, we have reformulated the whole section. Namely, we deleted headings, reduced the text describing each discipline and substituted with the description of findings in publication patterns. Please, see the manuscript.

p. 23 “ this leads to the same threshold whether” - meaning unclear, rephrase

We rephrased an unclear sentence.

Reviewer #2

- Data for four (or even three) subsequent single years are insufficient to base conclusions on concerning publication dynamics, let alone trends.

National databases are heterogeneous as for the scope, coverage, structure of publication types and availability of the data. Databases are also unique from the perspective of historical and political background. From that point of view, comparing data from databases with unique data structure and working conditions is highly problematic. We decided to analyze data for the timeframe 2013–2016 because this was the timeframe for which the data was available across all databases. In some databases, data is collected retrospectively with some delay. Expanding is problematic when we need to align all countries.

In this study, the data from national databases needed were post-processed to be mutually comparable and to be linked to the information about indexing each journal in each citation index in each year of the study. This procedure is only partly doable automatically. A considerable part of the work needs to be done manually (mostly linking ISSNs and titles to WoS journal lists and verification of the coverage of journals in given years). Therefore, this timeframe is, to a certain extent balanced with great accuracy of the data and with a couple of months (sic!) of data cleaning in background.

Following reviewers recommendations, we achieved to add the year 2016 for Flanders data to align the timeframe with other countries. Slovakia has no older data in the database than 2014. Even when showing 4-years data, we can see overall trends and differences between countries at least in disciplines with a sufficient number of publications. Even in the pre-revised version of the manuscript, we did not base conclusions in disciplines with small publication base and low coverage.

We have added the following explanation in the Data and Methods section: “National databases are unique as for the scope, coverage, structure of publication types and availability of the data. Databases have also different historical and political background. From that point of view, comparing data from databases is highly problematic. Since we endeavour to study all SSH disciplines, we accepted the limited timeframe 2013–2016 for which the data was available across all databases.”

- The analysis of top half IFs vs. bottom half IFs is crude; Subject Category - normalized IFs would be preferable

Since quartiles are calculated on the basis of the IF ranking within each WoS Category, they already form a crude subject normalization. We use quartiles because it is an understandable way how to show the position of the journal in the WoS category. Also, in many countries, it is customary to refer to quartiles in evaluation-related procedures. Quartiles are here perceived as “quality” sign. Therefore, the idea of this article was to compare the proportions of quartile zones across countries and to show, on the one hand, if the coverage is changing and on the other hand if the perceived “quality” of journals is changing. We explained this directly in the Introduction and also in Data and Methods section. We are of the opinion that IF or any other finer-grained metric would be kind of different information and not so meaningful for the purpose of the study.

We have rephrased the Introduction as follows:

“In this study, we consider IF to be an indication of a journal’s citation impact. Following this rationale, we take a journal’s quartile ranking in the JCR to be an indication of the journal’s quality requirements, selectivity, and scholarly impact. Those with higher rankings (Q1 and Q2) are more demanding journals. In this paper, we do not discuss whether IF is an appropriate indicator for evaluating research. We only use it at the journal level, which it was designed for. At the same time, we do not claim that WoS-indexed journals are better than non-WoS-indexed journals. Our selection of indicators reflects the fact that all five countries in our study make use of bibliometric indicators in their national funding and evaluation systems. This common trait in these countries represents just one part of the greater diversity of evaluation systems across Europe.”

We have added the following part to the Data and methods section:

“In many countries, it is customary to refer to quartiles in evaluation-related procedures. Quartiles are perceived as a sign of quality. Therefore, in this paper we seek to demonstrate on one hand if coverage is changing and on the other if the perceived quality of the journals in which articles are published is changing.”

- Differences between (groups of) countries should be tested statistically where possible

Since we work with simple descriptive statistics, we do not consider statistical testing as necessary in this specific situation. It is not clear, what is a benefit of statistical testing for the present study when we show the simple article counts and shares without using more dimensions in one figure.

- The decision concerning journals classified in multiple citation indexes to include these in SCI-E and SSCI if possible, and not partly to A&HCI when relevant, might bias findings for Social Sciences as well as for Humanities (p. 8). Preferably, this should be changed, if not, this potential bias should be discussed.

For the article share and coverage the presence in multiple citation indexes does not play a role. For the representation in citation indexes we counted only one citation index with the decision strategy explained in the manuscript – priority was given to JCR indexation (with IF), then we searched in AHCI and finally in ESCI. The explanation for this decision was the point of view of national evaluation systems and the perception of journal “prestige” or “quality”.

We looked at the extent to which journals are indexed both in AHCI and SCI-E or SSCI and added the text to the Data and methods section and found that there are 4.4 % articles with multiple assignments to AHCI and SCI-E/SSCI of the total number of WoS-indexed articles. Then there are 27.6 % articles with multiple assignments to AHCI and SCI-E/SSCI out of all AHCI-indexed articles. It is a very interesting phenomenon; however, it goes beyond the extent of this study. Changing the counting method in the way the Reviewer suggests would cause enormous changes in the master dataset, including new time-consuming cleaning and completing the data. Looking at journals with multiple citation index assigned, we revealed many specific situations, deserving manual decision. Then, the deeper conceptualization is needed, what does it mean for the performance in SSH. Although the representation of these journals with multiple indexes is remarkable, we think that treating multiple indexes as separate values at the level of the journal would cause potential inaccuracy and would be of rather low added value to this particular research questions.

We added the following sentence in the Data and Methods section.

“We also assigned a journal to the AHCI if it was present in both the AHCI and the SCI-E/SSCI but did not have an IF.”

Then we discussed this as a limitation in the Discussion as follows:

“Another limitation is rooted in the indexation of some journals in multiple databases, foremost those indexed both in the AHCI and the SCI-E / SSCI-E. This implies a risk of bias regarding the extent to which the SSH are covered in each individual citation index. Presence in multiple citation indexes does not affect article share and overall coverage. In terms of article representation in citation indexes, we decided to link every journal in each year with just one citation index, preferring the SCI-E/SSCI-E over the AHCI, as one of our research questions focuses on changes in the number of articles published in top-tier journals.”

The paper offers an interesting comparison between 3 West Slavic central European countries and two non-English Germanic ones in North West Europe, concerning their journal article output in SSH. The study of SSH output has a considerable history that, unfortunately,has not been given due consideration. Not much is said about the three elephants in the room: English (the dominant language in scholarly publishing), German (an important language in central and north-western Europe, and Russian the most important Slavic scholarly language in many fields. Especially in the humanities, English is less often used than in most (social) sciences, as opposed to German and Russian.

Thank you for this suggestion. We have added the following part to the Discussion:

“English-language journals are overrepresented in WoS. Hence, countries or disciplines publishing more in English are far more likely to be well represented in WoS. This partly explains the lower coverage of humanities publications.”

Kulczycki et al. (2020), https://asistdl.onlinelibrary.wiley.com/doi/full/10.1002/asi.24336, have found that the importance of Russian and German is fairly limited in journal publishing in CEE countries nowadays, accounting for, respectively, 0.4 to 1.0% and 1.7 to 2.3% of SSH articles. In other words, both languages are historically important in some of the countries under scrutiny but no longer play a major role. For this reason, we don't specifically mention them in the article

The perhaps most interesting aspect of the paper is the difference in evaluation and reward systems ("PRFS") and in particular the way this relates to adaptive publication strategies of scholars. However, the various PRFSs are described rather summarily for a full appreciation of the findings of the study.

Thank you for this relevant suggestion. This is an aspect that we are usually intensely interested in as well. However, studying relationships between PRFS and publication strategies directly in the manuscript was not the goal of this paper. Expectedly, it would take a lot of space – especially when describing the details how publications are discounted in those models – whereas Reviewer #1 asked for reducing the text length. We edited the Introduction and added references to recently published research as follows rather than describe each PRFS in detail.

Debackere, K., Arnold, E., Sivertsen, G., Spaapen, J., & Sturn, D. (2018). Performance-Based Funding of University Research. Publications Office of the European Union. https://doi.org/10.2777/644014.

Zacharewicz, T., Lepori, B., Reale, E., & Jonkers, K. (2018). Performance-based research funding in EU Member States—A comparative assessment. Science and Public Policy. https://doi.org/10.1093/scipol/scy041

Finally, although the importance of books for PRFSs is mentioned, the present study is not able to present much data on these. It is not made clear how well the Book Citation Indexes cover monographs and edited volumes.

We agree that books are crucial for PRFSs too. We did not provide any data on this because the main point of the paper was to study the article share, coverage and presence in WoS CC citation indexes with the finer-grained distinction of quartiles within JCR across all SSH disciplines in Frascati classification. It is not possible to cover all topics related to all publication patterns in SSH. The coverage of BCI is very limited in the SSH and is documented by Aksnes and Sivertsen (2019), which is also referenced in the manuscript.

Major revisions are recommended to meet the concerns raised above.

Typo's / mistakes: Table 2 should be replaced by Table s2. CZE in Table 2 should be NOR on p.11.

We fixed these mistakes.

Attachment

Submitted filename: PLoSOne_MichalPetr_response to reviewers.docx

Decision Letter 1

Lutz Bornmann

17 Nov 2020

PONE-D-20-14121R1

Journal article publishing in the social sciences and humanities: a comparison of Web of Science coverage for five European countries

PLOS ONE

Dear Dr. Petr,

Thank you for submitting your manuscript to PLOS ONE. After careful consideration, we feel that it has merit but does not fully meet PLOS ONE’s publication criteria as it currently stands. Therefore, we invite you to submit a revised version of the manuscript that addresses the points raised during the review process. One reviewer is still very critical and recommended to reject the manuscript. Thus, you should address the reviewer's points very carefully and comprehensively. Please submit your revised manuscript by Jan 01 2021 11:59PM. If you will need more time than this to complete your revisions, please reply to this message or contact the journal office at plosone@plos.org. When you're ready to submit your revision, log on to https://www.editorialmanager.com/pone/ and select the 'Submissions Needing Revision' folder to locate your manuscript file.Please include the following items when submitting your revised manuscript:

  • A rebuttal letter that responds to each point raised by the academic editor and reviewer(s). You should upload this letter as a separate file labeled 'Response to Reviewers'.

  • A marked-up copy of your manuscript that highlights changes made to the original version. You should upload this as a separate file labeled 'Revised Manuscript with Track Changes'.

  • An unmarked version of your revised paper without tracked changes. You should upload this as a separate file labeled 'Manuscript'.

If you would like to make changes to your financial disclosure, please include your updated statement in your cover letter. Guidelines for resubmitting your figure files are available below the reviewer comments at the end of this letter.

If applicable, we recommend that you deposit your laboratory protocols in protocols.io to enhance the reproducibility of your results. Protocols.io assigns your protocol its own identifier (DOI) so that it can be cited independently in the future. For instructions see: http://journals.plos.org/plosone/s/submission-guidelines#loc-laboratory-protocols

We look forward to receiving your revised manuscript.

Kind regards,

Lutz Bornmann

Academic Editor

PLOS ONE

[Note: HTML markup is below. Please do not edit.]

Reviewers' comments:

Reviewer's Responses to Questions

Comments to the Author

1. If the authors have adequately addressed your comments raised in a previous round of review and you feel that this manuscript is now acceptable for publication, you may indicate that here to bypass the “Comments to the Author” section, enter your conflict of interest statement in the “Confidential to Editor” section, and submit your "Accept" recommendation.

Reviewer #1: All comments have been addressed

Reviewer #2: (No Response)

**********

2. Is the manuscript technically sound, and do the data support the conclusions?

The manuscript must describe a technically sound piece of scientific research with data that supports the conclusions. Experiments must have been conducted rigorously, with appropriate controls, replication, and sample sizes. The conclusions must be drawn appropriately based on the data presented.

Reviewer #1: Yes

Reviewer #2: No

**********

3. Has the statistical analysis been performed appropriately and rigorously?

Reviewer #1: N/A

Reviewer #2: No

**********

4. Have the authors made all data underlying the findings in their manuscript fully available?

The PLOS Data policy requires authors to make all data underlying the findings described in their manuscript fully available without restriction, with rare exception (please refer to the Data Availability Statement in the manuscript PDF file). The data should be provided as part of the manuscript or its supporting information, or deposited to a public repository. For example, in addition to summary statistics, the data points behind means, medians and variance measures should be available. If there are restrictions on publicly sharing data—e.g. participant privacy or use of data from a third party—those must be specified.

Reviewer #1: No

Reviewer #2: Yes

**********

5. Is the manuscript presented in an intelligible fashion and written in standard English?

PLOS ONE does not copyedit accepted manuscripts, so the language in submitted articles must be clear, correct, and unambiguous. Any typographical or grammatical errors should be corrected at revision, so please note any specific errors here.

Reviewer #1: Yes

Reviewer #2: Yes

**********

6. Review Comments to the Author

Please use the space provided to explain your answers to the questions above. You may also include additional comments for the author, including concerns about dual publication, research ethics, or publication ethics. (Please upload your review as an attachment if it exceeds 20,000 characters)

Reviewer #1: (No Response)

Reviewer #2: It is laudable that the authors have provided additional data for Flanders for 2016, but a four-year period is still way to short to detect a trend. It might be possible to test for differences between, for instance, early and late years, but unfortunately, the authors decline to do this. This makes the paper less suitable for PLOS ONE.

A minor point is that the paper still does not pay proper attention to earlier studies dealing with publication changes in SSH.

Concerning the use of quartiles to measure journal impact: a small increase in the IF of a journal may raise it to another quartile; worse, a journal's decrease in IF may be less than that of one or more journals originally ranked in a higher quartile, leading to its promotion to a higher quartile. Using more fine-grained impact measures helps to detect and prevent such statistical problems. This point has not been addressed properly in the revised paper.

The paper's choice to not include journals with an IF partly to AHCI when these are both indexed in AHCI and other citation indexes considerably limits the paper's relevance for the humanities, as does, possibly to a lesser extent, the exclusion of BCI items.

Regrettably, the effect of changes in PRFS (reward and evaluation systems) on changes in publication patterns is still not more fully addressed. This distracts from the potential value of the paper.

Although the revised paper deals satisfactory with a few points have been raised in the original comments, it fails to address several important points.

**********

7. PLOS authors have the option to publish the peer review history of their article (what does this mean?). If published, this will include your full peer review and any attached files.

If you choose “no”, your identity will remain anonymous but your review may still be made public.

Do you want your identity to be public for this peer review? For information about this choice, including consent withdrawal, please see our Privacy Policy.

Reviewer #1: No

Reviewer #2: Yes: Anton J. Nederhof

[NOTE: If reviewer comments were submitted as an attachment file, they will be attached to this email and accessible via the submission site. Please log into your account, locate the manuscript record, and check for the action link "View Attachments". If this link does not appear, there are no attachment files.]

While revising your submission, please upload your figure files to the Preflight Analysis and Conversion Engine (PACE) digital diagnostic tool, https://pacev2.apexcovantage.com/. PACE helps ensure that figures meet PLOS requirements. To use PACE, you must first register as a user. Registration is free. Then, login and navigate to the UPLOAD tab, where you will find detailed instructions on how to use the tool. If you encounter any issues or have any questions when using PACE, please email PLOS at figures@plos.org. Please note that Supporting Information files do not need this step.

PLoS One. 2021 Apr 8;16(4):e0249879. doi: 10.1371/journal.pone.0249879.r004

Author response to Decision Letter 1


16 Feb 2021

Revised cover letter to: Petr, M et al., Journal article publishing in the social sciences and humanities: a comparison of Web of Science coverage for five European countries.

Dear PloS One Editors,

We are thankful for the Reviewer’s #2 comments that inspired us to make some more revisions and explain some critical points in the manuscript. We also added some extra references, addressing significant topics suggested by Reviewer’s #2. We tried to reflect the Reviewer‘s #2 concerns; however, we still feel that making all changes requested by Reviewer #2 would change the article's scope entirely. For instance, reviewer #2 asked addressing the effects of changes in PRFS on changes in publication patterns. We feel that addressing this point would make this text on the one hand too extensive in the scope and length, on the other hand too general to address these points to a deserved extent. Further, Reviewer #2 requested replacing the IF quartiles with more finer-grained impact measure. We want to remark that using IF quartiles in our article is entirely intentional as several national evaluations use this exact level for assessing performance. However, we addressed several topics in the article, provided some extra explanations and added references to document the Reviewer‘s #2 points.

Reviewer #2 finally says: „Although the revised paper deals satisfactory with a few points have been raised in the original comments, it fails to address several important points.”

In this article, we do not aim for comprehensiveness; the added value lies mainly in comparing the data from national databases from the perspective of the perception of publication performance by evaluation systems. We do not argue with the overall relevance and importance of recommendations that we did not implement. However, we still see that some implementations (mentioned above) are far beyond the paper's current scope and would significantly change it. Therefore, we prefer to focus on the topics we set up at the beginning of the research, as stated in the article.

With full respect to the experience of Reviewer #2, we would like to thank you for these beneficial recommendations, which we will be happy to address in further research projects. We kindly ask the Editor to reflect in his decision our argumentation that we carefully considered and still see as reasonable.

Below we copied the Reviewer´s #2 comments.

We also replaced the figures with PACE-fixed, as requested by the Editor.

Reviewers' comments

Reviewer #2: It is laudable that the authors have provided additional data for Flanders for 2016, but a four-year period is still way too short to detect a trend. It might be possible to test for differences between, for instance, early and late years, but unfortunately, the authors decline to do this. This makes the paper less suitable for PLOS ONE.

A minor point is that the paper still does not pay proper attention to earlier studies dealing with publication changes in SSH.

Response: National bibliographic databases are relatively new data sources, and we benefit from the fact that they cover full national output. WoS has been used as the main data source in all periods of bibliometric research on publication patterns, but national databases are used nowadays as an emerging data source with a great potential to analyse publication patterns beyond WoS. However, national bibliographic databases have limitations, such as differences in comprehensiveness, data structure, scope, covered period etc. Limitations having effect e.g. in the period analyzed in our article, pin the territory where our data can be reliable in mutual comparison. Expanding this period is problematic. Using just Web of Science data would make the work several times easier and allow to extend the timeframe of the dataset easily, but we decided to strengthen the focus on referencing the relevant literature on this and on describing these differences. We added the following text and a couple of relevant references to the very beginning of the Introduction:

From ca the 1980s, bibliometric methods became increasingly anchored in science policy and research evaluation. In this context, bibliometric research refocused from supporting sociological theories to research evaluation topics: science mapping, systemic effects of evaluation procedures, developing new indicators or studying publication patterns [1]. The common attribute of the recent era in bibliometric scholarship is using publication metadata as the primary data source. Major international bibliographic database Web of Science (WoS) has been used in research dealing with publishing and citation characteristics and patterns changes in the social sciences and humanities (SSH), as summarized by Nederhof [2] and recently by Franssen and Wouters [1]. In 2006, Archambault et al. [3] discussed the limits of WoS for any comparative analysis in terms of SSH output due to the language-related database bias. As documented by Franssen and Wouters [1], the recent establishment of comprehensive national databases has opened a new direction of bibliometric studies of SSH allowing for a complete picture of the publication practices by covering more publication types, sources, and research with local relevance than highly selective international databases WoS and Scopus do [4]. However, national databases are relatively new and limit the research by having a different scope and comprehensiveness. In this descriptive study, drawing on the latest bibliometric research approaches and results using national databases as a main data source, we aim to expand knowledge about publication patterns in journal publishing in SSH. More specifically, on an aspect of performance perceived by national evaluation systems in several countries as a sign of good performance, i.e. ranking of journals. [1] Franssen T, Wouters P. Science and its significant other: Representing the humanities in bibliometric scholarship. Journal of the Association for Information Science and Technology. 2019;70:1124-1137. doi: 10.1002/asi.24206.

[2] Nederhof A. Bibliometric monitoring of research performance in the Social Sciences and the Humanities: A Review. Scientometrics. 2006;66:81–100. doi: 10.1007/s11192-006-0007-2.

[3] Archambault E, Vignola‐Gagne E, Cote G, Lariviere V, Gingras Y. Benchmarking scientific output in the social sciences and humanities: The limits of existing databases. Scientometrics. 2006;68(3):329–342. doi: 10.1007/s11192-006-0115-z.

As for the trends, we describe the developments in the results section of the manuscript but conclude with more general ideas not entirely dependent on the trends: comparison of countries, main differences and relevant background.

Reviewer #2: Concerning the use of quartiles to measure journal impact: a small increase in the IF of a journal may raise it to another quartile; worse, a journal's decrease in IF may be less than that of one or more journals originally ranked in a higher quartile, leading to its promotion to a higher quartile. Using more fine-grained impact measures helps to detect and prevent such statistical problems. This point has not been addressed properly in the revised paper.

Response: Although we fully agree with the above-mentioned behaviour of the indicators, we would like to point out that it was not the paper's aim to evaluate journals, assess disciplines, or build a comprehensive picture of SSH. We conducted a descriptive analysis of the state of affairs, expanding the current picture of SSH using national databases as a data source. Our goal is to detect publication patterns in the specific area that is used for evaluating research in several national evaluation systems. Therefore, dealing with small changes to IF influencing the journal’s quartile is not entirely relevant if researchers perceive Q1 journals to be somehow special. For instance, in the Czech Republic, the evaluation methodology ranks journals in IF quartiles strictly due to journals´ IF in the year of publication without paying attention to the influence of small differences in the IF value on the line between two quartiles. In the present article, we show the distribution of journals in citation indexes and more specifically in IF quartiles in the way the national systems use quartiles for evaluating research. We translated this logic with all its imperfections of approximation methods of the national evaluation. We believe that we explained this intention satisfactory in the manuscript.

Reviewer #2: The paper's choice to not include journals with an IF partly to AHCI when these are both indexed in AHCI, and other citation indexes considerably limits the paper's relevance for the humanities, as does, possibly to a lesser extent, the exclusion of BCI items.

Response: The paper's main aim is to analyze the coverage of articles, which is clearly stated in the title and the manuscript. The decision not to duplicate items in each country´s dataset is, as we believe, suitable for the aim we established (see also above). We consider the decision strategy in deduplication of more citation indexes assigned to one journal appropriate to determine the discipline's performance aligned to the method of evaluation systems. Including journals partly in AHCI when these are also indexed in JCR is also possible and relevant for some cases. Nevertheless, we believe that humanities are not reduced in the results in any manner, as we show these articles preferably in the JCR-indexed category in accordance with the assumption that this category performs better in the evaluation system. Also, for the comparison of different countries from the point of view of the presence of articles in JCR-indexed journals, showing these duplicates in AHCI does not play a role

Reviewer #2: Regrettably, the effect of changes in PRFS (reward and evaluation systems) on changes in publication patterns is still not more fully addressed. This distracts from the potential value of the paper.

Response: We agree that analysis of the effects of PRFS’s is an important topic deserving of extensive discussion but argue that including such an analysis would need – a lot of explanation and would make the article lengthy. Instead of this, we added several works (references) in the Discussion (p. 27) that have earlier studied the PRFS’s effects (and demonstrated how complicated it is to trace such effects), some of which also brought some general discussion of this:

Butler L. What happens when funding is linked to publication counts? In: Moed H, Glänzel W and Schmoch U (eds). Handbook of Quantitative Science and Technology Research; 2005. pp. 389–405.

Butler L. Explaining Australia’s Increased Share of ISI Publications–the Effects of a Funding Formula Based on Publication Counts. Research Policy. 2003;32(1):143–55. doi: 10.1016/S0048-7333(02)00007-0.

Aagaard K, Bloch C, Schneider JW. Impacts of Performance-Based Research Funding Systems: The Case of the Norwegian Publication Indicator. Research Evaluation. 2015; 24(2):106–117. doi: 10.1093/reseval/rvv003.

Moed HF. UK Research Assessment Exercises: Informed Judgments on Research Quality or Quantity? Scientometrics. 2008;74(1): 153–161. doi: 10.1007/s11192-008-0108-1.

Hammarfelt B, De Rijcke S. Accountability in Context: Effects of Research Evaluation Systems on Publication Practices, Disciplinary Norms and Individual Working Routines in the Faculty of Arts at Uppsala University. Research Evaluation. 2015;24(1):63–77. doi: 10.1093/reseval/rvu029.

Reviewer #2: Although the revised paper deals satisfactory with a few points have been raised in the original comments, it fails to address several important points.

Response: We do not aim for comprehensiveness; the added value lies mainly in comparing the data from national databases from the perspective of the perception of publication performance by evaluation systems. We do not argue with the overall relevance and importance of recommendations that we did not implement. However, we still see that some implementations (e.g. effect of changes in PRFS) are far beyond the paper's current scope and would significantly change it. We feel that addressing several recommended points would make this text on the one hand too extensive in the scope and length, on the other hand too general to address these points to a deserved extent. Therefore, we prefer to focus on the topics we set up at the beginning of the research, as stated in the article.

With full respect to the experience of Reviewer #2, we would like to thank you for these very useful recommendations, which we will be happy to address in further research projects.

Attachment

Submitted filename: Response to Reviewers.docx

Decision Letter 2

Lutz Bornmann

29 Mar 2021

Journal article publishing in the social sciences and humanities: a comparison of Web of Science coverage for five European countries

PONE-D-20-14121R2

Dear Dr. Petr,

We’re pleased to inform you that your manuscript has been judged scientifically suitable for publication and will be formally accepted for publication once it meets all outstanding technical requirements.

Within one week, you’ll receive an e-mail detailing the required amendments. When these have been addressed, you’ll receive a formal acceptance letter and your manuscript will be scheduled for publication.

An invoice for payment will follow shortly after the formal acceptance. To ensure an efficient process, please log into Editorial Manager at http://www.editorialmanager.com/pone/, click the 'Update My Information' link at the top of the page, and double check that your user information is up-to-date. If you have any billing related questions, please contact our Author Billing department directly at authorbilling@plos.org.

If your institution or institutions have a press office, please notify them about your upcoming paper to help maximize its impact. If they’ll be preparing press materials, please inform our press team as soon as possible -- no later than 48 hours after receiving the formal acceptance. Your manuscript will remain under strict press embargo until 2 pm Eastern Time on the date of publication. For more information, please contact onepress@plos.org.

Kind regards,

Lutz Bornmann

Academic Editor

PLOS ONE

Additional Editor Comments (optional):

Reviewers' comments:

Reviewer's Responses to Questions

Comments to the Author

1. If the authors have adequately addressed your comments raised in a previous round of review and you feel that this manuscript is now acceptable for publication, you may indicate that here to bypass the “Comments to the Author” section, enter your conflict of interest statement in the “Confidential to Editor” section, and submit your "Accept" recommendation.

Reviewer #2: All comments have been addressed

**********

2. Is the manuscript technically sound, and do the data support the conclusions?

The manuscript must describe a technically sound piece of scientific research with data that supports the conclusions. Experiments must have been conducted rigorously, with appropriate controls, replication, and sample sizes. The conclusions must be drawn appropriately based on the data presented.

Reviewer #2: (No Response)

**********

3. Has the statistical analysis been performed appropriately and rigorously?

Reviewer #2: (No Response)

**********

4. Have the authors made all data underlying the findings in their manuscript fully available?

The PLOS Data policy requires authors to make all data underlying the findings described in their manuscript fully available without restriction, with rare exception (please refer to the Data Availability Statement in the manuscript PDF file). The data should be provided as part of the manuscript or its supporting information, or deposited to a public repository. For example, in addition to summary statistics, the data points behind means, medians and variance measures should be available. If there are restrictions on publicly sharing data—e.g. participant privacy or use of data from a third party—those must be specified.

Reviewer #2: (No Response)

**********

5. Is the manuscript presented in an intelligible fashion and written in standard English?

PLOS ONE does not copyedit accepted manuscripts, so the language in submitted articles must be clear, correct, and unambiguous. Any typographical or grammatical errors should be corrected at revision, so please note any specific errors here.

Reviewer #2: (No Response)

**********

6. Review Comments to the Author

Please use the space provided to explain your answers to the questions above. You may also include additional comments for the author, including concerns about dual publication, research ethics, or publication ethics. (Please upload your review as an attachment if it exceeds 20,000 characters)

Reviewer #2: (No Response)

**********

7. PLOS authors have the option to publish the peer review history of their article (what does this mean?). If published, this will include your full peer review and any attached files.

If you choose “no”, your identity will remain anonymous but your review may still be made public.

Do you want your identity to be public for this peer review? For information about this choice, including consent withdrawal, please see our Privacy Policy.

Reviewer #2: Yes: Anton J. Nederhof

Acceptance letter

Lutz Bornmann

1 Apr 2021

PONE-D-20-14121R2

Journal article publishing in the social sciences and humanities: a comparison of Web of Science coverage for five European countries

Dear Dr. Petr:

I'm pleased to inform you that your manuscript has been deemed suitable for publication in PLOS ONE. Congratulations! Your manuscript is now with our production department.

If your institution or institutions have a press office, please let them know about your upcoming paper now to help maximize its impact. If they'll be preparing press materials, please inform our press team within the next 48 hours. Your manuscript will remain under strict press embargo until 2 pm Eastern Time on the date of publication. For more information please contact onepress@plos.org.

If we can help with anything else, please email us at plosone@plos.org.

Thank you for submitting your work to PLOS ONE and supporting open access.

Kind regards,

PLOS ONE Editorial Office Staff

on behalf of

Dr. Lutz Bornmann

Academic Editor

PLOS ONE

Associated Data

    This section collects any data citations, data availability statements, or supplementary materials included in this article.

    Supplementary Materials

    S1 Fig. Proportion of articles in WoS by citation indexes.

    Czech Republic–social sciences.

    (TIF)

    S2 Fig. Proportion of articles in WoS by citation indexes.

    Slovakia–social sciences.

    (TIF)

    S3 Fig. Proportion of articles in WoS by citation indexes.

    Poland–social sciences.

    (TIF)

    S4 Fig. Proportion of articles in WoS by citation indexes.

    Norway–social sciences.

    (TIF)

    S5 Fig. Proportion of articles in WoS by citation indexes.

    Flanders–social sciences.

    (TIF)

    S6 Fig. Proportion of articles in WoS by citation indexes.

    Czech Republic–humanities.

    (TIF)

    S7 Fig. Proportion of articles in WoS by citation indexes.

    Slovakia–humanities.

    (TIF)

    S8 Fig. Proportion of articles in WoS by citation indexes.

    Poland–humanities.

    (TIF)

    S9 Fig. Proportion of articles in WoS by citation indexes.

    Norway–humanities.

    (TIF)

    S10 Fig. Proportion of articles in WoS by citation indexes.

    Flanders–humanities.

    (TIF)

    S1 Table. Article share–social sciences.

    (DOCX)

    S2 Table. Article share–humanities.

    (DOCX)

    S3 Table. Coverage of journal articles in WoS–social sciences.

    (DOCX)

    S4 Table. Coverage of journal articles in WoS–humanities.

    (DOCX)

    S5 Table. Proportion of articles in WoS by citation indexes.

    (DOCX)

    Attachment

    Submitted filename: PLoSOne_MichalPetr_response to reviewers.docx

    Attachment

    Submitted filename: Response to Reviewers.docx

    Data Availability Statement

    The data underlying this study are publicly available at: https://doi.org/10.5281/zenodo.4060376.


    Articles from PLoS ONE are provided here courtesy of PLOS

    RESOURCES