Skip to main content
PLOS One logoLink to PLOS One
. 2021 Nov 4;16(11):e0259453. doi: 10.1371/journal.pone.0259453

Do extraordinary science and technology scientists balance their publishing and patenting activities?

Yu-Wei Chang 1,2, Dar-Zen Chen 3, Mu-Hsuan Huang 1,*
Editor: Alberto Baccini4
PMCID: PMC8568124  PMID: 34735508

Abstract

This study investigated whether 12 scientists who had received the National Medal of Science and the National Medal of Technology and Innovation balanced publishing and patenting activities. The results demonstrated that although the scientist were recognized for their contributions to science and technology, the majority of recipients were not prolific researchers, and some were not influential. Notably, one scientist had not been granted a single patent. This indicated that scientific and technological contributions may not necessarily correspond with influential scientific publications and patents. Moreover, only two scientists had filed for patents before publishing, and they also invested more time developing technological inventions. Most recipients were science- or technology-oriented scientists. Few scientists balanced their publishing and patenting activities, and demonstrated excellent research and technology performance.

Introduction

Scientists tend to demonstrate discoveries in research and invention by recording them in scientific publications and patents, respectively. Scientific publications and patents not only provide opportunities for scientists to establish their scientific and technological influence but also become a proxy through which researchers can explore scientific and technological activities and the relationship between them [16]. However, most scientists disseminate their discoveries through scientific publications, and not all scientists in fields related to technology contribute through inventions or are interested in producing patents [7]. Therefore, few scientists contribute both to the production of scientific publications and patents. Because scientists’ research results have led to and followed technology and these researchers have produced scientific publications and patents to demonstrate their influence, we hypothesized that scientists engaging in both scientific and technological activities (hereinafter referred to as S&T scientists) produce both scientific publications and patents. Compared with research targeting scientists who only produce scientific publications, few studies have focused on S&T scientists [810], limiting our knowledge of their publishing and patenting activities.

Considerable time and effort are involved in publishing and patenting. Nevertheless, scientists who produced both scientific publications and patents have been reported to have higher research performance than other scientists within the field [11, 12]. Scientists’ research performance is mainly determined by productivity (number of publications) and influence (number of citations received by publications), which are quantified using bibliometric indicators. The scientific publication productivity of S&T scientists is typically higher than patent productivity because patenting requirements are more rigorous than those for publishing. However, the difference in productivity between publishing and patenting for individual S&T scientists remains unclear. Given the differences between disciplines and even across specializations within the same discipline [13], comparing scientists in terms of productivity was not the focus of this study. Instead, we investigated whether S&T scientists had a balanced ratio in terms of the number of papers published to patents awarded. Regarding the influence measured by citation counts, the skewness of citation counts received by scientific publications and patents [14] prompt another question regarding the difference in influence between scientific publications and patents for individual S&T scientists. In addition to indicators related to the quantity of scientific publications and patents as well as citation counts received by scientific publications and patents, the h-index, which combines the number of scientific publications and patents and their citation counts, has been widely used in the research evaluation context to elucidate scientists’ research performance [14].

Considering the uneven distribution of scientific productivity among scientists, we focused on eminent scientists. Compared to normal scientists who accounted for majority of scientists, the relatively small number of eminent scientists are expected to have relatively smaller differences in productivity among them. Moreover, eminent scientists exhibit more prolific or higher influence than that of normal scientists [15, 16]. Therefore, eminent S&T scientists were assumed to have more scientific publications and patents for exploring the differences in research performance between publishing and patenting.

Bibliometric indicators have been controversial for identifying distinguished scientists. Therefore, to investigate the research performance of extraordinary S&T scientists, we identified scientists who have been granted two distinguished awards that separately emphasize contribution to science and technology. Considering the requirements close to eminent S&T scientists, finally, we targeted scientists who have received both the National Medal of Science (NMS) and the National Medal of Technology and Innovation (NMTI). As of 2019, only 12 individuals have been awarded both the NMS and NMTI. The low number of winners indicates that this select group has been responsible for extraordinary contributions to science and technology. Therefore, they are qualified to serve as representative research subjects because they demonstrate a great deal of influence in the fields of science and technology.

Because eminent S&T scientists were considered in this study, further hypotheses were proposed on the basis of our general impression of scientists with scientific and technological contributions [1518]. In addition to producing both scientific publications and patents and having a higher scientific publication productivity than patent productivity, these excellent scientists have influential scientific and technological outputs. At least one scientific publication and one patent by each scientist has been highly cited by other scientists. If these contentions could be demonstrated to be true, that would indicate these scientists have produced scientific papers and patents and have demonstrate their contributions to science and technology. Moreover, they had a high number of scientific publications. A balanced ratio of publishing to patenting activity was evident among extraordinary science–technology scientists. Notably, the biggest difference between this study and previous studies is that individuals’ publishing and patenting activities were covered. The relationship between publishing and patenting activities of the same scientist was monitored. We determined that individual scientists regarded as having demonstrated a balance in publishing and patenting activities had a balanced ratio of the value of the h-index in scientific publications to the value of the h-index in patents. The h-index measuring both productivity and influence was used to reflect the difference in research performance between scientific and technological activities.

The 12 scientists considered in this study have been recognized for their excellent contributions to science and technology. Thus, we determined whether they balanced their publishing and patenting and also investigated the influence of their publishing and patenting activities. This study addressed the following three research questions:

  • Do the 12 recipients with contributions to science and technology have similar ratios of scientific articles to patents?

  • Are the scientific articles and patents produced by the 12 recipients highly cited?

  • Do the 12 recipients have similar h-index ratios for scientific articles and patents?

Literature review

Scientists with scientific publications and patents

Among numerous channels through which scientists can disseminate their ideas, scientific publications and patents are common types of intellectual outputs. However, the majority of researchers publish their research results in only scientific publications [19]. Not all scientists can produce patents. Moreover, the field of academic research affects the cost and opportunities for researchers to be involved in industrial research and filing patents. Scientists within the same field demonstrated different levels of research output. For instance, materials science researchers conducting application-oriented research are in a better position to file patents than scientists conducting basic research [20]. This observation was consistent with the findings of Bonaccorsi and Thoma [11], who studied the research productivity of scientists in nano science and technology.

Although scientists can present the same concept in both a scientific paper and a patent, the emphasis placed on the value of science and technology affects the types of output and the approaches adopted for intellectual output [21]. Calderini et al. [20] found that patents filed by materials science researchers tend to be derived from their ideas published in journals with median and high impact factors. This study revealed an increase in scientific publications and a decrease in filing patents. Chang et al. [22] analyzed patent–paper pairs of scientific papers and U.S. patents in the field of fuel cells, and they determined that patents were filed and approved before the publication year of scientific papers within patent–paper pairs. The competitive relationship between science and technology implies that individual scientists do not exhibit the same productivity with respect to generating scientific papers and patents.

Higher research productivity and research influence were determined to be linked to researchers with scientific publications and patents in specific fields. Meyer [8] investigated differences in research performance between scientists with scientific publications and patents and scientists without patents in the fields of nanoscience and nanotechnology. Scientists producing scientific publications and patents demonstrated higher productivity and research influence than other scientists did. Bonaccorsi and Thoma [11] in a study on the research productivity in the field of nano science and technology demonstrated that two-thirds of U.S. patents were invented by all scientists with at least one scientific publication or a patent. Klitkou and Gulbrandsen [12] observed that researchers affiliated with universities in Norway who had published both scientific publications and patents in the field of life science had higher research productivity than researchers without patents. These findings were consistent with claim by Magerman et al. [9] and Grimm and Jaenicke [10] that researchers involved in patenting would not reduce their research performance as measured by scientific publications.

Influence and contribution

Research output is the principal method by which researchers demonstrate their influence and contribution. Scientific contribution was ranked as the second most important criterion in the evaluation of scientific papers, after research originality [23]. This indicates that the value of scientific papers can be demonstrated through their possible contribution to science. Similarly, patents reflect technological innovation and contribution. Scientometric researchers widely use citation counts as a proxy for scientific and technological influence [2426] and also apply this bibliometric approach to assess the scientific contributions of individual scientists [27, 28]. This methodology indicates that research influence is not distinguished from scientific contribution. Although influence is not equivalent to contribution, the concept of influence is often mixed with that of contribution [29].

From the bibliometric perspective, the number of citations received by scientific publications is used to demonstrate various characteristics embedded in scientific publications, such as research quality and influence [30, 31]. However, opponents of the use of citation indicators for measuring research quality regard the limitations of citation indicators as barriers to quantifying the complex concept of research quality, which is mainly characterized by soundness, originality, scientific value, and societal value [31]. Although influence is regarded as having a closer relationship with citation counts than with research quality [32], complex citing behavior that involves diverse motives has resulted in some researchers being unconvinced that citation-related indicators are appropriate for assessing research influence [14, 30].

Several researchers have disputed that citation measures can be used to assess researchers’ contributions to science [30]. Martin and Irvine [32] defined contribution to science as scientific progress and stated that the actual influence of a publication is highly associated with the concept of scientific progress. They also argued that papers with a substantial actual influence were those providing major contributions to scientific knowledge, and influence cannot be measured directly by citation counts. The bibliometric approach simplifies the notion of scientific influence and contribution, and thus peer review, a traditional approach, remains a common means for selecting recipients of awards and honors [3335]. Traditionally, personal testimonials of the influence that a specific scientist has had on other researchers were the only manner in which to demonstrate scientific contributions. Despite the existence of peer review bias among reviewers, it is still considered the optimal method for assessing scientific contributions [23]. Nonetheless, some researchers support the citation count method; for instance, over half of the scientists interviewed for Aksnes’s study [29] agreed that citation counts received by papers reflect their scientific contributions and value. However, we did not identify any awards that planned to incorporate citation counts into the award requirements.

Because of differences in the characteristics of the bibliometric and peer review approaches, numerous studies have attempted to examine the correlation between peer review and citation measures. Several studies have demonstrated a positive correlation between peer review and citation measures [36, 37], whereas other studies have reported a weak correlation [31, 38]. Baccini and Nicolao [31] observed the low degree of agreement for grading journal articles in 13 of 14 fields between the peer review and bibliometric methods. Abramo, D’Angelo, and Reale [39] did not claim that bibliometric indicators were superior to expert judgment, but they concluded that the bibliometric approach was more reliable than peer review for predicting the future scholarly influence of scientific publications. These findings indicate that citation counts are a possible indicator of influence or contribution, and thus, the bibliometric approach was adopted as one indicator in the present study.

Altmetrics is a method for assessing the social effects of publications using social media or other Internet resources and can be employed for a more complete overview of the effect of publications. Therefore, it has become an area of interest; the increasing number of studies investigating the correlation between Altmetric scores and citation counts reflects a growing interest [40, 41]. Studies have reported a significant positive correlation between Altmetric scores and citation numbers [42, 43], whereas other studies have reported a weak correlation [44]. Altmetric scores should be regarded as a supplementary assessment measure of the influence of publications, but Altmetric data are only available for studies published after social media platforms were established in 2011 [45, 46]. Therefore, the public influence of studies published before 2011 cannot be assessed using Internet-based analysis. The majority of the scientific papers from the 12 subjects of the present study were published before 2011; therefore, we did not include Altmetric scores in our analysis.

Methodology

Scientists selection

Both the NMS and NMTI have been awarded to 12 researchers, who were selected as the subjects of this study. The NMS award was introduced in 1962 and the NMTI was introduced in 1985. The NMS and NMTI were established by the U.S. Congress in 1959 and 1980, respectively. The NMS is awarded to honor one scientist per year who has contributed to science and other areas and has influenced industries, education, or the country. The NMTI honors scientists who have greatly influenced and contributed to the American economy, social welfare, or the environment through technology commercialization and innovation. As of 2019, only 12 scientists have received both the NMS and NMTI. The NMS and NMTI candidates must be American citizens; therefore, these awards are not international in scope, but they are highly prestigious national awards. Thus, recipients of the NMS and the NMTI should constitute an appropriate sample for investigating the relationship between science and technology from the individual angle.

Table 1 lists the ages and institutions of the 12 scientists who received the NMS and NMTI. Subjects were aged 46–89 years when receiving the NMS and 53–85 years when receiving the NMTI. Most recipients first received the NMS and then the NMTI. Only three subjects received the NMS before the NMTI. The length of time between receipt of the two awards was 1–23 years. Seven scientists were awarded the two medals within 8 years of each other. The youngest subject was 70 years old in 2018 when data were collected for this study. This indicated that at the time of this study the 12 scientists were of retirement age and thus at the end of their academic career. Therefore, the scientific papers and patents by each recipient that were collected represented almost the entirety of their professional productivity.

Table 1. Ages and institutions of the 12 scientists who received an NMS and NMTI.

No. Name Affiliation NMS Field (NMS) Age (NMS) NMTI Field (NMTI) Age (NMTI)
1 Robert S. Langer Massachusetts Institute of Technology 2006 Engineering 58 2011 Medicine 63
2 Jan D. Achenbach Northwestern University 2005 Engineering 70 2003 Aerospace 68
3 Herbert W. Boyer University of California, San Francisco 1990 Biological Sciences 54 1989 Medicine 53
4 Nick Holonyak, Jr. University of Illinois 1990 Engineering 62 2002 Electronics 74
5 Arnold O. Beckman California Institute of Technology 1989 Physical Sciences 89 1988 Aerospace 88
6 Stanley N. Cohen Stanford University 1988 Biological Sciences 53 1989 Medicine 54
7 Paul C. Lauterbur University of Illinois 1987 Physical Sciences 58 1988 Medicine 59
8 Robert N. Noyce Intel Corporation 1979 Engineering 52 1987 Computer Science 60
9 Carl Djerassi Stanford University 1973 Physical Sciences 50 1991 Environment 68
10 Harold E. Edgerton Massachusetts Institute of Technology 1973 Engineering 70 1988 Electronics 85
11 Jack St. Clair Kilby Texas Instruments 1969 Engineering 46 1990 Hardware 67
12 Clarence L. Johnson Lockheed Corporation 1965 Engineering 55 1988 Aerospace 78

Data collection

Personal data including age, education, professional experience, and honors and awards were obtained from websites and social media. This assisted in identifying the papers and patents produced by the 12 recipients. The research papers and patents produced were the focus of this study and served as proxies for scientific and technological results, which are highly associated with professional output. Research papers published by each scientist were retrieved from the Scopus database, the largest multidisciplinary citation index database. The coverage of journals indexed by the Scopus database is much larger than Web of Science (WoS), another multidisciplinary citation index database. Therefore, to collect a more complete list of publications by each scientist over their course of career for analysis, the Scopus database was used as the source database to collect the published research papers. Only articles were defined as research papers in this study; nonresearch papers such as interviews, editorial materials, and book reviews were excluded. The latest year of publication was 2018. Regarding patents, because all 12 subjects were Americans and U.S. patent information was the only available data source, U.S. patents granted to the 12 subjects were retrieved from Google Patents. For purposes of this study, the latest year in which a U.S. patent was granted was also 2018. The names of the 12 scientists were the key information for searching bibliographic records of research papers and U.S. patents. Author and inventor affiliations were examined on the basis of their professional experiences to identify the relevant research output.

The bibliographic records of research articles provided by the Scopus database include title, author names, author affiliations, journal source, volume and number, pages, publication year, abstracts, and the number of citations of the paper. These data were downloaded on January 29, 2019. Because the cumulative number of citations received by each paper is updated to a date close to the end of 2018, the number of citations received by each paper closely reflects their influence as of the end of 2018. As mentioned in the preceding paragraph, Scopus covers more journal titles than WoS. This helped us collect a greater proportion of the articles published by the 12 scientists. However, Scopus does not provide the citation figures for years before 1970. For articles published before 1970 and that received citations before 1970, their annual number of citations for that period was collected from WoS if that citation data were available. The citation data collection was conducted on the premise that articles published before 1970 are indexed by WoS. Therefore, for articles published before 1970, their annual citation counts were collected from WoS (before 1970) and Scopus (after 1970). For example, the total number of citations received by an article published in 1968 was calculated from the number of citations recorded in 1969 in WoS and the number of citations recorded during 1970–2018 in Scopus. This prevented an underestimation of the number of citations received by articles published before 1970. Regarding the bibliographic records of patents, patent numbers, titles, abstracts, inventor names, and grant dates were collected in February 2019. Because the number of patent-related citations is not provided in the U.S. Patent and Trademark Office (USPTO) database, we calculated the number of citations received on the basis of the references for all patents. The list of references does not change after the patents are granted. Therefore, 2018 is the latest year for which the number of patent citations is recorded.

Data processing

To improve precision, an intensive manual task was performed to examine the bibliographic records of papers and patents. First, the titles of papers that were categorized as articles were examined, followed by papers of fewer than five pages without references. Second, the full texts were examined and some were excluded because they were the incorrect type; for instance, interviews, lectures, or editorial letters.

Several bibliometric indicators were used to measure and compare the differences in productivity and influence as evident in articles and patents. Regarding research productivity, considering the differences in individual lengths of professional careers, we calculated the average productivity per year for each scientist. Researchers who favor collaborative research tend to have higher productivity. Therefore, apart from full counting, fractional counting was also used to calculate productivity. If an article was written by n authors, each author was granted one article and 1/n article for research productivity, respectively. Regarding indicators related to influence and as frequently done by researchers, citation counts were employed to represent influence and contribution over time [47]. The number of citations added per year was also tracked to assess variations in influence over time. Particular attention was paid to the most highly cited article and patent of each scientist to track influence over time. The subjects were all senior scientists that had made considerable contributions to their fields throughout their careers. Therefore, h-index that incorporated productivity and influence was used as a reference indicator of their overall research performance [48, 49].

Results

Contributions acknowledged by NMS and NMTI

Table 2 presents the scientific and technological contributions of the 12 scientists for which they were awarded the NMS and NMTI; data were obtained from the NMS and NMTI websites. Every recipient earned both their medals for the same or similar contributions. This indicates that an identical idea can be developed in scientific research or applied in technological areas. For instance, Lauterbur received the NMS in the field of physical sciences in 1987 for nuclear magnetic resonance research, and he received the NMTI the following year in the field of medicine for his invention of nuclear magnetic resonance. Furthermore, Lauterbur’s educational background in chemistry did not limit his scientific and technological contributions to his field. Seven recipients had engineering expertise. Moreover, scientists had educational backgrounds in medicine, chemistry, biology, or physics. Eight were practice-oriented academics. Two recipients did not work for industries and another two did not teach in universities.

Table 2. Contributions, backgrounds, and job experiences of the 12 recipients.

No. Name Contributions for earning awards Education backgrounds Job experiences
1 Langer Polymeric controlled drug release systems Chemical engineering MIT
2 Achenbach Wave propagation/ ultrasonic methods Aeronautics and astronautics Northwestern Univ.
3 Boyer Recombinant-DNA technology Biology; Chemistry; Biotechnology Standard Univ.; Univ. of California; Genentech
4 Holonyak Light-emitting diodes (LEDs) Electronic engineering General Electric Co.; Univ. of Illinois at Urbana-Champaign
5 Beckman Development of analytical instrumentation Chemical engineering; Physical chemistry; Photochemistry California Institute of Technology; National Inking Appliance Co.
6 Cohen Recombinant-DNA technology Medicine Standard Univ.
7 Lauterbur Nuclear magnetic resonance Chemistry Univ. of New York at Stony Brook; Univ. of Illinois at Urbana-Champaign
8 Noyce Integrated circuit Physical electronics Intel Corp.; Fairchild Semiconductor Corp.
9 Djerassi Oral contraceptives / Insect control products Chemistry Syntex Co.; Standard Univ.; Wayne State Univ.
10 Edgerton Stroboscopic photography Electronic engineering Edgerton, Germeshausen, and Grier, Inc.; MIT
11 Kilby Integrated circuit Electronic engineering Texas Instruments
12 Johnson Design of aircraft Aeronautical engineering Lockheed

Awards

Fig 1 shows the number of awards, including the NMS and NMTI received by each scientist in specific years. In addition to the NMS and NMTI, each scientist earned several other scholarly awards during their professional career. The normal vertical line set as zero on the horizontal axis represents the year the NMS was awarded and the vertical bold red line represents the year the NMTI was awarded. Except for Edgerton, whose last award was the NMTI, all the scientists received other awards before and after both the NMS and NMTI. This means that their professional achievements and contributions were recognized several times by peers. The receipt of several distinguished awards reduces doubts regarding award committee bias. Except for Langer, who received up to 79 awards, the other 11 scientists received 8–26 awards. Their achievements in terms of the number of awards received may be attributed to the Matthew effect: that success breeds success [50]. Among 12 scientists, nine earned other major awards and prizes in medicine, biotechnology, chemistry, and physics, such as the Albert Lasker Basic Medical Research Award, Dickson Prize in Medicine, Charles Stark Draper Prize for Engineering, Harvey Prize, Shaw Prize, and Wolf Prize; receipt of these awards was determined by checking the distinguished awards listed on the website of Harvard Medical School [51]. Two scientists were Nobel Laureates (Kilby and Lauterbur). The vertical blue line in Fig 1 marks the year that the researcher received the Nobel Prize. Kilby and Lauterbur received a relatively modest number of awards, at 18 and 19, respectively. Two scientists with a relatively low number of awards all specialized in aeronautics (Achenbach and Beckman), which is a relatively small field. Furthermore, Boyer and Noyce had built a close collaborative relationship because of their shared research interests. They received a similar number of awards.

Fig 1. Number of awards per year.

Fig 1

Research and technology productivity

Table 3 lists figures related to the productivity and influence of articles indexed by Scopus and patents granted by the USPTO for each recipient. Djerassi had the largest combined number of articles and patents. Excluding one recipient (Achenbach) who had not filed any patents, Boyer had the highest ratio of articles to patents (28.33), followed by Lauterbur (24.00). The higher the ratio is, the larger the difference between article productivity and patent productivity is. Scientists with more balanced productivity in research articles and patents were those with ratios of approximately 1. Only four scientists had an article-to-patent ratio equal to or below 1, which indicated that they were not typical academics who dedicated themselves to producing scientific articles; this is because having patents granted is more challenging than having articles published. When using fractional counting, Djerassi still had the highest total productivity in articles and patents. Regarding the ratios of article production to patent production, Lauterbur came in first place (32.50) and Boyer ranked second (16.67). Only the ranks of the top two scientists changed when two different methods were used for counting total productivity. Although no threshold was set for high productivity, the differences in individual productivity of articles and patents and average productivity per year signaled that the majority were not prolific authors and inventors. Moreover, we observed a substantial difference in productivity among the scientists. To account for the difference in productivity between disciplines, two scientists in the field of biomedicine who frequently collaborated, Boyer and Cohen, were taken as examples. Cohen produced a much larger number of articles (423 vs. 85) and patents (29 vs. 3) than Boyer produced.

Table 3. Ratios of articles to patents.

Name A/P A/P (F) Time (A/P) Ave Pro (A/P) Ave C (A/P) Ave HC (A/P) H index (A/P)
Achenbach -- -- -- -- -- -- --
(377/0) (184.3/0) (56/0) (6.73/0) (22.9/0) (10.2/0) (51/0)
Boyer 28.33 16.67 9.00 3.15 0.93 4.89 12.00
(85/3) (25/1.5) (27/3) (3.15/1) (152.9/164.3) (81.6/16.7) (36/2)
Lauterbur 24.00 32.50 13.33 1.92 6.73 33.97 10.50
(100/4) (45.5/1.4) (40/3) (2.5/1.3) (87.5/13) (42.9/1.3) (42/4)
Cohen 14.59 11.80 2.71 5.36 3.04 8.43 6.87
(423/29) (146.3/12.4) (57/21) (7.4/1.4) (101.7/33.5) (68.5/8.1) (103/15)
Djerassi 9.78 8.07 2.71 3.61 11.22 40.44 9.56
(998/102) (337.4/41.8) (57/21) (17.5/4.9) (39.27/3.5) (19.6/0.5) (86/9)
Holonyak 8.80 4.46 1.90 4.64 0.89 2.45 1.96
(537/61) (125.2/28.1) (59/31) (9.1/1.9) (26.3/29.7) (9.1/3.7) (53/27)
Langer 2.44 2.08 1.19 2.06 1.49 8.45 1.76
(1171/479) (255.1/108.3) (44/37) (26.6/12.9) (123.3/82.7) (297.6/35.2) (194/110)
Edgerton 1.70 1.03 2.20 0.77 0.86 1.00 1.30
(56/33) (31.6/30.8) (33/15) (1.7/2.2) (8.3/9.6) (0.6/0.6) (13/10)
Beckman 1.00 0.48 1.22 0.81 2.36 1.80 1.29
(14/14) (5.4/11.2) (11/9) (1.3/1.6) (22.4/9.5) (0.9/0.5) (9/7)
Noyce 0.35 0.14 1.00 0.36 12.98 10.61 0.50
(6/17) (2.1/14.8) (6/6) (1/2.8) (320.8/24.7) (29.6/2.8) (5/10)
Johnson 0.27 0.10 0.33 0.86 3.09 5.21 0.40
(7/26) (2.5/24.2) (6/18) (1.2/1.4) (26.6/8.6) (2.3/0.4) (4/10)
Kilby 0.03 0.04 0.08 0.45 2.18 0.74 0.05
(2/58) (1.5/37.7) (2/26) (1/2.2) (64.0/29.4) (3.0/4.0) (1/21)

Notes: A refers to articles; P refers to patents; Time refers to the cumulative years publishing articles/ producing patents; Ave proc. refers to the mean articles/patents per year. Ave C refers to the mean citation counts per article/patent; Ave HC refers to the mean citations received by the most highly cited article/patent per year.

Notably, the total numbers of articles and patents for each scientist represented their cumulative productivity over their careers, which exceeded the normal retirement age (65 years). We were surprised by the high productivity in both articles and patents produced by Langer and by the low productivity in articles and patents produced by Beckman, Noyce, and Johnson. To assess productivity over time, the average productivity per year was examined for each scientist. Five scientists published over five articles per year and five scientists were granted at least two patents per year. Only two scientists achieved high mean productivity in articles and patents. Furthermore, five scientists with a low ratio (≤1.00) were focused on the production of inventions instead of articles.

Excluding one scientist without any U.S. patents, eight scientists continued publishing and filing patents for over 30 years. However, the actual cumulative number of years for publishing articles and being granted patents (Time [A/P]) was found to vary between subjects (Table 3). The ratios of the Time (A/P) revealed that only three scientists had invested a longer time period to filing patents, with a range between 0 and 0.7. This implies the majority of scientists contributed more time to publishing. However, the ratios displayed may underestimate the time spent by winners, considering that filing a patent is more difficult than publishing an article because of the difference in the processes of filing patents and publishing articles. As expected, the majority of recipients had higher productivity in articles than in patents. Moreover, the findings demonstrated that only one scientist published articles after he had been a patent holder for 26 years. Another subject published his first article and was granted his first U.S. patent in the same year.

Fig 2 illustrates the number of articles and patents produced in specific years. Variations in productivity each year demonstrate each subject’s productivity pattern. Moreover, the starting year indicates when individual scientists were first involved in publishing or patenting. The normal vertical line in the zero on the horizontal axis refers to the year they received the NMS, and the vertical bold line indicates the year when they received the NMTI. If a specific scientist received the NMTI five years after receiving the NMS, the year for obtaining the NMTI was labeled 5. Another reference point was set with a vertical line topped with a round dot, which refers to the year when a scientist was 40 years old. Regarding the horizontal lines, the black normal line represents the number of articles per year and the dotted red line refers to the annual number of U.S. patents. Fig 2 also exhibits the duration of time invested in publishing and patenting for each scientist. Some scientists started publishing articles before being granted patents or invested more time producing articles and thus obtained a higher number of articles. Article-oriented scientists are Achebach, Boyer, Holonyak, Lauterbur, and Cohen. Some scientists were patent-oriented and thus spent more time on invention; they received a higher number of patents. Kilby and Johnson are such scientists. The remaining scientists were between being article-oriented and patent-oriented.

Fig 2. Numbers of articles and patents by year for individual recipients.

Fig 2

Academic and technological influence

Mean number of citations was used as an indicator of the influence of articles and patents. The mean number of citations per article or patent is presented in Table 3. The ratio of the mean number of citations per article to the mean number of citations per patent indicates that Noyce (12.98) had the largest research influence despite only publishing six articles. Edgerton achieved the greatest mean technological influence with the lowest ratio of mean citations per article/patent (Ave C [A/P]) (0.86). Although Holonyak and Edgerton produced a lower number of patents than articles, their average technological influence per patent was higher than their average science influence per article. The influence of scientists may not be evident in the mean number of citations per article or patent because of the large variation in citation counts for individual articles and patents. Therefore, we further investigated the articles and patents with the highest citation counts. Regarding the average number of citations per year, a larger range of ratios of the most cited article to the most cited patent (AveHC [A/P]) was observed (0.74–40.44) than the range for AveC (A/P) (0.86–12.98).

To estimate the level of influence of the most-cited article for each scientist, we assessed their most-cited article in their field and compared the citation level to general articles in the subject field published in journals covered by WoS. For instance, for one subject, the article with the highest number of citations was published in a chemistry journal. Therefore, we examined all articles published in chemistry journals covered by WoS. In terms of citation counts, each of the 10 scientists’ most highly cited article was ranked within the top 0.1% of highly cited articles. We considered the top 0.1% of articles with the highest number of citations to be the threshold for highly cited articles because a study by Rodriguez-Navarro 2011) on the citation counts of papers by Nobel Laureates reported that high levels of citations are a useful indicator to identify influential articles. We observed that 10 of the 12 scientists published at least one highly influential article in journals belonging to their fields.

Fig 3 illustrates the changes in the cumulative number of citations received by all articles and patents per year and the changes in the annual number of citations received by the article and patent with the largest number of citations. The normal black line refers to the average number of citations received by all articles in a specific year, and the bold solid black line represents the average citations received by all patents in a specific year. Because of the large difference in the number of citations received for articles and patents, the number of citations was transformed in the scale of base 10 logarithm to reduce wide-ranging quantities to small scopes. This allowed us to plot the changes in the influence of articles and patents per year in the same subfigure. Regarding the article and patent with the largest citation counts, the dotted blue line reflects the number of citations received by highly cited articles per year, and the bold dotted red line reflects the number of citations received by the highly cited patent per year. Because some scientists only published a limited number of articles with a low number of citations, we further searched whether they had published other types of scientific publications that had higher citation counts. We determined that two scientists (Achenbach and Edgerton) published books that have received a larger number of citations than their articles, according to the citation records of WoS and Scopus. Variations in the number of citations received by the most highly cited book per year were plotted with a green line. Excluding the scientist (Achenbach) who had not filed any patents, three of the remaining scientists had a greater influence from articles than from patents.

Fig 3. Changes in the number of citations by year and type.

Fig 3

The h-index combines time, productivity, and influence and was used to reflect the longitudinal research performance of senior scientists. As displayed in Table 3, the ratios in the h-index of articles to patents demonstrated that Boyer had the highest h-index of articles and patents. Higher ratios appear to be related to larger differences in the production of articles and patents. Table 4 presents the correlations between pairs of indicators. A significant difference was observed in each pair of indicators related to the h-index (A/P), A/P, A/P(F), and Time (A/P) with strong correlations (correlation coefficients > 0.8). A medium correlation was observed between the h-index (A/P) and AveHC (A/P), with a correlation coefficient of 0.636. Fig 4 illustrates the three ratios of articles to patents by scientist and the ages at which each scientist began publishing and patenting. Each scientist was assigned an exclusive color to label their scores for five characteristics. The 12 scientists’ data are provided in descending order according to the h-index (A/P) value. The exception color is gray-blue, which is used for two scientists sharing the same Time(A/P) value. Moreover, one scientist (Achenbach) did not produce patents. Therefore, for this scientist, the ratios related to articles to patents could not be calculated; thus, no association among the three ratios and ages of publishing the first article is presented in Fig 4. Fig 4 illustrates that the majority of these scientists invested more time in publishing and yielded a higher ratio of h-index(articles) to h-index(patents). Moreover, most scientists started publishing before they started obtaining patents. Before the age of 30, ten scientists had published their first scientific papers, whereas only two scientists had obtained their first patents. The differences in ratios of articles to patents, including A/P, Time(A/P), and h-index(A/P), revealed that the ratio of patenting to publishing was heavily skewed among these 12 scientists.

Table 4. Correlations between pairs of indicators.

h-index (A/P) A/P A/P(F) Time (A/P) AveC (A/P) AveHC (A/P) AvePro (A/P)
H index (A/P) 1 0.926** 0.842** 0.813** 0.172 0.636* 0.542
A/P 1 0.891** 0.889** -0.039 0.397 0.549
A/P(F) 1 0.961** 0.117 0.594 0.361
Time (A/P) 1 0.062 0.497 0.204
AveC (A/P) 1 0.680* -0.124
AveHC (A/P) 1 0.222
AvePro (A/P) 1

Notes:

** p < .01,

* p < .05.

Fig 4. Differences in three ratios of articles to patents by scientist and the ages at which each scientist began publishing and patenting.

Fig 4

A considerable gap in the 12 scientists’ productivity and the influence of articles and patents was observed over a long-term period covering the end of their professional careers. Because the h-index that is suitable for measuring the productivity of senior scientists and the influence of the articles and patents that they produced, we used the results from the h-index for articles and patents to represent the differences between scientists in Fig 5 and divided them into three types according to their position in Fig 5. Only one scientist (Langer) was categorized as type A, featuring a high value of h-index in both articles and patents. A substantial difference in the value of the h-index of articles and patents was observed between Langer and other scientists. Although 11 scientists were concentrated on the left side, three scientists (Cohen, Djerassi, and Holonyak) who are closest to the center of Fig 5 were classified as type B because they were separated from the other eight scientists on the bottom left corner. The remaining eight scientists were classified as type C. According to the positions of the three scientists classified as type B, the requirements for separating type-B scientists from those of type C were generated. Three type-B scientists were located in the zone with an h-index score for articles larger than 50% and an h-index score for patents larger than 10%.

Fig 5. Distribution of values of the article and patent h-index for each scientist.

Fig 5

Discussion and conclusions

The 12 subjects investigated in this study are the only scientists to have received both the NMS, which emphasizes contribution to science, and NMTI, which emphasizes contribution to technology. Receiving both these prestigious awards is a rare occurrence. These 12 recipients have proven their extraordinary professional achievements and contributions to science and technology. Therefore, we are not surprised that they received other noteworthy awards before and after the NMS and NMTI as evidence of their peers’ recognition of their professional achievements. Eminent scientists can begin receiving honors early in their careers [52]. However, the 12 recipients’ numerous distinguished awards along their professional careers originally elevated our expectation of observing their high research performance. Although we had no initial estimation of the number of articles and patents they produced and the number of citations received by their articles and patents, the 12 eminent scientists were expected to have a considerable number of articles and patents and have influential articles and patents. Given that research performance comparison between scientists in different fields is discouraged, this study focused on the differences in research performance in articles and patents for each scientist. We assessed whether eminent scientists balanced their scientific and technological activities and demonstrated a similar influence in these two activities by using the ratio-related indicators to compare article and patent productivity and their citation counts. Moreover, changes in productivity and influence by year were tracked along these scientists whole professional careers for obtaining more details regarding their scientific and technological outputs.

Although we observed an effect from the length of time on the amount of research output and citations received by papers and patents and subsequently collected articles and patents along with recipients’ professional careers for longitudinal analysis, the findings of this paper revealed that the 12 recipients with substantial contributions to science and technology had various characteristics in productivity and influence. Although age had no strong association with their research performance [53], not all extraordinary scientists had high research performance; these scientists were neither prolific nor influential. This is consistent with the finding of Chang et al. [13], who revealed that the research performance of 50 biological scientists with contributions to science that were recognized by earning the NMS and a fellowship from the American Academic of Arts and Sciences varied at the individual level. No clear relationship was observed between research performance and scientific contribution.

Although the diverse research performance of excellent scientists identified by previous studies may be parallel to our study’s findings, we could not calculate the ratio of articles and patents for all of the scientists. Considering the difference in requirements between publishing and patenting, we anticipated that the ratios of articles to patents would be more than one for all 12 scientists, but we could not speculate the maximum of these ratios. The ratios of articles to patents for individual scientists that ranged from 0.03 to 28.33 indicated that we failed to predict some scientists’ minimum productivity. In particular, unexpected findings include one recipient not having been granted U.S. patent and some recipients having publishing a relatively limited number of research articles indexed by the two large interdisciplinary databases, WoS and Scopus. This indicates the possibility that scientists’ technological contributions may not be reflected in patents. This may also be true of the types of professional output that demonstrate scientific contribution. This type of scientist is not uncommon and not limited to the 12 recipients of the NMS and NMTI. For example, Nikola Tesla contributed to the design of the modern alternating current electricity supply system, but he did not publish scientific papers; however, he was recognized for his scientific contributions in the field of electrical engineering and mechanical engineering through his inventions [54].

Eleven recipients had published scholarly articles and held patented inventions. We further determined that the contributions for which they were awarded the NMS and NMTI were associated with their articles and patents. Excluding one subject (Achenbach) who did not hold a patent for an invention, all the subjects are now listed on the website of the National Inventors Hall of Fame. The website of the National Inventors Hall of Fame provided the specific U.S. patent number of their most influential patented inventions and their biographical information, which summarized the technological contributions that were consistent with contributions that led to them receiving the NMTI. Moreover, each scientist earned their NMS and NMTI from the same contribution or similar contributions. This indicates a blurring of the boundary between science and technology contributions. Moreover, for each recipient, some scientific articles and patents were topic related. The cumulative nature of intellectual endeavors means no single paper or patent fully represented their influence and contribution.

Compared with research productivity, research influence was revealed to have a stronger association with scientific contribution [13]. Although influence is not equivalent to contribution, these concepts are frequently confused because of the association between them. Therefore, in this study, citation-based counts were used to measure scientific and technological influence. The skewness of citation counts led to our expectation of few influential articles and patents, but we anticipated to observe at least one article and one patent produced by eminent scientists that were highly cited. In particular, the Matthew effect seems to be the natural consequence of several prestigious awards earned by the 12 scientists; these awards then boost their scientific and technological influence. The productivity and influence by year (Figs 2 and 3) and the years when the 12 scientists received their awards (Fig 1) indicated that some factors hamper identifying the association between the Matthew effect and citation counts received by year. One factor is that ever-increasing articles and patents enhance the likelihood of each scientist receiving citations. However, we could not determine the positive impact of awards on the increase in citation counts or the number of new citations received by scientists. Another factor is the decline in citation counts observed after some scientists received several awards. Despite 10 scientists publishing at least one top 0.1% highly cited article, two scientists did not publish such highly cited articles. Therefore, we could not identify substantial scientific contributions from only high citation levels.

Some awards also emphasize the importance of the research’s impact on practice and society [55]. As scientists are also expected to contribute to society, such as through solving human problems, other types of influence associated with contributions, such as societal effects, have started to be considered [56]. Altmetric scores are an emerging indicator for a type of societal influence other than the research influence, but they cannot be applied to scientists who published the majority of their papers before the emergence of Altmetrics. Measuring or presenting the social effect of individual researchers is still a challenge [55]. Therefore, although the 12 scientists have been recognized for their contribution to science and technology in terms of the NMS and NMTI, their contribution to society cannot be fully reflected in research performance measured by bibliometric indicators.

The various contributions resulting from discoveries cannot simply be measured through bibliometric indicators [57, 58]. Therefore, peer review mechanisms must remain the principal approach to assessing influences and contributions [59]. Furthermore, scientists capture different types of values, meeting their various needs for engagement in knowledge production and dissemination [20]. When societal contributions made by scientists are noticed, then some scientists are willing to further engage with society. However, the reality is that researchers’ choice for knowledge dissemination is primarily affected by monetary incentives that are focused on the amount of productivity and influence that can be easily measured through bibliometric indicators. Therefore, other systems such as peer review must exist to measure scientists’ contributions and achievements from other viewpoints.

On the basis of the current findings, we posit that scientists’ contribution to technology may not be reflected in inventions (patents). The gap between bibliometric indicators and the peer review system for indicating scientific contribution was confirmed in eminent scientists with contribution to science and technology. To emphasize the differences in professional productivity and scholarly and technological influence, these scientists were divided into three types according to the values of the h-index for articles and patents. The type A scientist, Langer, exemplified a scientist with a substantial contribution to science and technology as evident in high productivity in articles and patents and high influence through the output. Moreover, Langer received numerous awards. The large differences in the h- index values for articles and patents between Langer and the other 11 scientists indicate that Langer cannot be considered a typical example of a scientist with exceptional contributions to humanity. Langer was an outlier compared with other scientists. In terms of the ratios of h-index (articles) to h-index (patents), Langer (type A), Holonyak (type B), Edgerton (type C), and Beckman (type C) had ratios close 1, ranging between 1.29 and 1.96. They had a higher degree of balance between publishing and patenting activities. However, only Langer and Holonyak balanced their publishing and patenting activities, demonstrating excellent research and technology performance. Most of the examined individuals were science- or technology-oriented scientists.

The principal limitation of this study was the small number of subjects. However, the scientists investigated in this study had obtained acknowledgment for exceptional contributions to science and technology. The NMS and NMTI were used as proxies. However, scientists who have made substantial contributions to science and technology are rare. We did not identify any other awards that could be used to increase the sample size. Despite the small sample size, we contributed to locating scientists with substantial contributions to science and technology, and we explored their research performance through investigating their articles and patents. We also examined the awards and honors that they had obtained to support the rationale for selecting these 12 scientists as subjects. Moreover, a longitudinal analysis reduced the effect of time. The results of this study demonstrate that scientists with substantial contributions may not be productive or influential from the perspective of bibliometric measures. Although studies have focused on the research performance of Nobel laureates and scientists that received other awards [6062], the biggest contribution of this study is the observation that only a few eminent S&T scientists balanced their publishing and patents activities through high research performance in publishing and patenting. Scientists’ contribution to science is not limited to their research. Moreover, scientists’ contribution to technology may not be evident from their inventions.

Data Availability

All relevant data are available from the Zenodo database. DOI: http://doi.org/10.5281/zenodo.5115346 Publication date: July 20, 2021 License (for files): Creative Commons Attribution 4.0 International.

Funding Statement

This work was financially supported by the Center for Research in Econometric Theory and Applications (Grant no. 108L900204) from The Featured Areas Research Center Program within the framework of the Higher Education Sprout Project by the Ministry of Education (MOE) in Taiwan, and by the Ministry of Science and Technology (MOST), Taiwan, under Grant No. MOST 109-2634-F-002-045- and MOST 108-2410-H-002-219-. The funders had no role in study design, data collection and analysis, decision to publish, or preparation of the manuscript.

References

  • 1.Verbeek A, Debackere K, Luwel M, Andries P, Zimmermann E, Deleus F. Linking science to technology: using bibliographic references in patents to build linkage schemes. Scientometrics. 2002;54(3): 399–420. [Google Scholar]
  • 2.Glänzel W, Meyer M. Patents cited in the scientific literature: an exploratory study of ’reverse’ citation relations. Scientometrics. 2003;58(2): 415–428. [Google Scholar]
  • 3.Doi H, Heeren A, Maurage P. Scientific activity is a better predictor of Nobel award chances than dietary habits and economic factors. PLoS ONE. 2014;9(3):e92612. doi: 10.1371/journal.pone.0092612 eCollection 2014. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 4.Huang MH, Yang HW, Chen DZ. Increasing science and technology linkage in fuel cells: a cross citation analysis of papers and patents. Journal of Informetrics. 2015;9(2): 237–249. [Google Scholar]
  • 5.Qi Y, Zhu N, Zhai Y, Ding Y. The mutually beneficial relationship of patents and scientific literature: topic evolution in nanoscience. Scientometrics. 2018;115(2): 893–911. [Google Scholar]
  • 6.Sun XL, Ding K. Identifying and tracking scientific and technological knowledge memes from citation networks of publications and patents. Scientometrics. 2018;116(3): 1735–1748. [Google Scholar]
  • 7.Baldini N. University patenting: patterns of faculty motivations. Technology Analysis and Strategic Management. 2011;23(2): 103–121. [Google Scholar]
  • 8.Meyer M. Are patenting scientists the better scholars? an exploratory comparison of inventor-authors with their non-inventing peers in nano-science and technology. Research Policy. 2006;35: 1646–1662. [Google Scholar]
  • 9.Magerman T, van Looy B, Debackere K. Does involvement in patenting jeopardize one’s academic footprint? an analysis of patent-paper pairs in biotechnology. Research Policy. 2015;44(9): 1702–1713. [Google Scholar]
  • 10.Grimm HM, Jaenicke J. Testing the causal relationship between academic patenting and scientific publishing in Germany: crowding-out or reinforcement? Journal of Technology Transfer. 2015;40: 512–535. [Google Scholar]
  • 11.Bonaccorsi A, Thoma G. Institutional complementarity and inventive performance in nano science and technology. Research Policy. 2007;36(6): 813–831. [Google Scholar]
  • 12.Klitkou A, Gulbrandsen M. The relationship between academic patenting and scientific publishing in Norway. Scientometrics. 2009;82(1): 93–108. [Google Scholar]
  • 13.Chang YW, Chen DZ, Huang MH. Discovering types of research performance of scientists with significant contributions. Scientometrics. 2020;124(2):1529–52. [Google Scholar]
  • 14.Aksnes DW, Langfeldt L, Wouters P. Citations, citation indicators, and research quality: an overview of basic concepts and theories. Sage Open. 2019;9(1). doi: 10.1177/2158244019829575 [DOI] [Google Scholar]
  • 15.Garfield E, Welljams-Dorof A. Of Nobel class: a citation perspective on high impact research authors. Theoretical Medicine. 1992;13(2): 117–135. doi: 10.1007/BF02163625 [DOI] [PubMed] [Google Scholar]
  • 16.Kademani BS, Kalyane VL, Kumar V, Mohan L. Nobel laureates: their publication productivity, collaboration and authorship status. Scientometrics. 2005;62(2): 261–268. [Google Scholar]
  • 17.Li J, Yin Y, Fortunato S, Wang D. Scientific elite revisited: patterns of productivity, collaboration, authorship and impact. Journal of the Royal Society of Interface. 2020;17(165):e20200135. doi: 10.1098/rsif.2020.0135 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 18.Gingras Y, Wallace ML. Why it has become more difficult to predict Nobel Prize winners: a bibliometric analysis of nominees and winners of the chemistry and physics prizes (1901–2007). Scientometrics. 2010;82(2): 401–412. [Google Scholar]
  • 19.Murray F, Stern S. Do formal intellectual property rights hinder the free flow of scientific knowledge? an empirical test of the anti-commons hypothesis. Journal of Economic Behavior & Organization. 2007;63(4): 648–687. [Google Scholar]
  • 20.Calderini M, Franzoni C, Vezzulli A. If star scientists do not patent: the effect of productivity, basicness and impact on the decision to patent in the academic world. Research Policy. 2007;36(3): 303–319. [Google Scholar]
  • 21.Beck S, Mahdad M, Beukel K, Poetz M. The value of scientific knowledge dissemination for scientists-a value capture perspective. Publications. 2019;7(3). doi: 10.3390/publications7030054 [DOI] [Google Scholar]
  • 22.Chang YW, Yang HW, Huang MH. Interaction between science and technology in the field of fuel cells based on patent paper analysis. Electronic Library. 2017;35(1): 152–166. [Google Scholar]
  • 23.Nedić O, Dekanski A. Priority criteria in peer review of scientific articles. Scientometrics 2016;107(1): 15–26. [Google Scholar]
  • 24.Tartari V, Perkmann M, Salter A. In good company: the influence of peers on industry engagement by academic scientists. Research Policy. 2014;43(7): 1189–1203. [Google Scholar]
  • 25.Hohberger J. Does it pay to stand on the shoulders of giants? an analysis of the inventions of star inventors in the biotechnology sector. Research Policy. 2016;45(3): 682–698. [Google Scholar]
  • 26.Trippl M. Scientific mobility and knowledge transfer at the interregional and intraregional level. Regional Studies. 2013;47(10): 1653–1667. [Google Scholar]
  • 27.Klaić B. The use of scientometric parameters for the evaluation of scientific contributions. Collegium Antropologicum. 1999;23(2): 751–770. [PubMed] [Google Scholar]
  • 28.Mondal D, Raychoudhury N, Sarkhel JK. Scientific contribution of Professor Mahalanobis: a bio-bibliometric study. Current Science. 2018;115(8): 1470–1476. [Google Scholar]
  • 29.Aksnes DW. Citation rates and perceptions of scientific contribution. Journal of the American Society for Information Science and Technology. 2006;57(2): 169–185. [Google Scholar]
  • 30.MacRoberts MH, MacRoberts BR. The mismeasure of science: citation analysis. Journal of the Association for Information Science and Technology. 2018;69(3): 474–482. [Google Scholar]
  • 31.Baccini A, De Nicolao G. Do they agree? Bibliometric evaluation versus informed peer review in the Italian research assessment exercise. Scientometrics. 2016;108(3): 1651–1671. [Google Scholar]
  • 32.Martin BR, Irvine J. Assessing basic research: some partial indicators of scientific progress in radio astronomy. Research Policy. 1983;12: 61–90. [Google Scholar]
  • 33.Dickson Prize in Medicine. [cited 1 July 2021]. http://www.dicksonprize.pitt.edu/nomination/
  • 34.Fields Medal. International Mathematical Unit, 2021. [cited 1 July 2021]. https://www.mathunion.org/imu-awards/fields-medal
  • 35.Charles Stark Draper Prize for Engineering. National Academy of Engineering, Charles Stark Draper Prize for Engineering. [cited 1 July 2021]. https://www.nae.edu/20681/DraperPrize
  • 36.So CYK. Citation ranking versus expert judgment in evaluating communication scholars: effects of research specialty size and individual prominence. Scientometrics. 1998;41(3): 325–333. [Google Scholar]
  • 37.Jirschitzka J, Oeberst A, Göllner R, Cress U. Inter-rater reliability and validity of peer reviews in an interdisciplinary field. Scientometrics. 2017;113(2): 1059–1092. [Google Scholar]
  • 38.Ortega JL. Are peer-review activities related to reviewer bibliometric performance? a scientometric analysis of Publons. Scientometrics. 2017;112(2): 947–962. [Google Scholar]
  • 39.Abramo G, D’Angelo CA, Reale E. Peer review versus bibliometrics: which method better predicts the scholarly impact of publications? Scientometrics. 2019;121(1): 537–554. [Google Scholar]
  • 40.Thelwall M, Haustein S, Larivière V, Sugimoto CR. Do Altmetrics work? Twitter and ten other social web services. PLoS ONE. 2013;8(5):e64841. doi: 10.1371/journal.pone.0064841 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 41.Pulido CM, Redondo-Sama G, Sordé-Martí T, Flecha R. Social impact in social media: a new method to evaluate the social impact of research. PLoS ONE. 2018;13(8):e0203117. doi: 10.1371/journal.pone.0203117 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 42.Smith ZL, Chiang AL, Bowman D, Wallace MB. Longitudinal relationship between social media activity and article citations in the journal Gastrointestinal Endoscopy. Gastrointestinal Endoscopy. 2019;90(1): 77–83. doi: 10.1016/j.gie.2019.03.028 [DOI] [PubMed] [Google Scholar]
  • 43.Mullins CH, Boyd CJ, Corey BL. Examining the correlation between Altmetric score and citations in the general surgery literature. Journal of Surgical Research. 2020;248: 159–164. doi: 10.1016/j.jss.2019.11.008 [DOI] [PubMed] [Google Scholar]
  • 44.Ortega JL. Relationship between Altmetric and bibliometric indicators across academic social sites: the case of CSIC’s members. Journal of Informetrics. 2015;9(1): 39–49. [Google Scholar]
  • 45.Haustein S. Grand challenges in altmetrics: heterogeneity, data quality and dependencies. Scientometrics. 2016;108(1): 413–423. [Google Scholar]
  • 46.Gasparyan AY, Yessirkepov M, Duisenova A, Trukhachev VI, Kostyukova EI, Kitas GD. Researcher and author impact metrics: variety, value, and context. Journal of Korean Medical Science. 2018;33(18):e139. doi: 10.3346/jkms.2018.33.e139 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 47.Riikonen P, Vihinen M. National research contributions: a case study on Finnish biomedical research. Scientometrics. 2008;77(2): 207–222. [Google Scholar]
  • 48.Wagner CS, Horlings E, Whetsell TA, Mattsson P, Nordqvist K. Do Nobel Laureates create prize-winning networks? an Analysis of Collaborative Research in Physiology or Medicine. PLoS ONE. 2015;10(7):e0134164. doi: 10.1371/journal.pone.0134164 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 49.Ayaz S, Masood N. Comparison of researchers’ impact indices. PLoS ONE. 2020;15(5):e0233765. doi: 10.1371/journal.pone.0233765 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 50.Chan HF, Gleeson L, Torgler B. Awards before and after the Nobel Prize: a Matthew effect and/or a ticket to one’s own funeral? Research Evaluation. 2014;23(3): 210–220. [Google Scholar]
  • 51.Harvard Medical School Office of Research. Lists of distinguished awards and prizes. 2020. [cited 1 July 2021]. https://distinguishedawards.hms.harvard.edu/listawards
  • 52.Pickett CL. The increasing importance of fellowships and career development awards in the careers of early-stage biomedical academic researchers. PLoS ONE. 2019;14(10):e0223876. doi: 10.1371/journal.pone.0223876 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 53.Gingras Y, Larivière V, Macaluso B, Robitaille JP. The effects of aging on researchers’ publication and citation patterns. PLoS ONE. 2008;3(12):e4048. doi: 10.1371/journal.pone.0004048 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 54.Rajic D. Inventive level as a basis for the assessment of scientific contribution of inventors. FME Transactions. 2019;47(1): 76–82. [Google Scholar]
  • 55.Wolf B, Lindenthal T, Szerencsits M, Holbrook JB, Heß J. Evaluating research beyond scientific impact: how to include criteria for productive interactions and impact on practice and society. GAIA. 2013;22(2): 104–114. [Google Scholar]
  • 56.Olsen KA, Malizia A. Counting research ⇒ directing research. The hazard of using simple metrics to evaluate scientific contributions. an EU experience. Journal of Electronic Publishing. 2017;20(1):e3. doi: 10.3998/3336451.0020.102 [DOI] [Google Scholar]
  • 57.Bornmann L, Wallon G, Ledin A. Does the committee peer review select the best applicants for funding? an investigation of the selection process for two European molecular biology organization programmes. PLoS ONE. 2008;3(10):e3480. doi: 10.1371/journal.pone.0003480 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 58.Mazloumian A, Eom YH, Helbing D, Lozano S, Fortunato S. How citation boosts promote scientific paradigm shifts and Nobel Prizes. PLoS ONE. 2011;6(5):e18975. doi: 10.1371/journal.pone.0018975 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 59.Gallo SA, Carpenter AS, Irwin D, McPartland CD, Travis J, Reynders S, et al. The validation of peer review through research impact measures and the implications for funding strategies. PLoS ONE. 2014;9(9):e106474. doi: 10.1371/journal.pone.0106474 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 60.Rodriguez-Navarro A. Measuring research excellence: number of Nobel Prize achievements versus conventional bibliometric indicators. Journal of Documentation. 2011;67(4): 582–600. [Google Scholar]
  • 61.Liu X, Yu M, Chen DZ, Huang MH. Tracking research performance before and after receiving the Cheung Kong Scholars award: a case study of recipients in 2005. Research Evaluation. 2018;27(4): 367–379. [Google Scholar]
  • 62.Ibrahim B. The role of Egyptian State Awards in changing researchers’ performance in the science and technology sector. Research Evaluation. 2020;29(2): 171–190. [Google Scholar]

Decision Letter 0

Alberto Baccini

18 Jun 2021

PONE-D-21-10702

Do extraordinary science and technology scientists balance their publishing and patenting activities?

PLOS ONE

Dear Dr. Huang,

Thank you for submitting your manuscript to PLOS ONE. After careful consideration, we feel that it has merit but does not fully meet PLOS ONE’s publication criteria as it currently stands. Therefore, we invite you to submit a revised version of the manuscript that addresses the points raised during the review process. I recommend also to follow the minor points raised by Reviewer #1.

The two reviewers are critical of your paper by agreeing on the necessity to better organize the Introduction and the Discussion and Conclusion sections. The figure need to be revised at least by introducing suitable and explicative caption. Please clarify the status of availability of the data. You declared that they are not available, and that "all relevant data are within the manuscript". Consider carefully the data availability policy adopted by PLOS ONE (https://journals.plos.org/plosone/s/data-availability). I recommend also to evaluate the minor point highlighted by reviewer #1.

Please submit your revised manuscript by Jul 24 2021 11:59PM. If you will need more time than this to complete your revisions, please reply to this message or contact the journal office at plosone@plos.org. When you're ready to submit your revision, log on to https://www.editorialmanager.com/pone/ and select the 'Submissions Needing Revision' folder to locate your manuscript file.

Please include the following items when submitting your revised manuscript:

  • A rebuttal letter that responds to each point raised by the academic editor and reviewer(s). You should upload this letter as a separate file labeled 'Response to Reviewers'.

  • A marked-up copy of your manuscript that highlights changes made to the original version. You should upload this as a separate file labeled 'Revised Manuscript with Track Changes'.

  • An unmarked version of your revised paper without tracked changes. You should upload this as a separate file labeled 'Manuscript'.

If you would like to make changes to your financial disclosure, please include your updated statement in your cover letter. Guidelines for resubmitting your figure files are available below the reviewer comments at the end of this letter.

If applicable, we recommend that you deposit your laboratory protocols in protocols.io to enhance the reproducibility of your results. Protocols.io assigns your protocol its own identifier (DOI) so that it can be cited independently in the future. For instructions see: http://journals.plos.org/plosone/s/submission-guidelines#loc-laboratory-protocols. Additionally, PLOS ONE offers an option for publishing peer-reviewed Lab Protocol articles, which describe protocols hosted on protocols.io. Read more information on sharing protocols at https://plos.org/protocols?utm_medium=editorial-email&utm_source=authorletters&utm_campaign=protocols.

We look forward to receiving your revised manuscript.

Kind regards,

Alberto Baccini, Ph.D.

Academic Editor

PLOS ONE

Journal Requirements:

When submitting your revision, we need you to address these additional requirements.

1. Please ensure that your manuscript meets PLOS ONE's style requirements, including those for file naming. The PLOS ONE style templates can be found at

https://journals.plos.org/plosone/s/file?id=wjVg/PLOSOne_formatting_sample_main_body.pdf and

https://journals.plos.org/plosone/s/file?id=ba62/PLOSOne_formatting_sample_title_authors_affiliations.pdf

2. Please consider changing the title so as to meet our title format requirement (https://journals.plos.org/plosone/s/submission-guidelines). In particular, the title should be "Specific, descriptive, concise, and comprehensible to readers outside the field" and in this case it is not informative and specific about your study's scope and methodology.

3.Please ensure that you include a title page within your main document. We do appreciate that you have a title page document uploaded as a separate file, however, as per our author guidelines (http://journals.plos.org/plosone/s/submission-guidelines#loc-title-page) we do require this to be part of the manuscript file itself and not uploaded separately.

Could you therefore please include the title page into the beginning of your manuscript file itself, listing all authors and affiliations.

[Note: HTML markup is below. Please do not edit.]

Reviewers' comments:

Reviewer's Responses to Questions

Comments to the Author

1. Is the manuscript technically sound, and do the data support the conclusions?

The manuscript must describe a technically sound piece of scientific research with data that supports the conclusions. Experiments must have been conducted rigorously, with appropriate controls, replication, and sample sizes. The conclusions must be drawn appropriately based on the data presented.

Reviewer #1: Yes

Reviewer #2: No

**********

2. Has the statistical analysis been performed appropriately and rigorously?

Reviewer #1: Yes

Reviewer #2: Yes

**********

3. Have the authors made all data underlying the findings in their manuscript fully available?

The PLOS Data policy requires authors to make all data underlying the findings described in their manuscript fully available without restriction, with rare exception (please refer to the Data Availability Statement in the manuscript PDF file). The data should be provided as part of the manuscript or its supporting information, or deposited to a public repository. For example, in addition to summary statistics, the data points behind means, medians and variance measures should be available. If there are restrictions on publicly sharing data—e.g. participant privacy or use of data from a third party—those must be specified.

Reviewer #1: No

Reviewer #2: No

**********

4. Is the manuscript presented in an intelligible fashion and written in standard English?

PLOS ONE does not copyedit accepted manuscripts, so the language in submitted articles must be clear, correct, and unambiguous. Any typographical or grammatical errors should be corrected at revision, so please note any specific errors here.

Reviewer #1: No

Reviewer #2: Yes

**********

5. Review Comments to the Author

Please use the space provided to explain your answers to the questions above. You may also include additional comments for the author, including concerns about dual publication, research ethics, or publication ethics. (Please upload your review as an attachment if it exceeds 20,000 characters)

Reviewer #1: This study investigates the scientific and technological performance of the 12 scientists that were awarded both the NMS medal – an American prize recognizing scientific excellence – and the NMTI medal – an American prize given for outstanding technological contribution – in order to better understand the relationship between scientific publishing and patent activity of extraordinary scientists. The analysis of the bibliometric and patenting profiles of the 12 scientists shows that scientific and technological contributions that were awarded with prestigious prizes by the scientific community were not always matched by outstanding bibliometric and patenting performances. In this regard, the study is an interesting contribution to the debate on the capacity of bibliometric measures to capture “research excellence” and possibly to that on peer review vs. bibliometrics.

In my opinion, the study is a valuable contribution that deserves publication. However, I would suggest to the authors some revisions to improve clarity and readability of the paper.

First, I suggest the authors to better specify the focus of the paper and how it is related to the research questions. I was puzzled by the fact that the study mentions many different issues, including the relationship between science, technology, and academia (rows 15-27), the bibliometric and patenting profiles of outstanding scientists (rows 39-40), and if productivity, citation, and patent metrics are reliable indicators of scientific/technological excellence (rows 158-167), In my opinion, the main result of the paper is that these indicators are not reliable, as the outstanding scientists examined display highly diversified citation and patent profiles (rows 475-477, 531-532, and 552-553). I see that the two other topics are somehow related to this result, but I think that a more concise and clear presentation of the real focus of the paper would help the reader to follow the argument.

In particular, I suggest to better organize the Introduction and the Discussion and Conclusion sections, as some paragraphs seem to be scarcely connected with the main argument. For instance, the paragraph on Nobel laureates (rows 28-38) interrupts the presentation of the rationale of the study and it seems to me not relevant for the discussion, at least at this point of the paper. Similarly, the paragraph on the evaluation of societal effects (rows 520-530) seems to me scarcely related to the focus of the study.

The methodology applied is sound and well-tailored to the research questions. Quantitative analyses are well performed, and results support the paper’s claims. However, some figures may be improved:

Fig. 1: In the Lauterbur facet there is a blue line, does it mean something? If this is the case, please add the explanation.

Fig. 4 should be explained more extensively: what does the color code mean, for instance? What is “n” in the last row on the left? I suggest the author at least to expand the caption of this figure.

Fig. 5: it is not clear the rationale for the three types of scientists. Why were these ranges chosen to delimit type-B from type-C scientists? I suggest also to indicate the type by a color to improve the readability of the figure.

The authors cited mostly the appropriate literature, even if some key references that would perfectly fit in their argument are lacking. In particular, the sections on the relationship between citations and research quality should be integrated with a discussion of Aksnes et al. (2019) (DOI: 10.1177/2158244019829575) and similar contributions. Other references are suggested in the attached list of minor comments.

Lastly, I would suggest the authors get editing help from someone with full professional proficiency in English, as the meaning of some phrases and sentences in the manuscript is not always clear (see the list of minor comments attached).

Minor revisions and typos:

Row 19. “Notably” does not seem to me the right word. Maybe “However”?

Rows 28-38: I do not understand why this paragraph on Nobel laureates is placed here. It interrupts the argument. I would remove it.

Row 35: laureates_and

Row 42: “in sample <of> scientists”

Row 59: I am not sure that “assumption” is the right word here. Maybe “hypothesis” is better?

Row 89: Not sure “demonstrated” is the right word here. I suggest the authors to rephrase the entire sentence.

Row 127: “one scientific publication of a patent” should be “one scientific publication or a patent”.

Row 139-140: The meaning of this sentence is not clear. I suggest the authors to rephrase it.

Rows 143-145: I suggest the authors to integrate the discussion by commenting on Aksnes et al. (2019) (10.1177/2158244019829575), that is an in-depth review of the relationship between research quality and citations.

Row 162: I suggest the authors to add reference to Baccini and De Nicolao (2016) (10.1007/s11192-016-1929-y), a relevant study for this debate.

Row 209: I am not sure that the word “intelligence” is right here.

Row 229: “until” should be replaced with “before”, if a get the meaning of the sentence right.

Rows 231-232: It is not very clear how citations were calculated. Is it from the difference between the two?

Rows 287-288: Accumulation of prizes may be explained also by the Matthew Effect. I suggest the authors to briefly mention this alternative explanation.

Row 351: “he had been for a”

Row 483: I would replace “originate from” with “be reflected in” or “result in”.

Rows 485-490: The meaning of this paragraph is not clear to me. I suggest the authors to rephrase it and elaborate a bit on the relationship between inventions and discoveries to clarify the reasoning presented in this paragraph.

Ref n. 1: “National inVocation systems” should be changed in “National innovation systems”

Ref 25 “orreinforcement” should be replaced with “or reinforcement”

Ref. 32 I think it is better to cite e.g., MacRoberts and MacRoberts (2018) (10.1002/asi.23970)</of>

Reviewer #2: The paper summarizes the careers of 12 American scientists who might be considered as "excellent" performers both in science and technology. The authors try to understand what data about these scientists' publication and patenting activities can tell us about the career choices of successful scientists, and specifically about the relation between (academic) science and technology. Despite the small sample, the work is strictly based on a quantitative approach.

While I do appreciate the effort to better understand the sociology of science with from a scientometric perspective, I believe the paper suffers from three main problems.

First, the aims of the paper are unclear. The introduction (p. 4) spells out two very narrow and specific research questions (if these 12 people authored the same number of articles and patents, and if they are highly cited), and a very generic one (if their contributions are "demonstrated" by bibliometric indicators). The authors do not attempt to motivate why these questions are relevant. Then, the analysis only answers to the first two questions (see my point 2 concerning the third question), and the conclusions depart from all three questions, stating that the authors "have demonstrated that bibliometrics indicators are not a reliable measure of the value of discoveries in improving humanity" (pp. 23-24).

Second, the paper suffers from confusion in the use of terms and concepts. In some cases, this could be a language problem that the authors could possibly solve by asking a native speaker to proofread the manuscript. This is how I personally (and I myself am not a native speaker) interpret some odd statements such as "The research papers and patents produced ... served as proxies of scientific and technological intelligence" (pp. 9-10) or "The low number of winners indicates that this select group has been responsible for extraordinary contributions" (p. 3).

Other examples could be made, but my point is that quite obviously this is not just a language issue. Throughout the paper the authors use "contribution" and "influence" interchangeably (for example in pages 1, and 2), then clearly express the fact that they are not synonymous, but then e.g. on pag. 11 they write again that "citation counts were employed to represent [both] influence and contribution over time". Then for example on pag. 23 they deny this again ("Influence is frequently confused with contribution. In this study, only citation-based counts were used to measure and present scientific and technological influence"). To add to the confusion, sometimes they further add the notions of "performance" and "research originality" (beside the already recalled "intelligence", which I believe to be just a mistake). Perhaps, if the authors clarified better what they aim to measure or document, the statistical analysis too could be more in-depth and precise (see point below).

Third, the data analysis is rather superficial, and not geared at answering the third (larger) research question. For example, because of variability of citations across a same author's papers, the authors only consider the most highly cited paper for each author. Further, for years before 1970 the use citation data from Web of Science, and after 1970 from Scopus, without any consideration about how this might affect their results. The authors do not distinguish between citations received before and after the official recognition of the award(s) and therefore cannot tell us if impact (what they call "influence") is due to visibility and/or how is it affected by it.

More in general, I do not have precise ideas about how to demonstrate if "contribution" is measured by citation counts, so I am sorry I cannot offer constructive criticisms, but certainly the current analysis does not attempt to answer this question at all. Probably, totally different data would be required (including some measurable evidence of "contribution") and I would personally suggest to the authors to rather change the research question.

Concerning the empirical part, however, my biggest concern is that the authors draw general implications from a very (small and) select sample: both geographically (only US authors) and in terms of impact. They only write (p. 8) that this is an "appropriate" sample for a study of this kind, without much more reflection on this crucial methodological choice.

On the whole, I am sorry to say I believe these three main problems make the manuscript unsuitable for publication.

**********

6. PLOS authors have the option to publish the peer review history of their article (what does this mean?). If published, this will include your full peer review and any attached files.

If you choose “no”, your identity will remain anonymous but your review may still be made public.

Do you want your identity to be public for this peer review? For information about this choice, including consent withdrawal, please see our Privacy Policy.

Reviewer #1: Yes: Eugenio Petrovich

Reviewer #2: Yes: Carlo D'Ippoliti

[NOTE: If reviewer comments were submitted as an attachment file, they will be attached to this email and accessible via the submission site. Please log into your account, locate the manuscript record, and check for the action link "View Attachments". If this link does not appear, there are no attachment files.]

While revising your submission, please upload your figure files to the Preflight Analysis and Conversion Engine (PACE) digital diagnostic tool, https://pacev2.apexcovantage.com/. PACE helps ensure that figures meet PLOS requirements. To use PACE, you must first register as a user. Registration is free. Then, login and navigate to the UPLOAD tab, where you will find detailed instructions on how to use the tool. If you encounter any issues or have any questions when using PACE, please email PLOS at figures@plos.org. Please note that Supporting Information files do not need this step.

PLoS One. 2021 Nov 4;16(11):e0259453. doi: 10.1371/journal.pone.0259453.r002

Author response to Decision Letter 0


24 Jul 2021

Thank you for the constructive comments on our manuscript. We have revised the paper based on the reviewer’ comments and recommendations. Revisions of the manuscript are highlighted in yellow. In addition, the point-by-point responses to the reviewer comments are appended below.

Reviewer #1:

1. First, I suggest the authors to better specify the focus of the paper and how it is related to the research questions. I was puzzled by the fact that the study mentions many different issues, including the relationship between science, technology, and academia (rows 15-27), the bibliometric and patenting profiles of outstanding scientists (rows 39-40), and if productivity, citation, and patent metrics are reliable indicators of scientific/technological excellence (rows 158-167), In my opinion, the main result of the paper is that these indicators are not reliable, as the outstanding scientists examined display highly diversified citation and patent profiles (rows 475-477, 531-532, and 552-553). I see that the two other topics are somehow related to this result, but I think that a more concise and clear presentation of the real focus of the paper would help the reader to follow the argument. In particular, I suggest to better organize the Introduction and the Discussion and Conclusion sections, as some paragraphs seem to be scarcely connected with the main argument. For instance, the paragraph on Nobel laureates (rows 28-38) interrupts the presentation of the rationale of the study and it seems to me not relevant for the discussion, at least at this point of the paper. Similarly, the paragraph on the evaluation of societal effects (rows 520-530) seems to me scarcely related to the focus of the study.

Response:

Thank you for your constructive suggestions and comments. We have reorganized and revised the Introduction and the Discussion and Conclusions sections. In the Introduction section, we have revised the third research question and have deleted disconnected sentences including original lines 15–40 to clarify and strengthen statements and arguments. Regarding the Discussion and Conclusions section, in addition to deleting the irrelevant statements that you mentioned, we have added new references and made considerable revisions to improve and clarify the statements related to the discussion and conclusion.

2. Fig. 1: In the Lauterbur facet there is a blue line, does it mean something? If this is the case, please add the explanation.

Response:

We have added an explanation for the blue vertical line in two scientists’ data to the description of Fig. 1. The blue vertical line indicated the year when the scientist was awarded the Nobel Prize.

3. Fig. 4 should be explained more extensively: what does the color code mean, for instance? What is “n” in the last row on the left? I suggest the author at least to expand the caption of this figure.

Response:

We have revised Fig 4 to enhance the visual effect and reader comprehension. More details on Fig 4 have been added to the corresponding paragraph. In addition, we have expanded the caption of Fig 4.

4. Fig. 5: it is not clear the rationale for the three types of scientists. Why were these ranges chosen to delimit type-B from type-C scientists? I suggest also to indicate the type by a color to improve the readability of the figure.

Response:

We have added statements to clarify the rationale for three types of scientists. We have also revised Fig 5 to improve its visual effect, including the use of colors for scientists of different types.

5. The authors cited mostly the appropriate literature, even if some key references that would perfectly fit in their argument are lacking. In particular, the sections on the relationship between citations and research quality should be integrated with a discussion of Aksnes et al. (2019) (DOI: 10.1177/2158244019829575) and similar contributions. Other references are suggested in the attached list of minor comments.

Response:

We have added the three articles that you suggested and other references to the list of references and incorporated them into the statements and discussions related to citations, research quality, influence, and contributions.

6. Lastly, I would suggest the authors get editing help from someone with full professional proficiency in English, as the meaning of some phrases and sentences in the manuscript is not always clear (see the list of minor comments attached).

Response:

The manuscript has undergone professional editing.

7. Minor revisions and typos:

Row 19. “Notably” does not seem to me the right word. Maybe “However”?

Response:

We have replaced “Notably” with “However.”

Rows 28-38: I do not understand why this paragraph on Nobel laureates is placed here. It interrupts the argument. I would remove it.

Response:

We have deleted this paragraph.

Row 42: “in sample scientists”

Response:

We have deleted the word “sample.”

Row 59: I am not sure that “assumption” is the right word here. Maybe “hypothesis” is better?

Response:

We have replaced “assumption” with “hypothesis.”

Row 89: Not sure “demonstrated” is the right word here. I suggest the authors to rephrase the entire sentence.

Response:

We have revised the third research question.

Row 139-140: The meaning of this sentence is not clear. I suggest the authors to rephrase it.

Response:

We have rephrased this sentence.

Rows 143-145: I suggest the authors to integrate the discussion by commenting on Aksnes et al. (2019) (10.1177/2158244019829575), that is an in-depth review of the relationship between research quality and citations.

Row 162: I suggest the authors to add reference to Baccini and De Nicolao (2016) (10.1007/s11192-016-1929-y), a relevant study for this debate.

Ref. 32 I think it is better to cite e.g., MacRoberts and MacRoberts (2018) (10.1002/asi.23970)

Response:

We have added the three references as suggested and integrated them into the Literature Review and the Discussion and Conclusions sections.

Row 209: I am not sure that the word “intelligence” is right here.

Response:

We have replaced “intelligence” with “output.”

Row 229: “until” should be replaced with “before”, if a get the meaning of the sentence right.

Response:

We have replaced “until” with “before.”

Rows 231-232: It is not very clear how citations were calculated. Is it from the difference between the two?

Response:

We have rephrased this sentence to enhance clarity.

Rows 287-288: Accumulation of prizes may be explained also by the Matthew Effect. I suggest the authors to briefly mention this alternative explanation.

Response:

The relevance of the Matthew effect to our study has been addressed in the Discussion section.

Row 35: laureates_and

Row 127: “one scientific publication of a patent” should be “one scientific publication or a patent”.

Row 351: “he had been for a”

Ref 25 “orreinforcement” should be replaced with “or reinforcement”

Response:

We have examined the whole paper and corrected all typos.

Row 483: I would replace “originate from” with “be reflected in” or “result in”.

Response:

We have replaced “originate from” with “be reflected in.”

Rows 485-490: The meaning of this paragraph is not clear to me. I suggest the authors to rephrase it and elaborate a bit on the relationship between inventions and discoveries to clarify the reasoning presented in this paragraph.

Response:

We have rephrased the relevant text in the Discussion and Conclusions section.

Ref n. 1: “National inVocation systems” should be changed in “National innovation systems”

Response:

We have deleted this reference.

Reviewer #2:

1. First, the aims of the paper are unclear. The introduction (p. 4) spells out two very narrow and specific research questions (if these 12 people authored the same number of articles and patents, and if they are highly cited), and a very generic one (if their contributions are "demonstrated" by bibliometric indicators). The authors do not attempt to motivate why these questions are relevant. Then, the analysis only answers to the first two questions (see my point 2 concerning the third question), and the conclusions depart from all three questions, stating that the authors "have demonstrated that bibliometrics indicators are not a reliable measure of the value of discoveries in improving humanity" (pp. 23-24).

Response:

Thank you for your comments and suggestions. We have rephrased and reorganized the Introduction section to clarify the purpose and research questions of this paper. In particular, we have revised the third research questions and strengthened statements related to the three research questions to enhance the relevance of these statements. In addition, we have revised the Discussion and Conclusions section to ensure the discussions are connected to the research questions.

2. Second, the paper suffers from confusion in the use of terms and concepts. In some cases, this could be a language problem that the authors could possibly solve by asking a native speaker to proofread the manuscript. This is how I personally (and I myself am not a native speaker) interpret some odd statements such as "The research papers and patents produced ... served as proxies of scientific and technological intelligence" (pp. 9-10) or "The low number of winners indicates that this select group has been responsible for extraordinary contributions" (p. 3). Other examples could be made, but my point is that quite obviously this is not just a language issue.

Throughout the paper the authors use "contribution" and "influence" interchangeably (for example in pages 1, and 2), then clearly express the fact that they are not synonymous, but then e.g. on pag. 11 they write again that "citation counts were employed to represent [both] influence and contribution over time". Then for example on pag. 23 they deny this again ("Influence is frequently confused with contribution. In this study, only citation-based counts were used to measure and present scientific and technological influence"). To add to the confusion, sometimes they further add the notions of "performance" and "research originality" (beside the already recalled "intelligence", which I believe to be just a mistake). Perhaps, if the authors clarified better what they aim to measure or document, the statistical analysis too could be more in-depth and precise (see point below).

Response:

We have used an English editing service to improve our writing quality. Regarding the meaning of some concepts, such as influence, contribution, performance, and research originality, we have examined the full paper and revised related statements. In particular, we have strengthened the rationale for and what we analyzed through our research questions.

3. Third, the data analysis is rather superficial, and not geared at answering the third (larger) research question. For example, because of variability of citations across a same author's papers, the authors only consider the most highly cited paper for each author.

Response:

We have revised the third research questions and enhanced the connection between the three research questions.

4. Further, for years before 1970 the use citation data from Web of Science, and after 1970 from Scopus, without any consideration about how this might affect their results.

Response:

We have provided further information regarding the citation data collected from Web of Science (WoS) and Scopus. Scopus covers more journals than does WoS. This helped us collect the maximum number of articles published by the 12 scientists. However, Scopus does not provide data on the annual number of citations received by articles before 1970. Thus, for articles published before 1970 and that have received citations before 1970, their total number of citations would be under-counted. To address this problem, we verified whether articles published before 1970 were indexed by WoS and had annual citation count data before 1970. Therefore, for articles published before 1970, their annual citation counts were collected from WoS (before 1970) and Scopus (after 1970).

5. The authors do not distinguish between citations received before and after the official recognition of the award(s) and therefore cannot tell us if impact (what they call "influence") is due to visibility and/or how is it affected by it.

Response:

We have recorded citation counts received by year and illustrated the data in Fig 3. In addition, Fig 1 presents the years when each scientist received awards. With reference to Figs 1 and 3, we can observe the changes in citation counts before and after awards. Statements related to citation counts and awards have been added to the Discussion and Conclusions section.

6. More in general, I do not have precise ideas about how to demonstrate if "contribution" is measured by citation counts, so I am sorry I cannot offer constructive criticisms, but certainly the current analysis does not attempt to answer this question at all. Probably, totally different data would be required (including some measurable evidence of "contribution") and I would personally suggest to the authors to rather change the research question.

Response:

We have expanded and clarified statements related to citation counts, influence, and contribution as well as our research motives, and we have revised the third research question. As mentioned in our responses to questions 1–3, the description of what we measured for each research question has been expanded and clarified.

7. Concerning the empirical part, however, my biggest concern is that the authors draw general implications from a very (small and) select sample: both geographically (only US authors) and in terms of impact. They only write (p. 8) that this is an "appropriate" sample for a study of this kind, without much more reflection on this crucial methodological choice.

Response:

Regarding your biggest concern, we have explained why only 12 subjects were targeted in the study in our Methodology section. To identify scientists with substantial contributions to science and technology, we considered those that had received two specific distinguished awards, with one award emphasizing scientific contribution and the other focusing on technological contribution. We have expanded and clarified our rationale and method used for targeting the 12 scientists.

Attachment

Submitted filename: 04-reviewers-comments_plos-one.docx

Decision Letter 1

Alberto Baccini

7 Sep 2021

PONE-D-21-10702R1

Do extraordinary science and technology scientists balance their publishing and patenting activities?

PLOS ONE

Dear Dr. Huang,

Thank you for submitting your manuscript to PLOS ONE. After careful consideration, we feel that it has merit but does not fully meet PLOS ONE’s publication criteria as it currently stands. Therefore, we invite you to submit a revised version of the manuscript that addresses the points raised during the review process.

I think that the paper can be accepted for publication,  after you have carried out the corrections as suggested by the two reviewer. Please consider carefully the minor corrections indicated by both reviewers, and feel free to simplify the introduction according to the comment of the first reviewer. 

Please submit your revised manuscript by Oct 22 2021 11:59PM. If you will need more time than this to complete your revisions, please reply to this message or contact the journal office at plosone@plos.org. When you're ready to submit your revision, log on to https://www.editorialmanager.com/pone/ and select the 'Submissions Needing Revision' folder to locate your manuscript file.

Please include the following items when submitting your revised manuscript:

  • A rebuttal letter that responds to each point raised by the academic editor and reviewer(s). You should upload this letter as a separate file labeled 'Response to Reviewers'.

  • A marked-up copy of your manuscript that highlights changes made to the original version. You should upload this as a separate file labeled 'Revised Manuscript with Track Changes'.

  • An unmarked version of your revised paper without tracked changes. You should upload this as a separate file labeled 'Manuscript'.

If you would like to make changes to your financial disclosure, please include your updated statement in your cover letter. Guidelines for resubmitting your figure files are available below the reviewer comments at the end of this letter.

If applicable, we recommend that you deposit your laboratory protocols in protocols.io to enhance the reproducibility of your results. Protocols.io assigns your protocol its own identifier (DOI) so that it can be cited independently in the future. For instructions see: http://journals.plos.org/plosone/s/submission-guidelines#loc-laboratory-protocols. Additionally, PLOS ONE offers an option for publishing peer-reviewed Lab Protocol articles, which describe protocols hosted on protocols.io. Read more information on sharing protocols at https://plos.org/protocols?utm_medium=editorial-email&utm_source=authorletters&utm_campaign=protocols.

We look forward to receiving your revised manuscript.

Kind regards,

Alberto Baccini, Ph.D.

Academic Editor

PLOS ONE

Journal Requirements:

1. Please change the title so as to meet our title format requirement (https://journals.plos.org/plosone/s/submission-guidelines). In particular, the title should be "Specific, descriptive, concise, and comprehensible to readers outside the field" and in this case it is not informative and specific about your study's scope, methodology, and findings.

Please review your reference list to ensure that it is complete and correct. If you have cited papers that have been retracted, please include the rationale for doing so in the manuscript text, or remove these references and replace them with relevant current references. Any changes to the reference list should be mentioned in the rebuttal letter that accompanies your revised manuscript. If you need to cite a retracted article, indicate the article’s retracted status in the References list and also include a citation and full reference for the retraction notice.

Additional Editor Comments (if provided):

[Note: HTML markup is below. Please do not edit.]

Reviewers' comments:

Reviewer's Responses to Questions

Comments to the Author

1. If the authors have adequately addressed your comments raised in a previous round of review and you feel that this manuscript is now acceptable for publication, you may indicate that here to bypass the “Comments to the Author” section, enter your conflict of interest statement in the “Confidential to Editor” section, and submit your "Accept" recommendation.

Reviewer #1: (No Response)

Reviewer #2: All comments have been addressed

**********

2. Is the manuscript technically sound, and do the data support the conclusions?

The manuscript must describe a technically sound piece of scientific research with data that supports the conclusions. Experiments must have been conducted rigorously, with appropriate controls, replication, and sample sizes. The conclusions must be drawn appropriately based on the data presented.

Reviewer #1: Yes

Reviewer #2: Yes

**********

3. Has the statistical analysis been performed appropriately and rigorously?

Reviewer #1: Yes

Reviewer #2: Yes

**********

4. Have the authors made all data underlying the findings in their manuscript fully available?

The PLOS Data policy requires authors to make all data underlying the findings described in their manuscript fully available without restriction, with rare exception (please refer to the Data Availability Statement in the manuscript PDF file). The data should be provided as part of the manuscript or its supporting information, or deposited to a public repository. For example, in addition to summary statistics, the data points behind means, medians and variance measures should be available. If there are restrictions on publicly sharing data—e.g. participant privacy or use of data from a third party—those must be specified.

Reviewer #1: Yes

Reviewer #2: (No Response)

**********

5. Is the manuscript presented in an intelligible fashion and written in standard English?

PLOS ONE does not copyedit accepted manuscripts, so the language in submitted articles must be clear, correct, and unambiguous. Any typographical or grammatical errors should be corrected at revision, so please note any specific errors here.

Reviewer #1: Yes

Reviewer #2: Yes

**********

6. Review Comments to the Author

Please use the space provided to explain your answers to the questions above. You may also include additional comments for the author, including concerns about dual publication, research ethics, or publication ethics. (Please upload your review as an attachment if it exceeds 20,000 characters)

Reviewer #1: The new version of the manuscript is clearer and better organized compared to the first one. The paper’s focus is now clearly explained, and the research questions plainly spelled out.

However, I still have some troubles with the Introduction, which, in its present form, is too long, with paragraphs that are not always mutually well connected. My suggestion to the authors is to shorten it and remove several details that are already provided in the following sections of the papers (e.g., the debate on the reliability of citation counts for measuring influence is better presented in the Literature Review section, the history of the NMS and NMTI can be left to the Methodology section without weighing the Introduction down, and so on). I would suggest the authors to limit the Introduction to the presentation of the rationale and relevance of their research questions, leaving the context and other details to the Literature Review section.

Some repetitions may also be reduced to enhance readability throughout the paper (e.g., the fact that filing patents is more challenging than publishing articles is repeated at least three times in different points of the paper).

Lastly, there are still few typos that should be corrected:

Abstract: row 30: there is a missing “s” after “scientist”.

Row 86: it seems the verb is missing in this sentence.

Row 249: “scientist” should be turned to plural.

Reviewer #2: The authors addressed the most important concerns I and the other referees had raised. There remains a difference in points of views, which should not prevent publication of the manuscript, of course.

A final, minor remark: please clarify and streamline throughout the date(s) at which the data are updated. It seems to me, 2018 (for sample selection) and 2019 (for citations data), but this should be more clearly reported. Please note that on pag. 11 (row 252) you write "as of 2004": what does that mean, is that a typo?

**********

7. PLOS authors have the option to publish the peer review history of their article (what does this mean?). If published, this will include your full peer review and any attached files.

If you choose “no”, your identity will remain anonymous but your review may still be made public.

Do you want your identity to be public for this peer review? For information about this choice, including consent withdrawal, please see our Privacy Policy.

Reviewer #1: Yes: Eugenio Petrovich

Reviewer #2: Yes: Carlo D'Ippoliti

[NOTE: If reviewer comments were submitted as an attachment file, they will be attached to this email and accessible via the submission site. Please log into your account, locate the manuscript record, and check for the action link "View Attachments". If this link does not appear, there are no attachment files.]

While revising your submission, please upload your figure files to the Preflight Analysis and Conversion Engine (PACE) digital diagnostic tool, https://pacev2.apexcovantage.com/. PACE helps ensure that figures meet PLOS requirements. To use PACE, you must first register as a user. Registration is free. Then, login and navigate to the UPLOAD tab, where you will find detailed instructions on how to use the tool. If you encounter any issues or have any questions when using PACE, please email PLOS at figures@plos.org. Please note that Supporting Information files do not need this step.

PLoS One. 2021 Nov 4;16(11):e0259453. doi: 10.1371/journal.pone.0259453.r004

Author response to Decision Letter 1


8 Oct 2021

We appreciate the time and efforts that the reviewers have dedicated to providing valuable feedback on our manuscript. We have revised the paper based on the reviewers’ comments. The revisions are highlighted in yellow in the manuscript. Here is a point-by-point response to the reviewers’ comments and concerns.

Reviewer #1:

Comment 1: The new version of the manuscript is clearer and better organized compared to the first one. The paper’s focus is now clearly explained, and the research questions plainly spelled out. However, I still have some troubles with the Introduction, which, in its present form, is too long, with paragraphs that are not always mutually well connected. My suggestion to the authors is to shorten it and remove several details that are already provided in the following sections of the papers (e.g., the debate on the reliability of citation counts for measuring influence is better presented in the Literature Review section, the history of the NMS and NMTI can be left to the Methodology section without weighing the Introduction down, and so on). I would suggest the authors to limit the Introduction to the presentation of the rationale and relevance of their research questions, leaving the context and other details to the Literature Review section.

Response: We agree with this comment, and we have shortened the Introduction section by moving some statements to other sections, leaving the Introduction section clearer and more precise.

Comment 2: Some repetitions may also be reduced to enhance readability throughout the paper (e.g., the fact that filing patents is more challenging than publishing articles is repeated at least three times in different points of the paper).

Response: We have deleted the repetitive statements that you pointed out. And we have examined the whole paper and deleted some other repetitions to enhance readability, thanks for your suggestion.

Comment 3: There are still few typos that should be corrected: Abstract: row 30: there is a missing “s” after “scientist”. Row 86: it seems the verb is missing in this sentence. Row 249: “scientist” should be turned to plural.

Response: We have revised the typos that you pointed out and have also checked the whole paper to correct the other typos.

Reviewer #2:

Comment 1: The authors addressed the most important concerns I and the other referees had raised. There remains a difference in points of views, which should not prevent publication of the manuscript, of course. A final, minor remark: please clarify and streamline throughout the date(s) at which the data are updated. It seems to me, 2018 (for sample selection) and 2019 (for citations data), but this should be more clearly reported. Please note that on pag. 11 (row 252) you write "as of 2004": what does that mean, is that a typo?

Response: We have revised the typo that you identified on page 11, which is supposed to be “as of 2019”, thank you for pointing this out. We have also examined all the dates in this paper and made sure that they are correct.

We have indicated that the latest year of the published papers and patents that we used for analysis is 2018. The bibliographic records of the papers, which include the number of citations for each paper, were downloaded on January 29, 2019. Because the cumulative number of citations received by each paper was updated on a date close to the end of 2018, the number of citations received by each paper closely reflects their influence by the end of 2018.

For the patents, we have also elaborated on the relevant statements to clarify the dates. The bibliographic records of the patents were collected in February 2019. Because the number of patent-related citations is not provided in the USPTO database, we calculated the number of citations received on the basis of the references for all patents. The list of references does not change after the patents are granted. Therefore, 2018 is the latest year when the number of patent citations is recorded.

Attachment

Submitted filename: 03-Response-to-reviewers-comments_plos-one_r2.docx

Decision Letter 2

Alberto Baccini

20 Oct 2021

Do extraordinary science and technology scientists balance their publishing and patenting activities?

PONE-D-21-10702R2

Dear Dr. Huang,

We’re pleased to inform you that your manuscript has been judged scientifically suitable for publication and will be formally accepted for publication once it meets all outstanding technical requirements.

Within one week, you’ll receive an e-mail detailing the required amendments. When these have been addressed, you’ll receive a formal acceptance letter and your manuscript will be scheduled for publication.

An invoice for payment will follow shortly after the formal acceptance. To ensure an efficient process, please log into Editorial Manager at http://www.editorialmanager.com/pone/, click the 'Update My Information' link at the top of the page, and double check that your user information is up-to-date. If you have any billing related questions, please contact our Author Billing department directly at authorbilling@plos.org.

If your institution or institutions have a press office, please notify them about your upcoming paper to help maximize its impact. If they’ll be preparing press materials, please inform our press team as soon as possible -- no later than 48 hours after receiving the formal acceptance. Your manuscript will remain under strict press embargo until 2 pm Eastern Time on the date of publication. For more information, please contact onepress@plos.org.

Kind regards,

Alberto Baccini, Ph.D.

Academic Editor

PLOS ONE

Additional Editor Comments (optional):

Reviewers' comments:

Acceptance letter

Alberto Baccini

25 Oct 2021

PONE-D-21-10702R2

  Do extraordinary science and technology scientists balance their publishing and patenting activities?

Dear Dr. Huang:

I'm pleased to inform you that your manuscript has been deemed suitable for publication in PLOS ONE. Congratulations! Your manuscript is now with our production department.

If your institution or institutions have a press office, please let them know about your upcoming paper now to help maximize its impact. If they'll be preparing press materials, please inform our press team within the next 48 hours. Your manuscript will remain under strict press embargo until 2 pm Eastern Time on the date of publication. For more information please contact onepress@plos.org.

If we can help with anything else, please email us at plosone@plos.org.

Thank you for submitting your work to PLOS ONE and supporting open access.

Kind regards,

PLOS ONE Editorial Office Staff

on behalf of

Prof. Alberto Baccini

Academic Editor

PLOS ONE

Associated Data

    This section collects any data citations, data availability statements, or supplementary materials included in this article.

    Supplementary Materials

    Attachment

    Submitted filename: 04-reviewers-comments_plos-one.docx

    Attachment

    Submitted filename: 03-Response-to-reviewers-comments_plos-one_r2.docx

    Data Availability Statement

    All relevant data are available from the Zenodo database. DOI: http://doi.org/10.5281/zenodo.5115346 Publication date: July 20, 2021 License (for files): Creative Commons Attribution 4.0 International.


    Articles from PLoS ONE are provided here courtesy of PLOS

    RESOURCES