Skip to main content
PLOS One logoLink to PLOS One
. 2021 Mar 10;16(3):e0247553. doi: 10.1371/journal.pone.0247553

What cancer research makes the news? A quantitative analysis of online news stories that mention cancer studies

Laura Moorhead 1,*, Melinda Krakow 2, Lauren Maggio 3
Editor: Cindy Sing Bik Ngai4
PMCID: PMC7946182  PMID: 33690639

Abstract

Journalists’ health and science reporting aid the public’s direct access to research through the inclusion of hyperlinks leading to original studies in peer-reviewed journals. While this effort supports the US-government mandate that research be made widely available, little is known about what research journalists share with the public. This cross-sectional exploratory study characterises US-government-funded research on cancer that appeared most frequently in news coverage and how that coverage varied by cancer type, disease incidence and mortality rates. The subject of analysis was 11436 research articles (published in 2016) on cancer funded by the US government and 642 news stories mentioning at least one of these articles. Based on Altmetric data, researchers identified articles via PubMed and characterised each based on the news media attention received online. Only 1.88% (n = 213) of research articles mentioning US government-funded cancer research included at least one mention in an online news publication. This is in contrast to previous research that found 16.8% (n = 1925) of articles received mention by online mass media publications. Of the 13 most common cancers in the US, 12 were the subject of at least one news mention; only urinary and bladder cancer received no mention. Traditional news sources included significantly more mentions of research on common cancers than digital native news sources. However, a general discrepancy exists between cancers prominent in news sources and those with the highest mortality rate. For instance, lung cancer accounted for the most deaths annually, while melanoma led to 56% less annual deaths; however, journalists cited research regarding these cancers nearly equally. Additionally, breast cancer received the greatest coverage per estimated annual death, while pancreatic cancer received the least coverage per death. Findings demonstrated a continued misalignment between prevalent cancers and cancers mentioned in online news media. Additionally, cancer control and prevention received less coverage from journalists than other cancer continuum stages, highlighting a continued underrepresentation of prevention-focused research. Results revealed a need for further scholarship regarding the role of journalists in research dissemination.

Introduction

The news media are a crucial source of public information that can spur people into action and shape their health beliefs and behaviors [16]. Journalists reporting on health and science often include hyperlinks leading to original studies in peer-reviewed journals. These studies are typically funded by the US government, the world’s largest financer of cancer research [7], and come with a mandate that research be made publicly available [8,9].

The relatively new phenomenon of direct public access to research [10,11] is relevant to understanding the characteristics of journal articles that are most covered by journalists [1]. It influences how the public, including government officials, policy-makers, health communications professionals, and physicians [1214], access and understand research about cancer [1517], the most reported-on disease in US news [18]. Yet, little is known about the alignment between the characteristics of US federally funded research articles and the ways journalists report on them.

Research has generally focused on health news coverage [1921] in relation to what information people seek [22,23], disease mongering [2426], scientific uncertainty [26,27] and exaggeration [2629] or inaccuracy [24] of information. Typically, studies include either a broad mix of mass media (e.g., press releases, newspapers, magazines, radio, television, and the Internet) [30] or a narrow focus on legacy media, typically top-tier traditional print news publications with a national online presence (e.g., Los Angeles Times, New York Times, Washington Post) [31]. Neither media type adequately reflects the growing landscape of digital-native news organizations (e.g., Breitbart News Network, Buzzfeed, Politico, Vox) that publish only online and remain largely outside the attention of researchers [3,27,32,33]. These news publications push the boundaries of conventional journalism [3] and are part of an emerging newscape in which content is shared exponentially on the Internet and across social media platforms. When researchers have considered these news publications, they have typically been framed as secondary to traditional legacy outlets in terms of importance and quality [3436]. However, there is a need for scholarship both “to capture the diversity” of the field and to recognize the diversity of news and audiences found online.

Audiences for digital-native publications differ from those of legacy media [37,38]. Pew found that 28% of the public typically relied on non-traditional news providers [38]. Notably, younger people increasingly viewed digital-native news rather than legacy media [39]. The further growth of digital-native news has been spurred by the public’s increasing distrust of the news media [40]. Additionally, audiences of digital-native and local publications, in particular, tended to be from communities in which cancer mortality remained disproportionately high [41]. These audiences were often less wealthy and educated [42], as well as older and with a higher proportion of people of color [41].

From a public health perspective, reaching these audiences with understandable and accurate medical science is crucial. There is a need to examine journalistic digital-native news and popular online niche and local news publications independent of other forms of mass media and alongside more traditional news publications. For this paper, “journalistic” media is defined as the activity of professional reporters and editors working for news publications to gather, assess and present news and information to help people become more knowledgeable and able to make better decisions [43]. This stands in contrast to mass media, which also include non-profit and governmental news organizations, news aggregators and press release news wires among other content providers. (As discussed under Methods, this paper relied on Pew Research Center’s 2016 State of the Media report to define a publication as journalistic).

Journalistic news stories are among the most common ways people access health information [44,45]. Through their news choices and online clicks, people largely control how they attain information, including scientific research. Among US adults, 37% say they have used news stories with scientific information to help them make decisions about everyday life [45]. Yet, journalists act as gatekeepers, finding, selecting and framing what health information is mentioned and then amplified in news stories [46]. The role of gatekeeper is double-edged and not free from influence (e.g., journal venue, press releases, authors’ institutional status and scientific impact) [28,46,47]. Journalists can highlight and omit important health research that is relevant to the public. They also make choices that can privilege a type of research. For instance, researchers reported that journalists generally mentioned studies published in the highest impact journals; however, within these publications, journalists opt for studies lower on the hierarchy of research design (i.e., observational studies rather than randomized controlled trials) [31,48]. With the coronavirus pandemic, journalists have been referencing more preprints (i.e., research that has not gone through the peer-review process) [49]. The influence of press releases on journalists and their coverage [4850] is well established, as is journalists’ desire to select studies that appeal to their audiences based on a mix of factors outside public or consumer health (notably, novelty and timeliness) [51].

Professional journalism practices and norms also influence the inclusion of links to research in news stories, an increasingly common practice [52,53]. For some journalists, links represent valuable additional sourcing [54] and credibility markers [53]; for others, they raise concerns over reader understanding, research literacy, and numeracy, as well as loss of readership and, in turn, advertising revenue to a publication [53]. Scholars attribute the pattern of traditional or legacy news sites directing readers to external links less often than digital-native news sites to these norms and practices, particularly the socialization of journalists within traditional professional journalism [5562]. In a quantitative content analysis of stories from two top-tier prominent Belgian digital-native health news sites, researchers found that journalists linked to research from, on average, 30.9% of published articles, which is consistent with top US publications [62]. However, the frequency of these links varied by publication (33.9% versus 15.6%) [52,60,6365]. In a content analysis of 270 blog posts and news-site articles (912 links), Coddington [60] found news sites had the highest percentage of posts (i.e., six or more links, 22%), though they were also most likely to have no links at all (36%), suggesting variance in linking practices among journalists [60]. Additionally, Coddington found that professional journalists do not typically link to external sites (9% of links versus 91% for internal links, i.e., those that point to a publication’s own work), a finding aligned with previous research [62]. However, linking to research (classed as reference and fact-based information) was second only to linking to mainstream media content [60]. As Coddington explained, “news sites’ links open the door to a valuable contextual resource for curious readers, but they reinforce a strict perimeter on the realm of acceptable discourse on public issues” [60].

Scientists work to influence journalists as a way to expand science communication more to the lay public [66]. They see journalists as a conduit for increasing the visibility of their work and recognize a professional responsibility to respond to journalists [66]. Scholars have suggested that issues between scientists and journalists— e.g., concerns over inaccurate reporting and misinformation, exaggeration, source selection, presentation of preprint research and bias— are gaining attention in an effort to improve the public’s understanding of the scientific process [49,67]. Notably, Peters et al. [68] found that scientists and journalists interact more frequently and with less friction than previously reported. Yet, the researchers [66] also found that scientific communities have continued to regulate much of the news media contact with their members through norms that can be at odds with the influences and goals of public information departments. Scientists have a strategic view toward journalists and the marketing communications teams at both their institutions and the journals they publish in [66]. For instance, scientists have recognized the value journalists offer in terms of citation advantage for studies mentioned in influential news publications [69,70]. They have also employed blogs and social media strategies for their research and scholarly practice (e.g., discourse, collaboration, recruitment), knowledge translation, dissemination, and education [71]. These strategies have often been visible to journalists and used to attract the attention of science journalists [72,73].

Findings from this study offer stakeholders— public health and communications professionals, as well as health professionals and journalists— a view into how online mentions of peer-reviewed journal articles funded by the US government appear in the news media. This study comes at a crucial time in light of current debates around science and a growing mistrust of the news media and concerns over fake news (i.e., rumors and falsehoods). While scientists are generally considered trusted as sources of information, their expertise is often met with public skepticism [74]. For instance, half or fewer Americans see science as inclusive of the best available evidence and they often doubt scientific consensus on issues considered to be generally agreed upon (i.e., effects of vaccines, causes of climate change, effects of eating genetically modified food) [74,75]. The National Science Foundation (NSF) Science and Engineering Indicators found that only about 31% of people in the US have a “clear understanding” of what constitutes a scientific study [75]. Journalists have historically been viewed as sense makers for helping the public understand scientific research. However, a report on public trust revealed that 42% of people in the US distrusted the media and 63% of people globally did not know how to tell quality journalism from fake news [76]. Additionally, 59% of people reported that it was becoming more difficult to tell if news had been produced by a reputable news organisation [76]. As such, our findings could facilitate future education (e.g., media and science literacy), dissemination and funding initiatives while describing the broad mix of news media now used by the public.

This study lays the groundwork for future research that explores how online news media could be better incorporated into dissemination processes and knowledge translation strategies. Understanding the professional practices and processes of how cancer research moves from scientists/researchers and marketing/communications professionals to journalists and then to the public is key for developing ways to more directly connect lay people with emerging scientific research in understandable and valuable ways.

Spectrum of online news

Researchers’ reliance on top-tier traditional news media with a national readership may be partly a result of practicality and commercial business models rather than the public’s news and information habits. Databases of journalistic news content such as Lexisnexis and ProQuest can exclude popular digital-native news sites (e.g., Breitbart News Network, Buzzfeed, Mic), as well as popular niche sites (e.g., Business Insider, Bustle, Hello Giggles), which are typically considered outside mainstream journalism. These databases are also relatively static and do not capture the fluid nature of the news media landscape; they primarily act as archives of news stories and may not record all updates or changes to a story.

As an alternative to Lexisnexis and ProQuest, researchers have come to rely on Google News, an online aggregator of more than 4500 sources [44,77]. However, Google does not publicly release the sources included in Google News and not all content from news sites, notably those behind paywalls, are accessible through it [77,78]. Thus, transparent and replicable research using Google News can be challenging.

Altmetric LLC’s database of more than 5000 global media sources is another alternative for accessing a broad mix of news sources. These sources are manually curated and are updated in real time through posts mentioning research via Application Programming Interfaces (APIs) and Really Simple Syndication (RSS) feeds, software that allows the sharing of content through a standardized system for the distribution of content from online publishers. Altmetrics is particularly appropriate for scholarship considering the intersection of peer-reviewed research and journalists’ portrayal of that research in online news [79]. Unlike Google News and ProQuest, Altmetric LLC was designed to identify, track and retrieve the contexts (e.g., news articles, Twitter, Facebook) in which research is mentioned [79]. Altmetric gathers news mentions of journal articles, including date and time of publication, from within media stories using unique identifying links to publications (i.e., hyperlinks or publication identifiers, such as Digital Object Identifiers, DOIs). Altmetric scans the text of media stories and uses natural language processing (NLP) techniques to amass study information (i.e., author names, journal titles, and study timeframes), which constitutes a “mention” [79].

Altmetric sources include well-known legacy news publications (e.g., Los Angeles Times, New York Times, Washington Post), as well as digital-native publications (e.g., The Daily Beast, Huffington Post, Vox), popular niche sites (e.g., Dailyhunt, Politico, The Verge) and aggregation sites (e.g., EurekAlert, National Interest, Science Daily). Journalistic news media publications compose only a portion of Altmetric’s source list, which also includes blogs, governmental documents, press release services and wire services. Altmetric is not without limitations, and like Google News, it does not publicly release a full list of sources; however, the company regularly shares its database and source list with researchers. Additionally, its list of media sources is not systematic, as source selection is based on manually curated news outlets, with content made available via third-party providers and RSS feeds.

This study characterises and analyses journalistic news stories that mention cancer research. It explores the intertwining hierarchy among an online collection of 86 traditional and digital-native news publications and their role in disseminating cancer research funded by the US government to the public. The study extends the work of Maggio et al. that identified patterns and frequencies of how 200264 scholarly studies about cancer circulated through mass media (e.g., press releases from news wires, non-profit and governmental media organisations, news aggregators and journalistic news organisations) [30]. Maggio et al. found that the frequency of journal articles about specific cancer types was misaligned with US rates of cancer burden (i.e., incidence and mortality) [30]. However, the researchers did not examine the alignment of journal articles specifically with journalistic news coverage, a popular [1214,80] and more direct way the public accesses health information. This study addresses that gap.

Materials and methods

This cross-sectional exploratory study builds on methods established by Maggio et al. [30], as a sub-analysis of the larger study, which examined scientific journal articles about cancer published in 2016 with US government funding sources. The present study 1) identifies, examines and characterizes how the number of mentions of research in online journalistic news sources varies by cancer type and publication; and 2) compares the disease incidence and mortality rates with the amount of research published for each cancer type and with the amount of news media attention each receives.

To facilitate transparency and replication of the methods, the project’s complete data set and data management and analysis files are publicly accessible at https://zenodo.org/record/4075712 (see SPSS subfolder for project data and syntax files).

Inclusion and exclusion criteria

We utilized Maggio et al.’s data set of journal articles published in 2016 to allow comparisons to findings regarding mass media coverage of cancer research [30]. By searching PubMed, the researchers identified content about cancer (e.g., peer-reviewed papers, press releases, news stories, editorials, letters to the editor) via a query using Entrez Programming Utilities [30,81]. Maggio et al. included cancer articles as classified by National Library of Medicine (NLM) indexing. Within the search results, the researchers also identified articles regarding the 13 most frequently diagnosed cancer types in the United States (excluding non-melanoma skin cancers). The National Cancer Institute (NCI) identifies common cancer types as those with 40,000 or more cases reported annually [82]. These cancers, in order of estimated deaths, are lung, colon and rectum, pancreas, breast, liver, prostate, leukaemia, non-Hodgkin’s lymphoma, bladder, kidney, endometrium, melanoma and thyroid.

Each journal article included citation metadata as provided by NLM (e.g., journal title, funding information, medical subject headings, or MeSH). Maggio et al. [30] categorized articles indexed with MeSH terms for a common cancer or respective child term as being about that particular cancer. For instance, articles indexed with “Leukemia, Myeloid” were classed with the parent term “Leukemia.” Each cancer type mentioned in an article counted once toward its parent term. Articles without at least one of the 13 most prevalent cancers were classed as “other.” Using the Altmetric database on 2 March 2018, Maggio et al. [30] searched each document’s PubMed identifier and extracted those that received media attention through “mentions.” Altmetrics characterized journal article “mentions” as the presence of either a link to a journal article or a phrase referencing a journal article in one of the 5000 sources included in its database.

Codes and definitions

To generate a broad list of top US online journalistic media organizations, we combined two lists from Pew Research Center’s 2016 State of the Media report. One list included the top 50 newspapers by average Sunday circulation and online presence [83]. The second list included the most popular digital-native sites, 36 in total [83]. We combined these lists and then used them to filter out non-journalistic news media sources from Maggio et al.’s [41] full data set, which contained 2805 media organizations, including non-journalistic media organizations (e.g., blogs, public relations and governmental agencies). This allowed for a data set with only online journalistic media sources. A journal article was coded as having a journalistic mention if it was cited in a news story published by at least one of the 86 journalistic news organizations included in the combined list from Pew. (For the names of the news media sources, see Pew Research Center [83]). Use of the combined list allowed us to generate results in a reproducible manner that can be re-examined for other years of publication. The combined lists composed 3.1% of the total number of media organizations contained in the Altmetric data set (86/2805). Other journalistic news outlets are part of the Altmetric data set; however, they are not part of this study. Additionally, we examined journalistic news coverage of scientific articles across annual estimated incidence and mortality totals for common types of cancer (i.e., defined by the National Cancer Institute as cancers with an estimated incidence of 40,000 or more cases per year) [84,85].

Statistical analysis

Journal articles were coded for the presence or absence of at least one Almetric-defined mention in an online news story from the list of journalistic news outlets described above, as well as the total number of news mentions per article. We calculated descriptive statistics to provide a baseline understanding of the frequency and proportions of journal articles that received news mentions, as well as identify potential differences in the types of cancers covered by news sources. Subsequently, we examined the characteristics of news organizations (e.g., news type, frequency) that provided coverage of these journal articles.

Two-tailed chi square analyses were conducted to compare coverage of the top 13 cancers across types of online news media sources (i.e., top traditional news sites versus top digital-native sites). An alpha level of .05 was the threshold for statistical significance. Analyses were conducted using Microsoft Excel 365 ProPlus (counts and figure creation) and SPSS version 21 (descriptives and chi-square tests).

Results

As noted above, these analyses utilized a predefined data set from a previous study to facilitate direct comparisons with published research [30]. The data set included 11436 articles in PubMed published in 2016 that met predefined search and inclusion criteria, which included a cancer-related MeSH term (i.e., neoplasm) and at least one reported US funding source.

Within the original sample described above, 1.88% (n = 213) journal articles received at least one online news mention (i.e., instance of online news coverage). Of the 213 articles that received online news mentions, the median number of mentions per article was 1 (mean = 3; range: 1–23, SD = 3.8). Over half (n = 118, 55%) received one mention, 14% (n = 29) received two, and 31% (n = 66) had three or more mentions, for a total of 642 mentions across all 213 journal articles.

News mentions included journal articles from 96 scientific journals. News coverage most frequently cited research from JAMA (11 journal articles cited across 75 news mentions), JAMA Internal Medicine (7 articles, 50 mentions) and New England Journal of Medicine (8 articles, 49 mentions). The distribution of news mentions across journals varied in terms of both the total number of news mentions as well as the number of articles that received mentions. In other words, multiple news mentions could reflect a single journal article with extensive news coverage or multiple journal articles with lesser coverage per article. Thus, the average number of news mentions per journal article ranged from 1 (e.g., a journal with one journal article receiving one mention) up to 11 (Journal of Psychopharmacology), representing qualitatively distinct patterns of news coverage. Frequency of news mentions and cited journal articles among the top 10 scientific journals is presented in Table 1.

Table 1. Top 10 research journals featuring the most US government-funded cancer research articles with news mentions in 2016.

Rank Journal Journal impact factor Number of news mentions of cancer research articles in 2016 Number of journal articles with 1 or more news mentions Average news mentions per cancer research journal article
1 JAMA 45.540 75 11 6.8
2 JAMA Int Med 18.652 50 7 7.1
3 N Engl J Med 74.699 49 8 6.1
4 Lancet 60.392 47 6 7.8
5 Science 41.845 34 5 6.8
6 Nature 42.778 33 6 5.5
7 Cancer 6.102 32 12 2.7
8 J Psychopharmacol 4.738 28 2 14
9 Proc Nat Acad Sci 9.412 26 14 1.9
10 BMJ 30.223 22 2 11

We conducted an exploratory analysis of a subset of news mentions (n = 236 news stories from all journal articles included in Table 4) to illustrate the content included in Altmetric’s categorization of the story as a “mention.” Briefly, these news stories included both stories focused on summaries of the journal article as well as mentions of the journal article as part of a news story on a broader topic. Of these news stories, 59.5% represented original content written by a journalist at the news outlet, while 40.5% of news stories originated from a wire service such as the Associated Press and United Press International (UPI). Eighty-two percent of news stories mentioned the name of the journal, such as JAMA or BMJ, and 74.9% mentioned the name of at least one co-author of the journal article. Most (69.0%) stories included a hyperlink to the specific journal article or its abstract on the journal’s website.

Table 4. Top journal articles by news mentions.

Rank Journal Journal impact factor Title Number of news mentions
1 Lancet 60.392 Does physical activity attenuate, or even eliminate, the detrimental association of sitting time with mortality? A harmonised meta-analysis of data from more than 1 million men and women. 23*a
2 Nature 42.778 Substantial contribution of extrinsic risk factors to cancer development 18*
2 Science 41.845 The phenotypic legacy of admixture between modern humans and Neanderthals 18*a
2 JAMA 45.54 CDC guidelines for prescribing opioids for chronic pain 18*a
3 BMJ 30.223 Overspending driven by oversized single dose vials of cancer drugs 17
4 N Engl J Med 74.699 Fulminant myocarditis with combination immune checkpoint blockade 16*
5 JAMA 45.54 Comparison of Site of Death, Health Care Utilization, and Hospital Expenditures for Patients Dying With Cancer in 7 Developed Countries 15*
5 J Psychopharmacol 4.738 Psilocybin produces substantial and sustained decreases in depression and anxiety in patients with life-threatening cancer: A randomized double-blind trial. 15*
6 JAMA Int Med 18.652 Association of Specific Dietary Fats With Total and Cause-Specific Mortality. 14*
7 Cancer 5.742 Effects of marital status and economic resources on survival after cancer: A population-based study. 13*a
7 J Psychopharmacol 4.738 Rapid and sustained symptom reduction following psilocybin treatment for anxiety and depression in patients with life-threatening cancer: a randomized controlled trial 13*a
7 N Engl J Med 74.699 Regression of Glioblastoma after Chimeric Antigen Receptor T-Cell Therapy. 13
8 JAMA 45.54 US Spending on Personal Health Care and Public Health, 1996–2013. 12*
9 Science 41.845 Mutational signatures associated with tobacco smoking in human cancer. 11a
10 J Clin Onc 32.956 Financial Insolvency as a Risk Factor for Early Mortality Among Patients With Cancer. 10
10 JAMA 45.54 US County-Level Trends in Mortality Rates for Major Causes of Death, 1980–2014. 10

* The journal issued a press release about the research.

a The journal article also ranked in the top 10 articles by mass media mentions as analyzed by Maggio, et al [30].

All 13 of the most common cancers were included in the sample of journal articles. However, only 12 were the subject of at least one news mention (Table 2); articles addressing urinary and bladder cancer were not included in any identified news stories. Overall, 31% (n = 198) of the 642 news mentions included research addressing at least one of these common types of cancer. The majority of news mentions included research on only one cancer site, such as breast cancer or lung cancer (n = 187). Additionally, eight news mentions addressed two top cancers and three news mentions addressed three or more of these cancers. Frequency of news mentions differed across the common cancers, with breast cancer (n = 54), lung cancer (n = 35) and melanoma cancer (n = 33) being the subject of the most news mentions (Table 2). We also examined how mentions differed across estimated deaths for each common cancer type (Fig 1). These ratios ranged from 0 to 8618. With the exception of urinary and bladder cancers, which did not receive any mentions, the lowest death-to-mention ratio observed for breast cancer (ratio = 760.56, indicating greater coverage per estimated death), and the highest ratio observed for pancreatic cancer (ratio = 8618, indicating the least coverage per estimated death).

Table 2. Common cancer types covered by journal articles in 2016 resulting from US government funds in relation to news and mass media mentions, number of estimated new cases in 2017 and estimated deaths.

Cancer type No. of estimated new cases in 2017 No. of estimated deaths in 2017 No. of journal articlesa n = 11436 (% of total sample) No. of total online news mentions
Breast 255190 41070 1284 (11.2) 54
Lung 222500 155870 630 (5.5) 35
Melanoma 87110 87110 302 (2.6) 33
Colon and rectal 135430 50260 535 (4.7) 28
Prostate 161360 26730 586 (5.1) 23
Leukemia 62130 24500 544 (4.8) 17
Liver 40710 28920 302 (2.6) 8
Pancreatic 53670 43090 309 (2.7) 5
Endometrial 61380 10920 77 (0.7) 4
Kidney 63990 14400 106 (0.9) 4
Non-Hodgkin’s lymphoma 72240 20140 170 (1.9) 3
Thyroid 56870 2010 71 (0.6) 1
Urinary/Bladder 79030 16870 68 (0.6) 0

*Articles may be counted multiple times if they include two or more cancers.

a Totals reported in this column are originally reported in Maggio et al., Table 7 [30].

Fig 1. Ratio of annual estimated deaths in 2017 to number of online news mentions in 2016 for common types of cancer.

Fig 1

We also examined whether coverage of common cancer types differed by news source (traditional versus digital native). There was a significant association between news source type and mention of at least one common cancer type research (n = 642; X = 5.690, df = 1, p = .017) with traditional news sources including more mentions of research on common cancer types (n = 127) compared to news mentions across digital native news sources (n = 71). Across 642 news mentions, 57.2% (n = 367) appeared in traditional online news sources, and 42.8% (n = 275) appeared in digital native news sources. In terms of the distribution of news coverage, the 642 online news mentions appeared across 53 online news sources, with an average number of 12 mentions per online news source (median = 5 mentions, range = 1–92). In total, 34 (68%) Pew-ranked newspapers mentioned at least one journal article from the sample, and 19 out of 36 sources (52.8%) from the Pew digital-native list mentioned at least one journal article. Five news sources accounted for approximately half of these mentions (see Table 4): Philadelphia Inquirer (n = 92), Business Insider (n = 70), Washington Post (n = 59), Huffington Post (n = 54), and New York Times (n = 50). The most frequently cited journal article, “Does physical activity attenuate, or even eliminate, the detrimental association of sitting time with mortality? A harmonised meta-analysis of data from more than 1 million men and women” (published in Lancet [30,86]), appeared in 23 online news mentions distributed across 16 news sources.

Yet, news sources (see Table 3) varied by audience size and reach. For instance, Philadelphia Inquirer (Philly.com) was the top news source by cancer research mentions, but had only 1.8 monthly unique visitors (MUVs), making it the lowest-ranked publication in terms of audience (St. Louis Post-Dispatch had 2 million MUVs and Breitbart News Network had 7.2 million MUVs). In contrast and in terms of cancer research was Huffington Post (68 million MUVs), Business Insider (75 million MUVs), New York Times (88 million MUVs) and Washington Post (90 million MUVs). The online news media landscape, based on the top 10 news sources by cancer research mentions, was divided equally between traditional and digital-native news. This is notable, as there were more traditional than digital-native news sites in this sample (i.e., 50 versus 36, respectively). Of the five traditional news media publications, three were local publications. See Table 4 for a list of the top 20 journal articles by number of news mentions.

Table 3. Top 10 news sources by cancer research mentions in 2016.

Rank News source Type of news source Monthly unique visitors, estimated* Cancer research mentions in 2016
1 Philadelphia Inquirer (Philly.com) Traditional local news 1.8 million 92
2 Business Insider Digital-native news 75 million 70
3 Washington Post Traditional national news 90 million 59
4 Huffington Post Digital-native news 68 million 54
5 New York Times Traditional national news 88 million 50
6 Breitbart News Network Digital-native news 7.2 million 39
7 International Business Times Digital-native news 21 million 27
8 St. Louis Post-Dispatch Traditional local news 2 million 21
9 Houston Chronicle Traditional local news 15 million 20
10 Quartz Digital-native news 22 million 15

* Monthly unique visitors represent the number of distinct individuals requesting pages from a website during a given period; a unique visitor may visit a site multiple times within that same time frame, but will only count as one unique visitor [8789].

Discussion

Journalistic news stories are a primary way the public accesses research [16]. In this study, only one to two (1.88) scientific journal articles out of every 100 mentioning US-funded cancer research received any online news attention. This is in contrast to the one out of every six journal articles (also reporting US-government funding) that received mass media attention, more broadly defined [30]. Our findings suggest that a great deal of cancer research goes unreported by both the online news media and the mass media, with journalists, in particular, acting as a gatekeeper for getting taxpayer-financed research to the general public. This is consistent with previous research finding that only a small proportion of available journal articles received coverage by the news media [9092].

Findings revealed that health coverage of cancer research by journalists maps closely to the common cancer types covered by journal articles resulting from US government funds. The coverage highlights the ongoing issue of aligning news coverage and subsequent public access to research about cancers known to have the highest actual burden (i.e., incidence or mortality). This misalignment is notable, as prevention- and detection-focused research continues to be underrepresented in popular online news media.

A strength of this research is that it includes traditional (e.g., Los Angeles Times, New York Times, Washington Post) and local news sites (e.g., Philadelphia Inquirer, St. Louis-Post Dispatch, Houston Chronicle) alongside both digital-native news (e.g., Breitbart News, Buzzfeed, Vox) and niche news sites (e.g., Business Insider, International Business Times, Vice). When considered together, these sites offer a more complete view into the varied journalistic publications that report on cancer research in an evolving news media landscape. Findings highlight the importance of viewing news media penetration both holistically and by news media source.

Our study points to the continued dominance in cancer research coverage by traditional news organizations, but a dominance increasingly impinged by digital-native sites. While traditional news publications cited journal articles significantly more than digital-native and niche publications, the latter were not without mention among the top news sources. Five news sources accounted for approximately half of all journal article mentions (Philadelphia Inquirer, Business Insider, Washington Post, Huffington Post, New York Times), with two of these top sources being digital-native news sites (Business Insider and Huffington Post). Of the remaining top five, three were digital-native news.

The increasing popularity of digital-native news sites and their coverage of cancer studies suggest that researchers’ tendency to focus on traditional news media may no longer suffice, particularly with the continued decline of traditional news organizations in terms of audience and financial stability [93]. Future research opportunities include exploring constraints that may affect the citation of research by news organizations. For instance, are digital-native and/or niche news organizations more restricted in their access of peer-reviewed research (e.g., less likely to gain early access to research from press departments of academic publishers that target major national news publications or less likely to pay costly fees to access academic journals) or are they simply catering to the content desires of their audiences? Additionally, do some of the professional practices at the different publications affect research coverage (e.g., one publication type hires more reporters with advanced science or health degrees)?.

The use of Altmetric allowed this study to include local publications, an often overlooked news source for health information. The top-ranked Philadelphia Inquirer stood out for its higher number of cancer research mentions (92 mentions in 2016, compared to the second-ranked Business Insider with 70). While multiple reasons likely contributed to The Philadelphia Inquirer’s high number of stories, a tendency to run licensed or wire stories (i.e., news copy sent out by news agencies to subscribers) may have influenced its rank. This tendency was shared by the Breitbart News Network, which appeared to run almost exclusively licensed stories mentioning cancer research. This is in contrast to the Washington Post and New York Times (ranked third and fifth, respectively), which typically ran articles based on original reporting. These findings lay the groundwork for researchers to further explore the influence of news publications relying on licensed and news wire stories. Such an approach may suggest a consolidation of primary sources and a further culling of cancer research by the news media. Local publications are considered trustworthy and crucial providers of journalism in their communities [41]; however, the breakdown between news stories mentioning cancer research written by local reporters and those written by national and wire-service reporters is unclear. Documenting this breakdown would likely aid in reaching particular communities with cancer research and clarifying why local coverage may in fact be a local curation of national news and could suggest a lack of resources (e.g., a lack of access to journal articles or a shortage of health and science reporters at the local level). Additionally, public health and communication professionals and others interested in reaching local communities may benefit from our findings, which showed local publications to have a strong editorial interest in health research. A consideration about the size and characteristics of an audience could raise important questions for dissemination initiatives. For instance, is a mention in the New York Times the equivalent value of a mention in the St. Louis Post-Dispatch? While audience size suggests a difference in value, such a metric many not adequately take into account audience demographics, geography and community health needs.

As the results revealed, journalists and their news publications largely coalesced in terms of what cancer research they presented to the public, with 45% of journal articles receiving two or more mentions (for a total of 642 mentions across all 213 journal articles). Journalists also shared a tendency to report on research involving observational studies rather than RCTs, a finding aligned with previous studies [31,48]. This coming together in terms of what is newsworthy suggests that journalists, through professional practices (e.g., reporting, sourcing, etc.) and a shared value of top-tier brands, were largely united in what cancer research they considered relevant or of interest to the public.

Journalists, in particular, could benefit from greater context of their coverage of cancer research (e.g., how their research mentions map onto cancer incidence and mortality rates). Articles addressing urinary and bladder cancer were not included in any identified news stories, despite being sixth in terms of new cases among the 13 most common cancers included in US-funded journal articles. (Among these cancers, urinary and bladder ranks 10th in terms of estimated deaths.) However, breast cancer and melanoma were more heavily represented in news coverage relative to incidence and mortality rates, consistent with previous research [30]. Although melanoma is not among the top three cancers based on incidence or mortality, it ranked third based on news mentions—replacing prostate cancer. This contrasts with Maggio et al.’s [30] number of scientific articles about common cancer types in relation to media mentions. Breast, colon, lung, and prostate topped the list, followed by melanoma. The number of news mentions for melanoma was on par with that of lung cancer, despite lung cancer being responsible for most cancer deaths in the US each year. The reasons why melanoma may be mentioned in the news more frequently are unclear. However, melanoma may be discussed more because of seasonal coverage and sun-safety and anti-tanning campaigns. Also, former President Jimmy Carter announced in August 2015 that he had been diagnosed with metastatic melanoma, possibly leading journalists to pursue this cancer type more [94]. The influence of notable people and celebrities on the coverage of cancer by news publications is well reported in the literature [95]. Still, more research is needed into how journalists decide what types of cancer research are included in their reporting. For instance, when considering research for coverage, do journalists take into account cancer morbidity rates or do they rely more on other aspects (i.e., timeliness, celebrity news hook, journal venue or impact factor, press releases, scientists’ institutional status or influence)?

Journalists most often mentioned research from journals with strong branding and marketing efforts, consistent with previous research [28]. For instance, the JAMA Network dominated the news mentions, with five of the top 20 articles. Of the 20 articles that ranked highest by news mentions, only five were not also associated with a press release though they may still have benefitted from promotional efforts by a journal. This aligns with Maggio et al.’s [30] finding that journals engaged in outreach and dissemination efforts received the most mass media coverage. It also raises the question, Are media professionals at risk of missing important research worthy of coverage because they hew to notoriety and the marketing efforts of certain journals? More research is needed into how marketing affects journalists’ coverage of cancer research and if they are truly accessing the research (as opposed to working off an abstract or another media mention).

Journalists also appeared in agreement over mentioning journal articles that covered novel, sensational or surprising topics. In such cases, the inclusion of cancer research seemed to be of secondary importance. For instance, four of the top 20 journal articles by news mentions had impact factors below 6 (the median was 45.159). The research topics from these four studies (use of LSD, pasta consumption, and marital status effects on survival after cancer) were likely selected by journalists for their novelty rather than their public health importance. Additionally, two journals stand out for their high average of news mentions per journal article (a sign of intense news media attention): Journal of Psychopharmacology and BMJ. The former had two articles that garnered an average of 14 unique news mentions each, while the latter had two with an average of 11 unique news mentions each. The Journal of Psychopharmacology ran the following two articles: “Psilocybin produces substantial and sustained decreases in depression and anxiety in patients with life-threatening cancer: A randomized double-blind trial” [96] and “Rapid and sustained symptom reduction following psilocybin treatment for anxiety and depression in patients with life-threatening cancer: a randomized controlled trial” [97]. Journalists recast the titles of these articles into catchier headlines highlighting the novelty of the research. New York Times published two stories with the headlines: “A Dose of a Hallucinogen From a ‘Magic Mushroom’ and Then Lasting Peace” and “LSD to Cure Depression? Not So Fast.” Huffington Post ran “Psychedelic Mushrooms And LSD Are Among The Safest Recreational Drugs, Survey Finds.” The notable reach of these journal articles was likely due to their mention of a hallucinogenic alkaloid found in toadstools, rather than mention of cancer research. In the case of the top BMJ article, titled “Overspending driven by oversized single dose vials of cancer drugs” [98], journalists mentioned the article in 17 stories. The Boston Globe published “Study: $3B will be wasted on unused portion of cancer drugs” and The Philadelphia Inquirer ran “Overfilled cancer drug packs waste nearly $3 billion a year.” The topic of public health finance and its appeal to taxpayers likely contributed to the journal article’s reach, again, rather than mention of cancer research.

The top journal articles based on news mentions often aligned with the top journal articles based on mass media mentions [30]. For instance, the top journal articles for both news media and mass media mentions included five of the same articles, all of which were promoted with a press release. The overall top article, from the Lancet and about physical activity, garnered 462 mass media mentions but only 23 news mentions. This highlights the winnowing effect of journalists, but also the reach of a journal article that both taps into an almost universal health topic (e.g., physical activity) and offers an accessible title (i.e., Does physical activity attenuate, or even eliminate, the detrimental association of sitting time with mortality?). The other two articles, also with accessible titles— “CDC Guidelines for Prescribing Opioids for Chronic Pain” and “The phenotypic legacy of admixture between modern humans and Neanderthals”—highlight the value of tapping into a larger national conversation (i.e., the US opioid crisis and the Neanderthal ancestry reports promoted by companies such as 23andMe).

Limitations

The findings of this study must be considered within the context of its limitations. Media mentions do not necessarily equate to actual readers. Nor does this study consider the accuracy and quality of news stories mentioning cancer research. Additionally, Altmetric’s data set does not include all news publications; nor is it vetted externally. While PubMed actively indexes the US government funding for studies, authors are ultimately responsible for reporting funding. Also, the presented method includes only journal articles indexed in MEDLINE. Thus, we may have inadvertently missed including studies.

Conclusions

Our findings map a formative landscape of the dissemination of federally funded cancer research across a spectrum of online news media organizations, including traditional and local news publications, as well as often overlooked digital-native news sources. This study shows that the coverage of cancer research by journalists differs from that generated by other, mass media news producers. Results revealed which news stories are most popular with the public online, as well as which journals and journal articles most typically appear in these stories. Findings highlight a continued misalignment between prevalent cancers and cancers highlighted in online news media, as well as a tendency by journalists to report on cancer research with a particularly surprising, entertaining or sensationalist bent.

This study has implications for funder groups (e.g., NCI, NIH) that might benefit from a replicable method for tracking characteristics from their research portfolio that appear in a broad mix of journalistic online news media. Findings can be used as a benchmark to evaluate future funding initiatives, dissemination efforts and knowledge transmission strategies. Public health communication and public relations professionals might also benefit from an examination of which characteristics of journal articles are most mentioned by journalists, as well as why cancer control and prevention receive less coverage from journalists than other cancer continuum stages.

A focus on journalistic news stories is crucial as it offers a more accurate view into health news penetration, as previous research included mass media mentions and may not accurately represent the general public’s cancer news consumption or the changing newscape online. The role of journalist as gatekeeper is well established [46,99]. But less understood and still needing additional research is how journalists find, access, understand and use cancer research and how their news media organizations (e.g., traditional versus digital native, local versus national) and their professional practices (e.g., access to research through subscription or free databases, use of abstracts or full text, press briefings arranged by journals, direct contact with scientists) affect those behaviors [46].

As Brownson et al. [71] explained, ineffective dissemination contributes to a gap between the discovery of public health knowledge and its application in both policy development and the daily lives of people. Dissemination efforts may miss opportunities to consider what research reaches journalists and how they, in turn, decide what is worthy of coverage for a particular audience. While the media, collectively, are recognized as a key channel for knowledge dissemination [16], online digital-native news publications have received little attention, with researchers more focused on traditional legacy news publications. As such, a strength of this research is its board mix of online news organizations. As Maggio et al. reported [34], a broad inclusion of news sources is crucial as traditional news media are no longer the only or the primary sources of health information for the public. Future research should consider the intertwining relationship between scientists and journalists, particularly within social media and in consideration of the promotion, coverage and framing of prevalent cancers alongside actual public cancer burden.

Acknowledgments

The authors would like to thank Asura Enkhbayar and Juan Pablo Alperin for their methodological expertise associated with this paper.

Disclaimer

The views expressed in this article are those of the authors and do not necessarily reflect the official policy or position of the Uniformed Services University of the Health Sciences, the Department of Defense, the National Cancer Institute, or the US Government.

Data Availability

To facilitate transparency and replication of our methods, we have made our computer code and the project’s complete data set publicly accessible at https://zenodo.org/record/4448259#.YCQ32NhKg2w.

Funding Statement

The author(s) received no specific funding for this work.

References

  • 1.Stryker JE, Emmons KM, Viswanath K. Uncovering differences across the cancer control continuum: a comparison of ethnic and mainstream cancer newspaper stories. Prev Med. 2007;44: 20–25. 10.1016/j.ypmed.2006.07.012 [DOI] [PubMed] [Google Scholar]
  • 2.Jensen JD, Moriarty CM, Hurley RJ, Stryker JE. Making sense of cancer news coverage trends: a comparison of three comprehensive content analyses. J Health Commun. 2010;15: 136–151. 10.1080/10810730903528025 [DOI] [PubMed] [Google Scholar]
  • 3.Fishman J, Ten Have T, Casarett D. Cancer and the media: how does the news report on treatment and outcomes? Arch Intern Med. 2010;170: 515–518. 10.1001/archinternmed.2010.11 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 4.Trends and Facts on Online News | State of the News Media. In: Pew Research Center’s Journalism Project [Internet]. 2019 [cited 20 May 2020]. Available: https://www.journalism.org/fact-sheet/digital-news/.
  • 5.Health News Coverage in the U.S. Media. In: Pew Research Center’s Journalism Project [Internet]. 24 Nov 2008 [cited 20 May 2020]. Available: https://www.journalism.org/2008/11/24/health-news-coverage-in-the-u-s-media/.
  • 6.Setting the Agenda. In: Google Books [Internet]. [cited 20 May 2020]. Available: https://books.google.com/books/about/Setting_the_Agenda.html?id=oN2PKXMJYjkC.
  • 7.About NCI—Overview and Mission. In: National Cancer Institute [Internet]. 2015 [cited 20 May 2020]. Available: https://www.cancer.gov/about-nci/overview.
  • 8.[No title]. [cited 20 May 2020]. Available: http://publicaccess.nih.gov/public_access_policy_implications_2012.pdf.
  • 9.NIH to begin enforcing public-access policy in April. PsycEXTRA Dataset. 2008. 10.1037/e458982008-005 [DOI] [Google Scholar]
  • 10.[No title]. [cited 20 May 2020]. Available: http://publicaccess.nih.gov/public_access_policy_implications_2012.pdf. [Google Scholar]
  • 11.NIH to begin enforcing public-access policy in April. PsycEXTRA Dataset. 2008. 10.1037/e458982008-005 [DOI] [Google Scholar]
  • 12.Brodie M, Hamel EC, Altman DE, Blendon RJ, Benson JM. Health news and the American public, 1996–2002. J Health Polit Policy Law. 2003;28: 927–950. 10.1215/03616878-28-5-927 [DOI] [PubMed] [Google Scholar]
  • 13.Slater MD, Long M, Bettinghaus EP, Reineke JB. News coverage of cancer in the United States: a national sample of newspapers, television, and magazines. J Health Commun. 2008;13: 523–537. 10.1080/10810730802279571 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 14.Chen X, Siu LL. Impact of the media and the internet on oncology: survey of cancer patients and oncologists in Canada. J Clin Oncol. 2001;19: 4291–4297. 10.1200/JCO.2001.19.23.4291 [DOI] [PubMed] [Google Scholar]
  • 15.Moorhead LL, Holzmeyer C, Maggio LA, Steinberg RM, Willinsky J. In an Age of Open Access to Research Policies: Physician and Public Health NGO Staff Research Use and Policy Awareness. PLOS ONE. 2015. p. e0129708. 10.1371/journal.pone.0129708 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 16.Maggio LA, Moorhead LL, Willinsky JM. Qualitative study of physicians’ varied uses of biomedical research in the USA. BMJ Open. 2016. p. e012846. 10.1136/bmjopen-2016-012846 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 17.Kealey E, Berkman CS. The relationship between health information sources and mental models of cancer: findings from the 2005 Health Information National Trends Survey. J Health Commun. 2010;15 Suppl 3: 236–251. 10.1080/10810730.2010.522693 [DOI] [PubMed] [Google Scholar]
  • 18.Sadasivam RS, Kinney RL, Lemon SC, Shimada SL, Allison JJ, Houston TK. Internet health information seeking is a team sport: analysis of the Pew Internet Survey. Int J Med Inform. 2013;82: 193–200. 10.1016/j.ijmedinf.2012.09.008 [DOI] [PubMed] [Google Scholar]
  • 19.Smith KC, Niederdeppe J, Blake KD, Cappella JN. Advancing cancer control research in an emerging news media environment. J Natl Cancer Inst Monogr. 2013;2013: 175–181. 10.1093/jncimonographs/lgt023 [DOI] [PubMed] [Google Scholar]
  • 20.Sumner P, Vivian-Griffiths S, Boivin J, Williams A, Venetis CA, Davies A, et al. The association between exaggeration in health related science news and academic press releases: retrospective observational study. BMJ. 2014;349: g7015. 10.1136/bmj.g7015 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 21.Caburnay CA, Kreuter MW, Cameron G, Luke DA, Cohen EL, McDaniels L, et al. Black newspapers as a tool for cancer education in African American communities. Ethn Dis. 2008;18: 488–495. [PMC free article] [PubMed] [Google Scholar]
  • 22.Schultz SK. How Americans are Getting News and Information in the 21st Century. 2009. 10.21236/ada539880 [DOI] [Google Scholar]
  • 23.Niederdeppe J, Frosch DL, Hornik RC. Cancer news coverage and information seeking. J Health Commun. 2008;13: 181–199. 10.1080/10810730701854110 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 24.Brechman JM, Lee C-J, Cappella JN. Distorting Genetic Research About Cancer: From Bench Science to Press Release to Published News. Journal of Communication. 2011. pp. 496–513. 10.1111/j.1460-2466.2011.01550.x [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 25.Woloshin S, Schwartz LM. Giving legs to restless legs: a case study of how the media helps make people sick. PLoS Med. 2006;3: e170. 10.1371/journal.pmed.0030170 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 26.Jensen JD. Scientific Uncertainty in News Coverage of Cancer Research: Effects of Hedging on Scientists and Journalists Credibility. Human Communication Research. 2008. pp. 347–369. 10.1111/j.1468-2958.2008.00324.x [DOI] [Google Scholar]
  • 27.Kua E, Reder M, Grossel MJ. Science in the News: A Study of Reporting Genomics. Public Understanding of Science. 2004. pp. 309–322. 10.1177/0963662504045539 [DOI] [Google Scholar]
  • 28.Woloshin S, Schwartz LM, Casella SL, Kennedy AT, Larson RJ. Press Releases by Academic Medical Centers: Not So Academic? Annals of Internal Medicine. 2009. p. 613. 10.7326/0003-4819-150-9-200905050-00007 [DOI] [PubMed] [Google Scholar]
  • 29.Schwartz LM, Woloshin S, Baczek L. Media coverage of scientific meetings: too much, too soon? JAMA. 2002;287: 2859–2863. 10.1001/jama.287.21.2859 [DOI] [PubMed] [Google Scholar]
  • 30.Maggio LA, Ratcliff CL, Krakow M, Moorhead LL, Enkhbayar A, Alperin JP. Making headlines: an analysis of US government-funded cancer research mentioned in online media. BMJ Open. 2019;9. 10.1136/bmjopen-2018-025783 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 31.Lai WYY, Lane T, Jones A. Sources and coverage of medical news on front pages of US newspapers. PLoS One. 2009;4: e6856. 10.1371/journal.pone.0006856 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 32.Hurley RJ, Tewksbury D. News Aggregation and Content Differences in Online Cancer News. Journal of Broadcasting & Electronic Media. 2012. pp. 132–149. 10.1080/08838151.2011.648681 [DOI] [Google Scholar]
  • 33.Lee AM, Chyi HI. The Rise of Online News Aggregators: Consumption and Competition. International Journal on Media Management. 2015. pp. 3–24. 10.1080/14241277.2014.997383 [DOI] [Google Scholar]
  • 34.Bakker P. AGGREGATION, CONTENT FARMS AND HUFFINIZATION. Journalism Practice. 2012. pp. 627–637. 10.1080/17512786.2012.667266 [DOI] [Google Scholar]
  • 35.Deuze M, Witschge T. Beyond journalism: Theorizing the transformation of journalism. Journalism: Theory, Practice & Criticism. 2018. pp. 165–181. 10.1177/1464884916688550 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 36.Witschge T, Anderson CW, Domingo D, Hermida A. Dealing with the mess (we made): Unraveling hybridity, normativity, and complexity in journalism studies. Journalism. 2019. pp. 651–659. 10.1177/1464884918760669 [DOI] [Google Scholar]
  • 37.Antunovic D, Parsons P, Cooke TR. “Checking” and googling: Stages of news consumption among young adults. Journalism. 2018. pp. 632–648. 10.1177/1464884916663625 [DOI] [Google Scholar]
  • 38.For Local News, Americans Embrace Digital but Still Want Strong Community Connection. In: Pew Research Center’s Journalism Project [Internet]. 26 Mar 2019 [cited 20 May 2020]. Available: https://www.journalism.org/2019/03/26/for-local-news-americans-embrace-digital-but-still-want-strong-community-connection/.
  • 39.Bakker TP, de Vreese CH. Good News for the Future? Young People, Internet Use, and Political Participation. Communication Research. 2011. pp. 451–470. 10.1177/0093650210381738 [DOI] [Google Scholar]
  • 40.Tsfati Y. Online News Exposure and Trust in the Mainstream Media: Exploring Possible Associations. American Behavioral Scientist. 2010. pp. 22–42. 10.1177/0002764210376309 [DOI] [Google Scholar]
  • 41.Demographics of People Interested in Local News. In: Pew Research Center’s Journalism Project [Internet]. 14 Aug 2019 [cited 24 Nov 2019]. Available: https://www.journalism.org/2019/08/14/older-americans-black-adults-and-americans-with-less-education-more-interested-in-local-news/.
  • 42.[No title]. [cited 20 May 2020]. Available: http://www.newsmediaalliance.org/wp-content/uploads/2015/07/NAA-Facts-Figures-Logic-2015_v3.pdf.
  • 43.What is journalism? Definition and meaning of the craft. In: American Press Institute [Internet]. [cited 20 May 2020]. Available: https://www.americanpressinstitute.org/journalism-essentials/what-is-journalism/.
  • 44.Lloyd L, Kechagias D, Skiena S. Lydia: A System for Large-Scale News Analysis. String Processing and Information Retrieval. 2005. pp. 161–166. 10.1007/11575832_18 [DOI] [Google Scholar]
  • 45.Americans see medicine and health, food- and nutrition-focused articles as helpful in their everyday life decisions. In: Pew Research Center’s Journalism Project [Internet]. 2017 [cited 24 Nov 2019]. Available: https://www.journalism.org/2017/09/20/science-news-and-information-today/pj_2017-09-20_science-and-news_2-04/.
  • 46.Gesualdo N, Weber MS, Yanovitzky I. Journalists as Knowledge Brokers. Journalism Studies. 2019. pp. 1–17. 10.1080/1461670x.2019.1632734 [DOI] [Google Scholar]
  • 47.Singer JB. Transmission creep. Journalism Studies. 2018. pp. 209–226. 10.1080/1461670x.2016.1186498 [DOI] [Google Scholar]
  • 48.Selvaraj S, Borkar DS, Prasad V. Media coverage of medical journals: do the best articles make the news? PLoS One. 2014;9: e85355. 10.1371/journal.pone.0085355 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 49.Fleerackers A, Riedlinger M, Moorhead L, Ahmed R, Alperin JP. Communicating Scientific Uncertainty in an Age of COVID-19: An Investigation into the Use of Preprints by Digital Media Outlets. Health Commun. 2021; 1–13. [DOI] [PubMed] [Google Scholar]
  • 50.Schwartz LM, Woloshin S, Andrews A, Stukel TA. Influence of medical journal press releases on the quality of associated newspaper coverage: retrospective cohort study. BMJ. 2012;344: d8164. 10.1136/bmj.d8164 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 51.Molek-Kozakowska K. Communicating environmental science beyond academia: Stylistic patterns of newsworthiness in popular science journalism. Discourse & Communication. 2017. pp. 69–88. 10.1177/1750481316683294 [DOI] [Google Scholar]
  • 52.Li X. Internet Newspapers: The Making of a Mainstream Medium. Routledge; 2013. [Google Scholar]
  • 53.Stroobant J, Raeymaeckers K. Hypertextuality in net-native health news: A quantitative content analysis of hyperlinks and where they lead to. Journal of Applied Journalism & Media Studies. 2019;8: 367–385. [Google Scholar]
  • 54.Karlsson M, Sjøvaag H. Rethinking Research Methods in an Age of Digital Journalism. Routledge; 2018. [Google Scholar]
  • 55.Cui X, Liu Y. “How does online news curate linked sources? A content analysis of three online news media.” Journalism. 18: 852–870. [Google Scholar]
  • 56.Coddington M. (2012). Building frames link by link: The linking practices of blogs and news sites. International Journal of Communication, 6, 20. [Google Scholar]
  • 57.Cui X., & Liu Y. (2017). How does online news curate linked sources? A content analysis of three online news media. Journalism, 18(7), 852–870. [Google Scholar]
  • 58.Stroobant J. (2019). Finding the news and mapping the links: a case study of hypertextuality in Dutch-language health news websites. Information, Communication & Society, 22(14), 2138–2155. [Google Scholar]
  • 59.Stray J. (2010). Linking by the numbers: how news organizations are using links (or not). Nieman Journalism Lab, 10. [Google Scholar]
  • 60.Coddington M. Building frames link by link: The linking practices of blogs and news sites. Int J Commun Syst. 2012;6: 20. [Google Scholar]
  • 61.Cui X, Liu Y. How does online news curate linked sources? A content analysis of three online news media. Journalism. 2017;18: 852–870. [Google Scholar]
  • 62.Stray J. Linking by the numbers: how news organizations are using links (or not). Nieman Journalism Lab. 2010;10. [Google Scholar]
  • 63.Larsson AO. Staying in or going out? Assessing the linking practices of Swedish online newspapers. Journalism Practice. 2013;7: 738–754. [Google Scholar]
  • 64.Stroobant J. Finding the news and mapping the links: a case study of hypertextuality in Dutch-language health news websites. Inf Commun Soc. 2019;22: 2138–2155. [Google Scholar]
  • 65.Stroobant J, De Dobbelaer R, Raeymaeckers K. Tracing the Sources. Journalism Practice. 2018. pp. 344–361. 10.1080/17512786.2017.1294027 [DOI] [Google Scholar]
  • 66.Peters HP. Gap between science and media revisited: Scientists as public communicators. Proceedings of the National Academy of Sciences. 2013. pp. 14102–14109. 10.1073/pnas.1212745110 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 67.Liskauskas S, Ribeiro MD, Vasconcelos SM. Changing times for science and the public: Science journalists’ roles for the responsible communication of science. EMBO Rep. 2019;20. 10.15252/embr.201947906 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 68.Peters HP, Brossard D, de Cheveigne S, Dunwoody S, Kallfass M, Miller S, et al. SCIENCE COMMUNICATION: Interactions with the Mass Media. Science. 2008. pp. 204–205. 10.1126/science.1157780 [DOI] [PubMed] [Google Scholar]
  • 69.Phillips DP, Kanter EJ, Bednarczyk B, Tastad PL. Importance of the lay press in the transmission of medical knowledge to the scientific community. N Engl J Med. 1991;325: 1180–1183. 10.1056/NEJM199110173251620 [DOI] [PubMed] [Google Scholar]
  • 70.Dumas-Mallet E, Garenne A, Boraud T, Gonon F. Does newspapers coverage influence the citations count of scientific publications? An analysis of biomedical studies. Scientometrics. 2020. pp. 413–427. 10.1007/s11192-020-03380-1 [DOI] [Google Scholar]
  • 71.Dong JK, Saunders C, Wachira BW, Thoma B, Chan TM. Social media and the modern scientist: a research primer on social media-based research, dissemination, and sharing. African Journal of Emergency Medicine. 2020. 10.1016/j.afjem.2020.04.005 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 72.Collins K, Shiffman D, Rock J. How Are Scientists Using Social Media in the Workplace? PLoS One. 2016;11: e0162680. 10.1371/journal.pone.0162680 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 73.Van Eperen L, Marincola FM. How scientists use social media to communicate their research. J Transl Med. 2011;9: 199. 10.1186/1479-5876-9-199 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 74.Mixed Messages about Public Trust in Science. 4 Dec 2017 [cited 9 Oct 2020]. Available: https://issues.org/real-numbers-mixed-messages-about-public-trust-in-science/.
  • 75.Liskauskas S, Ribeiro MD, Vasconcelos SM. Changing times for science and the public: Science journalists’ roles for the responsible communication of science. EMBO Rep. 2019;20. 10.15252/embr.201947906 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 76.Website. [cited 9 Oct 2020]. Available: https://www.edelman.com/sites/g/files/aatuss191/files/2018-10/2018_Edelman_Trust_Barometer_Global_Report_FEB.pdf.
  • 77.Mahoney LM, Meghan Mahoney L, Tang T, Ji K, Ulrich-Schad J. The Digital Distribution of Public Health News Surrounding the Human Papillomavirus Vaccination: A Longitudinal Infodemiology Study. JMIR Public Health and Surveillance. 2015. p. e2. 10.2196/publichealth.3310 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 78.Weaver DA, Bimber B. Finding News Stories: A Comparison of Searches Using Lexisnexis and Google News. Journalism & Mass Communication Quarterly. 2008. pp. 515–530. 10.1177/107769900808500303 [DOI] [Google Scholar]
  • 79.Galligan F, Dyas-Correia S. Altmetrics: Rethinking the Way We Measure. Serials Review. 2013. pp. 56–61. 10.1080/00987913.2013.10765486 [DOI] [Google Scholar]
  • 80.Brownson RC, Eyler AA, Harris JK, Moore JB, Tabak RG. Getting the Word Out: New Approaches for Disseminating Public Health Science. J Public Health Manag Pract. 2018;24: 102–111. 10.1097/PHH.0000000000000673 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 81.Sayers E. E-utilities Quick Start. Entrez Programming Utilities Help [Internet]. National Center for Biotechnology Information (US); 2018. [Google Scholar]
  • 82.Common Cancer Types. In: National Cancer Institute [Internet]. 2015 [cited 20 May 2020]. Available: https://www.cancer.gov/types/common-cancers.
  • 83.Methodology: State of the News Media. In: Pew Research Center’s Journalism Project [Internet]. 23 Jul 2019 [cited 24 Nov 2019]. Available: https://www.journalism.org/2019/07/23/state-of-the-news-media-methodology/.
  • 84.Common Cancer Types. 2015 [cited 9 Oct 2020]. Available: https://www.cancer.gov/types/common-cancers.
  • 85.Common Cancer Types. 2015 [cited 9 Oct 2020]. Available: https://www.cancer.gov/types/common-cancers.
  • 86.Ekelund U, Steene-Johannessen J, Brown WJ, Fagerland MW, Owen N, Powell KE, et al. Does physical activity attenuate, or even eliminate, the detrimental association of sitting time with mortality? A harmonised meta-analysis of data from more than 1 million men and women. The Lancet. 2016. pp. 1302–1310. 10.1016/S0140-6736(16)30370-1 [DOI] [PubMed] [Google Scholar]
  • 87.[No title]. [cited 9 Oct 2020]. Available: http://media.philly.com/storage/MediaKit.pdf.
  • 88.Ruiz M. 2018 was a milestone year for Insider Inc. In: Business Insider [Internet]. 5 Feb 2019 [cited 9 Oct 2020]. Available: https://www.businessinsider.com/2018-milestone-year-insider-inc-2019-2.
  • 89.Delaney KJ. Thank you, readers: Quartz is turning five years old. Here’s what comes next. In: Quartz [Internet]. 6 Sep 2017 [cited 9 Oct 2020]. Available: https://qz.com/1069982/thank-you-readers-quartz-is-turning-five-years-old-heres-what-comes-next/.
  • 90.Matthias L, Fleerackers A, Alperin JP. Framing science: How opioid research is presented in online news media. Available: https://osf.io/k6hxn/download.
  • 91.Barnhurst KG. THE FORM OF REPORTS ON US NEWSPAPER INTERNET SITES, AN UPDATE. Journalism Studies. 2010;11: 555–566. [Google Scholar]
  • 92.Steensen S. Online journalism and the promises of new technology: A critical review and look ahead. Journalism studies. 2011;12: 311–327. [Google Scholar]
  • 93.Abramson J. Sustaining quality journalism. Daedalus. 2010. pp. 39–44. 10.1162/daed.2010.139.2.39 [DOI] [Google Scholar]
  • 94.How Should Clinicians Respond When Patients Are Influenced by Celebrities’ Cancer Stories? AMA Journal of Ethics. 2018. pp. E1075–1081. 10.1001/amajethics.2018.1075 [DOI] [PubMed] [Google Scholar]
  • 95.Celebrity Health Narratives and the Public Health. In: Google Books [Internet]. [cited 20 May 2020]. Available: https://books.google.com/books/about/Celebrity_Health_Narratives_and_the_Publ.html?id=igJJCgAAQBAJ.
  • 96.Ross S, Bossis A, Guss J, Agin-Liebes G, Malone T, Cohen B, et al. Rapid and sustained symptom reduction following psilocybin treatment for anxiety and depression in patients with life-threatening cancer: a randomized controlled trial. Journal of Psychopharmacology. 2016. pp. 1165–1180. 10.1177/0269881116675512 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 97.Griffiths RR, Johnson MW, Carducci MA, Umbricht A, Richards WA, Richards BD, et al. Psilocybin produces substantial and sustained decreases in depression and anxiety in patients with life-threatening cancer: A randomized double-blind trial. Journal of Psychopharmacology. 2016. pp. 1181–1197. 10.1177/0269881116675513 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 98.Bach PB, Conti RM, Muller RJ, Schnorr GC, Saltz LB. Overspending driven by oversized single dose vials of cancer drugs. BMJ. 2016. p. i788. 10.1136/bmj.i788 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 99.Schudson M. The sociology of news production. Media, Culture & Society. 1989. pp. 263–282. 10.1177/016344389011003002 [DOI] [Google Scholar]

Decision Letter 0

Cindy Sing Bik Ngai

27 Aug 2020

PONE-D-20-15200

What cancer research makes the news?

A quantitative analysis of online news stories that mention cancer studies

PLOS ONE

Dear Dr. Moorhead,

Thank you for submitting your manuscript to PLOS ONE. After careful consideration, we feel that it has merit but does not fully meet PLOS ONE’s publication criteria as it currently stands. Therefore, we invite you to submit a revised version of the manuscript that addresses the points raised during the review process.

Please submit your revised manuscript by Oct 10 2020 11:59PM. If you will need more time than this to complete your revisions, please reply to this message or contact the journal office at plosone@plos.org. When you're ready to submit your revision, log on to https://www.editorialmanager.com/pone/ and select the 'Submissions Needing Revision' folder to locate your manuscript file.

Please include the following items when submitting your revised manuscript:

  • A rebuttal letter that responds to each point raised by the academic editor and reviewer(s). You should upload this letter as a separate file labeled 'Response to Reviewers'.

  • A marked-up copy of your manuscript that highlights changes made to the original version. You should upload this as a separate file labeled 'Revised Manuscript with Track Changes'.

  • An unmarked version of your revised paper without tracked changes. You should upload this as a separate file labeled 'Manuscript'.

If you would like to make changes to your financial disclosure, please include your updated statement in your cover letter. Guidelines for resubmitting your figure files are available below the reviewer comments at the end of this letter.

If applicable, we recommend that you deposit your laboratory protocols in protocols.io to enhance the reproducibility of your results. Protocols.io assigns your protocol its own identifier (DOI) so that it can be cited independently in the future. For instructions see: http://journals.plos.org/plosone/s/submission-guidelines#loc-laboratory-protocols

We look forward to receiving your revised manuscript.

Kind regards,

Cindy Sing-bik Ngai

Academic Editor

PLOS ONE

Journal Requirements:

When submitting your revision, we need you to address these additional requirements.

1. Please ensure that your manuscript meets PLOS ONE's style requirements, including those for file naming. The PLOS ONE style templates can be found at

https://journals.plos.org/plosone/s/file?id=wjVg/PLOSOne_formatting_sample_main_body.pdf and

https://journals.plos.org/plosone/s/file?id=ba62/PLOSOne_formatting_sample_title_authors_affiliations.pdf

2. We noted in your submission details that a portion of your manuscript may have been presented or published elsewhere.

"See related manuscript, published in BMJ Open in 2018, uploaded with this submission. The data set from this previous manuscript was used for this paper; however, the effort does not constitute dual publication, as this paper explores a subset of data not highlighted in the previous study. "

Please clarify whether this or publication was peer-reviewed and formally published. If this work was previously peer-reviewed and published, in the cover letter please provide the reason that this work does not constitute dual publication and should be included in the current manuscript.

3. Thank you for stating the following in the Competing Interests section:

"The authors have declared that no competing interests exist."

We note that one or more of the authors are employed by a commercial company: independent researcher.

3.1. Please provide an amended Funding Statement declaring this commercial affiliation, as well as a statement regarding the Role of Funders in your study. If the funding organization did not play a role in the study design, data collection and analysis, decision to publish, or preparation of the manuscript and only provided financial support in the form of authors' salaries and/or research materials, please review your statements relating to the author contributions, and ensure you have specifically and accurately indicated the role(s) that these authors had in your study. You can update author roles in the Author Contributions section of the online submission form.

Please also include the following statement within your amended Funding Statement.

“The funder provided support in the form of salaries for authors [insert relevant initials], but did not have any additional role in the study design, data collection and analysis, decision to publish, or preparation of the manuscript. The specific roles of these authors are articulated in the ‘author contributions’ section.”

If your commercial affiliation did play a role in your study, please state and explain this role within your updated Funding Statement.

3.2. Please also provide an updated Competing Interests Statement declaring this commercial affiliation along with any other relevant declarations relating to employment, consultancy, patents, products in development, or marketed products, etc.  

Within your Competing Interests Statement, please confirm that this commercial affiliation does not alter your adherence to all PLOS ONE policies on sharing data and materials by including the following statement: "This does not alter our adherence to  PLOS ONE policies on sharing data and materials.” (as detailed online in our guide for authors http://journals.plos.org/plosone/s/competing-interests) . If this adherence statement is not accurate and  there are restrictions on sharing of data and/or materials, please state these. Please note that we cannot proceed with consideration of your article until this information has been declared.

Please include both an updated Funding Statement and Competing Interests Statement in your cover letter. We will change the online submission form on your behalf.

[Note: HTML markup is below. Please do not edit.]

Reviewers' comments:

Reviewer's Responses to Questions

Comments to the Author

1. Is the manuscript technically sound, and do the data support the conclusions?

The manuscript must describe a technically sound piece of scientific research with data that supports the conclusions. Experiments must have been conducted rigorously, with appropriate controls, replication, and sample sizes. The conclusions must be drawn appropriately based on the data presented.

Reviewer #1: Partly

Reviewer #2: Partly

Reviewer #3: Yes

**********

2. Has the statistical analysis been performed appropriately and rigorously?

Reviewer #1: Yes

Reviewer #2: No

Reviewer #3: Yes

**********

3. Have the authors made all data underlying the findings in their manuscript fully available?

The PLOS Data policy requires authors to make all data underlying the findings described in their manuscript fully available without restriction, with rare exception (please refer to the Data Availability Statement in the manuscript PDF file). The data should be provided as part of the manuscript or its supporting information, or deposited to a public repository. For example, in addition to summary statistics, the data points behind means, medians and variance measures should be available. If there are restrictions on publicly sharing data—e.g. participant privacy or use of data from a third party—those must be specified.

Reviewer #1: Yes

Reviewer #2: Yes

Reviewer #3: Yes

**********

4. Is the manuscript presented in an intelligible fashion and written in standard English?

PLOS ONE does not copyedit accepted manuscripts, so the language in submitted articles must be clear, correct, and unambiguous. Any typographical or grammatical errors should be corrected at revision, so please note any specific errors here.

Reviewer #1: Yes

Reviewer #2: Yes

Reviewer #3: Yes

**********

5. Review Comments to the Author

Please use the space provided to explain your answers to the questions above. You may also include additional comments for the author, including concerns about dual publication, research ethics, or publication ethics. (Please upload your review as an attachment if it exceeds 20,000 characters)

Reviewer #1: This paper can be considered as a subsection of a previously published paper in which an overview of altmetric mentions to US funded papers on cancer research. In this case, the authors focus on news stories. The paper is pretty straightforward and has no technical complications. While the motivation of the paper is of interest, the authors do not feed much from scientific communication literature on interests of scientists communicating with journalists, motivations, etc. There is a vast stream of literature on this which would greatly enrich, both the introduction as well as the discussion.

The analysis the authors make is quite superficial without deepening on motivations, external factors that may affect being mentioned in news media (e.g., journal venue, press releases, authors' institutional status, authors' influence), and I would say this is more of an exploratory paper than anything else.

So in general I find it quite poor as it does not delve much into the richness of the data they have nor they go beyond a basic descriptive analysis.

Beyond that, there are two specific sentences the authors make I do not agree and should be modified if accepted for publication later on.

- They indicate that partly, their novelty is on the journalists' interest on funded cancer research and use Altmetric.com as a 'better source' than others because it includes online news media. While this may be partly true, the authors ignore in the text two important limitations of this source: 1) the list of news media is quite arbitrary. The link the authors provided no longer refers to the list of news media from Altmetric.com. this should be updated. 2) News media mentions are identified by hyperlinks to papers, which is something that not always happens when reporting research in news media. This may affect especially traditional media which has a lower online presence and may not include hyperlinks to scientific papers, hence the differences in the results.

- In page 4, paragraph 3 the authors state the following: 'Increasingly, researchers utilize Altmetric’s database of more than 2500 global media sources'. There is no evidence of this whatsoever and no references are given.

Reviewer #2: Review of PONE-D-20-15200

This is a mostly descriptive paper about coverage of cancer topics in the media; the topic of this paper is important and timely. The introduction and discussion are interesting. However, the methods and results were underdeveloped. This might be addressed by clarifying some of the definitions and how variables were measured/coded. Perhaps providing a few examples of what was coded as a mention, or adding a list of mentions for a couple of the top mentioned papers would help. I would recommend removing the chi-squared analyses and, instead, creating some visuals that demonstrate the relative differences for incidence/death/mentions by cancer type and media type. Attaching one example of how this might look. I think a set of graphs would be a lot more powerful than the lists of numbers currently included.

Some information about the size and characteristics of the audience of the different outlets could add to the understanding of the reach of the different types of cancer information. Is a mention in the NYT equivalent in audience reach to a mention in the St. Louis Post-Dispatch, for example. These numbers may be tricky to get, but there are likely estimates of audience or market share.

In addition, given the current political climate and description of many of these outlets as fake news and growing public distrust of science, it seems like the inclusion of science in a broad spectrum of media outlets is extremely important. Some discussion of these topics could be useful.

Finally, there was some discussion of how journalists find science to report on and, from the lists shown in the paper, it seems that journal impact factor/visibility is probably a big part of it. Academics and academic institutions have been more visible and active on social media in recent years, which could influence the reporting of science if academics/academic institutions share science this way and “tag” journalists or journalistic outlets. (https://www.sciencedirect.com/science/article/pii/S2211419X2030029X)

Other possible edits:

- First two sentences of last paragraph in the abstract are confusing, reword to clarify.

- The files included are the data and data collection files, but the data management and analysis files are not available at the currently provided link. Including the data is great, but the paper is not reproducible without the statistical code as well. Use of Microsoft Excel can be problematic for reproducibility (see https://www.washingtonpost.com/news/wonk/wp/2016/08/26/an-alarming-number-of-scientific-papers-contain-excel-errors/ and Ziemann M, Eren Y, El-Osta A. Gene name errors are widespread in the scientific literature. Genome biology. 2016 Dec;17(1):1-3.

- Chi-squared can only find associations, not the direction of association, so this sentence needs to be re-worded or an analysis of the standardized residuals should be included to support the finding: “Traditional news sources included significantly more mentions of research on common cancer types (n = 240) compared to news mentions across digital native news sources (n = 204; X = 5.690, df = 1, p = .017).” One suggestion for rewording would be, “There was a significant association between news source type and mention of common cancer type research (n = 204; X = 5.690, df = 1, p = .017) with traditional news sources including more mentions of research on common cancer types (n = 240) compared to news mentions across digital native news sources.” It is a subtle distinction, but important given how chi-squared is computed. Following up with standardized residuals to determine which of the frequencies in the chi-squared were much different from expected would strengthen the results section and perhaps provide the authors and readers with additional insights.

- Including the IQR in addition to the range would be helpful in understanding the data. Or, as suggested above, including the statistical code so that interested readers could examine the distribution of the mentions per online news source.

- The standard deviation being higher than the mean, along with the range being so wide for number of mentions, suggests that this distribution is skewed and the median should be reported instead. It looks like the median is 1, so the mean of 3 is definitely exaggerating the central tendency.

- In table 1 it might be useful to add some sort of mentions/death or death/mention metric; it takes some work as the table currently is formatted to understand that, for example, pancreatic cancer is woefully under-reported given the amount of death (more than breast cancer! I had no idea.) Or, alternatively, a visual that compares the mortality rank and publicity rank or something similar so that this disconnect between incidence/mortality and publicity are more clear. …as a journalist might say, it seems like the authors have buried the lead.

- The column headings on Table 2 are really confusing; please clarify. Also, add a date range for the articles to the title of this table or to the “Total news mentions” column heading.

***Table1data to make graph (put in a csv to use R code below)***

cancer incidence deaths num.articles mentions

Breast 255190 41070 1284 54

Lung 222500 155870 630 35

Melanoma 87110 87110 302 33

Colon and rectal 135430 50260 535 28

Prostate 161360 26730 586 23

Leukemia 62130 24500 544 17

Liver 40710 28920 302 8

Pancreatic 53670 43090 309 5

Endometrial 61380 10920 77 4

Kidney 63990 14400 106 4

Non-Hodgkin’s 72240 20140 170 3

Thyroid 56870 2010 71 1

Urinary/Bladder 79030 16870 68 0

***R code for graph***

# open data

table1 <- read.csv("table1.csv")

# load tidyverse for graphing

library(package = "tidyverse")

# make graph of mentions & deaths

longer <- table1 %>%

mutate(deathsInThousands = deaths/1000) %>%

pivot_longer(cols = c("deathsInThousands","mentions"),

names_to = c("metric"),

values_to = "freqNum") %>%

drop_na() %>%

mutate(metric = as.factor(metric))

longer %>%

ggplot(aes(x = reorder(cancer, freqNum), y = freqNum, fill = metric)) +

geom_col(position = "dodge") +

coord_flip() +

theme_minimal() +

labs(y = "Frequency", x = "Cancer type")

Reviewer #3: Interesting article and approach! Here are some questions or suggestions overall:

Introduction:

1. It wasn't completely clear what "journalistic media" meant. A more concrete definition within the Introduction would be helpful as you dig down into the Methods, perhaps in lieu of the one you provided (e.g., includes both print and online sources). For example, in the paragraph beginning with: "As Maggio et al.’s [41] full data set included a broad collection of news media organizations, we filtered out non-journalistic news media sources from the data set, leaving only online news media sources." What is a non-journalistic media source? What exactly was filtered out? A clearer definition (perhaps with examples) would be helpful.

Methods:

1. In general, a little more detail or clarity about your process with Altmetric would be helpful, particularly for those who have never used the platform before. For example, you write: "The combined lists composed 3.1% of the total Altmetric data set (86/2805)." I'm not sure what the 2805 is referring to or how that number was obtained.

2. Similarly, you describe coding articles for the presence of a mention. As someone who is unfamiliar with Altmetric, was coding an automated process, or was this done manually? If the latter, more information about how this was done would be helpful.

3. Extensive detail provided regarding how the media-related data were obtained. However, a brief mention of where incidence/mortality data were derived from would be helpful, too, since that's a major aim of the paper.

4. It's nice to have all coding for this project publicly available!

Results:

1. Minor issue but Table 2 is referenced in text before Table 1.

2. Table 1 was particularly interesting, but little reporting or discussion of it was presented in the text. If part of the goal of this paper is to highlight discrepancies between morbidity/mortality and news coverage, I might highlight some "standouts" in the text. For example, lung cancer is responsible for ~150,000 deaths annually and received 35 online mentions, while melanoma (responsible for half as many deaths) received nearly identical coverage. A greater discussion of these points would serve to support your overall study aim.

Discussion:

1. The overall organization of the Discussion section made it a bit difficult to follow at times. In its current form, it seems to jump around, and it’s difficult to see how the findings of the present study fit with other relevant research. The structure proposed in the BMJ (Docherty & Smith, 1999) may be useful, and I encourage the authors to consider using it in this paper. doi: https://doi.org/10.1136/bmj.318.7193.1224

2. There are portions of the Discussion section that would be better placed within the Results. (Example: First paragraph under the "Growing divide: traditional vs. digital-native news" heading)

3. You mention that certain journal article topics (e.g., drugs or financing) are prioritized, it would be interesting to see what situated within the context of previous research as well. I would assume that this is common practice, but is that something unique to this study, or is that aligned with previous research on the topic?

4. Something that's missing from this section is a discussion of how this work relates to intentional dissemination efforts from researchers. Within the Introduction, you write: "Findings could facilitate future dissemination and funding initiatives... This study lays the groundwork for future research that explores how online news media could be better incorporated into dissemination processes and knowledge translation strategies." You also cite the 2018 Brownson article but don't discuss it or any related articles within this section. More consideration within the Discussion is warranted, as it seems to be a logical "next step" for this type of work.

**********

6. PLOS authors have the option to publish the peer review history of their article (what does this mean?). If published, this will include your full peer review and any attached files.

If you choose “no”, your identity will remain anonymous but your review may still be made public.

Do you want your identity to be public for this peer review? For information about this choice, including consent withdrawal, please see our Privacy Policy.

Reviewer #1: No

Reviewer #2: Yes: Jenine Harris

Reviewer #3: No

[NOTE: If reviewer comments were submitted as an attachment file, they will be attached to this email and accessible via the submission site. Please log into your account, locate the manuscript record, and check for the action link "View Attachments". If this link does not appear, there are no attachment files.]

While revising your submission, please upload your figure files to the Preflight Analysis and Conversion Engine (PACE) digital diagnostic tool, https://pacev2.apexcovantage.com/. PACE helps ensure that figures meet PLOS requirements. To use PACE, you must first register as a user. Registration is free. Then, login and navigate to the UPLOAD tab, where you will find detailed instructions on how to use the tool. If you encounter any issues or have any questions when using PACE, please email PLOS at figures@plos.org. Please note that Supporting Information files do not need this step.

PLoS One. 2021 Mar 10;16(3):e0247553. doi: 10.1371/journal.pone.0247553.r002

Author response to Decision Letter 0


15 Oct 2020

[[Please see the uploaded letter for a better formatted version of the following.]]

Dear editors,

We thank you and the reviewers for your feedback and suggestions regarding our manuscript entitled, "What cancer research makes the news? A quantitative analysis of online news stories that mention cancer studies" (Manuscript PONE-D-20-15200) by Moorhead, Krakow and Maggio. We are pleased to resubmit our manuscript with revisions based on this feedback. In the table below, we provide a description of how we addressed each reviewer comment. These revisions are also noted within the document using track changes.

Editor comments

Author response

Edit location

1. Please ensure that your manuscript meets PLOS ONE's style requirements, including those for file naming. The PLOS ONE style templates can be found at

https://journals.plos.org/plosone/s/file?id=wjVg/PLOSOne_formatting_sample_main_body.pdf and

https://journals.plos.org/plosone/s/file?id=ba62/PLOSOne_formatting_sample_title_authors_affiliations.pdf

Thank you. We have verified that we have done this.

NA

2. We noted in your submission details that a portion of your manuscript may have been presented or published elsewhere.

"See related manuscript, published in BMJ Open in 2018, uploaded with this submission. The data set from this previous manuscript was used for this paper; however, the effort does not constitute dual publication, as this paper explores a subset of data not highlighted in the previous study. "

Please clarify whether this or publication was peer-reviewed and formally published. If this work was previously peer-reviewed and published, in the cover letter please provide the reason that this work does not constitute dual publication and should be included in the current manuscript.

Again, this paper explores a subset of data not examined in the previous study; as such it does not constitute dual publication. The previous paper was peer-reviewed and published in BMJ Open. The citation is as follows:

Maggio, L. A., Ratcliff, C. L., Krakow, M., Moorhead, L. L., Enkhbayar, A., & Alperin, J. P. (2019). Making headlines: an analysis of US government-funded cancer research mentioned in online media. BMJ open, 9(2), e025783.

NA

3.1. Please provide an amended Funding Statement declaring this commercial affiliation, as well as a statement regarding the Role of Funders in your study. If the funding organization did not play a role in the study design, data collection and analysis, decision to publish, or preparation of the manuscript and only provided financial support in the form of authors' salaries and/or research materials, please review your statements relating to the author contributions, and ensure you have specifically and accurately indicated the role(s) that these authors had in your study. You can update author roles in the Author Contributions section of the online submission form.

Please also include the following statement within your amended Funding Statement.

“The funder provided support in the form of salaries for authors [insert relevant initials], but did not have any additional role in the study design, data collection and analysis, decision to publish, or preparation of the manuscript. The specific roles of these authors are articulated in the ‘author contributions’ section.”

If your commercial affiliation did play a role in your study, please state and explain this role within your updated Funding Statement.

3.2. Please also provide an updated Competing Interests Statement declaring this commercial affiliation along with any other relevant declarations relating to employment, consultancy, patents, products in development, or marketed products, etc.

Within your Competing Interests Statement, please confirm that this commercial affiliation does not alter your adherence to all PLOS ONE policies on sharing data and materials by including the following statement: "This does not alter our adherence to PLOS ONE policies on sharing data and materials.” (as detailed online in our guide for authors http://journals.plos.org/plosone/s/competing-interests) . If this adherence statement is not accurate and there are restrictions on sharing of data and/or materials, please state these. Please note that we cannot proceed with consideration of your article until this information has been declared.

Please include both an updated Funding Statement and Competing Interests Statement in your cover letter. We will change the online submission form on your behalf.

We have no funding and/or competing interests to declare. Apologies for any confusion on this front.

NA

Reviewers’ Responses to Questions

1. Is the manuscript technically sound, and do the data support the conclusions?

The manuscript must describe a technically sound piece of scientific research with data that supports the conclusions. Experiments must have been conducted rigorously, with appropriate controls, replication, and sample sizes. The conclusions must be drawn appropriately based on the data presented.

Reviewer #1: Partly

Reviewer #2: Partly

Reviewer #3: Yes

Throughout the manuscript, we have worked to improve the support of our conclusions. These changes include clarifying variable definitions/codes and methodological description and expanding our statistical analyses to address reviewer comments.

Throughout manuscript

2. Has the statistical analysis been performed appropriately and rigorously?

Reviewer #1: Yes

Reviewer #2: No

Reviewer #3: Yes

We have updated our statistical analyses to address comments raised by Reviewer 2.

Results section, pp. 4–11.

3. Have the authors made all data underlying the findings in their manuscript fully available?

The PLOS Data policy requires authors to make all data underlying the findings described in their manuscript fully available without restriction, with rare exception (please refer to the Data Availability Statement in the manuscript PDF file). The data should be provided as part of the manuscript or its supporting information, or deposited to a public repository. For example, in addition to summary statistics, the data points behind means, medians and variance measures should be available. If there are restrictions on publicly sharing data—e.g. participant privacy or use of data from a third party—those must be specified.

Reviewer #1: Yes

Reviewer #2: Yes

Reviewer #3: Yes

To facilitate transparency and replication of the methods, the project’s complete data set and data management and analysis files are publicly accessible at https://doi.org/10.5281/ zenodo.1306984.

p. 5

4. Is the manuscript presented in an intelligible fashion and written in standard English?

PLOS ONE does not copyedit accepted manuscripts, so the language in submitted articles must be clear, correct, and unambiguous. Any typographical or grammatical errors should be corrected at revision, so please note any specific errors here.

Reviewer #1: Yes

Reviewer #2: Yes

Reviewer #3: Yes

NA

NA

Reviewer Comments to the Author

Reviewer #1

This paper can be considered as a subsection of a previously published paper in which an overview of altmetric mentions to US funded papers on cancer research. In this case, the authors focus on news stories. The paper is pretty straightforward and has no technical complications. While the motivation of the paper is of interest, the authors do not feed much from scientific communication literature on interests of scientists communicating with journalists, motivations, etc. There is a vast stream of literature on this which would greatly enrich, both the introduction as well as the discussion.

The analysis the authors make is quite superficial without deepening on motivations, external factors that may affect being mentioned in news media (e.g., journal venue, press releases, authors' institutional status, authors' influence), and I would say this is more of an exploratory paper than anything else.

Thank you for the suggestion. We have incorporated additional information from scientific communication literature regarding scientists communicating with journalists.

We agree that this paper is primarily exploratory, as there is a gap in existing literature about Altmetric mentions specific to US-funded papers on cancer research by journalists. However, with this revision, we have worked to provide greater description of the methods and analyses in order to provide a solid foundation for future research building off of these exploratory findings.

pp. 2–3, 14

Beyond that, there are two specific sentences the authors make I do not agree and should be modified if accepted for publication later on.

- They indicate that partly, their novelty is on the journalists' interest on funded cancer research and use Altmetric.com as a 'better source' than others because it includes online news media. While this may be partly true, the authors ignore in the text two important limitations of this source: 1) the list of news media is quite arbitrary. The link the authors provided no longer refers to the list of news media from Altmetric.com. this should be updated. 2) News media mentions are identified by hyperlinks to papers, which is something that not always happens when reporting research in news media. This may affect especially traditional media which has a lower online presence and may not include hyperlinks to scientific papers, hence the differences in the results.

We have addressed the issues with these two specific sentences in our revision. We have revised and expanded out the section on Altmetric.com, including how Almetric uses both hyperlinks and phrases to scientific paper.

pp. 4–5

- In page 4, paragraph 3 the authors state the following: 'Increasingly, researchers utilize Altmetric’s database of more than 2500 global media sources'. There is no evidence of this whatsoever and no references are given.

This issue has been addressed through the revision of this paragraph. Additional citations have been added, along with six lines regarding the limitations of using Altmetric’s database.

pp. 4–5

Reviewer #2

This is a mostly descriptive paper about coverage of cancer topics in the media; the topic of this paper is important and timely. The introduction and discussion are interesting. However, the methods and results were underdeveloped. This might be addressed by clarifying some of the definitions and how variables were measured/coded. Perhaps providing a few examples of what was coded as a mention, or adding a list of mentions for a couple of the top mentioned papers would help. I would recommend removing the chi-squared analyses and, instead, creating some visuals that demonstrate the relative differences for incidence/death/mentions by cancer type and media type. Attaching one example of how this might look. I think a set of graphs would be a lot more powerful than the lists of numbers currently included.

Thank you for the suggestion to help clarify our definitions and coding methods for the reader as well as illustrate our analyses more clearly. To address this comment, we have first expanded the description of our analytic variables in the methods section, including more description of Altmetric’s definition of a media mention, as well as a definition and reference for common types of cancers.

We have also added Figure 1 to provide a visualization of relative differences in the ratio of estimated deaths per news mention across all common types of cancer. We provide a brief description to accompany this figure on pages 6-7 as well, which now reads:

“We also examined how mentions differed across estimated deaths for each common cancer type (Figure 1). These ratios ranged from 0 to 8618. With the exception of urinary and bladder cancers, which did not receive any mentions, the lowest death to mention ratio observed for breast cancer (ratio = 760.56, indicating greater coverage per estimated death), and the highest ratio observed for pancreatic cancer (ratio = 8618, indicating the least coverage per estimated death).”

pp. 6–7, 4–12

Some information about the size and characteristics of the audience of the different outlets could add to the understanding of the reach of the different types of cancer information. Is a mention in the NYT equivalent in audience reach to a mention in the St. Louis Post-Dispatch, for example. These numbers may be tricky to get, but there are likely estimates of audience or market share.

Thank you for this suggestion. We have added a column to Table 3 (p. 9) that incorporates the estimated monthly unique visitors for each publication, as an indicator of audience size and characteristics. Additionally, the Discussion (pp. 12–15) now includes reference to the differences between local and national news, as well as the changing digital media landscape.

pp. 9, 12–15

In addition, given the current political climate and description of many of these outlets as fake news and growing public distrust of science, it seems like the inclusion of science in a broad spectrum of media outlets is extremely important. Some discussion of these topics could be useful.

We appreciate this suggestion and have now incorporated this into both the Introduction (p. 4) and the Discussion (pp. 12–15).

pp. 12–15

Finally, there was some discussion of how journalists find science to report on and, from the lists shown in the paper, it seems that journal impact factor/visibility is probably a big part of it. Academics and academic institutions have been more visible and active on social media in recent years, which could influence the reporting of science if academics/academic institutions share science this way and “tag” journalists or journalistic outlets. (https://www.sciencedirect.com/science/article/pii/S2211419X2030029X)

Thank you for this point. Table 2 (p. 9) includes the impact factor for each journal and it is considered in the Discussion (pp. 14).

pp. 9, 14

Other possible edits:

- First two sentences of last paragraph in the abstract are confusing, reword to clarify.

We have rewritten these two sentences to address the reviewer’s concern.

p. 1

- The files included are the data and data collection files, but the data management and analysis files are not available at the currently provided link. Including the data is great, but the paper is not reproducible without the statistical code as well. Use of Microsoft Excel can be problematic for reproducibility (see https://www.washingtonpost.com/news/wonk/wp/2016/08/26/an-alarming-number-of-scientific-papers-contain-excel-errors/ and Ziemann M, Eren Y, El-Osta A. Gene name errors are widespread in the scientific literature. Genome biology. 2016 Dec;17(1):1-3.

We have addressed this: https://zenodo.org/record/4075712

NA

- Chi-squared can only find associations, not the direction of association, so this sentence needs to be re-worded or an analysis of the standardized residuals should be included to support the finding: “Traditional news sources included significantly more mentions of research on common cancer types (n = 240) compared to news mentions across digital native news sources (n = 204; X = 5.690, df = 1, p = .017).” One suggestion for rewording would be, “There was a significant association between news source type and mention of common cancer type research (n = 204; X = 5.690, df = 1, p = .017) with traditional news sources including more mentions of research on common cancer types (n = 240) compared to news mentions across digital native news sources.” It is a subtle distinction, but important given how chi-squared is computed. Following up with standardized residuals to determine which of the frequencies in the chi-squared were much different from expected would strengthen the results section and perhaps provide the authors and readers with additional insights.

Thank you to the reviewer for noting this important distinction regarding chi-squared tests and providing a suggested revision to offer a clearer interpretation of the results, given the limitations of this statistical test. We have adopted the revised wording suggested by the reviewer, and the lines on page 7 now read:

“There was a significant association between news source type and mention of common cancer type research (n = 204; X = 5.690, df = 1, p = .017) with traditional news sources including more mentions of research on common cancer types (n = 240) compared to news mentions across digital native news sources.”

p. 7

- Including the IQR in addition to the range would be helpful in understanding the data. Or, as suggested above, including the statistical code so that interested readers could examine the distribution of the mentions per online news source.

Following the suggestion of the reviewers, we have included a copy of our statistical code and dataset for interested readers to examine.

NA

- The standard deviation being higher than the mean, along with the range being so wide for number of mentions, suggests that this distribution is skewed and the median should be reported instead. It looks like the median is 1, so the mean of 3 is definitely exaggerating the central tendency.

Thank you for this suggestion. To more accurately represent the distribution of mentions, we have updated this section to report the median, with the mean and range included in parentheses. This sentence on page 6 now reads:

“Of the 213 articles that received online news mentions, the median number of mentions per article was 1 (mean = 3; range: 1-23, SD=3.8).”

p. 6

- In table 1 it might be useful to add some sort of mentions/death or death/mention metric; it takes some work as the table currently is formatted to understand that, for example, pancreatic cancer is woefully under-reported given the amount of death (more than breast cancer! I had no idea.) Or, alternatively, a visual that compares the mortality rank and publicity rank or something similar so that this disconnect between incidence/mortality and publicity are more clear. …as a journalist might say, it seems like the authors have buried the lead.

We appreciate the reviewer’s suggestion to include a visual to illustrate the ratio of estimated annual deaths to news mentions. The suggestion of this metric is indeed helpful in demonstrating the variation in coverage across types of cancers, given their mortality estimates. To address this, we have added the following text on pages 6–7:

“We also examined how mentions differed across estimated deaths for each common cancer type (Figure 1). These ratios ranged from 0 to 8618. With the exception of urinary and bladder cancers, which did not receive any mentions, the lowest death to mention ratio observed for breast cancer (ratio = 760.56, indicating greater coverage per estimated death), and the highest ratio observed for pancreatic cancer (ratio = 8618, indicating the least coverage per estimated death). “

We have also added Figure 1 to provide a visualization of these ratios as well.

pp. 6–7

- The column headings on Table 2 are really confusing; please clarify. Also, add a date range for the articles to the title of this table or to the “Total news mentions” column heading.

Thank you for this suggestion. We have revised the column headings for columns 3-5 greater clarity and also added the year to the table’s title.

pp. 7–12

***Table1data to make graph (put in a csv to use R code below)***

cancer incidence deaths num.articles mentions

Breast 255190 41070 1284 54

Lung 222500 155870 630 35

Melanoma 87110 87110 302 33

Colon and rectal 135430 50260 535 28

Prostate 161360 26730 586 23

Leukemia 62130 24500 544 17

Liver 40710 28920 302 8

Pancreatic 53670 43090 309 5

Endometrial 61380 10920 77 4

Kidney 63990 14400 106 4

Non-Hodgkin’s 72240 20140 170 3

Thyroid 56870 2010 71 1

Urinary/Bladder 79030 16870 68 0

***R code for graph***

# open data

table1 <- read.csv("table1.csv")

# load tidyverse for graphing

library(package = "tidyverse")

# make graph of mentions & deaths

longer <- table1 %>%

mutate(deathsInThousands = deaths/1000) %>%

pivot_longer(cols = c("deathsInThousands","mentions"),

names_to = c("metric"),

values_to = "freqNum") %>%

drop_na() %>%

mutate(metric = as.factor(metric))

longer %>%

ggplot(aes(x = reorder(cancer, freqNum), y = freqNum, fill = metric)) +

geom_col(position = "dodge") +

coord_flip() +

theme_minimal() +

labs(y = "Frequency", x = "Cancer type")

Thank you for sharing this code with us. Although the present analysis was not conducted in R, this is quite helpful to view for our future work. In the spirit of the reviewer’s comment, we have used Excel to generate Figure 1 to illustrate a comparison of estimated deaths per news mentions across common cancer types.

NA

Reviewer #3

Interesting article and approach!

Thank you for your comment.

1. It wasn't completely clear what "journalistic media" meant. A more concrete definition within the Introduction would be helpful as you dig down into the Methods, perhaps in lieu of the one you provided (e.g., includes both print and online sources). For example, in the paragraph beginning with: "As Maggio et al.’s [41] full data set included a broad collection of news media organizations, we filtered out non-journalistic news media sources from the data set, leaving only online news media sources." What is a non-journalistic media source? What exactly was filtered out? A clearer definition (perhaps with examples) would be helpful.

We added a more concrete definition in the Introduction (p. 3) and clarified in the Methods section that we relied on Pew Research Center’s 2016 State of the Media Report for our list of journalistic media (p. 6).

pp. 3, 6

Methods:

1. In general, a little more detail or clarity about your process with Altmetric would be helpful, particularly for those who have never used the platform before. For example, you write: "The combined lists composed 3.1% of the total Altmetric data set (86/2805)." I'm not sure what the 2805 is referring to or how that number was obtained.

Thank you for suggesting this clarification. We have expanded this description to clarify that the 2805 refers to the total number of all media organizations included in the Altmetric data, which we then narrowed down to a focused list of 86 journalistic news organizations for the present analyses. This section on page 6 now reads:

“We combined these lists and then used them to filter out non-journalistic news media sources from Maggio et al.’s [41] full data set, which included 2805 media organizations, including non-journalistic media organizations (e.g., blogs, public relations and governmental agencies). This allowed for a data set with only online journalistics media sources. A journal article was coded as having a journalistic mention if it was cited in a news story published by at least one of the 86 journalistic news organizations included in from these) two combined lists from Pew. (For the names of the news media sources, see Pew Research Center [45]). Use of these combined lists allowed us to generate results in a reproducible manner that can be re-examined for other years of publication. The combined lists composed 3.1% of the total number of media organizations contained in the Altmetric data set (86/2805).”

p. 6

2. Similarly, you describe coding articles for the presence of a mention. As someone who is unfamiliar with Altmetric, was coding an automated process, or was this done manually? If the latter, more information about how this was done would be helpful.

We have addressed this concern in our revision. We have revised and expanded out the section on Altmetric.com, including how Almetric codes its news sources.

pp. 4–5

3. Extensive detail provided regarding how the media-related data were obtained. However, a brief mention of where incidence/mortality data were derived from would be helpful, too, since that's a major aim of the paper.

Thank you for this suggestion. On page 6, we have added wording to mention where data on cancer incidence and mortality were obtained and added a citation to the Common Types of Cancer list provided by the National Cancer Institute. This section now reads:

“Additionally, we examined journalistic news coverage of scientific articles across common types of cancer (i.e., defined by the National Cancer Institute as cancers with an estimated incidence of 40,000 or more cases per year), which included breast, lung, melanoma, colorectal, prostate, leukemia, liver, pancreatic, endometrial, kidney, Non-Hodgkin’s lymphoma, thyroid, and urinary/bladder cancers."

p. 6

4. It's nice to have all coding for this project publicly available!

Thank you. We agree.

Results:

1. Minor issue but Table 2 is referenced in text before Table 1.

We have addressed this.

pp. 7–9

2. Table 1 was particularly interesting, but little reporting or discussion of it was presented in the text. If part of the goal of this paper is to highlight discrepancies between morbidity/mortality and news coverage, I might highlight some "standouts" in the text. For example, lung cancer is responsible for ~150,000 deaths annually and received 35 online mentions, while melanoma (responsible for half as many deaths) received nearly identical coverage. A greater discussion of these points would serve to support your overall study aim.

Thank you for this suggestion. We have expanded our discussion of Table 1 in the Discussion and Conclusion. We have also worked it better into the Abstract.

pp. 1, 4–15

Discussion:

1. The overall organization of the Discussion section made it a bit difficult to follow at times. In its current form, it seems to jump around, and it’s difficult to see how the findings of the present study fit with other relevant research. The structure proposed in the BMJ (Docherty & Smith, 1999) may be useful, and I encourage the authors to consider using it in this paper. doi: https://doi.org/10.1136/bmj.318.7193.1224

Thank you for pointing us to this paper. You will see that we incorporated the suggested structure into the discussion. In the Track Changes version, you can see the suggested structure via subheads, which was removed in the “clean” version.

pp. 12–16

2. There are portions of the Discussion section that would be better placed within the Results. (Example: First paragraph under the "Growing divide: traditional vs. digital-native news" heading)

We moved that initial paragraph and several other portions of the Discussion section into the Results section.

p. 6

3. You mention that certain journal article topics (e.g., drugs or financing) are prioritized, it would be interesting to see what situated within the context of previous research as well. I would assume that this is common practice, but is that something unique to this study, or is that aligned with previous research on the topic?

We have tried to address this in the literature section (i.e., Introduction).

pp. 1–6

4. Something that's missing from this section is a discussion of how this work relates to intentional dissemination efforts from researchers. Within the Introduction, you write: "Findings could facilitate future dissemination and funding initiatives... This study lays the groundwork for future research that explores how online news media could be better incorporated into dissemination processes and knowledge translation strategies." You also cite the 2018 Brownson article but don't discuss it or any related articles within this section. More consideration within the Discussion is warranted, as it seems to be a logical "next step" for this type of work.

Thank you for the suggestion. We have incorporated additional information from scientific communication literature regarding scientists communicating with journalists. We have included Brownson (2018) in the text and by name in the Conclusion. We have also worked to better address dissemination processes and knowledge translation strategies in the Conclusion.

pp. 2–3, 14–16

We greatly appreciate your further review of our manuscript and hope that the revisions as described above and in the revised paper meet your expectations. Please do not hesitate to contact me with any questions or additional requests.

Best regards,

Dr. Laura Moorhead

Assistant Professor, Journalism

Attachment

Submitted filename: PLOS ONE_ Revision table [PONE-D-20-15200].pdf

Decision Letter 1

Cindy Sing Bik Ngai

4 Dec 2020

PONE-D-20-15200R1

What cancer research makes the news?

A quantitative analysis of online news stories that mention cancer studies

PLOS ONE

Dear Dr. Moorhead,

Thank you for submitting your manuscript to PLOS ONE. After careful consideration, we feel that it has merit but does not fully meet PLOS ONE’s publication criteria as it currently stands. Therefore, we invite you to submit a revised version of the manuscript that addresses the points raised during the review process.

Specifically:

1) Please address the issues raised by the reviewers on data collection, statistical method and visual presentation; 

2) Please add a data availability statement to the manuscript (see https://journals.plos.org/plosone/s/data-availability) and make sure that the original data are accessible;

3) Please make sure that all the necessary supplementary information is provided and correctly referenced in the manuscript (please carefully read https://journals.plos.org/plosone/s/submission-guidelines and https://journals.plos.org/plosone/s/supporting-information).

Please submit your revised manuscript by Jan 18 2021 11:59PM. If you will need more time than this to complete your revisions, please reply to this message or contact the journal office at plosone@plos.org. When you're ready to submit your revision, log on to https://www.editorialmanager.com/pone/ and select the 'Submissions Needing Revision' folder to locate your manuscript file.

Please include the following items when submitting your revised manuscript:

  • A rebuttal letter that responds to each point raised by the academic editor and reviewer(s). You should upload this letter as a separate file labeled 'Response to Reviewers'.

  • A marked-up copy of your manuscript that highlights changes made to the original version. You should upload this as a separate file labeled 'Revised Manuscript with Track Changes'.

  • An unmarked version of your revised paper without tracked changes. You should upload this as a separate file labeled 'Manuscript'.

If you would like to make changes to your financial disclosure, please include your updated statement in your cover letter. Guidelines for resubmitting your figure files are available below the reviewer comments at the end of this letter.

If applicable, we recommend that you deposit your laboratory protocols in protocols.io to enhance the reproducibility of your results. Protocols.io assigns your protocol its own identifier (DOI) so that it can be cited independently in the future. For instructions see: http://journals.plos.org/plosone/s/submission-guidelines#loc-laboratory-protocols

We look forward to receiving your revised manuscript.

Kind regards,

Cindy Sing Bik Ngai

Academic Editor

PLOS ONE

[Note: HTML markup is below. Please do not edit.]

Reviewers' comments:

Reviewer's Responses to Questions

Comments to the Author

1. If the authors have adequately addressed your comments raised in a previous round of review and you feel that this manuscript is now acceptable for publication, you may indicate that here to bypass the “Comments to the Author” section, enter your conflict of interest statement in the “Confidential to Editor” section, and submit your "Accept" recommendation.

Reviewer #1: (No Response)

Reviewer #2: All comments have been addressed

Reviewer #3: All comments have been addressed

**********

2. Is the manuscript technically sound, and do the data support the conclusions?

The manuscript must describe a technically sound piece of scientific research with data that supports the conclusions. Experiments must have been conducted rigorously, with appropriate controls, replication, and sample sizes. The conclusions must be drawn appropriately based on the data presented.

Reviewer #1: No

Reviewer #2: Yes

Reviewer #3: Yes

**********

3. Has the statistical analysis been performed appropriately and rigorously?

Reviewer #1: Yes

Reviewer #2: I Don't Know

Reviewer #3: Yes

**********

4. Have the authors made all data underlying the findings in their manuscript fully available?

The PLOS Data policy requires authors to make all data underlying the findings described in their manuscript fully available without restriction, with rare exception (please refer to the Data Availability Statement in the manuscript PDF file). The data should be provided as part of the manuscript or its supporting information, or deposited to a public repository. For example, in addition to summary statistics, the data points behind means, medians and variance measures should be available. If there are restrictions on publicly sharing data—e.g. participant privacy or use of data from a third party—those must be specified.

Reviewer #1: Yes

Reviewer #2: Yes

Reviewer #3: Yes

**********

5. Is the manuscript presented in an intelligible fashion and written in standard English?

PLOS ONE does not copyedit accepted manuscripts, so the language in submitted articles must be clear, correct, and unambiguous. Any typographical or grammatical errors should be corrected at revision, so please note any specific errors here.

Reviewer #1: Yes

Reviewer #2: Yes

Reviewer #3: Yes

**********

6. Review Comments to the Author

Please use the space provided to explain your answers to the questions above. You may also include additional comments for the author, including concerns about dual publication, research ethics, or publication ethics. (Please upload your review as an attachment if it exceeds 20,000 characters)

Reviewer #1: I'd like to thank the authors for the efforts done on addressing my comments. However, I still have strong concerns with the conclusions the authors provide, being, that US-funded cancer research is barely mentioned in the news media. I do not think the authors really show this, but that based on a very sensitive and prone to errors method, which is that of using hyperlinks to papers (a rare way of reporting research findings in news media) they are not able to identify many mentions to US-funded research. I think this is very important distinction, as it places the problem on the fact that we do not have tools which are appropriate to do what the authors aim at.

The authors work with very low numbers, and therefore I believe that the data could be further explored in order to describe, 1) how is Altmetric.com retrieving the mentions?, and 2) are these findings a reflection of what is happening (rare mentions to US-funded cancer research, or a limitation of the tool the authors are using?

For the first issue, simply going into the actual news mentioning the papers and looking at them (at least a sample, but the size of the data is so small, that it is manageable) to then report how the news are linking to research and the context in which it is made would be enough. For the second, a potential idea would be to search in a news media database (the authors already mention two) for one of the more common cancers in a specific outlet (e.g., NY Times or any other), see how much they retrieve and see how much deals with research to find how they report research findings. If you find consistent evidence on them reporting in the way Altmetric.com identifies scientific literature, then you have some evidence on the robustness of your findings, even if it is anecdotal.

Reviewer #2: - I'm still not convinced the chi-squared is worth keeping; consider removing. It adds very little, if anything, to the work and without examining standardized residuals it's hard to know what a significant chi-squared even means in this context. If the authors decide to keep it, please make sure the numbers in the sentences are clear and correct (what are the 204 and 240?) and the assumptions for chi-squared are met.

- Regarding Figure 1, the ratios are a good idea, although the Figure then loses the impact about how serious each cancer is in terms of causing death. I'm including a version of the graph I sent in R in the previous review in Excel with the following adjustments from what was included in the paper:

(1) flip the coordinates so that the labels are easier to read

(2) order the bars by height

(3) add some way of knowing how serious each cancer is (either a second bar showing deaths/each cancer or some kind of labeling)

- I can't figure out where the code is for this paper from the zenodo link that was sent; what is saved there appears to be mostly stuff from the prior paper?

Reviewer #3: (No Response)

**********

7. PLOS authors have the option to publish the peer review history of their article (what does this mean?). If published, this will include your full peer review and any attached files.

If you choose “no”, your identity will remain anonymous but your review may still be made public.

Do you want your identity to be public for this peer review? For information about this choice, including consent withdrawal, please see our Privacy Policy.

Reviewer #1: Yes: Nicolas Robinson-Garcia

Reviewer #2: Yes: Jenine K Harris

Reviewer #3: No

[NOTE: If reviewer comments were submitted as an attachment file, they will be attached to this email and accessible via the submission site. Please log into your account, locate the manuscript record, and check for the action link "View Attachments". If this link does not appear, there are no attachment files.]

While revising your submission, please upload your figure files to the Preflight Analysis and Conversion Engine (PACE) digital diagnostic tool, https://pacev2.apexcovantage.com/. PACE helps ensure that figures meet PLOS requirements. To use PACE, you must first register as a user. Registration is free. Then, login and navigate to the UPLOAD tab, where you will find detailed instructions on how to use the tool. If you encounter any issues or have any questions when using PACE, please email PLOS at figures@plos.org. Please note that Supporting Information files do not need this step.

PLoS One. 2021 Mar 10;16(3):e0247553. doi: 10.1371/journal.pone.0247553.r004

Author response to Decision Letter 1


3 Feb 2021

Dear editors,

Again, we thank you and the reviewers for your feedback and suggestions regarding our manuscript entitled, "What cancer research makes the news? A quantitative analysis of online news stories that mention cancer studies" (Manuscript PONE-D-20-15200) by Moorhead, Krakow and Maggio. We are pleased to resubmit our manuscript with revisions based on this feedback. In attached response to reviewers, we provide a description of how we addressed each reviewer comment. These revisions are also noted within the document using track changes.

We greatly appreciate your further review of our manuscript and hope that the revisions as described above and in the revised paper meet your expectations. Please do not hesitate to contact me with any questions or additional requests.

Best regards,

Dr. Laura Moorhead

Assistant Professor, Journalism

Attachment

Submitted filename: PLOS ONE Response to Reviewers PONE-D-20-15200.pdf

Decision Letter 2

Cindy Sing Bik Ngai

10 Feb 2021

What cancer research makes the news?

A quantitative analysis of online news stories that mention cancer studies

PONE-D-20-15200R2

Dear Dr. Moorhead,

We’re pleased to inform you that your manuscript has been judged scientifically suitable for publication and will be formally accepted for publication once it meets all outstanding technical requirements.

Within one week, you’ll receive an e-mail detailing the required amendments. When these have been addressed, you’ll receive a formal acceptance letter and your manuscript will be scheduled for publication.

An invoice for payment will follow shortly after the formal acceptance. To ensure an efficient process, please log into Editorial Manager at http://www.editorialmanager.com/pone/, click the 'Update My Information' link at the top of the page, and double check that your user information is up-to-date. If you have any billing related questions, please contact our Author Billing department directly at authorbilling@plos.org.

If your institution or institutions have a press office, please notify them about your upcoming paper to help maximize its impact. If they’ll be preparing press materials, please inform our press team as soon as possible -- no later than 48 hours after receiving the formal acceptance. Your manuscript will remain under strict press embargo until 2 pm Eastern Time on the date of publication. For more information, please contact onepress@plos.org.

Kind regards,

Cindy Sing Bik Ngai

Academic Editor

PLOS ONE

Additional Editor Comments (optional):

Reviewers' comments:

Acceptance letter

Cindy Sing Bik Ngai

16 Feb 2021

PONE-D-20-15200R2

What cancer research makes the news? A quantitative analysis of online news stories that mention cancer studies

Dear Dr. Moorhead:

I'm pleased to inform you that your manuscript has been deemed suitable for publication in PLOS ONE. Congratulations! Your manuscript is now with our production department.

If your institution or institutions have a press office, please let them know about your upcoming paper now to help maximize its impact. If they'll be preparing press materials, please inform our press team within the next 48 hours. Your manuscript will remain under strict press embargo until 2 pm Eastern Time on the date of publication. For more information please contact onepress@plos.org.

If we can help with anything else, please email us at plosone@plos.org.

Thank you for submitting your work to PLOS ONE and supporting open access.

Kind regards,

PLOS ONE Editorial Office Staff

on behalf of

Dr. Cindy Sing Bik Ngai

Academic Editor

PLOS ONE

Associated Data

    This section collects any data citations, data availability statements, or supplementary materials included in this article.

    Supplementary Materials

    Attachment

    Submitted filename: PLOS ONE_ Revision table [PONE-D-20-15200].pdf

    Attachment

    Submitted filename: PLOS ONE Response to Reviewers PONE-D-20-15200.pdf

    Data Availability Statement

    To facilitate transparency and replication of our methods, we have made our computer code and the project’s complete data set publicly accessible at https://zenodo.org/record/4448259#.YCQ32NhKg2w.


    Articles from PLoS ONE are provided here courtesy of PLOS

    RESOURCES