Skip to main content
F1000Research logoLink to F1000Research
. 2015 Sep 24;4:409. Originally published 2015 Jul 28. [Version 2] doi: 10.12688/f1000research.6858.2

What’s in a Name? Exploring the Nomenclature of Science Communication in the UK

Sam Illingworth 1,a, James Redfern 1, Steve Millington 2, Sam Gray 3
PMCID: PMC4582756  PMID: 26448860

Version Changes

Revised. Amendments from Version 1

Following on from the comments from the reviewers a number of changes have been made to the document, which we believe help to frame the study in a far more rigorous light. The main issue that the reviewers picked up on was the fact that the analysis of the data was not rigorous enough, and that word clouds were not a suitable tool for assessing the results of the survey. As such, a far more detailed analysis has been undertaken, using the qualitative analysis software NVivo. An open coding approach was adopted, and the responses were categorised until descriptive saturation was reached. The word clouds have now been replaced with colour-coded tables that outline the thematic analysis and coding of the responses to each of the questions in the survey. Another issue that has been addressed is that of the sampling and the demographics that were recorded in this study. The sampling strategy is now discussed, and the limitations of the sample size and the effects that this has on the findings of the study are better contextualised.  It is also acknowledged that the recording of further demographic information was an oversight, and is something that should be addressed in future studies. The conclusions have also been tightened up, and the issue of how a consistent nomenclature will be useful in terms of the UK’s Research Excellence Framework (REF) and the proposed Teaching Excellence Framework (TEF) have also been discussed.

Abstract

This study, via a consideration of the literature, and a limited survey of active science communicators, presents concise and workable definitions for science outreach, public engagement, widening participation, and knowledge exchange, in a UK context. 

Sixty-six per cent of participants agreed that their definitions of outreach, public engagement, and widening participation aligned with those of their colleagues, whilst 64% felt that their personal definitions matched those of their institute. However, closer inspection of the open-ended questions found the respondents often differed in the use of the nomenclature. In particular, the respondents found it difficult to define knowledge exchange in this context. It is hoped that this initial study will form the foundation of future work in this area, and that it will help to further develop the debate regarding the need for a consistent nomenclature across science communication.

Keywords: Science Communication, Public Engagement, Outreach, Widening Participation, Knowledge Exchange

Introduction

Burns et al. (2003, pp. 183) define science communication as:

  • “…the use of appropriate skills, media, activities, and dialogue to produce one or more of the following personal responses to science (the AEIOU vowel analogy): Awareness, Enjoyment, Interest, Opinion-forming, and Understanding.”

This robust definition covers most aspects of communicating science to the public under a number of different guises. Where things start to get complicated is in the semantics regarding the different types of science communication, and in their appropriate use and classification.

Across UK institutions, science communication is often referred to using a variety of different terms, amongst them are science outreach, public engagement, widening participation, and/or knowledge exchange, but what do these terms actually mean? As well as institutional biases towards the ‘correct use’ of these terminologies there exists personal nuances in terms of their interpretation, which oftentimes depend upon the role of the person in question and how they perceive science communication to fit into their research and teaching practices, and beyond.

It can be argued that these definitions are simply a matter of semantics, but with science communication becoming more prevalent in grant applications and income generation (see e.g. the Research Council UK’s ‘Pathway to Impact’ report [ http://www.rcuk.ac.uk/ke/impacts/]), it is important for there to be consistency in what is a developing field. The advent of ‘Science 2.0’ (see e.g. Nattkemper, 2012) and what it entails is also an important driver behind having a clear and consistent nomenclature associated with science communication. Science 2.0 proposes a systemic change in the modus operandi of doing research and organising science, in which science communication will play a key part. With potentially large pots of money available in future grants, under specific terms and conditions, there needs to be a consistent terminology that can be drawn upon by the academic community and beyond.

According to the European Commission public consultation into Science 2.0 ( http://ec.europa.eu/research/consultations/science-2.0/background.pdf), something else that requires careful consideration is “the need to develop researcher and researcher reward schemes that reflect this (new) approach”. With potential reward schemes attributed to science communication activities, as well the creation of new job positions to fill these roles, it is important for all concerned to ensure that the language used in science communication is consistent.

This study, therefore, begins by discussing some of the definitions for science outreach, public engagement, widening participation, and knowledge exchange in the UK, derived from the common usage of these terms in the literature, and from the experiences of the authors. It then compares these definitions with the results of a survey of active science communicators from across the UK, and comments on the similarities and differences between the two, before identifying some suggestions for future nomenclature definitions within the field. The purpose of this study is to act as an initial scoping exercise, to begin to investigate and attempt to define a consistent set of nomenclature for use in science communication across the UK, and to act as a building block for further study and future debate.

Literature analysis

The term ‘science outreach’ has been commonplace in research literature since the early 1990s, at which time the number of research articles on science communication started to increase. Many of these early articles describe science outreach as a school/education-linked activity, whereby academics are engaging with different groups of people such as the general public, students and teachers (see e.g. Greenler et al., 1993; Kelter et al., 1992). The term science outreach, which included activities such as mentoring, tutoring, giving presentations, supporting teachers and involvement with after-school clubs and summer schools, continued to become synonymous with school-related activity in to the 2000s, (e.g. Andrews et al., 2005; Krasny, 2005). Recently, Ecklund et al. (2012) suggested that scientists involved in science outreach are often also engaged in some type of outreach involving school-aged children, demonstrating that the connection between school-related activity and science outreach remains strong.

Although much of the literature using the term ‘science outreach’ is based on work carried out in North America, this definition is similar to that used in the UK. Many organisations in the UK who fund science communication (e.g. Royal Society, Royal Society of Chemistry, Society of Biology, and the Wellcome Trust) use science outreach when explicitly discussing science communication with school children.

Although this link to school activity is present in the UK, there is some overlap with other commonly used science communication terms, in particular, public engagement; with some science communication practitioners using both terms together, e.g. schools outreach and public engagement.

In recent years there has been a shift from the deficit model of the ‘Public Understanding of Science’ towards a dialogue-based approach, which can be referred to as a ‘Public Engagement with Science and Technology’ ( Schäfer, 2008, and references therein).

Public engagement can be thought of as a way to restore public trust in science, by developing a two-way dialogue between the general public and the scientific community ( Wynne, 2006). Public engagement can foster global communication, enable shared experiences and methodology, standardize strategy, and generate shared viewpoints ( Cohen et al., 2008). Furthermore, it can be defined as a deliberative process, promoted in both academic and policy circles, as a potential means to build public trust in risk decisions and decision-makers ( Petts, 2008). With regards to policy makers, public engagement can be viewed as both relevant and useful in a regulatory context (see e.g. O'Doherty & Hawkins, 2010), with the results of public discussions with scientists being a worthwhile process in scientific development ( Jones, 2007).

Recent years have seen increasing encouragement by research institutions and funding bodies for scientists to actively engage with the public, who ultimately finance their work ( Bowler et al., 2012), and whilst many research institutions now have dedicated resources for public engagement activities, such activities are not yet considered essential ( Neresini & Bucchi, 2011). It is also unclear as to whether the institutional approach to public engagement is to focus on engaging with the public to promote their research and raise understanding, or if it is to open up a two-way dialogue in order to get their opinion on scientific research and protocol, especially in relation to potential political and ethical ‘hot potatoes’, e.g. geoengineering ( Parkhill & Pidgeon, 2011) and nanotechnology ( Jones, 2007).

The American National Centre for Media Engagement ( http://mediaengagedev.org/engagement/why-engage/difference-between-outreach-and-engagement) defines outreach as “a mechanism for delivering value-added content”, whereas engagement means, “collaboratively addressing community concerns.” This would seem to be consistent with the UK-centric arguments that have been laid out above, i.e. that outreach is a means of educating the general public (in particular school children), whereas public engagement involves a two-way dialogue in which the general public can offer advice and opinions as to the current state of scientific research. This approach to defining public engagement as something different from outreach is corroborated by Holliman et al. (2009, pp. 56) who state that:

  • “There is a heterogeneous community of practice operating in the space between what can be characterized as deficit-informed ‘science outreach’—aimed primarily at increasing scientific literacy—and dialogue-informed ‘public engagement’ seeking to foster productive exchanges between scientists and other stakeholders (including members of the public).”

However, there still appears to be some uncertainty as to the difference between these approaches, and also to potential overlaps with regards to audiences; it is also unclear as to whether these definitions are consistent at an institutional level. As Rowe & Frewer (2005, pp. 251) remark:

  • “Imprecise definition of key terms in the ‘public participation’ domain have hindered the conduct of good research and militated against the development and implementation of effective participation practices.”

Concerns about “access” to high education began to emerge alongside the expansion of the university sector in the latter part of 19th Century, but a research agenda on differential access only began to emerge following the recommendations of the Robbins Committee in 1963 to expand university attendance ( Kettley, 2007). These concerns resurfaced in 1990s when the divide between universities and polytechnics ended, ultimately leading to a commitment by the 1997 Labour Government to again expand by the sector by tackling barriers to higher education. Consequently, Labour established the Office of Fair Access (OFFA).

Widening Participation involves interventions targeted at social groups under-represented in Higher Education (HE), in order to encourage them to attend university. According to the OFFA ( http://www.offa.org.uk/) this includes:

  • Students from disadvantaged backgrounds

  • Students with disabilities

  • Students from some ethnic minority backgrounds

  • Care leavers

  • Part-time and mature students

With graduates benefitting from higher levels skills, knowledge, and access to the networks that are necessary to find higher paid work, the affordances of higher education are clear. Assuming disadvantaged social groups are afforded the same opportunities of access to employment through their university education, widening participation can help reduce social exclusion. It is not surprising, therefore, that the New Labour government largely reshaped the UK HE landscape in alignment with this ambition, with activity co-ordinated through Aimhigher, Lifelong Learning Networks, and the National Academy of Gifted and Talented Youth (see e.g. Frost, 2005).

However, the institutional landscape has since changed, with a greater onus now on the universities to independently deliver these objectives. In addition, university widening participation activity has come under greater scrutiny by the Higher Education Funding Council England (HEFCE), whereby universities opting to charge over £6k annual tuition fees, must also agree to Fair Access Agreements ( McCaig & Adnett, 2009).

In practice, widening participation aligns with the Pipeline or Learner Pathway model (see e.g. Clewell & Villegas, 1999); involving interventions designed to raise awareness and expectations of HE at various points within a learner’s education. With the emphasis on social mobility, widening participation focuses largely on targeting younger students from disadvantaged backgrounds utilising quantitative measures of poverty and deprivation, for example, the Index of Multiple Deprivation ( Deas et al., 2003) and the eligibility for Free School Meals datasets.

In addition, widening participation can also be thought of as a consideration of the student lifecycle, beyond pre-entry and transition, to include university curriculum design, student support and employability. This follows concern that students from disadvantaged backgrounds perceive universities as ivory towers, i.e. places that are beyond their reach and are not for the likes of people like themselves (see e.g. Mangan et al., 2010). It is important, therefore, to consider the impact of traditional university practices or institutional culture, not only on access, but also on the retention and progression of students from non-traditional backgrounds through HE.

Various conceptualisations of knowledge exchange have been in UK higher education discourse since the late 90s, when the Higher Education Reach Out to Business and Community (HEROBC) initially emerged. HEROBC was initially part of the so-called ‘third stream’ of funding, designed to sit alongside institutions’ teaching and research activities, and to provide funds for universities and colleges to pursue interactions with business and the wider community. At the time these interactions were exclusively centred on knowledge and/or technology transfer (rather than exchange), with the purpose of HEROBC being to develop the capacity and capability for knowledge transfer between Higher Education Institutions (HEIs) and other sectors. Typical activities that were funded through HEROBC included skills matching between university and business, and the provision of gateways to enable business to access university expertise and employability initiatives.

In 2001, HEROBC evolved into the Higher Education Innovation Fund (HEIF), which focussed on funding activities designed to increase the capability of universities “to respond to the needs of business, especially in instances that would lead to identifiable economic benefits” ( HEFCE, 2005, pp. 5) HEIF has since featured in four separate funding rounds, with explicit reference to knowledge exchange (rather than knowledge transfer) first emerging as a prominent part in December 2003 around the call for HEIF-2.

HEFCE, through the annual Higher Education Business and Community Interaction Survey (HEBCIS), now leads the categorisation of knowledge exchange activities. HEBCIS requires universities to report expenditure across various knowledge exchange categories including contract research, consultancy, CPD, business start-up, employability programmes etc. As university HEIF allocations are tied to levels of expenditure reported through HEBCIS, this exercise has been a big influence on what UK universities prioritise, resource and define in terms of knowledge exchange.

Despite the focus on expenditure, the important social, cultural and community role that universities play in wider society has not been entirely ignored. Influential voices have emerged around these concepts, most notably Professor David Watson (ex- Vice-Chancellor at Brighton University) who has been a champion of this societal agenda and the role that universities have to play within it, focusing on “civic and community” partnerships ( Watson, 2007).

Watson’s conceptualisation of knowledge exchange is rooted in a more engaged ‘two-way’ relationship between universities and external partners that sets out a much broader notion of knowledge transfer and knowledge exchange. John Goddard, the emeritus Professor of Regional Development Studies at Newcastle University UK, has also commented on the positions of universities as powerful engines of local and regional economic growth (see e.g. Goddard, 2009).

The most recent HEFCE definition states knowledge exchange “refers to HEIs’ engagement with businesses, public and third sector services, the community and wider public” ( http://www.hefce.ac.uk/glossary/#letterK). This adoption of a more explicit referencing of engagement within the knowledge exchange landscape has largely come about through a subtle yet important shift within funding council priorities prefaced, for example, within the Beacons for Public Engagement initiative (2008–2012) and leading towards the uptake of the impact agenda within the UK’s Research Excellence Framework (REF).

Survey

In order to assess the current opinion relating to the definitions of outreach, public engagement, widening participation, and knowledge exchange in UK HEIs, a survey was conducted that asked participants to relate their understanding of science communication nomenclature.

The survey was conducted using Bristol Online Surveys ( https://www.survey.bris.ac.uk/), and comprised 8 questions delivered with a mixed-method approach (i.e. qualitative and quantitative questions). The focus was to evaluate the participant’s views on what constituted outreach, public engagement, widening participation, and knowledge exchange. It also aimed to assess whether or not the participants felt as though their own opinions aligned with those of colleagues and their institution. A copy of the questionnaire can be found in the supplementary materials section of the article.

Estimating the number of active science communicators in the UK is beyond the scope of this study. However, given that this study aimed to provide an initial scoping exercise into the thoughts and consistency of active science communicators across the UK, and taking into account the limited time frame and zero budget, an ideal sampling size of between 50 and 100 participants was chosen for the survey. Given the limitations in budget (which also precluded an interviewing/focus group approach), a convenience sampling strategy was adopted, in which the survey was advertised using the ‘psci-comm’ mailing list hosted by JISCMail, as well through the Twitter accounts of the authors, all of whom are active participants in UK science communication networks across the Twittersphere. The target audience were people that identified themselves as being active UK science communicators, which is why this sampling strategy was adopted. This study was carried out according to the British Educational Research Association’s (BERA) ethical guidelines for educational research, with all of the data in this study fully anonymised.

Results & discussion

Answers to science communication questionnaire

These are the responses to the questionnaire that was used in this study to assess practitioner’s definitions of nomenclature in relation to science communication.

Copyright: © 2015 Illingworth S et al.

Data associated with the article are available under the terms of the Creative Commons Zero "No rights reserved" data waiver (CC0 1.0 Public domain dedication).

In total, 47 people participated in the survey during the allocated time frame of one month, and all bar one of them stated that they currently participated in outreach, public engagement, or widening participation events at their institute or company. Of the actively involved participants, 44 were located solely in UK, one in the Netherlands, and one was based in both the Netherlands and the UK. For the purposes of the analysis, only the 45 participants that were based in the UK, and who stated that they currently participated in science communication activities at their institute or company were selected. As well as being less than the intended sample size, it is acknowledged that the sample size of this survey is far too small to be able to make any generalisations about the nomenclature of science communication in a UK context. However, the responses will be able to help in the development of a potential framework, which can then be further discussed amongst the wider science communication community. It is also envisaged that this will help to foster the debate in terms of the importance of a standardised nomenclature for use across the UK science communication community and beyond. Given that the psi-comm mailing list contains several hundred active UK science communicators, and that between them the authors have several hundred Twitter followers that identify themselves as UK-based science communicators, it is disappointing that more people were not able to participate in the survey, but we believe that the number of responses is still sufficient for the purposes of this study.

Figure 1 shows the results of the survey in relation to h the participant’s personal definitions differed from those of their colleagues and their institutes.

Figure 1. Stacked columns showing how participant’s personal definitions differed from those of their colleagues and institutes/companies.

Figure 1.

Thirty-one of the forty-five participants (~69%) agreed that their definitions of outreach, public engagement, and widening participation aligned with those of their colleagues, whilst 30 (~67%) felt that their personal definitions matched those of their institute. Whilst the sample size is too small to draw any general conclusions about the consistency in science communication nomenclature across the UK, it is interesting to note that on the whole the majority of participants think that their definitions of outreach, widening participation, and public engagement match those of their colleagues and their institutes.

What is not clear from Figure 1 is if there actually is any agreement between the participants’ personal definitions of outreach, widening participation, and public engagement. As such, in addition to the questions regarding how the participants felt their definitions matched those of their colleagues and institutes, the survey also contained the following questions, which aimed to further explore how these different aspects of science communication are defined in the UK:

  • How would you define outreach?

  • How would you define public engagement?

  • How would you define widening participation?

  • How is knowledge exchange related to outreach, public engagement and widening participation?

From the responses to these open-ended questions, the qualitative analysis tool NVivo was used to perform a qualitative thematic analysis. The different themes that were selected for each of the questions, along with the corresponding coding frequencies, are shown in Figure 2Figure 5. To begin with, an open coding approach was taken, in which a number of major categories for each of the questions were deduced from the participant’s responses. These categories were then further investigated, including any potential overlaps. Following on from this initial open coding approach, the responses were re-examined in order to confirm that the major categories (and the concepts that these represented) were an accurate portrayal of the text. This stage was also used to explore how the categories and their concepts were potentially related. This methodology was carried out for each of the questions, and was carried out until descriptive saturation was reached, i.e. until there were no further codes, categories or themes found to be emerging from the analysis of the data.

Figure 2. Major categories for ‘definitions of outreach’ and frequencies; listed in alphabetical order and colour coded according to frequency.

Figure 2.

Figure 5. Major categories for ‘how knowledge exchange is related to outreach, public engagement and widening participation?’ and frequencies; listed in alphabetical order and colour coded according to frequency.

Figure 5.

From the thematic analysis and coding of the responses to the definition of outreach, there are eight major categories, which are displayed in Figure 2, the context for which are given below (there is some overlap between categories):

  • 1.

    Encourage University attendance – These were responses that defined one reason for outreach as being that of an advertisement for encouraging participants to attend university, although not necessarily the university that was conducting the outreach activities.

  • 2.

    Bringing In – These were responses that explicitly talked about bringing participants into the institute or research environment in order to conduct outreach activities.

  • 3.

    Communicating Research – Responses that talked about the use of outreach as a way of advertising and communicating the research of the institute.

  • 4.

    Going Out – These were responses that associated outreach activities as being those that took place outside of their institute.

  • 5.

    Non-academic audience – Those responses that mentioned how outreach was for a general non-academic audience.

  • 6.

    One-directional – These were responses that noted how outreach activities tend to be one-directional in their approach and delivery.

  • 7.

    Schools – Responses that described how outreach activities were for school children and young people.

  • 8.

    Unsure – Respondents who were unsure what was meant by the term outreach.

Regarding the responses to the definition of outreach ( Figure 2), what is immediately noticeable is that nearly half of the respondents made a direct association between science outreach and school education, and that schools and school children were their target audience. A smaller percentage (31%) of respondents, felt that such outreach was for non-academic audiences.

Thirteen respondents (~29%) felt that outreach takes place outside the university campus or research institute (in any location), whereas only six (~13%) noted the converse, i.e. that outreach activities are, and should be, carried out within the institute. One participant made the observation that:

  • “‘Outreach’ has negative connotations - it implies that the institution is doing all the work by ‘reaching out’ to an external group... I would define outreach as being the education-based programme of a large institution (such as a museum) where people are ‘brought in’ as opposed to coming as visitors of their own accord.”

Could it be therefore that the term outreach actually has negative implications for institutes should as museums, which require an influx of people to interact with their mainly on-site activities? Further probing would be needed to determine this hypothesis, wherein the participants’ responses would also need to be compared to their institutional roles.

Interestingly, some respondents explicitly discussed the connection between outreach and public engagement. One respondent stated that they believed outreach was “more educational than public engagement”, whilst two respondents believed outreach to be a subsection of public engagement, but did not elaborate on how the terms were differentiated in practice. Two of the participants made explicit reference to outreach being more of a one-way form of communication, with one of the participants noting that outreach is: “more one-way focused, having a scientist talk to a non-expert, not necessarily in a two-way conversation.”

In terms of the actual purpose of outreach, only sixteen (~36%) of the participants mentioned this explicitly, with nine (20%) of them stating that the purpose of outreach was to aspire audiences to pursue study in further education (although not necessarily at the institute conducting the research), and the same number of participants (two of the participants mentioned both of these categories as a rationale for conducting outreach) stated that the purpose of outreach was to communicate the research that was carried out by the institute.

Only two (~4%) of the respondents were unsure what was meant by outreach, with one of these participants stating that: “our organisation doesn't have an agreed definition of these terms.” Overall then, the sampling size is far too small to generalise in terms of absolute definitions, but from those that were surveyed, there appears to be quite a lot of disagreement in terms of the locale of where the outreach should take place, and what its ultimate purpose is, with more agreement on the target audience, i.e. non-academics with particular focus on school children and young people. This is in keeping with the literature review, which also discussed how many organisations in the UK who fund science communication use science outreach as the terminology when explicitly discussing science communication with school children.

From the coding of the responses to the definition of public engagement, there are four distinct categories, which are displayed in Figure 3, the context for which are given below (there is some overlap between categories):

Figure 3. Major categories for ‘definitions of public engagement’ and frequencies; listed in alphabetical order and colour coded according to frequency.

Figure 3.

  • 1.

    General Public – Respondents who thought that public engagement was any activity that was aimed at the general public.

  • 2.

    Promoting Research - Responses that talked about the use of public engagement as a way of promoting the research of the institute.

  • 3.

    Two-Way - These were responses that noted how public engagement activities tend to be two-directional in their approach and delivery.

  • 4.

    Unsure – Respondents who were unsure what was meant by the term public engagement.

From these responses, there were clearly two main themes: whom the public engagement was for, and what the purpose of the public engagement was. Some of the respondents talked about both (17), whereas a similar number talked about only the audience (21) and a smaller number (6) talked only about the purpose. In terms of the audience, the vast majority of the respondents that commented on the audience (37 out of 38) explicitly mentioned that public engagement should be for the general public. The remaining respondents noted that public engagement is for: “audiences not associated with schools and colleges,” which would seem to be in direct contrast to the audiences more readily associated with outreach. However, further questioning would be needed in order to ascertain if the other respondents also classified the general public as that which excludes school children. The whole notion of what constitutes the general public requires further investigation, as there was no clear set of categories as defined by the responses, with almost all of the respondents referring to simply “the public” or “the general public,” without going into details as to what was meant by that.

The respondents seem to mainly agree (19 out of 23) that the purpose of public engagement is to engage in a two-way dialogue with the public. Some of the sample responses that matched this definition included: “Public engagement can be a two-way process, with academics learning and incorporating feedback from the public”, “Public Engagement is ideally a two-way process, by which information is shared between two different groups”, and “Activities in which members of public audiences communicate with specialists in a way that has the potential to influence the specialists' activities.” The other responses regarding the purpose of public engagement (4 out of 23) were related to the promotion of the research and/or the institute itself.

Only one of the respondents was unsure what was meant by public engagement, with that participant stating that:

  • “I have no idea what the difference ( between public engagement and outreach) is supposed to be.”

From the responses of those that were surveyed, the most popular responses were that public engagement is for an audience of the general public, and that its purpose is to engage in a two-way discussion. Again, this is similar to the UK-centric view that was discussed in the literature review, that public engagement involves a two-way dialogue with the general public.

From the coding of the responses to the definition of widening participation, there are nine distinct categories, which are displayed in Figure 4, the context for which are given below (there is some overlap between categories):

Figure 4. Major categories for ‘definitions of widening participation’ and frequencies; listed in alphabetical order and colour coded according to frequency.

Figure 4.

  • 1.

    Atypical groups – Responses that mentioned the involvement of atypical groups.

  • 2.

    Disadvantaged groups – Responses that mentioned the involvement of disadvantaged groups.

  • 3.

    Encourage higher education – Respondents who thought that widening participation was any activity designed to encourage the participants to continue their studies into higher education.

  • 4.

    Hard-to-reach groups - Responses that talked about the use of widening participation to interact with traditionally hard-to-reach groups.

  • 5.

    Not recruitment - Respondents who felt that the purpose of widening participation was explicitly not as recruitment drive for their institute.

  • 6.

    Recruitment – Respondents who felt that the purpose of widening participation was as recruitment drive for their institute.

  • 7.

    Schools – Responses that talked about engaging with school children and young people.

  • 8.

    Universal target group - Responses that talked a general universal target group.

  • 9.

    Unsure – Respondents who were unsure what was meant by the term widening participation.

In terms of the purpose of widening participation, there was only one clear category, with 14 of the respondents (~31%) believing that it involved activities that encouraged students to continue their schooling into higher education. Within these responses, three of the participants explicitly stated that this involved a recruitment drive for the university, whereas two of the participants stated that they believed that it was not a recruitment drive. For future studies it would be interesting to investigate this dichotomy further.

The survey reveals less clarity, however, in terms of identifying specific target groups (some respondents referred to one or more target groups):

  • Twelve respondents simply referred to a universal target group e.g. society, the public or simply engaging more people.

  • Nine referred to groups that were under-represented or hard-to-reach, but most were unable to provide specific examples. Only one respondent, for example, referred specifically to minority ethnic status, only one to gender, and another referred specifically to disability.

  • Nine understood widening participation in terms of relating specifically to schools or younger people

  • Eight referred to atypical social groups, i.e. groups who would not normally or traditionally attend university

  • Six referred to groups experiencing some form of disadvantage

  • Three understood widening participation targets in terms of targeting geographical areas

A large number of responses refer to atypical social groups, people who express certain characteristics that appear to be defined against some notion of what constitutes a normal student, for example “those with different cultural attitudes and ideals.” This finding chimes with the concern that university staff continue to understand diversity in a way that reproduces the notion of universities as places for some normalized subject, defined against an atypical Other. Only one respondent, for instance, referred to widening participation in terms of curriculum support/design for widening participation of students already in HE.

Two respondents claimed to have not heard of the term widening participation before, whereas two expressed a rather cynical perception that widening participation had become “hijacked by university recruitment agendas”, and another respondent referred to widening participation as “the annoying habit of targeting minority groups.”

The number of respondents was too small to generalise in terms of absolute definitions, but from those that were surveyed, it appears that the purpose of widening participation is to aspire students to continue their education into higher education, which is in keeping with the definitions discussed in the literature review. Exactly which audiences should and are being targeted is less obvious from the responses, and any future study should look to further investigate why this is the case, and to what extent it is determined by institutional protocol and/or personal preference.

From the coding of the responses to the definition of knowledge exchange, there are four distinct categories, which are displayed in Figure 5, the context for which are given below (there is some overlap between categories):

  • 1.

    Industry – Respondents that mentioned the involvement of industry.

  • 2.

    Share best practices – Respondents who thought that knowledge exchange involved the sharing of best practices.

  • 3.

    Support - Responses that talked about how knowledge exchange involved the support of outreach, public engagement, and/or widening participation activities.

  • 4.

    Unsure - Respondents who were unsure how knowledge exchange is related to outreach, public engagement and widening participation.

What is immediately clear from Figure 5 is that there is no consistent definition or understanding of knowledge exchange amongst respondents. There are instead four major categories to the responses, which represent a reasonably broad range of concepts, definitions and views (or not) of knowledge exchange in evidence.

One of the most popular responses (~27%) was that knowledge exchange exists to support outreach, public engagement, and/or widening participation activities. The responses coded in this category ranged from simple expressions such as “it aids it”, to more verbose explanations, including:

  • “Knowledge exchange is an important part of all of the above, however they all require a lot more than just passing on knowledge, you have to also pass on your enthusiasm and do this in an enjoyable and engaging manner, to achieve good Outreach, PE or WP.”

This idea of exchanging information and expertise is related to the most popular response to this question (~33%), which was that knowledge exchange is a way of sharing best practices, be it between the expert and the audience in a two-way dialogue (in a similar manner to public engagement), or because it:

  • “Allows external stakeholders to influence our activities but also allows us to share expertise with or influence them.”

Nine of the respondents (20%) identified knowledge exchange as related to interactions between universities and industry, leading to increased economic activities. Given the nature of knowledge exchange that was laid out in the literature review, it was perhaps surprising to see that only one of the respondents explicitly mentioned HEIF, noting that:

  • “For the first time under the last HEIF round universities were asked to provide Knowledge Exchange Strategies to indicate how they would allocate their HEIF resource.”

Eleven of the respondents (~24%) were unsure of how knowledge exchange was related to outreach, public engagement and widening participation. Either because they had “never come across the term 'knowledge exchange'”, or because they were unsure how the terms were all related. Again, whilst the sampling size of this survey was too small to make any definite statements, of all the questions that were posed by this survey, it was the issue of knowledge exchange, which was responsible for the largest amount of misunderstanding.

Conclusion

Perhaps the most noticeable result from this study is that the open-ended responses to the survey resulted in a wide range of definitions of outreach, public engagement, widening participation and knowledge exchange amongst the participants, despite the quantitative data indicating that the majority of the respondents felt that their definitions of outreach, knowledge exchange, and public engagement agreed with those of their colleagues. This would seem to indicate that further communication is required both within and between institutes to ensure a level of consistency amongst science communicators, although this specific question will require further study.

The lack of demographic data means that it is not possible to comment in terms of the different roles within the UK science communication community. However, based on the current literature, and the results of this study, the following broad definitions are offered for each of the four considered topics:

Outreach: a one-way discourse, in which scientists communicate their research to the general public, with particular focus on school children and young people.

Public Engagement: a two-way dialogue, in which scientists converse with members of the general public in a mutually beneficial manner.

Widening Participation: any activity that engages with social groups under-represented in HE, in order to encourage them to attend university.

Knowledge Exchange: any activity that involves engagement with businesses, public and third sector services, the community and the wider public, which involves the sharing of best practice, and which can be monitored for funding purposes.

It is acknowledged that there is still some overlap between these definitions, for example a school assembly given by a university researcher at a local school might well be classed as being an outreach, widening participation, and knowledge exchange activity. In such instances it is important to consider the context of these classifications. In this example, the researcher’s faculty might classify the activity as outreach, the university’s widening participation team (or equivalent) may catalogue it as a widening participation activity, and the knowledge exchange offices (or equivalent) could acknowledge it in their records for HEFCE.

It is important for science communicators to consider the context in which their activity takes place, because depending on its classification, the activity may be eligible for different amounts of funding from different areas of resource. This consideration of context is especially important when applying for external funding, where science communicators will be expected to outline the specific area(s) in which their activity can be categorised. These activities are also extremely important in terms of determining the pathways to impact of future REFs, and whilst widening participation tends to align with teaching outcomes, rather than research, it should be acknowledged that important to note that widening participation will likely become part of the Teaching Excellence Framework – an outcome based model that the UK government proposes to evaluate quality of teaching.

The results of the survey also indicate that the respondents were less comfortable defining terminology around knowledge exchange than they were about outreach, public engagement and widening participation. The job titles and functions of respondents may be an important factor here, and further work is needed to confirm this. A future study is planned which also aims to assess how the different perceptions of science communication nomenclature would break down according to stakeholders. For example, the ways in which an academic, museum and learned society view these definitions might be very different. An international study, with a much larger target audience, is also required so as to assess differences in perceptions of the science communication lexicon between countries, both those traditionally associated with the field and those that are not. This study should also present participants with an open-ended question to define any further terms within the science communication vernacular which they believe to be important, and why.

This study, via a consideration of the literature, and a survey of science communicators, has presented concise and workable definitions for outreach, public engagement, widening participation and knowledge exchange. However, as with all names it is important that the people using them feel comfortable with them, and also that there is at least some form of consistency within the field (and beyond) as to their usage. This consistency will only come about by communication both within and between institutions, and this study aims to act as a starting point for such conversations, with planned future work aiming to further explore the perceptions of science communication and its nomenclature amongst a much wider target audience.

Data availability

The data referenced by this article are under copyright with the following copyright statement: Copyright: © 2015 Illingworth S et al.

Data associated with the article are available under the terms of the Creative Commons Zero "No rights reserved" data waiver (CC0 1.0 Public domain dedication). http://creativecommons.org/publicdomain/zero/1.0/

F1000Research: Dataset 1. Answers to science communication questionnaire, 10.5256/f1000research.6858.d97179 ( Illingworth et al., 2015).

Acknowledgments

We gratefully acknowledge the participation of all of the people who took the time to fill in the survey on science communication practices, and would like to thank the reviewers for constructive comments in the development of this manuscript.

Funding Statement

The authors confirm that this research was conducted without a grant.

[version 2; referees: 3 approved

Supplementary Material

Science communication questionnaire.

This is the questionnaire that was used in this study to assess practitioner’s definitions of nomenclature in relation to science communication.

References

  1. Andrews E, Weaver A, Hanley D, et al. : Scientists and public outreach: Participation, motivations, and impediments. Journal of Geoscience Education. 2005;53(3):281–293. Reference Source [Google Scholar]
  2. Bowler MT, Buchanan-Smith HM, Whiten A: Assessing public engagement with science in a university primate research centre in a national zoo. PLoS One. 2012;7(4):e34505. 10.1371/journal.pone.0034505 [DOI] [PMC free article] [PubMed] [Google Scholar]
  3. Burns TW, O’Connor DJ, Stocklmayer SM: Science communication: A contemporary definition. Public Underst Sci. 2003;12(2):183–202. 10.1177/09636625030122004 [DOI] [Google Scholar]
  4. Clewell BC, Villegas AM: Creating a nontraditional pipeline for urban teachers: The pathways to teaching careers model. J Negro Educ. 1999;68(3):306–317. 10.2307/2668103 [DOI] [Google Scholar]
  5. Cohen ER, Masum H, Berndtson K, et al. : Public engagement on global health challenges. BMC Public Health. 2008;8:168. 10.1186/1471-2458-8-168 [DOI] [PMC free article] [PubMed] [Google Scholar]
  6. Deas I, Robson B, Wong C, et al. : Measuring neighbourhood deprivation: A critique of the Index of Multiple Deprivation. Environ Plann C Gov Policy. 2003;21(6):883–903. 10.1068/c0240 [DOI] [Google Scholar]
  7. Ecklund EH, James SA, Lincoln AE: How academic biologists and physicists view science outreach. PLoS One. 2012;7(5):e36240. 10.1371/journal.pone.0036240 [DOI] [PMC free article] [PubMed] [Google Scholar]
  8. Frost P: The CTY summer school model: Evolvement, adaptation and extrapolation at the National Academy for Gifted and Talented Youth (England). High Ability Studies. 2005;16(1):137–153. 10.1080/13598130500115379 [DOI] [Google Scholar]
  9. Goddard J: Reinventing the Civic University.2009. Reference Source [Google Scholar]
  10. Greenler RG, Lasca NP Jr, Brooks AS, et al. : The Science Bag™ at the University of Wisconsin–Milwaukee: A successful forum for science outreach. Am J Phys. 1993;61(4):326–329. 10.1119/1.17263 [DOI] [Google Scholar]
  11. HEFCE: Higher Education Innovation Fund round 3: Invitation and guidance for institutional plans and competitive bids.2005. Reference Source [Google Scholar]
  12. Holliman R, Whitelegg L, Scanlon E, et al. : Investigating science communication in the information age: Implications for public engagement and popular media. Oxford University Press,2009. Reference Source [Google Scholar]
  13. Illingworth S, Redfern J, Millington S, et al. : Dataset 1 in: What’s in a Name? Exploring the Nomenclature of Science Communication in the UK. F1000Research. 2015. Data Source [DOI] [PMC free article] [PubMed]
  14. Jones R: What have we learned from public engagement? Nat Nanotechnol. 2007;2(5):262–263. 10.1038/nnano.2007.123 [DOI] [PubMed] [Google Scholar]
  15. Kelter P, Hughes K, Murphy A: Science Outreach for the 1990s. Sch Sci Math. 1992;92(7):365–369. 10.1111/j.1949-8594.1992.tb15610.x [DOI] [Google Scholar]
  16. Kettley N: The past, present and future of widening participation research. Brit J Sociol Educ. 2007;28(3):333–347. 10.1080/01425690701252531 [DOI] [Google Scholar]
  17. Krasny ME: University K–12 Science Outreach Programs: How Can We Reach a Broad Audience? Bioscience. 2005;55(4):350–359. 10.1641/0006-3568(2005)055[0350:uksoph]2.0.co;2 [DOI] [Google Scholar]
  18. Mangan J, Hughes A, Davies P, et al. : Fair access, achievement and geography: Explaining the association between social class and students’ choice of university. Stud High Educ. 2010;35(3):335–350. 10.1080/03075070903131610 [DOI] [Google Scholar]
  19. McCaig C, Adnett N: English universities, additional fee income and access agreements: Their impact on widening participation and fair access. Brit J Educ Stud. 2009;57(1):18–36. 10.1111/j.1467-8527.2009.00428.x [DOI] [Google Scholar]
  20. Nattkemper TW: Are we ready for science 2.0? Paper presented at the KMIS 2012- Proceedings of the International Conference on Knowledge Management and Information Sharing2012. Reference Source [Google Scholar]
  21. Neresini F, Bucchi M: Which indicators for the new public engagement activities? An exploratory study of European research institutions. Public Underst Sci. 2011;20(1):64–79. 10.1177/0963662510388363 [DOI] [Google Scholar]
  22. O’Doherty KC, Hawkins A: Structuring public engagement for effective input in policy development on human tissue biobanking. Public Health Genomics. 2010;13(4):197–206. 10.1159/000279621 [DOI] [PMC free article] [PubMed] [Google Scholar]
  23. Parkhill K, Pidgeon N: Public engagement on geoengineering research: preliminary report on the SPICE deliberative workshops. Understanding Risk Working (2011–11).2011;29 Reference Source [Google Scholar]
  24. Petts J: Public engagement to build trust: False hopes? J Risk Res. 2008;11(6):821–835. 10.1080/13669870701715592 [DOI] [Google Scholar]
  25. Rowe G, Frewer LJ: A typology of public engagement mechanisms. Sci Technol Hum Val. 2005;30(2):251–290. 10.1177/0162243904271724 [DOI] [Google Scholar]
  26. Schäfer MS: From public understanding to public engagement: An empirical assessment of changes in science coverage. Sci Commun. 2009;30(4):475–505. 10.1177/1075547008326943 [DOI] [Google Scholar]
  27. Watson D: Managing civic and community engagement. McGraw-Hill International.2007. Reference Source [Google Scholar]
  28. Wynne B: Public engagement as a means of restoring public trust in science--hitting the notes, but missing the music? Community Genet. 2006;9(3):211–220. 10.1159/000092659 [DOI] [PubMed] [Google Scholar]
F1000Res. 2015 Oct 7. doi: 10.5256/f1000research.7640.r10466

Referee response for version 2

Cornelia Lawson 1

The authors have provided a thorough review addressing all reviewer comments. The new version provides a much clearer analysis and robust conclusions. Shortcomings are addressed and I very much hope the authors will be able to further pursue their research to answer some of the questions that have remained.

I have read this submission. I believe that I have an appropriate level of expertise to confirm that it is of an acceptable scientific standard.

F1000Res. 2015 Sep 29. doi: 10.5256/f1000research.7640.r10465

Referee response for version 2

Sarah R Davies 1

The authors have thoroughly responded to the reviewers' comments, undertaking a more robust analysis of the data and more nuanced reflections on the both the data and the wider literature. As a result it's a much stronger and more interesting paper. I very much enjoyed the opportunity to engage with it.

I have read this submission. I believe that I have an appropriate level of expertise to confirm that it is of an acceptable scientific standard.

F1000Res. 2015 Aug 19. doi: 10.5256/f1000research.7380.r9682

Referee response for version 1

Cornelia Lawson 1

The article looks at a series of, perhaps, interchangeable terms used in the context of science communication and knowledge exchange and how they are understood by those working in academia. It provides an informative and brief literature review and concise definitions for each of the terms: outreach, public engagement, widening participation and knowledge exchange, and presents results of an original survey of science communicators.

The research question is important and timely as concepts of science communication and knowledge exchange merge and the impact agenda gains in importance in the UK and elsewhere. However, I have some mayor concerns regarding the empirical analysis. The sampling framework and the sample are not sufficiently described making it difficult to interpret the results. In particular, at several points the authors mention science communicators but do not specify whether these were the target population. They also make no estimate as to how many people were reached and which sectors they work in. It is therefore not clear whether the final sample can be considered representative (as is claimed in making use of the phrase 'fair cross-section').

The sample is very small and one is left wondering whether interviews with a similar number of respondents would not have enabled a better analysis and insights. The final sample also includes employees of universities and of companies but it is not clear how they are distributed and whether answers differ between the two. This is of importance as some of your conclusions seem to assume a homogeneous institutional environment, i.e an academic work environment. More over, we do not know whether respondents are academics, administrators, or professional science communicators. No collecting such information is a major flaw.

The use of word clouds to analyse your data strikes me as very odd. The word clouds add no information what so ever and I would suggest removing them. First of all because it can only highlight word frequency, which it does not do very accurately (e.g. it does not look for word stems), and because it does not allow to reader to know the actual frequency or importance of words but only their relative frequency and importance. Other packages such as NVivo should be used as it allows to identify relationships between words and to deduce linkages between the different answers provided.

Alternatively a comparison of definitions found in the literature and definitions provided by respondents would provide more interesting results and allow some important insights into how concepts are diffusing.

A few final remarks on the literature review: The article does not specify when public engagement and widening participation were introduced as concepts and/or policies in the UK and I think this could be added. The impact agenda within the UK's REF is mentioned towards the end of the review and from there it seems that outreach, public engagement and knowledge exchange are part of this agenda, while widening participation is not. Perhaps this could be commented on here and in the conclusion.

I have read this submission. I believe that I have an appropriate level of expertise to confirm that it is of an acceptable scientific standard, however I have significant reservations, as outlined above.

F1000Res. 2015 Sep 14.
Samuel Illingworth 1

Thank you for your helpful comments, which we have taken on board, and which have been used to improve the overall quality of the paper. 

Regarding the sampling framework and the sample size, we agree that this was not discussed in enough depth, but as discussed in the response to the previous comments this has now been rectified and discussed in the main body of the paper. Any claims that the paper makes in terms of generalisations have also been re-addressed, as we agree that given the sample size this is not appropriate.

We also agree that not collecting further demographic data in terms of job titles and roles was an oversight, and as discussed in the responses to the previous comments, and also in the paper itself, this is something that we intend to do in future studies, using the findings of this study as a starting point for further investigation. 

We accept that the use of word clouds to illustrate the key points in the data was at best inconclusive and at worst potentially misleading. However, this has now been addressed, with a more in-depth analysis using NVivo now provided, which allows for a more considered examination in terms of the relative frequency and importance of the responses. 

Regarding the introduction of these terminologies in terms of policies, concerns about “access” to high education began to emerge alongside the expansion of the university sector in the latter part of 19th Century, but a research agenda on differential access only began to emerge following the recommendations of the Robbins Committee in 1963 to expand university attendance. These concerns resurfaced in 1990s when the divide between universities and polytechnics ended, ultimately leading to a commitment by the 1997 Labour Government to again expand by the sector by tackling barriers to higher education. Consequently, Labour established OFFA. This has now been commented on in the literature survey. 

In terms of the relationship between widening participation and REF, widening participation tends to align with teaching outcomes, rather than research. However, it is accepted that there is a blurring of the boundaries, e.g. a research-focused public engagement event may well inspire someone to attend university who wouldn't have otherwise. In addition to this, it is important to note that widening participation will likely become part of the Teaching Excellence Framework – an outcome based model that the UK government proposes to evaluate quality of teaching. This has also been commented on in the conclusions, as suggested.

F1000Res. 2015 Aug 17. doi: 10.5256/f1000research.7380.r9685

Referee response for version 1

Sarah R Davies 1

This is an interesting topic, and one that is ripe for study. As the authors point out, there is a growing amount of activity in and funding for various kinds of science communication (they point to the phenomenon of 'Science 2.0', but they may also like to think about this relates to scholarship on Mode 2 knowledge production - e.g. see Etzkowitz & Leydesdorff (2000)). There is definitely lots going on, a mass of different terms for related activities, and a degree of slippage between different terminologies.

As the authors themselves point out, though, there are some problems with their methodology, and I also feel that the analysis and discussion would benefit from a fuller engagement with the literature. I'll take the latter point first.

To start with, I think the authors need to more clearly make a case for their focus on the terms that their analysis is based on (that's rather clumsily put - but I hope my point will become clear). Their starting point, unreferenced, is that "Science communication is often analogously and interchangeably referred to as science outreach, public engagement, widening participation, and/or knowledge exchange". Yes - but many other terms are also used, including dialogue, PUS, scientific literacy, tech transfer, third stream activity, and co-enquiry. Why were these terms selected (and indeed why not 'science communication' itself)? This sense of arbitrariness is heightened by a literature analysis that bounces around in a somewhat haphazard way. The authors ignore, for instance, work that has attempted to define some of these terms (Burns et al. (2003), whom they cite at the very start of the paper, offer an extended discussion of terms they see as similarly related to science communication, including scientific literacy and PUS; the work of the NCCPE - http://www.publicengagement.ac.uk - is also a valuable resource for seeing how different forms of science communication have been framed). By interpreting the 'participation' that the science communication literature often refers to (e.g. in the Rowe & Frewer article they cite) as being about 'widening participation' initiatives they also make a bridge to an area of university outreach and external relations that is infrequently discussed in the main body of science communication research (wrongly so, I think - but that's another discussion); rather, the term 'participation' tends to be used in the tradition of deliberative theory and public participation in science research and policy (a randomly chosen example would be  Bogner (2012)). All this is to say that things like 'public engagement' and 'widening participation' actually have rather different histories, and different communities around them - so it doesn't surprise me, for instance, that some respondents hadn't heard of widening participation initiatives. These are likely to be organised quite different within university systems. I was also surprised that the authors didn't spend more time considering other empirical work on practitioner definitions of these terms or of science communication generally. Some of my work has treated this (not that it's necessarily essential reading - but see, e.g.,  Davies (2013), but Jason Chilvers, John C Besley, and Kevin Burchell, among others, have also published analyses of how scientists and other practitioners of public communication tend to define and understand it (e.g. Chilvers (2008); also Besley (2010) ). 

So - in short, I think both the framing of the study, and the analysis and discussion, would benefit from a more thorough engagement with the qualitative literature that has built up around the meaning and practice of science communication. I also have some comments about the methodology and analysis - largely to do with the need for more explanation as to what the former was, and why it was chosen. For instance, the authors note that the survey was "advertised via email, social media accounts, and the ‘psci-comm’ mailing list"; earlier, they say that the study aimed to understand opinion on definitions of the four terms "in UK HEIs". What, exactly, is the target participant group? Everyone working in UK HEIs? A certain subset of this population, those who are interested in science communication? The HEIs themselves, as institutions (i.e. as organisations with particular brands)? How was the sampling strategy (the distribution of the survey) designed to reach the desired population? In terms of the survey advertisement via "email and social media accounts": email to whom, and why, and which social media accounts (and why - but you get the picture...)? The PSCI-COMM list is distributed to a large group of those interested or working in or researching science communication, in the UK but also internationally. Is the target population therefore 'science communicators'? (In practice almost all respondents had participated in science communication in some way, so perhaps so. But this needs to be clear.) [Apologies - I've just re-read your comment in the conclusion that respondents only being active science communicators is a weakness of the study. But in that case, if you're interested in everyone working in UK HEIs from chancellors to cleaners, you need to justify why you used a survey methodology, and why you thought your sampling strategy would reach everyone.]

I would also like to know more about how the analysis, and particularly the qualitative analysis, was carried out. Using word clouds is a rather basic means of analysis, as it tells us only how often a word is cited, not the context in which it is used or the meaning that is attached to it; discussion of the qualitative responses is therefore important. Were these coded in some way? How were themes robustly identified? The data evidently did not reach saturation (to use a term from grounded theory), as the authors emphasise the diversity in the definitions given. Do you think you needed a larger sample size, or is there so much interpretative flexibility in these terms that saturation would never be reached?

The authors do note, in the conclusion, that the sample size is a weakness of the study, and I would agree with this. As far as I can tell you were also not able to identify the role or situation of your respondents, only whether they had participated in science communication activities in the past. This, to me, also weakens the results significantly - or at least represents a lost opportunity. There are big differences between different kinds of communities within universities and as participants in communication (e.g. scientists, outreach officers, admin staff, PR teams, tech transfer offices...). It seems unlikely that these communities would have homogeneous definitions of the terms in question - or even have heard of them all (as your findings suggest).

I want to close - and I do apologise for the lengthy review - with a broader point. The issues that you touch upon raise some fascinating questions. I would love to know more, for instance, about the relations between individuals and teams in universities working on 'widening participation' and on public engagement-type projects, and I would be very curious to know if academic staff give different kinds of definitions of these terms to those who organise science communication on a professional basis. In this regard I would encourage you to look beyond finding the 'right', or even commonly used, definitions of particular terms. Alan Irwin has talked about 'third order' studies of science communication, which explore how different terms are mobilised by different groups, and the kinds of effects that this has (see Irwin (2008) In: Bucchi M and Trench B (eds), Handbook of Public Communication of Science and Technology, London and New York: Routledge). Your study suggests interesting ways to explore how very different meanings can be applied to the same terms - it would be great to hear more, in the future, about how these different meanings are made to matter in particular contexts (such as, to go back to the very beginning, moves towards Mode 2 and entrepreneurial universities). I wish you all the best with this future research.

I have read this submission. I believe that I have an appropriate level of expertise to confirm that it is of an acceptable scientific standard, however I have significant reservations, as outlined above.

F1000Res. 2015 Sep 14.
Samuel Illingworth 1

Thank you for your comments, which we found extremely useful, and which have helped us to reshape the paper into a more effective and considered study. 

The comment regarding the choice of outreach, public engagement, widening participation and knowledge exchange, and why these were chosen for selection is a pertinent one. However, we believe that in the context of UK institutions, these are the phrases that are most readily used in relation to science communication. PUS and scientific literacy are terms that are, in our experience, used more in an American and European context, whereas tech transfer and third stream activity would arguably be covered by knowledge exchange. However, we agree that the wording in the text was a bit strong, and has now been changed. We have also made a note in the conclusions regarding that future studies should also present participants with an open-ended question to define any further terms within the science communication lexicon which they believe to be important, and why. 

Regarding the bounciness of the literature analysis, we think that the literature that we have engaged with illustrates that there is no concise definition of the terms that we have chosen, and as such serves its purpose as a useful introduction and rationale for this study. It is understood that the selected terms have different historical and academic contexts, but we believe that the literature that we have engaged with illustrates this point.

As noted in the response to the previous reviewer’s comments, we have now addressed the issue of the sampling size and the sampling strategy, and accept that not doing so in the previous version of this paper was an oversight. It should have also been made more apparent that only the responses from UK-based participants were included in the analysis, and this has now been corrected for in the text. In addition to this, only the participants who reported as being active in science communication were included in the analysis; again this has now been made clearer in the text. 

We agree that Wordle was not the most academically rigorous way of presenting our data, and that word frequencies could be potentially misleading. As such the word clouds have been replaced with tables that outline the major themes for the thematic analysis, and the frequencies associated with them. A far more detailed description of how this thematic analysis was carried out using NVivo is also given. Through an open coding approach, descriptive saturation was reached, and this is also now discussed in detail in the text. 

Again, we acknowledge that the lack of sufficient demographic data reduces the potential depth of the analysis, representing a lost opportunity that we hope to address in future studies. This will include, as noted in the conclusions, investigating how different stakeholders define different aspects of science communication, and how and why this changes depending on job title, and both across and between institutes. 

Thank you for your suggestions regarding future studies, especially your comments regarding how definitions of the nomenclature may change depending on job role. This has now been further discussed in the conclusions where we talk about future work, where with a broader study and more demographic data such questions will be targeted. We agree that there may well be no ‘right’ definition of particular terms, but having broad and workable descriptions might well help better communication between the different practitioners of science communication.

F1000Res. 2015 Aug 10. doi: 10.5256/f1000research.7380.r9683

Referee response for version 1

Paige Brown Jarreau 1

Summary: A re-analysis of the data and a significant re-write is advised. 

The authors have provided a fine if brief literature overview of definitions of science outreach, public engagement, widening participation and knowledge exchange. This is also a timely topic, as the need for evidence-based science communication has grown in today's complex ecosystem of science new media and outreach efforts on the part of scientific institutions.

However, there are major flaws in this paper as it is currently written and presented. The sample is extremely small for an online survey - this sample would have been more appropriate for in-depth interviews with each respondent. This is acceptable as long as the audience/population is clearly defined. Are those who answered the survey mostly active scientists, or mostly dedicated science communicators? This could make a big difference to participants’ definitions of science communication, outreach, public engagement, widening participation and knowledge exchange.

Why only 47 respondents? The ‘psci-comm’ mailing list serves a much larger number of people than this (and the authors also don’t make any estimate of how many people they reached with their survey.) How long was data collected for? The authors should more clearly spell out their sampling goals and procedures. 

“These results suggest that in a still emergent field, the participants of this survey are likely to be the driving influence behind the definition of science communication at an institutional level.” How so? Please make this statement, and how/why this conclusion was reached, more clear.

“If the participants that took part in this survey represent a fair cross-section of people working in science outreach, public engagement, and widening participation across the UK then it is somewhat alarming that such a significant proportion of them feel as though the fundamental basis on which their work is founded lacks such clemency in its definitions.” The authors should be VERY careful in making any statement regarding the implications of this survey representing a fair cross-section of people working in science outreach, etc., as per my comments on the sampling procedure above. The authors don’t present any data on the basic demographics of the respondents. This was a critical oversight. 

In the analysis of survey responses, I am not under the impression that Wordle is an academically rigorous method for mapping word and concept frequency. Other qualitative analysis tools such as AtlasTi would have been preferable, even if figures were created in Wordle for demonstration purposes only. Wordle images, while a visually appealing, make it very difficult to quickly gauge relative word frequency. 

The authors’ methodology in qualitatively analyzing, coding and interpreting the survey responses is very unclear, if not missing altogether. How were the survey responses approached during data analysis? Were the authors’ primarily looking for responses that corresponded with how the scientific literature defines concepts of outreach, engagement, etc., or were novel or conflicting definitions also coded and analyzed (e.g. open vs closed coding?)? From the description of survey response analysis, it is not clear if the data analysis reached saturation, or what theory or framework guided the textual analysis.

It would be very useful to know the demographic information of those respondents who provided alternative definitions of outreach and engagement, and of those respondents who were unable to define widening participation, etc. Are these scientists? Are these professional science communicators? This type of basic information would enrich the meaning of these results. The authors state only in their conclusion section that the respondents were “active science communicators.” What type of science communicators? How many were involved in academic institutions? I think this type of information is paramount to the authors’ interpretation of their findings, and the lack of this data is a significant weakness in this paper.

What insights did we glean from the authors’ survey of 47 science communicators regarding the definitions and terminology surrounding science outreach, public engagement, widening participation and knowledge exchange beyond what the authors presented in their literature review? What insights to the field and definition of science communication were uniquely contributed by this study? In this the authors are also unclear. It would have been useful for the authors to compare and contrast at greater length their respondents’ definitions of these concepts to published scientific literature definitions.

I have read this submission. I believe that I have an appropriate level of expertise to confirm that it is of an acceptable scientific standard, however I have significant reservations, as outlined above.

F1000Res. 2015 Sep 14.
Samuel Illingworth 1

Thank you for your comments, which were very insightful, and which we have used to address these flaws, thereby helping to improve the academic rigour of the paper. 

Estimating the number of active science communicators in the UK is beyond the scope of this study. However, given that this study aims to provide an initial scoping exercise into the thoughts and consistency of active science communicators across the UK, and taking into account the limited time frame and zero budget of this study, an ideal sampling size of between 50 and 100 participants was chosen for the survey. Given the limitations in budget (which also precluded a more in-depth interviewing / focus group approach), a convenience sampling strategy was adopted, in which the survey was advertised using the ‘psci-comm’ mailing list hosted by JISCMail, as well through the Twitter accounts of the authors, all of whom are active participants in science communication networks across the Twittersphere. The target audience were people that identified themselves as being active UK science communicators, which is why this particular mailing list was adopted. Given that the psi-comm mailing list contains several hundred active science communicators, and that between them the authors have several hundred Twitter followers that identify themselves as UK-based science communicators, it is disappointing that more people were not able to participate in the survey, but we believe that the number of responses is still sufficient for the purposes of this study. This has now been made clearer in the text. 

Regarding the two statements relating to science communication at an institutional level and the cross-sections of science communicators that were represented, we agree that it was inappropriate to make such generalisations, and so these have been removed from the text. 

We agree that Wordle was not the most academically rigorous way of presenting our data, and that word frequencies could be potentially misleading. As such the word clouds have been replaced with tables that outline the major themes for the thematic analysis, and the frequencies associated with them. A far more detailed description of how this thematic analysis was carried out using NVivo is also given. Through an open coding approach, descriptive saturation was reached, and this is also now discussed in detail in the text. 

In terms of the lack of demographic data that was collected in regards to this survey, we realise that this was an oversight, and we have commented on this in the text. However, we still believe that this study is worthwhile, and that it has presented a relatively concise set of definitions that can be used for further discussions.

F1000Res. 2015 Aug 6. doi: 10.5256/f1000research.7380.r9684

Referee response for version 1

Laura Fogg Rogers 1

This article attempts to define common definitions for inter-changeable terms used in science communication through a literature review and survey of science communicators. It is a timely article, as this is a much sustained debate in both academic and practitioner circles, and relates to discussion in the literature about funding, incentivising and rewarding science communication. More particularly, it links directly to efforts to define the ‘impact’ of science communication through processes such as the Research Excellence Framework and the Public Engagement with Research agenda. The article is well grounded in this literature and refers to many of the current debates in this field.

The article uses survey results to move forward the debate on these definitions, attempting to find consensus amongst science communication practitioners. However there were only 47 participants involved in the survey, which is a low sample size. The authors acknowledge this point however, and also note that it may be useful to repeat the survey with researchers and professionals who don’t consider themselves science communicators.

Despite these limitations, the survey results indicate that while 66% of respondents state that their definitions match those of colleagues, the resulting qualitative data indicates wide differentiation. These results are presented through Wordle diagrams. While these are visually interesting, further qualitative thematic analysis, along with matrix analyses for coding frequency, could also have added to the results. Percentages are given for word frequency, but a methodology for how this was obtained would be useful.

The resulting concluding definitions are a useful addition to the field, but still produce considerable overlap between activities. The article notes that further work is needed, and I would concur that this is a topic ripe for more in depth research.

I have read this submission. I believe that I have an appropriate level of expertise to confirm that it is of an acceptable scientific standard.

F1000Res. 2015 Sep 14.
Samuel Illingworth 1

Thank you for your comments, which are greatly appreciated, and which we have used to improve both the content and presentation of our study. 

As you have pointed out, we acknowledge that the number of participants in this survey is small, and the purpose of this study was to act as an initial scoping exercise amongst science communicators to try and determine how they would summarise the terms of outreach, public engagement, widening participation and knowledge exchange within the umbrella term of ‘science communication.’ The idea for this paper was to begin to further develop the conversation that needs to take place in regards to how science communicators use the terminology in their fields, initially on a UK-wide basis. 

Given that limited demographic data was taken in this study (which was in hindsight an oversight), the Wordle diagrams were simply a tool to indicate the frequency with which certain words occurred. However, it was not at all clear from the original paper how this had been done, and further thematic analysis has now been carried out using NVivo, with the text being rectified to reflect this. The word clouds have also been replaced with tables that outline the major themes for the thematic analysis, and the frequencies associated with them.

Associated Data

    This section collects any data citations, data availability statements, or supplementary materials included in this article.

    Data Citations

    1. Illingworth S, Redfern J, Millington S, et al. : Dataset 1 in: What’s in a Name? Exploring the Nomenclature of Science Communication in the UK. F1000Research. 2015. Data Source [DOI] [PMC free article] [PubMed]

    Supplementary Materials

    Answers to science communication questionnaire

    These are the responses to the questionnaire that was used in this study to assess practitioner’s definitions of nomenclature in relation to science communication.

    Copyright: © 2015 Illingworth S et al.

    Data associated with the article are available under the terms of the Creative Commons Zero "No rights reserved" data waiver (CC0 1.0 Public domain dedication).

    Data Availability Statement

    The data referenced by this article are under copyright with the following copyright statement: Copyright: © 2015 Illingworth S et al.

    Data associated with the article are available under the terms of the Creative Commons Zero "No rights reserved" data waiver (CC0 1.0 Public domain dedication). http://creativecommons.org/publicdomain/zero/1.0/

    F1000Research: Dataset 1. Answers to science communication questionnaire, 10.5256/f1000research.6858.d97179 ( Illingworth et al., 2015).


    Articles from F1000Research are provided here courtesy of F1000 Research Ltd

    RESOURCES