Skip to main content
Proceedings of the Royal Society B: Biological Sciences logoLink to Proceedings of the Royal Society B: Biological Sciences
. 2022 Aug 10;289(1980):20221077. doi: 10.1098/rspb.2022.1077

Promoting scientific literacy in evolution through citizen science

Miriam Brandt 1,, Quentin Groom 2,, Alexandra Magro 3,4,, Dusan Misevic 5,6,, Claire L Narraway 7,, Till Bruckermann 8,9,, Anna Beniermann 10, Tom Børsen 11, Josefa González 12, Sofie Meeus 2, Helen E Roy 13, Xana Sá-Pinto 14, Jorge Roberto Torres 15, Tania Jenkins 16,†,
PMCID: PMC9363982  PMID: 35946159

Abstract

Evolutionary understanding is central to biology. It is also an essential prerequisite to understanding and making informed decisions about societal issues such as climate change. Yet, evolution is generally poorly understood by civil society and many misconceptions exist. Citizen science, which has been increasing in popularity as a means to gather new data and promote scientific literacy, is one strategy through which people could learn about evolution. However, despite the potential for citizen science to promote evolution learning opportunities, very few projects implement them. In this paper, we make the case for incorporating evolution education into citizen science, define key learning goals, and suggest opportunities for designing and evaluating projects in order to promote scientific literacy in evolution.

Keywords: evaluation, evolution misconceptions, education, learning, public participation in scientific research

1. Introduction

In a society fundamentally shaped by science and technology, scientific literacy is crucial in order to respond in a meaningful way to issues that pervade daily life and political actions. One scientific field for which this ‘everyday working knowledge of science’ [1] is particularly important is evolution. Evolutionary processes shape all aspects of the natural world [2], and many of the complex global challenges humanity is facing, such as human health (e.g. zoonotic diseases [3], antibiotic resistance [4], human microbiome [5]), food security [6] and biodiversity loss [7] are based on evolutionary processes. Furthermore, evolution has been applied in many fields outside biology, e.g. forensics [8], software development [9] and architecture [10]. Limited understanding of evolution can profoundly impair one's ability to make rational decisions on societal issues [11]. For instance, the COVID-19 pandemic has demonstrated that evolution impacts our daily lives: a genetic sequence inherited from Neanderthals increases the odds of hospitalization [12] and SARS-CoV-2's evolutionary history and ongoing evolution informs vaccine development [13].

Despite its importance, evolution is generally poorly understood [14] and is not always accepted by the public [15]. Understanding evolution requires more than just the learning of ‘facts’—promotion of scientific literacy in evolution is necessary. Scientific literacy involves being able to explain phenomena scientifically, evaluate and design scientific inquiry, interpret data and evidence scientifically [16]. These require knowledge about the content of science (content knowledge), an understanding of scientific methods (procedural knowledge) and insights into how scientific knowledge is created (epistemic knowledge) [17]. In addition, the ability to use scientific knowledge and reasoning in different situations (knowledge application) is required [18].

Citizen science (CS), defined here as the participation of non-professional scientists in research, is a suitable tool for increasing scientific literacy [19]. Indeed, CS projects provide an excellent context for learning: often rooted in real-life contexts, presenting cognitive challenges, and offering participation in hands-on scientific tasks [20]. These aspects are generally acknowledged as being essential ingredients of active learning [21], suggesting CS can achieve educational impacts. Unfortunately, its learning dimension is underexplored [22], and evidence for learning outcomes is scant [19,23,24].

Despite the centrality of evolution to biology, very few biology CS projects frame their activities in an evolutionary context. For example, of the 1603 projects on the CS platform SciStarter (https://scistarter.org/, as of June 2022), 672 are in ‘ecology and environment’, while only 14 mention evolution. We consider this to be a missed opportunity for promoting scientific literacy in evolution.

Here, we define different types of learning outcomes, describe challenges in promoting scientific literacy in evolution through CS, give recommendations, provide guidelines on how to design for learning, and evaluate the outcomes. While we focus on CS in evolution, many of our recommendations on creating and evaluating learning opportunities are more generally applicable to other fields of biology and CS more broadly. For example, learning opportunities for epistemic knowledge can be included in any CS project, regardless of the investigated topic (see section ‘Creating learning opportunities in citizen science projects’). In addition, evaluation of learning outcomes in CS projects is currently rare. The suite of instruments that we suggest (see section ‘Evaluating learning outcomes in citizen science projects') could be applied to any CS project looking to assess participant learning. To effectively apply our recommendations to other fields, researchers need to be clear about the learning outcomes they want to achieve and consider why participants might be interested in their project.

2. Key learning goals for scientific literacy in evolution

Including an educational dimension in a CS project requires being clear about its scientific goals and possible learning outcomes. Right from the beginning, aligning these outcomes with project goals and educational opportunities in the design is essential [24]. To increase scientific literacy, four key learning goals are crucial: content knowledge, procedural knowledge, epistemic knowledge and knowledge application (table 1). For the many other worthwhile outcomes of CS projects, such as relating to behaviour, interest, self-efficacy and motivation, we refer the reader to other frameworks [24,25]. Next, we explore the importance of the four learning goals in the context of evolution.

Table 1.

Examples covered by the four learning goals.

learning goal examples
content knowledge phenotypic variation; heritability of traits; selective pressure; adaptation
procedural knowledge observing variability within a population; recording changes in a certain trait over time; aligning DNA sequences; formulating hypotheses and designing studies
epistemic knowledge meaning of considering evolution as a ‘theory’; understanding that scientific knowledge is constantly changing through the addition of new evidence; understanding that science is embedded in society and influenced by cultural norms
knowledge application understand, be able to discuss and/or make informed decisions about issues such as: the emergence of new SARS-CoV-2 strains and the impact of COVID-19 vaccines; the importance of crop biodiversity for food security; the impact of invasive species

(a) . Content knowledge

Developing a good understanding of evolution and using evolutionary knowledge to explain biological scenarios requires a grasp of key concepts. Evolutionary theory rests on a network of foundational disciplines ranging from genetics to ecology and geology. Thus, understanding evolution requires synthesis and coordination of multiple perspectives, which is a challenge for learning and teaching [26]. This starts with understanding key concepts, such as ‘adaptation’, ‘variation’ and ‘selective pressure’, and words like ‘theory’ or ‘fitness’ (see ‘Communication issues'), in order to structure the acquired knowledge [27].

(b) . Procedural knowledge

Within CS projects, participants may be more familiar with certain types of procedural knowledge such as species identification, whereas they may be less familiar with others, such as analysing data and discussing evidence [28]. Procedural knowledge is important in the context of evolution because many evolutionary processes cannot be directly observed or subjected to experimentation, either because they took place in the past, and/or because they occur over large temporal and spatial scales, which may hinder understanding [29].

(c) . Epistemic knowledge

CS projects may also constitute a way to increase public understanding of the nature of science, that is, the characteristics of scientific knowledge and the way it is produced [30]. Research results are initially uncertain, can be contradictory, and are not definitive. In order to interpret research results appropriately a differentiated view of findings—from new, still uncertain findings, to accepted facts—is essential. This is especially pertinent with regard to evolution, as scientific debate over new results on evolutionary mechanisms is sometimes interpreted as disagreement within the scientific community on whether or not evolution happens [31]. Indeed, it has been shown that understanding the nature of science increases students' acceptance of evolution [32].

(d) . Knowledge application

Scientific literacy in evolution is required for citizens to understand how the world works as a system and inform decisions regarding global challenges [3,33]. It is therefore important that they are able to apply evolutionary knowledge learned in projects to other situations [34].

Although CS projects may promote learning across all four dimensions of scientific literacy, it is unlikely to be possible to address them all equally well. Which learning goals can realistically be achieved depends on the specific topic, methodology and project set-up. We will elaborate on how to create learning opportunities on evolution, after considering some important barriers to learning about evolution.

3. Barriers to learning evolution

Identifying barriers to learning evolution is essential in order to design for learning. Here, we describe three types of barriers: misconceptions, conflicts with established culture and values, and communication issues.

(a) . Misconceptions about evolution

A key challenge for scientists trying to increase scientific literacy in evolution is that important details of evolution by natural selection are often misinterpreted. For example, many people are not aware that mutations are random and have a range of effects; that the potential for adaptability is not unbounded; nor that ‘survival of the fittest’ refers to how organisms compare to each other, rather than some absolute fitness metric. Indeed, misconceptions are frequent and widespread across different demographic groups, including young students, teachers and the general public [3537].

In evolution, concepts that are abstract or counterintuitive include the difficulty to conceive of the spatial and temporal scales over which evolution occurs, probability and randomness [38,39]. In addition, understanding evolution requires linking a number of complex concepts and misconceptions about any one of them will impact the understanding of the others [40].

Misconceptions exist even among those that accept evolution [41] and are remarkably resilient to instruction [42]. Additionally, they can be context dependent: students may provide correct explanations for trait gain in one organism but fail to transfer that explanation to another species [43].

(b) . Conflicting culture and values

Educational approaches that focus on increasing knowledge about evolution might fail if they conflict with the culture and values of participants [16]. As public attitudes toward evolution are sometimes negative [44] they should be considered a key factor when implementing projects on evolution. Probably the most persistent example for a conflict is that between religion and acceptance of evolution [45]. This conflict predominantly occurs between evolution and some denominations of Christianity and Islam [38] and its extent is highly country dependent [46].

Acceptance of evolution is also influenced by the total number of years spent in education [38], understanding of nature of science [47], attitudes towards science [48], knowledge/understanding of evolution [49], and gross domestic product per capita [50]. Additionally, there is still a debate about the relationship between acceptance and actual understanding of evolution with conflicting evidence for strong positive correlation [51], weak positive relationship [36,37,48], or no correlation at all [52].

(c) . Communication issues

Effective communication in CS projects is challenging as scientists are predominately trained to communicate using specialized terminology. Moreover, some evolutionary terminology has different meanings in the scientific community and in colloquial language [53]. For example, ‘evolution’ is used colloquially to mean ‘change over time’, stripping it of its scientific meaning [54]. Similarly, colloquially, ‘theory’ is something unproven [55], and ‘selection’ implies a conscious selector [56]. Finally, translation between different languages may introduce an additional layer of ambiguity. In Roman languages there is no word for ‘fit’, and in Serbian and French, fitness is often translated as ‘adaptive value’ which could unintentionally imply an adaptationist view.

4. Creating learning opportunities in CS projects with a focus on evolution

Despite the huge potential for CS to achieve learning goals [19], this dimension is often underexploited [22]. One indirect way of achieving learning is to raise the level of participation that the project offers [57]. However, offering higher levels of engagement, such as the additional opportunity to analyse data, does not necessarily increase the learning outcomes [58]. Therefore, to achieve broader educational impacts, increasing the level of engagement will not suffice. When CS project initiators decide to include a learning dimension, their efforts will yield better results if learning goals and opportunities are clearly defined from the outset. We now consider how existing projects have designed learning opportunities in evolution, focusing on the learning dimensions defined above. Table 2 provides suggestions on how to promote learning opportunities of evolution in CS projects, that researchers can choose from, tailored to the goals and circumstances of the project (e.g. resources and expertise of the team).

Table 2.

Examples of opportunities to promote learning on evolution in CS projects. The selection of measures implemented will depend on the goals and circumstances of the project.

opportunity implementation examples considerations to improve learning when implementing in context of a project
curriculum-based activities implement activities with school classes consider collaborating with teachers and education researchers [59]
align educational activities with national curricula to make them attractive for educators [60]
identify the requirements and expectations of teachers and students [60], perhaps with the help of a logic model [61]
co-design of the project involve participants in developing research questions, study design, data analysis and/or communication consider co-design to broaden learning opportunities for epistemic knowledge and knowledge application [6264]
implement learning activities prior to or during co-creation processes [20], so participants can contribute meaningfully
allow and value contributions for multiple experiences and backgrounds to enhance learning and ownership [65]
engage participants in the design of outreach strategies [66] to promote positive attitudes
data collection, data analysis, understanding the nature of science provide training resources to underpin data collection, data analysis and background context explicitly teach participants about the steps of scientific inquiry [67]
combine teaching the necessary skills with (i) evolutionary background to provide conceptual context [20] and (ii) explaining the value of rigorous data collection and analysis [68]
encourage participant feedback to improve and develop the study methods [69]
give participants the opportunity to engage in different tasks [70]
gamification implement gamification of evolutionary content and/or of participation (i.e. achievement badges) use gamification to sustain participant interest and to motivate people not intrinsically motivated to participate in learning opportunities [71,72]
use gamification of participation to help participants develop a feeling of self-efficacy [73]
be careful not to oversimplify information about evolution in games, as this may generate misconceptions [74]
communicating with participants use uni-directional communication (e.g. emails, social media, website, field guides) as well as dialogue/social interactions (e.g. online, or in person at formal or informal meetings) engage in active public relations work [75]
acknowledge participants' contributions, as this helps to maintain their interest [64,76]
show respect for differing cultural, religious and educational backgrounds of participants [32]
share data, results, and information on how the data are used to evaluate potential evolutionary explanations [77,78]
invest in creating social interactions, as these promote learning and positive attitudes towards science [79]
refer participants to other projects in evolution to keep them engaged and increase learning outcomes [80]
make content more accessible by explaining real-world relevance [20] and through storytelling [81]
use clear language: be careful when using terms that have different meanings colloquially [55]
promoting peer-to-peer participant communication use narrative story-telling by participants (e.g. photo diaries), online communication (e.g. social media, blogs), formal and informal meetings have participants communicate knowledge from long-term memory as this active application increases learning [82]
reflect with participants on their peer-to-peer communication to avoid spread of misconceptions
discuss with participants which points they communicate, including relevant background [83]
encourage more advanced participants to teach beginners (near-peer teaching) to benefit learning for both [84]
support critical thinking by encouraging participants to discuss how their findings build evolutionary knowledge [77]

(a) . Creating learning opportunities for content and procedural knowledge

Simply presenting concepts or theories, and describing the scientific methods applied to evolutionary research, cannot be assumed to automatically increase citizen scientists' understanding of evolution [85]. To foster content and procedural knowledge, projects should provide active learning situations, supported by educational resources adapted to misconceptions, cultures and values of different groups. This occurred in ‘Evolution MegaLab’ [86] which mobilized 6461 registered participants to survey colour morphs of banded snails to map climate change effects. Communication resources explaining the evolutionary background of morph variation were adapted to different target audiences, and participants had immediate feedback on their results. As a result, it helped participants grasp the notion that evolution can be observed directly.

Likewise, in the ‘1000 Gardens’ project [87] 2492 registered participants engaged in an artificial selection experiment that provided data on the performance of soya bean genotypes at different latitudes. The theoretical background was explained in the context of the broader experimental design, and participants performed a small part of the experiment in their garden. At the end of the project, the results and conclusions of the project were shared with participants [88].

Such hands-on involvement also contributes to the acquisition of skills and methods relevant to studying evolution (procedural knowledge). For instance, in ‘Melanogaster: catch the Fly!’ [89], participants (about 320 school students to date) have the opportunity to learn about bioinformatics and use these tools to analyse evolution at the genomic level.

(b) . Creating learning opportunities for epistemic knowledge

While participants may gain increased content and procedural knowledge, there is no consensus in the literature on if this leads to an increased understanding of the nature of science [90], or influences people's acceptance of evolution [32]. Participants grasp major aspects of the nature of science more easily when they conduct experiments [91]. However, this may not be enough [92], and resources specifically designed to address distinct components of the nature of science are needed. The ‘Pieris project’ with participants from 30 US states and 32 different countries [93], examines how organisms respond to environmental change, provides information about the diversity of methods employed to infer the history of cabbage white butterfly populations, and the empirical evidence supporting their inferences on the history of invasion. Furthermore, it addresses the question of how to deal with uncertainty, illustrating that science is open to revision in the light of new evidence.

(c) . Creating learning opportunities to foster knowledge application

To achieve a larger impact on scientific literacy, projects with a focus on evolution should empower participants to apply acquired knowledge to new situations by highlighting its broader relevance and encouraging further engagement with other projects or communities. Many projects include blogs, or are connected to social platforms, fostering interaction with a broad spectrum of perspectives beyond the project's central subject [94]. ‘SquirrelMapper’ [95], a project that examines rapid adaptation to a changing environment in eastern grey squirrels, and which has amassed approximately 25 000 participants, goes even further. It gives citizen scientists the opportunity to apply their acquired knowledge to another CS project regarding the management of grey squirrels in cities, promoting engagement with other sectors of society.

(d) . Designing learning opportunities to address misconceptions

The first step for dealing with misconceptions is to anticipate them [96]. The KAEVO 2.0 instrument [36] can be used by CS projects to assess knowledge and misconceptions about evolution [97]. After which, rather than simply communicating facts, projects need to encourage participants to exert critical thinking [32]. Thus, project initiators should give participants the opportunity to test their prior knowledge by offering situations that challenge likely misconceptions [96]. As misconceptions are tenacious, it is important to revisit them frequently and to assess the validity of the participants' understanding (including by self-assessment). Social interactions that give space for conflicting viewpoints and communication, in addition to being beneficial for learning, also help to overcome misconceptions [98]. As such, it is useful for initiators to implement an array of approaches to improve interaction and offer choices that accommodate participants’ differences. This could also increase engagement and fidelity that reinforce learning [99].

5. Evaluating learning outcomes in evolution in citizen science projects

It is not sufficient to only design to promote scientific literacy as this does not guarantee uptake by participants. For instance, if learning opportunities are not at the right level they may not be used, since both over-straining and demanding too little is discouraging [100]. To find out if approaches are effective, we need to assess the learning outcomes achieved.

Although there are opportunities for learning in CS, the evidence of learning outcomes, especially with respect to scientific literacy, is sparse [23,24]. For example, in a non-exhaustive literature search of SciStarter, Google Scholar and Web of Science, we identified 58 CS projects on evolution, 38 of which (65%) claimed to have a learning outcome. Of those, only 10 (26%) actually evaluated it. Out of the five projects described above as providing learning opportunities (Evolution MegaLab, 1000 Gardens, Melanogaster: catch the fly, Pieris and SquirrelMapper), two evaluate for learning outcomes (JR Torres, J Gibbs 2022, personal communication).

Most CS projects aiming to promote participants' scientific literacy tend to only measure content knowledge [101]. However, a number of methods and instruments to evaluate the other learning outcomes exist (table 3), as well as a shared framework to measure individual learning outcomes from participation [24]. The selection of tools used will depend on the resources available for evaluation and the skillset of the project team, which could be augmented by interdisciplinary collaboration (e.g. with education scientists).

Table 3.

Examples of measurement instruments and approaches to evaluate dimensions of scientific literacy. The selection of measurement instruments used will depend on the goals and circumstances of the project.

name of measurement instrument or method evaluated construct
content knowledgea
Assessing COntextual Reasoning about Natural Selection (ACORNS) [102] understanding of natural selection, adaptive change
Concept Inventory of Natural Selection (CINS) [103] natural selection
Knowledge About EVOlution 2.0 (KAEVO 2.0) [36,97] several micro- and macro- evolutionary concepts
procedural knowledge
assessing experimental design [104] planning a scientific study and sampling design
FOrmal Reasoning Test (FORT) [105] scientific reasoning abilities
Scientific Reasoning Scale (SRS) [106] abilities for evaluating scientific findings
participant observation [107] group processes in knowledge production
epistemic knowledge
Connotative Aspects of Epistemological Beliefs (CAEB) [108] epistemological beliefs
views of Nature of Scientific Inquiry (NOSI views) [109] understanding nature of scientific inquiry
Student Understanding of Science and Scientific Inquiry (SUSSI) [110] understanding science and scientific inquiry
Views About Scientific Inquiry (VASI) [111] understanding scientific inquiry
Views of Nature of Science (VNOS) [30] understanding nature of science
knowledge application
Quantitative Assessment of Socio-Scientific Reasoning (QuASSR) [112] socioscientific reasoning
participant observation [107] application of acquired knowledge in discussions

aFor a full review of instruments that measure evolution understanding see [38,113].

(a) . Recommendations for choosing and designing evaluation instruments

When selecting evaluation instruments three key aspects need to be considered:

(i) . Depth and type of evaluation

Evaluations can be quantitative, issued as closed questionnaires (e.g. self-reporting or tests [114]); or qualitative, performed as open questionnaires or semi-structured interviews [115], participant observation [107], focus groups, photo diaries and the study of narratives [116].

(ii) . Applicability to the study population

In quantitative evaluation, instruments are designed, applied and validated for particular study populations and therefore may not be directly transferable. If no prior validation exists for the study population, a small pilot is recommended before the start of the project [115].

(iii) . Communicating evaluation goals and process

It is necessary to explain to participants the importance of evaluation and its requirements. Keep the measures as short as possible, and focus on the dimensions of scientific literacy your project targets. Goals must be made clear from the start and codes of ethics followed [117]. Co-evaluation, where project participants are involved in designing the project evaluation strategy, can be a useful tool to overcome participation barriers [101].

6. Balancing scientific goals with designing for learning and evaluation: challenges and benefits

Including a learning dimension in a CS project might be seen as a trade-off to the primary interests of the project initiator to achieve scientific outcomes and academic excellence [118]. Furthermore, project initiators often lack knowledge, incentive and resources to design for learning [19]. Yet, including learning opportunities can provide tangible benefits. Learning is an important factor for continuing motivation of participants [119], which in turn strongly affects data quality and quantity, as well as the project's societal impact through participants' willingness to advocate the topic [120,121].

Achieving learning outcomes can lead to societal impacts, which are increasingly recognized as central in research policy [122] and an important goal of academic researchers [123]. Many policymakers and funding agencies are already requiring CS projects to design and assess their learning outcomes [57], and this request is likely to be met by increasing financial support. For example, the SquirrelMapper project initiators were equally interested in the educational and biological dimensions of the project and developed the educational aspect for 10 years without funding. The project now has major funding for both dimensions, which are advanced simultaneously by an interdisciplinary team (J Gibbs 2022, personal communication). As such, clear benefits exist of designing for and evaluating learning outcomes.

Interdisciplinary collaborations can also contribute to solving the dilemma of having to divert resources to aspects that project initiators may not see as focal. Hence, collaboration between evolutionary biologists and education scientists/educators is suggested from the beginning of the project [124], resulting in a win–win situation. Indeed, for education researchers it may be scientifically rewarding to apply their expertise to this new learning context. However, interdisciplinary work requires open-mindedness, empathy, trust, transparency of different objectives, and an effort to develop mutual understanding [125] to create synergies between the different perspectives, values and norms involved.

7. Conclusion

In this paper, we argue that there is great potential for CS as a tool for evolution education. However, CS is not fully exploited as a research or educational tool by evolutionary biologists. Many projects either have no explicit learning goals, or if they do, it is often assumed that learning will happen by default when people participate in project activities. In reality, a positive effect on scientific literacy in evolution can only be achieved if projects are purposely designed and evaluated for learning outcomes. For this, we would like to encourage evolutionary biologists to develop CS projects in evolution, and actively engage with education scientists/educators who can contribute expertise on increasing scientific literacy in evolution.

Acknowledgements

We would like to thank the participants of the meeting: ‘Citizen science as a tool for education and promotion of scientific literacy in evolution’, January 2020, that contributed ideas to this paper. This article is based upon work from COST Actions ‘CA17127 EuroScitizen’ ‘CA17122 Alien CSI’, ‘CA15212 Citizen Science’, supported by COST (European Cooperation in Science and Technology; www.cost.eu).

Data accessibility

This article has no additional data.

Authors' contributions

M.B.: project administration, writing—original draft, writing—review and editing; Q.G.: conceptualization, funding acquisition, writing—original draft, writing—review and editing; A.M.: writing—original draft, writing—review and editing; D.M.: conceptualization, project administration, writing—original draft, writing—review and editing; C.L.N.: project administration,writing—original draft, writing—review and editing; T.B.: writing—original draft, writing—review and editing; A.B.: writing—review and editing; T.B.: writing—review and editing; J.G.: writing—review and editing; S.M.: writing—review and editing; H.E.R.: writing—review and editing; X.S.-P..: writing—review and editing; J.R.T.: writing—review and editing; T.J.: conceptualization, funding acquisition, project administration, writing—original draft, writing—review and editing.

All authors gave final approval for publication and agreed to be held accountable for the work performed therein.

Conflict of interest declaration

We declare we have no competing interests.

Funding

X.S.-P. is funded through FCT – Fundação para a Ciência e a Tecnologia, I.P., in the scope of the framework contract foreseen in the numbers 4, 5 and 6 of the article 23, of the Decree-Law 57/2016, of August 29, changed by Law 57/2017, of 19 July. A.M. benefited from funding by the laboratory ‘Evolution et Diversité Biologique’ as part of the ‘Laboratoires d’Excellence’ LABEX TULIP (ANR -10-LABX-41) and LABEX CEBA (ANR-10-LABX-25-01).

References

  • 1.Ayala FJ. 2013. Scientific literacy and the teaching of evolution. Ludus vitalis XXI, 231-237. [Google Scholar]
  • 2.Mindell DP. 2007. The evolving world: evolution in everyday life. Cambridge MA: Harvard University Press. [Google Scholar]
  • 3.Carroll SP, Jørgensen PS, Kinnison MT, Bergstrom CT, Denison RF, Gluckman P, Smith TB, Strauss SY, Tabashnik BE. 2014. Applying evolutionary biology to address global challenges. Science 346, 1-31. ( 10.1126/science.1245993) [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 4.Davies J, Davies D. 2010. Origins and evolution of antibiotic resistance. Microbiol. Mol. Biol. Rev. 74, 417-433. ( 10.1128/MMBR.00016-10) [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 5.Costello EK, Stagaman K, Dethlefsen L, Bohannan BJM, Relman DA. 2012. The application of ecological theory toward an understanding of the human microbiome. Science 336, 1255-1262. ( 10.1126/science.1224203) [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 6.Kingsbury N. 2009. Hybrid: The history and science of plant breeding. Chicago: University of Chicago Press. [Google Scholar]
  • 7.Radchuk V, et al. 2019. Adaptive responses of animals to climate change are most likely insufficient. Nat. Commun. 10, 1-14. ( 10.1038/s41467-019-10924-4) [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 8.Triggs CM, Buckleton JS. 2002. Logical implications of applying the principles of population genetics to the interpretation of DNA profiling evidence. Forensic Sci. Int. 128, 108-114. ( 10.1016/s0379-0738(02)00168-8) [DOI] [PubMed] [Google Scholar]
  • 9.Chiong R, Weise T, Michalewicz Z. 2011. Variants of evolutionary algorithms for real-world applications. New York, NY: Springer-Verlag. [Google Scholar]
  • 10.Byrne J, Fenton M, Hemberg E, McDermott J, O'Neill M, Shotton E, Nally C. 2011. Combining structural analysis and multi-objective criteria for evolutionary architectural design. In Applications of evolutionary computation: EvoApplications 2011: EvoCOMNET, EvoFIN, EvoHOT, EvoMUSART, EvoSTIM, and EvoTRANSLOG, Torino, Italy. ( 10.1007/978-3-642-20520-0_21) [DOI] [Google Scholar]
  • 11.Yacoubian HA. 2018. Scientific literacy for democratic decision-making. Int. J. Sci. Educ. 40, 308-327. ( 10.1080/09500693.2017.1420266) [DOI] [Google Scholar]
  • 12.Zeberg H, Pääbo S. 2020. The major genetic risk factor for severe COVID-19 is inherited from Neanderthals. Nature 587, 610-612. ( 10.1038/s41586-020-2818-3) [DOI] [PubMed] [Google Scholar]
  • 13.Cohen J. 2020. With record-setting speed, vaccinemakers take their first shots at the new coronavirus. Science. See: https://www.science.org/content/article/record-setting-speed-vaccine-makers-take-their-first-shots-new-coronavirus. [Google Scholar]
  • 14.Kampourakis K. 2014. Understanding evolution. Cambridge, NY: Cambridge University Press. [Google Scholar]
  • 15.Miller JD, Scott EC, Okamoto S. 2006. Science evolution 2006. Science 313, 765-766. [DOI] [PubMed] [Google Scholar]
  • 16.OECD. 2019. PISA 2018 assessment and analytical framework, PISA. See 10.1787/b25efab8-en. [DOI]
  • 17.Kampa N, Köller O. 2016. German national proficiency scales in biology: internal structure, relations to general cognitive abilities and verbal skills. Sci. Educ. 100, 903-922. ( 10.1002/sce.21227) [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 18.Roberts DA. 2007. Scientific literacy/Science literacy. In Handbook of reserach on science education (eds Abell SK, Lederman NG), pp. 729-780. Mahwah, NJ: Lawrence Erlbaum. [Google Scholar]
  • 19.Bonney R, Phillips TB, Ballard HL, Enck JW. 2016. Can citizen science enhance public understanding of science? Public Underst. Sci. 25, 2-16. ( 10.1177/0963662515607406) [DOI] [PubMed] [Google Scholar]
  • 20.National Academies of Sciences, Engineering, and Medicine; Division of Behavioral and Social Sciences and Education; Board on Science Education; Committee on Designing Citizen Science to Support Science Learning. 2018. Learning through citizen science: enhancing opportunities by design. Washington, DC: National Academies Press. [PubMed] [Google Scholar]
  • 21.Pedaste M, Mäeots M, Siiman LA, de Jong T, van Riesen SAN, Kamp ET, Manoli CC, Zacharia ZC, Tsourlidaki E.. 2015. Phases of inquiry-based learning: definitions and the inquiry cycle. Educ. Res. Rev. 14, 47-61. ( 10.1016/j.edurev.2015.02.003) [DOI] [Google Scholar]
  • 22.Ruiz-Mallén I, Riboli-Sasco L, Ribrault C, Heras M, Laguna D, Perié L. 2016. Citizen science: toward transformative learning. Sci. Commun. 38, 523-534. ( 10.1177/1075547016642241) [DOI] [Google Scholar]
  • 23.Bela G, et al. 2016. Learning and the transformative potential of citizen science. Conserv. Biol. 30, 990-999. ( 10.1111/cobi.12762) [DOI] [PubMed] [Google Scholar]
  • 24.Phillips T, Porticella N, Constas M, Bonney R. 2018. A Framework for articulating and measuring individual learning outcomes from participation in citizen science. Citizen Sci.: Theory Practice 3, 3. ( 10.5334/cstp.126) [DOI] [Google Scholar]
  • 25.Kieslinger B, Schäfer T, Heigl F, Dörler D, Richter A, Bonn A. 2018. Evaluating citizen science: towards an open framework. In Citizen science—innovation in open science, society and policy, pp. 81-95. London, UK: UCL Press. [Google Scholar]
  • 26.Catley KM. 2006. Darwin's missing link—a novel paradigm for evolution education. Sci. Educ. 90, 767-783. ( 10.1002/sce.20152) [DOI] [Google Scholar]
  • 27.Opfer JE, Nehm RH, Ha M. 2012. Cognitive foundations for science assessment design: knowing what students know about evolution. J. Res. Sci. Teach. 49, 744-777. ( 10.1002/tea.21028) [DOI] [Google Scholar]
  • 28.Stylinski CD, Peterman K, Phillips T, Linhart J, Becker-Klein R. 2019. Assessing science inquiry skills of citizen science volunteers: snapshot of the field. Int. J. Sci. Educ. Part B 2 10, 77-92. ( 10.1080/21548455.2020.1719288) [DOI] [Google Scholar]
  • 29.Jones MG, Taylor A, Minogue J, Broadwell B, Wiebe E, Carter G. 2007. Understanding scale: powers of ten. J. Sci. Educ. Technol. 16, 191-202. ( 10.1007/s10956-006-9034-2) [DOI] [Google Scholar]
  • 30.Lederman NG, Abd-El-Khalick F, Bell RL, Schwartz RS. 2002. Views of Nature of Science Questionnaire: toward valid and meaningful assessment of learners' conceptions of nature of science. J. Res. Sci. Teach. 39, 497-521. ( 10.1002/tea.10034) [DOI] [Google Scholar]
  • 31.Nieminen P, Mustonen AM. 2014. Argumentation and fallacies in creationist writings against evolutionary theory. Evol.: Educ. Outreach 7, 1-14. ( 10.1186/s12052-014-0011-6) [DOI] [Google Scholar]
  • 32.Dunk RDP, et al. 2019. Evolution education is a complex landscape. Nat. Ecol. Evol. 3, 327-329. ( 10.1038/s41559-019-0802-9) [DOI] [PubMed] [Google Scholar]
  • 33.Jørgensen PS, Folke C, Carroll SP. 2019. Evolution in the Anthropocene: informing governance and policy. Annu. Rev. Ecol. Evol. Syst. 50, 527-546. ( 10.1146/annurev-ecolsys-110218-024621) [DOI] [Google Scholar]
  • 34.Bull JJ, Wichman HA. 2001. Applied evolution. Annu. Rev. Ecol. Syst. 32, 183-217. ( 10.1146/annurev.ecolsys.32.081501.114020) [DOI] [Google Scholar]
  • 35.Bruckermann T, Fiedler D, Harms U. 2021. Identifying precursory concepts in evolution during early childhood—a systematic literature review. Stud. Sci. Educ. 57, 85-127. ( 10.1080/03057267.2020.1792678) [DOI] [Google Scholar]
  • 36.Kuschmierz P, Beniermann A, Graf D. 2020. Development and evaluation of the knowledge about evolution 2.0 instrument (KAEVO 2.0). Int. J. Sci. Educ. 42, 2601-2629. ( 10.1080/09500693.2020.1822561) [DOI] [Google Scholar]
  • 37.Kuschmierz P, et al. 2021. European first-year university students accept evolution but lack substantial knowledge about it: a standardized European cross-country assessment. Evol.: Educ. Outreach 14, 1-22. ( 10.1186/s12052-021-00158-8) [DOI] [Google Scholar]
  • 38.Kuschmierz P, Meneganzin A, Pinxten R, Pievani T, Cvetković D, Mavrikaki E, Graf D, Beniermann A. 2020. Towards common ground in measuring acceptance of evolution and knowledge about evolution across Europe: a systematic review of the state of research. Evol.: Educ. Outreach 13, 1-24. ( 10.1186/s12052-020-00132-w) [DOI] [Google Scholar]
  • 39.Fiedler D, Sbeglia GC, Nehm RH, Harms U. 2019. How strongly does statistical reasoning influence knowledge and acceptance of evolution? J. Res. Sci. Teach. 56, 1183-1206. ( 10.1002/tea.21547) [DOI] [Google Scholar]
  • 40.Sá-Pinto X, et al. 2021. Development and validation of a framework for the assessment of school curricula on the presence of evolutionary concepts (FACE). Evol.: Educ. Outreach 14, 1-27. ( 10.1186/s12052-021-00142-2) [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 41.Jakobi SR. 2010. ‘Little monkeys on the grass…’ How people for and against evolution fail to understand the theory of evolution. Evol.: Educ. Outreach 3, 416-419. ( 10.1007/s12052-010-0214-4) [DOI] [Google Scholar]
  • 42.Shtulman A. 2006. Qualitative differences between naïve and scientific theories of evolution. Cognit. Psychol. 52, 170-194. ( 10.1016/j.cogpsych.2005.10.001) [DOI] [PubMed] [Google Scholar]
  • 43.Nehm RH. 2018. Evolution. In Teaching biology in schools: global research, issues, and trends (eds Kampourakis K, Reiss MJ). London, UK: Routledge. [Google Scholar]
  • 44.Deniz H, Borgerding LA. 2018. Evolution education around the globe. Cham, Switzerland: Springer. [Google Scholar]
  • 45.Barnes ME, Dunlop HM, Sinatra GM, Hendrix TM, Zheng Y, Brownell SE. 2020. ‘Accepting evolution means you can't believe in god’: atheistic perceptions of evolution among college biology students. CBE Life Sci. Educ. 19, 1-13. ( 10.1187/CBE.19-05-0106) [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 46.Pew Research Center. 2020. Biotechnology research viewed with caution globally, but most support gene editing for babies to treat disease. See https://www.pewresearch.org/science/2020/12/10/biotechnology-research-viewed-with-caution-globally-but-most-support-gene-editing-for-babies-to-treat-disease/.
  • 47.Graf D, Soran H.. 2010. Einstellung und Wissen von Lehramtsstudierenden zur Evolution—ein Vergleich zwischen Deutschland und der Türkei. In Evolutionstheorie—Akzeptanz und Vermittlung im europäischen Vergleich, pp. 141-161. Berlin, Germany: Springer. [Google Scholar]
  • 48.Großschedl J, Konnemann C, Basel N. 2014. Pre-service biology teachers' acceptance of evolutionary theory and their preference for its teaching. Evol.: Educ. Outreach 7, 1-16. ( 10.1186/s12052-014-0018-z) [DOI] [Google Scholar]
  • 49.Ha M, Haury DL, Nehm RH. 2012. Feeling of certainty: uncovering a missing link between knowledge and acceptance of evolution. J. Res. Sci. Teach. 49, 95-121. ( 10.1002/tea.20449) [DOI] [Google Scholar]
  • 50.Heddy BC, Nadelson LS. 2012. A global perspective of the variables associated with acceptance of evolution. Evol.: Educ. Outreach 5, 412-418. ( 10.1007/s12052-012-0423-0) [DOI] [Google Scholar]
  • 51.Trani R. 2004. I won't teach evolution; it's against my religion. Am. Biol. Teach. 66, 419-427. ( 10.2307/4451708) [DOI] [Google Scholar]
  • 52.Sinatra GM, Southerland SA, McConaughy F, Demastes JW. 2003. Intentions and beliefs in students' understanding and acceptance of biological evolution. J. Res. Sci. Teach. 40, 510-528. ( 10.1002/tea.10087) [DOI] [Google Scholar]
  • 53.Somerville RC, Hassol SJ.. 2011. Enhancing the communication of climate change science. In AGU Fall Meeting Abstracts, pp. GC24A-02. [Google Scholar]
  • 54.Narraway CL, Davis O, Lowell S, Lythgoe K, Turner JS, Marshall S. 2020. Biotic analogies for self-organising cities. Environ. Plan. B: Urban Anal. City Sci. 47, 268-286. ( 10.1177/2399808319882730) [DOI] [Google Scholar]
  • 55.Gregory TR. 2008. Evolution as fact, theory, and path. Evol.: Educ. Outreach 1, 46-52. ( 10.1007/s12052-007-0001-z) [DOI] [Google Scholar]
  • 56.Rector MA, Nehm RH, Pearl D. 2013. Learning the language of evolution: lexical ambiguity and word meaning in student explanations. Res. Sci. Educ. 43, 1107-1133. ( 10.1007/s11165-012-9296-z) [DOI] [Google Scholar]
  • 57.Aristeidou M, Herodotou C. 2020. Online citizen science: a systematic review of effects on learning and scientific literacy. Citiz. Sci.: Theory Pract. 5, 1-12. ( 10.5334/cstp.224) [DOI] [Google Scholar]
  • 58.Greving H, et al. 2022. Improving attitudes and knowledge in a citizen science project on urban bat ecology. Ecol. Soc. 27, 24. ( 10.5751/ES-13272-270224) [DOI] [Google Scholar]
  • 59.Shein PP, Tsai C-Y. 2015. Impact of a scientist–teacher collaborative model on students, teachers, and scientists. Int. J. Sci. Educ. 37, 2147-2169. ( 10.1080/09500693.2015.1068465) [DOI] [Google Scholar]
  • 60.Zoellick B, Nelson SJ, Schauffler M. 2012. Participatory science and education: bringing both views into focus. Front. Ecol. Environ. 10, 310-313. ( 10.1890/110277) [DOI] [Google Scholar]
  • 61.Bruckermann T, et al. 2019. Learning opportunities and outcomes in citizen science: a heuristic model for design and evaluation. In Electronic Proceedings of the European Science Education Research Association. [Google Scholar]
  • 62.Bruckermann T, Stillfried M, Straka TM, Harms U. 2022. Citizen science projects require agreement: a Delphi study to identify which knowledge on urban ecology is considered relevant from scientists' and citizens’ perspectives. Int. J. Sci. Educ. B: Commun. Public Engagem. 12, 75-92. ( 10.1080/21548455.2022.2028925) [DOI] [Google Scholar]
  • 63.Lehr JL, McCallie E, Davies SR, Caron BR, Gammon B, Duensing S. 2007. The value of ‘dialogue events’ as sites of learning: an exploration of research and evaluation frameworks. Int. J. Sci. Educ. 29, 1467-1487. ( 10.1080/09500690701494092) [DOI] [Google Scholar]
  • 64.Bonney R, Mccallie E, Phillips T.. 2009. Public participation in scientific research: defining the field and assessing its potential for informal science education. A CAISE Inquiry Group Report.
  • 65.Bang M, Medin D. 2010. Cultural processes in science education: supporting the navigation of multiple epistemologies. Sci. Educ. 94, 1008-1026. ( 10.1002/sce.20392) [DOI] [Google Scholar]
  • 66.Fonseca CA, Sá-Pinto X, Dinis HA, Vasconcelos R. 2021. Shooting skinks for good: producing a movie improves attitudes towards a threatened species. Sci. Total Environ. 791, STOTEN 148356. ( 10.1016/j.scitotenv.2021.148356) [DOI] [PubMed] [Google Scholar]
  • 67.Duschl RA, Grandy R. 2013. Two views about explicitly teaching nature of science. Sci. Educ. 22, 2109-2139. ( 10.1007/s11191-012-9539-4) [DOI] [Google Scholar]
  • 68.Ballard HL, Dixon CGH, Harris EM. 2017. Youth-focused citizen science: examining the role of environmental science learning and agency for conservation. Biol. Conserv. 208, 65-75. ( 10.1016/j.biocon.2016.05.024) [DOI] [Google Scholar]
  • 69.Kellman PJ, Massey CM, Son JY. 2010. Perceptual learning modules in mathematics: enhancing students' pattern recognition, structure extraction, and fluency. Topics Cogn. Sci. 2, 285-305. ( 10.1111/j.1756-8765.2009.01053.x) [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 70.Prather EE, Cormier S, Wallace CS, Lintott C, Raddick MJ, Smith A. 2013. Measuring the conceptual understandings of citizen scientists participating in Zooniverse projects: a first approach. Astron. Educ. Rev. 12, 1-14. ( 10.3847/aer2013002) [DOI] [Google Scholar]
  • 71.Eveleigh A, Jennett C, Lynn S, Cox AL. 2013. ‘I want to be a captain! I want to be a captain!’: gamification in the old weather citizen science project. In Gamification '13: Proceedings of the First International Conference on Gameful Design, Research, October, Stratford, Ontario, Canada, pp. 79–82. ( 10.1145/2583008.2583019) [DOI] [Google Scholar]
  • 72.Dichev C, Dicheva D. 2017. Gamifying education: what is known, what is believed and what remains uncertain: a critical review. Int. J. Educ. Technol. High. Educ. 14, 1-36. ( 10.1186/s41239-017-0042-5) [DOI] [Google Scholar]
  • 73.Hidi S, Ann Renninger K. 2006. The four-phase model of interest development. Educat. Psychol. 41, 111-127. ( 10.1207/s15326985ep4102_4) [DOI] [Google Scholar]
  • 74.Reiss MJ, Romero B. 2013. Beliefs and the value of evidence creationism and intelligent design. How should creationism and intelligent design be dealt with in the classroom? Beliefs and the value of evidence. In Communication and engagement with science and technology: issues and dilemmas a reader in science communication (eds Gilbert J, Stocklmayer S), pp. 148-161. Routledge Taylor & Francis Group. [Google Scholar]
  • 75.Masters K, Oh EY, Cox J, Simmons B, Lintott C, Graham G, Greenhill A, Holmes K. 2016. Science learning via participation in online citizen science. J. Sci. Commun. 15, A07. ( 10.22323/2.15030207) [DOI] [Google Scholar]
  • 76.Ballard HL, Harris EM, Dixon CGH. 2017. Science identity and agency in community and citizen science: evidence & potential. See http://sites.nationalacademies.org/cs/groups/dbassesite/documents/webpage/dbasse_189606.pdf.
  • 77.Kuhn D. 1999. A developmental model of critical thinking. Educ. Res. 28, 16-26. [Google Scholar]
  • 78.Fernandez-Gimenez ME, Ballard HL, Sturtevant VE. 2008. Adaptive management and social learning in collaborative and community-based monitoring: a study of five community-based forestry organizations in the western USA. Ecol. Soc. 13, 4. ( 10.5751/ES-02400-130204) [DOI] [Google Scholar]
  • 79.Price CA, Lee H-S. 2013. Changes in participants' scientic attitudes and epistemological beliefs during an astronomical citizen science project. J. Res. Sci. Teach. 50, 773-801. ( 10.1002/tea.21090) [DOI] [Google Scholar]
  • 80.Bransford JD, Schwartz DL. 1999. Rethinking transfer: a simple proposal with multiple implications. Rev. Res. Educ. 24, 61-100. ( 10.3102/0091732x024001061) [DOI] [Google Scholar]
  • 81.Kromka SM, Goodboy AK. 2019. Classroom storytelling: using instructor narratives to increase student recall, affect, and attention. Commun. Educ. 68, 20-43. ( 10.1080/03634523.2018.1529330) [DOI] [Google Scholar]
  • 82.Karpicke JD, Blunt JR. 2011. Retrieval practice produces more learning than elaborative studying with concept mapping. Science 331, 772-775. ( 10.1126/science.1199327) [DOI] [PubMed] [Google Scholar]
  • 83.Kellman PJ, Garrigan P. 2009. Perceptual learning and human expertise. Phys. Life Rev. 6, 53-84. ( 10.1016/j.plrev.2008.12.001) [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 84.ten Cate O, van de Vorst I, van den Broek S. 2012. Academic achievement of students tutored by near-peers. Int. J. Med. Educ. 3, 6-13. ( 10.5116/ijme.4f0c.9ed2) [DOI] [Google Scholar]
  • 85.Kahan DM. 2017. ‘Ordinary science intelligence’: a science-comprehension measure for study of risk and science communication, with notes on evolution and climate change. J. Risk Res. 20, 995-1016. ( 10.1080/13669877.2016.1148067) [DOI] [Google Scholar]
  • 86.Worthington JP, Silvertown J, Cook L, Cameron R, Dodd M, Greenwood RM, Mcconway K, Skelton P. 2012. Evolution MegaLab: a case study in citizen science methods. Methods Ecol. Evol. 3, 303-309. ( 10.1111/j.2041-210X.2011.00164.x) [DOI] [Google Scholar]
  • 87.Taifun-Tofu GmbH, Universität Hohenheim. 1000 Gärten. See Http://www.1000gaerten.de/index.php?id=2 (accessed 18 February 2021).
  • 88.Würschum T, Leiser WL, Jähne F, Bachteler K, Miersch M, Hahn V. 2019. The soybean experiment ‘1000 Gardens’: a case study of citizen science for research, education, and beyond. Theor. Appl. Genet. 132, 617-626. ( 10.1007/s00122-018-3134-2) [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 89.González J, Torres R. 2021. Melanogaster: catch the fly! See https://melanogaster.eu/about-us/?lang=en (accessed 18 February 2021).
  • 90.Brossard D, Lewenstein B, Bonney R. 2005. Scientific knowledge and attitude change: the impact of a citizen science project. Int J. Sci. Educ. 27, 1099-1121. ( 10.1080/09500690500069483) [DOI] [Google Scholar]
  • 91.Paul J, Lederman NG, Groß J. 2016. Learning experimentation through science fairs. Int. J. Sci. Educ. 38, 2367-2387. ( 10.1080/09500693.2016.1243272) [DOI] [Google Scholar]
  • 92.Lederman NG. 2019. Contextualizing the Relationship Between Nature of Scientific Knowledge and Scientific Inquiry: Implications for Curriculum and Classroom Practice. Sci. Educ. 28, 249-267. ( 10.1007/s11191-019-00030-8) [DOI] [Google Scholar]
  • 93.Ryan SF. The Pieris Project. See http://www.pierisproject.org (accessed on 18 February 2021).
  • 94.Abu Amsha O, Schneider DK, Fernandez-Marquez JL, da Costa J, Fuchs B, Kloetzer L. 2016. Data analytics in citizen cyberscience: evaluating participant learning and engagement with analytics. Hum. Comput. 3, 69-97. ( 10.15346/hc.v3i1.5) [DOI] [Google Scholar]
  • 95.Cosnetino B, Gibbs J. SquirrelMapper. See http://squirrelmapper.org/index.html (accessed 18 February 2021).
  • 96.Gooding J, Metz B. 2011. From misconceptions to conceptual change: tips for identifying and overcoming students' misconceptions. Sci. Teacher 78, 34-37. [Google Scholar]
  • 97.Beniermann A, et al. 2021. Evolution Education Questionnaire on Acceptance and Knowledge (EEQ) - Standardised and ready-to-use protocols to measure acceptance of evolution and knowledge about evolution in an international context. COST: European cooperation in science & technology. Zenodo. ( 10.5281/zenodo.4554742) [DOI]
  • 98.Häkkinen P, Järvelä S, Mäkitalo-Siegl K, Ahonen A, Näykki P, Valtonen T. 2017. Preparing teacher-students for twenty-first-century learning practices (PREP 21): a framework for enhancing collaborative problem-solving and strategic learning skills. Teach. Teach.: Theory Pract. 23, 25-41. ( 10.1080/13540602.2016.1203772) [DOI] [Google Scholar]
  • 99.Weidlich J, Bastiaens TJ. 2017. Explaining social presence and the quality of online learning with the SIPS model. Comput. Hum. Behav. 72, 479-487. ( 10.1016/j.chb.2017.03.016) [DOI] [Google Scholar]
  • 100.Edwards R, McDonnell D, Simpson I, Wilson A.. 2017. Educational backgrounds, project design, and inquiry learning in citizen science. In Citizen inquriy (eds Herodotou C, Sharples M, Scanlon E). London, UK: Routledge. [Google Scholar]
  • 101.Schaefer T, Kieslinger B, Brandt M, Hummer P, Land-Zandstra A, Sieber A.. 2021. Evaluation in citizen science: the art of tracing a moving target. In The science of citizen science (eds Vohland K, Land-Zandstra A, Ceccaroni L, Lemmens R, Perlló J, Ponti M, Samson R, Wagenknecht K), pp. 495-516. Cham, Switzerland: Springer. [Google Scholar]
  • 102.Nehm RH, Beggrow EP, Opfer JE, Ha M. 2012. Reasoning about natural selection: diagnosing contextual competency using the ACORNS instrument. Am. Biol. Teach. 74, 92-98. ( 10.1525/abt.2012.74.2.6) [DOI] [Google Scholar]
  • 103.Anderson DL, Fisher KM, Norman GJ. 2002. Development and evaluation of the conceptual inventory of natural selection. J. Res. Sci. Teach. 39, 952-978. ( 10.1002/tea.10053) [DOI] [Google Scholar]
  • 104.Crall AW, Jordan R, Holfelder K, Newman GJ, Graham J, Waller DM. 2013. The impacts of an invasive species citizen science training program on participant attitudes, behavior, and science literacy. Public Underst. Sci. 22, 745-764. ( 10.1177/0963662511434894) [DOI] [PubMed] [Google Scholar]
  • 105.Kalinowski ST, Willoughby S. 2019. Development and validation of a scientific (formal) reasoning test for college students. J. Res. Sci. Teach. 56, 1269-1284. ( 10.1002/tea.21555) [DOI] [Google Scholar]
  • 106.Drummond C, Fischhoff B. 2017. Development and validation of the Scientific Reasoning Scale. J. Behav. Decis. Mak. 30, 26-38. ( 10.1002/bdm.1906) [DOI] [Google Scholar]
  • 107.Jorgensen DL. 2015. Participant observation. Emerging trends in the social and behavioural sciences. New York, NY: John Wiley. [Google Scholar]
  • 108.Stahl E, Bromme R. 2007. The CAEB: an instrument for measuring connotative aspects of epistemological beliefs. Learn. Instruct. 17, 773-785. ( 10.1016/j.learninstruc.2007.09.016) [DOI] [Google Scholar]
  • 109.Nehring A. 2020. Naïve and informed views on the nature of scientific inquiry in large-scale assessments: two sides of the same coin or different currencies? J. Res. Sci. Teach. 57, 510-535. ( 10.1002/tea.21598) [DOI] [Google Scholar]
  • 110.Liang LL, Chen S, Chen X, Kaya ON, Adams AD, Macklin M, Ebenezer J. 2008. Assessing preservice elementary teachers’ views on the nature of scientific knowledge: a dual-response instrument. Asia-Pacific Forum Sci. Learn. Teach. 9, 1-20. [Google Scholar]
  • 111.Lederman JS, Lederman NG, Bartos SA, Bartels SL, Meyer AA, Schwartz RS. 2014. Meaningful assessment of learners' understandings about scientific inquiry—the views about scientific inquiry (VASI) questionnaire. J. Res. Sci. Teach. 51, 65-83. ( 10.1002/tea.21125) [DOI] [Google Scholar]
  • 112.Romine WL, Sadler TD, Kinslow AT. 2017. Assessment of scientific literacy: development and validation of the Quantitative Assessment of Socio-Scientific Reasoning (QuASSR). J. Res. Sci. Teach. 54, 274-295. ( 10.1002/tea.21368) [DOI] [Google Scholar]
  • 113.Mead LS, Kohn C, Warwick A, Schwartz K. 2019. Applying measurement standards to evolution education assessment instruments. Evol.: Educ. Outreach 12, 1-4. ( 10.1186/s12052-019-0097-y) [DOI] [Google Scholar]
  • 114.Peter M, Diekötter T, Kremer K. 2019. Participant outcomes of biodiversity citizen science projects: a systematic literature review. Sustainability (Switzerland) 11, 1-18. ( 10.3390/su11102780) [DOI] [Google Scholar]
  • 115.Cohen R, Kincaid D, Childs KE. 2007. Measuring school-wide positive behavior support implementation: development and validation of the benchmarks quality. J. Posit. Behav. Interv. 9, 203-213. ( 10.1177/10983007070090040301) [DOI] [Google Scholar]
  • 116.Constant N, Roberts L. 2017. Narratives as a mode of research evaluation in citizen science: understanding broader science communication impacts. J. Sci. Commun. 16, 1-18. ( 10.22323/2.16040203) [DOI] [Google Scholar]
  • 117.BERA. 2018. Ethical guidelines for educational research. London, UK: British Educational Research Association. [Google Scholar]
  • 118.Sickler J, Cherry TM, Allee L, Smyth RR, Losey J. 2014. Scientific value and educational goals: balancing priorities and increasing adult engagement in a citizen science project. Appl. Environ. Educ. Commun. 13, 109-119. ( 10.1080/1533015X.2014.947051) [DOI] [Google Scholar]
  • 119.Jennett C et al. 2016. Motivations, learning and creativity in online citizen science. J. Sci. Commun. 15, A05. [Google Scholar]
  • 120.Agnello G, Vercammen A, Knight AT. 2022. Understanding citizen scientists’ willingness to invest in, and advocate for, conservation. Biol. Conserv. 265, 109422. ( 10.1016/j.biocon.2021.109422) [DOI] [Google Scholar]
  • 121.Land-Zandstra A, Agnello G, Gültekin YS.. 2021. Participants in citizen science. Cham, Switzerland: Springer International Publishing. [Google Scholar]
  • 122.Sivertsen G, Meijer I. 2020. Normal versus extraordinary societal impact: how to understand, evaluate, and improve research activities in their relations to society? Res. Eval. 29, 66-70. ( 10.1093/RESEVAL/RVZ032) [DOI] [Google Scholar]
  • 123.Fecher B, Hebing M. 2021. How do researchers approach societal impact? PLoS ONE 16, e0254006. ( 10.1371/journal.pone.0254006) [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 124.Crain R, Cooper C, Dickinson JL. 2014. Citizen science: a tool for integrating studies of human and natural systems. Ann. Rev. Environ. Resour. 39, 641-665. ( 10.1146/annurev-environ-030713-154609) [DOI] [Google Scholar]
  • 125.Schneider F, Giger M, Harari N, Moser S, Oberlack C, Providoli I, Schmid L, Tribaldos T, Zimmermann A. 2019. Transdisciplinary co-production of knowledge and sustainability transformations: three generic mechanisms of impact generation. Environ. Sci. Policy 102, 26-35. ( 10.1016/j.envsci.2019.08.017) [DOI] [Google Scholar]

Associated Data

This section collects any data citations, data availability statements, or supplementary materials included in this article.

Data Availability Statement

This article has no additional data.


Articles from Proceedings of the Royal Society B: Biological Sciences are provided here courtesy of The Royal Society

RESOURCES