Public participation in scientific projects is flourishing globally as part of projects labeled “citizen science” (CS). Already, a number of professional networks for CS stakeholders have been founded, for example, the US-based Citizen Science Association, the European Citizen Science Association, and the Australian Citizen Science Association.
As citizen science (CS) continues to grow, researchers and participants should move toward a shared understanding of what the practice is, what it is not, and what criteria CS projects must fulfill to ensure high-quality participatory research. Image credit: David Cutler (artist).
But what exactly qualifies as CS? It is interpreted in various ways (1) and takes different forms with different degrees of participation (2). In fact, the label CS is currently assigned to research activities either by project principal investigators (PIs) themselves or by research funding agencies. Against this backdrop, critical observers of CS, such as Guerrini et al. (3), have drawn attention to important legal and ethical issues including intellectual property and scientific integrity. Similarly, Vayena and Tasioulas (4) note the importance of protecting the interests of research participants in biomedical participant-led research, and Buyx et al. (5) note the need for a solidarity-based practice of CS to fully exploit its potential, making “every participant a PI.”
In light of the rapid growth of CS, present concerns, and calls for further improving the value of CS, we see several issues for policymakers, funding agencies, and citizens. Specifically, we believe that researchers and participants should move toward a shared understanding of what CS is, what it is not, and what criteria CS projects must fulfill to ensure high-quality participatory research (6). Establishing criteria will help ensure that CS projects are rigorous, help the field flourish, and where applicable encourage policymakers to take CS project data and results seriously.
Democratizing Science
Politicians throughout Europe understand CS as an important part of aspiring to the “democratization of knowledge production” (7) and heightening the societal relevance of publicly funded research (8). A recent historical reflection on CS attributes the term to a participatory turn in science policy and supports the claim that CS can lead to a democratization of science by turning science from a closed to an open activity (9). For example, the European Commission is currently investigating the potential of CS as an input for environmental policy making in the Knowledge Innovation Project and is supporting CS in its research funding programs (e.g., Citizens’ Observatories, Responsible Research and Innovation) (10). Taking this agenda into account, funding agencies have started to promote CS with tailored programs, such as the European Horizon 2020 “Science with and for Society” program (11). But policymakers and other stakeholders seeking to inform policies have difficulties verifying the reliability of CS data. Indeed, any project with public participation (with or without scientific rationale and participating scientists) can be labeled CS.
Because no generally accepted definition of CS exists, the national and international platforms listing CS projects face the challenge of accommodating considerable heterogeneity. Furthermore, on a practical level, the huge number and diversity of CS projects make it virtually impossible for interested citizens to evaluate the quality of any given CS project to decide whether or not to participate—and there may be less willingness to take part as a result. Privacy issues and the abuse of personal data constitute further concerns. Clearly, there must be a trusting relationship between researchers and citizen scientists. Therefore, CS web platforms listing CS projects need some degree of standardization to help ensure the high quality of projects.
The absence of an international definition seems to be widely accepted by many stakeholders working in the field of CS because this allows for methodological innovation (1, 12–14). Although we embrace methodological creativity, we argue that there should be a minimum set of quality criteria following an international definition of CS. This will allow for more effective sharing of results (e.g., data) and methods (e.g., tools) across the globe and across communities. Such an exchange can only be achieved via some degree of standardization and, thus, protection of both scientific endeavor as well as citizens’ and society’s interests.
CS has amazing potential as an innovative approach to data gathering and experimental design, as well as an educational and outreach tool. Let's make sure that future CS projects have sufficient rigor to earn the respect of participants, scientists, and policymakers.
The Value of Definitions
Why pursue a definition? What does a definition actually do? Generally speaking, any definition reduces complexity by providing a framework as well as the vocabulary to grasp the relevant ideas. A definition makes its subject not only explicit but also accessible. Notwithstanding the dynamic nature of science and participatory forms of innovation, we advocate for a definition of CS based on an interdisciplinary consensus we achieved in Austria. Furthermore, we seek to make this definition the basis for international minimum-quality CS standards.
In 2014, we established a national network for CS to connect stakeholders in Austria, promote CS projects in the public, and foster CS as a scientific method. Over a 3-year period, the number of projects listed on the associated web platform (www.citizen-science.at) increased from 5 projects (focusing on biodiversity research) to 54 projects (covering various scientific disciplines ranging from ecology to linguistics to medicine). Two of us (F.H. and D.D.), in our role as platform coordinators, decided which projects to include in the platform based on our respective expertise. We evaluated the project descriptions to determine whether the project seemed scientifically sound and whether it involved citizens in knowledge production.
However, this process was not as transparent nor as objective as it should have been. Additionally, the two coordinators are trained ecologists and lack knowledge in the growing variety of disciplines that CS project-listing applicants have hailed from, such as the social sciences and the humanities. Aware of these limitations, we recognized the need for an open catalog of criteria, one that’s sensitive to different scientific fields and allows a fair and thorough assessment of the projects listed.
Within 1 year, 22 members of the CS network Austria, working on CS-related projects at 17 different institutions, developed a set of criteria that project leaders would be asked to comply with in order for their projects to be listed on the Austrian CS web platform. Scientists, members of funding agencies, and policy advisors collaborated in this development. During the process, the platform coordinators organized three workshops, several online discussions, and an open consultation with the general public.
The catalog is based on the 10 principles of CS (15) and the Vienna Principles on Open Scholarly Communication (16), which entail general principles on the scientific rigor of CS projects, how to collaborate with citizens in research projects, and how to publish scientific results openly. The current version of the criteria catalog is now being applied to all projects listed on the platform. This catalog is, to the best of our knowledge, the first of its kind worldwide and could be used for further standardization of quality criteria for CS (17).
It is, however, a living document. We are constantly collecting feedback on the catalog at national and international events to update and improve the criteria—we seek to remain open to innovation and new and emerging forms of CS while still helping to ensure the rigor of CS endeavors and the interests of participants. Version 1.1 of the catalog covers seven areas of assessment including 20 criteria: what is not CS; scientific standards; collaboration; open science; communication; ethics; and data management. A brief explanation of each follows:
1. What Is Not Citizen Science
We created criteria to exclude projects that are not CS to be as open as possible to different concepts and disciplines. For example, opinion polls or data collection on participants is not considered CS. We do not exclude projects based on the research expertise or professional background of the project leader—i.e., project leaders need not have PhDs in science for their project to be classified as CS.
2. Scientific Standards
Three criteria probe the scientific rigor of a project, namely the scientific questions asked or hypothesis tested; the methods applied; and the rationale for generating new knowledge or developing new methods. We aspired to develop criteria that apply to projects from all scientific disciplines, ranging from the natural sciences to the social sciences and the humanities. Critics might suggest that these criteria are too vague or too harsh for their respective disciplines, but the presented criteria come from a group of people involved in the process who hail from a variety of disciplines, including art sciences, ecology, historical sciences, geography, science communication, climatology, educational sciences, computer sciences, political sciences, and data sciences.
3. Collaboration
Five separate criteria categorize the design of the collaboration between participants and project leaders. These criteria assess, for example, the active involvement of citizen scientists in the research process or the added value of the collaboration for all people involved in the project. In the case of Project Roadkill, for example, it’s safer roads for citizens and data points for scientists.
4. Open Science
We required that all data and results of a given CS project be published open access, provided there are no legal or ethical barriers to doing so. In our view, this is an important step toward increased transparency and trust in CS projects.
5. Communication
Communication is an essential part of any successful CS project. Therefore, our criteria request transparency for all CS projects to encourage dialogue among different interest groups. For example, the Roadkill Project aims at animal protection on roads. The project team provides information on the project website about project aims, the exact methods, and how citizens can participate (www.roadkill.at/en). Additionally, participants and interested citizens can contact the project team via diverse communication channels such as email, Instagram, Twitter, or directly via the Spotteron Roadkill app. These communication channels are crucial for the project’s success in two ways. First, citizens can get answers to specific questions, for example, with regard to data collection or technical issues. Second, via communication with involved citizens, the scientific coordinator can follow up on inconsistent data with the contributors and, thus, improve the overall data quality.
6. Ethics
Collaboration among all involved people in any CS project requires compliance with ethical standards, inclusiveness, and clear information on data policy and governance, as well as informed consent from project participants.
7. Data Management
Finally, a data management plan must be established prior to data collection to ensure that projects carefully and comprehensively describe how collected data are stored, secured, accessed, or deleted after finishing the project.
Early experiences applying these criteria affirm their role as a quality filter and assessment tool. We are, however, aware of the challenges and possible restrictions of the catalog. Content may not be applicable everywhere because of regional differences and local practices. For example, the White Paper on Citizen Science by Sanz et al. (18) considers projects to be CS when participants provide facilities for researchers, such as smartphone computing power. A number of national platforms share a similar understanding. Our catalog, conversely, would exclude such projects because participants are only providers of resources and not actively involved in any of the scientific activities. Furthermore, the establishment of minimum standards may also open the door to requiring a management process that needs to be carefully outlined to adapt these standards to future CS developments–for example, an international council. This may not be desirable if the process ends up impeded by levels of bureaucracy.
Therefore, we are advocating for a process that is practical and manageable for the CS community. Notwithstanding these potential challenges, at the very least, transparent criteria would enable community-hosting (e.g., web platforms, CS networks, CS associations) to assess the compliance of CS projects with minimum standards regardless of how much they vary in form and substance.
At a time when data come from multiple sources and citizens appear to be increasingly distrustful of science (19–21), projects that help members of the general public gain insight into the scientific process must be based on high-quality standards. Questionable methods could mean losing the trust of participants and eroding the belief that scientific pursuits are generally a public good. Thus, we believe that transparent criteria for CS projects can lead to more trust in science in general.
Quality standards could also provide funding agencies with concrete indications as to what they should expect from CS projects. For example, the current draft of the European Framework Program for 2021–2027 does not include funding opportunities explicitly dedicated to CS (22), hence threatening to reduce the visibility of CS projects in Europe and possibly putting CS projects at a competitive disadvantage. Highlighting approaches to CS that emphasize scientific rigor could help this research gain stature among funders.
Over time, CS will greatly benefit from a standardization process. Policymakers in Europe are still reluctant to use data generated in CS projects for the purposes of decision making. Minimum standards and a definition would improve the credibility of CS efforts. In addition, recognition of CS by public authorities would enable and foster civic empowerment by involving citizens in policy-relevant processes (23). Our current catalog of minimum quality criteria could be the basis for an international declaration, a joint effort by CS associations, funding agencies, and policymakers. Indeed, we believe CS projects, practitioners, and participants would all benefit as a result.
CS has amazing potential as an innovative approach to data gathering and experimental design, as well as an educational and outreach tool. Let’s make sure that future CS projects have sufficient rigor to earn the respect of participants, scientists, and policymakers.
Acknowledgments
We thank all working group members and members of the public who contributed to the Quality Criteria for Citizen Science Projects on Österreich forscht.
Footnotes
The authors declare no conflict of interest.
Any opinions, findings, conclusions, or recommendations expressed in this work are those of the authors and have not been endorsed by the National Academy of Sciences.
References
- 1.Eitzel MV, et al. Citizen science terminology matters: Exploring key terms. Citiz. Sci. 2017;2:1–20. [Google Scholar]
- 2.Schäfer T, Kieslinger B. Supporting emerging forms of citizen science: A plea for diversity, creativity and social innovation. J. Sci. Commun. 2016;15:Y02. [Google Scholar]
- 3.Guerrini CJ, Majumder MA, Lewellyn MJ, McGuire AL. Citizen science, public policy. Science. 2018;361:134–136. doi: 10.1126/science.aar8379. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 4.Vayena E, Tasioulas J. Adapting standards: Ethical oversight of participant-led health research. PLoS Med. 2013;10:e1001402. doi: 10.1371/journal.pmed.1001402. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 5.Buyx A, Del Savio L, Prainsack B, Völzke H. Every participant is a PI. Citizen science and participatory governance in population studies. Int J Epidemiol. 2017;46:377–384. doi: 10.1093/ije/dyw204. [DOI] [PubMed] [Google Scholar]
- 6.Heigl F, Dörler D. Public participation: Time for a definition of citizen science. Nature. 2017;551:168. doi: 10.1038/d41586-017-05745-8. [DOI] [PubMed] [Google Scholar]
- 7.Irwin A. Citizen Science: A Study of People, Expertise and Sustainable Development. Routledge Chapman & Hall; Abingdon, United Kingdom: 1995. [Google Scholar]
- 8.Bio Innovation Service 2018 Citizen science for environmental policy: Development of an EU-wide inventory and analysis of selected practices. Available at https://publications.europa.eu/en/publication-detail/-/publication/842b73e3-fc30-11e8-a96d-01aa75ed71a1/language-en). Accessed February 27, 2019.
- 9.Strasser BJ, Baudry J, Mahr D, Sanchez G, Tancoigne E. 2018 “Citizen science”? Rethinking science and public participation. Available at https://archive-ouverte.unige.ch/unige:100156. Accessed February 27, 2019.
- 10.Hecker S, et al. Innovation in citizen science: Perspectives on science-policy advances. Citiz. Sci. 2018;3:4. [Google Scholar]
- 11.European Commission 2017 EN Horizon 2020 Work Programme 2018–2020. Available at ec.europa.eu/research/participants/data/ref/h2020/wp/2018-2020/main/h2020-wp1820-swfs_en.pdf. Accessed February 27, 2019.
- 12.Cooper C. Links and distinctions among citizenship, science, and citizen science. A response to “The Future of Citizen Science.”. Democr Educ. 2012;20:Article 13. [Google Scholar]
- 13.Wiggins A, Crowston K. 2011. From conservation to crowdsourcing: A typology of citizen science. Proceedings of the 2011 44th Hawaii International Conference on System Sciences (IEEE Computer Society, Washington, DC), pp 1–10.
- 14.Friesen J, Rodríguez-Sinobas L. Advanced Tools for Integrated Water Resources Management. Academic Press; Cambridge, MA: 2018. [Google Scholar]
- 15.European Citizen Science Association 2015 Ten principles of citizen science. Available at ecsa.citizen-science.net/sites/default/files/ecsa_ten_principles_of_citizen_science.pdf. Accessed February 27, 2019.
- 16.Kraker P, et al. The Vienna Principles: A Vision for Scholarly Communication in the 21st Century. Zenodo; Geneva, Switzerland: 2016. [Google Scholar]
- 17.Heigl F, et al. 2018 Quality criteria for citizen science projects on Österreich forscht: Version 1.1. Available at https://www.researchgate.net/publication/326143643_Quality_Criteria_for_Citizen_Science_Projects_on_Osterreich_forscht_Version_11. Accessed February 27, 2019.
- 18.Sanz FS, Holocher-Ertl T, Kieslinger B, García FS, Silva CG. 2014 White paper on Citizen Science for Europe. Available at https://ec.europa.eu/futurium/en/content/white-paper-citizen-science). Accessed February 27, 2019.
- 19.Rynes SL, Colbert AE, O’Boyle EH. When the “best available evidence” doesn’t win: How doubts about science and scientists threaten the future of evidence-based management. J Manage. 2018;44:2995–3010. [Google Scholar]
- 20.Motta M. The enduring effect of scientific interest on trust in climate scientists in the United States. Nat Clim Chang. 2018;8:485. [Google Scholar]
- 21.Makarovs K, Achterberg P. Science to the people: A 32-nation survey. Public Underst Sci. 2018;27:876–896. doi: 10.1177/0963662517754047. [DOI] [PubMed] [Google Scholar]
- 22.European Commission 2018 Annexes to the Proposal for a Decision of the European Parliament and of the Council on establishing the specific programme implementing Horizon Europe – the Framework Programme for Research and Innovation. Available at https://ec.europa.eu/commission/sites/beta-political/files/budget-may2018-horizon-europe-decision_en.pdf. Accessed March 13, 2019.
- 23.Turrini T, Dörler D, Richter A, Heigl F, Bonn A. The threefold potential of environmental citizen science: Generating knowledge, creating learning opportunities and enabling civic participation. Biol Conserv. 2018;225:176–186. [Google Scholar]

