Skip to main content
F1000Research logoLink to F1000Research
. 2017 Jun 13;6:ELIXIR-876. [Version 1] doi: 10.12688/f1000research.11407.1

Four simple recommendations to encourage best practices in research software

Rafael C Jiménez 1,a, Mateusz Kuzak 2,b, Monther Alhamdoosh 3, Michelle Barker 4, Bérénice Batut 5, Mikael Borg 6, Salvador Capella-Gutierrez 7, Neil Chue Hong 8, Martin Cook 1, Manuel Corpas 9, Madison Flannery 10, Leyla Garcia 11, Josep Ll Gelpí 12,13, Simon Gladman 10, Carole Goble 14, Montserrat González Ferreiro 11, Alejandra Gonzalez-Beltran 15, Philippa C Griffin 10, Björn Grüning 5, Jonas Hagberg 6, Petr Holub 16, Rob Hooft 17, Jon Ison 18, Daniel S Katz 19,20,21,22, Brane Leskošek 23, Federico López Gómez 1, Luis J Oliveira 24, David Mellor 25, Rowland Mosbergen 26, Nicola Mulder 27, Yasset Perez-Riverol 11, Robert Pergl 28, Horst Pichler 29, Bernard Pope 10, Ferran Sanz 30, Maria V Schneider 10, Victoria Stodden 20, Radosław Suchecki 31, Radka Svobodová Vařeková 32,33, Harry-Anton Talvik 34, Ilian Todorov 35, Andrew Treloar 36, Sonika Tyagi 10,37, Maarten van Gompel 38, Daniel Vaughan 11, Allegra Via 39, Xiaochuan Wang 40, Nathan S Watson-Haigh 31, Steve Crouch 41,c
PMCID: PMC5490478  PMID: 28751965

Abstract

Scientific research relies on computer software, yet software is not always developed following practices that ensure its quality and sustainability. This manuscript does not aim to propose new software development best practices, but rather to provide simple recommendations that encourage the adoption of existing best practices. Software development best practices promote better quality software, and better quality software improves the reproducibility and reusability of research. These recommendations are designed around Open Source values, and provide practical suggestions that contribute to making research software and its source code more discoverable, reusable and transparent. This manuscript is aimed at developers, but also at organisations, projects, journals and funders that can increase the quality and sustainability of research software by encouraging the adoption of these recommendations.

Keywords: Open Source, code, software, guidelines, best practices, recommendations, Open Science, quality, sustainability, FAIR

Introduction

New discoveries in modern science are underpinned by automated data generation, processing and analysis: in other words, they rely on software. Software, particularly in the context of research, is not only a means to an end, but is also a collective intellectual product and a fundamental asset for building scientific knowledge. More than 90% of scientists acknowledge software is important for their own research and around 70% say their research would not be feasible without it ( Hannay et al., 2009; Hettrick et al., 2016).

Scientists are not just users of software; they are also prime producers ( Goble, 2014). 90% of scientists developing software are primarily self-taught and lack exposure and incentives to adopt software development practices that are widespread in the broader field of software engineering ( Wilson et al., 2014). As a result, software produced for research does not always meet the standards that would ensure its quality and sustainability, affecting the reproducibility and reusability of research ( Crouch et al., 2013).

Open Source Software (OSS) is software with source code that anyone can inspect, modify and enhance. OSS development is used by organisations and projects to improve accessibility, reproduction, transparency and innovation in scientific research ( Mulgan et al., 2005; Nosek et al., 2015). OSS not only increases discoverability and visibility, but it also engages developer and user communities, provides recognition for contributors, and builds trust among users ( McKiernan et al., 2016). OSS development significantly contributes to the reproducibility of results generated by the software and facilitates software reusability and improvement ( Ince et al., 2012; Perez-Riverol et al., 2014). Opening code to the public is also an opportunity for developers to showcase their work, so it becomes an incentive for adoption of software development best practices ( Leprevost et al., 2014). Thus, OSS can be used as a vehicle to promote the quality and sustainability of software, leading to the delivery of better research.

This manuscript describes a core set of OSS recommendations to improve the quality and sustainability of research software. It does not propose new software development best practices, but rather provides easy-to-implement recommendations that encourage adoption of existing best practices. These recommendations do not aim to describe in detail how to develop software, but rather lay out practical suggestions on top of Open Source values that go towards making research software and its source code more discoverable, reusable and transparent.

The OSS recommendations should be applied following existing and complementary guidelines like best practices, manifestos and principles that describe more specific procedures on how to develop and manage software. Some of these complementary guidelines are related to version control, code review, automated testing, code formatting, documentation, citation and usability. ( Artaza et al., 2016; DagstuhlEAS, 2017; Gilb, 1988; Leprevost et al., 2014; List et al., 2017; Perez-Riverol et al., 2016; Prlić & Procter, 2012; Smith et al., 2016; Wilson et al., 2014; Wilson et al., 2016).

This manuscript also aims to encourage projects, journals, funders and organisations to both endorse the recommendations and to drive compliance through their software policies. The recommendations are accompanied by a list of arguments addressing common questions and fears raised by the research community when considering open sourcing software.

In this manuscript, software is broadly defined to include command line software, graphical user interfaces, desktop and mobile applications, web-based services, application program interfaces (APIs) and infrastructure scripts that help to run services.

Target audience

Our target audience includes leaders and managers of organisations and projects, journal editorial bodies, and funding agencies concerned with the provision of products and services relying on the development of open research software. We want to provide these stakeholders with a simple approach to drive the development of better software. Though these OSS recommendations have mostly been developed within, and received feedback from, the life science community, the document and its recommendations apply to all research fields.

Strategies to increase software quality usually target software developers, focusing on training and adoption of best practices ( Wilson et al., 2014). This approach can yield good results, but requires a significant effort as well as personal commitment from developers ( Wilson, 2014). For an organisation employing scientists and developers with different sets of programming skills and responsibilities, it is not easy to endorse specific best practices or define a broad range of training needs. It is easier to endorse a set of basic recommendations that are simple to monitor, simple to comply with, and which drive the adoption of best practices and reveal training needs. The OSS recommendations aim to create awareness, encourage developers to be more conscious of best practices, and make them more willing to collaborate and request support. The recommendations define broad guidelines, giving developers freedom to choose how to implement specific best practices.

In terms of the adoption of these recommendations, we see endorsement as the first step: that is, agreeing to support the OSS recommendations without a formal process for implementation. Promotion is a second step: that is, actively publicising and incentivising the OSS recommendations within the organisation as well as globally. Compliance is the third step: to formally implement them within the organisation, with ongoing monitoring and public reporting if possible. To facilitate progress, we propose that organisations, projects, journals, as well as funding agencies include these OSS recommendations as part of their policies relating to the development and publication of software.

Open Source Software is not just adopted by non-profit organisations, but also by commercial companies as a business model ( Popp, 2015). Therefore, we encourage not only publicly funded projects but also for-profit entities to adopt OSS and support these recommendations.

Recommendations

1. Make source code publicly accessible from day one

Develop source code in a publicly accessible, version controlled repository (e.g., GitHub and Bitbucket) from the beginning of the project. The longer a project is run in a closed manner, the harder it is to open it later ( Fogel, 2005). Opening code and exposing the software development life cycle publicly from day one:

  • Promotes trust in the software and broader project

  • Facilitates the discovery of existing software development projects

  • Provides a historical public record of contributions from the start of the project and helps to track recognition

  • Encourages contributions from the community

  • Increases opportunities for collaboration and reuse

  • Exposes work for community evaluation, suggestions and validation

  • Increases transparency through community scrutiny

  • Encourages developers to think about and showcase good coding practices

  • Facilitates reproducibility of scientific results generated by all prior versions of the software

  • Encourages developers to provide documentation, including a detailed user manual and clear in-code comments

Some common doubts and questions about making software Open Source are discussed in the Supplementary File S1, “Fears of open sourcing and some ways to handle them”.

2. Make software easy to discover by providing software metadata via a popular community registry

Facilitate discoverability of the software project and its source code by registering metadata related to the software in a popular community registry. Metadata might include information like the source code location, contributors, licence, version, identifier, references and how to cite the software. Metadata registration:

  • Increases the visibility of the project, the software, its use, its successes, its references, and its contributors

  • Provides easy access for software packagers to deploy your software, thus increasing visibility

  • Encourages software providers to think about the metadata that describes software as well as how to expose such metadata

  • Helps to expose the software metadata in a machine readable format via the community registry

  • Increases the chances of collaboration, reuse, and improvement

Examples of community registries of software metadata are bio.tools ( Ison et al., 2016), ( Ison et al., 2016) biojs.io ( Corpas et al., 2014; Gómez et al., 2013) and Omic Tools ( Henry et al., 2014) in the life sciences and DataCite ( Brase, n.d.) as a generic metadata registry for software as well as data.

3. Adopt a licence and comply with the licence of third-party dependencies

Adopt a suitable Open Source licence to clarify how to use, modify and redistribute the source code under defined terms and conditions. Define the licence in a publicly accessible source code repository, and ensure the software complies with the licences of all third party dependencies. Providing a licence:

  • Clarifies the responsibilities and rights placed on third parties wishing to use, copy, redistribute, modify and/or reuse your source code

  • Enables using the code in jurisdictions where “code with no licence” means it cannot be used at all

  • Protects the software’s intellectual property

  • Provides a model for long-term sustainability by enabling legally well-founded contributions and reuse

We advise choosing a OSI-approved Open Source Licence unless your institution or project requires a different licence. Websites like “ Choose an open source license” provide guidelines to help users to select an OSI-approved Open Source Licence. Organisations like the OSS Watch also provide advice on how to keep track of the licences of software dependencies. For reusability reasons, we also advise authors to disclose any patents and pending patent applications known to them affecting the software.

4. Define clear and transparent contribution, governance and communication processes

Open sourcing your software does not mean the software has to be developed in a publicly collaborative manner. Although it is desirable, the OSS recommendations do not mandate a strategy for collaborating with the developer community. However, projects should be clear about how contributions can be made and incorporated by having transparent governance model and communication channels. Clarity on the project structure, as well as its communication channels and ways to contribute:

  • Increases transparency on how the project and the software is being managed

  • Helps to define responsibilities and how decision are made in the software project

  • Helps the community know how to collaborate, communicate and contribute to the project

For instance the Galaxy project’s website describes the team’s structure, how to be part of the community, and their communication channels.

Alignment with FAIR data principles

The FAIR Guiding Principles for scientific data management and stewardship provide recommendations on how to make research data findable, accessible, interoperable and reusable (FAIR) ( Wilkinson et al., 2016). While the FAIR principles were originally designed for data, they are sufficiently general that their high level concepts can be applied to any digital object including software. Though not all the recommendations from the FAIR data principles directly apply to software, there is good alignment between the OSS recommendations and the FAIR data principles (see Table 1).

Table 1. Comparison between the OSS recommendations and the FAIR data principles ( Wilkinson et al., 2016).

The FAIR Guiding Principles OSS recommendations
To be Findable: F1. (meta)data are assigned a globally unique
and persistent identifier; F2. data are described with rich
metadata (defined by R1 below); F3. metadata clearly and
explicitly include the identifier of the data it describes; F4.
(meta)data are registered or indexed in a searchable resource
“R2. Make software easy to discover by providing software
metadata via a popular community registry” aligns with the
Findability principle, helping to increase visibility and helping
software providers to think about how to describe software metadata
(versions, identifiers, contributors, citations, etc.)
To be Accessible: A1. (meta)data are retrievable by their identifier
using a standardized communications protocol; A1.1 the protocol
is open, free, and universally implementable; A1.2 the protocol
allows for an authentication and authorization procedure, where
necessary; A2. metadata are accessible, even when the data are
no longer available
“R1. Make source code publicly accessible from day one” focuses
on openness including accessibility. The FAIR accessible principle
instead opens the door to data that is restricted access e.g. for
privacy reasons. Since such reasons do not apply for software, the
OSS recommendations prefer to direct towards openness instead,
supporting open science to the maximum extent.
To be Interoperable: I1. (meta)data use a formal, accessible,
shared, and broadly applicable language for knowledge
representation; I2. (meta)data use vocabularies that follow FAIR
principles; I3. (meta)data include qualified references to other
(meta)data
This OSS recommendations do not aim to address software
interoperability directly but contribute to a more homogenous
description of software by encouraging software providers to
register software metadata into registries providing specific
metadata guidelines.
To be Reusable: R1. meta(data) are richly described with a
plurality of accurate and relevant attributes; R1.1. (meta)data are
released with a clear and accessible data usage license;
R1.2. (meta)data are associated with detailed provenance;
R1.3. (meta)data meet domain-relevant community standards
“R3. Adopt a license and comply with the licence of third-party
dependencies” aligns with the Reusability principle, helping to
define to what extent the source code can be used and reused
by the community, as a standalone software or as part of other
software.
Open availability of tools and libraries working with data formats
can be a great help in making data interoperable: e.g. reuse of the
same tools to read and write data can prevent subtle interoperability
problems.
Reproducibility of experiments and reuse of data is facilitated by
the open availability of the associated software which is part of the
provenance. All of the OSS recommendations thereby facilitate data
Reusability.

There are also distinctions between the OSS recommendations and the FAIR data principles. The FAIR data principles have a specific emphasis on enhancing machine-readability: the ability of machines to automatically find and use data. This emphasis is not present in the OSS recommendations which expect machine readable software metadata to be available via software registries. The OSS recommendations are less granular and aim to enhance understanding and uptake of best practices; they were designed with measurability in mind. The FAIR data principles do not have such built-in quantification yet. FAIR metrics are a separate effort under development, lead by the Dutch Techcentre for Life Sciences ( Eijssen et al., 2016).

The community registries can play an important role in making software metadata FAIR by capturing, assigning and exposing software metadata following a standard knowledge representation and controlled vocabularies that are relevant for domain-specific communities. Thus we expect the community registries to provide guidelines on how to provide software metadata following the FAIR Guiding Principles ( Wilkinson et al., 2016).

Conclusion

The OSS recommendations aim to encourage the adoption of best practices and thus help to develop better software for better research. These recommendations are designed as practical ways to make research software and its source code more discoverable, reusable and transparent, with the desired objective to improve its quality and sustainability. Unlike many software development best practices tailored for software developers, the OSS recommendations aim to target a wider audience, particularly research funders, research institutions, journals, group leaders, and managers of projects producing research software. The adoption of these recommendations offer a simple mechanism for these stakeholders to promote the development of better software and an opportunity for developers to improve and showcase their software development skills.

Acknowledgments

The authors wish to thank all the supporters of the OSS recommendations.

The OSS recommendations presented in this manuscript have been have been open for discussion for more than a year. This allowed them to be developed by a wide range of stakeholders, including developers, managers, researchers, funders and project coordinators and anybody else concerned with the production of quality software for research. We also organised several workshops and presented this work in several meetings to engage more stakeholders, collect feedback and refine the recommendations. For further information, about the OSS recommendations please visit the following site: https://SoftDev4Research.github.io/recommendations/

Funding Statement

This work was partially supported by ELIXIR-EXCELERATE and CORBEL. ELIXIR-EXCELERATE and CORBEL are funded by the European Commission within the Research Infrastructures programme of Horizon 2020, grant agreement numbers 676559 and 654248. The European workshops were supported by ELIXIR and organised in collaboration with the Software Sustainability Institute and Netherlands eScience Center. The workshop in Australia was supported by EMBL-ABR via its main funders: The University of Melbourne and Bioplatforms Australia.

[version 1; referees: 3 approved]

Supplementary material

Supplementary File 1: ‘Fears of open sourcing and some ways to handle them’. In this appendix we aim to expose some of the common fear scenarios related to open sourcing, and some ways to handle them.

References

  1. Artaza H, Chue Hong N, Manuel C, et al. : Top 10 metrics for life science software good practices [version 1; referees: 2 approved]. F1000Res. 2016;5: pii: ELIXIR-2000. 10.12688/f1000research.9206.1 [DOI] [PMC free article] [PubMed] [Google Scholar]
  2. Brase J: Datacite - A Global Registration Agency for Research Data. SSRN Electronic Journal.n.d. 10.2139/ssrn.1639998 [DOI] [Google Scholar]
  3. Corpas M, Jimenez R, Carbon SJ, et al. : BioJS: an open source standard for biological visualisation – its status in 2014 [version 1; referees: 2 approved]. F1000Res. 2014;3:55. 10.12688/f1000research.3-55.v1 [DOI] [PMC free article] [PubMed] [Google Scholar]
  4. Crouch S, Chue Hong N, Hettrick S, et al. : The Software Sustainability Institute: Changing Research Software Attitudes and Practices. Computing in Science & Engineering. 2013;15(6):74–80. 10.1109/MCSE.2013.133 [DOI] [Google Scholar]
  5. DagstuhlEAS: DagstuhlEAS/draft-Manifesto. GitHub. 2017. Accessed January 5. Reference Source [Google Scholar]
  6. Eijssen L, Evelo C, Kok R, et al. : The Dutch Techcentre for Life Sciences: Enabling data-intensive life science research in the Netherlands [version 2; referees: 2 approved, 1 approved with reservations]. F1000Res. 2016;4:33. 10.12688/f1000research.6009.2 [DOI] [PMC free article] [PubMed] [Google Scholar]
  7. Fogel K: Producing Open Source Software: How to Run a Successful Free Software Project.O’Reilly Media, Inc.2005. Reference Source [Google Scholar]
  8. Gilb T: Principles of Software Engineering Management.Addison-Wesley Professional,1988. Reference Source [Google Scholar]
  9. Goble C: Better Software, Better Research. IEEE Internet Computing. 2014;18(5):4–8. 10.1109/MIC.2014.88 [DOI] [Google Scholar]
  10. Gómez J, García LJ, Salazar GA, et al. : BioJS: An Open Source JavaScript Framework for Biological Data Visualization. Bioinformatics. 2013;29(8):1103–4. 10.1093/bioinformatics/btt100 [DOI] [PMC free article] [PubMed] [Google Scholar]
  11. Hannay JE, MacLeod C, Singer J, et al. : How Do Scientists Develop and Use Scientific Software?In 2009 ICSE Workshop on Software Engineering for Computational Science and Engineering2009. 10.1109/secse.2009.5069155 [DOI] [Google Scholar]
  12. Henry VJ, Bandrowski AE, Pepin AS, et al. : OMICtools: An Informative Directory for Multi-Omic Data Analysis. Database (Oxford). 2014;2014: pii: bau069. 10.1093/database/bau069 [DOI] [PMC free article] [PubMed] [Google Scholar]
  13. Hettrick S, Antonioletti M, Carr L, et al. : UK Research Software Survey 2014.Accessed November 27.2016. 10.5281/zenodo.14809 [DOI] [Google Scholar]
  14. Ince DC, Hatton L, Graham-Cumming J: The Case for Open Computer Programs. Nature. 2012;482(7386):485–88. 10.1038/nature10836 [DOI] [PubMed] [Google Scholar]
  15. Ison J, Rapacki K, Ménager H, et al. : Tools and Data Services Registry: A Community Effort to Document Bioinformatics Resources. Nucleic Acids Res. 2016;44(D1):D38–47. 10.1093/nar/gkv1116 [DOI] [PMC free article] [PubMed] [Google Scholar]
  16. Leprevost Fda V, Barbosar VC, Francisco EL, et al. : On best practices in the development of bioinformatics software. Front Genet. 2014;5:199. 10.3389/fgene.2014.00199 [DOI] [PMC free article] [PubMed] [Google Scholar]
  17. List M, Ebert P, Albrecht F: Ten Simple Rules for Developing Usable Software in Computational Biology. PLoS Comput Biol. 2017;13(1):e1005265. 10.1371/journal.pcbi.1005265 [DOI] [PMC free article] [PubMed] [Google Scholar]
  18. McKiernan EC, Bourne PE, Brown CT, et al. : How open science helps researchers succeed. eLife. 2016;5: pii: e16800. 10.7554/elife.16800 [DOI] [PMC free article] [PubMed] [Google Scholar]
  19. Mulgan G, Steinberg T, Salem O: Wide Open: Open Source Methods and Their Future Potential.Demos Medical Publishing.2005. Reference Source [Google Scholar]
  20. Nosek BA, Alter G, Banks GC, et al. : SCIENTIFIC STANDARDS. Promoting an open research culture. Science. American Association for the Advancement of Science,2015;348(6242):1422–25. 10.1126/science.aab2374 [DOI] [PMC free article] [PubMed] [Google Scholar]
  21. Perez-Riverol Y, Gatto L, Wang R, et al. : Ten Simple Rules for Taking Advantage of Git and GitHub. PLoS Comput Biol. 2016;12(7):e1004947. 10.1371/journal.pcbi.1004947 [DOI] [PMC free article] [PubMed] [Google Scholar]
  22. Perez-Riverol Y, Wang R, Hermjakob H, et al. : Open source libraries and frameworks for mass spectrometry based proteomics: a developer's perspective. Biochim Biophys Acta. 2014;1844(1 Pt A):63–76. 10.1016/j.bbapap.2013.02.032 [DOI] [PMC free article] [PubMed] [Google Scholar]
  23. Popp KM: Best Practices for Commercial Use of Open Source Software: Business Models, Processes and Tools for Managing Open Source Software. BoD – Books on Demand. 2015. Reference Source [Google Scholar]
  24. Prlić A, Procter JB: Ten simple rules for the open development of scientific software. PLoS Comput Biol. 2012;8(12):e1002802. 10.1371/journal.pcbi.1002802 [DOI] [PMC free article] [PubMed] [Google Scholar]
  25. Smith AM, Katz DS, Niemeyer KE, et al. : Software Citation Principles. PeerJ Comput Sci. 2016;2:e86 10.7717/peerj-cs.86 [DOI] [Google Scholar]
  26. Wilkinson MD, Dumontier M, Aalbersberg IJ, et al. : The FAIR Guiding Principles for scientific data management and stewardship. Sci Data. 2016;3:160018. 10.1038/sdata.2016.18 [DOI] [PMC free article] [PubMed] [Google Scholar]
  27. Wilson G, Aruliah DA, Brown CT, et al. : Best practices for scientific computing. PLoS Biol. 2014;12(1):e1001745. 10.1371/journal.pbio.1001745 [DOI] [PMC free article] [PubMed] [Google Scholar]
  28. Wilson G, Bryan J, Cranston K, et al. : Good Enough Practices in Scientific Computing.2016. Reference Source [DOI] [PMC free article] [PubMed] [Google Scholar]
  29. Wilson G: Software Carpentry: lessons learned [version 1; referees: 3 approved]. F1000Res. 2014;3:62. 10.12688/f1000research.3-62.v1 [DOI] [PMC free article] [PubMed] [Google Scholar]
F1000Res. 2017 Jul 10. doi: 10.5256/f1000research.12314.r23464

Referee response for version 1

Stefanie Betz 1

The article advocates research software openness presenting four recommendations to improve research software visibility, re-usability, and transparency. I really like the article and I think it is important to open research software. Please find below my feedback. (Overall, I agree with the comments of Milad Miladi):

  • In my opinion, the title does not reflect the content of the article. The focus is on adopting OSS and supporting the provided four recommendations to help to develop better software for better research. Currently, only the second part (supporting the provided four recommendations to help to develop better software for better research) is reflected in the title not the part about OSS.

  • I am not sure everybody is familiar with the open registry platforms. Thus, some more information regarding them or a link to background information would be nice.

  • I think recommendation four should include documentation.

I have read this submission. I believe that I have an appropriate level of expertise to confirm that it is of an acceptable scientific standard.

F1000Res. 2017 Jun 26. doi: 10.5256/f1000research.12314.r23467

Referee response for version 1

Greg (Gregory V) Wilson 1

The article presents focused, well-argued advocacy for improving software development practices in the sciences. None of the recommendations will be surprising to those already involved in open science, but as only a small minority of researchers actually  do them, it is worth presenting them forcefully and succinctly. I would recommend shortening the introductory material (sections "Introduction" and "Target Audience"), but that is a minor point.

I have read this submission. I believe that I have an appropriate level of expertise to confirm that it is of an acceptable scientific standard.

F1000Res. 2017 Jun 26. doi: 10.5256/f1000research.12314.r23468

Referee response for version 1

Roberto Di Cosmo 1,2

This article presents four simple recommendations that may improve the overall quality and visibility of research software. This reviewer agrees with the basic principles set forth by the authors, and hopes they will be widely shared and adopted at least for software that is expected to last longer than the time it takes for the corresponding research paper to be accepted and/or presented.

I have read this submission. I believe that I have an appropriate level of expertise to confirm that it is of an acceptable scientific standard.

Associated Data

    This section collects any data citations, data availability statements, or supplementary materials included in this article.

    Supplementary Materials


    Articles from F1000Research are provided here courtesy of F1000 Research Ltd

    RESOURCES