Abstract
The contemporary context for open science is characterised by growing international policy momentum toward more transparent, inclusive, and collaborative scientific practices which build on the transformative potential of openness to enhance research equity, reproducibility, and collaboration. However, the shift from aspiration to implementation remains uneven and contested. Persistent structural challenges (e.g. misaligned incentives, disparities in infrastructure, and uneven access to funding and training) risk reinforcing rather than dismantling existing inequalities. The rapid proliferation of artificial intelligence (AI) technologies, particularly generative AI (GenAI), raises new concerns about epistemic authority, research integrity, and the ownership of knowledge. While AI could accelerate discovery and increase access to scientific information, it may also amplify exclusionary dynamics or prohibit the transparency open science aims to promote. This commentary paper describes how the unfolding integration of AI into research practices intersects with the goals and challenges of open science, particularly in relation to equity and inclusion. I surface the tensions between open science and research equity, examining how the changing technological landscapes intersect with long-standing socioeconomic and structural barriers. The paper argues for renewed attention to the normative foundations of open science, particularly regarding intellectual property, inclusive infrastructure, and equitable participation.
Introduction
Science has always relied on epistemological openness to function effectively. The tradition of peer review, the dissemination of research findings, and the collaborative nature of scientific inquiry all rest on principles of transparency and accountability. This openness allows scientific claims to be scrutinised, replicated, and refined by the broader community, facilitating the testing and refinement of empirical knowledge. Openness in science should be understood as both a technical standard and as a principle of just and inclusive knowledge production. Scientific openness also fosters collective progress by enabling researchers to build on one another’s work. In this sense, openness is not an optional value but a foundational condition for scientific credibility, cumulative knowledge, and public trust. However, the historical trend towards openness in science is not linear, and scientific institutions and communities have often been or become ‘closed’ in their practice [1, 2].
Possibilities for inclusive science continue to evolve. The world is becoming more networked, offering greater coordination of infrastructure (institutional repositories, digital object identifiers (DOIs), Open Researcher and Contributor ID (ORCID) accounts, metadata). Training in open scientific practices is becoming more common, increasing open access publication rates, sharing of open source code, and open data. Open Science is supported by many significant policies including the UNESCO Recommendation on Open Science [3]; Plan S [4]; the European Open Science Cloud (EOSC) [5]; the San Francisco Declaration on Research Assessment [6] and, recently, the UNESCO Dubai declaration on open educational resources as digital public goods [7]. Such initiatives anticipate the implementation of an entire open science ecosystem in which the values of equity, accountability, and participation are embedded at the core of research systems.
UNESCO [3] recognises that realising open science means navigating complex and interconnected political, environmental and socio-economic challenges. The benefits of open science—and the ability to participate—are not equally distributed. A recent scoping review [8] found that greater openness does not itself result in increased capacity or inclusion since the capacity to participate in open science in fact varies greatly by geography, demographics, economic advantage and institutional culture. There is an evident tension between the normative aspirations of open science and the structural inequalities that persist across scientific communities. Against this backdrop, the recent and dizzying rise of AI—and particularly GenAI—technologies has added divisiveness and complexity [9]. This commentary paper considers how the unfolding integration of AI into research practices intersects with the goals and challenges of open science, particularly in relation to equity and inclusion.
Main text
Open science as democratisation of scientific process
Open science is often framed as a means to democratise the scientific process by expanding who can access, participate in, and benefit from research. Open access publishing allows research outputs to be read by anyone with an internet connection, helping to bridge knowledge divides. Open data and open code enable the reanalysis, replication, and creative reuse of research materials, thereby decentralising expertise and allowing for new insights and applications. Open peer review and participatory research methods, including citizen science [9], offer pathways for non-traditional contributors to influence scientific agendas and validate knowledge claims. Machine learning can improve accessibility and universal design.
From the perspective of epistemic democratisation, there is an opacity and concentration of power inherent in many AI systems. At the point of use, GenAI services are still rapidly proliferating. However, most advanced AI models are developed by a small number of well-resourced corporations and institutions, often using proprietary data, closed algorithms, and resource intensive infrastructure that are inaccessible to most researchers (particularly those in the Global South) [10]. Access to cutting-edge tools and the ability to shape research agendas are thus increasingly centralised by AI in contrast to the open science vision of democratised science. The extent to which these services are truly supporting equitable practice remains debated [11] and how AI companies exploit data uploaded to their platforms is unclear.
The behaviour of AI companies is driven by strategic attempts to garner market share and may embody forms of techno-colonialism [12]. However, AI technologies could be developed and implemented in other ways. Alternative AI models that prioritise transparency, inclusivity, and community governance could offer a more democratic approach to technological development [13]. Open-source and community-led AI initiatives align more closely with the values of openness and epistemic justice. Such models promote transparency by providing access to training data sources, model architectures, and development processes, allowing for scrutiny and reproducibility [14]. Such approaches could foster participation in open science by welcoming contributions from a diverse global community, including researchers from underrepresented regions, interdisciplinary teams, and civil society actors. Some projects foreground ethical considerations, such as multilingual representation, bias mitigation, and, crucially, equitable access to computational resources. BLOOM, for example, was developed through a collaborative, international process directly designed to challenge the dominance of Anglophonic and Western-centric AI models [15]. However, such services are not the norm.
AI, open science and research integrity
Transparency and reproducibility of research is a cornerstone of scientific integrity [16, 17]. Peer review has traditionally functioned as a safeguard against misinformation and poor methodology, though there are challenges for the traditional model [18, 19]. One of the most pressing challenges facing scholarly publishing today is the exponential increase in research article submissions, raising concerns about research quality, sustainability, and integrity. The ongoing shift from subscription-based access to article processing charges (APCs) in open access publishing incentivises publishers to explore business models. Journals and platforms reliant on APCs often benefit financially from higher volumes of accepted papers. This puts pressure on the research infrastructure. While GenAI has been highlighted as a possible efficiency for peer review [20] GenAI tools are making it increasingly easy for bad actors to quickly create content, shape it into a reasonable approximation of an academic contribution and send it for peer review, exacerbating the problem and resulting in the retraction of published papers [21] and undermining trust.
Academic publishers continue to aggressively defend their market dominance while open alternatives continue to perpetuate [22, 23]. While academic publishing remains highly profitable, the academics and other researchers who contribute intellectual property and labour rarely see such benefits. APCs are also fundamentally inequitable since they discriminate against those with less resources. There is ongoing debate about the move towards more equitable models, such as Diamond Open Access [24] which would improve integrity by mitigating commercial interest.
Interdisciplinary collaboration is particularly critical in AI-driven research. Many of the most pressing scientific questions today require expertise from multiple domains: biologists working with data scientists, engineers collaborating with climate researchers, policymakers engaging with social scientists, and so on. Open science practices make such collaborations more feasible by ensuring that data, methodologies, and findings are freely available for collective analysis. AI tools can also provide ways to coordinate and integrate multidisciplinary research, though they also raise possible questions around the effect on research integrity, such as a lack of authorial oversight when using writing tools or summarising domain specific content.
While open science is often praised for fostering data transparency, sharing, and reuse, Leonelli [25] warns that a simplistic focus on populating a universal commons can inadvertently reinforce epistemic injustice, marginalising diverse research cultures and undervaluing non-mainstream methods. She argues instead for conceptualising open science as “judicious connection” between diverse systems of practice and research cultures. Such approaches emphasize ongoing reflection, contextualisation, and community-building rather than simply relying on open access publication as a sufficient condition for open science.
Open-access publishing plays a crucial role in broadening knowledge dissemination, making research findings available to a wider audience. AI tools that purport to help researchers navigate the vast amount of literature being produced such as search engines and literature summary/synthesis tools can potentially play an important role, but greater attention needs to be paid to their effects on the integrity of research practices [26].
Open science for equity and inclusion
The call for “judicious connection” [25] in open science is also relevant across issues of equity, diversity and inclusion. Science and academia remain fundamentally inequitable, dominated by wealthier institutions from the Global North. Historically, access to scientific literature has been restricted by paywalls, APCs, institutional limitations, and opaque data-sharing practices. English has become the de facto language of global science, privileging native speakers and creating barriers for researchers working in underrepresented or low-resource languages. Linguistic prevalence limits the visibility and perceived legitimacy of non-English research, marginalising localised knowledge and reinforcing patterns of epistemic exclusion. Researchers who must translate or adapt their work for international publication face additional labour and cost burdens, which further exacerbate inequities in dissemination and recognition.
Although open science encourages data sharing, preprints, and collaborative methods, these practices often go unrewarded in career evaluations that continue to privilege high-impact journal publications. These barriers disproportionately impact researchers in low- and middle-income countries, independent scholars, and those outside traditional academic institutions [27, 28]. While the platformization of citizen science [9, 29] has expanded opportunities for public participation in scientific research, such opportunities are not equally distributed.
Can AI support greater equity, diversity and inclusion (EDI) in open science? The answer depends less on the technology itself and more on how it is designed, governed, and integrated into research ecosystems. AI has the technological potential to increase access to knowledge, reduce technical burdens, and enable new forms of participation. AI-assisted translation and language refinement tools could enhance accessibility, allowing researchers to communicate their findings across linguistic and cultural boundaries with greater ease, offering the possibility of a meaningful step toward a more multilingual and inclusive open science ecosystem. However, EDI considerations are often neglected in system development [30] and are often reduced to matters of technical accessibility rather than socio-economic or political considerations [31]. This has led to calls for more socially responsible system design [32] and a need to address structural barriers more directly [33].
There is a growing recognition of the need to balance global alignment with local sensitivity [25]. Open science must be adaptable to diverse epistemologies, knowledge systems, and community needs. An illustrative challenge is exemplified by the tensions between the imperative to share knowledge as widely as possible coming into potential conflict with traditional or indigenous knowledge systems which may have their own protocols for sharing [34, 35]. In such cases, EDI is respected by refraining from the attempt to make all forms of knowledge as widely available as possible.
Outlook
Openness is a fundamental principle that underpins the scientific ecosystem. By strengthening transparency, accessibility, and impact, open science enhances the methodological, epistemological and ethical dimensions of research. In the age of AI, maintaining openness is more crucial than ever for ensuring rigor, reproducibility, and public trust. The outlook for open science within the global research community is one of both growing momentum and increasing complexity. Legal frameworks have not kept pace with technological progress [7] which means that members of research ecosystems must ethically reflect on their practice and their contributions to open science.
There is an ongoing need for knowledge transfer services that can support discourses and make participation possible for a wider range of stakeholders. AI can potentially build capacity that supports equitable participation [8] but there are significant challenges in avoiding algorithmic discrimination [36]. There is also heightened awareness of the risks posed by unrestricted access to increasing amounts of data, including data misuse, and issues of research security. As a result, open science is evolving to a more nuanced set of practices with complex legal, ethical and epistemological obligations. The challenge ahead lies in developing inclusive, interoperable frameworks that support global collaboration while safeguarding research integrity, privacy, and trust; and expanding EDI through addressing structural as well as technical barriers to participation. International collaboration is essential, but this must be equitable and not just a way for Global North actors to export unethical practices [37] under the banner of being “open”.
Acknowledgements
N/A
Abbreviations
- APCs
Article Processing Charges
- AI
Artificial Intelligence
- DOIs
Digital Object Identifiers
- EDI
Equity, Diversity and Inclusion
- EOSC
European Open Science Cloud
- GenAI
Generative Artificial Intelligence
- ORCID
Open Researcher and Contributor ID
Author contributions
N/A.
Author information
Robert Farrow is Senior Research Fellow at the Institute of Educational Technology, The Open University (UK) where he leads the research programme Learning in an Open World. He is the co-director of the Global OER Graduate Network; the co-editor of the Journal for Interactive Media in Education; and the co-lead for the ICDE Open Education Network.
Funding
N/A.
Availability of data and materials
No datasets were generated or analysed during the current study.
Declarations
Ethics approval and consent to participate
N/A.
Consent for publication
N/A.
Competing interests
The authors declare no competing interests.
Footnotes
Submitted to BMC Advancing open science (commentary paper) although the deadline has passed (with approval) https://www.biomedcentral.com/collections/ADVOS (An earlier version of the abstract for this contribution was reviewed by in-house editor Samuel Brod, who approved the submission. Hannah Greco confirmed a fee waiver has been agreed.).
Publisher's Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
References
- 1.David PA. The historical origins of “open science”: an essay on patronage, reputation and common agency contracting in the scientific revolution. Capitalism Society [Internet]. 2008 Oct 24;3(2). Available from:10.2202/1932-0213.1040
- 2.Peter S, Deimann M. On the role of openness in education: a historical reconstruction. Open Prax. 2013;5(1):7. 10.5944/openpraxis.5.1.23. [Google Scholar]
- 3.UNESCO. UNESCO recommendation on Open science [Internet]. 2021. Available from: https://www.unesco.org/en/open-science/about
- 4.Why Plan S [Internet]. Plan S. Available from: https://www.coalition-s.org/why-plan-s/
- 5.European Commission. European Open Science Cloud (EOSC) [Internet]. Available from: https://research-and-innovation.ec.europa.eu/strategy/strategy-research-and-innovation/our-digital-future/open-science/european-open-science-cloud-eosc_en
- 6.DORA. San Francisco Declaration on Research Assessment [Internet]. Sfdora.org. 2017. Available from: https://sfdora.org/read/
- 7.Unesco.org. Dubai Declaration on Open Educational Resources (OER): digital public goods and emerging technologies for equitable and inclusive access to knowledge. 2025. Available from: https://unesdoc.unesco.org/ark:/48223/pf0000392271.locale=en
- 8.Ross-Hellauer T, Reichmann S, Cole NL, Fessl A, Klebel T, Pontika N. Dynamics of cumulative advantage and threats to equity in open science: a scoping review. R Soc Open Sci. 2022. 10.1098/rsos.211032. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 9.Fraisl D, Hager G, Bedessem B, Gold M, Hsing PY, Danielsen F, et al. Citizen science in environmental and ecological sciences. Nature Reviews Methods Primers. 2022 Aug 25;2(1):1–20. Available from: 10.1038/s43586-022-00144-4
- 10.Bozkurt A, Xiao J, Farrow R, Bai JYH, Nerantzi C, Moore S, et al. The Manifesto for teaching and learning in a time of generative AI: a critical collective stance to better navigate the future. Open Praxis [Internet]. 2024;16(4):487–513. 10.55982/openpraxis.16.4.777. [Google Scholar]
- 11.Widder DG, West S, Whittaker M. Open (For Business): Big Tech, Concentrated Power, and the Political Economy of Open AI [Internet]. Social Science Research Network. Rochester, NY; 2023. Available from: https://papers.ssrn.com/sol3/papers.cfm?abstract_id=4543807
- 12.Tacheva J, Ramasubramanian S. AI empire: unraveling the interlocking systems of oppression in generative AI’s global order. Big Data Soc. 2023;10(2). Available from10.1177/20539517231219241
- 13.Farrow R. The possibilities and limits of XAI in education: a socio-technical perspective. Learn Media Technol. 2023 Mar 10;1–14. Available from: 10.1080/17439884.2023.2185630
- 14.Liesenfeld A, Dingemanse M. Rethinking open source generative AI: open-washing and the EU AI Act. FAccT ’24, June 03–06, 2024, Rio de Janeiro, Brazil. 2024. Available from: 10.1145/3630106.3659005
- 15.What’s BLOOM and Why Is It Democratizing AI? [Internet]. Voiceflow.com. 2025 [cited 2025 Jul 2]. Available from: https://www.voiceflow.com/blog/bloom-ai
- 16.Resnik DB, Shamoo AE. Reproducibility and research integrity. Account Res [Internet]. 2016 Nov 7;24(2):116–23. Available from: https://www.ncbi.nlm.nih.gov/pmc/articles/PMC5244822/ [DOI] [PMC free article] [PubMed]
- 17.Moher D, Bouter L, Kleinert S, Glasziou P, Sham MH, Barbour V, et al. The Hong Kong principles for assessing researchers: fostering research integrity. PLoS Biol. 2020;18(7):e3000737. 10.1371/journal.pbio.3000737. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 18.Mollaki V. Death of a reviewer or death of peer review integrity? The challenges of using AI tools in peer reviewing and the need to go beyond publishing policies. Res Ethics. 2024;20(2):239. 10.1177/17470161231224552. [Google Scholar]
- 19.Drozdz JA, Ladomery MR. The Peer Review Process: Past, Present, and Future. Br J Biomed Sci [Internet]. 2024 Jun 17;81:12054. Available from: https://www.ncbi.nlm.nih.gov/pmc/articles/PMC11215012/ [DOI] [PMC free article] [PubMed]
- 20.Ahmed Saad A, Jenko N, Sisith Ariyaratne, Birch N, KarthikeyanP. Iyenagar, Arthur Mark Davies, et al. Exploring the potential of ChatGPT in the peer review process: an observational study. Diabetes Metabolic Syndrome Clin Res Rev. 2024 Feb 1;102946–6. Available from: 10.1016/j.dsx.2024.102946. [DOI] [PubMed]
- 21.Nguyen MH, Vuong QH. Artificial intelligence and retracted science. AI Soc. 2024. 10.1007/s00146-024-02090-z. [Google Scholar]
- 22.Larivière V, Haustein S, Mongeon P. The Oligopoly of academic publishers in the digital era. In: Glanzel W, editor. PLOS ONE. 2015 Jun 10;10(6):e0127502. Available from: 10.1371/journal.pone.0127502 [DOI] [PMC free article] [PubMed]
- 23.Logullo P, Beyer JA de, Kirtley S, Schlüssel MM, Collins GS. Open access journal publication in health and medical research and open science: benefits, challenges and limitations. BMJ Evidence-Based Med [Internet]. 2023 Sep 28;29(4). Available from: https://ebm.bmj.com/content/early/2023/09/28/bmjebm-2022-112126 [DOI] [PMC free article] [PubMed]
- 24.Dufour Q, Pontille D, Didier Torny D. Supporting diamond open access journals. Nord J Libr Inf Stud. 2023;4(2):35–55. 10.7146/njlis.v4i2.140344. [Google Scholar]
- 25.Leonelli S. Philosophy of open science [Internet]. Cambridge University Press; 2023. Available from: 10.1017/9781009416368
- 26.Azevedo L, Mallinson DJ, Wang J, Robles P, Best E. AI policies, equity, and morality and the implications for faculty in higher education. Public Integr. 2024 Oct 15;1–16. Available from 10.1080/10999922.2024.2414957
- 27.Pineda P, Mishra S. The semantics of diversity in higher education: differences between the Global North and Global South. High Educ. 2022. 10.1007/s10734-022-00870-4. [Google Scholar]
- 28.Nobes A, Harris S. Open access in low- and middle-income countries: attitudes and experiences of researchers. Emerald Open Res. 2023 Nov 12;1(3). Available from: 10.1108/EOR-03-2023-0006
- 29.Aristeidou M, Herodotou C. Online citizen science: a systematic review of effects on learning and scientific literacy. Citizen Sci Theory Pract. 2020. 10.5334/cstp.224. [Google Scholar]
- 30.Shams RA, Zowghi D, Bano M. AI and the quest for diversity and inclusion: a systematic literature review. AI Ethics [Internet]. 2023 Nov 13;5. Available from: 10.1007/s43681-023-00362-w
- 31.Chi N, Lurie E, Mulligan DK. Reconfiguring diversity and inclusion for AI ethics. In: Proceedings of the 2021 AAAI/ACM conference on AI, ethics, and society. 2021 Jul 21. Available from: 10.1145/3461702.3462622
- 32.Güven Ç, Alishahi A, Brighton H, Nápoles G, Olier JS, Šafář M, et al. AI in support of diversity and inclusion. arXiv (Cornell University). 2025 Jan 16. Available from 10.48550/arXiv.2501.09534.
- 33.Bossu C, Iniesto F. Equity, Diversity, and Inclusion in Open Education [Internet]. Advances in interdisciplinary research. Routledge; 2025. p. 71–9. Available from: 10.4324/9781003625391-7
- 34.Farrow R. Cosmopolitics and the commons. Education Ouverte et Libre Open Education [Internet]. 2025 Jun 13;(3). Available from: 10.52612/journals/eol-oe.2025.e1632
- 35.Callison C, Ludbrook A, Owen V, Nayyer K. Engaging respectfully with indigenous knowledges. KULA: Knowledge Creation Dissemin Preserv Stud. 2021 Jun 23;5(1). Available from: 10.18357/kula.146
- 36.Idowu JA. Debiasing education algorithms. Int J Artif Intell Educ. 2024 Jan 4. Available from10.1007/s40593-023-00389-4
- 37.Saleh S, Sambakunsi H, Nyirenda D, Kumwenda M, Mortimer K, Chinouya M. Participant compensation in global health research: a case study. Int Health. 2020;12(6):524–32. 10.1093/inthealth/ihaa064/. [DOI] [PMC free article] [PubMed] [Google Scholar]
Associated Data
This section collects any data citations, data availability statements, or supplementary materials included in this article.
Data Availability Statement
No datasets were generated or analysed during the current study.
