Skip to main content
Taylor & Francis Open Select logoLink to Taylor & Francis Open Select
editorial
. 2024 Jun 4;27(15):2587–2599. doi: 10.1080/1369118X.2024.2353782

Sphere transgressions: reflecting on the risks of Big Tech expansionism

Marthe Stevens a,CONTACT, Steven R Kraaijeveld a,b, Tamar Sharon a,2
PMCID: PMC11716645  PMID: 39802962

ABSTRACT

The rapid expansion of Big Tech companies into various societal domains (e.g., health, education, and agriculture) over the past decade has led to increasing concerns among governments, regulators, scholars, and civil society. While existing theoretical frameworks—often revolving around privacy and data protection, or market and platform power—have shed light on important aspects of Big Tech expansionism, there are other risks that these frameworks cannot fully capture. In response, this editorial proposes an alternative theoretical framework based on the notion of sphere transgressions, which draws on political philosopher Michael Walzer's theory of justice. The editorial not only introduces the sphere transgressions framework, but it also highlights its potential to generate novel research questions. Furthermore, this editorial introduces eight articles and one commentary from a group of interdisciplinary scholars that critically examine Big Tech expansionism by reflecting on this expansionism from the perspective of different societal spheres or by engaging with the sphere transgressions framework itself.

Introduction

The past decade has witnessed the rapid expansion of Big Tech into societal domains where these companies have not traditionally operated (Birch & Bronson, 2022; Sharon, 2016; Taylor, 2021; Van Dijck et al., 2019). In health and medicine, for example, large tech corporations have been developing software and wearables for remote clinical studies (Apple, 2015), devices for home medical surveillance (Olsen, 2021), artificial intelligence (AI) systems for diagnostics and disease prediction (Vincent, 2018), digital biomarkers (Verily, 2023), digital proximity tracing for pandemic management (Barber, 2020), electronic health record management (Microsoft, 2022a), health insurance initiatives (Farr, 2020), and funding schemes for biomedical research (Palmer, 2021).

The increased presence of large tech corporations can be noted in other societal domains as well. In education, they are making their mark with e-learning platforms that seek to personalize learning, facilitate remote teaching, and the expansion of their cloud infrastructure into education, such as Google Classroom (Google, 2014), Apple’s ‘Schoolwork’ (Apple, 2018) and AWS for education (AWS, 2024). In other sectors, large tech corporations have developed their own news and media channels such as Facebook News (Clegg, 2021), have expanded their reach in the transportation sector with autonomous cars and ships (High, 2022), and have become closely involved in public administration (Apple, 2022), agriculture (Grant, 2020), finances (Apple, 2019), science (NSF, 2021), and law (Microsoft, 2022b).

This expansionism is not limited to the traditional ‘Big Five’ (i.e., Alphabet, Apple, Meta, Amazon, and Microsoft). Palantir, for instance, is expanding into health and medicine as well as humanitarian aid and law (Artificial Lawyer, 2022; Greenwood, 2019; Mahdawi, 2022; Martin, 2023; Milmo, 2022), while Huawei has entered the financial and agricultural sectors (Fernandes, 2023; Huawei, 2021). Nor is this expansionism limited to the United States, where most of these companies originate. Increasingly, Big Tech is moving into various societal domains in Europe (Maganza, 2016), South America (Menezes, 2023), Africa (Gerber, 2023), and Asia (Choudhary, 2023). The sectorial and global reach of Big Tech expansionism is unprecedented and has been carefully documented by our research group in the open repository and digital tool Sphere Transgression Watch (Stevens et al., 2022).

Existing frameworks for analyzing Big Tech expansionism

Big Tech expansionism is a growing concern for governments, regulators, scholars and civil society groups (e.g., European Commission, 2022; Moore & Tambini, 2022; Taylor, 2021). Different frameworks have been developed and employed to analyze this phenomenon, among which we can distinguish two dominant ones: (1) privacy and data protection, and (2) market and platform power.1

First, privacy and data protection concerns have figured prominently in discussions about digital innovation and Big Tech. In the scholarly literature, much of the critical work on Big Tech has focussed on the importance of privacy as a value, and on the far-ranging impacts of (digital) surveillance and privacy breaches on important concepts like autonomy, democracy, and human rights (e.g., non-discrimination). A host of conceptualizations has ensued, such as contextual privacy (Nissenbaum, 2010), privacy as power (Véliz, 2020), the social value of privacy (Roessler & Mokrosinska, 2013), and privacy as civil inattention (Sharon & Koops, 2021). Concurrently, rules and regulations to protect personal data (e.g., the EU’s General Data Protection Regulation) and privacy-enhancing design solutions (e.g., Verheul & Jacobs, 2017) have been developed to mitigate privacy and data protection harms.

However, Big Tech expansionism is taking place in ways that are often compliant with privacy and data protection law (Veale, 2020; López Solano et al., 2022; Sharon and Gellert, this issue). A prime example of this is the privacy-friendly API for digital contact tracing that Google and Apple developed during the COVID-19 pandemic, which allowed these companies to play a role in public health and pandemic management (Sharon, 2021a; Taylor, 2021). Indeed, the activities of companies in new societal domains are not always predicated on a business model like data collection for targeted advertising, which requires handling data in ways that are privacy or data protection unfriendly. In this context, when the focus is on privacy-compliance, expansionism may be enabled rather than hindered (Sharon, 2021a). Moreover, while privacy is undoubtedly a core value of liberal democracies (Lever, 2015) that must be protected, privacy and data protection concerns offer a relatively narrow framing of the risks raised by Big Tech expansionism. The privacy and data protection framing does not allow us to see, for instance, that the involvement of these companies can lead to undue influence in new societal sectors, or that these sectors may be transformed in line with Big Tech’s own values and interests, which may run counter to public values and interests.

A second approach for understanding Big Tech expansionism, which we might call a market or platform power approach, has been to frame this expansionism in terms of the market dominance of Big Tech, and more generally in terms of the commodification and assetization of data that are generated in non-market areas, such as health or education (e.g., Birch, 2020; Birch & Bronson, 2022; Cohen, 2019; Couldry & Meijas, 2019; Prainsack, 2020; Zuboff, 2019). Particularly influential in foregrounding commodification concerns have been studies about infrastructural and platform power (e.g., Bridges, 2021; Gürses & Dobbe, 2020; Poel et al., 2021; Taylor, 2021; Van Dijck et al., 2019). These studies typically center on changes in social life due to a growing reliance on platforms and infrastructures owned and controlled by Big Tech. These studies have exposed the material conditions of infrastructural lock-ins and have highlighted unequal power distributions, changing economic processes, and new dependencies on Big Tech for the provision of public services.

However, while this focus on the political economy of data and the market dominance of Big Tech is valuable for moving beyond a relatively narrow focus on privacy, it can obscure the many effects that digitalization itself – separate from or notwithstanding commodification – has on the practices and values of different societal domains. Scholars have, for instance, shown that the digitalization of patient records has fundamentally changed the organization of healthcare and the work of medical professionals, who now need to collect and process different types of patient data irrespective of concerns about the possible commodification of these data (Berg, 1998; Wallenburg et al., 2019). Hence, while a focus on commodification and assetization can provide an explanation for why Big Tech actors succeed in moving into new sectors due to their market power, it cannot entirely account for Big Tech expansionism, which is also uniquely related to the digital expertise and know-how of tech actors (Sharon, 2021b). Institutional actors and domain experts in societal domains undergoing digitalization are keen to collaborate with tech actors not merely because of the latter's market dominance or capital but also, more specifically, because of their digital expertise (Adomako & Nguyen, 2023).

In addition, while online platforms are undoubtedly an important means by which large tech corporations have concentrated their power, platformization is only one aspect of their expansionism. Indeed, Big Tech has also steadily infiltrated more traditional sectors of the ‘offline’ world. For example, tech corporations in education no longer merely supply edtech infrastructure. They also increasingly have a say in the content of education and in how students learn. They wield influence, for instance, through ‘Google Reference Schools’ certificates that are conferred to schools that are recognized by Google for their ‘outstanding’ use of Google edtech, or through training programs for educators about how to make the most of Google tools as they ‘redefine learning through the use of technology’ (Google, 2024). In this context, the focus on platforms appears to be too narrow. A fuller account of the risks of Big Tech expansionism is needed.

Big Tech expansionism as sphere transgressions

To capture more fully what is at stake in Big Tech expansionism, we propose an alternative framework – sphere transgressions – that builds on previous work (Sharon, 2021a, 2021b, 2022; Stevens et al., 2022). This framework of sphere transgressions draws on political philosopher Michael Walzer’s theory of justice and the idea of complex equality developed in his book, Spheres of Justice (1983). According to Walzer, societies consist of different spheres of social life that are distinguished by a main good or cluster of goods internal to those spheres. The goods in question have social meanings, from which criteria of proper distribution can be derived. Thus, each sphere is to be regulated by its own principle of just distribution that is based on the shared meanings of the social goods that are distributed within it. Medical care, for example, is a sphere whose internal good is health and whose appropriate distributive principal is need. But need is not an appropriate distributive principle in the economic sphere, where free exchange acts as the main principle of distribution of goods, or in the sphere of office, where hiring procedures should be merit-based. Conversely, in the sphere of politics, the appropriate principles of distribution are neither need nor merit, but rather equal citizenship and the capacity to persuade.

In Walzer’s theory of complex equality, inequalities are not unjust if they result from a distribution of goods that is based on the appropriate distributive principle within respective spheres. Some people receive more medical care than others because their need is greater; some people will have more money than others as a result of free exchange; some people will have more political power based on their capacity to persuade. Injustice emerges when such inequalities transfer across spheres and when goods from one sphere – money, political power, family relations, and so on – also confer advantages in other spheres. In this way, these goods become what Walzer calls a ‘dominant’ good: a good that allows those ‘who have it, because they have it, [to] command a wide range of other goods’ (1983, p. 10).

According to Walzer, when a dominant good within one sphere allows one to command goods in other spheres, this constitutes a ‘sphere transgression’. Sphere transgressions occur, for example, when money can buy political power or love, or when family relations unduly determine educational opportunities, or when political power grants access to better medical care. According to Walzer, spheres should remain relatively autonomous, and justice is about protecting the relative autonomy of different societal spheres.

Writing in the 1980s, Walzer could not have incorporated novel developments like digitalization and the internet – let alone platforms and Big Tech – into his analysis. Yet, given the prominence and power of digitalization today, we contend that Walzer's sphere ontology should be updated to include a so-called sphere of the digital. The main good of this sphere might be said to be digital products and expertise, while its main principle of distribution is a form of merit. Finally, the main actors commanding this good within the sphere are Big Tech corporations that have accrued an important advantage in the sphere of the digital. This advantage is not unjust per se as long as it aligns with the principle of merit. One could argue that these companies have, at least initially, worked hard to develop and refine digital products and expertise – from hardware and software to cloud infrastructure and data analytics – and that they have become very successful at it.2

In a society that is increasingly undergoing digitalization and in which digital technologies are frequently offered as solutions to a vast range of societal problems, from personnel shortages in health (WHO, 2021) and education (Baker et al., 2019) to reducing the ecological footprint of farming (Dicks et al., 2019), digital products and expertise become a highly coveted – or what Walzer calls a dominant – good. As a result, the (legitimate) advantage that tech corporations have accrued in the sphere of the digital is converted into (illegitimate) advantages in other societal spheres, offering these corporations access to new spheres and increasing their dominance across society. It is from this perspective in particular that we argue that Big Tech expansionism into new societal spheres should be understood as sphere transgressions.

We are not the first scholars to extend Walzer's ideas to information and digital technology. Of note are Van den Hoven (1997)3 and Nissenbaum (2010) early use of Walzer’s work to argue that information generated in one sphere (or what Nissenbaum calls ‘context’) should not carry over into other spheres, thereby leading to privacy breaches. However, these authors primarily use Walzer’s framework to track the movement of data between different domains and focus on the privacy harms that transgressions can incur. Instead, we argue for the value of this framework for identifying how digital products and expertise are becoming dominant goods in society and how this facilitates transgressions by actors (Big Tech) whose influence was initially contained within their own sphere (‘the digital sphere’), but who have since moved into new societal spheres, thus creating new risks and types of harms.

One might ask, can one really speak of a sphere of the digital in a way that is similar to a sphere of health, education, politics, or the market? Would it not be more accurate to speak of a digital overlay (or underlay) that permeates all of society, cutting across spheres, rather than to identify a separate societal sphere that is characterized by its own goods and principle of distribution? These are pertinent theoretical questions concerning the concept of spheres and the nature of technology (as neutral tool or value-laden agent), which cannot fully be addressed here. Instead, we appeal to the heuristic value of the notion of a separate sphere of the digital, which allows us to apply Walzer’s theory of complex equality to two novel phenomena – digitalization and Big Tech expansionism – and to bring these developments together in a novel and meaningful way. Crucially, our framework allows us to articulate and foreground emerging risks and focal points that become visible when Big Tech expansionism is seen through the lens of sphere transgressions – risks beyond privacy violations or inadequate data protection, and even risks beyond market and platform power concerns. Understanding Big Tech expansionism in terms of sphere transgressions thus sets the stage for a new research agenda, as we describe below.

A new research agenda

First, sphere transgressions are specifically a matter of justice, domination, and (illegitimate) power. If actors use advantages that they have accrued in their original sphere of operation in order to gain a dominant position in other societal spheres, then this means that such newfound power is illegitimate according to the sphere transgressions framework. In the case of Big Tech moving into new societal spheres, the question of legitimacy is paramount: these companies do not necessarily have the domain expertise (medical, legal, journalistic, pedagogical, etc.) that should be at the center of decision-making and service provision in any given sphere. Furthermore, and especially in the case of public sectors, Big Tech companies are not accountable in the ways that public sector actors are (Taylor, 2021). The framework of sphere transgressions offers an explanation as to why Big Tech expansionism is illegitimate that is closely related to the types of goods and regulatory principles of the spheres into which they enter.

Second, the idea that some goods become dominant across societal spheres in the Walzerian sense should draw attention to the conditions under which those goods became dominant in particular social constellations and historical periods. Importantly, digital products and expertise are not an increasingly dominant good today only because Big Tech is actively promoting them; it is also a desideratum of spheres, of the traditional dominant actors in spheres, and of the policymakers who decide how investments in spheres are distributed. Indeed, the search for digital solutions in sectors like health and education has not only been initiated by Big Tech; it is also construed more broadly as a response to pressing societal needs (e.g., Baker et al., 2019; European Commission, 2018), just as the search for digital solutions developed by the private sector is a response to political-economic discourse about inefficient public sector digitalization (Mazzucato, 2018; Collington, 2022). In other words, it takes two to tango (Lanzing et al., 2021) – or, in this case, to produce a transgression. The demand-side of the dynamic warrants as much scrutiny as the supply-side.

Third, understanding Big Tech expansionism in terms of sphere transgressions suggests that not only marketization, but also digitalization, threatens the integrity and autonomy of societal spheres – even if the two forces tend to work together today, as is certainly the case for Big Tech. For Walzer (1983), as for many other social theorists (e.g., Sandel, 2012; Satz, 2010; Anderson, 2008; Radin, 1989), the main threat to equality in liberal societies is the encroachment of the market into other spheres, which threatens to transform or corrupt valued goods. This critique is, in various ways, echoed in critical data scholarship on platform power as discussed above. The sphere transgressions framework suggests that digitalization can also have a transformative and, arguably, corrosive effect on the goods and activities in other societal spheres.

This is because technology is never just an inert instrument utilized by people to achieve a given task, as has long been recognized by STS scholars and philosophers of technology (Akrich, 1992; Latour, 1992; Morrow, 2014; Miller, 2020; Winner, 1980). It is not value-neutral; it is already part of, rather than external to, the normative order. Moreover, technology can also be characterized by a certain logic and set of values, namely of increased efficiency, standardization, and control (e.g., Berg, 1998; Ellul, 1954; Marcuse, 1941). This means that transgressions from the sphere of the digital into other societal spheres can lead to the dominance of Big Tech actors in new spheres and across society, as well as the importation of a foreign logic which, applied to goods such as education, healthcare, agriculture, social welfare, and others, may transform those goods and ultimately the societal spheres themselves. The sphere transgression framework suggests that we can and should think of what digitalization does to other goods and spheres of social life, just as we are accustomed to thinking about, for instance, the ramifications of commodification.

Finally, the sphere transgressions framework entails a shift in thinking about what needs protection, as we transition to increasingly digital societies. Protection is necessary not only for data subjects and their fundamental rights (e.g., as safeguarded by data protection regulation), or consumers and fair market practices (e.g., as protected by competition law). Protection is also needed for societal spheres and their sphere-specific values, goods, forms of expertise, and unique ends.

Overview of special issue

To examine Big Tech expansionism more extensively in relation to sphere transgressions, this special issue unites a multidisciplinary group of scholars who critically engage with the growing influence of Big Tech in one of two general ways.4

The first type of contributions are ‘sphere-based,’ meaning that questions about Big Tech expansionism are explored from within different spheres and areas of expertise (e.g., health, law, education, etc.). Authors raise questions such as: How do the values and expertise imported by Big Tech clash with domain-specific values and expertise? Does this contribute to redefining the nature and aims of specific spheres and thus to reshaping them? What new dependencies are being created?

Kerssens and Van Dijck analyze how platformization and infrastructuralization within the Dutch educational system led to the merging of local and national public sectors into a transnational and global digital market. They focus on the adaptive learning application Bingel to demonstrate how sphere transgressions are conducive to data accumulation across national markets and sectors into transnational and global data infrastructures.

Lanzing uses the concept of sphere transgressions to provide a normative interpretation and critique of Palantir's expansion into the health domain. She argues that critiques based on data protection fail to capture the risks and harms of such expansion, that risks of no public returns, dominance, and new dependencies must be anticipated, and that Palantir's expansion into the sphere of health is a particularly pernicious case.

Wehrens, Wallenburg and Oldenhof reflect on the role of the (qualitative) social scientist embedded in consortia seeking to contribute to technological advancement in healthcare and argue that the sphere transgression framework can generate new challenges and opportunities for researchers to investigate and evaluate such transgressions in situ.

The second type of contribution involves engagement with the sphere transgressions framework itself. Authors explore questions such as: How might we best understand transgressions by Big Tech? How might the concept of sphere transgressions translate to different global regions? Given the risks raised by sphere transgressions, can regulation that is currently being developed (e.g., by the European Commission) adequately capture such risks?

Ortolani engages with the framework by introducing the notion of ‘counter-transgression’ in the case of legal dispute resolution. While transgressions by tech actors into this sphere can certainly be discerned, in the form of e.g., online dispute resolution platforms, he shows that the sphere of law is simultaneously counter-transgressing into the sphere of digital goods, injecting legal expertise and values into the functioning of tech companies, e.g., in the form of Facebook’s Oversight Board. For Ortolani, this raises the question of potential positive effects of transgressions between spheres.

Sharon and Gellert argue that when Big Tech expansionism is understood in terms of sphere transgressions, the EU's recent series of proposals to contain such expansionism fails to address three specific risks beyond privacy and data protection risks: non-equitable returns to the public sector; the reshaping of sectors in line with the interests of technology firms; and new dependencies on technology firms for the provision of basic goods.

López Solano and Castañeda focus on the case of IDEMIA. This French security company has developed a privatized infrastructure for identification and authentication services used by the Colombian National Civic Registry. They engage with the concept of sphere transgressions to question how and when certain forms of transgressions are possible between the public and private sectors.

Stevens presents interviews with medical researchers using Apple's ResearchKit in the Netherlands and the United States, which reveal that researchers are not merely passive recipients of sphere transgressions; they respond to Big Tech's initiatives in a variety of ways. Drawing on work by Michel De Certeau (1984), she shows that researchers do not simply welcome or resist Apple's ResearchKit – they also ‘make do’ using various tactics. Stevens argues that thinking in terms of tactics helps to identify needs and interests crucial to researchers and the sphere of medical research.

Oldenhof, Kersing and van Zoonen highlight an institutional void within digital welfare states, where legal, ethical, and quality procedures are insufficient to address current digital challenges. Based on two case studies, they show how sphere transgressions worsen citizen vulnerabilities within this void, proposing strategies to rectify the institutional void and underscoring the need to heed ‘soft signals’ to mitigate future adverse effects.

Finally, Taylor, Martin, de Souza, and López-Solano provide a commentary on Europe's experiences with the technology sector during the COVID-19 pandemic. Inspired by the sphere transgressions framework, they develop the notion of sector transgressions in their Global Data Justice project's report and highlight important implications of the rapid expansion of commercial technological power for governance across several sectors.

Acknowledgements

We would like to thank all the participants of the Sphere Transgressions Workshops in 2022 and the other members of the Digital Good team, Lotje Siffels, Andrew S. Hoffman and Marjolein Lanzing, for insightful discussions on Big Tech expansionism and the sphere transgressions framework.

Biographies

Marthe Stevens is Assistant Professor at the Interdisciplinary Hub on Digitalization and Society (iHub) and affiliated with the Department of Ethics and Political Philosophy at Radboud University (the Netherlands). Marthe studies the ethical and societal impacts of new technological innovations, mainly in education, and healthcare. She specializes in embedded ethics and seeks to integrate ethical thinking into innovation trajectories using insights from the Philosophy of Technology, Science and Technology Studies and Critical Data Studies.

Steven R. Kraaijeveld is a lecturer and researcher at Amsterdam University Medical Centers and an associate fellow at the Research Consortium on the Ethics of Socially Disruptive Technologies. His current research focuses on ethical and philosophical questions surrounding public health and technology.

Tamar Sharon is Professor of Philosophy, Digitalization and Society, Chair of the Department of Ethics and Political Philosophy and Co-Director of the Interdisciplinary Hub for Digitalization and Society (iHub) at Radboud University, Nijmegen. Her research explores how the increasing digitalization of society destabilizes public values and norms, and how best to protect them. She is a member of the European Commission’s European Group on Ethics in Science and New Technologies.

Correction Statement

This article has been corrected with minor changes. These changes do not impact the academic content of the article.

Funding Statement

This work was supported by the European Research Council [grant number: 804985].

Notes

1

Our aim is not to offer a systematic review of previous research but to provide an overview of some of the key approaches to Big Tech expansionism in the literature, which helps contextualize those adopted in this special issue.

2

This is not to say that the success of tech corporations is or has always been only merit-based. It has of course also been the result of untransparent, manipulative, and unlawful relationships with customers (especially maleficent data practices), as well as dubious or non-competitive market practices such as subsidizing (initially) non-profitable business models or buying up smaller companies to eliminate competition (e.g., Reich et al., 2021). The point is that the success of Big Tech in the sphere of the digital can be seen – and is seen by many – as primarily merit-based, and thus in accordance with the sphere’s appropriate principle of distribution.

3

See, also, Nagenborg (2009).

4

Nearly all authors participated in workshops surrounding the launch of the Sphere Transgression Watch digital tool, organized at Radboud University, the Netherlands, in June 2022 and November 2022.

Disclosure statement

No potential conflict of interest was reported by the author(s).

References

  1. Adomako, S., & Nguyen, N. P. (2023). Digitalization, inter-organizational collaboration, and technology transfer. The Journal of Technology Transfer, 1–27. 10.1007/s10961-023-10031-z [DOI] [Google Scholar]
  2. Akrich, M. (1992). The De-scription of technical objects. In Bijker W., & Law J. (Eds.), Shaping technology/building society (pp. 205–244). MIT Press. [Google Scholar]
  3. Anderson, E. (2008). The ethical limitations of the market. Economics & Philosophy, 6(2), 179–205. 10.1017/S0266267100001218 [DOI] [Google Scholar]
  4. Apple . (2015, March 9). Apple introduces ResearchKit, giving medical researchers the tools to revolutionize medical studies. Apple Newsroom. https://www.apple.com/newsroom/2015/03/09Apple-Introduces-ResearchKit-Giving-Medical-Researchers-the-Tools-to-Revolutionize-Medical-Studies/.
  5. Apple . (2018, June 26). Apple’s free Schoolwork app now available for teachers. Apple Newsroom. https://www.apple.com/newsroom/2018/06/apples-free-schoolwork-app-now-available-for-teachers/.
  6. Apple . (2019, March 25). Introducing Apple Card, a new kind of credit card created by Apple. Apple Newsroom. https://www.apple.com/newsroom/2019/03/introducing-apple-card-a-new-kind-of-credit-card-created-by-apple/.
  7. Apple . (2022, March 23). Apple launches the first driver’s license and state ID in Wallet with Arizona. Apple Newsroom. https://www.apple.com/newsroom/2022/03/apple-launches-the-first-drivers-license-and-state-id-in-wallet-with-arizona/.
  8. Artificial Lawyer . (2022, April 29). With Palantir Foundry Hence Aims to Improve Legal Buying. https://www.artificiallawyer.com/2022/04/29/with-palantir-foundry-hence-aims-to-improve-legal-buying/.
  9. AWS . (2024). Aws cloud computing for education. AWS. https://aws.amazon.com/education/?wwps-cards.sort-by=item.additionalFields.sortDate&wwps-cards.sort-order=desc.
  10. Baker, T., Smith, L., & Anissa, N. (2019). Educ-AI-tion rebooted? Exploring the future of artificial intelligence in schools and colleges. Nesta. https://www.nesta.org.uk/report/education-rebooted/.
  11. Barber, G. (2020 September 1). Google and Apple Change Tactics on Contract Tracing Tech. Wired. https://www.wired.com/story/google-apple-change-tactics-contact-tracing-tech/.
  12. Berg, M. (1998). Medical work and the ComputerBased patient record: A sociological perspective. Methods of Information in Medicine, 37(3), 294–301. 10.1055/s-0038-1634535 [DOI] [PubMed] [Google Scholar]
  13. Birch, K. (2020). Automated neoliberalism? The digital organisation of markets in technoscientific capitalism. New Formations, 100, 10–27. 10.3898/NewF:100-101.02.2020 [DOI] [Google Scholar]
  14. Birch, K., & Bronson, K. (2022). Big tech. Science as Culture, 31(1), 1–14. 10.1080/09505431.2022.2036118 [DOI] [Google Scholar]
  15. Bridges, L. (2021). Infrastructural obfuscation: Unpacking the carceral logics of the Ring surveillant assemblage. Information, Communication & Society, 24(6), 830–849. 10.1080/1369118X.2021.1909097 [DOI] [Google Scholar]
  16. Choudhary, L. (2023, August 24). Google.org invests $1 m to train NGOs in Asia on AI, cybersecurity. Tech in Asia. https://www.techinasia.com/google-org-invests-us1m-to-train-ngos-in-asia-on-ai-cybersecurity.
  17. Clegg, N. (2021, January 26). Facebook News Will Help Sustain Quality Journalism. Facebook Newsroom. Facebook News Will Help Sustain Quality Journalism |, Meta (fb.com).
  18. Cohen, J. (2019). Between truth and power: The legal constructions of information capitalism. Oxford University Press. [Google Scholar]
  19. Collington, R. (2022). Disrupting the welfare state? Digitalisation and the retrenchment of public sector capacity. New Political Economy, 27(2), 312–328. 10.1080/13563467.2021.1952559 [DOI] [Google Scholar]
  20. Couldry, N., & Meijas, U. (2019). The costs of connection: How data is colonizing human life and appropriating if for capitalism. Stanford University Press. [Google Scholar]
  21. De Certeau, M. (1984). The practice of everyday life . Translated by S. Rendall. Berkeley, CA: University of California Press. [Google Scholar]
  22. Dicks, L. V., Rose, D. C., Ang, F., Aston, S., Birch, A., Boatman, N., Bowles, E., Chadwick, D., Dinsdale, A., Durham, S., Elliot, J., Firbank, L., Humphreys, S., Jarvis, P., Jones, D., Kindred, D., Knight, S., Lee, M., Leifert, C., … Sutherland, W. (2019). What agricultural practices are most likely to deliver “sustainable intensification” in the UK? Food and Energy Security, 8(1), e00148. 10.1002/fes3.148 [DOI] [Google Scholar]
  23. Ellul, J. (1954). The technological society. In Perspectives on the computer revolution (pp. 415–429). [Google Scholar]
  24. European Commission . (2018). Communication on enabling the digital transformation of health and care in the Digital Single Market; empowering citizens and building a healthier society . European Commission. https://digital-strategy.ec.europa.eu/en/library/communication-enabling-digital-transformation-health-and-care-digital-single-market-empowering [Google Scholar]
  25. European Commission . (2022). Proposal for a regulation of the European parliament and of The council on contestable and fair markets in the digital sector (digital markets act) . European Commission. https://eur-lex.europa.eu/legal-content/en/TXT/?uri=COM%3A2020%3A825%3AFIN [Google Scholar]
  26. Farr, C. (2020, August 25). Alphabet’s Verily enters stop loss insurance market. CNBC. https://www.cnbc.com/2020/08/25/alphabet-verily-enters-stop-loss-insurance-market.html.
  27. Fernandes, D. (2023, September 8). Huawei mobile services & mobily Pay to enhance digital payments in Saudi arabia. IBS Intelligence. https://ibsintelligence.com/ibsi-news/huawei-mobile-services-mobily-pay-to-enhance-digital-payments-in-saudi-arabia/.
  28. Gerber, J. (2023, July 5). Meta, TikTok, Google agree to help IEC combat election disinformation. News 24. https://www.news24.com/news24/politics/government/meta-tiktok-google-agree-to-help-iec-combat-election-disinformation-20230705.
  29. Google . (2014, May 6). Previewing a new Classroom. Google blog. https://blog.google/outreach-initiatives/education/previewing-new-classroom/.
  30. Google . (2024). Google for Education. Google https://edudirectory.withgoogle.com.
  31. Grant, E. (2020, October 12). Mineral: Bringing the era of computational agriculture to life. X Blog. https://blog.x.company/mineral-bringing-the-era-of-computational-agriculture-to-life-427bca6bd56a.
  32. Greenwood, F. (2019, February 13). Why Humanitarians are Worried About Palantir’s New Partnership With the U.N. Slate. https://slate.com/technology/2019/02/palantir-un-world-food-programme-data-humanitarians.html.
  33. Gürses, S., & Dobbe, R. (2020, February 18). Programmable infrastructures. TUDelft. https://www.tudelft.nl/tbm/programmable-infrastructures.
  34. High, R. (2022, June 6). The mayflower autonomous ship Has reached North America: Why this pioneering transatlantic voyage matters for the advancement of AI and automation technology across every industry. IBM Newsroom. https://newsroom.ibm.com/The-Mayflower-Autonomous-Ship-Has-Reached-North-America.
  35. Huawei . (2021, March 30). Huawei works with partners to facilitate smart agriculture in Switzerland. Huawei News & Events. https://www.huawei.com/en/news/2021/3/5g-sunrise-upc-huawei.
  36. Lanzing, M., Lievevrouw, E., & Siffels, L. (2021). It takes two to techno-tango: An analysis of a close embrace between google/apple and the EU in fighting the pandemic through contact tracing apps. Science as Culture, 31(1), 136–148. 10.1080/09505431.2021.1999403 [DOI] [Google Scholar]
  37. Latour, B. (1992). Where Are the missing masses? The sociology of a few mundane artifacts. In Bijker W., & Law J. (Eds.), Shaping technology/building society: Studies in sociotechnical change (pp. 225–228). MIT Press. [Google Scholar]
  38. Lever, A. (2015). Privacy, democracy, and freedom of expression. In Roessler B., & Mokrosinska D. (Eds.), Social dimensions of privacy: Interdisciplinary perspectives. Cambridge University Press. [Google Scholar]
  39. López Solano, J., Martin, A., Ohai, F., de Souza, S. P., & Taylor, L. (2022). Digital disruption or crisis capitalism? Technology, power and the pandemic. Global Data Justice, 10.26116/gdj-euaifund [DOI] [Google Scholar]
  40. Maganza, F. (2016, April 13). Supporting digital inclusion for 1M French citizens. Google Company News. https://blog.google/around-the-globe/google-europe/supporting-digital-inclusion-for-1/.
  41. Mahdawi, A. (2022, June 14). Palantir, the all-seeing US tech company, could soon have the data of millions of NHS patients. My response? Yikes! The Guardian. https://www.theguardian.com/commentisfree/2022/jun/14/palantir-the-all-seeing-us-tech-company-could-soon-have-the-data-of-millions-of-nhs-patients-my-response-yikes.
  42. Marcuse, H. (1941). Some social implications of modern technology. Studies in Philosophy and Social Science, 9(3), 414–439. [Google Scholar]
  43. Martin, A. (2023). Aidwashing surveillance: Critiquing the corporate exploitation of humanitarian crises. Surveillance & Society, 21(1), 96–102. 10.24908/ss.v21i1.16266 [DOI] [Google Scholar]
  44. Mazzucato, M. (2018). The value of everything. Penguin Books. [Google Scholar]
  45. Menezes, F. Z. (2023). Tech roundup: Digital addresses reaching more Brazilian favelas. The Brazilian Report. https://brazilian.report/tech/2023/09/25/google-expands-digital-address-project/.
  46. Microsoft . (2022a, March 15). Microsoft expands healthcare cloud strategy with new solutions and capabilities across data, AI and clinician experiences. Microsoft News Center. Microsoft expands healthcare cloud strategy with new solutions and capabilities across data, AI and clinician experiences - Stories.
  47. Microsoft . (2022b, January 27). Microsoft and Wolters Kluwer Legal & Regulatory partner to explore AI-driven legal workflows. Microsoft Blogs. https://blogs.microsoft.com/ai-for-business/ai-powered-legal-workflows/.
  48. Miller, B. (2020). Is technology value-neutral? Science, Technology, & Human Values, 46(1), 53-80. 10.1177/0162243919900965 [DOI] [Google Scholar]
  49. Milmo, D. (2022, June 21). Palantir: Trump-backer’s data firm that wants a big NHS deal. The Guardian. https://www.theguardian.com/society/2022/jun/21/palantir-concerns-over-data-firm-poised-to-be-operating-system-of-nhs.
  50. Moore, M., & Tambini, D. (2022). Regulating big tech: Policy responses to digital dominance. Oxford University Press. [Google Scholar]
  51. Morrow, D. R. (2014). When technologies makes good people do bad things: Another argument against the value-neutrality of technologies. Science and Engineering Ethics, 20(2), 329–343. 10.1007/s11948-013-9464-1 [DOI] [PubMed] [Google Scholar]
  52. Nagenborg, M. (2009). Designing spheres of informational justice. Ethics and Information Technology, 11(3), 175–179. 10.1007/s10676-009-9200-3 [DOI] [Google Scholar]
  53. Nissenbaum, H. (2010). Privacy in context: Technology, policy and integrity of social life . Stanford: Stanford University Press. [Google Scholar]
  54. NSF . (2021). Nsf program on fairness in artificial intelligence in collaboration with Amazon . National Science Foundation. https://new.nsf.gov/funding/opportunities/nsf-program-fairness-artificial-intelligence [Google Scholar]
  55. Olsen, E. (2021, December 7). Amazon partners for fall-detection tech on newly released Alexa Together service. Mobile Health News. https://www.mobihealthnews.com/news/amazon-partners-fall-detection-tech-newly-released-alexa-together-service.
  56. Palmer, K. (2021, December 7). ‘A multi-decade journey’: Chan Zuckerberg Initiative invests another $3.4 billion to advance scientific tools. STAT. https://www.statnews.com/2021/12/07/chan-zuckerberg-biohub-initiative-health/?utm_source=STATNewsletters&utm_campaign=59fc746afa-health_tech_COPY_01&utm_medium=email&utm_term=0_8cab1d7961-59fc746afa-153416126.
  57. Poell, T., Nieborg, D., & Duffy, B. E. (2021). Platforms and cultural production. Wiley. [Google Scholar]
  58. Prainsack, B. (2020). The political economy of digital data: Introduction to the special issue. Policy Studies, 41(5), 439–446. 10.1080/01442872.2020.1723519 [DOI] [Google Scholar]
  59. Radin, M. J. (1989). Justice and the market domain. American Society for Political and Legal Philosophy, 31, 165–197. [Google Scholar]
  60. Reich, R., Sahami, M., & Weinstein, J.M. (2021). System error: Where big TechWent wrong and how we can reboot . New York: HarperCollins Publishers. [Google Scholar]
  61. Roessler, B., & Mokrosinska, D. (2013). Privacy and social interaction. Philosophy & Social Criticism, 39(8), 771–791. 10.1177/0191453713494968 [DOI] [Google Scholar]
  62. Sandel, M. (2012). What money can't buy: The moral limits of markets. Farrar, Straus and Giroux. [Google Scholar]
  63. Satz, D. (2010). Why some things should not be for sale: The moral limits of markets. Oxford University Press. [Google Scholar]
  64. Sharon, T. (2016). The Googlization of health research: From disruptive innovation to disruptive ethics. Personalized Medicine, 13(6), 10.2217/pme-2016-0057 [DOI] [PubMed] [Google Scholar]
  65. Sharon, T. (2021a). Blind-sided by privacy? Digital contact tracing, the Apple/Google API and big tech's newfound role as global health policy makers. Ethics and Information Technology, 23(Suppl 1), 45–57. 10.1007/s10676-020-09547-x [DOI] [PMC free article] [PubMed] [Google Scholar]
  66. Sharon, T. (2021b). From hostile worlds to multiple spheres: Towards a normative pragmatics of justice for the googlization of health. Medicine, Health Care and Philosophy, 315–327. 10.1007/s11019-021-10006-7 [DOI] [PMC free article] [PubMed] [Google Scholar]
  67. Sharon, T. (2022, February 2). Beyond privacy: there are wider issues at stake over Big Tech in medicine. Open Democracy. https://www.opendemocracy.net/en/technology-and-democracy/beyond-privacy-there-are-wider-issues-at-stake-over-big-tech-in-medicine/.
  68. Sharon, T., & Koops, B. J. (2021). The ethics of inattention: Revitalising civil inattention as a privacy-protecting mechanism in public spaces. Ethics and Information Technology, 23(3), 331–343. 10.1007/s10676-020-09575-7 [DOI] [PMC free article] [PubMed] [Google Scholar]
  69. Stevens, M., Sharon, T., van Gastel, B., Hoffman, A. S., Kraaijeveld, S. R., & Siffels, L. (2022). Sphere Transgression Watch. Distributed by iHub, http://www.sphere-transgression-watch.org.
  70. Taylor, L. (2021). Public actors without public values: Legitimacy, domination and the regulation of the technology sector. Philosophy & Technology, 34(4), 897–922. 10.1007/s13347-020-00441-4 [DOI] [PMC free article] [PubMed] [Google Scholar]
  71. Van den Hoven, M. J. (1997). Privacy and the varieties of moral wrong-doing in an information age. ACM SIGCAS Computers and Society, 27(3), 33–37. 10.1145/270858.270868 [DOI] [Google Scholar]
  72. Van Dijck, J., Poell, T., & de Waal, M. (2019). The platform society: Public values in a connective world. Oxford University Press. [Google Scholar]
  73. Veale, M. (2020, July 1). Privacy is not the problem with the Apple-Google contact-tracing toolkit. The Guardian. https://www.theguardian.com/commentisfree/2020/jul/01/apple-google-contact-tracing-app-tech-giant-digital-rights.
  74. Véliz, C. (2020). Privacy is power: Why and how you should take back control of your data. Bantam Press. [Google Scholar]
  75. Verheul, E., & Jacobs, B. (2017). Polymorphic encryption and pseudonymisation in identity management and medical research. NAW, 5(18), 168–172. [Google Scholar]
  76. Verily . (2023, October 2). How digital biomarkers could transform evidence generation. Verily. https://verily.com/perspectives/innovating-healthcare-digital-biomarkers.
  77. Vincent, J. (2018, August 13). DeepMind’s AI can detect over 50 eye diseases as accurately as a doctor. https://www.theverge.com/2018/8/13/17670156/deepmind-ai-eye-disease-doctor-moorfields.
  78. Wallenberg, I., & Bal, R. (2019). The gaming healthcare practitioner: How practices of datafication and gamification reconfigure care. Health Informatics Journal, 25(3), 549–557. 10.1177/1460458218796608 [DOI] [PMC free article] [PubMed] [Google Scholar]
  79. Walzer, M. (1983). Spheres of Justice: A defense of pluralism and equality. Basic Books. [Google Scholar]
  80. WHO . (2021). Global strategy on digital health 2020-2025 . World Health Organization. https://iris.who.int/bitstream/handle/10665/344249/9789240020924-eng.pdf [Google Scholar]
  81. Winner, L. (1980). Do artifacts have politics. Daedalis, 109(1), 121–136. [Google Scholar]
  82. Zuboff, S. (2019). The Age of surveillance capitalism: The fight for a human future at the new frontier of power. Public Affairs. [Google Scholar]

Articles from Information, Communication and Society are provided here courtesy of Taylor & Francis

RESOURCES