Skip to main content
Wiley Open Access Collection logoLink to Wiley Open Access Collection
. 2019 Oct 22;52(1):270–290. doi: 10.1111/anti.12583

A Crisis of Opportunity: Market‐Making, Big Data, and the Consolidation of Migration as Risk

Linnet Taylor 1,, Fran Meissner 2,3
PMCID: PMC7006806  PMID: 32063659

Abstract

Crisis narratives surrounding Europe’s 2015 migration influx fuelled demands for new ways of tracking, mapping and predicting human mobility. We explore how market opportunities for technology firms and data analytics start‐ups created by the EU’s Global Approach to Migration led to solutionistic approaches to compiling and analysing migration statistics. We show that initiatives such as the rebranding of existing platforms and services as migration prediction systems are consolidating policy conceptualisations of migration as risk. Despite the promise of greater granularity, this “big data approach” cannot offer greater certainty about who is on the move and why. Instead such approaches are ill‐suited to understanding the complex dynamics of migration and to offering protection to vulnerable people. The marketisation of migration statistics through big data offers a key case for advancing progressive approaches to both migration statistics and global data justice.

Keywords: big data, migration statistics, refugee crisis, solutionism, data governance, autonomy of migration


It would be naive to think that there is a firewall between commercial surveillance and government surveillance. There is not. (Sue Halpern [2016], in The New York Review of Books, on the [mis]direction of public attention after the Snowden revelations)

He told us that the files must be carefully taken care of, that they were not files, they were human lives, worthy of respect. (Roberto Kozak’s description of files of missing people under the Pinochet regime [quoted in MacAskill and Franklin 2016])

Introduction

Since the first tabulating machine was used to process census data in the 1890s (Hollerith 1894) to their current digitisation, population records have enhanced state powers both to include and to exclude (Broeders 2009). From the 2000s onward the increasingly global adoption of technologies including the internet, mobile phones and satellite sensing have created new layers of data with which people on the move can be counted and sorted. Moreover data analytic techniques which emerged in the private sector over the same period make it possible to combine and analyse large, unstructured datasets in new ways. This includes what is termed “nowcasting”, or the digital analysis of dynamic phenomena in, or close to, real time. Different sources such as satellite data, mobile phone location records, video content from the web and geotagged social media posts can now be linked and analysed using analytics services such as Palantir’s Gotham (Lev‐Ram 2016) or Microsoft’s Azure (Microsoft 2017). The assumption behind policy demands for more and better data is that if human mobility can be made more predictable, it also becomes more controllable (IOM and McKinsey & Company 2018). In this paper we contest this presumed link by exploring how the increasing marketisation of migration statistics and the uncritical adoption of big data analytics in this field are diverting attention from pressing political questions about how to govern both data and migration.

We are witnessing a datafication of migration statistics where our “perception of reality … becomes more technologically and statistically mediated” (Broeders and Dijstelbloem 2016:242) as large‐scale machine‐readable, “born‐digital” data are used to reflect patterns of human mobility (Taylor 2016a). In a new twist, calls have been made on both the EU and the UN level for the use of big data analytics to track migration flows, explicitly for policy purposes (Rango and Vespe 2017; UN Global Pulse 2017), perhaps most notably the call for “big data for migration” (BD4M) (IOM and European Commission 2017). Applying such analytics has serious implications for the way migration is observed and the kinds of policy interventions that can be justified.

For‐profit actors (entrepreneurs, start‐ups and large tech firms) increasingly shape the production of migration statistics. By consolidating a conceptualisation of migration as risk, subcontracting the process of defining migrants’ opportunities and futures distances the state from its responsibility to protect those within its borders, and distances migrants from opportunities to demand justice.

Methodologically, the entry of data analytics firms into this policy space creates space for solutionism (Morozov 2013), where developers first create technological systems and then look for problems to solve. Moreover, as we show, the application of big data analytics to migration statistics involves particular formalisations and proxies for making sense of migration—most importantly “data doubles” (Haggerty and Ericson 2000), or aggregations of people’s digital traces that then come to represent those people for purposes of policy intervention. We will argue that this process of producing new migration statistics through migrant data doubles is context‐ and theory‐free, that it lacks domain knowledge and that it heavily relies on biased preconceptions that are informed by, and perpetuate, long‐standing power hierarchies. The emergent combination of actors and of technological procedures implemented constitute an entrenched socio‐technical discourse that cements the conceptualisation of migration as risk, with knock‐on effects for contextual and situational configurations of diverse migrant flows. This process positions the destination or transit state as the recipient of risk, rather than the migrant. We will explore how this reconfigured risk becomes a justification both for a state of emergency (Pasquale 2013) which creates a market for technical solutions, and how it leads to an ontological claim that “the migrant” has an identity that can be known, described and used to “render uncertain futures knowable and actionable” (Scheel 2013:593), ignoring subjectivity and autonomy (Scheel 2013).

The processes we chart—of the empowerment of commercial actors to construct migrants through statistics—does not only alter how migration statistics are being created. It matters for thinking about radical politics in times of big data. On the policy level, while businesses (and particularly technology firms) have usually pushed for more liberal immigration policies, we demonstrate in this paper how the desire to compete in a new market for migration statistics is causing big tech firms to invent ways to make hyper‐restrictive policies more effective; consolidating what Landau (2019) has referred to as a “defensive assemblage of coercive controls”. The marketisation of migration statistics is also a powerful statement about migrant autonomy: it is predicated on the notion that migrants’ choices can be predicted, constrained and channelled by technologically produced knowledge, again in support of a restrictive policy vision. It is aligned with the notion that any migration generates a state of exception (Agamben 2005) which justifies treating migrants as separate from the rest of society—a political disempowerment and separation that is supported by technical and statistical means. Our empirical analysis explores these political implications and asks how we can challenge the technology‐ and market‐driven normalisation of this state of exception.

The paper offers a critical appraisal of new approaches to digitally categorising and surveilling people on the move. Its main contribution is to bring together discussions of datafication, politics and power from the field of Critical Data Studies (Burns et al. 2017) with literature on human mobility as risk. We thereby offer a critical analysis of how migration is being re‐problematised at the interface of big data, European crisis management and anti‐migration politics. We offer three main insights: first, we link the current accelerated uptake of big‐data migration statistics to processes of turning emergency or crisis narratives into market opportunities. Second, with novel empirical examples, we show how this process has allowed a solutionistic approach that formalises migration as risk and treats the resulting statistics as a commercial product. Third, by drawing on parallel challenges identified in what has been termed superdiversity we explore the tension between claims of granularity and precision in big data analytics and actually existing social complexity. We thus highlight the parallel tension between creating statistics and making responsible, contextually sensitive policy decisions.

The empirical basis for this paper comes from two consecutive research projects investigating the use of big‐data analytics in the migration and humanitarian fields. These projects involved fieldwork, observation and interviews conducted over the period 2012–2019, comprising around 200 formal and informal interviews conducted at events relevant to the “responsible data” movement, during participation in advisory groups and public discussions on data analytics in these fields, and scans of the media over the same period. One specific case of a collaboration between the Dutch statistical authorities and a data analytics consultancy—chosen because it is an early example of the intersecting commercial markets for data, analytics and policy recommendations created by EU policy—is explored in more depth using interviews.

Background: A Crisis of Many Kinds

As Hannah Arendt (1951:286) said of the 1930s refugee wave across Europe: “After 1935 … so‐called ‘economic immigrants’ and other groups … mixed with the waves of refugees into a tangle that never again could be unravelled”. Sigona (2018) notes, the increased global migration flows towards Europe which peaked in 2015 similarly challenged authorities’ ability to define and differentiate people on the move. While these increased migration flows have repeatedly been portrayed in terms of a “migration crisis”, this narrative ignores the intricate and manifold processes that led up to and followed changes in migration patterns (New Keywords Collective 2016; Sigona 2018). A “spectacle of numbers” has been noted to have played a particularly relevant role in portraying a situation that had become out of hand and required harsh measures to (re‐)gain control by managing migration (New Keywords Collective 2016; Sigona 2015). It is thus not surprising that in concert with these developments we also saw an accelerated uptake and calls for the use of new data sources, often explicitly with the aim of sorting, ordering and controlling migration towards Europe.

To situate the emerging involvement of technology firms in answering the policy priorities of controlling migration and contributing to the pressures of classification, it is necessary to clearly delineate, from a political and institutional perspective, how developments in 2015 came to be framed in crisis and disaster terms. First, the increased migration flows were experienced as a political threat by some European states, particularly those with eastern or southern borders. These states saw the influx as a foreign relations crisis driven by “coercive engineered migration” (Greenhill 2016), a phenomenon where aggressive states “directly create, or threaten to create, cross‐border population movements” in order to influence other states. The changed movement patterns were simultaneously framed as a resource crisis by receiving states such as Greece and Italy but also Germany, Sweden, Hungary and Austria (Eurostat 2016). In these states, many amongst the native population resisted the idea of taking in large numbers of refugees at a time when austerity policies were still in force to deal with the aftermath of the recession. While states such as Germany and Sweden initially supported settlement and framed efforts in light of fostering a “refugee dividend” as Hansen (2018:134) notes, the debate turned and hardliners more frequently noted that “the growth accrued [as a result of refugee settlement] is obviously the wrong, ‘artificial’ type of growth. By the same token, and in line with Brussels’ policy, refugees are now seen by many in both Sweden and Germany as the wrong type of migrants, not easily matched with labour markets”. Meanwhile, humanitarian and migration management organisations such as UNHCR, the Red Cross and the IOM predominantly identified this period as a logistical and funding crisis. These crises bore little relation to the more visceral one experienced by the migrants themselves, who often faced treacherous conditions in seeking a legitimate place of safety.

These different perceptions of crisis collided in the public sphere with shocking images of migrants’ suffering, adding to the pressure on EU policymakers. Actors in the technology sector offered, in place of this complex nexus of problems, a simpler alternative—a crisis of collecting and ordering information (Polach 2015)—the simple answer to which would be to improve predictive capabilities, define migrant trajectories, and exert control long before individuals arrived in Europe (Rango and Vespe 2017).

A big data analysis of migration flows involves a computational formalisation of central notions. What characterises a migrant, what forms of mobility are of interest to the modelling process, and what constitutes risk must all be formalised through proxies using available variables in the datasets chosen. Unlike in government statistics, where metadata are required to explain how concepts such as “migrant” are formalised, the definition of “a migrant” for the private sector is flexible and policy‐dependent. Any analysis performed by commercial firms will, therefore, use a proxy related to the risk category that appears to be most profitable (i.e. whichever type of risk maps best onto that of the policymakers who are the perceived market for the company’s analysis). This aligns with Andersson’s (2014:79) analysis of the EU border authorities’ vision of risk from migration as “not just the anticipation of danger … [but also] the source of potential profits … bundled into pockets, routes, flows, and vulnerabilities and assigned to police forces and external investors”. For these “external investors”, “a migrant” may become formalised as “someone who tweets about a particular place”, “someone who is observed moving from place X to place Y, in a group, during a given period”, or any number of other possible proxies. This process aligns with Haggerty and Ericson’s (2000) notion of the creation of a “data double”, a digital representation of a person for the purposes of action and intervention. This doubling of migrants into digital objects for analysis can also be seen as part of a dynamic where the analyst’s idea of “the migrant” must be mutually constituted with states’ ideas of “risk” and “orderly migration” to allow for a sustained for‐profit production of migration statistics. It is a process that heavily depends on the employed logics of making sense of migration and on how people on the move ought to be classified.

New data layers drawn upon in this process alter migration statistics by defining people in transit as migrants and allowing researchers to identify and visualise movements with much greater accuracy, sometimes in real time (Taylor 2016a). This goes against a traditional perspective where people were defined as migrants only once they crossed an international border (IOM 2018) and it adds to an increased blurring of who is a migrant and thus should be subject of migration control. The externalisation of the EU’s borders (Andersson 2014), combined with a “migration management” approach (Ghosh 2007) which foregrounds sorting and categorising migrants into more and less desirable types which can then be identified and directed using digital bordering technologies, lends itself to outsourcing to technology experts. In particular, it lends itself to a vision of data as solution, where any problem can be solved by more, and more granular, data. As a result, the predictions and recommendations it produces can be viewed as unreliable. Nevertheless, these data practices are being used to differentiate people, their propensities to move, and desirable and undesirable migrants (Hansen 2018). The effect of this is that a political suspicion of migrants is being hardwired into a datafied migration control that reinforces the “ideological and material architectures of illegality” (Nelson 2019).

Big data analytics, applied to migration, tends to produce modes of mapping that are colonial in the way they proxy, sort and categorise. They thus perpetuate unequal and exclusive power structures, and at the same time they become harder to criticise when data analytics promise the ability to predict and optimise migration and its implications. This is the link we set out to question. Why, becomes clearer by highlighting that the sorting mechanisms that we debate in this paper are complexly entangled and closely linked to migration‐driven re‐configurations of diversity (Hall 2017; Meissner 2018). Those reconfigurations have been debated in terms of superdiversity. While superdiversity is a malleable notion used differently by different researchers, in its analytically more potent variants, the notion holds that the implications of migration cannot be reduced to simple accounts and that migration‐driven diversity is certainly not amenable to simple (policy) solutions (Vertovec 2017) such as those promised by for‐profit tech companies.

Importantly, while processes of differentiation used to be aimed at localised migration‐driven diversities, newer data sources shift the governance of migration, and with it a diversification through the classifying of people as various types of migrants, outside the territory of destination states. Marketisation not only of data but of the power to define are changing the debate on why we need migration statistics in times of accelerated migration. This resembles the problem addressed by critiques of “disaster capitalism” (Duffield 2016; Klein et al. 2008) and the “Schmayekian emergency” (Mirowski 2013; Pasquale 2013), namely a crisis which simultaneously creates opportunities for profit for already‐powerful actors in a particular sector. The term combines the 1930s political theorist Schmitt’s contention that the ability to define an emergency is the true test of state power (Schmitt 2015) with the philosophy of free‐market economist Hayek, whose work is foundational to neoliberal thought. Hayek (2001) famously contended that a free market was the only way to preserve the conditions for a free society. Just as the US conglomerate Halliburton and various military contractors found rich opportunities in the chaos that followed the Iraq war of 2002, data analytics firms identified huge commercial opportunities in the combination of the migration influx of 2015 and the availability of new data sources.

To compete as vendors of migration statistics, data analytics firms promised policymakers large‐scale yet granular analytics that could provide directions for existing policy priorities based on risk management and control. This could be achieved by breaking down the complexities of observed migration flows by analysing them through commercially available digital traces and reports. This approach assumes that migration flows are an exception that contradicts and threatens a presumed normal situation where the risk of unregulated migration is not present. It positions data analytics vendors in partnership with state authorities in defining and operationalising ideas of normality and exception. This phenomenon maps not only onto Schmitt’s claim that the power to define normality and exception are key to state sovereignty, but also onto Hayek’s that it is desirable to have the market play a role in defining what is normal. Bigo (2014:220) describes the border control assemblage as “the locus of practices of sovereignty and exception”, and it is this locus of exception that offered a perfect opportunity for the technology sector to claim expertise and find new markets for their products.

The particular configuration of actors necessary to do this work happened to become available around the time of the escalation of the Syrian civil war. At that time links had already been established between the field of humanitarian agencies and big‐data players. As we note in the next sections, the timing and pre‐existing links not only initiated a marketisation of migration statistics, but also created space for a solutionistic approach to producing them.

Creating the Conditions for Datafication

The “crisis” framing of migration in the public debate—and its political effects—provided the demand for new forms of migration data. But the supply was generated by increasing closeness between large technology firms and the humanitarian innovation community. After a number of academic and policy proof‐of‐concept studies involving migration tracking (see, for an overview, Taylor 2016a), European authorities became increasingly interested in the potential of new data sources for a role in tracking and estimating migration flows (Rango and Vespe 2017). In contrast to static administrative data collected at borders, these data were emitted as people used their phones, passed under satellites, used the internet and social media or paid for goods (Taylor and Schroeder 2015). They were dynamic and continually updating, they could be gathered in real time, and they reflected snapshots of behaviour, discourse and location rather than an organised encounter between migrants and governmental authorities.

The field of “humanitarian innovation” (Betts et al. 2012) was a fertile environment for the datafication of migration. The humanitarian field’s engagement with private‐sector data sources gained visibility in the wake of the 2010 Haiti earthquake due to high‐profile epidemiological studies using mobile phone location data (Bengtsson et al. 2011; Pindolia et al. 2012; Wesolowski et al. 2014). By 2015 the community was hosting debates around new sensor‐based tracking methods, international data sharing standards and the use of social media to understand humanitarian needs (Taylor 2016b). Tracking migration flows from the countries involved in the Arab Spring, however, offered new analytical possibilities because of the higher proportion of those migrants who used social media to communicate with their networks. Social media, smartphones, satellites, drones as well as state and journalistic reporting together therefore offered unprecedentedly detailed but highly variable real‐time data about people’s movements and activities.

The regulation of migration has arguably become an industry that in part is justified by claiming that migration can be ordered and made predictable (Meissner 2018), despite empirical evidence to the contrary (Boswell 2011). The aim of ordering migration or representing it in a more granular way helps explain the lure of big data migration statistics as they are currently being adopted. Migration scholars aim to embrace and recognise complex social phenomena surrounding migration. This highlights that those committed to the field are aware that their analytic efforts cannot provide tools to remove the relational complexities and serendipity inherent in the dynamics of migration. This is an important caveat that puts a different spin on assessing the market value of more and invasively detailed migration data at any cost. This caveat, in turn, calls for being critical and vigilant about the uses and abuses of data in a multi‐stakeholder migration data market (Meissner 2019).

As Arendt, quoted earlier, elucidates, the production of migration statistics has always being deeply problematic. The primary basis for analysing global migration flows used to be a relatively simple and imperfect data matrix (Henning and Hovy 2011). Forecasting capacity was understood to be limited, contributing to calls for more and better data. However, the tenuousness of the link between categorising and controlling migrants has been ignored by the framers of policy: as Czaika and De Haas (2013) note, migration patterns do not necessarily respond to policy interventions in expected ways (see also Massey 1999). Models based on neoclassical migration theory tend to be criticised for focusing on limited sets of causal factors, making a big‐data approach potentially worthwhile by providing more complex explorations. The potential granularity of big data is one of its most prominent selling points in relation to making sense of migration dynamics. This granularity, according to business analysts (Laney 2001), is made possible through the variety of data sources, the velocity at which data are generated, and the sheer volume of data that can be analysed. It has further been shown that we have witnessed an increase in the speed, spread and scale of processes of diversification (Meissner and Vertovec 2015). Greater granularity could help answer concerns over how change and emergence shape the ways migration becomes socially relevant—yet as Black (2003) reminds us, how “problems” of migration are defined and debated matters to how we make sense of migration as a social phenomenon.

Scholars, including Beduschi (2018), have suggested that big data analytics could enable benevolent states and humanitarian organisations to identify needs and protect migrants from danger. However, this relies on centring the idea that migration constitutes risk to the migrant rather than conceptualising the migrant as a risk to the possible destination state. A migrant‐focused framing would have to address migrants as agents who cannot be stripped of their rights to exercise various kinds of power. Very different assumptions are operationalised when migration itself is addressed as a problem to be solved by the market. For example, McKinsey’s collaboration with the IOM (IOM and McKinsey & Company 2018) outlines “how migration data can deliver real‐life benefits for migrants and governments” by addressing migration as an economic phenomenon, where migrants are a resource to be arranged and shaped for maximum efficiency and productivity. Produced after the peak migration flows of the Arab Spring, this economic vision interacts productively with the crisis framing driving “big data for migration statistics” (IOM and European Commission 2017), where international migration is conceptualised as a crisis‐driven problem that can be solved, rather than as a “normal process” involving a diversity of flows and motivations. This purposeful intersection of crisis with the possibility of productivity has the effect of masking the politics involved in regulating migration and the politics of difference that shape migration‐driven diversity. As such datafication leaves little room for the contestation of narratives that in essence reproduce and perpetuate colonial power structures. Instead, by offering to solve a problem, data analytic efforts become even more reductive than the use of neoclassical migration theory.

Formalisation of the Problem as the Entry Point for Commercial Data Players

Politically, the idea of predicting migration through big data can be seen as one response to the wholesale failure of the European Commission’s Global Approach to Migration and Mobility (GAMM). The policy that preceded it, the Global Approach to Migration (articulated in 2005), aimed to build up a preventive infrastructure of law and border enforcement and intelligence that was then elaborated further by the GAMM policy in 2011, which targeted a broader picture including mobility. Within this broader landscape of mobility, GAMM aimed for greater legibility (Scott 1998), dividing and ordering the incoming flows of people into legitimate and illegitimate, migrant and mobile, in relation to the EU’s policy priorities. In relation to those priorities, GAMM calls for “a geographically comprehensive approach” to controlling irregular migration (European Commission 2011:4), specifically a “risk‐based approach” with “greater use of modern technology” (European Commission 2011:7). This progressive linking of risk, sorting and data technologies laid the groundwork for big data approaches to enter the policy discourse after 2011: the EC’s statement notes that “events have shown how quickly a section of the external border considered as low risk can … become subject to critical migratory pressure” (European Commission 2011:7). The policy discourse then calls on big data technologies as the solution to this complex problem of risk: through technology, GAMM could form an approach that “incorporates recognition of the volatility and elusiveness of migratory movements and adapts itself accordingly” (Casas‐Cortes et al. 2015).

Due in part to the extraordinary numbers migrating as a result of the Arab Spring, GAMM has failed to order the increasing mixing and unpredictability of migration into Europe. The “big data for migration” (BD4M) approach declared by the EC in 2017 (IOM and European Commission 2017) can be seen as one response to the challenge of making migrants more legible. Focusing on groups of political interest rather than the territory of the border, it externalises the border to any site of mobility—or discussion of mobility—on the part of groups considered potential migrants. As such migration can be cast as a problem of information: collecting the right data to enable control through categorisation.

The “right data” is a methodological question that analytics vendors are answering in different ways. One available approach might be termed an “N=all” approach where huge computing power is applied to all the data that might be relevant to a question, and emergent patterns are the output. Another is a sampling approach drawing on big data sources such as Twitter, then applying those in a more focused way to a predetermined question. The first has led in the direction of social media analysis to identify smuggling networks, to anti‐trafficking cooperation and to the dynamic surveillance of an irregularised migration. One salient example comes from a meeting on humanitarian data in 2015, where a senior Microsoft executive explained that the company was making proposals to European police and government authorities to use its new “Azure” environment to analyse large quantities of social media and video being posted by Syrians waiting at the Hungarian border for indications of their future mobility plans.

A refugee camp is essentially something you can treat as a prison [for analytical purposes]. People will eventually come out, and the aim is then to track them, to know what they are doing, and to prevent them from offending again. (Microsoft Europe executive, 23 October 2015)

By proposing to formalise the refugee camp as a prison to fit the capabilities of the software available, such an analysis would also formalise mobility, and migrants, as inherently risky. Agre (2004) describes computer scientists as “ontologists” whose “work consists of stretching whatever discourse they find upon the ontological grid that is provided by their particular design methodology”. The definition of “risk”, in Microsoft’s proposal, is formalised as presence in a camp, with the intent to move. This approach stands in contrast to an Autonomy of Migration perspective (Moulier Boutang 1998). AoM emphasises that migrants make situated and responsive decisions that are not visible to policymakers and that themselves help constitute the border through struggle and contestation (Casas‐Cortes et al. 2015). As such they add to the situational configurations of migration‐driven diversity that have been referred to in terms of superdiversity (Meissner 2015). The two are incompatible: the deterministic approach of computer science assumes that migrants are inherently risky and that their discourse represents stable intent, while an autonomy/superdiversity perspective would maintain that migrant intent is in a constant process of formation in relation to context. What thinking about superdiversity adds is an emphasis placed on considering the intersections of migrant autonomy with modes of differentiating migrants inherent in migration control (Hall 2017; Meissner 2018). Brutal migration regimes are in the process of becoming hardwired into the rationales and motivations of doing migration statistics (Heide Uhl 2017) and present a challenge to radical politics to criticise and imagine alternatives to those processes.

Formalisation is the main offer of a data analytic approach, and it maps conveniently onto the problem articulated by the GAMM of migration flows as chaotic, disordered and in need of sorting. In the same vein its successor statements, the Big Data for Migration plan adopted by the EC in 2018 (Rango and Vespe 2017) and the intensively debated Global Compact for Safe, Orderly and Regular Migration (hereafter: Global Compact) (United Nations 2018), are based on the idea that big data‐driven analysis can lead to a reduction in irregularity. The aim of accessing more extensive and dynamic data sources fostered a market for particular commercial actors, i.e. those with access to behavioural data and the capacity to analyse it. By 2015 independent business analysts were making the case that big tech companies could sell their services to government to access and analyse data about migration:

This could perhaps be a great opportunity for a concerted effort of Big Blue (IBM), SAP, Big Four (Deloitte, EY, KPMG, PwC) and other Analytics/Big Data purveyors to come up with comprehensive capabilities to create holistic reports and list of recommendations based on existing internal and external data sets for EU leaders who then could start making fact‐based decisions to deal with this crisis. (Polach 2015)

The claim that commercial analytics of “external data” combined with the formalisation of a “migration problem” will inevitably produce more fact‐based findings than existing data practices found fertile ground in the objectives created by the GAMM, BD4M and Global Compact statements. By early 2017, migration policy researchers recount clear signalling by policymakers that experimental, incomplete and proof‐of‐concept proposals from the private sector were welcome:

… one of the policymakers participating in the workshop mentioned some of the key knowledge gaps in migration—including regular migration flows and migrants’ characteristics, migrant smuggling trends and migration forecasting—stating that having data characterized by a large margin of error would still be better than having no data at all. (Rango and Vespe 2017:9)

The next section will explore a particular proof‐of‐concept project where the different modes of working and epistemologies analysed above became salient.

Public and Private Practices

As well as formalising the problem, rather than surfacing emergent dynamics, big data analytics also involves sampling from big data sources to answer a particular question, thus closing off other possibly relevant avenues of inquiry. This characteristic is demonstrated by a proof‐of‐concept collaboration between the Dutch statistical agency (CBS) and a UK data analytics consultancy, CGI (van der Sangen and van Sandijk 2018). Starting in 2017, CGI proposed a group of proof‐of‐concept projects, one of which was a migration forecasting project based on a mix of satellite and social media data on Syrians. The question the project asked was whether social media activity and satellite images of groups moving were correlated.

The study was based on historical time series data from Syria during 2015. According to a collaborator from the statistics agency, the study was constrained in the variety of the data it could access eventually using mainly Twitter data because they were openly accessible. It was also constrained by cultural and linguistic expertise: the project relied on one Arabic‐speaking team member and other Arabic‐speakers this team member recruited from his own networks. The project had no Syrians involved who could understand the contextual meaning of what was posted. The team encountered problems where subjects posted on social media in other Syrian languages, for which they had no translator.

Thus the project was based on a small subsample of the refugee population of interest: Arabic‐speaking Syrians who made posts on Twitter about migration. The assumption that this group was representative of the larger one was not possible to make, thus creating unknown bias and error in terms of standard statistical methods. However, internal validity in relation to the dataset was still possible, and the project proceeded on that basis. The proof of concept conducted, therefore, consisted of answering the question as to whether Syrian Arabic posts on Twitter seemingly related to intent to migrate correlate with human mobility as observed by satellite. This narrowness was not problematic as long as the underlying question was about operationalising a particular formalisation of risk and reducing complexity of patterns, rather than about the more difficult task of understanding the complexity of migration flows.

There is a comparison that can be made between this process and the neoclassical approach to migration: formalising reasons and processes in relation to migration to the purely economic is fundamentally incompatible with the autonomy approach which looks at known unknowns—how migrants are reasoning about their situation, and their understanding of opportunity. The formalisation involved in sentiment analysis adds a new layer to this: the analysis attempts to formalise “intent to migrate” by using particular words as proxies for intent, but in doing so, also formalises the notion of intent in a way that is as reductive as a purely economic analysis.

The project used very general proxies: movement of people from a particular group (Arabic speakers) in a particular direction (away from particularly turbulent zones toward the EU). However, the model was also designed around migrants as a political and security risk, and the computational system built to run it excluded other interpretations. Just as Agre (2004) describes, in the friction between computational and social scientific ontologies, the overall problem easily becomes subordinated to a smaller problem of systematic accuracy in relation to the data available. Thus one possible definition of success becomes that the system “will not capture the entire meaning of the original discourse, and will distort many aspects of the meaning that it does capture, but the relationship between discourse and artefact is systematic nonetheless” (Agre 2004:28).

In line with Agre’s analysis, execution of the project soon became conflated with analytical success. Although the statistics bureau’s project manager was clear that “the results of the study indicate that the relevant information on social media is too limited to use as a reliable source for insights into refugee flow developments”,1 the work found an audience in Frontex, the IOM, the European Asylum Support Office and SatCen, the satellite provider working with Frontex. The commercial firm involved was eager to continue the work, whereas the government statistics bureau did not attempt to do so:

The project was together with CGI, a commercial company. It is of course in their interest to try and make a product that they can commercially sell to different clients … For CBS however this works differently. As a governmental organization we make mandatory statistics and additionally do commissioned work. We have only very limited resources to invest on our own. In this case we didn’t find a potential client or other type of funding to continue the project, so we stopped. (Bob van den Berg, Netherlands Central Statistics Bureau, 19 February 2019)

The idea of migration analysis as a “product we can sell” makes a powerful appeal to colonial logics of knowledge and property. Quijano’s critique of the colonial claim to rationality draws important parallels between knowledge and property where “property, like knowledge, is a relation between people for the purpose of something, not a relation between an individual and something” (Quijano 2007:173). The data doubles created by data analytics are tradeable because of the effort and skill invested in constructing them. As such, the risky migrant is more valuable to the private sector than the complex, human, autonomous one.

Just as with the packaging of the colonial subject into castes or ethnic groups, once the “migrant” data double is packaged—often using proprietary data and methods—it is not possible to reverse‐engineer or change it. The categories firms use to collect those data, and the ontologies people use when they post on social media, are not visible at the point where the data are analysed and packaged: the analyst cannot ask the Syrian what she meant by the words in her tweet but instead must create a model for what certain words mean. Where the assumption about the word’s meaning is wrong, the meaning of the whole dataset becomes skewed, as words are repeated thousands, sometimes millions of times. The foundational myths of big data analytics, about the predominance of trafficking and smuggling such as “that migration is becoming the preserve of international criminal gangs, or that most or all asylum‐seekers are 'bogus'” (Black 2003:41), are reified and fixed by commercial big data analytics. The inferences used as part of the analysis thus become difficult, if not impossible to contest by migrants or those speaking on their behalf.

What Questions are We Asking of Migration Data?

More detailed and granular data may create more moral and political problems than they solve. As we have argued above, the accelerated entry of big‐data players to the field of migration statistics was driven by narratives about crisis and a tone that demanded quick and simple fixes without having to address deep‐seated differences in political positioning and perceived and practical responsibilities. The analysis thus far indicates two central points in relation to contemporary patterns of migration and modes of migration governance, that are also important for how we think about related questions of data governance. First, it may be possible to model complex social phenomena, but a complex model does not make the social phenomenon being researched less complex. It also does not substitute for theoretical positioning—and it can certainly not, in and of itself, be given the power to define a “normal” or “emergency” situation in a Schmittian sense. Such an approach entails a level of homogenising that generates asymmetric power over data subjects through practices of categorisation and othering. Second, policy interventions more often than not add to, rather than simplify, the complexity of migration. What is obscured by “now‐casting” migration is that the exertion of power within migration regimes shapes what is modelled (Hall 2017). This expands on why we argued that big data approaches in following a Schmayekian market logic currently lead to reductionist approaches and to ossifying difference categories rather than to highlighting superdiversity and the complexities of migration processes.2

Scholars working in a de‐colonial tradition have been critical of an overly individualised approach to categorising migrants as this is feared to potentially reify long‐standing categories of difference rather than help contest them (e.g. Ndhlovu 2015). Such criticisms, applied to the emergence of externalised migration statistics, can help make sense of why formalising and trading migrant data doubles adds little value in terms of addressing the types of crisis singled out earlier in this paper. Instead it furthers the worrying process of producing an ever larger pool of suspended people whose lives are actively governed and surveilled, curtailing the ability of those presumed to be migrants to escape and contest juridico‐political categorisation and enumeration (Oelgemöller 2011).

As Schinkel (2017, 2018) notes, the migrant as problem and risk is entrenched in policy and scholarly debates because of the predominance of an organicist notion of the social in social scientific approaches to migration. In those debates migrants and their descendants are always already outside of the social. An organicist notion of the social also maps onto the way big‐data players conceptualise migration. Like earlier use‐case studies in epidemiology, for which tools had already been developed and that needed to be monetised, migration is likened to a disease—effectively anyone emitting data in ways that suggest potential movement across European borders is considered a threat to the status quo. Migration is thus treated like an illness that can be forced into remission through coercive interventions, but that is best treated through early warning mechanisms and prevention efforts—the organicist logic always already excludes even before those on the move have arrived anywhere in particular. It also demands a constant data stream to monitor for signs of potential movement. This is solutionism in action, the problem defined and formalised to fit the technology rather than the other way around.

With big data analytics in play, the question of why we need new migration statistics becomes more pertinent precisely because inferences drawn from those formalisations have social consequences. Arguments that big data may help migrants (Beduschi 2018) tend to neglect the demand coming from migration management policies, which makes such analytics commercially viable. Yet if we seek to better make sense of the emergent properties of migration—not as risk but as a social process—it becomes necessary to understand how policies shape and alter migration. Looking at ways of studying social transformations through the analysis of migration‐driven diversification can help excavate more clearly the tension at play. It is clear that there are different approaches to the analysis of diversification. Diversity can be thought about in terms of categories of difference alone—for example the study of migration‐driven diversity used to focus on whether it mattered how many origin groups came together in a particular social constellation. Diversity can also be studied as a manifestation of social complexity by pointing to the emergent properties that variegated movement patterns, motivations and the regulation of migration contribute to. The first approach can become somewhat more nuanced by moving on from ethno‐focality and by expanding the list of aspects that differentiate migrants. This approach does not account for how, when and why the charted categories become salient and how they socially matter. To make these more complexity‐sensitive arguments it is necessary to link the analysis of superdiversity to an engaged debate that borrows from and develops different theoretical standpoints.

The challenges that thinking about superdiversity and social transformations in light of diversification entails are very similar to uncritically accepting new players to the migration statistics game: complex models do not do away with social complexity, and policies define goals but rarely produce fully predictable outcomes. The spread, speed and scale of diversification needs to be emplaced within complex interconnections. Such efforts cannot claim to make subjects knowable, but at most they can help to rethink why categories of difference matter, how they are constructed and how they are perpetuated. There is an important difference in using data to define and limit, versus exploring data to acknowledge and analytically make accessible the continuously changing character of any social category.

Despite the appeal of more malleable categorisations through data‐driven analysis, this also necessitates a more concerted debate on data governance because of the opacity of the production of those categories. This stands in an awkward dialectic with the black box that constitutes intersectionality in an interconnected world—the problem that we often do not know what additional social processes need to be taken into account in how we make sense of social dynamics (Lykke 2011). Less category‐centred analytics may help by providing richer explanations of processes of differentiation. Our empirical examples show that instead, if today’s under‐regulated data market continues, there will be significant opacity rather than transparency in the way migrant data‐doubles are created and used. This contributes to the effective use of processes of defining legitimate mobilities as a form of territorial (neo‐colonial) strategy (cf. Vigneswaran 2013). Thus one task for research is to surface how categories of difference and sameness are being produced, in order to uncover how people are being targeted at the macro and micro level, and whether this can be mapped onto vulnerability/need for protection or not. In this way we may be able to shed light on how and why the potential for granular data is valuable, not because it allows for claiming to know and being able to contain migration—to make it risk free—but because it shows that claiming such knowledge sets analytical efforts multiple steps backward. The data double is only a reassembled taxonomy of difference categories. In contrast to people, data doubles can much more easily be presented as predicable objects that inform policy decisions. Migration is thus more easily moved outside of the social world that those policies are aimed at, into the realm of scientific problems. Interventions are automatically pointed towards those presumed to be on the move—instead of being thought and theorised in terms of the entangled relational connections that policy makers need to face (Landau 2019).

Conclusion

The commercialisation of migration statistics raises not only questions about how technology affects politics and policy, but also about the interaction between politics, policy, power and the development of particular systems and analytical strategies such as those described in the empirical part of this paper—method, if market‐driven, becomes politics. We have explored the process of mutual shaping between technology and policy: an existing policy vision creates a market for technologies which then shape the world to fit that policy vision and make its enforcement possible. This dynamic is particular to the involvement of commercial actors: where policy interacts with the data analytics market, a field is created for firms to compete for contracts based on how closely they can adapt and develop analytical techniques to policy objectives. This provides a self‐reinforcing cycle of categorisation and control according to a risk‐based vision of migration, where data doubles are populated with data, formalised and packaged for sale in accordance with that vision. This stands in contrast to the traditional statistical procedures, which (although often demand‐driven) tend to be modular, discursive and thus more open to dis‐ and re‐assemblage in other configurations. For statistical agencies, the risky migrant can just as easily be framed as the at‐risk migrant, and the asylum‐seeker as the subject of domestic economic and social policy.

Analytically tracing the increasing marketisation of migration data illuminates the big‐data statistical project as a whole: it prompts us to ask why we need migration statistics, and what would constitute justice in the production of migration data. The kind of statistics marketed as “big data” tend to be designed to facilitate particular actions, in line with both profit‐making and the production and reproduction of illegality and deviance of those not perceived as profitable. The proxies we described ultimately are used to flatten multiple realities into a simple threat no threat binary crucial for narratives of control. It is also possible to foreground other possible objectives. The superdiversity lens we employ, for example, offers a dialogue between different policy targets. While the 3 v’s of big data may map neatly onto the much‐noted speed, spread and scale of migration and its implications, these 3 s’s of superdiversity should not be thought about without reference to the 3 p’s: policy, politics and power. Activating this link surfaces the different questions that can be asked of the statistical subject, opening up multiple possible policy trajectories of how to achieve a particular vision of society, rather than closing them down to a single index of risk.

We have identified the notion of the emergency as a foundational logic driving the market dynamics we describe. Urgency and crisis, in the “big data for migration” market, drives the availability of funding, partners and a policy audience. The migrant data double is valorised and monetised in relation to crisis narratives. Those narratives also legitimise the state’s framing of migration as risk. Besides making a tradeable product, the computational formalisation of the migrant into a data double also has real and direct effects on migrants. Despite being an analytical construct without grounding in the lived experience of migration or migrant identity, the data double, through its policy valence, creates a new, non‐territorial transit zone where migrants can be effectively suspended and reinvented (Oelgemöller 2011), influencing who can enter where, and how they can legitimately move or settle.

This is a political problem, not one of data protection, privacy or data ethics. It is not identifiability that is at stake, but its opposite: intervention without understanding of the target. It is here that a superdiversity perspective could act as a brake on a spiral of crisis and othering on the part of states: rather than wholesale approaches to analysing migration flows, the new data sources could be used in coordinating with existing data and theory to better understand the complexity of those flows, and to formulate policy responses that address migrants primarily as people within state territory, on the model of Floridi’s (2016) invocation of dignity in the face of datafication, and Wachter and Mittelstadt’s (2019) posited “right to reasonable inference”.

The use of migration data is a hard case which the project of global data justice (Taylor 2017) must address if it is to provide a meaningful framework for the progressive governance of data. Global data justice aims, primarily, to reconcile datafication’s operations on the global scale with people’s subjective needs, and to understand what constitutes the legitimate use of data. A progressive politics of data for migration, in this framing, asks how migrant autonomy and representation can be located within datafication practices, and what sort of legitimacy we should look for in relation to the collection and use of data on migrants. This presents two targets for critical interrogation and action: first, the design and implementation of the digital systems used to sort and categorise, which “automate inequality” (Eubanks 2018) and draw attention away from the complex and entangled problems of poverty and vulnerability in favour of uniform structures of classification and control. Second, the rhetorical and administrative separation of migration and migrants from domestic society, which similarly pushes both debate and resistance toward issues of national and economic security rather than equality and poverty alleviation. If we address data as a way of exploring diversity rather than as a tool for control, migrants can be seen as part of society and their autonomy and representation potentially become policy concerns.

Any migration policy based on a vision of risk aims to address problems that derive from an anxiety over a loss of control and certainty (Appadurai 2016). Paths established by centuries‐old core–periphery dynamics of colonialism and cemented by invited migration based on colonial ties may be closed off by policy. Yet those paths leave their traces in migrant imaginaries. For the human mobility we analyse in this paper, war and political insecurity are the proximate causes but colonial history and global inequalities form important underlying gravitational forces. Changing economic and political configurations are gradually shifting migration flows in much less predictable ways than currently available data might suggest. For states worried about migration as a risk, the policy focus might better be placed on acknowledging and making graspable the complexity of flows, rather than seeking data‐driven control and securitisation.

Acknowledgements

We thank Aaron Martin for his substantial contribution to data collection for the empirical component of this paper, and Darshan Vigneswaran for his feedback. We also thank reviewers and editors at Antipode for their collegial and helpful feedback, and Mariana Cardella Hermida for her translation of the abstract. On Linnet Taylor’s part, this paper was written with funding from the “Global Data Justice” project, which received funding from the European Research Council (grant agreement no. 757247). Fran Meissner’s work was supported by the European Union’s Marie Skłodowska‐Curie Actions (grant agreement no. 707124).

Endnotes

1

Interview with Bob van den Berg, CBS Netherlands, 19 February 2019.

2

For an alternative example of using social media data to question the possibilities of certainties, see http://arewehuman.iksv.org/exhibition/it-is-obvious-from-the-map/

Contributor Information

Linnet Taylor, Email: L.E.M.Taylor@tilburguniversity.edu.

Fran Meissner, Email: Fran.Meissner@tudelft.nl.

References

  1. Agamben G (2005) State of Exception. Chicago: University of Chicago Press; [Google Scholar]
  2. Agre P E (2004) Internet research: For and against In Consalvo M, Baym N, Hunsinger J, Jensen K B, Logie J, Murero M. and Shade L R. (eds) Internet Research Annual: Selected Papers from the Association of Internet Researchers Conferences 2000–2002, Volume 1 (pp 25–36). New York: Peter Lang; [Google Scholar]
  3. Andersson R (2014) Illegality, Inc. Oakland, University of California Press; [Google Scholar]
  4. Appadurai A (2016) Aspirational maps: On migrant narratives and imagined future citizenship. Eurozine 19 February https://www.eurozine.com/aspirational-maps/ (last accessed 25 September 2019) [Google Scholar]
  5. Arendt H (1951) The Origins of Totalitarianism. Boston: Houghton Mifflin Harcourt; [Google Scholar]
  6. Beduschi A (2018) The big data of international migration: Opportunities and challenges for states under international human rights law. Georgetown Journal of International Law 49(4):981–1071 [Google Scholar]
  7. Bengtsson L, Lu X, Thorson A, Garfield R and von Schreeb J (2011) Improved response to disasters and outbreaks by tracking population movements with mobile phone network data: A post‐earthquake geospatial study in Haiti. PLoS medicine 8(8):1–9 [DOI] [PMC free article] [PubMed] [Google Scholar]
  8. Betts A, Bloom L and Omata N(2012) “Humanitarian Innovation and Refugee Protection.” Working Paper No. 85, Refugee Studies Centre, University of Oxford
  9. Bigo D (2014) The (in)securitization practices of the three universes of EU border control: Military/Navy—border guards/police—database analysts. Security Dialogue 45(3):209–225 [Google Scholar]
  10. Black R (2003) Breaking the convention: Researching the “illegal” migration of refugees to Europe. Antipode 35(1):34–54 [Google Scholar]
  11. Boswell C (2011) Migration control and narratives of steering. British Journal of Politics and International Relations 13(1):12–25 [Google Scholar]
  12. Broeders D (2009) Breaking Down Anonymity: Digital Surveillance of Irregular Migrants in Germany and the Netherlands. Amsterdam: Amsterdam University Press; [Google Scholar]
  13. Broeders D and Dijstelbleom H (2016) The datafication of mobility and migration management: The mediating state and its consequences In van der Ploeg I. and Pridmore J. (eds) Digitizing Identities: Doing Identity in a Networked World (pp 242–260). New York: Routledge; [Google Scholar]
  14. Burns R, Dalton C M and Thatcher J E (2017) Critical data, critical technology in theory and practice. The Professional Geographer 70(1):126–128 [Google Scholar]
  15. Casas‐Cortes M, Cobarrubias S and Pickles J (2015) Riding routes and itinerant borders: Autonomy of migration and border externalization. Antipode 47(4):894–914 [Google Scholar]
  16. Czaika M and De Haas H (2013) The effectiveness of immigration policies. Population and Development Review 39(3):487–508 [Google Scholar]
  17. Duffield M (2016) The resilience of the ruins: Towards a critique of digital humanitarianism. Resilience 4(3):147–165 [Google Scholar]
  18. Eubanks V (2018) Automating Inequality: How High‐Tech Tools Profile, Police, and Punish the Poor. New York: St. Martin’s Press; [Google Scholar]
  19. European Commission (2011) “Communication from the Commission to the European Parliament, the Council, the Economic and Social Committee, and the Committee of the Regions: Communication On Migration.” COM(2011) 248
  20. Eurostat (2016) “Asylum in the EU Member States—Record Number of Over 1.2 Million First Time Asylum Seekers Registered in 2015—Syrians, Afghans and Iraqis: Top Citizenships .” STAT/16/581
  21. Floridi L (2016) On human dignity as a foundation for the right to privacy. Philosophy and Technology 29(4):307–312 [Google Scholar]
  22. Ghosh B (2007) Managing migration: Towards the missing regime? In Pécoud A. and de Guchteneire P. (eds) Migration Without Borders: Essays on the Free Movement of People (pp 97–118). New York: Berghahn; [Google Scholar]
  23. Greenhill K M (2016) Open arms behind barred doors: Fear, hypocrisy, and policy schizophrenia in the European migration crisis. European Law Journal 22(3):317–332 [Google Scholar]
  24. Haggerty K D and Ericson R V (2000) The surveillant assemblage. British Journal of Sociology 51(4):605–622 [DOI] [PubMed] [Google Scholar]
  25. Hall S M (2017) Mooring “super‐diversity” to a brutal migration milieu. Ethnic and Racial Studies 40(9):1562–1573 [Google Scholar]
  26. Halpern S (2016) They have, right now, another you. The New York Review of Books 22 December https://www.nybooks.com/articles/2016/12/22/they-have-right-now-another-you/ (last accessed 25 September 2019) [Google Scholar]
  27. Hansen P (2018) Asylum or austerity? The “refugee crisis” and the Keynesian interlude. European Political Science 17(1):128–139 [Google Scholar]
  28. Hayek F A (2001. [1944]) The Road to Serfdom. London: Routledge; [Google Scholar]
  29. Heide Uhl B (2017) Assumptions built into code: Datafication, human trafficking, and human rights—a troubled relationship? In Piotrowicz R W, Rijken C. and Heide Uhl B. (eds) Routledge Handbook of Human Trafficking (pp 407–416). London: Routledge; [Google Scholar]
  30. Henning S and Hovy B (2011) Data sets on international migration. International Migration Review 45(4):980–985 [Google Scholar]
  31. Hollerith H (1894) The electrical tabulating machine. Journal of the Royal Statistical Society 57(4):678–689 [Google Scholar]
  32. IOM (2018) “Migrant.” Key Migration Terms, International Organization for Migration https://www.iom.int/key-migration-terms#Migrant (last accessed 13 February 2019)
  33. IOM and European Commission (2017) “Big Data for Migration Alliance (BD4M): Harnessing the Potential of New Data Sources and Innovative Methodologies for Migration.” IOM Global Migration Data Analysis Centre and European Commission Knowledge Centre on Migration and Demography https://bluehub.jrc.ec.europa.eu/bigdata4migration/uploads/attachments/cjio8ha96019f99zvr9xq6025-big-data-for-migration-alliance-concept-note.pdf (last accessed 25 January 2019)
  34. IOM and McKinsey & Company (2018) More Than Numbers: How Migration Data can Deliver Real‐Life Benefits for Migrants and Governments .” IOM Global Migration Data Analysis Centre and McKinsey & Company
  35. Klein N, Smith N and Patrick C (2008) The shock doctrine: A discussion. Environment and Planning D: Society and Space 26(4):582–595 [Google Scholar]
  36. Landau L B (2019) A chronotope of containment development: Europe’s migrant crisis and Africa’s reterritorialisation. Antipode 51(1):169–186 [Google Scholar]
  37. Laney D (2001) “3D Data Management: Controlling Data Volume, Velocity, and Variety.” https://blogs.gartner.com/doug-laney/files/2012/01/ad949-3D-Data-Management-Controlling-Data-Volume-Velocity-and-Variety.pdf (last accessed 25 January 2019)
  38. Lev‐Ram M (2016) Palantir connects the dots with big data. Fortune 9 March http://fortune.com/palantir-big-data-analysis/ (last accessed 25 September 2019) [Google Scholar]
  39. Lykke N(2011) Intersectional analysis: Black box or useful critical feminist thinking technology? In Lutz H, Herrera Vivar M T. and Supik L. (eds) Framing Intersectionality: Debates on a Multi‐Faceted Concept in Gender Studies (pp 207–220). Farnham: Ashgate; [Google Scholar]
  40. MacAskill E and Franklin J (2016) Latin America’s Schindler: A forgotten hero of the 20th century. The Guardian 14 December https://www.theguardian.com/world/2016/dec/14/roberto-kozak-chile-latin-america-schindler (last accessed 25 September 2019) [Google Scholar]
  41. Massey D S (1999) International migration at the dawn of the 21st century: The role of the state. Population and Development Review 25(2):303–322 [Google Scholar]
  42. Meissner F (2015) Migration in migration‐related diversity? The nexus between superdiversity and migration studies. Ethnic and Racial Studies 38(4):556–567 [Google Scholar]
  43. Meissner F (2018) Legal status diversity: Regulating to control and everyday contingencies. Journal of Ethnic and Migration Studies 44(2):287–306 [Google Scholar]
  44. Meissner F (2019) Of straw figures and multi‐stakeholder monitoring: A response to Willem Schinkel. Comparative Migration Studies 10.1186/s40878-019-0121-y [DOI] [Google Scholar]
  45. Meissner F and Vertovec S (2015) Comparing super‐diversity. Ethnic and Racial Studies 38(4):541–555 [Google Scholar]
  46. Microsoft (2017) “Real‐Time Twitter Sentiment Analysis in Azure Stream Analytics.” https://docs.microsoft.com/en-us/azure/stream-analytics/stream-analytics-twitter-sentiment-analysis-trends (last accessed 25 January 2019)
  47. Mirowski P (2013) Never Let a Serious Crisis Go to Waste: How Neoliberalism Survived the Financial Meltdown. New York: Verso; [Google Scholar]
  48. Morozov E (2013) To Save Everything, Click Here: The Folly of Technological Solutionism. New York: PublicAffairs; [Google Scholar]
  49. Moulier Boutang Y (1998) De l’esclavage au salariat: Économie historique du salariat bridé. Paris: Presses universitaires de France; [Google Scholar]
  50. Ndhlovu F (2015) A decolonial critique of diaspora identity theories and the notion of superdiversity. Diaspora Studies 9(1):28–40 [Google Scholar]
  51. Nelson L (2019) Illegality In Antipode Editorial Collective (eds) Keywords in Radical Geography: Antipode at 50 (pp 151–154). Oxford: Wiley; [Google Scholar]
  52. New Keywords Collective (2016) “Europe/Crisis: New Keywords of ‘the Crisis’ in and of ‘Europe’.” Near Futures Online, Zone Books http://nearfuturesonline.org/europecrisis-new-keywords-of-crisis-in-and-of-europe/ (last accessed 21 January 2019)
  53. Oelgemöller C (2011) “Transit” and “suspension”: Migration management or the metamorphosis of asylum‐seekers into “illegal” immigrants. Journal of Ethnic and Migration Studies 37(3):407–424 [Google Scholar]
  54. Pasquale F (2013) Schmayek’s shutdown. Balkinization 16 October https://balkin.blogspot.nl/2013/10/schmayeks-shutdown.html (last accessed 25 January 2019) [Google Scholar]
  55. Pindolia D K, Garcia A J, Wesolowski A, Smith D L, Buckee C O, Noor A M, Snow R W and Tatem A J (2012) Human movement data for malaria control and elimination strategic planning. Malaria Journal 10.1186/1475-2875-11-205 [DOI] [PMC free article] [PubMed] [Google Scholar]
  56. Polach B (2015) Refugee crisis and the EU: Will big data/analytics provide some answers? Innovation Enterprise https://channels.theinnovationenterprise.com/articles/refugee-crisis-and-eu-will-big-data-analytics-provide-some-of-the-answers (last accessed 23 August 2018) [Google Scholar]
  57. Quijano A (2007) Coloniality and modernity/rationality. Cultural Studies 21(2/3):168–178 [Google Scholar]
  58. Rango M and Vespe M (2017) “Big Data and Alternative Data Sources on Migration: From Case‐Studies to Policy Support.” Summary Report, European Commission Joint Research Centre
  59. Scheel S (2013) Autonomy of migration despite its securitisation? Facing the terms and conditions of biometric rebordering. Millennium 41(3):575–600 [Google Scholar]
  60. Schinkel W (2017) Imagined Societies. Cambridge: Cambridge University Press; [Google Scholar]
  61. Schinkel W (2018) Against “immigrant integration”: For an end to neocolonial knowledge production. Comparative Migration Studies 10.1186/s40878-018-0095-1 [DOI] [PMC free article] [PubMed] [Google Scholar]
  62. Schmitt C (2015. [1934]) Politische Theologie. Berlin: Duncker and Humblot; [Google Scholar]
  63. Scott J C (1998) Seeing Like a State: How Certain Schemes to Improve the Human Condition Have Failed. New Haven: Yale University Press; [Google Scholar]
  64. Sigona N (2015) Seeing double? How the EU miscounts migrants arriving at its borders. The Conversation 16 October https://theconversation.com/seeing-double-how-the-eu-miscounts-migrants-arriving-at-its-borders-49242 (last accessed 25 January 2019) [Google Scholar]
  65. Sigona N (2018) The contested politics of naming in Europe’s “refugee crisis”. Ethnic and Racial Studies 41(3):456–460 [Google Scholar]
  66. Taylor L (2016a) No place to hide? The ethics and analytics of tracking mobility using mobile phone data. Environment and Planning D: Society and Space 34(2):319–336 [Google Scholar]
  67. Taylor L (2016b) The ethics of big data as a public good: Which public? Whose good? Philosophical Transactions of the Royal Society A 10.1098/rsta.2016.0126 [DOI] [PMC free article] [PubMed] [Google Scholar]
  68. Taylor L (2017) What is data justice? The case for connecting digital rights and freedoms globally. Big Data and Society 10.1177/2053951717736335 [DOI] [Google Scholar]
  69. Taylor L and Schroeder R (2015) Is bigger better? The emergence of big data as a tool for international development policy. GeoJournal 80(4):503–518 [Google Scholar]
  70. UN Global Pulse (2017) “Social Media and Forced Displacement: Big Data Analytics and Machine‐Learning.” UN Global Pulse and UNHCR Innovation Service
  71. United Nations (2018) “Global Compact for Safe, Orderly and Regular Migration: Intergovernmetally Negotiated and Agreed Outcomes.”
  72. van der Sangen M and van Sandijk J (2018) Mapping migrant flows with satellites. Statistics Netherlands (CBS) 26 February https://www.cbs.nl/en-gb/corporate/2018/09/mapping-migrant-flows-with-satellites (last accessed 25 January 2019) [Google Scholar]
  73. Vertovec S (2017) Mooring, migration milieus, and complex explanations. Ethnic and Racial Studies 40(9):1574–1581 [Google Scholar]
  74. Vigneswaran D (2013) Territory, Migration, and the Evolution of the International System. Basingstoke: Palgrave Macmillan; [Google Scholar]
  75. Wachter S and Mittelstadt B (2019) A right to reasonable inferences: Re‐thinking data protection law in the age of big data and AI. Columbia Business Law Review 2019(2):494–620 [Google Scholar]
  76. Wesolowski A, Buckee C O, Bengtsson L, Wetter E, Lu X and Tatem A J (2014) Containing the Ebola outbreak: The potential and challenge of mobile network data. PLoS Currents 29 September http://currents.plos.org/outbreaks/index.html%253Fp=42561.html (last accessed 25 September 2019) [DOI] [PMC free article] [PubMed] [Google Scholar]

Articles from Antipode are provided here courtesy of Wiley

RESOURCES