The research project sheds light on the inherent violence and shortcomings of today’s emerging computational capital paradigm (Beller 2017) shaped by machinic intelligence and automation. Within this increasingly complex arrangement of human and non-human actors, it raises questions tackling responsibility, legitimacy, truth politics and legal frameworks. The project investigates the specific characteristics of camps, more precisely of refugee camps, as places that fulfil a unique function within today’s geopolitical landscape. The camp is conceptualised related to its utilisation as a biopolitical laboratory for the testing of experimental technologies (biometrics) and its availability to provide cheap labour force: Biometric data-work (Case Study 1) (1) and Online Outsourcing of programming work to refugees (Case Study 2) (2). Biometrics are regarded as an aspect of algorithmic governance.
As a historically young dispositif, algorithmic governance acts through indirect steering and control of human and non-human actors by computational norms (Pasquinelli 2017). The project illustrates in what way new forms of value extraction and knowledge production are an integral part of how computational capital reproduces and increases inequality and social difference. Moreover, the project intends to show how computational capital’s machines, codes and modes of governance are built on colonialism, imperialism, violence and war.
The UNHCR cooperates with the company Iris Guard which delivers a technology to perform iris recognition in refugee camps. Since 2013, all people arriving in the Jordanian Zataari or Azraq camp must register their irises. Iris Guard has installed around 300 registration sites worldwide and scanned more than 2,4 million refugees. Produced in an automated way, the eye barcode is then uploaded to a cloud server. The UNHCR’s database can potentially track, tag, monitor and predict not only their consumer behaviour, but also their movement (3). The mode of data mining is compulsory since receiving food and relief aid is in large parts distributed through cash-based assistance: scanned irises now replace cash and people pay with their eye in camp supermarkets that are run by large supermarket chains such as Tazweed.
The humanitarian rationale of urgency, something must be done, allows UNHCR to conduct political, medical and policing tests (Jacobsen 2015). Historically, new technologies and experiments were always tested and carried out on minorities or on groups perceived as inferior. Camps thus serve as political-juridical grey areas, characterised by extraterritoriality, regimes of exception and marginalisation (Agier 2011). This research proposes to investigate how refugees of the global peripheries become experimental, precarious populations in huge Labcamps in which statistic, algorithmic and biometric technologies choreograph their performance.
What is the way in which the individuals perform a new form of work, informal, unpaid and precarious in the sense of Immaterial Labour (Lazzarato 1996) and in which their data-hybrid body plays a fundamental role? The precariousness of living in the camp, the exploitation of refugees in the Labcamp as objects of experimentation and as producers of identity- and data work and the control via algorithmic governance, turn the camp into a site of production, application and absorption of globalised computational capital. On all these levels a new form of work is in progress, a work that generates surplus value from scarce resources and vulnerable populations.
Methodologically, the project combines two approaches: it excavates the media histories of refugee camps by using a gouvernmental-forensic discourse analysis and by conducting fieldwork on site. It implements methods of the research group Forensic Architecture (FA) in order to locate and question the legitimacy and responsibilities of resources that enable people, artefacts, data and media to perform agency. This approach allows to visualise the entanglements between camp, labour and technology by challenging the dispositif of algorithmic governance and -work. It aims to reverse the forensic-gaze by state agencies and powerful institutions such as UNHCR with the objective to address questions of algorithmic accountability (Schuppli 2014).
1. Ariana Dongus, Christina zur Nedden, ZEIT ONLINE: https://www.zeit.de/digital/datenschutz/2017-12/biometrie-fluechtlinge-cpams-iris-erkennung-zwang, 15. 01. 2018.
2. Ariana Dongus, Christina zur Nedden, FRANKFURTER ALLGEMEINE SONNTAGSZEITUNG, http://www.faz.net/aktuell/wirtschaft/re-coded-bildet-fluechtlinge-im-irak-zu-programmierern-aus-14993418.html, 15. 01. 2018.
3. The case of the Rohingya refugees enrolment in Bangladesh is but one example of the perils UNHCR’s Biometric Identity Management System imposes on vulnerable populations. See: Zara Rahman, IRIN NEWS, https://www.irinnews.org/opinion/2017/10/23/irresponsible-data-risks-registering-rohingya, 15.12. 2017 or Elise Thomas, WIRED UK, http://www.wired.co.uk/article/united-nations-refugees-biometric-database-rohingya-myanmar-bangladesh, 15. 03. 2018.