KIM features an essay by digital artist and scholar Hiba Ali. The essay is part of a series hosted on our blog to stimulate discussions on the effects of AI on society.

 


 

Constant Tracking

A recently announced moratorium on face recognition technology by IBM, Microsoft and Amazon came on the heels of the global anti-racist uprising for Black lives. But this one-year ban will do no “good”, especially when police already have access to such technology and are using it on protestors. It is a surface-level political tactic aimed at fortifying a false public image of (oxymoronic) “good” corporations, when in fact, these corporations perpetuate classism, racism, U.S. imperialism and sexism.  What is needed, is a ban of facial recognition and carceral surveillance technologies. This call has been echoed by Sarah T. Hamid, William Jamal Richardson, Melinda Sebastian and Timnit Gebru, builds upon previous work by Color Coded LA, Ida B. Wells Just Data Lab, MIjente and Stop LAPD Spying, and leads me to conclude that public and private companies have partnered to create a business that uses the lives of U.S. Black and brown communities for carceral and surveillance purposes. As Melinda Sebastian has pointed out, almost all algorithms we use for bias, and “fairness” come from “Google and Microsoft, or some group funded by one of those two. ”

As I type this article, I am being surveilled by a Windows device. I have not allowed the device to conduct this type of the data collection but it does anyway as part of the “Basic” mode of device protection. The “Basic” mode of Telemetry, Windows surveillance software, that comes built-in with all of devices installed in Windows 10 operating systems, is not basic at all – it includes an enormous section (about 30,000 words) which details all of the extensions, events, and information that can be collected. Telemetry insists I forsake my privacy, the content of my computer, keyboard input and usage of programs, for “safety.” When a document is being scanned for data indexing, I am not alerted. When I walk down a street in the city and the cameras take photos of my face, I am not alerted. The continued usurping of our data has built a totalizing form of surveillance where our bodies and actions are coded into surveillance infrastructure that hardens racism and sexism, as a part of “digital redlining” as Safiya Umoja Noble refers in her book, Algorithms of Oppression.  

 

 

Histories of Surveillance and Carceral Infrastructure

The lifeblood of contemporary carceral surveillance and technology frameworks are rooted in U.S. histories of enslavement, empire and enclosure of Black and Indigenous people. As Ruha Benjamin defines in her book, Race After Technology (2019), the “Jim Code” is modeled after Jim Crow Laws, a system of laws that mandated racial segregation between 1876 to 1965 where “legal codes, social codes, and building codes intersected to keep people separate and unequal” and links the work of Michelle Alexander’s book, The New Jim Crow: Mass Incarceration in the Age of Color Blindness (2010.) Through technology, Jim Crow codes are updated as “Jim Code” maintained and enforced through algorithms, to sort, organize and adapt to a digital structure. Mentioned by Ethan Ciel, in March 1713, the Common Council of the City of New York, the pre-cursor to today’s City Council, approved “A Law for Regulating Negro & Indian Slaves in the Night Time.” It was a “lantern law” and required any slave older than 14 to carry “A Lanthorn and lighted Candle in it…as the light thereof may be plainly seen.”

Simone Browne in her book, Dark Matters: On the Surveillance of Blackness (2015) states that throughout history, “you see light being used as a disciplinary and surveillance tactic.” Browne said lantern laws also dehumanized enslaved people in a broader sense, turning them into human streetlights. “The lantern… kind of incorporated black and indigenous people into the public infrastructure of lighting,” The lighting infrastructure as surveillance was updated in 2016, when floodlights which were a part of Mayor Bill de Blasio’s 2014 plan to make neighborhoods and housing developments safer. The bright lights were aimed at New York City Housing Authority (NYCHA), which are public-housing units throughout the city that are available to low- and moderate-income New Yorkers, whose neighborhoods, of predominantly Black or Latino residents, are replete with surveillance cameras.

 

 

Using Fear and the False Premise of Safety

As Wendy Hui Kyong Chung has mentioned in her book, Control and Freedom: Power and Paranoia in the Age of Fiber Optics (2006), our data leaks, nothing is ever truly secure in a digital system. Flashy commercial campaigns promote false notions of safety, innovation, and convenience are exchanged for our privacy. Our private data is an endless site of theft and pilfering for corporations — even our faces and gait is weaponized against us. We have seen how images of U.S. protestors, as part of the uprising, posted on social media have been used to arrest them. Joy Buolamwini, Director of the Algorithmic Justice League, has attested to U.S. Congress on the ways in which facial recognition software continuously “misreads” queer and darker-skinned people. The carceral logic of facial recognition systems is not an accident, it does solely misrecognize them, it misclassifies and in turn, misidentifies them. In an era of surveillance capitalism, the “default” of facial recognition technologies is to be used against people for corporate profit with disregard to the violence they unleash on our material realities. 

 

 

Conclusion

In the U.S., so far, Portland, San Francisco, Jackson and Boston have banned facial recognition, while other cities consider legislation, these cities serve as examples on the steps toward abolishing carceral logic. Technology’s infrastructure in the hands of authoritarian, fascist regimes is always going to be used in abusive ways. We should advocate for the total banning of facial recognition and their use as part of carceral technologies. We refer to online protest guides that instruct us on methodologies of sousveillance, workarounds, and fugitivity. As raised by Charlene Carruthers, Mariame Kaba, Mikki Kendall, and Ruth Gilmore Wilson, we need to address the reparative framework to aid and abolish the resource extraction central to the enterprise of surveillance and carceral infrastructure – that include the actual mining of material to make technologies, the exporting and importation of U.S. military tactics to the national police as part of the “green to blue” pipeline and the constant expropriation of data. The road to abolishing the carceral technology industrial complex is long, we are instructed by the work of Sarah T. Hamid to work against the corporate grain, to work slowly, safely and to “live in friction.” We root ourselves in movements of abolition and care by strengthening people’s sovereignty and autonomy in our communities. When data is linked to carceral modes of being, it is bound to repeat the logic that it is fed. Banning facial recognition systems will be effective when it occurs in tandem with abolishing the police and dismantling the carceral state. 

 

→  Watch 360° Video “workers liberation as environmental justice” (2020) here.