Night terrors. I couldn’t remember the last time I had them before reading Gregoire Chamayou’s account of the US military’s use of drone strikes. Chamayou’s book chronicles two modes of operation that mark weaponised drones: the “pattern-of-life analysis” and the “kill box”. In the first place, through a combination of advanced surveillance techniques, people in targeted regions have their spatial, temporal and social activities digitally mapped based on their “regularities”. Any deviation from one’s normal pattern of behaviour then triggers suspicion and if such behaviours correspond to a “signature” of pre-identified behaviour matching militant activity, then one is liable to be fired upon by “pilots” sitting in front of screens thousands of kilometres away. In addition, weaponised drones also operate by demarcating target zones into discrete digital cubes called “kill boxes” where their remote operators can unleash Hellfire missiles. Within such a frame, the lifeworlds of millions are transformed into a virtual hunting ground with human beings as their algorithmic prey.
While apocalyptic warnings by prominent figures like Stephen Hawking on artificial intelligence-based killing machines have attracted some media attention, for millions living in Pakistan, Afghanistan and Yemen, that future is already here.
In what sort of world is such a means of killing thinkable?
I turned up to work cranky after a night of interrupted sleep, rueing the day I began reading about drone strikes. As a matter of routine, I launched the Learning Management System (LMS) that I, like many university teachers, are reliant upon to cope with the administration of students in the hundreds. Maybe I was in a daze, but as the LMS dashboard loaded that morning, inviting me to act on students who had been “inactive” and “at risk”, it brought to mind Heidegger’s point about the “essence of technology” – that it “enframes” the world and people as calculable, measurable and manipulable. The conjunction of institutional pressures of large classes, and consequently the “need” for administering these with the LMS, invited me to see my students not as inhabiting diverse lifeworlds that I may encounter through dialogue, but as already knowable as “good” or “bad”, “successful” or “at risk”, and to act on them based on symptomatic patterns in their behaviour.
Perhaps my guttural reaction to the way the world and people are seen through weaponised drones was not because of its alienness, but its familiarity.