The use of drones in combat is a practice that has consistently attracted attention over the past decade, whether in the form of a celebration of technologyâ€™s potential for removing humans from harm, or a sort of fascinated disquiet with the ways in which their use reconfigures ideas about warfare that are centuries old. These are all well-rehearsed elsewhere. What struck me recently was the way this technology is being represented as it moves from the domain of war to civilian life. In particular, Iâ€™ve been thinking about the capabilities of drones that make them distinct, that are peculiar to them. I think that these are predominantly where the moral queasiness that accompanies talk about drones comes from, and I think these capabilities are exactly what are being jettisoned as they become part of everyday life.
I should say now that a detailed analysis of what a â€œdroneâ€ is would quickly have to start seeing them as one particular extension of a network, with operators in one location, governance authorities in another, manufacturers and providers in yet others, all linked materially (ordnance, fibre optics, warehouses, office buildings, maintenance equipment, satellites) and immaterially (radio waves, laser targeting, guilt, obligation, trauma), and indeed combat is networked to such an extent now that the question of where the bounds of a belligerent entity or assemblage exist is philosophically challenging. To a degree, then, the idea of â€œdronesâ€ is a construction, perhaps predominantly a civilian construction, a way of synecdochically grouping together elements of a complex assemblage of technologies and practices for which the Reaper or Predator act as figurehead. So when I talk about â€œdronesâ€ I am simplifying, or at least making use of a construction that is not quite the same thing as the technologies as a military contractor or engineer might be referring to when they discuss air support or an â€œunmanned aerial vehicleâ€. But itâ€™s that idea of drones that is moving from the military to civilian, and thatâ€™s what Iâ€™m interested in.
I think that what is distinctive about the idea of drones, then, is:
- their capacity to act on the surface of the earth from the air (most often ballistically)
- their autonomy in making decisions, most controversially regarding the use of lethal force but more generally in categorising the input they receive (a threat? a person? a missile silo?)
- their independence from the ground during long hours aloft (of course this is linked to their autonomy, but Iâ€™m highlighting this separately as I think itâ€™s a distinct source of the uncanny nature people ascribe to them)
- the asymmetry of the surveillance relationship (itâ€™s not simply that they spy, because observation has always been part of warfare, but that the people they see in such amazing detail have so limited a view of them)
- their association with the highest levels of military decision-making (seen in the descriptions, for example, of â€œObamaâ€™s drone warâ€™ and the early concerns with the chain of command associated with their use)
- the distance (spatial and cultural) between their operators and those who suffer their effects.
These effects, of course, are what give rise to the well-documented ethical queasiness that is associated with their use. Despite being described as a military technology, drones have a huge impact on civilian life in the regions in which they are used, not simply at the moments when they kill people and damage property but through the threat of such harm. In the aftermath of an attack, survivors are reluctant to aid victims. People are reluctant to hold weddings or funerals. Communities fray. The psychological effects of being constantly under threat are immense: the drones can be heard, in some regions continuously, frequently unseen, with hugely damaging consequences.
When [children] hear the drones, they get really scared, and they can hear them all the time so theyâ€™re always fearful that the drone is going to attack them. . .
(The quote is from page 86 of a report from Stanford and NYU, â€œLiving Under Dronesâ€ – another detailed report, from Columbia, is â€œThe_Civilian_Impact_of_Dronesâ€). Children suffer particularly from living in regions subject to regular drone attacks, through the psychological trauma of being in constant fear, through not being distinguished from adult enemy combatants, and, where schools have been destroyed or the journey is too dangerous, through losing opportunities for education.
Overwhelmingly these are abstract issues for the people I know and the media I read. In the context of Europe, or the â€˜global Northâ€™, or the â€˜developed worldâ€™, drones are used in other countries, not where we live. These issues affect the people â€˜over thereâ€™, alongside all the other issues that come with poverty and insecurity and having the wrong kind of history.
But they are increasingly visible over here, too. Drones, or an idea of them, are moving into civilian life. They have gone from being a shorthand for â€˜the futureâ€™, an upgraded jetpack, to a feature of consumer and education arenas, and along the way their most troublesome aspects have been elided. Now, drones are toys for photographers and enthusiasts, tools for governing authorities to monitor land use, or vehicles for educators to promote STEM subjects and teach engineering. Their use for surveillance remains troubling for some, but if this is the only questionable element remaining of the original idea of â€œdronesâ€ then it is straightforward to subsume those concerns within the broader surveillance debate. The danger to civil liberties is removed and abstracted, to sit alongside traffic cameras and data protection, leaving an appealing edginess around what is left: expensive model aircraft for middle-class hobbyists, a specialised technological front in existing power struggles between observers (police, paparazzi) and the observed (citizens, celebrities). In sectors that infrequently concern themselves with wider critical or political questions, such as engineering or agriculture, drones are simply a practical new tool for managing space, and ethical questions tend not to be brought up.
These examples are hardly an exhaustive survey of the uses to which civil society might put drone technology, of course. But they illustrate that in moving from the military to the civilian, the elements that belong uniquely to the idea of drones, and that remain the most ethically problematic aspects of their use in military conflict, have been obscured. Their capacity to act on the surface of the earth from the air, their autonomy, their independence from the ground during long hours aloft, the distance (spatial and cultural) between their operators and those who suffer their effectsâ€”none of these feature in the stories being told about civilian drones. Instead they are a toy, a project for garage â€œmakersâ€, a new legislative arena alongside mobile phones and media piracy, a tool for industrial agriculture and environmental researchers, a novel twist on the existing struggles between state and citizenry over space.
Technology has always travelled from a military to a civilian context. Drones will continue to make their way from the desert to the high street, and along the way undoubted social benefits will result from the use of whatever they turn into. Charting local impacts of climate change, managing land and water more effectivelyâ€”these are positive things and I donâ€™t want to argue that they shouldnâ€™t be making use of technologies that were developed from drones. What I want to suggest is that it is in poor taste to claim the same technologies that prevent children attending school in one country as as asset in improving STEM education in another, or to describe toys for leisured photographers in the same terms as the devices that prevent societies celebrating marriages in open spaces, or for estate agents to increase the price of one part of land using technologies that reduced another part to rubble. These new uses of drone technology ought to be developed with a fuller recognition of their provenance. It should be easier to see the ways in which these new â€˜dronesâ€™ differ from the original military drones, the ways in which they fail to inflict the same damaging impacts on the world. Turning swords into ploughshares is a noble thing to do, but it seems dishonest to pretend they were never swords.
Given the cultural imperative to continue to use the word â€˜droneâ€™ to stand for a wide array of technologies that might otherwise perhaps be thought of as â€˜remote controlled aerial photographyâ€™, and assuming that this will be accompanied by the sort of elision I describe above, what sort of response can be made? I was talking with a friend of mine, Pete Bennett, about this, and I want to share one suggestion from our conversation: a language for children to use to program drones themselves, laying bare the capabilities of the device through the choices made available in the language. A LOGO for drones.
Using a language that reveals all of the capacities of a drone rather than the subset employed by consumer technologies referred to as drones would make remembering their original purpose easier, which in turn would go some way towards ensuring that young people designing activities for drones have the ethical and political issues that necessarily accompany the use of â€˜droneâ€™ technology visible and in front of them, demanding a response. Choosing not to use or to adapt a particular capacity becomes then a conscious choice, a deliberate re-purposing of a military tool for positive social ends: conversely, making use of capacities designed for belligerence or oppression would require the deliberate introduction of these values into civil space. The process of normalising this ethically problematic technology and turning it into something with social value can become something young people are able to participate in knowingly, rather than unthinkingly.
Iâ€™m using LOGO as a model for a few reasons: it also relates to an external device (the turtle), itâ€™s a good example of using comprehensible commands to introduce programming concepts, and itâ€™s possible to introduce new routines and actions through the
learn command. It would be straightforward for people to dive into writing practical actions for a drone, focussing on the ethics of their plans or their positive outcome, rather than being bogged down in technical issues that allow them to ignore the bigger questions raised by their actions.
Maybe it would look something like this:
patrol (route home)
if (not safelist)
if threat alert (sms)
fire (UV marker)
return data backup
learn patrol (path)
speed 60 start path start loop yes
learn path (route home)
waypoint 1 [lat:long]
waypoint 2 [lat:long]
waypoint 3 [lat:long]
Ideally, this drone LOGO would be run on hardware that was customisable, perhaps a regular chassis and basic core with space to plug in modules for different capabilities: taking video, using infrared imaging, dropping packages, firing lasers, sampling the atmosphere, monitoring noise levels, broadcasting data or anything else that might be hacked together from (say) Arduino boards and slotted into the standard format.
What sort of thing could be done? The only way to answer this would be to develop it with a group of young people, working with them to build a language that meant something to them, and learning about the priorities that lead them to address particular areas. But perhaps they might be concerned about the safety of their walk home (as hinted at in the example above), or interested in learning who actually uses the recycling bin, or worried about traffic and pollution outside their school, or curious about seasonal changes in tree cover. Or perhaps they might want to stalk pupils from another school, or film fights, or harass dog owners, or vandalise the police station, or any number of socially undesirable things. No-oneâ€™s perfect. The challenge would be for them to ask the proper questions about their actions, and to see that even superficially positive actions like patrolling routes to school raise questions about other peopleâ€™s right to privacy and the impact of unilateral action on trust or social cohesion.
One of the reasons, I think, that drones hold such fascination is that they are a site where powerful and important narratives collide: geopolitical inequalities, social justice, security, understanding networked agency and the ethics of having robots kill on our behalf. These are all bigger issues than this project can properly wrestle with. But I think it would be a good place to start in encouraging young people to see themselves in the young people whose lives have been so blighted by drones, and in helping them to develop the critical stance towards programming and technology that will help them, later, to design tools that make the world better.