The Disposition of Drones

At the U.S. Army’s drone training facility, a team of artists examines the relations between perception, technology, and power.

Gray Eagle UAS camera and sensing instruments
Close view of camera and sensing instruments on the Gray Eagle UAS. [Kit Abate]

When civilians talk about military drones, we often imagine the aerial view, the remote-controlled camera, the fuzzy targets on a screen. Operations on the ground are considerably more varied, and harder to picture. The first thing to understand is that the image of a lone gamer-pilot with his finger on the button­­ is a myth. Drones are frequently controlled by multiple teams sharing a common interface, in conditions that may be noisy or dusty or windy. Agency is embedded in networks of military bodies, buildings, vehicles, communications and information apparatus; shaped by human interaction and interpretation; mediated by images, interfaces, infrastructures; all under the influence of larger cultural systems. When we can’t see this, we aren’t really talking about drones at all.

Agency is embedded in networks, mediated by interfaces and infrastructures. When we can’t see this, we aren’t really talking about drones at all.

Another problem: we don’t agree on what to call them. Military personnel use the technical acronyms UAV (unmanned aerial vehicles) and UAS (unmanned aerial systems), and they dislike the widespread use of “drone,” which implies that human oversight has been usurped by the machine. 1 Critics say, yes, that’s the point. (For what it’s worth, I use the terms interchangeably.) These battles over symbolic representation obscure deeper reckonings about the role of humans in increasingly automated military operations.

So let’s get closer to ground. The world’s largest reconnaissance drone training facility is run by the U.S. Army at Fort Huachuca, in the southern Arizona desert, not far from the Mexican border. 2 About 700 soldiers and marines train here at a time, and some 3,000 drone operators and technicians graduate every year. 3 In 2016, toward the end of the Obama administration, I brought a team of artists to Fort Huachuca to observe and sketch, in conjunction with my project Incendiary Traces, an art and research initiative that considers relationships between perception, technology, and power. 4

The trainees are learning to operate General Atomics’ Gray Eagle, a successor to the Predator drone. It looks like a glider, with very long wings and a 28-foot body, and it can fly for over 27 hours, up to 25,000 feet, at speeds approaching 175 miles per hour, carrying over 1,000 pounds of equipment, including four Hellfire missiles. 5 It’s possible to operate the Gray Eagle from a few thousand miles away, but the satellite communications are expensive and glitchy. 6 So unlike Air Force pilots who control drones on the other side of the wo­­­rld, Gray Eagle operators work close to the field of engagement, in small, cramped, mobile control stations connected with ground troops in the combat zone and imagery analysts in a nearby control room. Personnel in all three locations share oversight of the drone operations.

Artist's sketch of a Gray Eagle drone in the hangar at Fort Huachuca
Sketch of a Gray Eagle UAS in the hangar at Fort Huachuca. [Rossitza Todorova]

Ground Control Station training simulators at Fort Huachuca
Ground Control Station training simulators. [Joseph Bates]

Some of the training missions at Fort Huachuca are computer simulations; others involve flying actual drones around the base. UAV operators work in pairs, with one person controlling the flight and the other handling the cameras, missiles, and remote sensors, known collectively as the payload. We were able to observe a large room with about fifteen pairs of simulation stations. One soldier, the flight controller, practices navigating a mountainous landscape, using pre-recorded video footage shot by drones in the local Arizona desert, as well as flight charts and vehicle data. The payload operator has a separate interface that shows the same background footage overlaid with simulated people, market stalls, cars, and other details. The payload operator can use lidar, synthetic-aperture radar, and infrared sensors to identify, locate, and follow enemy bodies, vehicles, buildings, and equipment, as well as to attack them.

Abstractions reflect the values of their makers. We each conjure our own vision of the battlespace.

Everything is rendered in low resolution; people appear as 2D cutouts in a 3D space. Their high key color and crisp edges contrast with the murky aerial video of a dry, brown, open space. The two visual environments, simulation and setting, are crudely integrated. Digital chunks representing foreign vehicles move along a blurry, algorithmically compressed vision of American asphalt. In real-world combat, operators will surveil people and places using the same interface. The subjects on screen are secondary to the operation of the equipment at hand.

In Extrastatecraft, Keller Easterling argues that information technologies and infrastructure spaces have a “disposition,” that is, built-in potential uses that are distinct from stated goals but can have political consequences. That’s true in many domains. Engineers, governments, journalists, designers, and artists all create and encode spaces with dispositions: “accidental, covert, or stubborn forms of power — political chemistries and temperaments of aggression, submission, or violence — hiding in the folds.” 7 Symbolic representations designed for one purpose can convey unstated meanings, and can even produce biased outcomes or perspectives that belie claims to neutrality. Abstractions, by nature, prioritize certain categories and units of information over others, and they reflect the values of their makers. My team observes with the flexible, imprecise tools of drawing and conceptual art. Drone operators surveil with mechanized, automated tools and military intention. We each conjure our own vision of the battlespace.

Drone operator training simulation with market and missile.
Screenshot of drone operator training simulation with market and missile. [Hillary Mushkin]

Artist's sketch of drone operator training simulation
Flying over Simulated Market, watercolor. [Hillary Mushkin]

Outside, on an airfield next to the simulation building, trainees practice flight and payload operation with real drones. Ground Control Stations (GCS) house the operations systems in situ. The GCS architecture draws our attention to the position of the human bodies that operate these superhuman instruments. The dark, cramped mobile control stations are just large enough to contain seats, computers, monitors, and related equipment for three operators, and a loud power generator. The operators wear headsets to communicate with each other and with the ground troops, air traffic control, and commanders. There is frequently crosstalk on the headset.

The term ‘drawing as’ provocatively associates technocratic ways of seeing and knowing with the human, subjective art of drawing.

In a nearby control hub known as the Tactical Operations Center (TOC), commanders share space with imagery analysts who work with the team in the GCS to collect geospatial information from the camera payloads. On Gray Eagles the most common reconnaissance payloads are an electrical optical camera, for color vision in daylight; an infrared camera, which reveals heat in black and white; a sparkle laser spotlight that is visible only with night vision goggles; and a laser designator used in guiding missiles. While the technologies extend human perception, they are susceptible to atmospheric disturbances, obstacles on the ground, and inclement weather.

The imagery analysts learn to identify what the Army calls “essential elements of information” (EEIs), military-speak for clues. To find recently buried IEDs, for example, the analysts will look for disturbed earth, thermal traces, and footprints in the desert. They’re also tasked with “making a product” that describes, or perhaps sells, the “intelligence value” of gathered information to other team members. The “products” can take the form of a 3D simulation that combines map, GIS, and lidar data from their aircraft. While military personnel speak of these packages like commodities, I think of them as being related to the sociological concept of “drawing as.” Janet Vertesi uses that term to describe the iterative process of collecting scientific data on Mars, where teams of experts interact with and massage data-based images, collectively shape and discuss interpretations, give narrative meaning to data, and thus create knowledge. The term provocatively associates scientific and technocratic ways of seeing and knowing with the human, subjective art of drawing. 8

Hillary Mushkin sketching in Gray Eagle hangar at Fort Huachuca
Hillary Mushkin sketching in Gray Eagle hangar at Fort Huachuca. [Kevin Ashu]

Artist's sketch of drone operator training
Flying over Simulated Mosque, pencil on paper. [Hillary Mushkin]

A similarly iterative process occurs between UAV operators and ground troops on the front lines, who are known to the operators as “customers.” They share a remote computer interface, and the ground troops can even take control of the payload camera as needed, using a mobile device like the One System Remote Video Terminal (OSRVT), a heavy-duty tablet. 9 These wearable, networked mobile devices are designed to be used in a multi-band communications network by people in situations of extreme mental and physical stress. For example, when soldiers are in an active battle, the military’s rules of engagement require them to maintain a line of sight from the moment the target is identified through the attack. If the target disappears from sight, even momentarily, the soldier must re-identify that person before shooting. With a shared Gray Eagle view, ground troops can maintain sight in varied terrain while drone operators drop missiles per the “customer” orders. 10

Human thought and awareness are instrumentalized functions of the system. Personal subjectivity is harnessed to increase the system’s power.

These team-operated machines help the military develop and maintain “situational awareness.” 11 In the decades since the fields of cybernetics, artificial intelligence, and human-computer interaction began to take shape after World War II, relationships between automated systems and human awareness have developed a life of their own, so to speak. 12 When framed as an engineering challenge, “situational awareness” is defined as the ability of systems operators in a control room to perceive, understand, and predict possible futures, and to make decisions in a complex, dynamic environment. Aerospace engineer Mica Endsley, who later became U.S. Air Force Chief Scientist, proposed, in her influential 1995 paper “Toward a Theory of Situation Awareness,” that information and interfaces should shape human operators’ perception and understanding of a situation so that they can act in relatively predictable ways that accord with system goals. In this view, human thought and awareness are instrumentalized functions of the system. 13 Personal subjectivity is harnessed to increase the system’s power. This approach follows a cybernetic philosophy of human-machine interaction developed in aerospace defense systems research during WWII. Norbert Wiener and colleagues established human behavior as a predictable, mathematical function, thereby encouraging an approach to systems design in which operator reactions are a mechanistic function of a larger system. 14

When you have soldiers taking control of UAVs in the battlefield through devices like the OSRVT, individual human vulnerability becomes a fundamental factor in the design of interfaces. Imagine a civilian professional working from a mobile office, with a backpack containing a phone, earbuds, charger cables, and laptop, using the wifi or cell service available in most urban settings and charging batteries at a cafe or airport. Soldiers on the ground need all this in heavy-duty, shock-proof housing, including secure data and radio connections and a reliable mobile power source. Likewise, the touchscreen interface is designed to be used under stress, with large buttons and simple, distinct gestures. In this way, the human body is deeply engaged in the unmanned system.

Under the hood of a Gray Eagle UAS
Under the hood of a Gray Eagle. [Chris Vena]

Ground Control Station, Fort Huachuca
Ground Control Station. [Kevin Ashu]

We were staging a performance of situational awareness in the most absurdly retrograde way, with watercolors and pencils.

Let’s return now to Easterling’s concept of “disposition” to sharpen the frame. The U.S. military develops and uses specialized technologies and techniques to gain “situational awareness,” and to “command” and “control” the landscape. Not surprisingly, given that lives are at stake, military personnel rely on technology and quantified visualization to impose a sense of objectivity and minimize human error. They understand conflict zones through information systems designed by humans in accordance with military goals, values, inclinations, and occlusions. How does unspoken bias get into these systems? Goals are framed. Categories of data are identified as relevant. Then data is collected, organized, and translated algorithmically and graphically to establish pathways for humans to use the software to serve system goals. 15 That information is delivered through an interface, and then people (i.e. system operators) look at the representations of data, interpret them, discuss their interpretations, and interact with the representations both individually and cooperatively. In Vertesi’s iterative, discursive process of “drawing as,” humans and machines dialog with each other and one another, analyses are stated and refined, beliefs are established and acted upon. Command and control and situation awareness are designed, or “drawn.” The world is abstracted for the sake of military efficiency. 16

Once these abstractions are encoded in software and hardware, they seem to overtake human interpretation of the environments they abstract. And yet the Army sees value in thinking like an artist or designer — forming and communicating knowledge through iteration. 17 Drone operation involves the no-tech, entirely human action of interpreting what is seen on screen and heard in the headsets, and then articulating it quickly, clearly, and succinctly in words and images. In fact, our hosts made a point of refuting the stereotype of drone operators as anti-social video gamers; they joked that artists like us were ideal recruits, as we are professionally trained to observe situations, record environments, and analyze images.

View Slideshow


We also grapple with problems of representation, not least by questioning the accuracy, truth, and authority of images. Here at Fort Huachuca, we were staging a performance of situational awareness in the most absurdly retrograde way – mapping 3D space to 2D with traditional perspectival conventions. I used watercolors for their fluid messiness; others used pencil and marker with varying line widths. Those would be liabilities in military mapping. Still other colleagues devised conceptual rules and systems à la Fluxus and John Cage, such as making a list of all numbers spoken in a presentation. My team registered messy, imprecise, subjective, incomplete versions of the landscape as conflict zone, in contrast to the aesthetics of “command and control” that predominate on a military site. We aimed to blur the lines between remote sensing, computer processing, and the human perception of complex, dynamic landscapes, while calling attention to the role of human subjectivity and interpretation in gaining “situational awareness.”

When we conceive of drones as automated and remotely controlled, we obscure the fact that humans are both sitting in the driver’s seat and designing it. UAVs are further entangled in larger social, economic, and political systems developed by humans over centuries. Systems develop in accordance with goals; and military goals, like those of artists and designers, engineers, government agencies, journalists, activists, scholars, poets, and everyone else, arise from their makers’ values and beliefs. Those of us who seek to understand the world are constantly producing our own abstractions, omitting countless factors from consideration. When we recognize that, we may be better able to comprehend how infinite, messy, and incomplete the search to understand and know the world can be.

  1. Jay Stanley, “‘Drones’ vs ‘UAVs’ — What’s Behind A Name?,” ACLU Free Future, May 20, 2013.
  2. See this helpful chart by CI Geography for a visualization of the U.S. military drone fleet. In 2014, the Army took over drone operations supporting Army ground troops from the U.S. Air Force, and since then it has conducted most of its training at Fort Huachuca. Air Force pilots continue to be trained at other sites.
  3. Email interview with Angela Camara, Fort Huachuca Public Affairs Officer. According to the U.S. Army careers page, UAS operators complete 10 weeks of basic training and 23 weeks of specialized instruction.
  4. For more, see the Incendiary Traces website and my earlier feature, Hillary Mushkin, “Reconnaissance: Inside the Panopticon,” Places Journal, February 2016,
  5. See U.S. Army Acquisition Support Center, MQ-1C Gray Eagle Unmanned Aircraft System (UAS), and General Atomics, Gray Eagle UAS.
  6. Satellite uplinks cost nearly $10,000 per hour, whereas closer range data links don’t require satellite use and have a better signal.
  7. Keller Easterling, Extrastatecraft: The Power of Infrastructure Space (London: Verso, 2014), 73. See also Easterling, “Zone: The Spatial Softwares of Extrastatecraft,” Places Journal, June 2012,
  8. Janet Vertisi, “Drawing as: Distinctions and Disambiguation in Digital Images of Mars,” in Eds. Catelijne Coopmans, Janet Vertesi, Michael E. Lynch, and Steve Woolgar, Representation in Scientific Practice Revisited (Cambridge: MIT Press, 2014).
  9. OSRVT is developed by Textron Systems, a multibillion dollar company headquartered in Rhode Island whose better-known products include Bell helicopters, Cessna planes, and Arctic Cat snowmobiles. These kinds of devices are developed by small companies as well, including the veteran-owned Black Diamond Advanced Technology, based in Arizona, a state which is home to a large UAV industry ranging from global aerospace defense leaders General Atomics (makers of the Gray Eagle, Predator, and Reaper military drones) to Unmanned Vehicle University, which supports UAS study all the way through the doctorate level.
  10. Interview with Richard Stone, Vice President of Strategic Planning and Director of User Experience and Human Systems Engineering for Black Diamond Advanced Technology.
  11. The jargon “situational awareness” has developed from the earlier “situation awareness.” They mean the same thing, but “situational” is more popular now in military circles. General Atomics even uses the slogan “Leading the Situational Awareness Revolution” in its promotional materials. See George Orwell’s classic essay “Politics and the English Language” (discussed in Stanley, op. cit.) for more on how military jargon and obfuscation serve political ends. Orwell’s essay was originally published in Horizon, Vol. 13, No. 76 (1946), 252-65, and is frequently anthologized.
  12. For a summary of the historical evolution of these relationships, see Jutta Weber and Lucy Suchman, “Human–machine autonomies,” in Eds. Nehal Bhuta, Susanne Beck, Robin Geiss, Hin-Yan Liu, and Claus Kress, Autonomous Weapons Systems: Law, Ethics, Policy (Cambridge: Cambridge University Press, 2016), 75-102; and N. Katherine Hayles, “Cybernetics,” in W. J. T. Mitchell and Mark B. N. Hansen, (Eds.) Critical Terms for Media Studies (Chicago: University of Chicago Press, 2010). In this journal, Shannon Mattern describes recent civilian examples in Mattern, “Mapping’s Intelligent Agents,” Places Journal, September 2017,
  13. Mica Endsley, “Towards a Theory of Situation Awareness in Dynamic Systems,” Human Factors 37 (1995), 32-64,
  14. Orit Halpern, Beautiful Data: A History of Vision and Reason since 1945 (Durham, North Carolina: Duke University Press, 2015). This discussion is covered in more detail in my chapter in the forthcoming book Control Room: Nodes in the Network City, Eds. Simon Marvin and Andrés Luque-Ayala (London: Routledge, 2018), which is an expansion of my essay on Mexico City’s C4i4 published in Places, “Reconnaissance,” op. cit.
  15. This process is not unique to the military, of course.
  16. I discuss this further in my chapter in Control Room, op. cit.
  17. See Major Harry Jones, “Army Field Manual 5-0: The Operations Process,” included in the MoMA exhibition Design and Violence (2013), and earlier discussion in Roger Martin, “Design Thinking Comes to the U.S. Army,” Design Observer, May 2010.
Hillary Mushkin, “The Disposition of Drones,” Places Journal, February 2018. Accessed 18 Mar 2018. <>

If you would like to comment on this article, or anything else on Places Journal, visit our Facebook page or send us a message on Twitter.