Two years ago, when a wildcard candidate won the U.S. presidential election, many people blamed the upset on a failure of recognition. Trump voters, this story goes, didn’t see a future for themselves in a deindustrialized economy. They didn’t see their values reflected in government and didn’t see their interests represented on issues like immigration and trade and the environment. 1 They mistook the showman for a dealmaker, a leader, an authentic voice. Shocked by the rise of Trumpism, liberals had to confront their own failures of recognition. They rushed to read stories about the “left behind,” the white working class, whose truths were “unspoken” and histories “untold.” 2 Eventually, they got around to examining the misogyny, racism, and repression in their own institutions and workplaces. Recognition went by the name of reckoning, as if seeing a problem were the same as dealing with it.
Failures to see and be seen are central to the American political crisis — to the hardening of our borders and the spread of new technologies that change the way we appear to one another.
All this is to say: failures to see and be seen are central to the American political crisis, which is marked by rising nationalism and isolationism and the fracturing of shared values. And that brings me to my true subject, the hardening of American borders and the spread of new technologies of recognition and identification that are changing the way we appear to one another. Amazon has been selling facial “Rekognition” software to law enforcement agencies. Google collaborated with the defense department on Project Maven, which uses artificial intelligence to analyze drone imagery (a technology that might someday be used to “surveil entire cities”). 3 In a time of resurgent fascism, it is disturbing to see these powerful companies join longtime actors like Palantir in supporting military, policing, and border control operations. 4 But there is resistance, too. This summer, Google employees got the company to drop its Pentagon contract, and across the tech industry we see protests, resignations, and collective actions organized by workers who are beginning to understand their own moral responsibility and agency. 5 They are recognizing the politics of recognition.
Historians, philosophers, and scientists who study recognition have long focused on the social operations of the city. 6 In the marketplace and in the street, we recognize others by acknowledging their identity and humanity, and we are recognized in return; as Hannah Arendt puts it, we “make [our] appearance explicitly.” For Arendt, the polis is “not the city-state in its physical location,” but rather an “organization of the people as it arises out of acting and speaking together … no matter where they happen to be.” 7 In practice, recognition is mediated by technology, law, citizenship, property relations, and other institutional structures. Nancy Fraser argues further that recognition is a question of social status; thus, a politics of recognition must be “aimed at overcoming subordination by establishing the misrecognized party as a full member of society.” 8
In Trump’s America, these politics start not in the city but at the border. How is recognition operationalized in this pervasively surveillant, artificially intelligent terrain? To answer that question, we must first train ourselves to recognize the border itself, which is not a fence or a wall but a security apparatus whose powers extend deep into the interior. Some 21,000 border patrol agents operate in a 100-mile border zone encircling the country, setting up checkpoints and questioning travelers. 9 In its fullest sense, “the border” also includes the infrastructure of deportation: the archipelago of immigration courts and detention facilities, and the bureaucratic and extralegal mechanisms of enforcing citizenship. Here we recognize the Americans born in South Texas whose passports have been declined or revoked by the State Department. 10 We recognize people whisked away by plainclothes enforcers who do not identify themselves as Immigration and Customs Enforcement agents. 11 We recognize the 12,800 “unaccompanied minors” held in converted Walmart buildings and tent camps, and the tens of thousands more living with sponsors as they await immigration hearings. 12
We must first train ourselves to recognize the border itself, which is not a fence or a wall but a security apparatus whose powers extend deep into the interior.
June 17, 2018, was a turning point in the recognition of this border apparatus. 13 Under the Trump administration’s “zero tolerance” policy, thousands of migrant children had been separated from their families at the border so the adults could be criminally prosecuted and deported. 14 Six weeks after this policy took effect, reporters were finally allowed inside a large detention center in McAllen, Texas, where adults and children were separated for processing. The government tried to control the narrative by prohibiting journalists from interviewing detainees or taking pictures; instead, U.S. Customs and Border Protection released its own images of children held in chain-link kennels, with emergency blankets and sleeping mats on a concrete floor. This followed the Department of Health and Human Services’s release of images of a youth shelter in Brownsville, Texas, that showed the underage charges well cared for, with clean beds and food and teddy bears and video games. 15 Some news outlets chose not to publish these photos, which were essentially state propaganda, 16 but many did, and so we saw the backs of the boys’ heads as they lined up to get food or see the doctor. Since they were minors, their faces were obscured, so that their privacy could be protected even as their humans rights were denied. Unrecognized as individuals, these bodies functioned as mannequins, anonymously activating the space, embodying the norms of compliance, decorum, and modesty — as if to suggest that the government itself had acted moderately and decently here.
The drone-view articulates ‘an authority that is distributed and elusive.’ We can ponder the question of who is responsible without ever recognizing: it’s us.
That weekend, Congressman Beto O’Rourke and hundreds of activists marched to the port of entry at Tornillo, Texas, to protest a tent camp under construction along the Rio Grande. Reuters published aerial photos showing that teenage boys were already detained there, even as new tents appeared overnight. 17 Getty Images now offers a collection of these photos, which you can license for $499 each. 18 From up high, the facility could be mistaken for an oil refinery with a grid of storage tanks. Zooming in, we recognize utility trailers and cream-colored tents outfitted with air conditioners to combat the 100-degree heat. We see patrol vehicles, communications antennae, staff in lime-green shirts, migrant boys lined up in columns. Peeking past a portico, we spy a sanctioned moment of abandon — a football game — and on the periphery, where rolls of sod stretch partway across a rectangular plot, we see a field-in-becoming, perhaps a new recreational space. The drone-view reveals all the low, sprawling infrastructural components, “articulating an authority that is distributed and elusive.” 19 We can ponder the question of who is responsible for this site without ever recognizing: it’s us.
Yet the images also invite us to consider, in comparison, our freedom. We see a farmer tending crops in a nearby field, a seemingly casual act of care. The normalcy of his actions and the freedom of his movement — which mirror our own — underscore the privileges denied on the other side of the fence. The Tornillo tent camp was supposed to be temporary, but in September HHS announced that the facility would triple in capacity, to 3,800 beds, and would remain open at least through the end of the year. 20

The wide circulation of the McAllen, Brownsville, and Tornillo images, as well as an audio recording leaked to ProPublica, 21 fed an incendiary public response. Suddenly, the horror was recognized. Before the end of the week, Trump signed an executive order undoing his own policy of family separation. 22 Soon, a federal judge ordered the administration to reunite migrant children with their families. At the court’s deadline in late July, 1,442 children had been reunified with parents in ICE custody, and another 378 had been released to sponsors or discharged in other “appropriate circumstances.” More than 700 children were still detained, in many cases because their parents had been deported. (As of mid-September, HHS had reunified or discharged 2,251 children and was still holding 182 whose status is monitored by the court.) 23 Over the summer, we witnessed the emergence of a new genre of documentary: handheld videos of tearful family reunions. Many of the children appeared dazed or disoriented, and some seemed not to know their parents — the ultimate misrecognition. As this article goes to press, in late September, the Trump administration has proposed that it be allowed to detain migrant families indefinitely. 24 We still know little about what happens inside the more than 100 shelters for unaccompanied children who arrive at the border alone.
Target Recognition
No one has a full picture of what’s going on here. Not the news media, the lawyers, the advocates, the families. Not the government itself, which is disconcerting. We expect one part of the body to know what the others are doing — we want self-recognition — but the response to the court order revealed that the Department of Homeland Security had no plan for returning these children to their parents. There was no central database of family separations. Records were missing and, in some cases, destroyed. 25 The border apparatus could not be marshaled to comply with the court order, because it was not designed to recognize the humanity of its subjects. It was designed to recognize targets.
Along the U.S.-Mexico border, and especially at ports of entry, recognition is mediated by multiple technologies, including passport controls, license plate readers, facial recognition software, drone surveillance, day and infrared cameras, motion sensors, radar systems, and more. 26 Officials are talking again about a “virtual wall,” and private companies are testing lidar surveillance (the technology that enables self-driving cars to identify and track nearby objects). 27 Documents suggest that the drones will be getting smaller and smarter, too. 28 Simply by moving toward the border, a human subject becomes a target, a visually identifiable or electromagnetically locatable object. Border surveillance recognizes unsanctioned advances and unwanted subjects — while validating the cohort who “belong.” The recognition and exclusion of “illegals” in turn signals validation, inclusion, to those among the nationalist right who regard immigration as a cultural and economic threat. That validation may ultimately be more significant than Trump’s long-promised “wall.”
The border apparatus could not be marshaled to comply with the court order, because it was not designed to recognize the humanity of its subjects. It was designed to recognize targets.
In congressional testimony, border officials disguise operational capabilities in a fog of military euphemisms and star-wars acronyms, like “Vehicle and Dismount Exploitation Radar (VADER),” which is described as “a side-looking airborne radar that detects illegal border crossers and relays their positions to field agents, while simultaneously capturing terrain change detection information.” Such terms of recognition establish a particular way of seeing. People are discussed as “movements” or “flows”; or, simply, “items of interest.” 29 Those interesting items can be recognized as volumes or heat sources, which are visible at night, in fog, and amidst heavy vegetation. 30 They are remotely detected, identified, geolocated, and “interdicted” as necessary. Media scholar Lisa Parks has observed that border imaging technologies produce ambiguous “spectral subjects,” which “take on the biophysical contours of a human body while [their] surface appearance” — including their gender and race — “remains invisible and [their] identity unknown.” 31 But that may not always be the case. CBP has signaled an interest in individualizing targets, and last year it invited contractors to develop equipment for recognizing facial features, clothing, and colors. 32
Facial recognition is relatively new to land borders but has already been implemented at several airports, and Republicans in Congress are keen to roll it out widely. The Intercept reported last year that “the tipping point for facial recognition may be right around the corner,” as dozens of firms vie for government contracts. 33 This summer, border officials at the port of entry at Anzalduas, Texas, tested a new technology bluntly named the Vehicle Face System, which captures photos of drivers through windshields and compares them with images in government databases. The system relies on new plenotopic cameras, developed at Oak Ridge National Laboratory, which “use an array of sensors to capture as much light information as possible,” through multiple focal lengths, allowing them to perceive depth despite any glare or tint introduced by the windshield. 34
So many multisensory modes of recognition: visual, thermal, and olfactory; textual and morphological; human and animal and machinic.
Political scientist Matthew Longo has observed that modern borders “cannot merely be ‘tall’” — with great walls, soaring drones, and high towers — “they must also be ‘wide’ and ‘layered.’” 35 In remote areas along the U.S.-Mexico border, where there are no fences or surveillance towers, border patrol agents drag tires to smooth the dirt so they can see footprints. 36 People are tracked like animals. Elsewhere, the panoptical network includes ground and imaging sensors that detect movement; fixed and mobile surveillance towers; and Tethered Aerostat Radar Systems, or big blimps that track unauthorized low-flying aircraft, including drones used by drug smugglers. At ports of entry, vehicles are screened by X-rays and dogs that sniff for food, drugs, and currency. Agriculture specialists inspect passenger luggage, truck beds, train cars, and shipping containers for invasive species and other “actionable pests.” 37 So many multisensory modes of recognition: visual, thermal, and olfactory; textual and morphological; human and animal and machinic.

Networks and Algorithms of Recognition
All these technologies have to be understood in the context of the ideological regime that controls them. 38 The evidence made public in the week of June 17 caused many Americans to see family separations at the border as a monstrous violation of human rights. But that awareness grew earlier, with a series of smaller recognitions. There was White House Chief of Staff John Kelly’s interview with National Public Radio on May 11 (“the children will be taken care of — put into foster care or whatever”); President Trump’s notorious riff at an immigration policy roundtable on May 16 (“they’re not people; they’re animals”); Senator Jeff Merkley’s viral video of his failed attempt to enter the Brownsville facility on June 3; and a report in The Washington Post, on June 9, that a Honduran father had committed suicide after the government took his three-year-old child. Two days later, Michelle Goldberg retold those stories in her New York Times column, “First They Came for the Migrants,” and asked her readers to recognize the “moral enormity” of what was happening at the border. “We still talk about American fascism as a looming threat, something that could happen if we’re not vigilant,” she wrote. “But for undocumented immigrants, it’s already here.” 39
Anduril is developing a surveillance system called Lattice, featuring 32-foot towers with a laser that shoots like a long-distance flash to create night-vision on the cheap.
Simultaneously, Wired reviewed a new venture by Palmer Luckey, the ebullient tech entrepreneur best known for developing the Oculus Rift virtual reality headset. After leaving Facebook in 2017, the Trump-supporting libertarian joined forces with some ex-Palantir executives, secured financing from Peter Thiel’s Founders Fund, and launched Anduril Industries. (It’s a Lord of the Rings reference, naturally.) The company is developing a surveillance system called Lattice, featuring 32-foot towers equipped with radar and a camera kludged together with a laser from a cosmetic hair-removal device. At night, the laser shoots out a beam that acts like a long-distance flash, to create night-vision on the cheap. This imaging technology is paired with artificial intelligence that can purportedly distinguish humans from other moving objects within a two-mile radius. As Wired’s Steven Levy reports, “The company taught its software to recognize the patterns of a person on the move, allowing it to avoid the expensive zoom lenses and thermal sensors used in competing systems.” During a ten-week trial run in Texas, Lattice “helped customs agents catch 55 unauthorized border crossers” and seize 982 pounds of marijuana. 40
Longo’s research shows that border management is becoming increasingly datalogical. The recognition of humans involves sophisticated data management and analysis, including sharing among DHS sites and coordination with other law enforcement and intelligence databases. 41 As border officials have testified, “Some of the most important advancements in … situational awareness are in the area of data integration and exploitation.” 42 Palantir’s Analytical Framework for Intelligence integrates and analyzes personal data — biographical information, addresses, fingerprints and other bodily markers, social networks, travel itineraries, immigration records — from various federal, state, and local law enforcement databases (including a Bush-era registry of visitors from predominantly Muslim countries that was later suspended). The ACLU reasonably speculates that other data sources include aviation and border watch lists; the biometric databases of the Homeland Security Office, which contain fingerprints, iris scans, palm prints, and facial images; and the Automated Targeting System, a cargo-tracking system, later expanded to people, that involves assigning a risk assessment score to everyone that crosses the border. 43 Again we so many modes modes of recognition, so many data extracted from their contexts of collection and mashed together! (And speaking of decontextualized interpretation: border patrol agents have the right to search your electronic devices, scan your social media accounts, and study your search history, too.) 44
Biometrics turn the border-crosser into a ‘pixelated subject … composed of so many data points that the individual is no longer a meaningful category.’
Customs and Border Protection shares access to Palantir’s centralized system with other agencies, notably ICE’s office of Enforcement and Removal Operations. 45 According to Edward Hasbrouck, an independent researcher and civil liberties advocate, “When Trump uses the term ‘extreme vetting,’ AFI is the black-box system of profiling algorithms that he’s talking about.” 46 These algorithms could be used to generate risk assessments and predict future behaviors, and ultimately to determine whether someone is permitted to cross the border or is pushed back across it. 47
Advances in biometrics are also transforming recognition. As Longo puts it, the border-crosser becomes a “pixelated subject[s] … composed of so many data points that the individual is no longer a meaningful category.” 48 Last year, a company called Biometric Intelligence and Identification Technologies offered law enforcement agencies a free three-year trial of its iris recognition software. The 31 sheriffs representing every county along the U.S.-Mexico border voted to “adopt tools that will capture, catalogue, and compare individuals’ iris data, for use both in jails and out on patrol.” 49 Senior vice president John Leonard acknowledged that his company’s software would help Trump’s efforts, “going after all these illegals.” How does one recognize an “illegal” eye? Agents take a high-resolution image of an individual’s iris, then create an individualized iris template that describes 240 unique characteristics (compared to 40 to 60 for fingerprints), which can be compared to other records in the company’s private database. A Texas sheriff told The Intercept, “It’s not unusual for people caught illegally from Mexico to give fake names and dates of birth. But it doesn’t matter what you use if we have your features — your iris, your fingerprints.” 50
Names and dates and autobiographies are fallible sources of recognition. Yet the body’s data, we’re led to believe, can’t lie. It represents identity indexically.

Subjectivities of Recognition
Researchers like Kelly Gates, Shoshana Amielle Magnet, and Simone Browne argue otherwise. Their studies show that biometric technologies not only fail to live up to promises of objectivity and reliability, but in fact continue a long history of misrecognizing women, people of color, and people with disabilities. 51 Data from white, able, male bodies often constitute the “training set” for these machines, and that tends to establish those demographic qualities as the standards against which other bodies and subjectivities are measured. Louise Amoore claims that the rise of “biometric borders” results in “the exercise of biopower such that the body itself is inscribed with, and demarcates, a continual crossing of multiple encoded borders — social, legal, gendered, racialized, and so on.” 52
Biometric technologies continue a long history of misrecognizing women, people of color, and people with disabilities.
Increasingly, the power to recognize humans at the border is shaped by startup tech companies whose rosters are overwhelmingly white and male. In this context, Luckey’s exuberant confidence (satirized by Haley Joel Osment in the HBO series Silicon Valley) is not just goofy but dangerous. Lattice is an opportunity for Anduril to get its foot in the door at Homeland Security, where it can lay out the mesh infrastructure to support the company’s vision for an augmented reality of “geographic near-omniscience.” Building upon his past work on Oculus, and drawing inspiration from bro-culture pillars like Lord of the Rings and Iron Man’s Stark Industries, Luckey aims to merge VR with surveillance tools to create what Wired describes as “a digital wall that is not a barrier so much as a web of all-seeing eyes, with intelligence to know what it sees.” 53 Anduril’s digital model of the border zone will recognize each “item of interest,” with a quantifiable degree of certainty, as either a person, an animal, a vehicle, or vegetation. Few other modes of being matter in this Borderland Ontology.
In Border as Method, Sandro Mezzadra and Brett Nielson argue that the border is “an epistemological device, which is at work whenever a distinction between subject and object is established.” 54 If that’s the case, Anduril is a method for recognizing these taxonomic distinctions — for determining what items are interesting and what should be done about them. Intercept, arrest, deport, sanction, welcome, ignore. If the company’s technology is widely adopted, we’ll be relying on Luckey’s lattice to determine, as Mezzadra and Nielson put it, “the relation of action to knowledge in a situation where many different knowledge regimes and practices come into conflict.”
We should think more intentionally about the differences among these knowledge regimes. How does one’s recognition of the border, and one’s understanding of the function it serves, inform the actions one takes there? Mexican and American border agents recognize different directional politics — since heading north is coded differently than heading south — and engage their clients accordingly. A refugee seeking asylum approaches a checkpoint with greater trepidation than a border-zone resident who once passed with relative ease between Nogales, Arizona, and Nogales, Mexico — or a rancher whose cattle pay no mind to the boundary. Similarly, DHS recognizes migrants as a different sort of “problem,” requiring a different set of actions, than HHS’s Office of Refugee Resettlement.
And we should consider how conflicting knowledge regimes affect the millions of immigrants in the 100-mile zone that constitutes the U.S. border. Nearly two-thirds of Americans reside here, which means they live inside a material landscape of surveillance, an ambient “lattice” of suspicion. While federal authorities are not permitted to conduct searches without “reasonable suspicion,” the ACLU claims that border patrol agents in this zone “routinely ignore or misunderstand the limits of their legal authority.” 55 As Mazzadra and Nielson would say, the authorities’ mistaken knowledge shapes their actions, potentially transforming any human subject within the zone into a criminal object.
Meanwhile, people living along the demarcated border have to deal with direct threats to their property and freedom of movement. The wall doesn’t always follow the political border; it sometimes dips into American or Mexican territory, creating existential and logistical dilemmas for residents whose addresses and passports declare one national identity, but who inhabit another bureaucratic landscape. 56 On the south bank of the Rio Grande, for example, the wall traps Americans, on American soil, on the Mexican side of the barrier. 57 My colleague Miriam Ticktin has pointed out that the animal openings on the border fence are 8.5 × 11 inches — the size of a sheet of paper. 58 This, to me, is more than a coincidence. The dimensions of that gap remind us that the border’s sorting apparatus isn’t just made of iron, barbed wire, and concrete. It’s also built out of bureaucracy: passports, paperwork, and protocols; forms and files, databases and algorithms.
That wall — the most primitively analog of border technologies — is the operative image writ large: it visually communicates the presence of the border condition and all the epistemological and ontological shifts and protocols it entails. The wall is not the border, but it seems particularly appealing to Trump as a pre-linguistic proclamation of power and control, a tyrannical speech act in concrete. It’s also a tool of interpellation: depending upon our own status, we might recognize ourselves in the wall’s summons (home is just over there!) or repudiation (you’re not welcome here).
Walls, like so many other border technologies, are tools of propaganda, meant to compel a recognition of rulers’ power to rule. They entice us to misrecognize their height and strength as an index of sovereignty, safety, certainty — a line that distinguishes us from them. Drones are similarly performative border technologies. “Despite evidence that drones have been an extremely expensive and highly ineffective means of securing the US-Mexico border,” Stephen Graham writes, “myths of total vision and absolute panopticism permeate the politics of drone deployment.” 59 Border technologies make for great photo-ops. And we citizens risk misrecognizing such spectacle as evidence of actual governance, of rightful authority.
A border — whether materialized in steel or patrolled and ‘performed’ by various state apparatuses and sensing technologies — means different things to, and calls for different responses from, the different beings who encounter it.
A border — whether materialized in steel or patrolled and “performed” by various state apparatuses and sensing technologies — means different things to, and calls for different responses from, the different beings who encounter it. 60 How do they recognize the border, if at all, and the politics it embodies? How do they recognize one another vis-à-vis the border? What ways of living together, of appearing before one another (à la Arendt) are effected by the border dispositif? To counter isolationist nationalism in the borderlands, activists will need to fight on all fronts, opposing unjust detentions and deportations, new material constructions, and new lattices of surveillance. But they will also need to confront the semiotics of the wall. What other ways might we recognize the border? How might we repurpose imaging technologies to help us see the intimate and global dynamics of border recognition? And how might we translate that knowledge into new modes of action?
Josh Begley, Best of Luck with the Wall (2016). [Field of Vision]
Here I submit the example of Josh Begley’s 2016 Best of Luck With the Wall, a video composed of 200,000 stitched-together satellite images tracking the route of Trump’s proposed border wall. Satellite imagery, despite its military origins and alienating distance, can actually help us recognize our own humanity in this often brutal terrain. From on high, we see how, in some places, the border sorts out urban conditions and populations; it divides prosperous, idyllic communities from desolate, crime-plagued cities. Elsewhere, the border is a seemingly accidental divide between two connected communities, their people sorted only by an arbitrary national identity. In the most remote, uninhabited places, the border functions almost as land art. Begley’s 30,000-mile view establishes a critical distance and yet still allows us to affectively recognize — to witness — local conditions. And the glitches and gashes in the imagery itself — the sharp changes in contrast and fidelity and color — demonstrate the composite nature of this monolithic enterprise: both the wall and its map are sutured together from thousands of individual pieces, each situated in place.
From above, we recognize that despite all the techniques and technologies marshaled to reify it, to spectacularize it, to render it sublime, the border is in large part just an epistemology performed and materialized, a way of seeing and sorting that’s also, simultaneously, a mode of governing — one that has incited wars, launched expeditions and crusades, secured the fate of kings and kingdoms, divided families, altered evolution, and provoked actions that have, for millennia, reshaped humanity and the planet we live on.
Daryl Meador, excerpt from Heroica Matamoros: A Border City in Three Landscapes (2018). [Daryl Meador]
And that brings me to a final object lesson, which proposes a new way of responding to the border, an embodied transnationalism. Daryl Meador, a former thesis advisee who is now a graduate student at New York University, made a film about riding with the Doble Ruedas (Double Wheel) bicycle collective in Matamoros, Mexico, across the Rio Grande from Brownsville, Texas. 61 Every week Doble Ruedas organizes inclusive, communally oriented rides throughout the city. They ride slowly, over potholed streets, giving everyone a chance to lead the pack. And every once in a while, someone yells out, “aburrido!” — boring! — thereby transforming this contested city, often sensationalized and demonized, into just another everyday place. Daryl’s film and video footage aims to capture these analog affective geographies. She uses her Bolex and GoPro to recognize the power of the cyclists’ self-determined mobility in a terrain supposedly defined by restricted movement — and to show how a sensorial encounter that exceeds the visual can reframe borders, which are so prone to over-representation and misrecognition.
If you would like to comment on this article, or anything else on Places Journal, visit our Facebook page or send us a message on Twitter.