Mission Control: A History of the Urban Dashboard

NASA Mission Control Center
Mission Control Center, Houston, 1965. [NASA]

We know what rocket science looks like in the movies: a windowless bunker filled with blinking consoles, swivel chairs, and shirt-sleeved men in headsets nonchalantly relaying updates from “Houston” to outer space. Lately, that vision of Mission Control has taken over City Hall. NASA meets Copacabana, proclaimed the New York Times, hailing Rio de Janeiro’s Operations Center as a “potentially lucrative experiment that could shape the future of cities around the world.” The Times photographed an IBM executive in front of a seemingly endless wall of screens integrating data from 30 city agencies, including transit video, rainfall patterns, crime statistics, car accidents, power failures, and more. 1

Futuristic control rooms have proliferated in dozens of global cities. Baltimore has its CitiStat Room, where department heads stand at a podium before a wall of screens and account for their units’ performance. 2 The Mayor’s office in London’s City Hall features a 4×3 array of iPads mounted in a wooden panel, which seems an almost parodic, Terry Gilliam-esque take on the Brazilian Ops Center. Meanwhile, British Prime Minister David Cameron commissioned an iPad app – the “No. 10 Dashboard” (a reference to his residence at 10 Downing Street) – which gives him access to financial, housing, employment, and public opinion data. As The Guardian reported, “the prime minister said that he could run government remotely from his smartphone.” 3

Rio Operations Center
Rio Operations Center, 2012. [IBM]

This is the age of Dashboard Governance, heralded by gurus like Stephen Few, founder of the “visual business intelligence” and “sensemaking” consultancy Perceptual Edge, who defines the dashboard as a “visual display of the most important information needed to achieve one or more objectives; consolidated and arranged on a single screen so the information can be monitored at a glance.” A well-designed dashboard, he says — one that makes proper use of bullet graphs, sparklines, and other visualization techniques informed by the “brain science” of aesthetics and cognition — can afford its users not only a perceptual edge, but a performance edge, too. 4 The ideal display offers a big-picture view of what is happening in real time, along with information on historical trends, so that users can divine the how and why and redirect future action. As David Nettleton emphasizes, the dashboard’s utility extends beyond monitoring “the current situation”; it also “allows a manager to … make provisions, and take appropriate actions.” 5

Juice Software, KnowNow, Rapt … the names conjured up visions of an Omniscient Singularity fueled by data, hubris, and Adderall.

In 2006, when Few published the first edition of his Information Dashboard Design manual, folks were just starting to recognize the potential of situated media. Design critic John Thackara foretold an emerging market for “global spreadsheets” (his term for data displays) that could monitor the energy use of individual buildings or the ecological footprint of entire cities and regions. Thackara identified a host of dashboard players already on the scene — companies like Juice Software, KnowNow, Rapt, Arzoon, ClosedloopSolutions, SeeBeyond, and CrossWorlds — whose names conjured up visions of an Omniscient Singularity fueled by data, hubris, and Adderall. 6

By now we know to interpret the branding conceits of tech startups with amused skepticism, but those names reflect a recognition that dashboard designers are in the business of translating perception into performance, epistemology into ontology. 7They don’t merely seek to display information about a system but to generate insights that human analysts use to change that system — to render it more efficient or sustainable or profitable, depending upon whatever qualities are valued. The prevalence and accessibility of data are changing the way we see our cities, in ways that we can see more clearly when we examine the history of the urban dashboard.

Bloomberg Terminal, 2009. [Ryuzo Masunaga/Bloomberg]
Bloomberg Terminal, 2009. [Ryuzo Masunaga/Bloomberg]

From Bloomberg Terminals to Bloomberg’s New York

Data displays often mimic the dashboard instrumentation of cars or airplanes. Where in a car you’d find indicators for speed, oil, and fuel levels, here you’ll find widgets representing your business’s “key performance indicators”: cash flow, stocks, inventory, and so forth. Bloomberg terminals, which debuted in 1982, allowed finance professionals to customize their multi-screen displays with windows offering real-time and historical data regarding equities, fixed-income securities, and derivatives, along with financial news feeds and current events (because social uprisings and natural disasters have economic consequences, too), and messaging windows, where traders could provide context for the data scrolling across their screens. Over the last three decades, the terminals have increased in complexity. As in a flight cockpit, the Bloomberg systems involve custom input devices: a specialized keyboard with color-coded keys for various kinds of shares, securities, markets, and indices; and the B-UNIT® portable scanner that can biometrically authenticate users on any computer or mobile device. The Bloomberg dashboard is no longer locked into the iconic two-screen display; traders can now access the dashboard “environment” on a variety of devices, just as David Cameron can presumably govern a nation via BlackBerry.

The Enron scandal incited a cultural shift … Chief Information Officers finally embraced the dashboard’s panoptic view.

The widespread adoption of the Bloomberg terminal notwithstanding, it took a while for dashboards to catch on in the corporate world. Stephen Few reports that during much of the ’80s and ’90s, large companies focused on amassing data, without carefully considering which indicators were meaningful or how they should be analyzed. He argues that the 2001 Enron scandal incited a cultural shift. Recognizing the role of data in corporate accountability and ethics, the Chief Information Officers of major companies finally embraced the dashboard’s panoptic view. I’d add another reason: before dashboards could diffuse into the zeitgeist, we needed a recognized field of data science and a cultural receptivity to data-driven methodologies and modes of assessment.

The dashboard market now extends far beyond the corporate world. In 1994, New York City police commissioner William Bratton adapted former officer Jack Maple’s analog crime maps to create the CompStat model of aggregating and mapping crime statistics. Around the same time, the administrators of Charlotte, North Carolina, borrowed a business idea — Robert Kaplan’s and David Norton’s “total quality management” strategy known as the “Balanced Scorecard” — and began tracking performance in five “focus areas” defined by the City Council: housing and neighborhood development, community safety, transportation, economic development, and the environment. Atlanta followed Charlotte’s example in creating its own city dashboard. 8

NYPD Real Time Crime Center
Real Time Crime Center, New York City. [via NYC Police Foundation]

In 1999, Baltimore mayor Martin O’Malley, confronting a crippling crime rate and high taxes, designed CitiStat, “an internal process of using metrics to create accountability within his government.” (This rhetoric of data-tested internal “accountability” is prevalent in early dashboard development efforts.) 9 The project turned to face the public in 2003, when Baltimore launched a website of city operational statistics, which inspired DCStat (2005), Maryland’s StateStat (2007), and NYCStat (2008). 10 Since then, myriad other states and metro areas — driven by a “new managerialist” approach to urban governance, committed to “benchmarking” their performance against other regions, and obligated to demonstrate compliance with sustainability agendas — have developed their own dashboards. 11

The Open Michigan Mi Dashboard is typical of these efforts. The state website presents data on education, health and wellness, infrastructure, “talent” (employment, innovation), public safety, energy and environment, financial health, and seniors. You (or “Mi”) can monitor the state’s performance through a side-by-side comparison of “prior” and “current” data, punctuated with a thumbs-up or thumbs-down icon indicating the state’s “progress” on each metric. Another click reveals a graph of annual trends and a citation for the data source, but little detail about how the data are actually derived. How the public is supposed to use this information is an open question.

OpenMi Dashboard
Mi Dashboard. [Open Michigan]

Some early dashboard projects have already been abandoned, and others have gone on hiatus while they await technical upgrades. The now-dormant LIVE Singapore! project, a collaboration of MIT’s Senseable City Lab and the Singapore-MIT Alliance for Research and Technology (SMART), was intended to be an “open platform” for the collection, combination, and distribution of real-time data, and a “toolbox” that developer communities could use to build their own civic applications. 12 The rise of smartphones and apps has influenced a new wave of projects that seek not just to visualize data but to give us something to do with it, or layer on top of it.

Over the past several years, a group of European cities has been collaborating on the development of urbanAPI, which proposes to help planners engage citizens in making decisions about urban development. Boston’s Citizens Connect has more modest aspirations: it allows residents to report potholes, damaged signs, and graffiti. Many projects have scaled back their “built-in” civic engagement aspirations even further. Citizens’ agency is limited to accessing data, perhaps customizing the dashboard interface and thereby determining which sources are prioritized, and supplying some of that data passively (often unwittingly) via their mobile devices or social media participation. If third parties wish to use the data represented on these platforms in order to develop their own applications, they’re free to do so — but the platforms themselves involve few, if any, active participation features.

In 2012, London launched an “alpha” prototype of the City Dashboard that powers the mayor’s wall of iPads. 13 Created by the Bartlett Centre for Advanced Spatial Analysis at University College London, and funded by the government through the National e-Infrastructure for Social Simulation, the web-based platform features live information on weather, air quality, train status, and surface transit congestion, as well as local news. 14 Data provided by city agencies are supplemented by CASA’s own sensors (and, presumably, by London’s vast network of CCTV cameras). In aggregate, these sources are meant to convey the “pulse” of London. Other urban cadences are incorporated via social media trends, including tweets from city media outlets and universities, along with a “happiness index” based on an “affect analysis” of London’s social media users. 15 The CASA platform has also been deployed in other UK cities, from Glasgow to Brighton.

City Dashboard, London
City Dashboard, London. [Bartlett Centre for Advanced Spatial Analysis]

By now these dashboard launches are so common that we begin to see patterns. Dublin’s dashboard, released just last fall by the Programmable City project and the All-Island Research Observatory at Maynooth University, integrates data from numerous sources — Dublin City Council, the regional data-sharing initiative Dublinked, the Central Statistics Office, Eurostat, and various government departments — and presents it via real-time and historical data visualizations and interactive maps. The platform is intended to help its audiences — citizens, public employees, and businesses — with their own decision-making and “evidence-informed analysis,” and to encourage the independent development of visualizations and applications. 16

Urban dashboard projects embody a variety of competing ideologies.

Such projects embody a variety of competing ideologies. They open up data to public consumption and use. They render a city’s infrastructures visible and make tangible, or in some way comprehensible, various hard-to-grasp aspects of urban quality-of-life, including environmental metrics and, in the case of the happiness index, perhaps even mental health. Yet at the same time these platforms often cultivate a top-down, technocratic vision that, as Paola Ciuccarelli and colleagues argue, “can be problematic, especially if matters such as the active engagement of all the stakeholders involved in designing, operating, and controlling these dashboards are not properly addressed.” 17 What’s more, these urban dashboards perpetuate the fetishization of data as a “monetizable” resource and a positivist epistemological unit — and they run the risk of framing the city as a mere aggregate of variables that can be measured and “optimized” to produce an efficient or normative system. 18

John Nott Sartorius, <em>A Horse and Carriage in a Landscape</em>.
John Nott Sartorius, A Horse and Carriage in a Landscape.

A History of Cockpits and Control

The dashboard as “frame” — of human agency, of epistemologies and ideologies, of the entities or systems it operationalizes through its various indicators — has a history that extends back much farther than ’80s-era stock brokerage desks and ’90s crime maps. Likewise, the dashboard’s relation to the city and the region — to space in general — predates this century’s interactive maps and apps. The term dashboard, first used in 1846, originally referred to the board or leather apron on the front of a vehicle that kept horse hooves and wheels from splashing mud into the interior. Only in 1990, according to the Oxford English Dictionary, did the term come to denote a “screen giving a graphical summary of various types of information, typically used to give an overview of (part of) a business organization.” The acknowledged partiality of the dashboard’s rendering might make us wonder what is bracketed out. Why, all the mud of course! All the dirty (un-“cleaned”) data, the variables that have nothing to do with key performance (however it’s defined), the parts that don’t lend themselves to quantification and visualization. All the insight that doesn’t accommodate tidy operationalization and air-tight widgetization: that’s what the dashboard screens out.

All the insight that doesn’t accommodate tidy operationalization and air-tight widgetization: that’s what the dashboard screens out.

Among the very pragmatic reasons that particular forces, resources, and variables have historically thwarted widgetization is that we simply lacked the means to regulate their use and measure them. The history of the dashboard, then, is simultaneously a history of precision measurement, statistics, instrument manufacturing, and engineering — electrical, mechanical, and particularly control engineering. 19 Consider the dashboard of the Model T Ford. In 1908, the standard package consisted solely of an ammater, an instrument that measured electrical current, although you could pay extra for a speedometer. You cranked the engine to start it (by 1919 you could pay more to add an electric starter), and once the engine was running, you turned the ignition switch from “battery” to “magneto.” There was no fuel gauge until 1909; before then, you dipped a stick in the fuel tank to test your levels. Water gushing from the radiator, an indicator you hoped not to see, was your “engine temperature warning system.” As new means of measurement emerged, new gauges and displays appeared.

Dashboard in an early Model T Ford. [Flickr/Commons]
The lone dashboard instrument in an early Model T Ford. [Flickr/Commons]

And then things began to evolve in the opposite direction: as more and more mechanical operations were automated, the dashboard evolved to relay their functioning symbolically, rather than indexically. By the mid-50s, the oil gauge on most models was replaced by a warning, or “idiot,” light. The driver needed only a binary signal: either (1) things are running smoothly; or (2) something’s wrong; panic! 20 The “Maintenance Required” light came to indicate a whole host of black-boxed measurements. The dashboard thus progressively simplified the information relayed to the driver, as much of the hard intellectual and physical labor of driving was now done by the car itself.

Dashboard design in today’s automobiles is driven primarily by aesthetics. It’s currently fashionable to give the driver lots of information — most of which has little impact on her driving behavior — so she feels in control of this powerful machine. Most “key performance indicators” have little to do with the driver’s relationship to the car. Just as important are her relationship to (1) the gas tank, (2) her Bluetooth-linked iPhone, and (3) the state trooper’s radar gun. 21 While some “high-performance” automobiles are designed to make drivers feel like they’re piloting a fighter jet, the dashboard drama is primarily for show. It serves both to market the car and to cultivate the identity and agency of the driver: this assemblage of displays requires a new literacy in the language and aesthetics of the interface, which constitutes its own form of symbolic, if not mechanical, mastery.

In an actual fighter jet, of course, all those gauges play a more essential operational role. As Frederick Teichmann wrote, in his 1942 Airplane Design Manual, “All control systems terminate in the cockpit; all operational and navigational instruments are located here; all decisions regarding the flight of the airplane, with … very few exceptions … are determined here.” 22 Up through the late ’20s or early ’30s, however, pilots had few instruments to consult. World War I pilots, according to Branden Hookway, were “expected to rely almost solely on unmediated visual data and ‘natural instinct’ for navigation, landing, and target sighting”; navigation depended on a mixture of “dead reckoning (estimating one’s position using log entries, compass, map, etc., in absence of observation) and pilotage (following known landmarks directly observed from the air).” 23 And while some instruments — altimeter, airspeed indicator, hand-bearing compass drift sight, course and direction calculator, and oil pressure and fuel gauges — had become available by the war’s end, they were often inaccurate and illegible, and most pilots continued to fly by instinct and direct sight.

North American F-100D cockpit
Cockpit of a North American F-100D jet fighter, 1956. [U.S. Air Force]

Throughout the 1920s, research funded by the military and by instrument manufacturers like Sperry sought to make “instrument flying” more viable. By 1928, Teichmann writes, pilots were flying faster, more complicated planes and could no longer “trust their own senses at high altitudes or in fogs or in cross-country flights or in blind flying”:

They must rely, for safety’s sake, almost entirely on radio communication, radio beacons, range compass findings, gyroscopic compasses, automatic pilots, turn and bank indicators, and at least twenty-five or more other dials and gadgets essential to the safe operation of the airplane in all kinds of weather. 24

In short, they came to depend on the dashboard for their survival. The instrumentation of piloting represented a new step in automation, according to Jones and Watson, authors of Digital Signal Processing. For the first time, automated processes began “replacing sensory and cognitive processes as well as manipulative processes.” 25 Dashboards manifested the “perceptual edge” of machines over their human operators.

Still, the dashboard and its user had to evolve in response to one another. The increasing complexity of the flight dashboard necessitated advanced training for pilots — particularly through new flight simulators — and new research on cockpit design. 26 Hookway argues that recognizing the cockpit-as-interface led to the systematized design of flight instrumentation that would streamline the flow of information. Meanwhile, recognizing the cockpit-as-environment meant that designers had to attend to the “physiological and psychological needs of pilot and aircrew,” which were shaped by the cramped quarters, noise, cold temperatures, and reduced atmospheric pressure of the plane. 27 Military applications also frequently required communication and coordination among pilots, co-pilots, navigators, bomb operators, and other crew members, each of whom relied on his own set of instruments. 28

Plotting table at RAF Uxbridge, headquarters of No. 11 Group.
Plotting table at RAF Uxbridge, headquarters of No. 11 Group. [Daniel Stirland]

The Control Room as Immersive Dashboard

Before long, the cockpit grew too large for the plane:

Phone lines linked controllers to the various airfields, which communicated with individual planes by high-frequency radio. A special red hotline went directly to Fighter Command headquarters at Bentley Priory. Plotters hovered around the situation map. … A vast electric tableau, glowing in a bewildering array of colored lights and numbers, spanned the wall opposite the viewing cabin like a movie curtain. On this totalizator, or tote board, controllers could see at a glance the pertinent operational details — latest weather, heights of the balloon barrage layer guarding key cities, and most especially, fighter status.

That was the Control Room of No. 11 Group of the RAF Fighter Command, at Uxbridge, England, in September 1940, as described by Robert Buderi in his book on the history of radar. 29 The increasing instrumentation of flight and other military operations, and the adoption of these instrumental control strategies by government and business, led to the creation of immersive environments of mosaic displays, switchboards, and dashboards — from Churchill’s War Rooms to the Space Age’s mythologized mission control.

The push-button changed the way we started our cars, summoned our servants, dialed our phones, manufactured our Space Sprockets, and waged our wars.

In the early 1970s, under Salvador Allende, Chile attempted to implement Project Cybersyn, a cybernetics-informed decision-support system for managing the nation’s economy. The hexagonal “Opsroom” was its intellectual and managerial hub, where leaders could access data, make decisions, and transmit advice to companies and financial institutions via telex. 30 Four of the room’s six walls offered space for “dashboards.” 31 One featured four “datafeed” screens housed in fiberglass cabinets. Using a button console on their chair armrests, administrators could control which datafeed was displayed — graphs of production capacities, economic charts, photos of factories, and so forth. It was a proud moment for the humble push-button — that primary means of offering binary input into our dashboards — which, in the course of a century, changed the way we started our cars, summoned our servants, dialed our phones, manufactured our Space Sprockets, and (demonstrating its profound ethical implications) waged our wars. Media historian Till Heilmann, who is investigating the push-button as an integral element in the history of digital technology, argues that pushing buttons — a practice that he traces back to operation of the electric telegraph (but which might go back farther, to the design of musical instruments) — is among the most important “cultural techniques” of the industrial and post-industrial ages. 32

Cybersyn Ops Room
Cybersyn Ops Room, Chile, 1972. [Gui Bonsiepe]

Another of the Opsroom’s walls featured two screens with algedonic alerts: red lights that blinked with increasing frequency to reflect the escalating urgency of problems in the system. On yet another wall, Cybersyn architect Stafford Beer installed a display for his Viable System Model, which helped “participants remember the cybernetic principles that supposedly guided their decision-making processes.” 33 The final “data” wall featured a large metal surface, covered with fabric, on which users could rearrange magnetic icons that represented components of the economy. The magnets offered an explicit means of analog visualization and play, yet even the seemingly interactive “datafeed” screens were more analog than they appeared. Although the screens resembled flat-panel LCDs, they were actually illuminated from the rear by slide projectors behind the walls. The slides themselves were handmade and photographed. The room’s futuristic Gestalt — conveyed by those streamlined dashboards, with their implication of low-barrier-to-entry, push-button agency — was a fantasy. “Maintaining this [high-tech] illusion,” Eden Medina observes, “required a tremendous amount of human labor” behind the screens. 34

Screen interfaces embody in their architectures particular ways of thinking and particular power structures, which we must critically analyze.

Cybersyn’s lessons have filtered down through the years to inform the design of more recent control rooms. In a 2001 edited volume on control room design, various authors advocated for the simultaneous consideration of human-computer interaction and human cognition and ergonomics. They addressed the importance of discerning when it’s appropriate to display “raw” data sets and when to employ various forms of data visualization. They advocated for dashboarded environments designed to minimize human error, maximize users’ “situation awareness” and vigilance, facilitate teamwork, and cultivate “trust” between humans and machines. 35

We might read a particular ideology in the design of Baltimore’s CitiStat room, which forces department managers to stand before the data that are both literally and methodologically behind their operations. The stage direction reassures us that it is those officials’ job to tame the streams of data — to contextualize this information so that it can be marshaled as evidence of “progress.” The screen interfaces themselves — those “control rooms in a box,” we might say — embody in their architectures particular ways of thinking and particular power structures, which we must critically analyze if we’re using these structures as proxies for our urban operations. 36

Model T race car
C.J. Smith and a Model T race car after the New York to Seattle Transcontinental Endurance Race, 1909. [via The Henry Ford]

Critical Mud: Structuring and Sanitizing the Dashboard

Now that dashboards and the epistemologies and politics they emblematize — have proliferated so widely, across such diverse fields, we need to consider how they frame our vision, what “mud” they bracket out, and how the widgetized screen-image of our cities and regions reflects or refracts the often-dirty reality. In an earlier article for Places, I outlined a rubric for critically analyzing urban interfaces. Here, I’ll summarize some key points and highlight issues that are particularly pertinent to urban dashboards:

First, the dashboard is an epistemological and methodological pastiche. It represents the many ways a governing entity can define what variables are important (and, by extension, what’s not important) and the various methods of “operationalizing” those variables and gathering data. Of course, whatever is not readily operationalizable or measurable is simply bracketed out. A city’s chosen “key performance indicators,” as Rob Kitchin and colleagues observe, “become normalized as a de facto civic epistemology through which a public administration is measured and performance is communicated.” 37

The dashboard also embodies the many ways of rendering that data representable, contextualizable, and intelligible to a target audience that likely has only a limited understanding of how the data are derived. 38 Hookway notes that “the history of the interface” — or, in our case, the dashboard — is also a “history of intelligences … it delimits the boundary condition across which intelligences are brought into a common expression so as to be tested, demonstrated, reconciled, and distributed.” 39 On our urban dashboards we might see a satellite weather map next to a heat map of road traffic, next to a ticker of city expenditures, next to a word-cloud “mood index” drawing on residents’ Twitter and Facebook updates. This juxtaposition represents a tremendous variety of lenses on the city, each with its own operational logic, aesthetic, and politics. Viewers can scan across data streams, zoom out to get the big picture, zoom in to capture detail; and this flexibility, as Kitchin and colleagues write, improves “a user’s ‘span of control’ over a large repository of voluminous, varied and quickly transitioning data … without the need for specialist analytics skills.” 40 However, while the dashboard’s streamlined displays and push-button inputs may lower barriers to entry for users, the dashboard frame — designed, we must recall, to keep out the mud — also does little to educate those users about where the data come from, or about the politics of information visualization and knowledge production. 41

Dublin Dashboard
One view of the Dublin Dashboard: bike availability, parking capacity, and travel time.

In turn, those representational logics and politics structure the agency and subjectivity of the dashboard’s users. These tools do not merely define the roles of the user — e.g. passive or active data-provider, data monitor, data hacker, app builder, user-of-data-in-citizen-led-urban-redevelopment — they also construct her as an urban subject and define, in part, how she conceives of, relates to, and inhabits her city. Thus, the system also embodies a kind of ontology: it defines what the city is and isn’t, by choosing how to represent its parts. If a city is understood as the sum of its component widgets — weather plus crime statistics plus energy usage plus employment data — residents have an impoverished sense of how they can act as urban subjects. Citizens may be encouraged to use a city’s open data, to build layers on top of the dashboard, to develop their own applications; but even these applications, if they’re to be functional, have to adhere to the dashboard’s protocols.

If the city is understood as the sum of its component widgets, residents have an impoverished sense of how they can act as urban subjects.

For the dashboard’s governing users, the system shapes decision-making and promotes data-driven approaches to leadership. As we noted earlier, dashboards are intended not merely to allow officials to monitor performance and ensure “accountability,” but also to make predictions and projections — and then to change the system in order to render the city more sustainable or profitable or efficient. As Kitchin and colleagues propose, dashboards allow for macro, longitudinal views of a city’s operations and offer an “evidence base far superior to anecdote.” 42

The risk here is that the dashboard’s seeming comprehensiveness and seamlessness suggest that we can “govern by Blackberry” — or “fly by instrument” — alone. Such instrumental approaches (given most officials’ disinclination to reflect on their own methods) can foster the fetishization and reification of data, and open the door to analytical error and logical fallacy. 43 As Adam Greenfield explains:

Correlation isn’t causation, but that’s a nicety that may be lost on a mayor or a municipal administration that wants to be seen as vigorously proactive. If fires disproportionately seem to break out in neighborhoods where lots of poor people live, hey, why not simply clear the poor people out and take credit for doing something about fire? After all, the city dashboard you’ve just invested tens of millions of dollars in made it very clear that neighborhoods that had the one invariably had the other. But maybe there was some underlying, unaddressed factor that generated both fires and the concentration of poverty. (If this example strikes you as a tendentious fabulation, or a case of reductio ad absurdum, trust me: the literature of operations research is replete with highly consequential decisions made on grounds just this shoddy.) 44

Cities are messy, complex systems, and we can’t understand them without the methodological and epistemological mud. Given that much of what we perceive on our urban dashboards is sanitized, decontextualized, and necessarily partial, we have to wonder, too, about the political and ethical implications of this framing: what ideals of “openness” and “accountability” and “participation” are represented by the sterilized quasi-transparency of the dashboard?

Getting Back to the Dirt

Contrast the dashboard’s panoptic view of the city with that of another urban dashboard from the late 19th century, when the term was still used primarily to refer to mud shields. The Outlook Tower in Edinburgh, Scotland, began in the 1850s as an observatory with a camera obscura on the top floor. Patrick Geddes, Scottish polymath and town planner, bought the building in 1892 and transformed it into a “place of outlook and … a type-museum which would serve not only as a key to a better understanding of Edinburgh and its region, but as a help towards the formation of clearer ideas of the city’s relation to the world at large.” 45 This “sociological laboratory” — which Anthony Townsend, in Smart Cities, describes as a “Victorian precursor” to Rio’s digital dashboard — embodied Geddes’s commitment to the methods of observation and the civic survey, and his conviction that one must understand a place within its regional and historical contexts. 46 Here, I’ll quote at length from two historical journal articles, not only because they provide an eloquent explication of Geddes’s pedagogical philosophy and urban ideology, but also because their rhetoric provides such stark contrast to the functionalist, Silicon Valley lingo typically used to talk about urban dashboards today.

Outlook Tower. [from Patrick Geddes, Cities in Evolution, 1915]
Outlook Tower. [from Patrick Geddes, Cities in Evolution, 1915]

The tower’s visitors were instructed to begin at the top, in the camera obscura, where they encountered projections of familiar city scenes — “every variety of modern life,” from the slums to the seats of authority — and where they could not “fail to be impressed with the relation of social conditions to topography,” as Charles Zueblin reported in 1899, in The American Journal of Sociology. The camera obscura, he wrote, “combines for the sociologist the advantages of the astronomical observatory and the miscoscopical laboratory. One sees both near and distant things.” Continuing:

One has a wider field of view than can be enjoyed by the naked eye, and at the same time finds more beautiful landscape thrown on the table by the elimination of some of the discordant rays of light. One sees at once with the scientist’s and the artist’s eye. The great purpose of the camera obscura is to teach right methods of observation, to unite the aesthetic pleasure and artistic appreciation with which observation begins, and which should be habitual before any scientific analysis is entered upon, with the scientific attitude to which every analysis should return. 47

This apparatus offers both a macro view and the opportunity to “zoom in” on the details, which is a feature of interactive digital dashboards, too. But here that change in scale is informed by an aesthetic sensibility, and an awareness of the implications of the scalar shift.

“On the Terrace Roof,” according to a 1906 exhibition review, “one has again an opportunity of surveying the Edinburgh Region, but in the light of day and in the open air” — and, Zueblin notes, “with a deeper appreciation because of the significance given to the panorama by its previous concentration” in the camera obscura:

Here the observer has forced upon him various aspects of the world around him; weather conditions, the configuration of the landscape, the varying aspect of the gardens as the seasons pass, our relation to the sun with its time implications, the consideration of direction of orientation, etc. 48

Descending the floors, visitors encountered exhibitions — charts, plans, maps, models, photos, sketches, etc. — that situated them within their spatial contexts at increasing scale: first the archaeology and historical evolution of Edinburgh; then the topography, history, and social conditions of Scotland; then the empire, with an alcove for the United States; Europe; and, finally, the Earth. (Zueblin admits that this last part of the exhibition, which in 1899 lacked the great globe that Geddes hoped to install, was underdeveloped.) Along the way, visitors came across various scientific instruments and conventions — a telescope, a small meteorological station, a set of surveying instruments, geological diagrams — that demonstrated how one gained insight into space at various scales.

“The ascent of the tower provides one with a cyclopaedia,” Zueblin observes, “the descent, a laboratory. … In the basement we find the results, not only of the processes carried on above, but also classifications of the arts and sciences, from Aristotle or Bacon to Comte and Spencer, and we incidentally have light thrown on the intellectual development of the presiding genius here.” 49 The building thus embodied various modes of understanding; it was a map of intellectual history.

At the same time, the tower gave shape to Geddes’s synthetic pedagogy: one that began with the present day and dug deeper into history, and one that started at home and extended outward into the region, the globe, and perhaps even the galaxy. The Tower impressed upon its visitors a recognition that, in order to “understand adequately his region,” they needed to integrate insights from various fields of specialization: biology, meteorology, astronomy, history, geology — yes, even those who study the mud and rocks thrown into the vehicle. 50

Today’s urban dashboards fail to promote a similarly rich experiential, multidisciplinary pedagogy and epistemology. The Outlook Tower was both a dashboard and its own epistemological demystifier — as well as a catapult to launch its users out into the urban landscape itself. It demonstrated that “to use results intelligently the geographer must have some knowledge of how they are obtained” — where the data come from. 51 The lesson here is that we can’t know our cities merely through a screen. From time to time, we also need to fly by sight, fiddle with exploding radiators, and tramp around in the mud.

Author's Note

Thanks to Rob Kitchin for proposing that I explore the history of dashboards. A version of this essay will appear in the forthcoming book Understanding Spatial Media (Sage Publications), edited by Kitchin, Tracey P. Lauriault, and Matthew W. Wilson. Thanks, too, to my spectacular research assistant Steve Taylor, and to my colleagues and friends Julia Foulkes and Aleksandra Wagner for their comments on earlier versions of this paper.

Notes
  1. Natasha Singer, “Mission Control, Built for Cities,” The New York Times, March 3, 2012. See also, Shannon Mattern, “Interfacing Urban Intelligence,” Places Journal, April 2014.
  2. City of Baltimore, “CitiStat/Process/Take a Tour”; Tom Pelton, “Running the City by the Numbers,” The Baltimore Sun, July 14, 2002.
  3. Steve O’Hear, “Well, What Do You Know: The UK Prime Minister’s iPad ‘App’ Is Real. We Have Details,” TechCrunch, November 7, 2012; Samuel Gibbs, “David Cameron: I can manage the country on my BlackBerry,” The Guardian, August 21, 2014; Alice Newton, “The Number 10 Dashboard,” Action 4 Case Study: Digital Capability Across Departments [policy paper], U.K. Cabinet Office, March 24, 2014.
  4. Stephen Few, “Dashboard Confusion,” Intelligent Enterprise, March 20, 2004; Few, Information Dashboard Design: Displaying Data for At-a-Glance Monitoring, 2nd ed. (Burlingame, CA: Analytics Press, 2013).
  5. David Nettleton, Commercial Data Mining: Processing, Analysis and Modeling for Predictive Analytics Projects (Waltham, MA: Elsevier, 2014), 80. See also Nils H. Rasmussen, Manish Bansal, and Claire Y. Chen, Business Dashboards: A Visual Catalog for Design and Deployment (John Wiley & Sons, 2009).
  6. John Thackara, In the Bubble: Designing in a Complex World (Cambridge, MA: MIT Press, 2006), 169.
  7. See Orit Halpern’s Beautiful Data: A History of Vision and Reason Since 1945 (Durham, NC: Duke University Press, 2015) for a provocative discussion on the historical relationships between the perception of data, the conception of rationality, and the performance of governance.
  8. Koen Pauwels, It’s Not the Size of the Data, It’s How You Use It: Smarter Marketing with Analytics and Dashboards (American Management Association, 2014), 27-29.
  9. Joshua Tauberer, “History of the Movement,” Open Government Data: The Book, 2nd ed. (self-published, 2014).
  10. See Robert D. Behn, “What All Mayors Would Like to Know about Baltimore’s CitiStat Performance Strategy” (IBM Center for The Business of Government: 2007).
  11. Rob Kitchin, Tracy P. Lauriault, and Gavin McArdle, “Knowing and Governing Cities Through Urban Indicators, City Benchmarking, and Real-Time Dashboards,” Regional Studies, Regional Science 2:1 (2015), 6-28.
  12. Here we see the dashboard mingling with metaphors like platform and For a discussion of these buzzwords, see Shannon Mattern, “Library as Infrastructure,” Places Journal, June 2014.
  13. See also the Amsterdam city dashboard, which is currently offline while the nonprofit media lab that hosts the site migrates its servers and imports new datasets.
  14. See also the London Assembly’s “Smart London Plan,” December 2013.
  15. Oliver O’Brien, “City Dashboard,” Suprageography, April 23, 2012.
  16. Rob Kitchin, “Dublin Dashboard Launch,” Programmable City, September 11, 2014.
  17. Paolo Ciuccarelli, Giorgia Lupi, and Luca Simeone, Visualizing the Data City: Social Media as a Source of Knowledge for Urban Planning and Management (New York: Springer 2014), 1-2.
  18. See Shannon Mattern, “Methodolatry and the Art of Measure,” Places Journal, November 2013, and “Interfacing Urban Intelligence,” op cit.
  19. See Kenelm Edgcumbe, Industrial Electrical Measuring Instruments, 2nd ed. (New York: D. Van Nostrand Company, 1918); Stuart Bennett, A History of Control Engineering, 1800 – 1930 (Herts, UK: Peter Peretrinus, 1979); and Bennett, A History of Control Engineering, 1930 – 1955 (Herts: UK: Peter Peregrinus, 1993). Frederik Nebeker identifies a host of other technological and scientific developments that played an integral part of electronics’ — and dashboards’ — backstory: electron tubes, particularly the thyratron voltage regulator, which allowed for precision control; imaging devices and graphic user interfaces, which allowed for capturing and visualizing those increasingly precise measurements; the telegraph, the wireless, military intelligence and cryptography; gyroscopic control and sound ranging; radar; and, certainly not least, calculating machines and binary computing. See Nebeker, Dawn of the Electronic Age: Electrical Technologies in the Shaping of the Modern World, 1914 to 1945 (Hoboken, John Wiley & Sons, 2009). Plus, the new field of control engineering emerged to regulate and investigate the confluence of these various mechanical, electrical, fluid, financial, communication, and physiological systems. Control is premised on the principle of feedback, which is typically linked to cybernetic theory. But as mechanical engineer Otto Mayr and engineering historian David A. Mindell both argue, feedback — a core operating principle of the dashboard — has a much deeper history. Decades before World War II, “engineers in a variety of settings” — Mindell cites the U.S. Navy Bureau of Ordnance, the Sperry Gyroscope Company, Bell Telephone Labs, and Vannevar Bush’s lab at MIT — “developed ideas and technologies of feedback, control, communications, and computing.” See Mindell, Between Human and Machine: Feedback, Control and Computing Before Cybernetics (Baltimore: John Hopkins University Press, 2002), 8. This work wasn’t informed by user-focused psychological and ergonomic research, as was the case in much of the concurrent cockpit research, but, rather, on an “ideal type that engineers created (consciously or unconsciously) as they designed machinery.” Mayr digs back ever farther: he traces the feedback loop back to Ktesibios’s water clock in Alexandra, Egypt, in the 3rd century B.C. and Philon’s self-refilling oil lamp from 200 B.C. The emergence of steam-pressure regulators in the 18th century, and the growing popularity of automata and other feedback devices, Mayr suggests, is due not only to the development of new technology, but also to people’s acceptance of machinery and systems — including the system of free enterprise — as autonomous. See Mayr, The Origins of Feedback Control (Cambridge, MA: MIT Press, 1970). The new technology, in order to gain traction within the culture, had to be accompanied by an epistemological shift and the evolution of new cultural techniques for producing and using that technology. Similarly, as noted earlier, we had to wait for more systematic means of handling data, more thoughtful methodologies for generating and analyzing it, and more cultural imperatives for producing it and monitoring it, before the dashboard could proliferate.
  20. Michael Berger, The Automobile in American History and Culture (Westport, CT: Greenwood Press, 2001), 240.
  21. My colleague Aleksandra Wagner suggested that, given how much of the driving task is now automated, our contemporary dashboards — with their increasing number of self-referential displays — might be transforming drive-time into an opportunity for us to explore our relationships to ourselves.
  22. Frederick K. Teichmann, Airplane Design Manual (New York: Pitman Publishing, 1942), 106. The design of a cockpit must consider the optimal arrangements of a variety of items, Teichmann wrote, many of which are accessed via — or need to be in close proximity to — the dashboard: windshield outline and construction; angles and field of vision; instruments and their location; power plant controls and their location; the pilot’s and co-pilot’s seats; the primary control systems; the break systems; hydraulic controls for breaks, flaps, tabs, etc.; automatic-pilot equipment; radio equipment; lighting; heating and ventilation; de-icing equipment and controls; oxygen equipment; accessibility and emergency exits (108-9).
  23. Branden Hookway, “Cockpit,” in Beatriz Colomina, Annmarie Brennan, and Jeannie Kim, Eds., Cold War Hothouses (New York: Princeton Architectural Press, 2004), 38-9.
  24. Teichmann, 122. Frederik Nebeker concurs: “A pilot relied on many instruments to monitor the engines and the flight of the plane. There were selsyn systems that showed the fuel level and the movements made by landing flaps and landing gear. … There were four types of gyros: horizon indicator, direction indicator, horizontal control, and directional control.” Dawn of the Electronic Age: Electrical Technologies in the Shaping of the Modern World, 1914 to 1945 (Hoboken, John Wiley & Sons, 2009), 382-3.
  25. B. Jones & J. D. McK. Watson, Eds., Digital Signal Processing: Principles, Devices and Applications (Peregrinus, 1990), 1.
  26. According to Teichmann, the military and aircraft manufacturers attempted to standardize the placement of instruments on the dashboard: “The primary flight group is immediately in front of the pilot and near the top of the panel. This group consists of the Sperry turn indicator and the Sperry flight indicator both on the same level. Below these is the secondary flight group consisting of the airspeed, bank and turn indicator combined, and the rate of climb instruments. To the left, either in the same row or as close as possible, the sensitive altimeter is located. In addition it is customary to locate the magnetic and radio compasses as conveniently close to the other flight instruments as possible. The engine instruments are usually grouped in the same general pattern, depending on their number” (123).
  27. Hookway, “Cockpit,” 41-42.
  28. See Arthur Joseph Hughes, History of Air Navigation (London: Allen & Unwin, 1946).
  29. Robert Buderi, The Invention that Changed the World: How a Small Group of Radar Pioneers Won the Second World War and Launched a Technological Revolution (New York: Simon & Schuster, 1996): 95.
  30. For more on the Opsroom, see Eden Medina, Cybernetic Revolutionaries: Technology and Politics in Allende’s Chile (Cambridge, MA: MIT Press, 2011), 118.
  31. One wall was the entrance, with an adjacent wardrobe, and the wall immediately to the right offered access to a small kitchen.
  32. Till A. Heilmann, “Buttons and Fingers: Our ‘Digital Condition,’” paper presented at Media in Transition 7, Unstable Platforms; the Promise and Perils of Transition, at MIT on May 5, 2011. Rachel Plotnick has also traced the cultural construction and reception of that cultural technique from the 1880s to the 1920s; she argues that educators, tinkerers, appliance manufacturers, and the electrical industry had different stakes in how the public engaged in button-pushing. Without intuitive button-pushing interfaces, new-fangled electrical appliances could be intimidating, stoking consumers’ confusion over or fears of electricity. An easy-to-use, non-threatening, button-based interface, however, communicated to the modern user that he was “in control of his situation, body, and technology.” See Plotnick, “At the Interface: The Case of the Electric Push Button, 1880–1923,” Technology and Culture 53:4 (October 2012), 830. Yet if button-pushing were too effortless, consumers would take electrical workers’ labor and expertise for granted and settle into the role of passive, complacent, oblivious consumers. Thus, electrical engineers focused on “rendering buttons ‘strange’ and unfamiliar rather than a taken-for-granted, invisible device” (Plotnick, 837).
  33. Medina, 121.
  34. Medina, 124.
  35. Jan Noyes and Matthew Bransby, Eds., People in Control: Human Factors in Control Room Design (London: Institution of Electrical Engineers, 2001).
  36. Mattern, “Interfacing Urban Intelligence.”
  37. Kitchin, Lauriault, and McArdle, 7.
  38. See the work of Edward Tufte, as well as Johanna Drucker, Graphesis: Visual Forms of Knowledge Production (Cambridge, MA: Harvard University Press / metaLAB, 2014).
  39. Branden Hookway, Interface (Cambridge, MA: MIT Press, 2014), 134.
  40. Kitchin, Lauriault, and McArdle, 11.
  41. Kitchin et. al. argue for the need to document “data lineage” — to highlight data’s provenance, provide metadata, reveal levels of error, etc. (22).  See also Jamie Bartlett and Nathaniel Tkacz, “Keeping an Eye on the Dashboard,” Demos Quarterly, October 24, 2014.
  42. Kitchin, Lauriault, and McArdle, 24.
  43. Or, conversely, it promotes the intentional employment of muddy methodology in the pursuit of desirable data. John Eterno, Arvind Verma, and Eli B. Silverman have reported on the manipulation of crime statistics: reclassifying (“downgrading”) crimes and “gaming the numbers” so as to achieve management’s and city government’s crime reduction goals. In New York, they write, “otherwise ethical men were driven to cook the books on major crimes to keep the Compstat gods appeased.” See Eterno, Vema, and Silverman, “Police Manipulation of Crime Reporting: Insiders’ Revelations,” Justice Quarterly (2014).
  44. Adam Greenfield, “Two Recent Interviews,” personal blog, February 24, 2014.
  45. “A Geographical Exhibition at the Outlook Tower, Edinburgh,” The Geographical Teacher 3:6 (Autumn 1906), 268.
  46. Anthony Townsend, Smart Cities: Big Data, Civic Hackers, and the Quest for a New Utopia (New York: W.W. Norton, 2013), 97.
  47. Charles Zueblin, “The World’s First Sociological Laboratory,” The American Journal of Sociology 4:5 (March 1899), 585-88.
  48. “A Geographical Exhibition,” 269; Zueblin, 588.
  49. Zueblin, 584-85.
  50. “A Geographical Exhibition”: 269.
  51. Ibid.
Cite
Shannon Mattern, “Mission Control: A History of the Urban Dashboard,” Places Journal, March 2015. Accessed 28 Jul 2016. <>

If you would like to comment on this article, or anything else on Places Journal, visit our Facebook page or send us a message on Twitter.