How often do you think about the mediated space between the weather forecast and the reality of climate change? Behind the day-glo radar maps and adorably abstract sun and cloud icons are vast amounts of data feeding atmospheric models that inform not only how we dress for the day, but how we prepare for droughts and superstorms.
The climate archive gets wilder and dirtier the deeper you go. Ice cores, boreholes, sediments, pollens, tree rings, corals, and other samples of the geologic field become documents.
Weather data flow through neural nets and populate massive data centers, but they also reside in refrigerators and polystyrene tubes. And the climate archive (like most archives) gets wilder and dirtier the deeper you go. To survey the past 150 years or so, climate researchers can use instrument readings from ships and weather stations, but to understand global patterns across deep time, they must turn to proxies: ice cores, boreholes, lake and ocean sediments, pollens, tree rings, corals, stalactites and stalagmites, and other natural features that index climatic events. 1 The geologic field itself, and strategically selected samples of it, become archival documents, in the same way that, for Suzanne Briet, “the photographs and the catalogues of stars, the stones in a museum of mineralogy, and the animals that are catalogued and shown in a zoo” are documents. 2
Yet those data and documents are not structured uniformly. Species collections, core samples, and medieval manuscripts can all help researchers understand the changing climate, but they are subject to widely varying protocols of collection, preservation, and access. Some state geologic surveys house their rocks in “museums,” while others maintain “sample libraries,” “reference collections,” or “repositories.” Differences in terminology reflect different conventions of thought and practice.
In the geosciences, there’s a long tradition of regarding the Earth itself, the terrestrial field, as an archive. Talk about big data. David Sepkoski shows how the invention of stratigraphy in the early 19th century helped naturalists see the Earth “as having a deep history, which could be ‘read’ in the succession of fossils embedded in the strata of [its] crust.” 3 As early as 1766, chemist Torbern Olof Bergman described fossils as “medallions of a sort … laid down on the originating earth surface, whose layers are archives older than all [human] annals, and which appropriately investigated give much light on the natural history of this our dwelling place.” 4 Over the next century, this metaphor multiplied across layers of abstraction. First there was the Earth as archive; then fossil records and specimen collections, visual representations of those collections, textual catalogs, and, eventually, databases. 5
In the geosciences, there’s a long tradition of regarding the Earth itself, the terrestrial field, as an archive.
Today, we have thousands of literal “earth archives” — collections of ice, rock, sediment, soil, and other geologic specimens useful for climate research. But they are not always managed by professional archivists. Geologist and information scientist Sarah Ramdeen acknowledges that state repositories are committed to “curation,” but she laments their uneven expertise in collection management (which sometimes involves thinning out materials), metadata, user services, and long-term care and storage. “Concerns over sustainability and interoperability are high,” she writes. 6 Meanwhile, Eira Tansey, an archivist who works with more traditional media, regrets her field’s reluctance to look beyond its borders, to the larger natural environment in which it operates: “We archivists think of records as something created by, for, and about humans. … Anything from the natural world may be considered data, and historically we have tended to leave data to other professions.” 7
So who, or what, will unite the core repositories, marine sample collections, manuscript libraries, and digital archives? These institutions have more in common than their “bigness.” They’re all fighting against instability — against material deterioration, shaky funding, evolving professional standards, and unpredictable threats like power outages and malfunctioning equipment. More to the point, they are all operating in a world of rising temperatures and sea levels. Geo-archives have always faced the reality of environmental change; some contain chunks of icebergs that no longer exist and water samples from lakes that have evaporated. Now, media-based archivists are starting to focus on resilience, too. “We need to completely rethink how to integrate climate change adaptation into our existing work,” Tansey says, “from appraisal to processing to preservation,” because simply collecting documentation of environmental instability is not enough.
These institutions have more in common than their ‘bigness.’ They’re all fighting against instability … and facing the reality of environmental change.
It’s time for curators — of cores, codes, and codices; manuscripts, machine-learning models, and mud — to think across their collections, recognizing their shared commitment to resilience and social responsibility. Together, they can honor the variety of “natural” and textual documents that yield critical data about geological, climatic, and related cultural processes — and that bring those processes to life for patrons. They can develop protocols and standards so that materials are widely accessible and useful. They can advocate collectively for funding and other forms of support. And they can strengthen their respective fields by posing big questions — not only about the bigness of their datasets, but also about the breadth of their fields of operation, and the existential challenges they can help us face.
This past summer, a giant iceberg broke off Antarctica’s Larsen C ice shelf, a “calving” event that was reported in the media (by some of the same outlets that track tech product releases and celebrity news) as a symbol of imminent climate change. After watching the fracture advance in satellite images over three years, the grand finale felt uncomfortably normal. One wonders how that God’s eye view, and all the hardware and software that enables it, is entangled in such terrestrial transformations.
Ice core repositories are unquestionably alluring. We might say they are the charismatic mega-terra of geo-archives.
The North and South Pole once felt like the ends of the earth. Now they are familiar territory, not only for the energy companies and political regimes negotiating the terms of a new Cold War, but also for the many scholars, designers, and artists gripped by a new planetary consciousness, fascinated by the regions’ cryptic histories, or simply drawn to extreme environments. 8 Of course, ice has long been a fact of life and a way of knowing for the indigenous people, whalers, and scientific and military researchers who live at the poles. As climate change accelerates, we’re all bound to get a lot more familiar with polar geography. Sverker Sörlin argues that we are living through a “cryo-historical moment” after “more than a century of attempts to enroll ice as an object of study in a range of science fields.” 9
In 1966, at Camp Century, an underground ice-city in Greenland, researchers from the U.S. Army Cold Regions Research and Engineering Laboratory drilled a core 1,387 meters down to bedrock. Their findings revolutionized climate science: “The Century ice core yielded a unique and powerful window to the Earth’s past … providing unbroken physical access to more than 100,000 years of history.” At the CRREL facility in Hanover, New Hampshire, the deep core “was transformed into the most detailed database of the Earth’s climatic history in existence.” 10 The earth archive became big data.
Ice core repositories are unquestionably alluring. We might say they are the charismatic mega-terra of geo-archives. (Or megafonds, for the archive nerds.) 11 As snow accumulates over millennia, it’s compressed into layers of ice that capture particulates, dissolved chemicals, organic matter, and air bubbles from various ages. The cores — long cylinders extracted from ice sheets and glaciers in Antarctica, Greenland, North America, and other glaciated regions — can be read layer by layer, as a “sort of almanac,” 12 an index of weather and climate conditions present at the time of each layer’s formation. Researchers can track precipitation, temperature, wind patterns, atmospheric composition (notably, greenhouse gas concentrations), and volcanic and solar activity. 13 (Even dust, that eternally invasive presence in the archive, is here an integral part of the frozen document.) 14 The U.S. National Ice Core Laboratory, in Denver, describes its 19,000 meters of refrigerated stock as “frozen time capsules” of climatic history. 15 As D. Graham Burnett notes, the cores also constitute an “archive of life,” of pollens, spores, and ancient microorganisms “that haven’t been seen on the planet in eons.” 16
Core extraction requires tremendous coordination and stamina, as described by geologist Richard Alley. Transport is logistically complex, too, with the ice carried on “cold decks” during flight, protected in specially insulated boxes, and then hauled in freezer trucks to cold storage. 17 After the cores arrive at the repository and achieve thermal equilibrium, they are racked, inventoried, and sliced up for analysis. 18 Some analytic procedures, like crushing a sample in a vacuum in order to study its gas composition, are “consumptive,” resulting in the sample’s destruction.
Despite this embrace of methodological “lossiness,” there is something paradoxical about the whole enterprise. Much ice core research is dedicated (if only implicitly) to mitigating climate change and slowing the melting of ice sheets and glaciers. But these operations require a tremendous amount of energy at every stage, from the refrigerated transport triathlon to the storage facilities with their redundant compressors and generators. Even then, the collections are vulnerable to loss. This past spring, refrigeration chillers at the Canadian Ice Core Archive shut down after only five months of operation (and not long after electric pumps failed at Svalbard, Norway, flooding the Global Seed Vault). The monitoring system that should have alerted officials also failed. As a consequence of this double melt-down, the brand-new facility at the University of Alberta lost 13 percent of its samples, which had been recently transferred from Ottawa’s Ice Core Research Laboratory, where they were “orphaned due to budget cuts at Natural Resources Canada.” 19 Even state-of-the-art, carefully monitored repositories can’t withstand the vagaries of government financing, extreme weather, and network failure.
Many archives and conservation efforts fall into a trap where the act of building the collection becomes the end in itself.
“Freezing ice cores to study climate change is a practice saturated with ironies,” Joanna Radin and Emma Kowal observe. “The ability to preserve samples of glacial ice requires energy intensive forms of preservation in order to demonstrate how fossil-fuel-dependent capitalist societies have contributed to climate change.” 20 We can draw parallels to other types of frozen storage, such as blood banks, seed banks, and frozen gametes of endangered species, as highlighted in a beautifully illustrated New York Times Magazine article on “arks of the apocalypse.” 21 As Radin and Kowal demonstrate, the “cryopolitics” of “freezing or suspending life in anticipation of future salvation” can be tricky. When we invest in these arks (or perhaps we should call them refrigerated container ships) do we provide false reassurance that we can ride out the storm? Do we obscure the failures of “carbon-based capitalism” and limit our obligations to act in the present? 22 Many archives and conservation efforts are susceptible to what Fernando Vidal and Nélia Dias call an “endangerment sensibility”; they fall into a trap where the act of building the collection becomes the end in itself, which is “mainly remedial, recuperative, therapeutic, even palliative.” 23
And sometimes those arks of apocalyptic idealism spring a leak: “power outages, faulty backup generators, fires, floods, earthquakes, contamination, liquid-nitrogen shortages, war, theft, neglect.” 24 All our geologic fields, and their archived proxies, must contend with sudden and unpredictable disaster.
Some repositories, meanwhile, are meant to be soggy. The Lamont-Doherty Core Repository at Columbia University contains tens of thousands of sediment cores and dredge and “grab” samples from every major ocean and sea, as well as lakes, bogs, rivers, marshes, and peatlands. 25 According to marine geologist Chris Goldfinger, such sediment collections — including the one at his own institution, Oregon State University — are “like a library of the earth.” 26 It’s a slow-growing collection: sediments accumulate on the sea floor at the rate of a couple centimeters per thousand years. Yet that big-and-slow data set offers clues to understanding oceanic “dead zones” and factors contributing to climate change. The distribution of microorganisms’ shells can signal changes in ocean currents and species migration, and the presence of particular oxygen isotopes can reveal the rate at which carbon is reaching the ocean floor or how much water is locked up in land-based ice sheets at a given time.
Sediments accumulating on the sea floor at the rate of a couple centimeters per thousand years constitute ‘a library of the earth.’
Meanwhile, turbidites, accumulations of marine sediment that are moved around by earthquakes or landslides, can help researchers piece together seismic histories — which, in turn, explain recent events like the severing of transoceanic telecommunications cables. Ice-rafted detritus registers the calving of icebergs, which leave a pebbly trail as they melt at sea. Fossilized bacteria and pollen offer clues to the origins and evolution of life. Abrupt boundaries mark the beginnings and ends of glacial periods, which can be pinned down with radiocarbon dating. Cores have also yielded evidence for reversals in the Earth’s magnetic field and have shown that glacial cycles are related to shifts in the shape of the Earth’s orbit, its axis tilt, or its wobble on that axis. 27
When the Lamont Geological Observatory was founded, in 1949, director Maurice Ewing had been collecting sediment cores for two years. Most geologists at the time believed the seabed to be a stable ground that received a steady rain of surface particles and dead microorganisms. Ewing’s cores, however, showed significant variability in sediment deposits, which suggested the influence of dynamic underwater forces. Rusty Lotti Bond reports that Ewing developed equipment to speed up the coring process, surmising that if his team “gathered as many cores as possible from as many places as possible, a pattern would emerge.” He then “decided to establish a library of cores so that researchers, now or in the future, could have specimens to study — with a fresh perspective or more sophisticated instruments, perhaps — to discover something new.” 28 Over the years, the scope of that inquiry has grown. As coral reefs decline around the world, the LDCR is now building a collection of coral samples and cores. Corals cover a shorter timespan than sediment cores, but their growth rings allow for more precise dating.
The oldest materials in the LDCR index 130 million years of geologic history. 130 million years! 29 Making sense of these dataforms requires collaboration among geochemists, mineralogists, oceanographers, geophysicists, imaging experts, and a variety of other specialists — including, I would emphasize, curators. Nichole Anest, the current curator, told me that most patrons today are geoscience researchers, but there are business applications too. AT&T consulted samples before laying fiber-optic cable across the Atlantic and Pacific. Oil companies funded the Observatory’s early cruises and sometimes still request samples, although they now have their own proprietary core repositories. 30 Again, we can’t ignore the irony: the methods used to prospect for fossil fuels that formed over the course of millennia are now used to piece together climatic histories across those millennia, and often to advocate for a future less dependent on fossil fuels.
When a marine sediment core enters the LDCR facility in Palisades, New York, it is cut into 1.5-meter lengths, split longitudinally, and marked with plastic tabs every 10 centimeters. The investigator and date are recorded, and the core is assigned a code based on the ship, cruise, and “leg” or “station” number. One of these longitudinal sections, the “archive half,” is kept intact; it is photographed and described in terms of texture, color, structure, and composition. The other section, the “working half,” is sliced up as researchers request samples. 31
In the early days, cores at the LDCR were marked with thumbtacks and stored in steel trays, with the archive halves wrapped in plastic. They rusted and dried out, which made them unsuitable for new analytical methods in geochemistry. Since 1984, cores have been refrigerated and kept moist in polystyrene containers called D-tubes. At the USGS Woods Hole Coastal and Marine Science Center on Cape Cod, researchers keep the tubes in climate-controlled vans known collectively as the “Freezer Farm.” Brian Buczkowski and Sarah Kelsey of Woods Hole explain that it is essential to preserve the “physical integrity” of the samples since the form and composition are their critical data. 32
Columbia’s LDCR is one of about a dozen such repositories worldwide. Anest believes it’s the largest academic core repository in the world (soon to be rivalled by Oregon State, which is building a new facility to house an acquisition of sediment cores from the Southern Ocean). 33 How data and documents are shared among these repositories is critical to marine science. At LDCR, many of the historical core logs and ship logs have been digitized, and all of the core data is shared in public databases (as a condition of funding from the National Science Foundation). Web tools allow researchers to virtually suture the archived segments together, to see an image of the full core with all its associated metadata.
Seemingly trivial administrative debates can raise epistemological questions no one ever thought to ask.
Kerstin Lehnert, a geochemist at the Observatory and director of the Interdisciplinary Earth Data Alliance, has been a pioneering figure in developing cyberinfrastructures. 34 She initiated the International Geo Sample Number, a global unique identifier for geologic and environmental specimens (similar to the digital object identifier, or DOI, for online content). It is designed to remedy inconsistent naming conventions which have produced confusion. “We have 70 samples named ML-12,” she told me, “from all over the planet.” The IGSN enables researchers to create distinct identifiers and build links between the samples, the data acquired from them, and the publications using that data. Researchers in other fields, including biodiversity and material science, and the National Institute for Standards and Technology, have all expressed interest in adopting or adapting the IGSN.
And yet, as with any attempt at standardization, there has been some resistance to Lehnert’s efforts. 35 A few major institutions have insisted that their legacy standards be adopted universally. As Geoffrey Bowker and Susan Leigh Star note, in their classic study of classification systems, “the spread or enforcement of categories and standards” inevitably involves “negotiations, organizational processes, and conflict.” 36 Yet that conflict sometimes yields great insight. Seemingly trivial administrative debates can raise epistemological questions no one ever thought to ask. “We just had a meeting in Australia,” Lehnert recalled, “where we talked about what a ‘sample’ is.”
Even as the digital infrastructure is improved, the material archive — the original cores — will always be essential. The ability to replicate results is fundamental to the scientific process, and samples often need to be studied again as science advances. “We would never be able to measure everything when the core comes out of the ocean, then dump it back in, assuming we’ve extracted all its information,” Lehnert said. There are always new methods and instrumentations, higher resolutions, emerging subfields that focus on new properties. Just a few years ago, one of the scanners in the lab “made a huge upgrade in its detecting capabilities,” Anest said. Some of their researchers have partnered with art authenticators and conservators at the Metropolitan Museum of Art, which has its own battery of unique instruments. As science historian Lorraine Daston observes, “No one knows in advance what questions future historians or climatologists will pose and what traces from the present (and whatever of the past has already been preserved) will be needed to answer them”; scientific archives will have to be “reconfigured to serve new lines of inquiry, over and over again.” 37
While geoscientists recognize that the material integrity of sediment cores is essential to their archival function, administrative records like ship logs have been treated differently. The data are imported into online databases, but the historical form and aesthetics of those data aren’t always shared. Yet, as historians of paperwork would note, the material form of those administrative documents is significant, as it allows us to study how categories, standards, and bureaucratic procedures structure the crew’s operations — or how the crew’s practices exceed the paperwork’s (and the bureaucracy’s) boundaries. 38 Anest observed that many administrative forms have annotations in the margin or on the back side. In fields like archaeology, recording media — photographs, maps, field notes, and so forth — are often viewed as part of the profession’s material culture. As Jennifer Baird and Lesley McFadyen argue, archaeology’s textual archive “exists as a site of translation between the material past encountered during excavation and the production of archaeological knowledge as an intellectual exercise.” 39 Here the curators and managers of geoscientific collections could learn from manuscript and rare book librarians and audio-visual archivists, who have long acknowledged the importance of their collections’ materiality.
Geological repositories affiliated with larger institutions can benefit from interdisciplinary partnerships. For example, Columbia University’s archives contain, in the Central Files, materials about Lamont-Doherty’s founding, funding, facilities, and administration; accounts of expeditions; research notes on Ewing; the papers of key faculty; and architectural drawings. At Ohio State University, geoscientists at the Byrd Polar and Climate Research Center have partnered with the university libraries to create a special archival collection containing the historical papers (letters, diaries, photographs, reports, expeditionary records), oral histories, and artifacts (medals, furs, etc.) of explorers and scientists involved in polar research. 40
While money pours into the digital infrastructures of Big Data, the material proxies underlying that data rely on precarious funding and the heroic efforts of a skeleton staff.
Most geoscientific curators simply don’t have the resources to become manuscript librarians or digitization specialists, too. In addition to supporting faculty and student research and fielding inquiries from around the world, Anest and her colleagues oversee a vibrant outreach program that includes teaching kits for local classrooms. They face a mound of residuals — leftover samples returned to the repository — that now await cataloging, plus filing cabinets, notebooks, drawers of photos, and shelves full of seismic tapes that await digitization. A few years ago, the Observatory hosted a “media archaeological” data rescue competition, in which sixteen international teams submitted proposals to “save data in danger of dying within old floppy disks, tape drives, or paper archives. 41 Many projects depend on the idiosyncratic institutional knowledge of specific staff members. 42 When Anest started working at the repository two decades ago, “there were seven full-time staff; now there are two.”
Those challenges will sound familiar to many archivists. 43 While governments, corporations, and foundations are pouring money and attention into the digital infrastructures of Big Data, the material proxies underlying and generating much of that data often rely on precarious funding and the heroic efforts of a skeleton staff. What will happen to federal support under a new administration that shows blatant disregard for science and open hostility to climate research? Who will advocate for the value of such relatively uncharismatic archives: mud and rocks on a shelf?
Soil Samples and Rock Cores
Dirt is perhaps even less charismatic. While anthropologist Mary Douglas famously defined “dirt” as “matter out of place,” some dirt — much like dust — becomes data in the archive. Soil repositories were originally conceived as a way to improve agricultural production, but today they are used in climate research, pollution monitoring, construction and engineering projects, and even criminal forensics. 44 Researchers test properties like soil depth, stoniness, wetness, variability, and vegetation cover, and by sampling soil from the same location over decades they can demonstrate the impacts of particular land uses and other environmental changes. The Rothamsted Sample Archive, north of London, contains samples of crops, soil, fertilizers, and manures dating back to 1843, which were used to demonstrate a rise in dioxins that lasted until the 1970s, followed by a sharp decline corresponding with new environmental regulations. Likewise, soils sampled at the Hubbard Brook Experimental Forest in New Hampshire contributed to the scientific discovery of acid rain. 45
As archaeologist Vance Holliday writes:
Soils are indicators of the nature and history of the physical and human landscape; they record the impact of human activity, they are a source of food and fuel, and they reflect the environment and record the passage of time. Soils also affect the nature of the cultural record left to archaeologists. They are reservoirs for artifacts and other traces of human activity. 46
Soil studies can also project future impacts. For example, certain soils are better than others at sinking greenhouse gases or binding pollutants to slow the release of toxins. The U.S. Army Corps of Engineers recently collected soil samples on the U.S.-Mexico border as part of tests to determine the most suitable construction material for Trump’s border wall. If the wall is built, it will dramatically change the way land is used by people, animals, and plants on both sides. 47
Many soil repositories are the product of multiple collections converging over the years, each with its own purposes and protocols. Others rely on donations, which may be collected by people with limited training. Curators try to standardize the processes for documenting old samples and (ideally) collecting more. New samples can be dried, frozen, or refrigerated, but many organizations, including the U.S. National Ecological Observatory, prefer to air-dry and sieve their samples, as dried soils will keep for decades and are not vulnerable to power outages and thaws. 48 Samples are described in terms of their site, morphology, chemistry, and other factors; then transferred to labeled containers and kept away from light. Some archives also maintain soil “profiles” or “monuments” — vertical sections that are preserved (typically with lacquer) to maintain their original texture and structure. 49
As we’ve seen, retrofitting old archival materials is a challenge. Managers at the Australian National Soil Archive note that “legacy soil data” must be “transcribed from paper copies of internal reports, ASCII text, Fortran-based non-relational flat-file database files or spreadsheets” — often tedious work that requires significant time and institutional knowledge. 50 When Val Stanley joined the Wisconsin Geological and Natural History Survey two years ago, as Samples and Laboratories Manager, she faced her own forensic challenge. She discovered giant wooden boxes containing seventeen cores of undocumented iron formation that were over a hundred years old and likely of significant research value. They were “dark data,” undiscoverable within the collection, until Stanley’s team reconstructed metadata using old core logs, master’s theses from the early 20th century, and old tax assessments of the Cahoon mine. Now those cores are being used in research projects exploring the relationship between this Precambrian formation and groundwater in southern Wisconsin. 51
We need to ask: What do we keep, how do we take care of it, and how do we describe it?
Rock cores are lower maintenance than ice and sediments, Stanley says, because climate control is not as critical. 52 (She previously worked in a lacustrine sediment core repository and now works at Oregon’s ice core repository.) But that doesn’t mean archival practices should be lax. Together with colleagues at other state repositories, she advocates for higher standards of accession, description, and preservation, including adoption of Lehnert’s IGSN. Too many donated rock cores are simply labeled “Hole One,” she said. And she wants to join forces with archivists and librarians in developing best practices and improving online access. Inspired by Ramdeen’s work, she poses three basic questions: “What do we keep, how do we take care of it, and how do we describe it?”
Some soils and rocks are enlisted for special operations in high-security archives. Since the 1980s, the U.S. federal government has maintained a repository of rock cores and samples about a hundred miles north of Las Vegas, on the Nevada National Security Site. While officials acknowledge the collection’s value “for researchers investigating the hydrology of arid areas,” their primary interest is in the fractured volcanic rock’s potential for “containment migration” — that is, the spread of radionuclides. 53 When the longstanding effort to build the Yucca Mountain Nuclear Waste Repository was shut down by the Obama administration in 2010, the climate control was turned off, and custody of the geologic cores and rock samples was transferred to a different branch of the Department of Energy.
What we see here is an effort to “read the archival rocks” in order to legitimate a geoengineering project that would dramatically reshape the geologic and cultural field for millennia. Geographer and landscape architect Seth Denizen laments that geoscientists “seem to be more often summoned to review evidence at the scene of a crime than to record the annals” of deep time. “The expertise that is called upon is the epistemological power to make matter speak. What do the rocks say?” 54 Now the Trump administration is signaling that they may restart the Yucca Mountain project, depending on the testimony of those rocks.
At Yucca Mountain, we see an effort to ‘read the archival rocks’ in order to legitimate a geoengineering project that would dramatically reshape the geologic and cultural field.
Nuclear waste repositories are an extreme example, but even more mundane soil archives are forced to reckon with the fact that modern soils are shaped by human activity. 55 As Denizen explains, “Things like trash, construction debris, coal ash, dredged sediments, petroleum contamination, green lawns, decomposing bodies, and rock ballast not only alter the formation of soil, but themselves form soil bodies, and in this respect are taxonomically indistinguishable from soil.” 56 Official soil maps, reflecting the prevailing classification, emphasize morphology (size, shape, chemistry) over genetics (where the soil comes from). They simply don’t know what to do with soils made through human activity. Denizen calls this a “large hole in the USDA soil taxonomy.” 57 Whereas the Unified Soil Classification System focuses on practical applications — for instance, whether a given soil is well suited for a highway or a factory farm — Denizen proposes a taxonomy that asks “the old ontological question: ‘what is soil?’” 58 His prompt echoes many we’ve heard before: what’s a document, what’s a record? These seemingly small questions about administrative classification can spark larger philosophical debates.
Curating a Future Earth
When environmental records began disappearing from government websites in the early days of the Trump administration, librarians and archivists joined forces with scientists and educators to save the data. 59 But digital preservation is only one front in the fight to make cultural institutions more resilient. Archives, libraries, museums, and historical sites also have to protect their material records from climatic forces. Tansey recommends that organizations take a sober “actuarial look” at the insurability of their collections. Further, they should reassess workflows and priorities, and perhaps even their appraisals, as they decide what’s worth acquiring and preserving. 60
Curating both the geological archive and the terrestrial field from which it is drawn must be a concerted effort involving scientists, librarians, archivists, network engineers, and others.
Social justice is also fundamental to resilience. The overwhelmingly white ranks of librarians and archivists need to listen closely to the vulnerable populations they serve, who face “barriers in accessing the types of records needed to substantiate their claims of environmental injustices, or have difficulty getting those in power to take seriously the evidence and documentation their communities have gathered together in the absence of official records.” 61 Many are working to increase diversity in their fields and to promote climate, science, and information literacies, empowering the communities they serve to develop their own resilient practices.
Further, cultural institutions need to conserve energy at all levels, from copy machines and book delivery services to climate-controlled storage facilities and data management. Archivist Ben Goldman notes that digital preservation standards “routinely lead to the duplication of the same content across multiple objects.” 62 Integrity, high resolution and redundancy — which are institutionalized through mantras like “lots of copies keep stuff safe!” (LOCKSS) — ultimately require lots of energy. Goldman says archivists should set “acceptable levels of mutability” and degrees of “lossiness,” embracing the standard of “graceful degradation” advanced by Bethany Nowviskie, director of the Digital Library Foundation. 63 Archivist Rick Prelinger concurs: “If archives are to ride the rising waves, it won’t be as arks fully caulked to repel leaks, but as permeable wetlands capable of assimilating ebbs and flows.” 64
And those who ride in leaky boats should probably know a bit about maintenance. Librarians and archivists need to avoid fetishizing innovation and instead “find nobility in activities like metadata enhancement, project maintenance, and forward migration” (the updating of old formats). 65 They have much to teach, and to learn from, the geoscientists who are building a more robust cyberinfrastructure. 66 Since its founding in 2005, the National Geological and Geophysical Data Preservation Program, an arm of the USGS, has provided federal funding for the preservation of material samples and data, and for the development of a national catalog and collective standards. 67 Researchers now have access to EarthCube’s Internet of Samples in the Earth Sciences, its forum on “Collaboration and Cyberinfrastructure for Paleogeoscience,” the paleoclimate data platform LinkedEarth, and the System for Earth Sample Registration, which distributes the IGSN. All depend on government funding and a supportive administration in the state house or White House.
In classifying and indexing samples of ice, rock, soil, and sediment, we acknowledge the Earth as a vast geo-informatic construct. It is both geology and data, ontology and epistemology. Yet unlike many Big Data operations, which live in the Cloud, this “Linked Earth” is also resolutely material — muddy, icy, soggy. Those material properties are essential to the Earth’s ability to document its own past, and to the human ability to predict, prepare for, and even redirect its future. Curating — that is, caring for — both the geological archive and the terrestrial field from which it is drawn must be an organized, concerted effort that involves scientists, librarians, archivists, network engineers, and other technical specialists. And efforts to make that data accessible and usable, to publicize the insights it yields, would benefit from stronger collaboration with cartographers, designers, artists, and other media-makers. 68 New archival processes, research methods, and creative practices represent the making of a new field of knowledge-production and curation. What we call this new field matters less than how we operate within it, probing and navigating uncertain anthropogenic terrains.
If you would like to comment on this article, or anything else on Places Journal, visit our Facebook page or send us a message on Twitter.