Toward a digital resilience

Practice Bridge
  • Dawn J. Wright
    • Environmental Systems Research Institute, Redlands, California, United States
    • College of Earth, Ocean, and Atmospheric Sciences, Oregon State University, Corvallis, Oregon, United States
    For correspondence:
DOI 10.12952/journal.elementa.000082


As we contend with human impacts on the biosphere, there is rightfully a great emphasis now on community adaptation and resilience to climate change. Recent innovations in information technologies and analyses are helping communities to become more resilient. However, not often discussed in this vein is a path toward digital resilience. If mapping and information tools are to help communities, it stands to reason that they must be resilient themselves, as well as the data that they are based on. In other words, digital tools can help make communities resilient by providing data, evidence-based advice on community decisions, etc., but the resilience of the tools themselves can also be an issue. Digital resilience means that to the greatest extent possible, data and tools should be freely accessible, interchangeable, operational, of high quality, and up-to-date so that they can help give rise to the resilience of communities or other entities using them. Given the speed at which humans are altering the biosphere, the usefulness and effectiveness of these technologies must keep pace. This article reviews and recommends three fundamental digital practices, particularly from the standpoint of geospatial data and for community resilience and policy-making. These are: (1) create and implement a culture that consistently shares not only data, but workflows and use cases with the data, especially within maps and geographic information systems or GIS; (2) use maps and other visuals to tell compelling stories that many different kinds of audiences will understand and remember; and (3) be more open to different kinds of partnerships to reduce project costs, yield better results, and foster public awareness and behavioral change.


Barnosky et al. (2015) point out that one of the grand challenges for both science and society, is solving six intertwined and vexing problems: human population growth and overconsumption, pollution, disease spillovers, human-caused climate disruption, ecosystem destruction, and extinction. A Scientific Consensus Statement for Maintaining Humanity’s Life Support Systems in the 21st Century (Barnosky et al., 2014a, 2014b) identifies many applicable solutions, including working alongside the engineering community (e.g., civil and construction engineering, environmental engineering, energy resources engineering, materials science, mechanical, industrial, and manufacturing engineering, and the like). For example, these collaborations are leading to the creation and deployment of new solar, wind, and hydro fuel cells, new monitoring systems for drought, new mapping systems that alert citizens to climate-induced hazards, and more.

This paper focuses more on the digital realm of software engineering, within the broader world of information technology and its accompanying practices. The 2015 American Association for the Advancement of Science (AAAS) symposium “Avoiding Collapse: Human Impacts on the Biosphere” emphasized that various innovations in digital data collection, analysis, and visualization now allow for a synthesis of environmental information so that we may track and understand human impacts at global to local scales. Analysis of such data provides new ways to identify macro-scale patterns and processes through long time periods. Indeed we are surely living in an unprecedented era of regional to global scale observation and simulation of the Earth as exemplified by scientific observatories funded by the National Science Foundation, such as: the Critical Zone Observatories ( that look at fluxes of water, carbon, sediments, and nutrients across natural watershed and human land use boundaries; the EarthScope program ( that investigates the structure of the Earth’s crust, the strain of earthquake faults, and the activity of volcanoes by way of a vast network of sensors scattered across the entire North American continent; the Ocean Observatories Initiative ( which is deploying a similar network of sensors around the world to measure the physical, chemical, biological and geological variables throughout the depths of the ocean down to the seafloor; and the Global Earth Observation System of Systems ( with its focus on space-borne, airborne, and in situ sensors. All of these above-mentioned programs are constructing the appropriate data management architecture and decision-support systems in parallel. Indeed, we now find ourselves inhabiting a “Digital Earth” comprised of digital technologies from satellites to wristwatches that monitor, map, model, and manage (Annoni et al., 2011; Dobson, 2000; Foresman, 2008; Goodchild, 2007; Goodchild et al. 2012; National Research Council, 2010).

These big science programs produce the so-called “big data.” Big data is defined in Gantz and Reinsel (2012) as “a new generation of technologies and architectures, designed to economically extract value from very large volumes of a wide variety of data by enabling high-velocity capture, discovery, and/or analysis.” Big data, with the three main characteristics of volume, velocity, and variety, are in turn leading to a new science paradigm, a new data science that deals with, among many issues, the inundation of data from satellites, sensors, and other measuring systems and the issues associated with those large data sets (Seife, 2015; Wright A, 2014). Indeed we are seeing the fruition of ideas expressed by Hey et al. (2009), which posits a new paradigm of scientific discovery beyond the existing three paradigms of empiricism, analysis, and simulation, to a fourth where insight is discovered through the manipulation and exploration of large data sets (i.e., both volume and variety). The impediments to further development are not only technological but also conceptual. For example, the lack of complete understanding about the nature of data in both space and time (i.e., both velocity and variety) continues to obstruct solutions to its manipulation in digital forms. But with recent advances in mapping technologies such as geographic information systems or GIS (as an example), scientists are now able to apply high-resolution optical and acoustic imaging techniques that span an incredible range of mapping scales, from km to cm (e.g., Chase et al., 2012; Costa et al., 2014; Dick et al., 2014; Dolan and Lucieer, 2014; Ferrini et al., 2008; Galparsoro et al., 2010; Makowski et al., 2015; Sen et al., 2013).

This is not only about a fourth paradigm of scientific discovery, but a fourth paradigm of government, where Pitroda (2013) predicts that the future of democratic governance lays not only in the pillars of the executive, legislative and judicial, but also in a fourth pillar of information. Indeed governments around the world are adopting principles of open data in government, in which data gathered at the taxpayers’ expense is made freely available for both access and reuse, to help foster the transparency and accountability of government as it addresses a wide range of societal challenges, including adaptation and resilience to climate change (Publications Office of the European Union, 2015). These data include local, state, and national boundaries, information about land ownership, the heights and depths of land and water bodies, the paths of streams and drainage areas, and maps of all transportations corridors, urban and rural (witness the recent, bipartisan Geospatial Data Reform Act introduced in the US Senate;​patial-Data-Reform-Act). In some cases, powerful interests actively seek to keep digital data and information technologies out of the reach of groups of people (Wright et al., 2009). This state of affairs suggests that we need to understand how spatial knowledge is shaped by identity, power, and socio-economic status, and how spatial data handling are socially and politically mediated, particularly with the emergence of crowd-sourcing and citizen science (Connors et al., 2011; Elwood, 2007; Goodchild, 2007).

A very positive circumstance, however, is that there exists an ever-growing catalog of data portals (i.e., single points of electronic access to data and the maps and tools that use them) along with information technology tools to monitor, track, and report events and day-to-day operations across a network of people within an organization (see for example the US federal government data portals Data.Gov and A significant proportion of these are map-based, given the incredible power of maps to communicate, persuade, inspire, understand, and elicit action (e.g., Gale, 2013; Harrower, 2015; Wood, 1992). The demand for maps has never been greater; whether for finding directions, for looking at city services, deliveries, movements of people and vehicles, weather events, social events, and social media. We are clearly in a new digital world order. And in this new world order these same digital mapping technologies used for science (for understanding how the Earth works) are also helping communities in a more practical way to gain resilience against one or more of the six intertwined problems discussed in Barnosky et al. (2015). These range from monitoring fire, drought or flooding to mapping the relevant insurance zones for such. They include tracking economic collapse or health epidemics, finding available drinking water, alerting us to temperature and precipitation changes, determining landscape vulnerability for land managers, monitoring air quality monitoring, even the identifying the suitability of a position on one’s roof for installing solar panels. However, often not discussed is a path toward digital resilience. If digital mapping and information tools are to help communities, it stands to reason that they must embody some resilience themselves if they are to continue to be effective.

Digital resilience

Resilience in general refers to the capacity to deal effectively with change and even threats, to recover quickly from challenges or difficulties, even to withstand stress and catastrophe. Holling (1996) lays out two distinct definitions of resilience, further amplified in Walker and Salt (2006). The first involves the common conception of the ability to recover quickly to a prior desired state (aka “engineering resilience”). The second involves whether a system retains the capacity to recover to a prior desired state at all, including its capacity to absorb disturbance and still retail essentially the same (desired) structure and function (aka “ecological resilience”). I argue that these definitions of resilience can and should apply to digital data and systems as well, meaning that for the most part they should be free, accessible, interchangeable, operational, and up-to-date; hence resilient.

Investments in digital data continue to rise (especially with the emergence of the open data movement) and data portals have been proven to be effective in providing electronic access to data and information (e.g., Aditya and Kraak, 2006; Diamond, 2013; Mossbauer et al., 2012). Gantz and Reinsel (2012) predict that by 2020, a third of the data in the digital universe (more than 13 zetabytes, where one zetabyte is one million times the word count of books, monographs and journals in the world’s largest library; Johnston, 2012; Kirk, 2012) will have tremendous societal value, but only if tagged and analyzed. As such there continue to be huge efforts in structuring and properly characterizing or tagging data, a practice commonly known as generating “metadata.” However, there is more that can be considered. A first principle is that making data, metadata, and all manner of computer code “available” via portals is no longer good enough. Gahegan and Adams (2014) point out the very important difference between what data portals provide and what the user actually needs, especially if seeking to serve many categories of users. Indeed, it is not only a question of how the data are shared openly, but how to facilitate the reuse of data for a range of appropriate scenarios that make sense for the data in that context (e.g., temperature and precipitation data can also be used to derive cooling degree days mapped out as energy needed to cool a home or business in a specific region). Three recommendations toward a digital resilience are shared below. These recommendations, while far from exhaustive, are meant to engender initial thought and discussion, especially where resilience to the grand challenges of society discussed in Barnosky et al. (2015) is concerned.

Share not only data, but workflows and use cases

In making our data open to access, we need to be more open about we are doing with the data. In other words, we need to share more of the workflows done with the data (i.e., the actual steps taken in an analysis or preparation of a map from initiation to completion), and further amplify them in use cases (scenarios or vignettes showing behind the scenes how and why data were used for a particular analysis or map, with an emphasis on a practical, real-world outcome to achieve the user’s goal). This is especially true if wishing to maintain scientific rigor where repeatability of an experiment or approach or algorithm is a hallmark. Can someone reconstruct and verify the rigor of an approach, and hence the correctness of a conclusion? Can someone replicate the workflow? In other words, can someone reconstruct and understand the scientific process that was undertaken? Garijo et al. (2014) define a workflow simply as a template defining the set of methods or tasks needed to carry out a scientific or computational experiment (Figure 1). Providing a workflow supports reimplementation or application by others, thus validating approaches and models (Börner and Scharnhorst, 2009; Weber et al., 2015). And given the provision of a workflow within a data portal, the best datasets, tools, and results are amplified not only for research and education, but also for practice by governments, non-profits, and other organizations seeking to solve societal problems. In addition, the long-term provenance and preservation of both the data and workflow are critically important for the life cycle of the scientific process. On-the-fly, dynamic calculations and approaches have their import, but scientists often want and need to keep the data at each stage of a workflow.

Wright (2016a) provides an example scientific data workflow package that is being shared openly and freely within the global data and map portal known as ArcGIS Online ( In a similar vein, Wright (2016b) also contains Python scripts and documentation for building and configuring a workflow to handle imagery from the Landsat 8 satellite.

The workflow is brought to life as part of a use case that simply tells the story of the correct or most effective way of using that workflow (aka best practice). This includes how the data used by the workflow may interoperate (or be interchangeable) in a range of formats and on whatever devices or platforms are chosen, especially by way of international standards. This is not unlike how a file or program that works on both a personal computer running Windows and also on a Macintosh running OS, or a digital device that plugs into a home stereo but will also plug into and work in one’s car, or how a smart phone can translate calls made from other places in the world in many different language. Also desirable in a use case is a discussion of interoperability in terms of modeling frameworks and how best to decouple data from models so that they may be used for multiple purposes (along the continuum from data to information to knowledge). The use case should also describe how the integration of the data and workflow might work with a host of additional scientific tools and libraries, or the workflow may “crosswalk” among several related approaches (e.g., Gallagher et al., 2015). In the scientific realm, but also for policy-makers and other non-specialists seeking to benefit from scientific data, well-written and complete use cases can be tremendously helpful. If the reader of the use case is able to understand what is happening (e.g., what is going on behind the scenes that produced a certain map or output), this will engender trust in the workflow and hence the results.

Figure 2 depicts the sharing of both workflows and use cases as part of dissemination strategy for a new global ecological land units (ELU) map, aimed at landscape ecologists, resource managers, land use planners, and the general public. The project is ongoing as a public-private partnership between the Environmental Systems Research and the US Geological Survey, and was officially commissioned by the Group on Earth Observations (GEO) and its Global Earth Observation System of Systems (GEOSS) Task EC-01-C1, as key outcomes of the GEO Biodiversity Observation Network and the GEO Ecosystems Initiative (Group on Earth Observations, 2005; Sayre et al., 2007). A global ecological marine units (EMU) map (in progress, Sayre et al., 2015) and a global ecological freshwater units map have also been commissioned. The ELU is a massive biophysical stratification of the planet at a finest yet-attempted spatial resolution (250 m) to produce a first ever map of distinct physical environments and their associated land cover, in a delineation of ecologically meaningful regions that is both classification-neutral and data-driven (Sayre et al., 2014; Stockton, 2015). The intent is to provide scientific support for planning and management (including as an important variable for GIS geodesign models and apps), to enable understanding of impacts to ecosystems from climate change and other disturbances, and for the valuation of ecosystem services. In this way, it offers fulfillment of one of the main recommendations of the White House President’s Council of Advisors on Science and Technology report on sustainable environmental capital (Executive Office of the President of the United States of America, 2011). A new ELU white paper (Frye et al., 2015) provides further guidance to use case testers throughout the academic community (i.e., anyone—researchers, students, resource managers, park rangers, Congressional staffers, etc.—wishing to work with a local or regional subset of the ELU map to assess its effectiveness in that region), offering the associated conceptual and technical support pro bono as they download the data, the workflow, the initial use cases, and prepare their own.

Be willing to tell stories, and of many kinds

Barnosky et al. (2015) point out an important two-pronged challenge in academia. The first is in training environmental and physical scientists to communicate issues in ways that truly resonate (especially in concert with, and learning from colleagues in the social sciences and humanities). As Caldas et al. (2015) point out, this is largely a matter of culture, “both a property of the individual, and a property of the social context in which individuals exist… but still an important variable mediating the relationship between humans and the natural environment.” The second challenge is in taking the knowledge developed within academia writ large and transmitting it into mainstream society in ways that elicit significant action. One way to accomplish both is through the medium of storytelling.

Gottschall (2012) contends that the last several decades of studies in psychology have shown repeatedly how story affects the human mind, and how attitudes, fears, hopes, and values are strongly influenced by story. Scientists are often encouraged not to publish their work until it constitutes a complete “story.” In the prior section a use case was categorized as a type of story, a “story” of best practice. However, there are different modes of story and storytelling. Scientists need to invert their mode and progression of communication to modes that a policy maker (or journalist) will receive well or understand (for a complete treatment of the concept and method see Baron, 2010). Scientists want to explain how the world works, by way of copious background information, overview of prior studies, detailed methods, results, and discussion before getting to the final take-home message. But policy-makers need this inverted, and thus need scientists to inform what decision they need to make (e.g., whether or not to establish a new protected area in their jurisdiction or specifically which ecosystem service to consider for a new management plan or regulation in that jurisdiction), sometimes in near real-time. A scientist may inform their decision by telling them a viable story. As we know, journalists are always in search of a good “story.”

Returning to the realm of digital data and information, there is a relatively new medium called the “story map.” The story map is a new medium for the sharing not only data, but photos, videos, even sounds, all within the framework of a digital map, and for telling a specific and compelling story by way of that map. Story maps are created via web map applications that provide the user with sophisticated cartographic functionality that does not require advanced training in cartography or GIS, and are usually coupled with web-accessible data needed to tell the story. They also allows the user to leverage their own data (including their workflows and use cases) in new ways to inform, educate, and inspire decision-makers on a wide variety of issues (Wright DJ, 2014, 2015a). Barnosky et al. (2015) provide the example of a story map focusing on how threats to human life support systems currently play out across the United States. Figure 3 shows the opening page of a story map featured in Esri and Jaggard (2014) for the Smithsonian Institution that “reveals the scope of humanity’s influence on Earth—and the innovations aiming to create a more sustainable future.” Other examples specifically for scientific, resource management and governance applications, including climate resilience, are described in Wright DJ (2014) and Wolfe (2015). A video within Wright (2016c) provides an instructional resource to aid in designing and deploying story maps. It provides further explanation about what story maps are, how a map becomes a story through its layers of data and information, and step-by-step instructions for constructing them.

Be continually open to partnerships

Climate science, resilience studies, and ecology are squarely in the realm of academia and government agencies, but it will be critical to partner with industry as well. The private sector is already working on solving the six intertwined problems discussed in Barnosky et al. (2015), and is often looking to create and share knowledge toward effective action and in partnership with academia or government. Many are entering into a “culture of resilience” not only as part of the values or worldview of a particular company, but because it is also good business. To quote former Maryland Governor Martin O’Malley and many, many others: “cooperation is the new competition” (see also Lowery, 2014; Lowitt, 2013). Hence, academics in particular should not be afraid to investigate partnerships with the private sector.

Such public-private partnerships are successful when based upon a holistic strategy that addresses specific community or social needs in the context of sustainable socio-ecological systems (Hoegh-Guldberg et al., 2013). For example, in June 2013, President Obama announced the Climate Action Plan and within that the Climate Data Initiative, which encourages innovators from the private sector and the general public to convey data on climate change risks and impacts in compelling and useful ways that help citizens, businesses, and communities make smart choices in the face of climate change (Wright, 2015a). Similarly, NOAA, in an attempt to explore more sustainable models for increasing the amount of open data made available via the cloud, while minimizing the great cost in doing so, has created cooperative research and development agreements with Amazon, Microsoft, Google, IBM, and the Open Cloud Consortium ( These industry partners have in turn formed data alliances with smaller companies such as AccuWeather, Esri, and PlanetOS to extend the public-private partnership even further. The current strategy in part mirrors the recommendation of Hoegh-Guldberg et al. (2013) to “optimize the yield of common goods utilized, minimize the cost to the public of the activity through the leveraging of opportunities and assets and, incentivize responsible behavior in a transparent and synergistic fashion such that it results in long-term sustainability.” On a much smaller scale, the new Research Data Alliance ( is fostering public-private partnerships focusing on data use, data quality, and the adoption of data sharing approaches and tools, with the aim that such a focus will lead to these resources being around and helpful for a longer period of time.

These are examples of free and open data stimulating new businesses and new research directions. A sample research issue: how best to keep the data free, of high quality, accessible, interchangeable, operational, up-to-date, and hence resilient, while keeping the businesses that generated the data sustained as well? A challenge will be engaging the rest of the private sector to participate in partnerships such as these, with the knowledge that they can, in fact, market these approaches, while still benefiting society at large and serving a public good.


Communities around the world are facing increasing challenges from natural and man-made disasters. Leaders of these communities seek to anticipate future trends and enact policies that will support rapid response during times of need. Whether they face challenges such as drought or flooding, economic collapse, or a health epidemics, communities that are resilient are using digital information technologies (such as GIS) to prepare ahead of time, to operate effectively during events, and to recover quickly.

While such digital information technologies can provide innovations that better connect cities, governments, and private organizations together in assessing their risk exposure and increasing their overall resilience, the emphasis of this article has been to lay out three major recommendations for ensuring the resilience of the digital resources themselves. Given the speed at which humans are altering the biosphere, we need such a digital resilience so as not to miss the opportunities for forecasting detrimental outcomes in time to avoid them. Global data needs will continue to grow, and will be met as the “digital Earth” expands, especially by way of real-time sensors. As such, data portals will continue to proliferate.

But this is not just about eyeballs on a map or on a series of numbers, but about coupling of the appropriate data, with workflows and use cases to make the data useful for the right audience (Wright, 2015b). The concepts of workflows and use cases are hardly new, but explicitly sharing these resources within data portals, even from the standpoint of digital objects that can be characterized with metadata, downloaded just like datasets, or even mapped out spatially, is the point of emphasis of here. It is also about telling compelling stories to effectively communicate the scientific results and to transform scientific data into actionable information that people can use in their decision cycles. To use the example of extreme weather (Wright, 2015b), this can be critical for short-term decisions (e.g., get in storm shelter now), medium term (e.g., evacuate), or long term (e.g., infrastructure planning as communities recover from a hurricane or tornado).

As emphasized in the AAAS symposium that prompted this Elementa Special Feature the time is now for governments, communities, non-profits, the private sector, and universities to go beyond just an exploration and discussion of ideas, to actually using these technological tools in an approach that is digitally resilient. This with an aim toward rapidly prototyping and delivering repeatable solutions that all can use to help guide the planet towards a more resilient future… before time runs out.