I was recently asked a provocative question by a student, that I want to share with you today. “What happens when data fails?”

I took a second to gather my thoughts in order to give an answer that would both challenge and engage. My mind was racing during the four seconds it took for me to say: “That is a very good question.”

I raced through a number of possible responses in my mind and thought about answering his question from a technical perspective, which is to discuss what happens when data systems fail or crash.

I thought about answering his question from a human perspective, which is to discuss what happens when the data solutions we deploy don’t work to help the intended audience. I thought about Cambridge Analytica and Facebook and how data privacy violations and the misuse of data can be seen as “data failing.”

At the end of my four seconds, my answer hit me like a ton of bricks and I said: “I saw what happens when data fails during my tenure at New York City during 9/11.”

What I discussed then was the fact that the inability to share data with those who need it at all times caused challenges during both manmade and natural catastrophes.

I used New York City as an example because I worked on issues like these during my tenure at the city. And, as a native New Yorker with family still living there, these issues are very important to me.

Data failed New York City on September 11, 2001.

The lack of data sharing leading up to 9/11 is outlined in the 9/11 commission report: “The U.S. government has access to a vast amount of information. But it has a weak system for processing and using what it has.”

The system of “need to know” should be replaced by a system of “need to share.” There are many anecdotal reports on the ground of emergency responders struggling to get the data that they needed to do their jobs quickly and effectively.

Data also failed during and after 2012’s Hurricane Sandy. The Sandy After Action Report explicitly stated that “the city should significantly improve its collection and synthesis of data on the provision of essential services throughout the city, including power, gas, and telecommunications.”

In 2013, The United States District Court Southern District of New York found the city liable for failing to provide meaningful access to people with disabilities to its emergency preparedness programs and services. As a result of that decision, the city was required to create a Post-Emergency Canvassing Operation (PECO) plan to rapidly survey households after a disaster to assess and identify the critical needs of people with disabilities.

During a canvassing operation to survey the households, canvassers were required to go door-to-door carrying a mobile survey tool to input resource requests and refer those requests to appropriate partners for resolution. Resource requests included but were not limited to: food, water, electricity, medical care and durable medical equipment.

But the city did not have a readily available database of people with disabilities.

The problem wasn’t that the data didn’t exist; the problem was that data was spread across multiple agencies in disparate databases with no chance of it ever being shared or smartly integrated for use during an emergency.

Data also failed during the summer of 2015 when a terrible outbreak of Legionnaires disease in New York caused 12 deaths and infected 112 people.

Ultimately, city health inspectors and first responders found that the bacteria was spreading via cooling towers. This was difficult to discern at the time since the city did not have a database of cooling tower locations.

As this example demonstrates, it’s the “unknown unknowns” that will hurt you.

The examples that I mentioned above have either been rectified or are being worked on.

I think these examples are relatable to many of us because these scenarios of “data failing” New Yorkers is something that happens across every government agency, everywhere.

So what are we doing about it?

I am seeing governments hiring Chief Data Officers and data scientists so the right people are at the helm to ensure that data doesn’t fail again in serious situations.

I have the pleasure of knowing and working with many data professionals in government and I am comfortable knowing that federal, state and local government – in the United States, Australia and all around the world – are headed in the right direction to make sure that data doesn’t fail us in a catastrophic way ever again.

Explore more of Dr Mashariki's blogs here 

This article – and others from Amen Ra Mashariki – are published as part of the GovLoop Featured Contributor program. View more articles here.

About the Author

Dr Amen Ra Mashariki
Dr Amen Ra Mashariki
Global director, World Resources Institute
Dr Amen Mashariki is the Global Director for the Data Lab at the World Resources Institute, where he combines data analysis with cutting edge technology to tackle global challenges.

Subscribe to
Esri Australia news