Transforming Earth Sciences Big Data into Actionable Products for Disaster Applications Part 2


The Disaster Life Cycle Cluster coordinates efforts among data providers, managers and developers of disaster response systems and tools, and end-user communities within ESIP. We developed a working definition of the disaster life cycle phases.  One cluster goal is to work with user communities to identify the information model and observations/outputs needed to enhance their decision-making.

This session will address cluster goals to identify, promote and showcase trusted authoritative data sources and products for disaster decision. We will explore ideas for capturing requirements and processes for managing trusted data sources. We have an invited presentation by Dr. Jim Morentz to discuss experience with XchangeCore and SpotOnResponse within FEMA as well as the California Earthquake Clearinghouse. ESIP has acquired the Open Science Framework and we may consider how this shared workspace tool might be used to manage dataset evaluations. We may also leverage ideas from the data maturity matrix under development by the ESIP data stewardship cluster.  This will lead to a workshop proposal to engage decision makers in discussions on availability of relevant Earth observation data and products and to better understand their information needs throughout the disaster life cycle.

“Experience with XchangeCore and SpotOnResponse” presented by Dr. Jim Morentz/JWMorentz LLC

“Data Driven Decision Making for Disasters – A proposed ESIP Workshop” discussion lead by Dave Jones/StormCenter Commnications

Discussion Topics
Evolve plans for our proposed Fall 2016 ESIP-sponsored workshop on "Data Driven Decision Making for Disasters, A Workshop for Operational Decision Making”

DRAFT  Data Driven Decision Making for Disasters  Workshop topics and ideas:

Hurricane Sandy: What Has Changed to Improve Readiness, Responsiveness and Resilience?
Use Cases to Focus on what has improved since Sandy: Readiness, Responsiveness, Resilience

Readiness: What has improved since Sandy with regard to improving the public/private sector’s preparedness when a threat occurs? (I.e. NWS providing Impact-based decision support services, interpreting their products to communicate what the impacts of their forecasts will be)
Responsiveness: What data products are available to communicate the responsiveness actions to restore power as rapidly as possible? (i.e. Satellite imagery for damage assessment, flying planes to image area/provide LIDAR flights, Social media scraping, drone overflights, etc.)
Resilience: What data sets, model outputs, etc are available to help improve the resilience of organizations and communities? Datasets may not be environmentally focused at all. (i.e. position of critical infrastructure that may be impacted by future development planning activities)

Readiness: NWS Product Improvements - Impact-based (Presentation by NWS Leadership)
Responsiveness: Damage Assessment (Post disaster assessment of damage in utility service areas) (NASA or DHS/FEMA representative)
Resilience: Rebuilding, planning, importance of connecting disparate organizations 


Thanks to all who contributed to our definition of "Data-Driven Decision Making" exercise! Definitions for “Data-Driven Decision Making”
Contributors: Ross, Sean, Maggi, Karen, Dave, Chung-Lin, Brian, David, Dan
Data-Driven Decision Making –
Providing appropriate information
For decision makers to enable situational awareness, where
Appropriate = Standardized, Simplified, Easy Access
Data-Driven Decision Making –
A process where key managers (the decision makers) utilize various data assets and inputs from SME/staff to form courses of action (CoAs), then look at pros/cons of each CoA to whittle the list down to 2-3. A war-gaming takes place for each selected CoA, and ultimately a decision is made based on the outcome. The decision rests of the quality and quantity of data that is provided.
Data-Driven Decision Making –
Filling the data gaps so that you don’t have to guess. Finding out about unknown unknowns.
Data-Driven Decision Making –
Knowledge-based value added “bundle” of data provided in time for helping making adequate decisions timely.
Data-Driven Decision Making –
Integrated data & information in easily recognized formats—by the decision makers; easily incorporated in the decision makers’ Common Operation Picture; relevant to the decision makers’ task at hand.
Data-Driven Decision Making –
Allocating and directing resources based on a wider frame of context and information outside the immediate organizational “bubble” – whether human or from a range of sensors that operate on difference scales, and “see” different aspects of the environment. Essentially, expanding the range and scale of vision of those people coordinating response to situations, for the best possible outcome.
Data-Driven Decision Making –
Having awareness of a threat via use case then thinking through the use of data (any data) that can accelerate SA (system analysis?) and decision making.
What data is available to me?
How timely is it?
How can I access it and use it?
How can I use data to plan?
Let data lead my decision
Data-Driven Decision Making –
There can be good and bad Data-Driven Decision-Making.  Good DDDM is when you use data if expeditious and available and clearly connected to the problem at hand.  DDDM is good when done in conjunction with appropriate value judgments (for example, return on ‘investment’ in prioritizing the savings of certain assets as more important than other assets (e.g., hospitals vs office buildings)).
Bad DDDM is when the data are improperly interpreted or applied erroneously to a situation for which the data are poorly suited or insufficiently representative of the phenomena that need attention. Also trying to use data without rendering explicitly the underlying value assumptions is a case of bad DDDM.
Data-Driven Decision Making –
DDDM for disaster response = using existing data & re-tasked satellite impact data & airborne platform; integrate that data using a few selected templates where template may be:
Template1 = {population data gridded + raster image of target area + other stuff}
Template2= {population data gridded + infrastructure vector data + other stuff}
DDDM for disaster planning = using forecast info with uncertainties to populate a risk matrix, e.g., F1 thru F4 x I1 thru I4, where
F1 thru F4 = forecast of Forcing (e.g., precip levels, wind strength, height of storm surge), where F1 is, say, 0mm/r < Precip < 30mm/r, F2= 30 mm/r < Precip < 60 mm/r, etc. with probabilities; and I1 = 0 ft < river level < 3 ft, I2 = 3 ft <river level < 6 ft, etc., with probabilities. Where Fx and It are all model outputs, models are re-run with increasing frequency as disaster looks more likely.

Moe, K.; Transforming Earth Sciences Big Data into Actionable Products for Disaster Applications Part 2; 2016 ESIP Summer Meeting. ESIP Commons , March 2016