FEMA Region IV Coastal Flood Loss Atlas

National Situational Awareness Predictive
Analytics (SAPA)
MOTF/Risk Analytics Training
July 27-31, 2015
Jesse Rozelle
FEMA Region VIII GIS Coordinator
FEMA Modeling Task Force SME
[email protected]
National SAPA
How do we create consistent national situational awareness predictive
analytics?
[email protected]
[email protected]
National SAPA
Steps toward standardization
1) Define SAPA operational requirements
2) Define standard processes based on hazard,
scale, and required turnaround time
3) Establish a framework for successful practices
4) Strengthen our analytic credibility
[email protected]
[email protected]
Defining SAPA Operational Requirements
Who is our audience?
FEMA? State, County, and local partners? All of
the above?
[email protected]
[email protected]
Defining Situational Awareness Requirements
What analytics do we need? Some examples
Primary List:
Hazard Extent
Population Impacts
Building Impacts
CIKR (Critical Infrastructure/Key Resources) Impacts
Transportation Impacts
Secondary List:
Mass Care Requirements
NFIP Impacts/Coverage
Other?
We can create a very long list of potential impact analytics, but
should we first all agree on basic, then extended lists?
[email protected]
[email protected]
Defining SAPA Operational Requirements
What is our goal?
•
•
•
•
•
•
Situational awareness?
Expediting declarations?
Expediting IA rental assistance?
Expediting response resources?
Public outreach and risk communication?
Projecting long term economic impacts?
[email protected]
[email protected]
What does SAPA Look Like? Product Formats
GeoPlatform Viewer
Spreadsheets
Static Maps
Operations Dashboard Viewer
[email protected]
[email protected]
Define Standard Processes Based on Hazard,
Scale, and Turnaround Time
Develop SAPA SOP’s for each hazard, for large scale and small
scale events.
SOP’s should cover basic and advanced analytic component lists
[email protected]
[email protected]
Scope of the Event/Turnaround Time
First, what scale of SAPA is
appropriate (and possible) for
a given event? What is the
extent of the disaster?
• Is our product time
dependent? (most likely)
• Is the scale of a request
realistic with the given time
frame?
• The answer for deriving
analytics for Minot and
Hurricane Sandy will be
very different
[email protected]
[email protected]
Is our disaster striking a small, rural town? Minot, ND 2011
[email protected]
[email protected]
Is our disaster causing impacts across a state? Colorado Floods
of 2013
[email protected]
[email protected]
Is our disaster causing devastating impacts across the entire
east coast? Hurricane Sandy
[email protected]
[email protected]
Depending on the scale, phase of an event and time for
completion, the analysis method will vary
• The goal is always to provide the most detailed analytics possible,
in an acceptable timeframe
• Managing expectations for what can be provided is critical
• The larger the extent of an event, the lower the detail of the
analytics possible
• The shorter the turnaround time allowed, the lower the detail of the
analytics possible
• Finding a middle ground is key
[email protected]
[email protected]
Weather Watches/Warning, Pre Activation Example
[email protected]
[email protected]
Imminent Flooding Event Sample Request:
Your RRCC is activated for a flooding response, flood gages show
major to historic flooding, and the extent of impacts is still unknown.
Some questions you’ll want to ponder before beginning your analysis:
• How many communities could be affected based on USGS stream
gages? One community, or 50? 100? Multiple states?
• How soon do you need to provide estimates? Two hours, two
days, or two weeks?
• Do you want to provide basic impact analytics (number of
households/people affected, CIKR)? Or do you want detailed
economic impact projections?
[email protected]
[email protected]
Small Scale Event – One Community (2 FTE)
•
1-3 hours – DFIRM based
exposure
•
1 week – custom inundation
mapping
•
2-3 weeks – event based
inundation mapping,
economic impacts and
percent damage per structure
•
Custom inundation mapping
availability dependent on
access to hydrologist
support, high resolution
terrain data (LIDAR)
[email protected]
[email protected]
Statewide Event – Regional – (3-4FTE)
•
1 day – DFIRM based
exposure
•
2-3 months– custom
inundation mapping
•
3-6 months– event based
inundation mapping,
economic impacts and
percent damage per structure
•
Custom inundation mapping
availability dependent on
access to hydrologist
support, high resolution
terrain data (LIDAR)
[email protected]
[email protected]
National Level 1 Event – Full MOTF Activation (8 FTE)
•
1 day – SLOSH/NHC advisory
based exposure – updated
daily
•
1 week - high water mark
based event inundation
mapping
•
3 weeks – final custom storm
surge inundation mapping for
entire event
•
3 months– event based
inundation mapping, full suite
of impacts to all sectors and
programs
•
1 year– event based
inundation mapping, full suite
of impacts to all sectors and
programs
[email protected]
[email protected]
Other Federal Agencies Are the Authority on
Estimating the Hazard
• Always defer to the authoritative science agency for estimating the
extent and severity of each hazard
• USGS, NOAA, NWS, SPC, NHC
• These agencies are the authority on hazard extent, but they do not
estimate impacts (SPC is beginning to deliver very rough impact
information)
• FEMA is the lead on impacts
• Flood inundation mapping is currently unsolved for
[email protected]
[email protected]
Common Pitfalls When Estimating the Hazard in
Hazus
The level of accuracy you get out of Hazus, is dependent on the accuracy of
your inputs, time spent setting up your model, and SME experience with
Hazus prior to the request. Garbage in, garbage out. Quality in, quality out.
The following are common mistakes which can lead to inaccurate analytics
when using Hazus. Hazus can estimate the hazard when OFA data isn’t
available, but its real value lies in loss estimation
•
•
•
Creating your own earthquake scenarios in Hazus earthquake vs. using
USGS shakemaps
Using the Hazus level 1 flood methodology – inundation mapping for an
event is very challenging
Using Hurrevac wind field data from outdated advisories when using the
Hazus hurricane wind model
[email protected]
[email protected]
Hazus Flood Model Time Investment/Limitations
There are limitations for all three Hazus models, but the flood model
limitations are most prominent
• Hazus level 1 flood analysis for 1 county takes approximately 2
days to run full H and H with 10 meter NED.
• Detailed site specific Hazus flood estimates for one county takes
about 2 weeks
• DFIRM exposure estimation for 1 county takes about 1 hour
[email protected]
[email protected]
Hazus Level 1 Flood Methodology
• LIDAR derived level 1 flood hazard methodology is very time
consuming, if it works at all. 30/10meter DEM derived flood hazard
is low resolution.
• Over estimation for level 1 flood model aggregated general building
stock. This is minimized with the dasymetric inventory.
• Time required from start to completion? Approximately 2 days of
processing per county, assuming no problem reaches or
unresolvable errors.
• Issues are being addressed in the Hazus modernization effort for
the future, but for now is not applicable for response
[email protected]
[email protected]
National Hazus Level 1 2009 Archive
• National Hazus level 1 analysis run in 2009 for every county in the
US; offers a means to skip level 1 hydrology
• Lower level of confidence, intended for general county by county
risk
• Utilized 30 meter resolution terrain data, outdated hydrological and
hydraulic methodology, and homogenous distribution aggregated
building stock loss estimation methodology
• Average AAL reported by NWS (Verify) $6-8B
• Hazus AAL from national study $60B
• Use of year 2000 census data vs 2010
[email protected]
[email protected]
Analyzing Flood Losses Using FEMA’s Hazus Flood Model
Aggregated vs. Site Specific Building Losses
[email protected]
[email protected]
Establishing a Framework for Successful Analytical
Processes
1) Build out an Org Chart of SME’s
2) Train and Certify those SME’s,
invest in new SME’s
3) Establish QA/QC processes
[email protected]
[email protected]
Who is providing SAPA and when?
Build out an organizational chart for contributors to SAPA
• Includes names, titles, and organizational location; not just titles
• Establish roles/responsibilities for regional staff, MOTF, AGTS,
EAD, HQ Recovery
• Build out a chart for regional and national hypothetical events; not
just level 1, but level 2, 3
• Plan strategically for how this would look in the future
(NHAP/institutionalizing the MOTF and other staffing initiatives),
and how it would look next week.
[email protected]
[email protected]
How do we standardize SAPA?
Develop New SME’s, and Maintain Current SME’s
•
•
•
•
•
•
•
•
Identify key players for the future, and next week
Make sure they’re qualified
Train them
Train them again
Not just Hazus training!
Focus on this steady state, not only during response
Certify and credential SME’s
GISP’s, ArcGIS Professional certification, Hazus certification
[email protected]
[email protected]
Establish Credibility in our Analytics
Consistency
• Establish pre defined products we’ll provide as a cadre
• Establish pre defined product formats we’ll provide as a cadre
(Geoplatform viewers, ArcGIS Operations Dashboards, static
pdf/jpg map templates)
• Delivery – upon activation, don’t wait for requests, implement our
SOPs
[email protected]
[email protected]
Establish Credibility in our Analytics
Transparency
• Clearly document your loss estimation methods so they can be provided
to our customers.
• This includes data sources, impact estimation methodologies, and how
recent your information was generated.
• The more details page in the FEMA GeoPlatform provides a great place
to do so.
[email protected]
[email protected]
Establish Credibility in our Analytics
Validity
•
•
•
•
•
Are you prepared to stand behind your analytics?
Are you willing to publicly put your name on it?
Are you willing to answer press inquiries?
Are you documenting your methodology in detail in Geoplatform?
Are you practicing proper QA/QC on your numbers before they go
out?
[email protected]
[email protected]
QA/QC processes are crucial! A second set of eyes on your results before
distributing can often catch easily identified issues.
[email protected]
[email protected]
Questions?
[email protected]
[email protected]