Aerospace technology

Prediction of Computational Quality
for Aerospace Applications
Michael J. Hemsch, James M. Luckring, Joseph H. Morrison
NASA Langley Research Center
Elements of Predictability Workshop
November 13-14, 2003
Johns Hopkins University
Workshop on UQEE
NASA Langley Research Center - 1
Outline
• Breakdown of the problem (again) with a
slight twist.
• The issue for most of aerospace is that
non-computationalists are doing the
applications computations.
• What are they doing now? What can we
do to help?
Workshop on UQEE
NASA Langley Research Center - 2
Breakdown of tasks
Experimentation
Off-line
Traceability to
standards
Verifying that the
coding is correct
Calibration of
instruments
Off-line
Random error
characterization Measuring the
using standard measurement system
artifacts
Off-line
Systematic error Discrimination testing
characterization of the measurement
system
Process output
of interest
Workshop on UQEE
Computation
Off-line Traceable
QA checks against
above measurements
during customer
testing
operational
definition of
the process
Off-line
Measuring the
computational process
Characterization
of process
variation using
standard problems
Off-line
Model-to-model and
model-to-reality
discrimination
Systematic error
characterization
QA checks against
above measurements
during computation for
customer
Solution
verification
NASA Langley Research Center - 3
The key question for applications:
“How is the applications person going to
convince the decision maker that the
computational process is good enough?”
Workshop on UQEE
NASA Langley Research Center - 4
Our tentative answer based on
observation of aero engineers trying to
use CFD on real-life design problems is
that it is the quantitative explanatory
force of any approach that creates
acceptance.
Workshop on UQEE
NASA Langley Research Center - 5
• How can quantitative "explanatory force“ be
provided?
• Breakdown to two questions:
– How do I know that I am predicting the
right physics at the right place in the
inference space?
– How accurate are my results if I do have
the right physics at the right place in the
inference space?
Workshop on UQEE
NASA Langley Research Center - 6
Airfoil Stall Classification
Workshop on UQEE
NASA Langley Research Center - 7
Boundaries Among Stall Types
Workshop on UQEE
NASA Langley Research Center - 8
• The applications person needs
a process that can be
 Controlled
 Evaluated
 Improved
(i.e. a predictable process)
Workshop on UQEE
NASA Langley Research Center - 9
Creating a predictable process …
Controllable input
(assignable cause variation)
Geometry,
flight conditions,
etc.
Process
Predicted
coefficients,
flow features,
etc.
Uncontrolled input from the environment
(variation that we have to live with,
e.g. numerics, parameter uncertainty,
model form uncertainty, users)
Workshop on UQEE
NASA Langley Research Center - 10
Critical levels of attainment for a predictable process
• A defined set of steps
• Stable and replicable
• Measurable
• Improvable
Workshop on UQEE
NASA Langley Research Center - 11
What it takes to have an impact ...
• Historically, practitioners have created their
designs (and the disciplines they work in) with
very little reference to researchers.
• Practitioners who are successfully using aero
computations already know what it takes to
convince a risk taker.
• If we want to have an impact on practitioners,
we will have to build on what they are already
doing.
Workshop on UQEE
NASA Langley Research Center - 12
What is takes to have an impact ...
• Good questions:
– Are researchers going to be an integral
part of the applications uncertainty
quantification process or are we going
to be irrelevant?
– What specific impact on practitioners
do I want to have with a particular
project?
– What process/product improvement am
I expecting from that project?
Workshop on UQEE
NASA Langley Research Center - 13
What is takes to have an impact ...
• We can greatly improve, systematize and
generalize the process that practitioners are
successfully using right now.
• The key watchwords for applications are:
– practicality, as in mission analysis and design
– alacrity, as in "I want to use it right now."
– impact, as in "Will my customer buy in?" and "Am I
willing to bet my career (and my life) on my
prediction?"
Workshop on UQEE
NASA Langley Research Center - 14
Actions
• Establish working groups like the AIAA Drag
Prediction Workshop (DPW)
– Select a small number of focus problems
– Use those problems
» to demonstrate the prediction uncertainty strategies
» to find out just how tough this problem really is
• For right now …
– Run multiple codes, different grid types, multiple models, etc.
– Work data sets that fully capture the physics of the
application problem of interest.
– Develop process best practices and find ways to control and
evaluate them.
– Develop experiments to determine our ability to predict
uncertainty and to predict the domain boundaries where the
physics changes.
Workshop on UQEE
NASA Langley Research Center - 15
Breakout Questions/Issues
1. Defining predictability in the context of the application
2. The logical or physical reasons for lack of predictability
3. Possibility of isolating the reducible uncertainties in view
of dealing with them (either propagating them or reducing
them)
4. The role of experimental evidence in understanding and
controlling predictability
5. The possibility of gathering experimental evidence
6. The role that modeling plays in limiting predictability
7. Minimum requisite attributes of predictive models
8. The role played by temporal and spatial scales and
possibilities mitigating actions and models
Workshop on UQEE
NASA Langley Research Center - 16