Uncertainty, Assumptions and Sensitivity Analyses

Day 2 Session 1
Overview of practicals and end of week
presentations
Practical Exercises and End of
Week Presentations/Exams
This years workshop aims to give you a greater capacity to:
1. Review an assessment paper, gain an understanding of its
strengths and weaknesses
2. Extract and interpret key information from stock assessment
papers to assist in fisheries management decision making
3. Gain confidence to discuss and talk about stock assesment
with your colleagues and peers in informal and formal fora.
We aim to do this through:
1. Theory sessions (acquiring the knowledge)
2. Practical sessions (applying the knowledge)
3. Group presentations (communicating your knowledge) – for
those who choose to do a presentation
The practicals will be used to systematically compile your end
presentations.
Practical Exercises and End of
Week Presentations
Through the practicals you will work in groups to:
1. Review model assumptions, uncertainty and fit.
a. Including devising questions for the scientists
2. Extract and interpret key management related information
3. Consider the implications of the assessment outcomes for
regional and national fisheries
4. Review and consider the implications (for the stocks) of the
Conservation and Management Measures that the
Commission has agreed to.
At the end of the week you will either:
A. Combine and present your findings to the group as a whole,
to demonstrate your understanding of all these issues.
B. Undertake an exam which will test your understanding of
these issues (particularly as relate to your assessment)
Practical Exercises and End of
Week Presentations
The groups are:
1. Tom, Elaine, Lilis – Bigeye tuna (2009)
2. Aketa, Terry, Thomas and Steve – Skipjack tuna (2008)
3. Vanessa, Hau, Tony, Netani – Albacore tuna (2009)
The Commission has requested SPC revise the Bigeye tuna and
Skipjack tuna assessments in 2010, hence focusing on the
current assessments will prepare you well for understanding
the revised assessments to be presented in August
You can download these assessments from:
http://www.spc.int/oceanfish/html/sam/saw/2010/index.htm
Day 2 Session 2
How confident can we be in using the
outputs from an assessment to inform
fisheries management decision making
processes (Part 1)?
(A discussion of model fit, assumptions,
uncertainty, sensitivity analyses)
Overview
• Introduction
• Uncertainty
• Assumptions
• Structural changes to models
• Sensitivity analyses
• Model fit and maximum likelihood
Purpose – To get you to think critically about the assessments
and to increase your confidence to discuss and critically review
the SC and other stock assessment papers.
Session outputs – 1. Draft presentation section on
BET/ALB/SKJ model assumptions, sensitivity analyses,
uncertainty and model fit. 2. Draft questions for SC5.
Introduction
When you pick up and read a new stock assessment paper (e.g. at
SC), it is very important that you are able to determine whether that
assessment warrants you having confidence in using its outputs to
assist you in your fisheries management decision making processes.
In other words:
Can we use this assessment to help us make decisions?
It is a critical question, but how do you go about answering it? How
do you “Assess the assessment”?
There are a number of step you can take to do this, but prior to doing
that it is very important that we discuss and understand a very critical
concept in stock assessment modelling:
Uncertainty
Model uncertainty
What is “uncertainty”? Some definitions:
•
“The incompleteness of knowledge about the states and
processes in nature”
E.g., the true value or values of natural
movement rates and so on in the population.
•
“The estimated amount by which an observed or calculated
value may differ from the true value”
E.g.,
confidence
distributions, etc.
•
mortality,
or
credibility
intervals,
posterior
“Lack of perfect knowledge of many factors that affect
stock assessments, estimation of biological reference points
and management”
E.g., cumulative error in reference points, variation in reference
points between subsequent model runs in a sensitivity
analysis, etc.
Model uncertainty
Why do we need to consider uncertainty in assessment models?
Stock assessments
contain uncertainty
Why? Because....
1.
We do not have perfect understanding of natural systems (e.g. Fish
populations) and the interacting processes that drive them.
2.
We can not collect data on such systems without some error.
3.
Because nature is stochastic (natural processes can vary randomly
in a manner that is difficult to predict accurately).
A model is only an mathematical/statistical representation of reality, and
how close a representation it is depends on our capacity to minimise
or express a number of different sources of error......
Model uncertainty
Some sources of model uncertainty:
•
Measurement error
Variation or bias in our observed sample quantities (e.g. catches,
effort, sizes, tag reporting rates)
•
Model structural error
Misspecification in population model structure or in assumed model
parameters (e.g. SRR, M at age, growth parameters) due to a lack
of understanding of the underlying dynamics of the system being
considered.
•
Process error
Additional variation in particular model processes or data sources
unaccounted for by sample variance estimates. Can arise from
natural variability. (e.g. variation in recruitment for environmental
reasons).
•
Estimation error
Variance in model estimates or reference points due to the
accumulated effects of any of the preceding causes.
Ref: http://www.fao.org/docrep/005/y1958e/y1958e03.htm
Model uncertainty
What are the implications of uncertainty for interpretation of
stock assessments?
1.
Sources of uncertainty in a stock assessment do exactly that—they
make the confidence of any estimated measure (point estimate), or
output, derived from the stock assessment results less certain (i.e.
less accurate or precise than we might ideally wish for).
2.
Does uncertainty in model estimates make the model
“wrong”? No! Uncertainty is inevitable (“nature is stochastic, our
knowledge is imperfect”). What can be considered “wrong” is where
an assessment does not explore, present estimates of, nor discuss
the uncertainty, so that managers are explicitly aware of it.
3.
How uncertainties in the data (“measurement errors”), in the model
specification (“model and process error”), or in the model fitting
process (“estimation error”) are dealt with within the modelling
process and how they are presented and discussed afterwards, is
very important for managers ability to gain confidence in the use of
the assessment outputs.
Uncertainty
Why must we consider uncertainty?
Fully exploring and expressing uncertainty in an assessment model leads,
paradoxically, to greater certainty for managers regarding what might be the most
appropriate management action to take, to ensure they meet their management objective
Example:
Point estimate only
Point Est with CIs
Structural Sensitivity Analyses
Management Responses might differ, e.g.
Allow increase in fishing effort
Hold fishing effort at current level
Reduce fishing effort
So, how do you assess a
stock assessment model?
There are five key questions you should ask yourself:
1.
Assumptions
What are the assumptions made by the assessment model?
2.
Model structure
What structural changes have been made since the last stock
assessment?
3.
Sensitivity analysis
Has a sensitivity analysis been undertaken to test the importance
and effect of each assumption or structural change?
4.
Goodness of fit
How well does the model (or models) fit the data?
5.
Uncertainty
Overall, how well has uncertainty been incorporated, represented
or discussed within the stock assessment?
ASSUMPTIONS
What are assumptions?
Definition of “Assumption”
A proposition or idea that is treated for the sake of a given
discussion as if it were known to be true (ie; is taken for
granted).
This can be in the presence or absence of evidence to
support that assumption and therefore, by its very nature, an
assumption can be wrong!
Preferably, an assumption should be based upon data or
information that indicates that that assumption is more likely
to be true, than would be any alternate assumption.
However, this is not always possible.
Assumptions
Why are assumptions made in stock assessments?
In stock assessment modelling, there are numerous instances
where scientists do not have exact measures of parameters,
or cannot guarantee the model structure (i.e. setup) used is
correct, and are forced to make assumptions about
parameter values and model structure, and program those
assumptions into the model.
For example: In the 2009 yellowfin tuna assessment, while
selectivity varies by gear type, an assumption is made that
selectivity does not vary over time…….do you believe that
selectivity of purse seine and longline has not varied over
time? Why or why not?
Assumptions
Assumptions are unavoidable!
What is critical is that those assumptions are explicitly stated and
recognised within the assessment paper (so we are all aware of them)
and if possible, sensitivity analyses run to test how important those
assumptions are to the management advice coming from the
assessment.
What is a sensitivity analysis?
It is undertaken by re-running the model with a different assumption
(a single structural or parameter value change), to determine if
changing that assumption results in different advice to the fishery
managers (i.e. different conclusion about stock status).
If is does, then clearly that assumption forms a critical source of
uncertainty in the model…., if it doesnt, then it is less important.
i.e. How sensitive is the model to the assumption?
Assumptions
Types of assumption
Assumptions made within stock assessment models can be (mostly)
grouped into three categories, being those which relate to:
1. Biological parameters within the model
2. Fisheries data estimates and partitioning
3. Statistical components of the model
In the past we have used the term structural assumptions and
biological assumptions but because these have some overlap, it is
perhaps better to think in terms of the three categories listed
above
Assumptions
Assumptions relating to biological parameters (examples):
Biological assumptions relate to the assumed values of biological
parameters or relationships expressed within the model. For example:
1. Natural mortality (M) – the scientists may assume that M varies by
age group but does not vary over time. A particular age independant
value of M might be assumed.
2. Maturity – it is often assumed that maturity ogive does not vary
between areas nor over time
3. Recruitment - Relationship between R and spawning biomass is
assumed to be weak/strong
4. Growth - growth follows a VBGF curve
5. Movement - Probability of capturing a tagged fish is the same as
capturing an untagged fish, with mixing of tagged and untagged fish
complete by time x. Tag reporting rates constant over time.
Often assumptions relate to assumed invariance of parameters with
respect to size, age, time or area…..how likely do you think this is?
Examples of assumptions made in YFT SA?
Assumptions relating to fisheries data and statistical
estimation of parameters:
In general, assumptions are made that the data collected from the
fishery relating to catch, effort and fish sizes are representative of the
actual catches, fishing effort etc. For example:
1. Size data - Available size-data is representative of relevant
fisheries (e.g. PH/ID)
2. Catch data - JP Observer and unloading data are representative
of JP fisheries in various model regions. Standard deviation is
assumed to be very small.
3. Effort data – e.g. in past assessments - PH/ID fisheries: effort
proportional to catch (but variance set high to compensate for
failure of this assumption)
4. Spatial structure (e.g. regions) – the spatial structure chosen
adequately separates fisheries into units with distinct catchability
and selectivity characteristics etc.
What assumptions are made in SAs? (YFT, 2009)
Table 2 – Assumptions
….which interpreted, mean..
The observed (reported) catches are very
accurate
Probability distributions of the L-F data
are normally distributed
Probability distributions of the W-F data
are normally distributed
A statistical assumption
The probability of a tag-recapture being
reported is the same in every region
Tagged fish will have mixed randomly
throughout the model region within 6 months
Spawning can occur throughout the year.
Recruitment estimates rely on the size and
CPUE data predominantly, although a SRR is
specified in the model.
Recruitment can occur anywhere.
Model estimates growth of smallest fish with
full freedom but estimates of growth of older
fish are constrained to follow a VBGF
The size selectivity of the different fishing
gears (LL, PS, PL etc) has never changed over
time, and within LL and PS gear types, are
assumed to be the same between model
regions....
Any changes in catchability over time for LL
have been accounted for by the CPUE
standardisation. The model is allowed to
estimate catchability variation for other
fisheries, (including seasonal and annual
variation)
****
Natural mortality varies by age but not over
time
Fish movement patterns are the same
regardless of age, vary between quarters, but
not over years.
Examples of assumptions made
in SAs
Table 2 (starting page 40) of the 2009 Yellowfin assessment lists the
main structural assumptions for that assessment…...please open your
YFT paper to page 40.
Take 15 minutes to read through these and make notes in relation to
the following questions:
1. Which of these assumptions might be wrong?
2. How or why might they be wrong?
3. Can you think of a way scientists might be able to independantly
test any of these assumptions?
4. Are you aware of any other assumptions made in the assessment
which are not stated in the table?
What assumptions are made in SAs? (BET,
2009)
What assumptions are made in SAs? (BET,
2009)
What assumptions are made in SAs?
(ALB, 2009)
What assumptions are made in SAs?
(ALB, 2008)
What are the impacts of
assumptions?
Positive
• Simplifies the model (easier to run and interpret)
• Reduces the number of parameters to be estimated
• Reduces the amount of data required
Negative
• Risk of making an incorrect assumption
• Risk of major impact on the conclusions and outcomes
of an assessment
Discussion of assumptions
made in the Bigeye, Albacore
and Skipjack Assessments
With your discussion group, and referring to your groups chosen
assessment paper, undertake the following exercise and discussions:
1. Read and familiarise yourselves with the executive summary of the
assessment paper, so that you have a general feel for the
assessment and its main conclusions
2. Examine the table of assumptions in the assessment paper. With
your group, discuss and identify at least 4 assumptions that you
feel could be incorrect or are the most uncertain.
1. Explain your reasoning why.
2. How would you query this at Scientific Committee. For each
assumption, draft a question or comment you would like to put
to the assessment scientists.
3. Can you think of any way in which the assumption could be
tested outside the model?
Assumptions
How can we test the importance of different assumptions to
the “end” management advice?
Given that there is potentially large risks to fisheries management
decision making associated with making the wrong assumptions
within stock assessment models, how can we determine which
assumptions hold the greatest risk? Least risk?
In general, the best approach is to test your assumptions via either:
a. Specific data analyses outside the model
b. Sensitivity analyses using the model
The key point is to look at the assumptions and identify those which
you feel are either unreasonable (if any) or will have the greatest
impact on the key management outputs, and either ask for
clarification/justification of those assumptions or suggest
sensitivity analyses be undertaken.
STRUCTURAL CHANGES
Structural Changes
What is meant by “structural change”?
A stock assessment for a given species will tend to change and evolve over
time, as the scientists find ways of better representing the fish population
and fishery through the model.
The WCPO tuna assessment nearly always incorporate a suite of structural
changes to the assesssment model between each assessment cycle.
Structural change made to a stock assessment model typically involves
changes to either:
1. Data and data partitioning
2. Assumptions regarding key parameters and population or fishery
processes (e.g. Stock recruitment relationship chosen, growth curve etc)
Structural Changes
What is meant by “structural change”?
Data and data partitioning
1.
2.
3.
New data added (sometimes defining a new fishery)
Data removed (found to be erroneous)
Data grouping changed (fisheries split or joined).
Often model structure will change over time as new data becomes available,
new fisheries start up, or new information regarding appropriate changes
to model structure becomes available.
Structural Changes
What is meant by “structural change”?
Such changes have the potential to impact model fit and key model
outputs, so it is important to identify what changes have been
made and their impact on parameter estimates, model fit, and end
management relevant outputs (e.g. reference points).
For example: The 2009 YFT Assessment
A significant number of structural changes were made to that
assessment (relative to the previous assessment) and these
changes had significant impacts on biological estimates (biomass
etc).
Structural Changes – YFT 2009
Structural Changes – YFT 2009
Fix steepness of SRR at 0.75
Structural Changes – YFT 2009
Revised natural mortality at age
Structural Changes – YFT 2009
Revised maturity ogive
Structural Changes – YFT 2009
Revised Purse Seine Catches
Structural Changes – YFT 2009
Revised Standardised
CPUE series
Discussion of structural changes
made in the Bigeye, Albacore and
Skipjack Assessments
With your discussion group, and referring to your groups chosen
assessment paper, take 20 minutes to undertake the following exercise
and discussions:
1. Identify in the paper text where it summarises structural changes
made to the model, relative to the previous model. Cut and paste
the relevant graphics that describe 3 to 4 of these changes, into a
powerpoint presentation, providing a very brief text description of
these changes on each slide.
2. Come back to the workshop and describe (verbally) to the other
participants the (selected) changes made in your assessment.
SENSITIVITY ANALYSES
What is the purpose of
undertaking sensitivity analyses?
• In stock assessment modelling, if there is either:
a) some uncertainty pertaining to a particular parameter
value set or estimated within the model, or pertaining to an
assumption made in the model, or,
b) a structural change to the model (e.g. due to new fishery
data becoming available, or fisheries being split, or new estimates of
biological parameters or relationships etc)..
…..then scientists will typically undertake what is called a “sensitivity
analyses”.
What is the purpose of
undertaking sensitivity analyses?
In case a) the sensitivity analyses might involve re-running the assessment
with both a higher and lower values of the uncertain parameter, or rerunning using a slightly different assumption.
In case b) the sensitivity analyses might involve running the model both
with and without the structural change.
The scientists and managers can then look at the difference in the model fit
(between the old and new model), and also the impact of the changes upon
the biological reference points BRPs and scientific advice provided to the
fisheries managers.
What is the purpose of undertaking
sensitivity analyses?
If there are not significant changes to model fit or BRPs, it might be deduced that
the while there is uncertainty around a parameter value or assumption, these may
not influence the end advice to fisheries managers.
That is, the outputs and conclusions of the stock assessment are not greatly
impacted by uncertainty in the level of this variable.
However, in some instances the reference points and management advice are
impacted by such changes.
How would you interpret stock status from this plot?
This is critical information for managers when considering how to use assessment
outputs in their decision making.
What is the purpose of undertaking
sensitivity analyses?
With respect to data related structural changes pertaining to splitting
or combining fisheries etc, such changes are typically only made if
there is some evidence that such a change will improve the fit of the
model.
As such, if these types of change do not improve model fit, the
scientists may then revert back to the previous model which did not
have those changes.
Therefore, sensitivity analyses are used to test assumptions and
changes to model structure and data inputs.
Sensitivity analyses (e.g. MLS 2006)
•For example, how do different starting values of M influence BRPs?
• Output are compared both qualitatively and quantitatively?
3,500
3,000
2,500
Low steepness, M=0.4
Yield (mt)
High steepness, M=0.4
Low steepness, M=0.2
2,000
Low steepness, M=0.6
Low steepness, low k
1,500
1,000
500
0
0.0
1.0
2.0
3.0
4.0
5.0
Effort multiplier (Fmult)
•How much impact are these different analyses likely to have on
management advice?
Sensitivity
analyses Final outputs
• RPs from different
analyses are presented
Table 1. Estimates of management quantities for the various model options. The highlighted
rows are ratios of comparable quantities at the same point in time (black shading) and ratios of
comparable equilibrium quantities (gray shading).
Management quantity
~
YFcurrent
~
YFMSY (or MSY)
~
B0
~
BFcurrent
~
BMSY
~
SB0
~
SBFcurrent
~
SBMSY
Bcurrent
Units
M=0.4,
M=0.2,
M=0.6,
uninformative uninformative uninformative
prior on
prior on
prior on
steepness
steepness
steepness
M=0.4,
informative
prior on
steepness
M=0.4, Low k,
uninformative
prior on
steepness
t per year
2,590
2,202
2,776
2,844
2,537
t per year
2,610
2,622
2,918
3,003
2,555
t
31,300
36,660
29,910
22,640
33,390
t
12,000
6,524
17,140
11,360
16,830
t
13,800
12,970
13,890
8,831
15,610
t
27,300
34,810
24,500
20,100
26,650
t
9,300
5,373
12,970
8,973
11,570
t
10,900
11,450
10,200
6,568
10,550
t
9,700
5,576
12,970
7,924
13,811
SBcurrent
t
7,400
4,489
9,463
6,055
8,692
Bcurrent, F  0
~
Bcurrent B0
~
Bcurrent BFcurrent
~
Bcurrent BMSY
Bcurrent Bcurrent, F 0
~
SBcurrent SB0
~
SBcurrent SBFcurrent
~
SBcurrent SBMSY
~
~
BFcurrent B0
~
~
SBFcurrent SB0
~
~
BMSY B0
~
~
SBMSY SB0
FMSY
~
Fcurrent FMSY
~
~
BFcurrent BMSY
~
~
SBFcurrent SBMSY
~
YFcurrent MSY
t
18,400
23,778
18,409
16,557
22,219
0.31
0.15
0.43
0.35
0.41
0.81
0.85
0.76
0.70
0.82
0.70
0.43
0.93
0.90
0.88
0.53
0.23
0.70
0.48
0.62
0.27
0.13
0.39
0.30
0.33
0.80
0.84
0.73
0.67
0.75
0.68
0.39
0.93
0.92
0.82
0.38
0.18
0.57
0.50
0.50
0.34
0.15
0.53
0.45
0.43
0.44
0.35
0.46
0.39
0.47
0.40
0.33
0.42
0.33
0.40
0.19
0.20
0.21
0.34
0.16
1.25
2.50
0.63
0.50
0.83
0.87
0.50
1.23
1.29
1.08
0.85
0.47
1.27
1.37
1.10
0.99
0.84
0.95
0.95
0.99
Sensitivity analyses
Sensitivity analyses help identify priority research areas
Sensitivity analyses also assist in indicating in which areas of the
assessment would more information result in a more robust
assessment
For example, if different M values had a significant impact on the
outcome of an assessment then more research into estimating
M would be recommended.
If different M values had little impact on the outcome of an
assessment, then more research into M would be provide little
improvement to the assessment. Other areas could be considered
for improvement.
This could not be clearly decided unless sensitivity analyses were
undertaken.
Sensitivity analyses
Current example: 2009 yellowfin tuna assessment
The 2009 YFT assessment ran numerous sensitivity analyses to test
the effect of structural changes to the model, parameter
uncertainties, and the effect of various assumptions within the
model, on BRPS and management outputs.
Sensitivity Analyses from 2009 YFT
Sensitivity Analyses from 2009 YFT
Very significant changes can occur to model estimates of
biological parameters as a result of structural changes to
the models
Running sensitivity analyses across all changes can help
determine which changes have the most significant effect
on model outputs
Overall, of the key structural changes made to the model,
it is the increased purse seine catch that has the
biggest impact upon lowering the biomass estimate. The
other changes had relatively little effect. Assumptions
regarding whether the model should rely on the size or
CPUE data are also important
Sensitivity Analyses from 2009 YFT
Differences in fishing impacts
between for different sensitivity
analyses
None of the assumptions or
changes explored (here) had a
very large effect on the expected
depletion levels in the fishery.
Later sensitivity analyses
investigating different levels of
steepness (of the stock
recruitment relationship) did
Sensitivity Analyses from 2009 YFT
Higher Fmult with
higher steepness
of SRR.
Higher MSY with
higher steepness
of the SRR.
Sensitivity Analyses from 2009 YFT
So what do the
results of all these
sensitivity analyses
tell us about the
status of the
stock??
How so we interpret
plots like this?
Which factor
appears to have the
greatest effect upon
the estimates of
stock status?
Sensitivity Analyses from 2009 YFT
How likely is it that overfishing is
occuring or the stock is overfished?
Discussion of sensitivity analyses in
the Bigeye and Albacore Assessments
Discuss and provide answers to the following questions:
1. From your assessment papers, describe (in general terms, without
listing each one) the types of structural sensitivity analyses
undertaken (e.g new data, new fisheries?? Etc)
2. Which of these changes appear to have had the largest impact on
model estimates of key parameters (biomass, recruitment etc) and
on the biological reference points?
3. Look at the full range of sensitivity analyses undertaken to determine
which parameters appeared to have the greatest influence on stock
status and BRP estimates. Identify which of the biological
assumptions or uncertainties had the greatest influence, and which
of the data uncertainties had the greatest influence?
4. What can you conclude from those analyses regarding the condition
of the stock?