Quality Assurance of the FARMSCOPER Model

Quality Assurance of the FARMSCOPER Model
Phase 1 report: Review of FARMSCOPER documentation against model
evaluation criteria.
Defra Contract Ref: SCF0113
3 April 2015 (revised 11 May 2015)
Robert Willows and Paul Whitehead
Water Resource Associates
1
SUMMARY
FARMSCOPER represents the integration of a substantial body of agricultural research, data and
component models that have previously been used in the development and analysis of government
policy. It represents a considerable achievement.
We have reviewed the available FARMSCPOPER documentation against USEPA guidance on good
practice in model development and evaluation (USEPA, 2009). We find good evidence that the
development and implementation of FARMSCOPER meets many aspects of this guidance. In reaching
this conclusion we make allowance for FARMSCOPER not being intended for use in a regulatory
context.
The majority of FARMSCOPER documentation has been generated as project reports as part of the
model development process. In addition there are other reports (e.g. Newell-Price et al. 2012)
containing important source data and methods. These reports are of a high standard: they contain
much of the information that the USEPA (2009) recommend should be available on model purpose,
design and development, quality assurance and performance. However, the reports describe the
staged development of FARMSCOPER and hence the information is somewhat fragmented. They
don't provide a definitive set of documentation to which users or other interested parties could
refer.
FARMSCOPER and its many component models are published in the scientific literature. This
provides evidence of peer review and scientific pedigree.
The model description provided for users1 is commendably succinct. In our view it provides naive
users with a good introduction to FARMSCOPER. But it does not provide the comprehensive
documentation that users should have available to inform their use of FARMSCOPER and aid their
interpretation of results. The published literature is widely scattered and is unlikely to be read by the
majority of users.
FARMSCOPER is freely available, yet the range of applications is somewhat opaque. We recommend
that DEFRA clarify the decision context(s) in which FARMSCOPER may be used. If FARMSCOPER is
primarily a tool for communicating the current evidence base on farm pollutant losses, the
effectiveness of mitigation measures, their costs and co-benefits, then the current documentation is
probably adequate.
If, however, FARMSCOPER is to be relied upon for providing advice on which decision-makers (e.g.
farmers, catchment managers, policy-makers) are going to rely and act, then we would recommend
that a more complete set of model documentation should be developed. A model description should
be informed by the decision context(s) to which the framework may be applied. It should include a
more complete account of the component models, by FARMSCOPER workbook, and a more
comprehensive evaluation of the consequences of assumptions and other uncertainties on
FARMSCOPER outputs, taking into account specific decision contexts. The resulting evaluation and
documentation would help inform a peer review of the FARMSCOOPER system. Members of the
1
ADAS (2014a), available for download from the FARMSCOPER pages of the ADAS website.
2
project team with suitable, independent external experts could undertake this review. It should also
be published.
The model user guide should be expanded to provide guidance on how the software should or
should not be used for specific types of advice or decision making, including the use and
interpretation of uncertainty analyses. This should cross reference the model documentation. The
documentation should be updated as new evidence is assembled as part of a model development
plan.
Further, specific recommendations are made within this report.
3
Contents
SUMMARY ....................................................................................................................................................... 2
1. INTRODUCTION............................................................................................................................................ 5
2. THE DEVELOPMENT AND EVALUATION OF ENVIRONMENTAL MODELS ....................................................... 7
2.1 DECISION CONTEXT: PROBLEM AND USER SPECIFICATION ........................................................................................... 7
2.2 MODEL DEVELOPMENT ...................................................................................................................................... 7
2.3 MODEL EVALUATION ......................................................................................................................................... 9
2.3.1 PEER REVIEW............................................................................................................................................... 10
2.3.2 QUALITY ASSURANCE .................................................................................................................................... 10
2.3.3 MODEL CORROBORATION, SENSITIVITY AND UNCERTAINTY ANALYSIS ...................................................................... 11
2.4 MODEL APPLICATION OR USER DOCUMENTATION ................................................................................................. 14
3. REVIEW OF FARMSCOPER DOCUMENTATION AGAINST SELECTED EVALUATION CRITERIA ........................ 16
3.1 DECISION CONTEXT: PROBLEM, USERS AND USER SPECIFICATION ............................................................................... 16
3.2 MODEL DEVELOPMENT ..................................................................................................................................... 19
3.3 EVIDENCE OF PEER REVIEW ............................................................................................................................... 21
3.4 QUALITY ASSURANCE, CORROBORATION AND UNCERTAINTY .................................................................................... 22
3.5 USER DOCUMENTATION ................................................................................................................................... 24
REFERENCES................................................................................................................................................... 27
4
1. INTRODUCTION
A wide variety of models are developed within the environmental sciences to synthesize
understanding of complex, natural systems, to aid decision makers. These models provide a basis for
analysis, prediction and hypothesis testing, and for forecasting and ultimately the provision of
scientifically based, rationale advice.
A model can be defined as "simplification of reality that is constructed to gain insights into select
attributes of a particular physical, biological, economic, or social system" (NRC, 2007). Alternatively,
Beven (2009) describes a model as "A set of constructs, derived from explicit assumptions, about
how a system responds to specified inputs. Quantitative models are normally expressed as sets of
assumptions and mathematical equations and implemented as computer codes". Hence,
computational models include measurable variables, numerical inputs ("data"), and theoretical or
empirical mathematical relationships and supporting assumptions, to provide quantitative outputs
(NRC, 2007).
FARMSCOPER is described as "a decision support tool that allows the assessment of the cost and
effectiveness of mitigation methods against multiple pollutants and multiple targets" (Gooday et al.,
2014). FARMSCOPER is also variously presented as a model or model framework. In FARMSCOPER
the outputs of interest, resulting from farm activities, are



the loss of a range of potential pollutants2 (e.g. nitrate, phosphorus, ammonia, nitrous oxide,
FIOs)
the change in some other potential environment benefits (e.g. biodiversity, soil structure,
energy use).
farm costs and production.
In particular the model calculates the effect of farm scale mitigation options on these outputs, taking
account of the nature of the farm, soils and climate. The ability to determine optimal sets of
mitigation options to reduce pollutant loss from amongst the wide number of potential available
measures is an important feature of the model. FARMSCOPER 3 allows calculations to be undertaken
on populations of farms, of variable types, within user-defined areas such as catchments or River
Basin Districts (RBDs).
Hence it is clear that FARMSCOPER is a model, sensu NRC (2007). However, we prefer the term
"model framework" because FARMSCOPER represents the results of a very considerable body of
empirical data, analyses of data, expert judgement and, importantly, the outputs of a large number
of other models including, for example, PSYCHIC (Davidson et al., 2008; Stromquist et al., 2008),
NEAP-N (Lord and Anthony, 2000), SPREADS (Gibbons et al., 2004). Generally it is the output(s) of
these models, rather than the models themselves, that have been incorporated within FARMSCOPER
and on which the Excel spreadsheet application undertakes calculations specified by the user.
We generally use the term "pollutant" in this report, rather than "potential pollutant", but recognise that in
many circumstances there may be little or no environmental harm associated with the levels of losses from
individual farms.
2
5
All models are approximate descriptions of real systems, include data of variable or uncertain
quality, and make assumptions that may not always be met (wholly or in part) by the particular
application. Hence all models are imperfect. Naive users may ask whether a model has been
validated. However, modelling best practice has recognised that model validation is not compatible
with the scientific method (e.g. Oreskes et al., 1994). Instead models are regarded as imperfect
tools, which, based on careful evaluation and an informed judgement, may be accepted as useful
(Beven, 2009). Therefore a key question for the user of any model, or anyone affected by decisions
based on its use, is the extent to which the model or model framework can be considered to be
sufficiently reliable and trustworthy.
The USEPA have developed guidance for the effective development, evaluation, and use of models
in environmental decision-making (USEPA, 2009). It recommends best practice to help model users,
who may not be involved in model development, form judgements regarding the strengths and
limitations of a model and hence "determine when a model, despite its uncertainties, can be
appropriately used to inform a decision". Specifically, it recommends that model developers and
users:




subject their model to credible, objective peer review;
assess the quality of the data3 they use;
corroborate their model by evaluating the degree to which it corresponds to the system
being modelled; and
perform sensitivity and uncertainty analyses.
Here we aim to determine the extent to which FARMSCOPER (version 3) meets the
recommendations of the USEPA guidance on the development, evaluation and application of
environmental models. We do this from a user perspective, that is, based on the evidence available
within the published documentation.
It is important to note that the USEPA guidance has a particular focus on models used to inform
regulatory decisions, that is, decisions that may require justification in a court of law. FARMSCOPER
is not being used in a regulatory context so we should not expect it to meet fully all the
recommendations of the USEAP (2009) guidance. Model evaluation is a relatively new discipline and
we note that few models, including those currently used in a regulatory context in the UK, have been
subject to such systematic scrutiny.
Following the USEPA guidance (see Box 1), the report is divided into three sections. Section 2
provides a short description of the general process of developing model-based decision tools, the
main steps involved and issues that need to be considered. Section 3 provides a review of the
documentation available to users of FARMSCOPER against the main recommendations of the USEPA
(2009) guidance on the documentation of environmental models, where it is considered appropriate
for the FARMSCOPER application.
Part 2 of this report will be based on undertaking a number of runs of FARMSCOPER. This will
contribute to the quality assurance of model framework and software implementation.
3
Data can be taken to mean the output of other models.
6
2. THE DEVELOPMENT AND EVALUATION OF ENVIRONMENTAL MODELS
Following the USEPA guidance (see Box 1), the development of decision-making tools can be
represented by a four phase process that is likely to be iterative, namely




Problem specification - aims to determine the decision context - i.e. who will use the
model application, what outputs are required, how they will be presented and used
Model development - covers the agreement on the underlying science and conceptual
model, its mathematical representation and implementation in software
Model evaluation - covers the testing of the model and quality assurance of the
implementation, its corroboration with independent data (aka "validation" and
"verification").
Model application - the use of the model in decision-making.
2.1 Decision context: Problem and user specification
The first step in problem identification is to identify what questions the model needs to answer.
"Problem identification will be most successful if it involves all parties who would be involved in
model development and use (i.e., model developers, intended users, and decision makers)" (USEPA,
2009).
2.2 Model Development
Having clearly defined the decision context the USEPA guidance recommends that the project team
should



develop a written statement of modelling objectives that includes
o
the state variables of concern,
o
the stressors driving those state variables,
o
appropriate temporal and spatial scales, and
o
the degree of model accuracy and precision needed;
identify the scope and type of model required (subject to constraints of data, cost, etc.);
agree the model’s "domain of application".
Once an outline model specification has been developed the guidance suggests that the project
team should seek an appropriate modelling framework that meets the project needs. The project
team will need to consider several issues, including:





The quality and relevance of the underpinning science (including peer-reviewed theory,
equations and algorithms)?
What level of model complexity is appropriate to the problem?
Whether the quality and quantity of available data support the choice of model?
Whether the model structure reflects all the relevant inputs and outputs identified in
the conceptual model?
Has relevant model code been developed?
o If so, has the model been evaluated? Has the code been verified?
o If not, suitable code will need to be developed and quality assured.
7
Box 1. Basic Steps in the Process of Modelling for Environmental Decision Making. Adapted
from Box 2 of USEPA (2009), following NRCA (2007).
Step
Modelling issues
Problem identification and Definition of model purpose
 Goal
specification: to determine
 Decisions to be supported
the right decision-relevant
 Predictions to be made
questions and establish
Specification of modelling
 Scale (spatial and temporal)
modelling objectives.
context
 Application and domain
 User community
 Required inputs
 Desired output
 Evaluation criteria
Model development: to
Conceptual model
 Type of model (e.g. dynamic,
develop the conceptual
formulation
static, stochastic,
model that reflects the
deterministic)
underlying science of the
 State variables represented
processes being modelled,
 Level of process detail
and develop the
necessary
mathematical
 Scientific foundations
representation of that
 Assumptions
science and encode these
Computational model
 Algorithms
mathematical expressions
development
 Mathematical/computational
in a computer program.
methods
 Inputs
 Hardware platforms and
software infrastructure
 User interface
 Calibration/parameter
determination
 Documentation
Model evaluation: to test
Model testing and revision
 Theoretical corroboration
that the model expressions
 Model components
have been encoded
verification
correctly into the computer
 Empirical corroboration
program and test the
(independent data)
model outputs by
 Sensitivity analysis
comparing them with
 Uncertainty analysis
empirical data.
 Robustness determination
 Comparison to evaluation
criteria set during formulation
An important aspect of model development is defining the domain of the model application. This
includes aspects of scale (e.g. "farm" v "national"), levels of aggregation or averaging of data in
defining variables (e.g. farm "types"), and the choice of processes and conditions to represent within
that domain. The latter include the transport and transformation processes relevant to the
policy/management objectives and the important time and space scales inherent in the transport
8
and transformation processes within that domain (in comparison to the time and space scales of the
problem objectives), and any peculiar conditions of the domain that will affect model selection or
new model construction.
The result of these considerations should be (i) documentation justifying the model development
choice(s) and (ii) a model development project plan that should explicitly recognise the need for
quality assurance (see section 2.3.2 for more information on quality assurance).
2.3 Model Evaluation
The goal of model evaluation is to ensure the integrity, utility and objectivity of the model.
Information gathered during model evaluation allows decision makers to be better positioned to
formulate decisions and policies that take into account all relevant model issues and concerns.
Model evaluation provides information to help answer four main questions (Beck 2002):




How have the principles of sound science been addressed during model development?
How is the choice of model supported by the quantity and quality of available data?
How closely does the model approximate the real system of interest?
How well does the model perform the specified task while meeting the objectives set by the
project quality assurance plan.
So evaluation addresses the soundness of the science underlying a model, the quality and quantity
of available data and whether key model assumptions are identified and justified. It includes an
evaluation of how well the model agrees with observed conditions, through consideration of
predictive skill that may include the assessment of reliability, bias and accuracy. These assessments
are usually based on data used in the development and building of the model and on independent
data used in model corroboration or ”validation”. We note that models can’t be validated in a
scientific sense (see Oreskes et al., 1994; Bevan, 2009; USEPA, 2009): all models have limited
predictive skill and can be falsified. Instead, there is a need to understand whether models give
answers that are sufficiently reliable as to be useful. It is only through the process of model
evaluation that decision makers can form a judgement as to whether the use of a model for a given
application is acceptable. This applies both to the general case ("can the model be used to assess the
loss of pollutants from farms in catchments?") and the specific instance ("should the model be used
to assess the loss of pollutants from this farm in this catchment?")
USEPA (2009) recommend an evaluation process include the following components: (i) credible,
objective peer review; (ii) QA project planning and data quality assessment; (iii) qualitative and/or
quantitative model corroboration; and (iv) use of sensitivity and uncertainty analyses.
Furthermore, for models used for regulatory purposes the USEPA (2009) recommends that
evaluation should continue throughout the life of a model.
For models that may be used to help determine policy, or models being considered for future
regulatory use, or that may be used to make decisions that have significant economic or
environmental consequences, there should be an evaluation plan. This should:

describe the model and its intended uses;
9





describe the relationship of the model to data, including the data for both inputs and
corroboration;
describe how such (new) data and other sources of information will be used to assess
the ability of the model to meet its intended uses;
describe all the elements of the evaluation plan by using an outline or diagram that
shows how the elements relate to the model’s life cycle.;
identify the factors or events that might trigger the need for major model revisions or
that might prompt users to seek an alternative model. These can be fairly broad and
qualitative;
Identify the responsibilities, accountabilities, and resources needed to ensure
implementation of the evaluation plan.
2.3.1 Peer Review
The purpose of peer review is


to evaluate whether the assumptions, methods, and conclusions derived from
environmental models are based on sound scientific principles.
to check the scientific appropriateness of a model for informing a specific decision or
types of decision.
Questions that are relevant to the peer review of models are provided in Box 2.
2.3.2 Quality Assurance
Quality assurance is an important aspect of model development, implementation and use. Model
evaluation specifically provides the quality assurance of the science and data underpinning the
model. It is an attribute of models that is meaningful only within the context of a specific model
application. Determining whether a model serves its intended purpose involves in-depth discussion
between model developers and the users responsible for applying for the model to a particular
problem.
A quality assurance project plan helps ensure that a model provides useful outputs that help inform
the decision-makers problem. As discussed above, evaluating the degree to which a modelling
project has met QA objectives is often a function of the external peer review process.
There are two key aspects


a data quality assessment ensures that a model has been developed according to the
principles of sound science;
to ensure that the completed model and implemented code performs the specified task,
within the accepted limitations of the model and supporting data.
Evaluating the degree to which a modelling project has met its QA objectives is often a function of
the external peer review process. Further guidance is provided in Box 3.
10
2.3.3 Model Corroboration, sensitivity and uncertainty analysis
Model corroboration is the process used for evaluating the degree to which a model corresponds to
reality. It includes the use of both quantitative and qualitative methods.
Quantitative model corroboration uses statistics to estimate the mismatch between the model and
real-world measurements. A particular emphasis will be on outputs of direct relevance to the
decision-making process the model is designed to inform. The extent and rigour of the analyses
undertaken depends on the type and purpose of the model application. For models where
significant consequences (e.g. financial or economic) may arise from the application of the model
more extensive and rigorous analyses may be needed, requiring the collection and/or use of data
not directly used as part of model development. The aim is often to understand the frequency and
circumstances leading to false positive or false negative outcomes.
Sensitivity and uncertainty analysis can also be used to understand how well the model represents
the underlying system, particularly in extreme circumstances. Sensitivity analysis reveals how a
model’s outputs change in response to changes in the inputs and helps understanding of the most
important sources of uncertainty in environmental models (Saltelli et al., 2000; Saltelli, 2002). In
contrast, uncertainty analysis investigates the lack of knowledge about a certain population (e.g. of
farms), or the true value of model parameters (e.g. a loss coefficient) affects model outputs. The
results of these analyses are important if the model user is to fully understand the confidence he can
place in modelling results. Emphasis is placed on the effective communication of the results of
these analyses (e.g. USEPA, 2009).
11
Box 2: Elements of External Peer Review for Environmental Regulatory Models, adapted from
Box D1, Appendix D, USEPA (2009).

•
•
•
•
•
•
•
•
•
•
Model purpose/objectives
o What is the regulatory context in which the model will be used and what broad
scientific question is the model intended to answer?
o What is the model's application niche?
o What are the model's strengths and weaknesses?
Major Defining and Limiting Considerations
o Which processes are characterized by the model?
o What are the important temporal and spatial scales?
o What is the level of aggregation?
Theoretical Basis for the Model — formulating the basis for problem solution
o What algorithms are used within the model and how were they derived?
o What is the method of solution?
o What are the shortcomings of the modeling approach?
Parameter Estimation
o What methods and data were used for parameter estimation?
o What methods were used to estimate parameters for which there were no data?
o What are the boundary conditions and are they appropriate?
Data Quality/Quantity
o See Box 3.
Questions related to model design include:
o What data were utilized in the design of the model?
o How can the adequacy of the data be defined taking into account the regulatory
objectives of the model?
Questions related to model application include:
o To what extent are these data available and what are the key data gaps?
o Do additional data need to be collected and for what purpose?
Key Assumptions
o What are the key assumptions?
o What is the basis for each key assumption and what is the range of possible
alternatives?
o How sensitive is the model toward modifying key assumptions?
Model Performance Measures
o What criteria have been used to assess model performance?
o Did the data bases used in the performance evaluation provide an adequate test
of the model?
o How does the model perform relative to other models in this application niche?
Model Documentation and Users Guide
o Does the documentation cover model applicability and limitations, data input,
and interpretation of results?
Retrospective
o Does the model satisfy its intended scientific and regulatory objectives?
o How robust are the model predictions?
o How well does the model output quantify the overall uncertainty?
12
Box 3. Quality Assurance Planning and Data Acceptance Criteria, adapted from Box D3, Appendix D,
USEPA (2009). See also USEPA (2000, 2002).
A Quality Assurance plan should address four issues regarding information on how non-direct
measurements are acquired and used on the project (USEPA 2002):
• The need and intended use of each type of data or information.
• How the data will be identified or acquired, and expected sources of these data. If and when it
needs to be revised or updated.
• The method of determining the underlying quality of the data.
• The criteria established for determining whether the quality of a given data set is acceptable for
use on the project.
Acceptance criteria for individual data values generally address issues such as the following:
Representativeness: Were the data collected from a population sufficiently similar to the population
of interest and the model-specified population boundaries? Were the sampling and analytical
methods used to generate the collected data acceptable to this project? How will potentially
confounding effects in the data (e.g., season, time of day, location, and scale incompatibilities) be
addressed so that these effects do not unduly impact the model output?
Bias: Would any characteristics of the dataset directly impact the model output (e.g., unduly high or
low process rates)? For example, has bias in analysis results been documented? Is there sufficient
information to estimate and correct bias? If using data to develop probabilistic distributions, are
there adequate data in the upper and lower extremes of the tails to allow for unbiased probabilistic
estimates?
Precision: How is the spread in the results estimated? Is the estimate of variability sufficiently small
to meet the uncertainty objectives of the modeling project (e.g., adequate to provide a frequency of
distribution)?
Qualifiers: Have the data been evaluated in a manner that permits logical decisions on the data’s
applicability to the current project? Is the system of qualifying or flagging data adequately
documented to allow data from different sources to be used on the same project (e.g., distinguish
actual measurements from estimated values, note differences in detection limits)?
Statistical representation: Is the data summarization process clear and sufficiently consistent with
the goals of this project. For example, is the choice of temporal or spatial averages, or other
simplifying statistics for chosen summary variables or classifiers (e.g. farm or soil types), or
statistically transformed values from unaltered measurement values)? Ideally, processing and
transformation equations will be made available so that their underlying assumptions can be
evaluated against the objectives of the current project.
13
2.4 Model Application or User Documentation
The use of a model in a decision process requires that the model application be transparent to



the users of the model application,
those making decisions based on its use, and
those affected by those decisions.
This is accomplished by providing written documentation to external parties. Consideration should
be given to the needs of both technical audiences (consultancies, university scientists) and less
technical audiences, such as those who may be affected by decisions (perhaps members of the
public) in an accessible style. The USEPA (2009) provide guidance on the scope and content of this
documentation (see Box 4). It will be substantially based upon documentation generated as part of
the development and evaluation of the model (see sections 2.1-2.3).
Model code should be properly documented, ensuring that the purpose of each section of code or
object is clear.
The model developers will also need to provide guidance to users



describing how to use the model software;
what considerations may need to be given to different applications (e.g. decision contexts,
or in particular cases); this may include the choice of parameter values;
to aid interpretation of model results.
This guidance is likely to be provided through a mix of



software user interface design including input and output (results and diagnostics)'
written documentation and help files,
training materials and example model applications or case studies.
14
Box 4: Recommended elements for model documentation, adapted from Box 11 USEPA (2009).
1. Management Objectives
 Scope of problem
 Technical objectives that result from management objectives
 Level of analysis needed
 Level of confidence needed
2. Conceptual Model
 System boundaries (spatial and temporal domain)
 Important time and length scales
 Key processes
 System characteristics
 Source description
 Available data sources (quality and quantity)
 Data gaps
 Data collection programs (quality and quantity)
 Mathematical model
 Important assumptions
3. Choice of Technical Approach
 Rationale for approach in context of management objectives and conceptual model
 Reliability and acceptability of approach
 Important assumptions
4. Parameter Estimation
 Data used for parameter estimation
 Rationale for estimates in the absence of data
 Reliability of parameter estimates
5. Uncertainty/Error
 Error/uncertainty in inputs Initial conditions, and boundary conditions
 Error/uncertainty in pollutant loadings
 Error/uncertainty in specification of environment
 Structural errors in methodology (e.g. effects of aggregation or simplification)
6. Results
 Tables of all parameter values used for analysis
 Tables or graphs of all results used in support of management objectives or conclusions
 Accuracy of results
7. Conclusions of analysis in relationship to management objectives
8. Recommendations for additional analysis, if necessary.
15
3. REVIEW OF FARMSCOPER DOCUMENTATION AGAINST SELECTED EVALUATION CRITERIA
The extent of model evaluation needs to be appropriate to the decision context of the model at
hand. FARMSCOPER is being used primarily in a non-regulatory, advisory role, or as a tool to
promote greater awareness of the underlying evidence base to a range of stakeholders. As such
Defra need to form a judgement as to whether there is sufficient evidence that FARMSCOPER, and
its component models, have been subject to an adequate evaluation process.
If, for example, FARMSCOPER were to be used directly to target costly and possibly controversial
control strategies then many aspects of the modelling may come under close scrutiny, perhaps via
judicial review. A more detailed model evaluation would be necessary in these circumstances and
Defra would be advised to commission a greater depth of external peer review, to provide more
complete documentation of the findings of the model evaluation process and be prepared to answer
questions that will arise about the FARMSCOPER model framework and its component models.
In the sections that follow we note where the available documentation shows evidence that the
development of FARMSCOPER, it’s evaluation and documentation meet USEPA guidelines. We do
this based principally on the criteria and questions identified in Box 2 - 4. We emphasize that the
USEPA (2009) guidance is developed to apply to regulatory models and that a lower level of evidence
may be acceptable to DEFRA for a non-regulatory model such as FARMSCOPER.
3.1 Decision context: Problem, users and user specification
FARMSCOPER has been developed by ADAS principally for Defra Water Quality Division, as the
customer, to provide a framework for the assessment of the potential to reduce agricultural
emissions of pollutants to air and water at the farm scale. The development was under a series of
projects,




WQ0106: Cost and Effectiveness of Policy Instruments (see Antony, 2006; Gooday and
Antony, 2010);
WT0706CSF: Benefits and Pollution Swapping (Chadwick et al., 2006);
SFF0601: Business as Usual III; and
WT0743CSF: Evaluating the Extent of Agricultural Phosphorus Losses across Wales (Anthony
et al., 2008).
FARMSCOPER developed the assessment of agricultural pollutant loss and the effect of mitigation
options developed in project WQ0106 (see Newell-Price et al., 2009; 2011), providing a formal
framework by which pollutant losses at the farm scale, and the impact of single or in-combination
mitigation methods thereon, could be assessed using methodologies developed under project
WT0743CSF.
The principle drivers for reducing farm losses of pollutants is legislative: Climate Change Act (2008)
(greenhouse gas emissions), Gothenburg Protocol and National Emission Ceilings Directive
(ammonia), EU Nitrates Directive (81/676/EEC) (nitrates), Drinking Water Directive (98/83/EC) (FIO's,
pesticides), Water Framework Directive (2000/60/EC) (phosphorus, FIO's, ammonium), EU
Freshwater Fish Directive (78/659/EC) (suspended sediments), and the Bathing Waters Directive
(2006/7/EC) (FIO's).
16
Version 3 of FARMSCOPER has been extended4 to allow





the impact of mitigation options applied to multiple farms at the catchment, River Basin
District or National scale;
to consider a wider range of farm emissions of potential pollutants (i.e. FIOs, CO2) and
benefits/disbenefits (farm production, energy use, soil carbon);
to provide updated information on prior rates of mitigation uptake;
to consider the benefits/disbenefits associated with the uptake of additional mitigation
measures by farms; and
the costs associated with mitigation options.
This resulted in the addition of two additional modules within FARMSCOPER ver. 3 (the "Cost" and
"Upscaling" workbooks).
There clearly would have been discussion of modelling objectives between DEFRA, ADAS and
possibly other partners during the development and review of these projects. Gooday et al. (2014a)
note that a driver for the development of the upscaling tool in version 3 was a desire amongst users
to apply the farm-based version of FARMSCOPER to multiple farms in different catchments. It is not
unusual that modelling tools develop iteratively and for user requirements to be not fully identified
at the outset.
It is to be expected, however, that a farm scale tool and a national/regional scale tool will have
different uses (i.e. address different problems), users (e.g. farm advisors v. policy advisors) and user
requirements. Meeting multiple uses and users may entail compromises in design and predictive
skill. Nevertheless, having a single common framework for analysing the effectiveness of both local
(farm) scale and National scale farm management has benefits if that model framework is accepted
as being based on robust science, sound data and modelling principles and practice.
The USEPA suggest that a "relatively simple, plain English problem identification statement" should
be provided. In our judgement Gooday et al. (2014b) provide a clear if succinct description of the
purpose of FARMSCOPER that focuses on the policy maker. They write (loc. cit., Introduction, p.5)
that the aim of FARMSCOPER is "to allow the assessment of the cost and effectiveness of mitigation
methods against multiple pollutants and multiple targets. By including an estimation of the cost of
implementation, Farmscoper can be used for assessing the ability for the cost of implementation to
be funded by the farming industry itself or offset by funding or subsidies from the government or
other sources. The tool is thus ideal for analysis of government policy, and is currently the only tool
available for such analysis which considers impacts on multiple pollutants to water, green house
gases, air quality and biodiversity."
The ADAS download web page5 describes FARMSCOPER as "a decision support tool that can be used
to assess diffuse agricultural pollutant loads on a farm and quantify the impacts of farm mitigation
methods on these pollutants. It also determines potential additional consequences of mitigation
method implementation for biodiversity, water use and energy use".
4
For detailed objectives see Gooday et al. (2014a), section 1.1, p 6.
5
http://www.adas.uk/Services/Service/farmscoper-397
17
Gooday et al. (2014a) state " The tool has been designed .... in order to aid policy makers in England
and Wales in the assessment of mitigation methods for use in action programmes and management
plans, and to guide catchment officers in the provision of funding (e.g. capital grants) and advice to
farmers aimed at tackling harmful diffuse pollution."
FARMSCOPER represents the integration of a substantial body of agricultural research, data and
component models that has previously been used in the development and analysis of government
policy. In the published documentation FARMSCOPER is presented primarily as a policy analysis tool.
Presumably it could be used replace the need for bespoke advice from the applied agricultural
research community to policy makers, at least in the initial phase of policy formulation. If so, the
particular circumstances in which it should/should not be used should, in our view, be better
identified, at least in outline.
FARMSCOPER is freely available to download from the ADAS website. Gooday et al. (2014a) note6
that previous versions of the FARMSCOPER tool have been used by wide variety of users
("government organisations, research institutes, consultancies, levy bodies and other agricultural
organisations") for a variety of purposes including the impact of policy options including agrienvironment schemes, prioritising catchment management plans and the assessment of "pollutant
footprints" and the mitigation potential of individual farms or groups of farms.
FARMSCOPER has also been demonstrated to small groups of farmers and farm advisors in two
workshops. The aim of these was to better gauge its usefulness or acceptability to these
communities (Gooday et al., 2014b). These workshops could appear to represent a post-hoc
consultation with users and stakeholders, as opposed to the prior consultation recommended by
USEPA (2009). We expect there may have been prior consultation but the evidence is not present
within the available documentation. However, the workshops can also seen as part of an iterative
process by which the uses and applications of FARMSCOPER are refined.
The workshops may also have been intended to aid communication and transparency of the
evidence-base supporting Government policy. The inference is that one intended use of the
FARMSCOPER software is as a vehicle for communication of the underlying scientific evidence,
helping to build understanding and confidence in agricultural and farming management within the
farming sector. An extended user community may include, farmers and their advisors, Environment
Agency staff, River Trusts, NGO's, catchment managers, officers and their advisors, NGO's, and, of
course, ADAS and other consultancies working on behalf of any of those.
It may be that FARMSCOPER is seen as an advisory tool. But advice contributes to decisions and the
provision of advice is often subject to adherence to professional standards or indeed regulation. If
FARMSCOPER were to be used for making decisions regarding pollution mitigation on farms in order
to achieve catchment management objectives, we would recommend that the problem
identification statement should be written to reflect this.
In summary, there is documentary evidence that there has been discussion between ADAS (the
model developers), Defra and at least some potential model users. The problem identification
statement recommended by USEPA (2009) is present within the introduction of various publications
6
Section 7, first paragraph, p. 77
18
(e.g. Gooday and Antony, 2010; Gooday et al., 2014a,b; ADAS, 2014a). However, it is not entirely
clear, beyond a narrow range of Defra-family Water Quality Policy advisors, that the use of
FARMSCOPER for particular types of decision problem, by particular categories of users, has been
fully defined as part of its development and implementation.
We recommend that a more explicit and extensive problem identification statement be developed
by Defra in partnership with the model developers and potential user community. This should better
identify (i) the principal decision context(s) in which FARMSCOPER may be applied and, for each, (ii)
the likely practitioners (e.g. Defra, Environment Agency, River Trusts, farm advisors) and their roles.
We would also recommend that these different decision contexts and decision problems should be
reflected in the model evaluation and other supporting user documentation.
3.2 Model development
In some environmental model applications, such as catchment management, there may be a range
of pre-existing models representing transport and transformation processes, with different spatial
and temporal and process representations, that may provide suitable, overarching model
frameworks. For example, it is conceivable that a number of alternative catchment hydrological
models could have been considered as a basis of the transport of pollutants from farms. In the case
of the FARMSCOPER model framework, however, the approach taken is rather different. The
requirement was (i) to select from a number of existing, essentially pollutant-focussed models, that
would meet the needs of the project requirement in terms of the pollutants and supporting data at
farm scale for each farm type and (ii) to assemble these models (or the outputs of these models),
and supporting data, within an integrated package. This is an example of a "Major Defining and
Limiting Consideration" that would be considered in any external peer review (see Box 2).
Further evidence that "major defining and limiting considerations" have been addressed is
documented within Gooday et al. (2010, 2014a,b). These include aspects of scale (e.g. "farm" v
"national"), farm types, and the choice of processes and including the transport processes relevant
to the policy/management objectives, the important time and space scales inherent in the transport
and transformation processes within that domain in comparison to the time and space scales of the
problem objectives, and specific conditions of the domain and pollutant properties, that will affect
model selection or new model construction.
There is clear evidence that the domain of application (see Box 2) has been carefully considered as
part of the selection and development of FARMSCOPER. It would be a surprise if it were not! The
"Pollutant Loss Co-ordinate system" provides the overarching conceptual model to FARMSCOPER
(see Table 4.1 of Gooday et al., 2014a; Section 2.1 and Table 1 of Gooday et al., 2014b; also ADAS
2014a). This provides a summary of the pollutants of interest, spatial and temporal scales, sources,
pathways and other aspects of the model domain7. There is perhaps a greater emphasis placed on
the pollutants, the influence of farm types, mitigation methods and their associated costs, rather
than on other important variables (USEAP (2009) use the term “stressors”) that might affect
pollutant loss, such as more accurate data on rainfall, or data on slope, geology or soil
7
For what is such a fundamental component of the FARMSCOPER model we note that, from a users perspective, its
explanation is spread across the available documents. The most complete explanations are found by linking the description
in Gooday and Anthony (2010) to the definitions that are provided in the FARMSCOPER workbook spreadsheets.
19
characteristics. The principle reason for this being that the affects of these state variables may be
incorporated within the component models used to derive the average coefficients used within
FARMSCOPER.
It is worth noting that the component models8, that provide the loss coefficients used within the
FARMSCOPER framework, can be considered as algorithms (see Box 2 - "Theoretical Basis for the
Model"). However, because the component models are run outside FARMSCOPER these loss
coefficients can equally well be described as parameters within FARMSCOPER (see Box 2 "Parameter Estimation"). However, the principle requirement is that the source of these values,
their scientific pedigree and reliability, should be understood and documented. Gooday et al.
(2014a,b) suggest one of the advantages of the FARMSCOPER framework is that a sophisticated user
can replace these coefficients with their preferred model or its outputs. They note, however, that
this facility has not been used to the best of their knowledge, and it is not clear whether this was
part of the agreed user-specification.
The literature describing the component models used to provide the loss coefficients provided in the
FARMSCOPER software is referenced throughout the documentation (Gooday and Antony, 2010;
Gooday et al., 2014a,b; ADAS, 2014a) and even within the FARMSCOPER workbook help sheets. One
weakness of the documentation is the requirement placed on each user to return to the wider,
scattered published literature to gain complete descriptions of the scientific basis of the component
models. Without doing so they are unlikely to gain a deeper appreciation of the capabilities and
limitations of the component models and hence of the overall FARMSCOPER framework. The level of
model description provided in the original publications is, in our view, rather variable. Furthermore,
the authors of the original papers did not anticipate the use of the models within an application such
as FARMSCOPER at the time of publication. Some of the publications are now quite dated, the
models may have been subject to revisions, and better performing models may be available. Users
of FARMSCOPER would benefit from having fuller (but succinct) descriptions of the capability,
purpose and limitations of the underlying component models, any revisions implemented since the
last, major publication, the level of external peer review, availability of model code and
documentation, etc. The descriptions provided in ADAS (2014a) do not (and were not intended) to
provide this level of documentation.
Assumptions are an inherent feature of all forms of analysis and decision-making and are a critical
feature of model building and use (Bevan, 2009). The identification of assumptions, in particular
those that are unlikely to be justified in a real world application and to which model outputs may be
sensitive, is a key part of USEPA guidance (See Box 2, "Key Assumptions"). We note that Section 2
and 3 of Gooday et al. (2014a), that describes the development of the Cost and Upscaling
workbooks, consistently and commendably identify key assumptions associated with each
component model. The explicit treatment of assumptions included in the Cost workbook is a
particularly effective device for communicating assumptions to the model user. Detailed
assumptions about farm structure and practice on the modelled farm systems are provided in
Newell-Price et al. (2011) and users of the model would be advised to be familiar with this report.
8
The component models are succinctly catalogued in ADAS (2014a).
20
FARMSCOPER includes an algorithm (NSGA-II, see Deb et al., 2000 and 2002) in order to find the best
of combination of pollutant mitigation measures, among a set of locally optimal solutions (known as
Pareto-Optimal solutions), under conditions where the decision maker is trying to maximise multiple
objectives, such as minimising costs while maximising the reduced loss of multiple pollutants and
other objectives such as possible biodiversity gain. While the key reference describing the basis of
the algorithm and its relevance to this sort of application is provided, the actual version and source
of the NSGA-II code implemented within FARMSCOPER does not appear to be identified9.
It is likely that the selection of component models was in many cases constrained by the empirical
nature of many of the models (i.e. their reliance on farm-scale data), by a lack of alternative choices
or by other practical considerations, such as the availability of code or ease of access by ADAS.
However, it is desirable that any options should be considered and, even in the absence of choice, a
reasoned justification for the use of each component model should be documented.
In summary, we would recommend that the FARMSCOPER documentation should include, in
addition to citations to the published literature, a more complete description of each the component
model(s), along the lines indicated by USEPA (2009). This should include sufficient detail to allow
identification of model version, parameter values, the nature and pedigree of data used for input,
calibration or parameter estimation, especially where there have been modifications subsequent to
any published description of the model. It should include an inventory of key assumptions and, as a
minimum, some qualitative evaluation of the strengths and weaknesses of the model for different
uses, including uncertainty in the model outputs/predictions at relevant scales of interest.
3.3 Evidence of Peer Review
The principle evidence that FARMSCOPER has been subject to external peer review is through the
publication of a description of FARMSCOPER and its application within a leading scientific journal
(i.e. Gooday et al., 2014b). The majority of the component models and methods used within
FARMSCOPER have been published in the external literature and hence very likely to have been
subject to an external peer review process10. Gooday et al. (2014a) does, in particular, provide
comprehensive pointers to the published literature relevant to each component model. Hence
Defra can conclude with some confidence that the assumptions and methods used within
FARMSCOPER, and conclusions derived it and its component models are based on sound scientific
principles.
Just because there is evidence of publication does not necessarily mean, however, that the models
themselves, or the choice of models and suitability for the agreed purpose, would enjoy complete
scientific consensus. A more complete evaluation of FARMSCOPER would include a critical review of
the component models and their limitations.
Gooday et al. (2014 a, b) provide justification for the scientific appropriateness of FARMSCOPER for
informing well defined decision problems. We assume that independent credible scientists and
user representatives on the Defra project team will also have provided peer review of the Defrafunded project(s) that led directly to the development of FARMSCOPER.
9
We assume it is that provided by MATLAB (2009), see Seshandri (2010) for a description.
An apparent exception is Antony and Morrow (2011)
10
21
However, we are not aware that the results of any peer review process have themselves been
published. USEPA guidance would suggest that there should be a separate report that documents
the critical scientific review of FARMSCOPER and its component models. This review should include
the methodological choices made, the scientific strengths and weaknesses of the model(s), including
hidden assumptions and the consequences if they are not justified (see Box 2 for further details).
The review should also include an assessment as to the extent to which the completed application
(FARMSCOPER) met the user-agreed problem specification and we recommend that this assessment
should be documented.
Members of the project team supplemented, with advantage, by external scientific experts, could
undertake a critical review and we recommend that the review should be published. We would
regard this as essential if the model was to be directly used to inform policy choices, perhaps
through the running of multiple sets of farm management, or if it were to be used in a more
regulatory context.
3.4 Quality Assurance, Corroboration and Uncertainty
ADAS operate a comprehensive in-house quality management system (QMS) and are ISO 9001 and
ISO 14001 accredited. USEPA (2009) guidance suggests that there should be evidence of a quality
assurance plan for FARMSCOPER, i.e. documentation to describe how quality management systems
and procedures have been applied in development of the model. The Quality Assurance plan should
describe the procedures adopted to quality assure the key steps involved in the development and
construction of the FARMSCOPER model and for the documentation and verification of model code
(in the case of FARMSCOPER that would include Excel macros). The quality assurance plan should
include criteria for the acceptable quality of given datasets (including loss coefficients and cost
estimates). Clearly supporting models and empirical datasets should, ideally, have the benefit of
existing quality assurance statements, to which reference should also be made in the model
documentation.
Gaining a good understanding of the reliability of model predictions or forecasts is not straightforward (Box 2 - "Model Performance Measures"). As a model based on extensive empirical datasets,
the reliability of FARMSCOPER is likely to depend quite strongly on the quality and
representativeness of those data. In the absence of information on variances or errors or bias
associated with these data, and with their associated descriptive functions, it is difficult to form a
view on the reliability of the outputs in particular circumstances. Likewise, the absence of
independent datasets makes it difficult to evaluate model performance through corroboration.
The performance of some of the component models may have been subject to extensive testing,
sensitivity and/or uncertainty analysis. Corroboration may have been assessed either through
standard goodness-of-fit measures to existing data or based on external datasets. Such assessments
can improve understanding and confidence in model outputs and should be reported. But because
the FARMSCOPER documentation relies on citations to the original papers, it is not clear from the
documentation whether a component model has been subject to some form of evaluation. Hence,
the user gains no insight regarding the reliability of different components of FARMSCOPER and
22
hence whether some FARMSCOPER outputs (e.g. relating to particular pollutants, pathways or
mitigation measures) are relatively more or less trustworthy.
The authors have designed FARMSCOPER to give national totals of pollutant loss that agree, as far as
is possible, with the national estimates11 and emphasize that estimates at smaller scales (catchment,
farm) are likely to be less reliable (Gooday et al., 2014a). This is attributed to the absence of accurate
data describing a particular catchment or farm of interest. It is, however, also a consequence of the
aggregation of sample data to different farm types (as well as different soil types and climate zones).
An assessment of the effect of variation within farm types on FARMSCOPER outputs is not reported
in the available documentation. Similarly, variation and co-variation in the effectiveness of
mitigation measures on pollutant loss, and uncertainty in the cost estimates associated with
mitigation measures, are not reported.
The importance of key assumptions has already been discussed and we have noted good evidence of
the identification of key assumptions (Gooday et al., 2014a). However, there is less emphasis on
evaluating the extent to which they are justified (in terms of the supporting scientific evidence base)
or, if they are not in particular circumstances, what may be the consequences for model outputs,
their interpretation or use. There is some consideration of how the accuracy of results may vary
from National scale to local scale, but this could be given greater emphasis in the presentation of
model results.
All these issues are addressed qualitatively by the systematic evaluation of model components by
well-informed experts, or quantitatively by the application of sensitivity or uncertainty analysis.
Those with access to the component models and data underlying FARMSCOPER would be best
placed to identify (for example) upper and lower bounds on model inputs and parameter values.
As noted earlier, FARMSCOPER represents the integration of a substantial body of agricultural
research, data and component models and, as the developers recognise (Gooday and Antony, 2010,
Section 6.2, p. 42), all parts of the framework have associated uncertainty. FARMSCOPER allows for
some effects of uncertainty to be assessed by the user. Hence, Gooday and Antony (2010) describe
the derivation of uncertainty ranges associated with the effectiveness of mitigation measures for
reducing pollutant losses (see Table 5.3, Table 8) and for the rates of prior implementation of
mitigation measures (Section 5.4, Table 10), based on expert judgement. Likewise, uncertainty
ranges associated with mitigation measure cost estimates are described in Section 2.1.5 of Gooday
et al. (2014a), although it is not clear how these particular ranges have been derived from the
literature sources.
Sensitivity analysis can also be applied to the baseline pollutant losses12 included within
FARMSCOPER (also soil carbon, energy use and production). Default values are provided13 which the
user is able to alter.
The ranges are treated as bounds on rectangular distributions ("areas of equal possibility", Gooday
and Antony, 2010, p. 43) and the uncertainty assessment is conventionally based on Latin Hypercube
11
Of course these national estimates are largely based on the same datasets.
We note the unnecessary accuracy associated with the coefficients within the FARMSCOPER database !
13
In the "Confidence Ranges" sheet of the FARMSCOPER Evaluate workbook
12
23
sampling. For simplicity the developers choose to assume no correlation in the variation in the
baseline pollutant losses between different pathways, whereas the effect of each mitigation
measure is assumed to act uniformly (correlation = 1.0) across all affected pollutants and pathways.
There is an implicit assumption that these are the most important sources of uncertainty governing
pollutant losses and the effectiveness of mitigation options to control those losses. We would
expect and recommend a more complete review of the sources of uncertainty in FARMSCOPER to
support their treatment,as implemented in FARMSCOPER. For example, it is not clear how the
default values for the uncertainty in baseline pollutant losses, soil carbon, energy use and
production have been agreed upon, why they should be +/- 25%, or why they should be the same for
each pollutant and pathway. It is not clear whether these ranges represent primarily environmental
variation (e.g. within or between farms, or perhaps between rainfall years) that might be reduced or
resolved by a user with more complete information in a particular case, or more fundamental
uncertainty in the underlying models, or both. While encouraging the user to alter these values,
there is no guidance on what values might be reasonable in particular circumstances, or what the
outcome may be.
We also note the use "expert judgement" to provide important and necessary information and
estimates where empirical data is lacking. There is a certain lack of consistency and detail in the
reporting of the process and basis of these judgements (for example in the uncertainty values
associated with the loss coefficients v. the cost coefficients). We assume the expert judgement was
provided by members of the ADAS project team, perhaps in discussion with other ADAS colleagues.
There are established protocols for undertaking expert elicitation (e.g. Morgan & Henrion, 1990;
Slottje et al., 2008; Refsgaard et al., 2007) and it is preferable to be explicit as to the nature and
source of the experts and their expertise.
We do not wish to be too critical: putting together FARMSCOPER represents a significant
achievement constrained by practical considerations. However, if followed, USEPA guidance is
helpful in that the reasons behind the current design and implementation of FARMSCOPER would be
clearer.
3.5 User Documentation
The documentation specifically provided for users of FARMSCOPER and distributed with the model
files, comprises



A short description of FARMSCOPER, (ADAS, 2014a)14
An installation and quickstart guide (ADAS, 2014b)7
A help sheet included within each component FARMSCOPER workbook.
The short description (ADAS, 2014a) provides the user with a succinct synthesis of the information
available within the Defra project reports (e.g. Anthony, 2006; Gooday & Anthony, 2010; NewellPrice et al., 2011; Gooday et al., 2014a) and in Gooday et al. (2014b). The main purpose of the
14
The currently available version can be downloaded from http://www.adas.uk/Services/Service/farmscoper-397
24
document, however, seems to be to provide a naive15 user with sufficient knowledge of the model to
make use of the software.
Section 2 introduces the source apportionment co-ordinate system that provides the underlying
conceptual models for FARMSCOPER (derived from Lemunyon & Gilbert, 1993). While the meaning
of the co-ordinates used within the conceptual model will be clear to the majority of users, the
meaning or utility of the "Type" co-ordinate is not explained and may worry some. The definition of
each term used within the co-ordinate system is to be found in the help section of each
FARMSCOPER workbook. We would recommend that the definitions are included within the user
documentation and model description.
Section 3 describes the default farm systems used within FARMSCOPER, based on the established
‘Robust Farm Types', Sections 4-7 describe the modelling of potential farm pollutant emissions,
Section 8 the modelling of Soil carbon, Section 9 on farm energy use and hence CO2 emissions.
Section 10-11 describe the estimation of production and indicators of biodiversity, water use and
soil quality. Section 12 describes how mitigation impacts are calculated, and the important
assumption that effect of in-combination measures accumulates multiplicatively (diminishing
returns).
Sections 13-14 describe the calculation of mitigation costs and prior rates of implementation. We
particularly like the comprehensive approach to identifying assumptions in the calculation of the
cost coefficients within the cost workbook. This has the advantage of the assumptions made being
immediately available to inform the user of FARMSCOPER. As noted earlier, the description of this
component of the model in Gooday et al. (2014a) is very effective.
Section 15 introduces the NSGA-II optimisation algorithm (Debs et al., 2000, 2002). We think that
most users will find Section 15 baffling, a better description is provided by Gooday et al. (2010). We
are not convinced that the text provided in Section 15 is sufficient to explain to the average user
what is being optimised or why! The importance of weightings applied to pollutants in a multicriteria optimisation is not introduced so it is not clear what users will make of the statement that "it
is desirable to identify optimal combination of methods without the relative weighting of the
importance of the different pollutants". We would recommend a greater emphasis on the purpose of
undertaking the optimisation and the importance of weightings in determining the outcome of the
process, and a reduced emphasis on how genetic algorithms work.
Section 16 introduces the sensitivity analysis that can be applied within Farmscoper to the possible
ranges of pollutant loss and the effect of mitigation measures on those losses. The potential
advantages (to the user) of undertaking sensitivity analysis are not explained. Uncertainties would
be better communicated to model users if they were presented in terms of particular applications of
the model or decision contexts, for example, when considering the effect of a set of mitigation
measures on an average farm of a particular type (a national scale or RBD scale assessment), or a
particular instance of a farm of that type (a local scale assessment).
Section 17 describes the calculation of environmental benefits and, finally, Section 18 describes the
upscaling tool.
15
Sensu "new" or "unfamiliar".
25
The user description (ADAS 2014a) certainly has its strengths: it is concise and provides the user with
comprehensive pointers to the data sources, supporting models and published literature. Together
with the Quick start guide (ADAS, 2014b) and the "Help" sheets within FARMSCOPER it is likely to
provide the naive user with sufficient information to start using FARMSCOPER. However, it does not
(and does not aim to) provide the user with any depth of insight into the complexities of
FARMSCOPER, or highlight key assumptions. Nor does it provide any guidance or warnings as to the
sensible limits of application of FARMSCOPER within any particular context or use.
In addition to the brief User description, we would recommend more comprehensive, user focussed
documentation describing the purpose, pedigree, assumptions and limitations and use of each
FARMSCOPER workbook. Sections 2 and 3 of Gooday et al. (2014a), which describe the development
of the Cost and Upscale workbooks, provide a good model for such documentation.
26
References
ADAS (2014a). Farmscoper Version 3 - May 2014 Description. ADAS, Wolverhampton. pp. 23.
ADAS (2014b). Farmscoper Version 3 - May 2014 Installation. ADAS, Wolverhampton. pp. 6.
Anthony, S. (2006). Cost effectiveness of policy instruments for reducing diffuse agricultural pollution.
Part 1 Farm-scale modelling. Defra projects WQ0106 and ES02025, Final Report, ADAS UK Ltd, pp.
44.
Anthony, S.G. and Morrow, K. (2011). Prototype farm scale indicator budget model. Defra project
WQ0111, Final Report, ADAS UK Ltd, pp. ??
Anthony, S., Turner, T., Roberts, A., Harris, D., Hawley, J., Collins, A., Withers, P. (2008). Evaluating
the extent of agricultural phosphorus losses across Wales. Defra project WT0743CSF, Final Report,
ADAS UK Ltd, 281pp.
Anthony, S., Duethman, D., Gooday, R., Harris, D., Newell-Price, P., Chadwick, D., Misselbrook, T.
(2009). Quantitative Assessment of Scenarios for Managing Trade-Off between the Economic
Performance of Agriculture and the Environment and Between Different Environmental Media. Defra
Project WQ0106 (Module 6), Final report, ADAS UK Ltd, 95 pp.
Beck, B. (2002). Model evaluation and performance. In: A.H. El-Shaarawi and W.W. Piegorsch, eds.
Encyclopedia of Environmetrics. Chichester: John Wiley & Sons.
Bevan, K. (2009). Environmental modelling: An uncertain future? Routledge, Abingdon. pp. xvii +310.
Chadwick, D., Chambers, B., Harris, D., Crabtree, R. (2006). Benefits and pollution swapping:
Crosscutting issues for catchment sensitive farming policy. Final report for Defra project WT0706,
29pp + Appendices.
Collins, A., Stromqvist, J., Davison, P., Lord, E. (2007). Appraisal of phosphorus and sediment transfer
in three pilot areas identified for catchment sensitive farming initiative in England – application of
the prototype PSYCHIC model. Soil Use and Management, 23, 117-132.
Davison, P., Withers, P., Lord, E., Betson, M., Stromqvist, J. (2008). PSYCHIC – A process based model
of phosphorus and sediment mobilisation and delivery within agricultural catchments. Part 1 –
Model description and parameterisation. Journal of Hydrology, 350, 290-302.
Deb, K., Agrawal, S., Pratap, A. & Meyarivan, T. (2000). A fast elitist non-dominated sorting genetic
algorithm for multi-objective optimisation : NSGA-II. Kanpur Genetic Algorithms Laboratory, India,
Report No. 200001, 11 pp.
Deb, K., Pratap, A., Agarwal, S., Meyarivan, T. (2002). A Fast and Elitist Multiobjective Genetic
Algorithm: NSGA-II. IEEE Transactions on Evolutionary Computation, 6(2), 182-197.
Gibbons, M. M., Anthony, S. G., Smith, K. A. (2004). SPREADS – A system for controlling the costs and
efficiency of manure and slurry spreading on farms. In: Bernal, M. A., Moral, R., Clemente, R. &
Paredes, C. [eds]. Proceedings of the 11th International Conference of the FAO ESCORENA on the
27
recycling of agriculture, municipal and industrial residues in agriculture (RAMIRAN 2004), Murcia,
Spain, pp. 349–352.
Gooday, R.D., Anthony, S.G. (2010). Mitigation Method-Centric Framework for Evaluating CostEffectiveness. Defra Project WQ0106 (Module 3), Final report, ADAS UK Ltd.
Gooday, R.D., Anthony, SG., Durrant, C., Harris, D., Lee, D., Metcalfe, P., Newell-Price, P., Turner, A.
(2014a). Farmscoper Extension. Defra Project SCF0104, Final report, ADAS UK Ltd. 83 pp.
Gooday, R. D., Anthony S.G., Chadwick, D.R., Newell-Price P., Harris D., Duethmann D., Fish R., Collins
A.L., Winter M. (2014b). Modelling the cost-effectiveness of mitigation methods for multiple
pollutants at farm scale. Science of the Total Environment, 468-469, 1198-1209.
Lemunyon, J.L., Gilbert, R.G. (1993). The concept and need for a phosphorus assessment tool. J.
Prod. Agric., 6, 483-486.
MATLAB (2009). NSGA - II: A multi-objective optimization algorithm.
http://www.mathworks.com/matlabcentral/fileexchange/10429-nsga-ii-a-multi-objectiveoptimization-algorithm/content/NSGA-II/html/non_domination_sort_mod.html
Morgan, M.G., Henrion M. (1990) Uncertainty: A Guide to Dealing with Uncertainty in Quantitative
Risk and Policy Analysis. Cambridge University Press, USA.
NRC (2007). Models in Environmental Regulatory Decision Making. National Research Council
Committee on Models in the Regulatory Decision Process, Board on Environmental Studies and
Toxicology, Division on Earth and Life Studies. Washington, D.C. National Academies Press.
Newell Price, J.P., Harris, D., Taylor, M., Williams, J.R., Anthony, S.G., Chadwick, D.R., Chambers, B.J.,
Duethmann, D., Gooday, R.D., Lord, E.I., Misselbrook, T.H., Smith, K.A. (2009). User Manual-‘ALL’. An
Inventory of Methods and their effects on Diffuse Water Pollution, Greenhouse Gas Emissions and
Ammonia Emissions from Agriculture. Defra project WQ0106. Work Package 5. Final Report.
Newell-Price, J. P., Harris, D., Taylor, M., Williams, J., Anthony, S., Deuthmann, D., Gooday, R., Lord,
E., Chambers, B., Chadwick, D., Misselbrook, T. (2011). An Inventory of Mitigation Methods and
Guide to their Effects on Diffuse Water Pollution, Greenhouse Gas Emissions and Ammonia Emissions
from Agriculture. Defra project WQ0106, Final Report, 162pp.
Oreskes, N., Shrader-Frechette, K., Belitz, K. (1994). Verification, validation and confirmation of
numerical models in the earth sciences. Science, 263: 641-646.
Refsgaard, J.C., Van der Sluijs, J.P., Højberg, H.J., Vanrolleghem, P.A. (2007). Uncertainty in the
environmental modeling process: A framework and guidance. Environmental Modelling & Software,
22: 1543-56.
Saltelli, A., Chan, K., Scott, M. (2000). Sensitivity Analysis. New York: John Wiley and Sons.
Saltelli, A. (2002). Sensitivity analysis for importance assessment. Risk Analysis 22: 579-590.
Seshandri, A. (2010). A fast elitist multiobjective genetic algorithm NSGA-II.
https://church.cs.virginia.edu/genprog/images/2/2f/Nsga_ii.pdf
28
Slottje, P., van der Sluijs, J.P., Knol, A.B. (2008). Expert Elicitation: Methodological suggestions for its
use in environmental health impact assessments. RIVM Letter report 630004001/2008.
http://www.nusap.net/downloads/reports/Expert_Elicitation.pdf
Stromqvist, J., Collins, A., Davison, P., Lord, E. (2008). PSYCHIC – a process based model of
phosphorus and sediment transfers within agricultural catchments. Part 2 – A preliminary
evaluation. Journal of Hydrology, 350, 303-316.
USEPA (2000). Guidance for Data Quality Assessment. EPA QA/G- 9. U.S. Environmental Protection
Agency, Washington, D.C.
USEPA (2002). Guidance on Environmental Data Verification and Data Validation. EPA QA/G-8. U.S.
Environmental Protection Agency, Washington, D.C.
USEPA (2009). Guidance on the Development, Evaluation, and Application of Environmental Models.
EPA/100/K-09/003 . Council for Regulatory Environmental Modeling, U.S. Environmental Protection
Agency, Washington, DC. vii+90 pp.
29