Communicating Program Theory for Environmental Research Helps

Session 853
Extending Organizational Capacity and Capability
to Evaluate Federal Environmental Research Programs
Developing a Framework that Integrates
Program Design, Management, Accountability & Evaluation
Dale Pahl* and Emma Norland
U.S. EPA Office of Research and Development
October 29, 2005
2005 Joint Conference: Crossing Borders, Crossing Boundaries
Canada Evaluation Society & American Evaluation Association
*Corresponding author contact: [email protected]
Presentation Focus
 Question: How can we develop a framework
that
 Integrates program design, management, accountability, and
evaluation?
 Responds to OMB guidance about the Research & Development
Investment Criteria?
 Communicates clearly—to evaluators, clients, and external
stakeholders—about the program’s environmental research and
outcomes?
2005 Joint Conference: Crossing Borders, Crossing Boundaries
Canada Evaluation Society & American Evaluation Association
Pahl & Norland Oct’05
Presentation Focus
 Response: Articulating the program theory for
EPA’s environmental research creates a logical
framework that . . .
 Integrates program design, management, accountability, &
evaluation;
 Engages research managers, clients, scientists, and
stakeholders across the program’s scope and lifetime; and
 Enables independent expert panels to evaluate evidence about
program relevance, quality, performance, and leadership—with
client input
2005 Joint Conference: Crossing Borders, Crossing Boundaries
Canada Evaluation Society & American Evaluation Association
Pahl & Norland Oct’05
Communicating Program Theory for Environmental Research
Helps Integrate Design, Management, Accountability, & Evaluation
Clients use research (short-term outcomes) …
… e.g., to make environmental decisions
Clients
Research Program
Outcomes and Environmental Results
Clients
Resources
Research
Topics &
Activities
Research
Short-Term
Intermediate
Long-Term
Outputs
Outcomes
Outcomes
Outcomes
Effective
Transfer
Program Managers
Have Direct Control
Specific
Organizations
& Individuals
Client
Decisions
& Actions
Strategic Goals
& Objectives
Program
Managers Have
Direct Influence
Mission
Agencies Have
Indirect Impact
2005 Joint Conference: Crossing Borders, Crossing Boundaries
Canada Evaluation Society & American Evaluation Association
Pahl & Norland Oct’05
Communicating Program Theory for Environmental Research
Helps Integrate Design, Management, Accountability, & Evaluation
Programs are designed from RIGHT to LEFT
analysis
Resources
Research
Topics &
Activities
synthesis
Key Research
Specific Clients
Short-term
Outcomes
Intermediate
Outcomes
Long-Term
Outcomes
(e.g., Improved
environmental
quality, reduced
human
exposure)
(e.g., Improved
human &
ecosystem
health)
Outputs
)
Intended
Changes in
Decisions or
Actions by
Specific Clients
2005 Joint Conference: Crossing Borders, Crossing Boundaries
Canada Evaluation Society & American Evaluation Association
Pahl & Norland Oct’05
Environmental Outcomes, Risk Assessment, & Accountability
Adapted from Presentations to EPA’s Board of Scientific Counselors by Hugh Tilson, Larry Cupitt, and John Vandenberg
Source
Emissions
Health
Impacts
Fate and
Transport
Ambient
Conditions
Exposure
And Dose
Early Signs
of Effects
 Risk assessment helps identify & prioritize scientific questions
& knowledge gaps across a program’s environmental outcomes
 Risk assessment is essential to help:
 Decide whether or not to take regulatory action . . .
Is there an environmental hazard?
 Decide what actions are most effective . . .
What actions do we take to protect human health?
 Understand how to evaluate the effectiveness of our decisions . . .
Were we effective?
2005 Joint Conference: Crossing Borders, Crossing Boundaries
Canada Evaluation Society & American Evaluation Association
Pahl & Norland Oct’05
Environmental Outcomes, Risk Assessment, & Accountability
Adapted from Risk Assessment in the Federal Government: Managing the Process (NRC, 1983); 1997 Update to ORD’s Strategic Plan (EPA, 1997); and OIG-ORD Presentation to EPA’s Deputy
Administrator (Pahl & Norland, March 2002)
EXPOSURE
RISK
ASSESSMENT
RISK
MANAGEMENT
2. Implementation Decisions Managers
make decisions about how to implement, comply with &
enforce regulations or how to remedy environmental
problems.
Pollution
Sources
Emissions
Biological
Effect
Internal Dose
Transport &
Transformation
Exposure-Dose
Relationships
Environmental
Concentrations
Adverse
Health Effect
Ecosystem and
Human Health
Considerations
Health Assessment
DoseResponse
Assessment
Risk
Characterization
Hazard Identification
Exposures
Legal Considerations
Exposure
Assessment
Risk
Management
Options
Social,
Economic,
& Political
Factors
1.Environmental
Decisions and
Regulations
3. Accountability Developing and measuring appropriate environmental
indicators demonstrates whether environmental decisions result in improved human
and environmental health.
2005 Joint Conference: Crossing Borders, Crossing Boundaries
Canada Evaluation Society & American Evaluation Association
Dale Pahl Nov ‘02
Pahl & Norland Oct’05
Short-Term Outcomes:
A Critical Link between Research & Impact
Short-Term Outcomes
(Research Contributions to Environmental
Decisions)
Limitations or
Gaps in
Knowledge
about the
Environmental
Problem
Transfer
Research
Topics &
Activities
Research
Outputs
Specific Clients
(e.g.,
dissemination
thru
publications)
Limitations
or Gaps in
Attitudes
Intended
Changes in
Decisions or
Actions by
Specific
Clients
Transfer
Intermediate
Outcomes
Long-Term
Outcomes
(e.g., improved
environmental
quality, reduced
human
exposure)
(e.g., improved
human &
ecosystem
health)
(e.g., guideline
manuals &
training)
Limitations
or Gaps
in Skills and
Abilities
Needed to
Respond
to the
2005 Joint Conference: CrossingEnvironmental
Borders,
Crossing Boundaries
Canada Evaluation Society & American
Evaluation
Association
Problem
Pahl & Norland Oct’05
Short-Term Outcomes:
A Critical Link between Research & Impact
 Short-term outcomes are achieved when key research
contributions are transferred to, and used by, intended
clients
 Research creates improved knowledge and applications for
risk assessment, environmental decisions, regulatory
decisions, and accountability:
•
Setting or revising standards—formal EPA rulemaking
•
Implementing standards—e.g., Regions or states develop plans to comply with
standards, restore ecosystems, or manage environmental exposure & risk
•
Applying environmental indicators to “measure” progress to achieve regulations
and environmental outcomes—e.g., assessing whether legislation & regulations
have the intended environmental impact
____
See, for example: Strategic Research Plan for Particulate Matter, Air Quality Subcommittee of the Committee on the Environment and Natural resources (CENR), December, 2002; Science to Support Rulemaking, EPA Office of
Inspector General (OIG) Report 2003-P-00003, November 15, 2002; and Air Quality Management in the United States, National Research Council, 2004.
1
2005 Joint Conference: Crossing Borders, Crossing Boundaries
Canada Evaluation Society & American Evaluation Association
Pahl & Norland Oct’05
Communicating Program Theory for Environmental Research
Helps Integrate Design, Management, Accountability, & Evaluation
Programs are designed from RIGHT to LEFT
analysis
Resources
Research
Topics &
Activities
synthesis
Key Research
Specific Clients
Short-term
Outcomes
Intermediate
Outcomes
Long-Term
Outcomes
(e.g., improved
environmental
quality, reduced
human
exposure)
(e.g., improved
human &
ecosystem
health)
Outputs
)
Intended
Changes in
Decisions or
Actions by
Specific Clients
2005 Joint Conference: Crossing Borders, Crossing Boundaries
Canada Evaluation Society & American Evaluation Association
Pahl & Norland Oct’05
Organizing Research, Topics, & Activities


Both applied1 and use-inspired basic research2
contribute to regulatory decisions and improved
understanding about environmental outcomes:
•
Applied1 environmental research is targeted at understanding and solving
particular environmental problems.
•
Basic2 environmental research elucidates problems that involve complex
environmental processes and nonlinear systems with multiple causes and effects.
•
EPA’s basic research is “use-inspired” basic research2—targeted to answering
questions about complex problems that cut across EPA’s programs to ensure that
decisions are based on a foundation of sound science.
Typically, the distinction between these types of
research is not clear cut.
__________
1 EPA's applied and basic research programs implement recommendations from the National Research Council in Building a Foundation for Sound Environmental Decisions,
(NRC, 1997).
2
Stokes, D.E. “Renewing the Compact between Science and Government,” in 1995 Forum Proceedings, Vannevar Bush II—Science for the 21st Century. Pages 15-32. Sigma
Xi, 1995.
2005 Joint Conference: Crossing Borders, Crossing Boundaries
Canada Evaluation Society & American Evaluation Association
Pahl & Norland Oct’05
Organizing Research, Topics, & Activities
Adapted from National Research Council, Building a Foundation for Sound Environmental Decisions (NRC, 1997).
Problem-Driven
Research
Core Research
Re-evaluate priorities regularly
Identify existing and emerging
issues for a specific problem
Elucidation of
Complex
Environmental
Processes
Use risk assessment to rank issues
and pinpoint largest uncertainties
Development
of Tools
FEEDBACK
Narrow EPA focus based on client
needs and recognition of what others
are doing
Identify research topics that improve
understanding, reduce uncertainties,
and develop client applications
Collection of
Data
Select projects based on broad
applicability, relevance to EPA,
and scientific merit.
2005 Joint Conference: Crossing Borders, Crossing Boundaries
Canada Evaluation Society & American Evaluation Association
Pahl & Norland Oct’05
Organizing Research, Topics, & Activities

Question:
•
How do we organize and “measure” research when
there are no objective methods to:



Measure new knowledge as it develops?
Manage the pace at which research progresses?
Measure research quality and impact?
2005 Joint Conference: Crossing Borders, Crossing Boundaries
Canada Evaluation Society & American Evaluation Association
Pahl & Norland Oct’05
Organizing Research, Topics, & Activities
 Response:
•
•
•
Organize the research with priority research topics
Assess research progress and priorities with periodic
meetings
Convene independent expert panels to evaluate the
improved knowledge and its applications with indicators
that span a program’s scope & lifetime
__________
1 For example, see Averch, H.A., “The Systematic Use of Expert Judgment,” pages 294-295 in Handbook of Practical Program Evaluation. J.S.
Wholey, H.P. Hatry, and K.E. Newcomer (eds). San Francisco, Jossey-Bass, 1994.
2 The Government Performance and Results Act: 1997 Governmentwide Implementation Will Be Uneven. U.S. GAO/GGD (1997)
2005 Joint Conference: Crossing Borders, Crossing Boundaries
Canada Evaluation Society & American Evaluation Association
Pahl & Norland Oct’05
Developing Indicators Across a Program’s Scope & Lifetime
Potential Indicators:
Evaluation Criteria
Quality
Relevance
Performance
Scientific
Leadership
Peer Review (multi-year research plan)
Peer Review (topic / project / activity)
Bibliometric Analysis
(publications)
Client Feedback (client use of research)
Independent Expert Review (including client feedback)
Very useful
Useful
Limited usefulness
2005 Joint Conference: Crossing Borders, Crossing Boundaries
Canada Evaluation Society & American Evaluation Association
Pahl & Norland Oct’05
The Importance of Independent Expert Evaluation
 The systematic use of independent expert judgment is
important1,2 when:
 It is difficult to measure program progress or outputs (e.g., advances in research knowledge)
needed to achieve outcomes
 Multidisciplinary expertise is needed to evaluate scientific progress that responds to
research topics and scientific questions
 It is difficult to determine when outcomes can be attributed to the program
 Agency programs involve research, regulation, or external partners such as state agencies
 These criteria illustrate why independent expert review
is important for evaluating EPA’s research programs
_____
1 For example, see Averch, H.A., “The Systematic Use of Expert Judgment,” pages 294-295 in Handbook of Practical Program Evaluation. J.S. Wholey, H.P. Hatry, and K.E. Newcomer
(eds). San Francisco, Jossey-Bass, 1994.
2
The Government Performance and Results Act: 1997 Governmentwide Implementation Will Be Uneven. U.S. GAO/GGD (1997)
2005 Joint Conference: Crossing Borders, Crossing Boundaries
Canada Evaluation Society & American Evaluation Association
Pahl & Norland Oct’05
Summary:
EPA follows a systematic approach1,2 to organize,
integrate, synthesize, and evaluate research that
informs environmental decisions by specific clients:

•
•
•
•
•

Develop an integrated risk assessment framework to synthesize available information about a
specific environmental problem
Assess client needs, knowledge gaps, research questions, and uncertainties
Create research topics needed to develop knowledge & evaluate evidence related to the gaps,
questions, and uncertainties
Monitor progress in implementing the research topics and in applying improved knowledge base
Evaluate the improved knowledge base and client applications
The next slide illustrates how program theory creates a
framework that integrates these objectives . . .
__________
1 For an example of EPA’s systematic approach, see Strategic Research Plan for Particulate Matter, Air Quality Subcommittee of the Committee on the Environment and Natural resources
(CENR), December, 2002.
2For additional background information ,see Risk Assessment in the Federal Government: Managing the Process. National Research Council, 1983; and Environmental Research and
Development: Strengthening the Federal Infrastructure. Carnegie Commission on Science, Technology and Government, 1992.
2005 Joint Conference: Crossing Borders, Crossing Boundaries
Canada Evaluation Society & American Evaluation Association
Pahl & Norland Oct’05
HOW?
WHAT?
Programs are designed from RIGHT TO LEFT
WHY?
The right science
Resources
& Inputs
The right questions
Specific
Clients
Short-Term
Outcomes
Environmental
Outcomes and
Strategic Goals
Long-Term
Outcomes
Research Topics
& Activities
Research
Outputs
A risk-based
assessment of
knowledge gaps &
needs related to EPA’s
strategic goals—
including strategic
guidance from
independent advisors
such as the NAS
The number, sequence,
and distribution of
research projects /
activities that respond to
the priority research
topics
Peer-reviewed
publications that
respond to the priority
research topics, as
measured by
bibliometric analysis
The program
integrates new
knowledge and key
research
contributions for
application at key
decision-points
(e.g., regulations)
by specific clients
Client use of research
knowledge and
applications at key
decision-points
The information-value or
decision-value of the
research knowledge or
applications for
understanding complex
environmental processes
The information-value
or decision-value of the
research knowledge for
developing indicators of
public health or
ecosystem health
Research Coordination
Teams assess and
prioritize client needs
for research to inform
policy, decision-making,
and accountability
The number of research
projects / activities
completed during the
past 5 years that respond
to the priority topics
The information value
of the publications
(for example, in
developing new
knowledge or in
reducing uncertainty)
as measured by
bibliometric analysis
The program
collaborates with
clients to transfer &
demonstrate key
research
contributions
The information-value or
decision-value of the
research knowledge
and applications used
by clients
The information-value or
decision-value of the
research knowledge or
applications for measuring
environmental progress,
developing risk mitigation
approaches, or evaluating
the effectiveness of
regulations
The relevance of the
program’s outcomes to
EPA’s mission and
legislative mandates
Research Coordination
Teams select priority
topics or questions to
organize the research
that leads to outcomes
The program ensures
high quality research with
merit-based competitive
awards (e.g., through
grant mechanisms) that
respond to the priority
research topics
Research
Coordination
Teams assess
client feedback on
key research
contributions
Progress to achieve
outcomes is measured
with a small number of
long-term goals and
measures
The information-value or
decision-value of the
research knowledge or
applications for helping
EPA to achieve its
strategic goals.
Partnerships provide
FTE or $ to support
priority research topics
linked to EPA outcomes
& strategic goals
The program sponsors
periodic meetings to
assess ongoing research
progress and priorities
Scientific leadership combines several elements:
(1) The program has developed priority research topics that help EPA achieve its strategic goals;
(2) The program has developed partnerships that support & coordinate research which focuses on one or
more of these topics; (3) Bibliometric analysis indicates that the program has made significant
contributions that respond to the priority research topics; and (4) The program’s principal investigators
are recognized leaders in their research disciplines
2005 Joint
Conference:
Crossing
Borders,
Crossing Boundaries
Evidence about program relevance,
quality,
performance,
and
leadership—evaluated
by independent expert panels
Canada Evaluation Society & American Evaluation Association
Pahl & Norland Oct’05