English

M&E in the GEF
Aaron Zazueta
GEF Evaluation Office
Expanded Constituency Workshop
Dalat, Vietnam - April 2011
 Results-based management (RBM),
and Monitoring and evaluation (M&E)
in GEF-5
 M&E policy for GEF-5
 M&E Minimum Requirements
 Involvement of focal points
 Questions on NCSA evaluation
 Evaluation planning for GEF-5
2
 Result based management - Setting goals
and objectives, Monitoring, learning and
decision making
 Evaluation is a “reality check” on RBM
 RBM, especially monitoring, tell whether
the organization is “on track”
 Evaluation could tell whether the
organization is “on the right track”
3
Two overarching objectives:
Promote accountability for the achievement of
GEF objectives through the assessment of
results, effectiveness, processes, and
performance of the partners involved in GEF
activities.
Promote learning, feedback, and knowledge
sharing on results and lessons learned among
the GEF and its partners as a basis for decision
making on policies, strategies, program
management, programs, and projects; and to
improve knowledge and performance.
4
 Reference to GEF Results-based Management (RBM)
 Strengthened knowledge sharing and learning
 Clarification of roles and responsibilities
 Stronger role for GEF Operational Focal Points in M&E
 Inclusion of programs and jointly implemented
projects
 Baseline data for M&E to be established by CEO
endorsement
 New Minimum Requirement on engagement of GEF
Operational Focal Points in project and program M&E
activities
5
Institutional
Level
(top-down)
GEB
Impacts
GEF
Strategic
Goals
Focal Area
Goal
Operating
Level
(bottom-up)
Outcomes
Outputs
Focal Area
Objectives
Project
Objectives
6
Project and Program Design
LFA/Results framework
M&E Plan
Implementation
Monitoring of progress; midpoint
course correction as needed
Evaluation
Terminal Evaluations
Lessons Learned
Management, monitoring, and learning
Lessons learned; Good practices
Adapted from the World Bank’s Results Focus in Country Assistance Strategies, July 2005, p. 13
7

M&E contributes to knowledge building
and organizational improvement
 Findings and lessons should be accessible to target
audiences in a user-friendly way
 Evaluation reports should be subject to a dynamic
dissemination strategy
Knowledge sharing enables partners to
capitalize on lessons learned from experiences
Purpose of KM in the GEF:
 Promotion of a culture of learning
 Application of lessons learned
 Feedback to new activities
8
GEF Council
Annual evaluation reports
Overall Performance Study
(to Assembly)
Annual Work Program
and Budget
Agency
evaluation units
GEF
Evaluation Office
Corporate
Project and Program
evaluations
evaluations
Project and Program
Independent evaluations
Annual Monitoring Report
Evaluation Management Response
Programming documents and indicators
Results Based Management
GEF Secretariat
Project and Program Implementation Reports
Agency Portfolio Reports
Project documents with M&E plans
Agency GEF
coordination units
Project and Program Implementation Reports
Project and Program monitoring documentation
Terminal evaluations
GEF projects and
programs
9
Enabling
Environment
M&E Policy
COUNCIL
Oversight
GEF
Evaluation
Office
GEF
Evaluation
Office,
Evaluation
Partners
Advice
STAP
GEF
Secretariat,
GEF
Agencies
Partner
Countries,
NGOs, Private
Sector,
Communities
10
A management response is required for all
evaluations and performance reports presented to
the GEF Council by the GEF EO
GEF Council takes into account both the evaluation
and the management response when taking a
decision
GEF EO reports on implementation of decisions
annually (Management Action Record)
In the case of Country Portfolio Evaluations
countries have the opportunity to provide their
perspective to Council as well
11
Design of M&E Plans
 Concrete and fully budgeted M&E plan by CEO
endorsement for FSP and CEO approval for MSP
 SMART indicators
 Projects should align with GEF focal area results
frameworks
 Baseline data for M&E by CEO endorsement
 Mid Term Reviews (where required or foreseen)
and Terminal Evaluations included in plan
 Organizational set up and budget for M&E
12
Implementation of M&E Plans
Project/program monitoring and supervision will
include execution of the M&E plan:
 SMART indicators for process/implementation
 SMART indicators for results
 Baseline for the project fully established and data
compiled to review progress
 Organizational set up for M&E is operational and
its budget is spent as planned
13
Project/Program Evaluations:
 All full sized projects and programs will be
evaluated at the end of implementation.
 Evaluations should:
 Be independent of project management or reviewed by GEF Agency
evaluation unit
 Apply evaluation norms and standards of the GEF Agency
 Assess, as a minimum, outputs and outcomes, likelihood of
sustainability, compliance with M&E minimum requirements 1 & 2
 Contain: data on the evaluation itself (including TORs); basic project
data, lessons
 Should be sent to GEF EO within 12 months of completion of
project/program
Guidelines for evaluating MSPs/EAs will be developed
14
Engagement of Operational Focal Points




M&E plans should include how OFPs will be engaged
OFPs to be informed on M&E activities, including Mid
Term Reviews and Terminal Evaluations, receiving
drafts for comments and final reports
OFPs invited to contribute to the management
response (where applicable)
GEF Agencies keep track of the application of this
requirement in their GEF financed projects and
programs
15
 Keep track of GEF support at the national level.
 Keep stakeholders informed and consulted in plans,
implementation and results of GEF activities in the country.
 Disseminate M&E information, promoting use of evaluation
recommendations and lessons learned.
 Assist the Evaluation Office, as the first point of entry into a
country:
 identify major relevant stakeholders,
 coordinate meetings,
 assist with agendas,
 coordinate country responses to these evaluations.
16
GEF-5 Cross-cutting capacity development
strategy:
 Fifth component: enhancing capacities to monitor and
evaluate environmental impacts and trends
 This should be identified as a priority in the NCSA
capacity development action plan
 The capacity development plan should be formulated as
a medium size project, or it should be integrated into a
broader proposal that would be formulated as MSP or
FSP – if MSP it should have 1:1 cofunding
 Development of regional partnerships could be
considered
 Funding from $44m set-aside for capacity development
17
Consolidation and strengthening of the four
streams of evaluative evidence:
 Country Portfolio Evaluations: up to 15 during GEF-5
 Impact Evaluations: International Waters, Climate
Change and other focal areas
 Performance Evaluations: APR continued and
strengthened as well as independent process reviews
 Thematic Evaluations: focal area strategies and
adaptation
These streams of evaluative evidence will enable
a timely OPS5 for which less additional work
should be needed than for OPS4
18
 Verification and ratings of outcome and progress
toward impact
 Coverage of the reform process: GEF project cycle and
modalities, direct access, STAR, paragraph 28
 Increased attention to the catalytic role of the GEF
 Trends in ownership and country drivenness
 Trends in global environmental problems and
relevance of the GEF to the conventions
 More in-depth look at the focal area strategies,
including sustainable forestry management
 Better understanding of the longer term impact of the
GEF
19
 Support to NCSAs was one of the approaches to implement
the GEF capacity development strategy and UN conventions
guidance to GEF
 NCSA aimed to identify country level priorities and needs
for capacity development to address global environmental
issues, holistic and long-term approach, country driven and
led
 As of August 2010:
 153 NCSAs approved ($28.7 million), 119 completed (UNDP: 76%;
UNEP: 23%; WB: 1%)
 23 approved second phases to implement NCSAs recommendations
(more in GEF5)
 Global Support Programme for NCSA (completed)
 Evaluation under preparation, report expected for the
November 2011 GEF Council
20
Issues that will be tackled in the NCSA evaluation and that can
be raised with us at this meeting:
 To what extent have NCSAs been relevant to your country’s
needs and priorities? Have they been relevant to support
the implementation of conventions?
 What was the process of NCSA preparation? Who
participated?
 What are the main achievements and results of the NCSAs?
 Was capacity development improved during the
implementation or NCSAs? Any specific examples?
 What is the sustainability of the capacity developed? Any
specific examples?
 Other issues to be included?
21
Discussion
Q&A on the new GEF M&E Policy
Any comments on the issues that will be
tackled in the NCSA evaluation? Other issues
to be included?
Thank you!