Operationalizing CLA

Operationalizing CLA -- Illustrative
Illustrative results include:
improved M&E function, better able to support mission and partner decision making
and learning
coordination, collaboration and synergy for greater synergy and effectiveness
mission staff, partners and others enabled to take a more analytic and adaptive
approach to keeping development interventions relevant, optimized with the larger
context, and more impactful
Illustrative interventions include:
collaborating opportunities
– across DO teams for intersectoral perspectives to inform sectoral work
– between USAID and its implementing partners, to build trust that will enable joint learning
– among implementing partners, for peer assist, activity synergy
– with external stakeholders for better coordination, learning and overall relationship building
help partners coordinate and collaborate by:
o mapping who’s doing what where
o developing a collaboration plan
o helping to facilitate/catalyze coordination and collaboration
work more closely with government counterparts and the local social science
community
strengthen participation in joint donor sector reviews to exercise more meaningful
influence
GIS—map USAID, other investments against demographic, economic, health etc. data;
share with partners, other donors, government
relationship facilitation training and implementation
influence planning with DO teams and partners
expertise locator system
learning opportunities
strengthen the feedback from monitoring and evaluation such that their findings are
systematically used in planning activities and projects
identify and fill knowledge gaps–-syntheses of existing research, new research/special
studies, knowledge exchange, expert panels/advisory groups
access STTA advisory services of sector experts, local leaders in government,
academia/social science, civil society, etc., for
o overall program strategy and implementation
o specific thematic issues
o tracking of game changers and other broad trends that affect USAID
bring partners together for discussions to ground-truth and elaborate on the data and
analysis that come out of the M&E system
help with refining development hypotheses and results frameworks, iteratively over the
course of CDCS implementation
develop and disseminate game-changer bulletins; facilitate discussion of findings and
implications for strategy and program implementation
conduct and share exit interviews with IPs from closed out projects
articulating technical learning agendas: knowledge gaps, hypothesis testing, impact
evaluation
active monitoring of direct interventions, indirect influencing, game changers
testing development hypotheses and theories of change; impact evaluations
structured peer assist (and training for peer assist)
informal and tacit knowledge exchange
continuity: in-briefing for new staff led by FSNs; mission programmatic history lite
delivered through interviews and other digestible formats; exit interviews with
departing IPs and mission staff
revised portfolio review aimed at project (not activity) level, focused on learning as well
as accountability
big picture reflections – large meetings, smaller intensive review working meetings
exchanges/rotations, internal to the mission (e.g., A&A staff on rotation with DO team)
and external (e.g., mission-to-mission)
support for feedback/learning loops – plan, align incentives, seed, monitor, encourage,
assess for practical implications, etc.
capturing and sharing any of these activities and their results with other missions,
partners
develop knowledge transfer and coaching plan to build mission capacity in facilitating
collaborating, learning and adapting*
adapting opportunities
develop a process for rolling work planning – how to create and implement them, and
align M&E activities with them
synthesize, analyze, determine changes that need to be made
adaptation workshops/meetings following portfolio reviews, big picture reflections, and
sudden game-changing events
support to program office on aligning mission processes (such as PMPs, portfolio
reviews) and developing language on learning for assistance agreements, etc. to support
collaboration, rolling work plans, adaptive management and other features of an
operationalized CLA methodology
specific M&E opportunities
assist mission staff and partners to understand the rationale for and role of different
types of monitoring and evaluation activities to inform selection or M&E approach,
scoping, and implementation of findings
produce quality scopes for evaluations, and quality evaluations
work with DO teams to define quality, appropriate PMPs that capture the kinds of data
collection and analysis needed, given the portfolio and the rolling nature of work
planning, to support learning and adaptation
track and analyze trend data
respond to data calls quickly with quality information, create reports according to
mission needs
support initiative-specific and initiative-level data calls; have the data and analysis
coalesce around the initiatives to show results at that level
PRS – should have more analysis coming out of the system – need analysis + raw data
for a discussion about whether data support the analysis
slice the data and the analysis of it by sector to enable stronger participation and
greater influence in joint donor sector reviews
support cost-benefit analysis with: (i) expertise in CBA, (ii) local knowledge, and (iii)
labor power to do all the associated legwork
enhance qualitative data collection and analysis; complement the data collection and
analysis with collaborative discussions among USAID staff and local partners to help
everyone understand that data collection choices, process and outputs, and the data
analysis; and to engage everyone in filling in the parts of the story that the data don’t
tell
understand and respond to the needs of AOTRs and IPs
overall, think through how to improve the mission’s approach to using the findings from
monitoring and evaluation in planning and implementing activities and projects
monitor environmental compliance
* In parallel, develop a plan, phased over the five years, to build internal mission capacity in
these areas in order to expand the human resources available to support this work, and to
meet more of these needs in-house.
January 19, 2012