Strategy and Learning: Threading it Together with Kim Goll

Connecting Evaluation, Strategy, and
Learning: Threading It Together
with Kim Goll
Featuring Joelle Cook,
Jennifer Li Shen, Paul Harder
San Diego Room
February 4, 2016
3:00pm – 5:15pm
Connecting Evaluation, Strategy, and Learning: Threading It Together
Agenda
Welcome and Introductions
Kim Goll
Case 1 Presentation and Discussion
Jennifer Li Shen
Case 2 Presentation and Discussion
Paul Harder
Break
Case 3 Presentation and Discussion
Joelle Cook
Debrief and Closing
Kim Goll
Connecting Evaluation, Strategy, and Learning: Threading It Together
The Elephant
Audience
Approach
and Methods
Timeframe
Reporting and
Storytelling
Context
and Purpose
Learning
Questions
Interpretation
and Insights
1) Why
2) What
3) Who
4) How
Measures
Resources
Connecting Evaluation, Strategy, and Learning: Threading It Together
Case 1: Going ‘Beyond The Test” While Embedding
Learning and Performance Management
The Why • To assist with developing growth strategy, make case for success, learn “what matters
most” for performance/accountability and to manage to org priorities
The What • Learning Q’s: What is our unique “way,” target market, and can we afford it? What do
we keep ourselves accountability beyond the API?
• Timing: 1st process in ’08, then ’12; now in implementation year 3; 10-year impact
statement with 3-5 year goals, annual cycle
• Measures: Board dashboard, 15 indicators; mgmt level, 30-35; Iterative
The Who • Joint board staff taskforce; funder looking for growth; board engaged regularly;
targeted consultant support; school community, teachers
• Now, performance managed by CEO, COO & CAO; reporting to board/public
The How • As part of strategy formation, created impact statement, scenarios, measures and
targets; Management team drilled down with site leaders
• Annual cycle w/check-ins, communications; embedded capacity
Key Takeaways • Journey of org change; builds engagement, alignment, culture, prioritization,
learning, accountability; takes resources, time, patience, commitment
• Make it a discipline; it’s iterative…think 1.0, 2.0 and 3.0 versions of dashboard
Connecting Evaluation, Strategy, and Learning: Threading It Together
Case 1: Context
A local, charter school organization that serves a historically underserved
community in Central Los Angeles was considering how best to grow amidst its
early successes in quality education and significant changes within the broader
education landscape. With funding for its first planning effort in 2008, a joint
taskforce of staff and board leaders set about defining a tangible vision for impact
and definition of success to go “beyond the test.” An explicit goal was to align the
growing organization and school sites, while pursuing growth with modest
resourcing.
The evaluation and learning effort including developing a theory of change and
measures of intermediate and ultimate outcomes as integral part of the strategy.
Later in 2012, the organization underwent a plan and dashboard “refresh” effort
since the opportunities and funding in public education had changed
tremendously. Blue Garnet was retained as a planning consultant first in 2008 and
then again in 2012. The charter school is now in Implementation Year 3 of its
second plan, and serves nearly 3,500 students annually through seven school sites.
Snapshot of dashboard
Sharing theory of change
at partner forum
Connecting Evaluation, Strategy, and Learning: Threading It Together
Case 1: Discussion Questions
1. As a funder have you funded this type of effort? As a
nonprofit leader, have you received funding for this type of
effort? Can you share some of the challenges that may have
arisen related to differing expectations or unexpected
results?
2. What strategic question is your organization/program facing?
What are you trying to learn?
3. Have you considered how evaluation and learning can serve
as a mutual accountability tool – in what way can it?
4. In what ways and at what points have you engaged key
stakeholders in the process in defining impact, evaluating
success and sharing results?
Connecting Evaluation, Strategy, and Learning: Threading It Together
Case 2: Developmental Evaluation of
a Youth Service Program
The Why • Organization wants to demonstrate impact to its biggest funder
• Organization wants to promote its two-tier model to the field
• Funder wants to understand better how model works
The What
The Who •
•
•
•
Funder asks evaluator for a developmental evaluation in three stages:
theory of change, process evaluation and outcome evaluation
Organization
Network members and advisors
Funder
Evaluator
The How • Process: Facilitated real-time learning, collaborative theory of change
• Data collection: interviews, focus group, secondary data
Key • Organizational learning has organizational impact
Takeaways • Need for flexibility in focus and methods
• Benefit of good planning and process evaluation
Connecting Evaluation, Strategy, and Learning: Threading It Together
Case 2: Context
The organization is a youth serving organization that helps young people transition to
successful employment. While it provides some limited services and training using its own
small staff, its main strategy has been to use its network of other providers to provide a
more comprehensive set of supports to youth. The organization’s goals are to help lowincome youth transition successfully to employment and to maintain that employment, to
support employers in being better able to work with young people and to build the
capacity of its network member organizations. The network model is a core element of the
organization’s brand.
The organization’s largest funded asked Harder+Company Community Research to conduct
a three-stage project, using a developmental evaluation framework. Starting the
development of a theory of change, the project moved into a process evaluation and then
finally into as assessment of impacts. Developmental evaluation is an approach that
focuses on real-time learning for emerging programs and avoids formal measurement in
the formative stages. It’s a learning approach that encourages insight and reflection rather
than premature accountability.
Connecting Evaluation, Strategy, and Learning: Threading It Together
Case 2: Discussion Questions
1) What are ways to partner with your funder in
learning and assessment?
2) How can an evaluation evolve to reflect the really
important questions?
3) How can learning and assessment help you to
promote your unique model?
Connecting Evaluation, Strategy, and Learning: Threading It Together
Case 3: Now What? How a Developmental Evaluation
Informed Strategy for a Province-Wide Initiative
The Why • To assist AFWI leadership and its stakeholders in understanding progress to
date and in making strategic decisions going forward
The What • 8-month, two-phase timeline
• Evaluation questions and outcomes at multiple levels (individual,
organizational, and system)
The Who • Staff, board, advisory committee
• Grantees, partners, stakeholders, and the field
The How • Developmental evaluation approach
• Document review, surveys, interviews, reflective practice sessions, and
ripple effects mapping
Key • Evaluation data supports strategic decision making
Takeaways • Knowing a lot about a few things is more valuable than knowing a little
about a lot of things
Connecting Evaluation, Strategy, and Learning: Threading It Together
Case 3: Context
The Palix Foundation established the Alberta Family Wellness Initiative (AFWI) in 2009 as
a platform to invest in improving the health and wellness of children and families in the
province. This initiative is based on the understanding that there is a link between early
life experiences and brain development, which subsequently contributes to health and
wellness outcomes throughout life. Given the cross-sector and multi-disciplinary nature
of what AFWI is aiming to achieve, the initiative was set up as a knowledge mobilization
effort to engage and catalyze relationships across stakeholders from science, policy, and
practice domains. The initiative deployed a number of strategies, including convening,
informing, educating, and creating engagement across diverse stakeholders from
academia, health, human services, justice, and education and supporting and facilitating
the understanding and application of this knowledge to catalyze system-level, integrated
change in policy, service provision, and on-the-ground practice rooted in cross sector
collaboration for the ultimate benefit of children and families. FSG conducted an
interim, developmental evaluation from November 2013 to June 2014.
Connecting Evaluation, Strategy, and Learning: Threading It Together
Case 3: Discussion Questions
1) Think about upcoming strategy discussions that you will
have within your own organization. How might you use
evaluation to inform those discussions?
2) Think about the vignettes created as part of the evaluation.
What elements of your work might you want to explore
using this smaller-scale approach?
3) The evaluation used participatory methods. How might your
organization use these or other participatory methods to
more intentionally engage grantees or other partners in
collective sense-making?
Connecting Evaluation, Strategy, and
Learning: Threading It Together
Joelle Cook, Jennifer Li Shen, &Paul Harder
San Diego Room
February 4, 2016
3:00pm – 5:00pm
Connecting Evaluation, Strategy, and Learning: Threading It Together
Agenda
Welcome and Introductions
Kim Goll
Case 1 Presentation and Discussion
Jennifer Li Shen
Case 2 Presentation and Discussion
Paul Harder
Case 3 Presentation and Discussion
Joelle Cook
Debrief and Closing
Kim Goll
The evaluation elephant
Context
and Purpose
Audience
Reporting and
Approach
Storytelling
and Methods
Learning
Questions
Organizational
Learning Interpretation
and Insights
Timeframe
Measures
Resources
The evaluation elephant
Audience
Reporting and
Storytelling
Timeframe
Approach
and Methods
Context
and Purpose
Learning
Questions
Organizational
Learning
Interpretation
and Insights
1) Why
2) What
3) Who
4) How
Measures
Resources
Connecting Evaluation, Strategy, and Learning: Threading It Together
Case 1: Going ‘Beyond The Test” While Embedding
Learning and Performance Management
The Why • To assist with developing growth strategy, make case for success, learn “what matters
most” for performance/accountability and to manage to org priorities
The What • Learning Q’s: What is our unique “way,” target market, and can we afford it? What do
we keep ourselves accountability beyond the API?
• Timing: 1st process in ’08, then ’12; now in implementation year 3; 10-year impact
statement with 3-5 year goals, annual cycle
• Measures: Board dashboard, 15 indicators; mgmt level, 30-35; Iterative
The Who • Joint board staff taskforce; funder looking for growth; board engaged regularly;
targeted consultant support; school community, teachers
• Now, performance managed by CEO, COO & CAO; reporting to board/public
The How • As part of strategy formation, created impact statement, scenarios, measures and
targets; Management team drilled down with site leaders
• Annual cycle w/check-ins, communications; embedded capacity
Key Takeaways • Journey of org change; builds engagement, alignment, culture, prioritization,
learning, accountability; takes resources, time, patience, commitment
• Make it a discipline; it’s iterative…think 1.0, 2.0 and 3.0 versions of dashboard
Connecting Evaluation, Strategy, and Learning: Threading It Together
Case 2: Developmental Evaluation of
a Youth Service Program
The Why • Organization wants to demonstrate impact to its biggest funder
• Organization wants to promote its two-tier model to the field
• Funder wants to understand better how model works
The What
The Who •
•
•
•
Funder asks evaluator for a developmental evaluation in three stages:
theory of change, process evaluation and outcome evaluation
Organization
Network members and advisors
Funder
Evaluator
The How • Process: Facilitated real-time learning, collaborative theory of change
• Data collection: interviews, focus group, secondary data
Key • Organizational learning has organizational impact
Takeaways • Need for flexibility in focus and methods
• Benefit of good planning and process evaluation
Connecting Evaluation, Strategy, and Learning: Threading It Together
Case 3: Now What?
How a Developmental Evaluation Informed Strategy
The Why • To assist AFWI leadership and its stakeholders in understanding progress to
date and in making strategic decisions going forward
The What • 8-month, two-phase timeline
• Evaluation questions and outcomes at multiple levels (individual,
organizational, and system)
The Who • Staff, board, advisory committee
• Grantees, partners, stakeholders, and the field
The How • Developmental evaluation approach
• Document review, surveys, interviews, reflective practice sessions, and
ripple effects mapping
Key • Evaluation data supports strategic decision making
Takeaways • Knowing a lot about a few things is more valuable than knowing a little
about a lot of things