CAPC BB Evaluation Panel Presentation

“If you don’t know where you are going, how
are you gonna’ know when you get there?”
Yogi Berra
Common Elements of Strong Program Evaluation
THEORY OF CHANGE: Cause/effect relationship of program
LOGIC MODEL: Identifies key program components and
their relationship to program outcomes
PERFORMANCE MEASUREMENT &
IMPACT EVALUATION: Differences between the two
Theory of Change Main Elements
If you deliver the intervention as planned, it will bring about a
measurable change/outcome in the community/participants in
relation to your identified community need.
Community
Need/Problem
Statistics
Specific
Intervention
Supports
Cause/Effect
Intended
Outcome
LOGIC MODEL
INPUTS: What we invest
ACTIVITIES: What we do
OUTPUTS: Direct products from program activities
EVIDENCE OF CHANGE/OUTCOMES
Short-Term Outcomes – one year
Changes
in knowledge, skills, attitudes
Medium -Term Outcomes – three years
Changes
in behavior or action resulting from new knowledge
Long-Term Outcomes – ten years
Meaningful
changes in condition
Birth & Beyond Evaluation Practices
A home visitation program at 9 Family Resource Centers serving parents at risk of
child abuse and neglect to prevent entry/re-entry into Child Protective Services
 Common Outcomes
 Sacramento County Commitment to CPS Evaluation
 External Evaluator from the onset
 Shared Database across all 9 sites
 Monthly Data Review distributed program wide
 Quarterly and Annual Reports
 Program Improvement Based on Data & Outcomes
Birth & Beyond Lessons Learned
 Integrated evidence-based practices into the model
 Intervention must align with assessments
 Analyze attrition of those served and ways to improve
 Member training aligns with intervention
 Since 2008, serving higher % of at-risk risk families
 Data facilitates peer to peer learning
 Track/monitor short-term outcomes to ensure meeting medium-
term outcomes
 Expanded services by adding one FRC (based on data)
Birth & Beyond Evaluation Challenge
Impact Evaluation: strives for high rigor with scientifically-based research
design comparing people exposed to an intervention (“experimental”
group) to people not exposed (“control” or “comparison” group)
How do you design an impact evaluation that does not deny services to
parents at risk of child abuse & neglect?
 Technical assistance from NORC
 Work in partnership with Sacramento County Child Protective Services to
enhance B&B evaluation conducting a quasi-experimental study
 Select a matched sample of families who have been referred to CPS during
the same time as Birth & Beyond families and did NOT receive Birth &
Beyond services (“control” or “comparison” group)
2010 -2013 Child Protective Services
Pre-, During, and Post-Program
Percentage of CPS Reports by Most Severe Disposition (n=1,943)
3.9%
3.5%
3.9%
6.3%
28.0%
8.7%
9.7%
16.2%
7.7%
83.9%
80.2%
48.1%
Pre-Program
During
None
Unfound/Unknown
Post-Program
Inconclusive
Substantiated
20 Year Child Death Review Team Report
Child Abuse and Neglect (CAN) Homicides
Sacramento County Resident Deaths
Rolling Five Year Average of Rates
4.00
3.40
3.11
3.04
3.12
3.20
2.93
2.97
3.00
2.91
1.22
1.80
1.90
05-09
1.29
03-07
1.42
02-06
1.33
01-05
1.50
00-04
1.87
2.00
04-08
2.27
2.50
1.00
0.50
Year
99-03
98-02
97-01
96-00
95-99
94-98
93-97
92-96
91-95
0.00
90-95
Child Death Rate
(per 100,00 children)
3.50
PERFORMANCE MEASUREMENT & IMPACT EVALUATION
Performance Measurement
Provides a view of how program is functioning
Does not “prove” that an intervention caused an outcome
Strives for high quality data, striking a balance between rigor & staff time
Focuses on shorter term changes, observed within a year
Impact Evaluation
Strives for high rigor; with scientifically-based research designs
Seeks to prove causal relationship; program is the specific cause
of improvements within the target population
Evidence that intervention causes the outcomes
Focuses on longer term changes as well as short term outcomes