- Catalyst

IMPLEMENTATION AND
PROCESS EVALUATION
PBAF 526
Today:
Recap last week
 Unanswered question…
 Next week: Bring in picture with program theory and evaluation
questions
 Overview of Implementation and Process Evaluation
 Path in Jamaica
Monitoring and Performance Management
 Zinc in Nepal
Process and Implementation Evaluation
• Focused on key “what are we doing” questions
(p. 170-1 Rossi; 320-
322 Patton)
• What activities are taking place?
• How are those being delivered?
• Which clients are getting which services? When? Is the neediest
•
•
•
•
clients getting the services they need?
How’s the quality of services?
Are there appropriate staffing and resources?
How do the activities match what was planned?
How do they vary across case workers, program sites, or over time?
Is that appropriate?
• Program design is basis for designing process evaluation
• Consider program process theory
• Look for processes critical to causal links
• Flow chart of program services is key
• Expect program adaptation over time (that’s how managers are
successful!)
• Listen to experiences of those at the front line (clients, staff), as well
as outside (collaborators, support functions)
Process and Implementation Evaluation
• Qualitative Methods:
Interviews, focus groups, site visits, observations
Provide insider and evaluator perceptions of what’s happening,
what is working well and what isn’t
• Clients
• Staff
• Collaborators
• Quantitative Methods:
Surveys, administrative data bases for program or
related institutions, external monitoring data
Provide assessments of who is receiving services, what kinds of
services are being delivered, over what time periods, at what cost
Jamaica’s Path Evaluation
• Logic Model
• Targeting Evaluation
• Implementation Evaluation
• Impact Evaluation
• Regression Discontinuity Design
• Results
Jamaica’s Path Evaluation
(p.11)
Jamaica’s Path Evaluation: Logic Model
Jamaica’s Path Evaluation: Targeting Evaluation
• Did benefits go to target population of those most
in need?
• Data sources:
• Random sample of poor households (pre-PATH)
• Random sample of PATH eligible applicants
• Data from administrative database
• Most recipients are poor;
• 25% are in extreme poverty; a few are well-off
• Coverage of program may only be 20% of poor
(after conditional rules), but funding only for
double that.
Jamaica’s Path Evaluation: Implementation Evaluation
• Was PATH implemented as planned?
• Eligibility process, Benefit adequacy,
Determining compliance with conditions, Benefit
receipt, Satisfaction
• Data sources:
• Two site visits in 5 random parishes
• Focus groups with recipients
• Interviews with local staff, schools, health
clinics, post offices
Jamaica’s Path Evaluation: Implementation Evaluation
• Recipients and others believe benefits may not
always be well targeted
• Recipients think process of determining
eligibility is difficult and slow
• Monitoring of compliance with healthcare and
school attendance is problematic
• Benefits are generally adequate, but are
sometimes delivered late
Jamaica’s Path Evaluation: Impact Evaluation
• What were the effects of PATH?
• Human capital: health care use, school attendance
• Data sources:
• Administrative data for participants and the comparison
group
• Surveys: baseline and 1.5 years later (pre and post
program)
• Compares random samples of participants with
applicants who were not eligible, before and after
the program (and controls for differences with
regression)
Jamaica’s Path Evaluation: Impact Evaluation
• Participants come from those just under threshold
• Control group comes from those just over threshold
• Regression discontinuity controls for eligibility score and other
characteristics in multivariate regression
Source: http://www.socialresearchmethods.net/kb/quasird.php
Jamaica’s Path Evaluation: Impact Evaluation
Jamaica’s Path Evaluation
• Evidence of successful implementation
• Eligibility and monitoring of compliance might be
improved
• Evidence of small but significant effects on school
attendance and health care (for kids)
Monitoring and Performance Measurement
• Takes process evaluation to real time:
• What are we doing?
• How is it being delivered?
• To whom?
• In what time frame?
• Success depends on administrative data capacity
• Must be tied to management processes to be sustainable
• Example: GMAP
http://www.accountability.wa.gov/reports/workfirst/200710
17/WorkFirstReport.pdf
Introducing Zinc in Nepal
• What were the key program activities?
• What are the critical causal links necessary for
success of the program?
• How was implementation evaluated?
• [What were the results of the implementation and
impact evaluation?]
Introducing Zinc in Nepal:
Results:
• Increased Zinc use from .4% to 15.4% in 6 months
• Zinc was usually used correctly
• Greatly increased knowledge of zinc by parents and
providers
• Local manufacturers developed low-cost products and
successfully distributed them
Introducing Zinc in Nepal:
Lessons for other programs:
• Efforts most effective when target both public and private sectors
• Local manufacturers can produce products if there is market
potential (Total market strategy), but profit margins matter
• Mass media was successful in creating demand and giving
knowledge including about correct usage
• Training and media together changed provider behaviors