Heading for presentation - NCRM EPrints Repository

Programme Evaluation for Policy
Analysis
Mike Brewer, 19 October 2011
www.pepa.ac.uk
PEPA is based at the IFS and
CEMMAP
© Institute for Fiscal Studies
Programme Evaluation for Policy Analysis:
overview
• Part of the ESRC-funded National Centre for Research Methods
• PEPA is about ways to do, and ways to get the most out of,
“programme evaluation”
“government
policies”
(although can
often generalise)
© Institute for Fiscal Studies
“estimating the
casual impact of”
Programme Evaluation for Policy Analysis:
overview
• Part of the ESRC-funded National Centre for Research Methods
• PEPA is about ways to do, and ways to get the most out of,
“programme evaluation”
–
Estimating the counterfactual
–
Characterising the uncertainty
–
Generalizing and synthesizing
• Beneficiaries
– those who do programme evaluation
– those who commission or design evaluations, or make decisions
based on the results of evaluations
© Institute for Fiscal Studies
1. Step change in conduct of programme
evaluation
• Training courses, workshops, on-line resources
• Research
– doing inference more accurately
– social networks and policy interventions
– estimating bounds of true impact
• Substantive research projects as exemplars
© Institute for Fiscal Studies
2. Maximise the value of programme evaluation
• Research
– Combining behavioural models with results of evaluation
– Compare RCTs with non-experimental approaches
– Synthesising results
• Training courses, especially for
– Those who commission evaluations and interpret results of
evaluations (link with Cross-Government Evaluation Group)
© Institute for Fiscal Studies
PEPA: who we are
• Professor Richard Blundell, UCL & IFS
• Professor Mike Brewer, University of Essex & IFS (Director)
• Professor Andrew Chesher, UCL & IFS
• Dr Monica Costa Dias, IFS (Deputy Director)
• Dr Thomas Crossley, Cambridge & IFS
• Professor Lorraine Dearden, Institute of Education & IFS
• Dr Hamish Low, Cambridge & IFS
• Professor Imran Rasul, UCL & IFS
• Dr Barbara Sianesi, IFS
• Department for Work and Pensions is a partner
www.pepa.ac.uk
© Institute for Fiscal Studies
Thoughts on evidence provision
• Pilots often not designed with focus of answering “what works?”
• Funding “What works?” seen as government’s responsibility
• Data
• Limitation of “what works” evidence
© Institute for Fiscal Studies
Spare slides on PEPA
© Institute for Fiscal Studies
PEPA: overview
PEPA.
Director: Mike
Brewer
0. Core programme
evaluation skills
1. Are RCTs worth
it?
Barbara Sianesi,
Jeremy Lise
© Institute for Fiscal Studies
2. Inference
Thomas Crossley,
Mike Brewer,
Marcos Hernandez,
John Ham
3. Control functions
and evidence
synthesis
Richard Blundell,
Adam Rosen,
Monica Costa Dias,
Andrew Chesher
4. Structural
dynamic models
Hamish Low,
Monica Costa Dias,
Costas Meghir
5. Social networks
Imran Rasul,
Marcos Hernandez
PEPA: research questions
PEPA
1. Are RCTs worth it?
2. Inference
3. Control functions and
evidence synthesis
4. Structural dynamic
models
5. Social networks
Can non-experimental
methods replicate the
results of RCTs?
Correct inference and
power calculations
where data have multilevel structure & seriallycorrelated shocks?
Can we weaken control
function approach to
estimate bounds?
How best to use ex post
evaluations in ex ante
analysis?
How best to collect data
on social networks?
How can we combine
results from RCTs with
models of labour market
behaviour?
Correct inference when
policy impacts are
complex functions of
estimated parameters?
Link between control
function and structural or
behavioural model s?
How are education
decisions affected by
welfare-to-work
programmes?
How is impact of policy
affected by the social
networks within and
between treated and
control groups?
How do GE effects alter
estimated impact of
training programmes?
Impact of time-limited inwork benefits on job
retention?
How are lessons from
multiple evaluations best
synthesised
How do life-cycle time
limits on welfare receipt
affect behaviour?
Can social networks
explain heterogeneity in
impact of a health
intervention?
© Institute for Fiscal Studies
PEPA: training and capacity building
PEPA
0. Core programme
evaluation skills
1. Are RCTs worth
it?
2. Inference
3. Control functions
and evidence
synthesis
4. Structural
dynamic models
5. Social networks
Core course in
evaluation methods
Course on
estimating “search
models”
Course, manual and
software tools on
power calculations
Course and
workshop on control
functions
Courses and
resources for
building dynamic
behavioural models
Survey and courses
on using data on
social networks
Courses for
designers and users
of evaluations
Workshop on using
“search models”
Courses, survey,
manual & software
for better inference
Course on bounds in
policy evaluation
Workshop on
dynamic behavioural
models and policy
evaluation
Survey and courses
on collecting data on
social networks
How-to guide for PS
matching
Workshop on value
of RCTs
Workshop and
courses on using
survivor models for
policy evaluation
© Institute for Fiscal Studies