Meta-analysis and benefit-cost analysis Or, “Show me the money!” 1 Meta-analysis The value of an evaluation is always limited by its methodological constraints Findings will always be subject to uncertainty Was sample representative? Was control/comparison group truly equivalent? Were measurements accurate? Are the finding generalizable? Meta-analysis helps to address these issues by aggregating findings of other evaluations 2 Meta-analysis Is critical to identifying “what works” – our collective knowledge of whether specific interventions are successful in achieving desired outcomes Statistically combines outcome measurements across rigorous evaluations to generate more valid and reliable predictions of program impacts 3 Meta-analysis General approach Identify prior impact evaluations of specified program Screen evaluations to identify subset that used rigorous and valid measures Statistically aggregate the impact measurements of selected evaluations to produce ‘effect size’ estimates and standard error 4 It’s a wonderful world This is highly complex and time-consuming work Fortunately, several groups are already doing it and publishing results “What works” clearinghouses Cochrane and Campbell academic collaboratives 5 What Are Clearinghouses? ● ● ● ● ● Purpose is to identify “what works” Review and summarize rigorous evaluations of interventions Assign ratings based on evidence (e.g., model, promising, mixed effects) Use slightly different methodologies, criteria and terminology Policy area specific What Works Clearinghouse = Education CrimeSolutions.gov = Criminal Justice Clearinghouses Results are available on-line and are significant resource to evaluators Gives you ready access to prior evaluations to borrow from their design and data collection instruments Can readily compare your results to those from other rigorous evaluations Can supplement your formative evaluations with published meta-analysis results to better inform stakeholders about program 7 It gets even better ● Results First Clearinghouse Database contains information from 8 clearinghouses ● Organized by policy area, over 1,400 interventions rated Provides links to program page ● Reconciles clearinghouse rating systems with traffic light colors Results First Clearinghouse Database Reconciles clearinghouse ratings RATING COLOR BROAD TIER OF EVIDENCE RATING DEFINITION Highest rated Research with the highest level of rigor shows a statistically significant positive impact (Strong Evidence) Second-highest rated Research with a high level of rigor shows a positive impact (Good Evidence) No effects Mixed effects Negative effects No evidence of impact Evidence differs on effectiveness: at least one study shows outcome had a positive effect while another shows the same outcome had a negative effect Evidence of a negative impact Results First Clearinghouse Database Let’s try it http://www.pewtrusts.org/en/research-andanalysis/issue-briefs/2014/09/results-firstclearinghouse-database 12 Cost-effectiveness and benefitcost analysis Extensions of evaluation that compare the cost of programs to the benefits that they achieve Enables evaluators to address 2 critical questions: “Is a program worth funding - Does it generate enough benefits to outweigh its costs? Which alternative would generate the most benefits per dollar invested? Both types of studies are growing in popularity 13 Why important? An evaluation finds that a students in new smoking prevention program are 10% less likely to smoke than those in comparison group Control group: 30% smoked Treatment group: 27% smoked Program costs $10,000 per person Is it worth funding? What is the value of the 10% reduction in smoking and is more than the program cost? 14 Cost-effectiveness analysis Compares costs of programs to the units of effectiveness they achieve (outputs/outcomes) Dollars per lives saved (seat belts) Dollars per polio cases prevented (vaccines) 15 Benefit-cost analysis Goes further and puts a dollar value on the outcomes that a program achieves and compares it to program costs Reported as return on investment ratio Value of outcomes achieved for every dollar spent on program Example: 3:1 ROI for drug courts $3 in benefits achieved for each $1 invested n program 16 General approach 1. 2. 3. 4. 5. 6. 7. Decide scope of analysis - what costs and benefits are important? Identify and measure costs and benefits Project costs and benefit over time Monetize costs and benefits Discount $ to present value Compute CBA/CEA ratios Conduct sensitivity analyses 17 Cost-effectiveness or Benefitcost? Cost-effectiveness best when it is difficult to put a dollar value on program outcomes and when a single outcome is of primary importance Benefit-cost is best when you can monetize outcomes and you care about multiple elements of program impacts 18 Decide scope of analysis You can’t measure everything Decide who & what has “standing” and is important enough to measure Example: juvenile justice Include: Costs: Direct program Benefits: Reduced recidivism (avoided state and social costs); Improved high school graduation rate for participants (income) Exclude: Indirect costs of crime on economic activity, intergenerational impacts 19 Measure costs & impacts HOW: Your friend the impact evaluation Or better yet, findings of LOTS of impact evaluations (meta-analysis) Must decide how far to track costs Generally focus on major cost drivers Measure total, average and marginal costs (cost to serve a single client) Measure outcomes that have ‘standing’ , both tangible and intangible ones as possible 20 Project costs over time Many costs and benefits occur over period of time Increased income of high school graduates Costs of treating diabetes over lifetime Develop projections for when costs and benefits occur (from long-term impact evaluations) Importance: Critical to deciding what programs will achieve positive ROI in specified time periods 21 Monetize costs and benefits Must determine what costs to include Must determine what outcomes are worth in monetary terms 22 How can you assess costs? Fortunately, there are reliable estimates for many things that are difficult to directly measure Cost of crime victimization Value of college degree Can estimate other values through willingness-to-pay studies Value of clean parks; wilderness protection Many values have been estimated by economists (your friend the google) 23 Discount to present value Costs and benefits that occur in the future have a lower economic value than those that occur immediately That’s why we charge interest on loans Must discount value of future costs and benefits to ‘present value’ By discounting them by discount rate (money value of time) 24 Compute CER/ROI Cost effectiveness ratio: Present value of costs/units of effectiveness Expressed as “dollars per dropout prevented” Cost-benefit ratio: Net present value of benefits/net present value of costs “Taxpayers and society receive $15,481 for each person who completes dropout prevention program Return on Investment ratio: $5.87 in benefits for each $1 invested OR 25 Sensitivity analysis How risky is the investment in the program? All projections are based on projections that are subject to uncertainty (error) Sensitivity analysis determines how much the results would change if assumptions were varied Typically done with Monte-Carlo analysis Runs analysis many times under varying assumptions 26 Things to keep in mind BCA and CEA estimate the economic value of investments in programs BUT, economic efficiency isn’t the only thing we care about Equity, civil rights, due process are also important but not measured by BCA/CEA Should recognize this caveat when reporting results 27 Fortunately, we have the technology… Sophisticated benefit-cost models have been developed programs to examine programs in many social policy areas Based on meta-analysis program impact estimates Are readily customizable to specific jurisdictions by specifying local program costs and population cohorts Have monetization & sensitivity analysis builtin 28 Example: Meta-analysis Functional Family Therapy 80% 70% Recidivism Rate 60% 50% 40% RECIDIVISM RATES REDUCED BY 16% 30% 20% Without Program (actual baseline) 10% With Program 0% 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 Follow-up Years Source: Based on Washington data CBA of Functional Family Therapy OUTCOMES FROM PARTICIPATION Reduced crime Increased high school graduation Reduced health care costs Total Benefits Cost Net Present Value Benefits per Dollar of Cost MAIN SOURCE OF BENEFITS $20,740 Lower state & victim costs $8,220 Increased earnings $66 Lower public costs $29,026 $3,406 $25,620 $8.52 Source: Based on Washington data The Killer Ap BCA and CBA become incredibly useful if you can provide comparative assessments of alternative ways to achieve a goal Such as ranking criminal justice program alternatives by their return on investment (think Consumer Reports ranking) 31 Comparison of CJ Programs COSTS BENEFITS BENEFIT TO COST RATIO Correctional education $1,180 $21,720 $18.40 Vocational education $1,645 $19,594 $11.91 Correctional industries $1,485 $6,818 $4.59 Drug courts $4,951 $15,361 $3.10 Intensive supervision (surveillance only) $4,305 -$1,139 -$0.26 Aggression Replacement Training (within institutions) $1,575 $16,827 $10.68 Functional Family Therapy (probation) $3,406 $29,026 $8.52 Drug courts $3,275 $8,110 $2.48 Multidimensional Treatment Foster Care $8,232 $20,065 $2.44 $67 -$12,319 -$183.87 JUVENILE JUSTICE PROGRAMS Scared Straight Source: Based on Washington data Example: Results First Benefit-cost model that can assess a wide range of programs across criminal justice, child welfare, K-12 education, prevention, and health care Based on meta-analyses of rigorous evaluations Based on work of Washington State Institute for Public Policy Now used by 22 states and 7 local governments 33 Results First Model Output Source: Based on Washington data Results First Model Output: Cash Flow Analysis Source: Based on Washington data Results First Model Output: Benefits by Perspective Source: Based on Washington data Results First Model Output: Taxpayer Benefits by Budget Area Source: Based on Washington data Results First Model Output: Taxpayer Benefits by Governmental Level Source: Based on Washington data
© Copyright 2026 Paperzz