use of impact evaluation for operational research prem week 2007 arianna legovini, africa impact...
TRANSCRIPT
Use of impact Use of impact evaluation for evaluation for operational researchoperational researchPREM Week 2007PREM Week 2007Arianna Legovini, Africa Impact Arianna Legovini, Africa Impact Evaluation Initiative (AFTRL)Evaluation Initiative (AFTRL)
ParadigmsParadigms
Project is a set of activities defined at time zero Project is a set of activities defined at time zero designed to deliver expected results. Short of designed to deliver expected results. Short of major upsets, we will try to stick to the script.major upsets, we will try to stick to the script. Project will either deliver or notProject will either deliver or not
Project is menu of options, some better than the Project is menu of options, some better than the others, with a strategy to find out which are bestothers, with a strategy to find out which are best Activities might change overtime. Project will Activities might change overtime. Project will
deliver more by the end than it did at beginningdeliver more by the end than it did at beginning
EvaluationEvaluation Old: retrospectiveOld: retrospective
Look back and judgeLook back and judge
New: prospectiveNew: prospectiveDecide what need to learnDecide what need to learnExperiment with alternativesExperiment with alternativesMeasure and informMeasure and informAdopt best alternativeAdopt best alternative
IE as OR canIE as OR can
Measure effectiveness of alternatives Measure effectiveness of alternatives (modes of delivery, packages, pricing schemes)(modes of delivery, packages, pricing schemes)
Provide rigorous evidence to modify project Provide rigorous evidence to modify project features overtime (managing by results)features overtime (managing by results)
Inform future project designs Inform future project designs
Prospective evaluation: Key stepsProspective evaluation: Key steps1. Identify policy questions
2. Design evaluation
3. Prepare data collection instruments
4. Collect baseline
5. Implement program in treatment areas
6. Collect follow up
8. Improve program and implement new
7. Analyze and feedback
““Infrastructure of Infrastructure of evaluation”evaluation”
Not a one shot studyNot a one shot study
Institutional framework linking data and Institutional framework linking data and analysis to policy cycle analysis to policy cycle
++ An analytical framework and data system An analytical framework and data system
for a sequence of learningfor a sequence of learning
Institutional frameworkInstitutional framework Evaluation team Evaluation team
M&E staff M&E staff
IE specialists IE specialists
Local researchersLocal researchers
StatisticiansStatisticians
Data collectorsData collectors
Policy-makers Policy-makers
Program managersProgram managers
ProgramChange
Data
Feedback
Analysis
Start building team Start building team capacitycapacity
Provide training opportunity for team and Provide training opportunity for team and operational staffoperational staff
Discuss and question policies, Discuss and question policies, developmental hypotheses and causal developmental hypotheses and causal linkages, investigate alternativeslinkages, investigate alternatives
Develop evaluation questionsDevelop evaluation questions Develop evaluation designDevelop evaluation design Provide some time for internal discussion Provide some time for internal discussion
and agreementand agreement
Operational questionsOperational questions
Question design choices of operationQuestion design choices of operation Ask whether equally likely alternatives Ask whether equally likely alternatives
should be consideredshould be considered Think of what choices were made on Think of what choices were made on
hunches rather than solid evidencehunches rather than solid evidence Identify program features that are being Identify program features that are being
changed and question underlying rationalechanged and question underlying rationale
Decision treeDecision tree
Subsidy
20%
Subsidy
40%
$1 CFL
$1.5 CFL
TIME
Sequential learningSequential learning
Use random trials to test alternative Use random trials to test alternative delivery mechanisms or packagesdelivery mechanisms or packages
Focus on short term outcomes Focus on short term outcomes Develop causal chainDevelop causal chain Identify outcomes that change in the short Identify outcomes that change in the short
term, e.g. take up rates, use, adoption, and term, e.g. take up rates, use, adoption, and are likely to lead to higher order outcomesare likely to lead to higher order outcomes
Time follow up data collection 6-12-18 Time follow up data collection 6-12-18 months after exposuremonths after exposure
Measure impact on ST outcomes in Measure impact on ST outcomes in alternative treatmentsalternative treatments
Identify bestIdentify best Change program to adopt best Change program to adopt best
alternativealternative Start with a new set of operational Start with a new set of operational
questions and trailsquestions and trails
ExampleExample Ethiopia PRSP set targets for electricity household Ethiopia PRSP set targets for electricity household
coverage (50%)coverage (50%) The electric company set out a ten year strategy to The electric company set out a ten year strategy to
connect rural towns connect rural towns No subsidy for last mile connection No subsidy for last mile connection Will they achieve targets???Will they achieve targets???
Experiment with alterative subsidy values (high, Experiment with alterative subsidy values (high, medium, low) to lower connection barriersmedium, low) to lower connection barriers
Measure connection rate at each level of subsidy, Measure connection rate at each level of subsidy, including the one needed to achieve 50% household including the one needed to achieve 50% household coveragecoverage
Results 6-12 monthsResults 6-12 months Adopt subsidy policy for the program consistent with Adopt subsidy policy for the program consistent with
targets OR change targetstargets OR change targets