managing evaluations for consistently high quality
Post on 25-Feb-2016
53 Views
Preview:
DESCRIPTION
TRANSCRIPT
Managing Evaluations for Consistently High Quality
American EvaluationAssociation AnnualConference - 2013
Molly Hageboeck
USAID M&E Projects Overseas Managed by MSI
Columbia Uganda
Kenya
South Sudan
Ethiopia
PakistanAfghanistan
USAID M&E Projects Overseas Managed by MSIKeys to Success
• Evaluations are projects – they can be managed• Identify key intervention points – quality checkpoints• Create tools for exerting quality control at the checkpoints• Share the tools with clients and evaluation teams -- Field handbook -- New website MSI build for USAID E3 to improve M&E includes evaluation management tools.
MSI Evaluation Management Checkpoints for USAID’s ProcessThe Evaluation Management Process
Stage 1 Stage 2 Stage 3 Stage 4 Stage 5
Decision to Evaluate to Issuance of SOW
Proposal Review to Approval for Data Collection to Begin
Support During Data Collection and Analysis
Initial Evaluation Results Briefing to
Final Report
Dissemination of Final Report to Assessment of Evaluation Influence
Decision to evaluate
Evaluation Manager assigned
Evaluation parameters defined (type, timing)
Development partner input (as appropriate)
Evaluation design/plan developed (USAID/
initial version) Evaluation
dissemination/ utilization plan developed by USAID (initial version – include list of what evaluation team needs to provide to USAID)
Design/plan reviewed/approved (USAID/initial version)
SOW drafted SOW reviewed and
approved (Quality Control checkpoint)
Solicitation issued (if external evaluators are to be involved)
Proposals reviewed/ team selected
Team inception report on performance monitoring findings (if required by SOW) (Quality Control checkpoint)
Team planning meeting (TPM)
Initial meetings with development partners
Detailed evaluation design/plan developed/refined by team
Evaluation design/plan (or modifications) approved (Quality Control checkpoint)
Register the evaluation with USAID/
Washington
Weekly status review with team against field work plan and schedule
Troubleshooting as needed to assist evaluation team in the field
Initial briefing (on completeness) of evaluation findings, conclusions and recommendations (Quality Control Checkpoint)
Approval to proceed to drafting report (if approval is required by SOW)
Submission of Draft Report
Oral briefing on draft report (if required by SOW)
Review of draft report – feedback to team (Quality Control Checkpoint)
Evaluation dissemination/
utilization plan updated/expanded by USAID (final version) Submission of Final Report
Review/acceptance of final report and other deliverables
Dissemination of evaluation report and executive summary (per dissemination/
utilization plan) Formal evaluation
review meeting Evaluation review
minutes disseminated
Follow-up on implementation status of accepted recommendations (per dissemination/
utilization plan) Follow-up on impact
of evaluation (per utilization plan)
Quality Checkpoint 1Evaluation Statement of Work (SOW)
Common Problems
• Management purpose is not clear/transparent
• Evaluation Questions – to many, not matched to purpose, not feasible
• There isn’t always an opportunity to comment on or negotiate the SOW
Solution: Help your ClientsImprove the SOWS theyPrepare
MSI Checklist for Developing/Reviewing Evaluation SOWS
Built it in about 2000 Gave it to USAID in 2010
Quality Checkpoint 2Written Review of Existing Information Before Final
DesignCommon Problems
• Late receipt of project reports/performance data• Team reviews often cursory – important data not extracted & shared
Solutions:
• Ask for reports when the SOW is issued.• Develop/require a structured desk review product within a short time frame
MSI Desk Review Template – First Deliverable from Teams – Before Final Design
Quality Checkpoint 3Final Evaluation Design/Plan Prior to Field Work
Common Problems
• The field team did not prepare the proposal stage design – and may not follow it• Teams too often start the field work without a final design, data collection and analysis (and sampling plan and all necessary instruments
Solution:
• Detailed evaluation design and
formal review/approval on a question by question basis from the actual team including all instruments before they get the keys to the jeep.• Provide teams with a
structured format to get started
Example 1: “Getting to Answers” Matrix Evaluation Questions
Type of Answer Needed
Data Collection Method(s)
Data Source(s)
Sampling or Selection Criteria
Data Analysis Method(s)
1. Descriptive
Comparative (normative) Cause-and-Effect
2.
Descriptive Comparative (normative) Cause-and-Effect
MSI “Getting to Answers” Matrix Built it in about 2005 Gave it to USAID in 2010
Quality Checkpoint 4Post-Field Work and Analysis Pre-Draft Briefing
Common Problems
• Teams start writing before they work out a clear flow of findings, conclusions and recommendations grounded in their evaluation evidence.• Many reports not well supported by evidence• Many mix up findings, conclusions and recommendations – and confuse readers.
Solution:
• Required oral briefing in bullets
to ensure all questions have been addressed and F-C-R flow Is logical• Block remaining LOE until
this step is passed – as the team may need to get more data before it writes.
Quality Checkpoint 5Structured Quality Focused Review of Draft Report
Common Problems
• Clients tend to review draft evaluation reports on substantive reports often skipping over structural and professional quality aspects.• Quality fine points may not get attention until the final stage – when all LOE has been spent• Or they remain missed until a meta-evaluation finds the flaws
Solution:
• Evaluation quality review checklist – shared with teams the day they start and all members of draft report review teams.
• Checklist based feedback to
team – and repeat use of checklist with final report to verify that improvements have been made
MSI Checklist for Reviewing EvaluationReports
Built it in about 2000 Gave it to USAID in 2010
Current “News” on MSI’s Evaluation Management System
• Update of MSI Handbook for Field Teams is underway
• Recent meta-evaluation for USAID of 2009-2012 evaluations found problems that greater internal use of an evaluation management system and associated tools would have caught -- and a recommendation on strengthen internal evaluation management practices in USAID has been provided.
top related