escap m&e system
TRANSCRIPT
MMMMMMMMMMMMMMMMMMMM EEEEEMMMMMMMMM EM&
Monitoring & Evaluation System Overviewand Evaluation Guidelines
ESCAP M&E SYSTEM
ESCAP is the regional development arm of the United Nations and serves asthe main economic and social development centre for the United Nations inAsia and the Pacific. Its mandate is to foster cooperation between its 53members and 9 associate members. ESCAP provides the strategic link betweenglobal and country-level programmes and issues. It supports Governments ofcountries in the region in consolidating regional positions and advocatesregional approaches to meeting the region’s unique socio-economic challengesin a globalizing world. The ESCAP office is located in Bangkok, Thailand.Please visit the ESCAP website at www.unescap.org for further information.
The shaded areas of the map indicate ESCAP members and associate members.
i
ESCAP Monitoring and Evaluation System
CONTENTS
Page
ACRONYMS ............................................................................................................................................ iii
GLOSSARY ............................................................................................................................................. iv
INTRODUCTION .................................................................................................................................. 1
1. MONITORING AND EVALUATION SYSTEM
1.1 INTRODUCTION ...................................................................................................................... 3Purpose and Definition ................................................................................................................ 3M&E Responsibilities ................................................................................................................... 3Comparing Monitoring, Evaluation and Review ....................................................................... 4
1.2 M&E IN THE CONTEXT OF RESULTS-BASED MANAGEMENT ................................. 6Introduction to Results-Based Management ............................................................................... 6Results Framework ....................................................................................................................... 6
2. MONITORING FRAMEWORK
2.1 INTRODUCTION ....................................................................................................................... 9Definition ...................................................................................................................................... 9Use of Monitoring ........................................................................................................................ 9Roles and Responsibilities ............................................................................................................ 10Support Systems ........................................................................................................................... 10
2.2 PROGRAMME MONITORING................................................................................................ 12Annual Work Plan ....................................................................................................................... 14IMDIS: Outputs, Work Months, Accomplishment Accounts ................................................... 14IMDIS: Programme Performance Report ................................................................................... 16The Executive Secretary’s Compact ............................................................................................ 17
2.3 PROJECT MONITORING ......................................................................................................... 17Project Document ......................................................................................................................... 18Project Progress Reports .............................................................................................................. 19Project Terminal Report ............................................................................................................. 20
2.4 FINANCIAL MONITORING .................................................................................................. 20
3. EVALUATION FRAMEWORK
3.1 INTRODUCTION ....................................................................................................................... 23Definition ...................................................................................................................................... 23Norms and Criteria ...................................................................................................................... 23
3.2 TYPES OF EVALUATIVE PROCESSES ................................................................................. 25External Evaluations .................................................................................................................... 26Internal Evaluations ..................................................................................................................... 27
ii
ESCAP Monitoring and Evaluation System
CONTENTS (continued)
PagePurpose of Evaluation and Review ............................................................................................. 28Evaluation Process ....................................................................................................................... 28
3.3 ORGANIZATIONAL ROLES AND RESPONSIBILITIES ................................................... 29
3.4 PLANNING EVALUATIONS .................................................................................................. 30Evaluation Planning and Budgeting at the Organizational Level ........................................... 30
3.5 USING EVALUATION FINDINGS ........................................................................................ 32Preparation of Management Response and Actions ................................................................... 32Sharing of Evaluation Findings .................................................................................................. 33Follow-up and Promotion of Learning ........................................................................................ 34
ANNEXES
Annex I. List of Key Reference Materials ................................................................................ 37
Annex II. Subprogramme and Supporting Organizational Structurefor the Biennium 2010-2011 ....................................................................................... 39
Annex III. Monitoring Fact Sheets and Tools ............................................................................ 41
Annex IV. Contents of the Evaluation Guidelines .................................................................... 43
iii
ESCAP Monitoring and Evaluation System
ACRONYMS
AA Accomplishment account
ACABQ Advisory Committee on Administrative and Budgetary Questions
UNAPCAEM United Nations Asian and Pacific Centre for Agricultural Engineering andMachinery
APCTT Asian and Pacific Centre for Transfer of Technology
APCICT Asian and Pacific Training Centre for Information and CommunicationTechnology for Development
ASD Administrative Services Division
AWP Annual work plan
CAPSA Centre for Alleviation of Poverty through Secondary Crops Developmentin Asia and the Pacific
DM Department of Management
EDM Executive direction and management
e-PAS Electronic Performance Appraisal System
EPOC ESCAP Pacific Operations Centre
ESCAP United Nations Economic and Social Commission for Asia and the Pacific
e-TC Electronic technical cooperation database
IMDIS Integrated Monitoring and Documentation Information System
IMIS Integrated Management Information System
IRFA IMIS Reporting Facility Application
JIU Joint Inspection Unit
M&E Monitoring and evaluation
OIOS Office of Internal Oversight Services
OPPBA Office of Programme Planning, Budget and Accounts
PMD Programme Management Division
PPR Programme performance report
PPBD Programme Planning and Budget Division
PWMP Project work and monitoring plan
RB Regular budget
RBM Results-based management
SIAP Statistical Institute for Asia and the Pacific
SSA Special service agreement
TC Technical Cooperation
UNDP United Nations Development Programme
UNEG United Nations Evaluation Group
XB Extrabudgetary
iv
ESCAP Monitoring and Evaluation System
GLOSSARY
Offices away from Bangkok Regional institutions and subregional offices under the auspices ofESCAP
PME focal point Planning, monitoring and evaluation focal point
Subprogramme manager The head of the organizational unit with the overall coordinatingresponsibility for a subprogramme
Member States States Members of the United Nations
member States States members of ESCAP
1
ESCAP Monitoring and Evaluation System
INTRODUCTION
The present document has been prepared in consultation with staff of the United Nations Economicand Social Commission for Asia and the Pacific (ESCAP) and the regional commissions for Africa,Europe, Latin America and Western Asia. It responds to requests made by the Advisory Committeeon Administrative and Budgetary Questions (ACABQ)1 of the General Assembly and the Office ofInternal Oversight Services (OIOS) of the United Nations Secretariat.2
The purpose of the document is to (a) provide an overview of the monitoring and evaluation (M&E)system of ESCAP, including policies and guidelines, as well as fact sheets and tools; and (b)promote coherence in the regional commissions’ approach to M&E in accordance with the recom-mendations of ACABQ.
The M&E system of ESCAP aims to facilitate:
More effective results-based management (RBM); M&E, in addition to planning and implementa-tion, constitute a primary pillar of RBM. M&E thus play a critical role in steering ESCAPprogrammes and projects towards the achievement of development results;
Improved institutional learning through the identification of lessons and systematic follow-up.M&E help to ensure that lessons learned in implementation and delivery inform planningprocesses and strengthen the developmental contribution of ESCAP to member States;
Strengthened accountability vis-�-vis member States and other development partners within theUnited Nations and beyond. M&E promote transparency and participation throughout the RBMprocess and serve to display the results of the work of ESCAP to all stakeholders.
Monitoring and evaluation at ESCAP is governed by the regulations and rules of the UnitedNations Secretariat as put forth by the Secretary-General.3 In addition, the evaluation component ofthe M&E system is guided by the principles for evaluation developed by the United NationsEvaluation Group (UNEG).4
The M&E system of ESCAP mainstreams M&E in strategic planning and management ofprogrammes and projects. Guidance on M&E is provided to explain requirements and responsibili-ties for M&E and how results from M&E are used to inform ongoing and future planning andimplementation processes. This guidance is provided through a series of documents (as illustratedin figure 1):
M&E System Overview, which provides the overall framework for M&E within ESCAP;
1 See the first report of ACABQ on the proposed programme budget for the biennium 2006-2007, 2005 (A/60/7).2 In addition to general guidance provided by OIOS to the entire Secretariat (see footnote 3 below), the OIOS
Monitoring, Evaluation and Consulting Division, in its report on its inspection on results-based management(RBM) practices at the United Nations Economic and Social Commission for Asia and the Pacific (ESCAP), July2007, issued specific recommendations on the M&E system of ESCAP.
3 Secretary-General's Bulletin, “Regulations and Rules Governing Programme Planning, the Programme Aspects ofthe Budget, the Monitoring of Implementation and the Methods of Evaluation”, ST/SGB/2000/8, 19 April 2000.
4 United Nations Evaluations Group, “Standards for Evaluation in the UN System”, April 2005, http://www.uneval.org; and “Norms for Evaluation in the UN System”, April 2005, http://www.uneval.org.
à
2
ESCAP Monitoring and Evaluation System
Monitoring Guidelines, with detailed instructions on monitoring requirements; Evaluation Guidelines, with detailed instructions on how to conduct evaluations at ESCAP; Monitoring and evaluation guidelines are supported by fact sheets and tools. The fact sheets are
subject to continuous updates. The latest versions are available on iSeek: (http://iseek.un.org/webpgdept1028_79.asp?dept=1028).
Figure 1. Documents providing guidance on the M&E system of ESCAP
Monitoring and EvaluationSystem Overview
Evaluation Guidelines
EvaluationTools
(in support ofevaluation process)
Evaluation FactSheets
(for eachevaluation type)
MonitoringTools
(in support ofmonitoring process)
Monitoring FactSheets
(for each monitoringrequirement)
Monitoring Guidelines(Procedures for Programme
Performance Monitoring andReporting through the use of
IMDIS as issued by Headquarters)
The present document is the M&E System Overview and comprises the following chapters:
Chapter 1 introduces M&E, its role in RBM and the programme and project cycles; Chapter 2 explains the monitoring framework for subprogrammes and projects; Chapter 3 explains the evaluation framework of ESCAP.
3
ESCAP Monitoring and Evaluation System
1. MONITORING AND EVALUATION SYSTEM
The present chapter provides background information on monitoring and evaluation (M&E), includ-ing evaluative review, and places it in the context of results-based management (RBM) at ESCAP.
1.1 Introduction
Purpose and Definition
The overall purpose of M&E is to measure and assess performance in order to more effectivelymanage the achievement of results.
Monitoring is a continuous function that aims primarily to assess systematically the progress of anexisting subprogramme or project towards the achievement of results.
Evaluation is a selective exercise, conducted in a formal and structured manner, to determine therelevance, efficiency, effectiveness and sustainability of a particular initiative. It may also be used,where appropriate, to determine the impact of that initiative and may cover subprogramme, projector thematic evaluations.5 Evaluation is used to improve the quality of the future programme of workof ESCAP.
M&E Responsibilities
The overall M&E responsibilities are defined by the organizational structure of the United Nationsand the ESCAP secretariat and by the job descriptions of key staff members. The Commission isresponsible for guidance and oversight of the work of the ESCAP secretariat.
ESCAP member States have a number of critical functions in relation to M&E:
Since the ESCAP secretariat is accountable to member States, it provides regular M&E reports tothe Commission and its subsidiary bodies;
Member States may request the secretariat to report on procedural issues or to conduct specificevaluations;
By approving the strategic framework and programme budget, Member States also approve theresults framework, which forms the basis for programme M&E;
By approving the programme budget, Member States also approve the biennial evaluation planof ESCAP;
In response to M&E reports, member States may decide to introduce changes to the ESCAPprogramme of work in both substantive and procedural terms;
Member States are also responsible for monitoring and evaluating commitments that theythemselves have made, for example in the context of ESCAP intergovernmental forums, includ-ing Commission resolutions.
5 Examples of thematic evaluations could include the evaluation of a cross-cutting issue (e.g. gender), a modality(e.g. surveys or committee meetings), or a fund (e.g. all projects funded by a particular donor).
4
ESCAP Monitoring and Evaluation System
The United Nations Department of Management (DM) through its Accountability and OversightSupport Service and the Office of Internal Oversight Services (OIOS) are responsible for guidance,oversight and quality assurance and support of monitoring and evaluation processes within theUnited Nations Secretariat. Additionally, OIOS may manage external evaluations of ESCAP.
Further, relevant donors may review evaluation reports or be involved in acting upon the recom-mendations contained in those reports. In the case of joint programmes or projects, partner entitiesshare M&E responsibilities with ESCAP.
At the level of the secretariat, the Executive Secretary of ESCAP is responsible for all activitiesundertaken by the ESCAP secretariat, which, in the context of M&E means ensuring that monitoringinformation and the findings and recommendations of evaluations are used to promote learning,strengthen accountability at ESCAP and ultimately improve organizational effectiveness indelivering development results.
The primary responsibility for monitoring lies with the division chiefs and section chiefs as well asthe heads of offices away from Bangkok. The Programme Management Division6 (PMD) isresponsible for overall coordination and provides technical support and quality assurance. TheDepartment of Management prescribes most of ESCAP’s programme monitoring requirements foreach biennium.
The primary responsibility for evaluation lies with PMD in order to ensure a maximum degree ofimpartiality and independence and to enhance the credibility of ESCAP vis-�-vis external stakehold-ers. Divisions provide support and advice in the planning, management and review of evaluations.They may also be requested to follow up on recommendations. Quality assurance is provided byOIOS, as appropriate. Evaluative reviews, or reviews, may be managed by divisions, includingOffices away from Bangkok. Evaluative reviews are implemented in line with the M&E SystemOverview and Evaluation Guidelines of ESCAP. Quality support and assurance is provided by PMD.
Each organizational unit responsible for delivering ESCAP’s programme of work appoints PlanningMonitoring and Evaluation (PME) focal points and assistants. This group of staff members servesas the anchor of M&E at ESCAP. PME focal points coordinate the monitoring and reportingactivities for the subprogrammes under the purview of the Division or Office. In the context ofevaluation, the PME focal points and assistants facilitate the formulation of ESCAP’s BiennialEvalation Plan, provide guidance to their colleagues on evaluation during the design phase ofprogrammes and projects, and coordinate the monitoring and reporting on follow-up to evaluationsby their division or office.
Comparing Monitoring, Evaluation and Evaluative Review
There are clear linkages between monitoring, evaluation and evaluative review: monitoring cangenerate questions to be answered by evaluation and review; evaluation and review can point outnew areas for monitoring. Evaluations and reviews draw heavily on data generated throughmonitoring during the programme or project cycle. Examples include baseline data, information onthe subprogramme or project implementation process, and measurements of outcomes and results.Finally, managers use monitoring, review and evaluation results to manage programmes and projects.Different aspects of monitoring, evaluation and review, including related responsibilities, are com-pared in table 1.
6 See, “Organization of the secretariat of the Economic and Social Commission for Asia and the Pacific” (ST/SGB/2005/11), 29 April 2005.
à
5
ESCAP Monitoring and Evaluation System
Table 1. Comparison between monitoring, evaluation and review at ESCAP
MONITORING EVALUATION EVALUATIVE REVIEW
Purpose Determine if External accountability to Organizational learning(Other purposes subprogrammes or member States and donors Internal accountabilityin italics) projects are progressing Internal accountability External accountability
according to plan Organizational learning
Responsibility Subprogramme ESCAP Programme Division or office awaymanagers Evaluation Officers from Bangkok
Use of findings Take corrective action Incorporate lessons learned Incorporate lessons learnedto ensure that in the strategic planning and in the planning andsubprogramme and decision-making process of decision-making processesproject objectives are ESCAP to improve future of ESCAPmet subprogrammes Mainstream an
Ensure accountability Ensure accountability to understanding of qualityto member States and member States, donors management across ESCAPdonors and other development Foster a culture of learning
partners
Focus Outputs/activities Objectives (subprogramme) Outcomes (project) Expected Outcomes (project) Results at different levels
accomplishments, Results (themes) Internal processesindicators ofachievement
Deliverables Continuously updated Evaluation reports with Review reports withinformation in IMDIS findings, lessons learned findings, lessons learnedand e-TC systems and recommendations and recommendations
Output reports, workmonth reports,AccomplishmentAccounts, ProgrammePerformance Report
Project progress andterminal reports
Dissemination Division, project ESCAP secretariat, donors All relevant secretariatstakeholders and, at and other stakeholders entitiesimportant milestones, Intranet Intranetmember States Internet
Quality assurance PMD OIOS Internal peersand support DM UNEG ESCAP Programme
PMD Evaluation Officers Reference group Reference group
6
ESCAP Monitoring and Evaluation System
1.2 M&E in the Context of Results-Based Management
Introduction to Results-Based Management
RBM is “a management approach aimed at changing the way organizations operate, with improvingperformance (achieving results) as the overriding orientation”.7 In other words, the overridingorganizational question relates to what results to achieve. Only then should the activities needed toachieve such results be determined.
ESCAP has introduced RBM with the aim of enhancing:
The organization’s relevance by clarifying comparative advantages and sharpening focus; Accountability and transparency to member States and donors through clearer programme and
project objectives supported by a strong M&E system; Ownership by all stakeholders through participatory planning, implementation and evaluation
processes; Effectiveness and efficiency of programmes and projects by interlinking planning, budgeting,
financial management, administration, human resources, knowledge management, and M&E,with a focus on results;
Organizational learning through evaluations and the sharing of lessons learned.
M&E together constitute a primary pillar of RBM. As such, M&E are integral components of theprogramme and project cycle. M&E are also key sources of knowledge - of data, information,lessons and recommendations – that needs to be captured by the ESCAP knowledge managementsystem with a view to (a) defining and redefining the desired development results to be achievedby the organization and (b) strengthening the methods, processes or modalities through which suchresults are to be achieved. The extent to which evaluative knowledge can thus be used inmanaging for development results depends, critically, on the effectiveness with which knowledgeitself is made available, disseminated, discussed and used.
Results Framework
In the context of RBM, each subprogramme and project is designed using a results framework.8 Theresults framework specifies the hierarchy of results: impact, outcome, expected accomplishments,and outputs/activities. Table 2 shows the results framework for subprogrammes and projects, whichdescribes the four results levels, the performance indicators used to measure the results, and therole of M&E.
The two-year programme cycle at ESCAP and at all other Departments of the UN Secretariatincludes programme planning, the implementation of the programme through projects and activities,and M&E.9 It is important to note that M&E is not an individual time-dependent step in theprogramme cycle, but is carried out throughout the biennium. Moreover, both planning andimplementation are subject to M&E (see figure 2).
7 United Nations Evaluation Group (UNEG), 2006. “The Role of Evaluation in Results-Based Management (RBM),draft report” (http://www.uneval.org).
8 Other terms used within ESCAP are “logical framework” and “logframe”.9 Projects may span more than one biennium.
7
ESCAP Monitoring and Evaluation System
Table 2. Results framework
RESULT LEVEL DESCRIPTION INDICATORS M&E
Subpro- Projectgramme
Goal Economic and social Quantity and quality: Reported at global anddevelopment Millennium Development regional conferences of the
Goals (MDGs) United Nations: Other relevant economic Periodic global and
and social indicators regional analysis ofmember States’ statistics;
Member States’ reports
Objective Outcome Formulation and Quantity and quality:a Reported in biennial PPR orimplementation of Evidence of number and other forums based on:improved economic and quality of new or Feedback from Membersocial policies and improved policies Statesprogrammes by member Evidence of delivery of Reports of committeesStates, based, e.g., on new or improved and special bodiescommitments from government goods and Other subprogrammeglobal and regional services monitoring by divisionsconferences External or internal
evaluations
Expected Outputs Capacities or potentials Quality and quantity: Reported in biennial PPRaccomplish- created among member Satisfaction with number based on:ments States' departments, and quality of products Stakeholder question-
organizations and and services provided naires, etc.individuals (systems by ESCAP; Reports of committeesand processes, Evidence, where and special bodiesknowledge and skills, appropriate, of initial Other subprogrammeattitudes and behaviour) follow-up and application monitoring by divisions
of skills and knowledge by Evaluationsmember States Evaluative reviews
Outputs Activities Delivery of products Quality and quantity:b Monitoring and reportingand and services e.g. Outputs/activities in IMDISActivities Servicing inter- delivered by type, Project progress reports
governmental bodies purpose, location, etc and terminal reports Training/workshops % progress against Information/ programme budget
publications % expenditure against Specific research/ budget
analysis
Notes:a In the context of the strategic framework, there is no requirement to formulate indicators at the level of the objective.b Integrated Monitoring and Documentation Information System (IMDIS) indicators or quality criteria for outputs and
activities. In the context of the programme of work/programme budget, there is no requirement to formulate
What member Statesachieve (4-8 yeartimeframe): benefit toultimate targetgroups
What member Statesdo (2-4 yeartimeframe): changein behavior andbenefits, utilizationof capacities
What ESCAP achieves(1-2 year timeframe):human and technicalcapacities
What ESCAP does(ongoing): specificmeasures needed toachieve outputs
8
ESCAP Monitoring and Evaluation System
Figure 2. M&E in the ESCAP programme cycle
Implementation2008 – 2009
Pla
nnin
g 2010-2011ProgrammeBudget
2012-2013StrategicFramework
2012-2013ProgrammeBudget
2014-2015StrategicFramework
2014-2015ProgrammeBudget
M&E fi
ndings
M&E fi
ndings
M&E fi
ndings
Implementation2010 – 2011
Implementation2012 – 2013
Programme planning, through the preparation of the strategic framework and the programmebudget, provides the basis for the implementation, monitoring and evaluation of the programme foreach biennium.
The strategic framework establishes the strategic direction of United Nations departments, includingESCAP, for a given biennium (two-year work programme period), is planned two years ahead ofthe biennium in question, and is approved by the General Assembly approximately one year aheadof the biennium. It defines the expected accomplishments and indicators of achievement for eachsubprogramme (see table 2), against which subprogrammes are monitored and evaluated. It alsooutlines the overall objective of each subprogramme and describes a strategy through which theexpected accomplishments will be achieved. ESCAP’s subprogrammes and organizational structureare presented in annex II.
Based upon the approval of the strategic framework, the programme budget is prepared and isapproved by the General Assembly in December of the year preceding the biennium. Theprogramme budget constitutes a detailed biennial plan for each subprogramme, describing outputsto be delivered, external factors (assumptions made, such as the availability of extrabudgetaryfunding), and detailed financial and human resource requirements. The programme budget definesthe outputs and activities that need to be delivered in order to achieve the results defined in thestrategic framework. It thus provides a framework for ongoing monitoring of programme implemen-tation and performance, including budgetary delivery.
Planning also takes place at the level of projects and other activities carried out as part ofprogramme implementation at ESCAP. In the case of technical cooperation (TC) projects, includingall extrabudgetary and some regular budget projects, the basis for implementation and M&E isprovided by the Project Document. The Project Document is described in more detail in section 2.3.
9
ESCAP Monitoring and Evaluation System
2. MONITORING FRAMEWORK
The present chapter provides an introduction to monitoring (section 2.1). It defines the concept,explains its use and identifies the monitoring roles and responsibilities of different stakeholders.Furthermore, the monitoring requirements at the level of subprogrammes (section 2.2) and projects(section 2.3) are explained. For each monitoring requirement, a separate more detailed fact sheet isavailable, which lists additional guidance and reference materials (see annex III).
2.1 Introduction
Definition
Monitoring is defined as “a continuing function that aims primarily to provide the management and mainstakeholders of an ongoing intervention with early indications of progress, or lack thereof, in the achievementof results. An ongoing intervention might be a (sub)programme, project or other kind of support to anoutcome.”10
Activities include formal information collection and reporting requirements (e.g. conduct of surveys,analysis of statistical data, review of expenditure against budget, reporting in IMDIS and e-TC,preparation of mission reports) and more informal activities (observations, discussions with col-leagues, scanning media reports, etc.). At regular intervals, monitored information can be used for amore formal review or assessment of subprogrammes and project performance.
The conceptual and methodological framework for monitoring at ESCAP is provided by theAccountability and Oversight Support Service of the Department of Management. Monitoring,review and assessment elements described in the present chapter should be seen in the context ofwhat OIOS categorizes as “mandatory self-assessment”.11
Use of Monitoring
Monitoring is an ongoing management function that answers the question “Are things goingaccording to plan?” It focuses on the implementation and financial progress of the subprogrammesand projects of ESCAP, placing greater emphasis on outputs and expected accomplishments than onoutcomes or impacts. As such, monitoring plays a key role in ensuring the results-orientation of theprogramme of work.
Results from monitoring are used to:
Take corrective action, if required, to ensure that subprogramme and project objectives are metwithin a given budget and timeframe by comparing actual progress against what was planned.
10 UNDP, 2002. “Handbook on Monitoring and Evaluating for Results”, http://www.undp.org/eo/documents/HandBook/ME-HandBook.pdf
11 Office of Internal Oversight Services (OIOS) “A Guide to Using Evaluation in the United Nations Secretariat”,June 2005, http://www.un.org/depts/oios/manage_results.pdf. OIOS defines “mandatory self-assessment” as“the assessment undertaken by all programme and subprogramme managers when reporting the results attainedwith respect to the expected accomplishment presented in the logical frameworks of the biennial programmebudget documents.”
10
ESCAP Monitoring and Evaluation System
Division chiefs and section chiefs will use monitoring information to improve subprogrammemanagement, and section chiefs and project officers will use monitoring information to improveproject management;
Enhance organizational learning by sharing findings and lessons learned with colleagues andmanagement internally, and occasionally with donors, for example through a project progressreport. This indirectly supports strategic and programme planning and decision-making atESCAP (particularly by informing the planning process related to future strategic frameworksand programme budgets);
Hold ESCAP accountable to member States and donors by justifying the efficient and effectiveuse of funds and staff resources.
At the level of United Nations Headquarters, aggregated results from monitoring are used (a) tocompile overall Secretariat reports on the delivery of the programme of work and (b) to comparethe programme delivery of different Secretariat entities.
Roles and Responsibilities
Monitoring responsibilities are described in job descriptions of relevant staff members and specificmonitoring tasks are included in their performance appraisal.
Subprogramme managers, i.e. the chiefs of divisions or the heads of other organizational unitswith the overall coordinating responsibility for the subprogrammes of ESCAP (see annex II)carry the primary responsibility for programme monitoring. They approve monitoring reports;
Planning, monitoring and evaluation (PME) focal points provide coordinated division-levelsupport in the process of PME. Each organizational unit responsible for delivering ESCAP’sprogramme of work nominates one professional staff member as the PME focal point (plus onealternate) and one general service staff member as the PME assistant (plus one alternate). Theyplay a coordinating role within each subprogramme on PME issues for purposes of (i) strength-ening quality support through effective PME and (ii) facilitating coordination between PMD andthe subprogrammes on PME issues;
Section chiefs, heads of regional institutions, project officers and other staff members monitor thesubprogrammes and projects in which they are involved or for which they are responsible;
The Programme Management Division (PMD) plays a technical support and coordinating roleand, together with the Department of Management, has a quality assurance function;
The Department of Management provides quality assurance and support in monitoring theprogramme of work of ESCAP;
OIOS monitors overall compliance with programme monitoring requirements.
In addition to the ESCAP secretariat, member States play a role in monitoring through theCommission and committees. Member States provide feedback on developments in their countries,follow-up to commitments made at relevant global and regional conferences and on the quality anddirection of the work of ESCAP. Participants in expert group meetings and intergovernmentalmeetings also provide the secretariat with feedback through questionnaires and surveys. Suchfeedback can be reflected in biennial programme performance reports, in the strategic frameworkand through reviews of the conference structure of the Commission.
Support Systems
ESCAP uses a number of support systems that are relevant to M&E. These are briefly introducedbelow.
11
ESCAP Monitoring and Evaluation System
Integrated Monitoring and Documentation Information System
The Integrated Monitoring and Documentation Information System (IMDIS) is a Secretariat-wideweb-based information system for programme performance monitoring and reporting, including thepreparation of the Secretary-General’s programme performance report. Staff members monitor andreport on the results attained through the implementation of their programme of work during agiven biennium. Each user can only update his or her areas of responsibility but can view theentire programme of work of the organization as well as individual subprogrammes. The intentionis to promote accountability, transparency and information-sharing. Instructions for using thesystem are provided in the IMDIS User’s Guide.12
Integrated Management Information System
The Integrated Management Information System (IMIS) is used in the day-to-day administrative andfinancial management of projects and other activities. It is an online transaction processing andinformation system that is used within the United Nations for the administrative management of:
Budget and finance, such as accounts payable and receivable, disbursements, expenditures andsuballotments;
Human resources, such as personnel details, insurance, payroll and recruitment;
Support services, such as procurement, stock management, property management and travel.
Electronic Performance Appraisal System
The purpose of the Electronic Performance Appraisal System (e-PAS) is to improve the achievementof subprogramme results by optimizing the performance of all staff. This is done by evaluating theperformance of staff members in order to assess efficiency, competency and integrity and to ensurecompliance with Staff Regulations and Rules. E-PAS is used to facilitate and document this processelectronically.
An e-PAS cycle involves the development of a plan at the start of the annual reporting period, amid-term review and an end-of-cycle appraisal. Each staff member has to indicate that he or she hasreceived a copy of the work plan for his or her unit. This is to ensure that the annual work plan ofthe Division or office, and similar plans at the section level, are shared with staff members andreflected in their e-PAS work plans. The e-PAS thus seeks to ensure that the work of all staffmembers is geared towards the achievement of the expected accomplishments of thesubprogrammes and, thereby, the organization’s overall results.
Electronic technical cooperation database
The electronic technical cooperation database (e-TC) was created to provide an overview of alltechnical cooperation projects funded through extrabudgetary resources.13 An overview of thesystem is provided in “E-TC, An Introduction” and more detailed guidance is provided in “E-TCUser Manual”.14 The purposes of e-TC are:
12 http://imdis.un.org/ under “Programme performance reporting portal”.13 E-TC currently does not include Development Account projects or projects funded through the regular budget of
ESCAP.14 Both documents can be accessed through ESCAP's internal shared drive at P:\Programme Management\e-TC.
12
ESCAP Monitoring and Evaluation System
To provide comprehensive and user-friendly budgetary and other financial information relevantto projects through a direct and regularly updated link to IMIS;
To act as a database for substantive project-related information, facilitating comparative analysesof trends;
To serve as an electronic depository for all important project information, including projectdocuments, progress and terminal reports.
2.2 Programme Monitoring
The basis for the monitoring of subprogrammes at ESCAP is the programme budget, which includesthe subprogramme results frameworks as well as detailed information on outputs, activities andrequired resources.
Ongoing monitoring and monitoring milestones
Monitoring of subprogramme activities is done on a regular basis to verify whether the delivery ofoutputs is going according to plan and is in accordance with the budget. This helps the staffmembers responsible for the delivery of specific outputs to make adjustments where they areneeded.
In addition to ongoing programme monitoring, there are several monitoring and reporting require-ments at set intervals. These are used by subprogramme managers as well as PMD and theHeadquarters’ Department of Management, to ascertain, at a higher level, whether the plannedoutputs are implemented in support of achieving the expected accomplishments and whether thesubprogramme is on track to achieve its objective. They include the updating of an annual workplan, as well as other monitoring requirements that are reported in IMDIS, such as accomplishmentaccounts.
The monitoring deliverables for ESCAP subprogrammes are summarized in figure 3.
0 months 6 months 12 months 18 months 21 months 24 months
Outputs Work
months
Outputs Work
months Accomplish-
mentAccount*
IMDISresultsinformation
Outputs Work
months Accomplish-
mentAccount*
IMDISresultsinformation
PreliminaryPerformanceAssessment*(see p. 16)
PreliminaryPerformanceReport*(see p. 16-17)
Figure 3. Programme monitoring deliverables during a biennium
* should be prepared with the participation of all staff
13
ESCAP Monitoring and Evaluation System
In addition to the regular requirements of updating IMDIS monitoring records, attention should bepaid to the following:
Beginning of the biennium: An annual workplan should be prepared with the participation of allstaff. The beginning of the biennium is also an opportunity to refine the programme perfor-mance assessment methodology. Baseline and target values for some indicators may need to berevised according to actual results achieved at the end of the previous biennium. It is alsocritical to review the methodology for measuring the indicators, including related data collectionneeds and verification methods early in the biennium. A monitoring plan should be developedand the evaluation plan for the biennium should be revised if necessary.
At each milestone, the annual workplan and the evaluation plan for each subprogramme shouldbe updated.
12 month milestone: The mid-biennium point is a major milestone as at this stage, there is stillroom for adjustment to ensure implementation of the subprogramme. At the 12 month mile-stone, subprogramme managers should undertake a thorough review of the implementation ofthe subprogramme, assess whether measures need to be taken to ensure attainment of itsexpected accomplishments, and identify any programme changes needed: additional outputs,reformulation, or termination for consideration by the Commission.
18 month milestone: As each department of the United Nations Secretariat is required to reporta first draft of the ‘Highlights of programme results’ and a first summary of ‘Challenges,obstacles and unmet goals’ at the 18 month milestones, subprogramme managers are encouragedto identify, through a participatory process, the elements from their subprogramme to befeatured in ESCAP’s draft of these two texts. Findings from evaluations or evaluative reviewsrelevant to the subprogramme should be used in the process.
Importance of staff participation
Active staff involvement in planning and monitoring will enhance ownership of the process andthus facilitate the achievement of results. Staff participation encourages the sharing of experiences,which strengthens organizational learning. Division and section chiefs as well as heads of officesshould ensure active staff involvement at all stages of the programme cycle, but particularly duringthe preparation of the following:15
Annual work plan (AWP): As staff members are responsible for the day-to-day delivery ofoutputs, they are best placed to determine how much time is needed for each activity, and toset realistic deadlines. As such, the AWP preparation can be used as a basis for the e-PAS planfor individual staff members.
Accomplishment accounts (AAs): Staff members can provide an update on the delivery ofoutputs, achievements against predefined indicators, constraints encountered and how these aremanaged, and lessons learned and suggestions for follow-up and improvement. This feedbackshould strengthen the overall delivery of the programme of work while contributing to thedevelopment of capacity for all staff members. It can also be used to update the AWP and forthe review of staff performance in e-PAS.
15 Further guidance on the participatory process promoted in support of monitoring and reporting (termedmandatory self-evaluation by OIOS) is provided in the OIOS’s ‘Guide to Using Evaluation in the United NationsSecretariat.’
14
ESCAP Monitoring and Evaluation System
Preliminary performance assessment (PPA) and programme performance report (PPR): Staffmembers can provide similar inputs as in the preparation of AAs. However, the PPA is evenmore important in that it effectively reflects on the entire programme cycle and is used toinform the planning process for future strategic frameworks, as well as to prepare theprogramme performance report, which is shared with Headquarters, member States and donors.The PPA and the PPR are thus also a time for reflecting on lessons learned, including fromevaluations conducted during the biennium.
Consultation with the Executive Secretary
Division chiefs and other staff members concerned are required to meet with the ExecutiveSecretary at the 12 month mark and the 21 month mark during the biennium to present their up-to-date AWP, their AA (12 months) and PPA (21 months), and to report on the lessons learned in theprocess of implementing the subprogramme. In line with their e-PAS requirements, they also reporton how staff participation was ensured.
Annual Work Plan
The annual work plan translates the programme budget for each subprogramme into a plan ofactivities for a calendar year. The annual work plan is a tool, which helps ensure the timelyimplementation of all outputs contained in the programme of work.
The programme budget lists the outputs, activities and resources required in order to achieveexpected accomplishments. However, the document is not suitable for the day-to-day managementof the work of a division, which requires much more detail, including information on projects andother activities that were not available at the time the programme budget was prepared.
On the basis of the programme budget, each division and office prepares a detailed annual workplan (AWP) at the beginning of the year. The AWP is a compilation of section work plans and thework plans of offices away from Bangkok, as appropriate. It includes all major goals and plannedactivities, whether they are funded through the regular budget or through extrabudgetary resources.It also includes monitoring and evaluation activities, which cover reporting in IMDIS (for example,outputs, accomplishment accounts) as well as planned evaluations and evaluative reviews.
The AWP serves as a management tool for division and section chiefs and heads of offices awayfrom Bangkok, and the AWP provides staff members with an overview of planned activities andtimelines. The subprogramme managers update the plan regularly but at least every six months.The e-PAS work plans of all staff members, including managers, should be based on the AWP.
In accordance with e-PAS requirements, all divisions and offices must have an AWP and areencouraged to use Fact Sheet 1, which provides guidance on the development of the AWP incombination with Monitoring Tool 1, which is a sample results-based annual work plan that can beadapted to the needs of the division or office.
See Fact Sheet 1See Monitoring Tool 1 – Sample results-based annual work plan
IMDIS: Outputs, Work Months, Accomplishment Accounts
Every six months, subprogramme managers report on the progress of their subprogramme inIMDIS. This includes outputs, work months and accomplishment accounts.
15
ESCAP Monitoring and Evaluation System
Output reporting
Output reporting in IMDIS is required every six months and covers several categories that are alsoused in the programme budget and AWP, including: (1) substantive servicing of meetings; (2)parliamentary documentation; (3) expert groups, rapporteurs, depository services; (4) recurrentpublications; (5) non-recurrent publications; (6) other substantive activities; (7) training courses,seminars and workshops; (8) fellowships and grants; and (9) field projects.
The purpose is to assess whether output delivery is aligned with the programme budget in termsof:
Quantity: Number of workshops, meetings, publications, etc. delivered; Timeliness: Percentage of outputs completed in relation to the total number of outputs planned.
Every six months, supporting documents for each output delivered, including meeting reports,publications issued or posted on ESCAP web pages, workshop attendance lists, and projectprogress/terminal reports are submitted to PMD. The PME assistants usually coordinate thereporting in IMDIS and to PMD for the entire subprogramme.
See Fact Sheet 2
Work month reporting
Every six months, each subprogramme also reports the work months spent by each professionalstaff member or consultant on the delivery of planned outputs, irrespective of whether funding isreceived through the regular budget or extrabudgetary resources. The purpose is to account for themanner of allocating professional staff and consultants’ time within the subprogrammes. Workmonths are reported using a standard template.
See Fact Sheet 3See Monitoring Tool 2 – Work Months Report
Accomplishment accounts
Accomplishment accounts (AAs) are required at 12, 18 and 24 months of the programme cycle.Following the 12-month AA, each new AA is an updated version of the previous one. The purposeof AAs, as given by OIOS, is to provide “a summary of a specific subprogramme accomplishmentthat is based on data collected for the indicators of achievement and other relevant information thatserves as the source of reporting on whether the relevant expected accomplishment was achieved.”16
AAs also report on the overall quality of outputs delivered.
Subprogramme managers prepare one AA for each expected accomplishment in a standard formatcovering, inter alia, the activities undertaken and outputs delivered, what was achieved, lessonslearned, including from evaluations, and suggestions for improvement.
All staff participates in the preparation of AAs, which are reviewed by subprogramme managers,section chiefs and heads of offices before they are uploaded in IMDIS. Summary results informa-tion from the AAs is entered directly into a number of fields in IMDIS (results information),
16 Office of Internal Oversight Services (OIOS) “Glossary of Monitoring and Evaluation Terms”, August 2006,http://www.un.org/Depts/oios/mecd/mecd_glossary/index.htm
16
ESCAP Monitoring and Evaluation System
including the “statement of accomplishments/results achieved” and the “lessons learned/areasneeding improvement”. The information is used to provide interim measurements of each indicatorof achievement.
The AAs are used to prepare the preliminary performance assessment (PPA) at 21 months, and theprogramme performance report (PPR) at 24 months. The PPA and PPR are explained in the nextsection.
See Fact Sheets 4 and 5See Monitoring Tool 3 – Sample Accomplishment Account
IMDIS: Programme Performance Report
The preparation of the programme performance report (PPR) consists of two steps:
Preliminary performance assessment (PPA) at the 21 month milestone, which is effectively thepreparation of the draft PPR;
Programme performance report (PPR) at the end of the biennium.
Preliminary performance assessment
Each Division and Office away from Bangkok must undertake a preliminary performance assess-ment (PPA) during the 21st month of the programme cycle. The purpose of the PPA is totake stock of the subprogramme’s performance near the end of the two-year programme cycle.Effectively, it amounts to the preparation of a draft programme performance report and involvesreviewing the accomplishment accounts, preparing a draft statement of accomplishment and asummary of lessons learned and areas identified for improvement, including from evaluationscompleted during the biennium, and reporting on the performance indicators. The PPA is used toinform the planning process for future strategic frameworks.
It is important that staff members who are involved in the day-to-day delivery of outputs activelyparticipate in this assessment to provide feedback on what has been achieved, identify and sharelessons learned and take part in determining next steps.
See Fact Sheet 6
Programme Performance Report
The programme performance report (PPR) is prepared by divisions and Offices away from Bangkokas the final report of the programme cycle at 24 months. The purpose of the PPR is to informmember States about the progress and performance of each subprogramme, using resources allo-cated in the programme budget. The PPR highlights not only achievements but also constraintsencountered, lessons learned and suggestions for improvement and should be based, to the extentpossible, on the findings and management response to evaluations. The PPR is posted in IMDIS andconsists of the following:
Statement of accomplishment/results achieved; Lessons learned/areas needing improvement; Information on the extent to which performance indicators were met, providing a measure of
whether expected accomplishments have been achieved; Completed outputs, work months and accomplishment accounts for the 24 month programme
period;
17
ESCAP Monitoring and Evaluation System
Relevant supporting documentation, such as project progress or terminal reports, evaluationreports, etc.
The subprogramme-level PPRs are also used to compile the PPR of the ESCAP programme as awhole. The Department of Management uses the report to compile the PPR of the United NationsSecretariat as a whole. The Secretary-General submits the United Nations-wide PPR to the GeneralAssembly for review by Member States.
See Fact Sheet 7
The Executive Secretary’s Compact
The Compact is an agreement between each Head of Department of the United Nations Secretariatand the Secretary-General on the objectives and priorities of the Head of Department for a givencalendar year. The Compact comprises a number of areas, including programme delivery, theimplementation of other selected mandates, the achievement of selected programme priorities, themanagement of human and financial resources and the contribution to the broader interests of theUnited Nations.
No specific monitoring of the implementation of the Compact is needed as the monitoring of thevarious elements of the Compact are mainstreamed in other monitoring activities, including that ofthe Human Resources Action Plan and the programme budget.
The preparation of the Compact report is also mainstreamed with the regular monitoring andreporting exercises at the 12 month and 24 month milestones where PMD compiles performanceinformation on each element of the Compact from across the ESCAP secretariat. The report on theassessment of Compact performance is submitted to the United Nations Deputy Secretary-Generaland effectively constitutes the Executive Secretary’s e-PAS.
2.3 Project Monitoring17
The basis for project monitoring is provided by the project document, which includes a logicalframework and a project work and monitoring plan (PWMP).
Project officers are responsible for monitoring the progress of projects on a day-to-day basis. Thiscould include, for example, tracking the preparation of workshops, funds committed and spent inIMIS, and the delivery of outputs by consultants and project partners. Through monitoring, projectmanagement obtains information that can help it decide what action to take to ensure that a projectis implemented according to plan. Examples of corrective actions may include communicating withproject staff and partners on the implementation of certain project activities, hiring consultants toassist with certain tasks, rescheduling a workshop, and revising the suballotment or projecttimelines.
In addition to continuous monitoring, there are also specific monitoring milestones, often combinedwith review and reporting, the purpose of which is to keep section and division chiefs, donors,project partners and other stakeholders informed about the project’s progress. These monitoringmilestones are described below and illustrated in figure 4.
17 This section covers projects funded through extrabudgetary sources only.
18
ESCAP Monitoring and Evaluation System
Project Document
At the beginning of a project, following the internal ESCAP approval of the project profile as wellas clearance by the donor(s), a detailed project document is prepared. This document forms thebasis for the implementation, monitoring, reporting and evaluation of the project and is preparedusing a standard template, including:
Body of the document: This includes an executive summary as well as sections describing thesituation analysis, logical framework, management arrangements and inputs of ESCAP and othercollaborating partners;
Annex 1: Logical framework; Annex 2: Project work and monitoring plan; Annex 3: Budget; Annex 4: e-TC summary page.
The annexes in particular constitute important monitoring and evaluation tools that are used duringthe project cycle. They are described below.
See Fact Sheet 8
Logical framework18
The full logical framework is attached, as annex I, to the project document, after the project profilehas been approved within ESCAP and by the donor. The purpose of the logical framework is todesign and plan a project and to provide a basis for its implementation, monitoring, reporting andevaluation. It is prepared in a matrix format and includes the project goal, outcome, outputs andactivities, as well as the indicators used to measure them.
18 Detailed guidance on the preparation of the Logical Framework is provided in “UNESCAP Training Guide onProject Planning, Monitoring and Evaluation”, February 2004.
Figure 4. Project monitoring milestones
Start of project June December June December End of project
Project Docu-ment
Allotmentrequest andentry in IMIS
Update ProjectWork andMonitoringPlan (PWMP)
Update e-TC
ProjectProgressReport
Update PWMP
Update e-TC
Project ProgressReport
Update PWMP
Update e-TC
ProgressReport
ProjectTerminalReport
ProjectTerminalEvaluation
Financialclosure ofprojectaccount
Ongoing monitoring
Update PWMP
Update e-TC
ProjectProgressReport
19
ESCAP Monitoring and Evaluation System
The logical framework is normally not revised during the project cycle unless major changes aremade to expected project results, project activities, timeline or budget. In such cases, a revision ofthe project document may be required.
See Fact Sheet 9
Project Work and Monitoring Plan
The project work and monitoring plan (PWMP) is similar to the annual work plan but its focus isproject-specific. It serves as a management tool for project officers, while providing section anddivision chiefs with a transparent overview of important activities and timelines. The e-PAS plansof project staff are therefore based on both plans.
The PWMP is prepared prior to the start of the project and attached, as annex 2, to the projectdocument. The PWMP shows the activities to be carried out over the entire project period. Theproject officer updates the PWMP every six months in conjunction with the preparation of theproject progress report.
See Fact Sheet 10
Budget
The project budget is attached, as annex 3, to the project document. It is used by project officers toplan and manage the project finances and it provides division chiefs, PMD and donors with atransparent overview of budgeted project costs. The budget is prepared using an MS Excel template.
The budget is not normally revised during the project cycle unless major changes are made toexpected project results, project activities, timeline or human and financial resources. In such cases,a revision of the project document may be required. The budget forms the basis for financialmonitoring of projects. Allotments are requested before the start of the project and for eachcalendar year.
See Fact Sheets 11 and 13
Summary page for e-TC and updates
The purpose of the e-TC system is to provide a clear overview of TC projects. A one-page summaryfor e-TC is attached, as annex 4, to the project document. It currently includes general information:participating countries or regional groups, subjects (for example, water, transport), modalities (forexample, advisory services, training, pilot projects), relevant MDGs (for example, reduce childmortality), and relevance to gender. Divisions post the information in e-TC and update it every sixmonths. Project officers also post pdf versions of supporting documentation in e-TC, such asprogress report/terminal reports, revised project documents, budget revisions and/or PWMP, etc.As the e-TC system is directly linked to IMIS, there is no need to insert or update financial andother information that is already in IMIS.
See Fact Sheet 12
Project Progress Reports
All projects require semi-annual project progress reports. Progress reports should include afinancial statement for the reporting period. The purpose of these reports is to internally monitorand review if the delivery of outputs of a given project is within planned timelines and the budget,
20
ESCAP Monitoring and Evaluation System
and to record lessons learned. Project management can then decide if corrective actions are neededand can update the project work and monitoring plan accordingly.
An additional purpose of the report is to provide section and division chiefs with feedback on theproject’s progress. In turn, they can use the report in the monitoring of subprogramme performancethrough the accomplishment accounts (to which project progress reports can be attached) and apreliminary performance assessment. They can also use the report to provide project donors withfeedback.
Project officers prepare the project progress report using a standard template based on the logicalframework of the project. Each progress report is an expansion of the previous report, so that thelast progress report covers the entire project duration.
See Fact Sheet 14
Project Terminal Report
Upon completion of a project, a terminal report is prepared. This report summarizes the outputsdelivered for the entire project and provides a terminal financial statement, but it also highlights thelessons learned under the project and lists recommended follow-up actions.
The purpose of project terminal reports is to provide donors with feedback about the project’soverall performance. For large projects, and projects with a duration of more than two years, thereport can be used as a starting point for a terminal evaluation of the project (see section 3.3). It isalso an important report for monitoring subprogramme performance and should therefore be postedin IMDIS in support of accomplishment accounts.
See Fact Sheet 15
2.4 Financial Monitoring
Financial monitoring of the delivery of outputs takes place in IMIS. E-TC and the IMIS ReportingFacility Application (IRFA) reports provide project officers with the necessary information tomonitor the financial delivery of the projects.
Allotments in IMIS are separated by the source of funding. In this regard, four broad categories arenormally understood:
Regular budget activities are described and budgeted for in the biennial programme budget.Expenditure against funds appropriated takes place continuously and budgetary performance isreviewed on a regular basis during the biennium. Particularly, a first budget performance reportis issued by Headquarters at the end of the first year and a second budget performance report,which is made available to Member States for their consideration of the proposed programmebudget for the following biennium, is prepared by individual departments in the second half ofthe second year (September or October). In preparation for this second budget performance,each division or office prepares an expenditure plan for the rest of the biennium;
Activities of the Regular Programme of Technical Cooperation are described at thesubprogramme level in the programme budget and respond to requests from member States forcatalytic advisory services and capacity development activities on a demand-driven basis.Monitoring for this category of funds is also included in the performance reports mentionedabove;
21
ESCAP Monitoring and Evaluation System
Extrabudgetary TC projects are briefly described in the programme budget but are financedthrough specific contributions from Member States, multilateral organizations or private organi-zations. The budget plan in the project document as well as the trust fund document, whichreflects the agreement reached between ESCAP and the donor on the use of the fund, providethe basis for financial monitoring. Divisions or offices review financial performance of TCprojects regularly and ensure that reports submitted to donors meet the requirements as per thetrust fund agreement;
Development Account activities are funded through a section of the United Nations regularbudget, with final project approval granted by the General Assembly. Financial performance isreported to DESA in its capacity as the Programme Manager for the Development Account. Asubstantive and financial report, prepared by Divisions or offices, is submitted to DESA at leastevery January, covering the activities of the previous year.
23
ESCAP Monitoring and Evaluation System
3. EVALUATION FRAMEWORK
The present chapter provides an overview of evaluative processes at ESCAP, which includesevaluations and evaluative reviews. It discusses the roles and responsibilities of different ESCAPstakeholders in the evaluation process, evaluation planning and budgeting and the evaluationprocess itself. The “ESCAP Evaluation Guidelines”, supported by evaluation fact sheets and tools,provide more detailed information on how to manage evaluations (see annex IV).
3.1 Introduction
Definition
Evaluation in the ESCAP context is defined as: “a selective exercise that seeks to determine assystematically and objectively as possible the relevance, effectiveness, efficiency and sustainability of anongoing or completed subprogramme, project or other initiative in light of its expected results. It encompassesits design, implementation and actual results to provide information that is credible and useful, enabling theincorporation of lessons learned into executive planning and decision-making.”19
Evaluation asks three questions: are we doing the right thing, are we doing it right, and are therebetter ways of achieving the expected results? Evaluations are a key element of results-basedmanagement (RBM).
OIOS distinguishes evaluations from other forms of assessment, such as monitoring, audit andappraisal.20,21 This evaluation framework covers what OIOS defines as “external evaluation” (byentities outside the ESCAP secretariat) and “internal evaluation” (by ESCAP).
All evaluations that are managed by ESCAP staff are considered “internal evaluations”. There arehowever two types of evaluations: “evaluations,” which are managed by the ESCAP ProgrammeEvaluation Officers, and “evaluative reviews,” which are managed by divisions or Offices awayfrom Bangkok.
Norms and Criteria
ESCAP seeks to uphold the norms and standards for evaluation developed by the United NationsEvaluation Group.22 These guiding principles have been adapted for ESCAP’s context, as seenbelow:
19 Adapted from: Joint Inspection Unit, United Nations, 2006, “Oversight Lacunae in the United Nations System”,(available online at http://www.unjiu.org/data/reports/2006/en2006_2.pdf).
20 Different OIOS evaluation types are described in: Office of Internal Oversight Services (OIOS), “A Guide toUsing Evaluation in the United Nations Secretariat”, June 2005 (available online at http://www.un.org/depts/oios/manage_results.pdf).
21 Evaluation questionnaires administered after expert group meetings and intergovernmental meetings are consid-ered to be part of monitoring.
22 United Nations Evaluations Group (UNEG), “Norms and Standards for Evaluation in the UN System”, April2005 (available online at http://www.uneval.org).
24
ESCAP Monitoring and Evaluation System
Intentionality: The scope, design and planning of evaluations should contribute to the generationof relevant, timely findings that meet the needs of stakeholders. It must be clear from the outsetwhat the evaluation findings will be used for, i.e. for organizational learning to feed into futureprogrammes and projects, accountability to member States and donors, or both;
Impartiality: The need for objectivity in the planning, design, team selection, execution andformulation of findings and recommendations, taking the views of relevant stakeholders intoaccount;
Independence: Only external evaluations that are managed and conducted by organizationsother than ESCAP can be considered truly independent. However, most evaluations of ESCAP’swork are managed by ESCAP staff. To maximize independence under these circumstances,evaluations that serve external accountability purposes are conducted by external consultants(evaluators) and managed by the ESCAP Programme Evaluation Officers. Evaluations (includ-ing evaluative reviews) that serve organizational learning purposes are, to the extent possible,conducted by external evaluators. Independence applies to evaluation managers as well as toevaluators. To avoid conflict of interest and undue pressure, evaluators must not have beenresponsible for the policy-setting, design or management of the subject of evaluation, nor expectto be in the near future;
Evaluability: Prior to undertaking a major evaluation requiring significant investment of re-sources, it is necessary to establish that it is technically possible to evaluate the initiative inquestion and that there is no major factor hindering the evaluation process, such as lack ofindependence, information, or clear intent of the subject to be evaluated;
Quality: The quality of the findings must be ensured through proper design, planning andimplementation, and by preparing a complete and balanced report, which contains informationthat can be easily distilled into lessons and disseminated;
Competencies for evaluation: Evaluation staff should have formal job descriptions and perfor-mance criteria, as well as relevant competencies and skills to conduct evaluations and hireexternal evaluators;
Transparency and consultation: Transparency and consultation are necessary steps in all stagesof the evaluation process to build ownership and facilitate consensus. Evaluation reports(including the terms of reference) should be available to major stakeholders and be publicdocuments that are accessible and readable;
Ethics: Evaluators must have personal and professional integrity, must allow institutions andindividuals to provide information confidentially and should verify their statements. They mustbe sensitive to the beliefs, manners and customs prevailing in a particular social and culturalenvironment; they should likewise be sensitive to and address issues of discrimination andgender inequality and should discreetly report wrongdoings if appropriate;
Follow up to evaluations: Management is required to provide a response to the recommenda-tions of evaluations, which, at ESCAP, should be included in the final evaluation report.Evaluation recommendations that have been accepted by management should be followed upsystematically and the status of follow-up should be reviewed periodically;
Contribution to knowledge building: Evaluation findings and recommendations should bepresented in such a way that they can be easily accessed, understood and implemented bytarget audiences. As such, they need to be relevant and appropriate, bearing in mind the
25
ESCAP Monitoring and Evaluation System
capacity and opportunities of the target audiences to strengthen implementation processes andresults. The sharing of evaluation reports should facilitate learning among stakeholders,including, where appropriate, other entities of the UN system.
In addition, although the focus will vary for some evaluations, the following standard evaluationcriteria are applied in formulating the overall evaluation objectives and more detailed evaluationquestions:23
Relevance: Appropriateness of objectives (of a theme or subprogramme) or outcomes (of aproject) in terms of ESCAP’s priorities, Governments’ development strategies and priorities, andrequirements of the target groups.
Efficiency: the extent to which human and financial resources were used in the best possibleway to deliver activities and outputs, in coordination with other stakeholders;
Effectiveness:24 extent to which the expected objectives (of a subprogramme or theme) oroutcomes (of a project) have been achieved, and have resulted in changes and effects, positiveand negative, planned and unforeseen, with respect to the target groups and other affectedstakeholders;
Sustainability: the likelihood that the benefits of the subprogramme, theme or project willcontinue in the future.
Several additional criteria reflect United Nations commitments, for example: gender mainstreaming,human rights-based approach, environmental sustainability, and “UN Coherence” (cooperationbetween different United Nations agencies).25 Where relevant, the evaluation should determine theextent to which these commitments have been incorporated in the design and implementation of asubprogramme, theme or project.
3.2 Types of Evaluative Processes
The nature and purpose of evaluations can vary considerably depending, for example, on theirfocus, timing or management. The different types of evaluative processes at ESCAP are introducedbelow and shown in figure 5. A more detailed comparison is presented below in table 3, andsupported by the fact sheets of the Evaluation Guidelines.
At ESCAP, two main categories of evaluative processes are distinguished on the basis of whomanages them:
External evaluations, which are managed and conducted by entities outside ESCAP;
23 Only relevant criteria are included in the objectives. For example, if an evaluation is ‘forward looking’, such asan evaluation of capacity building mechanisms then efficiency may not be an appropriate criterion.
24 ESCAP evaluations focus on objectives and outcomes, i.e., effects at the level of the immediate target groups ofESCAP, which are mainly the national Governments of the member States. Impact assessment (i.e., identifyingeffects at the level of ultimate beneficiaries, such as the poor and the disabled) would require significantresources and, in most cases, the impact cannot be attributed solely to ESCAP.
25 The additional criteria overlap to some degree with the standard evaluation criteria. Gender, rights-basedapproach and environmental sustainability are also part of the effectiveness criterion, the involvement of prioritycountries is also part of the relevance criterion, and the “UN coherence” criterion could also be consideredunder the efficiency criterion.
26
ESCAP Monitoring and Evaluation System
External Evaluations
According to OIOS, the term “external evaluation” should be used strictly for evaluations that aremanaged and conducted by independent entities, such as the Joint Inspection Unit (JIU), which has aUN-wide mandate, or by the Office of Internal Oversight Services (OIOS) on the basis of its UNSecretariat-wide mandate. External evaluations are conducted by evaluators, who are free of controlor influence by those responsible for the design and implementation of the subject of the evaluation.This supports the main purpose of external evaluations, which is external accountability to donors,member States or other external stakeholders. External evaluations can be mandatory or discretionary.
Mandatory external evaluation
Mandatory external evaluations are requested by intergovernmental bodies such as the GeneralAssembly, the Committee for Programme and Coordination, functional commissions, regional andsectoral intergovernmental bodies or other technical bodies. The primary purposes of mandatoryexternal evaluations include oversight and support to decision-making at the intergovernmental level;their findings are however highly useful also for programme managers, who are often required toimplement their recommendations and report back to the requesting intergovernmental body.
Discretionary external evaluation
Discretionary external evaluations are requested by programme managers and designed, managedand conducted by an external entity. The primary purpose of discretionary external evaluations isorganizational learning on the basis of independent and objective assessments for improved perfor-mance; their findings may however also support decision-making and accountability at the intergov-
Figure 5. Types of evaluative processes at ESCAP
EXTERNALEVALUATION
Project
INTERNALEVALUATION
EVALUATION EVALUATIVEREVIEW
Peer
Other
Other
Project
Subprogramme
ThematicMandatory
Discretionary
Internal evaluations, which are managed by ESCAP staff.
o Evaluations, which are managed by the ESCAP Programme Evaluation Officers in PMD.
o Evaluative reviews, which may be managed by any division or Office away fromBangkok.
27
ESCAP Monitoring and Evaluation System
ernmental level. In conjunction with the development of ESCAP’s biennial evaluation plan theExecutive Secretary can put forward suggestions for such evaluations.
Internal Evaluations
Internal evaluations are managed by ESCAP staff. They can be requested by the Commission orplanned by the ESCAP secretariat and as such be either mandatory or discretionary. Ad hocevaluations may be conducted on the basis of emerging needs and priorities by member States orthe secretariat.
ESCAP distinguishes between two types of internal evaluations, namely “evaluations” and “evalua-tive reviews”, as described below:
Evaluation
The term “evaluation” is utilized for evaluations that are managed by the ESCAP ProgrammeEvaluation Officers. This requirement is introduced to strengthen the independence and impartiality ofthe evaluation process and its findings and recommendations, as the Evaluation Officers are not directlyinvolved in the implementation or management of subprogrammes or projects. Evaluations have thepurpose of supporting decision-making at the strategic management level and hold the secretariataccountable to member States and external stakeholders. Different categories of evaluations include:
Thematic: An evaluation focused on a cross-cutting theme, fund, sector, modality, or service;
Subprogramme: An evaluation that considers the effects of the total portfolio or major compo-nents of activities that are aimed at achieving a common set of results as set out in the strategicframework. The scope of a subprogramme evaluation could be the combined work of adivision, a section, a subregional office or a regional institution, or the portfolio of technicalcooperation activities implemented under the subprogramme;
Project: An evaluation that focuses on the achievement of the results outlined in the logicalframework of a project, often within the context of a broader programme. Most often, projectevaluations are planned when the project is developed and included in the project documentand budget.
Other: Any other evaluative process for which it is deemed necessary that the process ismanaged by the Evaluation Officers to strengthen the independence of its findings.
Evaluative review
“Evaluative reviews” may be managed by any division or by any office away from Bangkok. Adistinctive feature of an evaluative review is therefore that the programme/project implementermay also be the manager of the evaluation. Evaluative reviews have the primary purpose offostering organizational learning. Different types of evaluative reviews include:
Project: Similar to project evaluations, a project evaluative review focuses on the achievement ofthe results outlined in the logical framework of a project. Project reviews are managed by theproject implementer and typically conducted by external consultants. Project reviews are fundedfrom the project budget;
Peer: A peer review can be managed by any division or office. Peer reviews are conducted by agroup of nominated staff representatives (peers), and the managing office would extend secre-tarial support to the process. Members of the peer group evaluate organizational performanceand practice relating to particular modalities, sets of activities, reports or procedures. Peerreviews are particularly useful in establishing quality standards for activities under review,mainstream an awareness of such quality standards and promote a quality-oriented work
28
ESCAP Monitoring and Evaluation System
culture. Reviews of the same subject may be conducted periodically to ensure continuity inorganizational learning. External consultants may be contracted to provide specialist advice.
Other: Evaluative reviews may cover any topic. For example, an evaluative review of a clusterof linked activities or a delivery modality could be undertaken by a division, resulting insystematic documentation of lessons learned and the generation of consolidated conclusions andrecommendations. All “other” evaluative reviews should, as any evaluative review, be con-ducted in accordance with ESCAP’s Evaluation Guidelines.
Purpose of Evaluation and Review
Organizational learning and accountability can be perceived as conflicting objectives. It is thereforeimportant to involve stakeholders in defining the general and primary purpose of each evaluation inorder to gain support both internally and externally of the organization. This is critical for thesuccess of evaluations.
Findings from evaluations, which are managed by the ESCAP Programme Evaluation Officers, areused to:
Enhance organizational learning based on past experience. This supports strategic planning anddecision-making at ESCAP, particularly in the context of the strategic framework andprogramme budget. At the project level, it supports the design, planning and implementation offuture projects;
Hold ESCAP accountable internally, and to member States, donors and other developmentpartners, by demonstrating that financial and staff resources were used appropriately.
Reviews, which can be managed by any division or Office away from Bangkok, are used to:
Identify important lessons relating to a wide range of ESCAP initiatives and processes and toincorporate them in planning and decision-making;
Mainstream an understanding of quality management across ESCAP; Foster a culture of learning.
Evaluation Process
Evaluations and most reviews, especially project reviews, are done in three stages: planning,implementation and use of findings. Each stage consists of several steps as illustrated in figure 6.Each step is explained in detail in the separate ESCAP evaluation guidelines.
Figure 6. Stages in the evaluation and review process
PLANNING
1. Prepare evaluationplan and budget
2. Prepare terms ofreference
3. Establish evaluationteam
4. Schedule and organizeevaluation
IMPLEMENTING
5. Conduct evaluation
6. Prepare draft report
7. Review draft report
USING EVALUATIONFINDINGS
8. Prepare managementresponse and actions
9. Share evaluationfindings
10. Follow up andpromote learning
29
ESCAP Monitoring and Evaluation System
3.3 Organizational Roles and Responsibilities
The following organizational roles and responsibilities govern evaluation at ESCAP:
The Commission: Responsible for guidance and oversight of the work of the ESCAP secretariat.May request evaluations of ESCAP’s subprogrammes, projects or other activities through resolu-tions. Committees that are subsidiary to the Commission may recommend to the Commissionthe undertaking of an evaluation or evaluative review.
The Office of Internal Oversight Services: Responsible for guidance and oversight of evaluationprocesses within the United Nations Secretariat. May conduct mandatory or discretionaryexternal evaluations of ESCAP. On an ad hoc basis, ESCAP may turn to OIOS (or other UNevaluation offices) for the provision of quality support for internal evaluations.
The Executive Secretary: Responsible for all activities undertaken by the ESCAP secretariat,which, in the context of evaluation means approving the biennial Evaluation Plan, approving themanagement response to evaluations and in some cases also to evaluative reviews, and ensuringthat evaluations and evaluative reviews are used to promote learning and strengthen account-ability at ESCAP.
Senior Management: Senior managers at ESCAP play an important role in ensuring the use ofevaluations. Through signing off on management responses and follow-up action plans, theycommit to and are thus held accountable for the implementation of follow-up to evaluations.Division Chiefs and Heads of offices away from Bangkok, being responsible for ensuring thatactivities under their purview are subject to regular evaluative reviews, also play an importantrole in the formulation of ESCAP’s Evaluation Plan.
Evaluation Officers: A central responsibility of the ESCAP Programme Evaluation Officers is toextend quality assurance and support to evaluative reviews, including during the preparation ofthe terms of reference (TOR), the identification of evaluation consultants and the review of theevaluation report. In providing this service, the Evaluation Officers are concerned withassessing and improving the merit or the worth of evaluative activities and their adherence toUNEG and ESCAP norms, as outlined in section 3.1. The Evaluation Officers also coordinate theformulation and monitoring of ESCAP’s Evaluation Plan and the management response to allevaluations and evaluative reviews. For evaluations, they further take on the role of evaluationmanager (see table 3).
PME focal points and assistants: Each operational division and the ESCAP Pacific OperationsCentre (EPOC), in addition to the Office of the Executive Secretary and the Human ResourcesManagement Section, Administrative Services Division (ASD), have appointed planning, moni-toring and evaluation (PME) focal points and assistants. This group of staff members serves asthe anchor of M&E at ESCAP. In the context of evaluation, the PME focal points and assistantsfacilitate the formulation of the biennial ESCAP Evaluation Plan, provide guidance to theircolleagues on evaluation during the design phase of programmes and projects, and coordinatethe monitoring and reporting on follow-up to evaluations by their division or office.
ESCAP staff: Staff from ESCAP divisions and Offices away from Bangkok support evaluationby providing inputs to ESCAP’s Evaluation Plan, organizing and participating in interviewsconducted by evaluators and reviewing and disseminating evaluation reports. They also sharethe organizational responsibility of ensuring the utility of evaluations by contributing to theimplementation of follow-up actions for which their office is responsible. For specific evaluationsor evaluative reviews, ESCAP staff members may take on the role of evaluation manager or bepart of the evaluation team (see table 3).
30
ESCAP Monitoring and Evaluation System
3.4 Planning Evaluations
Evaluation Planning and Budgeting at the Organizational Level
The ESCAP evaluation plan is prepared every biennium and submitted to the United NationsHeadquarters together with the programme budget. Figure 7 illustrates where the Evaluation Planfits within the ESCAP programme cycle. Evaluations can feed into the development of the Strategic
Table 3. Comparison between types of evaluative processes at ESCAP
EXTERNAL INTERNAL
EVALUATIONS EVALUATIONS EVALUATIVE REVIEWS
External accountability External accountability OrganizationalMain Purpose to member States and to member States and learning
WHY? (other purposes donors donors Internal accountabilityin italics) Internal accountability Internal accountability External accountability
Organizational learning Organizational learning
Evaluation OIOS ESCAP Programme Division or Officemanager JIU Evaluation Officers away from Bangkok
Other external parties
External consultants External consultants External consultantsEvaluation team OIOS staff External peers
JIU staff ESCAP staff(see Table 4)
Quality OIOS OIOS Internal peersassurance and JIU UNEG a ESCAP Programmesupport (by) Other external parties PMD Evaluation Officers
Reference group Reference group
Management Executive Secretary Executive Secretary, the Head of divisionresponse Chief of PMD and or office managing(signed by) heads of other relevant the review, heads of
divisions, or offices other relevantaway from Bangkok organizational
entities, and theChief of PMDb
Dissemination of United Nations ESCAP secretariat ESCAP secretariatc
evaluation Secretariat United Nationsfindings (to) External stakeholders Secretariat
External stakeholders
Follow up to ESCAP Executive Secretary Head of division orevaluation Other UN Secretariat All other relevant office managing thefindings and entities ESCAP staff reviewrecommendations All other relevant(by) ESCAP staff from
divisions or officesthat signed themanagement response
Notes:a OIOS or other United Nations Evaluation Group (UNEG) members may be consulted on an ad hoc basis.b Divisions managing an evaluative review may request the Executive Secretary to sign the management response.c Divisions managing an evaluative review may request that their evaluative review is issued for external distribution.
WHO?
HOW?
31
ESCAP Monitoring and Evaluation System
Figure 7. Evaluation within the ESCAP Programme Cycle
The 2010-2011ESCAP Evaluation Plan
is developed towardsthe end of 2008.
The 2012-2013ESCAP Evaluation Plan
is developed towardsthe end of 2010.
2010
2011
2008
2009
New Programme
Evaluation
New Programme
The purpose of the ESCAP Evaluation Plan is to identify, and budget for, evaluations and reviews ina transparent and consistent way, to seek endorsement from the relevant legislative bodies, and toprovide an overview of planned evaluations so that all stakeholders involved can prepare adequately.The plan includes all larger evaluation initiatives to be carried out during the two-year programmecycle and covers external and internal evaluations, including selected project and peer reviews.
The formulation of the ESCAP Evaluation Plan is a strategic exercise in so far as eventualevaluation results may have a critical bearing on future organizational policies and implementationmodalities. As such, it is important to identify issues and programme areas for in-depth analysisthat are of particular importance and relevance to ESCAP at the time of planning.
It is also important to ensure that evaluations are completed in a timely manner so that they canfeed into programme and project planning. The evaluation should be completed by the end of abiennium to ensure that the evaluation serves its intended purpose and the valuable information isused during the subsequent biennium.
Evaluations can be funded through various sources, but as a general rule:
External evaluations (both mandatory and discretionary) are budgeted by external entitiesimplementing the evaluation, such as OIOS. Various funding arrangements are conceivable (forexample, donors, Member States, other Secretariat entities, ESCAP, etc.);
Framework when completed by mid-year of the last year of a biennium and can inform theimplementation of the programme and projects during the next biennium. Hence, evaluationscompleted before October 2009 would inform the implementation of the programme of work for2010-2011 and the development of the Strategic Framework for 2012-2013.
32
ESCAP Monitoring and Evaluation System
Thematic and subprogramme evaluations are budgeted centrally by ESCAP, using appropriateXB and/or RB resources. Subprogramme evaluations focused on regional institutions shouldgenerally be budgeted for in the respective institutional support accounts of regional institutions;
Project evaluations or project reviews should be covered by project funds and should bebudgeted for during the project design phase;
Other types of evaluations and reviews, including peer reviews, are budgeted for as appropriate.
The biennial ESCAP Evaluation Plan also includes an estimate of staff time required to manage andsupport evaluations.
3.5 Using Evaluation Findings
Evaluations and evaluative reviews can only contribute to organizational learning if related findingsand recommendations are disseminated, discussed and acted upon. ESCAP management plays animportant role in this. ESCAP policy on the use of evaluation and review findings is explainedbelow. The evaluation guidelines provide further detail.
Preparation of Management Response and Actions
The use of evaluations for accountability and organizational learning is facilitated through thedevelopment of a “management response” and a “follow-up action plan” to the findings andrecommendations of each evaluation or evaluative review, making up the formal, written responsefrom the organization.
ESCAP management assumes a critical leadership role in ensuring the use of evaluations as they areinvolved in the formulation of management responses and follow-up action plans. ESCAP manage-ment is represented by the Executive Secretary of ESCAP in the case of external evaluations orevaluations managed by the ESCAP Programme Evaluation Officers. In the case of evaluativereviews, management is represented by the institutional entity that manages the review in question.Management responses include:
An overall response from the perspective of management on the evaluation or review and itsresults. This can include comments regarding the relevance and usefulness of the results. It mayalso highlight any differences in opinion with the evaluators while maintaining the indepen-dence of the evaluation findings;
A response to each individual recommendation, resulting in either acceptance (full or partial)or rejection of the recommendation. Additional comments may relate to broader implications forESCAP, in particular as regards programme and project planning and implementation;
Evaluation follow-up actions, corresponding to accepted recommendations, including comple-tion deadlines and the responsible implementing entity. In addition to actions resulting directlyfrom the evaluation recommendations, additional longer-term, strategic or institutional-levelactions may be included.
The ESCAP Programme Evaluation Officers coordinate the formulation of the management responseby seeking inputs from key stakeholders, such as the Executive Secretary of ESCAP and heads ofdivisions or offices. The final management response is included in the published evaluation report.ESCAP management sign off on the report after the management response and follow-up actionshave been included. In the case of evaluations managed by the ESCAP Programme EvaluationOfficers, PMD issues the final evaluation report containing the management response. In the case ofreviews, the respective management entity issues the final evaluation report, including the manage-ment response. Key differences in the management response to evaluations and evaluative reviewsare outlined in table 4.
33
ESCAP Monitoring and Evaluation System
Table 4. Management response
EVALUATIONS EVALUATIVE REVIEWS
Purpose Provide management with an Provide management with anopportunity to comment on the opportunity to comment on theevaluation evaluation
Publicly commit to specific responses to Internally commit to specific responsesrecommendations contained in the to recommendations contained in theevaluation review
Management Executive Secretary, PMD, operational Entity responsible for managing therepresented by divisions or Offices away from Bangkok review, i.e. Division or Office away
directly affected from Bangkok and PMD
Response coordinated by ESCAP Programme Evaluation Officers ESCAP Programme Evaluationwith inputs from relevant entities Officers in consultation with the
relevant Division(s) or office(s)away from Bangkok
Follow-up Any entities specifically mentioned in Any entities specifically mentionedimplemented by the management response in the management response
Follow-up monitored by PMD and the PME focal points for the PME focal points for the respectiverespective divisions or offices away from divisions or offices away fromBangkok that are involved in the Bangkok that are involved infollow-up action plan (with biannual the follow-up action plan (withreports to the Executive Secretary biannual reports to the Executiveprepared by PMD) Secretary prepared by PMD)
Sharing of Evaluation Findings
To ensure transparency and promote organizational learning and accountability, evaluation findingsshould be disseminated in accordance with the following policy:
All reports of evaluations and evaluative reviews (including the management response) aremade available internally, including on the ESCAP intranet, with the aim of enhancing transpar-ency, ownership and internal accountability;
Internal briefing sessions are conducted for ESCAP management and staff to highlight importantevaluation findings and recommendations, particularly where they are of strategic importance.Such briefings may be given by the lead evaluator or relevant ESCAP staff members;
Reports of evaluations are disseminated to external stakeholders, such as member States anddonors, posted on IMDIS as evidence for accomplishment accounts, posted on other relevantelectronic databases, and posted on the ESCAP website to enhance transparency and externalaccountability;
Reports of evaluative reviews and other evaluative processes that focus primarily on organiza-tional learning are normally shared internally only. If external accountability is explicitlymentioned as a purpose for an evaluative review, dissemination to external stakeholders andthrough the ESCAP website may take place;
Reports that are mandated to be submitted to intergovernmental bodies (i.e. the Commission,Governing Councils etc.) must be in the proper format, meeting editorial standards for pre-session documents. The document must include information on how to obtain a copy of the full
34
ESCAP Monitoring and Evaluation System
report of the evaluation. If the management response is not finalized in time to be included inthe pre-session document, the document should include a foot-note containing (a) the date bywhich the full report will be finalized and (b) information on how to obtain a copy of the reportat that time.
Follow-up and Promotion of Learning
Follow-up to recommendations and promotion of learning is ensured through actions primarily bythe Executive Secretary, heads of divisions and Offices away from Bangkok and PMD. The actionsinclude:
The Executive Secretary:
Ensuring that findings of strategic importance are considered and reflected in the organization’soverall direction and shared with relevant members of the UN system.
Ensuring that follow-up actions are undertaken by ESCAP management and staff by:
o Reviewing periodical status reports prepared by PMD and taking action as necessary;o Including general or specific requirements in the e-PAS of relevant senior staff members,
that they implement evaluation follow-up actions in time. As the e-PAS process includesa mid-term review and an end-of-cycle appraisal, this provides the opportunity to revisitactions every six months.26
Division chiefs and heads of Offices away from Bangkok:
Ensuring that the findings from evaluations are shared and used for programme and projectplanning exercises;
Incorporating actions for which they are responsible in the Annual Work Plan of their division/office;
Ensuring that relevant actions are included in the work and monitoring plans of activities,projects and programmes implemented by their division/office;
Including general or specific requirements in the e-PAS of relevant staff members, that theyimplement their assigned evaluation follow-up actions in time;
Monitoring and regularly updating the status of evaluation follow-up actions for which theirdivision/office is responsible;
Ensuring that the status of evaluation follow-up actions is documented under item 7 of theaccomplishment accounts, “Learning: lessons learnt to date and practical suggestions for im-provement”.
Programme Management Division:
PMD is responsible for monitoring the implementation of evaluation follow-up actions by:
Developing and maintaining an IT tool for tracking the follow-up to evaluations and liaisingwith PME Focal Points to ensure that the tool is used;
26 It is important to note that, in this way, ESCAP staff members and management are held accountable on thebasis of the follow-up actions they take or fail to take as a result of an evaluation, and not whether theevaluation resulted in positive or critical findings.
35
ESCAP Monitoring and Evaluation System
Liaising with the PME focal points to ensure that follow-up actions are regularly updated sothat the status of the implementation of actions is continuously tracked;
Preparing, every six months, an update for the Executive Secretary that includes the status ofactions by each division and office away from Bangkok and by evaluation, and a list ofoutstanding actions that are past the expected completion date.
PMD also organizes workshops open to all staff, at least once a year, which aim to:
Share experiences in managing and conducting evaluations during the preceding period; Review lessons learned from different evaluations and identify concrete areas in which such
lessons can be applied; Review the status of evaluation follow-up actions and agree on changes, as appropriate; Assess successes and barriers in creating an effective evaluation system and culture at ESCAP,
and identify what is needed to further improve ESCAP’s M&E System.
It is important that follow-up is incorporated into already existing monitoring processes as much aspossible to minimize the time required to track their implementation.
37
ESCAP Monitoring and Evaluation System
ANNEXES
Annex I. List of Key Reference Materials
Secretary-General’s Bulletin
ST/SGB/2000/8, 19 April 2000, “Regulations and Rules Governing Programme Planning, theProgramme Aspects of the Budget, the Monitoring of Implementation and the Methods ofEvaluation”
Publications issued by the Office of Internal Oversight Services (OIOS)
IMDIS User’s Guide version 2.6, December 2003 Procedures for Programme Performance Monitoring and Reporting for the 2008-2009 Biennium
through the Use of IMDIS, 18 December 2008 A Guide to Using Evaluation in the United Nations Secretariat, June 2005, http://www.un.org/
depts/oios/manage_results.pdf Proposals on the Strengthening and Monitoring of Programme Performance and Evaluation,
April 2005, http://www.un.org/depts/oios/pages/other_oios_reports.html Strengthening the role of evaluation and the application of evaluation findings on programme
design, delivery and policy directives, April 2006, http://www.un.org/depts/oios/pages/other_oios_reports.html
Inspection on Results-based management (RBM) practices at the United Nations Economic andSocial Commission for Asia and the Pacific (ESCAP), OIOS Inspection and Evaluation Division,July 2007, http://www.unescap.org/64/documents/Full-Report-on-the-inspection-of%20RBM-practices-ESCAP.pdf
Publications issued by the United Nations Evaluation Group (UNEG)
Standards for Evaluation in the UN System, April 2005, http://www.unevaluation.org/normsandstandards/
Norms for Evaluation in the UN System, April 2005, http://www.unevaluation.org/normsandstandards/
ESCAP Project and Programme Management Guide
The objective of the Project and Programme Management Guide (also called “Resource Guide”) is toprovide ESCAP staff members with a clear set of policies and procedures for programme andproject implementation. It is a web-based Guide that can be accessed through the UN Secretariathomepage (iSeek).27 The M&E System Overview is found in the “Monitoring and Evaluation”section.
27 iSeek homepage, via Quicklink “Inside ESCAP” – “Programme Management Division”, “Project and ProgrammeManagement Guide”, htt://iseek.un.org/webpgdept1028_3.asp?dept=1028
39
ESCAP Monitoring and Evaluation System
Annex II. Subprogramme and Supporting Organizational Structure forthe Biennium 2010-2011
The seven divisions of ESCAP along with the offices away from Bangkok correspond to the eightsubprogrammes in the 2010-2011 Strategic Framework of ESCAP, as described below.
Relationships between subprogrammes, divisions and offices away from Bangkok
Relationships between subprogrammes, divisions and offices away from Bangkok
SUBPROGRAMME ESCAP DIVISION Office away from Bangkok
1. Macroeconomic policy and Macroeconomic Policy and Centre for Alleviation ofinclusive development Development Division (MPDD) Poverty through Secondary
Crops Development in Asiaand the Pacific (CAPSA)
2. Trade and investment Trade and Investment Division (TID) United Nations Asian and PacificCentre for AgriculturalEngineering and Machinery(UNAPCAEM)
Asian and Pacific Centre forTransfer of Technology(APCTT);
3. Transport Transport Division (TD) None
4. Environment and Environment and Development NoneDevelopment Division (EDD)
5. Information and Information and Communications Asian and Pacific Trainingcommunications technology Technology and Disaster Risk Centre for Information andand disaster risk reduction Reduction (IDD) Communication Technology
for Development (APCICT)
6. Social Development Social Development Division (SDD) None
7. Statistics Statistics Division (SD) Statistical Institute for Asia andthe Pacific (SIAP)
8. Subregional activities for Office of the Executive Secretary EPOC (Subregional office for thedevelopment Pacific)
Subregional office for East andNorth-East Asia
Subregional office for North andCentral Asia
Subregional office for South andSouth-West Asia
41
ESCAP Monitoring and Evaluation System
Annex III. Monitoring Fact Sheets and Tools
A separate more detailed Fact Sheet with additional guidance and reference materials is availablefor the following monitoring requirements:
1. Annual work plan2. Output reporting3. Work months reporting4. Accomplishment accounts5. IMDIS Results Information6. Preliminary performance assessment (PPA)7. Programme performance report (PPR)8. Project document9. Logical framework10. Project work and monitoring plan11. Project budget12. Summary page for e-TC and updates13. Requests for allotments and revised allotments14. Project progress report15. Project terminal report
The fact sheets are subject to continuous updates. The latest versions are available on iSeek: (http://iseek.un.org/webpgdept1028_79.asp?dept=1028).
The monitoring tools provide practical templates, which can be adapted to the needs of divisions oroffices.
1. Sample results-based annual work plan2. Sample work months report3. Sample accomplishment account
43
ESCAP Monitoring and Evaluation System
Annex IV. Contents of the Evaluation Guidelines
The separate ESCAP Evaluation Guidelines provide more detailed guidance on how to conductevaluations, as well as variations between different types of evaluations. The guidelines comprisethe following chapters:
INTRODUCTION
Evaluation Guidelines in the context of ESCAP’s M&E System Structure of the Evaluation Guidelines Evaluation Tools Evaluation Fact Sheets
1. EVALUATION AT ESCAP
1.1 Types of evaluative processes at ESCAP1.2 Roles and responsibilities
2. PLANNING EVALUATIONS
2.0.1 Ensuring Evaluability2.1 Step 1: Prepare Evaluation Plan and Budget2.2 Step 2: Prepare the Terms of Reference for the Evaluation2.3 Step 3: Establish the Evaluation Team2.4 Step 4: Schedule and Organize the Evaluation
3. IMPLEMENTING EVALUATIONS
3.1 Step 5: Conduct the Evaluation3.2 Step 6: Prepare the Draft report3.3 Step 7: Review the Draft report
4. USING EVALUATION FINDINGS
4.1 Step 8: Prepare Management Response and Actions4.2 Step 9: Share Evaluation Findings4.3 Step 10: Follow-up and Promote Learning
Annex I. List of Key Reference MaterialsAnnex II. List of Evaluation ToolsAnnex III. List of Evaluation Fact SheetsAnnex IV. United Nations Norms for Evaluation adapted for ESCAP
i
ESCAP Evaluation Guidelines
CONTENTS
Page
ACRONYMS/ GLOSSARY .................................................................................................................. iii
INTRODUCTION .................................................................................................................................. 1
1. EVALUATION AT ESCAP
1.1 TYPES OF EVALUATIVE PROCESSES AT ESCAP ............................................................ 31.1.1 External evaluations ........................................................................................................... 41.1.2 Internal evaluations ............................................................................................................ 4
1.2 ROLES AND RESPONSIBILITIES ........................................................................................... 51.2.1 Organizational roles and responsibilities .......................................................................... 51.2.2 Roles and responsibilities in evaluative processes ............................................................. 61.2.3 Involving stakeholders in evaluation processes ................................................................. 7
2. PLANNING EVALUATIONS
2.0.1 Ensuring evaluability ......................................................................................................... 9
2.1 STEP 1: PREPARE EVALUATION PLAN AND BUDGET ............................................... 92.1.1 ESCAP Evaluation Plan .................................................................................................... 92.1.2 Budgeting for planned evaluations .................................................................................... 10
2.2 STEP 2: PREPARE THE TERMS OF REFERENCE FOR THE EVALUATION .............. 102.2.1 Define the evaluation .......................................................................................................... 112.2.2 Outline an evaluation methodology .................................................................................. 142.2.3 Set the budget and completion date .................................................................................. 18
2.3 STEP 3: ESTABLISH THE EVALUATION TEAM .............................................................. 18
2.4 STEP 4: SCHEDULE AND ORGANIZE THE EVALUATION.......................................... 202.4.1 Prepare an evaluation work plan ....................................................................................... 202.4.2 Gather background documentation .................................................................................... 202.4.3 Brief the evaluation team ................................................................................................... 20
3. IMPLEMENTING EVALUATIONS
3.1 STEP 5: CONDUCT THE EVALUATION ............................................................................. 23
3.2 STEP 6: PREPARE THE DRAFT REPORT ............................................................................ 23
3.3 STEP 7: REVIEW THE DRAFT REPORT .............................................................................. 253.3.1 Review the draft report ...................................................................................................... 253.3.2 Prepare the final report ...................................................................................................... 26
4. USING EVALUATION FINDINGS
4.1 STEP 8: PREPARE MANAGEMENT RESPONSE AND ACTIONS ................................. 274.1.1 Management response ........................................................................................................ 284.1.2 Follow-up action plan ......................................................................................................... 28
ii
ESCAP Evaluation Guidelines
CONTENTS (continued)
Page
4.2 STEP 9: SHARE EVALUATION FINDINGS ........................................................................ 294.3 STEP 10: FOLLOW-UP AND PROMOTE LEARNING ...................................................... 29
4.3.1 The Executive Secretary ..................................................................................................... 304.3.2 Division chiefs and heads of offices away from Bangkok ................................................. 304.3.3 Programme Management Division .................................................................................... 30
ANNEXES
Annex I. List of Key Reference Materials ................................................................................ 33
Annex II. List of Evaluation Tools ............................................................................................. 35
Annex III. List of Evaluation Fact Sheets ................................................................................... 37
Annex IV. United Nations Norms for Evaluation Adapted for ESCAP .............................. 39
iii
ESCAP Evaluation Guidelines
ACRONYMS/ GLOSSARY
ASD Administrative Services Division
ESCAP United Nations Economic and Social Commission for Asia and the Pacific
EPOC ESCAP Pacific Operations Centre
GA General Assembly
IMDIS Integrated Monitoring and Documentation Information System
M&E Monitoring and evaluation
Offices away Regional institutions and subregional offices under the auspices of ESCAPfrom Bangkok
OIOS Office of Internal Oversight Services
PME focal point Planning, monitoring and evaluation focal point
PMD Programme Management Division
PSC Programme Support Costs
RB Regular budget
RBM Results-based management
TOR Terms of reference
UNDP United Nations Development Programme
UNHQs United Nations Headquarters
XB Extrabudgetary
1
ESCAP Evaluation Guidelines
INTRODUCTION
Evaluation at ESCAP is governed by the regulations and rules of the United Nations Secretariat asput forth by the Secretary-General1 and guided by the principles for evaluation developed by theUnited Nations Evaluation Group.2 ESCAP’s Evaluation Guidelines operationalize these rules andprinciples by providing ESCAP staff members with practical guidance on how to manage andconduct evaluative processes. The present Guidelines have been designed as a stand-alonedocument to be used as a tool for guiding evaluation managers and other ESCAP staff membersthrough a 10-step process of planning, managing and using the findings of an evaluation orevaluative review. The Guidelines apply equally to evaluations and evaluative reviews3, unlessotherwise specified.
Evaluation Guidelines in the context of ESCAP’s M&E System
The Evaluation Guidelines form part of ESCAP’s Monitoring and Evaluation (M&E) System (seeFigure 1). The M&E System Overview is a document that outlines the role of M&E in the contextof results-based management. The document contains ESCAP’s evaluation framework, includingnorms and criteria, roles and responsibilities of different ESCAP stakeholders in the planning andbudgeting of evaluations and in the evaluation process itself. ESCAP’s evaluation framework isoperationalized in the present Guidelines.
Figure 1. Components of ESCAP’s M&E system
Monitoring and EvaluationSystem Overview
Evaluation Guidelines
EvaluationTools
(in support ofevaluation process)
Evaluation FactSheets
(for eachevaluation type)
MonitoringTools
(in support ofmonitoring process)
Monitoring FactSheets
(for each monitoringrequirement)
Monitoring Guidelines(Procedures for Programme
Performance Monitoring andReporting through the use of
IMDIS)
1 Secretary-General’s Bulletin, “Regulations and Rules Governing Programme Planning, the Programme Aspects ofthe Budget, the Monitoring of Implementation and the Methods of Evaluation”, ST/SGB/2000/8, 19 April 2000.
2 United Nations Evaluations Group (UNEG), “Norms and Standards for Evaluation in the UN System”, April2005 (available online at http://www.uneval.org); also see Annex IV on how these are applied at ESCAP.
3 See pages 4-5 of the Evaluation Guidelines for a definition of ‘evaluation’ and ‘evaluative review’.
2
ESCAP Evaluation Guidelines
Structure of the Evaluation Guidelines
Chapter 1 of the Guidelines outlines the different types of evaluative processes at ESCAP and therelated requirements, roles and responsibilities of ESCAP staff and other stakeholders.
Chapters 2 to 4 of the Guidelines are based on a 10-step evaluation process, divided into threestages, as shown below:4
Figure 2. Stages in the evaluation process
Chapter 2:PLANNING
EVALUATIONS
1. Prepare evaluation planand budget
2. Prepare terms ofreference
3. Establish evaluationteam
4. Schedule and organizeevaluation
Chapter 3:IMPLEMENTINGEVALUATIONS
5. Conduct evaluation
6. Prepare draft report
7. Review draft report
Chapter 4:USING EVALUATION
FINDINGS
8. Prepare managementresponse and actions
9. Share evaluationfindings
10. Follow up and promotelearning
Evaluation Tools
A set of “evaluation tools”, including checklists and templates, are provided separately to supportthe evaluation steps where necessary. Reference is made to the tools throughout the text and a listof tools is included in Annex II.
Evaluation Fact Sheets
A set of “evaluation fact sheets” describes how the 10 evaluation steps are applied in different typesof evaluative processes and outlines the related roles and responsibilities. The different types ofevaluative processes are shown in Figure 3, and a list of fact sheets is provided in Annex III. Thefact sheets are subject to continuous updates. The latest versions are available on iSeek: (http://iseek.un.org/webpgdept1028_79.asp?dept=1028).
4 The evaluation approach adopted by ESCAP, shown in Figure 2, is based on evaluation guidelines used by otherorganizations, most importantly OIOS (2005), UNDP (2002) and DFID (2005) (see Annex I).
3
ESCAP Evaluation Guidelines
1. EVALUATION AT ESCAP
Evaluation in the ESCAP context is defined as a selective exercise that seeks to determine assystematically and objectively as possible the relevance, efficiency, effectiveness and sustainability ofan ongoing or completed subprogramme, project, modality, theme or other initiative in light of itsexpected results. Evaluations encompass design, implementation and results to provide informationthat is credible and useful, enabling the incorporation of lessons learned into executive planning anddecision-making. Evaluation asks three questions: Are we doing the right thing, are we doing itright, and are there better ways of achieving the expected results? Evaluation is thus used tostrengthen accountability and to foster institutional learning with a view to improving the quality ofongoing and future initiatives.
This chapter covers different types of evaluative processes at ESCAP and the related requirements,roles and responsibilities of staff and other stakeholders.
1.1 Types of evaluative processes at ESCAP5
The categories of evaluative processes shown in Figure 3 below are distinguished on the basis of thegenesis of the evaluation and on who manages the evaluation process.
Figure 3. Types of evaluative processes at ESCAP
EXTERNALEVALUATION
Project
INTERNALEVALUATION
EVALUATION EVALUATIVEREVIEW
Peer
Other
Other
Project
Subprogramme
ThematicMandatory
Discretionary
5 ESCAP’s definitions are aligned with those of the Office of Internal Oversight Services (OIOS) “Glossary ofMonitoring and Evaluation Terms”; http://www.un.org/Depts/oios/mecd/mecd_glossary/index.htm.
4
ESCAP Evaluation Guidelines
1.1.1 External evaluations
External evaluations are managed and conducted by entities outside ESCAP, such as the JointInspection Unit (JIU), which has a UN-wide mandate, or by the Office of Internal Oversight Services(OIOS) on the basis of its UN Secretariat-wide mandate. External evaluations can be mandatory ordiscretionary:
Mandatory external evaluation
Mandatory external evaluations are requested by intergovernmental bodies such as the GeneralAssembly, the Committee for Programme and Coordination, functional commissions, regional andsectoral intergovernmental bodies or other technical bodies. The primary purposes of mandatoryexternal evaluations include oversight and support to decision-making at the intergovernmentallevel; their findings are however highly useful also for programme managers, who are oftenrequired to implement their recommendations and report back to the requesting intergovernmentalbody.
Discretionary external evaluation
Discretionary external evaluations are requested by programme managers and designed, managedand conducted by an external entity. The primary purpose of discretionary external evaluations isorganizational learning on the basis of independent and objective assessments for improved perfor-mance; their findings may however also support decision-making and accountability at the intergov-ernmental level. In conjunction with the development of ESCAP’s Evaluation Plan (see also section2.1.2) the Executive Secretary can put forward suggestions for such evaluations.
1.1.2 Internal evaluations6
Internal evaluations are managed by ESCAP staff. They can be requested by the Commission orplanned by the ESCAP secretariat and as such be either mandatory or discretionary. Ad hocevaluations may be conducted on the basis of emerging needs and priorities of member States orthe secretariat.
ESCAP distinguishes between two types of internal evaluations, namely “evaluations” and “evalua-tive reviews”, as described below:
Evaluation
The term “evaluation” is utilized for evaluations that are managed by the Evaluation Officers in theProgramme Management Division (PMD). This requirement is introduced to strengthen theindependence and impartiality of the evaluation process and its findings and recommendations.Evaluations have the purpose of supporting decision-making at the strategic management level andhold the secretariat accountable to member States and external stakeholders. Different categories ofevaluations include:
Thematic: An evaluation focused on a cross-cutting theme, fund, sector, modality, or service; Subprogramme: An evaluation that considers the effects of the total portfolio or major compo-
nents of activities that are aimed at achieving a common set of results as set out in the strategic
6 The terminology used by OIOS, “internal evaluation or self-assessment,” also covers the programme perfor-mance assessments that ESCAP considers part of its monitoring framework; see ESCAP's M&E System Overview.
5
ESCAP Evaluation Guidelines
framework. The scope of a subprogramme evaluation could be the combined work of adivision, a section, a subregional office or a regional institution, or the portfolio of technicalcooperation activities implemented under the subprogramme;
Project: An evaluation that focuses on the achievement of the results outlined in the logicalframework of a project, often within the context of a broader programme. Most often, projectevaluations are planned when the project is developed and included in the project documentand budget.
Other: Any other evaluative process for which it is deemed necessary that the process ismanaged by the Evaluation Officers of PMD to strengthen the independence of its findings.
Evaluative review
“Evaluative reviews” may be managed by any division or by any office away from Bangkok. Adistinctive feature of an evaluative review is therefore that the programme/project implementermay also be the manager of the evaluation. Evaluative reviews have the primary purpose offostering organizational learning. Different types of evaluative reviews include:
Project: Similar to project evaluations, a project evaluative review focuses on the achievement ofthe results outlined in the logical framework of a project. Project reviews are managed by theproject implementer and typically conducted by external consultants. Project reviews are fundedfrom the project budget;
Peer: A peer review can be managed by any division or office. Peer reviews are conducted by agroup of nominated staff representatives (peers), and the managing office would extend secre-tarial support to the process. Members of the peer group evaluate organizational performanceand practice relating to particular modalities, sets of activities, reports or procedures. Peerreviews are particularly useful in establishing quality standards for activities under review,mainstream an awareness of such quality standards and promote a quality-oriented workculture. Reviews of the same subject may be conducted periodically to ensure continuity inorganizational learning. External consultants may be contracted to provide specialist advice.
Other: Evaluative reviews may cover any topic. For example, an evaluative review of a clusterof linked activities or a delivery modality could be undertaken by a division, resulting insystematic documentation of lessons learned and the generation of consolidated conclusions andrecommendations. All “other” evaluative reviews should, as any evaluative review, be con-ducted in accordance with ESCAP’s Evaluation Guidelines.
1.2 Roles and responsibilities
1.2.1 Organizational roles and responsibilities
The following organizational roles and responsibilities govern evaluation at ESCAP (see also Table 1and Evaluation Tool 8: Evaluation process checklist):
The Commission: Responsible for guidance and oversight of the work of the ESCAP secretariat.May request evaluations of ESCAP’s subprogrammes, projects or other activities through resolu-tions. Committees that are subsidiary to the Commission may recommend to the Commissionthe undertaking of an evaluation or evaluative review.
The Office of Internal Oversight Services: Responsible for guidance and oversight of evaluationprocesses within the United Nations Secretariat. May conduct mandatory or discretionaryexternal evaluations of ESCAP. On an ad hoc basis, ESCAP may turn to OIOS (or other UNevaluation offices) for the provision of quality support for internal evaluations.
6
ESCAP Evaluation Guidelines
The Executive Secretary: Responsible for all activities undertaken by the ESCAP secretariat,which, in the context of evaluation means approving the biennial Evaluation Plan, approving themanagement response to evaluations and in some cases also to evaluative reviews, and ensuringthat evaluations and evaluative reviews are used to promote learning and strengthen accountabil-ity at ESCAP.
Senior Management: Senior managers at ESCAP play an important role in ensuring the use ofevaluations. Through signing off on management responses and follow-up action plans, theycommit to and are thus held accountable for the implementation of follow-up to evaluations.Division Chiefs and Heads of offices away from Bangkok, being responsible for ensuring thatactivities under their purview are subject to regular evaluative reviews, also play an importantrole in the formulation of ESCAP’s Evaluation Plan.
Evaluation Officers: A central responsibility of PMD’s Evaluation Officers is to extend qualityassurance and support to evaluative reviews, including during the preparation of the terms ofreference (TOR), the identification of evaluation consultants and the review of the evaluationreport. In providing this service, the Evaluation Officers are concerned with assessing andimproving the merit or the worth of evaluative activities and their adherence to UNEG andESCAP norms.7 The Evaluation Officers also coordinate the formulation and monitoring ofESCAP’s Evaluation Plan and the management response to all evaluations and evaluativereviews. For evaluations, they further take on the role of evaluation manager (see Table 1).
PME focal points and assistants: Each operational division and the ESCAP Pacific OperationsCentre (EPOC), in addition to the Office of the Executive Secretary and the Human ResourcesManagement Section, Administrative Services Division (ASD), have appointed planning, moni-toring and evaluation (PME) focal points and assistants. This group of staff members serves asthe anchor of M&E at ESCAP. In the context of evaluation, the PME focal points and assistantsfacilitate the formulation of the biennial ESCAP Evaluation Plan, provide guidance to theircolleagues on evaluation during the design phase of programmes and projects, and coordinatethe monitoring and reporting on follow-up to evaluations by their division or office.
ESCAP staff: Staff from ESCAP divisions and offices away from Bangkok support evaluationby providing inputs to ESCAP’s Evaluation Plan, organizing and participating in interviewsconducted by evaluators and reviewing and disseminating evaluation reports. They also sharethe organizational responsibility of ensuring the utility of evaluations by contributing to theimplementation of follow-up actions for which their office is responsible. For specific evaluationsor evaluative reviews, ESCAP staff members may take on the role of evaluation manager or bepart of the evaluation team (see Table 1).
1.2.2 Roles and responsibilities in evaluative processes
The following roles and task are relevant to all evaluative processes:
The evaluation manager: The primary role of the evaluation manager is to manage theevaluation process, rather than conduct the evaluation. Typical tasks of the evaluation managerinclude preparing the terms of reference, establishing the evaluation team, overseeing the reviewof the report, disseminating evaluation results and making other logistical arrangements. PMD’s
7 The norms for evaluation applied at ESCAP are outlined in Annex IV.
7
ESCAP Evaluation Guidelines
Evaluation Officers manage evaluations under the supervision of the Chief of PMD. Evaluativereviews are managed by staff from any division (including staff from PMD) under thesupervision of the relevant division chief or head of office. PMD Evaluation Officers providesupport as requested to the management of evaluative reviews.
The evaluator or evaluation team: Conducts the evaluation through document reviews, inter-views, surveys, meetings and site visits, etc. The team is generally comprised of one or moreexternal consultants (see Table 1);
The management response: The management response is the formal, written response fromESCAP’s management to the findings and recommendations of an evaluation or an evaluativereview. The management response is formulated jointly by organizational entities that areresponsible for or will be involved in the follow up to the evaluation, and is signed by therelevant Chiefs as well as the Chief of PMD. For evaluations, the management response is alsosigned by the Executive Secretary.
1.2.3 Involving stakeholders in evaluation processes
Involving stakeholders before an evaluation starts, and keeping them informed about progressduring the evaluation process allows the stakeholders to explain their expectations to the evaluationand raise related questions and concerns. This involvement is central to ensuring the support ofstakeholders during the evaluation process and afterwards during the implementation of follow-upactions to the evaluation. Stakeholders of an evaluation should be identified in the TOR of theevaluation, and should ideally be involved in the preparation of the TOR.
One mechanism for ensuring the active involvement of stakeholders in an evaluation process isthrough the establishment of a reference group or expert panel. The reference group or expertpanel can be formed in order to provide the evaluator or evaluation team with feedback from atechnical and methodological perspective. Reference group members can include stakeholders andpeers, both internal and external to the project and to ESCAP. The composition of the referencegroup is at the discretion of the evaluation manager.
A reference group performs a quasi-oversight function that helps ensure transparency of themanagement process as well as generate a sense of ownership and participation among referencegroup members and the organization as a whole. While the selection of an evaluator is theresponsibility of the evaluation manager, it is recommended to keep the reference group informedof the selection process to ensure that the selected evaluator is acceptable to all stakeholders.
Table 1 summarizes the above sections by outlining the different types of evaluative processes atESCAP and the related organizational roles and responsibilities.
See Evaluation Tool 8: Evaluation process checklist
8
ESCAP Evaluation Guidelines
Table 1. Comparison between types of evaluative processes at ESCAP
EXTERNAL INTERNAL
EVALUATIONS EVALUATIONS EVALUATIVE REVIEWS
WHY? External External accountability Organizational learningMain Purpose accountability to to member States and Internal accountability(other purposes member States and donors External accountabilityin italics) donors Internal accountability
Internal Organizational learningaccountability
Organizationallearning
OIOS PMD’s Evaluation Division or office awayEvaluation JIU Officers from Bangkokmanager Other external
WHO? parties
Evaluation team External External consultants External consultantsconsultants External peers
OIOS staff ESCAP staff JIU staff (see Table 5)
Quality assurance OIOS OIOS Internal peersand support (by) JIU UNEGa PMD’s Evaluation
Other external PMD Officersparties Reference group Reference group
Management Executive Secretary Executive Secretary, the Head of division orresponse Chief of PMD and heads office managing the(signed by) of other relevant review, heads of other
divisions, or offices relevant organizationalaway from Bangkok entities, and the Chief
HOW? of PMDb
Dissemination of United Nations ESCAP secretariat ESCAP secretariatc
evaluation Secretariat United Nationsfindings (to) External Secretariat
stakeholders External stakeholders
Follow up to ESCAP Executive Secretary Head of division orevaluation Other UN All other relevant office managing thefindings and Secretariat entities ESCAP staff reviewrecommendations All other relevant(by) ESCAP staff from
divisions or officesthat signed themanagement response
Notes:a OIOS or other United Nations Evaluation Group (UNEG) members may be consulted on an ad hoc basis.b Divisions managing an evaluative review may request the Executive Secretary to sign the management response.c Divisions managing an evaluative review may request that their evaluative review is issued for external distribution.
○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○
○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○
○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○
○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○
9
ESCAP Evaluation Guidelines
2. PLANNING EVALUATIONS
Evaluations at ESCAP are listed in the biennial Evaluation Plan, which is developed as anorganization-wide exercise around two years before its implementation. More immediately prior tothe initiation of an evaluation, the evaluation process is planned by the evaluation manager. Thischapter covers the development of the biennial ESCAP Evaluation Plan as well as the detailedplanning for the conduct of individual evaluations.
2.0.1 Ensuring evaluability
It is important to consider the “evaluability”, i.e. the extent to which programmes, projects andother interventions can be evaluated in a reliable and credible manner, already at the planningstage. Unless considerations of evaluability are built into the design, an evaluation may eventuallynot be feasible. In addition to the development of logical frameworks of programmes and projects,options for data collection and availability of baseline data should be considered during the designphase. M&E plans for programmes and projects at ESCAP are developed in support of ensuringmonitoring during implementation and evaluability at the mid-term or after finalization. Neverthe-less, evaluability will need to be reassessed at the time of a planned evaluation because a projectmay have changed or altered its strategy of implementation, for example in order to better addressthe needs of the target group. Such changes would make it difficult to evaluate against the originallogical framework and adjustments may accordingly have to be made.
2.1 Step 1: Prepare Evaluation Plan and Budget
2.1.1 ESCAP Evaluation Plan
The ESCAP Evaluation Plan is prepared every biennium and includes evaluation initiatives plannedto be carried out by the ESCAP secretariat during the two-year programme cycle, as well as relatedresource requirements in terms of work months and cash. The Evaluation Plan is developed inconjunction with the formulation of ESCAP’s biennial programme budget and thus forms an integralpart of the programme planning cycle.
Division Chiefs and other Programme Managers, in consultation with their staff, identify andpropose evaluations and evaluative reviews for inclusion in the Evaluation Plan. PMD reviews theproposals in the context of overall ESCAP evaluation requirements and prepares the draft Evalua-tion Plan for review and approval by the Executive Secretary. Additionally, performance reviews orother types of assessments mandated by the Commission that will be conducted in accordance withESCAP’s framework for evaluation are included in the Evaluation Plan.
The selection of what to evaluate is a critical exercise, as it determines the information that theorganization will have at its disposal for strategic decision-making. The following should beconsidered:
1) The intended purpose and objective of each proposed evaluation;2) The relative importance of the proposed subject for evaluation within the context of the
strategic direction and priorities of ESCAP (pilot projects intended to be replicated, recommen-dations made by external partners, etc.);
3) Evaluations planned by partner governments or other organizations (to complement and avoidoverlaps);
4) Resource requirements;5) Evaluability.
10
ESCAP Evaluation Guidelines
2.1.2 Budgeting for planned evaluations
In addition to the purpose and objective of evaluations or evaluative reviews, their budgetshould be considered during the programme or project planning stage. Evaluations and evaluativereviews can be funded from various sources, depending on the type of evaluation, as explainedin Table 2.
8 The term PSC refers to a cost recovery mechanism for “indirect costs” associated with the implementation of XBprojects. “Indirect costs” refers to work that is undertaken by central administration and management entities(i.e. PMD and ASD) to support the implementation of projects.
Table 2. Budgeting for planned evaluationsa
EVALUATION TYPE SOURCE OF FUNDS (non-staff)
External evaluation
Mandatory external evaluation External resources
Discretionary external evaluation ESCAP RB or XB resources, external resources, or a mix
Internal evaluation
Thematic ESCAP RB or XB resources, including institutional Subprogramme support funds of regional institutions
Project Project funds, supplemented by pooled PSC resources asappropriate
Internal evaluative review
Project Project funds
Peer reviews ESCAP XB or RB resources
a The table identifies regular budget (RB) and extrabudgetary (XB) sources of funds for the contracting of consultants andthe conduct of specific evaluation activities, including travel. The biennial ESCAP Evaluation Plan also includes anestimate of staff time required to manage and support evaluations.
There are no specific budgetary requirements for project evaluations or project evaluative reviews;however, in general, five percent of the operational project budget (i.e. net of Programme SupportCosts (PSC8)) is a recommended amount. The project evaluation/evaluative review budget shouldbe developed by considering the relative size of the project budget, the scope of the evaluation andany other criteria applied by the programme and project appraisal mechanisms at ESCAP. TheDivision Chief or Head of Office away from Bangkok, in coordination with the appraisal bodies,will determine the appropriate budget.
2.2 Step 2: Prepare the Terms of Reference for the Evaluation
Terms of reference (TOR) are used to plan for an evaluation and also form the basis for contractswith external consultants. The evaluation manager prepares the TOR. It is important that stakehold-ers of the evaluation are involved in the preparation of the TOR to ensure that the evaluation meetsstakeholder expectations, is not over-ambitious, and is sufficiently detailed for the evaluation team
11
ESCAP Evaluation Guidelines
to carry out the evaluation. A stakeholder workshop can be organized to ensure that allstakeholders are aware of the evaluation and are able to provide input to the TOR.
The TOR should: (1) define the evaluation, (2) outline an evaluation methodology (design) asdetailed below, and (3) set the budget and timeframe.
See Evaluation Tool 1: Evaluation TOR template
2.2.1 Define the evaluation
Purpose
Establish the evaluation purpose by answering the following three questions:
Whom is the evaluation for? Is it for a particular donor or for member States? Or is it forESCAP management or staff? Or both?
Why is the evaluation being carried out? What triggered the evaluation? Is there a specificreason for the timing of the evaluation?
How will the results be used? By being clear upfront as to how the results will be used (andsticking to this!) and with whom the evaluation will be shared, the evaluation manager cangenerate trust among all parties involved.
For example:
“The main purpose of the evaluation is to: (i) assess the performance of the project; (ii) derivelessons from implementation; and (iii) to put forward recommendations for future interventions inthe same sector area.”
“This evaluation is formative and forward-looking. Its purpose is to evaluate the operations andwork plan for [programme or project] with a view to ascertaining how the [programme or project]can be strengthened to better serve the needs of members and associate members of ESCAP.”
“The evaluation will feed into the planned mid-term review of the joint MOU between [partner]and ESCAP, scheduled to be held during the third quarter of 2009. To serve this purpose, thefindings and recommendations of the evaluation should provide guidance for the two institutions tofurther strengthen their partnership for the enhanced achievement of development results during thesecond half of the term of the MOU.”
Objectives
While the purpose clarifies why the evaluation is carried out, the objectives describe what theevaluation wants to illuminate. Table 3 lists the standard evaluation criteria and additional criteriathat relate to United Nations commitments. These criteria can be used to formulate evaluationobjectives. It is important that the relevant criteria are included in the objectives.
An example of typical objectives for a programme or project evaluation is:
To assess the relevance, efficiency, effectiveness and sustainability of the project/ programme/intervention;
To assess the extent to which the design and implementation of the project/programme/intervention took into consideration cross-cutting United Nations commitments relating to gender/a rights-based approach/environmental sustainability/ priority countries/working as one UN;
To identify concrete recommendations for improvement.
12
ESCAP Evaluation Guidelines
Table 3. Evaluation criteria
CRITERION DESCRIPTION
Standard evaluation criteria
Relevance Appropriateness of objectives (of a theme or subprogramme) or outcomes (of aproject) in terms of ESCAP’s priorities, Governments’ development strategies andpriorities, and requirements of the target groups.
Efficiency Extent to which human and financial resources were used in the best possible wayto deliver activities and outputs, in coordination with other stakeholders.
Effectiveness Extent to which the expected objectives (of a subprogramme or theme) or outcomes(of a project) have been achieved, and have resulted in changes and effects, positiveand negative, planned and unforeseen, with respect to the target groups and otheraffected stakeholders.
Sustainability Likelihood that the benefits of the subprogramme, theme or project will continue inthe future.
Additional criteria reflecting United Nations commitmentsa,b
UN coherence Extent to which different United Nations agencies and other development partnersoperate in a coordinated and coherent way in the design and implementation of thesubject of the evaluation. This could include utilization of structures in support ofregional coordination such as the Regional Coordination Mechanism (RCM) and itsThematic Working Groups (TWG) and ensuring coherent approaches with UNCountry Teams through Non-resident Agency (NRA) approaches.
Partnerships The extent to which key stakeholders have been identified to be partners in theplanning and delivery of a programme or intervention.
Aid effectivenessc In the context of the Paris declaration and the Accra Agenda for Action (AAA) thisrefers to the streamlining and harmonization of operational practices surroundingaid delivery to developing countries to ensure enhanced aid effectiveness. Thiscriterion also assesses the extent to which ESCAP has ensured that the programmeor project is driven by the country or territory in which it is implemented or, in theregional context, by the member States, and the extent to which there is a focus ondevelopment results and mutual accountability in the design and implementation ofthe subject of the evaluation.
Gender Gender mainstreaming is one of the key strategies of UN-supported analysis andstrategic planning. This criterion assesses the extent to which gender considerationshave been incorporated in the design and implementation of the subject of theevaluation.
Human rights-based Extent to which a human rights-based approach (HRBA), an approach that main-streams human rights principles throughout programming, has been utilized in thedesign and implementation of the subject of the evaluation.
Environmental Extent to which environmental sustainability considerations have been incorporatedin the design and implementation of the subject of the evaluation.
a The additional criteria overlap to some degree with the standard evaluation criteria. For example, the “one UN”criterion could also be considered under the efficiency criterion, the involvement of priority countries is also part of therelevance criterion, and gender is a cross-cutting issue that could be considered under each of the standard evaluationcriteria.
b A few of the additional criteria (UN coherence, gender mainstreaming, HRBA, and environmental sustainability) arebased on United Nations Development Group principles; for more information visit the programming reference guideat: www.undg.org.
c The current principles for aid effectiveness are outlined on the OECD website: www.oecd.org/dac.
mainstreaming
approach
sustainability
13
ESCAP Evaluation Guidelines
Scope
The scope of the evaluation describes what is included and what is not. The following should beconsidered in defining the scope:
Description of the subject to be evaluated (project, subprogramme, theme) and what is to beexcluded and included;
The period covered by the evaluation, e.g. the past five years of a subprogramme;
Geographical area, e.g. the South-East Asian countries targeted by a specific project;
Stakeholders of the evaluation, such as ESCAP, United Nations agencies, member States, donors,government agencies, civil society/NGOs. For example, an evaluation of expert group meetingscould include ESCAP staff members and participants in expert group meetings;
Point of reference of the evaluation, i.e. what you are evaluating against. For subprogrammesand projects a logical framework is developed as part of the planning process. However, alogical framework may not exist for cross-cutting issues (e.g. human rights) or approaches (e.g.capacity development), as the intent is for these cross-cutting issues to be mainstreamedthroughout programmes and projects. It is useful to develop an evaluation logical framework(see Evaluation Tool 2), in particular for those programmes, projects or themes that does nothave or did not develop a logical framework during the design phase. The evaluation managercan develop an evaluation logical framework and include it as part of the TOR or it can beincluded as a task to be performed by the consultant(s) hired to conduct the evaluation;
Evaluation questions that add more detail to each objective. For example, if one of the objectivesis “to assess the relevance, efficiency, effectiveness and sustainability of the programme”, thenspecific questions should be asked under each criterion. The design of the evaluation will bebased on the type of evaluation questions. In order to manage the size of the evaluation, it isrecommended to limit the number of evaluation questions.
See Evaluation Tool 2: Sample evaluation logical framework model
See Evaluation Tool 3: Evaluation questions under evaluation criteria
Box 1. Impact
Impact, in the context of ESCAP’s work, refers to member States’ achievements in bringing about benefitsfor ultimate target groups (e.g. slum dwellers, rural poor, small and medium-sized enterprises, etc.). Suchbenefits are linked, among others, to the Millennium Development Goals (MDGs), and indicators used tomeasure benefits could include the proportion of people living on less than a dollar a day, the number ofpeople living with HIV/AIDS, and the proportion of people with access to safe drinking water.
Apart from difficulties and costs associated with measuring these indicators, evaluating ESCAP’s impact ischallenging because of the difficulty of attributing observed changes to ESCAP’s work or isolatingESCAP’s contribution to measured impacts. Rather, ESCAP would seek to evaluate its contribution to theachievement of objectives (for subprogrammes) or outcomes (for projects). Further, ESCAP objectives oroutcomes generally relate to the formulation and implementation of economic and social policies andprogrammes by member States. For these reasons, impact is not included in the list of standard ESCAPevaluation criteria presented in Table 3.
14
ESCAP Evaluation Guidelines
Limitations
Next, it is useful to identify limitations, or constraints, to the evaluation that could influence theevaluation team in answering the key evaluation questions. Limitations are typically linked to thefollowing areas:
Political, such as political sensitivities, degree of interest/cooperation from member States; Organizational, including culture, support, managerial interest, knowledge and skills; Budget, time or resources, i.e. whether these are sufficient to conduct a rigorous evaluation; Data, which refers to the availability and quality of indicators, data and a baseline (which is
especially important in assessing the achievement of outcomes); Attribution, which relates to how easy or difficult it will be to attribute observed changes to
ESCAP.
It is also important to identify possible solutions for limitations, which should then be incorporatedin the evaluation methodology or design, although it may not always be possible to develop arigorous evaluation methodology and address all limitations. For example, in order to alleviate thepolitical sensitivity surrounding an evaluation the evaluation manager could engage stakeholdersfrom the very beginning to ensure that all agree upon the topic under evaluation and the means orstrategy for evaluating. You can find more ideas for addressing potential limitations in EvaluationTool 4.
See Evaluation Tool 4: Common evaluation limitations
Deliverables
The main output of an evaluation is the standard evaluation report. However, this report maydeviate from the standard structure. For example, in the case of a forward-looking evaluation, thereport may need to include a concrete strategy for the future in addition to individual recommenda-tions. There can also be other outputs that are required before, during or at the end of theevaluation, such as written comments on the TOR, a work plan, or a presentation to ESCAP staffmembers or management. Additionally, it is important to consider tailoring evaluation outputs todifferent target audiences. For example, policy-makers may not have time to read the full evaluationreport or may not want to carry around the entire report just for the executive summary, but theymay have the time to read through an evaluation brief that outlines the key findings, conclusionsand recommendations. Also, evaluations mandated by the Commission must be submitted in aformat that meets the requirements for pre-session documents.
See Evaluation Tool 5: Evaluation report template
2.2.2 Outline an evaluation methodology
The evaluation methodology, or design, describes the steps and activities that would be taken toanswer the evaluation questions. The development of the evaluation methodology at ESCAP isbased on standard models, which have been adapted for use at ESCAP, and consists of three steps:
1. Determining the designa. Possibility of before and after comparison;b. Possibility of utilizing a counterfactual;
2. Choosing information collection methods;3. Determining method(s) of data analysis.
15
ESCAP Evaluation Guidelines
The evaluation manager in the course of preparing the TOR will suggest a methodology. Once theevaluation methodology has been developed, it is essential to revisit the evaluation questions andascertain that these can be answered through the chosen methodology.
The TOR should also describe limitations to the scope and methodology, based on remainingconstraints, to ensure that the reader of the evaluation report can make a judgment on the validityof the findings, i.e. whether the method employed accurately measures what it is intended tomeasure. Common limitations are described in section 2.2.1 of the Guidelines and in EvaluationTool 4.
While the TOR should provide a suggested methodology, this may need to be tentative and subjectto the recommendations of the evaluator, who may be required to prepare an evaluation logicalframework. The purpose of the evaluation logical framework is to provide a reference againstwhich to conduct the evaluation. It reviews the expectations of the evaluation and takes intoaccount actual budgetary allocations, activities implemented and constraints encountered. Thisexercise provides important leads as to the availability of data and which methodologies would bemost appropriate.
See Evaluation Tool 2: Sample evaluation logical framework model
See Evaluation Tool 4: Common evaluation limitations
Box 2. The “Gold Standard” in evaluation
Evaluation literature refers to three primary types of evaluation designs to gather and analyze data:experimental, quasi-experimental and non-experimental. Experimental design is also referred to as the“Gold Standard” in the field of evaluation because it most closely resembles laboratory research, which isconsidered by many to be the most scientifically rigorous form of research. Experimental studies are thebest design for attributing causality to the intervention of interest. An experimental study entailsrandomly assigning study participants to either a “control group” or an “experimental group” andensuring enough participants for the outcome to be deemed statistically true. Experimental studies aregood at ruling out the possibility that something other than the intervention led to the observed outcomesbecause the evaluator is able to control the potential confounding factors.
In the development field, experimental designs are generally not feasible and are not deemed appropriate.ESCAP applies a combination of quasi-experimental and non-experimental designs. Quasi-experimentaldesigns obtain measurements before (through the establishment of a baseline) and after an intervention,and may include a comparison group that is not targeted by the intervention. Non-experimentalevaluation designs only take measurements in relation to the target group after the intervention.
1. Determining the design
a. Possibility of before and after comparison
In order to establish whether an intervention has brought about change, the situation before andafter the implementation of the intervention must be compared. An example that could beapplicable to ESCAP is a project in the Greater Mekong Subregion to assist national Governments indeveloping policies to increase rural households’ access to safe drinking water. In addition toreviewing existing policies after the completion of the project, the evaluation should also ascertainwhat policies existed before the project started. This is referred to as the ‘baseline’ and, ideally, it is
16
ESCAP Evaluation Guidelines
established at the start of the project. Thus, it is recommended that if this type of method isemployed it must be planned from the beginning of the project implementation because the datawill be more reliable and it will be unnecessary to “reconstruct the baseline”.
b. Possibility of utilizing a counterfactual
If changes have been observed after the implementation of the intervention, it is important todetermine whether the changes observed can be directly attributable to ESCAP’s contribution. Oneway of doing this is by exploring the “counterfactual”, which means asking what would havehappened without ESCAP’s involvement? ESCAP’s contribution is determined with more certainty if itcan be shown that a similar change did not take place for groups or countries that were nottargeted by the intervention.
For many evaluations, it will be a challenge to ascertain this information for two reasons: 1) asdiscussed previously, it is difficult to attribute a change directly to ESCAP’s involvement; and 2) itis difficult to compare the situation of countries to each other because of the many differenthistorical, political, social and economic conditions. As the work of ESCAP is carried outpredominantly at the regional, subregional and national levels, it is not always easy to find suitablecomparison groups: It is less complex to compare two neighbouring villages, one of which receivedaid and the other did not, than it is to compare two neighbouring countries or two subregions.
For these reasons it is advisable, in most cases, to utilize option A and plan a comparison of thesituation pre/post intervention.
2. Choosing information collection methods
The methodology and evaluation questions should guide the determination of the method of datacollection that would be most appropriate. Table 4 lists the information collection methods that aremost relevant to evaluations at ESCAP, indicating the main advantages and disadvantages of eachmethod. In most cases, a mix of qualitative and quantitative information will be used. Forexample, evaluators may first review project documentation and interview project staff to gain abroad understanding of the project (qualitative); then collect financial and other data in relation tothe indicators in the logical framework (quantitative), and then conduct a survey or interviewsamong project partners and target groups (qualitative and quantitative).
The following considerations may help to determine which method of data collection would beappropriate:
What information is already available and what needs to be collected? Which data collection method(s) will best answer the evaluation question(s)? What resources and time is available? What method will ensure stakeholder involvement? Can the validity (accuracy) and reliability (consistency in results using the same method) of data
be strengthened through a mixed qualitative/quantitative approach?
3. Determining method(s) of data analysis
Analysis and the interpretation of the results is a critical exercise. Data analysis is the search forpatterns and relationships in data and is guided by the evaluation questions. Many different meansfor analyzing qualitative and quantitative data exist. Whichever method is chosen the evaluationmanager and the reference group, if established, should work together with the evaluator(s) to placethe findings within the context of the programme, region, organization etc., identify possibleexplanations for unexpected results, and determine the conclusions that can be drawn from the datawithout unduly influencing the recommendations.
17
ESCAP Evaluation Guidelines
Bias, or the introduction of systematic error, is inevitable no matter which method of analysis ischosen. Triangulation is a recommended method for reducing bias. Triangulation is the usage ofmultiple (three or more) types of data (quantitative and qualitative), data sources or methods ofanalysis to verify and substantiate an assessment. Essentially, triangulation provides a strongerassessment because it has been cross-checked or substantiated through multiple methods.
Table 4. Information collection methods most relevant to ESCAP
METHOD ADVANTAGES DISADVANTAGES
Review of documentation
Made available to evaluator Inexpensive Limited to documents available Collected by evaluator Fast Difficult to verify quality of(see Table 7 for a list of Easy informationdocumentation types) Leaves out tacit and informal
knowledge
Interviews
ESCAP management/staff Broader view and context of Time consuming (in arranging Other United Nations agencies the topic being evaluated and conducting interviews) External stakeholders involved Suitable for complex or Extrapolation and comparison
in or affected by the intervention sensitive topics of findings may be difficult Increased depth and detail Evaluator and interviewees
must usually be in samelocation for face-to-faceinterviews (video-conferencesmay be possible)
Focus group sessions
ESCAP management/staff Faster and more cost-effective Responses cannot easily be(e.g. division chiefs, section than individual interviews comparedstaff, or QAT) Group interaction Inability to give views
External stakeholders involved anonymouslyin or affected by the intervention
Survey
Written questionnaire Relatively inexpensive Usefulness depends on response Web-based questionnaire Ability to reach more rate Telephone survey stakeholders Risk of losing subtle differences
Summarizes findings in a clear, in responses due to finiteprecise and reliable way classifications
Suitable for extrapolation and Difficult to verify quality ofcomparison of findings and informationreplication
Country/site visits
Interviews Same as interviews Time-consuming (in arranging Visits to locations Ability to get feedback from visits and conducting them)
sources close to ultimate target Expensivegroup
Suitable for projects whoseresults can only be verifiedvisually
Source: Based on Bamberger et al. (2006), Kusek and Rist (2004) and Danida (2001).
18
ESCAP Evaluation Guidelines
2.2.3 Set the budget and completion date
During the preparation of the ESCAP Evaluation Plan (Step 1), each planned evaluation has beenbudgeted and an indicative completion date has been established. In the preparation of the TOR,indicative timelines relating to the start of the evaluation, submission of draft and final reports andsign-off need to be identified. Moreover, it is important to prepare an indicative breakdown of thebudget, e.g. consultants’ time, travel costs and printing costs. Further details on tasks andtimeframes for the evaluation team are provided in Section 2.4.
2.3 Step 3: Establish the Evaluation Team
It is important to uphold the United Nations Norms and Standards for evaluation by minimizingconflict of interest and maximizing the objectivity of the evaluation team.9 Within the ESCAPcontext, complete avoidance of conflicts of interest is only possible when OIOS, JIU, or anotherexternal organization conducts an external evaluation of ESCAP. For internal evaluations, conflictsof interest can be minimized by ensuring that the evaluation team members, including theevaluation manager, are not involved with the management, policy or implementation of the subjectof evaluation. Thus, the purpose of the evaluation should be carefully considered and guide who isselected for the evaluation team.
Evaluators must have extensive experience in carrying out evaluations, technical knowledge of thetopic that is being evaluated as well as other specific expertise, such as country-specific knowledge,language skills and an understanding of ESCAP and the organizational context in which it operates.PMD is available to provide support in identifying suitable candidates.
The United Nations Standards for Evaluation in the UN System10 advise that work experience in thefollowing areas is particularly important:
design and management of evaluation processes, including with multiple stakeholders; survey design and implementation; social science research; programme/project/policy planning, monitoring and management.
It is also recommended to identify an evaluator with specialized experience, including datacollection and analytical skills, in the following areas:
understanding of human rights-based approaches to programming; understanding of gender considerations; understanding of Results-based Management (RBM) principles; logic modeling/logical framework analysis; real-time, utilization-focused, joint, summative and formative evaluation; quantitative and qualitative data collection and analysis; rapid assessment procedures; participatory approaches.
9 United Nations Evaluations Group (UNEG), “Norms and Standards for Evaluation in the UN System”, April2005 (available online at http://www.uneval.org).
10 United Nations Evaluation Group, “Standards for Evaluation in the UN System,” April 2005.
19
ESCAP Evaluation Guidelines
Additionally, personal skills in the following areas are important:
team work and cooperation; capability to bring together diverse stakeholders; communication; strong drafting skills; analytical skills; negotiation skills; language skills adapted to the region where the evaluation takes place.
Box 3. Resources for identifying an external evaluator
Disseminating the TOR for the evaluation through a list-serve or posting on a website of an EvaluationAssociation may increase the number of qualified applicants for the consultancy. A few of the relevantassociations are listed below:
United Nations Evaluation Group: www.uneval.org/contactus
Organisation for Economic Cooperation and Development – Development Assistance CommitteeNetwork on Development Evaluation (OECD/DAC): www.oecd.org/dac/evaluationnetwork
International Development Evaluation Association (IDEAS): www.ideas-int.org/
International Program for Development Evaluation Training (IPDET): www.ipdet.org/
Monitoring and Evaluation News: www.mande.co.uk
Sri Lanka Evaluation Association: www.nsf.ac.lk/sleva/
Malaysian Evaluation Association: www.mes.org.my/
PMD can assist with dissemination of TORs for evaluations and evaluative reviews through the aboveassociations.
In general, evaluations at ESCAP are conducted by an external consultant for cost efficiency.However, depending on the complexity and available budget of the evaluation, an evaluation teammay be hired consisting of a lead evaluator and a number of evaluators. A lead evaluator wouldassume overall responsibility for carrying out the evaluation. This includes, among other activities,managing the work of the team, acting as a spokesperson for the team, ensuring the quality ofinterviews and information gathering, facilitating the preparation of the draft report, presenting thedraft report, and producing the final report after comments have been received. Considerationshould be given to the balance of evaluation and subject expertise when evaluation team membersare selected. It is also important to consider the ability of the evaluator to work as a team member.Depending on the purpose and type of evaluation, the evaluation team can comprise externalconsultants, OIOS and/or ESCAP staff members, as shown in Table 1 on page 8.
In order to maximise objectivity and minimise conflict of interest, ESCAP staff members canconsider the following key points when establishing an evaluation team:
In order to ensure that the management of the evaluation process is strictly separated from theconduct of the evaluation, the evaluation manager cannot be considered as part of the evalua-tion team;
20
ESCAP Evaluation Guidelines
ESCAP staff members cannot be part of the evaluation team for external evaluations or forevaluations with external accountability as the main purpose;
ESCAP staff members cannot be part of the evaluation team for an evaluation of the work ofthe secretariat unit in which they work;
External consultants are contracted when specific expertise is required or when the mainevaluation purpose is external accountability;
Staff members from divisions or offices away from Bangkok make up the team for peer reviews.
Various team compositions are possible for different types of internal evaluations, as described inTable 5.
Table 5. Evaluation team composition for internal evaluations
EVALUATION TYPE CONSULTANTS ESCAP STAFF
Thematic, subprogramme or project evaluation (PMD is evaluation manager)
Peer review
Project review *
*Note: ESCAP staff cannot be part of the project review team for a review of a project implemented by the organizationalunit in which they work. Such reviews would be considered “self-assessments” and as such form part of ESCAP’smonitoring framework.
2.4 Step 4: Schedule and Organize the Evaluation
An overview of arrangements to be made before an evaluation starts is provided below.
2.4.1 Prepare an evaluation work plan
An evaluation work plan is prepared, based on the tasks listed in the TOR. Table 6 outlines aminimum list of related tasks and indicative timeframes.
2.4.2 Gather background documentation
The type of documentation that is necessary for an evaluation team to conduct an evaluation varieswith the type and topic of the evaluation. Table 7 contains documentation that is generallyprovided to the evaluation team.
2.4.3 Brief the evaluation team
For evaluations that are carried out by a larger evaluation team, whether consisting of consultantsor staff members, it is recommended to organize a briefing session with the entire team. Thebriefing could cover the following:
Introduction of evaluation team members, particularly if they have not worked with each otherbefore;
21
ESCAP Evaluation Guidelines
Background to the evaluation – ensure that team members understand the programme/projectand organizational context;
The purpose, objectives, scope, outputs of the evaluation;
Potential limitations of the evaluation;
Evaluation methodology;
Proposed evaluation work plan, including roles and responsibilities of team members;
Available documentation;
Reporting requirements, as specified in the TOR.
Based on the outcome of this briefing session, it may be necessary to modify the methodology and/or time schedule.
Table 6. Evaluation tasks and indicative timeframes
TASK RESPONSIBILITY INDICATIVE TIMEFRAME
Gather background documents Evaluation manager
Brief evaluator/team Evaluation manager
Inception Report: finalize methodology Evaluation manager or Prior to conducting theEvaluator/team evaluation
Conduct the evaluation Evaluator/team
Submit draft evaluation report to the Evaluator/team Within one month afterevaluation manager completing evaluation
activities
Provide comments on draft evaluation Relevant ESCAP staff, ESCAP Within two weeks afterreport to evaluators management, PMD or OIOS receipt of draft evaluation
(quality control), evaluation reportmanager, and reference group(if established)
Submit final evaluation report to the Evaluation team Within two weeks afterevaluation manager receipt of comments
Finalize evaluation report (layout, editing) Evaluation manager
Sign off on evaluation report Evaluator (s)
Formulate management response for ESCAP management, Within one month afterinclusion as an annex in the final coordinated by evaluation receipt of final draftevaluation report manager evaluation report
Sign off on management response ESCAP management
Share evaluation findings Evaluation manager and ESCAP Within one month after themanagement management response is
signed off
22
ESCAP Evaluation Guidelines
Table 7. List of documentation to be made available to the evaluation team
GENERAL
Organizational/team diagram Contact list of relevant ESCAP staff members, partners and other relevant stakeholders Publications Research papers Promotional material (e.g. booklets, brochures, fact sheets, newsletters, posters, information kits) Press releases Meeting information (e.g. attendance lists, minutes/reports, agenda, handouts, evaluation questionnaire
results) Training materials Mission reports Budget, allotments and expenditures overview Reports from previous evaluations
PROGRAMMES
Work programme, including results framework Divisional or Sectional Work Plans IMDIS reports (outputs, work months, Accomplishment Accounts) Programme Performance Report (PPR)
PROJECTS
Project document, including the work and monitoring plan, logical framework, budget and e-TCsummary
Relevant agreements (e.g. with the project donor) Project revisions (if applicable) Progress reports, including documents referred to in the report Mid-term evaluation or evaluative review Project terminal report, including documents referred to in the report
23
ESCAP Evaluation Guidelines
3. IMPLEMENTING EVALUATIONS
The implementation of evaluations is carried out by the evaluation team. The evaluation managerstays in touch with the evaluation team to provide assistance or clarification where needed, mediatein case any frictions arise, and ensure that the evaluation is carried out ethically and in accordancewith the agreed methodology. If a reference group is established, the evaluation manager also ensurestheir involvement in the process. Implementation involves three steps which are explained below.
3.1 Step 5: Conduct the evaluation
The evaluation team conducts the evaluation following the methodology described in the TOR, andincorporating any changes that were agreed during the planning stage.
It is in general not desirable to change evaluation activities during the course of the evaluationbecause when a systematic approach is not followed, systematic error, or bias, may be introducedinto the evaluation and thus compromising the findings. However, changes may be required insome instances and the lead evaluator should consult with the evaluation manager about majorchanges. The following are examples from past evaluations:
The return rate for a survey questionnaire is very low. Therefore, evaluators follow up withinterviews by telephone with selected stakeholders to whom a questionnaire was sent;
Countries to be visited by evaluators, as identified in the TOR, are changed due to theunavailability of informants;
Additional stakeholders are interviewed in a certain country on the basis of recommendationsfrom project partners interviewed in that country;
Group meetings with gender focal points are held in addition to interviews with individuals toconsider the views of a wider group of people.
Depending on the evaluation, the evaluation team discusses the main findings with the evaluationmanager or presents main findings to relevant ESCAP staff members towards the end of their visitto ESCAP. It is important that the evaluation manager ensures the independence of the evaluatorsby being prepared to accept the findings, also when they differ from the programme or evaluationmanager’s perspective.
3.2 Step 6: Prepare the draft report
Throughout the evaluation process, the evaluation team will document findings and conclusions.Usually, the lead evaluator will organize and facilitate team meetings to discuss findings andconclusions and coordinate the preparation of a draft report. It is not uncommon for the evaluatorto discuss findings with ESCAP staff members involved in the evaluation, for example to verify orclarify statements made during interviews or provide further information where there are gaps. If areference group has been established, meetings with the evaluator(s) will also serve this purpose.
The table of contents of the Evaluation or Evaluative Review Report is included in the TOR andusually follows the structure described in Table 8.
See Evaluation Tool 5: Evaluation report template
24
ESCAP Evaluation Guidelines
Table 8. Contents of the Evaluation Report
CONTENT PAGES COMMENTS(estimate)
Title page 1 Title, date of issuance Names of the evaluators Name of ESCAP or division that commissioned the evaluation, web
page address where report can be found electronically
Management response To be completed by ESCAP management (see Chapter 4)
Acknowledgments 1 Prepared by the evaluation team
Table of contents 1 List of chapters, sections and annexes
List of acronyms 1-2 In alphabetical order; these are written out in full the first time theyare used in the report
Executive summary 1-3 Background of the evaluation (one paragraph) Purpose and scope (one paragraph) Methodology (one paragraph) Main conclusions (one-sentence conclusions with brief explanation if
needed) Recommendations (one-sentence recommendations with brief
explanation if needed) Other comments or concluding sentence
1. Introduction 1-3 1.1 Background of the evaluation and the topic being evaluated 1.2 Purpose, objectives and outputs 1.3 Scope (including evaluation questions)
2. Methodology 1-3 2.1 Description of methodology: activities, timeframe, changescompared to TOR, and reasons for selecting sample reports,countries, sites, case studies, and interviewees as a representation ofthe topic being evaluated
2.2 Limitations: limitations of the methodology and scope andproblems encountered
3. Findings Varying 3.1 General: supporting information for the performance assessmentlength and other assessment, if required
3.2 Performance assessment: assessment against relevant evaluationcriteria (relevance, efficiency, effectiveness and sustainability)
3.3 Other assessment: assessment against relevant additional criteria(gender, rights-based approach, environmental sustainability, ESCAPpriority countries and “one UN”)
4. Conclusions 1-4 Main conclusions, both positive and negative, of the evaluation thatfollow logically from the findings
Ratings table with ratings for standard evaluation and additionalcriteria and a brief justification (optional)
5. Recommendations 1-4 Recommendations based on the conclusions, which can be addressed to ESCAP management, ESCAP staff, donors and otherrelevant stakeholders
Annexes I. Management response with follow-up actions (to be completedby ESCAP management; see chapter 4)
II. Terms of reference III. List of documents reviewed IV. List of interviewees Other annexes as required (e.g. schedule of work undertaken by the
evaluators, reports of meetings, interview summaries,questionnaires)
Source: Based on Department for International Development, “Guidance on Evaluation and Review for DFID Staff”, July2005 (available online at http://www.dfid.gov.uk/aboutDFID/performance/files/ guidance-evaluation.pdf).
25
ESCAP Evaluation Guidelines
3.3 Step 7: Review the Draft report
The process from the draft report to the final report takes place in two steps, as described below.
3.3.1 Review the draft report
The evaluation manager sends the draft report to the relevant division and office managers and toother programme or project staff for comments. Depending on the evaluation the draft report mayalso be sent to external stakeholders for comments. Comments can focus on the conclusions andrecommendations as well as technical and methodological issues. It is the responsibility of therelevant programme or project officers to conduct a technical review with inputs from otherstakeholders, which includes:
Is the information in the report accurate? (i.e., check for factual errors); Is the information in the report complete? (i.e., is there information lacking that could affect the
conclusion); Are the recommendations relevant, objective and specific enough to be implemented?
For all evaluations, the evaluation manager, supported by the PMD Evaluation Officers, conducts amethodological review or quality check of the draft report. This review aims to ensure that thereport and the drafting process meet a set of standard quality criteria (see Table 9).
The evaluation manager sends the compiled comments to the evaluation team.
See Evaluation Tool 6: Quality checklist for evaluation report
Table 9. Quality checklist used to review evaluation reports
Quality Check Description
The report meets the The report is tailored to the information needs of ESCAP and/or otherscope, purpose and entities that commissioned the evaluationobjectives of the The report does not deviate from the scope outlined in the TORevaluation as stated in The report can be used by ESCAP for the intended purpose as statedthe TOR in the TOR
The objectives, as outlined in the TOR have been met, including: theassessment against relevant performance criteria (relevance, efficiency,effectiveness, sustainability, etc.) is complete, i.e. evaluation questionsunder each criterion have been answered
The report is structured The report follows the table of contents outlined in the TOR andlogically includes the relevant annexes
The evaluation The evaluation methodology is clearly explained and has been appliedmethodology and its throughout the evaluation processapplication are explained Amendments to the methodology compared to what was proposedtransparently and clearly in the TOR have been clearly explained
The limitations of the evaluation methodology, including problemsencountered during the conduct of the evaluation, and theirimplications for the validity of the findings and conclusions have beenclearly explained
(Continued)
26
ESCAP Evaluation Guidelines
Table 9. (continued)
Quality Check Description
The findings and Relevant qualitative and/or quantitative sources of information haveconclusions are credible been considered
Analysis is done rigorously: triangulation is employed (cross-checkingof findings against other relevant sources); cause-and-effectrelationships are explained
Findings are adequately substantiated, balanced and reliable The relative contributions of stakeholders to the results are explained Limitations are explained The conclusions derive from the findings and are clear
The recommendations The recommendations are clear and follow logically from theare useful conclusions
The recommendations are impartial Recommendations are realistic, concrete and actionable within a
reasonable timeframe Recommendations for ESCAP should be clearly within the mandate of
ESCAP
The report is well The executive summary is brief but highlights the key findings,written conclusions and recommendations
The report uses consistent grammar and spelling (in accordance withUN rules)
Main messages are clearly distinguished from the text The report is written in good English and is easy to read The subject of evaluation (programme, project, other) is clearly
described including its logic model or results chain The stakeholders of the programme or project are clearly identified
3.3.2 Prepare the final report
The evaluation team adjusts the report based on feedback provided and submits the final report tothe evaluation manager. The evaluation manager ensures that the report is edited (in most casesonly the executive summary is formally edited) and formatted properly. In case of major edits, theevaluators should review the report once more to ensure that this has not affected the content. Next,evaluators sign off on the report and no further changes may be made to the report. For evaluativereviews, the evaluation manager then submits the report to PMD for coordination by the EvaluationOfficers of the formulation of ESCAP’s management response and follow-up action plan to itsfindings, conclusions, and recommendations (see Chapter 4).
Depending on the evaluative process, the lead evaluator or the entire evaluation team may beinvited to present the final report to ESCAP staff, management and/or other stakeholders. Thispresentation could be followed by questions and comments from participants and stakeholders. Itcan also be used to discuss a draft “management response”.
27
ESCAP Evaluation Guidelines
4. USING EVALUATION FINDINGS
The third stage of an evaluation focuses on using the evaluation findings. This stage involves threesteps, which are explained below.
4.1 Step 8: Prepare Management Response and Actions
The use of evaluations for accountability and organizational learning is facilitated through thedevelopment of a “management response” and a “follow-up action plan” to the findings andrecommendations of each evaluation or evaluative review, making up the formal, written responsefrom the organization to their findings and recommendations (see Box 3 for an example). In thisregard, it is critical that recommendations are relevant, objective and concrete enough to ensure thatmanagement can determine follow-up actions. It is the responsibility of the evaluation manager toensure this as part of the quality review of the draft evaluation report (Step 7).
ESCAP management assumes a critical leadership role in ensuring the use of evaluations as they areinvolved in the formulation of management responses and follow-up action plans. Through signingoff on management responses and follow-up action plans, they commit to and are thus accountablefor the implementation of follow-up to evaluations.11
For all evaluations and evaluative reviews, the process of formulating the management responseand follow-up action plan is coordinated by PMD’s Evaluation Officers in consultation with theevaluation manager and representatives from organizational units that are expected to be respon-
11 Table 1 (in Chapter 1) provides an overview of who represents ESCAP’s management and is responsible for themanagement response in different types of evaluative processes. Evaluation Tool 8: Evaluation process checklistfurther details the evaluation process and the varying responsibilities.
Box. 3 Management response (part 2) and follow-up action to a portfolioproject evaluation (fictional)
1. Strategic Recommendation for ESCAP: ESCAP should develop a partnership strategy, MoUs forpartnerships, a partnership action plan and monitoring mechanism. In order to achieve results,cooperation and synergy with other (specialized) institutions should be planned, negotiated, agreedupon and included in work plans, monitoring and evaluations.
Management response: We agree in principle. A comprehensive and results-oriented MOU model forpartnerships has been developed since 2005, which needs to be further promoted within the secretariat asa tool for institutionalizing partnership with UN and non-UN organizations. In addition, the partnershipstrategy will be further sharpened during the process of revising the TC Strategy.
Follow up actions:
Action Completion date Responsibility
a. Actively promote the use of the comprehensive MOU After September PMDmodel in partnership development as an integral part of 2008implementing the revised TC Strategy
28
ESCAP Evaluation Guidelines
sible for or otherwise involved in following up to an evaluation or evaluative review. Upon theconclusion of the work of the evaluator or evaluation team, evaluative reviews must be submittedto PMD for this process to be initiated.
The timely development of the management response ensures that the evaluation recommendationsand the corresponding follow-up actions remain relevant, thus the management response should becompleted within two months of the submission of the report to PMD.
See Evaluation Tool 7: Management response and follow-up action plan template
4.1.1 Management response
The management response consists of two parts and is inserted at the beginning of the evaluationreport:
The first part provides an overall response from the perspective of ESCAP management on theevaluation and its results. This can include comments regarding the relevance and usefulness ofthe results. It may also highlight any differences of opinion with regard to the evaluationfindings.
The second part provides a response from management to each individual recommendation,resulting in either (partial) acceptance or rejection of the recommendation. Additional commentsmay relate to broader implications for ESCAP, in particular in relation to programme andproject planning and implementation.
See Evaluation Tool 5: Evaluation report template
4.1.2 Follow-up action plan
In conjunction with preparing the management response, evaluation follow-up actions are identifiedfor each accepted recommendation. The expected completion dates and responsible unit are statedfor each follow-up action. The follow-up actions are included as Annex I of the evaluation report.
Many actions will relate directly to the topic being evaluated. For example, actions resulting from aproject evaluation could be to narrow the focus in a second phase of the project; plan additionalworkshops; allocate additional staff to the project; or hold a meeting with the donors and otherproject partners to discuss a future project.
Some findings, conclusions or recommendations may indicate broader implications for ESCAP,which would lead to the identification of longer-term, strategic or institutional-level actions. There-fore, it is important that internal stakeholders are consulted during the development of the follow-up action plan and commit to the management response, where appropriate. Examples of key areasfor such actions are:
Programme management. For example, an evaluation may find that:
- A certain modality is a very effective or cost-efficient way of achieving a programmeobjective;
- Certain existing target groups have no link to policy development in the countries that aprogramme or project is trying to influence;
- ESCAP could focus its work in areas where it has a clearer comparative advantage;In such cases, future programmes across ESCAP need to consider the findings;
29
ESCAP Evaluation Guidelines
Project management. For example, changes to project planning and appraisal processes may beconsidered in order to address evaluation recommendations. Or, the standard instructions forpreparing a project document may be considered changed on the basis of the findings of anevaluation;
Human resources. For example, evaluations may highlight the need to adjust recruitmentpractices, performance appraisal, or training.
See Evaluation Tool 5: Evaluation report template
4.2 Step 9: Share Evaluation Findings
The evaluation manager is responsible for finalizing the report for publication, including theincorporation of the final management response, the preparation of PDF files of the report and theexecutive summary, and, if required, overseeing the printing of hard copy reports and commission-ing the translation of the executive summary or the entire report.
It is important to note that the report is only finalized and issued after the management responseand follow-up actions have been included in the report. A copy of the final report must besubmitted to PMD.
Evaluation findings must be shared in accordance with the following guidelines:
All reports of evaluations and evaluative reviews (including the management response) aremade available internally, including on the ESCAP intranet, with the aim of enhancing transpar-ency, ownership and internal accountability;
Internal briefing sessions are conducted for ESCAP management and staff to highlight importantevaluation findings and recommendations, particularly where they are of strategic importance.Such briefings may be given by the lead evaluator or relevant ESCAP staff members;
Reports of evaluations are disseminated to external stakeholders, such as member States anddonors, posted on IMDIS as evidence for accomplishment accounts, posted on other relevantelectronic databases, and posted on the ESCAP website to enhance transparency and externalaccountability;
Reports of evaluative reviews and other evaluative processes that focus primarily on organiza-tional learning are normally shared internally only. If external accountability is explicitlymentioned as a purpose for an evaluative review, dissemination to external stakeholders andthrough the ESCAP website may take place;
Reports that are mandated to be submitted to intergovernmental bodies (i.e. the Commission,Governing Councils etc.) must be in the proper format, meeting editorial standards for pre-sessiondocuments. The document must include information on how to obtain a copy of the full report ofthe evaluation. If the management response is not finalized in time to be included in the pre-session document, the document should include a foot-note containing (a) the date by which thefull report will be finalized and (b) information on how to obtain a copy of the report at that time.
4.3 Step 10: Follow-up and Promote Learning
The evaluation does not end with the dissemination of the findings. Recommendations can only leadto improvements in ESCAP’s work if learning from evaluations is promoted and actions followingfrom the recommendations are implemented. It is important that follow-up is incorporated intoalready existing monitoring processes as much as possible to minimize the time required to tracktheir implementation.
30
ESCAP Evaluation Guidelines
4.3.1 The Executive Secretary
The Executive Secretary (ES) is responsible for overall leadership and oversight of the evaluationfunction at ESCAP and thus for ensuring that evaluations and evaluative reviews are used tostrengthen accountability and promote learning at ESCAP. The ES promotes the use of internalevaluations at ESCAP by:
Ensuring that findings of strategic importance are considered and reflected in the organization’soverall direction and shared with relevant members of the UN system.
Ensuring that follow-up actions are undertaken by ESCAP management and staff by:
o Reviewing periodical status reports prepared by PMD and taking action as necessary;o Including general or specific requirements in the e-PAS of relevant senior staff members, that
they implement evaluation follow-up actions in time. As the e-PAS process includes a mid-term review and an end-of-cycle appraisal, this provides the opportunity to revisit actionsevery six months.12
4.3.2 Division chiefs and heads of offices away from Bangkok
Division chiefs and heads of offices away from Bangkok are responsible for ensuring that follow-upactions under their purview are implemented in time. This is accomplished by:
Ensuring that the findings from evaluations are shared and used for programme and projectplanning exercises;
Incorporating actions for which they are responsible in the Annual Work Plan of their division/office;
Ensuring that relevant actions are included in the work and monitoring plans of activities,projects and programmes implemented by their division/office;
Including general or specific requirements in the e-PAS of relevant staff members, that theyimplement their assigned evaluation follow-up actions in time;
Monitoring and regularly updating the status of evaluation follow-up actions for which theirdivision/office is responsible;
Ensuring that the status of evaluation follow-up actions is documented under item 7 of theaccomplishment accounts, “Learning: lessons learnt to date and practical suggestions for im-provement”.
4.3.3 Programme Management Division
PMD is responsible for monitoring the implementation of evaluation follow-up actions by:
Developing and maintaining an IT tool for tracking the follow-up to evaluations and liaisingwith PME Focal Points to ensure that the tool is used;
Liaising with the PME focal points to ensure that follow-up actions are regularly updated sothat the status of the implementation of actions is continuously tracked;
12 It is important to note that, in this way, ESCAP staff members and management are held accountable on thebasis of the follow-up actions they take or fail to take as a result of an evaluation, and not whether theevaluation resulted in positive or critical findings.
31
ESCAP Evaluation Guidelines
Preparing, every six months, an update for the Executive Secretary that includes the status ofactions by each division and office away from Bangkok and by evaluation, and a list ofoutstanding actions that are past the expected completion date.
PMD also organizes workshops open to all staff, at least once a year, which aim to:
Share experiences in managing and conducting evaluations during the preceding period; Review lessons learned from different evaluations and identify concrete areas in which such
lessons can be applied; Review the status of evaluation follow-up actions and agree on changes, as appropriate; Assess successes and barriers in creating an effective evaluation system and culture at ESCAP,
and identify what is needed to further improve ESCAP’s M&E System.
33
ESCAP Evaluation Guidelines
13 iSeek homepage, via Quicklink “Inside ESCAP” – “Programme Management Division”, “Project and ProgrammeManagement Guide”, http://iseek.un.org/webpgdept1028_3.asp?dept=1028
ANNEXES
Annex I. List of Key Reference Materials
Secretary-General’s Bulletin
ST/SGB/2000/8, 19 April 2000, “Regulations and Rules Governing Programme Planning, theProgramme Aspects of the Budget, the Monitoring of Implementation and the Methods ofEvaluation”
Publications by the Office of Internal Oversight Services (OIOS)
A Guide to Using Evaluation in the United Nations Secretariat, June 2005, http://www.un.org/depts/oios/manage_results.pdf
Proposals on the Strengthening and Monitoring of Programme Performance and Evaluation,April 2005, http://www.un.org/depts/oios/pages/other_oios_reports.html
Strengthening the role of evaluation and the application of evaluation findings on programmedesign, delivery and policy directives, April 2006, http://www.un.org/depts/oios/pages/other_oios_reports.html
Publications by the United Nations Evaluation Group (UNEG)
Standards for Evaluation in the UN System, April 2005, http://www.unevaluation.org/normsandstandards/
Norms for Evaluation in the UN System, April 2005, http://www.unevaluation.org/normsandstandards/
ESCAP Project and Programme Management Guide
The objective of the Project and Programme Management Guide (also called “Resource Guide”) is toprovide ESCAP staff members with a clear set of policies and procedures for programme andproject implementation. It is a web-based Guide that can be accessed through the UN Secretariathomepage (iSeek).13 The Evaluation Guidelines, including templates, are found in the “Monitoringand Evaluation” section.
Published literature consulted during the preparation of the Evaluation Guidelines and tools
Bamberger et al., 2006. “Real World Evaluation: working under budget, time, data and politicalconstraints”. Sage Publications, Inc., www.sagepublications.com
Denmark, Ministry of Foreign Affairs, Danida, 2001. “Evaluation Guidelines”, second edition, http://www.um.dk/en/menu/DevelopmentPolicy/Evaluations/
United Kingdom of Great Britain and Northern Ireland, Department for International Development(DFID), July 2005. “Guidance on Evaluation and Review for DFID Staff”.
Joint Inspection Unit, United Nations, 2006. “Oversight Lacunae in the United Nations System”
Kusek, J.Z. and Ris, R.C, May 2004. “Ten steps to a results-based monitoring and evaluation system: ahandbook for development practitioners. World Bank Publications, www.worldbank.org/publications
United Nations Development Programme (UNDP), 2002. “Handbook on Monitoring and Evaluatingfor Results”, www.undp.org/eo/methodologies.htm
35
ESCAP Evaluation Guidelines
Annex II. List of Evaluation Tools
The following Evaluation Tools are available for conducting evaluations:
Evaluation Tool 1: Evaluation terms of reference template Evaluation Tool 2: Sample evaluation logical framework model Evaluation Tool 3: Evaluation questions under evaluation criteria Evaluation Tool 4: Common evaluation limitations Evaluation Tool 5: Evaluation report template Evaluation Tool 6: Quality checklist for evaluation report Evaluation Tool 7: Management response and follow up action plan template Evaluation Tool 8: Evaluation process checklist
37
ESCAP Evaluation Guidelines
Annex III. List of Evaluation Fact Sheets
Evaluation Fact Sheets are available for the following different types of evaluation:
Evaluation Fact Sheet 1: Thematic EvaluationThese evaluations focus on a sector, fund, cross-cutting issue, modality, publication or service.They are managed by PMD Evaluation Officers and carried out by external consultants;
Evaluation Fact Sheet 2: Subprogramme EvaluationThese focus on entire subprogrammes or major components thereof, e.g. regional institutions orproject clusters within a subprogramme. They are managed by PMD Evaluation Officers andcarried out by external consultants;
Evaluation Fact Sheet 3: Project EvaluationThese focus on individual projects. They are managed by PMD Evaluation Officers and carriedout by external consultants;
Evaluation Fact Sheet 4: Evaluative Review: Project ReviewThese reviews focus on individual projects and are managed by a division or an office awayfrom Bangkok and conducted by external consultants or ESCAP staff not from the division/institution managing the review;
Evaluation Fact Sheet 5: Evaluative Review: Peer ReviewPeer reviews can be of organizational performance and practice relating to particular modalities,sets of activities, reports or procedures. Peer reviews are managed by a division or an officeaway from Bangkok and conducted by a group of peers.
The fact sheets are subject to continuous updates. The latest versions are available on iSeek: (http://iseek.un.org/webpgdept1028_79.asp?dept=1028).
39
ESCAP Evaluation Guidelines
14 United Nations Evaluations Group (UNEG), “Norms and Standards for Evaluation in the UN System”, April2005 (available online at http://www.uneval.org).
Annex IV. United Nations Norms for Evaluation adapted for ESCAP
ESCAP seeks to uphold the norms and standards for evaluation developed by the United NationsEvaluation Group.14 These guiding principles have been adapted for ESCAP’s context, as seen below:
Intentionality: The scope, design and planning of evaluations should contribute to the generationof relevant, timely findings that meet the needs of stakeholders. It must be clear from the outsetwhat the evaluation findings will be used for, i.e. for organizational learning to feed into futureprogrammes and projects, accountability to member States and donors, or both;
Impartiality: The need for objectivity in the planning, design, team selection, execution andformulation of findings and recommendations, taking the views of relevant stakeholders intoaccount;
Independence: Only external evaluations that are managed and conducted by organizations otherthan ESCAP can be considered truly independent. However, most evaluations of ESCAP’s workare managed by ESCAP staff. To maximize independence under these circumstances, evaluationsthat serve external accountability purposes are managed by the PMD Evaluation Officers andconducted by external consultants (evaluators). Evaluations (including evaluative reviews) thatserve organizational learning purposes are, to the extent possible, conducted by externalevaluators. Independence applies to evaluation managers as well as to evaluators: To avoidconflict of interest and undue pressure, evaluators must not have been responsible for the policy-setting, design or management of the subject of evaluation, nor expect to be in the near future;
Evaluability: Prior to undertaking a major evaluation requiring significant investment of re-sources, it is necessary to establish that it is technically possible to evaluate the initiative inquestion and that there is no major factor hindering the evaluation process, such as lack ofindependence, information, or clear intent of the subject to be evaluated;
Quality: The quality of the findings must be ensured through proper design, planning andimplementation, and by preparing a complete and balanced report, which contains informationthat can be easily distilled into lessons and disseminated;
Competencies for evaluation: Evaluation staff should have formal job descriptions and perfor-mance criteria, as well as relevant competencies and skills to conduct evaluations and hireexternal evaluators;
Transparency and consultation: Transparency and consultation are necessary steps in all stagesof the evaluation process to build ownership and facilitate consensus. Evaluation reports(including the terms of reference) should be available to major stakeholders and be publicdocuments that are accessible and readable;
Ethics: Evaluators must have personal and professional integrity, must allow institutions andindividuals to provide information confidentially and should verify their statements. They mustbe sensitive to the beliefs, manners and customs prevailing in a particular social and culturalenvironment; they should likewise be sensitive to and address issues of discrimination andgender inequality and should discreetly report wrongdoings if appropriate;
40
ESCAP Evaluation Guidelines
Follow up to evaluations: Management is required to provide a response to the recommenda-tions of evaluations, which, at ESCAP, should be included in the report as an annex. Evaluationrecommendations that have been accepted by management should be followed up systematicallyand the status of follow-up should be reviewed periodically;
Contribution to knowledge building: Evaluation findings and recommendations should bepresented in such a way that they can be easily accessed, understood and implemented bytarget audiences. As such, they need to be relevant and appropriate, bearing in mind thecapacity and opportunities of the target audiences to strengthen implementation processes andresults. The sharing of evaluation reports should facilitate learning among stakeholders,including, where appropriate, other entities of the UN system.
i
CONTENTS
Page
FACT SHEETS
MONITORING FACT SHEETS
Fact Sheet 1. Annual Work Plan ............................................................................................. 1Fact Sheet 2. Output Reporting ............................................................................................... 3Fact Sheet 3. Work Months Reporting ................................................................................... 5Fact Sheet 4. Accomplishment Accounts ............................................................................... 7Fact Sheet 5. IMDIS Results Information .............................................................................. 9Fact Sheet 6. Preliminary Performance Assessment (PPA) ................................................ 11Fact Sheet 7. Programme Performance Report (PPR) ......................................................... 13Fact Sheet 8. Project Document ............................................................................................... 15Fact Sheet 9. Logical Framework ............................................................................................ 17Fact Sheet 10. Project Work and Monitoring Plan .............................................................. 19Fact Sheet 11. Project Budget ................................................................................................... 21Fact Sheet 12. Summary Page for e-TC and Updates ........................................................ 23Fact Sheet 13. Requests for Allotments and Revised Allotments ..................................... 25Fact Sheet 14. Project Progress Report ................................................................................... 27Fact Sheet 15. Project Terminal Report .................................................................................. 29
EVALUATION FACT SHEETS
Fact Sheet 1. Internal Evaluation: Thematic Evaluation ..................................................... 31Fact Sheet 2. Internal Evaluation: Subprogramme Evaluation .......................................... 33Fact Sheet 3. Internal Evaluation: Project Evaluation ......................................................... 35Fact Sheet 4. Evaluative Review: Project Review ................................................................ 37Fact Sheet 5. Evaluative Review: Peer Review .................................................................... 39
TOOLS
MONITORING TOOLS
Tool 1. Sample Results-based Annual Work Plan ............................................................... 41Tool 2. Sample Work Months Report .................................................................................... 43Tool 3. Sample Accomplishment Account ............................................................................. 49
EVALUATION TOOLS
Tool 1. Evaluation Terms of Reference Template ................................................................ 57Tool 2. Sample Evaluation Logical Framework Model ...................................................... 69Tool 3. Evaluation Questions under Evaluation Criteria ................................................... 71Tool 4. Common Evaluation Limitations ............................................................................... 73Tool 5. Evaluation Report Template ....................................................................................... 75Tool 6. Quality Checklist for Evaluation Report .................................................................. 89Tool 7. Management Response and Follow-up Action Plan Template ........................... 91Tool 8. Evaluation Process Checklist ...................................................................................... 93
1
Focus Subprogramme, regional institution
Periodicity Every year
Update semi-annually
Purpose To provide: a management tool for Division and Section Chiefs, and heads of Offices away from
Bangkok
an overview of planned activities and timelines to staff members
a basis for the e-PAS work plans of all staff members, including managers.
Deliverable Annual Work Plan, which is effectively a compilation of Section Work Plans and workplans of regional institutions, as appropriate. It includes:
Expected accomplishments and intermediary results as reflected in the StrategicFramework
Major activities, irrespective of whether they are funded through the regular budgetor through extra-budgetary resources
Monitoring activities, including benchmarks for reporting in IMDIS (e.g. outputs,Accomplishment Accounts)
Planned evaluations and evaluative reviews
Additional information typically includes the work programme output code, theaccount number (XB or RB), allocated budget, timeframe/deadlines and responsiblestaff members
Process At the start of the programme cycle, PMD enters the work programme for eachsubprogramme into IMDIS based on the Programme Budget for the biennium
Division Chief/Head of Office away from Bangkok prepares draft AWP togetherwith Section Chiefs, as appropriate
Participatory meeting are held with staff members to discuss the draft AWP (thiscan also take place at Section level)
The final AWP is prepared
The AWP is updated semi-annually, with active staff involvement
At 12 and 21 months, Division Chiefs and other staff concerned meet with theExecutive Secretary to present and report on their AWP and AA/PPA (cf. FactSheets 4 and 6)
Responsibilities Overall responsibility: Division Chief/ Head of Office away from Bangkok
Initiation: Division Chief/ Head of Office away from Bangkok
Execution: Division Chief/ Head of Office away from Bangkok
Quality assurance: Division Chief/ Head of Office away from Bangkok
Further Sample Results-based Annual Work Plan (Monitoring Tool 1)information
Last updated October 2009
Monitoring Fact Sheet 1 – Annual Work Plan
MONITORING FACT SHEET 1ANNUAL WORK PLAN (AWP)
Fact sheets are updated on a continuous basis. The latest versions are available on iSeek:(http://iseek.un.org/webpgdept1028_79.asp?dept=1028).
3
Focus Subprogrammes and Executive Direction and Management (EDM)
Periodicity Semi-annual (6, 12, 18 and 24 months of the programme cycle)
Purpose To report on output delivery against the Programme Budget in terms of: Quantity: number of meetings, publications, technical materials, workshops, etc.
Timeliness: percentage completion of planned outputs
Deliverable For each subprogramme, one report with ten categories in IMDIS: (1) Substantiveservicing of meetings; (2) Parliamentary documentation; (3) Expert groups,rapporteurs, depository services; (4) Recurrent publications; (5) Non-recurrent publi-cations; (6) Other substantive activities; (7) Training courses, seminars and work-shops; (8) Fellowships and grants; (9) Field projects
Supporting documentation for each output. For projects, this should include at leastthe project document, project revisions, work and monitoring plan and progressreports. Other supporting documents may include project terminal reports, projectevaluation reports, meeting reports, list of participants, publications, etc. Foroutputs available on the internet, a URL should be provided.
Process Prior to the start of each biennium and during the Programme Budget preparations,PMD enters the draft outputs into IMDIS. Upon approval of the work programmeby the General Assembly, PMD makes revisions accordingly
At each monitoring milestone, PMD sends a reminder to divisions to report onoutputs, work months, accomplishment accounts and IMDIS results information
At each monitoring milestone, each division completes an output report online inIMDIS. A hard copy of supporting documentation (together with accomplishmentaccounts and work months report) is submitted formally to PMD by each divisionchief and, as appropriate, head of office.
PMD reviews the submissions and sends an email to the Department of Management, UNHQ, to confirm the final status of the semi-annual reporting
DM reviews the submission in IMDIS and seeks further clarification as required.PMD coordinates ESCAP’s responses to DM with divisions and offices
Responsibilities Overall responsibility: Division Chief/ Head of offices away from Bangkok
Initiation: PMD
Execution: Division Chief, Head of Offices away from Bangkok, other staff;coordination by PME Assistants
Quality assurance: Division Chief/ Head of Office away from Bangkok / PMD / DM
IMDIS User’s Guide version 2.6, issued by OIOS, UNHQ, December 2003
PMD memorandum on output delivery, description of results and accomplishmentaccounts, 17 January 2005
PMD email circular to PME assistants dated 26 July 2007 on “Guideline for enteringoutputs under Technical Cooperation categories”
Advisory Notes No. 1 – Programme Performance Reporting Biennium 2008-2009,issued by Department of Management, UNHQ, 6 November 2008
Monitoring Fact Sheet 2 – Output Reporting
MONITORING FACT SHEET 2OUTPUT REPORTING
Fact sheets are updated on a continuous basis. The latest versions are available on iSeek:(http://iseek.un.org/webpgdept1028_79.asp?dept=1028).
Furtherinformation
4
Advisory Notes No. 2 – Procedures for Programme Performance Monitoring andReporting for the 2008-2009 Biennium through the Use of IMDIS, issued by Depart-ment of Management, UNHQ, 18 December 2008
Last updated October 2009
Monitoring Fact Sheet 2 – Output Reporting
5
Focus Subprogrammes and Executive Direction and Management (EDM)
Periodicity At each monitoring milestone (6, 12, 18, 21 and 24 months of the programme cycle)
Purpose To account for the use of staff resources (professional staff and consultants)
Deliverable Overview in excel format of actual work months spent on activities for each profes-sional staff member or consultant against the outputs in the Programme Budget,irrespective of whether funding comes from the Regular Budget or extra-budgetaryresources
Process At the start of the programme cycle, PMD prepares work months excel template foreach division and, as appropriate, Office away from Bangkok (subprogramme) andEDM, listing outputs in the left column. The template has five worksheets: 1 – 6months, 7 – 12 months, 13 – 18 months, 19 – 24 months, and total for the two years
At each monitoring milestone, PMD sends a reminder to divisions to report onoutputs, work months, accomplishment accounts and IMDIS results information
At each monitoring milestone, each division/office reports actual work months spentfor each professional staff / consultant. The right hand column of the templateautomatically calculates the total work months spent per output. Divisions enterinto IMDIS the total number of work months spent per output, and a hard copy ofthe spread sheet (together with output documentation and accomplishment accounts)is submitted formally to PMD by the division chief/head of office.
PMD reviews the submissions and sends an email to the Department of Management, UNHQ, to confirm the final status of semi-annual reporting
Notes:
Staff should include time spent on monitoring activities under IMDIS category (6)Other substantive activities
Responsibilities Overall responsibility: Division Chief and Head of Office away from Bangkok
Initiation: Division Chief / Head of Office away from Bangkok / PMD
Execution: Division Chief, Section Chief, Head of Office away from Bangkok, profes-sional staff members, other staff; coordination by the PME Assistant
Quality assurance: Division Chief / Head of office / PMD / DM
Excel template issued by PMD on 29 June 2005
PMD memorandum on work months monitoring dated 29 June 2005
Advisory Notes No. 2 – Procedures for Programme Performance Monitoring andReporting for the 2008-2009 Biennium through the Use of IMDIS, issued by theDepartment of Management, UNHQ, 18 December 2008, page 32-33
Sample Work Months Report (Monitoring Tool 2)
Last updated October 2009
Monitoring Fact Sheet 3 – Work Months Reporting
MONITORING FACT SHEET 3WORK MONTHS REPORTING
Fact sheets are updated on a continuous basis. The latest versions are available on iSeek:(http://iseek.un.org/webpgdept1028_79.asp?dept=1028).
Furtherinformation
7
Focus Subprogrammes and Executive Direction and Management (EDM)
Periodicity At each monitoring milestone of the programme cycle
Purpose To provide a summary of a specific subprogramme accomplishment that is based ondata collected for the indicators of achievement and other relevant information thatserves as the source of reporting on whether the relevant expected accomplishment wasachieved.
Deliverable For each expected accomplishment, an accomplishment account (AA) containing thefollowing:Chapeaux Indicators of achievement – from the approved Programme BudgetSection 1 Setting: rationale for the expected accomplishment, e.g. policy or
programme frameworkSection 2 End-users: who is targeted as the end-user of the products and servicesSection 3 Intermediaries: partner agencies/organizations which support or participate
in the delivery of products and servicesSection 4 Challenges: any events or developments that pose a challenge to achieving
the expected accomplishment, social, political, economical, environmentalSection 5 Events/Actions: activities undertaken and outputs deliveredSection 6 Results: what was achieved in relation to the indicators of achievementSection 7 Learning: lessons learnt to date and practical suggestions for improvement
Process Prior to the start of each biennium, PMD enters the expected accomplishments withindicators of achievements for each subprogramme. Upon approval of the workprogramme by the General Assembly, PMD makes revisions accordingly
Every six months, PMD sends a reminder to divisions / offices to report on outputs,work months, AA’s and IMDIS results information
At 12, 18 and 21 months, divisions / offices, at participatory meetings, discuss thequality of delivered outputs following the headings listed above
Based on the outcome of the discussions, divisions / offices prepare AA’s in MS Wordand post them in IMDIS. A hard copy (together with output documentation and workmonths report) is submitted formally to PMD by the division chief / head of office
At 12 months (and, at 21 months in the context of PPA preparations) DivisionChiefs and head of offices as well as other staff concerned meet with the ExecutiveSecretary to present and report on their AWP and AAs.
PMD reviews the submissions and sends an email to DM to confirm the final statusof the semi-annual reporting
DM reviews the submission in IMDIS and seeks further clarification as required.PMD coordinates ESCAP’s responses to DM
Notes:
Results information in IMDIS (cf. Monitoring Fact Sheet 5) is generated from the AA’s
The AA at 18 months is used to prepare the Preliminary Performance Assessment(PPA) at 21 months (cf. Monitoring Fact Sheet 6)
The AA at 24 months is used to prepare the Programme Performance Report (PPR)(cf. Monitoring Fact Sheet 7)
Monitoring Fact Sheet 4 – Accomplishment Accounts
MONITORING FACT SHEET 4ACCOMPLISHMENT ACCOUNTS
Fact sheets are updated on a continuous basis. The latest versions are available on iSeek:(http://iseek.un.org/webpgdept1028_79.asp?dept=1028).
8
Responsibilities Overall responsibility: Subprogramme manager
Initiation: PMD
Execution: Division Chief, Heads of Offices away from Bangkok, Section Chiefs,relevant professional staff; coordination by the PME focal point
Quality assurance: Subprogramme manager / PMD / DM
Templates for AA’s
IMDIS User’s Guide version 2.6, issued by OIOS, UNHQ, December 2003
Advisory Notes No. 1 – Programme Performance Reporting Biennium 2008-2009,issued by Department of Management, UNHQ, 6 November 2008
Advisory Notes No. 2 – Procedures for Programme Performance Monitoring andReporting for the 2008-2009 Biennium through the Use of IMDIS, issued by Depart-ment of Management, UNHQ, 18 December 2008
Sample Accomplishment Account (Monitoring Tool 3)
Last updated October 2009
Monitoring Fact Sheet 4 – Accomplishment Accounts
Furtherinformation
9
Focus Subprogrammes and Executive Direction and Management (EDM)
Periodicity 12, 18, 21 and 24 months of the programme cycle
Purpose To report on the progress towards achieving the expected accomplishments of theoverall subprogramme through measurements of the indicators of achievements andprovision of supporting results information
Deliverable For each expected accomplishment, completion of the following IMDIS fields: Statement of accomplishments/results achieved: A succinct version of the “Results”
section of the accomplishment account (AA)
Lessons learned/areas needing improvement: A succinct version of the “Learning”section of the AA
For each indicator of achievement, completion of the following IMDIS fields: Interim measurement: An interim measurement of the indicator, used as a bench-
mark to assess progress towards the target for the biennium. The measurement isto be updated with a frequency as indicated by “periodicity” under the “methodology” section for each indicator of achievement (see below)
Description of results: A succinct version of the part of the “Results” section of theAA that relates to the indicator of achievement in question
Methodology: A set of IMDIS fields that for each indicator of achievement identifiesdata sources, data collection and verification methods, external factors that coulddistort measurements, identifies or creates presentation formats, and indicates theperiodicity of measurements. It is the responsibility of each division /subprogramme to keep this section updated at all times
Process Prior to the start of each biennium, PMD enters the expected accomplishments withindicators of achievements for each subprogramme. Upon approval of the workprogramme by the General Assembly, PMD makes revisions accordingly
At each monitoring milestone, PMD sends a reminder to divisions / offices to reporton outputs, work months, AA’s and IMDIS results information
At 12, 18 and 21 months, divisions / offices, at participatory meetings, discuss thequality of delivered outputs
Based on the outcome of the discussions, the AA’s for each subprogramme areprepared / reviewed, and, based on the AA’s, IMDIS results information is updated
PMD reviews the submissions and sends an email to DM to confirm the final statusof the semi-annual reporting
DM reviews the submission in IMDIS and seeks further clarification as required.PMD coordinates ESCAP’s responses to DM
Responsibilities Overall responsibility: Subprogramme manager
Initiation: PMD
Execution: Division Chiefs, Heads of offices away from Bangkok, Section Chiefs,relevant professional staff; coordination by the PME focal point
Quality assurance: Subprogramme manager / PMD / DM
Monitoring Fact Sheet 5 – IMDIS results information
MONITORING FACT SHEET 5IMDIS RESULTS INFORMATION
Fact sheets are updated on a continuous basis. The latest versions are available on iSeek:(http://iseek.un.org/webpgdept1028_79.asp?dept=1028).
10
IMDIS User’s Guide version 2.6, issued by OIOS, UNHQ, December 2003
Advisory Notes No. 1 - Programme Performance Reporting Biennium 2008-2009,issued by Department of Management, UNHQ, 6 November 2008
Advisory Notes No. 2 - Procedures for Programme Performance Monitoring andReporting for the 2008-2009 Biennium through the Use of IMDIS, issued by Depart-ment of Management, UNHQ, 18 December 2008
Last updated October 2009
Furtherinformation
Monitoring Fact Sheet 5 – IMDIS results information
11
Focus Subprogrammes and Executive Direction and Management (EDM)
Periodicity Month 21 of the programme cycle
Purpose To take stock of the subprogramme’s performance after almost completing the two-year programme cycle
To prepare for the formulation of the draft Programme Performance Report (PPR) at24 months
To inform the Strategic Framework planning process which takes place duringOctober and November (months 22 and 23) of the second year of the programmecycle
Deliverable The Programme Performance Assessment (PPA) is effectively a draft PPR covering 21rather than 24 months of the biennium. The PPA exercise includes the following:
Obtaining an overview of the status of output delivery for the biennium
Reviewing the accomplishment account (AA) for each expected accomplishment
Updating IMDIS results information, including statements of accomplishment andlessons learned/areas needing improvement
Process Division Chief / Head of Office holds participatory meeting with staff to discuss thestatus of delivery of the programme of work for the biennium, including changessince the 18 months’ update of AA’s and IMDIS results information
Division Chief / Head of Office, in consultation with Section Chiefs / Heads ofregional institutions, approves the PPA. The updated AAs are submitted in hardcopy to PMD, and updated results information is entered into IMDIS
Division Chiefs / Head of Office and other staff concerned meet with the ExecutiveSecretary to present and report on their AWP and PPA.
PMD reviews and sends email to The Department of Management, UNHQ, toconfirm the final status of the PPA
Department of Management, UNHQ, checks submission status in IMDIS and seeksfurther clarification if required. PMD coordinates responses to DM
Responsibilities Overall responsibility: Subprogramme manager
Initiation: PMD
Execution: Division Chiefs, Heads of Offices, Section Chiefs, relevant professionalstaff; coordination by the PME focal point
Quality assurance: Subprogramme manager / PMD / DM
Further Programme Performance Reporting for 2004-2005 Advisory note No. 3: Preliminaryinformation programme performance assessment, 25 August 2004
Programme Performance Reporting for 2004-2005 Advisory note No. 4: Lessonslearned from monitoring and reporting 2003 - 2004, 28 September 2004
PMD memorandum on output delivery, description of results and accomplishmentaccounts, 17 January 2005
Monitoring Fact Sheet 6 – Preliminary Performance Assessment
MONITORING FACT SHEET 6PRELIMINARY PERFORMANCE ASSESSMENT (PPA)
Fact sheets are updated on a continuous basis. The latest versions are available on iSeek:(http://iseek.un.org/webpgdept1028_79.asp?dept=1028).
12
Advisory Notes No. 1 – Programme Performance Reporting Biennium 2008-2009,issued by Department of Management, UNHQ, 6 November 2008
Advisory Notes No. 2 – Procedures for Programme Performance Monitoring andReporting for the 2008-2009 Biennium through the Use of IMDIS, issued by Depart-ment of Management, UNHQ, 18 December 2008
Workshop Materials: a guide to writing meaningful and accurate accomplishmentstatements, 23 October 2003
Last updated October 2009
Monitoring Fact Sheet 6 – Preliminary Performance Assessment
13
Focus Subprogrammes and Executive Direction and Management (EDM)
Periodicity Month 24 of the programme cycle
Purpose To inform member States about the performance of each subprogramme against theProgramme of Work.
To compile the PPR of the ESCAP programme as a whole (this feeds into thePPR of the UN Secretariat as a whole). The Secretary-General submits the UN-widePPR to the General Assembly for review by Member States.
Deliverable The PPR is effectively an update of the Preliminary Programme Assessment (PPA)covering 24 months. The PPR includes:
Final output reporting, including supporting documentation
Final work months reporting
Final accomplishment account for each expected accomplishment together withsupporting documentation such as evaluation reports
Completed IMDIS results information (cf. Monitoring Fact Sheet 6)
The PPR for ESCAP summarizes the following for all subprogrammes combined:
Highlight of programme results
Challenges, obstacles and unmet goals
Legislative reviews, external and internal evaluations
Process Based on the Preliminary Performance Assessment (PPA) completed at 21 months
Divisions finalize the output report (programmed, reformulated, postponed andterminated), the work months report, and the accomplishment accounts for 24months in IMDIS, and attach further supporting documentation as available
Subprogramme managers approve the PPR together with Section Chiefs / Heads ofregional institutions, and send a signed hard copy of the statements of accomplish-ments to PMD
PMD reviews the PPR’s, coordinates the formulation of the PPR for ESCAP andconfirms the final status of the PPR to DM
OIOS checks submission status in IMDIS and seeks further clarification if required.PMD coordinates responses to DM
Responsibilities Overall responsibility: Subprogramme manager
Initiation: PMD
Execution: Division Chief, Heads of offices, Section Chiefs, relevant professionalstaff; coordination by the PME focal point
Quality assurance: Subprogramme manager / PMD / DM
Further Programme Performance Reporting for 2004-2005 Advisory note No. 4: Lessons learninformation from monitoring and reporting 2003 – 2004, 28 September 2004
PMD memorandum on output delivery, description of results and accomplishmentaccounts, 17 January 2005
Monitoring Fact Sheet 7 – Programme Performance Report
MONITORING FACT SHEET 7PROGRAMME PERFORMANCE REPORT (PPR)
Fact sheets are updated on a continuous basis. The latest versions are available on iSeek:(http://iseek.un.org/webpgdept1028_79.asp?dept=1028).
14
Monitoring Fact Sheet 7 – Programme Performance Report
Advisory Notes No. 1 – Programme Performance Reporting Biennium 2008-2009,issued by Department of Management, UNHQ, 6 November 2008
Advisory Notes No. 2 – Procedures for Programme Performance Monitoring andReporting for the 2008-2009 Biennium through the Use of IMDIS, issued by Depart-ment of Management, UNHQ, 18 December 2008
Workshop Materials: a guide to writing meaningful and accurate accomplishmentstatements, 23 October 2003
Last updated October 2009
15
Focus Project
Periodicity During project planning stage, prepared after the concept note is endorsed by the SMT
Purpose To provide a basis for the implementation, monitoring, reporting and evaluation of theprojectNote: Development of project documents should normally take place only after theconcept note has been endorsed by the SMT
Deliverable Project Document with:
Executive summary, situation analysis, results framework, management arrange-ments, and inputs of ESCAP and other collaborating partners
Annex 1: Logical Framework
Annex 2: Project Work and Monitoring Plan (PWMP)
Annex 3: Budget
Annex 4: Summary page for e-TC
Process Once a concept note is endorsed by the SMT, the concerned Division/Office initiatesa planning process for the project design and prepared a project document based onthe programme of work (and other relevant mandates) and following discussionwith key partners and stakeholders
PMO provides support
Division/Office submits the project document to QAT
QAT undertakes technical appraisal of the project documents based on establishedcriteria
The concerned Division/Office makes revisions as appraised by QAT, verified byrespective Chiefs, for the Executive Secretary’s consideration
PMO provides support for the finalization of the project documents, as required. TheChief, PMD also verifies the document prior to submission to the ES
The Executive Secretary (ES) approves and signs finalized project document, repre-senting executive approval of the project
PMD submits the approved project document to the donor normally along with adraft trust fund agreement for consideration
When funding is secured, the concerned Division/Office should post the signedproject document to e-TC and the Planning, Monitoring and Evaluation Assistantposts it on IMDIS
Project revisions including extensions can be processed by PMD on request by theimplementing office, provided that the project concept remains the same (and thedonor agrees)
The updating of the project work and monitoring plan and information in e-TC, andthe allocation of budget allotments take place continuously during the project’simplementation and reporting phase
MONITORING FACT SHEET 8PROJECT DOCUMENT
Fact sheets are updated on a continuous basis. The latest versions are available on iSeek:(http://iseek.un.org/webpgdept1028_79.asp?dept=1028).
Monitoring Fact Sheet 8 – Project Document
16
Responsibilities Overall responsibility: Division Chief
Initiation: Project Officer
Execution: Project Officer and Section Chief
Quality assurance: SMT, QAT and PMD
Project Document form: annotated and template
Annex 1 – Logical Framework: template and example
Annex 2 – Work and Monitoring Plan: template and example
Annex 3 – Budget: template and example
Standard salary costs for direct budgeting of P and G staff(http://iseek.un.org/webpgdept1028_10.asp?dept=1028)
Last updated March 2010
Monitoring Fact Sheet 8 – Project Document
Furtherinformation
17
Focus Project
Periodicity During project planning stage as part of the Project Document
Purpose To design and plan a quality project that meets the needs of member States andother beneficiaries
To provide a basis for the implementation, monitoring and reporting (Project Workand Monitoring Plan, ongoing monitoring, Progress Report, Terminal Report) andevaluation (Terminal Evaluation) of the project
To facilitate communication within the project team and with management anddonors, and organizational learning during the project’s implementation
Deliverable Logical Framework attached as Annex 1 to the Project Document, with generalinformation about the project (title, subprogramme, target countries, date, duration),followed by a matrix with four columns: First column: project goal and outcome, followed by outputs and with a list of
activities under each output (this is the ‘hierarchy of results’)
Second column: SMART* indicators for the project goal, outcome and each of theoutputs
Third column: means of verification, i.e. how the SMART indicators will bemeasured
Fourth column: important assumptions, i.e. factors or risks that are beyond thecontrol of the project
Hierarchy Indicator Means of Importantof results verification assumptions
Project goalOutcomeOutputsActivities
Process Relevant staff conduct a situation analysis, involving the identification and analysisof (a) stakeholders (needs, interests, potentials and weaknesses); (b) problems;(c) objectives; and (d) alternative intervention strategies
Relevant staff develop the project strategy or plan for the selected intervention in theformat of a Logical Framework
Relevant managers (Section Chief, Division Chief, Head of Office away fromBangkok) reviews the plan
The logical framework is attached as Annex 2 of the Project Document (see ProjectDocument fact sheet for detailed process)
The logical framework can be revised during the project cycle, as part of a projectdocument revision, if major changes are made to expected project results, projectactivities, timeline or budget and the necessary donor approval has been obtained
MONITORING FACT SHEET 9LOGICAL FRAMEWORK
Fact sheets are updated on a continuous basis. The latest versions are available on iSeek: (http://iseek.un.org/webpgdept1028_79.asp?dept=1028).
Monitoring Fact Sheet 9 – Logical Framework
* SMART – Specific, measurable, attainable, realistic, time bound
18
Responsibilities Overall responsibility: Division Chief/Head of RI/Head of Office
Initiation: Project Officer
Execution: Project Officer and relevant supervisors
Quality assurance: SMT/QAT and PMD
Project Document form: annotated and template
Annex 1 – Logical Framework: template and example
Project Planning, Monitoring and Evaluation Training Guide(http://iseek.un.org/webpgdept1028_10.asp?dept=1028)
Last updated March 2010
Monitoring Fact Sheet 9 – Logical Framework
Furtherinformation
19
Focus Project
Periodicity During planning stage as part of Project Document
Updated, as required, at start of project, semi-annually (as part of Progress Report),and in the event of significant changes in the project implementation schedule andactivities that requires an allotment revision
Purpose Management tool for Project Officers (similar to the plans that exist at Section andDivision level)
To provide Division Chiefs and PMD with a transparent overview of importantactivities and timelines
Deliverable Project Work and Monitoring Plan (PWMP) that shows, under each expected output,the activities to be carried out over the entire project period with timelines andmilestones. It shows planned monitoring and evaluation activities separately, forexample the preparation of Progress Reports or obtaining feedback from workshopparticipants. Outputs and activities are the same as listed in the Logical Framework ofthe project
Process Project Officer (or the relevant manager in case no Project Officer is appointed yet)prepares the plan using a standard template in excel whereby outputs/ activities arelisted from top to bottom and months are listed from left to right.
The relevant manager reviews the plan
Project Officer attaches plan as Annex 2 of the Project Document (see ProjectDocument fact sheet for detailed process)
Project Officer updates the plan, if and as required, every six months by indicatingfor each activity ‘not started’, ‘in progress’, or ‘completed’ and/or by updating themonths and deadlines for the activities. The (updated) plan is attached to theProgress Report (see Project Document fact sheet for detailed process)
Responsibilities Overall responsibility: Division Chief/Head of RI/Head of Office
Initiation: Project Officer
Execution: Project Officer and relevant supervisor
Quality assurance: SMT/QAT and PMD
Project Document form: annotated and template
Annex 2 – Work and Monitoring Plan: template and example(http://iseek.un.org/webpgdept1028_10.asp?dept=1028)
Last updated March 2010
MONITORING FACT SHEET 10PROJECT WORK AND MONITORING PLAN
Fact sheets are updated on a continuous basis. The latest versions are available on iSeek:(http://iseek.un.org/webpgdept1028_79.asp?dept=1028).
Monitoring Fact Sheet 10 – Project Work and Monitoring Plan
Furtherinformation
21
Focus Project
Periodicity During project planning stage as part of Project Document
Purpose Financial management tool for Project Officers
To provide operational Divisions, PMD/SMT and donors with a transparentoverview of budgeted project costs
Deliverable Budget in two formats attached as Annex 3 to the Project Document
Output-based budget: Budget items are grouped under each output and associatedactivities. There is currently a proposal to integrate the output-based budget into thework and monitoring plan
Expenditure-based Budget: Budget items are grouped under each type of expenditure (personnel, subcontracts/grants, training, equipment, miscellaneous)
Both budgets provide the total budgeted amounts for the entire project with abreakdown per calendar year
Process Project Officer (or the relevant manager in case no Project Officer is appointed yet)prepares the budget in both formats using a standard template in excel
The relevant manager reviews the budget
Project Officer attaches plan as Annex 3 of the Project Document (see ProjectDocument fact sheet for detailed process)
The budget may be revised during the implementation phase, which may requireapproval by the donor
Responsibilities Overall responsibility: Division Chief/Head of RI/Head of Office
Initiation: Project Officer
Execution: Project Officer and relevant supervisor
Quality assurance: SMT, QAT and PMD
Project Document form: annotated and template
Annex 3 – Budget: template and example(http://iseek.un.org/webpgdept1028_10.asp?dept=1028)
Last updated March 2010
MONITORING FACT SHEET 11PROJECT BUDGET
Fact sheets are updated on a continuous basis. The latest versions are available on iSeek: (http://iseek.un.org/webpgdept1028_79.asp?dept=1028).
Monitoring Fact Sheet 11 – Project Budget
Furtherinformation
23
Focus Technical Cooperation (TC) projects
Periodicity e-TC summary page at start of project
Updated semi-annually
Purpose To provide a clear overview of projects in the e-TC system. This database can be usedto carry out analyses of, for example, past year trends of projects by size, donor orcountry
Deliverable e-TC summary:
Budget data and timeframe
General information: responsible implementing Division, Institution or Office, partici-pating countries or regional groups, subjects (e.g. water, transport) etc.
Modalities (e.g. advisory services, training, pilot projects)
Relevant MDGs (e.g. reduce child mortality)
Relevance to gender
Process Division/RI/Office (Project Officer, Section Chief or other) prepares the summarypage for e-TC and includes it at Annex 4 of the Project Document (see ProjectDocument fact sheet for detailed process)
Project Officer updates the information in e-TC every six months and submitssupporting documents to PMD, such as Progress Reports and Terminal ReportProject Document, Progress Report, Terminal Report, reports resulting from theTerminal Evaluation of the project, and major publications of the project
Responsibilities Overall responsibility: Division Chief/Head of RI/Head of Office
Initiation: Project Officer
Execution: Project Officer and relevant supervisor
Quality assurance: PMD
e-TC An Introduction, 5 September 2003
e-TC User Manual (General User Version), March 2004.
Project Document form: annotated and template(http://iseek.un.org/webpgdept1028_10.asp?dept=1028)
Last updated March 2010
MONITORING FACT SHEET 12SUMMARY PAGE FOR e-TC AND UPDATES
Fact sheets are updated on a continuous basis. The latest versions are available on iSeek:(http://iseek.un.org/webpgdept1028_79.asp?dept=1028).
Monitoring Fact Sheet 12 – Summary Page for eTC and Updates
Furtherinformation
25
Focus Technical Cooperation (TC) Projects funded through Extra-budgetary (XB) resources
Periodicity Allotments for XB projects are requested before the start of the project and, forcontinuing projects in November for each following year
Revised allotment are also requested during the year if there are changes to theproject schedule (work plan) or implementation strategy
Purpose Financial management tool for Project Officers (financial progress of their projects)and Managers (indicator of amount of XB work to be delivered by the Division,Institution or Office)
To keep donors informed of progress of financial delivery
Deliverable Allotment with allocated funds for a particular year in five categories: 10 personnel,20 subcontracts/grants, 30 training, 40 equipment, 50 miscellaneous for the currentyear. The allotment may also indicate funds for future years, if any (see limitationsbelow)
Process Project Officer (or relevant manager in case no Project Officer is appointed yet)prepare the request for allotment or revised allotment using a standard template inexcel
Division Chief/Head of RI/Head of Office submits the allotment request to PMD
PMD reviews the request based on a set of criteria agreed with the donor(limitations below) and approves the allotment in IMIS
Project Officer, through the certifying officer, commits expenditure and monitorsagainst the allotted amounts in IMIS
Financial Services (ASD) issues a Financial Statement every six months of actualexpenditures against the allotment for inclusion in the project’s Progress Report.(Financial figures are also available in e-TC and IMIS/IRFA)
Limitations:
1. Amount: Allotments are limited to funds that are actually received. It is not possiblefor the UN to “pre-finance”
2. Purpose: The categories and purpose of expenditures are limited to those approvedin the project document
3. Time frame: The allotment is limited in time (generally one year until 31 Decemberof the year or less as per the project document)
For each project, the above limitations would often be further specified in the TrustFund Agreement (TFA) signed with the donor in question. Copies of the TFA areavailable in PMD
MONITORING FACT SHEET 13REQUESTS FOR ALLOTMENTS AND REVISED
ALLOTMENTS*Fact sheets are updated on a continuous basis. The latest versions are available on iSeek:
(http://iseek.un.org/webpgdept1028_79.asp?dept=1028).
* This fact sheet covers a request related to projects funded through XB sources only.
Monitoring Fact Sheet 13 – Request for Allotments and Revised Allotments
26
Responsibilities Overall responsibility: Division Chief/Head of RI/Head of Office
Initiation: Project Officer
Execution: Project Officer, relevant supervisor
Quality assurance: PMD
Further Request for Issuance or Revision of Allotment forminformation (http://iseek.un.org/webpgdept1028_10.asp?dept=1028)
Last updated March 2010
Monitoring Fact Sheet 13 – Request for Allotments and Revised Allotments
27
Monitoring Fact Sheet 14 – Project Progress Report
Focus Project
Periodicity Semi-annually (due on 31 July, covering January – June of the same year; and due on31 January, covering July - December of the previous year)
Purpose To internally monitor and review if the delivery of outputs of a given project iswithin planned timelines and budget, and record lessons learnt
To decide if corrective actions are needed and update the Project Work andMonitoring Plan accordingly
To inform donors of the status of project implementation and alert them to possibleadjustments of activities and timeframe
To support monitoring of subprogramme performance
Deliverable Project Progress Report, including
Summary page with project management information (such as title, budget, teammembers, expected completion date, status of financial delivery, etc.)
Summary of progress: provides a brief assessment of progress made in achieving theoutcome and goal of the project (outputs, indicators and extent to which outputswere achieved)
Comments, including lessons learnt: describes any changes made in implementingthe project, lessons learnt, and other comments
Annex 1: Financial Statement, covering total allocation, previous year expenditure,and current year expenditure as of 30 June or 31 December
Annex 2: Certification of Delivery of Work Programme Outputs Relating to Techni-cal Cooperation Project (for internal use only)
Documents available upon request: need to tick which of the following documentsare available (i) summary evaluations of activities/publications, and (ii) mediacoverage / clippings
Process Project Officer prepares the Progress Report following a standard template
Relevant supervisor reviews Progress Report
Financial Services (ASD) issues a Financial Statement that is attached as Annex 1 tothe Progress Report. Note: in exceptional cases, and as required by donors, e.g.ADB, EU, ESCAP may issue a tailor made financial statement (drafted by imple-menting office and approved by PMD)
Project Officer sends Progress Report to Division Chief/Head of RI/Head of Officeand PMD for review and approval
PMD sends the Progress Report to donor and posts it in e-TC/Head of Office
Project Officer sends copy to Division Chief/Head of RI, donor and to the Planning,Monitoring and Evaluation Assistant so that the report can also be posted in IMDISto support output reporting
Responsibilities Overall responsibility: Division Chief/Head of RI/Head of Office
Initiation: Project Officer
Execution: Project Officer, relevant supervisor, Financial Services (ASD)
Quality assurance: Division Chief/Head of Office/PMD
MONITORING FACT SHEET 14PROJECT PROGRESS REPORT
Fact sheets are updated on a continuous basis. The latest versions are available on iSeek:(http://iseek.un.org/webpgdept1028_79.asp?dept=1028).
28
Monitoring Fact Sheet 14 – Project Progress Report
Template: Project Progress Report
PMD memorandum on the revised progress and terminal report template, 10November 2009 (including requirement to report on the delivery of project-relatedprogramme outputs)(http://iseek.un.org/webpgdept1028_10.asp?dept=1028)
Last updated March 2010
Furtherinformation
29
Focus Project
Periodicity At the end of the project, submitted within one month after the official end date of theproject as per approved project document, or a revision thereof.
Purpose To provide feedback to donors about the project’s overall performance
To support the Terminal Evaluation of large projects and projects that last over twoyears
To support monitoring of subprogramme performance
Closure of project account (financial closure)
Deliverable Project Terminal Report, including Summary page with project management information (such as title, budget, team
members, completion date, etc.)
Project overall assessment, and summary of results (outputs, indicators and extent towhich outputs were achieved)
Follow-up actions, comments, including lessons learnt: describes any changes madein implementing the project, lessons learnt, and other comments
Annex 1: Financial Statement (covering budget and actual expenditure for the entireproject)
Annex 2: Certification of Delivery of Work Programme Outputs Relating to Techni-cal Cooperation Project (for internal use only)
Documents available upon request: need to tick which of the following documentsare available (i) summary evaluations of activities/publications and (ii) mediacoverage / clippings
Process Project Officer prepares the Terminal Report following a standard template
Section Chief/Head of RI/Head of Office reviews Terminal Report
Financial Services (ASD) issues a Terminal Financial Statement that is attached asAnnex 1 to the Terminal Report. Note: in exceptional cases, and as required bydonors, e.g. ADB, EU, ESCAP may issue a tailor made financial statement (draftedby the implementing office and approved by PMD)
Project Officer sends Terminal Report to Division Chief/Head of RI/Head of Officeand PMD for review and approval
PMD sends the Terminal Report to the donor and posts it in e-TC
Project Officer sends copy to Division Chief/Head of RI/Head of Office and to thePlanning, Monitoring and Evaluation Assistant so that the report can also be postedin IMDIS to support output reporting
Responsibilities Overall responsibility: Division Chief/Head of RI/Head of Office
Initiation: Project Officer
Execution: Project Officer, relevant supervisor, Financial Services (ASD)
Quality assurance: Division Chief / PMD
MONITORING FACT SHEET 15PROJECT TERMINAL REPORT
Fact sheets are updated on a continuous basis. The latest versions are available on iSeek:(http://iseek.un.org/webpgdept1028_79.asp?dept=1028).
Monitoring Fact Sheet 15 – Project Terminal Report
30
Monitoring Fact Sheet 15 – Project Terminal Report
Project terminal report template
PMD memorandum on the revised progress and terminal report template, 10November 2009 (including requirement to report on the delivery of project-relatedprogramme outputs)(http://iseek.un.org/webpgdept1028_10.asp?dept=1028)
Last updated March 2010
Furtherinformation
31
Purpose Organizational learning
To share findings with member States, donors and other external stakeholders
Accountability is not normally the main purpose of thematic evaluations
Focus Sectors, focusing on projects within a given sector, such as transport or energy
Funds, covering a cluster of projects and activities sponsored by a particulardonor
Cross-cutting issues, such as gender, poverty eradication, or human rights
Modalities / methodological approaches, such as advocacy, or capacitydevelopment
Publications, such as the Economic and Social Survey of Asia and the Pacific
Services, such as administrative or programme support services provided byASD and PMD
Budget Budgeted centrally by ESCAP, XB or RB funds
The budget size depends on the evaluation
Evaluation manager PMD
On an ad hoc basis, as requested by ESCAP:
OIOS
UNEG
Others
Step 1: Prepare Thematic evaluations are ideally conducted within the first 18 months of theevaluation plan biennium to allow findings to be used for the preparation of the followingand budget Strategic Framework and Programme Budget
Step 2: Prepare terms PMD Evaluation Officers prepares the terms of reference in conjunction withof reference relevant Divisions and Offices away from Bangkok.
Evaluation criteria and additional criteria may not all be relevant
PMD forms the team
The evaluation team normally consists of external consultants with evaluationexperience and knowledge of the topic being evaluated
Step 4: Schedule and PMD schedules and organizes the evaluationorganize evaluation
Step 5: Conduct The evaluation team conducts the evaluationevaluation
Step 6: Prepare draft The Lead Evaluator prepares draft report with input from other teamreport members
Evaluation Fact Sheet 1 – Internal Evaluation: Thematic Evaluation
EVALUATION FACT SHEET 1INTERNAL EVALUATION: Thematic Evaluation
Fact sheets are updated on a continuous basis. The latest versions are available on iSeek:(http://iseek.un.org/webpgdept1028_79.asp?dept=1028).
Step 3: Establishevaluation team
Quality assuranceand support
32
Step 7: Review Depending on the evaluation topic, a technical review is completed bydraft report internal stakeholders: Division Chiefs, Section Chiefs, Heads of Offices away
from Bangkok, ESCAP staff, and/or PMD
PMD Evaluation Officers conduct the methodological review of the draftreport
Step 8: Prepare PMD Evaluation Officer(s) will coordinate the management response (MR) by:management response a) requesting inputs to the management response from relevant divisions
institutions, offices; b) facilitating meetings, as required, with stakeholders toagree on an overall response to the evaluation.
The MR will be signed by the Chief of all Divisions, Institutions and Officesthat have been involved in the formulation of the MR, and by the ExecutiveSecretary.
The overall MR will be included as an insert at the beginning of theevaluation report. The detailed MR with follow-up actions will be included asan annex to the evaluation report.
PMD issues the final evaluation report
The Evaluation report will be posted on ESCAP internet (external website)and intranet (internal website). The detailed MR with follow-up actions,expected completion dates and responsible units will be kept on record inPMD for monitoring purposes.
Other methods for sharing in accordance with the Evaluation Guidelines, forexample PMD could host a briefing for staff on the findings of the evaluation
Step 10: Follow up Incorporate actions for which the ES or other senior management is respon-and promote learning sible in the annual work plan.
Share lessons of strategic importance with relevant members of the UNsystem.
Other methods for follow-up and learning, as suggested in the Guidelines
ESCAP’s capacity development approach – 2008
Economic and Social Survey of Asia and the Pacific – 2008
Last updated November 2009
Evaluation Fact Sheet 1 – Internal Evaluation: Thematic Evaluation
Step 9: Shareevaluation findings
Previous thematicevaluations
33
Purpose External accountability to member States and donors
Organizational learning
Internal accountability
Focus Entire subprogramme or major components thereof: Divisions or Offices awayfrom Bangkok
Budget Budgeted centrally, using appropriate XB and/or regular budget (RB)resources.
Subprogramme evaluations focused on regional institutions are treated asXB-funded projects and should be partly budgeted for in the respectiveinstitutional support accounts of regional institutions
Evaluation manager PMD
On an ad hoc basis, as requested by ESCAP:
OIOS
UNEG
Others
Step 1: Prepare ESCAP aims to undertake up to two evaluations of subprogrammes or majorevaluation plan components every bienniumand budget Budget size depends on the evaluation
Step 2: Prepare terms PMD prepares the terms of reference in conjunction with relevant Divisionsof reference or Offices away from Bangkok
Step 3: Establish PMD and other organizational entities involved jointly agree on the selectionevaluation team criteria for the evaluator(s)
PMD appoints the evaluator(s)
The evaluation team generally consists of external consultants with evaluationexperience and knowledge of the topic being evaluated
Step 4: Schedule and PMD schedules and organizes the evaluationorganize evaluation
Step 5: Conduct The evaluation team conducts the evaluationevaluation
Step 6: Prepare The Lead Evaluator prepares draft report with input from other teamdraft report members
Step 7: Review Depending on the evaluation topic, a technical review is completed bydraft report internal stakeholders: Division Chiefs, Section Chiefs, Heads of Offices away
from Bangkok, ESCAP staff, and/or PMD
PMD Evaluation Officers conduct the methodological review of the draftreport
Evaluation Fact Sheet 2 – Internal Evaluation: Subprogramme Evaluation
EVALUATION FACT SHEET 2INTERNAL EVALUATION: Subprogramme Evaluation
Fact sheets are updated on a continuous basis. The latest versions are available on iSeek:(http://iseek.un.org/webpgdept1028_79.asp?dept=1028).
Quality assurancesupport
34
Step 8: Prepare PMD Evaluation Officer(s) will coordinate the management response (MR) by:management response a) requesting inputs to the management response from relevant divisions,
institutions, offices; b) facilitating meetings, as required, with stakeholdersto agree on an overall response to the evaluation
The MR will be signed by the Chief of all Divisions, Institutions and Officesthat have been involved in the formulation of the MR, and by the ExecutiveSecretary
The overall MR will be included as an insert at the beginning of theevaluation report. The detailed MR with follow-up actions will be included asan annex to the evaluation report
Step 9: Share PMD issues the final evaluation reportevaluation findings The Evaluation report will be posted on ESCAP internet (external website)
and intranet (internal website). The detailed MR with follow-up actions,expected completion dates and responsible units will be kept on record inPMD for monitoring purposes
Other methods for sharing in accordance with the Evaluation Guidelines, forexample PMD could host a briefing for staff on the findings of the evaluation
Step 10: Follow up and Incorporate actions for which the ES or other senior management is respon-promote learning sible in the annual work plan.
Share lessons of strategic importance with relevant members of the UNsystem
Other methods for follow-up and learning, as suggested in the Guidelines
Previous evaluations Evaluation of EPOC (thus relevant to subprogramme 8 subregional activitiesfor development) planned for 2010-2011
Last updated November 2009
Evaluation Fact Sheet 2 – Internal Evaluation: Subprogramme Evaluation
35
Purpose External accountability to project donors and member States involved in oraffected by the project
Organizational learning
Internal accountability
Focus Individual projects
Project clusters
Budget Project terminal and mid-term evaluations should be considered on a case-by-case basis
In general, five percent of the operational project budget (i.e. net ofProgramme Support Costs (PSC)1) is a recommended amount for anevaluation
The budget is determined by considering the relative size of the projectbudget, the scope of the evaluation and any other criteria applied by theproject appraisal mechanisms at ESCAP
Staff time for supervisory functions needs to be planned for
Evaluation manager PMD
Quality assurance PMDand support OIOS
UNEG
Others
Step 1: Prepare Project evaluations should be included in the project document, annual workevaluation plan plan and the ESCAP Evaluation Planand budget
Step 2: Prepare terms PMD prepares the terms of reference in conjunction with relevant division orof reference office away from Bangkok
Step 3: Establish PMD agrees on the selection criteria for the evaluator(s) with the projectevaluation team implementing office(s)
PMD appoints the evaluator(s)
Normally only one evaluator is appointed to conduct the evaluation
The evaluator is an external consultant with evaluation experience andknowledge of the topic being evaluated
PMD and Division staff cannot be members of the team in order to ensurethe independence of the findings
1 The term PSC refers to a cost recovery mechanism for “indirect costs” associated with the implementation of projects.“Indirect costs” refer to work that is undertaken by central administration and management entities (i.e. PMD andASD) to support the implementation of an extra-budgetary project.
Evaluation Fact Sheet 3 – Internal Evaluation: Project Evaluation
EVALUATION FACT SHEET 3INTERNAL EVALUATION: Project Evaluation
Fact sheets are updated on a continuous basis. The latest versions are available on iSeek:(http://iseek.un.org/webpgdept1028_79.asp?dept=1028).
36
Step 4: Schedule and PMD schedules and organizes the evaluationorganize evaluation
Step 5: Conduct The evaluator conducts the evaluationevaluation
Step 6: Prepare draft The evaluator prepares the draft reportreport
Step 7: Review draft The relevant Project Officer(s) and managers conduct a technical review ofreport the draft report with inputs from relevant stakeholders
The PMD Evaluation Officer(s) conducts the methodological review of thedraft report
Step 8: Prepare PMD Evaluation Officer(s) will coordinate the management response (MR) by:management response a) requesting inputs to the management response from relevant staff and
managers; b) facilitating meetings, as required, with stakeholders to agree onan overall response to the evaluation
The MR will be signed by the Chief of all Divisions, Institutions and Officesthat have been involved in the formulation of the MR, and by the ExecutiveSecretary
The overall MR will be included as an insert at the beginning of theevaluation report. The detailed MR with follow-up actions will be included asan annex to the evaluation report
Step 9: Share PMD issues the final evaluation reportevaluation findings The Evaluation report will be posted on ESCAP internet (external website)
and intranet (internal website). The detailed MR with follow-up actions,expected completion dates and responsible units will be kept on record inPMD for monitoring purposes
Other methods for sharing in accordance with the Evaluation Guidelines, forexample PMD could host a briefing for staff on the findings of the evaluation
Step 10: Follow up Incorporate actions for which the relevant project implementing office(s) isand promote learning responsible in the annual work plan and project work and monitoring plan
Share lessons of strategic importance with relevant members of the UNsystem
Other methods for follow-up and learning, as suggested in the Guidelines
Previous evaluations To date, no project evaluations have been completed in accordance with theGuidelines
Last updated November 2009
Evaluation Fact Sheet 3 – Internal Evaluation: Project Evaluation
37
Purpose Organizational learning1
Focus Individual projects
Projects clusters
Budget Project terminal and mid-term evaluations should be considered on a case-by-case basis
In general, five percent of the operational project budget (i.e. net ofProgramme Support Costs (PSC)2) is a recommended amount for anevaluation
The budget is determined by considering the relative size of the projectbudget, the scope of the evaluation and any other criteria applied by theproject appraisal mechanisms at ESCAP
Staff time for supervisory functions needs to be planned for
Evaluative project Division or Office away from Bangkokreview manager
Quality assurance Internal peersand support PMD
Step 1: Prepare Project reviews should be included in the project document, annual workevaluative project plan of the relevant office(s) and the ESCAP Evaluation Planreview plan and budget
Step 2: Prepare terms Division or Office away from Bangkok prepares the terms of referenceof reference
Step 3: Establish project Division or Office away from Bangkok agree on the selection criteria for thereview team evaluator(s) and appoints the evaluator(s)
Normally only one evaluator is appointed to conduct the evaluative projectreview
ESCAP staff may not be part of the team for a review of a project imple-mented by the division, regional institution or office they work for
Step 4: Schedule and The review manager schedules and organizes the revieworganize evaluativeproject review
1 The purpose of all evaluative reviews should be organizational learning but some evaluative project reviews may alsoinclude accountability to external stakeholders.
2 The term PSC refers to a cost recovery mechanism for “indirect costs” associated with the implementation of projects.“Indirect costs” refer to work that is undertaken by central administration and management entities (i.e. PMD andASD) to support the implementation of an extra-budgetary project.
Evaluation Fact Sheet 4 – Evaluation Review: Evaluation Project Review
EVALUATION FACT SHEET 4EVALUATIVE REVIEW: Evaluative project review
Fact sheets are updated on a continuous basis. The latest versions are available on iSeek:(http://iseek.un.org/webpgdept1028_79.asp?dept=1028).
38
Step 5: Conduct The evaluator conducts the reviewevaluative projectreview
Step 6: Prepare draft The evaluator prepares the draft reportreport
Step 7: Review draft The relevant Project Officer(s) and managers conduct a technical review ofreport the draft report with inputs from relevant stakeholders
The review manager conducts the methodological review of the draft report,with support from PMD
Step 8: Prepare In consultation with the division that managed the evaluative review, PMDmanagement response Evaluation Officers will coordinate the MR to the evaluative review by:
a) requesting inputs to the management response from relevant staff andmanagers; b) facilitating meetings, as required, with stakeholders to agree onan overall response to the evaluation
The MR will be signed by the Chiefs of all Divisions, Institutions and Officesthat have been involved in the formulation of the MR, and by the Chief ofPMD. If specified in the TOR of the evaluative review (i.e. when the purposeof the evaluative review includes external accountability), the ExecutiveSecretary will also sign the MR
Step 9: Share review Review manager issues the final evaluation reportfindings Review findings are shared within the ESCAP Secretariat, posted on the
internal website, and where the purpose of the review is external accountabi-lity it is shared with other stakeholders, as appropriate
Other methods for dissemination in accordance with the EvaluationGuidelines
Step 10: Follow up In accordance with the Evaluation Guidelines and including:and promote learning Incorporating actions for which managers are responsible in their annual
work plans. Similarly, staff responsible for projects may include relevantactions in project work and monitoring plans
Updating the status of evaluation follow-up actions in a central intranet-basedlog
Previous reviews Trade and Investment Division: Forum for the Comprehensive Developmentof Indo-China (completed in 2009)
Last updated November 2009
Evaluation Fact Sheet 4 – Evaluation Review: Evaluation Project Review
39
Purpose Organizational learning1
Focus Cross-cutting issues, such as gender mainstreaming in project planning andimplementation
Modalities / methodological approaches, such as training, AccomplishmentAccounts, expert group meetings, or project terminal reporting
Services, such as administrative or programme support services provided byASD and PMD
Publications, such as the M&E Overview
Budget A budget is only needed if external consultants are hired to provide specialistadvice or to facilitate the exercise. In this case the review is budgetedcentrally by ESCAP, XB or RB funds, and the budget size depends on thereview
Evaluative review Any division or Office away from Bangkokmanager
Quality assurance PMD and internal peersand support
Step 1: Prepare Peer reviews are ideally conducted within the first 18 months of the bien-evaluative review nium to allow findings to be used for the preparation of the next Strategicplan and budget Framework and Programme Budget
Staff time must be allocated for conducting peer group evaluations, butfinancial resources may not be necessary
Step 2: Prepare terms The evaluative review manager prepares the TOR for discussion with theof reference peer group
Step 3: Establish peer The evaluative review manager initiates the formation of a peer group as theevaluative review team review team, based on nominations from divisions and/or Offices away from
Bangkok
Peer group selects a Lead Evaluator from within the group
External consultants may be contracted to provide specialist advice on thetopic under review or to facilitate the review process
Step 4: Schedule and Review manager schedules and organizes the review process in coordinationorganize peer evaluative with the peer groupreview
Step 5: Conduct peer The peer group conducts the peer evaluative reviewevaluative review
Evaluation Fact Sheet 5 – Evaluation Review: Peer Review
EVALUATION FACT SHEET 5EVALUATIVE REVIEW: Peer Review
Fact sheets are updated on a continuous basis. The latest versions are available on iSeek:(http://iseek.un.org/webpgdept1028_79.asp?dept=1028).
1 External and internal accountability are not the purpose of reviews that are managed by a peer group.
40
Step 6: Prepare draft The Lead Evaluator prepares draft report with input from other peer groupreport members
Amendments to the evaluation report template [Evaluation Tool 5] may berequired depending on the focus
Step 7: Peer evaluative Depending on the topic, the draft report is reviewed by relevant Divisionreview draft report Chiefs, Section Chiefs, Heads of Offices away from Bangkok, ESCAP staff and
PMD
The review manager conducts the methodological review of the draft reportwith support from PMD Evaluation Officers
Step 8: Prepare In consultation with the division that managed the evaluative review, PMDmanagement response Evaluation Officers will coordinate the MR to the evaluative review by:
a) requesting inputs to the management response from relevant staff andmanagers; b) facilitating meetings, as required, with stakeholders to agree onan overall response to the evaluation
The MR will be signed by the Chiefs of all Divisions Institutions and Officesthat have been involved in the formulation of the MR, and by the Chief ofPMD. If specified in the TOR of the evaluative peer review (i.e. when thepurpose of the evaluative review includes external accountability), theExecutive Secretary will also sign the MR
Step 9: Share evaluative The review manager issues the final reportpeer review findings The evaluative peer review is shared within the ESCAP secretariat
The purpose of the evaluative peer review is internal learning only and notexternal accountability. Thus, the report is usually not disseminated toexternal stakeholders because external dissemination could result in resistancefrom organizational entities or staff members to participate in peer groupreviews
Internal briefing sessions for ESCAP management and staff may be conductedto highlight important evaluation findings and recommendations
Other methods for dissemination in accordance with the Evaluation Guide-lines
Step 10: Follow up and Update the status of evaluation follow-up actions in a central intranet-basedpromote learning log
Reviews of the same subject may be conducted periodically to ensure continu-ity in organizational learning
Other methods in accordance with the Evaluation Guidelines
Previous reviews Peer Review of ESCAP activities for the promotion of the green growthapproach (completed in 2009)
Last updated November 2009
Evaluation Fact Sheet 5 – Evaluation Review: Peer Review
41
I. STRATEGIC PROGRAMME PLANNING AT ESCAP
GOAL: [The intended long-term result to which the division is intending to contribute towards.] Strategicprogramme planning at ESCAP is driven by a comprehensive approach that meets the needs and prioritiesof member States, taking into account UN system-wide coherence, ESCAP’s comparative advantages and theneed to balance analytical, normative and technical cooperation activities.
Strategic Expected Results 2009-2010 Main Activitiesprogramme [The intended short-to-medium term [The short-term activities that theplanning results that the division will achieve in division will accomplish and which
a given period and which will contribute should contribute towards thetowards the achievement of the goal.] achievement of the expected results.]
Programme An integrated single programming Finalize the updated TC Strategy.formulation framework for the analytical, Organize a series of programme
normative and technical cooperation planning workshops (TCS & PPBES).components of the work of each Provide inputs and support tosubprogramme is developed. operational Divisions in the
The draft Strategic Framework for development of [technical cooperation]2012-2013 reflects the priorities of programme documents, includingmember States of the Commission, results frameworks and outputs for thepromotes synergies with the work of strategic framework 2012-2013.other UN entities, and taps ESCAP’smultidisciplinary strengths.
The secretariat [technical cooperation]programme documents are responsiveto the needs of member States, in linewith work programme priorities anddesigned to deliver results.
Monitoring Tool 1 – Sample results-based work plan
MONITORING TOOL 1SAMPLE RESULTS-BASED WORK PLAN
The sample is an excerpt from the work plan of PMD for 2009-2010
43
Monitoring Tool 2 – Sample work months report
Subp
rogr
amm
e 2:
Sta
tist
ics
Nam
eN
ame
Nam
eN
ame
Nam
eN
ame
Nam
eN
ame
Nam
eN
ame
Nam
eO
ther
sT
otal
Prog
ram
me
elem
ent
12
34
56
78
910
11
P-R
BP-
RB
P-R
BP-
RB
P-R
BP-
RB
P-R
BP-
RB
P-R
BP-
RB
P-R
BC
-RB
C-X
BP-
RB
P-X
BC
-RB
C-X
B
Subs
tant
ive
serv
icin
g of
mee
ting
s
PB12
6620
2-1-
101
Com
mitt
ee o
n Po
vert
y0.
100.
150.
150.
150.
150.
150.
150.
151.
150.
000.
00R
educ
tion:
ple
nary
(4)
(2 i
n 20
08, 2
in
2009
)R
efor
mul
ated
by
64th
Com
mis
sion
to:
Com
mit
tee
on S
tati
stic
s:pl
enar
y (4
) (2
009)
PB13
5650
2-1-
102
Add
ed b
y 64
th0.
150.
100.
100.
100.
100.
100.
100.
100.
850.
000.
00C
omm
issi
on: C
omm
itte
eon
Sta
tist
ic: P
lena
ry(2
) (2
009)
PB12
1512
EDM
Sess
ion
of t
he C
omm
issi
on0.
250.
250.
250.
250.
250.
250.
251.
750.
000.
00
Parl
iam
enta
ry d
ocum
enta
tion
PB12
6602
2-1-
201
Com
mis
sion
: rep
orts
on
0.25
0.75
0.25
0.50
1.75
0.00
0.00
issu
es r
elat
ed t
o st
atis
tics
(2)
(200
8,20
09)
PB12
6618
2-1-
202
Com
mis
sion
: rep
orts
of
0.25
0.25
0.50
0.00
0.00
the
Stat
istic
al I
nstit
ute
of A
sia
and
the
Paci
fic(S
IAP)
(2)
(20
08,2
009)
PB12
6605
2-1-
203
Com
mitt
ee o
n Po
vert
y2.
000.
251.
000.
500.
250.
501.
005.
500.
000.
00R
educ
tion:
rep
orts
on
issu
es r
elat
ed t
o po
vert
yst
atis
tics
(2)
(200
8,20
09)
Ref
orm
ulat
ed b
y 64
thC
omm
issi
on t
o:C
omm
itte
e on
Sta
tist
ics:
repo
rts
on t
hede
velo
pmen
t of
sta
tist
ics
(1)
(200
9) &
rep
ort
of t
heC
omm
itte
e (1
) (2
009)
MO
NIT
OR
ING
TO
OL
2
SA
MP
LE
WO
RK
MO
NT
HS
RE
PO
RT
FO
R T
HE
PE
RIO
D O
F1
JAN
UA
RY
– 3
0 JU
NE
200
9
44
Subp
rogr
amm
e 2:
Sta
tist
ics
Nam
eN
ame
Nam
eN
ame
Nam
eN
ame
Nam
eN
ame
Nam
eN
ame
Nam
eO
ther
sT
otal
Prog
ram
me
elem
ent
12
34
56
78
910
11
P-R
BP-
RB
P-R
BP-
RB
P-R
BP-
RB
P-R
BP-
RB
P-R
BP-
RB
P-R
BC
-RB
C-X
BP-
RB
P-X
BC
-RB
C-X
B
Expe
rt g
roup
s, r
appo
rteu
rs, d
epos
itor
y se
rvic
es
PB12
6609
2-1-
301
Expe
rt g
roup
mee
ting
on0.
000.
000.
00da
ta e
xcha
nge
and
shar
ing
tech
nolo
gies
(1)
(20
08)
PB12
6608
2-1-
302
Expe
rt g
roup
mee
ting
on0.
100.
100.
000.
00da
ta a
nd d
ata
anal
ysis
for
the
MD
Gs
(1)
(200
9)
PB12
6607
2-1-
401
Stat
istic
al I
nstit
ute
for
Asi
a0.
000.
000.
00an
d th
e Pa
cific
Gov
erni
ngC
ounc
il (2
) (2
008,
2009
)
Rec
urre
nt p
ubli
cati
ons
PB12
6603
2-2-
101
Stat
istic
al Y
earb
ook
for
0.25
1.50
1.30
0.25
3.30
0.00
0.00
Asi
a an
d th
e Pa
cific
(2)
(200
8,20
09)
Non
-rec
urre
nt p
ubli
cati
ons
PB12
6626
2-2-
301
Ass
essm
ent
of t
he P
rogr
ess
0.25
1.40
0.75
1.00
1.50
3.40
0.00
1.50
Mad
e in
Ach
ievi
ng t
heM
illen
nium
Dev
elop
men
tG
oals
(1)
(20
09)
Oth
er s
ubst
anti
ve a
ctiv
itie
s
PB12
6604
2-2-
501
ESC
AP
web
pag
e on
0.25
0.25
0.15
0.65
0.00
0.00
stat
istic
s (h
ttp:
//w
ww
.un
esca
p.or
g/st
at)
(wee
kly
upda
ting)
(2)
(200
8,20
09)
PB12
6619
2-2-
901
Ad
hoc
stat
istic
al0.
250.
250.
751.
002.
250.
000.
00in
form
atio
n (2
) (2
008,
2009
)
PB12
6654
2-2-
902
Reg
iona
l st
udie
s on
the
0.50
0.50
0.00
0.00
avai
labi
lity
and
qual
ityof
dev
elop
men
t in
dica
tors
;in
clud
ing
indi
cato
rs o
nM
illen
nium
Dev
elop
men
tG
oals
(2)
(20
08,2
009)
PB12
6611
2-2-
903
Stat
istic
al I
ndic
ator
s fo
r0.
252.
001.
503.
750.
000.
00A
sia
and
the
Paci
fic (
web
data
base
qua
rter
lyup
datin
g) (
2) (
2008
,200
9)
PB12
6653
2-2-
904
Stat
istic
al a
ppen
dix
of t
he0.
250.
250.
000.
00Ec
onom
ic a
nd S
ocia
lSu
rvey
of
Asi
a an
d th
ePa
cific
(2)
(20
08,2
009)
Monitoring Tool 2 – Sample work months report
45
Subp
rogr
amm
e 2:
Sta
tist
ics
Nam
eN
ame
Nam
eN
ame
Nam
eN
ame
Nam
eN
ame
Nam
eN
ame
Nam
eO
ther
sT
otal
Prog
ram
me
elem
ent
12
34
56
78
910
11
P-R
BP-
RB
P-R
BP-
RB
P-R
BP-
RB
P-R
BP-
RB
P-R
BP-
RB
P-R
BC
-RB
C-X
BP-
RB
P-X
BC
-RB
C-X
B
PB12
6610
2-2-
905
Trai
ning
mat
eria
ls o
n1.
501.
500.
000.
00va
riou
s as
pect
s of
off
icia
lst
atis
tics
(2)
(200
8,20
09)
PB12
6652
2-2-
906
Web
-bas
ed k
now
ledg
e0.
000.
000.
00sh
arin
g fa
cilit
ates
rel
ated
to t
arge
ted
trai
ning
cour
ses,
sem
inar
s an
dw
orks
hops
(1)
(20
08,2
009)
Tra
inin
g co
urse
s, s
emin
ars
and
wor
ksho
ps
PB12
6616
2-3-
201
Reg
iona
l m
anag
emen
t0.
000.
000.
00se
min
ar f
or t
he h
eads
of
natio
nal
stat
istic
al o
ffic
es(5
0 pa
rtic
ipan
ts)
(1)
(200
8,20
09)
PB12
6617
2-3-
202
Reg
iona
l w
orks
hop
on0.
000.
000.
00co
ordi
natio
n of
sta
tistic
altr
aini
ng a
ctiv
ities
(10
part
icip
ants
) (1
) (2
008)
PB12
6615
2-3-
203
Reg
iona
l/su
breg
iona
l0.
000.
000.
00tr
aini
ng c
ours
es o
nco
llect
ion,
com
pila
tion,
proc
essi
ng, a
naly
sis,
and
diss
emin
atio
n of
bro
ad-
base
d of
ficia
l st
atis
tics
(120
par
ticip
ants
) (1
)(2
008,
2009
)
PB12
6614
2-3-
204
Reg
iona
l/su
breg
iona
l0.
000.
000.
00w
orks
hops
on
best
prac
tices
in
appl
ying
IC
Tfo
r po
pula
tion
and
hous
ing
cens
us(3
5 pa
rtic
ipan
ts)
(1)
(200
8,20
09)
PB12
6612
2-3-
205
Reg
iona
l/su
breg
iona
l0.
751.
000.
204.
251.
954.
250.
00w
orks
hops
on
econ
omic
and
envi
ronm
enta
lst
atis
tics,
inc
ludi
ngna
tiona
l ac
coun
ts(5
0 pa
rtic
ipan
ts)
(1)
(200
8,20
09)
Monitoring Tool 2 – Sample work months report
46
Subp
rogr
amm
e 2:
Sta
tist
ics
Nam
eN
ame
Nam
eN
ame
Nam
eN
ame
Nam
eN
ame
Nam
eN
ame
Nam
eO
ther
sT
otal
Prog
ram
me
elem
ent
12
34
56
78
910
11
P-R
BP-
RB
P-R
BP-
RB
P-R
BP-
RB
P-R
BP-
RB
P-R
BP-
RB
P-R
BC
-RB
C-X
BP-
RB
P-X
BC
-RB
C-X
B
PB12
6656
2-3-
206
Reg
iona
l/su
breg
iona
l0.
300.
300.
000.
00w
orks
hops
on
impl
emen
tatio
n of
new
soci
al-e
cono
mic
clas
sific
atio
ns a
ndec
onom
ic f
ram
ewor
ks(8
0 pa
rtic
ipan
ts)
(1)
(200
8,20
09)
PB12
6606
2-3-
207
Reg
iona
l/su
breg
iona
l0.
751.
001.
750.
000.
00w
orks
hops
on
soci
alst
atis
tics,
inc
ludi
ngpo
vert
y st
atis
tics,
hea
lthst
atis
tics
and
stat
istic
s on
mig
ratio
n an
d ge
nder
issu
es, w
ith s
peci
alre
fere
nce
to M
DG
s(7
0 pa
rtic
ipan
ts)
(1)
(200
8,20
09)
PB12
6613
2-3-
208
Reg
iona
l/su
breg
iona
l/0.
000.
000.
00na
tiona
l se
min
ars/
wor
ksho
ps/t
rain
ing
cour
ses
on v
ario
usco
untr
y-id
entif
ied
aspe
cts
of o
ffic
ial
stat
istic
s (1
)(4
50 p
artic
ipan
ts)
(1)
(200
8,20
09)
PB12
6655
2-3-
209
Reg
iona
l/su
breg
iona
l0.
000.
000.
00se
min
ar/w
orks
hops
on
the
impl
emen
tatio
n of
glob
al g
uide
lines
for
popu
latio
n an
d ho
usin
gce
nsus
(35
par
ticip
ants
)(1
) (2
008)
PB12
6657
2-3-
210
Reg
iona
l/su
breg
iona
l0.
752.
250.
752.
250.
00w
orks
hops
/tra
inin
gco
urse
s on
im
prov
emen
tof
dis
abili
ty m
easu
rem
ent
and
stat
istic
s in
sup
port
of t
he B
iwak
o M
illen
nium
Fram
ewor
k an
d R
egio
nal
Cen
sus
Prog
ram
me
(80
part
icip
ants
) (1
)(2
008,
2009
)
Monitoring Tool 2 – Sample work months report
47
Subp
rogr
amm
e 2:
Sta
tist
ics
Nam
eN
ame
Nam
eN
ame
Nam
eN
ame
Nam
eN
ame
Nam
eN
ame
Nam
eO
ther
sT
otal
Prog
ram
me
elem
ent
12
34
56
78
910
11
P-R
BP-
RB
P-R
BP-
RB
P-R
BP-
RB
P-R
BP-
RB
P-R
BP-
RB
P-R
BC
-RB
C-X
BP-
RB
P-X
BC
-RB
C-X
B
Fiel
d pr
ojec
ts
PB12
6658
2-3-
401
Cou
ntry
pilo
t te
sts
of0.
750.
750.
000.
00di
sabi
lity
ques
tion
sets
base
d on
Int
erna
tiona
lC
lass
ifica
tion
ofFu
nctio
ning
, Dis
abili
tyan
d H
ealth
(IC
F) (
1)(2
008,
2009
)
Sub-
tota
l0.
254.
402.
353.
003.
754.
505.
053.
752.
500.
003.
156.
501.
5031
.55
0.00
6.50
1.50
Out
side
wor
k pr
ogra
mm
e
2-4-
001
Act
iviti
es r
elat
ed t
o X
B0.
250.
250.
000.
00pr
ojec
t fo
rmul
atio
n(b
ilate
ral
and
othe
rs)
2-4-
002
Act
iviti
es r
elat
ed t
o0.
450.
251.
000.
502.
200.
000.
00de
velo
pmen
t ac
coun
tpr
ojec
t fo
rmul
atio
n/im
plem
enta
tion
(Sec
tion
36)
2-4-
003
Act
iviti
es r
elat
ed t
o0.
250.
951.
552.
750.
000.
00Se
ctio
n 22
2-4-
004
Non
-out
put
activ
ities
0.00
0.00
0.00
unre
late
d to
the
wor
kpr
ogra
mm
e
2-4-
005
Vac
ancy
5.00
1.00
6.00
0.00
0.00
2-4-
006
Leav
e0.
350.
600.
751.
251.
000.
451.
750.
050.
306.
500.
000.
00
2-4-
007
Red
eplo
yed
2.50
6.00
8.50
0.00
0.00
2-4-
008
Oth
ers
(spe
cify
)0.
751.
251.
352.
000.
500.
251.
007.
100.
000.
00
Sub-
tota
l5.
751.
603.
653.
002.
251.
500.
952.
253.
506.
002.
850.
000.
0033
.30
0.00
0.00
0.00
Tot
al6.
006.
006.
006.
006.
006.
006.
006.
006.
006.
006.
006.
501.
5064
.85
0.00
6.50
1.50
Monitoring Tool 2 – Sample work months report
49
MONITORING TOOL 3SAMPLE ACCOMPLISHMENT ACCOUNT
SUBPROGRAMME 2: STATISTICSACCOMPLISHMENT ACCOUNT
(PROGRAMME ASSESSMENT FOR 18 MONTHS)1 JANUARY 2008 – 30 JUNE 2009
EXPECTED ACCOMPLISHMENT 1
Setting:
Socio-economic policies that affect people's daily lives are all too often based on assumptions or oninadequate and even incorrect information. This is because relevant data are often-due to lack of nationalstatistical capacity-either lacking altogether or are of poor quality and not comparable with that of othercountries or over time. ESCAP resolution 62/10 highlights the urgent need to strengthen the capacity ofdeveloping countries in the Asia-Pacific region to regularly produce and disseminate official statistics formonitoring social, economic and environmental conditions and providing sound evidence-bases for policy-making and evaluation.
The activities under this expected accomplishment are designed to strengthen national statistical systemsby promoting the development and use of international statistical standards and providing relatedtechnical assistance. The promotion of international statistical standards refers to the implementation ofexisting standards and development of official statistics in emerging fields where the overall methodologyis undeveloped.
The subprogramme provides forums for the exchange of experiences and good practices and formulationof common regional positions to be presented at international statistical forums, especially the UnitedNations Statistical Commission, the highest standard-setting authority for official statistics. Internationalstandards allow governments to communicate in a globalizing world and citizens to compare how theircountries perform in relation to others and over time. This is an important source of information forenabling citizens to hold governments accountable for designing adequate policies and delivering theexpected results.
The statistics subprogramme was further strengthened with the re-establishment of the ESCAP Committeeon Statistics, which will give impetus to strategic dialogue on statistics development issues in the regionand guide the statistics development programme of the secretariat.
End-Users:
The immediate target group is the producers of official statistics in member and associate membercountries. The users of statistics are also an important group as they generate the demandand requirements for the continuing improvement of existing official statistics and development in newareas.
Monitoring Tool 3 – Sample Accomplishment Account
50
Expected accomplishment 1: Increased national capacity in Asia and the Pacific, particularly the leastdeveloped countries, to provide data required for measuring progress towards achieving internationallyagreed development goals
Indicator of achievement: Increased number of national statistical systems, benefiting from ESCAP andStatistical Institute for Asia and the Pacific assistance, that are able to provide data according tointernational statistical standards for measuring progress towards achieving national and internationallyagreed development goals, including the Millennium Development Goals
Results: Progress was made towards increasing the national capacity in the Asia-Pacific region to providedata on vulnerable groups, namely disabled persons and people working in the informal sector. Thenumber of countries with two or more data points for at least two-thirds of all MDG indicators –excluding ODA-related and agriculture support (as they do not measure data availability in developingcountries) – increased from 20 countries in 2007, to 24 countries in mid-2009.
The project on the informal sector and informal employment collected participating countries' data setshelping to fill major gaps in official statistics. ESCAP has been able to play a major role in facilitating thesharing of experiences among project countries at various stages of survey implementation and dataprocessing.
ESCAP's contribution to the development of international standards on disability measurement has beentangible and internationally visible. Several countries have indicated their intention of adding the WGshort question set on disability in their next census round. Viet Nam included it in its 2009 Census;Mongolia pilot tested it in preparation for its 2010 Census and Sri Lanka has expressed interest inincluding the questions in its 2011 Census. In terms of surveys, six participating countries successfullyconducted a cognitive test of the extended question set in April 2009 which contributed to the agreementof a final questionnaire that started was pilot tested between July and September. Regional workshopshave helped expand the pool of national experts on the subject.
Workshops on census data processing, on International Economic and Social Classifications and theTask Forces on International Merchandise Trade Statistics and on Statistics on International Tradein Services have helped clarify definitions and other relevant statistical matters among internationalagencies.
An EGM on gender statistics helped initiate a dialogue between the NWMs and NSOs, laying the basis fora longer-term partnership devoted to promoting gender equality. The EGM also provided feedback on therelevance of indicators to measure violence against women being evaluated and compiled by the FoCgroup which presented its recommendations at the fortieth session of the UN Statistical Commission inFebruary 2009.
The results from a survey on the effectiveness of SIAP's training indicated that the knowledge and skillsacquired during the courses had improved participants' and their organizations' performance over a widerange of subjects. The Institute has also helped improve statistical training networks with NSOs,international organizations and national training institutes, strengthening effectiveness of training activitiesin official statistics in Asia and Pacific.
Key Events/Actions:
During the reporting period ESCAP produced various assessments on recent statistics developments inAsian and Pacific countries and papers on the new ESCAP strategy for technical cooperation in statisticsdevelopment and a plan towards the development of economic statistics for the Committee on Statistics.The re-scheduled session of the Committee on Statistics was held in Bangkok from 4 to 6 February 2009and was attended by 33 delegations, including many at the highest Government Statistician-level, some ofwhom were of ministerial rank. Eighteen United Nations bodies and intergovernmental agencies alsoattended the session.
Monitoring Tool 3 – Sample Accomplishment Account
51
The Committee reviewed major issues, including: statistical development, regional cooperation and capac-ity building, gender statistics, economic statistics, vital statistics, measuring the progress of societystatistics, and programme planning for ESCAP's work on statistics. In conjunction with the Committeesession, an exhibition on the Asia-Pacific region's statistical achievements and a side event on coordinatingsupport for statistics development in Asia-Pacific were organized. The Committee exhibition was alsodisplayed in conjunction with the United Nations Statistical Commission on its fortieth session (New York,24 - 27 February).
As part of the follow up activities to the Committee session, ESCAP serviced the first meeting of itsBureau during the fortieth session of the Statistical Commission, producing the following documents:Terms of Reference for the Technical Advisory Group on Economic Statistics; and key lessons learnedfrom, and suggestions for, the next session of the Committee. Following the recommendations madeduring the Committee, ESCAP initiated work on establishing a technical advisory group and developing aregional action plan for the improvement of economic statistics in Asia and the Pacific through theorganization of an Expert Group Meeting in September. Regarding vital statistics, ESCAP organized aplanning meeting among international agencies in May 2009 as well as and Expert Group Meeting todevelop a Regional Programme for the Improvement of Vital Statistics in Asia-Pacific in September of thisyear.
ESCAP continued implementing two United Nations Development Account Projects: “Interregional Coop-eration on the Measurement of Informal Sector and Informal Employment” and “Improvement ofDisability Measurement and Statistics in Support of the Biwako Millennium Framework of Action and theRegional Census Programme.” As part of the proposed ESCAP regional census programme, ESCAPcontinued its collaboration with UNSD in promoting effective use of ICT for census management and datacollection and processing. It also supported UNSD with other on-going initiatives in the revision of globalclassifications and standards. Jointly with SSD, ESCAP explored the needs for developing a regionalprogramme to improve the production and use of gender statistics.
The interregional informal sector project is lead by ESCAP and implemented together with ECLAC andESCWA in Mongolia, the Philippines, Sri Lanka, Saint Lucia and Palestine. In April 2008, ESCAPparticipated in ESCWA-organized the National Workshop on Informal Employment and Informal SectorData Collection: Strategy, Tools and Advocacy. During the workshop, a project implementation plan anddraft questionnaires were prepared by Palestine Central Bureau of Statistics (PCBS) in collaboration withthe resource persons. ESCAP, through SIAP, organized the Workshop on Informal Employment andInformal Sector Data Collection II: Evaluation, Processing and Utilization of Data from '1-2' Surveys on 14-16 May 2008. During this workshop the IHSN Microdata Management Toolkit was introduced as a tool todocument 1-2 Surveys, creating synergy between the secretariat's informal sector and microdata manage-ment projects. By the end of the workshop, an action plan was designed to facilitate data analysis withtasks for the project countries and ESCAP.
During the reporting period, progress was made in data collection in all project countries. With theexception of Sri Lanka which started data collection in October 2008 and will collect data for fourquarters, the other four project countries have completed data collection activities. Data checking, editingand preparation of datasets for analysis is ongoing in project countries. Documentation of the 1-2 Surveyusing the IHSN Microdata Management Toolkit has been initiated by Mongolia, the Philippines and SaintLucia.
ESCAP provided direct technical support to project countries in Asia and the Pacific and assistance to theother implementing agencies on data processing, analyses and compilation of national accounts using 1-2Survey data. It conducted missions to two the Philippines in March 2009 and to Mongolia in April 2009 toassess the quality of the processed data, to assist with the preparation of datasets for analysis andestimation, and to assist with derivation of informal employment and informal sector estimates based onthe 1-2 Survey data collected. ESCWA conducted a mission to Palestine to review their progress, re-examine the methods adopted at different stages of field work and to discuss the content of their countryreport. ECLAC has been working closely with Saint Lucia to finalize the country report and to prepare fortheir regional workshop.
Monitoring Tool 3 – Sample Accomplishment Account
52
The project guidelines on using 1-2 Survey data to estimate annual HUEM (household unincorporatedenterprises with at least some market production) and informal sector value added and integrating theseestimates into the compilation of national accounts was drafted by the project consultant. The guidelineswere presented and discussed at the “Workshop on Estimating HUEM and Informal Sector Value AddedUsing 1-2 Survey Data” held in Bangkok from 25 to 27 May 2009 with the participation of all projectcountries. This meeting was held back-to-back with the workshop on “Informal Employment and InformalSector Data Analysis, Tabulations and Country Reports” (28-29 May 2009). While not part of the project,Viet Nam participated in the workshops and shared its experience of conducting the 1-2 Survey in Hanoiand Ho Chi Minh City. ADB given it had joined the Steering Committee of project in February 2008. ADB isimplementing a similar data collection based on ESCAP's experience in Armenia, Bangladesh and Indonesia.
Under the project on improving disability measurement and statistics, which was developed in directresponse to the regional and international calls for better disability statistics (i.e., the Biwako MillenniumFramework of Action and the United Nations Convention for Promoting the Rights of People withDisabilities), ESCAP organized a 'Regional Workshop on Promoting Disability Data Collection through the2010 Round of Population and Housing Censuses', the Washington Group on Disability Statistics, theWorld Bank and WHO. The workshop was attended by senior statisticians, health and disabilityprofessionals, and representatives of disabled people's organizations from more than 25 countries from theAsia-Pacific region, along with experts from UNFPA, UNESCO and other specialized agencies. To supportthe workshop, the Training Manual on Disability Statistics, produced in collaboration with WHO, wasofficially launched at the event. The regional workshop on censuses was used to select countries to takepart in the pilot test of a proposed extended question set for survey data collection. Seven out of fourteeninterested countries were chosen to be included in the project.
Under the guidance of the Steering Committee (SC) the secretariat put together a task-team with nationalexperts who took part in the previous ESCAP/WHO project and project coordinators from the pilot testcountries. The team contributed to the development of the extended question set, the first version ofwhich was presented and reviewed during the 8th Meeting of the Washington Group on DisabilityStatistics, in October 2008 in Manila. The Manila seminar also served as the first meeting of the disabilitytask-team which devised the next steps of the project.
In February 2008, ESCAP organized a “Training on Cognitive and Pilot Testing in the Context ofDisability Statistics” attended by Cambodia, Fiji, Kazakhstan, Maldives, Mongolia, the Philippines and SriLanka all countries taking part in the project. This training was followed by a cognitive (qualitative) testof the first version of the extended questions set of the survey on disability. The results of the cognitivetest were reviewed by ESCAP and the Washington Group on Disability Statistics during separate meetingsheld in May and June 2009 based on which a final questionnaire to be pilot tested was agree upon. In-country enumerator trainings took place between July and September 2009 in Cambodia, Kazakhstan,Maldives, Mongolia, Philippines and Sri Lanka prior to the pilot testing of the Washington Group/ESCAPextended question set on disability for surveys.
As part of the subprogramme's efforts towards improving national statistical capacity to produce,disseminate and analyze gender statistics, an Expert Group Meeting on Gender Statistics and the Use ofViolence against Women Indicators in support of the CEDAW and the Beijing Platform for Action (BPfA)was jointly organized by ESCAP from 1 to 3 October 2008. The meeting brought together senior officialsfrom National Women's Machineries (NWMs) and National Statistical Offices (NSOs) representing thegovernments of Afghanistan, Bangladesh, Cambodia, India, Philippines, Samoa and Thailand. Experts fromthe Korean Women's Development Institute, UNDP, UNIFEM, OHCHR and the Secretariat of the PacificCommunity (SPC) also participated in the meeting as resource persons.
ESCAP organized in Bangkok three regional workshops in collaboration with UNSD, namely on (i) theRevision of the International Recommendations for International Merchandise Trade Statistics, 9-12 Sep-tember 2008; (ii) Census Data Processing: Contemporary Technologies for Data Capture, Methodology andPractice of Data Editing, documentation and archiving, 15 - 19 September 2008; and (iii) InternationalEconomic and Social Classifications, 24 - 27 November 2008. ESCAP also hosted the Task Force onInternational Merchandise Trade Statistics and the Task Force on Statistics on International Trade inServices 10-11 March 2009 as well as their Joint Session on 12 March 2009.
Monitoring Tool 3 – Sample Accomplishment Account
53
Under UNSD development account project “Building Statistical Capacity in the Low Income Countries ofSouth East Asia”, ESCAP contributed to the subregional UNSD/UNWTO workshop on “Tourism Statisticsfor South East Asian countries” held in Vientiane from 16 to 19 June 2009. ESCAP is also supportingUNSD in the implementation of the development account project “Strengthening statistical capacity insupport of progress towards the Internationally Agreed Development Goals in countries of South Asia”,having contributed to workshop on “Latest Information Technologies in Knowledge Transfer” held inDhaka from 15 to18 June 2009.
During the reporting period SIAP conducted 37 courses training 802 participants from 60 countries in 12regional and subregional courses; including research based training programme, 4 sub-regional trainingcourse and 8 country courses/workshops. The outreach programme component was attended by 522participants. The outreach programme has traditionally had a majority of male participants but the ratio ofmale/female participants was approaching parity. The Institute expanded its TMA-based training coursesfrom 5 to 8 during the 18 month reporting period focusing in the areas of Fundamental Official Statistics;Application of ICT in the Production and Dissemination of Official Statistics; Data Analysis; Interpretationand Dissemination of Official Statistics; and Collection and Analysis of Official Economic Statistics forCentral Asian countries.
SIAP conducted the seventh seminar for heads of NSOs concurrently with the Conference on “ReshapingOfficial Statistics” of the International Association of Official Statistics. The seminar for chief statisticiansprovided opportunities to exchange ideas, share experiences, and analyze major successful factors inmanaging major national statistical activities. The Institute also continued its efforts in meeting the hightraining demand for training in the Pacific. The fourth sub-regional Course on Statistics for Pacific IslandDeveloping Countries was conducted in October/November 2008 in collaboration with the Secretariat ofthe Pacific Community.
SIAP's fourth Training Course in Analysis, Interpretation and Dissemination of Official Statistics focusedon social statistics making participants well conversant with the latest International Standards in SocialStatistics with special attention to indicators for monitoring and evaluation of achievements of MDGs andHuman Development concern. The Institute also conducted five in-country training workshops on MDGsand use of administrative data systems for statistical purposes in Mongolia, Nepal, Palau, Sri Lanka, andViet Nam. Seven other country courses were conducted in Indonesia, the Islamic Republic of Iran,Mongolia, the Philippines, Sri Lanka, Tajikistan, and Viet Nam.
SIAP conducted five distance training courses on a trial basis - three courses on Stata and two on Basicsof the System of National Accounts (SNA). They were supported by JICA and the host Government ofJapan using the JICA-net service. The JICA-net service directly connects JICA office in Tokyo and those inoverseas countries through videoconference facilities.
At its fourth session, SIAP's Governing Council endorsed its strategic plan 2010-2014, while also endorsingthe terms of reference for the evaluation of the Institute which aims at examining how well SIAP ismeeting its objectives in improving the collection, use and analysis of statistics in Asia and the Pacific. InApril 2009, SIAP conducted a training needs survey directed at all NSOs in Asia and the Pacific to reviewtheir: (i) priorities for skills development in their organizations; (ii) priorities for existing courses offeredby the Institute; (iii) how SIAP's work could be strengthened; and (iv) opportunities for the Institute towork in partnership with other organizations, including training institutes, in the region.
During its sixty-fifth session, the Commission adopted Resolution 65/2 - Regional technical cooperationand capacity building in statistics development in Asia and the Pacific. The Resolution aims at strengthen-ing the statistical capacity in the Asia-Pacific region and encourages countries to continue to support andincrease their financial support to the Statistical Institute for Asia and the Pacific.
Challenge: Despite wide acknowledgement of the importance of statistics, many developing countries arestill struggling to establish adequate institutional arrangements and find sufficient resources for develop-ing national statistical systems that comply with international statistical standards, and for efficientmanagement and operational practices. Retention of qualified staff, keeping up with the ever-more
Monitoring Tool 3 – Sample Accomplishment Account
54
demanding international statistical standards, and making efficient use of information and communicationtechnology in statistical and other operations are some other major challenges that national statisticaloffices are continuously struggling with. Even relatively developed and well-staffed statistical systems(which involve many statistical offices and units in various government departments) often find it difficultto plan and develop in a strategic and coordinated manner.
During 2008-2009, several major statistical standards, such as the 2008 System of National Accounts (SNA)and associated revisions of manuals and classifications, will come under implementation. The challenge forthe subprogramme in this regard is two-fold: first, to be effective in promoting the use of internationalstandards, and secondly, to ensure that the perspectives and needs of developing countries and areas ofthe Asia-Pacific region are considered in the process of the development and implementation of standards.The support for the 2010 Round of Population and Housing Censuses will continue, as the Roundprovides a major opportunity for many developing countries to obtain some of the most basic social anddemographic data for many MDG indicators.
The regional statistical community has welcomed the re-establishment of the Committee on Statistics. Thechallenge is to foster conditions for ensuring the commitment of members and associate members tobuilding the Committee as an effective and useful forum for the advancement of official statistics. There isa need for the Committee to find ways of functioning effectively within its terms of reference as an apexregional body in statistical standard development and coordination in statistics. One of its roles isreviewing the strategic framework and work programme of SIAP, which has a Governing Council of itsown. This poses a new challenge for the two legislative bodies to work in a mutually supportive way indirecting the statistical capacity building components of the subprogramme.
Learning: The Committee on Statistics highlighted the strong need of Asia-Pacific member States for theirown regional forum on statistics and statistical development. The secretariat has to further strengthen itsrole in facilitating and promoting regional cooperation, and in articulating the views and positions ofmember States at the global level, especially by maintaining its substantive and coordination advantage.By working closely with Bureau members, the secretariat has made significant progress towards imple-menting the decisions and recommendations of the Committee on Statistics.
Data collected by countries participating in the project on informal sector and were received withsignificant errors and delays, making it difficult for the project to run according to plan. Feedbackobtained during the workshop revealed that even closer monitoring of its implementation at the nationallevel is required. Inter-agency collaboration for this project is proving to be very useful to achieve moresustainable and widespread results, as proven by ESCAP's collaboration with ADB who based on thisexperience is implementing a similar project in Armenia, Bangladesh and Indonesia, and ADB.
The data processing experience from Sri Lanka, in particular, has proved that the questionnaire andsurvey design may need some adjustments in the future if the 1-2 Survey methodology continues to beimplemented in countries. The issue of integrating Phase 1 of the survey with the Labour Force Surveyrather than being administered as a separate questionnaire module needs serious consideration in order toavoid complications regarding mismatches of codes between the schedules especially in countries whereschedules are still manually coded.
The collaboration with the Washington Group, UNSD, UNECE, WHO and the Budapest Initiative has beenclose and productive which enhancing ESCAP's international visibility as using an inclusive approach inits technical cooperation work. While moving from the traditional medical approach towards the socialmodel view on disability has been challenging for policy makers and organizations working with personswith disabilities, the used inclusive approach helped countries adopt the recommended question set forcensuses.
UNSD delivers a large number of workshops globally in collaboration with regional commissions. Whilethis collaboration functions well in practice, not all activities can be programmed in advance and somemay not fall under the sectoral statistical priorities selected for regional commissions. Nevertheless, thehosting of such workshops complements ESCAP's ongoing technical cooperation activities.
Monitoring Tool 3 – Sample Accomplishment Account
55
The experience in organizing the EGM on violence against women indicators confirmed once again thatESCAP is in a position to influence international standards in statistically challenging areas. The Meetingalso highlighted the need for data producers and users to communicate openly and understand eachother's needs better in order to generate and disseminate useful data for national programmes.
SIAP conducted a training needs survey for NSOs in Asia and the Pacific in April 2009. Among the mainissues addressed were: (i) priorities for skills development in their organizations; (ii) priorities for existingcourses offered by the Institute; (iii) how SIAP's work could be strengthened. The high response rate (78percent) indicates that demand for official statistics throughout the region is increasing as countriesbecome more aware of the importance of official statistics in policy making, monitoring and evaluating itsimpact. Many gaps have been identified and SIAP is working towards addressing these needs.
Monitoring Tool 3 – Sample Accomplishment Account
57
Evaluation Tool 1 – Evaluation Terms of Reference Template
Terms of Reference for the[Title of the Evaluation]
DRAFT / FINAL DRAFT / FINAL
[Month, year]
Prepared by:OIOS / ESCAP / Division/Office
EVALUATION TOOL 1EVALUATION TERMS OF REFERENCE TEMPLATE
58
CONTENTS
Page
1. INTRODUCTION ................................................................................................................... 59
1.1 BACKGROUND OF THE EVALUATION ................................................................. 591.2 PURPOSE, OBJECTIVES AND DELIVERABLES...................................................... 591.3 SCOPE ............................................................................................................................... 59
2. METHODOLOGY ................................................................................................................... 60
2.1 METHODOLOGY............................................................................................................ 602.2 LIMITATIONS ................................................................................................................. 60
3. TIME REQUIREMENTS AND TIMELINES .................................................................... 61
3.1 TIME REQUIREMENTS ................................................................................................. 613.2 TIMELINES ...................................................................................................................... 61
ANNEXES ........................................................................................................................................ 63
ANNEX I. CONTENTS OF THE EVALUATION REPORT................................................ 63ANNEX II. QUALITY CRITERIA USED TO REVIEW EVALUATION REPORTS ......... 65ANNEX III. [OTHER].................................................................................................................... 67
TOR for the [Title Evaluation] – Draft/Final draft/Final month, year
Evaluation Tool 1 – Evaluation Terms of Reference Template
59
1. INTRODUCTION
1.1 Background of the evaluation
[Intro sentence: This is the terms of reference report of the evaluation of (subject) that is to beconducted between (month – month, year)].
[Brief background to the subject under evaluation – for details refer to annexes if required]
1.2 Purpose, objectives and deliverables
The purpose of the evaluation is to …
[Should address:
Who is the evaluation for? Is it for a particular donor or for member States? Or is it forESCAP management or staff? Or both?
Why is the evaluation carried out? What triggered the evaluation? Is there a specificreason for choosing the timing of the evaluation?
How will the results be used? By being clear upfront how the results will be used (andsticking to this!) the evaluation manager can generate trust amongst all parties involved,in particular amongst ESCAP staff.]
The evaluation objectives are to:
Objective 1 Etc,
The outputs of the evaluation include:
Evaluation report Etc,
[Describe the dissemination of the evaluation report, e.g.: The evaluation report will beprinted in hard copy for dissemination within the ESCAP Secretariat and to the donor, andpublished on ESCAP’s website: http://www.unescap.org/pmd/evaluation-reports.asp]
1.3 Scope
[The scope narrows the focus of the evaluation, for example geographical coverage, timeperiod or target groups to be included]
The scope of the evaluation is defined as:
# #
The evaluation questions [see Evaluation Tool 3 for guidance on evaluation questions] include:[max 10]
# #
TOR for the [Title Evaluation] – Draft/Final draft/Final month, year
Evaluation Tool 1 – Evaluation Terms of Reference Template
60
TOR for the [Title Evaluation] – Draft/Final draft/Final month, year
Evaluation Tool 1 – Evaluation Terms of Reference Template
2. METHODOLOGY
[This chapter describes the evaluation methodology and limitations of the evaluation].
2.1 Methodology
[Description of methodology, covering, for example:
Activities, data collection methods Method of data analysis Timeframe (e.g. 3-day country visits) Reasons for selecting sample reports, countries, sites, case studies, and interviewed
stakeholders as a representation of the topic being evaluated Other]
2.2 Limitations
The limitations [see Evaluation Tool 4 for more guidance] of the evaluations include:
# #
61
3. TIME REQUIREMENTS AND TIMELINES
This chapter provides the timeframe and budget of the evaluation. [Complete /amend thetable below as required]
3.1 Time Requirements
[Include a breakdown of the estimated number of days that the evaluator(s) will need tocomplete each evaluation task]
TOR for the [Title Evaluation] – Draft/Final draft/Final month, year
TASK ESTIMATED TIME REQUIREMENT
Desk review 3 daysDevelop evaluation plan or framework 3 daysDevelop and implement survey questionnaire 5 daysMission to Bangkok 7 daysPresentation of preliminary findings .5 daysDraft report 5 daysFinal report 2 daysTOTAL 25.5
3.2 Timelines
[A detailed description of all tasks related to the evaluation process with an indication of whoor what entity is responsible and the deadline for completion]
TASK RESPONSIBILITY WHEN (insert date)
Gather background documents Evaluation manager
Brief evaluator/team Evaluation manager
Inception Report: finalize methodology Evaluation manager or Prior to conducting theEvaluator/team evaluation
Conduct the evaluation Evaluator/team
Submit draft evaluation report to the Evaluator/team Within one month afterevaluation manager completing evaluation
activities
Provide comments on draft evaluation Relevant ESCAP staff, Within two weeks afterreport to evaluators ESCAP management, PMD receipt of draft
or OIOS (quality control), evaluation reportevaluation manager, andreference group(if established)
Evaluation Tool 1 – Evaluation Terms of Reference Template
62
TASK RESPONSIBILITY WHEN (insert date)
Submit final evaluation report to the Evaluation team Within two weeks afterevaluation manager receipt of comments
Finalize evaluation report (layout, editing) Evaluation manager
Sign off on evaluation report Evaluator (s)
Formulate management response for ESCAP management,inclusion as an annex in the final coordinated by evaluationevaluation report manager
Sign off on management response ESCAP management
Share evaluation findings Evaluation manager and Within one month afterESCAP management the management
response is signed off
Within one monthafter receipt of finaldraft evaluation report
TOR for the [Title Evaluation] – Draft/Final draft/Final month, year
Evaluation Tool 1 – Evaluation Terms of Reference Template
63
ANNEXES
Annex I. Contents of the Evaluation Report
The evaluation report should follow the structure as outlined in the table below [amendsubheadings and number of pages as required]
CONTENT PAGES COMMENTS(estimate)
Title page 1 Title, date of publication Names of the evaluators Name of ESCAP or division that commissioned the
evaluation, web page address where report can be foundelectronically
Acknowledgments 1 Prepared by the evaluation team
Table of contents 1 List of chapters, sections and annexes
List of acronyms 1-2 In alphabetical order; these are written out in full the first timethey are used in the report
Management To be inserted by ESCAP managementresponse
Executive summary 1-3 Background of the evaluation (one paragraph) Purpose and scope (one paragraph) Methodology (one paragraph) Main conclusions (one-sentence conclusions with brief
explanation if needed) Recommendations (one-sentence recommendations with brief
expla nation if needed) Other comments or concluding sentence
1. Introduction 1-3 1.1 Background of the evaluation and the topic beingevaluated
1.2 Purpose, objectives and outputs 1.3 Scope (including evaluation questions)
2. Methodology 1-3 2.1 Description of methodology: activities, timeframe,changes compared to TOR, and reasons for selecting samplereports, countries, sites, case studies, and interviewees as arepresentation of the topic being evaluated
2.2 Limitations: limitations of the methodology and scopeand problems encountered
3. Findings Varying 3.1 General: supporting information for the performanceassessment length and other assessment, if required
3.2 Performance assessment: assessment against relevantevaluation criteria (relevance, efficiency, effectiveness andsustainability)
TOR for the [Title Evaluation] – Draft/Final draft/Final month, year
Evaluation Tool 1 – Evaluation Terms of Reference Template
64
CONTENT PAGES COMMENTS(estimate)
3.3 Other assessment: assessment against relevant additionalcriteria (gender, rights-based approach, environmentalsustainability, ESCAP priority countries and “one UN”)
4. Conclusions 1-4 Main conclusions, both positive and negative, of the evalua-tion that follow logically from the findings
Ratings table with ratings for standard evaluation andadditional criteria and a brief justification (optional)
5. Recommendations 1-4 Recommendations based on the conclusions, which can beaddressed to ESCAP management, ESCAP staff, donors andother relevant stakeholders
Annexes I. Management response (to be completed by ESCAPmanagement)
II. Terms of reference III. List of documents reviewed IV. List of interviewees Other annexes as required (e.g. schedule of work undertaken
by the evaluators, reports of meetings, interview summaries,questionnaires)
TOR for the [Title Evaluation] – Draft/Final draft/Final month, year
Evaluation Tool 1 – Evaluation Terms of Reference Template
65
Annex II. Quality criteria used to review Evaluation Reports
The draft and final draft evaluation reports will be assessed against the quality criteria listedbelow.
TOR for the [Title Evaluation] – Draft/Final draft/Final month, year
Quality Check Description
The report meets the The report is tailored to the information needs of ESCAP andscope, purpose and or other entities that commissioned the evaluationobjectives of the The report does not deviate from the scope outlined in the TORevaluation as stated The report can be used by ESCAP for the intended purpose asin the TOR stated in the TOR
The objectives, as outlined in the TOR have been met, includ-ing: the assessment against relevant performance criteria(relevance, efficiency, effectiveness, sustainability, etc.) is com-plete, i.e. evaluation questions under each criterion have beenanswered
The report is structured The report follows the table of contents outlined in the TORlogically and includes the relevant annexes
The evaluation The evaluation methodology is clearly explained and has beenmethodology and its applied throughout the evaluation processapplication are explained Amendments to the methodology compared to what wastransparently and clearly proposed in the TOR have been clearly explained
The limitations of the evaluation methodology, includingproblems encountered during the conduct of the evaluation, andtheir implications for the validity of the findings and conclu-sions have been clearly explained
The findings and Relevant qualitative and/or quantitative sources of informationconclusions are credible have been considered
Analysis is done rigorously: triangulation is employed (cross-checking of findings against other relevant sources); cause-and-effect relationships are explained
Findings are adequately substantiated, balanced and reliable The relative contributions of stakeholders to the results are
explained Limitations are explained The conclusions derive from the findings and are clear
The recommendations The recommendations are clear and follow logically from theare useful conclusions
The recommendations are impartial Recommendations are realistic, concrete and actionable within a
reasonable timeframe Recommendations for ESCAP should be clearly within the
mandate of ESCAP
The report is well The executive summary is brief but highlights the key findings,written conclusions and recommendations
Evaluation Tool 1 – Evaluation Terms of Reference Template
66
Quality Check Description
The report uses consistent grammar and spelling (in accordancewith UN rules)
Main messages are clearly distinguished from the text The report is written in good English and is easy to read The subject of evaluation (programme, project, other) is clearly
described including its logic model or results chain The stakeholders of the programme or project are clearly
identified
TOR for the [Title Evaluation] – Draft/Final draft/Final month, year
Evaluation Tool 1 – Evaluation Terms of Reference Template
67
Annex III. [Other]
[Insert as required]
TOR for the [Title Evaluation] – Draft/Final draft/Final month, year
Evaluation Tool 1 – Evaluation Terms of Reference Template
69
Evaluation Tool 2 – Sample Evaluation Logical Framework Model
EV
AL
UA
TIO
N T
OO
L 2
SA
MP
LE
EV
AL
UA
TIO
N L
OG
ICA
L F
RA
ME
WO
RK
MO
DE
L
Cri
teri
aK
ey Q
uest
ion
Sub-
Que
stio
nsIn
dica
tors
Sour
ce o
fM
etho
dsA
ssum
ptio
nsin
form
atio
n
Rel
evan
ce
Was
the
gre
en g
row
th
Is g
reen
gro
wth
a p
olic
y
# of
cou
ntri
es
Stak
ehol
ders
,
Inte
rvie
ws
R
epre
sent
a-pr
ojec
t re
leva
nt t
opr
iori
ty o
f pa
rtic
ipat
ing
indi
catin
g th
atpo
licym
aker
s,
Surv
ey/
tive
Sam
ple
part
icip
atin
g co
untr
ies?
coun
trie
s?gr
een
grow
thN
GO
s an
dqu
estio
nnai
re
Staf
f
Do
the
part
icip
atin
gis
a p
rior
ityex
tern
altu
rnov
erco
untr
ies
find
the
#
of c
ount
ries
part
ners
H
igh/
low
activ
ities
of
the
proj
ect
indi
catin
gre
spon
seus
eful
for
enh
anci
ngac
tiviti
esra
tena
tiona
l ca
paci
ty?
wer
e us
eful
To
wha
t ex
tent
has
the
Le
vel
ofpr
ojec
t ad
just
ed t
o th
efle
xibi
lity
asch
angi
ng n
eeds
or
indi
cate
d by
prio
ritie
s of
par
ticip
atin
gpa
rtic
ipan
tsco
untr
ies?
Effe
ctiv
enes
s
To w
hat
exte
nt h
as t
he
How
man
y po
licym
aker
s
# of
Pr
ojec
t re
port
s
Rev
iew
for
um
Dat
a ga
ther
edgr
een
grow
th p
roje
ctac
tivel
y pa
rtic
ipat
ed i
npa
rtic
ipan
ts
Prog
ram
me/
and
capa
city
from
bas
elin
ein
crea
sed
the
the
polic
y fo
rum
s?
Cha
nge
inPr
ojec
tbu
ildin
gre
gard
ing
the
know
ledg
e an
d
How
man
y po
licym
aker
skn
owle
dge
orPa
rtic
ipan
tspr
ogra
mm
ele
vel
ofaw
aren
ess
ofpa
rtic
ipat
ed i
n th
eaw
aren
ess
onre
port
skn
owle
dge
orpo
licym
aker
s re
gard
ing
capa
city
bui
ldin
ggr
een
grow
th
Inte
rvie
ws
awar
enes
s of
stra
tegi
es a
nd p
olic
ypr
ogra
mm
e fo
r
Web
site
with
part
icip
ants
optio
ns t
o ad
dres
spo
licym
aker
s?ac
tivity
part
icip
ants
A
vaila
bilit
ycl
imat
e ch
ange
iss
ues
W
as t
here
a c
hang
e in
Be
fore
/Aft
erof
rep
orts
thro
ugh
the
gree
nkn
owle
dge
or a
war
enes
s?su
rvey
W
illin
gnes
s to
grow
th a
ppro
ach?
By h
ow m
uch?
com
pari
son
part
icip
ate
H
ow m
uch
web
site
activ
ity h
as t
aken
pla
ce?
70
Cri
teri
aK
ey Q
uest
ion
Sub-
Que
stio
nsIn
dica
tors
Sour
ce o
fM
etho
dsA
ssum
ptio
nsin
form
atio
n
Effic
ienc
y
Wer
e th
e pr
ojec
t
Do
the
outp
uts
just
ify
# of
sim
ilar
Bu
dget
rep
orts
Su
rvey
A
cces
sibi
lity
activ
ities
(e.
g. f
orum
the
cost
of
the
activ
ities
?pr
ojec
ts/
Pa
rtic
ipat
ing
C
ost/
bene
fitof
rep
orts
and
capa
city
bui
ldin
g
Did
pro
ject
act
iviti
esac
tiviti
esco
untr
ies/
anal
ysis
Ti
me
and
prog
ram
me)
the
mos
tdu
plic
ate
othe
r si
mila
rof
sta
keho
lder
sSt
akeh
olde
rs
Con
tent
reso
urce
sec
onom
ical
ly e
ffic
ient
initi
ativ
es b
y th
ean
d/or
ext
erna
l
Doc
umen
tan
alys
is o
fto
con
duct
way
of
achi
evin
g th
epa
rtic
ipat
ing
coun
trie
s or
part
ners
revi
ewal
tern
ativ
e
Hig
h/lo
wst
ated
out
puts
,ex
tern
al p
artn
ers?
C
ost
vs. b
enef
itpr
ojec
tsre
spon
se r
ate
outc
omes
and
goa
ls?
W
hat
alte
rnat
ives
exi
st?
#
and
type
of
alte
rnat
ive
proj
]ect
s
Sust
aina
bilit
y
Wha
t is
the
lik
elih
ood
W
hat
supp
ort
#
of i
nstit
utio
ns
Part
icip
atin
g
Inte
rvie
ws
Su
stai
ned
that
the
gre
en g
row
thm
echa
nism
s ar
e in
pla
ce i
mpl
emen
ting
coun
try
polic
y
Doc
umen
tef
fort
s in
thi
sap
proa
ch w
ill b
eto
ens
ure
that
the
gree
n gr
owth
repo
rts
revi
ewpr
ogra
mm
esu
stai
ned
with
in t
heca
paci
ty b
uild
ing
effo
rts
polic
ies
St
akeh
olde
rs/
Fo
cus
grou
psar
ea w
ill r
epa
rtic
ipat
ing
coun
trie
s?or
the
res
ults
fro
m t
he
# of
ins
titut
ions
part
icip
ants
M
edia
art
icle
sult
inpr
ojec
t ar
e su
stai
ned?
allo
cate
d fu
nds
Pa
rtic
ipat
ing
revi
ewbe
nefit
s
Wha
t co
mm
itmen
ts t
o
# of
par
ticip
coun
trie
s(a
s op
pose
din
stitu
tiona
lizin
g th
ean
ts w
ho h
ave
budg
etto
an
gree
n gr
owth
app
roac
hco
mm
itted
Ex
tern
alal
tern
ativ
eha
ve b
een
mad
e by
time/
ind
irec
tpa
rtne
rsap
proa
ch)
part
icip
atin
g co
untr
ies?
reso
urce
s
Med
ia
Part
icip
atin
gco
untr
ies
are
will
ing
topr
ovid
e th
isin
form
atio
n
Evaluation Tool 2 – Sample Evaluation Logical Framework Model
71
EVALUATION TOOL 3EVALUATION QUESTIONS UNDER EVALUATION CRITERIA
Evaluation Tool 3 – Evaluation Questions Under Evaluation Criteria
Standard evaluation criteria1
Relevance: Appropriateness of objectives (of a theme or subprogramme) or outcomes (of a project) in terms ofESCAP’s priorities, Governments’ development strategies and priorities, and requirements of the target groups.
Was the evaluation topic aligned with relevant policies at the national/regional level? To what extent is the evaluation topic in line with ESCAP’s programme of work? Has the evaluation topic taken into account previous evaluation findings (if applicable)? To what extent does the evaluation topic take into account and build upon the comparative advantages
and on-going activities of partner organizations or agencies? How is the relevance of the evaluation topic perceived at ESCAP? To what extent has the evaluation topic taken into account the priorities of the UNCT and national
development planning processes? Do the stakeholders find the objectives and results useful? Do any changes need to be made in order to reflect potential new needs and/or priorities?
Efficiency: Extent to which human and financial resources were used in the best possible way to deliveractivities and outputs, in coordination with other stakeholders.
To what extent has the evaluation topic been delivered in a cost effective way? How was the evaluation topic managed in terms of timeliness? How can time management be improved? To what extent did activities under evaluation involve stakeholders of the evaluation topic (e.g. project/
subprogramme partners, civil society, multilateral and bilateral donors)? Can the objectives be met in a more efficient way?
Effectiveness: Extent to which the expected objectives (of a subprogramme or theme) or outcomes (of aproject) have been achieved, and have resulted in changes and effects, positive and negative, planned andunforeseen, with respect to the target groups and other affected stakeholders.
To what extent have (or will) the planned outputs be achieved? What is the likelihood that the project/programme will contribute to the planned outcomes? To what extent does ESCAP promote a clear and coherent approach towards the evaluation topic?
Sustainability: Likelihood that the benefits of the subprogramme, theme or project will continue in the future.
To what extent can positive outcomes resulting from the programme/project be continued withoutESCAP’s further involvement?
To what extent are the outcomes replicable? To what extent has support from other stakeholders, UN partners, donors or other multilateral or
national partners been obtained to take forward project outcomes?
Additional criteria reflecting United Nations commitments
UN Coherence: Extent to which different United Nations agencies and other development partners operatein a coordinated and coherent way in the design and implementation of the subject of the evaluation. Thiscould include utilization of structures in support of regional coordination such as the Regional CoordinationMechanism (RCM) and its Thematic Working Groups (TWG) and ensuring coherent approaches with UNCountry Teams through Non-resident Agency (NRA) approaches.
To what extent were UN agencies involved in the design and implementation of the evaluation topic? To what extent do activities under evaluation promote partnership with other UN agencies? What was the effect or result of coordinated efforts?
1 United Nations Evaluations Group (UNEG), “Standards for Evaluation in the UN System”, April 2005 (available onlineat http://www.uneval.org).
72
Partnerships: The extent to which key stakeholders have been identified to be partners in the planning anddelivery of a programme or intervention.
To what extent was a stakeholder analysis completed and utilized to ensure partnership developmentin the design phase of the programme/project?
To what extent was duplication of services avoided due to the development of effective partnerships?
Aid Effectiveness: In the context of the Paris declaration and the Accra Agenda for Action (AAA) thisrefers to the streamlining and harmonization of operational practices surrounding aid delivery to develop-ing countries to ensure enhanced aid effectiveness. This criterion also assesses the extent to which ESCAPhas ensured that the programme or project is driven by the country or territory in which it is implementedor, in the regional context, by the member States, and the extent to which there is a focus on developmentresults and mutual accountability in the design and implementation of the subject of the evaluation.
To what extent were the targeted Governments involved in the planning and implementation of theproject?
To what extent do project stakeholders feel that their project was driven by the National Governmentand or other stakeholders?
To what extent were the efforts of similar projects coordinated?
Gender: Gender mainstreaming is one of the key strategies of UN-supported analysis and strategicplanning. This criterion assesses the extent to which gender considerations have been incorporated in thedesign and implementation of the subject of the evaluation.
To what extent was gender integrated into the design and implementation of the evaluation topic? To what extent does the evaluation topic regularly and meaningfully report on gender concerns in
reporting documents? To what extent is the sustainability of gender concerns assured?
Human rights-based approach (HRBA): Extent to which a human rights-based approach (HRBA), anapproach that mainstreams human rights principles throughout programming, has been utilized in thedesign and implementation of the subject of the evaluation.
To what extent was a HRBA integrated into the design and implementation of the evaluation topic? To what extent is the sustainability of human rights concerns assured?
Environmental sustainability: Extent to which environmental sustainability considerations have beenincorporated in the design and implementation of the subject of the evaluation.
To what extent was environmental sustainability integrated into the design and implementation of theevaluation topic?
To what extent is the sustainability of environmental concerns assured?
Evaluation Tool 3 – Evaluation Questions Under Evaluation Criteria
73
EVALUATION TOOL 4COMMON EVALUATION LIMITATIONS
The table below provides examples of common limitations encountered and potential means for addressingeach limitation.
Limitation Potential means for addressing the limitation
Political
The evaluation topic is politically sensitive
Challenges when evaluation findings determinefuture funding
Political pressures on the selection of evaluationtopic, scope and methodology
Dealing with unrealistic expectations as to whatthe evaluation can achieve
Reconciling divergent stakeholder informationneeds and expectations of the evaluation
Difficulties in involving stakeholders in theevaluation planning
Lack of incentive by stakeholders to participatein an evaluation
Working with stakeholders with little experiencewith or understanding of evaluation
Organizational
Staff involved in a project / subprogrammehave left
Staff involved in the evaluation have limitedexperience with evaluations
Evaluation fatigue or resistance from ESCAPmanagement or staff whose input is requiredfor the evaluation
Evaluation manager, OIOS/PMD providingquality assurance, and stakeholders involved inthe evaluation are based at different locations
Through active engagement of stakeholdersidentify potential barriers to wider stakeholderinvolvement and discuss ways to overcome thechallenges.
Engage stakeholders from the very beginning toensure that all agree upon the topic underevaluation and the means or strategy for evalu-ating.
Involve a reference group of external partnerswho can help to ensure a credible topic, scopeand methodology.
Ensure that all language is in laymen's termsand that when something is not understoodadditional resources, such as sample evaluationreports, are provided.
Involve stakeholders in designing the evaluationand reconstructing the baseline when needed.
Ensure that organizational support mechanismsare in place; ie. Guidelines, tool-kits, training,and that time is allocated to supporting staff.
Emphasize the importance of evaluation as alearning tool not as a means for assessingindividual performance.
Ensure that senior-level management expressessupport for the evaluation.
Utilize web/satellite-based communication pro-grams to ensure that all stakeholders involvedwith the evaluation can communicate effectively.
Budget, time and resources
Balancing demand for detailed coverage withresource constraints
Resources too limited to apply a rigorous evalu-ation methodology
Timeframes to complete the evaluation do notfit realistic timeframes
Not enough time for adequate consultant selec-tion process
Ensure that the evaluation is not overly ambi-tious by identifying the purpose and intendeduse of the evaluation.
Simplify the evaluation approach by minimizingthe evaluation criteria and limiting the numberof evaluation questions.
In evaluation “rigor” is not synonymous with“expensive”. Discuss ways for ensuring rigorthrough data collection and analysis techniques.
Evaluation Tool 4 – Common Evaluation Limitations
74
Limitation Potential means for addressing the limitation
Multiple or competing tasks combined withlimited resources to carry out the evaluation
Pressure to produce results too soon or in timefor certain decision points or meetings
Data
Problems of data quality, availability, reliability
Lack of baseline data or information todetermine if a change occurred
Limited resources for data or informationcollection
Indicator constraints (e.g. because the evaluationwas not considered during the planning stage)
Difficulties of getting stakeholders to respond tosurveys when they do not receive any benefit
Over-reliance on interviews with limitedstakeholders resulting in risk of bias
Ask for an extension or more resources!
Invest time during the development of the TORto ensure that the data collection methods areappropriate.
Try to reconstruct the baseline through alterna-tive data collection methods. (However, if this isnot possible then refrain from makingstatements that attribute a specific change orresult directly to the project/programme andrather focus on the contribution of the project/programme towards that specific change orresult.)
Go back to the programme logic model andreconstruct the design.
Discuss the limitations in the report and refrainfrom making generalized statements.
Ensure stakeholders that responses are confiden-tial and will help improve the delivery ofprogrammes/projects.
In order to minimize bias and build strongvalidity, ensure that multiple methods of datacollection are utilized so that information can betriangulated, or compared against each other.
Attribution / contribution
Lack of comparison group to determine ifchange occurred in areas / countries whereESCAP was not involved
Difficult to demonstrate ESCAP's contributionwith increasingly complex partnerships
Difficult to demonstrate ESCAP's contributionwhen there are many steps between ESCAP'sactivities (e.g. capacity building workshops) andoutcomes (e.g. policy change)
Other
Cultural
Language
Ensure from the beginning that the TOR is nottoo ambitious in terms of demonstrating specificbehavioral change or impact.
Utilize alternative sources of information to es-tablish whether there was a change within thetarget group.
ESCAP's contribution towards results can beanalyzed through multiple lenses (social, politi-cal, institutional etc.).
Ensure that cultural sensitivity and languageabilities of evaluators are considered whenestablishing the evaluation team.
Evaluation Tool 4 – Common Evaluation Limitations
75
[Title of the Evaluation]
DRAFT / FINAL DRAFT / FINAL
[Month, year]
Evaluators:[Name of evaluators starting with the lead evaluator]
Commissioned by:OIOS / ESCAP / Division
Management response completed by:[Date]
Web page address where report can be found electronically
Evaluation Tool 5 – Evaluation Report Template
EVALUATION TOOL 5EVALUATION REPORT TEMPLATE
76
ACKNOWLEDGMENTS
[Insert acknowledgments if appropriate, no more than 1 page]
[Month, year]
[Names of all evaluators]
Title Evaluation – Draft/Final draft/Final month, year
Evaluation Tool 5 – Evaluation Report Template
77
Title Evaluation – Draft/Final draft/Final month, year
CONTENTS
Page
ACKNOWLEDGMENTS .............................................................................................................. 76
LIST OF ACRONYMS .................................................................................................................. 78
MANAGEMENT RESPONSE ..................................................................................................... 79
EXECUTIVE SUMMARY .............................................................................................................. 80
1. INTRODUCTION .................................................................................................................. 81
1.1 BACKGROUND OF THE EVALUATION .................................................................. 811.2 PURPOSE, OBJECTIVES AND OUTPUTS .................................................................. 811.3 SCOPE ................................................................................................................................ 81
2. METHODOLOGY ..................................................................................................................... 82
2.1 METHODOLOGY ............................................................................................................. 822.2 LIMITATIONS .................................................................................................................. 82
3. FINDINGS ................................................................................................................................. 83
3.1 GENERAL .......................................................................................................................... 833.2 PERFORMANCE ASSESSMENT ................................................................................... 83
3.2.1 Relevance ................................................................................................................... 833.2.2 Efficiency ................................................................................................................... 833.2.3 Effectiveness .............................................................................................................. 833.2.4 Sustainability ............................................................................................................ 83
3.3 OTHER ASSESSMENTS.................................................................................................. 843.3.1 UN System Coherence ............................................................................................. 843.3.2 Gender Mainstreaming ............................................................................................ 843.3.3 Human rights-based approach ................................................................................. 843.3.4 Environmental sustainability ................................................................................... 843.3.5 Other ......................................................................................................................... 84
4. CONCLUSIONS ..................................................................................................................... 85
5. RECOMMENDATIONS ....................................................................................................... 86
ANNEXES ........................................................................................................................................ 87
ANNEX I. MANAGEMENT RESPONSE ............................................................................... 87ANNEX II. TERMS OF REFERENCE ....................................................................................... 88ANNEX III. LIST OF DOCUMENTS REVIEWED .................................................................. 88ANNEX IV. LIST OF INTERVIEWEES ...................................................................................... 88ANNEX V. ETC. ........................................................................................................................... 88
Evaluation Tool 5 – Evaluation Report Template
78
LIST OF ACRONYMS
ESCAP United Nations Economic and Social Commission for Asia and the Pacific
Title Evaluation – Draft/Final draft/Final month, year
Evaluation Tool 5 – Evaluation Report Template
79
MANAGEMENT RESPONSE
[This section provides the response by ESCAP management to the evaluation and includes aresponse to the overall evaluation and to the specific recommendations made. The manage-ment response that includes the follow-up action plan will be included as an annex to theevaluation report. To ensure that recommendations that have been accepted by the ESCAPmanagement are acted upon, an evaluation follow-up action plan with responsible units andexpected completion dates is submitted separately to the PMD (See Evaluation Tool 7:Management Response template).]
Overall Management Response to the Evaluation
[To be inserted by ESCAP management after the content of the evaluation report is finalized]
Management Response to Recommendations
[To be inserted by ESCAP management after the content of the evaluation report is finalized]
RECOMMENDATIONS MANAGEMENT RESPONSE
1.
2.
Etc.
Title Evaluation – Draft/Final draft/Final month, year
Evaluation Tool 5 – Evaluation Report Template
80
EXECUTIVE SUMMARY
[Note: the executive summary should ideally be 1-2 pages and not longer than 3 pages]
[Intro sentence: This report details the findings of the evaluation of (subject) that wasconducted between (month – month, year)]
[One sentence / paragraph background or context of the subject under evaluation]
[One sentence / paragraph description of the evaluation purpose and focus/scope]
[One sentence / paragraph description of the methodology]
[Main conclusions of the evaluation, as listed in the conclusion chapter of the report – anexplanatory sentence or paragraph may be included if required]
[Main recommendations of the evaluation, as listed in the recommendations chapter of thereport – an explanatory sentence or paragraph may be included if required]
[Other comments or concluding sentence as appropriate]
Title Evaluation – Draft/Final draft/Final month, year
Evaluation Tool 5 – Evaluation Report Template
81
Title Evaluation – Draft/Final draft/Final month, year
Evaluation Tool 5 – Evaluation Report Template
1. INTRODUCTION
[Intro sentence: This chapter describes the background of the evaluation, and the evaluationpurpose, objectives, outputs and scope, as outlined in the terms of reference (TOR) of thisevaluation].
1.1 Background of the evaluation
[Intro sentence: this is the draft/final draft/final report of the evaluation of (subject) that wasconducted between (month - month, year)].
[The evaluation was conducted by (name evaluators and their relation to ESCAP, e.g.independent consultants, ESCAP staff)]
[Brief background to the subject under evaluation - for details refer to annexes if required]
1.2 Purpose, objectives and outputs
[The purpose of the evaluation as outlined in the TOR]
[The evaluation objectives are to:
Objective 1 Etc, as outlined in the TOR]
[The outputs of the evaluation include:
Evaluation report Etc, as outlined in the TOR]
[Describe the dissemination of the evaluation report, e.g.: The evaluation report will beprinted in hard copy for dissemination within the ESCAP Secretariat and to the donor, andpublished on ESCAP’s website: www.unescap.org/evaluation]
1.3 Scope
[The scope of the evaluation, including evaluation questions as outlined in the TOR]
82
Title Evaluation – Draft/Final draft/Final month, year
Evaluation Tool 5 – Evaluation Report Template
2. METHODOLOGY
[Intro sentence: This chapter describes the implemented evaluation methodology and limita-tions of the evaluation].
2.1 Methodology
[Description of methodology, covering, for example:
Activities, data collection methods Timeframe (e.g. 3-day country visits) Changes to the methodology compared to the TOR Reasons for selecting sample reports, countries, sites, case studies, and interviewed
stakeholders as a representation of the topic being evaluated Other]
2.2 Limitations
[Description of the limitations of the evaluation and problems encountered during theevaluation, presented in bullet format]
[Describe the overall implications for the validity of the evaluation findings]
83
3. FINDINGS
[Intro sentence: This chapter provides the findings of the evaluation in accordance with theevaluation criteria and questions]
3.1 General
[This purpose of this section is to provide supporting information for the performanceassessment and other assessment. This section is only to be included if required and theheading title may be amended. An example is the description of the results framework andimplementation process of a project, programme or modality]
3.2 Performance assessment[Delete / insert subsections as applicable]
3.2.1 Relevance
[Intro sentence, amend as required: The assessment against the relevance criterion refers tothe consistency of intended objectives (of a subprogramme or theme) or outcomes (of aproject) with ESCAP’s priorities, governments’ development strategies and priorities andrequirements of the target groups.]
[Description of findings]
3.2.2 Efficiency
[Intro sentence, amend as required: The assessment against the efficiency criterion refers tothe extent to which human and financial resources were used in the best possible way todeliver activities and outputs, in coordination with other stakeholders.]
[Description of findings]
3.2.3 Effectiveness
[Intro sentence, amend as required: The assessment against the effectiveness criterion refers tothe extent to which the expected objectives (of a subprogramme or theme) or outcomes (of aproject) have been achieved, and have resulted in changes and effects, positive and negative,planned and unforeseen, with respect to the target groups and other affected stakeholders.]
[Description of findings]
3.2.4 Sustainability
[Intro sentence, amend as required: The assessment against the sustainability criterion refers tothe likelihood that the positive effects of the subprogramme, theme or project will continue inthe future.]
[Description of findings]
Title Evaluation – Draft/Final draft/Final month, year
Evaluation Tool 5 – Evaluation Report Template
84
3.3 Other assessments
[Delete / insert subsections as applicable]
3.3.1 UN System Coherence
[Intro sentence, amend as required: The assessment against the ‘UN system coherence’criterion refers to the extent to which different UN agencies and other development partnershave been involved in the design and implementation of the subject of the evaluation.]
[Description of findings]
3.3.2 Gender Mainstreaming
[Intro sentence, amend as required: The assessment against the gender criterion refers to theextent to which gender considerations have been incorporated in the design and implementa-tion of the subject of the evaluation.]
[Description of findings]
3.3.3 Human rights-based approach
[Intro sentence, amend as required: The assessment against this criterion refers to the extentto which a human rights-based approach (HRBA) has been incorporated in the design andimplementation of the subject of the evaluation.]
[Description of findings]
3.3.4 Environmental sustainability
[Intro sentence, amend as required: The assessment against the environmental criterion refersto the Extent to which environmental sustainability considerations have been incorporated inthe design and implementation of the subject of the evaluation.]
[Description of findings]
3.3.5 Other
[Intro sentence]
[Description of findings]
Title Evaluation – Draft/Final draft/Final month, year
Evaluation Tool 5 – Evaluation Report Template
85
Title Evaluation – Draft/Final draft/Final month, year
4. CONCLUSIONS
[Intro sentence: This chapter provides the conclusions of the evaluation, including generalconclusions and conclusions relating to the specific performance and other criteria]
[Intro sentence to the main conclusions: The main conclusions are as follows:]
[One sentence conclusion][One sentence / paragraph description]
[One sentence conclusion][One sentence / paragraph description]
Etc.
Evaluation Tool 5 – Evaluation Report Template
86
Title Evaluation – Draft/Final draft/Final month, year
5. RECOMMENDATIONS
[Intro sentence: This chapter provides recommendations based on the conclusions of theevaluation]
[Provide one-sentence numbered recommendations, followed by a short explanation. Recom-mendations should be concrete and action-oriented. It is also possible to provide more specificactionable recommendations underneath each general recommendation]
[Recommendation 1: [one sentence recommendation][One sentence / paragraph description or more specific recommendations
[Recommendation 2: [one sentence recommendation][One sentence / paragraph description or more specific recommendations]
Etc.
Evaluation Tool 5 – Evaluation Report Template
87
ANNEXES
Annex I. Management Response
Title Evaluation – Draft/Final draft/Final month, year
Title of Evaluation
Signature Date
Executive Secretary(or other management entityas appropriate)
Division Chief or Head of RegionalInstitution (as appropriate)
Division Chief or Head of RegionalInstitution (as appropriate)
General Remarks by Management
Report Recommendation Management Response Follow-up Action
1.
2.
Etc.
Evaluation Tool 5 – Evaluation Report Template
88
Annex II. Terms of Reference
Annex III. List of Documents Reviewed
Annex IV. List of Interviewees
Annex V. Etc.
Title Evaluation – Draft/Final draft/Final month, year
Evaluation Tool 5 – Evaluation Report Template
89
Quality Check Description
The report meets the The report is tailored to the information needs of ESCAP and/or otherscope, purpose and entities that commissioned the evaluationobjectives of the The report does not deviate from the scope outlined in the TORevaluation as stated The report can be used by ESCAP for the intended purpose as statedin the TOR in the TOR
The objectives as outlined in the TOR have been met, including: theassessment against relevant performance criteria (relevance, efficiency,effectiveness, sustainability, etc.) is complete, i.e. evaluation questionsunder each criterion have been answered
The report is structured The report follows the table of contents outlined in the TOR andlogically includes the relevant annexes
The evaluation The evaluation methodology is clearly explained and has been appliedmethodology and its throughout the evaluation processapplication are explained Amendments to the methodology compared to what was proposed intransparently and clearly the TOR have been clearly explained
The limitations of the evaluation methodology, including problemsencountered during the conduct of the evaluation, and their implica-tions for the validity of the findings and conclusions have been clearlyexplained
The findings and Relevant qualitative and/or quantitative sources of information haveconclusions are credible been considered
Analysis is done rigorously: triangulation is employed (cross-checkingof findings against other relevant sources); cause-and-effect relation-ships are explained
Findings are adequately substantiated, balanced and reliable The relative contributions of stakeholders to the results are explained Limitations are explained The conclusions derive from the findings and are clear
The recommendations The recommendations are clear and follow logically from theare useful conclusions
The recommendations are impartial Recommendations are realistic, concrete and actionable within a
reasonable timeframe Recommendations for ESCAP should be clearly within the mandate of
ESCAP
The report is well The executive summary is brief but highlights the key findings, conclu-written sions and recommendations
The report uses consistent grammar and spelling (in accordance withUN rules)
Main messages are clearly distinguished from the text The report is written in good English and is easy to read The subject of evaluation (programme, project, other) is clearly
described including its logic model or results chain The stakeholders of the programme or project are clearly identified
Evaluation Tool 6 – Quality checklist for evaluation report
EVALUATION TOOL 6QUALITY CHECKLIST FOR EVALUATION REPORT
91
EVALUATION TOOL 7MANAGEMENT RESPONSE AND FOLLOW-UP ACTION
PLAN TEMPLATE
A. Management response template
The general remarks by management and a management response (MR) to each recommenda-tion of the evaluation or evaluative review are inserted at the beginning of the evaluationreport. (See the Evaluation Tool 5: Evaluation report template).
The below MR template with follow-up actions will be included as an annex to the evaluationreport and the detailed follow-up action plan with the responsible units and expected comple-tion date should be submitted to PMD (see template B below).
Evaluation Tool 7 – Management Response Template
Title of Evaluation
Signature Date
Executive Secretary(or other management entityas appropriate)
Division Chief or Head of RegionalInstitution (as appropriate)
Division Chief or Head of RegionalInstitution (as appropriate)
General Remarks by Management
Report Recommendation Management Response Follow-up Action
1.
2.
Etc.
92
B. Follow-up action plan template
See below for the detailed follow-up action plan that includes the responsible units and theexpected completion date. This detailed follow-up action plan will be used for internalpurposes and must be submitted to PMD with the final evaluation or evaluative review report.
Evaluation Tool 7 – Management Response Template
Title of Evaluation
Signature Date
Executive Secretary(or other managemententity as appropriate)
Division Chief or Headof Regional Institution(as appropriate)
Division Chief or Headof Regional Institution(as appropriate)
General Remarks byManagement
Report Recommendation Management Follow-up Lead Unit Collaborating ExpectedResponse Action Units Completion
Date
1.
2.
Etc.
93
EVALUATION TOOL 8EVALUATION PROCESS CHECKLIST
Step Process questions
1. Prepare the evaluation plan The evaluability of the programme/project in question was consideredand budget during its planning stages.
The evaluation/evaluative review budget was developed during theplanning stages and the proper approval/appraisal mechanisms wereutilized.
PMD was informed of the evaluation/evaluative review for inclusionin the ESCAP Evaluation Plan.
2. Prepare the Terms of The TOR followed the outline provided in the EvaluationReference Guidelines.
The TOR was specific and clear about the purpose, scope, objectives,and timeframe of the evaluation.
The TOR was consulted with stakeholders of the evaluation.
The TOR specified the skill requirements for the evaluator/evaluationteam members.
3. Establish the evaluation The TOR was distributed widely to support the identification ofteam qualified consultant(s).
Experience in evaluation and specialized experience in the topic of theevaluation (for example statistics or trade) were considered in abalanced manner when the evaluation team was established.
Personal competencies, such as language and ability to work withdiverse stakeholders were considered.
4. Schedule and organize The evaluation work plan was developed outlining specific tasks,the evaluation person(s) responsible and indicative timeframes.
The evaluation team was briefed by stakeholders of the evaluation.
An inception report was developed, if necessary, outlining themethodology and any necessary changes to the TOR based on thebriefing.
5. Conduct the evaluation The evaluation was conducted by the evaluator or evaluation team inaccordance with the TOR.
The evaluation manager supported the team by providing relevantdocumentation, contact information, and time for consulting on thefindings.
6. Prepare the draft report The draft report was developed in line with the suggested structurepresented in the Evaluation Guidelines.
The evaluator presented the draft report to stakeholders.
7. Review the draft report A technical review was completed by relevant programme or projectand prepare the final report officers and other stakeholders.
A methodological review, or quality check, was completed by therelevant division supported by the PMD Evaluation Officers.
Evaluation Tool 8 – Evaluation process checklist
94
Step Process questions
The stakeholders provided advice on factual errors and/or validatedthe information presented in the report.
The final report was prepared by the evaluator/team on the basis ofall the comments provided and submitted to the evaluation manager.
8. Prepare the management The report was submitted to PMD to allow the Evaluation Officers toresponse (MR) coordinate the formulation of the management response (MR).
The PMD Evaluation Officer(s) requested inputs to the managementresponse from the relevant division(s) or office(s) away from Bangkok.
The PMD Evaluation Officer(s) facilitated meetings, as required, withstakeholders to agree on an overall response to the evaluation.
The MR was signed by: Evaluations: the Chief of Divisions and Heads of Offices that were
involved in the formulation of the MR and by the ExecutiveSecretary.
Evaluative reviews: the Chief of all Divisions and Heads of Officesthat were involved in the formulation of the MR.
The PMD Evaluation Officers submitted the final MR to the evaluationmanager.
The detailed MR with follow-up actions, expected completion datesand responsible units was kept on record in PMD for monitoringpurposes.
9. Share evaluation findings The evaluation manager included the overall MR as an insert at thebeginning of the evaluation report. The detailed MR with follow-upactions (but not expected completion dates etc.) was included as anannex to the evaluation report.
The final report was submitted to PMD.
The evaluation or evaluative review report was input by theevaluation manager into the IT tool which tracks follow-up toevaluations.
The Evaluation report was posted by PMD on the ESCAP internet(external website) and intranet (internal website).
The Evaluative Review report was posted by PMD on the ESCAPintranet.
Internal briefing sessions were conducted as relevant by the evaluator,evaluation manager, PMD’s Evaluation Officers or other ESCAP staff.
10. Follow-up and promote Relevant ESCAP staff implemented various activities, as described inlearning the Evaluation Guidelines, to ensure that the evaluation/evaluative
review was used to strengthen accountability and promote learning.
The evaluation manager ensured regular update of the status offollow-up actions in the IT tracking system developed for trackingfollow-up to evaluations/reviews.
Evaluation Tool 8 – Evaluation process checklist
Printed in BangkokMay 2010 – 500
United Nations publicationSales No. E.10.II.F.11Copyright � United Nations 2010ISBN: 978-92-1-120605-0
©