handbook on “programme and project thinking tools seven
TRANSCRIPT
Sub-Regional Workshop on Managing for Development Results
for the Caribbean Technological Consultancy Services Network's Cooperating Institutions
Accra Beach Hotel & Spa, Barbados
November 9-13, 2015
Handbook on “Programme and Project Thinking Tools -
Seven Simple Steps”
Centre for International Development & Training (CIDT) University of Wolverhampton
Telford Innovation Campus, TF2 9NT Telford, Shropshire
UK Tel: 00 44 (0)1902323219
www.wlv.ac/cidt
1
2
3
4
5
6
7
Compiled and written by: Philip N. Dearden with assistance from colleagues in the Centre for International Development and Training (CIDT), University of Wolverhampton, UK. Tel: 00 44 (0)1902 323219 Email: [email protected] and/or [email protected] Web: www.wlv.ac.uk/cidt Acknowledgements
Many thanks to Michel Thomas, Operations Officer, Caribbean Technological
Consultancy Services (CTCS) Caribbean Development Bank for his assistance in
sourcing background materials for the Case Study in this Handbook.
© CIDT 2015
All rights reserved
CIDT encourages the fair use of this material. Proper citation is requested
Version: MSMEs Caribbean Development Bank – October 2015
Managing for Development Results (MfDR) Handbook on “Programme and Project Thinking Tools”
----------------------------------------------------------------------------
Version: MSMEs October 2015 Page 1
CONTENTS
INTRODUCTION TO MANAGING FOR DEVELOPMENT RESULTS .................. 3
STEP 1 STAKEHOLDER ANALYSIS; WHO ARE WE?.................................... 11
1.1 Why do we involve others? .................................................................... 11
1.2 Who do we need to involve? .................................................................. 12
1.3 Undertaking a Stakeholder Analysis ....................................................... 13
1.4 A note on the WEMSME case study....................................................... 14
STEP 2 PROBLEM ANALYSIS; WHERE ARE WE NOW? ............................... 17
2.1 Identifying Problems and Possibilities (the current situation) .................... 17
2.2 Developing a Problem Tree ................................................................... 17
STEP 3 OBJECTIVES AND OPTIONS ANALYSIS; WHERE DO WE WANT TO BE? ............................................................................................................... 20
3.1 Looking forward .................................................................................... 20
3.2 Developing an Objectives/Vision Tree .................................................... 20
3.3 Choosing between options .................................................................... 22
3.4 Linking with the logframe ....................................................................... 23
STEP 4 OBJECTIVES DESIGN; HOW WILL WE GET THERE? ...................... 24
4.1 Identifying our objectives ....................................................................... 24
4.2 The Objectives Column in the Logical Framework ................................... 26
STEP 5: RISK MANAGEMENT; WHAT MAY STOP US GETTING THERE? ..... 29
5.1 Managing Risk...................................................................................... 29
5.2 The Key Questions ............................................................................... 30
5.3 Undertaking a Risk Analysis .................................................................. 30
5.4 The Assumptions Column in the Logframe ............................................. 33
STEP 6. HOW WILL WE KNOW IF WE’VE GOT THERE? ............................... 37
6.1 Laying the foundations for Monitoring, Review and Evaluation ................. 37
6.2 Terms and principles............................................................................. 37
6.3 Constructing indicators and targets ........................................................ 40
6.4 Types of Indicators ............................................................................... 41
6.5 Identifying the Data Sources, the evidence ............................................. 43
STEP 7: WORK & RESOURCE PLANNING; WHAT DO WE NEED TO GET THERE? ........................................................................................................ 50
7.1 Preparing a Project Work Plan ............................................................... 50
7.2 Preparing a Project Budget.................................................................... 50
8. CONCLUSIONS.......................................................................................... 53
8.1 Using the Logical Framework ................................................................ 55
8.2 Nesting the Framework ......................................................................... 55
8.3 Useful References ................................................................................ 55
APPENDIX A: GLOSSARY ............................................................................ 56
APPENDIX B: SOME MYTHS ABOUT RESULTS BASED MANAGEMENT ...... 65
APPENDIX C: GROWTH IN THE USE OF RESULTS BASED MANAGEMENT. 66
ADDENDIX D: PROJECT MANAGEMENT ...................................................... 67
APPENDIX E: SUMMARY OF THE LOGICAL FRAMEWORK .......................... 71
APPENDIX F: STRENGTHS AND WEAKNESSES OF THE LOGFRAME ......... 73
Managing for Development Results (MfDR) Handbook on “Programme and Project Thinking Tools”
----------------------------------------------------------------------------
Version: MSMEs October 2015 Page 2
APPENDIX G: CATEGORIES OF OUTPUTS................................................... 78
APPENDIX H: ASSESSING PROJECT PERFORMANCE ................................ 79
APPENDIX I: PORTFOLIOS OF LOGFRAMES ............................................... 85
APPENDIX J: NESTING OF LOGFRAMES ..................................................... 86
APPENDIX K: THE LOGFRAME AS A COMMUNICATION TOOL ................... 88
APPENDIX L: REPORTING USING THE LOGFRAME; AN EXAMPLE ............. 89
APPENDIX M: AN EXAMPLE OF A SIMPLE LOGFRAME ............................... 93
APPENDIX N: EXAMPLES OF LOGFRAMES ................................................. 94
APPENDIX O: ENGENDERING THE LOGICAL FRAMEWORK ......................117
APPENDIX P: USEFUL REFERENCES .........................................................119
APPENDIX Q: RESULTS FRAMEWORK - RWANDA: PROGRAMME TO SUPPORT GOOD GOVERNANCE (PSGG) ....................................................123
APPENDIX R: MONITORING, REVIEW AND EVALUATION (MRE) FRAMEWORK BASED UPON THE LOGFRAME’S INDICATORS...................134
APPENDIX S: NEW DFID LOGICAL FRAMEWORK - FOREST MARKET GOVERNANCE AND CLIMATE .....................................................................135
APPENDIX T: COMPARISONS BETWEEN TERMINOLOGIES OF DIFFERENT DONOR AGENCIES FOR RESULTS / LOGICAL FRAMEWORKS ..................141
Managing for Development Results (MfDR) Handbook on “Programme and Project Thinking Tools”
----------------------------------------------------------------------------
Version: MSMEs October 2015 Page 3
INTRODUCTION TO MANAGING FOR DEVELOPMENT RESULTS Managing for Development Results (MfDR)1 is a management strategy which aims
to improve transparency, accountability and effectiveness through:
defining realistic expected results (outputs, outcomes and impact),
monitoring progress towards their achievement,
integrating lessons learned into management decisions,
reporting on performance and outcome evaluation.
MfDR focuses on achievement of expected results.
The logical framework (logframe) approach (LFA) is a process to support MfDR;
with vital ‘thinking tools’ that strengthen analysis and design during formulation, implementation, evaluation results throughout the management process rather than solely on inputs and activities.
The term Project Cycle Management (PCM) is used to describe the management
activities, tools and decision-making procedures used during the life of the project. This includes key tasks, roles and responsibilities, key documents and decision options. After a little bit of introductory theory this handbook introduces a number of very practical Programme and Project "Thinking Tools". These have evolved over
several decades to support teams undergoing project work using a logical framework approach, usually within a developing organisational culture of MFDR. Rationale for MDFR
Governments and their citizens require evidence-based information that the use of public funds for development provides good value for money. Donor agencies require that all projects and programmes use an MfDR approach in order to provide evidence that public funds have led to sustainable results and will not provide funding to organisations which do not or cannot utilise MfDR effectively.
MfDR improves transparency and accountability through emphasising outcomes and higher level change and requiring evidence of change.
MfDR enables identification of what works and what does not work and places great emphasis on lesson learning to inform planning for improved outcomes.
MfDR can be applied at project/programme, country and corporate levels What do we mean by Results?
A result is a describable and measurable change in state that is derived from a cause and effect relationship. The cause and effect relationship can be illustrated as a results chain. See over
1 See Appendix A for a full definition of MFDR and the associated Results Based Management (RBM).
Managing for Development Results (MfDR) Handbook on “Programme and Project Thinking Tools”
----------------------------------------------------------------------------
Version: MSMEs October 2015 Page 4
Benefits of MFDR
The benefits of MFDR are numerous:
Planning: Planning emphasises outcomes and higher level change and the
management of inputs, activities and outputs to achieve them.
Ownership: Broad participation by stakeholders, partners and staff builds ownership and understanding of the desired change and the steps to achieve it.
Efficiency & Effectiveness: The systematic collection, analysis and assessment of data related to performance improves decision making and adjustments for improvement
Communication: Facilitates and encourages better communication of
performance
Reporting: Provides a systematic framework for reporting on results.
Lesson Learning: An emphasis on lesson learning can be most productive
and motivating and really help improve all of the factors above.
Managing for Development Results (MfDR) Handbook on “Programme and Project Thinking Tools”
----------------------------------------------------------------------------
Version: MSMEs October 2015 Page 5
Projects, Programmes and Portfolios
Organisations use MfDR to increase the effectiveness of their projects, programmes and portfolios.
A project is a set of activities aimed at achieving clearly specified objectives within a defined time period and with a defined budget.
A programme is a group of related projects managed in a coordinated way in
order to secure improved results.
A portfolio is all the projects and programmes managed by an organisation or department.
MFDR and the Logical Framework Approach
Managing for Development Results provides the key concepts and principles that ensure any development work is effective in producing the required outcomes. The Logical Framework (LogFrame) Approach (LFA) provides a set of tools to put MfDR into practice during the planning and implementation phases of development projects. It involves identifying:
strategic elements (activities, outputs, outcome and impact) and their causal
relationships;
indicators and evidence to measure performance
assumptions and risks that may influence success and failure.
The LFA is very widely used and influential in international development work2.
Development agencies, national governments, multilateral and bilateral partners, and
non-government organisations, use the logframe approach and in many agencies it is
mandatory practice. Likewise Results Based Management3 has become very
widespread. See Box 1.
Box 1 - Growth in the Use of Results Based Management (RBM)4
Early 1990’s Growing perception that aid programmes were failing to produce results
Mid 1990’s RBM reforms implemented by government agencies in Australia, Canada, the UK, the USA and the
Nordic countries; the Canadian International Development Agency (CIDA) introduced its Policy Statement on RBM Late 1990’s World Bank was one of first multilateral organisations to adopt RBM. UNDP and WFP were first UN
organisations to use RBM. 2000 The MDGs embodied results based approach to development; increased pressure on UN organisations,
bilateral donors and multilateral banks to use RBM 2004 UN General Assembly approved 9 benchmarks to measure progress towards implementation of RBM
2005 Paris Declaration outlined 5 principles for making development aid more effective: ownership, alignment,
harmonisation, results and mutual accountability 2006 UN launched a pilot initiative ‘Delivering As One’ aimed to increase coherence, effectiveness and efficiency
of joint UN operations through establishment of one UN Joint Office inn each of eight countries 2008 Accra Agenda for Action was designed to strengthen and deepen implementation of Paris Declaration (PD)
and set the agenda for accelerated achievement of the PD 2011 Busan partnership for Effective Development Cooperation led to working partnership between OECD and
UN Mid 2014 UN ‘Delivering As One’ initiative has been adopted in 37 countries
2 See Dearden P. N. and Kowalski R. 2003 Programme and Project Cycle Management (PPCM): Lessons from
South and North. Development in Practice. Vol 13 (5). http://www.ingentaconnect.com/content/routledg/cdip 3 Some common Myths about RBM are presented in Appendix C.
4 A fuller history of RBM is presented in Appendix D.
Managing for Development Results (MfDR) Handbook on “Programme and Project Thinking Tools”
----------------------------------------------------------------------------
Version: MSMEs October 2015 Page 6
A project and/or a programme should make a difference – it should bring about a clear result or change. All projects are ‘one-off’ initiatives to tackle a specific
problem or need(s). Agencies who fund projects want to bring about a change. This is what they will look for in the project design and want to see summarized in the logframe.
It’s important to remember that the Logical Framework is a summary of the project. Each box in the 4-by-4 matrix represents a simple question. In this guide, we will not only be looking at the content of each box but we will also be asking questions about the interrelationships between the answers given in the boxes (see Figure 2). Figure 2 - The basic “traditional” 4 x 4 Logical Framework: A Simple Set of Questions about the Programme or Project.
Objectives/
Narrative Summary
Indicators
Data Sources
Assumptions
Impact/Goal What is the longer term higher level overall objective or improved situation to which the project will
contribute?
What are the key quantitative or qualitative indicators related to the overall objective?
What are the sources of information for these indicators?
What are the factors and conditions required for longer term sustainability?
Outcome/ Purpose What are the specific and immediate beneficial changes to achieved by the project
What are the indicators showing whether and to what extent the project’s specific objectives are achieved?
What are the sources of data and information for these indicators?
What are the factors and conditions not under the direct control of the project which are necessary to achieve these objectives?
Outputs What are the concrete outputs (products and/or services) that must be delivered to achieve the Outcome/Purpose?
What are the indicators to measure whether and to what extent the project achieves the envisaged results and effects?
What are the sources of data and information for these indicators?
What external factors and conditions must be realised to obtain the expected outputs and results on schedule?
Activities What are the key activities to be carried out and in what
sequence in order to produce the expected outputs/results?
Inputs/Means: What are the means required to implement these
activities, e.g. personnel, equipment, training, studies, supplies, operational facilities, etc.
What are the sources of information about project
progress?
What pre- conditions are required before the project starts? What conditions
outside of the project’s direct control have to be present for the implementation of the planned activities?
Managing for Development Results (MfDR) Handbook on “Programme and Project Thinking Tools”
----------------------------------------------------------------------------
Version: MSMEs October 2015 Page 7
It’s also important to note that any logframe intended for submission to a donor ideally needs to be in the donor’s preferred Logical Framework “style”. Despite many years of donor harmonisation efforts, there is NOT yet harmonisation with regards the preferred formats or terminology. A useful guide to the preferred terminology used by different donors, banks and other international organisations is given in Appendix T. Clicking on the Symbaloo5 link will
take you directly to the each donors Guidance and/or Handbooks for developing a Logical Framework in their own preferred format and style.
One key thing to understand at the outset is who is responsible for delivering what. See Figure 3. Figure 3 – Control and Accountability
Degree of Control Project Accountability
Impact/Goal
What the project is
contributing towards
Outcome/Purpose
What overall the project
can reasonably
be held accountable
for achieving
Outputs
What is within the direct
control of management
Activities
Adapted from IFAD (2002)
The project team is responsible for delivering the Project Outputs. Put another way these Outputs and the associated Activities are the Terms of Reference for the Project team. The project needs to do all it can to achieve the Project Outcome or Purpose. The project can only contribute to the impact/Goal – it’s not responsible for delivering it
5 http://www.symbaloo.com/mix/guidelines
Decre
asin
g c
ontr
ol
Incre
asin
g c
ontr
ol
Managing for Development Results (MfDR) Handbook on “Programme and Project Thinking Tools”
----------------------------------------------------------------------------
Version: MSMEs October 2015 Page 8
The Project and Programme Thinking Tools Approach
Project and Programme management and planning can be difficult at the best of times. When the project or programme is one that involves a whole range of partners and agencies, it can be made even more so.6 The “Programme and Project Thinking Tools” introduced in this handbook have evolved over several decades to support teams undergoing “project” work.
The term ‘project’ can be confusing. In essence a project is set of activities aimed at
achieving clearly specified objectives within a defined time period and with a defined budget. The “Project Thinking Tools” can be applied at different levels of planning and decision-making. Essentially they can be used, with a relatively small project, a higher-level programme or indeed a whole organisation. In this handbook, the term ‘project’ is intended to include these higher levels.
The process of developing the key “thinking tool” - a logical framework (logframe) - for a project includes the development with key partners of thorough and clear plans7. The logical framework can help to organise the thinking within the project and to guide the purpose, with built-in mechanisms for minimising risks and monitoring, reviewing and evaluating progress. Completed logical frameworks form the basis of a project plan and can be used as a reference tool for on-going reporting. The thinking tool approach is divided into two phases of analysis and design.
The Project “Thinking Tool Approach”
Stakeholder analysis – identify who has an interest
and who needs to be involved
Objectives analysis – identify
solutions
Problem analysis – identify key problems, causes
and opportunities; determine causes and effects
Activity scheduling – set a
workplan and assigning responsibility
Resourcing – determine human
and material inputs
Developing the logframe –
define project structure, logic, risk and performance management
Options analysis – identify and
apply criteria to agree strategy
6 For more background on projects and project management, see Appendix D
7 For more information on the strengths and weaknesses of the logframe approach, see Appendix F
Managing for Development Results (MfDR) Handbook on “Programme and Project Thinking Tools”
----------------------------------------------------------------------------
Version: MSMEs October 2015 Page 9
Put it another way, the “Project Thinking Tool” process helps guide the planning of a journey from where we are now, HERE, to where we want to go, THERE. It works
through 7 core questions. This guidebook devotes a chapter to each question.
HERE
THERE
1 - Who are ‘we’? Who has an interest? Who should be involved?
2 - Where are we now?
What are the problems? What are the possibilities?
3 - Where do we want to be?
What are the options? What are our objectives?
4 - How will we get there?
What activities do we have to undertake?
5 - What may stop us getting there? What are the risks and how can we manage them?
What assumptions are we making?
6 - How will we know if we’ve got there?
What are our indicators and targets? What evidence do we need?
7 – What do we need to get there?
What detailed activities and resources are needed?
.
Managing for Development Results (MfDR) Handbook on “Programme and Project Thinking Tools”
----------------------------------------------------------------------------
Version: MSMEs October 2015 Page 10
Figure 1: The Programme or Project Cycle
The Programme or Project Cycle
Strategic Objectives
Vision/ Mission
Plans
Early Review
Concept Note
- Strategic fit
- Profile
- Funding
- Team
- Timing
“Project Thinking
Tools”:• Stakeholder Analysis
• Problem Analysis
• Risk Analysis
• Logical Framework
• Communication
Project Approval
• Project Information
• Profile
• Activities
• Time and Work Plans
• Finance and Budgets
Project Supervision
Report
6 monthly reports
Project Completion
Report (PCR)
Project Idea
Evaluation
Approval
to
Implement
Lesson LearningMonitoring
Approval to
Design
Evaluation
• Internal
• External
Managing for Development Results (MfDR) Handbook on “Programme and Project Thinking Tools”
----------------------------------------------------------------------------
Version: MSMEs October 2015 Page 11
STEP 1 STAKEHOLDER ANALYSIS; WHO ARE WE?
1.1 Why do we involve others?
Involving key partners in the early stages of project planning helps ensure commitment and ownership. This can help minimise tensions later on and has the added benefit that it pools knowledge and experience; helping to ensure the plan is as robust as possible. In a multi-agency project this early involvement is vital.
Effective engagement is likely to result in:
Improved effectiveness of your project. There is likely to be a greater sense of ownership and agreement of the processes to achieve an objective. Responsiveness is enhanced; effort and inputs are more likely to be targeted at
perceived needs so that outputs from the project are used appropriately.
Improved efficiency. In other words project inputs and activities are more likely to result in outputs on time, of good quality and within budget if local knowledge and skills are tapped into and mistakes are avoided.
Improved sustainability and sustainable impact. More people are committed to
carrying on the activity after outside support has stopped. And active participation has helped develop skills and confidence and maintain infrastructure for the long term.
Improved transparency and accountability if more and more stakeholders are given information and decision making power.
Improved equity is likely to result if all stakeholders’ needs, interests and abilities are taken into account.
What the experts
proposed
What the government
department specified
The design after review by
an advisory committee
The final compromise
design agreed
The system actually installed What the people really wanted!
Participation can have some simple but very important benefits!8
8 The original of this cartoon was published about 30 years ago. We have been unable to trace the cartoonist but
we would very much like to acknowledge them.
Managing for Development Results (MfDR) Handbook on “Programme and Project Thinking Tools”
----------------------------------------------------------------------------
Version: MSMEs October 2015 Page 12
Participation is likely to have many benefits. But it is not a guarantee of success. Achieving participation is not easy. There will be conflicting interests that come to the surface; managing conflict is likely to be an essential skill.
Participation can be time consuming. And it can be painful if it involves a change in practice; for example in the way institutions have ‘always done things’.
Working out who needs to be involved and what their input/interest is likely to be needs to be done as early as possible, but should also be repeated in the later stages of the project to assess whether the original situation has changed and whether the involvement of groups is being adequately addressed.
1.2 Who do we need to involve?
Analysing the stakeholders who need to be involved is one of the most crucial elements of any multi-agency project planning. Stakeholder analysis is a useful tool or process for identifying stakeholder groups and describing the nature of their stake, roles and interests.
Doing a stakeholder analysis can help us to:
identify who we believe should be encouraged and helped to participate
identify winners and losers, those with rights, interests, resources, skills and abilities to take part or influence the course of a project
improve the project sensitivity to perceived needs of those affected
reduce or hopefully remove negative impacts on vulnerable and disadvantaged groups
enable useful alliances which can be built upon
identify and reduce risks; for example identifying areas of possible conflicts of interest and expectation between stakeholders so that real conflict is avoided before it happens
disaggregate groups with divergent interests. Stakeholder analysis needs to be done with a variety of stakeholders to explore and verify perceptions by cross-reference.
Some potential groups you may want to consider are:
Users groups - people who use the resources or services in an area
Interest groups - people who have an interest in or opinion about or who can affect the use of a resource or service
Winners and losers
Beneficiaries
Intermediaries
Those involved in and excluded from the decision-making process.
Managing for Development Results (MfDR) Handbook on “Programme and Project Thinking Tools”
----------------------------------------------------------------------------
Version: MSMEs October 2015 Page 13
Another useful way of thinking about stakeholders is to divide them into:
Primary stakeholders. (Often the WHY or target population of a project.) They are generally the vulnerable. They are the reason why the project is being planned. They are those who benefit from or are adversely affected by the project. They may be highly dependent on a resource or service or area (e.g. a neighbourhood, a health clinic) for their well-being. Usually they live in or very near the area in question. They often have few options when faced with change.
Secondary stakeholders. (Often the HOW of reaching the Primary Stakeholders. These include all other people and institutions with a stake or
interest or intermediary role in the resources or area being considered. Being secondary does not mean they are not important; some secondaries may be vital as means to meeting the interests of the primaries.
It may be helpful to identify Key Stakeholders; primary and secondary stakeholders
who need to play an important active part in the project for it to achieve its objectives. These are the agents of change. Some key stakeholders are ‘gatekeepers’ who, like it or not, it is necessary to involve; otherwise they may have the power to block the project.
NOTE: Other meanings of the terms Primary and Secondary are used in some organisations. For example, Primary may refer to those directly affected, Secondary to those indirectly affected. This interpretation has generally been replaced by that above in order to emphasise a poverty focus.
1.3 Undertaking a Stakeholder Analysis
There are many different tools to help us to think about our stakeholders. Which ones are used depends upon the questions that need to be addressed. This example is one way (but not the only way) of doing a stakeholder analysis. There are several steps:
1. List all possible stakeholders, that is, all those who are affected by the project or can influence it in any way. Avoid using words like ‘the community’ or ‘the Local Authority’. Be more specific, for example, ‘12 to 14 year olds’ or the ‘Youth Service’
2. Identify, as thoroughly as possible, each stakeholder’s interests (hidden or open) in relation to the potential project. Note some stakeholder may have several interests. (See Figure 1a).
3. Consider the potential impact of the project on the identified stakeholders. Will the project have a positive or negative impact on them? (Award it + or - or +/- or ?).
4. Decide which stakeholder groups should participate at what level and when during the project cycle (see Figure 1b). Remember you cannot work with all groups all of the time. Complete participation can lead to complete inertia!
There are many other ways of doing a stakeholder analysis and many other factors that could be considered.
Managing for Development Results (MfDR) Handbook on “Programme and Project Thinking Tools”
----------------------------------------------------------------------------
Version: MSMEs October 2015 Page 14
1.4 A note on the WEMSME case study
The next 2 pages give an example of a Stakeholder Analysis.
Throughout this Handbook we have used one case study to illustrate the stages in the “Project Thinking Tool” approach. This will help you to see how the “thinking tools” link together.
The case study is based on a small project – The Women’s Empowerment through Micro, Small and Medium Enterprises (WEMSME). We have removed
some of the detail to make it more useful as a training case study. We have therefore made the context fictitious. The project is based in Eralc District in a small country called Trohs.
The Project involved the Eralc District Government, the Government of Trohs and the Development Partners (donors) involved working together to support the development of successful micro and small businesses.
Managing for Development Results (MfDR) Handbook on “Programme and Project Thinking Tools”
----------------------------------------------------------------------------
Version: MSMEs October 2015 Page 15
Figure 1a The WEMSME Project case study: Example of an initial Stakeholder Analysis
Stakeholders Primary stakeholders
Interests Impact (+,-, ?)
1 Women entrepreneurs Improved livelihoods through increased income generating opportunities Regular incomes
+
+
2 Men entrepreneurs Improved livelihoods through increased income generating opportunities Regular incomes
+
+
3 Women employed in green growth industry production or processing
Improved income opportunities; safe, working conditions; fair, direct pay not to husbands
+
4 Medium and sized companies
More production; added value; higher prices; more reliable income Gains that outweigh production, environmental and
employment restrictions Competition from MSMEs
+ ?
-
5 Small producer groups /
cooperatives
Access to markets; economy of scale; voice +/?
Secondary stakeholders
6 Ministry of Food and Agriculture (MFA) district level field staff
Long-term job prospects; opportunities for skills development Safety and security
+
?/-
7 Provincial MFA Chiefs Access to budget and capacity building; support in decentralised planning; political capital
+
8 MFA at national level Delivery on national and local objectives; extra resource and support to Administration Lesson learning
+/?
+
9 WEMSME Implementing Partner
Income through project management; success in delivery of results; future work prospects; capacity building opportunities for staff Security and safety of staff
+
-/?
10 WEMSME Project staff Long-term job prospects; opportunities for skills development Safety and security
+ ?
11 Trader Associations Access to high value niche markets Consistent and reliable supply
+
12 Traders and Suppliers Access to markets Consistent and reliable supply and market
+
13 Short term money lenders Regular markets -
14 Market and Economic Researchers e.g.
University staff
Good quality research and publications Lesson Learning
+
13 Other Green Growth Suppliers
Regular markets ?/-
14 Bankers and Financial institutions
Income from loans Achievement of loan lending targets
+ ?
15 International Labour Organisation (ILO)
Achievement of objectives Lesson Learning
+ ?
17 Development Partners/Donors
Achievement of Country Plan objectives +
Managing for Development Results (MfDR) Handbook on “Programme and Project Thinking Tools”
----------------------------------------------------------------------------
Version: MSMEs October 2015 Page 16
Figure 1b The WEMSME Project Case Study: Example of an initial Summary Participation Matrix
Action
Project Stage
Inform Consult Partnership Manage/Control
Identification International Labour Organisation (ILO)
Development Partners/Donors
Ministry of Food and Agriculture (MFA)
Planning Small scale growers
Women
Medium and Larger companies
Development Partners/Donors
Producer Groups
District MFA Chiefs
Traders Association
Market and Economic
Researchers
District Teams
MFA
Implementing and Monitoring
Development Partners/Donors
MFA
ILO
Implementing Partner
Small scale producers
Children
Women
Provincial MFA Chiefs
Project staff
Larger Producer Groups
Market and Economic
Researchers
District Teams
Project Steering Committee
Evaluation ILO Implementing Partner
Project staff
Development Partners/Donors
Producer Groups
Small scale producers
Children
Women
Medium and Large scale producers
Provincial MFA Chiefs
MFA
Project Steering
Committee
External Evaluators
Managing for Development Results (MfDR) Handbook on “Programme and Project Thinking Tools”
----------------------------------------------------------------------------
Version: MSMEs October 2015 Page 17
STEP 2 PROBLEM ANALYSIS; WHERE ARE WE NOW?
2.1 Identifying Problems and Possibilities (the current situation)
The first step has helped us to identify who needs to be involved, how and when in the initial design phase. With the right stakeholders on board, focus now turns to analysing the situation and prioritising the way forward, through situation and option analysis to help us to understand the current circumstances and develop possible choices for the future. The purpose of these activities is to develop a relationship of mutual respect and agreement between key stakeholders and to reach a position of collective understanding of the underlying issues and problem so that they can move onto the next stage. There is no single right way to do this and there are a number of options for working through the process – you should judge for yourself the best route to fit the context. This stage will include analysis of previous studies, research or evaluation material – perhaps documents that have lead you to this stage or documents from other organisations. There may also be notes from earlier meetings that may inform the process. The exercise usually needs to be repeated with different stakeholder groups, often very different pictures of the situation emerge.
2.2 Developing a Problem Tree
Developing a problem tree is one way of doing problem analysis. Essentially this
involves mapping the focal problem against its causes and effects.
Figure 2a The Problem Tree
EFFECTS
Focal Problem
Turning the problem
into a positive
statement gives the
outcome/purpose or
Impact/goal for the
intervention
Addressing the
causes of the
problem identifies
outputs and
activities
Addressing the
effects identifies
the indicators
CAUSES
Managing for Development Results (MfDR) Handbook on “Programme and Project Thinking Tools”
----------------------------------------------------------------------------
Version: MSMEs October 2015 Page 18
Depending on the group or the situation there are two methods for developing a problem tree…
Start with a blank sheet of flip chart paper, pens and 2” x 2” post-its (or small card and tape).
Method 1: “Brainstorming”
This method can be more creative, but it is risky; you can get tangled up.
Participants “brainstorm" issues around a problem(s) as yet unidentified. Each issue is recorded on a separate post-it. Don’t stop and think or question, just scatter the post-its on the flipchart. When ideas for issues dry up and stop,
Identify and agree the focal problem. It is probably there on the flipchart, but
may need rewording. Note that a problem is not the absence of a solution, but an existing negative state.
Sort the remaining issues into causes and effects of the problem.
Cluster the issues into smaller sub-groups of causes and effects building the tree in the process. Tear up, re-word and add post-its as you go.
Finish by drawing connecting lines to show the cause and effect relationships.
Method 2: Systematic
Better suited to the more systematic and methodical.
Participants first debate and agree the focal problem. Write this on a post-it and place it in the middle of the flipchart.
Now develop the direct causes (first level below the focal problem) by asking ‘but why?’ Continue with 2nd, 3rd and 4th level causes, each time asking ‘but why?’
Repeat for the effects above the focal problem instead asking ‘so what?’
Draw connecting lines to show the cause – effect relationships.
Managing for Development Results (MfDR) Handbook on “Programme and Project Thinking Tools”
----------------------------------------------------------------------------
Version: MSMEs October 2015 Page 19
Figure 2b The WEMSME case study: Example of a Problem Tree
CAUSES Lack of network of green growth MSMEs
Lack of school and college education in
innovation and enterprise
development
Weak knowledge
base on MSME drivers and barriers to successful
investment in
green growth
Poor range of subsidized products
and services available
Lack of useful policies in
place
Weak state planning and
budgets
Incentives not favorable to
private sector investment
Environment in place for MSMEs to
access green growth support services and finance is not
conducive
Weak ICT infrastructure
Lack of investment
Weak MSME research capacity, lesson learning
and documentation
SME’s cannot easily access
finances
Lack of culture for the
encouragement of innovation and
entrepreneurship
Lack of evidence
based policies
Lack of training and
dissemination of best practices
Lack of early
warning systems
EFFECTS
Un stable prices
in local markets
Increased disparity in income and opportunity
in relation to gender, disability, vulnerability
Weak performance of existing and new MSMEs
Disillusionment with efforts towards good
governance Weak
innovation and
cooperation
Decreased incomes of
MSME owners and
their households Lack of supply of goods
and services from MSMEs
Weak internal management and
functioning of MSMEs
Failure to achieve national objectives
Lack of job security
Lack of knowledge on MSMEs
Lost opportunities to district and national
economy
Increased
poverty
Problem Tree
Managing for Development Results (MfDR) Handbook on “Programme and Project Thinking Tools”
----------------------------------------------------------------------------
Version: MSMEs October 2015 Page 20
STEP 3 OBJECTIVES AND OPTIONS ANALYSIS; WHERE DO WE WANT TO BE?
3.1 Looking forward
Having defined the problem that we are trying to tackle we now need to develop this into objectives that we can work towards.
Some facilitators and participants prefer to skip Step 2 the Problem Tree and move directly on to an Objectives or Vision Tree. Instead of looking back, looking
forward; rather than thinking in terms of negatives, participants imagine a desired situation in the future; (this Focal Objective is placed in the centre of the flipchart.) What is needed to achieve that situation? (placed below the Focal Objective). What would result from achieving the situation? (placed above).
Going directly to an Objective Tree can be particularly useful in a post-conflict context where participants find analysis of the problem painful.
3.2 Developing an Objectives/Vision Tree
This can be done by reformulating the elements of our problem tree into positive desirable conditions. Essentially the focal problem is “turned over” to become the key objective for addressing the problem. In logical framework terms it may be the Impact/Goal or Purpose; discussed in more detail later. (So in our example, the problem of ‘Lack of Network of MSMEs’ could simply become an objective of ‘Network of MSMEs established’). Below the focal problem, you can continue
this “reversing” for each of the causes listed to create further objectives.
Above, if the problem is addressed one would expect to see changes in the effects, so there will be useful ideas here for potential indicators of progress and identification of the benefits to be achieved.
Managing for Development Results (MfDR) Handbook on “Programme and Project Thinking Tools”
----------------------------------------------------------------------------
Version: MSMEs October 2015 Page 21
Figure 3a The WEMSME case study: An Objectives Tree derived from a Problem Tree
----------------------------------------------------------------------------------------------------
Managing for Development Results (MfDR) Handbook on “Programme and Project Thinking Tools”
----------------------------------------------------------------------------
Version: MSMEs October 2015 Page 22
3.3 Choosing between options
This has now given us a number of options for our objectives and the group needs to decide which ones to focus on (Options Analysis). You should agree the
criteria for assessing the various options. Key factors here could include:
Degree of fit with macro objectives (The bigger picture)
What other stakeholders are doing?
The experience and comparative advantage of your organisation and partners
What are the expected benefits? To whom?
What is the feasibility and probability of success?
Risks and assumptions? Who is carrying the risk?
Financial criteria – costs, cashflows, financial sustainability?
Social criteria – costs and benefits, gender issues, socio-cultural constraints; who carries social costs?
Environmental criteria – what are the environmental costs and gains?
Technical criteria – appropriateness, use of local resources, market factors?
Institutional criteria – capacity, capacity building, technical assistance?
Economic criteria – economic returns, cost effectiveness?
When the criteria have been set a decision as to which option to take can follow.
Figure 3b The WEMSME case study: Options Analysis
Managing for Development Results (MfDR) Handbook on “Programme and Project Thinking Tools”
----------------------------------------------------------------------------
Version: MSMEs October 2015 Page 23
What then happens to options which you decide NOT to address? (In the example in Figure 3b, it has been decided, for whatever reason, not to focus on Progressive policies , Culture of encouragement innovation…and improved ICT infrastructure.)
It may be these options are being addressed by others in parallel with your project (in which case there will be need for dialogue with those invoved). If no one will be addressing them, and these root causes to the orginal problem are serious, they remain risks to our planned project and will need to be managed. We will return
to this later.
3.4 Linking with the logframe
Sometimes it is possible to link the chosen options from the objectives tree into the first ‘objectives’ column of the logframe as shown in Figure 3c.
It does not always work as neatly as in the example! It depends on the complexity of the orginal problem, and on the time spent on and the level and detail of the problem analysis. Sometimes the original core problem translates into the Purpose (as here), sometimes into the Impact/Goal. The point is, your problem and objectives trees are important as source documents for ideas. There are no hard and fast rules. In the example, a major effect of the original problem Lack of Network of Green Growth MSMEs has been used as the basis for the
Oucome/Purpose, giving the project a social poverty focus
Figure 3c The WEMSME case study: Linking with the logframe objectives.
Managing for Development Results (MfDR) Handbook on “Programme and Project Thinking Tools”
----------------------------------------------------------------------------
Version: MSMEs October 2015 Page 24
STEP 4 OBJECTIVES DESIGN; HOW WILL WE GET THERE?
4.1 Identifying our objectives
We have defined our problem and begun to consider our objectives. Remember the Problem Tree and Objectives Tree are important reference documents at this stage. Work through a simple step-by-step approach.
Stage 1 - Define the Impact or Goal
The Impact or Goal is the higher order objective, the longer term positive change that the project will contribute to. Use only one Impact statement.
Some progress towards the Impact should be measurable during the lifetime of the project. The Impact defines the overall “big picture” need or problem being addressed; it expresses the justification, the ‘Greater WHY’, of what is planned. E.g. Increased economic opportunities for women and men and greater investments in green growth.
Stage 2 - Define the Outcome or Purpose
The Outcome/Purpose (together with its associated indicators) describes
the short and medium-term positive effects of the project. The Purpose is also a justification, a WHY statement. It needs to be clearly defined so all key stakeholders know what the project is trying to achieve during its lifetime. E.g. Network of green growth Small and Medium sized Enterprises (MSMEs) established.
Have only one Outcome/Purpose. If you think you have more, then you
may need more than one logframe; or your multiple Purposes are in fact Purpose indicators of a single Purpose as yet unphrased; or they are lower order outputs.
The Outcome/Purpose should not be entirely deliverable, i.e. fully
within the project manager’s control. If it is deliverable, then it should be an Output. The Outcome/Purpose usually expresses the uptake or implementation or application by others of the project’s Outputs; hence it cannot be fully within managerial control. ‘You can take a horse to water, but you can’t make it drink’. The project may be ‘delivering’ the water, but it cannot control the behaviour of others outside the team (the horse). So we aim for the Outcome/Purpose to be achieved but this cannot be
guaranteed. It will depend on stakeholders’ actions and assumptions beyond the control of the project manager. The manager can best exert influence over Outcome/Purpose achievement by maximising the completeness of delivery of the Outputs and mitigating against risks to the project.
The ‘gap’ between Outputs and Outcome/Purpose represents ambition. How ambitious you are, depends on the context, on the feasibility
of what you are trying to do and the likelihood others outside managerial control will change their behaviour. Don’t have the Purpose unrealistically remote from the Outputs; conversely, don’t set them so close when, in reality, more could be achieved. The Outcome is not simply a reformulation of the Outputs.
Managing for Development Results (MfDR) Handbook on “Programme and Project Thinking Tools”
----------------------------------------------------------------------------
Version: MSMEs October 2015 Page 25
Whoever will be approving the project proposal, should be focusing their challenge on, and seeking justification for, the causal link between Outputs and Outcome.
When setting the Outcome, avoid phrases like ‘by’ or ‘through’ or ‘in order to’ or ‘so that’. They are confusing and usually mean the Outcome
includes objectives at more than one level. This detail will more appropriately be in other boxes of the logframe (e.g. indicators).
Stage 3 - Describe the Outputs
The Outputs describe what the project will deliver in order to achieve the
Purpose. They are the results that the project must deliver. They can be thought of as the Terms of Reference or Components for project implementation, the deliverables in the control of the project manager.
Outputs are things, nouns and usually include Human Capacity, Systems, Knowledge and Information, Infrastructure, Materials, Awareness. E.g. a) Effective linkages; b) Market-oriented evidence; c) A coherent plan etc. For
more details see Appendix G.
Typically there are between 2 – 8 Outputs; any more than that and the logframe will become over-complicated.
Stage 4 - Define the Activities
The Activities describe what actions will be undertaken to achieve each
output. Activities are about getting things done so use strong verbs. E.g. Establish… Develop…
Stage 5 - Test the Logic from the bottom to the top
When the four rows of column 1 have been drafted, the logic needs to be tested.
Use the IF/THEN test to check cause and effect. When the objectives
hierarchy is read from the bottom up it can be expressed in terms of:
If we do these activities, then this output will be delivered.
If we deliver these outputs, then this Outcome will be achieved
If the Purpose is achieved then this will contribute to the Impact.
The IF/THEN logic can be further tested by applying the Necessary and Sufficient test. At each level, ask are we doing enough or are we doing too much for delivering, achieving or contributing to the next level
objective?
As you test the logic, you will be making assumptions about the causal linkages. We will be looking at this in more detail shortly.
Managing for Development Results (MfDR) Handbook on “Programme and Project Thinking Tools”
----------------------------------------------------------------------------
Version: MSMEs October 2015 Page 26
4.2 The Objectives Column in the Logical Framework
We put the objectives into the first column of the logical framework – the objectives column:
Figure 4a The Objectives Column
Column 1
Objectives
Column 2
Indicators
/ targets
Column 3 Data
sources
Column 4 Assumptions
Goal/Impact:
The higher order long-
term development objective to which the
project contributes
The Greater Why?
Purpose/Outcome:
The specific and
immediate beneficial
changes achieved by the
project The Why?
Outputs:
The deliverables of the project or the terms of
reference
The What?
Activities: The main activities that
must be undertaken to
deliver the outputs
The How?
Managing for Development Results (MfDR) Handbook on “Programme and Project Thinking Tools”
----------------------------------------------------------------------------
Version: MSMEs October 2015 Page 27
Figure 4b The WEMSME case study: Column 1 - The Hierarchy of Objectives
Column 1 Objectives
Column 2 Indicators /
targets
Column 3
Data sources
Column 4
Assumptions
Goal/Impact:
Increased economic opportunities for women and men and greater investments in green growth in Eralc District in Trohs.
Purpose/Outcome:
Network of green growth MSMEs established in Eralc District.
Outputs:
1. MSME’s access finances 2. Enabling environment in place for MSME’s to
access green growth support services and finance
3. Knowledge base developed on MSME drivers and barriers to successful investment in green growth
Activities:
1.1 Conduct baseline study of MSMEs in Trohs 1.2 Carry out Training Needs Assessment. 1.3 Identify and research MSME best practices 1.4 Develop a training programme 1.5 Disseminate best MSME practices in case
studies and training workshops 1.6 Establish MSME network systems for access and
exchange of information and learning. 2.1 Design of subsidised financial instruments to
support MSME green growth investments with Financial partners.
2.2 Develop MSME diversification action plans with Financial partners.
2.3 Hold event where MSMEs pitch business plans to potential investors
2.4 Hold MSME business plan competition and make awards
3.1 Carry out robust analysis of the green product market place and market chains
3.2 Collection of lessons learned from workshops & events
3.3 Develop and implement Knowledge Management & communications plan
3.4 Produce knowledge and research products with local University partners
Managing for Development Results (MfDR) Handbook on “Programme and Project Thinking Tools”
----------------------------------------------------------------------------
Version: MSMEs October 2015 Page 28
Checklist – Objectives
Below is a simple checklist for checking the objectives in column 1 of the Logframe.
1. Do they answer
Goal/Impact Greater Why?
Purpose/Outcome Why?
Outputs What?
Activities How?
2. Does the logic work?
Vertical logic in Column 1;
Then If
Is it necessary and sufficient? (i.e. is too much or too little being proposed?)
3. Is there only one Purpose/Outcome?
4. Is the Purpose/Outcome clearly stated, avoiding phrases like ‘by’, ‘in order to’, ‘through’ and ‘so that’.
5. Is the Purpose/Outcome too remote from the Outputs?
6. Is the Purpose/Outcome more than just a reformulation of the Outputs?
7. Does the gap between Purpose/Outcome and Outputs show realistic ambition? Is it assessable? Is the causal link strong?
8. Are the Outputs deliverable?
9. Do we see Process as well as Product objectives?
10. Are the Outputs and Activities linked /cross-numbered?
Managing for Development Results (MfDR) Handbook on “Programme and Project Thinking Tools”
----------------------------------------------------------------------------
Version: MSMEs October 2015 Page 29
STEP 5: RISK MANAGEMENT; WHAT MAY STOP US GETTING THERE?
5.1 Managing Risk
Risk is the potential for unwanted happenings impairing the achievement of our objectives. Every project involves risks. Risk assessment and management
are essential elements in business; likewise in development and community work.
If you talk to experienced development and/or community workers they will usually agree that when projects fail, it is not generally because the objectives were wrong but because insufficient time and thought were given to the risk factors, to what can go wrong with the plan and to the assumptions that are being made.
Worthwhile projects involve risk, sometimes very high risk. The important point is not necessarily to avoid risks but to plan for them by identifying and
assessing them and allocating time and other resources to manage them for example by monitoring and mitigation.
So it is vital that risks are identified in planning and that a risk management plan is built into the overall design process and implementation management.
Development organisations are placing considerable emphasis on creating a risk culture; an awareness and competence in risk management. There are a number of common perceptions blocking progress; and responses that can move
forward good practice.
Figure 5a Perceptions and Responses in risk management
Perceptions blocking progress
Poor practice Responses
Good practice
Risk analysis is seen as an ‘add-on’; it’s done mechanically
because it’s a mandatory procedure.
It should be an integral core of
what we do. It should serve as a challenge function to
interrogate our thinking.
It’s seen as too difficult. It’s not difficult. It involves just a few basic questions.
A long list of risks will
impress. Strong analysis is needed to
identify the few, key ‘mission critical’ risks. And then to design effective mitigatory measures.
Once the Risk Analysis is done, it’s done and never revisited.
It needs regular tracking and review.
It’s just done internally.
Potentially it’s a key tool for broader project ownership and
political buy-in.
Managing for Development Results (MfDR) Handbook on “Programme and Project Thinking Tools”
----------------------------------------------------------------------------
Version: MSMEs October 2015 Page 30
What is / are the:
IMPORTANCE? depends mainly on:
• What is the HAZARD itself? Scale? Seriousness?
• What is the VULNERABILITY to the hazard? of the poor? of the project?
PROBABILITY? The likelihood of it happening. What data is there? How
reliability is the data?
COSTS? Social? Financial? What are they and who bears them? The
already vulnerable?
GAINS? What are the gains from going ahead?
MITIGATION? What can be done to improve any or all the above?
5.2 The Key Questions
Remember other documents are likely to help in the identification of risks; e.g. the stakeholder analysis, the problem analysis etc. But once we have identified the risks, what are the key questions?
Figure 5b The Key Questions
5.3 Undertaking a Risk Analysis
Stage 1 Identify the risks. Brainstorm the risks using the draft Hierarchy of Objectives (Column 1). At each level ask the question: ‘What can stop us … ?’ …doing these Activities,…..delivering these
Outputs, ….achieving this Purpose, ……contributing to this Impact / Goal? These are phrased as risks. Write each risk on a separate post-
it and place them in column 4; it does not matter at this stage at what level you place them.
On a separate sheet on flipchart paper draw the table in Figure 5b overleaf. Transfer the risk postits from column 4 of the
logframe to the left column of the new table.
Stage 2 Analyse and manage the risks. Then as a group discuss each risk
in turn:
What is its likely importance (Im)? Write H, M or L; high,
medium or low.
Managing for Development Results (MfDR) Handbook on “Programme and Project Thinking Tools”
----------------------------------------------------------------------------
Version: MSMEs October 2015 Page 31
Risks and Assumptions A Risk is potential event or occurrence could adversely affect achievement of the desired results. An Assumption is a necessary
condition for the achievement of results at different levels. A risk is best not written as the negative of an assumption (e.g. Assumption = ‘inflation remains at manageable level’; Risk = ‘hyperinflation’). It is useful to view assumptions as the conditions that remain after mitigatory measures have been put in place.
What is its likely probability (Pr)? Write H, M or L.
You may at this point decide to hereafter disregard insignificant
risks; those that are Low Low.
Discuss and agree
possible mitigatory measures; record these on
the chart. In a few cases there will not be any but even with so-called uncontrollable risks, some degree of mitigation is usually possible.
Even if mitigatory measures
are successful, it is unlikely you can remove the risk completely. What ‘residual’ assumptions are you left with? Record
these.
Example:
Highjacking is a risk in civil aviation. As a mitigatory measure, passengers are now subject to hand luggage and body searches. Even if done effectively this does not remove the risk altogether; the Impact probably remains unchanged, the Probability may be reduced from Medium to Low. You are left with a residual assumption that
‘With effective screening measures in place, highjacking will not happen’.
Figure 5b Risk analysis table
Risks Im9 Pr10 Mitigation Assumptions Highjacking of aircraft H M Airport security
screening of all passengers
With effective screening measures in place, highjacking will not happen
9 Importance
10 Probability
Transfer these to Column 4 of
the LF
Do these transfer to Column 1 and become
extra activities?
Managing for Development Results (MfDR) Handbook on “Programme and Project Thinking Tools”
----------------------------------------------------------------------------
Version: MSMEs October 2015 Page 32
Figure 5c The WEMSME case study: Managing the risks
(Table incomplete; for illustration only)
Risks Im Pr Mitigation Assumptions
1. Deterioration of security situation disrupts project Activities, Outputs, Outcome/Purposes and Impact.
M M
Ensure close liaison with security forces and District Chiefs.
Draw up security plan with attached budget.
Monitoring and regular review.
1. The security situation does not deteriorate such that it disrupts project activities and results.
2. Benefits of the project are captured by men and/or others elites M M
Ensure institutional representation of disadvantaged groups in a gender sensitive manner.
2. Benefits of the project accrue to the vulnerable at community and household levels.
3. Markets are hard to penetrate and local markets become saturated.
M L
Initial and on-going market research must be realistic and robust and gender sensitive.
3. Local financial services are able to compete in meeting growing local demand in gender sensitive manner.
4. Required financial inputs for MSMEs outside project control are not available.
L M
Encourage diversity of service provision.
Strong collaboration with relevant partners.
Inclusion of partners in planning and capacity building.
4. Key financial inputs are available to MSMEs in a gender sensitive manner.
5. Current social networks hinder the establishment of new essential linkages.
H L
Thorough stakeholder analysis, involvement and ownership.
Implement communication strategy.
5. Essential gender sensitive linkages between MSMEs and others in the market place can be fostered.
6. The on-going demands of financial standards are too alien to MSMEs
H M
Effective gender sensitive training and communication.
Clear and understood quality criteria.
6. Market needs are understood and addressed in a gender sensitive manner
7. The incentives to stay in business are not strong enough.
H M
Parallel efforts within the enforcement and alternative business strategies.
7. The incentives for MSMEs are strong enough.
Managing for Development Results (MfDR) Handbook on “Programme and Project Thinking Tools”
----------------------------------------------------------------------------
Version: MSMEs October 2015 Page 33
5.4 The Assumptions Column in the Logframe
You have identified and analysed the risks, determined mitigatory measures and agreed what residual assumptions still hold. Transfer to your logframe as appropriate:
Your mitigatory measures into Column 1; i.e. extra activities; (or the measures may be reflected in the indicators in Column 2; we come to this later).
Your residual assumptions into Column 4. These are conditions which could affect the success of the project. They are what remains after the
mitigatory measures have been put in place.
Figure 5d The Assumptions Column
Column 1 Objectives
Column 2 Indicators / targets
Column 3 Data
sources
Column 4 Assumptions
Impact / Goal:
Important conditions needed in order to contribute to the Impact / Goal
Purpose/Outcome:
Important conditions needed in order to achieve the Purpose/Outcome
Outputs:
Important conditions needed to deliver the Outputs
Activities:
Important conditions needed to carry out the Activities; the pre-conditions.
Managing for Development Results (MfDR) Handbook on “Programme and Project Thinking Tools”
----------------------------------------------------------------------------
Version: MSMEs October 2015 Page 34
By adding assumptions our logic is extended; check the logic with the IF AND THEN test:
IF the Pre-conditions hold, THEN the Activities will be carried out.
IF Activities have been carried out, AND if the Assumptions at Output level hold true, THEN the Outputs will be delivered.
IF Outputs are delivered, AND if the Assumptions at Output level hold, THEN the Purpose/Outcome will be achieved.
IF the Purpose/Outcome has been achieved, AND if the assumptions at Impact level hold, THEN the Project will contribute to the Impact / Goal.
Figure 5e The IF AND THEN logic
Objectives Assumptions
Impact/ Goal
Then we should contribute to this
Impact/Goal
And these conditions
hold
Purpose/ Outcome
If we achieve this Purpose/Outcome. Then we should
achieve this Purpose/Outcome.
And these
conditions hold
Outputs If we deliver these
Outputs. Then we will deliver these
outputs.
And these conditions
hold
Activities If we carry out these activities
Pre-
conditions
Managing for Development Results (MfDR) Handbook on “Programme and Project Thinking Tools”
----------------------------------------------------------------------------
Version: MSMEs October 2015 Page 35
Figure 5f The WEMSME case study: Column 4 - The key assumptions.
(Table incomplete; for illustration only) Objectives Col2 Col3 Assumptions
Impact / Goal: Increased economic opportunities for women and greater investments in green growth in Eralc District in Trohs.
Project will be replicated on National scale thereby contributing to enhanced sector performance
Outcome: Network of green growth MSMEs established
Macro-economic outlook is
favourable in Trohs and the
region
The security situation in
Trohs does not deteriorate
such that it disrupts project
activities and results.
Total revenue impact will occur after 2 years of MSMEs being established
Outputs:
1. MSME’s access finances 2. Enabling environment in place for MSMEs to
access green growth support services and finance
3. Knowledge base developed on MSME
drivers and barriers to successful investment in green growth
Finance will be available with
improved MSME capacity to
request
Subsidised loan rates are
affordable to project
participants.
Quality BDS providers can be
identified to provide services
at rates affordable to start-
ups and young entrepreneurs
Project participants become
attractive lending target for
financiers
Managing for Development Results (MfDR) Handbook on “Programme and Project Thinking Tools”
----------------------------------------------------------------------------
Version: MSMEs October 2015 Page 36
Indicative Activities: 1.1 Conduct baseline study of MSMEs in Trohs 1.2 Carry out Training Needs Assessment. 1.3 Identify and research MSME best practices 1.4 Develop a training programme 1.5 Disseminate best MSME practices in case
studies and training workshops 1.6 Establish MSME network systems for
access and exchange of information and learning.
2.1 Design of subsidised financial instruments to support MSME green growth investments with Financial partners.
2.2 Develop MSME diversification action plans with Financial partners.
2.3 Hold event where MSMEs pitch business plans to potential investors
2.4 Hold MSME business plan competition and make awards
3.1 Carry out robust analysis of the green
product market place and market chains 3.2 Collection of lessons learned from
workshops & events 3.3 Develop and implement Knowledge
Management & communications plan 3.4 Produce knowledge and research products
with local University partners
Checklist – Risks and Assumptions
1. Have all the important risks been identified?
e.g. from the Stakeholder analysis?
e.g from the Problem trees? Etc.
2. Are the risks specific and clear? Or too vague?
3. Where risks are manageable, have they been managed?
4. Where possible, have mitigatory measures been included as Activities and Outputs? i.e. moved into Column 1?
5. Are the Assumptions at the right level?
6. Does the logic work?
Check the diagonal logic for Columns 1 and 4
Then and these assumptions hold
If
Is it necessary and sufficient? Again, is enough being proposed; is too much being proposed?
7. Should the project proceed in view of the remaining assumptions? Or is there a KILLER risk that cannot be managed, of such high probability and impact, that it fundamentally undermines the project and forces you to stop and rethink the
whole project?
Managing for Development Results (MfDR) Handbook on “Programme and Project Thinking Tools”
----------------------------------------------------------------------------
Version: MSMEs October 2015 Page 37
STEP 6. HOW WILL WE KNOW IF WE’VE GOT THERE?
6.1 Laying the foundations for Monitoring, Review and Evaluation
One of the key strengths of the logframe approach is that it forces the planning team to build into the design how the project will be monitored, reviewed and evaluated. The project is planning to deliver, achieve and contribute a chain of results at different levels; these are the intended changes in development conditions resulting from the development project or programme.
Indicators are identified to show how we intend to measure change from the current baseline. Targets are set to be achieved by the end of the time period, together with milestones to measure progress along the way. The logframe
approach helps in addressing and reaching agreement on these issues early at the design stage. It helps to pinpoint the gaps and determine what needs to be done. It asks what data is needed now and in the future, and what data sources will be used, be they secondary, external, reliable and available, or primary,
internal and requiring budgeted data collection activites within the project.
An oft-quoted principle is ‘if you can measure it, you can manage it’. The one may not inevitably follow the other, so we can qualify as: ‘if you can measure it, you are more likely to be able to manage it’. Or the reverse that ‘if you can’t measure it, you can’t manage it.
6.2 Terms and principles
The main confusion comes with Indicators and Targets. Indicators are a means by which change will be measured; targets are definite ends to be achieved. So
to take two examples:
Indicators Targets
the proportion of population with access to improved sanitation, urban and rural
halve, between 1990 and 2015, the proportion of people without sustainable access to basic sanitation
the proportion of girls achieving Grade 4
increase by 15% in girls achieving Grade 4 by month 36
An Indicator is a quantitative and/or qualitative variable that allows the
verification of changes produced by a development intervention relative to what was planned.
A Target is a specific level of performance that an intervention is projected to accomplish in a given time period. Milestones are points in the lifetime of a project by which certain progress
should have been made A Baseline is the situation prior to a development intervention against which
progress can be assessed or comparisons made.
Managing for Development Results (MfDR) Handbook on “Programme and Project Thinking Tools”
----------------------------------------------------------------------------
Version: MSMEs October 2015 Page 38
The indicator shows how the change from the current situation will be measured.
An indicator is not something you achieve. You do however aim to achieve a target. A target is an endpoint; a Specific, Measureable, Achievable, Relevant and Time-bound endpoint. A target should be SMART; don’t try making an
indicator smart. And don’t make the objectives in column 1 of the logframe smart; keep them as broad results.
It’s useful to think of milestones as interim or formative targets. Thus for the first example target above of halving by 2015 the proportion of people without sustainable access to basic sanitation, reductions of 35% by 2009 and 42% by
2012 would be milestones. They provide an early warning system and are the basis for monitoring the trajectory of change during the lifetime of the project.
A baseline is needed to identify a starting point and give a clear picture of the pre-
existing situation. Without it, it is impossible to measure subsequent change and performance (Figure 6a). For example, without knowing the baseline, it would not be possible to assess whether or not there has been a ‘25% improvement in crop production’. Collecting baseline data clearly has a cost; but so does the lack of baseline data! The reliability and validity of existing, secondary data may be in doubt and there may not be enough of it. In which case, baseline studies will be needed before targets can be set and before approval for implementation can generally be given. In some circumstances, it may be appropriate to carry out
some baseline data collection and target-setting post-approval. Indeed it may be perfectly acceptable, indeed good practice, to state that some ‘indicators and targets to be developed with primary stakeholders in first 6 months of the project.’
Figure 6a: Baseline, targets and achievement (adapted from UNDG guidelines)
Current level of achievement
Commitment
Performance
Baseli
ne
Targ
et
Ach
iev
em
en
t
Before looking at how indicators are constructed, some important points:
Who sets indicators and targets is fundamental, not only to ownership and transparency but also to the effectiveness of the measures chosen. Setting objectives, indicators and targets is a crucial opportunity for participatory design and management.
Indicators and targets should be disaggregated for example by gender, ethnic group, age, or geographic area. Averages can hide disparities
particularly if large sample sizes are needed for statistical reliability.
Managing for Development Results (MfDR) Handbook on “Programme and Project Thinking Tools”
----------------------------------------------------------------------------
Version: MSMEs October 2015 Page 39
Some indicators in every logframe should relate to standard or higher level indicators. Most organisations seek to attribute and
communicate their work towards a set of standard results or indicators (often closely aligned with the MDGs). Operations in-country will need show linkage to national priorities; UN agencies to an UNDAF etc. Projects will need to show linkage of indicators upwards if they are part of a larger programme.
A variety of indicator target types is more likely to be effective. The need for objective verification may mean that too much focus is given to the quantitative or to the simplistic at the expense of indicators that are harder to verify but which may better capture the essence of the change taking place. Managers sometimes need to be persuaded of the usefulness of qualitative data!
The fewer the indicators the better. Collect the minimum. Measuring change is costly so use as few indicators as possible. But there must be
indicators in sufficient number to measure the breadth of changes happening and to provide the triangulation (cross-checking) required.
The process in brief
Set key indicators
No
Yes
Choose different indicators
Set milestones and targets to be achieved
Collect it No
Yes
How will change be measured?
What is the intended result? – output, outcome, impact
Is the baseline data available?
Is it possible to collect it?
Are the right stakeholders involved
in this process?
Managing for Development Results (MfDR) Handbook on “Programme and Project Thinking Tools”
----------------------------------------------------------------------------
Version: MSMEs October 2015 Page 40
6.3 Constructing indicators and targets
Before looking at the process of constructing indicators and targets, the point is made again here: who should be involved in developing indicators and determining the target? ‘Insiders’ are much more likely to come up with original
and effective measures than ‘outsiders’.
Stage 1: Start by writing basic indicators as simple measures of change. They
are best written at this stage without elements of the baseline or target, without numbers or timeframe. For example:
a. Loan return rate
b. Immunization coverage
c. Community level representation on district councils
d. Fish catch
e. Rural households with livestock
Stage 2: Indicators need to be clear, measuring quality and quantity and, where appropriate, disaggregated and location-specific. So re-examine your basic
indicator to clarify your measure. The previous examples might develop into:
a. % loan return rate of men and women group in 3 targeted districts
b. Proportion of one-year olds vaccinated against measles.
c. Number of women and men community representatives on district councils
d. Average weekly fish catch per legally certified boat
e. Proportion of female- and male-headed households in 3 pilot rural areas with livestock
Each variable in an indicator will need to measurable and measured. So for an indicator such ‘Strengthened plan effectively implemented’ what is meant by
‘strengthened’ or ‘effectively’, or ‘implemented’? Each of these terms will need to be clarified for this to become a usable, measurable indicator.
Stage 3: Now for each indicator ask:
i. Is the current situation, the baseline, known? If not, can the
baseline data be gathered now, cost-effectively?
ii. Will the necessary data be available when needed (during the intervention for milestones, and at the end for a target)?
If data is or will not be available, you should reject the indicator and find some
other way to measure change.
Managing for Development Results (MfDR) Handbook on “Programme and Project Thinking Tools”
----------------------------------------------------------------------------
Version: MSMEs October 2015 Page 41
Stage 4: With the relevant baseline data to hand, determine milestones (at regular intervals during the project) and targets (at the end). For example
Baseline
Milestone
12
months
Milestone
24
months
Target
3 years
a. % loan return rate of men and women group in 3 targeted districts.
F44:M24 F50:M40 F70:M60 F80:M70
b. Proportion of one-year olds vaccinated against measles.
24% 30% 60% 85%
c. Number of women and men community representatives on district councils.
F0:M0 - At least
F2:M2
At least
F2:M2
d. Average weekly fish catch per legally certified
boat. 50kg 50kg 75kg 100kg
e. Proportion of female- and male-headed
households in 3 pilot rural areas with livestock.
F24:M80 F36:M85 F60:90 F95:M95
Stage 5: Check that your milestones and targets are SMART, Specific, Measureable, Achievable, Relevant and Time-bound.
To be useful, indicators need to have a number of characteristics. They need to be:
Specific; not vague and ambiguous; clear in terms of the quality and quantity of change sought; sensitive to change attributable to the project; disaggregated appropriately;
Measurable; the information can be collected, and will be available at
the time planned; cost-effective and proportionate
Achievable; realistic in the time and with the resources available; targets not just ‘made up’, without baseline or stakeholder ownership;
Relevant; substantial, necessary and sufficient; they relate to higher level indicators
Time-bound; milestones will together show progress is on-course;
targets are measurable within the lifetime of the project.
6.4 Types of Indicators
Binary Indicators These simple Yes or No indicators are most common at Output and Activity
levels. For example ‘Draft guidelines developed and submitted to Planning Committee’ Direct and Indirect Indicators Direct indicators are used for objectives that relate to directly observable change resulting from your activities and outputs; for example tree cover from aerial photography as an indicator of deforestation. Proxy indicators measure change
indirectly and may be used if results:
Managing for Development Results (MfDR) Handbook on “Programme and Project Thinking Tools”
----------------------------------------------------------------------------
Version: MSMEs October 2015 Page 42
are not directly observable like the quality of life, organisational development or institutional capacity
are directly measurable only at high cost which is not justified
are measurable only after long periods of time beyond the life span of the project.
The number of lorries carrying timber out of the forest could be an proxy indicator
of deforestation. But then there’s uncertainty as to whether timber resources are being used or burned within the forest; or are being taken out by means besides lorries; or on unsurveyed routes etc. So proxy indicators need to be used with care. But well-chosen proxies can be very powerful and cheap. Sampling for a certain river invertebrate can give a very clear picture of pollution levels. The price of a big-Mac has been used to assess the health of a currency or economy. Qualitative and Quantitative Indicators Quantitative indicators measure numerical values over time. Qualitative
indicators measure changes not easily captured in numerical values e.g. process-related improvements, perceptions, experiences, behaviour change, strengthened capacity etc. This is particularly relevant in gender and social aspects. Special effort and attention needs to be given to devising qualitative indicators. A balance of indicators is needed that will capture the total picture of change. Rigid application of the steps and format outlined in 6.4 can result in performance or change that is difficult to quantify not being considered or given value. We should not ignore to measure changes just because they may be difficult to quantify or analyse. It is often, with care, possible to ‘quantify’ qualitative aspects; opinion polls and market surveys do it all the time. A citizen score card for example might collect public opinion data on public services. Whether the instrument is valid or crude or spurious will depend on the context, and the way the information is collected, analysed and used. Process and Product Indicators It is important to measure not just what is being done but how it is being done; not
just the ‘products’ resulting from an intervention, but also the ‘processes’. Processes may be ‘means’ but with an underpinning capacity building agenda, those ‘means’ themselves become ‘ends’. Focus on the processes will generally lead to better targeting of the activities at real problems and needs, better implementation and improved sustainability. At the outset of a process initiative it may be very difficult, and undesirable, to state the precise products of the initiative. Instead outputs and activities may be devised for the first stage or year; then later outputs and activities are defined on the basis of the initiative learning. Processes will therefore need more frequent monitoring. Product indicators may measure the technologies adopted, the training manual in print and disseminated, the increase in income generated. Process indicators are usually more qualitative and will assess how the technologies were developed and adopted, how the manual was produced and how the income was generated, and who was involved. At least some of these indicators will be subjective. End-users
Managing for Development Results (MfDR) Handbook on “Programme and Project Thinking Tools”
----------------------------------------------------------------------------
Version: MSMEs October 2015 Page 43
and participants may be asked to verify them, but the means of verification may still be less than fully objective.
6.5 Identifying the Data Sources, the evidence
Having set indicators, milestones and targets, what Data Sources or evidence will be used for each measure? This is a vital aspect of the initial planning that is often overlooked. Building in data sources at this stage will make the monitoring, review and evaluating of the project easier.
Column 3 of the logframe relates to the verification; indeed it is sometimes titled Means of Verification. It should be considered as you formulate your indicators and targets. So complete columns 2 and 3 at the same time.
A data source will almost invariably be documents; sometimes it may be films, DVDs, videos or audiotapes. The key point, a data source is not an activity, such as a survey, a stakeholder review. If an activity is required, and will be done and
budgeted within the project, then it will be in Column 1 of the logframe. The output of that activity, the survey report or review report will be the data source.
In specifying our Data Sources we need to ask a series of simple questions:
What evidence do we need?
Where will the evidence be located?
How are we going to collect it?
Is it available from existing sources? (e.g. progress reports, records, accounts, national or international statistics, etc)
Is special data gathering required? (e.g. special surveys)
Who is going to collect it? (e.g. the project team, consultants, stakeholders etc)
Who will pay for its collection?
When/how regularly it should be provided (e.g. monthly, quarterly annually)
How much data gathering (in terms of quantity and quality) is worthwhile?
Some typical Data Sources
Minutes of meetings and attendance lists
Stakeholder feedback, results of focus groups
Surveys and reports
Newspapers, radio and TV recordings, photographs, satellite imagery
National and international statistics
Project records, reviews and reports; external evaluation reports
Reports from participatory poverty assessment or rural/urban appraisal exercises
Managing for Development Results (MfDR) Handbook on “Programme and Project Thinking Tools”
----------------------------------------------------------------------------
Version: MSMEs October 2015 Page 44
Be careful not to commit yourselves to measuring things that will be very expensive and time consuming to measure. Go back to Column 2 if the indicators you have chosen are impractical to measure. You need to be practical! The CREAM approach can be used, and may well be useful in certain projects,11
i.e.
Clear
Relevant
Economic
Adequate
Monitorable The choice of indicator and review criteria depend on what stakeholders want to measure or the type of changes they want to better understand and assess.
In the process of completing Columns 2 and 3, you are likely to be adding activities and possibly an output to Column 1 relating to monitoring, review and lesson learning.
Figure 6b. Indictors and Verification
Column 1 Objectives
Column 2 Indicators / targets
Column 3 Data Sources
Column 4 Assumptions
Impact / Goal:
Measures of the longer-term impact that the project contributed to.
Sources of data needed to verify status of Goal/Impact level indicators
Purpose/ Outcome:
Measures of the outcome achieved from delivering the outputs.
Sources of data needed to verify status of the Outcome level indicators
Outputs:
Measures of the delivery of the outputs.
Sources of data needed to verify status of the Output level indicators
Activities:
These measures are often milestones and may be presented in more detail in the project work plan.
Sources of data needed to verify status of the Activity level indicators
A typical Monitoring Review and Evaluation Framework that can be developed from a logframe is presented in Annex P.
11 See Imas & Rist 2009, The Road to Results – Designing and Conducting Effective Development
Evaluation, The World Bank, p.117
Managing for Development Results (MfDR) Handbook on “Programme and Project Thinking Tools”
----------------------------------------------------------------------------
Version: MSMEs October 2015 Page 45
Another way is to look at indicators is by applying FABRIC to performance
information12:
Focused on the organisation’s aims and objectives;
Appropriate to, and useful for, the stakeholders who are likely to use it;
Balanced, giving a picture of what the organisation is doing;
Robust in order to withstand organisational changes or individuals leaving;
Integrated into the organisation,
Cost Effective, balancing the benefits of the information against the costs. The choice of indicator and review criteria depend on what stakeholders want to measure or the type of changes they want to better understand and assess. Checklist – Indicators and Data Sources
1. Are the Targets and Milestones described in terms of Quality, Quantity and Time (QQT)?
2. Are the Indicators and Data Sources:
Relevant
Valid / Reliable
Measurable / verifiable
Cost-effective / proportionate?
3. Are the Indicators necessary and sufficient? Do they provide enough triangulation (cross checking)?
4. Are the Indicators varied enough?
Product and Process
Direct and Indirect
Formative, Summative and beyond
Qualitative and Quantitative
Cross-sectoral?
5. Who has set / will set the Indicators? How will indicators be owned?
6. Are the Data Sources
Already available
Set up where necessary within the project?
7. Is there need for baseline survey?
12
Identify HM Treasury, Cabinet Office, National Audit Office, Audit Commission & Office for National
Statistics 2001, Choosing the right FABRIC: a framework for performance information, HM Treasury, Cabinet Office, National Audit Office, Audit Commission & Office for National Statistics, London, http://www.nao.org.uk/report/choosing-the-right-fabric-3/ing the Data Sources, the evidence.
Managing for Development Results (MfDR) Handbook on “Programme and Project Thinking Tools”
----------------------------------------------------------------------------
Version: MSMEs October 2015 Page 46
Objectives
Indicators and Targets
Data sources
Assumptions
Indicators
Base-line
Milestones Target 2015
1year 2 year -
% loan return rate of men and women group in 3
targeted districts.
F44 M24
F50 M40
F70 M60
F80 M70
One possible layout of Indicators Baselines, Milestones and Targets
This type of layout has now been slightly modified and used by the UK’s Department for International Development (DFID) in their new Logical Framework template See Appendix S for an example.
Managing for Development Results (MfDR) Handbook on “Programme and Project Thinking Tools”
----------------------------------------------------------------------------
Version: MSMEs October 2015 Page 47
Figure 6c The Women’s Empowerment Small and Medium Sized Enterprise (WEMSME) Project Case study Eralc District in Trohs: The complete logframe example Timeframe: 4 years Allocation: $2.4 million
Objectives Indicators (by End of
Project unless otherwise
stated)
Data Sources Assumptions
Impact / Goal:
Increased economic
opportunities for
women and greater
investments in green
growth in Eralc District
in Trohs.
Incremental contribution of MSMEs sector to green growth in Trohs.
National Growth
and Performance
Statistics
Project will be
replicated on National
scale thereby
contributing to
enhanced sector
performance
Outcome:
Network of green growth MSMEs established.
Increased % revenue growth of green growth supported business Increased new jobs (x% for women and y% men) created by MSMEs Increased number of MSMEs (x% women and y% men) accessing
loans from Banks and financial institutions Information shared through MSME
network lead to new opportunities
Bi-annual surveys of MSME participants Bi-annual surveys of MSME participants
Bi-annual surveys
of Banks and
financial
institutions
Macro-economic outlook is favourable in Trohs and the region The security situation in Trohs does not
deteriorate such that it disrupts project activities and results. Total revenue impact
will occur after 2 years
of MSMEs being
established
Outputs:
1. MSME’s access finances
Increased number of green growth business plans developed Increased number of green growth applications approved for finance
submitted by MSMEs. Increased number of green growth entrepreneurs (women and men) with basic accounting systems and bank accounts
Project Progress reports Project Progress
reports Baseline and Post
operations Survey
of entrepreneurs
Finance will be available with improved MSME capacity to request
Subsidised loan rates are affordable to project participants. Quality BDS providers can be identified to provide services at
rates affordable to start-ups and young entrepreneurs Project participants become attractive lending target for
financiers
2. Enabling
environment in place for MSMEs to access green growth support services and finance .
Increased number of green growth
entrepreneurs (X% women and Y% men) accessing Business Development services Increased number of green growth entrepreneurs (X% women and Y% men) linked to or pitching to
potential investors Increased number of financial
instruments for green growth
Economic and
Social
development
adviser reports
Managing for Development Results (MfDR) Handbook on “Programme and Project Thinking Tools”
----------------------------------------------------------------------------
Version: MSMEs October 2015 Page 48
Objectives Indicators (by End of
Project unless otherwise
stated)
Data Sources Assumptions
MSMEs available.
3. Knowledge base
developed on MSME
drivers and barriers to
successful investment
in green growth
Ten successful green growth entrepreneurs (5 women and 5 men) showcased in training and events Five research/knowledge/ lesson
products on green growth
entrepreneurship shared with
national and international
practitioners
Project Reports Publicity records and newspaper survey Knowledge products Project Reports
Indicative Activities:
1.1 Conduct baseline study of MSMEs in
Trohs 1.2 Carry out Training Needs Assessment. 1.3 Identify and research MSME best practices
1.4 Develop a training programme 1.5 Disseminate best MSME practices in case studies and training workshops
1.6 Establish MSME
network systems for
access and exchange
of information and
learning.
Baseline study completed by M2.
TNA completed by M6. MSME Research Survey completed by month 12 Results of Research into best MSME
practices disseminated as key messages by month 18 Annual reviews report a range of new and innovative service provision New network systems to reach
vulnerable groups established by
month 24.
Quarterly reports Quarterly reports
Quarterly reports Quarterly reports Inception report and MSME research meeting
report Best MSME practices report Annual reviews
Managing for Development Results (MfDR) Handbook on “Programme and Project Thinking Tools”
----------------------------------------------------------------------------
Version: MSMEs October 2015 Page 49
2.1 Design of subsidised financial instruments to support MSME green growth investments with Financial partners.
2.2 Develop MSME diversification action plans with Financial partners. 2.3 Hold event where MSMEs pitch business
plans to potential investors 2.4 Hold MSME
business plan
competition and make
awards
Baseline study completed by M 3 and reported in Inception report. Three district-level clusters each of at least 12 MSME groups with a total of 120 MSMEs supported by M6 Action Plans in place by M12 with
meetings at least quarterly thereafter. One diversification plan completed by M18 with action plan in operation A further six similar district clusters
established by M24; total number of
groups 36, and farmers 360.
Quarterly reports Inception report Review report and Quarterly reports Quarterly reports
Review report and
Quarterly reports
3.1 Carry out robust analysis of the green product market place and market chains
3.2 Collection of lessons learned from workshops & events 3.3 Develop and implement Knowledge
Management & communications plan 3.4 Produce
knowledge and
research products with
local University
partners
Stakeholder mapping exercise completed by M 3. Ongoing data study thereafter Market Analysis Report completed by
M4 Training plan in place by M8 training ongoing thereafter. Training evaluation exercise undertaken annually
Lesson Learning Reviews completed by M12 with case studies and clear lessons derived. Communications plan agreed by District committee by M12. Communications plan published in
local Entrepreneur’s magazine by month M14. Best practice briefings for a variety of
audiences drafted and tested; first set
by M 18.
Baseline Analysis report and Quarterly reports Quarterly report
TNA report Training Plan and reports Review report Synthesis report
Quarterly reports Best practice
briefings and
other materials
Finally it’s necessary to check that the overall Logframe is “engendered”.
Appendix O can be used as a useful check list to ensure that gender has been carefully considered and questions about gender adequately answered.
Managing for Development Results (MfDR) Handbook on “Programme and Project Thinking Tools”
----------------------------------------------------------------------------
Version: MSMEs October 2015 Page 50
STEP 7: WORK & RESOURCE PLANNING; WHAT DO WE NEED TO GET THERE?
7.1 Preparing a Project Work Plan
The activities listed in a logframe developed for approval prior to implementation will probably include indicative activity clusters or groups. Clarification of a detailed work plan will generally happen in the first few months of implementation, often called the Inception Phase. This is very important time when stakeholder
ownership is broadened and consolidated, when the overall plan is confirmed, when the necessary activities are worked out in detail and when the monitoring, review and evaluation needs and arrangements are finalised.
A common mistake is to include too much detail in the logframe. There is no need to list pages and pages of detailed activities. Typically these are set out in a separate Work plan or Gantt Chart, in general terms for the whole project
lifespan and in detail for the next 12 months. See Figure 7a for an example.
In a Gantt Chart each Output is listed together with its associated activities (sub-activities and/or indicators and milestones are sometimes used as well). Then some form of horizontal bar coding is given against a monthly (or sometimes weekly) calendar.
To this may be added other columns such as the identity of the staff who will do the activity; the proposed number of days; priority; rough estimate of cost; etc.. The beauty of the work plan in this form is that it is highly visual, relates back to the logical framework in a precise way, and it can be used to give order and priority to inputs.
It is an opportunity to review the time scale and feasibility of the project activities, allocate responsibility for undertaking actions (or achieving indicators), and can also inform issues of cash flow. It is also a participatory tool that can be used with the project team to explore precisely the issues listed above. In this role it may begin as a timeline onto which indicators are placed (thus making them milestones), which in turn informs the timing of the actions to achieve them.
7.2 Preparing a Project Budget
Now the full Budget needs to be prepared. Figure 7b gives an example. It is not essential for the budget line headings to fully correlate with the logframe objective headings and not always possible. For example there could be one project vehicle partially used for implementation of ALL project activities.
However if costs can be accounted for against project activities and outputs then value for money can be compared between the different Activities and Outputs
and this will be very useful when the project is reviewed and perhaps further phases are planned and funded. In addition if project expenditure can be reported against the logframe objectives then expenditure on different aspects of the project become much more transparent for the interested, but intermittently involved, stakeholders
Managing for Development Results (MfDR) Handbook on “Programme and Project Thinking Tools”
----------------------------------------------------------------------------
Version: MSMEs October 2015 Page 51
Figure 7a Example of a work plan / Gantt Chart (partial)
MONTH
ACTIVITY WHO? 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 etc.
1.1 Raise awareness of key stakeholders. RT
1.2 Establish Project Steering Committee (PSC) TF
1.3 Recruit / train core staff. PM
1.4 Initial stakeholder consultations. PM
1.5 Secure agreement on Inception Report. PM
2.1 Establish partnerships with existing institutions.
TF
2.2 Review current socio-economic networks. TF/RT
2.3 Set up MSME interest groups. RT
2.4 Identify and build networks with diverse service providers
3.1 Analyse market opportunities and standards.
TF/PM
3.2 Conduct baseline and on-going study of MSME practices, productivity and production.
PM
3.3 Review lessons from similar quality MSME
efforts.
PM
3.4 Establish information systems for on-going access, flow and exchange of information
PM
Etc etc.
KEY Development Implement Self-review Annual Review
Managing for Development Results (MfDR) Handbook on “Programme and Project Thinking Tools”
----------------------------------------------------------------------------
Version: MSMEs October 2015 Page 52
Figure 7b: A typical project budget based on a logframe
Activities / Inputs Unit Quantity per quarter Cost per unit
Cost codes Costs per quarter Project total
Q1 Q2 Q3 Q4 Project Govt Q1 Q2 Q3 Q4
1.1 Raise awareness of
key stakeholders
Equipment Computers
No.
1
780
E2
A/1.5
780
780
Travel Km 500 500 250 250 0.2 T1 C/2.3 100 100 50 50 300
Non-fixed salaries and allowances
P days 40 40 40 40 70 S4 B/4.3 2800 2800 2800 2800 11200
Consultancy support P days 14 14 300 S3 B/3.2 4200 - - 4200 8400
Meeting costs No. 2 1 1 3 200 P5 F/4.2 400 200 200 600 1400
Communications Lump 2 2 1 1 100 O3 H/3.3 200 200 100 100 600
1.2 etc
1.3 etc
Managing for Development Results (MfDR) Handbook on “Programme and Project Thinking Tools”
----------------------------------------------------------------------------
Version: MSMEs October 2015 Page 53
8. CONCLUSIONS
Checking the Logical Framework
You should now have a completed Logical Framework and it is worth going through it and checking it against this checklist13.
1 The Project has one clear Purpose/Outcome.
2 The Purpose/Outcome is not a reformulation of the outputs.
3 The Purpose/Outcome is outside the full managerial control of the
project manager BUT the causal links between outputs and Purpose/Outcome are clear and strong.
4 The Purpose/Outcome is clearly stated and does not contain words
like “by”, “so that” or “through”.
5 All the outputs are necessary for accomplishing the Purpose/Outcome.
6 The outputs are clearly stated.
7 The outputs are stated as results, with the noun preceding the
verb.
8 The activities define the action strategy for accomplishing each
output, led by strong verbs.
9 The impact / goal is clearly stated.
10 The if/then relationship between the Purpose/Outcome and goal is
logical and does not miss important steps.
11 The assumptions at the activity level include pre-existing
conditions.
12 The outputs plus the assumptions at Purpose/Outcome level
produce the necessary and sufficient conditions for achieving the Purpose/Outcome.
13 The Purpose/Outcome plus assumptions at impact / goal level
describe the critical conditions for substantively contributing to the goal.
14 The relationship between the inputs/resources and the activities is
realistic.
15 The relationship between the activities and outputs is realistic.
13
Adapted from the Team up Project List.
Managing for Development Results (MfDR) Handbook on “Programme and Project Thinking Tools”
----------------------------------------------------------------------------
Version: MSMEs October 2015 Page 54
16 The relationship between the outputs and the Purpose is realistic
17 The vertical logic from activities, outputs, Purpose to goal is
realistic as a whole.
18 The indicators at the Purpose level are independent from the outputs. They are not a summary of outputs but a measure of the Purpose level change.
19 The Purpose indicators measure what is important.
20 The Purpose targets have quantity, quality and time measures.
21 The output targets are objectively verifiable in terms of quantity,
quality and time, and are independent of the activities.
22 The impact / goal-level targets are verifiable in terms of quantity,
quality and time.
23 The associated budget defines the resources and costs required for accomplishing the Purpose.
24 The Data Sources column identifies where the information for
verifying each indicator will be found and who will be responsible for collecting it.
25 The activities identify any actions required for gathering data /
evidence.
26 The outputs define the management responsibility of the Project.
27 When reviewing the Logical Framework, you can define the
monitoring, review and evaluation plan for the Project.
28 The Purpose indicators measure sustainable change.
29 The output strategy includes a description of the project
management systems.
30 The team designing the project are completely exhausted!
Managing for Development Results (MfDR) Handbook on “Programme and Project Thinking Tools”
----------------------------------------------------------------------------
Version: MSMEs October 2015 Page 55
8.1 Using the Logical Framework
The logical framework now provides a comprehensive and through project plan that all partners have been involved in and that has an inherent logic running through it. The logical framework is useful for a number of purposes:
Monitoring, Reviewing and Evaluating – Keeping track of the project, it forms a most useful monitoring, reporting and evaluation tool (See Appendix H for further details).
Communicating the details of what the project is about – Informing partners about the overall objectives of the project (See Appendix K for further details).
Reporting in brief.(See Appendix L for further details).
A commissioning tool – Section 8.2 explains how frameworks can be nested within each other – the overall Goals/Impacts can become Purposes/Outcomes which other organisations can be commissioned to deliver.
8.2 Nesting the Framework
One of the interesting things about logical frameworks is how they can be linked together and ‘nested’ within each other. Your organisation/group may have a number of different level plans (For example an organisational plan, regional plans, team plans and individual plans within these). Theoretically the objectives should feed down through these plans so that the ‘Purpose for the high level plan becomes the impact / goal for the subsequent plans and this process continues as objectives become more and more specialised. See Appendices I and J for further details.
8.3 Useful References
References to all the donor agency logframe handbooks and guidance sheets are given in Appendix P. Key references for the development of logframes are given
below.
Dearden P.N., Jones S. and Sartorrius, R. 2003, Tools for Development - A Guide for Personnel Involved in Development. Department For International Development. London. (pp144). “Tools for Development”
Asian Development Bank Guidelines for Preparing a Design and Monitoring Framework (DMF) (2006) www.adb.org/Documents/guidelines/guidelines-preparing-dmf/guidelines-preparing-dmf.pdf
Ausaid guides www.ausaid.gov.au/ausguide/default.cfm Europe Aid guides http://europa.eu.int/comm/europeaid/qsm/project_en.htm http://europa.eu.int/comm/europeaid/qsm/documents/pcm_manual_2004_en.pdf
SIDA guide http://www.sida.se/shared/jsp/download.jsp?f=SIDA1489en_web.pdf&a=2379
Managing for Development Results (MfDR) Handbook on “Programme and Project Thinking Tools”
----------------------------------------------------------------------------
Version: MSMEs October 2015 Page 56
APPENDIX A: GLOSSARY The following terms and definitions are applied in the evaluation of global and regional partnership projects and programmes14. Many of these terms and their respective definitions are based on the Organisation for Economic Co-Operation and Development (OECD)/Development Assistance Committee (DAC) Glossary of Key Terms in Evaluation and Results-Based Management of 2002. Accountability: In the evaluation context, Accountability refers to the results and effects of
a development intervention, not the funding or legal responsibility. Accountability in
development refer to the obligation to demonstrate that work has been conducted in
compliance with agreed rules and standards or to report fairly and accurately on performance
results with regard to clearly defined roles, responsibilities and performance expectations of
partners in the use of resources.
Activity: Actions taken or work performed through which inputs—such as funds, technical
assistance, and other types of resources—are mobilised to produce specific outputs.
Aid Effectiveness at Country Level: Aid Effectiveness is indicated by Ownership;
Harmonisation; Alignment and Mutual Accountability.
Appraisal: An overall assessment of the relevance, feasibility and potential sustainability of
a development intervention prior to a decision of funding. The purpose of appraisal is to
enable decision-makers to decide whether the activity represents an appropriate use of
resources.
Appropriateness of processes: Appropriateness of processes is a criterion for
examining whether- processes have been followed that would ensure the relevance of
policies and the effectiveness of results; approaches and concrete efforts were made to
tackle particular issues identified; there was coordination with other donors and international
organisations; consultations took place with the recipient countries; the implementation
system was sufficient; and processes were followed to regularly monitor the implementation
status.
Assumptions: Hypotheses about factors, risks or conditions which could affect the progress
or success of a development intervention. Assumptions are made explicit in theory-based
evaluations where evaluation tracks systematically the anticipated results chain.
Attribution: The ascription of a causal link between observed changes/or expected to be
observed changes and a specific intervention. This involves a comparison of net
outcomes/impacts caused by an intervention with gross outcomes/impacts. It refers to that
which is to be credited for the observed changes or results achieved. It represents the extent
to which observed development effects can be attributed to a specific intervention of to the
performance of one or more partners taking into account other interventions (anticipated and
unanticipated) confounding factors, or external shocks. Formal attribution, which is the
separation of the MDBs’ role from that of other internal or external players, is extremely
difficult because of the multiplicity of factors that affect development outcomes and impacts.
Audit: An independent, objective assurance activity designed to add value and improve an
organization‘s operations. It helps an organisation accomplish its objectives by bringing a
systematic, disciplined approach to assess and improve the effectiveness of risk
management, control and governance processes. A distinction is made between regularity
14 From Caribbean Development Bank Performance Assessment System (PAS) Volume 1 Public Sector
Investment Lending and Technical Assistance
Managing for Development Results (MfDR) Handbook on “Programme and Project Thinking Tools”
----------------------------------------------------------------------------
Version: MSMEs October 2015 Page 57
(financial) auditing, which focuses on compliance with applicable statues and regulations,
and performance auditing, which is concerned with relevance, economy efficiency and
effectiveness. Internal auditing provides an assessment on internal controls undertaken by a
unit reporting to management while external auditing is conducted by an independent
organization.
Bank Performance: The extent to which services provided by the Bank ensured quality at
entry of the intervention and supported effective implementation through appropriate
supervision (including ensuring adequate transition arrangements for regular operation of
supported activities after completion of the intervention, towards the achievement of
development outcomes.
Base-Line Study: An analysis describing the situation prior to a development intervention,
against which progress can be assessed or compared.
Benchmark: Reference point or standard against which performance or achievements can
be assessed.
Beneficiaries: Individuals, groups or organisations, whether targeted or not, that benefit
directly or indirectly, from the development intervention.
Borrower Performance: The extent to which the borrower/client (including government and
implementing agency or agencies) ensured quality of preparation and implementation, and
complied with covenants/conditionality and agreements, towards the achievement of
development outcomes.
Client Satisfaction: Comparison of outputs (goods or services) with client expectations.
Cluster Evaluation: An evaluation of a set of related activities, projects and/or programs.
Conclusion: Conclusions point out the factors of success or failure of the evaluated
intervention, with special attention paid to the intended and unintended results and impacts,
and more generally to any other strength or weakness. A conclusion draws on data
collection and analyses undertaken, through a transparent chain of arguments.
Contribution: Assessment of results to determine whether an MDB has made a contribution
to key results or outcomes, that is both plausible and meaningful and, identifying the main
drivers of the outcomes. A plausible association of MDB assistance with development results
can be assessed by: characterizing the role played by the MDB in the sector or thematic
domain (i.e. lead MDB); examining the policies and actions of other major development
partners for consistency with those of the MDB; and examining evidence that the main
outcomes were not achieved primarily due to exogenous events.
Cost-Effectiveness: Comparison of the outcomes/impacts of an intervention with their
costs.
Counterfactual: The situation or conditions which hypothetically may prevail for individuals,
groups or organisations was there no development intervention (i.e. the “Without Intervention
Scenario”).
CREAM Indicators of good performance: These should be Clear (precise and
unambiguous), Relevant (appropriate to the subject at hand), Economic (available at a
reasonable cost), Adequate (sufficient to assess performance), and Monitorable (amenable
to independent validation).
Country Programme Evaluation: Evaluation of one or more donor’s or agency’s portfolio of
development interventions, and the assistance strategy behind them in a partner country.
Data Collection Tools: Methodologies used to identify information sources and collect
information during an evaluation. Examples are informal surveys, direct and participatory
Managing for Development Results (MfDR) Handbook on “Programme and Project Thinking Tools”
----------------------------------------------------------------------------
Version: MSMEs October 2015 Page 58
observation, community interviews, focus groups, expert opinion, case studies and literature
search.
Descriptor: A word or phrase (key) used to categorise records in a database so that all
records containing the key can be retrieved together.
Development Intervention: An instrument for partner (donor and non-donor) support aimed
to promote development. Examples are policy advice, projects, and programmes.
Development Objective (DO): The intended impact contributing to physical, financial,
institutional, social, environmental, or other benefits to a society, community, or group of
people through one or more development interventions.
Development Programme: This is a time-bound intervention (country programme/strategy
evaluation) involving multiple activities that may cut across sectors, themes and/or
geographic areas.
Effect: Intended or unintended change, results, outcomes due directly or indirectly to an
intervention.
Effectiveness: The extent to which the development intervention’s objectives were
achieved or are expected to be achieved, taking into account their relative importance. The
process of determining effectiveness involves the comparison of the actual results of an
intervention with planned/expected results. Effectiveness is also used as an aggregate
measure of the merit/worth of an activity/intervention has attained, or is expected to attain, its
major relevant objectives efficiently in a sustainable manner and with a positive institutional
development impact.
Effectiveness of Results: Effectiveness of results assesses to what degree the initial
targets have been achieved. To assess effectiveness, therefore, indicators are needed for
gauging the respective results of activities at the input, output, and outcome levels.
Efficiency: A measure of how economically resources/inputs (funds, expertise, time etc.)
are converted to outputs. Comparison of the outputs (good and services) of an intervention
with their costs.
Evaluability: Extent to which an activity or a program can be evaluated in a reliable and
credible manner. Evaluability assessment calls for the early review of a proposed activity in
order to ascertain whether its objectives are adequately defined and its results verifiable.
Evaluation: The systematic and objective assessment of an on-going or completed project,
programme, strategy or policy, its design, implementation and results. The aim is to
determine the relevance and fulfillment of objectives, development efficiency, effectiveness
(Effectiveness), impact and sustainability by comparing expected results with actual results.
An evaluation should provide information that is credible and useful, enabling the
incorporation of lessons learned into the decision making process of both recipients and
donors. Evaluation also refers to the process of determining the worth or significance of an
activity, policy, project or programme. It is an assessment, as systematic and objective as
possible, of a planned, ongoing or completed development intervention. In some instances,
evaluation involves the definition of appropriate standards, examination of performance
against those standards, an assessment of actual and expected results and the identification
of relevant lessons.
Evaluation Ethics: The evaluation process shows sensitivity to gender, beliefs, manners
and customs of all stakeholders and is undertaken with integrity and honesty. The rights and
welfare of participants in the evaluation are protected. Anonymity and confidentiality of
individual informants should be protected when requested and/or as required by law.
Evaluation Methodology: The evaluation methodology is clarified by defining what criteria
and what analytical approaches will be used to assess the evaluation subjects. The
evaluation framework, which lucidly organises such key points as the evaluation
Managing for Development Results (MfDR) Handbook on “Programme and Project Thinking Tools”
----------------------------------------------------------------------------
Version: MSMEs October 2015 Page 59
perspectives, evaluation criteria, and sources of information, provides the basis of discussion
for determining the evaluation methodology. The method of analysis used in evaluation may
be either qualitative or quantitative. Quantitative methods, such as cost benefit analysis and
econometric analysis, are possible when quantitative targets have been set and a wealth of
data is available.
Evaluation Standards: As an assessment of planning, implementation, and results,
evaluation needs to have standards. Five of these are widely used - Relevance,
Effectiveness, Efficiency, Impacts and Sustainability. These criteria do not need to be
regarded as a set. Individual criteria can be selected according to the purpose and object of
an evaluation, and it should be remembered that there are other criteria besides these five.
Ex-ante Evaluation: An evaluation that is performed before implementation of a
development intervention. Ex-ante evaluation may also refer to appraisal of a development
intervention or assessment of quality at entry of a development intervention.
Ex-post Evaluation: Evaluation of a development intervention after it has been completed.
Its function is to identify factors of success or failure; assess the sustainability of results and
impacts; and provide information, conclusions and recommendations that may inform other
interventions or future interventions in the same sector/country. It may be undertaken
directly after or long after completion of implementation of a development intervention.
Formative Evaluation: Evaluation intended to improve performance, most often conducted
during the implementation phase of projects or programmes. Formative/process evaluations
may also be conducted to determine compliance, legal requirements or as part of a larger
evaluation initiative.
Goal: The higher-order DO to which a development intervention is intended to contribute.
Impact Assessment:An assessment of an intervention’s impact is a comparison of the
measures before and after implementation of the intervention –it compares the situation
before the intervention is implemented (using baseline data) with the results achieved from
implementing the intervention. It is equivalent to the “With” and “Without” (counterfactual)
scenarios applied in project appraisal analysis.
Impacts: Positive and negative, primary and secondary, long-term effects, both intended
and unintended, produced directly or indirectly by a development intervention. Impacts are
the kinds of organizational, community, or system level changes expected to result from
intervention activities and which might include improved conditions, increased capacity,
and/or changes in the policy arena.
Independent Evaluation: An evaluation carried out by entities/persons free of the control of
those responsible for the design and implementation of an intervention.
Independence of evaluators’ vis-à-vis stakeholders: The evaluation report indicates
the degree of independence of the evaluators from the policy, operations and management
function of the commissioning agent, implementers and beneficiaries. Possible conflicts of
interest are addressed openly and honestly. The evaluation team is able to work freely and
without interference. It is assured of cooperation and access to all relevant information. The
evaluation report indicates any obstruction which may have impacted on the process of
evaluation.
Indicator: Quantitative or qualitative factor or variable that provides a simple and reliable
means to measure achievement, to reflect the changes connected to an intervention, or to
help assess the performance of a development actor.
Inputs: The financial, human, and material resources used for the development intervention.
Institutional Development Impact: The extent to which an intervention improves or
weakens the ability of a country or region to make more efficient, equitable, and sustainable
use of its human, financial, and natural resources e.g.: through (a) better definition, stability,
Managing for Development Results (MfDR) Handbook on “Programme and Project Thinking Tools”
----------------------------------------------------------------------------
Version: MSMEs October 2015 Page 60
transparency, enforceability, and predictability of institutional arrangements, or (b) better
alignment of the organization’s mission and capacity with its mandate, which derives from
these institutional arrangements. Such impacts can include the intended and unintended
effects of an action.
Intervention: a capital project, programme loan, policy-based financing or technical
assistance loan or grant financed by the Bank.
Intervention under Implementation: A capital the intervention’s loan/policy-based loan for
which the Loan Agreement has been signed, and under implementation until the loan is fully
disbursed, cancelled or until the relevant project completion report has been prepared.
Intervention in Operation: An intervention is in operation when the loan is fully disbursed
or the undisbursed balance is cancelled. A project remains in operation until the loan is fully
repaid.
Lesson Learned/Lesson of Experience: Generalizations based on evaluation experiences
interventions or policies that abstract from the specific circumstances to broader situations.
Frequently, lessons highlight strengths or weaknesses in preparation, design, and
implementation that affect performance, outcome and impact.
Loan Effectiveness: A loan becomes effective when conditions precedent to first
disbursement, are satisfied.
Logical Framework (Log frame): A Log frame is a management tool used to improve the
design of interventions, most often at the project level. It involves identifying strategic
elements (inputs, outputs, outcomes, impact) and their causal relationships, indicators, and
the assumptions or risks that may influence success and failure. It thus facilitates planning,
execution, and evaluation of a development intervention.
Logic Model: A logic model is a technical tool for summarizing all relevant information
related to development assistance or a program or project. Logic models are usually
presented in a matrix covering such categories as objectives/results, inputs, indicators (or
objectively verifiable indicators), means of verification, and assumptions/risks. Various types
of logic models have been designed for different purposes; there is no “correct” format.
Managing for Development Results (MfDR): MfDR is a management strategy that focuses
on the development performance and sustainable improvements in country outcomes. It
provides a coherent framework for development effectiveness in which performance
information is used for improved decision-making; and it includes practical tools for strategic
planning, risk management, progress monitoring, and outcome evaluation.
Meta-Evaluation: The term is used for evaluations designed to aggregate findings from a
series of evaluations. It can also be used to denote the evaluation of an evaluation to judge
its quality and/or assess the performance of the evaluators.
MfDR Principles: 1. Focusing the dialogue on results at all phases of the development
process. 2. Aligning programming, monitoring, and evaluation with results. 3. Keeping
measurement and reporting simple. 4. Managing for, not by results. 5. Using results
information for learning and decision-making.
Millennium Development Goals (MDGs): In 2000, in a key effort to promote more effective
development, 189 UN member countries agreed to work toward reduction of global poverty
and improved sustainable development. These global aims are reflected in the eight MDGs,
with their 18 targets and 48 performance indicators. The MDGs provide specific, measurable
targets that are gradually being adapted at the country level as the basis for country
outcomes and then monitored over time to help gauge progress.
Monitoring: A continuing function that uses systematic collection of data on specified
indicators to provide management and the main stakeholders of an ongoing development
Managing for Development Results (MfDR) Handbook on “Programme and Project Thinking Tools”
----------------------------------------------------------------------------
Version: MSMEs October 2015 Page 61
intervention with indications of the extent of progress and achievement of objectives and
progress in the use of allocated funds.
OECD/DAC Evaluation Criteria: Relevance - the extent to which the objectives of a
development intervention are consistent with beneficiaries’ requirements, country needs,
global priorities and partners’ and donors’ policies; Effectiveness -the extent to which the
development intervention’s objectives were achieved, or are expected to be achieved, taking
into account their relative importance; Efficiency - a measure of how economically
resources/inputs (funds, expertise, time etc.) are converted to results; Impacts - positive and
negative, primary and secondary long-term effects produced by a development intervention,
directly or indirectly, intended or unintended; and Sustainability - the continuation of the main
benefits from a development intervention after major development assistance has been
completed; the probability of continued long-term benefits; and the resilience to risk of the net
benefit flows over time.
Outcomes: The likely or achieved short-term and medium-term effects of an intervention’s
outputs or the extent to which an intervention’s major relevant objectives were achieved, or
are expected to be achieved efficiently. Outcomes are the observable behavioral,
institutional, and societal changes that take place over 3 to 10 years, usually as the result of
coordinated short-term investments in individual and organizational capacity building for key
development stakeholders (such as national governments, civil society, and the private
sector).
Outputs: The direct results of the activities of an intervention. Outputs are the products,
capital goods, and services that result from a development intervention; and they may also
include changes resulting from the intervention that are relevant to the achievement of
outcomes. Outputs are usually described in terms of size and scope of the services or
products delivered or produced by the intervention. They indicate whether or not an
intervention was delivered to the intended audiences at the intended level. An intervention
output, for example, might include the number of classes taught, meetings held, materials
distributed, program participation rates, or total service delivery hours.
Partners: Individuals, groups and organizations that collaborate to achieve mutually agreed
upon objectives. The concept connotes shared goals, common responsibilities for outcomes,
distinct accountabilities and reciprocal obligations. Partnerships include governments, civil
societies, non-governmental organisations, universities, professional and business
associations, multilateral organisations, private companies and investment consortiums.
Participatory Evaluation: Evaluation method in which representatives of agencies and
stakeholders (including beneficiaries) work together in designing, carrying out and
interpreting an evaluation.
Partnership: Partnership can be defined as a collaborative relationship between entities to
work toward shared objectives through a mutually agreed division of labor. At the country
level, this means engaging under government leadership with national stakeholders and
external partners (including international development agencies) in developing,
implementing, and monitoring a country’s own development strategy.
Performance: The degree to which a development intervention or a development partner
operates according to specific criteria/standards/guidelines or achieves results in accordance
with stated goals and plans.
Performance Indicator: A variable that allows the verification of changes in the
development intervention or shows results relative to what was planned.
Performance Measurement: A system for assessing performance of development
interventions against stated goals.
Managing for Development Results (MfDR) Handbook on “Programme and Project Thinking Tools”
----------------------------------------------------------------------------
Version: MSMEs October 2015 Page 62
Performance Monitoring: A continuous process of collecting and analysing data to
compare how well a development intervention (project, programme or policy) is being
implemented against expected results.
Process Evaluation: A process evaluation evaluates the internal dynamics of implementing
organizations, their policy instruments, their service delivery mechanisms, their management
practices, and the linkages among these.
Productivity: Comparison of the outputs (goods and services) of an intervention with
physical inputs.
Programme Evaluation: Evaluation of a set of interventions, marshaled to attain specific
global, regional, country, or sector DOs.
Project Implementation Start-up: An intervention enters the portfolio on the date the Loan
Agreement is signed.
Project or Programme Objective: The intended physical, financial, institutional, social,
environmental or other development results to which a project or programme is expected to
contribute.
Project Under Implementation: An intervention for which the Loan Agreement has been
signed, and under implementation until the loan is fully disbursed, cancelled or until the
relevant project completion report has been prepared.
Purpose: The publicly stated objectives of the development project or programme.
Quality: Comparison of the characteristics/value of the outputs of an intervention to
technical standards.
Quality Assurance: Encompasses any activity that is concerned with assessing and
improving the merit or worth of a development intervention or its compliance with given
standards. Quality assurance may also refer to the assessment of the quality of a portfolio
and its development effectiveness.
Recommendations: proposals aimed at enhancing the effectiveness, quality or efficiency of
a development intervention; at redesigning the objectives; and/or at the reallocation of
resources. Recommendations are actionable proposals and should be linked to
conclusions.
Relevance: Relationship/link of the objectives of an intervention to broader country or
development agency goals. The extent to which the objectives of a development intervention
are consistent with beneficiaries’ requirements, country-needs, global priorities and partners’
and donors’ policies. Relevance also indicates whether the objectives of an intervention or
its design are still appropriate given changed circumstances.
Reliability: Consistency or dependability of data and evaluation judgements, with reference
to the quality of the instruments, procedures and analyses used to collect and interpret
evaluation data.
Results: The output, outcome, or impact (intended or unintended, positive and negative) of
a development intervention.
Results Chain: The causal sequence for a development intervention that stipulates the
necessary sequence to achieve desired objectives, beginning with inputs, moving through
activities and outputs, and culminating in outcomes, impacts, and feedback.
Results Framework: The program logic that explains how the DO is to be achieved,
including causal relationships and underlying assumptions.
Results-Based Management (RBM): A management strategy focusing on performance
and achievement of outputs, outcomes and impacts. It provides the management
frameworks and tools for strategic planning, risk management, performance monitoring, and
evaluation. Its main purposes are to improve organizational learning and to fulfill
accountability obligations through performance reporting.
Managing for Development Results (MfDR) Handbook on “Programme and Project Thinking Tools”
----------------------------------------------------------------------------
Version: MSMEs October 2015 Page 63
The following is the logic flow in RBM:
HOW should this be
implemented?
WHAT
should be
produced?
WHAT outcomes do we expect from this
investment? (HOW are outputs used?)
WHY should
we do this?
Resources
(Inputs) Activities Outputs
Short-term
Outcomes
Medium-term
Outcomes
Long-term
Outcomes
Long-term
Impacts
Review: An assessment of the performance of an intervention, periodically or on an ad hoc
basis. Reviews are usually less comprehensive and/or in-depth than evaluations. They tend
to emphasize operational aspects.
Risk to Development Outcome (RDO): The risk, at the time of evaluation, that
development outcomes or expected outcomes will not be realised or maintained. Some
RDOs are internal or specific to an intervention; and are primarily related to the suitability of
the intervention’s design to its operating environment. Other RDOs arise from factors outside
the intervention (e.g.: country level - price changes, global - technological advances). The
impact on outcomes of a change in the operating environment depends on the severity and
nature of the change, as well as the adaptability (or lack thereof) of the design of the
intervention to withstand that change.
Risk analysis: An analysis or an assessment of factors (called assumptions in the Log
frame) that affect or are likely to affect the successful achievement of an intervention’s
objectives. It is a detailed examination of the potential unwanted and negative consequences
to human life, health, property, or the environment posed by development interventions; a
systematic process to provide information regarding undesirable consequences; and the
process of quantification of the probabilities and expected impacts for identified risks.
Categories of risk include: technical (where innovative technology and systems are involved);
financial (robustness of financial flows and financial viability); economic (both at country and
global level); social (strength of stakeholder support and/or mitigation of any negative social
impacts); political (volatility of stability of political situation); environmental (both positive and
negative impacts); government ownership/commitment (continuation of supportive policies
and any budgetary provisions); other stakeholder ownership (private sector, civil society);
institutional support (from cooperating agencies/institutions, other development partners,
issues related to legal/legislative framework); governance; and exposure to natural hazards.
Sector: Includes development activities commonly grouped together for the purpose of
public action (such as health, education, agriculture, transport etc.).
Sector Program Evaluation: Evaluation of a cluster of development interventions in a
sector within one country or across countries, all of which contribute to the achievement of a
specific development goal.
Self-Evaluation: An evaluation by those who are entrusted with the design and delivery of a
development intervention.
SMART Outcomes and Impacts: Targeted outcomes and impacts should be Specific,
Measurable, Action-oriented, Realistic and Time-bound.
Stakeholders: Agencies, organisations, groups or individuals who have a direct or indirect
interest in the development intervention or its evaluation.
Summative Evaluation: A study conducted at the end of an intervention (or a phase of that
intervention) to determine the extent to which anticipated outcomes were produced.
Summative evaluation is intended to provide information about the worth of the program.
Managing for Development Results (MfDR) Handbook on “Programme and Project Thinking Tools”
----------------------------------------------------------------------------
Version: MSMEs October 2015 Page 64
Sustainability: The continuation of benefits from a development intervention after major
development assistance has been completed. It is also the probability of continued long-
term benefits and the resilience to risk of the net benefit flows over time.
Target Group: The specific individuals or organisations for whose benefit the development
intervention is undertaken.
Terms of Reference: Written document presenting the purpose and scope of the
evaluation, the method to be used, the standard against which performance is to be
assessed or analyses are to be conducted, the resources and time allocated, and reporting
requirements. Two other terms are sometimes used with the same meaning – ‘scope of
work” and “evaluation mandate”.
Thematic Evaluation: Evaluation of a selection of development interventions, all of which
address a specific development priority that cuts across countries, regions, and sectors.
Triangulation: The use of three or more theories, sources or types of information to verify
and substantiate an assessment. By combining multiple data sources, methods, analyses or
theories, evaluators seek to overcome the bias that comes from single informants, single
methods, single observer or single theory studies.
Validity: The extent to which the data collection strategies and instruments measure what
they purport to measure.
Value for Money: Value for money (VFM) is about striking the best balance between the
“three E’s” − economy, efficiency and effectiveness. It is not a tool or a method, but a way of
thinking about using resources well. The optimum combination of whole-life cost and quality
(or fitness for purpose) to meet the user’s requirement. It can be assessed using the criteria
of economy, efficiency and effectiveness ‘the optimal use of resources to achieve intended
outcomes’
The Good Practice Guide for Public Sector procurement guidance for public entities defines
value for money as: “Using resources effectively, economically, and without waste, with due
regard for the total costs and benefits of an arrangement, and its contribution to the
outcomes the entity is trying to achieve. It adds that: “The principle of value for money when
procuring goods or services does not necessarily mean selecting the lowest price, but rather
the best possible outcome for the total cost of ownership (or whole-of-life cost); and that
value for money is achieved by selecting the most appropriate procurement method for the
risk and value of the procurement, and not necessarily by using a competitive tender.
A range of assessment tools are available to provide analysis and information to support
good judgement in decision making. Assessment tools include: comparisons with other
activities (Compare the costs of the activity, or the prices/rates charged for specific budget
items with those in other activities with comparable outcomes and context); programme logic
analysis (assesses the internal logic of the programme and the fit between outcomes,
outputs, inputs and budget); cost-effectiveness analysis (considers proposed outcomes and
outputs, analyses the approach and budget and whether there are alternative ways of
delivering the required outcome); financial viability or cost-benefit analysis (analyses the
income and expenditure of the enterprise/service and the case for its viability ); economic
cost-benefit analysis (assigns dollar values to all costs and benefits of an activity to produce
a measure of the net monetary benefits (or cost) of the activity itself that can be compared or
ranked against other activities); opportunity cost analysis (value of what you could do with
those resources if they were spent elsewhere); multi-criteria ranking (procurement evaluation
that involves multi-criteria ranking ) and quality assurance processes and tools (appraisal
and peer review, monitoring and review to provide evidence of progress in achieving
development outcomes).
Managing for Development Results (MfDR) Handbook on “Programme and Project Thinking Tools”
----------------------------------------------------------------------------
Version: MSMEs October 2015 Page 65
APPENDIX B: SOME MYTHS ABOUT RESULTS BASED MANAGEMENT 1. When an organisation has decided to adopt RBM it has to demonstrate to
stakeholders the results as soon as possible. Experience shows that it takes time, often up to ten years, to fully establish and implement a performance measurement and management system. It takes even longer to see the higher level results. It is better to follow a structured step-by-step approach with clear milestones to ensure buy-in and ownership from staff, beneficiaries and intermediaries. It takes time to develop strategic plans and instruments for results measurement, to monitor results data long enough to establish trends and to judge performance vis-à-vis targets, and to evolve new reporting and decision-making process in which performance information is used. 2. RBM seems to involve a lot of extra work that we cannot possibly afford. The process of making changes requires its own resources, but the cost of implementation is temporary and better viewed as an investment. As the change begins to take place and RBM
becomes a part of the organization, of its business processes and of its culture, these costs will go down. In the long-term RBM has the potential to reduce costs by streamlining procedures and processes and introducing simple management tools. It is important to note that the many organisations are already doing much of the work RBM requires – developing project plans, collecting performance data and reporting on performance. RBM is more about connecting and streamlining these activities to make sure they are all efficiently aligned to support the results that the organisation needs to achieve. Collecting performance data may be expensive, but not using data to understand and improve performance is far more expensive. 3. RBM is only for Technical delivery, not for other services units. RBM applies to any work function in a development organization and can demonstrate: i) its
development results (through project/programme achievement) and ii) its organizational management results (through efficiency and effectiveness in using its human, financial and information resources). While Technical projects demonstrate CDB development results, RBM in each organizational unit illustrate organizational management results. Therefore administrative or services units (e.g., human resources, finance, procurement, building management) also need to define the results they are striving to achieve and the strategy for achieving these results. They need to systematically plan, measure and manage their work to ensure that they remain focused on results and do all they can to maximize results. Traditionally, administrative functions have been viewed as processes to be carried out in accordance with prescribed procedures and with limited emphasis on results. But administrative functions also face the same pressures as any other function to demonstrate their value and justify the resources they utilize.
4. We cannot be held accountable for results for which we have little control. RBM does contribute to greater accountability for results throughout the organization. Translating that accountability into employee performance appraisal systems is not a mechanical process, however. For RBM to work effectively, employees must feel comfortable discussing their performance, setting targets and measuring improvement. Holding employees directly accountable for achieving a defined outcome is often not only unrealistic – because many different factors may have contributed to that outcome – but also can discourage the kind of open and honest approach to planning and performance measurement that is so essential to RBM. A better approach is to hold employees accountable for: i) influencing outcomes (not achieving outcomes) and ii) managing for results – for applying RBM processes and principles diligently and skillfully to understand performance and adjust strategies and operations to
improve performance.
Managing for Development Results (MfDR) Handbook on “Programme and Project Thinking Tools”
----------------------------------------------------------------------------
Version: MSMEs October 2015 Page 66
APPENDIX C: GROWTH IN THE USE OF RESULTS BASED MANAGEMENT
Early 1990s
Many countries carried out extensive public sector reforms to become more results-oriented and effective in response to social and political pressures. They also faced ‘Aid Fatigue’ as there was a growing public perception that aid programmes were failing to produce significant development results.
Mid 1990s Management reforms focused on accountability, performance and results were applied by government agencies in Australia, Canada, New Zealand, the Nordic countries, the UK and the USA. In 1996 The Canadian International Development Agency (CIDA) introduced its Policy Statement on Results-Based Management
Late 1990s Many bilateral agencies formally adopted results-based management. The World Bank was one of the first multilateral organisations to endorse the approach. In 1997 Kofi Annan, the
UN Secretary-General , proposed results-based budgeting (RBB) to replace programme budgeting to make the Organisation more effective. In 1999 the United Nations Development Programme (UNDP) and the UN World Food Programme (WFP) were the first UN Organisations to use RBM.
2000-2005 Adopted in 2000, the Millennium Development Goals (MDGs) embodied the results-based approach to development through identifying a set of goals and measurable targets with specific dates for achievement and performance indicators to measure progress. This increased pressure on UN organisations, bilateral donors and multilateral banks to demonstrate their commitment to achieve the goals in a harmonised manner.
In 2004 the UN General Assembly approved nine benchmarks to measure the progress towards effective implementation of RBM and to harmonise RBM terminology and approach
across the UN.
In 2005 the Paris Declaration put in place a series of implementation measures and established a monitoring system to evaluate progress. It outlined five fundamental principles for making development aid more effective: ownership, alignment, harmonisation, results and mutual accountability.
2006 - 2010 In 2006, the UN launched a pilot initiative ‘Delivering As One’ aimed to increase coherence, effectiveness and efficiency of UN operations through the establishment of one UN Joint Office in each of eight countries.
In 2008 the Accra Agenda for Action was designed to strengthen and deepen implementation of the Paris Declaration. It evaluated the progress of change and set the agenda for accelerated achievement of the Paris targets.
2011 – 2014 In 2011 the Busan Partnership for Effective Development Cooperation led to a proposal for a working partnership between the OECD and the UN which has since been set up.
By 2014 the UN ‘Delivering as One’ initiative has been adopted in 37 countries.
Managing for Development Results (MfDR) Handbook on “Programme and Project Thinking Tools”
----------------------------------------------------------------------------
Version: MSMEs October 2015 Page 67
ADDENDIX D: PROJECT MANAGEMENT
What is a project? A project can be defined as ‘a series of activities aimed at bringing about clearly specified objectives within a defined time period and with a defined budget’15.
Another definition of a project might be ‘a temporary organisation that is needed to produce a unique and defined Purpose or result at a pre-specified time using predetermined resources.’16
A project should have a number of features:
a finite, defined life cycle
defined and measurable results
a set of activities to achieve those results
defined stakeholders
an organisational structure with clear roles and responsibilities for
management, coordination and implementation
a defined amount of resources and
a monitoring, review and evaluation system. Within the business context emphasis is placed on the need for a project to be created and implemented according to a specified business case. In the
development context, this may not be considered relevant. But it is. Perhaps omit the word business and the message is clear and useful; that a project needs to have a specified case. It needs to be based on a clear rationale and logic; it must be
‘defendable’ at all stages when it comes under scrutiny. By its very nature, a project is temporary, set up for a specific purpose. When the expected results have been achieved, it will be disbanded. So projects should be distinguished from on-going organisational structures, processes and operations, with no clear life cycle. These organisational aspects may well of course
provide key support functions to projects but those aspects do not come with the remit of the project team. Where needed they are in effect services bought in by the project. (One can of course have an individual with more than one role, one of which may be long-term, on-going within the organisation, another temporary within a project.) Within the development context there are many different types of project; different in purpose, scope and scale and this can lead to confusion. In
essence a project is any planned initiative that is intended to bring about beneficial change in a nation, community, institution or organisation. It has boundaries that are determined by its objectives, resources and time span. A ‘project’ typically is a free-standing entity relatively small in budget, short in duration and delivered by its own implementation unit. Or it may be an endeavour with a multi-million dollar budget and timeframe stretching to a decade. But the same term is sometimes confusingly used
15
EU (2004) Aid Delivery Methods. Volume 1 Project Cycle Management Guidelines available at ec.europa.eu/comm/europeaid/reports/pcm_guidelines_2004_en.pdf 16
This definition comes from PRINCE2 a project management method established by the UK Office of
Government Commerce (OGC) which has become a standard used extensively by the UK government but which is also widely used and recognised internationally. OGC( 2005) Managing successful projects with PRINCE2
Managing for Development Results (MfDR) Handbook on “Programme and Project Thinking Tools”
----------------------------------------------------------------------------
Version: MSMEs October 2015 Page 68
also for large and complex initiatives embedded within still larger programmes, with rolling time-frames and involving multiple partners. The term is sometimes also used for the development of an element of policy. These notes are about project planning; but remember essentially the same principles, processes and tools can also be applied in programme planning.
Weaknesses of the project approach
‘Classical’ projects in the development context have come in for much, usually highly justified, criticism; for example:
‘Outsider’ (usually donor) controlled priorities and systems
Not aligned with national priorities
Little local ownership, not responsive to real needs, weak implementation,
accountability and sustainability Not addressing holistic, cross-sectoral issues; the management language
is full of metaphors, of projects exacerbating the tendency to think and work in ‘boxes’ or ‘silos’
Fragmented and disjointed effort (sometimes in opposite directions)
Perverse incentives (e.g. well-funded ‘capacity building’ projects can de-skill
other key actors such as government departments) High transaction costs; excessive demands on time of national government
offices; poorly harmonised planning and reporting systems Bias in spending; tied aid.
But all these issues are not unique to projects; many can apply equally to other aid approaches. And they have not meant that projects have disappeared. In non-state work, such as civil society (e.g. NGOs, charities) and the private sector, projects remain a key aid modality. And projects remain within state work, but the nature and ownership of those projects and the funding mechanisms behind them have changed and are continuing to change.
What is the Project Managers Role?
Every project requires management. Someone should be setting objectives, allocating resources, delegating responsibility and monitoring performance in order to keep the project on track.
Of course, as in any management situation, the style that the manager adopts can vary from a very authoritarian, vanguard leader with a hands-on approach, through to a consultative, delegating manager who is one step back from the action, to a democratic, developer manager who facilitates others to achieve. We would advocate the latter.
As a project manager you are key to the success of the project. To be effective you must be able to:
Lead and/or coordinate a team of skilled individuals
Communicate with everyone involved with the project
Motivate the project team, stakeholders, and contractors
Negotiate effective solutions to the various conflicts that may arise between the needs of the project and its stakeholders.
Managing for Development Results (MfDR) Handbook on “Programme and Project Thinking Tools”
----------------------------------------------------------------------------
Version: MSMEs October 2015 Page 69
Identify the risks to the project and limit their effects upon its success.
Use a variety of basic project management tools and techniques
Maintain a good sense of humour at all times!
Do however please remember:
Tools such as stakeholder and problem analysis are not a substitute for professional judgement; simply complementary!
What is Project Cycle Management (PCM)?
The term Project Cycle Management (or PCM as it is sometimes called) is used to describe the management activities, tools and decision-making procedures used during the life of the project. This includes key tasks, roles and responsibilities, key documents and decision options.
The objective of PCM is to provide a standard framework in which projects are developed, implanted and evaluated. The concept of a cycle ensures that the results of the different experiences of the project are learned and factored into new projects, programmes and policy.
The use of PCM tools and decision making procedures helps to ensure that:
Projects are relevant to agreed strategic objectives
Key stakeholders are involved at the important stages of the project
Projects are relevant to real problems of target groups/beneficiaries
Project objectives are feasible and can be realistically achieved
Project successes can be measured and verified.
Benefits generated by projects are likely to be sustainable
Decision-making is well informed at each stage through easily understood project design and management materials.
The Project Cycle
There is no “correct” or “ideal” project cycle. Different organisations develop their own project cycle according to their own needs, requirements and operating environment.
A typical Project Cycle is shown in Figure A (over). It is interesting to compare it with the cycle in the Introduction.
Throughout the entire cycle a process of reflection is encouraged to ensure that LESSON learning is at the heart of the process, enabling adjustment to activities,
indicators of success, appreciation of risks and the focus of achievements.
Managing for Development Results (MfDR) Handbook on “Programme and Project Thinking Tools”
----------------------------------------------------------------------------
Version: MSMEs October 2015 Page 70
FIG
UR
E A
: T
HE
PR
OJE
CT
CY
CL
E
S
takehold
er
Analy
sis
P
roble
m A
naly
sis
R
isk A
naly
sis
Logic
al F
ram
ew
ork
Analy
sis
P
roje
ct
Docum
ent
T
erm
s o
f R
efe
rence
P
art
icip
ato
ry M
anagem
ent
R
eport
ing
M
onitoring
A
ctivity-t
o-O
utp
ut
Revie
ws
O
utp
ut-
to-O
utc
om
e
Revie
ws
P
roje
ct
Com
ple
tion
Report
s
LE
SS
ON
LE
AR
NIN
G
IDE
NT
IFIC
AT
ION
CL
EA
RA
NC
E
DE
SIG
N
AP
PR
OV
AL
C
OM
PL
ET
ION
EV
AL
UA
TIO
N
IMP
LE
ME
NT
AT
ION
S
TR
AT
EG
IC O
BJE
CT
IVE
S
R
equests
Local O
bje
ctives
E
valu
ation S
tudie
s
P
roje
ct
Concepts
F
easib
ility
Stu
die
s
Managing for Development Results (MfDR) Handbook on “Programme and Project Thinking Tools”
----------------------------------------------------------------------------
Version: MSMEs October 2015 Page 71
APPENDIX E: SUMMARY OF THE LOGICAL FRAMEWORK Start here (NOT with the Activities!)
Prior Steps Use appropriate and proportionate processes before starting on the logframe itself e.g stakeholder, problem, objectives and options analyses.
Objectives
Indicators /
Targets
Data
sources
Assumptions
Step 7 Re-check the design logic e.g if the
conditions are in place and we do the activities, will we deliver the Outputs? And so on up columns 1 and 4. Move on to Step 8 overleaf.
Step 1 Define the Impact / Goal To what national or sector level priorities are we contributing? What long-term benefits on the lives
of the poor will happen partly as a result of the project? Several interventions may share a common Goal.
Impact
Purpose/
Outcome to Impact conditions
Step 6d With the Purpose achieved, what
conditions are needed to contribute to the Impact / Goal?
Do a robust risk analysis.
At each level, identify risks by asking what can stop success. For each risk, evaluate its seriousness and probability; and identify
mitigatory measures. Manage the risks by adding mitigatory measures planned within the project to Column 1 (mainly as
Activities, possibly as an Output). The conditions that remain are the Assumptions in Column 4. Avoid mixing
Assumptions and Risks.
Step 2 Define the Purpose/Outcome What immediate change do we want to achieve? Why is the intervention needed? How will others change their behaviour as a result of the use,
uptake or implementation of the Outputs? How will development conditions improve on completion of the Outputs? Limit the Purpose/Outcome to one succinct statement.
Purpose/ Outcome
Output to Purpose/
Outcome conditions
Step 6c With the Outputs delivered, what conditions are needed
to achieve the Purpose?
Step 3 Define the Outputs What will be the measurable end results of the planned activities? What products or services will the project be directly responsible for, given the
necessary resources?
Outputs
Activity to
Output
conditions
Step 6b With the Activities completed, what conditions are needed
to deliver the Outputs?
Step 4 Define the Activities What needs to be actually done to achieve the Outputs? This is a summary (not detailed workplan) showing what needs to be done to accomplish each Output.
Activities
Pre-
conditions
Step 6a What conditions need to be in place for the Activities to be done successfully?
Step 5 Check the vertical logic back up Column 1 Apply the If/then test to check cause and effect. If the listed Activities are carried out, then will the stated Output result? Is what is planned necessary and sufficient? Are we planning to do too much or too little? And so on up Column 1.
Step 6 Define the assumptions at each level Do a robust risk analysis to determine the Assumptions in the project design.
Managing for Development Results (MfDR) Handbook on “Programme and Project Thinking Tools”
----------------------------------------------------------------------------
Version: MSMEs October 2015 Page 72
Step 8 Define the Performance Indicators and Data Sources / Evidence Complete both columns together
Objectives
Indicators /
Targets
Indicators are means; Targets are ends. Start by defining Indicators; only set Targets when there is enough baseline data and stakeholder ownership. Set Indicators and Targets in terms of Quality, Quantity and Time.
Evidence is usually in the form of documents, outputs from data collection. Some reliable sources may already be available. Include data collection planned and resourced in the project as Activities in Column 1.
Data sources
Assumptions
Impact
Step 8a Impact indicators / targets What will indicate the impact changes that
are happening / will happen to which the project has contributed? Include changes that will happen during the lifetime of the project, even if only early signs.
Step 8a Impact data sources What evidence will be used to report on
Impact changes? Who will collect it and when?
Purpose/
Outcome
Step 8b Purpose indicators / targets At the end of the project, what will indicate
whether the Purpose has been achieved? This is the key box when the project is evaluated on completion.
Step 8b Purpose data sources What evidence will be used to report on
Purpose changes? Who will collect it and when?
Outputs
Step 8c Output indicators / targets What will indicate whether the Outputs have
been delivered? What will show whether completed Outputs are beginning to achieve the Purpose? These indicators / targets define the terms of reference for the project.
Step 8c Output data sources What evidence will be used to report on
Output delivery? Who will collect it and when?
Activities
Step 8d Activity indicators / targets What will indicate whether the activities have been successful? What milestones could show whether successful Activities are delivering the Outputs? A summary of the
project inputs and budget will also be one(but not the only) entry here?
Step 8d Activity data sources What evidence will be used to report on
the completion of Activities? Who will collect it and when? A summary of the
project accounts will be one (but not the
only) entry here.
Do not include too much detail in the logframe. A detailed workplan and budget will follow as separate, attached documents.
Managing for Development Results (MfDR) Handbook on “Programme and Project Thinking Tools”
----------------------------------------------------------------------------
Version: MSMEs October 2015 Page 73
APPENDIX F: STRENGTHS AND WEAKNESSES OF THE LOGFRAME
INTRODUCTION
The logical framework (logframe) approach (LFA) is a process and tool (more accurately a ‘basket of tools’) for use throughout the project and programme cycle17 to help strengthen analysis and design during formulation, implementation, evaluation and audit. It involves identifying strategic elements (activities, outputs, Purpose and impact) and their causal relationships, indicators and evidence to measure performance and the assumptions and risks that may influence success and failure. The logframe approach includes a set of interlocking concepts to guide and structure an iterative process of analysis, design and management. In this paper we distinguish between that process and the documented product of that process, the
logical framework matrix. A quality process is vital if a useful and effective product is to be generated. The approach is essentially a way of thinking, a mentality. In some
contexts the matrix product is less important than the process; indeed a matrix may not be needed. The approach has become very widely employed and influential especially, but not exclusively, in international development work. Many development agencies, including national governments, multilateral and bilateral partners, and non-government organisations, use the logframe approach in one of its variants. In many agencies and for a variety of reasons, it has become mandatory practice. Aid effectiveness commitments, most recently in the 2005 Paris Declaration18 agreed by most partners in the development community, set out clear progress indicators including for harmonisation of procedures in shared analysis, design and results-oriented frameworks. This is work still, as the webpages say, ‘under construction’. Already we are seeing much more consensus on terminology (e.g. in OECD19 and UNDG20 glossaries). Similarly there is more uniformity amongst agencies in the format of logical frameworks than there was a decade ago. Complete uniformity is unlikely to be achievable or indeed desirable; frameworks are needed for different outcomes so a general design framework will differ from one specifically to show detailed results monitoring arrangements. The important thing is that the frameworks help not hinder communication; that users can see how frameworks for different
outcomes link one to another within an overall results-based management system. The logframe approach, proponents argue, is a simple process that helps:
organise thinking;
relate activities and investment to expected results;
set out performance indicators;
17
The LFA can be applied at different levels with small projects, a higher-level programme or indeed a whole organisation. In this paper, the term ‘project’ is intended to include all levels. 18
http://www.oecd.org/dataoecd/11/41/34428351.pdf 19
http://www.oecd.org/dataoecd/29/21/2754804.pdf
Managing for Development Results (MfDR) Handbook on “Programme and Project Thinking Tools”
----------------------------------------------------------------------------
Version: MSMEs October 2015 Page 74
allocate responsibilities;
communicate information on the project concisely and unambiguously.
There are however limitations to the logframe approach. In the current debate, it is not easy to separate weaknesses that may be inherent in the tool itself from the poor application of that tool. Some feel it is essentially a good tool, but one that is often badly applied. The 'good servant, bad master' theme is deepened by the frequent use of the logframe as a rigid and inflexible tool for central, hierarchical control. Some opponents go further and reject the approach itself on the grounds that it is reductionist and simplistic, that it exacerbates power imbalances between funder, intermediary and beneficiary and that it is 'western-centric'. Perhaps the most valid, but not altogether satisfactory, justification for widening the use of the LFA is that 'something is better than nothing'. An approach has to be used,
ultimately to report progress against expenditure, and if there is widespread consensus on one approach, all the better. Some who criticise the LFA as a planning tool, are actually comparing it with not planning. Most of us would rather not plan; but
not planning rarely results in effective and efficient operation. Many lessons have been learnt over the last twenty years as regards LFA best practice; examples of enlightened and rewarding application in a variety of contexts are now common. The LFA will only be beneficial if it is used in a thoughtful way such that it influences project identification and design from the start, rather than only being added at the end. The logframe matrix itself should be a product and summary of thorough and systematic situation analysis and cannot be a substitute for this. As such it must be embedded in a wider process; before work on the logframe matrix starts, there needs to be analysis of who should be involved and how. This in turn will
lead to more effective appraisal of the context (be it social, technical, environmental, economic, institutional, or gender etc.), of the problem to be addressed, of the vision sought and strategic analysis of the alternative ways forward.
STRENGTHS OF THE LOGICAL FRAMEWORK APPROACH
The major strengths of the logframe approach are:
It brings together in one place a statement of all key elements of the project or programme.
Having all key components of projects or programme in a systematic, concise and coherent way helps you clarify and demonstrate the logic of how the initiative will work. This can be particularly helpful when communicating between partners and when there is a change of personnel.
It fosters good situation analysis and project design that responds to real problems and real needs.
It systematizes thinking. It can help ensure that the fundamental questions are asked and that cause and effect relationships are identified. Problems are analysed in a systematic way and logical sequence. It guides you in identifying
20
http://www.undg.org/documents/2485-Results-Based_Management_Terminology_-_Final_version.doc
Managing for Development Results (MfDR) Handbook on “Programme and Project Thinking Tools”
----------------------------------------------------------------------------
Version: MSMEs October 2015 Page 75
the inter-related key elements that constitute a well-planned project. It highlights linkages between project elements and important external factors.
It encourages robust risk management.
It systematically requires risks to be identified and assessed and mitigatory measures to be factored into the design. It informs the ultimate decision to approve the plan for implementation in the light of remaining assumptions.
It anticipates implementation.
The logframe approach helps in the setting up of activity and input schedules with clear anticipated outcomes. Likewise the use of logframes, can help ensure continuity of approach if any original project staff move or are replaced.
It sets up a framework for monitoring and evaluation where anticipated and actual results can be compared.
By having objectives and indicators of success clearly stated before the project starts the approach helps you set up a framework for monitoring and evaluation. It is notoriously difficult to evaluate projects retrospectively if the original objectives are not clearly stated. It helps to reveal where baseline information is lacking and what needs to be done to rectify this. The approach can help clarify the relationships that underlie judgements about the likely efficiency and effectiveness of projects; likewise it can help identify the main factors related to the success of the project.
It is easy to learn and use.
Effective training in the basics of the logframe approach can be given in a few days. Opportunities are then needed to apply and consolidate learning with follow-up support through mentoring, networking and further training. A key group of staff can become an effective resource team in a short period of time.
It does not add time or effort to project design and management, but reduces it.
Like many other design and management tools the logframe approach has to be learnt before it can be effectively used. Once learnt however, it will save time. Of course, if it is being compared with not doing essential analysis and design work, then it takes longer; but ‘not doing’ is not an option.
It enhances communication.
The approach facilitates common terminology, understanding, purpose and ownership within and between partners. Several logframes can interrelate; they can nest together as a portfolio of initiatives working towards a common vision. In a powerful way this can help individuals and teams understand the whole of which they are a part; it helps them to see the bigger picture.
It can be used as a basis for a reporting and overall performance assessment system.
The monitoring and evaluation elements of the logframe can be used to develop a format for reporting clearly and succinctly against objectives and indicators and for success scoring. Scores in turn can be collated across a portfolio to give an assessment of overall performance and organisational and developmental effectiveness.
Managing for Development Results (MfDR) Handbook on “Programme and Project Thinking Tools”
----------------------------------------------------------------------------
Version: MSMEs October 2015 Page 76
WEAKNESSES OF THE LOGICAL FRAMEWORK APPROACH Some significant limitations of the LF approach are: It is not a substitute for other technical, economic, social and environmental analyses. It cannot replace the use of professionally qualified and experienced staff.
It can help project design, implementation and evaluation, but clearly does not do away with the need for other project tools especially those related to technical, economic, social and environmental analyses. Likewise the approach does not replace the need for professional expertise and experience and judgement.
It can be used as a means of rigid, top-down hierarchical control.
Rigidity in project administration and management can sometimes arise when logframe objectives, targets and external factors specified during design are used as a straightjacket. The LF matrix should not be sunk in concrete, never to be altered to fit changing circumstances. There needs to be the expectation that key elements will be re-evaluated and adjusted through regular project reviews.
The logframe process might be carried out mechanistically as a bureaucratic box-filling.
This is a common abuse of the tool. The individual at their desk or in their hotel room mechanistically filling in the matrix ‘because that’s what the procedures say’ is the antithesis of the approach. In its extreme the approach becomes a fetish rather than an aid.
The process requires strong facilitation skills to ensure real participation by appropriate stakeholders.
To undertake the logframe process with the active participation of appropriate stakeholders in decision-making is not easy. Facilitating, for example illiterate primary stakeholders effectively through the process requires considerable skill.
The logframe is simplistic and reductionist.
It over-relies conceptually on linear cause and effect chains. Life is not like that. As a result, the logframe can miss out essential details and nuances.
The whole language and culture of the logframe can be alien.
The jargon can be intimidating. In some cultures (organisational and national) the logframe can be very alien. Concepts and terminology do not always easily translate into other cultures and languages. The objectives-driven nature of the logframe does not always transfer well across cultural boundaries. Unless precautions are taken the LFA can discriminate and exclude.
The logframe approach is western-centric.
This continues to be a hotly debated issue. Some opponents see the approach as a manifestation of western hegemony and globalisation.
Managing for Development Results (MfDR) Handbook on “Programme and Project Thinking Tools”
----------------------------------------------------------------------------
Version: MSMEs October 2015 Page 77
IN CONCLUSION
The logframe is not a panacea. However, used sensitively, it is a powerful approach, that can result in greater effectiveness, efficiency and inclusion. Developing a logframe with real participation can have a very positive impact. Fresh thinking is needed, customised to each context, to the extent in some contexts perhaps of not using the matrix itself, and just working with the questions therein. The LFA’s wide adoption suggests that, on balance, its strengths outweigh its limitations; some disagree. Users need however to be well aware of the weaknesses and potential abuses and misuses of the approach. The LFA must to be used flexibly with eyes open to its limitations and pitfalls.
Managing for Development Results (MfDR) Handbook on “Programme and Project Thinking Tools”
----------------------------------------------------------------------------
Version: MSMEs October 2015 Page 78
APPENDIX G: CATEGORIES OF OUTPUTS Human Capacity
Specific Individuals or Groups able to do specific tasks To identify needs To research To develop policy
Systems
For Administration For Management For Handling Information Procedures and guidelines For Research For Monitoring and Evaluation For Promotion and dissemination For Procurement and Contracting For Reporting For Human Resource Management
Knowledge and Information
Lessons learned Product and Process Policy initiatives
Infrastructure
Clinics Classrooms Computers etc
Materials
Research publications Extension materials Grey literature Training materials / curricula Broadcasts Websites Databases Documented procedures Product and Process
Awareness of various audiences
Users Policy makers Other researchers in region and internationally Donor community Secondary Stakeholders
Managing for Development Results (MfDR) Handbook on “Programme and Project Thinking Tools”
----------------------------------------------------------------------------
Version: MSMEs October 2015 Page 79
APPENDIX H: ASSESSING PROJECT PERFORMANCE
Why assess project performance? We need to demonstrate project performance so that we can more effectively manage the outputs and outcomes of what we do and direct our effort in the direction where it will have the greatest impact. Project performance assessment traditionally involved monitoring and evaluation with a focus on assessing inputs and implementation processes. The trend today is to broaden assessment to include many elements that together contribute to a particular development outcome and impact. So depending on the context, assessment may be needed for example of outputs, partnerships, coordination, brokering, policy advice, advocacy and dialogue.
Learning
Accountability
Decision Making
Monitoring
Review, Evaluation and Impact Assessment
How?
Of what?
Projects and Programmes
Strategies and Policies
Partnerships
Why?
SDGs
Evaluative
exercises
Areas of
focus
Capacity
building for
performance
The main reasons for performance assessment are to:
Enhance organisational and development learning; to help our
understanding of why particular activities have been more or less successful in order to improve performance
Be accountable to clients, beneficiaries, donors and taxpayers for the use of
resources; and thereby to Ensure informed decision-making.
An underpinning rationale is the capacity building for improving performance.
Managing for Development Results (MfDR) Handbook on “Programme and Project Thinking Tools”
----------------------------------------------------------------------------
Version: MSMEs October 2015 Page 80
Monitoring, Review, Evaluation and Impact Assessment The use of the terms varies in different organisations. Be aware that when talking with others, they may use different words, or the same words may mean different things. A common interpretation of them is: Monitoring:
the systematic collection and analysis on a regular basis of data for checking performance. This is usually done internally to assess how inputs are being used, whether and how well activities are being completed, and whether outputs are being delivered as planned. Monitoring focuses in particular on efficiency, the use of resources. Key data
sources for monitoring will be typically internal documents such as monthly/quarterly reports, work and travel logs, training records, minutes of meetings etc. Review:
an assessment of performance periodically or on an ad hoc basis, perhaps annually or at the end of a phase. It usually involves insiders working with outsiders; implementers with administrators and other stakeholders. Review focuses in particular on effectiveness, relevance and immediate impact. It assesses whether the activities have delivered the
outputs planned and the Purposes of those outputs; in other words whether there is indication that the outputs are contributing to the purpose of the intervention. Early reviews are sometimes called Activity-to-Output Reviews, later ones Output-to-Purpose Reviews. ‘Review’ is sometimes used synonymously with ‘evaluation’; review is a form of evaluation. Key data sources for review will typically be both internal and external documents, such as ½ yearly or annual reports, a report from a stakeholder participatory review event, data collection documents, consultants’ reports etc. Evaluation: in many organisations is a general term used to include review. Other organisations use it in the more specific sense of a systematic and comprehensive assessment of an on-going or completed initiative. Evaluations are usually carried out by outsiders (to enhance objective accountability) but may involve insiders also (to enhance lesson learning). Evaluations focus on
the relevance, effectiveness, efficiency, impact and sustainability of a project or programme. Evaluations are often carried out to assess and synthesise several initiatives together on a thematic, sector or programme basis. Key data sources for evaluation will be both internal and external. They may include review reports, commissioned study reports, national and international statistics, impact assessment reports etc. Impact assessment is a form of evaluation that tries to differentiate changes that can be attributed to a project/programme from other external factors that may have contributed. Those changes may be intended or unintended. Impact assessment tries to assess what has happened as a result of the intervention and what may have happened without it.
I
O
O
A
I
O
O
A
I
O
O
A
Managing for Development Results (MfDR) Handbook on “Programme and Project Thinking Tools”
----------------------------------------------------------------------------
Version: MSMEs October 2015 Page 81
It is clear then that M&E reflect a continuum with no clear boundaries. With that caveat said, the following table offers some general differences.
Monitoring Review Evaluation
When is it done?
continuous throughout the life of an initiative
occasional, mid-way or at the end of a phase or initiative
infrequent, during, at the end or beyond the end of an initiative
Why is it done?
to assess whether an initiative is on track and make adjustments
to reflect on and explain performance; to learn and share lessons; to hold managers
accountable
to reflect on and explain performance; to learn and share lessons, often at a programme, thematic or sector, rather than
project level; to hold managers accountable; to assess impact in relation to external factors and contributions and attributions to change
What is measured?
checks mainly efficiency, the processes of the work - inputs, activities, outputs, conditions and assumptions
checks the effectiveness, relevance and immediate impact of the initiative and the achievement of Purpose
checks the efficiency, effectiveness, relevance, impact and sustainability of the work and the achievement of objectives. It examines with and without scenarios.
Who is involved?
generally only insiders involved
may involve outsiders and insiders; generally initiated by the project/ programme team
usually involves outsiders but perhaps also insiders; often initiated by an Evaluation Office in the same agency or by another agency altogether
What sources of
inform-ation are used?
typically internal documents such as
monthly/quarterly reports, work and travel logs, training records, minutes of meetings etc.
both internal and external documents such
as ½ yearly or annual reports, a report from a stakeholder participatory review event, data collection documents, consultants reports etc.
both internal and external including review reports,
consultants reports, national and international statistics, impact assess-ment reports etc.
Who uses the results?
managers and staff are the main users of the information gathered
many people use the information e.g. managers, staff, donors, beneficiaries
many people use the information e.g. managers, staff, donors, beneficiaries and other audiences
How are the results used?
decision-making results in minor corrective changes
decision-making may result in changes in policies, strategy and future work
decision-making may result in major changes in policies, strategy and future work
Managing for Development Results (MfDR) Handbook on “Programme and Project Thinking Tools”
----------------------------------------------------------------------------
Version: MSMEs October 2015 Page 82
M & E criteria It is crucial to plan an M&E system from the outset; e.g. when doing an organisational strategic plan, when planning an initiative. A system is needed that will examine progress against agreed performance indicators; that will address core criteria and questions (based on the DAC criteria):
Relevance (Does the organisation or initiative address the needs? Is it consistent with the policies and priorities of the major stakeholders – especially, where relevant, of the client country? To what extent is it compatible with other efforts? Does it complement, duplicate or compete?)
Efficiency (Are we using the available resources wisely and well? How do
outputs achieved relate to inputs used?)
Effectiveness (Are the desired objectives being achieved at Outcome/Purpose and Impact / Goal level? Does it add value to what others are doing? To what extent are partners maximising their comparative advantage?)
Impact (What changes, positive and negative, have occurred and are these attributable to the initiative?)
Sustainability (Will the Outcome and impacts be sustained after external
support has ended? Will the activities, outputs, structures and processes established be sustained?)
Performance Scoring Some organisations use scoring systems as an integral part of the monitoring and review process to rate aspects of performance; for example of the likelihood that the Outputs and Outcome of the project will succeed (or have succeeded, depending on when the scoring is done) or of the level of risk, which threatens the achievement
of success. Annual scoring can provide important data for accountability, learning and decision-making. With care it may be possible for scores to be aggregated across a programme or sector or office to provide an overall picture of success and value for money. The quality of scoring is clearly a key issue; bad data generates bad conclusions. The system has to be applied consistently and robustly involving relevant stakeholders and partners.
Preparing for review or evaluation This will probably have been set in the project document logframe and workplan. Even so these exercises often take implementers by surprise. Some steps:
Clarify Scope and Timing
Start planning typically 6-9 months before the event, especially if it is to involve independent evaluators or senior officials; their diaries are likely to be full.
Involve Partners and Stakeholders This may be straightforward. Or it may be a delicate operation. Present the exercise positively emphasising the opportunity to work together in assessing progress, to support joint learning, to account for resources used and improve overall effort.
Managing for Development Results (MfDR) Handbook on “Programme and Project Thinking Tools”
----------------------------------------------------------------------------
Version: MSMEs October 2015 Page 83
But recognise fears and discuss them openly. Seek an organisational culture where the discovery of mistakes and failures is accepted as an opportunity to improve rather than to blame and to condemn.
Agree the Terms of Reference
Goods ToRs are critical. Typically these will include: i. Objectives Why the evaluation is being undertaken. A brief description
of what is to be evaluated; project status; key partners and stakeholders; changes in context; previous evaluations
ii. Scope The issues, areas and timeframe the evaluation will cover; some
key evaluation questions iii. Implementation Composition and areas of expertise of the team;
leadership and management; methodology and approach; field visits; phases and scheduling
iv. Products Findings, recommendations, lessons, performance scoring;
local discussion and feedback; debriefing. Report drafts and editing process; the final report – content, scope, length, language, deadlines, dissemination
v. Background More detailed information about the context; reference
documents etc. Plan and Implement any special surveys that may be needed
Fresh primary data may be needed. Or an analysis of documentation.
Plan for any special requirements
For example, translation of key documents.
Quality Standards for Evaluation Utility - meeting the information needs of the intended users and therefore
relevant and timely Accuracy - using valid, reliable and relevant information Independence - impartial, objective, and independent from the process
concerned with policy-making, and the delivery and management of development assistance
Credibility - depends on the skill and experience of the evaluators, and on the
transparency and inclusiveness of the evaluation process (credible evaluations also require accuracy and independence)
Propriety - conducted legally, ethically, and with due regard for the welfare of
those involved in the evaluation, as well as those affected by its results.
Where to go for further information World Bank Evaluation http://www.worldbank.org/evaluation/ FAO http://www.fao.org/pbe/pbee/en/224/index.html http://www.fao.org/docs/eims/upload/160705/auto-evaluation_guide.pdf
Managing for Development Results (MfDR) Handbook on “Programme and Project Thinking Tools”
----------------------------------------------------------------------------
Version: MSMEs October 2015 Page 84
IFAD http://www.ifad.org/evaluation/guide/ EU Guidelines http://europa.eu.int/comm/europeaid/evaluation/methods/guidelines_en.pdf OECD and DAC http://www.oecd.org/pages/0,2966,en_35038640_35039563_1_1_1_1_1,00.html UNDP Evaluation Office http://www.undp.org/eo/ UN Evaluation Forum http://www.uneval.org/ International Development Evaluation Association http://www.ideas-int.org/
Managing for Development Results (MfDR) Handbook on “Programme and Project Thinking Tools”
----------------------------------------------------------------------------
Version: MSMEs October 2015 Page 85
APPENDIX I: PORTFOLIOS OF LOGFRAMES
INSTITUTION
Mandate Mission
Objectives
PROGRAMMES
Themes Sectors Regions
PROJECTS
The logframe approach can help to communicate, organise, manage and focus a portfolio:
To improve horizontal and vertical communication
To standardise planning and design
To monitor and evaluate performance at all levels
To provide a logical focus. For the individual involved in such an organisation, to be able to ‘see the whole’ can be important in motivation and ownership.
Managing for Development Results (MfDR) Handbook on “Programme and Project Thinking Tools”
----------------------------------------------------------------------------
Version: MSMEs October 2015 Page 86
APPENDIX J: NESTING OF LOGFRAMES HIV/AIDS in Sub-Saharan Africa
Department or Country Assistance Plan level
Impact / Goal
Poverty reduced in Sub-Saharan Africa.
Sector level e.g. Health sector in Nkonia
Purpose
Progress towards Millennium Development Goals in 16 key countries.
Impact / Goal
Progress towards health MDGs in Nkonia.
Programme level e.g. sexual and reproductive health in Ekim State.
Outputs 1. Government-led
health programmes within poverty reduction strategy developed and
implemented focussing on MDGs.
2. Government-led education programmes in Poverty Reduction
Strategy developed and implemented focussing on MDGs.
3. Better economic and political governance.
4. Sustained
improvement in climate for foreign investment, local private sector development and market access for the
poor.
Purpose Government-led health programme within Poverty Reduction Strategy developed and implemented focussing on
MDGs.
Impact / Goal State health programme implemented successfully contributing to sexual and reproductive health MDGs.
Project level e.g. Life Planning Education in Marivi
Districts.
Outputs 1. National Strategic
Health Policy
developed and implemented.
2. A model for Family Medicine.
3. Integration of Ministry of Health with Social
Security systems. 4. Research, monitoring
and impact assessment systems agreed and in place.
5. Skills developed in
contracting private services.
Purpose Sexual and Reproductive Health policy developed
and implemented in Ekim State.
Impact / Goal Improved sexual and reproductive health status
in Marivi though successful implementation of State Sexual and Reproductive Health policy.
Outputs 1. Improved enabling
and policy environment.
2. Capacity of partner institutions developed.
3. Youth-friendly services accessible to female and male adolescents.
4. Schools able to deliver Life Planning
Education effectively. 5. etc
Purpose Schools effectively delivering Life Planning Education.
Outputs 1. Partner consensus
and plan for way
forward. 2. Improved methods of
control identified. 3. Schools with
resourced Action Plans developed with community.
4. Cadre of teacher trainers in place.
5. Materials and curriculum developed.
6. Core of teachers trained in each
school. 7. etc
Managing for Development Results (MfDR) Handbook on “Programme and Project Thinking Tools”
----------------------------------------------------------------------------
Version: MSMEs October 2015 Page 87
Weed Research in semi-arid areas
Department level
Impact / Goal Poverty reduced.
Economic growth. National environmental problems mitigated.
Sector strategy level e.g. Research strategy
Purpose Productive capacity of crop sector enhanced on economically and
environmentally sustainable basis.
Impact / Goal Productive capacity of smallholder cropping sector enhanced on
economically and environmentally sustainable basis.
Programme level e.g. semi-arid systems programme
Outputs 1. Research outputs
disseminated and implemented.
2. Policy development strategy successfully implemented.
3. Successful operations strategy in place.
Purpose
Research outputs disseminated and implemented.
Impact / Goal Research outputs relating to semi-arid systems disseminated and
implemented.
Project level e.g. Control of Striga weed project
Outputs 1. Key researchable
constraints removed. 2. Research
programmes successfully operational.
Purpose Research programmes
successfully operational.
Impact / Goal Research programmes
relating to semi-arid systems successfully operational.
Outputs 1. Impact of weeds on
the crop production cycle minimised.
2. Impact of pests on production of sorghum and millet based systems
minimised. 3. Impact of pests on
cotton production minimised.
Purpose Impact of Striga on the
crop production cycle minimised.
. Outputs 1. Dynamics of
sorghum/Striga communities better
understood and incorporated in crop management strategies.
2. Improved methods of control developed and
promoted. 3. etc
Managing for Development Results (MfDR) Handbook on “Programme and Project Thinking Tools”
----------------------------------------------------------------------------
Version: MSMEs October 2015 Page 88
APPENDIX K: THE LOGFRAME AS A COMMUNICATION TOOL
The logical framework is an important communication tool. It can help us to explain to our project partners and other stakeholders what we are doing and why. It can help us prepare reports for sponsors and other key stakeholders. This can be achieved by taking:
A step-by-step presentation approach21
1. Impact / Goal: "The overall goal is to ............."
2. Purpose: "In order to contribute to this goal we in this project will............"
3. Outputs: "We will achieve this objective by taking direct responsibility
for............"
4. Activities: "Let me describe our strategy in more detail. We believe that if we
.............."
5. Activity level Assumptions: "and if .........."
6. Output level Indicators: "we will achieve our targets of ............."
7. Purpose Indicators: "In addition to reaching these targets, several other
things must happen if we are to achieve our major objective of ............"
8. Output level Assumptions: "These other factors, outside our direct control,
include ........."
9. Purpose level Assumptions: "We believe that if we can achieve our major
objective, we will contribute to our overall goal. This contribution is, however, affected by factors outside of this project. These include ........ All of these factors taken together will be sufficient to realise this goal. The strategy we propose is an important and cost effective step towards that end."
10. Evidence: "We propose that our performance be monitored and assessed in
the following way..........."
21
Adapted from the original Team Up Project Checklist
Managing for Development Results (MfDR) Handbook on “Programme and Project Thinking Tools”
----------------------------------------------------------------------------
Version: MSMEs October 2015 Page 89
APPENDIX L: REPORTING USING THE LOGFRAME; AN EXAMPLE The next four pages give an example of a typical reporting format based on the logframe; at different objective levels and at
different times during the project cycle. The first two columns of each table are cut and pasted from the logframe. Development organisations have committed themselves to move towards uniform reporting procedures and formats; until that happens, formats will vary.
PROGRESS/MONITORING REPORT COUNTRY……………… PROJECT TITLE……………………… PERIOD COVERED……………… CODE………………… DATE PREPARED…………………….. PREPARED BY……………………
PROJECT STRUCTURE INDICATORS OF
ACHIEVEMENT PROGRESS COMMENTS AND
RECOMMENDATIONS
RATING
*
ACTIVITIES (Insert activities and inputs from the logical framework).
INDICATORS (Insert indicators from the logical framework).
Provide a report against each activity and input.
Provide comments against each activity and input plus recommendations where appropriate. Comment on the extent to which the assumptions are being met.
* 1. Likely to be completely achieved 2. Likely to be largely achieved 3. Likely to be partially achieved 4. Only likely to be achieved to a very limited extent 5. Unlikely to be achieved x Too early to judge the extent of achievement
Managing for Development Results (MfDR) Handbook on “Programme and Project Thinking Tools”
----------------------------------------------------------------------------
Version: MSMEs October 2015 Page 90
PROGRESS/MONITORING REPORT
COUNTRY……………… PROJECT TITLE……………………… PERIOD COVERED……………… CODE………………… DATE PREPARED…………………….. PREPARED BY……………………
PROJECT STRUCTURE
INDICATORS OF ACHIEVEMENT
PROGRESS COMMENTS AND RECOMMENDATIONS
RATING
*
OUTPUTS (Insert outputs from the logical framework). Insert also details of any unexpected outputs.
INDICATORS (Insert indicators from the logical framework).
Provide a report against each output indicator.
Provide comments against each output indicator plus recommendations where appropriate. Comment on the extent to which the assumptions are being met.
Managing for Development Results (MfDR) Handbook on “Programme and Project Thinking Tools”
----------------------------------------------------------------------------
Version: MSMEs October 2015 Page 91
MONITORING/OUTPUT TO PURPOSE REVIEW REPORT COUNTRY……………… PROJECT TITLE……………………… PERIOD COVERED……………… CODE………………… DATE PREPARED…………………….. PREPARED BY……………………
PROJECT STRUCTURE
INDICATORS OF ACHIEVEMENT
PROGRESS COMMENTS AND RECOMMENDATIONS
RATING
*
OUTCOME (Insert Outcome from the logical framework).
INDICATORS (Insert indicators from the logical framework).
Provide a report against each Outcome indicator.
Provide comments against each indicator plus recommendations where appropriate. Comment on the extent to which the assumptions are being met.
Managing for Development Results (MfDR) Handbook on “Programme and Project Thinking Tools”
----------------------------------------------------------------------------
Version: MSMEs October 2015 Page 92
OUTCOME TO IMPACT / GOAL REVIEW REPORT COUNTRY……………… PROJECT TITLE……………………… PERIOD COVERED……………… CODE………………… DATE PREPARED…………………….. PREPARED BY……………………
PROJECT STRUCTURE
INDICATORS OF ACHIEVEMENT
PROGRESS COMMENTS AND RECOMMENDATIONS
RATING
*
Impact / GOAL (Insert impact from the logical framework).
INDICATORS (Insert indicators from the logical framework).
Provide a report against each impact indicator.
Provide comments against each indicator plus recommendations where appropriate. Comment on the extent to which the assumptions are being met.
Managing for Development Results (MfDR) Handbook on “Programme and Project Thinking Tools”
----------------------------------------------------------------------------
Version: MSMEs October 2015 Page 93
APPENDIX M: AN EXAMPLE OF A SIMPLE LOGFRAME Project Title: Community Recreation Facilities for Kids - Building a Swing!
Objectives Indicators / Targets
Data Sources Assumptions
Impact / Goal:
Integrated community with happy
kids and adults
Number of stressed families decreases by 50%
Other communities adopt similar ideas
Reports from village clinic and counsellors
Newspaper articles
Birth rate continues
Purpose/Outcome:
Kids have fun, are busy and safe
60% of local young kids use the swing safely at least once a month by end of year 2.
Kids’ opinion on life in the village improved by end of year 2.
User survey Participatory evaluation with the kids
Safe recreation leads to happiness and community integration Facilities don’t create
conflict
Outputs:
1. Capacity within community to manage the building and long-term maintenance of the swing
6-monthly meetings after completion with > 5 members.
Swing maintained and in use over minimum 5 year period
Minutes of meetings
Maintenance and annual safety inspection records
People see the benefit of it
Easy maintenance
2. A safe, well-built swing Swing completed and in use in 12 months Minimal number of
accidents Few repairs needed
Safety certificate on completion Accident records;
bruises, minor cuts & hospitalisation Maintenance log
No vandalism Kids like and use it Kids don’t fight
Activities:
1.1 Establish community committee and undertake lobbying required
Planning team set up by x Committee chosen by x
Monthly meetings during planning & building phase with > 8 members
Minutes of meetings
Attendance records
Enthusiasm and participation maintained
Football club will give up a small amount of land for the swing
1.2 Set budget Budget Accounts Low inflation
1.3 Raise funds Enough money raised by x Income/receipts Sufficient funds raised
1.4 Set up systems for maintenance Rota agreed amongst parents to maintain swing by
x
Quarterly rota pinned on library notice board
2.1 Consult kids Ideas generated and incorporated in design
Plan discussed with designers
2.2 Design it Designed by x Design in hand
2.3 Get planning permission Planning permission by x Permit in hand Permission given
2.4 Commission builder Tenders issued by x Contract awarded by x
Documentation Building firm reliable and capable
2.5 Build it Completion by x Documentation
2.6 Test it Tested by builders by x Verbal report
2.7 Safety inspection on completion Inspection by x Certificate in hand
2.8 Carry out user survey and participatory evaluation with the kids
Survey carried out by x Findings displayed in public library
Managing for Development Results (MfDR) Handbook on “Programme and Project Thinking Tools”
----------------------------------------------------------------------------
Version: MSMEs October 2015 Page 94
APPENDIX N: EXAMPLES OF LOGFRAMES
Jamaica All Age Schools Project (JAASP) Logical Framework 22
Objectives Indicators Verification Assumptions Goal
Improved lifetime opportunities for poorer rural children.
Increased number of children from poor communities finding employment or accessing higher levels of education.
Growth and/or stability of the economy
Purpose
Better education for children from poor,
rural communities.
By the end of project:
At least 60% of the students are reading at or above grade
4 level
30% increase in scores attained in core subjects at Grade 6 and 9 levels
School attendance at 90%
At least 98% of the students completing 9 years of schooling
10% increase in pupils progressing to secondary school
National Education Statistics
Student Assessment Unit data
Government of Jamaica remains committed to
poverty reduction through investment in education Jamaican economy provides
employment opportunities and other initiatives. Access to upper secondary
places and skills training available
22
Dearden P.N. 2000 Report on Project Cycle Management and Logframe Review Workshop Jamaica All Age School Project
(JAASP) Jamaica, October 2000. Department for International Development (DFID) and University of Wolverhampton, UK.
Managing for Development Results (MfDR) Handbook on “Programme and Project Thinking Tools”
----------------------------------------------------------------------------
Version: MSMEs October 2015 Page 95
Outputs
1. Improved
community /school participation.
80% of school boards and
Parent Teacher Associations (PTAs) operating effectively by year 2
50% reduction in school vandalism by year 3 85% attendance of boys and
girls by year 3 Active participation of the community in supporting the
curriculum and infrastructure by year 3 One community school meeting
per term PTAs in 48 schools conduct programmes including some of
the following:
Adult education
Nutrition and welfare
Home work clubs
Literacy support
Extra curricular activities
Schools maintenance
Skills sharing
School development plan includes community involvement section
Minutes of board
meetings Census documents
Inspection reports Logbooks and attendance records
Community profile conducted
Principals reports Reports from Education Officers and Regional
Education Officers (REOs) Dialogue with school
community Community
feedback
Interest of community
members Parents have resources to contribute
Principals and staff will be receptive/committed to full community participation
Co-operation from other agencies
2. Improved school management.
School Development Plan prepared in all project schools by year 1
School Development Plans implemented effectively in all schools by year 2
Comprehensive and effective School Development Plans
(SDPs) developed and approved by stakeholders in all Project Schools by March 2001
All principals trained by Ministry of Education and Culture (MOEC) in staff
instructional management by year 2
80% of Principals use improved management techniques by end of year 3
Principals organised/conduct
one cluster workshop per
Plans submitted to project manager REOs records and reports
Plans submitted to Project Managers Plans reviewed by Regional officers
and Technical Assistants TEO monitoring reports Education officer
Reports Senior Education Officer records Workshop reports
Participants and facilitator reports
Training and support is sufficient to enable schools to formulate and implement plans and
monitoring systems are effective Principals and other school personnel respond positively
Principals and other school personnel do not feel threatened by change/full community involvement in school development planning.
Managing for Development Results (MfDR) Handbook on “Programme and Project Thinking Tools”
----------------------------------------------------------------------------
Version: MSMEs October 2015 Page 96
term for staff training
3. Improved quality of teaching and
learning with a special emphasis on literary and numeracy.
All teachers trained in the new curricular by year 3
Classrooms in project schools have more learner centred (interactive/activity based/participatory) by year 3
Through support from in school cluster based resource persons, teachers demonstrate increased confidence.
All teachers demonstrating observable mastery of the methodologies demonstrated by the Revised Primary Curriculum
All teachers using interactive teaching with a focus on literacy by year 2. All Grade 1 teachers trained to ensure smooth transition from Basic Schools year 3.
Teachers employ appropriate strategies to meet the needs of children with exceptionalities by year 2.
Teachers trained and demonstrating ability to identify students with exceptionalities by project mid-term
At least 30% increase in attainment levels in Grade 1 readiness, Grade 3 Diagnostic, Grade 4 literacy, Grade 6 and Grade 9 Junior High exams by year 3 of
project.
Programme documentation
Course registers and records Baseline and monitoring reports
Education officer reports Student perception
Panel reports Stakeholder perceptions
Perceptions of Education officers, Principals and Teachers Workshop reports
and evaluations Self evaluations
Assessment records
Availability, capacity and willingness of teachers to
participate in training. Teachers will implement new strategies. Central and regional
monitoring and support systems arte in place. Adequate and suitable infrastructure and public services in place to
support learning Parental support Appropriate methodologies/ curriculum to needs/level
of learners. Attendance level sufficient to take advantage of improved teaching and learning environment.
Students with exceptionalities are recognised and addressed.
Students have sufficient nutritional levels to accommodate learning.
4. Regional and national systems strengthened to provide training and support for improved teaching and
learning.
Systematic Regional Education Officers plans for INSET provision to remote schools effectively implemented by year2
Effective learning support in schools by year2 Effective Guidance and counselling in every project
Education Office Reports Staff development plans in School
Development Plans Course register
Availability of officers for ongoing training. Resource centres appropriately equipped
and utilised. In house personnel have technical skills to operate multi media equipment.
Managing for Development Results (MfDR) Handbook on “Programme and Project Thinking Tools”
----------------------------------------------------------------------------
Version: MSMEs October 2015 Page 97
school by year2 50% of teachers of Grades 7-
9 trained in ROSE curriculum and methodology by year 3 Systematically organised school-based staff development in all schools by
year 2 In-service training strategy at Regional and National levels in place by end of year 1. Implemented in pilot schools
in year 2 Officers able to effectively facilitate teacher development by project mid-term
Appropriate material and equipment in use in resource centres by Year 2 Regional Development Plan indicating planned activities
(e.g. in-service training, staff development, data collection and management) for the respective regions.
Education Officer Monitoring
Reports Course registers Programme documentation
National data TEO reports Reports from EOs,
teachers and principals Internal assessment and G3 diagnostic test
Workshop reports and evaluation sheets Handover
documents and regional Office inventories
Availability and willingness of persons to
be involved in special needs training. Sufficient Ministry of Education and Culture capacity to provide and
train at least one Guidance counsellor in each cluster.
5. Appropriate levels of teaching and
learning resources provided to meet curriculum needs.
Books and equipment being used effectively by Y2
Teaching resources in school before start of school year Individual access to text books and other resources
Learning resources from local materials developed and utilised
Observation reports by TEOs
Log book record Reports from parents
TEO reports
Efficient procurement and delivery system.
Learning and teaching resources used effectively Appropriate material is available and accessible
6. Minor rehabilitation works
identified and carried out (through school development planning process).
Work identified costed and approved by March year 1
Work satisfactorily completed by mid year 2
Building reports
School Development plan Estimate of expenditure Building official
approval Building Officers inspection report TEOs reports
Work carried out is on time and adequately
supervised to maintain quality. Work plan will be within the financial budget Positive political support
Managing for Development Results (MfDR) Handbook on “Programme and Project Thinking Tools”
----------------------------------------------------------------------------
Version: MSMEs October 2015 Page 98
Community feedback
7. Increased capacity for lessons learnt to be transferred and sustained.
10% of non-project schools utilising the best practices at End of Project Action research in all regions following project guidelines
Participation by all stakeholders in mid-term review
Baseline data and research results Monitoring reports Panel reports/
classroom observation Documentation from action research projects
Review reports
A culture of learning will develop. Documentation will take place.
Findings will be disseminated to all stakeholders.
Managing for Development Results (MfDR) Handbook on “Programme and Project Thinking Tools”
----------------------------------------------------------------------------
Version: MSMEs October 2015 Page 99
Lesotho Public Financial Management - Logical Framework 23
Programme title: Public Sector Improvement and Reform Programme (PSIRP)
Public Financial Management (PFM) Component
Objectives Indicators
Verification
Assumptions
GOAL: Public finances effectively managed and
targeted towards improved development.
Achievement of: Poverty Reduction
Strategy targets
PURPOSE: Strong PFM systems and processes started to be implemented, led by
clear, long-term Government of Lesotho (GoL) priorities
1. Cabinet leads strong
PFM oversight by:
New Finance Act
Commitment to an integrated capital & recurrent budget.
Commitment to
macro- & medium term planning.
2. PAC discharges oversight function as evidenced by:
Hearings held on
schedule with Accounting Officers challenged
Reports on the PAC with clear
recommendations on measures to be taken
1. Political will to target
budgetary resources released by improved
PFM to meet objectives of the GoL Poverty Reduction Strategy (PRS).
2. PRS and macro- and medium term plans set out clear targets and
strategies for poverty reduction, in line with National Vision 2020.
3. The parallel and complementary reforms arising from PSIRP are
achieved
23 Dearden, P. N.(2005) Government of Lesotho Public Sector Improvement and Reform Programme, Public
Financial Management (PFM) Component, Logical Framework and Project Cycle Management Training, Inception Workshop 27 June – 1 July 2005, Department For International Development South Africa (DFIDSA) and Centre
for International Development and Training (CIDT), University of Wolverhampton UK.
Managing for Development Results (MfDR) Handbook on “Programme and Project Thinking Tools”
----------------------------------------------------------------------------
Version: MSMEs October 2015 Page 100
Output 1: Integrated planning
and budgeting processes implemented.
1. Comprehensive MTEF documents:
3 line ministry MTEF budget documents by April 2006,
10 ministry MTEF budget documents by April 2007,
20 ministry MTEF budget documents
by April 2008 2. Sensitisation and
training programmes increase by end of 2005.
MTEF Documents
Training reports
EU CBEP project focus is able to
support proposed budgetary reforms. Sufficient GoL commitment to an integrated approach to planning and
budgeting. Outstanding completion of Planning / Finance merger does not disrupt the integration
of planning and budgeting processes.
Output 2: Modern integrated accounting, revenue and expenditure management systems introduced.
1. New IFMIS & HRMS, and supporting ICT framework, procured and implemented in 8
pilot sites by April 2007
2. IFMIS implemented in all ministries by May 2008.
3. New Treasury structure established and operational in MoFDP, MAUs and Sub-
Accountancies from April 2007.
4. Internal Audit ministry-based Audit Units Teams established in
MoFDP and 5 ministries by March 2007.
5. 9 Internal Audit Committees operating in
MoFDP and 8 pilot ministries by March 2007.
6. Report on Professional development
options by March 2006
7. Training on IFMIS commences July
Government Gazette
Approved structures by MPs Minutes Existence of
established positions Training Report
Finance Act
Options Report
Backlog of 1996-97 to 2000-01 Public Accounts is addressed early in the Programme. EU is able to finance
IFMIS / ICT / HRMS EU procurement regulations can accommodate implementation schedule.
IFMIS implementation schedule is realistic with regard to budget cycle dates. MoFDP able to
recruit additional staff required Existing Treasury / MAU / Sub-Accountancy staff are not resistant to
change. Sufficient funding available to support whole Programme. Long-term professional
development programmes in Lesotho can be supported.
Managing for Development Results (MfDR) Handbook on “Programme and Project Thinking Tools”
----------------------------------------------------------------------------
Version: MSMEs October 2015 Page 101
2006. 8. New Finance Act
submitted to
Attorney General by October 2007.
Output 3: Strengthened macroeconomic analysis and forecasting linked to
fiscal policy, budget performance and monitoring.
1. Macroeconomic model developed and approved by Hon. Minister by August 2007.
2. Medium term forecasts produced by July 2007.
3. Fiscal policy reports
progressively from 2007/8-budget year.
Signed endorsement letter Medium Term Report
Fiscal Policy Report
Reporting and reliability improvements under IFMIS/accounting reforms able to
support fiscal performance reporting requirements. Ministry MIS capable of providing financial
and physical progress information.
Output 4: Effective Cabinet participation in the
budget.
1. Cabinet receives Budget Framework
Papers from September 2005.
2. Draft MTEF submitted to Cabinet annually from February
2006. 3. Financial and
output performance reports submitted to Cabinet
quarterly from July 2007.
Cabinet memorandum
Cabinet memorandum
Sufficient Cabinet support for MTEF approach.
Reporting and reliability improvements under IFMIS/accounting reforms able to support fiscal
performance reporting requirements.
Output 5: Role of independent oversight
strengthened.
1. PAC Reports by June every year from 2007.
2. New Audit Act prepared and submitted to Attorney General by December 2005.
3. Strategic plan of
Audit Office in place by December 2005.
4. Action plans and measures taken to address the
backlog in Public Accounts by July
PAC Reports Audit Act Signed copy of
covering letter to Attorney General Copy of approved Strategic Plan
Audit Manuals Training registers Training registers
Training and Development
PAC members are supportive of change. GoL can provide
adequate accommodation and support resources. Sufficient Cabinet and Parliament support for PAC reforms.
Backlog in Public Accounts can be addressed. New Audit Act is passed before the end of the project.
Support from Auditor General’s Office for
Managing for Development Results (MfDR) Handbook on “Programme and Project Thinking Tools”
----------------------------------------------------------------------------
Version: MSMEs October 2015 Page 102
2006. 5. Performance Audit
Manual prepared
and staff trained by July 2006.
6. Financial Audit Manual prepared and staff trained by July 2007.
7. Professional audit training and development programmes developed and agreed by July
2006. 8. Options developed
to establish professional public sector auditing training in Lesotho
by May 2008.
Programmes Option paper
reform programme is adequate. Sufficient qualified
staff available.
Output 6: Procurement systems modernised.
1. Standard procurement documentation and revised procurement
thresholds in place by December 2005.
2. Professional training and development
programme developed and training commenced from January 2006.
3. New legislative
provisions for government procurement drafted by July 2007
4. New GoL
Procurement System operational from May 2008.
5. Long-term professional procurement
training capability in Lesotho established by May 2008.
Procurement documents Training Reports
Draft legislation
Government Gazette
Support from ministries for new Procurement System. Sufficient numbers of GoL procurement
staff and availability of new staff support the new system. Sufficient interest in procurement training. Sufficient internal
audit capacity to ensure effective system operation. Creation of PRB not supported by GoL. Funding not available
to provide long-term availability of professional procurement training.
Managing for Development Results (MfDR) Handbook on “Programme and Project Thinking Tools”
----------------------------------------------------------------------------
Version: MSMEs October 2015 Page 103
Illegal Logging: Tackling the underlying Governance, Policy and Market Failures Programme – Logical Framework 24
Objectives Indicators Verification Assumptions
SUPERGOAL
Realise the potential of forests to reduce poverty
GOAL
Policies, processes and institutions that promote sustainable and equitable use of forests in the interests of the poor.
Improved governance of national and international institutions (rules, procedures, norms).
Records of wider representation and accountability mechanisms.
Forests are important in the livelihoods of poor people
More responsible markets Adoption of industry codes of conduct.
Greater demand for legal products.
PURPOSE
Facilitate reforms by national
and international institutions to address the governance, policy and market failures that cause and sustain illegal logging and associated trade.
1. Policy that is
informed by objective evidence.
1. National policy
statements
An equitable
trading system requires governments and the trade in major consuming countries to take
actions to against illegally logged timber.
2. National, regional and international
policy processes that learn from each other.
2. Proceedings of policy processes.
3. More markets that discriminate against
illegally harvested products.
3. Changes to procurement
policies.
OUTPUTS
1 Improved understanding of causes, scale and solutions to
illegal logging and associated trade.
1.1 Estimates of the nature, scale and impacts of
illegal logging in selected countries documented.
1.1 Monitoring reports, trade
statistics.
Improved understanding
facilitates policy and institutional reforms.
Need to simplify defining legality risks
compromising pro-poor legislative reform.
1.2 Key drivers of illegal logging – poor
governance, weak enforcement and market factors – analysed.
1.2 Studies on corruption, weak
enforcement, market pressures
1.3 Impacts of illegal
logging and enforcement actions on poor analysed.
1.3 Country-specific
research studies
24 Dearden, P.N. Mahony, D. and Jordan, G. ,2006, Illegal Logging – Tackling the Underlying Governance, Policy
and Market Failures Programme. Output to Purpose Review (OPR), January 2006, Department for International Development. (DFID) London and Centre for International Development and Training (CIDT), University of
Wolverhampton, UK.
Managing for Development Results (MfDR) Handbook on “Programme and Project Thinking Tools”
----------------------------------------------------------------------------
Version: MSMEs October 2015 Page 104
1.4 Policy reform and
countermeasures taken in producer and consumer countries and at the international level result from improved understanding gained.
1.4 Legislative/policy
reform and actions in producer and consumer countries and in international
trade.
2. Effective communication and advocacy that maintains political will and the
momentum for change and widens the base of support for action.
2.1 Continuing UK ministerial level engagement.
2.1 Ministerial participation in and speeches to
relevant fora, meetings. Answers to PQs.
2.2 Effective communications within
DFID maintains awareness and utilises links with other programmes
2.2 Intranet up to date. Regional
and country offices informed. Inter-divisional exchange of information.
2.3 Relevant news items and other external communications are tracked and appropriate responses made
2.3 Press releases and responses
2.4 Industry groups and companies adopt purchase policies that seek to eliminate trade in illegal timber.
2.4 Actions by trade associations and individual companies
2.5 Effective communications with stakeholders and wider public maintained
ensures wide understanding of issues and actions being taken to address issues.
2.5 Media reports, consultations, NGO newsletters, website
3. Coherent UK, EU and G8
policy and regulatory framework.
3.1 Consistent policy
statements and actions within Whitehall.
3.1 Minutes of inter-
departmental meetings, joint policy papers, ministerial statements, answers to PQs
DEFRA granted
resources to implement CPET
Managing for Development Results (MfDR) Handbook on “Programme and Project Thinking Tools”
----------------------------------------------------------------------------
Version: MSMEs October 2015 Page 105
3.2 EU FLEGT Action Plan
adopted, regulation passed and successful negotiations with Partner Countries.
3.2 Enactment of
regulation; FPAs signed.
Sectoral and
member states’ concerns are overcome. Potential Partner Countries willing to sign FPAs. UK
Customs granted resources to enforce EU regulation.
3.3 Collaborative or supportive actions taken by Japan.
3.3 Programme documents, policy statements, measures taken.
Japanese concerns about competitiveness understood and addressed.
3.4 Collaborative or supportive actions taken by the US.
3.4 Programme documents, policy statements, measures taken
US concerns about reciprocity are overcome and do not delay progress on intergovernment
al measures
3.5 Collaborative or supportive actions taken by China.
3.5 Programme documents, policy statements,
measures taken
China becomes receptive to market signals
and diplomacy.
4. Development of tools and systems to tackle illegal logging and associated trade.
4.1 Technical monitoring, auditing and chain of custody solutions further developed, evaluated
and where appropriate utilised
4.1 Technical reports
4.2 Appropriate independent monitoring
programmes adopted and, where appropriate, supported.
4.2 Reports.
4.3 Tools and systems contribute to
strengthened interagency co-operation at national, regional and international levels
4.3 Donor and other programmes.
Key producer country’s
willingness to adopt efficient systems achieved
5. Regional policy processes that lay the foundations for delivering reforms.
5.1 Effective participation of civil society and the private sector, as well as governments, in
FLEG and similar initiatives.
5.1 Proceedings of policy processes and views of participants.
Managing for Development Results (MfDR) Handbook on “Programme and Project Thinking Tools”
----------------------------------------------------------------------------
Version: MSMEs October 2015 Page 106
5.2 Constraints to poor people’s legal access to and
management of forest resources are on the agenda at regional processes.
5.2 Proceeding of FLEG and other regional
processes
5.3 East Asia FLEG
process gains and maintains momentum.
5.3 Proceedings of
FLEG Task Force meetings and Working Group actions
Regional
partners motivated to take process forward
5.4 Malaysia and Singapore take actions within or independently of FLEG
5.4 Countries’ policy statements and participation in bilateral or multilateral actions
5.5 AFLEG process post-ministerial actions initiated and momentum maintained.
5.5 Proceedings of national multi-stakeholder discussions.
Broad scope of AFLEG declaration does not divert attention from
illegal logging issues
5.6 Latin American and North Eurasian FLEG
processes launched and lead donors supported.
5.6 Press reports and NGO newsletters.
International promoters have
capacity to initiate additional processes
5.7 Evidence of active links and learning between
regions.
5.7 Cross participation in
meetings.
Sufficient progress is made
to provide useful lessons
ACTIVITIES
1.1 Review of reports, co-operation in original research where appropriate
1.2 Review of reports, co-operation in original research where appropriate
1.3 Review of reports, co-operation in original research where appropriate
1.4 Targeted support to enforcement action and governance reform.
2.1 Preparation of briefing materials. Regular meetings with ministers. Involvement of ministers in suitable events
2.2 Maintain internal web site. Participation where possible in cross-linked programmes.
2.3 Tracking of information of media reports and related activities. Follow-actions.
2.4 Secondment to TTF. Support to industry action where appropriate.
2.5 Maintain web site and stakeholder consultations. Information to civil society on specific issues.
Managing for Development Results (MfDR) Handbook on “Programme and Project Thinking Tools”
----------------------------------------------------------------------------
Version: MSMEs October 2015 Page 107
3.1 Servicing Inter-departmental Whitehall Group, the Inter-Departmental Working Group, the Timber
Buyers’ Group and the UK Forest Partnership.
3.2 Participating in and supporting actions aimed at implementing the EU FLEGT programme.
3.3 Regular communications with Japan to share lessons on promoting coherent domestic and international policies on procurement, trade policy, illegal logging and governance reforms.. Continued attendance at AFP. Co-operation on activities in Indonesia.
3.4 Regular communications with involved US officials, through G8 and other fora. Support to US on
Latin America and N. Eurasia FLEG where appropriate.
3.5 Identify and follow through opportunities to engage with China
4.1 Support to development and evaluation of monitoring, auditing and tracking systems, including support to EU FLEGT partnerships.
4.2 Support to operation of monitoring, auditing and tracking systems, where appropriate.
4.3 Support to use of tools and systems that support inter-agency co-operation, both regionally and internationally.
5.1 Support to civil society involvement in promoting actions under regional FLEGs
5.2 Reports on poor people’s access and management opportunities prepared for FLEG and other
regional fora.
5.3 Continued selective support to and participation in East Asia FLEG
5.4 Dialogue and other actions to encourage Malaysia and Singapore to participate in tackling illegal timber trade.
5.5 Continued selective support to and participation in AFLEG
5.6 Participate where can offer useful support in Latin America and N. Eurasia FLEG.
5.7 Support visits of participants from FLEG processes to observe and offer insights to other FLEGs.
Managing for Development Results (MfDR) Handbook on “Programme and Project Thinking Tools”
----------------------------------------------------------------------------
Version: MSMEs October 2015 Page 108
Livelihoods and Forestry Programme - Nepal Project Name: Livelihoods and Forestry Programme Country: Nepal Total Cost: £18.67 million
Narrative Summary
Objectively Verifiable Indicators Means/Sources of Verificatin
Assumptions
GOAL
Reduced vulnerability and
improved livelihoods for poor and excluded rural people.
1. Diversified livelihood choices.
2. Ability of rural communities to recover from environmental and social shocks enhanced.
3. Reduced rural poverty.
4. Increased GDP from the forestry sector.
5. Average assets value of rural poor & excluded (P&E) households increased.
6. Increased access of P&E to government productive services e.g., agriculture, livestock, forest, irrigation, finance and marketing.
Baseline studies in PY 1 and 2 (Eastern and Western areas).
District profiles in PY3/4 in the Terai and Mid Western area).
Follow up preliminary impact assessment in PY 5 in Eastern and
Western areas and full evaluation in PY10 in the eastern, western, Terai and the mid western areas.
Independent study
reports.
PURPOSE
Assets of rural communities enhanced through more equitable,
efficient and sustainable use of forest/ natural resources
By EoP:
1. % of forest users groups25
which independently implement (i) active forest
26
management and (ii) socio-economic development plans
increase from 6% (2003) to 60% in hills, from 31% (2004) to 50% in mid west and from 35% (2004) to 50% in Terai.
2. % of poor and excluded FUG members who claim their rights
to natural resources in an organised way increases from 31% (2006) to 60%
27.
3. At least 40% of the economically poor user group member households report
increased income because of their membership of user groups.
4. At least 30% P&E user group members report greater access to livelihoods choices (e.g.,
education, health, credit, livestock and emergency
1 FUG assessment report, Output to Purpose Review (OPR)/ Annual Review report, Independent
study reports.
2 FUG/ LFP progress reports, FUG
assessment report, OPR/ Annual Review report, impact reports.
3 Impact monitoring report, OPR/ Annual Review report.
4 Impact monitoring report, OPR/ Annual review report, FUG annual reports.
5 FUG Assessment
There will be acceptable risk and environment, particularly in Terai for having physical access to the forest
and VDCs/districts
The rate of conflict affected migration in and out of LFP areas remains stable at the
current (2007) rate
Expected reforms in the forestry sector are inclusive-sensitive and
informed by field realities and experience.
DFID remains engaged in Nepal,
providing predictable
25
Includes CFUGs, PLMGS, Soil groups, Leasehold groups, CFMG, etc. 26
Active forest management means planning based on potentiality of the forest and resources and its
implementation (maximizing the potentialities) 27
% of P&E organized in the groups will be recorded as the achievement against the indicator
Managing for Development Results (MfDR) Handbook on “Programme and Project Thinking Tools”
----------------------------------------------------------------------------
Version: MSMEs October 2015 Page 109
Narrative Summary
Objectively Verifiable Indicators Means/Sources of Verificatin
Assumptions
support.)
5. User satisfaction score achieved by forestry sector service providers on their
technical28
support increases from (i) 66% to 75% for DFO, (ii) 18% to 40% for DSCO and, (iii) to 80% for F/UGs and their networks, Local Resource Persons, and Animation
Programme Manager/ partner NGOs.
6. The average fund mobilized (leverage) by the FUGs is at least equal to the total amount of funds invested by LFP
29.
7. % of (i) ethnic group30
members of FUG/Cs who participate in meetings increases from 31% (2003) to 60% in hills, 64% (2005) to 75% in mid west and 18%
(2005) to 40% in Terai, (ii) women from 33% (2003) to 60% in hills, 54% (2005) to 70% in mid west and 49% (2005) to 60% in Terai and (iii) poor to 50% in all areas.
8. % of FUGs spending at least 35% of their fund to P&E provisions increase from 6% (2004) to 40% in hills, 18% to 40% in mid west and 10% to 25% in Terai
31.
report and Impact monitoring reports, assets tracking/ well being record, Output
to Purpose Review (OPR) report.
6 FUG reports, DFO reports, and District Progress reports.
7 Baseline report, FUG
records, OPR/ Annual review report, FUG assessment reports.
8 Baseline reports, FUG assessment reports, OPR/ Annual review
reports.
development support through appropriate aid development instruments.
OUTPUT 01 Forest managers
32enabled
to responsively manage and utilize forest resources
BY EOP33
1. % of potential community forest
estate under a defined management system increases from 27% (2003) to 50% in
1. District Progress
reports, District level data from DFO, MFSC,
The present policies that allow access to forest resources by
28
Technical support includes advisory services required to FUGs such as technical, social, institutional and coordination related services 29
The leveraged amount will be calculated on annual expenditure and it is not about the cumulative total for
programme period 30
Disadvantaged janajati and caste group people and religious minorities will be considered while analysing information against this indicator. 31
The baseline values indicate the percentage of FUGs which spent 20% or above resources in pro-poor provisions 32
The word “forest managers” denotes forest group members of any forest regimes, and all forest management -
related service providers 33
EOP/EOC= End of Programme or End of Component
Managing for Development Results (MfDR) Handbook on “Programme and Project Thinking Tools”
----------------------------------------------------------------------------
Version: MSMEs October 2015 Page 110
to sustainably maximize the multiple benefits
hills, 30% (2005) to 50% in mid west and from 5% (2005) to 25% in Terai.
2. Out of the total potential public and institutional land in the Terai, 10% will be under a defined management system with regeneration of forest.
3. % of FUG members who report
improvement in (i) availability of forest products increases from 82% (2003) to 90% in Hills, 47% (2005) to 60% in Terai and 78% (2005) to 85% in Mid-west and (ii) wildlife/water
condition from 75% (2003) to 85% in hills, 63% to 75% (2005) in mid west and 26% (2005) to 35% in Terai.
4. % of FUGs involved in NTFP management increases from
9% (2003) in hills, 31% (2005) in mid west and 26% (2005) in Terai to 50% in all LFP areas.
5. Number of FUG-based forest enterprises increased from 12 (2003) in hills, 52 (2005) in mid
west and 59 (2005) in Terai by at least five times
6. In all LFP districts, Operation Plans (OP) are amended on time (no OP back-log) with technically improved
34 OPs and
constitutions
GIS maps.
2. District progress reports, Annual review.
3. Baseline study
reports, Impact
monitoring reports, FUG assessment reports, District Progress reports.
4. FUG
Assessment,
Impact monitoring LFP progress reports.
5. FUG database,
Case studies, records from DFO/LFP/ NGO and independent study reports, FUG
assessment reports.
6. FUG assessment, copy of OPs, FUG monitoring
report, Progress reports.
community continues
Appropriate means for registering public and institutional land to communities is
determined. The forest sector policy will be favourable to promote forest
based enterprises and markets
34
Technically improved OPs will have supervised inventory and management prescriptions.
Managing for Development Results (MfDR) Handbook on “Programme and Project Thinking Tools”
----------------------------------------------------------------------------
Version: MSMEs October 2015 Page 111
OUTPUT 02 Poor and excluded groups enabled to participate in and benefit from the
forestry sector
BY EOP 1. All the new and amended
operational plans (OPs) and constitutions have at least three P&E equitable
provisions, one each for participation, forest and other resource distribution).
2. % of the total FUGs who implement at least three P&E equitable provisions
increases from 1.25% (2003) in hills, 3.5% (2004) in mid west, 3.8% (2004) in Terai to 20% (one each related to participation, forest and other resource allocation).
3. At least 50% of economically poor FUG members access income-generating opportunities.
4. At least (i) 50% women, (ii) 15% Dalits (both male and
female), (iii) 30% disadvantaged ethnic group (both male and female) and (iv) 15% poor represented in executive committees of FUGs
5. At least (i) 33% women and
(ii) 33% Dalits or disadvantaged ethnic group (both male and female) represent in key decision making positions of FUG executive committees.
6. At least 60% of poor and excluded households access benefits generated from forestry groups and their resources (e.g., paid employment, educational
benefits, quick impact and community development, credit facility, skill development training, land allocation, emergency fund etc.)
1. FUG
Assessment reports, FUG constitution
review report, independent study reports.
2. FUG
assessment
reports, FUG documents review and independent studies.
3. FUG progress
reports, District Progress reports, FUG Assessment
reports. 4. FUG
assessment reports.
5. FUG Assessment reports, District Progress reports, Reports from
LFP partner institutions, independent study reports.
6. District
Progress reports, FUG assessment reports, DFO progress reports, FUG progress
reports, independent study reports.
There will be continued respect and support to working
inclusively in the new political context. LFP partners will
have favourable policy and operating environment to implement their activities
MFSC will approve and implement Gender and Social Inclusion
(GSI) strategy
Managing for Development Results (MfDR) Handbook on “Programme and Project Thinking Tools”
----------------------------------------------------------------------------
Version: MSMEs October 2015 Page 112
OUTPUT 03 Capacity within and
coordination amongst institutions strengthened for forestry sector development and enhanced livelihoods.
1. All LFP districts have multi
stakeholder fora with a
secretariat functioning as the principle district level forest sector planning, coordination and monitoring mechanism.
2. In LFP districts, village level multi-stakeholders forum
engaged in forestry sector activities (i.e. network)) established in at least 50% VDCs of hills and mid west, and 25% VDCs in Terai.
3. All multi-stakeholder fora
include gender and social inclusion aspects in their decisions, plans and monitoring.
4. % of (i) woman staff in LFP and its partner institutions
increases from 21% (2006) to 33%; and (ii) staff from excluded groups (both women and men from Dalits and disadvantaged ethnic groups) from 37% (2006) to 45%.
5. All District Forest Offices and key partners will target their interventions in proportion to the base population
35 of
different social groups (women, Dalits and
disadvantaged groups) in LFP districts.
6. Up to 15 MSc and 30 BSc scholarships provided to MFSC staff
1. LFP/ DFO/
Network
Progress reports,
2. LFP/ DFO/
Network Progress reports,
3. Training reports, progress reports.
4. Review report, Progress report and Gender audit reports.
5. Copies of DFO and partners plans, Gender audit reports.
6. LFP financial
records, nominations by MFSC, annual and progress
reports.
MFSC and MLD will have
consensus on decentralisation strategies and federal state structure.
Politically accepted governance mechanism will be in place at districts and
national level DFCC, VFCC and forest user group networks will work positively with
user groups and stakeholders
35
Base population will be defined by the information available from Central Bureau of Statistics (CBS/GoN) and the figures are taken as context data for proportionate services and representations
Managing for Development Results (MfDR) Handbook on “Programme and Project Thinking Tools”
----------------------------------------------------------------------------
Version: MSMEs October 2015 Page 113
OUTPUT 04 Innovative, inclusive and conflict sensitive approaches shared to
inform forest sector planning and policies
1. At least one new (innovative)
initiative (i.e., in forest management/ NTFP/ Agro- forestry/ public land, safe and
effective development/ pro-poor and excluded growth, scholarship package, alternative energy, High Altitude Forest Management, forest certification etc.) tested
per year 2. LFP strategy on
Communication developed and implemented, sharing with Programme Management
Committee (PMC/MFSC) members, LFP partners (e.g. DFOs and forestry sector networks), DFID and wider audience.
3. At least one effective practice
paper/ strategy/ approach developed, implemented and shared (e.g. on climate change, peace building, SFM, second generation issues in forestry and importance of
disaggregated monitoring information) per year.
1. Progress
reports, documents of innovative
practices Annual review and independent reviews.
2. Copy of publication, progress reports, annual review report, meeting
minutes, response from people receiving publications and
communications, website feedback.
3. Copies of
strategy,
Progress reports, Annual Review report, sharing reports, meeting minutes.
There will be continued favourable political and policy
environment that supports developing and testing of innovative ideas.
Managing for Development Results (MfDR) Handbook on “Programme and Project Thinking Tools”
----------------------------------------------------------------------------
Version: MSMEs October 2015 Page 114
OUTPUT 05 National Level forest
sector capacity and response to field reality strengthened
1. At least one MFSC field responsive policy/ strategy/guideline developed/
revised per year. 2. At least one research/ study
paper to inform forestry sector improvements produced every year.
By EOP
3. P&E F/UG representatives participate in all policy formulation, contributing to develop field responsive and P&E sensitive policies and guidelines.
4. A prototype for gender and social inclusion sensitive monitoring system of MFSC in place.
5. Groundwork for forestry sector reform started in-line
with the changed context.
1. Copies of policy/ strategy/
guidelines Progress reports, Annual review, workshop/ meeting
reports. 2. Copies of the
research publication.
3. Copies of policy/ strategy/ guideline.
4. MFSC set of
monitoring questionnaire and checklist, FUG database, Annual review,
Progress reports.
5. Copy of
forestry sector reform papers
and plans
All partners committed to
adopt and implement GoN Policies, Guidelines and Strategies.
MFSC continues consultative policy formulation process
Managing for Development Results (MfDR) Handbook on “Programme and Project Thinking Tools”
----------------------------------------------------------------------------
Version: MSMEs October 2015 Page 115
Livelihood Forestry Programme - Key activities by Outputs 1. Output 01
1.1. Group Formation (CFUG, PLMG, PFUG, CFMUG…)
1.2. Constitutions/ OP preparation/ amendments/ Forest management plan preparation 1.3. Forest nursery establishment and forest/ NTFP species seedlings production
activities 1.4. Soil and water conservation activities e.g., trail improvement, water resources
protection, on farm conservation, irrigation canals, landslide protection… 1.5. Government controlled/ community managed forest related activities e.g., plan
preparation, silvicuture operation, fire line mgmt, fuel wood depo, thinning and pruning etc.
1.6. Demo plot support (establishment and management) 1.7. Forest / NTFP species plantation and post plantation activities 1.8. Forest protection support/ Forest management support 1.9. DFO/ Forest managers training, exposure visit, awareness campaigns
1.10. Forest/ watershed/ soil conservation/ public land/ Agro-forestry/ NTFP/ Alternative energy management training/ workshop for the users
1.11. Forest user groups planning and review workshops 1.12. PPSI/ GPSE sensitisation training/ exposure to forest managers and monitoring
system development 1.13. Pond management within forest areas
1.14. Forest/ agro/ livestock based enterprises development and management activities 1.15. Forest product marketing support 1.16. Awards (Best FUGs, Quiz, etc.) 1.17. DFO/ DSCO support for resource centre management, field equipment etc. 1.18. Conflict resolution meeting, training, workshop etc. 1.19. Research related to scientific forest management
1.20. B.Sc./ M.Sc. scholarship support 1.21. Climate change/ Global warming related activities (e.g., sample inventory
preparation) 2. Output 2
2.1. P&E identification activities (e.g., well-being ranking)
2.2. Income generating activities (forest based and non-forest based) and revolving fund provisions
2.3. Support in P&E sensitive policy formulation and FUG planning 2.4. Animation/ Social Mobilisation activities 2.5. Education support for P&E children 2.6. Emergency fund/ humanitarian support
2.7. Small health and sanitation activities targeting to P&E 2.8. Land allocation (CF and Public Land) 2.9. P&E exposure visit 2.10. P&E skills enhancement, capacity building training/ workshop and scholarship
support 2.11. Issue based sub group formation and related support
2.12. Tole level processes and groups strengthening 2.13. Small infrastructure support (irrigation, drinking water etc. focusing to P&E) 2.14. Research related to P&E issues 2.15. NRM classes targeted to women and P&E 2.16. P&E specific support under Local Initiative Fund (LIF)
3. Output 03
3.1. Network formation and strengthening 3.2. VFCC/ DFCC strengthening support 3.3. Awareness raising on climate change, global warming and Kyoto protocol 3.4. Orientation on peace sensitive development 3.5. Different level forest coordination meetings 3.6. DFO/ DSCO Office support for resource centre, equipment, stationery etc.
3.7. Institutional development training and workshops for service providers
Managing for Development Results (MfDR) Handbook on “Programme and Project Thinking Tools”
----------------------------------------------------------------------------
Version: MSMEs October 2015 Page 116
3.8. Celebration of environment day etc. 3.9. Inter group conflict resolution (e.g., Boundary)
3.10. Institutional strengthening support to networks, user groups etc. (organisational analysis, training, workshops and materials)
3.11. Review and planning workshops with stakeholders and networks 3.12. Collaborative activities 3.13. Monitoring and Evaluation activities (FUG monitoring and categorisation, field visits,
impact monitoring, progress monitoring etc. and related training/ workshop)
4. Output 04
4.1. Strategy development 4.2. Publication of best practices 4.3. Thematic workshops/ interactions 4.4. Piloting/ testing of different approaches and initiatives
4.5. Central level support to networks and federations (civil society groups) 4.6. Policy work through participation in different task forces 4.7. Capacity building/ training on planning and monitoring 4.8. Publication/ dissemination of LFP effective practices 4.9. Implementation of communication action plan
5. Output 05 5.1. Central level support to MFSC on policy/ strategies/ system and guidelines
development/ strengthening (e.g. PLMG policy, CF guidelines…) 5.2. Joint action with civil society networks 5.3. Contribution to develop and implement Gender and Social Inclusion Strategy 5.4. Contribution for forestry sector review, study on forest sector contribution on GDP
5.5. P&E support in participating policy debate 5.6. Policy review (audit) 5.7. Contribution in research/ studies by MFSC and its subsidiaries 5.8. M&E system strengthening support / Database management support 5.9. Communication and extension activities
Managing for Development Results (MfDR) Handbook on “Programme and Project Thinking Tools”
----------------------------------------------------------------------------
Version: MSMEs October 2015 Page 117
APPENDIX O: ENGENDERING THE LOGICAL FRAMEWORK
The logframe should incorporate an awareness of the social relations that are intrinsic to project implementation, monitoring, and evaluation. In this regard, two common assumptions must be critiqued.
1. That participatory projects benefit both women and men, and 2. That that women are generally a homogeneous social group.
More than three decades of gender analysis in research and development work informs us that neither of these assumptions is true. The task is to converge gender analysis and the logframe to improve gender equity in projects.
An engendered logical framework requires that the process of planning a project, as
well as each component of the logical framework matrix, be seen through a “gender lens.” This lens is informed by gender analysis, which is a methodology to investigate the socially constructed differences between men and women, and between women themselves (Moser 1993; Goetz 1997). These differences determine the extent to which men and women vary in their access to and control over resources and encounter different constraints and opportunities in society, whether it is at the level of the household, community, or state. Established patterns of gender inequality and inequity can be exposed, explored, and addressed through gender analysis
Managing for Development Results (MfDR) Handbook on “Programme and Project Thinking Tools”
----------------------------------------------------------------------------
Version: MSMEs October 2015 Page 118
Key Questions for Engendering the Logical Framework
Objectives/Narrative Summary
Indicators Data Sources Assumptions
Goal/Impact
Do gender relations in any way influence the project goal?
What measures can verify achievement of the gender-
responsive goal?
Are the data for verifying the goal sex-disaggregated and analyzed in terms of gender?
What gender analysis tools will be used (e.g., in impact assessment)?
What are the important external factors necessary for sustaining the gender-responsive goal/Impact?
Outcome/Purpose Does the project have gender- responsive objective(s)?
What measures can verify achievement of the gender- responsive objective(s)?
Are the data for verifying the project purpose/outcome sex- disaggregated and analyzed in terms of gender? What gender analysis tools will be used?
What are the important external factors necessary for sustaining the gender-responsive objective(s)?
Outputs
Is the distribution of benefits taking gender roles and relations into account?
What measures can verify whether project benefits accrue to women as well as men, and the different types of women engaged in or affected by the project?
Are the data for
verifying project outputs sex- disaggregated and analyzed in terms of gender? What gender analysis tools will be used (e.g., in participatory field evaluations)?
What are the
important external factors necessary for achieving project benefits (specifically, benefits for women).
Activities Are gender issues clarified in the implementation of the project (e.g., in workplans)?
Inputs What goods and services do project beneficiaries contribute to the project? Are contributions from women as well as men
accounted for? Are external inputs accounting for women’s access to and control over these inputs?
Are the data for verifying project activities sex- disaggregated and analyzed in terms of gender? What gender analysis tools will be used (e.g., in monitoring the
activities)?
What are the important external factors necessary for achieving the activities and especially ensuring the continued engagement of men and women
participants in the project?
Managing for Development Results (MfDR) Handbook on “Programme and Project Thinking Tools”
----------------------------------------------------------------------------
Version: MSMEs October 2015 Page 119
APPENDIX P: USEFUL REFERENCES
Stakeholder analysis: http://www1.worldbank.org/publicsector/anticorrupt/PoliticalEconomy/PREMNote95.pdf - excellent World Bank paper on stakeholder analysis in reform processes
http://www1.worldbank.org/publicsector/politicaleconomy/November3Seminar/Stakehlder%20Readings
/SAGuidelines.pdf - interesting guidelines for doing SA (over-complex and quantitative?)
http://www.stsc.hill.af.mil/crosstalk/2000/12/smith.html - a good journal article
http://www.phrplus.org/Pubs/hts3.pdf - stakeholder analysis in health reform
http://www.policy-powertools.org/index.html - tools for SA in natural resource management
Logical Frameworks
Dearden P.N., Jones S. and Sartorrius, R. 2003, Tools for Development - A Guide for Personnel Involved in Development. Department For International Development. London. (pp144). “Tools for Development”
Dearden P. N. and Kowalski R. 2003 Programme and Project Cycle Management (PPCM): Lessons from South and North. Development in Practice. Vol 13 (5).
http://www.ingentaconnect.com/content/routledg/cdip
Dearden P.N. 2005, An Introduction to Multi Agency Planning using the Logical Framework Approach. 0-19+ Partnerships and Centre for International Development and Training, University of Wolverhampton. http://www2.wlv.ac.uk/webteam/international/cidt/cidt_Multi_Agency_Planning.pdf European Commission (EC), Aids delivery methods: project cycle management guidelines. European
Commission (March 2004.).
European Commission, ROM Handbook: Results-Oriented Monitoring of, European Commission (April 2012.). IAEA, Designing IAEA Technical Cooperation Projects using the Logical Framework Approach; A quick reference guide (2010).
IAEA (OIOS), Guides for Programme and Project Evaluation (2003). IFAD, Managing for Impact in Rural Development, a guide for project M&E; Rome (2002). OECD, Glossary of Key Terms in Evaluation and Results Based Management (2010).
UNDG, Results-based Management Handbook: Harmonizing RBM concept and approaches for improved development results at country level (October 2011).
United Nations Development Program (UNDP), Handbook on Planning Monitoring and Evaluating for Development Results (2009). WFP, Monitoring and Evaluation Guidelines
World Bank, Self-Assessment in Managing for Results: Conducting Self-Assessment for
Development Practitioners (2005). World Bank, Ten Steps to a Results-Based Monitoring and Evaluation System (2004).
Indicator Sources (at high level, outcome/impact)
Millennium Development Goals Indicators. The data, definitions, methodologies and sources for more
than 60 indicators to measure progress towards the Millennium Development Goals http://mdgs.un.org/unsd/mdg/Default.aspx
Managing for Development Results (MfDR) Handbook on “Programme and Project Thinking Tools”
----------------------------------------------------------------------------
Version: MSMEs October 2015 Page 120
Gives access to all national statistical agencies http://www.ssb.no/lenker/
Over view over the available statistical databases within the UN http://unstats.un.org/unsd/databases.htm
UNICEF Statistics. Economic and social indicators for 195 countries, with special emphasis on the living conditions for children http://www.unicef.org/statistics/index_step1.php
UN Human Settlement Programme. Key indicators for cities and regionshttp://www.devinfo.info/urbaninfo/
World Development Data Query. The World Bank’s database, which contains 54 different indicators for 206 countries. http://ddp-ext.worldbank.org/ext/DDPQQ/member.do?method=getMembers
Gender statistics and indicatorshttp://genderstats.worldbank.org/home.asp
United Nations Environment Programme (UNEP). http://geodata.grid.unep.ch/
An online statistical data resource of selected demographic and health indicators gathered from various
sources for several countries of the worldhttp://dolphn.aimglobalhealth.org/
Social indicators covering a wide range of subject-matter fields
http://unstats.un.org/unsd/demographic/products/socind/default.htm
WHO Statistical Information System (WHOSIS). Interactive database bringing together core health
statisticshttp://www.who.int/whosis/en/
Global and internationally comparable statistics on education, science, technology, culture and
communicationhttp://www.uis.unesco.org/ev.php?ID=2867_201&ID2=DO_TOPIC
International Labour Organization. Covers of core labour statistics and estimates for over 200 countries
http://www.ilo.org/global/What_we_do/Statistics/lang--en/index.htm
UNCTAD/WTO International Trade Centre. Presents trade and market profiles, Country Map, based
on trade statistics that benchmark national trade performance and provide indicators on export supply and import demand. http://www.intracen.org/menus/countries.htm
Food and Agriculture Organization of the United Nations (FAO). Data relating to food and agriculturehttp://faostat.fao.org/
Transparency International seeks to provide reliable quantitative diagnostic tools regarding levels of transparency and corruption at the global and local levels. http://www.transparency.org/policy_research/surveys_indices
Gender and Logical Frameworks
Beck, T. and Stellcner, M. (1997) Guide to Gender-sensitive Indicators (Quebec, Canadian
International Development Agency). Goetz, A.M. (1997) Introduction: getting institutions right for women in development, in: A. M. Goetz (Ed.) Getting Institutions Right for Women in Development, ch. 1 (London, Zed Books). International Service for National Agricultural Research (ISNAR) (1997) Gender Analysis for
Management of Research in Agriculture and Natural Resources, a training module (The Hague, ISNAR). Locke, C. and C. Okali (1999) ‘Analysing changing gender relations: methodological challenges for gender planning’, Development in Practice 9(3):274–286.
MacDonald, M., Sprenger, E. and Dubel, I. (1997) Gender and Organizational Change: bridging the gap between policy and practice (The Netherlands, Royal Tropical Institute).
Managing for Development Results (MfDR) Handbook on “Programme and Project Thinking Tools”
----------------------------------------------------------------------------
Version: MSMEs October 2015 Page 121
Moser, C.O.N. (1993) Gender Planning and Development Theory, Practice and Training (London, Routledge).
United Nations Development Programme (UNDP) (1998) Uganda Human Development Report 1998 (Kampala, UNDP). US Agency for International Development (USAID (1994) Genesys Training Resource Material (Washington, USAID Office of Women in Development).
Vera Mkenda-Mugittu (2003) Measuring the invisibles: Gender mainstreaming and monitoring
experience from a dairy development project in Tanzania, Development in Practice, 13:5, 459-473.
Gender Database that provides indicators that contains a set of innovative measures to quantify inequalities between men and women. http://www.oecd.org/document/23/0,3343,en_2649_33935_36225815_
Theories of Change The Community Builder's Approach to Theory of Change: A Practical Guide to Theory Development, by Andrea A. Anderson. Washington, D.C.: The Aspen Institute, 2005
http://www.aspeninstitute.org/sites/default/files/content/docs/rcc/rcccommbuildersapproach.pdf
ActKnowledge and related community site Theory of Change Online:
http://www.actknowledge.org/
Theory of Change Online Community: http://www.theoryofchange.org/
Theory of Change online software tool: http://www.theoryofchange.org/toco-software/
http://betterevaluation.org/evaluation-components/define
CARE International’s theory of change guidance and resources
Conceptual and practical information and guidance about how to approach theory of change. http://p-shift.care2share.wikispaces.net/Theory+of+Change+Guidance#Resources
Cheyanne Church and Mark M. Rogers: Designing for Results: Integrating Monitoring and Evaluation in Conflict Transformation Programs (2006), Search for Common Ground
http://www.sfcg.org/programmes/ilr/ilt_manualpage.html
Also on http://www.stabilisationunit.gov.uk/stabilisation-and-conflict-resources/thematic/doc_details/111- d
Donor Committee on Enterprise Development Standard
http://www.enterprise-development.org/page/measuring-and-reporting-results
Paul Duignan, Outcomes Planning and Dooview Software
http://www.outcomescentral.org/
DooView software:
http://www.doview.com/
Funnell, S. and Rogers, P. (2011)Purposeful Programme Theory. Effective use of theories of change and logic models, San Francisco: Jossey-Bass
Managing for Development Results (MfDR) Handbook on “Programme and Project Thinking Tools”
----------------------------------------------------------------------------
Version: MSMEs October 2015 Page 122
HIVOS Resource Portal on Theory of Change
http://www.hivos.net/Hivos-Knowledge-Programme/Themes/Theory-of-Change/Background- initiative/About-the-Theory-of-Change-resource-portal
HIVOS/UNDP: Method and Facilitation Guide: “Theory of Change. A thinking-action approach to navigate in the complexity of social change processes, Iñigo Retolaza, 2011
http://www.democraticdialoguenetwork.org/documents/view.pl?s=13;ss=;t=;f_id=1811
Logic model guidance from Kellog Foundation
Kellog Foundation http://www.wkkf.org/knowledge-center/resources/2006/02/WK-Kellogg-Foundation- Logic-Model-Development-Guide.aspx
http://www.keystoneaccountability.org/resources/guides
Magenta Book: UK Treasury Department
http://www.hm-treasury.gov.uk/data_magentabook_index.htm
MANDE News: Theories of Change resources
http://mandenews.blogspot.co.uk/2012/03/modular-theories-of-change-means-of.html
Participatory Impact Pathways Analysis (PIPA) Wiki!
http://boru.pbworks.com/w/page/13774903/FrontPage
The PIPA Wiki is a resource of the Water and Food Challenge Fund, CGIAR, which also has resources on developing a theory of change-based M&E system:
http://monitoring.cpwf.info/background/theory-of-change
http://www.researchtoaction.org/theory-of-change-useful-resources/
University of Wisconsin, Evaluation Extension Service
http://www.uwex.edu/ces/pdande/evaluation/evallogicmodel.html
World Wildlife Fund, Programme Standards and Tools
http://wwf.panda.org/what_we_do/how_we_work/programme_standards/
Care International UK: http://www.careinternational.org.uk/research-centre/conflict-and-peacebuilding/227-guidance-for- designing-monitoring-and-evaluating-peacebuilding-projects-using-theories-of-change
Managing for Development Results (MfDR) Handbook on “Programme and Project Thinking Tools”
----------------------------------------------------------------------------
Version: MSMEs October 2015 Page 123
APPENDIX Q: RESULTS FRAMEWORK - RWANDA: PROGRAMME TO SUPPORT GOOD GOVERNANCE (PSGG)
This comprehensive Results Framework is from a four year Programme to Support Good Governance (PSGG) in Rwanda.
This programme was funded by DFID and managed by UNDP. This results framework was developed in a participatory manner in the first 18 months of the programme.
Reference:
Programme to Strengthen Good Governance (PSGG) DFID/UNDP Rwanda. DFID Output to Purpose Review Report. Philip N. Dearden and Herman Masahara 27 May 2010
---------------------------------------------------------------------------------------------------
Programme to Support Good Governance
A Joint UNDP / DFID Initiative
Performance Monitoring and Evaluation Plan
2008 – 2012
Managing for Development Results (MfDR) Handbook on “Programme and Project Thinking Tools”
----------------------------------------------------------------------------
Version: MSMEs October 2015 Page 124
Program to Strengthen Good Governance Strategic Results Framework
Good Governance is a Critical Element in the Achievement of the Millennium Development Goals and Rwanda Vision 2020, including a Principal Anchor for Promoting the EDPRS’ Pro-poor Growth and Umurenze Objectives
The National Parliament Effectively Discharges
its Mandate in an
engendered Way
National Women’s Council
Effectively Discharges its Mandate in an engendered way
National Human Rights Commission
Effectively Discharges
its Mandate in an engendered way
Key Areas of Good
Governance
Capacity Built
Key Areas of
Good Govern
Capacity Built
Enabling GG Policy / Legal &
Institutional Context
Strategic plan /
M&E plan refined
Parliament radio
Study tours
Legislative / policy
training
Research capacity developed
Strategic plan /
M&E plan refined
Civic education
Ingando education
Laws reformed
conflict management
Staff training / TA
Strategic plan /
M&E plan refined
Laws reformed / policies initiated
Communications &
media campaigns
conflict management
staff training / TA
PS
GG
Ou
tco
mes
PS
GG
Ou
tpu
ts
Partn
er
Ac
tivitie
s P
SG
G
Go
al
Strengthened Constitutionally Mandated Institutions Increase State Accountability,
Responsiveness & Transparency in an Engendered Way
PS
GG
Pu
rpo
se
Enabling GG
Policy / Legal &
Institution Context
Enabling GG Policy / Legal & Institution
al Context
Key Areas of
Good Govern
Capacity Built
Unity & Reconciliation Commission
Effectively Discharges
its Mandate in an engendered way
The Ombudsman’s Office
Effectively Discharges
its Mandate in an engendered Way
The High Council Of the Press Effectively
Discharges its Mandate in
an engendered Way
Key Areas of
Good Govern
Capacity Built
Enabling GG Policy / Legal & Institution
al Context
Key Areas of
Good Govern Capacity
Built
Enabling GG
Policy / Legal & Institutio
n Context
Enabling GG Policy / Legal & Institution
al Context
Strategic plan /
PM&EP prepared
Staff training / TA
Laws reformed or new ones passed
Training for media owners/journalists
Staff training / TA
Strategic plan /
M&E plan refined
Civic education on corruption
Laws reformed or
new ones passed
Corruption studies
Staff training / TA
Strategic plan /
M&E plan refined
Computerization voter registration
Laws reformed
Civic electoral education
Staff training / TA
Key Areas of Good
Governance
Capacity Built
A Nation Where Constitutionally Mandated Institutions Take a Lead Role in Promoting National Reconciliation, Social Peace and Poverty Reduction by Acting as Agents of Good Governance and Empowering Citizens to
Participate in These Key Societal Matters
PS
GG
Vis
ion
Managing for Development Results (MfDR) Handbook on “Programme and Project Thinking Tools”
----------------------------------------------------------------------------
Version: MSMEs October 2015 Page 125
Rwanda Programme to Support Good Governance
Performance Management Data Sheet PSGG Purpose / Strategic Objective : Strengthened Constitutionally Mandated Institutions Increase State Accountability, Responsiveness
& Transparency in an Engendered Way
Result to be Measured: Purpose / Strategic Objective Level: Strengthened Constitutionally Mandated Institutions Increase State
Accountability, Responsiveness & Transparency in an Engendered Way
Indicator SO 1 (Formulation): JGA Indicator on civil liberties – indicator 21: IREX Media Sustainability Index ‘Freedom Of Speech’ sub-index
Unit Of Measure: A hard percentage change (improvement) in the rating by IREX in terms of the IREX scores
Source Of Data: Data will be collected from IREX on a yearly basis
Indicator Description: This is one of the Joint Governance Assessment
Indicators that PSGG will measure at the Purpose level as well as by the
MHC in its own PMEP
Expected Performance Over MTSP Duration YEAR PLANNED ACTUAL
Methodology Comments: The MSI assesses five objectives in shaping a
successful media system:
1. Legal and social norms protect and promote free speech and access to public information,
2. Journalism meets professional standards of quality,
3. Multiple news sources provide citizens with reliable and objective news,
4. Independent media are well-managed businesses, allowing editorial
independence, and
5. Supporting institutions function in the professional interests of Independent media.
Each of these objectives is assessed using 7-9 sub-indicators measured on a score of
0-4. The indicator scores are based on the votes of a panel of 10 local journalists and
on IREX staff assessments. The Rwanda survey was first conducted in 2006-7
(shown here) and will be updated annually. IREX collects the data on annual basis to establish among other areas the
performance of the media and the nature of the environment in which that media
operates. IREX carries out its research on the media based on five objectives which
include freedom of speech, professional journalism, plurality of news sources,
business management, supporting institutions.
2008 Baseline: 2.29
2009 Target: 2.49
2010 Target: 2.79
2011 Target: 3.19
2012 Target: 3.69
Supplementary Information / Indicator Description: IREX
is an International Research & Exchanges Board. Under its media
Sustainability Index (MSI) Africa, media and the environment in
which it operates are assessed. Like any other country, Rwanda’s is
rated by IREX based on a set of five media objectives among those
are freedom of speech, plurality of news sources and supporting
institutions. The latest IREX information on the rating is of 2006-
2007. The overall score for Rwanda on the five objectives is 2.29
which reflects fairly strong scores in Objectives 1 and 3, freedom of
speech and plurality of news. On the lower end, Objectives 4 and 5,
business management and supporting institutions, scored just above
and below a 2, respectively. http://www.irex.org/programs/MSI_Africa/rwanda.asp
Managing for Development Results (MfDR) Handbook on “Programme and Project Thinking Tools”
----------------------------------------------------------------------------
Version: MSMEs October 2015 Page 126
Rwanda Programme to Support Good Governance
Performance Management Data Sheet PSGG Purpose / Strategic Objective : Strengthened Constitutionally Mandated Institutions Increase State Accountability, Responsiveness
& Transparency in an Engendered Way
Result to be Measured: Purpose / Strategic Objective Level: Strengthened Constitutionally Mandated Institutions Increase State
Accountability, Responsiveness & Transparency in an Engendered Way
Indicator SO 2 (Formulation): Percent of population that has directly experienced a corrupt activity (Comprehensive survey of incidence
of corruption)
Unit Of Measure: Percent of respondents who have directly experienced a case of corruption
Source Of Data: Ombudsman Office Survey or PSGG level Survey
Indicator Description: This is one of the Joint Governance Assessment
Indicators that PSGG will measure at the Purpose level as well as by the OO
in its own PMEP.
Expected Performance Over MTSP Duration YEAR PLANNED ACTUAL
Methodology Comments: Survey of a sample of Rwandan citizens to
determine how many of them have a direct experience with at least one
corrupt act. JGA = Stratified sample including different socio-economic and
occupational groups. Data will be collected at least every two years. The data
will be collected through survey, either undertaken directly by PSGG as part
of its consolidated survey exercise including other IP opinion questions; by
the OO itself; or by the JGA as this is one if its core indicators
2008 Baseline:
2009 Target:
2010 Target:
2011 Target:
2012 Target:
Supplementary Information / Indicator Description: This
is a JGA Indicator that has been modified to better define the
indicator being measured. We are measuring the percent of corruption through the responses of individuals who have
directly experienced at least one corrupt act. The actual JGA indicator reads as follows: Comprehensive survey of
incidence of corruption. Concerned PSGG / OO staff felt
that it needed refining which is why it was changed.
Managing for Development Results (MfDR) Handbook on “Programme and Project Thinking Tools”
----------------------------------------------------------------------------
Version: MSMEs October 2015 Page 127
Rwanda Programme to Support Good Governance
Performance Management Data Sheet PSGG Purpose / Strategic Objective: Strengthened Constitutionally Mandated Institutions Increase State Accountability, Responsiveness
& Transparency in an Engendered Way
Result to be Measured: Purpose / Strategic Objective Level: Strengthened Constitutionally Mandated Institutions Increase State
Accountability, Responsiveness & Transparency in an Engendered Way
Indicator SO 3 (Formulation): The number of complaints of injustice forwarded by OO to the concerned justice institutions
disaggregated by institution (e.g., police, tribunals, prosecutor general, District Council)
Unit Of Measure: Number of complaints forwarded to concerned institutions
Source Of Data: Ombudsman Office Audit Reports or JGA assuming that it changes the concerned indicator
Indicator Description: Receiving complaints relating to corruption and forwarding to justice institutions enhances public confidence in
fighting corruption policy. This is modified JGA indicator see discussion below for the reason changed
Expected Performance Over MTSP Duration YEAR PLANNED ACTUAL
Methodology Comments: Corruption cases treated are forwarded to judicial
institutions, a follow up system is put in place and the citizens are assisted if
necessary. Current data base only collects non-disaggregated data. OO will
need to manually disaggregate data by institution forwarded the complaints
.
2008 Baseline:
2009 Target:
2010 Target:
2011 Target:
2012 Target:
Supplementary Information / Indicator Description: The
JGA indicator to which this indicator relates reads: No. of successful
prosecutions as a % of cases reported to police and/or ombudsman. It
is not being used in this PMEP because the OO has no control over: 1) whether the concerned institution will actually prosecute the cases brought to it; or, 2) its competence or ability to win cases. Accountability should be direct in terms of what the OO and other IPs can actually achieve.
Managing for Development Results (MfDR) Handbook on “Programme and Project Thinking Tools”
----------------------------------------------------------------------------
Version: MSMEs October 2015 Page 128
Rwanda Programme to Support Good Governance
Performance Management Data Sheet PSGG Purpose / Strategic Objective: Strengthened Constitutionally Mandated Institutions Increase State Accountability, Responsiveness
& Transparency in an Engendered Way
Result to be Measured: Purpose / Strategic Objective Level: Strengthened Constitutionally Mandated Institutions Increase State
Accountability, Responsiveness & Transparency in an Engendered Way
Indicator SO 4 (Formulation): The number of commission hearings in which executive officials are requested to appear and respond to commission members’ questions disaggregated by Senate and Chamber of Deputies and Committee jurisdiction
Unit Of Measure: number of commission hearings in which Executive Branch Officials are called befor and appear before a concerned
Parliamentary Commission
Source Of Data: Committees Secretariat and/or Joint Governance Assessment
Indicator Description: This indicator captures the Committees’ activity to oversee Executive performance and spending. One of the
Parliament’s key roles is holding the executive accountable (oversight). As stipulated in the Organic Law, Committees may review and investigate on the implementation of the policies and use of public funds. This indicator is one of the JGA indicators that the PSGG will
follow and report on at both the Parliamentary IP level and at the PSGG level. The JGA indicator reads: Number of times Ministers get
called to parliament
Expected Performance Over MTSP Duration YEAR PLANNED ACTUAL
Methodology Comments: The Project Coordinator will collaborate with
committee secretariat to capture the data and it will be done semi annually. It
is also possible that the JGA will measure this indicator and provide PSGG
with the results
2008 Baseline : CoD- 5
Sen-
2009 Target: CoD – 10
Sen -
2010 Target:
2011 Target:
2012 Target:
Supplementary Information / Indicator Description:
Managing for Development Results (MfDR) Handbook on “Programme and Project Thinking Tools”
----------------------------------------------------------------------------
Version: MSMEs October 2015 Page 129
Rwanda Programme to Support Good Governance
Performance Management Data Sheet PSGG Purpose / Strategic Objective: Strengthened Constitutionally Mandated Institutions Increase State Accountability, Responsiveness
& Transparency in an Engendered Way
Result to be Measured: Purpose / Strategic Objective Level: Strengthened Constitutionally Mandated Institutions Increase State
Accountability, Responsiveness & Transparency in an Engendered Way
Indicator SO 5 (Formulation): Equality of all Rwandans (men and women) reflected by ensuring that women are granted at least 30 percent of positions in decision making organs disaggregated by Senator / Deputies, Cabinet Members and Judges
Unit Of Measure: Percent of Women by Institution in positions of Power (Deputies, Senators, Cabinet, Judges)
Source Of Data: Parliament (Senate and Chamber of Deputies), Cabinet and Ministry of Justice
Indicator Description: This is one of the JGA Indicators that will be tracked and reported on by the PSGG both at the Programme level and
by the Parliament. The Programme will only be tracking Parliament compliance as this concerned institution is the only one being supported by PSGG. The JGA indicator Reads: The Percent of Women in Positions of Power. The indicator assesses whether state institutions are
moving towards compliance with the 30% constitutional requirement of having women in decision making positions. The constitution
mandates the senate to oversee the implementation of the fundamental principles of the constitution one of which is the 30% gender parity
within institutions. The project will assess the implementation of 30% gender parity within parliament.
Expected Performance Over MTSP Duration YEAR PLANNED ACTUAL
Methodology Comments: Data will be collected annually with 2008 as the
baseline. The Project Coordinator will collect the data; and / or JGA will
provide this information
2008 Baseline: 50 percent
2009 Target: 30 percent
2010 Target:
2011 Target:
2012 Target:
Supplementary Information / Indicator Description:
Managing for Development Results (MfDR) Handbook on “Programme and Project Thinking Tools”
----------------------------------------------------------------------------
Version: MSMEs October 2015 Page 130
Rwanda Programme to Support Good Governance
Performance Management Data Sheet PSGG Purpose / Strategic Objective: Strengthened Constitutionally Mandated Institutions Increase State Accountability, Responsiveness
& Transparency in an Engendered Way
Result to be Measured: Purpose / Strategic Objective Level: Strengthened Constitutionally Mandated Institutions Increase State
Accountability, Responsiveness & Transparency in an Engendered Way
Indicator SO 6 (Formulation): Number of human rights cases reported to NHRC and the proportion of these that get resolved (disaggregated by length of resolution, types, age and sex).
Unit Of Measure: Number of files related to human rights violation which have been managed and cleared up by the Commission
Source Of Data: NHRC Annual reports
Indicator Description: There are actually two measures here: first, the number of human rights cases reported to NHRC; and, second, of the
number submitted, the proportion that actually get resolved by the commission. This actually a complex indicator as not all cases submitted to
NHRC fall within its criteria and in many instances, the cases are forwarded on to the competent government organ.
Expected Performance Over MTSP Duration YEAR PLANNED ACTUAL
Methodology Comments:
- Individual case files are registered according to category violation and date of
receipt
- On a yearly basis, files are reviewed to determine whether they have be resolved or not
- Based on this information the percentage or proportion of cases received that
have been resolved is calculated
2008 Baseline:
2009 Target:
2010 Target:
2011 Target:
2012 Target:
Supplementary Information / Indicator Description:
Human rights violation recorded by social and economic
categories allows analysts to conduct research on which
categories of violations and against who has been violated.
Managing for Development Results (MfDR) Handbook on “Programme and Project Thinking Tools”
----------------------------------------------------------------------------
Version: MSMEs October 2015 Page 131
Rwanda Programme to Support Good Governance
Performance Management Data Sheet PSGG Purpose / Strategic Objective: Strengthened Constitutionally Mandated Institutions Increase State Accountability, Responsiveness
& Transparency in an Engendered Way
Result to be Measured: Purpose / Strategic Objective Level: Strengthened Constitutionally Mandated Institutions Increase State
Accountability, Responsiveness & Transparency in an Engendered Way
Indicator SO 7 (Formulation): Number of reports required under UN Human Rights instruments to which Rwanda in signatory that are compiled and reported to treaty reporting bodies in timely manner
Unit Of Measure: Number of reports required under UN Human Rights instruments to which Rwanda in signatory that are compiled and
reported to treaty reporting bodies in timely manner.
Source Of Data: Annual reports
Indicator Description: The reports written accordingly to legal obligations prove that authorities are concerned about applicability of human
rights as provided for in the existent legal framework. The more new treaties and conventions related to human rights are diligently ratified and domesticated by Rwanda, the better the rule of law reign is insured
Expected Performance Over MTSP Duration YEAR PLANNED ACTUAL
Methodology Comments:
- Make a list of reports to be written according to legal obligations (
internal and international)
- Indicate reports effectively written and broadcast.
- Calculate in which proportion written and broadcast reports are,
compared to reports to be written.
2008 Baseline:
2009 Target: Number of
issued treaties
Number of ratified
treaties
2010 Target: Number of
issued treaties
Number of ratified
treaties
2011 Target: Number of
issued treaties
Number of ratified
treaties
2012 Target: Number of
issued treaties
Number of ratified
treaties
Supplementary Information / Indicator Description:
Managing for Development Results (MfDR) Handbook on “Programme and Project Thinking Tools”
----------------------------------------------------------------------------
132
36
Indicator from Joint governance assessment
Rwanda Programme to Support Good Governance
Performance Management Data Sheet PSGG Purpose / Strategic Objective: Strengthened Constitutionally Mandated Institutions Increase State Accountability, Responsiveness
& Transparency in an Engendered Way
Result to be Measured: Purpose / Strategic Objective Level: Strengthened Constitutionally Mandated Institutions Increase State
Accountability, Responsiveness & Transparency in an Engendered Way
Indicator SO 8 (Formulation): : Measures of trust and reconciliation increase among Rwandans, disaggregated by neighbours, community institutions and selected public bodies
36 (Measures of trust and reconciliation from JGA)
Unit Of Measure: Percentage of the population
Source Of Data: Opinion surveys / Undertake perceptions surveys of trust in neighbors, community institutions and selected public bodies
Indicator Description: That indicator assesses the level of trust among Rwandans and their trust towards their different ins titutions
(community and Government’s institutions). From the level of trust that will have been identified for year 2008, an increase of at least 5% in
2010, and at least 10% in 2012 compared to the reference year (2008) is expected.
Expected Performance Over MTSP Duration YEAR PLANNED ACTUAL
Methodology Comments: There are indications/measures which show
positive change in trust level and reconciliation among Rwandans, but no data
have been collected yet according to standard criteria. A study is being done
by the NURC with support by the IRC and will show current data on
Rwandans’ level of trust at different levels, and those will be basic data of that
indicator. A similar study for data collection will be done every two years so
as to show level of trust and reconciliation that will have been achieved by
Rwandans. The task of collecting data related to that indicator will be done by
NURC research service.
2008 Baseline: NA
2009 Target:
2010 Target: +5%
2011 Target:
2012 Target: +10%
Supplementary Information / Indicator Description:
Managing for Development Results (MfDR) Handbook on “Programme and Project Thinking Tools”
----------------------------------------------------------------------------
133
Rwanda Programme to Support Good Governance
Performance Management Data Sheet PSGG Purpose / Strategic Objective: Strengthened Constitutionally Mandated Institutions Increase State Accountability, Responsiveness &
Transparency in an Engendered Way
Result to be Measured: Purpose / Strategic Objective Level: Strengthened Constitutionally Mandated Institutions Increase State Accountability,
Responsiveness & Transparency in an Engendered Way
Indicator SO 9 (Formulation): The percent of who perceive that targeted institutions have contributed to improved accountability, transparency and responsiveness, particularly by the executive disaggregated by PSGG-supported constitutionally mandated institution
Unit Of Measure: Percentage of the population that believes constitutionally mandated institutions have contributed to good governance
Source Of Data: Opinion surveys / Undertake perceptions surveys of trust in neighbors, community institutions and selected public bodies
Indicator Description: PSGG is providing significant funding over a strategic period to six major institutions whose mandates contribute to improved
accountability, transparency and responsiveness with a particular focus on the Executive branch of government. The best way to gauge the impact that
these institutions are having is through a perception survey of Rwandans as to their views on each of these six institutions and their effectiveness in
promoting good governance.
Expected Performance Over MTSP Duration YEAR PLANNED ACTUAL
Methodology Comments: This is an indicator that will be measured every
two years after a baseline measurement in 2009. This is as close to an impact
level indicator that the PMEP will have among any of the six individual
PMEPs for the institutions concerned. It will be undertaken along with other
opinion surveys that have developed both in this overarching Purpose level
PMEP and the individual IP PMEPs.
2008
2009 Baseline: NA
2010 Target: 50 %
2011 Target:
2012 Target: 75 %
Supplementary Information / Indicator Description: This single
indicator, disaggregated by constitutionally mandated institution, will
be able to provide a broad measure of the programme’s performance
and each of its individual IPs
Managing for Development Results (MfDR) Handbook on “Programme and Project Thinking Tools”
----------------------------------------------------------------------------
134
APPENDIX R: MONITORING, REVIEW AND EVALUATION (MRE) FRAMEWORK BASED UPON THE LOGFRAME’S INDICATORS
Indicator37
Measures
(How are measures defined and what
information should be collected?)
For what
purpose is the information to be
collected?
Who needs
the information?
What are
the data collection methods?
When and
how often should collection occur?
Who is
responsible for development of (a) procedures and instruments, and
(b) carrying out the collection?
How much
does it cost to collect information?
Who is
responsible for reviewing and verifying the process?
How will the
information be reported?
Comments
1. (a) (b)
2.
3.
4.
37
Cut and pasted from the Logical Framework.
Managing for Development Results (MfDR) Handbook on “Programme and Project Thinking Tools”
----------------------------------------------------------------------------
135
APPENDIX S: NEW DFID LOGICAL FRAMEWORK - FOREST MARKET GOVERNANCE AND CLIMATE PROJECT TITLE Forest Governance Markets and Governance (FGMC) (version 22 November 2010)
IMPACT (the global change we wish to
contribute to)
Indicator G.1 Baseline – 2010 2011 2015 Target – 2020 Assumptions
Improved management of forests for poverty reduction, biodiversity conservation, and climate
protection.
Deforestation rate in
developing countries
13 million hectares a year
(average 2000-2010)
10 million hectares a year
(average 2010-2020)
Reduced
deforestation and reduced illegal logging contribute significantly to
reduced carbon emissions and reduced biodiversity loss.
Source
FAO Global Forest Resources Assessment (FRA) 2010, 2015 and 2020
Millennium Development Goal (MDG) 7 -Indicator 7.1 Proportion of land area covered by forest-UNEP/FAO
Indicator G.2 Baseline – 2010 2011 2015 Target – 2020
Value from
forests (jobs, timber, ecosystem
services, including carbon) accruing to forest-dependent
people in developing countries
Baseline international data
for tropical forest countries
Source
“The Economics of Ecosystems and Biodiversity” (TEEB), 2010; Rights and Resources Initiative (RRI); Centre
for International Forestry Research (CIFOR); Ecosystem Marketplace website.
Managing for Development Results (MfDR) Handbook on “Programme and Project Thinking Tools”
----------------------------------------------------------------------------
136
OUTCOME (what will change and
who will benefit)
Indicator P.1
(outcome measure)
Baseline - 2010 2011 2015 Target - 2020 Assumptions
Governance and market reforms that reduce the illegal use of forest resources and benefit poor people
Forest policy and governance performance
(a) 12% of 5 countries rated good
(b) Baseline for
further 5 countries
(a) 50% of 5 counties rated good
(b) 12% of 5 countries rated good
(a) 75% of 5 countries rated good
(b) 50% 5 countries rated good
Clearer less
contested tenure underpin the maintenance of public goods from
forests.
Source
Index score against 12 policy response indicators based on index developed by the independent assessment by Chatham House “Illegal logging and related trade: Indicators,” July 2010 aligned with Common Forest
Governance Assessment Framework (World Bank/FAO); Similar assessments commissioned 2012, 2015, 2017, 2019
Indicator P.2
(outcome
measure)
Baseline - 2010 2011 2015 Target - 2020
Quantity [and value] of illegally sourced timber
and other commodities
(a) Timber -100 million m3
(global) Value ($X)
(b) Baseline data produced for other
commodities (OCs)
(a) Timber – Further 80
million m3 (global)
(b) OCs – Further $X
(a) Timber – Further 60
million m3 (global)
(b) OCs – Further $X
Source
Independent assessment by Chatham House “Illegal logging and related trade: Indicators” July 2010; Similar assessments commissioned 2012, 2015, 2017, 2019; For other commodities (OCs) data commissioned from
Chatham House or alternative
Indicator P.3
(outcome
measure)
Baseline - 2008 2011 2015 Target - 2020
Area and value forests under
non-State rights, including local communities
26% Update baseline 32% 38%
Source
Baseline: “From Exclusion to Ownership? ”, p.7, 2008, Rights and Resources Initiative (RRI) (Percentage of forest estate not administered by government, based on the 25 countries out of the 30 most forested countries with complete tenure data for 2002 and 2008); Follow up studies; Qualitative assessments
INPUTS (£) DFID (£) defra (£) Other (£) Total (£) DFID SHARE (%)
Managing for Development Results (MfDR) Handbook on “Programme and Project Thinking Tools”
----------------------------------------------------------------------------
137
250,000,000 [X] Rest of EU 700,000,000
USA, China, Japan & others 2,850,000,000
3,800,000,000 (estimated)
7%
INPUTS (HR) DFID (FTEs) 1.5 Full Time Equivalent : 2 x 0.5 FTE - Senior Forestry Advisers; 0.5 FTE - Project Officer
OUTPUT 1 (specific
direct deliverable of the programme)
Indicator 1.1 Baseline - 2010 2011 2015 Target - 2020 Assumptions
Engagement by multiple stakeholders increased and sustained in targeted producer and processing
countries.
FLEGT38
countries with operational Legality
Assurance Systems
0 countries 1 country 5 countries 10 countries
Legality
assurance provides sufficient guarantee to
inspire consumer and client confidence
Source
Eu FLEGT Action Plan progress reports; Independent FLEGT Voluntary Partnership Agreement (VPA) monitoring reports
Indicator 1.2 Baseline - 2010 2011 Target -2015 2020
FLEGT countries
with poverty and governance impact monitoring in
place
0 countries 1 country 10 countries
Source
Civil society monitoring; independent VPA poverty impact monitoring reports (including gender impact);
national M&E systems where available; FLEGT Joint Implementation Committee (JIC) reports
IMPACT
WEIGHTING
Indicator 1.3 Baseline - 2010 2011 2015 Target - 2020
50% FLEGT countries
with effective multi-stakeholder institutions
4 countries 7 countries 10 countries
Source RISK RATING
Civil Society monitoring; independent VPA monitoring reports; Commissioned reports Asian Barometer; Latinobarometro; Afrobarometer. Includes forest oversight by institutions such as FLEGT joint implementation committees, parliamentary committees, academic bodies, local NGOs, private sector associations, public
demonstrations, political party debates. Includes gender assessment of institutional effectiveness.
Medium 20%
INPUTS (£) OUTPUT 1
185,000,000
38
FLEGT = Forest Law Enforcement Governance and Trade
Managing for Development Results (MfDR) Handbook on “Programme and Project Thinking Tools”
----------------------------------------------------------------------------
138
OUTPUT 2 Indicator 2.1 Baseline - 2010 2011 2015 Target - 2020 Assumptions
Public policies and private business standards that tackle trade in timber and other commodities from illegal
forest practices
EU and other
consumer countries with effective public procurement and
/or import due diligence systems [in place for timber and
other commodities]
(a) Timber – 8 countries
(b) OCs – X countries
(a) Total 20 countries
(b) Total 3 countries (a) Total 33 countries
39
(b) Total 5 countries
Public and
voluntary standards are specified by enforceable
indicators
Verification or
audit of standards and of compliance is
sufficient meets essential requirements
Source
Periodic progress reports by the European Commission, Chatham House(?)
Indicator 2.2 Baseline - 2010 2011 2015 Target - 2020
% value of tropical timber and other
commodity markets compliant with voluntary legal
and/or sustainability standards
(a) Timber - 1% (b) OCs – Oil Palm 5%
(a) Timber - 5%
(b) OCs – 10%
(a) Timber - 15%
(b) OCs – 15%
Source
UN Food and Agriculture Organisation (FAO); World Wide Fund for Nature (WWF) Forest Footprint Disclosure reports (FDD); Certifying body reports and websites (e.g. Forest Stewardship Council (FSC
http://www.fsc.org), Programme for the Endorsement of Forest Certification (PEFC) http://www.pefc.org/,
Malaysian Timber Certification Council (MTCC) http://www.mtcc.com.my/, Roundtable for Sustainable Palm Oil (RSPO) http://www.rspo.org/, National Federations of Oil Palm http://www.fedepalma.org/statistics, Roundtable on Responsible Soy (RTRS) http://www.responsiblesoy.org/; FAO FRA
http://www.fao.org/forestry/fra/en/
IMPACT WEIGHTING
Indicator 2.3 Baseline - 2010 2011 2015 Target - 2020
20% Number influential financial investors that
specify and report on compliance with standards
8 financial investors 11 financial investors
15 financial investors
Source RISK RATING
Financial investors, sovereign fund, banks annual reports (e.g. WB, IFC, IADB, AfDB, HSBC, Calpers, ABN
Amro, Norway Sovereign Fund); Forest Footprint Disclosure (FDD) reports
Medium 20%
DFID INPUTS (£) OUTPUT 2
5,000,000
39
EU27, Norway, Switzerland, Japan, Australia, New Zealand, USA
Managing for Development Results (MfDR) Handbook on “Programme and Project Thinking Tools”
----------------------------------------------------------------------------
139
OUTPUT 3 Indicator 3.1 Baseline - 2010 2011 2015 Target - 2020 Assumptions
Increased knowledge and momentum for
change
Public
awareness of illegal logging and other commodities that
drive deforestation
1,500 media coverage items Further 5,000
media coverage items
Further 7,000 media
coverage items
Source
Chatham House “Illegal logging and related trade: Indicators,” July 2010 (this includes public awareness
survey); Similar assessments commissioned 2012, 2015, 2017, 2019
IMPACT
WEIGHTING
Indicator 3.2 Baseline - 2010 2011 2015 Target - 2020
20% Visits to
[Chatham House] illegal forest commodity information
website
141,583 visits; average 388 visits
per day; 106,279 absolute unique visitors; each visitor looked at average 2 pages (2009/10)
160,000 visits: 500 visits per
day; each visitor looking at 2 pages (2018/19)
At least 5 additional franchised sites
Source RISK RATING
Web administrator http://www.illegal-logging.info/ Low 5%
INPUTS (£) OUTPUT 3
50,000,000
Managing for Development Results (MfDR) Handbook on “Programme and Project Thinking Tools”
----------------------------------------------------------------------------
140
OUTPUT 4 Indicator 4.1 Baseline - 2010 2011 2015 Target - 2020 Assumptions
Coherence between programmes on forests and deforestation at national and international levels
Number of international
and bilateral initiatives on Reducing Emissions from Deforestation and forest Degradation
(REDD+) that take account of evidence [lessons] from FLEGT
2 REDD initiatives 11 REDD
initiatives
11 REDD initiatives
Source
REDD Project reports (e.g. Forest Investment Programme (FIP), Forest Carbon Partnership Facility
(FCPF), UN-REDD; Bilateral REDD by USA, UK, Australia, Germany, Japan, Norway, The Netherlands, Japan); CIFOR Global REDD monitoring; Rights and Resources Initiative (RRI) reports; World Resources Institute (WRI) reports
IMPACT
WEIGHTING
Indicator 4.2 Baseline - 2010 2011 2015 Target - 2020
10% Number of producer
countries coordinating consistent FLEGT, REDD+ and similar initiatives
0 countries 5 countries 10 countries
Source RISK RATING
Medium/High 40%
INPUTS (£) OUTPUT 4
5,000,000
DFID INPUTS (£) FACILITATION,
MONITORING AND EVALUATION (M&E)
5,000,000
Managing for Development Results (MfDR) Handbook on “Programme and Project Thinking Tools”
----------------------------------------------------------------------------
141
APPENDIX T: COMPARISONS BETWEEN TERMINOLOGIES OF DIFFERENT DONOR AGENCIES FOR RESULTS / LOGICAL FRAMEWORKS
Ultimate Impact End Outcomes Intermediate Outcomes Outputs Interventions
Needs-based Higher Consequence
Specific Problem Cause Solution Process Inputs
DFID40
Impact Outcome Outputs Activities
FAO Impact Outcome Outputs Activities
UNDP41
Impact Outcome Outputs Activities
ADB42
Impact Outcome Outputs Activities
CARE Impact Effects Outputs Activities Inputs
PC/LogFrame43
Goal Purpose Outputs Activities
USAID Results Framework
44
Goal Statement Development Objectives Project Purpose Project Outputs
USAID Logframe
45
Final Goal Strategic Goal/ Objective Intermediate results
Activities
DANIDA46
Impact Outcome Outputs Activities Inputs
CIDA47
Ultimate Outcome Outcomes Outputs Activities Inputs
GTZ48
Indirect Results Direct Results / Outcomes Use of Outputs Outputs Activities Inputs
40
DFID/UKAid: Guidelines on using the revised logical framework (2011) 41
UNDP: Handbook on Planning, Monitoring and Evaluating for Development Results (2009) 42
ADB: Design and Monitoring Framework Guidelines (2007) http://www.adb.org/documents/guidelines-preparing-design-and-monitoring-framework 43
PC/LogFrame (tm): 1988-1992 TEAM technologies, Inc. 44
USAID: ADS Chapter 203 (3.2) Assessing and Learning (2012) (There is some inconsistency of terminology within this document) 45
USAID: The Logical Framework Approach to portfolio Design, Review and Evaluation in A.I.D.: Genesis, Impact, Problems and Opportunities. CDIE,1987 46
DANIDA: Danish Development Cooperation in a Results Perspective (2011) 47
CIDA: Presentation of the amended key results-based management terms and definitions (2008) Performance Management Division 48
GTZ: Results Based monitoring (2008)
Managing for Development Results (MfDR) Handbook on “Programme and Project Thinking Tools”
----------------------------------------------------------------------------
142
Ultimate Impact End Outcomes Intermediate Outcomes Outputs Interventions
Needs-based Higher Consequence
Specific Problem Cause Solution Process Inputs
European Union
49
Overall Objective(s) Project Purpose Results Activities
NORAD50
Impact Outcomes Outputs Activities Inputs
UNHCR51
Goal(s) Objectives Project Objective Outputs Activities Input/Resources
World Bank52
CAS Goal Program Purpose Project Development Objective Outputs Component Activities
Inputs
AusAID53
Goal/Impact Purpose/Outcome Component Objectives / Intermediate
Results
Outputs Work
Programme (optional)
JICA54
Overall Goal Project Purpose Outputs Activities Inputs
Tearfund55
Goal Purpose Outputs Activities Inputs
This table has been referred to as “’The Rosetta Stone of Logical Frameworks” Originally compiled by Jim Rugh for CARE International and InterAction’s Evaluation Interest Group.
Links to many of these documents and others can be found on this Symballoo page developed by Patt Flett and Philip Dearden December 2012:
http://www.symbaloo.com/mix/guidelines
49
EU: Aid Delivery Methods: Project Cycle Management Guidelines (2004) 50
NORAD: Results Management in Norwegian Development Cooperation (2008) 51
UNHCR: Project Planning in UNHCR: A Practical Guide on the Use of Objectives, Outputs and Indicators for UNHCR Staff and Implementing Partners. Ver 2. March 2002. 52
World Bank: Country Assistance Strategy (CAS) in The LogFrame Handbook (2000) 53
AusAid: 3.3 Logical Framework Approach (2005) 54
JICA: – Evaluation Methods (2004) 55
Tearfund: Roots 5 Project Cycle Management (2009)