1
Government Wide Monitoring Government Wide Monitoring and Evaluation Systemand Evaluation System
Ronette Engela The PresidencyConrad Barberton National TreasurySieraag de Klerk StatsSaZandile Nkonyana DPLGHenk Serfontein DPSA
2
Content of presentationContent of presentation
Overview of the GWM&E framework Programme Performance Information Evaluations Census and Surveys - National Statistical
System Derived and Transversal Information Systems
DPLG Local Government System DPSA Systems
3
In 2005 Cabinet approved In 2005 Cabinet approved an an implementation planimplementation plan
to develop to develop a monitoring and evaluation a monitoring and evaluation systemsystem
for use across governmentfor use across government
To encompass To encompass Monitoring implementation, effectiveness,
validationEvaluation impact and process evaluationEarly warning proactively identify blockagesVerification validates integrity of dataData collection using existing capacitiesAnalysis research driven assessments Reporting appropriate & customised
to target groups
The centre of government
needs to provide
clear policies and frameworks
MandatePresidency
4
Composite system that draws its data from contributory systems
Emerging system built up over time with consistent and sustained participation by all stakeholders
Policy and Standards
Reporting and Databases
Capacity Building
Lead Agency DPSAPresidency SAMDI
Partner Agencies OPSC
Stats SA
National Treasury
Presidency
Stats SA
DPLG
National Treasury
DPSA
Presidency
Stats SA
DPLGDPSA
Phased implementation through 3x work streams
5
Are we still on track?Are we still on track? Missed a number of deadlines Information needed to inform next
phase of development not in place Audit of reporting requirements and
M&E systems in gov late P&P shifted to Policy and Standards
Independent development of systems Nearly all provinces
Lack of own capacity - many consultants in sectors
Misconception re ‘System: IT’ vs ‘Framework’
Reporting labyrinthReporting labyrinthplethora of dataplethora of data
For example:For example:
Provincial dept Provincial dept reports to:reports to:•Own dept and exec authority•National concurrent dept•Premier Office•National Treasury•Presidency
6
Entry pointsEntry points
Executive reportingEvidence based decision
making for: Resource allocation Policy refinement
Extensive executive interestSupport government
implementation focus
RefineRefine
7
GWM&E System: data terrainsGWM&E System: data terrains
Census and Survey Information
Registers and Admin data
Evaluations
Programme Performance Information
8
Census and Survey Information
Registers and Admin data
Evaluations
Programme Performance Information
GWM&E System: policy platform
Policy FrameworkPolicy Framework
9
GWM&E Framework: Programme Performance Information
PPIE
C&SDept Dept Dept Dept
Derived information system DPLGDWAF
Transversal systemsPERSAL
BAS
10
GWM&E System: GWM&E System: Executive reports
Census and Survey Information
Registers & Admin data
Evaluations
PPI
Dept Dept
Derived information system
Transversal systems
Dept
PoAPoA
NationalIndicators
11
12
Census and Survey Information
Evaluations
PPI
Dept Dept
Derived information system
Transversal systems
Dept
Area of Responsibility
13
EvaluationEvaluationNeed for detailed policy
evaluations on a number of different levels:Departmental led reviews
and evaluation of policies and programmes
Sectoral reviewsBroad, cross cutting
reviews led by centre of government
Monitoring and evaluation have different purposes
14
Programme Performance Programme Performance InformationInformation
Conrad Barberton National Treasury
15
Constitution S.215 S.216 Budget & Exp Management
PFMA & MFMA
Ensure information on inputs, outputs and outcomes …
MandateTreasury
…underpins planning, budgeting, implementation management and accountability reporting…
…to promote transparency and expenditure control.
16
Programme Performance Information
E
C&SDept
Derived information system
Area of Responsibility
Transversal systemsLine manager
Dept exec
Programme manager
Exec Authority
Dept
17
Framework for Managing Performance Information
Key Performance
Concepts
IMPACTS
OUTCOMES
OUTPUTS
INPUTS
ACTIVITIES
The developmental results of achieving specific outcomes
The medium-term results for specific beneficiaries that are the consequence
of achieving specific outputs
The final products, or goods and services produced for delivery
The processes or actions that use a range of inputs to produce the desired
outputs and ultimately outcomes
The resources that contribute to the production and delivery of
outputs
What we use to do the work?
What we do?
What we produce or deliver?
What we wish to achieve?
What we aim to change?
Plan, budget, implement and
monitor
Manage towards achieving these
results
IMPACTS
OUTCOMES
OUTPUTS
INPUTS
ACTIVITIES
The developmental results of achieving specific outcomes
The medium-term results for specific beneficiaries that are the consequence
of achieving specific outputs
The final products, or goods and services produced for delivery
The processes or actions that use a range of inputs to produce the desired
outputs and ultimately outcomes
The resources that contribute to the production and delivery of
outputs
What we use to do the work?
What we do?
What we produce or deliver?
What we wish to achieve?
What we aim to change?
Plan, budget, implement and
monitor
Manage towards achieving these
results
18
Framework for Managing Performance InformationPerformance
Indicators
IMPACTS
OUTCOMES
OUTPUTS
INPUTS
ACTIVITIES
RELATIONSHIP INDICATORS
OPINION-BASED INDICATORS
Types of direct indicators
QuantityQuality
Cost / Price Timeliness
Start and end timesDistributionAdequacy
Accessibility
DIRECT INDICATORS
Economy
Efficiency
Effectiveness
Data gathered through surveys
Equity
Indicators
Indicators
Indicators
Indicators
IndicatorsData gathered mainly by management information
systems
Calculated using a combination of direct indicators and other
dataIMPACTS
OUTCOMES
OUTPUTS
INPUTS
ACTIVITIES
RELATIONSHIP INDICATORS
OPINION-BASED INDICATORS
Types of direct indicators
QuantityQuality
Cost / Price Timeliness
Start and end timesDistributionAdequacy
Accessibility
DIRECT INDICATORS
Economy
Efficiency
Effectiveness
Data gathered through surveys
Equity
Indicators
Indicators
Indicators
Indicators
IndicatorsData gathered mainly by management information
systems
Calculated using a combination of direct indicators and other
data
19
Link to individual performance Link to individual performance agreements of line managers and HOD agreements of line managers and HOD
Ensure Programme Performance Information is appropriately used for planning, budgeting and managementplanning, budgeting and management purposes •set performance standards and targets prior to the start of each year•review performance and take management action•evaluate performance at the end of a service delivery period
PPI manual per sector - extensive consultation to determine needs of
different usersNext steps
Treasury (2)
20
National Statistical SystemNational Statistical System
Akiiki Kahimbaara StatsSA
21
Evaluations
PPI
Dept Dept
Derived information system
Transversal systems
Dept
Area of Responsibility
Census and Survey Information
Registers and Admin data
22
The Statistics Act (No. 6 of 1999) The Statistics Act (No. 6 of 1999) 2002 January Cabinet Legkotla2002 January Cabinet Legkotla
State of the Nation Addresses State of the Nation Addresses 2004 and 20052004 and 2005
•Section 14 subsection 6 clauses (a), (b) and (c) where the Statistician-General may advise an organ of state on the application of quality criteria and standards
•Section 14 subsection 7 clauses (a) and (b) grants the Statistician-General power to designate statistics produced by other organs of state as official statistics
•Section 14 subsection 8 clauses (a) and (b) authorises the Statistician-General to comment on the qualityof national statistics produced by another organ of state; and to publish such other department’s statistics
MandateStatsSA
23
Demand for information[user groups or
indicator categories]
Stats SAOther producers
[departments, CSOs, etc]
• Insufficient supply of quality information
• Uncertain quality? Poor comparability Isolated producers No shared standards
• Insufficient statistical skills
Official statistics
Unknown quality
Problem
24
Demand for information[user groups or
Indicator categories]
Stats SAOther producers
[departments, CSOs, etc]
Regulatory environment• Statistics Law• Governance structure• Quality standards• Advocacy programme• Code of conduct
• Adequate information• Reliable information
[quality, sustainable]• Sufficient skills
Coordination
Feedback
Proposed solution
Feedback
25
Proposed new approachProposed new approach
Use official statistics to ensure quality
• SASQAF [quality framework]•Framework of international best practice•UN Principles of Official Statistics
Maintain the decentralised system of statistical production
Transform all national statistics into official statistics
SASQAFDraft available
26
Transform existing departmental Transform existing departmental data (registers & surveys) data (registers & surveys) into sustainable sources into sustainable sources
of official statisticsof official statistics
•More use of administrative data than of surveys - sustainability and cost
• Agreements and collaboration between Stats SA and individual dept
• Joint Working Party between Stats SAand each of the departments to improve quality
ProgrammeNSS
27
Access registers or datasets and map Access registers or datasets and map them against indicators in the compendiumthem against indicators in the compendium
•Identify gaps – where the compendium of indicators should be coming fromthe department• Assess quality of registers or datasets for usability• Suggest improvements
Working with individual departments Next steps
StatsSA
28Build statistical capacity in departmentsBuild statistical capacity in departments
•Encourage departments to establishstatistical capacity – as part of existing M&E units or as components on their own•Allocate sufficient resources: part of MTEF budgeting for sustainability
•Implement statistical training programmes with support from Stats SA
Audit departments for statistical capacityNext steps
StatsSA (2)
29
ConcernsConcerns
Inadequate appreciation of the practice of managing for results
• Measurement of performance and “objective” info for planning and decision-making not a priority
• Current practice of existing M&E units without measurement
Circumlocutionary behaviour of public officers (multiple formalities)
• Waiting for approval (all the way to the minister!): inadequate delegation of authority
• Lack of institutional memory in government departments [change the head: start afresh]
•Operational silos – protecting one’s turf•Cover-ups: fear of exposure
30
Government Wide Monitoring Government Wide Monitoring and Evaluation Systemand Evaluation System - Derived and Transversal Information System
Zandile Nkonyane DPLG
31
Content of presentationContent of presentation Mandate Responsibility Problem Statement Proposed strategy Achievements to date Next Steps
32
GWM&E Framework: Programme Performance Information
PPIE
C&SDept Dept Dept Dept
Derived information system
Transversal systems
33
Constitution Chapter 3 & 7 Constitution Chapter 3 & 7 MSA, MFAMSA, MFA
Function of DPLGFunction of DPLG•develop national policies and legislation with regards to Provinces and Local Govt, and to monitor the implementation of such•to support Provinces and Local Govt in fulfilling their constitutional and legal mandate
MandateDPLG
34
DPLG Monitoring, ReportingDPLG Monitoring, Reporting & Evaluation& Evaluation
ResponsibilityDPLG
•Developing and implementing an integrated monitoring, reporting and evaluation system (dplg)
•Providing leadership and support to Local Govt for the successful implementation of the GWM&E framework
•The development of M&E capacity in the province and local government for enhanced reporting on the implementation of the 5 Year Local Government Strategic Agenda (2006 – 2011)
35
ResponsibilityDPLG 2
5 KPA of 5 Year Local Govt Strat Agenda5 KPA of 5 Year Local Govt Strat Agenda•Municipal Transformation and Organisational Development•Basic Service Delivery•Local Economic Development•Municipal Financial Viability and Management•Good Governance and Public Participation
DPLG will develop and implement a system for assessing local govt
service delivery
36
Varying degrees: Varying degrees: ofof understanding understanding and and
ofof M&E capacity and readinessM&E capacity and readiness
•Lack of integrated approach to local government M&E across 3 spheres
•Lack of appropriate M&E reporting structures to monitor local govt service delivery
DPLG did readiness assessment of provincial dept
of local govt to do MRE at local govt level
Problem
37
Standardise approach to Monitoring, Standardise approach to Monitoring, Reporting and EvaluatingReporting and Evaluating
•Institutionalise M&E systems within appropriate structures in IGR framework•Develop capacity building strategy
DPLG did assessment of business processes that inform M&E system Proposed
Strategy
38
Types of local government Indicators
43%
36%
21%
9%
2%
20%
0%
45%
27%
8%
0%5%
10%15%20%25%30%35%40%45%50%
Input Activities Output Outcome Impact
Indicator category
Pe
rce
nta
ge
Initially 500
Collated allCollated all indicators indicators
on local govton local govt
Now 150
39
Integrated Local Government M & E Management
A shared understanding of indicator development A shared understanding of indicator development across all three spheres of governmentacross all three spheres of government
GWM&E System
National
Provincial
District
Local
GPOA
PGDS
IDP
IDPIndicators
40
Achievements to dateAchievements to dateDone
Readiness assessment of provincial dept of local government
Draft Local Government M&E framework, aligned to GWM&E
National Local Government M&E forum established
Collaboration with SAMDI and NT TAU for capacity building
41
Next stepsNext stepsPlans
Finalise Local Government M&E framework
Design and implement Local govt monitoring system that extracts information with multiple sector departments that deal with Local Government – based on impact model
Capacity Building for provinces and municipalities
42
Derived system:Derived system:DPSADPSA
Henk Serfontein DPSA
43
Public Service ActPublic Service Act•Responsible for public service transformation •Custodian of public management frameworks •Performance and knowledge management •Service delivery improvement
Co-Chair of the Governance Co-Chair of the Governance and Administration Clusterand Administration Cluster
Co-Chair the GWM&E Task Co-Chair the GWM&E Task Team Team
Aim Increase public
service effectiveness and improve governance
MandateDPSA
44
GWM&E System: Executive reportsGWM&E System: Executive reports
Census and Survey Information
Evaluations
PPI
Dept Dept
Derived information system
Transversal systems
Dept
PoAPoA
NationalIndicators
Area of Responsibility
PM WatchPM WatchCabinet Lekgotla Cabinet Lekgotla ReportReport
45
Lead agency on databases Lead agency on databases and Reporting Work streamand Reporting Work stream
Responsibility DPSA
Two major initiatives currently in developmentTwo major initiatives currently in development
Public Management Watch Public Management Watch Extracts HR and Budget expenditure data Identifies vulnerable departments
HR utilisation Report HR utilisation Report Derived from departmental annual reports Focuses on how well HR is used
46
• Will draw from various GWM&E sources •To provide useful, practical updates•Executive level information •High strategic overview.
New development
Creation of Creation of Year end Report Year end Report bi- annual Cabinet maKgotlabi- annual Cabinet maKgotla
47
Linking the performance management and measurement systems needs dedicated attention
•DPSA M&E capacity overstretched (APRM etc.) •Building public participation systems a key issue •Must enhance GWM&E stakeholder relations •Overcoming formalistic compliance by sectors and provinces.
Task: Improving public management outside financial and
expenditure issues Challenges
DPSA
48
• Priority streamlining reporting burden Priority streamlining reporting burden • Compilation of reporting requirement Master list Compilation of reporting requirement Master list • Undertake consultations with systems usersUndertake consultations with systems users• Focus on improving understanding of their needs Focus on improving understanding of their needs • Develop database architectureDevelop database architecture
Priority: Work stream 2 objectives
Data bases and reporting
Next steps DPSA
49
ConcernsConcerns Need to incorporate Public
Opinions/ Imbizo processes Public Entities and
Constitutional Bodies
Conceptual clarity - IGR Key role players involved Convergence around
indicators
Data quality will improve with public attention and utilisation
Steady does it
Myth: “at the press of a Myth: “at the press of a button”button”
Systems to alert about Systems to alert about problem, need political problem, need political will to deal with itwill to deal with it
Conclusion
50
Ke ya lebogaKe ya lebogaKe a lebohaKe a leboha
Ke a lebogaKe a lebogaNgiyabongaNgiyabonga
NdiyabulelaNdiyabulelaNgiyathokozaNgiyathokoza
NgiyabongaNgiyabongaInkomuInkomu
Ndi khou livhuhaNdi khou livhuhaDankieDankie
Thank youThank you