results-based budgeting - oecd - session 6 - p. darville - chile.pdf · performance indicators 2001...
TRANSCRIPT
Results-Based Budgeting:
The Chilean Experience
Paula Darville
Head of Management Control Division Budget Office, Ministry of Finance, Chile
November 2012
PERFORMANCE BUDGETING
• Different performance budgeting models.
• In Chile:
• Using information in the decision-making process in order to improve efficiency in public expenditures.
• Not direct relationship (not mechanical).
• Used together with other categories of information.
• Political Priorities
• Financial Restrictions
• Presentation to the Congress along with the Budget Law.
EVALUATION AND MANAGEMENT CONTROL SYSTEM
• Chile has implemented PB through a Management Control System.
• The system´s objective is to provide performance information and introduce practices to improve the quality of public expenditure. In particular:
• Improving resource allocation
• Improving the use of resources
• Improving transparency
Monitoring and Follow up
• Performance Indicators (1993)
• Strategic Definitions (2001)
Evaluation
• Ex ante (2000)
• Ex post: – Evaluation of Public Programs (1997)
– Impact Evaluation (2001)
– Agency Evaluation (2002)
– Evaluation of New Programs (2009)
Institutional Wage Incentives Mechanisms
• Management Improvement Program (1998)
• Incentive to Physicians(2003)
• Institutional Efficiency Goals(2007)
EVALUATION AND MANAGEMENT CONTROL SYSTEM
PERFORMANCE INDICATORS
• Started 1993 as a pilot initiative.
• Since 2001 they are part of the budget preparation (public agencies presents their indicators and their goals during budget preparation).
• Aimed at telling how a government organization is performing over time.
• Measure performance in different:
– Dimensions (effectiveness, efficiency, economy, service quality)
– Delivery levels (process, output, outcome)
• Support from strategic plan, management improvement system (PMG).
• Disclosure policy: Congress and general public.
PERFORMANCE INDICATORS
2001 2002 2003 2004 2005 2006 2007 2008 2009 2010 2011 2012 2013
Number of Agencies 72 109 111 132 133 136 139 142 150 150 153 154 154
Number of Indicators 275 537 1.039 1.684 1.588 1.552 1.445 1.443 1.504 1.274 1.197 1.221 1.035
Average indicators per agency 3,8 4,9 9,4 12,8 11,9 11,4 10,4 10,2 10 8,6 7,8 7,9 6,7
% of output, outcomes indicators 70% 75% 81% 89% 89% 84% 90% 90% 91% 93%
% of indicators evaluated 59% 73% 92% 94% 98% 100% 100% 100% 100% 100% 100% - -
Average % of achievement 80% 69% 76% 86% 88% 88% 93% 90% 93% 95% 93% - -
PROGRAM EVALUATION
• Assesses ongoing programs against their stated aims and expected results.
• Requirements: relevance, independence, timeliness, transparency
• Started 1997.
• Budget-related.
• Programs selected with Congress.
• Performed by independent evaluators selected by public tendering (panels of experts or universities and consulting firms).
• Evaluators with authority to request information, commission studies.
• Counterpart in ministries/agencies are in charge of programs.
• Programs are classified, according to results of evaluation, in performance categories.
• Reports to Budget, full disclosure to Congress and the public.
• Commitments to incorporate recommendations from the evaluation.
• Followed by formal agreements between Budget and Executing Unit.
TYPES OF EVALUATION
• Evaluation of Public Programs
– Review consistency in design, execution and reporting.
– Based on logical framework methodology.
– Performed by panels of 3 independent experts, selected by public tendering.
– Final reports in 6 months.
– Directly connected to the budget cycle.
• Impact evaluations
– Assess program effectiveness on basis of impact measures.
– Methodology includes extensive data collection, more sophisticated evaluations techniques (CBA, CEA), control groups.
– Performed by consulting firms, universities, selected by public tendering.
– Final reports in 1-11/2 years.
TYPES OF EVALUATION
• Agency Evaluation
– Assess consistency of ministry/agency portfolio of programs in relation to agencies mission and objectives.
– Search for duplications, inconsistencies, opportunities to generate synergies and savings.
• Evaluation of New Programs
- Design of the evaluation at the beginning of each new program.
- Establishing control groups for the evaluation, based on randomized trials whenever possible.
- Establishing an international advisory committee to periodically review and assess the process of evaluation.
PROGRAM EVALUATIONS: 1997-2012
( ) = in progress
EVALUATION LINES 1997 - 1999
2000 - 2005
2006 2007 2008 2009 2010 2011 2012 Total
Evaluation of Public Programs (EPG)
80 94 13 14 16 20 18 10 19 284
Agencies Evaluation (ECG) 19 2 4 7 5 2(3) (6) 0 48
Impact Evaluations (EI) 30 7 14 12 8 12(2) (9) (11) 105
Evaluation of New Programs (EPN)
4(1) 4 (5) 0 14
Total Programs/Agencies 80 143 22 32 35 38 41 30 30 451
• During 2006 – 2012 total budget evaluated has reached 56%.
11
Evaluation of Programs 2012 (EPG - Impact)
PROGRAM EVALUATION
Effects Budget 2012
(US$) Budget Proposal
2013 (US$) Variation
(US$) Variation
(%)
Good performance 101.445.829 113.100.083 11.654.254 11%
Adequate performance 628.482.808 794.036.917 165.554.109 26%
Under performance (*) 122.825.167 111.952.317 -10.872.851 -9%
No demostrated Results 31.633.530 32.356.526 722.996 2%
Total 884.387.335 1.051.445.843 167.058.508 18%
( *) Without considering scholarship program.
Evaluated programs and institutions
Compliance with corporate agreements: 1999-2012
Ministry Achieved
Partially Achieved
Not Achieved
Transfer of Achieved
Total N° of Commitments
Valid by June 2012
Ministry of Agriculture 98% 2% 0% - 645
Ministry of National Defense 98% 1% 1% - 154
Ministry of Economy 97% 2% 1% - 336
Ministry of Education 96% 3% 1% - 933
Ministry of Foreing Affairs 99% 1% 0% - 168
Ministry of Finance 100% 0% 0% - 107
Ministry of Internal affairs 90% 5% 4% 1% 333
Ministry of Justice 97% 2% 1% - 270
Ministry of Planning 95% 4% 1% 1% 619
Ministry of Minning 100% 0% 0% - 49
Ministry of Public Works 90% 5% 5% - 151
Ministry of Health 92% 7% 1% - 151
Government General Secretariat Ministry 92% 0% 0% 8% 273
Presidential General Secretariat Ministry 100% 0% 0% - 31
Ministry of Labor and Social Security 95% 1% 0% 4% 277
Ministry of Transport and Telecommunications 99% 1% 0% - 92
Ministry of Housing 89% 8% 3% - 244
Ministry of National Assets 73% 20% 7% - 41
Ministry of Energy 100% 0% 0% - 33
Ministry of Natural Resources 91% 6% 3% - 89
TOTAL 95% 3% 1% 1% 4.996
• Started 2000 with preparation of 2001 Budget
• Standard format to submit public programs for funding (2005 – 2009)
• Gathering relevant information for the budget elaboration (diagnosis, potential and target population; focusing and selection criteria; aims; expected results; costs, among others), based on logical framework.
• Two step process:
• Ex ante evaluation: through a standard form new and reformulated programs must submit their proposals to evaluation. Social Development Ministry assesses the “social programs”, and Budget Office the rest. All programs get a positive or a negative recommendation of its design.
• “E” form: requirements of budget for new and existing programs must be submitted on a standard form with respect to the pertinence of financing such proposals.
Ex Ante EVALUATION
EVALUATION 2011-2014
• Gradually moving towards getting information that supports the budget preparation.
• Focusing primarily on performance in the dimensions of effectiveness and efficiency.
• Corporate agreements focused on performance indicators and follow up once a year.
• Strengthening ex ante evaluation.
• The achievement of management objectives are associated to a monetary incentive to all employees in the public institution (Law 19.553, 1998).
• The percentages of incentives are:
MANAGEMENT IMPROVEMENT PROGRAM (PMG)
* Before 2007 the % of incentives were 5% and 2,5%.
% of
Achievement
% of Incentive for Institutional Performance
2007 - 2010
2007* 2008 2009 2010
90% - 100% 5,70% 6,30% 7,00% 7,60%
75% - 89% 2,85% 3,15% 3,50% 3,80%
< 75% 0% 0% 0% 0%
16
Area Systems
Human Resources
1. Training
2. Hygiene, Security and Improved work environment
3. Performance Evaluation
Customer Service
4. Integral Customer Service
5. e-government
6. Information Security
Management, Planning and
Control
7. Planning / Management Control
8. Internal Audits
9. Territorial Management
Financial Management
10. Government Procurement
11. Financial Accounting
Gender Focus 12. Gender focus
System of Quality Management
13. System of Quality Management (ISO)
• Six strategic areas: (i) Human Resources, (ii) Customer Service; (iii) Management, Planning and Control; (iv) Financial Management; (v) Gender focus y (vi) System of Quality Management; and 13 systems (with objectives and stages of development).
• Average weight for management, Planning and Control was 10%.
……..PMG in 2010
Compliance Statistics: 1998 – 2011
83%
94% 92%
66%
79% 75% 75%
85% 78%
81% 87%
92% 93% 95%
0%
10%
20%
30%
40%
50%
60%
70%
80%
90%
100%
19
98
19
99
20
00
20
01
20
02
20
03
20
04
20
05
20
06
20
07
20
08
20
09
20
10
20
11
% o
f A
gen
cie
s
Year
Agencies with 100% of bonus
• Gradually moving towards an economic incentive focused
primarily (exclusively) in compliance with institutional performance targets.
• Focusing primarily on outcomes, outputs, and quality service (core business).
• Weight of Monitoring System: 50% (2011); 60% (2012); 80% (2013).
• Voluntary incorporation of system of quality management.
• External evaluation.
• Changes in evaluation mechanisms.
PMG: 2011 -2013
Thank you