business planning and evaluationportal.publicpolicy.utoronto.ca/en/contentmap... · – informs...
TRANSCRIPT
Bell, Browne, Molnar & Delicate Consulting 1
Business Planning and Evaluation
Putting the pieces together !
2
Who We Are…BBMD Consulting
2
• Focus on Performance Measurement and Program Evaluation
• Extensive Performance Measurement and Program Evaluation experience with federal, provincial and municipal governments (currently supporting the Performance Measurement efforts of Canadian and Ontario Government Departments)
Please visit www.bbmd.ca for details
Evaluation – Definition and Purpose
3.1 In the Government of Canada, evaluation is the systematic collection and analysis of evidence on the outcomes of programs to make judgments about their relevance, performance and alternative ways to deliver them or to achieve the same results.
3.2 Evaluation provides Canadians, Parliamentarians, Ministers, central agencies and deputy heads an evidence‐based, neutral assessment of the value for money, i.e. relevance and performance, of federal government programs. Evaluation:
– supports accountability to Parliament and Canadians by helping the government to credibly report on the results achieved with resources invested in programs;
– informs government decisions on resource allocation and reallocation by:
• supporting strategic reviews of existing program spending, to help Ministers understand the ongoing relevance and performance of existing programs;
• providing objective information to help Ministers understand how new spending proposals fit with existing programs, identify synergies and avoid wasteful duplication;
– supports deputy heads in managing for results by informing them about whether their programs are producing the outcomes that they were designed to produce, at an affordable cost; and,
– supports policy and program improvements by helping to identify lessons learned and best practices.
Reference: Policy on Evaluation, April 1, 2009
Bell, Browne, Molnar & Delicate Consulting 4
Some Drivers of Measurement
The Treasury Board of Canada’s 1997 Report entitled “Getting Government Right: Governing for Canadians” began the fundamental shift to a results based management agenda.
“Results for Canadians – A Management Framework for the Government of Canada”(March 30th , 2000) intensified the push towards a results based management culture in the federal public sector.
A new Treasury Board of Canada Evaluation Policy implemented in March, 2009.
1. Brief History of PM
Bell, Browne, Molnar & Delicate Consulting 6
American Government example
1. Brief History of PM
www.whitehouse.gov/omb/expectmore/
Bell, Browne, Molnar & Delicate Consulting 8
Strategic Roadmap
Monitor, Evaluate & Report on Performance Data
Innovation
Continuous Improvement
Performance Measures
Strategies and Action Plans
Outcomes and Outputs
Vision
SWOT Analysis
Mission and Values
Where are we now?
Where do we want to be?
How do we get there?
How do we measure our progress?
Where do we go from here?
• Systems to monitor progress• Compile performance information• Provide information for resource allocation
• Develop and implement new products and services to meet new and emerging demands
• Products and services better, faster, cheaper
• Performance indicators• Methods to measure program performance• Accountability and responsibility
• Gap analysis• Detailed action/work plans• Resource allocation
• Outcomes (direct, shared and ultimate)• Specific, measurable outputs to achieve outcomes
• Image of your organization’s desired future
• Internal and external assessment• Client/stakeholder analysis• Quality assessment• Strategic issues
• Broad statement of organization's mission• Core values, actions to achieve mission• Staff and clients/stakeholders
Bell, Browne, Molnar & Delicate Consulting 9
Why Outcome Based Performance Measurement?
3. PM and Planning/Reporting Responsibilities
Bell, Browne, Molnar & Delicate Consulting 10
Public Sector Performance Measurement Framework
PROGRAM OUTCOMES
WHY
CLIENTS/STAKEHOLDERS
WHO
OUTPUTS
WHAT
PROCESSES/ACTIVITIES
HOW
FINANCIAL CAPITAL
HUMAN CAPITAL
STRUCTURAL CAPITAL
RESOURCE ALLOCATIONINVESTMENT DECISIONS
PRIORITY SETTINGSTRATEGIES
Strategic Direction (What we should be doing)
SatisfactionReach
ENABLERS INPUTS
Effectiveness
Various Program Perspectives
Mission Change Objectives/ Objectives / Initiatives Initiatives
Program Delivery(What we do)
Performance Measurement(How well we are doing)
Benefits / Consequences / ImpactsBehavioural change
Efficiency
- Cost- Time- Quantity- Quality
Effective Resource ManagementSustainable WorkforceLeadership and ValuesEnabling Work EnvironmentEmployee SatisfactionProductivity/Retention
Strategies Plans
Citizen Focus Managing for Results
Public Service Values Responsible Spending
Sustaining Changing The The Business Business
Bell, Browne, Molnar & Delicate Consulting 11
Federal Government Example
Implemented the MRRS (Management Resources and Results Structure)
3. PM and Planning/Reporting Responsibilities
Strategic Outcome
Program Activities
Sub-Activity Level
Sub-Sub Activity Level
Bell, Browne, Molnar & Delicate Consulting 12
Whole of Government Framework
3. PM and Planning/Reporting Responsibilities
Bell, Browne, Molnar & Delicate Consulting 13
Performance Measures (HRM Scorecard, July 2005,Managing For Results)
3. PM and Planning/Reporting Responsibilities
Strategic Outcomes Supporting Outcomes
Safe Communities • Timely and Appropriate Emergency Response
• Citizens Feel Safe• Buildings, properties and infrastructure in HRM are safe, healthy and well maintained
• Reasonable amount spent to maintain buildings, properties and infrastructure
Healthy, Sustainable, Vibrant Communities
•HRM is a desirable and attractive place to work, learn, play and live
• People and Goods can move easily throughout the municipality
• Preservation of the Environment
• Attraction of New Businesses, Retention and Growth of Existing Businesses•Development is Appropriately Planned and Fiscally Sustainable
• Tax Burden is Balanced and Competitive
Excellence in Governance
• Citizens feel they are making a difference/participating in the future direction of HRM
• Citizens are satisfied that HRM vision and priorities have been implemented
• Citizens believe HRM works with other levels of government to improve service delivery
• Citizens are confident in the governance and management of HRM
•HRM has sound financial management practices
Excellence in Service Delivery
• Customers are satisfied with the level of services received from HRM
• Customers are satisfied with the quality of services received from Council and staff
• Citizens feel the municipal services they receive are worth the taxes they pay
Bell, Browne, Molnar & Delicate Consulting 14
Terminology Definition Examples
Input Value of resources used to produce an output
•Dollars budgeted/spent• Staff hours used
Output Quantity or number of units produced. Outputs are activity oriented, measurable, and usually under managerial control
• Eligibility interviews conducted• Library books checked out• Children immunized• Prisoners boarded• Purchase orders issued
Efficiency Inputs used per unit of output • Cost per appraisal• Plans reviewed by reviewer
Service Quality
Degree to which customers are satisfied with a program, or how accurately or timely a service is provided
• Percent of respondents satisfied with service• Error rate per data entry operator• Frequency of repeat repairs• Average days to address a facility work order
Outcome Qualitative consequences associated with a program/service, i.e., the ultimate benefit to the customer. External forces can sometimes limit managerial control. Outcomes focus on the ultimate why of providing a service.
• Reduction in fire deaths/injuries• Percent of job trainees who hold a job for more than six months
• Percent of juveniles not reconvicted within 12 months
• Adoption/redemption rate of impounded animals
Performance Measures(Department of Management and Budget, Fairfax County, Virginia, 2005)
Bell, Browne, Molnar & Delicate Consulting
Core Issues to be Addressed in Evaluation
• To address value for money, evaluations will be required to assess all core issues identified above (as appropriate, departments may choose to address additional issues in their evaluations).
Relevance
Issue #1: Continued Need for program
Assessment of the extent to which the program continues to address a demonstrable need and is responsive to the needs of Canadians
Issue #2: Alignment with Government Priorities
Assessment of the linkages between program objectives and (i) federal government priorities and (ii) departmental strategic outcomes
Issue #3: Alignment with Federal Roles and Responsibilities
Assessment of the role and responsibilities for the federal government in delivering the program
Performance (effectiveness, efficiency and economy)
Issue #4: Achievement of Expected Outcomes
Assessment of progress toward expected outcomes (incl. immediate, intermediate and ultimate outcomes) with reference to performance targets and program reach, program design, including the linkageand contribution of outputs to outcomes
Issue #5: Demonstration of Efficiency and Economy
Assessment of resource utilization in relation to the production of outputs and progress toward expected outcomes
Reference: Directive on the Evaluation Function, April 1, 2009
Bell, Browne, Molnar & Delicate Consulting 17
Evaluation and Performance Measurement
• Formal evaluations generally consist two types:
• Summative
• Formative
• They are generally undertaken on a cyclical basis (midway and at Program end of funding agreement‐usually 2 and 5 years)
• Performance Measurement is an on‐going operational collection of performance information utilized to modify business operations
• Performance measurement is an integral part of evaluation lifecycle and feeds the cyclical evaluations
Bell, Browne, Molnar & Delicate Consulting 18
The Performance Continuum Stages to Development & Implementation
Source: Exhibit ‐R
MAF Guide
, Treasury Board Secretariat, A
ugust 2
001
2
0
4
1
5
6
3
7
8
Performance Measurement Understanding
Develop/Revise Information Systems & Data Processing
Measure & Report against Performance Information
Establish Appropriate Data Collection Strategy
Develop Performance Measures & Indicators
Program Profile – Logic Model –Outcomes – Key Evaluation Issues
Formative Evaluation – Management Issues
Summative Evaluation –Fundamental Program Issues
Review/Assess/Modify Performance Measurement System
Bell, Browne, Molnar & Delicate Consulting
Planning‐Evaluation Cycle
Reference: Research Methods Knowledge Base, http://www.socialresearchmethods.net/kb/index.php
Bell, Browne, Molnar & Delicate Consulting 21
The Basis of Measurement… the Logic Model
The causal or logical relationship between activities and outputs and the outcomes of a given program, policy or initiative that they are intended to deliver. Also known as a results chain.
4. PM Concepts
Inputs Activities Outputs SharedOutcomes
Low InfluenceHigh Control
RESULTS
DirectOutcomes
UltimateOutcomes
StrategicOutcomes
Bell, Browne, Molnar & Delicate Consulting 22
The Framework for EvaluationAs a framework for evaluation, the Logic Model prompts organizations to examine all four components:
1. The clarity and thoroughness of the plans and priorities and how closely these align with the organization’s mission and vision
2. The effectiveness of the processes, including the adequacy ofresources, efficiency with which they are leveraged, and the extent to which the optimal stakeholders are engaged
3. The extent to which specific and measurable outputs and outcomes are generated in the near and middle term, according to defined timelines
4. The extent to which specific and measurable impacts are generated in the longer term and the extent to which these truly realize the vision and mission of the organization
Bell, Browne, Molnar & Delicate Consulting 24
Definitions
• Need to have a common vocabulary within the organization on measurement terms and concepts.
• This means providing specific definitions of:– Activities
– Outputs
– 3 Levels of Outcomes
• The objective is to have all participants “singing from the same song sheet, common understanding and interpretation of terms.
4. PM Concepts
Bell, Browne, Molnar & Delicate Consulting 25
Definition: Activities
• Activities = what we do
• Describes a collection of functions (actions, jobs, tasks)
• Activities require resources ($ and people)
• For example: policy development, partnership management, research, manufacturing, advocacy.
4. PM Concepts
Bell, Browne, Molnar & Delicate Consulting 26
Definition: Outputs
• Outputs = what we produce
• Describes the direct products and services generated through activities
• Outputs are usually tangible and concrete
• For example: the policy itself, partnership agreements, research reports, manufactured product, changes to legislation resulting from advocacy.
4. PM Concepts
Bell, Browne, Molnar & Delicate Consulting 27
Definition: Outcomes
• Outcomes = why we do it
• Describes the effects, benefits or consequences achieved through the delivery of various activities and their outputs
• Outcomes are differentiated by the degree of control the organization has over the achievement of the outcome and the degree of influence the organization has with other intermediaries in achieving the outcomes.
4. PM Concepts
Bell, Browne, Molnar & Delicate Consulting 28
Types of Outcomes
Categorized according to the degree of influence, as follows:
Direct First‐level effects of, or immediate Outcomes response to the outputs, e.g., changes in
degree of customer satisfaction.
Shared Second order of outcomes, benefits and Outcomes changes in behaviour, decisions, policies
and social action attributable to outputs to demonstrate that program objectives are being met, e.g., increased employability as a result of a training program.
4. PM Concepts
Bell, Browne, Molnar & Delicate Consulting 29
Ultimate The ultimate or long‐term consequences Outcomes for human, economic, civic or environmental benefit, to which the organization and/or government contributes to.
All of which feed the…
Strategic The long term and enduring Outcomes organizational and/or government‐
wide benefits or goals that the Project, Program or Organization is contributing to.
Types of Outcomes
4. PM Concepts
Bell, Browne, Molnar & Delicate Consulting 30
Principles of Good Outcome Statements
1. Outcomes are noun‐based desired end states
2. Directional outcome statements do not belong in logic models or outcome statements. Direction is achieved through targets not outcome statements
3. Avoid self‐serving statements. After all, this is not the end state desired
4. Clear cause and effect linkage to the next level of outcomes
5. No wiggle words that are subject to interpretation (e.g. effective, efficient).
6. SIMPLE: Outcomes described in a way that helps shareholders, public and others to relate to the desired state that is being described
4. PM Concepts
Bell, Browne, Molnar & Delicate Consulting 31
What changes in the world result from…
4. PM Concepts
Inputs Activities Outputs SharedOutcomes
Low InfluenceHigh Control
RESULTS
DirectOutcomes
UltimateOutcomes
StrategicOutcomes
KNOWLEDGE ATTITUDES
• Awareness• Understanding• Skills
• Perceptions• Acceptance
BEHAVIOUR
• Involvement• Compliance• Action
EARLY / LATER EFFECTS
• Societal change• Socio- economic
benefits
Social Marketing RoleSocial Marketing Role
KNOWLEDGE ATTITUDES &BEHAVIOURS
• Awareness• Understanding• Skills
• Perceptions• Acceptance• Involvement• Compliance• Action
EARLY / MID-TERM EFFECTS
• -• Societal change• Socio economic
benefits
Social Marketing Role
• long term and enduring organizational and/or government-wide benefits or goals
LONG TERM EFFECTS
Bell, Browne, Molnar & Delicate Consulting 32
Indicators, Targets & Standards
• Indicators measure progress, providing information on how well we are doing, for example:– Average response time for calls of greatest urgency– % of emergency calls to Fire responded to within 5 minutes
• Targets are specific performance goals tied to indicators, against which actual performance will be compared, for example:– 10 minute response time for calls of greatest urgency– 90% of emergency calls to Fire responded to within 5 minutes
• Standards are similar to targets, and refer to a service level to which an organization is prepared to commit, for example:– 30 minutes or it’s free– 10% refund if a similar product is advertised at a lower price
4. PM Concepts
Bell, Browne, Molnar & Delicate Consulting 33
How to Determine “the Critical Few” Characteristics of Evaluation
• Credible – provide a credible, independent view
• Specific ‐ eliminate ambiguity, show relevance to the expected outcome
• Linked ‐ clear (cause and effect) linkages to other indicators
• Reliable ‐ scientifically and statistically sound, provide an appropriate degree of accuracy. Measure the same thing across time (allowingcomparisons) and for different groups/regions
• Available – data are easily accessible or there is a low level of effort tocollect and analyze
• Cost‐effective – the costs for collection are aligned with the overall utility of the indicators
• Understandable – data can be easily grasped by various audiences
4. PM Concepts
Bell, Browne, Molnar & Delicate Consulting 34
Presentation Formats
There are no hard and fast rules concerning the format for reports –the presentation will be dictated by the indicators, and the management information needs.
• It should be seldom the case that one looks at an individual element of an evaluation in isolation. A balanced approach is more useful to get the full picture of the performance of an organization
• The balanced approach is also likened to an airplane cockpit. Rather than giving pilots just one “trouble light” (the plane is either okay or not okay), a bank of gauges is supplied so that the pilot may see trends and where trouble is developing
4. PM Concepts
Bell, Browne, Molnar & Delicate Consulting 35
Citizen Perspective
PublicServices
PublicServices
Public Benefits
Public Benefits
PublicResources
PublicResources
Citizen as Client• Timeliness• Accessibility• Reliability• Responsiveness• Fairness
Citizen as Taxpayer• Cost• Efficiency• Effectiveness
Citizen as Beneficiary• Health & Safety• Economic well‐being• Fairness & equity• Public security
4. PM Concepts
Bell, Browne, Molnar & Delicate Consulting 36
For More Information or Services, Contact Us…
(613) 562‐3468
www.bbmd.ca
Bell, Browne, Molnar & Delicate Consulting 37
Annex – Supporting Information
• International approaches to Performance Measurement
• Evaluation Standards
• Performance Information Assessment Grid
• Data Analysis Techniques
• Performance Frameworks ‐ tying it all together
4. PM Concepts
Bell, Browne, Molnar & Delicate Consulting 38
Approaches To MeasurementMethod Country
United Kingdom Operating and Financial Review www.opsi.gov.uk/ United Kingdom
Nouvelles Régulations Économique www.legifrance.gouv.fr France
Australian legislation for public sector organisation www.apsc.gov.au/publications02/performancemanagement.htm
Australia
ASX Principles of Good Corporate Governance and Best Practice Recommendations www.asx.com.au/supervision/pdf/asxcgc_marked_amended_principles_021106.pdf
Australia
MERITUM Guidelines – Measuring Intangibles to Understand and Improve Innovation Management www.nfrcsr.org/international/practical_tools‐toolkits/
European Commonwealth
(6 countries)
Keizai Doyukai www.doyukai.or.jp/en/ Japan
The Global Reporting Initiative www.globalreporting.org/Home United Nations
Danish guidelines on intellectual capital reporting www.pnbukh.dk/site/files/pdf_filer/Intellectual_Capital_Statements_‐_The_New_Guideline.pdf
Denmark
1. Brief History of PM
Bell, Browne, Molnar & Delicate Consulting 39
Performance Information Assessment Grid
Priority
Degree of importance/relevance in terms of measuring efficiency (“doing things well”) and/or effectiveness (“doing the right things/having an impact”)
High 3 It is essential in order to measure efficiency and/or effectiveness
Medium 2 It is useful in measuring efficiency and/or effectiveness
Low 1 It is of little use in measuring efficiency and/or effectiveness
Data Accessibility
The degree to which data are readily available either manually and/or through an automated system
High 3 Indicator data are immediately available
Medium 2 The appropriate data are not immediately available through an automated or manual system but could be gathered relatively easily
Low 1 The data to support this indicator are not currently available or accessible and will be difficult to access
Level of Effort
An estimate of the level of effort required to report reliably against the indicator on a regular basis
Low 3 Minimal effort to gather, analyze, interpret and report reliably on the indicator
Medium 2 It requires a moderate degree of effort to reliably report on the indicator
High 1 It is very resource intensive to reliably report on the indicator
4. PM Concepts
Bell, Browne, Molnar & Delicate Consulting 40
Data Analysis Techniques
4. PM Concepts
Qualitative assessment
• Narrative review
• Content analysis, quantifying qualitative data
• Identifying and verifying emergent themes
• Grounded theory
• Flow diagrams
Bell, Browne, Molnar & Delicate Consulting 41
Data Analysis Techniques
4. PM Concepts
Quantitative assessment
• Descriptive statistics (frequencies, means,• etc.)• Multiple regression and analysis of variance• Meta‐analysis• Trend analysis• Structural equation modeling• Cost‐effectiveness analysis, case costing, financial
analyses, etc.