1 linking performance measures to benchmarks in the budget process march-april 2002 department of...

Post on 31-Mar-2015

216 Views

Category:

Documents

0 Downloads

Preview:

Click to see full reader

TRANSCRIPT

1

Linking Performance Measures to Benchmarks in the Budget Process

March-April 2002

Department of Administrative Services

Oregon Progress Board

www.econ.state.or.us/opb

2

Overview

• Why measure performance?

• Why Oregon Benchmarks?

• What makes a good performance measure?

• What is required in the budget process?

• Getting started

3

Handouts

• Logic models– Logic model worksheets (yellow)– Logic model examples (ochre)

• Submission forms– Links to Oregon Benchmarks (blue)– Performance Measure Data Summary (green)

• Evaluation forms– PM criteria worksheet (off-white)– Today’s training evaluations (purple)

4

Why measure performance?

It’s at the core of results-based management• Provides greater accountability

Is the ship on course?

• Fosters internal learning and improvementIs the ship running well?

AND…it has been required since 1993.

See Appendix B.

5

Why link to Oregon Benchmarks?They articulate Oregon’s hopes and expectations.

• “High-level outcomes” or measures of societal well-being.

• Beacons for the “ship” and the “fleet”.

• For budget, link only to those that relate to your core mission and goals (“primary linkages”).

6

Oregon has ninety benchmarks in three broad categories.

• Economy• Education• Civic Engagement• Social Support• Public Safety• Community Development• Environment

7

What happens if your agency does not link to an Oregon Benchmark?.

• That’s OK. You have two options:

– You may submit other high-level outcomes to gauge how Oregon is doing relative to your mission.

– Small agencies: if this is not feasible, you can “look up” to your mission and/or mandate.

• All high-level outcomes should pass the “so what” test. Do Oregonians care?

So what??

8

Logic models define the links.

Goal (generallyunmeasurable)

Performance Measures

Impact Intermediate Outcome Measures

Agency Inputs and Activities

OutputMeasures

High-level outcome(s)

(measurable)

(Increase) % of offenders with intake assessments

Output

“So That”

% of offenders engaged in work, training, education and/or treatment(is increased)

Intermediate Outcome

“So That”

% of offenders showing a measurable improvement in behavior and/or skill level (is increased)

Intermediate Outcome

“So That”

% of paroled offenders convicted of a new felony within three years (is decreased)

High-Level Outcome (Benchmark #61)

A logic model embeds a continuum of measures in a“so that” chain.

10

What makes a good performance measure?

BASIC criteria required for 2003-05Performance measures should:1. Use GASB* terms and definitions2. Gauge progress towards agency goals and

benchmarks or other high-level outcomes3. Focus on a few key indicators4. Have targets5. Be based on accurate and reliable data

*Governmental Accounting Standards Board

11

• OUTCOME = Result (the best kind of measure)– High-level (societal) = OBM#11, Per capita income– Intermediate = Average wage of agency job placements

• OUTPUT = Product or service (“widget”)– # of job placements per quarter

• INPUT = Time, money, material or demand– FTEs in the “Job Placement Unit”– Dollars allocated to the “Job Placement Unit”– Case load or number of complaints– INPUTS ARE NOT STAND-ALONE PERFORMANCE

MEASURES• EFFICIENCY = Input per output

– # of days required to process a job application

Basic criteria #1. Use GASB definitions

12

Two kinds of intermediate outcomes: chunks and stones

EXAMPLE: Benchmark #18, Ready to Learn1. A “chunk” of the population is measured

for the high-level outcome (HLO) % of children of served families who are

ready to learn (versus % of all children in the county who are ready to learn)

2. “Stepping stone” toward the HLO is measured.

• % of trained parents who read regularly to their children (reading to kids is a stepping stone to being ready to learn)

# of intake assessments completed

Output

% of offenders engaged in work, training, education and/or treatment

Intermediate Outcome

% of offenders showing a measurable improvement in behavior and/or skill level

Intermediate Outcome

% of paroled offenders convicted of a new felony within three years

High-Level Outcome (Benchmark #64)

Basic criteria #2. Measure progress towards agency goals and benchmarks

Goal to “reduce repeat offenders” is UNMEASURABLE

MEASURES gauge progress

14

Basic criteria #3. Focus on a few key measures.

• Represent the scope of agency responsibility

• Number 30 max (except for mega-agencies)

• Include the best measures for:– “Is the ship on course?”– “Is the ship running well?”

• Additional measures internal toyour agency can provide more detailed management information.

Agencies should decide how “high up” to gofor their key measures.

More agency influ

ence

More policy in

tent

Consider level of agency

INFLUENCE

# of intake assessments completed

Output

% of offenders engaged in work, training, education and/or treatment

Intermediate Outcome

% of offenders showing a measurable improvement in behavior and/or skill level

Intermediate Outcome

% of paroled offenders convicted of a new felony within three years

High-Level Outcome (Benchmark #64)

16

Basic criteria #4. Performance measures should have targets.

• TARGET = Desired level at any given point in time

• Should be ambitious but realistic• Target setting is an art and a science

based on – trend data

– comparisons

– expert opinion

• Targets not required until Jan. 2003

Recidivism now

Recidivism TARGET

17

Basic criteria #5. Accurate and reliable data.

• Without trustworthy data, the system is meaningless.

• Example: verifiable employment records are better than estimated job creation

• Each measure should have at least one data point, preferably several.

• Data should describe what is being measured.

18

Performance measure criteria ADVANCED = required for 2005-07 biennium

Performance measures should:6. Link to an organizational unit

7. Cover organizational outcomes like efficiency and customer satisfaction

8. Allow comparisonsMore training on Advanced Criteria later

Annual Performance Reports submitted to DAS/LFO. (Annually in September)

Submit Links to Oregon Benchmarks (March - August 2002)

TA & Training on Performance Measures

Budget Instructions

Comments & Measures

Accompany Governor’s Recommended Budget (November 2002)

See Guidelines pp.10 & 11

Budget Timeline for Performance Measures

(April – August 2002)Adjustments (Optional)

Performance Measure Data Summary to Ways & Means (January - June 2003)

Agencies adjust measures and targets per legislature (June 2003)

Criteria-based review(April – Aug. 2002)

20

Hypothetical example #1

Impact

AGENCY INPUT/ACTIVITYAward grants to local contractors to conduct “best practice” juvenile crime prevention programs (JCP).

INTERMEDIATE OUTCOMES % of juveniles in JCP programs with significantly mitigated risk

factors.

GOALReduce juvenile

crime.

HLOJuvenile Arrests

(OBM#61)

Agency Performance

Measures

OUTPUTS# grants awarded by county

# days of TA delivered by county

21

Hypothetical example #2

Impact

AGENCY INPUT/ACTIVITYAward grants to local contractors to design/deliver “best practice”

parent education classes.

INTERMEDIATE OUTCOMES % of children from participating (trained) families entering school

ready to learn.

GOAL Healthy, thriving

children.

HLO: % of kindergarteners ready

to learn (OBM#18)

Agency Performance

Measures

OUTPUTS# grants awarded by county.

“Best practice” guidelines done by

22

Hypothetical example #3

Impact

AGENCY INPUT/ACTIVITY Jointly sponsor, with cities,

regional educational events for private citizens every quarter.

INTERMEDIATE OUTCOMES % participating citizens with

improved understandingCustomer satisfaction ratings

GOAL: Citizen involvement (C.I.) in

land use planning

HLO: % of cities with neighborhood

organizations.

Agency Performance

Measures

OUTPUTS# citizens trained.

# C.I. guidelines distributed.

23

Related Oregon Benchmarks (OBMs) or High-Level Outcomes (HLOs):

 

% of cities with active neighborhood organizations

Agency Goal OBM#HLO#

Key Performance Measure

PM #

PM Since

New or Mod.?

2000 Valu

e

2005 Target

Lead Division or Unit (Optional)

 Citizen involvement in land use planning

 

1 Percent of participants with improved understanding

 

Ag# - 1

 

2002 

New 55%

 70%

 

Communications

     

            

     

            

     

            

 

  

              

     

            

Pertinent Benchmark or High-level outcome(s):Links to Oregon Benchmarks Form

HLO 1 - Percent of cities with active neighborhood organizations.

24

 

Performance Measure Definition(numbered as shown below) Data Targets

1997 1998 1999 2000 2001 2000 2001 2002 2003 2004 2005

Agency # - 1                      

Agency # - 2                      

Agency # - 3                      

Agency # - 4                      

Agency # - 5                      

Agency # - 6                      

Agency # - 7                      

Agency # - 8                      

Performance Measure Data Summary (for Ways and Means)

55%62% 70%60% 65%Percent of participants with improved understanding

25

Helpful websites

Governmental Accounting Standards Boardwww.gasb.org GASB home pagehttp://accounting.rutgers.edu/raw/seagov/pmg/

National Center for Public Productivity, RutgersA Brief Guide to Performance Measurement in Local Government (1997)http://newark.rutgers.edu/~ncpp/cdgp/Manual.htm#man1

John F. Kennedy School of Government, HarvardAn Open Memorandum to Government Executives - Get Results Through Performance Management (2001)http://www.ksg.harvard.edu/visions/

26

Additional resources

• Book and reports– Measuring Up, Jonathan Walters (1998)

– The Reinventor’s Fieldbook, David Osborne and Peter Plastrik, Chapter 7 (2000)

– Making Results-Based State Government Work, The Urban Institute (2001)

• Oregon Progress Board– Technical Assistance

– Training

– Strategic Planning

27

George DunfordPerformance Measure Manager, DAS(503) 540-1138George.Dunford@state.or.us

Jeffrey L. TryensExecutive Director, Progress Board(503) 986-0039Jeffrey.L.Tryens@state.or.us

Rita ConradSenior Policy Analyst, Progress Board(503) 986-0031Rita.R.Conrad@state.or.us

DAS/Oregon Progress Board

top related