1 introduction to evaluating the minnesota demonstration program paint product stewardship...

22
1 Introduction to Evaluating the Minnesota Demonstration Program Paint Product Stewardship Initiative September 19, 2007 Seattle, WA Matt Keene, Evaluation Support Division National Center for Environmental Innovation Office of Policy, Economics and Innovation U.S. Environmental Protection Agency

Upload: erica-young

Post on 12-Jan-2016

216 views

Category:

Documents


1 download

TRANSCRIPT

Page 1: 1 Introduction to Evaluating the Minnesota Demonstration Program Paint Product Stewardship Initiative September 19, 2007 Seattle, WA Matt Keene, Evaluation

1

Introduction to Evaluating the Minnesota Demonstration Program

Paint Product Stewardship Initiative

September 19, 2007 Seattle, WA

Matt Keene, Evaluation Support DivisionNational Center for Environmental InnovationOffice of Policy, Economics and InnovationU.S. Environmental Protection Agency

Page 2: 1 Introduction to Evaluating the Minnesota Demonstration Program Paint Product Stewardship Initiative September 19, 2007 Seattle, WA Matt Keene, Evaluation

2

Presentation Objective

Introduce the Paint Product Stewardship Initiative to the key steps in designing the demonstration program evaluation.

Page 3: 1 Introduction to Evaluating the Minnesota Demonstration Program Paint Product Stewardship Initiative September 19, 2007 Seattle, WA Matt Keene, Evaluation

3

Session Agenda

Program Evaluation: Definition, Uses, Types- What is Program Evaluation?

- Why Should We Evaluate?

Steps In the Evaluation ProcessI. Select Program to Evaluate

II. Identify Evaluation Team

III. Describe the Program

IV. Develop Evaluation Questions

V. Identify Existing and Needed Data

VI. Select Data Collection Methods

VII. Select Evaluation Design

VIII. Develop Evaluation Plan

Page 4: 1 Introduction to Evaluating the Minnesota Demonstration Program Paint Product Stewardship Initiative September 19, 2007 Seattle, WA Matt Keene, Evaluation

4

What is Program Evaluation?

Program Evaluation:

A systematic study that uses measurement and analysis to answer specific questions about how well a program is working to achieve its outcomes and why.

Performance Measurement:

The ongoing monitoring and reporting of program progress and accomplishments, using pre-selected performance measures.

Page 5: 1 Introduction to Evaluating the Minnesota Demonstration Program Paint Product Stewardship Initiative September 19, 2007 Seattle, WA Matt Keene, Evaluation

5

Why Evaluate?

Good Program Management:

Ensure program goals and objectives are being met.

Help prioritize resources by identifying the program services yielding the greatest environmental benefit.

Learn what works well, what does not, and why.

Learn how the program could be improved.

Provide information for accountability purposes:

Government Performance and Results Act of 1993: Requires EPA to report schedules for and summaries of evaluations that have been or will be conducted and identify those that influence development of the Agency’s Strategic Plan.

Environmental Results Order 5700.7: Requires EPA grant officers and grant recipients to identify outputs and outcomes from grants and connect them to EPA’s strategic plan.

Page 6: 1 Introduction to Evaluating the Minnesota Demonstration Program Paint Product Stewardship Initiative September 19, 2007 Seattle, WA Matt Keene, Evaluation

6

Steps for Designing an Evaluation

VI. Select Data Collection Methods

II. Identify Evaluation Team

III. Describe the Program

IV. Develop Evaluation Questions

V. Identify Existing and Needed Data

VIII. Develop Evaluation Plan

VII. Select Evaluation Design

I. Select Program to Evaluate

Page 7: 1 Introduction to Evaluating the Minnesota Demonstration Program Paint Product Stewardship Initiative September 19, 2007 Seattle, WA Matt Keene, Evaluation

7

Assessing Whether to Evaluate Your Program (Evaluability Assessment)

1. Is the program significant enough to merit evaluation? Consider: program size, # of people served,

transferability of pilot, undergoing PART

2. Is there sufficient consensus among stakeholders on program’s goals and objectives?

3. Are staff & managers willing to make decisions about or change the program based on evaluation results?

4. Are there sufficient resources (time, money) to conduct an evaluation?

5. Is relevant information on program performance available or can it be obtained?

6. Is an evaluation likely to provide dependable information?

7. Is there a legal requirement to evaluate?(Adapted from Worthen et al. 1997.)

Page 8: 1 Introduction to Evaluating the Minnesota Demonstration Program Paint Product Stewardship Initiative September 19, 2007 Seattle, WA Matt Keene, Evaluation

8

Steps for Designing an Evaluation

II. Identify Evaluation Team

III. Describe the Program

IV. Develop Evaluation Questions

V. Identify Existing and Needed Data

I. Select Program to Evaluate

VI. Select Data Collection Methods

VIII. Develop Evaluation Plan

VII. Select Evaluation Design

Page 9: 1 Introduction to Evaluating the Minnesota Demonstration Program Paint Product Stewardship Initiative September 19, 2007 Seattle, WA Matt Keene, Evaluation

9

Identify Evaluation Team Members

Select diverse team members:

• Individuals responsible for designing, collecting, and reporting information used in the evaluation

• Individuals with knowledge of the program

• Individuals with a vested interest in the conduct/impact of the program

• Individuals with knowledge of evaluation

• Identify a Skeptic!

Page 10: 1 Introduction to Evaluating the Minnesota Demonstration Program Paint Product Stewardship Initiative September 19, 2007 Seattle, WA Matt Keene, Evaluation

10

Steps for Designing an Evaluation

VI. Select Data Collection Methods

II. Identify Evaluation Team

III. Describe the Program

IV. Develop Evaluation Questions

V. Identify Existing and Needed Data

VIII. Develop Evaluation Plan

VII. Select Evaluation Design

I. Select Program to Evaluate

Page 11: 1 Introduction to Evaluating the Minnesota Demonstration Program Paint Product Stewardship Initiative September 19, 2007 Seattle, WA Matt Keene, Evaluation

11

Describe the Program

Describe the program using a logic model

Use the logic model to:

• Check assumptions about how the program is supposed to work

• Brainstorm evaluation questions

Page 12: 1 Introduction to Evaluating the Minnesota Demonstration Program Paint Product Stewardship Initiative September 19, 2007 Seattle, WA Matt Keene, Evaluation

12

Page 13: 1 Introduction to Evaluating the Minnesota Demonstration Program Paint Product Stewardship Initiative September 19, 2007 Seattle, WA Matt Keene, Evaluation

13

Elements of the Logic Model

Inter-mediate

Changes in behavior, practice or decisions.

Behavior

Inter-mediate

Changes in behavior, practice or decisions.

Behavior

Customer

User of the products/ services. Target audience the program is designed to reach.

Customer

User of the products/ services. Target audience the program is designed to reach.

Activities

Things you do– activities you plan to conduct in your program.

Activities

Things you do– activities you plan to conduct in your program.

Outputs

Product or service delivery/ implementation targets you aim to produce.

Outputs

Product or service delivery/ implementation targets you aim to produce.

Resources/ Inputs:

Programmatic investments available to support the program.

Resources/ Inputs:

Programmatic investments available to support the program.

Short-term

Changes in learning, knowledge, attitude, skills, understanding.

Attitudes

Short-term

Changes in learning, knowledge, attitude, skills, understanding.

Attitudes

Long-term

Change in condition.

Condition

Long-term

Change in condition.

Condition

External Influences

Factors outside of your control (positive or negative) that may influence the outcome and impact of your program/project.

External Influences

Factors outside of your control (positive or negative) that may influence the outcome and impact of your program/project.

Outcomes

PROGRAM RESULTS FROM PROGRAM

WHYHOW

Page 14: 1 Introduction to Evaluating the Minnesota Demonstration Program Paint Product Stewardship Initiative September 19, 2007 Seattle, WA Matt Keene, Evaluation

14

Outcomes

Shorter-term awareness

Intermediate behavior

Longer-term condition

OutputsActivities Customers

PPSI Demonstration ProgramProgram Goal: Design, implement and evaluate a fully-funded statewide paint product stewardship program that is cost-effective and environmentally beneficial

OUTREACH/EDUCATION• Establish relationships/partnerships• Implement education/outreach and social

marketing projects/campaign

Pro

ject

Imp

lem

enta

tio

n

Sta

ge

• Baseline information• Program database

Awareness of recycled paint and waste hierarchy improves

September 13, 2007

MEASUREMENT• Collect baseline data • Ongoing data collection• Interim analysis

•Consumers

•Retailers

•Manufacturers

•Agencies

•Environmental Groups

•Recyclers

Pla

nn

ing

an

d N

eed

s A

sses

smen

t S

tag

eIm

ple

men

tati

on

Sta

ge

Use

an

d T

ran

sfer

Sta

ge

Management systems = Collection, Processing, Transportation, Recycling, Disposal Waste Hierarchy = Reduce, Reuse, Recycle, Resource Recovery

• Education materials• Workshops• Media• Tools for Consumers

and Retailers

Less waste paintDecisions based on waste hierarchy

• Interim reports and presentations

Page 15: 1 Introduction to Evaluating the Minnesota Demonstration Program Paint Product Stewardship Initiative September 19, 2007 Seattle, WA Matt Keene, Evaluation

15

Steps for Designing an Evaluation

II. Identify Evaluation Team

III. Describe the Program

IV. Develop Evaluation Questions

V. Identify Existing and Needed Data

I. Select Program to Evaluate

VI. Select Data Collection Methods

VIII. Develop Evaluation Plan

VII. Select Evaluation Design

Page 16: 1 Introduction to Evaluating the Minnesota Demonstration Program Paint Product Stewardship Initiative September 19, 2007 Seattle, WA Matt Keene, Evaluation

16

What are Evaluation Questions?

Questions (at any point on the performance spectrum/ logic model) that the evaluation is designed to answer.

They should reflect stakeholders’ needs.

Evaluation questions are KEY because they:

• Frame the scope of the evaluation

• Drive the evaluation design, data collection, and reporting

Page 17: 1 Introduction to Evaluating the Minnesota Demonstration Program Paint Product Stewardship Initiative September 19, 2007 Seattle, WA Matt Keene, Evaluation

17

Types of Evaluations and Common Evaluation Questions

Evaluation Type Common Evaluation Questions

Design assessment Is the design of the program well formulated, feasible, and likely to achieve the intended goals?

Process evaluation or implementation assessment

Is the program being delivered as intended to the targeted recipients?Is the program well managed?

Outcome evaluation Are desired program outcomes obtained?Did the program produce unintended outcomes?

Net impact evaluation Did the program cause the desired impact? Is one approach more effective than another in obtaining the desired outcomes?

Cost evaluation What are the specific costs for implementing and operating the program?Is the program cost efficient? Cost effective?

Adapted from Evaluation Dialogue Between OMB and Federal Evaluation Leaders: Digging a Bit Deeper into Evaluation Science, April 2005

Page 18: 1 Introduction to Evaluating the Minnesota Demonstration Program Paint Product Stewardship Initiative September 19, 2007 Seattle, WA Matt Keene, Evaluation

18

The Evaluation Plan

• What: Brief document describing evaluation purpose, audience, scope, design, & methods.

• Why: The purpose is to clearly articulate and communicate expectations for the evaluation.

• Who: Developed by one or more team members based on team’s common understanding.

• When: Can be developed at any point from initial selection of the program through development of the research design.

• Plans are living documents and need to be revised to account for changes in evaluation objectives or methods.

Page 19: 1 Introduction to Evaluating the Minnesota Demonstration Program Paint Product Stewardship Initiative September 19, 2007 Seattle, WA Matt Keene, Evaluation

19

Components of an Evaluation Plan

• Purpose of the evaluation/ Evaluation questions

• Primary audience

• Context (organizational, management, political)

• Data collection methods and analysis

• Evaluation design

• How evaluation findings will be reported

• Consider different formats for different target audiences

• Expectations for roles and communication among evaluators, program staff/managers, and key stakeholders

• Resources available for evaluation (staff, budget)

• Timeline for evaluation

• Note: Save sufficient time to develop evaluation questions and analyze data thoroughly.

Page 20: 1 Introduction to Evaluating the Minnesota Demonstration Program Paint Product Stewardship Initiative September 19, 2007 Seattle, WA Matt Keene, Evaluation

20

Steps for Designing an Evaluation

VI. Select Data Collection Methods

II. Identify Evaluation Team

III. Describe the Program

IV. Develop Evaluation Questions

V. Identify Existing and Needed Data

VIII. Develop Evaluation Plan

VII. Select Evaluation Design

I. Select Program to Evaluate

Page 21: 1 Introduction to Evaluating the Minnesota Demonstration Program Paint Product Stewardship Initiative September 19, 2007 Seattle, WA Matt Keene, Evaluation

21

Contact

Matt Keene

(202) 566-2240

[email protected]

Evaluation Support DivisionNational Center for Environmental InnovationOffice of Policy, Economics and Innovation

U.S. Environmental Protection Agency

Page 22: 1 Introduction to Evaluating the Minnesota Demonstration Program Paint Product Stewardship Initiative September 19, 2007 Seattle, WA Matt Keene, Evaluation

22