used with permission of: john r. slate

23
A brief overview What is program evaluation? How is an evaluation conducted? When should it be used? When can it be used? Used with Permission of: John R. Slate

Upload: gordy

Post on 13-Jan-2016

17 views

Category:

Documents


0 download

DESCRIPTION

A brief overview What is program evaluation? How is an evaluation conducted? When should it be used? When can it be used?. Used with Permission of: John R. Slate. Presentation Outline. Definitions Purposes Types Key concepts of evaluative research Research designs - PowerPoint PPT Presentation

TRANSCRIPT

Page 1: Used with Permission of:  John R. Slate

A brief overview

What is program evaluation?How is an evaluation conducted?

When should it be used?When can it be used?

Used with Permission of:

John R. Slate

Page 2: Used with Permission of:  John R. Slate

Presentation Outline

Definitions Purposes Types Key concepts of evaluative research Research designs Requirements for program evaluation

Page 3: Used with Permission of:  John R. Slate

A Definition of “Program”

“An organized set of resources and activities directed toward a common purpose or goal”

Page 4: Used with Permission of:  John R. Slate

Two Definitions of “Program Evaluation”

“... an assessment, through objective measurement and systematic analysis, of the manner and extent to which Federal programs achieve intended objectives.” (Source - Government Performance and Results Act (GPRA))

“The application of scientific research methods to assess program concepts, implementation and effectiveness” (Source - General Accounting Office, Designing Evaluations, Report GAO/PEMD-10.1.4)

Page 5: Used with Permission of:  John R. Slate

Where Does Evaluation Fit Into a Program’s Planning, Development and Implementation Process?

Make decision to create programand set strategic direction

5

Page 6: Used with Permission of:  John R. Slate

Where Does Evaluation Fit Into a Program’s Planning, Development and Implementation Process?

Make decision to create programand set strategic direction

Determine what the program will do and how it will do it , set targets and program objectives

6

Page 7: Used with Permission of:  John R. Slate

Where Does Evaluation Fit Into a Program’s Planning, Development and Implementation Process?

Make decision to create programand set strategic direction

Determine what the program will do and how it will do it , set targets and program objectives

Create program infrastructure& management, administrative& information systems, develop performancemeasures

7

Page 8: Used with Permission of:  John R. Slate

Where Does Evaluation Fit Into a Program’s Planning, Development and Implementation Process?

Make decision to create programand set strategic direction

Determine what the program will do and how it will do it , set targets and program objectives

Create program infrastructure& management, administrative& information systems, develop performancemeasuresDetermine required levels of

human resources and material support

8

Page 9: Used with Permission of:  John R. Slate

Where Does Evaluation Fit Into a Program’s Planning, Development and Implementation Process?

Make decision to create programand set strategic direction

Determine what the program will do and how it will do it , set targets and program objectives

Create program infrastructure& management, administrative& information systems, develop performancemeasuresDetermine required levels of

human resources and material support

Implement action plan(Program operation & management, performance measurement, corrective action, etc..., .)

9

Page 10: Used with Permission of:  John R. Slate

Where Does Evaluation Fit Into a Program’s Planning, Development and Implementation Process?

Make decision to create programand set strategic direction

Determine what the program will do and how it will do it , set targets and program objectives

Create program infrastructure& management, administrative& information systems, develop performancemeasuresDetermine required levels of

human resources and material support

Implement action plan(Program operation & management, performance measurement, corrective action, etc..., .)

Program effectiveness, impact, efficiency evaluationsto determine the continue need for the program, alter program design, resource requirements, etc..

10

Page 11: Used with Permission of:  John R. Slate

Types of Program Evaluations FormativeFormative - Judging the worth of a program while activities are forming or in process

• focus is on the process more than the outcome

• can help make in-process improvements

• often involves a small scale field test

11

Page 12: Used with Permission of:  John R. Slate

Types of Program Evaluations FormativeFormative - Judging the worth of a program while activities are forming or in process

• focus is on the process more than the outcome

• can help make in-process improvements

• often involves a small scale field test

SummativeSummative - Judging the effectiveness of a fully operating or completed program

• focus in on outcome and overall program worth

• can help decisions to expand, terminate, modify

• usually encompasses the entire program

12

Page 13: Used with Permission of:  John R. Slate

Questions Asked by Program Evaluations* Descriptive - Statistics on inputs, outputs, and

outcomes; how does the program work?

Normative - What is the expected performance (goal) of the program in relation to actual achievement?

Impact - If a goal is not met, why not? Program evaluations must establish a cause/effect relationship between an unmet goal and program activities or other, external factors

*from Designing Evaluations, GAO/PEMD-10.1.4

Page 14: Used with Permission of:  John R. Slate

Decisions Program Evaluation Can Help Make

Continue or discontinue a program

Improve policies and procedures

Add or drop specific program elements

Institute similar programs elsewhere

Allocate resources

Accept or reject approaches & theories

Page 15: Used with Permission of:  John R. Slate

Key concepts of Evaluative Research

empirical - based on valid, reliable data replicable - study can be repeated in exactly

the same way in another time, place or setting

falsifiable - hypothetical cause/effect relationships can be demonstrated or not

According to standards in the research and academic communities, program evaluations should strive for scientific proof and method, i.e.., they should be:

Page 16: Used with Permission of:  John R. Slate

Evaluation Methodology(highly simplified)

Define program

Identify outcome goals and objectives

Model hypothetical cause/effect relationships between program activities and outcomes

Develop goal/objective measurement criteria and desired achievement levels

Locate, collect and analyze data on program participants (and maybe control group)

Compare actual results with target levels

Page 17: Used with Permission of:  John R. Slate

Evaluation Strategies and Designs*Evaluation Strategy Research Designs Description & (Purpose)

Sample survey Cross-sectional Panel Criteria-referenced

Data collection from a sample toexamine events and conditions(descriptive & normative)

*Based on Designing Evaluations, GAO/PEMD-10.1.4

17

Page 18: Used with Permission of:  John R. Slate

Evaluation Strategies and Designs*Evaluation Strategy Research Designs Description & (Purpose)

Sample survey Cross-sectional Panel Criteria-referenced

Data collection from a sample toexamine events and conditions(descriptive & normative)

Case study Single Multiple Criteria-referenced

Analytical description of an event,process or program(descriptive & normative

*Based on Designing Evaluations, GAO/PEMD-10.1.4

18

Page 19: Used with Permission of:  John R. Slate

Evaluation Strategies and Designs*Evaluation Strategy Research Designs Description & (Purpose)

Sample survey Cross-sectional Panel Criteria-referenced

Data collection from a sample toexamine events and conditions(descriptive & normative)

Case study Single Multiple Criteria-referenced

Analytical description of an event,process or program(descriptive & normative)

Fieldexperiment

True experiment Non-equivalent comparison Before/after (inc. time series)

Test case to isolate and control theprogram stimulus(impact)

*Based on Designing Evaluations, GAO/PEMD-10.1.4

19

Page 20: Used with Permission of:  John R. Slate

Evaluation Strategies and Designs*Evaluation Strategy Research Designs Description & (Purpose)

Sample survey Cross-sectional Panel Criteria-referenced

Data collection from a sample toexamine events and conditions(descriptive & normative)

Case study Single Multiple Criteria-referenced

Analytical description of an event,process or program(descriptive & normative

Fieldexperiment

True experiment Non-equivalent comparison Before/after (inc. time series)

Test case to isolate and control theprogram stimulus(impact)

Use of availabledata

Secondary data analysis Evaluation synthesis

Attempts to test hypotheses andanswer questions based onexisting information systems(descriptive, normative, & impact)

*Based on Designing Evaluations, GAO/PEMD-10.1.4

20

Page 21: Used with Permission of:  John R. Slate

Evaluability Assessment

A study to determine if, when, and how a program can be evaluated

Prevents premature impact and outcome evaluations

Usually involves:• clarification of the intent of the program

• formulation of “testable” cause/effect statements

• determines what can be measured and how

• assesses validity, reliability, and relevance of data on which the evaluation would based

Page 22: Used with Permission of:  John R. Slate

Circumstances When ProgramEvaluation is Not a Good Idea

When there is no question about the program

When there is no clear program structure or focus

When program activities cannot be distinguished from other activities

When people cannot agree on what the program is trying to achieve

When valid measurement methods and data do not exist and cannot be created

When the study would serve no useful purpose

Page 23: Used with Permission of:  John R. Slate

OverviewProgram Evaluation Prerequisites

An operating “program”, i.e.., a distinct set of activities and resources with a common purpose and focus

Agreement on program goals and objectives Agreement on program metrics for its goals and

objectives Agreement on what constitutes program

“success” Existence, or ability to develop, valid data Absence of legal, administrative, or cultural

barriers to the study Agreement on intended use of the study