ppas 4310 program evaluation: background & theory by: dr. mojgan rahbari 1

29
PPAS 4310 PPAS 4310 PROGRAM EVALUATION: PROGRAM EVALUATION: BACKGROUND & THEORY BACKGROUND & THEORY By: Dr. Mojgan Rahbari 1

Upload: oswald-short

Post on 26-Dec-2015

218 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: PPAS 4310 PROGRAM EVALUATION: BACKGROUND & THEORY By: Dr. Mojgan Rahbari 1

PPAS 4310PPAS 4310

PROGRAM EVALUATION: PROGRAM EVALUATION: BACKGROUND & THEORYBACKGROUND & THEORY

By: Dr. Mojgan Rahbari1

Page 2: PPAS 4310 PROGRAM EVALUATION: BACKGROUND & THEORY By: Dr. Mojgan Rahbari 1

Lecture Overview1.Define what program evaluation is2.Why there is a need for program evaluation?3.Key questions to consider when designing a program evaluation4.Core categories of program evaluation5.Evaluation models6.Guiding principles of evaluation—The Canadian

Evaluation Association7.Different types of program evaluation8.Fundamental Methodological Tools for Collecting

Data During Evaluations

2 By: Dr. Mojgan Rahbari

Page 3: PPAS 4310 PROGRAM EVALUATION: BACKGROUND & THEORY By: Dr. Mojgan Rahbari 1

Introduction

No clear market signals to measure government’s service performance

1. Program improvement: Formative evaluation2. Summative Evaluation: accountability3. To generate more general knowledge4. As a political ploy or for public relations—produce data to support a program or justify a decision

o Evaluation is unpopular o Creates great temptations to shoot, or at least ignore the messengero Evaluation became popular again in the mid-1990s

3 By: Dr. Mojgan Rahbari

Page 4: PPAS 4310 PROGRAM EVALUATION: BACKGROUND & THEORY By: Dr. Mojgan Rahbari 1

What is Program Evaluation?

It is the systematic collection of information about the activities, characteristics, & outcomes of programs to (Patton, 1997: 23) :

i. Make judgments about the program, ii. Improve program effectivenessiii. Inform decisions about future programming

• Evaluation typically is ex post analysis— after the launch of a program

The Canadian Evaluation Association [formed in 1986] http://www.evaluationcanada.ca/site.cgi?s=1

OISE Centre for the Advancement of Measurement, Evaluation, Research & Assessment (CAMERA) http://fcis.oise.utoronto.ca/~camera/

4 By: Dr. Mojgan Rahbari

Page 5: PPAS 4310 PROGRAM EVALUATION: BACKGROUND & THEORY By: Dr. Mojgan Rahbari 1

What is Program Evaluation?

• Organization’s mission states the overall goals which must be reached to accomplish the mission

• These goals often becomes a program

• Programs are organized methods to provide certain related services to constituents, e.g., clients, customers, patients, etc.

• Evaluated to assess if they are indeed useful to constituents

• Program evaluation involves collecting information about a program to make decisions about the program

5 By: Dr. Mojgan Rahbari

Page 6: PPAS 4310 PROGRAM EVALUATION: BACKGROUND & THEORY By: Dr. Mojgan Rahbari 1

Why There is a Need for Program Evaluation

• Championed by the management consulting industry

• Practice borrowed from the private sector

• Fiscal pressures since 1990s encouraged program reviews & reallocation of resources

• Policy communities & public’s increased demand on accountability

o Policies & programs are co-produced with the private & non profit sectors

6 By: Dr. Mojgan Rahbari

Page 7: PPAS 4310 PROGRAM EVALUATION: BACKGROUND & THEORY By: Dr. Mojgan Rahbari 1

7

Steps in the Evaluation Process

Link: http://www.phac-aspc.gc.ca/php-psp/toolkit-eng.php

Page 8: PPAS 4310 PROGRAM EVALUATION: BACKGROUND & THEORY By: Dr. Mojgan Rahbari 1

Key Questions to Consider When Designing a Program Evaluation

• For what purpose is the evaluation being done?

• Who are the audiences for the information from the evaluation?

• What kinds of information are needed to make the decision you need to make?

• From what sources should the information be collected?

• How can that information be collected in a reasonable fashion?

• When is the information needed?

• What resources are available to collect the information?

8 By: Dr. Mojgan Rahbari

Page 9: PPAS 4310 PROGRAM EVALUATION: BACKGROUND & THEORY By: Dr. Mojgan Rahbari 1

Core Categories of Program Evaluation [Pal, 2010:310]

Process Evaluation[Program Activities]

i. What are the components of the program?

ii. How is the program delivered?

Efficiency Evaluation [Costs/Benefits]

i. What was the ratio of benefits to costs in this program?

ii. Given what we spent, did we get the most out of it?

Impact Evaluation[Outcomes]

Did the program have the intended effects?If not, why not?

10 By: Dr. Mojgan Rahbari

Page 10: PPAS 4310 PROGRAM EVALUATION: BACKGROUND & THEORY By: Dr. Mojgan Rahbari 1

Process-Based Evaluations

Aim to understand how a program works—how does it produce that results that it does

i. Useful if programs are long-standing

ii. Have changed over the years

iii. Employees or customers report a large number of complaints about the program

iv. There appear to be large inefficiencies in delivering program services

v. Accurately portraying how a program truly operates (e.g., for replication elsewhere)

Can you think of questions to consider when designing an evaluation to understand &/or examine the processes in a program?

11 By: Dr. Mojgan Rahbari

Page 11: PPAS 4310 PROGRAM EVALUATION: BACKGROUND & THEORY By: Dr. Mojgan Rahbari 1

Outcomes-Based Evaluation

o Is important for nonprofit organizations & funders

o Examines if the organization is doing the right program activities to bring about the outcomes needed by clients

Outcomes are usually in terms of:

i. Enhanced learning (knowledge, perceptions/attitudes or skills) or

ii.Conditions, e.g., increased literacy, self-reliance, etc.

12 By: Dr. Mojgan Rahbari

Page 12: PPAS 4310 PROGRAM EVALUATION: BACKGROUND & THEORY By: Dr. Mojgan Rahbari 1

Goals-Based Evaluation

Evaluating the extent to which programs are meeting predetermined goals or objectives

Can you think of examples of general types of goals programs try to achieve?

13 By: Dr. Mojgan Rahbari

Page 13: PPAS 4310 PROGRAM EVALUATION: BACKGROUND & THEORY By: Dr. Mojgan Rahbari 1

Different Types of Program Evaluation

• Needs evaluations• Cost/benefit analysis• Formative: Effectiveness—focuses on process• Summative: Efficiency— focuses on outcome• Goal-based• Process• Outcomes, etc.

14 By: Dr. Mojgan Rahbari

Page 14: PPAS 4310 PROGRAM EVALUATION: BACKGROUND & THEORY By: Dr. Mojgan Rahbari 1

Efficiency Evaluation

i. Cost-benefit analysisii. Cost-effectiveness analysis

• Both techniques focus on the problem of resource allocation

Steps:i. Decide on accounting unitii. Catalogue all costs & benefits over timeiii. Monetize those costs & benefitsiv. Discount those costs & benefits over the period of time the

project or program will be operatingv. Determine the net social benefit

15 By: Dr. Mojgan Rahbari

Page 15: PPAS 4310 PROGRAM EVALUATION: BACKGROUND & THEORY By: Dr. Mojgan Rahbari 1

Key Factors in Process Evaluation1. Clarity about the program:

• What the program is about?• What its intended objectives are?

2. Logic model: ties inputs to activities, to short-term, intermediate, & final outcomes

3. Judgment

4. Attribution: o What a program’s contribution has been to a specific outcome?o What contribution the specific program in question has made to

the outcomeo How much of the success or failure can we attribute to the

program?o What has been the contribution made by the program?

16 By: Dr. Mojgan Rahbari

Page 16: PPAS 4310 PROGRAM EVALUATION: BACKGROUND & THEORY By: Dr. Mojgan Rahbari 1

Key Factors in Process Evaluation

5. Credible indicators• Tell you something important about the program & can

be measured

6.Linking resources to results• Should contribute to the wider process of governmental

resource allocation• Should encourage transparency in gov. decisions &

accountability to citizens

7. Sustainability• Should be part of ongoing strategy of performance

assessment

17 By: Dr. Mojgan Rahbari

Page 17: PPAS 4310 PROGRAM EVALUATION: BACKGROUND & THEORY By: Dr. Mojgan Rahbari 1

Process/Implementation EvaluationM

• Monitors an existing program to assess the effort put into it— but its not the same as measuring success

• It involves looking at “how something happens rather than or in addition to examining outputs & outcomes” [Patton, 2002: 159]

It aims at:

• Understanding the internal dynamics of how a program, organization, or relationship operates

• Improving the process whereby goals are met

• Linked to implementation

18 By: Dr. Mojgan Rahbari

Page 18: PPAS 4310 PROGRAM EVALUATION: BACKGROUND & THEORY By: Dr. Mojgan Rahbari 1

Logic Models

o Another way of mapping programs & bring together program theory & implementation theory

o It is a model of how the program will work under certain environmental conditions to solve identified problems (McLaughlin & Jordan, 2004: 7)

o It is a chart or diagram that represents the narrative of:

• What the program is targeting?• How it works?• What it is trying to achieve?

19 By: Dr. Mojgan Rahbari

Page 19: PPAS 4310 PROGRAM EVALUATION: BACKGROUND & THEORY By: Dr. Mojgan Rahbari 1

Key Elements of Logic Model

Source: A Result-based Logic Model for Primary Health Care, (2004: 3).

DiscussionWhat would the logic model of the needle exchange

program in Ottawa or Vancouver look like?

19

20

Page 20: PPAS 4310 PROGRAM EVALUATION: BACKGROUND & THEORY By: Dr. Mojgan Rahbari 1

A Results-based Logic Model for Primary Health Care

20

Page 21: PPAS 4310 PROGRAM EVALUATION: BACKGROUND & THEORY By: Dr. Mojgan Rahbari 1

22 By: Dr. Mojgan Rahbari

Page 22: PPAS 4310 PROGRAM EVALUATION: BACKGROUND & THEORY By: Dr. Mojgan Rahbari 1

23 By: Dr. Mojgan Rahbari

Page 23: PPAS 4310 PROGRAM EVALUATION: BACKGROUND & THEORY By: Dr. Mojgan Rahbari 1

Method Overall Purpose Advantages Challenges

questionnaires, surveys,

checklists

when need to quickly and/or easily get lots of information from

people in a non threatening way

-can complete anonymously

-inexpensive to administer

-easy to compare and analyze

-administer to many people

-can get lots of data-many sample

questionnaires already exist

-might not get careful feedback

-wording can bias client's responses-are impersonal

-in surveys, may need sampling expert

- doesn't get full story

interviews

when want to fully understand someone's

impressions or experiences, or learn

more about their answers to

questionnaires

-get full range and depth of information

-develops relationship with client

-can be flexible with client

-can take much time-can be hard to analyze

and compare-can be costly

-interviewer can bias client's responses

documentation review

when want impression of how program operates without interrupting the program; is from

review of applications, finances, memos,

minutes, etc.

-get comprehensive and historical information

-doesn't interrupt program or client's routine in program

-information already exists

-few biases about information

-often takes much time-info may be incomplete

-need to be quite clear about what looking for-not flexible means to

get data; data restricted to what

already exists

Fundamental Methodological Tools for Collecting Data During Evaluations

23

Page 24: PPAS 4310 PROGRAM EVALUATION: BACKGROUND & THEORY By: Dr. Mojgan Rahbari 1

observation

to gather accurate information about how

a program actually operates, particularly

about processes

-view operations of a program as they are

actually occurring-can adapt to events as

they occur

-can be difficult to interpret seen

behaviors-can be complex to

categorize observations-can influence

behaviors of program participants

-can be expensive

focus groups

explore a topic in depth through group

discussion, e.g., about reactions to an experience or suggestion,

understanding common complaints,

etc.; useful in evaluation and

marketing

-quickly and reliably get common impressions

-can be efficient way to get much range and

depth of information in short time

- can convey key information about

programs

-can be hard to analyze responses

-need good facilitator for safety and closure-difficult to schedule 6-

8 people together

case studies

to fully understand or depict client's

experiences in a program, and conduct

comprehensive examination through cross comparison of

cases

-fully depicts client's experience in program

input, process and results

-powerful means to portray program to

outsiders

-usually quite time consuming to collect, organize and describe-represents depth of information, rather

than breadth

24

Fundamental Methodological Tools for Collecting Data During Evaluations

Page 25: PPAS 4310 PROGRAM EVALUATION: BACKGROUND & THEORY By: Dr. Mojgan Rahbari 1

Possible Client/Partner Organizations

o Family Services Torontoo Riverdale Immigrant Women's Centre o The Cross-Cultural Community Services Associationo Ontario Council of Agencies Serving Immigrantso Access Alliance Multicultural Health & Community Serviceso Across Boundaries- Ethno racial Mental Health Centreo COSTI. Caledonia Centre. Language & Skills Training Serviceso YWCA Torontoo Canadian Women's Foundationo Yellow Brick House- emergency shelter for abused & homeless womeno Disabled Women's Network

26 By: Dr. Mojgan Rahbari

Page 26: PPAS 4310 PROGRAM EVALUATION: BACKGROUND & THEORY By: Dr. Mojgan Rahbari 1

Possible Client/Partner Organizations

United Way Toronto- www.unitedwaytoronto.com

Canadian Red Cross- www.redcross.ca

Ontario Nonprofit Network- www.ontariononprofitnetwork.ca

Ontario Ministry of the Attorney General- www.attorneygeneral.jus.gov.on.ca

Ontario Ministry of Transportation- www.mto.gov.on.ca

Ontario Ministry of the Environment- www.ene.gov.on.ca

27 By: Dr. Mojgan Rahbari

Page 27: PPAS 4310 PROGRAM EVALUATION: BACKGROUND & THEORY By: Dr. Mojgan Rahbari 1

28

Royse D., Thyer B. A. , Padgett D. K., © 2009Program Evaluation: An Introduction

Wholey J. S., Hatry H. P., Newcomer K. E., © 2010, Handbook of Practical Program Evaluation

Good Textbooks for the Course

By: Dr. Mojgan Rahbari

Page 28: PPAS 4310 PROGRAM EVALUATION: BACKGROUND & THEORY By: Dr. Mojgan Rahbari 1

29

References

City of Ottawa (2011), Needle Exchange Policy. Retrieved on 10/6/2011, from http://www.ottawa.ca/residents/health/living/alcohol_drugs_tobacco/drugs/site/policy_en.html

Canadian Evaluation Society (2011), CES Guidelines for Ethical Conduct, Retrieved 10/6/2011, from http://www.evaluationcanada.ca/site.cgi?s=1

McLaughlin J. A. & Jordan G.B. (2004) using logic models. In J.S. Wholey, H. P. Hatry, and K.E. Newcomer (Eds.), Handbook of practical program evaluations, 2nd Ed. (pp. 7-32). San Francisco, CA: Jossey-Bass

OECD (2005) Modernising Government: The Way Forward. Paris: OECD.

Pal, L. A. (2010) Beyond Policy Analysis: Public Issue Management in Turbulent Times, 4th Ed., Toronto, Canada: Nelson Education Ltd.

By: Dr. Mojgan Rahbari

Page 29: PPAS 4310 PROGRAM EVALUATION: BACKGROUND & THEORY By: Dr. Mojgan Rahbari 1

30

References

Patton, M. Q. (1997) Utilizing-focused Evaluation: The new century text (3rd ed.) Thousand Oaks, CA; Sage Publications.

Schacter, M. (1999), Means…ends…indicators: Performance measurement in the public sector. Ottawa: Institute on Governance.

Watson D. E., Broemeling A., Reid R.J, Black C. (2004) A Result-based Logic Model for Primary Health Care: Laying an Evidence-Based Foundation to Guide Performance Measurement, Monitoring and Evaluation, Centre for Health Services and Policy Research, College of Health Disciplines, The University of British Columbia, September 2004. Retrieved 11/6/2011, from http://www.chspr.ubc.ca/files/publications/2004/chspr04-19.pdf

By: Dr. Mojgan Rahbari