the path to health and wellness: building our strength in program planning and evaluation janis...

Post on 26-Dec-2015

218 Views

Category:

Documents

1 Downloads

Preview:

Click to see full reader

TRANSCRIPT

The Path to Health and Wellness:Building Our Strength inProgram Planning and Evaluation

The Path to Health and Wellness:Building Our Strength inProgram Planning and EvaluationJanis Weber, Ph.D.Janis Weber, Ph.D.

Describe a general approach to

evaluation planning using a program

logic model, performance objectives,

indicators, and data sources

Goal

What comes to mind when you hear

Program Evaluation?

What Is Program Evaluation?

“The systematic collection of information about

the activities, characteristics, and outcomes of

programs to make judgements about the

program, improve program effectiveness, and/or

inform decisions about future program

development.”

Demonstrate accountability to stakeholders

Measure program achievement

Manage program resources

Document and improve program operations

Why Invest in Program Evaluation?

Accountability

Describe relationship between activities and intended outcomes

Monitor program implementation

Measure intermediate and long-term outcomes

Educate about realistic expectations for change

Evaluation

• Requires more in-depth data collection

• Measures short-term and intermediate outcomes

• Documents implementation and effectiveness

• Includes developing a program logic model

Surveillance Data• Routine

• Existing resources

• Less flexibility

Evaluation Data• Occasional

• Additional resources

• Flexible

Surveillance and Evaluation

Framework for Program Evaluation in Public Health

TM

Standards for “Good” Evaluation

Utility

Feasibility

Propriety

Accuracy

Framework for Program Evaluation in Public Health

TM

A Collaborative Approach

Reduces suspicion and fear

Encourages differing perspectives and many voices

Increases awareness and commitment

Increases the possibility of reaching objectives

Increases the possibility findings will be used

• Insiders

• People who manage or work in the program

Outsiders

• Those served or affected by programs, or work in partnership with the program to achieve its goals

Intended users of the evaluation

• People who are in a position to decide something about the program

Identify Stakeholders—Who?

Who are your stakeholders?

Example HOPE Stakeholders

• Program staff

• Local and regional coalitions

• Community leaders, members, and grantees

• Local, state, and national partners

• Program funders

• Program Users

• Federal, state and county health departments

• Professional and business entities

• Schools and educational groups

• The medical community

• Community-based organizations

Example HOPE Stakeholders

• Universities and educational institutions

• Government(local, state legislators, and the governor)

• Families

• Privately owned businesses and business associations

• Faith organizations

Ensure that evaluation designed to answer questions important to stakeholders

Increase likelihood of continued support

Build wider competency in evaluation

Increase possibility evaluation findings used

Involve Stakeholders—Why?

Stakeholders as Team Members

• Engaging stakeholders• Diplomats• Those with diverse networks

• Describing the program• People who understand program history, purpose and operation

• Focusing the evaluation design• Decision makers who guide program direction

• Gathering credible evidence• Experienced evaluators• Social or behavioral scientists

• Justifying conclusions • Trusted people with few personal stakes

• Ensuring use and sharing lessons learned

• Advocates• Clear communicators• Creative thinkers• Members of the power structure    

Step Helpful Team Members

Example Stakeholder Interests

What is the program doing to address the problem?

How can I get involved in the program?

Is the program making a difference?

Is the program worth the cost?

Can the program be made more efficient?

Priority Questions

Who are your stakeholders?

How can stakeholders be identified or engaged?

What role(s) do stakeholders have in planning?

Who lends credibility to an evaluation or team?

How will we integrate stakeholder values?

Checklist—Engage Stakeholders

Identify stakeholders

Review list of stakeholders for inclusivity

Balance individuals and organizations

Reflect the situation of overweight in setting

Establish a method of communication

Target key stakeholders for participation

Identify key areas for stakeholder input

Create a plan for strategic involvement

Bring stakeholders together as needed

Understand and reflect stakeholder values

Checklist—Engage Stakeholders

Framework for Program Evaluationin Public Health

TM

Our plans miscarry because they have no aim.When a man does not know what harbor he is making for,

no wind is the right wind.

—Marcus Annaeus Seneca

Describe the Program

• Need

• Expected effects

• Activities

• Resources

• Stage of development

• Context

• Logic model

We Must Describe the Program BeforeWe Can Begin Monitoring and Evaluation

Describing the program includes:

• Identify what activities you will do based on the needs identified from the data

• Clearly outline what the activities will accomplish immediately

• Clearly outline the impact the activities will have in the longer term

What Is a Logic Model?

Disciplined way of mapping a program

Platform for discussion

Multi-purpose tool

Means and not an end

Presentation of links in a chain of reasoning

INPUTS ACTIVITIES OUTCOMESOUTPUTS

Program Resources

What we do with program resources to fulfill mission

Benefits for participants during and

after program activities

Direct products of

activities

Generic Program Logic Model

Demonstrate a theory of change linked to time

Educate stakeholders about realistic expectations

for change

Set ourselves up to demonstrate accountability now

and in the future

Tell a complete a story

A Continuum of Outcomes—Why?A Continuum of Outcomes—Why?

A comprehensive evaluation includes A comprehensive evaluation includes process process andand outcome measures outcome measures..

Process

• Were program activities accomplished?

• Were the activities implemented as planned?

Process and Outcome Evaluation

Outcome

• Is the program having the intended impact?

• Is there progress toward larger program goals?

The type and quantity of services provided

The number of people receiving services

What actually happens during implementation

How much money the project is using

The staffing for services/programs

The number of coalition activities and meetings

Assessment of program fidelity

Process Evaluation

Results of program services

Changes in individuals

Changes in attitudes, beliefs or behaviors

Changes in the environment

Changes in health disparities related to overweight and obesity

Outcome Evaluation

How do we transition from the logic model to program evaluation?

QUESTION:

Inputs

ActivitiesShort-termOutcome

IntermediateOutcome

Long-termOutcomeOutputs Goal

Evaluate Against Program Logic Model

Evaluation Questions

Is your program making a difference?

Is your program effective in reducing tobacco consumption?

Can your program be improved?

What exactly is your program doing?

Is your program accomplishing what it was intended to accomplish?

Was your program implemented as planned?

Are you using resources efficiently and effectively?

Is your program’s performance on par with established standards?

Performanceobjectives

Indicators

Data sources

Performanceobjectives

Indicators

Data sources

Performanceobjectives

Indicators

Data sources

Inputs

ActivitiesShort-termOutcome

IntermediateOutcome

Long-termOutcomeOutputs Goal

Evaluate Against Program Logic Model

““Unless the goals are operationalized into Unless the goals are operationalized into

specific objectives, it is unlikely that a specific objectives, it is unlikely that a

plan can be implemented to meet them” plan can be implemented to meet them”

(Rossi and Freeman 1982, p 56).(Rossi and Freeman 1982, p 56).

Performance Objectives

Focus program priorities

Benchmark progress over time

Set targets for accountability

Strong Program Objectives Are

Specific

Measurable

Achievable

Relevant

Time-bound

Increase the proportion of

youth in grades 6 to 8 who

engage in 30 minutes of

physical activity per day from

X percent in June 2007 to Y

percent in June 2008.

Specific

Measurable

Achievable

Relevant

Time-bound

Strong Program Objectives Are

Specific

Measurable

Achievable

Relevant

Time-bound

Increase the proportion

of youth in grades 6 to 8

who engage in 30

minutes of physical

activity per day from X

percent in June 2007 to

Y percent in June 2008.

Strong Program Objectives Are

Specific

Measurable

Achievable

Relevant

Time-bound

Increase the proportion

of youth in grades 6 to 8

who engage in 30

minutes of physical

activity per day from X

percent in June 2007 to

Y percent in June 2008.

Strong Program Objectives Are

Specific

Measurable

Achievable

Relevant

Time-bound

Increase the proportion

of youth in grades 6 to 8

who engage in 30

minutes of physical

activity per day from X

percent in June 2007 to

Y percent in June 2008.

Strong Program Objectives Are

Increase the proportion

of youth in grades 6 to 8

who engage in 30

minutes of physical

activity per day from X

percent in June 2007 to

Y percent in June 2008.

Specific

Measurable

Achievable

Relevant

Time-bound

Strong Program Objectives Are

Specific

Measurable

Achievable

Relevant

Time-bound

Increase the proportion

of youth in grades 6 to 8

who engage in 30

minutes of physical

activity per day from X

percent in June 2007 to

Y percent in June 2008.

Strong Program Objectives Are

Framework for Program Evaluation in Public Health

TM

Not everything that can be counted counts andnot everything that counts can be counted.

—Albert Einstein

Focus the Evaluation Design

Purpose

Users

Uses

Questions

Methods

Agreements

Focus the Evaluation Design

Define the purpose(s) of your evaluation

Include process and outcome evaluation

Identify evaluation questions

Link ? to goals, outcomes, and objectives

Identify the use(s) of results

Focus the Evaluation Design

Collect data to make comparisons

Review options for evaluation design

Consider a goal-based evaluation model

Make sure evaluation design “fits” questions

Seek expertise and/or review as needed

CDC Framework for Program Evaluation

TM

“Don’t accept your dog’s admiration asconclusive proof that you are wonderful.”

—Ann Landers

Gather Credible Evidence

Indicators

Sources

Quality

Quantity

Logistics

Sources of Information-People

Clients, program participants/nonparticipants

Staff, program managers, administrators

Partner agency staff

General public

Key informants

Funders

Critics/skeptics

Representatives of advocacy groups

Elected officials, legislators, policy makers

DocumentsGrant proposals, newsletters, press releasesMeeting minutes, administrative records Registration/enrollment formsPublicity materials, quarterly reportsPublications, journal articles, postersPrevious evaluation reportsNeeds assessments

Surveillance summariesDatabase recordsRecords held by funders or collaboratorsWeb pagesGraphs, maps, charts, photographs, videotapesObservationsMeetings, special events/activities, job performanceService encounters

Grant proposals, newsletters, press releases Meeting minutes, administrative records Registration/enrollment forms Publicity materials, quarterly reports Publications, journal articles, posters Previous evaluation reports Needs assessments Surveillance summaries Database records Records held by funders or collaborators Web pages (electronic documents) Graphs, maps, charts, photographs, video

Sources of Information-Documents

Meetings

Special events and/or activities

Job performance

Service encounters

Sources of Information-Observations

Gather Credible Evidence

Confirm use of intended outcomes

Confirm outcomes are logically linked

Address a continuum of outcomes

Collect process and outcome data

Identify at least 1 indicator for each outcome

Link outcomes to indicators and data sources

Gather Credible Evidence

Ask if new data collection is necessary

Add evaluation ? to existing systems

Consider mixed-method data collection

Take into account available resources

Test new instruments to identify sources of error

Consider issues of timing

Framework for Program Evaluation in Public Health

TM

Justify Conclusions

Standards

Analysis/synthesis

Interpretation

Recommendations

Sample Benchmarks for Performance Measurement

Needs of participants

Community values, expectations, norms

Program missions, objectives

Program protocols and procedures

Change in performance over time

Performance by similar programs

Performance by a control or comparison group

Sample Benchmarks for Performance Measurement

Resource efficiency

Mandates, policies, regulations, laws

Judgments by participants, experts, and funders

Institutional goals

Political ideology

Social equity

Human rights

Justify Conclusions

Check data for errors

Analyze data using appropriate techniques

Consider issues of context when interpreting

Discuss alternative explanations

Compare results with those of similar programs

Justify Conclusions

Assess results against available literature

Use HP2010 as a key point for comparisons

Examine the limitations of the evaluation

Document potential biases

Framework for Program Evaluation in Public Health

TM

Design

Preparation

Feedback

Follow-up

Dissemination

Ensure Use and Share Lessons Learned

Demonstrate that resources are well spent

Aid in the formulation of budgets

Compare outcomes with those of previous years

Compare actual with intended outcomes

Identify training and technical assistance needs

Ensure Use

Support annual and long-range planning

Focus attention on issues important to program

Promote your program

Identify partners for future collaborations

Enhance the public image of your program

Retain or increase funding

Provide direction for program staff

Ensure Use

Ensure Use and Share Lessons Learned

What can be done to increase the use of evaluation findings?

What can be done to reduce confusion or misinterpretation?

How should results be presented for different audiences?

How are program planning and evaluation related to one another?

Ensure Use and Share Lessons Learned

Design for intended use by intended users

Prepare stakeholders for eventual use

Provide continuous feedback to stakeholders

Disseminate procedures used and lessons learned to stakeholders

Draft recommendations based on audience

Ensure Use and Share Lessons Learned

Revisit the purpose(s) of the evaluationwhen preparing reports and recommendations

Tailor evaluation reports to audience(s)

Present clear and succinct findings in a timely manner

Avoid jargon when presenting information to others

Document limitations of the evaluation

Disseminate via multiple venues

Expensive

Too time consuming

Tangential

Technical

Not inclusive

Academic

Punitive

Political

Useless

Rethink Program Evaluation

Cost effective

Strategically timed

Integrated

Accurate

Engaging

Practical

Surprisingly helpful

Participatory

Useful

Janis Weberjanisweber@hughes.net

top related