program evaluation and outdoor education: an overview

Download Program evaluation and outdoor education: An overview

If you can't read please download the document

Upload: james-neill

Post on 16-Apr-2017

5.748 views

Category:

Education


0 download

TRANSCRIPT

Welcome to Psychology 102

Program evaluation
& outdoor education:
An overview

Dr James NeillCentre for Applied PsychologyUniversity of Canberra

16th National Outdoor Education Conference, Perth, Western Australia, Jan 10-13, 2010

Session abstractThis session will discuss program evaluation in outdoor education. What is it? Why do it? What methods are there? How can data be analysed? How can results be used? We will consider several example program evaluation studies and available tools and resources. There will also be opportunity to workshop your own program evaluation needs.

More info: http://wilderdom.com/wiki/Category:NOEC/2010

BiographyJames currently lectures in the Centre for Applied Psychology at the University of Canberra and conducts research in outdoor education, experiential learning, the natural environment, and new technologies. He also edits . James previously taught outdoor education at the University of New Hampshire and was the Research Coordinator and a senior instructor at Outward Bound Australia.

More info: http://wilderdom.com/JamesNeill.htm

Overview

Discuss program evaluation in outdoor education e.g., What is it?

Why do it?

What methods are there?

How can data be analysed?

How can results be used?

Image source: http://commons.wikimedia.org/wiki/File:Information_icon4.svgLicense: Public domain

Image source: http://commons.wikimedia.org/wiki/File:Autoroute_icone.svgLicense: CC-BY-A 2.5Author: http://commons.wikimedia.org/wiki/User:Doodledoo

Overview

Consider example program evaluation studies and available tools and resources.

Opportunity to workshop your own program evaluation needs.

Image source: http://commons.wikimedia.org/wiki/File:Information_icon4.svgLicense: Public domain

Image source: http://commons.wikimedia.org/wiki/File:Autoroute_icone.svgLicense: CC-BY-A 2.5Author: http://commons.wikimedia.org/wiki/User:Doodledoo

Resources:
http://wilderdom.com/wiki/Evaluation
http://wilderdom.com/wiki/Research


Email:
[email protected]

Contacts & resources

Resources

Resources

http://managementhelp.org/evaluatn/fnl_eval.htm

What is it?

Image name: * Blackboard *Image source: http://www.flickr.com/photos/8078381@N03/3279725831/Image author: pareeerica, http://www.flickr.com/people/8078381@N03/Image license:CC-by-A 2.0 http://creativecommons.org/licenses/by/2.0/deed.en

What is evaluation?

E-value-ation
(a systematic process of
determining value)

The necessity argument runs something like this:- If we dont get in and start building up a systemic evaluation programs of our programs now, we will eventually (probably) be increasingly forced by outside providers to do so

The moral argument runs something like this:If we purport to affect psychosocial aspects of peoples lives then we have a moral obligation to maximize in a rigorous, thorough way the processes and outcomes of our programs

(if we teach canoeing badly, but the kids survive, does it matter? But if we teach about personal lives badly, we potentially muck around with and damage the core aspects of a persons being? Therefore we are as morally obligated if not more so to be as thorough and rigorous in our educational design, our training of facilitators, and our evaluation of outcomes as we are in throughly researching appropriate safety procedures, etc.)

Safety audits are now common but educational auditing is not they are becoming more common

Noted that in Brookes examination of fatalities since the 1960s in Australian outdoor education, staff were identified as one of the highest risks for fatalities because they are inclined to go outside of program policies and normal behaviors likewise, whilst we discipline our students through systematic processes of feedback and reflection, we tend not to subject the educational quality of our programs to the same kind of rigor in analysis

Research vs. evaluation

Research and evaluation are ways of answering questions.

Research = aims to generalise findings to outside world

Evaluation = findings are specific and restricted

(more info: Priest, 2001)

We are all natural researchers and evaluators. E.g., we are always assessing other people and judging value

If you are asked to buy new harnesses for a ropes course, you naturally research the answer the question.

When it comes to assessing the quality of programming, do we systematically investigate?

Why do it?

Image name: * Blackboard *Image source: http://www.flickr.com/photos/8078381@N03/3279725831/Image author: pareeerica, http://www.flickr.com/people/8078381@N03/Image license:CC-by-A 2.0 http://creativecommons.org/licenses/by/2.0/deed.en

Outdoor education life cycle:
Role of research & evaluation

As the field matures, there is a trend towards more programs becoming involved in research and evaluation.

International Life Cycle
(Priest, 1999)

Source: Priest, S.(1999).National life cycles in outdoor adventure programming. The Outdoor Network, 10 (1),16-17, 34-35.

Source: Priest, S.(1999).National life cycles in outdoor adventure programming. The Outdoor Network, 10 (1),16-17, 34-35.

Priest (1999) figure & table

Organisational use of
experiential learning cycle principles

Why evaluate?

Two main motivations:NECCESITYwe have to (for others)

MORALITYwe want to (to improve/develop)

The necessity argument runs something like this:- If we dont get in and start building up a systemic evaluation programs of our programs now, we will eventually (probably) be increasingly forced by outside providers to do so

The moral argument runs something like this:If we purport to affect psychosocial aspects of peoples lives then we have a moral obligation to maximize in a rigorous, thorough way the processes and outcomes of our programs

(if we teach canoeing badly, but the kids survive, does it matter? But if we teach about personal lives badly, we potentially muck around with and damage the core aspects of a persons being? Therefore we are as morally obligated if not more so to be as thorough and rigorous in our educational design, our training of facilitators, and our evaluation of outcomes as we are in throughly researching appropriate safety procedures, etc.)

Safety audits are now common but educational auditing is not they are becoming more common

Noted that in Brookes examination of fatalities since the 1960s in Australian outdoor education, staff were identified as one of the highest risks for fatalities because they are inclined to go outside of program policies and normal behaviors likewise, whilst we discipline our students through systematic processes of feedback and reflection, we tend not to subject the educational quality of our programs to the same kind of rigor in analysis

Hierarchy of
research/evaluation motivations

-1. Intentional disinterest / non-engagement 0. Denial or non-awareness 1. Forced / compulsory2. Marketing and funding purposes3. Improve the quality of the program4. Contribute to developing profession5. For the sake of humanity & the cosmos

Progression through the 7 stages can start from any point and need not progress linearly (e.g., a change in leadership can cause a big leap up or down).

Types/models of evaluation
(Priest, 2001)

Needs assessment: What are some gaps that the program will fill?

Feasibility study: Given the constraints, can the program succeed?

Process evaluation: How is the implemented program progressing?

Outcome evaluation: Were program goals and objectives achieved?

Cost analysis: Was the program financially worthwhile or valuable?

Research: Will the program work elsewhere?

Other options: grants; minimalist just do it just experiment with an end of program survey at the end of your next program

Ways of gathering data

Questionnaires, surveys, checklists

Interviews

Documentation review

Observation

Focus groups

Case studies

http://managementhelp.org/evaluatn/fnl_eval.htm#anchor1585345

Models for getting started

InternalEuse existing staff resources

Advantages: cost-efficient; high level of specific program knowledge

ExternalConsultant, university, another organisation, graduate student, etc.

Advantages: independent; expertise

Collaborative funding applications

Other options: grants; minimalist just do it just experiment with an end of program survey at the end of your next program

Example evaluation studies

Outward Bound Australia Colonial Foundation Study (Neill, 2001)

Young Endeavour (Berman, Finkelstein, & Powell, 2005)

Outdoor Education Group (Learning Journeys)

Melbourne Children's Institute Resilience/Mental Study Study (2010-2012)

Other options: grants; minimalist just do it just experiment with an end of program survey at the end of your next program

A typical evaluation process

Define purpose of evaluation

Audience who needs to know?

Identify stakeholders

Establish program objectives & their operational definitions

Identify data collection methods

Establish research designs

Develop & pilot measures

Collect data

Analyse data

Report / disseminate - & get feedback

Consider/Act on recommendations

Other options: grants; minimalist just do it just experiment with an end of program survey at the end of your next program

Define purpose of evaluation

What is your motivation? Evaluation or research?

Why are you wanting to evaluate? Internal?

External?

What do you want to do with the evaluation?

What is the research question?

Note: Its not research if you can't be surprised by the results?

Other options: grants; minimalist just do it just experiment with an end of program survey at the end of your next program

Audience: Who needs to know?

Humanity?

Local society?

Funders?

Parents / School community?

Principal?

Program manager?

Instructors?

Students?

Other options: grants; minimalist just do it just experiment with an end of program survey at the end of your next program

Identify stakeholders?

Who has valuable information to help develop a comprehensive picture? Local community?

Outdoor education staff?

Client organisation staff?

Students?

Family?

Environment?

Other options: grants; minimalist just do it just experiment with an end of program survey at the end of your next program

Program objectives
& operational definitions

Other options: grants; minimalist just do it just experiment with an end of program survey at the end of your next program

Ways of gathering data

Questionnaires, surveys, checklists

Interviews

Documentation review

Observation

Focus groups

Case studies

http://managementhelp.org/evaluatn/fnl_eval.htm#anchor1585345

Program objectives
& operational definitions

Purposes / outcomesDescription

Recreational, PhysicalLeisure (fun, relaxation, enjoyment), Physical fitness, Outdoor skills training

EducationalDirect (subject knowledge) and indirect (e.g., Academic self-concept)

DevelopmentalPersonal and social development, life skills and functionality of behaviour

Therapeutic, RedirectionalImprove dysfunctional personal and group behaviour patterns

EnvironmentalEnvironmental attitude, knowledge, and behaviour

Other options: grants; minimalist just do it just experiment with an end of program survey at the end of your next program

Dissemination / Reporting forms

Academic article

Conference presentation

Report: Technical? Non-technical?

Executive summary

Seminar / briefing

News release / Popular article

Student thesis

Website

Video

Other options: grants; minimalist just do it just experiment with an end of program survey at the end of your next program

Berman, M., Finkelstein, J., & Powell, M. (2005). Tall ships and social capital: A study of their interconnections. International Journal of the Humanities, 2(2).

Priest, S. (2001). A program evaluation primer. Journal of Experiential Education, 24(1), 34-40. Retrieved January 10, 2010, from http://academic.evergreen.edu/curricular/atpsmpa/Priest.pdf

References

http://wilderdom.com/wiki/Evaluation