program evaluation & faculty participation by group 4: charlotte featherston, sara martin,...

21
Program Evaluation & Faculty Participation By Group 4: Charlotte Featherston, Sara Martin, Christina (Gaupp) Stacy, and Elsy Thomas

Upload: berniece-dean

Post on 18-Jan-2018

220 views

Category:

Documents


0 download

DESCRIPTION

A Classic Model: The Tyler Model Often referred to as “objective model” Emphasis on consistency among objectives, learning experiences, and outcomes Curriculum objectives indicate both behavior to be developed and area of content to be applied (Keating, 2006)

TRANSCRIPT

Page 1: Program Evaluation & Faculty Participation By Group 4: Charlotte Featherston, Sara Martin, Christina (Gaupp) Stacy, and Elsy Thomas

Program Evaluation & Faculty ParticipationBy Group 4: Charlotte Featherston, Sara Martin, Christina (Gaupp) Stacy, and Elsy Thomas

Page 2: Program Evaluation & Faculty Participation By Group 4: Charlotte Featherston, Sara Martin, Christina (Gaupp) Stacy, and Elsy Thomas

Historical Perspective & Background

• Early approaches to educational program evaluation

• Ralph Tyler’s behavioral objective model

Page 3: Program Evaluation & Faculty Participation By Group 4: Charlotte Featherston, Sara Martin, Christina (Gaupp) Stacy, and Elsy Thomas

A Classic Model: The Tyler Model • Often referred to as “objective model”

• Emphasis on consistency among objectives, learning experiences, and outcomes

• Curriculum objectives indicate both behavior to be developed and area of content to be applied (Keating, 2006)

Page 4: Program Evaluation & Faculty Participation By Group 4: Charlotte Featherston, Sara Martin, Christina (Gaupp) Stacy, and Elsy Thomas

Tyler’s Four Principles of Teaching

• Principle 1: Defining appropriate learning objectives

• Principle 2: Establishing useful learning experiences

• Principle 3: Organizing learning experiences to have a maximum cumulative effect

• Principle 4: Evaluating the curriculum and revising those aspects that did not prove to be effective (Keating, 2006)

Page 5: Program Evaluation & Faculty Participation By Group 4: Charlotte Featherston, Sara Martin, Christina (Gaupp) Stacy, and Elsy Thomas

Primary Strengths of Tyler’s Model

• Clearly stated objectives a good place to begin

• Involves the active participation of the learner (Prideaux, 2003)

• Simple linear approach to development of behavioral objectives (Billings & Halstead, 2009)

Page 6: Program Evaluation & Faculty Participation By Group 4: Charlotte Featherston, Sara Martin, Christina (Gaupp) Stacy, and Elsy Thomas

Progression of Program Evaluation

• 1980’s• Outcome assessments• State legislatures• National League from Nursing

(NLN)

• CCNE – 1990’s

Page 7: Program Evaluation & Faculty Participation By Group 4: Charlotte Featherston, Sara Martin, Christina (Gaupp) Stacy, and Elsy Thomas

Program Evaluation: 2000• Sauter (2000) surveyed all baccalaureate nursing programs in the United states to determine how they develop, implement, and revise their program evaluation plans.

• In 2006 Suhayda and Miller reported on the use of Stufflebeam’s CIPP model in providing a frame work for comprehensive program evaluation that would serve undergraduate and graduate nursing programs.

Page 8: Program Evaluation & Faculty Participation By Group 4: Charlotte Featherston, Sara Martin, Christina (Gaupp) Stacy, and Elsy Thomas

Relevance & Justification

Page 9: Program Evaluation & Faculty Participation By Group 4: Charlotte Featherston, Sara Martin, Christina (Gaupp) Stacy, and Elsy Thomas

Program Evaluation

Set Expectations

Collect Data

Use Data

Page 10: Program Evaluation & Faculty Participation By Group 4: Charlotte Featherston, Sara Martin, Christina (Gaupp) Stacy, and Elsy Thomas

Importance of Evaluation

Page 11: Program Evaluation & Faculty Participation By Group 4: Charlotte Featherston, Sara Martin, Christina (Gaupp) Stacy, and Elsy Thomas

Relevance

Page 12: Program Evaluation & Faculty Participation By Group 4: Charlotte Featherston, Sara Martin, Christina (Gaupp) Stacy, and Elsy Thomas

D-Implement Educational Program

G-Improvement the Educational Program

A-Needs Assessment

F-Feedback to 1- Learner

2- Teacher3-Organization

E-Evaluate the EducationalProgram

B-Educational objectives1- Cognitive

2- Psychomotor3-Attitude

C-Plan and Design Educational Program

1-Content 2-Method 3- Material

4-Evaluation methodology5-Environment

The ABC’S of EVALUATION

Page 13: Program Evaluation & Faculty Participation By Group 4: Charlotte Featherston, Sara Martin, Christina (Gaupp) Stacy, and Elsy Thomas

Impact to Program

Evaluation

Page 14: Program Evaluation & Faculty Participation By Group 4: Charlotte Featherston, Sara Martin, Christina (Gaupp) Stacy, and Elsy Thomas

Focus of Impact Evaluation

Participant’s perception and

satisfactionParticipant’s beliefs about teaching and

learning

Participant’s teaching

performance

Student’s perceptions about

staff teaching performance

Student’s learning

Effects on the culture of institution

Page 15: Program Evaluation & Faculty Participation By Group 4: Charlotte Featherston, Sara Martin, Christina (Gaupp) Stacy, and Elsy Thomas

Resources to Conduct

Impact Evaluation

Reliable and valid instruments

Trained data collectors

Personnel with research and statistical expertise

Equipment for data collection

Equipment for data collection analysis

Page 16: Program Evaluation & Faculty Participation By Group 4: Charlotte Featherston, Sara Martin, Christina (Gaupp) Stacy, and Elsy Thomas

When to do Impact Evaluation1.New program added to curriculum

2.Pilot programs which are due to be markedly scaled up 3.Ongoing program

Page 17: Program Evaluation & Faculty Participation By Group 4: Charlotte Featherston, Sara Martin, Christina (Gaupp) Stacy, and Elsy Thomas

Curriculum design

Discipline of knowledge

Characteristics of discipline

Characteristics of discipline

Evaluation on Curriculum

Page 18: Program Evaluation & Faculty Participation By Group 4: Charlotte Featherston, Sara Martin, Christina (Gaupp) Stacy, and Elsy Thomas

Conclusion• Program evaluation is collaborative, comprehensive, and complex.

• By understanding the history of program evaluation we can better understand the theory behind it.

http://teaching.berkeley.edu/sites/teaching.berkeley.edu/files/evaluationFINAL2.png

Page 19: Program Evaluation & Faculty Participation By Group 4: Charlotte Featherston, Sara Martin, Christina (Gaupp) Stacy, and Elsy Thomas

Conclusion• Evaluation should focus on a specific purpose with the goal of long-term

improvement.• Evaluators must consider program values along with societal expectations.

Page 20: Program Evaluation & Faculty Participation By Group 4: Charlotte Featherston, Sara Martin, Christina (Gaupp) Stacy, and Elsy Thomas

Conclusion“Development and implementation of a carefully designed theory-driven program evaluation plan will support continuous quality improvement for

nursing education programs” (Billings & Halstead, 2009, p. 507).

Assess Plan Improve

Page 21: Program Evaluation & Faculty Participation By Group 4: Charlotte Featherston, Sara Martin, Christina (Gaupp) Stacy, and Elsy Thomas

References• Bastable, S.B. (2013). Nurse as educator. (4th ed.). Sudbury, MA: Jones and Bartlett. • Billings, D. M., & Halstead, J. A. (2009). Teaching in nursing: A guide for faculty (3rd ed.). St. Louis, MO: Elsevier Saunders. • Denham, T.J. (2002). Comparison of two curriculum/Instructional Design Models: Ralph W. Tyler and Siena College Accounting

Class, ACCT205. Retrieved from ERIC Database. (ED 471734)• Educational Development Programs, 6(2), 96-108. Retrieved from: http://www.tandfonline.com/doi/abs/10.1080/

13601440110090749• Keating, S. (2006). Curriculum development and evaluation in nursing. Philadelphia, Pennsylvania: Lippincott Williams & Wilkins.• Klein, C., (2006). Linking competency-based assessment to successful clinical practice. Journal of Nursing Education.45(9), 379-

383.• McDonald, Mary C. (2014). Guide to assessing learning outcomes. (3rd ed.). Sudbury, MA: Jones and Bartlett. • Northeastern Illinois University. (n.d.). Classical Model. Ralph Tyler, 1949, Book Summary. Retrieved from

www.neiu.edu/~aserafin/New%20Folder/TYLER.html• Oermann, M., & Gaberson, K. (2006). Evaluation and testing in nursing education. (2nd ed.). New York, NY: Springer Publishing

Company, Inc. • Outline of principles of impact evaluation . (n.d.). Retrieved from http://www.oecd.org/dac/evaluation/dcdndep/37671602.pdf• Prideaux, D. (2003). Curriculum design: ABC of learning and teaching in medicine. British Medical Journal, 326(7383), 268-270.

Retrieved from http://www.ncbi.nlm.nih.gov/pmc/articles/PMC1125124/?tool=pubmed• Ross, A. (2010). Survey data collection for impact evaluation. Retrieved from:

http://siteresources.worldbank.org/EXTHDOFFICE/Resources/5485726-1256762343506/6518748-1292879124539/25.Collecting-Quality-Data-for-Impact-Evaluation_Adam

• University of South Florida College of Education. (n.d.). Ralph Tyler’s little book. Retrieved from www.coedu.usf.edu/agents/dlewis/publications/tyler.htm