kirkpatrick and beyond: a comprehensive methodology for

24
Kirkpatrick and beyond: A comprehensive methodology for influential training evaluations Dr. Jan Ulrich Hense Ludwig-Maximilians-Universität München, Department Psychology The 9th European Evaluation Society International Conference October 6 - 8, 2010, Prague, Czech Republic 7 October 2010, 15:45h, Strand 1 - Methodological challenges III

Upload: others

Post on 30-Dec-2021

2 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Kirkpatrick and beyond: A comprehensive methodology for

Kirkpatrick and beyond:A comprehensive methodology for influential training evaluations

Dr. Jan Ulrich HenseLudwig-Maximilians-Universität München, Department Psychology

The 9th European Evaluation Society International ConferenceOctober 6 - 8, 2010, Prague, Czech Republic7 October 2010, 15:45h, Strand 1 - Methodological challenges III

Page 2: Kirkpatrick and beyond: A comprehensive methodology for

# 2Kirkpatrick and beyond (EES Prague, 7 Oct 2010)

Dr. Jan HenseEvaluation of training

Kirkpatrick's four levels of training evaluation

Series of four articles 1959/1960

Evaluating training programs 3rd ed. 2006

Has been the dominant model for evaluation of training for decades.

Page 3: Kirkpatrick and beyond: A comprehensive methodology for

# 3Kirkpatrick and beyond (EES Prague, 7 Oct 2010)

Dr. Jan HenseThe four levels in a nutshell

Results

What impacthad it on theorganization?

Behaviour

What was applied at theworkplace?

LearningWhat skills

& knowledgewere learned?

Reaction

Did theylike the training?

???

? ??

?

???

??

Page 4: Kirkpatrick and beyond: A comprehensive methodology for

# 4Kirkpatrick and beyond (EES Prague, 7 Oct 2010)

Dr. Jan HenseProblems and criticism

Linearity

• Theoretically not plausible (e.g. interaction between reaction and learning)

Suggested causality ("chain of evidence")

• Weak empirical evidence (e.g. Alliger et al., 1989; 1997)

• Suggests reaction as a proxy for subsequent levels ("happy sheet

evaluations")

"Black-box" view of training

• Narrow focus on outcomes (e.g. Holton, 1997)

• Ignores training process, individual influences, and contextual influences

on learning and transfer (e.g. Bates, 2004)

"One size fits all" approach

• Ignores variety of training specifics

Page 5: Kirkpatrick and beyond: A comprehensive methodology for

# 5Kirkpatrick and beyond (EES Prague, 7 Oct 2010)

Dr. Jan HenseConsequences

Limited use of four levels for influencing practice• Unable to provide formative information on training improvement

• Risk of wrong summative conclusions

Need for a more complete view of training reality• Inputs (e.g. participant characteristics)

• Processes (e.g. learning methods)

• Context (e.g. training and workplace conditions)

Use of theory• Learning and instructional theory

• Transfer theory

• (cf. theory based evaluation, Weiss, Chen & Rossi, Bickman, Donaldson,

Rogers ...)

Page 6: Kirkpatrick and beyond: A comprehensive methodology for

# 6Kirkpatrick and beyond (EES Prague, 7 Oct 2010)

Dr. Jan Hense

Theory I: Learning and instruction

Self-regulatedlearning

Authenticcontext

Authenticcontext

Multiple contextsMultiple contexts

SocialcontextSocialcontext

Instructionalcontext

Instructionalcontext

Problem-based learningBalance of construction

and instruction

Problem-based learningBalance of construction

and instruction

Cooperativelearning

Learning withnew media

(Reinmann & Mandl, 2006)

Page 7: Kirkpatrick and beyond: A comprehensive methodology for

# 7Kirkpatrick and beyond (EES Prague, 7 Oct 2010)

Dr. Jan Hense

Theory II: Training transfer

(Baldwin & Ford, 1988)

Page 8: Kirkpatrick and beyond: A comprehensive methodology for

# 8Kirkpatrick and beyond (EES Prague, 7 Oct 2010)

Dr. Jan Hense

Evaluation of trainings for Six Sigma in a corporate setting

A case example

Page 9: Kirkpatrick and beyond: A comprehensive methodology for

# 9Kirkpatrick and beyond (EES Prague, 7 Oct 2010)

Dr. Jan HenseSix Sigma Training

Six SigmaSystematic, data-driven approach to quality improvement

Goal: error free processes

MethodologyDMAIC cycle

Related tool set

RolesGreen & Black Belts (GB & BB)

Master Black Belts (MBB)

SettingInternational corporation

Specialized in industrial gases

2. Measure

What is the current state

of things?

3. Analyze

What is wrong?What's causing

the problem?

5. Control

How sustainableare the effects

of improvements?

4. Improve

What can be done to improve

the problem?

1. Define

What are the problems

and goals?

DMAICcycle

Page 10: Kirkpatrick and beyond: A comprehensive methodology for

# 10Kirkpatrick and beyond (EES Prague, 7 Oct 2010)

Dr. Jan Hense(Six Sigma) Training theory

Inputs

TrainingGoals

ContentsDesign

Methods

ParticipantsLearning preferences

MotivationPrevious knowledgePrevious experiences

Preparation

Training process

RealizationLearning sequencesMedia and materials

RolesLearner activitiesTrainer activities

DidacticsProblem-based learning

Cooperative learningSelf-directed learning

Outcomes

ReactionSatisfactionUsefulness

Immediate effectsSkills & knowledge

AttitudesSelf-efficacy

Long-term effectsProject finalization

Career options

Con

text Workplace

Need for trainingLine manager support

WorkplaceLearning conditionsTransfer conditions

Training contextFacilities

Resources

Results

Cost reduction

Added value

Cost-benefit ratio

Organi-zation

Transfer

Usabilityof contents

Applicationof skills and knowledge

Supportby coach

by manager

ResultsBehaviourLearningReaction

Training concept

Page 11: Kirkpatrick and beyond: A comprehensive methodology for

# 11Kirkpatrick and beyond (EES Prague, 7 Oct 2010)

Dr. Jan HenseMethods

Study I: Analysis of training conceptSample: Manuals for trainers and participantsMethod: Structured document analysis & Interviews

Study II: Training process observationSample: ca. 80 training hours ( BB-training + GB-training)Method: Structured observation by two independent observers (event sampling with category scheme)

Study III: Pre- & posttest surveysSample: 40 participants of ongoing GB- and BB-coursesMethod: Written surveys before and after the training

Study IV: Transfer studySample: 34 MBBs & 77 former BB-participantsMethod: Online transfer surveysAdditional interviews with MBBs and former BB-participantsAdditional analysis of finished Six Sigma projects

Page 12: Kirkpatrick and beyond: A comprehensive methodology for

# 12Kirkpatrick and beyond (EES Prague, 7 Oct 2010)

Dr. Jan Hense

Inputs

TrainingGoals

ContentsDesign

Methods

ParticipantsLearning preferences

MotivationPrevious knowledgePrevious experiences

Preparation

Training process

RealizationLearning sequencesMedia and materials

RolesLearner activitiesTrainer activities

DidacticsProblem-based learning

Cooperative learningSelf-directed learning

Outcomes

ReactionSatisfactionUsefulness

Immediate effectsSkills & knowledge

AttitudesSelf-efficacy

Long-term effectsProject finalization

Career options

Con

text Workplace

Need for trainingLine manager support

WorkplaceLearning conditionsTransfer conditions

Training contextFacilities

Resources

Results

Cost reduction

Added value

Cost-benefit ratio

Organi-zation

Transfer

Usabilityof contents

Applicationof skills and knowledge

Supportby coach

by manager

Training concept

Observation SurveysTestDocument analysis Interviews

Page 13: Kirkpatrick and beyond: A comprehensive methodology for

# 13Kirkpatrick and beyond (EES Prague, 7 Oct 2010)

Dr. Jan Hense

Selected Results:Study II: BB training observation

Page 14: Kirkpatrick and beyond: A comprehensive methodology for

# 14Kirkpatrick and beyond (EES Prague, 7 Oct 2010)

Dr. Jan Hense

Page 15: Kirkpatrick and beyond: A comprehensive methodology for

# 15Kirkpatrick and beyond (EES Prague, 7 Oct 2010)

Dr. Jan Hense

Selected Results:BB training observation

Lecture

Inter-active lecture

Demon-stration

Presen-ting case

exam-ples

Scaffol-ding

Feed-back

Docu-men-

tation of results

Six Sigma project

referenceNumber of sequences 35 51 16 6 18 15 12 9

Average min. per sequence 10:34 13:44 13:04 24:50 29:13 13:08 16:30 15:13

Page 16: Kirkpatrick and beyond: A comprehensive methodology for

# 16Kirkpatrick and beyond (EES Prague, 7 Oct 2010)

Dr. Jan Hense

Selected results:Study III: Pre- & posttest surveys(BB-training participants)

Skills & Knowledge (pre & post test)

3,06

4,38

4,50

3,63

3,94

3,87

3,21

2,05

1,74

2,37

1 2 3 4 5

Overview of SixSigma

Six Sigma tools

Statistics

Projectmanagement

Soft Skills

Pretest mean Posttest mean

Page 17: Kirkpatrick and beyond: A comprehensive methodology for

# 17Kirkpatrick and beyond (EES Prague, 7 Oct 2010)

Dr. Jan Hense

Selected results:Study III: Pre- & posttest surveys(BB-training participants)

Self-efficacy (post test)

3,53

3,87

3,93

4,13

1 2 3 4 5

I am better able to cope with my current non-Six Sigma related job demands.

I am better able to apply the learning contents in my current daily job activities.

I am better able to work more effectively in my current job.

I am able to successfully realize my own Six Sigmaprojects.

Posttest mean

Page 18: Kirkpatrick and beyond: A comprehensive methodology for

# 18Kirkpatrick and beyond (EES Prague, 7 Oct 2010)

Dr. Jan Hense

Selected results:Study III: Pre- & posttest surveys(BB-training participants)

Project finalization (post test)

7,00

1,00

8,00

10,00

5,00

1,00

15,00

14,00

10,00

4,00

0% 20% 40% 60% 80% 100%

Define

Measure

Analyze

Improve

Control

finished ongoing not started yet

(number of participants)

Page 19: Kirkpatrick and beyond: A comprehensive methodology for

# 19Kirkpatrick and beyond (EES Prague, 7 Oct 2010)

Dr. Jan Hense

Selected Results:Study IV: Transfer study

3,03

2,96

2,99

4,21

1 2 3 4 5

My executive manager gives me sufficient time forworking as a Black Belt.

My work environment supports me in my Black Belt role.

I have the necessary resources at my work place to successfully work as a Black Belt.

I am able to get help from a Master Black Belt whenever I need it.

Mean

Transfer conditions

Page 20: Kirkpatrick and beyond: A comprehensive methodology for

# 20Kirkpatrick and beyond (EES Prague, 7 Oct 2010)

Dr. Jan HenseConsequences

Selected recommendations to training management:

InputImprove preparation of participants (e.g. obligation to define own project with MBB)

Slightly reduce training contents

ProcessReduce number and length of (interactive) lectures

Refer to learners' own projects more regularly

Synchronize training weeks with project progress

ContextStrengthen transfer conditions (e.g. line manager support, post-training workshops)

Process control and support of participants‘ projects

Page 21: Kirkpatrick and beyond: A comprehensive methodology for

# 21Kirkpatrick and beyond (EES Prague, 7 Oct 2010)

Dr. Jan HenseDiscussion

Merits of the evaluation approach

The multi-method approach, comprising • input,

• process,

• outcome, and

• transfer analyses,

generated multiple formative insights into possible improvements of the trainings’ concept and implementation.

This would not have been possible with an entirely outcome-oriented evaluation.

Page 22: Kirkpatrick and beyond: A comprehensive methodology for

# 22Kirkpatrick and beyond (EES Prague, 7 Oct 2010)

Dr. Jan HenseConclusion

On a general evaluation theory and policy level, the study can serve as an example of the limitations of pure impact evaluations for influencing practice.

In many practical contexts they will not provide the information necessary to guide improvements of a program.

Inclusion of theoretical considerations on input, context and process influences is often indispensable for evaluation to sustainably influence practices and policies.

Page 23: Kirkpatrick and beyond: A comprehensive methodology for

# 23Kirkpatrick and beyond (EES Prague, 7 Oct 2010)

Dr. Jan Hense

Thank you for your attention!

[email protected]

Page 24: Kirkpatrick and beyond: A comprehensive methodology for

# 24Kirkpatrick and beyond (EES Prague, 7 Oct 2010)

Dr. Jan HenseReferences

Alliger, G. M. & Janak, E. A. (1989). Kirkpatrick's level of training criteria. Thirty years later. Personnel Psychology, 42, 331-342.

Alliger, G. M., Tannenbaum, S. I., Bennett, W., Traver, H. & Shotland, A. (1997). A meta-analysis of the relations among training criteria. Personnel Psychology, 50, 341-358.

Baldwin, T. & Ford, K. (1988). Transfer of training: A review and directions for future research. Personal Psychology, 41 (1), 63-105.

Bates, R. (2004). A critical analysis of evaluation practice: the Kirkpatrick model and the principle of beneficence. Evaluation and Program Planning, 27, 341-347.

Holton, E.F. (1996). The flawed four-level evaluation model. Human Resource Development Quarterly, 7, 5-21.

Kirkpatrick, D. (1959a). Techniques for evaluating training programs. Part 1 - Reaction. Journal of the American Society of Training Directors, 13 (11), 3-9.

Kirkpatrick, D. (1959b). Techniques for evaluating training programs. Part 2 - Learning. Journal of the American Society of Training Directors, 13 (12), 21-26.

Kirkpatrick, D. (1960a). Techniques for evaluating training programs. Part 3 - Behavior. Journal of the American Society of Training Directors, 14 (1), 13-18.

Kirkpatrick, D. (1960b). Techniques for evaluating training programs. Part 4 - Results. Journal of the American Society of Training Directors, 14 (2), 28-32.

Kirkpatrick, D. L. & Kirkpatrick, J. D. (2006). Evaluating training programs. The four levels (3rd ed.). San Francisco: Berret-Koehler.

Reinmann, G. & Mandl, H. (2006). Unterrichten und Lernumgebungen gestalten [Teaching and design of learning environments]. In A. Krapp & B. Weidenmann (Hrsg.), Pädagogische Psychologie. Ein Lehrbuch [Educational psychology. A textbook]. 5. ed. (pp. 613–658). Weinheim, Germany: Beltz.