research and best practice for the design and delivery of training richard clark ict immersive...

Post on 23-Dec-2015

238 Views

Category:

Documents

0 Downloads

Preview:

Click to see full reader

TRANSCRIPT

Research and Best Practice for the Design and Delivery of

Training

Richard ClarkICT IMMERSIVE TRAINING WORKSHOP

2

Important Problems and Questions?

3

Questions for Today’s Workshop

I will discuss the five questions:

1. What have we learned recently that could help us solve pressing problems in training?

2. Do some technologies or media influence learning or motivation more than others?

3. What kind of training leads to adaptable skills and knowledge?

4. How can instructors and developers help trainees learn study skills?

5. How can we decide whether training should be delivered in the classroom or by multi media?

4

1. Training Impact Studies

In past two decades the best training resulted on only a 20% increase in learning (meta analysis results).

In task analysis, top experts only provide 30% of information about how they perform tasks.

50% of trainees are wrong when asked how much they learned from training on reaction forms.

Providing learner control over the sequencing of lessons or content harms learning.

First , the Bad News

5

Training Impact Studies

Technology-based, distance learning is no more effective than classroom-based, instructor-led.

Requiring students to discover what they need to learn (e.g. by solving problems to learn the solution) results in about 40% less learning than demonstrating how to solve problems.– The most vulnerable students are harmed the most.

– Multi media programs often distract students and reduce learning for the most vulnerable students.

6

Training Impact Studies

Evidence that the way we most often design Serious Games makes them significantly less effective for learning and more expensive than more direct ways to train.

Adjusting training for different learning styles does NOT increase learning (e.g. Multiple Intelligences, Myer’s Briggs, Visual or Verbal Ability).

30% of feedback strategies damage performance, another 30% have no impact.

WHY IS THE EVIDENCE SO COUNTER-INTUITIVE AND DISMAL?

7

2. Our Mental Architecture Limits Learning Thinking is limited to 3-4 ideas at one time

Evolution prevented rapid, self destructive learning. Emotionality and stress take up thinking space.

When we exceed mental capacity we “zone out”

“Daydreaming is pleasurable and unconscious.

We have two different knowledge systems One is conscious and takes up space in thinking One is unconscious and does not require thinking

space

8

We have two knowledge systems

1. Knowledge about What and Why (Declarative)

– Conscious, easier to learn and forget, can be wrong.– Helps us imagine and handle novelty.– About 10%- 30 % of adult knowledge.

2. Knowledge about When and How (Procedural)

– Unconscious, hard to learn and forget, mostly right.– Helps us automate strategies that work without taking up

any of the 3 to 4 thinking spaces.– About 70% - 90% of all adult knowledge.

9

Problems caused by two knowledge systems

1. We are unaware of procedural knowledge and so we focus training on declarative knowledge.

– People do not learn how to think or decide.– They learn 20% & get the rest by trial and error.

2. Experts (as SMEs or trainers) are 70% UN-aware of their own mental strategies

– SMEs only provide 30% of “how” but believe they’ve given 100%

SOLUTION?

10

Possible Solutions to Problems

Three New Training Strategies:1. Use Cognitive Task Analysis to capture

unconscious expertise.2. Use New “Direct Instruction” to design training.3. Avoid Cognitive Overload with new Screen

Design

Lets look at all three in the context of ADDIE

11

Effective Training

We currently use the ADDIE model:

1) Analysis (What and who should be trained?)

2) Design (What plan is best for this training?)

3) Development (What TSPs, media and materials?)

4) Implementation (How should we deliver training?)

5) Evaluation (How will we measure success?)

Lets look at what we’ve learned about each stage:

12

I. Effective Training: Analysis

SMEs Role in Analysis? Two issues to consider –

1. SMEs must have successfully and recently performed the tasks being analyzed.– They most often have obsolete skills because the

operational environment evolves quickly.

2. Even the best SMEs are only 30% aware of how they perform the mental part of even routine tasks.– Convincing evidence from many different fields – they can

only describe what they can watch themselves do.

– Incomplete or wrong information in training puts trainees at risk in operational environments.

13

I. Effective Training: Analysis

The new cognitive task analysis Interview captures unconscious expertise

Increase to 80% accuracy (Chao, 94; Clark, 2006)

When used in training, performance impact doubles (from 20% to 45% - Lee, 2004)

Recent evidence that DL dropout decreases.

Dr. Ken Yates will describe and provide examples this afternoon

14

II. Effective Training: Design

Design (What plan is best for this training?)

We have a number of challenges in design:

1. ADDIE does not dictate training methods and the one’s we select are often ineffective.

2. Asking trainees to “construct” or “discover” how to do something in order to solve a problem is least effective method.

3. Reviews of training studies for the past 50 years indicate that “direct instruction” is best.

15

II. Effective Training: Design

David Merrill has reviewed all effective direct instruction training systems (military and civilian) and suggests that they have five key elements:

1) Authentic Problems (from mission environment)

2) Connections to trainees prior knowledge (analogies, examples)

3) Demonstration of “when and how to act and decide” captured, from highly successful experts with recent experience in current operational environment.

4) Part and whole-task practice with immediate corrective feedback

5) Transfer and automation help (field exercises, immersive simulations, serious games)

16

II. What training leads to adaptability?

1. Trainees must first learn why and how to perform in an authentic, routine setting.

2. Providing analogies that relate the how and why to previous experience increases adaptability.

3. “How to” knowledge must be applied in increasingly novel environments (varied practice).

4. All practice must be “hands on” with supportive and corrective feedback.

5. When trainees are asked to explain why the strategy they are using did not work, flexibility increases.

17

II. What training leads to Study Skills?TEACH STUDENTS SIX L2L STRATEGIES:

1. Take notes and outline key points by translating course information into their own words.

2. Connect what they are learning to the imagining of previous experiences and to familiar examples and analogies.

3. Focus their attention on key points.

4. Ask higher order questions (Why? What Evidence?) to check their deeper understanding of course content.

5. Write summaries of main ideas in their own ideas and check.

6. Think about alternative points of view (What if? How does the enemy or that other person think about it?).

18

III. Effective Training: DevelopmentDevelopment (What media and materials?)

1. Best practice requires that we begin with a plan (design) and then develop media and materials based on the training plan.– Now we most often begin with development of

materials without a plan.

2. Conclusive proof that media influence cost and access to training but NOT learning.– SMEs influence accuracy of task description– Design dictates training methods.– Development influences access, time and cost

19

III. Effective Training: DevelopmentDevelopment (What media and materials?)

3. Design and Development must be compatible – all five direct instruction elements must be in all training exercises – Yet:– If we don’t capture when and how to make decisions

during analysis, we can’t teach essential knowledge– Our DL software makes practice and immediate

corrective feedback mostly impossible – Serious games are most often based on discovery

methods but could be based on direct instruction– Multi-media training often hoses students with too

much information that is not relevant to objectives

20

Multimedia Design

Give information with narration not print

Prevent novice or intermediate trainee control over:

1. Sequence of lessons, modules or information2. Learning activities, for example, extra

information or additional problems

Allow limited control of pacing

When presenting process or procedures, start with overview and use pointer words such as “first,” “second,” or “as a result.”

21

Development: Multi Media Development

1. Eliminate extraneous visuals and sound 31% 5

2. Do not read text on screen – narrate without text 23% 5

3. Place labels next to part of graphic it describes 38% 5

4. Simultaneous narration and video – do not separate 42% 5

5. Student pacing better than system pacing 31% 3

6. Conceptual information just before “how to” 27% 5

7. Graphics + Narration not text + animation 32% 7

8. Add pictures to Narration - not narration alone 42% 11

9. Conversational style “you or we” - not formal 36% 11

10. Human voice better than machine voice 25% 3

Mayer, 2009

10 Principles (Students learn better when …) % increase Tests

22

IV. Effective Training: ImplementationImplementation (How should we deliver training?)

How to decide between classroom and electronic media?:

1) Can sensory modes and application context be simulated by available delivery media? (video, computer, instructor)

2) Can practice be observed and feedback provided by computer?

3) If “no” to any part of 1) or 2), offer classroom or synchronous live instruction for those parts.

4) If any part of 1) and 2) are “yes” Use ingredients method of cost/utility analysis – if computer is cost beneficial, use it.

23

V. Effective Training: Evaluation

Evaluation (How will we measure success?) –

Kirkpatrick 4 Level Model:

1. Reactions only tell us about student confidence and values – NOT learning or performance.

2. Learning assessment must tap both conceptual knowledge and application skill– Reactions can be negative and learning positive– Learning may not transfer

3. Transfer assessment must capture performance not opinions about performance

4. Reactions, learning and transfer may be positive but the problem we were trying to solve may not be solved.

.

24

Questions? Comments?

THANKYOU!

CLARK@USC.EDU

top related