a technology using feedback to manage experience based learning

3
A technology using feedback to manage experience based learning TIM DORNAN, MARTIN BROWN, DAN POWLEY & MIKE HOPKINS UMIST and Hope Hospital, University of Manchester School of Medicine, UK SUMMARY The aim was to establish how ICT could apply feedback principles to experience based learning. Based on a survey of student and staff requirements, we developed a personalized educational technology (‘iSUS’) that: (1) Made students clear what they should learn; (2) Helped them meet appropriate real patients; (3) Encouraged reflective feedback; (4) Calculated benchmarks from accumulated feedback; (5) Compared indivi- dual students’ feedback against those benchmarks; (6) Matched clinical activities to curriculum objectives; (7) Gave feedback to teachers and course leads. Bench testing proved the system usable. During seven weeks of real time use, a whole year group of 111 students feedback on 1183 learning episodes. Five hundred and forty-one (46%) of feedback episodes were self initiated. We have successfully prototyped an application of feedback principles to experience based learning that students seem to find useful. Introduction Even in problem-based learning (PBL) curricula, there are many obstacles to integrative, self directed learning in clinical settings (Dornan et al., 2004a, Patel et al., 2002). We have used IT to make the many activities of a university hospital more accessible (Foster & Dornan, 2003). But students remained very unclear which activity to choose. We hypothesized the technology would perform better if it made students more aware of their curriculum objectives and encouraged them to ‘close the learning loop’ by giving feedback. Not only would that help their individual learning by encouraging reflection but it could help other students make informed choices, and show teachers, course leaders and managers how cost-effectively they were meeting students’ learning needs. We decided to test the hypothesis by designing, building and evaluating a technology to tackle this complex process. Methods This research was conducted within a curriculum that continues PBL into clerkships (O’Neill, 1998). Signups are pre-arranged, one-off attendances at clinical activities that complement what a student’s placement provides (Foster & Dornan, 2003). The technology was named ‘intelligent Signup System’ (iSUS) because it would tailor the presenta- tion of information to individual need (Figure 1). Five students, two teachers, and one education manager took part in semi-structured, in-depth, audio-recorded inter- views to determine how feedback could be gathered and presented back, and experiences recommended. Two project workers analyzed the interview transcripts, generated use cases, storyboarded the main functions, and presented them back to users at a design workshop. They prototyped iSUS using a three-tier architecture approach. A relational database (MS Access) and web/application server (MS ASP/ IIS) delivered HTML pages to the client/browser software. intelligent Signup System (iSUS) personalisation Each student, teacher, or course leader had a personal homepage. A student, for example, could review the objectives of their current module, what signups they had attended, what reflective comments they had recorded, and how their cumulated experience compared with their peer group. Objectives Curriculum objectives were made the organizing principle of iSUS, both in its database structure and by organizing the homepage and screens around them. The screens gave priority to a student’s current module but allowed them to record learning related to other modules whenever opportunities presented. Helping students meet appropriate patients The centrality of curriculum objectives allowed iSUS to rank signups ‘intelligently’ by how well they matched the user’s current learning need. Relevance was calculated by matching a student’s aggregated experience at that point in time to the aggregate of objectives other students had met by attending the signup. This design emulates commercial websites that give individualized recommendations based on other customers’ feedback. Signups could also be ranked by teacher, availability, date and time, and other students’ rating of their quality. They could be browsed freely, always with the facility to view other students’ feedback on them. Feedback form Designed to take fewer than five minutes to complete, the form contained: Check-boxes to record if either the student or teacher did not attend, and whether the experience allowed active participation; Correspondence: Dr Tim Dornan, Hope Hospital, Stott Lane, Salford, Manchester M6 8HD, UK. Tel: þ44 (0) 161 206 5153; fax: þ44 (0) 161 206 5989; email: [email protected] 736 Med Teach Downloaded from informahealthcare.com by CDL-UC Davis on 10/26/14 For personal use only.

Upload: mike

Post on 02-Mar-2017

215 views

Category:

Documents


2 download

TRANSCRIPT

Page 1: A technology using feedback to manage experience based learning

A technology using feedback to manage experiencebased learning

TIM DORNAN, MARTIN BROWN, DAN POWLEY & MIKE HOPKINSUMIST and Hope Hospital, University of Manchester School of Medicine, UK

SUMMARY The aim was to establish how ICT could apply

feedback principles to experience based learning. Based on a survey

of student and staff requirements, we developed a personalized

educational technology (‘iSUS’) that: (1) Made students clear

what they should learn; (2) Helped them meet appropriate real

patients; (3) Encouraged reflective feedback; (4) Calculated

benchmarks from accumulated feedback; (5) Compared indivi-

dual students’ feedback against those benchmarks; (6) Matched

clinical activities to curriculum objectives; (7) Gave feedback to

teachers and course leads. Bench testing proved the system usable.

During seven weeks of real time use, a whole year group of 111

students feedback on 1183 learning episodes. Five hundred and

forty-one (46%) of feedback episodes were self initiated. We have

successfully prototyped an application of feedback principles to

experience based learning that students seem to find useful.

Introduction

Even in problem-based learning (PBL) curricula, there

are many obstacles to integrative, self directed learning in

clinical settings (Dornan et al., 2004a, Patel et al., 2002). We

have used IT to make the many activities of a university

hospital more accessible (Foster & Dornan, 2003). But

students remained very unclear which activity to choose. We

hypothesized the technology would perform better if it made

students more aware of their curriculum objectives and

encouraged them to ‘close the learning loop’ by giving

feedback. Not only would that help their individual learning

by encouraging reflection but it could help other students

make informed choices, and show teachers, course leaders

and managers how cost-effectively they were meeting

students’ learning needs. We decided to test the hypothesis

by designing, building and evaluating a technology to tackle

this complex process.

Methods

This research was conducted within a curriculum that

continues PBL into clerkships (O’Neill, 1998). Signups are

pre-arranged, one-off attendances at clinical activities that

complement what a student’s placement provides (Foster

& Dornan, 2003). The technology was named ‘intelligent

Signup System’ (iSUS) because it would tailor the presenta-

tion of information to individual need (Figure 1). Five

students, two teachers, and one education manager took

part in semi-structured, in-depth, audio-recorded inter-

views to determine how feedback could be gathered and

presented back, and experiences recommended. Two

project workers analyzed the interview transcripts, generated

use cases, storyboarded the main functions, and presented

them back to users at a design workshop. They prototyped

iSUS using a three-tier architecture approach. A relational

database (MS Access) and web/application server (MS ASP/

IIS) delivered HTML pages to the client/browser software.

intelligent Signup System (iSUS) personalisation

Each student, teacher, or course leader had a personal

homepage. A student, for example, could review the

objectives of their current module, what signups they had

attended, what reflective comments they had recorded,

and how their cumulated experience compared with their

peer group.

Objectives

Curriculum objectives were made the organizing principle

of iSUS, both in its database structure and by organizing

the homepage and screens around them. The screens

gave priority to a student’s current module but allowed

them to record learning related to other modules whenever

opportunities presented.

Helping students meet appropriate patients

The centrality of curriculum objectives allowed iSUS to

rank signups ‘intelligently’ by how well they matched the

user’s current learning need. Relevance was calculated

by matching a student’s aggregated experience at that point

in time to the aggregate of objectives other students had met

by attending the signup. This design emulates commercial

websites that give individualized recommendations based

on other customers’ feedback. Signups could also be ranked

by teacher, availability, date and time, and other students’

rating of their quality. They could be browsed freely,

always with the facility to view other students’ feedback

on them.

Feedback form

Designed to take fewer than five minutes to complete,

the form contained:

� Check-boxes to record if either the student or teacher did

not attend, and whether the experience allowed active

participation;

Correspondence: Dr Tim Dornan, Hope Hospital, Stott Lane, Salford,

Manchester M6 8HD, UK. Tel: þ44 (0) 161 206 5153; fax: þ44 (0) 161

206 5989; email: [email protected]

736

Med

Tea

ch D

ownl

oade

d fr

om in

form

ahea

lthca

re.c

om b

y C

DL

-UC

Dav

is o

n 10

/26/

14Fo

r pe

rson

al u

se o

nly.

Page 2: A technology using feedback to manage experience based learning

� A quantitative (Likert) rating of the quality of the

experience;� Radio buttons that assigned a value of 0, 1 or 2 to

each module objective, according to what students

learned from it;� Textual, reflective feedback on the experience.

The system ‘knew’ which signups students had attended

and asked for feedback when they next logged in, giving them

only two chances to defer before blocking future bookings

until the feedback had been received. A student could also

call up the feedback form at will to make reflective entries on

self-initiated learning.

Presentation of cumulated feedback

To the individual student

The programme presented a student’s aggregate clinical

experience on a bar chart and benchmarked it against the

mean of the peer group and an absolute criterion of

adequacy. A mouse click on the graph led the student from

reflection on their accumulated experience to a menu of

learning opportunities that could supplement it. Another

click led to the homepage of an individual signup, a further

one checked its availability, and a final one booked a place

to attend it.

To a teacher

Likewise, numerical ratings and textual comments could

be aggregated and compared with other activities or the

equivalent activity in previous years.

To academic leads and managers

It was anticipated these ‘super users’ would have less

predictable information needs. Accordingly, their access

to the data was to be provided in the form of pivot tables,

allowing them to choose dimensions (signups, students,

learning objectives, time etc.) for comparison.

Evaluation

Bench testing

Twelve tasks that tested how iSUS might perform in a

typical user session were tested on twelve users (ten students

and two teachers). Each user was asked to:

� Provide a solution to each task;� Rate two Likert items evaluating how well the system

supported the task and how usable it was;� Provide (optionally) a free text statement describing

any issues that had arisen from the task.

A researcher checked the subject had understood the

scenario and gave help when needed. Each user then

completed a questionnaire that rated their previous IT

experience and overall rating of the system.

Field trial

After pilot use by three groups of eight students, analysed

in detail and reported elsewhere, iSUS was advanced to

real-time use over seven weeks by a whole third year group of

111 Hope Hospital students, evaluated by analysing the

system’s databases.

Results

Bench testing

Ninety-three percent of tasks were successfully completed

and median ratings for support and usability were all above

the midpoint of the scale. In their free text comments,

two student users commented adversely on the complexity

of the system, whilst others commented favourably on its

usability and the guidance it gave.

Field trial

The 111 students fed back on 1183 learning episodes (1.5

per student per week). Six hundred and forty-two episodes

of feedback were prompted by the system and 541

(46%) initiated by the student.

Discussion

Throughout our integrative phase 2 and 3 curriculum,

students have hospital specialty placements to provide

access to an appropriate case mix and help them feel

they belong in the clinical environment. Concurrently, they

spend one day per week in primary care. We originally

pinned our hopes on PBL tutorials to give students clear

objectives for, and an opportunity to feed back on, place-

ment learning, but we were rather disappointed (Dornan

et al., 2004a). Teacher’s interests and diseases encountered

Teachers

Courseleads

/managers

Report

Analyse

Studentpeers

Lastexperience

Nextexperience

Givefeedback

on lastexperience

Homepage

Reviewmodule

objectives

Reviewown learning

against objectives;identify gaps

See whatis on offer

See whatother students

have saidabout it

Choose

Individualstudent

Figure 1. The iSUS learning cycle. Students come to their

individual homepages having had experience in the clinical

environment, enter feedback, review their progress against

the module objectives and the progress of their peers, identify

gaps in their learning, identify experiences to fill them, and

review their peers’ feedback on the experience before

committing themselves to attend it.

A technology using feedback to manage experience based learning

737

Med

Tea

ch D

ownl

oade

d fr

om in

form

ahea

lthca

re.c

om b

y C

DL

-UC

Dav

is o

n 10

/26/

14Fo

r pe

rson

al u

se o

nly.

Page 3: A technology using feedback to manage experience based learning

opportunistically seemed to have a disproportionate influ-

ence. This study shows how ICT could focus learning back

on the curriculum objectives, promote feedback and help

students and in the language of reflective learning, ‘plan new

actions’ down to a particular activity at a particular time

in a particular place. The system’s potential to evaluate

the curriculum is as yet untapped, but is a focus for

our continuing research (Dornan et al., 2004b). Studies on

reflective/portfolio learning show the importance of mentor-

ing (Pearson & Heywood, 2004), so we predict iSUS will not

reach its maximum potential until self-evaluation is supple-

mented by a small group where there is reflective debriefing

on students’ experience based learning, tutored by a

practitioner/mentor.

Practice points

� IT can be used to support and partially direct a

medical student’s learning, and ‘close the feed-

back loop’.

� Even in a self directed curriculum, students value

advice and guidance about what to learn and how to

learn it.

� Feedback is of interest to several groups of stake-

holders; students, administrative staff, academic course

leaders, and those responsible for quality management

and enhancement.

� This study shows how IT can manage the process, as

opposed to the content, of learning.

Notes on contributors

TIM DORNAN, Consultant Physician and Educationalist, had the

original idea of signups, and of developing a directive

learning management system. He supervised educational and clinical

aspects.

MARTIN BROWN, Senior Lecturer in Computing and Mathematics at

UMIST, devised the feedback strategy embodied in iSUS and the

relevance index and supervised computational aspects.

DAN POWLEY developed iSUS as project work for his computing science

MSc degree and continues to develop it.

MIKE HOPKINS co-developed iSUS and also wrote an MSc thesis on it.

References

DORNAN, T., SCHERPBIER, A., KING, N. & BOSHUIZEN, H. (2004a)

Clinical teachers and problem based learning. Phenomenological

study, Med. Educ., in press.

FOSTER, M. & DORNAN, T. (2003) Self-directed, integrated

clinical learning through a signup system, Med. Educ., 37, pp. 656–659.

O’NEILL, P.A. (1998) Problem-based learning alongside

clinical experience: reform of the Manchester curriculum, Education

for Health, 11, pp. 37–48.

PATEL, L., BUCK, P., DORNAN, T.L. & SUTTON, A. (2002) Child Health

and Obstetrics-Gynaecology in a problem-based curriculum: accepting

the limits of integration and the need for differentiation, Med. Educ., 36,

pp. 261–271.

DORNAN, T., BOSHUIZEN, H., CORDINGLEY, L., HIDER, S., HADFIELD, J.

& SCHERPBIER, A. (2004b) Evaluation of self-directed clinical

education: validation of an instrument, Med. Educ., in press.

PEARSON, D.J. & HEYWOOD, P. (2004) Portfolio use in

general practice vocational training: a survey of GP registrars, Med.

Educ., 38, pp. 87–95.

T. Dornan et al.

738

Med

Tea

ch D

ownl

oade

d fr

om in

form

ahea

lthca

re.c

om b

y C

DL

-UC

Dav

is o

n 10

/26/

14Fo

r pe

rson

al u

se o

nly.