looking backwards to move forwards: seminal research that has influenced key researcher in the field...

Post on 15-Nov-2014

557 Views

Category:

Business

0 Downloads

Preview:

Click to see full reader

DESCRIPTION

Findings from a survey of key researchers in the computer assisted assessment field to identify the seminal research in this field to date.

TRANSCRIPT

Looking backwards to move forwards

Denise WhitelockInstitute of Educational Technology

The Open University

d.m.whitelock@open.ac.uk

Seminal research that has influenced key researchers in the field of Computer Assisted

Assessment

Outline

Seminal literature

Analysis Response

Quick survey

Discussion

Overview

DMW July 12 CAA conference

Objectives1. To understand what is

meant

by seminal work in the field

2. To investigate key researchers’

understanding of seminal work

3. To analyse the findings

4. To identify the gaps that need

to be filled in the current context

DMW July 12 CAA conference

Seminal literature for CAA

• What are the classics?• What’s in the CAA

archive?

• What should all your students read?

DMW July 12 CAA conference

Ask the experts

• 12 subjects

• Epistolary interviews

• Facilitated dialogue

• Seminal literature redefined

• What is seminal in our time?

• Context changes focus

DMW July 12 CAA conference

Subjects

• 5 women

• 7 men

• Age approx 48 years

• Majority professorial status

• Publications that have had impact

DMW July 12 CAA conference

The experts’ response• Difficult question• Why do you want to

know?• What influenced me in

my research• What I think is a big

influence for now• Paper often before its

time• Ignore citation index• Looking more to a 4*

paper?DMW July 12 CAA conference

Top articlesJournal Article Number of

Responses

Bennet R.E. (2002) Inexorable and Inevitable: the Continuing Story of Technology and Assessment JLearning Technology and Assessment Vol 1 no1

6

Collins, Hawkins and Friederiksen (1994) Three Different views of Students : the Role of Technology in Assessing Student performance JLearning Sciences 3 (2) 205-217

5

Sleeman and Brown (1982) Intelligent Tutoring Systems Science vol 228 Issue 4698 Academic Press 456-462

7

Nicol and Macfarlane Dick (2006) Formative assessment and self regulated learning: A model and 7 principles of good practice feedback studies in Higher Ed,32 (2) 199 -216

9

H.S Ashton, C.E Beevers et al ( 2006) Automatic measurement of Mathematical Ability in Secondary Education (2006) BJET 37 1, 93-119

4

Landauer, Latham and Foltz Automatic Essay Assessmenthttp;www.tandfonline.com/dol/abs/10.1080/0969594032000148154

7

Whitelock, D., Watt, S., Raw, Y. and Moreale, E. (2003) ‘Analysing Tutor Feedback to Students: First steps towards constructing an Electronic Monitoring System’. Association for Learning Technology Journal (ALT-J). Vol. 11, No. 3

3

DMW July 12 CAA conference

Main categories

• Automatic essay marking

• Modelling

• History recaps

• Automatic Assessment in Anger

• Feedback both non and automatic

DMW July 12 CAA conference

Automatic marking of free text entry

• Open comment

• Mitchell

• Science at the OU

• Jordan

• SafeSea new EPSRC project Whitelock and Pulman

DMW July 12 CAA conference

Open Mentor: Feedback to tutorsWhat is Open Mentor?

• A learning support tool for tutors that provides reflective comments on their assessment of feedback added to students’ assignments

How does it work?• Flander’s categories inappropriate• Bales’ categories• Open Mentor provides tutors with guidance by

analysing the comments and grouping them in four major categories

DMW July 12 CAA conference

Identifying trends: H801

0 5 10 15 20 25

A Pass 1

A Pass 2

A Pass 3

A Pass 4

B Pass 1

B Pass 2

B Pass 3

B Pass 4

C Pass 1

C Pass 2

C Pass 3

C Pass 4

D Pass 1

D Pass 2

D Pass 3

D Pass 4

Ba

les'

In

tera

cti

on

al

Ca

teg

ori

es

at

ea

ch

Pa

ssL

leve

l

Number ofIincidences

Graph shows conflated Bales’ categories against mean number of incidences in H801 scripts

DMW July 12 CAA conference

Identifying trends: H801

5.96

17.13

5.73

1.61

A

B

C

D

Pie Chart shows the mean number of incidences per pass per conflated Bales' Interactional Category for all four levels of pass in H801 scripts

Key:

A = Positive reactions

B = Responses

C = Questions

D = Negative reactions

DMW July 12 CAA conference

DMW, CALRG, May 2009

HEA-funded Synthesis Report on Assessment and Feedback

• Consult the academic community on useful references– Seminar series– Survey– Advisors– Invited contributors

• Prioritise evidence-based references• Synthesise main points• For readers:

– Academics using technology enhancement for assessment and feedback

– Learning technologists – Managers of academic departments

DMW July 12 CAA conference

Evidence-based literature

• 142 references

• Technology-enhanced methods

• Use for assessment and feedback

• Type of evidence

• Ease of access (18 could not be retrieved)

DMW July 12 CAA conference

Categories of evidence usedCategory Description1a Peer reviewed generalizable study providing

effect size estimates and which includes (i) some form of control group or treatment (may involve participants acting as their own control, such as before and after), and / or (ii) blind or preferably double-blind protocol.

1b Peer reviewed generalizable study providing effect size estimates, or sufficient information to allow estimates of effect size.

2 Peer reviewed ‘generalizable’ study providing quantified evidence (counts, percentages, etc) short of allowing estimates of effect sizes.

3 Peer-reviewed study.4 Other reputable study providing guidance.

DMW July 12 CAA conference

Number of references recommended in each evidence category

Evidence category

Number of references recommended

Cumulative %

1a 15 12.1%

1b 8 18.5%

2 12 28.2%

3 49 67.7%

4 40 100.00%

Total 124

DMW July 12 CAA conference

How do the findings compare with Gilbert, Whitelock and Gale HEA study?

• All Journal articles

• No practice guides

• More technical papers

• History of deep questions showing the early struggles in the field

• David Nicholls’ work common to both

• Whitelock’s work common to both

DMW July 12 CAA conference

A d v i c e f o r A c t i o n

Characteristics Descriptor

Authentic Involving real-world knowledge and skills

Personalised Tailored to the knowledge, skills and interests of each student

Negotiated Agreed between the learner and the teacher

Engaging Involving the personal interests of the students

Recognises existing skills

Willing to accredit the student’s existing work

Deep Assessing deep knowledge – not memorization

Problem oriented Original tasks requiring genuine problem solving skills

Collaboratively produced

Produced in partnership with fellow students

Peer and self assessed Involving self reflection and peer review

Tool supported Encouraging the use of ICT

Elliott’s characteristics of Assessment 2.0 activities

DMW July 12 CAA conference

What’s under the bonnet?: Algorithms or heuristics?

• Well! It’s all code• Pros and cons• Formalising

models• Pedagogical

theory operationalised OPEN TO TEST

DMW July 12 CAA conference

e-Assessment futures• Free text entry

• Adaptive testing

• Automatic marking

• Advice for Action

• Learning Analytics Data mining

• Motivation Badges and Dweck

DMW July 12 CAA conference

Four assessment special issues• Brna, P. & Whitelock, D. (Eds.) (2010) Special Issue of

International Journal of Continuing Engineering Education and Life-long Learning, Focusing on electronic Feedback: Feasible progress or just unfulfilled promises? Volume 2, No. 2

• Whitelock, D. (Ed.) (2009) Special on e-Assessment: Developing new dialogues for the digital age. Volume 40, No. 2

• Whitelock, D. and Watt, S. (Eds.) (2008). Reframing e-assessment: adopting new media and adapting old frameworks. Learning, Media and Technology, Vol. 33, No. 3

• Whitelick, D. and Warburton, W. (2010). Special Issue of International Journal of e-Assessment (IJEA) entitled ‘Computer Assisted Assessment: Supporting Student Learning’

DMW July 12 CAA conference

top related