assessment: diversity of strategies chris rust oxford brookes university

Download ASSESSMENT: DIVERSITY OF STRATEGIES Chris Rust Oxford Brookes University

Post on 25-Dec-2015

215 views

Category:

Documents

0 download

Embed Size (px)

TRANSCRIPT

  • Slide 1
  • ASSESSMENT: DIVERSITY OF STRATEGIES Chris Rust Oxford Brookes University
  • Slide 2
  • Student learning & assessment Assessment is at the heart of the student experience (Brown, S & Knight, P., 1994) From our students point of view, assessment always defines the actual curriculum (Ramsden, P.,1992) Assessment defines what students regard as important, how they spend their time and how they come to see themselves as students and then as graduates.........If you want to change student learning then change the methods of assessment (Brown, G et al, 1997)
  • Slide 3
  • Session outline 6 Strategies Change the criteria Change the task Mechanise assessment* Assess the process Assess groups Involve the students* plus Strategic Programme Decisions* *can reduce staff workload
  • Slide 4
  • Purposes of assessment (adapted from Brown G et al 1997 ) motivate students diagnose a student's strengths and weaknesses help students judge their own abilities provide a profile of what each student has learnt provide a profile of what the whole class has learnt grade or rank a student permit a student to proceed select for future courses license for practice select, or predict success, in future employment provide feedback on the effectiveness of the teaching evaluate the strengths and weaknesses of the course achieve/guarantee respectability and gain credit with other institutions and employers To:
  • Slide 5
  • Purposes of assessment 2 1.Motivation 2.Create learning activities 3.Providing feedback 4.Judging performance (to produce marks, grades, degree classifications; to differentiate; gatekeeping; qualification) 5.Quality assurance 1, 2 & 3 concern learning and perform a largely formative function; should be fulfilled frequently 4 & 5 are largely summative functions; need to be fulfilled infrequently but well
  • Slide 6
  • Formative vs Summative Formative: focus is to help the student learn Summative: focus is to measure how much has been learnt. not necessarily mutually exclusive, but. Summative assessment tends to: come at the end of a period or unit of learning focus on judging performance, grading, differentiating between students, gatekeeping be of limited or even no use for feedback
  • Slide 7
  • Constructive alignment & issues of validity The fundamental principle of constructive alignment is that a good teaching system aligns teaching method and assessment to the learning activities stated in the objectives so that all aspects of this system are in accord in supporting appropriate student learning (Biggs, 1999)
  • Slide 8
  • Constructive alignment: 3-stage course design What are desired outcomes? What teaching methods require students to behave in ways that are likely to achieve those outcomes? What assessment tasks will tell us if the actual outcomes match those that are intended or desired? This is the essence of constructive alignment (Biggs, 1999)
  • Slide 9
  • Change the criteria Essay - library and journals example Laboratory reports - Information Communication Technology* skills example (* spreadsheets, statistical packages, word-processing, graphics, etc.) Task: Fill in the skills checklist for the average student at a particular stage on one of your courses. Where you have high importance ratings but low skills ratings, consider ways those desired skills could be highlighted by changing the criteria
  • Slide 10
  • Change the task Traditional assessment samples a narrow range of abilities Validity Transferability Relevance/interest/motivation Plagiarism and cheating NB Sense of audience and real purpose
  • Slide 11
  • Change the task - task Take the most traditional assessment from one of your courses and invent as many different assessment tasks as possible. Especially keep in mind the issue of validity and the learning outcome/s being assessed, and try to ensure that each new task has a sense of real audience and purpose.
  • Slide 12
  • Mechanise Assessment 1.Statement banks 2.Computer aided-assessment 3.Assignment attachment sheets
  • Slide 13
  • Mechanise assessment 1 - statement banks Write out frequently used feedback comments, for example: 1.I like this sentence/section because it is clear and concise 2.I found this paragraph/section/essay well organised and easy to follow 3.I am afraid I am lost. This paragraph/section is unclear and leaves me confused as to what you mean 4.I would understand and be more convinced if you gave an example/quote/statistic to support this 5.It would really help if you presented this data in a table 6.This is an important point and you make it well etc.
  • Slide 14
  • Weekly CAA testing case study data StudentWeek 1Week 2Week 3Week 4Week 5Week 6Week 7 A57632135402720 B68714579838077 C232111---- D45514579838077 E------- F63-51-47-35 G54583550586062 (Brown, Rust & Gibbs,1994)
  • Slide 15
  • CAA quizzes Scenario First term, first year compulsory law module A new subject for most (75%) students High failure rate (25%), poor general results (28% 3rd class, 7% Ist) Solution: Weekly optional WebCT quizzes (50% take-up) Outcome: Quiz takers: 4% fail, 14% 3rd class, 24% Ist Non-quiz takers: same pattern as before Overall:14% fail (approx half previous figure) 21% 3rd class 14% 1st (double previous figure)
  • Slide 16
  • Assess groups major issues Reasons - Active learning/engagement/exploratory talk - to develop interpersonal/group skills - to produce a bigger, more complex product/outcome - pragmatic, logistical reasons (e.g. staff time/limited resources) Scale - size of group (pairs, triads, 4-6) - length of time - size/complexity of outcome Composition of group, and how chosen Need for preparation, training and/or guidance Assessment - none - process vs product - formative/feedback only - summative (N.B. fairness)
  • Slide 17
  • Assess groups preparation, training and guidance Reflection on previous groupwork experience/s Negative brainstorm set of guidelines, ?contract Definition/allocation of roles Guidelines on process e.g. minutes, project plan, etc. Consideration of how to deal with problems Team skills development checklist
  • Slide 18
  • Assessing groups yellow card system The assignment referee (or dealing with dysfunctional group members) White Card The offence has been noted and group/tutors have voted for a white card. A recorded warning but no further penalty Green Card A further offence(s) have been recorded. A green card has been voted for by the group and seconded by tutors. 5 penalty points, and further offence(s) will incur a yellow card Yellow Card A further offence(s) has been recorded. A yellow card has been voted for by the group and seconded by tutors. 10 further penalty points and further offence(s) will incur a red card Red Card Exclusion from the group and 0 marks for the project (this means you would be required to re-take the 7410 module) This individual has been judged: a)Not to have made any meaningful contribution to the group over term 2 and at least half of term 3 b)Their behaviour has seriously disrupted the efforts of the rest of the group (Retail Management Field, Oxford Brookes Business School)
  • Slide 19
  • Involve the students - 1 self assessment Strengths of this piece of work Weaknesses in this piece of work How this work could be improved The grade it deserves is.. What I would like your comments on it is the interaction between both believing in self-responsibility and using assessment formatively that leads to greater educational achievements ( Brown & Hirschfeld, 2008 )
  • Slide 20
  • Involve the students 2 peer marking using model answers Scenario: Engineering students had weekly maths problem sheets marked and problem classes Increased student numbers meant marking impossible and problem classes big enough to hide in Students stopped doing problems Exam marks declined (Average 55%>45%) Solution: Course requirement to complete 50 problem sheets Peer assessed at six lecture sessions but marks do not count Exams and teaching unchanged Outcome: Exam marks increased (Av. 45%>80%)
  • Slide 21
  • Involve the students 3 peer feedback Scenario: Geography students did two essays but no apparent improvement from one to the other despite lots of tutor time writing feedback Increased student numbers made tutor workload impossible Solution: Only one essay but first draft required part way through course Students read and give each other feedback on their draft essays Students rewrite the essay in the light of the feedback In addition to the final draft, students also submit a summary of how the 2nd draft has been altered from the1st in the light of the feedback Outcome: Much better essays
  • Slide 22
  • Involve the students 4 peer feedback (Zeller, 2000*) The Praktomat system allows students to read, review, and assess each others programs in order to improve quality and style. After a successful submission, the student can retrieve and review a program of some fellow student selected by Praktomat. After the review is complete, the student may obtain reviews and re-submit improved versions of his program. The reviewing process is independent of grading; the risk of plagiarism is narrowed by personalized assignments and automatic testing of submitted programs. In a survey, more than two thirds of the students affirm