evaluating electronic voting systems for enhancing student experience (eevs)

15
Evaluating Electronic Voting Systems for Enhancing Student Experience (EEVS) Marija Cubric and Amanda Jefferies University of Hertfordshire 1 Heads of E-Learning Forum (HeLF) 27 th Meeting, 31 st October 2012, University of Assessment and Feedback Programme Strand B Projects: Evidence & Evaluation

Upload: zenda

Post on 25-Feb-2016

52 views

Category:

Documents


0 download

DESCRIPTION

Assessment and Feedback Programme Strand B Projects: Evidence & Evaluation . Evaluating Electronic Voting Systems for Enhancing Student Experience (EEVS) . Marija Cubric and Amanda Jefferies University of Hertfordshire. Heads of E-Learning Forum ( HeLF ) 27 th Meeting, - PowerPoint PPT Presentation

TRANSCRIPT

Page 1: Evaluating Electronic Voting Systems for Enhancing Student  Experience (EEVS)

1

Evaluating Electronic Voting Systems for Enhancing Student Experience (EEVS) Marija Cubric and Amanda Jefferies University of Hertfordshire

Heads of E-Learning Forum (HeLF) 27th Meeting, 31st October 2012, University of Westminster, London

Assessment and Feedback ProgrammeStrand B Projects: Evidence & Evaluation

Page 2: Evaluating Electronic Voting Systems for Enhancing Student  Experience (EEVS)

• UH Assessment and Feedback project 2010-11– Enhancing student experience and teaching support – EVS, QMP, online marking and feedback– 3,845 EVS handsets – 8 academic schools – Handsets registration against student IDs– Integration of test results with student records

• iTEAM JISC-funded project 2011-13– Further enhancements of students’ assessment experience– 3000 more handsets and more schools (11)

• Early adopters (prior to 2010)– Life Sciences, Health and Emergency Professions, Business

2

Background

Page 3: Evaluating Electronic Voting Systems for Enhancing Student  Experience (EEVS)

• To provide an up-to-date view of the student and staff experiences of EVS and their opinions of what makes for successful use of the technology, within a large-scale project and across multiple disciplines.

• To identify, from evaluating the staff and student experiences, a set of critical success factors for introducing and maintaining the use of EVS in support of an institutional assessment and feedback strategy.

3

EEVS Project Objectives

Page 4: Evaluating Electronic Voting Systems for Enhancing Student  Experience (EEVS)

• 300+ papers on the use of EVS in HE (1998-2011)– Brown, Davis, Draper, Kennedy, Nichol, Oliver, White et al– Early adopters from UH: Lorimer, Hilliard, Thornton, Willis

• Focus on local use in classes: – to engage students, – to encourage interactivity, – to support constructivist approach to L&T

• Lack of literature on institutional deployment– Exception: Twetten et al (2007) Successful Clicker

Standardization, Educause Quarterly 4

4

Literature Review

Page 5: Evaluating Electronic Voting Systems for Enhancing Student  Experience (EEVS)

Methodology• Evaluation research position: friend

– Worked with the iTEAM project members “to define focus, gather data and share provisional findings” (Cousins, 2003).

• Research approach: pragmatic, exploratory approach– Focus on process improvements – Open to capturing un-expected as well as intended outcomes.

• Data collection: mixed-method, mainly cross-sectional data:– Students’ survey (N=590, RR= 14.4%), – Staff survey(N=88, RR= 5.9%),– Students’ blogs (N=26), – Staff interviews (N=12).

• Sampling: Mainly by self-selection, some purposive• Analysis: SPSS v19, thematic analysis, SUS method (Bangor, Kortum &

Miller, 2008)

5

Page 6: Evaluating Electronic Voting Systems for Enhancing Student  Experience (EEVS)

6

Page 7: Evaluating Electronic Voting Systems for Enhancing Student  Experience (EEVS)

EVS training is important regardless of the overwhelming’ perception of the ease of use. • SUS score of 74% i.e. higher perceived usability than 70% of other

products tested • “It is very easy to use the EVS as all you have to do is press the

button...” (blogger AJ2)• “I could use the basic features on it, but some buttons I feel afraid to

touch, as I don’t know what they do.” (blogger MC1)

Accessibility might be an issue for students with sight or dexterity impairments• 40 (6.8%) students disagreed that EVS is suitablefor students with

disabilities• “… it is a fairly small screen to look at and use may also hit the wrong

button because I feel the buttons are really close together“ (blogger AJ6)

7

Usability and accessibility

Page 8: Evaluating Electronic Voting Systems for Enhancing Student  Experience (EEVS)

EVS had highly positive (perceived) impact on students’ learning and satisfaction • Responding to questions made me think about the course material (84%)• EVS provided me with an immediate check of understanding (83%) • Using the EVS allowed problem areas to be identified (75%) • I enjoy using EVS in my learning (71%)

Summative use of EVS has in some cases created unnecessary tension, anxiety and indicated inadequacies of the technology for formal examinations.• Not given enough time for answering questions • Not testing how easily you can use device but how much you know about the subject• Should be able to cancel answers as it is possible to press a wrong button accidentally• Unable to amend your answers once the question has moved on• Using the handsets for a test , made the test feel less important, almost ‘gimmicky’

8

Impact on learning and satisfaction

Page 9: Evaluating Electronic Voting Systems for Enhancing Student  Experience (EEVS)

9

Page 10: Evaluating Electronic Voting Systems for Enhancing Student  Experience (EEVS)

Technology acceptanceOverall positive attitude to EVS acceptance with more positive attitude in Social Sciences and Arts than in other subject groups.• 83% of staff intend to use EVS in the next 6-12 months.• The odds of having a positive attitude to the use of EVS in teaching were 5 times

higher in SSH group then in other subject groups (χ2(2)= 5.061, p<0.05).

Resistance to the use of EVS came mainly from those who had the change imposed on them or were faced with large classes in which to use them • “One of the difficulties that we have is a cultural change for the staff. They are very

wedded to assessment being in a certain format “ (Lecturer)

In some cases the effort expectancy was considered to be too high compared to perceived performance gains• “I think it takes longer to set up and use than is beneficial for a 50 minute class”

(anonymous member of staff)

10

Page 11: Evaluating Electronic Voting Systems for Enhancing Student  Experience (EEVS)

Impact on teaching, satisfaction, workload

EVS led to positive changes in teaching practice• EVS made them think more about the interactions in lecture (60% staff)• The lecturer addressed relevant topics/issues identified by student responses (62%

students)

Moderate agreement that EVS led to higher job satisfaction• I enjoy using EVS in my teaching (43%)• “It’s so quick, I could come back from a lecture, it’s immediately exportable to excel

and it just goes up on StudyNet and the students can know their marks.” (Lecturer)

Although no positive impact yet on the staff workload, understanding that the adoption is a ‘Long March’ (Kanter) process• My workload pattern has changed (in a positive way) as a result of using EVS (7.7%) • ‘Once it’s embedded, then the workload for staff should be greatly reduced in terms of

marking, writing assignments’ (lecturer)

11

Page 12: Evaluating Electronic Voting Systems for Enhancing Student  Experience (EEVS)

12

Page 13: Evaluating Electronic Voting Systems for Enhancing Student  Experience (EEVS)

Other factors influencing successful institutional deployment• Technical competence of tutors and tutors’ experience with

writing MCQ questions– On-going staff development sessions including support for

developing and changing their pedagogy • Efficient handset distribution and adequate replacement cost

– Consistent implementation (e.g. centralized operational procedures, related to cost, ownership and distribution)

• Facilitating conditions such as:– technology-ready classrooms and reliable software

• Continuous availability of technical support, especially– Availability of local support via drop-in and 1-1 help sessions

13

Page 14: Evaluating Electronic Voting Systems for Enhancing Student  Experience (EEVS)

Conclusions• Patterns of previous technology adoption at UH are

mirrored –– The example of the MLE saw early student enthusiasm but

academics took more time to feel fully at ease with the change in practice and pedagogy.

‘Everybody who’s used it and got it to work comes back and say it’s great and they’re going to use it again, it’s getting over that initial hurdle and that means that the technology has got to be easy so that people can get in the swing of it’ (Biosciences leader)

14

Page 15: Evaluating Electronic Voting Systems for Enhancing Student  Experience (EEVS)

ReferencesBangor, A., Kortum, P., & Miller, J.A. (2008). The System Usability Scale (SUS): An Empirical Evaluation, International Journal of Human-Computer Interaction, 24(6).

Cousins, G. (2008) Researching Learning in Higher Education, Routledge

Chickering and Gamson http://www.uis.edu/liberalstudies/students/documents/sevenprinciples.pdf

D’Inverno, R., Davis, H., & White S. (2003), Using a personal response system for promoting student interaction, Teaching Mathematics and its Applications, Vol. 22 (No. 4): 163-169.

Draper, S. W., & Brown, M. I.(2002).Use of the PRS handsets at Glasgow University, Interim Evaluation Report: March 2002 http://www.psy.gla.ac.uk/~steve/evs/interim.html accessed April 2012

Draper, S. W., & Brown, M. I. (2004). Increasing interactivity in lectures using an electronic voting system. Journal of Computer Assisted Learning, 20(2), 81-94.

JISC (2004) http://www.jisc.ac.uk/media/documents/publications/effectivepracticeelearning.pdf

Kanter, R.M., Stein, B.A. & Jick, T.D. (1992) The Challenge of Organizational Change. Free Press. New York

Kennedy, G. E., & Cutts, Q. I. (2005). The association between students' use of an electronic voting system and their learning outcomes. Journal of Computer Assisted Learning, 21(4), 260-268

Lorimer, J., & Hilliard, A. (2009). Use of a Electronic Voting System (EVS) to Facilitate Teaching and Assessment of Decision Making Skills in Undergraduate Radiography Education. Paper presented at the Proceedings of 8th European Conference on e-Learning, Bari, Italy

Moore, G. A. (1991). Crossing the Chasm. New York: Harper Business.

Nicol, D., & Draper, S. (June 2009). A blueprint for transformational organisational change in higher education: REAP as a case study. In J. T. Mayes (Ed.), Transforming Higher Education through Technology-Enhanced Learning.

Oliver, M. (2006). New pedagogies for e-learning? ALT-J: Research in Learning Technology, 14(2), 133 - 134.

Robins, K., (2011) EVS in the Business School, University of Hertfordshire Internal Report

Thornton, H. A. (2009). Undergraduate Physiotherapy students’ choice and use of technology in undertaking collaborative tasks. Open University, UK, Milton Keynes

Twetten, J., Smith, M.K., Julius, J. and Murphy-Boyer, L. (2007) Successful Clicker Standardization EDUCAUSE QUARTERLY • Number 4 2007

Venkatesh, V., Morris, M. G., Davis, G. B., & Davis, F. B. (2003). User acceptance of information technology: toward a unified view. MIS Quarterly, 27(3), 425–478.

Willis, J. (2009) Using EVS in the School of Life Sciences, University of Hertfordshire Internal Report 15