reporting module evaluation outcomes an evolving approach
TRANSCRIPT
Reporting Module Evaluation Outcomes
An Evolving Approach
Dr Tim LinseyHead of Academic Systems & Evaluation
Directorate for Student AchievementKingston University
London, [email protected]
Bluenotes GlobalAugust 2019
Overview
• Introduction & Context• Delivery of MEQs• Reporting & Analysis & Evaluation of Reporting approaches• Data Warehouse
• University Dashboards• University Annual Monitoring and Enhancement process
• Issues and Developments
IntroductionAcademic Systems and Evaluation• TEL Systems, • Attendance Monitoring, • Learning Analytics, • Student Voice
• MEQs, • Kingston Student Survey• National Student Survey• Postgraduate Survey
Key considerations
• Creating a culture of continuous improvement supported by analytics
• Consistent approach embedded in Quality Assurance and Evaluation processes
• Closing the feedback back loop and valuing the student voice• Effectively engaging students throughout the curriculum• MEQs are a key part of the student feedback approach
Reintroduction of MEQs• Decision taken in January 2017 to reintroduce MEQs• MEQ Working Group• 10 Quantitative + 2 Qualitative questions• 2018/19 – All online surveys
• MEQ Environment: Blue from Explorance• Lead Indicator for Teaching Quality KPI.
VLE Integration
My Module Evaluations
Processes & Timing Consistent approach across all modules Automated scheduling and publishing of MEQs Scheduled to allow time for in-class discussion MEQs run all year but two main survey windows (16 days) Reports automatically Published into the VLE within a few hours of
an MEQ completing Response rates
Orchestrated approach–Briefing guide and PowerPoint for all module leaders– Set of agreed statements to conveyed to students–Student created video introducing MEQs–Staff asked to find a slot in class–Staff requested to leave class for 15 mins.–Use of course representatives
2018/19 (to March)
• 832 MEQ reports generated (exceeding minimum threshold of 4)• 76% of student responses contained qualitative feedback• 38% students completed one or more MEQs• 47% completed via mobile devices
Module ReportsStaff and student reports similar except the student version excluded
comments and comparisons (Department and Faculty averages)
Improve
Best things
Further Reports–Department, Faculty and University aggregate reports–Summary reports for each Faculty–Modules with zero responses or not met threshold–Custom Reports
Summary Report for all Modules 2016/17Summary table ranking all modules by their mean overall score.Colour coded => 4.5 =< 3.5
Summary Report for all Modules 2017/18– Colour coding was problematic– Staff suggestion to rank by standard deviation from the overall
university mean.
Additionally– Comparison of 2016/17 vs 2017/18
Summary Report for all Modules 2018/19Reviewed our approach to consider issues raised in the literature:
• Comparisons between modules of different types, levels, sizes, functions, or disciplines
• Averaging Ordinal scale data• Bias• Internal consistency
(e.g. Boring, 2017; Clayson, 2018; Hornstein, 2017; Wagner et. al. 2016)
Education Committee Summary Report
Sorted by Faculty, Level, Response rateInclude statistical significance
Statistical confidence
Methodology: Dillman, D. Smyth, J, Christian, L. 2014 Internet, Phone, Mail and Mixed-Mode Surveys: The Tailored Design Method. John Wiley & Sons.
Ranking by % Agree
Frequency Distributions–Request that staff also review the frequency distribution of their
responses–Is the distribution bimodal, and if so why?
Mean = 2.9
Aggregating Questions to Themes
–Teaching–Assessment–Academic Support–Organisation
We noted– Care needed to be taken with aggregated data and inferences
drawn from it– An individual MEQ report is informative for the module team
knowing the local context but care needs to be taken without looking at trends and holistically across other metrics.
Data Warehouse–Raw data passed to the KU Data Warehouse
–Tableau Dashboards (Strategic Planning and Data Insight Department).
• Dashboards accessible by all staff including showing top 5 and bottom 5 modules for each level.
• Data aggregated with ability to drill down to module level–Annual Monitoring and Enhancement Process
University Dashboards
CourseMetrics
ModuleMetricsMEQs Progression
AttainmentGap
NSSKSS
BME VA
Progression
KPIsEmployability
LearningAnalytics
MEQs
Drilling up and down
Department View
Annual Monitoring and Enhance Process
Module Enhancement PlansCourse Enhancement Plans
–Online–Pre-populated with MEQ and other metrics–Module leaders and course leaders review the performance of the
module / course citing evidence and pre-populated metrics
MEPs
(extract)
Issues & Developments–When should the MEQ be distributed? – Focus Group
feedback–Automation – Administration & Analysis–Text Analytics–47% students completing MEQs via Mobile Devices–Analysis across different student voice surveys
– Staff being named in qualitative feedback & issues of etiquette– Students concerned about anonymity– Response rates – followed up with modules with high response rates.– Feedback to Students– Demographic analysis
Collaborative • Directorate for Student Achievement• Academic Systems & Evaluation Team• Strategic Planning and Data Insight (Inc. all dashboard
development)• Academic Registry• Information & Technology Services• Faculties via the MEQ Working Group / Student Surveys Group• Student Course Representatives• Explorance
References• Boring, A. 2017 Gender biases in student evaluations of teaching. Journal of
Public Economics 145, 27–41• Clayson, D. 2018 Student evaluation of teaching and matters of reliability.
Assessment & Evaluation in Higher Education, 43:4, 666-681• Dillman, DA, Smyth, JD, & Christian, LM 2014, Internet, Phone, Mail, and Mixed-
Mode Surveys : The Tailored Design Method, John Wiley & Sons.• Hornstein, H. 2017 Student evaluations of teaching are an inadequate
assessment tool for evaluating faculty performance. Cogent Education, 4, 1-8.• Wagner, N, Rieger, M, & Voorvelt, K 2016 Gender, ethnicity and teaching
evaluations: Evidence from mixed teaching teams. Economics of Education Review, 54, 79-94.
Any Questions?