examining the effect of a real time student dashboard on student behavior and student achievement
TRANSCRIPT
Examining the Effect of a Real-Time Student Dashboard on Student Behavior
and Student AchievementRobert Bodily, Charles Graham,
Tarah Kerr, and Ben MackleyBrigham Young University
Why do student dashboards matter?● Most learning analytics systems are concerned with collecting data, but
then what?● Provide concept, assignment, unit, or course level feedback to help
students identify knowledge gaps● Help students develop metacognitive or self-regulation strategies● Provide information that can be understood at a glance
Review of Research: Dashboard Effect on Behavior1. 21% of students accepted the system recommendation to view additional content2. Students participating in courses using the system were more likely to continue taking classes than those who did not
enroll in these courses3. Students who enabled notifications (on 2 out of 3 systems) showed increased contributions in the social network space4. Students visited the discussion space more frequently but did not post more frequently5. The percentage of posts viewed increased for all students, but there were few sustained changes6. The number of students completing assignments increased and LMS use increased7. About 50% of students accepted recommendations from the system8. There was an 83.3% student interaction increase after recommendations were given9. Students completed assignments more quickly and were able to complete the entire course more quickly
10. *For two of the three visualizations, students post quantity increased; for one of three, student post quantity decreased.
11. *Students logged in more frequently, completed their coursework more quickly, completed more questions, and answered more questions correctly on assignments
12. *There were no significant differences between the treatment and control groups in terms of learning efficiency
*Sample size greater than 150 and conducted an actual experiment (randomized control trial or other equivalent group mean difference testing method)
Review of Research: Dashboard Effect on Achievement1. No significant achievement differences2. Increased A’s and B’s. Decreased C’s and D’s.3. No significant achievement differences4. Students received more passing grades5. Frequency and quality of posts was affected positively and negatively6. Students performed significantly better on the evaluation task7. Treatment group performed significantly better on final exam8. *No significant differences between treatment and control9. *No significant achievement differences
10. *No significant achievement differences, but one course had an effect with Pell eligible students11. *Treatment group performed significantly better on final exam
*Sample size greater than 150 and conducted an actual experiment (randomized control trial or other equivalent group mean difference testing method)
Review of Research: Dashboard Effect on Skills1. Significant increase in student self-awareness accuracy2. Female students had increased interest when they had a choice to use the
system; male students reported higher interest with mandatory notifications
Context 1: Fall 2015, Design● Instructor advocated for video use● Quizzes were short (3-5 questions), easy, and based on the videos● Dashboard was accessed in the LMS next to videos● Access was given after
the first exam● Design: content
recommender dashboard
Context 1: Fall 2015, Methods● Students were randomly placed in dashboard treatment group or control
group● T-tests were used to make sure groups were equivalent across all
covariates (Exam 1 score, quiz scores, video use)
Context 1: Fall 2015, Results● Randomized control trial showed no significant differences between
treatment and control for student behavior or student achievement● Low student use
○ 11.5 clicks per session, 2 sessions per student, 42% of students accessed the dashboard○ Data quality of dashboard?○ Not useful to students?
● Evaluation showed many students: ○ Did not know they had access to the dashboard○ Did not know how to use the system○ Did not think it would be useful○ Did not have time○ Had a lot of other resources
Context 2: Winter 2016, Design● Put videos within a videos tab in the dashboard to increase use (help
them see data visualizations more frequently)● Longer quizzes but still formative (unlimited attempts)● Everyone had access● Design: scatterplot
dashboard
Context 2: Winter 2016, Results● Low student use
○ Data quality in dashboard?
Fall Winter
Percent Access 42% 48%
Sessions per student 1.98 1.89
Clicks per session 11.5 14.2
Context 3: Spring 2016, Design● Everyone had access● Demo the dashboard for everyone at the beginning of the semester● Quizzes are graded and attempts are limited to 3 (high stakes)● Teach the TAs about the dashboard and have them encourage it● Have the instructor mention the benefits of the system● Provide more resources (videos, practice quizzes, and web resources)● Design: content recommender 2 dashboard
Context 3: Spring 2016, Design
Context 3: Spring 2016, Results● Increased frequency of use● Decreased clicks per session (students are more efficient)● Perceptions of dashboard
Fall Winter Spring
Percent Access 42% 48% 80%
Sessions per student 1.98 1.89 3.28
Clicks per session 11.5 14.2 10.6
Principles Learned● Students need to be aware of and trained in a new system
○ Send students reminders○ Instructor and teaching assistants can discuss benefits of the system○ Demo the system for the students
● Systems need to be directly related to helping students achieve their goals○ Unit-level feedback helped with the test review process○ Help students identify knowledge gaps○ Remediate knowledge gaps with videos, text resources, and practice questions
● Usability tests and system evaluations are necessary○ We changed a lot after our usability and evaluation tests
Future Research● How can we support students as they engage with online feedback?● More rigorous evaluations and measured effects research (randomized
control trial, quasi-experimental methods)