elo-based supplemental course evaluations teaching academy assessment subcommittee david baum &...
TRANSCRIPT
ELO-based supplemental course evaluations
Teaching Academy Assessment Subcommittee
David Baum & Janet Batzli
The Assessment Grid
The Assessment Grid
The Assessment Grid
The Assessment Grid
Goals• Campus wide, quantitative course evaluations– Not to replace departmental evaluations
• Address Essential Learning Outcomes– Track student learning– Assess courses/programs– Educate students about the ELOs
• Post summary data on the web– Incentivize participation– Help students select courses– Be open with stakeholders
Why should instructors care?
• Know how we are doing• Be judged on what student’s gain from the
course (not charisma)• Get students to reflect on learning rather than
enjoyment
UW Essential Learning Outcomes
http://www.learning.wisc.edu/welo2010.pdf
UW Essential Learning Outcomes
Approach
• Initial questions based on ELO language• Survey Center conducted three rounds of
focus groups to sequentially improve questions
• Also piloted data presentation
Initial question
How much did this course enhance your knowledge of the human or natural world?
Final Recommendation
In general, how much did this course enhance your knowledge of the world, such as knowledge of human cultures, society, or science? [Not at all, A little, Somewhat, Quite a bit, A great deal] – While students were not always sure what “..of
the world” meant, with the defining words, they got the intended idea
Initial question
How much did this course help you develop intellectual and practical skills, such as critical and creative thinking, written and oral communication, teamwork, and problem-solving?
Combining intellectual and practical/pre-professional skills was confusing to students – split into two questions
Final Recommendations
How much did this course help you develop intellectual skills, such as critical or creative thinking, quantitative reasoning, and problem solving?
How much did this course help you develop professional skills, such as written and oral communication, computer literacy, and working in teams?
[Not at all, A little, Somewhat, Quite a bit, A great deal]
Initial question
How much did this course affect your values and sense of personal and social responsibility, for example by increasing your knowledge of policy issues, engagement in community and civic affairs, intercultural knowledge, or ability to reason ethically?
Final RecommendationsHow much did this course increase your sense of
social responsibility, that is increased your knowledge of cultures or provided you with opportunities for civic or community involvement?” [Not at all, A little, Somewhat, Quite a bit, A great deal] • There was some confusion between feeling vs. acting
responsibly• The question mixes an impact on a student with contents
of a course• We tried “reason ethically” but this confused students
– More editing might be needed
Initial question
How much did this course advance your ability to integrate diverse areas of knowledge?
Final Recommendations
How much did this course improve your ability to combine knowledge or skills from different fields of study? [Not at all, A little, Somewhat, Quite a bit, A great deal] • If we interpret the ELO to mean interdisciplinarity
within a course, then the question is working
Initial question
How would you rate this course for its overall quality and educational impact?
Quality and impact were found to be different.
Final Recommendation
How would you rate the overall educational value of this course, that is the extent to which the course improved your all-around education or prepared you for the future?” [Very poor, Poor, Fair, Good, Very good]– Students seem to understand the intent
Final RecommendationHow would you rate the overall quality of this
course, that is the extent to which it was structured and taught in order to maximize its educational value?” [Very poor, Poor, Fair, Good, Very good]– Unclear to some students if they should assess the
professor or course structure– We want the focus to be on the course rather
than the instructor– Maybe need to narrow, split, or drop this question
Final Recommendation
Students felt it would be helpful to know whether a student was or was not “a major”
Question added (at the start of the survey):At the time you enrolled, did you primarily take
this course to fulfill a requirement for your major? [yes/no]– Students seemed to understand this question, as
intended
General student reactions to the survey
• The survey made them reflect on their courses in ways they hadn’t before and thought the survey asked about aspects of courses where they did gain something
• Students see that you can appreciate classes for different reasons
• Some students had mentally compared the answers they gave from one course to another
• ELOs became clearer• The overall reception of the utility of the survey
was very positive
Ideal implementation
• After each semester, students receive a link to a personalized survey covering all the courses they took in the previous semester
• Data are collated and made available to students and the public (CouseGuide?)
Sample Data Presentation
• The students liked the idea of having such data available– it would definitely help because there isn’t currently a
way to know if a class is good or bad besides “anecdotes”
– Better than “RateMyProfessor.com” - those ratings don’t necessarily tell you what the class is going to be like
• But they are realistic: – …A student shouldn’t put all his “trust into what the
graph might say.”
5 Imaginary studies 101 (331 students; 105 majors)
Imaginary studies 102 (183 students; 80 majors)
Imaginary studies 202 (89 students; 33 majors)
Imaginary studies 401 (23 students; 12 majors)
Imaginary studies 402 (19 students; 17 majors)
Imaginary studies 201 (113 students; 83 majors)
Mean for nonmajorsMean for majors
Sample data presentationNeeds work!
Other things student want (but we don’t expect to provide)
• How much work• How easy it was to get an A• Correlation between the amount of work
students put in and how much they feel they actually learned from the course
• Data for specific professors or TAs
Next steps
• Run a pilot survey on a large sample of students• Use focus groups to improve data presentation
style• Consult with central administration on whether
funds would be available to establish ongoing surveys and posting of data for all courses
• Seek faculty/staff buy-in (allowing for instructors to opt-out)
Thanks
• Janet Batzli• TA Executive Committee and Assessment
Subcommittee• UW Survey Center– John Stevenson; Jennifer Dykema; Jaime Faus;
Tara Piche
• Mo Bischoff• UW Assessment Council