making assessment more meaningful and sustainable with the ...€¦ · making assessment more...
TRANSCRIPT
Making Assessment More Meaningful and Sustainable with the Multi-State Collaborative
Jera Lee, TaskstreamKent Johnson, Indiana University Purdue University-Fort Wayne
assessment ∙ accreditation ∙ e-portfoliosHLC, April 2016
Outcomes
• Understand the goals and scope of the Multi-State Collaborative (MSC) to Advance Learning Outcomes Assessment
• Explain the benefits and experiences for participants
• Appreciate how simple technology can reduce barriers to doing direct assessment of student learning outcomes (SLOs)
• Consider how the technology could be used on your campuses to engage faculty and start or improve assessment initiatives
• Founded in 2000
• 500+ client institutions
• Proven, reliable technology
• Experienced, caring team
• Dedicated to client success
Taskstream partners with institutions to improve student learning and institutional quality through
effective assessment practices.
Aqua is the latest member of our product suite to support your outcomes assessment journey…
AMS
Assessment Planning & Documentation
Student Assessment & e-Portfolios
Streamlined Outcomes Assessment
• Track student progress over time• Engage students in assessment• Encourage integrative learning
• Collect program documentation• Organize plans and findings• Prep for regional accreditation
• Get to outcomes data quickly• Collect student work in bulk• Distribute work for scoring
Aqua LAT
Why is it so difficult to engage in meaningful, sustainable assessment of student learning?
Software doesn’t do assessment.People do assessment.
If people aren’t engaged in the work, software isn’t going to help.
The Multi-State Collaborative
(MSC) to Advance Learning
Outcomes Assessment
VALUEGates-funded project to assess essential learning outcomes with authentic student work scored by faculty using VALUE rubrics
The MSC Model
1. Collect student work samples from 69 two- and four-year institutions in 9 states to assess 3 outcomes: Written Communication, Quantitative Literacy, Critical Thinking
• Sample from upper-division students (75% of coursework completed)
• Minimum of 75-100 artifacts per outcome per institution
• Limit of 7-10 artifacts collected from any one faculty member or course
• Limit of one artifact per student
• Limit of one outcome assessed per artifact
2. Randomly distribute redacted artifacts for scoring with VALUE rubrics
• Faculty scorers required to attend in-person calibration workshop
• Faculty not allowed to score work sampled from their institution
Institutional Perspective
Institutional Overview
• Master’s Level University (Very High Undergraduate)
• Public – Managed by Purdue, Granting Degrees from Purdue University and Indiana University
• Primarily non-residential,
• High transfer-in – High Swirl
• State mandated general education articulation agreement
• Location – Ivy Tech CC Campus situated adjacent to IPFW
• Statewide participation in MSC
MSC Participation
MSC By the Numbers
• Faculty Participants = 12
• Assignments Submitted = 250
• Faculty Scorers = 2
MSC Participation
Statewide participation – about half of Indiana publics are actively engaged in MSC
IPFW participation is focused on:• Written and Oral Communication
• Quantitative Literacy
• Critical Thinking
Artifacts (existing student work gathered from IPFW courses) • Year One: Written Communication and Critical Thinking
• Year Two: Adding Quantitative Literacy
Value Rubrics applied to score artifacts
Institutional Plan
• Presently are engaged in HLC Assessment Institute
• Project focused on developing a “culture of learning” where assessment is integrated in teaching and learning processes
• Grounded in embedded and authentic assessment
• Initial emphasis in general education on written and oral communication, critical thinking, and quantitative reasoning
• Mapped IPFW General Education Outcomes in WC, OC, CT, QR, and Baccalaureate Framework to Value Rubrics
• MSC serves as pilot for our institutional plan to use Aqua
• Delayed slightly by Purdue System project examining common platforms for all Purdue campuses
Project Status
• Leveraging project to help train faculty on Value Rubrics and to get a group of faculty familiar with using Taskstream and Value Rubrics to assess student learning.
• Artifacts submitted last year with additional artifacts being submitted this year
• Hope to leverage experience assessing student work to develop “signature assignments” for General Education areas and assess work in Taskstream
• Finally - Taskstream “ease of use”
Technical Challenge: Scaling Direct Assessment
Technical Challenge: Engaging Users
• All with day jobs and other responsibilities
• Each role had distinct goals, tasks
• Limited opportunity to train
• Limited time to complete their work
• Focused on one activity at a time
PROJECT
LEADS
DATA
LEADS
FACULTY
SCORERS
How easy can we make this?
Visual tracking reports to show progress of collection and scoring
PROJECTLEADS
DATALEADS
FACULTYSCORERS
Engaging the MSC users: How we did it
use space for screenshots or other large images
Track progress of collection across the project and by statePROJECTLEADS
use space for screenshots or other large images
Track progress of scoring overall and by outcomePROJECTLEADS
Visual tracking reports to show progress of collection and scoring
PROJECTLEADS
DATALEADS
FACULTYSCORERS
Engaging the MSC users: How we did it
Simple tools for tagging assignments and bulk uploading artifacts
Create and tag assignments with key metadataDATALEADS
Bulk upload student artifacts (papers)DATALEADS
Visual tracking reports to show progress of collection and scoring
PROJECTLEADS
DATALEADS
FACULTYSCORERS
Engaging the MSC users: How we did it
Basic tools for tagging assignments and bulk uploading artifacts
Simple, intuitive and tablet-friendly evaluation experience to make scoring a breeze
A simple queue of workFACULTYSCORERS
Easy interaction with the contentFACULTYSCORERS
Quick access to the rubricFACULTYSCORERS
Clear way to review scoresFACULTYSCORERS
Little User Training Needed
• Demonstrated the “VALUE Rubric Scoring System” to MSC participants
• Training was a 30-minute demo and 15 minutes of questions
How easy did we make it?
“Taskstream took any concerns about technology out of the
scoring experience!”
“Taskstream was a useful system that allowed me to assess the student artifacts quickly.”
“I was pleased with the ease of using and dependability of Taskstream.
“Taskstream was great. It was an easy system to use, understand, and navigate.”
“Taskstream is very user-friendly.”
MSC pilot study participant feedback
Reflections and Evolution
• Improve user experience for uploading student work
• Present assessment results with visually engaging reports
• Offer additional ways to bring work into the system
• Support multiple artifacts per student, assessment of multiple outcomes for each assignment and even “live” evidence.
• Provide the means for institutions to engage in similar initiatives locally
Beyond the MSC: Local Assessment
A simple path for direct assessment of learning
Critical Thinking
Written Communication
Intercultural Knowledge
What?Which outcome(s)
will you assess?
Where?Where will you get
the evidence?
course assignments
Who?Who will score
student work?
SimpleSetup
How?How will you collect
the evidence?
A simple path for direct assessment of learning
What?Outcome(s)
Where?Assignments
Who?Scorers
SimpleSetup
Data!Streamlined
Scoring Experience
MSC Webinar Series
www1.taskstream.com/resources
Upcoming Webinar:
www1.taskstream.com/event/webinarsmay2016aqua/