building rubrics for large-scale, campus-wide assessment afternoon session thomas w. zane...
TRANSCRIPT
Building Rubrics For Large-scale, Campus-
wide AssessmentAfternoon Session
Thomas W. Zane [email protected]
Diane L. Johnson [email protected]
Jodi Robison [email protected]
Afternoon Session Agenda
1. Build a critical thinking rubric designed for adjuncts who teach a GE course.
2. Build a written communications literacy rubric designed for a group of senior portfolio reviewers.
3. Build a rubric based on the generic criteria method.
4. Q&A and Wrap-up.
Afternoon Session AssumptionsYou are tasked to take the lead
on a large-scale assessment project at your institution.
You face two big issues:◦What are the steps for building the
scoring rubrics?◦How could you guide faculty through
the process?
Case #1: Critical Thinking
Our institution has decided to measure critical thinking
across the curriculum. Together we form the
committee charged with building a way to measure this college-wide outcome.
Critical Thinking Rubric – Early Decisions
Purpose – Measure critical thinking ability based on student submissions in MANY types of courses.
Target of Measurement – Ability level as perceived from written submissions.
Define Critical Thinking ◦Critical thinking is the conscious and deliberate use of thinking skills and strategies used for guiding what to think, believe, or do.
Critical Thinking Rubric – Search for Criteria –
1. Found thousands of articles and rubrics.
2. Identified primary sources. 3. Collected criteria.4. Defined the criteria.
Critical Thinking – Define the Criteria
1. Interpretation
The primary definition of interpretation is the act of making sense of various inputs. Interpretation requires that we clarify the purpose, issue, problem/question, meaning, etc.
2. Analysis Analysis means to break down, examine, or otherwise explore the issues, available information, arguments, etc. With analysis, we must manipulate, process, or otherwise make active changes to the inputs to make better sense of them.
3. Evaluation To evaluate means to determine the merit, value, efficacy, advantages, worth, authenticity, validity, impact, or significance, of something (e.g., the evidence, claims, assumptions, biases, perspectives, etc.)
4. Inference This broad term covers reasoning coupled with the use of evidence and standards that together are necessary for synthesizing, coming to a conclusion, making decisions, identifying alternatives, generalizing, planning, predicting, etc.
5. Explanation (Communication)
Communicate the outcomes of thinking such as stating results, justifying procedures, explaining meaning, presenting arguments, etc. This is considered CT because of the mental processes involved in designing a well-written (or spoken) message.
6. Self-regulation (Metacognition)
During all of the above (and sometimes following the thinking as well), reflect, self-examine, pose questions about thinking, self-correct, etc.
Critical Thinking Rubric – Design The Scale
CT is a human ability that increases in cognitive demand
So…build a scale that reflects this!
Four points (standard for our online systems).
Define the scale points.
Scale Definitions
Determine what each score level means to you.
Define the level as best you can.
Critical Thinking – Select the Specific Aspects for Each Row
Each of the six areas of critical thinking are still too broad.
We need to break them down into smaller aspects.
We went back to the Internet to find rows in rubrics that fit our definitions.
Created a sample of rows for faculty to use.
Used the same six categories to allow for aggregation of data across the campus.
Split each of the six into different sorts of approaches to fit various disciplines.
Promised to come to the department if a row could not be found. (None have requested this.)
Sample (starting point) Rows
Classification
Lists information.
Incorporates information.
Classifies information.
Examines reasoning for information classifications.
Consider Biases
Does not detect any biases.
Identifies biases.
Contests biases.
Overcomes biases.
Decision Making Based On Support
Does not make a decision.
Makes a decision but does not provide backing.
Makes and supports a good decision.
Justifies why a given decision is the best.
Build Your Rubric
Assume you are a department assessment coordinator now.
Assume our signature assignment is to write a paper about off-roading equipment. (Assignment is in the workbook.)
Review Appendix F to select a row for each of the six categories.
Localize the Descriptors
Add assignment-specific language to the rubric descriptors to support more natural feedback to students. Original
Example Clarifying Questions
Does not ask questions.
Identifies some questions.
Asks good questions.
Analyzes insightful questions.
Adapted Version Clarifying Questions
Does not ask questions about the budget problem.
Identifies some basic common or obvious questions about the budget problem.
Asks relevant questions that guide further research into the budget problem.
Analyzes insightful questions showing a deep understanding of how the questions can guide the research.
Case #2: Written Communications
• Use the same assumptions. • You need a college-wide
measure, but you need buy-in from ALL departments.
• Let’s build a new rubric using the same methods.
Written Communications Literacy – Early Decisions
Purpose – measure written communications quality from student submissions.
Target of Measurement – writing qualities (trait-based).
Define Written Communications Literacy.
Written Communications – Define the ConstructA search for rubrics on this trait
yielded over 100K hits.So, we dropped back and punted. We used the AAC&U VALUE rubric
as a starting point.In addition, we consulted the
commonly used 6+1 Traits Rubric.From these, we found criteria and
potential wording for some rubric rows.
Scale Definitions
We worked with our English teachers to narrow down what the scores should mean.
1. Major writing errors necessitating major revision or rewrite
2. Minor quality errors that could be resolved with minor revisions
3. Competent writing that would pass as is
4. Excellent writing that went beyond minimum standards
Define the Criteria
• Content Development• Genre and Discipline-Specific
Conventions • Claims• Credible Evidence• Analysis• Control of Syntax and Mechanics• Overall Impact
Written Communications Rubric Development Procedure
1. Print/Open a copy of the Starting Point Rubric for Written Communication .
2. Review the section of the document that corresponds to each row in the rubric template.
3. Select ONE approach for measuring that criterion (row) from the “Descriptor Categories” in column one of the examples tables.
4. Place the description into the “3” cell on that row.5. Write a descriptor for the remaining column cells
along that row.
Review Quality of Our Rubric
Turn back to section 7 in the morning session notes.
Review our rubric in light of the checklist.
Case #3Build From Generic
CriteriaTake a look in your workbooks at
the generic criteria listing.
We argue that these criteria could lead you to most of the criteria you might want to use in future assignment-based rubrics.
Examples (Workbook)
2 Examples in the workbook.Bolded criteria were selected as
the most important.Which do you feel were more
important?
Now It Is Your Turn
Select a favorite assignment (or one of the examples).
Select 3-4 important criteria.Complete a rubric with 3-4 rows including:◦Criteria definitions. ◦Score scale definitions.◦Descriptors.
Wrap UpQuestions?What Worked?What Needs Work?
Reminder: If you want feedback on your first attempts at a rubric, send them to [email protected]