student learning assessment: an interactive exploration megan oakleaf, mls, phd nelig annual program...

53
Student Learning Assessment: An Interactive Exploration Megan Oakleaf, MLS, PhD NELIG Annual Program June 8, 2007

Upload: monica-charles

Post on 03-Jan-2016

216 views

Category:

Documents


1 download

TRANSCRIPT

Page 1: Student Learning Assessment: An Interactive Exploration Megan Oakleaf, MLS, PhD NELIG Annual Program June 8, 2007

Student Learning

Assessment:An

Interactive Exploration

Megan Oakleaf, MLS, PhDNELIG Annual Program

June 8, 2007

Page 2: Student Learning Assessment: An Interactive Exploration Megan Oakleaf, MLS, PhD NELIG Annual Program June 8, 2007

© M. Oakleaf

Overview

• Purposes of Assessment

• The Assessment Cycle

• Major Assessment Tools

• Choosing the Best Tools for Your Campus

• Reporting Assessment Results

• Facing Challenges

Page 3: Student Learning Assessment: An Interactive Exploration Megan Oakleaf, MLS, PhD NELIG Annual Program June 8, 2007

Purposes of Assessment

Page 4: Student Learning Assessment: An Interactive Exploration Megan Oakleaf, MLS, PhD NELIG Annual Program June 8, 2007

© M. Oakleaf

Why should I assess student learning? 

• To respond to calls for accountability

• To participate in accreditation processes

• To inform decision-making regarding program structure/performance

• To improve teaching skills

• To improve student learning

Page 5: Student Learning Assessment: An Interactive Exploration Megan Oakleaf, MLS, PhD NELIG Annual Program June 8, 2007

The Assessment Cycle

Page 6: Student Learning Assessment: An Interactive Exploration Megan Oakleaf, MLS, PhD NELIG Annual Program June 8, 2007

Identify outcomes

Create services

Enact services

Gather data

Interpret data

Enact decisionsReview goals

Information ServicesAssessment Cycle

Adapted from Peggy Maki, PhD & Marilee Bresciani, PhDBy Megan Oakleaf, PhD

© M. Oakleaf

Page 7: Student Learning Assessment: An Interactive Exploration Megan Oakleaf, MLS, PhD NELIG Annual Program June 8, 2007

Identify learning outcomes

Create learning activities

Enact learning activities

Gather data to check learning

Interpret data

Enact decisions

Review learning goals(IL standards)

ILI Assessment CycleAdapted from Peggy Maki, PhD

& Marilee Bresciani, PhDBy Megan Oakleaf, PhD

© M. Oakleaf

Page 8: Student Learning Assessment: An Interactive Exploration Megan Oakleaf, MLS, PhD NELIG Annual Program June 8, 2007

Major Assessment Tools

Page 9: Student Learning Assessment: An Interactive Exploration Megan Oakleaf, MLS, PhD NELIG Annual Program June 8, 2007

© M. Oakleaf

What assessment tools are available?

• Satisfaction surveys

• Tests

• Performance assessments

• Rubrics

• Classroom Assessment Techniques (CATs)

Page 10: Student Learning Assessment: An Interactive Exploration Megan Oakleaf, MLS, PhD NELIG Annual Program June 8, 2007

© M. Oakleaf

Why the focus on outcomes-based tools?

“Outcomes assessment alerts us to what students know about or do not know about library research, thus allowing librarians to adapt instruction to the needs of the students. It also helps us to determine what we are doing right and what we are doing wrong, what needs more emphasis, and what students already ‘get.’ In short, our instruction is better because we know how we are doing.” Carter, Elizabeth W. "'Doing the Best You Can with What You Have': Lessons Learned from Outcomes Assessment." Journal of Academic Librarianship 28.1 (2002): 36-41.

Page 11: Student Learning Assessment: An Interactive Exploration Megan Oakleaf, MLS, PhD NELIG Annual Program June 8, 2007

© M. Oakleaf

Tests Defined

• Are primarily multiple choice in format

• Strive for objectivity

• Grounded in early behaviorist educational theory

Page 12: Student Learning Assessment: An Interactive Exploration Megan Oakleaf, MLS, PhD NELIG Annual Program June 8, 2007

© M. Oakleaf

Tests – Benefits, 1 of 2

Learning• Measure acquisition of factsData• Are easy and inexpensive to score• Provide data in numerical form• Collect a lot of data quickly• Tend to have high predictive validity with GPA or

standardized tests scores• Can be made highly reliable (by making them longer)• Can be easily used to make pre/post comparisons• Can be easily used to compare groups of students

Article forthcoming by Megan Oakleaf

Page 13: Student Learning Assessment: An Interactive Exploration Megan Oakleaf, MLS, PhD NELIG Annual Program June 8, 2007

© M. Oakleaf

Tests – Benefits, 2 of 2

If locally developed…• Help librarians learn what they want to know about student

skills• Are adapted to local learning goals and students• Can be locally graded and interpretation of results can be

controlledIf non-locally developed…• Can be implemented quickly• Reduce staff time required for development and scoringOther• Are widely accepted by the general public

Article forthcoming by Megan Oakleaf

Page 14: Student Learning Assessment: An Interactive Exploration Megan Oakleaf, MLS, PhD NELIG Annual Program June 8, 2007

© M. Oakleaf

Tests – Limitations, 1 of 2

Learning

• Measure recognition rather than recall

• Reward guessing

• Include oversimplifications

• Do not test higher-level thinking skills

• Do not measure complex behavior or “authentic” performances

• Do not facilitate learning through assessmentArticle forthcoming by Megan Oakleaf

Page 15: Student Learning Assessment: An Interactive Exploration Megan Oakleaf, MLS, PhD NELIG Annual Program June 8, 2007

© M. Oakleaf

Tests – Limitations, 2 of 2

Data

• May be designed to create “score spread”

• May be used as “high stakes” tests

If locally developed…

• May be difficult to construct and analyze

• Require leadership and expertise in measurement

• May not be useful for external comparisons

Page 16: Student Learning Assessment: An Interactive Exploration Megan Oakleaf, MLS, PhD NELIG Annual Program June 8, 2007

Multiple Choice

-The Basics-

Page 17: Student Learning Assessment: An Interactive Exploration Megan Oakleaf, MLS, PhD NELIG Annual Program June 8, 2007

© M. Oakleaf

Multiple Choice Test Questions, 1 of 3

What student skills do you want to measure?

Which skills are important enough to measure?

Keep in mind…

Stem

• Direct questions are better than incomplete sentences

Page 18: Student Learning Assessment: An Interactive Exploration Megan Oakleaf, MLS, PhD NELIG Annual Program June 8, 2007

© M. Oakleaf

Multiple Choice Test Questions, 2 of 3

Answer choices• Write the correct answer first• Limit obviously incorrect choices; wrong answers

should be plausible• Use parallel construction and similar length • Avoid negatively phrased answers • Avoid “all of the above” and “none of the above”• “Select best” more challenging than “select correct”

Page 19: Student Learning Assessment: An Interactive Exploration Megan Oakleaf, MLS, PhD NELIG Annual Program June 8, 2007

© M. Oakleaf

Multiple Choice Test Questions, 3 of 3

In general…

• Avoid unintentional clues

• Keep vocabulary, phrasing, & jargon simple

• Avoid extreme words (all, never, always) and vague words (may be, usually, typically)

• Omit needless words

Page 20: Student Learning Assessment: An Interactive Exploration Megan Oakleaf, MLS, PhD NELIG Annual Program June 8, 2007

Interactive Exploration

Retrieve worksheets from folder.

At each table,

divide into 2 groups.

Page 21: Student Learning Assessment: An Interactive Exploration Megan Oakleaf, MLS, PhD NELIG Annual Program June 8, 2007

Interactive ExplorationMultiple Choice Test Questions

1. Select a question from the test provided.2. Does it adhere to the multiple choice

guidelines?3. What is the answer to the question? Does

your group agree? Why or why not?4. What might you do to improve the question?5. If you had 100 students’ answers to this

question, what would you be able to do with that data? What decisions could you make based on the data?

Page 22: Student Learning Assessment: An Interactive Exploration Megan Oakleaf, MLS, PhD NELIG Annual Program June 8, 2007

© M. Oakleaf

Performance Assessments Defined

• Focus on students’ tasks or products of those tasks

• Simulate real life application of skills, not drills

• Strive for contextualization & authenticity

• Grounded in constructivist, motivational, and “assessment for learning” theory

Page 23: Student Learning Assessment: An Interactive Exploration Megan Oakleaf, MLS, PhD NELIG Annual Program June 8, 2007

© M. Oakleaf

Performance Assessments – Benefits

Learning• Align with learning goals• Integrate learning and assessment• Capture higher-order thinking skills• Support learning in authentic (real life) contexts• Facilitate transfer of knowledgeData• Supply valid dataOther• Offer equitable approach to assessment

Article forthcoming by Megan Oakleaf

Page 24: Student Learning Assessment: An Interactive Exploration Megan Oakleaf, MLS, PhD NELIG Annual Program June 8, 2007

© M. Oakleaf

Performance Assessments – Limitations

Data

• May have limited generalizability to other settings and populations

Other

• Require time to create, administer, and score

Article forthcoming by Megan Oakleaf

Page 25: Student Learning Assessment: An Interactive Exploration Megan Oakleaf, MLS, PhD NELIG Annual Program June 8, 2007

1. Select one of the outcomes below.• The student will develop a realistic overall plan and timeline to

acquire needed information. • The student will construct and implement effectively-designed

search strategies. • The student will analyze information to identify point of view or

bias. • The student will acknowledge the use of information sources

through documentation styles.

2. What “tasks” would reveal students’ ability to accomplish this outcome?

3. What “products” could serve as evidence of their ability?4. Create a list of tasks and/or artifacts that could be

assessed to assess the outcome.

Interactive ExplorationPerformance Assessments

Page 26: Student Learning Assessment: An Interactive Exploration Megan Oakleaf, MLS, PhD NELIG Annual Program June 8, 2007

http://www.ioxassessment.com/images/PerformanceAssessment.jpg

Page 27: Student Learning Assessment: An Interactive Exploration Megan Oakleaf, MLS, PhD NELIG Annual Program June 8, 2007

© M. Oakleaf

Rubrics Defined

Rubrics…

• describe student learning in 2 dimensions1. parts, indicators, or criteria and

2. levels of performance

• formatted on a grid or table

• employed to judge quality

• used to translate difficult, unwieldy data into a form that can be used for decision-making

Article forthcoming by Megan Oakleaf

Page 28: Student Learning Assessment: An Interactive Exploration Megan Oakleaf, MLS, PhD NELIG Annual Program June 8, 2007

http://www.southcountry.org/BROOKHAVEN/classrooms/btejeda/images/rubric%20big.JPG

Rubrics are often used to make instructional decisions and evaluations.

Page 29: Student Learning Assessment: An Interactive Exploration Megan Oakleaf, MLS, PhD NELIG Annual Program June 8, 2007

Rubric for Assessing Student Ability to Evaluate Websites for Authority

© M. OakleafArticle forthcoming by Megan Oakleaf

Page 30: Student Learning Assessment: An Interactive Exploration Megan Oakleaf, MLS, PhD NELIG Annual Program June 8, 2007

© M. Oakleaf

Rubrics – Benefits, 1 of 2

Learning• Articulate and communicate agreed upon

learning goals• Focus on deep learning and higher-order

thinking skills• Provide direct feedback to students• Facilitate peer- and self-evaluation• Make scores and grades meaningful• Can focus on standards

Article forthcoming by Megan Oakleaf

Page 31: Student Learning Assessment: An Interactive Exploration Megan Oakleaf, MLS, PhD NELIG Annual Program June 8, 2007

© M. Oakleaf

Rubrics – Benefits, 2 of 2

Data• Facilitate consistent, accurate, unbiased scoring• Deliver data that is easy to understand, defend,

and convey• Offer detailed descriptions necessary for informed

decision-making• Can be used over time or across multiple

programsOther• Are inexpensive to design and implement

Page 32: Student Learning Assessment: An Interactive Exploration Megan Oakleaf, MLS, PhD NELIG Annual Program June 8, 2007

© M. Oakleaf

Rubrics – Limitations

Other

• May contain design flaws that impact data quality

• Require time for development

• Require time for training multiple rubric users

Article forthcoming by Megan Oakleaf

Page 33: Student Learning Assessment: An Interactive Exploration Megan Oakleaf, MLS, PhD NELIG Annual Program June 8, 2007

1. Chose an outcome to assess:• The student will develop a realistic overall plan and timeline to

acquire needed information. • The student will construct and implement effectively-designed

search strategies. • The student will analyze information to identify point of view or

bias. • The student will acknowledge the use of information sources

through documentation styles.

2. What “criteria” make up this outcome?3. What does student performance “look like” at a

beginning, developing, and exemplary level?4. Enter the criteria and performance descriptions in

the rubric provided.

Interactive ExplorationRubrics

Page 34: Student Learning Assessment: An Interactive Exploration Megan Oakleaf, MLS, PhD NELIG Annual Program June 8, 2007

© M. Oakleaf

CATs Defined

• Are short, formative, & ongoing

• Are focused on individual classroom environments

Page 35: Student Learning Assessment: An Interactive Exploration Megan Oakleaf, MLS, PhD NELIG Annual Program June 8, 2007

© M. Oakleaf

CATs & The Assessment Cycle

1. Chose a class.2. Select an outcome to

assess.3. Choose a CAT.4. Teach the class, use the

CAT, & collect data.5. Analyze the data.6. Reflect on results.7. Formulate adjustments.8. Deploy adjustments in the

next class.http://www.lib.ncsu.edu/

instruction/assessment/

Page 36: Student Learning Assessment: An Interactive Exploration Megan Oakleaf, MLS, PhD NELIG Annual Program June 8, 2007

Identify learning outcomes

Create learning activities

Enact learning activities

Gather data to check learning

Interpret data

Enact decisions

Review learning goals(IL standards)

Revise learning goals

Reflect

Revise on the fly based on formative feedback

Gather formative data to check comprehension

ILI Assessment CycleAdapted from Peggy Maki, PhD

& Marilee Bresciani, PhDBy Megan Oakleaf, PhD

© M. Oakleaf

Page 37: Student Learning Assessment: An Interactive Exploration Megan Oakleaf, MLS, PhD NELIG Annual Program June 8, 2007

Choosing the Best Tools for Your Campus

Page 38: Student Learning Assessment: An Interactive Exploration Megan Oakleaf, MLS, PhD NELIG Annual Program June 8, 2007

© M. Oakleaf

Choosing the Best Tool for Your Campus

PURPOSE• Why are we conducting this

assessment?• Are we conducting assessment to

respond to calls for accountability or accreditation processes?

• Are we conducting assessment to strengthen program performance and structure?

• Are we conducting assessment to improve student learning and librarian skills?Article forthcoming by

Megan Oakleaf

& Neal Kaske

Page 39: Student Learning Assessment: An Interactive Exploration Megan Oakleaf, MLS, PhD NELIG Annual Program June 8, 2007

© M. Oakleaf

Choosing the Best Tool for Your Campus

STAKEHOLDER NEEDS• Who are the stakeholders of

this assessment effort?• Are our stakeholders

internal, external, or both?• Will our audience prefer

qualitative or quantitative data? Will they have other data preferences?

Article forthcoming by Megan Oakleaf & Neal Kaske

Page 40: Student Learning Assessment: An Interactive Exploration Megan Oakleaf, MLS, PhD NELIG Annual Program June 8, 2007

© M. Oakleaf

Choosing the Best Tool for Your Campus

UTILITY

• Will our assessment tell us what we want to know?

RELEVANCE TO LEARNING

• What will our assessment reveal about student learning?

• Will this be new information or just newly formatted old information?

Article forthcoming by Megan Oakleaf & Neal Kaske

Page 41: Student Learning Assessment: An Interactive Exploration Megan Oakleaf, MLS, PhD NELIG Annual Program June 8, 2007

© M. Oakleaf

Choosing the Best Tool for Your Campus

MEASURABILITY• Will our assessment be

trustworthy and accurate?• Can we use a sample or do

we need to assess an entire population?

• Do we have a baseline or do we need to establish one?

Article forthcoming by Megan Oakleaf & Neal Kaske

Page 42: Student Learning Assessment: An Interactive Exploration Megan Oakleaf, MLS, PhD NELIG Annual Program June 8, 2007

© M. Oakleaf

Choosing the Best Tool for Your Campus

COST• What are the initial and

ongoing costs of our assessment approach?

• What time costs will we incur?• What financial costs will we

incur?• What personnel costs will we

incur?Article forthcoming by Megan Oakleaf & Neal Kaske

Page 43: Student Learning Assessment: An Interactive Exploration Megan Oakleaf, MLS, PhD NELIG Annual Program June 8, 2007

© M. Oakleaf

Choosing the Best Tool for Your Campus

INSTITUTIONAL ISSUES

• Will the tool support the goals of our overall institution?

• How will the results be used by our institution?

• If not supportive of institutional goals, is it worthwhile to the library internally?

Article forthcoming by Megan Oakleaf & Neal Kaske

Page 44: Student Learning Assessment: An Interactive Exploration Megan Oakleaf, MLS, PhD NELIG Annual Program June 8, 2007

© M. Oakleaf

Large-Scale vs. Classroom Assessment

Large-Scale Assessment• Formal • Objective• Time efficient• Cost efficient• Centrally processed• Reduced to single scores• Not focused on

diagnosing and targeting needs of individual learners

• Politically charged• Designed to support

program decision-making

Classroom Assessment• Informal• Locally developed, scored, &

interpreted• Includes instructionally

valuable tasks• Shows short-term changes in

student learning• Provides feedback to students• Useful for making changes to

curricula/activities/assignments• Conducted in a trusting

environment• Designed to support instruction

Lorrie Shepard

Page 45: Student Learning Assessment: An Interactive Exploration Megan Oakleaf, MLS, PhD NELIG Annual Program June 8, 2007

Reporting Assessment Results

Page 46: Student Learning Assessment: An Interactive Exploration Megan Oakleaf, MLS, PhD NELIG Annual Program June 8, 2007

Identify learning outcomes

Create learning activities

Enact learning activities

Gather data to check learning

Interpret data

Enact decisions

Review learning goals(IL standards)

ILI Assessment CycleAdapted from Peggy Maki, PhD

& Marilee Bresciani, PhDBy Megan Oakleaf, PhD

Report data &decisions

© M. Oakleaf

Page 47: Student Learning Assessment: An Interactive Exploration Megan Oakleaf, MLS, PhD NELIG Annual Program June 8, 2007

© M. Oakleaf

Why Document & Report Results?

• No one knows you’re engaged in assessment unless you document and report it.

• Learning takes place when documenting—it enables you to “close the loop”.

• Documenting gives you evidence of accomplishments and evidence of a plan for improvement.

• Accreditation requires documentation.Bresciani

Page 48: Student Learning Assessment: An Interactive Exploration Megan Oakleaf, MLS, PhD NELIG Annual Program June 8, 2007

© M. Oakleaf

The Reporting Process

• Briefly report assessment method for each outcome.

• Document where the outcome was met.

• Document where the outcome was not met.

• Document decisions made for improvements.

• Refine and repeat assessment after improvements are implemented.

Bresciani

Page 49: Student Learning Assessment: An Interactive Exploration Megan Oakleaf, MLS, PhD NELIG Annual Program June 8, 2007

© M. Oakleaf

Know your Data & Tell a Story

• Understand your data.• Consider professional literature

and experiences.• Look for patterns.• Identify the data that tells you

the most about your outcome and is most helpful in making improvements.

• Summarize. • Determine which audiences

need to know about what information in order to make improvements.

Bresciani

Page 50: Student Learning Assessment: An Interactive Exploration Megan Oakleaf, MLS, PhD NELIG Annual Program June 8, 2007

© M. Oakleaf

Reporting to Administrators

Use a 3-part reporting strategy:1. Provide background about the assessment effort

itself. 2. Provide assessment results and answer

questions stakeholders are likely to have.3. Provide a follow-up on the status of efforts for

improvement and effectiveness of changes.

What about “bad” data?

http://www.ncrel.org/sdrs/areas/issues/methods/assment/as600.htm

Page 51: Student Learning Assessment: An Interactive Exploration Megan Oakleaf, MLS, PhD NELIG Annual Program June 8, 2007

© M. Oakleaf

What challenges might I face? 

• Lack of institutional support

• Lack of faculty support

• Lack of library administrative support

• Lack of coworker support

• Lack of money

• Lack of time

• Lack of assessment or statistical expertise

Page 52: Student Learning Assessment: An Interactive Exploration Megan Oakleaf, MLS, PhD NELIG Annual Program June 8, 2007

http://www.archives.gov/publications/prologue/2004/winter/images/we-can-do-it.jpg

Page 53: Student Learning Assessment: An Interactive Exploration Megan Oakleaf, MLS, PhD NELIG Annual Program June 8, 2007

Thank You!

Questions?

[email protected]