presented by jennifer fager xavier university for university of wisconsin-superior enhancement day...

38
Selecting and Developing Assessment Approaches and Methods Presented by Jennifer Fager Xavier University for University of Wisconsin-Superior Enhancement Day 1/19/2011

Upload: rhoda-robinson

Post on 25-Dec-2015

218 views

Category:

Documents


0 download

TRANSCRIPT

  • Slide 1
  • Presented by Jennifer Fager Xavier University for University of Wisconsin-Superior Enhancement Day 1/19/2011
  • Slide 2
  • Guiding Principle The assessment of student learning should become a system whereby planning, data collection, analyses, and improvement are included Reactions?
  • Slide 3
  • Burning Questions What sorts of things should a professor try to assess? Having decided on what to assess, how should a professor go about assessing it? What sorts of things should faculty try to assess in their programs? Having decided on what to assess, how should faculty go about assessing these things?
  • Slide 4
  • What is your Assessment Pattern? Why do you assess what you assess? What are the reasons professors construct and use assessment instruments Identify areas of deficiency Understand end-of-instruction targets
  • Slide 5
  • Data-Driven Assessment Professors make inferences and then decisions based upon the inferences What does an A mean? B? C? F? THUS It is important to clarify BEFORE the test is created the decisions that will be influenced by students performances.
  • Slide 6
  • How do you do that? Its time to give your mid-term. Before you do so, there are several questions that need to be addressed. What should be tested? What topics were discussed and for how long? What type of items will you use? How long will students have to take the exam? How many items/points should be used given the amount of time available?
  • Slide 7
  • Two Fundamental Questions What evidence do you have that students achieve your stated learning outcomes? In what ways do you analyze and use evidence of student learning?
  • Slide 8
  • Another Question or two What changes have you made to your programs, your institution, or your courses based upon evidence collected? What evidence do you currently possess that might inform essential decisions that need to be made?
  • Slide 9
  • Defining Evidence Information that tells you something directly or indirectly about the topic of interest Evidence is neutral -- neither good nor bad Requires context to be meaningful Two types of assessment evidence Direct and Indirect
  • Slide 10
  • Direct Evidence Students show achievement of learning goals through performance of knowledge, skills: Scores and pass rates of licensure/certificate exams Capstone experiences Individual research projects, presentations, performances Collaborative (group) projects/papers which tackle complex problems Score gains between entry and exit Ratings of skills provided by internship/clinical supervisors Substantial course assignments that require performance of learning Portfolios Course assignments Others?
  • Slide 11
  • Indirect Evidence Attitudes, perceptions, satisfaction, and experiences of learning and the learning environment Students self-assessments of learning Local student, alumni, employer surveys and questionnaires Course Evaluations. National Engagement Surveys and Satisfaction Surveys NSSE, CCSSE, FSSE, BCSSE, SSI (Noel Levitz) Focus Groups (student, faculty, employer) Interviews (student, faculty, employer) Others?
  • Slide 12
  • Finding Evidence: An Evidence Inventory Lets you discover the evidence you already have, such as: Institutional Research data Student Life data Exit Surveys (seniors) Alumni Surveys Start with the obvious but dont stop there
  • Slide 13
  • Finding Evidence: Perils and Pitfalls Institutional history Weve already done that, and it didnt tell us anything! Territory; Politics Fighting for scant resources Institutional policy/culture about sharing information I dont want somebody policing my classrooms! Who owns the evidence?
  • Slide 14
  • Finding Evidence: Appropriateness Does the evidence address student learning issues appropriate to the institution or the program? Does the evidence tell you something about how well the institution or program is accomplishing its mission and goals? The questions you have about student learning should guide your choice of appropriate existing evidence and identify gaps where a new type of evidence might be needed
  • Slide 15
  • Evidence Example Attached to this packet are data and analysis examples. Writing Results Rubric IR Survey CAAP results Students Will Think Critically Form Student Affairs data Library Data
  • Slide 16
  • Assisting Academic Departments: Some Assumptions Faculty are intensely interested in what students are learning Assessment occurs in classrooms and academic departments every day Evidence of student learning already exists in academic departments The challenge is not to convince academic departments to gather evidence, but rather to help them recognize and use evidence they already have
  • Slide 17
  • Assisting Academic Departments: Addressing Common Barriers This is a lot of work! Use some sort of evidence inventory to help faculty understand how existing academic practices yield evidence Keep expectations reasonable, given limited time and resources Offer assistance and rewards Remember: it is not necessary to gather all the evidence all of the time
  • Slide 18
  • Assessment Inventory: One Example Inventory of Written Statements and Plans 1. Do you have a written mission statement or statement of purpose? yes no If yes, please attach a copy or reference where this can be found: ________________________________________________________ 2. Do you have a written statement of intended educational outcomes describing what a student should know or be able to do when they have completed this program? yes no 3. Do you have a written method of assessment for measuring student outcomes? yes no 4. Does your program have a separate accreditation process? yes no
  • Slide 19
  • Assessment Inventory: One Example Direct Methods of Assessment 1. ________ Comprehensive Examinations 2. ________ Writing proficiency Examinations 3. ________ National Examinations assessing subject matter knowledge 4. ________ Graduate Record Exam General Test 5. ________ Graduate Record Exam Subject Test 6. ________ Certification Examinations 7. ________ Licensure Examinations 8. ________ Locally developed pre-test or post-test for subject matter knowledge 9. ________ Major paper/project 10. ________ Program/course portfolios 11. ________ Capstone coursework 12. ________ Audio/video tape of presentations/performances
  • Slide 20
  • Assisting Academic Departments: Addressing Common Barriers How do I know you wont use this against me? Be consistent and firm in the message that assessment is not faculty evaluation, that results will only be reported in the aggregate Partner with faculty willing to engage in the process and make her/his evidence public Link assessment results to allocation of resources, ideally through a strategic planning process If appropriate, develop policies regarding assessment
  • Slide 21
  • Assisting Academic Departments: Addressing Common Barriers My students pass the tests. Why isnt that good enough? Tests often measure only content knowledge Learning = what student know (content knowledge) + what they can do with what they know (performance) Grades are generally not linked to specific learning outcomes and dont aggregate well Modify course tests to measure learning outcomes by adding performance assessments
  • Slide 22
  • Modifying Tests to Gather Direct Evidence of Learning Identify questions on the test that provide evidence of a learning outcome: Five questions that require the use of deductive reasoning to arrive at the right answer Open-ended questions that require students to solve a unique problem given knowledge/skills learned Isolate those questions and look for patterns of performance: the average grade in the class was a B but 85% of the students missed four of the questions requiring deductive reasoning 70% of students were able to use a particular theory/approach to resolve the problem
  • Slide 23
  • Meaningful Evidence Situated within the institutional and departmental mission and context Addresses relevant questions Analyzed and interpreted in relation to other evidence Examples?
  • Slide 24
  • Meaningful Evidence: Facts + Context Fact: National survey data indicates seniors do not feel a sense of engagement and belonging on our campus.
  • Slide 25
  • Meaningful Evidence: Facts + Context Fact: Seniors feel disengaged from our campus (national survey data) Fact: Seniors would recommend this institution to other people (senior exit surveys)
  • Slide 26
  • Meaningful Evidence: Facts + Context Fact: Seniors feel disengaged from our campus (national survey data) Fact: Seniors would recommend this institution to other people (senior exit surveys) Context: Over the past five years, an average of 82% of first-year alums donated to the institution
  • Slide 27
  • Recognizing Meaningful Evidence How compelling is your evidence? Does it make you want to do something? Will it make others want to do something? How relevant is your evidence? To what is it linked: departmental mission, institutional initiatives? How trustworthy is your evidence? How was it gathered? Who does it represent? Is it one piece? Several pieces?
  • Slide 28
  • HLC Expanded Fundamental Questions What evidence do you have that students achieve your stated learning outcomes? * Who actually measures the achievement of student learning outcomes? * At what points in the curriculum or co-curricular activities are essential institutional (including general education), major, or program outcomes assessed? * How is evidence of student learning collected? * How extensive is the collection of evidence? In what ways do you analyze and use evidence of student learning? * Who analyzes the evidence? * What is your evidence telling you about student learning? * What systems are in place to ensure that conclusions are drawn and actions taken on the basis of the analysis of evidence? * How is evidence of the achievement of student learning outcomes incorporated into institutional planning and budgeting
  • Slide 29
  • Meaningful Evidence: Example Senior exit surveys: Indicate a dissatisfaction with the amount of time spent on clinical skills Departmental assessment of skill ability and development finds that, of the critical skills required: students are outstanding on three of them, satisfactory on two, and not acceptable on two Internship evaluations from supervisors consistently cite lack of ability in clinical skills
  • Slide 30
  • Meaningful Evidence: Qualitative Data Appropriate uses: Exploring an issue in more depth Answering specific questions about individual experience: Ex: How are you different now than you were before? Ex: how did living with a host family inform your understanding of the culture? Including student voices
  • Slide 31
  • Qualitative Data Analysis: Open-Ended Questions Read the data Strip and code the data, while looking for themes and patterns Present the data thematically---it will lead you somewhere Academic Advising General Education Student perceptions of particular courses
  • Slide 32
  • Qualitative Data Example 420 was a senior level course but I felt like a freshman! There was no way I knew all of that stuff. I thought I was going to fail 420 and Im a good student. I didnt know how to do anything in 420 and the instructor didnt care. We kept saying we didnt know but he just kept going. It was ridiculous.
  • Slide 33
  • Qualitative Data Example Drill down into the data by asking pertinent questions: What are the learning goals of 420? How did students perform in 420? What are the assumptions about students entering 420? Skill level? Knowledge base? Analyze the program curriculum map Where do students learn prerequisite skills and/or knowledge? How and where are program and course learning outcomes (expectations) assessed? Are they assessed?
  • Slide 34
  • Using Assessment Results Inform policy decisions Strategic allocation/reallocation of resources Make changes in curriculum Support new initiatives Accountability Inform stakeholders about expectations and results Improve teaching and learning on campus
  • Slide 35
  • Presenting Assessment Results Consider audience Who are they? Whats important to them? How will they use assessment information in their lives? Appropriate presentation Present data thematically/topically Link data and interpretations to institutional initiatives or departmental strategic planning (provide a context)
  • Slide 36
  • Assessing and Improving Assessment Were the assessments reasonable and manageable? Did they answer your questions? Did they tell you something about student learning? Were you able to use the evidence you gathered? What else do you need to know?
  • Slide 37
  • Questions and Comments Where do you need to go from here? What is your assessment system for the program in which you teach? Is the system understood by all stakeholders? Does the system reflect the discipline?
  • Slide 38
  • Helpful Sources Diamond, Robert M. Designing and Assessing Courses & Curricula (1998) Allen, Mary J. Assessing Academic Programs in Higher Education (2004) Huba, Mary E. and Jann E. Freed. Learner-Centered Assessment on College Campuses (2000) Suskie, Linda. Assessing Student Learning: A Common Sense Guide (2004) Walvoord, Barbara E. Assessment Clear and Simple (2004)