selecting assessment tools for gathering evidence of learning outcomes

26
Selecting Assessment Tools for Gathering Evidence of Learning Outcomes 1 February 24, 2010 Lehman College Assessment Council

Upload: kagami

Post on 23-Feb-2016

63 views

Category:

Documents


0 download

DESCRIPTION

Selecting Assessment Tools for Gathering Evidence of Learning Outcomes. Lehman College Assessment Council. February 24, 2010. Timeline. Ongoing assessment. Spring 2011 Middle States report due April 1 Second completed assessment cycle of student learning goals Analyze evidence - PowerPoint PPT Presentation

TRANSCRIPT

Page 1: Selecting Assessment Tools for Gathering Evidence of Learning Outcomes

Selecting Assessment Tools for Gathering Evidence of Learning Outcomes

1

February 24, 2010

Lehman College Assessment Council

Page 2: Selecting Assessment Tools for Gathering Evidence of Learning Outcomes

Timeline

Fall 2009• Articulate learning goals and objectives for majors and programs.

• Identify learning opportunities in curriculum and places where students demonstrate learning of objectives. (February 16 target date)

Spring 2010• First Assessment Plan

• Programs begin gathering evidence.

• Supporting workshops

• Results and Analysis reported (end May)

• Learning objectives on syllabi continues

Fall 2010• First completed assessment

cycle of student learning goals• Report on how

assessment results were used

• Identify 2nd goal and begin to gather evidence on second goal

• Supporting workshops through fall semester.

Spring 2011• Middle States report due April

1• Second completed

assessment cycle of student learning goals

• Analyze evidence• Report on how assessment

results were used (May)

Ongoing assessment

Page 3: Selecting Assessment Tools for Gathering Evidence of Learning Outcomes

Timeline: Spring 2010 February 16

Curriculum Maps Assessment Plans

April 16 Assessment Council workshop

May 31 Assessment Results (w/supporting documents)Ongoing Evidence gathering Meetings with ambassadors Development opportunities *** Syllabi ***

3

Page 4: Selecting Assessment Tools for Gathering Evidence of Learning Outcomes

What do we want our students to learn?4

What… knowledge, skills, abilities, and habits of mind

…do we expect graduates of our program to have?

Page 5: Selecting Assessment Tools for Gathering Evidence of Learning Outcomes

Assessment ToolboxAssessment tools recommended by Suskie (2004)

5

Portfolios Tests (blueprinted – i.e., mapped back onto

objectives) Focus Groups Interviews Assignment Scoring Guides/Rubrics Surveys

SEE ALSO: Suskie, Assessing Student Learning, 2nd ed., Ch. 2

Page 6: Selecting Assessment Tools for Gathering Evidence of Learning Outcomes

Direct vs. Indirect Evidence Direct evidence of student learning is tangible,

visible, self-explanatory evidence of exactly what students have and haven’t learned.

Indirect evidence provides signs that students are probably learning, but evidence of exactly what they are learning may be less clear and less convincing.

While indirect evidence (feedback/surveys) can be useful, direct evidence is often best for getting concrete indications that students are learning what we’re hoping they’re learning.

6

Page 7: Selecting Assessment Tools for Gathering Evidence of Learning Outcomes

Embedded course assignments (written/oral) Department wide exams (blueprinted) Standardized tests (blueprinted) Capstone projects Field experiences Score gains, Pre-Test/Post-Test Videotape and audiotape evaluation of own

performance Portfolio evaluation and faculty designed

examinations Summaries and assessments of electronic class

discussion threads Student reflections on outcomes related to values,

attitudes, beliefs

See: Suskie (2009), ch. 2, table 2.1

7

This Assessment Cycle: Direct Evidence

Page 8: Selecting Assessment Tools for Gathering Evidence of Learning Outcomes

What Is a Scoring Guide or Rubric?

8

List or chart describing the criteria used to evaluate or grade completed student assignments

such as presentations, papers, performances, etc.Includes guidelines for evaluating each of the criteriaBoth a means of evaluating student work and providing meaningful feedback to students

Page 9: Selecting Assessment Tools for Gathering Evidence of Learning Outcomes

Using Scoring Guides/Rubrics to Assess Program Goals

9

How can Rubrics be used to assess program learning goals?

Embedded course assignments Capstone experiences Field experiences Employer feedback Student self-assessments Peer evaluations Portfolios

Page 10: Selecting Assessment Tools for Gathering Evidence of Learning Outcomes

Using a Scoring Guide/Rubric: Advantages10

Clarify vague, fuzzy statements – “Demonstrate effective writing skills”

Help students understand expectations Help students self-improve

(metacognition) Make scoring easier and faster Make scoring accurate, unbiased and

consistent Reduce arguments with students Help improve teaching and learning

Page 11: Selecting Assessment Tools for Gathering Evidence of Learning Outcomes

Developing a Scoring Guide / Rubric:Steps

11

Step I – Look for models

Step II – Define the traits or learning outcomes to assess:Structure, Content, Evidence, Presentation, Technical, Accuracy, etc

Step III – Choose the scale / level of performance(5pt/3pt, letter grades, Excellent-Poor, etc)

Step IV - Draw a table

Step V - Describe the characteristics of student work at each level.Start with high end and then low end and then describe point in between

Step VI – Pilot test the rubric

Step VII – Discuss the results

Page 12: Selecting Assessment Tools for Gathering Evidence of Learning Outcomes

Developing a Scoring Guide / Rubric: List the Things You’re Looking For

12

Why are we giving students this assignment?

What are its key learning goals?What do we want students to learn by completing it?

What skills do we want students to demonstrate in this assignment?

What specific characteristic do we want to see in completed assignments?

Page 13: Selecting Assessment Tools for Gathering Evidence of Learning Outcomes

Using Scoring Guides/Rubrics: Rating Scale

13

Source: Assessing Student Learning: A Common Sense Guide by Linda Suskie

Page 14: Selecting Assessment Tools for Gathering Evidence of Learning Outcomes

Using a Scoring Guide/Rubric: Descriptive Rubric

14

Source: Assessing Student Learning: A Common Sense Guide by Linda Suskie

Page 15: Selecting Assessment Tools for Gathering Evidence of Learning Outcomes

How it All Fits Together: Course Embedded

EXAMPLE 1 – Communication (any discipline)Program Goal I: Students will be able to

communicate effectivelyLearning Objective IA: Express ideas in a clear

and coherent manner in an oral presentationClass: Speech 101Assignment: Students will make a persuasive

argument (pro or con) on a current domestic or international issue (health care, Afghan war, financial crisis, etc)

Assessment Technique: Using an oral presentation rubric, students will be evaluated on Organization, Content and Style

Page 16: Selecting Assessment Tools for Gathering Evidence of Learning Outcomes

How It All Fits Together: Course Embedded Rubric

Below Expectations

Satisfactory Exemplary Score

Organization

No apparent organization. Evidence is not used to support assertions. (0-2)

The presentation has a focus and provides some evidence that supports conclusions. (3-5)

The presentation is carefully organized and provides convincing evidence to support conclusions. (6-8)

Content The content is inaccurate or overly general. Listeners are unlikely to learn anything or may be misled. (0-2)

The content is generally accurate, but incomplete. Listeners may learn some isolated facts, but they are unlikely to gain new insights about the topic.(5-7)

The content is accurate and complete. Listeners are likely to gain new insights about the topic. (10-13)

Style The speaker appears anxious and uncomfortable, and reads notes, rather than speaks. Listeners are largely ignored. (0-2)

The speaker is generally relaxed and comfortable, but too often relies on notes. Listeners are sometimes ignored or misunderstood. (3-6)

The speaker is relaxed and comfortable, speaks without undue reliance on notes, and interacts effectively with listeners. (7-9)

Total Score:Source: Assessing Academic Program in Higher Education by Mary J. Allen

Page 17: Selecting Assessment Tools for Gathering Evidence of Learning Outcomes

How It All Fits Together: Capstone

17

EXAMPLE 2 – Research (any discipline)Program Goal I: Students will understand how to conduct research Program Goal II: Students will be able to demonstrate proficiency in writing

mechanicsProgram Goal III: Students will understand sociological concepts and theories

Objectives: several objectives pertaining to each of the above goals will be assessed (see program’s learning goals and objectives)

Class: Sociology 495

Assignment: Students will write a 20 page research paper on a topic in sociology. Students are expected to develop a thesis, gather and analyze information, synthesize this information to support their argument and demonstrate proficiency in writing mechanics.

Assessment Technique: Using a research rubric developed and tested by department faculty, students work will be evaluated on six different criteria. The rubric is attached.

Page 18: Selecting Assessment Tools for Gathering Evidence of Learning Outcomes

How It All Fits Together: Capstone Rubric

18

  Thesis/Problem/Question Information Seeking/Selecting and

Evaluating

Analysis Synthesis Documentation Product/Process

4 Student(s) posed a thoughtful, creative question that engaged them in challenging or provocative research. The question breaks new ground or contributes to knowledge in a focused, specific area.

Student(s) gathered information from a variety of quality electronic and print sources, including appropriate licensed databases. Sources are relevant, balanced and include critical readings relating to the thesis or problem. Primary sources were included (if appropriate).

Student(s) carefully analyzed the information collected and drew appropriate and inventive conclusions supported by evidence. Voice of the student writer is evident.

Student(s) developed appropriate structure for communicating product, incorporating variety of quality sources. Information is logically and creatively organized with smooth transitions.

Student(s) documented all sources, including visuals, sounds, and animations. Sources are properly cited, both in-text/in-product and on Works-Cited/Works-Consulted pages/slides. Documentation is error-free.

Student(s) effectively and creatively used appropriate communication tools to convey their conclusions and demonstrated thorough, effective research techniques. Product displays creativity and originality.

3 Student(s) posed a focused question involving them in challenging research.

Student(s) gathered information from a variety of relevant sources--print and electronic

Student (s) product shows good effort was made in analyzing the evidence collected

Student(s) logically organized the product and made good connections among ideas

Student(s) documented sources with some care, Sources are cited, both in-text/in-product and on Works-Cited/Works-Consulted pages/slides. Few errors noted.

Student(s) effectively communicated the results of research to the audience.

2 Student(s) constructed a question that lends itself to readily available answers

Student(s) gathered information from a limited range of sources and displayed minimal effort in selecting quality resources

Student(s) conclusions could be supported by stronger evidence. Level of analysis could have been deeper.

Student(s) could have put greater effort into organizing the product

Student(s) need to use greater care in documenting sources. Documentation was poorly constructed or absent.

Student(s) need to work on communicating more effectively

1 Student(s) relied on teacher-generated questions or developed a question requiring little creative thought.

Student(s) gathered information that lacked relevance, quality, depth and balance.

Student(s) conclusions simply involved restating information. Conclusions were not supported by evidence.

Student(s) work is not logically or effectively structured.

Student(s) clearly plagiarized materials.

Student(s) showed little evidence of thoughtful research. Product does not effectively communicate research findings.

Page 19: Selecting Assessment Tools for Gathering Evidence of Learning Outcomes

Group Exercise19

Write a Rubric!

Page 20: Selecting Assessment Tools for Gathering Evidence of Learning Outcomes

Using Surveys as an Assessment

20

Potential AdvantagesMeasure of attitudes, dispositions, values,

habits of mindComplement other forms of assessment dataTriangulation of data from different

perspectives about how well a goal or objective is met

Efficient way of gathering information from program completers or alumni in the workforce

Page 21: Selecting Assessment Tools for Gathering Evidence of Learning Outcomes

Example: Using Complementary Survey Data to Evaluate Program Outcomes

21

ECCE Undergraduate Program Goal:   Candidates must be able to plan instructional tasks and

activities appropriate to the needs of students who are culturally diverse and those with exceptional learning needs in elementary schools. They must be able to teach the literacy skills of listening, speaking, reading, and writing to native English speakers and students who are English language learners at the childhood level, including methods of reading enrichment and remediation. (NYSDOE, ACEI)

Divisional surveys of education program completers conducted every semester showed that graduates overall do not feel they are well prepared to teach English Language Learners effectively.

Page 22: Selecting Assessment Tools for Gathering Evidence of Learning Outcomes

22

ECCE courses where goal is addressed: ECE 300, ECE 301, ECE 431, ECE 432, ECE 433

ECCE program assessment data: Student portfolios (lesson plans demonstrating

differentiated instruction for ELLs and effectiveness at having ELLs meet lesson objectives)

Student teaching evaluations (completed by students, teachers, supervisors)

Online survey data at focus school from graduates and “experts” (cooperating teachers, college supervisors) using Likert Rating Scale

Recent changes in program Explicit attention to working with ELLs during field

placements attached to courses prior to final semester student teaching

Program faculty are revisiting content of courses at program meetings (return to curriculum mapping)

Example: Using Complementary Survey Data to Evaluate Program Outcomes

Page 23: Selecting Assessment Tools for Gathering Evidence of Learning Outcomes

One Goal. Three Assessments. Triangulation:

Direct evidence: portfolio assessment. Direct evidence: teaching performance

evaluations scored using rubrics. Indirect evidence: later survey

(graduates, cooperating peers) regarding student preparedness for real world teaching.

23

Example: Using Complementary Survey Data to Evaluate Program Outcomes

Page 24: Selecting Assessment Tools for Gathering Evidence of Learning Outcomes

Using Surveys as an Assessment

24

Potential LimitationsSelf-report data from a survey may or may not

accurately reflect student learning (indirect measure of student learning)

Responses might be influenced by participants’ perceptions of what they believe the evaluator wants to hear, e.g., if the course instructor is conducting the survey

If used as an “add on” assessment, participation is voluntary and may require follow-up

Issues of sampling, validity & reliability

Page 25: Selecting Assessment Tools for Gathering Evidence of Learning Outcomes

Assessment Council Membership

25

Salita Bryant (English) [email protected] Nancy Dubetz (ECCE) [email protected] *Robert Farrell (Lib) [email protected] Judith Fields (Economics) [email protected] Marisol Jimenez (ISSP) [email protected] Teresita Levy (LAPRS) [email protected] Lynn Rosenberg (SLHS) [email protected] Robyn Spencer (History) [email protected] Minda Tessler (Psych) [email protected] Janette Tilley (Mus) [email protected] Esther Wilder (Soc) [email protected]*Committee ChairAdministrative Advisor – Assessment Coordinator• Ray Galinski - [email protected]

Page 26: Selecting Assessment Tools for Gathering Evidence of Learning Outcomes

References/Resources26

Suskie, L. (2004). Assessing student learning: A common sense guide. San Francisco: Anker Publishing Co., Inc.

Suskie, L. (2009). Assessing student learning: A common sense guide. San Francisco: John Wiley & Sons, Inc.

SurveyMonkey: www.surveymonkey.com