plan implement assess improve. outcomes assessment the process of providing credible evidence of the...

63
Plan Implemen t Assess Improve

Upload: dale-howard

Post on 18-Jan-2016

222 views

Category:

Documents


0 download

TRANSCRIPT

Plan

Implement

Assess

Improve

Outcomes Assessment

The process of providing credible

evidence of the outcomes of

higher education undertaken for

the purpose of improving

programs and services within the

institution.

Banta, T. W.

ASSESSMENT . . .

“a rich conversation

about student learning

informed by data.”

-- Ted Marchese --

AAHE

Assessment of Individual Student Development

Assessment of basic skills for use in advising• Placement• Counseling

Periodic review of performance with detailed feedback

End-of-program certification of competence• Licensing exams• External examiners• CLAST

Key Results of Individual Assessment

Faculty can assign grades

Students learn their own strengths and weaknesses

Students become self-assessors

A Second Look

Across students

Across sections

Across courses

Where is learning satisfactory?

What needs to be retaught?

Which approaches produce the most learning for which students?

Group Assessment Activities

• Classroom assignments, test, projects

• Questionnaires for students, graduates, employers

• Interviews, focus groups• Program completion and placement• Awards/recognition for graduates• Monitoring of success in graduate

school• Monitoring of success on the job

Use of Results of Group Assessment

• Program improvement

• Institutional and / or state peer review

• Regional and / or national accreditation

Some Purposes of Assessment

1. Students learn content

2. Students assess own strengths

3. Faculty improve instruction

4. Institutions improve programs/services

5. Institutions demonstrate accountability

Outcomes Assessment Requires Collaboration

In setting expected program outcomes

In developing sequence of learning experiences (curriculum)

In choosing measures In interpreting assessment findings In making responsive improvements

Barriers to Collaboration in the Academy

1. Graduate schools prepare specialists

2. Departments hire specialists

3. Much of our scholarship is conducted alone

4. Promotion and tenure favor individual achievements -- interdisciplinary work is harder to evaluate

Campus Interest in Assessment

WHAT WORKS in….

increasing student retention? general education? use of technology in instruction? curriculum in the major?

Good assessment is good research . . .

An important question An approach to answer the

question Data collection Analysis Report

-Gary R. Pike (2000)

To Foster Collaboration

Name interdisciplinary committeesRead and discuss current literature

on learning/assessmentAttend conferences togetherBring experts to campusShare good practicesWork together on learning

communities

Most Faculty Are Not Trained as Teachers

FACULTY DEVELOPMENTCan Help Instructors:

Write clear objectives for student learning in courses and curricula

Individualize instruction using a variety of methods and materials

Ask questions that make students active learners

Develop assessment tools that test higher order intellectual skills

Taxonomy of Educational Objectives

(Bloom and Others, 1956)Cognitive domain

categories

Knowledge

Comprehension

Application

Analysis

Synthesis

Evaluation

Sample verbs for outcomes

Identifies, defines, describes

Explains, summarizes, classifies

Demonstrates, computes, solves

Differentiates, diagrams, estimates

Creates, formulates, revises

Criticizes, compares, concludes

Organizing for Assessment

Goal Course Measure Findings Uses

Write Portfolio

Speak Speech

Think Test

Find Information

Project

Some General Education Objectives

Differentiate between fact and opinion

Gather, analyze, and interpret dataApply ethical principles to local,

national, global issuesCommunicate ideas in writing

effectively

Learning Outcomes in Science

1. Define and explain basic principles, concepts, theories of science

2. Identify characteristics that distinguish math and science from each other and from other ways of obtaining knowledge

3. Illustrate how developments in science can raise ethical issues

4. Solve theoretical or experimental problems in science

5. Evaluate the validity and limitations of theories and scientific claims in interpreting experimental results

6. Evaluate scientific arguments at a level encountered by informed citizens

Critical Assessment Questions

1. What should a major know and be able to do?

2. What curriculum experiences promote student attainment of

This knowledge?

These skills?

3. Are these experiences taking place?

4. How do we know students are attaining

The knowledge?

The skills?

Planning for Learning and Assessment

1. Whatgeneraloutcomeare youseeking?

2. Howwould youknow it(theoutcome)if you sawit? (Whatwill thestudentknow orbe able todo?)

3. How willyou helpstudentslearn it?(in classor out ofclass)

4. How couldyoumeasureeach of thedesiredbehaviorslisted in #2?

5. What aretheassess-mentfindings?

6. Whatimprove-mentsmight bebased onassess-mentfindings?

Some Assessment History

1970 – Alverno

NE Missouri

1979 – Tennessee

1985 – VA, NJ, CO

1998 – HE Amendments - Accreditors

Purposes for Assessment Accountability

to satisfy external stakeholders

Improvement to make things better

internally

Some external impetus is necessary

to initiate outcomes assessment

in higher education.

Organizational Levels for Assessment

National

Regional

State

Campus

College

Discipline

Classroom

Student

Licensing/Certification Tests

• National Teacher Exam• Commons and specialty areas

• Engineer in Training Exam• NCLEX in Nursing• CPA exam in Accounting• Bar exam in Law• NCARB exam in Architecture• Board exams in Medicine, Social Work,

Planning

Major Field Achievement Tests from Educational Testing Service

Princeton, New Jersey

BiologyChemistry

Computer ScienceEconomicsEducation

EngineeringGeologyHistory

Literature in EnglishMathematics

MusicPhysics

Political SciencePsychologySociology

Definitions and Assessment Methods for

Critical Thinking, Problem Solving, and Writing

By

T. Dary ErwinJames Madison University

for the National Postsecondary Education Cooperative

(U.S. Dept. of Education, National Center for Education Statistics)

Student Outcomes PilotCognitive Working Group

Washington, DC 1998

Website: nces02.ed.gov/evaltests

Are Standardized Tests the Answer?

Not available in many fieldsDo not measure all that is taughtUsually assess knowledge, not

performanceMay be standardized on

unrepresentative norm groupProvide few, if any, subscoresDo not indicate why scores are low

Start with Measures You Have

Assignments in courses

Course exams

Work performance

Records of progress through

the curriculum

Primary Trait Scoring

Assigns scores to attributes (traits) of a task

STEPS Identify traits necessary for success in

assignment Compose scale or rubric giving clear

definition to each point Grade using the rubric

Can Develop a Research Paper

1. Narrows and defines topic

2. Produces bibliography

3. Develops outline

4. Produces first draft

5. Produces final draft

6. Presents oral defense

  

 

Out-standing

Accept-able

Unaccept-able

BibliographyOutstanding – References current,

appropriately cited, representative, relevant

Acceptable – References mostly current, few citation errors, coverage adequate, mostly relevant

Unacceptable – No references or containing many errors in citation format, inadequate coverage or irrelevant

Mapping Course Outcomes to Program Outcomes

Outcomes Course 1 Course 2 Course 3

1 2

3

4

5

6

7

Sophomore Competence in Mathematics(Multiple choice responses & supporting

work)

Score3

2

1

0

CriterionClear conceptual understanding, consistent

notation, logical formulation, complete solution

Adequate understanding, careless errors, some logic missing, incomplete solution

Inadequate understanding, procedural errors, logical steps missing, poor or no response

Problem not attempted or conceptual understanding totally lacking

Ball State University

Assessment in Sociology and Anthropology

Focus groups of graduating students Given a scenario appropriate to the discipline, a

faculty facilitator asks questions related to outcomes faculty have identified in 3 areas: concepts, theory, methods.

2 faculty observers use 0-3 scale to rate each student on each question

GROUP scores are discussed by all faculty Murphy & Goreham

North Dakota State University

Journal Evaluation

1. Entries accurately and vividly record objective observations of site experiences (events, people, actions, setting)

2. Entries convincingly record subjective responses to site experience (thoughts, emotions, values, judgments)

3. Entries effectively analyze/ evaluate your experiences (find insights, patterns, meaning, causes, effects)

Well done

Satisfac-tory

Unsatis-factory

Direct Measures of LearningAssignments, exams, projects, papers

Indirect MeasuresQuestionnaires, inventories, interviews

- Did the course cover these objectives?- How much did your knowledge increase?- Did the teaching method(s) help you

learn?- Did the assignments help you learn?

Fast Feedback(at end of every class)

Most important thing learned Muddiest point Helpfulness of advance reading

assignments for day’s work in class Suggestions for improving class /

assignmentsBateman & Roberts

Graduate School of Business

University of Chicago

Student Suggestions for Improvement

Install a portable microphone Increase type size on

transparencies Leave lights on when using

projector Don’t cover assigned reading in

detail Provide more examples in class

College Student Experience Questionnaire(4th Edition)

SCALES

Computer and information technology

Course learning Writing experience Experience with faculty Art, music, theater Campus facilities

Clubs and organizations Personal experiences Student acquaintances Scientific and quantitative

experiences Conversations The college environment Estimate of gains

College Student Experiences Questionnaire (sample item)Library Experience

used library as quiet place to study used online catalogue asked librarian for help read reserve book used indexes to journal articles developed bibliography found interesting material by browsing looked for further references cited used specialized bibliographies read document other authors cited

Self-Reports

How time is spentStudying (IUB)In all activities (Miami)

Social interactionsDiaries, journalsPortfolios

Assessing Student GrowthThe Portfolio - Some Examples of Content

Course assignmentsResearch papersMaterials from group projectsArtistic productionsSelf-reflective essays (self-assessment)CorrespondenceTaped presentations

Student Electronic Portfolio

Students take responsibility for demonstrating core skills

Unique individual skills and achievements can be emphasized

Multi-media opportunities extend possibilities

Metacognitive thinking is enhanced through reflection on contents

- Sharon J. Hamilton

IUPUI

Using Electronic Assessment Methods

We can Track progress in assignments Evaluate contributions to group

projects Conduct immediate process

checks to evaluate instruction Assess the quality of written

work

Faculty-Developed Exam in Religious Studies

Components:1 Identification of topic for comprehensive paper in senior seminar

faculty critique2 Development of bibliography for paper

faculty critique3 Development of outline for paper

faculty critique4 Writing of first draft of paper

faculty critique5 Writing of final paper

faculty critique according to set of guidelines critique by external consultants using same guidelines

University of Tennessee, Knoxville

Authentic Assessmentat

Southern Illinois University - Edwardsville

Business - Case Study Analysis with Memo Education - Professional Portfolio Psychology - Poster on Research Project Engineering - Senior Design Project Nursing - Plan of Care for Patient

Responses to Assessmentat

Southern Illinois University - EdwardsvilleBusiness - More case studies and research

Education - More practice in classroom management

Psychology - Curriculum change in statistics Engineering - More practice in writing and

speaking Nursing - Simulation lab with computerized

patients

In a Comprehensive Assessment Program...

INVOLVE Students Faculty Student Affairs Staff Administrators Graduates Employers

Guidance from Alumni Alumni surveys emphasized that graduates

valued skills in writing, speaking, working collaboratively, and information literacy

Now the Faculty Senate’s General Education Committee has developed 5 learning elements, at least 3 of which must be integrated in any course approved for general education

-Michael Dooris

Penn State University

Involving EmployersCombination of survey and focus groups

for employers of business graduates Identified skills, knowledge, personality attributes

sought by employers Encouraged faculty to make curriculum changes Motivated student to develop needed skills Strengthened ties among faculty, students, employers

- Kretovics & McCambridge

Colorado State University

Colorado State UniversityCollege of Business

Curriculum changes based on employer suggestions:

1 credit added to Business Communications for team training and more presentations

Ethics & social responsibility now discussed in intro courses

New Intro to Business course emphasizing career decision-making

More teamwork, oral & written communication, problem-solving in Management survey courses

- Kretovics & McCambridge

Longwood CollegeIn 1989 – MFAT scores at 35th percentile “a marginal program”In 1991 – New dean engaged faculty in assessment and continuous improvementIn 1998 – MFAT scores at 96th percentile Satisfaction of students and faculty ranked 2nd of

7 peers AACSB accreditation with highest rating

Building a Scholarship of Assessment

- Banta & Associates

Jossey-Bass Publishers

April 2002

The Scholarship of Assessment Involves

basing assessment studies on relevant theory/practice

gathering evidence

developing a summary of findings

sharing findings with the assessment community

Some Research Traditions Underlying Assessment

Program evaluationOrganizational change and

developmentCognitive psychologyStudent developmentMeasurement Informatics

Assessment MethodsImprove instruments to measure

content knowledge at more

complex levels

affective development

effects of educational interventions

changes in learning over time

Organizational Behavior & Development

How can assessment be combined with other systemic changes to improve teaching & learning?

What patterns of organizational behavior promote and sustain assessment?

What methods of providing and managing assessment information are most effective?

Which public policy initiatives are most effective in promoting improvement on campuses?

Targets for Research on Engaging Faculty

How can we determine the interests and commitments of stakeholders?

How should we educate stakeholders for choosing methods?

How can we reduce costs and maximize assessment’s benefits?

What ethical principles should guide our work?

Derived from Michael Quinn Patton’sUtilization – Focused Evaluation (1997)

Success Factors1 Committed leadership2 Collaboration between faculty and student

affairs leaders3 Teamwork in planning and implementation4 Supportive campus climate

Concern for students, continuous

improvement5 Involvement in design of assessment6 Results effectively communicated7 Conscientious follow-up8 Persistence

The Future Need for evidence of accountability will

increase More faculty will recognize benefits of

assessment More electronic assessment methods will be

developed More sharing of assessment methods will

take place Faculty will learn more about learning and

student learning will improve