laccd program assessment

Post on 18-Dec-2014

665 Views

Category:

Education

2 Downloads

Preview:

Click to see full reader

DESCRIPTION

Presentation given by RCC's Arend Flick during the Program Assessment Workshop held on October 16, 2009 at Pierce College.

TRANSCRIPT

Assessing program-level learning outcomes:

theory and practice

Presentation for Los Angeles Community College DistrictOctober 16, 2009

Arend FlickAssoc. Professor, English

A reminder: why we’re here . . .

“It is extremely difficult to argue as a responsible academic that it is wrong to gather information about a phenomenon, or that it is inappropriate to use the results for collective betterment.” (Peter Ewell)

“Seasoned observers have pointed out the irony of the academy, as an institution dedicated to discerning the truth through evidence, being so seemingly resistant to measuring quality through evidence.” (American Association of State Colleges and Universities)

What you should know and be able to do after this session . . .

• You should be able to write learning outcomes for a program. • You should become more familiar with methods (direct and

indirect) available to you to assess learning in a particular program.

• You should become more familiar with how you might make assessment of programs a systematic part of your college’s processes.

• You will have produced draft program-level outcomes statements and draft assessment plans for modification and (eventually) adoption by your colleges.

Why Assess Programs?• To demonstrate that they work (i.e., for

accountability purposes)

• To help in planning, resource allocation, etc.

• To identify problem areas (e.g., misalignment of courses with program goals) that can lead to improvement

What Is a Program?– Title V (section 55000) says it’s “an

organized sequence of courses leading to a defined objective, a degree, a certificate, a diploma, a license, or transfer to another institution of higher education.”

– Title V also stipulates that 18 semester hours are required at a minimum.

Which means . . .• LACCD has literally hundreds of programs to assess (partly

because you have so many majors).• The largest program at any of your colleges is general

education.• But you also have other interdisciplinary programs (e.g.

honors, Puente, basic skills, study abroad, IGETC, etc.)• Your career-tech programs are already probably far along in

the process of assessing learning (even if they don’t know it).

And Means . . .

• We need to find ways of identifying and assessing program-level SLOs that are not unduly burdensome (or intrusive),lead to improvement in teaching and learning,satisfy increasingly rigorous accountability

demands,assist in planning processes,AND meet ACCJC standards.

Why program-level assessment is the greatest challenge for CCs

• So many programs (and many “phantom” programs)• So many inter-disciplinary programs, often with no

one directly responsible for their ensuring their effectiveness

• Assessment methods that work well for university programs often don’t work as well for us

adapted in part from Mary J. Allen's work at CSU Bakersfield

Assessing Program-Level Outcomes

1. Define the program’s learning outcomes (what we want students to be able to think, do, or know when they’ve completed it).

2. Check for alignment between curriculum and outcomes.

3. Develop an assessment plan.4. Collect assessment data.5. Use this information for improvement, for planning,

for accountability, for resource allocation.6. Routinely examine the assessment process itself.

What Are SLOs? (a Refresher)

• They can be defined at any instructional level, ranging from the specific lesson all the way up to the institution.

• As opposed to “objectives,” they emphasize application of knowledge (what a student can do at the end of a course of instruction).

• They are not discrete or highly specific skills but aggregated complexes of knowledge, ability, and attitudes. They represent the broadest goals of the course or program.

Program-Level SLO Suggestions• The fewer the better.• Don’t get too hung up on language--this is only the first (and

by far the simplest) part of the assessment cycle, and you can change/refine SLOs later as necessary.

• Make sure the outcome is something that can be assessed.• Work collaboratively as much as possible.• Solicit advice from advisory groups, licensing/accrediting

boards, etc.—and do a google or ERIC search of counterpart program SLOs at other community colleges

• Think about course-program alignment, but recognize that some course SLOs won’t map to program SLOs (and some program SLOs will depend on OLEs for achievement).

Some Examples of Programmatic SLOs

• Computer/Business Applications Certificate:1. Productively work as a team member with people with

diverse backgrounds in a workplace environment.2. Communicate effectively in support of a business office,

including production and design of complex electronic and paper-based correspondence and documents.

3. Use the Internet, a wide variety of computer application and standard business procedures to compute, analyze business performance, and solve problems.

4. Actively assist in implementing general office procedures, including records management

5. Demonstrate high qualities of self-management and self-awareness in terms of workshop responsibility and productivity.

• Music1. Demonstrate understanding of the fundamental melodic,

harmonic, and rhythmic structure of music.2. Demonstrate fluency with the language of music in written and

aural form.3. Perform on an instrument (or voice) at college sophomore

level.4. Perform effectively in a musical ensemble.5. Use the piano keyboard to demonstrate and apply musical

concepts.6. Demonstrate understanding of the historical development of

music.

• English1. 80% of a sample of graduating English majors in a literature

survey course will be able to score at least 70% on a test designed to measure their success in identifying authors, in placing them in their historical periods, and in knowing the titles of their major works.” (an operational SLO)

2. At graduation, students are able to write a clear, coherent, and persuasive essay demonstrating their ability to analyze and interpret texts, to apply secondary criticism to them, and to explain their contexts.

Some Practical Advice in Writing Programmatic SLOs

• Identify the most important things a student should leave your program being able TO DO (or know). Address student competency rather than content coverage. (Try for no more than five SLOs.)

• Consider: course SLOs and major assignments; transfer alignment needs, external licensure/accreditation requirements; employer needs; alumni feedback. Use active verbs to craft sentences that are clear, intelligible to students.

• Ensure that the SLO is assessable, measurable.• Share draft SLO with colleagues to sharpen focus.

Some hands on

• Using the worksheet, let’s spend 20 minutes or so trying to develop a short (but comprehensive) list of outcomes for a degree or certificate in our discipline.

• In the last five minutes, let’s trade with someone outside the discipline for feedback.

Program SLO Checklist Outcomes are written using action verbs. The language indicates the program’s big picture,

not nuts & bolts. Outcomes describe what students can DO, not

what the program’s goals are. They address student competency (how they apply what they’ve learned) rather than content coverage.

Once We Have Programmatic SLOs . . .

• Where should they appear?– In the college catalog?– On the college website (and any program-

focused web pages)?– On brochures, posters, etc. that describe

our programs?– Other?

Some direct (i.e., performance-based) methods to assess learning in programs• Look at work that students are already doing in courses to

determine if, and to what extent, it demonstrates their achievement of program-level competencies.

• Administer nationally normed (or locally developed) tests of program-level competencies.

• Have students reflect on their values, attitudes, and beliefs (if developing these is an intended outcome of the program)

• In C-T programs, have employers rate skills of recent graduates.

• In C-T programs, use scores and pass rates on appropriate licensure exams that can be aligned with program SLOs.

Using student work produced in classes to assess programs

• In capstone courses (or de facto capstone courses)• Through portfolios, either paper or electronic (i.e., by

taking a second look at essays, projects, presentations, etc. that students are already doing in courses)

• Through individual faculty assessments at the course level, either informally or through an assessment data management system

• As a beginning, through a course-program assessment matrix

A course-program assessment matrix

• If you’ve defined (and are assessing) course-level SLOs, you can use a matrix to gather information about course-level learning that potentially maps to program-level competencies.

• Two caveats:– A student may not be in a specific program just because she is

taking a class required by that program. (Most “GE” courses serve three or more “programs” simultaneously.)

– Students may achieve some program-level outcomes through co-curricular activities, not coursework.

Some problems with direct methods to assess programs

• They are often labor intensive• They often don’t allow for cross-institutional

comparison• They too often tacitly depend on the judgments of

individual instructors, working in isolation from each other, about their own students

• In the case of externally designed standardized tests, instructors may mistrust results

Some indirect methods to assess learning in programs

• Student surveys of self-perceived learning gains• Student engagement surveys (e.g., the CCSSE)• Alumni surveys• Focus groups (of exiting students, alumni, etc.)• Faculty surveys• For C-T courses, surveys of or interviews with employers• For C-T courses, employment data• IR data on student retention and success in programs (though

these can be deceptive indicators of “student learning.”)

G.D. Kuh, "Using student and alumni surveys for accountability in higher education" (2005)

A note on student self-reports

They are likely to be moderately valid measures of student learning when:

1. The information requested is known to students.2. Questions are phrased clearly and unambiguously.3. Questions refer to recent activities.4. Students think questions merit a serious and thoughtful

response.5. Questions don’t threaten or embarrass the student.

Developing an assessment plan

Some models:

• Riverside Community College District• Cabrillo College• Community College of Baltimore County• University of South Alabama (using TracDat)

Some features of good systematic program-level assessment

• Units are expected to undertake PLA assessment cycles regularly and report results annually. (Reports include all Nichols-column information: 1) SLO(s) assessed, 2) assessment method(s) employed, 3) brief description of data generated by assessment, 4) how results were used for improvement.)

• Assessment reports are read and used as part of strategic planning processes. Resource allocation decisions take assessment results into consideration (e.g., requests for new technology are fulfilled on the basis of how it has been shown to improve learning.)

• It is a component of the program review process, which itself is a component of planning processes.

Reporting assessment results

Some final thoughts on making this work for us . . .

• Annual assessment updates should be short, in a standardized format, and preferably organized into a searchable database.

• A routine expectation of administrators and faculty is that they participate regularly in the assessment process—with faculty preferably working collaboratively to define and assess outcomes, and use results for improvement.

• Hold off (for now) on interdisciplinary program assessment, but eventually assign responsibility (maybe to “coordinating disciplines”).

• New programs should have SLOs and assessment plans as a condition of approval.

• Planners need to use assessment results and program reviews to define what the institution does well (and not well). Their decisions need to be driven by empirical evidence related to student learning.

top related