data collection and closing the loop: assessment’s third and fourth steps

21
Data Collection and Closing the Loop: Assessment’s Third and Fourth Steps Office of Institutional Assessment & Effectiveness SUNY Oneonta Spring 2011

Upload: metta

Post on 25-Feb-2016

39 views

Category:

Documents


0 download

DESCRIPTION

Data Collection and Closing the Loop: Assessment’s Third and Fourth Steps. Office of Institutional Assessment & Effectiveness SUNY Oneonta Spring 2011. Important Assessment “Basics”. - PowerPoint PPT Presentation

TRANSCRIPT

Page 1: Data Collection and Closing the Loop: Assessment’s Third and Fourth Steps

Data Collection and Closing the Loop:

Assessment’s Third and Fourth Steps

Office of Institutional Assessment & EffectivenessSUNY OneontaSpring 2011

Page 2: Data Collection and Closing the Loop: Assessment’s Third and Fourth Steps

Important Assessment “Basics”

Establishing congruence among institutional goals, programmatic and course objectives, learning opportunities, and assessments

Linkages to disciplinary (and, as appropriate, accreditation/certification) standards

Using a variety of measures, both quantitative and qualitative, in search of convergence

Value of course-embedded assessment Course- vs. program-level assessment

Page 3: Data Collection and Closing the Loop: Assessment’s Third and Fourth Steps

Course- Vs. Program-Level Assessment

Focus of SUNY Oneonta assessment planning is programmatic student learning objectives Not about assessment of individual students or faculty Rather, the question is: To what extent are students

achieving programmatic objectives? Data collection will still, for the most part, take place

in the context of the classroom (i.e., course-embedded assessment)

However, program must have process in place for compiling and aggregating data across courses and course sections, as appropriate

Page 4: Data Collection and Closing the Loop: Assessment’s Third and Fourth Steps

What You’ve Done So Far1. Development of programmatic student learning

objectives Including discipline-appropriate as well as college-

wide expectations for student learning Covering cognitive, behavioral, and attitudinal

characteristics as appropriate2. Curriculum mapping

Determining the extent to which learning objectives correspond to curricular experiences

Reviewing rationale for program requirements and structure

Exploring potential for developing “assessment database,” leading directly to Step 3

Page 5: Data Collection and Closing the Loop: Assessment’s Third and Fourth Steps

Sample Curriculum Map – Linking Step 2 to Step 3

COURSE

SLOs 1 2 3 4 5 6 7

Introductory Course E E History/Theories E P P Methods E, L E, L L Required Course 1 E E E, P P Required Course 2 P P P Required Course 3 E E P Required Course 4 I, PO I, PO Capstone PO PO PO PO PO Assessment Key: P-Paper E-Exam PO-Portfolio O=Oral Presentation L-Lab Assignment I-Internship

Page 6: Data Collection and Closing the Loop: Assessment’s Third and Fourth Steps

Collecting Assessment Data:

Assessment’s Third Step Finding Evidence that Students are

Achieving Programmatic Goals

Page 7: Data Collection and Closing the Loop: Assessment’s Third and Fourth Steps

Important Preliminary Activities Reach consensus as a faculty on what constitutes

good assessment practice No point in collecting meaningless data!

Develop strategies for assuring that measures to be used are of sufficient quality Review by person/group other than the faculty member who

developed the measure Use of checklist that demonstrates how measure meets

good practice criteria developed by program faculty Decide how issue of “different sections” will be

addressed Will same measures be used? If not, how will comparability be assured?

Page 8: Data Collection and Closing the Loop: Assessment’s Third and Fourth Steps

Assuring Quality of Plan: Questions to Ask

Are assessment measures direct? Student perceptions of the program are valuable, but cannot be the

only indicator of learning Is there logical correspondence between the

measure(s) and the learning objective(s) being assessed?

Is there a process for establishing reliable scoring of qualitative measures?

Are data being collected from a range of courses across the program (i.e., are they representative)?

Page 9: Data Collection and Closing the Loop: Assessment’s Third and Fourth Steps

Suggestions for Maximizing Value of

Assessment Data Use a variety of assessment measures Quantitative and qualitative Course-embedded and “stand-alone” measures (e.g.,

ETS Major Field tests, CLA results) Use benchmarking as appropriate and available Ultimately, convergence of assessment results is

ideal (i.e., triangulation) Establish a reasonable schedule for collecting

assessment data on an ongoing basis (i.e., approximately 1/3 of learning objectives per year)

Page 10: Data Collection and Closing the Loop: Assessment’s Third and Fourth Steps

Suggestions for Maximizing Value of

Assessment Data (cont.) For each learning objective, collect assessment data from a variety of courses at different levels as much as possible

Helps assure results aren’t “idiosyncratic” to one course or faculty member

Can provide insight into extent to which students are “developing” (cross-sectionally, anyway)

Page 11: Data Collection and Closing the Loop: Assessment’s Third and Fourth Steps

Also Consider the Following: The value of a capstone experience for collecting

assessment data “Double dipping” (i.e., using the same evaluative

strategies and criteria to assign grades and produce programmatic assessment data)

Working closely with other faculty in developing measures, especially when teaching courses with multiple sections Do measures have to be the same?

No, but the more different they are, the harder it will be to compile data and reach meaningful conclusions

Page 12: Data Collection and Closing the Loop: Assessment’s Third and Fourth Steps

From Learning Objectives to Assessment Criteria

Once measures are selected, establish clear and measurable a priori “success” indicators

For each measure, determine what constitutes meeting and not meeting standards

While these definitions may vary across faculty, programs will need to use the same categories for results (e.g., exceeding, meeting, approaching, not meeting standards)

Otherwise, reaching conclusions about “program effectiveness” will be difficult

Again, the more faculty collaborate with each other in establishing standards, the easier it will be to organize results and reach meaningful conclusions

Ultimately, it’s a programmatic decision

Page 13: Data Collection and Closing the Loop: Assessment’s Third and Fourth Steps

Post-Assessment Considerations Once data are collected, they must be organized and

maintained in a single place An Excel spreadsheet will work just fine

They will also have to be compiled in some fashion, although the form this takes will depend on the program’s approach

One possibility: Examine for each learning objective the overall percentage of students who met or failed to meet standards (using averages)

Or: Break these percentages down by course level Ultimately, some systematic organization and

categorization of assessment results is necessary in order to move on to Step 4

Page 14: Data Collection and Closing the Loop: Assessment’s Third and Fourth Steps

Closing the Loop: Assessment’s Fourth

Step Using Assessment Data to Improve Programs, Teaching, and Learning

Page 15: Data Collection and Closing the Loop: Assessment’s Third and Fourth Steps

Now That You’ve Gone to All This Trouble…..

The only good reason to do assessment is to use the results to inform practice

Can and should happen at the individual faculty level, but in the context of program assessment, the following needs to happen:

Provision of compiled, aggregated data to faculty for review and consideration

Group discussion of those data DOCUMENTATION of the assessment process and

results, conclusions reached, by faculty, and actions to be taken (more about this later)

Page 16: Data Collection and Closing the Loop: Assessment’s Third and Fourth Steps

What Should be the Focus of

Closing the Loop Process? Identification of “patterns of evidence” as revealed by the assessment data

How are data consistent? Do students at different course levels perform similarly? Eventually, it will be possible to look at this issue over time

How are they distinctive? Do students perform better on some objectives than others?

Comparison of expected to actual results What expectations were confirmed? What came as a complete surprise?

What are possible explanations for the surprise?

Page 17: Data Collection and Closing the Loop: Assessment’s Third and Fourth Steps

What Should be the Focus of

Closing the Loop Process? (cont.)

The decision as to whether assessment results are “acceptable” to faculty in the program

What strengths (and weaknesses) are revealed? What explains the strengths and weaknesses?

Do they make sense, given results of curriculum mapping process and other information (e.g., staffing patterns, course offerings)?

And, most important, what should (and can) the program do to improve areas of weakness?

Process also provides an ideal opportunity to make changes in assessment process itself as well as in programmatic objectives for the next assessment round

Page 18: Data Collection and Closing the Loop: Assessment’s Third and Fourth Steps

Some Possible Ways to Close the Assessment Loop Faculty, staff, and student development

activities Program policies, practices, and procedures Curricular reform Learning opportunities

Page 19: Data Collection and Closing the Loop: Assessment’s Third and Fourth Steps

A Final Issue: The Importance of

Documenting Assessment Increasing requirements related to record-keeping on assessment and actions that are taken based on assessment results

Frequently, actions that are taken don’t “match” results Documentation need not be highly formal, and in fact can be

effectively done in tabular form for each objective, to include: Summary of results Brief description of strengths and weaknesses revealed by data Planned revisions to make improvements as appropriate Planned revisions to the assessment process itself

Provides record that can then be referred to in later assessment rounds and a way of monitoring progress over time

Page 20: Data Collection and Closing the Loop: Assessment’s Third and Fourth Steps

Developing an Assessment Plan:

Some Important Dates May 3, 2010: Submission of Step 1 (Establishing Objectives) of college guidelines

December 1, 2010: Submission of Step 2 (Activities & Strategies) of guidelines

June 1, 2011: Submission of Steps 3 (Assessment) and 4 (Closing the Loop) [plans only]

2011-12 academic year: First round of data collection

Page 21: Data Collection and Closing the Loop: Assessment’s Third and Fourth Steps

APAC Members

Paul French Josh Hammonds Michael Koch Richard Lee Patrice Macaluso

William Proulx Anuradhaa Shastri Bill Wilkerson (Chair) Patty Francis (ex

officio)