different perspectives on the assessment mandate: the results of a survey

27
2007 NASPA Assessment & R etention Conference Different Perspectives on the Assessment Mandate: The Results of a Survey Neil Pagano Associate Dean Columbia College Chicago [email protected]

Upload: lamya

Post on 14-Feb-2016

33 views

Category:

Documents


1 download

DESCRIPTION

Different Perspectives on the Assessment Mandate: The Results of a Survey . Neil Pagano Associate Dean Columbia College Chicago [email protected]. 139 th Belmont Stakes. Question Posted to Assess Listserv (2004). - PowerPoint PPT Presentation

TRANSCRIPT

Page 1: Different Perspectives on the Assessment Mandate: The Results of a Survey

2007 NASPA Assessment & Retention Conference

Different Perspectives on the Assessment Mandate: The

Results of a Survey

Neil PaganoAssociate Dean

Columbia College [email protected]

Page 2: Different Perspectives on the Assessment Mandate: The Results of a Survey

2007 NASPA Assessment & Retention Conference

139th Belmont StakesBelmont S. (G1) 1 1/ 2 Miles | Open | 3 Year Olds Stakes | Purse: $1,000,000

Prg. # Horse Jockey Trainer Wt. ML 1 Imawildandcrazyguy Guidry M Kaplan William A 126 20-1 2 Tiago Smith M E Shirreffs J ohn 126 10-1 3 Curlin Albarado R J Asmussen Steven M 126 6-5 4 C P West Prado E S Zito Nicholas P 126 12-1 5 Slew's Tizzy Bejarano R Fox Gregory 126 20-1 6 Hard Spun Gomez G K J ones J Larry 126 5-2 7 Rags to Riches Velazquez J R Pletcher Todd A 121 3-1

Page 3: Different Perspectives on the Assessment Mandate: The Results of a Survey

2007 NASPA Assessment & Retention Conference

Question Posted to Assess Listserv (2004)

“Is there any evidence that a higher education assessment/evaluation of student learning program has indeed produced positive (or negative) change in the quantity or quality of what students actually learn. There seems to be a lot of anecdotal information about how assessment/evaluation programs were created and implemented, but not any actual empirical support. Considering the logistical and personnel-related ramifications of such an undertaking, any success in getting a program off the ground and moving is certainly noteworthy and to be commended. However, I am trying to prepare a report on the actual effectiveness of assessment/evaluation programs. Is there a program that is doing what it is purported to do: improving student learning. If so, is there any (weak, so-so, or solid) empirical evidence to this effect? Any good studies?”

Page 4: Different Perspectives on the Assessment Mandate: The Results of a Survey

2007 NASPA Assessment & Retention Conference

Two Responses:

• “Assessment done well is effective and assessment done poorly is not effective.”

• Three possible explanations:1) Assessment is still relatively new2) Assessment is “decentralized or course-

embedded”3) “Faculty already do a darned good job

teaching, and their assessment results simply document that.”

Page 5: Different Perspectives on the Assessment Mandate: The Results of a Survey

2007 NASPA Assessment & Retention Conference

Prior Research

• Peterson, Einarson, Augustine, & Vaughan (1999), Institutional Support for Student Assessment: Methodology and Results of a National Survey

• Survey - ISSA: Purposes, Methods, Structures, & Impact– Preparing for self-study or accreditation (1st in

importance)– Improving the achievement of undergraduate

students (2nd in importance)

Page 6: Different Perspectives on the Assessment Mandate: The Results of a Survey

2007 NASPA Assessment & Retention Conference

Conclusions… “Institutions do not routinely use student

assessment data in internal decision-making or monitor its impact on important areas of institutional and student performance. Given the extensive claims made for the value of students’ assessment and the substantial human and financial resources invested in student assessment activities, institutions need to give greater priority to examining how student assessment data is used, and how it impacts the performance of individual students and the institution itself.”

Page 7: Different Perspectives on the Assessment Mandate: The Results of a Survey

2007 NASPA Assessment & Retention Conference

Follow-Up Study

• Peterson, Vaughan, & Perorazio (2002). Student Assessment in Higher Education: A Comparative Study of Seven Institutions– “Exemplary” institutions for “benchmarking”– Ten domains, including Initiating Conditions,

Institutional Approach, Culture, and Utilization– Only one institution (Wake Forest University)

used assessment results “extensively”

Page 8: Different Perspectives on the Assessment Mandate: The Results of a Survey

2007 NASPA Assessment & Retention Conference

Research Questions

1. What are the reasons for undertaking assessment?

2. What assessment methods are used and which are valued?

3. How effective have these assessment efforts been?

4. What variables (institution-type, control, respondent position) impact responses to Qs 1, 2 & 3?

Page 9: Different Perspectives on the Assessment Mandate: The Results of a Survey

2007 NASPA Assessment & Retention Conference

Survey: Four Sections

I. PurposeII. Methods UsedIII. Methods ValuedIV. Effect of Assessment Efforts

Page 10: Different Perspectives on the Assessment Mandate: The Results of a Survey

2007 NASPA Assessment & Retention Conference

Survey Distribution and Responses

• Two Listservs: Assess (University of Kentucky) and Communities of Practice

• Snowball Sampling for Further Coverage• 331 Total Completes

Page 11: Different Perspectives on the Assessment Mandate: The Results of a Survey

2007 NASPA Assessment & Retention Conference

Limitations

• Sampling Method not Random– Purposive method likely to recruit the “choir”

• Mixture of Respondents– Some from same institution

• Basic Statistical Analysis – ANOVA, T-Tests, and Chi Square

Page 12: Different Perspectives on the Assessment Mandate: The Results of a Survey

2007 NASPA Assessment & Retention Conference

Survey Respondents by Institutional Type and Position

Institution Type Position Assoc. Bacc. Ma/Doc Total Faculty 51 45 61 157 Assessment Leader

34 28 63 125

Administrator 21 5 23 49 Total 106 78 147 331

Page 13: Different Perspectives on the Assessment Mandate: The Results of a Survey

2007 NASPA Assessment & Retention Conference

Purposes of Assessment by Institution Type - ANOVA

Importance for Conducting Assessment at Institution Purpose

All

n = 331

Associate

n =106

Baccalau- reate n = 78

Masters/ Doctoral n = 147

F

p

1. Preparing for accreditation or responding to the requirements of accrediting association

3.72 (.55)

3.72 (.55)

3.60 (.71)

3.78 (.45)

2.505 .083

2. Meeting state reporting requirements

2.96 (1.03)

3.40 (.74)

2.42 (1.10)

2.93 (1.04)

22.620 .000

3. Guiding internal resource allocation decisions

2.80 (.98)

3.01 (.92)

2.57 (.95)

2.78 (.89)

5.157 .006

4. Guiding undergraduate academic program improvement

3.32 (.83)

3.38 (.79)

3.29 (.89)

3.29 (.82)

.450 .638

5. Demonstrating student achievement

3.33 (.79)

3.45 (.73)

3.26 (.86)

3.29 (.80)

1.631 .197

6. Improving faculty instructional performance

2.88 (.94)

3.03 (.99)

2.68 (.97)

2.87 (.87)

3.135 .045

a 1 = no importance; 2 = minor importance; 3 = moderate importance; 4 = very important . Standard deviations shown in parentheses.

Page 14: Different Perspectives on the Assessment Mandate: The Results of a Survey

2007 NASPA Assessment & Retention Conference

Purposes of Assessment by Position - ANOVA

Importance for Conducting Assessment at Institution Purpose

All

n =331

Faculty n = 157

Assessment Leader n =125

Admini-strator

49

F

p 1. Preparing for

accreditation or responding to the requirements of accrediting association

3.72 (.55)

3.75 (.56)

3.66 (.55)

3.73 (.53)

.902 .407

2. Meeting state reporting requirements

2.96 (1.03)

3.06 (1.05)

2.73 (1.04)

3.22 (.82)

5.756 .003

3. Guiding internal resource allocation decisions

2.80 (.93)

2.84 (.96)

2.63 (.91)

3.10 (.77)

4.937 .008

4. Guiding undergraduate academic program improvement

3.32 (.83)

3.23 (.84)

3.35 (.82)

3.53 (.77)

2.691 .069

5. Demonstrating student achievement

3.33 (.79)

3.29 (.79)

3.30 (.85)

3.57 (.61)

2.591 .076

6. Improving faculty instructional performance

2.88 (.94)

2.87 (.97)

2.85 (.93)

2.96 (.89)

2.47 .781

a 1 = no importance; 2 = minor importance; 3 = moderate importance; 4 = very important Standard deviations shown in parentheses.

Page 15: Different Perspectives on the Assessment Mandate: The Results of a Survey

2007 NASPA Assessment & Retention Conference

Purposes: A Comparison to the ISSA

Administrator (n = 49) ISSA (n = 1379) Purpose Mean Std. Dev. Mean Std. Dev. t 1. Preparing for accreditation or

responding to the requirements of accrediting association

3.73

.53 3.86 .65 -.68

2. Meeting state reporting requirements

3.22

.82 2.89 1.18 2.70*

3. Guiding internal resource allocation decisions

3.10

.77 2.71 .91 3.48*

4. Guiding undergraduate academic program improvement

3.53

.77 3.43 .72 .90

5. Demonstrating student achievement

3.57

.61 3.48 .71 1.10

6. Improving faculty instructional performance

2.96

.89 3.02 .82 -.46

a 1 = no importance; 2 = minor importance; 3 = moderate importance; 4 = very important b The item on the ISSA was slightly different: “Preparing institutional self-study for accreditation.” All other items on the survey were identical to the ISSA.

*p < .01

Page 16: Different Perspectives on the Assessment Mandate: The Results of a Survey

2007 NASPA Assessment & Retention Conference

Assessment Methods Used

Scale: 1 = not used; 2 = used in some areas; 3 = used in most areas;

4 = used in all areas

Assessment Method (N = 331) MeanStudent Surveys 2.74Student Papers 2.57Student Projects 2.51Alumni Surveys 2.31Capstone Courses 2.24Anecdotal Evidence 2.23Departmental Exams 2.22Student Portfolios 2.05Commercial Exams 1.90Student Interviews and Focus Groups 1.82Employer Surveys 1.75Transcript Analysis 1.72

Page 17: Different Perspectives on the Assessment Mandate: The Results of a Survey

2007 NASPA Assessment & Retention Conference

Assessment Methods Used by Institution - ANOVA

Extent of Method Used Method

All

n = 331

Associate

n =106

Baccalau- reate n = 78

Masters/ Doctoral n = 147

F

p Student Surveys

2.74 (.84)

2.71 (.91)

2.82 (.85)

2.72 (.79)

.48 .619

Alumni Surveys

2.31 (.80)

2.10 (.78)

2.35 (.84)

2.43 (.77)

5.37 .005

Departmental Exams

2.22 (.86)

2.22 (.93)

2.09 (.81)

2.28 (.84)

1.17 .309

Student Papers

2.57 (.80)

2.42 (.93)

2.79 (.75)

2.57 (.82)

4.82 .009

Student Projects

2.51 (.77)

2.37 (.79)

2.63 (.75)

2.55 (.76)

2.81 .061

Student Portfolios

2.05 (.58)

1.89 (.52)

2.07 (.55)

2.15 (.62)

5.94 .003

a1 = not used; 2 = used in some areas; 3 = used in most areas; 4 = used in all areas Standard deviations shown in parentheses.

Page 18: Different Perspectives on the Assessment Mandate: The Results of a Survey

2007 NASPA Assessment & Retention Conference

Assessment Methods Used by Institution – ANOVA (cont.)

Extent of Method Used Method

All

n = 331

Associate

n =106

Baccalau- reate n = 78

Masters/ Doctoral n = 147

F

p Capstone Courses

2.24 (.87)

1.78 (.75)

2.66 (.90)

2.36 (.77)

29.39 .000

Transcript Analysis

1.72 (.89)

1.78 (.95)

1.61 (.77)

1.73 (.90)

.862 .423

Commercial Exams

1.90 (.77)

1.85 (.83)

1.86 (.72)

1.95 (.74)

.594 .553

Employer Surveys

1.75 (.61)

1.80 (.69)

1.56 (.62)

1.82 (.52)

5.17 .006

Student Interviews/ Focus Groups

1.82 (.54)

1.64 (.52)

1.88 (.54)

1.91 (.53)

8.61 .000

Anecdotal Evidence 2.23 (.94)

2.18 (.95)

2.33 (1.02)

2.22 (.88)

.609 .545

1 = not used; 2 = used in some areas; 3 = used in most areas; 4 = used in all areas Standard deviations shown in parentheses.

Page 19: Different Perspectives on the Assessment Mandate: The Results of a Survey

2007 NASPA Assessment & Retention Conference

Assessment Methods ValuedMethod (n = 330) Yes (%) No (%)Student Projects 78.80 21.20Student Portfolios 77.00 23.00Student Surveys 74.80 25.20Student Papers 73.90 26.10Capstone Courses 73.30 26.70Alumni Surveys 70.30 29.70Student Int/Focus Groups 66.70 33.30Departmental Exams 60.60 39.40Employer Surveys 52.40 47.60Commercial Exams 42.40 57.60Anecdotal Evidence 34.80 65.20Transcript Analysis 26.70 73.30

Page 20: Different Perspectives on the Assessment Mandate: The Results of a Survey

2007 NASPA Assessment & Retention Conference

Methods Used vs. Methods Valued

1 Student Surveys 1 Student Projects (3)2 Student Papers 2 Student Portfolios (8)3 Student Projects 3 Student Surveys (1)4 Alumni Surveys 4 Student Papers (2)5 Capstone Courses 5 Capstone Courses (5)6 Anecdotal Evidence 6 Alumni Surveys (4)7 Departmental Exams 7 Stu. Int./Foc. Groups (10)8 Student Portfolios 8 Departmental Exams (7)9 Commercial Exams 9 Employer Surveys (11)

10 Stu Int./Focus Groups 10 Commercial Exams (9)11 Employer Surveys 11 Anecdotal Evidence (6)12 Transcript Analysis 12 Transcript Analysis (12)

Methods Used Methods Valued

Page 21: Different Perspectives on the Assessment Mandate: The Results of a Survey

2007 NASPA Assessment & Retention Conference

Methods Valued by Institution Type - Chi Square

• Associate Institutions placed less value in:– Alumni Surveys (p = .008)– Capstone Courses (p = .002)

• Baccalaureate Institutions placed less value in Employer Surveys (p = .008)

Page 22: Different Perspectives on the Assessment Mandate: The Results of a Survey

2007 NASPA Assessment & Retention Conference

Methods Valued by Position – Chi Square

• Faculty placed relatively less value in 9 of the 12 Methods. Statistically significant differences in:– Departmental Exams (p = .006)– Student Papers (p = .001)– Student Portfolios (p = .016)– Capstone Courses (p < .000)– Commercial Exams (p = .046)– Student Interviews/Focus Groups (p = .023)

• Faculty placed more value in Anecdotal Evidence (p = .132)

Page 23: Different Perspectives on the Assessment Mandate: The Results of a Survey

2007 NASPA Assessment & Retention Conference

Perspectives on the Effects of the Assessment Mandate:

4 Survey Items

1. Our institutional assessment efforts have been effective.

2. Our institutional assessment efforts have identified areas where we need to make curricular/programmatic changes.

3. We have made curricular/programmatic changes as a result of our assessment.

4. It is important that every institution have an assessment plan.

• No differences in Institutional Type or Control

Page 24: Different Perspectives on the Assessment Mandate: The Results of a Survey

2007 NASPA Assessment & Retention Conference

Perspective on Effect of Assessment: by Position - ANOVA Perspective

All

n = 328

Faculty n =156

Assessment Leader n = 123

Admini- strator n = 49

F

p

1. Assessment efforts have been effective.

3.31 (.99)

3.13 (1.04)

3.43 (.91)

3.61 (.91)

5.965 .003

2. Assessment efforts have identified areas for curricular/programmatic changes.

3.66 (.92)

3.50 (1.01)

3.80 (.81)

3.79 (.82)

4.303 .014

3. We have made curricular/programmatic changes.

3.59 (.99)

3.44 (1.06)

3.69 (.90)

3.80 (.93)

3.415 .034

4. It is important that every institution have an assessment plan.

4.35 (.83)

4.22 (.87)

4.53 (.74)

4.33 (.87)

4.922 .008

a 1 = strongly disagree; 2 = disagree; 3 = neither agree nor disagree; 4 = agree; 5 = strongly agree. Standard deviations shown in parentheses.

Page 25: Different Perspectives on the Assessment Mandate: The Results of a Survey

2007 NASPA Assessment & Retention Conference

Closing Comments

• Institution Type Matters– Different institutions have different priorities

and purposes• Position Matters

– Faculty, Assessment Leaders and Administrators differ on purposes, methods valued, and the ultimate effect of the mandate

Page 26: Different Perspectives on the Assessment Mandate: The Results of a Survey

2007 NASPA Assessment & Retention Conference

Closing Comments

• Accreditation is an Important Lever– Effects of revised expectations

• Need to Know More– US Higher Ed Post-Spellings Commission– What is “Assessment?”

Page 27: Different Perspectives on the Assessment Mandate: The Results of a Survey

2007 NASPA Assessment & Retention Conference

Different Perspectives on the Assessment Mandate: The

Results of a Survey

Neil PaganoAssociate Dean

Columbia College [email protected]