a basic toolbox for assessing institutional effectiveness michael f. middaugh assistant vice...

88
A Basic Toolbox for Assessing Institutional Effectiveness Michael F. Middaugh Assistant Vice President for Institutional Research and Planning University of Delaware Commissioner and Vice Chair Middle States Commission on Higher Education

Post on 22-Dec-2015

216 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: A Basic Toolbox for Assessing Institutional Effectiveness Michael F. Middaugh Assistant Vice President for Institutional Research and Planning University

A Basic Toolbox forAssessing Institutional Effectiveness

Michael F. MiddaughAssistant Vice President for Institutional Research and Planning

University of Delaware

Commissioner and Vice ChairMiddle States Commission on Higher Education

Page 2: A Basic Toolbox for Assessing Institutional Effectiveness Michael F. Middaugh Assistant Vice President for Institutional Research and Planning University

Workshop Objectives

• Identify context for why assessment of institutional effectiveness is important

• Identify key variables in developing measures for assessing institutional effectiveness

• Identify appropriate data collection strategies for measuring those variables

• Identify appropriate strategies for communicating information (NOTE THAT I DID NOT SAY DATA!) on institutional effectiveness

Page 3: A Basic Toolbox for Assessing Institutional Effectiveness Michael F. Middaugh Assistant Vice President for Institutional Research and Planning University

Context

Page 4: A Basic Toolbox for Assessing Institutional Effectiveness Michael F. Middaugh Assistant Vice President for Institutional Research and Planning University

Robert Zemsky and William Massy - 1990

“[The academic ratchet] is a term to describe the steady, irreversible shift of faculty allegiance away from the goals of a given institution, toward those of an academic specialty. The ratchet denotes the advance of an entrepreneurial spirit among faculty nationwide, leading to increased emphasis on research and publication, and on teaching one’s specialty in favor of general introduction courses, often at the expense of coherence in an academic curriculum. Institutions seeking to enhance their own prestige may contribute to the ratchet by reducing faculty teaching and advising responsibilities across the board, enabling faculty to pursue their individual research and publication with fewer distractions. The academic ratchet raises an institution’s costs, and it results in undergraduates paying more to attend institutions in which they receive less attention than in previous decades.”

(Zemsky and Massy, 1990, p. 22)

Page 5: A Basic Toolbox for Assessing Institutional Effectiveness Michael F. Middaugh Assistant Vice President for Institutional Research and Planning University

Boyer Commission on Educating Undergraduates - 1998

“To an overwhelming degree, they [research universities] have furnished the cultural, intellectual, economic, and political leadership of the nation. Nevertheless, the research universities have too often failed, and continue to fail, their undergraduate populations…Again and again, universities are guilty of advertising practices they would condemn in the commercial world. Recruitment materials display proudly the world-famous professors, the splendid facilities and ground breaking research that goes on within them, but thousands of students graduate without ever seeing the world-famous professors or tasting genuine research. Some of their instructors are likely to be badly trained or untrained teaching assistants who are groping their way toward a teaching technique; some others may be tenured drones who deliver set lectures from yellowed notes, making no effort to engage the bored minds of the students in front of them.”

(Boyer Commission, pp. 5-6)

Page 6: A Basic Toolbox for Assessing Institutional Effectiveness Michael F. Middaugh Assistant Vice President for Institutional Research and Planning University

U.S. News “America’s Best Colleges” - 1996

“The trouble is that higher education remains a labor-intensive service industry made up of thousands of stubbornly independent and mutually jealous units that support expensive and vastly underused facilities. It is a more than $200 billion-a-year economic enterprise – many of whose leaders oddly disdain economic enterprise, and often regard efficiency, productivity, and commercial opportunity with the same hauteur with which Victorian aristocrats viewed those ‘in trade’… The net result is a hideously inefficient system that, for all its tax advantages and public and private subsidies, still extracts a larger share of family income than almost anywhere else on the planet…”

(America’s Best Colleges, p. 91)

Page 7: A Basic Toolbox for Assessing Institutional Effectiveness Michael F. Middaugh Assistant Vice President for Institutional Research and Planning University

National Commission on the Cost of Higher Education - 1998

• “…because academic institutions do not account differently for time spent directly in the classroom and time spent on other teaching and research activities, it is almost impossible to explain to the public how individuals employed in higher education use their time. Consequently, the public and public officials find it hard to be confident that academic leaders allocate resources effectively and well. Questions about costs and their allocation to research, service, and teaching are hard to discuss in simple, straightforward ways and the connection between these activities and student learning is difficult to draw. In responding to this growing concern, academic leaders have been hampered by poor information and sometimes inclined to take issue with those who asked for better data. Academic institutions need much better definitions and measures of how faculty members, administrators, and students use their time.”

(National Commission on Cost of Higher Education, p. 20)

Page 8: A Basic Toolbox for Assessing Institutional Effectiveness Michael F. Middaugh Assistant Vice President for Institutional Research and Planning University

Spellings Commission on the Future of Higher Education 2006

“We believe that improved accountability is vital to ensuring the success of all of the other reforms we propose. Colleges and universities must become more transparent about cost, price, and student success outcomes, and must willingly share this information with students and families. Student achievement, which is inextricably connected to institutional success, must be measured by institutions on a “value-added” basis that takes into account students’ academic baseline when assessing their results. This information should be available to students, and reported publicly in aggregate form to provide consumers and policymakers an accessible, understandable way to measure the relative effectiveness of different colleges and universities.”

(Spellings Commission, p.4)

Page 9: A Basic Toolbox for Assessing Institutional Effectiveness Michael F. Middaugh Assistant Vice President for Institutional Research and Planning University

Middle States Accreditation StandardsExpectations: Assessment & Planning

It is the Commission’s intent, through the self-study process, to prompt institutions to reflect on those assessment activities currently in place (both for institutional effectiveness and student learning), to consider how these assessment activities inform institutional planning, and to determine how to improve the effectiveness and integration of planning and assessment.

Page 10: A Basic Toolbox for Assessing Institutional Effectiveness Michael F. Middaugh Assistant Vice President for Institutional Research and Planning University

MSCHE Linked Accreditation Standards:Standard 14: Student Learning Outcomes

Assessment of student learning demonstrates that, at graduation, or other appropriate points, the institution’s students have knowledge, skills, and competencies consistent with institutional and appropriate higher education goals.

Page 11: A Basic Toolbox for Assessing Institutional Effectiveness Michael F. Middaugh Assistant Vice President for Institutional Research and Planning University

Selected Fundamental Elements forMSCHE Standard 14

• Articulated expectations for student learning (at institutional, degree/program, and course levels)

• Documented, organized, and sustained assessment processes (that may include a formal assessment plan)

• Evidence that student learning assessment information is shared and used to improve teaching and learning

• Documented use of student learning assessment information as part of institutional assessment

Page 12: A Basic Toolbox for Assessing Institutional Effectiveness Michael F. Middaugh Assistant Vice President for Institutional Research and Planning University

MSCHE Linked Accreditation Standards:Standard 7: Institutional Assessment

The institution has developed and implemented an assessment process that evaluates its overall effectiveness in achieving its mission and goals and its compliance with accreditation standards.

Page 13: A Basic Toolbox for Assessing Institutional Effectiveness Michael F. Middaugh Assistant Vice President for Institutional Research and Planning University

Selected Fundamental Elements forMSCHE Standard 7

• Documented, organized, and sustained assessment processes to evaluate the total range of programs and services, achievement of mission, and compliance with accreditation standards

• Evidence that assessment results are shared and used in institutional planning, resource allocation and renewal.

• Written institutional strategic plan(s) that reflect(s) consideration of assessment results

Page 14: A Basic Toolbox for Assessing Institutional Effectiveness Michael F. Middaugh Assistant Vice President for Institutional Research and Planning University

MSCHE Linked Accreditation Standards:Standard 2: Planning, Resource Allocation

and Institutional Renewal

An institution conducts ongoing planning and resource allocation based on its mission and goals, develops objectives to achieve them, and utilizes the results of its assessment activities for institutional renewal. Implementation and subsequent evaluation of the success of the strategic plan and resource allocation support the development and change necessary to improve and to maintain quality.

Page 15: A Basic Toolbox for Assessing Institutional Effectiveness Michael F. Middaugh Assistant Vice President for Institutional Research and Planning University

Selected Fundamental Elements forMSCHE Standard 2

• Clearly stated goals and objectives that reflect conclusions drawn from assessments that are used for planning and resource allocation at the institutional and unit levels

• Planning and improvement processes that are clearly communicated, provide for constituent participation, and incorporate the use of assessment results

• Assignment of responsibility for improvement and assurance of accountability

Page 16: A Basic Toolbox for Assessing Institutional Effectiveness Michael F. Middaugh Assistant Vice President for Institutional Research and Planning University

Variables

While we will discuss several variables today that contribute to assessment of institutional effectiveness, keep in mind that you don’t have to measure everything.

PRIORITIZE within the context of your institution’s culture and needs.

Page 17: A Basic Toolbox for Assessing Institutional Effectiveness Michael F. Middaugh Assistant Vice President for Institutional Research and Planning University

Students

• Admitted• Entering• Continuing• Non-Returning• Graduating• Alumni

Page 18: A Basic Toolbox for Assessing Institutional Effectiveness Michael F. Middaugh Assistant Vice President for Institutional Research and Planning University

Environmental Issues

• Student and Faculty Engagement• Student and Staff Satisfaction• Employee Productivity• Compensation

- Market- Equity

• Campus Climate• Economic Impact

Page 19: A Basic Toolbox for Assessing Institutional Effectiveness Michael F. Middaugh Assistant Vice President for Institutional Research and Planning University

STUDENTS

Page 20: A Basic Toolbox for Assessing Institutional Effectiveness Michael F. Middaugh Assistant Vice President for Institutional Research and Planning University

Admitted Students

• What can we learn from monitoring admissions cycles?

• What additional drill down is needed to fully understand student admissions behavior?

Page 21: A Basic Toolbox for Assessing Institutional Effectiveness Michael F. Middaugh Assistant Vice President for Institutional Research and Planning University

A Typical Admissions Monitoring Report

Campus Summary: First-Time Freshman Applicants, Their SAT Scores and Predicted Grade Index, by Admission Status and by Residency Status for the Entering Classes in theFall of 2005 as of 09/15/05; Fall 2006 as of 09/18/06; and Fall 2007 as of 09/13/07.

2005 2006 2007 2005 2006 2007 2005 2006 2007 2005 2006 2007 2005 2006 2007 2005 2006 2007Counts - Resident 2,340 2,332 2,088 109 148 172 1,940 1,877 7,147 1,362 1,255 1,174 0.83 0.80 3.42 0.70 0.67 0.61 - Nonresident 18,984 19,209 20,133 7,506 5,871 5,838 7,295 8,101 8,489 2,078 2,201 2,348 0.38 0.42 0.42 0.28 0.27 0.28 - Total 21,324 21,541 22,221 7,615 6,019 6,010 9,235 9,978 15,636 3,440 3,456 3,522 0.43 0.46 0.70 0.37 0.35 0.23

SAT Verbal - Resident 557 562 569 456 444 445 567 576 584 559 564 575 - Nonresident 572 579 584 541 535 536 609 615 620 592 600 604 - Total 571 577 582 540 532 533 600 607 614 579 587 594

SAT Math - Resident 563 567 574 454 446 452 547 582 589 566 570 580 - Nonresident 596 598 605 561 551 554 636 635 643 619 620 627 - Total 592 595 602 560 549 551 623 625 634 598 602 611

Predicted GradeIndex - Resident 2.67 2.75 2.83 1.66 1.60 1.71 2.75 2.78 2.85 2.75 2.78 2.85 - Nonresident 2.90 2.94 2.94 2.40 2.36 2.41 2.98 3.04 3.05 2.98 3.05 3.04 - Total 2.82 2.88 2.88 2.22 2.30 2.37 2.90 2.95 2.98 2.89 2.95 2.98

AdmissionOffered

AdmissionAccepted

AllApplicants

AdmissionDenied

Eastern Seaboard State UniversityWeekly Admissions Monitoring Report

Ratio of Offersto Total Applications

Ratio of Acceptsto Offers (Yield)

Page 22: A Basic Toolbox for Assessing Institutional Effectiveness Michael F. Middaugh Assistant Vice President for Institutional Research and Planning University

Drilling Down

• Why do some students to whom we extend an offer of admission choose to attend our institution?

• Why do other students to whom we extend an offer of admission choose to attend a different school?

• How is our institution perceived by prospective students within the admissions marketplace?

• What sources of information do students draw upon in shaping those perceptions?

• What is the role of financial aid in shaping the college selection decision?

Page 23: A Basic Toolbox for Assessing Institutional Effectiveness Michael F. Middaugh Assistant Vice President for Institutional Research and Planning University

Survey Research is Useful in Addressing These Questions

• “Home-Grown” College Student Selection Survey

• Commercially Prepared- College Board Admitted Student Questionnaire- College Board Admitted Student Questionnaire-Plus

• Commercially prepared allows for benchmarking

Page 24: A Basic Toolbox for Assessing Institutional Effectiveness Michael F. Middaugh Assistant Vice President for Institutional Research and Planning University

Academic Preparedness of Respondent Pool

Average High School Grades of ASQ Respondents

All Non-Admitted Enrolling EnrollingStudents Students Students

Average High School Grades N (%) 1664 (65%) 1011 (66%) 653 (64%)

A (90 to 100) 76% 72% 82% B (80 to 89) 24% 27% 18% C (70 to 79) *% *% *% D (69 or Below) 0% 0% 0%

Page 25: A Basic Toolbox for Assessing Institutional Effectiveness Michael F. Middaugh Assistant Vice President for Institutional Research and Planning University

Academic Preparedness of Respondent PoolAverage SAT Critical Reading Scores of ASQ Respondents

All Non-Admitted Enrolling EnrollingStudents Students Students

SAT Critical Reading N (%) 1489 (59%) 897 (59%) 592 (58%)

610 594 634610 600 630

700 and Above 13% 9% 20% 650 to 690 19% 16% 24% 600 to 640 26% 26% 27% 550 to 590 24% 27% 19% 500 to 540 12% 14% 8% 450 to 490 4% 6% 1% 400 to 440 2% 2% 1% 350 to 390 *% 0% *% 300 to 340 *% *% *% Below 300 *% *% *%

Mean Score:Median Score:

Page 26: A Basic Toolbox for Assessing Institutional Effectiveness Michael F. Middaugh Assistant Vice President for Institutional Research and Planning University

Academic Preparedness of Respondent Pool

Average SAT Mathematical Scores of ASQ Respondents

All Non-Admitted Enrolling EnrollingStudents Students Students

SAT Mathematical N (%) 1490 (59%) 897 (59%) 593 (58%)

630 617 651640 620 660

700 and Above 19% 14% 26% 650 to 690 27% 24% 31% 600 to 640 24% 25% 23% 550 to 590 19% 22% 13% 500 to 540 8% 9% 6% 450 to 490 3% 4% 1% 400 to 440 1% 2% *% 350 to 390 *% *% *% 300 to 340 *% *% *% Below 300 0% 0% 0%

Mean Score:Median Score:

Page 27: A Basic Toolbox for Assessing Institutional Effectiveness Michael F. Middaugh Assistant Vice President for Institutional Research and Planning University

The survey allows respondents to rate 16 items with respect to their influence on the college selection decision. In choosing which institution to attend, the top three considerations for both enrolling and non-enrolling students are availability of majors, academic reputation of the institution, and commitment to teaching undergraduates. These are followed closely by educational value for price paid.

Page 28: A Basic Toolbox for Assessing Institutional Effectiveness Michael F. Middaugh Assistant Vice President for Institutional Research and Planning University

What Admitted Students are SeekingImportance of Selected College Characteristics to ASQ Respondents

All Non-Admitted Enrolling EnrollingStudents Students Students

Availability of Majors 89% 90% 88% Academic Reputation 81% 82% 80%

78% 80% 74%77% 79% 73%

Quality of Academic Facilities 69% 72% 64% Personal Attention 64% 65% 63% Quality of Social Life 64% 67% 60% Attractiveness of Campus 61% 66% 54% Cost of Attendance 60% 62% 56% Surroundings 60% 62% 56% Quality of Campus Housing 58% 60% 55% Extracurricular Opportunities 55% 56% 54% Availability of Recreational Facilities 55% 58% 50% Access to Off-Campus Activities 35% 37% 33% Special Academic Programs 33% 32% 35% Prominent Athletics 27% 27% 26%

Undergraduate Teaching Commitment Value for the Price

Page 29: A Basic Toolbox for Assessing Institutional Effectiveness Michael F. Middaugh Assistant Vice President for Institutional Research and Planning University

Survey respondents are then asked to rate the focal institution on the 16 dimensions, compared with other institutions to which they applied and were accepted.

NOTE: Student perceptions don’t have to be accurate to be real. It is the reality of student perceptions that must be addressed.

Page 30: A Basic Toolbox for Assessing Institutional Effectiveness Michael F. Middaugh Assistant Vice President for Institutional Research and Planning University

Important Perceptions about University of Delaware

Importance and Rating of Selected Institutional Characteristics

A. Less Important and the University Rated Higher B. Very Important and the University Rated Higher

Special Academic Programs Attractiveness of CampusCost of AttendanceValue for the PriceQuality of Social LifeAvailability of Recreational FacilitiesExtracurricular Opportunities

C. Less Important and the University Rated Lower D. Very Important and the University Rated Lower

Access to Off-Campus Activities Academic ReputationProminent Athletics Personal Attention

Quality of On-Campus HousingUndergraduate Teaching CommitmentSurroundingsQuality of Academic FacilitiesAvailability of Majors

"Charcteristics Considered Very Important" were those rated "Very Important" by at least 50 percent of respondents. "Characteristics for Which the University was Rated High" were those for which the mean rating of the University was higher than the mean rating for all other institutions. The characteristics are listed in decreasing order of the difference between the mean rating for the University and the mean rating for all other institutions.

Page 31: A Basic Toolbox for Assessing Institutional Effectiveness Michael F. Middaugh Assistant Vice President for Institutional Research and Planning University

Impressions of the University Among ASQ Respondents

All Non-Admitted Enrolling EnrollingStudents Students Students

COLLEGE IMAGES N (%) 2545 (100%) 1530 (100%) 1015 (100%)

Fun 61% 68% 50% Friendly 57% 63% 47%

53% 61% 41%41% 43% 39%

Highly Respected 41% 50% 27% Partying 40% 41% 38% Intellectual 36% 45% 23% Challenging 34% 43% 21% Athletics 30% 34% 26% Career-Oriented 30% 38% 17% Selective 29% 33% 23% Prestigious 28% 35% 16% Diverse 26% 29% 22% Personal 19% 23% 13% Average 13% 7% 22% Back-Up School 12% 5% 21% Difficult 9% 12% 4% Isolated 6% 2% 11% Not Well Known 6% 4% 8%

Comfortable Large

Page 32: A Basic Toolbox for Assessing Institutional Effectiveness Michael F. Middaugh Assistant Vice President for Institutional Research and Planning University

Financial Aid as a Factor in College Selection

• Survey allows respondents to report financial aid awards from “The college you plan to attend.”

• Work study awards at UD and at competitors are virtually identical, while the average loan award at competitors is about $1,000 higher than at UD.

• The average need-based grant at competitors is double that for UD.

• The average merit grant at competitors is about $5,000 higher than at UD.

• Total financial aid packages awarded by competitors are about double that awarded by UD.

Page 33: A Basic Toolbox for Assessing Institutional Effectiveness Michael F. Middaugh Assistant Vice President for Institutional Research and Planning University

A Potential Competitive Disadvantage for UD

Average Aid Average AidAwarded by Awarded by

U of D School Attended(Enrolling (Non-Enrolling Students) Students)

Students ReportingWork-Study Awarded

N 89 79Average Award $2,045 $2,107

Students ReportingLoan Awarded

N 316 193Average Award $3,948 $4,915

Students ReportingNeed-Based Grant Awarded

N 143 104Average Award $4,310 $9,354

Students ReportingMerit Grant Awarded

N 301 218Average Award $7,180 $12,024

Students ReportingTotal Aid Awarded

N 482 297Average Award $8,281 $16,011

Page 34: A Basic Toolbox for Assessing Institutional Effectiveness Michael F. Middaugh Assistant Vice President for Institutional Research and Planning University

Entering Students

• ACT College Student Needs Assessment Survey: Ask respondents to identify skills areas – academic and social – where they feel they will need assistance in the coming year.

• College Student Expectations Questionnaire: Asks respondents to assess their level of expectations with respect to intellectual, social, and cultural engagement with faculty and other students in the coming year.

Page 35: A Basic Toolbox for Assessing Institutional Effectiveness Michael F. Middaugh Assistant Vice President for Institutional Research and Planning University

Continuing/Returning Students• Student Satisfaction Research

- ACT Survey of Student Opinions - Noel-Levitz Student Satisfaction Inventory

• ACT Survey of Student Opinions

Student use of, and satisfaction with 21 programs and services typically found at a college or university (e.g. academic advising, library, computing, residence life, food services, etc.)

Student satisfaction with 43 dimensions of campus environment (e.g., out-of-classroom availability of faculty, availability of required courses, quality of advisement information, facilities, admissions and registration procedures, etc.)

Self-estimated intellectual, personal, and social growth; Overall impressions of the college experience

NOTE: Survey is available in four-year and two-year college versions.

Page 36: A Basic Toolbox for Assessing Institutional Effectiveness Michael F. Middaugh Assistant Vice President for Institutional Research and Planning University

What About Non-Returning Student Research?TABLE 1: ENROLLMENT, DROPOUT RATES AND GRADUATION RATES

FOR FIRST-TIME FRESHMEN ON THE NEWARK CAMPUS (Total)

Enrollment and Dropout Rates Graduation Rates

Entering 1st 2nd 3rd 4th 5th 6th within within withinFall Term Fall Fall Fall Fall Fall Fall 3 yrs 4 yrs 5 yrs Total

1995 N 3154 2673 2439 2355 599 113 21 1721 2219 2344% enrollment 100.0% 84.7% 77.3% 75.3% 19.0% 3.6% 0.7% 54.6% 70.4% 74.3%% dropout 0.0% 15.3% 22.7% 24.7% 26.4% 26.1%

1996 N 3290 2804 2585 2489 606 108 22 1825 2302 2427% enrollment 100.0% 85.2% 78.6% 76.3% 18.4% 3.3% 0.7% 55.5% 70.0% 73.8%% dropout 0.0% 14.8% 21.4% 23.7% 26.1% 26.7%

1997 N 3180 2766 2523 2436 581 117 27 1827 2284 2401% enrollment 100.0% 87.0% 79.3% 77.5% 18.3% 3.7% 0.8% 57.5% 71.8% 75.5%% dropout 0.0% 13.0% 20.7% 22.5% 24.3% 24.5%

1998 N 3545 3080 2830 2762 653 118 22 2079 2621 2727% enrollment 100.0% 86.9% 79.8% 78.5% 18.4% 3.3% 0.6% 58.6% 73.9% 76.9%% dropout 0.0% 13.1% 20.2% 21.5% 22.9% 22.7%

1999 N 3513 3126 2871 2757 526 83 31 2193 2632 2684% enrollment 100.0% 89.0% 81.7% 79.4% 15.0% 2.4% 0.9% 62.4% 74.9% 76.4%% dropout 0.0% 11.0% 18.3% 20.6% 22.6% 22.7%

2000 N 3128 2738 2524 2453 496 85 24 1884 2297 --% enrollment 100.0% 87.5% 80.7% 79.2% 15.9% 2.7% 0.8% 60.2% 73.4%% dropout 0.0% 12.5% 19.3% 20.8% 23.9% 23.8%

2001 N 3358 2976 2746 2674 472 0 31 2138 -- --% enrollment 100.0% 88.6% 81.8% 80.6% 14.1% 0.0% 0.9% 63.7%% dropout 0.0% 11.4% 18.2% 19.4% 22.3% 0.0%

2002 N 3399 3055 2866 2787 0 0 42 -- -- --% enrollment 100.0% 89.9% 84.3% 83.2% 0.0% 0.0% 1.2%% dropout 0.0% 10.1% 15.7% 16.8% 0.0% 0.0%

2003 N 3433 3035 2808 0 0 0 -- -- -- --% enrollment 100.0% 88.4% 81.8% 0.0% 0.0% 0.0%% dropout 0.0% 11.6% 18.2% 0.0% 0.0% 0.0%

2004 N 3442 3064 0 0 0 0 -- -- -- --% enrollment 100.0% 89.0% 0.0% 0.0% 0.0% 0.0%% dropout 0.0% 11.0% 0.0% 0.0% 0.0% 0.0%

Page 37: A Basic Toolbox for Assessing Institutional Effectiveness Michael F. Middaugh Assistant Vice President for Institutional Research and Planning University

What About Non-Returning Student Research?Drilling Deeper…..

• Commercial instruments exist, but response rates tend to be low, and reported reasons for leaving politically correct – personal or financial reasons.

• For the last several years, we have administered the Survey of Student Opinions during the Spring term to a robust sample of students across freshman, sophomore, junior, and senior classes.

• The following Fall, the respondent pool is disaggregated into those who took the Survey and returned in the Fall, and those who took the Survey, did not return in the Fall, and did not graduate.

• Test for statistically significant differences in response patterns between the two groups.

Page 38: A Basic Toolbox for Assessing Institutional Effectiveness Michael F. Middaugh Assistant Vice President for Institutional Research and Planning University

Campus Pulse Surveys

• Based upon information gleaned from Survey of Student Opinions, we annually develop five or six short, web-based focused Campus Pulse Surveys directed at specific issues that surfaced. Among recent Campus Pulse Surveys:

– Registration Procedures Within a PeopleSoft Environment– Quality of Academic Advising at the University– Personal Security on Campus– Issues Related to Diversity within the Undergraduate Student Body

Page 39: A Basic Toolbox for Assessing Institutional Effectiveness Michael F. Middaugh Assistant Vice President for Institutional Research and Planning University

Note: While there are a number of instruments that allow for assessment of student satisfaction among undergraduate students, there is very little in the way of instrumentation for assessing graduate student research. Graduate students are virtually forgotten when it comes to any facet of student research, and data collection instruments are generally locally developed, if they exist at all.

We have just developed a Graduate Student SatisfactionSurvey and will be happy to share it with interested parties.

Page 40: A Basic Toolbox for Assessing Institutional Effectiveness Michael F. Middaugh Assistant Vice President for Institutional Research and Planning University

Student Engagement

• College Student Expectations Questionnaire (CSXQ)

• College Student Experiences Questionnaire (CSEQ)

• National Survey of Student Engagement (NSSE)

Page 41: A Basic Toolbox for Assessing Institutional Effectiveness Michael F. Middaugh Assistant Vice President for Institutional Research and Planning University

Benchmarks of Effective Educational Practice(NSSE)

• Level of academic challenge– Course prep, quantity of readings and papers, course emphasis, campus environment emphasis

• Student interactions with faculty members– Discuss assignments/grades, career plans & readings outside of class, prompt feedback, student-

faculty research

• Supportive campus environment– Resources to succeed academically, cope w/ non-academic issues, social aspect, foster relationships

w/ students, faculty, staff

• Active and collaborative learning– Ask questions & contribute in class, class presentations, group work, tutor peers, community-based

projects, discuss course-related ideas outside class

• Enriching educational experiences– Interact w/ students of a different race or ethnicity, w/ different religious & political beliefs, different

opinions & values, campus environment encourages contact among students of different economic, social, & racial or ethnic backgrounds, use of technology, participate in wide-range of activities (internships, community service, study abroad, independent study, senior capstone, co-curricular activities, learning communities)

Page 42: A Basic Toolbox for Assessing Institutional Effectiveness Michael F. Middaugh Assistant Vice President for Institutional Research and Planning University

IV.A.1. TO WHAT EXTENT HAS YOUR EXPERIENCE AT THIS INSTITUTION CONTRIBUTED TO ACQUIRING A BROAD GENERAL EDUCATION?

3.23

3.10

3.28

3.14

3.00

3.05

3.10

3.15

3.20

3.25

3.30

UD Freshmen UD Seniors

2001

2005

Both 2005 University of Delaware freshmen and seniors estimate greater gains in general education than in 2001. The 2005 University of Delaware responses are not statistically significantly different from those at national research and peer universities.

3.28

3.14 3.14

3.22

3.25

3.17

3.05

3.10

3.15

3.20

3.25

3.30

2005 Freshmen 2005 Seniors

UD

Research Univ.

Peer Univ.

1 = VERY LITTLE; 2 = SOME; 3 = QUITE A BIT; 4 = VERY MUCH

Page 43: A Basic Toolbox for Assessing Institutional Effectiveness Michael F. Middaugh Assistant Vice President for Institutional Research and Planning University

V.B.2. HOW WOULD YOU BEST CHARACTERIZE YOUR RELATIONSHIPS WITH FACULTY AT THIS INSTITUTION?

5.43

5.24

5.29

4.94

4.60

4.70

4.80

4.90

5.00

5.10

5.20

5.30

5.40

5.50

UD Freshmen UD Seniors

2001

2005

2005 University of Delaware freshmen and seniors both report a decline in the quality of their relationships with faculty. That said, the University of Delaware responses from both groups are not statistically significantly different from those at national research and peer universities.

5.29

4.944.96

5.21 5.23

4.85

4.60

4.70

4.80

4.90

5.00

5.10

5.20

5.30

5.40

2005 Freshmen 2005 Seniors

UD

Research Univ.

Peer Univ.

1 = UNAVAILABLE, UNHELPFUL, UNSYMPATHETIC; 7 = AVAILABLE, HELPFUL, SYMPATHETIC

Page 44: A Basic Toolbox for Assessing Institutional Effectiveness Michael F. Middaugh Assistant Vice President for Institutional Research and Planning University

Graduating Students

Pursuing Still

Number Number of Full-Time Part-Time Further Seeking

Curriculum Group in Respondents¹ Employment Employment Education Employment

Class N % % % % %

Agriculture & Natural

Resources 175 63 36.0 58.7 12.7 19.0 7.9

Arts & Sciences

Humanities 417 145 34.8 67.6 10.3 13.1 9.0

Social Sciences 904 295 32.6 71.2 4.4 14.2 8.5

Life & Health Sciences 158 63 39.9 73.0 -- 25.4 --

Physical Sciences 166 61 36.7 72.1 -- 16.4 9.8

Business & Economics 571 227 39.8 89.9 3.1 2.2 4.4

Engineering 189 103 54.5 85.4 -- 11.7 1.9

Health Sciences 390 144 36.9 72.9 2.8 21.5 2.8

Human Services,

Educ. & Public Policy 557 213 38.2 85.9 2.8 4.2 5.6

2005 University Total23,527 1,314 37.3 77.2 4.0 11.9 5.9

NOTE: Percentages may not total to 100.0 because of rounding

EMPLOYMENT AND EDUCATIONAL STATUS OFBACCALAUREATES BY CURRICULUM GROUP

CLASS OF 2005

Page 45: A Basic Toolbox for Assessing Institutional Effectiveness Michael F. Middaugh Assistant Vice President for Institutional Research and Planning University

Alumni Research

• Commercially prepared instruments exist. Decide if they meet your needs or if you have to develop your own.

• Decide early on why you are doing this research: Are you assessing the continuing relevance of the college experience? Are you cultivating prospects for the Development Office? Both? Be up front if fund raising is a component.

• Decide the which classes you need to survey; don’t go after every living alumnus unless you are a very young institution.

Page 46: A Basic Toolbox for Assessing Institutional Effectiveness Michael F. Middaugh Assistant Vice President for Institutional Research and Planning University

Assessing Student Learning Outcomes

• I’ll provide only a brief overview, as there are others (Linda Suskie, Trudy Banta, Jeff Seybert) who are far better versed than I am.

• That said, understand that assessment of student learning is at the core of demonstrating overall institutional effectiveness.

• Assessment of student learning is a direct response to the inadequacy of student grades for describing general student learning outcomes.

Page 47: A Basic Toolbox for Assessing Institutional Effectiveness Michael F. Middaugh Assistant Vice President for Institutional Research and Planning University

According to Paul Dressel of Michigan State University (1983), Grades Are:

“ An inadequate report of an inaccurate judgment by a biased and variable judge of the extent to which a student has attained an undefined level of mastery of an unknown proportion of an indefinite material. ”

Page 48: A Basic Toolbox for Assessing Institutional Effectiveness Michael F. Middaugh Assistant Vice President for Institutional Research and Planning University

There is no “one size fits all” approach to assessment of learning across the disciplines

None of these should be applied to evaluation of individual student performance for purposes of grading and completion/graduation status.

1. Standardized Tests• General Education or Discipline Specific• State, Regional, or National Licensure Exams

2. Locally Produced Tests/Items• “Stand Alone” or Imbedded

3. Portfolios/Student Artifacts• Collections of Students’ Work• Can Be Time Consuming, Labor Intensive, and Expensive

4. Final Projects• Demonstrate Mastery of Discipline and/or General Education

5. Capstone Experiences/Courses• Entire Course, Portion of a Course, or a Related Experience

(Internship, Work Placement, etc.)

Page 49: A Basic Toolbox for Assessing Institutional Effectiveness Michael F. Middaugh Assistant Vice President for Institutional Research and Planning University

Non-Student Measures of Institutional Effectiveness:

Teaching Productivity, Instructional Costs,and Externally Funded Scholarship

Page 50: A Basic Toolbox for Assessing Institutional Effectiveness Michael F. Middaugh Assistant Vice President for Institutional Research and Planning University

Budget Support Metrics

Page 51: A Basic Toolbox for Assessing Institutional Effectiveness Michael F. Middaugh Assistant Vice President for Institutional Research and Planning University

In 1988, the University of Delaware…

• Was transitioning from a highly centralized, “closed” management style with respect to sharing of information.

• Had grown from a total enrollment of 7,900 students in the late 1960’s to 20,000+ students in the mid-1980’s.

• When financial data were examined, found itself with $9 million in recurring expenses on non-recurring revenue sources.

• Needed to eliminate 240+ full time positions from the basic budget to achieve a balanced budget,

Page 52: A Basic Toolbox for Assessing Institutional Effectiveness Michael F. Middaugh Assistant Vice President for Institutional Research and Planning University

Ground Rules in Making Budgetary Decisions

• Decisions would be rooted in quantitative and qualitative data that would be collegially developed and broadly shared.

• Savings would initially be achieved by eliminating vacant positions, outsourcing non-essential functions, and taking advantage of technology.

• In eliminating human and fiscal resources, to the largest extent possible the academic core of the University would be insulated. However, over the long term, resources would need to be reallocated between and among academic units.

Page 53: A Basic Toolbox for Assessing Institutional Effectiveness Michael F. Middaugh Assistant Vice President for Institutional Research and Planning University

How Best to Make Resource Reallocation Decisions Within Academic Units?

• A series of budget support metrics would be developed for measuring instructional productivity and costs within academic departments and programs.

• Input as to which variables should be used was sought from deans and department chairs. These variables were supplemented by those identified by Office of Institutional Research and Planning, and a final set of productivity/cost variables was achieved through consensus.

• The resulting product became known as ”Budget Support Notebooks.”

Page 54: A Basic Toolbox for Assessing Institutional Effectiveness Michael F. Middaugh Assistant Vice President for Institutional Research and Planning University

Looking at a Budget Support Notebook Page from a Humanities Department Within the College of Arts

and Science

Page 55: A Basic Toolbox for Assessing Institutional Effectiveness Michael F. Middaugh Assistant Vice President for Institutional Research and Planning University

BUDGET SUPPORT DATA College of Arts and Science

1996-97 Through 1998-99 Department X

A. TEACHING WORKLOAD DATA

FALL FALL FALL SPRING SPRING SPRING

1996 1997 1998 1997 1998 1999

FTE MAJORS

Undergraduate 38 31 39 38 40 39

Graduate 0 0 0 0 0 0

Total 38 31 39 38 40 39

DEGREES GRANTED

Bachelor's ----- ----- ----- 20 19 19

Master's ----- ----- ----- 0 0 0

Doctorate ----- ----- ----- 0 0 0

TOTAL ----- ----- ----- 20 19 19

STUDENT CREDIT HOURS

Lower Division 6,246 5,472 5,448 4,518 6,156 5,478

Upper Division 726 638 869 1,159 951 966

Graduate 183 153 129 195 276 135

Total 7,155 6,263 6,446 5,872 7,383 6,579

% Credit Hours Taught by 77% 81% 77% 82% 91% 82%

Faculty on Appointment

% Credit Hours Taught by 23% 19% 23% 18% 9% 18%

Supplemental Faculty

% Credit Hours Consumed by Non-Majors 98% 97% 98% 96% 98% 97%

Page 56: A Basic Toolbox for Assessing Institutional Effectiveness Michael F. Middaugh Assistant Vice President for Institutional Research and Planning University

BUDGET SUPPORT DATA College of Arts and Science

1996-97 Through 1998-99 Department X

A. TEACHING WORKLOAD DATA

FALL FALL FALL SPRING SPRING SPRING

1996 1997 1998 1997 1998 1999

FTE STUDENTS TAUGHT

Lower Division 416 365 363 301 410 365

Upper Division 48 43 58 77 63 64

Graduate 20 17 14 22 31 15

Total 485 424 435 400 504 445

FTE FACULTY

Department Chair 1.0 1.0 1.0 1.0 1.0 1.0

Faculty on Appointment 15.0 16.0 15.0 15.0 15.0 15.0

Supplemental Faculty 1.5 1.0 1.3 1.0 0.8 1.5

Total 17.5 18.0 17.3 17.0 16.8 17.5

WORKLOAD RATIOS

Student Credit Hrs./FTE Faculty 408.9 347.9 373.7 345.4 440.8 375.9

FTE Students Taught/FTE Faculty 27.7 23.6 25.2 23.5 30.1 25.4

Page 57: A Basic Toolbox for Assessing Institutional Effectiveness Michael F. Middaugh Assistant Vice President for Institutional Research and Planning University

BUDGET SUPPORT DATA College of Arts and Science

1996-97 Through 1998-99 Department X

B. FISCAL DATA

FY 1996 FY 1997 FY 1998

($) ($) ($)

RESEARCH AND SERVICE

Research Expenditures 0 5,151 499

Public Service Expenditures 0 0 0

Total Sponsored Research/Service 0 5,151 499

Sponsored Funds/FTE Fac. On Appointment 0 312 31

COST OF INSTRUCTION

Direct Instructional Expenditures 1,068,946 1,141,927 1,144,585

Direct Expense/Student Credit Hour 81 84 88

Direct Expense/FTE Student Taught 1,198 1,229 1,301

REVENUE MEASURES

Earned Income from Instruction 3,960,208 4,366,720 4,311,275

Earned Income/Direct Instructional Expense 3.73 3.82 3.77

Page 58: A Basic Toolbox for Assessing Institutional Effectiveness Michael F. Middaugh Assistant Vice President for Institutional Research and Planning University

In the Initial Stages, Summary Data Such as Budget Support Data May be Challenged, and Must be Supported

by Solid Background Analysis

Page 59: A Basic Toolbox for Assessing Institutional Effectiveness Michael F. Middaugh Assistant Vice President for Institutional Research and Planning University

Who Is Teaching What to Whom?

Page 60: A Basic Toolbox for Assessing Institutional Effectiveness Michael F. Middaugh Assistant Vice President for Institutional Research and Planning University

Table 1: Departmental Workload Verification Data

Rank/ # Organized Tenure/ Home Dept./ % Teaching S-Contract?Name Course(s) Sections Credits Course Type Load Enrolled Credits Credits Yes/No

Thomas Jones Chairperson Tenured Sociology

SOC 454 1 3 Hrs. Lecture 100 9 27 3 NoSOC 964 0 3-12Hrs. Supv. Study 100 0 3 1 No

Total 9 30 4

Mary Smith Professor Tenured Sociology

SOC 201 1 3 Lecture 100 246 738 3 NoSOC 203 1 3 Lecture 100 100 300 3 No

Total 346 1038 6

William Davis Professor Tenured Sociology

CSC 311 1 3Hrs. Lecture 100 8 24 Cross Listed With SOC 311

SOC 311 1 3Hrs. Lecture 100 38 114 3 No Cross Listed With CSC 311

SOC 327 1 3Hrs. Lecture 100 13 39 3 NoSOC 366 0 1-3Hrs. Supv. Study 100 1 1 1 NoSOC 866 0 1-6Hrs. Supv. Study 100 1 3 1 No

Roger Brown Assistant Professor Tenure Track Sociology

SOC 467 1 3Hrs. Lecture 100 7 21 3 No400 Level Meets With 600 Level

SOC 667 1 3Hrs. Lecture 100 1 3 No600 Level Meets With 400 Level

SOC 213 1 3Hrs. Lecture 100 77 231 3 NoCross Listed With WOMS 213

WOMS 213 1 3Hrs. Lecture 100 21 63 NoCross Listed With SOC 213 Total 106 318 6

Students

Page 61: A Basic Toolbox for Assessing Institutional Effectiveness Michael F. Middaugh Assistant Vice President for Institutional Research and Planning University

At What Cost?

Page 62: A Basic Toolbox for Assessing Institutional Effectiveness Michael F. Middaugh Assistant Vice President for Institutional Research and Planning University

Departmental Expenditures, by Object and by Function: Fiscal Year 1999 Undergraduate Department in Humanities

Departmental Org. Activity Public AcademicInstruction Research Educ. Depts. Research Service Support

(01-08) (09) (10) (21-39) (41-43) (51-56)Expenditures

Salaries Professionals 27,570 0 0 0 0 0 Faculty Full-Time (Including Dept. Chair) 993,612 0 0 0 0 0 Part-Time (Including Overload) 23,985 0 0 0 0 0 Graduate Students 0 0 0 0 0 0 Post Doctoral Fellows 0 0 0 0 0 0 Tuition/Scholarship 0 0 0 0 0 0 Salaried/Hourly Staff 59,601 0 0 0 0 0 Fringe Benefits 0 0 0 0 0 0

Subtotal 1,104,768 0 0 0 0 0

Support Miscellaneous Wages 1,884 0 0 499 0 0 Travel 12,275 6,510 0 0 0 0 Supplies and Expenses 10,496 2,925 0 0 0 0 Occupancy and Maintenance 1,270 0 0 0 0 0 Equipment 0 0 0 0 0 0 Other Expenses 13,932 230 0 0 0 0 Credits and Transfers 0 0 0 0 0 0

Subtotal 39,817 9,665 0 499 0 0

TOTAL EXPENDITURES 1,144,585 9,665 0 499 0 0

Page 63: A Basic Toolbox for Assessing Institutional Effectiveness Michael F. Middaugh Assistant Vice President for Institutional Research and Planning University

Ground Rules For Using Budget Support Data

• Decisions are never made on the basis of a single year of data. Trend information is the goal.

• Data are not used to reward or penalize, but rather as tools of inquiry as to why productivity/cost patterns are as they appear.

• It is understood that there are qualitative dimensions to productivity and cost that are not captured in this analysis.

Page 64: A Basic Toolbox for Assessing Institutional Effectiveness Michael F. Middaugh Assistant Vice President for Institutional Research and Planning University

Budget Support Data

• Gave academic units a sense of buy-in and ownership of the data being used for academic management and planning.

• Helped transform academic departments from 54 loosely confederated fiefdoms into a coherent university, with each unit contributing in meaningful ways to realization of the University’s mission areas of teaching, research, and service.

Page 65: A Basic Toolbox for Assessing Institutional Effectiveness Michael F. Middaugh Assistant Vice President for Institutional Research and Planning University

Extending Budget Support Analysis….

• As useful as appropriate comparisons are between and among like departments within the University, comparisons would be substantially enhanced if, for example, the History Department at the University of Delaware could be compared with History Departments at actual peer universities, and at universities with History Departments to which the University of Delaware aspires.

• Out of this need for external comparative information, the Delaware Study of Instructional Costs and Productivity was born.

Page 66: A Basic Toolbox for Assessing Institutional Effectiveness Michael F. Middaugh Assistant Vice President for Institutional Research and Planning University

Delaware Study of Instructional Costs and Productivity

• Over the past decade, the Delaware Study of Instructional Costs and Productivity has emerged as the tool of choice for benchmarking data on faculty teaching loads, direct instructional costs, and externally funded faculty scholarship, all at the academic discipline level of analysis.

• The emphasis on disciplinary analysis is non-trivial. Over 80 percent of the variance in instructional expenditures across four-year postsecondary institutions is accounted for by the disciplines that comprise a college’s or university’s curriculum.

Page 67: A Basic Toolbox for Assessing Institutional Effectiveness Michael F. Middaugh Assistant Vice President for Institutional Research and Planning University

Delaware Study:Teaching Load/Cost Data Collection Form

Page 68: A Basic Toolbox for Assessing Institutional Effectiveness Michael F. Middaugh Assistant Vice President for Institutional Research and Planning University

Using Delaware Study Data

Page 69: A Basic Toolbox for Assessing Institutional Effectiveness Michael F. Middaugh Assistant Vice President for Institutional Research and Planning University

• We provide the Provost with data from multiple years of the Delaware Study, looking at the University indicators as a percentage of the national benchmark for research universities.

• The Provost receives a single sheet for each academic department, with graphs reflecting the following indicators: Undergraduate Fall SCRH/FTE Faculty, Total Fall SCRH/FTE Faculty; Total AY SCRH/FTE Faculty (All); Fall Class Sections/FTE Faculty; Direct Cost/SCRH; External Funds/FTE faculty

Page 70: A Basic Toolbox for Assessing Institutional Effectiveness Michael F. Middaugh Assistant Vice President for Institutional Research and Planning University

Science Department

Undergraduate Student Credit Hours Taught per FTE T/TT Faculty

0

50

100

150

200

250

300

1994 1996 1997 1998 1999

UD

Nat'l

Total Student Credit Hours Taught per FTE T/TT Faculty

050

100150

200250

300350

1994 1996 1997 1998 1999

UD

Nat'l

Total Class Sections Taught per FTE T/TT Faculty

0.00.51.01.5

2.02.53.03.5

1994 1996 1997 1998 1999

UD

Nat'l

Total Student Credit Hours Taught per FTE Faculty (All Categories)

180190200210220230240250260

1994 1996 1997 1998 1999

UD

Nat'l

Direct Instructional Expenditures per Student Credit Hour

$0

$50

$100

$150

$200

$250

1994 1996 1997 1998 1999

UD

Nat'l

Separately Budgeted Research and Service Expenditures per FTE T/TT Faculty

$0

$20,000

$40,000

$60,000

$80,000

$100,000

$120,000

FY95 FY97 FY98 FY99 FY00

UD

Nat'l

Page 71: A Basic Toolbox for Assessing Institutional Effectiveness Michael F. Middaugh Assistant Vice President for Institutional Research and Planning University

The Delaware Study – Next Steps

• As useful as Delaware Study teaching load/cost benchmarks are, they do not address the non-classroom dimensions of faculty activity in an institution and its academic programs.

• It is possible that quantitative productivity and cost indicators for a given program/discipline may differ significantly from other institutional, peer, and national benchmarks for wholly justifiable reasons of quality that can be reflected in what faculty do outside of the classroom.

• However, this cannot be determined unless measurable, proxy indicators of quality are collected.

Page 72: A Basic Toolbox for Assessing Institutional Effectiveness Michael F. Middaugh Assistant Vice President for Institutional Research and Planning University

The Delaware Study – Faculty Activity Study

• In Fall of 2001, the University of Delaware was awarded a second three-year FIPSE grant.

• This grant underwrote the cost of developing data collection instruments and protocols for assessing out-of-classroom facets of faculty activity.

• Once again, the grant supported an Advisory Committee charged with responsibility for refining and enhancing data collection instrumentation, data definitions, and study methodology.

• Faculty Activity Study collects data on 43 discrete variables related to instruction, scholarship, service to the institution, service to the profession, and public service.

Page 73: A Basic Toolbox for Assessing Institutional Effectiveness Michael F. Middaugh Assistant Vice President for Institutional Research and Planning University

Delaware Study:Faculty Activity Study Data Collection Form

Page 74: A Basic Toolbox for Assessing Institutional Effectiveness Michael F. Middaugh Assistant Vice President for Institutional Research and Planning University

Transparency in Assessment

www.udel.edu/ir

Page 75: A Basic Toolbox for Assessing Institutional Effectiveness Michael F. Middaugh Assistant Vice President for Institutional Research and Planning University

Other Issues Related to Institutional Effectiveness

• In order to attract and retain the most capable faculty and staff, you have to compensate them.

AAUP Academe (March/April Issue) annually publishes average faculty salary, by rank, for most institutions in the country, 2-year and 4-year

CUPA-HR and Oklahoma Salary Study annually publish average faculty salary, by rank, by discipline, by Carnegie institution type. Also publish average salary for newly hired Assistant Professor.

Salary equity and salary compression studies

Page 76: A Basic Toolbox for Assessing Institutional Effectiveness Michael F. Middaugh Assistant Vice President for Institutional Research and Planning University

Other Issues Related to Institutional Effectiveness

• In order to attract and retain the most capable faculty and staff, you have to provide a hospitable workplace.

Employee Satisfaction Studies

Campus Climate Studies

Page 77: A Basic Toolbox for Assessing Institutional Effectiveness Michael F. Middaugh Assistant Vice President for Institutional Research and Planning University

Economic Impact Studies

• While not a direct measure of institutional effectiveness, economic impact studies can be a powerful tool in shaping college/government relations for public institutions in particular. The methodology is straightforward, and we will share it if you contact our office.

“In 2007, the University, its employees, and students spent approximately $410 million in Delaware. Using U.S. Department of Commerce Bureau of Economic Analysis multipliers, University of Delaware expenditures have an overall impact on the local economy of approximately $751 million. In addition to the intellectual capital that the University provides to the State of Delaware, there is significant financial capital as well. University expenditures total more than three times the State’s appropriation to the institution, and the overall economic impact of those expenditures is over six times the State appropriation. “

Page 78: A Basic Toolbox for Assessing Institutional Effectiveness Michael F. Middaugh Assistant Vice President for Institutional Research and Planning University

Institutional “Dashboards” that report on key success indicators can be an succinct means of reporting on basic measures of institutional effectiveness.

www.udel.edu/ir/UDashboard

Page 79: A Basic Toolbox for Assessing Institutional Effectiveness Michael F. Middaugh Assistant Vice President for Institutional Research and Planning University

Claims of institutional effectiveness are stronger when focal institution’s measures on important measures are compared with those of peer institutions.

Page 80: A Basic Toolbox for Assessing Institutional Effectiveness Michael F. Middaugh Assistant Vice President for Institutional Research and Planning University

Choosing Peer Groupings

• Scientific – Cluster Analysis or Other Multivariate Tool

• Pragmatic - e.g. Compensation Peers

• “Whatever!” - e.g. Admissions Peers

Page 81: A Basic Toolbox for Assessing Institutional Effectiveness Michael F. Middaugh Assistant Vice President for Institutional Research and Planning University

We are extending the Dashboard concept to include key variables related to the University’s new Strategic Plan that enable us to compare our position vis-à-vis actual peers and aspirational peers.

Page 82: A Basic Toolbox for Assessing Institutional Effectiveness Michael F. Middaugh Assistant Vice President for Institutional Research and Planning University

Student Faculty Ratio: Actual Peer Group

1213 14 15 15 16 16 17 17 17 18 19 19 20 20 20 21

22

26

0

5

10

15

20

25

30

Syrac

use

U. Miam

i

Southe

rn M

ethod

ist

St. Lo

uis U.

Pittsb

urgh

Clemso

n

Marq

uette

Connec

ticut

Penn S

tate

Delaware

Baylor

Kansas

Sta

te

U. Mas

s-Am

hers

t

Rutger

s NB

USC-Colu

mbia

Auburn

Virgini

a Tec

h

Miam

i U. O

xford

SUNY at B

ingha

mto

n

Student Faculty Ratio: Aspirational Peer Group

79 9 9 9 9

1011 11 11

13 13 1314

15 15 1517

1819

0

5

10

15

20

25

30

Tufts

Brown

Case W

estern

Lehig

h

Roches

ter

Tulane

Carneg

ie Mell

on

George

town

New Yor

k U.

Willi

am a

nd M

ary

Boston C

.

Notre D

ame

Wisc

onsin

-Madis

on

UNC Cha

pel H

ill

Mich

igan-

Ann A

rbor

Rensse

laer

Virgini

a

Delaware

Texas a

t Austi

n

U.Califo

rnia-Ir

vine

Page 83: A Basic Toolbox for Assessing Institutional Effectiveness Michael F. Middaugh Assistant Vice President for Institutional Research and Planning University

Freshman to Sophomore Retention Rate: Actual Peer Group

79%83% 83% 84% 86% 86% 87% 89% 89% 89% 89% 89% 90% 90% 90% 90% 91% 92% 93% 94%

0%

10%

20%

30%

40%

50%

60%

70%

80%

90%

100%

Freshman to Sophomore Retention Rate: Aspirational Peer Group

90% 91% 92% 93% 93% 93% 93% 93% 94% 94% 95% 96% 96% 96% 97% 97% 97% 97% 98%

0%

10%

20%

30%

40%

50%

60%

70%

80%

90%

100%

Page 84: A Basic Toolbox for Assessing Institutional Effectiveness Michael F. Middaugh Assistant Vice President for Institutional Research and Planning University

Six Year Graduation Rate: Actual Peer Group

59% 63% 63% 66%73% 73% 73% 74% 74% 74% 75% 75% 76% 78% 78% 79% 79% 81% 82% 85%

0%10%20%30%40%50%60%70%80%90%

100%

Six Year Graduation Rate: Aspirational Peer Group

71%76% 77% 78% 78% 79% 80% 82% 84% 84% 86% 86% 87%

91% 91% 92% 92% 94% 94% 96%

0%

10%

20%

30%

40%

50%

60%

70%

80%

90%

100%

Page 85: A Basic Toolbox for Assessing Institutional Effectiveness Michael F. Middaugh Assistant Vice President for Institutional Research and Planning University

Dashboard Variables Being Collected

STUDENT CHARACTERISTICSPercent of Accepted Applicants MatriculatedTotal Minority Percentage in Undergraduate Student BodyFreshman to Sophomore Retention RateFour Year Graduation RateSix Year Graduation Rate RESEARCH ACTIVITYTotal R&D Expenditures per Full Time FacultyTotal Service Expenditures per Full Time Faculty FINANCEUniversity Endowment as of June 30Alumni Annual Giving Rate INSTRUCTIONFull Time Students per Full Time FacultyTotal Degrees GrantedTotal Doctoral Degrees GrantedPercent of Faculty With TenurePercent of Faculty That Are Full TimePercent of Women Among Full Time FacultyPercent of Minorities Among Full Time FacultyPercent of Minorities Among Full Time StaffTotal Faculty Who Are National Academy Members

Page 86: A Basic Toolbox for Assessing Institutional Effectiveness Michael F. Middaugh Assistant Vice President for Institutional Research and Planning University

A Possible Format for the Strategic Dashboard

Page 87: A Basic Toolbox for Assessing Institutional Effectiveness Michael F. Middaugh Assistant Vice President for Institutional Research and Planning University

Another Potential Format for Strategic Dashboard

Page 88: A Basic Toolbox for Assessing Institutional Effectiveness Michael F. Middaugh Assistant Vice President for Institutional Research and Planning University

That’s All, Folks!

• What have I missed that you would like covered?

• Other questions????

[email protected]