institutional effectiveness handbook...learning/administrative process, an assessment tool may need...

110
Institutional Effectiveness Handbook 3rd edition

Upload: others

Post on 24-Aug-2020

2 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Institutional Effectiveness Handbook...learning/administrative process, an assessment tool may need to be developed internally (i.e. Graduating Student Surveys, Department Satisfaction

Institutional Effectiveness Handbook

3rd edition

Page 2: Institutional Effectiveness Handbook...learning/administrative process, an assessment tool may need to be developed internally (i.e. Graduating Student Surveys, Department Satisfaction
Page 3: Institutional Effectiveness Handbook...learning/administrative process, an assessment tool may need to be developed internally (i.e. Graduating Student Surveys, Department Satisfaction
Page 4: Institutional Effectiveness Handbook...learning/administrative process, an assessment tool may need to be developed internally (i.e. Graduating Student Surveys, Department Satisfaction

INSTITUTIONAL EFFECTIVENESS HANDBOOK, Edition 3.0 Published by Southeastern University, Office of Institutional Effectiveness 1000 Longfellow Blvd., Lakeland, FL, 33801. Copyright © 2007, 2010, 2013, 2018 by Southeastern University, Inc. All rights reserved. No part of this publication may be reproduced, stored in a retrieval system, or transmitted in any form or by any means, electronic, mechanical, photocopying, recording, or otherwise, without the prior permission of the copyright owner. Authors of the 3rd edition: Justin E. Rose, Associate Director of Institutional Effectiveness Cody J. Lloyd, Executive Director of Information Management Author of previous editions: Andrew Miller, Associate Provost, School of Unrestricted Education Primary Sponsor: Dr. Andrew H. Permenter, Vice President of Institutional Research and Effectiveness

Office of Institutional Effectiveness Page 2

Page 5: Institutional Effectiveness Handbook...learning/administrative process, an assessment tool may need to be developed internally (i.e. Graduating Student Surveys, Department Satisfaction

Office of Institutional Effectiveness Page 3

Page 6: Institutional Effectiveness Handbook...learning/administrative process, an assessment tool may need to be developed internally (i.e. Graduating Student Surveys, Department Satisfaction

Contents

Introduction 5

History of Institutional Effectiveness at Southeastern University 6

Assessment Today at SEU 8

Foundations of Quality Enhancement 10

Assessment Committee 13

How to Conduct Effective Outcomes Assessment 14

Academic Program Assessment 26

Academic Program Review 27

Administrative Unit Assessment 28

Faculty and Staff Assessment 29

Assessment of Student Success & Student Development 30

Competency-Based Education 40

Assessment Calendar 41

Conclusion 43

References 44

Appendices 46

Appendix A: Glossary 47

Appendix B: The Assessment Process (Stanford University) 50

Appendix C: University Mission, Vision, & Strategic Plan 52

Appendix D: Completed MPA 58

Appendix E: Completed PLO 59

Appendix F: Checklist for Feedback on Administrative Assessment Plans and Reports 63

Appendix G: Checklist for Feedback on Academic Assessment Plans and Reports 64

Appendix H: Curriculum Map 65

Appendix I: Foundational Document Example 66

Appendix J: Campus Labs Tutorials 69

Appendix K: Academic Program Assessment (American University: Cairo) 78

Appendix L: Creating Learning Outcomes (Stanford University) 85

Appendix M: Planning for Dissemination and Use (Stanford University) 90

Appendix N: Academic Program Review Schedule 91

Appendix O: Administrative Unit Assessment (American University: Cairo) 92

Appendix P: Useful Assessment Websites 97

Appendix Q: IPEDS Peer Data 98

Appendix R: Foundational Core Outcomes 100

Appendix S: Survey Policy & Procedure 101

Notes 103

Office of Institutional Effectiveness Page 4

Page 7: Institutional Effectiveness Handbook...learning/administrative process, an assessment tool may need to be developed internally (i.e. Graduating Student Surveys, Department Satisfaction

Introduction

Institutional Effectiveness 3.0 – This is the third edition (and 4th revision) of the Institutional Effectiveness Handbook. As explained in detail in the following sections, the Institutional Effectiveness plan and processes have made significant strides since January 2007 when the first Institutional Effectiveness Handbook was produced. The following sections will explain in detail where we have been and where we hope to go during the next phase of the plan for Institutional Effectiveness at Southeastern University.

We can measure the baseline achievement levels of our students’ knowledge, skills, and abilities through the use of direct, internally generated assessments such as tests, essays, research papers, portfolios, capstone projects, and internship evaluations.

We can measure our students’ levels of knowledge in general education and discipline specific areas through the use of nationally benchmarked instruments such as the ETS Proficiency Profile test, discipline specific major field tests (ETS), and other tests such as the GRE, LSAT, and MCAT.

We can measure our students’ opinions and attitudes regarding the effectiveness of classroom pedagogy, student services, and academic support indirectly through the use of internal surveys such as the Graduating Student Survey, the Freshman Survey, course evaluations, departmental surveys, and focus groups.

We can measure student satisfaction with all areas of campus life (classroom, student services, academic support, housing, food service, administrative functions) through the use of nationally benchmarked instruments such as the National Survey of Student Engagement (NSSE) and the Noel-Levitz Student Satisfaction Inventory (SSI). With regard to NSSE, significant correlation between levels of student engagement and student learning has been verified.

This Handbook will guide the university into the next phase of Institutional Effectiveness at Southeastern University. Through the increased use of the Campus Labs assessment and planning software, we will gather and tabulate hundreds of more pieces of data that will relate specifically to the outcomes we identify in academics, student support, student life, and administrative functions. This additional data, grouped by outcome category, will enable us to obtain clear, relevant measures of where we really are relative to identified outcomes (baseline data), how well we are progressing (longitudinal data, trends), and where we are relative to other colleges and universities (benchmark data).

Through the increased collection, analysis, and continual review of this relevant data, many important objectives can be achieved, not the least of which will be measurable improvements in academic quality, academic support, administrative processes, student life, and the professional preparation of our students.

A Glossary of Terms used throughout this Handbook can be found in Appendix A. (AP)

Office of Institutional Effectiveness Page 5

Page 8: Institutional Effectiveness Handbook...learning/administrative process, an assessment tool may need to be developed internally (i.e. Graduating Student Surveys, Department Satisfaction

History of Institutional Effectiveness at Southeastern University

A new era, a new plan – Beginning in 2007, a new assessment system was launched, the Master Plan of Advance (MPA), in which every department, whether academic, academic support, administrative, or student support, completes goal grids to include outcomes, assessments/plan, results/analysis, and recommendations for improvements. Departmental goals are organically connected to the university’s Mission, Mission in Action, and Institutional Goals (Appendix C). Under the plan, each department would enter goals and assessments into the Campus Labs software. Every departmental goal must be linked to an institutional goal. Since the MPA goals and assessments are submitted by all campus units, they deal with campus processes, programs, and organizational issues. Ultimately, all MPA goals should also, at least indirectly, connect to student learning.

Also beginning in 2007, the assessment cycle for the MPA was changed from one to three years. The first three year cycle, 2007 to 2010, ended at the end of the fiscal year, June 30, 2010. The next three year cycle will begin in 2010 and end June 30, 2013. In each cycle, the outcomes and assessments of the MPA grids were evaluated by the Assessment Committee. A follow-up review was conducted on all results and recommendations for a particular cycle. See Appendix D for Sample MPA.

More specifically and directly related to student learning are the Program Learning Outcome (PLO) grids, which were also developed in 2007, for the academic departments’ use, and confined to measure discipline-specific and general education student learning outcomes. The PLO process is designed to continually monitor levels of student learning, analyze assessment results, and propose and implement program, curricular, and pedagogical enhancements in order to improve those results. The PLO grid is created and submitted annually by all academic departments and reviewed and evaluated by the Office of Institutional Effectiveness. See Appendix E for Sample PLO.

In early 2007, the Office of Institutional Effectiveness produced the Institutional Effectiveness Handbook and distributed it to faculty and campus administrators. The Handbook outlined the process, mechanics, and timeline for a campus-wide system of goal creation, assessment, gathering results, and quality enhancement for Southeastern University. It also included three initiatives proposed by the then Director of Institutional Effectiveness, which were based on internal campus data, external benchmarks, and best practices. These three initiatives were: 1) Campus-wide system of Institutional Effectiveness, 2) General Education Initiative, and 3) Retention Initiative. The Handbook also included appendices with a glossary, university mission and institutional goals, templates for creating unit mission statements, goal grids, and assessments, various survey and assessment results, best practices and peer review information, and web links to other schools’ assessment information and plans.

In August 2010, a more comprehensive, superior handbook was developed by the Dean of Institutional Research & Retention and the Assessment Coordinator. The revised handbook serves as a comprehensive guide to creating a strategic plan for improved student learning, assessment, and quality enhancement. Significantly expanding on the previous edition, the new handbook provides a solid philosophical foundation for institutional effectiveness, new techniques for assessment collection/reporting, and five-year initiatives designed to enhance the university.

During the 2010-13 cycle, the university continued to advance the role of assessment and planning in the institution. PLOs were collected, measured, and analyzed for every academic major. Additionally, the MPA process was continued for all administrative departments. In all, significant strides were made in every area of Institutional Effectiveness. As we launch into our next phase (2013-17), we seek to establish a stronger foundation of data-informed, decision making and change. (AP/AM)

The 2013-17 strategic planning cycle witnessed the achievement of the majority of planned initiatives, the successful completion of the SACSCOC Fifth-Year Interim Report, and the expansion of the role and responsibilities of the Office of Institutional Effectiveness. As the Strategic Plan was updated and a new

Office of Institutional Effectiveness Page 6

Page 9: Institutional Effectiveness Handbook...learning/administrative process, an assessment tool may need to be developed internally (i.e. Graduating Student Surveys, Department Satisfaction

plan was fomented for 2017-2022, the MPA was extended to a 5-year cycle to align with the new 5-year University Strategic Plan.

Renewed focus on student success. Since the conclusion of the 2013-17 strategic planning cycle, the university developed a strong interest in the cultivation of a paradigm of student success, informed by the concept of thriving in college, which incorporates a holistic approach to conceptualizing, designing, implementing, and evaluating outcomes and experiences that allow students to discover and realize their divine design. This interest led to a collaborative effort between students, faculty, staff, administrators, alumni, and other stakeholders to design a Foundational Core curriculum as an evolution of the existing General Education curriculum, which describes institutional outcomes for graduates of Southeastern University, based on AAC&U LEAP outcomes and the university’s commitment to its particular mission. The new focus on student success also informed the 2017-2022 Strategic Plan, which aims to elevate the university’s commitment to both institutional effectiveness and the distinct needs of the 21st century learner.

In response to these changes and the exponential growth of the institution in non-traditional educational deliveries, the Office of Institutional Effectiveness wrote a 3rd edition of the Academic Program Review Handbook in order to both broaden the focus of the university faculty to include concepts like student success and sharpen their competencies around delivering high quality educational programming, informed by foundational theories of outcomes assessment and pedagogy. This new 3rd edition of the Institutional Effectiveness Handbook will serve as both a comprehensive guide to understanding and implementing useful assessment and planning for continuous improvement (in the spirit of the 2nd edition) and as a resource for all departments and stakeholders at Southeastern University (both academic and non-academic administrative) who are endeavoring to better understand and serve their constituents, as well as improve their contribution to the university’s mission. (JR)

Office of Institutional Effectiveness Page 7

Page 10: Institutional Effectiveness Handbook...learning/administrative process, an assessment tool may need to be developed internally (i.e. Graduating Student Surveys, Department Satisfaction

Assessment Today at SEU

At Southeastern, we currently utilize a myriad of direct and indirect assessments to measure student learning, student satisfaction, student achievement, and campus services. Here are some of our major assessments.

Direct Assessments:

❖ Course-based assessments such as tests, in-class essays, oral presentations, recitals, and research papers.

❖ Major Field Tests (ETS) for most majors prior to graduation (Senior Exit Exams).

❖ Capstone projects, major research papers, and/or portfolios for most students prior to graduation.

❖ National Student Clearinghouse tracks graduates’ enrollment in graduate.

❖ Rubrics, pre/post exam scores, embedded question sets, and other assessment mechanisms.

❖ school and/or continuing education programs.

❖ Florida DOE tracks graduates working in Florida.

❖ Faculty Professional Activities Reports.

❖ Administrative and fiscal benchmarks.

General Education Assessments:

❖ ETS© Proficiency Profile (formerly MAPP) pre (entering freshmen) and post (prior to graduation) test for all students. Measures English, Math, Writing, and Critical Thinking proficiency.

❖ Math (Pearson’s) pre and post tests for entering students (Math for the Liberal Arts).

❖ Writing Essay pre and posttest with rubric (Composition I).

❖ Critical Thinking Essay with rubric (Composition II).

❖ Oral Communication pre and posttest with rubric (Fundamentals of Speech).

❖ Bible knowledge pre and posttest for entering and graduating students.

Indirect Assessments:

❖ Course Evaluations every semester.

❖ Noel-Levitz Student Satisfaction Inventory (SSI) every two years.

❖ National Survey of Student Engagement (NSSE) every two years.

❖ Graduating Student Survey (twice per year).

❖ Alumni Survey (once per year).

❖ New Student Survey (once per year).

❖ Numerous departmental surveys.

Office of Institutional Effectiveness Page 8

Page 11: Institutional Effectiveness Handbook...learning/administrative process, an assessment tool may need to be developed internally (i.e. Graduating Student Surveys, Department Satisfaction

❖ Numerous focus groups.

❖ Thriving Quotient

❖ HERI Faculty Survey

Assessment results are gathered and analyzed in an effort to create a culture of evidence, evidence being the primary catalyst for positive change and quality enhancement. At least annually, results of student learning assessments are gathered and evaluated by each department relative to their majors. Results are compared to prior years for longitudinal studies. When possible, results are compared to national benchmarks. The greatest challenge in utilizing assessment results for program improvement is in gathering aggregate results by department for majors (or for General Education), and taking time to review, discuss, and analyze results for programmatic, course, or curricular changes designed to improve quality and student learning. Assessment results may also indicate the need for specific kinds of professional development for faculty, or needed changes in administrative processes or policies.

Future challenges, objectives – Continuing into the next phase of our Institutional Effectiveness plan, several broader issues with assessment are being addressed. They are:

● To calibrate every institutional unit’s assessment plan in alignment with the strategic plan and institutional outcomes of Southeastern University.

● To expand and integrate student learning outcomes assessment beyond academic affairs units into student affairs and support services.

● To build a general knowledge of student success as a shared responsibility and model for operations and assessment across all institutional units.

● To correlate student learning assessment with actual changes in pedagogy and curriculum as to improve the student learning experience.

● To educate the larger campus community and executive leadership as to the value of assessment with regard to accreditation standards, quality enhancement, budgeting, strategic planning, and development.

● To leverage the Learning Management System, D2L Brightspace, as a tool for automating assessment data capture and organization, thereby mitigating the amount of labor required to conduct effective outcomes assessment.

● To integrate the Institutional Effectiveness processes into the day-to-day processes of campus units and eliminate the impression that IE is a tacked-on report created for accreditors at the end of the year, without organic connection or relevance to the actual work of the unit.

(AP/AM/JR)

Office of Institutional Effectiveness Page 9

Page 12: Institutional Effectiveness Handbook...learning/administrative process, an assessment tool may need to be developed internally (i.e. Graduating Student Surveys, Department Satisfaction

Foundations of Quality Enhancement

In order to effectively enhance the quality of our academic programs, academic support units, administrative functions, and student development initiatives, Institutional Effectiveness at Southeastern University strives to be mission-driven, inter-dependent, data-informed, outcomes-oriented, and peer-reviewed.

Mission-Driven – The mission of the university is articulated by the governing board and implemented by campus administration, faculty, and staff. It is critical to the Institutional Effectiveness plan that departmental missions and outcomes focus on supporting and achieving the larger mission and goals of the institution. In this way, the entire campus community is working together toward the same ultimate purposes; fulfilling the vision and mission of the institution. Every departmental outcome should clearly relate to the larger mission of the university.

Data-Informed – Data is the evidence used to certify assessment results and levels of quality enhancement. Levels and degrees of student learning are verified by the results (data) of academic assessments such as tests, essays, research papers, internship evaluations, portfolios, major field tests, and graduates’ job-placement rates. Administrative and academic support data verifies levels of quality with regard to student services and other administrative functions. Baseline data attests to preliminary levels of student knowledge and the overall quality of campus programs, services, and processes. Longitudinal data establishes levels of change and improvement. Benchmark data matches comparable internal measures against consortia, regional, and national norms.

Office of Institutional Effectiveness Page 10

Page 13: Institutional Effectiveness Handbook...learning/administrative process, an assessment tool may need to be developed internally (i.e. Graduating Student Surveys, Department Satisfaction

Inter-Dependent – Pursuit of the larger goals and mission of the institution necessarily creates overlap between academic departments, administrative units, and campus divisions. In nearly every case, campus units are required to work together in order to achieve the larger mission of the university. Achievement of academic outcomes requires communication and cooperation between the academic units and those in academic support such as advising, testing, and tutoring. Retention initiatives require cooperation between academic departments, academic support units, and Student Financial Services. Improvements and quality enhancements that result from cooperation between units and divisions in turn benefit all campus stakeholders: students, faculty, parents, alumni, development, and others.

Outcomes-Oriented – Direct measures of academic achievement ultimately produce evidence that is virtually indisputable with regard to levels of academic quality and student learning. Focus on outcomes allows for a realistic assessment of baseline, longitudinal, and benchmark results. Focus on outcomes also allows for an ongoing process of monitoring results, noting increasing or decreasing levels of achievement or quality, and continually responding to assessment results with increasingly refined prescriptions for improvement.

Direct, quantitative assessment results are invaluable in measuring levels of student learning, academic quality, and program improvement. Qualitative assessments are valuable for those areas that are less easily measured quantitatively, such as fine arts, and as additional, indirect measures of students’ attitudes and opinions. They are gathered through instruments such as course evaluations, surveys, and focus groups.

Office of Institutional Effectiveness Page 11

Page 14: Institutional Effectiveness Handbook...learning/administrative process, an assessment tool may need to be developed internally (i.e. Graduating Student Surveys, Department Satisfaction

Peer-Reviewed – All university programs, policies, and procedures should be periodically reviewed by colleagues from peer institutions. In this way, levels of quality in academic programs, academic support, administrative services, and student life are maintained and enhanced. Similar to benchmarked instruments, peer review allows for independent, objective assessment, analysis, and recommendations for academic programs and other university functions. Accreditation compliance and more specifically, accreditation reaffirmation are forms of peer review because colleagues from other colleges and universities thoroughly review all campus programs, processes, and procedures. Regular, internal program review is also encouraged; it provides a less formal, less costly means of ongoing program review and evaluation. Discipline-specific accreditations are encouraged because they provide even more thorough, ongoing review and scrutiny for academic quality and continual improvement based on the latest research and best practices.

(AM/AP)

Office of Institutional Effectiveness Page 12

Page 15: Institutional Effectiveness Handbook...learning/administrative process, an assessment tool may need to be developed internally (i.e. Graduating Student Surveys, Department Satisfaction

Assessment Committee

History – Over the past several years, the Assessment Committee has met on an ad-hoc basis, its members consisting of the deans, chairs, and department heads; essentially the Senior Leadership Council. The committee has met at key intervals in order to assess the assessment process and the specific assessment plans of all campus units. When the three year assessment cycle was instituted for the Master Plan of Advance outcomes and assessments, the Assessment Committee met and evaluated one another’s goals and assessments, entering written comments regarding mission statements, outcomes, and assessments, answering questions such as:

❏ Is the Mission Statement clear and does it reflect the mission of the institution?

❏ How do the outcomes enhance student learning?

❏ Are the outcomes measurable by direct means?

❏ Do the outcomes reflect “big picture” goals that can be assessed by a variety of means?

❏ When will the assessments be administered, and by whom?

❏ Will longitudinal measures be gathered and tabulated?

❏ Are comparative, benchmark measures available?

❏ Do the outcomes and assessments reveal an understanding of assessment as it relates to quality enhancement?

❏ Do the outcomes and assessments reveal an understanding of assessment as it relates to the larger vision, mission, and goals of the university?

Standing Assessment Committee – Accreditation requirements and best practices indicate the need to form a permanent, standing Assessment Committee that will regularly review departmental assessment plans as well as cast a vision for campus-wide assessment that reflects the broader mission of the university. The Assessment Committee will ensure that all campus units are aware of their roles in achieving the university’s mission. Beginning with the mission and institutional goals, the Assessment Committee will also attempt to incorporate the use of software that will facilitate gathering assessments results from a variety of sources in order to create coherent, verifiable results of outcome achievements. The Assessment Committee will consist of the Institutional Effectiveness Office, one representative from each academic department, and five at-large members from administrative, academic support, and student development areas.

Generally, the Assessment Committee will meet early in the Fall Semester each year in order to review the academic Program Learning Outcomes (PLO) grids that have been submitted by each department for that year. Later in the Fall Semester, the committee will meet again to review and approve the PLO goal grids, including any changes that were made following the first review. Following the Spring Semester, the committee will meet to review and approve the results and recommendations for each academic department, noting any improvements or suggestions for future assessment plans.

The Assessment Committee will utilize this same process every year with regard to the Master Plan of Advance goal cycle that all campus departments (academic, academic support, and administrative) perform. In all cases, a standard rubric/checklist for feedback will be utilized to evaluate and communicate committee findings and recommendations of Administrative (see Appendix F) and Academic (see Appendix G) assessment plans, procedures, and results. (AM/AP)

Office of Institutional Effectiveness Page 13

Page 16: Institutional Effectiveness Handbook...learning/administrative process, an assessment tool may need to be developed internally (i.e. Graduating Student Surveys, Department Satisfaction

How to Conduct Effective Outcomes Assessment

Outcomes assessment is a systematic process of identifying learning/programmatic outcomes based upon the mission; identifying appropriate assessments; ensuring the collection, analysis, reporting, and communication of assessment results; and initiating change based upon the results. A properly administered assessment plan ensures an ongoing process of quality enhancement, designed to safeguard the long-term viability and sustainability of Southeastern University. The process is guided by the following principles:

Consistency – refers to the ongoing process of ensuring that outcomes, assessments, and results are consistently measured from year-to-year. This principle also requires consistent assessment of student learning/administrative processes across academic departments, administrative units, and campus divisions (Academic Affairs, Enrollment Management, Finance & Administration, Student Development, and University Advancement). Consistency is the hallmark of a well-planned and well-executed assessment plan.

Collaboration – refers to the input of deans/chairs/directors in harmony with faculty members, departmental staff, and the university Leadership Team. The assessment process should be articulated and driven at the executive level of the university and permeate every part of the institution. Assessment is the responsibility of every member of the university faculty, staff, and administration.

Communication – refers to the systematic process of communicating the need for assessment, the plan for assessing student learning/administrative processes, the assessment results, and the recommendations for change. Communication is essential to maintain buy-in from university constituents.

Community Support – refers to the process of ensuring that adequate personnel are allocated to prepare, execute, and report the assessment plan. It is recommended that each department use someone other than the dean/chair/director. Academic departments may give faculty release time to complete assessment activities.

Community Building – refers to the process of educating members about the importance of assessment, communicating the assessment plan, and providing an opportunity for assessment results to improve departmental activities and/or student learning. Assessment results should not be esoteric or overly complicated. The goal is to inform and improve the university. (AM)

Office of Institutional Effectiveness Page 14

Page 17: Institutional Effectiveness Handbook...learning/administrative process, an assessment tool may need to be developed internally (i.e. Graduating Student Surveys, Department Satisfaction

Executive Summary of Outcomes Assessment

The process of outcomes assessment can be organized into five distinct steps which are summarized below and explored in more detail in the next five sections of the handbok:

1. Define the Unit Foundational Document - Develop a mission statement, identify constituent populations, list critical processes, articulate your values, and describe your vision.

2. Identify Programmatic and/or Learning Outcomes - Specific, well-defined, realistic, action-oriented, future-tense goals that reflect the knowledge, skills, or abilities students should possess upon graduation/program completion in academic or co-curricular programming, or reflect the desired programmatic outcomes of an administrative unit (e.g. quality enhancement, strategic growth, customer satisfaction).

3. Identify Appropriate Assessments - The unit determines the type of measure that will return needed actionable data for the outcomes it has identified and considers various conceptual factors that may affect its design, administration, and use. Assessments may be formative or summative, internally or externally developed, and indirect or direct measures. It is a best practice to conduct more than one assessment measure for each outcome.

4. Collect, Analyze, Report, and Communicate Findings - The unit’s assessment coordinator (usually a department or program leader, or someone tapped by leadership to coordinate assessment) defines the assessment measure’s timeline. Assessment data is collected through rubrics, surveys, exams, focus groups, data requests, and/or other mechanisms. Data is analyzed via thematic/critical reflection on qualitative data or statistical analysis of quantitative data. Results and analysis, along with recommendations for improvement and results of actions taken from previous assessments, are reported, generally in the Campus Labs Planning site. Reports should be communicated to the unit’s stakeholders.

5. Take action based on findings - The utility of outcomes assessment depends on continuous improvement. Information and insights resulting from the findings of an assessment measure should be used to improve learning or programmatic outcomes. This may take the form of changes to curriculum, pedagogy, or assessments in academic and co-curricular units or changes to administrative operations, budgets, personnel, strategic planning, or assessments in administrative units. Assessment is a perpetual work in progress, so the results of actions taken should be regularly measured as part of continuing assessment efforts after the initial assessment has been conducted.

Office of Institutional Effectiveness Page 15

Page 18: Institutional Effectiveness Handbook...learning/administrative process, an assessment tool may need to be developed internally (i.e. Graduating Student Surveys, Department Satisfaction

1. Define the Unit Foundational Document

Overview – Assessment of academic and administrative units begins with the development of a mission statement, which reflects the university mission statement (See Appendix C). The unit mission statement should describe the purpose of your department or unit and clearly communicate: What you do, Why you do it, and How you do it. It is important to obtain consensus from the members (faculty, directors, staff, etc.) on the department’s constituents, critical processes, values, and vision. Thus, the time spent articulating the unit mission is an important step in the institutional effectiveness process.

Writing a Foundational Document

Mission What is our purpose?

Constituents Who do we serve?

Critical Processes What are our daily, weekly, and quarterly operations?

Values What our are essential, guiding principles?

Vision What do we want to do in the future?

The mission statement should include the following sections: (1) a clear description of the purpose of the department along with its primary function, (2) identification of department constituents such as students, alumni, staff, faculty, donors, etc., and (3) a description of how the department contributes to the development of students and/or the university.

Questions for Consideration – Below are the questions you should consider when writing the unit foundational statements.

1. Mission – What is your department’s primary educational/administrative purpose? Academic: Does your programs/department represent specific skills (Bible knowledge, communication, writing, and critical thinking skills) or broad academic disciplines (religion, English, biology, mathematics, education, communication, etc.)? Administrative: What service(s) does your department represent (i.e. business affairs, financial aid, environmental services, etc.)?

2. Constituents – Who are the department’s key stakeholders/customers? In other words, whom does your program primarily serve (i.e. undergraduate students, graduate students, residential/ commuter students, alumni, donors, graduating seniors, etc.)?

3. Critical Processes – What is your department providing to meet the stated purpose? Academic: What activities does your department utilize to facilitate learning (i.e. coursework, labs, research projects, etc.)? Administrative: What specific service(s) does your department offer to university constituents (i.e. housekeeping, oversight, scholarship, fundraising, student advisement, registration, graduation audits, licensed counseling, etc.)?

4. Values – What do you hold to be valuable and essential to your department? In other words, what are your over-arching, guiding principles (i.e. honesty, integrity, diversity, innovation, data integrity, professionalism, etc.)?

5. Vision – What are your long-term aspirations? Academic: What type of careers or further study will the academic programs prepare students for (full-time ministry, seminary, corporate workplace, high schools education, etc.)? Administrative: What are the long-term goals of your

Office of Institutional Effectiveness Page 16

Page 19: Institutional Effectiveness Handbook...learning/administrative process, an assessment tool may need to be developed internally (i.e. Graduating Student Surveys, Department Satisfaction

efforts (efficient data management and reporting, strong university endowment, student leaders, beautiful campus, etc.)?

See Appendix I for an example of a Unit Foundational Document that incorporates these five parts.

Writing the Mission Statement – Using the responses to these questions, you are now prepared to write the unit mission statement.

The mission of [Enter Department Name] is to [Enter your response from questions #1] by providing [Enter your response from question #2] with [Enter your response from question #3] in order to [Enter your responses from question #5].

Academic Example – The mission of the Department of Foreign Languages is to expose students to a variety of perspectives in Portuguese language and Iberian and Latin American culture, civilization and history by providing majors with training in writing and communication as well as cultural and historical analysis in order to develop students into critical and global thinkers prepared for careers in business, social service, and government or for graduate study in Iberian and Latin American Studies.

Administrative Example – The Office of Institutional Effectiveness (IE) supports the mission of Southeastern University through assessment, planning, research, reporting, accreditation support, and the integration of technology to promote a culture of data-informed decision-making, accountability, and quality enhancement.

Mission Statement Checklist – Now that you have created your mission statement, use the following checklist to help determine if your statement is effective and clearly defines the goals and vision of the department.

Yes No Is the mission statement brief and memorable?

Is the mission statement distinctive? (Can it stand on its own and distinguish itself from other programs if the program’s name were removed?)

Does it clearly state the purpose of the program?

Does it indicate the primary function or activities that the department offers?

Does it identify the major stakeholders/customers?

Does it support the University and unit (Academic Affairs, Finance and Administration, or Development) mission statements?

(Material adapted from University of Central Florida UCF Academic Program Assessment Handbook February 2005 Information, Analysis, and Assessment.)

The comprehensive unit mission statement is articulated in the Foundational Document by providing a complete list of the Mission, Constituents, Critical Processes, Values, and Vision. (Appendix I)

The departmental Foundational Statements will be entered into Compliance-Assist. Specific instructions for entering planning data into Compliance-Assist are provided in Appendix J.

Office of Institutional Effectiveness Page 17

Page 20: Institutional Effectiveness Handbook...learning/administrative process, an assessment tool may need to be developed internally (i.e. Graduating Student Surveys, Department Satisfaction

2. Identify Programmatic and/or Learning Outcomes

Overview – Once the mission is identified, departments are to create outcomes which reflect the departmental mission statement. For academic/student development departments, outcomes will reflect the knowledge, skills, and abilities students should possess and can demonstrate by graduation (i.e. students will demonstrate knowledge of biblical text, demonstrate oral communication skills, demonstrate knowledge and skill in effective writing, etc.). For administrative departments, outcomes will reflect the programmatic outcomes of the department (i.e. improve customer satisfaction, provide more scholarships, to improve the budgeting process, etc.).

Recommendations – Before preparing a list of outcomes, consider the following recommendations:

1. Outcomes should be specific and well defined. Outcomes should contain clear and concise terminology, which addresses the desired skill or outcome. They should exclude the greatest number of alternatives so the outcome can be measured.

Academic Example – The outcome, “Students completing the B.S. Gen. Biology should be well practiced in the relevant skills of the field,” is too vague. The outcome does not provide a baseline for measuring the relevant skills of the field. A better example would be, “Students completing the B.S. Gen. Biology will demonstrate knowledge of the various cell structures and their functions.”

Administrative Example – The outcome, “The admission department will improve the training of admission counselors,” is too vague. The outcome does not provide the baseline for required training or even the specific evidences of a quality training program. A better example would be, “The admission department will improve the training of admission counselors by employing standardized processes, policies, and training manuals/exercises.”

2. Outcomes should be realistic. It is important to ensure that outcomes are attainable. Academic: Outcomes must be formulated in view of the student’s abilities, the available resources in the college/department, and the accumulation of other assessments. Administrative: Outcomes should represent a manageable list, which is reasonable for completions within 3 to 5 years. With this in mind, departments should avoid verbose language.

3. Outcomes should rely on active verbs in the future tense. It is important to state outcomes in the future tense as a statement of what is expected of students/department. The outcome might include the following phrases: “Students will demonstrate . . .” or “The admission office will . . .”

4. There should be a sufficient amount of outcomes. You should include three to five outcomes in your assessment plan. Fewer than three outcomes does not provide adequate information to verify a process of assessment. More than five outcomes is usually too many because the data becomes complicated to collect, track, and synthesize.

5. Outcomes should be simple. The outcomes should be stated in a clear and simple manner. Avoid the use of compounded statements that join the elements of two or more outcomes. Outcomes should address only one goal.

Office of Institutional Effectiveness Page 18

Page 21: Institutional Effectiveness Handbook...learning/administrative process, an assessment tool may need to be developed internally (i.e. Graduating Student Surveys, Department Satisfaction

Writing the Outcomes – Based on the previous recommendations, each department should write three to five outcomes to include the following: (1) the beneficiary of the service (i.e. students, university administrators, prospective applicants, etc.), (2) a focus on the end state (i.e. “Students will. . .” or the “The admission office will . . .”), and (3) the clearly articulated outcome.

Academic Examples –

➢ Students will be able to apply the scientific method to solve problems.

➢ Students will be able apply algorithmic techniques to solve problems and obtain valid solutions.

➢ Students will be able to test hypotheses and draw correct inferences using quantitative analysis.

➢ Students will be able to communicate business knowledge both orally and in writing.

Administrative Examples –

➢ Library patrons will have access to the appropriate information resources needed for learning and research.

➢ Users will receive prompt assistance in resolving technical problems related to university networking services.

➢ Campus units will receive the technical support they need to conduct effective assessment.

➢ Eligible employees will have the information they need to make appropriate decisions regarding employee benefits packages.

➢ The university’s senior administrators will have the information they need for decision-making related to budgets and financial planning.

Outcome Checklist –

Now that you have created your mission statement, use the following checklist to help determine if your statement is effective and clearly defines the goals and vision of the department.

Yes No Outcome can be directly measured and/or observed.

Relies on action verbs in future tense

Is useful in identifying areas for improvement? (Outcomes are not a list of standard office tasks. They are outcomes designed for quality enhancement.)

Describes what students are intended to learn/ what the department is intending to achieve.

(Material adapted from Stanford University: Assessment Tools)

Entering Departmental Outcomes – The departmental learning/administrative outcomes will be entered into Campus Labs Planning (seu.campuslabs.com/planning). Specific instructions for entering planning data into Campus Labs Planning are provided in Appendix J.

Office of Institutional Effectiveness Page 19

Page 22: Institutional Effectiveness Handbook...learning/administrative process, an assessment tool may need to be developed internally (i.e. Graduating Student Surveys, Department Satisfaction

3. Identify Appropriate Assessments

Overview – Assessments should not be complicated, and when properly developed, collected, and analyzed, can be a powerful tool for planning, budgeting, staffing, and programmatic changes. Assessment data affords us the opportunity to understand where we are currently and how well the department is progressing towards its outcomes, like taking a pulse of the department in order to understand the current state of affairs. With this in mind, assessments should be administered as part of the ongoing, regular departmental processes. It is also important to understand that assessment is not an evaluation of an individual employee. Rather, assessments should be designed to measure the accomplishments and progress of the entire department.

Current Inventory – When identifying the appropriate assessments, each department should first take an inventory of the tools the department is currently using to measure student learning/administrative processes. What information are you collecting? What kinds of assessments are you already using or familiar with? What kinds of assessments are used by your profession/academic discipline?

(Material adapted from American University at Cairo: Assessment-A Guide to Developing and Implementing Effective Outcomes Assessment for Academic Support and Administrative Units, 2007)

Choosing the Assessment Method – Next, departments are to choose the assessment methods to be utilized. According to the NPEC Sourcebook of Assessment, Volume 1 (2000), the following elements should be reviewed:

1. Formative vs. Summative Evaluations: Formative Evaluation – The outcome is designed to provide feedback, with the intent of improving pedagogical skills, student learning, curricula, institutional processes, structure, and programs. Summative Evaluation – The outcome is designed to assist in the decision-making process (i.e. programmatic changes, resource allocation, personnel, etc.).

2. Internally vs. Externally Developed Internally Developed – If there is not an external method of evaluation that assesses student learning/administrative process, an assessment tool may need to be developed internally (i.e. Graduating Student Surveys, Department Satisfaction Surveys, etc.). Externally Developed – A commercial/widely accepted tool provides an opportunity for institutional comparison. Externally developed assessments prevent departments from reinventing the wheel. Examples include Noel-Levitz, NSSE, ETS field exams, and the like.

3. Conceptual Considerations Decision Making – Determine if the assessment data will be used for decision making regarding important policy issues. Also, determine the relevance of the assessment to the decision making process. For instance, if you are measuring the ability of students to write effectively in the business world, the assessments should include a resume and other office communiqué rather than the literary analysis of a poem. Applicability – Determine if the assessment measures the related stakeholders (students, donors, etc.) In other words, to what extent will the assessment data yield results that can be used by multiple groups? An example is the Graduating Student Survey, which was designed to offer student satisfaction scores for multiple departments. Interpretability – Determine if the assessment data will be provided in a format that is comprehendible to individuals with different backgrounds. Traditionally, the Office of Institutional Effectiveness uses a 1 to 5 Likert scale for satisfaction scores. This method is a standard means

Office of Institutional Effectiveness Page 20

Page 23: Institutional Effectiveness Handbook...learning/administrative process, an assessment tool may need to be developed internally (i.e. Graduating Student Surveys, Department Satisfaction

of assessing student satisfaction. Credibility – Assessment credibility is based on the time, energy, and expertise required; the psychometric qualities of the test; the ease of interpretation of results; and the amount of detail provided.

6. Other Considerations – Sample size, the length of the assessment, the audience, and the assessment design (pre/post-testing) are just a few variables that greatly affect assessment outcomes.

4. Indirect vs. Direct Indirect Assessments – Captures students’ attitudes, perceptions, opinions, and experiences. These include questionnaires, interviews, focus groups, satisfaction surveys, studies, and advisory boards. Direct Assessments – Prompts students/departments to represent or demonstrate their knowledge, skills, abilities, and measurable achievements. Academic: Exams, performance evaluations, standardized testing, field test, licensure exams, oral presentation, projects, demonstrations, case studies, simulations, portfolios, research papers, juried activities, etc. Administrative: budget numbers, enrollment figures, usage rates, number of complaints, average wait time, number of applications, dollars raised, etc. External benchmarks: graduation and retention rates, best practices, job and graduate school placement rates, administrative benchmarks for salaries and numbers of faculty and staff, etc.

For each learning/administrative outcome, you should have two assessments at least one of which should be a direct assessments. Do not reinvent the wheel. Other departments/colleges are using assessments which might be applicable to your department. Furthermore, the Office of Institutional E manages a number of annual assessments. For a complete list, see the Assessment Calendar and Institutional Effectiveness Schedule.

Entering Departmental Outcomes – The departmental learning/administrative assessments will be entered into the Campus Labs Planning site. Specific instructions for entering planning data into Campus Labs Planning are provided in Appendix J.

Office of Institutional Effectiveness Page 21

Page 24: Institutional Effectiveness Handbook...learning/administrative process, an assessment tool may need to be developed internally (i.e. Graduating Student Surveys, Department Satisfaction

4. Collect, Analyze, Report, and Communicate Findings

Overview – Once the assessment plan (Foundational Document, outcomes, assessments) is developed, the institutional effectiveness process needs to be implemented. It is recommended that each department appoint a faculty or staff member to be the assessment coordinator (AC). This could be the director, dean, chair, or another staff/faculty member. Based upon the assessment plan, the AC should develop an assessment timeline, which outlines when the assessments will take place, when the assessment results are due, when the results will be tabulated and analyzed, when the department will meet to discuss the results and recommend changes, and who will oversee the entire process. This cannot be stressed enough!

Collection – Collection of results is often overlooked in the assessment processes. Therefore, it is important to invest a considerable amount of time, energy, and resources to design the collection mechanisms. If an academic department is using a rubric, the AC should establish a deadline for submission and provide a template for submitting rubric scores. Below is an example of a rubric score template that could be used.

Course Number

Section Number

Student I.D. #

Student Name

Organ. Sub-score

Grammar Sub-score

Content Sub-score

Thesis Sub-Score

ENGL 1133 01 178373 John Doe 20 23 21 19

ENGL 1133 01 186154 Michael Smith

19 18 17 15

ENGL 1133 01 156893 Jane Taylor 23 24 25 22

ENGL 1133 01 187539 Bethany Todd

16 18 20 16

Assessment results should be collected in the easiest format possible, in a consistent manner. This applies to administrative as well as academic. Some assessment collection may require the use of assistive technologies (Infomaker, Jenzabar, etc.). The Office of Institutional Effectiveness can assist your department in collecting institutional data (i.e. retention, enrollment, faculty-to-student ratio, etc.) and launching surveys using Baseline, our assessment software. Future assessments will be collected and housed in the Baseline software with built-in rubrics, testing, surveying technology, and data entry capabilities.

Analysis – Analysis of assessment data is dependent upon the mode of assessment. Qualitative results should promote specific themes which arise out of focus groups, student comments, and/or audit reports. If the assessment data is quantitative, the department should employ various statistical techniques for analyzing data such as:

✓ Frequency Distributions – Frequency distributions are an easy way to summarize the number and range of scores. For instance, you may be interested in determining the number of students who received a particular score on an objective test or rubric. Frequencies are easily converted to percentages by dividing the counts by the sum of all frequencies and then multiplying by 100. The example below represents the range of scores from the ETS Proficiency Profile.

Office of Institutional Effectiveness Page 22

Page 25: Institutional Effectiveness Handbook...learning/administrative process, an assessment tool may need to be developed internally (i.e. Graduating Student Surveys, Department Satisfaction

✓ Calculating Measures of Central Tendency – Measures of central tendencies represents the average or typical score. The three most common methods are mean, median, and mode. Mean = sum of all data / sample size – The danger in using the mean is the unnecessary fluctuations of scores due to outlying scores (or those that fall outside the normal range). Therefore, it may be more appropriate to measure the median of a set of scores. Median = The median is the counting average. The median is reached by arranging the numbers in an ascending order and then counting up to the midpoint. Mode = The mode is the most frequently occurring score.

✓ Calculating Measures of Dispersion or Variability – Measures of dispersion represent how scores are spread above and below the measure of tendency. The range, variance, and standard deviation are the most common measures of variability.

(Material adapted from Stanford University: Assessment Tools)

Note: Most statistical analysis can be completed in programs such as Excel or SPSS.

Ensure that all of the data is available in a consistent format so as to guarantee the reliability of the scores. Now that you have analyzed the data, you should compare the results of your assessments against the learning/administrative outcome.

Did the results follow a certain pattern? For example, did certain professors have lower or higher scores on a grading rubric? What section of the rubric or knowledge exam did students fail to adequately meet? Is there evidence of improvement based upon the assessment results? The final step is to flesh out the themes which arise from both the quantitative and qualitative results.

Reporting – After the assessments are tabulated and analyzed, the results need to be reported in a format that is easily interpreted. The Office of Institutional Effectiveness prescribes the use of Campus Labs as the primary portal for assessment reporting. The departmental learning/administrative assessment results will be entered into Campus Labs Planning. Specific instructions for entering planning data into Compliance-Assist are provided in Appendix J.

Office of Institutional Effectiveness Page 23

Page 26: Institutional Effectiveness Handbook...learning/administrative process, an assessment tool may need to be developed internally (i.e. Graduating Student Surveys, Department Satisfaction

Communication – The assessment results should fuel an ongoing discussion of institutional effectiveness. The outcomes, assessments, and results/analysis for the department should be collected onto the appropriate outcome (MPA/PLO, see Appendices D & E) and presented to the department. An example is provided here.

Outcomes Assessments Results/Analysis Recommendations 1. Data Integrity, Entry and Retrieval: To work with other departments to work toward the proper entry, greater integrity and enhanced retrieval of needed data.

1.1 More of the data variables listed in the goal will be "clean" and available in the 2007-2008 fiscal year. 1.2 Common Data Set and Fact Book produced in fall and made available to campus faculty, staff and administration. 1.3 All necessary data submitted in a timely fashion for Fact Book, Common Data Set and reports and surveys by fall 2010. All needed data will be reported and have excellent integrity by fall 2010.

1.1.1 All data variables listed, except for high school class rank, are now being effectively entered and readily captured. 1.2.1 Common Data Set and Fact Book have been produced each year. Templates have been created to allow for more automation in gathering and entering values. 1.3.1 All data has been gathered in a timely fashion and has been certified by IR and departments providing the information.

1.1.1.1 Continue automation and innovation in entering data variables. Work toward more automation in populating database, and decreasing manual data entry. 1.2.1.1 Further automate production of Fact Book and Common Data Set. 1.3.1.1 Clarify deadlines for submission of data, IR data review, and data certification.

Office of Institutional Effectiveness Page 24

Page 27: Institutional Effectiveness Handbook...learning/administrative process, an assessment tool may need to be developed internally (i.e. Graduating Student Surveys, Department Satisfaction

5. Take action based on findings

Overview – Closing the loop is the last phase in the assessment cycle and involves making decisions about how to respond to assessment data. Academic and administrative departments should discuss assessment-related data at least twice per year—at the beginning of each year to affirm learning/administrative outcomes and at the end to discuss the result of assessments. Academic departments should strive to identify areas of weakness and recommend possible pedagogical, curriculum, and/or program changes. Administrative departments should assess their progress and recommend possible changes to structure, budget, personnel, and/or processes. Following the year-end meeting, recommendations should be reflected in the completed MPA in Campus Labs Planning. Additionally, these discussions should be reflected in departmental and committee minutes.

Examples of changes that may be implemented as a result of assessment

Category Specific Changes Changes to the Assessment Plan ❖ Revision of intended learning outcomes

❖ Revision of measurement approaches

❖ Changes in data collection methods

Changes to the Curriculum ❖ Changes in teaching techniques

❖ Revisions in prerequisites

❖ Revisions of course sequence

❖ Revisions of course content

❖ Addition of courses

❖ Deletion of courses

Changes to the Departmental Processes ❖ Revision of admission requirements

❖ Revision of advising standard or processes

❖ Improvements in technology

❖ Changes in personnel

❖ Changes in customer practices

(Adapted from University of Central Florida UCF Academic Program Assessment Handbook February 2005 Information, Analysis, and Assessment)

It is important to remember that the recommendations and subsequent changes are generally gradual and cumulative rather than immediate. Departments should undergo a process of on-going assessment with attention to necessary budgetary, personnel, and resource allocation priorities.

(AM/AP)

Office of Institutional Effectiveness Page 25

Page 28: Institutional Effectiveness Handbook...learning/administrative process, an assessment tool may need to be developed internally (i.e. Graduating Student Surveys, Department Satisfaction

Academic Program Assessment

Overview – Academic program assessment is at the very heart of the university’s mission, because it speaks to the university’s main purposes: student learning and student success. Assessment of academic programs is best accomplished through a systematic, ongoing, broad-based, collaborative process involving the academic deans, chairs, and faculty of each academic program. Academic program assessment is utilized to measure basic levels of student knowledge, skills, and abilities (baseline measures), the rate or degree of change or improvement in student achievement (longitudinal studies), and for comparisons with peer institutions (benchmark data).

As outlined briefly here and in much more detail in Appendix K, a good assessment plan and process consists of eight steps:

❏ Define the departmental mission

❏ Identify the program learning outcomes

❏ Ensure that curriculum adequately covers identified outcomes

❏ Define assessment measures, instruments

❏ Create assessment plan

❏ Administer assessments

❏ Collect, analyze, report, and communicate assessment results

❏ Implement changes or improvements in assessment plan, program, and curriculum based on assessment results.

Student Learning Outcomes – A comprehensive process for creating student learning outcomes can be found in Appendix L.

Curriculum Maps – To ensure that identified outcomes are adequately taught and measured across the curriculum, departments should employ the use of curriculum maps. An example is provided in Appendix H. View the following learning module to learn how to chart and analyze a curriculum map: Curriculum Map learning module

Student Development Application – The Academic Program Assessment process is also applicable to the Student Development unit. In higher education literature, residence halls are routinely referred to as “living, learning communities.” In this model, student life personnel are recognized as co-educators with faculty in that they seek to integrate student life with the academic processes of the university. Based on this model, student development departments should articulate learning outcomes for students as a baseline for measuring student success, faith integration, leadership skills, etc. More on this area of assessment can be found in the Assessment of Student Success & Student Development chapter of this handbook.

(AP/JR)

Office of Institutional Effectiveness Page 26

Page 29: Institutional Effectiveness Handbook...learning/administrative process, an assessment tool may need to be developed internally (i.e. Graduating Student Surveys, Department Satisfaction

Academic Program Review

Through a self-reflective process, academic units will investigate a number of key performance indicators, including program learning objectives, curriculum relevance, teaching and learning methods, student success, and administrative practices and procedures. The program review cycle, timeline, APR section guides and supplemental learning presentations, tools, and other resources can be found in the Academic Program Review Handbook (3rd ed.). This handbook should not be perceived as a prescriptive document, designed to evaluate departmental compliance with predetermined standards. Rather, the APR process embodies a philosophy of self-reflection and self-improvement, wherein departments articulate their outcomes and assess the extent to which these are achieved. Furthermore, the undergirding philosophy of this process is in keeping with the University’s model of shared governance.

The current proposed review cycle is as follows:

Academic Years Academic Unit

2018-19 Barnett College of Christian Ministries & Religion (School of Divinity*) School of Unrestricted Education

2019-20 School of Business Administration* 2020-21 Department of Humanities

Department of Visual Arts 2021-22 Department of Communication 2022-23 School of Leadership Studies

School of Legal Studies Department of Music

2024-2025 College of Natural & Health Sciences (Department of Nursing*) College of Education (Department of Undergraduate Studies*)

2025-2026 College of Behavioral & Social Sciences (Social Work*) *Units with state or professional accreditations may be exempt from aspects of the APR process dealing with academic content and quality. These units will focus primarily on the MVR and the AP, and will not require an additional on-site visit.

Office of Institutional Effectiveness Page 27

Page 30: Institutional Effectiveness Handbook...learning/administrative process, an assessment tool may need to be developed internally (i.e. Graduating Student Surveys, Department Satisfaction

Administrative Unit Assessment

Overview – Administrative unit assessment is an ongoing, continuous, and systematic process of assessing the administrative activities of your department. Every department on campus is to assess its administrative processes including the academic, student life, administrative, and development units. The result of administrative unit assessment provides empirical data necessary for developing annual and long-range plans. This information is to be used for determining resource allocation and long-range planning. Assessment can benefit departments by:

● Helping clarify the department/unit’s mission, its role in achieving the university mission, and identify the activities necessary to achieve the department/unit’s mission

● Providing coherence and direction to the department’s work

● Providing staff with clear expectations and a means of evaluation

● Providing administrators with better information about how their services are viewed by students (customers)

● Helping administrators make informed decisions about budgeting, personnel, staff hires, the need to improve or expand services, and more

● Ensuring that resources are being allocated in the most effective manner

(Material adapted from American University at Cairo: Assessment-A Guide to Developing and Implementing Effective Outcomes Assessment for Academic Support and Administrative Units, 2007)

As outlined briefly here and in much more detail in Appendix O, a good assessment plan and process consists of seven steps:

❏ Define the unit’s foundational statements

❏ Identify the most important outcomes of the unit

❏ Define assessment measures, instruments

❏ Create assessment plan

❏ Administer assessments

❏ Collect, analyze, communicate, and report assessment results

❏ Implement changes or improvements in assessment plan, program, and department based on assessment results

- AP/AM

Office of Institutional Effectiveness Page 28

Page 31: Institutional Effectiveness Handbook...learning/administrative process, an assessment tool may need to be developed internally (i.e. Graduating Student Surveys, Department Satisfaction

Faculty and Staff Assessment

Faculty Assessment – Faculty assessment is the responsibility of the Faculty Development Committee in coordination with the deans and chairs. In the fall of each year, faculty members prepare goals for the academic year to address areas of the university teaching vocation. The areas of vocational commitment are as follows:

● Teaching

● Advising

● Scholarship and Professional Contributions

● Community Service

● University Service

● Administrative Goals (for faculty who have assigned/paid administrative positions)

The faculty member meets with his or her dean/chair in order to discuss the plan for the academic year. Goals are then mutually agreed upon by the faculty member and the dean or department chairperson. Once the goals are agreed upon, they officially become part of the Professional Activities Contract (PAC). Data concerning each goal is gathered in a variety of ways, including following:

❖ Supervisor Classroom Observations

❖ Student Course Assessment: Course evaluations are distributed via the Campus Labs: Course Evaluation tool twice a semester.

❖ Peer Classroom Assessment

❖ Advisor Evaluation

At the end of each year, the faculty member completes a Professional Activities Report (PAR) in which he/she explains whether or not the Professional Activities Contract goals were met. This report also includes lists of committees, sponsorships, and other contributions to the institution, as well as a record of professional memberships, conferences, workshops attended, and papers presented or published.

After receiving the Professional Activities Report, near the end of each spring semester, the department chair or dean reviews all the preceding assessment sources and completes the Department Chair Summary Report. The completed Annual Faculty Development and Assessment Portfolio is then reviewed by the faculty member in conference with the dean/department chair. Both the faculty member and supervisor sign the portfolio and it becomes part of the assessment record for the faculty member.

Staff Assessment – Staff and administrative assessment is administered by the Human Resource Office. For information related to this process, see the Staff Handbook (revised 02/2018).

The Office of Institutional Effectiveness aids supervisors in the performance management process with the administration of 360 Evaluations on the Qualtrics platform. For information about the Qualtrics 360 Evaluation at Southeastern University, contact the Executive Director of Information Management ([email protected]).

(AM/JR)

Office of Institutional Effectiveness Page 29

Page 32: Institutional Effectiveness Handbook...learning/administrative process, an assessment tool may need to be developed internally (i.e. Graduating Student Surveys, Department Satisfaction

Assessment of Student Success & Student Development

Southeastern University’s mission is “Equipping students to discover and develop their divine design to serve Christ and the world through Spirit-empowered life, learning, and leadership.” It is therefore the responsibility of all units at the institution, especially those that work directly with students, to design, execute, and improve opportunities for students to discover, develop, live, learn, and lead through an intentional, scaffolded educational journey. This endeavor requires the engineering of outcomes that describe in a detailed manner how students will become beneficiaries of the university mission, as well as curricular and co-curricular learning and development experiences that deliver those outcomes.

Student Success

Historically, student success has been construed as the ability to retain students, and enable them to persist through to graduation. The historic emphasis on attainment rates is one of the most basic definitions of student success and only takes into account access to college and degree completion (Schreiner, Louis, & Nelson, 2012). A concerted effort has been undertaken to understand the factors that contribute to retention, persistence, and graduation rates. This focus on student success led to research that has uncovered a vast set of contributing variables that impact persistence. The amalgamation of these variables form theories of student persistence that align with the following disciplinary perspectives: sociology, psychology, organizational, cultural and economics (Kinzie, 2012).

Perspectives on Student Success.

● Sociological Perspective The sociological perspective of student success takes into account two main factors that influence persistence. First, numerous studies have evaluated the impact of social structures that influence college students (Braxton, 2000; Braxton, Doyle, & Jones, 2013; Tinto, 1986). Some of these social structures include college peers, socioeconomic status, socialization processes, and support from others (Braxton et al., 2013). Second, the sociological perspective considers the shared behaviors that promote a common outcome such as student persistence (Kinzie, 2012).

● Psychological Perspective The psychological perspective of student success focuses primarily on individual students and their psychological characteristics that influence persistence and departure decisions (Astin, 1977, 1993; Bean & Eaton, 2000, 2001). This perspective emphasizes the impact of numerous variables on student persistence including “individual attributes, beliefs, coping skills, levels of motivation, and interactions with other members of the campus community” (Kinzie, 2012, p. xvii).

● Organizational Perspective The organizational perspective of student success considers the impact of institutional factors such as behaviors, policies, and practices that impact persistence. Research conducted by Kuh, Kinzie, Buckley, Bridges, and Hayek (2007) categorized these factors into the following groupings: “institutional size, selectivity, resources, faculty-student ratios…control, mission, and location” (p. 15). These elements impact the level of commitment students have towards an institution, their sense of belonging, and overall satisfaction and, in turn, their likelihood of persisting (Bean, 1980, 1983, 1985).

● Cultural Perspective The cultural perspective of student success evaluates the unique challenges faced by underrepresented student groups. As a result of their unique lived experiences and underlying institutional constructs they are often less likely to benefit from the learning environment of an institution (Astin, 1977, 1993; G. Kuh, J. Kinzie, J. H. Schuh, E. J.

Office of Institutional Effectiveness Page 30

Page 33: Institutional Effectiveness Handbook...learning/administrative process, an assessment tool may need to be developed internally (i.e. Graduating Student Surveys, Department Satisfaction

Whitt, & Associates, 2005b; Mayhew, Rockenbach, Bowman, Seifert, & Wolniak, 2016). When these theories of involvement and engagement are applied students from different cultural backgrounds are often overlooked.

● Economic Perspective Finally, the economic perspective of student success weights the cost and benefits of higher education. Specifically, this perspective considers how students conduct informal cost-benefit analysis related to their college experience and the activities they choose to participate in. Braxton (2003) found that students who perceive that the value of their education is not with the cost, their more likely to depart before completing a degree.

Expanded Vision of Student Success As noted previously, the view of student success held by many of the theories outlined above focuses on persistence and degree attainment as the primary measure of success. By applying many of the theories contained within the discipline perspectives outlined above, institutions aim to increase the percentage of students who retain, persist, and graduate. Unfortunately, this is an oversimplification of student success and the purpose of a college education. In recent years, scholars have begun to expand their view of student success. Some of the additional metrics now being considered in the research include learning gains, talent development, student satisfaction, and student engagement (Kinzie, 2012).

Taking into account the foundational perspectives of student success and integrating the expanded visions for student success previously discussed, Kuh, Kinzie, Buckley, et al. (2007) provide a more holistic definition of student success as “academic achievement; engagement in educationally purposeful activities, satisfaction; acquisition of desired knowledge, skills, and competencies; persistence; and attainment of educational objectives” (p.10). An expanded vision of student success takes into account a plethora of factors. Due to the complex, sometimes confounding variables that impact student success, Kinzie (2012) states that there is no single solution to help students succeed, but rather a concerted campus-wide effort from all stakeholders must occur and take into account the factors outlines in this section.

Thriving The pioneering work of Schreiner (2010) continues to expand the view of student success in college. While Schreiner’s conceptual framework takes into account many of the expanded views outlined previously, her work provides a working theory of holistic student well-being that not only supports student success, but human flourishing. Grounded in positive psychology, Schreiner defines student thriving as a student being “fully engaged intellectually, socially, and emotionally in the college experience” (p. 4). Schreiner (2010) has found that students who are thriving tend to engage in deep learning, intentionally work towards goals, value the perspectives of others, foster healthy relationships, and overall have a positive perspective. As a result, thriving students experience personal and academic success that promotes retention, persistence, and graduation rates and, ultimately, are able to experience the full benefits of a college education (Schreiner, 2010).

Assessing Student Success Southeastern University’s Office of Institutional Effectiveness administers a number of measures that return data relevant to student success at the institution. The majority of those assessments are described elsewhere in this handbook. They include the National Survey of Student Engagement, which measures student involvement in a number of engagement indicators that are associated with a successful undergraduate experience, including Higher-Order Learning, Reflective & Integrative Learning, Collaborative Learning, a Supportive Environment, etc., as well as High Impact Educational Practices like

Office of Institutional Effectiveness Page 31

Page 34: Institutional Effectiveness Handbook...learning/administrative process, an assessment tool may need to be developed internally (i.e. Graduating Student Surveys, Department Satisfaction

First-Year Experience, learning communities, mentoring, study abroad, capstone experiences, and other enriching educational practices that are positively associated with student learning and retention; Ruffalo Noel-Levitz’s Student Satisfaction Inventory, which measures both student satisfaction and priorities on a number of items relevant to the undergraduate experience; the Thriving Quotient, which is an instrument that was developed to measure the academic, social, and psychological aspects of a student’s college experience that are most predictive of academic success, institutional fit, satisfaction with college, and ultimately graduation; and a number of institutionally developed measures like the Graduating Student Survey and regular focus groups, all described in this handbook. National measures like NSSE and the SSI return valuable national benchmark and consortia data for comparison.

Although the above measures are useful (and perhaps indispensable to institutional effectiveness), they do not begin to exhaust the potential for measuring student success at Southeastern University. Student Development, Academic Services, Information Management, and other units all have roles to play in the development of a culture of evidence of student success at the institution. Outcomes assessment is the standard strategy for accomplishing this significant undertaking.

Student Learning & Student Development

Assessment of student learning is not confined to the learning that takes place within an academic department’s curricula or a student’s academic course work. Student learning has become a model for thinking about the primary aim of nearly every conceivable planned experience, activity, collaboration, service, and network in which a student might participate while enrolled in a postsecondary institution. One’s “life at college” is more than the sum of auxiliary activities designed by student affairs offices to enrich one’s time outside of the classroom in a 4-year degree program. Student Development, but also every other unit at the institution that is primarily charged with supporting or serving students, must, like Academic Affairs, develop outcomes related to student knowledge, ability, and thought which it intends to foster and measure as students move from matriculation to commencement. The following excerpt from Assessment Reconsidered: Institutional Effectiveness for Student Success describes how student affairs professionals should be understood as educators, just as much as their faculty colleagues:

Student affairs educators have long had as their professional mission to support personal and social maturation. While there has been some recognition that such maturation is a complex learning process, the explicit acknowledgement that learning outside the classroom is still learning . . . has come to be explicit only relatively recently. Many student affairs professionals have not thought of themselves as educators; indeed, some resist that label . . . There is, however, no conflict between providing excellent services and supporting learning (Keeling, et al., p. 8).

Thus, as educators, student affairs professionals must assess learning outcomes in order to enhance learning, ensure alignment with institutional outcomes, and integrate the curricular and co-curricular journeys of our students. This responsibility is not merely a trend in higher education. As Fernando Padro, a Faculty Fellow in NASPA, says, as quoted in Building A Culture of Evidence in Student Affairs, “The era of the learner outcome is here to stay for the foreseeable future” (Mason and Meyer, 2012, p. 61). It should be noted that a distinction can be made between learning outcomes and developmental outcomes. Mason and Meyer state that developmental outcomes “assess affective dimensions or attitudes directed toward a person, object, place, or idea that predispose people to behave in certain ways” (p. 68). Learning outcomes, on the other hand, “assess the intellectual or cognitive learning you want to occur” (Mason and Meyer, 2012, p. 64). While student affairs units may emphasize development over learning or

Office of Institutional Effectiveness Page 32

Page 35: Institutional Effectiveness Handbook...learning/administrative process, an assessment tool may need to be developed internally (i.e. Graduating Student Surveys, Department Satisfaction

learning over development in a given co-curricular space, both are critical to student success, and both may be assessed with some of the following examples. (CL/JR)

The following example describes the stages of planning an assessment of student learning that transpires in a co-curricular learning experience:

Topic Assessment Stage Assessment Activity

Seat at the Table (Multicultural Affairs series)

Identifying an educational need

Needs assessment -- administering an assessment tool, such as a survey, and analyzing data that indicate the need for change in knowledge, attitude, or behaviors.

Conceptualizing

Categorizing the identified needs; developing a scope and sequence of learning goals that illustrates who should know what, and in what general order or sequence the content should become known, as a result of the educational programming.

Planning

Working collaboratively with others (representatives of constituent groups) to identify who or what department will create each workshop; what knowledge, skills, or attitudes students should develop as a result of the workshops (i.e., student learning outcomes); when and where each workshop will take place; how the learning opportunities will be made known to students; what assessment method will be used to determine level of impact; how assessment data will be gathered; and how that data will be analyzed, synthesized, reported, and responded to. Note that assessment data are never collected for their own sake; assessment serves a purpose beyond assessment itself.

Implementing

Delivering the workshops; the format, content, and methods should respond to the learning goals, fit the learning style of the participants, and anticipate the assessment process.

Evaluating

Developing the assessment strategy and methods; providing the assessment to the learners; gathering, analyzing, and synthesizing data; reporting findings; developing a strategy to respond to data findings; describing the data findings and proposed responses to key stakeholders.

* Adapted from Assessment Reconsidered: Institutional Effectiveness for Student Success (Keeling, et al., 11)

Office of Institutional Effectiveness Page 33

Page 36: Institutional Effectiveness Handbook...learning/administrative process, an assessment tool may need to be developed internally (i.e. Graduating Student Surveys, Department Satisfaction

The Planning stage on the previous page’s example requires perhaps the most creative and critical thinking about precisely what students will come away from a given learning/development experience with, especially with regard to the varying degrees of preparedness and capacities among postsecondary student populations. Students are whole persons on journeys of learning and development, not mere information processors or one-dimensional cogs within the institution. Such complexity requires a scaffolded approach to the development of learning outcomes, which recognizes that learning and development occur gradually, from lower to higher levels of complexity and rigor. The example below provides sample learning outcomes related to the domain of citizenship, based on taxonomies associated with student class level and type of learning. This is not a prescriptive set of outcomes, though it can be adapted or used as a template for nearly any area of student development or student success.

Domain: Citizenship

Cognitive Psychomotor Affective

First-year students Define what it means to be a campus student leader.

Demonstrate how to constructively facilitate a class discussion.

Awareness of how interpersonal skills express basic social competencies like respect, cooperation, and patience.

Sophomores Explain why interpersonal skills are core elements of citizenship.

Engage in conflict mediation through student organizations, classroom discussions, or debate.

Distinguish levels of citizenship advocacy like activism, radical activism, and civil disobedience.

Juniors Apply principles of civility when faced with peer pressure (e.g., setting boundaries, engaging refusal skills, and clear communication).

Assist first-year and sophomore students in clarification activities, both formally (in student organization activities) and informally (in residence halls, service learning activities, or alternative spring break experiences).

Describe one’s personal sense of integrity.

Seniors Describe and discuss characteristics of citizenship and incorporate those into campus leadership programs.

Analyze leadership portfolios of first-year student scholarship applicants.

Assist college administrators by synthesizing student leaders’ accomplishments and creating an annual report.

* Adapted from Assessment Reconsidered: Institutional Effectiveness for Student Success (Keeling, et al., 26)

Office of Institutional Effectiveness Page 34

Page 37: Institutional Effectiveness Handbook...learning/administrative process, an assessment tool may need to be developed internally (i.e. Graduating Student Surveys, Department Satisfaction

Best Practices

Developmental, learning, and program assessment have a place in all major areas in student affairs. As shown in the following examples, colleges and universities across the United States are employing a variety of learning outcomes, developmental outcomes, program outcomes, and other assessment strategies, including rubrics.

Academic Advising At The University of Vermont, the division of Academic Support Programs tracks usage; conducts needs assessment of faculty, students, and volunteers in the Note Taking Program; tracks student grade point average and retention through outcomes assessment; provides satisfaction studies; measures its resource effectiveness; and sets benchmarking standards. To strategically plan for the future, the division uses CAS standards to review the program with an external panel (Assessment Continuum, 2010). The University of Wisconsin–Milwaukee’s College of Engineering and Applied Science uses cognitive, skills, and affective student learning outcomes to measure the value of its academic advising (ACCESS to Success, 2011). Admissions The Office of Admission and Financial Aid at Pitzer College in California has created a goal for prospective students and their families to demonstrate learning outcomes. To achieve the eight established student learning outcomes, the office created eight means of communication, ranging from information sessions to websites to individual visits at high schools throughout the United States (Office of Admission Student Learning Outcomes, 2012). Campus Centers/Student Unions The Hulman Memorial Student Union (HMSU) at Indiana State University contributes to the division’s Master Assessment Plan by conducting guest satisfaction surveys, usage reports, reservation surveys, programming surveys, and need surveys. A report on the measures of student behavior was created after students attended a professional skills development workshop. Specific behavioral outcomes were identified and measured, including strengths and weaknesses (HMSU Research and Assessment, 2012). The Ohio State University Ohio Union used CAS College Union standards and the Association of College Unions International Core Competencies to create desirable student learning outcomes such as intellectual growth, effective communication, enhanced self-esteem, realistic self-appraisal, clarified values, and career choices (Burden et al., 2008). The union’s use of tables and specific examples of outcome behaviors is readily transferable to a working rubric document.“ Career Services The Career Center at Boston College in Massachusetts uses a rubric to assess practice interviews. During a 1-hour practice interview with a career advising staff member, the advisor offers students insights and suggestions to improve their interviewing skills. The discussion is guided by a rubric that defines the skills students need to demonstrate: verbal and nonverbal communication, listening, value of previous experience, and preparation and interest. At the end of the session, students are ranked on a scale from “occasionally” to “consistently.” The rubric is used as a teaching tool to add a learning element summarizing the practice interview (Using a Rubric to Assess Practice Interviews, 2012). Learning outcomes have taken center stage at Indiana State University’s Career Center since at least 2004. Learning outcomes are used for student internship evaluations and to measure student behavior after workshops and interviews (Career Center Research and Assessment, 2012).

Office of Institutional Effectiveness Page 35

Page 38: Institutional Effectiveness Handbook...learning/administrative process, an assessment tool may need to be developed internally (i.e. Graduating Student Surveys, Department Satisfaction

Counseling and Health Programs The Center for Health and Wellbeing at The University of Vermont offers use and student satisfaction surveys to determine student satisfaction. The university also tracks the number of clinical visits (Utilization and Student Satisfaction at CHWB, 2011). “The Indiana State University Student Counseling Center created the Counseling Outcomes Assessment Study to determine whether clients would report learning in one or more of the 13 counseling behavior areas (Report of Outcomes of Student Counseling Clients, 2007). Disability Support Meredith College in North Carolina created learning outcomes for students who used the counseling center’s disability services, as well as learning objectives for faculty (Welcome to Disability Services, 2011). Distance Education Ventura College in California keeps it simple with two college-level student learning outcomes: information competency, and critical thinking and problem solving. Units develop and measure specific, goal-oriented outcomes. Outcome statements look like this: “At least 20% of the faculty completing distance education training provided by the college will use one or more teaching tools/ techniques in their distance education course” (Distance Education, 2011). Housing and Residence Life Pathways is Boston College’s first-year residential life experience for the 306 students living in Hardey House and Cushing Hall. The program purposefully integrates the school’s mission into a first-year residential program that includes overall student experience, overall resident advisor experience, Frosh.0 (small discussion groups), resident assistant training, an alternative spring break, and an academic initiative. Each of the components has an assessment strategy, such as short answer surveys, rubrics, focus groups, observations, and other evidence of learning that correlate to outcomes (What Is Pathways?, 2012). The Office of Residence Life and Housing at Bridgewater State University in Massachusetts has created a multitude of learning outcomes for programs and services, ranging from general student outcomes geared toward living in a residential community (e.g., Students will be able to effectively communicate with their fellow residents) to first-year student housing assignments (e.g., Students will be able to recognize the importance of respecting the needs of others) to programming (e.g., Residents will be able to expand their knowledge to challenge current beliefs). Additional outcomes have been created for the First Year Residential Experience program, the programs held during the crucial first six weeks of the school year, the Leaders Emerging and Developing series, Community Watch Committee programs, and Residence Life and Housing Sustainability Committee programs (e.g., Students will be able to identify on- and off-campus resources that promote sustainable practices). Each residential learning community has specific outcomes focused on its community purpose, and staff members (students and professionals) are provided with outcomes for every step of employment, from resident assistant recruitment, application, and experience processes to staff training. Bridgewater even includes facilities in its learning outcomes, from work orders to damage billing to appeals. Additionally, every residence hall student organization has specific learning outcomes (B. Moriarty, personal communication, April 30, 2012). International Students Ventura College helps international students set their course by focusing on three college-level student learning outcomes: (1) information competency, (2) critical thinking and problem solving, and (3) social interaction and life skills. These learning skills are translated into service unit outcomes and assessed periodically. The following are examples of outcomes:

Office of Institutional Effectiveness Page 36

Page 39: Institutional Effectiveness Handbook...learning/administrative process, an assessment tool may need to be developed internally (i.e. Graduating Student Surveys, Department Satisfaction

● International students will demonstrate knowledge about their immigration status and understand the requirements for maintaining their visa status.

● International students will demonstrate success by maintaining satisfactory academic progress. International students will demonstrate an understanding of the United States by their successful integration into the community. (International Students, 2011)

The W.E.B. DuBois International House (I-House) at Morehouse College in Georgia brings together academic affairs, student affairs, and wellness services to offer an integrated living-learning experience for international students. I-House has established a mission, goals, and three learning outcomes for all international students and U.S. citizens who live there (W.E.B. DuBois International House, 2009). Internships and Cooperative Education As part of their undergraduate humanities program, students at the State University of New York Maritime College may enroll in internship hours, complete with learning outcomes. Students must acquire skills in three areas to demonstrate the acquisition and retention of understandings and competencies. In the area of communication skills, students must demonstrate accomplishments in oral and written communication evidenced by daily logs, e-mail communication with faculty, and the clear and persuasive expression of ideas. Students must demonstrate at least six learning outcomes in the category of cognitive skills, which may include organizing and maintaining information, negotiating and arriving at a decision, or working in cross-cultural or multinational systems. Additionally, students must accomplish at least eight professional skills learning outcomes, such as exercising leadership, behaving ethically, teaching others, and participating as a member of a team (Maritime Studies Internship, 2011). Learning Assistance Programs/Tutoring In 2006, Brazosport College in Texas crafted a quality enhancement plan called Creating a Connected, Integrated Transitional Education Program. The plan came into being after a review of institutional research data, discussions with college faculty and staff, and examination of national research data confirmed that transitional education offered the best opportunity for improving student learning outcomes. To make the desired changes, Brazosport activated the plan, adopted four learner outcomes, and created institutional and program goals to develop a structure that supports these goals. The plan includes assessment strategies, implementation tasks, and timelines (Brazosport College Quality Enhancement Plan, 2006). Multicultural Student Programs and Services The Office of AHAN (African American, Hispanic, Asian American, and Native American) Student Programs at Boston College “Opened the DOR” with its Dialogues on Race program, demonstrating programmatic and student learning outcomes. After completing the session, 100% of the participants understood and could correctly define institutional racism, and 100% would recommend the program to a peer. Additional learning outcomes included the ability to articulate the importance of learning about different experiences based on race, culture, and ethnicity; ability to demonstrate an increased level of comfort in discussing issues of race in academic and social settings; and ability to demonstrate a clear understanding of institutional racism and how it affects society (Open the DOR, 2012). Orientation Students who participate in new student orientation and first-year programs at Bowling Green State University in Ohio develop personal action plans related to academic success, career development, leadership and engagement, and personal and fiscal responsibility. Students also must demonstrate the ability to recognize Bowling Green’s learning outcomes, understand how the outcomes

Office of Institutional Effectiveness Page 37

Page 40: Institutional Effectiveness Handbook...learning/administrative process, an assessment tool may need to be developed internally (i.e. Graduating Student Surveys, Department Satisfaction

are connected to their curricular and cocurricular goals, and describe their rights and responsibilities in achieving these outcomes (Invest, Engage, Attain, 2012). Lourdes University in Ohio offers a simple mission statement for orientation: “Orientation provides new students with information and campus resources to help begin their college experience. Orientation welcomes new students to campus and establishes new connections with the Lourdes community” (Mission and Outcomes, 2012, para. 1). Lourdes learning outcomes require students who complete orientation to demonstrate their ability to navigate the campus, identify learning opportunities outside the classroom, identify campus services, and outline campus rules and expectations (Mission and Outcomes, 2012). Parent and Family Programs Parents contribute to University of Minnesota students’ success by supporting the university’s goals for student learning and development outcomes. The university asks parents to:

● Challenge their student to identify, define, and solve problems independently. ● Have their student set and achieve personal goals and make responsible decisions in relation

to academics, career planning, social interactions, and community engagement. ● Understand and support the university’s commitment to academic excellence and integrity,

ethical behavior, diversity, and civility. ● Empower their student to examine personal values. ● Encourage their son or daughter to learn about and respect the values and beliefs of others. ● Allow their student to accept the consequences of his or her actions and accept responsibility

for personal errors. (Desired Outcomes for Parent/Family Involvement, 2012)

The University of the Pacific in California has detailed learning outcomes and assessment measures for each new student and family program. The result is a readable chart that outlines outcome progress, from the stated objective to how the outcome will be assessed to the results of the evidence and what that means to the program director (Student Learning Outcomes for New Student and Family Programs, n.d.).

Registrar Programs and Services Union College in New York recognizes that all student services areas can create program objectives. The objectives for the Office of the Registrar include: (1) provide accurate transcripts to current and former students in a timely manner; (2) import student schedules and courses requiring final exam scheduling and arrange them to produce a conflict-free schedule, with the fewest exams in one day for each student; (3) process change of major/minor forms promptly; (4) meet with seniors to ensure that they will complete graduation requirements; (5) register students for classes in a timely manner; and (6) collect grades from faculty and report them to students (Assessment: Registrar’s Objectives, 2011).

Service Learning Programs The Volunteer and Service Learning Center at Boston College wanted to determine whether students were learning and developing as mentors in their roles as Big Brothers and Big Sisters. The center conducted personal interviews pertaining to learning outcomes, student involvement, and program operations. The interviews revealed that although participants were enthusiastic about the program, many had difficulty articulating the value of the mentoring relationship. The study authors recommended the creation of more opportunities for students to recognize their own personal growth and articulate it through structured reflection and training (Volunteer and Service Learning Center, 2012).

Office of Institutional Effectiveness Page 38

Page 41: Institutional Effectiveness Handbook...learning/administrative process, an assessment tool may need to be developed internally (i.e. Graduating Student Surveys, Department Satisfaction

Student Conduct Programs Learning outcomes for the student conduct program at Lafayette College in Pennsylvania are simple and measurable: (1) Students will know that policies and expectations related to student behavior are explained in the Student Handbook and where the Student Handbook is located; (2) students will have a basic understanding of their rights and responsibilities as members of the Lafayette community; and (3) students who meet with staff members regarding violations will be able to articulate how their decisions may affect the attainment of their personal and academic goals (Division of Campus Life, 2012).

Student Life and Leadership Programs Students at Eastern Michigan University can participate in LeaderShape, which helps participants achieve four primary outcomes: (1) increase their commitment to acting consistently with core ethical values, personal values, and convictions; (2) increase their capability to develop and enrich relationships as well as to increase their commitment to respecting the dignity and contribution of all people; (3) embrace the belief in a healthy disregard for the impossible; and (4) develop the capability to produce extraordinary results. In addition, student participants learn to work in high-performance teams; practice decision making for ethical dilemmas; learn to deal with change; clarify personal values and standards; and understand and respect the values of others (LeaderShape, 2012).

Women Student Programs Learner outcomes set the stage for a three-tiered assessment of the effectiveness of bystander intervention education at Boston College. Students complete a pretest before attending a 1-hour presentation; complete a posttest after the presentation; and are surveyed again 3 months later to measure whether their behavior has changed. The curriculum was modified on the basis of student feedback and approved to fully implement in a strategic manner (Bystander Intervention Education Assessment, n.d.).

*Excerpt From: Culp, Marguerite McGann and Dungy, Gwendolyn Jordan. Building a Culture of Evidence in Student Affairs: A Guide for Leaders and Practitioners.

Office of Institutional Effectiveness Page 39

Page 42: Institutional Effectiveness Handbook...learning/administrative process, an assessment tool may need to be developed internally (i.e. Graduating Student Surveys, Department Satisfaction

Competency-Based Education

Traditionally, educational delivery in North American postsecondary institutions has relied exclusively on the credit-hour seat time model for both awarding college credit and assessing progress through an academic program. However, in recent years, the competency-based education (CBE) model has emerged as an attractive alternative to the established paradigm. As a distinct model of undergraduate and graduate education, CBE represents a departure from the credit-hour system in two significant ways. One difference is that, instead of awarding credit hours for the completion of requisite seat time and passing course grades, academic programs award competencies to students who achieve them through the demonstration of skill, ability, or knowledge in relevant disciplines or fields. The other major difference is that CBE necessitates a different approach to conducting assessment of learning. As the reader will recall, learning outcomes assessment in the traditional postsecondary space establishes course, program, and institution-level goals for students in terms of knowledge, skill, and ability, identifies the extent to which they achieve those goals, and makes improvements based on the resulting evidence. CBE, on the other hand, requires a detailed assessment of whether students achieve each competency in the overall suite of competencies that comprise the program or learning series. Whereas traditional outcomes assessment asks about student performance against a learning goal and may express that performance with various metrics or assessment narratives, CBE assessment is a binary-option report process. That is, “Did the student demonstrate achievement of the competency?” Typically, CBE coaches or instructors are strategically employed to intervene and aid students with the necessary scaffolding and support needed to move through the increasingly complex and rigorous competencies of their program, based on assessment of competency achievement or failure.

According to the Competency-Based Education Network (C-BEN), “Competency-based education combines an intentional and transparent approach to curricular design with an academic model in which the time it takes to demonstrate competencies varies and the expectations about learning are held constant. Students acquire and demonstrate their knowledge and skills by engaging in learning exercises, activities and experiences that align with clearly defined programmatic outcomes. Students receive proactive guidance and support from faculty and staff. Learners earn credentials by demonstrating mastery through multiple forms of assessment, often at a personalized pace” (http://www.cbenetwork.org/competency-based-education/). At the time of this edition’s publication, CBE has yet to be implemented as a full programmatic educational delivery model at Southeastern University, but it is anticipated that a number of academic units will find it to be a useful strategy for recruiting, retaining, and graduating more students in the near future. See the C-BEN’s Quality Framework for Competency-Based Education Programs.

(JR)

Office of Institutional Effectiveness Page 40

Page 43: Institutional Effectiveness Handbook...learning/administrative process, an assessment tool may need to be developed internally (i.e. Graduating Student Surveys, Department Satisfaction

Assessment Calendar

The Office of Institutional Effectiveness maintains an up-to-date Google Calendar, which delineates all planned institutional and departmental surveys, course evaluations, and other measures. Please consider the timing of your survey or assessment when planning, as overlapping measures may lead to survey fatigue among respondent populations. The calendar can be accessed by adding Institutional Effectiveness or [email protected] to your Google Calendar under “Add a coworker’s calendar”.

Month Activities August ➢ Initiate Program Learning Outcome (PLO) process (Annual)

➢ Initiate Master Plan of Advance (MPA) process (5 year cycle)

➢ Launch Summer Course Evaluations: Session B

➢ Collect, analyze, and disseminate Summer Course Evaluation results

September ➢ Prepare General Education assessments (Campus Labs Planning)

➢ Prepare assessments for various departments (i.e. BCMT, ACE, OAAS, etc.)

➢ Launch Alumni Survey

➢ Prepare Noel-Levitz Satisfaction Inventory – SSI (Odd years, Fall)

➢ Prepare National Survey of Student Engagement – NSSE (Even years, Spring)

➢ Register for CCCU Coordinated Assessment Project (CAP)

➢ Register and Prepare for Thriving Quotient (as needed)

➢ Academic Assessment Committee meeting

October ➢ Prepare Noel-Levitz Satisfaction Inventory (Odd years)

➢ Prepare National Survey of Student Engagement – NSSE (Even years, Spring)

➢ Administer Thriving Quotient (as needed)

➢ Launch Course Evaluations: Session A Evening/Online/Graduate

➢ Academic Assessment Committee meeting

November ➢ Hold Freshman Focus group

➢ Launch Fall Graduating Student Survey

➢ Launch Course Evaluations: Session B & Traditional

➢ Academic Assessment Committee meeting

December ➢ Collect, analyze, and disseminate Graduating Student Survey results

Office of Institutional Effectiveness Page 41

Page 44: Institutional Effectiveness Handbook...learning/administrative process, an assessment tool may need to be developed internally (i.e. Graduating Student Surveys, Department Satisfaction

➢ Collect, analyze, and disseminate Alumni Survey results

➢ Collect, analyze, and disseminate Course Evaluation results

January ➢ Prepare General Education assessments (Campus Labs Planning)

➢ Prepare assessments for various departments (i.e. BCMT, ACE, OAAS, etc.)

➢ Administer National Survey of Student Engagement – NSSE

➢ Academic Assessment Committee meeting

➢ APR-related surveys and focus groups

February ➢ Administer National Survey of Student Engagement – NSSE

➢ Academic Assessment Committee meeting

March ➢ Administer National Survey of Student Engagement – NSSE

➢ Launch Course Evaluations: Session A Evening/Weekend/Online/Graduate

➢ Hold Freshman Focus Group

➢ Academic Assessment Committee meeting

➢ University Assessment Committee Meeting

April ➢ Launch Spring Graduating Student Survey

➢ Launch Course Evaluations: Session B & Traditional

➢ Academic Assessment Committee meeting

May ➢ Collect, analyze, and disseminate Graduating Student Survey results

➢ Collect, analyze, and disseminate Course Evaluation results

➢ Collect and analyze MPA & PLO Reports

June/July ➢ Return analysis to academic/administrative departments

➢ Launch Summer Course Evaluations: Session A

Office of Institutional Effectiveness Page 42

Page 45: Institutional Effectiveness Handbook...learning/administrative process, an assessment tool may need to be developed internally (i.e. Graduating Student Surveys, Department Satisfaction

Conclusion

Though relatively short, this handbook provides the comprehensive guidelines, specific examples, bibliographical information, and web-based resources needed to create and sustain a state-of-the-art system of institutional effectiveness and assessment at Southeastern University for many years to come. The content and principles articulated here create a framework that welcomes creativity and innovation, while maintaining the foundational principles and procedures associated with best practices in higher education assessment.

Significant amounts of data are available internally through the Jenzabar EX system, Common Data Set, Fact Book, and Power BI Reports and Dashboards. Also available and viewable are the IPEDS reports and other reports and surveys submitted regularly by the Office of Institutional Effectiveness (see Appendix Q). Accreditation compliance, best practices in higher education, Southeastern’s mission, and student success, as well as practical issues involving student recruiting, budgets, physical facilities, and institutional development must all factor into the plans for a promising future.

The information provided here, if utilized, will almost certainly result in improved educational quality, improved student services, improved outcomes for graduates, improved quality in the campus experience, and enhanced ability to market the university and its programs.

(AP/AM)

Office of Institutional Effectiveness Page 43

Page 46: Institutional Effectiveness Handbook...learning/administrative process, an assessment tool may need to be developed internally (i.e. Graduating Student Surveys, Department Satisfaction

References

A National Consortium for Designing, Developing and Scaling New Models for Student Learning. (n.d.). Retrieved from http://www.cbenetwork.org/competency-based-education/

Allen, M. J. (2001). Assessing Academic Programs in Higher Education. Bolton, MA: Anker.

Allen, M. J. & Noel, E. (2002). Outcomes Assessment Handbook. California State University Bakersfield.

American Association for Higher Education. (1991). Principles of Good Practice for Assessing Student Learning. Sterling, VA: Stylus.

Angelo, T. A., & Cross, K. P. (1993). Classroom Assessment Techniques: A Handbook for College Teachers (2nd ed.). San Francisco: Jossey-Bass.

Banta, T. W. (2010). Building a Scholarship of Assessment. San Francisco: Jossey-Bass.

Banta, T. W. (2004). Portfolio Assessment: Uses, Cases, Scoring, and Impact. San Francisco:Jossey-Bass.

Belanoff, P., & Dickson, M. (1991). Portfolios: Process and Produce. Portsmouth, NH: Boynton/Cook.

Bloom, B. S., Engelhart, M.D., Furst, E. J., Hill, W. H. & Krathwohl, D. R. (1956). The Taxonomy of Educational Objectives, the Classification of Educational Goals, Handbook I: Cognitive domain. New York: David McKay.

Bresciani, M. J. (2007). Assessing Student Learning in General Education: Good Practice Case Studies. Bolton: Anker.

Bridgewater State College (2000). Assessment Guidebook. Bridgewater, MA.

California State University (1999). PACT Outcomes Assessment Handbook. Bakersfield, CA.

Camus, A. (1955). The Myth of Sisyphus: And Other Essays (O’Brien, J., Trans.). New York: Vintage Books. (Original work published 1942).

Commission on Colleges of the Southern Association of Colleges and Schools (2007). Resource Manual for the Principles of Accreditation: Foundations for Quality Enhancement. Decatur, Georgia.

Culp, M. M., & Dungy, G. J. (2012). Building a Culture of Evidence in Student Affairs: A Guide for Leaders and Practitioners. Washington, DC: NASPA-Student Affairs Administrators in Higher Education.

Croker, L., & Algina, J. (1986). Introduction to Classical and Modern Test Theory. New York: Harcourt Brace & Jovanovich.

Huba, M. E., & Freed, J. E. (2000). Learner-Centered Assessment on College Campuses: Shifting the focus from teaching to learning. Boston: Allyn and Bacon.

Keeling, R. P., Wall, A. F., Underhile, R., & Dungy, G. J. (2008). Assessment Reconsidered: Institutional Effectiveness for Student Success. United States: ICSSIA.

Kemmis, S., and McTaggart, R. (2000). Participatory Action Research. In N. Denzin and Y. Lincoln (Eds.) Handbook of Qualitative Research (2nd Ed.) (pp. 567-605). Beverley Hills CA: Sage.

Kuh, G. D. (2010). Student success in college: Creating conditions that matter. Hoboken: Wiley.

Office of Institutional Effectiveness Page 44

Page 47: Institutional Effectiveness Handbook...learning/administrative process, an assessment tool may need to be developed internally (i.e. Graduating Student Surveys, Department Satisfaction

Linn, R. L. & Miller, M. D. (2005). Measurement and Assessment in Teaching (9th Ed.). Upper Saddle River, NJ: Pearson.

Maki, P. L. (2004). Assessing for Learning: Building a Sustainable Commitment across the Institution (1st ed.). Sterling,VA: Stylus.

Nichols, J. O., Nichols, K. W., et al. (2005). A Road Map for Improvement of Student Learning and Support Services through Assessment. New York: Agathon.

Palomba, C.A., & Banta, T.W. (1999). Assessment Essentials: Planning, Implementing, and Improving Assessment in Higher Education. San Francisco: Jossey-Bass.

Rossi, P. H., Lipsey, M. L., & Freeman, H. E. (2003). Evaluation: A Systematic Approach (7th ed.). Newbury Park: CA: Sage.

Stevens, D.D. & Levi, A.J. (2004). Introduction to Rubrics: An Assessment Tool to Save Grading Time, Convey Effective Feedback and Promote Student Learning. Sterling, VA: Stylus.

Suskie, L. (2004). Assessing Student Learning: A Common Sense Guide. Bolton, MA: Anker.

University of Central Florida. (2005). University of Central Florida Assessment Handbook. Orlando, FL.

University of Massachusetts Amherst. (2001). Program-Based Review and Assessment: Tools and Techniques for Program Improvement. Amherst, MA.

University of Virginia. Assessment Guide: Seven Steps to Developing and Implementing a Student Learning Outcomes Assessment Plan. Charlottesville, VA.

Office of Institutional Effectiveness Page 45

Page 48: Institutional Effectiveness Handbook...learning/administrative process, an assessment tool may need to be developed internally (i.e. Graduating Student Surveys, Department Satisfaction

Appendices

Appendix A: Glossary 32

Appendix B: The Assessment Process (Stanford University) 35

Appendix C: University Mission, Vision, & Strategic Plan 37

Appendix D: Completed MPA 39

Appendix E: Completed PLO 42

Appendix F: Rubric for Feedback on Administrative Assessment Plans and Reports 44

Appendix G: Rubric for Feedback on Academic Assessment Plans and Reports 45

Appendix H: Curriculum Map 46

Appendix I: Foundational Document Example 49

Appendix J: Campus Labs Tutorials 52

Appendix K: Academic Program Assessment (American University: Cairo) 61

Appendix L: Creating Learning Outcomes (Stanford University) 68

Appendix M: Planning for Dissemination and Use (Stanford University) 73

Appendix N: Academic Program Review Schedule 74

Appendix O: Administrative Unit Assessment (American University: Cairo) 75

Appendix S: Useful Assessment Websites 80

Appendix T: IPEDS Peer Data 81

Office of Institutional Effectiveness Page 46

Page 49: Institutional Effectiveness Handbook...learning/administrative process, an assessment tool may need to be developed internally (i.e. Graduating Student Surveys, Department Satisfaction

Appendix A: Glossary

Assessment: A continuous process of gathering, evaluating, and communicating information to improve learning and institutional effectiveness.

Assessment of Student Learning: The third element of a four-part cycle: developing articulated student learning outcomes, offering students opportunities to achieve those outcomes, assessing achievement of those outcomes, and using the results of those assessments to improve teaching and learning and inform planning and resource allocation decisions.

Baseline Data: A current measure of achievement levels or administrative success, quantitative or qualitative, with regard to a particular variable such as average scores on standardized tests, number of students who use the library, or number of crimes committed on campus. All departments should establish baseline data in connection with outcomes in order to show improvement over time.

Benchmark: A standard of comparison (regional, national, or consortial) against which performance can be measured or assessed.

Classroom Assessment Techniques (CATs): Assessment tools that faculty members can use to gather timely feedback about a single lecture or discussion. Examples include the Minute Paper, the One Sentence Summary, and Direct Paraphrasing.

Criteria: An accepted standard, measure, or expectation used in evaluation or decision making.

Critical Success Factors (CSFs): Key areas of activity where positive results are necessary for the organization to achieve its goals.

Curriculum Mapping: Identify a program learning outcome and its concomitant parts. Identify specific courses where a learning outcome will take place and its level of coverage, i.e. introduce, reinforce, emphasize, and advanced (I, R, E, A). This data is plotted on a grid for each major. A similar exercise could include general education outcomes compared against discipline-specific courses.

Direct Measures: Measures which are directly tied to performance. In assessing student learning using direct measures, students’ work or performance provides information directly linked to students’ attainment of knowledge or skills. Direct measures are more reliable indicators of student learning than indirect measures. Examples include classroom and homework assignments, examinations and quizzes, capstone projects, research paper, student portfolios, retention and graduation statistics, and artistic performances. Direct assessments of administrative functions could include tracking numbers of clients served, average response time for enquiries, personnel per student ratios, or budgetary outlays per student.

E-Portfolio: A portfolio that is maintained online, containing student work in a digital format.

Freshman Cohort: Technically, the group of new freshman entering in the traditional fall semester. These students have no, or very few, prior college credits. The cohort is tracked to produce retention and graduation rates in coming years.

Goal: What the organization wants to achieve; desired outcomes for the organization or program, rather than actions. Goals are related to the institution or department’s mission and vision. Synonymous with outcome.

Graduation Rate: The percentage of students in the freshman cohort who graduate from the same institution within six years (12 semesters) of their original matriculation. Graduation rates are often calculated for particular demographics (gender, race) and other variables (GPA, first-generation).

Office of Institutional Effectiveness Page 47

Page 50: Institutional Effectiveness Handbook...learning/administrative process, an assessment tool may need to be developed internally (i.e. Graduating Student Surveys, Department Satisfaction

Indirect Measures: Measures which are not directly tied to performance; often subjective impressions of educational quality and services. Indirect measures rely on perception and are less meaningful for assessment than direct measures. They are, however, helpful to corroborate the results of direct measures. Examples include exit surveys, student opinion surveys, alumni surveys, grades not based on scoring guidelines, career development over time, and student activities.

Institutional Effectiveness: The extent to which an institution has a clearly defined mission and institutional outcomes, measures progress towards achieving those outcomes, and engages in continuous efforts to improve programs and services.

Key Performance Indicators (KPIs): Quantifiable goals that measure performance. These goals should be well-defined, critical to an organization’s success, and reflect the organization’s mission and goals. KPIs are usually measured against benchmarks.

Learning Outcomes: The knowledge, skills, abilities, values, and attitudes that students gain from a learning experience.

Longitudinal Data Studies: Assessment results and averages computed over several years. These are important because they can show improvement over time (value added).

Mission: The purpose of an organization or program; its reason for existing. Mission statements provide the strategic vision or direction of the organization or program and should be simple, easily understood, and widely communicated.

Objective: The tasks to be completed in order to achieve a goal/outcome. Objectives are specific and measurable and must be accomplished within a specified time period.

Outcomes: Synonymous with goals. Outcomes are tied to the mission and are something that the organization, department, program, or unit wants to achieve. Outcomes should be specific, measurable, use action verbs, and focus on the ends rather than the means.

Peer Review: The thorough, independent review and evaluation of academic programs, policies, processes, procedures, curriculum, assessment plan, and outcomes. Generally, peer review is conducted by a team of peers from a similar institution and academic discipline. Regional accreditation reaffirmation provides a forum for peer review. Discipline-specific accreditations provide a more rigorous form of peer review. Peer review for all academic programs should be conducted periodically.

Portfolio: An accumulation of evidence about individual achievement or progress towards goals. Student portfolios used for assessment purposes may include but are not limited to projects, journals, research papers, creative writing, presentations, and video or recordings of speeches and performances.

Program Review: Periodic self-studies in which departments are asked to present their mission statements; resources, including the number of faculty, faculty qualifications and productivity, teaching load, curriculum, and technology; learning outcomes and assessment measures; the ways in which departments have shared assessment results and used those results to inform departmental decision-making; and plans for improving learning.

Qualitative Data: Data that cannot be measured or expressed in numerical terms and relates to or is based on the quality or character of something. Qualitative data describes or characterizes something using words rather than numbers. Examples of qualitative data include surveys, focus groups, and feedback from external reviewers.

Office of Institutional Effectiveness Page 48

Page 51: Institutional Effectiveness Handbook...learning/administrative process, an assessment tool may need to be developed internally (i.e. Graduating Student Surveys, Department Satisfaction

Quantitative Data: Data that is capable of being measured or expressed in numerical terms. Examples of quantitative data include test scores, grades, certification exam results, graduation and retention rates, number of clients served, and budget amounts.

Recommendations: Based on assessment results, recommendations for quality enhancement, structural changes, improved processes, budget request, and strategic planning are created. Recommendations should heavily inform outcomes for upcoming cycle.

Retention Rate: Technically, the percentage of students retained in the fall from the freshman cohort the preceding fall.

Rubric: A criteria-based scoring guideline that can be used to evaluate performance. Rubrics indicate the qualities the judge/reviewer will look for in differentiating levels of performance and assessing achievement.

Shareholders: Whom does your unit serve? Students, faculty, staff, administrators, parents, alumni, others, or all of the above?

Unit Foundational Document: The mission statement at the department level. (See Appendix I for an example.) The unit mission should include the Mission Statement, Customers, Critical Processes, Values, and Vision. Once developed, this can be reviewed periodically, but revised only when changes are needed.

Some Material adapted from American University at Cairo: Assessment-A Guide to Developing and Implementing Effective Outcomes Assessment for Academic Support and Administrative Units, 2007

(AP/AM)

Office of Institutional Effectiveness Page 49

Page 52: Institutional Effectiveness Handbook...learning/administrative process, an assessment tool may need to be developed internally (i.e. Graduating Student Surveys, Department Satisfaction

Appendix B: The Assessment Process (Stanford University)

The Assessment Process

Assessment is an ongoing, iterative process that uses results to inform decisions and make improvements. In order to improve, careful planning is necessary. Learning goals and outcomes must be clearly specified, appropriate measures must be selected, data collection must be carefully executed and most importantly results must be shared for improvements to occur. The figure below illustrates a cycle of interlinked activities that facilitate continuous improvement. (Figure adapted from Maki, 2004)

Seven Steps to Closing the Loop

Step 1: Creating an Infrastructure for Assessment: Organizing an Assessment Committee

Before beginning, it is important to set up the appropriate infrastructure for assessment in order to ensure that the process is self-sustaining. We strongly suggest that programs a committee with a rotating chair who will lead the process. This will alleviate faculty workload as well as provide quality assurance for planning and dissemination. The assessment process is more likely to be self-sustaining if faculty collectively agree on what is important, buy into assessment procedures, and decide as a group what the data mean and how to improve.

Office of Institutional Effectiveness Page 50

Page 53: Institutional Effectiveness Handbook...learning/administrative process, an assessment tool may need to be developed internally (i.e. Graduating Student Surveys, Department Satisfaction

Step 2: Defining the Mission of the Program

Each program should formulate a mission statement that will constitute a broad statement of its goals, values, and aspirations.

Step 3: Defining the Program Learning Outcomes

Each program should formulate at least three learning outcomes that describe the specific abilities, knowledge, values, and attitudes it wants students to acquire as a result of the program.

Step 4: Selecting assessment methods and identifying targets

Programs may use several different methods to measure student learning outcomes and must include direct measures of learning for each learning outcome. This is a WASC requirement. They should also identify expected levels of performance for each outcome.

Step 5: Collecting the Data

It is important to determine how the data collection will be implemented (i.e., who will collect the data, where it will be collected, and who will be sampled). All data should be reported in the form of group data to ensure the privacy of those who are assessed.

Step 6: Analyzing the Results

It is important to summarize and report the data in a meaningful way to communicate findings to program faculty. One person in every program should be in charge of writing up the report.

Step 7: Closing the Loop

No matter how results turn out, they are worthless unless they are used. The results of assessment data should be disseminated to faculty in the program as well as faculty outside of the program to obtain their ideas about how to improve the program. WASC is particularly interested in seeing documentation for this step. In some cases changes will be minor and easy to implement, while others will be more difficult and will have to be implemented over multiple years.

Office of Institutional Effectiveness Page 51

Page 54: Institutional Effectiveness Handbook...learning/administrative process, an assessment tool may need to be developed internally (i.e. Graduating Student Surveys, Department Satisfaction

Appendix C: University Mission, Vision, & Strategic Plan

OUR MISSION Equipping students to discover and develop their divine design to serve Christ and the world through Spirit-empowered life, learning, and leadership.

OUR CORE VALUES 1. A university absolutely committed to Christ-like formation.2. A university of educational breadth and depth.3. A university of faculty distinction.4. A university that thinks globally.5. A university committed to serving human need in our community and in communities around the

world.6. A university characterized as a community of grace.7. A university that is student-focused at all times.

OUR CULTURE Every organization has a culture that defines it. Southeastern University is no different. Our values, our mission, and our strategies flow into practices and decisions that shape the kind of spirit and ethos that people who work here and attend here emulate. We believe that culture creation is so important that as much time and thought goes into designing the Southeastern culture as an architect would put into designing an award-winning building.

We feel there are many elements that will define our culture but that the most important of all is our people. More than any other element, our people will determine our effectiveness at achieving our mission. We feel these elements define the type of people we aspire to be at SEU:

1. Christ-centered - Our culture will be defined first and foremost as people who live out the JesusWay – loving God and others.

2. Collaborative - The main thing in our culture is getting the job done. Our culture will encouragepeople to collaborate freely, not be hierarchical, sharing information and decisions, andencouraging teamwork.

3. Embraces Change - Private Higher Education today assumes a great education. The greatUniversities rise above the rest through innovating. Innovation means taking risks and embracingchange – lots of change.

4. Bias to Perform - The contribution of high performers to low performers in an organization istypically 8 to 1. High performers do 8 times the work of low performers. We want high performersin every position. Our mission is too important to settle for anything less.

5. Makes Decisions - As organizations grow they become more complex and chaotic. Thetraditional solution to complexity and chaos is process. Responsible people thrive on freedomand responsibility - they like making decisions. While healthy systems and process are needed,too much process drives these high performing people away. We will constantly manage thetension between maintaining healthy levels of process and increasing employee freedom andresponsibility as we grow so that we can continue to keep and attract high performing people.

6. Student Focused - We will never settle with satisfying our students – satisfaction is the given.They must become raving fans for life.

7. Courageous - Our people will say what they think even if it is controversial, make toughdecisions without agonizing, take smart risks, and question actions inconsistent with our values.Unhealthy people create drama, courageous people prevent drama.

Office of Institutional Effectiveness Page 52

Page 55: Institutional Effectiveness Handbook...learning/administrative process, an assessment tool may need to be developed internally (i.e. Graduating Student Surveys, Department Satisfaction

4.2 Identify, recruit, and enroll underrepresented student populations.

4.2.1 Recruit cross-department team to create, execute and evaluate an effective First Generation strategy.4.2.2 Recruit cross-department team to create, execute and evaluate an effective student success strategy for

ecumenically diverse populations.4.2.3 Recruit cross-department team to create, execute and evaluate an effective student success strategy for

minority populations.4.2.4 Recruit cross-department team to create, execute and evaluate an effective student success strategy for

adult learners.4.2.5 Recruit cross-department team to create, execute and evaluate an effective student success strategy for

online learners.

4.3 Increase conversion of dual enrolled students to full-time FTIC students to 20%.

4.3.1 Hire a dedicated admissions counselor for dual enrolled students.4.3.2 Targeted marketing and recruitment efforts to dual enrolled students.4.3.3 Develop dual enrolled programs that enable students to seamlessly move from their dual enrolled program

into a bachelor's program at SEU.

5. STRONG FINANCIAL BASE

5.1 Grow the university endowment.

5.1.1 Develop a young alumni program.5.1.2 Create a donor cultivation plan.5.1.3 Hire a experienced grant writer with a proven track record.5.1.4 Increase annual unrestricted giving to $1.2 million per year5.1.5 Increase the endowment through gifts and growth to $15 million

5.2 Achieve a Composite Financial Index of at least 2.0.

5.2.1 Create a budget contingency that grows to 2% of revenues within five years5.2.2 Invest at least 120% of depreciation expense each year in paying down debt principal and in capital spending5.2.3 Reduce long term debt by 10% through planned, annual pay downs5.2.4 Build a year-end cash reserve of at least $12 million

5.3 Create a culture of transparency backed by data-informed decision-making.

5.3.1 Develop an integrated planning and budgeting model.5.3.2 In conjunction with 6.2.3, create a decision support team to research and provide relevant information for

budgetary decisions.5.3.3 Design and implement a predictable and equitable pay management system5.3.4 Create an academic program ROI.5.3.5 Establish and track financial goals.

6. CULTURE OF HIGH PERFORMANCE & QUALITY

6.1 Hire and retain first-rate employees.

6.1.1 Improve the on-boarding process for new employees.6.1.2 Develop a world-class internal professional development and training program.6.1.3 Create 360-Evaluations for all administrative staff.6.1.4 Research and Implement a New Human Resources Information System (HRIS).6.1.5 Develop a Succession Plan and Workforce Mobility Plan.6.1.6 Identify Key Performance Indicators on Organizational Culture & Employee Performance.6.1.7 Conduct Salary Studies and Realign Compensation Strategies to Align with Industry Best Practice.

6.2 Create a culture of quality enhancement through data-informed decision-making.

6.2.1 Develop data analytics and metrics to track organizational performance and decision support.6.2.2 Form a data governance committee.6.2.3 Implement a Decision Support Team for objective, comprehensive decision analysis.6.2.4 Expand the Business Intelligence platform to campus-wide use.

5

OUR FIVE YEAR PLANTo create a university of curricular and co-curricular excellence with a faculty of distinction, diverse student population, strong financial base, and a culture of high performance and quality.

STRATEGIC GOALS

CURRICULAR EXCELLENCE. We will continue to create a curriculum of educational breadth and depth, with a global perspective.

CO-CURRICULAR EXCELLENCE. We will provide a co-curricular experience to allow students to develop academically, socially and spiritually.

FACULTY DISTINCTION. We will continue to recruit and retain high quality faculty.

DIVERSE & EXPANDING STUDENT POPULATION.We will recruit and retain a diverse and dynamic student population.

STRONG FINANCIAL BASE.We will propel the university into a place of financial strength.

CULTURE OF HIGH PERFORMANCE & QUALITY.We will create a culture of high performance and quality for university stakeholders.

UNRESTRICTED EDUCATION.We will develop and expand the university’s nontraditional academic programs to provide unrestricted access to students.

1234567

Office of Institutional Effectiveness Page 53

Page 56: Institutional Effectiveness Handbook...learning/administrative process, an assessment tool may need to be developed internally (i.e. Graduating Student Surveys, Department Satisfaction

1. CURRICULAR EXCELLENCE

1.1 Enhance faculty and student engagement.

1.1.1 Develop a faculty/student undergraduate research initiative.1.1.2 Improve the student-to-faculty ratio.1.1.3 Create more interdisciplinary courses that align with teaching/learning interests and encourage

co-teaching opportunities.1.1.4 Expand professional learning communities, e.g., Enactus.1.1.5 Reduce the number of online courses completed by traditional students.

1.2 Enhance the university's commitment to Liberal Arts.

1.2.1 Create a Foundational Core curriculum that meets AAC&U learning outcome benchmarks.1.2.2 Develop a coherent course of study that produces an academic experience in alignment with SEU’s

mission, provides for distinct learning experiences, and is tailored to each student's divine design.1.2.3 Hire a full-time foundational core director for the traditional program.

1.3 Increase participation in experiential learning.

1.3.1 Facilitate student research opportunities at national & international schools & organizations of academic prestige. 1.3.2 Increase exposure to course offerings in and delivery systems for foreign languages relevant to academic

programs.1.3.3 Increase experiential learning opportunities in both global and local contexts.

1.4 Enhance support for academic excellence in all colleges and departments.

1.4.1 Increase funding for library resources.1.4.2 Strengthen the Academic Program Review process.1.4.3 Expand and support the Center for Student Success.1.4.4 Pursue national accreditations in education, (NCATE), music (NASM), and counseling (CACREP).1.4.5 Launch first Ph.D. program.

2. CO-CURRICULAR EXCELLENCE

2.1 Improve FYE, helping students' integrate into the community to promote thriving academically, socially, & spiritually.

2.1.1 Prepare faculty and staff to effectively engage with Generation Z/Next students through research and training.2.1.2 Revamp the Christ, Culture, U. course as a first-year seminar with a strengths-based approach.2.1.3 Revamp the student advising model with a strengths-based approach.2.1.4 Increase first-year student participation in campus events to promote student integration.

2.2 Empower & support students with inclusive opportunities for life, learning, & leadership to promote community and retention.

2.2.1 Expand peer mentoring and spiritual leadership opportunities to develop Christian leaders.2.2.2 Redesign the DSF experience to be more student-focused and student led.2.2.3 Develop more dedicated student spaces on campus.2.2.4 Strategically create and encourage more student activities and organizations.2.2.5 Create inclusive opportunities to promote student involvement.

2.3 Increase "student fit" and satisfaction through strategic use of facilities and services.

2.3.1 Optimize housing to increase student fit and occupancy rate.2.3.2 Provide effective opportunities for commuters and extended student engagement.2.3.3 Offer services and activities meeting the needs of today's student.2.3.4 Create spaces across campus that optimize and encourage group activity and study.

2.4 Assist graduates' transition into the workforce or graduate school.

2.4.1 Implement a strengths-based career service model that connects personal awareness with career planning.2.4.2 Increase business and ministry partnerships to provide internship and employment opportunities.2.4.3 Increase mentoring and network opportunities by 200%.2.4.4 Establish capstone courses within majors or senior experiences geared towards career and continuing

education transitions.

2.5 Increase FTIC Retention Rate to 75% and first year persistence rate for transfers to 75%.

2.5.1 Develop and implement a five-year Student Success Strategy following proven best practices.2.5.2 Increase participation in experiential learning experiences (See 1.3).2.5.3 100% of FT Student Development & Student Success staff attend one on-campus and one external first year

training intensives.2.5.4 All academic departments create strategies and measures enabling them to meet minimum 75% retention

goal by 2022.

2.6 Increase graduation rate of both FTIC and Transfers to 45%.

2.6.1 see 2.5.1 & 2.5.22.6.2 Increase student's fit in their chosen major and support in years 3 and 4.2.6.3 All academic departments create strategies and measures enabling them to meet minimum 45% graduation

goal by 2022.

3. FACULTY OF DISTINCTION

3.1 Increase opportunities for faculty development & continuing education to increase scholarship & raise faculty profile.

3.1.1 Hire grant writer to assist with locating grant opportunities and writing/submitting grant proposals.3.1.2 Increase faculty development funds to support faculty scholarship.3.1.3 Increase funding for terminal degree attainment.3.1.4 Increase funding for sabbatical opportunities to promote faculty scholarship.3.1.5 Expand university-based faculty training and development services.3.1.6 Implement standard, intensive training for new faculty that clearly elucidates roles and responsibilities.

3.2 Continue to provide a superior classroom experience.

3.2.1 Hire additional full-time faculty to achieve an appropriate student-to-faculty ratio.3.2.2 Utilize graduate and teaching assistants to assist with course facilitation, where appropriate.3.2.3 Research and add high impact educational practices to promote engaged learning.

3.3 Review and refine the faculty evaluation, promotion, and tenure processes.

3.3.1 Conduct a self study of the faculty evaluation, promotion, and tenure processes, in the spirit of theAcademic Program Review protocols with external reviews and peer benchmarking.

3.3.2 Establish and implement a plan to address the recommendation from the self study in 3.3.1.

4. DIVERSE & EXPANDING STUDENT POPULATION

4.1 Integrate Admissions process for all student populations, lowering per student cost of recruitment and increasing student quality.

4.1.1 Move online marketing and recruitment in house.4.1.2 Analyze and revise scholarshipping.4.1.3 Create a more "customer friendly" virtual and physical Welcome Center.

4

Office of Institutional Effectiveness

Office of Institutional Effectiveness Page 54

Page 57: Institutional Effectiveness Handbook...learning/administrative process, an assessment tool may need to be developed internally (i.e. Graduating Student Surveys, Department Satisfaction

1. CURRICULAR EXCELLENCE

1.1 Enhance faculty and student engagement.

1.1.1 Develop a faculty/student undergraduate research initiative.1.1.2 Improve the student-to-faculty ratio.1.1.3 Create more interdisciplinary courses that align with teaching/learning interests and encourage

co-teaching opportunities.1.1.4 Expand professional learning communities, e.g., Enactus.1.1.5 Reduce the number of online courses completed by traditional students.

1.2 Enhance the university's commitment to Liberal Arts.

1.2.1 Create a Foundational Core curriculum that meets AAC&U learning outcome benchmarks.1.2.2 Develop a coherent course of study that produces an academic experience in alignment with SEU’s

mission, provides for distinct learning experiences, and is tailored to each student's divine design.1.2.3 Hire a full-time foundational core director for the traditional program.

1.3 Increase participation in experiential learning.

1.3.1 Facilitate student research opportunities at national & international schools & organizations of academic prestige.1.3.2 Increase exposure to course offerings in and delivery systems for foreign languages relevant to academic

programs.1.3.3 Increase experiential learning opportunities in both global and local contexts.

1.4 Enhance support for academic excellence in all colleges and departments.

1.4.1 Increase funding for library resources.1.4.2 Strengthen the Academic Program Review process.1.4.3 Expand and support the Center for Student Success.1.4.4 Pursue national accreditations in education, (NCATE), music (NASM), and counseling (CACREP).1.4.5 Launch first Ph.D. program.

2. CO-CURRICULAR EXCELLENCE

2.1 Improve FYE, helping students' integrate into the community to promote thriving academically, socially, & spiritually.

2.1.1 Prepare faculty and staff to effectively engage with Generation Z/Next students through research and training.2.1.2 Revamp the Christ, Culture, U. course as a first-year seminar with a strengths-based approach.2.1.3 Revamp the student advising model with a strengths-based approach.2.1.4 Increase first-year student participation in campus events to promote student integration.

2.2 Empower & support students with inclusive opportunities for life, learning, & leadership to promote community and retention.

2.2.1 Expand peer mentoring and spiritual leadership opportunities to develop Christian leaders.2.2.2 Redesign the DSF experience to be more student-focused and student led.2.2.3 Develop more dedicated student spaces on campus.2.2.4 Strategically create and encourage more student activities and organizations.2.2.5 Create inclusive opportunities to promote student involvement.

2.3 Increase "student fit" and satisfaction through strategic use of facilities and services.

2.3.1 Optimize housing to increase student fit and occupancy rate.2.3.2 Provide effective opportunities for commuters and extended student engagement.2.3.3 Offer services and activities meeting the needs of today's student.2.3.4 Create spaces across campus that optimize and encourage group activity and study.

2.4 Assist graduates' transition into the workforce or graduate school.

2.4.1 Implement a strengths-based career service model that connects personal awareness with career planning.2.4.2 Increase business and ministry partnerships to provide internship and employment opportunities.2.4.3 Increase mentoring and network opportunities by 200%.2.4.4 Establish capstone courses within majors or senior experiences geared towards career and continuing

education transitions.

2.5 Increase FTIC Retention Rate to 75% and first year persistence rate for transfers to 75%.

2.5.1 Develop and implement a five-year Student Success Strategy following proven best practices.2.5.2 Increase participation in experiential learning experiences (See 1.3).2.5.3 100% of FT Student Development & Student Success staff attend one on-campus and one external first year

training intensives.2.5.4 All academic departments create strategies and measures enabling them to meet minimum 75% retention

goal by 2022.

2.6 Increase graduation rate of both FTIC and Transfers to 45%.

2.6.1 see 2.5.1 & 2.5.22.6.2 Increase student's fit in their chosen major and support in years 3 and 4.2.6.3 All academic departments create strategies and measures enabling them to meet minimum 45% graduation

goal by 2022.

3. FACULTY OF DISTINCTION

3.1 Increase opportunities for faculty development & continuing education to increase scholarship & raise faculty profile.

3.1.1 Hire grant writer to assist with locating grant opportunities and writing/submitting grant proposals.3.1.2 Increase faculty development funds to support faculty scholarship.3.1.3 Increase funding for terminal degree attainment.3.1.4 Increase funding for sabbatical opportunities to promote faculty scholarship.3.1.5 Expand university-based faculty training and development services.3.1.6 Implement standard, intensive training for new faculty that clearly elucidates roles and responsibilities.

3.2 Continue to provide a superior classroom experience.

3.2.1 Hire additional full-time faculty to achieve an appropriate student-to-faculty ratio.3.2.2 Utilize graduate and teaching assistants to assist with course facilitation, where appropriate.3.2.3 Research and add high impact educational practices to promote engaged learning.

3.3 Review and refine the faculty evaluation, promotion, and tenure processes.

3.3.1 Conduct a self study of the faculty evaluation, promotion, and tenure processes, in the spirit of the Academic Program Review protocols with external reviews and peer benchmarking.

3.3.2 Establish and implement a plan to address the recommendation from the self study in 3.3.1.

4. DIVERSE & EXPANDING STUDENT POPULATION

4.1 Integrate Admissions process for all student populations, lowering per student cost of recruitment and increasing student quality.

4.1.1 Move online marketing and recruitment in house.4.1.2 Analyze and revise scholarshipping.4.1.3 Create a more "customer friendly" virtual and physical Welcome Center.

3 Page 55 Office of Institutional Effectiveness

Page 58: Institutional Effectiveness Handbook...learning/administrative process, an assessment tool may need to be developed internally (i.e. Graduating Student Surveys, Department Satisfaction

4.2 Identify, recruit, and enroll underrepresented student populations.

4.2.1 Recruit cross-department team to create, execute and evaluate an effective First Generation strategy.4.2.2 Recruit cross-department team to create, execute and evaluate an effective student success strategy for

ecumenically diverse populations.4.2.3 Recruit cross-department team to create, execute and evaluate an effective student success strategy for

minority populations.4.2.4 Recruit cross-department team to create, execute and evaluate an effective student success strategy for

adult learners.4.2.5 Recruit cross-department team to create, execute and evaluate an effective student success strategy for

online learners.

4.3 Increase conversion of dual enrolled students to full-time FTIC students to 20%.

4.3.1 Hire a dedicated admissions counselor for dual enrolled students.4.3.2 Targeted marketing and recruitment efforts to dual enrolled students.4.3.3 Develop dual enrolled programs that enable students to seamlessly move from their dual enrolled program

into a bachelor's program at SEU.

5. STRONG FINANCIAL BASE

5.1 Grow the university endowment.

5.1.1 Develop a young alumni program.5.1.2 Create a donor cultivation plan.5.1.3 Hire a experienced grant writer with a proven track record.5.1.4 Increase annual unrestricted giving to $1.2 million per year5.1.5 Increase the endowment through gifts and growth to $15 million

5.2 Achieve a Composite Financial Index of at least 2.0.

5.2.1 Create a budget contingency that grows to 2% of revenues within five years5.2.2 Invest at least 120% of depreciation expense each year in paying down debt principal and in capital spending5.2.3 Reduce long term debt by 10% through planned, annual pay downs5.2.4 Build a year-end cash reserve of at least $12 million

5.3 Create a culture of transparency backed by data-informed decision-making.

5.3.1 Develop an integrated planning and budgeting model.5.3.2 In conjunction with 6.2.3, create a decision support team to research and provide relevant information for

budgetary decisions.5.3.3 Design and implement a predictable and equitable pay management system5.3.4 Create an academic program ROI.5.3.5 Establish and track financial goals.

6. CULTURE OF HIGH PERFORMANCE & QUALITY

6.1 Hire and retain first-rate employees.

6.1.1 Improve the on-boarding process for new employees.6.1.2 Develop a world-class internal professional development and training program.6.1.3 Create 360-Evaluations for all administrative staff.6.1.4 Research and Implement a New Human Resources Information System (HRIS).6.1.5 Develop a Succession Plan and Workforce Mobility Plan.6.1.6 Identify Key Performance Indicators on Organizational Culture & Employee Performance.6.1.7 Conduct Salary Studies and Realign Compensation Strategies to Align with Industry Best Practice.

6.2 Create a culture of quality enhancement through data-informed decision-making.

6.2.1 Develop data analytics and metrics to track organizational performance and decision support.6.2.2 Form a data governance committee.6.2.3 Implement a Decision Support Team for objective, comprehensive decision analysis.6.2.4 Expand the Business Intelligence platform to campus-wide use.

OUR FIVE YEAR PLANTo create a university of curricular and co-curricular excellence with a faculty of distinction, diverse student population, strong financial base, and a culture of high performance and quality.

STRATEGIC GOALS

CURRICULAR EXCELLENCE.We will continue to create a curriculum of educational breadthand depth, with a global perspective.

CO-CURRICULAR EXCELLENCE.We will provide a co-curricular experience to allow students to develop academically, socially and spiritually.

FACULTY DISTINCTION.We will continue to recruit and retain high quality faculty.

DIVERSE & EXPANDING STUDENT POPULATION.We will recruit and retain a diverse and dynamic student population.

STRONG FINANCIAL BASE.We will propel the university into a place of financial strength.

CULTURE OF HIGH PERFORMANCE & QUALITY.We will create a culture of high performance and quality for university stakeholders.

UNRESTRICTED EDUCATION.We will develop and expand the university’s nontraditional academic programs to provide unrestricted access to students.

1234567

2

Office of Institutional Effectiveness Page 56

Page 59: Institutional Effectiveness Handbook...learning/administrative process, an assessment tool may need to be developed internally (i.e. Graduating Student Surveys, Department Satisfaction

1

• Launch first Ph.D. Program.

• Create student space to optimize group activity and study.

• Increase retention and graduation rates.

• Expand unrestricted education through innovative programs and deliveries.

• Create a virtual and physical Welcome Center.

• Develop a world class professional development and training program.

6.3 Develop and follow business processes in accordance to best practices.

6.3.1 Clearly define standard operating procedures.6.3.2 Establish an Administrative Program Review process.6.3.3 Develop information security protocol.

6.4 Leverage technology to create an effective, robust, and functional organization.

6.4.1 Create a CRM for Student Support.6.4.2 Implement a campus-wide project management system.6.4.3 Initiate a workflow task force to design a plan for work flow mapping and process automation in conjunction

with 6.3.2.6.4.4 Redevelop SFNET to become a centralized self-service portal for faculty and staff.

7. UNRESTRICTED EDUCATION

7.1 Transition the School of Extended Education to a fully functioning nontraditional academic division.

7.1.1 Hire a vice president for extended education.7.1.2 Hire additional full-time faculty to achieve an appropriate student-to-faculty and full-time-to-part-time ratios

(see 3.2.1).7.1.3 Transition curriculum development/maintenance and learning management system (and related training/

support functions) from The Learning House to the School of Extended Education.7.1.4 Review and refine the faculty evaluation, promotion, and tenure processes for extended education faculty

(see 3.3).7.1.5 Increase opportunities for faculty development and continuing education to increase scholarship and raise

7.1.1 faculty profile in the academic sector (see 3.1).

7.2 Increase number and quality of extended education (online and site-based) degree programs.

7.2.1 Add 1-2 programs in design and communication studies by 2022.7.2.2 Add 1-2 programs in behavioral sciences (e.g., psychology and human services) studies by 2022.7.2.3 Add 1-2 programs in business and leadership studies by 2022.7.2.4 Add 1-2 programs in ministry & theology studies by 2022.7.2.5 Add 1-2 graduate programs under the purview of the School.7.2.6 Adapt the Foundational Core to the extended education program and deliveries.7.2.7 Research and add high impact educational practices to promote engaged learning (see 3.2.3).7.2.8 Complete the Academic Program Review process for the School of Extended Education, including a review

of degree programs and modalities (see 1.4.2).

7.3 Add competency-based education (CBE) degree-seeking and non-degree-seeking programs.

7.3.1 Add 1-2 CBE certificate programs each year.7.3.2 Add 1-2 CBE bachelor programs by 2022. (e.g., Business Administration or Ministerial Leadership).7.3.3 Add 1-2 CBE master's programs by 2022. (e.g., Organizational Leadership or Higher Education).

7.4 Increase number, type, and quality of extended locations.

7.4.1 Add 5 to 10 church-based sites or campuses each year.7.4.2 Add 5 to 10 business or non-profit sites or campuses by 2022.7.4.3 Conduct an annual review of existing sites/campuses to identify performance gaps and improvement plans.7.4.4 Hire an academic director for each regional campus.

7.5 Offer executive education and other professional development programs.

7.5.1 Offer 1-2 executive education and professional development seminars (or similar experiences) each year.7.5.2 Investigate and offer technology (or related field) "bootcamps".

7.6 Increase global learning, including international and domestic, opportunities and partnerships.

7.6.1 Add 2-3 experiential learning opportunities in both global (international and domestic) and local (City of Lakeland) contexts.

7.6.2 Add foreign language course offerings and delivery systems for traditional and extended education programs.

7.7 Implement a student success infrastructure for extended education students.

7.7.1 Transition student success and retention services from The Learning House to the School of Extended Education.7.7.2 Implement a first-year, strengths development student success initiative for extended education students (see 2.1).7.7.3 Develop and implement a five-year Student Success Strategy following proven best practices. (see 2.5.1).

Office of Institutional Effectiveness Page 57

Page 60: Institutional Effectiveness Handbook...learning/administrative process, an assessment tool may need to be developed internally (i.e. Graduating Student Surveys, Department Satisfaction

Appendix D: Completed MPA Update

Academic Center for Enrichment

1.0 Student Focused

Outcome: Overall Progress: 15%

Outcome: Status: On Track

Outcome Start: 07/01/2017

Outcome End: 06/30/2018

Assessments and/or Action Plans: 1.1 Strive for a Student Satisfaction rating of 80% or higher across all services. On site tutoring services will be surveyed at point of service and additional opportunities for providing feedback about general services will be developed and implemented in the 2017 2018 assessment cycle. 1.2 Offer services and activities that are responsive to the needs of today's student. Data collected through the center will be analyzed and used to identify developing trends and student requests at the end of each major term. 1.3 Implement the university's strategic plan to meet the academic and learning needs of students from underrepresented populations (first generation, minority, adult learners, online students, and etc.). 1.4 Implement Strengths Based Advising 2017-2018 Update: 1.1 Students reported that they strongly agree/agree 96%- 98% across all measures of student satisfaction in the point of service survey. This survey is administered to all students that utilize tutoring services. The alternative point of service survey instruments were not implemented in the 2017-2018 year due to not having the kiosk system implemented. There are plans to obtain the technology in the 2018- 2019 fiscal year and the software for administering the survey has been integrated with center services. 1.2 Data is collected and reviewed on daily, weekly, monthly, semesterly, and an annual basis. Due to the increase in center visits in the Fall 17 semester, ACE secured 4 classrooms to be used for overflow space and workshops. Additionally, secondary reception staff was scheduled for Thursday evenings, the night with the highest volume of traffic. Men's Soccer Study Hall program also adjusted its requirements to allow its student athletes to attend during any hours of operation and this provided substantial relief to the general flow in the center. Spring reports indicate that center traffic will not be significantly decreasing so strategies for securing additional space (classrooms, library) are being reviewed. 1.3 The university initiated the beginning stages of developing the strategic plan through its partnership with Credo. Currently, the university is in an assessment phase and task forces begin meeting in the Fall of 2018. 1.4 While Strengths Based Advising has not yet been launched campus wide, the center has already begun examining its pedagogical practices in light of Strengths Based Advising, Thriving Student, and Positive Psychology approaches for the learning and academic coaching and counseling. Additionally, Dr. Laurie Schreiner, a leading expert in the field of Strengths Based Advising, will be presenting to the university in the fall. Attached Files Center Usage by Hour SP18 Color Coded.pdf Types of Reports.docx Monthly Report Calculations.xlsx Monthly Reports 20180531T195549Z001.zip

Office of Institutional Effectiveness Page 58

Page 61: Institutional Effectiveness Handbook...learning/administrative process, an assessment tool may need to be developed internally (i.e. Graduating Student Surveys, Department Satisfaction

Appendix E: Completed PLO

Learning Outcome Number: 1.0 Learning Outcome Title: Preparation for Further Study Learning Outcome Description: Graduates in the MA in Theological Studies will demonstrate a developed knowledge of systematic, historical, biblical, ethical, and practical theologies, preparing for further study, theological reflection, articulation, and teaching Learning Outcome: Start: 6/30/2017 Learning Outcome: End: 6/29/2018 Learning Outcome: Responsible Department: M.A. Theological Studies Learning Outcome: Overall Progress: On-Track Assessments:

1.1 Grading rubric for Homiletics: Methods of Biblical Preaching and Teaching PMIN 5213 Historical Overall Score Averages for all offerings:

FY 14-15 FY 15-16 FY 16-17 FY 17-18

Content 4.6 4.09 5 5

Relevance 4.45 3.81 4.67 4.61

Reasoning 4.8 3.97 4.5 4.33

Illustration 4.58 3.99 4.33 4.49

Delivery 4.73 4.03 4.6 4.62

Authenticity 5 4.78 5 5

Overall Grade 4.69 4.11 4.68 4.68

FY 17-18 PMIN 5213 Online Offering Average Scores

Content Relevance Reasoning Illustration Delivery Authenticity Overall Grade

5 4.67 4.5 4.33 4.6 5 4.68

FY 17-18 PMIN 5213 Face to Face Offering Average Scores

Content Relevance Reasoning Illustration Delivery Authenticity Overall Grade

5 4.5 4 4.8 4.67 5 4.7

Office of Institutional Effectiveness Page 59

Page 62: Institutional Effectiveness Handbook...learning/administrative process, an assessment tool may need to be developed internally (i.e. Graduating Student Surveys, Department Satisfaction

1.2 Grading rubric for Biblical Exposition and Faith Integration

BIBL 5223 Historical Overall Score Averages for all offerings:

FY 15-16 FY 16-17 FY 17-18

Intro 4.75 4.2 4.16

Historical Information 13.92 12.8 12.17

Literary Information 19 15.4 17

Pericope 26.58 27.4 24.5

Contemporary Application

8.75 9.6 8.33

Mechanics 8.33 8 7.5

Bibliography 8.58 6.8 7.33

Final Grade 89.92 84.2 81

FY 17-18 BIBL 5223 Online Offering Average Scores

Intro Historical Info

Literary Info Pericope ContemporaryApplication

Mechanics Bibliography Final Grade

5 14 19 28 8 5 6 85

FY 17-18 BIBL 5223 Face to Face Average Scores

Intro Historical Info

Literary Info Pericope ContemporaryApplication

Mechanics Bibliography Final Grade

4 11.8 16.6 23.8 8.4 8 7.6 80.2

Office of Institutional Effectiveness Page 60

Page 63: Institutional Effectiveness Handbook...learning/administrative process, an assessment tool may need to be developed internally (i.e. Graduating Student Surveys, Department Satisfaction

Analysis of Assessment Data: 1.1.1- MA(TS) Students (n=3) completed PMIN 5213, with strong scores across all categories, indicating that students are successfully applying intended learning outcomes of the course. 1.1.2- The lowest scoring category for MA(TS) students across all categories was Reasoning (4.33/5) 1.1.3- The highest scoring categories for MA(TS) students across all categories were Content (5/5) and Relevance (5/5) 1.1.4- Delivery methods of the course had similar scores across all categories, with the exception of Reasoning and Illustration categories. The difference in the Reasoning category can be attributed to the fact that only

Office of Institutional Effectiveness Page 61

Page 64: Institutional Effectiveness Handbook...learning/administrative process, an assessment tool may need to be developed internally (i.e. Graduating Student Surveys, Department Satisfaction

one MA(TS) student completed the course in the face to face offering. A greater number of MA(TS) students enrolled in the course would lead to a more robust data analysis in this respect. The difference in the Illustration may indicate that online students find it diffcult to provide effective illustrations when preaching primarily to a camera without an audience. 1.2.1 - MA(TS) students (n=6) completed BIBL 5223. There is a negative trend in scores across delivery options in all categories except Bibliography and Literary Information. 1.2.2 - Lower scores may be the result of one student in the course receiving a 41/100 as a final score, with low scores across all categories of the rubric. When this outlier is removed, scores increase significantly and indicate a positive trend from the 16-17 academic year. The student who received the lowest grade is an ESL student. 1.2.3 - Scores in Mechanics and Bibliography categories, historically categories with the most room for improvement were high in the 17-18 academic year when adjusted to take the outlier score into account. There was a discrepancy in the scores in these categories between Online and Face-to-Face students. 1.2.4 - Student scores decreased from the 16-17 academic year in the substantive categories of Historical Information, Literary Information, and Theological Pericope Analysis (even when adjusted for the outlier score). Recommendation(s) for Improvement: 1.1.1- Student scores are high, indicating that ILOs are being met for the course. Still, improving the lower scoring categories should become a priority. Special attention in course lesson should be given to the importance of reasoning in a sermon. 1.1.2- Attention should be given to Illustration category for online students who may not be preaching to an audience. 1.2.1- Mechanics and Bibliography scores were lower for online students than face-to-face students. Online students should be directed to the new SmartThinking proofreading application and informed of their access to library resources (i.e. e-books, scanning of reference materials, etc.). 1.2.2- In order to address lower scores in the substantive categories of Historical Information, Literary Information, and Theological Pericope Analysis, instructors may want to attempt providing clear examples of what is expected of each section or require students to turn in portions of the final paper incrementally, so that feedback can be given to improve final scores on these sections. An incremental submission process was utilized in the 15-16 academic year for the face to face sections of the course. The effcacy of this process for increasing scores is evident in the fact that scores in the 15-16 academic year were significantly higher than scores in the following two academic years. Action Classification(s): Modification of Pedagogical Strategies (Classroom), Modification of Curriculum Design (Departmental)

Priority of Recommendations: Moderately Important

Results of Action: *Standardized rubric has worked well for tracking data historically in PMIN 5213 Homiletics *Standardized rubric has worked well for tracking data historically in BIBL 5223 Biblical Exposition

Office of Institutional Effectiveness Page 62

Page 65: Institutional Effectiveness Handbook...learning/administrative process, an assessment tool may need to be developed internally (i.e. Graduating Student Surveys, Department Satisfaction

Appendix F: Checklist for Feedback on Administrative Assessment Plans and Reports

Over the last fiscal year, each division of the institution has worked toward accomplishing their 4-year assessment and planning goals (MPAs). As a part of this process, each area must submit an annual update on overall outcome progress. The below checklist will be used as a measure to ensure all needed items are submitted. To access and edit your departmental MPA you must login with your SEU credentials to seu.campuslabs.com/planning. Any questions or concerns regarding this process can be directed to Justin Rose via email ([email protected]) or phone (863-667-5386).

Outcome 1.0

☐ The action and/or assessment plans have been adequately updated (e.g. Detailed narrative of what accomplishments/drawbacks/observations have occurred over the past fiscal year. Narrative should include an overview of statistics or relevant examples of progress.)

☐ The action and/or assessment plans that are behind the established timeline have adequate rational and plans for correction.

☐ There is adequate supporting documentation added to the outcome (e.g. assessment results, reports, marketing material, etc.)

☐ The outcome overall progress should be estimated (e.g. 20%)

☐ The outcome status has been marked as “On-Track”, “At Risk”, “Off-Track”, “Completed”, or “Canceled.” Notes on Outcome 1.0 Click here to enter text.

Office of Institutional Effectiveness Page 63

Page 66: Institutional Effectiveness Handbook...learning/administrative process, an assessment tool may need to be developed internally (i.e. Graduating Student Surveys, Department Satisfaction

Appendix G: Checklist for Feedback on Academic Assessment Plans and Reports

Over the last academic year, each program in institution has worked toward accomplishing their Program Learning Outcomes (PLOs). As a part of this process, each area must submit an annual update on overall progress. The below checklist will be used as a measure to ensure all needed items are submitted. To access and edit your departmental PLO you must login with your SEU credentials to seu.campuslabs.com/planning. Any questions or concerns regarding this process can be directed to Justin Rose via email ([email protected]) or phone (863-667-5386).

Outcome 1.0

☐ Learning Outcome Progress | The learning outcome progress has been marked as “Completed”.

☐ Longitudinal (Trend) Assessment Data | All assessment data from the 2012-13 and 2013-14 academic years have been listed under the corresponding assessments.

☐ Rubric/Exam Subscale Data | If rubrics (or exams with subscales) were used to assess the PLO, then rubric subscale averages should be reported.

☐ Analysis of Assessment Data | Strengths and Weakness of student learning are identified. ☐ Recommendations for Improvement | Based off of the data analysis, recommendations are stated on how student learning or assessment will be improved. ☐ Results of Actions | What was the impact of the 2012-13 recommendations for improvement? Notes on Outcome 1.0 Click here to enter text.

Office of Institutional Effectiveness Page 64

Page 67: Institutional Effectiveness Handbook...learning/administrative process, an assessment tool may need to be developed internally (i.e. Graduating Student Surveys, Department Satisfaction

Appendix H: Curriculum Map

For editable template available for download, please click here or navigate to the following url: https://drive.google.com/file/d/1KUnYYtjxtjy3p0U0HqxomfZAbMuOOhGA/view?usp=sharing

Office of Institutional Effectiveness Page 65

Page 68: Institutional Effectiveness Handbook...learning/administrative process, an assessment tool may need to be developed internally (i.e. Graduating Student Surveys, Department Satisfaction

Appendix I: Foundational Document Example

Foundational Document: Institutional Effectiveness

The Mission: The Office of Institutional Effectiveness (IE) supports the mission of Southeastern University through assessment, planning, research, reporting, accreditation support, and the integration of technology to promote a culture of data-informed decision-making, accountability, and quality enhancement.

Motto/Tagline: Quality enhancement through data-informed decision-making.

Customers: The mission of the Office of Institutional Effectiveness supports the following groups:

● Board of Trustees

● University Leadership

● Faculty/Staff

● Students/Alumni

● External constituents such as SACS COC, CCCU, ICUF, the state and federal governments, the Alliance, and various private and public research organizations.

Critical Processes: The mission of the Office of Institutional Effectiveness (IE) is accomplished through the following critical processes:

1. Assessment & Planning – Implementing a comprehensive program of goal creation, evaluation of assessments, and the use of results for quality enhancement. Assisting faculty, staff and leadership in articulating learning and/or programmatic goals and identifying appropriate assessments and/or action plans. Working directly with all departments to ensure timely, appropriate data collection, analysis, reporting, and the integration of university planning and budgeting procedures. Analyzing and summarizing data for reports. Collecting and housing results and plans for improvement annually for all departments, disciplines, and general education. Maintaining and enforcing reporting deadlines. Leading the University Assessment Committee, which meets periodically in order to heighten awareness of and review assessment procedures. Building capacities for campus assessment leaders through periodic training and professional development.

2. Assessment Design & Administration – Assisting faculty, staff, and leadership with the creation and administration of in-house and published instruments for measuring attitudes, skills, knowledge, abilities, or trends, including the Noel-Levitz, NSSE, Graduating Student Survey, Alumni Survey, and others surveys. Preparing and administering course evaluations each semester and coordinating the distribution of faculty and administrator reports. Conducting focus groups and other qualitative measures with various constituents. Analyzing, interpreting, and reporting survey, course evaluation, and qualitative data. Creating action plans based on said data.

3. Academic Program Reviews - Leading the Academic Program Review (APR) initiative, a self-reflective process, wherein academic units will investigate a number of key performance indicators, including learning objectives, curriculum, teaching and learning methods, student learning assessments, and administrative processes and procedures.

4. Strategic Planning – Collecting, analyzing, and summarizing disaggregated institutional data to identify trends and thereby inform the strategic planning process. Collecting of action plan and assessment data as to ensure alignment with departmental assessment/planning processes.

Office of Institutional Effectiveness Page 66

Page 69: Institutional Effectiveness Handbook...learning/administrative process, an assessment tool may need to be developed internally (i.e. Graduating Student Surveys, Department Satisfaction

Assisting departments with the cascading of cultural values and strategic goals, objectives, action plans, and criteria for success. Integrating various planning, budgeting, and performance management processes.

5. Research & Reporting – Collecting, analyzing, and reporting data that support decision-making processes, comply with reporting requirements of external agencies, and respond to ad hoc requests for information. Developing innovative methods to advance data collection, analysis, and reporting. Assisting with data governance philosophy and implementation.

6. Accreditation Support – Producing narrative and documentation in compliance with SACS COC guidelines. Assisting the Associate Provost with all accreditation activities, including extension site reviews, new program notifications, prospectus submissions, substantive change applications, quality enhancement processes, and reaffirmation of accreditation proceedings.

7. Technology – Innovating and integrating the use of technology for accreditation support, assessment/planning, and research/reporting processes. Implementing the Institutional Analytics and Data Warehouse system for the purposes of systematic and on-demand analysis and reporting. Maintaining the Campus Labs system for assessment, accreditation, and planning purposes, including Compliance Assist, Baseline, and Course Evaluations. Managing the university’s electronic portfolio tool. Collaborating on the use of Noble Hour for collecting and reporting student community service hours. Developing and maintaining a public website for dissemination of departmental resources.

8. Coordination of Effort – Consistently communicating with departments for coordination of efforts. Committee memberships: Academic Council, Academic Programs Committee, Academic Program Review Committee, Curriculum Committee, Data Governance Committee, Faculty Council, General Education Committee, QEP Committee, Strategic Planning Task Force, and the University Assessment Committee, as well as numerous Ad Hoc Committees.

Cultural Value: The Office of Institutional Effectiveness (IE) supports the institution with a mindset of collegiality, wherein all actions and decision-making processes are informed by the following cultural values:

1. Lead by Example - Staff consistently model actions that demonstrate institutional effectiveness. Relationships - Staff create and maintain Christ-like, amicable, and supportive relationships with other departments as the office pursues its goals.

2. Collaboration – Staff seek to create a decision-making environment which models consultation and collective responsibilities of all stakeholders from every level.

3. Constituent-Focus – Staff dedicate time and energy to ensure every question or concern is answered in a timely and complete manner.

4. Quality Enhancement – Staff create a climate of continual improvement, wherein assessment and data inform decision making.

5. Data Integrity – Staff disseminate complete, accurate, unbiased data that is useful for evaluation and decision-making.

6. Transparency – Staff share information and processes via respectful and open communication.

7. Evidence-Based - Staff commit to the use of substantive data and evaluation to show if the university’s programs, processes, and services are effective and contribute significantly to the institution’s ability to fulfill its mission.

Office of Institutional Effectiveness Page 67

Page 70: Institutional Effectiveness Handbook...learning/administrative process, an assessment tool may need to be developed internally (i.e. Graduating Student Surveys, Department Satisfaction

8. Innovation - Staff research, implement, and evaluate new methods to fulfill the departmental mission.

9. Proactivity – Staff seek to consistently advance and positively influence the institution by developing circles of influence.

10. High Performance – Staff never accept the status quo and always perform beyond the call of duty.

Vision: Institutional Effectiveness seeks to engage itself in a process of quality enhancement through data-informed decision-making. Through continual evaluation, we seek to improve data integrity; through continual education and innovation, we seek to increase the speed with which we provide information; through continual communication, we seek to harmonize and coordinate the efforts of various departments toward the ultimate ends of the University’s Mission. In doing so, we seek to nurture a culture of quality enhancement throughout the University.

Office of Institutional Effectiveness Page 68

Page 71: Institutional Effectiveness Handbook...learning/administrative process, an assessment tool may need to be developed internally (i.e. Graduating Student Surveys, Department Satisfaction

Appendix J: Campus Labs Tutorials

The following tutorials are for the Campus Labs – Planning Module.

Accessing Projects (MPAs, PLOs, Foundational Documents)

Step 1: Navigate to Campus Labs Planning site

The first step to accessing Campus Labs Planning is to open your web browser and type in the following web address: seu.campuslabs.com/planning

Step 2: Login

Next login to the site with the same username and password you use to access your other SEU accounts (do not include @seu.edu)

Office of Institutional Effectiveness Page 69

Page 72: Institutional Effectiveness Handbook...learning/administrative process, an assessment tool may need to be developed internally (i.e. Graduating Student Surveys, Department Satisfaction

Step 3: My Dashboard

Once in the Planning site, you will see two to three icons on the upper left side of the page. By default, you will land on the Home icon called Dashboard. Click on the second one from the top, called Plans.

Creating new Administrative Outcomes for the MPA

If you have not already done so please follow steps 1-3 above.

Step 1: Select Appropriate Project:

Once in Plans, you will have several project options, administrative departments will see Foundational Document & Master Plan of Advance, academic departments will see Foundational Documents & Academic Assessment Plan, though units with academic and administrative assessment roles may see all 3. Select the Master Plan of Advance. Also ensure that the Fiscal Year (FY) dropdown above Master Plan of Advance is set to the correct year.

Office of Institutional Effectiveness Page 70

Page 73: Institutional Effectiveness Handbook...learning/administrative process, an assessment tool may need to be developed internally (i.e. Graduating Student Surveys, Department Satisfaction

Step 2: Select your Department/Program

Once the appropriate project and year are selected, select your department or program from the organizational unit hierarchy on the left side of the page.

Step 3: Create a New Administrative Outcome

On the right side of the page, you should see a dropdown field that says “+ Plan Item”. Click on that dropdown and select administrative outcome.

Office of Institutional Effectiveness Page 71

Page 74: Institutional Effectiveness Handbook...learning/administrative process, an assessment tool may need to be developed internally (i.e. Graduating Student Surveys, Department Satisfaction

Step 4: Number and Title Outcome

In the first field type the number of the outcome in the format shown below. The outcome title is a short description of the outcome and the outcome description provides the details of the outcome. For information on writing administrative outcomes please see the previous section of this handbook. *In order to save any text entered in the Planning fields, you must click anywhere outside of the text field. A green check mark should appear when your entry is successfully saved.*

Step 5: Select Start and End Dates and Responsible Department

In this next set of boxes select the appropriate dates, 7/1/2017-6/30/2022, and the responsible department.

Office of Institutional Effectiveness Page 72

Page 75: Institutional Effectiveness Handbook...learning/administrative process, an assessment tool may need to be developed internally (i.e. Graduating Student Surveys, Department Satisfaction

Step 6: Record Assessments and/or Action Plans

In this section, record planned assessments and/or action plans. You will see several other sections below this one but for the first part of this project you will only need to complete the MPA to this point. For a review of how to write strong outcomes please see the previous sections of this handbook. *In order to save any text entered in the Planning fields, you must click anywhere outside of the text field. A green check mark should appear when your entry is successfully saved.*

Creating New/Edit Program Learning Outcomes

If you have not already done so please follow steps 1-4 above.

Step 1: Select Appropriate Project:

Once in Plans, you will have several project options, administrative departments will see Foundational Document & Master Plan of Advance, academic departments will see Foundational Documents & Academic Assessment Plan, though units with academic and administrative assessment roles may see all 3. Select the Academic Assessment Plan. Also ensure that the Fiscal Year (FY) dropdown above Academic Assessment Plan is set to the correct year.

Office of Institutional Effectiveness Page 73

Page 76: Institutional Effectiveness Handbook...learning/administrative process, an assessment tool may need to be developed internally (i.e. Graduating Student Surveys, Department Satisfaction

Step 2: Select your Department/Program

Once the appropriate project is selected you will see in the middle of the screen a hierarchy of the University, use this hierarchy to find your department/program and select it.

Step 3: Create a New Program Learning Outcome

On the right side of the page, you should see a dropdown field that says “+ Plan Item”. Click on that dropdown and select program learning outcome.

Office of Institutional Effectiveness Page 74

Page 77: Institutional Effectiveness Handbook...learning/administrative process, an assessment tool may need to be developed internally (i.e. Graduating Student Surveys, Department Satisfaction

Step 4: Number and Title Outcome

In the first field type the number of the outcome in the format shown below. The outcome title is a short description of the outcome and the outcome description provides the details of the outcome. For information on writing PLOs please see the previous section of this handbook. *In order to save any text entered in the Planning fields, you must click anywhere outside of the text field. A green check mark should appear when your entry is successfully saved.*

Step 5: Select Start and End Dates and Responsible Persons

In this next set of boxes select the appropriate dates and Responsible Department.

Office of Institutional Effectiveness Page 75

Page 78: Institutional Effectiveness Handbook...learning/administrative process, an assessment tool may need to be developed internally (i.e. Graduating Student Surveys, Department Satisfaction

Step 6: List Assessments/Raw Results

In this section, list each of your assessment methods as outlined in the previous section of this handbook on creating and assessing learning outcomes. When the assessment cycles ends input the raw results as an indented paragraph under the corresponding assessment. (see PLO example in appendix E) *In order to save any text entered in the Planning fields, you must click anywhere outside of the text field. A green check mark should appear when your entry is successfully saved.*

Step 7: Analysis of Assessment Data (END OF CYCLE)

Provide a detailed analysis of assessment data as outlined in the below image and PLO example in appendix E.

Office of Institutional Effectiveness Page 76

Page 79: Institutional Effectiveness Handbook...learning/administrative process, an assessment tool may need to be developed internally (i.e. Graduating Student Surveys, Department Satisfaction

Step 8: Recommendation(s) for Improvement (END OF CYCLE)

Step 9: Action Classification

Use this section to select all applicable actions that will be taken as a part of the recommendations for improvement. You can also select the priority of your recommendations.

Step 10: Results of Action (END OF CYCLE)

Office of Institutional Effectiveness Page 77

Page 80: Institutional Effectiveness Handbook...learning/administrative process, an assessment tool may need to be developed internally (i.e. Graduating Student Surveys, Department Satisfaction

Appendix K: Academic Program Assessment (American University: Cairo)

Eight Steps to Effective Outcomes Assessment

Step 1: Define the mission of your department or program

Your program’s mission serves as the foundation for assessment planning. The mission statement should describe the purpose of the program as well as reflect the mission of the university.

For academic departments, the mission should focus on educational values, areas of knowledge in the curriculum, and careers or future studies for which graduates are prepared. Ideally, it should be stated concisely, in a few sentences.

The following are examples of mission statements:

Example 1: Construction Engineering (AUC)

To provide a high quality engineering education within a liberal arts context to students from Egypt as well as from other countries. The aim is to produce generations of engineers who will be leaders in their profession and able to manage projects and construction organizations. The pursuit of excellence is central to the department's mission, maintaining high standards of academic achievement, professional behavior, and ethical conduct.

Example 2: Engineering Services (AUC)

The mission of Engineering Services at the American University in Cairo is to provide high quality training and service to the industrial community in Egypt and other countries.

Example 3: The Writing Center (AUC)

The Writing Center is committed to developing students’ communication abilities by providing services to enhance critical thinking, presentation, and writing skills for both graduates and undergraduates in all disciplines. As a function of this mission, we support the efforts of teaching and non-teaching faculty in all disciplines.

Step 2: Identify the most important outcomes of the department or program

Learning outcomes are the knowledge, skills, values, and attitudes that students gain from a learning experience. They address the following questions:

− What should students know and be able to do when they have finished their particular program at AUC?

− What knowledge, skills, or attitudes distinguish graduates from your program from other students?

− How do these outcomes tie in with the university’s mission and educational goals?

Answering these questions produce statements of learning outcomes or learning goals (the two phrases are used interchangeably). The list does not need to include all learning outcomes, only the most important; more than two and less than eight is ideal.

Learning outcomes need to be specific, clear, and measurable and ideally include knowledge that students acquire, skills that students demonstrate, and attitudes that students develop. Well-defined outcomes are often stated as: “Students will …” or “Upon graduation, students will…”

Office of Institutional Effectiveness Page 78

Page 81: Institutional Effectiveness Handbook...learning/administrative process, an assessment tool may need to be developed internally (i.e. Graduating Student Surveys, Department Satisfaction

In addition, when developing outcomes:

− Focus on the ends, not the means -- what students will do after completing the course or program, what the desired “end state” should be.

− Use an “action” verb to describe in an observable way what students should be able to do.

− Try not to be too broad or too specific.

Finally, share outcomes with students and staff. Students learn more effectively when they are given clear goals to help them focus on what’s most important, understand how individual assignments or courses fit with the goals of the department, and how this course or program will help prepare them for life or careers after graduation. Program outcomes should be listed on the program’s website, and course outcomes should be listed on course syllabi.

Sample Departmental Outcomes:

Example 1: Business Administration (Bowling Green State University) (Student Achievement Assessment Committee (SAAC) 2007) Graduates will be able to:

✓ Demonstrate problem-solving, critical-thinking, oral and written communications, and team and leadership skills

✓ Apply business tools and concepts in domestic and global contexts

✓ Integrate foundational and functional business areas in making decisions

✓ Show commitment to ethical values and behavior, continuous learning, and professional growth

✓ Show understanding and appreciation for cultural, racial, and gender differences

Example 2: Computer Science (Bowling Green State University) ("Department and Program Learning Outcomes” 2007) Graduates will be able to:

✓ Program in a higher-level language

✓ Work effectively with a client and members of a software development team to analyze, specify, design, implement, test, and document software that meets the client's needs

✓ Acquire new computer-related skills independently as technologies evolve

✓ Communicate technical concepts to non-technical persons, both orally and in writing

✓ Develop a plan to integrate hardware and software into a particular environment

✓ Conduct themselves in an ethical and professional manner

Example 3: Biology (AUC)

Office of Institutional Effectiveness Page 79

Page 82: Institutional Effectiveness Handbook...learning/administrative process, an assessment tool may need to be developed internally (i.e. Graduating Student Surveys, Department Satisfaction

✓ The graduates of the Biology Department will be able to:

✓ Think critically, identify biological issues and formulate solutions to biological problems.

✓ Use computers and information technology effectively to address biological problems.

✓ Function effectively in a teamwork environment.

✓ Apply knowledge in basic mathematics, general chemistry, calculus bases physics and statistics to solving biological problems.

✓ Use their knowledge and comprehension of basic biological principles, concepts, and theories.

✓ Evaluate and synthesize information and ideas from a variety of sources and formats.

✓ Competently collect, analyze, organize, evaluate, and present scientific data.

✓ Understand, analyze, and evaluate original research literature in support of current research projects.

✓ Compete effectively for entry level employment and/or placement in graduate or professional training facilities.

Step 3: Ensure that students have adequate opportunities to achieve these outcomes

A program’s curriculum needs to ensure that all students in the program have the opportunity to achieve these goals before they graduate. Program planners need to ask, “In what courses or experiences do students learn these skills or acquire this knowledge?”

A matrix can be a useful tool to map outcomes with the curriculum and learning experiences to ensure that all students are presented with adequate learning opportunities.

Step 4: Define how you will assess progress towards these outcomes

Assessments don’t have to be complicated and, when used well, can be a powerful tool for improvement, providing better information for planning, budgeting, changes in curriculum, new programs, staffing, and student support. Student learning assessment data helps us understand what our students are learning, where they might be having difficulty, and how we can change the way we teach and how we can shape our curriculum to help them learn better. Assessment is not an evaluation of individual students, faculty or courses.

Start by taking an inventory of the kinds of tools your department or program is already using.

Many departments and programs are already accessing student learning outcomes. These assessments might take the form of capstone courses, theses, papers, individual or group projects, performances, documentaries, presentations, student portfolios, alumni or employer surveys, student opinion surveys, focus groups, standardized tests, entry or exit tests or surveys, reports from internship supervisors, or other measures. Additionally, many offices on campus collect and analyze institutional data. These offices include IPART, Career Advising and Placement Services (CAPS), Alumni, Student Affairs, Graduate Studies, and others. This data can be analyzed to provide your program with information about your students and alumni.

Listed below are direct and indirect measures of student learning. Effective assessment plans must include a mix of direct and indirect methods of assessment.

Office of Institutional Effectiveness Page 80

Page 83: Institutional Effectiveness Handbook...learning/administrative process, an assessment tool may need to be developed internally (i.e. Graduating Student Surveys, Department Satisfaction

Direct methods of evaluating student learning provide tangible evidence that a student has acquired a skill, demonstrates a quality, understands a concept, or holds a value tied to a specific outcome. They answer the question, “What did students learn as a result of this (assignment/project/exam…)?” and “How well did they learn?” Direct methods generally result in student “products” like term papers or performances.

Direct Methods of Assessing Student Learning:

❖ Capstone courses

❖ Review of senior projects by external evaluators (using scoring guidelines – see appendix 3)

❖ Licensure or certification exams

❖ Places in the curriculum where multiple faculty members examine student work, e.g. theses, video documentaries, art projects, research projects, etc. Scoring guidelines should be used – see appendix 3.

❖ Portfolios and e-portfolios, with material showing progression throughout major (See Appendix 5 for more information.)

❖ Homework assignments, examinations and quizzes, term papers and case studies

❖ Evaluations of student performance in internships, research projects, field work, or service learning.

❖ Classroom Assessment Techniques (CATs) (See Appendix 4)

❖ Standardized tests

❖ Videotape of oral presentations or performances

❖ Entry and exit exams

Indirect methods provide more intangible evidence, demonstrating characteristics associated with learning but only implying that learning has occurred. When a student answers a question correctly, there is direct evidence that he or she has learned. When a student says that he or she has an excellent understanding of the topic, there is indirect evidence. While both methods of assessing learning are valuable, indirect evidence is more meaningful when it is tied to direct evidence.

Indirect Methods of Assessing Student Learning:

❖ Retention and graduation statistics

❖ Job placement or graduate school acceptance

❖ Career development over time

❖ Student perception surveys

❖ Course evaluations, with questions added regarding learning

❖ Alumni surveys or focus groups

❖ Employer surveys or focus groups

❖ Student activities

❖ Teaching strategies that promote learning

❖ Course grades not based on scoring guidelines or not linked to clear learning goals.

❖ Number of student hours spent on homework

❖ Number of student hours spent on service learning

❖ Number of student hours spent on cultural or intellectual activities related to learning outcomes

❖ Entry and exit student surveys

Office of Institutional Effectiveness Page 81

Page 84: Institutional Effectiveness Handbook...learning/administrative process, an assessment tool may need to be developed internally (i.e. Graduating Student Surveys, Department Satisfaction

At the course level, course learning outcomes should be listed on the syllabi, and the course should be structured so that there are multiple opportunities for students to achieve the course outcomes.

Aren’t Course Grades Enough? Assessment tries to link student performance to specific learning outcomes. Grades can be an excellent assessment tool, if the performance being graded is linked to a specific outcome. Traditional course grades tend to provide a summary measure of students’ performance across many outcomes, which doesn’t provide the kind of specific feedback necessary to link student performance to improvement. They can also include factors like attendance, participation, and test-taking skills. Course grades can provide insight, however, into a student’s understanding of the course content and can serve as an indirect method of assessment.

What about Course Evaluations? Course evaluations are not a direct measure of student learning because they focus more on student perceptions of the quality of teaching than on learning outcomes. Some universities have modified their course evaluations to include questions that address student perceptions of learning as well. These kinds of questions would ask students how well they thought they achieved the learning goals of the course. An example of a revised course evaluation that does both is available at http://www.idea.ksu.edu/StudentRatings/index.html.

Step 5: Develop the assessment plan

Once the mission, learning outcomes and assessment methodologies have been developed, the assessment plan must be completed. See Appendix 6 for a template for an assessment plan at the program level. Program assessment coordinators should use this template to develop their plans and reports or create a text document that provides the same information in a similar format, e.g. assessment measures and benchmarks should be listed for each outcome, along with results and action plans for each outcome. This template can also be helpful for faculty planning assessment at the course level. When completed, the plan should be shared with the Dean and a copy sent to IPART.

Remember, not all outcomes need to be assessed – only those that are the most important. More than two and less than eight is generally a manageable number. In addition, not all outcomes must be assessed each year. Departments and programs can schedule assessment of outcomes over several years, if needed.

Before starting your plan, consider the following:

❏ Are your learning outcomes well-stated? Are they measurable? Do they focus on outcomes rather than the process? Are they tied to AUC’s institutional learning outcomes?

❏ Are all of your outcomes being taught? Are they taught in a sensible sequence?

❏ Are different sections of the same course sharing the same outcomes? While course content and teaching methods can differ, it often helps to ensure that all sections of the same course share the same learning goals.

❏ When and how often will assessment information be collected and shared? With whom will it be shared?

❏ How will you use the information? How will it be used to inform the department’s decision-making? How will it affect course content and sequencing, testing, availability of labs and library resources, faculty-student interaction, course staffing, class size, student advising, and more?

Additional information on assessment, training, workshops, and other assistance is available from IPART and from the Center for Learning and Teaching. IPART’s website also hosts a wide range of information

Office of Institutional Effectiveness Page 82

Page 85: Institutional Effectiveness Handbook...learning/administrative process, an assessment tool may need to be developed internally (i.e. Graduating Student Surveys, Department Satisfaction

and online resources as well as copies of this guide and the assessment plan template in downloadable format.

Step 6: Carry out the assessment

Once the plan is developed and submitted, the assessment process needs to be implemented. Remember, for program assessment, the goal is to assess program-level outcomes, not to evaluate individual students or faculty members. The assessment coordinator, or chair of an assessment committee, will manage the program’s assessment process and will create a detailed timeline for the assessment cycle. The timeline might include dates for when work will be collected, when results will be tabulated and analyzed across the program, and when faculty will meet to discuss the results of the process and recommend changes. Items to consider include which courses and learning experiences are better suited for assessment, timelines and schedules, whether all students should be assessed or only a sample, and how to best protect the confidentiality of the students being assessed.

Step 7: Collect, analyze, communicate, and report on your findings

After assessment information is collected, the results need to be analyzed and communicated in useful ways to the faculty, who can consider changes to teaching methods, the curriculum, resource availability and scheduling, course content, and other factors.

At the end of the year, faculty members should complete an assessment report, similar in format to the plan, stating each course’s learning outcomes, assessment tools used, results of the assessment, and how the results were used to make changes to help students and improve learning. A template for the report is included in the appendix.

The program’s assessment coordinator should collect and tabulate results across the program and/or department and report that information back to the department or program faculty. The program’s assessment coordinator should share the department/program’s overall report with the Dean or Area Head and send a copy to IPART, which will provide timely feedback and comments. Departments and programs are encouraged to share their results with all stakeholders.

Assessment results should be used in preparation of departmental budgets and changes to the long-range plans. The results should also be used to review and adjust the department’s assessment plans, to improve student learning.

Step 8: Take action based on those findings

Assessment results are meant to be used: to improve teaching and inform decision-making and resource allocation. Once assessment results have been collected and analyzed, faculty need to return to the department or program’s learning goals – how do the results of the assessment meet those expectations? Were the standards that were set appropriate? Should performance expectations be changed? What aspects of the assessment process worked well and what changes might make it more effective? What were the most effective assessment tools? Can they be shared and used in other courses or programs?

Examples of some of the changes departments and programs might take include:

✓ Increasing the credit value of a key course, or divide a course into two courses

✓ Developing a capstone course

✓ Requiring students in their last semester to complete an independent project

✓ Developing rubrics with which faculty teams can better review students’ projects

✓ Hiring or re-assigning faculty

✓ Increasing classroom space

Office of Institutional Effectiveness Page 83

Page 86: Institutional Effectiveness Handbook...learning/administrative process, an assessment tool may need to be developed internally (i.e. Graduating Student Surveys, Department Satisfaction

✓ Adding new courses

✓ Re-designing the curriculum

✓ Increasing contacts with alumni

✓ Improving the website

✓ Providing training to faculty and staff

Keep track of planned changes to teaching practices, the curriculum, or other aspects of your program based on assessment results, those changes that have already been carried out in response to assessment results, and the impact those changes had on student learning and performance.

Assessment results are important evidence on which to base requests for additional funding, curriculum changes, new faculty lines, and more. Most importantly, the use of assessment results to make these kinds of changes to improve student learning and inform decision-making and planning is the reason why we assess. Even negative assessment results can have powerful, positive impact when they are used to improve the learning process.

Office of Institutional Effectiveness Page 84

Page 87: Institutional Effectiveness Handbook...learning/administrative process, an assessment tool may need to be developed internally (i.e. Graduating Student Surveys, Department Satisfaction

Appendix L: Creating Learning Outcomes (Stanford University)

What Are Student Learning Outcomes?

Learning outcomes are statements of the knowledge, skills and abilities individual students should possess and can demonstrate upon completion of a learning experience or sequence of learning experiences. Before preparing a list of learning outcomes consider the following recommendations:

Learning outcomes should be specific and well defined. When developing a list of student learning outcomes, it is important that statements be specific and well defined. Outcomes should explain in clear and concise terms the specific skills students should be able to demonstrate, produce, and known as a result of the program’s curriculum. They should also exclude the greatest number of possible alternatives so that they can be measured. For example, the learning outcome “Students completing the BS in Chemistry should be well practiced in the relevant skills of the field” is too vague. In this example, we do not know what the relevant skills of the field of chemistry include. This will create problems in measuring the behavior of interest and drawing valid conclusions about the program’s success.

Learning outcomes should be realistic. It is important to make sure that outcomes are attainable. Outcomes need to be reviewed in light of students’ ability, developmental levels, their initial skill sets, and the time available to attain these skill sets (i.e, 4 years). They should also be in line with what is being taught.

Learning outcomes should rely on active verbs in the future tense. It is important that outcomes be stated in the future tense in terms of what students should be able to do as a result of instruction. For example, the learning outcome “Students have demonstrated proficiency in…” is stated in terms of students’ actual performance instead of what they will be able to accomplish upon completion of the program. Learning outcomes should also be active and observable so that they can be measured. For example, outcomes like “Students will develop an appreciation of, and will be exposed to…” are latent terms that will be difficult to quantify. What does it mean to have an appreciation for something, or to be exposed to something?

Learning outcomes should be framed in terms of the program instead of specific classes that the program offers. Learning outcomes should address program goals and not specific course goals since assessment at the University is program-focused. For example, the learning outcome “Students completing Chemistry 101 should be able to…” is focused at the course level. It does not describe what a graduating senior in Chemistry should be able to demonstrate as a result of the program.

There should be a sufficient number of learning outcomes. You should include between three to five learning outcomes in your assessment plan. Fewer than three will not give you adequate information to make improvements, more than five may be too complicated to assess. It is important to note that not all programs will assess all learning outcomes in all classes. The program may choose to focus on one or two per class.

 

Office of Institutional Effectiveness Page 85

Page 88: Institutional Effectiveness Handbook...learning/administrative process, an assessment tool may need to be developed internally (i.e. Graduating Student Surveys, Department Satisfaction

Learning outcomes should align with the program’s curriculum. The outcomes developed in your plan need to be consistent with the curriculum goals of the program in which they are taught. This is critical in the interpretation of your assessment results in terms of where changes in instruction should be made. Using curriculum mapping is one way to ensure that learning outcomes align with the curriculum. A curriculum map is a matrix in which learning outcomes are plotted against specific program courses. Learning outcomes are listed in the rows and courses in the columns. This matrix will help clarify the relationship between what you are assessing at the program level and what you are teaching in your courses.

Learning outcomes should be simple and not compound. The outcomes stated in your plan should be clear and simple. Avoid the use of bundled or compound statements that join the elements of two or more outcomes into one statement. For example, the outcome “Students completing the BS in mathematics should be able to analyze and interpret data to produce meaningful conclusions and recommendations and explain statistics in writing” is a bundled statement. This outcome really addresses two separate goals, one about analyzing and interpreting data and another about writing.

Learning outcomes should focus on learning products and not the learning process. Learning outcomes should be stated in terms of expected student performance and not on what faculty intend to do during instruction. The focus should be on the students and what they should be able to demonstrate or produce upon completion of the program. For example, the learning outcome “Introduces mathematical applications” is not appropriate because its focus is on instruction (the process) and not on the results of instruction (the product).

(Diagram adapted from Linn & Miller, 2005.)

Constructing Learning Outcomes

Considering Taxonomies – Taxonomies of educational objectives can be consulted as useful guides for developing a comprehensive list of student outcomes. Taxonomies attempt to identify and classify all different types of learning. Their structure usually attempts to divide learning into thee types of domains (cognitive, affective, and behavioral) and then defines the level of performance for each domain. Cognitive outcomes describe what students should know. Affective outcomes describe what students should think. Behavioral outcomes describe what students should be able to perform or do. (Adapted from OAPA Handbook PROGRAM-Based Review and Assessment. UMass Amherst)

Bloom’s Taxonomy of Educational Objectives (1956) is one traditional framework for structuring learning outcomes. Levels of performance for Bloom’s cognitive domain include knowledge, comprehension, application, analysis, synthesis, and evaluation. These categories are arranged in ascending order of cognitive complexity where evaluation represents the highest level. The table below presents a description of the levels of performance for Bloom’s cognitive domain.

Office of Institutional Effectiveness Page 86

Page 89: Institutional Effectiveness Handbook...learning/administrative process, an assessment tool may need to be developed internally (i.e. Graduating Student Surveys, Department Satisfaction

Level Description

Knowledge (represents the lowest level of learning)

To know and remember specific facts, terms concepts, principles or theories

Comprehension To understand, interpret, compare, contrast, explain

Application To apply knowledge to new situations to solve problems using required knowledge or skills

Analysis To identify the organizational structure of something; to identify parts, relationships, and organizing principles

Synthesis To create something, to integrate ideas into a solution, to propose an action plan, to formulate a new classification scheme

Evaluation (represents the highest level of learning)

To judge the quality of something based on its adequacy, value, logic or use

Adapted from California State University, Bakersfield, PACT Outcomes Assessment Handbook (1999)

Using Power Verbs

When composing learning outcomes, it is important to rely on concrete action verbs that specify a terminal, observable, and successful performance as opposed to passive verbs that are not observable. For example, the statements “be exposed to,” “be familiar with,” and “develop an appreciation of,” are not observable and would be difficult to quantify. The table below provides a list of common active verbs for each of Bloom’s performance levels.

Knowledge Comprehension Application Analysis Synthesis Evaluation define/state classify apply analyze arrange appraise

identify describe compute appraise assemble assess

indicate discuss construct calculate collect choose

know explain demonstrate categorize compose compare

label express dramatize compare construct contrast

list/label identify employ contrast create decide

memorize locate give examples criticize design estimate

name paraphrase illustrate debate formulate evaluate

recall recognize interpret determine manage grade

record report investigate diagram organize judge

relate restate operate differentiate perform measure

duplicate review organize distinguish plan rate

select suggest practice examine prepare revise

underline summarize predict experiment produce score

Office of Institutional Effectiveness Page 87

Page 90: Institutional Effectiveness Handbook...learning/administrative process, an assessment tool may need to be developed internally (i.e. Graduating Student Surveys, Department Satisfaction

tell translate inspect propose select argue

translate cite inventory set up value critique

sketch question articulate infer model interpret

read distinguish assess solve perform criticize

use solve collect test integrate defend Sample Learning Outcomes

Languages and Literature:

Students will be able to apply critical terms and methodology in completing a literary analysis following the conventions of standard written English. Students will be able to locate, apply and cite effective secondary materials in their own texts.

Students will be able to analyze and interpret texts within the contexts they are written. French students will be able to demonstrate oral competence with suitable accuracy in pronunciation, vocabulary, and language fluency.

French students will be able to produce written work that is substantive, organized, and grammatically accurate. French students will be able to accurately read and translate French texts.

Humanities and Fine Arts:

Students will be able to demonstrate fluency with formal vocabulary, artistic techniques and procedures of two-dimensional and three-dimensional art practice. Students will demonstrate in-depth knowledge of artistic periods used to interpret works of art including the historical, social and philosophical contexts.

Students will be able to critique and analyze works of art and visual objects. Students will be able to identify musical elements, take them down at dictation, and perform them at sight. Students will be able to communicate both orally and verbally about music of all genres and styles in a clear and articulate manner. Students will be able to perform a variety of memorized songs from a standard of at least two foreign languages. Students will be able to apply performance theory in the analysis and evaluation of performances and texts. Students will be able to analyze and interpret scripts. Students will demonstrate in-depth knowledge and understanding of contemporary theatre forms and artists. Students will be able to demonstrate proficiency in a variety of dance styles, including ballet, modern dance, jazz, and tap.

Physical and Biological Sciences:

Students will be able to demonstrate an understanding of core knowledge in biochemistry and molecular biology. Students will be able to apply critical thinking and analytical skills to solve scientific data sets. Students will be able to apply the scientific method to solve problems. Students will be able to demonstrate written, visual, and/or oral presentation skills to communicate scientific knowledge. Students will be able to acquire and synthesize scientific information from a variety of sources. Students will be able to apply techniques and instrumentation to solve problems.

Mathematics:

Students will be able to translate problems for treatment within a symbolic system. Students will be able to articulate the rules that govern a symbolic system. Students will be able apply algorithmic techniques to solve problems and obtain valid solutions. Students will be able to judge the reasonableness of obtained solutions.

Social Sciences:

Office of Institutional Effectiveness Page 88

Page 91: Institutional Effectiveness Handbook...learning/administrative process, an assessment tool may need to be developed internally (i.e. Graduating Student Surveys, Department Satisfaction

Students will be able to write clearly and persuasively to communicate their scientific ideas clearly. Students will be able to test hypotheses and draw correct inferences using quantitative analysis. Students will be able to evaluate theory and critique research within the discipline.

Business:

Students will be able to work in groups and be part of an effective team. Students will be able to communicate business knowledge both orally and written. Students will be able to recognize and respond appropriately to an ethical and regulatory dilemma. Students will be able to recognize and diagnose accounting problems. Students will demonstrate disciplinary competence in a field of business.

(NOTE: These samples were gathered from a variety of sources including UR assessment plans, program assessment statements at other institutions, etc.)

Office of Institutional Effectiveness Page 89

Page 92: Institutional Effectiveness Handbook...learning/administrative process, an assessment tool may need to be developed internally (i.e. Graduating Student Surveys, Department Satisfaction

Appendix M: Planning for Dissemination and Use (Stanford University)

“ …The gods condemned Sisyphus to endlessly roll a rock up a hill, whence it would return each time to its starting place. They thought with some reason that there was no punishment more severe than eternally futile labor…” – The myth of Sisyphus (Camus 1955)

Just like the myth of Sisyphus, there is no punishment more severe than investing time and effort for nothing. As foolish as it may seem, the last and most important step in assessment, “closing the loop,” is often ignored after faculty have spent much time and effort developing a plan and collecting data.

Although it is impossible to predict what uses will be made of the assessment results until activities are conducted and results are considered, it is still important to think about how information will be shared and acted upon. This will include planning how results will be shared with faculty members in the program and what types of changes could be made in light of the assessment results (i.e., changes in your curriculum, teaching materials, or instruction).

Here is an example of what a plan for dissemination and use might look like:

Anecdotal evidence (professor reports) suggests that there is a wide discrepancy in the skill level of students entering into Chemistry 470. Assessment data will be used to make decisions about our structure of existing courses. More specifically, the progression of our courses as students enter into Chemistry 470. We will use the collected assessment data to help answer some of these questions by comparing assessment results for students who have taken Chem230 versus those who have not while controlling for ability (SAT). In addition, we are interested in determining whether our current assessment instrument and rubric effectively assess application of scientific reasoning (i.e., instruments sensitivity). Results will be shared via a faculty retreat and on our program’s website. The entire staff and faculty will participate in reviewing the assessment data at a faculty retreat held each summer. Results will be presented by the assessment committee through a formal presentation. Additionally we will post the assessment results on-line annually for transparency.

Use the template below to develop your plans for dissemination and use of results to include in your assessment plan.

Template for Plan for Dissemination and Use

Assessment data will be used to make decisions about [insert first item], [insert second item if appropriate], and [insert third item as appropriate].

Results will be shared via [insert dissemination vehicle].

NOTE: This template is provided to help you develop your statement on how you plan to disseminate and use your assessment results. You are not required to use this wording but you should include its components.

Office of Institutional Effectiveness Page 90

Page 93: Institutional Effectiveness Handbook...learning/administrative process, an assessment tool may need to be developed internally (i.e. Graduating Student Surveys, Department Satisfaction

Appendix N: Academic Program Review Schedule

Each unit will be reviewed on a 2-year schedule. The process would begin in August of the 1st year and end May of the 2nd year. Within an academic year, up to four units will be involved in the review process at various points. Below is a proposed review cycle for the current departments.

The current proposed review cycle is as follows:

Academic Years Academic Unit

2018-19 Barnett College of Christian Ministries & Religion (School of Divinity*) School of Unrestricted Education

2019-20 School of Business Administration* 2020-21 Department of Humanities

Department of Visual Arts 2021-22 Department of Communication 2022-23 School of Leadership Studies

School of Legal Studies Department of Music

2024-2025 College of Natural & Health Sciences (Department of Nursing*) College of Education (Department of Undergraduate Studies*)

2025-2026 College of Behavioral & Social Sciences (Social Work*) *Units with state or professional accreditations may be exempt from aspects of the APR process dealing with academic content and quality. These units will focus primarily on the MVR and the AP, and will not require an additional on-site visit.

Office of Institutional Effectiveness Page 91

Page 94: Institutional Effectiveness Handbook...learning/administrative process, an assessment tool may need to be developed internally (i.e. Graduating Student Surveys, Department Satisfaction

Appendix O: Administrative Unit Assessment (American University: Cairo)

How to Conduct Effective Outcomes Assessment

As with student learning assessment, assessment of supporting units begins with the development of a mission statement, based on the university’s mission statement, and follows with the development of measurable outcomes, which reflect the units’ critical success factors -those characteristics or activities that are essential for the organization to achieve its mission. Supporting units assess progress towards those outcomes, communicate results, and use those results to improve their processes and inform decision-making and resource allocation. The following sections provide detail on how to develop an effective outcomes assessment plan, including how to write a mission statement, how to develop outcomes, and how to assess progress towards those outcomes.

Step 1: Define the mission of your department or unit.

Your mission serves as the foundation for assessment planning. The mission statement should describe the purpose of the department or unit as well as reflect the mission of the university.

Mission statements should be not more than three or four lines and should clearly and effectively communicate WHAT you do, WHY you do it, and HOW you do it.

Example 1: Main Library (AUC): To develop, maintain and enhance the resources, services, and environment necessary to provide the highest level of support for the instructional and research needs of the AUC community.

Example 2: Engineering Services (AUC): The mission of Engineering Services at the American University in Cairo is to provide high quality training and service to the industrial community in Egypt and other countries.

Example 3: The Writing Center (AUC): The Writing Center is committed to developing students’ communication abilities by providing services to enhance critical thinking, presentation, and writing skills for both graduates and undergraduates in all disciplines. As a function of this mission, we support the efforts of teaching and non-teaching faculty in all disciplines.

Step 2: Identify the most important outcomes of the department or unit.

An outcome is a specific statement that describes the benefit that a department hopes to achieve or the impact on a “customer” or the institution that is a result of the work that your unit performs. Outcomes should be challenging but attainable. A department should identify at least one outcome for each of its functional responsibilities.

Office of Institutional Effectiveness Page 92

Page 95: Institutional Effectiveness Handbook...learning/administrative process, an assessment tool may need to be developed internally (i.e. Graduating Student Surveys, Department Satisfaction

Hints for writing outcomes:

● Begin the outcome statement with the beneficiary of the service you provide: “Students are aware of…,” “Administrators have the…,”

● Focus on the ends, not the means -- what the desired “end state” should be.

● An outcome should be stable over a number of years, not time dependent. If it is time-dependent, you are probably writing an objective rather than an outcome.

● Outcomes also need to be measurable and related directly to the work of your department.

● List only key outcomes – between three and five is ideal.

Examples of Departmental Outcomes

➢ “Prospective applicants will meet a welcoming and informative environment when they enquire about graduate studies.”

➢ “Evaluation and testing of the English language proficiency of incoming graduates will be timely and accurate.”

➢ “Library patrons have access to the appropriate information resources needed for learning and research.”

➢ “Users will receive prompt assistance in resolving technical problems related to university networking services.”

➢ “Faculty, staff, and students will be able to identify EO/AA laws, policies, and procedures and know-how and from where to seek assistance.”

➢ “University departments and units will have the technical support needed to effectively assess their programs and services.”

➢ “Campus units will receive the technical support they need to conduct effective assessment.”

➢ “Eligible employees have the information they need to make appropriate decisions regarding employee benefits packages.”

➢ “The university’s senior administrators have the information they need for decision-making related to budgets and financial planning.”

➢ “Students will be able to write an effective resume.”

➢ “Faculty members effectively use technology to promote student learning.”

➢ “Clients of the Counseling Center will be able to use two or more ways to manage emotions.”

➢ “All inquiries from the news media will be answered in a timely and appropriate manner.”

What is the difference between outcomes, objectives and strategies?

All of AUC departments and units are engaged in planning as well as assessment. Assessment plans detail expected outcomes, progress towards those outcomes, and how results will be used to improve effectiveness. While the results of assessment are intended to inform planning, they are not substitutes for plans and are not the place to detail administrative strategies, objectives, and planned administrative actions.

Office of Institutional Effectiveness Page 93

Page 96: Institutional Effectiveness Handbook...learning/administrative process, an assessment tool may need to be developed internally (i.e. Graduating Student Surveys, Department Satisfaction

Outcomes are something that the department or unit wants to achieve; they are desired end results for the organization or program, rather than actions. Outcomes are related to the institution or department’s mission and vision, and focus on the benefit to the recipient of the service.

Example of an Outcome Statement: All academic support and administrative units at AUC conduct ongoing and effective assessment of their activities and services and use the results of assessment to inform planning, decision-making and resource allocation.

Objectives are the tasks to be completed in order to achieve a goal. Objectives are specific and measurable and must be accomplished within a specified time period. There should be two to three objectives per goal.

Example of an Objective: By the end of 2009-2010, all academic support and administrative units will have outcomes assessment plans in place.

Strategies are the means you plan to use to achieve your objectives. There should be a minimum number of strategies to achieve each objective.

Example of a Strategy: Develop and distribute assessment materials in hard-copy and online forms. These will include an assessment guide, plan, and report templates, examples, evaluative rubrics to provide feedback on plans and reports, online links to additional resources, etc.

Finally, share outcomes with staff and with the university community. Staff perform more effectively when they are given clear goals to help them focus on what’s most important and understand how individual responsibilities or tasks fit with the goals and outcomes of the department.

Step 3: Define how you will assess progress towards these outcomes

Assessments don’t have to be complicated and, when used well, can be a powerful tool for improvement, providing better information for planning, budgeting, new programs, staffing, and student support. Assessment data help us understand where our unit is in our progress towards our expected outcomes, where we might be having difficulty, and how we might change the way we work to improve our effectiveness. Assessment is not a performance evaluation of individual staff members.

Start by taking an inventory of the kinds of tools your department or program is already using. What information are you already collecting? What kinds of assessments are you already using or are already familiar with? What kinds of assessments are recommended by your profession? For each expected outcome, describe methods you are using or plan to use to measure how well your department or unit is actually performing in relation to the outcome. Assessment measures can be direct or indirect, qualitative or quantitative, objective or subjective, and multiple measures should be used for each outcome. An assessment method or measure can also be used to assess progress towards more than one outcome.

Develop targets or benchmarks for each measure, for example, “80% of users responded that they are satisfied with the service.”

Additionally, many offices on campus collect and analyze institutional data. These offices include IPART, Career Advising and Placement Services (CAPS), Alumni, Student Affairs, Graduate Studies, and others. This data can be analyzed to provide your program with information about your clients or customers.

Office of Institutional Effectiveness Page 94

Page 97: Institutional Effectiveness Handbook...learning/administrative process, an assessment tool may need to be developed internally (i.e. Graduating Student Surveys, Department Satisfaction

Examples of assessment measures for supporting units include:

❖ Student satisfaction surveys

❖ Number of complaints

❖ Count of program participants

❖ Growth in participation

❖ Average wait time

❖ Comparisons to professional organizations’ best practices

❖ Statistical reports

❖ Average service time

❖ Staff training hours

❖ Number of applications

Step 4: Complete the assessment plan

❖ Processing time for requests

❖ Number of users

❖ Focus groups

❖ Opinion surveys

❖ External review

❖ Number of staff trained

❖ Dollars raised

❖ Attendance at events

❖ Student participation in clubs and activities

Once the mission, expected outcomes and assessment methods and benchmarks have been developed, the assessment plan must be completed. Assessment coordinators should use this template to develop their plans and reports or create a text document that provides the same information in a similar format, e.g. assessment measures and benchmarks should be listed for each outcome, along with results and action plans for each outcome. When completed, the plan should be shared with the Area Head and a copy sent to IPART.

Remember, not all outcomes need to be assessed – only those that are the most important. Three to five is generally a manageable number. In addition, not all outcomes must be assessed each year. Departments and units can schedule assessment of outcomes over several years, if needed.

Additional information on assessment, training, workshops, and other assistance is available from IPART. IPART’s website also hosts a wide range of information and online resources as well as copies of this guide and the assessment plan template in downloadable format.

Office of Institutional Effectiveness Page 95

Page 98: Institutional Effectiveness Handbook...learning/administrative process, an assessment tool may need to be developed internally (i.e. Graduating Student Surveys, Department Satisfaction

Step 5: Carry out the assessment

Once the plan is developed and submitted, the assessment process needs to be implemented.

Remember, for outcomes assessment, the goal is to assess department outcomes, not to evaluate individual students or faculty members. The assessment coordinator will manage the program’s assessment process and will create a detailed timeline for the assessment cycle. The timeline might include dates for when work will be collected, when results will be tabulated and analyzed across the program, and when the department’s staff will meet to discuss the results of the process and recommend changes.

Step 6: Collect, analyze, communicate, and report on your findings

After assessment information is collected, the results need to be analyzed and communicated in useful ways to department colleagues, who can consider changes to service methods, resource availability and scheduling, and other factors. At the end of the year, the department should complete an assessment report, similar in format to the plan, stating expected outcomes, assessment tools used, results of the assessment, and how the results were used to make changes to improve effectiveness. A template for the report is included in the appendix.

The department’s assessment coordinator should share the department overall report with the Area Head and send a copy to IPART, which will provide timely feedback and comments. Departments and units are encouraged to share their results with all stakeholders. Assessment results should be used in preparation of departmental budgets and changes to the long-range plans. The results should also be used to review and adjust the department’s assessment plans, to ensure that the highest quality information is available to assist the department in meeting its expected outcomes.

Step 7: Take action based on those findings

Assessment results are meant to be used: to improve effectiveness and inform decision-making and resource allocation. Once assessment results have been collected and analyzed, the department needs to return to the department’s expected outcomes – how do the results of the assessment meet those expectations? Were the standards that were set appropriate? Should performance expectations be changed? What aspects of the assessment process worked well and what changes might make it more effective? What were the most effective assessment tools? Can they be shared and used in other courses or programs? In what areas does the department excel, and in what areas does it need to improve?

Keep track of planned changes, those changes that have already been carried out in response to assessment results, and the impact those changes had on performance and effectiveness.

Then, start the process all over again, for continuous quality improvement.

Assessment results are important evidence on which to base requests for additional funding, curriculum changes, new faculty and staff lines, and more. Most importantly, the use of assessment results to make these kinds of changes to improve effectiveness and inform decision-making and planning is the reason why we assess. Even negative assessment results can have powerful, positive impact when they are used to improve performance, effectiveness, and ultimately, the university’s ability to achieve its mission.

Office of Institutional Effectiveness Page 96

Page 99: Institutional Effectiveness Handbook...learning/administrative process, an assessment tool may need to be developed internally (i.e. Graduating Student Surveys, Department Satisfaction

Appendix P: Useful Assessment Websites

Accreditation Council for Business Schools and Programs Alverno (Student Assessment of Learning) American Academy for Liberal Education Annapolis Group Assessment Commons Assessment Institute - IUPUI Assessment Update (Jossey-Bass) Association Colleges of the South Association of American Colleges and Universities – VALUE: Valid Assessment of Learning in

Undergraduate Education Association for the Assessment of Learning in Higher Education (AALHE) Association of Theological Schools Association to Advance Colleges Schools of Business (AACSB) Boise State: Institutional Analysis, Assessment, and Reporting C-BEN (Competency-Based Education Network) Council for Accreditation of Counseling and Related Educational Programs Higher Education Data Sharing Consortium Independent Colleges and Universities of Florida (ICUF) Indiana University: Planning and Institutional Improvement Integrated Post-Secondary Educational Data System (IPEDS) Datacenter Malcolm Baldridge National Quality Award National Association of Schools of Music North Carolina State: University Planning & Analysis National Council for Accreditation of Teacher Education National Institute for Learning Outcomes Assessment (NILOA) National Survey of Student Engagement Old Dominion University: Institutional Research and Assessment Samford University: Assessment at Samford Southern Association of Colleges and Schools (SACS) Stanford University: Institutional Research & Decision Support Truman State: Assessment University of Colorado-Boulder: office of Planning, Budgeting, and Analysis University of Florida - Assessment University of Wisconsin-Madison University of Wisconsin-Stout (MBA)

Office of Institutional Effectiveness Page 97

Page 100: Institutional Effectiveness Handbook...learning/administrative process, an assessment tool may need to be developed internally (i.e. Graduating Student Surveys, Department Satisfaction

Appendix Q: IPEDS Peer Data

Overview – The Integrated Postsecondary Educational Data System (IPEDS) is a comprehensive, data-collection tool designed to collect and aggregate various forms of data from every Title IV institution (college, university, and technical school). Title IV institutions are those institutions receiving federally-funded grants, scholarships, and student loans. IPEDS is sponsored by the U.S. Department of Education: Institute of Educational Science.

Title IV institutions are required to submit several reports to IPEDS annually. Each year, the Office of Institutional Research and Retention (IRR) compiles and submits the following data reports: (1) Institutional Characteristics, (2) Fall Enrollment, (3) 12-Month Enrollment, (4) Completions, (5) Graduation Rates, (6) Student Financial Aid, (7) Finance, and (8) Human Resources. The data is collected to describe—and analyze trends in—postsecondary education in the U.S., in terms of the number of students enrolled, staff employed, dollars expended, degrees earned, and many other variables.

Submitted Data –

1. Institutional Characteristics - Collects basic institutional characteristics including types of academic programs, mission statement, organizational control, award levels, calendar system, student enrollment organization (part-time, full-time, etc.), fall enrollment estimates, corporate structure, admission requirements, student charges, and athletic associations.

2. Fall Enrollment – Collects enrollment of part-time and full-time undergraduate and graduate students, enrollment by ethnicity and gender, enrollment by major, enrollment by age and gender, first-time undergraduate data, retention rates, and student-to-faculty ratio.

3. 12-Month Enrollment – Collects the unduplicated headcount of students within an academic year and the 12-month instructional activity.

4. Completions – Collects completions by major (CIP code), by aware level (UG or GR), by major sequence, by gender, and by ethnicity.

5. Graduation Rates – Collects the six-year graduation rate for a particular fall freshman cohort, completions within 150%, completions by length of time (4, 5, and 6+), and transfers and exclusions.

6. Student Financial Aid – Collects the quantity and percentage of federal, state, and institutional grant aid for a particular freshman cohort.

7. Finance – Collects data with regard to the institution’s financial position (assets, liabilities, plant, property, and equipment), changes in net assets (revenue, expenses, gains, and losses), amount of scholarships and fellowships, revenue and investment returns, expenses by functional and natural classifications, and the value of endowment assets.

8. Human Resources – Collects data on employees by assigned position (full-time and part-time), headcount of instructional staff by rank and contract length, salaries of staff, fringe benefits of staff, headcount of faculty with tenure, salary class intervals by classification, and headcount of new hires.

Office of Institutional Effectiveness Page 98

Page 101: Institutional Effectiveness Handbook...learning/administrative process, an assessment tool may need to be developed internally (i.e. Graduating Student Surveys, Department Satisfaction

The massive quantity of reported data allows for a comprehensive process of peer analysis and benchmark studies. In addition to a data collection system, the IPEDS datacenter provides public access to every institution’s reported data.

How to find institutional data –

1. Go to the IPEDS Data Center 2. Select the “Look up an Institution” option 3. Select the “Continue” button 4. Enter and select the “Institution Name” or Unit ID (SEU Unit ID = 137564) 5. Select one of three options (Institutional Profile or Reported Data, and Data Feedback Report).

Instructional Video: How to Find Institutional Data

Peer Analysis and Benchmark Studies- For more advanced analysis, you might consider the use of the IPEDS Executive Peer Tool, which compares individual institutions against a select comparison group. Each year the Office of Institutional Research and Retention maintains a custom comparison group, which includes a selection of Sister Assembly of God (AG) schools, Council of Christian Colleges and Universities (CCCU) South schools, a portion of the Independent College and Universities of Florida (ICUF), and a select number of other institutions.

How to compare institutional data –

1. Go to the IPEDS Data Center 2. Select the “ExPT and DFR” option 3. Enter and select the focus institution (i.e. Southeastern University) 4. Choose the appropriate comparison group (Recommended: “Use institution-defined custom comparison group”) 5. Select one of two options (Institutional Profile or Reported Data, and Data Feedback Report). 6. Select the “Next” button once your comparison group is selected 7. Choose one or more variables and select “Next” 8. On the next page, you will have an option to view and download the data

Instructional Video: How to Compare Institutional Data

Office of Institutional Effectiveness Page 99

Page 102: Institutional Effectiveness Handbook...learning/administrative process, an assessment tool may need to be developed internally (i.e. Graduating Student Surveys, Department Satisfaction

Appendix R: Foundational Core Outcomes

Upon graduation from Southeastern University, students will be able to:

Spiritual Formation

1. Demonstrate a basic knowledge of the biblical basis of Christian theology, the history of Christian traditions, and the various expressions of global Christianity. 2. Articulate a biblically-based statement of Christian identity as it relates to theology, ethics, habits and practices, and vocation.

Personal and Social Responsibility

3. Apply critical consideration of ethical principles and actions to their lived experiences as Christians.

4. Demonstrate intellectual and practical fluency in the stewardship of God’s creation, including their own health & wellness, personal & family finances, civic engagement, and the environment.

5. Engage with diverse populations and belief systems.

Intellectual & Practical skills

6. Demonstrate the ability to reason and develop evidence-based decisions using numerical information and to use relevant mathematical and logical methods to analyze and solve problems effectively.

7. Demonstrate information literacy and critical thinking by locating and evaluating information that assists them in examining ideas and evidence before formulating an opinion or conclusion.

8. Demonstrate effective written and oral communication with and response to varied audiences.

Engaging Human Culture and the Natural World

9. Participate in the discourse of human cultures and its expression in the humanities (e.g. arts, history, literature).

10. Evaluate the relationship between Christian faith and the sciences (natural, behavioral, and social) and their approaches to understanding and resolving unanswered questions.

Office of Institutional Effectiveness Page 100

Page 103: Institutional Effectiveness Handbook...learning/administrative process, an assessment tool may need to be developed internally (i.e. Graduating Student Surveys, Department Satisfaction

Appendix S: Survey Policy & Procedure

Surveys are an indispensable tool within higher education for collecting useful data about populations (whether they be students, alumni, faculty, staff, or others). Typically, surveys solicit respondents to respond to questions about their experiences, perspectives, or opinions. Surveys administered at Southeastern take a variety of forms, depending on factors such as the department or program administering the survey, the survey population, the rationale for the survey, the kind of data and information needed, and other factors. Some surveys administered by the university are part of national benchmarking efforts, which not only return information about our institution, but also valuable comparison data. There are a number of surveys designed to return information that is mandated for reporting purposes by the government or components of information that must be gathered for regional accreditation (SACSCOC) documentation and reporting. The most powerful aspect of survey data is its capacity to inform institutional stakeholders in order to make data-driven decisions for improvement. Still, it is important to recognize that survey data is always a matter of indirect assessment, insofar as the responses coming from any population are self-reported, and not a direct assessment of objective facts. Demand at Southeastern University for surveys has grown exponentially over the years as stakeholders have realized the potential for quality enhancement and continuous improvement afforded by survey instruments. This has necessitated the development of official policies and procedures in order to optimize the frequency and quality of survey administration, and to ensure the integrity of all data obtained in such measures. The following policies and procedures are binding for all institutional stakeholders at Southeastern University, with a number of exceptions listed below: Survey Request Individuals or departments at SEU interested in conducting a survey of any population are expected to consult with the Office of Institutional Effectiveness prior to the development and/or administration of a survey. Use the following form to submit the details of your proposed survey to the Office of Institutional Effectiveness, including the text of your questionnaire, administration dates, and other pertinent information: SEU Survey Proposal Form Online Survey Calendar The Office of Institutional Effectiveness maintains an up-to-date Google Calendar, which delineates all planned institutional and departmental surveys, course evaluations, and other measures. Please consider the timing of your survey, as overlapping measures may lead to survey fatigue among respondent populations. The calendar can be accessed by adding Institutional Effectiveness or [email protected] to your Google Calendar under “Add a coworker’s calendar”. Institutional Review Board (IRB) Some surveys require approval through the Institutional Review Board. Please be sure to review the IRB requirements to determine if your survey needs IRB approval prior to launch. The Office of Institutional Effectiveness is not responsible for IRB applications. You can learn more about IRB requirements at the following site: www.seu.edu/irb/start-here/ Survey Action Plan Forms After an office/department has conducted a survey, the Action Plan Form must be completed within 3 months of the survey closing. This form asks for information regarding the key survey findings and proposed actions for using the survey results (for program/departmental planning and/or improvement). The form can be accessed via the following link: SEU Survey Action Plan form

Office of Institutional Effectiveness Page 101

Page 104: Institutional Effectiveness Handbook...learning/administrative process, an assessment tool may need to be developed internally (i.e. Graduating Student Surveys, Department Satisfaction

Exceptions These guidelines will not apply to the following common survey types and will not need to be submitted for review by the IE office; however, they may still require IRB approval: Surveys administered by faculty for scholarly research (collaboration with the Office of Institutional Effectiveness is still encouraged in this case). Surveys administered for thesis/dissertation research. Surveys administered by students for class projects. Surveys administered by the University Leadership Team.

Office of Institutional Effectiveness Page 102

Page 105: Institutional Effectiveness Handbook...learning/administrative process, an assessment tool may need to be developed internally (i.e. Graduating Student Surveys, Department Satisfaction

Notes

Office of Institutional Effectiveness Page 103

Page 106: Institutional Effectiveness Handbook...learning/administrative process, an assessment tool may need to be developed internally (i.e. Graduating Student Surveys, Department Satisfaction

Notes

Office of Institutional Effectiveness Page 104

Page 107: Institutional Effectiveness Handbook...learning/administrative process, an assessment tool may need to be developed internally (i.e. Graduating Student Surveys, Department Satisfaction

Notes

Office of Institutional Effectiveness Page 105

Page 108: Institutional Effectiveness Handbook...learning/administrative process, an assessment tool may need to be developed internally (i.e. Graduating Student Surveys, Department Satisfaction

Notes

Office of Institutional Effectiveness Page 106

Page 109: Institutional Effectiveness Handbook...learning/administrative process, an assessment tool may need to be developed internally (i.e. Graduating Student Surveys, Department Satisfaction

Southeastern University

Office of Institutional Effectiveness 1000 Longfellow Blvd. Lakeland, FL 33801 Phone: 863.667.5703 Fax: 863.667.5200

1000 Longfellow Blvd., Lakeland, Florida 33801 | 863.667.5703 | fax 863.667.5200

Page 110: Institutional Effectiveness Handbook...learning/administrative process, an assessment tool may need to be developed internally (i.e. Graduating Student Surveys, Department Satisfaction