academic units assessment handbook · measured, analyzed and reported in a timely fashion. each...

18
ACADEMIC UNITS ASSESSMENT HANDBOOK Guidelines for Assessing Institutional Effectiveness May 2020 Edition ipa.fsu.edu [email protected]

Upload: others

Post on 05-Nov-2020

4 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: ACADEMIC UNITS ASSESSMENT HANDBOOK · measured, analyzed and reported in a timely fashion. Each academic program creates an assessment governance structure most suitable to its size

ACADEMIC UNITS ASSESSMENT HANDBOOK

Guidelines for Assessing Institutional Effectiveness

May 2020 Edition

ipa.fsu.edu [email protected]

Page 2: ACADEMIC UNITS ASSESSMENT HANDBOOK · measured, analyzed and reported in a timely fashion. Each academic program creates an assessment governance structure most suitable to its size

TABLE OF CONTENTS

OVERVIEW OF INSTITUTIONAL EFFECTIVENESS .................................................... 1

What is Institutional Effectiveness? ................................................................. 1 Why do we evaluate Institutional Effectiveness? ............................................ 1 How do we assess Institutional Effectiveness? .............................................. 2 What is the appropriate number of Program and Student Learning Outcomes? .......................................................................................................... 2 Who governs Institutional Effectiveness? ....................................................... 3 When do we assess Institutional Effectiveness? ............................................ 4 Which national, regional, and state entities are Institutional Effectiveness stakeholders? ..................................................................................................... 5

STUDENT LEARNING OUTCOMES .............................................................................. 5 Description of Student Learning Outcomes ..................................................... 5 Assessing Student Learning Outcomes ........................................................... 6 Recording and Analyzing Results ..................................................................... 7 Formulating Improvement Plans ....................................................................... 8

PROGRAM OUTCOMES ................................................................................................ 9

Description of Program Outcomes ................................................................... 9 Assessing Program Outcomes ....................................................................... 10 Recording and Analyzing Results ................................................................... 10 Formulating Improvement Plans ..................................................................... 11

Appendix A: Bloom’s Taxonomy ............................................................................... 13 Appendix B: Assessment Components of Student Learning Outcome ................. 14 Appendix C: Assessment Components of Program Outcome ................................ 16

Page 3: ACADEMIC UNITS ASSESSMENT HANDBOOK · measured, analyzed and reported in a timely fashion. Each academic program creates an assessment governance structure most suitable to its size

ACADEMIC UNITS ASSESSMENT HANDBOOK 1

OVERVIEW OF INSTITUTIONAL EFFECTIVENESS What is Institutional Effectiveness? A planning, implementation and assessment process that allows us to evaluate whether our practices are meeting our goals. Institutional effectiveness (IE) activities help assess performance and provide university accountability. The IE process reinforces instructional and administrative quality and effectiveness through a systematic review of goals and outcomes that are consistent with FSU’s mission.

Why do we assess Institutional Effectiveness? For three reasons: 1) to self-evaluate and improve educational activities to benefit students, 2) to demonstrate the product of our efforts to the public and campus community, and 3) to meet the requirements for accreditation. “Student outcomes—both

Page 4: ACADEMIC UNITS ASSESSMENT HANDBOOK · measured, analyzed and reported in a timely fashion. Each academic program creates an assessment governance structure most suitable to its size

ACADEMIC UNITS ASSESSMENT HANDBOOK 2

within the classroom and outside of the classroom—are the heart of the higher education experience” at FSU (SACSCOC Resource Manual, p. 66). The IE process is a key way to measure how well we are meeting student learning and student experience goals. How do we assess Institutional Effectiveness? IE is typically evaluated at the level of individual academic programs, which are defined as distinguishable degree programs, commonly assigned a unique CIP (Certification of Instructional Programs) code and offering one or more majors. For example, academic department of Chemistry and Biochemistry has 3 academic degree programs: Biochemistry, Chemical Science, and Chemistry. Each academic program, in turn, offers at least one academic major at one or more degree levels (Bachelor’s, Master’s, Specialist, Doctorate). In addition to degrees, academic programs can also offer certificates. Current inventory of academic programs is available in the FSU Fact Book and in an interactive visualization. Every academic program sets annual performance goals that are measured and evaluated to determine how well they performed in a given year on those items. SLOs – Academic programs develop student learning outcomes that specify knowledge, skills, values, and attitudes that students will attain throughout their studies in a program or in a specific course. Assessment methods and desired levels of student competencies are established in accordance with discipline-specific expectations and levels that are appropriate for post-graduation success. If appropriate, SLOs can be written to conform to the requirements of discipline-specific accrediting agency. POs – All university units (academic, administrative, and academic and student support services) define and set expectations for their program outcomes. POs are the broader goals of the academic program and may align with FSU Strategic Plan implementation, state funding metrics, strategic plans of the unit’s College, and/or the requirements of discipline-specific accrediting agency. What is the appropriate number of Program and Student Learning Outcomes? Each academic program is required to formulate at least 1 PO and at least 2 SLOs for all degree levels, except Bachelor’s. Due to increased accountability for undergraduate educational outcomes, Bachelor’s-level programs are requested to articulate at least 5 SLOs, at least 3 of which must be assessed in 3000-4000-level courses and focus on 3 different categories from the following list: content/discipline knowledge and skills, communication skills, critical thinking skills.

Page 5: ACADEMIC UNITS ASSESSMENT HANDBOOK · measured, analyzed and reported in a timely fashion. Each academic program creates an assessment governance structure most suitable to its size

ACADEMIC UNITS ASSESSMENT HANDBOOK 3

An off-campus instructional site is a teaching site located geographically apart from the main campus or a site at which an institution provides electronic delivery and where students go to access the support services (e.g., Republic of Panama, Panama City, Sarasota, or Distance Learning). Please note academic programs that are offered on multiple campuses and/or in multiple modalities are expected to have the same SLOs, but they may have different POs. Instructional faculty from different locations/modalities should jointly decide which student knowledge, skills, and abilities to select for SLOs. However, for distance/online learning experiences and for approved off-campus sites, the IE assessment needs to be conducted and reported separately within the IE portal. A comparative assessment of SLOs should be included when reporting and analyzing results and/or formulating improvement plans; it may note any significant differences in achieving Outcomes between different delivery types or locations. Who governs Institutional Effectiveness? The Office of the Provost and Executive Vice President is responsible for the overall coordination of the university assessment processes. The Office of Institutional Performance and Assessment (IPA) within the Office of the Provost provides assistance to FSU units during all stages of their IE assessment cycles. The final review and approval of entries in the IE portal is the responsibility of the Executive Vice President for Academic Affairs or designee. At the level of individual academic units, the IE assessment process is a shared responsibility between the academic degree program faculty, the assessment coordinators, the department chair(s) and the associated dean(s). As such, they all are involved in an annual workflow that assures that Outcomes are appropriately designed, measured, analyzed and reported in a timely fashion. Each academic program creates an assessment governance structure most suitable to its size and functions.

Page 6: ACADEMIC UNITS ASSESSMENT HANDBOOK · measured, analyzed and reported in a timely fashion. Each academic program creates an assessment governance structure most suitable to its size

ACADEMIC UNITS ASSESSMENT HANDBOOK 4

Typically, each academic program designates one or two faculty members as assessment coordinators who lead and manage the assessment process and implementation of improvements at the level of their academic program. However, it is expected that all program faculty understand, provide input for, agree with, and participate in the improvement of educational activities. IE assessment activities should be carried out in close coordination with existing department/program curriculum committees, especially in cases when new curricular actions or changes to academic policies are being proposed in furtherance of continuous improvement of student learning. Prior to or shortly after academic program assessment coordinators submit the description of the IE assessment components into the IE portal, department chairs or designees review and approve the submissions. The final review and approval should be conducted by the dean of the college or authorized designee. Suggested rubrics for evaluating the IE submissions are developed and distributed by the IPA Office on its website.

When do we assess Institutional Effectiveness? While the process of improvement is always continuous and ongoing, we only formally evaluate attainment of Program and Student Learning Outcomes once a year, at the conclusion of the academic year cycle. Start and end dates for an academic year are determined by the academic programs based on their instructional activities timeline. Common academic year cycles are: 1) Fall and Spring, 2) Fall, Spring, Summer, 3) Summer C, Fall, Spring, Summer A & B.

Page 7: ACADEMIC UNITS ASSESSMENT HANDBOOK · measured, analyzed and reported in a timely fashion. Each academic program creates an assessment governance structure most suitable to its size

ACADEMIC UNITS ASSESSMENT HANDBOOK 5

Because academic programs have varying faculty and staff availability throughout the calendar year, there are two recommended calendars for engaging in and completing various components of the IE assessment. One option is to begin IE assessment activities in May and another option is to begin them in August. Importantly, all academic programs, provided they have available human resources at that time, are allowed and encouraged to complete their IE assessment components before the specified deadlines. Which national, regional, and state entities are Institutional Effectiveness stakeholders? We provide information about student learning outcomes and program outcomes to the Southern Association of Colleges and Schools Commission on Colleges (SACSCOC), the regional body for the accreditation of degree-granting higher education institutions in the Southern states. SACSCOC accreditation is required to maintain eligibility for federal funding, including student financial aid and research grants. Per Regulation 8.016, Florida Board of Governors (BOG) requires all institutions in the State University System of Florida to establish a process for certifying that each baccalaureate graduate has completed a program with clearly articulated expected core student learning outcomes. These outcomes constitute state-mandated Academic Learning Compacts (ALCs). Many national discipline-specific accrediting bodies also require FSU academic programs to document and achieve a range of student educational outcomes and to provide evidence of efforts toward continuous improvement.

STUDENT LEARNING OUTCOMES

Description of Student Learning Outcomes Based on the definitions provided by SACSCOC and the BOG, SLOs specify the content knowledge, discipline-specific skills, communication skills, critical thinking skills, values, and attitudes that students are expected to attain throughout their studies in a program or in a specific course. When developing expectations for student learning, program faculty are asked, in addition to their own expert opinion, to also take into consideration perspectives of appropriate constituencies, such as potential employers and graduate programs, regarding the knowledge and skills graduates need in the global marketplace and society. When defining what faculty want students to learn in a program or in a particular course, it is recommended to use specific action verbs from the Bloom’s taxonomy (see Appendix A) because they describe measurable stages of learning.

Page 8: ACADEMIC UNITS ASSESSMENT HANDBOOK · measured, analyzed and reported in a timely fashion. Each academic program creates an assessment governance structure most suitable to its size

ACADEMIC UNITS ASSESSMENT HANDBOOK 6

Here’s an example of the Bachelor’s Biological Science SLO description:

• Provide a succinct name for the SLO: SLO Name: Understanding Core Genetic Principles.

• Identify what knowledge, skills or abilities students will have learned: Student Learning Outcome: Upon completion of the course of instruction, the student will be able to demonstrate knowledge of the core concepts of genetics: the nature, inheritance, and expression of genetic information.

• Assign proper categorization for the SLO (Discipline/Content Knowledge and Skills, Communication Skills, Critical Thinking Skills): SLO Category: Discipline/Content Knowledge and Skills.

Assessing Student Learning Outcomes SLOs should be observable measurements of student learning. These measurements must be methodologically sound, reliable, and delivered in a consistent fashion from year to year. When designing an assessment methodology for a SLO, it is useful to adhere to the S.M.A.R.T. guidelines – outcomes of student learning should be Specific, Measurable, Achievable, Relevant, and Time-bound. Assessment instruments may include a standardized or instructor-constructed quiz/test/exam, select items on a quiz/test/exam, a lab assignment, capstone project, juried performance, or final paper. Assessment should be aligned with the discipline. Critical thinking skills of students in Political Science may be best assessed through an evaluative report that provides analysis of a political organization. Communication skills of students studying Russian may be best showcased through an oral presentation and evaluated by the means of a grading rubric that allows for scoring of pronunciation and use of vocabulary. Discipline-specific skills of students in Chemistry may be best measured through a lab report describing use of proper equipment and techniques. Final course letter grades are not suitable for assessment because they are summative measures that do not allow for an evaluation of specific skills or knowledge sets. Here’s an example of the Bachelor’s Nursing SLO assessment methodology:

• Describe how the assessment of the SLO will be conducted (relevant details may include information about who will assess student learning, in which course(s), during which semester(s), under what circumstances, and how the assessment instrument will be used):

Page 9: ACADEMIC UNITS ASSESSMENT HANDBOOK · measured, analyzed and reported in a timely fashion. Each academic program creates an assessment governance structure most suitable to its size

ACADEMIC UNITS ASSESSMENT HANDBOOK 7

Assessment Process: Student’s compassionate nursing care skills will be assessed in Professional Nursing Internship (NUR 4945) capstone course taught every Spring term. Two teaching faculty will conduct an in-person, 1-hour-long observation of a care-giving session delivered by the student. The rating that the student will receive on the ‘Compassionate Care’ item in the clinical performance evaluation form will be used to measure student learning.

• Specify a measurable assessment standard that defines success: Goal/benchmark: 90% of students in the course will demonstrate performance at the level of ‘Satisfactory’.

• Provide information about the assessment instrument: Assessment Instrument: Clinical Evaluation.

Recording and Analyzing Results During the academic year, program faculty deliver instruction and carry out other (co-) curricular activities as planned before the start of the academic year. Designing good assessment mechanisms and delivering the instructional activities that were focused on promoting student learning is the key to successful IE reporting. At the end of each assessment cycle, academic programs aggregate information/data and then conclude whether the SLO goals/benchmarks selected at the start of the cycle were achieved. The results are analyzed to determine the reasons why these Outcomes were attained by the students at that particular level. Proper results statement for each SLO is largely quantitative and provides a plethora of details. Per FERPA guidelines, do not provide information about individual student’s academic performance and exercise caution when reporting aggregated assessment results of groups with fewer than 5 students. Here’s an example of the Master’s Geography SLO reporting of results and analysis:

• Present information regarding the levels at which the SLO was achieved: Results Statement: 17 (81%) out 21 students enrolled in the Research Methods (GEO 5118) course in Fall 2019 and Spring 2020 received a score of 85 or above on their term papers. The benchmark set for this SLO (at least 80% of students receiving a score of 85 or above) was achieved.

• Examine the reason(s) for the attained results:

Page 10: ACADEMIC UNITS ASSESSMENT HANDBOOK · measured, analyzed and reported in a timely fashion. Each academic program creates an assessment governance structure most suitable to its size

ACADEMIC UNITS ASSESSMENT HANDBOOK 8

Analysis of Results: Compared to Fall 2018-Spring 2019 results indicating that 76% of students scored at or above the benchmark, this academic year, a greater proportion of students achieved the SLO. We hypothesize that the increase was due to improving the instructional materials for, and pedagogical approaches to, teaching how to choose the appropriate statistical test for different research questions (this topic was the most problematic to students last academic year). We also noticed that student performance on the SLO correlated with their attendance: those who attended fewer classes showed lower levels of mastery on the term paper.

Formulating Improvement Plans The most intensive component of the assessment cycle is devising plans for, and committing to, continuous improvement. Formulating sound improvement plans requires participation, engagement and meaningful contribution on the part of instructional faculty and curriculum committees. Whether SLOs have been met or not, it is the responsibility of the program faculty and leadership to determine a plan of action for the next year. Occasionally, the level of student learning does not meet the desired goal/benchmark. In this case, academic programs should provide reasons why these goals/benchmarks were not met and then develop an improvement plan for the upcoming year. These plans should be well-thought-out and describe specific changes to be implemented, including revising instructional materials, adding or removing topics from taught content, or adopting a different textbook. Improvement plans may also require new or modified assessment practices or professional development. In cases when SLOs are being consistently achieved at a high level, it is recommended to either increase the desired goal/benchmark or to derive a SLO that would address other important learning outcomes. If these changes are not feasible, academic programs should consider how they expect to maintain high level of student learning. Most importantly, “Plans to make improvements do not qualify as seeking improvement, but efforts to improve a program that may not have been entirely successful certainly do.” (SACSCOC Resource Manual, page 68). Here’s an example of the Panama City Campus Bachelor’s Professional Communication SLO improvement plan:

• Describe specific plans to improve or sustain performance. Improvement Plan: Because the goal/benchmark has been consistently achieved for the last three academic years, academic program faculty and curriculum committee decided to increase it from 75% to a higher goal/benchmark. Next year, at least 85% of students enrolled in SPC 2608 Public Speaking course will receive an average ‘Good’ or ‘Excellent’ peer rating on Articulation and Body Language questions on the grading rubric.

Page 11: ACADEMIC UNITS ASSESSMENT HANDBOOK · measured, analyzed and reported in a timely fashion. Each academic program creates an assessment governance structure most suitable to its size

ACADEMIC UNITS ASSESSMENT HANDBOOK 9

In addition to increasing the goal/benchmark for this SLO, we also want to make a change to one of the class assignments. The results showed that the lowest ratings were consistently given for in-class delivery of the Persuasive Speech. Students shared that the time allotted for this speech was insufficient. We will increase the time students are given to deliver the Persuasive Speech from 5-7 minutes to 6-8 minutes. The results from different campuses are comparable. Students enrolled in the same course offered on the Tallahassee campus also demonstrated high level of competency in the SLO for Vocal and Physical Delivery in Public Speaking (PC Campus = 84% and Tallahassee Campus = 86%).

PROGRAM OUTCOMES Description of Program Outcomes Academic programs have direct impact on student learning and student success. Unlike SLOs that are focused on improving student content knowledge and skills, POs reflect broader goals of the academic program and may align with FSU Strategic Plan implementation, state funding metrics, strategic plans of the unit’s College, and/or the requirements of discipline-specific accrediting agency. Chosen POs should be a result of an academic unit’s analysis of program’s strengths and weaknesses and should reflect its commitment to advancing the University’s mission. Academic POs, like many university-wide metrics, may require planning for improvement that is long-term and is a product of multiple strategies. For example, increasing student graduation rates is a PO that requires successfully implemented efforts aimed at retaining students in a major, their completion of required course work, and ensuring that academic maps do not have ‘bottlenecks’. In addition, graduation-impacting changes put into effect in a given academic year may yield noticeable results in four or more years – which is how long it takes for a new cohort of students to fully benefit from the full range of programmatic improvements. Here’s an example of the Doctoral Biomedical Sciences PO description:

• Provide a succinct name for the PO: PO Name: Manuscript submissions.

• Identify the improvement to be made at the academic program level: Program Outcome: Manuscripts with doctoral students and faculty as co-authors will be written and submitted to scientific journals.

Page 12: ACADEMIC UNITS ASSESSMENT HANDBOOK · measured, analyzed and reported in a timely fashion. Each academic program creates an assessment governance structure most suitable to its size

ACADEMIC UNITS ASSESSMENT HANDBOOK 10

Assessing Program Outcomes Assessment of POs should be methodologically sound, reliable, and measured in a consistent fashion from year to year. When designing an assessment methodology, it is useful to adhere to the S.M.A.R.T. guidelines – program-level outcomes should be Specific, Measurable, Achievable, Relevant, and Time-bound. In cases when an academic program selects a PO focused on university-level priorities, it may be more appropriate to use officially-reported data. For example, many indicators of student achievement (e.g., retention and graduation rates, post-graduation employment success) are available on FSU Institutional Research (IR) official website. When feasible, IPA and IR will provide additional data and analytic support to academic programs in need of custom reports and datasets. If an academic program chooses a special PO that is not tracked centrally, then information/data collected and/or generated by individual academic programs should be used. Here’s an example of the Bachelor’s Criminology PO assessment methodology:

• Describe how the assessment of the PO will be conducted: Assessment Process: 2-Year graduation rates of undergraduate students who transferred to FSU and declared Criminology as their major will be assessed by calculating the number of transfer students who graduated from FSU by the end of their second year divided by the total number of transfer students in the original cohort. Graduation rates of transfer students will be retrieved from the Graduation/Retention reports published by FSU Office of Institutional Research at https://ir.fsu.edu/graduation_retention_secure.aspx.

• Specify a measurable assessment standard that defines success: Goal/benchmark: Beginning with Summer/Fall 2020 cohort, at least 50% of transfer students in Criminology will graduate from FSU in two years.

Recording and Analyzing Results Guidelines for recording and analyzing results for a PO are similar to the guidelines for SLOs. At the end of each assessment cycle, academic programs either aggregate information/data collected internally or retrieve it from centrally-maintained sources. The results are used to determine whether the PO was met or not. The culmination is a brief report with an analysis of why the Outcome selected at the start of the cycle was achieved at that particular level. A proper results statement for a PO is largely quantitative and provides a plethora of methodological details. FERPA guidelines should be followed when recording and analyzing POs.

Page 13: ACADEMIC UNITS ASSESSMENT HANDBOOK · measured, analyzed and reported in a timely fashion. Each academic program creates an assessment governance structure most suitable to its size

ACADEMIC UNITS ASSESSMENT HANDBOOK 11

Here’s an example of the Bachelor’s Sociology PO reporting of results and analysis:

• Present information regarding the levels at which the PO was achieved: Results Statement: Out of 28 instructional faculty in the program, 17 (61%) faculty participated in at least one teaching workshop offered by the FSU Center for the Advancement of Teaching (CAT) in the 2018-19 academic year (Fall, Spring, Summer). 9 faculty members participated in more than one workshop. The goal/benchmark set for this PO in Summer 2018 to have at least 50% of instructional faculty participate in this kind of professional development was achieved.

• Examine the reason(s) for the attained results: Analysis of Results: The reasons why almost two thirds of our faculty attended at least one teaching workshop offered by the CAT are our faculty’s desire to continuously improve their pedagogical knowledge and skills and the strong encouragement and support provided by the department Chair. At the departmental meeting before the start of the Fall semester, she informed the faculty of this opportunity and highlighted that teaching excellence is one of the priorities outlined in our college strategic plan (Goal 3) and the FSU Strategic Plan (Goals IV-V).

Formulating Improvement Plans Guidelines for formulating improvement plans for a PO are similar to the guidelines for SLOs. The most intensive component of the assessment cycle is devising plans for, and committing to, continuous improvement. Formulating sound improvement plans requires participation, engagement and meaningful contribution on the part of academic program leadership, instructional faculty and curriculum committees. Whether a PO has been met or not, it is the responsibility of the program faculty and leadership to determine a plan of action for the next year. Occasionally, an Outcome does not meet the desired goal/benchmark. In this case, academic programs should provide reasons why these goals/benchmarks were not met and then develop an improvement plan. These plans should be well-thought-out and describe specific changes to be implemented, including hiring new faculty, adding or removing prerequisite courses, or changing faculty service expectations. Improvement plans may also require new or modified assessment practices or selecting a different PO. In cases when a PO is being consistently achieved at a high level, it is recommended to either increase the desired goal/benchmark or to derive a new PO that would address other important aspects of the University mission. If these changes are not feasible, academic programs should consider how they expect to maintain high level of performance.

Page 14: ACADEMIC UNITS ASSESSMENT HANDBOOK · measured, analyzed and reported in a timely fashion. Each academic program creates an assessment governance structure most suitable to its size

ACADEMIC UNITS ASSESSMENT HANDBOOK 12

Most importantly, “Plans to make improvements do not qualify as seeking improvement, but efforts to improve a program that may not have been entirely successful certainly do.” (SACSCOC Resource Manual, page 68). Here’s an example of the Bachelor’s Hospitality Management PO improvement plan:

• Describe specific plans to improve or sustain performance: Improvement Plan: This past academic year we had some success in increasing the number of students who take entrepreneurship courses. Our academic program wants to continue increasing entrepreneurial and innovative mindset and skills among our students. The program instructional faculty and the curriculum committee decided that instead of offering ENT 4114 Business Plan Design as an elective course, we will designate it as a required course and place it on the academic maps for both majors offered by our academic program (Global Club Management & Leadership and Hospitality & Tourism Management). We communicated our intent to the Dean of Jim Moran College of Entrepreneurship who agreed to open an additional section of this course in the next academic year. We are optimistic that this change will help our graduates be successful and will support FSU Strategic Plan Goal I and Performance-Based Funding metric #10, which are aimed at increasing the number of graduates who have taken an entrepreneurship class.

Page 15: ACADEMIC UNITS ASSESSMENT HANDBOOK · measured, analyzed and reported in a timely fashion. Each academic program creates an assessment governance structure most suitable to its size

ACADEMIC UNITS ASSESSMENT HANDBOOK 13

Appendix A: Bloom’s Taxonomy Action Verbs from Stanny (2016)

Knowledge Understand Apply Analyze Evaluate Create arrange articulate act analyze appraise arrange choose associate adapt appraise argue assemble

cite characterize apply break arrange categorize copy cite back/back up break down assess choose

define clarify calculate calculate attach collect describe classify change categorize choose combine

draw compare choose classify compare compile duplicate contrast classify compare conclude compose identify convert complete conclude contrast construct indicate defend compute contrast core create

label demonstrate construct correlate counsel design list describe demonstrate criticize create develop

locate differentiate develop debate criticize devise match discuss discover deduce critique estimate

memorize distinguish dramatize detect decide evaluate name estimate employ diagnose defend explain order explain experiment diagram describe facilitate

outline express explain differentiate design formulate quote extend generalize discover determine generalize read extrapolate identify discriminate discriminate generate recall generalize illustrate dissect estimate hypothesize recite give implement distinguish evaluate improve

recognize give examples interpret divide explain integrate record identify interview evaluate grade invent relate illustrate manipulate examine invent make repeat indicate modify experiment judge manage

reproduce infer operate figure manage modify review interpolate organize group mediate organize select interpret paint identify prepare originate state locate practice illustrate probe plan

tabulate match predict infer rate predict tell observe prepare inspect rearrange prepare

underline organize produce inventory reconcile produce write paraphrase relate investigate release propose

predict schedule order rewrite rate recognize select organize select rearrange relate show outline set up reconstruct report simulate point out supervise relate represent sketch predict synthesize reorganize restate solve prioritize test revise review translate question value rewrite rewrite use relate verify role-play select utilize select weigh set up summarize write separate specify tell solve summarize translate subdivide synthesize survey tell/tell why test write

Page 16: ACADEMIC UNITS ASSESSMENT HANDBOOK · measured, analyzed and reported in a timely fashion. Each academic program creates an assessment governance structure most suitable to its size

ACADEMIC UNITS ASSESSMENT HANDBOOK 14

Appendix B: Assessment Components of Student Learning Outcome Bachelor’s Psychology Example

• Provide a succinct name for the SLO:

SLO Name: Research Ethics. • Identify what knowledge, skills or abilities students will have learned:

Student Learning Outcome: Upon completion of the course of instruction, the student will be able to demonstrate knowledge of the ethical treatment of human and animal research participants/subjects.

• Assign proper categorization for the SLO (select one of the following:

Discipline/Content Knowledge and Skills, Communication Skills, Critical Thinking Skills):

• SLO Category: Discipline/Content Knowledge and Skills. • Describe how the assessment of the SLO will be conducted (relevant details may

include information about who will assess student learning, in which course(s), during which semester(s), under what circumstances, and how the assessment instrument will be used):

• Assessment Process: We will assess this outcome by testing students in all

sections of PSY 3213C (Research Methods in Psychology) offered during the academic year (Fall, Spring, Summer). PSY 3213 classes were chosen for this assessment because this is the "core" research methodology course for students with majors in Psychology. To assess this outcome we will use a departmental exam that was written and is curated by our faculty. The exam is multiple-choice and is given at end of semester. The entire exam assesses five learning outcomes associated with knowledge of research methodology in psychology. Five items from the exam were designed to assess the Research Ethics learning outcome.

• Specify a measurable assessment standard that defines success:

Goal/benchmark: We will consider that we have met our goal/benchmark when 85% of enrolled students demonstrate knowledge of the material by correctly answering at least three out of five items on Research Ethics.

• Provide information about the assessment instrument:

Assessment Instrument: Department Assessment.

Page 17: ACADEMIC UNITS ASSESSMENT HANDBOOK · measured, analyzed and reported in a timely fashion. Each academic program creates an assessment governance structure most suitable to its size

ACADEMIC UNITS ASSESSMENT HANDBOOK 15

• Present information regarding the levels at which the SLO was achieved:

Results Statement: 225 (95%) out of 237 students enrolled in PSY 3213C Research Methods in Psychology course in Fall 2018, Spring 2020 and Summer 2020 correctly answered at least three out of five items on Research Ethics. The goal/benchmark set for this outcome (at least 85% of students) was achieved.

• Examine the reason(s) for the attained results:

Analysis of Results: We hypothesize that the high level of student knowledge on this outcome is due to the fact that the main lecture part of the course and the lab part of the course that cover and emphasize research ethics are of high instructional quality. Students typically rate highly the course materials and the overall course quality on the SPOT forms. In addition, the instructors who teach this course are experienced and give special attention to this content area. Students repeatedly give high marks to the instructors for their explanation of the material and the overall teaching of the course.

• Describe specific plans to improve or sustain performance.

Improvement Plan: This outcome has been achieved significantly above the goal/benchmark for the last four academic years, ever since this outcome was selected. There were similar findings for other SLOs that were also assessed by the same department exam. We decided to review this assessment instrument as a committee. The committee consisted of instructors of PSY 3213 and members of the Undergraduate Studies Committee for the Department of Psychology. To establish the content validity of the test, the committee examined the items to verify that they reflected the intended learning goal. Our analysis showed that the department exam had good validity properties. As there were no validity issues found, we wondered if perhaps the exam was too easy. To investigate this possibility, we conducted an item difficulty analysis that showed there were several test items whose difficulty levels were very low (over 92% of students responded correctly). We decided to redesign the exam to test higher levels of learning. For example, a few items that measure lower levels of Bloom’s taxonomy (knowledge and understanding) will be rewritten to test middle levels (application and analysis). We plan to deploy the redesigned exam during the next academic year.

Page 18: ACADEMIC UNITS ASSESSMENT HANDBOOK · measured, analyzed and reported in a timely fashion. Each academic program creates an assessment governance structure most suitable to its size

ACADEMIC UNITS ASSESSMENT HANDBOOK 16

Appendix C: Assessment Components of Program Outcome Bachelor’s Spanish Example

• Provide a succinct name for the PO:

PO Name: Spanish Language Class Size. • Identify the improvement to be made at the academic program level:

Program Outcome: The program will reduce average class size to facilitate engaged learning and the ability of students to connect with their instructors.

• Describe how the assessment of the PO will be conducted: • Assessment Process: We will assess this outcome by counting how many

undergraduate class sections with capped enrollment of 19 students will be offered in the 2019-20 academic year. We will calculate percentage of these class sections out of the total offered sections. We will use the university official data reporting system (Oracle Business Intelligence) to retrieve data on the number of class sections and students enrolled past the drop/add week.

• Specify a measurable assessment standard that defines success:

Goal/benchmark: 70% of class sections with enrollment of 19 or fewer students. • Present information regarding the levels at which the PO was achieved:

Results Statement: 49 (73%) out of 67 undergraduate class sections offered by the Spanish program in the last academic year (Summer C 2019, Fall 2019, Spring 2020, Summer A 2020, and Summer B 2020) had enrollment of 19 students or fewer. The goal/benchmark set for this outcome was achieved.

• Examine the reason(s) for the attained results:

Analysis of Results: Reducing average class size was a university-wide initiative. Our program received financial and logistic support from the FSU administrative and academic leadership. Spanish program faculty and graduate teaching assistants successfully worked out new teaching schedules to accommodate the increased number of Spanish language sections.

• Describe specific plans to improve or sustain performance.

Improvement Plan: In order to continue offering many small-size class sections, our program will seek a permanent increase in funding to grow the number of graduate teaching assistants and to add one tenure-track faculty line.