2020-2021 academic program assessment...closing the loop ... related to the delivery of distance...

39
Authors: Elizabeth Bledsoe Piwonka, Alyce Odasso, Alicia Dorsey 2020-2021 ACADEMIC PROGRAM ASSESSMENT ASSESSMENT REVIEW GUIDELINES April 2020

Upload: others

Post on 13-May-2020

1 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: 2020-2021 ACADEMIC PROGRAM ASSESSMENT...Closing the Loop ... related to the delivery of distance education programs or programs available at alternate geographic locations is included

Authors: Elizabeth Bledsoe Piwonka, Alyce Odasso, Alicia Dorsey

2020-2021 ACADEMIC PROGRAM ASSESSMENT

ASSESSMENT REVIEW GUIDELINES

April 2020

Page 2: 2020-2021 ACADEMIC PROGRAM ASSESSMENT...Closing the Loop ... related to the delivery of distance education programs or programs available at alternate geographic locations is included

2 | P a g e

Table of Contents

Foreword .......................................................................................................................... 3

Components of Academic Program Assessment ................................................................... 4

Academic Program Assessment Cycle at Texas A&M University ............................................. 6

Using AEFIS to Document Annual Program Assessment ......................................................... 7

Changes to Assessment Forms ............................................................................................ 9

Assessment Support ..........................................................................................................10

The Assessment Plan

Program Description .........................................................................................................11

Program Learning Outcomes ..............................................................................................13

Measures .........................................................................................................................15

Targets .............................................................................................................................18

The Assessment Report

Findings ...........................................................................................................................19

Data-Informed Actions ......................................................................................................22

The Assessment Reflections & Closing the Loop Report

Assessment Reflections .....................................................................................................24

Closing the Loop ...............................................................................................................26

Appendix

Information for Assessment Liaisons ..................................................................................29

Information for Department Approvers ..............................................................................35

Glossary ...........................................................................................................................37

2020-2021 Workflow Graphic ............................................................................................39

Page 3: 2020-2021 ACADEMIC PROGRAM ASSESSMENT...Closing the Loop ... related to the delivery of distance education programs or programs available at alternate geographic locations is included

3 | P a g e

Foreword

The purpose of academic program assessment is for faculty to gather information about what and how students are learning, discuss that information as a faculty group, and use it to inform continuous improvement efforts within the academic program. By extension, these efforts aid in enhancing the educational experience for students, improving program learning outcome (PLO) assessment results, further developing students’ skills in the identified PLOs, and actively involving program faculty in the curricular assessment and quality improvement process. The information presented in each section of this manual defines Texas A&M University’s expectations for the documentation of PLO assessment. This “how-to” manual is designed to guide academic programs through the academic program assessment process, highlight best practices, and facilitate self- and peer-review of Assessment Plans and Assessment Reports.

As of Spring 2019, academic programs across TAMU document program assessment efforts in AEFIS

(Assessment, Evaluation, Feedback, and Intervention System), an integrated, comprehensive, online assessment

platform. Based on feedback from faculty and staff involved in the assessment process over the years, and in an

effort to support a more manageable and meaningful assessment process, the following steps were taken with

the implementation of AEFIS:

1. The documentation of academic program assessment has been segmented into three distinct

components: (1) the Assessment Plan, (2) the Assessment Report, and (3) the Assessment Reflections

and Closing the Loop Report;

2. The assessment review cycle has been extended to approximately 18 months, more reflective of a

typical academic program assessment cycle (see pg. 40); and,

3. New strategies have been implemented to provide more timely feedback regarding ongoing assessment

efforts.

These changes are briefly outlined in the next following sections.

Page 4: 2020-2021 ACADEMIC PROGRAM ASSESSMENT...Closing the Loop ... related to the delivery of distance education programs or programs available at alternate geographic locations is included

4 | P a g e

Components of Academic Program Assessment

Assessment Plan. The Assessment Plan, completed each spring semester, identifies which program learning outcomes (PLOs) will be assessed during the upcoming academic year, as well as the measures and targets that will be used to assess each PLO. Programs may identify as many PLOs as they see fit to assess in a given year as long as at least one is assessed each academic year.

Components making up the Assessment Plan:

• Program Description

• Program Learning Outcome(s)

• Measures

• Targets

Assessment Report. The Assessment Report, completed each fall semester, summarizes assessment findings or results gathered over the course of the previous academic year (as outlined in the previously established Assessment Plan). Also included in the Assessment Report are data-informed actions, which are based on the assessment results. These data-informed action(s) are changes which will be implemented in the future, and at least one data-informed action designed to strengthen/improve one of the assessed PLOs (i.e., a curricular change) is required each year.

Components making up the Assessment Report:

• Findings

• Data-Informed Actions

Assessment Reflections and Closing the Loop Report (and DE/Alternate Geographic Location Reporting, if applicable). Like the Assessment Report, this final report is completed each fall semester. All programs respond to two sets of prompts--Assessment Reflections and Closing the Loop. Where applicable, a third set of prompts related to the delivery of distance education programs or programs available at alternate geographic locations is included.

Assessment Reflections Program Coordinators are asked to explicitly discuss the involvement of program faculty and leadership in the assessment process. Next, they are asked to reflect on the effectiveness and usefulness of assessment efforts undertaken during the previous academic year (as addressed in the Assessment Report). Anticipated changes to assessment strategies for the upcoming cycle (e.g., revised rubrics, expanded sources of data, updated targets, etc.) are also described in the Assessment Reflections section of this report.

Closing the Loop Program Coordinators are asked to identify a recently-implemented curricular change (i.e., a change implemented at least 2-3 years ago) designed to improve a specific PLO. In this narrative, program faculty summarize subsequently gathered assessment data used to determine whether or not the described change(s) led to improvements in the targeted PLO(s).

Distance Education and Alternate Geographic Location Reporting Distance Education programs (i.e., those for which >50% of the curriculum is available via technology) and programs offered at alternate geographic locations (i.e., those which can be completed away from the “home” campus at an approved off-site teaching location) respond to an additional set of questions in the Assessment

Page 5: 2020-2021 ACADEMIC PROGRAM ASSESSMENT...Closing the Loop ... related to the delivery of distance education programs or programs available at alternate geographic locations is included

5 | P a g e

Reflections & Closing the Loop report. Programs are asked what data are regularly reviewed and how those data are used to ensure high-quality learning and a rich educational experience regardless of mode of delivery and/or geographic location. These prompts and guidelines for responding to them can be found in the Distance Education and Alternate Geographic Location Reporting Guidelines document accessible from https://assessment.tamu.edu.

Page 6: 2020-2021 ACADEMIC PROGRAM ASSESSMENT...Closing the Loop ... related to the delivery of distance education programs or programs available at alternate geographic locations is included

6 | P a g e

Academic Program Assessment Cycle at Texas A&M University

Academic programs engage in an approximately 18-month assessment process during which faculty assess the effectiveness of their programs in meeting identified PLOs. The specific dates vary slightly by year.

AY 2020-2021

NOTE: Assessment Plans are typically due each Spring semester; however, the AY 2020-21 Plan deadlines have been pushed to the Fall semester to better accommodate faculty as the University institutes temporary changes in response to the spread of COVID-19.

ASSESSMENT PLAN:

• September 4, 2020: DRAFT of Assessment Plans submitted to internal Assessment Liaisons for feedback

• October 2, 2020: Assessment Liaisons submit feedback to programs

• November 20, 2020: Final Assessment Plans submitted to OIEE1

FALL 2020 – SPRING 2021: Data collection efforts ongoing as described in the AY 2020-21 Assessment Plan

ASSESSMENT REPORT:

• October 22, 2021: DRAFT of Assessment Reports submitted to internal Assessment Liaisons for feedback

• By November 12, 2021: Assessment Liaisons submit feedback to programs

• December 3, 2021: Final Assessment Reports submitted to DEPARTMENT APPROVERS

• December 17, 2021: Last date for Department Approvers to submit approved Reports to OIEE2

o OIEE will submit Assessment Report feedback to programs by no later than January 31, 2022o Programs should acknowledge feedback in AEFIS by February 28, 2022

ASSESSMENT REFLECTIONS & CLOSING THE LOOP REPORT (+ DE/Alternate Geographic Location Reporting):

• December 3, 2021: Report submitted to OIEEo OIEE will submit feedback by no later than January 31, 2022o Programs should acknowledge feedback in AEFIS by February 28, 2022

AY 2019-2020

ASSESSMENT REPORT / ASSESSMENT REFLECTIONS & CLOSING THE LOOP REPORT (+ DE/Alternate Geographic Location Reporting):

• October 16, 2020: DRAFT Assessment Report submitted to internal Assessment liaisons for feedback

• By November 30, 2020: Feedback submitted to programs

• December 9, 2020: Final Assessment Reports submitted to DEPARTMENT APPROVERS

• December 18, 2020: Last date for Department Approvers to submit approved reports to OIEE ANDdeadline for programs to submit the Assessment Reflections & Closing the Loop Report (andDE/Alternate Geographic Location prompt response if applicable)

1 Office of Institutional Effectiveness & Evaluation 2 Department Approvers are encouraged to review the Assessment Reports as soon as they are submitted by Program Coordinators in order to ensure that Program Coordinators have enough time to make requested revisions.

Page 7: 2020-2021 ACADEMIC PROGRAM ASSESSMENT...Closing the Loop ... related to the delivery of distance education programs or programs available at alternate geographic locations is included

7 | P a g e

Using AEFIS to Document Annual Program Assessment

Getting Started

Faculty and staff who are responsible for the submission of Assessment Plans/Reports are called Program

Coordinators in AEFIS. Program Coordinators use their NetID and password to log in to AEFIS (tamu.aefis.net).

Program Coordinators may refer to the AEFIS User QuickStart Guide for a step-by-step walkthrough of logging in,

accessing, and submitting assessment forms. This visual guide also includes useful tips and things to remember

as program assessment is documented using the AEFIS system.

Accessing Program Assessment Forms

Assessment forms assigned to Program Coordinators will appear in the Action Items list on the right side of the

browser after logging in. Click the blue pencil icon to edit the information in the form. If the Action Items list

does not automatically appear, it can be accessed by clicking on the bell icon at the top right of the screen.

Please pay particular attention to the academic year listed on the form in which you are working. At any given

time, there are Program Assessment forms available for two assessment cycles—the cycle for which the Plan is

being documented and the cycle for which the assessment is occurring and the Report is being documented.

Sometimes those forms will both be visible in the Action Items list. Program Coordinators should verify they are

working in the correct form.

Upon opening the 2020-21 assessment form, Program Coordinators will find information is already entered in

some fields. The following information has been pre-populated in the 2020-21 forms:

• The mission statement from the 2019-20 form (pre-populated into the “Discipline-specific purpose” text

box in the Program Description section)

• All Program Learning Outcomes (PLOs) entered into the 2018-19 and 2019-20 forms

• All Measures and Targets from the 2019-20 form under each PLO

Submitting Program Assessment Forms

Throughout the assessment documentation cycle Program Coordinators will submit the Plan twice, the Report

twice, and the Assessment Reflections & Closing the Loop Report once (see pg. 40 for a graphic representation

of the assessment cycle). When submitting the 2020-21 Plan for the first time, it will be sent to the Assessment

Liaison for internal feedback. Simply click the “I’m, Finished Submit” button at the bottom of the form to do so.

For all submissions after the initial submission an additional step is required. There are two additional buttons

above the “I’m Finished, Submit” button: “Approve Form” and “Reject Form.” In order to successfully submit

the form, “Approve Form” must be selected first. This button indicates the form should move to the next step

in the workflow. The “Reject Form” button indicates the form should move back a step in the workflow.

Program Coordinators will likely not use the “Reject Form” button very often, if at all (for example, it may be

used in rare cases when the Assessment Liaison asks for the form to be sent back to them).

Page 8: 2020-2021 ACADEMIC PROGRAM ASSESSMENT...Closing the Loop ... related to the delivery of distance education programs or programs available at alternate geographic locations is included

8 | P a g e

NOTE: After receiving feedback on the Assessment Plan from OIEE, Program Coordinators may update the

Program Description, PLOs, Measures, and/or Targets as they see fit. However, the form should NOT be

submitted again after making revisions. The form will not be submitted again until the Assessment Report

(Findings and Data-Informed Actions) is due the following fall semester. Simply use the “Continue Later” button

to save any changes made to the form.

Once the program coordinator submits a form it will not display on the Action Items menu; however, it is still

accessible to view by the Program Coordinator from the dashboard widget labeled My Data Collection Forms.

Email Notifications

When feedback is sent to Program Coordinators—whether from the Assessment Liaison or OIEE staff—the

system automatically sends an email notification indicating that an assessment form is available on the Program

Coordinator’s Action Items list with feedback. The sender of these notifications is listed as “The Office of

Institutional Effectiveness & Evaluation,” but the notifications are sent automatically by the AEFIS system.

Please read these email notifications carefully as they provide important information such as who provided

feedback, next steps and future deadlines, and technical information about the AEFIS system.

NOTE: If you have a student email address (@email.tamu.edu) in addition to your work email address

(@tamu.edu), you may need to forward these notifications from your student account to your work account.

AEFIS receives a nightly update from the University’s Student Information System, during which student email

addresses overwrite work email addresses (even those that have been manually entered). Therefore, if you do

not believe you are receiving these notifications please check your student email account and set up the

forwarding service.

Page 9: 2020-2021 ACADEMIC PROGRAM ASSESSMENT...Closing the Loop ... related to the delivery of distance education programs or programs available at alternate geographic locations is included

9 | P a g e

Changes to the Assessment Forms

Program Coordinators who completed assessment forms in AEFIS for the 2018-19 and 2019-20 cycles will notice

some changes have been made to the form fields for the 2020-21 cycle.

The table below summarizes field-specific changes by section of the Plan/Report. The prompts for the

Assessment Reflections, Closing the Loop, and Distance Education Reporting sections have also been updated

for clarification. Please see the relevant sections of this manual for details about those prompts.

A significant change in the 2020-21 form deals with the selection of PLOs to assess in the upcoming academic

year. In previous cycles, Program Coordinators were still able to see all of the PLOs entered into the assessment

form after OIEE provided feedback on the Assessment Plan, including those that were not selected. As of the

2020-21 cycle, only PLOs submitted in the final Assessment Plan (i.e., Step 3 of the workflow; see pg. 40) will be

visible to Program Coordinators at later steps. Please see the related FAQ in the Program Learning Outcomes

section of this manual for more information.

Page 10: 2020-2021 ACADEMIC PROGRAM ASSESSMENT...Closing the Loop ... related to the delivery of distance education programs or programs available at alternate geographic locations is included

10 | P a g e

Assessment Support

Academic programs and internal Assessment Liaisons are supported in their assessment efforts by OIEE assessment consultants. Each academic college/branch campus is assigned to an assessment consultant within OIEE as follows:

OIEE Staff Member and Contact Information

Alicia Dorsey [email protected]

Elizabeth Bledsoe Piwonka [email protected]

Alyce Odasso [email protected]

College of Dentistry College of Architecture College of Agriculture & Life Sciences

College of Medicine Bush School of Government & Public Service

College of Education & Human Development

College of Nursing College of Engineering College of Geosciences

College of Pharmacy College of Liberal Arts Mays Business School

College of Veterinary Medicine and Biomedical Sciences – professional DVM program

Texas A&M University - Qatar College of Science

School of Law

College of Veterinary Medicine and Biomedical Sciences – Undergraduate and Graduate programs

School of Public Health

Texas A&M University – Galveston

OIEE staff work closely with the college/branch campus Assessment Liaisons throughout the year to ensure training, expectations, and deadlines are clearly communicated to all programs. Requests for additional training or questions can be submitted to [email protected]. Program Coordinators and program faculty involved in the assessment process are welcome to contact the relevant consultant listed above for guidance on assessment efforts.

On the following pages, each section of the Assessment Plan and Report are outlined in detail. Prompts for

Assessment Reflections, Closing the Loop, and DE/Alternate Geographic Location reporting are also provided

and explained. Please use this manual as a guide as the program works to plan and document the assessment

process.

Page 11: 2020-2021 ACADEMIC PROGRAM ASSESSMENT...Closing the Loop ... related to the delivery of distance education programs or programs available at alternate geographic locations is included

11 | P a g e

Program Description

The Program Description section of the Assessment Plan outlines the discipline-specific purpose and focus of the program(s) to be assessed. A strong program description communicates the skills and knowledge students will have at the end of their experience and what students will be prepared to do with these skills and knowledge after graduation. Along with the description, programs are asked to provide information about the following3:

1. Geographic location of delivery

Programs should clearly state the physical geographic location of program delivery. This refers to thecampus (College Station, Galveston, Qatar) and/or the alternate geographic location (e.g., City Centre in Houston, HSC in Bryan, Dallas, McAllen, etc.). If the program is available at multiple geographic locations please enter each site, separated by a comma or semicolon.

NOTE: Fully online programs (i.e., programs that do not offer any of the curriculum face-to-face) should simply state “delivered via technology” in this text box.

2. Mode of delivery

Programs are prompted to select one of three options:

(1) Face-to-Face: Most of the curriculum (>50%) is only available to students in person.

(2) Via technology (>50% of the curriculum): More than half of the curriculum is available tostudents through asynchronous web-based delivery and/or through synchronous delivery wherecontent is delivered real-time but the instructor and student(s) are in different geographiclocations.

(3) Both: The program is offered both face-to-face and via technology. That is, students cancomplete >50% of their degree requirements either face-to-face or through the technologicalmeans defined above.

3. Format of delivery

Programs are prompted to select one of three options:

(1) Synchronous: Instruction is available to and accessed by students in real time with the instructor.

(2) Asynchronous: Instruction does not occur in real time. Instructors provide content which thestudent can access via technology.

(3) Both: Programs utilizing both synchronous and asynchronous methods should select this option.

3 The Program Description section does not have specific feedback criteria. Only qualitative feedback is provided.

Page 12: 2020-2021 ACADEMIC PROGRAM ASSESSMENT...Closing the Loop ... related to the delivery of distance education programs or programs available at alternate geographic locations is included

12 | P a g e

NOTE: In cases where two or more programs with different modes and/or formats of delivery are included in the same Assessment Plan, please follow these instructions:

1. In the Program Description text box clearly state the geographic location, mode of delivery, andformat of delivery for each credential included in the Assessment Plan. This is in addition to thediscipline-specific purpose and focus of the program.

2. For Mode of Delivery and Format of Delivery, please select the option that applies to the DistanceEducation program included in the Assessment Plan. If the Assessment Plan includes more than oneDistance Education credential with differing modes and formats of delivery, DO NOT select “Both”from the drop-down menu. Choose only one of these credentials and select the options that applyto that credential. In the Program Description text box, be sure to clearly state the credential towhich these selections apply.

FAQs

Q: The Assessment Plan includes two (or more) academic programs--how should I respond to the Program Description prompts?

A: In cases were more than one program is included in the Assessment Plan (i.e., two or more similar programs of the same degree level or two or more programs of the same discipline with different degree levels), please be sure to address the purpose and focus of each individual credential. In addition, clearly list the geographic location, mode, and format of delivery for each program in the Program Description text box. Please see the NOTE at the top of this page for more detailed information.

Q: What should be entered for “geographic location of delivery” if the program is only offered 100% online?

A: “Delivered via technology” is the recommended response.

Page 13: 2020-2021 ACADEMIC PROGRAM ASSESSMENT...Closing the Loop ... related to the delivery of distance education programs or programs available at alternate geographic locations is included

13 | P a g e

Program Learning Outcomes

All programs are expected to establish a minimum of three program learning outcomes (PLOs) as part of the program’s comprehensive Assessment Plan. However, programs need only select and assess one PLO annually. PLOs should represent the knowledge and skills students are expected to possess upon graduation. Feedback on PLOs will be based on the presence or absence of the criteria described below.

Feedback Criteria

1. Outcome is clearly written, reflecting what students are expected to have learned upon completion of theprogram

To an external party reading the Assessment Plan it should be clear what knowledge and/or skills a graduate of the program is expected to possess. A strong PLO is written in clear, straightforward terms and is appropriate to the level of the degree offered.

2. Outcome is measurable

Simple and specific PLOs are the most straightforward to measure. Consider the following:

• Use of action verbs—such as those found in Bloom’s Taxonomy—to define specificexpectations makes measurement of student achievement more concrete.

• Compound and/or complex outcomes containing multiple skills are often difficult to fullycapture, particularly if only one or two measures are used. PLOs focusing on a single skill areeasier to measure comprehensively.

• PLOs related to students’ beliefs and values are not directly measurable and should be avoided(e.g., Students will develop a greater appreciation of a differing culture).

3. Outcome is mapped appropriately to university-level learning outcome(s)

Texas A&M University (TAMU) has identified university-level learning outcomes which describe theknowledge and skills undergraduate, masters, and doctoral students should possess upon graduation from TAMU. Additionally, the Texas A&M University System (TAMUS) has identified learning outcomes (called EmpowerU outcomes) that apply to undergraduate degree programs and undergraduate-level certificates. Programs should appropriately document how their PLOs align with each set of outcomes using the Relevant Associations feature in the assessment forms.

FAQs

Q: The program has external accreditation requirements for the discipline which requires specific PLOs be assessed. Can we use those outcomes in this Assessment Plan?

A: Yes, in fact we encourage accredited programs to ensure close alignment between the annual program assessment process and program accreditation requirements.

Page 14: 2020-2021 ACADEMIC PROGRAM ASSESSMENT...Closing the Loop ... related to the delivery of distance education programs or programs available at alternate geographic locations is included

14 | P a g e

Q: Do we have to measure the same outcomes every year? / Can we measure the same outcomes every year? A: Program faculty should guide the assessment process, including determining which outcomes are measured

and when. Some programs place their learning outcomes on two- or three-year rotations, focusing on just one or two PLOs in a given academic year. However, assessment planning should be an intentional process. In some cases, this might mean measuring the same PLOs every year, and in others this might mean measuring PLOs on a rotation. Even programs that assess their outcomes on a planned rotation might need to deviate from their rotation from time to time. Again, these decisions should be driven by faculty and the observations they make.

Q: Can this Assessment Plan include program objectives like participation in events, publication productivity,

etc.? A: The primary purpose of the assessment reporting process is to document student learning. Other objectives

(e.g., tracking the number of manuscripts submitted by students) may be included as part of the Assessment Report as additional objectives; however, programs should ensure they are meeting the minimum expectation of reporting assessment results for one Program Learning Outcome annually, and that any programmatic objectives are in addition to PLO(s).

Q: For the associations with the university-level learning outcomes, is it better to indicate all that are

somewhat associated or to be more selective?

A: The associations should be as closely aligned as possible; that is, any given PLO should only be associated

with the university-level learning outcome(s) it most closely resembles. If two associations are closely related

to the PLO, both may be selected. One purpose of the associations is to demonstrate how the program is

assessing the university-wide outcomes through its annual assessment practices.

Q: I received feedback from OIEE about on my 2020-21 Assessment Plan. I can’t see all of my outcomes

anymore. What happened?

A: This is a new characteristic of assessment forms after the final Assessment Plan has been submitted (i.e.,

after Step 3 in the workflow; see pg. 40 for a graphical representation of the workflow). Whereas in the past

Program Coordinators have had access to all of their PLOs even through the Report stages of the workflow,

this is no longer the case. The reason for this limited view is to encourage meaningful and intentional

assessment planning. Program faculty should discuss their intentions for the upcoming year of assessment

well before the assessment takes place.

Page 15: 2020-2021 ACADEMIC PROGRAM ASSESSMENT...Closing the Loop ... related to the delivery of distance education programs or programs available at alternate geographic locations is included

15 | P a g e

Measures

A measure describes the methods of collecting and evaluating assessment data. A strong measure description makes the assessment strategy easy to follow for an external party who is not intimately involved in the day-to-day operations of the program. The Measures section can be thought of as a miniature methods and data analyses section of a research paper—it is a description of methods used to gather and analyze assessment data.

For program learning outcomes (PLOs), there are two types of measures: direct and indirect.

Direct measures require students to demonstrate and display their competency or ability in some way that is evaluated for measurable quality by an expert such as faculty, assessment professionals, internship supervisors, or industry representatives. Some examples of direct measures include:

• Written assignments, oral presentations, portfolios, or demonstrations to which a rubric orother detailed criteria are applied

• Exam questions focused on a particular PLO• Scores on standardized exams (e.g., licensure, certification, or subject area tests)• Employer or internship supervisor ratings of student performance• Other assignment grades based on detailed criteria

Indirect measures provide secondhand information about student learning. Whereas direct measures are concerned with the quality of student work as it relates to certain PLOs, indirect measures are indicators that students are probably learning. Often, indirect measures are too broad to represent the achievement of specific PLOs. Some examples of indirect measures include:

• Tasks tracked by recording completion or participation rates• Completion of degree requirements• Number of students who publish manuscripts or give conference presentations• Survey questions students answer about their own perception of their abilities• Job placement• Course grades and some exam grades (see FAQs below)• GPAs• Course enrollment data

All PLOs need at least one direct measure. Indirect measures may supplement direct measures, but the focus of the Assessment Plan should be on the direct measurement of PLOs. Measures will be evaluated based on the presence or absence of the criteria described below.

Feedback Criteria

1. Measure is a direct measure of student learning and clearly aligns with the outcome

A direct measure of student learning requires direct evaluation, meaning the quality of student workor the student’s demonstration of that specific skill should be observed and clearly reported. Programs should also ensure alignment between outcomes and their measures. For example, if the PLO states students will articulate certain discipline-specific concepts, the measure should describe a written or

Page 16: 2020-2021 ACADEMIC PROGRAM ASSESSMENT...Closing the Loop ... related to the delivery of distance education programs or programs available at alternate geographic locations is included

16 | P a g e

oral activity through which students discuss those concepts (versus identifying the concepts on a multiple-choice exam, for example).

2. The data collection is clear (i.e., where the data are coming from)

The external reviewer of the Assessment Plan will not have an in-depth understanding of the programand its curriculum, so it is important to clearly communicate where data are coming from by including the following information, where appropriate: the course designation, the point in the curriculum when the data is collected, who collects the data, sampling and/or recruiting methods, etc.

3. Methodological processes are clear (i.e., how the data will be evaluated and reported)

This criterion is concerned with measurement and data analysis. The measure description shouldprovide a clear picture of how the data are evaluated and reported, specifically for assessment of the program learning outcome. The program should address how the data collected through the assessment process are aggregated and analyzed for a program-level discussion about student learning.

4. All referenced rubrics/surveys are attached or sufficiently described

Programs should attach instruments used in the assessment process (i.e., rubrics, prompts, surveys,

exam items, etc.) as supporting documentation in the Assessment Plan when feasible. Examples of

instruments that programs are not expected to attach as supporting documents are standardized tests

(such as certification exams or third-party assessment instruments) and qualifying/preliminary

examination prompts.

FAQs

Q: Why aren’t course grades and GPAs considered direct measures?

A: Final course grades and GPAs are very broad metrics. PLOs are much more specific. Although a great deal of instruction in a particular course might focus on developing skills related to a PLO, the course likely addresses a variety of other skills and knowledge as well. It is difficult to determine how much of a course grade or GPA is actually due to how well students demonstrate a single, specific learning outcome. Furthermore, many course grades also take into account activities like class participation and attendance which further complicate how much a course grade actually captures a specific learning outcome.

Q: Why aren’t exam grades considered direct measures?

A: Similar to course grades (see above), comprehensive or multi-unit exam grades are often broad. However, some exams that are focused on specific topics and/or skills can make them candidates for use in direct assessment. When it comes to exam scores, program faculty should always consider which data would be more meaningful: a comprehensive exam grade or performance on specific exam items (e.g., essay questions or a grouping of multiple-choice items).

Page 17: 2020-2021 ACADEMIC PROGRAM ASSESSMENT...Closing the Loop ... related to the delivery of distance education programs or programs available at alternate geographic locations is included

17 | P a g e

Q: Should we use more than one measure? Do we have to use more than one measure?

A: Consider this—you wouldn’t award a diploma based on a single exam grade. Relying on a single measure to capture collective student performance on a PLO will provide only limited information about the extent to which students are achieving that PLO. Programs are strongly encouraged to use more than one measure to assess student ability as this will provide a more complete picture of the curriculum and what student learning in the program entails. As a byproduct, use of multiple measures will facilitate conversations about continuous improvement.

Q: We received previous feedback about using a comprehensive rubric score as evidence of a PLO. Why is this a problem?

A: Aggregating rubric scores is not inherently a problem. Many multi-criteria rubrics exist, all of which might directly relate to the overall learning outcome (for example, AAC&U rubrics). However, depending on what the program wishes to accomplish it might be more useful to report the results for each rubric criterion separately. Breaking down the results like this can uncover gaps in learning that might not have been as obvious in the aggregate score alone. More granular results also make continuous improvement opportunities easier to identify. Additionally, some rubrics may include criteria for unrelated skills. For example, the rubric to evaluate a research paper might include criteria for the Literature Review, Methods, Analysis, and Discussion, but then also include a criterion for Grammar, Syntax, and Mechanics. If the PLO is specific to research skills, only scores from the research-related criteria should be reported.

Page 18: 2020-2021 ACADEMIC PROGRAM ASSESSMENT...Closing the Loop ... related to the delivery of distance education programs or programs available at alternate geographic locations is included

18 | P a g e

Targets

A target is the level at which a program considers their program learning outcome (PLO) to be “met” or achieved on a given measure. Strong targets are clear, static levels of achievement. Targets will be evaluated based on the presence or absence of the criteria described below.

Feedback Criteria

1. The standard is clearly presented

“Standard” refers to the minimally acceptable student performance. For example, for a measureutilizing a 6-point rubric criterion, the standard might be that students earn at least 4 out of 6 points on the criterion. The standard should be meaningfully selected; it might be more or less rigorous depending on the degree level or on past student performance. Program faculty should collectively determine appropriate standards.

2. The proportion of students expected to meet the standard is clearly stated

The program should determine how many of their students are expected to meet the set standard. Forparticularly rigorous standards the proportion of students expected to meet the standard might be lower than for a more conservative standard. This proportion should be collectively determined by program faculty.

3. The target clearly aligns with the outcome and measure

The target(s) for a given measure should align with the measure description by using consistentlanguage and format. In cases where a rubric is used, make sure the target statement specifically refers to the minimally acceptable performance level on the rubric. It also helps to refer back to the specific PLO being assessed. For example: 80% of students will earn ‘Accomplished’ or ‘Exemplary’ on the rubric criterion specific to written communication skills.

FAQs

Q: What are some examples of strong targets that include a specific proportion of students expected to meet

the standard?

A: Here are some examples of strong targets:

• 85% of students will earn at least 7 out of 10 points on the critical thinking essay question.

• 100% of students will achieve the ‘Competent’ threshold on the Content Development rubric criterion.

• 70% of students will score above the 80th percentile on the ACS standardized exam.

Page 19: 2020-2021 ACADEMIC PROGRAM ASSESSMENT...Closing the Loop ... related to the delivery of distance education programs or programs available at alternate geographic locations is included

19 | P a g e

Findings

Findings are the results from the analysis of assessment data. Strong Assessment Reports will consistently communicate findings in a clear manner that aligns with the language of the related measure and target. In addition to the finding statement itself, programs should select the appropriate designation (called a target status indicator)—whether the target was Met, Not Met, or Partially Met from the provided menu. Please see the FAQs section for information about the appropriate use of Partially Met.

If there are no findings to report for a given measure/target, programs may select one of the two other options in the target status indicator dropdown menu4: (1) No students enrolled or (2) No data collected/reported. Appropriate use of each is briefly described here:

• No students enrolled: Select this option if there were no students enrolled in the program during theacademic year for which the Report is being prepared.

• No data collected/reported: This option might be selected for a number of reasons. Most often it will beselected if there are fewer than 5 students enrolled in a given academic year, or if there are fewer than5 students at the point in the curriculum where assessment data is collected. Programs are not requiredto report assessment results for fewer than 5 students. If this option is selected, it MUST beaccompanied by a brief explanation. Programs that experience consistently low enrollment shouldrefer to the FAQs section for more information.

Findings will be evaluated based on the presence or absence of the criteria described below.

Feedback Criteria

1. Findings align with the measure and target as described

Language used in the finding statements should mirror the language used in the measure and targetdescriptions. The format in which the results are reported should clearly relate to the process(es) as outlined in the measure and target descriptions.

2. Target status indicator is used appropriately

Target status indicators are used to indicate whether the target was Met, Not Met, or Partially Met. Inaddition to selecting target status indicators, the finding statement should support the selected indicator. If no findings are reported, either No students are enrolled or No data collected/reported should be selected (with an explanation accompanying the latter selection).

3. Findings include a brief discussion regarding the meaning/value of results for purposes of continuousimprovement

4 Please note that these two options do not appear in the 19-20 Report forms. For the 19-20 Report, please use the Findings text box to communicate that no students are enrolled or why no data was collected/reported.

Page 20: 2020-2021 ACADEMIC PROGRAM ASSESSMENT...Closing the Loop ... related to the delivery of distance education programs or programs available at alternate geographic locations is included

20 | P a g e

Finding statements should go beyond simply reporting results; they should also include an explanation and/or reflection about the practical significance of the results. This can be achieved by comparing the current findings to those of previous years, by discussing what was surprising or affirming about the results, or by further drilling down into the data to discover more granular information.

4. Where appropriate, findings are disaggregated (i.e., by program, by mode of delivery, by geographiclocation)

Provided here are some examples of Assessment Reports in which results should be disaggregated:

• The Assessment Plan includes two programs with different credentials (e.g., MS/PhD orMS/MEd combined in a single Plan)

• The Assessment Plan includes two or more programs with different modes of delivery (e.g., aDistance Education program in the same Plan as a face-to-face program)

• The Assessment Plan includes a single program, but that program is offered both face-to-faceand via technology OR face-to-face in two different geographic locations

FAQs

Q: What does ‘Partially Met’ mean and when should it be used?

A: Partially Met should be used only when reporting findings for compound/complex targets. For example: A program uses a four-criteria, four-point rating scale rubric to measure written communication. The target states that 80% of the students will achieve a score of ‘3’ or higher on all criteria of the rubric. The results show that 83% of students achieved ‘3’ or better on two of the criteria, but only 75% achieved a ‘3’ or better on the other two criteria. This Target would be Partially Met. Partially Met should not be used if the target was only close to being met.

Q: There is consistently low enrollment in the program—can we always select ‘No data collected/reported’ if there are fewer than 5 students on which to report assessment results?

A: No. Only programs that occasionally experience low enrollment or only occasionally assess fewer than 5 students should select this option. Programs with consistently low enrollment must utilize other methods of reporting results. We recommend combining current assessment results with assessment results from the past two or three cycles in which the same measures were used. This creates a larger sample and results in more data on which to base continuous improvement efforts.

Q: All of the targets are met. If we just say “This is an indication that our students are performing well” will the program meet the criteria of discussing the meaning/value of results for purposes of continuous improvement?

A: Saying the findings are an indication that students are performing well is essentially the same as indicating the target is Met. The finding statement should go one step further by contextualizing the results. This can be done in multiple ways (see Feedback Criterion #3), but one of the most powerful ways to discuss the meaning

Page 21: 2020-2021 ACADEMIC PROGRAM ASSESSMENT...Closing the Loop ... related to the delivery of distance education programs or programs available at alternate geographic locations is included

21 | P a g e

of results for purposes of continuous improvement is to describe the longitudinal trend. How have students performed on this outcome/measure over the past few assessment cycles? Is progress being made? If not, to what might faculty attribute this trend?

Q: How should finding statements be structured?

A: There is not a prescribed template all finding statements must follow. However, the following is a template programs might find useful:

• First sentence: Present the assessment results in the context of the measure (e.g., 85% of studentsachieved at least 3 points on the ‘Written Communication’ rubric criterion).

• Second sentence: Reiterate the target, stating whether it was met, not met, or partially met (e.g.,The target of 80% achieving at least a 3 was met).

• Third sentence: Contextualize the results by discussing longitudinal data trends, presenting othersupporting data (if available), and/or by reflecting on whether results were surprising or affirmingand why.

Page 22: 2020-2021 ACADEMIC PROGRAM ASSESSMENT...Closing the Loop ... related to the delivery of distance education programs or programs available at alternate geographic locations is included

22 | P a g e

Data-Informed Actions

Data-informed actions are the actions the program intends to take in response to the assessment results. These actions should have a close, clear connection to the data collected during the assessment reporting cycle. Every program is expected to submit at least one data-informed action that fulfills the criteria below. In addition, programs are expected to address use of results for each individual finding statement. See FAQs section for additional information.

Feedback Criteria

1. Course of action is designed to improve/strengthen student learning

Data-informed actions should clearly articulate a specific course of action designed to improve futureassessment results for a targeted PLO. There should be enough detail included that an external reviewer is able to understand what specific changes are being made to affect positive change in achievement of a specific PLO. See FAQs for examples of appropriate actions/curricular changes to improve PLO results.

2. Action includes specific implementation details (e.g., timeline, responsible party, etc.)

Including a timeline and identifying the responsible party demonstrates the action has been carefullyconsidered and implementation has been discussed amongst program faculty. Consider including an estimate of when the impact of the action might first be observed in assessment results.

3. Action description addresses why the program believes the action will lead to improvements in learning

Data-informed actions should identify how and/or why the program believes the action will improvestudent learning. This might be a description of how the action will directly affect students, how the action addresses identified deficiencies contributing to current assessment results, or why faculty believe this action will help improve the program and the PLO results overall.

FAQs

Q: How specific does a data-informed action need to be?

A: The development of a data-informed action should be a collaborative decision-making process involving program faculty. It should reflect a specific, systematic response to the findings. This does not mean the action must be a large, resource-demanding overhaul to the program/curriculum. Rather, the action should be specific, identifiable, and should be able to be implemented in a systematic and intentional way. Listing possible curricular changes without committing to a specific action does not demonstrate a clear and intentional use of assessment data for improvement.

Examples include, but are not limited to:

• A course-level adjustment at any point in the curriculum• New text, assignments, etc.

Page 23: 2020-2021 ACADEMIC PROGRAM ASSESSMENT...Closing the Loop ... related to the delivery of distance education programs or programs available at alternate geographic locations is included

23 | P a g e

• Guest lecturer in a specific course

• New programming or activities designed to enhance and improve PLOresults

• A pilot program

• Prerequisite or other curriculum-based adjustment

• Changes to assignment requirements

• Changes to meeting requirements

• Changes to advising strategies

• Additional required trainings for faculty, staff, or students

Q: Can a data-informed action focus on a change to the program’s assessment strategies?

A: It is expected that at least one data-informed action is a curricular change designed to improve student learning. Changes to assessment strategies and/or to the overall assessment process do not fit this criterion. Programs should primarily discuss changes to assessment strategies in the Assessment Reflections & Closing the Loop Report.

Q: Do we have to enter a data-informed action for every finding?

A: Text responses are required in all data-informed action text boxes, meaning the program should have a

response to all of the reported findings. If the program establishes a data-informed action for only one finding, responses to the other findings might be less involved. Here are a few examples: (1) the program will continue monitoring student achievement on the PLO; (2) the program will continue collecting data for X number of cycles in an effort to identify a specific trend in the data; (3) the program will continue to gather data until there is sufficient data for analysis.

Q: How do we write a data-informed action when all of the targets are met?

A: Met targets are a sign the program is functioning well and the established PLOs are achievable. It does not

mean, however, that all of the work is done and there is no further need for assessment or attention to

continuous improvement. Therefore, the program should still consider how the collected data can inform

continuous improvement efforts. Strategies for continuous improvement based on met targets include, but

are not limited to:

• Drilling down into the results further, perhaps by demographics, course section, or some other

dimension in an effort to identify possible gaps or disparities

• Adjusting the target

o If the program follows this strategy it is critical to include a discussion of what action the

program will take in order to help students meet the new target. This keeps the focus of the

data-informed action on a curricular change rather than simply on updating the target (which

would be considered a change to the assessment strategy and should be documented in the

Assessment Reflections & Closing the Loop Report).

Page 24: 2020-2021 ACADEMIC PROGRAM ASSESSMENT...Closing the Loop ... related to the delivery of distance education programs or programs available at alternate geographic locations is included

24 | P a g e

Assessment Reflections & Closing the Loop Report

This report serves two main purposes. It is an opportunity for programs to demonstrate: (1) that program assessment is a faculty-owned and faculty-driven process and (2) how they engage in the most important part of the assessment process—closing the loop on program learning outcome results.

In addition to these two sets of prompts, Distance Education programs and programs offered at alternate geographic locations are asked to discuss what the program has learned about the educational experience and quality of learning for students in these unique modalities/locations. This additional section of the report is covered in detail in the Distance Education and Alternate Geographic Location Reporting Guidelines document, accessible from https://assessment.tamu.edu.

Assessment Reflections

In this section programs reflect on the assessment processes and practices they have employed over the course of the last year. Specifically, programs should address each of the following:

1. PROGRAM LEADERSHIP AND FACULTY INVOLVEMENT: How are program leadership and facultyinvolved in the sense-making of assessment data and decisions regarding continuous improvementefforts?

2. CHANGES TO ASSESSMENT PRACTICES: Think about the assessment cycle you just completed and the

challenges you faced (or potential challenges you face) in using the data you collected. What could the

program do differently next time to ensure the data that is gathered and reported continues to be

useful in guiding curricular/programmatic change?

Be sure to include the following:

i. Are there changes you need to make regarding what kind of assessment data is gathered? (Is it

the right data?)

ii. Are there changes you need to make regarding how data are analyzed and/or reported so that

they are useful for continuous improvement? (Is the data specific enough to guide changes?)

Feedback Criteria

1. The role of faculty and program leadership in assessment is sufficiently described

Program assessment should be a faculty-owned and faculty-driven process. Individuals who holdleadership positions in the program/department should also be involved in some capacity. Responses to the first prompt should describe the role of the program’s faculty and leadership throughout the assessment process. To guide the response, consider the following questions: At what stage(s) of the assessment process were faculty and program leadership involved? In what capacity? What role did they play in data sense-making and in the decision-making processes related to continuous improvement and future assessment?

Page 25: 2020-2021 ACADEMIC PROGRAM ASSESSMENT...Closing the Loop ... related to the delivery of distance education programs or programs available at alternate geographic locations is included

25 | P a g e

2. Response includes considerations of the quality and utility of the assessment data for continuous improvement

Programs should reflect on whether the data they collected is meaningful and/or sufficient to guide continuous improvement efforts. For example, if 100% of students meet a target every year is it useful to keep that target/measure in Assessment Plan? Could faculty learn more from reporting the average result for each individual rubric criterion versus the average comprehensive rubric score? These are the kind of questions to consider. However, responses to this prompt might also include a variety of other considerations: the need for establishing (or more frequently convening) an assessment committee, revising the PLOs, changing the timing of assessment data collection, etc.

3. Changes to assessment methodology are sufficiently described

Based on lessons learned throughout the assessment process (including the considerations presented above), programs should clearly state what changes, if any, are being implemented with regard to the way the program approaches PLO assessment. Describe the concrete steps or new processes that are being implemented in response to what the program learned about their assessment practices.

FAQs Q: What level of detail should be provided with respect to the role of faculty and program leadership in

assessment? A: Programs should provide a brief narrative about the breadth and scope of faculty involvement given

assessment is a faculty-driven process. The response does not need to include specific names but should be detailed enough to capture the overall process. This could be done by attaching and briefly describing minutes from a program faculty meeting where assessment data were discussed and data-informed actions identified. When in doubt, err on the side of providing more detail versus less detail.

Q: If we created a data-informed action that addresses changes to the program’s assessment process, do we

provide the same information here? A: Actually, such actions should be reported here and not as a data-informed action in the Assessment Report.

Data-informed actions should be focused on curricular changes, not changes to assessment practices. Any planned changes to assessment practices (e.g., changing rubrics, revising surveys, redefining outcomes, changing how data will be analyzed) should be described in this section.

Page 26: 2020-2021 ACADEMIC PROGRAM ASSESSMENT...Closing the Loop ... related to the delivery of distance education programs or programs available at alternate geographic locations is included

26 | P a g e

Closing the Loop Programs are asked to reflect on the impact of a previously implemented data-informed action OR some other curricular change specifically designed to strengthen a program learning outcome (PLO). The identified action/curricular change should address a learning outcome for which assessment data has since been collected (i.e., after the change has been implemented and program faculty could reasonably see whether the change made a difference or not).

1. PREVIOUSLY IMPLEMENTED CHANGE: What curricular change did you make (approximately 2-3 years ago) in an attempt to improve a specific program learning outcome? Be explicit about:

i. the specific program learning outcome;

ii. the specific original assessment results that prompted the change (i.e., quantitative or

qualitative findings; and,

iii. the nature of the change.

2. FINDINGS: Did the change make a difference? Be sure to include: i. what data you gathered;

ii. what the specific new assessment results were (i.e., quantitative or qualitative findings);

iii. whether the data suggest the change made a difference in the program learning outcome; and,

iv. what implications there are for future programmatic/curricular changes.

Feedback Criteria

1. Targeted program learning outcome and assessment findings that prompted the development of the action/change are described

Indicate the specific PLO that was the focus of the implemented data-informed action/curricular change. Briefly describe the assessment findings on which the action/change was based. This should include a short description of the measure from which the results were derived. Be sure to state the specific findings, avoiding vague statements such as “the target wasn’t met.”

2. Action/change that was implemented is described (including contextual information)

Provide a brief but descriptive summary of the previously implemented data-informed action/curricular change. It should be clear what specific action was taken, when the action was taken, and who was involved in the implementation process.

3. Subsequently gathered assessment data used to determine whether the action/change led to an improvement in the program learning outcome are summarized

Clearly state the specific results of the subsequent PLO assessment and how these results compare to the previously gathered data (as discussed in response to Part 1 of this prompt). In doing this, the program may wish describe the methodology that was used (e.g., describe the scale or rubric if the result is a mean situated on a scale). The results should be clear enough that an external party reading the report would have no unanswered questions about the interpretation of the results.

Page 27: 2020-2021 ACADEMIC PROGRAM ASSESSMENT...Closing the Loop ... related to the delivery of distance education programs or programs available at alternate geographic locations is included

27 | P a g e

4. Implications of subsequently gathered assessment data are discussed

Consider the impact the data-informed action/curricular change may have had on the learning outcome results. Whether results were improved or not improved, reflect on what role the action/change may have played and discuss how the program aims to further improve outcome achievement in the future.

FAQs Q: What if there was no improvement in the targeted outcome(s)? A: The purpose of this process is to examine whether or not program changes made a difference. In cases

where improvement was not observed, this is valuable information in and of itself. Reflect on what might be done differently in the future to guide improvement.

Q: What if we don’t have any follow-up results yet? A: The only action/curricular change discussed here should be one that has either been fully implemented or

implemented enough that program faculty could reasonably tell if it made a difference in PLO results. “Closing the Loop” means being able to provide a complete narrative about actions the program has taken and whether those actions made a difference. However, see the rest of the FAQs for exceptions.

Q: What if the program is brand new/new to being formally assessed? A: If a program is only 2 or 3 years old then it is possible not enough time has passed for program faculty to fully

implement an action and/or reassess the PLO. In such cases, please clearly indicate when systematic assessment of the program began and what efforts are currently being made to use previously collected assessment data for continuous improvement.

Q: Can we discuss an action that was implemented more than 2-3 years ago? A: Yes. Some actions/curricular changes take longer to implement than others, so the assessment findings you

discuss in Part 1 of this prompt might be from more than 2 or 3 years ago.

Q: The program is using different measures than before, so the pre- and post-action data aren’t directly comparable. Is this an issue? A: No, this is not an issue. Assessment is not a hard science, so it is not necessary for the methodology to stay

the same throughout the process. Assessment itself is a process, so it makes sense for measures to change as the program evolves. The program’s reflection on the efforts made to improve student learning is more important than ensuring directly comparable assessment results.

Q: How do we respond to these prompts if we plan to discuss a curricular change that wasn’t included in a

previous Assessment Report?

Page 28: 2020-2021 ACADEMIC PROGRAM ASSESSMENT...Closing the Loop ... related to the delivery of distance education programs or programs available at alternate geographic locations is included

28 | P a g e

A: Curricular changes not previously documented in an Assessment Report should be discussed in the same way as previous data-informed actions. These other curricular changes should still be based on some kind of data/results/observations about student learning. If this information is qualitative in nature (e.g., a discussion in a faculty meeting in which gaps in student learning came to light), please be sure to describe that information in detail.

Q: What if none of the program’s previous data-informed actions were curricular in nature/none were

designed to improve PLOs? A: As this documentation is specific to program learning outcome assessment, the action discussed in the

“Closing the Loop” section should be curricular in nature (and specifically related to a PLO). If none of the program’s previous data-informed actions (i.e., actions submitted in previous Assessment Reports) were curricular in nature, the program should instead discuss some other curricular change made over the last few years (see the FAQ above).

Page 29: 2020-2021 ACADEMIC PROGRAM ASSESSMENT...Closing the Loop ... related to the delivery of distance education programs or programs available at alternate geographic locations is included

29 | P a g e

Information for Assessment Liaisons

The Liaison Role in the Assessment Process

Assessment Liaisons provide internal feedback to programs twice each cycle—once on the Plan and then again

on the Report. The purpose of this is to provide useable feedback to the programs prior to the final submission,

ensuring clear, aligned, high quality Assessment Plans and Reports. In addition to providing feedback, Liaisons

are encouraged to work with program faculty throughout the year to provide support and emphasize the

importance of the assessment process.

Program assessment is inextricably tied to TAMU’s SACS-COC accreditation status. TAMU’s 10-year reaffirmation

document, which will include assessment work, is due Fall 2021.The following will be reviewed by a SACS-COC

Peer Reviewer:

• AY2018-19 – Select examples of Assessment Plans/Reports to demonstrate the new assessment process

adopted by TAMU

• AY2019-20 – Large sample of Assessment Plans/Reports and Assessment Reflections & Closing the Loop

Reports

• AY2020-21 – Large sample of Assessment Plans [Note: All Assessment Reports will be made available

during the onsite review in March 2022]

Providing Feedback

This manual can serve as a guide for Liaisons as well as for Program Coordinators. Liaisons are encouraged to

refer to the Plan and Report sections of this manual when providing feedback to programs. Feedback in this

program assessment review process takes two forms:

1. Categorical: Yes, No, Not Applicable

Yes should be selected only in cases where the criterion is completely fulfilled. The Not Applicable

option is new as of the 20-21 assessment cycle; it should be used in instances where a particular

criterion is not relevant to the program, measure, finding, etc. For example: An Assessment Plan for a

single fully face-to-face, College Station-based program is not expected to disaggregate their results by

program, geographic location, or mode of delivery, so Not Applicable is the appropriate selection for

that criterion in the Findings section.

2. Qualitative

Each section of the Plan/Report includes a text box where qualitative feedback can be provided. OIEE

recommends providing qualitative feedback in sections where No is selected for any of the related

criteria. Qualitative feedback helps Program Coordinators understand what specific revisions are being

requested and why.

If the Liaison would like to leave feedback but does not wish to review the Plan or Report a second a time, leave

the appropriate feedback and select the “Approve Form” button. Once the button is highlighted, click “I’m

Finished, Submit” to send the feedback forward in the workflow to the Program Coordinator.

Page 30: 2020-2021 ACADEMIC PROGRAM ASSESSMENT...Closing the Loop ... related to the delivery of distance education programs or programs available at alternate geographic locations is included

30 | P a g e

The “Reject Form” button can and should be used in cases where the liaisons would like to review the Plan or

Report again before advancing it through the rest of the workflow. Select the “Reject Form” button and click

“I’m Finished, Submit” to send the form with feedback backwards in the workflow to the Program Coordinator.

Please be aware that Program Coordinators are NOT notified when Liaisons reject a form. Liaisons will need to

notify Program Coordinators outside of the AEFIS system.

Liaisons should use their best judgment (in concert with this manual) to determine which programs might

benefit from multiple iterations of internal feedback.

Changes to Feedback Fields

Some changes have been made to the feedback fields as of the 20-21 cycle. Most notably, Liaisons are now able

to edit the feedback they provided on the Plan after Program Coordinators submit the first draft of the Report

(i.e., Plan feedback provided at Step 2 is editable at Step 6 of the workflow). Liaisons can now edit their prior

feedback if Program Coordinators made revisions based on Liaison and/or OIEE feedback.

In some sections, feedback fields were combined, deleted, or re-worded for clarification. Please see the relevant

sections of this manual for a comprehensive list of the feedback fields for each part of the Plan and Report.

Specific Feedback Considerations for Plan and Report

Program Description

• Program Coordinators can freely describe the discipline-specific purpose of their program(s). This does

not necessarily need to resemble a mission statement. There are not any review criteria specifically

related to the description they provide.

• Of greatest import is how programs respond to the prompts about geographic location, mode of

delivery, and format of delivery. When reviewing this section, ensure the Program Coordinators of

combined Assessment Plans (i.e., Plans that include two or more programs, including those that name a

single program but actually offer it in both modalities) respond to these three prompts accordingly:

o Geographic location of delivery: Programs are provided a single-line text box in which to enter

information; they should list locations for each program in the Plan (e.g., “College Station

campus, online” or “College Station campus, Houston City Centre”).

o Mode of delivery (Face-to-face, Via Technology, Both): Programs select one option from the

three listed. For combined Assessment Plans the response should only reflect ONE of the

programs. Both should not be selected to reflect mode of delivery for both/all programs in the

combined Plan, but rather to reflect the Distance Education program if the combined Plan is

offered via both methods. For example: The MS in Mathematics is offered both face-to-face and

fully online, so Both would be the appropriate selection. Alternatively, in the combined

Assessment Plan for the MS and MEd in Educational Administration the appropriate selection

would be Via Technology because the MEd is a Distance Education program (and the MS is

Page 31: 2020-2021 ACADEMIC PROGRAM ASSESSMENT...Closing the Loop ... related to the delivery of distance education programs or programs available at alternate geographic locations is included

31 | P a g e

offered only face-to-face).

o Format of delivery (Synchronous, Asynchronous, Both): As with the mode of delivery, in

combined Assessment Plans the response should only reflect ONE of the programs, and it

should be the same program for which the response to Mode of delivery was selected. Both is

indicative of a single program for which instruction is offered both synchronously and

asynchronously.

• In the Program Description section of this manual, Program Coordinators for combined Assessment

Plans are also instructed to list each program’s location, mode, and format of delivery in the “Discipline-

specific purpose” text box, so as to minimize confusion.

• If a Program Coordinator responsible for the Plan in AEFIS does not know this information, encourage

them to work closely with someone in the program who has this knowledge. There are a number of

sections of the Assessment Report requiring specific consideration of delivery methods so it is strongly

recommended that someone with this knowledge is involved throughout the assessment process.

Program Learning Outcomes (PLOs)

• Selection of PLO(s) to assess in the upcoming academic year should be an intentional, faculty-driven

decision. Encourage Program Coordinators to open this discussion with faculty in their program as early

as possible during each Spring semester. Even programs with a rotating schedule for assessing their

PLOs might benefit from having this conversation annually.

• Appropriate selection of Relevant Associations is important because OIEE uses those associations to run

a variety of reports for institution and system reporting. In instances where a PLO is broadly linked to a

number of Relevant Associations, provide feedback encouraging Program Coordinators to only select

those that most closely align with the PLO. In the case of undergraduate programs, PLOs should be

linked to both the University Undergraduate Learning Outcomes and the EmpowerU (TAMUS)

outcomes. PLOs in combined MS/PhD Assessment Plans should select Relevant Associations from both

the Masters and Doctoral University Learning Outcomes.

Measures

• Every PLO must be assessed using a direct measure, but programs are strongly encouraged to use more

than one measure per PLO. During the assessment planning process program faculty should discuss

measurement strategies for the upcoming academic year, and should be reminded that measures do

not necessarily have to be created for program assessment. Existing assignments, artifacts, and exams

are often great sources of assessment data so long as they are discussed relative to the program

learning outcomes and not only within the context of the course or milestone from which they are

taken.

• Indirect measures can be included in the Assessment Plan as they can provide valuable supplemental

information about outcomes. However, indirect measures should not constitute the entire

measurement strategy for a given PLO. See the Measures section of this manual for examples of direct

Page 32: 2020-2021 ACADEMIC PROGRAM ASSESSMENT...Closing the Loop ... related to the delivery of distance education programs or programs available at alternate geographic locations is included

32 | P a g e

and indirect measures.

• Supporting documents, such as rubrics, assignments, project instructions/descriptions, and surveys are

always useful to include, but are not necessarily crucial to include as long as all aspects of the measure

are thoroughly described.

Targets

• Most targets should have two parts: the standard (minimally acceptable performance level) and the

proportion of students expected to meet the standard. In instances where the standard is an average

score of all students who were assessed on the given measure, the appropriate feedback selection for

the “proportion of students” criterion is Not Applicable. For example: “The average score on the critical

thinking category of the rubric will be at least 3 out of 5.”

• Pay especially close attention to measures utilizing rubrics as the evaluation strategy. The targets for

these measures should only refer to comprehensive rubric scores if all rubric criteria directly relate to

the PLO as it is described. If only a single rubric criterion directly relates to the measure, the target

statement should refer to only that criterion. Otherwise, No should be selected for the feedback

criterion about alignment.

Findings

• Results should be disaggregated by program/credential, mode of delivery, and/or geographic location,

as applicable. Refer to the information provided in the Program Description section—this should help

Liaisons determine whether results should be disaggregated in a given Assessment Report. If results are

not disaggregated, or if no explanation is provided as to why results are not disaggregated, the Liaison is

encouraged to reject the form and request more information.

• If Program Coordinators select No data collected/reported, they are expected to provide an explanation

as to why. There is a separate text box specifically labeled and meant for a response to this selection.

There might be a variety of reasons for why there are no findings to report; see the Findings FAQs for

examples.

• Programs are not required to report findings for a given measure if fewer than five students were

assessed. However, programs with consistently low enrollment are not exempt from the assessment

process. Assessment procedures should be defined regardless of how many students are enrolled.

Programs with consistently small numbers are encouraged to aggregate their results across multiple

assessment cycles to achieve a larger sample of student data on which to report (e.g., a two- or three-

year running average). Of course, some programs experience uncharacteristic low enrollment from time

to time that results in a very small student population. In such cases programs may use the No data

collected/reported selection and provide an explanation.

• Finding statements should include more than just the current results. Programs should contextualize

their results in some way, though it is up to them how they do this. OIEE often recommends discussing

the longitudinal trend of the findings, discussing whether the results were affirming or surprising, or

Page 33: 2020-2021 ACADEMIC PROGRAM ASSESSMENT...Closing the Loop ... related to the delivery of distance education programs or programs available at alternate geographic locations is included

33 | P a g e

performing further analysis on the data for more granular results. Sometimes programs address this in

the related Data-Informed Actions text box, which is acceptable.

Data-Informed Actions

• “Pre-actions” are not appropriate data-informed actions (e.g., “We will meet to discuss assessment

results” or “We will review the curriculum to determine the best place for an intervention”). The

assessment reporting cycle has been purposefully adjusted to allow more time in the Fall semester for

program faculty to discuss assessment results from the previous academic year. By the time the Report

is submitted program faculty should have convened to discuss the assessment results and establish

data-informed action(s).

• As long as at least one data-informed action reflects a curricular change clearly designed to improve

student learning, it is acceptable for Assessment Reports to include other data-informed actions that are

less related to specific curricular responses. OIEE encourages providing feedback in response to these

process-oriented actions that reminds Program Coordinators to also discuss these changes in the

Assessment Reflections section of the Report.

• Program Coordinators are required to enter information in each data-informed action text box. This

information might reflect changes to the assessment process, intent to continue collecting data before

any action is taken, or monitoring student performance/achievement on the given outcome/measure.

These are all appropriate responses only if the Report includes at least one curricular data-informed

action. Responses like “N/A” or “No action needed” warrant qualitative feedback encouraging programs

to reflect on what was learned through the assessment process and what the next steps will be.

Assessment Reflections, Closing the Loop, and Distance Education/Alternate Geographic Location Reporting

Although Liaisons do not provide internal feedback on this report, they are strongly encouraged to work closely

with programs as they craft responses to these prompts. Here are some considerations to keep in mind:

Assessment Reflections

• Liaisons can continually reinforce the message within the college/school that assessment is a

faculty-owned and faculty-driven process. Not only should faculty be collecting assessment data

from students, but they should also be the decision-makers when establishing Assessment Plans and

creating and implementing data-informed actions. In order for continuous improvement efforts to

be effective, faculty must be intimately involved at every step.

• Programs working toward greater involvement of faculty and those in leadership positions should

clearly relate this in their response and provide as much detail as possible. Furthermore, both

faculty involvement and leadership involvement should be addressed.

• Changes to the assessment process should be reported in this section. Programs are prompted to

reflect on the usefulness of the assessment data they collected and relate changes they have made

or intend to make to measurement strategies that will ensure greater usefulness of data in the

future. If more guidance is needed, encourage Program Coordinators to review the feedback on

Page 34: 2020-2021 ACADEMIC PROGRAM ASSESSMENT...Closing the Loop ... related to the delivery of distance education programs or programs available at alternate geographic locations is included

34 | P a g e

previous Assessment Reports. Feedback from 18-19 is accessible in AEFIS, and Liaisons have access

to previous years’ feedback reports via the Reports landing page at assessment.tamu.edu.

Closing the Loop

• Closing the Loop is arguably the most important part of the assessment process. Though these

prompts have been revised for clarity, Liaisons should work closely with Program Coordinators to

ensure that there is no confusion about the appropriate, effective way to close the loop. The action

discussed in this narrative should be curricular in nature, and the outcome(s) it was meant to

improve should have been re-assessed so that follow-up findings can be reported. Some programs

have struggled to develop curricular data-informed actions in the past—remind program faculty that

the action discussed in this narrative does not need to have been submitted in a previous Assessment

Report. Please see the Closing the Loop section of the manual for additional information.

Distance Education and Alternate Geographic Location Reporting

Please refer to the Distance Education and Alternate Geographic Location Reporting Guidelines

document accessible from https://assessment.tamu.edu for detailed information about suggested

metrics to report on for different kinds of distance programs and programs at alternate geographic

locations. Encourage Program Coordinators to review that document and contact their OIEE Liaison if

more guidance is needed.

Page 35: 2020-2021 ACADEMIC PROGRAM ASSESSMENT...Closing the Loop ... related to the delivery of distance education programs or programs available at alternate geographic locations is included

35 | P a g e

Information for Department Approvers

Department leadership should have an active role in the assessment process, particularly when it comes time to

analyzing assessment data and developing data-informed actions. However, this step in the workflow provides

Department Heads or Associate Department Heads with an opportunity to complete a final quality check of

Assessment Reports before they are submitted to OIEE for review. All of the information in the Plan and Report,

including internal and external feedback provided up to that point, is embedded in the form and viewable by the

Department Approver.

Department Approvers will be notified via email when Assessment Reports are available for final review. Please

refer to the information below:

Email Notifications

AEFIS automatically sends an email notification to the Department Approver when a Program Coordinator

submits their Assessment Report for final review. The sender of these notifications is listed as “The Office of

Institutional Effectiveness & Evaluation”, but they are sent automatically by the AEFIS system. Please be sure to

carefully read these email notifications as they provide detailed information about logging in to the system,

accessing Reports, and how to reject or approve Reports.

Logging in to AEFIS

Go to https://tamu.aefis.net to log in to AEFIS. The website will automatically redirect you to authenticate

through CAS using your NetID and password.

Accessing Assessment Reports

Assessment Reports will appear in the Action Items list on the right side of the browser after logging in. Click the

blue pencil icon to edit the information in the form. If the Action Items list does not automatically appear, it can

be accessed by clicking on the bell icon at the top right of the screen.

Once in the form, you will notice that none of the fields are editable except for a Feedback text box at the

bottom of the form.

Rejecting and Approving Assessment Reports

Review the Report, paying special attention to the Findings and Data-Informed Actions. If you wish to request

revisions before the final submission, provide specific feedback in the text box at the bottom of the form, click

“Reject Form” and then “I’m Finished, Submit”. This will send the form backwards in the workflow to the

Program Coordinator. Please note the Program Coordinator will need to be directly informed of this action, as

there are no automatic notifications for this process.

Page 36: 2020-2021 ACADEMIC PROGRAM ASSESSMENT...Closing the Loop ... related to the delivery of distance education programs or programs available at alternate geographic locations is included

36 | P a g e

If the Report is in good shape the form can be approved and submitted. In this instance, qualitative feedback can

still be provided but Program Coordinators will not see it until after OIEE feedback has been provided. To

submit, click “Approve Form” and then click “I’m Finished, Submit.”

Page 37: 2020-2021 ACADEMIC PROGRAM ASSESSMENT...Closing the Loop ... related to the delivery of distance education programs or programs available at alternate geographic locations is included

37 | P a g e

Glossary Alternate geographic location

Degree program or certificate which can be completed away from the “home” site of the program at an approved off-site teaching location

Assessment Liaison

Texas A&M University and AEFIS terminology for college, school, or campus liaisons who provide internal feedback on program learning outcome assessment and work with the Office of Institutional Effectiveness & Evaluation on behalf of their respective college, school, or campus

Asynchronous delivery

Instruction that does not occur in real time; content is made available to students and accessed at their convenience

Closing the loop Analyzing results from outcome assessment, using the results to make changes to improve student learning, and re-assessing outcomes in order to determine the effect those changes had on student learning

Continuous improvement An ongoing, coordinated, faculty-driven effort to ensure delivery of quality education to all students; includes establishing intentionally designed program learning outcomes, assessing the extent to which students achieve those learning outcomes, and informed reflection to reinforce and enhance current success and redirect efforts as needed to ensure future success

Data-informed action Typically, a curricular change based on outcome assessment results, designed to improve student learning; can also explain “next steps” to be taken in light of assessment data.

Department Approver Texas A&M University and AEFIS terminology for Department Heads and/or Associate Department Heads who provide the final review of the Assessment Report before submitting it to Office of Institutional Effectiveness & Evaluation for feedback

Distance education program

Degree programs or certificates which allows a student to complete more than one-half of the required semester credit hours required for the program through a combination of distance education courses (distance.tamu.edu)

Finding

Results from the analysis of assessment data Format of delivery

The method by which program faculty deliver instruction to students in a traditional face-to-face or distance education program, categorized as synchronous delivery, asynchronous delivery, or both

Page 38: 2020-2021 ACADEMIC PROGRAM ASSESSMENT...Closing the Loop ... related to the delivery of distance education programs or programs available at alternate geographic locations is included

38 | P a g e

Measure A term describing the process by which assessment data is collected and evaluated, including what the data is, who it is collected by and from, and the methodology used to analyze it

Mode of delivery

The method by which most of the program’s curriculum is offered to students, categorized as face-to-face or via technology

Program Coordinator

Texas A&M University and AEFIS terminology for individuals who are responsible for documenting assessment information and submitting assigned program assessment forms

Program learning outcome (PLO)

A skill or competency students are expected articulate or demonstrate by the time they graduate from the academic program and/or complete the requirements for a certificate

Student learning outcome (SLO)

A skill or competency students are expected articulate or demonstrate; SLOs are defined at multiple levels (e.g., course, program, institution, and system levels)

Synchronous delivery

Instruction is available to and accessed by students in real time with the instructor

Target The clearly stated level at which a program considers their program learning outcome (PLO) to be “met” or achieved on a given measure

Page 39: 2020-2021 ACADEMIC PROGRAM ASSESSMENT...Closing the Loop ... related to the delivery of distance education programs or programs available at alternate geographic locations is included

39 | P a g e