wilmington university college of education educator ... · the purpose of the quality assurance and...

16
1 | Page Wilmington University College of Education Educator Preparation Program Quality Assurance & Assessment Manual 2018-2019 Michele Brewer, Ed.D Written & Approved: Dean Director of Advanced Programs Director of Initial Teacher Preparation Programs Office of Technology, Assessment, and Compliance Office of Clinical Studies

Upload: others

Post on 23-Jun-2020

0 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Wilmington University College of Education Educator ... · The purpose of the Quality Assurance and Assessment Manual is to describe in writing the EPP’s assessment system so that

1 | P a g e

Wilmington University College of Education Educator Preparation Program

Quality Assurance & Assessment Manual 2018-2019

Michele Brewer, Ed.D Written & Approved:

Dean Director of Advanced Programs Director of Initial Teacher Preparation Programs Office of Technology, Assessment, and Compliance Office of Clinical Studies

Page 2: Wilmington University College of Education Educator ... · The purpose of the Quality Assurance and Assessment Manual is to describe in writing the EPP’s assessment system so that

2 | P a g e

Wilmington University College of Education

Educator Preparation Program ASSESSMENT MANUAL

Table of Contents

Introduction

I. College and EPP Organizational Structures and Programs

II. Educator Preparation Program‘s Mission and Conceptual Framework

III. Development of the EPP‘s Assessment System

IV. Types of Assessments Defined

V. Alignment of assessment instruments with conceptual framework

VI. Assessment of EPP Operations

VII. Fairness, Accuracy, Consistency, and Elimination of Bias

VIII. Use of Information Technologies

IX. Procedures for Data Collection, Analysis and Use

X. Timeline

XI. Quality Assurance System Graphic

Page 3: Wilmington University College of Education Educator ... · The purpose of the Quality Assurance and Assessment Manual is to describe in writing the EPP’s assessment system so that

3 | P a g e

Introduction

Welcome to Wilmington University College of Education and Educator Preparation Program

The purpose of the Quality Assurance and Assessment Manual is to describe in writing the EPP’s assessment system so that all professional education faculty members in the EPP are aware of and understand the system. Continuous improvement is essential to maintaining and improving the quality of our programs. Improving the quality of our programs helps our candidates to prepare to be effective decision makers who are committed to teaching diverse learners. The Quality Assurance and Assessment Manual describes how our EPP assessment system supports the accreditation process, how we are organized, and the processes by which we assess ourselves for continuous improvement.

The mission of Wilmington University is rooted firmly in building exemplary and innovative academic programs within the context of a student-centered learning environment. Relevance of curriculum is our focus, and it is in this spirit that we routinely assess our academic programs to determine the extent to which learning has occurred and student educational needs have been met. A partnership exists between the Delaware Department of Education (DEDOE) and the Council for the Accreditation of Educator Preparation (CAEP). Both have a common core of accreditation requirements, utilizing Specialized Professional Associations (SPA) and State program review leading to CAEP accreditation for the EPP.

Each member of our faculty and staff is familiar with these descriptions, systems, and processes. The Quality Assurance and Assessment Manual is a living document designed to help capture our progress in preparing competent educators. Because this is a continuous process, the manual and our assessment structures can be modified quickly based on authentic, current data.

The faculty formally adopted and implemented the assessment system in 2004, and the plan was (and is) published on the WilmU website. The College of Education (COE) updates and revises assessments annually, and sometimes more often, based on ongoing needs. At the recommendation of NCATE during the 2007 visit, we adopted a digital suite of tools (Watermark/Taskstream) to increase the efficiency of our Outcomes Assessment data collection, reporting and analysis. Since then, the EPP Digital Assessment System has become a reality. The latest review and revision occurred in June 2014, reflecting new state regulations/legislation, changes in certification rules, new CAEP Standards and requirements for evidence review, and the Delaware Educator Preparation Data System – a new state-wide electronic system that houses data on all educators as part of the Delaware Educator Preparation Program Reports (Scorecards). Please address any questions, comments, or suggestions about the Educator Preparation Program Assessment Handbook to:

John Gray, Ed.D. Dean College of Education

Michele Brewer, Ed.D Chair| Technology, Assessment, and Compliance

College of Education

Page 4: Wilmington University College of Education Educator ... · The purpose of the Quality Assurance and Assessment Manual is to describe in writing the EPP’s assessment system so that

4 | P a g e

I. COLLEGE AND EDUCATOR PREPARATION PROGRAM ORGANIZATIONAL STRUCTURES AND PROGRAMS

A. Organization of the University

The Board of Trustees at Wilmington University (WilmU) under the leadership of the University President, Dr. LaVerne T. Harmon (appointed in 2017), work together to lead WilmU. Administrators and faculty regularly contribute their experience, guidance, and recommendations to the University’s leadership for continuous improvement. The President’s Executive Team meets weekly, as does the Academic Council (composed of all academic deans), and communication between the two groups is open and frequent. Staff, students and alumni all have a valuable role in supporting and monitoring the effective application of University policies and standard procedures to best serve students. The University is currently organized into seven colleges each headed by a dean who reports to an academic vice president. Program directors and chairs report to their respective dean.

• College of Arts and Sciences • College of Business • College of Education • College of Health Professions • College of Online and Experiential Learning • College of Social Behavioral Sciences • College of Technology

The Wilmington University Timeline provides historical details tracing 50 years, from the school’s founding to the present day. (This timeline served as one of several fact sources for this document.) Wilmington University opened in 1968 as a four-year college with three bachelor’s degree programs and 147 students. Academic programs and enrollment grew dramatically over the next five decades to more than 150 degree and certificate programs enrolling more than 20,000 students. Initial Middle States Accreditation (MSA) was awarded in 1975 and was a significant benchmark. Each MSA Self-Study since that time (the most recent in 2015) has resulted in full accreditation with no conditions. An interesting demographic trend emerged in the early 70’s which continues up to the present day. The proportion of non-traditional students–meaning working adults age 24 or older–-increased to represent the majority of our student population, and that remains the case today. Most (87%) WilmU students work full or part-time. Gender and attending status also began to shift.

The chart above highlights the dramatic change in gender and attending status that has occurred over the past 45 years. Females now make up two thirds of the student population and 70% go to school part-time. These figures continue to align with the non-traditional student age group of 24 and older.

Enrollment Year

Enrollment Attending Status Male Female Full Time Part Time

1972 (N=630) 76% 24% 64% 36%

2017 (N=20,480) 37% 63% 30% 70%

Page 5: Wilmington University College of Education Educator ... · The purpose of the Quality Assurance and Assessment Manual is to describe in writing the EPP’s assessment system so that

5 | P a g e

Most noteworthy is the increasing popularity of online courses. Data reported in WilmU Facts at a Glance show that about 45% of all classes are now being taken online. WilmU has responded to these obvious student preferences by adding more online courses and now offers more than 100 fully online programs at all degree levels. Diversity of the student body has been a focal point for recruitment and design of academic programs and student services. Educating employees, faculty and students in managing issues pertaining to diversity has been a University priority, particularly when the international student population surged to 1000 in 2013. Students came from multiple Asian, Middle Eastern and African countries. Most of them stayed with the University until they graduated. Since its inception, the College of Education has maintained a climate of high expectations, caring, and respect for the worth of every individual. We are committed to providing an environment of opportunity, equity and access that is sensitive to context and culture. We work hard to help our students learn how to adjust and adapt educational methodologies in an equitable, contextually appropriate and culturally sensitive manner. The COE Access and Equity Committee is an example of that commitment and is working hard to recruit and retain underserved populations. WilmU’s first graduate degree program, an MBA, became available to students in 1977. Graduate programs in education were launched in the mid-80’s and soon grew to become Wilmington College’s largest programs. Undergraduate teacher preparation programs in elementary education and a doctoral program in educational leadership launched in the late 1980’s. Over the next 25 years a variety of additional programs were developed in response to market demands, school/district needs, and students’ career interests. Wilmington College became a University in the late 2000’s. This change reflected WilmU’s broad base of academic offerings at all levels (undergraduate, graduate, doctoral) and its first 100% online program offerings. The school now could offer courses at multiple locations, and enrolled more than 11,500 students. University status advanced WilmU’s ability to increase partnerships with other schools and expand its presence in the tristate area. New locations continued to open in DE, NJ, MD and PA – most recently, the newly built Brandywine Campus in northern Delaware, near the state borders of PA and NJ. B. Organization of the College of Education

Education programs grew along with the institution. By 2009, academic department divisions converted to colleges within WilmU. The College of Education (COE) formed with its own organization, vision, mission and goals, see COE Conceptual Framework. Today, we are a major provider of educators for schools in Delaware and the region.

The College of Education Programs under CAEP review consist of two types of degree programs leading to licensure and certification: Initial Licensure Teacher Preparation Programs and Advanced Programs:

Initial Licensure Teacher Preparation Programs • BS Elementary Education (Grades K-6) • BS Middle Level Education (Grades 6-8) • M.Ed. Elementary Studies (Grades K-6) • Master of Arts in Secondary Teaching: Grades 7-12 • M.Ed. Special Education K-12

Advanced Programs • M.Ed. In Elementary and Secondary School Counseling • M.Ed. in Reading • M.Ed. in School Leadership

Page 6: Wilmington University College of Education Educator ... · The purpose of the Quality Assurance and Assessment Manual is to describe in writing the EPP’s assessment system so that

6 | P a g e

Wilmington University’s COE is led by an academic dean, supported by directors or Initial Teacher Preparation and Advanced Programs. The dean and directors collaborate in the selection and appointment of a (Faculty) Program Chair for each Program. Program Chairs assume a leadership role in the program to advocate for their faculty and their department/program. They oversee practical operations to ensure continued program quality designed for licensure and continued accreditation. Click on this link for the College of Education Organizational Structure.

WilmU COE prepares candidates to satisfy the Delaware Department of Education licensure and certification requirements; therefore, it is essential that all programs involved in educator preparation work collaboratively when developing programs, placing candidates in field experiences and internships, and collecting and analyzing data for decision-making. The Office of Clinical Studies manages placements for educator preparation programs. Program Chairs manage placement services for students in Advanced Programs.

C. Organizations and Programs Connecting the EPP with the College and Community

Education Preparation Programs include relationships with schools and organizations in the community. These relationships are an active part of the COE and are maintained through ongoing initiatives and structures that actively involve faculty, candidates, community members, school personnel, and students. There are reciprocal benefits to these affiliations, which allows other stakeholders to become part of the COE and the EPP becomes part of the fabric of the community and its schools. A list of COE partnerships is available upon request. II. EDUCATOR PREPARATION PROGRAM’S MISSION and CONCEPTUAL FRAMEWORK

About the COE Vision

We believe that effective professional educators must also be learners—learners who want to share challenging ideas and successful practices with their colleagues. Educators prepared at Wilmington University believe in the importance of hard work and persistence, and in reflecting on and improving the quality of that work. They are committed to collaborating with parents, colleagues, and community stakeholders. They want to create teaching/learning environments that support personal, physical and emotional development; intellectual growth; and high levels of student achievement ... and which encourage innovation, exploration, creativity and problem solving. We try to maintain a climate of high expectations, caring, and respect for the worth of every individual. We view ourselves as "Professional Partners, Creating Environments for Learning."

Mission

The College of Education at Wilmington University prepares educators to work successfully with children from birth through adolescence, and to work closely with all education stakeholders. Our programs prepare candidates to work effectively with students with a wide variety of learning needs and from diverse cultural, socioeconomic and linguistic backgrounds. An important goal of our programs is the translation of theory into practice. All programs are standards-driven. All programs emphasize the importance of databased decision-making, practical experiences in classrooms and schools, content knowledge, knowing and understanding learner needs, and the application of research-based best practices.

Conceptual Framework

The COE Conceptual Framework reflects the vision and mission of the University and articulates the College's philosophy and goals. The Conceptual Framework is the fundamental theoretical architecture and basis for all degree programs. The framework includes eight specific Program Attributes essential for the preparation of effective educators.

Page 7: Wilmington University College of Education Educator ... · The purpose of the Quality Assurance and Assessment Manual is to describe in writing the EPP’s assessment system so that

7 | P a g e

These attributes appear on every course syllabus.

1. The Vision and Mission of Institution and Unit 2. The College of Education Philosophy, Purpose, and Goals 3. Knowledge Bases That Inform the Unit's Conceptual Framework

Organizing Theme, “Professional Partners Creating Environments for Learning" Program Attributes which Define the Conceptual Framework

4. References

III. DEVELOPMENT OF THE EPP’S ASSESSMENT SYSTEM

Wilmington University formalized its efforts at using assessment for continuous improvement through implementation of an “Institutional Effectiveness Assessment Plan” in 2003. That plan is updated annually as “Proof Positive³” focusing on three major facets of university life including (1) the candidate (performance and satisfaction), (2) the workplace (campus climate and staff/faculty satisfaction and development), and (3) policies and processes (and continuous review and improvement thereof). Further defining the original Institutional Effectiveness Assessment Plan, an Academic Affairs Assessment Plan was implemented extending the focus on the academic side of the University through (1) assessment of candidate learning outcomes, (2) assessment of candidate satisfaction, (3) assessment of teaching effectiveness, and (4) promotion of educational values – added in an August 2011 update of the plan. These two plans establish the framework at the University for regular and frequent assessment of outcomes for the purpose of continuous institutional, unit, and program improvement.

As part of the Academic Affairs Assessment Plan, in 2004, the academic colleges implemented processes for assessing candidates and for collecting data on the performance of the candidates on Graduation Competencies established by the University for both undergraduate and graduate candidates. Each teaching block (7 week sessions) and/or semester, data on candidate performance on the Graduation Competencies were collected and forwarded to a newly-established Office of Institutional Research for maintenance and reporting. Candidate outcomes data were reviewed by the colleges to determine the level at which candidates in the various programs perform relative to the Graduation Competencies.

All academic programs at the University, in addition to having the institution-wide Graduation Competencies, also have sets of Program Competencies which define what candidates are expected to know and be able to do upon successful completion of their particular preparation programs. In the College of Education, Program Competencies were (and still are) developed around the standards of the State and the standards of the specialized professional associations (SPAs) for the respective preparation programs.

Page 8: Wilmington University College of Education Educator ... · The purpose of the Quality Assurance and Assessment Manual is to describe in writing the EPP’s assessment system so that

8 | P a g e

At the time of the first NCATE on-site visit in March 2007, the assessment system measuring candidate attainment of Graduation and Program Competencies (as well as State and SPA standards) and making program improvements based on the data collected for those competencies was in place in the College of Education. However, and rightly so, in the “Areas for Improvement” cited by the NCATE visitation team, one of four recommendations made stated that “the assessment system lacks a consistent unit process for aggregating and using data.” While we had implemented and were fully utilizing various assessments and forms of data for the improvement of candidate performance, the unit, and our preparation programs, there were variations among programs in the data collection process and we had not formalized our unit assessment system. As a result of that recommendation, we (1) committed to writing and institutionalizing our unit assessment system, (2) improved our process of collecting and aggregating data in a way that continually informs our decision making, and (3) established a regular cycle of review for our educator preparation programs.

Our unit assessment system is a three-tiered system establishing a process for review of data for the purpose of continuous unit and program improvement. Tier 1 includes candidate performance data related to the institution’s Graduation Competencies. These data, known as CECRAM (Course-Embedded Criterion-Referenced Assessment Measures), are collected and shared with the Office of Institutional Research every semester and are used as part of our regular program review process. Tier 2 includes the candidate performance data on the Program Competencies, which, as noted above, are State and SPA standards and with the eight attributes that serve as the core of our Conceptual Framework. These data are those which we use for our SPA and State reports and are also regularly reviewed as part of our own program review process. Tier 3 of the unit assessment system includes multiple sets of data that inform faculty of both program quality and operational effectiveness.

Along with a consistent process for using data, institutionalized by the Unit Assessment System, we also worked to implement an improved, more consistent process for aggregating data. At the time of the 2007 NCATE on-site visit, our NCATE Coordinator, with assistance from program coordinators, was responsible for gathering and organizing the outcomes assessment data for the initial teacher preparation programs. Program coordinators, themselves, gathered the data for the other professional education programs and shared those data with the NCATE Coordinator. The NCATE Coordinator then shared the results with faculty and with the Office of Institutional Research. Because of the NCATE recommendation cited above for improvement of our assessment system, in fall 2008 the College of Education began the process of implementing an electronic database system to become the tool for collecting, organizing, maintaining, analyzing, and reporting on the candidate outcomes assessment data dictated by the Unit Assessment System. An initial vendor for the system could not provide the flexibility needed for the comprehensive system, so a new vendor was placed under contract in 2010. The electronic database now also serves as an e-portfolio for use by the candidate in organizing his/her work around the Graduation and Program Competencies. In January 2011, a new position of Technology Chair was added by the College to manage the continued implementation of the electronic database. At this point, all preparation programs in the College of Education are a part of the system and the system has established a consistent process for aggregating data recommended by NCATE in 2007. The College now has a dedicated Office of Technology, Assessment, and Compliance which includes the Chair, a Data Analyst, and an Administrative Coordinator/Data Specialist. There is an annual college-wide review of all programs. The review looks at multiple sources of data to make determinations about candidate performance, faculty effectiveness, and program quality. CECRAM and SPA assessment data are reviewed on a regular basis with each Program Chair and as an EPP during summer retreats to evaluate candidate performance as it reflects upon program performance. Many other sources of data are reviewed throughout the academic year during College faculty meetings. These sources of data include, but are not limited to (1) candidate seat counts, (2) candidate GPAs, (3) candidate grades, (4) IDEA and practicum/student teaching evaluation results, (5) the results of student evaluations from the ten extra IDEA questions that assess the Program Attributes of the Conceptual Framework, (6) alumni and employer survey results, (7) multiple data sources related to CAEP’s Annual Reporting measures, (9) institutional and division-wide demographic data, and (10) results from various surveys of candidates and mentors completed periodically. The compilation of all of these assessments and other sources of data allow College faculty to regularly evaluate candidate and program performance.

Page 9: Wilmington University College of Education Educator ... · The purpose of the Quality Assurance and Assessment Manual is to describe in writing the EPP’s assessment system so that

9 | P a g e

The assessment system, using internal and external assessment, aligns with professional, state, national, and institutional standards and spans both initial and advanced programs. Since its inception, the basic design of the assessment system has been modified, streamlined, and improved to provide the data necessary for program improvement. The assessment system continues to afford the EPP a structure around which effective programs can be built, maintained, and assessed. In 2018, the first Assessment Manual was written clarifying and organizing the required processes, procedures, and assessments used to assure program quality and candidate excellence. Since then, the Assessment Manual has been updated yearly to keep pace with the substantive changes and requirements in teacher preparation locally, state-wide, and nationally.

This system has four broad themes:

1. Data collection to support assessment of competence for certification of candidates. 2. Data collection to assess the quality and effectiveness of programs. 3. Data collection to assess the effectiveness of the Educator Preparation Program including field experiences,

clinical preparation, and partnerships. 4. Data collection to track the performance of graduates in their field of specialization.

To ensure that data are collected and posted in a timely manner, the Chair of the Office of Technology, Assessment, and Compliance has developed a key assessment blueprint for administrators, faculty, staff, and candidates that assure postings by clearly delineated deadlines, typically set up on a semester basis.

The Dean of the College of Education in conjunction with the Faculty are responsible for final decision making on proposed changes to EPP policies and procedures, response to changes in state/national standards requirements, EPP operating procedures and policies, and other related decisions. Program Chairs are responsible for using posted data to evaluate programs and to propose any program or course changes that must be submitted, assessed, and approved through the COE and the university’s curriculum review process.

Data Management

Data Management is made up of the individuals from the Office of Technology Assessment, and Compliance who come into contact with, process, and analyze critical student data. In the College of Education these positions include:

• Chair | Technology, Assessment, and Compliance • Data Analyst • Administrative Coordinator/Data Specialist

The Office of Clinical Studies manages placement data on licensure candidates. Placements at the Advanced level are maintained by each Program Chair.

Data collected by College of Education offices are critical to decision making leading to the admission of candidates, placement of candidates in field experiences and internships, recommending candidates for initial certification, and tracking candidates’ success following certification.

The data collected are used to develop and support a variety of College and Educator Preparation Program reports including, but not limited to the following:

• Specialized Professional Associations (SPA) • Council for Accreditation of Educator Preparation (CAEP) Annual Report • Delaware Professional Standards Commission Program Approval Reports • United States Department of Education Title II • Delaware Educator Preparation Reports (Scorecards) • WilmU Academic Affairs Annual Reports

Page 10: Wilmington University College of Education Educator ... · The purpose of the Quality Assurance and Assessment Manual is to describe in writing the EPP’s assessment system so that

10 | P a g e

IV. TYPES OF ASSESSMENTS DEFINED

A. Structured External Assignments

A Structured External Assignment (SEA) is a key assessment of each individual candidate actively enrolled in a particular course. Candidates upload SEAs into Watermark/Taskstream and faculty determine whether a candidate has met, not met, or reached the target for the identified elements/standards on a rubric. The College is held accountable for ensuring that candidates meet standards in their education programs. Candidates’ artifacts collected via Watermark/Taskstream inform instructors regarding candidate knowledge, skills, and dispositions. Instructors use this information to improve course content and pedagogy.

B. Key Program Assessments

A Key Program Assessment is an assessment developed and utilized by a specific program to assess faculty, candidates, and/or cooperating teachers/university supervisors. Program assessments are aligned to the conceptual framework, InTASC, state, CAEP and SPA (when appropriate) standards and are used to collect data useful in completing required annual program reports. In the program assessments, the individual candidate evaluations are aggregated and shown in means, percentages, and graphs, then disaggregated by standard sets to reveal program strengths and areas for improvement. Key Program Assessments provide data needed for specific program improvement. Key Program Assessments are identified by the Program Chairs in conjunction with program faculty, and the Office of Technology, Assessment, and Compliance.

Key Program Assessment data are collected and stored within the COE’s Watermark/Taskstream Exhibit Center and are available to all College of Education faculty within their accounts. Praxis data are stored in a COE Database on WilmU’s SharePoint site and maintained by the EPP’s Data Management Team.

V. ALIGNMENT OF ASSESSMENT INSTRUMENTS WITH CONCEPTUAL FRAMEWORK Initial Programs and Advanced Programs

The EPP’s assessment system for the initial and advanced programs addresses the COE’s conceptual framework. The assessment system also addresses the CAEP standards by gathering evidence of candidate selectivity, content knowledge, pedagogical content knowledge, professional and pedagogical skills, ethics and dispositions, and effects on student learning. Curriculum maps have been developed that align with the COE’s conceptual framework and relevant SPA standards. The College utilizes the Assessment Management System (AMS) within Watermark to maintain these maps and monitor program goals and outcomes.

VI. ASSESSMENT OF EPP OPERATIONS

EPP operations are activities undertaken by the EPP pertaining to governance, planning, budget, personnel, facilities, services and procedures such as advising and admission, and resources that support the COE‘s mission in preparing educators.

Some of the ways in which EPP operations are assessed are the following:

A. The EPP ensures that the assessments are aligned to the COE’s conceptual framework and state, InTASC, and

SPAS (as appropriate) standards.

Page 11: Wilmington University College of Education Educator ... · The purpose of the Quality Assurance and Assessment Manual is to describe in writing the EPP’s assessment system so that

11 | P a g e

B. Candidate Evaluation of the Initial Teacher Education Program (Exit Survey): This is a cumulative survey from the candidate’s perspective regarding the preparation programs. Data gathered from the survey assist the faculty, Dean, Program Chairs, and staff to assess operations regarding program admission procedures, advising and support services, student teaching/internship placement, availability of courses, technology resources, quality of instruction, and overall quality of the program. Data are used for EPP and program improvement. The Exit Survey is anonymous and conducted using Panorama in order to assure candid evaluations.

C. Field Experience Survey: All candidates enrolled in any field experience are required to complete a survey

regarding the appropriateness of the placements, expertise of the mentor teacher, and alignment with course objectives and standards.

D. Candidate Evaluation of University Supervisor and Candidate Evaluation of Cooperating Teacher: All candidates

enrolled in the initial education program complete these surveys at the end of the clinical internship semester. These forms help the Chair of Clinical Studies assess the quality of the supervision and support that the candidates received from the college supervisor and from the cooperating teacher.

E. Cooperating Teaching Evaluation of the Clinical Experience: All cooperating teachers supervising clinical interns

complete a survey evaluating the clinical experience provided by WilmU.

VII. FAIRNESS, ACCURACY, CONSISTENCY, AND ELIMNATION OF BIAS

The EPP uses the following strategies to ensure fairness, accuracy, consistency, and elimination of bias throughout its assessment system:

• The EPP ensures that the assessments are linked to the COE‘s Conceptual Framework, and the CAEP

Standards are demonstrated in the alignment of all evaluation measures. • The COE informs all initial undergraduate and graduate candidates of all requirements in the education

program when they initially meet with their advisor. Information about the conceptual framework, dispositions expected of candidates, transition points, key assessments, and other requirements are included during the orientation meetings and within the initial efolio course where candidates activate their Watermark/Taskstream accounts, and in the Clinical Internship Handbook.

• Advanced candidates receive information regarding program requirements during a Program Planning Conference when they meet with the Graduate Advisor or Program Chair.

• Rubrics to assess candidates’ work are shared with the candidates before the rubrics are used. Thus, candidates know what they will be assessed on, what is expected of them, and the level of proficiency associated with each scoring decision.

• Electronic Forms are used to assess clinical interns, cooperating teachers, and clinical faculty/university supervisors.

• Reliability studies are conducted each year on a rotational basis for EPP Key Assessment Rubrics. • The most recent validity studies for assessments were conducted fall 2018, with high positive results

regarding the nature of the assessment elements and content validity. New assessments are vetted for validity.

• EPP faculty members review and discuss professional dispositions and evaluation of key assessments at professional education meetings. These discussions are documented in program, department, committee, and COE meeting minutes.

• In order to show candidate progress, some performance assessments may be administered multiple times at different points of the candidates’ progression through the program. For example, dispositions are first

Page 12: Wilmington University College of Education Educator ... · The purpose of the Quality Assurance and Assessment Manual is to describe in writing the EPP’s assessment system so that

12 | P a g e

assessed by professionals during a group interview. Dispositions are then measured during an early field experience (Practicum I), Practicum II and III, and finally in Internship. This enables the faculty to assess the challenges and areas for growth of dispositions across the program and for each individual candidate.

VIII. USE OF INFORMATION TECHNOLOGIES

The EPP uses Taskstream to maintain the majority of data on its initial and advanced education candidates. Candidates are required to purchase a Taskstream account similar to purchasing a textbook for the course. Candidates have access to the account at no extra charge for six years. Candidates submit key assessments via Taskstream. Also, cooperating teachers and college supervisors submit evaluations via Taskstream.

Each year, the Taskstream Administrator for the EPP, develops an Assessment Blueprint that includes information on the key common EPP assessments and on the program assessments that are administered via and stored in Taskstream. The information includes: course number, who completes the assessment, name of the assessment, date when the assessment is available for use in Taskstream, and date when data are available for faculty to review.

In addition, the Taskstream platform is linked from the WilmU Learning Management System (LMS) and MyWilmU portal to permit candidates to review course specific performance assessments, task instructions and rubrics, and enable them to monitor their own progress throughout their programs. In addition to the student-centric focus of Taskstream, there are multiple systems for data collection about programs and operations. For example, Cognos is a data collection and report generation system that Institutional Research administers and makes accessible to all COE personnel. COE leadership and Program Chairs utilize Cognos to run reports on all aspects of the College operations, including but not limited to enrollment statistics; student progress, program level data; and university demographics. These systems support the ability to monitor operational effectiveness (e.g., setting program priorities and data tracking). There are also several other significant collection systems and uses. 1. SharePoint is a password protected web-based collaborative platform that integrates with Microsoft Office

used as a document management and storage system housing COE Praxis scores on all candidates. Multiple users access the system for a variety of purposes.

2. Smartsheet is a software service application for collaboration and work management used to assign tasks,

track project progress, manage calendars, share documents, and manage other work, using a spreadsheet-like user interface. COE utilized this platform for program revisions, project management, collaboration with partners, and program planning.

3. The ETS® Data Manager for the Praxis® tests is a collection of services related to Praxis score reporting and

analysis. Access to the ETS Data Manager is provided via the internet from a secure website using a login and password. COE uses this system to view data for different test-taker groups based on variables such as gender, ethnicity, educational level and type of educator preparation program.

4. ProEthica® is a secure, password-protected website comprised of a series of interactive modules with real-

life scenarios related to dispositions. Aligned to the Model Code of Ethics for Educators (MCEE), these learning modules are accessed by COE candidates. The Office of Technology, Assessment, and Compliance manages candidates' enrollment status; manage payment vouchers; and access reports and other information outlining educator progression.

5. The IMPACT Dashboard is a COE created infrastructure used to collect, manage and track our candidates’

impact on student learning in their programs, and link it to their impact on student learning as practicing educators for 5 years in the classroom (ITP Standard Four Compendium 1 - IMPACT Dashboard/DPAS II - Danielson Framework).

Page 13: Wilmington University College of Education Educator ... · The purpose of the Quality Assurance and Assessment Manual is to describe in writing the EPP’s assessment system so that

13 | P a g e

These systems for data collection, reporting, and sharing that inform multiple stakeholders and guide decisions at all levels of the College contribute to the overall Quality Assurance System. IX. PROCEDURES FOR DATA COLLECTION, ANALYSIS AND USE

A. Collecting and Entering Data into Taskstream

Candidates upload key assessments within each course and at transition points. Faculty score these assessments within two weeks of the due date provided to candidates. Data from all common EPP and program assessments are collected each block/semester and entered into Taskstream by the individual designated to complete the assessment form (i.e. faculty, supervisor, cooperating teacher) according to timelines provided by Program Chairs and/or the Chair of the Office of Clinical Studies.

Data for the Field Placement Module is input by the Office of Clinical Studies Placement Specialists, regarding university supervisors, cooperating teacher data, candidate data, and site demographics.

B. Aggregating and Disaggregating Data

1. EPP data from all common EPP assessments are aggregated for the EPP, and then disaggregated for each program and level through Taskstream by the Taskstream Administrator within one month after final two weeks has ended each semester.

2. Program data from all specific program assessments are aggregated for the program through Taskstream

by the Taskstream Administrator within one month after the final exam week has ended each semester.

C. Forwarding Data

1. The aggregated EPP data will be forwarded by the Taskstream Administrator to the Program Chairs and Dean within three weeks after final exam week has ended each semester.

2. The disaggregated EPP (by program/level) data will be forwarded by the Taskstream Administrator to Program Chairs within one month after final exam week has ended each semester. The format of the disaggregated program data for key assessments will be copied into an Assessment Report housed in the AMS system for use by the programs.

D. Summarizing, Analyzing, Reporting, and Disseminating Data

1. Outcomes Assessment Summit

An annual Summit is held in order to review progress to date of the outcomes assessment process, discuss strengths and weaknesses of the process and most importantly, present, analyze, and discuss examples of data-based decision making (“closing the loop”) related to Academic Assessment. Attendees at the Summit include the Chief Academic Officer, the Assistant Vice Presidents for Academic Affairs, the College Deans, and the Manager of Institutional Research as well as other invited guests. A summary report is presented to the Faculty Senate. In addition, regular reports are offered throughout the academic year to the Faculty Senate by the Deans as a way to inform and motivate faculty regarding the effective use of outcomes assessment data and subsequent decisions based on those data. The minutes of Faculty Senate (Faculty Senate Blackboard site) include these presentations and any related materials such as copies of handouts and PowerPoint slides for review at a later time.

Page 14: Wilmington University College of Education Educator ... · The purpose of the Quality Assurance and Assessment Manual is to describe in writing the EPP’s assessment system so that

14 | P a g e

2. College Meetings

It is critical for all faculty to be informed of academic assessment results. Members of the faculty participate in regular college meetings to review data and processes. During these meetings, successes are celebrated and specific changes in areas such as curriculum, pedagogy or policy are made. In addition, academic program and course related meetings are regularly scheduled throughout the year with full time and adjunct faculty. The various programs routinely present assessment results to stakeholders for review and input. These meetings consist of members of the community, practitioners in the field, and faculty members as well as student representatives.

3. Program Advisory Committees

a) Program Chairs meet regularly with their program faculty and/or their Advisory Committee to review

aggregated specific program data that includes:

1) for each EPP and on each key program assessment instrument, the number and percent of candidates performing in each cell on the scoring scale along with an assessment report template; 2) strengths and weaknesses identified in a course(s) or in the program based on the data; and 3) change(s) in a course(s) or in the program that will be made based on the data, if any changes are appropriate at that time.

b) Program Chairs work with the Subject Matter Experts (SMEs), program faculty, and the Office of

Technology, Assessment, and Compliance to modify key assessments if appropriate. 4. Summer Data Retreat

a) Faculty from Initial and Advanced Programs meet during a full day assessment retreat in the summer to

review data focusing on a specific area for improvement. Sources of data that are reviewed include, but are not limited to (1) candidate seat counts, (2) candidate GPAs, (3) candidate grades, (4) IDEA and practicum/student teaching evaluation results, (5) alumni survey results, (6) employer survey results, (7) completer results at the institutional level, (9) institutional and division-wide demographic data, and (10) results from various surveys of candidates completed periodically, CECRAM, and SPA assessments. The compilation of all of these assessments and other sources of data allows faculty to regularly evaluate candidate and program performance.

Page 15: Wilmington University College of Education Educator ... · The purpose of the Quality Assurance and Assessment Manual is to describe in writing the EPP’s assessment system so that

15 | P a g e

X. TIMELINE

The Educator Preparation Program adheres to the following assessment timeline:

• Prior to the beginning of each semester: The Taskstream Coordinator receives information from the Program Chairs and develops the Assessment Blueprint for that semester.

• During each fall and spring semester: Data are collected via Taskstream according to dates identified by the

Program Chairs and the Chair of the Office of Technology, Assessment, and Compliance.

• During each fall and spring semester: Progress of initial and advanced candidates is monitored at designated transition points by the Program Chairs, the Office of Clinical Studies, and the Office of Technology, Assessment, and Compliance.

• Within one month after the final exam each semester: the Taskstream Coordinator aggregates the data from

the common EPP assessments for the EPP and disaggregates the EPP data for each program; and aggregates the program data from the specific program assessments.

• Ongoing: Program assessment reports are presented at monthly COE faculty meetings, and annual advisory

board meetings when scheduled by programs. • By October 1 annually: COE Dean prepares an Outcome Assessment Report on candidate performance for the

EPP for the academic year. This report is shared WilmU leadership and COE faculty for recommendations for improvement.

Page 16: Wilmington University College of Education Educator ... · The purpose of the Quality Assurance and Assessment Manual is to describe in writing the EPP’s assessment system so that

16 | P a g e