college of education & human development unit assessment system step 1: program-level

25
COLLEGE OF EDUCATION & HUMAN DEVELOPMENT UNIT ASSESSMENT SYSTEM Step 1: Program-level

Upload: quentin-byrd

Post on 27-Dec-2015

215 views

Category:

Documents


2 download

TRANSCRIPT

COLLEGE OF EDUCATION & HUMAN DEVELOPMENTUNIT ASSESSMENT SYSTEM

Step 1: Program-level

Welcome

The Unit Assessment System as a decision-making framework

The import of the changes we are implementing

Your leadership role The College’s expectations

Objectives

To provide an review of the Unit Assessment System

To review the “big picture” – how and why we are implementing the Unit Assessment System

To describe the annual time line for implementation of the system

To distribute and describe the Program Data Yearbooks

To describe the format and content of the annual program-level reports

The Big Picture

A little history: NCATE 2003 2013 Program-level experiences with SPAs and GMU’s

APR NCATE Unit Standard 2 What a focused visit means, and why we are

having one Since 2011

CEHD Reorganization Divisions API

Continuous improvement & core values Unit Assessment System & strategic decision

making for CI

What is the Unit Assessment System?

An integrated decision-making framework that involves multiple levels of decision making

Program

Division

College

What is the Unit Assessment System?

At the program level, APCs and program faculty review evidence in Data Yearbook to inform decisions related to improving candidate performance on standards

Program decisions and strategic goals inform Division Directors’ decisions related strategic goals, resources, staffing

The above informs the Executive Team and the Dean on decisions related to resource allocation, organizational structures and processes, and strategic goals and objectives adopted for the unit.

Time line for Unit Assessment System

February 4th: Data Yearbooks distributed

March 15th: APCs submit program reports

April 1st: Division Directors submit reports

May 6th: GSE faculty receive briefing on Executive Team decisions, next steps

May 15th: Executive Team final written summary; NCATE team begins Institutional Report

Activity: Using the Data Yearbook What questions would you like to answer

about your programs, based on a review of the following types of data?

Candidate information

Admissions data; Candidate demographics

Candidate performance data

Performance on key assessmentsPerformance disaggregated by location

Graduate & employer surveys

GMU graduate exit survey; CEHD exit survey (satisfaction); Graduate, employer follow-up

Internships Field supervisor qualifications; Field supervisor demographics; Field site characteristics

Faculty Faculty qualifications; Faculty demographics; Course evaluations (disaggregated by location, ft/adjunct)

Program accreditation State accreditation matrix; Standards alignment

GALLERY WALK & (brief) BREAK

Program review process

How well are candidates performing on each of your key assessments, across the calendar year?

How well are candidates performing on standards, across the calendar year?

What opportunities exist for continuous improvement of your program?

What objectives for improvement will you commit to as a program?

Components of the data yearbook For each category of evidence:

What is this evidence? Where did it come from? Suggested ways of using it

What might this evidence imply? What other evidence might you use to

triangulate?

An action research approach

Root cause analysis

Solution development

and action planning

Reflectivepractice and evaluation

Collaborative problem diagnosis

Candidate information

Admissions data: # of applicants, # accepted or denied admission, etc. Admissions is the “gateway” into your

program What do trends suggest?

Are you satisfied that your process yields desired outcomes?

Candidate demographics Snapshot of diversity represented among

candidates in your program

Assessment of candidate performance

Candidate performance on key assessments (by assessment “bin”) Are candidates in our program demonstrating

what they know and are able to do with consistency?

Are there specific standard elements on which candidates seem to excel, or to have difficulties?

What does data suggest about assessment processes?

Candidate performance disaggregated Data will be distributed in February

Candidate assessment of dispositions (2013)

Graduate and employer surveys Mason graduate exit survey (May 2012

grads) CEHD graduate exit survey

Satisfaction with various aspects of candidate experience at GMU and in your program

May help answer “why” questions CEHD graduate follow-up survey (2013) CEHD employer follow-up survey (2013)

Internship, field experiences

Internship/supervisor qualifications Internship/supervisor demographics

What are the characteristics of field supervisors?

How does this relate to the quality and diversity of field experiences?

Internship/field placement supervisor evaluations What do candidates say about supervision?

Internship/field placement site characteristics How diverse are placement sites?

Faculty information

Faculty qualifications (by f/t, p/t, adjunct) Does your program have a sufficient cadre of

highly qualified instructors? Faculty demographic characteristics

Are candidates taught by a diverse group of instructors?

Course evaluations (by on/off campus; f/t, p/t, adjunct) Do candidates perceive teaching to be high

quality? Course syllabus review

Program accreditation

State accreditation matrix Standards alignment crosswalk

Report template

Part 1: Program goals What, if any, goals and objectives did your

program pursue during 2012? Part 2: Candidate performance

How well are candidates performing on each key assessment, across the calendar year?

How well are candidates performing on standards, across the calendar year?

What evidence did you consult to support your conclusions?

Report template

Part 3: Examination of program data What does candidate admissions and demographic

data tell you about the quality, quantity, and diversity of candidates?

What do candidate and employer surveys suggest about program efficacy? What, if any, areas represent a concern?

What does evidence related to internship and field experiences (if applicable) suggest about the quality, quantity, and diversity of placements?

What does evidence suggest about the quality, quantity, and diversity of faculty, including candidate evaluation of faculty teaching?

Report template

Part 4: Program improvement objectives What opportunities exist for improvement? What are your program’s long term (3-5 years)

and/or short term (1 year) goals and objectives? What resources do you need to accomplish

these? Part 5: CI: Program assessments

What have you done to study assessment consistency?

What have you found as a result? What changes have you made?

What the program report is…& is not

The program report is… Evidence of your program faculty’s

examination of data related to continuous improvement;

A conduit for your program to communicate its accomplishments, goals and resource needs

A means for division directors & deans to learn from you about your program

The program report is not… A repetition of the data presented in the

yearbook

Activity: What to do now?

How will you take this information back to program faculty? What is your action plan?

How will evidence in the Data Yearbook help you answer the questions you posed earlier?

Dates & times for support

Tuesday, February 26th, 1-2 pm PhD, PE, Ed Psych, IOT, LT

Tuesday, February 26th, 2-3 pm ELMS

Tuesday, March 5th, 11am-Noon SPED

Tuesday, March 5th, Noon-1pm APTDIE

PLUS – BY APPOINTMENT, IF NEEDED

Questions

The complaint department is currently closed. However, if you have questions…(Libby will be happy to answer them)