educator effectiveness: 1 an orientation for principals and assistant principals

Post on 21-Dec-2015

221 Views

Category:

Documents

1 Downloads

Preview:

Click to see full reader

TRANSCRIPT

1

EDUCATOR EFFECTIVENESS:

An Orientation for Principals and Assistant Principals

2

The purpose of the Wisconsin Educator Effectiveness System is to

help educators grow as professionals in order to increase student learning.

3

The Educator Effectiveness System in Wisconsin

• DPI has established minimum expectations for educator evaluation.

• Districts have the authority to add to the system requirements but cannot do less (i.e. A district could require 2 SLOs each year)

• There are aspects of the EE System that are left to local discretion (Which educators fit the definition of teacher)

4

Who is in Which Year of the Cycle?

Supporting Year 1? Supporting Year2? Summary Year?

Continuous Improvement Using Multiple Measures

Practice Outcomes

6

TRADITIONAL EVALUATION

• If you are good at something, it isn’t hard

• You set goals to “demonstrate” your strengths and abilities

• Struggles or challenges demonstrate weakness

WI EE• The path to mastery is hard

• Educators set goals to focus their improvement efforts

• From the most novice to the most expert, everyone can improve some aspect of their practice

Changing Our Thinking…

7

THE EFFECTIVENESS CYCLE

First year, and every third year after

9

MULTIPLE MEASURES

Balancing Multiple Measures

A Summary based on

evidence of Educator Practice

A Summary based on

evidence of Student

Outcomes

Educator Practices

The Educator Practices Summary is comprised of scores for each of the components in the Wisconsin Framework for Principal Leadership.

12

13

Assistant Principal Evaluation Components

1.1.3 and 1.2.5 are added if an Assistant Principal evaluates teachers

Required Principal Observations

Announced School Visit Observation (no specified

length)

*School SamplingVisits

*School SamplingVisits

+

*School Sampling Visits are less formal opportunities for the principal’s evaluator to get a sense of the normal flow of the school day and observe the principal in their varied roles.

15

Collecting Evidence of Practice

• Evidence can be collected through scheduled observations and school visits as well as through other sources such as interviews, surveys, or artifacts as determined during the planning session.

• A list of possible artifacts linked to subdomains and components can be found in Appendix C of the Principal Process Manual.

16

What is an Artifact?

• A source of evidence used to document effectiveness at the component level

• Some artifacts will provide evidence for multiple components

• Evaluators will use the rubric to identify the performance level that best matches the evidence of practice within the artifact that has been uploaded

17

A Few Considerations…

• Is there value in aligning some of your artifacts? (PD you led or organized related to an element of the School Improvement Plan and schedule of walk-throughs used to monitor implementation).

• When does an artifact become evidence? If I upload a certificate from a conference…what does it prove? Consider the value of a short reflection to give meaning to an artifact.

.

Out

com

es

School Learning Objectives (SLOs)

• 1 SLO• Educator self-approves and scores in all

years.• SLO is part of the EEP

Creating the SLO Score

Educators self-scores his/her SLO annually using the Revised SLO Scoring Rubric.

The rubric contains 2 criteria: one related to results (Did students meet the goals you set?) and one related to process (Did you engage fully in the SLO process?).

SLO Quality IndicatorsReflections/Feedback/ Notes for

Improvement

Baseline Data and RationaleThe educator used multiple data sources to complete a thorough review of student achievement data, including subgroup analysis.

The data analysis supports the rationale for the SLO goal.The baseline data indicates the individual starting point for each student included in the target population.

AlignmentThe SLO is aligned to specific content standards representing the critical content for learning within a grade-level and subject area.

The standards identified are appropriate and aligned to support the area(s) of need and the student population identified in baseline data.

The SLO is stated as a SMART goal.Student PopulationThe student population identified in the goal(s) reflects the results of the data analysis.Targeted GrowthGrowth trajectories reflect appropriate gains for students, based on identified starting points or benchmark levels.

Growth goals are rigorous, yet attainable.Targeted growth is revisited based on progress monitoring data and adjusted if needed.IntervalThe interval is appropriate given the SLO goal.The interval reflects the duration of time the target student population is with the educator.Mid-point checks are planned, data is reviewed, and revisions to the goal are made if necessary.Mid-point revisions are based on strong rationale and evidence supporting the adjustment mid-course.

Evidence Sources

SLO SCORING RUBRICScore Criteria Description (not exhaustive)

4 Student growth for SLO(s) has exceeded the goal(s).

Educator engaged in a comprehensive, data-driven SLO process that resulted in exceptional student growth.

Evidence indicates the targeted population’s growth exceeded the expectations described in the goal.

Educator set rigorous superior goal(s); skillfully used appropriate assessments; continuously monitored progress; strategically revised instruction based on progress monitoring data.

3 Student growth for SLO(s) has met goal(s).

Educator engaged in a data-driven SLO process that resulted in student growth.

Evidence indicates the targeted population met the expectations described in the goal.

Educator set attainable goal(s); used appropriate assessments; monitored progress; adjusted instruction based on progress monitoring data.

2 Student growth for SLO(s) has partially met the goal(s).

Educator engaged in a SLO process that resulted in inconsistent student growth.

Evidence indicates the targeted population partially met expectations described in the goal.

Educator set a goal; used assessments; inconsistently monitored progress; inconsistently or inappropriately adjusted instruction.

1 Student growth for SLO(s) has not met the goal(s).

Educator engaged in a SLO process that resulted in minimal or no student growth.

Evidence indicates the targeted population has not met the expectations described in the goal.

Educator set inappropriate goal(s); inconsistently or inappropriately used assessments; failed to monitor progress; failed to adjust instruction based on progress monitoring data.

• Using the Revised SLO Scoring Rubric, the evaluator will assign a holistic score (based on a 1-4 scale) after considering all SLOs.

• Score is based on the preponderance of evidence from documentation.

In the typical, 3 year Effectiveness Cycle, the educator will have three SLO processes that inform the final holistic score:

Educators in the Summary Year this year (our first official year of implementation) will only have one SLO process that informs your final holistic score at the end of the 14-15 year:

26

Turn and Talk

• What have you heard that’s new?• What questions do you still have?

27

SUMMARIZING THE EFFECTIVENESS CYCLE

Final Effectiveness Summary

At the conclusion of the Summary Year, the evaluator determines a score for each Principal Framework component and also determines one holistic SLO score.

Reporting Scores

The component scores (practice) and the holistic SLO score (outcome) are uploaded by Teachscape to DPI’s WISEdash secure, where only the educator and his or her administrators will be able to view the results.

Final Effectiveness Summary

Within WISEdash, the scores for the components are combined to result in a final Educator Practices Summary.

The holistic SLO score, the Reading/Graduation Rate score, and Principal Value-Added score (when available) are combined to result in a final Student Outcomes Summary.

Practice Summary

Principals:• Component scores averaged = Practice Summary

Outcomes Summary• Individual measure scores weighted proportionally• Weighted scores added together• Summary rounded to nearest decimal on scale of 1-4• Example:

Principal

SLO = 3.0 x .5 = 1.5

Value-Added = 3.0 x .475 = 1.425

School-wide Reading = 3.0 x .05 = .15

OUTCOME SUMMARY = 1.5 + 1.425 + .15 = 3.075

Effectiveness Summary Graph

34

Summary Year Overview and Timeline

35

Turn and Talk

• What are the similarities and differences between the Supporting Years and the Summary Year?

36

Our Effectiveness Coaches

• Who they are• How they can help

37

Local Talking Points: Has your district discussed…

• setting parameters for the number of artifacts an educator may upload?

• the number of Sampling School Visits (mandated range is 2-3) and whether or not one or more will happen in the Supporting Years?

• whether principals will be directed to write School Learning Objectives that align to district goals?

• whether teachers will be directed to write Student Learning Objectives that align to school or district goals?

• what is an artifact?• Whether you will ask for any aligned artifacts?• The structure, process, documentation of peer review

38

What is Next?

• View the first Module for Step 4• Attend additional Teachscape training

opportunities• Complete “Beginning of the Year” activities

39

CESA #4 Educator Effectiveness Support

Billie Finco 608-786-4830 bfinco@cesa4.k12.wi.us

Want to get all the latest information and updates or just ask a question? Join the CESA #4 Educator Effectiveness

Google+ Community:

http://bit.ly/CESA4EE

Sherri Torkelson 608-786-4855 storkelson@cesa4.k12.wi.us

For more information and resources related to the Wisconsin Educator Effectiveness System,

please visit the WIEE website at:

ee.dpi.wi.gov

40

top related