march 8, 2011 student outcomes and principal evaluation: key questions for peac principal evaluation...

20
March 8, 2011 Student Outcomes and Principal Evaluation: Key Questions for PEAC Principal Evaluation Subcommittee

Upload: marissa-clayton

Post on 27-Mar-2015

216 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: March 8, 2011 Student Outcomes and Principal Evaluation: Key Questions for PEAC Principal Evaluation Subcommittee

March 8, 2011

Student Outcomes and Principal Evaluation: Key Questions for PEAC Principal Evaluation Subcommittee

Page 2: March 8, 2011 Student Outcomes and Principal Evaluation: Key Questions for PEAC Principal Evaluation Subcommittee

Overview of the Webinar

1. Review of guiding questions for sub-committee consideration

2. Introduction and review of value-added measures and update on value-added models being created in CPS

3. Discussion of guiding questions

New Leaders for New Schools 2

Page 3: March 8, 2011 Student Outcomes and Principal Evaluation: Key Questions for PEAC Principal Evaluation Subcommittee

Overview of the Webinar

1. Review of guiding questions for sub-committee consideration

2. Introduction and review of value-added measures and update on value-added models being created in CPS

3. Discussion of guiding questions

New Leaders for New Schools 3

Page 4: March 8, 2011 Student Outcomes and Principal Evaluation: Key Questions for PEAC Principal Evaluation Subcommittee

Guiding questions on student outcomes

• What measures should be used in evaluating principals?

• What is the right balance between value-added growth measures and attainment measures?

• How, if at all, should be adjust our judgments based on a school’s demographics and other characteristics, like student mobility?

• How many years of data should be used for any single year’s rating on student growth?

• What processes and parameters should guide local flexibility and adaptation to the state system over time?

For each of these categories, we identify specific questions (noted in bold) and considerations based on research and our experience (noted in italics).

New Leaders for New Schools 4

Page 5: March 8, 2011 Student Outcomes and Principal Evaluation: Key Questions for PEAC Principal Evaluation Subcommittee

Measures of Student Outcomes: K-8

Should we use ISAT data?

• Better matched to principal evaluation than teacher evaluation:– Larger pool of students in growth analyses allows for less variability in direction of results– Clearer attribution of students to principal (with clear mobility parameters)– Serves as one important element of student outcomes piece, but helpful if balanced with non-test

outcomes (in high school) or other assessment data (as long as it is consistent across the LEA)– Important to use multiple years of information to establish trend

• Can be used to measure attainment (e.g., % of kids meeting proficiency), gain/growth (e.g., increase in % of kids meeting proficiency), and value-added

Should we use interim assessments?

• Technically sound but some cautions- More reliable than summative tests if computer adaptive- Assessments may not cover all content- Students may not take interim assessments seriously- Such assessments not meant for use as accountability tools

• From 2014-15, the PARCC assessments should provide an integrated solution to interim and summative assessments

New Leaders for New Schools 5

Page 6: March 8, 2011 Student Outcomes and Principal Evaluation: Key Questions for PEAC Principal Evaluation Subcommittee

Measures of Student Outcomes: K-8 (continued)

Should we use school-level student outcome goals set by principals and their managers?

• Common practice, but depends on rigor of principal manager expectations

What other measures of student growth, beyond state tests, should we consider?

• Measures of student aspirations toward college in middle school grades

• Student attendance

New Leaders for New Schools: 6

Page 7: March 8, 2011 Student Outcomes and Principal Evaluation: Key Questions for PEAC Principal Evaluation Subcommittee

Measures of Student Outcomes:High School Considerations

Should we use PSAE data?

• Can be used to assess subjects beyond reading and math (i.e., writing, science)

• Can be used as an attainment measure (% of students reaching proficiency) and as a growth (increase in % of students reaching proficiency)

• Substantial technical issues in converting these data to value-added estimates- Gap between 8th grade ISAT and 11th grade PSAE, with data distortion from dropouts and

retained students- Anticipate improved ability to make value-add estimates using PARCC assessments in 2014-15

and onward

What other measures of student growth, beyond state tests, should we consider?

• High school student growth measures should expand beyond state tests to include “on track” to college measures:

– Student attendance– Grade to grade progression– Credit accumulation (potentially including “quality of credits”)– Cohort graduation rates, and quality of diploma earned (if data exists)

– Note: These measures can be turned into “value added” metrics, by looking at predicted values versus actual values at the school level

New Leaders for New Schools: 7

Page 8: March 8, 2011 Student Outcomes and Principal Evaluation: Key Questions for PEAC Principal Evaluation Subcommittee

Balancing attainment, growth, and value-add

How should we weight attainment, growth and value-add within an overall rating?

• Focusing on more on movement measures (i.e., gain/growth, value-add)- Provides a better picture of the impact of the principal - Creates stronger incentives for principals to work in lower performing schools- Pushes schools with higher performing incoming students to keep advancing their performance

(past “proficiency” to “college-ready”)- Values all students by assessing progress from their starting points- Requires districts to look at same-student comparisons rather than “cohort to cohort”

comparisons whenever possible

• Where possible, use multiple growth measures

• Relative weight on attainment (or on maintenance of growth) might increase as performance level of school increases

Should we treat low-performing schools and high-performing schools differently or the same?

• There is a ceiling to growth on proficiency, suggesting two changes for high-performing schools:- Give schools gain/growth points if they exceed a proficiency ceiling (e.g. Chicago approach)- Tie a portion of the gain/growth goal to their success in increasing the percent of students

meeting the “advanced” category on current assessments

New Leaders for New Schools: 8

Page 9: March 8, 2011 Student Outcomes and Principal Evaluation: Key Questions for PEAC Principal Evaluation Subcommittee

Balancing attainment, growth, and value-add: An illustration

New Leaders for New Schools: 9

Reward principals for maintaining high

levels of achievement

Shift to growing the percentage of

students reaching “advanced”

Emphasis on

measures of growth

Page 10: March 8, 2011 Student Outcomes and Principal Evaluation: Key Questions for PEAC Principal Evaluation Subcommittee

DCPS Principal Evaluation Components

Evaluation Component

% allocated Total

Professional Competencies

30%Leadership Framework Assessments

30%

Student Outcomes

50%

Value-Added Measure

20%

School Specific Goals

10%

DC CAS Gains Goals

20%

Other

20%

Special Education Compliance

10%

Teacher Retention

5%

Family Engagement

5%

Page 11: March 8, 2011 Student Outcomes and Principal Evaluation: Key Questions for PEAC Principal Evaluation Subcommittee

New York City Principal Evaluation Components

Evaluation Component % allocated

School’s Graded ProgressComponents of the Grade: – Student growth measures

make up 60% – Absolute performance 25% – School Climate 15%

32%

School Specific Goals 31%

Compliance with District Mandates

15%

School Quality Review* 22%

*In New York City, a School Quality Review is a two- or three-day visit by experienced educators to a school. The visit typically includes classroom observations, conversations with school leaders and stakeholders, and examinations of student work. New York City has developed a rubric to guide the visits and to determine how well organized a school is to educate its students.

• 40-50% of the New York Evaluation is made up of Student Outcome data

• 26% of a school’s graded progress is focused on student outcomes

• 14-24% of the School Specific Goals are focused on student outcomes

Page 12: March 8, 2011 Student Outcomes and Principal Evaluation: Key Questions for PEAC Principal Evaluation Subcommittee

Chicago “Performance Calculators” for Principals

Elementary School High School

Attainment 43% 36%

Gain/Growth 43% 64%

Value-Added 14% 0%

Attainment (“Status”) •ISAT targets in reading, math, science, composite and highest grade•Attendance target

Targets for:•ACT average•One-year dropout and freshmen “on track”•Attendance•PSAE Reading/Math/Science

Gain/Growth (“Trend”) •Growth in ISAT in reading, math, science, composite and highest grade•Growth in attendance

Growth in: •ACT average•One-year dropout and freshmen “on track”•Attendance•PSAE Reading/Math/Science•AP enrollment and success•Reading and Math scores from Explore/Plan/ACT sequence

Value-Added (“Growth”) Reading and math (ISAT)

New Leaders for New Schools: 12

Page 13: March 8, 2011 Student Outcomes and Principal Evaluation: Key Questions for PEAC Principal Evaluation Subcommittee

Adjusting for student characteristics

Should we include controls in the value-added growth models to account for student characteristics?

• Increases the accuracy of value-added estimates- Controls can be changed from year to year to alter the approach to a given population (e.g.,

special education, English language proficiency, homelessness)  

• There may be some value in excluding some controls – at the sake of maximal accuracy of estimates – in order to signal heightened responsibility for schools to accelerate achievement for low income students of color.

Should we give extra weight for improving results for students who start out further behind?

• Set targets that expect faster growth for lower performing students in the district/state

How should we address the question of student mobility?

• VARC and others use methods that assign portions of value-added growth to a school based on the percentage of the school year a student has been enrolled at the school.

New Leaders for New Schools: 13

Page 14: March 8, 2011 Student Outcomes and Principal Evaluation: Key Questions for PEAC Principal Evaluation Subcommittee

Years of data used for judgments of principals

How many years of data should be used for any single year’s rating on student growth?

Given the variation in single-year results, evaluate student outcomes based on multi-year trends

- Note: Value-added estimates are more reliable at the school-level than at the classroom level, since higher student numbers reduce the impact of year-to-year fluctuations. BUT, we want to create incentives for long-term improvement, not quick fixes.

Provide additional time or use more years of data for early tenure principals

Plan for the availability of sufficient data before any significant consequences (e.g. ensuring most recent test data is available before making spring retention decisions)

New Leaders for New Schools: 14

Page 15: March 8, 2011 Student Outcomes and Principal Evaluation: Key Questions for PEAC Principal Evaluation Subcommittee

Processes for adaptation

What guidelines do we put in place for all districts to follow if they want to design their own systems?

• The balance of growth and attainment should be fixed.

• Measuring success in other academic subjects depends on the presence of reliable local assessments.

• The technical capability to develop and implement value-added models is not present in most districts.

What should be the ongoing process for evaluating the system and adapting it?

• Among other things, the state will need to adjust its test measures when the PARCC assessments are rolled out in 2014-15

New Leaders for New Schools: 15

Page 16: March 8, 2011 Student Outcomes and Principal Evaluation: Key Questions for PEAC Principal Evaluation Subcommittee

Overview of the Webinar

1. Review of guiding questions for sub-committee consideration

2. Introduction and review of value-added measures and update on value-added models being created in CPS

3. Discussion of guiding questions

New Leaders for New Schools: 16

Page 17: March 8, 2011 Student Outcomes and Principal Evaluation: Key Questions for PEAC Principal Evaluation Subcommittee

Common Approaches to Measuring Student Success

New Leaders for New Schools: 17

Source: VARC (http://varc.wceruw.org/tutorials/Oak/index.htm)

Our overall goal is to measure the performance of a principal based on student performance. How is this accomplished?

Page 18: March 8, 2011 Student Outcomes and Principal Evaluation: Key Questions for PEAC Principal Evaluation Subcommittee

Understanding Value-Added Measures

Stephen Ponisciak

Value-Added Research Center

School of Education, University of Wisconsin-Madison

New Leaders for New Schools: 18

Page 19: March 8, 2011 Student Outcomes and Principal Evaluation: Key Questions for PEAC Principal Evaluation Subcommittee

Overview of the Webinar

1. Review of guiding questions for sub-committee consideration

2. Introduction and review of value-added measures and update on value-added models being created in CPS

3. Discussion of guiding questions

New Leaders for New Schools: 19

Page 20: March 8, 2011 Student Outcomes and Principal Evaluation: Key Questions for PEAC Principal Evaluation Subcommittee

Guiding questions on student outcomes

• What measures should be used in evaluating principals?

• What is the right balance between value-added growth measures and attainment measures?

• How, if at all, should be adjust our judgments based on a school’s demographics and other characteristics, like student mobility?

• How many years of data should be used for any single year’s rating on student growth?

• What processes and parameters should guide local flexibility and adaptation to the state system over time?

New Leaders for New Schools: 20