an implementer’s perspective: an evaluation and compensation pilot

Post on 09-Feb-2016

27 Views

Category:

Documents

0 Downloads

Preview:

Click to see full reader

DESCRIPTION

An implementer’s Perspective: An evaluation and compensation pilot. October 17, 2012 Richard Bowman, Ph.D. Context Proposal Results Buy-in Logistics Metrics Issues. Context Proposal Results Buy-in Logistics Metrics Issues. Context. Reform efforts - PowerPoint PPT Presentation

TRANSCRIPT

AN IMPLEMENTER’S PERSPECTIVE:

AN EVALUATION AND COMPENSATION PILOT

October 17, 2012Richard Bowman, Ph.D.

• Context• Proposal• Results• Buy-in• Logistics• Metrics• Issues

• Context• Proposal• Results• Buy-in• Logistics• Metrics• Issues

Context• Reform efforts

• Teacher Evaluation Task Force (TETF)• Race To The Top (RTTT)• Planned legislation• Prior legislation• Shared urgency

• Four schools• SIG application

• “Funds have been set aside to support performance pay.”• $1,000 per teacher assigned to the school. (~ $300K)

• Strategic Data Project Data Fellows• Highly trained and capable personnel pressed into service

• Context• Proposal• Results• Buy-in• Logistics• Metrics• Issues

The Proposal• Performance compensation• Teacher evaluation• Pilot program• Multiple measures

• Student Surveys• Observations• School Value Added• Individual Value Added• PLC Student Learning Goals (SLOs)• Individual Student Learning Goals (SLOs)

• In-depth research

The Proposal at 50,000 feet

The Proposal at 10,000 feet

Student Surveyson the ground

I wonder who translated

this?

Student Surveyson the ground

• Context• Proposal• Results• Buy-in• Logistics• Metrics• Issues

Proposal Results

Most Valued: Student Surveys

• Context• Proposal• Results• Buy-in• Logistics• Metrics• Issues

Union Leader Buy-in (Negotiations)• Voluntary participation• Student Learning Goals (SLOs)• Hold-harmless evaluation• Bonus, not pay• Percentages• Ferguson’s research

Teacher Buy-in• Emphasize shared background• Time – personal time spent• Focus on concerns

• Negative consequences• Time taken

• Talk context• Responsiveness• Transparency

• Context• Proposal• Results• Buy-in• Logistics• Metrics• Issues

Logistics - Survey• One friendly Denver SDP fellow• Two local SDP fellows• Four schools• Eight survey versions• 16 survey days• 93 participants• ~20,000 student surveys

• Blank photocopied surveys• Rough class count stuffed in envelopes• Envelopes sorted by period and teacher

• ~20,000 student barcode labels• Cut by guillotine and bagged by class period

Logistics - Software• Excel (Report production)

• Lookup tables• VB Macros

• RDBMail (Outlook Integration)• PDF generation

• Google Docs/Drive (Collaboration)• Stata (Data analysis and management)• Remark Office OMR (Optical recognition)

• Context• Proposal• Results• Buy-in• Logistics• Metrics• Issues

Student Surveyson the ground

Metrics• Survey

• Average responses assuming a 1 - 5 valued scale.• Average for each of the “Seven Cs”• Composite average of the “Seven Cs”

• Reporting• Average for each of the “Seven Cs” and composite

• Individual• School-wide• Pilot

• Histogram of the composite• Top two relative strengths and weaknesses• No item-level detail

• Evaluation• Thirds of composite score within school

Metrics• Value-added Model

• Includes student, teacher, and school covariates• 1-year estimates for both teachers and schools

• Best Linear Unbiased Predictor (BLUP) aka. Bayesian or shrunken estimates• Relative to (normalized) district-wide student test scores

• By year, standard deviations• Random effects model estimated by REML• Estimates produced four times, based on SCA and SBA

• Reporting• Limited numbers

• Concern about meaningfulness of numbers• Conversion to months of learning

• Confidence level• Visual• Range of months of learning in text

• Histogram• School estimate and teacher estimates

• Evaluation• Three possible ratings

• Significantly above, significantly below, and not significantly different than average• 95% significance

Metrics• Student Learning Goals

• Teacher developed and measured and reported• Individual goals and PLC goals

• “Reach” goal – goal that would be a stretch to achieve• “Expected” goal – goal expected to be achieved

• Evaluation• Three possible ratings

• Met Reach Goal, Met Expected Goal, Did not meet either goal

Metrics• Observations

• Administrator evaluates teachers based on three domains of Danielson’s FFT

• Reported electronically• Ratings converted into a four-point scale and averaged

• Evaluation• Teachers reported whether or not observation happened

• Context• Proposal• Results• Buy-in• Logistics• Metrics• Issues

Observation Data

Issues• Training and Professional Development• Observation

• Principal time• No consequences• Length

• Student Learning Goals• Ensuring rigor and relevance• Professional development

• Surveys• Special Populations• Proctoring

• Value-added• Understanding, fairness, and use

• School-wide evaluations• Universally disliked

• Joint or shared evaluations• Largely disliked

Issues• How can this information inform instruction?• Evaluation Metrics

• Selection bias due to volunteers• Relative classifications

• With or without base year, creates problems• Absolute classification

Thank you!

Questions or Comments?

top related