value -added assessment: one star in the constellation of organizational development and...

48
Value -Added Assessment: One Star in the Constellation of Organizational Development and Transformation Dr. Jim Lloyd Assistant Superintendent Olmsted Falls City Schools

Upload: suzan-johnson

Post on 13-Dec-2015

213 views

Category:

Documents


0 download

TRANSCRIPT

Value -Added Assessment: One Star in the Constellation of

Organizational Development and Transformation

Dr. Jim LloydAssistant Superintendent

Olmsted Falls City Schools

Advanced OrganizersOlmsted Falls is a SOAR DistrictOlmsted Falls will become part of BFK’s T-CAP

Lloyd (2008)—DVAS reported:The need for further PD related to using data to

impact teaching and learningThe need to “fit” EVAAS in with other data setsThe need to use EVAAS as an improvement tool

Objectives of the Presentation1. Understand the following points:

Value-added data is one very important component to the continuous improvement process.

EVAAS is a rear view mirror analysis The story behind the added value is most

important

2. Special programs do not lead to increases in student achievement or progress.

3. Changes in adult behavior do lead to increases in student achievement and progress.

4. Play “small ball” and do not try to hit a grand slam…get teachers to begin to do things differently and share those experiences .

What’s in your folder?Part III of a presentation that I gave to our

middle school staff last year…I handed out the exploration questions that were created for the groups.

An article from the Principal NavigatorChapter V OFCS Power Walkthrough TemplateOLAC Leadership Development Framework

What did Sanders & others tell us?

Factors related to student learning – District, School, and Teacher Influence on Student ProgressFollowing inferences were shared at the

Governors Education Symposium (2004)

Based on 22 years of Value-Added Study, Dr. Sanders draws the following conclusions:

Variation in student academic progress can be attributed this way:

5% attributed to District quality

30% attributed to School quality

65% attributed to Teachers 65% attributed to Teachers qualityquality

Socio-economic status

Early educational

opportunities

Parent’s educational

level

School Factors

Influences on student achievement

•Teacher quality •use of formative assessment•clear learning targets•Quality instructional practices•School effects•Clear

mission/vision•Goal setting•District effects

Influences on student PROGRESS/GROWTH

Things People Will Say about EVAASDistricts & schools with high achievement

scores can’t make gains to demonstrate growth…this model isn’t fair.

This model isn’t reliable and valid…there is discrepant research in the field about it.

How often do students score within the Top 3 Scaled Score Points two years in a Row?

Subject Students Considered

Percentage of students scoring within the top three scaled scores on OAT two years in a row

4th Gd. Reading 26,511 0.18%

4th Gd. Math 26,511 0.15%

5th Gd. Reading 26,695 0.12%

5th Gd. Math 26,695 0.21%

6th Gd. Reading 26,718 0.04%

6th Gd. Math 26,718 0.05%

7th Gd. Reading 26,699 0.04%

7th Gd. Math 26,699 0.01%

8th Gd. Reading 27,919 0.19%

8th Gd. Math 27,919 0.05%

How did the Suburban Districts Do, in particular?

The highest percentage of students scoring within the top three scaled scores two years in a row was a little over 2%.

Five wealthy Ohio suburban school districts had the following highest (district best) rates of students scoring within the top 3 scaled scores 2 years in a row: District A – 2/172 (1.16%): 8th gd. Reading District B – 7/612 (1.14%): 5th gd. Math District C – 5/266 (1.88%): 4th gd. Math District D – 1/77 (1.30%): 4th gd. Math District E – 1/58 (1.72%): 5th gd. Math

These were the highest rates these districts saw for any grade for students repeating top-3 scaled score performances across years within an OAT subject

Organizational Development Through Collaborative ExplorationWork of the Ohio Leadership Advisory

Council (OLAC)Things You Should Consider

Establish a District Leadership TeamEstablish Building Leadership TeamsWork on the work

About exploration…Excellent with Distinction doesn’t mean much when you don’t know exactly why

We needed to look at data points in order to see our constellation

The Leadership for Learning Framework (Reeves, 2006)

The Olmsted Falls “Effect” Constellation

EVAAS

Data

Classroom Walkthroug

hs

OAT

Data

CASLData

Perception

Data

Graduation Data

Implementation

Data

SAT/ACT

SOAR

End of Course

Exams

We’re working on clearly defining the “Cause” constellation now

Our exploration mechanism

District/Building Leadership Teams

State Diagnostic Teams (SDTs) work with districts in corrective action

State Support Teams (SSTs) work with districts and schools in need of improvement

Educational Service Centers (ESCs) work with other districts requesting assistance

is involved?

Teams use data tools to identify critical needs

do these teams work in districts and schools?

District/Building Leadership Teams Regional Service Providers External Vendors Higher Education

is involved?

District/Building Leadership Teams State Diagnostic Teams State Support Teams Educational Service Centers

is involved?

District/Building Leadership Teams

State Diagnostic Teams

State Support Teams

Educational Service Centers

Regional Managers

Single Point of Contact

is involved?

Review data Gather evidence of implementation and impact

Provide technical assistance and targeted professional development

Leverage resources

Work with leadership to develop research based strategies and action steps focused on critical needs identified in stage 1.

How

Who

How

Who

How

How

Who

do these teams work in districts and schools?

do these teams work in districts and schools?

do these teams work in districts and schools?

Who

STAGE 1

STAGE 3

STAGE 2

Implement the Focused Plan

Monitor the Improvement Process

Identify Critical Needs of Districts and Schools

Develop a Focused Plan

Ohio Improvement Process

STAGE 4

Our processConduct a cause and effect analysis

Use an array of data points including both SOAR and ODE value-added information

Define a very limited number of goals

Our district foci—Get better at 2 thingsClarity of Learning TargetsStudent Feedback

Stated in measurable terms—By 2011 OFCS will have experienced a 5% increase in proficient students in all buildings in each core subject area when compared to 2008 baseline performance as measured by the OAT and OGT.

Specific, Measurable, Attainable, Results, Time bound

Increase student proficiency in all buildings in the core…does this mean we should only aim for proficiency…NO!

J_Lloyd_2008

OFCS Goal

Strategy Deconstruct, implement and monitor the

most important learning targets by content area into degrees of cognitive complexity in order to more clearly articulate the meaning of them to students.

Make the learning targets clearer for students in the core curriculum in grades PreK—12.

Create an implementation system to determine whether or not the essential learning targets are clear to students prior to, during and after instruction.

Develop a balanced assessment system that emphasizes formative feedback to students during learning and has points of data collection after learning.

Provide time and support for teachers to collaborate on student learning

J_Lloyd_2008

Making the Learning Targets Clearer

It establishes where the learners are in their learning.

It establishes where they are going.

It provides them with advanced organizers on how to get there.If we don’t start with clear targets we won’t end with sound assessments.

Start with considering all indicators Identify PIs by content area for each grade

level Link PIs to course content and course

descriptions Learning targets are written in student and

parent friendly language Unwrap learning indicators for the

standards in order to identify concepts, skills, Essential Questions & Big Ideas

Use a learning taxonomy to identify complexity of learning targets

J_Lloyd_2008

What do we mean by clarity?

Research indicates students can hit targets they can see

Increases opportunities for formative assessment and student feedback

Teachers talking about and agreeing on targets makes them clearer to everyone

Posting targets in the classroom and talking about them before, during and after instruction makes them more relevant

Breaking targets down into complexity makes them clearer to everyone

PD Implications of ClarityID Power Indicators and actual use them to

make the learning targets clearer for studentsStudent friendly learning targets prior to,

during and after lessonsBig Ideas and Essential Questions prior to,

during and after lessonsAsking students if the targets are clearMonitor the implementation of our

professional development to ensure it is changing instructional practice (classroom walkthroughs)

Finding 1: Classroom assessment feedback Finding 1: Classroom assessment feedback should provide students with a clear picture should provide students with a clear picture of their progress on learning goals and how of their progress on learning goals and how they might improve.they might improve.

Hattie (1992) & Hattie & Taimperley (2007)Hattie (1992) & Hattie & Taimperley (2007)Bangert-Drowns, Kulick, Kulick & Morgan (1991)Bangert-Drowns, Kulick, Kulick & Morgan (1991) Telling students whether they were correct or students whether they were correct or

incorrect incorrect had a negative effect on their learning. on their learning. Explaining the correct answer and having them the correct answer and having them

refine refine was associated with gainswas associated with gains in learning (20 in learning (20 percentile points).percentile points).

Finding 1: Classroom assessment Finding 1: Classroom assessment feedback should provide students with feedback should provide students with a clear picture of their progress on a clear picture of their progress on learning goals and how they might learning goals and how they might improve.improve.

Fuchs & Fuchs (1986)—analyzed 21 studiesFuchs & Fuchs (1986)—analyzed 21 studiesGraphic displaysGraphic displays of results of results enhancesenhances student learning. student learning.

ResultsResults interpreted by a set of interpreted by a set of rulesrules (like a rubric) (like a rubric) enhancedenhanced student student achievementachievement by 32 percentile by 32 percentile points.points.

Finding 2: Feedback on classroom assessment should encourage students to improve

Kluger & DeNisi (1996)The manner the feedback is communicated greatly

affects + or – effect on achievement.

When feedback is negative it decreases achievement by 5.5 %ile points.

Marzano (2006) identified Marzano (2006) identified 2 characteristics2 characteristics of of effective feedback.effective feedback.

Feedback must provide students with a way to interpret even low scores in a manner than does not imply failure.

Feedback must help students realize that effort on their part results in more learning.

65% D

Finding 3: Classroom assessment Finding 3: Classroom assessment should be formativeshould be formative

Black & Wiliam (1998)—analyzed 250 studiesBlack & Wiliam (1998)—analyzed 250 studiesFormative assessmentFormative assessment done well done well resultsresults in student in student

achievement achievement gainsgains of about 26 percentile points. of about 26 percentile points.

It has the It has the highest impacthighest impact on those students who have on those students who have a history of being a history of being low achieverslow achievers..

FORMATIVE ASSESSMENTFORMATIVE ASSESSMENT

……is a is a planned process in which in which assessment-elicited evidenceevidence of students’ status is used of students’ status is used by by teachers to adjust their ongoing their ongoing instructional procedures or by instructional procedures or by students to adjust their current learning tactics. their current learning tactics.

Popham, J (2008). Transformative assessment. Popham, J (2008). Transformative assessment.

Alexandria, VA: ASCD.Alexandria, VA: ASCD.

Finding 4: Formative classroom Finding 4: Formative classroom assessments should be assessments should be frequentfrequent

Bangert-Drowns, Kulik & Kulik (1991)—meta-Bangert-Drowns, Kulik & Kulik (1991)—meta-analysis (29 studies).analysis (29 studies).

Frequency of of formativeformative classroom classroom assessments is assessments is related to student achievement

For SPED students Cues & corrective

feedback Cues, participation,

reinforcement & corrective feedback

Reducing class size Rewards &

punishment Teacher praise

39 percentile points 37 percentile points37 percentile points

27 percentile points

5 percentile points 5 percentile points 4 percentile points

PD Implications of FeedbackEstablish data/learning teams and structure

collaborative timeProvide opportunities for teachers to learn and

share feedback strategiesHave teachers observe each other to see how

it occurs

Monitor the implementation of our professional development to see if it is changing instructional practice (classroom walkthroughs)

Close Your Knowing-Doing GapImplement and monitor the things that

you’re already doingProvide people with time to reflect on the

results