a ugust 2015 p rincipal meeting ~ billie finco & sherri torkelson ~

90
AUGUST 2015 PRINCIPAL MEETING ~ Billie Finco & Sherri Torkelson ~

Upload: brook-holt

Post on 11-Jan-2016

214 views

Category:

Documents


0 download

TRANSCRIPT

PowerPoint Presentation

August 2015 Principal meeting~ Billie Finco & Sherri Torkelson ~

Welcome!Tell us.Who you areWhere you are fromWhat you do

Meet the new CESA #4 administrator!

Cheryl GullicksrudAssessment UpdatePALS for 15-16 (and then)annually assess each pupil enrolled in 4yearold kindergarten to 2nd grade in the school district or in the charter school for reading readiness. The school board or operator shall ensure that the assessment evaluates whether a pupil possesses phonemic awareness and letter sound knowledge. A school board or operator may administer computer adaptive assessments.DPI pays (or prorates if appropriation is insufficientCan ditch your fall WKCE datesFall Aspire requirement is gonePolicy passed in the budgetRequest waiver from US Ed for WCER to select 3-5 assessmentMore policy in budgetSummative assessments for all students in grades 3-103-8 must be replacedMay not use assessment developed by SBACEnglish, reading, writing, science, and mathematics Must meet Fed Ed rulesVertically scaled and standards-basedDocuments progress toward College and Career Readiness BenchmarksComputer-basedPredictive of college readiness as measured by IHE assessments

More policy in budgetLooks like opt-outs wont count in proficiency ratesPassing Civics Test (60/100) required for graduationRetake until successfulSwD must take but do not need to pass

Tentative 2015-16 DPI Assessment CalendarCALL Survey

Perhaps Sparta can fill us in.8

Structures to support student learning

Assessment and EvaluationUse of formal plans, processes and programs that may guide teaching practices in your school.

Creating a professional learning environment in your school

The work of school leaders to acquire resources, manage personnel and communicate with families

Maintaining a safe and effective learning environment.

EE Updates and resourcesDPI specialist workflow coming sometime in SeptemberTeachscapeLaunching on weekend of August 15thMost of ReflectArtifact/evidence updates coming over the next 3 months

Request for sample artifacts

Beginning-of-the-Year EE actions and considerations

Beginning-of-the-Year EE Planning conference guide

Evaluation scheduling tool

New Teacher and summary year educator orientation ppt

Page1Beginning of Year Working collaboratively with their evaluator or a peer, educators draw upon the SLO and Outcome Summary Process Guide (see page 2) to develop a minimum of one SLO. The development of the SLO now must include the review of teacher and principal value-added, as well as graduation rates or schoolwide reading value-added (as appropriate to the role of the educator). Educators continue to document the goal within the appropriate online data management system (e.g., Teachscape or MyLearningPlan). Collaborative learning-focused conversations are required as part of the process, but flexibility exists in whom educators collaborate with in Supporting Years. However, in Summary Years, educators must conduct this process with their evaluators. What is different from what you did last year?25Page2

What is new or different from last year?

Page3TEACHERS Teacher Value-Added and Schoolwide Reading: When developing SLOs, teachers must review individually, as well as with teacher teams at both the grade level and across the content area (e.g., schoolwide reading value-added), to identify trends (i.e., strengths and areas for growth) across time. These trends can inform SLOs or professional practice goals, based on areas of need. Working in teams with other teachers could inform the development of a team SLO that may align to a School Learning Objective identified by the principal. Value-added trends may also illuminate strategies that have worked well, based on areas of strength, and can support ongoing instructional efforts. Working in teams with other teachers could provide the opportunity to share best practices and successful strategies which support school improvement plans and/or goals.

Lets walk through thisGraduation Rate: When developing SLOs, high school teachers must review graduation rate data across time to identify positive or negative trends regarding the matriculation of their schools students. During this review, teachers should reflect on how their practice has supported the trends within the graduation rate data. Teachers should also review the data in vertical and horizontal teams to review school (and district) practices which positively and negatively impact graduation rates. This analysis can inform the development of SLOs, as well as professional practice goals, to support the improvement of graduation rates of the educators students. This review can also illuminate the success of various college and career ready strategies implemented by teachers and across the school to be modified or duplicated. Educators are not required to develop a goal based on these data or to develop a goal with the intention to improve these data, unless the data indicates that is necessary.

As always, the purpose of the Educator Effectiveness System is to provide information that is meaningful and supports each individual educators growth in their unique roles and contexts. By reviewing multiple data points, including those listed above, the educator has access to a more comprehensive view of their practice and a greater ability to identify areas of strength and needboth of which can inform the development of goals, as well as instructional/leadership strategies which can support progress towards goals.

Note: Due to the lag in data provided by DPI to districts, as well as the date in the year in which the data is provided to the districts (i.e., the following year), educators should only use the data to review trends across time when developing an SLO. Educators should not use the data to score SLOs.

Well wrap this up at 10:45)32Should You Stay or Should You Go?The next portion of the meeting is devoted to Value Added

Well walk through a ppt that you can take and use with your staff

We will wrap up sometime between 12:00 and 12:30

InfusingValue-Added into the SLO processAugust 2015Turn and Talk:What do you know about value added data?

35

Page1Beginning of Year Working collaboratively with their evaluator or a peer, educators draw upon the SLO and Outcome Summary Process Guide (see page 2) to develop a minimum of one SLO. The development of the SLO now must include the review of teacher and principal value-added, as well as graduation rates or schoolwide reading value-added (as appropriate to the role of the educator). Educators continue to document the goal within the appropriate online data management system (e.g., Teachscape or MyLearningPlan). Collaborative learning-focused conversations are required as part of the process, but flexibility exists in whom educators collaborate with in Supporting Years. However, in Summary Years, educators must conduct this process with their evaluators. What is different from what you did last year?37Page2

What is new or different from last year?

Page3TEACHERS Teacher Value-Added and Schoolwide Reading: When developing SLOs, teachers must review individually, as well as with teacher teams at both the grade level and across the content area (e.g., schoolwide reading value-added), to identify trends (i.e., strengths and areas for growth) across time. These trends can inform SLOs or professional practice goals, based on areas of need. Working in teams with other teachers could inform the development of a team SLO that may align to a School Learning Objective identified by the principal. Value-added trends may also illuminate strategies that have worked well, based on areas of strength, and can support ongoing instructional efforts. Working in teams with other teachers could provide the opportunity to share best practices and successful strategies which support school improvement plans and/or goals.

Lets walk through thisGraduation Rate: When developing SLOs, high school teachers must review graduation rate data across time to identify positive or negative trends regarding the matriculation of their schools students. During this review, teachers should reflect on how their practice has supported the trends within the graduation rate data. Teachers should also review the data in vertical and horizontal teams to review school (and district) practices which positively and negatively impact graduation rates. This analysis can inform the development of SLOs, as well as professional practice goals, to support the improvement of graduation rates of the educators students. This review can also illuminate the success of various college and career ready strategies implemented by teachers and across the school to be modified or duplicated. Educators are not required to develop a goal based on these data or to develop a goal with the intention to improve these data, unless the data indicates that is necessary.

As always, the purpose of the Educator Effectiveness System is to provide information that is meaningful and supports each individual educators growth in their unique roles and contexts. By reviewing multiple data points, including those listed above, the educator has access to a more comprehensive view of their practice and a greater ability to identify areas of strength and needboth of which can inform the development of goals, as well as instructional/leadership strategies which can support progress towards goals.

Note: Due to the lag in data provided by DPI to districts, as well as the date in the year in which the data is provided to the districts (i.e., the following year), educators should only use the data to review trends across time when developing an SLO. Educators should not use the data to score SLOs.

Our MISSION as educators is to improve teaching and learning.When I ask educators if they agree with that statement, they overwhelmingly agree. They sometimes quibble a bit about the wording, and about who else is also responsible, but they dont disagree that at our core, as educators, we want students to know more, be able to do more and BE more when they leave us then when they came to us: we want the same for the adults in the system. I call that continuous improvementMindset Of ImprovementYou dont have to be sick to get better!-Michael Josephson

Sometimes the mindset of improvement is difficult. One of the greatest obstacles to improvement is ego. If the idea of getting better learning new ways to do things, new strategies to deal with people, new ways to motivate ourselves is thought of as an implicit criticism that we werent good enough before, we are likely to reject it. That you are a better parent, manager or person now doesnt mean you were bad or inadequate before. The key is to welcome personal growth as proof of your strength, not evidence of prior weakness. The phrase You dont have to be sick to get better simply means that improvement is always possible. It is also always desirable.

Mindset Of ImprovementContinuous Improvement is for EVERYONE

CI is a mindset we want to develop in all adults (administrators, teachers, custodians, etc.), and all students.If our best and brightest think they do not need to work hard, learn, and get better; they are wrong. Continuous Improvement is about creating that mindset of what is my next step toward getting better? for everyone.There are 2 general ways to look at student assessment data47Attainment model - a point in time measure of student proficiencycompares the measured proficiency rate with a predefined proficiency goal.

Growth model measures average gain in student scores from one year to the nextaccounts for the prior knowledge of students.

Basically, there are two ways to measure student learning. Attainment measures the student's mastery of the content tested. Requires a single data point, like a WKCE score.Growth is determined by calculating the difference between current year and previous year average test scores. It allows us to determine how quickly a student, classroom, grade, school, district is improving. Requires two consecutive years of scores.47What is Value-Added?It is a type of growth model that measures the contribution of schooling to student performance on the WKCE in reading and in mathematics

Uses statistical techniques to separate the impact of schooling from other factors that may influence growth

Focuses on how much students improve on the WKCE (or our new assessment) from one year to the next as measured in scale score points48Value-added is a growth metric that tries to isolate the contribution of a school or grade to student performance and adjust for factors outside of the control of the school.

Holding a number of factors constant (e.g., Free/Reduced Lunch Status, ELL, Prior Achievement, Gender, Race), what impact did a school or grade have on student learning?

Value-Added Analysis measures overall school, grade or classroom progress in student attainment, over time, by taking into account current and past student performance and:Adjusting for Starting Point and previous test, gender, free or reduced price lunch, Disability indicators, English language learner indicators, student mobility, and grade level.

More Clear Data PictureMany data pieces give us a fuller picture

STAR

WKCE or BadgerAIMSwebACTWorkKeys

Classroom AssessmentsAPSurveysAspireObservation DataPALSVA Data

Why would we care about Value Added data?VA allows for fairer growth comparisons to be made (in contrast to pure achievement)

50

90% ProficiencySchool ASchool B86% Proficiency6% Free and Reduced90% Free and ReducedWhich school is better?Does this demographic change your thinking?

Controlling for Demographic factors allows us to compare apples to apples instead of apples to oranges. By isolating the schools impact apart from demographic makeup of its student population, it is possible to accurately compare schools that would appear to be very different.

VA allows for fairer growth comparisons to be made (in contrast to pure growth)We know that in Wisconsin, certain groups of students do not grow (or achieve) at the same rate as others.

This can be due to the achievement level of a child (lowest students can grow the most)

This can also be related to demographics such as Special Ed statusELLRace/ethnicityEconomically Disadvantaged etc.

Hi! Im a 4th grade boy. I got a scale score of 2418 on my WKCE in reading this year!

And these are all the other boys in WI who had the exact same scale score as me.4th gradeFor example: Lets say that the average growth of a 4th grade boy in WI with a starting scale score of 2418 is1 6 points by the time he takes the test in 5th grade. If you have a boy in your school who has this same starting score in 4th grade but shows 20 points of growth

52

Now Im in 5th grade and just got a scale score of 2449 on my reading WKCE! I grew 31 points.

All of the other boys took the test again, too. Their average scale score was 2443. Their growth was 25 points.5th gradeFor example: Lets say that the average growth of a 4th grade boy in WI with a starting scale score of 2418 is1 6 points by the time he takes the test in 5th grade. If you have a boy in your school who has this same starting score in 4th grade but shows 20 points of growth

53

So we would say that my teachers in 4th grade had a higher Value Add than would be expected.

5th gradeAverage growth was 25 pointsI grew 31 pointsFor example: Lets say that the average growth of a 4th grade boy in WI with a starting scale score of 2418 is1 6 points by the time he takes the test in 5th grade. If you have a boy in your school who has this same starting score in 4th grade but shows 20 points of growth

54Outside the schools influenceRace/EthnicityGenderSection 504Economic StatusDisability (by type)Prior Year Score (reading and math)English Proficiency (by category level)MobilityUsing the same process, VA Controls for these factorsThe expected growth for each student is calculated in the same way by looking at the average growth across the state for each characteristic. DPI then looks at the actual growth for a given student with certain characteristics and determines if the school met the average growth expectation, or exceeded or fell below that expected growth. Once the VA score for each student is calculated, these are combined into grade level and school VA estimates.55How do they decide what to control for?In order to be considered for inclusion in the Value-Added model, a characteristic must meet 4 requirements:Think of the things that might go herefamily income, whether or not a child has a learning disabilityThis is a must and sometimes we dont have this. Think of something like the ability of a child to pay attentionthis is critical and outside of the schools control but there is really no reliable way to measure or isolate this within-child characteristicSometimes there is a way to collect related data, or proxy data. For economically disadvantaged status, what we really would want to know is the yearly income for each and every family. We dont have that, but in this case we do have data about free and reduced lunch status which serves as a proxy for this characteristic.In the end, characteristics are only added to the model if they make the VA measure more accurate. WI currently controls for 8 characteristics.

56Checking for UnderstandingWhat would you tell a 5th grade teacher who said they wanted to include the following in the Value-Added model for their results?:5th grade reading curriculumTheir students attendance during 5th gradeTheir students prior attendance during 4th gradeStudent motivation

Would not be appropriate because it is not outside the school/teacher influence.Would not be appropriate because schools/teachers have a great deal of influence on attendance rates through student and parent engagementSame as above.Difficult to collect reliable data or proxy data57Reporting Value-AddedIn the latest generation of Value-Added reports, estimates are color coded based on statistical significance. This represents how confident we are about the effect of schools and teachers on student academic growth.Green and Blue results are areas of relative strength.Student growth is above average.

Gray results are on track. In these areas, there was not enough data available to differentiate this result from average.

Yellow and Red results are areas of relative weakness. Student growth is below average.

58

Grade 4303Value-Added is displayed on a 1-5 scale for reporting purposes.

About 95% of estimates will fall between 1 and 5 on the scale.

Most results will be clustered around 33.0 represents meeting predicted growth for your students.

Since predictions are based on the actual performance of students in your state, 3.0 also represents the state average growth for students similar to yours. Numbers lower than 3.0 represent growth that did not meet prediction.

Students are still learning, but at a rate slower than predicted.Numbers higher than 3.0 represent growth that beat prediction.

Students are learning at a rate faster than predicted.Read through the orange box first. Then click through.59

Grade 43.895% Confidence Interval30READINGValue-Added estimates are provided with a confidence interval.

Based on the data available for these thirty 4th Grade Reading students, we are 95% confident that the true Value-Added lies between the endpoints of this confidence interval (between 3.2 and 4.4 in this example), with the most likely estimate being 3.8.360

Confidence IntervalsGrade 34.513READINGGrade 436Grade 5844.54.53Grade 31.513Grade 436Grade 5841.51.53MATHColor coding is based on the location of the confidence interval.

The more student data available for analysis, the more confident we can be that growth trends were caused by the teacher or school (rather than random events).

Lets Look at the Reports Available in School Access File Exchange (SAFE)We begin with some caveat!s!VA is one data source among many that provides a different perspective on student growth.

VA should never be the sole data source to identify effective/ineffective schooling!

Taking VA out of the Student Outcome score allows each educator to decide how (or if) this data informs the SLO process.This next section is intended for you to use with your own school VA reports. You may choose to use your snipping tool to insert your school data in the appropriate place. Printing color copies of your VA reports for staff might also be helpful

Page 1Introduction to VA Color Coding

Page 2With a partner: What are some observations you can make about this data as a school?

How might teacher teams use this data?Share your thinking

Lets look at our VA as a schoolWith a partner: What does this data suggest about how we are growing students in reading and in math?Use your snipping tool to insert a screenshot of the top section only of your own school VA report page 2 hereit will look something like this.Share your thinking

Lets look at our VA by gradeWith a partner: What does this data suggest about how we are growing students across grades in our school?Use your snipping tool to insert a screenshot of the bottom section only of your own school VA report page 2 herewill look something like this.Share your thinking

Pages 3 & 4

With a partner: What are some observations you can make about this subgroup data?

What questions do you have?78Lets look at our reading VA with subgroups With a partner: How effective was our school in growing different groups of students in reading?

Use your snipping tool to insert a screenshot of your own school VA report page 3 hereit will look something like this.

Lets look at our math VA with subgroupsWith a partner: How effective was our school in growing different groups of students in math?

Use your snipping tool to insert a screenshot of your own school VA report page 4 hereit will look something like this.Share your thinkingPage5

Introduction to VA Scatter PlotsThis is just a picture of page 5, an introduction to the scatter plots.82

Notice that this graph plots achievement and growth together. The black lines in the center represent average. The vertical line represents typical or average growth, the horizontal line represents average proficiency rates on the state assessment. The shaded bands represent standard deviation above and below the state average and are considered within the average range. The shaded area in the horizontal band represents proficient/advanced scores that are within one standard deviation of the mean (average). The shaded area in the vertical band represents VA scores that are within one standard deviation of the mean (average).

83

Discuss..84With a partner: How might a teacher team use this data to identify an area of focus for their SLO?

Pages 6 & 7Grade level VA and AchievementLets look at our VA and achievement plotted together

Use your snipping tool to insert a screenshot of your own school VA/Achievement scatter plots herethey will look something like this. If these are too small to see you may need to print colored copies for pairs or groups.With a partner: What stands out in our school data when we look at achievement and growth together?Allow ample time for pairs to process.87Share your thinkingHow do/dont these reports add to our total data picture?89How might todays learning apply to your own SLO?