what are slo's
DESCRIPTION
Dr. Voltz's presentation at Raising Student Achievement ConferenceTRANSCRIPT
What are SLO’s
Dr. Richard Voltz, Associate Director Illinois Associa;on of School
Administrators
PERA (Performance Evalua;on Review Act) • Performance Evalua;on Reform Act 2010 (PERA) • New evalua;ons for teachers and principals to address prac;ce and student performance in an effort to improve student achievement
• Guided by the work of PEAC – Performance Evalua;on Advisory Council – 32 representa;ve members P-‐20 – Meet monthly since 2010 – State Models and Guidance for Districts – Open Mee;ngs – Website Info
Two Parts
Teacher Prac)ce Student Growth
Two Parts
Teacher Prac)ce Student Growth
50% to 75%
50% to 25%
• Simple Growth Model - Measures difference in student attainment over time.
• Value-Added Model - Measures difference in student attainment over time, controls for stable student factors (e.g. race, SES)
Common Approaches To Measuring Student Growth
At least one Type I or Type II assessment
At least one Type III assessment
Type I
A reliable assessment that measures students in the same manner with the same poten;al assessment items, is scored by a non-‐district en;ty, and is administered beyond Illinois. (Norm-‐referenced)
Type II
Developed, adopted, approved, & u;lized district-‐wide (example: District-‐wide Algebra test)
Type III
Rigorous, aligned with the course curriculum. The evaluator & teacher determine measures of student learning. (Classroom Test, por[olios)
Must have one from Type I or Type II and one from Type III
ISBE Assump;ons
• Districts should pilot student growth for one year prior to implementa;on
• Districts should use PARCC as Type I for math and ELA
• Much work will be done outside of formal PERA Joint Commi^ee mee;ngs
Student growth is “Demonstrable change in a
student’s learning between two or more points in ;me.”
Who decides?
• District PERA Joint Commi^ee decides metrics & targets for teachers, including subgroups. (ELL, etc.)
• Evaluator and Principal agree upon metrics & targets for principals.
Ques;ons about student growth • What assessments will you choose? • How will you measure core (tested) courses? • How will you measure non-‐tested areas? • If you use a por[olio, what is the rubric? • What happens with co-‐teaching? • What is the appropriate a^endance/class ;me to consider? • What if a student changes sec;ons? • How does block scheduling fit? • What is the minimum number of students? • What is the target growth? • How do the 4 ra;ngs fit into the scheme of student
growth?
Student Growth Metrics should align to Educa;on Best Prac;ces
• Standards based • Team Teaching • Professional Learning Communi;es • Do not put teachers into compe;;on with each other
• Each teacher should be compared to a standard so all could poten;ally receive favorable ra;ngs
Are SLO’s required?
Why would you choose SLO’s?
• Districts decide on their own. • For those school districts defaul;ng to the state model for student growth for Type III assessments, SLOs are the required measurement model for student growth.
What is the process?
• Design Commi^ee • Formalized PERA Joint Commi^ee – Commi^ee has 180 days to agree – Then ISBE “Default Plan” for whatever parts not agreed upon.
Plan Requirements
• Mul;ple data points • One Type III required • Decide on the Type III assessment – Teacher created – Textbook created – Student work samples or por[olios – Student performance assessment – Grade-‐level experts designed
Plan Requirements
• Teachers without Type I or II must include two Type III
• Student growth expecta;ons are consistent with the assessment and model selected
• Requires midpoint review of progress which may adjust expecta;ons
• Determine how student characteris;cs (Sp Ed, ELL) are used
Suggested Timeline
• Each district should pilot their student growth approach for one year prior to full implementa;on
• PARCC assessments will be considered an appropriate Type I assessment for math and ELA when they are available.
More Sugges;ons
• Gradual implementa;on • Pilot without stakes • Sample pilot • Revise as you learn more
Decisions of Joint Commi^ee
• Determine which categories of teachers will be required to have a single Type III assessment and which will have two Type III assessments.
• Decide what types of SLOs will be allowed and under what condi;ons they can be used.
• Select and ar;culate each step the teacher and administrator should follow to develop an SLO.
• Select the appropriate Type III assessments for each category of teacher. Iden;fy assessments that will need to be developed and the supports needed to do so.
• Select or develop an SLO review and documenta;on process.
• Decide how SLOs will be scored and combined with other measures of student growth. Determine what percentage or weight your district will a^ribute to the SLOs within the broader evalua;on system.
• Develop a plan for monitoring and evalua;ng the SLO process
Ques;ons
• Are the assessments currently in use in your district aligned to the standards?
• What Type I, Type II, and Type III assessments does the district currently have available to use for student growth purposes? – District should list each Category of Teacher followed by specific Type I, II, and III assessment available
District Assessment Iden;fica;on Tool
Category of Teacher Type I Type II Type III Early Elementary, Pre-‐K, K 1st-‐5th Grade Core Elementary PE Elementary Resource 6-‐8 Math 6-‐8 ELA 6-‐8 Science 6-‐8 Social Studies 6-‐8 PE 6-‐12 Health 6-‐8 Resource HS Math HS English HS Biology HS Physics HS Social Studies HS PE HS Foreign Language HS Driver Educa;on HS Business HS CTE
Build, Buy, Borrow: Selecting Appropriate Assessments
• Does the assessment match the content that the teacher(s) intend to teach?
• Do a majority of the items on the assessment align with the curriculum standards identified?
• Does the assessment measure growth over the interval of instruction? How?
• Will the data from the assessment be beneficial to teachers? Students? The district? How?
• Are the assessments administered the same? • Are the assessment scored the same way?
SLO’s
What are SLO’s
• Targets of student growth that teachers set at the start of the school year and strive to achieve by the end of the semester or school year.
• These targets are based on a thorough review of available data reflec;ng students’ baseline skills and are set and approved aner collabora;on and consulta;on with colleagues and administrators.
What is in an SLO?
• Baseline data and trend data – Specify data used, it should be measureable, it should target specific academic concepts, skills or behaviors. What does the data show you about the student’s star?ng points?
• Possible data – Pre-‐assessment – Review student’s previous performance
• Student popula;on – Which students will be included in this SLO? Include course, grade level, and number of students. Evaluator involved in the process.
– ALL students should be included, exclusions need to align to PEAC and district guidelines.
• All of my 3rd period class of seventh grade science students. There are 18 students in the class.
• Interval of instruc;on – What is the dura?on of the course that the SLO will cover? Include beginning and end dates
• This is a unit SLO for Chemistry. This area of the curriculum generally runs from the beginning of December through the end of February.
• Standards and content – What content will the SLO target? To what related standards is the
SLO aligned? • 11.A.3c Collect and record data accurately using consistent
measuring and recording techniques and media. • 12.C.3a Explain interac;ons of energy with ma^er including
changes of state and con-‐serva;on of mass and energy. • 12.C.3b Model and describe the chemical and physical
characteris;cs of ma^er (e.g., atoms, molecules, elements, compounds, mixtures).
• 13.A.3a Iden;fy and reduce poten;al hazards in science ac;vi;es (e.g., ven;la;on, handling chemicals).
• 13.B.3f Apply classroom-‐developed criteria to determine the effects of policies on local science and technology issues (e.g., energy consump-‐;on, landfills, water quality).
• CC.7.W.3.d Text Types and Purposes: Use precise words and phrases, relevant descrip;ve details, and sensory language to capture the ac;on and convey experiences and events
• The assessment(s) to be used – What assessments(s) will be used to measure student growth for this SLO?
• Department (PLC) created Chemistry unit exam which includes a hands on component, a mul;ple choice sec;on, and a wri^en essay response.
• Student characteris;cs – What accommoda?ons will you make to allow for the considera?on of the
characteris?cs or special student popula?ons (Special Educa?on, ELL, At Risk, etc)?
• For sped students, IEP requirements will be followed, for example some students will take an alternate form of the test with ques;ons adapted to simpler language or read aloud. Growth goals will be adapted to each student on an individual basis based upon prior growth evidence.
• ELL students will be tested using a modified form of the exam. Growth goals will be adapted to each student on an individual basis based upon prior growth evidence.
• At risk student/poverty student has absenteeism issues, the growth goal will be less ambi;ous due to lack of exposure to material during the unit. If student misses more than 95% of school year, removal from SLO may result.
• All students scoring more than 95% on the pre-‐test will be given and alternate assessment for the post test. I will use an essay style of test, it will test the same standards in a different and higher level manner, it will require students to show a deeper level of synthesis. I will use the district approved scoring rubric for wri;ng in the content area. All students will be expected to score 3.5 or be^er to meet growth goal.
• All students not iden;fied in the above 4 categories will have rigorous but reasonable growth goals based upon prior baseline date indicators. (Most will be expected to grow a minimum of 15%)
• Growth targets – Considering all available data and content requirements, what growth target(s) can students be expected to reach?
– Should never be based on IEP goals. (SLO’s are for groups of students, IEP is for individual student)
• See a^ached for student roster of growth goals.
• Ra;onale for growth target – What is your ra?onale for seOng the above target(s) for student growth within the interval of instruc?on
• This goal is reasonable because I will have ample ;me to instruct my students. There will be three chapter tests along the way so I can monitor and adjust instruc;on when necessary. I have built in 3 days for full class re-‐teach if necessary. Kids on track will have alternate work those days.
• Mid-‐Point Learning Data Review – What kind of mid-‐point data did you review in order to review student progress towards goals? What did your review reveal? What adjustments to instruc?on will be made (if any)?
• Review of chapter 4 and 5 tests. Implemented two re-‐teach days so far. Re-‐taught Bohr model to whole class on day 18 aner informal assessments revealed great misunderstandings.
Types of SLO’s • Course-‐level SLO’s – Focused on the en;re student popula;on for a given course, onen across mul;ple classes
• Class-‐level SLO’s – Focused on the student popula;on in a specific class
• Targeted Student SLO’s – Separate SLOs for subgroups of students who need specific support
• Targeted Content SLO’s – Separate SLOs for specific skills or content that students must master
• Tiered SLO’s – Course-‐ or class-‐level SLOs that include differen;ated targets for the range of student abili;es
Teachers should not develop SLO assessments in isola;on.
Assessments should be developed by content and grade level experts or in a collabora;ve PLC learning environment made up of all the teachers in the subject and/or
grade level.
What do we want all students to know and be able to do?
How will we deliver content?
How do we know all students are learning?
What will we do if students are not learning?
For teacher evalua;on purposes, common forma;ve assessments should gauge student growth of essen;al skills/knowledge, not student a^ainment for a specific
subject test or quiz.
Growth is NOT A^ainment
Student growth should cover a recurring set of standards/objec;ves.
1. Endurance: Will this standard or indicator provide students with knowledge and skills that will be of value beyond a single test date? This is informa;on a student will need to know far beyond the last test the teacher gives.
2. Leverage: Will this provide knowledge and skills that will be of value in mul;ple disciplines? (For example: making inferences is a skill that can be used in many subjects)
3. Readiness for the next level of learning: Will this provide students with essen;al knowledge and skills that are necessary for success in the next grade of the next level of instruc;on?
Ainsworth, L. (2003)
Aligning to Common Core Essential Skills/Knowledge
Dis;nguish content vs. skills.
SLOs need to be focused on academic targets that are both long-‐term and measurable.
PLC’s are vital for providing input and answers for student growth measures.
• What will be assessed? – What all students have to know and be able to do.
• How it will be assessed? – Selected responses, constructed responses, performance
• Determine complexity of assessment. – Determine how many levels.
Sandoval School District SLO Process
The next slide is the most important slide of this en;re presenta;on!
Baseline • What do you know about your students? • What does the data tell you? • What are their strengths and weaknesses? • How did your students perform on the pre-‐test? • What student needs are iden;fied from the data? • Set your criteria ahead of ;me: – Must be measurable – Use allowable data to drive instruc;on and set growth
targets – Targets specific academic concepts, skills, or behaviors – What assessments are available in your district?
Popula;on
• Identify all students being included on the SLO. • Set your criteria ahead of time. – Attendance
• Mobile students, late move ins
– Pre-test data must be available – Exceptions are allowed with approval
Objec;ve • What is your long-‐term goal for advancing
learning? • What are the students expected to do or know
by the end of the semester/year? • Set criteria ahead of ;me: • Rigorous • Targets specific academic or behavioral skills • Must use baseline data • Must be measurable • Collabora;on is required
Examples
• Students will be able to write reflec;ons, that respond to a narra;ve selec;on, that demonstrate higher order thinking skills.
• Students will increase their comprehension, vocabulary, and fluency in reading.
• Students will use the scien;fic method to organize, analyze, evaluate, make inferences, and predict trends using data from the classroom experiments.
• Students will demonstrate an understanding of quadra;cs and exponent rules.
Ra;onale
• What is the compelling why behind choosing the objec;ve?
• Why is it important to cover the content? • Using your data analysis, how does the content
relate to student strengths and weaknesses? • Set criteria ahead of ;me: • Align with school and district improvement plans • Align with teaching strategies and learning content • Classroom data is reviewed for strengths and needs by
student group, subject, concept, skill, and behavior.
Examples
• Students struggle with mo;ve, inference, making predic;ons, drawing conclusions from text, according to the pre assessment. , so I will focus on these specific reading skills. Most students have mastered (19/23) character traits, main idea, cause-‐effect, summarizing.
Strategies
• How will you help your students achieve the objective?
• Set criteria ahead of time: – Identify the type of instruction or key
strategies – Be appropriate for learning content and skill
level – Research based
Targeted Growth
• How much growth is expected by the end of the evalua;on cycle?
• Set criteria ahead of ;me: • Maximum of 5 ;ers • Expressed in whole numbers • Encourage collabora;on • Covers 75% of the popula;on • Based upon pre-‐assessment data • Students can uphold high achievement • Quan;fiable goals
Assessment
• What assessment will be used to measure student growth?
• Set criteria ahead of time: • Administered in a consistent manner and data secure • Applicable to the purpose of the class and reflective of skills
being covered in the class • Produces timely and useful data • Standardized: Same content, administration, and reporting of
results • Aligned with standards
SLO Expecta;ons
• Elementary – ELA and Math
• Middle School and High School – If teaching mul;ple content areas must have objec;ves in at least 2 content areas
• All students in the class must be assessed
Scoring SLO’s � Assign SLO with value of 1-4 � SLO’s are averaged (Keep decimal value)
Finalizing Performance Evaluation Rating
� 75% Teacher Practice -25% Student Growth � Teacher Practice Rating (1-4) x 0.75 +
Student Growth Rating (1.0-4.0) x 0.25 = Overall Rating
Student Demographics
• Do not adjust expecta;ons for students based on a student’s demographic or AYP classifica;ons.
• Students with the same performance history should not have different achievement expecta;ons based on their demographics.
Use External and Internal Assessments
• Student achievement growth should be derived from both external and internal assessments.
• These assessments need to be universally administered.
• Districts should not use different tests for different teachers in the same content area.
Measurement Model
• Per state stature (Illinois Administra;ve Code, Part 50) districts must adopt a measurement model that will be used to analyze changes in student test scores.
• Districts need to compare the student’s projected achievement and the student’s actual achievement as the measurement model for growth.
Student growth projec;ons should be based on the same general methodology across all grades, subjects, tests and rubrics.
Reliability
• Research is conclusive in documen;ng that growth scores from mul;ple measures is more reliable than growth from single measures.
• Combining growth scores into a single summa;ve growth score for the teacher will greatly improve the reliability of the district’s teacher evalua;on system.
Common Misunderstandings
• The new ISBE growth value table model is unrelated to the default state growth model for teacher evalua;on.
• Growth Value Table are for NCLB purposes • Default State Growth Model is the work of PEAC to develop the default state growth model for principal and teacher evalua;on.
ISAT is allowable for teacher evalua;on
Type III assessments need not be teacher created for use in his/her
classroom
Assessments mee;ng the defini;on of Type I and/or Type II,
can also be used as a Type III provided it aligns to the
curriculum.
Reliable and valid assessment does not ensure a reliable and valid system for measuring growth
Student Learning Objec;ve (SLOs) as a methodology s;ll requires that the district adopt a measurement model to quan;fy how changes in student test scores reflect changes in student knowledge or skills.
Founda;onal Issues
• Assessment does not equal performance • Type I can be a Type II and can be a Type III • If test results are within the normal ranges then teacher prac;ce score trumps
• Focus on reliability and validity of systems, not of tests.
• Combine local tests with norm referenced tests to increase reliability
• Focus on building good performance evalua;on systems not good tests.
• The only score that ma^ers is the score you use for ra;ng purposes.
• Reliability is a func;on of a psychometric analysis.
For additional information contact: ���
Dr. Richard Voltz���[email protected]���
217-741-0466���http://richvoltz.edublogs.org