d ata -b ased p roblem s olving and d ata s ystems shelby robertson, ph.d. therese sandomierski, ma...

Download D ATA -B ASED P ROBLEM S OLVING AND D ATA S YSTEMS Shelby Robertson, Ph.D. Therese Sandomierski, MA Pamela Sudduth, MA

If you can't read please download the document

Upload: hassan-bobbett

Post on 16-Dec-2015

217 views

Category:

Documents


0 download

TRANSCRIPT

  • Slide 1
  • D ATA -B ASED P ROBLEM S OLVING AND D ATA S YSTEMS Shelby Robertson, Ph.D. Therese Sandomierski, MA Pamela Sudduth, MA
  • Slide 2
  • This Session: Solidify a vision for problem solving at Tier 1 See some examples of what it looks like for different domains Become familiar with some resources that are available to support DBPS
  • Slide 3
  • DBPS Workgroup Develop a model/template for data-based problem solving across tiers Can be applied by schools and districts Primary outcomes will be the conceptual framework, training resources, and exemplars for professional development at the district level. Library for consultants
  • Slide 4
  • What Is Data-Based Problem Solving? Decisions in a MTSSS Framework are based on student performance data. Data-Based Problem Solving is infused in all components of a MTSSS practice. At the screening level, data would be used to make decisions about which students are at risk of their needs not being met. In the progress monitoring stage, data is used to make decisions about effectiveness of interventions. Decisions to increase or decrease levels of intervention within a Multi-Tiered Systems of Support Framework are based on student performance data.
  • Slide 5
  • Why is Data-Based Problem Solving Important? Data-based decisions regarding student response to intervention is central to the MTSSS Framework. Important educational decisions about intensity and likely duration of interventions are based on individual student response to instruction across multiple tiers of interventions and are informed by data on learning rate and level.
  • Slide 6
  • Knowing why and for what purpose data is being collected is imperative. When the purpose and intent of data collection is known, the data can be used to make various decisions. Why is Data-Based Problem Solving Important?
  • Slide 7
  • What Should Schools Consider? Three types of data are gathered within a MTSSS process: Data as a result of universal screening is used to identify those students who are not making academic or behavioral progress at expected rates Data as a result of diagnostic assessment is used to determine what students can and cannot do in important academic and behavioral domains Data as a result of progress monitoring is used to determine if academic or behavioral interventions are producing desired effects.
  • Slide 8
  • Data collection leads to appropriate support and strategic instruction for ALL students. When looking at data, a team may decide: if the delivery of the core curriculum should be altered, if more information is needed, or if supplemental instruction needs to be added. Data that is collected will also inform the school whether or not the problem exists as a result of the classroom environment, intervention, curriculum, instruction, or learner.
  • Slide 9
  • Problem Solving Process Define the Problem What Do We Want Students to KNOW and Be Able to DO? Problem Analysis Why Cant They DO It? Implement Plan What Are WE Going To DO About It? Evaluate Did It WORK? (Response to Intervention RtI)
  • Slide 10
  • Step 1/Tier 1 Integrated Guided Questions Guiding Questions: Step 1 Problem ID What do we expect our students to know, understand, and be able to do as a result of instruction? Do our students meet or exceed these expected levels? (How sufficient is the core?) Are there groups for whom core is not sufficient?
  • Slide 11
  • Full Option Graduates! Both domains focus on a common goal:
  • Slide 12
  • AcademicsBehavior NGSSS for all grade levels, content areas School-Wide expectations Character Education Traits School-Wide social skills curricula School/District mission statements What do we expect our students to know, understand, and be able to do as a result of instruction? To effectively address student outcomes, schools must address both domains.
  • Slide 13
  • How sufficient is the core?
  • Slide 14
  • Are there groups for whom core is not sufficient?
  • Slide 15
  • How sufficient is the core?
  • Slide 16
  • Are there groups for whom core is not sufficient?
  • Slide 17
  • How to Answer the Questions: Behavior Attendance Tardies Suspensions Discipline referrals Surveys Locally developed, safety, climate, substance abuse Percent participating in Tier 1 system
  • Slide 18
  • How sufficient is the core? www.swis.org
  • Slide 19
  • How sufficient is the core?
  • Slide 20
  • Are there groups for whom core is not sufficient? www.flrtib.org
  • Slide 21
  • Are there groups for whom core is not sufficient? www.flrtib.org
  • Slide 22
  • Guiding Questions: Step 2 Problem Analysis If the core is NOT sufficient for either a domain or group of students, what barriers have or could preclude students from reaching expected levels? Step 2/Tier 1 Integrated Guided Questions
  • Slide 23
  • I nstruction C urriculum E nvironment L earner Alignment with Standards and Across Grade/School Levels, Relevancy to Students Personal Goals, Content, Pacing, Progression of Learning, Differentiation Cognitive Complexity of Questions and Tasks, Gradual Release of Responsibility, Appropriate Scaffolding, Connection to Students Personal Goals, Interests and Life Experiences Reward/Consequence System, Visual Cues, Climate/Culture, Quality of Student/Adult Relationships, Quality of Peer Relationships, High Expectations for ALL Students, Collaboration and Voice Reinforcement Preferences, Perceptions of Competence and Control, Perceived Relevancy of Instruction/Education, Integration and Affiliation with School, Academic/Social- Emotional Skill Development
  • Slide 24
  • Hypotheses HypothesisData Source Examples I Instruction did not include modeling and guided practice. Lesson plans, observations, report/survey data, permanent products C Skills targeted in the lessons did not align with the NGSSS Lesson plans, Observations of task, assignments and assessments E School-wide reinforcement program includes few developmentally appropriate reinforcement options. Review of school-wide behavior plan, Student survey and student focus group feedback L A substantial amount of instructional time is lost due to excessive absenteeism.Attendance, ODRs, Suspensions
  • Slide 25
  • Reaching Expected Levels If the core is NOT sufficient for either a domain or group of students, what barriers have or could preclude students from reaching expected levels?
  • Slide 26
  • What potential barriers have precluded us from improving student outcomes? Lack of Common Assessments Common Planning Ongoing Progress Monitoring Curriculum Mapping Aligned with NGSSS and Common Assessments Resource Availability Administrative Support Professional Development
  • Slide 27
  • Possible Data Sources for Analysis I Lesson plan review, instructional observations, survey data, permanent products C Lesson plans, Observations of task, assignments and assessments E Review of school-wide behavior plan, student survey and student focus group feedback, walk-through assessments, climate surveys, behavior plan/fidelity measures L Attendance, ODRs, suspensions, Assessment of academic/social- emotional skill development Analyzing Identified Problems
  • Slide 28
  • The school-wide reinforcement program IS NOT being implemented with fidelity
  • Slide 29
  • Guiding Questions: Step 3 Plan Development & Implementation What strategies or interventions will be used? What resources are needed to support implementation of the plan? Planning for Step 4 How will sufficiency and effectiveness of core be monitored overtime? How will the data be displayed? How will fidelity of interventions be monitored over time? How will fidelity of the problem solving process be monitored over time? How will good, questionable, and poor responses to intervention be defined? Step 3/Tier 1 Integrated Guided Questions
  • Slide 30
  • What strategies or interventions will be used?
  • Slide 31
  • Math Resources What resources are needed to support implementation of the plan?
  • Slide 32
  • Literacy Resources What resources are needed to support implementation of the plan?
  • Slide 33
  • http://www.flrtib.org http://flpbs.fmhi.usf.edu
  • Slide 34
  • Tier 1 Interventions (Behavior) Based on the function of the problem behavior Teach the skill Reward the skill Consequate effectively Referrals by expectation, context, motivation, admin decision will help inform the possible function www.flpbs.fmhi.usf.edu for examples www.flpbs.fmhi.usf.edu
  • Slide 35
  • How will sufficiency and effectiveness of core be monitored overtime? Common Assessment Example
  • Slide 36
  • Monitoring the Core (Behavior): Referrals per Day/Month www.flrtib.org
  • Slide 37
  • How will fidelity be monitored over time? Fidelity of implementation is the delivery of instruction in the way in which it was designed to be delivered. Fidelity must also address the integrity with which screening and progress-monitoring procedures are completed and an explicit decision-making model is followed. Fidelity also applies to the problem solving processbad problem solving can lead to bad decisions to implement otherwise good interventions.
  • Slide 38
  • Monitoring the Core (Behavior): Fidelity Depends on the intervention! Lesson plans with built-in fidelity checklists Permanent products of lessons Token sign-out logs Counts of positive post cards Parent call logs Implementation measures Surveys, focus groups observations
  • Slide 39
  • Implementation Measures: PBS Implementation Checklist
  • Slide 40
  • Implementation Measures: Benchmarks of Quality
  • Slide 41
  • How will good, questionable, and poor responses to intervention be defined? Decision Rules: Positive Response Gap is closing Can extrapolate point at which target student(s) will come in range of target--even if this is long range Questionable Response Rate at which gap is widening slows considerably, but gap is still widening Gap stops widening but closure does not occur Poor Response Gap continues to widen with no change in rate.
  • Slide 42
  • Defining Adequate Response: Tier 1 for Behavior School-Wide screenings (< 20% identified) ODRs by October (< 2 majors) Teacher nominations, ESE (EBD) referrals Declining trend* in discipline data Attendance, tardies
  • Slide 43
  • Step 4 Plan Evaluation of Effectiveness Have planned improvements to core been effective? Step 4/Tier 1 Integrated Guided Questions
  • Slide 44
  • Performance Fall Positive Response to Intervention Expected Performance Observed Performance WinterSpring Gap is closing, Can extrapolate point at which target student(s) will come in range of target--even if this is long range
  • Slide 45
  • Performance Time Positive Response to Intervention Expected Trajectory Observed Trajectory
  • Slide 46
  • Performance Fall Questionable Response to Intervention Expected Performance Observed Performance WinterSpring Rate at which gap is widening slows considerably, but gap is still widening Gap stops widening but closure does not occur
  • Slide 47
  • Performance Time Questionable Response to Intervention Expected Trajectory Observed Trajectory
  • Slide 48
  • Performance Fall Poor Response to Intervention Expected Performance Observed Performance WinterSpring Gap continues to widen with no change in rate.
  • Slide 49
  • Performance Time Poor Response to Intervention Expected Trajectory Observed Trajectory
  • Slide 50
  • Have our interventions been effective?
  • Slide 51
  • Slide 52
  • Decisions What to do if RtI is: Positive Continue intervention with current goal Continue intervention with goal increased Fade intervention to determine if student(s) have acquired functional independence.
  • Slide 53
  • Decisions What to do if RtI is: Questionable Was our DBPS process sound? Was intervention implemented as intended? If no - employ strategies to increase implementation integrity If yes - Increase intensity of current intervention for a short period of time and assess impact. If rate improves, continue. If rate does not improve, return to problem solving.
  • Slide 54
  • Decisions What to do if RtI is: Poor Was our DBPS process sound? Was intervention implemented as intended? If no - employ strategies in increase implementation integrity If yes - Is intervention aligned with the verified hypothesis? (Intervention Design) Are there other hypotheses to consider? (Problem Analysis) Was the problem identified correctly? (Problem Identification)
  • Slide 55
  • We CANNOT Continue to Ignore the Data Will we meet our goal of 100% by 2014?
  • Slide 56
  • FLDOE Race to the Top L ocal I nstructional I mprovement S ystem Minimum Standards FLDOE identified nine component areas of a LIIS and specific requirements for each.
  • Slide 57
  • 6. Analysis and Reporting-The system will leverage the availability of data about students, district staff, benchmarks, courses, assessments, and instructional resources to provide new ways of viewing and analyzing data. 8. Data Integration-The system will include or seamlessly share information about students, district staff, benchmarks, courses, assessments, and instructional resources to enable teachers, students, parents, and district administrators to use data to inform instruction and operational practices.
  • Slide 58
  • Academics & Behavior influence one another in a multitude of ways Systems & resources are being developed to support DBPS RtI:B database Workgroup models & materials The Reciprocal Nature of Academic & Behavior Outcomes
  • Slide 59
  • Thank You!