e valuation t ools as i mplementation d rivers two implementation rubrics from the california spdg 1

Post on 25-Dec-2015

222 Views

Category:

Documents

3 Downloads

Preview:

Click to see full reader

TRANSCRIPT

1

EVALUATION TOOLS AS IMPLEMENTATION DRIVERS

Two Implementation Rubricsfrom the California SPDG

2

EARLY ERIA: EFFECTIVE READING INTERVENTIONS ACADEMY

ERIA established in 2003-04 before the Era of RTI

upper-elementary and middle schools focus

a diversity of approaches at sites in regional cohorts

recently dev ERIA 2.0 middle and high school focus

Frequently Asked Questions “What is ERIA?”

“How does this relate to RtI2?”

“What is ‘intervention’ and where do we get it?”

How do we Evaluate this?

3

TWO INTEGRATIVE EVALUATION TOOLS SERVE AS IMPLEMENTATION DRIVERS

Program Guide articulates PD model a 16-page booklet, explicitly addresses both

implementation and intervention practices to guide the design of a site-based program.

Implementation Rubric operationalizes PD model a 10-item instrument which provides a framework for

trainers, coaches, site team members, and teachers to evaluate and discuss implementation, fidelity, and next steps

Everyone is on the same page Sustainability (beyond funding, staff turnover) Scale-up (recruit new sites/districts, beyond SPDG) Diversity of approaches enabled

4

SPDG Evaluators Li Walter and Alan Wood synthesized content expert input and worked to make it readily accessible to a variety of stakeholders.

“5 Steps”• Step 1: Identify• Step 2: Assess• Step 3: Deliver• Step 4: Monitor• Step 5: Improve

EVALUATION TOOL: THE PROGRAM GUIDE

5

Stakeholders from every level were included to provide input both on implementation and intervention practices.

A variety of existing documents and other resources were also synthesized into theProgram Guide.

6

There is a Table of Contents for ease of use.

7

Implementation practices are outlined over a three-year schedule.

8

Implementation practices are described in detail.

9

The Site Team is key to implementing ERIA.

It and its supporting structures are detailed.

10

An expansive list of key roles are described in detail.

11

While the first half of the Program Guide focuses on implementation practices, the second half focuses on intervention practices and the 5 Steps:

Step 1: IdentifyStep 2: AssessStep 3: DeliverStep 4: MonitorStep 5: Improve

12

Step 1: Identify

“Identify struggling readers through universal literacy screening early in the school year using statewide English-language Arts test scores.”

13

Step 2: Assess (1 of 2)

“Assess the decoding, reading fluency, and comprehension skills of struggling readers to guide intervention placement and instruction.”

14

Step 2: Assess (2 of 2)

Presents a variety of assessment tools and strategies, both for basic and advanced implementation.

15

Step 3: Deliver (1 of 2)

“Deliver interventions to assess specific skill needs for success in the core curriculum using evidence-based programs and practices with fidelity.”

16

Step 3: Deliver (2 of 2)

Presents a variety of Intervention topics, programs, and models and how they may be appropriate for implementation, both in basic and advanced implementation.

17

Step 4: Monitor

“Monitor the progress of struggling students to ensure that interventions are helping students improve and to adjust intervention placements accordingly.”

18

Step 5: Improve

“Improve content literacy instructional practices to actively and effectively engage all students in the core curriculum.”

19

Student Outcomes

Past successes, increasing English-Language Arts proficiency inclusive of Students with Disabilities, is detailed.

20

EVALUATION TOOL:IMPLEMENTATION RUBRIC

The 10 items are intervention practices-focused mostly, with site team and fidelity items

The overall tool and process of how the rubric isused drives the implementation practices Self-evaluate and reflect on learning and

implementation. Shared with coaches and trainers to guide activities Evaluates the fidelity of implementation of both the PD

model and the interventions

Former 26-item, 3-point checklist lacked the specificity to be meaningful and useful.

21

IMPLEMENTATION RUBRIC, ADAPTED FROM “GOAL ATTAINMENT SCALES” Amy Gaumer Erickson and Monica Ballay presented

“goal attainment scales” on a June 17 SIG Network webinar: http://www.signetwork.org/content_pages/78

Rubric explicitly describes 5 implementation levels for each of 10 items: Levels 1, 2, and 3 reflect the “Not started,” “In

progress,” and “Achieved” implementation levels of former checklist.

Levels 4 and 5 detail concrete steps towards optimal implementation, beyond the basics.

Each implementation level for each item is explicitly described, building more meaning into the tool than our previous checklist format allowed.

22

23

24

25

26

27

28

Automatically generated report summarizes three years of implementation in a two-page format.

The most-recent response to each item is summarized below the trend chart, including text which describes what each implementation level means in explicit terms.

29

The most-recent response to each item’s “Summary” is gathered in a one-page format.

30

The most-recent response to each item’s “Next Steps” is also gathered in a one-page format.

31

All data from all three years of collection is gathered in a single row on the “Hidden Formulas” tab for easy aggregation.

All of this other blank space stores the hidden formulas that make the Excel file work. Because this file doesn’t rely on Visual Basic or Macros, we have avoided security and software compatibility issues.

32

THE GUIDE AND RUBRICS: WELL-RECEIVED

Sites love it. Returning sites love the specificity and usefulness. New sites feel like they know what they are getting

into.

Trainers and coaches can see how sites are progressing and where the needs are.

Evaluation Taskforce was encouraged by the possibility of greater ability to learn about the effectiveness of the professional development models.

33

BEST IMPLEMENTATION RUBRICBuilding Effective Schools Together

34

35

36

37

38

39

ERIA on the Web:http://calstat.org/effectivereading.html

BEST / PBS on the Web:http://calstat.org/bestpbs.html

Li Walter: li@sonic.net

Alan Wood: alan.wood@calstat.org (707) 287-0054

top related