assessing the effects of mcgraw-hill’s flex...

33
Assessing the Effects of McGraw-Hill’s FLEX Literacy Program on Oral Reading Fluency Skills: A Pilot Study Using Single-Subject Design Methodology SKF Educational Services, LLC Shannon Flaum-Horvath, Ph.D

Upload: others

Post on 04-Jul-2020

4 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Assessing the Effects of McGraw-Hill’s FLEX …s3.amazonaws.com/Edcanvas/21098/local/FLEX_Sprin… · Web viewTitle Assessing the Effects of McGraw-Hill’s FLEX Literacy Program

Assessing the Effects of McGraw-Hill’s FLEX Literacy Program on Oral Reading

Fluency Skills: A Pilot Study Using Single-Subject Design Methodology

SKF Educational Services, LLCShannon Flaum-Horvath, Ph.D

Page 2: Assessing the Effects of McGraw-Hill’s FLEX …s3.amazonaws.com/Edcanvas/21098/local/FLEX_Sprin… · Web viewTitle Assessing the Effects of McGraw-Hill’s FLEX Literacy Program

Introduction

The Problem

It is no secret that, despite the emphasis of the No Child Left Behind Act, American students continue to struggle in reading. Findings from the 2009 National Assessment of Education Progress (NAEP) reveal that generally, American students’ reading scores are not significantly different from what they were in 2007: nationally, about 33% of fourth grade students and 25% of eighth and twelfth grade students read below what is considered a ‘basic’ level of proficiency. Findings are even more alarming for particular subgroups of students; between 62% to 65% of students with special needs in grades four, eight, and twelve performed below a ‘basic’ level of proficiency in reading. For students characterized as English language learners (ELL), between 71% and 78% of students in grades four, eight, and twelve performed below a ‘basic’ level of proficiency; in fact, twelfth grade students exhibited the worst performance since 1998, with 78% of students scoring below a basic level of proficiency.

One Proposed Solution

In effort to address this need, McGraw-Hill has published, FLEX Literacy a unique, research-based reading intervention program currently in development. Using both digital and teacher-led content delivery, FLEX Literacy combines dynamic, individualized online instruction targeting discrete skills with teacher-led, small group instruction focusing on reading strategies through shared-reading lessons. In addition, the program contains a component that focuses on the application of skills by providing students the opportunity to collaborate while completing projects. The program was designed to maximize and streamline classroom flow by balancing whole-class instruction, independent student tasks, and collaborative small group work. FLEX Literacy contains three Learning Experiences. The Digital Experience reflects individualized online instruction provided through computer technology; The Print Experience reflects shared reading and writing experiences through teacher-led, small group instruction ; and The Project Experience reflects collaborative group work and activities. FLEX Literacy is designed to maximize student engagement by incorporating various methods of instruction and focusing on all sensory modalities, important when instructing students with differing learning styles.

2

Page 3: Assessing the Effects of McGraw-Hill’s FLEX …s3.amazonaws.com/Edcanvas/21098/local/FLEX_Sprin… · Web viewTitle Assessing the Effects of McGraw-Hill’s FLEX Literacy Program

Purpose of Study

The purpose of this pilot study was twofold: first, to investigate the effect of FLEX Literacy on the oral reading fluency skills for a select group of students performing below grade level in reading; and second, to obtain information regarding the program’s efficacy. Specific questions of interest include:

1. What effect does FLEX Literacy have on the oral reading fluency performance of students who are exposed to the program?

2. What effect does FLEX Literacy have on the attitudes and behaviors of students and educators who are exposed to the program?

3. What recommendations, if any, do teachers and students provide for improving the program?

In an effort to address these questions, this report includes quantitative and qualitative information obtained through standardizes measures, teacher surveys, and interviews with teachers, students, and program consultants.

3

Page 4: Assessing the Effects of McGraw-Hill’s FLEX …s3.amazonaws.com/Edcanvas/21098/local/FLEX_Sprin… · Web viewTitle Assessing the Effects of McGraw-Hill’s FLEX Literacy Program

Method

Description of Research Sites

Two sites were selected for this study: Berlin Middle School, serving students in grades five through eight and located in Somerset County, Pennsylvania; and Cheney Elementary, a primary school serving students in grades Pre-kindergarten through fifth and located in Orange County, Florida. Information contained on the Pennsylvania State Department of Education website indicates that Berlin MS enrolled 314 students during the 2010-2011 school year. About 25% of students school-wide and about 77 % of the special needs population performed ‘below basic’ on the Pennsylvania System of School Assessment (PSSA) test in reading. Information contained on the Florida State Department of Education website indicates that Cheney Elementary lists enrolled 503 students during the 2010-2011 school year. On the Florida state assessment in reading, 39% of third graders, 42% of fourth graders, and 45 % of fifth graders attending Cheney scored below proficiency levels in reading. For students identified with special needs, 100% of third graders, 92% of fourth graders, and 82% of fifth graders scored below proficiency. About 50% of third grade ELL students, 68% of fourth grade ELL students, and 83% of fifth grade ELL students scored below proficiency levels in reading.

Participants

Ten students, four attending Berlin MS and six attending Cheney Elementary, participated in this study. The four students attending Berlin MS are identified with special needs and attend a resource room for reading instruction. Two of these students are diagnosed with Autism. All students are boys, with one in the fifth grade and three in the sixth grade. All participating students from Berlin MS are Caucasian and speak English as their native language. At Cheney, six students, all fourth graders, participated in the program. Three of these students are boys and three students are girls. With the exception of one student, all are of Hispanic ethnicity and are characterized as ESL. One student is identified with special needs. Two teachers, one in each school, implemented FLEX Literacy during the three week period.

4

Page 5: Assessing the Effects of McGraw-Hill’s FLEX …s3.amazonaws.com/Edcanvas/21098/local/FLEX_Sprin… · Web viewTitle Assessing the Effects of McGraw-Hill’s FLEX Literacy Program

Procedure

Program Implementation

The FLEX Literacy program was implemented at the beginning of May, 2011, and remained in place for three weeks. Prior to implementing FLEX Literacy, teachers received on-site training by two program consultants familiar with the FLEX Literacy program. The program consultants provided ongoing support to the teachers and conducted on-site visits to ensure that the program was implemented as intended, while providing feedback and instruction, as needed.

Measures

The primary outcome measure selected for this study was oral reading fluency. Generally, oral reading fluency refers to the ability to read connected text smoothly, effortlessly, and with little focus on the mechanics of reading, such as decoding (Mather & Goldstein, 2001). There were several reasons for selecting this as the primary outcome measure. Oral reading fluency is one reading component identified by reading specialists as critical for overall reading development and achievement (Schilling, Carlisle, Scott, & Zeng, 2007). Research has demonstrated that oral reading fluency has strong predictive validity for overall reading skill (Schilling, et al., 2007), and for performance on standardized reading assessments (Shaprio, Solari, & Petscher, 2008; Hixson & McGlincey, 2004). Additionally, oral reading fluency is relatively easy to measure and is frequently used as a screening and progress monitoring tool (Goffreda & DiPerna, 2010).

This study used the Oral Reading Fluency (ORF) subtest of the Dynamic Indicators of Basic Early Literacy Skills (DIBELS) 6th Edition, a standardized, individually administered curriculum-based measure consisting of various short, one-minute reading assessments designed to evaluate a student’s fluency on specific reading tasks (University of Oregon Center on Teaching and Learning). The ORF subtest provides a reading fluency score, defined as the number of words read correctly within a one-minute time period. Substituted or omitted words are counted as errors, as are hesitations of more than three seconds. Words self-corrected within the three-second criteria are scored as accurate. The number of words read aloud minus the number of errors represents the ORF score (University of Oregon Center on Teaching and Learning). The reliability and validity of the ORF subtest is well established; test-retest reliabilities for oral reading fluency on elementary students is documented to range from .92 to .97, and alternate form reliability of different reading passages drawn from the same level have ranged from .89 to .94 (Tindal, Marston, & Deno, 1983). Criterion-related validity studied in eight separate studies conducted during the 1980’s reported coefficients ranging from .52 to .91 (Good & Jefferson, 1998).

5

Page 6: Assessing the Effects of McGraw-Hill’s FLEX …s3.amazonaws.com/Edcanvas/21098/local/FLEX_Sprin… · Web viewTitle Assessing the Effects of McGraw-Hill’s FLEX Literacy Program

Prior to implementing FLEX Literacy, baseline assessments were conducted to determine student proficiency in oral reading fluency. This was important, as students’ baseline, or performance before exposure to FLEX Literacy, is the standard against which subsequent performance, or performance after exposure to FLEX Literacy, is compared. At the start, all students were reported by teachers as reading on a third grade reading level, and were administered the appropriate grade level ORF probe. As part of the baseline assessment, students were administered three parallel forms of the ORF probe to obtain a stable baseline level. The median score on the three baseline administrations of the ORF subtest was recorded as the initial or baseline ORF score. Weekly progress monitoring assessments, using parallel forms of the DIBELS ORF, were administered to assess students’ response over time to FLEX Literacy All assessments were conducted by two consultants familiar with DIBELS administration.

Research Design and Data Analysis

Educators and researchers must seek practical and empirically robust tools to facilitate the determination of a program’s presence, stability, and durability of treatment effects. Single-subject designs, as employed in this case, allow educators to investigate the process of change for a particular case, not the average case. Unlike correlational and descriptive methods, single-subject design methodology is experimental; its purpose is to determine causal relationships between variables (Horner, Carr, Halle, McGee, Odom, & Wolery, 2005). While there are many variants of single subject designs, many involve only one participant or a small group of participants in a single study. Data is analyzed at the ‘case’ or unit level, a ‘case’ including a single participant, a classroom, or a school. The dependent variables are typically observations of a target behavior, and the independent variable is a specified program or intervention procedure that is actively manipulated and carefully monitored throughout the investigation. An effect is demonstrated when the change in the target behavior covaries with implementation of the intervention. Single subject designs can provide strong basis for determining program effects, and are used more and more frequently in education and clinical research. Recently, the utility of single subject designs was recognized by the U.S. Department of Education Institute of Education Sciences What Works Clearinghouse, which deems such designs as acceptable for demonstrating program effectiveness (Kratchowill, Hitchcock, Horner, Levin, Odom, Rindskopf, & Shadish, 2010).

This study utilized a multiple baseline design. Generally, multiple baseline designs contain the following elements: (a) repeated measurement of the dependent variable across at least two baselines; (b) staggered introduction of treatment across baselines, and; (c) immediate observed effects of the intervention with no observable effects in conditions in which the intervention has not been implemented. In the multiple baseline across subjects design, the same intervention is ‘staggered’ over time, and the same behavior monitored throughout the intervention.

6

Page 7: Assessing the Effects of McGraw-Hill’s FLEX …s3.amazonaws.com/Edcanvas/21098/local/FLEX_Sprin… · Web viewTitle Assessing the Effects of McGraw-Hill’s FLEX Literacy Program

Results

Results presented here reflect student performance during the three-week implementation of FLEX Literacy. Each graph reflects the performance of an individual student. In all cases, the first data point for each student reflects the median words per minute on the baseline ORF probe, and represents that student’s performance prior to exposure to FLEX Literacy. This data point is followed by a vertical line, which demarks the boundary between the students’ baseline or pre-program performance, and performance during instruction with FLEX Literacy. Each data point to the right of the vertical line reflects students’ performance on each subsequent ORF assessment, administered on a weekly basis. The y-axis represents the number of words read correctly per minute. The x-axis reflects the week of program instruction.

Students Identified with Special Needs

Figures 1 through 4 represent findings for the four students identified with special needs who attend a resource room for reading instruction.

Figure 1. Student 1 performance on the oral reading fluency subtest of the DIBELS

7

Page 8: Assessing the Effects of McGraw-Hill’s FLEX …s3.amazonaws.com/Edcanvas/21098/local/FLEX_Sprin… · Web viewTitle Assessing the Effects of McGraw-Hill’s FLEX Literacy Program

Figure 2. Student 2 performance on the oral reading fluency subtest of the DIBELS

Figure 3. Student 3 performance on the oral reading fluency subtest of the DIBELS

8

Page 9: Assessing the Effects of McGraw-Hill’s FLEX …s3.amazonaws.com/Edcanvas/21098/local/FLEX_Sprin… · Web viewTitle Assessing the Effects of McGraw-Hill’s FLEX Literacy Program

Figure 4. Student 4 performance on the oral reading fluency subtest of the DIBELS

9

Page 10: Assessing the Effects of McGraw-Hill’s FLEX …s3.amazonaws.com/Edcanvas/21098/local/FLEX_Sprin… · Web viewTitle Assessing the Effects of McGraw-Hill’s FLEX Literacy Program

Students identified as ESL

Figures 5 through 10 represent the findings for those six students who are identified as ESL.

Figure 5. Student 5 performance on the oral reading fluency subtest of the DIBELS

Figure 6. Student 6 performance on the oral reading fluency subtest of the DIBELS

10

Page 11: Assessing the Effects of McGraw-Hill’s FLEX …s3.amazonaws.com/Edcanvas/21098/local/FLEX_Sprin… · Web viewTitle Assessing the Effects of McGraw-Hill’s FLEX Literacy Program

Figure 7. Student 7 performance on the oral reading fluency subtest of the DIBELS

Figure 8. Student 8 performance on the oral reading fluency subtest of the DIBELS

11

Page 12: Assessing the Effects of McGraw-Hill’s FLEX …s3.amazonaws.com/Edcanvas/21098/local/FLEX_Sprin… · Web viewTitle Assessing the Effects of McGraw-Hill’s FLEX Literacy Program

Figure 9. Student 9 performance on the oral reading fluency subtest of the DIBELS

Figure 10. Student 10 performance on the oral reading fluency subtest of the DIBELS

12

Page 13: Assessing the Effects of McGraw-Hill’s FLEX …s3.amazonaws.com/Edcanvas/21098/local/FLEX_Sprin… · Web viewTitle Assessing the Effects of McGraw-Hill’s FLEX Literacy Program

Table 1 presents the percentage of non-overlapping data (PND) by descriptive category for all students. The PND is a commonly-used method for analyzing data in single-subject designs. It is calculated by first determining the number of data points in the intervention phase that exceeds the highest data point in the baseline phase. This value is divided by the total number of data points in the intervention phase and multiplied by 100, yielding a percentage score. Descriptively, values of 90% or higher reflect “highly effective” interventions; values of 70% to under 90% reflect “moderately effective” interventions; values from 50% to under 70% reflect “mildly effective” interventions; and values below 50% reflect negligible effects (Ma, 2006).

Table 1. PND Descriptive Category for Students (Total)

About a fifth of the sample exhibited higher oral reading fluency scores, compared to baseline, on every administration of the progress monitoring assessments, administered during implementation of FLEX Literacy. According to calculation guidelines for PND, these students found the program highly effective in increasing oral reading fluency. Half of the students exhibited higher oral reading fluency scores, over baseline, on two out of three administrations of the progress monitoring assessments. According to calculation guidelines, these students found the program mildly effective in increasing oral reading fluency. Finally, less than one-third of students exhibited higher oral reading fluency scores on one or less administrations of the progress monitoring assessment. The effect on oral reading fluency skills for this group of students was considered negligible.

13

Category n %

Highly Effective 2 20

Moderately Effective 0 0

Mildly Effective 5 50

Negligible Effects 3 30

Page 14: Assessing the Effects of McGraw-Hill’s FLEX …s3.amazonaws.com/Edcanvas/21098/local/FLEX_Sprin… · Web viewTitle Assessing the Effects of McGraw-Hill’s FLEX Literacy Program

Table 2 presents the percentage of increase above baseline levels of performance, in the number of words read, per minute. The table details findings for only those students demonstrating at least ‘mild’ success using the PND criteria. The percentage of increase ranged from a low of 7 % to a high of 58%.

Table 2. Percent increase above baseline levels in number of words read per minute

Teacher Survey

At the end of the pilot study, teachers were asked to complete a structured survey regarding their experiences and perceptions about FLEX Literacy. The initial portion of the survey contained 46 statements constructed in a Likert format and containing six possible response options ranging from Strongly Agree to Strongly Disagree. Several questions were designed to elicit information about the teachers’ perceptions of the program as a whole, while others were more pointed and reflected components of the Print, Digital, and Project experience. The final portion of the survey contained seven open-ended response items for which teachers could expound upon their responses.

In regard to their perceptions of the program as a whole, both teachers responded either Strongly Agree or Moderately Agree for all eight items contained within this portion of the survey. Teachers felt that FLEX Literacy adequately addresses content standards and that the time allotted for instruction was generally appropriate. Teachers indicated that preparation and instruction was manageable; both teachers selected Strongly Agree for these items. Teachers felt

14

Student Number Baseline WPM Highest WPM % increase WPM

1 159 172 8%

2 63 100 58%

4 72 91 26%

6 121 135 12%

7 128 137 7%

8 114 128 12%

9 107 127 19%

Page 15: Assessing the Effects of McGraw-Hill’s FLEX …s3.amazonaws.com/Edcanvas/21098/local/FLEX_Sprin… · Web viewTitle Assessing the Effects of McGraw-Hill’s FLEX Literacy Program

that the three learning experiences (Digital, Print, and Project) worked together to build student understanding, and both teachers indicated that they would recommend FLEX Literacy to others.

The second portion of the survey contained 15 items that reflect perceptions of The Digital Experience. When asked about their perceptions of the Digital Experience, teachers generally gave favorable impressions and selected either Strongly Agree or Moderately Agree for all items. Teachers consistently selected Strongly Agree to indicate that students felt successful during the Digital Experience and that students found the lessons highly engaging. Both teachers selected Strongly Agree to indicate that students were able to access lesson components and were able to focus on the learning activities for the entire block of time dedicated to this portion of the program. Teachers selected Strongly Agree to indicate that ample time was allotted for the students to make daily progress through the lessons. Both teachers selected Moderately Agree to indicate that students were able to independently complete the learning activities. In regard to instructional design, teachers selected Strongly Agree to indicate that the design is of high quality and includes age appropriate graphics. Teachers selected Strongly Agree to indicate each that content standards are addressed in the Digital Experience, and the lesson content is appropriate.

When asked questions specifically about the format of the 2.B Targeted Interventions and Mastery Checks, feedback ranged from Strongly Agree to Mildly Agree. One teacher selected Strongly Agree to indicate that the Targeted Interventions and Mastery Checks were easy to understand and administer, while the other teacher stated that she “did not have feedback on the computer to make informed decisions about the Targeted Interventions”. Both teachers selected Mildly Agree to indicate that the quantity/frequency of the 2.B Targeted Intervention and Mastery Checks was appropriate. One teacher selected Mildly Agree to indicate that the 2.B Targeted Intervention and Mastery Checks were useful for diagnosing and correcting student errors in understanding, while the other teacher selected Moderately Agree to this statement.

The third portion of the survey contained 11 items that asked specifically about perceptions of the Print Experience. Both teachers responded Strongly Agree or Moderately Agree on all items save one, and response patterns were nearly identical. Teachers selected Strongly Agree to indicate that students felt successful during the Print Experience, and selected Strongly Agree to indicate that students found it engaging. Teachers selected Strongly Agree to indicate that routines were easy to follow and that content standards were addressed. Both teachers selected Strongly Agree to indicate that the instruction to model student responses was clear, and that the vocabulary sections were helpful and informative. Both teachers selected Moderately Agree to indicate that the Print Experience was completed within the 25 minutes allotted for this component. There were disparate responses on items addressing the amount of time devoted to the Interactive Reader and Skill Differentiation sections, with one teacher selecting Mildly Agree to indicate that time is adequate, with the other teacher selecting Moderately Agree .

The final portion of the structured part of the survey contained 13 items that asked about perceptions of the Project Experience. One teacher (from Berlin MS) participated in the Project

15

Page 16: Assessing the Effects of McGraw-Hill’s FLEX …s3.amazonaws.com/Edcanvas/21098/local/FLEX_Sprin… · Web viewTitle Assessing the Effects of McGraw-Hill’s FLEX Literacy Program

Experience, and provided responses about that component of the program. The teacher selected Strongly Agree when asked whether students felt successful and found the projects engaging. Similarly, the teacher selected Strongly Agree to indicate that the Project Experience addresses content standards, has closely aligned project learning goals and activities, and contained manageable and appropriate technology resources. The teacher selected Moderately Agree to indicate both that the Turn & Talk activity was useful in assessing student understanding, and the Book Bound Foldable was useful as a project organizer. Similarly, the teacher selected Moderately Agree to indicate both that the project rubrics provided an excellent analysis of performance of each task, and that the time allotted for each activity is adequate. The teacher selected Mildly Agree to indicate both that the time allotted for each project and the duration of the project is appropriate.

The open-ended response portion of the survey contained 7 items that solicited feedback regarding strengths of FLEX Literacy, usefulness of various components, and recommendations for improvement. One teacher remarked that the “digital correlating with print” component makes FLEX Literacy unique, while the other teacher indicated that “the ability of the students to highlight and write on the text, along with the scaffolded note taking” makes the program unique. Interestingly, the teacher of students with special needs listed the ‘scaffolded note-taking’ as a perceived weakness of the program. When asked what grouping strategies were most successful, one teacher indicated that grouping students by “ reading level ability” proved successful , while the other teacher noted that rotating the students from activity to activity, as prescribed in the program, “worked great”. When asked how student responses were modeled during the Print Experience, one teacher responded, “through use of a SMARTBoard”, while the other teacher indicated that she attempted to use an overhead projector, which was “not that convenient, and it was hard to model it”. Teachers were varied in their opinion about the benefit of saving the modeled written student responses from class to class. When asked if there was a benefit to saving the response from class to class, one teacher indicated “no, since the students had them in their books”, while the other teacher indicated, “yes, for review/prior knowledge”. Finally, when asked about recommended changes to FLEX Literacy, one teacher indicated that it would be helpful to have the computer portion devised so that “students could not click through instructions and have the instructions read aloud and highlighted to force students to listen and read all instructions (maybe this would only be necessary for the first 20 lessons or so)”. The other teacher provided several recommendations, which included: “using a SMARTBoard for the Print Experience; making it available for the student to go back and repeat a lesson after remediation with BLM Targeted Intervention; correlating the projects to the story units”.

16

Page 17: Assessing the Effects of McGraw-Hill’s FLEX …s3.amazonaws.com/Edcanvas/21098/local/FLEX_Sprin… · Web viewTitle Assessing the Effects of McGraw-Hill’s FLEX Literacy Program

Teacher Interview

At the end of the pilot study and after completing the survey, both teachers provided additional feedback during a semi-structured interview, conducted via telephone and by an independent research consultant. Both teachers indicated that they very much enjoyed using the program, and cited the combination of print, digital, and project components “very unique” and a feature not found in other reading programs. Both teachers especially liked the graphic organizers and the daily goals which “set the tone for the day”. The inherent design of the program includes rotating students from activity to activity; one teacher noted that she particularly liked this, and found it “good for classroom flow”. In regard to the Print Experience, both teachers indicated that the students very much enjoyed the stories. One teacher noted: “the kids loved the stories, and they really held their interest. They found them really fun to read”. Both teachers indicated they “loved” that the stories are content-based, and emphasize vocabulary acquisition. This in particular was important to one teacher, whose classroom was largely comprised of Hispanic students, some of which are ESL. When asked about the ease of implementation, both teachers indicated that the program was easy to implement and very well-scripted, and that both seasoned and novice teachers would have “no problems” using the program. One teacher illustrated this by reporting that she was absent during one day of instruction and her substitute “could use it and follow the script and had no problems-even without training”. When asked about the suitability of the program for a variety of learners, both teachers indicated that “all students can use it”, and that it seems to be well equipped to address the unique needs of unique learners, especially ESL students since “the program includes a lot of vocabulary and has one-on-one interaction with the teacher and small group discussion, which are things that ESL students need”. One teacher, teaching predominantly special education students, indicated that while all students could use the program, some of the vocabulary was a little advanced, but by modifying the format (e.g., using a cloze procedure) her students demonstrated success. When asked if they would recommend the program to colleagues, both teachers indicated that they would recommend it, and if their scheduled permitted, agreed to teach the program in the fall.

Teachers generally sang FLEX Literacy many praises, and there were a few minor recommendations for program improvement. Both teachers indicated that the use of a SMARTBoard greatly increased the feasibility of using the program, and the one teacher who utilized an overhead projector at times found it challenging. While teachers report that students thoroughly enjoyed the Digital Experience, teachers noted that they tended to “click through the direction part” without reading it. Both teachers indicated that having the directions read aloud or building a “forced pause” into the design would be of benefit.

17

Page 18: Assessing the Effects of McGraw-Hill’s FLEX …s3.amazonaws.com/Edcanvas/21098/local/FLEX_Sprin… · Web viewTitle Assessing the Effects of McGraw-Hill’s FLEX Literacy Program

Student Interview

During the final administration of the DIBELS probes, the program consultants conducted a short, semi-structured interview with individual students. The first section of responses reflects those from the group of ELL students who attend Cheney. The second section provides responses from students identified with special needs and attending Berlin MS.

When asked what, in particular, ELL students liked about the program, overwhelmingly students responded: “the computer part”. One student expounded upon this by stating: “I like the computer-when I make a mistake, the computer fixes it and tells me and helps me and I like the games and reading the stories”. Another student liked the highlighted words, which facilitated the identification of the “author, title, characters, and setting-it is easy”. Yet another student appreciated the breadth of material covered in the lessons, indicating that the program “teaches a lot of different stuff in each lesson”. Two students indicated that they particularly enjoyed the high-interest stories, one in particular about New Mexico. When asked what they would change about the program, three of six students said, “nothing”, while the remaining students unanimously recommended that the characters (avatars) be modified to make them more interactive. Students indicated that they want to “dress/decorate the room and be able to play with the character” and “be able to do things” with the characters. When asked if the program helped them learn and do better in reading, all students indicated that they felt they learned and benefitted from the program. Vocabulary acquisition and the linking of letters to sounds were two primary skills students felt they improved, along with an understanding of synonyms. One student demonstrated this by stating: “I learned that synonyms mean same and not different”.

The responses of students identified with special needs largely mirrored that of the ESL student responses. When asked what the students liked best about the program, the majority of students cited the project and the computer components; one student summarized the experience by simply stating, “it is just fun”. One student indicated that the story about Amelia Earhart was particularly interesting, while another student remarked that “kicking the soccer ball through the cones” (during the Digital Experience) was a favorite. When asked what should be changed, one student recommended adding the soccer ball activity “for all levels”, while other students indicated that “nothing” should be changed. All students felt that the program helped them “learn more words” and interestingly, gave them “more reading time”. Since this group of students participated in the Project Experience, they were asked about their experience with the group activity. All students reported that they very much enjoyed the Project Experience. One student remarked that “finding information about hamsters” was particularly enjoyable, while another student was more general, stating that getting information together and “getting stuff ready” was fun. One student provided an interesting response, stating that he liked the group

18

Page 19: Assessing the Effects of McGraw-Hill’s FLEX …s3.amazonaws.com/Edcanvas/21098/local/FLEX_Sprin… · Web viewTitle Assessing the Effects of McGraw-Hill’s FLEX Literacy Program

project because, “they talk to me like a real person”. Yet another student was particularly enthusiastic, stating that “being with my friends and getting to do the presentation – this is the second one for me this week!”.

Fidelity of Program Implementation

Two program consultants conducted weekly site visits, to not only conduct progress monitoring assessments, but also to provide feedback regarding program implementation. Consultants noted that, generally, teachers presented the program the way it was intended. Teachers followed the program script each day, and students were rotated between the Digital and Print Experience within the allotted time (25 minutes), which seemed “the right amount of time” to complete these components of the program.

ESL Classroom

During the Print Experience, one consultant observed some slight modifications, considered largely driven by the unique needs of ESL students. While teaching vocabulary words, the teacher had students tap instead of raising hands to respond. The consultant noted that this particular group of students stumbled over vocabulary words and proper nouns, which extended the small group/print experience by two or three minutes on a few days. The teacher had students use the first read as a “shared read” instead of the teacher reading to students. In addition, the teacher provided “bulleted notes” for students to copy instead of a paragraph, which was reportedly well received. The consultant for the ESL classroom indicated that she felt no modifications that might compromise the integrity of the program, were made.

Special Education Classroom

During a letter-writing activity, the consultant reported that the teacher herself typed the letter and used a “cloze” procedure (e.g., filling in a missing word or phrase) for some of the words and phrases. Similarly, the teacher utilized this procedure for vocabulary, as it times was “too advanced” for her special needs students. The consultant reported that students responded positively and felt that this was an appropriate strategy, given the needs of this particular group of students.

19

Page 20: Assessing the Effects of McGraw-Hill’s FLEX …s3.amazonaws.com/Edcanvas/21098/local/FLEX_Sprin… · Web viewTitle Assessing the Effects of McGraw-Hill’s FLEX Literacy Program

Discussion

The overarching goal of this study was to assess how FLEX Literacy works, in action. In effort to obtain a holistic view, we incorporated student outcome measures, quantitative and qualitative feedback from teachers and students, and information about program fidelity. The study utilized single-subject methodology, appropriate for a sample of this size and useful for determining program effects over a relatively short period of time.

Teachers and students provided feedback regarding their perceptions of FLEX Literacy, and feedback was remarkably positive. Teachers generally felt that FLEX Literacy was easy to implement, addressed content standards, and addressed the unique needs of unique learners, which in this study, included students with special needs and students for which English is not their native language. Students themselves expressed that they enjoyed the program and that they felt that they benefitted from it, a perception supported by the observed increases on oral reading fluency assessments.

Teacher, consultant, and student perceptions of the Project Experience, a unique feature of FLEX Literacy and reading programs in general, are worthy of note. Teachers, consultants, and students consistently reported that they “love” the Project Experience, and one teacher stated that she felt the “concept of the project, wonderful”. Students were so excited to demonstrate their knowledge and felt “important” during the presentation. One consultant reported that on ‘presentation’ day, two of the students identified with special needs came “dressed for the occasion, in shirt and ties”. Another group of students showcased their creativity during a presentation about fish, by providing to each member of the audience a cupful of Goldfish crackers. Certainly, the Project Experience is designed to increase general knowledge, vocabulary, language development, and the acquisition of 21st century skills, but there are perhaps some hidden, unintended benefits. During the interview, one student in particular, noted that other students spoke to him ‘like he was a real person’, which is in itself an interesting statement. Yet another student expressed a great deal of enthusiasm about the group projects, and seemed thrilled that he had the opportunity to engage in such work “twice” that week. The increase in social skills, feelings of group acceptance, and enthusiasm for learning are perhaps byproducts of this component of FLEX Literacy that, although perhaps unintended, are no less important.

While the student and teacher feedback is encouraging and certainly reflects positively on the program, it is important that use of the program reflects in an increase in some measure of achievement. One of the goals of this pilot study was to determine the effect of FLEX Literacy on the oral reading fluency skills of a select group of students. Students were exposed to the program for three weeks, from about mid-May through the beginning of June and at the end of the 2010-2011 school year. Findings reveal that 7 of 10 students in the sample demonstrated growth in oral reading fluency, and using suggested criteria for determining program effectiveness for single-subject designs, experienced at minimum ‘mild’ success while receiving

20

Page 21: Assessing the Effects of McGraw-Hill’s FLEX …s3.amazonaws.com/Edcanvas/21098/local/FLEX_Sprin… · Web viewTitle Assessing the Effects of McGraw-Hill’s FLEX Literacy Program

FLEX Literacy reading instruction. For two students, one identified with special needs, and one identified as ESL, FLEX Literacy was considered ‘highly effective’ in increasing oral reading fluency skills. It should be noted that in effort to receive a ‘highly effective’ designation in a study of such short duration, calculation methods dictate that every score in the progress monitoring phase exceed that of the median baseline score; although five students experienced ‘mild’ success with the program, according to calculation guidelines in PND, in fact two out of three (or 66%) of each student’s ORF scores during program implementation exceeded the median baseline ORF score. In fact one of these students (student #4) experienced a 26% increase in words per minute, above baseline. For this sample of students, increases ranged from a low of 7% to a high of 58% increase in words per minute over baseline levels, with 90% of students (n=9) scoring highest on the final progress monitoring assessment, given at week three. There were some students, however, for which the program produced negligible effects. An interview with the program consultant revealed that two of these students have been diagnosed with autism, one of which presented some behavioral challenges during the assessment procedures, and the other relatively lower functioning. In light of this, the findings as a whole are somewhat surprising, given the relatively short time of program exposure. Further, while single-subject methodology is particularly effective in detecting program effects over relatively smaller time increments, a three-week period of exposure to a new program might be considered the lower limits. It would be particularly interesting to see if these trends hold with extended use of the program.

The results of this study indicate that FLEX Literacy is well received by teachers and students. Teachers and the students themselves feel that they benefited from the program, evidenced not only by perceptions of increased achievement in reading, but also through unintended effects which include an increased sense of peer acceptance and enthusiasm for learning. Preliminary results indicate that after a mere three weeks of instruction with FLEX Literacy, 90% of students in this study exhibited higher oral reading fluency scores after instruction, compared to before instruction, with increases ranging from a low of 7% to a high of 58% increase in words per minute, over baseline levels. In sum, preliminary findings suggest FLEX Literacy is regarded as easy to implement, unique reading program that increases students’ reading fluency after three weeks of instruction.

21

Page 22: Assessing the Effects of McGraw-Hill’s FLEX …s3.amazonaws.com/Edcanvas/21098/local/FLEX_Sprin… · Web viewTitle Assessing the Effects of McGraw-Hill’s FLEX Literacy Program

References

Goffreda, C.T., & DiPerna, J.C. (2010). An empirical review of psychometric evidence for the dynamic indicators of basic early literacy skills. School Psychology Review, 39(3), 463 – 483.

Good, R. H., & Jefferson, G. (1998). Contemporary perspectives on Curriculum-Based Measurement validity. In M. R. Shinn (Ed.), Advanced applications of Curriculum-Based Measurement (pp. 61-88). New York: Guilford.

Hixon, M. D., and McGlinchey, M. T. (2004). The relationship between race, income, and oral reading fluency and performance on two reading comprehension measures [Electronic version]. Journal of Psychoeducational Assessment, 22(4), 351 – 364.

Horner, R.H., Carr, E.G., Halle, J., McGee, G., Odom, S., and Wolery, M. (2005). The use of single-subject research to identify evidence-based practice in special education. Council for Exceptional Children, 71(2), 165-179.

Kratochwill, T. R., Hitchcock, J., Horner, R. H., Levin, J. R., Odom, S. L., Rindskopf, D. M & Shadish, W. R. (2010). Single-case designs technical documentation. Retrieved June 6, 2011 from What Works Clearinghouse website: http://ies.ed.gov/ncee/wwc/pdf/wwc_scd.pdf.

Ma, Hsen-Hsing (2006). An alternative method for quantitative synthesis of single-subject researches: Percentage of data points exceeding the mean. Behavior Modification, 30 (5), 598-617.

Mather, N., and Goldstein, S. (2001). Retrieved June 6, 20011 from http://www.ldonline.org/article/6354.

Schilling, S.G., Carlisle, J. F., Scott, S. E., and Zeng, Ji. (2007). Are fluency measures accurate predictors of reading achievement [Electronic version] The Elementary School Journal, 107(5), 429 – 448.

Shapiro, E. S., Solari, E., and Petscher, Y. (2008). Use of a measure of reading comprehension to enhance prediction on the state high stakes assessment [Electronic version]. Learning and Individual Differences, 18(3), 316 – 328.

Tindal, G., Marston, D., & Deno, S. L. (1983). The reliability of direct and repeated measurement (Research Rep. 109). Minneapolis, MN: University of Minnesota Institute for Research on Learning Disabilities.

22

Page 23: Assessing the Effects of McGraw-Hill’s FLEX …s3.amazonaws.com/Edcanvas/21098/local/FLEX_Sprin… · Web viewTitle Assessing the Effects of McGraw-Hill’s FLEX Literacy Program

University of Oregon Center on Teaching and Learning. DIBELS (Dynamic Indicators of Basic Early Literacy Skills). Retrieved on 10/14/08 from http://www.dibels.uoregon.edu/dibelsinfo.php.

Olive, M.L., & Franco, J.H. (2007). (Effect) size matters: And so does the calculation. Behavior Analysis Today, 8(2), 76 – 86. http://eric.ed.gov/PDFS/EJ800973.pdf

U.S. Department of Education, Institute of Education Sciences, National Center for Education Statistics. National Assessment of Educational Progress (NAEP) 2009. The nation’s report card. Retrieved June 13, 2011 from: http://nationsreportcard.gov/reading_2009

23