literacy assessments

49
Literacy Assessments Literacy Workgroup Marcia Atwood Michele Boutwell Sue Locke-Scott Rae Lynn McCarthy

Upload: conroy

Post on 20-Jan-2016

38 views

Category:

Documents


0 download

DESCRIPTION

Literacy Assessments. Literacy Workgroup Marcia Atwood Michele Boutwell Sue Locke-Scott Rae Lynn McCarthy. Getting Ready. Would like to have an opening activity here to activate background knowledge. Assess Frequently. In order to determine reading problems early - PowerPoint PPT Presentation

TRANSCRIPT

Page 1: Literacy Assessments

LiteracyAssessments

Literacy WorkgroupMarcia Atwood

Michele BoutwellSue Locke-Scott

Rae Lynn McCarthy

Page 2: Literacy Assessments

Getting Ready

Would like to have an opening activity here to activate background knowledge

Page 3: Literacy Assessments

Assess Frequently

In order to determine reading problems early

In order to monitor which skills are developing and which skills need more explicit instruction

In order to allow teachers to make informed instructional decisions at the point of need.

Page 4: Literacy Assessments

Types of Reading Assessments

Standards based assessments

General outcome measures

Diagnostic

Progress monitoring

Page 5: Literacy Assessments

General Outcome Measures

General Outcome (Screening) MeasuresA simple set of procedures that teachers can use to plan, adapt, individualize, and evaluate instructional programs for their students– (Christine A. Espin, Anne Foegen) and (Deno, 1985; Deno &

L.S. Fuchs, 1987; L.S. Fuchs & Deno, 1991; Shinn, 1989)

Provide a bottom-line evaluation of the effectiveness of a reading program and/or a teacher’s instruction to determine which children will need additional support in achieving important reading outcomes

Page 6: Literacy Assessments

Progress Monitoring

Determine if students are making adequate progress at their instructional level

Determine if need more intervention to close the achievement gap

The use of direct, repeated measurement of student progress toward long-range instructional goals– Standard tasks used as indicators of student proficiency

– (Espein and Foegen, 1996)

Page 7: Literacy Assessments

Progress Monitoring FrequencyToo few data points taken too infrequently means that students will stay in ineffective interventions too long

As the frequency of progress monitoring increases, the probable strength of the data’s ability to reliably inform instructional increases– 2x/week after 10 weeks: excellent with 1 probe

– 1x/week after 10 weeks: excellent with 1 probe

– Every 3 weeks after 10 weeks: poor with median of 3 probes

Pearson Education, Inc

Page 8: Literacy Assessments

Frequency of Assessment and Student Achievement

Bangert-Drowns, R.L. Kulik J.A. & Kulik, C.L.C, (1991), Effects of frequent cclassroom testing. Journal of Educational Research, 85. 89-99

Page 9: Literacy Assessments

Benchmark Assessment vs

Progress Monitoring

Page 10: Literacy Assessments

Diagnostic Assessments

Help teachers plan instruction by providing in-depth information about students’ skills and instructional needs that impact general outcome measures

Page 11: Literacy Assessments

International Reading Association Standards for Literacy Assessment

Interests of the students are paramount in the assessment

The teacher is the most important agent of assessment

The primary purpose of assessment is to improve teaching and learning

Assessment must reflect and allow for critical inquiry into curriculum and instruction

Page 12: Literacy Assessments

Assessment must recognize and reflect the intellectually and socially complex nature of reading and writing and the important roles of school, home, and society in literacy development.

Assessment must be fair and equitable

The consequences of an assessment procedure are the first and most important consideration in establishing the validity of the assessment.

Page 13: Literacy Assessments

The assessment process should involve multiple perspectives and sources of data.Assessment must be based in the local school learning community, including active and essential participation of families and community membersAll stakeholders in the educational community-students, families, teachers, administrators, policy makers, and the public-must have an equal voice in the development, interpretation and reporting of assessment information.

Page 14: Literacy Assessments

Families must be involved as active, essential participants in the assessment process.

Page 15: Literacy Assessments

Before Assessing

The reason for the assessment and the use of the data must be clear

What do you want to know?

Page 16: Literacy Assessments

What do you do with the data?

Identify the need

Validate the need

Plan the intervention– Determine the individual expected rate of

improvement

Evaluate the intervention

Review outcomes

Page 17: Literacy Assessments

Based on your data, you can determine…

What’s working?

What’s not working?

Who is on target for achieving standards and benchmarks?

Who is at risk for reading difficulties?

Who is not making progress adequate enough to close the gap?

Page 18: Literacy Assessments

Identify System Patterns

Are there components of the big ideas mastered/not mastered by the majority of students?

Are there differences in the performance of subgroups ( grades, teachers etc.)?

Are there similarities among students’ performance?

Are additional data needed?

Page 19: Literacy Assessments

In the classroom teachers can use data to…

Group students for instruction

Target specific reading concepts and skills that students have not mastered

Determine instructional intensity

Monitor student progress

Identify personal professional development interests and needs

Page 20: Literacy Assessments

Changes can be made in…

Intensity (explicit, targeted, strategic) of instruction

Group size

Amount of time in intervention

Change in program

Assessment procedures

Page 21: Literacy Assessments

Instructional Difference

“We have research to indicate that when a student is performing below the grade level of the reading instruction being delivered in the general education program, the classroom program has little effect on the target student. Instead, tutoring accounts for the student’s growth.”

Dr. Lynn Fuchs

Reading Rockets

Page 22: Literacy Assessments

Reading Instruction Must…

Be explicit and systematicBe paced appropriatelyBe based on student assessment dataAllow opportunities to see it, guided practice, independent practiceBe based on research

Page 23: Literacy Assessments

In order to achieve this, teachers must….

Understand that assessment is an important part of instruction

Understand how to administer different types of assessment and when to administer them

Analyze the data in order to use it to inform instruction

Page 24: Literacy Assessments

Assessing the Big 5

Phonemic Awareness

Phonics

Vocabulary

Fluency

Comprehension

Page 25: Literacy Assessments
Page 26: Literacy Assessments
Page 27: Literacy Assessments

Sample Phonics Assessments

Pronouncing the phonetic elements in isolationPronouncing a sampling of phonetic elements in sentences real and nonsense words

– Really Great Reading Diagnostic Decoding Surveys (complimentary download from www.reallygreatreading.com)

– DIBELS Nonsense Word Fluency

– Quick Phonics Screener, Read Naturally

Page 28: Literacy Assessments

Importance of Vocabulary

The second most important root cause for comprehension deficitsIt is rare to find a child who is good at decoding and has good vocabulary knowledge but is weak in comprehensionVocabulary knowledge predicts word reading ability – at first grade, it predicts comprehension 10 years later

Page 29: Literacy Assessments

Dimensions of VocabularyMaryanne Wolf, Ph.D.

Incrementality – degrees of knowingMultidimensionality – morphology, syntax, pragmaticsInterrelatedness – features of a word and how it relates to other wordsPolysemy – knowing the multiple meanings of a word; predicts comprehension, aids in word recognition in and out of context; 1/3 of English words are polysemous

Page 30: Literacy Assessments

Vocabulary Assessment

No universal screening tool yetReceptive Vocabulary – matching a picture to a wordDefinitional Knowledge – describing the meaning of the wordMultiple Meanings (semantics) - identify both pictures that represent the word (Communication Intent (syntax) through ambiguous sentences and figurative language)

Page 31: Literacy Assessments

Receptive Vocabulary

Point to the picture that means road.

Page 32: Literacy Assessments

Definitional Knowledge

“Tell me what map means.”DIBELS – Word Use Fluency (WUF)

"Listen to me use the word in a sentence, (pause) "The rabbit is eating a carrot." Your turn, "rabbit.”

Categories – choose the pictures that belong together

Page 33: Literacy Assessments

Polysemous Word Knowledge

Self-assessmentAmbiguous Sentences – find the two pictures that go with the sentence.

“We need a new bat.”

Pragmatics – “What does the girl mean when she says, ‘Go fly a kite’?”

Page 34: Literacy Assessments

Vocabulary Connections

The extent of students’ vocabulary knowledge relates strongly to their reading comprehension and overall academic success.

(Fran Lehr and associates commenting on the persistent evidence provided by Baumann, Kame’enui & Ash, 2003; Becker, 1977; Davis, 1942; Whipple, 1925)

Page 35: Literacy Assessments

Oral Reading Fluency

Predictor of later reading outcomes (Fuchs, Fuchs, Hosp, & Jenkins, 2001; Shinn, 1998)

Richard Wagner, FCRR:

Primary concern to prevent reading difficulties is decoding; it is the most serious threat to reading achievementStudies with thousands of children, replicated 3 times: nearly all poor comprehenders had decoding AND vocabulary deficitsOnly .2% to .5% of poor comprehenders were adequate decoders

Page 36: Literacy Assessments

One Minute Probes

“counting the number of words read correctly from text under standardized 1-minute testing conditions is an excellent indicator of general reading achievement, including comprehension, for most students,” Advanced Applications of Curriculum –Based Measurement, Mark R. Shinn (1998)

Page 37: Literacy Assessments

Rate of Improvement

Fuchs et al. (1993) reasonable expectations for average, poor and disabled readers

Page 38: Literacy Assessments

Calculating Expected Growth

Data points fluctuate significantly– Establish an aim (goal) line based on expected

growth per week and the number of weeks of instruction

– Calculate the trend line– Calculate the R-squared value

Shows how closely the estimated values for the trend-line corresponds to the actual data

Page 39: Literacy Assessments

Correlation Between ORF and Statewide Assessments

Fluency rate of third graders and the third grade end of year state assessment .66 Crawford et al. (2001)

Spring ORF and Oregon reading assessment .67 Good et al. (2001)

ORF and reading comprehension in Iowa .80 Fuchs et al. (2001)

(FCRR - .50 makes us jump around the room)

Page 40: Literacy Assessments

Assessing Fluency

DIBELS (DIBELS Next) Oral Reading Fluency (Grades 1-6)

Aimsweb Reading CBM, Pearson, Inc. (Grades 1-8)

DRA2 Diagnostic Reading Assessment 2

Intervention Central Oral Reading Fluency Passage Generator

Easy CBM (Grades 1-8)

Ohio Literacy Alliance (Grades 9-12)

Page 41: Literacy Assessments

Means to the End

Comprehension is influenced by:– Accurate and fluent word reading– Vocabulary and linguistic competence– Conceptual and factual knowledge– Knowledge and skill in the use of

cognitive knowledge about what to do when comprehension breaks down

• Reid Lyon, NICHD

Page 42: Literacy Assessments

Sample Comprehension Assessments

• DIBELS DAZE• Project PROACT Maze Reading Passages

– Vanderbilt University ([email protected])

• DRA-2 – inferential and factual questions• Writing samples• DIBELS Oral Reading Retell• Easy CBM – multiple choice

Page 43: Literacy Assessments

Maze CBM

• Passage of connected text– First sentence intact– Every nth (e.g. 7th) word deleted– 3 choices provided– Timed – 2:30 – 3:00 minutes

• Fuchs and Fuchs (1992) found that for students with mild disabilities, the stability of maze data was higher than that of other reading measures.

Page 44: Literacy Assessments

Motivation to Read

• Expectancy-Value Theory (Eccles, 1983)

• Motivation is dependent on two factors:– The extent to which the person expects success

or failure– The value or overall appeal that the person

associates with the task

Page 45: Literacy Assessments

Decrease in Motivation to Read

• Motivation to read decreases with age. The decline begins at or about the fourth grade.– (Durik et al., 2006; Kush & Watkins, 1996;

McKenna et al., 1995)

Page 46: Literacy Assessments

Assessing Motivation

• Interest inventories

• Motivation to Read Profile (MRP; Gambrell, Palmer, Codling, & Mazzoni, 1996)– Reading Survey– Conversational Interview

Page 47: Literacy Assessments

Let’s Look at Data

• This is literacy data for a middle school

• What do the data tell you?

• What questions do they raise?

• What is missing?

Page 48: Literacy Assessments

How would you lead this district towards a QIP goal?

• Using the LQI tools that you have in your packet, determine what is needed specifically in the area of assessment for literacy in this district.

• What would your goal look like?

• What would your system objective look like?

• What would your student objective look like?

Page 49: Literacy Assessments

What Assessment Activities Would You Recommend?

• How would you assess the goal?

• What activities would you use to analyze the data?