indiana interpretive guide for statewide assessments

52
Indiana Interpretive Guide for Statewide Assessments 2019–2020 Indiana Department of Education

Upload: others

Post on 23-Nov-2021

7 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Indiana Interpretive Guide for Statewide Assessments

i

Indiana Interpretive Guide for Statewide

Assessments

2019–2020

Indiana Department of Education

Page 2: Indiana Interpretive Guide for Statewide Assessments

i

Table of Contents Introduction ................................................................................................................................... 1

Overview of Interpretive Guide .................................................................................................. 1 Overview of Indiana Assessments ............................................................................................ 1 Test Design Principles ............................................................................................................... 2

Principles of Reporting, Interpretation, and Use ........................................................................... 9 Overall Scale Scores ................................................................................................................. 9 Standard Error of Measurement .............................................................................................. 10 Proficiency Levels ................................................................................................................... 11 Sample Reports ....................................................................................................................... 15 Individual Score Reports ......................................................................................................... 16 Interpretation of Aggregate Scores ......................................................................................... 34 Interpretation of Aggregate-Level ORS Reports ..................................................................... 34 Preliminary Results ................................................................................................................. 45

Glossary ...................................................................................................................................... 47 References .................................................................................................................................. 49

Page 3: Indiana Interpretive Guide for Statewide Assessments

ii

List of Tables Table 1: Blueprint and Item Specification Location by Assessment Program ............................... 4 Table 2: Location of ILEARN PLDs ............................................................................................... 7 Table 3: Location of I AM PLDs .................................................................................................... 7 Table 4: Location of ISTEP+ Grade 10 PLDs ............................................................................... 8 Table 5: Location of IREAD-3 PLDs ............................................................................................. 8 Table 6: ILEARN Proficiency Levels ........................................................................................... 12 Table 7: I AM Proficiency Levels ................................................................................................. 13 Table 8: ISTEP+ Grade 10 Proficiency Levels ............................................................................ 14 Table 9: IREAD-3 Proficiency Levels .......................................................................................... 15 Table 10: Reports Available to Parents/Guardians and Educators ............................................. 16 Table 11: ILEARN Scale Score Ranges ..................................................................................... 19 Table 12: I AM Scale Score Ranges ........................................................................................... 20 Table 13: IREAD-3 Scale Score Ranges .................................................................................... 20 Table 14: ISTEP+ Grade 10 Scale Score Ranges ...................................................................... 21 Table 15: Reporting Category Score .......................................................................................... 21 Table 16: Example of Reporting Category and Next Steps ........................................................ 22 Table 17: ILEARN Condition Codes ........................................................................................... 24 Table 18: ISTEP+ Grade 10 Condition Codes ............................................................................ 25 Table 19: ILEARN Narrative Writing Rubric ................................................................................ 25 Table 20: ILEARN Informational Writing Rubric .......................................................................... 26 Table 21: ILEARN Opinion Writing Rubric .................................................................................. 27 Table 22: ILEARN Explanatory Writing Rubric ........................................................................... 28 Table 23: ISTEP+ Grade 10 Writing Rubric Grades 5–12 .......................................................... 31 Table 24: ISTEP+ Grade 10 Grammar and Usage Rubric Grades 9–12 .................................... 32

List of Figures Figure 1: Evidence-Centered Design ............................................................................................ 3 Figure 2: Annotated ILEARN Mathematics Grade 3 Blueprint ...................................................... 5 Figure 3: Precision in Measurement ........................................................................................... 11 Figure 4: Individual Score Reports .............................................................................................. 17 Figure 5: Aggregate-Level Subject Detail Report ....................................................................... 35 Figure 6: Aggregate-Level Reporting Category Detail Report .................................................... 37 Figure 7: Aggregate-Level Standard Detail Report ..................................................................... 39 Figure 8: Student Roster Subject Report .................................................................................... 41 Figure 9: Student Roster Reporting Category/Strand Report ..................................................... 43 Figure 10: Preliminary Results Timeline ..................................................................................... 45

Page 4: Indiana Interpretive Guide for Statewide Assessments

1

Introduction

Overview of Interpretive Guide The Indiana Interpretive Guide for Statewide Assessments is designed to help educators, parents, students, and other stakeholders understand and interpret the results of Indiana’s four assessments. These four assessments include: the Indiana Learning Evaluation Assessment Readiness Network (ILEARN); Indiana’s Alternate Measure (I AM); the Indiana Reading Evaluation and Determination, Grade 3 (IREAD-3); and the Indiana Statewide Testing for Educational Progress-Plus (ISTEP+) Grade 10 assessments. This guide provides information on how to appropriately interpret data in Individual Student Reports (ISRs) and other reports available to educators through the Online Reporting System (ORS).

Overview of Indiana Assessments Indiana’s assessments support instruction and student learning by providing feedback to educators and parents. Educators and parents can use this feedback to inform instructional strategies and content that help guide student learning. All Indiana assessment programs discussed in this guide are criterion-referenced and summative. Criterion-referenced assessments measure students’ knowledge and ability based on specific standards, such as the Indiana Academic Standards (IAS). Summative assessments provide useful data for measuring growth, proficiency, and/or learning gaps between different groups of students at a specific point in time. Summative assessments are frequently considered high-stakes evaluations. Results of high-stakes evaluations can inform accountability measures for a school or determine a student’s promotion to the next grade level. Corporations, schools, educators, and parents use summative assessments to evaluate student learning at the end of an instructional unit, such as at the end of the school year, by comparing assessment results against a standard or benchmark. Educators and parents can compare performance across students, classes, schools, and corporations.

The 2019–2020 Indiana Assessments Overview Chart on the Indiana Department of Education’s (IDOE’s) website provides an overview of the key aspects of each program (ILEARN, I AM, IREAD-3, and ISTEP+ Grade 10) discussed in this guide.

The Released Items Repository (RIR) for each program, available on the Indiana Assessment Portal, allows parents, educators, and students to view sample assessments for each grade and content area. The RIR helps parents, educators, and students prepare for testing by allowing users to practice taking assessments before the official test administration. Users can view the layout of each test, interact with test content and features, and learn about accommodations available for each assessment.

Page 5: Indiana Interpretive Guide for Statewide Assessments

2

Test Design Principles This section of the interpretative guide provides an overview of the test design principles used in the development of all Indiana assessments. Following these principles means the assessments are grounded in research and allow for specific claims about student knowledge and understanding. Outlined below are definitions of key terms and ideas. While this section provides useful background of the assessments, a thorough understanding of these ideas is not necessary to interpret the reports. Test validity is the extent to which an assessment measures what it is supposed to measure. An assessment is created to measure what a specific group of students knows about a topic or topics under specific conditions (AERA, APA, and NCME, 2014; NRC, 2007). The results of an assessment will only be valid if it is taken by the intended group under the intended conditions. For example, if first grade students take a reading comprehension test intended for fifth grade students, the test will not be valid. The test will not be measuring what it is supposed to measure, which, in this case, is the reading comprehension of fifth grade students. Similarly, if students are given extra time to complete a timed exam, the exam will not be valid. In both examples, the exam is not measuring what was intended, what students can complete within a specific length of time. Reliability refers to the consistency in a measurement over time. For example, the score should remain relatively consistent, or reliable, if the same student takes the same assessment on different occasions. When used as designed, test data can provide useful information.

Evidence-Centered Design Each Indiana assessment was developed with Indiana educators using evidence-centered design. Assessments are designed using this process by gathering reliable evidence to support claims about student learning and knowledge. Evidence-centered design uses test blueprints and item specifications to ensure the validity and reliability of the assessment results. Test blueprints pinpoint the content to be assessed. Item specifications show how each standard should be assessed.

Assessment design begins with a clear outline of the desired content claims in relation to student learning. Content claims are statements that can be made about student learning based on student performance on an assessment. The primary content claim made about Indiana students is whether students are college- and career-ready. Test design principles consider the evidence that is collected at each step to ensure that reporting can represent the claim being made about students.

Page 6: Indiana Interpretive Guide for Statewide Assessments

3

Figure 1: Evidence-Centered Design

A student’s ability is estimated based on the evidence gathered from each item (i.e., a student’s responses to an item). The overall test results, or how a student performed on the test, will indicate whether a student has achieved the content claim and is college- and career-ready.

The IAS were designed to ensure that students across grades are receiving the instruction they need to be on track for college- and career-readiness by graduation. The IAS were approved by the Indiana State Board of Education in April 2014 for English/Language Arts (ELA) and Mathematics and March 2015 for Social Studies. The IAS for Science were updated in 2016 to reflect changes in Science content. Indiana’s Alternate Academic Standards, or Content Connectors, were approved by the Indiana State Board of Education in June of 2018. More information about the IAS and Content Connectors can be found on IDOE’s website: https://www.doe.in.gov/standards.

Blueprints define the essential content that an assessment will measure by determining which standards are the most important to assess. Standards are prioritized based on the knowledge, skills, and abilities educators believe students need to know to be ready for college- and career-readiness, post-secondary education, or integrated employment. Indiana educator committees created the blueprints for all Indiana assessments. Test blueprints do not change once they have been set for an assessment program unless standards or policy changes require edits. The standards on the blueprint are measured with test items.

Item specifications define how each standard will be measured and include the cognitive complexity of the standard (I.e., difficulty level of that standard), evidence required to show a student has mastered the standard, and possible item types for the assessment. An item specification exists for every standard measured on each of the

Page 7: Indiana Interpretive Guide for Statewide Assessments

4

Indiana assessments. Item writers use item specifications to carefully develop items. Item specifications are also a resource for educators. Links to blueprints and item specifications for all grades and subjects can be found in Table 1.

Evidence-centered design culminates in the delivery of the tests to the student. Successful student performance can show mastery of IAS. Proficiency levels on ISRs demonstrate a student's performance.

Test Blueprints Test blueprints reflect reporting categories. Reporting categories are groups of similar standards that are assessed within each grade and subject. These sets of standards can be used to identify each student’s relative strengths or weaknesses for different subdomains of a content area. For example, in Mathematics, some reporting categories include Algebraic Thinking and Data Analysis, Computation, and Number Sense. Student performance within a reporting category indicates how proficient a student is with that subdomain of content.

IDOE worked closely with Indiana educators to create blueprints that guide the item development process for all grades and subjects. Educators discussed what potential reporting categories and reporting frameworks would best support instruction in Indiana during a workshop. Educators and IDOE used the results to create blueprints. ILEARN blueprints represent educator feedback on how the grades 3-8 standards need to be assessed to show that students are on track to be college- and career-ready. Educators followed the same process to develop the I AM blueprints. Educators considered the ILEARN priorities, the Content Connectors, and the expectations that students to be on track for post-secondary education or integrated employment. ISTEP+ Grade 10 was established as a reflection of student’s college- and career-readiness as the final high school assessment mandated by the state for accountability and graduation requirements. IREAD-3 blueprints were defined to represent the priority of standards for reading foundations by the end of grade 3.

Table 1 gives the blueprint and item specification location for each assessment program.

Table 1: Blueprint and Item Specification Location by Assessment Program

Blueprint and Item Specification Location ILEARN

I AM

ISTEP+ Grade 10

IREAD-3

Page 8: Indiana Interpretive Guide for Statewide Assessments

5

Figure 2 shows an annotated ILEARN Mathematics grade 3 blueprint.

Figure 2: Annotated ILEARN Mathematics Grade 3 Blueprint

Page 9: Indiana Interpretive Guide for Statewide Assessments

6

The bottom line indicates the total number of operational items used in calculating student scores. The grade 3 Mathematics ILEARN assessment has between 46 and 48 operational items. The Reporting Category column indicates the overall percentage of the assessment characterized by each specific reporting category. The last column indicates the overall item range for each specific reporting category. In this example, the first reporting category is Algebraic Thinking and Data Analysis. Each grade 3 Mathematics assessment has 9–11 Algebraic Thinking and Data Analysis items that cover 19–24 percent of the overall test length.

The Standard column lists all standards assessed for the given grade and subject by reporting category. The Standard Item Range and Standard % of Test columns indicate the minimum and maximum number of items per standard on each assessment, as well as how much of the assessment is about each standard. For example, each grade 3 Mathematics assessment has a minimum of one and a maximum of three items from Standard 3.AT.1. These items cover 2–7 percent of the overall test length. There is a minimum of 0 and a maximum of 2 items from Standard 3.AT.2, covering 0–4 percent of the overall test length. The different values for minimum and maximum item counts were determined by the standard weights applied by educators during blueprint construction. Educators determined that Standard 3.AT.1 was a higher priority and has a higher minimum and maximum. Educators determined that Standard 3.AT.2 was a lower priority and has a lower minimum and maximum.

Educators identified some standards as important for inclusion on the assessment but did not prioritize them highly enough to be reported as a separate reporting category. Process Standards for all ILEARN Mathematics assessments includes this type of standard.

Educators, parents, and students can review a student’s reporting category performances on his or her ISR. The ISR reports reporting category performance as Below, At/Near, or Above the At Proficiency performance expectation for the given reporting category. Students characterized as Below Proficiency are below the At Proficiency performance expectation and need additional support and instruction to master the standards represented by the reporting category. Students characterized as Above Proficiency are above the At Proficiency performance expectation and almost always successfully answer questions related to the standards represented by the reporting category. Above Proficiency does not apply to I AM. Students characterized as At/Near the performance expectation perform close to the At Proficiency level, but there is not enough information to fully determine if they are above or below the performance expectation. Educators and parents may ask for additional information about the student’s proficiency to better target remediation and support.

Performance-Level Descriptors A Performance-Level Descriptor (PLD) outlines the knowledge and skills that students performing at a given level demonstrate in each content area and at each grade level

Page 10: Indiana Interpretive Guide for Statewide Assessments

7

for each standard assessed. For example, in Mathematics, IAS 3.AT.1 states that students should be able to solve real-world problems involving addition and subtraction of whole numbers within 1000 (e.g., by using drawings and equations with a symbol for the unknown number to represent the problem). The PLD for a student demonstrating knowledge and skills at the Below Proficiency level is: Identifies real-world problems as addition or subtraction. The knowledge and skills required for this PLD are less complex than the knowledge and skills required for the At Proficiency PLD. The PLD for a student demonstrating knowledge and skills at the At Proficiency level is: Solves real-world problems involving addition and subtraction of whole numbers within 1000.

IDOE involved Indiana educators in the development of the PLDs for all Indiana summative assessments. Table 2, Table 3, Table 4, and Table 5 include the hyperlinks to access the PLDs for each assessment.

Table 2: Location of ILEARN PLDs

ILEARN Range PLDs Mathematics ELA Science Social Studies

Grade 3 Grade 3

Grade 4 Grade 4 Grade 4

Grade 5 Grade 5 Grade 5 Grade 6 Grade 6 Grade 6

Grade 7 Grade 7

Grade 8 Grade 8 Biology U.S. Government

Table 3: Location of I AM PLDs

I AM Range PLDs Mathematics ELA Science Social Studies

Grade 3 Grade 3

Grade 4 Grade 4 Grade 4

Grade 5 Grade 5 Grade 5 Grade 6 Grade 6 Grade 6

Grade 7 Grade 7

Grade 8 Grade 8

Grade 10 Grade 10 Biology

Page 12: Indiana Interpretive Guide for Statewide Assessments

9

Principles of Reporting, Interpretation, and Use Scores reflect student achievement on the IAS for ILEARN, IREAD-3, and ISTEP+ Grade 10 and on the Content Connectors for I AM. Indiana’s assessments support instruction and student learning by providing feedback to students, educators, and parents. Parents and educators may monitor student achievement at the student or school level using a variety of reporting metrics.

Parents and educators should interpret all assessment reports and scores with caution. Consider the following when reviewing reports and scores:

• Scale scores are estimates of true scores and have some level of error associated with them. See the Standard Error of Measurement for more details about the level of error and interpretation.

• Aggregated score reports represent group characteristics. Users must consider the number of students in each group when performance is compared across groups. The sections below outline additional details to consider when viewing aggregate reports.

• Summative assessment results should not be the only piece of information considered when reviewing student performance. Summative assessment results provide limited information. Other sources of data, such as classroom assessments and teacher evaluations, should be considered when making decisions regarding student learning. Assessment scores reflect a student’s performance on a single day and may vary depending on several factors.

Overall Scale Scores For all Indiana assessments, students receive an overall subject area score called a scale score. Scale scores are standardized scores that are comparable across years and test forms. Items on assessments range in difficulty from easy to hard. Two students that correctly answered the same number of items might receive different scale scores. Scale scores are a consistent measure across test forms whether a student takes a fixed-form or computer adaptive test.

Attemptedness There may be times when a student starts an assessment but does not finish it. Each assessment has rules that determine when a student has “attempted” an assessment and will receive a score, and how an incomplete assessment will be scored.

A student must answer a minimum number of items on the assessment to receive a useful and trustworthy score. The same is true for reporting category scores. Most students complete their assessment and receive both overall and reporting category

Page 13: Indiana Interpretive Guide for Statewide Assessments

10

scores across all grades, subjects, and assessments. In some cases, a student may receive an overall score but no reporting category scores. For ISTEP+ Grade 10, it is possible to receive one or more reporting category scores but not an overall score.

For all assessments, if a student started an assessment but did not answer enough items to demonstrate their knowledge and skills, their score report will show as Undetermined (UND). Students may have a UND for overall and reporting category scores or a UND for only reporting category scores depending on the number of items the student answered. For ISTEP+ Grade 10, students could receive a UND for overall score but receive scores for one or more reporting categories. For the I AM assessment, if a student demonstrates she/he is not engaged with the assessment by continuously not responding to items, their score report will show No Mode of Communication (NMC).

If parents, teachers, principals, or other school personnel believe their student should have received reported scores, they should contact the Indiana Assessment Help Desk.

Standard Error of Measurement All students have a given level of knowledge and skills. Assessments are designed to measure what students know and can do, and that process can be complex. Student performance on an assessment may vary due to a variety of reasons (e.g., they are not feeling well, or they are not feeling motivated). Student knowledge and skills cannot be measured precisely in the way an object’s weight can. Student performance will vary depending on the circumstances in which the assessment is taken. An object’s weight will always remain the same. Figure 3 illustrates this idea.

The Standard Error of Measurement (SEM) is the range in which a student’s “true score” is expected to fall. A student’s “true score” is what he or she knows and can do. SEM incorporates the factors that affect a student’s performance and is useful to students, parents, and teachers. It acknowledges the difficulty of measuring a true score by providing a likely range of the student’s knowledge and skills.

Page 14: Indiana Interpretive Guide for Statewide Assessments

11

Figure 3: Precision in Measurement

SEM allows users to estimate a score range that a student would likely fall if the same assessment were given to a student multiple times. For example, a scale score of 2535 with an SEM of 22 indicates that if the student completed the same test multiple times, the score would likely fall between 2513 and 2557. Scale scores and SEMs will vary based on the test and student. All test scores, including scores on assessments and quizzes designed and administered by classroom teachers, are subject to some degree of measurement error.

Proficiency Levels The Every Student Succeeds Act (ESSA) is federal legislation that requires student achievement to be reported in terms of at least three proficiency levels. Proficiency levels are ranges on a student achievement scale that classify students by how many of the content standards they know. One of these proficiency levels must be designated as the proficient level. Each state must determine the number of proficiency levels to use and the meanings associated with those levels. Educators and parents can use proficiency level information to help plan individual instructional goals for the student. ILEARN, I AM, ISTEP+ Grade 10, and IREAD-3 assessments each have a slightly different purpose, so the proficiency levels for each assessment vary.

ILEARN

IDOE convened a committee of Indiana stakeholders in May 2018 to determine the proficiency levels for ILEARN. Stakeholders determined that four proficiency levels were appropriate for the ILEARN grade-level assessments and Biology End-of-Course Assessment (ECA). These four proficiency levels demonstrate the varying levels of proficiency a student can achieve and indicate whether a student is on track for college- and career-readiness. The committee agreed to have two proficiency levels (Below Proficiency and Approaching Proficiency) to describe students who were not yet

Page 15: Indiana Interpretive Guide for Statewide Assessments

12

proficient. These levels illustrate both a student’s current knowledge and the instructional support that a student needs to demonstrate proficiency. The committee also determined two proficiency levels (At Proficiency and Above Proficiency) to describe proficient students. Students who are At Proficiency have mastered grade-level standards and are on track to be college- and career-ready. Students who are Above Proficiency have mastered standards at the next grade level and are also on track for college- and career-readiness.

In February 2019, a group of Indiana stakeholders convened to determine the proficiency levels for the ILEARN U.S. Government ECA. The ILEARN U.S. Government ECA is an optional assessment. Students achieve an At Proficiency or Below Proficiency level of performance. The description of each ILEARN proficiency level is included in Table 6 below and on IDOE's website.

Table 6: ILEARN Proficiency Levels

ILEARN Proficiency Levels Proficiency Level Description

Level 1: Below Proficiency

Indiana students below proficiency have not met current grade level standards. Students may require significant support to develop the knowledge, application, and analytical skills needed to be on track for college- and career-readiness.

Level 2: Approaching Proficiency

Indiana students approaching proficiency have nearly met current grade level standards by demonstrating some basic knowledge, application, and limited analytical skills. Students may require support to be on track for college- and career-readiness.

Level 3: At Proficiency

Indiana students at proficiency have met current grade level standards by demonstrating essential knowledge, application, and analytical skills to be on track for college- and career-readiness.

Level 4: Above Proficiency

Indiana students above proficiency have mastered current grade level standards by demonstrating more complex knowledge, application, and analytical skills to be on track for college- and career-readiness.

I AM

IDOE convened a committee of Indiana stakeholders in August 2018 to determine the proficiency levels for I AM. The stakeholders determined that the number of proficiency levels for I AM would be different from the number for ILEARN. The goal of proficiency in ILEARN is college- and career-readiness, while the goal of proficiency in I AM is to be on track for post-secondary education or competitive integrated employment. The stakeholders recommended three proficiency levels for I AM – Below Proficiency, Approaching Proficiency, and At Proficiency. They determined that only one proficiency level was required to describe whether the student was proficient. They also determined that two proficiency levels below proficiency would allow parents and educators to gain more specific information about the students who are not meeting proficiency levels. Table 7 and IDOE's website includes information on how these levels are defined.

Page 16: Indiana Interpretive Guide for Statewide Assessments

13

Table 7: I AM Proficiency Levels

I AM Proficiency Levels Proficiency Level Description

Level 1: Below Proficiency

Indiana students below proficiency have not met current grade level Content Connectors. Students may require significant support to develop the knowledge, application, and skills to be on track for post-secondary education or competitive integrated employment.

Level 2: Approaching Proficiency

Indiana students approaching proficiency have nearly met current grade level Content Connectors by demonstrating some basic knowledge, application, and skills. Students may require support to be on track for post-secondary education or competitive integrated employment.

Level 3: At Proficiency

Indiana students at proficiency have met current grade level Content Connectors by demonstrating essential knowledge, application, and skills to be on track for post-secondary education or competitive integrated employment.

ISTEP+ Grade 10

ISTEP+ Grade 10 proficiency levels are also connected to the test’s purpose. ISTEP+ Grade 10 is a Graduation Qualifying Examination (GQE). Parents can find information on Indiana high school graduation requirements on IDOE's website. Students in cohorts 2019 through 2022 are required to pass this assessment to fulfill their high school graduation requirements. Students who achieve Pass or Pass+ meet the graduation requirement. Students who achieve Did Not Pass should receive extra support and remediation from their school and may participate in retest opportunities as needed. Table 8 and IDOE’s website includes information on Did Not Pass, Pass, and Pass + proficiency levels for ELA and Mathematics.

Page 17: Indiana Interpretive Guide for Statewide Assessments

14

Table 8: ISTEP+ Grade 10 Proficiency Levels

ISTEP+ Grade 10 Proficiency Levels Proficiency Level Subject Area Description

Did Not Pass

ELA

Tenth grade students performing at the Did Not Pass level demonstrate limited understanding when reading, comparing, and responding to a range of grade-level appropriate texts, including literature and nonfiction. Students display limited writing skills using basic, appropriate Standard English conventions when producing different writing forms.

Mathematics

Did Not Pass students demonstrate limited mathematical and problem-solving skills. Students may have difficulty when solving problems with linear and compound inequalities, quadratics, and systems of equations, and the complexity of algebra may be an obstacle for Did Not Pass students. Also, math topics including geometry, measurement, data analysis, and statistics can be stumbling blocks for students. Did Not Pass students may have difficulty making decisions about how to approach problem-solving situations, how to communicate their ideas, and how to apply mathematical knowledge to other situations.

Pass

ELA

Tenth grade students performing at the Pass level demonstrate proficient understanding when reading, comparing, and responding to a range of grade-level appropriate texts, including literature and nonfiction. Students display proficient writing skills using mostly appropriate Standard English conventions when producing different writing forms.

Mathematics

Pass students demonstrate proficient mathematical and problem-solving skills. Students are capable of solving problems with linear and compound inequalities, quadratics, and systems of equations, and they are competent in the areas of geometry, measurement, data analysis, and statistics. Pass students are skilled with algebra concepts, such as writing and solving linear, exponential and quadratic equations. Pass students experience success when solving problems, communicating ideas, and applying mathematical knowledge to a variety of situations

Page 18: Indiana Interpretive Guide for Statewide Assessments

15

ISTEP+ Grade 10 Proficiency Levels Proficiency Level Subject Area Description

Pass+

ELA

Tenth grade students performing at the Pass+ level demonstrate advanced understanding when reading, comparing, and responding to a range of grade-level appropriate texts, including literature and nonfiction. Students display advanced writing skills using appropriate Standard English conventions when producing different writing forms.

Mathematics

Pass+ students demonstrate advanced mathematical and problem-solving skills. Students solve multi-step problems with rational and irrational numbers, exponents, and square roots and demonstrate knowledge in the areas of geometry, measurement, data analysis, statistics, and probability. Pass+ students display highly developed skills with algebra concepts and functions, including writing and solving linear and compound inequalities, quadratics, and systems of linear equations. Pass+ students solve sophisticated problems, support their solutions, and generalize the results to other situations.

IREAD-3 The purpose of IREAD-3 is to determine whether a student is proficient in foundational reading standards through grade 3. IREAD-3 has two proficiency levels: Pass and Did Not Pass. Table 9 includes information on how these proficiency levels are defined. IDOE issued guidance regarding the retention of a student that does not pass IREAD-3 prior to a student’s projected grade 4 year for parents and educators.

Table 9: IREAD-3 Proficiency Levels

IREAD-3 Proficiency Levels Proficiency Level Description

Did Not Pass Students demonstrate limited understanding when reading and responding to grade-level literary and informational texts. Students have difficulty and comprehend new variations of word meaning and new text-based vocabulary.

Pass Students demonstrate proficient understanding when reading and responding to grade-level literary and informative texts. Students identify and comprehend most new variations of word meaning and new-text vocabulary.

Sample Reports Table 10 shows the various reports that are available to parents and educators by program. Schools distribute ISRs to families in a secure manner. Educators may access ISRs, the Aggregate-Level Reporting Category Detail Report, the Aggregate-Level

Page 19: Indiana Interpretive Guide for Statewide Assessments

16

Subject Detail Report, the Aggregate-Level Standard Detail Report, the Student Roster Subject Report, and Student Roster Reporting Category/Strand Report.

Table 10: Reports Available to Parents/Guardians and Educators

The sections below provide an explanation of the various components included in the ISRs and the other reports available in ORS.

Individual Score Reports

Interpretation of Individual Score Reports ISRs provide details about a student’s performance on an assessment. The information included in a student’s ISR will vary based on the assessment and subject. Figure 4 includes the different sections of the ISR for each assessment.

Page 20: Indiana Interpretive Guide for Statewide Assessments

17

Figure 4: Individual Score Reports

Page 21: Indiana Interpretive Guide for Statewide Assessments

18

Common Elements Across All Assessments ILEARN, I AM, ISTEP+ Grade 10, and IREAD-3 ISRs include the following components that are common across all assessments.

Basic Test Information All Indiana ISRs include Basic Test Information. Basic Test Information includes the student’s name, the student’s Student Testing Number (STN), the name of the assessment, and the school year. Basic Test Information is indicated by numeral 1 in Figure 4.

Proficiency Levels All Indiana ISRs include the proficiency level related to the student’s score on the assessments. Numerals 6, 7, and 8 in Figure 4 provide examples on how this information appears on each assessment’s ISR. The Proficiency Levels section on page 11 provides more information on the proficiency levels for each assessment. The proficiency level for ILEARN 3–8, ILEARN Biology ECA, and I AM assessments is indicated by the heading, Proficiency Level. The proficiency level for ISTEP+ Grade 10 is indicated by the heading, Performance Level. Performance Levels are discussed on page 22. The proficiency level for IREAD-3 and ILEARN U.S. Government ECA is indicated by the heading, Passing Status. It is discussed in the Passing Status section on page 22.

Scale Scores Each ISR includes a scale score. The scale score is indicated by numeral 3 in Figure 4. Table 11, Table 12, Table 13, and Table 14 provide information on which proficiency level is associated with which scale score range for ILEARN, I AM, ISTEP+ Grade 10, and IREAD-3, respectively. Except for the ILEARN U.S Government ECA, ISRs for each assessment include a scale score that is associated with the student’s proficiency level. ISRs for ILEARN 3–8, ILEARN Biology ECA, I AM, ISTEP+ Grade 10, and

Page 22: Indiana Interpretive Guide for Statewide Assessments

19

IREAD-3 also include a bar graph that visually depicts the student’s scale score and proficiency level.

Table 11: ILEARN Scale Score Ranges

Page 23: Indiana Interpretive Guide for Statewide Assessments

20

Table 12: I AM Scale Score Ranges

Table 13: IREAD-3 Scale Score Ranges

Page 24: Indiana Interpretive Guide for Statewide Assessments

21

Table 14: ISTEP+ Grade 10 Scale Score Ranges

Reporting Category Performance Measure Table ISRs provide detailed information about student performance within reporting categories. Reporting categories represent groups of similar standards that are assessed within each grade and subject. Test blueprints for each grade and subject lists these reporting categories. Table 1 includes the location of the test blueprints for each program.

There are a variety of ways to report reporting category scores as outlined in Table 15. ILEARN uses a reporting category scale and SEM to determine a student's reporting category performance level (e.g., Above, At/Near, Below). I AM and IREAD-3 assessments report percentage correct. Refer to numeral 10 in Figure 4. ISTEP+ Grade 10 assessments use the Indiana Performance Index (IPI) and an indication of Mastery. Refer to numeral 12 in Figure 4. IPI is further discussed in the Indiana Performance Index section on page 33.

Table 15: Reporting Category Score

Assessment Reporting Category Score

ILEARN Reporting Category Performance

I AM Percent Correct IREAD-3 Percent Correct ISTEP+ Grade 10 IPI and Indication of Mastery

There is a feature in ORS for students who take the ILEARN and I AM assessments called Next Steps. Educators and parents may use the Next Steps feature to better understand student test results and help further support their student. The Next Steps information suggests activities educators and parents/guardians may do with their student to help improve their student’s knowledge and performance on future assessments.

ILEARN displays Next Steps information at the reporting category level. I AM displays Next Steps information at the overall subject proficiency level. An example of an ILEARN ELA grade 6 Reporting Category and Next Steps is listed in Table 16.

Page 25: Indiana Interpretive Guide for Statewide Assessments

22

Table 16: Example of Reporting Category and Next Steps

Reporting Category Next Steps

Key Ideas and Textual Support/Vocabulary

Ask your student to read a story or nonfiction text and explain how the author develops central ideas, events, and characters. Ask your student to determine the meaning of unfamiliar words and discuss how specific words and phrases shape the text.

Other Considerations There are additional elements on some student ISRs that apply only to certain assessments. Additional information is available below.

Passing Status ISRs for ILEARN U.S. Government ECA and IREAD-3 indicate a student’s passing status. Numeral 8 in Figure 4 provides an example on how Passing Status appears on an ISR. A single cut score will determine if the student passed or did not pass. A cut score is the lowest possible score on a standardized test that a student must earn to either pass or be considered proficient.

For ILEARN U.S Government ECA, a student will be designated as either Below Proficiency or At Proficiency. ILEARN U.S. Government ECA is an optional assessment that students can take after completing their U.S. Government course. Some schools use student assessment results to determine a student’s final course grade.

For IREAD-3, a student will be designated as either Pass or Did Not Pass. All grade 3 students enrolled in accredited public and private schools are required to pass IREAD-3 per Indiana state legislation. Students can retake the assessment in grades 4 or 5 if they do not pass in grade 3.

Performance Level There are three PLDs for ISTEP+ Grade 10 that indicate how a student performed on the assessment: Pass+, Pass, and Did Not Pass. The performance level indicates where the student is categorized based on the student overall scale score. Refer to numeral 7 in Figure 4 for an example on how performance level appears on an ISR. The cut score for the Pass performance level represents the point on the scale that a student is considered proficient. Students have met proficiency if they are categorized as Pass+ or Pass. Additional information about PLDs and examples at each level can found on IDOE's website.

Lexile® Measure ILEARN ELA and IREAD-3 student ISRs include information on student Lexile® measures. Numeral 4 in Figure 4 provides an example on how the Lexile® measure

Page 26: Indiana Interpretive Guide for Statewide Assessments

23

appears on an ISR. Lexile® measures are a single score followed by the letter L. For example, a grade 3 student will receive a Lexile® measure in the range of 415L–760L. This score reflects a student’s reading ability and the text complexity of different reading materials. Educators use this score to match students with appropriate texts to ensure academic growth and success. This score can also help educators identify any areas where a student might be struggling and need additional support. Educators and parents can track a student’s Lexile® score from grade 3 through ILEARN grade 8 to ensure that the student is growing academically and on a path for college- and career-readiness.

Quantile® Measure ILEARN Mathematics student ISRs include information on student Quantile® measures. Numeral 5 in Figure 4 provides an example on how the Quantile® measure appears on the ISR. A Quantile® measure is a single score indicated by a number followed by the letter Q. For example, a grade 3 student will receive a Quantile® measure in the range of 305Q–555Q. Quantile measures in range from 0Q–above to 1400Q and span from kindergarten to high school. Quantile® scores are available for ILEARN Mathematics assessments only.

A Quantile® score reflects a student’s mathematical achievement. This score can help educators determine which skills and concepts a student is ready to learn. It can also assist educators in determining the level of success the student is expected to have with an upcoming mathematical skill. This score can also identify any educational or learning gaps in a student’s learning to ensure instruction can be provided to support student growth. It will also reflect how the student is growing in mathematical knowledge on a single scale across grade levels.

Comparison Scores Table ILEARN and I AM student ISRs include a comparison table. Numeral 2 in Figure 4 provides an example on how the comparison scores table appears on an ISR. The comparison scores table shows average scale scores at the school, corporation, and state levels for all ILEARN assessments, except U.S. Government ECA. For ILEARN U.S. Government ECA, the comparison scores table shows the number of students that participated in the assessment at the school, corporation, and state levels. For I AM, the table shows the percentage of students who are performing At Proficiency at the school, corporation, and state levels. Educators and parents can use this data to compare their students’ results to other students in their school, corporation, or state once final.

Hand-scored Open-ended Items and Condition Codes ILEARN and ISTEP+ Grade 10 assessments contain test items that have open-ended responses. These items allow a student to provide a short answer containing several lines of text, or an essay answer containing several paragraphs of text. These items are hand-scored by human scorers prior to score reporting.

Page 27: Indiana Interpretive Guide for Statewide Assessments

24

Hand-scorers use a combination of scoring tools and resources to ensure they assign correct and consistent scores during the hand-scoring process. These resources include item anchor sets, practice sets, and qualification sets.

Anchor sets are examples of previously scored student responses. The scored responses are accompanied by notes from the hand-scoring leaders that explain the reasoning behind the given score accompany. Hand-scorers use these examples during training and while scoring student responses. Anchor sets show hand-scorers why a student response received the score it did and aid them in making decisions about other student responses they will score.

After the hand-scorers have reviewed the anchor sets, they review the practice sets. Practice sets are a set of student responses that help the hand-scorers apply the scoring rules illustrated in the anchor sets. Hand-scorers practice scoring on these student responses to confirm understanding. After the hand-scorers determine scores for each item in the practice set, they are provided with the correct score for each response and the reasoning for the score.

Finally, hand-scorers complete a qualification set which will ‘qualify’ each of them to be a hand-scorer. Like a practice set, a qualification set consists of a set of student responses. A qualification set is used to assess the hand-scorers’ understanding of the scoring rules before they can begin live-scoring student responses. After hand-scorers have reviewed and discussed the practice set, they take two qualification sets to ensure the scorer understands the scoring rubric.

In most instances, a student will receive a numeric score on an assessment item. However, there are some instances in which a student might receive a condition code in hand-scoring instead of a numeric score. These codes are assigned to responses that do not meet the scoring rules or criteria for a numeric score. For example, a student response that is blank would receive a specific condition code for blank responses. Table 17 and Table 18 provide information on the condition codes for both ILEARN and ISTEP+ Grade 10.

Table 17: ILEARN Condition Codes

ILEARN Condition Codes Value Description

A Blank/No Response/Refusal B Illegible C Written predominately in language other than English D Insufficient Response/Copied from text E Response not related to test questions or scoring rules

Page 28: Indiana Interpretive Guide for Statewide Assessments

25

Table 18: ISTEP+ Grade 10 Condition Codes

ISTEP+ Condition Codes Value Description

B Blank Essay, Not Tested (e.g., no response, erased, refusal)

I Insufficient / Copied from text L Non-scorable language T Off topic (essay only) M Off purpose (essay only) X Illegible

ILEARN Writing Dimensions The writing part of the ILEARN ELA assessment is scored based on the Performance Task (PT) writing rubric for each criterion. Numeral 11 in Figure 4 provides an example on how writing dimension scores appears on the ISR. Each student completes one writing prompt as the final question in a PT. Students may engage in one of the following writing types: Narrative Writing (grades 3–8), Informational Writing (grades 3–5), Opinion Writing (grades 3–5), Explanatory Writing (grades 6–8), or Argument Writing (grades 6–8). Each writing type is scored using a writing rubric specific to the writing type, as illustrated in the tables below. For more information on each rubric, please visit IDOE's website.

Table 19: ILEARN Narrative Writing Rubric

ILEARN Narrative PT Writing Rubric (Grades 3–8) Organization/Purpose

4 points The organization of the narrative, real or imagined, is fully sustained and the focus is clear and maintained throughout.

3 points The organization of the narrative, real or imagined, is adequately sustained, and the focus is adequate and generally maintained.

2 points The organization of the narrative, real or imagined, is somewhat sustained and may have an uneven focus.

1 point The organization of the narrative, real or imagined, may be maintained but may provide little or no focus.

NS

• Insufficient (includes copied text) • In a language other than English • Off-topic • Off-purpose

Development/Elaboration

4 points The narrative, real or imagined, provides thorough, effective elaboration using relevant details, dialogue, and/or description.

3 points The narrative, real or imagined, provides adequate elaboration using details, dialogue, and/or description.

Page 29: Indiana Interpretive Guide for Statewide Assessments

26

ILEARN Narrative PT Writing Rubric (Grades 3–8) Organization/Purpose

2 points The narrative, real or imagined, provides uneven, cursory elaboration using partial and uneven details, dialogue, and/or description.

1 point The narrative, real or imagined, provides minimal elaboration using few or no details, dialogue, and/or description.

NS

• Insufficient (includes copied text) • In a language other than English • Off-topic • Off-purpose

Conventions 2 points The response demonstrates an adequate command of conventions.

1 point The response demonstrates a partial command of conventions.

0 points The response demonstrates little or no command of conventions.

NS

• Insufficient (includes copied text) • In a language other than English • Off-topic • Off-purpose

Table 20: ILEARN Informational Writing Rubric

ILEARN Informational PT Writing Rubric (Grades 3–5) Organization/Purpose

4 points

The response has a clear and effective organizational structure, creating a sense of unity and completeness. The organization is sustained between and within paragraphs. The response is consistently and purposefully focused.

3 points

The response has an evident organizational structure and a sense of completeness. Though there may be minor flaws, they do not interfere with the overall coherence. The organization is adequately sustained between and within paragraphs. The response is generally focused.

2 points

The response has an inconsistent organizational structure. Some flaws are evident, and some ideas may be loosely connected. The organization is somewhat sustained between and within paragraphs. The response may have a minor drift in focus.

1 point The response has little or no discernible organizational structure. The response may be related to the topic but may provide little or no focus.

NS

• Insufficient (includes copied text) • In a language other than English • Off-topic • Off-purpose

Evidence/Elaboration 4 points

The response provides thorough elaboration of the support/evidence for the controlling/main idea that includes the effective use of source material. The response clearly and effectively develops ideas, using precise language.

3 points

The response provides adequate elaboration of the support/evidence for the controlling/main idea that includes the use of source material. The response adequately develops ideas, employing a mix of precise and more general language.

Page 30: Indiana Interpretive Guide for Statewide Assessments

27

ILEARN Informational PT Writing Rubric (Grades 3–5) 2 points

The response provides uneven, cursory elaboration of the support/evidence for the controlling/main idea that includes uneven or limited use of source material. The response develops ideas unevenly, using simplistic language.

Evidence/Elaboration 1 point

The response provides minimal elaboration of the support/evidence for the controlling/main idea that includes little or no use of source material. The response is vague, lacks clarity, or is confusing.

NS

• Insufficient (includes copied text) • In a language other than English • Off-topic • Off-purpose

Conventions 2 points The response demonstrates an adequate command of conventions.

1 point The response demonstrates a partial command of conventions.

0 points The response demonstrates little or no command of conventions.

NS

• Insufficient (includes copied text) • In a language other than English • Off-topic • Off-purpose

Table 21: ILEARN Opinion Writing Rubric

ILEARN Opinion PT Writing Rubric (Grades 3–5) Organization/Purpose

4 points

The response has a clear and effective organizational structure, creating a sense of unity and completeness. The organization is sustained between and within paragraphs. The response is consistently and purposefully focused.

3 points

The response has an evident organizational structure and a sense of completeness. Though there may be minor flaws, they do not interfere with the overall coherence. The organization is adequately sustained between and within paragraphs. The response is generally focused.

2 points

The response has an inconsistent organizational structure. Some flaws are evident, and some ideas may be loosely connected. The organization is somewhat sustained between and within paragraphs. The response may have a minor drift in focus.

1 point The response has little or no discernible organizational structure. The response may be related to the opinion but may provide little or no focus.

NS

• Insufficient (includes copied text) • In a language other than English • Off-topic • Off-purpose

Evidence/Elaboration 4 points

The response provides thorough and convincing elaboration of the support/evidence for the opinion and supporting idea(s) that includes the effective use of source material. The response clearly and effectively develops ideas, using precise language.

3 points

The response provides adequate elaboration of the support/evidence for the opinion and supporting idea(s) that includes the use of source material. The response adequately develops ideas, employing a mix of precise with more general language.

Page 31: Indiana Interpretive Guide for Statewide Assessments

28

ILEARN Opinion PT Writing Rubric (Grades 3–5) 2 points

The response provides uneven, cursory elaboration of the support/evidence for the opinion and supporting idea(s) that includes partial or uneven use of source material. The response develops ideas unevenly, using simplistic language.

Evidence/Elaboration 1 point

The response provides minimal elaboration of the support/evidence for the opinion and supporting idea(s) that includes little or no use of source material. The response is vague, lacks clarity, or is confusing.

NS

• Insufficient (includes copied text) • In a language other than English • Off-topic • Off-purpose

Conventions 2 points The response demonstrates an adequate command of conventions.

1 point The response demonstrates a partial command of conventions.

0 points The response demonstrates little or no command of conventions.

NS

• Insufficient (includes copied text) • In a language other than English • Off-topic • Off-purpose

Table 22: ILEARN Explanatory Writing Rubric

ILEARN Explanatory PT Writing Rubric (Grades 6–11) Organization/Purpose

4 points

The response has a clear and effective organizational structure, creating a sense of unity and completeness. The organization is fully sustained between and within paragraphs. The response is consistently and purposefully focused.

3 points

The response has an evident organizational structure and a sense of completeness. Though there may be minor flaws, they do not interfere with the overall coherence. The organization is adequately sustained between and within paragraphs. The response is generally focused.

2 points

The response has an inconsistent organizational structure. Some flaws are evident, and some ideas may be loosely connected. The organization is somewhat sustained between and within paragraphs. The response may have a minor drift in focus.

1 point The response has little or no discernible organizational structure. The response may be related to the topic but may provide little or no focus.

NS

• Insufficient (includes copied text) • In a language other than English • Off-topic • Off-purpose

Evidence/Elaboration 4 points

The response provides thorough elaboration of the support/evidence for the thesis/controlling idea that includes the effective use of source material. The response clearly and effectively develops ideas, using precise language.

3 points

The response provides adequate elaboration of the support/evidence for the thesis/controlling idea that includes the use of source material. The response adequately develops ideas, employing a mix of precise and more general language.

Page 32: Indiana Interpretive Guide for Statewide Assessments

29

ILEARN Explanatory PT Writing Rubric (Grades 6–11) 2 points

The response provides uneven, cursory elaboration of the support/evidence for the thesis/controlling idea that includes uneven or limited use of source material. The response develops ideas unevenly, using simplistic language.

Evidence/Elaboration 1 point

The response provides minimal elaboration of the support/evidence for the thesis/controlling idea that includes little or no use of source material. The response is vague, lacks clarity, or is confusing.

NS

• Insufficient (includes copied text) • In a language other than English • Off-topic • Off-purpose

Conventions 2 points The response demonstrates an adequate command of conventions.

1 point The response demonstrates a partial command of conventions.

0 points The response demonstrates little or no command of conventions.

NS

• Insufficient (includes copied text) • In a language other than English • Off-topic • Off-purpose

Table 20: ILEARN Argument Writing Rubric

ILEARN Argumentative PT Writing Rubric (Grades 6–11) Organization/Purpose

4 points

The response has a clear and effective organizational structure, creating a sense of unity and completeness. The organization is fully sustained between and within paragraphs. The response is consistently and purposefully focused.

3 points

The response has an evident organizational structure and a sense of completeness. Though there may be minor flaws, they do not interfere with the overall coherence. The organization is adequately sustained between and within paragraphs. The response is generally focused.

2 points

The response has an inconsistent organizational structure. Some flaws are evident, and some ideas may be loosely connected. The organization is somewhat sustained between and within paragraphs. The response may have a minor drift in focus.

1 point The response has little or no discernible organizational structure. The response may be related to the claim but may provide little or no focus.

NS

• Insufficient (includes copied text) • In a language other than English • Off-topic • Off-purpose

Evidence/Elaboration 4 points

The response provides thorough and convincing elaboration of the support/evidence for the claim and argument(s) including reasoned, in-depth analysis and the effective use of source material. The response clearly and effectively develops ideas, using precise language.

3 points

The response provides adequate elaboration of the support/evidence for the claim and argument(s) that includes reasoned analysis and the use of source material. The response adequately develops ideas, employing a mix of precise with more general language.

Page 33: Indiana Interpretive Guide for Statewide Assessments

30

ILEARN Argumentative PT Writing Rubric (Grades 6–11) 2 points

The response provides uneven, cursory elaboration of the support/evidence for the claim and argument(s) that includes some reasoned analysis and partial or uneven use of source material. The response develops ideas unevenly, using simplistic language.

Evidence/Elaboration 1 point

The response provides minimal elaboration of the support/evidence for the claim and argument(s) that includes little or no use of source material. The response is vague, lacks clarity, or is confusing.

NS

• Insufficient (includes copied text) • In a language other than English • Off-topic • Off-purpose

Conventions 2 points The response demonstrates an adequate command of conventions.

1 point The response demonstrates a partial command of conventions.

0 points The response demonstrates little or no command of conventions.

NS

• Insufficient (includes copied text) • In a language other than English • Off-topic • Off-purpose

Each dimension is independently scored and reported on student reports. Item Analysis, Organization/Purpose, and Evidence and Elaboration are averaged and rounded to an integer. The overall writing prompt score will range from 0 to 6.

ISTEP+ Grade 10 Open-Ended Items ISTEP+ Grade 10 scores open-ended writing items using four writing dimensions (also called traits). Numeral 13 in Figure 4 provides an example on how scores for open-ended items appear on the ISR. Student responses are scored for Ideas and Content, Organization, Style, and Voice on a 6-point rubric. Students responses are scored for grammar and usage on a 4-point rubric. These rubrics are summarized in the tables below. Additional information on these rubrics can be found on the IDOE website.

Page 34: Indiana Interpretive Guide for Statewide Assessments

31

Table 21: ISTEP+ Grade 10 ELA 2-Point Constructed-Response Rubric

ELA 2-point Constructed-Response (CR) Rubric

2 points Proficient The response fulfills all the requirements of the task. The information given is text-based and relevant to the task.

1 point

Partially Proficient The response fulfills some of the requirements of the task, but some of the information may be too general, too simplistic, or not supported by the text.

0 points

Not Proficient The response does not fulfill the requirements of the task because it contains information that is inaccurate, incomplete, and/or missing altogether.

NOTE: This rubric is applied to reading-comprehension items. Student responses to these items are scored for reading comprehension only.

Table 23: ISTEP+ Grade 10 Writing Rubric Grades 5–12

ISTEP+ Grade 10 Writing Rubric Grades 5–12

6 points

A Score Point 6 paper is rare. It fully accomplishes the task in a thorough and insightful manner and has a distinctive quality that sets it apart as an outstanding performance.

5 points

A Score Point 5 paper represents a solid performance. It fully accomplishes the task, but lacks the overall level of sophistication and consistency of a Score Point 6 paper.

4 points

A Score Point 4 paper represents a good performance. It accomplishes the task, but generally needs to exhibit more development, better organization, or a more sophisticated writing style to receive a higher score.

3 points

A Score Point 3 paper represents a performance that minimally accomplishes the task. Some elements of development, organization, and writing style are weak.

2 points

A Score Point 2 paper represents a performance that only partially accomplishes the task. Some responses may exhibit difficulty maintaining a focus. Others may be too brief to provide sufficient development of the topic or evidence of adequate organizational or writing style.

1 point

A Score Point 1 paper represents a performance that fails to accomplish the task. It exhibits considerable difficulty in areas of development, organization, and writing style. The writing is generally either very brief or rambling and repetitive, sometimes resulting in a response that may be difficult to read or comprehend.

Page 35: Indiana Interpretive Guide for Statewide Assessments

32

Table 24: ISTEP+ Grade 10 Grammar and Usage Rubric Grades 9–12

ISTEP+ Grammar and Usage Rubric Grades 9–12

4 points

Does the writing exhibit superior command of language skills? A Score Point 4 paper exhibits a superior command of written English language conventions. The paper provides evidence that the student has a thorough control of the concepts outlined in the Indiana Academic Standards associated with the student’s grade level. In a Score Point 4 paper, there are no errors that impair the flow of communication. Errors are generally of the first-draft variety or occur when the student attempts sophisticated sentence construction.

3 points

Does the writing exhibit good control of language skills? In a Score Point 3 paper, errors are occasional and are often of the first-draft variety; they have a minor impact on the flow of communication.

2 points

Does the writing exhibit fair control of language skills? In a Score Point 2 paper, errors are typically frequent and may occasionally

impede the flow of communication.

1 point

Does the writing exhibit minimal or less than minimal control of language skills? In a Score Point 1 paper, errors are serious and numerous. The reader may need to stop and reread part of the sample and may struggle to discern the writer’s meaning.

NOTE: The elements of this rubric are applied holistically; no element is intended to supersede any other element. The variety and proportion of errors in relation to the length of the writing sample are considered. A very brief paper consisting of only a few sentences may receive no more than 2 score points.

College- and Career-Readiness Indicator ILEARN student ISRs include a College- and Career-Readiness Indicator. Numeral 9 in Figure 4 provides an example of how the College- and Career-Readiness Indicator appears on the ISR. Indiana educators recommended that ILEARN cut scores reflect what is expected for students to be considered college- and career-ready, based on the IAS for each tested grade and subject. A student who is in either the At Proficiency or Above Proficiency performance level is on track for college- and career-readiness based on ILEARN assessment results. The College- and Career-Readiness Indicator on the ISR will read “Yes.” A student who is in either the Below Proficiency or Approaching Proficiency performance level is not on track for college- and career-readiness based on ILEARN results. The College- and Career-Readiness Indicator on the ISR will read “No.” Parents should proceed with caution when reviewing this indicator, as this is just one element of a student’s overall assessment results.

Parents are encouraged to visit IDOE’s website to review the ILEARN PLDs and the references made to College- and Career-Readiness for more information.

Page 36: Indiana Interpretive Guide for Statewide Assessments

33

Indiana Performance Index IPI target scores for ISTEP+ Grade 10 strands (reporting categories) depend on the items included on the assessment. IPI target scores are calculated separately for online, paper-and-pencil, and breach test forms. Numeral 12 in Figure 4 provides an example of how IPI target scores appear on an ISR. IPI is used to indicate a student’s performance on the IAS. IPI is a statistical value that reflects the number of items a student would have answered correctly if the student had responded to 100 similar items for a specific reporting category on the test. It is a better measure of the student’s performance than a simple percentage of correct answers on a small number of questions.

Target scores at the aggregate level are the average target for all tested students in the selected group. When viewing the Aggregate Performance on Each Strand report, use caution when interpreting the target scores and average strand scores if the group includes combinations of online, paper-and-pencil, and/or breach form testers. In these cases, the Total Percent Mastered indicator may provide a better understanding of how students in the group performed. Target IPI scores on an ISR reflect the calculated targets for that student’s mode of testing (online, paper-and-pencil, or breach).

Rescore Process The ILEARN assessment permits principals and parents to request a rescore of open-ended, hand-scored items as required under Indiana legislation. Rescore windows are scheduled for each ILEARN and ISTEP+ Grade 10 First Time Administration after the conclusion of the test window. Rescore windows align with the release of the online reporting scores in ORS. ISTEP+ Grade 10 retest assessments are automatically rescored on behalf of students.

Principals facilitate the rescore requests with the parents or guardians in attendance at the school. A non-disclosure agreement is required to review items and associated responses in the student registration database called the Test Information Distribution Engine (TIDE) prior to accessing the secure materials available. Principals have access to students’ individual open-ended item-level scores in TIDE and to students’ overall scale scores in ORS. The principal will access TIDE to view a student response along with the student score for the item. Secure scoring documentation including anchor sets and scoring rubrics used for scoring the item are also available in TIDE. Anchor Sets include examples of student responses with notes to be used by the hand-scorers during initial training, as well as for self-monitoring during the hand-scoring window. Scoring rubrics contain examples for a specific score on a test. For example, these resources would explain if a student could receive a “2” or a “1” on an item.

Principals can download a student ISR in combination with hand-scoring materials, review with a parent or guardian, and determine if a rescore should be requested for a student. The ORS User Guide provides step-by-step instructions on how to download these reports. The ORS User Guide is available on the Indiana Assessment Portal.

Page 37: Indiana Interpretive Guide for Statewide Assessments

34

Parents can request a copy of their student’s ISR from ORS prior to the rescore window. Parents should contact their student’s teacher to retrieve these reports. Parents will be able to view only the student’s overall scale score on these reports. Individual item-level scores are viewable in TIDE only during the rescore process with the principal at the student’s school.

Rescore requests cannot be reversed once principals request them. A student's score can go up or down based on the rescore result. Therefore, caution should be taken when requesting rescores for students. Released scores in ORS are considered preliminary during the rescore process, as the results are not yet final.

Interpretation of Aggregate Scores Aggregate Score Reports provide average classroom, school, and corporation level information about overall and reporting category performance on student assessments. This information allows users to see where a student falls in comparison to other students. Aggregate Score Reports include information about student strengths and weaknesses to improve teaching and student learning. Aggregate Score Reports are available in ORS for ILEARN, I AM, and ISTEP+ Grade 10.

The Aggregate-Level Subject Detail Report, the Aggregate-Level Reporting Category Detail Report, and the Aggregate-Level Standard Detail Report present summary results for a selected aggregate unit (corporation, school, or classroom) and summary results for the higher level of aggregate unit. A corporation is the highest-level aggregate unit. The classroom is the lowest-level aggregate unit. A corporation includes multiple schools and classrooms. A school includes multiple classrooms. Classrooms include only themselves. The selection of different "levels" of aggregative units change the results that are presented to the user.

Aggregate Score Reports include all students that meet attemptedness criteria and have completed enough of the assessment to receive a reliable score. Students with Invalidated or Undetermined scores are not included in the Aggregate Score Reports. Students taking I AM who receive a condition code of NMC are also not included. More information about the various included aggregates are below. Aggregate Score Reports are available for authorized Indiana educators in ORS and are not available to parents/guardians.

Interpretation of Aggregate-Level ORS Reports

Aggregate-Level Subject Detail Report The Aggregate-Level Subject Detail Report includes a summary of student performance within a grade/subject area for ILEARN, I AM, and ISTEP+ Grade 10 assessments.

Page 38: Indiana Interpretive Guide for Statewide Assessments

35

State-level aggregate data for this report is available for the ILEARN and I AM. assessments. The state-level data is suppressed until the assessment window closes and IDOE approves release of the data. State-level aggregate data is not available for the ISTEP+ Grade 10 Retests

Figure 5 includes the various elements of the Aggregate-Level Subject Detail Report. This report can illustrate student performance for all students or for a specific demographic subgroup of students, such as gender. The breakdown by filter segments the score data by a specific demographic subgroup. When educators select a subgroup, the report expands to display the corresponding data for that subgroup.

Figure 5: Aggregate-Level Subject Detail Report

The following is a list of data that appears on the Aggregate-Level Subject Detail Report:

• Number of Students (Numeral 1 in Figure 5)—The number of students to date who have completed and submitted the assessment.

Page 39: Indiana Interpretive Guide for Statewide Assessments

36

• Average Scale Score (Numeral 2 in Figure 5)—The average scale score for students who completed the assessment. A scale score represents the number of correct responses submitted by the student.

• Percent Proficient/Percent Passed (Numerals 3 and 6 in Figure 5)—The percentage of students for each selected administration who demonstrated proficiency on the selected test.

o For ILEARN, Percent Proficient information shows the percentage of students to date who scored At Proficiency or Above Proficiency on the selected assessment.

o For I AM, Percent Proficient information shows the percentage of students to date who scored At Proficiency on the selected assessment.

o For ISTEP+ Grade 10, Percent Passed information shows the percentage of students to date who scored Pass or Pass+ on the selected assessment.

• Percent in Each Proficiency Level/Percent in Each Performance Level (Numerals 4 and 7 in Figure 5)—The distribution of students across each of the achievement levels.

• Number of Students in Each Proficiency Level/Number of Students in Each Performance Level (Numerals 5 and 8 in Figure 5)—The number of students across each of the achievement levels.

Aggregate-Level Reporting Category Detail Report The Aggregate-Level Reporting Category Detail Report is available for ILEARN 3–8 and ILEARN Biology ECA, ISTEP+ Grade 10, and I AM assessments. It provides the aggregate summaries of student performance in each reporting category for a grade and subject. Like the Aggregate-Level Subject Detail Report, this report presents the summary results for the selected aggregate unit (corporation, school, or classroom). Summary data will also appear for the aggregate unit above the selected aggregate unit. For example, if “classroom” is selected as the aggregate unit then the report will also contain summary data at the school-level. Summaries can be presented for all students within a corporation, school, or classroom and for students within a defined demographic subgroup, such as gender. Figure 6 includes the various elements of the Aggregate-Level Reporting Category Detail Report.

Page 40: Indiana Interpretive Guide for Statewide Assessments

37

Figure 6: Aggregate-Level Reporting Category Detail Report

The following is a list of data that appears on the Aggregate-Level Reporting Category/Strand Report:

• Reporting Category/Strand (Numeral 1 in Figure 6)—The reporting categories, or strands, within the selected subject. The reporting categories, or strands, for each assessment can be found in each blueprint. Links to the blueprints are included in Table 1.

• Percent at Each Performance Category (Numeral 2 in Figure 6)—Percentage of students at each reporting category achievement category who took the

Page 41: Indiana Interpretive Guide for Statewide Assessments

38

selected ILEARN assessment. The detail page shows the distribution of students Below, At/Near, and Above standard for each reporting category.

• Average Percent Correct (Numeral 3 in Figure 6)—The average percentage of total points earned for each reporting category for I AM.

• Average Strand Score (Numeral 4 in Figure 6)—The average strand score for students who completed the ISTEP+ Grade 10 assessments.

• Total Percent Mastered (Numeral 5 in Figure 6)—The percentage of students who achieved mastery of each strand (i.e., achieved at or above the target score on each strand) for ISTEP+ Grade 10 assessments.

Aggregate-Level Standard Detail Report The Aggregate-Level Standard Detail Report is available only for ILEARN computer-adaptive assessments. Computer-adaptive assessments are designed to adjust the level of difficulty of the assessment to the responses provided by the student while taking the assessment. For Spring 2019, this report is available only for ILEARN ELA and Mathematics assessments. For School Year 2019–2020, this report is available for ILEARN Biology ECA assessments administered in December 2019 and February 2020. ILEARN was not administered in Spring 2020.

This report provides performance information about students’ strengths and weaknesses at the standard-level. The Aggregate-Level Standard Detail Report provides information on how a group of students in a school or corporation performed on a given standard compared to the proficiency cut score. This report is not available for individual classrooms. Figure 7 includes the various components of the Aggregate-Level Standard Detail Report.

The Areas Where Performance Indicates Proficiency section on the report includes a performance indicator. It includes four possible values: Above, Borderline, Below, and Insufficient Information. This performance indicator reflects how an aggregate unit performed on the standard compared to the proficiency cut scores.

Page 42: Indiana Interpretive Guide for Statewide Assessments

39

Figure 7: Aggregate-Level Standard Detail Report

Please note that the legend in the standards-level report differs from other aggregate reports in ORS. The symbols used on the standard report represent the aggregate “average” performance for a group within a corporation or school while the symbols used on other reports represent the student’s individual performance. The legends are different because they represent different measures. Using the same labels while measuring different elements would be confusing so the labels are intentionally different.

The following is a list of data that appears on the Aggregate-Level Standard Detail Report:

• Standards (Numeral 1 in Figure 7)—The content standards within the selected subject.

• Areas Where Performance Indicates Proficiency (Numeral 2 in Figure 7)—This measure produces information on how a group of students in a school or corporation performed on the standard compared to the cut score that demonstrates proficiency. It shows whether the performance for this group of students was Above, Borderline, or Below what was expected of students at the

Page 43: Indiana Interpretive Guide for Statewide Assessments

40

proficient level or the proficiency cut score. It also shows if there is Insufficient Information to determine student performance on the standard.

o Above—The group of students performed Above the proficiency standard for this standard.

o Borderline—The group of students performed At or Near the proficiency standard for this standard.

o Below—The group of students performed Below the proficiency standard for this standard.

o Insufficient Information—Not enough information is available to determine whether performance on this standard is Above, At or Near, or Below the proficiency standard. Asterisks will appear more often for lower priority standards (such as the Process Standards in Mathematics). An asterisk may appear if the standard was not assessed at all within the aggregate group or not enough students received and responded to items that measured the standard.

Student Roster Subject Report The Student Roster Subject Report is available for ILEARN, I AM, and ISTEP+ Grade 10 assessments. This report includes information about how a group of students individually performed on a selected test. It displays a row of data for each student in a school or corporation. Figure 8 includes the various components of the Student Roster Subject Report.

Page 44: Indiana Interpretive Guide for Statewide Assessments

41

Figure 8: Student Roster Subject Report

The following is a list of data that appears on the Student Roster Subject Report:

• Name (Numeral 1 in Figure 8)—The name of the student. • STN (Number 2 in Figure 8)—The student’s unique identifier. • Scale Score (Numeral 3 in Figure 8)—The student’s scale score. • Proficiency Level/Performance Level (Numerals 7 and 8 in Figure 8) —The

proficiency classification associated with the student’s score for the test. • Reported Lexile® Measure/Reported Quantile® Measure (Numerals 4 and 5

in Figure 8)—The reported Lexile® Measure shows a single score that reflects the student’s reading ability. The reported Quantile® Measure shows a single

Page 45: Indiana Interpretive Guide for Statewide Assessments

42

score that reflects the student’s mathematical achievement. This column is available only for ILEARN ELA and Mathematics assessments.

• College- and Career-Readiness Indicator (Numeral 6 in Figure 8)—This attribute indicates whether a student is on track to becoming college- and career-ready based on the student’s performance on the assessment. This column is available only for ILEARN assessments.

Student Roster Reporting Category/Strand Report The Student Roster Reporting Category/Strand Report is available for ILEARN 3–8, ILEARN Biology ECA, I AM, and ISTEP+ Grade 10 assessments. It displays a row of data for each student who belongs to a classroom, school, or corporation. Students taking the ILEARN 3–8, ILEARN Biology ECA, and I AM assessments receive reports related to student performance on reporting categories. Students taking ISTEP+ Grade 10 receive reports related to student performance on strands. Figure 9 includes the various components of the Student Roster Subject Report.

Page 46: Indiana Interpretive Guide for Statewide Assessments

43

Figure 9: Student Roster Reporting Category/Strand Report

The following is a list of data that appears on the Student Roster Reporting Category/Strand Report:

• Name (Numeral 1 in Figure 9)—The name of the student. • STN (Numeral 2 in Figure 9)—The student’s unique identifier. • Scale Score (Numeral 3 in Figure 9)—The student’s scale score for I AM and

ILEARN. • College- and Career-Readiness Indicator (Numeral 4 in Figure 9)—This

attribute indicates whether a student participating in ILEARN assessments is on track to becoming college- and career-ready based on the student’s performance on the assessment.

Page 47: Indiana Interpretive Guide for Statewide Assessments

44

• Reporting Category Achievement Category (Numeral 5 in Figure 9)—The student’s achievement category in each of the reporting categories for a specific ILEARN assessment. See Table 1 for links to the blueprints and reporting category information. Student performance on reporting categories is reported as one of three achievement categories:

o Students who score Below demonstrated performance in the reporting category that was clearly below proficient.

o Students who score At/Near demonstrated performance in the reporting category that was exactly at or immediately below/above proficient.

o Students who score Above demonstrated performance in the reporting category that was clearly at/above proficient.

• Reporting Category Proficiency Level (Numeral 6 in Figure 9)— The proficiency classification associated with the student’s score for the reporting category for ILEARN and I AM assessments.

• Reporting Category Percent Correct (Numeral 7 in Figure 9)—The percentage of items the student answered correctly for each reporting category for a specific I AM assessment. See Table 1 for links to the blueprints and reporting category information.

• Strand Score (Numeral 8 in Figure 9)—The student’s score for each strand for ISTEP+ Grade 10 assessments. See Table 1 for links to the blueprints and strand information more information.

• Strand Performance (Numeral 9 in Figure 9)—The student’s performance shows whether the student achieved mastery of each strand (i.e., achieved At or Above Target Score; Strand Mastered on each strand) for ISTEP+ Grade 10 assessments.

o Students who achieved At or Above Target Score; Strand Mastered demonstrated mastery of the strand.

o Students who scored Below Target Score; Strand Not Mastered demonstrated less mastery of the strand.

• Strand Target Score (Numeral 10 in Figure 9)—The score expected of students earning At or Above Target Score; Strand Mastered for each strand for ISTEP+ Grade 10 assessments.

Page 48: Indiana Interpretive Guide for Statewide Assessments

45

Specific Information for Educators

Preliminary Results Preliminary results are initial scores in ORS that have not been finalized by the Indiana Department of Education (IDOE). The purpose of preliminary scores is to provide educators and parents/guardians with an early indicator of test results shortly after a student completes an assessment. I AM, IREAD-3, and ISTEP+ Grade 10 do not have preliminary score results.

For ILEARN assessments, preliminary scores will appear in ORS on a rolling basis during the test window after tests are completed and scored. Most ILEARN assessments include test items that require human-handscoring prior to score reporting in ORS. When items are hand-scored, it can take up to 12 business days for scores to become available in ORS.

Preliminary results begin to populate in ORS after the first few weeks of testing. Students who take the assessment earlier in the test window will see their preliminary score results earlier than students who take theirs later. It is important to note that initial preliminary results do not include a diverse sample of students. Educators should proceed with caution when reviewing student, school, and corporation aggregate results while scores are still considered preliminary, and not final. Results for ILEARN assessments are considered preliminary until a date established by IDOE on an annual basis.

Figure 10: Preliminary Results Timeline

Page 49: Indiana Interpretive Guide for Statewide Assessments

46

Additional Resources The additional educator resources listed below provide more information on ORS and the process for requesting a rescore on the ILEARN and ISTEP+ Grade 10 assessments. The resources are available on the Indiana Assessment Portal.

ORS User Guide Educators can use the ORS User Guide for all Indiana programs. It is located in the CAI Systems User Guides folder under Resources on the Indiana Assessment Portal. This guide provides authorized users with information on how to access student score reports, such as ISRs or corporation- and school-level reports.

ORS Training Module Educators can use the ORS Training Module for all Indiana programs. It is located in the Training Resources folder under Resources on the Indiana Assessment Portal. This module provides a detailed overview of ORS and how to access detailed score reports and online corporation school data files for each assessment.

Request a Rescore Training Module Principal designees at schools should review this module before the first rescore window for ILEARN and ISTEP+ Grade 10 of each school year. It is located in the Training Resources folder under Resources on the Indiana Assessment Portal. This module provides an overview on how to define the principal role in TIDE, maintain security of test materials, and create and view a request for rescores in TIDE.

Page 50: Indiana Interpretive Guide for Statewide Assessments

47

Glossary Anchor sets. Examples of previously scored student responses with notes from hand-scoring leaders that further explain the reasoning applied to student responses.

Content claims. Statements that can be made about student learning because of their performance on an assessment. Assessments are designed using evidence-centered design to gather evidence to support the content claims.

Criterion-referenced assessment. An assessment that evaluates and reports a student’s mastery based on a set of specific standards, such as the Indiana Academic Standards.

Lexile® measure. A single score followed by the letter L. A Lexile measure reflects a student’s reading ability and a text’s complexity.

Performance-Level Descriptor. A statement that outlines the knowledge and skills students can demonstrate at a given level within each content area and grade level.

Proficiency Cut Score. The score that differentiates between proficiency or performance levels on an assessment. For tests with four proficiency levels there would be three proficiency cut scores.

Proficiency levels. Areas on a student achievement scale that classify students by how much of the content standards they know.

Quantile® measure. A single score indicated by a number followed by the letter Q. A Quantile® measure reflects a student’s mathematical achievement and can help identify areas of strength and weakness. A Quantile® measure also determines which skills and concepts a student is ready to learn and the level of success the student is expected to have with an upcoming mathematical skill.

Reliability. A reference to the consistency in a measurement. If a student takes the assessment several times, the score should remain relatively consistent with time.

Reporting categories. Groups of similar standards that are assessed within each grade and subject. Reporting categories separate content into smaller sets of related standards for reporting and other purposes. They can be used to identify each student’s relative strengths or weaknesses for different subdomains of a content area.

Scale scores. Standardized scores that are comparable across years and test forms.

Standard error of measurement. A measure of how much a given test score is spread around a student’s “true score,” or what he or she knows and can do.

Summative assessments. An assessment that provides useful data for measuring growth, proficiency, and/or learning gaps between different groups of students. This

Page 51: Indiana Interpretive Guide for Statewide Assessments

48

type of assessment is frequently considered a high-stakes evaluation (e.g., accountability measures, graduation requirements).

Page 52: Indiana Interpretive Guide for Statewide Assessments

49

References American Educational Research Association, American Psychological Association, National Council on Measurement in Education, and Standards for Educational and Psychological Testing. Washington, D.C.: AERA, 2014.

The National Academies (2007). Lessons learned about testing: Ten years of work at the National Research Council. Washington, D.C.