identifying top performing schools for the high schools” rankings · 2017-10-18 · page 3 |...
TRANSCRIPT
Identifying Top‐Performing High Schools for the “Best High Schools” Rankings
Analytical Methodology and Technical Appendices
Prepared for U.S. News & World Report by RTI International
May 2015
Acknowledgments This report was produced for U.S. News & World Report (U.S. News) by RTI International. The authors—Ben Dalton, Elisabeth Hensley, Erich Lauff, and Colleen Spagnardi—would like to acknowledge the many people and organizations that made these rankings possible.
First, we would like to express gratitude to officials and staff at state education agencies who have helped provide the data necessary for the analysis and answered questions about the data. In particular, we wish to thank staff in Arizona and North Carolina for their rapid turnaround of our data requests. Without states’ express help and the considerable work that goes into making such data available generally, these rankings would not be possible.
Second, we would like to express our appreciation to the U.S. News staff for their patience and assistance throughout the project. This was the first year RTI conducted the rankings analysis for U.S. News, and their flexibility and encouragement helped ensure that the rankings were produced in a timely and effective manner.
Third, we would like to thank the researchers at the American Institutes for Research (AIR) and their predecessors for developing the methodology employed in this year’s rankings. Where possible, RTI followed the methodology outlined in AIR’s documentation to the 2014 “Best High Schools” rankings. The current document is also based on AIR’s 2014 analytical methodology and technical appendices.
For questions about the 2015 “Best High Schools” rankings, please contact
Robert J. Morse Chief Data Strategist U.S. News & World Report 1050 Thomas Jefferson St. NW Washington, DC 20007 [email protected]
Contents
Acknowledgments ...................................................................................................................................... i
Analytical Methodology ............................................................................................................................ 1
The “Best High Schools” Method .............................................................................................................. 2
Method Overview ..................................................................................................................................... 2
Data Sources ............................................................................................................................................. 4
Sample Sizes in Different Steps of the Analysis........................................................................................ 4
Step‐by‐Step Process Details: Indicators and Criteria .............................................................................. 6
Step 1: Identify High Schools That Performed Better Than Expected on State Accountability Assessments ......................................................................................................................................... 7
Step 2: Identify High Schools That Performed Better Than the State Average for Their Least Advantaged Students .......................................................................................................................... 10
Step 3: Identify High Schools That Performed Best in Providing Students With Access to Challenging College‐Level Coursework ............................................................................................... 13
Data Notes .............................................................................................................................................. 16
References.............................................................................................................................................. 19
Technical Appendices .............................................................................................................................. 20
Appendix A. ............................................................................................................................................. 21
Appendix B. ............................................................................................................................................. 24
Alabama .............................................................................................................................................. 25
Alaska .................................................................................................................................................. 26
Arizona ................................................................................................................................................ 27
Arkansas .............................................................................................................................................. 28
California ............................................................................................................................................. 29
Colorado ............................................................................................................................................. 30
Connecticut ......................................................................................................................................... 31
Delaware ............................................................................................................................................. 32
District of Columbia ............................................................................................................................ 33
Florida ................................................................................................................................................. 34
Georgia ............................................................................................................................................... 35
Hawaii ................................................................................................................................................. 36
Idaho ................................................................................................................................................... 37
Illinois .................................................................................................................................................. 38
Indiana ................................................................................................................................................ 39
Iowa .................................................................................................................................................... 40
Kansas ................................................................................................................................................. 41
Kentucky ............................................................................................................................................. 42
Louisiana ............................................................................................................................................. 43
Maine .................................................................................................................................................. 44
Maryland ............................................................................................................................................. 45
Massachusetts .................................................................................................................................... 46
Michigan ............................................................................................................................................. 47
Minnesota ........................................................................................................................................... 48
Mississippi ........................................................................................................................................... 49
Missouri .............................................................................................................................................. 50
Montana ............................................................................................................................................. 51
Nebraska ............................................................................................................................................. 52
Nevada ................................................................................................................................................ 53
New Hampshire .................................................................................................................................. 54
New Jersey .......................................................................................................................................... 55
New Mexico ........................................................................................................................................ 56
New York ............................................................................................................................................. 57
North Carolina .................................................................................................................................... 58
North Dakota ...................................................................................................................................... 59
Ohio .................................................................................................................................................... 60
Oklahoma............................................................................................................................................ 61
Oregon ................................................................................................................................................ 62
Pennsylvania ....................................................................................................................................... 63
Rhode Island ....................................................................................................................................... 64
South Carolina .................................................................................................................................... 65
South Dakota ...................................................................................................................................... 66
Tennessee ........................................................................................................................................... 67
Texas ................................................................................................................................................... 68
Utah .................................................................................................................................................... 69
Vermont .............................................................................................................................................. 70
Virginia ................................................................................................................................................ 71
Washington ......................................................................................................................................... 72
West Virginia....................................................................................................................................... 73
Wisconsin ............................................................................................................................................ 74
Wyoming ............................................................................................................................................. 75
PAGE 1 | Identifying Top‐Performing High Schools Analytical Methodology
Analytical Methodology
PAGE 2 | Identifying Top‐Performing High Schools Analytical Methodology
The “Best High Schools” Method U.S. News & World Report (U.S. News) publishes the “Best High Schools” rankings to identify the top‐performing high schools in the United States. These rankings are based on three aspects of school performance: (1) the performance of all students on state assessments in reading and mathematics; (2) the performance of disadvantaged student subgroups—defined as Black/African‐American students, Hispanic/Latino students, and students who are eligible for free or reduced‐price lunch or who are economically disadvantaged as determined by the state—on these assessments; (3) and the degree to which high schools prepare students for college by offering a college‐level curriculum.
This 2015 version of the rankings, using data from the 2012–13 school year, is based on documentation provided by U.S. News about the 2014 methodological approach and adjusted as requested by U.S. News.
More information and a list of the top‐performing high schools are available on the “Best High Schools” website (www.usnews.com/education/best‐high‐schools).
Method Overview The technical methods used to create the rankings were designed to:
� Identify high schools that have succeeded in serving their students—including those from disadvantaged student subgroups—as measured by academic performance on state assessments in reading and mathematics.
� Evaluate how well high schools have prepared their students for college, as measured by participation in and performance on Advanced Placement (AP) or International Baccalaureate (IB) examinations.
A three‐step process was used to generate the “Best High Schools” rankings:
� Step 1: Identify high schools that performed better than expected on state accountability assessments, given their population of economically disadvantaged students.
� Step 2: Identify high schools whose disadvantaged students performed better than the state average for disadvantaged students.
� Step 3: Identify high schools that performed best in providing students with access to challenging college‐level coursework.
Step 1 and Step 2 of the method were based on state‐by‐state analyses designed to evaluate high schools on the performance of their students on state assessments. Step 1 identified high schools within each state that performed better on state reading and mathematics assessments than their poverty level would lead one to expect. Step 2 identified high schools with disadvantaged student
Disadvantaged
student subgroups
were defined as
Black/African‐
American,
Hispanic/Latino,
and economically
disadvantaged
students
PAGE 3 | Identifying Top‐Performing High Schools Analytical Methodology
subgroups that performed better than the state average for these subgroups. High schools that passed these initial steps were considered at least bronze‐medal high schools and were analyzed further. High schools that did not pass Step 1 or Step 2 were not eligible for a medal.
High schools that met the criteria for Step 1 and Step 2 then proceeded to Step 3, which examined the extent to which these high schools prepared their students for college, as determined by participation in and performance on AP or IB examinations (computed as the college readiness index; see page 14). High schools with a college readiness index (CRI) at or above the median CRI for all high schools in the country were eligible for silver or gold medals. The high schools with the top 500 CRI scores received a gold medal, while all other high schools above the national median CRI received a silver medal. In cases where schools tied on CRI scores, a set of tiebreakers based on AP or IB examinations were used to determine ranking. (For more information on tiebreakers, see Substep 3.5 of this report.) To summarize:
� Bronze‐Medal High Schools: Passed Step 1 and Step 2 and had a CRI below the median or did not have a CRI value.
� Silver‐Medal High Schools: Passed Step 1 and Step 2 and had a CRI at or above the median but did not rank in the top 500 for CRI among high schools across all states that passed Step 1 and Step 2.
� Gold‐Medal High Schools: Passed Step 1 and Step 2, had a CRI at or above the median, and ranked in the top 500 for CRI among high schools across all states that passed Step 1 and Step 2.
All other schools were categorized as “not ranked.” Exhibit 1 illustrates the three‐step process for ranking the high schools.
PAGE 4 | Identifying Top‐Performing High Schools Analytical Methodology
Exhibit 1. High School Performance Ranking System for Step 1, Step 2, and Step 3
Data Sources The data from the 2012–13 school year that were used to produce these rankings came from the following sources:
� School‐level state assessment results were retrieved from state education agency websites or directly from state education agencies.
� The universe of high schools and associated demographic data were retrieved from the Common Core of Data (CCD) (http://nces.ed.gov/ccd) at the U.S. Department of Education’s National Center for Education Statistics (NCES). Only public high schools (including charter high schools) were included in the analysis.
� AP examination results for the 2013 cohort were provided by the College Board (https://www.collegeboard.org). IB examination results for the 2013 cohort were provided by the International Baccalaureate Organization (http://www.ibo.org).
Sample Sizes in Different Steps of the Analysis Although the data requested from states for the purpose of ranking high schools did not include individual student‐level achievement data, many states had data‐suppression rules based on the Family Educational Rights and Privacy Act
PAGE 5 | Identifying Top‐Performing High Schools Analytical Methodology
(http://www.ed.gov/policy/gen/guid/fpco/ferpa/index.html) or their own restrictions that limited data availability for some schools. In the 2012–13 school year, according to the CCD, there were 29,070 public schools serving at least one grade in grades 9–12 in the United States. To be eligible for the 2015 “Best High Schools” rankings, high schools were required to meet one of the following criteria:
� Their lowest grade is grade 9 or their highest grade is grade 12. This excludes elementary, middle, and junior high schools but includes schools with both a grade 12 and one or more less‐than‐high school grades (e.g., grades 7–12); and
� They have at least 15 students in one of grades 9–12; or, if missing grade‐level enrollment counts, the number tested in mathematics or reading is at least 15; or, if missing grade‐level enrollment and numbers tested, the total enrollment is at least 15. These rules ensure that schools meet minimum size requirements while maximizing data availability.
There were 21,179 schools that met the grade‐level criterion, and 19,753 schools that met both the grade‐level and enrollment criteria. These schools were eligible for bronze medals. Additional criteria were required for step 3, part 2: schools were only eligible for a silver or gold medal if they had at least 15 students in grade 12 and at least 10 students taking one or more AP or IB exams. In addition to these criteria, many states had state‐specific suppression rules (e.g., rules guiding the minimum number of reportable students in a subgroup) to protect the identities of their students and thus did not report complete data. As a result of suppression or missing data, 19,278 high schools were eligible to be included in Step 1. During the analysis, 7,280 high schools passed Step 1. Of these, 763 high schools did not pass Step 2 (either from insufficient data or not meeting the Step 2 analysis criterion), leaving 6,517 to be considered for Step 3. A total of 2,527 high schools that passed Step 1 and Step 2 also had a qualifying CRI. Exhibit 2 presents the number of high schools at each step of the analysis.
Exhibit 2. Number of Public High Schools in Analysis Sample
Analysis Sample Number of High Schools
Reasons for Decrease in the Number of High Schools
Total public schools serving one or more grades 9–12
21,179 —
Schools considered for analysis 19,753 1,426 schools did not meet grade‐level or size requirements.
Schools considered during Step 1 19,278 475 high schools did not have enough data to calculate a performance index.
Schools considered during Step 2 7,280 11,998 high schools did not pass Step 1.
Schools considered during Step 3 6,517 763 high schools did not pass Step 2.
Schools considered for gold and silver medals
2,527 3,990 high schools that passed Step 1 and Step 2 did not administer AP or IB examinations, had fewer than 15 grade 12 students, or had fewer than 10 students who took AP or IB examinations.
—Not applicable.
PAGE 6 | Identifying Top‐Performing High Schools Analytical Methodology
Step‐by‐Step Process Details: Indicators and Criteria Exhibit 3 provides an overview of each step of the process. Following are more detailed explanations, along with descriptions of the different metrics used to calculate the rankings. Exhibit 3. Detailed Breakdown of the Technical Approach for Step 1, Step 2, and Step 3
PAGE 7 | Identifying Top‐Performing High Schools Analytical Methodology
Step 1: Identify High Schools That Performed Better Than Expected on State Accountability Assessments Step 1 of the “Best High Schools” method identified high schools that performed better on state reading and mathematics assessments than would be expected given the proportion of students identified as economically disadvantaged. To pass Step 1, high schools needed to have higher achievement than high schools with similar proportions of economically disadvantaged students.
The relationship between academic achievement and socioeconomic status has been studied extensively, and the literature indicates a reasonably consistent moderate‐to‐large relationship between the two (e.g., Caldas & Bankston, 1997; Crosnoe, 2009; Crosnoe & Schneider, 2010; Rumberger & Palardy, 2005; Sirin, 2005; White, 1982). For this reason, Step 1 of the rankings aimed to identify high schools that performed above expectations, controlling for the proportion of economically disadvantaged students. Correlation does not establish causality, and therefore it cannot be stated that economically disadvantaged students should have lower expectations placed on them. Rather, this relationship simply indicates that for most (but not all) high schools, the challenge of educating disadvantaged students has not yet been overcome. (In the analysis, the relationship between school poverty and school average achievement was negative in all states, though the strength of the relationship varied from state to state).
Substep 1.1: Calculate the Performance Index for Each High School
A performance index was computed for each high school that was based on student performance on 2012–13 state reading and mathematics assessments.1 The performance index is designed not only to reward high schools for the number of students at the proficient level but also to assign more weight for students who are performing at levels above the proficient benchmark (as determined by the state). The index valued proficient as 1.0 point, with one level above proficient assigned 1.5 points and two levels above proficient assigned 2.0 points. One level below proficient—considered approaching proficient in this method—was assigned a value of 0.5 points.2 No points were awarded for performance at two or three levels below proficient.
The high school’s performance index was calculated by multiplying the percentage of students scoring at each performance level (e.g., proficient, above proficient) by the index value for that level (e.g., 1.0, 1.5). For example, if a high school participated in an examination with four performance categories—below proficient, approaching proficient, proficient, and above proficient—and all students scored above proficient, the high school would receive a performance index of 150 because 100 percent of students fell in the above proficient category, which is given a weight of 1.5. Exhibit 4 presents information for calculating the performance index for a sample high school.
1 In cases where states assessed students on reading as well as English/language arts, the reading assessment was used. If no reading assessment was reported, English/language arts results were analyzed. 2 When only one level was reported below proficient, that level received a value of 0.
Step 1 was conducted
within each state; high
schools were not
compared with each
other across states.
PAGE 8 | Identifying Top‐Performing High Schools Analytical Methodology
Exhibit 4. Example of Calculating the Performance Index
Subject Area Below Proficient
(Weight = 0)
Approaching Proficient
(Weight = 0.5) Proficient (Weight = 1)
Above Proficient
(Weight = 1.5) Total
Test Takers
Reading 5% 22% 58% 15% 120Mathematics 7% 15% 60% 18% 145
The performance index for this high school with these proficiency levels would be computed by the following formula:
The state assessment data provided for these analyses were provided or available in a variety of formats and disaggregations, depending on the state. In some states, overall numbers tested and percent reaching each proficiency level were directly provided, and the performance index could be calculated immediately. In other states, results were only provided that were disaggregated by grade level, subject area, and/or disadvantaged student subgroup (e.g., subgroups defined by race/ethnicity and/or poverty status). In those cases, weighted means were used to combine data from the various subgroups. For example, the calculation of the performance index in Step 1 would first require combining disaggregated reading and mathematics proficiency data by grade level and disadvantaged student subgroup. To create the performance index, grade levels would then be pooled using a weighted average of the number of tested students.
In addition, some states had heavily suppressed values—or reported no values—for the numbers tested in reading and mathematics. For example, Alabama and Mississippi provided no data on numbers tested, while values in Virginia were heavily suppressed. In these cases, the overall number tested in reading and mathematics was pulled from grade‐level‐appropriate counts from the CCD (for states where assessments were tied to a single grade) or were weighted by the proportion of numbers tested for the entire state (for states in which assessments could be taken by students in multiple grades). Therefore, for example, missing values for numbers tested in Virginia were assigned a weight of 25 for reading and 75 for mathematics, emulating the proportion of all test‐takers in the state.
See Appendix A for more detailed information on the assessments used in this analysis, the ranges of potential performance index values, and the various proficiency levels by state.
Substep 1.2: Calculate the Percentage of Economically Disadvantaged Students
The percentage of students in poverty was calculated with enrollment values retrieved from the CCD’s eligibility counts for free or reduced‐price lunch, relative to the total number of students at a school. The weighted mean value of the state was used when poverty values were missing for a school.
PAGE 9 | Identifying Top‐Performing High Schools Analytical Methodology
Substep 1.3: Regress the Performance Index on the Percentage of Economically Disadvantaged Students
Linear regression analysis was used to determine the state‐specific relationship between the school‐level performance index and school‐level poverty.
Substep 1.4: Use Residuals to Establish the Performance Zone
Using the linear regression, residuals—the difference between a high school’s expected and its observed performance index values—were used to establish the performance zone around the regression line. The upper and lower boundaries of the performance zone were set to +/–0.33 standard deviation of the residual values. This is a change from the 2014 rankings, in which +/–0.50 standard deviation was used as the upper and lower bounds. By using a smaller standard deviation of the residual, more schools could pass Step 1 and be eligible for further analysis. See Appendix B for state‐by‐state scatterplot graphs showing this relationship and the distribution of high schools.
Substep 1.5: Create the Risk‐Adjusted Performance Index
Each high school’s residual measured the degree to which a high school differed from its statistically expected performance on reading and mathematics assessments, when controlled for the proportion of economically disadvantaged students. A risk‐adjusted performance index was defined as the ratio of each high school’s residual to one‐third of a standard deviation. Index values of one or greater indicated that the high school performed better than would be statistically expected.
Substep 1.6: Select High Schools That Surpass the Performance Threshold, and Proceed to Step 2
High schools with risk‐adjusted performance index values at or above 1 (with the value of 1 corresponding to the upper threshold of the performance zone of one‐third of a standard deviation) were considered performing beyond expectations, according to U.S. News, and advanced to Step 2.
For example, as shown in Exhibit 5, 261 high schools in an example state (Florida) performed at or above the upper threshold of the performance zone, after controlling for the proportion of economically disadvantaged students, and would have progressed to Step 2. These schools had a value of 1 or higher on the risk‐adjusted performance index. The performance index value needed by a high school to pass Step 1 is higher for high schools with a lower proportion of economically disadvantaged students than for high schools with a higher proportion of economically disadvantaged students.
PAGE 10 | Identifying Top‐Performing High Schools Analytical Methodology
Exhibit 5. Example of Step 1 Performance Index Analysis (State of Florida)
Step 2: Identify High Schools That Performed Better Than the State Average for Their Least Advantaged Students Step 2 identified high schools in which disadvantaged students—defined as Black/African American, Hispanic/Latino, or economically disadvantaged as determined by state criteria (often defined as students eligible for free or reduced‐price lunch through the National School Lunch Program)—had combined reading and mathematics proficiency levels that were at least equal to the state’s average reading and mathematics proficiency levels for all disadvantaged students. The purpose of Step 2 was to make sure that high schools progressing to Step 3 successfully educated all students, regardless of their socioeconomic or racial/ethnic backgrounds.
Substep 2.1: Calculate the Combined Reading and Mathematics Proficiency (RaMP) Rate for Disadvantaged Student Subgroups for Each High School
The first task in this process was identifying disadvantaged student subgroups in each of the high schools that passed Step 1. These student subgroups included Black/African American students, Hispanic/Latino students, and economically disadvantaged students. After the subgroups were identified, the aggregate school‐wide reading and mathematics proficiency (RaMP) rate was calculated for the disadvantaged student subgroups, which weighted each of the three
Step 2 was conducted
within each state, and
high schools were not
compared with each
other across states.
PAGE 11 | Identifying Top‐Performing High Schools Analytical Methodology
subgroups by their relative size to create a single weighted proficiency rate. In other words, the RaMP rate is a weighted average of the percentage of students for each group at or above the proficient level.
The example in Exhibit 6 illustrates how a RaMP rate is calculated. In this example, each of the subgroups completed state tests in reading and mathematics. A weighted average percentage of students scoring at or above proficient has been computed. The exact formula for computing the RaMP index for this sample school is provided below Exhibit 6.
Exhibit 6. Example of Calculating the Reading and Mathematics Proficiency Rate for One School
Group
Subject
Proportion Tested or In Schoo11
% Below
Proficient
% Approaching Proficient
% Proficient
% Above
Proficient
Black/African American
Reading 0.13 5 10 55 30
Mathematics 0.15 7 8 65 20
Hispanic/Latino Reading 0.14 4 11 65 20
Mathematics 0.14 5 10 55 30
Economically Disadvantaged
Reading 0.22 9 16 60 15
Mathematics 0.20 9 6 65 20 1 Proportion calculated from either numbers tested that were provided from assessment data or from the CCD. See “Accounting for Missing Subgroup Numbers Tested” on page 17.
Substep 2.2: Calculate the State Average RaMP Rate for Disadvantaged Student Subgroups
A weighted state average for the disadvantaged student subgroups was calculated using student subgroup performance across all high schools in the state. To create the state average RaMP rate, all high school RaMP values were averaged, weighting the three subgroups by their relative size (e.g., the total number of tested disadvantaged students in a high school) to create a single proficiency rate.
Substep 2.3: Calculate the Proficiency Gap Differential for Disadvantaged Student Subgroups
To calculate the disadvantaged student proficiency gap differential, the high school‐specific RaMP rate for the disadvantaged student subgroups present in the school was compared with the state average for disadvantaged student subgroups. Values greater than or equal to zero indicated that a high school’s disadvantaged student subgroups outperformed the state average or equaled it. Values lower than zero meant that a high school’s disadvantaged student subgroups performed worse than the state average.
PAGE 12 | Identifying Top‐Performing High Schools Analytical Methodology
Substep 2.4: Select High Schools That Do as Well as or Better Than the State Average, and Proceed to Step 3
High schools with disadvantaged student subgroups that performed as well as or better than the state average advanced to Step 3. That is, all high schools that had a value of 0 or higher for the disadvantaged student proficiency gap differential passed Step 2. As with earlier versions of the “Best High Schools” rankings, high schools that passed Step 1 and did not have disadvantaged student subgroups automatically moved to Step 3. See Exhibit 7 for an illustrative example.
Exhibit 7. Example of School Performance of Disadvantaged Student Subgroups on State Assessments for Three Schools
— Not applicable.
High schools that passed Step 1 and Step 2 were automatically considered bronze‐medal high schools and were further analyzed to determine whether they qualified for a silver or a gold medal.
School
High School’s State Test Proficiency Rate for Disadvantaged Student Subgroups
State Average of State Test Proficiency Rate for Disadvantaged Student Subgroups
Proficiency Gap Differential for Disadvantaged Student Subgroups Continue to Step 3?
School A 90.1 66.8 23.3
Yes. Disadvantaged student subgroups in School A performed better than the state average for disadvantaged student
School B — 66.8 —
Yes. School B passed Step 1, and there were no data to disqualify it in Step 2. School B had no student subgroups meeting the required minimum size.
School C 65.9 66.8 –0.9
No. Disadvantaged student subgroups in School C performed worse than the state average for disadvantaged student subgroups.
PAGE 13 | Identifying Top‐Performing High Schools Analytical Methodology
Step 3: Identify High Schools That Performed Best in Providing Students With Access to Challenging College‐Level Coursework Step 3 of the analysis measured the extent to which students were prepared for college‐level work. The college readiness index (CRI)—created for the “Best High Schools” rankings—accounted for 12th‐grade student participation in and performance on AP or IB examinations. The CRI was used to determine which high schools passed Step 3 to become silver‐medal high schools and also was used to rank high schools across states to distinguish the gold‐medal high schools.
Participation in Step 3 required that at least 10 students were administered at least one AP or IB examination and that a high school have at least 15 students in grade 12. If a high school did not meet these criteria, the high school did not participate in Step 3 even if it had passed Steps 1 and 2. In high schools that offered both AP and IB examinations, the CRI was calculated for the examination with more test‐takers.3
Substep 3.1: Calculate Student Participation in AP and/or IB Examinations for Each High School
An AP/IB participation rate was created for each high school by calculating the percentage of 12th graders who took at least one AP or IB examination at some point during high school.
Substep 3.2: Calculate Student Performance on AP and/or IB Examinations for Each High School
A quality‐adjusted AP/IB participation rate was created for each high school by calculating the percentage of 12th graders who passed at least one AP or IB examination at some point during high school. Passing rates for this analysis were based on students achieving a score of 3 or higher on AP examinations and 4 or higher on IB examinations.
Substep 3.3 Calculate the CRI for Each High School
As indicated in Exhibit 8, the CRI was calculated by combining the AP/IB participation rate (weighted 25 percent) and the quality‐adjusted AP/IB participation rate (weighted 75 percent). The CRI is designed to measure both access to college‐level material (participation) and the ability to master this material (performance).
3 For high schools with both AP and IB programs, choosing one program over another was an attempt to assign more weight to the larger program within the high school. It is recognized, however, that this approach may understate the level of college readiness at the high school.
Step 3 was
the only step
conducted across
states because the
college readiness
index (CRI) is a
common metric—i.e.,
it was computed in
the same way across
states using the same
variables.
PAGE 14 | Identifying Top‐Performing High Schools Analytical Methodology
Exhibit 8. Calculation of the College Readiness Index (CRI)
The CRI measures both the breadth and depth of the college‐level curriculum in high schools. The purpose of the CRI is to avoid creating an incentive for high schools to improve their ranking by offering more AP or IB courses and examinations, regardless of whether their students are prepared to succeed in them.
Substep 3.4: Calculate the National Median CRI to Select High Schools to Be Ranked
The threshold for the CRI was set at the median of all CRI values—which, in this year’s analysis, was 19.42. That is, half the sample for which AP or IB data were available had CRI values higher than this value. High schools that passed Step 1 and Step 2, participated in AP or IB, and were at or above this median benchmark were eligible for silver or gold medals.
Substep 3.5: Rank High Schools and Assign Medals
High schools were awarded bronze medals if they passed Step 1 and Step 2 and either (1) participated in AP or IB programs but did not meet the CRI threshold of 19.42 or (2) did not participate in AP or IB programs. High schools that passed Step 1 and Step 2 and met or exceeded the CRI threshold were awarded a silver or a gold medal.
Though 19,753 high schools initially were considered for the rankings, 475 schools did not have sufficient data with which to calculate a performance index. Of the 19,278 eligible high schools, 3,990 schools (20.2 percent of the eligible U.S. high schools) were awarded bronze medals, 2,027 schools (10.3 percent) were awarded silver medals, and 500 schools (2.5 percent) were awarded gold medals. All gold‐medal high schools in this year’s rankings had a CRI of 52.43 or higher. See Exhibit 9.
PAGE 15 | Identifying Top‐Performing High Schools Analytical Methodology
Exhibit 9. High School Performance Pyramid
In cases in which gold‐medal high schools were tied on their CRI, secondary rankings were calculated to create tiebreakers. The first tiebreaker was the average number of examinations passed per student among students who took and passed at least one test. The second tiebreaker was the number of examinations per test‐taker, which calculated an average number of tests taken per student among students who took at least one test. The third tiebreaker was the percentage of students taking and passing at least one examination.
Gold 500 high schools
Silver 2,027 high schools
Bronze
3,990 high schools
No medal 12,761 high schools
PAGE 16 | Identifying Top‐Performing High Schools Analytical Methodology
Data Notes Schools without a Performance Index. To be considered for the rankings, a high school needed available assessment data for at least one subtest used in the state‐specific analyses for Step 1 and Step 2. Approximately 2.4 percent of the high schools (475 high schools) in the initial group of schools considered for the rankings did not have enough data to calculate a performance index and were thus removed from the analysis. Some of the reasons for this exclusion were missing state assessment data, missing state assessment data for the “all students” category, missing state assessment data for relevant subtests, missing records in the CCD, and suppressed state assessment data. In particular, the state of Alabama was missing all values for numbers tested; in this case, data from the CCD grade‐level enrollments were used to generate a performance index. In addition, Virginia had a high level of suppression for numbers tested; to properly balance the reading and mathematics tests in Virginia’s performance index, the ratio of tests taken by high school students at the state level was used to weight the assessment results (25 percent reading and 75 percent mathematics).
For the 2015 rankings, a new threshold was applied in Step 1 to include more
high schools. The rankings from 2012 to 2014 used a threshold of 0.5 standard
deviations for the performance zone in Step 1. In 2015, the threshold was
reduced to 0.33 standard deviations. The rankings prior to the 2012 release used
a threshold of 1.0 standard deviation for the performance zone in Step 1. In
addition, the rankings prior to 2012 used a threshold of 20 for the CRI in Step 3
to identify silver‐medal schools. The 2013 through 2015 rankings used a CRI
threshold based on the national median of all calculated indexes.
Starting in 2012, the “Best High Schools” rankings identified the top 500 high
schools as gold‐medal high schools—instead of the top 100 high schools, as
recognized in previous versions. In addition, gold‐ and silver‐medal schools were
ranked; previous versions ranked only schools receiving gold medals.
The “Best High Schools” rankings no longer separately acknowledge high schools
that did not pass Step 1 and Step 2 but had equally high values for the CRI as the
top‐ranked gold‐medal high schools. (These high schools formerly had received
“honorable mention.”)
Thresholds to Identify “Best High Schools”
PAGE 17 | Identifying Top‐Performing High Schools Analytical Methodology
Data could have been suppressed for various reasons, including to protect identification of students, protect identification of students in particular subgroups, and hide unreportable data. It is possible that in some of these cases, the data were redacted because a high percentage of students in the school achieved the same standard (e.g., more than 90 percent of students scored above proficient) and the data were suppressed from public view. States where less than 85 percent of schools did not have a performance index were Delaware, Maine, Utah, and Wyoming. Oklahoma was also missing a significant proportion of schools entirely from their state assessment file, although the percentage of received school records for which a performance index could be calculated was 100 percent. The numbers of schools without a performance index are noted at the bottom of each state‐by‐state scatterplot in Appendix B.
Accounting for Missing Subgroup Numbers Tested. If high schools were missing numbers tested for subgroups, CCD enrollment data were used as a substitute. To avoid choosing specific grade or school‐level enrollment counts from the CCD to match the different grades and courses tested within each state, the proportions of subgroups in the school overall were used as the weights in calculating RaMP. That is, instead of using the numbers tested that were directly provided in the assessment data or, as substitutes, CCD enrollment counts, the proportion of students in each subgroup was calculated from either the assessment data or from the CCD (if assessment values were missing). This procedure has the advantage of providing consistency across states and missing subgroups, is less prone to error, and allows for mixed patterns of missing and non‐missing subgroup values within a school. Specifically, the steps are as follows:
� First, using state assessment data, calculate the percentage of students (Black, Hispanic, White, economically disadvantaged, and non‐economically disadvantaged) being tested in each school.
� Second, using CCD data, calculate the percentage of students in each subgroup (same groups as above) as a percentage of the entire school.
� If the percentage of students in each group being tested is missing, substitute the percentage calculated from the CCD data.
Alabama and Mississippi suppressed all subgroup values for numbers tested, and Maine was missing almost all of their economically disadvantaged student data. Illinois, Kentucky, and Virginia were also missing significant proportions of subgroups numbers tested.
Free and Reduced‐Price Lunch Eligibility. Technical documentation for the CCD (http://nces.ed.gov/ccd/pdf/2015009.pdf) notes changes in the number of students eligible for free and reduced‐price lunch from 2011‐12 to 2012‐13. District of Columbia and West Virginia reported a decrease in students eligible for reduced‐price lunch by more than a third due to community eligibility options (Keaton, 2014, D‐4 and D‐9). Changes in community eligibility options also accounted for significant changes in free and reduced‐price lunch eligibility in Illinois. The number of students eligible for free lunch more than doubled and reduced‐price lunch reduced by 80% (Keaton, 2014, D‐4). Maine reported issues with the 2011‐12 free and reduced‐price lunch data, which caused an almost 600% increase in students eligible for free lunch in 2012‐13 (Keaton, 2014, D‐5). Texas attributed an increase of 23% in eligibility for free
PAGE 18 | Identifying Top‐Performing High Schools Analytical Methodology
lunch to changes in data reporting methods. The number of students eligible for free lunch increased by almost a third in Utah, but no reason was reported for the change (Keaton, 2014, D‐9).
Use of Advanced Placement Data. States provided assent to use aggregated Advanced Placement test participation data from The College Board. Three states, however, did not respond to requests for consent: Alabama, Minnesota, and South Dakota. Schools in these states were eligible for a Bronze medal (passing step 1 and step 2) and eligible for a Silver or Gold medal based only on their International Baccalaureate test participation. In addition, in providing assent to use AP data, some states requested suppression of particular AP data values. These states were Colorado, Florida, Idaho, New York, and Tennessee.
PAGE 19 | Identifying Top‐Performing High Schools Analytical Methodology
References Caldas, S. J., & Bankston, C. III. (1997). Effect of school population socioeconomic
status on individual academic achievement. Journal of Educational Research, 90(5), 269–277.
Crosnoe, R. (2009). Low‐income students and the socioeconomic composition of public high schools. American Sociological Review, 74(5), 709–730.
Crosnoe, R., & Schneider, B. (2010). Social capital, information, and socioeconomic disparities in math course work. American Journal of Education, 117(1), 79–107.
Keaton, P. (2014). Documentation to the NCES Common Core of Data Public Elementary/Secondary School Universe Survey: School Year 2012‐13 Provisional Version 1a (NCES 2015‐009). U.S. Department of Education. Washington, DC: National Center for Education Statistics. Retrieved from http://nces.ed.gov/pubsearch/pubsinfo.asp?pubid=2015009.
Rumberger, R. W., & Palardy, G. J. (2005). Does segregation still matter? The impact of student composition on academic achievement in high school. Teachers College Record, 107(9), 1999–2045.
Sirin, S. R. (2005). Socioeconomic status and academic achievement: A meta‐analytic review of research. Review of Educational Research, 75(3), 417–453.
White, K. R. (1982). The relation between socioeconomic status and academic achievement. Psychological Bulletin, 91(3), 461–481.
PAGE 20 | Identifying Top‐Performing High Schools Analytical Methodology
Technical Appendices
PAGE 21 | Identifying Top‐Performing High Schools Analytical Methodology
Appendix A. State Assessments and Performance Levels Used to Calculate the Performance Index and Disadvantaged Student Subgroup Proficiency Rates (2011–12) The following table shows the state assessments (reading and mathematics) used to calculate the performance index and disadvantaged student subgroup proficiency rates. It also shows the range of potential performance index values and the levels reported below proficient and at or above proficient. The proficient level was assigned a value of 1.0 points, with 1.5 points for one level above proficient and 2.0 points for two levels above proficient. One level below proficient was assigned a value of 0.5 points. Two levels below proficient received a value of 0 (three levels below proficient was also assigned a value of 0 points; only Oregon reported three levels below proficient). When only one level was reported as below proficient, that level also received a value of 0 (these levels are shown under the “2 levels below proficient” column below).
State Assessment Name
Range of Potential
Performance Index Values
2 Levels Below
Proficient (0.0)
1 Level Below
Proficient (0.5)
Proficient (1.0)
1 Level Above
Proficient (1.5)
2 Levels Above
Proficient (2.0)
Alabama Alabama High School Graduation Exam (AHSGE) 0–150 9� 9� 9�
Alaska Standards Based Assessment (SBA) 0–150 9� 9� 9� 9�
Arizona Arizona’s Instrument to Measure Standards (AIMS) 0–150 9� 9� 9� 9�
Arkansas Mathematics EOCs1 and Grade 11 Literacy EOG2 0–150
9� 9� 9� 9�
California3
California High School Exit Exam (CAHSEE) and Standardized Testing and Reporting (STAR) 0–1000
Not applicable
Colorado Transitional Colorado Assessment Program (TCAP) 0–150 9� 9� 9� 9�
Connecticut Connecticut Academic Performance Test (CAPT) 0–200 9� 9� 9� 9� 9�
Delaware Delaware Comprehensive Assessment System (DCAS) 0–150 9� 9� 9� 9�
District of Columbia
DC Comprehensive Assessment System (DC CAS) 0–150 9� 9� 9� 9�
Florida
Florida Comprehensive Assessment Test (Reading FCAT 2.0) and Mathematics EOCs 0–200
9� 9� 9� 9� 9�
1 End‐of‐course assessments are given at the end of a particular course of study such as Algebra I or Geometry. They are typically referred to by states, and abbreviated throughout this table, as EOCs. 2 End‐of‐grade assessments are given at the end of a particular grade and are often referred to as EOGs by states. 3 This analysis used results from California’s Academic Performance Index (API) 2013 Growth Report rather than proficiency‐level results from state assessments; API values are calculated by the state using results from both the CAHSEE and the STAR Program.
PAGE 22 | Identifying Top‐Performing High Schools Analytical Methodology
State Assessment Name
Range of Potential
Performance Index Values
2 Levels Below
Proficient (0.0)
1 Level Below
Proficient (0.5)
Proficient (1.0)
1 Level Above
Proficient (1.5)
2 Levels Above
Proficient (2.0)
Georgia EOC Tests (EOCT) 0–150 9� 9� 9�
Hawaii Hawaii State Assessment (HSA) 0–150 9� 9� 9� 9�
Idaho Idaho Standards Achievement Tests (ISAT) 0–150 9� 9� 9� 9�
Illinois Prairie State Achievement Examination (PSAE) 0–150 9� 9� 9� 9�
Indiana End‐of‐Course Assessments (ECAs) 0–100 9� 9�
Iowa Iowa Assessments 0–150 9� 9� 9�
Kansas Kansas State Assessment (KSA) 0–150 9� 9� 9�
Kentucky
Kentucky Performance Rating for Educational Progress (K‐PREP) EOCs 0–150
9� 9� 9� 9�
Louisiana EOCs 0–150 9� 9� 9� 9�
Maine Maine High School Assessment (MHSA) 0–150 9� 9� 9� 9�
Maryland Maryland High School Assessment (HSA) 0–150 9� 9� 9�
Massachusetts Massachusetts Comprehensive Assessment System (MCAS) 0–150 9� 9� 9� 9�
Michigan Michigan Merit Examination (MME) 0–150 9� 9� 9� 9�
Minnesota Minnesota Comprehensive Assessments (MCA) 0–150 9� 9� 9� 9�
Mississippi Subject Area Testing Program, 2nd Edition (SATP2) 0–100 9� 9� 9�
Missouri Missouri Assessment Program (MAP) EOCs 0–150 9� 9� 9� 9�
Montana
Montana Comprehensive Assessment System (MontCAS) Criterion‐Referenced Test 0–150
9� 9� 9� 9�
Nebraska Nebraska State Accountability (NeSA) 0–100 9� 9�
Nevada Nevada High School Proficiency Examination (HSPE) 0–150 9� 9� 9� 9�
New Hampshire New England Common Assessment Program (NECAP) 0–150 9� 9� 9� 9�
New Jersey High School Proficiency Assessment (HSPA) 0–150 9� 9� 9�
New Mexico New Mexico Standards Based Assessment (NMSBA) 0–150 9� 9� 9� 9�
New York Regents Examinations 0–150 9� 9� 9� 9�
PAGE 23 | Identifying Top‐Performing High Schools Analytical Methodology
State Assessment Name
Range of Potential
Performance Index Values
2 Levels Below
Proficient (0.0)
1 Level Below
Proficient (0.5)
Proficient (1.0)
1 Level Above
Proficient (1.5)
2 Levels Above
Proficient (2.0)
North Carolina End‐of‐Course (EOC) Tests 0–150 9� 9� 9� 9�
North Dakota North Dakota State Assessment 0–150 9� 9� 9� 9�
Ohio Ohio Graduation Test (OGT) 0–200 9� 9� 9� 9� 9�
Oklahoma
Oklahoma Core Curriculum Tests (OCCT) End of Instruction (EOI) Assessments 0–150
9� 9� 9� 9�
Oregon4 Oregon Assessment of Knowledge and Skills (OAKS) 0–150 9� 9� 9� 9�
Pennsylvania Keystone End‐of‐Course Exams 0–150 9� 9� 9� 9�
Rhode Island New England Common Assessment Program (NECAP) 0–150 9� 9� 9� 9�
South Carolina High School Assessment Program (HSAP) 0–150 9� 9� 9� 9�
South Dakota Dakota State Test of Educational Progress (Dakota STEP) 0–150 9� 9� 9� 9�
Tennessee Tennessee Comprehensive Assessment Program (TCAP) EOC 0–150 9� 9� 9� 9�
Texas
State of Texas Assessments of Academic Readiness (STAAR) EOC Assessments 0–150
9� 9� 9�
Utah State Core Criterion‐Referenced Tests (Core CRT) 0–100 9�
Vermont New England Common Assessment Program (NECAP) 0–150 9� 9� 9�
Virginia Standards of Learning (SOL) EOCs 0–150 9� 9�
Washington
High School Proficiency Exam (Reading HSPE), Mathematics EOCs 0–150
9� 9� 9�
West Virginia West Virginia Educational Standards Tests (WESTEST 2) 0–200 9� 9� 9� 9�
Wisconsin Wisconsin Knowledge and Concepts Examinations (WKCE) 0–150 9� 9� 9�
Wyoming Proficiency Assessment for Wyoming Students (PAWS) 0–150 9� 9� 9�
4 Oregon also reported proficiency results for two levels below proficient.
PAGE 24 | Identifying Top‐Performing High Schools Analytical Methodology
Appendix B. State Assessment Regression Analyses for the Performance Index The following pages contain state‐by‐state scatterplot graphs showing the relationship between performance index (as measured by performance on state assessments in reading and mathematics) and poverty rate. High schools above the performance zone (the green band) are deemed to be performing above expectations in their state for their poverty levels. These high schools passed Step 1 of the analyses.
The relationship between performance index and poverty rate was negative across all states. In other words, in each state, the performance index decreased as the level of poverty increased. Schools for which a performance index could not be calculated from available data were not included in the analysis. The number of schools for which this is the case appears below each state specific table.
PAGE 25 | Identifying Top‐PerformingHigh Schools Technical
Step 1 Analysis Actual Performance Versus Expected Performance
Total number analyzed statewide (calculated PI) 359
Number of high schools performing above expectations in Step 1 131
Percentage of schools performing above expectations in Step 1 36 Grades‐Based Testing Subject 9th 10th 11th 12th
Reading 9
Mathematics 9
Alabama
PAGE 26 | Identifying Top‐PerformingHigh Schools Technical
Step 1 Analysis Actual Performance Versus Expected Performance
Total number analyzed statewide (calculated PI) 109
Number of high schools performing above expectations in Step 1 45
Percentage of schools performing above expectations in Step 1 41 Grades‐Based Testing Subject 9th 10th 11th 12th
Reading 9 9
Mathematics 9 9
Alaska
PAGE 27 | Identifying Top‐PerformingHigh Schools Technical
Step 1 Analysis Actual Performance Versus Expected Performance
Total number analyzed statewide (calculated PI) 463
Number of high schools performing above expectations in Step 1 200
Percentage of schools performing above expectations in Step 1 43 Grades‐Based Testing Subject 9th 10th 11th 12th
Reading 9
Mathematics 9
Arizona
PAGE 28 | Identifying Top‐PerformingHigh Schools Technical
Step 1 Analysis Actual Performance Versus Expected Performance
Total number analyzed statewide (calculated PI) 283
Number of high schools performing above expectations in Step 1 110
Percentage of schools performing above expectations in Step 1 39 Grades‐Based Testing Subject 9th 10th 11th 12th
Reading 9
Mathematics 9 9
End‐of‐grade tests include Grade 11 Literacy, and end‐of‐course tests include Algebra I and Geometry.
Arkansas
PAGE 29 | Identifying Top‐PerformingHigh Schools Technical
Step 1 Analysis Actual Performance Versus Expected Performance
Total number analyzed statewide (calculated PI) 2,045
Number of high schools performing above expectations in Step 1 950
Percentage of schools performing above expectations in Step 1 46 Grades‐Based Testing Subject 9th 10th 11th 12th
Reading 9
Mathematics 9
California
PAGE 30 | Identifying Top‐PerformingHigh Schools Technical
Step 1 Analysis Actual Performance Versus Expected Performance
Total number analyzed statewide (calculated PI) 357
Number of high schools performing above expectations in Step 1 131
Percentage of schools performing above expectations in Step 1 37 Grades‐Based Testing Subject 9th 10th 11th 12th
Reading 9 9
Mathematics 9 9
Colorado
PAGE 31 | Identifying Top‐PerformingHigh Schools Technical
Step 1 Analysis Actual Performance Versus Expected Performance
Total number analyzed statewide (calculated PI) 193
Number of high schools performing above expectations in Step 1 63
Percentage of schools performing above expectations in Step 1 33 Grades‐Based Testing Subject 9th 10th 11th 12th
Reading 9
Mathematics 9
Connecticut
PAGE 32 | Identifying Top‐PerformingHigh Schools Technical
Step 1 Analysis Actual Performance Versus Expected Performance
Total number analyzed statewide (calculated PI) 26
Number of high schools performing above expectations in Step 1 10
Percentage of schools performing above expectations in Step 1 38
Grades‐Based Testing Subject 9th 10th 11th 12th
Reading 9 9
Mathematics 9 9
Delaware
PAGE 33 | Identifying Top‐PerformingHigh Schools Technical
Step 1 Analysis Actual Performance Versus Expected Performance
Total number analyzed statewide (calculated PI) 32
Number of high schools performing above expectations in Step 1 13
Percentage of schools performing above expectations in Step 1 41 Grades‐Based Testing Subject 9th 10th 11th 12th
Reading 9
Mathematics 9
District of Columbia
PAGE 34 | Identifying Top‐PerformingHigh Schools Technical
Step 1 Analysis Actual Performance Versus Expected Performance
Total number analyzed statewide (calculated PI) 694
Number of high schools performing above expectations in Step 1 261
Percentage of schools performing above expectations in Step 1 38 Grades‐Based Testing Subject 9th 10th 11th 12th
Reading 9 9
Mathematics 9 9 9 9
Florida
PAGE 35 | Identifying Top‐PerformingHigh Schools Technical
Step 1 Analysis Actual Performance Versus Expected Performance
Total number analyzed statewide (calculated PI) 440
Number of high schools performing above expectations in Step 1 157
Percentage of schools performing above expectations in Step 1 36 Grades‐Based Testing Subject 9th 10th 11th 12th
Reading 9 9 9 9
Mathematics 9 9 9 9
End‐of‐course tests include 9th‐grade Literature, American Literature, Math 1, Math 2, GPS Algebra, GPS Geometry, and Coordinate Geometry.
Georgia
PAGE 36 | Identifying Top‐PerformingHigh Schools Technical
Step 1 Analysis Actual Performance Versus Expected Performance
Total number analyzed statewide (calculated PI) 54
Number of high schools performing above expectations in Step 1 21
Percentage of schools performing above expectations in Step 1 39 Grades‐Based Testing Subject 9th 10th 11th 12th
Reading 9
Mathematics 9
Hawaii
PAGE 37 | Identifying Top‐PerformingHigh Schools Technical
Step 1 Analysis Actual Performance Versus Expected Performance
Total number analyzed statewide (calculated PI) 162
Number of high schools performing above expectations in Step 1 66
Percentage of schools performing above expectations in Step 1 41 Grades‐Based Testing Subject 9th 10th 11th 12th
Reading 9 9
Mathematics 9 9
Idaho
PAGE 38 | Identifying Top‐PerformingHigh Schools Technical
Step 1 Analysis Actual Performance Versus Expected Performance
Total number analyzed statewide (calculated PI) 667
Number of high schools performing above expectations in Step 1 215
Percentage of schools performing above expectations in Step 1 32 Grades‐Based Testing Subject 9th 10th 11th 12th
Reading 9
Mathematics 9
Illinois
PAGE 39 | Identifying Top‐PerformingHigh Schools Technical
Step 1 Analysis Actual Performance Versus Expected Performance
Total number analyzed statewide (calculated PI) 395
Number of high schools performing above expectations in Step 1 145
Percentage of schools performing above expectations in Step 1 37 Grades‐Based Testing Subject 9th 10th 11th 12th
Reading 9 9 9 9
Mathematics 9 9 9 9
Indiana
PAGE 40 | Identifying Top‐PerformingHigh Schools Technical
Step 1 Analysis Actual Performance Versus Expected Performance
Total number analyzed statewide (calculated PI) 340
Number of high schools performing above expectations in Step 1 126
Percentage of schools performing above expectations in Step 1 37 Grades‐Based Testing Subject 9th 10th 11th 12th
Reading 9
Mathematics 9
Iowa
PAGE 41 | Identifying Top‐PerformingHigh Schools Technical
Step 1 Analysis Actual Performance Versus Expected Performance
Total number analyzed statewide (calculated PI) 318
Number of high schools performing above expectations in Step 1 103
Percentage of schools performing above expectations in Step 1 32 Grades‐Based Testing Subject 9th 10th 11th 12th
Reading 9
Mathematics 9
Kansas
PAGE 42 | Identifying Top‐PerformingHigh Schools Technical
Step 1 Analysis Actual Performance Versus Expected Performance
Total number analyzed statewide (calculated PI) 271
Number of high schools performing above expectations in Step 1 105
Percentage of schools performing above expectations in Step 1 39 Grades‐Based Testing Subject 9th 10th 11th 12th
Reading 9 9 9
Mathematics 9 9 9
Kentucky
PAGE 43 | Identifying Top‐PerformingHigh Schools Technical
Step 1 Analysis Actual Performance Versus Expected Performance
Total number analyzed statewide (calculated PI) 318
Number of high schools performing above expectations in Step 1 106
Percentage of schools performing above expectations in Step 1 33 Grades‐Based Testing Subject 9th 10th 11th 12th
Reading 9 9 9 9
Mathematics 9 9 9 9
Louisiana
PAGE 44 | Identifying Top‐PerformingHigh Schools Technical
Step 1 Analysis Actual Performance Versus Expected Performance
Total number analyzed statewide (calculated PI) 95
Number of high schools performing above expectations in Step 1 29
Percentage of schools performing above expectations in Step 1 31 Grades‐Based Testing Subject 9th 10th 11th 12th
Reading 9
Mathematics 9
Maine
PAGE 45 | Identifying Top‐PerformingHigh Schools Technical
Step 1 Analysis Actual Performance Versus Expected Performance
Total number analyzed statewide (calculated PI) 232
Number of high schools performing above expectations in Step 1 96
Percentage of schools performing above expectations in Step 1 41 Grades‐Based Testing Subject 9th 10th 11th 12th
Reading 9
Mathematics 9
High School Assessment tests include English 2 and Algebra/Data Analysis.
Maryland
PAGE 46 | Identifying Top‐PerformingHigh Schools Technical
Step 1 Analysis Actual Performance Versus Expected Performance
Total number analyzed statewide (calculated PI) 352
Number of high schools performing above expectations in Step 1 108
Percentage of schools performing above expectations in Step 1 31 Grades‐Based Testing Subject 9th 10th 11th 12th
Reading 9
Mathematics 9
Massachusetts
PAGE 47 | Identifying Top‐PerformingHigh Schools Technical
Step 1 Analysis Actual Performance Versus Expected Performance
Total number analyzed statewide (calculated PI) 834
Number of high schools performing above expectations in Step 1 309
Percentage of schools performing above expectations in Step 1 37 Grades‐Based Testing Subject 9th 10th 11th 12th
Reading 9
Mathematics 9
Michigan
PAGE 48 | Identifying Top‐PerformingHigh Schools Technical
Step 1 Analysis Actual Performance Versus Expected Performance
Total number analyzed statewide (calculated PI) 535
Number of high schools performing above expectations in Step 1 232
Percentage of schools performing above expectations in Step 1 43 Grades‐Based Testing Subject 9th 10th 11th 12th
Reading 9
Mathematics 9
Minnesota
PAGE 49 | Identifying Top‐PerformingHigh Schools Technical
Step 1 Analysis Actual Performance Versus Expected Performance
Total number analyzed statewide (calculated PI) 243
Number of high schools performing above expectations in Step 1 97
Percentage of schools performing above expectations in Step 1 40 Grades‐Based Testing Subject 9th 10th 11th 12th
Reading 9
Mathematics 9
Mississippi
PAGE 50 | Identifying Top‐PerformingHigh Schools Technical
Step 1 Analysis Actual Performance Versus Expected Performance
Total number analyzed statewide (calculated PI) 504
Number of high schools performing above expectations in Step 1 190
Percentage of schools performing above expectations in Step 1 38 Grades‐Based Testing Subject 9th 10th 11th 12th
Reading 9 9 9 9
Mathematics 9 9 9 9
End‐of‐course tests include English I, English II, Algebra I, Algebra II, and Geometry.
Missouri
PAGE 51 | Identifying Top‐PerformingHigh Schools Technical
Step 1 Analysis Actual Performance Versus Expected Performance
Total number analyzed statewide (calculated PI) 121
Number of high schools performing above expectations in Step 1 37
Percentage of schools performing above expectations in Step 1 31 Grades‐Based Testing Subject 9th 10th 11th 12th
Reading 9 9
Mathematics 9 9
Montana
PAGE 52 | Identifying Top‐PerformingHigh Schools Technical
Step 1 Analysis Actual Performance Versus Expected Performance
Total number analyzed statewide (calculated PI) 230
Number of high schools performing above expectations in Step 1 87
Percentage of schools performing above expectations in Step 1 38 Grades‐Based Testing Subject 9th 10th 11th 12th
Reading 9
Mathematics 9
Nebraska
PAGE 53 | Identifying Top‐PerformingHigh Schools Technical
Step 1 Analysis Actual Performance Versus Expected Performance
Total number analyzed statewide (calculated PI) 112
Number of high schools performing above expectations in Step 1 37
Percentage of schools performing above expectations in Step 1 33 Grades‐Based Testing Subject 9th 10th 11th 12th
Reading 9
Mathematics 9
Nevada
PAGE 54 | Identifying Top‐PerformingHigh Schools Technical
Step 1 Analysis Actual Performance Versus Expected Performance
Total number analyzed statewide (calculated PI) 84
Number of high schools performing above expectations in Step 1 31
Percentage of schools performing above expectations in Step 1 37 Grades‐Based Testing Subject 9th 10th 11th 12th
Reading 9
Mathematics 9
New Hampshire
PAGE 55 | Identifying Top‐PerformingHigh Schools Technical
Step 1 Analysis Actual Performance Versus Expected Performance
Total number analyzed statewide (calculated PI) 405
Number of high schools performing above expectations in Step 1 105
Percentage of schools performing above expectations in Step 1 26 Grades‐Based Testing Subject 9th 10th 11th 12th
Reading 9
Mathematics 9
New Jersey
PAGE 56 | Identifying Top‐PerformingHigh Schools Technical
Step 1 Analysis Actual Performance Versus Expected Performance
Total number analyzed statewide (calculated PI) 179
Number of high schools performing above expectations in Step 1 62
Percentage of schools performing above expectations in Step 1 35 Grades‐Based Testing Subject 9th 10th 11th 12th
Reading 9
Mathematics 9
New Mexico
PAGE 57 | Identifying Top‐PerformingHigh Schools Technical
Step 1 Analysis Actual Performance Versus Expected Performance
Total number analyzed statewide (calculated PI) 1,239
Number of high schools performing above expectations in Step 1 457
Percentage of schools performing above expectations in Step 1 37 Grades‐Based Testing Subject 9th 10th 11th 12th
Reading 9 9 9 9
Mathematics 9 9 9 9
Regents’ exams include English, Integrated Algebra, Geometry, and Algebra 2/Trigonometry.
New York
PAGE 58 | Identifying Top‐PerformingHigh Schools Technical
Step 1 Analysis Actual Performance Versus Expected Performance
Total number analyzed statewide (calculated PI) 550
Number of high schools performing above expectations in Step 1 148
Percentage of schools performing above expectations in Step 1 27 Grades‐Based Testing Subject 9th 10th 11th 12th
Reading 9
Mathematics 9
North Carolina
PAGE 59 | Identifying Top‐PerformingHigh Schools Technical
Step 1 Analysis Actual Performance Versus Expected Performance
Total number analyzed statewide (calculated PI) 118
Number of high schools performing above expectations in Step 1 43
Percentage of schools performing above expectations in Step 1 36 Grades‐Based Testing Subject 9th 10th 11th 12th
Reading 9
Mathematics 9
North Dakota
PAGE 60 | Identifying Top‐PerformingHigh Schools Technical
Step 1 Analysis Actual Performance Versus Expected Performance
Total number analyzed statewide (calculated PI) 871
Number of high schools performing above expectations in Step 1 427
Percentage of schools performing above expectations in Step 1 49 Grades‐Based Testing Subject 9th 10th 11th 12th
Reading 9
Mathematics 9
Ohio
PAGE 61 | Identifying Top‐PerformingHigh Schools Technical
Step 1 Analysis Actual Performance Versus Expected Performance
Total number analyzed statewide (calculated PI) 80
Number of high schools performing above expectations in Step 1 25
Percentage of schools performing above expectations in Step 1 31 Grades‐Based Testing Subject 9th 10th 11th 12th
Reading 9 9 9 9
Mathematics 9 9 9 9
Core Curriculum tests include English II and Algebra I.
Oklahoma
PAGE 62 | Identifying Top‐PerformingHigh Schools Technical
Step 1 Analysis Actual Performance Versus Expected Performance
Total number analyzed statewide (calculated PI) 293
Number of high schools performing above expectations in Step 1 139
Percentage of schools performing above expectations in Step 1 47 Grades‐Based Testing Subject 9th 10th 11th 12th
Reading 9
Mathematics 9
Oregon
PAGE 63 | Identifying Top‐PerformingHigh Schools Technical
Step 1 Analysis Actual Performance Versus Expected Performance
Total number analyzed statewide (calculated PI) 671
Number of high schools performing above expectations in Step 1 254
Percentage of schools performing above expectations in Step 1 38 Grades‐Based Testing Subject 9th 10th 11th 12th
Reading 9
Mathematics 9
Pennsylvania
PAGE 64 | Identifying Top‐PerformingHigh Schools Technical
Step 1 Analysis Actual Performance Versus Expected Performance
Total number analyzed statewide (calculated PI) 53
Number of high schools performing above expectations in Step 1 13
Percentage of schools performing above expectations in Step 1 25 Grades‐Based Testing Subject 9th 10th 11th 12th
Reading 9
Mathematics 9
Rhode Island
PAGE 65 | Identifying Top‐PerformingHigh Schools Technical
Step 1 Analysis Actual Performance Versus Expected Performance
Total number analyzed statewide (calculated PI) 218
Number of high schools performing above expectations in Step 1 81
Percentage of schools performing above expectations in Step 1 37 Grades‐Based Testing Subject 9th 10th 11th 12th
Reading 9
Mathematics 9
South Carolina
PAGE 66 | Identifying Top‐PerformingHigh Schools Technical
Step 1 Analysis Actual Performance Versus Expected Performance
Total number analyzed statewide (calculated PI) 141
Number of high schools performing above expectations in Step 1 51
Percentage of schools performing above expectations in Step 1 36 Grades‐Based Testing Subject 9th 10th 11th 12th
Reading 9
Mathematics 9
South Dakota
PAGE 67 | Identifying Top‐PerformingHigh Schools Technical
Step 1 Analysis Actual Performance Versus Expected Performance
Total number analyzed statewide (calculated PI) 347
Number of high schools performing above expectations in Step 1 123
Percentage of schools performing above expectations in Step 1 35 Grades‐Based Testing Subject 9th 10th 11th 12th
Reading 9 9 9 9
Mathematics 9 9 9 9
End‐of‐course tests include English I, English II, English III, Algebra I, and Algebra II.
Tennessee
PAGE 68 | Identifying Top‐PerformingHigh Schools Technical
Step 1 Analysis Actual Performance Versus Expected Performance
Total number analyzed statewide (calculated PI) 1,664
Number of high schools performing above expectations in Step 1 545
Percentage of schools performing above expectations in Step 1 33 Grades‐Based Testing Subject 9th 10th 11th 12th
Reading 9 9 9 9
Mathematics 9 9 9 9
Texas
PAGE 69 | Identifying Top‐PerformingHigh Schools Technical
Step 1 Analysis Actual Performance Versus Expected Performance
Total number analyzed statewide (calculated PI) 134
Number of high schools performing above expectations in Step 1 53
Percentage of schools performing above expectations in Step 1 40 Grades‐Based Testing Subject 9th 10th 11th 12th
Reading 9
Mathematics 9 9 9
Utah
PAGE 70 | Identifying Top‐PerformingHigh Schools Technical
Step 1 Analysis Actual Performance Versus Expected Performance
Total number analyzed statewide (calculated PI) 53
Number of high schools performing above expectations in Step 1 19
Percentage of schools performing above expectations in Step 1 36 Grades‐Based Testing Subject 9th 10th 11th 12th
Reading 9
Mathematics 9
Vermont
PAGE 71 | Identifying Top‐PerformingHigh Schools Technical
Step 1 Analysis Actual Performance Versus Expected Performance
Total number analyzed statewide (calculated PI) 323
Number of high schools performing above expectations in Step 1 113
Percentage of schools performing above expectations in Step 1 35 Grades‐Based Testing Subject 9th 10th 11th 12th
Reading 9 9 9 9
Mathematics 9 9 9 9
Virginia
PAGE 72 | Identifying Top‐PerformingHigh Schools Technical
Step 1 Analysis Actual Performance Versus Expected Performance
Total number analyzed statewide (calculated PI) 450
Number of high schools performing above expectations in Step 1 183
Percentage of schools performing above expectations in Step 1 41 Grades‐Based Testing Subject 9th 10th 11th 12th
Reading 9
Mathematics 9
Washington
PAGE 73 | Identifying Top‐PerformingHigh Schools Technical
Step 1 Analysis Actual Performance Versus Expected Performance
Total number analyzed statewide (calculated PI) 94
Number of high schools performing above expectations in Step 1 35
Percentage of schools performing above expectations in Step 1 37 Grades‐Based Testing Subject 9th 10th 11th 12th
Reading 9 9 9
Mathematics 9 9 9
West Virginia
PAGE 74 | Identifying Top‐PerformingHigh Schools Technical
Step 1 Analysis Actual Performance Versus Expected Performance
Total number analyzed statewide (calculated PI) 459
Number of high schools performing above expectations in Step 1 166
Percentage of schools performing above expectations in Step 1 36 Grades‐Based Testing Subject 9th 10th 11th 12th
Reading 9
Mathematics 9
Wisconsin
PAGE 75 | Identifying Top‐PerformingHigh Schools Technical
Step 1 Analysis Actual Performance Versus Expected Performance
Total number analyzed statewide (calculated PI) 68
Number of high schools performing above expectations in Step 1 27
Percentage of schools performing above expectations in Step 1 40 Grades‐Based Testing Subject 9th 10th 11th 12th
Reading 9
Mathematics 9
Wyoming
Prepared for U.S. News & World Report by
RTI International 3040 East Cornwallis Road Research Triangle Park, NC 27709‐2194 www.rti.org RTI International is a registered trademark and trade name of Research Triangle Institute