www.georgiaeducation.org data analysis this presentation is intended to accompany the georgia school...
TRANSCRIPT
www.GeorgiaEducation.org
Data Analysis
This presentation is intended to accompany the Georgia School Council GuideBook.
www.GeorgiaEducation.org
Why so many tests? Assessments are a central part of the
accountability system required by the federal No Child Left Behind Act and the Georgia A+ Reform Act of 2000.
Both laws require annual testing of all students in specific grades and subjects.
See p. 2.15 of the Georgia School Council GuideBook for a chart of all required assessments in Georgia.
www.GeorgiaEducation.org
Purpose of Assessments To identify those students performing
below grade level, at grade level and above grade level in order to tailor instruction to individual student needs.
To provide teachers with information to guide instruction.
To assist schools and school systems in setting priorities.
www.GeorgiaEducation.org
Two Kinds of Assessments
Criterion-referenced tests (CRTs) measure how well the student has learned a particular curriculum.
A group of experts decides how many questions must be answered correctly for a student to pass or to receive a score in a particular category (e.g., Pass, Pass Plus). The number of correct answers required is called the cut score.
CRCTs, end of course tests (EOCTs), and graduation tests are criterion-referenced assessments used in Georgia.
www.GeorgiaEducation.org
Two Kind of Assessments
Norm-referenced tests (NRTs) measure how well students perform compared to all the other students taking the test.
Scores are reported in percentiles of 1 to 99.
The 50th percentile is the median. Half of the students did better; half did worse.
The Iowa Test of Basic Skills (ITBS) and the Stanford 9 are the two most commonly used NRTs in Georgia.
www.GeorgiaEducation.org
Compare and Contrast
Georgia’s criterion-referenced tests show how well students have mastered the curriculum adopted by the State Board of Education.
Norm-referenced tests, such as the Iowa Test of Basic Skills (ITBS), compare our students’ mastery of general knowledge to other students in the nation.
www.GeorgiaEducation.org
Discuss Some criticize the
requirement for annual assessments saying it leads to teachers “teaching to the test.” Do you agree?
Does your answer depend on whether a CRT or NRT is used?
If assessments were not used, how would student achievement be measured and monitored?
www.GeorgiaEducation.org
The National Test: NAEP The National Assessment of Education
Progress (NAEP) is sometimes referred to as the nation’s report card.
Only certain students in certain grades in certain schools take the test.
Results are provided only at the state level. Federal law requires states to participate in
NAEP reading and math assessments in grades 4 and 8 every other year.
www.GeorgiaEducation.org
NAEP Is A Check and Balance
Each state adopts its own criterion-referenced test and sets the standards to be met. No Child Left Behind requires NAEP to be given to ensure states do not use easy tests or low standards.
Each state uses its own tests and standards to determine whether or not it meets the goals (Adequate Yearly Progress) of NCLB.
Comparing the percentage of students who meet the standards on the state test to the percentage of those who meet the NAEP standards can reveal the rigor of the state test.
www.GeorgiaEducation.org
NAEP vs. CRCT Results NAEP scores are listed as Advanced, Proficient,
Basic, and Below Basic. The expectation is that students will be Proficient or Advanced.
CRCT scores are Exceeding the Standard, Meeting the Standard, and Did Not Meet the Standard.
Because of the 4 categories of scores versus three, it is difficult to directly compare them. The fairest way is to compare the percent Below Basic on NAEP and the percent Not Meeting the Standard on the CRCT.
www.GeorgiaEducation.org
Comparing the percent of students who scored “Below Basic” on NAEP and “Does Not Meet Standard” on CRCT in 2003
0%5%
10%15%20%25%30%35%40%45%
4thGrade
Reading
4thGradeMath
8thGrade
Reading
8thGradeMath
NAEPCRCT
www.GeorgiaEducation.org
NAEP vs. CRCT Results
The closer the results are, the more likely it is that the tests have a similar level of difficulty and a similar cut score set.
4th Grade Math has very similar results on the two tests.
8th Grade Math is not quite as close. The largest disparity is in 4th grade Reading. 8th Grade Reading also has a large difference in the
results.
It appears that the NAEP Reading tests are more rigorous than the Georgia CRCT in Reading.
www.GeorgiaEducation.org
Where To Find Georgia Assessment Data Office of Student Achievement:
www.gaosa.org Georgia Department of Education:
www.gadoe.org Georgia School Council Institute:
www.GeorgiaEducation.org Georgia Public Policy Foundation:
www.gppf.org
www.GeorgiaEducation.org
Where To Find National Data Standard and Poor:
www.schoolmatters.org Education Trust:
www.edtrust.org NAEP:
http://nces.ed.gov/nationsreportcard/
www.GeorgiaEducation.org
Resources Pp 2.3 – 2.10 in the Georgia School
Council GuideBook are a step-by-step guide to analyzing test scores.
The GuideBook and this presentation are written using the Georgia School Council Institute’s website. Test scores are found in the Center for School Performance section at www.GeorgiaEducation.org.
www.GeorgiaEducation.org
Purpose of Data Analysis
Are all students learning what we expect them to know?
Which students are not succeeding?
How do we improve the achievement of all students?
“That which is not measured cannot be improved.”
www.GeorgiaEducation.org
Three Levels of Test Data
There are three levels of test data available to the public: State Level System Level School Level
Individual schools have class level and student level results.
www.GeorgiaEducation.org
Analyzing Test Data
Begin with state level data.
www.GeorgiaEducation.org
Begin with State Level Data Understanding the state statistics helps
put your school and system data into perspective.
Learning the terminology helps you identify what is relevant.
First, look at the Profile Report to see the demographics of the state and the changes that are occurring.
www.GeorgiaEducation.org
State Level Profile Report
www.GeorgiaEducation.org
When you are provided data using percentages, always be clear as to whether you are looking at a change in percent or a change in percentage points.
www.GeorgiaEducation.org
Understanding the Trends
When changes in percentages are listed, it is a change in percentage points, not the percent change itself.
There was a 3 percentage point increase in the number of Hispanic students. (4% in 2000 and 7% in 2004)
48,366 more Hispanic students is a 54 percent increase from 2000 to 2004.
7% of the students in 2004: 1,486,125 x .07 = 104,029
4% of the students in 2000: 1,391,579 x .04 = 55,663 Percent increase: 55,663 / 104,029 = 54%
www.GeorgiaEducation.org
Understanding the Trends Percentage point changes tell only part of the
story. What are the demographics of the students
moving into the schools? If Georgia gained 94,546 students in five years
and 48,366 were Hispanic, then 51% (48,366 / 94,546) of the new students were Hispanic.
It is as important to know what the population trends are in your school as it is to know the demographic percentages.
www.GeorgiaEducation.org
Discuss The number of Hispanic
students has grown tremendously, but the percent of students in the Limited English Proficient (LEP) program has remained steady. What conclusions can you draw?
What impact on student achievement could changing demographics have?
www.GeorgiaEducation.org
Pop Quiz
43% of all students (1,391,579) were eligible for free or reduced lunch (FRL) in 2000.
In 2004, 46% of the students (1,486,125) were eligible for FRL.
How many more students are eligible in 2004?
What is the percent increase in those eligible?
www.GeorgiaEducation.org
Answers
In 2000: 598,379 students were eligible for FRL.(1,391,579 x .43 = 598,379)
In 2004: 683,618 were eligible.(1,486,125 x .46 = 683,618)
85,239 more students were eligible for FRL.(683,618 - 598,379 = 85,239)
That is an 88% increase in the number of students eligible.(598,379/683,618) x 100 = 88%
www.GeorgiaEducation.org
Analyzing Test Data
Next, look at test scores for all the students at the state level.
www.GeorgiaEducation.org
State Level Test Scores Report
www.GeorgiaEducation.org
What do the numbers say? The percent Exceeding the Standard should be
going up, and the percent Not Meeting the Standard should be going down.
If the percent Meeting the Standard drops, which category is increasing? Are more students moving to a higher level or a lower level?
Are there changes in the number of students tested?
Has improvement been greater in one subject? In one particular grade? Is there a reason?
www.GeorgiaEducation.org
Keep in Mind
Trend information is more important than comparing one year to another.
The same group of students is not being compared.
One year’s results alone does not indicate a trend.
www.GeorgiaEducation.org
Look for Achievement Gaps After looking at scores for all students,
look at the scores for subgroups of students.
This is called disaggregating the data.
Federal and state law require the disaggregation and reporting of scores by ethnicity, gender, socioeconomic status, disability and migrant status.
www.GeorgiaEducation.org
Look for Achievement Gaps
On the Test Scores Report, click on a subject under “Achievement Gap Analysis” to see the current year’s scores of each subgroup in graph form.
www.GeorgiaEducation.org
Look for Achievement Gaps
On the Test Scores Report, click on the box labeled “View Scores by Group” and select a subgroup. You will see the disaggregated data by year. This allows you to see the trends in the scores of the subgroup.
www.GeorgiaEducation.org
Analyzing Achievement Gaps Compare the scores of the different groups
of students. Differences in scores reveals the achievement gaps.
Which group has the highest proportion of students performing below grade level?
Are some groups doing better than others? Are the differences the same in every
subject? In every grade?
www.GeorgiaEducation.org
Exercise
Jones Elementary is excited because the percent of fourth grade students Exceeding the Standard in reading has increased by 16 percentage points in the last four years.
Some members of the school council are concerned because the percent of students who Did Not Meet the standard in reading did not decrease this year.
Is this grounds for concern? What should the school council look at to answer
this question? Is additional data needed?
www.GeorgiaEducation.org
What did you decide? Results over time (4 years) is more significant than
one year’s results. Look at the number of students tested overall and in
each subgroup. If more students were tested, it is not necessarily
grounds for concern that there was no change in the percent not meeting the standard.
If some subgroups improved, that is a positive change. If some subgroups lost ground, this may be something for the school council to keep in mind as they look at future data. Think of it as a caution light rather than a red flag.
www.GeorgiaEducation.org
Analyzing test scores is more than just comparing numbers. It is comparing numbers in a way that puts them in perspective and gives them meaning.
www.GeorgiaEducation.org
System and School Analysis Do the same kind of analysis for the school system
and your school. Start with the Profile Report. Look for changes in the student population. Look at Test Score Reports for all students. In each subject area, check the change in the
percent Exceeding and the percent Not Meeting the standard.
Has the number of students being tested changed? Compare scores to the system and state. Check for achievement gaps.
www.GeorgiaEducation.org
Comparing Schools or Systems
The unique part of GeorgiaEducation.org is the ability to compare test scores of schools and systems that are demographically similar. The “Similar Systems” and “Similar Schools” Reports will give you this information.
www.GeorgiaEducation.org
Comparing SchoolsYou will first see a listing of your school, the state, and schools with similar demographics.
www.GeorgiaEducation.org
Comparing Test Scores
Click “Test Scores Comparisons” to see the test scores of all the schools.
www.GeorgiaEducation.org
Comparing Test Scores A disaggregation box is available. The list can be sorted by clicking on
“M/E” (Meets and Exceeds) to see the schools ranked by the highest percentage of students meeting and exceeding the standard. Clicking on “DNM” (Did Not Meet) puts the lowest achieving at the top.
Each subject can be sorted by achievement.
www.GeorgiaEducation.org
Comparing Test Scores
www.GeorgiaEducation.org
Comparing in Graph FormIf you prefer to look at a graph, click on a subject area below “Target schools by subject.”
www.GeorgiaEducation.org
Comparing in Graph FormThe graph will include your school and the five most similar schools.
www.GeorgiaEducation.org
The SAT Debate
www.GeorgiaEducation.org
SAT and ACT Scores: What do they tell Us? Both tests are used for college
admissions, but they test different skills. The SAT is more of a critical thinking
and problem solving test designed to measure a student’s potential to learn.
The ACT is a more curriculum-based test designed to measure what a student has learned.
www.GeorgiaEducation.org
SAT Scores The SAT was designed to predict how well any given
student would perform in his or her freshman year of college.
Because the SAT is taken by students in all 50 states, SAT scores are used by the media to rank the quality of public education in the 50 states.
Within the state, SAT scores are used to rank high schools.
Georgia’s low ranking is often attributed to the high percent of students taking the test. Is that a valid argument?
What happens if only states with similar demographics and similar participation rates are compared?
www.GeorgiaEducation.org
Comparison of 2004 Georgia SAT Scores to States with Similar Participation and Demographics
Participation Rate
Average Score
State Rank % Asian % Black
% Hispanic % White % Other
MD 68% 1026 34 7% 27% 5% 58% 3%
VA 71% 1024 35 7% 19% 5% 66% 4%
NJ 83% 1015 38 9% 13% 10% 63% 4%
NY 87% 1007 39 8% 13% 12% 62% 6%
NC 70% 1006 41 3% 23% 3% 68% 3%
DE 73% 999 46 4% 19% 4% 70% 3%
FL 67% 998 47 4% 15% 18% 57% 5%
GA 73% 987 49 5% 28% 3% 61% 3%
www.GeorgiaEducation.org
Are our students prepared for the SAT? If the purpose of the SAT is to determine
college readiness, students taking it should be on the college prep track. Is the number of students who receive a college prep diploma similar to the number who take the SAT?
If the percent of students eligible for a HOPE scholarship is used to estimate grade point average, what does that indicate about the preparedness of Georgia’s students?
www.GeorgiaEducation.org
Students appear to be well-prepared to take the SAT.
0%
10%
20%
30%
40%
50%
60%
70%
1998 1999 2000 2001 2002 2004
% College PrepDiplomas
% Taking SAT
% Eligible for HOPE
www.GeorgiaEducation.org
SAT Conclusions Without a student information system, the
state cannot know for sure that the students receiving college prep diplomas are the ones taking the SAT, but the close correlation of the numbers indicates that.
It is also not clear that only those eligible for HOPE are taking the SAT.
Taken together though, it appears that those students taking the SAT should be considered well-prepared.
www.GeorgiaEducation.org
How does Georgia compare on the ACT?
19 19.5 20 20.5 21
1994
1997
2001
2004
NationalGeorgia
38th
41st
45th
47th
www.GeorgiaEducation.org
Graduation Rates The goal of the K-12 system is to graduate
students prepared for postsecondary work, whether that is technical school, college, or employment.
Graduation rates and the credentials awarded tell how well that goal was accomplished.
The credentials awarded may be a diploma or a certificate of attendance. Students who earn the required number of Carnegie units but do not pass the graduation test receive a certificate of attendance.
www.GeorgiaEducation.org
Graduation Rates There are different ways to determine a
graduation rate. Without a student information system the state cannot track individual students.
Looking at the number of 9th graders and the number of graduates four years later gives an approximation of the graduation rate. Independent analyses which use a statistical variation of this cohort method yield a graduation rate within 3 percentage points of this simple method.
www.GeorgiaEducation.org
9th Grade Enrollment vs. Completion 4 Years Later
0
20000
40000
60000
80000
100000
120000
140000
1999 2000 2001 2002 2003 2004
9th EnrollmentGraduates 4 Yrs LaterDiplomas 4 Yrs Later
www.GeorgiaEducation.org
Credentials Awarded in 2004
College Preparatory Diploma 46.3%
Both College and Technology 18.6%
Technology/Career Diploma 23.7%
Certificate of Attendance 6.9% Special Education Diploma 4.5%
www.GeorgiaEducation.org
Data Analysis Summary Don’t analyze data in a vacuum. Context is
critical. Analyzing the data should lead to additional
questions. Consider what other information might be needed to explain it more fully.
Effective data analysis can guide improvement in student learning, classroom instruction, and the school environment.
Always look for the “meaning behind the data.” Don’t take it on face value. Analyze the “spinning” of the data.
www.GeorgiaEducation.org
Summary
Data analysis is just the beginning of the improvement process. It is how we interpret and use the data that can make a difference.
www.GeorgiaEducation.org
Data Analysis
This presentation is intended to accompany the Georgia School Council GuideBook.