document resume ed 366 616 tm 020 987 author title … · 2013. 11. 23. · ed 366 616. author...

43
ED 366 616 AUTHOR TITLE INSTITUTION PUB DATE NOTE AVAILABLE FROM PUB TYPE EDRS PRICE DESCRIPTORS DOCUMENT RESUME TM 020 987 Barton, Paul E.; Coley, Richard J. Testing in America's Schools. Policy Information Report. Educational Testing Service, Princeton, NJ. Policy Information Center. 94 46p. Policy Information Center, Mail Stop 04-R, Educational Testing Service, Rosedale Road, Princeton, NJ 08541-0001 (87.50 prepaid). Statistical Data (110) -- Reports Evaluative/Feasibility (142) MF01/PCO2 Plus Postage. Ability; Accountability; *Achievement Tests; Criterion Referenced Tests; Disadvantaged Youth; Educational Diagnosis; Educational Practices; *Educational Testing; Elementary Secondary Education; Essay Tests; Ethnic Groups; Instructional Improvement; Language Tests; Mathematics Tests; Multiple Choice Tests; Norm Referenced Tests; Portfolios (Background Materials); Profiles; Program Evaluation; Racial Differences; School Districts; Standardized Tests; *State Programs; Teacher Made Tests; *Testing Programs; *Test Use; Writing Tests IDENTIFIERS *Alternative Assessment; North Central Regional Educational Laboratory; *Performance Based Evaluation; State Student Assessment Prog Database 1992 93 ABSTRACT This report provides a profile of state testing programs in 1992-93, as well as a view of classroom testing practices by state, school district, school, or individual teacher. Information, taken from a variety of sources, including the National Assessment of Educational Progress and a General Accounting Office study, indicates that the multiple-choice test remains dominant. The most prevalent purposes of state programs are accountability, instructional improvement, and program evaluation. Virtually all states test in mathematics and language, and most also test in science, writing, and social studies. Thirty-eight programs include writing samples, and 34 states use norm-referenced tests while 34 use criterion-referenced tests. Seventeen use some form of performance assessment, and six collect student portfolios. At least 36 percent of all students were tested in state programs in 1992-93. Only one state uses a norm-referenced test for high school graduation purposes, while 20 use criterion-referenLed tests. In the classroom, in contrast, non-multiple-choice tests appear to be the predominant mode. It is also concluded that patterns of traditional and alternative testing in the classroom are very similar for students of different races, ethnicity, ability groups, and resource adequacy. Fifteen figures and three tables present study information. Document notes refer readers to information sources. Four appendix tables amplify the information. (SLD)

Upload: others

Post on 05-Sep-2020

4 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: DOCUMENT RESUME ED 366 616 TM 020 987 AUTHOR TITLE … · 2013. 11. 23. · ED 366 616. AUTHOR TITLE. INSTITUTION. PUB DATE NOTE AVAILABLE FROM. PUB TYPE. EDRS PRICE DESCRIPTORS

ED 366 616

AUTHORTITLE

INSTITUTION

PUB DATENOTEAVAILABLE FROM

PUB TYPE

EDRS PRICEDESCRIPTORS

DOCUMENT RESUME

TM 020 987

Barton, Paul E.; Coley, Richard J.Testing in America's Schools. Policy InformationReport.Educational Testing Service, Princeton, NJ. PolicyInformation Center.9446p.

Policy Information Center, Mail Stop 04-R,Educational Testing Service, Rosedale Road,Princeton, NJ 08541-0001 (87.50 prepaid).Statistical Data (110) -- ReportsEvaluative/Feasibility (142)

MF01/PCO2 Plus Postage.Ability; Accountability; *Achievement Tests;Criterion Referenced Tests; Disadvantaged Youth;Educational Diagnosis; Educational Practices;*Educational Testing; Elementary Secondary Education;Essay Tests; Ethnic Groups; InstructionalImprovement; Language Tests; Mathematics Tests;Multiple Choice Tests; Norm Referenced Tests;Portfolios (Background Materials); Profiles; ProgramEvaluation; Racial Differences; School Districts;Standardized Tests; *State Programs; Teacher MadeTests; *Testing Programs; *Test Use; Writing Tests

IDENTIFIERS *Alternative Assessment; North Central RegionalEducational Laboratory; *Performance BasedEvaluation; State Student Assessment Prog Database1992 93

ABSTRACTThis report provides a profile of state testing

programs in 1992-93, as well as a view of classroom testing practicesby state, school district, school, or individual teacher.Information, taken from a variety of sources, including the NationalAssessment of Educational Progress and a General Accounting Officestudy, indicates that the multiple-choice test remains dominant. Themost prevalent purposes of state programs are accountability,instructional improvement, and program evaluation. Virtually allstates test in mathematics and language, and most also test inscience, writing, and social studies. Thirty-eight programs includewriting samples, and 34 states use norm-referenced tests while 34 usecriterion-referenced tests. Seventeen use some form of performanceassessment, and six collect student portfolios. At least 36 percentof all students were tested in state programs in 1992-93. Only onestate uses a norm-referenced test for high school graduationpurposes, while 20 use criterion-referenLed tests. In the classroom,in contrast, non-multiple-choice tests appear to be the predominantmode. It is also concluded that patterns of traditional andalternative testing in the classroom are very similar for students ofdifferent races, ethnicity, ability groups, and resource adequacy.Fifteen figures and three tables present study information. Documentnotes refer readers to information sources. Four appendix tablesamplify the information. (SLD)

Page 2: DOCUMENT RESUME ED 366 616 TM 020 987 AUTHOR TITLE … · 2013. 11. 23. · ED 366 616. AUTHOR TITLE. INSTITUTION. PUB DATE NOTE AVAILABLE FROM. PUB TYPE. EDRS PRICE DESCRIPTORS

tkex

hik

*V

PA

R%

WM

;

EC

TIO

N

s tr«

#

oitu

0

*

0 0 0

0000

0

4%00

000

0 0 0 0 4* 0 0 0 0 00 0 a) 0 0 TY <2) 0 0 0 rIA

0 4, 0 CD 4 C

li) 0 0 (D

tr.1

T.1

ve

roee

P; A to

rosm

thm

tht

foto

ot

he ow eattl

).

.heo

uri

uOlg

PA

e

twol

ts)4

A You

they utm

dres

vuto

z,

num

ber&

txs

/Out

4rx.

AT

AR

AC

k<fT

.

U S DIc

e,A

RT

ME

NT

OF

ET

AT

CA

TIO

T,

011o e Ot

Edu

catio

nal

Res

earc

h

and

Impr

ovem

ent

ED

UC

TIO

NA

L

RE

SO

UR

CE

S

INF

OR

MA

TIO

N

CE

NT

ER

(ER

IC)

Thi

s

ddcu

rnen

l

has

been

lepr

oduc

ed

as

let

owed

Iron

.

Ihe

ON 50

.

01

orga

naah

on

orig

ina.

ing

C Min

or

chan

ges

have

been

mad

e

to umpr

ove

reD

IOdu

chon

clua

loty

Pce

nts

of vlew or

opin

ions

stal

ed m this

doc

rnen

t

do not

nece

ssar

ily

repr

esen

t

ollio

al

OF

R!

oosI

tIon

or 001r

1y

16c.

otex

oli

thy

c$3

.1-3

4),4

4,41

,

Slet

c,e,

4.21

1111

e.,

AI&

A

411

iNt

Ws.

Aitt

ieof

t+1,

444a

aria

r

-tag

adia

m

4.44

014,

liada

inbo

gl I: -r fa.

JAR

AM

IAA

L -

43A

LL

-rM

ia

-At&

.4C

4414

4

4111

4440

0L11

BE

ST

CO

PY

AV

AIL

AB

LE

2

"PE

RM

ISS

ION

TO

RE

PR

OD

UC

E

TH

IS

MA

TE

RIA

L

HA

S

BE

EN

GR

AN

TE

D

BY

Rga

e.3

Oor

..6y

TO

TH

E

ED

UC

AT

ION

AL

RE

SO

UR

CE

S

INF

OR

MA

TIO

N

CE

NT

ER

(ER

IC)"

Page 3: DOCUMENT RESUME ED 366 616 TM 020 987 AUTHOR TITLE … · 2013. 11. 23. · ED 366 616. AUTHOR TITLE. INSTITUTION. PUB DATE NOTE AVAILABLE FROM. PUB TYPE. EDRS PRICE DESCRIPTORS

This report was writtenby Paul E. Barton andRichard T. Coley of the ETSPolicy Information Center.

Additional copies of this reportcan be ordered for $7.50 (prepaid)from:

Policy Information CenterMail Stop 04-REducational Testing ServiceRosedale RoadPrinceton, NJ 08541-0001(609) 734-5694

Copyright v1994 by EducationalTesting Service. AU rights reserved.Educational Testing Service is anAffirmative Action/Equal Opportu-nity Employer.

Educational Testing Service, Ers ,and ap arc registered trademarksof Educational Testing Service.

Contents

Preface 2

Acknowledgments 2

Summary and Highlights 3

Introduction 5

The Purposes of State Testing Programs 6

Subjects Tested 8

Types of Tests Used in State Testing Programs 8

Number of Students Tested 10

Universe and Sampling Approaches 12

Types of Tests Used for Selected State Testing Purposes 14

Test Design of Systernwide Tests 16

Types of Systemwide Tests 16

Teacher Reports on the Frequency of Testing 18

Student Reports cn the Frequency of Testing 20

Frequency of Multiple-Choice Testing 22

The Frequency of Teacher-Developed Tests 24

Nontraditional State Assessment 26

The Status of Alternative State Testing Programs 28

Subjects Assessed in Nontraditional State Testing Programs 30

Teacher Use of Different Types of Tests, by Race/Ethnicity 32

Teacher Use of Different Types of Tests, by Ability Group 34

Teacher Use of Different Types of Tests,by Resource Availability 36

In Conclusion 38

Notes and References 38

Appendices 39

3

Page 4: DOCUMENT RESUME ED 366 616 TM 020 987 AUTHOR TITLE … · 2013. 11. 23. · ED 366 616. AUTHOR TITLE. INSTITUTION. PUB DATE NOTE AVAILABLE FROM. PUB TYPE. EDRS PRICE DESCRIPTORS

Preface Acknowledgments

There has been an explo-sion in testing in theUnited States over the past20 years or so. it hasalmost reached the pointwhere education reform,at least in some circles,has come to mean "moretesting." Given the volumeof school testing, and thevariety of purposes itserves, it is hard to portrayaccurately both the quan-tity and kinds of testingoccuring in today's class-rooms. This report attemptsto gather together newinformation on this topic.As the nation enters a newera of expecting morefrom tests, while at thesame time broadening theuse of various forms ofperformance assessment,we need a baseline fromwhich to observe change.

Paul E. BartonDirectorPolicy Infirmation Center

2

Much of the data in thisreport is drawn from theState Student AssessmentProgram Database,1992-93, produced bythe Council of ChiefState School Officers(CCSSO) and the NorthCentral Regional Educa-tional Laboratory(NCREL). This databasecontains descriptions ofthe state testing pro-grams in the UnitedStates, and we aregrateful to the statetesting directors forsupplying the informa-tion. We are also grate-ful to the NationalAssessment of Educa-tional Progress, fromwhich we used data.

The report wasreviewed by Ed Roeberof CCSSO and by LindaBond and Arie van derPloeg of NCREL. At ETS,the report was reviewedby John Mazzeo,Howard Wainer, andivIchael Zieky. ShilpiNiyogi was the editor,David Freund provideddata analysis of NAEP,Carla Cooper supplieddesktop publishingservices, and Ric Brucedesigned the cover.

Page 5: DOCUMENT RESUME ED 366 616 TM 020 987 AUTHOR TITLE … · 2013. 11. 23. · ED 366 616. AUTHOR TITLE. INSTITUTION. PUB DATE NOTE AVAILABLE FROM. PUB TYPE. EDRS PRICE DESCRIPTORS

Summary and Highlights

Testing in America'sSchools provides aprofile of state testingprograms in 1992-1993,as well as a view ofclassroom testing prac-tices, whether requiredby the state, schooldistrict, school, orindividual teacher. Testsused at the state ordistrict level, of course,are typically standard-ized, whether they aremultiple-choice oralternative assessments.Teacher-constructed teststypically are non-standardized, and mayinclude a variety oftasks, includingmultiple-choice, shortresponse, essay, andperformance tasks.

The nation is enter-ing an era of change intesting and assessment.Efforts at both thenational and state levelsare now directed atgreater use of perfor-mance assessment,constructed responsequestions, and portfoliosbased on actual studentwork. However, as willbe seen in the followingpages, the multiple-choice, machine-scoredapproach is still verymuch dominant. At thesame time, however,alternative assessmentapproaches seem to bespreading rapidly, andhave reached furtherinto the testing systemthan many may be

aware. The states aremoving purposefully inpursuing alternativeforms of assessment, butshow no signs yet ofabandoning traditionalassessment programs.

This report tries toconvey the presentstatus of assessment in away that can help totrack changes as wehead toward the year2000. It is principally astatistical report, not anattempt to evaluate thequality of the testingprograms of the U.S., ormake judgments aboutthe adequacy of ournation's testing practices.Highlights are presentedbelow.

State Testing Programs

Purposes The threemost prevalent purposesof state testing programsare accountability,instructional improve-ment, and programevaluation. Next arestudent diagnosis/placement and highschool graduation.

Subjects Virtually allstates test students inmathematics and lan-guage, and most alsotest in science, writing,and social studies. A fewtest students in voca-tional subjects.

Types of Tests Thirty-eight stite programs

include writing samples,34 use norm-referencedtests and 34 use criterion-referenced tests. Seven-teen use some form ofperformance assessmentand six collect studentportfolios.

Volume of TestingAt least 14.5 millionstudents were tested in1992-1993, representingat least 36 percent of allK-12 students. However,the testing is concen-trated in certain grades,particularly at gradesfour and eight, whereabout half of the stu-dents are tested. Testingis lightest in kindergar-ten and grades 1, 2, and12. Testing for exitingschool is limited in theU.S., although it isstandard practice inmany countries.

Testing for Account-ability Of the statesthat test for accountabil-ity or program evalua-tion, 41 test every stu-dent in a grade. Just onestate uses sampling, andthree use a combinationof sampling and univer-sal testing. The lessintaisive samplingapproach used by theNational Assessment ofEducational Progress hasnot been adopted by thestates in their efforts tojudge the effectivenessof schools and districts.

5

State testing programsuse a variety of tests fordifferent purposes:

For accountability andprogram evaluationpurposes, most stateprograms use norm- orcriterion-referenced tests(36 and 40 programs,respectively). But 22 areusing performance tests,eight are using portfo-lios, and nearly all thestate programs usewriting samples.

For diagnosis andplacement purposes,most state programs usetraditional kinds of tests.Twenty-one and 24programs, respectively,are using norm- andcriterion-referenced tests;and 23 use writingsamples. Ten use perfor-mance tests, and twouse portfolios.

For student promotionpurposes, three and ninestate programs, respec-tively, are using norm-and criterion-referencedtests; three use a perfor-mance test, two useportfolios, and six usewriting samples.

For high school gradu-ation purposes, just onestate program uses anorm-referenced test,and 20 use criterion-referenced tests. Twoare using performancetests, 13 use .writing

3

Page 6: DOCUMENT RESUME ED 366 616 TM 020 987 AUTHOR TITLE … · 2013. 11. 23. · ED 366 616. AUTHOR TITLE. INSTITUTION. PUB DATE NOTE AVAILABLE FROM. PUB TYPE. EDRS PRICE DESCRIPTORS

samples, and no stateprogram uses portfoliosfor this purpose.

Test Design Seventypercent of tests givenin statewide systemsare multiple-choice,12 percent are writingsamples, hnd 18 percentare multiple-subjectperformance tests.

Kinds of TestsEighty percent of thestate systemwide testsgiven are achievementtests, 8 percent areaptitude tests, 6 percentassess vocational inter-est, 3 percent schoolreadiness, and 3 percentassess other areas.

Alternative AssessmentStates are busy

developing and imple-menting alternativeforms of assessment.Some are ready to use,some a..e in pilot stage,some are in the begin-ning stage, and somehave been funded butnot yet started. Fourteenstates are involved withenhanced multiple-choice items, 18 withshort-answer open-ended questions, 22 withextended-responseopen-ended items,14 with individualperformance asoess-ments, seven with groupperformance assess-ments, nine with portfo-lios or learning records,

4

five with projects ordemonstrations, threewith interviews, and fourwith observations.

These alternative itemtypes are being used inthe subjects generallytested for accountabilitypurposes.

Testing in theClassroom

While the debatecontinues about switch-ing more to alternativeforms of assessment insystem-wide 'testing,non-multiple-choicetesting appears to be thepredominant mode inthe classroom.

Multiple-choice Tests.At the fourth grade,6 percent of studentshave teachers who givemultiple-choice testsonce or twice a week,43 percent once or twicea month, and 51 percentyearly or never. Thecomparable percentagesfor eighth graders are 4,30 and 66.

Problem Sets. Abouthalf of the fourth gradershave teachers who useproblem sets once ortwice a week, 39 percentonce or twice a month,and 9 percent yearly ornever. The comparablepercentages for grade 8are 58, 32, and 10.

Written Responses.Forty-four percent offourth graders are giventests requiring writtenresponses at leastmonthly, 16 percentonce or twice a year,and 40 percent never orhardly ever. For eighthgraders the comparablepercentages are 44, 22,and 33.

Projects, Portfolios,Presentations. For grade4 students, 20 percentare given these forms ofassessment at leastmonthly, 25 percentonce or twice a year,and 54 percent are neveror hardly ever giventhese assessment types.For grade 8 the compa-rable percentages are 21,32, and 47.

Student reports on thefrequency of math tests

Nine percent ofstudents in fourth gradereport taking math testsalmost every thy and30 percent at least oncea week.

At grade 8, 6 percentof students report takingmath tests almost everyday and 55 percent atleast once a week.

Four percent of alltwelfth graders reporttaking math tests almostevery day and 50 percentat least once a week.

6

Of twelfth graderstaking math courses,4 percent report takingmath tests almost everyday, and 57 percentreport taking them atleast once a week.

Teacher reports on thefrequency of math tests

Among the 44 jurisdic-tions participating in the1992 NAEP math assess-ment, the percentages ofstudents taking multiple-choice tests at least oncea week ranged fromzero in Guam andNebraska to 22 percentin the District of Colum-bia. The average was4 percent.

Nationally, about 60percent of teachers givetheir own math tests tostudents at least once aweek. The percentagesvary from 88 percent inLouisiana to 40 percentin Oregon.

Testing Equity Thepatterns of studentexposure to traditionaland alternative tests inthe classroom were verysimilar for studer,s ofdifferent race/ed nicity,students in classes ofdifferent ability group-ings, and students inclasses with differingresource adequacy asreported by theirteachers.

Page 7: DOCUMENT RESUME ED 366 616 TM 020 987 AUTHOR TITLE … · 2013. 11. 23. · ED 366 616. AUTHOR TITLE. INSTITUTION. PUB DATE NOTE AVAILABLE FROM. PUB TYPE. EDRS PRICE DESCRIPTORS

Introduction

The nation has launcheda serious effort toimprove education, andtesting is one of thelevers being used toraise achievementstandards. In the 1970s,we had an explosion instate standardized testingprograms. The momen-tum carried into the1980s under state leader-ship in the era of theExcellence Movement.As this gave way to anew wave of reform inthe late 1980s, culminat-ing in the CharlottesvilleEducation Summit, callswere made for broaden-ing educational assess-ment to include perfor-mance assessment,portfolios, and con-stnicted responses.

The purpose of thisPolicy InformationReport is to provide acurrent summary oftesting in America'sschools. Data are drawnfrom several sources, butthe survey of stateassessment programsconducted by the Coun-cil of Chief State SchoolOfficers and the NorthCentral Regional Educa-tional Laboratory(CCSSO/NCREL) is used

as the major source ofinformation on stateprograms.* Data fromthe National Assessmentof Educational Progress(NAEP) and inform:Ionfrom a U.S. GeneralAccounting Office studyare also used in thisreport. Those interestedin the details and intrica-cies of particular statetesting programs willneed to obtain theCCSSO/NCREL survey(see page 38 for order-ing information).

These data sourcesenable us to view testingin the schools throughseveral different win-dows, and it is hard tosee the whole housethrough any one ofthem. The report startsby examining statewidetesting programs, wherethe development anddeployment of broaderforms of assessment arelargely taking place. Butthat is not all of thestandardized testinggoing on in theschools individualdistricts and schoolshave their own testingprograms, beyond whatcomes from the statecapitol. And all this is

only a fraction of thetesting that occurs in theAmerican classroom. Forit is the individualteacher who conductsmost of the testing andconstructs most of thetests. So we also providedata from teachers onwhat kinds of tests theyuse, and how frequentlythey give them.

The report formatpresents data graphicallyon the right-hand pageand provides a narrativedescription of the data,along with informationabout data sources, onthe left-hand page.

*in the 1992-1993 school year, 46 states had statewide testing programs. 10 a, Nebraska, New Hampshire, andWyoming were the four states without a statewide program. Most states have more than one "program."California's testing program, for example, has three components. The Career-Technical Assessment Project isused to determine a student's readiness to enter the workforce or enter post-secondary study;Golden StateExams are used to qualify students for awards; and Performance Assessment, Grades 4, 5, 8, and /Oprovideon-demand assessment used to support the state curriculum frameworks, to facilitate good instruction, and todemonstree accountability.

7 5

Page 8: DOCUMENT RESUME ED 366 616 TM 020 987 AUTHOR TITLE … · 2013. 11. 23. · ED 366 616. AUTHOR TITLE. INSTITUTION. PUB DATE NOTE AVAILABLE FROM. PUB TYPE. EDRS PRICE DESCRIPTORS

ThePurposes

of StateTesting

Programs

6

States test their students for a variety ofreasons.

to demonstrate accountability to taxpayers

to use the results to improve instructionand evaluate the effectiveness of educa-tion programs

to diagnose student strengths and weak-nesses and help in student course place-ment

to certify that students are ready for thenext grade or to graduate from highschool, or to certify their competence toemployers

to accredit or approve schools or schooldistricts

to identify students and schools forrewards

to certify that children are ready forschool

Figure 1 provides a count.* School perfor-mance reporting/accountability, instructionalimprovement, and program evaluation are themajor purposes of state testing programs.

"While a few states have only one testing program, or compo-nent, most states have several different ones, each designedfor a different purpose or purposes. For example, Louisianauses the Kindergarten Development Readiness ScreeningProgram to screen all entering kindergarten students; employsthe Louisiana Education Assessment Program to test grade-appropriate state curriculum skills in grade oromotion deci-sions and to allocate state funded remediation; and uses theLouisiana Statewide Norm-Referenced Testing Program toprovide comparisons with other states, demonstrate schoolaccountability, and provide a basis for program evaluation.

Source for Figure 1: Data aredrawn from the "State StudentAssessment Program Data-base, 1992-1993." Council ofChief State School Officersand North Central RegionalEducational Laboratory.

Page 9: DOCUMENT RESUME ED 366 616 TM 020 987 AUTHOR TITLE … · 2013. 11. 23. · ED 366 616. AUTHOR TITLE. INSTITUTION. PUB DATE NOTE AVAILABLE FROM. PUB TYPE. EDRS PRICE DESCRIPTORS

Accountability, instructional improvement, and programevaluation are major purposes of state programs.

Figure 1: The Purposes of State Testing Programs, 1992-93

School Reporting/Accountability

InstructionalImprovement

Program Evaluation

Student Diagnosis/Placement

High SchoolGraduation

Other

SchoolAccreditation

Student Promotion

High SchoolSkills Guarantee

Student Awardor Recognition

School Awardor Recognition

Kindergarten orGrade 1 Readiness

41

23

30

39

7-140 50

18

--11110

0 9

41

14

8

__ 8

0 60 4

0 10 20

1

30Number of States

9 7

Page 10: DOCUMENT RESUME ED 366 616 TM 020 987 AUTHOR TITLE … · 2013. 11. 23. · ED 366 616. AUTHOR TITLE. INSTITUTION. PUB DATE NOTE AVAILABLE FROM. PUB TYPE. EDRS PRICE DESCRIPTORS

SubjectsTested

in StateTesting

Programs

Types ofTests Used

in StateTestino

Programs

8

Virtually all of the states that have testingprograms assess students in mathematics andlanguage. Most include science, writing, andsocial studies as weil, A few states teststudents in vocational subjects, as well as todetermine readiness and aptitude. Figure 2shows the number of states reportIng assess-ment by subject.

Thirty-eight states use a student writing sampleas part of their assessment program. Norm-referenced tests and criterion-referenced tests*are each used by 34 states, Seventeen statesuse some form of performance assessment andsix states collect student porrolios. Thesecounts are shown in Figure 3.

While the multiple-choice tests in this surveyhave been categorized as norm-referenced orcriterion-referenced, this is not the case for theother types of tests reported, which could beeither norm- or criterion-referenced.

'In a norm-referenced test, student performance is comparedto that of other students taking the same test; in a criterion-referenced test, student performance is compared to apredetermined performance criteria.

; 1 10

Source for Figures 2 and 3:Data are drawn from the"State Student AssessmentProgram Database,1992-1993." Council of ChiefState School Officers andNorth Central RegionalEducational Laboratory.

Page 11: DOCUMENT RESUME ED 366 616 TM 020 987 AUTHOR TITLE … · 2013. 11. 23. · ED 366 616. AUTHOR TITLE. INSTITUTION. PUB DATE NOTE AVAILABLE FROM. PUB TYPE. EDRS PRICE DESCRIPTORS

State testing programs assess students in corecurriculum subjects.

Figure 2: Subjects Tested in State Testing Programs, 1992-93

Mathematics

Language(Including Reading)

Writing

Science

Social Studies

Vocational .--- 4

Readiness 3

Aptitude -111 2

kb

45

35

34

29

0 10 20 30 40 50Number of States

States use a variety of test types in their testing programs.

Figure 3: Types of Tests Used in State Testing Programs, 1992-93

Writing Sample

Norm-Referenced

Criterion-Referenced

Performance Event

Portfolio 1/ 6

17

40 38

-4/ 34

0 34

O 10 20 30 40 50Number of States

ii9

Page 12: DOCUMENT RESUME ED 366 616 TM 020 987 AUTHOR TITLE … · 2013. 11. 23. · ED 366 616. AUTHOR TITLE. INSTITUTION. PUB DATE NOTE AVAILABLE FROM. PUB TYPE. EDRS PRICE DESCRIPTORS

Number ofStudents

Tested inState

TestingPrograms

10

During the 1992-1993 school year, the CCSSO/NCREL survey revealed that at least 14.5 mil-lion students werc tested in state programs(this does not include students tested fordistrict or classroom purposes).* Figure 4allows a look at the grade levels of the testedstudents. While relatively few students aretested at the beginning and end of their schoolyears, large numbers of students are testedbetween grades 3 and 11, with the highestnumbers reported in grades 4 and 8.

The survey also reports that the 14.5 millionstudents tested represented at least 36 percentof all studeras. Nationally, at grades 4 and 8,about half of all students were tested, onaverage. These numbers vary considerably,however, by state.

While it is common in many countries forgraduating secondary school students to betested, relatively few seniors are tested at all inthe U.S., compared to most other grades.

'Since numbers were missing for several states, the countspresented here represent a minimum.

12

Source for Figure 4: Data aredrawn from the "State StudentAssessmem Program Data-base, 1992-1993." Council ofChief State School Officersand North Central RegionalEducational Laboratory.

Page 13: DOCUMENT RESUME ED 366 616 TM 020 987 AUTHOR TITLE … · 2013. 11. 23. · ED 366 616. AUTHOR TITLE. INSTITUTION. PUB DATE NOTE AVAILABLE FROM. PUB TYPE. EDRS PRICE DESCRIPTORS

Testing peaks in grades 4 and 8, and is lowestin grades K 1, 2, and 12.

Fiaure 4: Number of Students Tested in State Testing Programs,by Grade, 1992-93

K --* 154,281

1 0 105,687

2 ----- 282,592

3

4

5

6

7

1,427,763

931,025

2,012,662

0 1,588,735

1,430,201

9

10

11

12 0 396,202

1,048,455

1,516,341

1,402,042

2,256,164

r-1 1.5 2 2.5 3

Millions of Students

13 II

Page 14: DOCUMENT RESUME ED 366 616 TM 020 987 AUTHOR TITLE … · 2013. 11. 23. · ED 366 616. AUTHOR TITLE. INSTITUTION. PUB DATE NOTE AVAILABLE FROM. PUB TYPE. EDRS PRICE DESCRIPTORS

Universeand

SamplingApproaches

States were asked to specify the samplingframe they used for accountability and pro-gram evaluation assessment the largestcategories of state testing. The results areshown in Figure 5. Forty-one of the statesindicated that they conduct "universal orcensus" testing, which means that they test allstudents within a chosen grade(s).

Only one state, Minnesota, uses a simplerandom sample of students. Three states use amixture Kansas samples students on mathand reading, Kentucky samples students on aperformance assessment, and Vermont usessampling in its mathematics assessment. Thesethree states conduct census testing for otherprogram components, however.

Source for Figure 5: Data aredrawn from the "State StudentAssessment Program Data-base, 1992-1993." Council ofChief State School Officersand North Central RegionalEducational Laboratory.

Page 15: DOCUMENT RESUME ED 366 616 TM 020 987 AUTHOR TITLE … · 2013. 11. 23. · ED 366 616. AUTHOR TITLE. INSTITUTION. PUB DATE NOTE AVAILABLE FROM. PUB TYPE. EDRS PRICE DESCRIPTORS

Nearly all of the states that test for accountabilitypurposes test all students at the targeted grades.

Figure 5: Type of Student Samples Used for StateAccountability and Program Evaluation Testing, 1992-1993

Type of Sample

Universe or Censds

Simple Random 1

Mixed - 3

41

I I I I I

0 10 20 30 40 50

Number of States

Page 16: DOCUMENT RESUME ED 366 616 TM 020 987 AUTHOR TITLE … · 2013. 11. 23. · ED 366 616. AUTHOR TITLE. INSTITUTION. PUB DATE NOTE AVAILABLE FROM. PUB TYPE. EDRS PRICE DESCRIPTORS

Types ofTests Used

for SelectedState

TestingPurposes

14

As shown earlier, some important purposes ofstate testing programs are student diagnosisand placement, student promotion, programevaluation and accountability, and high schoolgraduation. Figure 6 shows the types of testsused by state programs* for each of thesepurposes.

For student diagnosis and placement, tradi-tional tests are typically used norm- andcriterion-referenced tests and writing samples.This pattern also holds for accountability andprogram evaluation purposes.

Alternative forms of assessment are not fre-quently used for student diagnosis and place-ment, although 10 state programs are nowusing performance assessment for this pur-pose. For accountability and program evalua-tion, however, it's interesting to note that 22state programs use a performance test ardeight use portfolios.

For gatekeeping purposes student promo-tion and high school graduation criterion-referenced tests prevail. However, only ninestates are using criterion-referenced tests forpromotion. In tests for promotion, students areassessed on what they are expected to knowto advance within the educational system.Thirteen states also require students to com-plete a writing sample in order to earn a highschool diploma.

'Note that numbers refer to the number of state programs, notthe number of states. Some states use more than one testingprogram for a particular purpose.

16

Source for Figure 6: Data aredrawn from the "State StudentAssessment Program Data-base, 1992-1993." Council ofChief State School Officersand North Central RegionalEducational Laboratory.

Page 17: DOCUMENT RESUME ED 366 616 TM 020 987 AUTHOR TITLE … · 2013. 11. 23. · ED 366 616. AUTHOR TITLE. INSTITUTION. PUB DATE NOTE AVAILABLE FROM. PUB TYPE. EDRS PRICE DESCRIPTORS

Alte

rnat

ive

form

s of

ass

essm

ent a

re n

ow u

sed

in m

any

stat

e pr

ogra

ms,

part

icul

arly

for

prog

ram

eva

luat

ion

and

acco

unta

bilit

y.

Fig

ure

6: T

ypes

of T

ests

Use

d fo

r M

ajor

Sta

te T

estin

g P

urpo

ses,

199

2-93

Dia

gnos

is o

r P

lace

men

tS

tude

nt P

rom

otio

n

Crit

erio

n-re

fere

nced

Writ

ing

Sam

ple

Nor

m-r

efer

ence

d

Per

form

ance

Tes

t e 1

0

Oth

er T

est

7

Por

tfolio

Ass

essm

ent -

2

24 23

21

Crit

erio

n-re

fere

nced

9

Writ

ing

Sam

ple

-- 6

Nor

m-r

efer

ence

d3

Per

form

ance

Tes

t3

Oth

er T

est -

2

Por

tfolio

Ass

essm

ent -

4) 2

010

2030

400

1020

3040

Num

ber

of P

rogr

ams

Num

ber

of P

rogr

ams

Pro

gram

Eva

luat

ion

or A

ccou

ntab

ility

Hig

h S

choo

l Gra

duat

ion

Crit

erio

n-re

fere

nced

Writ

ing

Sam

ple

Nor

m-r

efer

ence

d

Per

form

ance

Tes

t

--e

40

22

Oth

er T

est -

-- 1

6P

ortfo

iio A

sses

smen

t8

Crit

erio

n-re

fere

nced

to 2

0

- 43

Writ

ing

Sam

ple

13

36N

orm

-ref

eren

ced

4b 1

Per

form

ance

Tes

t - 2

Oth

er T

est

4

Por

tfolio

Ass

essm

ent 0

010

2030

400

1020

3040

Num

ber

of P

rogr

ams

Num

ber

of P

rogr

ams

1718

Page 18: DOCUMENT RESUME ED 366 616 TM 020 987 AUTHOR TITLE … · 2013. 11. 23. · ED 366 616. AUTHOR TITLE. INSTITUTION. PUB DATE NOTE AVAILABLE FROM. PUB TYPE. EDRS PRICE DESCRIPTORS

TestDesign of

SystemwideTests

Types ofSystemwide

Tests

16

In its report tp Congress on testing in the U.S.,the General Accounting Office found whileperformance and criterion-referenced tests weremore popular among educators, actual testingpractice differed (Figure 7).

In .'c 1990-1991 survey of testing, the GAOfound that multiple-choice tests constituted 70percent of all tests given. Writing samples ormultiple-choice tests with writing samplescomprised 12 percent; multiple-subject perfor-mance tests constituted 18 percent.

That same GAO study categorized systemwidetests according to the main purpose intendedby the test-makers. As shown in Figure 8, 80percent of all such tests taken in 1990-91 wereachievement tests, those that attempt to mea-sure a student's accumulated knowledge orskill. Most of these were commercial tests andmany were adapted to match a state's curricu-lum.

Another 8 percent were designed to measureaptitude or ability, e.g., an intelligence test.Three percent of the systemwide tests weredesigned to measure school "readiness."

19

Source for Figures 7 and 8:Data are drawn from U.S.General Accounting Office.Student Testing: CutrentExtent and Expenditures, withCost Estimates for a NationalExamination, January 1993.

Page 19: DOCUMENT RESUME ED 366 616 TM 020 987 AUTHOR TITLE … · 2013. 11. 23. · ED 366 616. AUTHOR TITLE. INSTITUTION. PUB DATE NOTE AVAILABLE FROM. PUB TYPE. EDRS PRICE DESCRIPTORS

Multiple-choice tests continue to comprise the great majorityof systemwide tests in the United States.

Figure 7: Test Design Features for Systemwide Tests in theU.S., 1990-1991

Multiple-choice Tests(70%)

Writing Samplesor Multiple-choice Testswith Writing Sample (12%)

Multiple-subjectPerformance Tests (18%)

Achievement tests are the most commonly usedsystemwide tests in the United States.

Figure 8: Types of Systemwide Tests in the U.S., 1990-1991

Aptitude orAbility (8%)

Vocational Interest (6%)

Readiness (3°/0)

Other (3%)

2017

Page 20: DOCUMENT RESUME ED 366 616 TM 020 987 AUTHOR TITLE … · 2013. 11. 23. · ED 366 616. AUTHOR TITLE. INSTITUTION. PUB DATE NOTE AVAILABLE FROM. PUB TYPE. EDRS PRICE DESCRIPTORS

TeacherReportson the

Frequencyof Testing

18

In the 1992 mathematics assessment, theNational Assessment of Educational Progress(NAEP) asked teachers how frequently theyassessed their students using four differenttypes of tests:

Multiple-choice testsProblem sets*Written responsesProjects, portfolios, or presentations

The results are summarized in Figure 9 forgrades 4 and 8. In general, these data indicatethat teachers are using what have been called"alternative" types of assessment in theirclassrooms. At both grade levels, over half ofthe students were taught by teachers whoused problem sets once or twice a week; fourout of ten were given assignments requiringwritten responses at least monthly; and aboutone-fifth were given projects, portfolios, orpresentations at least monthly. Only 6 and 4percent of fourth and eighth graders, respec-tively, were given multiple-choice tests onceor twice a week.

While these data indicate that teachers areemphasizing alternative assessment muchmore than might be expected, and usingmultiple-choice tests less frequently, there isroom for greater use of alternative assess-ments. Figure 9 also shows that about half ofthe students at both grade levels had teacherswho never or hardly ever used projects, port-folios, or presentations as assessment tools intheir classrooms.

'Problem sets are a set of one to five problem situations, wordproblems from the textbook, or constructed by the teacher(written or verbal).

21

In Figure 9, note that thefrequency categories formultiple-choice tests andproblem sets are differentthan for written responsesand protects, portfolios. andpresentations.

These teacher reports are ofallthe tests they give in theclassnxm, including teststhey make themselves, whilemost previous pages in thisreport refer to statewidetesting programs.

Source for Flpre 9: Data aredrawn from National Centerfor Education Statistics, DataCompendium for the NAEP1992 Mathematics Assess-ment of the Nation and theStates, Educational TestingService, May 1993, p. 507.While these data are drawnfrom the Teacher Question-naire, the data are student-based. That is, the percentagereported is the percentage ofstudents whose teachersreport...

See Appendix Table 1 forstandard errors.

Page 21: DOCUMENT RESUME ED 366 616 TM 020 987 AUTHOR TITLE … · 2013. 11. 23. · ED 366 616. AUTHOR TITLE. INSTITUTION. PUB DATE NOTE AVAILABLE FROM. PUB TYPE. EDRS PRICE DESCRIPTORS

Many teachers are using alternative assessment within theirclassrooms once or twice a week; multiple-choice testsare more likely to be used once or twice a month or less.

Figure 9: Teacher Reports on the Frequency of Testing, 1992-93

Grade 4 Multiple-choice Tests Grade 8

Once/TwIce a Week -4 6 Once/Twice a Week -0 4

Once/Twice a Month ----- 43 Once/Twice a Month ----* 30

Yearly/Never 51 Yearly/Never 66

0 10 20 30 40 50 60 70 0 10 20 30 40 50 60 70Percentage of Students Percentage of Students

Once/Twice a Week

Grade 4Problem Sets

Grade 8

-- 53 Once/Twice a Week 58

Once/Twice a Month ---- 39 Once/Twice a Month 32

Yearly/Never .g Yearly/Never 4 10

0 10 20 30 40 50 60 70 0- 10 20 30 40 50 60 70Percentage of Students Percentage of Students

Grade 4Written Responses Grade 8

At Least Monthly 44 At Least Monthly

Once/Twice a Year -- 16 Once/Twice a Year ----- 22

Never/Hardly Ever -------- 40 Never/Hardly Ever -- 33

44

0 10 20 30 40 50 60 70 0 10 20 30 40 50 60 70Percentage of Students Percentage of Students

Grade 4

At Least Monthly ---- 20

Once/Twice a Year 25

Never/Hardly Ever

Projects, Portfolios,Presentations Grade 8

At Least Monthly 4421

OncefTwIce a Year

54 Never/Hardly Ever 470 10 20 30 40 50 60 70 0 10 20 '30 40 50 60 70

Percentage of Students Percertage of Stucients

22 19

Page 22: DOCUMENT RESUME ED 366 616 TM 020 987 AUTHOR TITLE … · 2013. 11. 23. · ED 366 616. AUTHOR TITLE. INSTITUTION. PUB DATE NOTE AVAILABLE FROM. PUB TYPE. EDRS PRICE DESCRIPTORS

StudentReportson the

Frequencyof Testing

20

The 1992 NAEP math assessment also askedstuderes how often they were tested in math.The results are shown in Figure 10.

In the fourth grade, about 40 percent of thestudents were tested at least once a week. Bythe eighth grade this percentage increased to60 percent.

Among all twelfth graders, just over half aretested once a week or more, and amongtwelfth graders taking math, that figure wasjust over 60 percent.

23

Source for Figure 10: Data aredrawn from National Centerfor Education Statistics, DataCompendium fir the NAEP1992 Mathematics Assessmentof the Nation and the StatesEducational Testing Service,May 1993, p. 508.

Set Appendix Table 2 forstandard errors.

Page 23: DOCUMENT RESUME ED 366 616 TM 020 987 AUTHOR TITLE … · 2013. 11. 23. · ED 366 616. AUTHOR TITLE. INSTITUTION. PUB DATE NOTE AVAILABLE FROM. PUB TYPE. EDRS PRICE DESCRIPTORS

About half of 8th and 12th graders say theyare tested in math at least weekly.

Figure 10: Students Reports on How Often They Take Math Tests, 1992

Grade 4

Almost Every Day 0 9

At Least Once a Week

Less Than Weekly

Grade 8

Almost Every Day 6

At Least Once a Week

Less Than Weekly

All 12th Graders

30

39

-0 61

55

Almost Every Day 411 4

At Least Once a Week 50---411Less Than Weekly 46

12th Graders Taking Math

Almost Every Day 9 4

At Least Once a Week

Less Than Weekly

----0 57391111111-1

010 20 30 40 50 60 70

Percentage of Students

24 21

Page 24: DOCUMENT RESUME ED 366 616 TM 020 987 AUTHOR TITLE … · 2013. 11. 23. · ED 366 616. AUTHOR TITLE. INSTITUTION. PUB DATE NOTE AVAILABLE FROM. PUB TYPE. EDRS PRICE DESCRIPTORS

Frequencyof Multiple-

ChoiceTesting

The 1992 NAEP math assessment also providesseparate information for each of 44 participat-ing jurisdictions. Figure 11 shows the percent-age of students whose teachers reported thatthey used multiple-choice tests to assessstudent progress at least once a week.

The percentages range from zero in Guam andNebraska to 22 percent in the District ofColumbia. The average for the nation was4 percent.

25

Source for Figure 11: Data aredrawn from National Centerfor Education Statistics, DataCompendium for the NAN'1992 Mathematics Assessmentof the Nation and the States,Educational Testing Service,May 1993, p. 510.

See Appendix Table 3 forstandard errors.

Page 25: DOCUMENT RESUME ED 366 616 TM 020 987 AUTHOR TITLE … · 2013. 11. 23. · ED 366 616. AUTHOR TITLE. INSTITUTION. PUB DATE NOTE AVAILABLE FROM. PUB TYPE. EDRS PRICE DESCRIPTORS

Few teachers reported testing students with multiple-choicetests in math at least once a week.

Figure 11: Percentage of 8th Graders Whose Teachers Report TheyUse Multiple-choice Tests at Least Once a Week, by Jurisdiction, 1992

GuamNebraskaWyomingMissouri

New MexicoMaine

IowaPennsylvania

WisconsinKentucky

North CarolinaOhio

Rhode IslandMinnesota

HawaiiConnecticut

CaliforniaWest Virginia

IndianaDelaware

VirginiaColorado

ArizonaMaryland

MassachusettsIdaho

North DakotaTexas

MichiganUtah

Now HampshireTennessee

AlabamaOklahomaArkansas

Virgin IslandsSouth Carolina

GeorgiaNew York

New JerseyMississippi

LouisianaFlorida

District of Columbia

4%

8%

9%

i I 1 I

0 5 10 15 20Percentage of Students

22%

25

2623

Page 26: DOCUMENT RESUME ED 366 616 TM 020 987 AUTHOR TITLE … · 2013. 11. 23. · ED 366 616. AUTHOR TITLE. INSTITUTION. PUB DATE NOTE AVAILABLE FROM. PUB TYPE. EDRS PRICE DESCRIPTORS

TheFrequency

of Teacher-Developed

Tests

24

In the 1990 NAEP math assessment, teachersin participating jurisdictions were asked toindicate how often they gave teacher-developed tests to their math students. Datafor these 40 jurisdictions are shown in Figure12.

A litde under half of the eighth grade studentsin Oregon, Minnesota, Oklahoma, and Michi-gan had teachers who gave their own tests atleast once a week. At the other end of thescale, more than eight out of 10 students inAlabama, District of Columbia, 10-- ode Island,and Louisiana had teachers who used theirown tests on a weekly, or more frequent,basis. The national average was 61 percent.

27

Source for Figure 12: NationalCenter for Education Statistics,Me STAM of MathematicsAchievement: NAF.P's 1990Assessment of 11 e Nation andthe Trial Assessment of theStates, Educational TestingService, June 1991, p. 321.

See Appendix Table 4 forstandard errors.

Page 27: DOCUMENT RESUME ED 366 616 TM 020 987 AUTHOR TITLE … · 2013. 11. 23. · ED 366 616. AUTHOR TITLE. INSTITUTION. PUB DATE NOTE AVAILABLE FROM. PUB TYPE. EDRS PRICE DESCRIPTORS

Nationally, about 60 percent of teachers give their own mathtests to students at least once a week. The percentage varies

from 88 percent in Louisiana to 40 percent in Oregon.

Figure 12: Percentage of 8th Graders Whose Teachers ReportUsing Teacher-Generated Math Tests Once or More a Week,by Jurisdiction, 1990

OregonMinnesotaOklahomaMichigan

IllinoisIowa

GuamWyoming

IndianaArizona

MontanaNorth Dakota

IdahoNebraska

HawaiiColorado

West VirginiaWisconsinCaliforniaKentucky

Virgin IslandsNew Mexico

OhioDelawareArkansas

TexasVirginia

North CarolinaMaryland

ConnecticutPennsylvania

FloridaGeorgia

New HampshireNew York

New JerseyAlabama

District of ColumbiaRhode Island

Louisiana

--- 40%44%------ 47%

49%

50%

51%

55%56%

57%

58%63%

67%

----- 68%69%

70%

71%

72%73%

74%

80%82%

84%88%

1 1 i I 1 1

30 40 50 60 70 80Percentage of Students

90

2825

Page 28: DOCUMENT RESUME ED 366 616 TM 020 987 AUTHOR TITLE … · 2013. 11. 23. · ED 366 616. AUTHOR TITLE. INSTITUTION. PUB DATE NOTE AVAILABLE FROM. PUB TYPE. EDRS PRICE DESCRIPTORS

Non-traditional

StateAssessment

26

As shown in Figure 13, as well as in several ofthe following sections, the states are busydeveloping and implementing nontraditionalitem types. Figure 13 provides an overall senseof this activity level. While the chart gives adot to a state regardless of the developmentstage of the assessment (some states havefunded an assessment type while others havealready implemented the assessment), thedirection of state activity is clear.

Extended-response open-ended items, shortanswer open-ended items, and enhancedmultiple-choices items were the most frequentalternatives cited. Individual and group perfor-mance assessments and portfolio or learningrecords are also being used by a considerablenumber of states. A few states are using inter-views, observations, projects, exhibitions, ordemonstrations in their state testing programs.

"The states responding to the survey made their own interpre-tations of what was meant by "enhanced multiple-choice."Generally, these are efforts to extend a multiple-choice itemby asking for such things as an explanation of why a particu-lar option was chosen, or having more than one correctanswer.

29

Source for Figure 13: Data aredrawn from the "State StudentAssessment Program Data-base, 1992-1993." Council ofChief State School Officersand North Central RegionalEducational Laboratory.

Note: A marker in a cell ofFigure 13 means that the stateis in some phase of develop-ing the assessment type:funded, not started; begundevelopment; completeddevelopment; piloted, beingrefined; or ready for use.States reporting that they"want to develop" theassessment type are not givena marker in the cell.

Page 29: DOCUMENT RESUME ED 366 616 TM 020 987 AUTHOR TITLE … · 2013. 11. 23. · ED 366 616. AUTHOR TITLE. INSTITUTION. PUB DATE NOTE AVAILABLE FROM. PUB TYPE. EDRS PRICE DESCRIPTORS

Man

y st

ates

are

beg

inni

ng to

mak

e co

nsid

erab

le u

se o

f non

trad

ition

al it

ems

in th

eir

stat

ewid

e te

stin

g pr

ogra

ms.

Fig

ure

13: T

he U

se o

f Non

trad

ition

al T

ests

Item

s in

Sta

te A

sses

smen

t Pro

gram

s, 1

992-

1993

Enh

ance

dM

ultip

le-

Cho

ice

Sho

rtA

nsw

erO

pen-

ende

d

Ext

ende

d-R

espo

nse

Ope

n-en

ded

Inte

rvie

wO

bser

vatio

nIn

divi

dual

Per

form

ance

Ass

essm

ent

Gro

upP

erfo

rman

ceA

sses

smen

t

Por

tolio

or

Lear

ning

Rec

ord

Pro

ject

,E

xhib

ition

,D

emo

Oth

er

TO

TA

LS14

1822

34

147

95

4

Ala

ska

Ala

bam

aA

rkan

sas

Ariz

ona

Cal

iforn

ia0

Col

orad

oC

onne

ctic

ut*

Del

awar

ee

Flo

rida

Geo

rgia

Haw

aii

Idah

oIll

inoi

sIn

dian

aK

ansa

s.K

entu

cky

IM

aine

SM

aryl

and

Mas

sach

uset

taM

ichi

gan

Min

neso

taA

Mis

sour

i0

Nev

ada

New

Jer

sey

New

Mex

ico

New

Yor

kN

orth

Car

olin

aO

h lo

Ore

gon

Pen

nsyl

vani

aR

hode

Isla

ndS

outh

Car

olin

aT

enne

ssee

Tex

as10

_

Uta

hV

erm

ont

Wes

t Virg

inia

Wis

cons

in31

Page 30: DOCUMENT RESUME ED 366 616 TM 020 987 AUTHOR TITLE … · 2013. 11. 23. · ED 366 616. AUTHOR TITLE. INSTITUTION. PUB DATE NOTE AVAILABLE FROM. PUB TYPE. EDRS PRICE DESCRIPTORS

The Status ofAlternative

StateTesting

Programs

28

Figure 14 provides a closer look at the statusof state testing programs that use alternativeitem types. Each type of assessment is shownfor each development category ready touse; piloted, being refined; begun or com-pleted development; and funded, not started.

A considerable number of states are ready toimplement (or are already using) extended-response open-ended items, enhancedmultiple-choice items, individual performanceassessments, and short answer open-endeditems. A number of states are also movingahead in developing and piloting a variety ofalternative item types.

32

Source for Figure 14: Data aredrawn from the "State StudentAssessment Program Data-base, 1992-1993." Council ofChief State School Officersand North Central RegionalEducational Laboratory.

Note: In Figure 14, a statecould be counted more thanonce if, for example, it had anenhanced multiple-choice testthat was ready to use andanother that was beingpiloted. Thus, the countsdiffer from Figure 13.

Page 31: DOCUMENT RESUME ED 366 616 TM 020 987 AUTHOR TITLE … · 2013. 11. 23. · ED 366 616. AUTHOR TITLE. INSTITUTION. PUB DATE NOTE AVAILABLE FROM. PUB TYPE. EDRS PRICE DESCRIPTORS

A c

onsi

dera

ble

num

ber

of s

tate

s ar

e us

ing,

or

are

read

y to

use

, alte

rnat

ive

item

type

s in

thei

r st

ate

test

ing

prog

ram

s.

Fig

ure

14: N

umbe

r of

Sta

tes

Usi

ng A

ltern

ativ

e Ite

m T

ypes

, 199

2-19

93, b

y S

tatu

s

Ext

ende

d-re

spon

se O

pen-

ende

d

Enh

ance

d M

ultip

le-c

hoic

e

Indi

vidu

al P

erfo

rman

ce A

sses

smen

t

Sho

rt A

nsw

er O

pen-

End

ed

Por

tfolio

or

Lear

ning

Rec

ord

Gro

up P

erfo

rman

ce A

sses

smen

t

Obs

erva

tion

Pro

ject

, Exh

ibiti

on, D

emon

stra

tion

Inte

rvie

w

Oth

er

Ext

ende

d-re

spon

se O

pen-

ende

d

Enh

ance

d M

ultip

le-c

hoic

e

Indi

vidu

al P

erfo

rman

ce A

sses

smen

t

Rea

dy to

Use 10 10

9

16

05

1015

20N

umbe

r of

Sta

tes

[Beg

un o

r C

ompl

eted

Dev

elop

men

tI

5

---4

b 4

Sho

rt A

nsw

er O

pen-

End

ed -

--e

6P

ortfo

lio o

r Le

arni

ng R

ecor

d -4

0 1

Gro

up P

erfo

rman

ce A

sses

smen

t1

Obs

erva

tion

- 1

Pro

ject

, Exh

ibiti

on, D

emon

stra

tion

2

Inte

rvie

w -

1

Oth

er -

1

'05

1015

2

Pilo

ted,

Bei

ng R

efin

ed

Ext

ende

d-re

spon

se O

pen-

ende

d7

Enh

ance

d M

ultip

le-c

hoic

e2

Indi

vidu

al P

erfo

rman

ce A

sses

smen

t ---

e 6

Sho

rt A

nsw

er O

pen-

End

ed

Por

tfolio

or

Lear

ning

Rec

ord

413

Gro

up P

erfo

rman

ce A

sses

smen

t2

Obs

erva

tion

0

Pro

ject

, Exh

ibiti

on, D

emon

stra

tion

62

Inte

rvie

w -

4 1

Oth

er .2

510

1520

Num

ber

of S

tate

s

Fun

ded,

Not

Sta

rted

Ext

ende

d-re

spon

se O

pen-

ende

d 0

Enh

ance

d M

ultip

le-c

hoic

e '4

1

Indi

vidu

al P

erfo

rman

ce A

sses

smen

t -4

1

Sho

rt A

nsw

er O

pen-

End

ed -

01

Por

tfolio

or

Lear

ning

Rec

ord

-- 2

Gro

up P

erfo

rman

ce A

sses

smen

t 13

Obs

erva

tion

0

Pro

ject

, Exh

ibiti

on, D

emon

stra

tion

0

Inte

rvie

w 0

Oth

er 4

10 2

05

1015

20N

umbe

r of

Sta

tes

Num

ber

of S

tate

s

33

Page 32: DOCUMENT RESUME ED 366 616 TM 020 987 AUTHOR TITLE … · 2013. 11. 23. · ED 366 616. AUTHOR TITLE. INSTITUTION. PUB DATE NOTE AVAILABLE FROM. PUB TYPE. EDRS PRICE DESCRIPTORS

SubjectsAssessed in

Non-traditional

State TestingPrograms

30

As might be expected, state alternative assess-ments are given in the same subjects that arethe province of traditional multiple-choicetests. Figure 15 shows the variety of subjects inwhich nontraditional item types are used. Thesubjects of writing, mathematics, reading,science, and social studies top the list.

35

Source for Figure 15: Data aredrawn from the "State Studer.:Assessment Program Database, 1992-1993." Council ofChief State School Officersand North Central RegionalEducational Laboratory.

Page 33: DOCUMENT RESUME ED 366 616 TM 020 987 AUTHOR TITLE … · 2013. 11. 23. · ED 366 616. AUTHOR TITLE. INSTITUTION. PUB DATE NOTE AVAILABLE FROM. PUB TYPE. EDRS PRICE DESCRIPTORS

States are using nontraditional items to assess student progressin the traditional subjects of writing, math, reading and science.

:igure 15: Number of States Developing or Using Nontraditional Testems, by Subject, 1992-1993

Writing

Math

Reading

Science 4 18Social Studies 4 14

Other 0 5Health Education 0 4

History 4 4

Physical Education 0 4

Speaking, Listening 0 4

Vocational Education 0 4

Foreign Languages 4 3

Geography 4 3

Music 4 3

Visual Arts 3

1

21

0 29

0 35

0 10 20 30 40Number of States

36 31

Page 34: DOCUMENT RESUME ED 366 616 TM 020 987 AUTHOR TITLE … · 2013. 11. 23. · ED 366 616. AUTHOR TITLE. INSTITUTION. PUB DATE NOTE AVAILABLE FROM. PUB TYPE. EDRS PRICE DESCRIPTORS

Teacher Useof Different

Types ofTests, by

Race/Ethnicity

32

The 1992 NAEP mathematics assessmentprovides data that can be used to determinewhether different groups of students areexposed to different assessment types. Onecan see, for example, whether students fromminority groups, students in lower abilitygroups, or students in classrooms with lowerlevels of resources available to their teachersare being given different types of tests.

Table 1 shows the extent of multiple-choicetests, problem sets, written responses, andprojects or portfolios given to studentsgrouped by race/ethnicity. The pattern isabout the same among the racial/ethnicgroups, after taking sampling error intoaccount, and there are few differences that aresignificant.

e! 37

Data in Table 1 are drawnfrom NAEP 1990, 1992 -National Math Assessment-Data Almanac - Grade 8:Teacher Questionnaire,Weighted Percentages andComposite Proficiency Means.

Page 35: DOCUMENT RESUME ED 366 616 TM 020 987 AUTHOR TITLE … · 2013. 11. 23. · ED 366 616. AUTHOR TITLE. INSTITUTION. PUB DATE NOTE AVAILABLE FROM. PUB TYPE. EDRS PRICE DESCRIPTORS

Table 1: Teachers' Reports on How Frequently They Assess Students with DifferentAssessment Types, Grade 8, by Racegithnic:ty, 1992

White Black Hispanic Asian AmericanIndian

Multiple-Choice

Once/Twice a Week 4% (1.1) 6% (3.0) 6% (1.5) 8% (4.1) 1% (0.6)

Once/Twice a Month 28 (3.1) 40 (3.4) 32 (3.0) 28 (5.7) 22 (5.6)Once/Twice a Year 29 (2.6) 24 (3.9) 27 (3.3) 28 (5.0) 31 (4.6)Never 40 (2.5) 30 (3.4) 36 (2.6) 36 (5.2) 47 (6.7)

Problem Sets

Once/Twice a Week 61 (2.7) 50 (3.2) 51 (3.4) 63 (5.5) 67 (9.6)

Once/Twice a Month 29 (2.8) 43 (3.7) 38 (3.6) 33 (6.0) 26 (8.1)

Once/Twice a Year 6 (1.6) 4 (1.3) 7 (1.6) 3 (1.2) 5 (2.4)

Never 5 (1.2) 3 (1.0) 4 (1.2) 2 (1.8) 3 (2.2)

Written Responses

OncefTWice a Week 8 (1.6) 12 (2.9) 14 (2.2) 8 (2.2) 3 (2.0)

Once/ Twice a Month 37 (3.1) 33 (4.3) 31 (2.7) 30 (5.6) 20 (6.2)

Once/Twice a Year 22 (2.5) 26 (3.8) 22 (3.3) 25 (5.3) 28 (6.6)

Never 34 (3.2) 29 (3.0) 34 (4.1) 38 (7.1) 49 (9,2)

Projects or Portfolios

Once/Twice a Week 2 (0.7) 5 (1.9) 3 (0.8) 3 (1.0) 1 (0.5)

Once/Twice a Month 18 (2.1) 19 (3.2) 20 (2.3) 25 (4.5) 16 (6.1)

Once/Twice a Year 32 (2.0) 32 (2.7) 32 (2,8) 21 (3.0) 31 (7.4)

Never 48 (3.0) 44 (3.8) 45 (3.3) 51 (4.7) 52 (8.4)

Percentages are followed by standard errors (in parentheses).

=wirigra6.4

33

Page 36: DOCUMENT RESUME ED 366 616 TM 020 987 AUTHOR TITLE … · 2013. 11. 23. · ED 366 616. AUTHOR TITLE. INSTITUTION. PUB DATE NOTE AVAILABLE FROM. PUB TYPE. EDRS PRICE DESCRIPTORS

TeacherUse of

DifferentTypes ofTests, by

AbilityGroup

34

Table 2 shows how frequently teachers usedifferent assessment types for each studentability group low, middle, high, and mixed.Again, the pattern is very similar among theability groups.

39

Data in Table 2 are drawnfrom NAEP 1990, 1992National Math Assessment -Data Almanac - Grade 8:kacher Questionnaire,Weighted Percentages andComposite Proficiency Awns.

Page 37: DOCUMENT RESUME ED 366 616 TM 020 987 AUTHOR TITLE … · 2013. 11. 23. · ED 366 616. AUTHOR TITLE. INSTITUTION. PUB DATE NOTE AVAILABLE FROM. PUB TYPE. EDRS PRICE DESCRIPTORS

Table 2: Teachers' Reports on How Frequently They Assess Students with DifferentAssessment Types, Grade 8, by Ability Grouping, 1992

Low Middle High Mixed

Multiple-Choice

Once/Twice a Week 2% (0.8) 5% (1.6) 3% (1.3) 5% (1.9)

Once/Twice a Month 30 (3.6) 35 (3.7) 22 (3.7) 29 (4.4)

Once/Twice a Year 28 (3.9) 25 (3.9) 34 (4.5) 27 (4.7)

Never 40 (3.9) 35 (3.1) 42 (3.1) 9 (5.6)

Problem Sets

Once/Twice a Week 54 (6.4) 58 (3.6) 67 (3.5) 50 (5.8)

Once/Twice a Month 29 (3.8) 33 (3.9) 26 (3.3) 41 (5.3)

Once/Twice a Year 9 (3.5) 6 (1.7) 3 (1.4) 3 (1.6)

Never 9 (4.3) 3 (1.0) 4 (0.9) 6 (2.4)

Written Responses

Oncerrwice a Week 7 (1.9) 7 (1.5) 13 (3.5) 8 (2.1)

Once/iWice a Month 28 (4.6) 45 (3.2) 31 (3.6) 29 (4.0)

Once/Twice a Year 26 (4.5) 21 (2.6) 21 (2.5) 24 (4.7)

Never 39 (5.3) 27 (2.9) 35 (4.7) 39 (4.6)-Projects or Portfolios

Once/ Twice a Week 1 (0.4) 3 (1.6) 2 (0.9) 3 (1.2)

Once/Twice a Month 18 (3.0) 20 (3.0) 19 (2.8) 17 (3.8)

Once/Twice a Year 31 (4.9) 33 (3.6) 32 (4.7) 30 (3.9)

Never 0 (5.9) 45 (3.7) 47 (4.7) 51 (4.6)

Percentages are followed by standard errors (in parentheses).

40 35

Page 38: DOCUMENT RESUME ED 366 616 TM 020 987 AUTHOR TITLE … · 2013. 11. 23. · ED 366 616. AUTHOR TITLE. INSTITUTION. PUB DATE NOTE AVAILABLE FROM. PUB TYPE. EDRS PRICE DESCRIPTORS

Teacher Useof DifferentAssessment

Types, byResource

Availability

36

Table 3 shows the types of assessments stu-dents are exposed to when grouped by theirteachers' estimation of the sufficiency of class-room resources. The pattern does not varymuch with differences in the availability ofclassroom resources.

41

Data ln Table 3 are from the1992 National Assessment ofEducational ProgressMathematics Assessment andwere calculated by specialcomputer runs conducted atEducational Testing Service.

Page 39: DOCUMENT RESUME ED 366 616 TM 020 987 AUTHOR TITLE … · 2013. 11. 23. · ED 366 616. AUTHOR TITLE. INSTITUTION. PUB DATE NOTE AVAILABLE FROM. PUB TYPE. EDRS PRICE DESCRIPTORS

Table 3: Teachers' Reports on How Frequently They Assess Students with DifferentAssessment Types, Grade 8, by Reclurce Availability, 1992

Get All NeededResources

Get Most ofNeeded Resources

Get Some or Noneof Needed Resources

Multiple-Choice

Oncerrwice a Week 10% (4.5) 2% (0.9) 5% (1.8)

Once/Twice a Month 30 (6.3) 35 (4.2) 23 (3.2)

Once/Twice a Year 21 (4.6) 27 (3.1) 32 (4.9)

Never 39 (6.3) 36 (3.4) 40 (4.0)

Problem Sets

Once/Twice a Week 56 (7.1) 58 (2.9) 59 (4.6)

Once/Twice a Month 32 (7.6) 31 (2.8) 33 (4.0)

Once/Twice a Year 8 (5.2) 6 (1.6) 4 (1.3)

Never 5 (2.9) 5 (1.2) 4 (1.5)

Written Responses

Once/Twice a Week 14 (3.7) 6 (1.5) 11 (2.5)

Once/Twice a Month 35 (5.6) 35 (3.6) 36 (3.8)

Once/Twice a Year 14 (4.2) 24 (3.0) 24 (2.9)

Never 37 (6.6) 35 (3.9) 30 (4.9)

Prolects or Portfolios

Once/Twice a Week 2 (2.1) 2 (0.6) 3 (1.3)

Once/Twice a Month 23 (4.8) 19 (2.2) 17 (3.8)Once/Twice a Year 26 (5.9) 30 (3.6) 35 (4.0)

Never 48 (7.5) 50 (3.9) 44 (4.6)

Percentages are followed by standard errors (in parentheses).

4237

Page 40: DOCUMENT RESUME ED 366 616 TM 020 987 AUTHOR TITLE … · 2013. 11. 23. · ED 366 616. AUTHOR TITLE. INSTITUTION. PUB DATE NOTE AVAILABLE FROM. PUB TYPE. EDRS PRICE DESCRIPTORS

In Conclusion

There is a lot of testingin American schools. Avisitor who had not seenour schools for 20 yearswould likely see thevolume of testing as themost noticeable changein American education.It appears that the futureholds more testing,unless the alternativeassessments underdevelopment start takingthe place of traditionalstandardized tests.

Testing is very muchalive in our schools, buta national investigationof testing has been underway now for a few years.The jury is still out onthe condition of thetesting system. Thestatistics in this report tella lot about the quantityand type of testing, butnothing about its impactand its value. Onefeature of testing thatneeds much closerexamination is the "testall students" approachfor accountability in 41state systems, with justone state samplingstudents (and three stateswith mixed systems).The National Assessmentof Educational Progresshas been operating on asampling basis for over20 years. Its samplingapproach has been usedat the state level andcould be used at thedistrict level, as well.This approach can serveaccountability needs

38

Nwej

without intruding somuch on instructionaltime; testing everystudent in a class canthen be reserved forserving instructionalpurposes.

The progress indeveloping alternativeitem types in statewideprograms constructedresponses, performanceassessments, and port-folios appears to besubstantial. And whenwe look at the total oftesting... includingteacher constructedtesting... that goes on inthe classroom, alterna-tive approaches appearto be doing well. Thereis, however, wide varia-tion among the states inthe frequency of multiple-choice testing. Theindividual teachercreates the tests mostclosely aligned withinstruction, and this alsovaries considerablyamong the states, with60 percent of 8th gradestudents getting suchtests in mathematics atleast once a week.Perhaps we need moreattention given to help-ing teachers constructgood assessments.

It remains to be seenwhether these perfor-mance-type assessmentswill begin to substan-tially displace the tradi-tional multiple-choice,machine-scored tests thathave predominated. We

do note that seven in 10tests in state testingsystems are still multiple-choice. Establishing abaseline, as we attemptto do in this report, willpermit educators andpolicy makers to trackthe profile of testing inthis period of fermentand change.

Notes and References

The State Student Assess-ment Program Database,1992-1993 was producedby the Council of ChiefState School Officers andthe North CentralRegional EducationalLaboratory using resultsfrom a statewide surveyof state testing programs.For information aboutdatabase content, con-tact:

Edward Roeber, Director,Student Assessment ProgramsCCSSO202-386-7045

Linda BondDirector of AssessmentRegional Policy InformationCenterNCREL

317-328-9704

Student Testing: CurrentExtent and Expenditures,with Cost Estimates for aNational Examinationwas prepared for Con-gress by the U.S. GeneralAccounting Office. Firstcopies of GAO reportsare free. Additionalcopies are $2 each.

4 3

Order No. GAO/PEMD-93-8 from:

GAOPO Box 6015Gaithersburg, MD 20877or call 202-275-6241

The NAEP 1992 Math-ematics Report Card forthe Nation and the Stateswas prepared by Educa-tional Testing Serviceunder contract with theNational Center forEducation Statistics. Forordering information,write to:

Education Information BranchOffice of Educational Research

and ImprovementU.S. Department of Education555 New Jersey Ave., NWWashington, DC 20208-5641

The Data Compendiumfor the NAEP 1992Mathematics Assessmentof the Nation and theStates was prepared byEducational TestingService under contractwith the National Centerfor Education Statistics.For information, write tothe address listed above.

The STATE of Mathemat-ics Achievement: NAEP's1990 Assessment of theNation and the TrialAssessment of the Stateswas prepared by Educa-tional Testing Serviceunder contract with theNational Center forEducation Statistics. Forinformation, write to theaddress listed above.

Page 41: DOCUMENT RESUME ED 366 616 TM 020 987 AUTHOR TITLE … · 2013. 11. 23. · ED 366 616. AUTHOR TITLE. INSTITUTION. PUB DATE NOTE AVAILABLE FROM. PUB TYPE. EDRS PRICE DESCRIPTORS

Appendix Table 1: Teachers' Reports on the Frequency of Testing(Percentage of Students with Standard Error Reported in Parentheses)

Once or Twice a WeekOnce or Twice a MonthYearly or Never

Multiple-Choice Tests

Grade 4 Grade 8

6 (1.0) 4 (1.0)43 (2.8) 30 (2.5)51 (2.7) 66 (2.8)

Problem SetsGrade 4 Grade 8

Once or Twice a Week 53 (2.8) 58 (2.3)Once or Twice a Month 39 (2.3) 32 (2.4)Yearly or Never 9 (1.4) 10 (1.7)

Written Responses

Grade 4 Grade 8

At Least Monthly 44 (2.6) 44 (2.7)Once or Twice a Year 16 (1.5) 22 (2.0)

Never or Hardly Ever 40 (2.0) 33 (2.7)

Projects, Portfolios, or Presentations

Grade 4 Grade 8

At Least Monthly 20 (1.7) 21 (2.0)Once or Twice a Year 25 (1.8) 32 (2.5)Never or Hardly Ever 54 (2.4) 47 (2.6)

Appendix Table 2: Students' Reports on How Often They TakeMathematics Tests, Grades 4, 8, and 12, NAEP, 1992

(Percentage of Students with Standard Error Reported in Parentheses)

Grade 4 Grade 8 Grade 12 Grade 12All Students Taking Math

Almost Every Day 9 (0.6) 6 (0.3) 4 (0.3) 4 (0.4)At Least Once a Week 30 (1.2) 55 (1.2) 50 (1.2) 57 (1.4)Less Than Weekly 61 (1.5) 39 (1.3) 46 (1.2) 39 (1.5)

4 4 39

Page 42: DOCUMENT RESUME ED 366 616 TM 020 987 AUTHOR TITLE … · 2013. 11. 23. · ED 366 616. AUTHOR TITLE. INSTITUTION. PUB DATE NOTE AVAILABLE FROM. PUB TYPE. EDRS PRICE DESCRIPTORS

Appendix Table 3: Percentage of StudentsWhose Teachers Report They UseMultiple-Choice Tests at Least Once a Week,1992 (Percentage of Students with StandardError Reported in Parentheses)

Appendix Table 4: Teachers' Reports on How OftenStudents Take Teacher-Generated Mathematics Tests(Percentage of Students with Standard ErrorReported in Parentheses)

At Least Several About Once

Alabama 5 (1.7) Times a week a Week

Arizona 3 (1.2)California 2 (0.7) Alabama 9 (2.8) 70 (3.1)Colorado 3 (1.0) Arizona 7 (2.2) 47 (2.7)Connecticut 2 (0.7) Arkansas 6 (2.3) 61 (3.1)Delaware 3 (0.2) California 5 (1.2) 52 (3.6)District of Columbia 22 (0.8) Colorado 6 (1.5) 50 (3.8)Florida 9 (1.9) Connecticut 5 (1.7) 66 (2.9)Georgia 6 (1.3) Delaware 2 (0.3) 65 (1.1)Hawaii 2 (0.2) District of Columbia 27 (1.0) 55 (1.1)Idaho 4 (1.3) Florida 11 (1.8) 61 (3.1)Indiana 3 (1.1) Georgia 7 (1.7) 66 (3.4)Iowa 1 (1.1) Hawaii 4 (0.4) 51 (0.9)Kentucky 2 (0.4) Idaho 2 (0.4) 51 (2.2)Louisiana 8 (2.4) Illinois 4 (1.7) 46 (4.2)Maine 1 (1.1) Indiana 5 (1.5) 49 (4.0)Maryland 3 (1.1) Iowa 6 (2.9) 44 (4.8)Massachusetts 3 (1.2) Kentucky 6 (1.8) 52 (4.4)Michigan 4 (1.9) Louisiana 11(2.4) 77 (3.6)Minnesota 2 (1.0) Maryland 4 (1.4) 65 (3.4)Missouri 1 (0.2) Michigan 5 (1.6) 44 (3.7)Nebraska 0 (0.2) Minnesota 3 (1.1) 41 (3.5)New Hampshire 4 (2.3) Montana 3 (0.8) 50 (2.9)New Jersey 7 (2.0) Nebraska 5 (1.2) 49 (3.5)New Mexico 1 (0.9) New Hampshire 5 (0.8) 69 (1.3)New York 7 (2.2) New Jersey 11 (2.5) 64 (3.7)North Carolina 2 (0.8) New Mexico 4 (0.6) 64 (1.5)North Dakota 4 (2.0) New York 6 (2.1) 68 (3.3)Ohio 2 (1.0) North Carolina 10 (1.6) 60 (3.2)Oklahoma 5 (1.8) North Dakota 10 (0.6) 46 (3.0)Pennsylvania 1 (0.9) Ohio 4 (1.3) 63 (3.8)Rhode Island 2 (0.2) Oklahoma 2 (0.8) 45 (3.6)South Carolina 6 (1.8) Oregon 3 (1.1) 37 (3.3)Tennessee 5 (1.7) Pennsylvania 5 (1.4) 66 (3.3)Texas 4 (1.5) Rhode Island 12 (1.0) 72 (1.6)Utah 4 (1.5) Texas 3 (1.1) 65 (3.5)Virginia 3 (1.0) Virginia 12 (2.2) 57 (3.4)West Virginia 3 (1.1) West Virginia 8 (2.1) 49 (4.6)Wisconsin 1 (0.6) Wisconsin 5 (1.8) 52 (3.7)Wyoming 1 (0.2) Wyoming 4 (0.7) 46 (1.2)

Guam 10 (0.6) 41 (0.7)Virgin Islands 5 (0.4) 58 (0.9)

4540

Page 43: DOCUMENT RESUME ED 366 616 TM 020 987 AUTHOR TITLE … · 2013. 11. 23. · ED 366 616. AUTHOR TITLE. INSTITUTION. PUB DATE NOTE AVAILABLE FROM. PUB TYPE. EDRS PRICE DESCRIPTORS

04202-08057 S1234.46 204882

4C