document resume - ericdocument resume ed 080 582 tm 003 098 title state educational assessment...

103
DOCUMENT RESUME ED 080 582 TM 003 098 TITLE State Educational Assessment Programs.,1973 Revision.. INSTITUTION Educational Testing Service, Princeton, N.J.; Education Commission of the States, Denver, Colo.. PUB DATE 73 NOTE 104p. EDRS PRICE MP-S0.65 HC-$6.58 DESCRIPTORS *Data Analysis; Educational Quality; Educational Research; Elementary Grades; *Evaluation Methods; Interviews; Kindergarten; Program Descriptions; Secondary Grades; *State Programs; *State surveys; *Student Evaluation; Technical Reports IDENTIFIERS District of Columbia; Puerto Rico; United States; Virgin Islands ABSTRACT This publication has two major parts..The firrt is a paper by Joan Beers, Pennsylvania lepartteht of Education, and Paul Educational Testing SerVidt, which deScribes, analyted and -interprets the Most significant pOrtiOnt of data collected in a 4-edond survey of the Status- of state ,educational assessment programs.. The second part is a report of the assessment' activities of each_ -Of the 50 states, the DiStrict of COluibia, PUerto Rico, and the Virgin Iilanda.,Each report proVides a standard desbription of the state's activities, the name, address and telephone number of each individUal interviewed, and, where appropriate, a list of publications Oettaining to the staters program..A copy ofthe interview guide is -provided in the appendix..(Authorn*)

Upload: others

Post on 21-May-2020

3 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: DOCUMENT RESUME - ERICDOCUMENT RESUME ED 080 582 TM 003 098 TITLE State Educational Assessment Programs.,1973 Revision.. INSTITUTION Educational Testing Service, Princeton, N.J.; Education

DOCUMENT RESUME

ED 080 582 TM 003 098

TITLE State Educational Assessment Programs.,1973Revision..

INSTITUTION Educational Testing Service, Princeton, N.J.;Education Commission of the States, Denver, Colo..

PUB DATE 73NOTE 104p.

EDRS PRICE MP-S0.65 HC-$6.58DESCRIPTORS *Data Analysis; Educational Quality; Educational

Research; Elementary Grades; *Evaluation Methods;Interviews; Kindergarten; Program Descriptions;Secondary Grades; *State Programs; *State surveys;*Student Evaluation; Technical Reports

IDENTIFIERS District of Columbia; Puerto Rico; United States;Virgin Islands

ABSTRACTThis publication has two major parts..The firrt is a

paper by Joan Beers, Pennsylvania lepartteht of Education, and PaulEducational Testing SerVidt, which deScribes, analyted and

-interprets the Most significant pOrtiOnt of data collected in a4-edond survey of the Status- of state ,educational assessment programs..The second part is a report of the assessment' activities of each_ -Ofthe 50 states, the DiStrict of COluibia, PUerto Rico, and the VirginIilanda.,Each report proVides a standard desbription of the state'sactivities, the name, address and telephone number of each individUalinterviewed, and, where appropriate, a list of publicationsOettaining to the staters program..A copy ofthe interview guide is-provided in the appendix..(Authorn*)

Page 2: DOCUMENT RESUME - ERICDOCUMENT RESUME ED 080 582 TM 003 098 TITLE State Educational Assessment Programs.,1973 Revision.. INSTITUTION Educational Testing Service, Princeton, N.J.; Education

FILMED FROM BEST AVAILABLE COPY

Pr c00'6r, 4NOUNA,. Pt EN Gla....ED

.PCVIZAinu r

P4. AND (0N5AN AS tlFEN;NNG<I,Cqf ." r. 76.

aTE 0 r,,i3N0DUC..:06 .( 41,'STLY NE

4t. ,,LP-fiqron

U S.OEPARTMENT OF HEALTH.EDUCATION & WELFARENATIONAL INSTITUTE OF

EDUCATION1 tIS DOCUMENT HAS PEEN ;REPROMUD EXACTLY A5 RECEIVED FROMTHE PERSON OR OPGANUATION ORIGINATINO It POINTS Or VIE. Ou OPINIONSSTATED DO NOT NECESSARILY /4EPRE

_ SENT OFFICIAL NATIONAL INSTITUTE or-,_EDUEATION:POS,TIONOP:POLICY

State Educational Assessment Programs1973 Revision

Center for Statewide Educational Atemetoesasentand the

ERIC' ClassarIssarlsonme on Tetett4, Menmearentent anti Evaluationtat

Educational 'remting Service,In ollalaorettican With

Education Conansission of the Staters

Page 3: DOCUMENT RESUME - ERICDOCUMENT RESUME ED 080 582 TM 003 098 TITLE State Educational Assessment Programs.,1973 Revision.. INSTITUTION Educational Testing Service, Princeton, N.J.; Education

STATE EDUCATIONAL ASSESSMENT PROGRAMS1973 REVISION

Center for Statewide Educational Assessmentand the

ERIC Clearinghouse on Tests, Measurement and Evaluationat

Educational Testing Servicein collaboration with

Education Commission of the States

Educational Testing ServicePrinceton. New Jersey Berkeley. California Evanston. Illinois

Page 4: DOCUMENT RESUME - ERICDOCUMENT RESUME ED 080 582 TM 003 098 TITLE State Educational Assessment Programs.,1973 Revision.. INSTITUTION Educational Testing Service, Princeton, N.J.; Education

CONTENTS

v Foreword 47 Missouri

vii Introduction 49 Montana

1 Statewide Educational Assessment 50 Nebraska

State Descriptions 51 Nevada

9 Alabama 52 New Hampsaire

10 Alaska 53 New Jersey

12 Arizona 56 New Mexico

13 Arkansas 57 New York

14 California 64 North Carolina

16 Colorado 65 North Dakota

18 Connecticut 66 Ohio

20 Delaware 67 Oklahoma

21 District of Columbia 69 Oregon

23 Florida 70 Pennsylvania

24 Georgia 71 Puerto Rico

27 Hawaii 73 Rhode Island

29 Idaho 75 South Carolina

30 Illinois 77 South Dakota

31 Indiana 78 Tennessee

32 Iowa 79 Texas

33 Kansas 81 Utah

35 Kentucky 83 Vermont

36 Louisiana 84 Virgin Islands

38 Maine 85 Virginia

39 Maryland 86 Washington

40 Massachusetts 87 West Virginia

42 Michigan 89 Wisconsin

44 Minnesota 90 Wyoming

46 Mississippi 95 Interview Guide

Page 5: DOCUMENT RESUME - ERICDOCUMENT RESUME ED 080 582 TM 003 098 TITLE State Educational Assessment Programs.,1973 Revision.. INSTITUTION Educational Testing Service, Princeton, N.J.; Education

FOREWORD

This second survey of the status of state educational assessment programs brings up-to-date the information collected in early 1971 for the first publication. In the two-yearperiod between surveys there have been changes of varying degrees, making many of theoriginal descriptions obsolete.

Extensive interest in the earlier publication was a primary factor in our decision toundertake a revised report. We have been very pleased to learn from a variety of sourcesthat the initial document stimulated a .. Amber of states to reexamine their own assess-ment efforts and that their reexaminationwas facilitated by having easy access to descrip-tions of the assessment efforts of other states.

We hope that this revised report will contribute further to the improvement of stateassessment programs by serving as a useful source of information for state educationleaders.

Princeton, New Jersey William W. Turnbull, PresidentJune 1973 Educational Testing Service

Page 6: DOCUMENT RESUME - ERICDOCUMENT RESUME ED 080 582 TM 003 098 TITLE State Educational Assessment Programs.,1973 Revision.. INSTITUTION Educational Testing Service, Princeton, N.J.; Education

INTRODUCTION

This publication has two major parts. The first is a paper by Joan Beers, Pennsylvania

Department of Education, and Paul Campbell, Educational Testing Service, which

describes, analyzes and interpre:s the most significant portions of the data collected in

the survey. The second part is a report of the assessment activities of each of the 50

states, the District of Columbia, Puerto Rico and the Virgin Islands. Each report provides

a standard description of the state's activities, the name, address and telephone number of

each individualinterviewed and, where

appropriate, a list of publications pertaining to the

state's program. A copy of the interview guide may be found in the appendix.

In conducting the interviews, no attempt was made to restrict the definition of a state

educational assessmentprogram or to go beyond what the state personnel were willing to

describe as their assessment program.Finally, in the interest of accuracy, the information

appearing in this report was sent to the appropriate person or persons in each state for

their review, modification and approval.

A project as large and complex as this one requires the cooperation of many people.

We were most fortunate in obtaining generous assistance from all whom we called upon

for help. We are particularly grateful to:

The individuals whose names appear at the end of each state report, who provided the

basic data for the survey.

Charles Hoover, Director of ERIC at the National Institute of Education, for

approving the development of the summary paper,utilizing the resources of the ERIC

Clearinghouse on Tests, Measurement and Evaluation.

Staff members of the Center for Statewide EducationalAssessment, the ERIC Clear-

inghouse on Tests, Measurement and Evaluation, the Office of Field Surveys, the

Publications Division and the Regional Offices of Educational Testing Service for

carrying out the project.

James Hazlett, Education Commission of the States, for his assistance in disseminating

the results of this project to members of the Commission.

Richard 0. FortnaProject Director

vii

Page 7: DOCUMENT RESUME - ERICDOCUMENT RESUME ED 080 582 TM 003 098 TITLE State Educational Assessment Programs.,1973 Revision.. INSTITUTION Educational Testing Service, Princeton, N.J.; Education

STATEWIDE EDUCATIONAL ASSESSMENT

by Joan S. Beers, Pennsylvania Department of Educationand Paul B. Campbell, Educational Testing Service

Considerable expansion of statewide assessment hasoccurred in the two years since the results of the previoussurvey were published. In the report of that survey. Dyerand Rosenthal (1971) present a concise statement of thehistory and development of assessment programs. They alsoidentified some apparent trends which were beginning toemerge. Against this background, the present survey wasdesigned to allow specific analysis of the nature of state-wide assessment programs, as planned or as placed inoperation.

P

To provide the information for this analysis, the surveyutilizes a uniform set of questions and a common format.

The District of Columbia, the Virgin Islands and PuertoRico were included with the 50 states and for the purposesof this survey are reported among the states.

The most obvious development is that there are nowmany more operational assessment programs than existed atthe time of the earlier survey. Each of the 53 states hasreported assessment activities either as operational, in adevelopmental process or in a planning stage.

Three Groups of Statewide Assessment Programs

After a first reading of statewide assessment programdescriptions, it is easy to be deceived into thinking thatstatewide assessment programs are more alike than not., Thesimilarities are many. State agency personnel are in charge.They select tests or guide and direct their construction,specify target populations, arrange for test administrations,make provisions for scoring, analyze the results or havethem analyzed and assemble and disseminate reports.Assessment purposes are similar alsoto assess develop-ment, to measure influences on learning, to assess needs, tomeasure growth. It takes a second and more careful reading,even a third and a fourth reading, to begin to detect thatreal differences do exist among programs. These differ.ences, we discovered, center abodt one question: Who getsto use the results? When the 30 operational programs aredivided into those for which data is collected for decisionmaking by state agency personnel and those for which datais collected for decision making by teachers and administra-tors, many other differences fall into place.

The distinction between these two types of programs issometimes subtle. Many programs are designed to serveboth levels of decision making. Nevertheless, we took therisk of categorizing the 30 operational programs into twogroups based upon program emphases. We screened the pro-gram summaries, separating those whose major focus is col-lecting information for state-level use from those whosemajor focus is collecting information for local use. A thirdgroup of 24 programs we labeled "emerging" because theyare not yet operational. A first cycle of testing, analyzingand reporting has not been completed.

New York state has three assessment programs. The PupilEvaluation Program is in Group 1 and The System for Pupiland Program Evaluation and Development is in Group 2.Therefore, New York is included twice. The Regents Exam.

ination, the third New York slate program, is not cate-gorized.

The 17 programs for which the emphasis is on collectinginformation for state-level decision making are in the fol-lowing.

ArizonaCaliforniaColoradoConnecticutDistrict of ColumbiaFlorida

MaineMassachusettsMichiganNevadaNew JerseyNew York

North CarolinaRhode IslandSouth CarolinaTennesseeTexas

The 13 programs for which the emphasis is ori collectinginformation for local-level decision making a e in thesestates:

AlabamaArkansasDelawareHawaiiIdaho

IowaKentuckyMississippiNew HampshireNew York

The 24 emerging programs are in these states:

AlaskaGeorgiaIllinoisIndianaKansasLouisianaMarylandMontana

North DakotaNew MexicoPennsylvania

Minnesota UtahMissouri Virgin IslandsNebraska VermontOhio VirginiaOklahoma WashingtonOregon West VirginiaPuerto Rico WisconsinSouth Dakota Wyoming

In the following matrix, essential features of each of thethree groups of programs are described. Reading across thematrix, differences are highlighted.

I

Page 8: DOCUMENT RESUME - ERICDOCUMENT RESUME ED 080 582 TM 003 098 TITLE State Educational Assessment Programs.,1973 Revision.. INSTITUTION Educational Testing Service, Princeton, N.J.; Education

THREE GROUPS OF STATEWIDE ASSESSMENT PROGRAMS

Group !

Program emphasis: Collecting informa-tion for decision-making at the statelevel. N=I7 .,

Group 2

Program emphasis: Collecting informa-tion for decision-making at the locallevel. N=13

Group 3

Emerging Programs N=24

1. Assessment is mandated by thestate legislature in 9 states (52%).

I. Assessment is mandated by thestate legislature in 1 state.

1. Assessment is mandated by thestate legislature in 6 states (25%).

2. Assessment results are reported tothe state legislature in 13 states(76%).

2. Assessment results are reported tothe state legislature in 3 states(23%).

2. Reports are not yet compiled butthe focus for 20(89%) of the pro-grams is the collection of data forstate-level decision-making.

3. The collection of data for PPBSand/or a statewide MIS is specifiedfor 7 programs (41%).

3. The collection of data for PPBSand/or a statewide MIS is specifiedfor 4 programs (30%).

3. The collection of data for PPBSand/or a statewide MIS is plannedfor 9 programs (37%).

4. Assessment data is used to allocatestate and federal funds in 8 states(47%).

4.. Assessment data is not used toallocate state and federal funds inany state.

4. Assessment data will be used toallocate state and federal funds in10 states (42%).--.

5. Participation is required in 10

states (59%).5.. Participation is voluntary in 12

states (92%).

5. All but 3 states report that parti-cipation will be voluntary.

6. Samples of students rather than allstudents in the target populationsare tested in 12 states (70%).

6. All students in the target popula-tions are tested in 12 states (92%).

6. Seventeen states (71%) plan todraw samples of students for test-ing.

7. Cognitive skills only are assessedin 11 states (65%).

7. Both affective and cognitive skillsare assessed in 9 states (69%).

7. Thirteen states (59%) plan toassess both cognitive and affectiveskills.

8. Criterion-referenced tests are

administered in 9 states (52%).

8. Norm-referenced tests only areadministered in 9 states (69%).

8. Of the 17 states which specifiedwhether norm-referenced or crite-rion-referenced tests will be used,11 states plan to administer crite-rion-referenced tests.

9. Assessment is financed by statefunds only in 6 states (35%),federal funds only in 4 states(24%), a combination of state andfederal funds in 5 states (29%),and a combination of local, state,and/or federal funds in 2 states(11%).

9. Assessment is financed by statefunds only in 2 states (15%),federal funds only in 3 states(23%), a combination of state andfederal funds in 4 states (31%),and a combination of local, state,and/or federal funds in 4 states(31%).

9. Assessment is financed by statefunds only in 5 states (21%),federal funds only in 14 states(58%), a combination of state andfederal funds in 3 states (12%),and a combination of local, state,and/or federal funds in 2 states(8%).

2

Page 9: DOCUMENT RESUME - ERICDOCUMENT RESUME ED 080 582 TM 003 098 TITLE State Educational Assessment Programs.,1973 Revision.. INSTITUTION Educational Testing Service, Princeton, N.J.; Education

Group 1 Group 2 Group 3

10. All but one of the interviewees 10. Five of the interviewees reported 10. I t is too early to determinereported that program objectives that program objectives are being whether or not program objectivesare being met successfully. met successfully; one reported are being met successfully for

that objectives are being met to alimited extent; three reported thatobjectives are not being met suc-cessfully; and four did not answerthe question...

most of the emerging programs.

II. Interviewees reported the fol- 11. Interviewees reported the fol- 11. Interviewees reported the fol-lowing as major problem areas: lowing as major problem areas: lowing as major problem areas:

a. Not enough money. a. Not enough money. a. Not enough money.b. Not enough staff. b. Not enough staff. b. Not enough staff.c. Difficulty with program c. Inability to make adequate c. Lack of coordination at the

coordination at the state level. use of project results. state level.d. Resistance from teachers to d. Difficulty in getting the pro- d. Lack of agreement on future

outside testing. gram data into decision- directions.e. Lag in the development of making hands.

systematic use of the data. e. Negative at titudes towardf. Mandatory rather than volun- testing.

tary participation. f. Lack of understanding aboutg. Voluntary rather than manda- the usefulness of tests by

tory participation. teachers and students.h. Lack of acceptance by g. Difficulty in making meaning-

teachers who see the program ful interpretations of results.as a threat without providing h. Difficulty in developingany direct benefits. awareness at the local level.

i. Magnitude of the program. i. Difficulty in inspiring use ofj. Inability to use the data in data at the local level.

significant ways. j. Inadequate dissemination.k. Inability to help school k. Improper use of results.

systems use the data. 1. Improper interpretation of1. Inadequate dissemination.m. Difficulty in acclimating

school personnel to therationale and purposes of cri-terion-referenced testing.

n. Opposition from teachers'organizations.

o. Inability to use the programdata to make decisions.

p. Lack of understanding at alllevels of the program and itspurposes.,

results.

3

Page 10: DOCUMENT RESUME - ERICDOCUMENT RESUME ED 080 582 TM 003 098 TITLE State Educational Assessment Programs.,1973 Revision.. INSTITUTION Educational Testing Service, Princeton, N.J.; Education

Comparisons and Contrasts Among the Three Groups of Programs

1. Legislative MandatesLegislative mandates are theimpetus for statewide assessment programs in 16 states. Inall other states, assessment was introduced by state educa-tion agency personnel. Many programs began in response toESEA Title I and Title III requirements. Of the 17 opera-tional programs which have as their emphasis the collectionof information for state-level decisions, more than half ofthem began with legislative mandates. In contrast, of the 13operational programs which have as their emphasis thecollection of data for local-level decisions, only the Pennsyl-vania program has a legislative mandate. Although the focusfor most of the emerging programs is the collection of datafor state-level use, few states have legislative mandates.

2. Reporting of Assessment ResultsA consequence oflegislative mandates is the required reporting of assessmentresults to legislators. Legislative reports are more likely toaccompany programs designed for state-level decision

making and less likely to accompany programs designed forlocal-level decision making.

3. Use of Data for PPBS and/or MISThe use of assess-ment results highlights a third contrast between the twogroups of operational programs. State agency personnelwho direct programs intended primarily for state use aremore likely to apply assessment results to Planning, Pro-gramming and Budgeting Systems (PPBS) and/or Manage-ment Information Systems (MIS) than are state agencypersonnel who direct programs intended primarily forschool district use. Effective use of assessment data is aproblem mentioned frequently by many of the inter-viewees. The ways in which assessment data are used forPPBS or MIS and the effectiveness of such data for thesepurposes might be some areas for further study.

4. Use of Assessment Data for Allocating Funds-1n eightstates where achievement data is collected primarily forstate purposes, the results are used to allocate state andfederal funds to school districts. In 10 other states, theintentions are to use assessment information to distributefunds. When the results of achievement tests are linked tothe distribution of money, do teachers and administratorstreat tir assessment program differently? The questionmight be worth while exploring.

5. Voluntary Versus Required ParticipationThere is adefinite trend toward voluntary participation and awayfrom required participation on the part of school districts,although most interviewees report that practically all

invited schools do, in fact, participate. In 5 of the 10 statesin which participation is required, interviewees reportedthat teacher resistance to statewide testing is a majorproblem. Whether or not teachers are more likely to resiststatewide testing programs when they are required ratherthan voluntary is another possible area for further study. In

4

the group of programs where the emphasis is on collectinginformation for school district purposes, only in Hawaii isparticipation required.

6. Sampling Versus Testing of All Students in TargetPopulationsThe contrast is striking between the twogroups of operational programs on the issue surroundingtesting all students versus testing samples of students.Samples of students from target populations are tested inthe majority of states where assessment data is used forstate-level decisions. All students from target populationsare tested, except in Iowa, in the states where assessmentdata is intended for local-level decisions. In most of thestates where programs are not yet operational, samples of.udents from target populations rather than all students

from target populations will be tested. Target populationsare defined, usually, by age and grade levels. Target popula-tions include, usually, specified students in elementaryschools, middle schools and high schools. Rarely is everystudent at every age or grade level tested.

7. Measurement of Cognitive and Affective Achieve-mentWhat is measured provides another area of contrastbetween the two groups of operational programs. Themeasurement of verbal and mathematical achievement onlyis the focus for the majority of programs in the groupwhere the emphasis is on information for state-level use. Incontrast, in the group where the emphasis is on local-leveluse, attitudes as well as verbal and mathematical skills areassessed in the majority of states. In the group of stateswith emerging programs, there is a definite trend towardthe measurement of attitudes, also. The large number ofstates which do or plan to assess attitudes and personaldevelopment represents a significant change since the 1971study was conducted. At that time, Dyer and Rosenthalwrote that states were concerned mainly with how welltheir educational systems were succeeding in impartingbasic skills. Few assessment programs measured beyond thethree R's. The inherent complexities in measuring attitudesand personal-social development were once thought to beoverwhelming in those states which were pioneers in thisarea. Interestingly, these difficulties are hardly mentionedby the interviewees in this study. Can it be that we arebeginning to reach a stage in measurement where affectiveskills are no more difficult to assess than cognitive skills?

The affective domains mentioned number at least 12, butthere are 4 which are most prevalent: attitudes towardschool, self-concept or self-acceptance, citizenship andcareer development or orientation.

8. Norm-Referenced Versus Giterion-Referenced Test-ingOn the issue of norm-referenced versus criterion--referenced measurement, contrasts and comparisons amongthe three groups of programs again are evident. There is adefinite trend toward the use of criterion-referenced testing

W.

Page 11: DOCUMENT RESUME - ERICDOCUMENT RESUME ED 080 582 TM 003 098 TITLE State Educational Assessment Programs.,1973 Revision.. INSTITUTION Educational Testing Service, Princeton, N.J.; Education

in those states where the results are used for state-leveldecision making. This trend continues in those states whereassessment is not yet operational. In the majority of stateswhere results are used for local-level decision making,norm-referenced tests only are administered. The under-lying reasons for these findings do not seem obvious. Thefindings may have occurred by accident rather than bydesign. If the findings did occur by design, the implicationis that state-level decision makers consider the results fromcriterion-referenced testing more suitable for their purposesthan the results from norm-referenced testing. As a corol-lary, the implication is that local-level decision makersconsider the results from norm-referenced testing more suit-able for their purposes than the results from criterion-referenced testing. Since the current push for the develop-ment of criterion-referenced testing comes from its use inindividually prescribed instruction, a highly localizeddecision - making situation, this finding is a most curiousanomaly. The issue should not be left here, but should beexplored further.

9. Financing Statewide AssessmentFunding is variedamong all groups of programs. State funds only is the mostfrequently mentioned source of money for programs inwhich the emphasis is on collecting information for stateuse. A combination of state and federal funds and the useof local funds in combination with state and/or federalfunds are the most frequently mentioned sources of moneyfor programs in which the emphasis is on collectinginformation for local use. In this group of programs, localfunds ate more likely to be appropriated than in any othersoup of programs. Federal funds only is the most fre-quently mentioned source of money for emerging pro-grams. One question that might be asked is whetherprograms which are financed in full or in part by the stateare more likely to be accompanied by a greater commit-ment on the part of the state agency than are programswhich are supported solely by federal funds. A second ques-tion that can be asked is: In those states where schooldistricts contribute money for statewide assessment, doschool district personnel also play a policy-malting role?

10. Meeting Program ObjectivesAre program objectivesbeing achieved satisfactorily? In those states where programemphasis is on state-level decision making, the intervieweeswere almost in unanimous agreement that they weremeeting objectives. In those states where program emphasisis on local-level decision making, few interviewees statedthat program objectives were being met satisfactorily.

11. Major ProblemsWhat are the major problems relatedto assessment programs? This open-ended question brought

forth answers that may say as much or more about state-wide assessment as all of the other questions combined.Two eternal problems, not enough money and not enoughstaff, appear in practically all reports. What constitutesenough money is hard to define, but a closer look at staffnumbers leads one to wonder how assessment can be

accomplished at all. Twelve states have but I full-time staffmember responsible for the program and not more than 10states have 5 or more full-time staff members. If the objec-tives of the programs are being met satisfactorily, as manyinterviewees claim they are, other agencies must be assistingstatewide educational assessment personnel. In 20 states,the state university cooperates with the state department toimplement the programs. In 9 states, school district staffassist with assessment programs. In 36 states some aspect iscontracted to outside personnel.

Interviewees in states with emerging programs report fewproblems. They can look forward to a great number,judging from the many problems reported by intervieweesin states with operational programs. In states where theprogram emphasis is on collecting information for state useonly, practically no problems were mentioned. It is in stateswhere assessment results are used for both state-level andlocal-level decisions that problems are most numerous.Moreover, the difficulties reported from these states aremore similar than different from the difficulties encoun-tered by states where assessment results are used primarilyfor local-level decisions. The dissemination, interpretation,acceptance, understanding, awareness and utilization ofassessment results by teachers and administrators areproblems for state department personnel. Lack of moneyand staff may be contributing factors, but negativism andresistance on the part of teachers and administrators arefactors which cannot be treated lightly.

Goals

The 1971 survey of state assessment programs suggestedthat there was an increasing concern with the involvementof citizens in goal setting. This involvement, although notuniversal, has continued as a major trend in the 1972.73survey. Several such projects have been completed.

The usual approach is represented by procedures such as

those used in Wyoming, where with the assistance of theUniversity of Wyoming, a series of discussions were heldwith the participation of a State Sounding Committee, theState Superintendent of Public Instruction, the State Boardof Education, the State Education Agency, students,teachers, administrators, the teacher's association, stafffrom the National Assessment of Educational Progress andother educational experts. Goals were prepared during thesediscussions which were presented to the State Board foranticipated endorsement.

A very scholarly and thoughtful approach is representedby the Georgia Assessment Project. A series of papers onGeorgia's future were commissioned from knowledgeablepeople in many professions. These papers in turn werecritiqued by additional experts. From these activities aseries of educational goals were derived, which in turn wereassigned priority by a representation of Georgia citizens,including students, through use of the Delphi Technique(Weaver, 1971). The goals were then used to derive objec-tives which provided the basis for instrumentation in theGeorgia Assessment Program.

5

Page 12: DOCUMENT RESUME - ERICDOCUMENT RESUME ED 080 582 TM 003 098 TITLE State Educational Assessment Programs.,1973 Revision.. INSTITUTION Educational Testing Service, Princeton, N.J.; Education

The result of this goal setting activity has produced a list

of goals ranging from 1 to 78. They can be classified in

three general categories: learner outcome goals, process

goals and institutional goals. The learner goals may be

further classified among the following subsets:

1. Basic Skills2. Cultural Appreciation3. Self-realization4 Citizenship and Political Understanding

5. Human Relations6. Economic Understanding

7. Physical Environment8. Mental and Physical Health

9. Creative, Constructive and Critical Thinking

10. Career Education and Occupational Competence

11. Lifelong Learning12. Values and Ethics13. Home and Family Relations

The general recognition of the societal expectation that

school is more than a reading, writing and arithmetic class is

evident from these goals. They constitute the bulk of the

goal-setting activity, with the other categories appearing as

facilitating learner outcomes.The process category includes such ideas as the involve-

ment of students and citizens in the planning of curriculum

(Kansas) and the designing and implementation of instruc-

tional programs (Idaho).An example of institutional goals can be found in

Virginia, where the goals include standards for personnel,

instructional materials and programs, planning and manage.

ment. Another example, from South Carolina, is a goal to

promote programs to provide adequate and qualified pro-

fessional and paraprofessional personnel to staff the state's

educational systems.Although thoughtful and productive approaches have

been demonstrated by the goal setting activities of the

states, a major problem appears at the point where goals are

translated into program objectives and into data collection

procedures. As a typical example, 27 of the 53 states have

stated a goal concerned with human relations. However,

only three states report that they have been able to conduct

an assessment of progress toward such a goal. Several are in

the process of developing the necessary instrumentation,

but have not yet achieved results which are a satisfactory

solution to the measurement problems, although tentative

use of these instruments in a few states appears to offer

positive benefits in that schools are encouraged to make

instructional provision for the achievement of goals in

human relations. While there appears to be some stability in

group trends, measurement in an affective area such as this

is in need of much additional development.

The unfortunate side effect of the limited application of

assessment to noncognitive achievement is that many

educators, failing to see assessment programs applied in.

affective areas, doubt the genuineness of the state commit-

ment to these areas.Finally, consideration of the worthy and carefully devel-

6

oped goals reported by most of the nation leads one to the

question: Do assessment programs increase our chances of

attaining the stated goals? Are program changes occurring

because of assessment results? It is too early to specify a

positive or negative answer to these questions, because most

programs have not been in operation long enough. There is,

however, spotty evidence from several states indicating that

at least some assessment programs are stimulating change at

the district level. For example, several Michigan districts

have asked for their assessment results in punched, card

form so that they can conduct their own intensive analysis.

At least one district has designed a pre-post analysis of an

intervention program which attempts to improve school

attitude. Many Pennsylvania districts have also undertaken

program changes as a result of assessment activities. A

somewhat different example is Colorado's accreditation

plan, which depends on objectives selected through assess-

ment activities for program design. In general, however,

state assessment programs have not had enough operational

time to demonstrate the nature and extent of their influ-

ence on the goal orientation of programs.

Related Data

One expressed purpose for many statewide assessment pro-

grams is to measure influences on learning. How program

designers define "influences on learning" can be illustrated

best by the kinds of related data assessment specialists col-

lect. Related data can be grouped under four categories:

pupil charaCteristics, school characteristics, community

characteristics and process variables. The specifics under

each of these categories are:

Pupil Characteristics

1. age

2. sex

3. socioeconomic status4. racial/ethnic background

5. bilingual or not

School Chanrteristics

1. dropout rate2. attendance rate3. percent minority4. pupil/teacher ratio5. per-pupil expenditure for education

6. cost of instructional programs

7. teacher datasalaries, education, course loads

8. use of paraprofessionals9. :ransportation costs

Community Characteristics

I. size

2. geography

Process Variables

1. classroom practices

Page 13: DOCUMENT RESUME - ERICDOCUMENT RESUME ED 080 582 TM 003 098 TITLE State Educational Assessment Programs.,1973 Revision.. INSTITUTION Educational Testing Service, Princeton, N.J.; Education

No one state collects all of this information. Some statescollect a great deal, others collect age and sex only and stillothers collect none. The question that must be raised is:What, if anything, do states do with this information?

The kinds of facts gathered under Pupil Characteristicsand Community Characteristics can provide informationabout relationships between these varables and pupils'achievement, but the characteristics are impossible tochange. The kinds of facts gathered under School Charac-teristics can be manipulated, to a greater or lesser degree,and may provide some clues as to what schools can do toimprove the learning process. The best possibilities formeasuring influences on learning probably can be foundunder Process Variables. However, only one state, Pennsyl-vania, reports that information about classroom practices iscollected.

It is probably safe to say that statewide assessment willnot produce any startling revelations about what can bedone by teachers with pupils to help children learn moreeffectively. This 4.,:inclusion is not meant to be as much anindictment of statewide assessment as it is a statement of itslimitations. Reve!..tions in teaching practices and methodscan come only from intensive analysis within each schoolbuilding and within each classroom. If statewide assessmentdata can whet the appetites of teachers and administratorsfor doing the kinds cf evaluation only they can do forthemselves, statewide assessment will serve its purposeswell.

Accountability Revisited

The word "accountability" is not mentioned very often inthe 53 statewide assessment summaries. But the conceptcan be detected easily in most of the program writeups.Accountability is the heart and soul of most assessmentprograms. More importantly, state education agencies inevery state in the union are taking the leadership in helpingor coercing school administrators to answer to the public'scries for better information about what children know andhow well schools are doing their job.

Do statewide assessment programs and the manner inwhich they are conducted adequately define the dimensionsof accountability? At least one authority doesnot think so.Stake (1973) defines an accountable school as one thatdiscloses its activities, makes good on staff promises, assignsstaff responsibility for each area of public concern andmonitors its teaching and learning.

Most state accountability proposals call for more uniformstandards across the state, greater prespecitication of objec-tives, more careful analysis of learning sequences and bettertesting of student performance. These plans are doomed.What they bring is more bureaucracy, more subterfuge, andmore constraints on studert opportunities to learn. Thenewly enacted school act :.intability laws will not succeed inimproving the quality of education for any group oflearners... If state accountability laws are to be in the bestinterests of the people, they should protect local contrql ofthe schools, individuality of teachers, and diversity of learn-ing opportunities. They should not escalate the bureaucracyat the state or local level. They should not allow school

ineffectiveness to be more easily ignored by drawing atten-tion to student performance. They should not permit testscores to be overly influential in schoolwide or personaldecisionsthe irreducible errors of test scores should berecognized. The laws should make it easier for a school to beaccoun* t. '1 the community in providing a variety of high

g opportunities for every learner.

I ....gee with Stake's vision of what accountability couldor should be is uncomfortable. Not only are the ingredientsof most statewide assessment plans in antithesis to wh,tStake is promoting, but the idea that these plans may infact be harmful or damaging to the educational system is aserious charge.

The supposed dangers resulting from mass testing havebeen voiced before. Dyer (1966) reminds us how loudly thecritics shouted in response to the plan for a National Assess-ment of Educational Progress. Some of the argumentsraised against National Assessment were: (1) the testswould put undue pressure upon students; (2) the findingswould lead to unfair comparisons; (3) teachers would teachfor the tests to the neglect of important educational objec-tives; (4) the program would ultimately force conformityand impose federal control on the schools.

Dyer reacts by stating that "...one would suppose thctto assess the educational enterprise by measuring thequality of its product is an egregious form of academicsubversion" (1966, p. 69). Dyer sees the need for statewidetesting programs for two reasons: continuity in the educa-tional process and stability in educational systems. State-wide testing can help to bring greater continuity into theeducational process if it can bring to teachers a continuousflow of information about the developmental needs ofstudents regardless of where they are or where they havebeen and if tests are seen not so much as devices for selec-tion or classification or evaluation, but as instruments forproviding continuous feedback indispensable to theteaching-learning process.

Statewide testing can help to bring greater stability intothe educational process by steering a well-planned educa-tional program toward well-considered educational goals.

The issue should not be one of state-imposed account-ability versus locally initiated accountability. State educa-tion agencies and state legislatures have their own reasonsand suffer their own pressures for collecting informationabout students' educational achievements. School districts'accountability to the state should not be confused withschool districts' accountability to their own communities orwith teachers' accountability to their own school systems.

The issue should be whether state-imposed accountabilitysystems encourage or discourage school administrators andteachers from developing their own accountability plans.

It can be said that accountability laws are the signs of thepublic's lack of faith in the effectiveness of schooling. Itcan be said, also, that accountability laws are the signs thatschool officials did not, or could not, respond on their ownto accountability demands.

Whether or not statewide assessment programs emphasizelocal use of results, helping teachers and administratorsinterpret and use the results is a job most state assessment

Page 14: DOCUMENT RESUME - ERICDOCUMENT RESUME ED 080 582 TM 003 098 TITLE State Educational Assessment Programs.,1973 Revision.. INSTITUTION Educational Testing Service, Princeton, N.J.; Education

'.. ..:" .her try to do or feel they should do. It is alsoill, 4ret.. mere they report the greatest difficulties andfrustrations.

Baker (1973) says it well: "...the fallacy of account-ability is that it can be legislated in a monolith. Perhaps like

most worthwhile things, it should be allowed to have aninfancy and mature." Perhaps state education agencies aredoing the best they can do and as much as they should do.Perhaps school board members, administrators and teachersnow should do more.

REFERENCES

Baker, E.L. Opening accountability: A story in two parts.The Journal of Educational Evaluation. February 1973,4(1), 7-8.

Dyer, H.S. The functions of testingold and new. In Test-

ing Responsibilities and Opportunities of State Educa-tion Agencies. Albany, New York: New York StateEducation Department, 1966. pp. 63-79.

Dyer, H.S. & Rosenthal, E. An overview of the survey

8

findings. In State Educational Assessment Programs.Princeton, N.J.: Educational Testing Service, 1971. Pp.ix-xix. (ED 056 102).

Stake, R.E. School accountability laws. The Journal ofEducational Evaluation February 1973, 4(1), 1 -3.

Weaver, W.T. The delphi forecasting method. Phi DeltaKappan, 1971, 52 267-271.

Page 15: DOCUMENT RESUME - ERICDOCUMENT RESUME ED 080 582 TM 003 098 TITLE State Educational Assessment Programs.,1973 Revision.. INSTITUTION Educational Testing Service, Princeton, N.J.; Education

STATE DESCRIPTIONS

ALABAMA

Alabama Statewide Needs Assessment Program

Goals

A formal set of statewide educational goals has beenprepared by the State Department of Education in coopera-tion with the University of Alabama and the AlabamaEducation Study Commission. These goals have beenformally adopted by the State Department of Educationand reflect a combination of input, process, and output.The goals are available upon request.

Program Title and PurposesThe official title of this program is the Alabama StatewideNeeds Assessment Program. Its major purposes are to assesseducational needs and to develop innovative measures tomeet these needs.

InitiationThe program was initiated as a fulfillment of ESEA Title 111requirements in 1969.

PolicyThe State Department of Education. the Office of Planningand Evaluation, the University of Alabama, and state law allplay a part in the determination of program policy.

FundingState Department of Education funds are the sole source offunding for the program. It is unknown how much wasspent on the program last year.

AdministrationThe Office of Planning and Evaluation and the Universityof Alabama coordinate the program statewide. Sixteen full-time professionals within the State Department of Educa-tion and seven full-time professionals within the Universitywork on the program. The University of Alabama isinvolved in all phases of program development; the Cali-fornia Test Bureau (CTB), McGraw-Hill Inc. is involved ininstrumentation and scoring.

PopulationAll public school students in grades K-12 are involved in thestudy. School dropouts are also included. Parochial andprivate schools may participate at their own discretion.School participation is voluntary (but encouraged), exceptfor those schools trying to qualify for ESEA Title I andTitle III funds. All eligible schools were included last year.Community characteristics are not used as a basis for selec-tion.

Areas Assessed

The cognitive areas of aptitude, English, mathematics, andreading and vocational abilities are being assessed as well asthe noncognitive areas of attitudes, citizenship, personalvalues, and self-concept.

InstrumentationThe California Test of Mental Maturity and theComprehensive Tests of Basic Skills are used, as well assome state-designed questionnaires. The questionnaireswere tailor-made for the program by the University ofAlabama. Item sampling was not used to construct thequestionnaires, and they are used as a group-reliablemeasure only.

Related DataAdditional information collected to help interpret programdata includes age, county, dropout rates, school name,school size, sex, and socioeconomic data.

Data Collection and ProcessingClassroom teachers administer the tests. The outside con-tractor (CTB), the State Department, and the University ofAlabama process program information.

InterpretationThe State Department of Education, the CTB, and theUniversity of Alabama organize, analyze, and interpret allprogram data.

Use of DataProgram data is used for program planning and evaluationand to help individual school districts define their ownneeds.

DisseminationReports of program results are prepared by the StateDepartment of Education and include state glimmaries andlocal board of education summaries. The State Departmentand local boards of education receive copies of programreports.

OverviewIn general, the program is viewed favorably by the public,schools, administrators, and legislature, and programobjectives are being achieved. The major problem has beeninforming the public.

Prospects for the FutureThe program will continue with modifications and annualrevision. The areas most likely to change are funding,instrumentation, data collection, and use of data.

9

Page 16: DOCUMENT RESUME - ERICDOCUMENT RESUME ED 080 582 TM 003 098 TITLE State Educational Assessment Programs.,1973 Revision.. INSTITUTION Educational Testing Service, Princeton, N.J.; Education

ALABAMA

A Study of the Products of Vocational Educationin Alabama

Goals .

A formal set of statewide educational goals has been pre-pared in quantifiable terms. The primary criterion is theratio of the number of students placed in jobs to the

number of students trained. The Division of VocationalEducation prepared these goals.

Program Title and PurposesThe official title of this program is A Study of the Productsof Vocational Education in Alabama. Its major purpose isthe evaluation of the process in vocational education.

InitiationThe program was initiated by the Division of VocationalEducation in 1969.

PolicyProgram policy is determined by the Division of VocationalEducation.

FundingThe program is funded through both state funds (50 per-cent) and federal funds (50 percent). Actual amountsexpended on the program have not yet been determined.

AdministrationThe Division of Vocational Education, with the equivalentof two and one-half full-time professionals working at thestate level, coordinates the program statewide. Assistance is

received from Auburn University for data interpretation.

PopulationAll vocational students in the state, from age 14 to adult-hood are included in the target population. A randomsample of this population is used to select students for thestudy. Dropouts are included but parochial and privateschool students are excluded. All schools chosen bysampling are required to participate. Community character-istics are not considered.

Areas AssessedAll areas of vocational instruction, including attitudes andsalable skills, will be assessed.

InstrumentationThe General Aptitude Test Battery (GATB) of the UnitedStates Employment Bureau and Interest Inventories will beused. Tests are chosen by the Division of VocationalEducation.

Related GataWhat related data, if any, will be used has not yet beendetermined.

10

Data Collection and ProcessingIt has not yet been decided who will be responsible for datacollection and processing.

InterpretationAuburn University and research staff members of the StateDepartment of Education will be responsible for the organi-zation, analysis, and interpretation of program data.

Use of DataProgram results will be used for program development,modification, and redirection.

DisseminationDissemination of program data has not yet been deter-mined.

OverviewReactions to the program can not be judged at this time.

Prospects for the FutureThe study is likely to continue with some modification.

Individual InterviewedThe primary source of information was Ledford Boone,Coordinator of the Office of Planning and Evaluation, StateDepartment of Education, State Office Building, Mont-gomery, Alabama 36104 (telephone: 205 269-7271 or269-7968).

REFERENCES

State Educational Assessment Programs. Princeton, New Jersey:Educational Testing Service. 1971.

Alabama State Department of Education. Educational Goals forAlabama Schools. September 1972. Montgomery, Alabama.

ALASKA

Alaska Educational Assessment and Model ofReasonable Expectation

IntroductionAlaska has elected to design their first statewide assessmentof student skills with tests that are culturally free in rela-tion to content and linguistically equivalent in relation toquestion wording. There are 30 school districts in the state,one of which is u der state control. This district consists of126 rural villages in which seven different Eskimo dialectsare spoken in addition to three major Indian dialects andstandard American English. The issue is further com-pounded in that students range from those who are gen-uinely bilingual to those who are nonlingual (that is, they

Page 17: DOCUMENT RESUME - ERICDOCUMENT RESUME ED 080 582 TM 003 098 TITLE State Educational Assessment Programs.,1973 Revision.. INSTITUTION Educational Testing Service, Princeton, N.J.; Education

do not have an adequate command of either the ancestrallanguage or the English language).

Goals

Design specifications for the testing model were formulatedin 1972. The specifications were established by the StateEducation Agency working in conjunction with the StateCommissioner of Education, and the State Board. Thesespecifications have been formally adopted by the StateBoard and a commitment was made to proceed with thework effort. The most important of these specificationswere that test instruments must be: I) diagnostic andprescriptive, 2) and they must allow for the establishmentof reasonable objectives for individual students, groups ofstudents, and the state as a whole. Because a technique hasbeen developed for the latter specification, the analyticprocess has been labeled the "Model of ReasonableExpectation."

Program Titles and PurposesThe assessment program in Alaska is known as the AlaskaEducation Assessment and Model of Reasonable Expecta-tion. Its main objectives are to: assess cognitive development, assess educational needs (herein narrowly defined),measure educational growth, provide data for managementinformation systems, set statewide, local and individualobjectives and provide information for th, state's plan-ning-programming-budgeting system.

InitiationThe program was initiated by the Office of Research andPlanning at the request of the State Commissioner ofEducation. Pilot work has been in progress since August1972. Alaska expects to begin first statewide testing inthree years if the necessary funding is received.

Policy

The State Board of Education, working in conjunction withthe Commissioner, the State Education Agency and repre-sentatives of the school districts across the state, formulatesprogram policy and changes. In Alaska, the Commissionerof Education is employed by the State Board.

FundingProgram funding comes from the state general educationfund. Thus far $10,000 has been spent, and $100,000 hasbeen requested for the initial work on pilot instruments.

AdministrationIf funding is received, a Test and Evaluation Section underthe Office of Research, Planning and Information will beestablished. There will be, initially, one full-time profes-sional working on the development of the state assessmentsystem and educational evaluation in general. The Centerfor Northern Educational Research, University of Alaska,will have an important responsibility in that they will beresponsible for the question of quality control in relation tothe work of any contractor selected for actual test devel-opment.

ALASKA

PopulationWhile it is not certain yet as to the actual pc:ulation to betested under the pilot, major emphasis will be given tothose students primary through junior high. Initial effortswill be extended to the area of reading and computationalskills only. Pilot schools tested will be on a voluntary basisonly and, for the most part, at the expense of the state.Parochial and private schools may or may not participate.Because the testing does not involve district comparisons interms of ranking, community input characteristics will notbe considered.

Areas Assessed

Cognitive areas which will be assessed, include mathematics,reading and language skills. The language skills instrumentswill be developed to cover both English and non-Englishspeaking students. While no noncognitive areas are beingconsidered in the pilot, it is fully intended that success inthe pilot will evolve into consideration of performance inthe affective domain. Vocational areas are planned for latertesting, but will not be covered in the pilot program.

InstrumentationAll instruments are being tailor-made for Alaskan studenItem sampling will be used in test construction, and allbistruments will be pretested before statewide administra-tion. All tests will be criterion-referenced measures and,because item sampling techniques will be used, tests will beinterpretable initially only in terms of group reliability.

Related DataAdditional information to be collected to help analyzeinitial program data will be: age, dropout rates, informationon educational aides, boroughs (Alaska has no counties),percent minority, racial/ethnic data, school name and size,socioeconomic data, sex and teacher data.

Data Collection and ProcessingThe tests will be administered by classroom teachers, underthe supervision of SEA or contractor-trained administra-tors. The SEA will be responsible for processing all programdata, through its educational data processing section.

InterpretationThe outside contractors working in conjunction with theSEA will be responsible for analysis, organization and inter-pretation of all program data.

Use of Data

Data collected will be used for Instruction, program evalua-tion, program planning and identification of exemplaryprograms. Its use will not be restricted by program au-thority.

Overview

The program proposal is viewed favorably by the stateeducators who have been contacted, but until actual testingtakes place, professional and public opinion cannot be

II

Page 18: DOCUMENT RESUME - ERICDOCUMENT RESUME ED 080 582 TM 003 098 TITLE State Educational Assessment Programs.,1973 Revision.. INSTITUTION Educational Testing Service, Princeton, N.J.; Education

ALASKA

assessed. Program objectives are slowly but surely beingachieved. The major problem is funding.

Prospects for the FutureAlaska State Assessment will continue, if the programreceives needed funding from the governor and state legis-lature. The only other problem the SEA is concerned withnow is development of instruments which will accuratelymeasure language skills in each of the non - English dialects

Individual InterviewedThe primary source of information was Ernest E. Polley,Coordinator, Office of Research and Planning, Alaska StateDepartment of Education, Pouch FAlaska State OfficeBuilding, Juneau, Alaska 99801 (telephone: 907586-5380).

ARIZONA

ENAPA - Educational Needs Assessment Programfor Arizona

GoalsA formal set of 10 goals has been prepared and is availablein the Phase I Needs Assessment Program Report. Thegoals, prepared by State Education Agency (SEA) person-nel, have been approved but not formally adopted by theState Board of Education. Basically idealistic in nature,modifications may be made as the program progresses. Thegoals are a combination of input, process, and output, beingmore product-oriented than input or process in nature.

Program Title and PurposesThe official name of the Arizona program is the Educa-tional Needs Assessment Program for Arizona (ENAPA).The main purposes of the program are to: assess educational needs, assess cognitive and affective development,measure major influences on learning (very difficult becauseof the number of variables involved), provide data for amanagement information system, set statewide educationalgoals, and provide information for a planningprogram-ming-budgeting system at the state level.

InitiationENAPA was initiated by the State Education Agency as aresult of the federz :equirements under ESEA Title IN.The program has been in effect since the latter part of1971, with the first test administration during the 1971-72school year.

PolicyThe Planning and Evaluation Division of the State Depart.ment of Education is responsible for all program policy andchanges.

12

FundingENAPA is funded completely by federal monies. The bulkof funding is through ESEA Title III, with some fundingfrom ESEA Title I and the Vocational Education Acts.Roughly $40,000 was spent on ENAPA during the lastfiscal year.

AdministrationThe Planning and Evaluation Division, with one full-timestate professional, is responsible for coordinating the pro-gram statewide. Evaluative Programs for Innovative Cur-riculum (EPIC)Diversified Systems Corporation ofTucson, Arizona, was involved in initial program develop-ment but is no longer working for ENAPA.

PopulationIn the 1971-72 school year, the target group was all stu-dents in grade 8. Dropouts are not included in the testpopulation, but teachers are asked to identify students ofthe right chronological age who have dropped behind orskipped ahead in grade level. This year nine-year-olds are ofprimary concern. Delinquents and schools for the delin-quent are purposely omitted from the program, as areparochial and private schools. All public schools and publicschool children of the target age are included in the targetpopulation. Samples are drawn from a statewide roster ofstudent populations in each target age gror.p. All studentsand schools drawn from the sample are required to parti-cipate. Community characteristics are not considered.

Areas AssessedThe last test administration assessed only the cognitive

' areas of reading, writing, and mathematics. At the time thisinformation was collected, the areas to be assessed in thespring of 1973 had not yet been de e xd. Citizenshipmay be tested this year, but no .. , .- i.,Acopitive areas.ENAPA would like to test "self-concept," but hasn't yetfound a valid instrument to use.

Instrumentation11. iotva Tests of Basic Skills and Iowa Cognitive AbilitiesTest we e used last year for reading, writhig, and mathematics in grade 8. No nonstandardized tests are being usedat this time.

Related DataAdditional information collected to help analyze programdata includes the following: age, cost of instructional pro-grams, percentage minority, racial/ethnic background,school name, school size, family income, sex, whether ornot student had kindergarten, whether or not student was aparticipant in an ESEA Title I or Title III program, size ofcommunity, and whether or not student is bilingual(language other than English spoken at home.)

Data Collection and ProcessingPaid consultants (usually professional educators) are trainedby the Planning and Evaluation staff to administer the tests.

Page 19: DOCUMENT RESUME - ERICDOCUMENT RESUME ED 080 582 TM 003 098 TITLE State Educational Assessment Programs.,1973 Revision.. INSTITUTION Educational Testing Service, Princeton, N.J.; Education

Houghton Mifflin Co. scores the tests and returns scoreprintouts for each student to the state.

InterpretationThe Planning and Evaluation staff works with computerprogrammers to produce program analyses from thecomputer printouts.

Use of DataThe program results are used primarily to influence theallocation of state and federal funds, and, hopefully, toidentify exemplary programs. This in turn influences pro-gram planning. Use of program data is not restricted bystate or local authority.

DisseminationReports of program progress (primarily state summaries)are prepared by the Planning and Evaluation Division. Thefollowing receive ENAPA reports: the SEA, the stategovernor, the state legislature, newspapers, other states(upon request), the State Board of Education, local schooldistricts, and ERIC. ENAPA does not plan at this time tomail reports to parents, teachers, or students.

OverviewENAPA is, of course, viewed favorably by state educationpeople, but it is too early to assess public opinion towardthe project. ENAPA personnel feel that their programstandards are successfully being met although they also feelthat there is room for improvement. The two main stum-bling blocks are funding and the lack of available trainedpersonnel.

Prospects for the FutureENAPA is likely to continue with modification. Two majormodifications under consideration are: (I) to shift from anorm-referenced test to 1 criterion-referenced test, and (2)to develop a needs assessment program which would bebased on the use of local needs assessment models. AnAccountability Law has just been passed by the state legis-lature, which may influence the types of tests used byENAPA.

Individual InterviewedThe primary source of information was William Raymond,Director of Planning and Evaluation in the Arizona StateDepartment of Education, 1535 West Jefferson, Phoenix,Arizona 85007 (telephone: 602 271-5696).

ARKANSAS

Arkansas Needs Assessment Project

GoalsNo formal goals have been prepared, but goal discussion hasbegun between the Title III Office and the Planning, Eva lu-

ARKANSAS

ation and Instructional Services Office. The goals areexpected to express a combination of process and output,i.e., student outcomes: what would be the desired outputof student education.

Program Title and PurposesThe official name of the Arkansas program is the ArkansasNeeds Assessment Project. The major purposes of theproject are to assess cognitive development, assess educa-tional needs, and measure educational growth.

InitiationThe project began in the latter part of 1970 as an out-growth of Title III. The Planning, Evaluation and Instruc-tional Services office was responsible for project initiation.

PolicyProject policy is formulated by the State Education Agency(SEA).

FundingThe Arkansas Needs Assessment Project is funded throughESEA Title III. Funds from ESEA Title IV A have alsobeen used. Roughly $15,000 was spent on the project in1972.

AdmiaistrationThe ESEA Title III Office is responsible for coordinatingthe project statewide. The equivalent of one full-time pro-fessional works on assessment at the state level. ScienceResearch Associates (SRA) is the outside consultant forscoring services.

PopulationIn the 1971-72 school year, grades 3, 8, and 11 were tested;during the 1972.73 school year, grades 4, 9, and 12 will betested. Dropouts are not included, nor are delinquent stu-dents and handicapped students (EMRs and physicallyhandicapped in special classes). Schools for the delinquentand the handicapped are excluded, as are the vocational-technical schools. Schools were chosen for participation onthe basis of geographic location, ethi.ic makeup, and schoolsize. Parochial and private schools were invited to parti-cipate. Participation of any district was voluntary. Approxi-mately 6 percent of the eligible schools participated in theproject during the 1971-72 school year.

Areas AssessedMathematics, reading, language arts, natural science, andsocial science are being assessed in the cognitive areas. Inthe noncognitive areas, physical fitness and three areas inthe affective domain (attitudes toward school, towardteachers, and general acceptance) are being assessed.

InstrumentationThe following instruments are used: Grade 4SRAAchievement Series and the Modern Math for Under-

13

Page 20: DOCUMENT RESUME - ERICDOCUMENT RESUME ED 080 582 TM 003 098 TITLE State Educational Assessment Programs.,1973 Revision.. INSTITUTION Educational Testing Service, Princeton, N.J.; Education

standing Test; Grade 9SRA Multi-Level AchievementTest, the President's Council on Physical Fitness Test, and

the state-designed Attitude Inventory; Grade 12ThePresident's Council on Physical Fitness Test and the state-designed Attitude Inventory (no achievement tests). The

Attitude Inventory is a state-designed adaptation from theInstructional Objectives Exchange; all other instrumentsused are standardized tests. The tests were chosen by State

Committee. None of the instruments is considered crite-rion-referenced, and none is viewed only as group-reliable.The instruments all may be used for individual interpreta-tion.

Related DataAdditional information collected to help analyze projectdata includes: age of student, racial/ethnic background,

school name and size, and status (if involved) in ESEA Title

I Program.

Data Collection and ProcessingGuidance counselors and some school administrative staffadminister the tests in the schools. The SEA is responsible

for processing project data, with scoring of the achievementtests done by Science Research Associates.

InterpretationThe SEA is responsible for organization. analysis, and

interpretation of all project data.

Use of DataProject results are used for comparative analysis acrossschools, guidance, project planning, and project evaluation.Project authority does not restrict the use of data.

DisseminationProject reports are prepared and include the followingtypes: state summaries, regional summaries, some school

summaries, school system summaries, summaries by grade.The State Education Agency prepares these reports and

sends copies to other states, the State Board of Education,

school districts, and ERIC.

OverviewThe Assessment Project is viewed most favorably through-

out the state. Project objectives are being achieved verysatisfactorily. The major problems lie in the areas of effortcoordination and getting adequate use of project resultsinto decision-making hands (governmental and legislative

levels).

Prospects for the FutureThe Arkansas project is likely to continue, with somechanges occurring in the target population (expansion), theareas assessed (expansion), the measuring instruments(development of more tailor-made instruments), and in theuse of data (moving toward more efficient use). Veryprobably Arkansas will move into a project planning phasein late 1973 in order to evaluate program progress.

14

Individuals InterviewedThe primary sources of information were Sherman Peter-

son, Associate Director of Planning and Evaluation (tele-

phone: 501 371-1561), and Charles Watson, Supervisor of

ESEA Title III (telephone: 501 371-1245), State Depart-ment of Education, Education Building, Little Rock,

Arkansas 72201.i

CALIFORNIA

California Statewide Testing Program

IntroductionBeginning in May 1973, radical changes will occur in Cali-fornia's statewide programs. The California StatewideTesting Program and the Miller-Unruh Basic Reading ActTesting Program have been modified through the passage ofAssembly Bill 665 in the summer of 1972. The old pro-grams will be modified and put under one umbrellatheCalifornia Statewide Testing Program. The scope of areas isbeing broadened, the format of testing is being changed,and all tests will be provided by the state. The State Boardof Education will, under Assembly Bill 665, no longer beauthorized or required to adopt minimum academicstandards for graduation from high school, including theprevious requirement for high school seniors to demon-strate an 8.0 achievement level in reading and mathematics

before graduation from high school..

GoalsCalifornia does not have a formal set of statewide educa-tional goals. Since 1969, however, the California Legisla-ture's Joint Committee on Educational Goals and Evalua-tion has been guiding a developmental process of settinggoals and objectives for the state's public school system.Recently this committee, together with the Department ofEducation, has initiated a process involving professionaleducators and other citizens in every county to determineschool goals, objectives, and priorities. A three-volumeguide, Education for the People, widely distributedthroughout the state through county departments of educa-tion, provides assistance to local groups for participation in

this goal-setting process. It is hoped that all local districtschool boards will have adopted district philosophies, goals,

objectives, and priorities by November 1973, and that theselocal statements will become synthesized into state -

approved goals by February 1974.

Program Title and Purposes .

The official title of the program is the California StatewideTesting Program. The major purposes of state testing arenow (and will ne) to assess cognitive development as a first

step in the (possible) development of a statewide educa-

tional management information system.

Page 21: DOCUMENT RESUME - ERICDOCUMENT RESUME ED 080 582 TM 003 098 TITLE State Educational Assessment Programs.,1973 Revision.. INSTITUTION Educational Testing Service, Princeton, N.J.; Education

InitiationThe State Legislature has been the initiator of mandatedstate testing programs in California since the inception ofstate legislation for testing in 1961. The first mandatedtesting began in 1962. The California State Testing Act of1972 (Leroy Greene's Assembly Bill 665) will supersede thepreviously used Miller-Unruh and General Statewide TestingPrograms in 1973.

Policy

While the State Department of Education actually deter-mines program policy and does initiate change, an advisorygroup of representatives from major school systems in thestate works closely with the Department.

FundingAt this point it is necessary to differentiate between thecurrent programs and those planned for 1973-74 and there-after. The Office of the Legislative Analyst has estimatedthat state testing programs actually cost 5800,000 last year(1971-72) and that 75 percent of this expense was borne bylocal districts, with the state bearing 25 percent of theexpense. Of the state's portion, 50 percent came from thestate's general fund and 50 percent from ESEA Title V. It isanticipated that the cost of the new program will be sharedequally by the state and local school districts.

AdministrationThe Office of Program Evaluation and Research of theState Department of Education is responsible for theadministration of the program. In each county and eachschool district one person acts as a local coordinator and asliaison with the State Department. Until recently, theOffice of Program Evaluation has had only two full-timeprofessional staff members. Currently, five full-time profes-sionals are working both on the old program and develop-ment of the new. Outside agencies and consultants are notbeing used at this time

Population

The target population is all regularly enrolled students inthe public schools of the state in grades I, 2, 3, 6, and 12.Dropout, educable mentally retarded, emotionallydisturbed, non-English speaking, and physically handi-capped students are generally excluded. However, theemotionally disturbed and the non-English speaking areexcluded only if enrolled in special day classes for morethan 50 percent of the school day, and the physicallyhandicapped are excluded only upon recommendation oftheir teachers. All other pupils in the target grades will betested, but each pupil will be tested with only a sample ofitems. Participation in the testing program is required of allschools with the exception of public schools for the handi-capped and parochial schools. One hundred percent ofeligible schools were included in last year's testing program.Community characteristics do not prOvide basis for selec-tion.

C

CALIFORNIA

Areas Assessed

Aptitude, reading, mathematics, and effectiveness ofwriting are the cognitive areas being assessed. No attempt isbeing made to measure any noncognitive areas at thepresent time.

InstrumentationThe folic wing instruments will be used in the programthrough May of 1973: reading (grades I, 2. and 3)theCooperative Prim zry Reading Test (Forms 12A, 23A, 23B),published 1 y Ede :ational Testing Service; scholastic apti-tude (grade: 6 ar-d 12 the Lorge-Thorndike IntelligenceTest, Verba' Battery Multi-Level edition (Form 1, Levels Dand G), published by Houghton-Mifflin Company; reading,language, and mathematicsComprehensive Test of BasicSkills, Level 2, Form Q, reading, language, and arithmeticsections, published by CTB/McGraw-Hill Company;achievement (Grade 12) Iowa Tests of Educational Devel-opment, Form X-4 (Class Period Version) Tests 3, 4, and 5,published by Science Research Associates. These are allrecognized as standardized, commercially-available, norma-tive instruments. They are all individually reliable and notconsidered criterion-referenced. These instruments werechosen by the State Department of Education with theassistance of an advisory committee of testing specialistsand school administrators.

Currently work is being undertaken by Department ofEducation staff to develop the instruments to be used inthe program after May 1973. These instruments will useitems from various sources (generally existing normativetests), with the items to be placed into a pool and proportioned among the subtests. All instruments will thus betailor-made for the program, and the instrument actuallyadministered to any one pupil will contain only a sample ofall the items from the total test. These instruments will notbe criterion-referenced. The new program will use matrixsampling only in grades 2, 3, 6, and 12. The grade I testwill be a specially constructed, short. easy readiness test tobe used as a baseline. All will be group reliable only

Related Data

A variety of related data is collected, including demo-graphic characteristics of the community, financial char-acteristics of the school district and community, pupil andparent characteristics, instructional and staffcharacteristics(elementary schools only at this time), and information onavailability of special instructional programs.

Data Collection and Processing

Classroom teachers are responsible for administering thetests, and local school districts do their own scoring andprocessing (although they may contract with an outsidescoring agency for this service).

InterpretationThe Department of Education is responsible for statewideanalysis, organization, and interpretation of program data.

15

Page 22: DOCUMENT RESUME - ERICDOCUMENT RESUME ED 080 582 TM 003 098 TITLE State Educational Assessment Programs.,1973 Revision.. INSTITUTION Educational Testing Service, Princeton, N.J.; Education

CALIFORNIA

The Department hopes, however, that school districts andlocal schools make individual interpretations for their ownuse.

Use of DataThe data at all grades is designed to provide the legislatureand general public in California with basic information con-cerning the achievements of children in the public schoolsof the state. Out of this general use of information certainspecific uses may be identified, such as identification ofexemplary programs, program evaluation, comparativeanalysis across school districts, and the planning of instruc-tion. Results of the Miller-Unruh program at grades 1, 2,and 3 are used in the allocation of funds for special readinginstruction in these grades. The law does not restrict the useof the data as long as individual pupils are not identified.

DisseminationElaborate analyses of the results are prepared annually bythe Department of Education and summarized by state,region, district, and test area. These annual reports aredistributed to the State Education Agency, the governorand legislature, newspapers, other states upon request, theState Board of Education, local school districts and schools,students and parents, teacher organizations, and ERIC.Program results are disseminated throughout the state, boththrough newspaper releases and through the broadcastingmedia.

Overview

Opinion toward the program is generally favorable. Themajor problems related to the current program are the lackof acceptance of the program by professional educators,particularly teachers, who see the program as a threat tothem without providing any direct instructional benefits;the mandated nature of the program (related to theimmediately preceding problem); the magnitude of the pro-gram in a state as large as California; and the lack of re-sources within the Department of Education to utilize thedata in significant ways and to help school systems utilizethe data.

Prospects for the FutureIt is hoped that the new program will continue indefinitely.It is supposed to interface with a set of educational objec-tives congruent with the wishes of educators, legislators,and other citizenz. Since school personnel are active inassisting the Department of Education staff in defining thecharacteristics of the items that will form the new tests, it ishoped that the tests themselves will become more accept.able to and interpretable by teachers.

Individual InterviewedThe primary source of information was Dale Carlson,Consultant in the Office of Program Evaluation, CaliforniaDepartment of Education, State of California, 721 CapitolMall, Sacramento, California 95814 (telephone: 916322-2200).

16

REFERENCESDepartment of Education, Office of Program Evaluation. Elementary School Questionnaire. State of California.

California Legislature, Joint Committee on Educational Goals andEvaluation. Report. The way to relevance and accountability ineducation. Published by the Senate and Assembly of the State ofCalifornia, April 1970.

California State Legislature, Joint Committee on Educational Goalsand Evaluation. Education for the people, vol. I, Guidelines fortotal community participation in forming and strengthening thefuture of public elementary and secondary educatior in California;vol. 2, A resource book for school-community decision making, tobe used in conjunction with volume I; (vol. 3, Descriptive booklet)Sacramento, 1972.

Assembly Bill 665, California State Legislature.

Statewide Testing Programs: 1972.73 and Beyond. Memo datedSeptember 5, 1972, from Alexander I. Law, Chief of the Office ofProgram Evaluation to district and county Superintendents ofSchools.

COLORADO

Colorado Statewide Learner Needs AssessmentProgram

GoalsThe latest set of statewide educational goals was formallyadopted in March 1972 by the State Education Agency andthe State Board of Edcuation. These goals deal mainly withoutcomes of education. An earlier set of goals and objec-tives was adopted by the State Board in 1962. The stategoals have been derived from citizen statements gatheredduring statewide meetings, from teachers and administra-tors in local education agencies. and from resear-' mi howand why students learn. TN: current set of goals is availableupon request.

Program Titk and PurposesThe official title of the program is the Colorado StatewideLearner Needs Assessment Program. Its major purposesinclude assessing educational needs and cognitive/non-cognitive development, measuring growth, providing datafor local level management information systems, curriculumimprovement, and, though in a formative stage, providinginformation for a planning-programming-budgeting systemat the state level.

InitiationColorado assessment was initiated primarily by the StateEducation Agency, supported by an advisory committeewith some involvement by the State Board of Education.Initiation of the idea came in April 1968, with the first testadministration in May 1970.

Page 23: DOCUMENT RESUME - ERICDOCUMENT RESUME ED 080 582 TM 003 098 TITLE State Educational Assessment Programs.,1973 Revision.. INSTITUTION Educational Testing Service, Princeton, N.J.; Education

r

PolicyThe Chief State School Officer and the State EducationAgency have primary responsibility for conducting the pro-gram, although representatives of major school systemsthroughout the state suggest program modifications.

FundingProgram expenses are financed through both federal andlocal funds. ESEA Title 111 and 402-V funds cover approxi-mately 75 percent of program expense, while operationalfunds of the State Education Agency support the re-mainder. As yet there are no regular funds provided by thelegislature although the State Accountability Act provides$40,000 which may be used, in part, for the program. Mostof the state's funds are used to support the work of paidconsultants. Program costs last year were approximately$45,000.

AdministrationState Education Agency personnel coordinate the programstatewide. The state ESEA Title III staff, with 1.5 full-timeindividuals, administer the program at the state level. Theyare assisted by the staff of the Laboratory of EducationalResearch at the University of Colorado. Consultants fromLER are primarily responsible for instrument developmentsampling, and data analysis.

PopulationAl] public school students enrolled in grades 5 and 11 areeligible to participate in the program with the exception ofthose educationally mentally retarded and emotionallydisturbed students in school who do not take their school'sregularly administered achievement tests. Dropouts are notincluded. A sample of school districts, stratified by schoolsize, is selected and asked to allow its schools to participate.School participation is not required and an alternate schoolis substituted if a given school declines. Random samples ofstudents are selected within each target grade at each schoolresulting in approximately 3-5 percent of the total popula-tion being tested. Last year about 15-20 percent of alleligible schools in the state participated. Since new legisla-tion may require comparisons across schools withindistricts, it is anticipated that the percentage of schoolsparticipating will increase.

Areas AssessedEnglish, health, mathematics, natural science, reading,social science, and citizenship are assessed in the cognitivearea, while attitudes toward school, citizenship attitudes,personal values in social sciences (current school learning),and self-concept are assessed in the noncognitive area.Language skills will also be included for the spring 1973assessment.

InstrumentationAll tests and inventories used in the program were tailor-made by the State Education Agency in conjunction withconsultants from the Laboratory of Educational Research

COLORADO

at the University of Colorado. Teacher committees through-

out the state assisted in this process as item writers and

reviewers. All cognitive area tests are criterion-referenced

measures. All cognitive and noncognitive tests and inven-

tories have been designed for individual measurement,

rather than primarily as group-reliable measures.

Related DataRelated data collected during the course of the assessmentprogram includes: age, grade, school and district code,national origin, language spoken other than English, racial/

ethnic background, sex, socioeconomic data, and identifica-tion of students in bilingual or migrant programs.

Data Collection and ProcessingA combination of district test coordinators, school admin-

istrative staffs, guidance counselors, and some classroom

teachers give the tests in the schools. While the State Educa-

tion Agency has the ultimate responsibility for program

data, the Laboratory of Educational Research at the Uni-

versity of Colorado carries out the data processing. Some

state services have been used.

InterpretationThe State Education Agency with advice from the Labora-

tory of Educational Research, has the primary responsi-

bility for analyzing, organizing, and interpreting the data,including the distribution of printouts of summary data and

explanatory material. Local school districts, using the mate-

rial sent to them by the SEA, are responsible for local

interpretation and follow-up.

Use of DataPrimary use of the data is for allocation of federal funds to

school districts, on the basis of identified needs, although

some program evaluation, instructional improvements, andpublic relations uses have been cited. While there is no law

restricting or limiting the use of the data from the program,the SEA emphasizes that the results may only be used in a

constructive way.

DisseminationWorking papers and summary data reports are prepared bythe Laboratory of Educational Research for the SEA. Fromthese reports, the State Education Agency prepares stateand school system summaries (item by item, average per-centage correct by target area), school system summaries(expected percentage correct based on regression of SESand test data), summaries by target areas, and publicinformation pieces for state newsletters and newspapers.Reports of the program are distributed to all State Educa-tion Agency departments, the governor, the legislature,other states, the State Board of Education, all 181 schooldistricts in the state, participating schools, the Departmentof Institutions in the state, the USOE, college and univer-sity schools of education, and ERIC. Parents can alsoreceive reports, but only upon request. Special news

17

il.

Page 24: DOCUMENT RESUME - ERICDOCUMENT RESUME ED 080 582 TM 003 098 TITLE State Educational Assessment Programs.,1973 Revision.. INSTITUTION Educational Testing Service, Princeton, N.J.; Education

COLORADO

releases are prepared by the SEA for the state newsletterand for newspapers throughout the state.

OverviewThe program is viewed favorably by most state officials,including the governor, the state legislature, and the StateBoard of Education. Program objectives are being achieved,although there are two major obstacles: school resistance tooutside testing (i.e., testing conducted by groups other thanthe local school personnel) and the lag in development ofsystematic data use.

Prospects for the FutureThe program is likely to continue at least through the endof the 1972-73 school year, although it may be subject tochanges in State Board of Education policy, existingaccountability, and PPBES laws, or through new legislation.The area most likely to change is policy controlthe pro-gram may not remain under the ESEA Title III office. TheSEA also sees a need to establish a systematic system toeducate people to make the best use of program results inlocal districts and schools. SEA personnel are also planningto use grade 12 students in the program within the nexttwo years.

Individuals InterviewedThe primary source of information was John Helper,Consultant, Colorado Department of Education, Colfax andSherman, Denver Colorado 80203 (telephone: 30389 2-223 8) and Arthur Olson. Director, CooperativeAccountability Project, 1362 Lincoln Street, Denver, Colo-r Jo 80203 (telephone: 303 892-2133).

REFERENCESColorado Department of Education. Educational goals for ColoradoCitizens. Published March 1972.

Colorado Department of Education. An Assessment of LearnerNeeds in Colorado, Academic Year 1970-71. Published by theAssessment and Evaluation Unit, Colorado Department of Educa-tion. ERIC Report: TM 001 852.

The Accountability Act of 1971.

CONNECTICUT

Connecticut Needs Assessment Program (CNAP)

IntroductionThe statement in the 1971 publication State EducationalAssessment Programs was in error. ::onnecticut was then,and still is, actively involved in establishing educationalgoals for the state and assessing the educational needs of itsstudents. In addition to the Needs Study described below,the State Educational Agency has developed a plan entitledEvaluation and Assessment Procedures for Public School

18

Programs. The plan is designed to meet the concern of theState Board and the legislature for more extensive evalua-tion of pupil progress with emphasis on the use of evalua-tive data for strengthening education. The plan, whichincludes proposals for assessment at both the state and locallevels, was submitted to the general assembly in February1972 in response to P.A. 665 of the 1971 session of thegeneral assembly. It is expected that it will be funded inpart by a state appropriation for 1973-74. An applicationhas been made to NIE for additional funding.

When implemented, the program will incorporate theNeeds Assessment (CNAP) as one of its major componentsand the total program will be entitled the ConnecticutEducational Program Evaluation Project (CEPEP).

GoalsConnecticut is currently in the process of establishing state-wide educational goals. Beginning in early 1971 a GoalsResponse Study was carried out simultaneously withreading assessment throughout the state. The major purposeof the goal setting effort is to establish the basis for aprogram designed to meet the needs of public school stu-dents in Connecticut. A consultant committee, the StateBoard of Education and the State Education Agency havebeen involved in this project along with thousands ofConnecticut citizens. The project has been monitored by anexecutive group of the State Department of Education aspart of Connecticut's first state sponsored educationalneeds study. Though the goals have not been formallyadopted by the state, a report summarizing the results ofthe study has been published by the Connecticut StateBoard of Education., Connecticut Citizens Response toEducational Goals 1971-1972. It is hoped that the findingsof the goal survey will receive wide public attention andthat school districts will continue to develop goals andeventually a management-by-objectives program themselves.The goals discussed in this project deal primarily with theoutcomes of education; however, it is anticipated that acomprehensive plan for evaluation will include a combina-tion of input, process and output.

Program Title and PurposesThe official title of the program is the Connecticut NeedsAssessment Program. Its major purposes are: assessment ofcognitive development, assessment of educational needs,measurement of growth, providing data for a statewidemanagement information system, setting statewide educational goals and providing information for a planning-programming-budgeting system.

InitiationThe State Education Agency staff initiated a proposal forthe study of needs assessment. The proposal, given impetusby the initiation of the National Assessment of EducationalProgress and ESEA Title III requirements, was submitted tothe State Board of Education in 1970. In February 1972the Board adopted a more comprehensive Plan for Evalua-tion and Assessment in response to Section 10.4 of the

Page 25: DOCUMENT RESUME - ERICDOCUMENT RESUME ED 080 582 TM 003 098 TITLE State Educational Assessment Programs.,1973 Revision.. INSTITUTION Educational Testing Service, Princeton, N.J.; Education

General Statutes as revised by P.A. 665 of the 1971 sessionof the general assembly. Initial assessments took place inthe fall of 1971.

Policy

An executive group in the State Education Agency assignsspecific tasks and responsibilities related to the NeedsStudy through the Bureau of Educational Management andFinance. Many departmental staff are involved in thestudies with all major divisions represented on the executivegroup.

FundingThe program has been financed 100 percent by federalfunds (ESEA III and 402). Program expenses during the1971-72 school year were approximately $150,000.

AdministrationProgram administration is coordinated by the executivegroup in the State Education Agency; however, all programdetails have been handled by the Institute for the Study ofInquiring Systems, Philadelphia, Pennsylvania, which hasoperated under contract with the State Board of Education.

PopulationTarget population is defined by age. Random samples ofstudents aged 9, 13 and 17 are identified using a four-stagesample process involving: community size, school districthaving probability proportional to enrollment, schoolwithin district (one for each age group) and, finally, stu-dents randomly selected within each school. All students ofa given age are included even if they have dropped behindgrade level. Dropouts, the educationally mentally retarded,the emotionally disturbed and non-English speaking stu-dents are excluded. School participation is voluntary, andnon-public schools were excluded. No community char-acteristics (other than size) are considered during sampling.

During 1971-72, 100 percent of the schools invited parti-cipated in the program, resulting in the assessment of 7,000students in 156 schools representing 54 of the state's 169school districts.

Areas AssessedThe cognitive area of reading was assessed during 1971-72.No noncognitive areas were assessed. Following the cycle ofassessment carried out by the National Assessment ofEducational Progress, it has been proposed that science beassessed during the 1973-74 school year.

InstrumentationThe instruments used for the reading assessment wereadapted from the reading exercises of the National Assess-ment of Educational Progress. The instruments wereselected by the Institute for the Study of Inquiring Systemsin cooperation with the State Education Agency. Theinstruments are described as criterion-referenced measures.Individual students were not tested for measures of their

CONNECTICUT

achievement, but groups of students were assessed in theirperformance of particular reading skills.

Related DataAge and sex of respondent are the only additional pieces ofinformation collected during assessment to help analyze,interpret and organize the program data.

Data Collection and ProcessingThe Institute for the Study of Inquiring Systems wasresponsible for administering instrun ants for the readingassessment and for tvaring and prose sing assessment, data.The ISIS Hued and trained :(1 fieli, administrators, pri-marily Connecticut people with exrerience in schools, asexercise administrators. The State Education Agencyobserved administrations in a sample of 18 schools and con-cluded that the test administrators were doing an excellentjob.

InterpretationThe Institute for the Study of Inquiring Systems had theresponsibility for analyzing, organizing and reporting assess-ment data.

Use of DataAlthough it is somewhat early to indicate specific uses ofthe program data, it is 'anticipated that the results will beused to establish priorities so that state and federal fundscan be directed to meet identified needs. The results will beused also for budgeting, instructional purposes and programplanning. Reading assessment data already is being used forthese purposes. The authority for the program does notrestrict or limit the use of the program data.

Dissemination

Program reports were prepared by the outside contractor incollaboration with the State Education Agency. The reportsmade comparisons in several categories: with establishednationwide assessment of students in the same age group-ings, with students in the northeast, by sex and by size ofcommunity in which the student attends school. Programreports are disseminated to the State Education Agency,state board of education, governor, legislature, schooldistricts, schools, parents, teacher organizations, otherstates, newspapers and ERIC. These reports are available tostudents in school libraries.

Overview

No comments concerning attitudes to the program are avail-able at this time.

Prospects for the FutureAlthough funding is currently a major obstacle, it isexpected that there will continue to be an emphasis by theSEA on securing data on pupil performance as well aseducational program input and process. Hopefully assess-ment and evaluation will not be limited to the cognitivedomain; provisions for assessing the affective domain areincluded in presently formulated plans.

19

Page 26: DOCUMENT RESUME - ERICDOCUMENT RESUME ED 080 582 TM 003 098 TITLE State Educational Assessment Programs.,1973 Revision.. INSTITUTION Educational Testing Service, Princeton, N.J.; Education

CONNECTICUT

In addition to meeting the Title III needs assessmentrequirements, there is a sincere respect for the concept ofaccountability and a meaningful commitment to needsassessment and evaluation as an essential element in soundinstructional programs.

Individuals InterviewedThe primary sources of information were Robert W.Stoughton, Associate Commissioner, Division of Insti .:.-

tional and Pupil Services, Alfred Villa, Chief of the Bureauof Education Management and Finance (telephone: 203566 i )83); and Jost. ti Cashman, Federal/State Evaluation,Division of Administrative Services (telephone: 203566-4897). Their address is the Connecticut State Depart.ment of Education, Box 2219, Hartford, Connecticut06115.

REFERENCESConnecticut State Board of Education. Connecticut CitizensResponse to Educational Goals 1971-1972.

Connecticut State Board of Education. Connecticut Reading Assess.ment 1971-1972.

DELAWARE

Delaware Educational Accountability System

GoalsIn the fall of 1970, a study was conducted using the Delphitechnique to determine what educational goals of Delawarewould be generally accepted. Data derived from this studywas used as the basis for eliciting citizen response through aseries of statewide public forums. As a result of commentsand suggestions, a list of learner goals and subgoals wasprepared and these were formally adopted by the StateBoard of Education in June 1972. The nine major goal*areas covered include communication and basic skills,

facilitating skills and attitudes, career and economiccompetence, citizenship, physical and mental health,human relations, self-realization, home and family relation-ships and aesthetic and cultural appreciation. Copies ofthese goal statements are available upon request.

Program Title and PurposesThe program is called the Delaware Educational Account-ability System (DEAS), of which the Delaware EducationalAssessment Project (DEAP) is a part. This systematic,comprehensive, long-range planning model was developedto improve education in Delaware's public schools. The sub-

systems of this plan are designed to answer four main ques-tions: What do we want from our educational system?(goals and objectives) What have we attained? (assessmentof current status) What are our strengths and weaknesses?

20

(needs assessment and priorities) And what can be done?(examination of current programs, formulation and imple-mentation of new programs).

InitiationThe Delaware Educational Accountability System (DEAS)

was developed in 1970 by the Planning, Research andEvaluation Division (PRE) of the Department of PublicInstruction under a federal grant, Title IV, Section 402 ofPL 90.247. The Delaware Educational Assessment Program(DEAP) was initiated in the spring of 1972. The 1971-72assessment battery was developed according to criteria

determined by the Department of Public Instruction,several DPI advise y committees and local agency personnelin consultation with the assessment contractor, Educational

Testing Service (ETS).

PolicyThe Delaware Educational Accountability Advisory Councilwas formed to provide advice and counsel in the design and

implementation of the DEAS model. This council consistsof the following members: President and Past President,Delaware School Boards Association; President and PastPresident, Delaware Chief School Officers Association;President, Delaware PTA; Executive Secretary, DelawareState Education Association; Dean, College of Education,University of Delaware; President, United Forces for Educa-tion; a Delaware teacher; and the Director of the Planning,Research and Evaluation Division, Delaware Department ofPublic Instruction. The Administrative Council of theDepartment, the State Board of Education, DEAS Coordi-nators, Chief School Officers in local school districts andDepartment staff are involved in determining programpolicy. Efforts have been coordinated by the Planning,Research and Evaluation Division.

FundingFunding of the DEAS is 50 percent state and 50 percentfederal. State funding is a direct line item of the budget.Federal aid comes from ESEA Titles I, 111, V and Title IV,

Section 402. During fiscal year 1973, roughly 5200,000

will be spent on the 'SEAS program, of which some$60,000 will be for the assessment program.

AdministrationAlthough the Planning, Research and Evaluation Division isresponsible for coordinating the implementation of thevarious subsystems of this planning cycle, the operation ofthe fully implemented DEAS will be a total Department

enterprise.

PopulationRegular students in grades 1, 4 and 8 of all public schoolsand students in one grade per year in parochial schools are

eligible for testing. All eligible schools participated in theDEAP program in fiscal 1972, and it is anticipated that theparticipation in the 1973 DEAP will be comparable.

1

Page 27: DOCUMENT RESUME - ERICDOCUMENT RESUME ED 080 582 TM 003 098 TITLE State Educational Assessment Programs.,1973 Revision.. INSTITUTION Educational Testing Service, Princeton, N.J.; Education

Areas AssessedAptitude, language arts, reading, science, mathematics,knowledge of careers and attitudes toward school wereassessed in 1972. In 1973 knowledge of careers and attitudes toward school will be omitted and replaced by assess-ment of physical and mental health.

InstrumentationIn grade 1, the Cooperative Primary Tests in reading,listening and mathematics have been modified and adaptedfor Delaware's assessment program. In grades 4 and 8, theSchool and College Ability Tests (SCAT) are used formeasures of verbal and quantitative aptitude, and theSequential Tests of Educational Progress (STEP), modifiedand adapted to Delaware's criteria, are used in the basicskill areas.

Related DataInformation on student, school and community variablessuch as socioeconomic status, pupil/teacher ratios, percentminority and school size are collected. These variables willbe related to student performance when analyzing testresults.

Data Collection and ProcessingClassroom teachers, guidance counselors and school admin-istrative staff administer the tests in the classroom. Educa-tional Testing Service scores all tests, and analysis isperformed by the Planning, Research and Evaluation Divi-sion staff.

InterpretationThe Department of Public Instruction provides test inter-pretation sessions to local districts upon request. In-depthanalysis and interpretation of test results are a sharedresponsibility of local school districts and DPI, Planning,Research and Evaluation Division and Instructional Divisionstaff. Manuals to aid schools and local districts in test inter-pretation have been deveoped al.,: disseminated to everyschool in every district.

Use of DataData from the testing program is used to identify strengthsand weaknesses in school, district or state programs. Plan-ning, Research and Evaluation and Instructional Divisionstaff of the Department of Public Instruction and Univer-sity of Delaware personnel will work with selected schoolsor districts to improve curriculum or programs in areasdemonstrated by the DEAP to be in need of assistance.

DisseminationReports prepared by the Planning, Research and EvaluationDivision include: a needs assessment report, state sum-maries, district summaries, school summaries and individualstudent results. In addition, public information releases andsummaries of Title I students are prepared. State summariesand the needs assessment report are released to the gov-ernor, the legislature, State Board of Education, Depart-

DISTRICT OF COLUMBIA

ment of Public Instruction, school districts and localschools, as well as to ERIC, newspapers, other Stateagencies and interested educational personnel. Release ofdistrict and school reports is at the discretion of the localschool district.

OverviewStatewide, opinion has been favorable to the DelawareEducational Accountability System (DEAS), with greateracceptance being gained as educators find use for theproducts developed. Subsystems of DEAS are being imple-mented and adapted as time and resources permit. Themajor constraints in the DEAS program are lack of suf-ficient funds and staff.

Prospects for the FutureThe Delaware Educational Accountability System is

expected to continue for the foreseeable future with modi-fications to the system being made as experience is gained.

Individual InterviewedThe primary source of information was Wilmer E. Wise,Director of the Planning, Research and Evaluation Division,Delaware Department of Public Instruction, Dover, Dela-ware 19901 (telephone: 302 678.4583).

REFERENCEPlanning, Research and Evaluation Division. Goal Statements forDelaware Public School Students for the 70's and 80's. September1972.

DISTRICT OF COLUMBIA

Pupil AssessmentThe Academic AchievementProject

IntroductionThe educational organization of the District of Columbia istypical in that all public schools are under the administra-tive control of the District of Columbia School Board.Therefore, the Academic Achievement Project more closelyresembles a large district testing program than a stateprogram.

GoalsA formal set of educational goals has been adopted by theDivisi:m of Education, which is somewhat analogous to aState Superintendent's Office and the District Board ofEducation. A combination of input, process and outputgoals are reflected in the goals statement, prepared with theassistance of citizens' advisory groups, the Board of Educa-tion and the Board of Higher Education. The goals are avail-able upon request.

21

Page 28: DOCUMENT RESUME - ERICDOCUMENT RESUME ED 080 582 TM 003 098 TITLE State Educational Assessment Programs.,1973 Revision.. INSTITUTION Educational Testing Service, Princeton, N.J.; Education

DISTRICT OF COLUMBIA

Program Title and PurposesPupil AssessmentThe Academic Achievement Project(Clark Plan) is the official title of the program. Its majorpurposes include the assessment of educational needs, themeasurement of growth and influences on learning and theprovision of information for a planning-programming-budgeting system.

InitiationThe program was initiated during the 1970-71 school yearby the District of Columbia School Board under recom-mendation from a consultant, Dr. Kenneth B. Clark, hiredby the School Board. The project is now in the third yearof operation.

PolicyThe Board of Education and the Department of PupilPersonnel Services, Pupil Appraisal Section, are responsiblefor determining how the program will be conducted andwhat changes will be made.

FundingThe program is funded by regular local budget appropria-tions; no federal funds are utilized. Approximately$475,000 was spent on the program in 1972.

AdministrationThe Department of Pupil Personnel Services, PupilAppraisal Section, coordinates the program, which is

staffed by six full. ime professionals. Personnel from themajor test publishing companies are used as consultants andas staff for inservice workshops and training programs.

PopulationAll students enrolled in the regular curriculum in grades 1through 9 participate in the program. Dropouts do notparticipate. Since vocational schools begin at the tenthgrade, no vocational students are included in the program.Parochial and private schools are excluded from the pro-gram; all other schools are required to participate.

Areas AssessedThe testing program is limited to cognitive skills in readingand mathematics.

InstrumentationStandardized achievement tests in reading and mathematicsare administered in grades 1 through 3. The tests used ingrades 4 through 9 are tailor-made editions of criterion-referenced tests, the Prescriptive Reading and the Prescrip-tive Mathematics tests developed by CTB/McGraw-Hill.Other standardized tests in reading and mathematics areadministered in grades 4 through 9. These tests are usedonly as group reliable measures. The District Board ofEducation approves the tests used in the program.

22

Related DataPresently no additional information is collected as an aid inanalyzing and interpreting the citywide test data. However,plans are being developed for collecting additional informa-tion on a citywide basis in the future. The Department ofResearch does collect additional data on 16 schools in theEvaluation Systems Project.

Data Collection and ProcessingTests are administered by classroom teachers and guidancecounselors with the assistance of university students andparaprofessionals who serve as proctors. The PupilAppraisal Section in the Department of Pupil PersonnelServices is responsible for collating the information ob-tained. The tests are scored by the test publisher.

InterpretationThe Pupil Appraisal Section assists individual schools ininterpreting test results. Results are sent directly to theschools by the scoring service. A series of workshops isconducted by the Pupil Appraisal Section to instruct schoolcounselors and testing chairmen in Interpreting and pre-senting test results to students, teachers and parents.

Use of DataThe data collected is used for the identification of exem-plary programs, program evaluation and planning, guidanceand instruction. The publication of results for individualpupils is prohibited.

DisseminationThe Pupil Appraisal Section prepares data summaries forthe school system. Program reports are submitted to theBoard of Education, the Department of Pupil PersonnelServices, other states, teacher organizations and the press.The interpretation of test results to students and theirparents is the responsibility of school counselors and testingchairmen.

OverviewThe criterion-referenced testing component of the programis viewed favorably by local school administrators and stu-dents and viewed very favorably by teachers, the Board ofEducation and the Pupil Appraisal Section. However, thestandardized testing component is not viewed as favorably.The general consensus is that the program objectives arebeing achieved very well, although there are some problemswith acclimating school personnel to the rationale andpurposes of a criterion-referenced testing program.

Prospects for the FutureThis program is likely to continue with some modificationsbeing made at the discretion of the Board of Education.The Pupil Appraisal Section is planning to enlarge its datastorage capabilities and to follow the progress of specificgroups of students longitudinally. It is likely that the pro-

Page 29: DOCUMENT RESUME - ERICDOCUMENT RESUME ED 080 582 TM 003 098 TITLE State Educational Assessment Programs.,1973 Revision.. INSTITUTION Educational Testing Service, Princeton, N.J.; Education

gram's goals and objectives will change after the completionof the three year Academic Achievement Project.

Individuals InterviewedThe primary sources of information were Robert B. Farr,Director of Pupil Appraisal, Department of Pupil PersonnelServices, District of Columbia Public Schools, 9th Floor,Presidential Building, 415 12th Street N.W., Washington,D.C. 20004 (telephone: 202 737-1189) and WilliamRumsey, Chief, Education Division, Department of HumanResources, 1329 E Street N.W., Suite 1032, Washington,D.C. 20004 (telephone: 202 638-2406).

FLORIDA

Florida State Educational Assessment Program

GoalsA formal set of 10 goals, covering goals for student develop.ment and organizational goals, has been prepared. The firstseven cover goals for student development, and the lastthree cover organizational goals. The last goal area, "Evalua-tion of Strategies," provides the basis for Florida Assess-ment: "The performance of the state system of publiceducation shall be evaluated in terms of the achievement ofits students and the efficiency of its process." These goalswere prepared in a coordinated effort by the State Com-missioner of Education, the State Board of Education andan advisory group consisting of citizens, educators andstudents. The goals have been formally adopted by theState Board of Education and reflect a combination ofinput, process and output. They are available upon request.

Program Title and PurposesThe official name of the Florida student testing program isthe Florida State Educational Assessment Program. Itsmajor purposes are to: assess cognitive development, assesseducational needs, assess noncognitive development (afuture purpose), measure influences on learning, providedata for a management information system (a futurepurpose) and provide information for a planning-pro-gramming-budgeting system (a future purpose).

InitiationIn June 1971, the Florida State Educational AssessmentProgram was enacted into law and funded through thecooperative efforts of the state legislature and the StateDepartment of Education.

Po::cyAs assigned by the state legislature, the State Commissionerof Education and the State Department of Education deter-mine program policy.

FLORIDA

FundingFlorida assessment is completely funded through stategeneral revenue monies. Approximately $96,000 was spenton assessment in 1971 and $300,000 in 1972.

AdministrationThe Division of Elementary and Secondary Educationcoordinates the program statewide. The equivalent of threeand one-half full-time professionals work on the program atthe state level. In-state consultants work on test develop-ment, item writing and development of program objectives.The consultants include local school districts, private firmsand all the state universities. The primary outside con-tractor involved in 1972 was Harcourt Brace Jovanovich,Inc. (New York), which printed and scored all test mate-rials. The Center for the Study of Evaluation at UCLA wasthe primary outside contractor in 1971 and also helped todevelop the tests.

PopulationDuring the school year 1971-72, ages seven and nine weretested; in the 1972-73 school year ages eight, eleven, andthirteen will be tested. Students are drawn by a multiple-matrix sampling of all public school students in the targetage groups. Students who have dropped behind in gradelevel are now included, although educators are questioningthe worth of this, and the practice may be discontinued infuture assessments. The following are excluded from thestudy: trainable mentally retarded school dropouts,parochial and private schools, and schools for the trainablementally retarded. All public schools receiving state moniesare required to participate, and, in the 1971-72 school year,99 percent of the eligible schools did participate. Com-munity characteristics are not used as a basis for schooleligibility.

Areas Assessed

In the 1971-72 school year, only the cognitive area ofreading was assessed. For the 1972-73 school year, thecognitive areas of reading, writing and mathematics arebeing assessed. At this time no noncognitive areas areassessed, but an attitudes inventory on human relation skillsis being planned. Also, occupational skills will be includedin future years.

InstrumentationAll testi struments currently in use were tailor-made forthe progrM Item sampling was used to construct all tests.The 1971 tests were developed by the Center for the Studyof Evaluation at UCLA, using state-written items. The 1972tests were developed by in-state staff of the school districts,state universities and the State Department of Education;they were printed by Harcourt Brace Jovanovich, Inc. TheTests were considered to be criterion-referenced measuresand at the present time are used only for statewide andschool districtwide assessment. Schoolwide and individualstudent assessment will be incorporated in future years. Inaddition, annual tests are given to all students in grades 9

23

Page 30: DOCUMENT RESUME - ERICDOCUMENT RESUME ED 080 582 TM 003 098 TITLE State Educational Assessment Programs.,1973 Revision.. INSTITUTION Educational Testing Service, Princeton, N.J.; Education

FLORIDA

and 12 using nationally normed tests developed by Educa-tional Testing Service. It is planned to redesign and adoptthese instruments for use in the statewide assessmentprogram.

Related DataAdditional information collected to help analyze programdata includes: age of student, county, percent minority,racial /ethnic background, school name, sex of student and,if handicapped, the type of handicap.

Data Collection and ProcessingClassroom teachers and school administrative staff admin-ister the tests. The State Education Agency (SEA) and theoutside contractor process the program data. Program datais scored by Harcourt Brace Jovanovich, Inc.

InterpretationThe SEA and the local school districts coordinate organiza-tion, analysis and interpretation of the program data.

Use of DataThe primary use of program data is for instructionalpurposes; secondary uses include program planning andevaluation and public relations. Program authority does notrestrict the use of program data.

DisseminationProgram reports are prepared by the SEA and include statesummaries, district summaries and public informationpieces. Copies of these reports are sent to the legislature,the newspapers, other states upon request, the State Boardof Education, local school districts and schools, teacherorganizations and ERIC.

OverviewThe state level education hierarchy is very supportive ofFlorida assessment, as is the local education administrativestaff. Teachers are less favorable in their views, perhapsbecause they are waiting to see its value to the instructionalprogram. It is believed that as the program is more fully

developed it will be a valuable tool for classroom teachers.Program objectives are being met with 90 percent effective-ness, with the main problem being insufficient lead time for

instrument development.

Prospects for the FutureThe program is likely to change, with modification in theseareas: goal development, subjects assessed, development ofmeasuring instruments, data collection and processing, datainterpretation, data use and dissemination of programresults.

Individual InterviewedThe main source of information was William Cecil Golden,Associate Commissioner for Planning and Coordination,Florida Department of Education, Tallahassee, Florida32304 (telephone 904 488.6539).

24

REFERENCESFlorida State Department of Education. Goals for Education inFlorida.

The State Commissioner of Education, Floyd T. Christian. Memo-randum on Goals for Education in Florida, published August 17,1971, by the Office of the State Commissioner.

GEORGIA

IntroductionThe 1971 issue of State Educational Assessment Programspresented the Georgia Assessment Project (GAP). Outcomesof Phase I were discussed and research activities to be

implemented in Phase II were described. GAP Phase IIreceived no state funding and is continued in the AtlantaAssessment Project. In addition, the state has funded aStatewide Testing Program. Both are described below.

Atlanta Assessment Project (AAP)

GoalsA formal set of program goals has been prepared by acitizens' advisory group and adopted by the State Board ofEducation. These goals are concerned with a combinationof input, educational process and student output and areavailable upon request. The point of departure in theestablishment of Goals for Education in Atlanta, 1985, wasthe product set of Goals for Education in Georgia devel-oped in the first phase of the Georgia Assessment Project.The AAP conducted studies to determine the extent towhich the Goals for Education in Georgia are applicable tothe Atlanta area. The studies to establish goals involved useof the Delphi technique, a method of forecasting developedby Rand Corporation. This technique was intended for usein answering questions about the future when a great dealof uncertainty and complexity surround the questions.

Two types of forecasting are involved in establishingeducational goals for the future. One type forecasts whatfuture conditions probably will be at a given point in thefuture; the other forecasts what educational goals should bein the light of these probable future conditions. The Delphitechnique has been used in making both types of forecasts.The AAP used the technique in forecasting what educational goals should be in order to prepare young people inthe Atlanta area to live successfully in the future.

There were three Delphi studies, each involving a dif-ferent group of participants with different areas ofexpertise with respect to education. The three groups were(I) professional, technical, managerial and communityleaders in the Atlanta area (N=275); (2) student leaders inthe 25 high schools in the Atlanta Public Schools (N=369);and (3) teachers, counselors, curriculum coordinators,

Page 31: DOCUMENT RESUME - ERICDOCUMENT RESUME ED 080 582 TM 003 098 TITLE State Educational Assessment Programs.,1973 Revision.. INSTITUTION Educational Testing Service, Princeton, N.J.; Education

principals and other administrators in the Atlanta PublicSchools (N=429). Teachers represented every subjectmatter area offered in the high school curriculum. Bothteachers and students were selected to be representative ofthe total teacher and student population in each highschool with respect to race and sex. The professional andtechnical participants were selected to represent all themajor occupational categories at this level in the Dictionaryof Occupational Titles across races.

In the Delphi studies, each of the Goals for Education inGeorgia-86 separate goal statementswas rated as to itsrelative importance in preparing young people in theAtlanta area to live successfully in 1985 and thereafter.Further, each participant was given an opportunity to sug-gest additional goals that he felt were important and shouldbe added to the list.

In the AAP, the Delphi studies involved three rounds ofmaking judgments regarding the relative importance ofgoals, with each subsequent judgment about a goal by aparticipant made in the light of a summary of the judg-ments of all participants about that goal in the previousround. The primary result of the studies was a ranking ofgoals on the basis of the mean rating of importanceforeach of the three groups of participants separately andacross groups. Participants su:4:ested 35 additional goalswhich were added to the original list of 86.

Program Title and PurposesThe activities of the AAP fall into five consecutive phases:(1) planning, (2) establishing Goals for Education inAtlanta, 1985, (3) developing behavioral objectives to oper-ationally define each goal at the twelfth grade level, (4)developing criterion-referenced measures to determineachievement in respect to each objective and (5) admin-istering the tests to produce the first round of results to beused in administrative and instructional decision making.

There are three separate activities underway in thedevelopment of behavioral objectives: (1) developing taxo-nomic specifications for each goal, (2) reviewing and classi-fying existing behavioral objectives collected from a varietyof sources in terms of the Goals for Education in Atlanta,1985, and in terms of taxonomic categories in the cogni-tive, affective and psychomotor domains and (3) thewriting of additional behavioral objectives as necessary torepresent each goal. The first two activities are being per-formed by teachers and curriculum specialists within theAtlanta Public Schools who have had experience withbehavioral objectives and with the cognitive, affective andpsychomotor domains. The review and classification ofexisting objectives will greatly decrease the magnitude ofthe task in writing new objectives for the project. The thirdactivity is being performed by a larger number of teachersand curriculum specialists who have received trainingthrough the project in the writing of behavioral objectivesand the use of the taxonomies of objectives to meet thespecifications of the AAP. Community resource personnelin the professional and technical areas and students areacting in an advisory capacity to the writers.

GEORGIA

InitiationThe Atlanta Assessment Project (AAP), jointly initiated bythe state and city Department of Education, is a 2-1/2 yearendeavor to develop techniques and tools for measuring theprogress of Atlanta's graduating high school seniorsandthose who are old enough to graduate, but have nottoward the achievement of educational goals relevant toliving in 1985 and thereafter. The model for assessmentbeing developed in the AAP is expected to be adapted foruse in the Georgia Assessment Project (GAP).

The first two phases, the planning and the establishmentof Goals for Education in Atlanta, 1985, have beencomplaed. The third phase, the development of behavioralobjectives thaothr,'11 operationally define each goal, is cur-rently underway.

PolicyProgram policy is set by the Superintendent through theDivision of Program and Staff Development.

FundingThe AAP is funded under Title III of the Elementary andSecondary Education Act, Public Law 89-10, as admin-istered in the Georgia Department of Education by theDivision of Program and Staff Development (Dr. James E.Bottoms, Director; Dr. Will G. Atwood, Associate Director;Lester M. Solomon, Coordinator of Title III and projectofficer for the AAP). Between March and August 1972,$43,000 was spent on assessment. From September 1972 toJune 1973, it is estimated that $250,000 will be spent.

AdministrationThe AAP is administered and operated within the AtlantaPublic Schools, Atlanta, Georgia (Dr. John W. Letson,Superintendent), through the Division of InstructionalServices (Dr. E. Curtis Henson, Assistant Superintendentfor Instructional Services). The project staff includes fiveprofessional members and two secretariesthe director, aresearch associate, a systems manager, two researchassistants, a stenographer and a typist with skills in

operating a computer terminal (Dr. Ray L. Sweigert Jr.,Director).

PopulationThe tests will be administered to students about to graduatefrom high school and to other young people in the areawho are old enough to graduate though they may havedropped back a year or so or have left school altogether. Amatrix sampling method will be used, so that each studentwill receive a subset of the total battery of measures.

Areas AssessedThough not yet finalized, program plans include assessmentof the following cognitive areas: aptitude, English, health,mathematics, natural science, fine arts, reading, social

science, vocational and writing. Plans also include assess-ment of attitudes, citizenship, creativity, interests, personalvalues and self-concept in the noncognitive area.

25

Page 32: DOCUMENT RESUME - ERICDOCUMENT RESUME ED 080 582 TM 003 098 TITLE State Educational Assessment Programs.,1973 Revision.. INSTITUTION Educational Testing Service, Princeton, N.J.; Education

GEORGIA

InstrumentationThe behavioral objectives developed in the project will be

used to design criterion-referenced tests to measure the

achievement of the objectives. Test development will becontracted to an outside agency, with task forces of objec-tive writers in the project reviewing test items to determinethe extent to which the items adequately reflect the intentof those who wrote the objectives. Items will be tried outon samples of students in the target population to deter-mine the statistical characteristics of the measures.

Related DataTo help analyze and interpret the test data, AAP will col-

lect the following information: age, sex, dropout rates,teacher data, school size and socioeconomic and racial/

ethnic data.

Data Collection and ProcessingIt has not yet been determined who will be responsible for

administering the tests or processing the resulting informa-

tion.

InterpretationThe State Education Agency in conjunction with the

Atlanta Public Schools will be responsible for analyzing,

organizing and interpreting the data.

Use of DataThe primary outcome will be criterion-referenced data as to

the extent to which young people in the Atlanta area are

mastering the Goals for Education in Atlanta, 1985. Thedata will be available for making decisions as to whatinstructional changes need to be made in order to provide

even better education for the young people of Atlanta.Decision-making resulting from the use of this information

would ideally involve all levels of the system, with each

level making those decisions appropriate to the resourcesavailable for discretionary use at that level.

Another important outcome is involvementthe involve-

ment of teachers and other school personnel in the setting

of goals and the writing of objectives, the involvement of

students and the involvement of community leaders.Last, but not least of the outcomes is a model for period-

ically reviewing and revising goals and objectives. In a world

changing as fast as oursand threatening to change evenfasterit is increasingly necessary to review and updatewhat it is that students are expected to do and to assesswhere they are with respect to meeting these expectations.The AAP will produce a model for doing that.

DisseminationReports of program results are prepared by the project staffin the form of school system summaries and sent to theState Education Agency, the State Board of Education,

other states, school districts, schools, teacher organizations,ERIC and newspapers.

26

OverviewThe program is viewed favorably by the governor and legis-

lature and very favorably by the State Education Agency,

the Boards of Education, local school administrators,teachers, students and the general public. Program objec-

tves are being achieved very well, and they have en-countered no major problems to date.

Prospects for the FutureThe program is likely to continue with modificationthrough the 1973-74 school year and, except for minor

variations in the program, this description should remainaccurate through that period. Using the National Assess-

ment model (paced tapes, testing out-of-school 17-year-olds

and so on) next year, the element most likely to change will

be data collection and processing.

REFERENCESFor further information regarding the Atlanta Assessment Project,

please contact the project office at 1001 Virginia Avenue, Suite

313, Hapeville, Georgia 30354 (telephone: 404 768-1047).

Statewide Testing Program (SWTP)

GoalsThe State Education Agency prepared statewide educa-

tional goals and the State Board of Education has adopted

the goals. The fundamental goal of the SWTP is theimprovement of achievement levels and is therefore output

oriented.

Program Title and PurposesThe Statewide Testing Program (SWTP) endeavors to col-

lect achievement data on students, comparative information

to be used in evaluating the student, teacher, curriculumand learning tools. Identification and correction of weak-

nesses can thereby be affected.

InitiationThe governor and the state legislature initiated the program.

Testing took place in its first year, fiscal 1972.

PolicyThe Georgia State Department of Education is responsible

for policy decisions.

FundingCosts for the SWTP for fiscal year 1972 were approxi-

mately $350,000. This sum represents the cost of testing

materials, teachers' manuals and correction and datacompilation services. Funds were provided 100 percent by

the state.

Page 33: DOCUMENT RESUME - ERICDOCUMENT RESUME ED 080 582 TM 003 098 TITLE State Educational Assessment Programs.,1973 Revision.. INSTITUTION Educational Testing Service, Princeton, N.J.; Education

AdministrationThe SWTP is administered in the Georgia Department ofEducation by the Division of Program and Staff Develop.ment. There are two full-time employees involved withSWTP (Dr. James E. Bottoms, Director and Dr. Jerry W.Waites, Coordinator of SWTP).

PopulationDuring fiscal year 1972 tests were administered to 250,000students in grades 4, 8 and 12 throughout Georgia's 188school systems. Dropouts and the educationally mentallyretarded were excluded. Delinquent, handicapped, paro-chial and private schools were not included. All students inthe target population were tested and participation wasrequired (because schools receive state monies). Com-munity characteristics were not a factor in target selection.

Areas AssessedBasic skills in reading, vocabulary, work-study habits,mathematics and comprehension were measured in fourthand eighth graders. Achievement in reading, English usageand mathematics was measured in twelfth graders. All stu-dents were also given verbal and nonverbal ability tests.Testing plans for fiscal year 1973 involve grade 11 ratherthan grade 12.

InstrumentationA committee selected standardized tests developed by theUniversity of Iowa as instruments for the program. Noneare criterion-referenced or group reliable.

Related DataThe following additional data is collected to help analyzeand interpret the test data: student's age, cost of instruc-tional programs, county, dropout rates, school name,school size, student's sex and socioeconomic and teacherdata.

Data Collection and ProcessingTeachers are responsible for administering tests for theprogram. In 1972, workshops were offered before and aftertesting to orient teachers unfamiliar with standardizedtesting. These sessions were not given in 1973 because theywere not felt to be necessary. National Computer Systemsis responsible for scoring the tests and the State EducationAgency is responsible for processing the informationobtained for the program.

InterpretationLocal schools and the teachers receive data on the overallachievement level of the students. Teachers may examineindividual students' answers to test questions, also.

Use of Data

Results of the program can be used for comparative analysisacross schools, guidance, program evaluation and planning,teaching aids and improvement of students' achievementlevels.

HAWAII

Dissemination

Program reports are jointly prepared by the State Educa-tion Agency and outside contractors in the form of state,regional, school system, schoo! and student summariesReports for public information pieces are also prepared.Copies of appropriate reports go to the State Board ofEducation, the governor, other states, school districts andnewspapers.

Overview

The program is viewed favorably by students, teachers,local school administrators and the general public and evenmore favorably by the Boards of Education, the StateEducation Agency, the legislature and the governor.Generally speaking, program objectives are being wellachieved. The major problem related to the SWTP is insuf-ficient professional staff.

Prospects for the FutureThe program is likely to continue. Some changes may bemade with regard to measuring instruments: Future plansinvolve administering additional tests prior and subsequentto the main battery of exams.

Individual InterviewedThe primary source of information was James E. Bottoms,Division of Planning, Research and Evaluation, GeorgiaState Department of Education, 231 State Office Building,Atlanta, Georgia 30312 (telephone: 404 656-2556).

HAWAII

Statewide Testing Program

IntroductionIn discussing Hawaii's educational programs, it must bekept in mind that Hawaii is not like other states in its

'..cational organization. The public schools of the stateare under a single state administration, so the assessmentprogram is comparable to a large district testing programrather than to a typical state assessment program. AU publicschools are under the district administrative control of theState Department of Education.

GoalsThere is a master plan for education which includes aformal set of goals, available upon request. The StateDepartment of Education was responsible for developingthe master plan and goals, although no formal stateadoption has been made. The agency is, of course, chargedwith the responsibility of operating the schools; thus, anyfurther adoption of the plan is not technically necessary.The goals are both process and output in nature.

27

Page 34: DOCUMENT RESUME - ERICDOCUMENT RESUME ED 080 582 TM 003 098 TITLE State Educational Assessment Programs.,1973 Revision.. INSTITUTION Educational Testing Service, Princeton, N.J.; Education

HAWAII

Program Title and PurposesThe Hawaii assessment program is officially known as theStatewide Testing Program. Its major purposes are to assesscognitive development and to measure growth. A manage.ment information system is in the development stage, andsome of the results from the testing program will be usedfor this system, although the testing program was notdeveloped primarily for this purpose. Its original andpresent conception is to provide instructional aid to theteacher.

InitiationThe program was initiated by the State Education Agencyin 1957.

PolicyAn Advisory Committee makes recommendations con-cerning the program to the Assistant Superintendent forInstructional Services who, in turn, makes recommendations to the Superintendent for final action.

FundingThe program is financed by state money from the GeneralEducation Fund. The cost for the program is not accountedfor as a separate entity. There is an annual budget of about$30,000 for purchase of materials and for new studentsgained from increased enrollment. Most other costs areabsorbed within various State Education Divisions. Scoringand other data processing is handled through the StateElectronic Data Processing Division Facility. Planning,coordination and general administration are handled by theEvaluation Section within the State Department of Edu-cation.

AdministrationAs indicated above, the State Department of Education isresponsible for administering the program. One personwithin the department has primary full-time responsibilityfor the Statewide Testing Program. While the AdvisoryCommittee is composed primarily of personnel from theState Department offices and the schools, it includes onemember from the University of Hawaii, Department ofEducational Psychology, and one member from the Com-munity Colleges System Administrative Staff, StudentAffairs.

PopulationBecause all public schools in the state are operated throughthe State Department of Education, the target population isthe total student population attending public schools. Thefollowing grades indicate the levels at which various testsare given: grades 4, 5 and 6 and grades 7 through 12. Gradelevel, not age, is the basis for testing, and school aropoutsare not included; educable retardates and physically handi-capped are also excluded. All public schools except thosefor the blind and the deaf are included, but no parochial orprivate schools are tested. School participation is required,and all eligible schools were included last year. Because all

28

public schools are included, no community characteristicsare used as a basis for selection.

Areas AssessedThe cognitive areas tested include differential aptitudes,English, mathematics, natural science, reading, social

science, and writing. No noncognitive areas are beingassessed at this time.

InstrumentationAppropriate test levels for each grade will be used in thefollowing tests: the Sequential Tests of EducationalProgress (STEP) in reading, mathematics, writing, science,and social studies; the School and College Ability Tests(SCAT), forms 5A, 4A, 3A, 2A, 2B; the Differential Apti-tude Test (DAT) form L, revised (for grade 9 only). Noneof these tests has been tailor-made for this program,although there is ongoing work in the area of instrumentdevelopment. As outlined above, the Advisory Committeemakes its recommendations of standardized tests to be usedto the Assistant Superintendent, and the final decision onwhich tests will be used is made by the Superintendent.None of the testing instruments currently in use wasdeveloped as a criterion-referenced measure, and all areused as individually reliable measures rather than as group-reliable measures.

Related DataAt present no additional data is collected for use in analysisand interpretation of test results. A separate effort is under-way in the Planning Division to utilize a variety of socio-economic and related data as part of the analyses.

Data Collectiop and ProcessingClassroom teachers administer the tests. The tests arescored and processed by the State Education Agency, withthe assistance of the State Department of Budget andFinance, Electronic Data Processing Division.

InterpretationThe State Education Agency is responsible for the organiza-tion, analysis, and interpretation of all program data.

Use of DataThe Evaluation Section uses the program results in theinstructional protss, guidance, program evaluation, pro-gram planning and public relations. There is no legalrestriction on the use of program data except for preservingthe confidentiality of individual student test scores.

DisseminationReports of program results are prepared by the StateEducation Agency, and they include both state and districtsummaries. Information is also provided to individualschools. Program results are generally available to thegovernor, the legislature, the press and the public at large.

Page 35: DOCUMENT RESUME - ERICDOCUMENT RESUME ED 080 582 TM 003 098 TITLE State Educational Assessment Programs.,1973 Revision.. INSTITUTION Educational Testing Service, Princeton, N.J.; Education

Overview

Among the problems presently related to the program arethe developing negative attitudes towards all testing. thelack of understanding as to the usefulness of tests byteacher and student and the difficulty in making meaning.ful interpretations of results at both the legislative andpublic levels.

Prospects for the FutureThe program is likely to continue, indefinitely, with somemodification. Changes will be gradual. There are no specificplans for change at the present time, although the goals orobjectives are likely to become more explicit and sharplydefined and communicated more fully to people in thestate, the areas assessed may be L, oadened and changes inmeasuring instruments may occur. Future program reportswill attempt to make more meaningful interpretations forboth public and school usage..

Individuals InterviewedThe primary scurces of information were Ronald Johnson,Administrator of Evaluation, Department of Education,Honolulu, Hawaii 96804 (telephone: 808 548-5891); CarlFischer, Testing Specialist, Department of Education,Honolulu, Hawaii 96804 (telephone: 808 548-5941);Kellett Min, Planning Division, Department of Education,Honolulu, Hawaii 96804 (telephone: 808 548-6485).

REFERENCESHawaii Department of Education. Hawaii Statewide Testing Pro-gram. January 1972.

Hawaii Department of Education. Summary Report of MinimumTesting Program 1970-71. Evaluation Report No. 80. December1971. No. TAC 72-4558. Office of Instructional Services, Evalua-tion SectionTesting Unit.

Hawaii Department of Education. Minimum Testing Program:Policy and Regulations. Code nos. 2520 and 2520.1.

IDAHO

State Testing Program (STP)

Goals

A set of statewide educational goals has been jointly pre-pared by a citizens' advisory group, the Chief State SchoolOfficer, the State Board of Education and the State Educa-tion Agency and has been formally adopted by the StateEducation Agency and the State Board of Education. Thegoals concern process and are available upon request.

Program Title and Purposes

The official name of this effort is the State Testing Pro-gram. The major purposes are to assess cognitive develop-ment, to provide data for a management informationsystem and to provide a counseling tool for local schools.

IDAHO

InitiationThe program was initiated six years ago by the State Educa-tion Agency.

Policy

The STP policy is determined by the Chief State SchoolOfficer, the State Board of Education, the State EducationAgency and representatives of major school systems.

FundingProgram funding is 100 percent federal through ESEA TitleHI. Last year roughly 524.000 was spent.

AdministrationPupil Personnel Services of the State Departnwnt of Educa-tion coordinates the program statewide. The professionalstaff consists of one state and .01 local full-time equiv-alents.

Population

The target groups are defined by grades 9 and 11. The stateexcludes only dropouts from the target population andtests the remainder. All other eliminations were self-initiated as participation is voluntary. Last year 95-97percent of all school districts with grades 9 and 11 parti-cipated.

Areas Assessed

The STP tested only the cognitive areas of aptitude,English, mathematics, natural science, reading and socialscience.

InstrumentationThe Differential Aptitude Test (DAT) was administered toninth graders and the Iowa Test of Educational Develop-ment (ITED) was administered to eleventh graders. TheDAT and ITED were chosen by an advisory committee.Neither test is considered criterion-referenced. Both areused as group-reliable measures by the State EducationAgency.

Related Data

To help in the analysis and interpretation of data, the fol-lowing information was utilized: age, dropout rates, schoolname, school size and sex.

Data Collection and Processing

Guidance counselors are responsible for giving the tests tothe students. The University of Idaho State Testing Serviceis responsible for scoring and data processing.

InterpretationThe State Education Agency and the University of Idahoare responsible for analyzing, organizing and, to an extent,interpreting test data. The major responsibility for interpre-tation lies with the local school districts.

29

Page 36: DOCUMENT RESUME - ERICDOCUMENT RESUME ED 080 582 TM 003 098 TITLE State Educational Assessment Programs.,1973 Revision.. INSTITUTION Educational Testing Service, Princeton, N.J.; Education

t

IDAHO

Use of DataThe results of the program are used for comparative

analysis across schools, guidance, program evaluation, pro.

gram planning and public relations. The authority of theSTP does not restrict use of the data.

DisseminationReports are jointly prepared by the University of Idaho and

the State Education Agency and sent to the State Board of

Education, school districts, schools, students and parents.

Reports are prepared in the form of a state summary,school system summaries, school summaries and individual

student reports.

OverviewProgram objectives are not being achieved as well as had

been hoped.

Prospects for the FutureThe program will continue, however, with modification.

Goals and objectives, interpretation of data and use of data

are among the aspects most likely to change. The currentdescription should remain accurate for approximately one

year.

Individual InterviewedThe primary source of information was Ronald J. Dent,

Consultant to Pupil Personnel Services, State OfficeBuilding, 650 State Street, Boise, Idaho 83707 (telephone:

208 384-2115).

ILLINOIS

Illinois Inventory of Educational Progress (IIEP)

GoalsIllinois has a formal working set of goals in nine broad

subject areas. These goals were prepared by the Office ofthe Superintendent of Public Instruction (OSPI) and are

based on ideas and suggestions collected at six public

hearings held throughout the state between June andAugust of 1971. The existing goals are presently being

viewed through another series of statewide public hearings.

and they have been adopted by the Assessment and Evalua-

tion Planning Section of the Office of the Superintendentof Public Instruction as the focus for the assessment pro-

gram. The goals have been mailed to all school systems and

are available upon request.

Program Title and PurposesThe official title of the assessment program is the IllinoisInventory of Educational Progress. The major purposes of

30

the program are to assess cognitive and noncognitive devel-

opment, to measure growth and to provide information forstate-level decision-making bodies. The two major goals of

the assessment program are (1) to make available the first

census-like data on the educational attainments of Illinois

students and (2) to measure any growth or decline which

takes place over time.

InitiationThe Office of the Superintendent ofPublic Instruction, to

meet ESEA Title III requirements, initiated the first phase

of the program in 1971. After approximately four years of

field testing, the first data will be collected on a statewide

sample basis in September 1975.

PolicyThe following groups advise the OSPI on how the program

should be conducted and what changes should be made: a

citizens' advisory group, representatives of school systems

and the state legislature.

FundingThis program has been financed by federal ESEA Title III

and ESEA Title V in the past. In the future, Illinois plans to

utilize state funds.

AdministrationCounty and local educators and the state Assessment and

Evaluation Planning personnel coordinate the programstatewide. Four full-time professionals at the state level

work on assessment. Individuals from the state universities

now serve as consultants. Plans are being made for the Data

Services Section of the Office of the Supentitendent of

Public Instruction to conduct the data analysis.

PopulationThe target population, areas assessed and instrumentation

will all refer to plans for September 1975. Illinois plans to

test both by age (9-, 13- and 17-year-olds) and by grade

(grades 4, 8 and 11). Hopefully dropouts and students of a

given age falling behind the corresponding grade level will

be included. Local participation will be voluntary. A sample

will be drawn by the school districts and will be stratifiedusing size and type of school district, size and type of com-

munity, sex, race, per pupil expenditure and other com-

munity and socioeconomic variables.

Areas AssessedIn the cognitive domain, English, mathematics, natural

science, reading, social science, vocational education,

writing and the arts are being considered. In the non-

cognitive domain, citizenship, creativity, personal values,

self-concept, interests in relation to work and the future

and psychomotor performance are being considered.

InstrumentationFormal testing on a statewide basis will not be done untilSeptember 1975. Between now and 1975. pretesting and

Page 37: DOCUMENT RESUME - ERICDOCUMENT RESUME ED 080 582 TM 003 098 TITLE State Educational Assessment Programs.,1973 Revision.. INSTITUTION Educational Testing Service, Princeton, N.J.; Education

existing data from data banks will be used to constructguidelines for choosing or designing instruments. Theinstruments will be in the form of exercises, some borrotvedfrom National Assessment. Standardized tests, if any, willbe chosen by a committee. The instruments will be used as

group-reliable measures; as presently conceived, at least 75percent of the instruments will be criterion-referencedmeasures.

Related Data

Cost of instructional programs, dropout rates, paraprofes-sionals or education aides, percentage minority, racial-ethnic, socioeconomic, sex, school size and teacher datainformation will be collected and used in the analysis andinterpretation of the data.

Data Collection and ProcessingClassroom teachers, guidance counselors, curriculumcoordinators and paid proctors will be used to collect data.The State Education Agency will have the responsibility foranalyzing, interpreting and disseminating the findings of theassessment program.

InterpretationThe OSPI is also responsible for analyzing, organizing andinterpreting the data.

Use of DataIt is anticipated that the results will be used in the fol-lowing areas: guidance, program evaluation and planning,comparative analysis of school districts by general char-acteristics, allocation of state and/or federal funds andpublic relations. Schools will not be compared with otherschools. Other restrictions do not exist at this time; how-ever, the state is proceeding with caution, and additionalrestrictions may be imposed in the future.

Dissemination

Reports will be prepared at the appropriate time by theSuperintendent of Public Instruction. State, regional, targetarea and school system characteristics summaries will beprepared as will public information pamphlets. Reports willbe sent to the governor, the legislature, other states, schooldistricts and ERIC. The reports will also be available tonewspapers, students and parents upon request.

Overview

The bill put before the legislature requesting authority forand outlining the proposed program has been viewed veryfavorably. The basis of this bill is the development of a

systematic information-gathering program to provide state-level decision makers and the general citizenry withinformation about the direct outcomes of education as

exhibited by students. The ultimate purpose is to provideinformation that can be used to improve the educationalprocess, to improve education at all levels by using informa-tion about what students know, what skills they havedeveloped and what their attitudes are.

INDIANA

Prospects for the Future

The program will be developed as a continuous and compre-hensive assessment of student knowledge, understanding,skills and attitudes. A four- or five-year cycling process willreveal changes over a continuum. Pilot programs will beconducted prior to the fall of 1975.

Individual InterviewedThe primary source of information was Thomas Springer,Director of Assessment and Evaluation Planning Section,Office of the Superintendent of Public Instruction, Spring-field, Illinois 62706 (telephone: 217 525-4980).

REFERENCEAction Goals for the Seventies: An Agenda for Illinois Education,May 1972. Office of the Superintendent of Public Instruction,Illinois.

INDIANA

Indiana State Needs Assessment Program

Goals

A set of nine goals for education in Indiana has been pre-pared. The present set of goals does not deal directly withassessment, but with educational institutions and manage-ment. The goals were prepared by a committee consistingof the following four Department of Public Instructionemployees: the Director for State Planning (the only educa-tor), the Assistant Superintendent for Finance, the ChiefAccountant in the Department and the Assistant Superin-tendent for Administration Services. The goals have notbeen formally adopted by the state, and it is quite likelythat they will be changed drastically in the near future.They are available on request.

Program Title and Purposes

The official name of the program is the Indiana State NeedsAssessment Program. Its main purposes are to assess cogni-tive development and to assess educational needs, withlimited attention to the affective domain.

InitiationThe program was initiated by the State Title ill Office inAugust 1971. The first test administration was planned forFebruary 1973.

Policy

Program policy is determined by the State Title III Officewith the aid of some consultants.

Funding

Funding is through the federal ESEA Title Ill. Roughly$30,000 was spent on the program last year.

31

0.

Page 38: DOCUMENT RESUME - ERICDOCUMENT RESUME ED 080 582 TM 003 098 TITLE State Educational Assessment Programs.,1973 Revision.. INSTITUTION Educational Testing Service, Princeton, N.J.; Education

INDIANA

AdministrationThe State ESEA Title III office coordinates the programstatewide, with one full-time professional working on

assessment at the state level. Local education agencies(districts) did much of the pretesting work for the pilots,and Dr. Terry Schurr of Ball State University did much ofthe item analysis and instrumentation.

PopulationThe target population will consist of public school studentsin grades 4, 8 and 12. A sample will be drawn from thispopulation and those students drawn will be asked toparticipate. Public school participation is voluntary. Com-munity characteristics (whether rural, urban, suburban andsize) are used as one basis of selection. Parochial and privateschools and school dropouts are excluded tiom the study.

Areas AssessedThe cognitive areas of English, mathematics, naturalscience, reading, social science and vocational educationwill be assessed. The noncognitive areas of citizenship, self-concept and attitude toward school will also be assessed.

InstrumentationThere is only one testing instrument, the Needs AssessmentInstrument, consisting of 200 items (80 reading items and120 items divided equally among all other areas). This is atailor-made instrument, with some items borrowed fromNational Assessment. The vocational items in the test willbe given only to grade 12 students. Field testing is beingdone by the ESEA Title III Division to determine instru-ment validity. The instrument development was a coordi-nated effort by the State Title III Office and consultants.The Needs Assessment Instrument will be used as a group-reliable measure. It is not considered a criterion-referencedmeasure.

Related DataThe age and grade level of each participating student will becollected to help analyze program data.

Data Collection and ProcessingIt has not been determined who will administer the test,but it will probably be classroom teachers and guidancecounselors. Ball State University will process program data.

InterpretationData analysis, organization and interpretation will probablybe done by Dr. Terry Schurr at Ball State University.

Use of DataProgram data will be used to identify target Title III pro-grams for the future. it will also be used to identify themost important exemplary programs throughout the state.Program authority does not restrict the use of programdata.

32

DisseminationProgram reports are being prepared by the State Title IIIOffice and will be disseminated statewide in the form ofstate summaries. These reports are directed toward schoolpeople (schools and school districts), but others, such as thegovernor, the legislature and state newspapers, do haveaccess to the reports.

OverviewIt is too early to determine opinions statewide, but thestate education hierarchy does seem to view Indiana assess-ment favorably. About 60-70 percent of the state objectivesare being achieved, despite the problems of program c ordi-nation and occasional duplication of effort (i.e., the Rightto Read Project).

Prospects for the FutureThe Indiana program is likely to continue, with possiblechanges occurring near the end of 1973., The State Depart-ment of Education offices are elective, and new officialsbringing new ideas may change the whole structure of theassessment program.

Individual InterviewedThe main source of information was Ivan Wagner, Directorfor State Planning and Evaluation, State Department ofPublic Instruction, Room 225, Capitol Building, 309 WestWashington Street, Indianapolis, Indiana 46204 (telephone:317 633-4963).

IOWA

Iowa Needs Assessment Program (INAP)

GoalsIowa does not have a formal set of educational goals at thistime, but is presently working on a process for rankinggoals. Field tests shouid be completed in the spring of1973.

Program Title and PurposesThe program's working title is the Iowa Needs AssessmentProgram (INAP). Whether or not this will become theofficial title has not yet been deCided. The primarypurposes of this program are to assist local districts to dotheir own assessment, measure growth, assess cognitive andnoncognitive development and assess educ:,tional needs.Secondary purposes are to provide data for a managementinformation system and to set statewide educational goals.Indirectly the program will also provide information for aplanning-programming-budgeting system.

InitiationThe (NAP was initiated in 1969 by the Planning, Researchand Evaluation office. Although they carried out a testinglast year, this year they are working on processes forranking goals.

Page 39: DOCUMENT RESUME - ERICDOCUMENT RESUME ED 080 582 TM 003 098 TITLE State Educational Assessment Programs.,1973 Revision.. INSTITUTION Educational Testing Service, Princeton, N.J.; Education

PolicyProgram policy is jointly determined by the Chief StateSchool Officer, the State Board of Education, the StateEducation Agency, and to a degree by representatives ofmajor school systems.

FundingThe program is completely federally funded. The majorsource of funds is ESEA Title III, with some funds fromTitle IV and Title V. Last year the program cost approxi-mately 585,000.

AdministrationThe Planning, Research and Evaluation office of whichESEA Title Ill is a part, is responsible for coordinating theprogram statewide. Two professional staff members workfull time on state assessment with the help of two outsidecontractors. Iowa State University assisted with samplingprocedures and selection and Measurement ResearchCorporation (MRC) assisted with data collection andanalysis.

PopulationThis year the 1NAP is doing no testing as it is concentratingon other aspects of the program: however, last year it didtest at grades 4, 7, and 12. Excluded from the targetpopulation were dropouts, the educationally mentallyretarded and the emotionally disturbed, physically handi-capped students and parochial and vocational-technicalschools. A 20 percent sample of school districts wasrandomly selected last year and 83 of the state's 450districts participated. Participation was voluntary and wasnot confined to communities with specific characteristics.

Areas AssessedThe cognitive areas of reading, literature and science havebeen assessed as well as the noncognitive areas of attitudetoward school and self-concept.

InstrumentationNational Assessment exercises were used last year for thecognitive areas of reading, science and literature. Affectiveinstruments in Attitudes Toward School and Self-Conceptdeveloped at Instructional Objectives Exchange were usedin the 1970-71 school year. None of the test material wastailor-made. The instruments were chosen for the programby the State Education Agency and were group-reliable.The National Assessment exercises for reading, science andliterature were cnterion-referenced.

Related DataAge, sex, urban vs. suburban regions, grades, school name,racial/ethnic information and school size were utilized inanalyzing and interpreting the data.

Data Collection and ProcessingThe 1NAP hired proctors to administer the tests. MRC wasthen responsible for processing and analyzing the answersheets and test results.

KANSAS

InterpretationThe Planning, Research and Evaluation office of the StateEducation Agency organized and interpreted the data forits own use.

Use of DataThe data is useful to assist local districts in making compari-sons of their results with state results. The use of the data is

not restricted by state authority.

DisseminationA report has been prepared by the Planning, Research andEvaluation staff in the form of a state summary by region.Reports were made available to the State EducationAgency, the State Board of Education and other states.Reports also went to school districts and schools who hasparticipated and other interested parties on request.

OverviewThe State Education Agency, State Board of Education andlocal school districts and administrators are responding tothe program with mixed feelings. The intent of the programis to aid local school districts in conducting their ownassessment. However, to date, local participation has beenlimited. The state feels that program objectives are notbeing fully accomplished at this time. The main problemseems to be developing awareness at the local levels andinspiring use.

Prospects for the FutureThe program will likely continue with the following areassubject to change: areas assessed, instruments used andinterpretation, use and dissemination of data. The presentdescription of the 1NAP should remain accurate at leastthrough 1973.

Individual InterviewedThe primary source of information was Max Morrison,Department of Public Instruction, Grimes State OfficeBuilding, Des Moines, Iowa 50319 (telephone: 515

281-5274).

KANSAS

Kansas Needs Assessment Model

IntroductionThe Kansas Needs Assessment Model still is in the relativelyearly stages of its development. For this reason, much ofthe information concerning data collection, interpretation,use and dissemination is tentative or general in nature,while information on reactions to the program and evalua-tion of its success or achievement is lacking. Most completeat this time is information concerning the background ofthe program, its goals and objectives, the target populationto be assessed and the proposed areas of assessment.

33

Page 40: DOCUMENT RESUME - ERICDOCUMENT RESUME ED 080 582 TM 003 098 TITLE State Educational Assessment Programs.,1973 Revision.. INSTITUTION Educational Testing Service, Princeton, N.J.; Education

KANSAS

GoalsOn July 6, 1972, the Kansas State Board of Educationformally adopted a set of statewide goals and objectives foreducation. The 4 major and 35 subgoals are the result ofwork and participation by approximately 8,000 teachers,the Kansas State Department of Education staff, repre-sentatives of the U.S. Office of Education, students, admin-istrators and citizens. The goals stress the need to involvelay citizens and students in the planning of the school'scurriculum and to provide specific programs for dropouts,exceptional children and persons in penal institutions. Thegoals also reflect the need for better communication amongstudents, teachers, parents and administrators. The currentset of goals is available upon request.

Program Titk and PurposesAlthough an unofficial title, the program is known cur-rently as the Kansas Needs Assessment Model. Its majorpurposes include identifying statewide goals, subgoals, and

objectives, reaffirming existing goals and revising them as

necessary, assessing cognitive/noncognitive developmentand educational needs and providing data for a managementinformation system. The piogram is aimed at uncoveringdiscrepancies between goals and objectives and actual stu-dent behavior and attitudes. The Model is not concernedwith the performance or growth of individuals, but ratherwith district and statewide assessment of educational needs.

InitiationProject SEEK (State Educational Evaluation of Kansas)initiated the setting of goals for education in 1969 when itassessed the educational needs of Kansas students (gradesK-12); another contributory effort was made by the Mid-continent Regional Educational Laboratory (Mc Rel), whichattempted to rank by priority the educational needs asperceived by the State Department of Education staff.Three years after the inception of the present phase ofstatewide goals for education, the program was undertaken.

PolicyThe State Board of Education has been involved in deter-mining how the program will be conducted and hasapproved plans for the Kansas Needs Assessment Model.Other groups involved include a citizens' advisory group,the Chief State School Officer, representatives of majorschool systems, the State Education Agency and ESEATitle III; some of these groups, although not currently veryactive in the program, will be involved to a greater extent asthe program develops.

FundingSo far 100 percent of the program expenses have beenfinanced by ESEA Title III federal funds. Future plans callfor state fund involvement, but the funds to be utilizedhave not, as yet, been identified. There is no data availableas to last year's program costs, as this is the program's firstyear of operation.

34

AdministrationThe ESEA Title III and Kansas State Department of Educa-tion Planning, Evaluation and Research Council staffcoordinate the program statewide. The professional staffsize at the state level is one fu'l-time equivalent; other staffmembers are involved as needed. The needs assessmentactivities are performed by the Bureau of EducationalResearch of the University of Kansas under contract withthe Kansas State Department of Education. There are 1.5professional full-time equivalents on the state level devotingtheir time to needs assessment.

PopulationRepresentative samples will be tested from the targetpopulation of public school students, enrolled in grades 6and 12, who attend regular classes. No certain type ofschool districts or individual schools are required to partici-pate in the program; all schools will be encouraged toadminister the tests on a voluntary basis though only thoseselected will be included in the sample. Private, parochial,delinquent or handicapped schools are not included in thetarget population, nor are any students who are institu-

tionalized or in special classes.

Areas AssessedCognitive areas assessed include English, health, humanities,consumer and environmental education, mathematics,

natural and social sciences, reading, writing, and vocationalareas. Noncognitive areas M be assessed are citizenship,

interests, self-concept and personal values. During thespring of 1973 Kansas will be pilot testing science andmathematics instruments.

InstrumentationThe measures to be administered for the program have not

been selected or developed at this time although it is

planned to develop and/or select measures from existinginstruments. Although these instruments will probably be

criterion-referenced measures, it has not yet been decidedwhether item sampling will be used in their construction.It's the intent of the program to develop group-reliablemeasures; no individual feedback is planned.

Related DataNo decision on the collection of related data has been made

at this time.

Data Collection and ProcessingNo decision has yet been made as to who will be respon-sible for administering the tests, inventories and so forth forthe program. The State Education Agency will be respon-sible for processing the information obtained from the

program.

InterpretationThe State Education Agency has the primary responsibilityfor analyzing, organizing and interpreting the data. How-

ever, this does not preclude local districts from makingadditional analyses on their own.

9.

Page 41: DOCUMENT RESUME - ERICDOCUMENT RESUME ED 080 582 TM 003 098 TITLE State Educational Assessment Programs.,1973 Revision.. INSTITUTION Educational Testing Service, Princeton, N.J.; Education

Use of DenAnticipated uses for the program results are: allocation ofstate and federal funds, budgeting, instruction, programplanning and public relations. It is State Department policythat individual data will not be identified.

Dissemination

Plaits indicate that ann al public information pieces andState, regional, school system, school and target area sum-maries will be prepared by the State Education Agency.Specific recipients of these program reports have not yetbeen identified. but, generally, all Kansas school districtsand other interested parties will receive copies.

Overview

It is too early to say how the program is generally viewed,or whether it is believed that the program is achieving itsobjectives. The major problems facing the program at thisstage are financial.

Prospects for the FutureThis program is likely to contint_ although, as evidencedby the previous information given, areas to be assessed,measuring instruments to be used and data collection pro-cedures to be employed are likely to change in the nearfuture. Funding is another element 1. 'y to change as theprogram is developed. As society changes, it is expectedthat goals and needs of education will change. The KansasNeeds Assessment Model will be a continuous processwhich will allow updating of goals 'nd identification ofcritical needs.

Individuals InterviewedThe primary sources of information were Phttip S. Thomas,Director of Title Ill Section, Kansa: State Department ofEducation, 120 East 10th Street, Topeka, Kansas 66612(telephone: 913 296-3128); and Dick Tracy, State Con-sultant, University of Kansas School of Education,Lawrence, Kansas 66044 (telephone 913 864-3758 or 913296-0111).

REFERENCEStatewide Goals for Education in Kansas, published July 6, 1972,by the Kansas State Board of Education, Topeka.

KENTUCKY

Kentucky Assessment Program

IntroductionPhase I of the Kentucky Needs Assessment Study delin-eated learner needs. Phase II has been and is continuing toestablish needed objectives and set criteria by which toestablish these objectives. Phase II work is described here.,

KENTUCKY

GoalsSeveral sets of statewide educational goals have been

:spared by different lay and professional educator groupsas a part of the Kentucky Needs Assessment Study. Thereare both similarities and differences in these sets of state-ments since they were prepared somewhat independentlyby the different groups. Personnel from various offices anddivisions of the Kentucky Department of Education arenow attempting to reconcile the differences and synthesizethese different sets into a single set acceptable to all whoparticipated in the goal-setting process. It is anticipated thatany dissimilarities will become reconciled and that the StateEducation Agency will adopt the final set. These goals willbe c.o.werned with both process and output.

Program Title and PurposesIt is anticipated that the official name of the program willbe the Kentucky Educational Assessment Program and thatits major purposes will probably include assessment ofeducational needs, measurement of pupil growth andutilization for the development of a planning-program-ming-budgeting system.

InitiationThe program was initiated by the State Education Agency.Phase I was completed in September of 1970. Phase II isnow in its third year.

PolicyAll policy-making functions are vested in the State Educa-tion Agency.

FundingFunding of the program in 1971.72 came from federal andstate general funds. About 70 percent of total expenseswere provided by ESEA Title V. The program cost approxi-mately $50,000 in 1971.72.

AdministrationResponsibility for statewide program coordination lieswithin the recently established Office of Planning andResearch. Its four divisions are Research, Planning, Ev llu:tion and Dissemination. Two full-time professionals wolkon the program at the state level. EPIC Diversified Systc:osCorporation (Tucson, Arizona) provided consultant serviceto earlier phases of the program and will continue to pro -'vide assistance in planning computer service and dataanalysis through 1972-73.

Population

Tapet population is defined by grades 4, 8 and 11. Samplesof students are drawn from each grade. Dropouts areexcluded because of unavailability. Parochial, private andvocational/technical schools have not been included in thisstudy. No student is excluded by reason of any specialdesignation or diagnosis. A representative sample ofdistricts is sele-ted within each of the state's 17 educationaldevelopment districts and they are invited to participate. In

35

9.

Page 42: DOCUMENT RESUME - ERICDOCUMENT RESUME ED 080 582 TM 003 098 TITLE State Educational Assessment Programs.,1973 Revision.. INSTITUTION Educational Testing Service, Princeton, N.J.; Education

KENTUCKY

1971-72, 80 districts of a state total of 189 were invited toparticipate. During 1972-73, 120 school districts areexpected to participate. Community characteristics werenot conside. ' as a basis for selection.

Areas Assessed

The cognitive areas of reading and mathematics and non-cognitive areas of citizenship, psychomotor (physicalfitness) skills and attitudes toward school, peers and self arebeing assessed in Phase II.

InstrumentationMeasuring instruments in the cognitive areas include:Comprehensive Test of Basic Skills, Form Q, Level II(fourth grade); Comprehensive Test of Basic Skills, Form Q,Level III (eighth grade); Comprehensive Test of Basic Skills,Form Q, Level IV (eleventh grade). The Physical FitnessTest of the American Association for Health, PhysicalEducation and Recreation was used to assess psychomotorskills. These tests were selected by teacher committeesthroughout the state in an effort to obtain measuringinstruments which appeared to be most nearly congruentwith selected state goals of education. Attitude measureswere developed by teacher committees with consultationfrom EPIC. The test battery is considered to be a combina-tion of criterion- and norm-referenced measures. Thoughsome tests are individually reliable, the state is using resultsfor group interpretation only.

Related DataSeveral variables were selected by the State Department ofEducation to be used in the analysis of data. These includedgeographic representation, sex, adjusted gross familyincome, per-pupil expenditure for education and total stu-dent population by grade level.

Qata Collection and ProcessingIII each school selected for the sample, one person from theschool administrative staff is selected to be responsible foradministering the tests and inventories used in the program.The State Department of Education is responsible forprocessing all information.

InterpretationEPIC has provided services for preparing analyses and inter-pretations of program data. However, tilt. Department ofEducation staff directs EPIC in the kinds of reports andanalyses which are to be prepared.

Use of DataResults of the assessment aLe used in program planning,program evaluation and instruction. Since the standardizedtests selected for this program are those which are assumedto have highest correlation with the instructional program,considerable emphasis is placed upon an examination ofthose test items missed by 50 percent of the students andthe percentage of students missing those items deemedmost related to relevant instructional outcomes.

36

DisseminationNo restrictions are placed upon the use or dissemination ofthe assessment program data. Reports are prepared by EPICunder Department of Education guidance and distributedto all school districts in the state and the Department ofEducation. Reporting is on a statewide, regional and localdistrict basis.

Prospects for the FutureThis program will undoubtedly continue with expansion ofareas assessed and modification of data collected on associ-ated variables, interpretation, use and dissemination ofinformation. A significant first effort in this plannedexpansion is the establishment of the Office of Planningand Research within the Department of Education and themove by the department to ootain clear agreement on andcommitment to the goals of education for Kentucky bypersonnel within the department.

Individuals InterviewedThe primary sources of information were David Shannon,Head of the Office of Planning and Research, 17th Floor,Capital Plaza, Frankfort, Kentucky 40601 (telephone: 502564-4394), and D.E. Elswick, Director of the Division ofEducation Research, 17th Floor, Capital Plaza, Frankfort,Kentucky 40601 (telephone: 502 5644398).

REFERENCESKentucky State Department of Education. Kentucky InterstateProject: MidAtlantic Region Interstate Project. Final Report,December 1972.

Kentucky State Department of Education. Kentucky EducationalNeeds Assessment Study. Phase 11, Learner Needs. September 1971.

Kentucky State Department of Education. Learner Outcomes.Kentucky Educational Needs Assessment Study. October 1972.

LOUISIANA

Louisiana Learner Needs Assessment Program

Goals

There are 14 major priorities for Louisiana education, asformulated by State Superintendent of Schools Louis 1.Michot. A combination of input, process and output, theyhave not yet been formally adopted by the state. Copies ofPriorities for Progress in Public Education are availableupon request.

Program Title and PurposesThe official title of the Louisiana program is the LouisianaLearner Needs Assessment Program. Its major purposes areto: assess cognitive development, assess learner needs, assessnoncognitive development (a future objective), provide datafor a management information system, measure growth andset statewide educational goals.

Page 43: DOCUMENT RESUME - ERICDOCUMENT RESUME ED 080 582 TM 003 098 TITLE State Educational Assessment Programs.,1973 Revision.. INSTITUTION Educational Testing Service, Princeton, N.J.; Education

InitiationThe Assessment Program is an outgrowth of the ESEA Title

III office. Program development began in the latter part of

1969 with a statewide assessment of facilitating needs. The

first test administration in reference to learner needs is set

for the early spring of 1973.

PolicyThe State Superintendent of Education and the StateEducation Agency are responsible for setting programpolicy.

FundingThe program is now funded by federal monies under ESEA

Title III. Projections indicate, however, that for the fiscal

year 1973-74 50 percent of the funds will come from thestate, and some of the federal funds used will be fromESEA Title I. Roughly $140,000 has been spent on the

developmental stages of the program.

AdministrationThe state Bureau of Experimental Programs is responsible

for coordinating the Assessment Program statewide. Fourfull-time professionals work on the program at the statelevel. No regular outside consultants are used, although

advice was sought from the staff of the National Assess-

ment of Educational Progress (NAEP).

PopulationAges 9, 13 and 17 will be assessed in the first test admin-istration. School dropouts and students who have droppedbehind in grade level will be included. Students will bechosen by a random sampling of all students in the targetage group. All schools receiving state assistance will berequired to participate. Parochial and private schools andschools for the handicapped (blind and deaf) will beexcluded from the study. Community characteristics (i.e..community size and whether rural, urban or suburban) arebeing considered.

Areas AssessedFor the 1973 test administration, only the cognitive area ofreading will be assessed. Future plans call for testing in the

cognitive areas of mathematics, natural science, social

science and vocational education and the noncognitive areasof attitudes, citizenship and personal interests. There is also

discussion regarding the development of Instruments to be

used in penal institutions and instruments to be used for

special education and career education.

InstrumentationThe reading instrument to be used was tailor-made by the

Bureau of Experimental Programs, with assistance from

NAEP personnel and reading experts throughout the state.

Item sampling was used in test construction, and the test is,

as a result, only group-reliable. No individual student inter-pretations will be made. The test is a criterion-referencedmeasure.

LOUISIANA

Related DataThe only additional data to be collected to help program

analysis is the age and sex of the students tested.

Data Collection and ProcessingPaid consultants, trained by the SEA, will be responsible

for administering the tests and probably will be substitute

teachers and/or retired teachers and school administrators.The Bureau of Experimental Programs will be responsible

for processing all program data.

InterpretationProgram data will be analyzed, organized and reported by

the Bureau of Experimental Programs. Interpretation of the

data will involve all facets of the state's community.

Use of DataLouisiana plans to use initial program data for programplanning; future use may be made in the area of fundallocation. Data will also probably be used by local schools,

parishes (districts), state and private universities and local

citizens' groups for individual interpretations. Use of thedata will not be restricted by program authority.

DisseminationProgram reports will be prepared by the Bureau of Experi-

mental Programs and will include: state and regional sum-

maries; summaries by target area; summaries of compari-sonsstate to national, state to regional, state to state; and

summaries of cross-matrix comparisons. Wherever possible

comparisons will be made between the results obtained in

Louisiana and those reported by NAEP. The following will

probably receive copies of program reports: the governor,the legislature, newspapers, other states, the State Board of

Education, local school districts, local schools, parents,

teacher organizations and ERIC.

OverviewThe program is viewed favorably by members of the StateDepartment of Education, but it is too early to determine

opinion statewide. Program objectives are being adequately

achieved, with the major problem being unavailability of

qualified professionals.

Prospects for the FutureThe Louisiana Assessment Program is likely to continue

with some modification. The present work will probablyremain constant through the end of 1973. Changes will

occur in the areas of policy control, state and federalfunding, the types of measuring instruments used and the

number of areas assessed.

Individual InterviewedThe main source of information was Dan K. Lewis, Director

of the Bureau of Experimental Programs, State Departmentof Education, P.O. Box 44064, Capital Station, Baton

Rouge, Louisiana 70804 (telephone: 504 389 - 6211).37

Page 44: DOCUMENT RESUME - ERICDOCUMENT RESUME ED 080 582 TM 003 098 TITLE State Educational Assessment Programs.,1973 Revision.. INSTITUTION Educational Testing Service, Princeton, N.J.; Education

MAINE

Maine Assessment of Educational Progress

Goals

A formal set of statewide educational goals has been pre-pared as part of the "Proposed Philosophy for MaineSchools," which contains not only goals for education, butalso a philosophy of education, educational concerns andstate guidelines for education. It was prepared by the StateCurriculum Committee, composed of teachers, the StateSuperintendent of Schools, principals and members of theState Department of Education. The goals have beenformally adopted by the State Board of Education andreflect a combination of input, process and output. The"Proposed Philosophy for Maine Schools" is available uponrequest.

Program Title and PurposesThe official title of the Maine program is the Maine Assess-ment of Educational Progress. Its major purposes are to:assess cognitive development, assess educational needs,assess noncognitive development, measure influences onlearning, measure growth, provide data for a managementinformation system and provide information for a plan-ning-programming-budgeting system.

InitiationEarly in 1972 the idea for the assessment program wasproposed by the State Board of Education and the StateEdwation Agency. A five-phase cycle is in effect, withnine II covering January 1 through December 31,.1973.Phase I was completed December 31, 1972 (pilot testing);Phase V will be completed by December 31, 1976.

Policy

Program policy is determined by the State EducationAgency, the State Board of Education and citizens'advisory groups.

FundingRoughly 75 percent of program funds come from federalmonies, under ESEA Title 1, Title III, Title V and Title IV,Section 402. The other 25 percent comes from the allo-cated funds for State Planning and Evaluation. Approxi-mately $96,000 was spent on the program in 1972.

AdministrationThe Office of Planning and Evaluation coordinates the pro-gram statewide, Five full-time professionals work on theprogram bt the state level. The Department of Educationand Cultthal Services (Maine Department of Education)acts as consultant for test development, analysis andscoring.

PopulationThe target population is all students aged 9, 13 and 17 inthe schools of Maine. Studen'ts are chosen by a random38

sampling of the target population. Participation is volun-tary. Students who have dropped behind in grade level areincluded, but educationally mentally retarded, emotionallydisturbed and physically handicapped students are ex-cluded. In 1972, 10 percent of the eligible student popula-tion was tested. The ages of 9, 13 and 17 will be tested inall five phases of the first program cycle. Community char-acteristics are not used as a basis for student selection.

Areas Assessed

The noncognitive area of citizenship was tested in Phase 1of the first cycle. Assessment will cover the following cog-nitive areas: in Phase 1, writing; in Phase 11, science andreading; in Phase III, literature and career and occupationaldevelopment; in Phase IV, music and mathematics; and inPhase V, art and social studies.

a'

InstrumentationThe National Assessment of Educational Progress tests areused for all areas and ages. These tests were adapted for usein Maine schools by the Department of Educational andCultural Services. They are all considered criterion-referenced measures and are used as group-reliablemeasures. No Individual student use is anticipated.

Related Data

Additional information collected to help analyze programdata includes: age, cost of instructional programs, county,grades, paraprofessional educational data, percent minority,previous course work, racial/ethnic background, schoolname and size, sex, socioeconomic data and teacher data.

Data Collection and ProcessingConsultants chosen by the Department of Educational andCultural Services (DECS) administer the tests in the class-room. The DECS scores the tests, and the information isthen processed by the State Education Agency (SEA).

InterpretationThe data collected is analyzed, organized and interpretedby DECS in conjunction with the SEA.

Use of Data

Program results are used for allocation of state and federalfunds, for instruction and for program planning. Programauthority does not restrict or limit the use of program data.

DisseminationProgram reports are prepared by the SEA and include statesummaries, regional summaries and public informationpieces. The following receive copies of program reports: thegovernor and legislature, newspapers, other states uponrequest, the State Board of Education, school districts andERIC.

Overview

The program is viewed most favorably by laymen and alllevels of the state and local education hierarchy. Program

Page 45: DOCUMENT RESUME - ERICDOCUMENT RESUME ED 080 582 TM 003 098 TITLE State Educational Assessment Programs.,1973 Revision.. INSTITUTION Educational Testing Service, Princeton, N.J.; Education

objectives are being achieved very well, with the mainproblem one of dissemination of information.

Prospects for the FutureThe Maine Assessment of Educational Progress is likely tocontinue at least through 1976, with changes occurring inthe areas of program funding and dissemination of programinformation.

Individual InterviewedThe primary source of information was Horace P. MaxcyJr., Assistant to the Commissioner, Department of Educa-tion, Education Building, State House Complex, Augusta,Maine 04330 (telephone: 207 289-2321).

REFERENCESEducational Needs Assessment Plan for Maine, January 1972through December 1976. Prepared by the Office of Planning, Evalu-ation and Research: revised January 17, 1972.

Educational Needs Assessment Plan for Maine: A Progress Report.June 1, 1972, Department of Education.

Proposed Philosophy for Maine Schools. Prepared by the State Cur-riculum Committee, 1971.

MARYLAND

Maryland Educational Accountability Program

Goals

Maryland's first set of statewide educational goals validatedby a Goal Validation and Needs Assessment Project waspublished in a report dated November 30, 1972. Tengeneral goals and 37 specific goals, all describing the outputof public education, have been identified and validated byvarious publics. Local school system educators, interestedpublic groups, students, the State Superintendent ofSchools, various other public officials and legislators, theState Board of Education and the State Department ofEducation took part in the validation of goals identified bythe staff of the Department. The State Board has officiallyadopted five of the general goals.

Program Title and PurposesThe official name of this program is the Maryland Educa-tional Accountability Program. It received impetus from anaccountability law which took effect on July 1, 1972.Major initial purposes of the program are to assess cognitivedevelopment and educational needs, to measure influenceson learning, to provide data for a management informationsystem and to set and evaluate attainment of statewideeducational goals.

MARYLAND

InitiationWith the assistance of a USOE grant under Sec. 402, TitleIV, the State Department of Education began in 1970 todevelop a statewide evaluation strategy, a planning capa-bility and a management information plan. In 1971 aneducational Goal Validation and Needs Assessment Projectwas launched. In June 1972 the accountability lawextended these earlier efforts by formalizing the need forgoal setting and achievement assessment in the area ofreading, writing and mathematics. No new testing programswere mandated, and none has yet been recommended bytask forces or advisory committees.

Policy

A broad-based Advisory Council on Accountability (bothcitizen and educator) and the local boards of educationmake recommendations to the State Department of Educa-tion relative to the development and implementation of theprogram. Policy decisions are made by the State Super-intendent of Schools and the Maryland State Board ofEducation.

FundingThis program has received no regular funding. Activities tothis point have made use of existing resources and stafftime. The recently completed Goal Validation and NeedsAssessment Project, funded by ESEA Title III and Title Vmonies, furnished much data relevant to the AccountabilityProgram. The Needs Assessment Project report will providea jumping off point for the Advisory Council on Account-ability. Last year the program cost approximately $54,000.

AdministrationA State Coordinator on Accountability has been appointedto serve in the Superintendent's Office and to coordinatethe work of the Accountability Council and the goal settingand assessment team efforts of the State Department ofEducation staff and local educator advisors. A number ofState Department staffers have been assigred to this efforton a half-time basis. No new positions have been madeavailable to the department.

PopulationAll public schools K-12 are included in the program. Gradelevels for reporting purposes have not been selected. Nodoubt the educationally mentally retarded and theemotionally disturbed will be excluded when data areassembled. The law does not require the inclusion ofchildren and youth who attend private or parochial schoolsor special schools for the delinquent or handicapped.

Areas Assessed

At the present time as the Accountability Program is beingplanned, in the cognitive area, vocabulary, reading, language(spelling, capitalization, punctuation, usage), work study(map reading, reading graphs and tables, use of referencematerials) and mathematics are assessed. No noncognitiveareas are being assessed at the present time on a statewidebasis.

39

Page 46: DOCUMENT RESUME - ERICDOCUMENT RESUME ED 080 582 TM 003 098 TITLE State Educational Assessment Programs.,1973 Revision.. INSTITUTION Educational Testing Service, Princeton, N.J.; Education

MARYLAND

InstrumentationPresently a variety of existing measures in 24 schoolsystems are used; none has been tailor-made for the pro-gram. Each school system chooses its own instruments, allof which are individually reliable. In cases where schoolsuse the same instrument, grade level and administrationdates vary. Current efforts are being made to developwriting measures based on National Assessment, and acriterion-referenced reading test is in the developmentalphases.

Related DataAs the Accountability Program takes shape, additionalinformation will be collected to help analyze and interprettest data. At present, socioeconomic and aptitude datadefinitely are considered; other types of data to be gatheredare yet to be determined.

Data Collection and ProcessingCurrently tests are given by classroom teachers. Dataobtained is processed by local school systems.

InterpretationData supplied by local school systems will be analyzed bythose systems and by the State Education Agency. Deci-sions on analyses of data for publication of state and localstandings have not been made.

Use of DataData obtained from the program points up educationalneeds and establishes priorities. In highlighting the faultsand fortes of the various systems, the results may be usedfor program evaluation, guidance, school managementguidelines and identifying exemplary programs. Also, thedata furnishes a basis for reports to the public. There is norestriction in the law whatsoever on the use of the programresults.

DisseminationThree kinds of reports are to be prepared by the StateEducation Agency: state summaries, school system sum-maries and school summaries. Students and parents mayreceive these publications if they request copies; automaticdistribution includes the State Education Agency, thegovernor, the legislature, newspapers, other states, the StateBoard of Education, school districts, the schools, teacherorganizations and ERIC.

OverviewBecause the program is so young, views of the variousgroups involved cannot be obtained. An evaluation ofhowwell the program objectives are being achieved would bepremature. One can say, however, that at this point in timethere are two definite limiting factors: lack of funds andlack of full agreement on how to proceed.

Prospects for the FutureThe program will continue indefinitely, Major planningchange: may take place before the program is crystallized.

40

Individual InterviewedThe primary source of information was Richard K. McKay,Assistant State Superintendent in Research, Evaluation andInformation Systems, P.0, Box 8717, Baltimore, Maryland21240 (telephone: 301 796-8300, ext. 320).

MASSACHUSETTS

Massachusetts Design for Assessment

CoalsDuring 1971-72 the Department of Education, through anextensive participatory process, directed considerable atten-tion to the identification of needs and the evolution of goalstatements. Many groups participated in the preparation of

the statewide educational goals: the citizens' advisorygroup, students, the Chief State School Officer, the StateBoard of Education and the legislature. Formally adoptedby the State Board of Education, the goals provide a frame-work for the integration of testing and evaluation activitiesand deal primarily with output. Paraphrased, the goal state-ments are as follows: physical and emotional well being,basic communication skills, effective use of knowledge,capacity and desire for lifelong learning, citizenship in ademocratic society, respect for the community of man.occupational competence, understanding of the environ-ment, individual values and attitudes and creative interestsand talents. These 10 educational goals are intertwined; noone goal stands in isolation from the rest. They will help to

defir performance objectives for learners, identify tasks tobe performed by local and state educational agencies ingiving life to those objectives and help to determine meansfor evaluating learners' progress toward the goals.

Program Title and PurposesThe major purpose of the Massachusetts Design for Assess-ment program is to assess public education in the Common-

wealth in light of statewide educational goals. Facets of theplan for the evaluation of educational programs are as fol-lows: I) identification of needs of children, youth andsociety in general, 2) establishment of goals for educationconsistent with these identified needs, 3) development ofprograms designed to meet or satisfy the needs, 4) develop-

ment of assessment and evaluation activities to measurelearner achievement and program success in light of thegoals and 5) repeating the above four-step cycle through theredefinition of goals and the revision of programs.

InitiationThe Massachusetts Design for Assessment was initiatedprimarily by the State Education Agency during 1970-72with educational goals jointly prepared by the citizens'advisory group, students, the Chief State School Officer,the State Board of Education and the legislature in 1971.

Page 47: DOCUMENT RESUME - ERICDOCUMENT RESUME ED 080 582 TM 003 098 TITLE State Educational Assessment Programs.,1973 Revision.. INSTITUTION Educational Testing Service, Princeton, N.J.; Education

PolicyWhile major responsibility for conducting the program andinstituting changes rests with the State Education Agencyand the State Board of Education, it is recognized thatmany groups (teachers, parents, school committees, tax-payers, administrators) have a stake in the assessment and

evaluation process.

FundingProgram expenses are completely financed by ESEA TitleIII federal funds. Program costs for 1970-71 were approxi-mately S140,000, and approximately $60,000 for 1971-72.

AdministrationThe Massachusetts Division of Research, Planning andEvaluation has assigned two full-time individuals to

coordinate the program statewide. Outside agencies andconsultants (CTB/McGraw Hill, Westinghouse Learning)have assisted in the administration of this program.

PopulationState assessment in Massachusetts included in 1971 thetesting of all fourth-grade pupils in the area of basic skills,with all data reports and analyses made on the comparisonof achievement in relation to ability. Evaluation activities in1972 included repetition of the fourth-grade testing philos-ophy on a 10 percent sample of eighth-grade pupils anddemonstration of mastery testing in the areas of science andcitizenship on two 10 percent samples of seventh graders.Participation by all schools is required, and all types ofschools are included in the program. Participation is notconfined to communities with special characteristics. Inaddition to the above testing activities, two studies wereconducted in the area of kindergarten pupil assessment andmeasurement of affective behavior. Nov underway,1972-73 activities deal with the submission of reading andmathematics objectives to be used as a basis for futuremastery testing.

Areas AssessedThe following cognitive areas are being assessed: aptitude,English, mathematics, natural science, reading, study skills,citizenship and areas in which previous evaluation experi-ence may render some contribution.

InstrumentationInstruments used for the testing of all fourth-grade studentsand the 10 percent sample of eighth-grade pupils in the area

of basic skills were the Test of Academic Aptitude and theComprehensive Test of Basic Skills by CTB/McGraw-Hill.Mastery testing in the areas of science and citizenship in-cluded submission of National Assessment objectives inScience and Citizenship for ranking by the participatingschool system (specifically, those Massachusetts school

teachers involved in each of the samples) in terms of theircurricular priorities and the development of achievement

instruments by Westinghouse Learning from National

Assessment items consistent with the objective ranking.

MASSACHUSETTS

Data from these criterion-referenced materials was reported

only in terms of item performance and then summarized in

relation to the selected objectives. All of these tests, chosen

by the State Education Agency, are used only as groupreliable measures (although the California Test of Academic

Aptitudes and the Comprehensive Test of Basic Skills can

be used as individually reliable measures).

Related DataAdditional information collected to help analyze andinterpret the data include the cost of instructional programsand the percentage of minority students.

Data Collection and ProcessingClassroom teachers are responsible for giving the tests,while outside contractors or the Department's computerfacility staff are responsible for processing the informationobtained from the program.

InterpretationThe State Education Agency, with the aid of local schools

and local school districts, has the primary responsibility for

analyzing, organizing and interpreting the data.

Use of DataThe results of this program, interpreted in relation to state-

wide educational goals, will be used in the area of instruc-

tion. There is no law for the program which restricts orlimits the use of data.

DisseminationThe State Education Agency has prepared state summaries

of the results of the program which are distributed to the

State Board of Education and the school districts. State-

wide reporting of testing activities did not identify indi-

vidual schools or individual districts. Opportunities wereprovided in the process to permit each teacher, school and

district to consider its results on a highly individualized

basis.

OverviewThe program is viewed favorably by the State Education

Agency, with varied reactions on the part of boards ofeducation, local school administrators and teachers. High

interest was shown by participants, particularly in theNational Assessment criterion-referenced materials, which

measure item mastery. The program objectives are being

achieved, although a major problem is funding.

Prospects for the FutureIt is hoped that :lie present program will expand. Theelements in the program most likely to change in the near

future are: the areas assessed, the measuring instruments

used, the interpretation of data and the use of data.

Individual InterviewedThe primary source of information was James F. Baker,Associate Commissioner for Research, Planning and Evalua-

41

Page 48: DOCUMENT RESUME - ERICDOCUMENT RESUME ED 080 582 TM 003 098 TITLE State Educational Assessment Programs.,1973 Revision.. INSTITUTION Educational Testing Service, Princeton, N.J.; Education

al.

MASSACHUSETTS

tion, Massachusetts Department of Education, 182

Tremont Street, Boston, Massachusetts 02111 (telephone:617 727-8477).

REFERENCESMassachusetts Department of Education Assessment Task Force,Massachusetts Design for Assessment; A Position Paper.

Massachusetts Department of Education Assessment Task Force,Testing White Paper.

Massachusetts Application of National Assessment Items in Citizen-ship and Science.

Massachusetts Fourth Grade Testing Program 1971, "R & D TestBulletin No. I."

Report of the Task Forces assisted by the Advisory Committee onEducational Goals to the Board of Education, Educational Goals forMassachusetts, The Commonwealth of Massachusetts, Board ofEducation, September 1971.

MICHIGAN

Michigan Educational Assessment Program(Fourth and Fifth Assessments)

IntroductionMichigan Assessment, an ongoing program since 1969, isnow in a transitional phase. During the January 1973 testadministration (the Fourth Assessment), previously usedtests were given again. In the meantime, however, there wasconcurrent development of criterion-referenced tests forthe Fifth Assessment to be administered in October 1973.The report which follows gives information, where possible,on both the present program and on future program devel-opment.

GoalsMichigan has established 22 goals in three major areas, plusan Appendix on Educational Improvement. The 22 goalsprimarily reflect student output; the Appendix providesstandards of input and process. The goals (and Appendix)were developed during 1970-71 by the following groupsworking in conjunction with the State Education Agency: acitizen task force, the Superintendent of Public Instruction,and the State Board of Education. Formally adopted inSeptember 1971 by the State Board of Education, thesegoals will remain the same for both the Fourth and FifthAssessments. They are available upon request.

Program Title and PurposesThe official title of this program is the Michigan Educa-tional Assessment Program. Its major purposes are to: assesscognitive development, assess educational needs, assess non-cognitive development (with the Fifth Assessment to be thefirst phase), measure influences on learning, measure educa-

42

tional growth (Fourth Assessment data will provide initialanalysis), provide data for a management informationsystem (a very important purpose) and provide informationfor a planning-programming-budgeting system (with theFifth Assessment providing initial data).

InitiationProgram initiation was a cooperative effort by the gov-ernor's office, the State Board of Education, the StateEducation Agency (SEA), the state legislature, and theSuperintendent of Public Instruction. Program developmentbegan in 1969, with the first statewide testing conductedduring the 1970-71 school year.

PolicyThe following groups determine how the program is con-ducted and what changes will be made in the nature of theprogram: a citizen task force, the Superintendent of PublicInstruction, the SEA, the State Board of Education, andthe state legislature (for funding only).

FundingMichigan Assessment is completely funded by state moneyunder general appropriations given to the State Departmentof Education. Approximately $375,000 was spent on theprogram in 1972.

AdministrationIntermediate school districts,* coordinators in local schooldistricts and professionals within the State Department ofEducation work together on statewide program coordina-tion. There are five full-time state level professionals, 59intermediate school district coordinators and 530 localschool district coordinators assisting in the administrationof the project. Outside consultants include: EducationalTesting Service (ETS) of Princeton, New Jersey, which hasbeen responsible for past instrument development, scoringand analysis, and the California Test Bureau/McGraw-Hill,Inc. (CTB) of Monterey, California, which is cooperatingwith four local school districts and the Department in thedevelopment of the criterion-referenced tests to be used inthe October 1973 administration (the Fifth Assessment).

PopulationFor both the January 1973 and October 1973 administra-tions, all public school students in grades 4 and 7 are beingassessed. Dropouts are not included in the regular assess-ment program, but dropout rates are reported within indi-vidual school systems. Delinquent students and schools,educable mentally retarded and emotionally disturbedstudents, schools for the handicapped and private schoolsare all excluded from the study. Parochial schools are notincluded in the regular assessment study, but have aseparate state-provided assessment study, funded throughESEA Title V under the old National Defense EducationAct (NDEA). Last year 75 percent of all parochial schools

*Reorganized county districts, considered a regional service agencyfor operating local school districts.

Page 49: DOCUMENT RESUME - ERICDOCUMENT RESUME ED 080 582 TM 003 098 TITLE State Educational Assessment Programs.,1973 Revision.. INSTITUTION Educational Testing Service, Princeton, N.J.; Education

in the state participated in this study. All public schools arerequired to participate in the regular Michigan AssessmentProgram, and 100 percent of the eligible schools partici-pated in the Third Assessment Study. Future plans call fora random sampling of the target population. Communitycharacteristics are not used as a basis for school selection.

Areas AssessedThe cognitive areas of mathematics, reading and themechanics of written English were assessed in January1973, with word relationships being assessed under apti-tudes. For the October 1973 administration, the cognitiveareas of reading and mathematics will be assessed. Aptitudemay be assessed, but the insturment to be used has not yetbeen determined. During the first two years of MichiganAssessment, an Attitudes Instrument was used to assessself-concept and attitudes toward school and schoolachievement. The instrument was found to be technicallydeficient and was removed from the program. Work is nowbeing completed on a new Attitudes Instrument which rpaybe administered statewide as part of the Fifth Assessment.

InstrumentationThe Standardized Michigan Educational Assessment TestBattery in mathematics, reading and the mechanics ofwritten English (grade levels 4 and 7) was used for theJanuary 1973 test administration. This test battery wasdeveloped by Educational Testing Service to the specifica-tions of the Michigan State Department of Education.These tests are individually reliable, norm-referencedmeasures. The new instruments, to be used for the October1973 test administration in mathematics and reading (gradelevels 4 and 7), have been developed to state specificationsby the California Test Bureau and four Michigan schooldistricts under contract to the Department of Education.These tests have been developed as criterion-referencedmeasures. Item sampling techniques were used to constructthese measures, and they will be used both as group-reliablemeasures and for individual student profiles.

Related DataAdditional information collected to help analyze programdata includes: cost of instructional program, dropout ratesin district, student grade assignments, paraprofessionaleducator data, percentage of racial/ethnic minority, schoolname and size, socioeconomic data and education andsalaries of teachers involved in the study.

Data Collection and ProcessingClassroom leachers are primarily responsible for admin-istering the tests. The State Education Agency (SEA) inconjunction with Educational Testing Service has beenresponsible for processing program data in the past. Theoutside consultant to be used for data processing of thecriterion-referenced tests has not yet been chosen.

InterpretationThe SEA, in conjunction with an outside contractor (as yet

,

MICHIGAN

undetermined), will be responsible for organizing, analyzingand interpreting program data.

Use of Data

Program results are used for the following: allocation offunds under the State Compensatory Education Program,some comparative analysis across schools, program planningand public relations. Program authority does not restrictthe use of program data, although the SEA restricts the useof individual student scores.

DisseminationProgram reports are prepared and include: state summaries,regional summaries, school system summaries, school sum-maries, student summaries (containing student scores),summaries of target areas and public information pieces.The following receive copies of program reports: the SEA,the state governor, the state legislature, newspapers, otherstates upon request, the State Board of Education, schooldistricts, individual schools, students, parents, teacherorganizations and ERIC. Program reports are preparedjointly by the SEA and the outside contractors.

OverviewThe Michigan Program is gaining acceptance from all levelsof the state educational hierarchy and the general public.Program objectives are being met very satisfactorily. Themain problems lie in the area of dissemination, gaining anunderstanding of the program and an awareness of what itis trying to accomplish and being able to properly use theprogram data in decision making.

Prospects for the FutureThe program is likely to continue. The areas most likely tochange in 1973-74 are: the areas assessed (broadeningsubject matter), improvement of measuring instruments,improvement of instruction in the statewide objectives,data collection and processing procedures and dissemina-tion techniques. As mentioned before, Michigan is planningto use random sampling of target populations in futureassessments. The state is also developing a special test, to beused at the entry into first grade, to assess the student'sreadiness for school.

Individual InterviewedThe main source of information was David L. Donovan,Director of Research, Evaluation and Assessment Services,Michigan Department of Education. P.O. Box 420, Lansing,Michigan 48902 (telephone: 517 373-1830).

REFERENCESMichigan Department of Education. The Common Goals of Mich.igan Education. 1971.

Michigan Department of Education. Objectives and Procedures. Thefirst report of the 1972-73 Michigan Educational Assessment Pro-gram. October 1972.

Michigan Department of Education. Individual Pupil Report:Explanatory Materials. The second report of the 1971-72 MichiganEducational Assessment Program. April 1972.

43

11. oig

Page 50: DOCUMENT RESUME - ERICDOCUMENT RESUME ED 080 582 TM 003 098 TITLE State Educational Assessment Programs.,1973 Revision.. INSTITUTION Educational Testing Service, Princeton, N.J.; Education

MINNESOTA

Minnesota Statewide Educational AssessmentProgram

GoalsA working set of statewide educational goals and objectives

has been developed in the areas of reading and mathe-matics. Additional content areas are being developed using

the following criteria:I) The objectives are considered important by subject -

matter specialists.2) School personnel recognize the objectives as tasks which

the schools are attempting to accomplish.3) Thoughtful lay citizens of the state consider the objec-

tives worthy of attainment.The development of the goals and objectives is coordi-

nated through the Assessment Advisory Council. Theobjectives, very much output oriented, are available uponrequest.

Program Title and PurposesThe purposes or goals of the Minnesota Statewide Educa-

tional Assessment Program (MSEAP) areI) To provide the educational decision-makers in Minne-

sota with a program to diagnose relevant strengths andweaknesses in Minnesota schools, and subsequently

2) To provide resources at the various levels of indicated

need for purposes of ameliorating weaknesses and

capitalizing upon the strengths within the educational

systems of the state.The specific objectives assigned to the program are as

follows:1) To determine the level of performance of students in

this state in the cognitive, affective and psychomotor

domains.2) To identify the variables which account for the varia-

tions in student performance.3) To report the results of this investigation to educational

decision-makers in the executive and legislative branchesof state government, the State Board of Education, theDepartment of Education, local school administrators,local school boards and interested citizens of the state,providing a guide for the allocation of school resources.

4) To longitudinally report the extent to which progress is

being made in Minnesota schools toward improvingstudent performance within the State of Minnesota.

These purposes and objectives were presented to, and

approved by, the State Board of Education on October 4,

1971. No legislation has been mandated by the legislature

or requested by the Department. However, the Departmentis requesting financial support for the program from thelegislature through its program budget,

InitiationThe idea for the program was initiated in 1971 by the Chief

State School Officer. The first pilot testing took place in

44

the spring of 1972, and the implementation of Phase Ioccurred during the spring of 1973.

PolicyThe Assessment Advisory Council is a 25-member body

with representation from professional educational organiza-

tions, other state agencies, higher education, non-public

education, the governor's office, the state legislature, citi-

zen groups concerned with education and representativesfrom the Department of Education. With the participation

of these groups, the various audiences with which theDepartment must communicate are represented to expressopinions and identify concerns that surface in these

audiences.Additionally a five-member Technical Advisory Com-

mittee has been formed to provide technical expertise in

areas related to educational assessment, measurement andevaluation. The membership includes public school per-sonnel with research and evaluation responsibilities and

professors from higher education institutions with partic-

ular expertise in research and measurement.The responsibilities of both advisory groups are limited to

general policy establishment; specific responsibilities rela-

tive to implementation remain with the Commissioner ofEducation and his designees.within the Department of Edu-

cation. Each advisory group meets about once every six

weeks.

FundingThough the state does contribute staff time, the Pilot Phase

of the program was supported by federal ESEA Title IIImonies. Approximately S100,000 was spent on this phase

of the program. Implementation of Phase 1 of the program

(spring 1973) was made possible by a grant from the Hill

Family Foundation which the Department of Education is

matching with state and federal resources. Future phases of

the program are to be financed through the appropriations

requested by the department from the current legislative

session.

AdministrationESEA Title Ill personnel within the state coordinate the

program with a professional staff size of one full-time

equivalent. Reading and mathematics consultants from the

University of Minnesota and Minneapolis Public Schools

have worked with the Department on the development of

objectives and exercises. Research Triangle Institute (Rhohas conducted a comprehensive planning study on theassessment program for the Department and also has been

contracted to implement Phase I of the program this spring.

Phase I involves the assessment of 17-year-olds and eleventh

graders in the area of reading.

PopulationThe target groups are defined by both age and grade level.

Students of a given age dropping behind grade level are

included, whereas dropouts are not included at this time.

Educationally mentally retarded, emotionally disturbed,

Page 51: DOCUMENT RESUME - ERICDOCUMENT RESUME ED 080 582 TM 003 098 TITLE State Educational Assessment Programs.,1973 Revision.. INSTITUTION Educational Testing Service, Princeton, N.J.; Education

non-English speaking and physically handicapped studentsand handicapped schools are excluded from the program.Samples are then drawn from the target population. Partici-pation is voluntary, with 100 percent cooperation to date.Last year the pilot reading and mathematics instrumentsand an attitudes survey were administered to a five percentsample of third- and sixth-grade classrooms across the state.Participation was not confined to communities with specialcharacteristics. Phase I of the assessment program includesapproximately 232 secondary schools and 5,100 students.

Areas AssessedIn the Pilot Phase the cognitive areas of reading andmathematics and the general noncognitive area of attitudeswere assessed. The attitudinal assessment included the fol-lowing areas:1) Attitude Toward Mathematics2) Attitude Toward Reading3) Attitude Toward School4) Attitude Toward School Achievement5) Attitude Toward Self and Others6) School Atmosphere Index7) Environmental-Ecological Attitudes8) Citizenship9) Careers and the World of Work

In the future, Minnesota will assess the cognitive areasmeasured by the National Assessment Program and, addi-tionally, the areas of health and physical education. In thenoncognitive domain, Minnesota will test attitudes (towardsubject areas, school, careers, etc.), citizenship and hope-fully interests, personal values and self-concept.

Phase I of the program is confined to the areas of reading,to be followed by mathematics and social studies.

InstrumentationThe reading and mathematics exercises used in the PilotPhase were tailor-made by consultants from the Universityof Minnesota and Minneapolis Public Schools. Phase I ofthe assessment program involves released exercises from theNational Assessment Program and Minnesota-developedexercises. National Assessment Program exercises accountfor one-fourth 'f the total exercises, and Minnesota-developed exercises account for three-fourths of the total.Minnesota exercises were written by Dr. Mark Aulls, Dr. P.David Pearson and Mr. Alan Farstrup of the University ofMinnesota. Their efforts have been coordinated with aStatewide Reading Assessment Review Committee. Similarprocedures will be utilized in additional phases of theassessment program. Students are sampled to providereportage on the performance of 9-year-olds and fourthgraders; 13-year-olds and eighth graders; and 17-year-oldsand eleventh graders. The age levels will be used forcomparison of Minnesota results to the National Assess-ment Program's national and regional results. Grade levelswill be used for within-state reportage. The exercises arereferenced to specific Minnesota educational objectives andare used as group reliable measures.

MINNESOTA

Related DataMinnesota is collecting additional information to helpanalyze and interpret the test data. Background variablesinclude sex, age, grade level, socioeconomic conditions(home and family background), size and type of com-munity and school financial, personnel and programmaticcharacteristics.

Data Collection and ProcessingPaid exercise administrators, such as the National Assess-ment Program uses, are responsible for giving the :nv=.11-tories and exercises. An outside contractor, ResearchTriangle Institute, is responsible for processing the programdata.

InterpretationResearch Triangle Institute and the Department of Educa-tion will be responsible for analyzing and organizing thedata. State and/or private universities, the State EducationAgency and outside contractors will cooperatively beinvolved with interpretation although the majority of theresponsibility will rest with the state.

Use of DataProgram results will be used for program planning, publiccommunication and relations, allocation of state and/orfederal funds, instructions and a comparative analysis acrossregions. At this time program authority does not limit theuse of program data.

Dissemination

Reports will be prepared by the State Education Agency inconjunction with the outside contractors and sent to thegovernor, the legislature, newspapers, other states, the StateBoard of Education, school districts, schools, teacherorganizations, ERIC and indirectly to students and parents.State, regional and target area reports and public informa-tion releases will be prepared.

Overview

Reactions to the program have been very positive on boththe state and local level, and it is felt that program objec-tives are definitely being achieved. The major problem isthe uncertainty of funding.

Prospects for the FutureThe program will continue and it is hoped that funding willimprove. The present description will remain accuratethrough the summer of 1973, at which time more informa-tion will be available on the advanced stages of theprogram.

Individual InterviewedThe primary source of information was John W. Adams,Director, State Educational Assessment, Department ofEducation, 728 Capitol Square Building, St. Paul, Minne-sota 55101 (telephone: 612 296. 3885).

45

Page 52: DOCUMENT RESUME - ERICDOCUMENT RESUME ED 080 582 TM 003 098 TITLE State Educational Assessment Programs.,1973 Revision.. INSTITUTION Educational Testing Service, Princeton, N.J.; Education

MISSISSIPPI

Continuing Plan for Education in Mississippi

GoalsThe Mississippi State Department of Education is in theprocess of identifying statewide educational needs. Ageneral educational need was defined as "A lack of some-thing requisite, desirable or useful"' in relation to publicschool education. No attempt was made in this study toidentify specific learner needs.

This study was conducted on the basis that school pro-grams should be developed in direct relation to the identi-fied needs. In other words determining needs should be thefirst step in comprehensive planning, which was definel asfollows: Comprehensive planning is the logical process of(1) establishing goals and priorities which are based ongiven evidences of needs, (2) writing specifically statedobjectives for achieving each goal after developing and com-municating strategies and techniques, (3) implementing theplan in consideration of diversified resources such as cate-gorical aid programs and varying levels of manpower quan-tities and skills, (4) evaluating the results of the plan on thebasis of the stated objectives and (5) recycling the pro-cedure as deemed necessary as a result of the evaluation.2

Program Title and PurposesThe program title is "Continuing Plan for Education inMississippi." The purpose of this study was to identifygeneral educational needs in the public schools of Missis-sippi. Four major sources were used to collect data fordetermining those educational needs. One source wasthrough an Inventory of Educational Needs developed forthe purpose of seeking the opinions of selected persons tocertain items characteristic of educational programs; asecond source was through an analysis of the results of thestate testing program conducted during the 1970-71 schoolyear; a third source involvea an analysis of the datacompiled by the Selective Service System for Mississippi;and a fourth source was through an analysis of the statis-tical data collected by personnel within the State Depart-ment of Education. Requisite for the administration ofmonies from many federal programs is that a compre-hensive needs assessment be conducted among the intendedrecipients of said funds. An attempt was made through thisstudy to coordinate all needs assessment activities requiredof state agencies within the Department of Education inMississippi into one major assessment activity.

InitiationIn an effort to improve public school education in Missis-sippi, the state superintendent of education appointed a12-member Council for Planning and Evaluation to studyeducational needs and to recommend programs designed tomeet the identified needs. The initial meeting of theCouncil was held on February 2, 1971, with the member-ship of the Council representing all divisions and major pro-

46

grams within the Department of Education. The resultingstudy represented a coordinated effort for systematicallyand comprehensively identifying general educational needsin Mississippi and therefore became the first phase of theMississippi Plan for Education.

PolicyThe State Department of Education is responsible forcoordinating program policy and change.

FundingThe program is funded 100 percent by federal ESEA TitleIV, Section 402 funds with the exception of state-con-tributed staff time.

AdministrationThe State Superintendent of Education and the Office ofPlanning and Evaluation of the State Department of Educa-tion are responsible for coordinating the program statewide.

PopulationIn the spring of the 1970-71 school year 122 schooldistricts participated in a state testing program. All studentsin public schools in grades 5 and 8 were tested. Eighty-seven percent of the school districts submitted the testresults from their respective districts in mean raw scoreform for use on a statewide basis to the Department ofEducation.

Areas AssessedThe areas of reading, mathematics, language and spellingwere assessed in spring 1971.

InstrumentationTwo instruments were involved in Mississippi's needs identi-fication. The California Achievement Tests were admin-istered to students in grades 5 and 8. The Inventory ofEducational Needs was sent to various educational person-nel and laymen throughout the state.

Al though the California Achievement Tests (CAT),CTB/McGraw-Hill yield individual results which the localschools used, the state only received and was only con-cerned with group results (in mean raw score form bydistrict). State and district norms were then compared withnational norms; however, the sample population fromwhich the test was normed is unlike the test population inMississippi. This had to be considered when drawing con-clusions.

Selected people within Mississippi were given theopportunity to express their opinions about the state'sgeneral educational needs by responding to an instrumententitled "Inventory of Educational Needs." This instru-ment, which contained 276 separate items, was developedthrough the cooperative efforts of the members of theCouncil for Planning and Evaluation. Local school super-intendents, selected school board members, principals,teachers, teacher aids, students and parents were given theopportunity to provide data for the development of the

Page 53: DOCUMENT RESUME - ERICDOCUMENT RESUME ED 080 582 TM 003 098 TITLE State Educational Assessment Programs.,1973 Revision.. INSTITUTION Educational Testing Service, Princeton, N.J.; Education

instrument. An investigation of related literature was con-ducted. Also, each of the state departments of educationand the U.S. Office of Education were requested to sendinformation relating to needs assessment studies that mighthave been undertaken. Additionally, deliberations weremade from each of four evaluation studies concerningeducation in Mississippi. Those four studies were as follows:"State-Wide Education Study,"3 "Improving Organization,Accounting and Data Processing Operations," "Report ofa Management Review, "5 and "Report to the Governor."6

Through the review of literature, ideas were gained but noprocedure was found that was considered applicable toMississippi for determining educational needs. The instru-ment was then pretested for the purpose of refining thefinal draft and determining a reliability coefficient. TheInventory of Educational Needs was designed to be mailedto all superintendents, school board members and principalsin Mississippi. Selected teachers, students, parents and com-munity leaders (presidents of all local chambers of com-merce) in the state were also asked to respond to thesurvey.

Related DataThe Selective Service records as compiled from the reportsof the local draft boards representing each county in Missis-sippi were used as a source of data for identifying generaleducational needs in Mississippi. The statistical reportscompiled within the State Department of Education andother state agencies were used as another source of data foridentifying general educational needs. School enrollmentfigures were compared with the potential enrollment figuresas indicated by the appropriate live birth figures. Dropoutrates and the holding power of the public schools were also

researched. Comparisons by type of school districts weremade for per-pupil expenditures, for average salaries ofinstructional personnel and for the percent of schoolrevenue received from local, state and federal sources.

Data Collectiot, and ProcessingThe California Achievement Tests were administered to thestudents on a local basis and graded by CTB/McGraw-Hill.Because results of the California Achievement Tests weremade available by the testing company only to the partici-pating districts, a form was necessary for collecting theresults for use on a statewide basis. A letter was preparedwith the accompanying form and mailed to the superin-tendents of the 122 participating school districts requestingthe test results from their respective districts in mean rawscore form. After the due date of return, an appeal throughtelephone contact was made directly with the superin-tendents who had not responded. A final usable return ofachievement test results represented 106 school districts.

InterpretationThe Department of Education was responsible for ana-lyzing, organizing and interpreting both the test datareceived from the superintendents and the other programdata.

MISSOURI

Use of DataProgram data was used first to systematically identify needs

and then to establish statewide educational goals which

represents Phase II for the "Continuing Plan for Education

in Mississippi."

DisseminationThe State Department of Education has prepared a compre-hensive program report, General Educational Needs Assess-

ment in Mississippi, which is available upon request.

OverviewGenerally the program has been very favorably received;however, it is too early to make any specific observations orcomments.

Prospects for the FutureThe general educational needs assessment is only the first

step of a "Continuing Plan for Education in Mississippi."

Individual InterviewedThe primary source of information was Jerry R. Hutchin-

son, Coordinator, Office of Planning and Evaluation, StateDepartment of Education, Post Office Box 771, Jackson,Mississippi 39205 (telephone: 601 354-7328).

REFERENCEGeneral Educational Needs Assessment in Mississippi, 1972. State

Department of Education.

IWebster's Seventh New Collegiate Dictionary (Springfie ld, Massa-chusetts: G. and C. Merriam Company, 1970), P. 565.

2Thomas Burns, BESE Regional Conference (conference held inWashington, D.C., October 27, 1971).

3"State-Wide Education Study" (study prepared by Booz-Allen &Hamilton Incorporated, Chicago, Illinois, 1967).

4"Improving Organization, Accounting and Data Processing Opera-tions" (study prepared by Peat, Marwick, Mitchell & Company,Jackson, Mississippi, February 9, 1968).

5"Report of a Management Review" (study prepared cooperativelyby the U.S. Office of Education and the Mississippi State Depart-

ment of Education, December 1970).

6"Report to the Governor" (study prepared by the Mississippi Com-mission on Efficiency and Economy in State Government, Decem-ber 1969).

MISSOURI

Missouri Statewide Assessment Project

GoalsA set of statewide educational goals was adopted by theMissouri State Board of Education. Prepared by the State

47

Page 54: DOCUMENT RESUME - ERICDOCUMENT RESUME ED 080 582 TM 003 098 TITLE State Educational Assessment Programs.,1973 Revision.. INSTITUTION Educational Testing Service, Princeton, N.J.; Education

MISSOURI

Education Agency, the goals deal with the outcomes ofeducation. A copy of the goals is available on request.

Program Title and Purposes

The purposes of the program, called the Missouri StatewideAssessment Project, are to measure the status and progressof the students of the state toward educational goals andobjectives and to identify educational needs.

InitiationMissouri's Assessment Project was initiated by the StateBoard of Education. The present phase of development wasbegun during the fall of the 1972-3 school year.

Policy

The State Board of Education, with the assistance ofadvisory groups, determines the course of the program.

Funding

ESEA, Title III, and Section 402 funds have been used foradministrative costs. Other state and local funds have beenin the form of staff time. No specific state funds have beenallocated. Specific program costs have not been identified;because of the early stage of development, the time spenton the assessment program has not been directly charged tothe project.

AdministrationA State Education Agency Steering Committee coordinatesthe program under the chairmanship of the Director ofPlanning and Evaluation. There has been considerableinvolvement by the professional staff of the State Depart-ment of Education; however, the amount of staff timespent depends on the phase of the planning or implementa-tion that is being conducted. Outside consultants will beinvolved in the development of the measurement instru-ments, scheduled to commence in July 1973.

PopulationPublic schools will comprise the target population. State-wide data only will be collected through a representativesample of students.

Areas Assessed

The educational goals are categorized as follows:

Goal Area I. Intellectual DevelopmentA. CommunicationB. Quantitative ThinkingC. Social Processes

D. Scientific UnderstandingE. Decision MakingF. Aesthetic Appreciation

Goal Area II. Physical DevelopmentA. Growth and MaturationB. HealthC. Recreation

Goal Area III. Social DevelopmentA. Social and Physical Environment

48

B. Cultural AwarenessC. Government InstitutionsCitizenshipD. Avocational PursuitsE. Concept of Self, Morality, Values

Goal Area IV. Career DevelopmentA. Social Significance of WorkB. Occupational ExplorationC. Occupational PreparationD. Occupational Education (Adult)

Assessment will be developed in all areas as resources andexpertise allow.

InstrumentationThe tests to be used for the Missouri Assessment Project arenot known at this time; however, previously developedstandardized tests, criterion-referenced exercises, orspecially developed instruments that meet validity andreliability standards will be considered. A request forproposals for test development will be issued in the summerof 1973.

Related DataStatewide data on such variables as cost of instructionalprograms, dropout rates, socioeconomic and teacher data(e.g., teacher salaries, teacher education, course load, etc.)are available through other departmental programs.

Data Collection and ProcessingClassroom teachers and/or trained administrators may beused in th, instrumentation phase. It is not known how thedata will be processed at this time, since requirements forprocessing will depend on the type of measurement under-taken.

InterpretationData will be reported on a statewide basis in relationship togoal areas.

Use of Data

It is anticipated that the data may be used to determineallocation of resources in terms of consultant assistance andcategorical aid.

DisseminationIt is anticipated that separate reports will be generated foreach unique audience (e.g., the general public and localck, tool personnel).

Overview

The second phase of the Assessment Project involved thedevelopment of statewide objectives. Over 250 schooldistricts voluntarily participated in their development andreview.

Prospects for the FutureThis program, still in its beginning stages, is dependent onfuture federal and state funding.

Page 55: DOCUMENT RESUME - ERICDOCUMENT RESUME ED 080 582 TM 003 098 TITLE State Educational Assessment Programs.,1973 Revision.. INSTITUTION Educational Testing Service, Princeton, N.J.; Education

Individual InterviewedThe primary source of information was John F. Allan,Urector of Planning and Evaluation, State Department ofEducation, Jefferson City, Missouri 65101 (telephone: 314751. 3501).

REFERENCEMissouri State Department of Education. Educational Goals for theState of Missouri. November 1972.

MONTANA

CoalsMontana currently is formulating a set of statewide educa-tional goals. The formulation is a phase of the statewideneeds assessment program under sponsorship of ESEA TitleIII. The goals will mainly be output in nature. A citizens'advisory group, the Chief State School Officer, the StateEducation Agency, other educational professionals andstudents are now participating in the preparation of thesegoals.

Program TitleThe statewide needs assessment program has no separateofficial title at this time. Its major purpose is to asses.educational needs statewide in order to focus responsiveeffort into areas where the needs are critical.

Initiation

The program was initiated in 1968 by the Chief StateSchool Officer and the ESEA Title III staff of her office.

Policy

The Chief State School Officer and her staff interpretrequirements of federal ESEA Title Ill, set program policyand determine how the program is conducted and whatchanges are made.

FundingThe program is completely funded federally through ESEATitle Ill. During the past year roughly $30,000 was spent.

AdministrationAlthough the ESEA Title III staff has a role in administra-tion, the primary responsibility for coordination of theneeds assessment rests with the State Agency's Research,Planning, Development and Evaluation component and theChief State School Officer. The professional staff effort isone and one-half full-time equivalents at the state level. Atpresent no outside agencies or consultants are involved.

PopulationTarget groups will be defined by grade, including all typesof public schools in the state. From these target grades,,

MONTANA

representative samples will be drawn from schools whoseofficials agree to participtc.

Areas Assessed

Through conjecture at this point in time, the State isplanning to assess in all the behavioral domains as describedby the statewide goals. Though not yet validated by thestate population, these goals probably include some of thefollowing: communication skills, career education, cog-nitive ability and the noncognitive areas of citizenship,attitudes (for example, to school and subject matter),interests, personal values and self-concept.

InstrumentationThe instruments have not been defined as yet.

Related Data

Age, sex and geographic area by zip code will probably beidentified in the goal-validations survey comparisons of theeastern and western parts of the state (thought to differ inphilosophic outlook) as well as rural and urban areas.

Data Collection and ProcessingAt this time it is undecided who will be directly responsiblefor admi-'0, ring any tests and inventories or for processingthe resultn,L data and information.

InterpretationThe Research, Planning, Development and Evaluationomponent will be responsible for the analysis, organization

and interpretation of the data. It is undecided whether out-side help will be solicited.

Use of DataThe primary purpose of the program is to identify and rankeducational needs. The assessed criticality of these needswill then be a basis for program planning. Montana also willuse the results to help determine the allocation of state orfederal funds. The program data will be universally avail-able.

DisseminationA report will be prepared presenting the planning,processing and results of the program. It will be in the formof a statewide summary but will include comparisons ofeast vs. west and 1 .., vs. urban. Copies of the reports willgo to the State Education Agency, the State Board ofEducation, school district officials and the newspapers.

OverviewGroups representative f* ,choolmen, teachers, schoolboards, students ' ,: ow general public have expressedfavorable opinic .. about tare program. So far, programobjectives are be:ng avitiv,-d.

Prosper' . .41 ,,. : utureThe ps,v-- . will continue as long as funds are available.

49

Page 56: DOCUMENT RESUME - ERICDOCUMENT RESUME ED 080 582 TM 003 098 TITLE State Educational Assessment Programs.,1973 Revision.. INSTITUTION Educational Testing Service, Princeton, N.J.; Education

MONTANA

Most likely to change in the future are: policy control,fund ng, administration and data collection. The presentdescription should remain accurate until the end of the1972-73 academic year.

Individuals InterviewedThe primary sources of information were Ray Bitney,Instrument and Analysis Specialist for the Research,Planning, Development and Evaluation component, Officeof the Superintendent of Public Instruction, State Capitol,Helena, Montana 59601 (telephone: 406 449-3693) and J.Michael Pichette, Reporting Services Coordinator, Re-search, Planning, Development Evaluation, Office of theSuperintendent of Public Instruction, State Capitol, Helena,Montana 59601 (telephone: 406 449-3693).

NEBRASKA

Nebraska State Assessment Program

GoalsA formal set of goals for Nebraskan education has beenprepared, covering student achievement and educator,citizen and institutional responsibility. These goals wereprepared by citizens, including students, educators andlaymen from the entire state, and have been formallyadopted by the State Board of Education. They represent acombination of input, process and output. The goals areavailable upon request.

Program Title and PurposesThe official name of this program is the Nebraska StateAssessment Program. Its major purposes are to assess cog-nitive development, assess educational needs, assess non-cognitive development and measure influences on learning(a future development).

InitiationThe program began as a result of a legislative mandatepassed in the latter part of 1969. Program development has'wen ongoing for the past two years, and the first statewidetest administration was scheduled for March and April of1973.

PolicyProgram policyand Evaluationprogram policygram.

is determined by the Research, PlanningDivision. The state legislature influenceswithin the limitations of funding the pro-

FundingState funds are appropriated and designated State Assess-ment Program Funds by ,:.e state legislature. Roughly$5,000 was spent on State Assessment in 1972.

50

AdministrationThe Research, Planning and Evaluation Division coordinatesthe program statewide, with two full-time professionals atthe state level. Houghton M alin Company has beeninvolved with the state as a consultant in program develop-ment.

PopulationThe student group being assessed in 1973 consists of arandom sample drawn from the target population of publicschool students in grades 4, 5 and 6. Educable mentallyretarded, emotionally disturbed, school dropouts and non-English speaking students are excluded from the study;parochial and private schools are also excluded from thepilot study. Vocational-technical schools and students willnot be included in the first assessment, but will be includedin future assessments. School participation is voluntary, andcommunity size is used as a basis for selection.

Areas AssessedMathematics and reading are being assessed in the cognitiveareas, and self-concept as a learner and attitudes towardschool are being assessed in the noncognitive areas. Voca-tional awareness tests are planned for future assessments.

InstrumentationMI test instruments being used in the Assessment Programwere tailor-made for the program by the outside con-sultants. The instruments are "mini tests," each with anobjective, instructions and five-to-ten exercises. After thesystem is working, schools will be able to tailor some partsof their testing programs with these instruments. Appro-priate measures have been designed for each target area andgrade with the aid of concept sampling. These measures aredescribed as criterion-referenced measures, and they will beused for individual reliability.

Related DataSex and age of the students tested will be collected to helpanalyze program data.

Data Collection and ProcessingClassroom teachers will administer the tests. The outsidecontractor will process all program data.

InterpretationThe outside contractor in conjunction with the Research,Plaaning and Evaluation Division will organize, analyze andinterpret program data.

Use of DataProgram results will be used for curriculum developmentand to determine the best areas of allocation of stateresources. Program authority does not restrict the use of

program data.

DisseminationProgram revrts will be prepared as a coordinated effort by

Page 57: DOCUMENT RESUME - ERICDOCUMENT RESUME ED 080 582 TM 003 098 TITLE State Educational Assessment Programs.,1973 Revision.. INSTITUTION Educational Testing Service, Princeton, N.J.; Education

the Research, Planning and Evaluation Division and the out-side contractor. The types of reports which will be preparedhave not yet been determined, although copies of thesereports will probably be sent to the legislature, the generalpublic, state educators and teachers colleges. The basis ofthese reports will be the attainment of objectives.

OverviewThe program is viewed favorably statewide. Program objec-

es are being achieved despite the limitation of appro-In Cations.

Prospects for the FutureThe Nebraska State Assessment Program probably willcontinue through 1976. Discontinuance of the programwould only occur as a result of political changes or non-appropi iation of funds.

Individual InterviewedThe main source of information was Joseph Mara, StateConsultant to the Research Planning and Evaluation Divi-sion of the Nebraska State Department of Education, 233South 10th Street, Lincoln, Nebraska 68509 (telephone:402 471-2531).

REFERENCEEducational Goals Statements for Nebraskans, Nebraska StateDepartment of Education.

NEVADA

Nevada Needs Assessment Program (NNAP)

Goals

Nevada has prepared a formal set of statewide educationalgoals which are available on request. The 10 main goal areasare fostering creativity, vocational productivity, continuingeducation, intergroup acceptance, motivation to learn,citizenship and social competence, self-understanding andacceptance, mastery of basic skills, physical and emotionalhealth and intellectual development. The goals wereoriginally developed by the Far West Regional EducationalLaboratory, then located in Berkeley, California. The StateDepartment of Education took the goals and slightlymodified them to suit its purposes. The goals have beenformally adopted by the State Department of Educationand are basically output in nait 'e.

Program Title and PurposesThe official title of the program is the Nevada Needs Assess-ment Program. NNAP's major purposes are to assess cog-nitive and noncognitive development, measure growth,provide data for a management information system andassess learner needs (cognitive, affective and psychomotor).

NEVADA

Initiation

The program was initiated by the State Department ofEducation in the spring of 1971. Six months later the firstpretest was administered.

Policy

Program changes and policy are jointly determined byrepresentatives of school systems and the State Departmentof Education.

FundingThe program is entirely federally funded. The main sourceis ESEA Title III although ESEA Title I and ESEA Title IVSection 402 funds are also used. Last year the program costapproximately $25,000.

AdministrationThe Division of Planning and Evaluation in the StateDepartment of Educaticri ;-. responsible for coordinatingthe program statewide. The state has four full-time profes-sional employees involved in state assessment. Somecounties also have employees in jobs related to assessment..Nevada has contracted outside agencies for help in scoringand summarizing data. During the 1971-72 academic yearthey used Evaluative Programs for Innovative Curriculum(EPIC) in Tucson, Arizona; in 1972-73 they are usingCentral Data Processing, a state agency.

PopulationDuring the 1971-72 year the target group consisted of grade3 students; however, :luring the 1972-73 school year grade4 is included as well. Educationally mentally retarded andemotionally disturbed students are excluded as are schoolsfor the delinquent and handicapped. Parochial and privateschools are also excluded. From the target population, asample of 4,500 students was drawn. Although participa-tion was not required, 92 percent of the students parti-cipated. Some community characteristics were consideredin selection.

Areas Assessed

Reading and mathematics are being assessed in the cognitiveareas. Self-appraisal and school sentiment are being assessedII the noncognitive area.

InstrumentationThe Comprehensive Test of oasic Skills (CTBS) is used totest the cognitive areas of reading and mathematics. In theaffective areas, Nevada adapted the School SentimentIndex-Primary Level and the Self-Appraisal Inventory-Pismary Level from the Instructional Objectives -;xchange(I0X) in Los Angeles. Item sampling was not used inadapting the inventories. The affective instruments arecriterion-referenced in the sense that an objective is formu-lated, then measures are devised to assess the objective'sattainment. The emphasis, however, has been more onobtaining baseline affective data and obtaining group-reliable data. The CTBS was chosen by a group of teachersat workshops for that purpose in the spring of 1971.

51

Page 58: DOCUMENT RESUME - ERICDOCUMENT RESUME ED 080 582 TM 003 098 TITLE State Educational Assessment Programs.,1973 Revision.. INSTITUTION Educational Testing Service, Princeton, N.J.; Education

NEVADA

Related DataSchool name, school size, sex, racial/ethnic information andgeographical designations are taken into consideration tohelp analyze and interpret the data.

Data Collection and ProcessingClassroom teachers are responsible for giving the CTBS. Theaffective tests are given by nonschool personnel, usuallyeither parents or teacher aids from another school. (IOXfelt that administration by school personnel could con-taminate the results since some questions refer to theteacher, principal, et al.) Outside contractors, last yearEPIC and this year Central Data Processing, are responsiblefor scoring and summarizing the data.

InterpretationThe State Department of Education is responsible foranalyzing, organizing and interpreting the data.

Use of DataThe results of the program are primarily used for instruc-tional guidance and to some extent for determining learnerneeds for program development. The program authorityprohibits release of interdistrict or individual affective data.

DisseminationReports prepared by the State Department of Educationare summarized by state, region, school, classroom, student(cognitive only) and ethnic group. The statewide reports arethen disseminated by the State Department of Education toother states, the State Board of Education, school districts,schools, teachers and newspapers. The state does not releaseindividual, class, school or district data except to the indi-vidual, class, school or district concerned.

OverviewIn general, local school administrators, teachers and thegeneral public view the program favorably whereas theState Department of Education and the State Board ofEducation view the program very favorably. Cognitive pro-gram objectives are being achieved very well. A majorproblem has been teacher apprehension over how the pro-gram results are to be used.

Prospects for the FutureThe program will continue with the current descriptionremaining accurate until the fall of 1973. Changes can beexpected in objectives, target population (in 1973-74 thetarget population will be grades 3, 5 and 7), areas assessed,measuring instruments, data collection and use of data.

Individual InterviewedThe primary source of information was R.H. Mathers,Consultant, Assessment and Evaluation, Heroes MemorialBuilding, State Department of Education, Carson City,Nevada 89701 (telephone: 702 882-7111).

52

NEW HAMPSHIRE

GoalsNew Hampshire does not yet have a formal set of statewideeducational goals. In its 1970 Biennial Report, the StateDepartment of Education issued a list of 12 present andfuture goals; another report is presently being preparedand will contain a composite list of goals from the variousdivisions. Since management by objectives is employedwithin the department, clearer goals will soon be stated. Inthe 1970 Biennial Report, goals involve a combination ofthe input, process and output of education. All elementaryschools are now required to develop goals. A current educa-tional needs assessment study ultimate will develop a set ofstatewide goals.

Program Title and PurposesThe present assessment program has no title. Its majorpurpose is to assess educational needs, with assessment ofcognitive development and establishment of statewide andlocal goals as additional purposes. The State Department ofEducation Report of New Hampshire Educational NeedsAssessment for January-June 1972 presents sets of needsfor the learner, the staff and the system, followed by a ranklisting of 16 critical educational needs.

InitiationNew Hampshire assessment was initiated mainly by theState Education Agency using federal funds. The StateBoard of Education and the Commissioner of Educationalso contributed to its development. Testing began in 1953.Lack of funds precluded state support for testing for1972-73 and may prevent future State Department support..

PolicyThe following groups have been involved in policy-making:the 2ommissioner of Education, the State Board of Educa-tion and the State Education Agency.

FundingBoth federal and local funds have been utilized in coveringprogram expenses. Local funds provided 60 percent of theexpenses; the remaining 40 percent came from federalmonies from ESEA Title I, ESEA Title III, NDEA V-A,Vocational Education Acts and Aid to the Handicapped.No state funds have been used directly. Last year, localeducation agencies tested 40,000 students and were reim-bursed $ 25,000. Total program costs were roughly$65,000.

AdministrationStatewide coordination of the program has been in thehands of till Director of Testing and Research of the Divi-sion of Instruction, assisted by personnel from the Planningand Evaluation Unit.

Page 59: DOCUMENT RESUME - ERICDOCUMENT RESUME ED 080 582 TM 003 098 TITLE State Educational Assessment Programs.,1973 Revision.. INSTITUTION Educational Testing Service, Princeton, N.J.; Education

PopulationTarget groups were defined by grades 4, 6, 8 and 10 and donot include dropouts. Each school decided when to partici-pate and whether or not to exclude students because ofunusual mental, emotional, physical or social situations. Noparticular type of school was excluded from the program,although most private schools did not participate. In someareas private schools serve as local public schools, and inthese cases they were included. Participation was voluntary.Last year, about 85 percent of the eligible schools wereincluded. Communities with special characteristics were notgiven special attention.

Areas AssessedIn the cognitive domain, mathematics, reading, socialstudies and science have been assessed. Except for a careerknowledge questionnaire, no measures have been used toassess the noncognitive areas.

InstrumentationNone of the tests previously used was tailor-made for theprogram; none was criterion-referenced or used as a group-reliable measure. The State Education Agency and anAdvisory Committee, of which Walter Durost was amember, selected the Stanford Achievement Test Battery,the Otis-Lennon Tests of Mental Ability and the School andCollege Ability Test. In 1970-71, all students in grades 4, 6,8 and 10, including parochial schools, were given theachievement battery and an ability test. In 1971.72, eachschool was allowed to choose date, grade level and test tobe administered for grades 8 and below. Testing proceduresfor grade 10 remained the same. In 1972-73, no reimburse-ment was provided by the state.

Related DataAdditional informationanalyze the data.

on age was collected to help

Data Collection and ProcessingVarious unidentified groups had the responsibility ofadministering the tests in the schools. Local school districtsand the State Education Agency then processed theinformation obtained. Some scoring was done throughservices available at the University of New Hampshire.

InterpretationLocal school districts and the State Education Agency wereresponsible for analyzing, organizing and inte:reting thedata.

Use of Data

Program results have been used primarily at the local levelby the local education agencies. No legal restrictions wereplaced on use of the data.

DisseminationThe State Education Agency prepared various reports ofthe results of the assessment program. State summaries,

NEW JERSEY

school summaries, and student reports were compiled andmade available to schools, students, parents, school districtsand the State Agency.

Overview

Official group reactions to the program are not available.On the basis of individual reactions, however, one can saythat the program has been viewed most favorably by theState Education Agency and by the various local boards ofeducation. The governor and local school administrators arenot quite as enthusiastic; the legislature, the teachers andstudents and the general public are somewhat less favorablein their opinions. Objectives have not been achieved toowell due to problems with funding, data processing, anduses on the state and local levels.

Prospects for the FutureThe program is not likely to continue, primarily because oflack of funds. If changes are made, they are likely to occurwith respect to funding, areas assessed and measuringinstruments used.

Individual InterviewedThe primary source of information was H. Stuart Pickard,Director of Planning and Evaluation, Commissioner'sOffice, State Department of Education, State HouseAnnex, Concord, New Hampshire 03301 (telephone: 603271-2340).

NEW JERSEY

New Jersey Educational Assessment Program

IntroductionNew Jersey has two programs relating to state educationalassessment, the New Jersey Educational Assessment Pro-gram and the "Our Schools" Project. The programs paralleleach other and are constantly coordinated.

GoalsA formal set of statewide educational goals has beenadopted by the State Board of Education. These goalsconcern both educational output arid process. A citizens'advisory group, the Chief State School Officer, the gov-ernor, outside contractors, the State Board of Education,the State Education Agency, students, school boardmembers, teachers, parents and business and communityleaders were all involved in the determination of the goals.The current set of goals is available upon request.

Program Title and PurposesOfficially named the New Jersey Educational AssessmentProgram, its major purpose is to provide information usefulto the development of programs and thrusts designed to

53

Page 60: DOCUMENT RESUME - ERICDOCUMENT RESUME ED 080 582 TM 003 098 TITLE State Educational Assessment Programs.,1973 Revision.. INSTITUTION Educational Testing Service, Princeton, N.J.; Education

NEW JERSEY

move education closer to attainment of the state goals. The

goals include assessing student achievement, assessingeducational needs, measuring growth and influences onlearning and providing data for educational management.

initiationThe New Jersey Educational Assessment Program wasinitiated by the State Board of Education and supported bythe governor's office, the state legislature and the ChiefState School Officer. Approximately three years haveelapsed between initiation of the program and its presentstage of development.

PolicyThe State Department of Education primarily determineshow the program will be ,:rond.:cted, with citizens' advisorygroups, the Chief State School Officer, the governor'soffice, the State Board of Education, teachers and admin-istrators and local boards of education throughout the statesuggesting possible program modifications.

FundingThe program is totally supported by state funds; programcosts last year were approximately 5300,000.

AdministrationThe state's Office of Educational Assessment, with a cur-rent staff size of four full-time individuals, coordinates theprogram statewide. Educational Testing Service. Princeton,New Jersey, was involved in constructing, administering andscoring the tests.

PopulationAll schools are required to participate in the programexcept for delinquent, handicapped, parochial and privateschools. All public school students in grades 4 and 12 (withthe exception of educationally mentally retarded. emo-tionally disturbed, non-Englr.h speaking, dropouts andphysically handicapped students) were assessed.

Areas AssessedMathematics and reading were assessed in the cognitivearea.

InstrumentationTailor-made achievement tests were developed by Educa-tional Testing Service, the State Department of Educationand various state teacher committees.

Related DataInformation on age, cost of instructional programs, dropoutrates, paraprofessionals or teacher aides, percent minority,previous course work, racial/ethnic data, school name andsize, sex, socioeconomic and teacher data is collected tohelp analyze the test data.

Data Collection and ProcessingClassroom teachers, guidance counselors and school admin-istrative staff (under the direction of a district coordinator)

54

are responsible for giving the tests. The State Departmentof Education and Educational Testing Service process theinformation obtained from the assessment instruments.

InterpretationLocal schools and school districts, with the aid of the StateDepartment of Education, organize, analyze and interpretthe data.

Use of DataThe results of the program will be used to assist the StateBoard of Education in the allocation of existing resources,to provide information to local districts which can be used

to suggest and/or modify courses of instruction, to identifyexemplary programs, for guidance and for program evalua-tion and planning. State Board of Education policy requiresthat all data, except for information related to individualstudents or teachers, be released to the public. Programdata will not be released without proper interpretive mate-rials which must be developed by each educational level towhich information will be provided.

DisseminationThe State Department of Education prepares interpretedstate, regional, district type and county reports. Localeducators prepare interpreted school system and individualschool reports for public dissemination. These reports areavailable to the State Department of Education, the gov-ernor, the legislature, newspapers, other states, the StateBoard of Education, school districts, schools, students.parents, teacher organizations and the general public.

OverviewThe program is viewed favorably by most state officials,local school board members, local school administrators,teachers, students and the general public. Some oppositionfrom the New Jersey Education Association is seen as theone major problem related to the program.

Prospects for the FutureThe program will continue, with probable modifications intarget population, areas assessed, measuring instruments,data collection and processing, Interpretation and use ofdata and dissemination procedures.

NEW JERSEY

The "Our Schools" Project

GoalsA formal set of statewide goals based on informationderived from the participation of 6,000 people, includingstudents, local school board members, teachers. parents andbusiness and community leaders through two years ofvarious activities was formally adopted by the State Board

Page 61: DOCUMENT RESUME - ERICDOCUMENT RESUME ED 080 582 TM 003 098 TITLE State Educational Assessment Programs.,1973 Revision.. INSTITUTION Educational Testing Service, Princeton, N.J.; Education

of Education. The goals deal with educational output andprocess factors.

Program Title and Purposes

The purposes of the program, officially titled The "OurSchools"/Needs Assessment Project, are to set and con-tinually reassess statewide educational goals, assess state-wide status in relation to the goals, identify educationalneeds, determine influences on learning and growth,provide data for current educational management on thestate level, including the instigation of the development ofprograms to address identified needs.

InitiationThe State Board of Education with support from the gov-ernor's office, State Department of Education and theChief State School Officer initiated the idea for this pro-gram. Approximately three years have elapsed betweeninitiation and the present stage of development.

Policy

A citizens' advisory group, the State Board of Education,the Chief State School Officer and the State Department ofEducation determine how the program will be conductedand what changes will be made in the nature of theprogram.

FundingThe program expenses are totally supported by federalfunds from ESEA Title III. The salaries of one and one-third full-time staff members are funded by departmentalappropriations of state funds. The program cost wasapproximately S25,000 in 1972.

AdministrationThe Office of Planning coordinates the program statewidewith a full-time professional complement of three. Educa-tional Testing Service aided in the development of instru-ments, while Opinion Research Corporation conducted a

statewide public opinion survey for the program.

PopulationAll students in all levels of public education in the state(with the exception of higher education) are included in the"Our Schools" program, including dropouts and adult andcontinuing education students. Participation in the programis voluntary, and nearly all school systems have participatedin some manner.

Areas Assessed

All cognitive and noncognitive areas of the educationalgoals established by the state will be assessed in somemanner. Examples of some of these areas are: mathematics,reading, social science, personal values, self-concept andnatural sciences.

NEW JERSEY

InstrumentationAchievement tests coverning basic skills in reading andmathematics were used for the initial target area. Othermethods of assessment, including special sampling tech-niques, observations and oral and written surveys, are beingconsidered to provide statewide summary information rele-vant to other goals.

Data Collection and Processing

Data will be collected by an assortment of methods,dependent upon the nature of the goal statement underconsideration. The State Department of Education and out-side contractors are responsible for processing the informa-tion obtained from the program.

Interpretation

The State Department of Education is responsible foranalyzing, organizing and interpreting the data on the statelevel. Local school districts are encouraged to utilize projectmodels and information.

Use of Data

Results of the program will be used in allocation of existingeducational resources by the State Board of Education andthe Department of Education in such areas as budgetinganalysis of needs and priorities, identification of exemplaryprograms, program evaluation and program planning andpublic information.

Dissemination

Summaries and public information pieces have been pre-pared for the program by the State Department of Educa-tion. Copies of these reports are sent to the governor, thelegislature, newspapers, other states, the State Board ofEducation, school districts, schools, students, parents andteacher organizations and are available to the public atlarge.

Overview

The program is viewed very favorably by the governor. thelegislature, the State Department of Education and Boardof Education. Other groups generally view the programwith favor, including local school districts, administratorsand teachers, students and the general public. On the wholethe "Our Schools" program is achieving its objectives. Themajor problem seen facing this program is the necessity ofbringing together representatives from across the state atregular intervals.

Prospects for the FutureThe program is likely to continue with modifications andnew directions in the areas of target population, areasassessed, measurement instruments, data collection andprocessing procedures and use of data.

Individuals InterviewedThe primary sources of information were Gordon Ascher,Director of the Educational Assessment Program (tele-

55

Page 62: DOCUMENT RESUME - ERICDOCUMENT RESUME ED 080 582 TM 003 098 TITLE State Educational Assessment Programs.,1973 Revision.. INSTITUTION Educational Testing Service, Princeton, N.J.; Education

NEW JERSEY

phone: 609 292-7983) and Glenn H. Tecker, AssistantEducational Planner (telephone: 609 292-7600). Theiraddress is Research, Planning and Evaluation Bureau,Department of Education, 225 West State Street, Trenton,New Jersey 08625.

NEW MEXICO

New Mexico Evaluation Program

Goals

A formal set of 11 statewide educational goals has beenprepared by the State of New Mexico. The goals, a combi-nation of input, process and output, were derived throughthe participation of citizens' advisory groups, the ChiefState School Officer, the legislature, the State EducationAgency and students. Though not formally adopted by thestate they are available on request.

Program Title and PurposesThe official title of the program is the New Mexico Evalua-tion Program. The major purposes are assessing cognitiveand noncognitive development, assessing educational needs,measuring growth and influences on learning and, though inits formative stage, providing data for a managementinformation system and a planning-programming-budgetingsystem leading to formalized statewide educational goals.

InitiationThe program was initiated by the State Education Agencyin early 1971. The first standardized tests administered on astatewide basis were given in 1970. The first series of fieldtests for the objective-based

testing occurred in 1972.

Policy

The Chief State School Officer, the State Board of Educa-tion and a citizens' advisory group, under the direction ofthe Director of Evaluation of the State Education Agency,determine how the program is conducted and what changeswill be made.

FundingFederal ESEA Title III and ESEA Title I monies financeabout 33 percent of the program expenses, and local schooldistricts finance the remainder. (Local costs include scoringservices and test materials.) Total cost of the program lastyear was approximately $160,000.

AdministrationThe State Department of Education, in conjunction withthe district superintendents, local boards and committees,coordinates the program statewide. The local :ommitteesare composed of students, teachers, administrators andcommunity representatives. Five state professbnals imple-56

ment the evaluation design for the state. A state committeeof counselors, psychologists, school administrators andothers acts in an advisory capacity to the State DepartmentEvaluation group. The State Department of Education isassisted by the staff of Educational Evaluation Associates inLos Angeles, California, in validating instrumentation,checking out procedures and auditing. This agency alsoserves as a liaison between the State Department and thelegislature.

PopulationAll students in grades 1, 5 and 8 from public, parochial,private and vocational/technical schools are assessedutilizing standardized tests. Students in Title I programs(preschool) and dropouts are included; however, studentsfrom schools for the delinquent and the handicapped areexcluded. No delinquent, educationally mentally retardedand emotionally disturbed students are assessed. All eligibleschools were included in the assessment last year. Objec-tive-based tests are administered to all students at grades 6,9 and 12 in public schools statewide.

Areas AssessedIn the cognitive area, aptitude, English, mathematics,natural science, reading, social science and career educationare assessed. In the noncognitive area, attitudes towardteachers, other children and the community are assessed.

InstrumentationStandardized tests selected by the State Education Agencyand committees were the Otis-Lennon Mental Ability Testand the Comprehensive Test of Basic Skills. Though thetests yield individually reliable results, they are used only asgroup-reliable measures by the state. Locally constructedinstruments were developed by the State EducationAgency, state professionals and the staff of EducationalEvaluation Associates. As of 1973.40 short objective-basedtests had been constructed and field tested. Eventually,measures will be developed for each objective in the state'scognitive skills objective bank.

Related DataAdditional information collected to help analyze or inter-pret the data included cost of instructional programs, trans-portation costs, area density, dropout rates, attendancefigure rates, percent of minority groups, racial/ethnicbalances, school size, sex, socioeconomic data and teacherdata (salaries, education, course loads, and so on).

Data Collection and ProcessingClassroom teachers are responsible for administering thetests. The State Education Agency in conjunction withEducational Evaluation Associates processes the programdata.

Interpretation of DataThe State Education Agency and outside contractorsanalyze, organize and interpret the program data.

Page 63: DOCUMENT RESUME - ERICDOCUMENT RESUME ED 080 582 TM 003 098 TITLE State Educational Assessment Programs.,1973 Revision.. INSTITUTION Educational Testing Service, Princeton, N.J.; Education

Use of DataThe results of the program are primarily used for instruc-tion, comparative analyses across schools, identification ofexemplary programs, guidance, program evaluation,, pro-gram planning and community relations. A five-year limiton the use of the data has been established by the StateBoard of Education; however, this restriction has not yetbeen put into effect. Test results distributed to schooldistricts and personalized reports to students hopefully willassist the local level administrators in improving the qualityof their curriculum.

DisseminationReports of the program results are prepared by the StateEducation Agency. These include: state summaries, schoolsystem summaries, school summaries, student reports andpublic information pieces. The reports are issued directly tothe State Education Agency, governor, legislature, StateBoard of Education, schools, school districts, teacher organ-izations and ERIC. State coordinators in each districtreceive a copy of all monographs. Students receive a copyof their scores through their schools. Standard reports toparents relative to the students' performance on thestandardized testing may be furnished although this is anoption left to the local educational agency.

OverviewThe governor, legislature and State Education Agency havereacted very favorably to the program. Boards of education,local school administrators, teachers, students and thegeneral public show a generally favorable attitude. Programobjectives are being achieved, but synthesizing and dis-tributing information to the state's 300,000 students and18,000 teachers in a lucid manner is the chief communica-tion problem.

Prospects for the FutureThe program is projected to continue. Areas assessed will bechanged and measuring instruments are being reviewed inorder to facilitate data collection, the use of data anddissemination of materials. These areas are expected toimprove as the program progresses into its third, fourth andfifth years. Information derived in the statewide evaluationeffort will be utilized in conjunction with other informa-tion to formulate the basis for school accreditation.

Individual InterviewedThe primary source of information was Alan Morgan, StateDirector, Evaluation and Assessment, State Department ofEducation, Santa Fe, New Mexico 87501 (telephone: 505827-2928).

NEW YORK

IntroductionThe New York State Assessment and Evaluation System ismade up of the following components:

I. PEP (Pupil Evaluation Program) and2: the Regents Examinations, criterion measures of quality

and essentially norm-referenced tests;3. SPPED (System for Pupil and Program Evaluation and

Development), a resource for instructional programdevelopment, which includes a new developing systemknown as CAM (Comprehensive Achievement Moni-toring);

4. PIE (Performance Indicators in Education), in a sense adelivery system of information, which provides measuresof school effectiveness, and uses output from-BEDS;

5. BEDS (Basic Educational Data System), the statewideinformation system, which will be extended this year toinclude a student file (information is currently based onteachers and their programs, finances and buildings);

6. NYSEIS (New York State Education InformationSystem), a statewide plan for 12-13 regional data pro-cessing centers. (These centers will facilitate the trans-mission of information by the schools, districts andregions to the State Education Department for statewideanalysis.)

In the future, the State Education Department plans toassist local school administrators in interpreting theinformation relayed back to them. The Education Depart-ment also plans the construction of a simulation model, bywhich local administrators could test policies, that is,personnel, use of technology, educational aid and so on.

Detailed individual program descriptions follow.

GoalsThe Education Department is in the process of preparing aformal statement of educational goals, encompassing alltesting programs and information banks in the state assess-ment system. Plans called for the completion of the firstdraft by January 1973 and the presentation of the goals tothe Board of Regents for approval. This draft is not thefinal version, however; the goals will be submitted for state-wide discussion before an amendment is adopted.

New York has also been involved in a two-year statewidecommission on the cost, quality and financing of education,which has been chaired by Manley Fleischman. The recom-mendations of the Commission, known as the "FleischmanReport," may affect future assessment plans in New York.This report emphasizes the need for establishing a statewideaccountability system, involving all programs (see above)and the Office of Long-Range Planning, which would beincorporated into the statewide operating package.

First, the report recommends that accountability becentered on achievement at the building level, with annualreports to be sent to the districts, the Education Depart-

57

Page 64: DOCUMENT RESUME - ERICDOCUMENT RESUME ED 080 582 TM 003 098 TITLE State Educational Assessment Programs.,1973 Revision.. INSTITUTION Educational Testing Service, Princeton, N.J.; Education

NEW YORK

ment and to be made available to parents. The annualreports should be based on pupil achievement by grade,including the measures or any standardized tests in dif-ferent fields that are involved.

Second, the report recommends that a Statewide TestingProgram be initiated at specified grade levels throughsecondary schools, emphasizing the CORE skills, that is,English composition, reading and mathematics. This testingprogram would eliminate Regents Exams in the highschools and substitute nationally normed achievement testssuch as the National Assessment or College Board Examina-tions. Throughout the report, "regionalization" is empha-

sized: regional achievement, report preparation, regionaldata processing and so on. Suggested also is the transferralof PIE and the delivery system to the regional level so thatoperations would be conducted in a regional-local senserather than an Education Department-centered sense.

The report further recommends that school achievementaccountability be coepled with fiscal accountability instandardized budgeting and auditing procedures; thissystem would be established by the Education Department.The achievement and fiscal accountability would have as abasic link the statewide comprehensive information systemsto provide facts for long-range planning evaluation and

enforcement of state mandates.

Pupil Evaluation Program (PEP)

Title and PurposeThe official program title is the New York State PupilEvaluation Program. Its major purpose is to provide asingle, uniform set of statewide test data that will identifyeducationally disadvantaged students and give an objectivepicture of the reading and mathematics achievement ofthese students.

InitiationPEP was established by the State Education Department inthe fall of 1965 after enactment of ESEA Title I.

Program PolicyThe State Education Department determines how the

program is conducted, with advice and assistance from local

school districts.

FundingPEP is financed almost entirely by federal funds, ESEATitle I; state funds are utilized for overhead costs (office

space and so on). Last year the program cost approximately

S250,000.

AdministrationThe Division of Educational Testing of the State EducationDepartment coordinates the program statewide. Three pro-fessional staff members devote full time to PEP and relatedtest advisory services.

58

PopulationThe target group consists of all students in grades 3,6 and9. PEP is an every-pupil testing program; however, severlymentally, physically or emotionally handicapped childrenare not required to participate. (Non-English. speaking stu-dents may be excused, but their scores must be entered aszero.) All schools, public and nonpublic, are required toparticipate.

Target AreasReading and mathematics achievement in grades 3,6 and 9are the target areas.

InstrumentationThe PEP tests are standardized survey tests in mathematicsand reading developed by the State Education Departmentwith the aid of teacher committees. All instruments werecarefully pretested, and all are used as group-reliablemeasures.

Related DataAdditional information collected to help analyze andinterpret the data includes size and type of school and typeof community (large or small cities, rural areas and village-suburban areas). Most of this information is used to createmultiple reference groups for the program.

Data Collection and ProcessingClassroom teachers are responsible for administering thetests. The local school or district personnel score the tests,prepare distributions of part and total scores by schoolbuilding and then send this distribution to the State Educa-tion Department on machine-readable forms.

InterpretationThe State Education Department analyzes and summarizesthe distributions of raw scores provided by the schools intoindividual school building reports. These reports providepercentile ranks of median scores and cumulative percentsof pupils in nine levels of achievement (stanines). Thereports also contain similar information for relevantreference groups of pupils, including all pupils in the schooldistrict. School building and school district administratorsreceive copies of the reports for the pupils in their schools,and copies are kept on file in the Department.

Use of DataThe results of PEP are used for allocating state and federal

funds, budgeting, instructional planning, comparative

analyses across schools, research (identification of exem-plary programs), program evaluation, program planning andother accountability purposes.

DisseminationState summaries are prepared by the State EducationDepartment. These reports are available to the governor,the legislature, newspapers, teacher organizations and thepublic.

Page 65: DOCUMENT RESUME - ERICDOCUMENT RESUME ED 080 582 TM 003 098 TITLE State Educational Assessment Programs.,1973 Revision.. INSTITUTION Educational Testing Service, Princeton, N.J.; Education

OverviewThe program appears to be achieving the objectives forwhich it was established. A major problem has been thetendency among some groups, lacking technical back-ground, to use the test results in isolation as a measure ofthe quality of the educational program.

Prospects for the FuturePEP will probably continue for the near future, and theabove description will remain accurate for two or threeyears.

Regents Examination Program

Title and PurposeThe official program title is Regents Examination Program.Its major purposes are (1) to evaluate achievement andprogress, (2) to establish and maintain standards, (3) toprovide supervisory tools for improving instruction and (4)to serve a guidance function.

InitiationThe first examinations were administered in 1865. TheState Education Department oversees the program and hasmade many changes in it since that time to keep it abreastof changes in education. All public high schools are re-quired to make "general use" of the examinations in orderto be eligible for state aid.

Program PolicyProgram policy is determined by the State EducationDepartment and the Board of Regents with the advice andhelp of representatives of local school systems.

FundingIn fiscal year 1972, the program was supported by 60 per-cent state monies. and 40 percent federal funds, ESEA TitleIII. The total cost was about $500,000.

AdministrationThe Division of Educational Testing of the State EducationDepartment coordinates and administers the program state-wide. The validity of the tests and appropriate interpreta-tion and use of the results, however, is a joint responsibilityshared with the subject specialists in the Division of GeneralEducation. Approximately 25 Department professionalstaff members provide about 12 full-time equivalent man-years of work on the program each year.

PopulationThe target groups are defined by the courses of study. Allstudents in grades 9-12 who take Regents courses of study,which are designed for pupils of average and above averageability, are expected to take Regents examinations in thosecourses. Other students may also take Regents examinations with the approval of the high school principal.

NEW YORK

Areas AssessedExaminations are provided in 21 subject areas within thecognitive areas of English, foreign languages, mathematics,natural science, social science and business.

InstrumentationThe examinations vary from end-of-year tests, as in mathe-matics and natural sciences, to comprehensive examinationsencompassing learning objectives of several years of instruc-tion, as in English and social sciences. Both essay andobjective-type questions are used. The examinations areprepared by the State Education Department with theassistance of hundreds of teachers each year who serve asitem writing and examination review consultants and asmembers of committees preparing the final examinations.All questions are carefully checked against course objec-tives, and all objective questions are pretested. Entirely newexaminations are prepared for each of three administrationperiods each year (January, June and August), and the dif-ficulty level of each new examination is carefully controlledto match that of previous examinations.

Related DataThe Department collects information concerning Regentscourse enrollments and percent passing each examination.

Data Collection and ProcessingClassroom teachers administer and score the examinationsunder the supervision of the high school principal inaccordance with directions provided in adminisrationmanuals and rating guides provided by the Department. TheDepartment reviews a random sample of papers in eachsubject to verify maintenance of rating standards.

InterpretationThe passing score on each examination is 65 percent.Normative data is also prepared for all pupils writing eachexamination.

Use of DataThe results of Regents examinations are used for instruc-tional planning, guidance, program evaluation and publicinformation. Regents examination scores are placed onstudents' school records aad transcripts. The extent towhich the marks are used to determine local grades isoptional with schools. The scores are also used by studentsto earn a state high school diploma or a state endorsementon the local school diploma.

DisseminationStatistical summaries are prepared and disseminated toschool districts, individual schools and teachers. The reportsare also available to the public. Pupils, of course, are pro-vided with their test scores.

OverviewThe program is considered to be achieving its objectives.There is some criticism of overemphasis on Regents marks.

59

Page 66: DOCUMENT RESUME - ERICDOCUMENT RESUME ED 080 582 TM 003 098 TITLE State Educational Assessment Programs.,1973 Revision.. INSTITUTION Educational Testing Service, Princeton, N.J.; Education

NEW YORK

Another criticism is that not all students are involved. Theprogram is likely to continue, with the above descriptionremaining accurate for at least 4 to 5 years.

System for Pupil and Program Evaluationand Development (SPPED)

IntroductionThere are three major parts of the SPPED project: ( I ) abank of behavioral objectives, test items and instructionalresources known as BOIR; (2) a computer-based evaluationsystem employing Comprehensive Achievement Monitoring(CAM); and (3) sets of planning and training materials thatsupport system implementation. The BOIR is a statedeveloped and maintained computerized storage andretrieval system which, when complete. will provide schoolpersonnel with objectives, test items, learning resources andpast ;hievement data. The information will assist with cur-riculum development, test construction and instructionalprogram development.

The BOIR is objective-based in that both the test itemspresently in it and the items which will be added to it willbe cross-coded to particular objectives. Plans call for theaddition of an instructional resources component whichwill be coded to particular objectives. This means that oncean objective is selected for use, test items and lists ofinstructional resources related to that objective can be pro-vided. The objectives within the bank are all behavioral andof various types ranging from terminal behavioral, suitablefor defining the end-parts of an instructional program, toobjectives designed to meet day-to-day instructional needs.Test Items in the bank also vary from fill-in to multiple-choice measures. As the bank is expanded, observationmeasures requiring more extensive written responses will beincluded along with the test items. The instructional re-sources part of the bank is designed to hold entries of allkinds of resources that could be used to provide instructionon a particular objective along with data about thatresource. This data will Include such things as the mode ofinstruction which the objective is appropriate for, readinglevel, the past performance of students who have used theobjective, the price, the producer and so on.

The second SPPED component consists of a computer-based evaluation system that can generate data related tostudent performance on objectives. The ComprehensiveAchievement Monitoring (CAM) evaluation technique isbased on an evaluation design which incorporates item-examinee sampling and a sliding evaluation design whichallows manipulation of the amount of pre-testpost-testand retention data that is collected on student performancein a testing operation. The reports generated using CAM canbe used to make decisions related to individual studentprogress and to evaluate the instructional program forma-tively and summatively. CAM computer processing is doneby a single program with two smaller programs helping toproduce data that is required to initiate the main program.

60

The first of these programs, a Test Construction Program,generates the item-examinee sampling design given input onthe number of subjects, the number of items per test form,and the number of objectives to be tested across the form.In this process, test forms are generated at random, witheach objective having equal opportunity of appearing on atest form. The other program is the Test Scheduling Pro-p-am, used to randomly assign Individuals to test formsbased on student aptitude or past performance; conse-quently, the CAM-generated sample estimates of classperformance by objective are unbiased.

This system of evaluation is also designed to be supportedby a complete test construction operation set up in thelocal school districts allowing these districts to constructmore reliable and valid tests than those developed pre-viously. The items for the tests are drawn from the bank.The construction operation is focused around the develop-ment of a form which will provide reliable data fordecision-making and yet meet certain criteria related tocomposition and duplication of the required number of testforms.

The third and final component of the SPPED Projectconsists of sets of materials for planning and implementingthe project and for training the people who will use it. Thetraining materials come in both the simulation format andas self-instructional modules. They are designed to enablestaff to handle skills ranging from selection of objectives tointerpretation of data for decision-making.

Program Title and PurposesThe official title of this program is the System for Pupil andProgram Evaluation and Development (SPPED). Its majorpurposes are to assess educational needs, measure influenceson learning, measure growth. provide data for a manage-ment Information system, set statewide educational goalsand provide information for a planningprogramming-budgeting system.

InitiationThe State Education Department Initiated the idea forSPPED with support from foundations, ongoing work inuniversities and continuing development in the Departmentitself. Four years have elapsed since work with CAM began,and three years have elapsed between the initiation of themore comprehensive SPPED project and its present stage ofdevelopment.,

Program PolicyProgram policy is determined by the State EducationDepartment in conjunction with university groups whocontribute to the system.

AdministrationThe Division of Research coordinates the program state-wide. The professional staff within the Division consists ofseven professionals. Additional State Education Depart-ment personnel are drawn on a part-time basis from theMathematics and Reading Bureaus. Other assistance is pro-

Page 67: DOCUMENT RESUME - ERICDOCUMENT RESUME ED 080 582 TM 003 098 TITLE State Educational Assessment Programs.,1973 Revision.. INSTITUTION Educational Testing Service, Princeton, N.J.; Education

vidcd by personnel from various regional educational

centers throughout New York State, the University of

Massachusetts, the University of Rochester, StanfordUniversity, Random House. Sequoia Union High School in

'California, and the Title III Project in Hopkins, Minnesota.

PopulationThe target group is composed of all students K-12. Drop-

outs and students below grade level are included; educa-

tionally mentally retarded students and handicapped

students are not. Participation is not confined to com-munities with special characteristics, and school participa-

tion is voluntary. Six percent of the schools in New York

participated last year.

Areas AssessedSPPED concentrates mainly on mathematics and reading:

however, there is some involvement in all other content

areas found in public schools. Noncognitive areas such ascreativity and self-concept are not included in the assess-

ment program, although these areas are involved in some

research projects. Aptitude is measured, but only to classify

participants into low, medium or high ability for assign-

ment to testing forms.

InstrumentationAll the tests used in SPPED are from the item and objective

banks provided by the state (see Introduction). All theinstruments are custom-made for SPPED and are criterion-

referenced measures. Either past achievement, criterion-

referenced measures or standardized achievement tests areused to measure the aptitude of the participants. Thespecific standardized achievement tests include the Metro-

por!an, Iowa and Stanford Achievement Tests.

Related PataAdditional data collected for analysis and interpretation

includes age, cost of instructional programs. aptitude, grade

level, numher of objectives, groups of objectives, courses.programs and teacher information.

Data Collection and ProcessingClassroom teachers are responsible for administering the

instruments. If the local schools have a computer. they are

responsible for processing the information obtained fromthe program. Otherwise, the data is processed in one of the

state's regional computing centers or by commercial

agencies.

InterpretationLocal schools are responsible for analyzing and interpreting

the information.

Use of DataThe results of the program are used to manage individual

student performance, to formatively evaluate a single part

of an instructional program or a complete program, to

compare programs and to validate the curriculum which is

NEW YORK

based on the objectives themselves. Use of the data is notrestricted by law.

DisseminationTechnical reports and the content of the behavioral-

objectives bank are sent to ERIC: In some cases, the StateEducation Department publishes reports, in most cases, thelocal school districts prepare their own summaries. These

summaries are disseminated to every audience, from the

student to the superintendent.

OverviewIn general. SPPED is viewed favorably by various audiences

within the New York State educational community, who

feel the program fills various needs. On the Department

level, it is viewed positively since it provides for dear pro.gram specification and data production related to theadequacy of that program. School administrators share this

same interest; they also feel that SPPED serves as a source

for building and improving these programs. Response byteachers, students and parents is also generally positive,based on the SPPED delivery of specific information

directly related to the class and individual student. Other

audiences such as the governor and legislature have

probably only recently become aware of the systemthrough the Fleishman Report (see Introduction) and have

not registered any opinion. Interest in the system by schooldistricts across New York State is growing and the Division

of Research expects that 10 percent of the state's districts

will in some way be using the system by the 1973-74 school

year.The major problems are (I) lack of familiarity with the

system's approach on the part of school personnel, (2) need

to restructure the schools in order to enable teachers to

meet and work together regularly. (3) lack of familiaritywith the computer, (4) lack of funds at the local level and

(5) lack of sufficient funding at the state level. This lack of

funds has caused uneven development of the project. that

is, the SPPED Reading Bank needs test items and theinstructional resources bank is undeveloped.

Prospects for the FutureThe program, based on a 10-year plan, is very likely tocontinue. The only changes likely to occur are that (1) thefunding basis may improve and shift to other types of

support and (2) the areas assessed will expand to include

personal values, morals and moral judgment.

Performance Indicators in Education (PIE)

Program Title and PurposesPerformance Indicators in Education is the official title ofthe program. The purposes of the program are to measureschool effectiveness and to provide information to help

:ate and local school officials improve educational .pro-

gams through better allocation of resources.61

Page 68: DOCUMENT RESUME - ERICDOCUMENT RESUME ED 080 582 TM 003 098 TITLE State Educational Assessment Programs.,1973 Revision.. INSTITUTION Educational Testing Service, Princeton, N.J.; Education

NEW YORK

InitiationThe forerunner of PIE, the Quality Measurement Project,

was initiated by the State Education Department in 1956.The Office of Planning and Innovation proposed methods

of developing indicators of educational performance in

1966, and the Bureau of School Program Evaluation

subsequently began the actual development of performance

indicator models. Four years elapsed from the initiation of

the idea to actual implementation.

Program PolicyThe Associate Commissioner for Research and Evaluation,

Director of the Division of Evaluation and the Chief and

staff of .he Bureau of School Programs Evaluation make

program decisions consistent with the policy of the Com-missioner and the Board of Regents. Advice is also obtained

from local school officials and BOCES superintendents.

FundingPIE is funded by ESEA Title V, and State EducationDepartment funds; however, the State Education Depart-ment's primary contribution is staff. The total budget for

fiscal year 1972 was approximately 560,000.

AdministrationThe Bureau of School Program Evaluation coordinates the

project. The Chief of the Bureau of School Program Evalua-

tion is responsible for planning, liaison with other units,budget preparation, supervision and the editing of reports.The Chief's staff consists of two research associates, oneresearch assistant and one typist. Educational TestingService was involved in a preliminary feasibility study. RRC

International assisted in developing computer programs for

generating performance estimates of educational programs.

PopulationTo date, the project has concentrated on grades 3 and 6.Profiles showing the standing of a district were prepared for

628 districts in 1972; that figure represents about 80 per-

cent of the districts in the state. The six largest districts in

New York were excluded from the program in 1972; how-

ever, these districts will be included in the 1973 analysis.Over 100 districts were not included because complete data

for them was not available at the time statistical analyses

were made. Non-public schools were not included.

Target AreaPIE concentrated mainly on reading and mathematics in1972.

InstrumentationNo single instrument is required; scores on any achievementtest could be used as criterion measures. Presently, theproject uses output data through the Pupil Evaluation Pro-

gram (see description), specifically reading and mathematicstests for grades 3 and 6 and a variety of information con-

tained in the Basic Educational Data System (see descrip-

tion). Plans call for uship other data, available through

62

other State Education Department units and local standard-

ized testing using the ETS-OE Anchor Test method.

Related DataData is collected on surrounding conditions, that is, those

influences in the environment (home, school, community)

likely to affect educational outcomes.

Data Collection and ProcessingThe State Education Department is currently responsible

for collecting and processing the data.

InterpretationThe State Education Department is responsible for inter-

preting the data.

Use of DataResults are being used by district staffs to identify areas for

further study and analysis in order to improve reading and

mathematics programs.

DisseminationSchool system summaries have been prepared and dis-

seminated to 628 school districts. The reports indicate

school district performance in reading and mathematics.

grades 3 and 6. Local school officials receive individual

reports for their districts; the reports are also available to

staff members of the State Education Department. A

description of the program is available through ERIC and

has been disseminated to other state education depart-

ments.

OverviewThe project is viewed favorably by State Education Depart-

ment leaders and legislative staff members. Local adminis-

trators' reactions have been mixed but generally favorable.

The program objectives are being achieved fairly well,although PIE staff would like to collect other kinds of data

(in addition to the data already collected by the Depart-

ment), for example, socioeconomic data.

Prospects for the FuturePIE is very likely to continue, with this description re-

maining accurate for one year. Elements most likely to

change in the near future are the levels and areas assessed,

the data collected and analytical procedures.

Basic Educational Data System (BEDS)

IntroductionBecause BEDS is an information system and not specificallyan assessment program, the topics "Goals," "Instrumenta-tion" and "Related Data" have been eliminated.

Program Title and PurposeThe program is entitled the Basic Educational Data System

........T,

Page 69: DOCUMENT RESUME - ERICDOCUMENT RESUME ED 080 582 TM 003 098 TITLE State Educational Assessment Programs.,1973 Revision.. INSTITUTION Educational Testing Service, Princeton, N.J.; Education

(BEDS). Its purpose is to improve the quality of informa-tion available to interested parties by centralizing andstandardizing the data collection.

InitiationThe State Education Department initiated the program inthe fall of 1967, approximately five and one-half years ago.Planning and consolidation of form .:- systems began in1959.

PolicyThe Director of the Information Center on Education, theChief of the Bureau of Educational Data Systems and theChief of the Bureau of Statistical Services, all State Educa-tion Department based, determine how the program isconducted and also effect any changes, with advice fromDepartment and school district representatives.

FundingThe program is financed by 82 percent state funds and 18percent federal funds.

AdministrationThe Chief, Bureau of Educational Data Systems, and theChief, Bureau of Statistical Services coordinate the programstatewide. The Information Center on Education and theDivision of Electronic Data Processing jointly developed theprogram's mark sense forms.

PopulationAll teachers in New York state public schools are included.The program is mandatory; all public schools are requiredto participate.

Target AreasBEDS calls for the annual collection, analysis, storage alidretrieval of basic information on public schools' profes-sional staff, students, curricula and facilities.

Data Collection and ProcessingAs mentioned under "Administration," two mark senseforms are used:, one by classroom teachers and one by non-classroom professional staff. In addition, a school summaryform is prepared by each school principal, and a schooldistrict summary form is prepared by the district centraloffice. Individual school districts are responsible for col-lecting data on "Information Day" each fall. A digiter(mark-sense reader) in the State Education Departmentprocesses the mark-sense forms.

InterpretationAll data is summarized by computer.

Use of DataThe data is primarily used to provide a starting point atstate and local levels for the educator in decision-makingand planning curriculum development, experimental pro-grams, staffing and instructional facilities. Summary data

NEW YORK

can provide individual school districts with meaningfulcomparisons to help measure their own needs. Salary levelsof educators can be readily compared with those of otherprofessionals and with current cost-of-living figures. Use ofthe data is not restricted by law; however, a strict Depart-ment policy on confidentiality does restrict the use of dataassociated with individuals.

DisseminationStandard reports are prepared and distributed to divisionswithin the State Education Department and to individualschool districts. Special reports based on state or regionalschools may be prepared upon request, usually for the legis-lature, the State Education Department, the commissioner,school districts, professional associations, researchers andothers. Interested parties may request information for anydata level. Certain personal data is not available on an indi-vidual basis.

OverviewBEDS is generally viewed favorably.

Prospects for the FutureThe BEDS staff is now developing a consolidated data basewhich will include the full BEDS file, all PEP data, exten-sive financial information and the 1970 census data. TheProgram will also break out sub-systems, for example,disadvantaged, handicapped or vocational education Thepurposes of this computer-assessed data base are to enablequicker access to the information and to enable easier andquicker manipulation of the data.

New York State Education InformationSystem (N YSEIS)

(Because of the nature of this program, the standard formatis not appropriate.)

The New York State Education Information System is astructure of 13 regional data processing centers. Thesecenters provide administrative data and processing servicesto school districts; the information includes attendance,grade reporting, financial accounting, personnel data andpayroll application. Future plans are to include education:program management information in the bank.

The State Education Department will then receive 13tapes of information from these regional data processingcenters as a spin-off from their regular activities. Thisprocess will facilitate their assessment and evaluative pro-cedures by making current district information easily avail-able at state, regional, local district, school and class levels.

Individuals InterviewedThe primary sources of information were Lorne H.Woollatt, Associate Commissioner for Research, Evaluationand Communication; Robert O'Reilly, Chief, Bureau of

63

Page 70: DOCUMENT RESUME - ERICDOCUMENT RESUME ED 080 582 TM 003 098 TITLE State Educational Assessment Programs.,1973 Revision.. INSTITUTION Educational Testing Service, Princeton, N.J.; Education

NEW YORK

School and Cultural Research; Victor Taber, Director, Divi-sion of Educational Testing; John Stigelmeier, Director,Information Center on Education; Alan Robertson,Director of Division of Educational Evaluation and DavidIrvine, Chief, Bureau School Programs Evablation. All are inthe New York State Education Department, Albany, NewYork 12224 (telephone: 518 474-3878).

REFERENCESRobert P. O'Reilly, The SPPED Reading Objectives: A StateBan trio! Project, 1972.

NORTH CAROLINA

North Carolina State Assessment Program

GcalsA formal set of statewide educational goals was adopted bythe Executive Staff of the State Department of PublicInstruction in the spring of 1971. The establishment ofgoals and specific objectives for the agency marked thebeginning of a major movement dedicated to improving themanagement of the public schools in North Carolinabothat the state and local levels. The formulation of goals forthe state agency came only after consiucrable input wasreceived from a Governor's Study Commission on Educa-tion and from persons interested in the educational enter-prise in North Carolina, such as superintendents, teachers,members of the State Department of Public Instruction,school board members and others. Concurrent activitieswere begun in planning and needs assessment so that educa-tion in North Carolina could move toward meeting its goals.The assessment program is described below.

Program Title and PurposeThe assessment program is entitled The Statewide Assess-ment of Educational Progress at the Sixth Grade. The pro-gram will provide performance information to citizens,legislators, local school boards and those responsible foreducational leadership on the state level, for the purpose ofimproving the quality of education in North Carolina. Theinformation will assist in identifying statewide educationalneeds and will facilitate setting priorities in North Carolina.Local school districts will be aided in planning and prioritysetting; they will also have access to meaningful state,regional and type of community norms for comparisonpurposes. The general public will hopefully achieve agreater understanding of the needs, problems and accom-plishments of their school systems. The assessment programwill also provide the state legislature with educationalinformation needed for the enactmer' of legislationappropriate to educational needs.

InitiationThe State Superintendent of Public Instruction, aware of a

64

need for more performance information, initiated the stateassessment as part of a total effort in better managementpractices. The planning grant for assessment was approvedby the State Board of Education in 1970; intensive briefingand preparation followed on all levels in 1971, and the firstinstruments were administered in April of 1972.

PolicyThe State Superintendent of Public Instruction is respon-sible for providing the overall leadership for the assessmentprogram. The planning and implementation for the assess-ment is conducted by the Assistant Superintendent forResearch and Development in collaboration with otherAssistant Superintendents.

FundingState research funds and ESEA Titles I, III and V wereutilized for the direct costs of the program. State funds alsopurchased materials, travel and certain administrativeexpenses. In 1971-1972, the program's direct cost wasapproximately $115,000. Seventy percent of this amountcame from federal monies. An additional S65,000 of localsupport for training and administration was combined withS40,000 of state in-kind contribution in salaries of personsinvolved in implementing the many stages of the program.This amounted to a grand total of 5220,000.

AdministrationThe Research Triangle Institute (Rn), a North Carolinanon-profit research firm, was responsible for sample selec-tion and data analysis. Except for the sampling process, allother phases of state assessment were conducted by theState Department of Public Instruction in collaborationwith the Research Triangle Institute.

PopulationThe target population was defined by grade (grade 6). Theonly exclusions from the program were trainably mentallyretarded students and students not in the public schoolsystem. The participating students were randomly selectedin such a way that information on the state as a whole wasobtained (that is, the state's z ^graphic regions, the type.of communities, the racial population and so on). Schoolparticipation was voluntary. All of the randomly selectedschools participated in the program.

Areas AssessedThe cognitive areas assessed were reading, language arts,mathematics, career awareness and academic ability. Thenoncognitive areas assessed were student's attitude towardschool, teacher in the learning environment. parent-homeenvironmental influence, self-concept, self-motivation andneed to achieve and peer influence.

InstrumentationThe Department of Public Instruction selected the cognitiveinstruments; these instruments v.re reviewed by a group of

Page 71: DOCUMENT RESUME - ERICDOCUMENT RESUME ED 080 582 TM 003 098 TITLE State Educational Assessment Programs.,1973 Revision.. INSTITUTION Educational Testing Service, Princeton, N.J.; Education

school psychologists and local school superintendents. Withtheir concurrence, the final selection of the Iowa Tests ofBasic Skills was made. The assessment program also in-cluded the Lorge-Thorndike Intelligence Test. The non-cognitive test was developed by the Division of Research,Department of Public Instruction.

Related DataThe State Education Agency, the school principals and thestudents' homeroom teachers supply school informationconcerning size, enrollment, percent minority students,adequacy of several school components and informationabout teachers. Student information was secured whichincluded sex, age, socioeconomic status, race and previousparticipation in ESEA Title I.

Data Collection and ProcessingLocal school superintendents appointed their own testingspecialists for test administration. These specialists wereused in order to insure greater standardization and controlof the testing situation. Measurement Research Center(MRC) scored the tests after they had been reviewed andedited by the test administrators, the Assessment Coordi-nator and Research Triangle Institute.

Interpretation of DataResearch Triangle Institute was responsible for the initiald^ta analysis phase. Interpretation of results and furtheranalyses were completed by the Division of Research, StateDepartment of Public Instruction.

Use of DataThe data will be used for program planning and prioritysetting on both the state and local levels. The data will alsoidentify educational needs, and provide baseline dataagainst which appropriate comparisons can be made insubsequent years. The information obtained from the stu-dent's background questionnaire and the Principal Ques-tionnaire will be used to determine the status of manylearning and environmental factors thought to influencelearning. The data obtained from the Skills Test will beused to compare the performance of North Carolina stu-dents with southeastern and national norms.

DisseminationState summaries and regional summaries were prepared anddisseminated to the State Education Agency, the governor,the legislature, newspapers, the State Board of Education,the local school districts and the schools. A slide/soundpresentation was prepared for these audiences as well asmagazine articles in professional journals.

OverviewThe program has generally been well received.

Prospect; for the FutureThe assessment program is very likely to continue as aresult of state appropriations for assessment by the 1973

NORTH DAKOTA

legislature. The scope and character of the assessmentinstrumentation package will be modified and additionalgrade levels will be added as assessment points.

Providing the local school districts with technical assist-ance in developing information systems for decision makingwill also be an added dimension to next year's assessmentmodel.

Individual InterviewedThe primary source of information was William L Brown,Director of Research, State Department of Public Instruc-tion, Raleigh, North Carolina 27602 (telephone: 919829-3809).

NORTH DAKOTA

State of North Dakota Assessment ofEducational Needs

GoalsAt this time North Dakota does not have a formal set ofstatewide educational goals; however, it is in the process ofdeveloping a set of goals.

Program Title and PurposesThe official title of the program is the State of NorthDakota Assessment of Educational bk :ds. Its major pur-poses are to set statewide educational goals, to assess cog-nitive development, to assess educational needs and tomeasure growth.

InitiationThe idea for the program was initiated in January 1969 bythe State Education Agency, but only a testing programwas initiated at that time. Other phases have been addedsince then.

PolicyThe State Education Agency is responsible for determiningprogram policy and change with the aid of a citizens'advisory group.

FundingThe program is financed by approximately 70 percentfederal funds, 20 percent state funds and 10 percent localfunds. Federal funds are provided by ESEA Title III (forguidance purposes) and ESEA Title IV Section 402. Thestate provides funds from itsGuidance Division. EstimatedCost for 1972-73 is $23,300.

AdministrationESEA Title III and Title IV, Planning and Development andPupil Personnel Services personnel coordinate the program

65

Page 72: DOCUMENT RESUME - ERICDOCUMENT RESUME ED 080 582 TM 003 098 TITLE State Educational Assessment Programs.,1973 Revision.. INSTITUTION Educational Testing Service, Princeton, N.J.; Education

NORTH DAKOTA

statewide. At this time no outside agencies or consultantsare involved, except in data processing. Two full-time equiv-alents have been assigned to the program.

PopulationNorth Dakota defines its target group by grade, testinggrades 3, 5, 7. 9 and 11. Needs in reading and attitudeswere assessed in grade 4. Dropouts are not included. Theonly individuals excluded are migrants. Except for reading,which will be done on a sample basis, all individuals in thetarget group are tested. All schools are included, but partici-pation is voluntary. Last year 93 percent of the elementaryschools and 87 percent of the secondary schools partici-pated, bringing total participation to 90 percent.

Target Areas

The cognitive areas of English, mathematics, naturalscience, reading, social science and writing are beingassessed. Attitudes toward peers, family, school and self arebeing assessed in grade 4.

InstrumentationNorth Dakota used the Iowa Tests of Basic Skills, the IowaTests of Educational Development and an Attitude Scaledeveloped by Instructional Objectives Exchange (10X). AUtests were chosen by the State Educational Agency and willbe used as group-reliable meausres though some yield indi-vidually reliable results.

Related Data

To help analyze and interpret the test results, North Dakotawill consider minority percentages, racial/ethnic informa-tion, school name, school size, sex, teacher data, dropoutrates and cost of instructional programs.

Data Collection and Processing

Guidance counselors are responsible for giving the tests andinventories. Outside contractors then score and process thetests. Outside contractors are Measurement ResearchCorporation and Science Research Associates.

InterpretationThe outside contractors aid the State Education Agencyand local schools to analyze, organize and interpret the testresults.

Use of Data

The program results will be used for instruction, identifica-tion of exemplary programs, guidance, program evaluationand program planning. Use of the data is not restrictedbeyond the understanding that confidentiality be respected.

DisseminationThe only reports prepared are score reports sent to partici-pating schools. Schools receive local names and percentilerankings for each student. Each school is sent only resultsthat pertain to that schovi.

66

OverviewThe program is viewed favorably by the governor, legisla-ture, State Board of Education, students and general publicand even more favorably by the State Education Agency,local school administrators and teachers. Program objectivesare being achieved to a limited extent. A major problem liesin the proper utilization and interpretation of test and pro-gram data.

Prospects for the FutureThe program will continue with the present descriptionremaining accurate for one year. At that time, measurementinstruments and data collection and processing pro,:edureswill probably change. Use and interpretation of data willincrease.

Individual InterviewedThe primary source of information was Lowell L. Jensen,Director of the Division of Planning and Development,State Department of Public Instruction, Bismarck, NorthDakota 58501 (telephone: 701 224-2269).

OHIO

The Statewide Search fox Consensus

Goals

A formal set of statewide educational goals, output innature, was prepared and tentatively adopted by the StateBoard of Education in April 1972. Since that time a reviewand refinement of the goals has involved approximately120,000 people. Citizens' advisory groups, the State Super-intendent of Schools, the State Board of Education, theDivision of Planning and Evaluation and statewide groupsof students have been included in this process. The goals areavailable upon request.

Program Title and PurposesOfficially entitled The Statewide Search for Consensus, theprogram has as its major purposes assessing cognitivedevelopment (not yet in effect), providing data for amanagement information system, providing information fora planning-programming-budgeting system, and meeting alegislative mandate.

InitiationThe program was initiated in February 1972 by a legislativemandate Uwe points in law), with the State Superintendentof Schools involved in program initiation. The first pilottesting should be completed by June 30, 1973.

Policy

Personnel from the following jointly determine programpolicy: citizens' advisory committees, the State Superin-

Page 73: DOCUMENT RESUME - ERICDOCUMENT RESUME ED 080 582 TM 003 098 TITLE State Educational Assessment Programs.,1973 Revision.. INSTITUTION Educational Testing Service, Princeton, N.J.; Education

tendent of Schools, the governor's office, representatives ofmajor school systems, the State Board of Education, theState Education Agency and the state legislature.

FundingThe pilot program is funded through budget appropriationsfor the Department of Education. Roughly $110,000 wasspent in 1972 for the program.

AdministrationThe Planning Division of the State Department of Educa-tion will coordinate program efforts statewide. Eight full-time professionals work on the program at the state level.The Evaluation Center at Ohio State University is devel-oping program strategies and assessment procedures.

PopulationIndividuals will be selected for the program by samplingfrom the target population. The pilot target population willbe all students in grades K -12 in the state. Students whohave dropped behind in grade level will be included, as willschool dropouts. Only delinquent students will be excludedfrom the target population. Community characteristics(type and size) are considered in sampling.

Areas Assessed

The cognitive areas of aptitude, English, mathematics,natural science, reading, social science, vocational educationand writing will be assessed, as will the noncognitive areasof citizenship, creativity, interests, personal values, self-concept and attitudes toward society and school.

InstrumentationThe test instruments have not yet been developed. It ishoped that they will be developed as criterion-referencedmeasures.

Related DataIt is planned to collect the following information to helpinterpret program data: age, cost of instructional programs,county, dropout rates, paraprofessional educator data,percent minority, racial/ethnic background, school name,school size, sex and teacher data.

Data Collection and ProcessingClassroom teachers will be responsible for administering thetests. The . xal schools will be responsible for processingprogram data.

InterpretationThe local schools and the SEA will organize, interpret andanalyze program data.

Use of DataProgram results will probably be used for the followingpurposes: budgeting, program evaluation, program planningand public relations. Program authority will not restrict theuse of program data.

OKLAHOMA

DisseminationState summaries and public information pieces willprobably be prepared either by the SEA or an outside con-tractor. The following will probably receive copies ofprogram reports: the SEA, the governor, the legislature,newspapers, the State Board of Education, school districts,schools, students, parents, teacher organizations and ERIC.

OverviewPublic opinion seems to be favorable toward the project, asdoes state educator opinion, but opinion will be easier toassess after pilot testing has been completed. Programobjectives are being achieved, with the main problem beingthe lack of time for proper program development. Becauseof the legislative mandate, pilot testing must be completedby June 30, 1973.

Prospects for the FutureThe program is likely to continue, with probable modifica-tion in the areas of data collection, processing, interpre-tation and use.

Individual InterviewedThe primary source of information was Roger J. Lulow,Director of the Division of Planning and Evaluation, StateDepartment of Education, Room 615, Ohio DepartmentsBuilding, 65 South Front Street, Columbus, Ohio 43215(telephone: 614 469-4838).

OKLAHOMA

An Assessment of Educational Needs forStudents in Oklahoma

GoalsA set of statewide educational goals, both product andprocess oriented, has been prepared and published in the1971-1972 Annual Report of the Oklahoma State Depart.ment of Education, entitled Measuring Up. ..Moving On.This report is availabie upon request.

Program Title and PurposesThe official name of the assessment program is An Assess-ment of Educational Needs for Students in Oklahoma. Theprogram is aimed at assessing educational needs, settingstatewide educational goals, and providing information fura management system.

InitiationAlthough most of the initiative for developing the programhas come from within the State-Federal Programs Divisionof the State Department of Education, other groupsthroughout the state have participated. Progress has been

67

Page 74: DOCUMENT RESUME - ERICDOCUMENT RESUME ED 080 582 TM 003 098 TITLE State Educational Assessment Programs.,1973 Revision.. INSTITUTION Educational Testing Service, Princeton, N.J.; Education

OKLAHOMA

made from the first contemporary statewide assessmentmade in 1969, which considered educational programneeds, through the 1970 phase, which considered academicand socioeconomic characteristics of students, to the cur-rent phase of establishing educational goals for the totalstate public education effort.

PolicyOklahoma's State Board of Education has taken responsi-bility for doing the needs assessment and has specificallydelegated that portion dealing with the public elementaryand secondary schools to its Planning, Research and Evalua-tion Section. Contributions to policy decisions have alsobeen made by the state legislature, the Oklahoma Educa-tion Association, the Oklahoma School Boards Associationand representatives of major school systems. Operationalpolicy includes making sure that the data gathered is from arepresentative population, using data which has alreadybeen collected (census, annual reports, evaluation) or isbeing collected, gathering information aimed at whatpeople should know or be able to do as a result of theireducation, using but not supplanting or duplicating localneeds assessment results, and investigating areas of studentneed not currently measurable by means of standardizedtests.

FundingRoughly S10,000 to S15,000 of federal funds are usedyearly in the administrative implementation of the pro-gram; approximately S20,000 of state and local funds areused in the data collection and processing phase. ESEATitle PT has provided a majority of the federal monies whileben., supplemented by ESEA Titles 1 and V and NDEAV-A expenditures. Most of the state funds have been admin-istrative, acid the local funds have come from instructioncategories.

AdministrationAdministrative responsibility lies with the State Board ofEducation and its Planning, Research and Evf ationSection. The Oklahoma State University Department ofEducation was consulted during the development of thegoals survey questionnaire. Professional staff in thePlanning, Research and Evaluation Section administer theprogram as a part of their overall function. At the statelevel one position has full !sponsibility for the programwith additional person. el ..vailable as needed. As yet nocounty or local leve f.,sitions have been established as apart of the state program.

PopulationTarget populations for the goal survey included a stratifiedrandom sample of regions, communities, schools andpublics. Publics included official community organizationrepresentatives; business or industry owners, managers andemployees; governmental representatives; parents; highschool and college students and educators; and StateDepartment of Education representatives. Schools were

68

chosen from three different population levels within eachregion: average daily attendance of 1,000 or more, of 250to 299 and of 249 or below. Likewise, different sizes ofcommunities were represented, as well as 15 differenthigher education institutions, both private and public.

Areas AssessedTopical and item organization was concerned with the fol-lowing: principals of democracy; political competence;cultural values; morality; general intelligence; communica-tion skills; reading-writing; social studies; humanities;-finearts; sciences; economic and vocational competence; atti-tudes; mental, physical and environmental health; andselected school and community factors. Respondents wereasked to indicate the importance they attached to state-ments of need in the above categories in terms of their ownviewpoint and their own community.

InstrumentationInstrumentation consisted of three different survey book-lets with a total of 107 questions. These booklets werespecifically prepared for the program. Item sampling wasused by Oklahoma State University in the process ofidentifying appropriate items for the goal survey question-naire. The resultant items were then included in the threesurvey booklets published by the Oklahoma State Depart-ment of Education.

Related DataOther information was available and utilized in the interpre-tation of the data obtained. Data directly related to thesurvey included county, school district and school size.Data collected in other surveys (including previous years'data which considered factors such as dropout rates, grades,teachers, minorities, race, socioeconomic status and so on)could be related to the goals survey data because of theseidentifying data.

Data Collection and ProcessingThe surveys were administered by the Planning, Researchand Evaluation Section with the assistance of local collegestudents from each of the 15 regions in the state. Informa-tion derived was processed by the State Department ofEducation Data Center.

InterpretationThe Planning, Research and Evaluation Section wasresponsible for analyzing and interpreting the data.

Use of DataResults will be used principally for wall setting, programplanning, and, eventually, evaluations will be related to theobjectives derived therefrom. State laws have imposed norestrictions on the use of the data, but legislative intent andprivileged information statutes and rulings indicate thatschool systems should not be compared with each other norshould individual students.

Page 75: DOCUMENT RESUME - ERICDOCUMENT RESUME ED 080 582 TM 003 098 TITLE State Educational Assessment Programs.,1973 Revision.. INSTITUTION Educational Testing Service, Princeton, N.J.; Education

DisseminationReports to be prepared by the State Department of Educa-tion will include state, regional and public summaries. Stateeducators, legislators and governmental executives willautomatically receive copies of reports. Other groups mayreceive them on request or on selected mailing.

OverviewMost state agency personnel have a favorable attitudetoward the program. Local administrators favor the conceptbut have a qualified hesitancy to accept all definitions ofassessment as being of equivalent value. This is justified bysuch facts as the tendency for a few teachers to "teach forthe test"; lack of measurements for attitude, motivation,and future performance; conflicts between different levelsof goals and means used to measure them; paucity ofpersonnel who know how to interpret test results; andrespect for the privacy of the student to enhance his feelingof dignity and self-worth. One major problem developedand continues: the difficulty encountered in convincing theuninformed (including test makers and testing companypersonnel) that 'needs assessment is not merely gettingresults on standardized tests or criterion-referenced testitems in the academic or nonacademic areas, but includesthe whole range of human needs which may or may notrelate to the way our schools are currently operating.

Prospects for the FutureThe program is likely to continue with expected modifica-tion in the topical areas assessed. Instruments used toquantify the measurement of objective accomplishment willbe selected or developed as the next step.

Individual InterviewedThe primary source of information was James L. Casey,Coordinator of Planning. Research and Evaluation, StateDepartment of Education, State Capitol, Oklahoma City,Oklahoma 73105 (telephone: 405 478-0351).

OREGON

Statewide Assessment of the OregonBoard of Education Learning Goals

The major effort in the state to date has involved a contractwith Science Research Associates to conduct testing pro-grams in grades 4, 9 and II for the following areas: reading,language arts, mathematics, social studies, science and useof sources. the program also included the development andadministration of two instruments to assess 1) career aware-ness and 2) general awareness (self-concept, citizenship,

human relations). A general dissatisfaction with theapproach resulted in the conceptual design of a newprogram.

OREGON

Goals

The Oregon Board of Education is in the process of identi-fying and publicly validating statewide goals. The goalsrelate to student outcomes and management processes.

Program Title and PurposesThe official title is Statewide Assessment of the OregonBoard of Education Learning Goals. The major purposes ofthe program are to assess educational performance, identifyeducational needs, establish biennial priorities and informthe various publics on the educational condition of theirschools.

InitiationThe program was initiated in August 1972 by the StateDepartment of Education.

Policy

The State Board of Education and the State Department ofEducation octermine program policy and make decisionsrelative to changes that are to be made.,

FundingIt has been proposed to the state legislature that a budgetof $600,000 be appropriated for the 1973-75 biennium tofinance the program.

AdministrationThe State Department of Education through the AssistantSuperintendent, Division of Planning, Development andEvaluation will administer the program. The conceptualdesign of the program was developed through a contractwith the University of Oregon. The test development,administration and data analysis will be contracted to someagency.

PopulationGrades 1, 3, 6, 11 and 12 are the target populations identi-fied in the model.

Areas Assessed

The program calls for the following schedule of assessment.1974:grade 1, self-help, self-concept, group interactionI974:grade 3, reading, arithmetic1974:grade 6, self-knowledge, social studies, science1975:grades 1, 3, 6, as above for the 1974 testing1975:grade I2, self-knowledge, social studies, science1976:grades 1, 3, 6. 12, as above for the 1975 testing1976:grade II, career orientation, consumer problems

InstrumentationThe explicit behavioral objectives from which statewideassessment tests are developed will make evident whichparticular skills are being assessed. The test items will referdirectly to the behavioral objectives which are establishedfor each skill.

The criterion-referenced tests developed for assessment of

69

Page 76: DOCUMENT RESUME - ERICDOCUMENT RESUME ED 080 582 TM 003 098 TITLE State Educational Assessment Programs.,1973 Revision.. INSTITUTION Educational Testing Service, Princeton, N.J.; Education

OREGON

reading performance at the end of the third grade will mostprobably include oral performance items so that tests willbe individually administered to each student. The arith-metic test will, in all probability, be a group-administeredpaper-and-pencil test.

Basic knowledge tests will probably consist primarily ofpaper-and-pencil type tests but may include checklists to becompleted by the teacher, parent, counselor or othersecondary observer.

The materials used to assess applied performance willprobably consist of two types:1) An empirically developed checklist which describes

specific behaviors and actions.2) The second form of assessment will probably include

"test-teach-test" items. Games, simulated environments,role-playing and so on may all be used in this format.

Related Data

Relevant demographic data will be collected and willinclude school size and geographic location.

Data Collection and Processing

Classroom teachers will be responsible for test administra-tion. Scoring will be done by teacher and district personnel.

InterpretationLocal school district personnel will be responsible forinterpreting test data in terms of the educational goals,programs and resources of their district. These interpreta-tions may also receive input from a selected "jury"(parents, teachers, laymen) from that district. Schooldistrict interpretation will take into account the individualtest interpretations provided by the teachers.

The testing contractor will be responsible for investigatingthe validity of test interpretations provided by individualteachers and local school districts.

The Oregon Board of Education will receive individual,local school district and statewide analysis of test resultsfrom the testing contractor. It will interpret this dataaccording to state level educational priorities.

Use of Data

The data will be used in reporting and making recommenda-tions to !oval boards of education, the State Board ofEducation and the legislature.

Dissemination

District and state level summaries will be prepared forpublic information.

OverviewIt is believed that this approach to assessment will result inmore meaningful information than past efforts haveproduced.

Prospects for the Future

The implementation schedule will depend upon the alloca-tion of resources by the state legislature.

70

Individuals interviewedThe primary sources of information are Mary Hall,Assistant Superintendent, Planning, Development andEvaluation; R.B. Cleaner, Coordinator, Planning andEvaluation, Oregon State Department of Education, 942Lancaster Drive, NE, Salem, Oregon 97310 (telephone: 503378-3074); and Robert Mattson, Associate Dean, School ofEducation, University of Oregon, Eugene, Oregon 97403(telephone: 503 686-3406).

PENNSYLVANIA

The Educational Quality Assessment Program

Goals

A formal set of statewide educational goals has been pre-pared for Pennsylvania. The following were involved in goalpreparation: citizens' advisory groups, the Secretary ofEducation, the governor, outside contractors and the StateBoard of Education. The goals have been formally adoptedby the State Board of Education, have been published andare available upon request.

Program Title and PurposesThe official name of the program is The EducationalQuality Assessment Program. It is also called "The Pennsyl-vania Plan for the Assessment of Educational Quality" and"The Pennsylvania Plan." The major purposes of the pro-gram are to: assess cognitive development, assess educa-tional needs, assess noncognitive development, measureinfluences on learning and measure growth.

InitiationThe EQA Pro-..n was initiated August 8, 1963. by thestate legislaturt., ..sicler Section 290.1 of Act 299. The firststatewide testing was conducted in the fall of 1970.

Policy

Program policy is determined by citizens' advisory groups,the Chief State School Officer, the State Board of Educa-tion and the State Education Agency.

FundingTwo-thirds of the EQA Program monies come from theState General Education Budget. The other third comesfrom federal ESEA Title Ill funds. Roughly $250,000 wasspent on the program in 1972.

AdministrationThe Division of Educational Quality Assessment coordi-nates the program statewide. Eight full-time professionalswork on the progi :m at the state level. Educational TestingService was a consultant for general program development,

Page 77: DOCUMENT RESUME - ERICDOCUMENT RESUME ED 080 582 TM 003 098 TITLE State Educational Assessment Programs.,1973 Revision.. INSTITUTION Educational Testing Service, Princeton, N.J.; Education

National Computer Systems handles scanning and scoringand the computer facilities at Pennsylvania State Universityare used for program analyses.

PopulationIn 1973-74, the target testing population will be all publicschool students in grades 5, 8 and 11 except educationallymentally retarded, emotionally disturbed and migrant stu-dents. Parochial and private schools, delinquents anddelinquent schools are also excluded from the study.School participation is voluntary. Twenty percent of theeligible students participated in the program last year andone-third are expected to participate annually in the period1973-1976. Community characteristics are not used as abasis for school/student selection.

Areas AssessedThe cognitive areas of mathematics and reading are beingassessed, as well as the noncognitive areas of citizenship,creativity, self-esteem, understanding others, interest inschool, health behavior, vocational development, appre-ciating human accomplishments and preparation for achanging world.

InstrumentationTests and inventories are used in each area, with appro-priate levels for each grade. Except for the cognitive areasof mathematics and reading, these instruments were con-structed within the State Education Agency. The tests arenorm-referenced and are used as group-reliable measures.

Related DataAdditional information collected to help analyze programdata includes: cost of instructional progran s, percentminority, school size, sex, attendance, class size classroompractices, accessibility of resources and socioeconomic andteacher data.

Data Collection and ProcessingClassroom teachers and guidance personnel are responsiblefor test administration. The outside contractors and theEQA Division process the program data.

InterpretationProgram data is organized, analyzed and interpreted by theEQA staff. Interpretation visits are made to each partici-pating school district by division staff members.

Use of DataProgram results are used for instruction, comparativeanalysis across schools, program planning and in attemptsto identify exemplary programs. individual students are notidentified; scores are- aggegated by school. Programauthority does not restrict the use of program data, but theconfidentiality of individual school scores is respected.

DisseminationState and school summaries of program results are preparedby the EQA Division. Copies of program reports are sent to

PUERTO RICO

school districts, local schools and to ERIC. Individual

school reports are released only through the superin-tendents of participating districts.

OverviewThe EQA Program is viewed favorably by educators, lay-

men and various state education departments. Programobjectives are being well achieved, despite the problems of

inadequate budgeting and staff.

Prospects for the FutureThe Pennsylvania Plan is likely to continue, with somemodification occurring within the next two years. The

measuring instruments used, data collection and processing

procedures and data interpretation are the areas mostsubject to change.

Individual InterviewedThe primary source of information was Thomas E. Kendig,

Chief of the Division of Educational Quality Assessment,

Pennsylvania Department of Education. P.O. Box 911,Harrisburg, Pennsylvania 17126 (telephone: 717

787-4234).

PUERTO RICO

Puerto Rico Needs Assessment Program

GoshThe administrative staff of the Area for Educational Plan-

ning and Development, appointed by the Secretary ofEducation, has prepared a set of island-wide educationaland social goals. These goals are supplemented by sub-goals

and operational objectives. They have been formallyadopted by the island, and they deal with a combination ofinput, process and outcome of education.

Program Title and PurposesThe program title is the Puerto Rico Needs Assessment

Study. Its purpose is to discover the discrepancy between

desired learner outcomes and the learners' current status.(A student need is considered the discrepancy between the

desired student behavior and the current level of behavior.)

InitiationThe Secretary of Education initiated the idea for the Needs

Assessment Program. The program strategy consists of five

phases; these phases are planned to extend over three

yearsI971-1974.

PolicyProgram policy is determined by the Secretary of Educa-

tion, with the aid of a working committee of administrative

personnel from the Department of Education.71

Page 78: DOCUMENT RESUME - ERICDOCUMENT RESUME ED 080 582 TM 003 098 TITLE State Educational Assessment Programs.,1973 Revision.. INSTITUTION Educational Testing Service, Princeton, N.J.; Education

PUERTO RICO

FundingThe Puerto Rico State Assessment Program is funded by 60percent federal ESEA Title III funds and 40 percent statefunds.

AdministrationThe working committee of administrative personnel fromthe Department of Education. Division of EducationalEvaluation and Development, coordinates the programisland-wide. The committee includes the secretaries ofregular programs and Educational Planning and Develop-ment, along with the directors of Educational Evaluationand Development, Spanish and mathematics programs. Theprofessional staff consists of planners, evaluators, researchtechnicians and a systems analyst. Personnel from theElectronics Center for Educational Data and consultantsfrom the University of Puerto Rico also are involved inplanning and conducting the program.

The Council of Higher Education formulated a question-naire; Educational Testing Service (ETS), Princeton. NewJersey constructed the sampling design. ETS is also in theprocess of preparing achievement tests and is under con-tract to design answer sheets.

PopulationThe study will be carried out by using a stratified samplerepresentative of public school students enrolled in grades3, 6 and 9. Participation is not confined to communitieswith special characteristics: however. in stratifying thepopulation, the following variables will be taken into con-sideration: metropolitan, urban and rural areas; and rural orurban zones, in order to study socioeconomic status. Publicschool participation is required.

Aress AssessedSpanish and mathematics are being assessed. Noncognitiveareas are not involved.

InstrumentationAchievement tests in Spanish and mathematics are beingadministered, as well as a questionnaire to determine socio-economic level. The test was constructed by ETS, and thequestionnaire by the Council of Higher Education specif-ically for use in the Puerto Rico State Assessment Program.

Related DataIn order to help analyze and interpret the data, informationregarding socioeconomic status, sex and general ability iscollected.

Data Collection and ProcessingEvaluators and planners from the Division of EducationalEvaluation and Development selected and trained classroomteachers and evaluation coordinators from the schooldistricts for the administration of instruments. Evaluationcoordinators, local and regional educational planners and

72

ETS are responsible for processing the informationobtained from the program.

InterpretationCentral educational planners are involved in comparativeanalysis of test results. Personnel from the Spanish andmathematics programs then determine the needs in achieve-ment in those two areas according to the administered tests.

Use of DataThe data will be used to provide the information needed tocarry out deeper investigation in smaller geographical areaswhich will result in the preparation of innovative projectsleading to the solution of identified critical educationalneeds.

DisseminationBy the end of the first stage of the study of needs (May1973), all findings will be disseminated toihe system withpertinent recommendations. Involved in the disseminationof information are the Central Educational Planners, theCentral and Information Offices and the Division of Evalua-tion and Educational Development.

OverviewNo comments concerning attitudes toward the assessmentare available at this time.

Prospects for the FutureLong-range plans have been developed for 1973-74 through1976-77. The main activities will involve development ofappropriate instruments in the affective and psychomotordomains and the administration of them to ascertain theneeds in those areas. Concurrently, all school districts willparticipate in the administration of math, Spanish andgeneral ability tests, as well as a socioeconomic question-naire to a sample of students in grades 3,6, and 9 with thepurpose of actualizing the information.

As a final activity, the school districts whose performancehas been identified as below the minimum competence levelof achievement will participate in a comprehensive testingprogram geared to produce a detailed identification ofpupils' needs and related factors influencing their per-formance. This comprehensive administration will provide aframework for the design and implementation of compen-satory and innovative programs.

In summary, the whole loop of operational activities inthe area of educational development should be visualized asa dual component action of needs assessment and programdevelopment. The first is a research-based activity geared tothe development of operational concepts that will help insharpening the focus of instructional activities on a morerelevant and rational curriculum basis.

Individual InterviewedThe primary source of information was Miguel Plaud.Director. Division of Evaluation and Educational Develop-

Page 79: DOCUMENT RESUME - ERICDOCUMENT RESUME ED 080 582 TM 003 098 TITLE State Educational Assessment Programs.,1973 Revision.. INSTITUTION Educational Testing Service, Princeton, N.J.; Education

ment, Department of Education, Urb. Tres Monjitas, HatoRey, Puerto Rico 00918 (telephone: 809 764-3445).

RHODE ISLAND

IntroductionThe Rhode Island Statewide Testing Program is presently ina crucial state of transition. A more or less routine programof standardized testing at logical points in the curriculum isbeing transformed into a more flexible program that will bemore congruent with the Statewide Needs Assessment Pro-gram. The present program will be continued for the fol-lowing year (1973); however, a standardized program ofreading assessment at both the primary and intermediatelevels will be added. In addition, career counseling informa-tion will be provided for students at the senior high levelthrough the use of a variety of standardized instruments.Finally, when the results of the first Statewide NeedsAssessment are examined this spring (1973), standardizedand objective-referenced instruments will be used toprovide more detailed information about characteristics ofstudents in Rhode Island. (Mr. Henry W. Stevenson Jr.,Assistant Commissioner for Research. Planning and Evalua-tion).

Statewide Needs Assessment

GoalsAn open systems planning process was employed to deter-mine a formal set of statewide educational goals. The StateBoard of Regents have approved this set of goals, which wasselected by the Board of Education of the State Board ofRegents, state legislators, the Chief State School Officer, acitizens' advisory committee, students and others. Thecurrent set of goals, which is primarily learner-oriented, isavailable upon request.

Program Title and PurposesThe program, officially titled the Statewide Needs Assess-ment Project, will assess cognitive development and educa-tional needs. and will provide data for a managementinformation system and for a planning-programming-budgeting system.

InitiationThe Statewide Needs Assessment was initiated by the StateBoard of Regents, the superintendents and Teacher Associa-tions. In May 1972, the first stage of the project wascompleted. In May 1973, the first questionnaires will beanalyzed.

PolicyThe Chief State School Officer, members of the state legis-lature, the State Board of Regents, a State Education Corn-

RHODE ISLAND

mission, the State Education Agency, outside committeesand a coordinating council determine how the program isconducted and what changes will be made in its nature.

FundingFederal funds from ESEA Title III and ESEA Title Vfinance this Needs Assessment Project. The project utilizedfunds only for staff time last year, and about $30,000 wereused in 1972. Funds in the amount of $30,000 have been

budgeted for 1973.

AdministrationPlanning, Evaluation and Research personnel coordinate theprogram statewide. There are two full-time equivalents at

the state level.

PopulationThe target population includes all elementary and second-ary schools with the exception of delinquent schools,schools for the handicapped, parochial and private schools.Non-English speaking, educationally mentally retarded, andemotionally disturbed students are not included in theprogram. Dropouts will be assessed as well as students whohave dropped behind grade level. All schools receiving statemonies are required to participate in the program. The pro-gram is in its first year.

Areas AssessedAll areas related to the Rhode Island Goals for Educationare assessed.

InstrumentationA "tailor-made" public opinion poll was developej by theState Education Agency. The opinion poll asks questionsconcerning what does exist and what should exist in educa-tion in Rhode Island. The questions were developed by the

staff and relate directly to headings under the Rhode IslandGoals for Education. The opinion poll is administered tosamples of grades 4, 8 and 12 students, their parents,teachers, administrators, State Education Agency person-nel, legislators and other citizen groups.

Related DataAdditional information collected to help analyze orinterpret data includes age, sex, grade, school name andsize, socioeconomic data and teacher data (for example.teacher salaries, teacher education, course load and so on).

Data Collection and ProcessingThe classroom teachers with the aid of public educationaltelevision are responsible for giving the tests for the pro-gram. The State Education Agency is primarily responsiblefor processing the information obtained from the program;the computer center at Rhode Island College is the con-tractor for analysis and interpretation of data as requestedby the State Education Agency.

73

9.

Page 80: DOCUMENT RESUME - ERICDOCUMENT RESUME ED 080 582 TM 003 098 TITLE State Educational Assessment Programs.,1973 Revision.. INSTITUTION Educational Testing Service, Princeton, N.J.; Education

RHODE ISLAND

InterpretationThe State Education Agency has primary responsibility foranalyzing, organizing and interpreting the data.

Use of DataThe results of the program will be used for program plan-ning. No restrictions or limits have been placed on the useof the data, which is open to all.

DisseminationBecause the program is in an early stage of development, noreports have yet been prepared. I t is planned that the StateEducation Department will prepare school system sum-maries, state summaries, student reports and publicinformation pieces to be distributed to the State EducationAgency, the governor, legislature, newspapers, other states(as requested), the State Board of Regents, students, schooldistricts, teacher organizations and ERIC.

OverviewAt this stage of development, it is difficult to say how theprogram is viewed; however, the State Education Depart-ment and the State Board of Regents are favorable. Theprogram is generally seen as achieving its goals. Anticipatingfuture difficulties and making changes before problemsmake change impossible is seen as a major difficulty.

Prospects for the FutureThe program was only planned to run one year. It is notexpected that it will be operated in this way again foranother five years. if the program is repeated.

Statewide Standardized Testing

GoalsThe formal set of statewide educational goals approved bythe Rhode Island State Board of Regents was the result ofan open systems planning process which involved greatnumbers of people. These goals, determined by the StateBoard of Regents, Board of Education, students, a citizens'advisory committee, the state legislators, Chief State SchoolOfficer and others mainly concern learner-oriented (output)goals. The current set of goals is available upon request.

Program Title and PurposesThe main goals of this program, which is officially calledthe Statewide Standardized Testing Program, includeassessing cognitive development and educational needs,providing data for a management information system andproviding information for a planning-programming-budgeting system.

InitiationThe State Department initiated this program and submutedrelevant legislation to the Board of Education. There have

74

been about five years of testing since the inception of theprogram.

PolicyThe Chief State School Officer, the State Board of Regents,the State Education Agency and a State Education Com-mission determine how the program is conducted and whatchanges will be made in the nature of the program. Alsoinvolved are outside committees and a coordinating council.

FundingThe Statewide Standardized Testing Program is whollysupported by state funds, which are provided by generalappropriations for statewide testing. Program costs last yearwere approximately $140,000. For 1973, S400,000 hasbeen appropriated.

AdministrationThe program is coordinated by the Research, Planning andEvaluation Agency, with the equivalent of one full-timeprofessional staff member administering it at the state level.University and college professors and local system admin-istrators travel across the state to interpret test results forlocal people. About 20 people are employed annually forthis purpose.

PopulationThe target group participating in the program was selectedby grades (for grades K, 4, 8 and 9) and age (correspondingto grades 2, 5 and 6). Dropouts, educationally mentallyretarded, emotionally disturbed and non-English speakingstudents are not included in the Statewide StandardizedTesting Program. All students in the target population parti-cipate in the program. Delinquent schools and schools forthe handicapped are excluded from the program; schoolsreceiving state monies are required to participate. Ninety-nine percent of the eligible schools participated in the pro-gram last year. No special characteristics are considered inselecting schools.

Areas AssessedAptitude, mathematics, reading and vocational areas areassessed in the cognitive domain.

InstrumentationThe Metropolitan Readiness Test was used in Kindergarten,the Iowa Tests of Basic Skills in grades 2, 4, 5, 6 and 8, theLorge-Thorndike in grades 4 and 8. the Cognitive AbilitiesTest in grades S and 6, and the Differential Aptitude Test inGrade 9. None of the tests have been lailor-made" for theRhode Island program and none are considered criterion-referenced. These tests were chosen for the program by theState Education Agency and a testing committee, and allare described as individually reliable.

Related DataAdditional related information collected to help analyzeand interpret the data from the assessment program

Page 81: DOCUMENT RESUME - ERICDOCUMENT RESUME ED 080 582 TM 003 098 TITLE State Educational Assessment Programs.,1973 Revision.. INSTITUTION Educational Testing Service, Princeton, N.J.; Education

V.

include: age, grade, school name, sex and minimum socio-economic data.

Data Collection and ProcessingClassroom teachers and the school administration staff areresponsible for giving the tests. The State EducationAgency is responsible for processing the information fromthe program and has issued a contract to Rhode IslandCollege for the analysis and interpretation of data.

InterpretationThe State Education Agency (in conjunction with RhodeIsland College) is responsible for analyzing, organizing andinterpreting the data; local schools and local school districtsare responsible for the interpretation of data at the locallevel.

Use of DataThe results of the program will be used for instructionalpurposes and for making a comparative analysis acrossschools. No law restricts or limits the use of the data.

DisseminationReports on program results have been prepared by the StateEducation Agency and Rhode Island College; they aredistributed to the State Education Agency, school districts,schools and teacher organizations. Further dissemination isdetermined locally. The Board of Regents has created astudy commission to determine the manner and form inwhich test scores shall be reported to the public. Theconcept of disclosure has been approved, but the precisemethod of disclosure is under study and will be recom-mended to the Board of Regents when decided.

Overview

The program is viewed favorably by the legislature; how-ever, there are many different viewpoints depending onwhich area is being discussed. The testing program ispresently under study, and general feelings about thesuccess of the program cannot really be judged at this time.The program's objectives are seen as being generallyachieved; however, procedures for revising the program areinadequate and must be made more effective. Also, addi-tional staff to interpret and analyze test data is needed.

Prospects for the FutureThis program is likely to continue, although it will bemodified to a certain extent. The program will be expandedand will emphasize areas other than standardized tests. Alsolikely to change are the target population, areas assessed,the measuring instruments utilized and use and dissemina-tion of data. (See also Mr. Stevenson's Introduction.)

Individuals InterviewedThe primary sources-of information were Henry W. Steven-son Jr., Assistant Commissioner for Research, Planning andEvaluation, State Agency for Elementary and SecondaryEducation, Roger Williams Building, Hayes Street, Provi-

SOUTH CAROLINA

dence, Rhode Island 02908 (telephone: 401 277-2640);Carol Kominski, State TastingConsultant, room 216, RogerWilliams Building, Hayes Street, Providence, Rhode Island02908 (telephone: 401 277-2050); and Cynthia Ward,Research Coordinator, room 212, Roger Williams Building,Hayes Street, Providence, Rhode Island 02908 (telephone:401 277-3126).

SOUTH CAROLINA

Annual Assessment System

IntroductionSouth Carolina draws upon data from its Statewide TestingProgram for assessment purposes. Private and parochialschools and numerous measurement devices are included inthe Statewide Testing Program.

GoalsThe findings of a 1968 needs assessment study and addi-tional facts about the state's educational system wereutilized by key staff members in the Department to developa proposed set of long-range educational objectives forSouth Carolina. These objectives were presented to theState Board of Education for consideration and on May 8,1970, the Board adopted "South Carolina's EducationalObjectives for 1975." Its eleven major objectives were consistent with the Board's "Statement of Educational Philos-ophy" which stated that there should be a five-year plan toimprove the state's system of public education.

The goals are: (1) reduce dropouts by 50 percent, (2)reduce the number of students repeating first grade from 15percent to 5 percent, (3) establish a statewide program ofpublic kindergarten for five-year-olds, (4) measurablyimprove the basic verbal and quantitative skills of in-schoolstudents, (5) provide an adequate occupational training pro-gram for 100 percent of the secondary school students whochoose it, (6) increase the number of high school graduatesentering post high school training to at least 50 percent, (7)develop an adequate educational program for youth withphysically, mentally or emotionally handicapped condi-tions, (8) increase the total adult educational enrollment inbasic and high school programs by 100 percent, (9)promote programs to provide adequate and professionaland paraprofessional personnel to staff the state's educa-tional system, (10) develop and maintain a system of con-tinuous evaluation and upgrading of education (whichincludes nine continuing objectives, output in nature) and(1 I) insure the implementation of at least a definedminimum educational program in each local school district.

The tenth goal inspired and is the main objective of theAnnual Assessment System. The five-year plans to meetthese goals, which are input, process and output oriented,are available upon request.

75

Page 82: DOCUMENT RESUME - ERICDOCUMENT RESUME ED 080 582 TM 003 098 TITLE State Educational Assessment Programs.,1973 Revision.. INSTITUTION Educational Testing Service, Princeton, N.J.; Education

SOUTH CAROLINA

Program Title and PurposesThe main purposes of the Annual Assessment System are toassess observable pupil performance primarily in cognitivedevelopment as a first step toward identifying educationalneeds and to provide data for the system of managementwhich is utilized in meeting those needs.

InitiationThe State Board of Education, the State Education Agency(SEA), the Chief State School Officer and the state legis-lature were all involved with the initiation of the presentprogram. An outgrowth of State Educational GoalNo. 10,the initial assessment system was initiated in 1968 while thestate educational goals were still informal.

PolicyProgram policy and changes are cooperatively determinedby the Chief State School Officer, the State Board ofEducation, the S4te Education Agency, the state legis-lature and local school boards.

FundingDuring fiscal year 1973, the program was funded by 90percent federal monies (ESEA Titles I, III and V) and 10percent state monies. During fiscal year 1974, the assess-ment program will be financed to a greater extent withstate funds. In 1973 the program cost roughly $200,000.

AdministrationResearch and Evaluation personnel coordinate the programon the state level using a professional staff size of 10 full-time equivalents. The SEA's Executive Planning Committeeis extensively involved in management process at the statelevel. On the local level, school administrators assume muchof the responsibility for the assessment effort.

PopulationThe Annual Assessment draws data from the StatewideTesting Program (STP) which tests by grade but is beingmodified to assess by age. All grade 4 and 7 students areincluded in the STP. An approximate 10 percent sample ofthe grade 9 and 12 students is also included in the system.Participation is voluntary and is not confined to com-munities with special characteristics. Last year participationwas almost 100 percent in grades 4 and 7.

Areas AssessedThe cognitive areas of aptitude, English grammar, basicmathematics skills, natural science. social science andreading were assessed last year. Future test administrationswill include inventories in the noncognitive domains ofcitizenship, self-concept and interests as well as in othercognitive subject areas.

InstrumentationThe Statewide Testing Program uses four major tests for

76

assessment purposes. They are (1) the California Tests ofBasic Skills (CTBS)for grades 4 and 7, (2) the CaliforniaTest of Mental Maturity (CTMM)an abbreviated form isadministered to grade 4, (3) the Iowa Tests of Basic Skills(ITBS) a sample of grades 9 and 12 and (4) a StudentResponse Test, under developmeo. These tests wereselected by Research and Evaluation and State EducationAgency personnel in cooperation with the school districts.None of the tests are criterion-referenced and all are usedby the state as group-reliable measures.

Related DataSouth Carolina is in the process of preparing district pro-files which will be used to help analyze and interpret assess-ment data. The profiles contain age, dropout rates, parapro-fessionals or education aides, percent minority, racial/ethnic, sex, socioeconomic, teacher (salary, education,course load, etc.) and general demographic (census data,occupation of the head of the household, rural/urban/suburban) data.

Data Collection and ProcessingAlthough the option is left to the district, classroomteachers generally are responsible for giving the tests. Thepublishing companies score their tests for the state.

Interpretation of DataResearch and Evaluation personnel are responsible foranalyzing, organizing, interpreting and reporting assessmentdata.

Use of DataAssessment data is used to prepare state and district reportswhich serve planning purposes, to prepare budget requestsfor state and federal funds and to improve instruction anddepartmental services to school districts. Program authoritydoes not restrict the use of program assessment data at thistime.

DisseminationAssessment reports, prepared by Research and Evaluationpersonnel, are in the form of state, district, school and Title1 and Ill summaries. The reports are available to the StateEducation Agency, the State Board of Education, schooldistricts, teacher organizations, students, parents, thegovernor, the legislature, newspapers and the general public.

OverviewReactions to the program have been favorable, especiallywithin the State Education Agency and State Board ofEducation. Program objectives are being achieved. Majorproblems are insufficient personnel, funding and enoughresources to honor commitments to timelines.

Prospects for the FutureThe assessment program will continue on an annual basis, atleast through 1975. Preliminary assessment reports are cur-

Page 83: DOCUMENT RESUME - ERICDOCUMENT RESUME ED 080 582 TM 003 098 TITLE State Educational Assessment Programs.,1973 Revision.. INSTITUTION Educational Testing Service, Princeton, N.J.; Education

rentiy available and a more definitive report schedule isnow unt:er development. The changes most likely to occurinclude an expansion of the areas assessed, the selection ofmore effective instruments and better interpretation anduse of data.

Individuals InterviewedThe primary sources of information were W. Edward Ellis,Director, Office of Research, Department of Education,Rutledge Building, Columbia, South Carolina 29201 (tele-phone: 803 758-2169) and Diana J. Ashworth, Director,Office of Planning, 1206 Rutledge Building, Columbia,South Carolina 29201 (telephone: 803 758-2115).

REFERENCE1975 Objectives for South Carolina Public Schools: A Five YearPlan. Department of Education, South Carolina.

SOUTH DAKOTA

South Dakota Statewide Needs Assessment

IntroductionAt present, two major programs are being conducted inSouth Dakota: Statewide Needs Assessment, a pilot study,and a Feasibility Study for Multi District Vocational Educatun Concept. In 1971 the South Dakota St Board ofVocational Education approved federal funding for anExemplary Secondary Multi District Vocational EducationCenter Project. The project began operation of the center inthe fall of 1971. Much interer: vas developed in the opera-tion of the center and in the possibility that this method ofproviding secondary vocational education could be utilizedthroughout South Dakota. In the spring of 1972 ti e SouthDakota State Board of Vocational Education approvedfederal funds for a Feasibility Study for Multi DistrictVocational Education Concept for secondary vhoolstudents in South Dakota. The study was design wincorporate the activities of the Exemplary ject wit:,data and information compiled by the study staff to nrvide recommendations for utilization of the Multi DLtrictConcept in South Dakota. Recommendations have beenprovided to the South Dakota State Board of VocationalEducation, and the State Board has requested publicinterest and information meetings to be held throughoutthe state. Multi District Vocational r.ducation operationswill provide increased vocational education opportunities, tosecondary school students in South Dakota through whichstudents may have available from three to four tin 13 tilepresent program offerings. If the people of the state desiresuch centers, legislative action will be necessary to allow fortheir successful opeation.

The South Dakota Statewide Needs Assessment Progran.will be discussed in the following paragrapf

SOUTH DAKOTA

Goals

A formal set of 10 statewide objectives dealing with input,process and output were published in Bulletin 99 threeyears ago and are available upon request. These objectiveswere prepared by a citizens' advisory group, the Chief StateSchool Officer and college-level professional educators andwere formally adopted by the State Board of Education.

Program Title and PurposesThe official name of the program is the South DakotaStatewide Needs Assessment. Its major purpose is to assesseducational needs on a statewide basis and to provide thecapability for local districts to assess their needs in reLtionto those of the state and its regions.

InitiationThe program was initiated by the State Education Agencyin conformance with ESEA Title I and Title III specifica-tions. The state legislature recommended communityinvolvement in educational affairs. Initiation of the ideacame in Marc .1 of 1972, with the launching of the pilotstudy occurring nine months later in December 1972.

PolicyThe State Education Agency in conjunction vith the localeducational agencies has primary responsibility for con-ducting and making changes in the program.

Funding

Program expenses are financed by federal and state funds.Two-thirds of the financing is through ESEA Title I andTitle III federal funds with the remaining one-third pro-vided by state appropriations. Program costs last year wereapproximately $25,000 to $30,000.

AdministrationThe State Education Agency with a staff of Si) full-timepersonnel coordinates the program statewide. They havebeen assisted by Commonwealth Learning, Arlington,Virginia, which drcfted the instruments for the pilot study.

PopulationThe target group incl :es high school students, professionaleducators, adininistrators, laymen, board members, legis-lators and other government officials. Elementary schoolstudents are not included. For the pilot study, individualswere selected for the program by a stratified randomsampling technique. A sample of 100 individuals in each ofsix planning regions (those regions were determined bypopulation) was selected. For the final study, individualswill be selected at random for every school district. Noschool is required to participate in the program. For thepilot study, roughly 10 percent of all eligible schools parti-cipated last year.

Areas Assessed

English, foreign language, health, 'athematics, naturalscience, reading, social science, vocations, writing and fine

77

Page 84: DOCUMENT RESUME - ERICDOCUMENT RESUME ED 080 582 TM 003 098 TITLE State Educational Assessment Programs.,1973 Revision.. INSTITUTION Educational Testing Service, Princeton, N.J.; Education

SOUTH DAKOTA

arts (optional) are the cognitive areas assessed for this pro-gram. Assessment of noncognitive areas includes: attitudestoward school, minorities and society, citizenship attitudes,persons' values, self-concept and, to a limited extent, assess-ment of creativity and interests.

InstrumentationQuestionnaires are used for this program. The question-naire, South Dakota Department of Public InstructionQuestionnaire to Determine Educational Needs As You SeeThem, includes 74 basic questions and 15 optional fine artsquestions. For teachers there are an additional 12 questionsand for administrators, 22 additional questions. Itemsampling was used to construct this tailor-made ques'!on-naire. It was developed by planning consultants and ESEATitles I and III personnel.

Related DataRelated data collected during the course of the assessmentprogram includes: age, sex, school district and length ofresidence.

Data Collection and ProcessingStaff members of the State Education Agency are respon-sible for distributing the questionnaires. The informationobtained from the pilot study was processed by a staff fromCommonwealth Learning, Arlington, Virginia.

InterpretationThe State Education Agency has responsibility for ana-lyzing, organizing and interpreting the data.

Use of DataThe results will be used for program planning and evalua-tion. There are no present restrictions or limitations on theuse of the data.

Dissemination

Preliminary reports of the results of the program have beenprepared by the State Education Agency. These reports arestate summaries of the program's stage-by-stage progress.Copies of the program reports are available upon request.

OverviewThe program is viewed favorably by the governor, the statelegislature, boards of education, local school administratorsand teachers; and it is viewed quite favorably by students,the general public and the State Education Agency. Pro-gram objectives are being achieved. The major problemrelated to the program is staff availability.

Prospects for the FutureThe program is likely to continue for two more years,although subject to modifications. The elements in the pro-gram most likely to change in the near future are: targetpopulation, me.suring instruments, data collecting andprocessing procedures, interpretation of data, use of dataand more effective methods for the dissemination of data.78

Individuals InterviewedThe primary sources of information were Henry G. Kosters,Associate Superintendent, Department of Public Instruc-tion, Capitol Building, Pierre, South Dakota 57501 (tele-phone: 605 224-4290) and Emmett B. 01,:son, StateDirector, Division of Vocational and Technical Education,SDREA Building, 222 West Pleasant Drive, Pierre, SouthDakota 57501 (telephone: 605 224-3423).

TENNESSEE

Tennessee State Assessment Program

GoalsIt is anticipated that statewide goals will be prepared by acitizens' advisory group, the State Commissioner of Educa-tion, the State Board of Education, the State Departmentof Education and selected students working together state-wide. These goals are expected to reflect a combination ofinpn,, process and output.

Program Title and PurposesThe official name of the program is the Tennessee StateAssessment Program. Its major purposes are to: assesseducational needs, provide data for a management informa-tion system, set statewide educational goals and provideinformation for a programming-planning-budgeting system.

InitiationThe State Assessment Program was initiated in 1968 fol-lowing ESEA Title 111 State Administration Guidelines. TheState Department of Education was involved in programinitiation, with initial planning done in conjunction withthe Bureau of Educational Research and Services atMemphis State University. The "Design for TennesseeAssessment and Evaluation," published in March 1969, out-lined the first three-year cycle for assessment activities.

Policy

Program policy has not yet been determined.

FundingThe program is completely financed through federal funds,using ESEA Title III and ESEA Title IV, Section 402.Roughly $26,000 was spent in 1972 on state assessment.

AdministrationAn interdepartmental planning and evaluation committeeserves in an advisory capacity for planning assessmentactivities. Activities are implemented through the Office ofPlanning and Evaluation in cooperation with ESEA Title 111staff and the Testing and Evaluation Center. The equivalentof three full-time professionals work on state assessment atthe state level.

Page 85: DOCUMENT RESUME - ERICDOCUMENT RESUME ED 080 582 TM 003 098 TITLE State Educational Assessment Programs.,1973 Revision.. INSTITUTION Educational Testing Service, Princeton, N.J.; Education

PopulationThe target population is defined by grade, with grades 5and 8 being currently assessed. Dropouts will not beincluded. Individuals will be selected for the program froma random sample drawn on all public school students in thetarget grades. Special schools for the handicapped, voca-donal, parochial and private schools will be excluded fromthe study. Program participation will be strictly voluntary.Some community characteris.ics will be considered (sizeand whether urban or rural).

Areas AssessedCognitive areas to be assesed include mathematics, naturalscience, reading and social science. Attitudes towardlearning will be assessed in the noncognitive area. Thetesting phase of a program has begun.

InstrumentationThe Stanford Achievement Tests (Harcourt Brace Jovan-ovich, Inc.) will be used for both grades 5 and 8. The Atti-tudes Toward Learning Instrument will be administeredonly in grade 8. The Stanford Achievement Tests werechosen by the State Department of Education. The Atti-tudes Instrument was tailor-made for the program, withitem sampling and analysis done by the State TestingCenter. This instrument was developed by the State Depart-ment of Education working in conjunction with MemphisState University. Tennessee does not consider any of itstesting instruments to be criterit-a-leferenced. All instru-ments are individually reliable.

Related DataAdditional information to be collected to facilitate programanalyses includes the .ast of the instructional program, thestudent.' grades, school name, school size, socioeconomicdata and teacher data.

Data Collection and ProcessingClassroom teachers will be responsible for the administra-tion of all test instruments. The State Department ofEducation will be responsible for processing the programdata.

interpretationThe organization, analysis and interpretation of all programdata will be done by the State Department of Education.

Use of DataProgram results will be used for instructional improvement,for guidance purposes, for program planning and programevaluation. At present, the authority for the program doesnot restrict or limit use of assessment dud.

DisseminationProgram reports are prepared and include: state summaries,school system summaries, school summaries and studentsummaries by subject and grade. These reports are available

_

to decision makers in the State Department of Education.

TEXAS

Overview

It is difficult to determine statewide opinion of the Ten-nessee State Assessment Program, although it is viewedfavorably by the State Depailment of Education. Programobjectives are being adequately achieved despite problemsof insufficient funds, unavailability of trained personneland the difficulties of developing program coordination atthe state level.

Prospects for the FutureIt is likely that the Tennessee State Assessment Programwill continue, with probable major changes occurringduring the latter part of 1973. It is difficult to determine atthis time all the areas in which these changes may occur.

Individuals InterviewedThe primary sources of information were Howell W. Todd,Coordinator of the Office of Planning and Evaluation, Ten-nessee State Department of Education, 263 Cordell HullBuilding, Nashville, Tennessee 37219 (telephone: 615741-3206) and Patsy Burress, Director of ESEA Title III,Tennesszt; State Department of Education, 135 CordellHull Building, Nashville, Tennessee 37219 (telephone: 615741-1776). John Arms, Chairman of the State AssessmentAdvisory Committee, Tennessee State Department ofEducation, 135 Cordell Hull Building, Nashville, Tennessee37219 (telephone: 615 741-1776), was also contacted.

TEXAS

Educational Needs Assessment: A StatewideDesign for Texas

GoalsThe State Board of Education formally adopted a set ofgoals for public school education in Texas on October 3,1Q70. They include a combination of factors based upon'desired conditions" projected into the "Long-Range Planfor Needs Assessment." Goals cover student development;organizational efficiency in the public school system ofTexas; and accountability, which includes the continuingevaluation of the public school system in terms of thecompetence of its products and the efficiency of its struc-ture and processes. The goals, prepared under the directionof a citizens' advisory group, the Chief State SchoolOfficer, the governor and outside contractors are availableupon request.

Program Title and PurposesEducational Needs Assessment: A Statewide Design forTexas is the official title of the program. Major purposesinclude the assessment of cognitive and noncognitivedevelopment, the assessment of educational needs and pro-

79

Page 86: DOCUMENT RESUME - ERICDOCUMENT RESUME ED 080 582 TM 003 098 TITLE State Educational Assessment Programs.,1973 Revision.. INSTITUTION Educational Testing Service, Princeton, N.J.; Education

TEXAS

viding data for a management system. The revision ofeducational goals is also a major purpose of the program.

InitiationThe State Education Agency was primarily responsible for

initiating the Texas Public School Assessment Program in

1967. Approximately four years has elapsed since its initia-

tion to its present stage of development.

PolicyThe Texas Education Agency (Commissioner, State Board

of Education and State Department of Education), workingwith representatives of regional education service centers

and school districts, determines how the program is con-ducted and what changes will be made in the nature of the

program.

FundingFederal funds finance about 80 percent of the program

expenses. The source of the federal funds varies by the area

of concern being assessed. The state supports about 15

percent, and the remaining expenses are derived from local

levels. The estimated cost for last year's program is

5100,000.

AdministrationResources required for coordinating the program statewide

are obtained from a wide range of personnel at the state,

regional and school district levels. The Office of Planning in

the Texas Education Agency has primary responsibility for

the program, and this staff of four full-time professionals is

assisted by personnel from the following other TEA offices:

ESEA Title III and V, elementary and secondary education,

pupil personnel services, research and vocational education.

Representatives of regional education service centers,

school districts and contracting companies provide addi-

tional assistance in the administration of the needs assess-

ment program.

PopulationThe program provides for assessment of a wide range of

target populations defined by ethnicity, SES, location and

so on. A major concern of the program is to insure that

varims age groups (11-year-olds, 14-year-olds, 17-year-olds,

adults and out-of-school youth) will be assessed, although

grades are also used to identify target populations. School

participation is requested but not mandatory. All schools

are eligible for testing, but sampling varies according to the

test administered. The 1971 Texas Achievement Appraisal

Study (TAAS) was administered to a sample of approx-

imately one-half of the high school seniors in Texas.

Criterion- referenced mathematics and reading tests in the

fall of 1971 were administered on a sixth grade level.

Schools were selected size and type of community, Title

I or non-Title I participation and the ethnic distribution of

respondents. Approximately 10 percent of eligible schools

were included in the testing, and approximately 10 percent

of the pupils being taught at the sixth grade level

80

responded. Desired learner outcomes for career education

are being developed for 17-year-old students this year.

Areas AssessedThe areas currently being studied in the Long-Range Plan

for Needs Assessment (1967-1975) are mathematics,

reading and career education. Other areas of concern in the

Long-Range Plan include: the noncognitive areas of student

attitudes and aspirations, personal and social relations and

affective behavior.

InstrumentationIn 1967 and again in 1971, the American College Test

(ACT), the Texas Achievement Appraisal Study (TAAS)

and a pupil questionnaire were administered to samples of

high school seniors. In the fall of 1971 the Prescriptive

Reading Test (PRT)Texas Edition, Prescriptive Mathe-matics Inventory (PMI) Level B and a Pupil Identification

Form (PIF) were administered to samples of sixth grade

students. The ACT measured developed abilities and

yielded normative data. Both PRT and PMI are criterion-

referenced tests designed to provide teachers with informa-

tion on individual students which could facilitate instruc-

tional planning. Assessment instruments are selected from

those available from pure or applied research projects

conducted by universities, educational laboratories or from

commercial test companies by the State Education Agency.

Although some tests are chosen or designed so that local

schools can receive individual feedback, the data is used

strictly as group-reliable measures on a statewide basis. Item

sampling has not been used to construct any of these tests.

The Career Maturity Inventory and the Assessment ofCareer Development were administered to 15,000 students

in the spring of 1973. Both ninth and eleventh grade stu-dents were tested to determine (1) the usefulness of the

data yielded by the selected instruments for program

planning at the local, regional and state levels and (2) the

extent to which information is provided on the status of

the desired learner outcomes for career education. Theevaluation of the 1973 pilot testing will determine theinstrumentation for the statewide assessment to be con-

ducted in the area of career education.

Related DataAdditional items involved in the analysis and interpretation

of data include: age, county, percent minority, previous

course work, racial/ethnic, school name and size, sex and

socioeconomic factors.

Data Collection and ProcessingAdministrative staff and school personnel administered the

tests and inventories. The State Education Agency and out-

side contractors were responsible for processing the

information. Experts from various sectors of the educa-

tional enterprise reviewed preliminary test data. Companies

publishing the instruments were contracted to produce test

Worts.

Page 87: DOCUMENT RESUME - ERICDOCUMENT RESUME ED 080 582 TM 003 098 TITLE State Educational Assessment Programs.,1973 Revision.. INSTITUTION Educational Testing Service, Princeton, N.J.; Education

Interpretation of DataOutside contractors were responsible for analyzing, organ-izing and interpreting the gross statewide results. Furtherprocessing and analysis of information was done by theTexas Education Agency.

Use of DataResults of the programs are used for instruction, guidanceand program planning on a long-range basis. Needs assess-ment leads to the choice of priority pupil needs towardwhich the resources of the Texas Education Agency can befocused. Pupil assessment, evaluation and planning areviewed as vital elements in establishing and renewingAgency programs for sharper focusing of local efforts upollpublic needs. No restrictions have been placed on the use ofthe data. Reports sent to various Agency divisions, informa-tion stored in the Management Information Center and datacollected and recorded by staff members of other AgencyDivisions are examined for information that will serve toidentify the existence of educational needs within theschools.

Information available in the ERIC documents, stztqgovernment libraries and other specialized resource centershas been examined for materials relevant to the identifica-tion of priority areas of educational concern. Special con-sideration is being given to a search for existing information.,about pupil organizations, opinions and activities notreadily available in the current materials that deal with theeducational problems and concerns of the schools.

The assessment results from each study ofa priority areaof concern are to be used as a basis for establishing state-wide learner objectives. These objectives become long-rangetargets for accomplishment in the elementary and second-ary school system in Texas.

DisseMinationReports of program results are prepared by the TexasEducation Agency. Summaries are prepared by state, targetarea, region and local systems. Student reports are oftenissued on an individualized basis. Public information newsreleases are prepared. The State Board of Education isinformed of identified needs and is provided with recom-mendations for alleviating these needs. Slide-tape presenta-tions about the results are prepared for use with audiencesof educators and lay citizens.

Overview

State officials and educational agencies in Texas view theprogram very favorably. Objectives are being achievedalthough certain obstacles exist such as limited resources,time lap and communication gaps.

Prospects for the FutureBecause the Texas program is a long-range plan for needsassessment, target populations, areas assessed and data col-lection vary from one year to another. Each year's resultsare recorded and serve as a basis for updating and renewalof the Statewide Design.

UTAH

Individual InterviewedThe primary source of information is Keith L. Cruse, Pro-gram Director, Needs Assessment, Texas Education Agency,Austin, Texas 78701 (telephone: 512 475-2066).

REFERENCESTexas Education Agency. Educational Needs Assessment: A State.wide Design for Texas. September 1972.

Texas Education Agency. 1971 Texas Achievement AppraisalStudy. May 1972.

Texas Education Agency. Sixth Grade Mathematics. 1972.Texas Education Agency. Sixth Grade Reading. 1972.Texas Education Agency. Assessment of Selected EducationalNeeds: A Local Perspective. April 1971,

UTAH

Utah Education Evaluation Improvement Effort

IntroductionIn the past eight years there have been two major assess-ment projects which were conducted within a larger pro-gram. The Education Evaluation Improvement Effort is aworking title only since the present program is still pri-marily in the design stage. The program is primarilyconcerned with improving the quality of educationaldecision making through evaluation measurement, researchand assessment techniques. A program report on thepresent work can be expected in 1974.

'GoalsA formal set of eight statewide educational goals has beenprepared, covering the areas of intellectual, social, emo-tional, physical, productive, aesthetic, environmental andethical-moral-spiritual maturities. There is a range of 8 to25 sub-objectives within these major goals areas. Citizens'Advisory Groups, the Chief State School Officer, the StateBoard of Education, the State Education Agency and stu-dents all participated in goal preparation. The goals havebeen formally adopted by the State Board of Education,the State Textbook Commission and the State Course andStudy Committee. The goals are primarily long-range stu-dent output in nature. They are available upon request.

Program Title and PurposesThe unofficial title of the program is the Utah EducationEvaluation Improvement Effort. Its major purpose is toimprove the quality of educational decision making throughresearch evaluation measurement and assessment in suchareas as cognitive development, educational needs, non-cognitive development and influences on learning.

I.

81

Page 88: DOCUMENT RESUME - ERICDOCUMENT RESUME ED 080 582 TM 003 098 TITLE State Educational Assessment Programs.,1973 Revision.. INSTITUTION Educational Testing Service, Princeton, N.J.; Education

UTAH

InitiationThe program was initiated in 1965 by the State EducationAgency,_ with the first legislative report on activitiesprepared in 1969.

PolicyThe responsibility for conducting the program and its.changes belongs to the State Board of Education, the Officeof the Utah State Board of Education and representativesof major school systems.

FundingIn the past the program has been entirely federally fundedusing ESEA Title IV, Section 402 and Title V funds. Lastyear roughly 530,000 was spent. In 1973 the program isasking for state funds, but it is not certain yet what statelaw the appropriations will come under.

AdministrationThe Planning Unit coordinates the program statewide. Theprofessional staff consists of two full-time individuals on astate level.

PopulationAs a part of the overall program, Utah conducted an assess-ment project in 1970 in which the target group was definedby grade, with grades 2, 3, 4, 6, 8 and 10 being assessed.Individuals not systematically sampled included dropouts,delinquents, migrants and non-English speaking students:Parochial, private and vocational-technical schools wereexcluded from the study. Samples were drawn from thetarget groups and participation was voluntary. Special com-munity characteristics were not taken into consideration indrawing the sample.

Areas Assessed

The cognitive areas of reading, speaking abilities, arith-metic, science, social studies and general comprehensionand learning skills as well as the noncognitive areas ofpsychomotor skills, social adjustment, maturity, personaladjustment, flexibility, self-confidence, healthy aspirations,leadership, optimism and other attitudes (toward self, selfas learner, others, learning, school and community) wereassessed.

InstrumentationUtah did not use a uniform testing program for assessment,but used already existing local testing programs. The localdistricts chose tests for their own purposes and the statedrew on the resulting data.

The State Education Agency developed a StudentInformation System (SIS) as a means of obtaining andstoring effective and limited cognitive data to complementthe local testing programs.

The following three SIS inventories have been used togather data from students:

The Student Questionnaire Le% 11 (SQI) for elementarystudents

82

The Student Questionnaire Level II (SQII) for studentsin junior and senior high :-.hool

The Student Questionnaire Level III (SQIII) for youngpeople who have left the school program

One S1S instrument, the Student Check List (SCL), hasbeen used to gather student data from teacher ratings (sub-sections measure students' various achievement factors,learning related problems and diversified behavioral char-acteristics). The various scales from S1S provide for bothteacher-perceived and self-perceived output.

Related DataThe following additional information was used to helpanalyze data: age, grades, percent minority, school size, sex,socioeconomic and r: i/ethnid status.

Data Collection and ProcessingFor the last assessment project (1969), classroom teacherswere responsble for giving the tests and collecting the data.Local school districts and the Office of the Utah StateBoard of Education were responsible for processing theinformation obtained.

InterpretationThe Office of the. Utah State Board of Education was alsoresponsible for analyzing, organizing and interpreting thedata.

Use of DataThe primary use of the resulting data was production of alegislative report. To some extent the data was also used forprogram evaluation, program planning and public relations.

DisseminationThe State Education Agency prepared a state summaryreport, including reports by target areas. Copies of theprogram report went to the State Board of Education, thestate legislature, ERIC and, by means of newspaper releases,to school dist: icts, schools, students, parents and teacherorganizations.

Overview

The program has been viewed passively in the past; how-ever, it is doing well in meeting program objectives. Themajor problem is felt to be placing proper emphasis uponthe possible uses of assessment information. Since the cur-rent program is undergoing change, it is not timely toevaluate or judge its performance.

Prospects for the FutureThe program is expected to continue, with modificationplanned. The current description of the program is onlyexpected to be accurate for a brief period. A more accurateand current description should be available in the nearfuture. Two areas most likely to change are disseminationand use of data.

Page 89: DOCUMENT RESUME - ERICDOCUMENT RESUME ED 080 582 TM 003 098 TITLE State Educational Assessment Programs.,1973 Revision.. INSTITUTION Educational Testing Service, Princeton, N.J.; Education

Individual Interviewed

The primary source of information was Stephen L. Murray,Specialist in Evaluation, The Office of the Utah State Boardof Education, 136 E. South Temple, 1400 University ClubBuilding, Salt Lake City, Utah 84111 (telephone: 801328-5888).

VLAMONT

Public School Approval Process in Vermont

Goals (t913)The Vermont Department of Education Agency PolicyStatement contains the Oak. for the Vermont Departmentof Education. Also, goals concerned with process and stu-dent output known as the ,Vermont Design for Educationare available upon request.

Progam Title and PurposesThe official title of the program is the Public SchoolApproval Process in Vermont. Its major purposes aredescribed by Vermont Statute:

3165. Approval of public schools

Approval of a public school shall be a finding by thestate board that' the school conforms to all require-ments of law and to such other standards relating toinstruction, faculty, curriculum, library, educationalmaterials, physical facility which the state board shallestablish by regulation as necessary to provide anacceptable educational opportunity for pupils in theschool. Approval of a school shall be for a term deemedappropriate by the state board.Added 1969, No. 298(Adj. Sess.), 316, eff. July 1, 1970.

A Blue Ribbon Study Group is working with the StateDepartment of Education in developing regulations for theabove statute and has arrived at a consensus on eight steps.In the process for educational designs (nursery through12+), by which schools in a supervisory union or super-visory district may be approved, the eight steps it the locallevel are:

Establish Philosophy of EducationEstablish clearly defined Goals and ObjectivesDetermine present status with respect to the Goals and

ObjectivesDefine Ne,,dsDevelop PlansHold a public hearingPresent final plan to State Department of Education for

action by Commissioner. Decision may be appealedto State Board of Education.*

According to a designated multiyeat cycle.

VERMONT

initiationThis program was initiated by the State Department ofEducation acting upon a charge from the State Board ofEducation on November 15, 1971.

Policy

The State Board of Education sets policy. The State Educa-tion Agency through the Blue Ribbon Study Groupinvolves a broad representation of citizens in working outthe procedure.

FundingState and local funds are used. A program cost figure is notavailable at this time.

4

AdministrationThe Division of Instruction (Elementary and SecondaryEducation) personnel coordinate the program statewide.The professional staff includes a Chief of EducationalAssessment. There are no outside agencies or consultantsinvolved in the administration of the assessment program atpresent.

PopulationGuidelines for the student population sampled have notbeen established as yet. It is proposed that use be made ofdata from testing programs already administered at thelocal level. The "Anchor System" is under study.

Areas AssessedAt this point, it is proposed that data from the cognitiveareas of mathematics and reading be used in the "trial run"period.

InstrumentationVermont is updating its information on he status of testingin the local school districts.

Data Collection and ProcessingThe State Education Agency will have the responsiblity forprocessing the information obtained.

InterpretationThe State Education Agency will also be responsible foranalyzing, organizing and interpreting the processed data.

Use of DataConfidentiality will be respected. Assessment results areintended to be used for program planning. School approvalwill be by state acceptance of a plan to meet priority needs.

DisseminationThe Commissioner of Education will receive a copy of theplan and take appropriate action as described in theapproval process.

Overview

Problems are foreseen due to lack of sufficient staff andfunds.

83

Page 90: DOCUMENT RESUME - ERICDOCUMENT RESUME ED 080 582 TM 003 098 TITLE State Educational Assessment Programs.,1973 Revision.. INSTITUTION Educational Testing Service, Princeton, N.J.; Education

Prospects for the FutureThe set of elementary and secondary schools in each schoolSupervisory Union or district will be acted upon period.ically according to a designated multi-year cycle.

Individual InterviewedThe primary source of information was Karleen V. Russell,Director of Instruction, State Department of Education,Montpelier, Vermont 05602 (telephone: 802 828-3111).

VIRGIN ISLANDS

Assessment of Educational Needs

IntroductionThe U.S. Virgin Islands Assessment of Educational Needshas been divided into three phases. Nan I is a community-wide survey to dete.mine which educational goal (that is,learning skills, citizenship, enjoyment of learning, self-concept and so on) is considered most important, andwhich is most in need of f-nprovement. In Phase II, theVirgin Islands will take the goals selected in Phase I andidentify what they wan each learner to accomplish.(Initially, these objectives will not be prepared for all gradelevels but only for several selected grades. There will bespecial workshops held for teachers, principals and super-visors to train them in preparing the objectives. At thispoint testing instruments will be selected and/or preparedand administered. In Phase III, educational programs will bedeveloped to eliminate the learner needs that wereestablished.

GoalsA formal set of educational goals for the entire territory hasbeen prepared and adopted by the Board of Education andis available upon request. The focal point of the needsassessment is the learner and the end results of his educa-tional process.

Program Title and PurposesAt the end of the 1972-73 school year, the Virgin IslandsAssessment of Educational Needs will provide the followingtypes of data: diagnostic data in reading and math tosecond and fifth grade teachers, base line data on whatsecond and fifth grade students know about other peoplesin the Caribbean, data on which variables seem to influencelearner achievement and a list of learner needs in which thestudents have not achieved the desired proficiency level.

InitiationThe Department of Education initiated a needs assessmentin the 1970 fiscal year which was interrupted until March1972, when the current activities were resumed.

84

PolicyA citizens' advisory group, the Chief State School Officer,representatives of two school districts and the State Educa-tion Agency cooperatively determine program policy andchanges.

FundingThe program is funded 100 percent by federal ESEA TitleIll monies.

AdministrationESEA Title III personnel coordinate the program through-out the territory, in close coordination with the StateDirector of Planning, Research and Evaluation. The ter-ritory has a professional staff of 1.5 full-time equivalentsworking on educational assessment. EPIC DiversifiedSystems Corporation in Tucson, Arizona, will be respon-sible for overall study design, data analysis and test con-sulting.

PopulationThe target groups in this first year of the needs assessmentwill be grades 2 and 5. All students in these grades in the 27public schools in the Virgin Islands will be tested in May ofthis year.

Areas AssessedThe community-wide survey (Phase 1) of students, parents,educators, business people, college students and senior citi-zens determined that the basic skills were r ^nsidered themost important goal of the education system and thatgreater emphasis should be placed on teaching the studentsto understand the way of life of other peoples in theCaribbean. Therefore, math, reading, and student under-standing of Caribbean culture will be assessed this year.

InstrumentationA standardized reading achievement test will be admin-istered; however, only selected test items which have beenrelated to specific objectives will be utilized in the analysisof the test results. In mathematics and Caribbean culture,locally prepared inventories will be administered. Theseinventories were prepared by teachers in the State Educa-tion Agency. The mathematics and reading inventories arecriterion-referenced. The test of Caribbean culture is beinggiven in preparation for curriculum development in the fallof 1973. Although for assessment purposes the results willonly be used as group-reliable data, the individual resultswill go to the students' teachers.

Related DataAlthough a final determination has not been made, itemsunder consideration include demographic data, (age, sex,language spoken in home, number of years in the VirginIslands and place of birth), student educational back-grounds (for example, special remedial help given inreading, whether retained in grade, missed years of school),teacher data (years of experience, educational background,

Page 91: DOCUMENT RESUME - ERICDOCUMENT RESUME ED 080 582 TM 003 098 TITLE State Educational Assessment Programs.,1973 Revision.. INSTITUTION Educational Testing Service, Princeton, N.J.; Education

years teaching in Virgin Islands) and school data (name,size, graded, ungraded).

Data Collection and ProcessingThe Department of Education is responsible for data col-lection and its consultant is responsible for data processing.

InterpretationEPIC will be responsible for analyzing program data and theDepartment of Education will be responsible for inter-preting the data.

Use of DataThe objectives of the needs assessment state the types ofdata that will be provided. The diagnostic data will be usedby teachers for instructional purposes,_and the data on thechildrens' knowledge of the Caribbean will be used in cur-riculum development. Data on which educational variablesinfluence learner achievement will be used to determine theuse of financial resources and establish educational policies,while data on learner needs will be used to establish needpriorities and to develop educational programs to eliminatethese needs.

DisseminationThe results of the community-wide survey (Phase 1) will besent to educators, PTA presidents: legislators and all headsof community organizations. A public forum is planned todiscuss the survey results. Needs assessment results will bedisseminated in various ways depending on the type of dataand its proposed use.

Overview

ESEA Title III personnel and the Department of Educationare very pleased with the prospects of the program.

r

Prospects for the FutureThe present description will remain accurate through theend of the 1972-73 academic year. The program willcontinue but with modification. Phases II and III will becarried out, and expansion can be expected in the areas ofobjectives within goal areas and grade levels to be tested.After the completion of Phase III, the new educationalprograms must be tested to see if the students are meetingthe objectives that have been set.

Individuals InterviewedThe primary source of information was Kurt Komives,Research and Evaluation Specialist, Director, ESEA TitleIII, Department of Education, P.O. Box 630, St. Thomas,Virgin Islands 00801 (telephone: 809 774-5886).

VIRGINIA

Virginia Educational Needs Assessment Study

GoalsA formal set of statewide educational goals has been jointlyprepared by the Chief State School Officer, the legislature,the State Board of Education, the State Education Agency,citizens' advisory groups and other consultants and experts.The goals have been formally adopted by the legislature anddeal with a combination of input, educational process andstudent output. They are available upon request.

Program Title and PurposesThe program is entitled the Virginia Educational NeedsAssessment Study. Following the completion and imple-mentation of Phase I (the cognitive and affective domains),Phase II, the assessment of educational needs in the psycho-motor domain, was initiated and conducted during the1971-72 school year. The purposes of the program are: (1)to delineate the needs for psychomotor programmingwithin elementary schools, (2) to draw inferences foreducational planning in terms of teacher education instruc-tional practices, classroom management, selection of cur-riculum materials and individualization of instruction in thepsychomotor domain; and (3) to correlate findings ofpsychomotor study (Phase II) with the findings of Phase I,and also with the standards of quality approved by theBoard of Education and enacted by the General Assembly.

InitiationThe Virginia Needs Assessment Study was initiated by theState Superintendent of Public Instruction. Planning wasaccomplished througi, a task force composed of representa-tives of all interested divisions within the State EducationalAgency and the N.C. Kephart Glen Haven AchievementCenter.

Policy

The State Department of Education Task Force retains theauthority for modification and change in the planning andfuture implementation action on the final findings of thestudy.

Funding .-The stud., .vas initiated in response to the requirements ofESEA Title ill, and the major costs of the study are beingborne by Title III.

AdministrationThe Federal Programs Office coordinated the programstatewide. The N.C. Kephart Glen Haven AchievementCenter conducted the study and was responsible for selec-tion, combination and development of the tests and thestatistical verification by analysis of data collected from thesample population.

85

Page 92: DOCUMENT RESUME - ERICDOCUMENT RESUME ED 080 582 TM 003 098 TITLE State Educational Assessment Programs.,1973 Revision.. INSTITUTION Educational Testing Service, Princeton, N.J.; Education

VIRGINIA

PopulationThe Division of Educational Research and Statistics,Virginia Department of Education, selected a stratifiedsample of 4,500 children, grades K-4 from public schools,for an initial screening in the form of a teacher checklist.An additional sample of 1.500 children was utilized toprocure additional data in response to pupil test programsadministered under the direction of the N.C. Kephart GlenHaven Achievement Center.

Areas AssessedPsychomotor function was assessed in Phase II.

InstrumentationThree instruments were primarily used in the testing pro.gram:

1. The Purdue Perceptual-Motor Survey (PPMS)2. The Virginia Psycho-Motor Screening Instrument (to

provide the classroom teacher with an instrument withwhich she can evaluate her students in the areas ofpsychomotor development).

3. The Test of Non-Verbal Auditory DiscriminationTENVAD (to assess auditory discrimination in youngchildren and patterned after the model of the SeashoreTest of Musical Talent (1960).

Related DataTo help analyze and interpret the test data, the followingadditional information is also collected: age, sex, academicgrades and rural/urban data.

Data Collection and ProcessingSelected graduate students from Virginia schools of highereducation assisted with the testing process. The N.C.Kephart Glen Haven Achievement Center trained these stu-dents and then utilized them in the testing under the super-vision of personnel of its association. As ,.: result of thisexperience the students will be available to educationalagencies in the future as additional personnel with special.ized training in the psychomotor domain.

Use of DataThe program is designed to improve guidance, instruction.program planning and program evaluation. To avoid ide%ti-

fication of individual schools or school districts, results willbe reported only in terms of the six geographic regions ofthe state.

DisseminationAnalysis results and recommendations for implementingpsychomotor training were available early in 1973. Thestate summary will be distributed to the governor, the legis-lature, the State Educational Agency and newspapers.

OverviewResponse to the program is generally favorable; actualimplementation has not occurred. It is too early to judgehow well the program objectives are being achieved.

86

Prospects for the FutureAt this time, subsequent action to up-date the study isplanned. Implications for curriculum change and teachertraining are anticipated.

Individuals InterviewedThe primary sources of information were Robert V. Turner,Special Assistant, Federal Programs and Relations, VirginiaDepartment of Education, P.O. Box 6-Q, Richmond,Virginia 23216 (telephone: 703 770-3170) and Clarence L.Kent, Supervisor, Guidance and Testing Service, StateBoard of Education, Richmond, Virginia 23216 (telephone:703 770-2617).

REFERENCESWoodrow W. Wilkerson. Standards of Quality and Objectives forPublic Schools in Virginia 1972-74.

State Department of Education. Manual for Implementing Stand-ards of Quality and Objectives for Public Schools in Virginia

1972-1974.

WASHINGTON

State Assessment of Educational Outcomes

IntroductionWashington's assessment program is still in the develop.mental stage and most of this description will deal with thatdevelopment and tie proposed operational program. Theprogram will become operational during the 1974-75

academic year.

GoalsA formal set of 18 statewide educational goals was coopera-tively prepared by the Superintendent of Public Instructionand the State Board of Education and adopted by the latterin January 1972. The Delphi method was used in identi-fying these goals, thereby also involving students, teachers,school administrators, parents, professional groups andrepresentatives of labor and industry. A series of statewidepublic hearings was conducted as a means of gaining ideasand reviewing the goals. Of the 18 goal statements, 10 dealwith stuient outcome. The goals are available upon request,

Program Title and ObjectivesMajor purposes of Washington's program, State Assessmentof Educational Outcome.% are to assess educational needs.cognitive and noncognitive development, provide data for amanagement information system and provide informationfor a planning-programming-budgeting system.

InitiationThe program was jointly initiated by the State Board ofEducation, the Superintendent of Public Instruction and

Page 93: DOCUMENT RESUME - ERICDOCUMENT RESUME ED 080 582 TM 003 098 TITLE State Educational Assessment Programs.,1973 Revision.. INSTITUTION Educational Testing Service, Princeton, N.J.; Education

the state legislature approximately two years ago.. The firstmodel was pilot tested a year late' in May 1972.

PolicyAt the present the pilot program is under the auspices ofthe State Education Agency, which is working in coopera-tion with the participating districts. As the programdevelops, an advisory committee and technical task forcewill play a larger role in program policy.

FundingAt the present, state funds are the primary source ofmonies. ESEA Title III and Title V funds have been used.

AdministrationSEA Program Evaluation personnel presently coordinatethe program with a staff of two full-time equivalentsworking on this project. Washington has consulted withEducational Testing Service, other testing companies andthe National Assessment Program office.

PopulationFor pilot study Washington chose grades 4 and 6 for itstarget population, eventually it hopes to test all of K-8.Samples were drawn from a voluntary target population.

Areas Assessed

fhe area of mathematics is being assessed at this time(springs of 1972 and 1973). Plans are being developed toassess reading-communication skills and mathematics duringthe 1973.74 school year on one grade level.

Instrumentation

The instrumentation consisted of Washington-made crite-rion-referenced tests. They were developed by the StateEducation Agency with the guidance of teachers. It wasvery important that the teachers identify the objectives.

Related Data

For the pilot studies Washington has considered schooldistrict size, demographic variables and student variables inanalyzing results.

Data Collection and Processing

This year teachers, trained by SEA, are responsible foradministering the tests, and the SEA is processing the data.

InterpretationThe State Education Agency is responsible for analyzingand interpreting the data with the aid of teachers andadministrators from cooperating districts.

Use of Data

The program results will be used to improve local instruc-tion and state program planning.

Dissemination

Reports of pilot studies are prepared by the SEA and sentto the school districts, schools and teachers involved. The

WEST VIRGINIA

reports are in the form of state, school, class and studentsummaries.

Overview

The program is viewed favorably by the SEA, the localschool administrators and teachers. Program objectives arebeing achieved; however, funding and a shortage of trainedpersonnel are major problems.

Prospects for the Future

The program will continue with the present descriptionremaining accurate through June of this year. During the1973-74 school year, the focus will be on assessing a smallstatewide sample of children on one grade level in the areasof reading-communication skills and mathematics.

Individual InterviewedThe primary source of information was Robert C. Munson,Supervisor, Program Evaluatit, , Old Capitol Building,Olympia, Washington 98504 .*:telephone: 206 753-3449).

WEST VIRGINIA

West Virginia Educational Assessment Program

The Bureau of Planning, Research and Evaluation ispresently designing a plan for soliciting the opinions of thecitizens of the state concerning goals for student learning.The final set of goals should be established and approved bythe State Board of Education in late 1973.

Program Title and PurposesThe official title of the program is the West Virginia Educa-tional Assessment Program. The major purpose of theprogram is to assess educational need in those goal areasranked highest by the state's citizens. An additionalpurpose is to develop a computer-based system for meas-uring school effectiveness. The sygera will consider thesurrounding conditions of home, school and community.Finally, a basic part of the program will deal with upgradingthe Department of Education's information system.

initiationThe present program for the assessment of student learningwas initiated in 1972 with the work conducted by theBureau of Planning, Research and Evaluation in conjunc-tion with participation in the Mid-Atlantic Region Inter-state Project. During calendar year 1972, the Bureau ofPlanning, Research and Evaluation adapted an assessmentmodel, revised data collection procedures, implementedcomputer programs for the statistical analysis of variablesassociated with the evaluation model and completed a pilot

87

Page 94: DOCUMENT RESUME - ERICDOCUMENT RESUME ED 080 582 TM 003 098 TITLE State Educational Assessment Programs.,1973 Revision.. INSTITUTION Educational Testing Service, Princeton, N.J.; Education

WEST VIRGINIA

study using the evaluation model. With the advent of stu-dent learner assessment efforts. the institutional assessmentof the past was phased out in 1972.

PolicyThe following groups determine program policy and whatchanges will be made: an assessment advisory council, theChief State School Officer and the State Board of Educa-tion.

FundingTitles 111 and V of ESEA are primary sources of funding forthe program. State funds support the statewide testingprogram which provides much of the raw data for theassessment effort.

AdministrationThe program is administered by the Bureau of Planning,Research and Evaluation within the West Virginia Depart-ment of Education. The program requires the equivalent oftwo full-time professional staff members. One and one-halfof this equivalency is within the Bureau, with the remainderin the Division of Guidance, Counseling and Testing. Thenecessary administration of instruments at the school levelrequires the equivalent of two additional staff members. Atthis time, no outside agencies or consultants are involved inare program's administration.

PopulationThe student target population was defined on the basis ofgrade, with dropouts and educably mentally retardedexcluded. Grade 6 will be assessed and all individuals withinthe target grade will be included. All public and non-publicschools were included. Participation in the program is in theform of an informal requirement, with 100 percent of theeligible schools included. Because of the nature of participa-tion (all schools), community characteristics were not aselection factor.

Areas AssessedIn 1972, the cognitive areas of aptitude, English, mathe-matics, natural science, reading and social science will beassessed. An additional area of assessment will be thestudents' attitude toward school subjects. In the presentyear, prior to completion of the selection of goals forstudent learning and the ranking by the State's citizens, theBureau of Planning, Research and Evaluation has assumedthat both skills and attitudeswill be given high priority.

InstrumentationThe data on the students' skills and attitudes will be derivedfrom the administration of the Scholastic Testing Service'sEducational Development Series. This battery is admin-istered at grades 3, 6, 9 and 11. The instrument was recom-mended by a state test selection committee and approvedby the State Education Agency. No part of the battery iscriterion-referenced. The test results are reliable for usewith individual students.

88

Related DataSchool condition variables, community condition variablesand input variables will be used in the computer analysis ofthe data.

Data Collection and ProcessingTrained classroom teachers, guidance counselors and super-visors and school administrative staff members are respon-sible for giving the tests and inventories. WestVirginia'sState Education Agency is responsible for processing theresulting information.

InterpretationThe State Education Agency will be responsible for organ-izing, analyzing and interpreting statewide data. Schoolpersonnel with State Education Agency and local districtassistance will be responsible for individual school interpre-tations.

Use of DataThe assessment program results will be used for purposes ofinstruction, guidance, program evaluation, decision-makingand program planning. By a policy agreement will' the localeducation agencies, individual school data is restricted. Abasic analysis interpretation by individual school isforwarded to each respective school. The school receivingthe report decides upon the release of the data and analysis.

DisseminationReports of the results will be prepared by the State Educa-tion Agency. The reports will include state summaries andschool summaries. Individual schools will receive only thestate summary and their own summary.

OverviewFrom 1967 to 1972, statewide assessments of educationere institutional in nature. Beginning in 1972, the Bureauof Planning, Research and Evaluation initiated steps whichwill lead to an assessment of student learning. To fullyimplement a student learner assessment entails the collec-tion of different data, a revision of the present informationsystem and the pilot testing of numerous computer pro-grams. The implementation of a total assessment systemwill provide information to those in decision-making posi-tions. This should lead to a decrease in the number ofdecisions based on intuition and an increase in the numberof decisions based on information gained from the soundtreatment of data.

Prospects for the FutureBecause the assessment program for student framing isrelatively new, there will be much modification within thebasic design in the next few years.

Individuals Inters, wedThe primary sources of information were Philip F. Thorn-ton, Consultant, Assessment and Evaluation, State Depart-ment of Education, Capitol Building No. 6, Room B-163,

Page 95: DOCUMENT RESUME - ERICDOCUMENT RESUME ED 080 582 TM 003 098 TITLE State Educational Assessment Programs.,1973 Revision.. INSTITUTION Educational Testing Service, Princeton, N.J.; Education

Charleston, West Virginia 25305 (telephone: 304348-7814) and John McClure, Consultant, Assessment andEvaluation, State Department of Education, Room 337B,Capitol Complex, Charleston, West Virginia 25305 (tele-phone: 304 348-2698).

WISCONSIN

Statewide Assessment Program

GoalsA set of statewide educational goal areas dealing mainlywith educational outcomes has been developed by anadvisory task force composed of educators, non-educators,members of the legislature, students and other citizensinterested in education. These goal areas have been recom-mended to the State Superintendent for Public Instructionbut have not yet been formally adopted by him. The reportof the task force to the State Superintendent is availableupon request.

Program Title and PurposesThe program, officially called the Wisconsin StatewideAssessment Program, seeks to assess educational needs,measure influences on learning, measure growth, providedata for a management information system ..ad set state-wide educational goals.

InitiationThe Chief State School Officer and the governor initiatedthe idea for the program in the 1971-73 biennial statebudget. The legislative mandate was passed in October1971, officially launching the pietera'm- , which is currentlyin its first operational year, First testing was in May 1973,

Policy

A citizen's advisory group, the Chief State School Officer,representatives of school svetns, the State EducationAgency and the state ' all have some input indetermining how the :, conducted and whatchanges will be made.

FundingApproximately 80 percent of the program is financed byfederal ESEA funds. The remaining 20 percent is financedby a special category of state funds. The program costswere approximately $90,000 through June 1973.

AdministrationPlanning and Evaluation personnel are responsible forcoordinating the program statewide. One full-time equiv-alent position is supported by Title III funds, another-full-time position by state funds and a half-time. professionalstaff person is jointly supported by a combination of

WISCONSIN

federal and state funds. The Research Triangle Institutedeveloped the sampling design for the assessment program.

PopulationThe program is confined to grade 3 and grade 7 students inpublic elementary schools and excludes handicapped andnonpublic schools. A random sample of schools within thetarget population is chosen to participate in the program.Students who have ,dropped behind grade level are includedin the program; however, educationally mentally retarded,emotionally disturbed and non-English speaking studentsare not. Not all eligible students participate in the program;a sample of students is drawn from the target population.

Areas Assessed

Mathematics and reading are being assessed in the cognitiveareas and in the noncognitive area-student attitudes towardsthe educational cuvironment are being assessed.

InstrumentationIn mathematics, criterion-referenced tests developed bymath teachers and specialists across the state are used. Themath test was "tailor-made" and was constructed usingitem sampling. In reading, a criterion-referenced test,adopted and revised for state purposes, is used. Both testsare used as "group reliable" measures.

Related Data

Although nonachievement data to be collected has not beenfully determined, it is planned to collect information on perpupil instructional costs, socioeconomic status, school size,teacher data, student age and sex.

Data Collection and ProcessingLocal school staff will be responsible for giving the tests.The State Education Agency,, with the assistance of theWisconsin State Testing Service, will process the informa-tion obtained from the program.

InterpretationThe State Education Agency and curriculum ..,ecialists willbe responsible for analyzing, organizing and in:erpreting thedata.

Use of DataThe results of this year's assessment will be used to build astatewide profile for third and seventh graders concerningthe mathematics and reading skills they possess. These stateprofiles will then be released to the public ..nd otherinterested parties for their perusal. In addition, the achieve-ment and nonachievement data will be analyzed to deter-mine what conditions/relationships are in need of furtherresearch.

DisseminationIt is planned that the State Educaticn Agency and a Tech-nical Advisory Committee will prepare state summaries,target area summaries and public information pieces. These

89

Page 96: DOCUMENT RESUME - ERICDOCUMENT RESUME ED 080 582 TM 003 098 TITLE State Educational Assessment Programs.,1973 Revision.. INSTITUTION Educational Testing Service, Princeton, N.J.; Education

WISCONSIN

reports will be distributed by the State Education Agencyto the governor, legislature, school districts, schools, stu-dents, parents, teacher organizations, ERIC and otherinterested parties.

OverviewAs the program has been outlined thus far, reaction hasbeen favorable. However, the program has not yet becomeoperational. The program is proceeding according toschedule, and the program objectives are being achieved asplanned. The major problem is the lack of funds to designand implement a comprehensive program which wouldassess a larger sample of the population and adequatelycollect and analyze nonachievement data which wouldprovide for expanded decision-making.

Prospects for the FutureThe program is likely to continue, with modifications infunding, areas assessed and the development of additionalmeasurement instruments.

Individual InterviewedThe primary source of information was Robert Gon,c11,Section Chief, Research Assessment and Evaluation, 126Langdon Stieet, Madison, Wisconsin 53702 (telephone: 608266-1782).

WYOMING

CoalsThe first phase of the statewide educational needs assess-ment program was the identification of the It oerorientedgoals and objectives for Wyoming public elementary andsecondary education. The goals became available May I,1973, in the Phase I Assessment Report. Work copies of theidentified goals and objectives for the various subject areasand the instructional support areas of Student and MediaServices were widely distributed in the state during thedevelopmental period and later through routine dissernination of sub-reports. The procedures for the identification ofthe goals and objectives involved nine basic steps:

I. Review of literature and pertinent documents such asState Curriculum Guides, school district documents,national goals and objectives, State Eoard of Educationpublications and policies, school law and educationalresearch documents by a consultant team at the Univer-sity of Wyoming.

2. Preparation of rationale and tentative2goals statementsby assessment area consultant teams.

3. The refinement of tentative goals statements by thestate's leading educators serving on committees for eachinstructional and instructional support area treated inthe assessment.

90

4. Survey of the professional and lay public for reaction toappropriateness, meaningfulness and clarity of therationale and the goals statements.

5. Revision of goals statements by the consultant teambased on the results of the surveys.

6. Identification of sub-goals, general and specific objec-tives, following the previously described procedures.

7. Review of identified goals and objec :ves by recognizedregional and national authorities.

8. Final revision of goals and objectives.9. Publication and dissemination.

Goals and objectives were identified for the following areas:

Affective EducationArt EducationCommunication Arts:

Secondary EnglishLanguage ArtsReading

Exceptional ChildrenForeign LanguageHealth EducationMathematicsMusic

Occupational Education:Business EducationDistributive EducationHome Economics

Industrial Education:Industrial ArtsTrade & Industry

Physical EducationPsychomotor EducationScienceSocial StudiesPupil Personnel ServicesMedia Services

The goals deal with the output of education, except forStudent and Media Services which involve input andprocess. Involved in the process of goal identification 11..:ebeen the personnel from school districts, faculty and staffmembers of the University of Wyoming, College of Educa-tion, specialists from the State Department of Education,the administrative offices of the College of Education andthe State Department of Education, the Title III, ESEASt ate Advisory Committee, the Wyoming EducationAssociation and 28 professional education associations andorganizations in the state. Legislators and representative ofthe lay and professional leadership in the state were repre-sented in the State Sounding Committee. The StateSounding Committee reviewed progress in ail areas treatedin Phase I, providing reaction and input. Over 2,000 mJi-viduals in the state were surveyed in the initial goals identi-fication.

Resources and materials from other states, major cur-riculum projects and the resources and materials of theNational Assessment of Educational Progress have beenused extensively in Phase I activities.

Endorsement of the goals by the State Board of Educa-tion is anticipated. A number of school districts in the Statehave used the WYENAP goals and objectives as springboardmaterials for the development of local g 7als and objectives.

' Program Ttle and PurposesOfficially titled the Wyomingment Project (WYENAP), the pwhich focus on the learner inmeasurement and eva:inition

Educational Needs Assess-rogram claims JO objectivesthe classroom: establishingstandards, developing and

I

Page 97: DOCUMENT RESUME - ERICDOCUMENT RESUME ED 080 582 TM 003 098 TITLE State Educational Assessment Programs.,1973 Revision.. INSTITUTION Educational Testing Service, Princeton, N.J.; Education

pilot testing a measurement program, ascertaining learneroutcome levels, identifying learner needs and learner-relatedneeds, determining critical levels of those needs, recom-mending measures to alleviate critical needs, informing thepublic of the needs assessment project's activities andmaking recommendations relevant to establishing anongoing state assessment program. These objectives areincorporated into a 5-phase schedule running through 1973.The overall purpose of the needs assessment is to facilitatestatewide planning for the improvement of elementary andsecondary education and at the same time meet the specialrequirements of ESEA Title III for statewide educationalneeds assessment.

InitiationWYENAP was initiated by the State Department of Educa-tion under a contract with the University of WyomingCollege of Education, Center for Research, Service andPublication in 1971. Projected activities are presently out-lined through 1978 with a measurement development andmeasurement schedule which includes some recycling ofassessment in special subject areas.

Policy

WYENAP serves directly under the Center for Research,Service and Publication, College of Education, University ofWyoming. While the contractual relationship between theUniversity and the State Department of Education specifiesthe roles, duties and responsibilities for the conduct of theproject, the practical and philosophical thrust of the rela-tionship is that of cooperation and joint effort in theservice of the schools of the state. The University ofWyoming is the only institution of higher learning in thestate beyond the community college level. The Universityof Wyoming College of Education and the State Depart-ment of Education have traditionally provided the bulk ofeducational services to the schools of the state. The needsassessment project, in a formal arrangement, continues tofurther these services.

The Project Steering Committee coordinates and directsthe WYENAP activity. Voting members of the committeeinclude four State Department of Education members andthree faculty members of the University of Wyoming Col-lege of Education. Ex-officio members of the steeringcommittee are the Dean of the College of Education andthe State Superintendent of Public Instruction. The StateESEA Title III Director serves as chairman of the com-mittee and the project director serves as committee ed;ecu-tive secretary.

While policy rests primarily with the contracting agencyand is set forth in the WYENAP contract and proposal,recommendations for policy changes and recommendationsfor project modification and revision are acted upon by theproject steering committee, Appropriate recommendationsare forwarded by the committee chairman to the StateSuperintendent of Public Instruction for final approval.

FundingFederal funds from ESEA Titles I and III finance all

WYOMING

expenditures. The contract also provides overhead for theUniversity which covers office space, furniture and equip-ment and make3 available to the project many additionalUniversity-based services such as computer services, duplica-tion and printing, use of motor pool and the like. While theprimary funding is by contract, school districts have pro-vided released time for teachers and administrators servingon the various committees to attend meetings. This hasincluded district paymen ' substitute teachers. Onehundred twenty-five educate :om the schools of the statehave served on such committees. The committees met twoto four times during the Phase I operation. While it is ratherdifficult to estimate the school district contribution interms of monies, it has been considerable and certainly vitalto the project's success. Last year's expenses were roughly$80,000. The current contract for Phase II operation is$84,050.

AdministrationThe WYENAP staff includes the director, an assistant to thedirector, a full-time secretary and part-time data processing,research and clerical help. Graduate assistants also served asresearch aides in the Phase I operation. WYENAP staff areuniversity employees, the project operates under Universitypolicy, rules and regulations. Consultants from the Univer-sity of Wyoming faculty and from the State Department ofEducation, for the most part, headed up the committees forthe identification of goals and objectives for the specialsubject and instructional support areas, as well as psycho-motor and affective education. Some educators from thefield with appropriate expertise and experience also servedon the consultant teams. Additional consultant teamsserved the project to provide the demographic and histor-ical research for the Phase 1 publication.

Last year approximately 50 consultants and 125 teacherswere directly involved in the conduct of the study.. Con-sultants for measurement and development served this pastyear to organize and supervise the development of crite-rion-referenced measurement programs in literature,reading, science and writing.

PopulationSeventeen-year-olds in school will undergo assessment inNovember 1973 in the cognitive and affective domains.Assessment instruments will be administered to a propor-tionally stratified random sample of approximately 2,500students. This is about one-third of the in-school seven-teen-year-old student population. Parochial and privatenon-profit schools are invited to participate. Students instate institutions, such as the reformatories for boys andgirls, will not be included at this time; however, the use ofmeasurement tools will be made available to these institu-tions upon request. The educational programs of these stateinstitutions are not under the jurisdiction of the StateBoard of Education. Also, in November, students in gradesK-6 will undergo psychomotor assessment. A similarly sizedproportionally stratified random sample will be drawn forthe psychomotor assessment. Reporting categories are

91

...,--,..- _._-

Page 98: DOCUMENT RESUME - ERICDOCUMENT RESUME ED 080 582 TM 003 098 TITLE State Educational Assessment Programs.,1973 Revision.. INSTITUTION Educational Testing Service, Princeton, N.J.; Education

WYOMING

anticipated to be: age, grade, school district size by popula-tion, attendance center size by population, cost per pupilby school district and comparison with NAEP "p-values"for appropriate exercises.

Areas Assessed

Tryouts of the affective and cognitive domain (science,reading, writing and literature) instruments were conductedin April 1973 in three school districts. Pilot testing in oneschool district was conducted in April 1973 for the psycho-motor assessment screening instrument following validationof the screening instrument in another Wyoming schooldistrict in February and March 1973.

The long-range assessment for WYENAP includes the fol-lowing schedule of initial measurement administration:

1973.74

Writing

Science

Reading

Literature

Affective

Psychomotor

1974 -75

Citizenship

Music

Social Studies

Physical Education*

Affective

Psychomotor

1976.77

Career & Occupa-tional Development

Affective

Psychomotor

New development

1975.76

Mathematics

Foreign Language*

Health & Safety*

Affective

Psychomotor

1977-78

Art

Affective

Psychomotor

Recycling of assessment will begin in 1975-76 with thetentative designation of reading and science scheduled forreassessment at that time. Assessment related to input andprocess for the areas of Media Services and Student Serviceswill continue through the 5-year assessment period.

InstrumentationIn the development of criterion-referenced testing pro-grams, released exercises and data from the National Assess-ment of Educational Progress (NAEP) were utilized as afirst priority for adaptation and keying to WYENAP goalsand objectives. Other sources of exercises utilized on asecond priority basis were released exercises from otherstate assessments, The Instructional Objectives Exchange(I0X), BSCS and SAPA. Third priority was originallydeveloped exercises. The affective assessment instrumentitems were drawn from a combination of sources orig-inating in and out of the state; however, the bulk of itemsin the instrument were developed originally by theWYENAP consultants.

Workshops began in January 1973 to orient consultantsto the criterion-referenced measurement developmentprocess and to train those who developed the measurementprograms for writing, reading, science and literature.

92

The specifications and procedures for exercise develop-ment were developed for WYENAP, based heavily on NAEPresources and materials for exercise development. A I5-stepprocedure was devised for the development and keying ofexercises. This included monitoring and supervision ofdevelopment and a series of reviews.

The psychomotor assessment screening instrument hasbeen developed specifically for WYENAP assessmentpurposes and specifications.

For both pilot testing and later assessment, exercises willbe administered in five packages, each package containingitems from the five areas. Administration of each plckagewill not take more than one hour of testing time. Thesampling plan is presently based on 150 responses perexercise or item in the cognitive and affective domains.

Related Data

The WYENAP Phaie I report of the identified goals andobjectives places these value statements into context andperspective through accompanying demographic andhistorical reports. The development of the demographicstudy and the treatment of the history of Wyoming publicschool finance and the growth of elementary and secondaryeducation in the state have been an integral part of thePhzse I operation of WYENAP. The assessment's activitiesin respect to Media Services, Student Services and excep-tional children have been designed to produce an informa-tion base to apply in Phases III and IV operation for theidentification of learner-related needs and establishment oflevels of criticality. At the present time the State Depart-ment of Education has included in the ManagementInformation System the annual collection and analysis ofprogram information for Wyoming secondary schools. Thisinformation base will be utilized in the analysis of results ina manner similar to that stated above in Phases III and IVoperations.

Data Collection and Processing

WYENAP staff will administer tests for the program. Actualprocessing of data will be done under subcontracts withother agencies for scoring and analysis.

InterpretationWYENAP will be responsible for the analysis, organizationand interpretation of the data. The setting of levels ofcriticality will involve the State Department of Education,the organized profession, school district representatives, laypublic and others. A similar process of involvement andinput to that of identifying the goals and objectives will beutilized.

Use of Data

Program results will be used in statewide educational plan-ning, in covering ESEA Titles I and Ill requirements and incovering comprehensive planning requirements for indi-vidual school districts. All data will become confidentialstate property, to be published only with written permis-sion from the State.

Page 99: DOCUMENT RESUME - ERICDOCUMENT RESUME ED 080 582 TM 003 098 TITLE State Educational Assessment Programs.,1973 Revision.. INSTITUTION Educational Testing Service, Princeton, N.J.; Education

DisseminationWYENAP staff have prepared and distributed progressreports to all people involved to this point. The Phase IReport was published in the spring of 1973.

OverviewOverall opinion of the program is favorable. The StateEducation Agency is most positive. Objectives for theproject are being achie-ved very well. Time and money arethe only major problems with respect to the long-termprogram outlined for measurement development, assess-ment and utilization of data.

Prospects for the FutureWYENAP is expected to continue, with possible modifica-

tions in funding. This description will remain accuratethroughout 1973.

Individual InterviewedThe primary source of information was James L. Head lee,Director, WYENAP, College of Education, University ofWyoming, Laramie, Wyoming 82070 (telephone: 307

766-6490).

REFERENCEWYENAP: Wyoming Educational Needs Assessment Project, Univer-sity of Wyoming, State Department of Education, Education Profes-sion, School Districts, Wyoming Citizens.

93

Page 100: DOCUMENT RESUME - ERICDOCUMENT RESUME ED 080 582 TM 003 098 TITLE State Educational Assessment Programs.,1973 Revision.. INSTITUTION Educational Testing Service, Princeton, N.J.; Education

INTERVIEW GUIDE FOR THE SURVEY OF STATE EDUCATIONAL ASSESSMENT PROGRAMS

© FALL 1972

State Interviewer

Interviewee Date

Return to: Educational Testing ServiceRoom B-005Princeton, New Jersey 08540Telephone: 609 921.9000

Statewide Educational Goals

1. Has a formal set of statewide educational goals been prepared?a. Yesb. Noc. In process

2. If statewide educational goals have been prepared, are theyavailable on request?a. Yesb. No

3. Which of the following groups participated in the preparation ofthe statewide educational goals?a. Citizens' advisory groupb. Chief State School Officerc. Governoril. Legislaturee. Outside contractorsf. State Board of Educationg. State Education Agencyh. Studentsi. Other

4. Have the statewide educational goals been formally adopted bythe state?a. Yesb. Noc. In process

5. What legally constituted state authority adopted these goals?a. Governorb. Legislaturec. State Board of Educationd. State Education Agencye. Other

6. Are the goals about:a. Inputb. Processc. Outputd. Combination

State Educed Ind Assessment ProgramProgram Tide and Objectives

7. What is the official name of this program?

8. The major purpose(s) of this program is/are to:a. Assess cognitive developmentb. Assess educational needsc. Assess noncognitive developmentd. Measure influences on learninge. Measure growthf. Provide data for a management information systemg. Set statewide educational goalsh. Provide information for a planning-progmming-budgeting

systemi. Other

Definition of the Target PopulationIndividuals

9. The target group(s) is/are defined by:a. Ageb. Gradec. Other

10. Are dropouts included?a. Yesb. , No

11. if all students of a given age are included, do you include thosewho have dropped behind grade level?a. Yesb. No

12. Which of the following students are excluded from the pro-gram?a. Noneb. Academicc. Delinquentd. Educationally mentally retardede. Emotionally disturbedf. Giftedg. Migranth. Non-English speakingi. Physically handicappedj. Vocationalk. Other

13. How are individuals selected for the program?a. All in target population are includedb. Samples are drawn from the target populationc. Other

95

Page 101: DOCUMENT RESUME - ERICDOCUMENT RESUME ED 080 582 TM 003 098 TITLE State Educational Assessment Programs.,1973 Revision.. INSTITUTION Educational Testing Service, Princeton, N.J.; Education

Schools Funding

14. What types of schools are excluded from the program?a. Noneb. Delinquentc. Handicappedd. Parochiale. Privatef. Vocational-technicalg. Other

15. Are certain types of schools required to participate in the pro-gram?a. Noneb. Schools receiving federal moniesc. Schools receiving state moniesd. Other

16. Is school participation required?a. Yesb. No

17. What percent of eligible schools were included last year?

Community

18. Is participation confined to communities with special char-acteristics?a. Yesb. No

19. If yes, what community characteristics are considered?a. Ruralb. Urbanc. Suburband. Sizee. Socioeconomic statusf. Other

Initiation of the Program

20. Which of the following groups initiated the idea for this pro-gram?a, Governor's officeb. Independent organizationc. State Board of Educationd. State Education Agencye. State Legislaturef. Chief State School Officerg. Teachers' associationh. Other

21. How long from initiation of the idea to either administration ofinstruments in the schools or to your present stage of devel-opment?

Program Policy

22. Which of the following groups determine how the program isconducted and what changes will be made in the nature of theProgram.a. Citizens' advisory groupb. Chief State School Officerc". Governor's officed. Representatives of major school systemse. State Board of Educationf. State Education Agencyg. State legislatureh. Other

96

23. Roughly what proportion of the program expenses is financed

a. Federal fundsb. Local fundsc. Other

24. What categories of state funds are utilized?

25. Which of the following federal funds are utilized?a ESEA Title Ib. ESEA Title Illc. ESEA Title Vd. ESEA Title VIe. ESEA Title VIIf. ESEA Title VIIIg. NDEA V-Ah. Vocational Education ActsL Other

26..Roughly how much did the program cost last year?

Administration of the Program

27. What state, county or local education agency personnel coordi-nate the program statewide?a. ESEA (specify which title)b. Elementary and secondary education bureauc. Planning and evaluationd. Pupil personnel servicese. Researchf. Vocational educationa Countyh. School districtL Localj. Other

28. What is the professional staff size in terms of full-time equiv-alents?a. Countyb. Localc. State

29. Are outside agencies or consultants involved?a Yesb. No

If yes, please identify the organization or consultadts anddescribe their responsibilities.

Target Areas

30. Which of the following cognitive areas are being assessed?a Aptitudeb. Englishc. Foreign languaged. Healthe. Mathematicsf. Natural scienceg. Reedingh. Social scienceL Vocationalj. Writingk. Other

31. Which of the following noncognitive areas are being assessed?a. Attitudes (specify)b. Citizenshipc. Creativityd. Interestse. Personal valuesf. Self-conceptg. Other

Page 102: DOCUMENT RESUME - ERICDOCUMENT RESUME ED 080 582 TM 003 098 TITLE State Educational Assessment Programs.,1973 Revision.. INSTITUTION Educational Testing Service, Princeton, N.J.; Education

InstrumentationInterpretation of Data

32. What tests, inventories or other measures are being used for 42. Who is responsible for analyzing, organizing and interpreting the

each target area at each age or grade level?data?a. Local schools

33. Of those tests, inventories or other measures listed in 32, which b. Local school districtshave been "tailor-made" for the program? c. Outside contractors

34. Was item sampling used to construct any of the "tailor-made" d. State Education Agency

instruments?e. State or private universities

a. Yes (specify)f. Other

b. No

35. Who developed these "tailor-made" tests?a. Consultantsb. Outside contractorsc. State Education Agencyd. Teacher committees in the statee. Other

36. How were the standardized tests listed in 32 chosen?

a. Committeeb. Consultantsc. State Education Agencyd. Other

37. Which of the instruments in 32 would you describe as criterion-

referenced measures?

38. Which of the instruments in 32 are used only as group-reliable

measures?

Related Data

Use of Data

43. How are the results of the program used?

a. Allocation of state or federal fundsb. Budgetingc. Instructiond. Comparative analysis across schools

e. Identification of exemplary programsf. Guidanceg. Program evaluationh. Program planning1. Public relationsj. Other

44. Does the law or authority for the program restrict or limit use

of the data?a. Yesb. No

Dissemination

39. What other additional information is collected to help analyze45. Are reports of the results of the program prepared?

or interpret the data?a. Yes

Nb. No

a. oneb. Age 46. What kinds of reports are prepared?

c. Cost of instructional programs a. State summaries

d. County b. Regional summaries

e. Dropout rates c. School system summaries

f. Academic grades d. School summaries

g. Use of paraprofessionals or educational aides e. Student reports

It. Percent minority f. Summaries by target areas

L Previous course work g. Public information pieces

j. Racial/ethnic background h. Other

k. School nameI. School size

47. Who receives a copy of the program reports?

m. Sexa. State Education Agency

n. Socioeconomic statusb. Governor

o. Teacher datac. Legislature

p. Otherd. Newspaperse. Other states

Data Collection and Processingf. State Board of Educationg. School districts

40. Who is responsible for giving the tests, inventories, etc. for the h. Schools

Program?i. Students

a. Classroom teachers j. Parents

b. Guidance counselors k. Teacher organizations

c. Paid consultants I. ERIC

d. School administrative staffm. Other

e. Other

41. Who is lesponsible for processing the information obtained

from the program?a. Local schoolsb. Local school districtsc. Outside contractorsd. State Education Agencye. State or private universitiesf. Other

*e.g.,teacher salaries, teacher education, course load, etc.

48. Who prepares the reports?a. College or universityb. Outside contractorsc. State Education Agencyd. Other

97

1

Page 103: DOCUMENT RESUME - ERICDOCUMENT RESUME ED 080 582 TM 003 098 TITLE State Educational Assessment Programs.,1973 Revision.. INSTITUTION Educational Testing Service, Princeton, N.J.; Education

Overview49. In general, how is the program viewed by:

Very VeryUnfavorably Favorably

Prospects for the Future52. Is the program likely to continue?

a. Yesb. Noc. With modificationa. Governor I 1111111 53. If yes, how long do you expect this description to remainb. Legislature IIIIIIIi accurate?

54. If no, will it discontinue because of:,a. Lack of funds

c. State Education IIIIIII1 b.' Lack of interestAgency

c. Unable to meet objectives (specify)d Boards ofd. Only planned to run "x" years

.education I I I 1 I I I I e. Widespread opposition

f. Othere. Local school

55. Which of the following elements in your program are mostadministrators 111111 I J likely to change in the near future?

a. Goals or objectivesf. Teachers I I 1 I I I 1 1 in Target populationc. Policy controlg. Students I I L 1 1 1 I d. Funding

h. General public I 1 1 1 1 I I Ie. Administrationf. Areas assessedg. Measuring instruments

50. In general, how well would you say the program objectives areh. Data collection and processing procedures

being achieved?

i. Interpretation of dataVery Very j. Use of dataPoorly Well k. Dissemination

1.1111111 I. Other

56. Is a report ofState Educational Assessment Programs coveringall states likely to be us:ful in revising your program in the

51. What are the major problems related to the program?future?a. Yesb. No

98