document resume ed 274 679 tm 860 539 - eric · document resume ed 274 679 tm 860 539 author yale,...

24
DOCUMENT RESUME ED 274 679 TM 860 539 AUTHOR Yale, David C. TITLE Implementation of a Microcomputer-Based Testing System in a Military Training Environment. INSTITUTION Assessment Systems Corp., St. Paul, Minn. SPONS AGENCY Office of Naval Research, Arlington, Va. Personnel and Training Research Programs Office. REPORT NO ONR-RR-85-2 PUB DATE 11 Nov 85 CONTRACT 00014-83-C-0634 NOTE 24p. PUB TYPE Reports - Research/Technical (143) EDRS PRICE MF01/PC01 Plus Postage. DESCRIPTORS *Achievement Tests; *Adaptive Testing; Adults; Computer Assisted Instruction; *Computer Assisted Testing; Diagnostic Tests; Item Banks; Microcomputers; *Military Training; Postsecondary Education; Standardized Tests; Technical Education; *Test Construction; *Testing IDENTIFIERS *MicroCAT Testing System ABSTRACT A computerized testing system was installed on an experimental basis at the Basic Electricity and Electronics School of the Naval Training Center in San Diego. The system consisted of a network of IBM Personal Computers running a slightly modified version of the commercially available MicroCAT Testing System. It was configured to fit transparently into the school's computer-managed instruction system. After a few minor adjustments and a few added features, the system met its goal of paralleling the paper-and-pencil version of the tests with a minimum of change in standard testing procedures. Now in place, the system provides a base on which diagnostic testing research can begin. Diagnostic testing will be implemented using the custom interface included in MicroCAT, which allows users to link FORTRAN or Pascal procedures to MicroCAT. (Author/JAZ) *********************************************************************** Reproductions supplied by EDRS are the best that can be made from the original document. ***********************************************************************

Upload: others

Post on 27-May-2020

11 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: DOCUMENT RESUME ED 274 679 TM 860 539 - ERIC · DOCUMENT RESUME ED 274 679 TM 860 539 AUTHOR Yale, David C. TITLE Implementation of a Microcomputer-Based Testing. System in a Military

DOCUMENT RESUME

ED 274 679 TM 860 539

AUTHOR Yale, David C.TITLE Implementation of a Microcomputer-Based Testing

System in a Military Training Environment.INSTITUTION Assessment Systems Corp., St. Paul, Minn.SPONS AGENCY Office of Naval Research, Arlington, Va. Personnel

and Training Research Programs Office.REPORT NO ONR-RR-85-2PUB DATE 11 Nov 85CONTRACT 00014-83-C-0634NOTE 24p.PUB TYPE Reports - Research/Technical (143)

EDRS PRICE MF01/PC01 Plus Postage.DESCRIPTORS *Achievement Tests; *Adaptive Testing; Adults;

Computer Assisted Instruction; *Computer AssistedTesting; Diagnostic Tests; Item Banks;Microcomputers; *Military Training; PostsecondaryEducation; Standardized Tests; Technical Education;*Test Construction; *Testing

IDENTIFIERS *MicroCAT Testing System

ABSTRACTA computerized testing system was installed on an

experimental basis at the Basic Electricity and Electronics School ofthe Naval Training Center in San Diego. The system consisted of anetwork of IBM Personal Computers running a slightly modified versionof the commercially available MicroCAT Testing System. It wasconfigured to fit transparently into the school's computer-managedinstruction system. After a few minor adjustments and a few addedfeatures, the system met its goal of paralleling the paper-and-pencilversion of the tests with a minimum of change in standard testingprocedures. Now in place, the system provides a base on whichdiagnostic testing research can begin. Diagnostic testing will beimplemented using the custom interface included in MicroCAT, whichallows users to link FORTRAN or Pascal procedures to MicroCAT.(Author/JAZ)

***********************************************************************Reproductions supplied by EDRS are the best that can be made

from the original document.***********************************************************************

Page 2: DOCUMENT RESUME ED 274 679 TM 860 539 - ERIC · DOCUMENT RESUME ED 274 679 TM 860 539 AUTHOR Yale, David C. TITLE Implementation of a Microcomputer-Based Testing. System in a Military

U S. DEPARTMENT Of EDUCATIONOffice of Educational Research and ImprovementEDUCAIIONAL RESOURCES INFORMATION

CENTER (ERIC)..liefhia document has been reproduced as

received from the person or organisationoriginating it

0 Minor changes have been made to improvereproduction quality

Points of view or opinions stated in this dixthment do not necessarily represent officialOERI position or policy

Page 3: DOCUMENT RESUME ED 274 679 TM 860 539 - ERIC · DOCUMENT RESUME ED 274 679 TM 860 539 AUTHOR Yale, David C. TITLE Implementation of a Microcomputer-Based Testing. System in a Military

unclassified

REPORT DOCUMENTATION PAGE

Ia. REPORT SECURITY CLASSIFICATIONunclassified

lb. RESTRICTIVE MARKINGS

2a. SECURITY CLASSIFICATION AUTHORITY 3 . DISTRIBUTION /AVAILABILITY OF REPORT

approved for public release;distribution unlimited2b. DECLASSIFICATION /DOWNGRADING SCHEDULE

4. PERFORMING ORGANIZATION REPORT NUMBER(5) ,

ONR-85-2

5. MONITORING ORGANIZATION REPORT NUMBER(S)

6a. NAME OF PERFORMING ORGANIZATION

Assessment Systems

I 6b. OFFICE SYMBOL(If blapp licae)

7a. NAME OF MONITORING ORGAMZATION

Office of Naval Research

6c. ADDRESS (City, State, and ZIP Code)

2233 University Avenue, Suite 310St. Paul, MN 55114

7b. ADDRESS (City, State, and ZIP Code)

800 N. Quincy Street, Code 442Arlington, VA 22217

8a. NAME OF FUNDING /SPONSORINGORGANIZATION

Personnel and Training Researc

8b. OFFICE SYMBOL(If applicable)

9. PROCUREMENT INSTRUMENT IDENTIFICATION NUMBER

N00014-83-C-0634

8c. ADDRESS (City, State, and ZIP Code)

Office of Naval Research800 N. Quincy Street, Code 442Arlington, VA 22217

10. SOURCE OF FUNDING NUMBER

PROGRAMELEMENT NO.

PROJECTNO.

N 507-002

TASKNO.

WOR.. UNITACCESSION NO.

11. TITLE (Include Security Classification)

Implementation of a Microcomputer-Based Testing System in a Military Training Environment

12. PERSONAL AUTHOR(5)

C. David Vale13a TYPE OF REPORTtechnical report

13b. TIME COVEREDFMOM 9/1/83 T(11/11/85

14. DATE OF REPORT (Year, Month, Day)85 Nov 11

I

15. PAGE COUNT12

16. SUPPLEMENTARY NOTATION

17. COSATI CODES 18. SUBJECT TERMS (Continue on reverse if necessary and identify by block number)adaptive testing, computerized testing, diagnosic testing,item banking, item response theory, MicroCAT, software,tailored testin: trainin:

FIELD GROUP SUB-GROUP

19. ABSTRACT (Continue on reverse if necessary and identify by block number)

A computerized testing system was installed on an experimental basis at theBasic Electricity and Electronics Schcol of the Naval Training Center in SanDiego. The system consisted of a network of IBM Persoral Computers running aslightly modified version of the commercially available MicroCATtm TestingSystem. It was configured to :_it transparently into the school's computer-managed instruction system. After a few minor adjustments and a few addedfeatures, the system met its goal of paralleling the paper-and-pencil version ofthe tests with a minimum of change in standard testing procedures. Now in place,the system provides a base on which diagnostic testing research can begin.

20. DISTRIBUTION/AVAILABILITY OF ABSTRACTPA UNCLASSIFIED/UNUMITED 1:1 SAME AS RPT. 111 DTIC USERS

21. ABSTRACT SECURrY CLASSIFICATIONunclassified

22a. NAME OF RESPONSIBLE INDIVIDUALCharles E. Davis

22b. TELEPHONE (Include Area Cnde)202-696-4046

22c. OFFICE SYMBOL

DD FORM 1473, 84 MAR 83 APR edition may be used until exhausted.All other editions are obsolete.

SECURITY CLASSIFICATION OF THIS PAGE

unclassified

Page 4: DOCUMENT RESUME ED 274 679 TM 860 539 - ERIC · DOCUMENT RESUME ED 274 679 TM 860 539 AUTHOR Yale, David C. TITLE Implementation of a Microcomputer-Based Testing. System in a Military

ABSTRACT

A computerized testing system was installed on an experimental basis at theBasic Electricity and Electronics School of the Naval Training Center in SanDiego. The system consisted of a network of IBM Personal Computers runninga slightly modified version of the commercially available MicroCATtm TestingSystcm. It was configured to fit transparently into the school's computer-managed instruction system. Af ter a few minor adjustments and a few addedfeatures, the system met its goal of paralleling the paper-and-pencil version ofthe tests with a minimum of change in standard testing procedures. Now inplace, the sys.tem provides a base on which diagnostic testing research canbegin.

Page 5: DOCUMENT RESUME ED 274 679 TM 860 539 - ERIC · DOCUMENT RESUME ED 274 679 TM 860 539 AUTHOR Yale, David C. TITLE Implementation of a Microcomputer-Based Testing. System in a Military

TABLE OF CONTENTS

Introduction 1

Design of the Testing System 1

System Requirements 1

Analysis of the Systems 3

MIISA: The Navy's Computer-Managed Instruction System 3

The MicroCAT Testing System 4

Integrating the Resources 6

Implementation of the System 7

Description of the Initial System 7

Initial Evaluation 8

System Revisions 8

Additional Features 8

Evaluation ofthe System 10

References 12

Page 6: DOCUMENT RESUME ED 274 679 TM 860 539 - ERIC · DOCUMENT RESUME ED 274 679 TM 860 539 AUTHOR Yale, David C. TITLE Implementation of a Microcomputer-Based Testing. System in a Military

INTRODUCTION

Achievement testing takes up a substantial portion of a trainee's time in aself-paced military service technical school because continual assessment of thetrainee's skills is necessary to pace the instruction. Obviously, anything thatcan be done to make testing more efficient or to extract better informationfrom the testing process will enhance the quality of training. Several forms ofcomputerized testing, including computerized adaptive testing (Weiss, 1982,1935) and computer-based diagnostic testing (Tatsuoka & TatFuoka, 1983), offerthe promise of such an improvement.

Computer-based instruction and testing in the service schools requiresreliable, inexpensive computer equipment that can huidle a variety ofpresentation forms. Among the forms such equipment must handle are standardcomputer-based instruction and conventional, adaptive, or diagnostic testing.Although a variety of' software systems for computer-based instruction areavailable, very few software systems are available for implementing adaptiveor diagnostic testing. The MicroCATtm Testing System (Assessment Systems,1984) is a generic testing system that can be used for most forms of testing andmany forms of cnmputer-based instruction.

The development of the MicroCAT system was partially supported by f undsfrom the Of fice of Naval Research (ONR). A major objective of ONR insupporting this development was to provide a testing system to meet the needsof the training and achievement-testing environment. To test its ef fectivenessin this environment, MicroCAT was implemented in a Navy Training Center asa means of introducing diagnostic testing into one of the service technicalschools.

The system was implemented at the Basic Electricity and Electronics(BE&E) School at the Naval Training Center in San Diego, Califovnia. Theoverall implementation plan was to introduce a computerized testing systeminto the current testing process and, once this system was in place and tested, toextend the program to diagnostic testing. This report describes the design andinitial implementation of this system.

DESIGN OF THE TESTING SYSTEM

When this project began, the design of the MicroCAT Testing System wasnearly complete and many of the MicroCAT programs had been developed. Theobjectives of the design of the testing system for the BE&E School were: (1) toassess the testing needs of the school, (2) to expand the MicroCAT system toallow the strategies for diagnostic testing to be implemented, and (3) tointegrate the system into the testing environment and the computer-managedinstructional system that were already in place at the school.

System Requirements

Students at the BE&E School are tested approximately once a day. Thestudent studies a particular subject and then takes a test on that subject. The

1

Page 7: DOCUMENT RESUME ED 274 679 TM 860 539 - ERIC · DOCUMENT RESUME ED 274 679 TM 860 539 AUTHOR Yale, David C. TITLE Implementation of a Microcomputer-Based Testing. System in a Military

achievement tests used in the BE&E curriculum contain 8 to 50 questions onbasic electricity and electronics knowledge. Typically they consist of someform of graphic (e.g., a schematic or a chart) and a question, often using specialsymbols (such as an omega for ohms). Figure 1 shows a sample item (notactually used in the BE&E curriculum) on resistance analysis. To solve thisproblem, the examinee must know how to apply Ohm's law and must eitherrecognize that the bridge on the right of the schematic is balanced (thusproviding a computational shortcut) or apply an appropriate network theoremto determine the overall resistance, and thus currclt, in the system.

Figure 1. Sample Electronics Item

3v28a

5s1

VVV\How much currQnt will flowin this circuit?

A. 21 milliamperesB. 63 milliamperesC. 127 mlaliamperesD. 254 milliamperes

--\ANV

2ar4 2 a,

ANSWER:\,

To take a test in the conventional paper-and-pencil format, a studentreports to a testing room and is assigned a microfiche card containing the test.The student then goes to a testing carrel containing a microfiche reader, loadsthe test into it, and responds to the questions by marking an optically scannableanswer sheet. After the student completes the test, he or she puts the answersheet into an optical scanner, which reads the answer sheet and transmits theinformation to MIISA, the computer-managed instruction system running on amainframe computer in Memphis, Tennessee. MIISA determines that the testthe examinee took was the proper one, scores it, reports the results, and updates

2

Page 8: DOCUMENT RESUME ED 274 679 TM 860 539 - ERIC · DOCUMENT RESUME ED 274 679 TM 860 539 AUTHOR Yale, David C. TITLE Implementation of a Microcomputer-Based Testing. System in a Military

the student's record in the database. The student receives the reported resultson a printing computer terminal connected to the optical scanner. This reporttells the student his or her score and what test to take next.

During the initial phases of implementation of the computerized testingsystem, students could take tests using either the computerized system or theconventional microfiche cards. The computerized tests had to be psychometri-cally comparable to the microfiche tests because all scores would be interpretedon the same scale. It was important that the tests be psychologically compara-ble as well, because if students perceived a difference in the difficulty of thetests, either real or imagined, they might avoid the form they considered to bemore difficult or troublesome. Three factors that contribute to the psychologi-cal comparability of the forms are: (1) speed of system response to theexaminee, (2) fault tolerance during system failures, and (3) support ofstandard test-taking strategies. To avoid giving the examinee the impressionthat the computerized version is slower than the conventional version, a goalfor the maximum system time between the examinee's response and thepresentation of the next item was set at less than five seconds. It is alsoimportant for examinees to feel confident that their work will not be lostbecause of equipment failure. And finally, a major test-taking strategy tliatmust be supported is the examinee's ability to skip items and then return tothem at the end of the test.

The computerized testing system also had to fit into the existing computer-managed instructional system without requiring any programming on the partof the Navy. This essentially meant that no changes in the testing processcould be made that would be detected by MIISA.

Finally, the system had to be able to handle special cases. An example of aspecial case in the traditional testing mode would be a mis-scanned answersheet that failed to give credit for all correct responses. Another would be theloss of an examinee's record after its receipt had been acknowledged by MIISA.In the conventional testing format, special cases are handled by the test proctor,who interacts with MIISA on the printer terminals used to return examinee testresults. A similar means of proctor intervention had to be made available withthe computerized testing system.

Analysis of the Systems

MUSA: The Navy's Computer-Managed Instruction System

All instruction and testing at the BE&E School is managed by MIISA, theprogram that assigns and scores tests and tracks student progress throughout theentire course of study. It is a very large program running on a mainframecomputer at a central computer installation. Because of its size and distancefrom the BE&E School, it is very difficult to make any changes to the program.Therefore, the computerized testing system had to use existing MIISAinterfaces. The most convenient interface was with the printer terminalsthrough which scanned test responses are transmitted and score reports arereceived.

3

Page 9: DOCUMENT RESUME ED 274 679 TM 860 539 - ERIC · DOCUMENT RESUME ED 274 679 TM 860 539 AUTHOR Yale, David C. TITLE Implementation of a Microcomputer-Based Testing. System in a Military

The printer terminals used are General Electric Terminet terminals. Theseterminals contain suf ficient intelligence to read the data from the opticalscanner, add a header of approximately 20 characters, and transmit thetransaction to MIISA. Data are transmitted from the Terminet through astandard RS232 serial port. The data are communicated through a 1200-baudmodem to a local concentrator and then transmitted to MHSA at 9600 baud.

The transactions sent to MHSA are all single lines of ASCII charactersterminated with a carriage return. Data returned from MIISA are score reportsformatted for the Terminet's printer. Among the transactions of interest to thisproject are score reports and requests for tests to be taken. It was apparentthat a convenient way to connect to the existing system was to emulate theTerminet terminals, sending proper Terminet transactions and receiving scorereports.

The MicroCAT Testing System

The MicreCAT Testing System was designed to be a self-contained systemfor developing, adm:nistering, and analyzing adaptive tests. The system ispackaged :.nto four subsystems: the Development Subsystem, the ExaminationSubsystem, the Assessment Subsystem, and the Management Subsystem. Theprograms available in each of these subsystems are shown in Table 1.

The Development Subsystem contains programs for entering and editingtest items consisting of text and graphics and for arranging those items intotests using a number of conventional and adaptive testing strategies. Tests arespecified in MCATL, an authoring language designed especially for specif yingtests. (This 4ecification may also be accomplished by filling in blanks in pre-defined straiegy templates.) MCATL is compiled to an intermediate form ofcode that can be executed quickly during the testing process.

The Examination Subsystem administers the tests. The programs in thissubsystem read the test specification instructions (the intermediate code filegenerated by compiling the test specification), present the test items, accept theexaminee's responses, score the responses, and report the results in a data file.

The Assessment Subsystem contains programs for analyzing tests that havebeen administered. One program, ASCAL, estimates item response theory (IRT)item parameters. Other programs in this subsystem reformat data for analyses,perform conventional item analyses, evaluate characteristics of item pools, andperform test validation analyses.

The Management Subsystem is intended for use with a network of testingstations. Some programs in the Management Subsystem allow a proctor tomonitor testing at a number of testing stations from a single terminal; othersstore individual examinee data in a master data file.

4

Page 10: DOCUMENT RESUME ED 274 679 TM 860 539 - ERIC · DOCUMENT RESUME ED 274 679 TM 860 539 AUTHOR Yale, David C. TITLE Implementation of a Microcomputer-Based Testing. System in a Military

Table 1. MicroCAT Components

Development Subsystem

BANK: Enters and edits text and graphics itemsMAKEFONT: Generates special-purpose character setsCREATE: Creates tests using pre-defined templatesEDIT: Enters and edits MCATL test specificationsCOMPILE: Compiles test specifications

Examination Subsystem

TESTONE: Tests one examinee and writes the score to a fileTESTMANY: Tests examinees repeatedly and writes scores to a file

Assessment Subsystem

COLLECT: Collects and formats item response dataANALYZE: Performs conventional item and test analysesESTIMATE: Estimates IRT item parameters using ASCALtmEVALUATE: Pre-evaluates a test's potential using IRTVALIDATE: Performs test validation analyses

Management Subsystem

RESERVE: Reserves disk space for communicationPROCTOR: Proctors test administration from a command stationRETRIEVE: Retrieves data from the master data file

Two substantial modifications to the MicroCAT system were planned toincorporate it into the Naval Training Center (NTC) testing environment. Atthe time this implementation was planned, the MicroCAT system had no facilityby which an examinee could skip an item and later review it, or changeresponses to any items previously administered. Such capabilities are nottypically allowed in adaptive testing. However, to maintain psychologicalcomparability to the existing conventional testing process, such an addition wasnecessary. The second addition was the incorporation of communicationfacilities so that the MicroCAT Testing System could communicate with MIISA.The planned approach to this was to enhance the proctoring program so that itcould communicate with MIISA by emulating the General Electric Terminetterminals and the standard transaction protocols.

5

Page 11: DOCUMENT RESUME ED 274 679 TM 860 539 - ERIC · DOCUMENT RESUME ED 274 679 TM 860 539 AUTHOR Yale, David C. TITLE Implementation of a Microcomputer-Based Testing. System in a Military

Integrating the Resources

The MicroCAT Testing System runs on IBM Personal Computers. Eachindividual testing station has one such computer. An IBM Personal Computerconsists of three major components: a monitor, a system unit, and a standardkeyboard with several additional function keys added at each end.

For the fault-tolerant testing system required for the NTC implementation,18 IBM Personal Computers were connected via an Ether Net local area network.A diagram of the system is shown in Figure 2.

Figure 2. Structure of the NTC Implementafion

Server

2

Station

15

1Station

14

-1

Station

13

-1

Station

12

-1

Station

11

Server

1

EthernetCable

AStation

9

F

[Station

10

Proctor

Station

MUSA

..1. °°-

1-

Station

8

Station

7

Station

1

Station

2

Station

3

Station

4

Station

5

Station

6AN

The two network servers shown at the top of Figure 2 contain the tests thatare administered and the data that are collected. The two servers in the NTCsystem are IBM PC-XT computers. Each has 256 kb of RAM memory, one 360-kb diskette drive, and one 10-mb hard-disk drive. Each server contains all ofthe tests to be administered. In normal operation, each server serves half of thetesting stations. If either of the servers fails, the other one is capable ofhandling the entire testing system.

116

Page 12: DOCUMENT RESUME ED 274 679 TM 860 539 - ERIC · DOCUMENT RESUME ED 274 679 TM 860 539 AUTHOR Yale, David C. TITLE Implementation of a Microcomputer-Based Testing. System in a Military

All remaining terminals on the network are IBM PCs with 192 kb of RAMmemory and a single diskette drive. The single diskette drive contains onlythose programs necessary to link each terminal into the network and thecurrent examinee's responses for test recovery in case the testing station fails.

Two of the testing stations are configured to function as proctoringstations. In addition to the standard testing system hardware, they contain aserial port to communicate with MHSA and a printer to print test results. Innormal operation only one proctoring station is used; the other is used as astandard testing station and is available as a backup if the proctoring stationfails.

IMPLEMENTATION OF THE SYSTEM

Description of the Initial System

The initial system was organized functionally as described above and inFigure 2. To operate the system, the test proctor first has to start the networkservers by turning on the power, entering the date and time, and making asingle keystroke to start the network server in a normal fashion.

After starting the servers, the proctor turns on the proctor station and allof the testing stations. All of these terminals automatically link into thenetwork and establish a connection with the proper server. The server to whicheach testing station connect . is determined by data contained on the diskettewithin the testing station.

The proctoring station presents a message asking the proctor if the testrequest queue should be cleared. In the case of a normal start, this is alwaysdone. Only in unusual circumstances, such as recovery after a power failure,would the proctor not clear the test request queue. The communications linkwith MIISA is automatically established by turning on the modem. At thispoint the system is ready for operation.

When an examinee arrives to take a test, the proctor assigns him or her toone of the available testing stations. Each available testing station displays themessage that the examinee should enter his or her Social Security number andpress the return key. When the examinee does this, the Social Security numberis passed through the network to the server and from the server to theproctoring station, where it is formatted into a transaction asking MIISA whattest should be assigned to the examinee. MIISA then responds with a report,and the program running on the proctoring station extracts the test identifierfrom that I eport. It passes the test identifier back through the network to thetesting station where the examinee is waiting for a test. This process typicallytakes between 5 and 10 seconds.

As items arc administered at the testing station, each response is edited toensure that it is valid. When the examinee finishes a test, the response recordis passed through the network to the proctoring station, which formats it into atransaction and transmits it to MIISA. MHSA then scores the test, updates the

7

1 ,?

Page 13: DOCUMENT RESUME ED 274 679 TM 860 539 - ERIC · DOCUMENT RESUME ED 274 679 TM 860 539 AUTHOR Yale, David C. TITLE Implementation of a Microcomputer-Based Testing. System in a Military

examinee's course .eccord, and transmits a report to the proctoring station. Theproctoring station then passes this report to the system printer, from which theexaminee obtains his or her score report. The testing process is complete at thispoint.

Initial Evaluation

For the most part, the initial system ran without error. Students takingtests on the system were reliably tested and always received proper reportsfrom MIISA. However, the Navy chiefs in charge of the testing process noticedtwo potential dif ficulties with the system. First, they determined that it waspossible for a student to run two terminals simultaneously. By doing this, astudent could preview the items on one stRtion and then answer them on asecond station. Since there was no feedback given about the correctness ofresponses, there was really no advantage to be gained from doing this, but itwas nevertheless of concern to the chiefs. The second potential problem wasthat students could reset their testing stations with several combinations of keys(e.g., control-c, and the system reset combination of control-alt-delete).

System Revisions

To alleviate the first problem identified in the initial evaluation, a lockoutbuffer was incorporated into the proctoring station to prevent a student fromoFerating more than one station at a time. When a student logs into the system,his or her Social Security number is kept in a buffer and is not deleted until heor she completes the test. If the student tries to log in at another station, amessage appears informing him or her that this is not allowed, and the proctoris alerted at the proctoring station.

To solve the reset problem, most of the control key combinations that couldreset the testing station were disabled. However, the control-alt-deletecombination is buried deep in the hardware of the IBM Personal Computer asthe system reset and there is no way to disable it from the software. This wasconsidered a relatively minor problem, however, because it is extremelyunlikely that a student would hit this combination of keys accidentally, andanyone who was determined to reset the station could always do so by turningthe power off, even if the control-aft-delete combination could have beendisabled.

Additional Features

Several dditional features were added to the system, some of which hadnot been initzally in:ended. The f irst was a modification to allow students totake remedial tests on the computerized testing system. (Remedial tests are forstudents who fail a particular portion of a test and must retake only thatportion af ter additional study.) In the microfiche mode, the student simplyanswers the questions in that section and leaves all of the other sections on theanswer sheet blank. If the student accidentally answers items in any othersection of the test, the test record is rejected by MIISA. To allow remedialexaminations in the computerized testing system, two modifications were made.First, standard testing mode was altered to allow students in remedial mode to

81 3

Page 14: DOCUMENT RESUME ED 274 679 TM 860 539 - ERIC · DOCUMENT RESUME ED 274 679 TM 860 539 AUTHOR Yale, David C. TITLE Implementation of a Microcomputer-Based Testing. System in a Military

skip sections of the test. Remedial test sections without any responses aresimply ignored by MUSA. Another modification was necessary to solve theproblem that occurred when an examinee accidentally answered an item in thewrong section, causing MUSA to reject the entire test record. In this case, theresponse vector is analyzed at the proctoring station, and any section in the testthat has some but not all of the items answered is completely blanked as if theexaminee had answered no items in that section. That section is then ignoredby MUSA, and only the section tha t has all items answered is scored. Withthese modifications, the computerized mode is virtually identical to the paper-and-pencil mode of remedial testing.

A second feature that was added to the system was the capability toretransmit an examinee's test record directly from the proctoring station.Occasionally, the MIISA system would accept a test record and produce a reportbut then lose the test record. The proctors then had to re-enter the record byhand using the communication capability provided in the proctoring station.To solvc this problem, a facility was incorporated into the proctoring programthat would retransmit the entire test record from the recovery file on thetesting station's diskette.

Because the data collected by computer administration were to be analyzedby the University of Illinois, a data transfer scheme was needed. The MUSAlink is a real-time link in that testing waits for communication. Transferringthe data to the University of Illinois, on the other hand, had to be done onlywhen the data were needed or when the disks on the NTC network were full.A system was developed whereby the test proctor periodically dumped the datafrom the system disks to two sets of diskettes, one for the University of Illinoisand cite for backup. After dumping the data, the proctor was instructed tomail one set to the University of Illinois and to keep the backup set untilreceipt was confirmed. The data on the system disk were erased after thediskettes were made. Except for the difficulty of getting the proctor to makethe data diskettes on a regular basis, this scheme worked well.

Testing has not been interrupted br cause of any system problems. It wasinterrupted for several weeks, however, by the implementation of new versionsof the tests. The frequent changes in tests, which had not been anticipatedwhen the system was installed, required frequent communication with theUniversity of Illinois. It had been intended that the University of Illinoiswould do the test development and then either manually install the tests in theSan Diego system or mail complete test files with installation programs to berun by the proctor. Flo,L.ever, as the test chaages became more frequent, itbecame apparent that it would be more ef ficient for NTC personnel to makethe changes themselves and install the tests.

Test development in the MicroCAT system is a three-stage process. First,the items are authored using the system's Graphics Item Banker. Then the testis specified using an authoring language. Finally, the authoring language iscompiled, a process that reformats the items and processes the instructions in amanner that allows items to be presented rapidly. Implementing a test in theNTC system required the further step of copying the compiled test onto theappropriate disk volume.

9

14

Page 15: DOCUMENT RESUME ED 274 679 TM 860 539 - ERIC · DOCUMENT RESUME ED 274 679 TM 860 539 AUTHOR Yale, David C. TITLE Implementation of a Microcomputer-Based Testing. System in a Military

NTC test administration personnel mastered the process with relative ease.1-:-twever, a few problems did arise. One problem was that if diskettes wereswapped while the item banker was running, a bank would be destroyed.Although this problem is easily circumvented by not swapping diskettes, thissolution was obviously not optimal. A utility program that could recover abank destroyed in this manner was developed.

A second problem that was encountered was that two people sharing a diskvolume using the Ethernct network from 3Com can, under certaincircumstances, destroy each other's work. For example, NTC personneldestroyed an item bank by writing portions of a memo over it. Fortunately, thenew program was able to restore most of what was lost.

EVALUATION OF THE SYSTEM

The MicroCAT Testing System was implemented at the BE&E School toprovide a vehicle for diagnostic testing and to evaluate the MicroCAT systemin a full-scale operational testing environment. In general, the MicroCATsystem has performed admirably. To date, approximately 2,400 items have beenbanked for this application. From these, approximately 50 different tests havebeen implemented, and approximately 1,500 tests have been administered.Informal evidence from the BE&E School suggests that the system is fastenough for all testing needs, that examinee's perceive it as psychologicallyparallel to the microfiche form of testing, and that it is adequately fault-tolerant. Although the local testing system rarely fails, the capability toretransmit data if MIISA loses the original transmission has been very valuable.

As an evaluation site for the MicroCAT system, the NTC environment hasbeen less than optimal. To date, only the conventional testing capabilities ofthe MicroCAT Testing System have been evaluated to any degree. Theconsiderable power for adaptive test administration and analysis that is a majorstrength of the MicroCAT system has not been evaluated at all in the NTCimplementation. Fortunately, some of the commercial sites in which theMicroCAT Testing System is used have provided more thorough tests of thesystem's adaptive testing capabilities. Even there, however, it may be severalyears before all of the extensive capabilities of the MicroCAT Testing Systemare given a challenging test.

FUTURE PLANS FOR DIAGNOSTIC TESTING

The MicroCAT Testing System has not yet been used for diagnostic testing;insufficient data have been collected to allow diagnostic tests to be developed.The programs are ready to implement such testing, however.

MicroCAT does not include the diagnostic testing strategies because theyare still under development and are not widely used. Diagnostic testing will beimplemented using the custom interface included in MicroCAT. The custominterface allows users to link FORTRAN or Pascal procedures to MicroCAT.New scoring procedures can be included this way and are treated by MicroCATin a manner similar to the standard scoring procedures (i.e., they are executed

10

Page 16: DOCUMENT RESUME ED 274 679 TM 860 539 - ERIC · DOCUMENT RESUME ED 274 679 TM 860 539 AUTHOR Yale, David C. TITLE Implementation of a Microcomputer-Based Testing. System in a Military

each time a score is needed). Similarly, test execution can jump directly to acustom procedure through the execution of a procedure call in the testspecification.

Using these custom interfaces, programmers at the University of Illinoisw;ll develop and revise the diagnostic procedures as needed. No modificationto the MicroCAT Testing System itself will be required.

Page 17: DOCUMENT RESUME ED 274 679 TM 860 539 - ERIC · DOCUMENT RESUME ED 274 679 TM 860 539 AUTHOR Yale, David C. TITLE Implementation of a Microcomputer-Based Testing. System in a Military

REFERENCES

Assessment Systems Corporation. (1984). User's manual for the MicroCAT TestingSystem (Research Rep. No. ONR-85-1). St. Paul, MN: Author.

Tatsuoka, K. K., & Tatsuoka, M. M. (1983). Spotting erroneous rules of operationby the individual consistency index. Journal of Educational Measurement,20(3), 221-230.

Weiss, D. J. (1982). Improving measurement quality and efficicncy withadaptive testing. Applied Psychological Measurement, 6(4), 473-492.

Weiss, D. J. (1985). Adaptive testing by computer. Journal of Consulting andClinical Psychology, 53(6), 774-789.

12

Page 18: DOCUMENT RESUME ED 274 679 TM 860 539 - ERIC · DOCUMENT RESUME ED 274 679 TM 860 539 AUTHOR Yale, David C. TITLE Implementation of a Microcomputer-Based Testing. System in a Military

Distribution List

Personnel Analysis Division,

AF/MPXA5C360, The PentagonWashington. DC 20330

Air Force Human Resources LabAFHRL/MPDBrooks AFB, TX 78235

Dr. Earl A. AlluisiHQ, AFHRL (AFSC)Brooks AFB, TX 78235

Dr. Erling B. AndersenDepartment of StatisticsStudiestraede 6

1455 CopenhagenDENMARK

)r. Phipps ArabieUniversity of IllinoisDepartment of Psychology

603 E. Daniel St.Champaign, IL 61820

Technical Director, ARI5001 Eisenhower AvenueAlexandria, VA 22333

Dr. Eva L. BakerUCLA Center for the Study

of Evaluation145 Moore HallUniversity of CaliforniaLos Angeles, CA 90024

Dr. Isaac BejarEducational Testing ServicePrinceton, NJ 08450

Dr. Menucha BirenbaumSchool of EducationTel Aviv UniversityTel Aviv, Ramat Aviv 69978ISRAEL

Dr. Arthur S. BlaiwesCode N711Naval Training Equipment Center

Orlando, FL 32813

18

Dr. R. Darrell BockUniversity of ChicagoDepartment of EducationChicago, IL 60637

Cdt. Arnold BohrerSectie Psychologisch OnderzoekRekruterings-En SelectiecentrumKwartier Koningen Astrid

Bruijnstraat1120 Brussels, BELGIUM

Dr. Robert BreauxCode N-095RNAVTRAEQUIPCENOrlando, FL 32813

Dr. Robert BrennanAmerican College Testing

ProgramsP. O. Box 168Iowa City, IA 52243

Dr. Patricia A. ButlerNIE Mail Stop 18061200 19th St., NWWashington, DC 20208

Mr. James W. CareyCommandant (G-PTE)U.S. Coast Guard2100 Second Street, S.W.Washington, DC 20593

Dr. James CarlsonAmerican College Testing

ProgramP.O. Box 168Iowa City, IA 52243

Dr. John B. Carroll409 Elliott Rd.Chapel Hill, NC 27514

Dr. Robert CarrollNAVOP 01B7Washington, DC 20370

Dr. Norman CliffDepartment of PsychologyUniv. of So. CaliforniaUniversity PirkLos Angeles, CA 90007

Page 19: DOCUMENT RESUME ED 274 679 TM 860 539 - ERIC · DOCUMENT RESUME ED 274 679 TM 860 539 AUTHOR Yale, David C. TITLE Implementation of a Microcomputer-Based Testing. System in a Military

Distribution List (Continued)

Director,Manpower Support andReadiness Prcgram

Center for Naval Analysis2000 North Beauregard StreetAlexandria, VA 22311

Dr. Stanley CollyerOffice of Naval Techno.ogyCode 222

800 N. Quincy StreetArlington, VA 22217-5000

Dr. Hans CrombagUniversity of LeydenEducation Research CenterBoerhaavelaan 22334 EN LeydenThe NETHERLANDS

CTB/McGraw-Hill Library2500 Garden Road

Monterey, CA 93940

Dr. Dattprasad DivgiCenter for Naval Analysis4401 Ford AvenueP.O. Box 16268Alexandria, VA 22302-0268

Dr. Hei-Ki DongBall Foundation800 Roosevelt RoadBuilding C, Suite 206

Glen Ellyn, IL 60137

Defense TechnicalInformation Center,

Cameron Station, Bldg 5Alexandria, VA 22314

Attn: TC(12 Copies)

Dr. Stephen DunbarLindquist Center

for MeasurementUniversity of IowaIowa City, IA 52242

Dr. James A. EarlesAir Force Human Resources LabBrooks AFB, TX 78235

Dr. Kent EatonArmy Research Institute5001 Eisenhower AvenueAlexandria, VA 22333

Dr. John M. EddinsUniversity of Illinois252 Engineering Research

Laboratory103 South Mathews Street

Urbana, IL 61801

Dr. Susan EmbretsonUniversity of KansasPsychology DepartmentLawrence, KS 66045

ERIC Facility-Acquisitions4833 Rugby AvenueBethesda, MD 20014

Dr. Benjamin A. FairbankPerformance Metrics, Inc.5825 CallaghanSuite 225San Antonio, TX 78228

Dr. Leonard FeldtLindquist Center

for MeasurementUniversity of IowaTowa City, IA 52242

, Richard L. FergusonArican College Testing

ProgramP.O. Box 168Iowa City, IA 52240

Dr. Gerhard Fischer

Liebiggasse 5/3A 1010 ViennaAUSTRIA

Prof. Donald FitzgeraldUniversity of New EnglandDepartmehc of PsychologyArmidale, New South Wales 2351

AUSTRALIA

Mr. Paul FoleyNavy Personnel R&D CenterSan Diego, CA 92152

19

Page 20: DOCUMENT RESUME ED 274 679 TM 860 539 - ERIC · DOCUMENT RESUME ED 274 679 TM 860 539 AUTHOR Yale, David C. TITLE Implementation of a Microcomputer-Based Testing. System in a Military

Distribution List (Continued)

Dr. Carl H. FrederiksenMcGill lliversity3700 McTavish StreetMontreal, Quebec H3A 1Y2CANADA

Dr. Robert D. GibbonsUniversity of Tllinois-ChicagoP.O. Box 6998Chicago, IL 69680

Dr. Janice GiffordUniversity of MassachusettsSchool of EducationAmherst, MA 01003

Dr. Robert GlaserLearning Research

& Development CenterUniversity of Pittsburgh3939 O'Hara StreetPittsburgh, PA 15260

Dr. Bert GreenJohns Hopkins UniversityDepartment of PsychologyCharles & 34th StreetBaltimore, MD 21218

Dr. Ronald K. HambletonProf. of Education & PsychologyUniversity of Massachusetts

at AmherstHills HouseAmherst, MA 01003

Ms. Rebecca HetterNavy Personnel R&D CenterCode 62San Diego, CA 92152

Dr. Paul W. HollandEducational Testing Service

Rosedale RoadPrinceton, NJ 08541

Prof. Lutz F. HornkeUniversitat DusseldorfErziehungswissenschaftlichesUniversitatsstr. 1

Dusseldorf 1WEST GERMANY

Dr. Paul Horst677 G Street, #184Chula Vista, CA 90010

Mr. Dick HoshawNAVOP-135Arlington AnnexRoom 2834Washington, DC 20350

Dr. Lloyd HumphreysUniversity of IllinoisDepartment of Psychology603 East Daniel StreetChampaign, IL 61820

Dr. Steven HunkaDepartment of EducationUniversity of Alberta

Edmonton, AlbertaCANADA

Dr. Huynh HuynhCollege of EducationUniv. of South CarolinaColumbia, SC 29208

Dr. Robert JannaroneDepartment of PsychologyUniversity of South CarolinaColumbia, SC 29208

Dr. Douglas H. JonesAdvanced Statistical

Technologies Corporation10 Trafalgar CourtLawrenceville, NJ 08148

Dr. G. Gage KingsburyPortland Public SchoolsResearch and Evaluation Department501 North Dixon Street

'. Box 3197

vid, OR 97209-3107

liam Koch4ty of Texas-Austinent and Evaluetionr

Au -Y 78703

Page 21: DOCUMENT RESUME ED 274 679 TM 860 539 - ERIC · DOCUMENT RESUME ED 274 679 TM 860 539 AUTHOR Yale, David C. TITLE Implementation of a Microcomputer-Based Testing. System in a Military

Distribution List (Continued)

Dr. Leonard KroekerNavy Personnel R&D CenterSan Diego, CA 92152

Dr. Michael LevineEducational Psychology210 Education Bldg.University of IllinoisChampaign, IL 61801

Dr. Charles LewisFaculteit Sociale WetenschappenRijksuniversiteit GroningenOude Boteringestraat 239712GC GroningenThe NETHERLANDS

Dr. Robert LinnCollege of EducationUniversity of IllinoisUrbana, IL 61801

Dr. Robert LockmanCenter for Naval Analysis4401 Ford AvenueP.O. Box 16268Alexandria, VA 22302-0268

Dr. Frederic M. LordEducational Testing ServicePrinceton, NJ 08541

Dr. James LumsdenDepartment of Psychology

University of Western AustraliaNedlands W.A. 6009AUSTRALIA

Dr. William L. MaloyChief of Nnval Education

and TrainingNaval Air StationPensacola, FL 32508

Dr. Gary MarcoStop 31-EEducational Testing ServicePrinceton, NJ 08451

Dr. Clessen MartinArmy Research Institute5001 Eisenhower Blvd.

Alexandria, VA 22333

Dr. James McBridePsychological Corporationc/o Harcourt, Brace,

Javanovich Inc.1250 West 6th StreetSan Diego, CA 92101

Dr. Clarence McCormickHQ, MEPCOMMEPCT-P2500 Green Bay RoadNorth Chicago, IL 60064

Mr. Robert McKinleyUniversity of ToledoDepartment of Educational PsychologyToledo, OH 43606

Dr. Barbara MeansHuman Resources

Research Organization1100 South WashingtonAlexandria, VA 22314

Dr. Robert MislevyEducational Testing ServicePrinceton, NJ 08541

Headquarters, Marine CorpsCode MPI-20Washington, DO 20380

Dr. W. Alan NicewanderUniversity of OklahomaDepartment of PsychologyOklahoma City, OK 73069

Dr. William E. MordbrockFMC-ADCO Box 25APO, NY 09710

Dr. Melvin R. Novick356 Lindquist Center

for MeasurementUniversity of IowaIowa City, IA 52242

Director, Manpower and PersonnelLaboratory,NPRDC (Code 06)

San Diego, CA 92152

Page 22: DOCUMENT RESUME ED 274 679 TM 860 539 - ERIC · DOCUMENT RESUME ED 274 679 TM 860 539 AUTHOR Yale, David C. TITLE Implementation of a Microcomputer-Based Testing. System in a Military

Distribution List (Continued)

Library, NPRDCCode P201LSan Diego, CA 92152

Commanding Officer,Naval Research Laboratory

Code 2627Washington, DC 20390

Dr. James OlsonWICAT, Inc.1875 South State StreetOrem, UT 84057

Office of Naval Research,

Code 1142PT

800 N. Quincy StreetArlington, VA 22217-5000(6 Copies)

Special Assistant for MarineCorps Matters,

ONR Code 00MC800 N. Quincy St.Arlington, VA 22217-5000

Dr. Judith OrasanuArmy Research Institute5001 Eisenhower AvenueAlexandria, VA 22333

Wayne M. PatienceAmerican Council on EducationGED Testing Service, Suite 20One Dupont Circle, NWWashington, DC 20036

Dr. James PaulsonDepartment of PsychologyPortland State UniversityP.O. BOX 751Portland, OR 97207

Dr. Roger PennellAir Force Human Resources

LaboratoryLowry AFB, CO 80230

Dr. Mark D. ReckaseACTP. O. Box 168Iowa City, IA 52243

Dr. Malcolm ReeAFHRL/MPBrooks AFB, TX 78235

Dr. Carl RossCNET-PDCDBuilding 90Great Lakes NTC, IL 60088

Dr. J. RyanDepartment of EducationUniversity of South CarolinaColumbia, SC 29208

Dr. Fumiko SamejimaDepartment of PsychologyUniversity of TennesseeKnoxville, TN 37916

Mr. Drew SandsNPRDC Code 62San Dieg.), CA 92152

Dr. Robert SasmorArmy Research Institute5001 Eisenhower AvenueAlexandria, VA 22333

Dr. Mary SchratzNavy Personnel R&D CenterSan Diego, CA 92152

Dr. W. Steve SellmanOASD(MRA&L)2B269 The PentagonWashington, DC 20301

Dr. Kazuo Shigemasu7-9-24 Kugenuma-KaiganFujusawa 251

JAPAN

Dr. William Sims

Center for Naval Analysis4401 Ford AvenueP.O. Box 16268Alexandria, VA 22302-0268

22

Page 23: DOCUMENT RESUME ED 274 679 TM 860 539 - ERIC · DOCUMENT RESUME ED 274 679 TM 860 539 AUTHOR Yale, David C. TITLE Implementation of a Microcomputer-Based Testing. System in a Military

Distribution List (Continued)

Dr. H. Wallace SinaikoManpower Research

and Advisory ServicesSmithsonian Institution

801 North Pitt StreetAlexandria, VA 22314

Dr. Richard SorensenNavy Personnel R&D CenterSan Diego, CA 92152

Dr. Paul SpeckmanUniversity of MissouriDepartment of StatisticsColumbia, MO 65201

Dr. Martha StockingEducational Testing ServicePrinceton, NJ 08541

Dr. Peter StoloffCente' for Naval Analysis200 North Beauregard StreetAlexandria, VA 22311

Dr. William StoutUniversity of IllinoisDepartment of MathematicsUrbana, IL 61801

Maj. Bill StricklandAF/MPXOA4E168 PentagonWashington, DC 20330

Dr. Hariharan SwaminathanLaboratory of Psychometric and

Evaluation ResearchSchool of EducationUniversity of MassachusettsAmherst, MA 01003

Mr. Prad Sympson

Navy Personnel R&D CenterSan Diego, CA 921:

Dr. Kikumi TatsuokaCERL252 Engineering Research

LaboratoryUrbana, IL 61801

Dr. Maurice Tatsuoka220 Education Bldg1310 S. Sixth St.Champaign, IL 61820

Dr. David ThissenDepartment of PsychologyUniversity of KansasLawrence, KS 66044

Mr. Gary ThomassonUniversity of IllinoisEducational PsychologyChampaign, IL 61820

Dr. Robert TsutakawaThe Fred Hutchinson

Cancer Research CenterDivision of Public Health Sci.1124 Columbia StreetSeattle, WA 98104

Dr. Ledyard TuckerUniversity of IllinoisDepartment of Psychology603 E. Daniel StreetChampaign, IL 61820

Dr. Vern W. UrryPersonnel R&D CenterOffice of Personnel Management1900 E. Street, NWWashington, DC 20415

Dr. David ValeAssessment Systems Corp.2233 University AvenueSuite 310St. Paul, MN 55114

Dr. Frank VicinoNavy Personnel R&D CenterSan Diego, CA 92152

Dr. Howard WainerDivision of Psychological StudiesEducational Testing ServicePrinceton, NJ 08541

23

Page 24: DOCUMENT RESUME ED 274 679 TM 860 539 - ERIC · DOCUMENT RESUME ED 274 679 TM 860 539 AUTHOR Yale, David C. TITLE Implementation of a Microcomputer-Based Testing. System in a Military

Distribution List (Continued)

Dr. Ming-Mei WangLindquist Center

for MensurementUniversity of Iowa

Iowa City, IA 52242

Mr. Thomas A. WarmCoast Guard InstituteP. O. Substation 18Oklahoma City, OK 73169

Dr. Brian WatersProgram ManagerManpower Analysis ProgramHumRRO

1100 S. Washington St.

Alexandria, VA 22314

Dr. David J. WeissN660 Elliott HallUniversity of Minnesota75 E. River RoadMinneapolis, MM 55455

Dr. Ronald A. WeitzmanNPS, Code 54WzMonterey, CA 92152

Major John WelshAFHRL/MOANBrooks AFB, TX 78223

Dr. Rand R. WilcoxUniversity of Southern

CaliforniaDepartment of PsychologyLos Anceles, CA 90007

German Military RepresentativeATTN: Wolfgang Wildegrube

StreitkraefteamtD-5300 Bonn 2

4000 Brandywine Street, NWWashington, DC 20016

Dr. Bruce WilliamsDepartment of Educational

PsychologyUniversity of Illinois

Urbana, IL 61801

Dr. Hilda WingArmy Research Institute5001 Eisenhower Ave.Alexandria, VA 22333

Dr. Martin F. WiskoffNavy Personnel R & D CenterSan Diego, CA 92152

Mr. John H. WolfeNavy Personnel R&D CenterSan Diego, CA 92152

Dr. George WongBiostatistics Laboratory

Memorial Sloan-KetteringCancer Center

1275 York AvenueNew York, NY 10021

Dr. Wendy YenCTB/McGraw HillDel Monte Research ParkMonterey, CA 93940

BEST COPY AVAILABLE