simulation systems technical area · unclassifiedj security classification of-this page tmhon does...

35
Technical Ropedt 490> 94) LEVEL"C SIMULATOR FIDELITY: A CONCEPT PAPtER S4Robert T. Hays SIMULATION SYSTEMS TECHNICAL AREA DTIC ".JUL 17 1981 F U. S. Army Research Institute for the Behavioral and Social Sciences Nowmbw 1980 "Awproved for publicl rmlob ; distributihOn unlimited. 817 16 100

Upload: lynhan

Post on 13-Feb-2019

214 views

Category:

Documents


0 download

TRANSCRIPT

Technical Ropedt 490>94) LEVEL"CSIMULATOR FIDELITY: A CONCEPT PAPtER

S4Robert T. Hays

SIMULATION SYSTEMS TECHNICAL AREA

DTIC".JUL 17 1981

F

U. S. Army

Research Institute for the Behavioral and Social Sciences

Nowmbw 1980

"Awproved for publicl rmlob ; distributihOn unlimited.

817 16 100

I U. S.ARMY RESEARCH INSTITUTEFOR THE BEHAVIORAL AND SOCIAL SCIENCES

4 A Field Operatir~g Agency under the Jurisdiction of theDepuzty Chief of Staff for Personnel

FRANKWN A. HARTJOSEPH ZEIDNER Colonel, US ArmyIIteChniCAd Direcr Commander

NOTICES

DISTRIBUUTION firtmv~y d~usr bbj'On Oflthis '.PO~t haS ban mamb bv ARI. Ple wmw ed'c~wo~ednaoneominng diM~rbgt"oq a' rOw'G to U. $. AumyV 04"wdh11i'lf fnpwu fto th410104001 and 1fsal Scommoft

ATTN: PERI-TP. 5001 ElagthtovAr Avenue, Ahtxandrim. Virg~nia 22333.

EINAL6 DIMSOITION: This t'aott niev be dsutovedimsgnm it s no ~m~ 'numm. poem do nat rejm it tother U. S. Armn Kqgmarch institute for the Uemuoire jwa So"1g k~nm..

SThe I indimp in this tftwot we not to be oaR~unAd a an officigi 0"wttttijn of tth A#ItV p064ti0A.unkes I* dimeqnsted bv other inuthorited docunrnnts.

UnclassifiedjSECURITY CLASSIFICATION OF-THIS PAGE tMhon Does Efnt~.e4) ________________

REPORT DOCUMENTATION PAGE BEFORE COMPLETING FORM)~~tL U~~a 1 . OVT ACCESSION NO. 3 04LCIPIENT'S CATALOG NUMBER

/'tJ>,~4i S. Y~PE OF REPORT a PERIOD CovesNW

~)SIMULATOR FIDELITY: A CONCEPT PAPER# ________

A %. ~. . PERFORMING omo. REPO"r NUMBER

7. AUTHOR(a) 6. CONTRACT OR GRANT NUMSERC.)

Robert T/Hays__ _ _ _ __ _ _ _ _ _ _

9.PERFORMING ORGANIZATION NAME AND AODDRESS SO. PROGRAM ELEMENT. PRO0JECT, TASKU.S. Army Research Institute for the Behavioral ARKA & WORK UN IT NUMSEES

and Social Sciences /(PERI-OU) 2Q677A5001 Eisenhower Avenufi, Alexandria, VA 22333 217. 7A9y

11. CONTROLLING OFFICE NAME AND ADOOVLSS - LPEPURT DATKU.S. Army Research Institute for the Behavioral P)Novabo.,2.98[1and Social Sciences (PERI-0U) Is. NUMBERorFPAGES5001 Eisenhower Avenue, Alr.rxandria, VA 22333 31

'tMONITORIMG AGENCY 1414411 A001RE'jU1-if 411U0rot Ifre CMU.Iidad Offce) III. SECURITY CLASS. (of W~e .wooff)

Unclassified-- So. ECLA5S 1C ON/DOWNGRADING

14 ISTRIEUTION STATEMENT (of Ohio Repor)

Approved for public release; distribution unlimited.

I7. DISTRIS1UTION STATKMQNT (of the 0ebelwei Omtv~te Int 1110611 20, It dufeIAf. hew Repo")

SIS raUP9LKMENTARY NOTVS

1S. KEY wOGID1 (Continue on frwon ol ds It noaoemrand Identify by biftA nuthbe')

Simulation trainingSimulatorsSimulator fidelityFidelity requirements

II& -AWPACT X00016 4W 0610`1 ok N unefeonrOW idsmib'6 Wee Iftoth .ef)-'ýThis report reviews the literature on simulator fidelity and showz that

there is a great deal of confusion in the usaqe of the term fidelity. i\ twQaspect definition of fidelity is proposed which focuses on the physical andfunctional similarity of the training device to the actual equipment forwhich training is undertaken. This definition is discussed as it applies toseveral parameters of the training situation, such as type of task, trainee' 7stage of learning, or tota,. training context, to determine training

(Continued)A

FM' , 3 EK11OM Ot I OV 65 IS OSSOLETI Unc(Aassified j(AJ--

i.EGUSTY CLAWPtICAYION OF T1412 PACE (11081 Del. &Of@*~d

UnclassifiedSKCUIITY CL.ASSIIICAY ION OF THIS PAGIECWhm DaI& gotemt.,

Item 20 (Continued);-effectiveness-. Methodological issues in the empirical determination of

fidelity requirements are discussed. A pilot research strategy to beginthe empirical study of fiaAity requirements is outline.

I

iii

ii UnclassifiedSECURITY CL *SIFICATION OF THIR PAGIE(Woo D~a ffed

Technical Report 490

SIMULATOR FIDELITY: A CONCEPT PAPER

A0003510n oraTwNTIS G o:k

Robert T. Hays DTIC TABUnann(ouced "

Distribution/

Availability Codos

Submitted by: . Avail •.nd/;r

Frank J. Harris, Chief Dirt Special

SIMULATION SYSTEMS TECHNICAL AREA

Approved by:

Edgar M. Johnson, DirectorORGANIZATIONS AND SYSTEMSRESEARCH LABORATORY

U.S. ARMY RESEARCH INSTITUTE FOR THE BEHAVIORAL AND SOCIAL SCIENCES5001 Eisenhower Awvnue. Alexandria. Virginia 22333

Office. Deputy Chief of Stiff for PersonnelDepartrient of the Army

November 1960

Army ProjW.t Number20162717A790 Simulatlon Systems

A , a tr fo k r sij i dUVmuvw uP W.wdm

-7/

ARI Research Reports and Technical Reports are intended for sponsors ofR&D tasks and for other research and military agencies. Any findings readyfor implementation at the time of publication~ are presented in the last partof the Brief. Upon completion of a major phase of the task, formal recomn-mendations for official action normally are conveyed to appropriate militaryagencies by briefing or Disposition Form.

V

IVIAAI

FOREWORD

The Si'uulation Systems Technical Area of the Army Research Institutefor the Behavioral and Social Sciences (ARI) performs research and develop-ment in areas that include training simulation with applicability to mili-tary training. Of special interest is research in the area of simulationfidelity rec-mirerents. It is necessary to determine the necessary levelsof simulator fidelity before any training system may be developed and pro-cured for use in the Army training community.

This report surveys the relevant literature on simulator fidelity andrecommends a workable definition of simulator fidelity as well as outlin-ing an approach to the empirical determination of simulator fidelityrequirements.

This report is responsive to the requirement of the Office of the Proj-ect Manager for Training Devices (PM TRADE) and to the Army's Project2Q162717A790, "Defining Simulation Fidelity Requirements."

JOEPH Z ID ER

hnical Direýtor

v

2I

S~v

SIMULATOR FIDELITY: A CONCEPT PAPER

BRIEF

Requirement:

To review the relevant literature on simulator fidelity in order to de-termina how the term fidelity has been used. From this review, to recommenda workable definition of fidelity and outline a course of empirical investi-gation to determine the necessary levels of simulator fidelity to insureadequate training effectiveness.

Procedure:

An extensive review of literature on simulator fidelity was conducted.It was shown that there is much confusion in the usage of the term "fidelity."A two-aspect defiuition of fidelity was proposed which focuses on the physi-cal and functional simjsLar..ty of the training de-ice to the actual equipmentfor which training is undertaken. This definition is discussed as it appliesto several parameters oL the t:aining situation, such as type of task, train-ee's stage of learninq, or total training context, to determine training ef-fectiveness. Met.hodological issues in the empirical determination of fide!ityrequirements are discussed and a pilot research strategy is outlined.

Findings:

Confusion surrounding the use of the term ":"ideiity" may be substantiallyreduced if we limit the use of the term and let fidelity refer only to thedegree of similarity, both physical and functional, between a training deviceand the actual equipment for which the training was undertaken.

Utilization of Findings:

This report can be used by researchers in determining how they may de-sign empirical studies to determine necessary levels of simulator fidelityand by the military training community to determine the nature of trainingdevices, which must be designed and procured.

i

S~vii

SIMULATOR FIDELITY: A CONCEPT PAPER

CONTENTS

Page

DEFINITIONS OF FIDELITY .................................................. 2

General Approaches to the Definition of Fidelity .. .................. 2Specific Breakdowns of Fidelity ...................................... 2

RECOMMENDED APPROACH TO FIDELITY ........................................ 6

REASONS FOR DEPARTURE FROM HIGH FIDELITY................................9

'FIDELITY AND TRAINING EFFECTIVENESS ..................................... 10

Fidelity and the Training Context....................................11Fidelity and Type of Task............................................11jFidelity and Stage of Learning ....................................... 11Fidelit~y and Task Analysis ........................................... 12Fidelit:y -and the Trainee's Abilities ................................. 14Fidelity and Psychological Principles................................14Trainialg Effectiveness as Determinant of Fidelity Requirements . 15

METHODOLOGICAL ISSUES....................................................15

CONCLUSIONS AND RECOMMENDATIONS ......................................... 18

REFERENCES...............................................................19

DISTRIBUTION.......................................... .................. 23

LIST OF TABLES

Table 1. Varieties of fidelity terni . .................................. 7

2. overlapping fidelity concepts. ................................ 8

3. General relationship between stages of learning, trainingobjectives, and types of training devices. ................... 13

4. Types of training devices associated with stages oflearning.......................................................13

LIST OF FIGURES

Figure 1. Samnple design for a pilot study to determine the "best"combination of fidelity levels fcr a given task ............. 17

ix

Lit-,.A"-

SIMULATOR FIDELITY: A CONCEPT PAPER

Today's Army is faced with training problems 1ike none previously en-countered in that the high technology of its advanced weapons systems re-quires correspondingly sophisticated operators and maintainers. This re-quirement places a heavy burden on training, particularly given theprevalence of large numbers of Category IV enlistees. The traditionalapproach of training personnel on actual equipment is becoming more andmore prohibitive because of the cost of theseý systems. To crje with thisprohibitive expense and to improve the effectiveness of its training, theArmy has and will continue to turn increasingly toward the use of simu-lators. This movement is being paralleled by the other aý.med services(Miller, 1974; Spangenberg, 1976; Fink & Shriver, 1978; Purifoy & Benson,1979; Malec, 1980). Apart from potential cost navings, simulators haveseveral unique instructional advantages over the use of either the class-room or actual equipment for training.

Spangenberg (1976) discusses seven unique advantages of sim'.lation for

training. Simulators can (1) provide immediate feedback, (2) increase thenumber of crises, conflicts, equipment malfunctions, and emergencies to pro-vide the trainee with experience which would be unavailable o,, actual equip-ment, (3) compress time so a complex sequence of tasks may be accomplishedin the time it would take to run through only one or two tasks on the actualequipment, (P.) vary the sequence of tasks to maximize training efficiency,(5) provide guidance and stimulus support to the trainee in the form ofprompts and feedback, (6) vary the difficulty level to match the skill levelof each individual trainee, and (7) provide the trainee with an overviewfrom which the trainee may form an overall understanding of the whole situ-ation. These advantages, in addition to their potential cost-effectiveness

have led the Army (PM TRADE) to the conclusion that simulators should beutilized to greater advantage in its training program.

But though they are intended to reduce training costs, training simu-lators may not be cost effective if they duplicate the physical and func-tional characteristics of operational equipment more precisely than is re-quired for effective transfer of training, i.e., if simulation fidelityexceeds t'aining requirements. However, designing the appropriate levelof fidelity is not an easy task. At the very least, a set of consistentlyused operational definitions of fidelity is required. The term fidelityhas been used with.a variety of meaninhgs and it is vital that we .agree ona workable definition if we are to develop user-oriented guidance in thedesign, acquisition, and deployment of training simulators. This paperattempts to develop such a definition of fidelity by reviewing and synthe-sizing the relevant literature. The paper then attempts to place the con-cept in the proper context of the Army's training needs. Finally, it out-lines a variety of strategies for implementing further research to determinespecific fidelity requirements for the wide range of tasks carried out bythe modern Army.

DEFINITIONS OF FIDELITY

The literature on simulation for training discusses the problem of thenecessary levels of fidelity in simulators. The concept of fidelity is

unclear because it has been used in a wide variety of ways with a widevariety of meanings. This section surveys the relevant literature and de-

lineates the most widely used definitions of fidelity. Some definitionsare very general while others tend to be more specific.

General Approaches to the Definition of Ficlity

Several recent discussions of training simulation have taken a generalapproach to the definition of fidelity. Two instances will serve as ex-empldrs of this approach. Miller (1974) states that fidelity refers to theaccuracy of reproduction of the actual equipment by the simulator; it is thedegree to which a simulator resembles the i.eal operational equipment orsituation. An even more general approach iz; taken by Malec (1980). Hestates tnat fidelity has been achieved if acLi-'ity on thU simulator is "suf-ficiently like exercising the actuai equipment" (Malec, 1980, p. 16).Neither of these definiLions is sufficiently precise or operational to beuseful in the development of guidance for simulator designers. We mustknow what "similar" means or what to be "sufficiently like" the actualequ~ipment entails. There have been othei: definition-s of fidelity whichare much more specific than those above.

Specific Breakdowns of Fidelity

One of the most frequently cited breakdowns of the concept of fidelitywas provided by Kinkade and Wheaton (1972). They detail three types offidelity: equipment, environmental, and psychological.

* Equipment fidelity is the de.gree to which the simulator duplicatesthe appearance and "feel" of the operational equipment.

0 Environmental fidelity is the degree to which the simulator dupli-cates the sensory stimulation (excluding control feel) which isreceived from the task situation.

Psychological fidelity is the degree to which the simulator isperceived by the trainee as being a duplicate of the operationalequipment and the task situation.

(Kinkade & Wheaton, 1972, p. 679)

Kinkade and Wheaton provide examples of each of these types of fidelity.Equipment fidelity would be high for a driver trainer designed to teachdriving if "an actual automobile cab, with steering wheel, steering wheel

feedback dynamics, speedometer, fuel gauge, etc." (Kinkade & Whe-.on,1972, p. 679) were included. Envirormental fidelity would be high if thetrainer provided "motion cues and a thre2-dimensional, dynamic visual pre-sentation of the external world (i.e., the road, trees, sky, etc.)"

2

(Kinkadle & Wheaton, 1972, p. 679). Finally, the level of psychologicalfidelity would be high if the trainee perceives the simulator as beinghighly realistic, even though it might deviate substantially from theactual equipment it is supposed to represent. These three types of fi-delity do not cover all aspects of the concept.

A series of four papers (Wheaton, Mirabella, & Farina, 1971; Wheaton& Mirabella, 1972; Mirabella & Wheaton, 1974; Prophet & Boyd, 1970) dis-cuss task fidelity. Task fidelity refers to the correspondence betweentasks performed on the actual equipment and tasks performed on the train-ing simulator. A similar behavioral approach to fidelity is taken byMatheny (1978). We thus have four facets to the concept of fidelity sofar. One facet deals with the physical configuration of the simulator(equipment fidelity), one deals with the total context which the simula-tor tries to duplicate (environmental fidelity), one takes a perceptualviewpoint (psychological fidelity), and finally we have a behavioral ap-proach (task or behavioral fidelity). Some approaches do not break theconcept of fidelity down quite so much.

Fink and Shriver (1978) discuss only two types of fidelity--functionaland physical fidelity. Functional fidelity is the attempt to representfaithfully the stimulus and response options provided by all or portions ofa piecc of equipment. The previous example of the motion and visual cues ina driver trainer would also be an example of functional fidelity. Physi-cal fidelity is the attempt to represent accurately the appearance and"feel" of the actual equipment. The example of the parts included in thedriver trainer is also an example of physical fidelity. Fink and Shriver'sdesignations of functional and physical fidelity correspond to Kinkade andWh•aton's designations of environmental and equipment fidelity respectively.The term "psychological fidelity" is not used by Fink and Shriver. Thismay be because they consider it a function of the other two tylpx.s offidelity.

Slenker and Cream (1977) not only leave out psychological fidelity bu'.physical fidelity as well. They discuss only functional fidelity. Thistheay define as the attenpt to duplicate all of the stimulus conditions ofthe actual equipment. They apparently consider functional fidelity to en-compass physical fidelity. This approach would therefore be guided by pre-cise stimulus analysis to ensure that the simulator provided the same cu.sas provided by the actual equipment.

Freda (1979) defines fidelity as having two aspecLs. Physical fidelityis the "engineering (hardware) representation ol features in the operationalequipment" and psychological fidelity iz the "behavioral (functional) repre-sentation of the information procissing demands of the operational equip-ment" (Freda, 1979, p. 2). This approach approximates that of Fink andShriver (1978). V.reda's physical fidelity agrees with physical fidelity,as defined by eink and Shriver. Freda's psychological fidelity is veryclose to Fink and Shriver's functional fidelity. Any differences betweenthese two approaches are those of emphasi';. Freda appears to focus moveclosely on the trainee through t]h' behavioral aspects of "psychological'fidelity while FinX aiid Shriver focus on the hardware through the stimulusand response aspects of "functional" fidelity. Since one may define

3

behavior as stimuli and responses, it is not difficult to see the parallelbetween these two approaches.

It can be easily seen that rhe term filelity has been used in a varietyof contexts. Hardware, environmental, perceptual, and behavioral applica-tions of the term fidelity tend to obscure one's understanding of the term.In addition, different "types" of fidelity are used to lebel the same con-cept, such as equipment and physical fidelity or environmental and functionalfidelity. The applicatioi, of the term "fidelity" in so many contexts and inso many variations cnnstitutes an overextension of the term and negates theusefulness of the concept. We need more consistency and clarity in ourdefinitions of fidelity concepts if they are to be useful in the design oftraining simulators. Perhaps fidelity concepts have been developed tooacademically and our efforts should be directed to other areas that areoriented r.ore toward applications.

Often it is possible to gain greater understanding of a concept if westudy the way it is used by practitioners as opposed to researchers. Enother words, let us now change the direction of our focus from the academicside to the more pragmatic approach taken by contractors who actually at-tempt to build simulators and by the armed services which attempt to pro-vide user guidance in evaluating and procuring the devices that the con-tractors build.

Simulator developers have not been much more successful in consistentlyconceptualizing fidelity. Consider for example the work of four potentialcontractors who submitted designs for a proposed maintenance training simu-lator, the Army Maintenance Trriining and Evaluation Simulation System(AMTESS).

Hughes Aircraft Corporation (1980) takes a two-factor approach to thedefinition of fidelity. They define physical fidelity as the degree ofrealism presented in every sensory mode. This seems to be a combination ofboth of Fink and Shriver's distinctions, physical and functional Zidelity.Realism of simulator controls aj well as realism in the way the controlsare displayed and the total environment in which the simulator is usedwould be indicat -- of physical fidelity, as the term is used by Hughes.Hughes also uses the term "psychological fidelity" in a new sense. It isdefined as how fully the training environment exercises the mental skillsrequired in performing the job tasks. Other distinctions of fidelity arealso made, such as behavioral fidelity, engineering fidelity, and equipmentfidelity, but it is not clear (at least to this reviewer) how these termsare used by Hughes.

Seville Research Corporation (1980) defines fidelity as the details ofthe characteristics of the equipment or item which are represented in thesimulation and the "mode" in which those details are represented, and whichare specifically included for training purposes. This approach analyzesfidelity in terms of the entire training context. The equipment charac-teristics may need to be modified to deal with different tasks or stagesin the training process. They emphasize, however, that the question offidelity only concerns those characteristics which need to be included toenhance training effectiveness and not those related to maintainability,reliability, or cost.

4

Grumian Aerospace Corporation (1980) does not specifically definefidelity. They do allude to its definition by stating that fidelity re-quirements analysis specifies the degree or extent of physical and func-tional equipment similarity required to achieve satisfaction of the train-ing objectives. They thus focus on physical and functional similaritiesof the simulator to the actual equipment in defining fidelity. They em-phasize the need to analyze behavioral tasks in terms of the necessarycues which the simulator must provide to enable the trainee to discrimi-nate among the cues provided by the actual equipment.

Honeys.all Corporation (1980) does not attempt to define fidelity.They simply assume that it means the similarity of the simulator to theactual equipment. They state that it is important to determine the mir.•-mum level of fidelity necessary to train required tasks.

It can be easily seen that these practitioners are more concernedwith the development of equipment rather than the development of conceptualdefinitions. Thoy are willing to assume that fidelity somehow deals withthe physical and functional similarity between the simulator and the actualequipment and then to design their equipment to be as similar as necessaryto train to criterion levels. This approach works for the construction ofsimulators.

It is more difficult for the individuals who are responsible for pro-curing these devices because they must be able to judge the output of thecontractor before they invest huge sums of loney to develop the proposedtraining simulators. It is therefore important that they determine theirfidelity requirements before attempting to evaluate proposed trainingsimulators. There are several guidance aids which attempt to assist inthis process.

Although several documents have been developed to provide guidanceto the individuals tesponsible for procuring training devices, the limitedprogress in conceptualizing fidelity is still seen. For example, in Train-ing Device Requirements Guide (1979), the concept of fidelity is discussedonly in terms of the physical s.Lmilarity of the displays (cues) and con-trols (responses) between the i;,ýmulator and the actual equipment (pp. 153-160). In this da•-e, the term fidelity is not used. Function&l similarity,which compares the operator's behavior in terms of information flow fromeach display to the perator and from the operator to each control, is alsoemphasized but again without i:he use, of the term fidelity. This documentmakes it very clear that the wnoL important factor is not the physical simi-larity of the Amulator "but wh&erier the operator acts on the same amountof information in the same way In both operational and training situations"(p. 157). A simulator might be paysically dissimilar to the actual equip-

ment (i.e., a flat panel device) and still provide the infnrmation thetrainee requires to perform ii, the operational setting. This document pro-vides guidelines for the analyjis of tasks to be trained and also for theanalysis and evaluation of the simLtlator in terms of physical and functionalsimilarity.

A similar approach is taaken by Interservice Procedures (1975), a TRADOCprocedures pamphlet. Although this publication uses the term "fidelity" ina different context than discusý-v4 previously (here it refeis to how well

5

the actions, conditions, cues, and standards of the Job Performance Measuresapproximate thcse of the tasks), it nevertheless acknowledges the importanceof task analysis in the development of training devices. This pamphlet pro-vides guidelines for the analysis of physical characteristics and behavioralrequirements.

An approach which avoids the use 'f the concept of fidelity in the se-lection of training devices is the TRAINVICE II model (Swezey & Evans,1980). This model incorpo1.ates techniques from both of the previous publi-cations to analyze the physical and funcional characteristics of the train-ing devices and also the behavioral requirements which must be trained. Aswith both of the other guidebo~ks discussed, this model focuses upon theimportanc.e of both the physical and the functional characteristics of thesimulator or other training device. The term "fidelity" is not employedin this discussion and it is not a comparison of simulator characteristicswith operational equipment characteristics but rather a comparison of simu-lator characteristics with learning guidelines.

An additional guidebook for training device design was developed byHoneywell for the Air Force Human Resources Laboratory (Miller, McAleese, &Erickson, 1977). This guidebook defines fidelity as "the degree of cor-respondence between operational and simulation devices in terms of cues,responses, and actions that can be performed" (pp. 51-52). According tothis guidebook, the level of fidelity may be determined by task analysis.Tasks are analyzed in terms of criticality, frequency, and difficulty.Task statements are developed and subsequently used to determine the neces-sary level of fidelity for the device designed to train the tasks.

One other publication which addresses this issue is provided by theAir Force Human Resources Laboratory (Purifoy & Benson, 1979). This modeldiscusses four considerations which must be evaluated when discussing theappropriate levels of simulator fidelity. They are environmental condi-tions, stimuli, response situations, and control-display relationships.In this guidebook, fidelity is viewed in the whole training context. Itis assumed that fidelity cannot be isolated but must be incorporated intoan overall training requirements analysis. Here we must not, only deal withphysical and functional similarity but also how similarity levels interactwith other variables in the training situation. A similar conclusion isreached by Eddowes & Waag (1980). They define fidelity as the physicalsimilarity of the learning environment to the performance environment andfeel that a simulator must be incorporated into the total training contextif it is to be effective.

RECO14MENDED APPROACH TO FIDELITY

Table 1 shows the variety of fidelity terms that were discussed pre-viously. It can easily be seen that many terms have been used in connec-tion with the term "fidelity." Not only have many different terms beenused, but various fidelity concepts have been described with differentlabels. Table 2 shows how several of the same concepts have been labeledwith various terms. VariatiQns in terminology and multiple labeling ofconcepts tend to confuse the issue, so it is important to realize that the

6

Table 1

Varieties of Fidelity Terms

Author/publication Fidelity terms used

Kinkade & Wheaton (1972) Equipment fidelityEnvironmental fidelityPsychological fidelity

Wheaton, 1irabella, & Farina (1971) Task fidelity

Matheny (1.978) Behavioral fidelity

Fink & Shriver (1978) Physical fidelityFunctional fidelity

Slenker & Cream (1977) Functional fidelity

Freda (1979) Physical fidelityPsychological fidelity

Seville (1980) Fidelity (total context)

Grumman (1980) Physical fidelity requirementsFunctional fidelity requirements

Honeywell (1980) Similarity

Training Device Requirements Physical similarityGuide (1979) Functional similarity

Interservice Procedures (1975) Fidelity of job performancemeasures

Miller, McAleese, & Erickson (1977) Degree of correspondence (cues,responses, actions)

Purifoy & Benson (1979) Fidelity (total context)

Eddowes & Waag (1980) Physical similarity

7

Table 2

Overlapping Fidelity Coacepts

Concept Fidelity terms used Author/publication

Physical Physical Fink & Shriver (1978)fidelity Equipment Kinkade & Wheaton (1972)

Functional Slenker & Cream (1977)Physical Freda (1979)Physical Hughes (1980)Fidelity (context) Seville (1980); Purifoy £

Benson (1979)Physical requirements Grumman (1980)Physical similarity Honeywell (1980); Training De-

vice Requirements Guide(1979); Eddowes & Waag (1980)

Physical correspondence Miller, McAleese, & Erickson(1977)

Functional Functional Fink & Shriver (1978)fidelity Environmental Kinkade & Wheaton (1972)

Functional Slenker & Cream (1977)Psychological Freda (1979)Fidelity (context) Seville (1980); Purifoy &

Benson (1079)Functional requirements Grumman (1980)Functional similarity Honeywell (1980); Training De-

vice Requirements Guide(1979)

Functicial correspondence Miller, McAleese, & Erickson(1977)

Behavioral Task fidelity Wheaton, Mirabella, & Farina(1971)

Behavioral fidelity Matheny (1978)Psychological fidelity Hughes (1980)Job performance measure Interservice Procedures (1975)

Perceptual Psychclogical Kinkade £ Wheaton (1972)

8

issue is the acquisition by the trainees of skills which they may use onthe actual equipment. We are thus concerned with behavior, and it is im-portant that we not confuse fidelity with this. Fidelity has been definedin terms that cover the entire gamut from physical characteristics of thetraining simulator to the perception of the trainees and their behaviors.In the opinion of this author, the term "fidelity" should be restricted todescriptions of the configuration of the equipment and not be used whendiscussing behavLors. We must ultimately measure and evaluate behaviors,but there are many concepts already available such as sensation, perception,learning, retention, or reinforcement to deal with these. The issue of fi-delity only becomes muddled if we attempt to use the same term to cover allthe various aspects of the training situation. This is not to say -e shouldthrow out the remainder of the gamut. Rather, we should use labels forthese concepts that do not confuse them with fidelity.

Of the approaches discussed above, the one which best isol'ates theequipment characteristics from the behaviors of trainees on thri equipmentis the approach of F•,K and Shriver (1978). As was stated prEoviously, Finkand Shriver only discuss functional and physical fidelity. in both func-tional and physical fidelity we are concerned with the equiraent on whichtraining is to proceed. The following definition of fidelity is theref,.reproposed:

Fidelity is the degree of similarity between the siAulator andthe equipment which is simulated. It is a measurewent of the physi-cal characteristics of the simulator (physical fiiielity) and theinformational or stimulus and response options ot the equipment(functional fidelity).

With this restricted use of the term fidelity in mind, let us now dieuusswhy the level of simulator fidelity need not necessarily be as high aspossible.

RE1SONS FOR DEPARTURE FROM HIGH FIDELITY

In his consideratioas for the designs ot simulators, Miller (1974) statesthat studies have never shown that high fidelity is associated with poorertraining. This statement may not be true. In a study by the Air Force (Mar-tin & Waag, 1978), it was shown that flight simulators with very high fidelity(6 degrees of motion) provided too much information for novice trainees andactually detracted from simulator efficiency. There are also other consid-erations which force the designers and procurers of training simulators todesire lower levels of fidelity.

Cox et al. (1965) st-idied the effects of a wide variety of fidelitylevels on the training of fixed-procedures tasks. They concluded that therequirements for fidelity are really quite low as long as the controls anddisplays of the device remain clearly visible to the individual trainee andas long as the functional relations between parts are maintained. They arecareful to state that these conclusions apply only in a fixed-procedures taskparedigm. It is probably safe to conclude that at least some departu.ýes fromhigh fidelity will not produce detrimental effects in the training effeatiSve-ness of a simulator on other types of tasks. Since high fidelity is

9

associated with higheir simulator cost's, it is prudent to determine just howmuch fidelity is necessary in a training simulator.

Blaiwes et al. (1973) state four reasons for departure from perfectphysical fidelity. The first deals with training effectiven'ess. It hasbeen shown (Martin & Waag, 1980) that it is contrary to good ur. 4-nirng prac-tice to make training an exact duplicate of certain real jobs. The train-ing situation affords the opportunity for immediate feedback and enhancedstimulus cues which aid in training but which may not be available on actualequipment. Whw'n these features are incorporated in a simulator, the levelof fidelity is lcowered.

A second reason is the cost effectiveness of lower fidelity simulators.A simulator may be less costly than the actual equipment and if it can bedesigned with a lower level of fidelity and still provide adequate transferof training, even more money can be saved.

The third reason for departure from perfect fidelity is the safety ofthe training. It may be too dangerous to train people on the actual equip-ment. Not only :.ould individuals be injured but the actual equipment maybe damaged if certain malfunctions are induced for the purposes of training.

The final reason given by Blaiwes et al. is the technological barriersto duplicating the operational environment. It may not always be possibleto produce a training simulator that exhibits perfect fidelity. The levelof fidelity necessary to produce trained personnel must be determined byacclounting not only for the previous considerations but also for considera-tios inherent in the training context. Training context is defined as theway in which a training simulator is incorporati..I into a specific programof instruction. Although we may see. fidelity as a central question in anyatteApt to design or evaluate a training simulator, it is not. The centralquestion is not its Zidelity but its training effectiveness. As we shallsee, fidelity interacts with several parameters to determine trainingeffectiveness.

FIDELITY ANJD TRAINING EFFECTIVENESS

The question of the necessary level of fidelity has been asked everqince simulators began to be used in training. Not much progress was madein determining fidelity requirements until it was realized that fidelityis not really the question at all. Bunker (1978) states that progress wasmade only when "instead of pondering how to achieve reali.sm, we should askhow to achieve training" (p. 291).

The same point was made by Kinkade and Wheaton (3972) several yearsearlier. "The overall level of fidelity required in a training device ispartially determined by the desired amount of transfer of training" (p. 679).

What these statements tell us is that fidelity is not a concept thatmay be discussed in isolation. The effect of fidelity on training effective-ness is not simple. It is modified by the total context in which the equip-ment is used.

10

Fidelity and the Training Context

Micheli (1972) makes the point that it may be more important how a de-vice is used than how it is designed. 14ontemerlo (1977) states that thetraining effectiveness of a simulator is a function of the total trainingenvironment and not just the characteristics of a particular piece of equip-ment. Wheaton et al. (1976) concluded after an extensive review of fidelityliterature that fidelity per se is not sufficient to predict device effec-tiveness and that the effect of fidelity varies as a function of the type oftask to be trained. It becomes increasingly clear that we cannot productivelydeal with the concept of fidelity in isolation but rather as a function oftotal training context--which includes the training tasks, the stage oflearning of the trainees, and the instructional system designed to train thetasks. Training effectiveness (i.e., the amount of transfer of trainingfrom the simulator to the actual equipment in a specific context) may beused to determine necessary simulator fidelity levels.

Fadelity and Type of Task

Several authors have n'ade the point that we must define necessary fi-delity levels in terms of the tasks which must be trained. Different tasksmay require different levels of fidelity. Fink and Shriver (1978) distin-guish between the necessary instruction for maintenance tasks versus thatneeded for operational tasks. Maintenance training should entail "instruc-tion in decision processes to a greater degreo than operations training.This instruction must include formulation of decision rules, identificationof decision alternatives, and actual decisioa making" (p. 5). The fidelitylevel necessary for decision making may be very different thar that neces-sary for operating the controls on a piece of equipment. The important pointhere is that we are attempting to train individuals to behave in certainways and that the ultimate behaviors must guide our choice of fidelitylevels. Miller (1980) states the puint ia this way:

While physical identity of opprational and training Lquipment doesfacilitate transfer, the cont~olling factor is not physical con-gruentce but the (sometimes subtle) cues that guide the behaviorinvolved in the performance of a task. Careful analysis and ex-ploitation of these cue/behavior patterns is the basis of all syn-thetic or simulation-based approaches to training (p. 5).

Stated another way, "The important point is that the behavioral re-quirements dictate the physical cha:acteristics rather than some perceivedphysical fidelity dictating behavior" (Matheny, 1970, p. 4).

What is necessary is for us to bridge the gap between the output be-haviors which we desire and the input behaviors which the trainees arrivewith.

Fidelity and Stage of Learning

4n important consideration in determining the fidelity levels necessaryto build this "behavioral bridge" is the stage of learning in which we find

11

the tra-nee. Fink and Shriver (1978) discuss four stages of learning ofwhich each has different general training objectives and whith Lend them-sl1ves to the use of different types of training devices with differentlevels of fidelity. Table 3 shows these stages of learning and theirrelationship to training objectives and types of training devices. Finkand Shriver are of the opinion that "no one class of training aid or de-vices need carry the entire training load" (p. 22). They feel it is morecost effective to train the first stages of learning on low fidelity de-vices before switchinc to more expensive simulators required in laterlearning stages. Kinkade and Wheaton (1972) make the same point. Theydelineate five stages of training for which different training devices areappropriate. Table 4 shows the types of training devices associated withthese five stages of training.

Fidelity and Task Analysis

We can see that fidelity levels caruot be determined outaide of the

training context. The determination of the necessary fidelity becomes arepetitive process with the analysis of tasks and behavior objectives asits guiding principles. "The major decisions in this process are selec-tion of the subset of tasks that will actually be trained in the device anddetervination of the degree of fidelity necessary to train each of the tasksin the bubset" (Eggemeir & Cream, 1978, p. 18). Task analysis therefore is

the important first step in determining not only fidelity requirements butalso whether a simulator is necessary at all.

Tasks have been analyzed in various ways. Smode (1971) analyzes tasksas training objectives: what is to be trained, under what conditions thetraining will occur, and the standard of performance desired at the end oftraining. For each objective, tasks are stated in terms of performanceelements, how these will he measured and the conditions under which per-formance occurs. Smode provides a detailed strategy for developing thesetraining objectives. Similar strategies are provided in the guidebookswhich were discussed above (Interservice Procedures, 1975; Lenzychi & Fin-ley, 1980; Training Device Requirements Guide, 1979).

Another approach to task analysis is to rank each task in terms ofits criticality, frequency, and difficulty (CFD) (Slenker & Cream, 1977;Eggemeir & Cream, 1978; Freda, 1979). Tasks that are rated high on thesedimensions would be included in the training device with the fidelity leveldeterminod by a repetitive process. Freda (1979) also uses the CFD approachto provice a framewozk for analyzing learning-tasks in terms of the infor-mation processing demands on the individual In the operational setting."These information processing demands can be viewed as a sequential flowof three irformation processing stages" (p. 6).

The first stage is sensory input and "refers to the degree of CFD in-

volved in the apprehension of operational stimulus parameters for supra-threshold input processing" (p, 9). The second stage is central processingand "refers to the degree of CFD involved in using cognitive skills andstrategies for selecting the appropriate psychomotor output based on thesensory input" (p. 9). 'The third stage is psychomotor output and "refers

12

AL',I

Table 3

General Relationship Between Stages of Learning, Training Objectives,and Types of Training Devices (Fink & Shriver, 1970, p. 23)

Stages of General traininglearning objectives Types of training devices

1st stage Acquire enabling skills Demonstrators--wall charts,and knowledges films, TV, mock-ups, etc.

Nomenclature & parts locationTrainers

2nd stage Acquire uncoordinated Part-task trainersskills and unapplied Procedures trainersknowledges

3rd stage Acquire coordl -.. Troubleshooting logic trainersskills and aJt. to Job segment trainersapply knowle,•.ges Skills trainers

4th stage Acquire job proficiency Operational equipmentin job setting Actual equipment trainers

Table 4

Types of Training Devices Associated with Stages of Learning(adapted from Kinkade & Wheaton, 1972, p. 673)

Stages oflearning Types of training devices

Indoctrination Films, TV, mockup

Procedural training Photographs, nonfunctional mockup,functional mockup

Familiarization training Functional equipmentPart-task trainer

Skill training Functional with man-machine dynamicsRepresentedPart-task trainer

Transition training Part-task simulator

13

to the degree of CFD, involved in the expression of the appropriate beha-vioral response" (p. 9),

Depending upon the stage of the information processing demands inwhich a given task is classified and depending upon its CFD rating, thefidelity requirements of a simulator to train that task may be det'ermined(Freda, 1979; Lenzychi & Finley, 1980).

An alternative (or perhaps complementary) approach to task analysisis to analyze the behavioral cues which stimulate the required performance(Pearson & Moore, 1978; Grumman, 1980). in this approach, the degree offidelity necessary in the simulator is determined on the basis of theamount of information the students must derive from the cue to make thec~,e worthwhile. "The desired fidelity of a given cue is that level whichwill enable a trainee discrimination equivalent to that which he could ob-tain from that cue and similar cues provided by the operational equipment"(Grumman, 1980, p. 128). This approach would make a useful addition to theapproach discussed above where tasks are analyzed in terms of CFD. Once itis determined that a given task is above the criterion level on CFD, it maythen be analyzed in terms of its behavioral cues to determine the necessarylevel of fidelity to train the task.

Fidelity and the Trainee's Abilities

An important area of research which relates to task analysis Is. thedetermination of how the skills which are necessary for a given task relateto general abilities. This relationship has been utilized in rating taskson CFD (Freda, 1979). Tasks have been organized into 11 general categorieswhich are very close to general abilities (Aagard & Braby, 1976). Not onlycan these general learning categories be used to evaluate tasks in deter-mining necessary fidelity levels, but it is also possible to use the outputfrom such evaluations to determine which individuals should be selected fortraining in the first place. By testing individuals on basic abilitiesprior to their inclusion in training programs, it should be possible toscreen out those trainees who do not possess the required abilities tobenefit from a given training program. Again, we must keep in mind thatthe issue of fidelity should be secondary to the real issue, which is theproduction of well trained personnel. The "best" training course withperfect fidelity levels will not be productive if the trainees selectedfor the course do not have the requisite abilities to master the taskstrained in the course.

Fidelity and Psychological Principles

The determination of the necessary fidelity level for a training simu-lator may be facilitated if we are guided by established psychologicalprinciples in the design of the simulator. Adams (1979) discusses fivepsychological principles which should underlie the design and use of anysimulator. These principles are (1) human learning is dependent on knowledgeof tesults (KOF') so any simulator should incorporate KOR, (2) perceptuallearning or the ability to exctract information from stimulus patterns in-creases as a result of experience which may be provided on a simulator,

14

(3) if a response is to be made to a stimulus, then the stimulus and thecontrol for response to it must be in the simulator,, (4) transfer is high-est when the similarity between the simulator and the actual equipment ishigh although transfer is possible with low simila'rity, anid (5) the traineemust be motivated if learning is to occur. one additional principle (Grum-'man, 1980) is relevant. It is the phenomenon o'f percepcual zonstancy.This phenomenon occurs when familiar objects are perceived as maintainingtheir perceived characteristics almost independent of changr~s in stiriuilsconditions. This principle allows reduced fidelity in simulators~ to acertain point. Each of these principles should be taken into account whenthe fidelity levels of a simulator are determined,.

Training Effectiveness as Determinant of Fidelity Requirements

As was stated previously, training, rather than fidel~ity, is the realissue. We are in the business of training individuals in specific taSskand all the hardware and software are simply aids in this tra~ining p~rocess.The necessary fidelity level becomes easier to determine by usinq trainingeffectiveness as our criterion. The necessary fidelity level f or a givensimulator may be empirically determined by the training effectiveness ofthe simulator. As stated previously, training effectiveness may be opera-tionalized as the amount of transfer of training from the simulator to theactual equipment. However, this method has problems as pointed out byAdams (1979). The mouit critical problem is the cost of transfer of train-ing experiments. It would be very costly to design several simulatorswith different levels of fidelity, train individuals on them, and thenevaluate the trainee's performance on actual equipment. However, this isthe only method that will provide us with the empirical data necessary todetermine fidelity requirements. it may be possible to make this me~thodmore cost effective by limiting the number of fidelity levels evaluatedand by beginning our investigations with lower levels of fidelity. As Finkand Shriver (1978) wrote: "The answer to maintenance training lies not inhow much but how little simulation to use. This is not to say that fewersimulators should be used, merely that low-cost, low-fidelity simulatorsshould be employed where they can be made effective" '(p. 26). Eddowes(1978) also said: "As soon as the capabilities of a simulator supportpractice of a set of training tasks, stop adding to it. Everything elsethe simulator does will cost more, and while it may not detract from train-ing effectiveness, it probably won't add to it" (p. 53). once we have de-termined that a lower fidelity level is adequate for a given simulator, itwill not be necessary to evaluate simulators which incorporate higher fi-delity levels. Let us now discuss some of the methodological issues in-volved in empirically determining necessary levels of simulator fidelity.

METHODOLOGICAL ISSUES

Several methodological issues arise when we attempt to empirically de-termine the necessary fidelity level for a simulator to train a given task.As was discussed earlier, various tasks will require different levels offidelity as will different stages of the learning process. It is importantthat general guidelines be established for required fidelity levels and theonly way to establish these guidelines is through empirical investigations

15

of the relationship between level of fieelity and training effectiveniess.These empirical investigations should hecgin at a gene.ral level and proceedto a more rupecific level. The studieF P.t the gonera±l level may be consid-ered pilot studies which will direct later investigations focusing on spe-cific tasks and specific covfigurat..ors of training simulators.

The first step in: the empirical. investigation of che rel ationship be-tween level of fidelity and training effectiveness i to develop a scaleto measure leval of fidelity. If we follow tCe detinition of fidelity thatwaA developed previously, we will have twi iactors to measure: the physi-cal characteristics of the sirurilat1-s. (phyoical fidelity) and the informa-tional or stimulus and response options of the equipment (functi•-lalfidelity).

We therefore need to measure the physical and iwn jonal aspects of

fidelity before we can attempt to empir'.ca* utt.eiee,ý how these factorsrelate to the training AE fectiveneos sf t!.o simulator. mn example of thistwo-fau;cor approach to the meas remenat of ficielity is the work of Wheaton,Mirabella, and farina (1971), Wheaton and Mirabella (1972), and Mirabellaeajm Whcaton (J.974). In these three invest.gations, two scales wer.e used tomeasure the characteristic.s of t vaininq simulators. Panel lay-out indexesdeveloped by Fowler, Williams, Fowler, and Young (39 were used to measurethe physical fidelity of the simulators. Another scale which was adsumed tovary independently with the above scale was used ti measure the fiunctionalfidelity of the simulators. This scale, called the Display 1valuative Index(DEI) was developed by Siegel, Miehle, and Federman (19t2) and is a measureof the effectiveness with which information flows frm displays to corre-sponding controls via the operator. This approach yields two Yide2 ity mea-sures which define the two aspects of fidelity. Depen-ing ;n other a'aram-eters such as task type and level of training, these aipocts would intotactand lead to a given level of training effectiveness.

An example of the design for one of these pilot stucies is shown inFigure 1. Here we see the two aspects of fidelity (physical and functional)being treated as factors in a 3 x 3 factoral design. Depending on how afidelity factor fell on the measurement scale chosen for the experiment,each factor would be represented as low, medium, or high. Values in thecells of the design would be derived from measures of proficiency on theactual equipment.

The measurement of training effectiveness is another problem whichmust be solved if our empirical investigations are to be meaningful. Whilethere are a variety of methodologies and measures that have been used tomeasure the training effectiveness of simulators, the transfer of trainingexperiment is the most useful for experimental purposes (Adams, 1979). Ofthe various measures of training effectiveness that may be derived from atransfer of training experiment, the transfer effectiveness ratio (Proven-mire & Roscoe, 1971) may be best adapted for this research. It has beenused successfully in Army simulator effectiveness research (Holman, 1979).The transfer of a training type of study would have to be applied to eachtraining task to determine the "best" combination of physical and func-tional fidelity for that task. A review of theories and models of trans-fer of training is provided by Wheaton et al. (1976).

16

Physical fidelity

Lo.i Medium High(control absent) (drawing) (3-D)

Low(doesn't workmechanically)

MediwxaFunctional (wrswit

fideity (works withno effect)

High(works with

effect)

Figure 1. Sample design for a pilot study to determine the "best"combination of fidelity levels for a given task.

17

Once 3uch pilot studies have narrowed our focus downi to a few testableconcepts, we may develop training programs which actually use simulators totrain individuals. Then we can measure the level of training effectivenesson a given task. Our research efforts should determine the training effec-tiveness of every type of fidelity configuration. Data from these effortsmay then be used to determine which configuration is most cost effective toproduce the highest level of training effectiveness vii a given task. Iflower levels of fidelity produce adequate training effectiveness, then thereis no necessity to produce simulators with higher fidelity. As was statedpreviously, more fidelity than is necessary may not reduce the efficiency ofa simulator but it will probably not improve it either. Just enough fidelityto motivate the trainee while providing him or her with the necessary skillson the actual equipment is the goal to which we should strive. In this waywe still provide the most cost effective training for the Army's personnel.

CONCLUSIONS AND RECOMMENDATIONS

Confusion surrounding the use of the term "fidelity" may be substan-tially reduced if we limit the use of the term and let fidelity refer onlyto the degree of similarity, both physical and functional, between a train~-ing device and the actual equipment for which the training was undertaken.By dealing with the configuration of the equipment itself, we have somethingconc'rete which we can manipulate to change the trainee's behaviors. Otherconcepts which have been d4-scussed under the label "fidelity" are still use-ful but should be separated from the concept of fidelity by the use of otherlabels. We may then ask how f>idelity interacts wi-th these concepts. ManyresearcherL.i have concluded that fidelity, as defined in this paper, inter-acts with at least four factors: the stage *:'f learning of the trainee, thetype of task to be trained and the way the simulator is used and acceptedby students aind instructors, and the individual sensory and perceptual dif-ferences among subpopulations of trainees. Further research needs to beconducted to at least generally determine how the above factors actuallyinteract with the physical and functional configuration of training simu-lators in producing given amounts of transfer of training. Knowledge aboutthe interaction of these factors with physical and functional fidelity willallow us to determine the principles of simulator training which producethe most effective trainingi programn for any given task. once these prin-ciples are quantified we will then be able to determine not only the mosteffecýtive simulator configuration but also the training context in whichthe simulator may be used most effectively.

The proposed definition of fidelity and the research strategy whichuses this definition 3hould enable us to quantify the concept of simulatorfidelity for any given task. We may then use this dita as one source ofinput to our efforts to prescribe simulator device characteristics for de~-vices not yet produced and also t... predict the training effectiveness ofdevices that already exist. Our efforts will pay off in more cost effec--tive programs of instruction which better train individuals for the needsof the Army.

References

.:agard, J.A. & Braby, R.B. Iearning guidelines and algorithms for types oftraining objectives, TAEG Report No. 23, March, 1976.

Adams, J.A. On the evaluation of training devices. Human Factors, 1979, 21(6),711-720.

Blaiwes, A.S., Puig, J.A., & Regan, J.J. Transfer of training and the measure-ment of training effectivencss. Human Factors, 1973, 15(6), 523-533.

Bunker, W.M. Training effectiveness versus simulation realism. In theEleventh NTEX/Industry Co, 'rence Proceedings. Naval Training EquipmentCenter: Orlando, Florida, 1978.

Cox, J.A., Wood, R.O., Boren, L.M. & Thorne, H.W. Functinnal and appearancefidelity of training devices for fixed-procedures taaks. TechnicalReport: 65-4, Alexandria, VA: Human Resources Research Organization, 1965

Crawford, M.P. Simulation in training and education.. Professional Paper:40-67, Alexandria, VA, llumRRO, 1967.

Eddowes, E.E. Flight simulator fidelity and flying training effectiveness.In ProceedIngs of a.§ymposium at the Annual Meeting of the AmericanPsychological Associatio., August, ].978, Toronto, Canada.

Eddowes, E.11. & Waag, W.L. The Use of simulators for training in-flight andemergency procedures. NATO advisory Group for Aerospace Research andDevelopment (AGARD), June 1.980.

Eggemeir, F.T., & Cream, B.W. Some consideratiors in development of teamtraining devices. In D.E. Erwin (Ed.), P__s.•2oloical Fidelity inSimulated Work Environments: Proceeding p of a S osi'im at the Ann'aalMeetin of the American Psycho oyical _•m-.)jium at the. Annual Mfeetinof the American Psychological Mn:,ociation, August, 1978, Toronto, Canada.

Fink, C.D. & Shriver, E.L. Simulators for maintenance training; some issues,problems and areas for future research. Technical Report: AFHRL-TR-78-27, July, 1978, Air Force Human Resources Laboratory, TechnicalTraining Division, Lowry Air Force Base, Colorndo.

Fowler, R.L., Williams, W.E., Fowler, M.G., & Young, D.D. An Investigationof the relationship between operator performance and operator panel layoutfc- continuous tasks. Report No. AýI¶L-TR-68-]70. Aerospace Medical'•esearch Laboratory, Wright-Patte :.on Air Force Base, Ohio, 1968.

Freda, J.S. Fidelity in Training simulation. Unpublished paper: ArmyResearch Institute, 1979.

19

Gilbert, T.F. Human Competence; Enginoering Worthy Performance.

New York: McGraw-Hill, 1978

Grumman Aerospace Corporation. AMTESS-1 Final Report. April, 1980.

Holman, G.L. Training effectiveness of the CH-47 flight simulator.ARI Research Rapport 1209, May 1979

Honeywell Corporation. AMTESS-1 Final Report. March, 1980

Hughes Aircraft Company. AMTESS-l Final Report. March, 1980.

Interservice Procedures for Instructional Systems Development.TRADOC Pamphlet 350-40. 1 August, 1975.

Kinkade, R.G. & Wheaton, G.R. Training device design. In H.P.VanCott & R.G. Kinkade. (*Eds.), Human Engineering Guide toEquipment Design. Washington, D.C.: American Institutes forResearch, 1972.

L'enzychi, H.P. & Finley, D.L. How to determine training devicerequirements and characteristics: A handbook for trainingdevelopers. Draft Technical Report TR-80. U.S. Army ResearchInstitute, Alexandria, Va., May 1980.

Malec, V.M. Simulation in maintenance training. Unpublished paper:Navy Persomnel Research and Development Center, San Diego, CA,1980

Martin, E.L. & Waag, W.L. Contributions of platform motion to simu-lator effectiveness: Study 1-Basic contact. Interim Report:APHRL-TR-78-15. Air Force Human Resources Laboratory, June1978.

Matheny, W.G. The concept of performance equivalence in trainingsystems. In D.E. Erwin (Eds.) Psychological Fidelity inSimulated Work Environments; Proceedings of Symposium at theAnnual Meeting of the American Psychological Association,August, 1978, Toronto, Canada.

Micheli, G.S. Analysis of the transfer of training, substitutionand fidelity of simulation of transfer equipment. TrainingAnalysis and Evaluation Group. Naval Training Equipment Center,Orlando, Florida, 1972.

Miller, G.G. Some considerations in the design and utilizationof simulators for technical training. Technical Report:AFHRL-TR-74-65. Air Force Human Resources Laboratory, August,1974.

Miller, K.E. Simulation in maititenance training. Unpublished paper-Naval Training Equipment Center, Orlando, Florida, 1980.

Miller, L.A., McAleese, K.J. & Erickson, J.M. Training Device DesignGuide; The Use of Training Requirements in Simulation Design.Air Force Human Resources Labovatory. Contract F33615-76-C-0050,June 1977

20

Mirabella, A. & Wheaton, G.R. Effects of task index variationson transfer of training criteria. Technical Report:NAVTRAEQUIPCEN 72-C-0126-1. Navy Training Equipment Center,Orlando, Florida, January, 1974.

Montemerlo, M.D. Training device design: The simulators versusstimulation controversy. Technical Report: NAVTRAEQUIPCENIH-287. Navy Training Equipmeat Center, Orlando, Ylorida,July, 1977.

Narva, H.A. Development of a systematic methodology for the appli-cation of judgement data to the assessment of training deviceconcepts. Research Mewrandwa 79-7. US Army Research Institute,Alexandria, Virginia, May 1979

Pearson, T.E. & Moore, E.O. Analysis of softwuare simulation incomputer-based electronic equipment maintenance trainers.TAEG Report No. 57. Training Analysis and Evaluation Group,Orlando, Florida, Sept. 1978.

Prophet, W.W. & Boyd, H.A. Device task fidelity and transfer oftraining: Aircraft cockpit pr'ocedures training. Alexandria,Virginia: Human Rcsources Research Organization, TechnicalReport 70-10, July 1970.

Provenmire, H.K. & Roscoe, S.N. An eviluation of ground based flighttrainers in routine primary flight training. Human Factors,1971, 13 (2), 109-116.

Purifoy, G.R. & Benson, E.W. Maintenance Trair imulators Designand Acquistion: Su-ar:t of current proc4 TechnicalReport: AFHRL-TR-79-23. Air Force Huw rces Laboratory,November, 1979.

Seville Research Corporation. AMTESS-1 Final Report. March, 1980.

Siegel, A.I., Miehle, W. & Federman, P. Information transfer indisplay control systems. VI. A manual for the use and appli-cation of the display evaluative index (Sixth Quarterly Report).Wayne, PA: Applied Psychological Services, 1962.

Slenker, K. & Cream, B.W. Part-task trainer for the F106 A MA-IRadar/infared fire control system: Design, specification, andoperation. Technical Report: AFHRO-TR-77-52, September 1977.Air Force Human Resources Laboratory, Advanced Systems Division,Wright-Patterson Air Force Base, Ohio

Smode, A.F. Human factors inputs to the training device designprocess. Technical Report; NAVTRADEVICENT 69-C-0298-1.September, 1971. Navy Training Equipment Center, Orlando,Florida.

21

Spangenberg, R.W. Selection of simulation as an instructionalmedium. From Proceedings - Volume IV. Future of simulatorsin skills training. First International Learning TechnologyConference & Exposition oa Applied Learning Technology,Society for Applied Learning Technology, July 21-23, 1976,p. 65.

Swezey, R.W. & Evans, R.A. Guidebook for Users of TRAINVICEII. Alexandria, Virginia: US Army Research, Contract No.MDA-903-79-C-0428, May, 1980.

Training Device Requirments Guide. Orlando, Florida: PM TRADEand Ft. Eustis, VA: Army Training Support Center, 5 January,1979.

Wheaton, G.R., Mirabella, A., & Farina, A.J. Trainee tid instructortask quantification: Development of quantitative indices anda predicLive methodology. Technical Report: NAVTRADEVCEN69-C-0728-1. Naval Training Device Center, Orlando, Florida,January, 1971.

Wheaton, G.R. & Mirabella, A. Effects of task index variationson training effectiveness criteria. Technial Report:NAVTRAEQUIPCEN 71-C-0059-1. Navy Training Equipment Center,Orlando, Florida, October, 1972.

Wheaton, G.R., Rose, A.M., Fingerman, P.W., Korotkin, A.L. &Holding, D.H. Evaluation of the effectiveness of trainingdevices; Literature Review and Preliminary Model. ResearchMemorandum 76-6. US Army Research Institute for theBehavioral and Social Sciences, Arpil, 1976. Contract No.DAHC 19-73-C-0049.

22

-I0LI Aoifl 'ESTERN COAMAND 'ATYWi ....... .--- -...--.

1 DEPARTMENT OF THE NAVY INAININb ANALYSIS AND EVA UAION.Q IOUP-

I HG, iCATA ATTN: ATCAT-OP-u7 HGUA-IRESEARCH 'AND STUDIES'DFc -..

I MILTITARY OCCUPATIONAL ULVLLOPMENT DIV UApC.MSp-OR~CHFMNB - -

I HWJA ATINt DAMU-UR ____________

I Mg' II.AIA ILCHICiA.AL L1iRT-I HQUA OUCSPER

-T USR~AICO i STC--.-... . . .-

I HQUA ATINS OAMI1ISII USA-MTDOW SYSTEMS-CW4O "' ATlIJI DRSA V-6ZI.ýAI USA :ORAUCOM A1141 AMSLL-IPA-I4 -.-.- .--- - -.. , -. -- -.-

I HEAD44UARTERS9 US 4ARINE CORPS ATINt COuF mPl-2O --

-- S A wYEUR PE-A"N0SEVENTH AAAI-y.-I 1ST INFANTRY DIVISION ANU FT. RILEY ATrNi AFZN-OPT-T . .

rTUSk-rNTELITGENCE -4W SECURITY CONMA~iI)ATk1ou ~N2 HQ TwADOC TECHNICAL L14RANY I

MILITARY OCCUPATIONAL ULVLLOPMF.NT DIRECTORATE ATTNlATZ!I7tqR!!.-M$-M,-l .4.113. .FadaN.BI)-1

I USA *4ILPERCEN ATTNs UA~PCOO-T

I ATH INFANTRY DIVISION

I~(~ S iATYA4~o NAVAI AIR SYSTLAt COMMAND

I 1230 USARCom RESERVE CLNIL.R T~A ~ ILSI .- . . . . .

1DIREC:TORATE OFokH4UR A 1 Lt&!L AJ.L!~j A.&A ~ ..

I USAAHMC +FT;. l&NUKAVfAT ION UIVISIONIUSA PORCES 1.UAg. AFIN - QLlPk'Y C OFi S--iQI-1T"LLLEKCE -. .. .

-T*-nj-'T-O ~ AFUE- C;7DEPUrY-CHIEF OF STAFF FOR OPERATIONSi.YA- ANMY AIR UEFENSE 6ýHUOL .A!T .7N ATSAI)JU.......i OT -ET0AA1E 'fKTAINING- -A 141 *Al -TI DIRE0;TRATE OF CU41BAT UEVLLOPMLNI§__&TTNt ATZIA.DI HQUANCUM MWAITRPC LITr6WrNUF CI DEPANTMENT OF iNL ARM#Y Ub ARMY INTELLIGENCE *~i~QM&~

1 USA MISSILE COM4ANU ATTN: UHSMIwNTN

I USA IOCI`S COMMAND ____

1 US MILITARY DISIHICT OF WASHINGIUN OFlU OF EQUAl Q9Rfý9PR1VLITY _ _..- . . .--

-T-NAW-'t V7IAW PERSONI~I.. tCMU SOUTHPR4FLIJ Uly-_22 ARI iIAISON OFFICE

1 HQ U,,AREUI4 ATT14 UCSOPS

1 U.S. NAVY TRAIN146 ANALY!bIS EVALUAT14JN GROUPITISAGHC'jEATTN-i ATEC EX-E HIUMAN FACTORS'....- . . . . -

1 USAFAUOS/7AC SENIOR AWMY ADVISOR~T rI IN UNrV SEMTRA3R ON -ARMED- FORCES-+, SOC----..

I USA IrLECTRONIC PROVING #OHOUNU ATTN 1 SIjCfP.MT-MLS . . .I1 O~b RDZF)PUMTf -FU--9TL4rE-WQ-T-IIINOLOdYI OFC uF NAVAL RESLAHCH/

'iRi/LAY.I AFHR /LRLG A A i TTNI AFHRL/tiiH-F--rjIjirk 6aCE AUK'AN kESOUOC't. LAb

2.3

IFEJ)E#AL AvIATIUAI AIMIN1S[HAT1UeN CtNTy4AL REGIONW LIbWARy, ACkL-66

1 AFAMw.L/BBI AFAMwiL/tlET NAVAi 'PERSONNEL R ANU Vi CLNIk.H LohMANV AND SUjiPIRT SYS TEMS

I NAVY PERSONNE.L R ANU U LLN1LW /

INAVY PERSONNEL k AND U CENTLN UIHLLCTON OF PROGRAmS

I NAVY IVLHSONNEL R ANUD0 Ct.NTER /

I US AWMY AVN ENGINE~kING FLIGHIT ALTIVITY ATYNM DAVTE-1fl

p' OFC uF NAVAL. RLSLk.ICH i.jL~bUNNFL AN.) IKAININ~j NEEARHC PROGNA.45

I NAVAI PERS0NNEL iA. -J CtNTLH ,

ioFc J NAVAL RLSEAWHC V.ft.LI UtIiCERt, ENVIkUNKiENTAL PHYSIOLOGY

I DEPT, OF NATIONAL UEFLNCk. ULPENLLr ANO CIVIL INSrITUTF OF* ENVI#P mEn

I NAVAi ALHOSPACt. 14EUICAL Hb#.H LAO A~kMbPACE PSVCHiOLO0Y DEPAWTMENr

1 USA iPADOC SYShr4S ANALYSIS ALTLVIlY AtTN: ATAA-TCA

1 HEAOk4UAtTLRS* W.AST bUAHU CHI9I' F.ISYCNqjqOQ.I(AL 15CH NHj

I UiSA "KESEARCH ANL) lECHNULWMY LAý-ATtki OAVUL-At (DR. R', So UUNN)

I USA e.NGINk.EN TUP03HA0HIL LAOS AITN* ETL-G7SL

1 USA i-NGINEER TUP03WAPHIC LAOS ATTN; STINFO CLNTIER

1 USA oNGINEFH TUPL83HAiPNIC LAOS AlNt E.L-TD-5

1 USA m4OBILITY EUtlI~MENT k AND U CUMO) AIm:S UHUME-TO (SCHOOL)

I N16Hi VISION LAH ATF-N UHSELmv-SUL)

I USA iRAINING BOARD *ATTN: Atlb-ATU-TAI LISA ,iUMAN ENbINE~ilNb LAU1 uS/AHrL LIAISON RE2 USAAV14C/

1 USA i4ATk.NIEL SYSTEMS ANALYSIS mCIIVITY ATTN: OHKSy-C

1 IjSA .iESEARCH OFC /

I NAFEt. HUMAN ENU14*ININb OHANLH

vRtXTTFLLE-cnLUMB8iS 1.AUORATORIES TACTICAL TECHNICAL OJýC

I USA ARHCTIC TESI CEN ATTN: AMStL-PL-TS

1 UISA cOLD RFGIONS TEST CEN ATTr4: STECR-up

I USA t.UNCkP7S ANALYSiS AbLY ATTNI LSCA-wop

1 USA CONCEPTS ANALYSIS AGCY ATTN$ CSCA-JF

I Hu wewAIH nIV Q* '4LUNUPSY4.MIAIRVI UJSAC'ACtA ATTN: ATIL-..AC-1CI USACiaCUA ATINI PTZ!L-CAC-IMI USACAC ýITN:Z ArIL-CA(.-IAI IJSAC&%CUA ArrIN' AT?D.--CAC-A

1 UiSA i.LICTHONIC WAqFAHk LAd CHILI INTLLLIGENCE MAYER nEVEL SUPP OFF

I OSA wS~Cti DEVLL + STANUiWLIWLA b~ U.K.I AFWAe /PIGR (COIC)

I USA w.LSEAHCH ANIJ 0LVLLUPM~.Nl LAdS CH~t-F. ISLAV bCIENCFS LilV, FOOO SCI LAB~

I TRAJi&NA ATTN: SAJS-OR1 NAVAI AlW SYSTLmb COMMA14U A4ITU AIR-2313I ECOM ATYNt AMbFL-CT-O1 USACiitC 1ECHNICk.. INFUHMAJIUN I.LNTER

I tJSAARL LIRRARYI HU'A.rv RESULINCES2 HSLH UNG, IIUMWPU)/I SEVIILL RLSEARCHj COHIJOKA1IUN1 USA iRAUOC SYSTEMS ANALYSIS A~lIVI7Y ATTN: ATAA-SL (TFCH LIBRARY)

1 UN1FIIHMtA, SENVICLS UNIT VI THL lILALYM Nc1 UEI'ARTMENT OF PSYCHIATRY

I USA j.OMPUIER SYSTEMS COMMANU aIIN: CJ)MMAND) TECHNICAL L18RAHY H-9~

I MUMAaw RESOURCES RSCH 0Mb (HUMkwO)1 HU14RHU. LIRRARYI FU5TIb DIHECTONATE, USAAMWUL TLCHNICAL LINRARY

I RANUl CORPURAiIUN /I RAND CONPPUAlIUN ATrN: LIBRARY U

1 M)ENAL AVIATIUN AUMINISib(ATION ATTN: CAMI LIWRARY ACC-44(D

I NAFEC L16RARY9 ANA-6~4I GRONINGLH LIUiRANY ATTN: ATZF-vS-L bLJ)b ~1311 CENTFR FOR NAVAL ANALYSISI 14AVAI HEALTH RSCH LEN LldI4ARY

I NAVAl ELECTRONICS LAO AIIN: HFbLAHCH L18RANY

24

I NAVdAi PERSONNE.L " AM.) U CLN LIkHAHY AITNS COUE I1O0I AIR i;ORCE HUMAN R--SOuRCL.S LA8 ATTN: AFfiRL/015I H'.). @-. 'IUACHtJLA ATTN* ILLH R~t4 UIVI USA atADEMY OF HEALTH SCIt.NC~tS 5TImSJN L1HRARV (DOCUmFNTS)i SCHOtiL OF SYSTLmS ANU LUbISTSC5I I)SaMp ROC TECHNICAL LItSHAkYI 6)EiAmTMLNT OP' lo NAVY 1ImAININu ANALYSIS ANt) EVALUArION-LbpI NATIuiNAL CENTER F)R HEALTH STATIS11CS /1 US'4A DEVI 01' di-t4AVIUHAL bLI ANU LLAUi~sHIPI OLI') WMINIflN UNIVERSITY VERFOoMANCE ASSLSSMk.NT LAb80HATORY; OSA .,OMMANU ANt) uENLIAL SIAF1' (:.ULL~.6L 'ATTN: LIdRAkY,I USA IRANSPORI1AIU SCHOOL USA THANSIP Tt.CP INFO ANU *RSCH CENI NAsA HQ /I NMRMt. PAOGRAM 0AAGER FUH HUMANI PRFIJHMANcLI NAVAi MELICAL K' A~w U CUMMANt) (,**)I USA m()MTNCE:N TFC(ANICAL kLSEAkf.ti bkANCI1 LIRRANY,H (.UA UJSA MLLJ NHC4 AMA) UtYL~L CommAwtI USA o. ELD ARTY tHA) /

f Ar 4:LEARINLAMOU~t IOH l4LNIsAL Hf.4ALrh I'Jv PAI4KLAWN IdLoo(I ( OF TEXAS CEN 1'3R COmt4UNiCA~jWIU k~ici

I 1tViT'UrE FOR UtFPNSk, ANALYSLSI USA ikAININU SPiW'~iT CL,4LH ATIMJ ATLC-i5T-PAI A)FHkh TeLCHN0LOU~Y J)C (H)I ýURNOE UNIV ULtPT OF PSI' 1ULU(O1CAL SCILL,4CFS

I ObSA ioBILITY ~L40ME14T k A14U Ui LUMAi'JAJ AlfT1N: UNUML-i'4I HO, toSA PlUk AFTNIq ANPL-UtL

D)A US ARMY AkJ.H4I'jLNs t~iu ft.Sk,AthCh' # LVALUAJLON DIVISIONICALSIOAN HUMAN FACIOkS ANL) THAi'~INt* c~ME,4eI U'SA w~LROM&LUICAL HESFAN'Cl1 LAb SLliENTiFjAC IN1'UHMAIION C.FNI'ER1 USAF SCHOUL uF ALROSPiACE mEO1C.JNL ACRUAML)ICAL LLHNAwy (TbK-.4)I US MiLIIAHY ACA4Lt~y (JLPt. Ot ~iSYOWY, pU b~lI USA INTELLIGENCý. lLN A14U SLH AIIN:o SCHOOL LIdMARY) USA INTIkLLIjiINIt, .",N ANjt )(SiH AIN!i Ar!)L..JP

)MAR41IA CORPS XNTIIUIEI NAVAI SAFi.TY CttqlI"1 /I USAAvMC Atir H.f Iq5LQ.H AIIN: AIiLA-LSiI (IS AotMY AYN NO~i LIbRARY AITN: 4LMI&1 LIHI44AHIA14I USAAu'fJC ATTN.- ATIU-UI iUS t4iLlIAHy ACAI~t.Y UIHLLIUI4 01' INSJIIUT1014AL RSCHI OSA AIR DEFENsE 5CHOOL AIIN: 1 lAr-CUi-MtSI LlSAA 1,S-L~ukAHY-oUCUNM.NTSI USA 4I1R ULPE:NSL bl$ARL) ATTN: 1' LLS N~LPD5II0RYI USA iNFANIHY fAUA) ATIN: ATib-Ab-ALI USA iNTELLIGENCL 'CN ANt) sLH &ATTNt ATS-hloT-S1'LI USA 41140NANCE CLi'j ANtI SCl ATIiN AlbL-IU..TACI USA ARMOR SCHOOL ATTN: AILK-li)I USA aRMOM CENT~q ijINk.IUhkA1l ;)1 (UMSAT J~VELUI0MtNTS,1 NAVA$ POSTGRAOUAT-- SCH A7TN: oLULEy 'kNUX LIdRARY (COOF 14;0.)1 USA ,HANSPOH1AILU;J -S(HUL ULk.I-IV ASSI. C0144ANOANT LUtICA* ILCHNUL0iY1 USA sIGNAL SCH~UL AN)U. 1-, HUUON ATTV: ArTZH-LfIULSA akMOR CENTLW * FT* KANUA WtI'CL OF Atf4ok FOHCE Wit * boTANUAi4UIZAI1ON

1 CHIEo OF NAVAL EDJ)CATIUON ANUl TN~j /I USA sJGNAL bCHUOL * FT. 0,UHUUN LLDUCAIIUNAL ILEHMOL~bY DIVISIONI HO AIC/XPTIU THA14JING SYSTLMS iitV~LLU4Lf4T5j USA iNTELLIGLNCt. CLN ANt) StL AIIN: AfhI-1'.RM1 US AwMY ARmOR C0,4%14 ATTN: AT/-ruJp,4OI USA ijUAWrkNMAbIt.H 5CHOUL LINLCIURATE UF IRAIN&Pb ULVELO;*MicNISI US CuAST GUARD ACAUEMY /1 USA iHANSPOH1ArIU4 SCHvOL UIWVLTOHA~I Oý TRAININGs + nocTHiNL1 USA INFANTRY SCHUJ)L LItIRAMy /I liSA iNFANTRy SL'1U:)L ATTN: ATSH-l..VI IJS AwjMY INFANTRY SCHOOL AITNI AISN-CJ

25

I USA ,NFANTRY SCHU)L AITNi AISHL)O1-L'4UI tSiA 4NFAN'TNY SCHiU) ATTNI ATS-i-tvI USA mP~ C (HEM !ýCtl/TNU CL.N 4 V1. MCCLE6.LAt AITNt ,AT4I'..PT$

SUSK M4P CHEM SCH'- ING CON' * F'T' MUCLTLE.LAN' UlM 'COI4BAT UEVELOP14LNTI USA v4P * Ct*,M b(t1/TNI., LLN + I,. MCCLLE.LAN U1I4. IAINING DEVE.LOPMENtJI USA 1,P 4 CHEM SCH/TNG CEN + FT. MCCUL.LAN ATTNt ATIN-NP-ACk.I USA iNSTITtlTE UF AUM1INISiHATIOU AITN: 1RESIuLNiT INAINING MANAGEMENTI USA i-IELU ARTILLL4Y SCHOUL MUWHIS SwEIT LIdHARY

IUSA iNSTItL-TL Uý AI)MINIS1NATIUN ALPj~i ~NRI 'USA wAR CULL~dE AITN: LLtOHAHYI USA iNGINEER SLHUOL LIHHMHY ANU LK~AWH4N~i W)UHCS CENTER1 USA AM0MO SCHOUL (USAHMS) Aliq's LIHHAHYI OR6AIZi~AILONAL t.4;tCliVINLSS (dN4 * SCtH AI1N: L1dRAHIANI US Aw.MY INTELL1,iLL4E? CLN14LH + %CI1OUL ATTAI~ ATSIWTI US ANMY INTELLhL'4CL. CLNILW + SCH.OUL. AlTNG-.AT5I-RMN _. .-

~ )~ IY NTELLI',',AC4L -d.NTLk * SC1UUL AIT T.i-TUpm

1 05 AkMY IN7E.LLh.6L'4CE CLNrLk SLHOOL ATTN: ATSI-CU-CSI i;S AWMY INTELLLENCE CENTt.R S4,HUUL ATT!-4: ATSI-ESI UEIPAWTiMENI OF 1iL~ A1IR IUH(Lt AlN UNLVtHSITY LklWRARY (ATC)I HO Tw.ADOC TRAINING(. UEvEL'JPMENT INSIITUTFH' RITiSH LMIHASSY iN1Ti"Jvi ULPLIJCt bilAFF

?CAN'AijIAN JOINI-S*AFF -.- - .-. . . .

ICULS (W) LIHRHYNIFRENc:H ARMY ATTkC-4EI AUSTwIAN LmHA5sy h)EFENSL., MILLITAY A~4L AIR ATTACHEI CANABoIAN UEFENCEt JAISON STAFF ATTN: CUIINSELLOM, UEFFNCE 0 ANu U

ROYAi NLtHE'RLANiJS LMbAbSy MILIIAHY ATlACHE-1 CVANfiTAN lr0RCn-BASE- COR? ALL"S-ATTrif$7P99 L'. OEiN�?' CAo4AoIAN ýORCLS PEWSUNNL.L AIV'L HSCH U41T1 ARMY i;t-.SONNtL N.sEAkCH ESTABLISHMENTI NL.THi.RLANUS EPUAiS5Y 0!-FIC.LW OF IH AIR ATTACHL6 L DRRATRY OF CONI3RESS "EXCHANGE-AND GIFT 01-V.......i EFbiwSE 1I.CHNILikL INFURMAILUN CL14 ATTN$I DTIC-OUA-2

14n~ tI Vii-RY 'Of CVN'UR!3S "UNIT ID'UgtUlET5*EXPEMTrYlNGPWi6JECfT' -.

i US cUVE*4NMFNT PHL014INU. UFL LL"HARY9 PUB~LIC DOCUMENTS DEPARTMLNTI US UOVERNMENT P'RIN~TING OFL LIIAHARY AND sTATUTORY* LIA DIV (SLL)I THE. allMY LIHNAHY ATTN% AkMY STULIILS St.CI ROYAi ARMY ED)UCATIONAL CORPS CFNTRE RwSC OO F TOAIKING SUP~PORT

NOMMHEI I)* AUUESSLFS 223 -

'TOTAL NIJNBER OF-CUPIES 467 -.. . .

26