u.s. coast guard research and development center

104
U.S. Coast Guard Research and Development Center 1082 Shennecossett Road, Groton, CT 06340-6096 Report No. CG-D-05-01 EVALUATING SIMULATORS FOR ASSESSMENT OF MARINER PROFICIENCY ^resof^ FINAL REPORT January 2001 DISTRIBUTION STATEMENT A Approved for Public Release Distribution Unlimited This document is available to the U.S. public through the National Technical Information Service, Springfield, VA 22161 Prepared for: U.S. Department of Transportation United States Coast Guard National Maritime Center Arlington, VA 22203-1804 20010612 086

Upload: others

Post on 16-Nov-2021

0 views

Category:

Documents


0 download

TRANSCRIPT

U.S. Coast Guard Research and Development Center

1082 Shennecossett Road, Groton, CT 06340-6096

Report No. CG-D-05-01

EVALUATING SIMULATORS FOR ASSESSMENT OF MARINER PROFICIENCY

^resof^

FINAL REPORT January 2001

DISTRIBUTION STATEMENT A Approved for Public Release

Distribution Unlimited

This document is available to the U.S. public through the National Technical Information Service, Springfield, VA 22161

Prepared for:

U.S. Department of Transportation United States Coast Guard National Maritime Center

Arlington, VA 22203-1804

20010612 086

NOTICE

This document is disseminated under the sponsorship of the Department of Transportation in the interest of information exchange. The United States Government assumes no liability for its contents or use thereof.

The United States Government does not endorse products or manufacturers. Trade or manufacturers' names appear herein solely because they are considered essential to the object of this report.

This report does not constitute a standard, specification, or regulation.

Marc B. Mandler, Ph.D. Technical Director United States Coast Guard Research & Development Center 1082 Shennecossett Road Groton, CT 06340-6096

1. Report No.

Technical Report Documentation Page

CG-D-05-01

2. Government Accession Number

4. Title and Subtitle

EVALUATING SIMULATORS FOR ASSESSMENT OF MARINER PROFICIENCY

7. Author(s)

Mireille Raby, Alice M. Forsythe, Marvin C. McCallum, and Myriam W. Smith

U.S. Coast Guard Research and Development Center 1082 Shennecossett Road Groton, CT 06340-6096

11. Contract or Grant No.

DTCG39-94-D-E00777, TO 96-F-E00148(0012)

13. Type of Report & Period Covered

Final

9. Performing Organization Name and Address

Battelle Seattle Research Center 4500 Sand Point Way N.E. Seattle, WA 98105-3900

12. Sponsoring Organization Name and Address

U.S. Department of Transportation United States Coast Guard National Maritime Center Arlington, VA 22203-1804

15. Supplementary Notes

The R&D Center's technical point of contact is Dr. Anita M. Rothblum, 860-441-2847, email: [email protected].

3. Recipient's Catalog No.

5. Report Date

January 2001 6. Performing Organization Code

Project Number: 3302.3.2 8. Performing Organization Report No.

R&DC 150

10. Work Unit No. (TRAIS)

14. Sponsoring Agency Code

Director (NMC-4B) National Maritime Center Arlington, VA 22203-1804

16. Abstract (MAXIMUM 200 WORDS)

This report describes a process for evaluating the capability of simulators to support performance-based assessment of mariner proficiencies. The step-by-step approach produces an evaluation protocol for examining the following capabilities of a subject simulator: to provide flexible exercise programming to the assessor; to replicate the characteristics of actual equipment; to simulate the operational conditions required to demonstrate the desired performance; and to support thorough debriefing of the assessment candidate. The general method is illustrated with a case study, examining the capability of two PC-based simulators to support the assessment of mariner performance in Automatic Radar Plotting Aid (ARPA) operation. The approach can be generalized to the evaluation of simulators of other marine equipment and of equipment in other industries, and to the evaluation of simulators for training. The method is compliant with the requirements of the International Maritime Organization (IMO) Seafarers' Training, Certification and Watchkeeping (STCW) Code and with the best practices of Instructional Systems Development (ISD).

17. Keywords

skills performance tests simulators training devices

training management transfer of training computer-aided instruction

19. Security Class (This Report)

UNCLASSIFIED

18. Distribution Statement

This document is available to the U.S. public through the National Technical Information Service, Springfield, VA 22161

20. Security Class (This Page)

UNCLASSIFIED 21. No of Pages

95 22. Price

Form DOT F 1700.7 (8/72) Reproduction of form and completed page is authorized.

in

ACKNOWLEDGEMENTS

The authors would like to acknowledge several individuals who provided support on this project. Captain John Nunnenkamp and Captain George Sandberg of the U.S. Merchant Marine Academy provided technical support in the early phases of this work. Captain William Bennett, Mr. Perry Stutman, Mr. David Field, and Mr. John Bobb from our sponsoring agency, U.S. Coast Guard, National Maritime Center (NMC), provided ongoing assistance and support during reviews of our methodology. Dr. Anita Rothblum and Ms. Irene Gonin of the U.S. Coast Guard Research and Development Center (R&DC) provided careful reviews of draft reports. Ms. Elizabeth Weaver of the R&DC shepherded the final version through the review process. Dr. Marc Mandler and Mr. Robert Richards also of the R&DC, provided support and encouragement during this project. Finally, we would like to recognize the influential role of the simulator developers whose products we reviewed. These individuals agreed to take part in a review that was in a developmental stage. Their efforts in supporting our evaluation and explaining their products' capabilities had a far-reaching effect on our methodology and findings. We gratefully acknowledge their participation.

IV

EXECUTIVE SUMMARY

Introduction

The United States Coast Guard (USCG) has introduced significant changes in the requirements for the training and qualifications of mariners. These changes occur in the context of heightened concern for the safety of navigation, increasingly challenging shipboard conditions, and changing international agreements, all of which have resulted in greater demands on the competence of ships' crews. The new requirements emphasize the practical demonstration and critical assessment of mariner performance to ensure competence for licensing. To provide technical support to the USCG National Maritime Center (NMC) in the implementation of these new requirements, the USCG Research and Development Center (R&DC) investigated some of the central issues in assessment. Project findings provide model processes for the development of rigorous and practical performance-based assessment procedures by industry, and guidelines for the critical review of these procedures by the USCG or third-party reviewers.

Objectives and Technical Approach

The primary objective was to develop an approach for evaluating a simulator's capability to support specific, performance-based assessment procedures. The basic assumption of the approach is that performance-based objectives for the assessment of specific mariner proficiencies should serve as the technical foundation for simulator evaluations. That is, the performance that an assessment candidate must demonstrate, and the conditions required for demonstrating that performance, should provide the basis for specifying the required characteristics of a simulator to support a valid assessment. This report documents a step-by-step method for determining the required simulator characteristics and for preparing a protocol to be used for evaluating a simulator. The approach developed is based on: (1) mariner performance requirements; (2) assessment conditions required for demonstrating performance; and (3) operational requirements for the shipboard equipment used by the mariner. This approach can be applied to simulators of varying capabilities and cost. This will help to identify the most cost- effective simulator for a valid assessment of mariner performance.

To test the approach, an assessment of Automatic Radar Plotting Aid (ARPA) operation was used as a case study. The ARPA operation was chosen because it is one of the proficiencies mandated by the STCW Code. STCW-defined mariner performance assessment objectives for ARPA operation were taken as a starting point, and the conditions for their assessment were used as a basis for determining the required simulator characteristics. These requirements were incorporated into a simulator evaluation protocol, which was applied to two commercially- available desktop ARPA simulators.

Conclusions The project demonstrated the feasibility of the approach to simulator evaluation. The ARPA simulator case study successfully identified important differences in the capabilities of two different desktop simulators to support performance-based assessments. Further, the present evaluation protocol explicitly incorporated standards for simulators established by the EVIO

STCW Code. Thus, the more general simulator evaluation method can be fully consistent with these IMO standards.

The present application of the simulator evaluation method was limited to PC-based ARPA simulators. However, the method is generic and has a broad range of potential applications, such as: more complex ARPA simulators; other bridge and engine room equipment simulators; other maritime simulators (such as vessel traffic system simulators); and simulators used in other industries (e.g., flight simulators, driving simulators, power plant control room simulators). Although the present application focused on the use of simulators for performance assessment, the evaluation of simulators for use in training applications would be similar. Once objectives are established for a training program, simulator evaluation criteria could be developed based on training requirements.

There is a broad range of potential users of the evaluation method. Training institutions (e.g., maritime academies, colleges, and commercial training centers) could use the method for selecting cost-effective simulators to meet their needs. The USCG, other regulatory agencies, or third-party reviewers could adapt the method in developing standardized evaluation procedures for different types of simulators. Simulator manufacturers could use the method to identify the features and capabilities needed in new simulators and in upgrades to existing simulators.

Recommendations

The following actions by the USCG, maritime academies and other training institutions will make the most effective use of the research findings:

• The USCG should make the current methodology widely available to the maritime community and encourage its inclusion in performance-based assessment courses or in "train- the-trainer" courses.

• The simulator evaluation method documented in this report should be applied to a wide range of simulators so as to create a library of simulator evaluation protocols. The USCG should encourage the maritime academies and other appropriate institutions to apply the methodology to other types of simulators, share general lessons learned, model protocols for other types of simulators, and share actual results of these evaluations.

• The USCG should develop, or encourage appropriate institutions to develop, standardized evaluation procedures for various types of simulators. These procedures could include standard scenarios and conditions, as well as guidelines and cut-off scores for accepting or not accepting a simulator or a course based on it.

• The USCG should make the ARPA simulator evaluation protocol widely available and consider requiring its use as a standard evaluation for ARPA simulators used in training courses.

VI

TABLE OF CONTENTS

EXECUTIVE SUMMARY v

TABLE OF CONTENTS vii

LIST OF ILLUSTRATIONS viii

LIST OF TABLES viii

LIST OF ACRONYMS AND ABBREVIATIONS xi

INTRODUCTION 1

Role of Simulators in Performance-Based Assessments 1

Basis for Simulator Evaluation 2

Report Organization 2

A METHOD FOR EVALUATING SIMULATORS USED IN PERFORMANCE-BASED

ASSESSMENT 4

Step 1: Define Performance-Based Mariner Assessment Requirements 5

Step 2: Define Simulator Evaluation Objectives, Evaluation Conditions, and Evaluation Criteria 7

Define Simulator Evaluation Objectives 7

Define Simulator Evaluation Conditions 9

Define Simulator Evaluation Criteria 10

Step 3: Develop Simulator Evaluation Protocol 10

Step 4: Conduct Simulator Evaluation 12

Step 5: Summarize and Analyze Findings 13

ILLUSTRATIVE ANALYSIS OF ARPA SIMULATOR CAPABILITIES 14

CONCLUSIONS AND RECOMMENDATIONS 18

Conclusions 18

Recommendations 19

Simulator Evaluation Method 19

ARPA Simulator Evaluation Protocol 20

vn

TABLE OF CONTENTS (Continued)

REFERENCES 21

APPENDIX A ARPA Simulator Evaluation Objectives, Evaluation Conditions, and Evaluation Criteria A-l

APPENDIX B ARPA Simulator Evaluation Protocol B-l

APPENDIX C Worksheets for Compiling and Analyzing Simulator Evaluation Data C-l

LIST OF ILLUSTRATIONS

Figure 1. Method for evaluating simulators 5

Figure 2. Percentage of criteria met by each ARPA simulator, in four evaluation categories 15

Figure 3. Percentage of assessment objectives supported by each ARPA simulator, in six assessment objective categories 16

LIST OF TABLES

Table 1. Assessment objective categories 6

Table 2. Performance measures and standards for ARPA mariner assessment objective 2.2 7

Table 3. IMO general performance standards for simulators 8

Table 4. Procedures for conducting a simulator evaluation 12

Table 5. Summary of simulator capabilities for evaluation objective 2.1, selection of display presentation, orientation, and vector mode 14

Table A-l. ARPA simulator evaluation objectives in the exercise programming category A-2

Table A-2. ARPA simulator evaluation objectives in the equipment set-up category A-8

Table A-3. ARPA simulator evaluation objectives in the simulation category A-l 1

Table A-4. ARPA simulator evaluation objectives in the debriefing category A-19

Table B-l. Initial conditions for IMO scenario I B-5

Table B-2. Worksheet for recording ARPA simulator test data for IMO scenario I B-5

Table B-3. Initial conditions for IMO scenario II B-6

vm

LIST OF TABLES (Continued)

Table B-4. Worksheet for recording ARPA simulator test data for MO scenario II B-6

Table B-5. Initial conditions for IMO scenario HI B-7

Table B-6. Worksheet for recording ARPA simulator test data for IMO scenario DI B-7

Table B-7. Initial conditions for IMO scenario IV B-8

Table B-8. Worksheet for recording ARPA simulator test data for IMO scenario IV B-8

Table B-9. Vessel data at 00:00 for exercise A (McCallum et al., 2000) B-10

Table B-10. Exercise A - Evaluation worksheet for exercise programming criteria B-12

Table B-l 1. Exercise A - Evaluation worksheet for equipment set-up criteria B-13

Table B-12. Exercise A - Evaluation worksheet for simulation criteria B-15

Table B-13. Vessel data at 00:00 for exercise B (McCallum et al., 2000) B-20

Table B-14. Exercise B - Evaluation worksheet for exercise programming criteria B-21

Table B-15. Exercise B -Evaluation worksheet for equipment set-up criteria B-22

Table B-16. Exercise B - Evaluation worksheet for simulation criteria B-23

Table B-17. Exercise B - Evaluation worksheet for debriefing criteria B-25

Table B-l8. Vessel data at 00:00 for exercise E (McCallum et al., 2000) B-27

Table B-19. Exercise E - Evaluation worksheet for exercise programming criteria B-28

Table B-20. Exercise E-Evaluation worksheet for simulation criteria B-28

Table B-21. Vessel data at 00:00 for exercise F (McCallum et al., 2000) B-30

Table B-22. Exercise F-Evaluation worksheet for equipment set-up criteria B-31

Table B-23. Exercise F - Evaluation worksheet for simulation criteria, first iteration B-31

Table B-24. Exercise F - Evaluation worksheet for simulation criteria, second iteration B-32

Table C-l. Detailed summary of Simulator Y's ability to satisfy debriefing criteria C-2

Table C-2. Worksheet for recording a detailed summary of a simulator's ability to satisfy exercise programming criteria C-3

Table C-3. Worksheet for recording a detailed summary of a simulator's ability to satisfy equipment set-up criteria C-5

Table C4. Worksheet for recording a detailed summary of a simulator's ability to satisfy simulation criteria C-7

Table C-5. Worksheet for recording a detailed summary of a simulator's ability to satisfy debriefing criteria C-l 1

IX

LIST OF TABLES (Continued)

Table C-6. General summary of Simulator Y's ability to satisfy simulator evaluation objectives in the debriefing category C-12

Table C-7. Worksheet for summarizing findings of an ARPA simulator evaluation, organized by simulator evaluation objective C-13

Table C-8. Summary of the capability of Simulator Y to support mariner assessment objectives in category 1, setting up and maintaining displays C-15

Table C-9. Worksheet for recording a summary of a simulator's capability to support mariner assessment objectives C-16

LIST OF ACRONYMS AND ABBREVIATIONS

ARPA

COLREGS

CPA

DIW

EBL

ECDIS

GMDSS

IMO

kt

MAO

nm

NMC

NVIC

PC

SART

STCW Code

TCPA

USCG

VRM

Automatic Radar Plotting Aid

International Rules for the Prevention of Collisions at Sea

Closest point of approach

Dead in the water

Electronic bearing line

Electronic Chart Display and Information System

Global Maritime Distress and Safety System

International Maritime Organization

Knot

Mariner assessment objective

Nautical mile

National Maritime Center

Navigation and Vessel Inspection Circular

Personal computer

Search and Rescue Transponder

Seafarers' Training, Certification and Watchkeeping Code

Time to closest point of approach

United States Coast Guard

Variable range marker

XI

[This page intentionally left blank.]

xn

INTRODUCTION

In this report we describe a method for evaluating the capability of simulators to support mariner performance assessments. The method is illustrated with the development of an evaluation protocol and its application in evaluating two Automatic Radar Plotting Aid (ARPA) simulators. The simulator evaluation method described in this report may be applied to other bridge and engine room simulators used in mariner training and assessment, as well as to simulators designed for use in other industries.

The research leading up to this report was part of a broader research project conducted for the United States Coast Guard (USCG) National Maritime Center (NMC) by the USCG Research and Development Center (R&DC). The overall objective of the research project is to establish a methodology for conducting assessments of mariner proficiencies using practical demonstration. As part of this broader project, a rigorous method for developing performance-based assessment procedures was developed and documented (McCallum, Forsythe, Smith, Nunnenkamp, & Sandberg, 2000) hereafter referred to as "McCallum et al. (2000)." This assessment development method provided much of the theoretical foundation for the present effort.

Role of Simulators in Performance-Based Assessments

The International Maritime Organization (MO) Seafarers' Training, Certification and Watchkeeping (STCW) Code indicates that mariner proficiency should be assessed by practical demonstration (MO, 1996). This means that to be considered proficient, a mariner must be able to perform a variety of shipboard operations safely and effectively in a real or operationally realistic setting to the satisfaction of an expert. As discussed in McCallum et al. (2000), these demonstrations of performance should be assessed using an assessment procedure based on a well-defined set of performance objectives. This type of assessment is called performance-based assessment.

Performance-based assessment can occur aboard a ship, using a simulator, or in a classroom setting. Marine simulators frequently are used for performance-based assessment because they provide a safe and convenient alternative to shipboard assessment. Using a simulator, a mariner can be assessed under controlled conditions on a wide variety of shipboard operations. These operations include emergency or hazardous situations that often cannot be replicated for shipboard assessment because of safety and cost considerations. Additionally, simulators allow a mariner to practice exercises repeatedly, and many simulators provide performance feedback. All of these characteristics are useful for training, as well as assessment, because they allow mariners to be introduced to new tasks without any danger to themselves or their coworkers (Goldstein, 1993).

There is a broad range of commercially available marine simulators that can be used to support any given mariner assessment. These simulators range from those designed to provide highly realistic operational environments, controls, and displays to those that are designed to represent a facsimile of some limited portion of operational information. Examples of more complex simulators are full-mission bridge simulators and ARPA simulators featuring high-fidelity inputs into actual operational equipment. Examples of less technically complex simulators are personal computer- (PC) based simulators featuring standard PC displays and controls in place of actual

operational equipment. Obviously, a simulator that is a full-scale mock-up of the operational setting is likely to have many of the shipboard equipment features and functions necessary to conduct a valid mariner assessment. However, with the advances in PC processing power and the incorporation of PC processing in shipboard equipment, the advantages of the more complex and elaborate simulators are becoming less pronounced. When these factors are considered in conjunction with the affordability of PC-based simulators, it becomes evident that the capability of PC simulators to support mariner assessment must be considered seriously.

The amended STCW Code specifically requires that simulators be used to assess mariner proficiency in radar and ARPA operation. The STCW Code also specifies that simulators may be used for demonstrating proficiency in other areas, such as maintaining a safe engineering watch or monitoring the loading and unloading of cargo. In addition, other competencies not emphasized in the STCW amendments, such as operating a Global Maritime Distress and Safety System (GMDSS) or an Electronic Chart Display and Information System (ECDIS), also can be trained and assessed using simulators. Marine simulators varying significantly in fidelity and cost are available for each of these areas.

ARPA simulators provide a wide range of capabilities. In earlier work on this project we used a simulator based on full-scale operational equipment to develop a mariner assessment procedure for ARPA operation. However, there are a number of PC-based ARPA simulators providing many of the features and functions available on the more costly full-scale simulators. The simulator evaluation procedure we developed in the present effort was applied to two PC-based ARPA simulators.

Basis for Simulator Evaluation

The present approach evaluates simulators on the basis of their ability to support performance- based mariner assessments. These simulator evaluation requirements are developed from a set of mariner assessment objectives, which define how a mariner must demonstrate his or her proficiency in a given operational area. For example, in the case of ARPA operation, one mariner assessment objective is Demonstrate use and limitations of ARPA operational warnings (McCallum, et al., 2000). Based on assessment objectives such as this one, all further mariner assessment requirements (i.e., assessment conditions, performance measures, and performance standards) are developed.

The assessment objectives guiding mariner assessment development serve as the basis for developing simulator evaluation requirements. For example, to ensure thorough mariner assessment, an ARPA simulator must be able to replicate the operational warnings typically found on an ARPA. Because mariner assessment objectives are so fundamental to the present approach to simulator evaluation, at any point it should be possible to trace a given simulator evaluation requirement back to the mariner assessment objective on which it is based. We designed the simulator evaluation procedure in the present report to meet this requirement.

Report Organization

The main body of this report has three sections. The first section, A Method for Evaluating Simulators Used in Performance-based Assessment, describes a procedure for evaluating simulators used in mariner assessment and illustrates how this procedure can be applied. We

used ARPA simulators as an example throughout the discussion. The second section, Illustrative Analysis of ARPA Simulator Capabilities, provides examples of analyses possible with the resulting findings. Finally the Conclusions and Recommendations section presents conclusions regarding the technical value and practical applications of this method, as well as recommendations for refining and implementing the general evaluation method and our ARPA protocol.

In addition to the main body, three appendices provide the analytical basis for the evaluation and the method for evaluating ARPA simulators. Appendix A, ARPA Simulator Evaluation Objectives, Evaluation Conditions, and Evaluation Criteria, documents the requirements for the ARPA simulator evaluation. Appendix B, ARPA Simulator Evaluation Protocol, provides the evaluation protocol that we developed and applied in the evaluation of two ARPA simulators. Appendix C contains a set of worksheets for compiling and analyzing evaluation results.

A METHOD FOR EVALUATING SIMULATORS USED IN PERFORMANCE-BASED ASSESSMENT

Figure 1 depicts the present method for developing and conducting a simulator evaluation. It is a five-step process. In the first step, performance-based mariner assessment requirements are defined. The performance-based assessment requirements are based on an analysis of the mariner assessment objectives. These assessment objectives are identified through a review of mariner skill and knowledge requirements in specified operational areas. Basic mariner assessment requirements include assessment conditions, performance measures, and performance standards. In the second step, simulator evaluation objectives (i.e., the specific items on which the simulator is evaluated), simulator evaluation conditions, and simulator evaluation criteria are defined. All three are derived from the performance-based assessment requirements defined in the first step.

In the third step, the evaluation protocol is developed. An evaluation protocol is a plan for executing a simulator evaluation. The protocol is organized by simulator evaluation objective, and is divided into four categories: exercise programming, equipment set-up, simulation, and debriefing. In the fourth step, the simulator evaluation is conducted. A separate evaluation should be conducted by at least two evaluators to ensure the results of the evaluation are reliable. Finally, in the fifth step, the findings of the evaluation are summarized and analyzed. This section contains a discussion of these five steps, along with examples of how we applied this method to the evaluation of two ARPA simulators.

For the ARPA simulator evaluation, we evaluated two PC-based simulators. We selected simulators with different processing characteristics and cost to ensure our evaluation protocol was sufficiently flexible for application to a range of simulators, and sufficiently sensitive to discriminate among them. Additionally, because simulator manufacturers frequently change and upgrade simulator capabilities and offer systems in multiple configurations, we opted not to identify the two simulators by name. Instead, we refer to them as Simulator X and Simulator Y.

In general, Simulator X was a lower-cost system with fewer simulator features. Simulator Y was a higher cost system with a wider range of features and functions. Simulator X was designed to mimic the display information of an ARPA unit, but not to replicate actual ARPA processing characteristics. Simulator Y was designed to duplicate the processing characteristics of an actual ARPA. During the evaluations, we had the cooperation and participation of both simulator manufacturers. Their participation greatly assisted us in understanding and considering the capabilities of their simulators.

1. Define performance- based mariner assessment requirements

■as

|2. Define simulator evaluation objectives, conditions and criteria

]

3. Develop simulator evaluation protocol

4. Conduct simulator evaluation

5.Summarize and analyze findings

Figure 1. Method for evaluating simulators.

Step 1: Define Performance-Based Mariner Assessment Requirements

The first step in developing a simulator evaluation protocol is to define the requirements of the performance-based mariner proficiency assessment that will be conducted using the simulator. The mariner assessment requirements include the assessment objectives, assessment conditions, performance measures and performance standards. This step can be time consuming but simulator evaluation is not its primary purpose. These requirements form the basis of mariner assessments and need to be developed, whether a simulator will be used or whether assessment will take place in another setting. For a detailed discussion of these concepts, see McCallum et al. (2000). For helpful materials, including a manual, on how to develop assessments, see McCallum, Forsythe, Barnes, Smith, Macaulay, Sandberg, Murphy, & Jackson (2000).

The mariner assessment objectives are the critical requirements of job performance that can be measured and assessed. These objectives should reflect the skills and knowledge required for a job in a specified operational area. All of the corresponding mariner assessment requirements the mariner assessment conditions, performance measures, and performance standards should be based on the assessment objectives. Ultimately, all of the mariner assessment requirements should be based on a review of job and task requirements in the operational setting. Resources available to support this review include the STCW Code (MO, 1996), the U.S. Code of Federal Regulations, documented job procedures, technical manuals, and knowledgeable job incumbents.

The growing trend toward performance-based assessment in the U.S. maritime industry may mean that in the future, developers could leverage documented mariner assessment procedures for use in simulator evaluation protocol development. However, at the present time few

documented performance-based mariner assessment procedures exist in the public domain. One that is available is the ARPA assessment developed for McCallum et al. (2000). This procedure is our source for the ARPA assessment requirements used in the present simulator evaluation protocol.

McCallum et al. (2000) specified 27 performance-based ARPA assessment objectives. The objectives requiring use of a simulator are listed in Appendix A in the column labeled "Mariner Assessment Objective." These assessment objectives were organized under six operational categories that have general applicability to assessment objectives for the operation of complex equipment (see Table 1). It is reasonable to expect that a set of assessment objectives addressing the operation of various types of shipboard equipment, such as an ECDIS or a GMDSS, could be organized under the general headings presented in Table 1. As an example, one of the ARPA mariner assessment objectives is 1.2, Selection, as appropriate, of required speed and compass input to ARPA, which was in the category corresponding to "equipment initialization."

Table 1. Assessment objective categories.

(1) Equipment initialization.

(2) Basic understanding of equipment output.

(3) Technical limitations of the equipment.

(4) Advanced technical operations.

(5) Broad application of skill and knowledge to the job.

(6) Operational warnings and systems tests.

After the performance-based assessment objectives are specified, the conditions for assessing mariner performance corresponding to each objective should be identified. The assessment conditions define the setting, tools, reference aids and safety precautions that are required for assessment of mariner proficiency. The conditions for each objective should be precisely specified so that comparable conditions can be replicated from one assessment session to the next. Assessment conditions can vary widely in their detail and complexity, as governed by the technical content of a given mariner assessment objective. For instance, some objectives may require conditions in which the equipment is simply initialized (e.g., to demonstrate basic operational set-up features of the equipment), whereas other objectives may require that a specific simulated exercise be running (e.g., to demonstrate technical limitations of the equipment).

For example, one of the ARPA simulator evaluation objectives specified in McCallum et al. (2000) requiring the use of a simulator is 2.5, Selection of vector time scale (see Appendix A). We tested this objective during the equipment set-up phase before an ARPA test exercise. No exercise needed to be running for the assessment of this particular objective. However, mariner assessment objective 2.2 Appreciation of the uses, benefits, and limitations of ARPA operational warnings, required a specific exercise to be running to test the mariner's response to operational warnings.

The other basic requirements for a performance-based assessment are the specification of performance measures and standards. A performance measure is a recordable, observable action, or indication of an action. A performance standard is an established minimum level of

performance based on relevant assessment criteria. Taken together, the assessment measures and standards determine what specific mariner performance must be elicited and recorded during assessment.

In their ARPA assessment, McCallum et al. (2000) specified three performance measures for mariner assessment objective 2.2, Appreciation of the uses, benefits, and limitations of ARPA operational warnings. . Table 2 shows these performance measures and their respective performance standards. The candidate was measured on his or her performance in setting safe limits, and in responding to safe limit warnings and guard zone warnings. To meet the performance standards, the candidate had to set safe limits in accordance with the assessor's instructions, and he or she had to identify the safe limit warning and the guard zone warning correctly.

Table 2. Performance measures and standards for ARPA mariner assessment objective 2.2.

Mariner Assessment Objective Performance Measure Performance Standard Appreciation of the uses, benefits, and limitations of ARPA operational warnings

2.2 Appreciation of the uses, 2.2.1 Safe limit setting Safe limit set in accordance with benefits, and limitations of assessor instructions

2.2.2 Safe limit warning Safe limit warning correctly identified

2.2.3 Guard zone warning Guard zone warning correctly identified

Step 2: Define Simulator Evaluation Objectives, Evaluation Conditions, and Evaluation Criteria

In this step, the simulator evaluation objectives, evaluation conditions, and evaluation criteria are defined. These simulator evaluation requirements should be derived from the mariner assessment requirements developed in the first step. Precise definitions of the simulator evaluation requirements are essential to designing an evaluation protocol that is comprehensive and focused on the simulator's ability to support mariner assessment.

Define Simulator Evaluation Objectives

The factors on which the simulator is evaluated are called simulator evaluation objectives. These objectives are based on the mariner assessment objectives defined in the previous step, as well as on the performance standards for simulators specified by the MO in the amended STCW Code, Section A-l/12, Part I, 2.1-2.6 (MO, 1996). Table 3 summarizes these general standards

Table 3. IMO general performance standards for simulators.

(1) Ability to satisfy the specified assessment objectives. (2) Abilitv to simulate the operating capabilities of the shipboard equipment concerned, to a level () of realism appropriate to the assessment objectives, and include the capabilities, technical

limitations and possible errors of such equipment. (3) Ability to provide sufficient behavioral realism to allow an assessment candidate to exhibit the

knowledge and skills appropriate to the assessment objectives.

(4) Ability to provide an interface through which a mariner can interact with the equipment and simulated environment.

(5) Abilitv to provide a controlled operating environment capable of generating a variety of ( } conditions^ including emergency, hazardous, or unusual situations relevant to the assessment

objectives. (6) Ability to permit an assessor to control, monitor, and record exercises for the effective

assessment of candidate performance. _ .

The first three standards relate to the simulator's ability to replicate, in a realistic manner, the operational capabilities and limitations of the shipboard equipment. Operational capabilities and limitations refer to the functions and features of the equipment relative to its intended use. For example, one ARPA function is to acquire and track vessel targets. This function has some limitations, such as delays in computing the speed and heading of multiple targets. Simulators should be evaluated on their ability to replicate operational limitations such as this one.

The fourth standard relates to the simulator's ability to provide a display interface through which the mariner can view the operations of the simulator and interact with the equipment and simulated environment. The fifth standard relates to the simulator's ability to provide a controlled operating environment capable of generating a variety of conditions for use in assessment exercises. Finally, the sixth standard addresses the simulator's ability to provide adequate means for an assessor to debrief a mariner following an assessment.

Our development of the simulator evaluation method was intended to follow STCW direction. As summarized in Table 3, the STCW Code requires that simulators used in mariner proficiency assessment possess an "interface through which a mariner can interact with the equipment (IMO 1996) No mention is made, however, of the necessity for simulator controls to replicate the controls of a particular equipment manufacturer. The evaluation of specific controls is a possibility not only for an ARPA simulator, but also for a PC-based simulator of any type of equipment. If an evaluation of a simulator's capability to mimic specific controls is desired, such objectives could be built into the evaluation protocol.

The six standards presented in Table 3 can be divided into four main categories of simulator

capability:

• Exercise programming (standard 5).

• Equipment set-up (standard 4).

• Simulation (standards 1-3).

• Debriefing (standard 6).

The order used to list these four main categories of simulator capability differs from the order adopted by MO. The current order better reflects the sequence of steps one follows when conducting a simulator evaluation.

Simulator evaluation objectives should be defined for each of these categories of simulator capability. Exercise programming objectives should address the simulator's ability to permit an assessor to control exercise conditions. For ARPA simulators, this includes the programming of own ship hydrodynamic and maneuvering characteristics, land masses, and environmental characteristics such as wind, current, etc. Equipment set-up objectives should address all items pertinent to initiating and setting up the display and equipment. (The controls characteristic of a specific manufacturer could be included in this category of objectives.) Simulation objectives should address all operational capabilities and technical limitations requiring a dynamic, simulated exercise, including equipment failures, errors, and alarms. Lastly, debriefing objectives should address the simulator's ability to permit the assessor to monitor, record, replay, and print the results of exercises.

As noted earlier, in addition to the above standards, the simulator evaluation objectives are based on the mariner assessment objectives defined in the previous step. The operational capabilities and limitations required for each assessment objective should be identified and represented in the simulator evaluation objectives. As an example, for our ARPA simulator evaluation, we derived 33 simulator evaluation objectives from the 27 ARPA assessment objectives specified in McCallum et al. (2000). These ARPA simulator evaluation objectives include all of the operational capabilities, controls, and displays required for a simulator to run the ARPA assessment in McCallum et al. (2000). An example is mariner assessment objective 5.3, The operation of the trial maneuver facility. To address this requirement, we defined a corresponding simulator evaluation objective 3.14, Operation of the trial maneuver facility. Appendix A contains a set of tables delineating all of the evaluation objectives we specified for ARPA simulators.

Define Simulator Evaluation Conditions

After the simulator evaluation objectives have been determined, the evaluation conditions for each objective can be defined. Evaluation conditions refer to the context of the equipment's intended use. The context of use includes the tasks that mariners perform, as well as the circumstances influencing the behavior of the equipment. This context also includes environmental conditions that could degrade the performance of the equipment. For example, sea clutter is an important evaluation condition for ARPA simulators, because sea clutter can induce operator errors by masking target information. An ARPA simulator should be evaluated on both its ability to display sea clutter and its ability to reproduce the masking of target information when sea clutter is present.

When defining simulator evaluation conditions, the mariner performance-based assessment requirements specified in step 1 should be considered. These assessment requirements dictate the assessment conditions to be supported by the simulator, and form the basis for the evaluation conditions. The evaluation conditions should specify whether the simulator should be evaluated in a dynamic mode (i.e., simulating an operational exercise), or in a static mode (i.e., initialized but not running an exercise). They should also specify the situation to replicate (e.g., location and vessels involved) and actions to perform (e.g., appropriate course change).

Evaluation conditions can take the form of standardized exercises. The standardized exercises used for simulator evaluation can be the ones developed for the mariner assessment, or they can be more simple exercises designed to test specific simulator operations. Ideally, these exercises should be developed with the assistance of individuals knowledgeable about the operational capabilities of the actual shipboard equipment.

For our ARPA simulator evaluation, we used standardized exercises developed for McCallum et al. (2000). Within each exercise, we specified the evaluation conditions for each evaluation objective, and then categorized the conditions as either dynamic or static. Simulator evaluation objective 3.5, Use of graphic representation of danger areas, is an example of an objective that we evaluated using a dynamic exercise, because it was necessary to observe the navigation of own ship close to danger areas to see how the simulator represented danger areas. Simulator evaluation objective 2.5, Selection of vector time scale, is an example of an objective that we evaluated when the simulator was in a static mode. To meet this objective, an ARPA simulator had to provide both adjustable time and fixed time scales, and it had to indicate which vector time scale was in use. No specific exercise needed to be running to evaluate these capabilities. Appendix A lists the conditions for each evaluation objective that we specified for ARPA simulators.

Define Simulator Evaluation Criteria

When the simulator evaluation objectives and evaluation conditions have been defined, the evaluation criteria can be specified. Evaluation criteria refer to the simulator's ability to provide the specific feature (a control or a display) needed to meet a specific evaluation objective. A control enables the operator to perform a given function (e.g., a rotary knob used to select a heading). A control criterion can also represent the underlying function enabled by a control. A display is a visual or auditory representation of the function or the environment (e.g., North-up display of a vessel's course).

Simulator evaluation criteria can be based on one or more of the following: STCW simulator requirements (MO, 1996); MO performance standards for the actual equipment (MO, 1971, 1979); and the requirements needed to satisfy the specified mariner assessment objectives (Bole & Dineley, 1990; McCallum et al., 2000). For example, simulator evaluation objective 2.5, Selection of vector time scale, has evaluation criteria that are based on both STCW simulator requirements and MO performance standards for actual equipment. The evaluation criteria for this objective consist of one control feature, 2.5.Cl, Ability to select adjustable time scale or fixed time scale; and one display feature, 2.5.Dl, Indication of time scale of vector in use. The tables in Appendix A contain the complete list of control and display criteria we specified for the ARPA simulator evaluation objectives.

Step 3: Develop Simulator Evaluation Protocol The simulator evaluation objectives, evaluation conditions, and evaluation criteria that were defined in the previous step provide the foundation for the simulator evaluation protocol. The protocol can be organized around the individual simulator evaluation objectives, and divided into four sections corresponding to the simulator evaluation objective categories: exercise programming, equipment set-up, simulation, and debriefing. The protocol should address the following information for each objective:

10

Evaluation conditions.

Evaluation criteria for controls.

Evaluation criteria for displays.

Availability of specific control or display.

Simulator performance rating for each control and display criterion.

Comments.

Specifications worksheet.

The evaluation conditions tell the evaluators what specific conditions need to be present to evaluate each objective. Evaluators need this information to ensure comparable conditions exist for each evaluation.

The evaluation criteria for controls specify which control features should be evaluated. Our evaluation criteria address the underlying processing characteristics and limitations represented by a simulator's controls, rather than strictly the physical resemblance of the controls to actual equipment. This follows the approach set forth in the STCW standards for simulators (see Table 3). However, the physical fidelity of controls may be a consideration in some applications or assessments where actual equipment operation is an objective.

The evaluation criteria for displays specify which display features should be evaluated. All controls and displays relating to the evaluation objectives should be addressed, and there may be more than one control and display for each objective.

The last three areas of the protocol—availability of specific control or display, simulator performance rating for each control and display criterion, and comments— are for recording evaluator observations. An evaluator can note whether or not the specified control or display was available on the simulator, and he or she can rate how well the simulator satisfied the evaluation criteria. A variety of different rating scales can be used. At a minimum, the rating scale should permit evaluators to make a distinction among simulators that meet, partially meet, and do not meet, the evaluation criteria. By providing comments, an evaluator can note any other pertinent information that should be included in the evaluation.

In addition, the evaluation protocol should include a specifications worksheet for each simulator evaluated. Equipment specifications are useful for describing and comparing simulators. Items to include on the specifications worksheet are the manufacturer, model, hardware, software, network configuration, actual equipment interface of the simulator (i.e., the manufacturer and model that the simulator replicates, if any), cost, etc. The list of items on this worksheet depends on the type of simulator being evaluated.

In developing the protocol, instructions should be written for evaluators so they have a standardized process for rating each simulator. Written instructions should define the rating scales and rating criteria, as well as the purpose of each section of the protocol. Explicit instructions can help to ensure evaluation results are comparable across simulators.

The protocol we used for the ARPA simulator evaluation is provided in Appendix B. This protocol includes a specifications worksheet and a set of standardized exercises designed to

11

replicate operational limitations of actual shipboard equipment. It also includes a set of exercises designed to evaluate the conditions and criteria necessary to support mariner assessment.

Step 4: Conduct Simulator Evaluation The fourth step in evaluating simulators is to conduct the evaluation to determine how well the simulator supports performance-based assessment. When conducting an evaluation, it is beneficial to have the cooperation and participation of the simulator manufacturers, to assist evaluators in programming exercises and operating the equipment, as well as to ensure understanding and consideration of all the simulator's capabilities. In addition, manufacturers may benefit from a thorough evaluation by discovering the strengths and weaknesses of their products.

If possible, two or more evaluators should participate in the evaluation. Comparing and integrating the findings of multiple raters result in a more reliable evaluation. Also, an objective evaluation requires evaluators to follow well-defined procedures for administering the protocol. The procedures might vary depending on the nature of the simulator being evaluated. For example, the evaluation of a single component simulator, such as an ECDIS, would be much less complex than the evaluation of a multi-component simulator, such as an engine room simulator. Table 4 summarizes the procedures we followed when conducting evaluations of the two ARPA simulators.

Table 4. Procedures for conducting a simulator evaluation.

(1) Provide the manufacturer with the evaluation conditions (including standardized exercises) to be programmed ahead of time, if applicable.

(2) Ensure a knowledgeable person is available to demonstrate the simulator's capabilities.

(3) Ask the manufacturer to provide the specifications for the equipment being evaluated.

(4) Evaluate each item in the order presented: programming capabilities, set-up capabilities, simulation capabilities, and debriefing capabilities. Ensure all simulator evaluation objectives have been evaluated.

(5) Evaluate each item requiring a dynamic evaluation using the appropriate exercise. Run each exercise separately.

(6) Conclude the evaluation process by asking the manufacturer to discuss any features that might have been overlooked during the evaluation.

Before evaluating each ARPA simulator, we forwarded an overview of the evaluation objectives and a detailed description of the evaluation exercise scenarios to each manufacturer. Subsequently, we sent a team of three evaluators to conduct the simulator evaluations at each company. During these evaluations, evaluators observed the simulators running the standardized exercises, and each evaluator independently completed an evaluation form. Evaluators used a rating scale of yes, partial, or no. A yes score indicated the simulator met the criterion. A partial score indicated the simulator partially met the criterion, and a no score indicated the simulator did not meet the criterion.

12

Step 5: Summarize and Analyze Findings The final step in a simulator evaluation is to summarize and analyze the findings. Summarizing the findings entails integrating observations and scores across evaluators for each evaluation criterion. Analyzing the findings entails combining and extracting selected simulator evaluation criterion scores to address specific issues with respect to the capability of a simulator to support mariner assessment.

To summarize the results of the evaluation, evaluators' ratings and observations should first be combined for each simulator evaluation criterion. Methods for combining differing observations can be based on either a consensus-building approach or an averaging approach. Using a consensus-building approach, differing observations and scores are identified and discussed among the evaluators until an agreement concerning the observation or score is reached. This approach is time-consuming but has the advantage of addressing subtle or highly technical issues reflecting a simulator's capabilities. Using an averaging approach, observations and ratings are combined by determining the central tendency among evaluators. This approach is more efficient but may tend to obscure subtle or highly technical concerns regarding simulator characteristics.

Using the resulting simulator evaluation criteria scores as a basis, the evaluation team can conduct a series of analyses addressing the capabilities of the simulator or simulators under evaluation. Three general types of analysis can be conducted. First, the scores for each simulator evaluation criterion can be combined to obtain scores for each of the separate simulator evaluation objectives. Second, the separate evaluation objective scores can be combined to evaluate capabilities with respect to each of the four evaluation categories: exercise programming, equipment set-up, simulation, and debriefing. Third, selected simulator evaluation scores can be extracted and summarized to address specific issues, such as the capability to support individual mariner assessment objectives and IMO simulator requirements. These types of analyses are illustrated in the next section by the results of our ARPA evaluation.

13

ILLUSTRATIVE ANALYSIS OF ARPA SIMULATOR CAPABILITIES

Using our evaluators' observations and scores for the two ARPA simulators, we performed the three types of analyses listed in the previous section. Selected results of our analyses are presented below to illustrate both the analysis procedures and the types of comparisons that can be made. The worksheets that we used to perform these analyses are presented in Appendix C.

Capability of simulators to support individual simulator evaluation objectives. In the present ARPA simulator evaluation, scores for the three separate evaluators were reviewed, discrepancies among evaluators were identified, and each issue was discussed until a consensus score for each evaluation criterion was obtained. By assigning a numerical value of 1 (yes), 0.5 {partial), or 0 (no), we calculated simulator scores for each evaluation objective. These scores allowed us to identify the relative strengths and weaknesses of each simulator.

Table 5 compares the scores for Simulator X and Simulator Y on each of the evaluation criteria corresponding to simulator evaluation objective 2.1, Selection of display presentation, orientation, and vector mode. This table indicates that Simulator X fully met six criteria, partially met three criteria, and did not meet two criteria, resulting in a score of 7.5 for simulator evaluation objective 2.1. In comparison, Simulator Y fully met nine of the criteria and did not meet two of the criteria, resulting in a score of 9.0 for this simulator evaluation objective.

Table 5. Summary of simulator capabilities for evaluation objective 2.1, selection of display presentation, orientation, and vector mode.

Evaluation Criterion Met

Simulator Evaluation Criterion (C = Control, D = Display) Simulator X Simulator Y

2.1 .C1 Ability to toggle between sea- and ground-stabilized modes

2.1.D1 Indication of display mode

2.1 .C2 Ability to toggle between North-up, and either course-up or head-up azimuth stabilization

2.1 .D2 Indication of display orientation mode

2.1 .C3 Ability to toggle between relative and true motion

2.1 .D3 Indication of display vector mode

2.1.C4 Ability to use ARPA on the following ranges: (a) 3 or 4 miles, and (b) 12 or 16 miles

2.1.C5 Fixed range rings available

2.1.D5.1 Indication of range scale in use

2.1.D5.2 Indication of distance between range rings

2.1 .C6 Availability of variable range marker (VRM)

Summary Score (Yes = 1, Partial = 0.5, No = 0)

No

7.5

No

No No

Partial Yes

Yes Yes

Yes Yes

Partial Yes

Yes Yes

Yes Yes

Yes Yes

Partial Yes

Yes Yes

9.0

The scoring approach illustrated in Table 5 represents a modest level of technical sophistication. A more advanced approach would be to apply different weights to the separate simulator evaluation criteria prior to calculating summary scores. Higher weights would indicate those criteria that are considered relatively more important than other criteria. Scores would be

14

multiplied by the respective weight for each criterion to obtain a weighted score. Valid criterion weights could be obtained through structured reviews with subject matter experts.

Capability of simulators to support each of the four general simulator evaluation categories. After analyzing simulator capabilities at the detailed level of individual simulator evaluation criteria, analyses can address the more general issues corresponding to the four evaluation categories. Here, the scores for the individual evaluation objectives can be summed within the corresponding evaluation category to provide summary scores for each category. These more general scores provide a broader basis for evaluating the simulator's main strengths and weaknesses. Figure 2 depicts the percentage of possible criterion scores met by each ARPA simulator within each simulator evaluation category. This figure reveals consistently high scores for Simulator Y (between 80 and 98 percent of the criteria were met for each evaluation objective); and more varied, but consistently lower scores for Simulator X (between 20 and 56 percent of the criteria were met).

100%

80%

d> 5 CO fi0% k. 0) ** k_

Ü «f- o 50%

0) Cl 10 c B> 40% (> k. 0> Q.

30%

20%

10%

0%

□ Simulator X

■ Simulator Y

Exercise Programming Equipment Set-up Simulation

Evaluation Category

Debriefing

Figure 2. Percentage of criteria met by each ARPA simulator, in four evaluation categories.

Capability of simulators to support mariner assessment. A simulator's capability to support the mariner assessment objectives is a fundamental consideration in the analysis of evaluation

15

results. Analysis of these issues can be conducted by compiling simulator evaluation objective scores corresponding to each of the mariner assessment objectives. Figure 3 compares the capabilities of Simulator X and Simulator Y to support the six assessment objective categories specified in McCallum et al. (2000). This figure reveals consistently high percentage scores for Simulator 7 (between 83 and 100 percent of the assessment objectives were supported); and more varied percentage scores for Simulator X (between 17 and 100 percent of the objectives were supported).

100%

90%

o 80% o Q. Q.

d) 70%

a O

60%

50%

40%

o> 30%

o 20%

10%

0%

D Simulator X

■ Simulator Y

- I Setting Up & Situation

Maintaining Displays Assessment

Factors Affecting Parallel Indexing Performance &

Accuracy

Assessment Objective Category

Application of COLREGS

Operational Warnings &

Systems Tests

Figure 3. Percentage of assessment objectives supported by each ARPA simulator, in six assessment objective categories.

Interpretation of the findings. Overall, our evaluation summaries indicated that Simulator X had limited capabilities in each of the four simulator evaluation categories: exercise programming, equipment set-up, simulation, and support for debriefing. As an example, the system could simulate landmasses and environmental conditions, but it did not provide flexibility in specifying either the strength or weakness of the conditions. The main strength of its simulation was its capability to support parallel indexing. It had minimal capability to record exercises and support debriefing. On the other hand, Simulator Y supported the bulk, although not all, of the requirements in each of these categories. Simulator Fs strengths included its ability to generate complex and varied exercise conditions and its ability to record exercises. Weaknesses in its simulation included an inability to specify danger areas on one of the radar display interfaces we tested.

16

Simulator X also had limitations in supporting mariner assessment objectives. This system mimicked ARPA display features, but not the underlying processing characteristics. For example, its target display information did not reflect the temporary processing delays and inaccuracies typical of actual ARPA systems. For use in mariner assessment, Simulator X would need augmented vessel target processing capabilities so assessment candidates could experience the conditions exhibited by an actual ARPA unit. Simulator Y offered greater support for mariner assessment. Its ability to duplicate the actual display, control, and processing characteristics of an ARPA unit allowed it to generate varied exercise scenarios and faithfully simulate target ship processing,

Generalizing from the analysis. In this report, we have presented and illustrated a very structured method for the evaluation and analysis of simulators in their capability to support an extremely demanding application— performance-based assessment of mariner proficiency. For our ARPA example, we have identified the particular strengths and weaknesses of two simulators in their capabilities for mimicking individual ARPA features, for providing broader ARPA simulator functions, and finally for supporting the MO requirements for assessment of mariner proficiency in ARPA operation. Whereas both simulators support many of the assessment objectives, one more than the other, neither one supports 100 percent of the objectives. Could either one of the simulators we evaluated support assessment of mariner proficiency in ARPA operation? The answer is not a simple "yes" or "no." The mariner assessment requirements cannot all be met, even by the more capable of the two simulators. However, the potential advantages of using simulators rather than real equipment remain.

A more productive question to be answered by this structured evaluation might be, "What is the best use of proposed simulator technology?" A simulator might be used for a preliminary assessment of mariner performance to ensure that the individual is ready to make the best use of an opportunity for assessment on real equipment in a laboratory or at sea. As an alternative, a preliminary assessment on a simulator might be augmented by a later, more limited assessment on real equipment. Given either of these approaches, a decision would have to be made as to whether the greater effectiveness of a more costly simulator or of assessment on real equipment is worth the increased cost. The detailed evaluation method that we have presented is a tool not only for the assessment developer, but also for the simulator designer. The evaluation identifies weak features and the potential value of their improvement to the user, especially for mariner assessment. After identifying the weaknesses, the evaluator can consider the value of a potential improvement in relation to its cost to the manufacturer and to future buyers. We have proposed a systematic method for evaluating simulators and must leave it to others in the maritime industry to design a broader program of performance-based assessments that benefit from the capabilities of simulators. A parallel effort to ours, to systemically identify the features needed by engine room simulators to support mariner assessment, reached a similar conclusion, that simulators need to be incorporated into a broader program of mariner assessment (Stutman, 1999).

17

CONCLUSIONS & RECOMMENDATIONS

The following are our conclusions and recommendations, based on our experiences and findings from the present research effort. The conclusions address the technical feasibility, practicality, and potential applications of this method. The recommendations identify actions for the refinement and implementation of both the general simulator evaluation method and the ARPA simulator evaluation protocol.

Conclusions

Based on our experience and findings during the development and application of the simulator evaluation method, we conclude that this approach is technically feasible and practical. In addition, because this method is based on the requirements of mariner performance-based assessment, we also conclude it can be applied in a broad range of domains. The five major conclusions of this report are presented below.

The simulator evaluation method is technically feasible and practical. We demonstrated the technical feasibility and practical value of our approach for evaluating the capability of marine simulators to support mariner assessment. A structured method for evaluating simulators based on performance-based assessment requirements was defined. An example evaluation protocol for ARPA simulators was then developed and successfully applied in the evaluation of two PC- based ARPA simulators. This application allowed us to refine both the method and evaluation procedure. It also allowed us to verify the technical feasibility and practical value of the general method and the ARPA evaluation protocol.

The simulator evaluation method is fully compliant with STCW Code standards for simulators. The STCW Code establishes a set of performance standards for simulators supporting mariner assessments (see Table 3). These standards represent the basic requirements for any simulator to be able to support mariner assessment. The present method is fully compliant with these standards. Each of the STCW Code simulator standards is explicitly incorporated in the present method.

The simulator evaluation method could be applied to a broad range of simulators in the maritime and other industries. The present application of the simulator evaluation method was limited to PC-based ARPA simulators. However, the method has a much broader range of potential applications. It could be applied to the full spectrum of ARPA simulators, as well as a wide range of bridge and engine room simulators (e.g., ECDIS simulators, GMDSS simulators, and diesel engine simulators). In addition, the method could be applied to other maritime simulators (e.g., vessel loading simulators and vessel traffic system simulators) or simulators designed for assessment of performance in other industries (e.g., flight simulators, driving simulators, and power plant control simulators).

The simulator evaluation method can be generalized to the evaluation of training simulators. The method described in this report focused on the evaluation of simulators for use in mariner performance assessment. Performance assessment is an important use of simulators, but an equally important use is training. As in the case of assessment procedures, training programs can be developed with the explicit identification of performance objectives and

18

performance measures. Given the specification of these training requirements, simulator evaluation criteria could be developed to determine the capability of a simulator to support a training program.

Training institutions, regulatory agencies, and simulator manufacturers can apply the simulator evaluation method. There is a wide range of potential users of this evaluation method. The full range of training institutions (academies, colleges, and commercial training centers) could use this methodology for the selection of cost-effective simulators. The USCG or other regulatory agencies also could use the method to develop standardized evaluation procedures for different types of simulators. In addition, simulator manufacturers could use the established evaluation standards, as well as completed evaluations, to determine those features and capabilities that should be modified in future design upgrades.

Recommendations The overall objective of this research effort was to develop an approach to simulator evaluation for use by the USCG and the maritime industry in their response to the 1995 STCW amendments. Given the demonstrated feasibility and success of this approach, we recommend that the USCG, maritime academies, simulator manufacturers, and other organizations take the following actions to bring this approach into practice. The recommendations are organized into two sections corresponding to the primary products of this effort: the simulator evaluation method, and the ARPA evaluation protocol.

Simulator Evaluation Method

This document provides a relatively general summary of the simulator evaluation method we developed and refined. We recommend that this description serve as a guide for implementation by the USCG and members of the maritime educational community. We recommend the following actions to ensure this simulator evaluation method best supports the purpose of increasing the effectiveness of simulator and training course evaluations in the maritime industry.

Distribute this simulator evaluation protocol. A complete and general method for the development of a simulator evaluation protocol can serve as a reference for the maritime community. The USCG should make the current methodology widely available to the industry by publishing this report as a public domain document, and by encouraging its inclusion in courses on performance-based assessment or "train-the-trainer" courses.

Encourage the development of a library of simulator evaluation protocols. The evaluation method documented in this report should be applied to a wide range of simulators so as to create a library of simulator evaluation protocols. The USCG should encourage maritime academies and other appropriate institutions to apply the methodology to other types of simulators and then share general lessons learned, model protocols for other types of simulators, and actual results of evaluations. Examples of simulators that could be evaluated using the present approach are ECDIS simulators, GMDSS simulators, vessel traffic system simulators, cargo loading simulators, and diesel engine simulators.

Standardize simulator evaluation procedures for selected types of simulators. When the methodology is better understood and accepted, the USCG should develop, or encourage

19

appropriate institutions to develop, standardized evaluation procedures for various types of simulators. These procedures could include standard scenarios and conditions, as well as guidelines and cut-off scores for accepting or not accepting a simulator or a course based on it.

ARPA Simulator Evaluation Protocol

This report provides an example of applying a structured methodology in developing an ARPA simulator evaluation protocol based on mariner assessment requirements. In addition, it provides a useful approach to the actual evaluation of ARPA simulators. We recommend the following actions to ensure this protocol best contributes to the effectiveness of ARPA simulator evaluations in the maritime industry.

Publish the ARPA simulator evaluation protocol and encourage its review and use. The USCG should make the ARPA simulator evaluation protocol widely available and encourage its review and use by the maritime academies and other appropriate institutions, such as public and private maritime educational institutions and shipping companies.

Use the ARPA simulator evaluation protocol in the assessment of simulator requirements for approval of training courses. With greater understanding and acceptance of the approach, the USCG should use it or require its use as a standard evaluation of ARPA simulators for use in training courses.

20

REFERENCES

Bole, A. G., and Dineley,W. 0.(1990). Radar and ARPA manual. Oxford, England: Butterworth-Heinemann.

Goldstein, I. L. (1993). Training in organizations: Needs assessment, development, and evaluation. Pacific Grove, CA: Brooks/Cole.

International Maritime Organization. (1971). Performance standards for navigational radar equipment. Resolution No. A.222VII. London: Author.

International Maritime Organization. (1979). Performance standards for Automatic Radar Plotting Aids (ARPA). Resolution No. A.422XI. London: Author.

International Maritime Organization. (1996). Seafarer's Training, Certification, and Watchkeeping (STCW) Code. London: Author.

International Maritime Organization. (1998). Guidance on shipboard assessments of proficiency. (Marine Safety Committee Circular No. 853). London: Author.

McCallum, M. C, Barnes, A. E., Forsythe, A. M., Smith, M. W., Maynard, G. E., Blanchard, R. T., Hempstead, S. C, Martinez, N., Murphy, J., & Jackson, P. (2000). Conducting Mariner Assessments (Report No. R&DC 204). Groton, CT: U.S. Coast Guard Research & Development Center.

McCallum, M. C, Forsythe, A. M., Barnes, A. E., Smith, M. W., Macaulay, J., Sandberg, G. R., Murphy, J., & Jackson, P. (2000). A Method for Developing Mariner Assessments (Report No. R&DC 202). Groton, CT: U.S. Coast Guard Research & Development Center.

McCallum, M. C, Forsythe, A. M., Smith, M. W., Nunnenkamp, J. M., & Sandberg, G. R. (2000). Developing Performance-Based Assessments of Mariner Proficiency (Report No. R&DC 303). Groton, CT: U.S. Coast Guard Research and Development Center.

Stutman, P. (1999). Comparing the Assessment of Marine Engineers Using Engine Room Simulators Versus Shipboard Practical Demonstrations. Proceedings of The Fourth International Conference on Engine Room Simulators, ICERS/4. Vallejo, CA, June 27-30, 1999.

21

[This page intentionally left blank.]

22

APPENDIX A

ARPA Simulator Evaluation Objectives, Evaluation Conditions, and Evaluation Criteria

Appendix A provides a detailed description of the 33 simulator evaluation objectives we derived from the performance-based mariner assessment objectives specified in McCallum et al. (2000), STCW standards for simulators (MO, 1996), and EVIO standards for actual equipment (MO, 1971, 1979). The appendix is divided into four tables (Tables A-l through A-4), corresponding to the four evaluation objective categories: exercise programming, equipment set-up, simulation, and debriefing.

Each table is divided into eight columns. The simulator evaluation objectives are described in detail in the first column. The corresponding mariner assessment objectives from McCallum et al. (2000) are listed in the second column. In the third column the evaluation type is provided. It is either static or dynamic, depending on the objective. In the fourth column the evaluation conditions are described, and the exercise used to evaluate the objective is specified.

In the next four columns the evaluation criteria for each objective are described, and the reference for each criterion is specified. The criteria consist of the controls and displays that correspond to each evaluation objective. For example, simulator evaluation objective 2.4,- Selection of safe limits, has one control criterion, 2.4. Cl, Ability to select safe limits according to distance (CPA) and time (TCPA); and one display criterion, 2.4. Dl,- Indication of safe limits. Both of these criteria are required for an ARPA simulator to support mariner assessment objective 2.2, Appreciation of the uses, benefits, and limitations of ARPA operational warnings. The source for each evaluation criterion is indicated in the reference column. Each criterion is derived either from the corresponding mariner assessment objective (MAO) in column two; IMO Resolution A.222 (VII), Performance standards for navigational radar equipment (IMO, 1971); Resolution A.422 (XI), Performance standards for automatic radar plotting aids (IMO, 1979); or Section A-I/12 of the amended STCW, Standards governing the use of simulators (IMO, 1996).

A-l

u o M V a

s

8 g

8 J3

^> %-* u

o c o

■*-<

85 J3

> la O *-»

E

< Ja a H

o <

c o

"CO o

XS 05 C 05 — 05

0) T CO

Pi

CM CM CM

S; « ■a Ug CD C 3

S gö o Q- o E3.D

§ E « — c CD

CO lu

£3 c

'>_ CD CO C Q) =

c-> !2 CO o 03 O $ CO

c in » > Z » i- c8i-i

> _ CO o

03 CD

■SEE ■D C <° C . CO ^ ^ CD o-e lo i_ CO -r- 03^

|c2

IE g O O -£ CM CD §

■a CD o c

CO „ CD

CO P>

■5 5

C O o

* §1 "° c '-5 Q- « ~ .2 g

S ^ o O H CD

D. T- X>

« ~ - CD C CO T3 J- o= £ Q-

CO Ü CD CD v.

X! O x- S Ü

o <

O <

c o o

5 « CO .£ < CD CD o

T- 2 = 5 Ü CD 03 2 ■cut ^ CD O CO T- 05 O C

O ~ £ 05 .is Q-.9. S -o» < o o T- co t3 O CD 2 r\i c cö ^ CD x: T- O) O

Ü c -^ CD *^ v_ hr 3 _Q O <C CD

C\) 5 Ü CD

™ <5 I- O)

-^ CO

<

>,T3 >, .E .a *-* r- •*-* ... u

0 s Ü 0)T3

"loo I- en o

>,T3 _ £

<C CD

■* 5 Ü CD

c c o C CO ^.

< m -5 -2 03 JD

O CD 5 o ^ <N§£ci? T- D5 Q- CO CO

>> = c 5 .2 <_ CO

8 SÜ-! ■ -p ° i- ^N § £3

x- O Q- O

I §

i5 o

LU

U C CO

CO

03 CD co o 1_ CD X

LU

03 Ü

CD CO <=• 03 CO (0

S E

< ts CD

.i; _ o o £ co

8&I LU Q o

c CD

Ü

05 -q c

c o

Ü o

c o

'•»* 0) «0 Q. 3 >> « r >

LU

o co E Ü CO •j c CO >.

55°

Ü ■a c ■— co E Ü CO s c CO >.

55°

El <D v. fl) lr 0) c .2 p C *j .= CO u i- (0 tt) re a> ■—

co O <

S ro = co c o «ITJ rji O c

Q co c c

i- o co 5

CD N

03 CO CD • » Ö

o E

CO

°-o C 05 •^ CD

°~ Q- X

CD

c o c o _ CD ~ CO 05 ^

&8 ETJ CO CD

El T- XI CO CO

CO 05 O

Q. CD

Q ffl CD »

■5-g

^ ON CO CD =

co 3 re On»

1 = 811 cl^o co 5 o E

i_ c o o

3 3

Ere w i3

CO c CO

CO

£ CD co £

CD tD 05 >r .«P. 05 ÜS2 _- CD CO ^ X C T- 05 CD

o i= 05 CO CD

CO C3 g

05 CO 'ZZ o

CO C o o O CD co c ö b c

o o

CD C CO CD oc y

T7

^Ng C- > A O CO JZ

i- CD O C 05

A-2

u o W> Q>

es o (50 c

5

& 1»

^> •Ö

<u J3

> a>

■*■♦ S w B

^ s fi o

»■*

OS J3

> cu

O es

S

CU

H

U

.3 >. 0) n +* a L. w Ü n e o

CB 3 CD >

UJ w d) U m

o c a>

3 ^ E a>

a> </> DC

c r o o

CO *; 3 •a CO > UJ

c o Ü

c o

■^ a) to o 3 >;

> UJ

.. s « a) c .2

.5= W Ü m W » to a) ■—■ 5 » £ to O

<

fc 5 <u o o > *-» •— ._ J5 tStJ 3 3© EBB W UJ °

CI <: o

o c o

m o

c o c>

n > o

co .c T- w

o o a)

3 _ _ ÜTJB

O -C CD to

CD Q. 2

UJ to sz c ° O p

•■— 3 i- 3 O Ü 10

CO Q CO

Ü CO

UJ w © a. c

c c O tO

o < o <

o _

>< co to

o 0>

■o n co c (D O

< u

F <

E o T— ,. CM to «= O

CO

o a> a)

Q. r

U

CO 2 S T— to to "r- Q. Ü)

r" a

C <D

TJ to

to "O

F to <i> _Q a> <> to H c

CO n a w <o 10

■o ii u

CO i

g ® ® CD c

f § 8 I "5 & JP E tö

a) CD —

: a» . a> E O D) W - 5

Q c5) E — E p to ro O *- w *- t tfi T- a. > = > id T3

oS ^ >p to 0) to

-Q C O o

V en o o to o tta ^ CX3Ä

o tr CD — 3 3 3 2 < E S Ü 2 g V cue CO p to - Q.E

UJ

< IS

Q. 3

CD

CO CD

CD CO (0 x Or UJ > o

CD C CO "O CO

f g§ CD C7)*; !2 c "° co c c CD <]) o > > o

to 1_ CD (0

o (- tr c (_> o

o .CD >

r

"a cz n

UJ CD o

to E ü to ~ c CO >^

o5°

2 -S 2 w to

CD c o

^ Ü to Ü o O) O

.£* to > 01 CO

ti n> r k-

CD Q.

(D CD E o

to m

CO to to CD

to r

T— a. o

A-3

T3 <u 3 S

Ö o U

s- o 6X) 0)

o «

ID5

o c.

JO

«

"a >

o

s

o E "O co

•I" — P

3= co.S2 111 Q-S 00 ■£; O o"2

i- o o

h .£ co

»- to

° 3 CD °

S= to UJ Q.

Q co

«la T- O »

o

o XI

E w S Q. CO >•

tO ®

> E> CM B Ox:

c c o o CO *- <o 5 o . o o

Ind

ode

iate

11 Q a> 2

CD 5 O CD

^gä ^i?« T-: «o

01 u c 0)

d>

a>

O <

O <

CM CM

E O 3 o

£.E >,° Sc » = CD

-Q E ®

Em -Q "5 <e

■r- CO ~ CM „

AC

ro

gr

f 20 Ü <J x co a

^ "co IE T- ao T- to to

CD CD o

3 < m

ves

sel

catio

ns (

s D

ve)

o

5 <

CD (O to CD > F

to

•i » > s

CD > o CO

CO ro CO S= jQ ■<t CO 3 O T_ O O>'o CO o CD c Js cr> TJ- 2 2.« ■* Ü CO «»

E o CD

T— Q. I/} r- T— Q. —

< <

XI

3 CO ^ o "3- zz. 2i J2- ^

"O c CO

8.« So "- O) c c # m

CD O)

X» CD to O XI

< OH

3 H

c o

5 5

c o '■s a) CO Q, 3 >:

> UJ

^ 5 ® g EB j. to a> ro d) •—• 2 » 5 w O

<

i; S 1) S .2 > CO -S »5 — CO o 3 _3 © E oa

C/> UJ °

oo

to CD to

2 "- CD ID X c

UJ to

" -2 ® -9 X) CO £ E

3 c?S> °§cö

JO to to to CD >

to

O CD as CD

JO 3 CD CO to .— CO CO

°> a > CO

CD

CD x: O

cr <

o o CO

oc o XI

*- CM CO

CO o o

a> to

w CD

21 ü 3 o ,2r to 5 CD

&«. Wcl t c5 i- Q.

m c o m U) > m r" to

O o to to

m CD CD O

A-4

01 3 C 'S G ©

o W> u> *J 03 u St) s i s 8 2 sa to

•»■^

CO CD

"■3 u

o c

u o 03

SS

03

H

v- CO 3 C O >.

M_ "D O O

O TJ CO >% o

T- <5 Ä ,— ö> ü Q cö 2 " er i- o o

<D TI

3 O

C CO

^_ Cl> o co o (11

3 o u

LU _co

CM (1)

TI U m <1> ■* c

CD n

T— o co

TJ ü

o2S

,2-c cö UJ .co -g co S,.a

Q co co

r- OTJ

TJ TI l_ C g CO

^_ Cl) O CO

o 3 11 o

o LU co

CD O)

TI u 01 CD

■* c CD O

T— o CO

o < o < o <

a co >-lö

<E Is» co

•*. 2 T- Q.

Ä O CD

co co

Jg .t; o TJ - C TJ O CD O CD D.P- 3 co

CD

<E CX3 CO ü

co

CD > 3

- CD CD C O C0

Q.E

O CO

= t O " O 5 2 c E cö < E ••§ 2 CD o> ffl =5 r -!2 ü £,TJ 0 P. TJ o « S> ffi

5 CD

.t; CD s •■= D) •— ■° ^ c < is TJ CD

c E o E « CD

ü

i- Q. O

f0 O) O |_ ro.EiScr p TJ Q-<-

C o

co _3 CO >

LU

3

CO TJ oj

TD Oi.t; D) C -D

C0 CD O I— > ü

er

"o CD

F CO

c o Cl u t_ TJ > r o

LU CD o

c o '•« ® CO Q.

75 H > LU

TJ r ü

co r- o co

i m >> W Q

£ co o != co a> co m ■-•

co O <

a> u M > co 'S '-^

— C0 o 3 3 5 E co 'S i» LU O

A-5

T3

a a 'S c o U

u O 00 4>

« u

s

2 r a, to

D J3

0)

*-3 u

o c ^o '■3 a

u ©

J3 3 E

< Pi

a 0)

c M- O

II CO (0 o I C D) — JD

5"S

o c o

.y w _ 73 S o - ET3 CM O C

q 85 W ir CO

^ §>E

"o

si TJ s= C CO _ E

Q< H 0_ W DC

c o

CO 3 TJ

i3 o

c o

'■5 » CO Q. 3 >.

To l- > HI

O H Ü5

O r-

o to o -~ TJ C ~ >. CD o > 1= CD -g = ■9 S" c 5

(0 CO CM

. &E q o

Er? o o O '.P o o £•§ E « CO c

o 'S £. co Q. Q.

< GC <

CO CD

c?:i ex«

co c o

CO 3

CD <o

ai u) E

CO r: o o

< -1 0_ rr CO

< b

m a> co

o r o rrt o CO n CO c CD

CO ■i— 1_

Q CD

CD 3 1— Ü

o >- )- a> c 2-; •2 = ffl u .9 co TJ CD C CO

TJ CM CD

Q 3 co TJ

■ CD

o C CD

■BS CO CD

O CD TJ -C

„.I q £ CD TJ

• CO

TJ O >< CD

>.CÖ $ i_ , sc>o S ■■= CD O <- =

O

o £ ® - CO b CO >- CO CD

?!£ CL CO -O

n B o j_ SZ *— >.= >. 3'1 o

.t; ü CD

5 5S [a o» a>

*"■ co < CD .y c 18 £

So 3 *;

co .9-

CM o CO CO to w

o £ U CO

CD c CD

CO CD O £

I- o ■i— CD CO .ir o o

CD ■a c CO

< ,_ co © CD C CO 3 o o CD CO X CD

LU Cfl

CD O c CD

CO 't-

"S co CO -^* DC.£

o 1.« co E Ü CO ~ c CO ><

55Q

^ a> S co c .£ r- w ** .= CO Ü ■- tfj CD CO CD "^

to O <

CD O j_f I_ 01 co T> b CO

o ■c O a> CO n ^

c> T- CD

■o CO CD CD CO

c?-o CO CO

c8 cB

- 3 a.

CO "O _ O ». o -£ O =

^ C .3 E o

® w « CX CO _co

< CL EC <

5 Q.CC CM co co £< to2E» S» 3

Ä-.C CD == ° E <S c5 * §.« CD co "a

"a-S-c

3 3 ° o co S

^00 _ = .E o "fflä .2 D) cr >_ CO CO

¥ CD'S g.E 2

E O -Q

TO '

<

CO

< a 0- D)

tot..' o »- CO E CO -^ CO CO "O 3 -Q E

CD

CO

- -S

i- CD Ä

CO CO

C0"g 3

o Q.1o .E to CO

E

® ro « a. co ro o o -g t! CD « >>_C CD = o c

§ EE < 3 a3

* §.« CD CO TJ

< _ 0- co DC "j= <m

3 3 o o Ü CO

S-1o_ = E o ^5 co tr ^ CO as

IEI giS.2 E O J3

i. c o o

3 3 E

o — .SJ. 10 JQ"

« |3 o

CD CD

Q. CO w ■— c 2 g-.2 9- CD o

c 3

CO

2 cc o

co .t;

co _, co

ö E

CD

SV 1 o < c

TJ Q. — co u Prr (" o

o ® OS ® 1_ -.

co

CD _ a:«

o 15

co .t: Q.® F T- o O i_-

CD Q. O

A-6

T3 3

#s c ©

o OX)

es u

S

s o a, to

«

o s o

'S es 3

> u ©

E

<

es H

c o (0 3

(D > UJ

c o

■*- d>

3 >. re H- >

UJ

^ £ § r- t 4-t E w o I; (I) 1) (0 4) •—.

» o <

15 3 _3 £

cn i3 O

ü CO CD O

— JZ o

Q CD CD «

O CO

o 2 to -g .2 o "o x:

Q

i- CO ü

£8 CO

<- CD

* 2 8 Ü CD O

>>T3 — E

"• CD

CD J. CD o CD CD

O 3 2 «Eg r- CO CO

CO CD O

JC Ü CD

CO

CO

o o CD CO

"O

m

A-7

>» u o WD a»

+J a w

to

S tu

ft» Ctf

u V

o c

> L. o

■*->

« 3 S

as ■

3 «3 H

IS

> °

c o

(0 Q. 3 >.

>

S G) i. <D !r as p .2 .= CO u i- to a> co m -^ 5 » 5 eo O

<

« 5 <u ■«-• ■—._

« m ~ — CO o 3 D Q)

E co 5" 5515 O

CM CM

CM CM

CM CM

CM CM

CM CM CM

C o

to a>

II ,- >• Pi *- co

CM TJ

a> TJ 0

"o E c 0

c 0

CO CO O c •n CD

i_

Ü

CM >< CO

• a. 1- eo

0 T7 r. 0 0 F m v_ 0 0

T> 0 r Q> >

CO n CO 0.

T- co CM TJ

c o « g

TJ C

Q

(M

CO o co a> a> c CO

CD CO c CO

c CO CD

■a S c CD — -Q

CM CD

» c Q co co T- CO

CM TJ

o < 7% CM

co <

§<M

CO <

CM CM

CM CM CM

CM CM CM

O TJ *- c - >> CD

co

O - O _

o *- c >. CD TJ

a. 3

< CD ■ TJ

CM Ü « (0

. 3 a. o 3

c o CO

< CD

w CO A " « 3 = q^#^» - T- 0> o ^ ,_ N 2 Z '(D O (8 (0

ES

O c CD

< CD CO

co"0 ®

.*; >_ N iü S co -i- OJ-22 o CM 2 £ E

■2" = = o

< a. 0< S ■ CD 5 T co -5

CO CD CD C CO 1_

CO c

_CD

E

5 --

_2 1

o CO

TJ 5c co ill?

a> CO

co mCC

LO

JD

CD CO

_CD

«> m « CD -g CD ü &| q |^ T- C <0 T- > CO

<° - CM o E CM CO

CD CO

CO "O O

c o

CO

111 Q

CO Q.

5l

CD CO

CD T3 O E

o CD

>

CL

"CD CO

CD CO c CO

o

55

0 c 0 c

0

0 (11

s > CO CD CO "-

10

CD "• CD c- >. N 1- CO =

O -U 1— co co T— "D CO

-a so co jo

_co x> a. c co o

II o ®

to E

c .2 co S °> i co 2 CD Q._C0

H< 2 a. -g ™5 o T- < O

•5T3 cr o CO Q. Ü CO

I#C0 o -2 « 3 CD M- CO > O

o o'l-o § 8 « CD

jD Q..2 g a» >?m ^ (fljo^ o T- co 2 o WOO >

.la

.2-0 TJ CD C CD — a. T- W

T-' "CO

qi CM CO

E CM

C g

Ü co x5 o £x>

CD CM CD

■ Q. J^ » P O CH 3 CM CO

O c

CO-* o ^

et o (D g'-= CL„ 3 CO 3 O

O O)

E "o S CD 3 Q. CO CO

co .£ S:

co c

CD CO

< CD CD 2

CO CD x a. LU CO

c 0 CO 4-J M- TJ

O CD CD - CD (1) CO Q. CO ffl » CD •C TJ r Q. CO

CM It

3 a. c CO CO <o ,<

E DL

Ü <

TJ C

M- CO C-TJ - c CD 3 o CD a.

•p ac O CO '~ CD -r, CO — TJ CO CD CD CO

CO .i: a. 3 p

CM CT C ■ CD C3

CM S O

A-8

T3 V 3

s o U

o wo v es u

«a to

S «a

a

«1

> »mm •+■» CJ

o s

es j3 "es 0)

o ■4-" es 's E

<

3 es H

IT

3 Q. C

•^5 CO °* S2 ü co

II — o

ü T—

c\i 5 q i CM CO

3 Q. C

CO (/) CO Q.

<N E CM O

CM CM

co CO CD

*. CO O i_ c o O CO •5 c" CO c o -cr ^ <= — .2 T- CO Q'5 cö "■

CM CO

CD "D O

_ E ° c c o o »5

'■gia O 3

IS" J= co CM S

CO CO CM E

c o o

o CO 3

m cr o o n CO r o "^

CO CO Q E CD

O TS Q CO

^ 1 CM :=

O H CO

CM CM

CM CM

CM CM

C o o

o o> ** o >.— .■s co , ~ CO T- -9 co _

CM I'S

<N © o-!

co CO CO Q.

E o o o

O CO

c o BY c

o "ti >, CO m .-ti C/J CO

= CD 3 C

co

cr o CO

CO ^- CO

CO

E cr o CO

, ■ u CM — CD Ü CO o a CO CD ID CD CO

D) c

"v_

„; © « © CM CO

= c5 o ' <f >- 3

jS <° CO ^ ©

"8 ' « ©

CO

i2 woi.hü

c o 2 ~-5

o w < ■•-- >, J» Er >. F LL U) .t; »- > ü

Abi

lit

ate

li n

gto

: CD O 3 cr o A

bil

arge

y

w

tic a

CJ

T—

CD CO

© O)

B 2.3.

C4

sele

ct t

man

ual

auto

ma

c o

'cö 2.4.

C1

sele

ct s

ac

cord

i IS to ._ 3 TJ

i3 o

en o

CO en Q. E-3 o © O co

c

^ o CD ^

2 5 % CD cr ■ X O ©

LU < co

CD en iS o CO Ü)

|8" 2 CO

CD en

c o

3 cr o CO

CO CD

CÜ ™

CD c co £ o —

X CO UJ CO

c o

V- (D CO Q. 3 >.

> UJ

co E Ü CO

■^ c co >-

C0Q

-a c o

.2 §

CO Q

£ CD h. fl) !r a> c .2 .£ « o i- CO CD co a) ■—. S»5 co O

<

.2 co

CD e P «i o c

CD Q._C/>

H< O

™5 o i- < o

0 CO '■5 "o co *.

3 ° cr o CO Q_ Ü CO

1 o 2o 3 CD CO >

o _

c -S2"R « o =J= ° c ~ CS (0 o Co ? c -ti - © - -.Q.2 5 <D -° ■ ■ ~ CD

< s i < I CM CD "g d CO CM £ - -*" * CÖ< g

^ 5 «> o o > *^ •— ._ « "S *- — CO o

E ro n cö £ O 8S

^^ CO <

co rr CM <

CO o 3 'i=

S w

-8 CO .o

i2 « 2 E

g o © ©

CO

O 3 O CO

Tt r-

CM =

A-9

T3 V 3 G 'S e o U

!* O w> v

I" i to

s

3 OH

a>

'S u a>

5s o a

a J3

>

© a

"3 £

CM

<

3 C3 H

<D

E iü ® o §

J.E To o T3 CD C >

,_ "o Q _CD

If) (0 ■ o

<M co

§<M

CO <

o

<M (0

X) CO CO o CO ro

3 CD ^ £ CO '.=

CD CO X O

^= CO

CN CM

•2 = To ]<2 y 3

— CO

ql co a) c\i cä

CO <

Ms .*: c xi c < co o

co '■= T- a> co O c5."5 m acrt ^300 CM CO CO Ü

~ CO c

'CO

3 c O -a -a CD Sen Ü u

O O TO CD

CD CO

CO c

co ';= CD TO

O CO JQ eo

c o m o CO

■D CD

CO t_

T— fl) u TO h- C

CO CM t3

i? c = CO XI T3 < CO

T- >•

ql« ^ 2.2 CM CO CO

co —

i3 o ö o §• CD O ■ x a> a>

LU > co

CO CD

Q. 3

CD Ü _ x x a>

UJ LU co

CD co CO CD 1_ CO

s TO c CO Q. Q 3

c o *- (1) CO Q. 3 >:

15 H > LU

o o

To CO

1.« co E O CO

CO >>

o co E Ü CO ~ c CO >.

C0Q

»- CD

,_ S s> CD C .2

.E w o i- CO CD CO CD ■=•

co O <

»- CO O CO

CD 2 CO CO 3 c CD O

■* CD

O 0- '■= a: CO <

=5 >. crx!

if

■ CO CO CO t: CD c £: ., CD CO

o co >_ CD CO

CD CD

n p> 3

il CO CD O-O

H x: o Q- ^-

^t- 5h CM 0)-i=

■- c o o

JS To 3 3 E 15

en LU

». _CD O CO c o o <°

■i: CD ° P © .E ® t! co 5

CM >

c CD x:

^_ S "o co <- ro

o 2 ■^ CO

CO o

CM CD

co '3 cr

SI If si

CO »

n CO

o CD

CD

CO

CD CD (/) TO l~- crt CM "O

A-10

u o 0£ Q>

« U

S ©

•■e

0> 43

a> ^> "■3 u a» '2 o c

'-3 «

> u ©

"3 S

< tu

i

«

•Ü = ™ O C C (0

l! TO X3

c a> — CD

CO 73 .£ S CD

eg CM

CO

CD O CO 10 (1) ^ffl ^ CD CO m "° r- o 5 < E-y

->,co

CD c c CO o

Ü TO

£ CD

co c TO o

CD CO

b C

Q. >_

- c

.pi =5

5 TO c —

CD

O-gE «^ TO

cn" 73

COX) o

Q. TO C

CM CM ^3"

O <

2.1 $-ro jo >, ^ E a> To XI >- CD 3

«■§3 8 ^•^.° TO

■ a. co 3 CM CO CO E CO 73 J5 'co

c o

TO

5.8 CM CO CO i:

2.1 ^TO B >* ^ E a> to XI >_ CD 3

CM srcM iS Q — — 3

■ Q. CO 3 CM co co E CO 73 -P 'in

c o

13 c

CD

P? TO

Ä CM $ Q .*:

_ CM ^ CO CO i=

O CD

.2 TO TO -D O CD

73 -*

-o

q § CM O

CO CO

o <

$CM

C/) <

CM CM

CM CM O

<

o ■~ c >, CD

.-« CD

< CD XI

CO C CO 0 CD

X CO CO "D T- cn CD O ri°aE

73 ■ CD

73 N c = s -° O TO CO to

73 CD N

3 TO

O ,

>,© S ^ ® c S|.5 <_go ft® « ••

T- a> co o co 2 '-ei -B

Q.

i CD CO

13 O O

2 §12s >..* " -§ o ® = 2 "O Q."" CO 1S i = | «

CM g-O = O <§ co co a.Ü .£ ®

o c Q> TO ,„

Sx j*r ° 5 £ a — O T3 Q.^- CO XI <

TO c i= TO = §«

CM 2 co I?« Ü

CM PO

8-1 inuo

nf

or

east

£ , o

S 3.S2 xi ro 3 < W o „ O «

< o

O CM

CO

E n^ ^ "5 — 0..0 » ü§ö S-tB O NO? w E E co coiS

II is

UJ-S.2

C _ O CD ^5 CD-O£

•C " TO 5 T3 3 £ x: co c

a>xic0"o9ocoro

LU H- Q_ E c c 5 ü

C = CD °

5 to E Q.

s 0-6-0 ;§_ ■° -i= c CD ": 'S'

CT-S c 2 = ■•= v §»S-.g CTTO § P? I- =5 o .£ to 2 iS

LU ^

£ 2 TO TO J= <^ CD

O CO

o J£ « I'l » £ Z CO TO % P '5 o % 3 -5 CD er CD p b cr x P = TO £ °

73 CD

5i CD c

CO

wo §

X - ,_ LU < O

o TO

= ° oS § '5 cr © cnü cr 9 c m => o < o iB co co

CD co

o _o

11 S o

c o

Hs ® (0 Q. 3 >.

15 H > LU

o

E TO C

Q

TO E CO c >. Q

73 C ^ co E Ü TO

'■^ C CO ><

k. o> !r 0) E .2 c* c *-• ,E w o w CO «) TO 0) •=? S w 5 » o

<

c CO co o

«!» Q. a>

Q TO

CD «

o-o TO 3 _

CO N

co -2 ~ 2 'S co

T3

TO <D o> -2 3 TO

TO5

CO CD CD 4!;

2 »•§

co C CD

73

5 '- iS o CO ""

CD.S "g CD

=>'! — CD

x: CM

TO cr *- o

II O CO

■'S ^ CD O

^ 5 CO TO

o o > ™ TO t> 3 _3 £ § TO X?

CO LU °

c CD X

CO o

Jrii Q. CD

öl T- ffl

c CD

I * S > CD & CD CO^

X! 73 £

C TO 73 •i= . CD CO 73 N CCS

CO O CO CO CO

TO 3 C TO ü

r- •■;= TO o E CD £

c o

33, CO

CMci CT c>

CO TO TO

A-ll

T3 0> 3 a

'■3 a o u

o

S .© •S <3

a)

"■3 u 4»

o s

.© "■3 es 3

"is

u o es

2

es H

B <ß — C -C (D

« ST5 « D> w

3 .£ to .CO C 0)

> CO jo

,-' ® > to Q .-S flj '=

ril E>.| CO CO iS O

CM CM

O = c ® g co

J" .2 TO 73 c

2 S D> Q O c

™ Ü? * co\2 5

CM CM

CD >- C O c O

73 <D N

* ^ ci 3 C TO co 'c <o > CO CD ^ 5 c

Q — ® CO CD

« 3 ? CO

CM CM

o -a

.2 3 -co-05

.2 o>o> i B) r

3 b CM to 5

■OS

Q CD "®

CO E> = co* 8

CM CM ■*

-2 c 73 tD

if co £> 3 .£ to c *- >|8 c^a Q 5 CD co"§ S» co co 5

CM CM

O

J.| to '<0 o o

=5 °- C 73 — CD

<M o CO 2 Q ~ CO 03 CO «

■6 ®_£ C -C ~

co JO S _ CO O) CD

■5 c

o c S O CD

•■io=c

» C 0) N > to ~ c

5 & o in -32 co w osr'5 ™ 3 CD O co to c to

-f c 'tö E

3 CO CM JO 5

Q CO cö ü m

CD CD TO D) V. 1—

Y< to CO

CM CM

CM CM

CM CM

O CD

>, "co .-£* to

CD < o . _ CD g S{

5 > I ^ n ''S S E CO C3T3 =

o 73

>. CO 3 TO

1= a) c -° ^ ni I- < u CD

CM -s; m m > =5

O SÜ o a> « t> CO c

o CO to 73 N

CO CD

S2 to tD tD

to CD

® s CO > TO ()

CD 73 to C CO 73 *— CO 3 r tu

1— TO CO S

o >.

< ° <D

81 to ft

to 21

o .£ ~ c <2 to to S > . O CD

, - tO TO

CD 73 TOfD

t0 3

tö 2 o to _i 2 ■* CD

O -c «CO

o >. TO ^ I'E ^ CD § s

co'^S^ l; u es s i; o '< o to a COt073iS COOCOC073S

IS 3 73

i3 o

73 C CO c to _, «a g

< CO ■

05 „ O TO

— c

o ro •• ^ < ^ Ä C0 CO < Q. > E ® ro a.

TO £ -K •= -5 C

CD

.CO « O 5 I— to co 5

^ ü H O ^

D)

iS 73

CO .* C tD CO O 3 TO

Ji in co

CD CD

c CO CD a> CD to

TO CO

o CD X

* a) tD tD

c o

10 Q. 3 >.

> UJ

E CO c >. Q

tO

>. Q

o 'E

Q

o E CO

>. Q

L. a> > 0) E c CO o w (0 V (U 5 CD (ft JS

to O

B »- CO ~ O c CD co .2

CD § 2

OjiS CL co CD'E O TO co .£ < .£

CD ^ OC CO

c o CD

$\» 2* u o tu CD i= *-

3= " 73 CD o C

tD TO*

^^ IB c CO 8 CT73 COiiS «

o « 3 E <o

CO |5

(0 o 3 £ 15"

< a_ <

CO CO

® o co .2 =>-co

5'I co =

CO TO

tD Q. O

A-12

T3 <u 3 a

"-C c o u

u o OX) 0»

«

C •2 <3

J5

O)

m> 'S

o a

a 9

o> t- o

+-> 03

< OH

O CD

Is CO «

* s. 3 0> » CO > r

CD °> co J2 c Q 5 c

CO CO 5

o.o

§.l g'o £ CD

C D- -o

Q 3 Ä ^ "2 "5

o >- cS .2 5 « o .2 co -O CD c co

CM <o

q § ■<* "a CO S

c o CO o

£ © T3

■i- O

« 2 CO

. -ife

c o

o XJ C CD — T>

™ g Q

co co

c r o CO m ^ CO .2 co

15 3

c 'c

"O CD co CO

i « > 5 i_

CM CD

Q D) Q XI

"> m w ■o 3

CO 73 co CO

co <

CM CM CM

CM CM

|- « CO

o 5 3. CO 'S .2 -= Ü I TS CD

CO CO T3 CO

CO

co ? i: ■=y c 'co

TJ

o >>

'ri TJ c CO

CO

CO

1

c CO

c o

Abi

lity

to

ange

and

of

any

n

disp

lay

dica

tior

adin

g ce

men

t

< >. 3 CD m — Z■ JS oc — 3 "^

.£ W c.2. O TJ

_ >- n> <-> . CD o-<r CM

o ■<fr

3 C CO

E

Q.

O CD

0)

3 .4.C

3 bt

ain

earin

bj

ect

3.4.

C4

of t

arg

and

re

with

S

CO O CO Q. O CO O XI o

IS IT!

i3 o

CD O)

iS 2 <

i5 CD O) oil E>.£

„ co .£ E * E ™ — JC 3 b~ CO

I § = S S S S CD CD S CO E "o Ü3

CD

i2 CO

•2- m CD 5 S o £ 3-5 Q. O O

_3 O CO CD

CO

CO

«CO

CO < CCCO

CD CO o o

CO CO CD <- .21 en > > c > CO CO OcTl

c o

'43 0) CO Q. 3 >>

75 l- > tu

E CO c >^ Q

CO o <o -a •= co c E O CO co

©CD >. i2<Q

F CO r >^ Q

1 CO CO CO *; 0

^_ CD CO c> V >- CD CO

CD CD

Q. 21

3 SI CO

CD x: 0"°

^ 0 Q. ^ ** 2^s

CM 0)-i=

^ s § 0) C -C (~ *~ *^ 1- CO 0) CO (j) —. S»S co O

<

CD

CO M-

c ° o c

'■5-B 0) CO

o .9 Q ~

CO CD

.. <0 1_ 1- 3 LL <T> <

ÜJ CO T! 01 CO CO

CO CD O

co" c 0

CM ; T3

« £ CD 0 .2 > 5 n> '^ 3 3 o E co 5" w i3 o

CD <n

T3 CO C —

« o

l§ C3 IS B o CD = o~

■* CD COS

.oö

CO CD ct> *— co CO T-l CD C" to CO

CO CD O

co"

J- 0

<x> CO

CD

C o Q. CO c CO

~ co

CO < <D CO

cio 2 c en ° S •^3 CO

CD c ro

m CU i_ -^ CO CD ^> CD —

• CD ro

CO S-D

A-13

gCM OJ CM

CM CM

T3 IV 3 G

"■3 G o U

u o ex) cu

s •e .3

»3

+3 o a>

o e

+3 es

_3

s_ o

E

I

<

3 H

10 0)^

ro .E °->, CO -=T

b co

T- f

ö 5

o .E ■5 © 03 .Q

8. a CO TO

o

Si £! II O (D

C (0

5 » © ®

.2 jg s= °

(0 (M *= Q c to ® *H CD CO -O

O T3 "O i- CD CD CD >..* DÜÜ

ESS Du*" C -o CD

to <2 .i CD C —

■^ .2 CD

CD CO

.5. >

£ E o co

CD

CD Q. £

^_. o CD r <"T i_ co m m ^_, c < ■n

ro CD

Q 3 CD C) CO co

03 - TO \

CDO g co

O- CD nr O o CD £,•—

w 5 c

-. CD"2 ®

H 3 a. g ^ go.° co 3h co

CD

E? S "S CD O CD CD TO £

CO

S-ü*

i= £ CM

CD -<D C0 < C ^ a. co

^ CO ?r CD F CO "O Ü X! C

CD

"O CD CD » a>, ^ CO _co 3 oÖ Q. CT CO ü CD ~ < CO T3

3 © CO o -Q

Q ~~

His.0 CO TJ CO

C o

O CD E

a.© .2

TO.2 £ ^■DQ. °.E co

CD

?? CM

CO <

c o ü

= »-S Ö CO Q- 5 CO CO

s if CO CO -C

o < >, °- s a>< <£' o

CD CO 3 -Q. CO

CO S X3 -O 1^"

.is 11 is

o o C CO o .

<*-* >; ^ m

.<2 '■= — >• Ü JO 0 CO

LU < > -O

CD TO i_ CO

<Z CD ° CO >.

CO ü -iü

LU O "Ö

c o

■^ (I) CB n 3 >- co H >

LU

E CO c >. Q

E CO c >. Q

c i- CD CD

» E~ E to o i- (0 CD C8 (D -—>

co O <

o

CO CO

TO^_ "O O CD

5 o o :-E c c

CO •£ § CD O

o 8 p> CO TO CD

,_ co-£ > CD« JO o co .2

3 CO

LO Ü 'S T3 CO

®-g EiS

CO CD CD ■DPE $5

CD S

O _ CD CO

CO CO -r> *r Ü CD c °

Jtz O co CO

s « s-S ~ co « E £ cj CD

CD

CM=5 £ 73 O i= C _ O CO O XI

o JS £ '-g Q.

O CO £=5

c o

c I = — o E co o w E o c

co ° -t - ~ >>° C0T3 « CO ~m C CD .E C ~ O > C <c = J t

CD 5 Q. <D CM -g 5 w 'S LO Q. CO "O T3

!_ o CD f

o CO

CO 3 CT

o CD c o O c

O o 3 o "O m (1) O -i X > ■*-*

CD (0 co

«LU

o . ° I

« £ o> ° .2 > 3 3 Era W LU

U3 CD

"5 co CD i:

^c^ CD m co !c

_co

TO 5 "° 3 £ « CO §

CD CD — Q. O CD

r- £ .2 coS £

c o co co .2 <D

if J2.Ö

•a Ö co o

A-14

T3

3 G

a ©

o WD 0)

es u s ©

3

0) J3

W3

#>; +3 u v

o s

.© '-3

CS j3 's 0) L- o es

OH

3 cs H

DC

V DC

c o o

■S «= 2 ° o ro

< a> o 0. o c < § re ^. — "o ^ re 2 Q § £

CO .£ 05

CM (M

o JS — Q. >> co i= TJ T5 _, -Q <D ® iS < x: c re

*- CO "O

o"5 |< " 2 §°- ^ «-£ CO o o <

CM CM 'S-

CM CM

re Q_ re O -a - ~ -a co a> en a)

Q. CO CO

CD

q 3 CO o COÄ

c 3 O (1)

CO .} (1) t„. X! E Cl) ■^

C CO

■o c

L_ >_ < > < 5 o r (1 >* O o H c/> i_

8.-0- -s CD CD 5= CD D> CO Q- c >- CO CO CO

■- CD CO CD J- Q. CD CO < D —

8H|5E ■C CD CO CD

nuünt

re CD CO O CD

CM q CO

CO < re Q-

C\] (M

< C : CO Q_ -i=

re O m

■a *~ T3 CD CD O) CD

O.

CM CM

CO CD co £ 2

CD - 0)73

CD

CO

CD

!2< 3 a. So

2 ten

-Hü)

o co" CD CO 1_ I—

<M 3 Q CO

°> "CO CO "O

. CD 03 S O) CO C i- CO CO >- CD CO - a. m a. re p.

«3-—C CD ? CO CD °- CD != 0^£

(0 ._

i3 o

CO O CD -o O) CD

re Q. -C CO

co o

CO CD >_ O) O

!"§'

CD x re o o

LU h- o JO

CD CD g X S. UJ O

o o b CO O JQ iS -O

c o V' a> CO Q.

75 1- >

UJ

E CO c >. Q

E CO

Q

£ co o »- co a> (0 8) •—■

« O <

CO sz CD r- i= CO CD -£= O M- C £ 3 X3 © O J5 ^ CO C E CO

■Re — "^"^ *

-a ™ co § c ° CD w c■ ~ co c 00 CD O CO CO ^ >:

E S | o | ■■§ g- « o B .£ £ E --5

o

>- —. CD O °^E CO CD i: — CD <0 O 73 73 3 J3 C O

„ O ^r * E CD O ° CD 0> co "O > c n ® '-5 CO .5- CD CO

ü co co S:

o c

CO

o iS ■g D- O CO ET3

« 5 o o o > ^-» ■— ._

«J IS •' — CO o 3 _3 o E co IT

«5 il °

M- 'S -03 O

° o Ä-o co = a> CD

're 2 c co •t; to E"°

c .9-

" o c _

O CD

CO _ CD CO 2> co

O

°l co = c o o —

CO t£ -= CD Q. CO

CD"

CO co co 3 i; CO -C o o > ü Ü XI

_J CO D)co

rr, CO C0 3 ^ u; (l) r o o O > ü üU

A-15

T3 CU 3 Ö *3 S o U

u © ÖD CU

« U

S ©

0)

V3

u <v 5s o Ö

es

>• a> fc- o

"3 S

< OH PS

H

CD C 0) w >= (0 £ <r> O CD

«I -^i CD CO CO

Si: dg,

is co <5 CD — Q) >> C

~ o c "- ,— c © C0

co (5

^ to > o

■o CD

C0 Q.

q£ II T3 ^ 3 CO CO

_> o CD

Q. CO 5

o co »- CO o -*-*

LU

o> CM .y m c (_> 0J ts c 1_ T- .2 S» m T— T3 i. 5 co C «

5 CO

^- C •!= *- T- " £ <» Q J iS c c\i co .£ «> ^ cö <S.<2 CO Q. t T3

5 OT O CD

'■§:§ 2 x T3 CD C "O — C (M — Q J

co Q_ CO

o <

o <

O < 5

XI < - E O co

CD O)

CO CD

3 C0 O <° ©je. o> c £

2§c

n A CO S O -3

5-1 2 E xi J? E co

CO Q_ C O X>

O CD

= iS co "o S^ cö CO < -O Q. CO

CD c CD ig

3 'S 'C er O) co

co o co

°. CO g X E

C\l C0 3 CD "O T- CD 'S -O C cö E 5 .£ «

■^. C X >-= CD

1= C0 "O XI CD .E < ■£> —

O .£ Q. x -E CD

>. CD C0 ~0 ^ O) _ c xi < co o 3

co C\J co

li <= 5 CD Q.

"D CD

co E o= co tu m =

IS co — 3 T3

o _co CD .

O 5

8 .s s S

x:

3 O

T3

C0

c

cö C0 CD

LU

CD © .b CD c « >- t CO x: co CD H co xi

"CD ©

CD O)

« C0

er « O T3

t!

CD c:

c o

.b T3 CO c !2 Cö

Q- CD x:

CO CD 3= CO Ä o CD CÖ c > S CO XI

CD O

„ « ~ -2 CD i; co -a

C0 CD "O SQ-c.Cc .co ~ ra m o o x - — CD CD O ~ E X CO ^ CÖ 0

LU 3 c Ei

c o

'■s "> CO Q. 3 >. CO i >

LU

E CÖ c >, Q

o

'E Cö c >. Q

E CÖ c >. Q

E cO c >. Q

CD E

CD

in o co a> V xt co O <

co

O O c

§>«£©

y co

< Q. er

0>;C -<-

=5 C0 ,_ CD i^ co 5 Ä ^ *: ,_ CD o co co CD o xi co "O

a. 5 c S o Z S

co "CD t5 CD O) CD O g 5= co

c5 ™ ^ «

H c c I co1

3 co *; o

CO O O CO "D

c o _2

"cö c co .2

O)-^ o C CO Q. S ® r-^

C0

-t— T3 C0

^.£ E

Ig cö S CO O

g^ CD .— c co = r> x

CD Cvl TJ

Q. ^".£

i; 5 CD o o ■> *^ *— »M

C0 'S *-

— CO ü 3^5 E co 5" ■n > O V> LU

<2 T3

O 5 <0

ro CD " - °> c g c o

—i 2 05 . c

<

<

co o co 3W o co CÖ "O

Q. g C 5 o

CD D) CD o >- s: co £ -2 CD -s

E .£ £ co r- 3 CO ~ 7;

^. ,br Cö C .2 CJ O CO "O CO

co .2 Q- _ .t= ^ O co o"o -. ro Q- CD CD CO c C 3 •— rö

x P CM CD .E 1- -o cö cö.c E

c :° 11 o *"' ^_ 0 >-a> -o -j= > CD C 3 C CD CD C "O £ cö — co Q. © E

A-16

<v a c *^ c o u

u o 0*

C5 u c ©

•■c .3 a to a»

o c ©

• p*

ca

> u ©

.2 Ö E

3

DC

r o m o Ti CD r T> — o ,_ E n L- () co o

II) CO >

c o m '^ TI CO o o E "^ k_

r (i) (i) > T> 3 (!)

r • m u F Tt 1— CO

CD .9-

CO o 3 5 Er

CO CO c o o

3 " ® c m"a> iS E

5 o CD tr

CD x: .E

CO O 3 fc

CD cc

CM CM 5 W

O CAJ

CO <

O <

c o ü

5> > co co < c co„

■* o = 2

CM ~ .2 -i= T- ■- *; d

. X) CO o CO CO oici-

O CD

^c £

< Ä ^ CO CD 2 o

-Q E Ü _C >- 3 = c rX O O O S E

co co > 5 i= .£

JO -Q 3 TJ

ES c o

'co O CO u CO CD

>•,__« o c

Ü3 | ,_ 'c

= S > X!

21 c a..c . O m ~J

CO =

T> O

"O (D °i

o ex ^, o.<2 ** ■•»"° IS O CM tS - ü ^ E

CO ,_ CD C

. co >. CO 3 TJ

. CO

.t; CD

5 E < ~

CO

n ® <~> -o ■* = -

co .E X)

CO

ÜÖ

3 XI

> ° i3 o

CD >? CO 5 a 8

< o o .!2 g CD 4= ~* ~a c° CO _ CD c >.

g g « "5 § 2> UJ w £ E S B

3 CD C

T3 £ E o .

rJ= ®

2t£S» .«2 -2 o «

<D CD ,_ ü g X W (jj eg CD HID > ü c

CO o Q.

C o CO Q. 3 >.

>

E CO c >. Q

E CO c >. Q

St CD i. 0) Sr «> c .£ .5 w o "= CO CD CO CU -zr

CO O <

o CD .ti CD ?

ill c cp co $ -Q ©

• ?'. I- -g CO o

■ S g CD CO »t >

CO

CD > CD

CD c a. co ° E CD _

-C CO

CO

00 CD

lO .c

r" CD o > m CD

r CD m o E CD CO

2 >.

o «

t" CD " >-x: CD S ü c

CO CO TJ

< _ 0- CO en •= <^ o£

3 3

o 2 CO .b

= c o

i_ CO CO

> C CO

2^.2

E O -D

i; £ <D o o > 'S '5 ~ — CO o 33 J E co 5"

co i3 o

Is > CD 5 > CD u|>

■>, O CD *-

"5.-^ CD CD

.52 CD .£ x, Q o co c co

CO C CO o 00 CO CD CD o

<- ±± 3 ni CO O CO 3 2 $

^t- -C = T- *" Ü

A-17

T3 0> 3 a

o U

u o DO cu

u S O ••e

V3 at

'■B w a>

o c o !8

u ©

£

OH

I

es H

a> u c a> v »^ a> tr

O -D T3 © C ©

w

CO

Ü CD

. 1 a. CO CO °

co

O CD CD g)

O 2

in c

o o

c CD CO i»

O Ü

■o c

-2 © CD O CD :s Q. CO CO ü

CO fc 3 CD CO

> o T- CO

W ffl Ö) Q Q..S in E E

co

° © „,

oS 5 *- "n O LUc U „, CO T3 CO

CM ~ CO O Q CO TJ to

co o ».£

Q. CO

? ^ TJ CO

If 6-B o w E

£^:

m 2

m c CD co c: «) ^ CO o _CD £ CO o o

o

o h-

o H CO

5 CM

W <

.t: T3

o -©•

"* CO D) T- — -2 O ^TJ CO ® © T- CO ©

• CD CJ-

.*: "o

<1- OJ ^ CO

PS 2. in CD p T- B) S

■ CD O CO S O

w £ ni — .-. Q- ° CO '■•=

CO m <r © CO o O) co - E CD <

«> ^1 S 5 8 I c c co -^:

T 2 , „ „ CO Q- CO CO O

> CO >

Jg co ■—

IS

CD CD a. co

©

CO .© CO Q -2

co CO CO a. E o o

T3 _CD JD CO

.© O) O©

c o '** CD CO Q. 3 >.

> til

CO

>. Q

E CO c >. Q

c ^ a> CD C E CO ^ CO «J CD > CO

CO <

■P-2 CD CD C

« 5 co

CD J2

JS jo 3 CD

< © m '

CO O CO

C T3 — c

CO

CO Ü

tr © CO

CD O i_-

co -a E2 •§"5

CD CO

T- £ <£ 2 co" o

73

IS© S.E co o - CO

CO c CO CD co co ^-

p 3 Q.

o .E <

O CO

i o 3 2 C CO CD ° O -Ä ~ © ra "

Ü- C D © ^

CO JC CO CD ^ «

< a. DC < D) CO

.£ © o 's

£ o ♦= ©"cici

oj w "cö CD © E

-3 <5

i = E o S'-g

E

© " * Q. CO _C0

© o is j*- CD co >?£Z CD ~ O c

<. a. tr ■ <

CD CO T3

c5~2? 24! Ü o

^■io_ SCO

CO co CD ro^o

C CO >

Ü CO .b E O -Q

S 5 a> o o > *^ ■— ._

™ m "5 3 3 CD E co la in £ O

CD Ü

£ >-" CO to

>- CO o >- o © Q- jg

« s CO o

■o

^^ g.e CO o

> CO CO c CO CD

c 3 a.

o .£ <

CO

o

^' c?

c?oS -c - o S- © £ ■§ CO « CO

- ^ c D> C E o

_ C 3 CD ■■= CD-^Si-K o 1- CO CO SO c

co © E w 5

A-18

u o W) 0)

C3

«ft •8 J3

0) >•

o e o

• I"*

« 13 >

cu u o *J s

S

OH

I

.5 *^ a> ** ü c o TO

_3 TO >

LU &- O TO 3 E

cc

CD CD CO >. o >

O

C CO CD C

O '°> O CO

f« CO X!

£ CL E O CO s

■a . c

- - £= CO - C CD CD CD CO > XI CO <; CO CL 5 Q_ CO

co u .co

« 2 S-P I

o CO CD

CD

a> X) CO

_o Q. a. co

CD £ en CD

* 8" .op

■* co C-

r co co -a x: co o CC

T- CM Q Q c\i CM'

3 O i2

c 5 CD

5 CD

c CD > > > n. CD r CO

(11 CO u CM en x: CO

Q o U LC

oo CD X CD

,_ ■*

T- CM

c . . O D) E.E

X) CO

< t» Q CD Ö DC H c •*« -

CO

CO

CM

o I- co

o h-

= o XI x: < 5

>, CD " ^ 0)0

S5.C0 o

o

3 <

O CM

T3

CO

DC LL

CD CO 3 CO D-

T- W P) >t

CD O = Set •" -C ^s ^y := C X! CD CO S"2 S < CD < CO w

O O) x: o 1 J^ CO CD o *. O ~ x

m.£ CD C m = "> 3 t CL CD ■* CL O

CO ü

g CD CD X ~ CD

CD O

IS 3 TJ

i2 o

CD O) CO

^ >* ID CD o m > a> CO Xi CD

y o t- co

X CD m LU CC a.

CO CD

CO *- a>.<=

,<2 >- O .CO m CL CD Q. X CD £ CD

LU C£ ■=. DC

CD E

co CO

>. CO

|-2 CD Q. co

© O CD W *" CO CJ ^'o CD '— CD X XI x

LU < CD

CO "CD

CD C

CÜ CO

CD "^ CO

.2 o-£

^11 x o ü LU 2 CO

c o

'■5 "> CO CL 3 >. TO I- >

LU

E CO c

a

E co c >. Q

E CO c >. Q

E a) i- CD !r CD C .2 ^ I- *- .5 co o j= co CD TO 0) ■— 5 » 5

co O <

CD > o

^- .s> 2 -Q TO O 3 C E.2

CO TO _3 TO >

LU

CL CD

CC

CM

A-19

[This page intentionally left blank.]

A-20

APPENDIX B

ARPA Simulator Evaluation Protocol

This simulator evaluation protocol is designed to address PC-based ARPA simulators with different capabilities to determine the simulators' utility in performance-based assessment. The mariner assessment objectives addressed in this simulator evaluation are those specified in McCallum et al. (2000). The protocol has three sections:

• Simulator Specifications. This section includes a two-page worksheet for recording simulator specifications. The evaluator uses this worksheet to identify and briefly describe each simulator.

• IMO Scenarios. This section consists of four scenarios specified in LMO Resolution A.422 (XI), Performance standards for automatic radar plotting aids (LMO, 1979). These scenarios are designed to test the simulator's ability to replicate a specific ARPA operational limitation, the time delay associated with inaccurate sensor inputs. The instructions for this section are on page B-4.

• Operational Exercises. This section contains the main portion of the evaluation form. It is divided into four sections corresponding to exercises A, B, E, and F from McCallum et al. (2000). The general instructions for this section are on page B-9; instructions for the exercises precede the evaluation worksheets for each exercise.

B-l

SIMULATOR SPECIFICATIONS

Date of Evaluation Evaluator(s)

Manufacturer __, Model

Hard Drive(s) and Monitor(s) For all required hard drives, describe the console type (trainee or instructor), manufacturer, processor, bytes of RAM, bytes of storage, and network card. For all required monitors, describe manufacturer, size, and color and video specifications.

Monitor Test Model:

Console Type

1.

Hard Drive

2.

Minimum Requirements:

1.

2.

Operating System and Source Code

Test Model Others Available

Accessories Check all that are available and circle those used with test model.

• Keyboard • Mouse • Touchscreen • Trackball • Actual radar keyboard

• Other

Radar Interface Check all that are available and circle the interface used on test model.

• Furuno • Sperry :

Kelvin Hughes • Military.

Racal Decca • Other(s) .

• Raytheon • No specific model

Radar Display

Radar Controls

Algorithmic Model

Actual Radar Simulated

B-2

ARPA Controls Check all that are available, and circle those used with test model.

Trainee Instructor

Icons • •

Menus • •

Function keys • •

Other • •

Peripherals Check all that are available, and circle those used with test model.

• Printer • Plotter • ECDIS • 360° sea visualization • Ship controls

• Other(s)

Land Mass

Number in Library

Instructor Can Create

Company Engineers Can Create

Own Ship •

Target •

Number of target ships that can be tracked

Exercise Programming

• Pre-programmed exercises can be modified

Number of pre-programmed exercises available

Custom exercises can be programmed

System Troubleshooting

• Operating manual

Networking Capability

• Instructor console

Demonstration of system malfunctions

Other trainee consoles Number of trainee consoles

Cost of Test Model Trainee Console Instructor Console

Hardware Software Total $ $

Complete System

$

B-3

IMO SCENARIOS

This section provides the means to determine whether the simulator can achieve the performance standards stipulated in four standardized scenarios developed by EVIO (cited in Bole & Dineley, 1990, p. 364).

Rationale The level of accuracy of actual ARPA units varies depending on the sensor input data and technical equipment specifications. For this reason, EVIO has specified standards that should be achieved by an actual ARPA unit under four specific operational conditions. The goal of this section is to verify that the simulator replicates the operational limitations of an actual ARPA in each of these four conditions.

Instructions The ARPA simulator should be tested on the four standardized EVIO scenarios specified in the following pages. First, the scenario's initial conditions should be programmed into the simulator. Each scenario includes only one target and own ship. Second, the scenario should be run and the target steadily tracked. After the first minute, the simulator's performance data should be recorded. The tracking should be continued, and after three total minutes, the simulator's performance data should be recorded again. After the scenario ends, the entire process should be repeated five more times. Repeating the scenarios more than once enables evaluators to verify that the simulator faithfully replicates the limitations of actual equipment. Each iteration should generate a different set of data.

B-4

T3 G O cn

O O 0)

<4-H

O 4—>

03

3 £ «

03

to

2 a

C

s _3

"o O tu <u

T3 G

r>

a t/3

1) X

OH O

4—*

C/3

T3

a o o

tu ■4-4

a c

4-1

<u tfc! 03

OH o to

o

i>, o

-< r>h t-H

tu tu G

c w G O r^ "Ü o 4-^ o h-s - . 1

3 ^ (i) >, <D £- O 4-1 ■o o c i-i

<+-t

^ 03 l-H 4-J

G c o3 G

<D 1

T3 0)

X) a 4-*

C/3

<D O (I) > c G O O (-1 t+-i

o 1—1

</3

<4-l o

03 X o3 tu h-5

(D c a

"o o S-l

u T3 e 3

«5 03 T3 O

03 o • MM

> O Ui OH

<D l-i 03

T3

3 C

E 0) Id

J3

<U

X 4—»

<D

<

Xj t4-H o

XJ öß

4—>

03 Ui (U G D 0ß 03

03

C <u u 1/3

o <U -*J 03 3 l-H

OH c/3

i

no <4-l o

4—»

03 T3

O

X

T3

X «2 id c <D

03 4—»

o 03

• —H 4-H

(N t4-H c a <o t/3

O

H c

• —H

03 4-1

«2

4-»

o G O

C

a D

"o

4-T

> S ©

r. C/3 "ZH O o u <D <u 0) 03 G

t/3 G

03

<D 4—»

> 4—*

~s 4-1

l-H

o 4-4

o3

X H

aj +■>

3 C

S

3 G

'a o

4—>

3

> 4-»

c 03 >

03

c3 T3

<U X

■4—*

4-1

.2

c t—1

« 3

c G 4-1

O (D 0>

6 O o (1)

> Ä o 4—» XI <*, C5 4-H Tt <D H 0) Ü

03 -a

S-i

o X

o Ü 03 4—»

m G o3

o (1)

a t/1 o

G OH

T3 03 <2 to «

4-H

o 03

PQ o S-H

D 4—» G •2 o

a >> 4-1

03 <D «4=. 4-1 VI t-H >-< > X

H • —H

o G O

O 03 t-i

4—»

3 «2 2 "3

03 S^. CO o ^C 0)

4—J 03

>4 a T3 X

enar

io 1

m th

e da

ta 1/3

4-»

<D b0 »-. 03

4—*

X

3

"03 > 4-*

G o3 > 1)

o3 0)

4—»

Ü X!

<U 3

_G

tu

a c

C 03

CX 4i t/3

03 -4—>

03

X H

O £ ,"/ 1» tu 03 ID

4>

to

(/) §> o 03 (-c <D

C O

tu •2 O CM H X U b-H C< lx

S^ ■c y x. 5 §. o

(M

4-4 I* o O 00 T—

fio

S • 2 o

0)

I.

(0 1-

0) 0> (0 o -3 i; O .c 3

00 K O

o

O) c c o <Q o 0) o to

<b O) E c c

£ 00

13 2 o

Q. </> T

X CO

c ? O 0)

o 3 o o o o

u

a

tu

c2 C3

■M « T3 ■*-i (« 0> *J

u o 03

OH

DC C

T3 u O u 4-

tu c2

o

I

3 H

—i (N

(0

.15 *c H >. u (0

3 o Ü < V

3 C

if

X 1-

in

<r

n

CM

T—

J2 re

*i»

K- >. o « 1.

3 O o <

3 C

i ■

0) c O

in

1-

CO

M

T—

Acc

ura

cy

Leve

l1

tD

§:

c

iS

E

E c in d c

'E

c Xj

o ID

C X

2 C\J

c X

5

(B X CO O

"5. CO

o Z

X CO o H. O- ns o -Z

Eva

luatio

n

Crite

ria

5 5 K

05

O Ü 03

1

03 03

03

03 OS c CO

OS

CO 03

H

> ta n> x

o c

■o

o X m

T3

<U OH

T3 c 03

3 O

< OH

u

s

03 3 KI > >-> t) <Tt l-H

3 <> t) m bJ) (U c X

t> ni

u o>

>-. -o

13 u

CO

«1 ct) o

TJ (13 <D -i

(Ü c

B-5

o

c Q> O

CO

-4-4 c c3

c 03 > ^ en

> !D 03

!^ Ui <D 2 Ul (D UI

<D X! a

X -4—1

X! -1—)

Tt 03

13 S-i O o

o o a O

n <D c 00

Ul 3 oo *—* § 13 oo <D C3 <u o

u? Us o <o o s D Ul

X -4-4

Ul s a

C o3

*-4—»

3

i—H -4-4

v. S

<L> 3 Ä

3

O (U ^ a (1) ü hit s D Ui Ul

«2 a c

Ui O

xl a c

I

Si

oo 3 O O 0) HH

13 1

a •i—i

00 X

53 13

<D

o u

> O

r <D i-S* -4-4 -4-4 c3 O XI

-4—1 b-, o

m X bO 3 O S-I X

(N 00 Ö

a _3

"o

03 t-H S3

Ul & (U Ut os

13 <L> <U &. 00

C o3

<U oo Ui 3

O

3

S

"o o U <D

-Ö 3 3

4-1 4-H

< OH

U H Ul o < 0-, U 03

o

c a

_3

"o o Ui D

T3 3 P

e o - i-H

"3

a -t-H

00

<u X

&, O

3 O tuo 03

-4-4

03 -Ö

Ü X -4-4

<4-l

uT D >

o s «2 ä

-3 O ^)- (1) 00 oo O

O a

O PQ X T} 3 o 4> >

• i-H 3 O =1

S 1>

T3 Ui

03 -4-4

03

ffi "«

03 03 2 "3 o rj

S o

o o (D Ui

3

T3 <D X -t-J

oö C .2 "5

"3

o ~ 00 (D bJ> 4-H m s _M (U 3 <D i

a -.—1

00

<D

X H

oi s C

E

S-l

(D X!

o"

C3 S

-3 Tl

X ID > 03 X 03

T3 Ut O O (D Ui

> <t> X

X

o 03 C

O

C3

C 03

o3

(1)

00

a oo 00 O

U! 0> a

00

00 u,

o2 X

Ui

-4-4

<D Ui

Ui o 03

© u

«2

X

c 3

1)

< o a u,

o3 > X >. Ü 3

O 13 CO 03 03 03

H 13 03 t-H

-4—4 -4—4

OS 3 O X oo .a 03

<D -4-4 03

>. T3 T3- (Tl

CO

T3 03

T3 <D X

00 03 3 -t—t

bo Ul 03

Ul <D

-4—»

°u1 o r!

-4-4

oo

Xj

Ui a>

'G o a

00

03 -4-4

03 T3 ID X

a oS

X -4-4

O -4-J

03 3 C C3

-4-4

00 -4-4

03

H o-j Ul -M 3 3 U

o S-i

O 03 Ui

03 > Ö O U

03 > a) <3

S^ "■2 S> Jxl

■5^ o ^g«'

o o

(Q 3 ro Ä!,o o rSjO

« « 2

-1—

a> u (0 H «

0) 0) o 3 i; in .c 3 ^ ►- .° o

O

D) C

o (Q o a> o m

« D) E C c to cc

■Q <b j^

Q. o a. CO

1

£ (A C 5 O a>

o >. O 3 O O o o

'u

s u

O £

a C3

«

0)

u o C3

2 WD C

-3 o u 0) Ui

UI

.2 a> <u

J3 to

o

I

PQ

H

CN

.5 'u H >. u (0 u 3 U u < a> 4-»

3 C

i ■ 0) a> V.

X H-

m

»*

to

OJ

T-

« .2 'u H >. o (0 1_

3 o Ü < V

■**

3 C

i a> c O

V)

t

CO

CM

\

T-

Acc

ura

cy

Leve

l2

1- c

-S ■8

E 1^

(0 o "5. Q_ CO

o

(D

JD CO

_o "5. Q. CO

o c

o a> c^ c

2 CO

d c

I

CD

XI CO .O

Q. a. CO

o c

a> n CO o

Q. Q. CO

O C

Eva

luat

ion

Crit

eria

5 o 5 g

3 o O CB

1

T3 CD CO

C^

CB

1

CD

c 5

O) c c CO CD

C3 > C3

O c

3 O X m

-a to a) a.

ID c

3 o o

3

a, U H <u

H

ob c

13 o u

-4-4

ID

o C3

O

<u .£ •5 -^ ■" o C u a> ■"

£ -o p. C^J

»- t«

S o

U 3 (D -S

H ^

B-6

>> "c o <o o a u Ul

ID Ul

U O

T3

> O Ul Cu (D Ul

-o <D ID CU to

T3 c a (D C/3 u 3 O o (D

13

u O

4—»

03

3

(D 43

O

03 <D

H G

03 T3

<D 43

O o (D Ul

to"

<3

2 a

a a

a O

c 2 S

o o u, <D

C 3

PQ

43

H

c D

43 H

aj 3 C

o> c o

4-1 o

.2 S 'u c3

<D -O

u .O

Ul <D

s -s

T3 O

<-" G <D £

<D

10

<D 43

T3 u. O O CD Ul

5 a

a a

^ G £ 2 -a '3

E~H ul

° .§ '—l 'to G (D S £ = & o o

<D

C

T3 (D

T3 O O

(/) (D

es c o

73 T3 es G 03

G (D iD

X> <D > 03

43

03

<D 43

4-<

< 03

^ u O <D ^ ."G

<D

<D 43

03 Ul 13

u G O u

Ü u

0-,

03 u H

<D 43 4—»

03 _3 "S3 > ID

-1—1

C

> ID

a

a a

-o G 03

<u a a

5U C O

ID 43

1)

03

O

03 O

c iD

« i—I

G <D

G O O D

-a ID 03

<D G

r-i U

2 « -u' (D CM -g

G -tJ

O

o <D > ID

.5 ° 03 H3

^ s <D O

c «

o o <D u

>

(D ,G

J3 (/3 *-> (D

<D Ul O

3 «2 m -Ö G 03

O ■~ o 03

Ul 03 >

3 O

43

03 -a

a, D (D 43

+—» . 03 to (D ~ & -S <D C

as ^

"S C5 S a> u in

O S1

I-

1/3 a o

T3 c o w

H

■B 1) j«:

5 s. 4*

o CM

o in

(M CtO

«> "O 2 « 0)

LO

CD

a> en u as 1- a>

Q) <o o 3 i; oo

.>- 3 CO ^- .° OJ

O

D> C

LO (Q Tj-

H) o EQ

a> o> t- c c

5 00

■Q Q> a> ^ Q. LO

o. CO

JC to c 5 O

u. o 3 o O o o

o «S C <u u

Q £

c2 es -w es

T3 -toi C« 01

+-> id O es

"3 Cfl ™

OH

WD S

o u a> t-i u a a> v

J=

!- O

V© ■

cu

es H

10

.2 *ü* t- >. u CO 1— 3 Ü O < B 3 C

"i ■ cu u

JC H

in

«*

CO

CM

-

JO ro 'u h- >. u (0 u 3 O o < 9)

■*-*

3 C

c O

m

tr

CO

CN

T-

Acc

ura

cy

Leve

l3

CO CD

a co

]§ '5

CO

■8 CD

E

E c

d c

JZ

%

c E

T—

c

'I

o CO

CO

c JZ

'1

j5 o

c JZ

'I .CO

Eva

luatio

n

Crite

ria

5 o 5 K

CD

S2 3 o o CD

■a CD CD

CD

1 CD

c CO

c CO CD

CD

!+3

'3 > u

42

O C

3 O

3 u

•a c

D O O

< U H

H

c

T3 C3

o crt bl)

OJ C j= J^ <)

Pit u 0) m >^

T3 Cu P3

43 P

B-7

•n m S-< Ps o <j 03 o Ö 0) ^ S-i S-i <D « »3

<3

X V-l tU

03

£ o o S

a, o

2

CD V-

co* 1—^

-4—4

CO

"03 s <3 <w o

XT' "2

Co -4—4

3 tU

T3 o s

s 2 s

T3 3 c3 -4-4

s CD S-i

^ -4-A 3 tU

1 ^ w

s c/> 'co

a O ^ 3 O O

tU S-H 03

t3

«2 T3

T3

o

3 1

^4 e o

•i—i

O (U

XI 'C

fi a> CJ

> O S-I

Ü

S-I CU

T3 G

o 3 Fi

t+H o

-4-4 03 S-i D 3

CO a CO X!

3 O

tU

d o T3

cU <U

3 2 X -4—4

c3 s oo

3 OH X T3 hH

CL, 1 O D tU Ui

c a <u CO S-i

PQ

s, x ca

o S-I o

-a a

-1—» CO

CD T3

CO

3 2

-3

CM • r-l

s-T

«2 fi O

H P S-H

o 3 tu

> -4-i

3 c o O <u T3 O "^ co (U o r< C O G cu 3 o O 0) 4)

X E-

3 C s

3 tu (U

J2

03 -4—4

T3 CO

3

o

3 X3 CD S-i cu

3

o > 43

tU X -4—> 'S

- fi

O OS

"3 2 co

fi

s cu e o

C+-I

03 c ©

-4—»

Cd

tl)

3

s-, O

S-4 tU

-4—*

«U >

1^- I

PQ

-3 ^G o

(U S-i

co (U

.a CD o T3 „ X H

X! 03 CO -4-4 +-» T) cu tU

O -4—»

_o C3 -4—1

CO

C C3

< 2 CO

O

i

PQ

<U O,

o3

o

-a c3

'EH <l>

Li

«2 3

ctj ■4—1

as -a ctj

-4—>

<u S-i

O

2

o

>^ S-i 03

X cd

H T3

S-i O

3 o

o a S-i

4-1

■ 1—1 S-i <u

4—» -.—1

S-i O

S-i

3 a en

>

o

p to

.2 as

-o

X

CO

!-c

aS J3 "c3 >

-t—» 3

>> o3 <U

-I—» CO

0)

3 .2 3

"c3 >

T3 3 d

(N CO

X CO

o3 +-4 03

T3

1>

X

2 X cs > 0)

3 .2 '■t—»

3 O U

tU

G

4-4 CO H

0) o

c3

t>0 O s-i

OH

o a H

S-i

0) x 4—*

CtJ > 'S

S-i

03 tU a, tU

04

to

2? * S j«:

o CM

o m «,9

CM CM io

3 Q) 2 00 1—

^-* a>

re 1-

V (U <t> o 3 i; CO

■■; ■3 o *- J° CO

O

D) c

in (Q Tf <U o

co

Q> O) (- c c

c? CO

T3 Q> ^

Q.

tb in CM

X

CO

c ? O tu

(0 l~ o 3 o o o o

2 i>

Ui C8 fi cu U

03. TJ

CO CU

u O

03

< «

fi

o u <u u u a CU CU

-a CO

la o

op PQ

3 08 H

CN

CO

_<B

>. o (0 l_ 3 O o < d>

3 c

i & 0> k. X 0-

in

t

m

CM

T-

» .5 *i_ 1- >. u re k. 3 O o < S 3 C

■ a> c O

IO

■*

CO

CM

T-

Acc

ura

cy

Leve

l4

CO q>

1 ? CO

CD

E c

d

c 'E i—

c

5

o CD

CM

C

I

2 CM

c

1

CD

J2 CO .O

Q- Q. CO

O c

CD

_o CO o Q. Q. CO

O C

Eva

luatio

n

Crite

ria

< Q.

o

< a. O H

CD CO i—

O

Ü

CD 3 4—

H

T3 CD CD Q.

cn CD 3 i—

H

CD

C

c

(5 CD

CD

X J3

> C3

O c

3 O

-C CO

■O CD D a, CO

CD 3

T3 C ca

< CU U H

CD x: H

ab c

-a CD

It) 3 trt > >. t) ffl t~ 3 C) C) crt 01)

o> c JS ^

4"^ C> rrt

u CD CO CD

>. T3

a. CD CD,

CO n» cfl O

Tl <l) CD 3

<1> C

E

B-

OPERATIONAL EXERCISES

This section constitutes the core of the simulator evaluation form. We organized the evaluation form on the basis of four of the operational ARPA exercises specified in McCallum et al. (2000). The four exercises (A, B, E, and F) begin with a general description of the exercise parameters, and the initial data (name, bearing, range, course, speed, and vessel type) for all targets involved in the exercise. Following the initial target data, we divided each exercise into at most four sections, corresponding to the simulator evaluation objective categories: exercise programming, equipment set-up, simulation, and debriefing. Each exercise has simulation objectives; however, only exercises A, B, and E have exercise programming objectives; and only exercises A, B, and F have equipment set-up objectives. Exercise B is the only one that has debriefing objectives.

In the exercise programming section, we evaluated each simulator on its ability to program the initial conditions necessary to create the ARPA exercise. In the equipment set-up section, we evaluated each simulator on its ability to initialize different ARPA features, such as the display orientation. In the simulation section, we evaluated each simulator on its ability to simulate the operational capabilities of an actual ARPA unit. In the debriefing section, we addressed each simulator's ability to permit an instructor to monitor, record, replay, and print exercises. The following data are addressed on the evaluation form:

• Time. The time frame for the exercise.

• Evaluation Conditions. The required actions the simulator should perform. The numbered conditions address specific aspects of the simulator evaluation objectives listed in Appendix A. The non-numbered conditions are described in the cover page for each exercise.

• Evaluation Criteria. Descriptions of the control and display features to be evaluated.

• Availability. An indication of the presence or absence of the feature.

• Performance Rating. An indication of how well the simulator performed each criterion.

• Comments. A record of any other pertinent information about the criterion.

For example, the equipment set-up section of exercise A contains simulator evaluation objective 2.1 - Selection of display presentation, orientation, and vector mode. The evaluation conditions for this objective specify that the simulator should be set to a North-up display orientation. The evaluation criteria for objective 2.1 include eleven separate controls and displays. The criteria relating to display orientation are 2.1.C2 -Ability to toggle between North-up, and either course- up or head-up azimuth stabilization; and 2.1.D2 - Indication of display orientation. When evaluating these criteria, an evaluator would indicate in the availability column whether the simulator has these capabilities. Then, under performance ratings, the evaluator would indicate to what extent the simulator met the criteria. "Y" (yes) indicates the simulator fully meets the criteria; "P" (partial) indicates the simulator partially meets the criteria; and "N" (no) indicates the simulator does not meet the criteria.

B-9

Evaluation Worksheets forARPA Exercise A Exercise A is an open waters scenario with 21 ships and 3 buoys, including 6 container ships and 15 fishing vessels; precipitation clutter is present near the buoys. The initial data for the candidate's own ship and the target ships are in Table B-9. The following events occur in this scenario:

• Own ship and target E maneuver; target D reduces speed.

• Target C is lost (due to instructor moving target), resulting in lost target alarm actuation.

• The speed log and gyrocompass are disabled.

• Sea clutter appears at end of the exercise near own ship.

These events are noted under "Condition" in Table B-12. Refer to the "Time" column ofthat table for the timing of each event.

Table B-9. Vessel data at 00:00 for exercise A (McCallum et al., 2000).

Target Name Bearing Range nm Course Speed kt Target Type

Own Ship 090° 20 Container

A 073° 10.4 DIW DIW Container

B 090° 10.0 090° 7 Container

C 131° 10.6 000° 23 Container

D 285° 8.2 090° 25 Container

E 050° 10.0 200° 13 Container

F DIW DIW Buoy

G DIW DIW Buoy

H DIW DIW Buoy

1 DIW DIW Fishing

J DIW DIW Fishing

K DIW DIW Fishing

L DIW DIW Fishing

M DIW DIW Fishing

N DIW DIW Fishing

O DIW DIW Fishing

P DIW DIW Fishing

Q DIW DIW Fishing

R DIW DIW Fishing

S DIW DIW Fishing

T DIW DIW Fishing

U 045° 10 Container

B-10

Instructions

Program the data in Table B-9 into the ARPA simulator. Review the exercise programming conditions given in column 2 of Table B-10. Evaluate the simulator's ability to generate the target characteristics and location required for this exercise. Next, review the equipment set-up conditions in Table B-l 1. Determine whether the simulator can be set up per the required conditions. Third, review the simulation conditions in Table B-l2. Determine whether the simulator can simulate the events in a realistic and dynamic manner. Under "Availability," note whether the simulator has the required control or display. Under "Performance ratings," indicate the extent to which the simulator satisfies each evaluation criterion. Y (yes) indicates the simulator fully meets the criterion; P (partial) indicates the simulator partially meets the criterion; and N (no) indicates the simulator does not meet the criterion. If limitations are found for any criterion, detail them under "Comments."

B-ll

.2

"E u be S

2 g ^>

So

8 IM

■*^>

0) <u

■C

^ ©

a _© %^ a a

>

i < yi "3 u

W

pa

H

CO

c <D

E E o ü

Per

form

ance

R

atin

g

z

0.

>

Ava

il-

abili

ty

Y/N

.2 'ii o>

Ü C g

"■^

CO 3

CO > HI l_ o

■*-»

CO

3

E c/5

Q. in b

ö c o Ü

Abi

lity

to g

ener

ate

prec

ipita

tion

in s

peci

fic l

ocat

ion

and

dens

ity

c

o

o

"33 •a o E o CD

CD CO

O

>> ■H Q.

< CO

CD CO i_ 3 O Ü

75

'c

E CO i_ D) O O.TJ

o$

Ä" co

1? < CO A

bilit

y to

pro

gram

fut

ure

ma-

ne

uver

s (c

ours

e ch

ange

to

120°

at

06:0

0)

Abi

lity

to o

btai

n ra

nge

and

bear

ing

for

all t

arge

ts

T3 CD CO

CO *-

'1« ES 2> cnS 2Q °- ^ o °

-C? CO

< o

1

CO

E CD

3 3

E~

?s °-o 2 co >. CD S > !Q CD < c

CO >. o 3

E CO

O) o Q. O

>. s < A

bilit

y to

pro

gram

rem

oval

and

ad

ditio

n of

tar

get

from

ex

erci

se a

rea

Abi

lity

to s

elec

t m

odel

of t

arge

t sh

ip

Abi

lity

to p

rogr

am m

inim

um o

f 20

tar

gets

Abi

lity

to p

rogr

am s

peed

log

m

alfu

nctio

n

c o o c 3

E« 2 E o> co O CO

2 E -& o = o

< o>

< 0- tx < E CO l_

O) o Q.

O

>• £

11

"O c <° CD

tl -2 « T3 CD

2.Q

2 CD D- CD

2§ >. o ~ CO S £ < CO

Con

ditio

n

c <2 >- o > -is c -a CD c

S 8 CO —

.2 2 CD CD

DC E

C\l

a. 1c

CO

is O CD >.03

11 Q. CO

CO a.

CO

CD E?co CO (D

si Q- CO

CO Q_

1.5

R

epro

duce

crit

ical

eq

uipm

ent

mal

func

tions

1.6

R

epro

duce

crit

ical

A

RP

A o

pera

tiona

l lim

itatio

ns

B-12

.2 'u <u

U CJ

§•

S

S •S< a

u .©

<u

o

ES .©

C8 J3

> w

I «3

'3 v

H

c a> E E o ü

Per

form

ance

R

atin

g

z

Q.

>

Ava

il-

abili

ty

Y/N

.2 V

ü c o

to 3

CO > UJ

o CO

3

E

(0

Q. (0

Q

CD T3 O E >. CO

ex CO

XJ

O c o to o XI c

CD XI O E i_ o o CD >

"o c o

"to o XI c

5^ CO

a. to

XJ

ll m CO CO ~ o c xi.S> = o

c

CD CO O to CD CD C to

c o "to o

Is

8ft c 21

co .£

1 ® xi en o § c »- o c

■^ CD CO CD

.9 S

XI CD CD Q. CO

"cö

c CO

E "o c o

"CO

XI Q- .£ .E

CD _o XI CD CD CL CO O 3 CO

O c g to ü

XS c

Ö

c o Ü

Abi

lity

to t

oggl

e be

twee

n se

a- a

nd g

roun

d-st

abili

zed

mod

es

CD °

COi;

°ü o « — CD

= "to

3* Abi

lity

to t

oggl

e am

ong

Nor

th-u

p, c

ours

e-up

, an

d he

ad-u

p

Abi

lity

to u

se t

he f

ollo

win

g ra

nges

:

(1)

12 o

r 16

mile

(2)

3 or

4 m

ile

CD CO

CO l_

XI CD ><

"o >.

3 CO r= to CO CO

3-i

CD

«-£ .Q CO

■= E

CO CD > CD

"o CO

< CD Abi

lity

to s

et s

peed

log

with

1

kt r

esol

utio

n:

(1)

m

anua

l

CD C^

XI CD CD Q. CO O

to E o 3 CO

c _o "*■»

TJ c o ü

CD -o to S CD N

"O '— O -D

E« >.<? CO CO Q. CD to °> Q 2

c\i

CD O to -^

® P xi E O CD E >

o J2 o 2 a> > B

o

© to c .2 a.

1-g O Q o Z

c\i

a> 03 CD

'*_ CD CO c ns c

EC 3

CM

o

"CD to CD O CO

XJ o CD c

CD^ Q-O

C/5 CM

CM

CM

E i-

Q. 3

a> (75

Q.

CO

Q.

"CD CO

3

"55 co

Q.

"CD CO

B-13

.2 "E

U

§■

a %> U a CU

is 3

.2 rj

I

CU

!- CU

n CU

H

(0 *-» c V E E o ü

Per

form

ance

R

atin

g

Z

a

>-

Ava

il-

abili

ty

Y/N

.2 CD

o c o m 3 CO >

HI

o

3 E

CO

Q. CO

b

CO

c CO

ED M- Q. O C

.2 co

8 g- ? o — o

CO co CO Q. E o o o >> TO O c o

"cO .2 3 "O D. -E .£

TO c

c o

In '3 cr 0 CO

O c 0

"co 0

XJ

.0

CO E 0 CD

3 t> CO 0

^ «= §•2 "co.SS O 3

T5 C" £ co

CO 3

CO "O

1! §•£ 08 » O 3

'■5 e'- er ° JE co

CO CD i_ CO

c 0 co 3 0 X CD

O C g to 0 xs

ö c o ü

Abi

lity

to s

et c

ompa

ss lo

g w

ith 1

° re

solu

tion:

(1) m

anua

l in

put co

CO CO Q. E o o o a_

>. TO

s

o

03 '3 cr o CO

"5 CO

o ><

.■Ü CO = TO

Abi

lity

to s

elec

t ta

rget

s m

anua

lly w

hile

in a

utom

atic

ac

quis

ition

CD i_

'3 cr ü CO

_>. "co 3 c CO

£ 0 >.i2 ~ a> = TO < 2 A

bilit

y to

sel

ect

excl

usio

n ar

ea a

ccor

ding

to:

(1)

bear

ing

(2)

rang

e

c o

45 c o Ü

o

"So to o CO <5 CO O

n ®

E|

CM

c\i

TO c

o

il Q. 3 3

O CD < CO

CO cvi 2.

6

Exc

lusi

on a

rea

set t

o 3

nm n

orth

of

ow

n sh

ip

CU

E Q. 3

® C75

3

"5

3

B-14

.2 'u CU

'u cj

S ©

.3 S

•»■» to

a 01 01

C/5

O

c _o ♦3 C5

_3

> W

i

o>

*3 u cu X

W

cj r-l

PQ

3 H

(0 *-« c a) E E o O

Per

form

ance

R

atin

g

H

0.

SI-

Ava

il-

abili

ty

Y/N

.2 CD *-»

"L-

Ü c o

CO 3

CO > LU

o

3 E

</5

S'. CO

Q. CO

Q

Q. !c CO

c

o o c o

"> o o S x3 o Is

g

CO o D.

"CD D)

re "o c o

"cfl o

XJ

o CO CD i_

_o

Ö

E s co >>

!j a> CO 2>

>.2

CO >. o 3 XI

"o c o re o X) _c

c o

s '5. o CD i_ Q.

"o C g re .9 co XJ CD

£ re Tar

get

dete

ctio

n is

in

fluen

ced

by lo

catio

n an

d de

nsity

of

prec

ipita

tion

XI CD ,_

3 S "a 3 So °§ C *s o re re '5. .9 0 XI CD

£ Q. Abi

lity

to s

imul

tane

ousl

y di

spla

y in

form

atio

n fo

r at

le

ast

20 t

arge

ts

U5 0) O) i_ re

XJ CD

o re i—

"o c g re o

XJ _c A

bilit

y to

sim

ulta

neou

sly

disp

lay

info

rmat

ion

for

at

leas

t 10

tar

gets

re XJ CD

O re

o c g re o

XJ c

Ö 1- *-» C o Ü

oil

i| o 3 >- 5.

5 £ < Q- A

bilit

y to

acq

uire

, tr

ack,

pr

oces

s an

d co

ntin

uous

ly

upda

te i

nfor

mat

ion

for

at

leas

t 20

tar

gets

CD XJ o E c o

co :-~

§■§ Q- V Q. Ü 3 CO co 0

S'S ^•E — o 5 "3 < re A

bilit

y to

acq

uire

, tr

ack,

pr

oces

s an

d co

ntin

uous

ly

upda

te i

nfor

mat

ion

for

at

leas

t 10

tar

gets

c o

*■»-»

TO C o Ü

Q. !c (0

le o a>

Is Q. CO

tt> Q.

CO

CO

»- CD

Ǥ >> CO

CD CL

O-IE CO co

1.2

R

eplic

ate

envi

ronm

enta

l co

nditi

ons

3.2

T

rack

one

or

mor

e ta

rget

s us

ing

auto

mat

ic

acqu

isiti

on

3.2

T

rack

one

or

mor

e ta

rget

s us

ing

man

ual

acqu

isiti

on

cu E i-

O o o

■A CO o o o o

B-15

.2 "S

"S u S © ••c .3 s La

a> v

43 c«

S CU ^ 3 s S o -3 03 O

03 w

>- W

I < y> *S u.

cj r-l

i

W

H

CO

c 0)

E E o ü

co CD co o 1_ CD X

LU

o 1 c

■a B « CO >

ÜJ

co CD co 'o 1_ CD X

LJJ

O 1 _c

B

CO > LU

Per

form

ance

R

atin

g

z

QL

>-

Ava

il-

abili

ty

Y/N

.2 CD

O c o ra 3

CO > ID l- O

3 E

en

>. CO

Q. CO

Q

Vis

ual

and/

or a

udib

le

war

ning

whe

n ne

w ta

rget

en

ters

acq

uisi

tion

area

c co CO Ü o> *- c CD £

« CO •2 5

•^ CO CO -'

.2 5 ■a <D c c V

isua

l an

d/or

aud

ible

war

n-

ing

whe

n ta

rget

sto

re is

full

Tar

get

data

(co

urse

, sp

eed,

C

PA

, TC

PA

, ra

nge,

be

arin

g) s

how

tren

d af

ter

first

min

ute

CO

B

IE P CD t CD

« >_ B B 1 « 4- <D CD CO 0>0 is CD CO £. 1- O- A

bilit

y to

can

cel t

he d

ispl

ay

of u

nwan

ted

targ

et d

ata

Ö i-

c o ü

Abi

lity

to a

ctiv

ate

or

deac

tivat

e ne

w ta

rget

w

arni

ng

a>

3 B .2 c t> c CD §

21 S£ 5 o < CO A

bilit

y to

req

uest

the

dis

play

of

targ

et d

ata

(cou

rse,

sp

eed,

CP

A,

TC

PA

, ra

nge,

be

arin

g)

c o 4-»

'■o c o o

3.3

N

ew ta

rget

ent

ers

the

acqu

isiti

on

area

3.3

A

dditi

onal

tar

get

ente

rs a

cqui

sitio

n ar

ea w

hile

AR

PA

is a

lread

y tr

ack-

in

g m

axim

um

num

ber

of t

arge

ts

- "t

arge

t st

ore

full"

w

arni

ng a

ctua

tes

03 o> B "o >. CO

Q. CO

.— CO Q -a

CO

CD

E i-

cö CD o o o o

B-16

.2 'u <U

'u

a ©

•■e

u a i-t CV 0>

J3

o ^ > 3 «a s

•-3 c ea o

>• W

l

en »** u

X W

pa

3 H

(0 +-» c

£ E o ü

Per

form

ance

R

atin

g

z

a.

>-

Ava

il-

abili

ty

Y/N

.5 0)

Ü c o

m 3 ro > HI I— o re 3 E

c/5

(0 a. CO

b

Tar

get

data

(co

urse

, sp

eed,

C

PA

, T

CP

A,

rang

e,

bear

ing)

sho

w t

rend

afte

r fir

st m

inut

e

CO

B "5

P © t ©

co ~ ro ©

1 « © .!5 o>'o ro iz 1- Q-

03 _o w

.o © =5 2> 3 ro ro ~ i_ c o ©

If ro en

11 .<2 ro > S

© A: o ro i_

to _ro "o c .2 c ro.2 o .t;

ö c o Ü

Abi

lity

to a

ctiv

ate

or

deac

tivat

e lo

st t

arge

t w

arni

ng

© _Q C «

ro 3 *- cr

c o

C o ü

CO © a> Co CO o

J= CM O T-

.0- 2 »%

3.9

D

ispl

ay o

f tar

get

vect

ors/

data

fo

llow

ing

own

ship

cou

rse

chan

ge

0) © O)o

j0. CO Ü ■*""

UJS *- ©

ro o 1— o 3.

3

Loss

of

trac

k fo

r ta

rget

C a

nd

soun

ding

of

"lost

ta

rget

" al

arm

co j2 © o 2. c 3 J*

Q o © "O

ro Q. 1— CO

a> E j-

CD a> o o o o

CM

1—

o

CO

o o

B-17

.2 "S CU

"C ej

S

•-B •3 3

u a •^ <u <u

J3

o ^ ^ 3 = S s a at O 5 <-> «5 w

>

I

0>

.22 "o u 0) IX

W

r* ■

M

3 «8 H

CO

c 0) E E o O

Per

form

ance

R

atin

g

z

Q.

>-

Ava

il-

abili

ty

Y/N

.5 V

*^ O c o

■*■*

(0 3 CO >

UJ

o _C0

3

E CO

CO

Q. 0) Q

Dis

play

at

leas

t fo

ur e

qual

ly

time-

spac

ed p

ast

posi

tions

of

any

tar

get(

s) b

eing

tr

acke

d ov

er a

per

iod

of a

t le

ast

eigh

t m

inut

es

If ta

rget

has

bee

n tr

acke

d fe

wer

than

eig

ht m

inut

es,

num

ber

of p

ast

posi

tions

di

spla

yed

refle

cts

the

time

trac

ked

A ta

rget

just

acq

uire

d ha

s no

ves

sel

hist

ory

Tar

get

data

(co

urse

, sp

eed,

C

PA

, T

CP

A,

rang

e,

bear

ing)

sho

w t

rend

afte

r fir

st m

inut

e

CO 03

I1 £$ ££ « XI CO ©

| « j- CD CD CO D3'o

CO 2 H Q-

TJ CD CD Q. CO CD .O

co .£ ._ c -5 «

1* CO o

is CO 03

> o

CO o

■— cO

^■%

H O T3 CD

CO S

§ s- CD ~\

S ® 2 s?

uJ iS

Ö

c o Ü

CD CO CO CD > o >, CD CO CD Q. CO CO O T3

— £ S to < Jz A

bilit

y to

res

et d

isab

led

spee

d lo

g

c o

*** TO C o ü

3.6

A

ctiv

atio

n of

ve

ssel h

isto

ry

disp

lay

3.8

D

ispl

ay o

f tar

get

vect

ors/

data

fo

llow

ing

E co

urse

cha

nge

and

D s

peed

ch

ange

3.15 S

peed

log

di

sabl

ed

E i-

uS CO T— t—

o o

co

i— o

B-18

05 u CU

u u S

•■e

i-

c2 ■<-»

0) CU

O ^

* 3 <= s o '-S

•-3 B CJ O

B. w CJ w

w I

CU

°5 u CU XI

M

3 a) H

(0

c CD

E E o ü

Per

form

ance

R

atin

g

z

£L

>-

Ava

il-

abili

ty

Y/N

.5 *^ <D

4-* 'w Ü C o

■*-»

ro _D

CO > LU

o ro 3 E to

CO

Q. CO

b

CD.E S E 73 5 3 5 CO >_ >_ o o t 73 «> C CO CO CO — CO CO Q.

II > o

73 <0 C £ to .2 0-03 x: .2 to 73 c .E 5 CD O to

§8

UJ iS Onl

y he

ad-u

p re

lativ

e m

otio

n di

spla

y is a

vaila

ble

whe

n gy

roco

mpa

ss f

ails

Tar

get

dete

ctio

n is

in

fluen

ced

by lo

catio

n an

d de

nsity

of

prec

ipita

tion

73 CD ,_ Ü tD

73 3

°§ o ro « '5. .2 0 73 CD

S a.

73 CD Ü 3

73

2 CO CD CO

O

c o

Sis 73 3 Sä

CD 73 O E

o o CD > "o c o to o

73 c

Ö

c o Ü

73 CD

XI CO CO

73

0> CO CO CO CD CO "- Q_

2 E ■£* o = o

Abi

lity

to s

uppr

ess

unw

ante

d ec

hoes

fro

m s

ea

clut

ter, r

ain

and

othe

r ty

pes

of p

reci

pita

tion

Abi

lity

to m

anua

lly a

nd

cont

inuo

usly

adj

ust

the

sea

and

prec

ipita

tion

anti-

clut

ter

Abi

lity

to t

oggl

e be

twee

n ve

ctor

mod

es w

ithou

t lo

sing

th

e tr

acki

ng i

nfor

mat

ion

c o

T3 c o Ü

CO CO CO Q.

E-D O CD

S3 ^ CO

OS m

o

c g ro '5. o CD

rx

eg

Fals

e ec

hoes

whe

n se

a cl

utte

r ap

pear

s ne

ar b

uoys

, an

d al

so

from

pre

cipi

tatio

n cl

utte

r pr

esen

t th

roug

hout

exe

rcis

e

3.13

Sw

itch

from

re

lativ

e m

otio

n to

tr

ue m

otio

n di

spla

y to

ide

ntify

as

pect

of t

arge

t

EN

D O

F E

XE

RC

ISE

E i-

o CM

o

CM

o

CM

O

tM m CM CM

o o

B-19

Evaluation Worksheets forARPA Exercise B

In this exercise, the candidate's own ship is maneuvering in the Strait of Gibraltar. Three vessel targets (A, B, C) proceed in various directions, while nine fishing vessels remain in fixed positions. A racon (D) is on the coastline at Isla Tarifa. The display presentation is set to North- up. This scenario features other programming requirements and events:

• A zone of shallow depth is in the northeast corner of the exercise area.

• A radar blind sector is in the southwest corner of the exercise area.

• Using parallel index lines, own ship maneuvers to 270° to stay 3 nm from Isla Tarifa.

Table B-13. Vessel data at 00:00 for exercise B (McCallum et al., 2000).

Target Name Bearing Range nm Course Speed kt Type of Target

Own Ship 255° 20 Container

A 230° 4.8 075° 15 Container

B 255° 11.7 090° 16 Container

C 255° 6.0 165° 10 Container

D 281° 6.7 DIW Racon

E 069° 5.8 255° 30 Container

F 207° 5.8 DIW Fishing

G 207° 6.7 DIW Fishing

H 214° 6.3 DIW Fishing

1 207° 4.3 DIW Fishing

J 296° 4.0 DIW Fishing

K 296° 3.0 DIW Fishing

L 311° 4.0 DIW Fishing

M 312° 3.0 DIW Fishing

N 300° 3.0 DIW Fishing

Instructions

Program the data in Table B-13 into the ARPA simulator. Review the exercise programming conditions given in column 2 of Table B-14. Evaluate the simulator's ability to generate the target characteristics and location required for this exercise. Next, review the equipment set-up conditions in Table B-15. Determine whether the simulator can be set up per the required conditions. Third, review the simulation conditions in Table B-16. Determine whether the simulator can simulate the events in a realistic and dynamic manner. Lastly, review the debriefing conditions in Table B-17. Determine whether the simulator can provide the required summaries of exercise activities. On all worksheets, under "Availability," note whether the simulator has the required control or display. Under "Performance ratings," indicate the extent to which the simulator satisfies each evaluation criterion. If limitations are found for any criterion, detail them under "Comments."

B-20

.2 'u *>> I* u

a

I

kl

0)

u o

c

C3 _2 "3 >■

W l

M a»

u at X

W

ffl

H

0) u TO o>

a

"5= ? < TO

■*-* >> 1- TO o a c w o n TO 3 TO >

111 CO u

(i) CD o 3 v_

.c CD

fl CO CO r u> CO 0 O

fl) o m

u o b o

o c 0)

(0 "o

E c

CD

TO m -O TO

CD o o C\l

CO <r> >. ■^ to

(1) CO CO TO JC a. o

-a c 3 o

CD

I« *- CO

*- CO co ■;=

CO "o

S TO co c a> ~o c S d) TO CD co o a> — — C CD >.= c .t; co c ~ CO CO 3 o sz < O Ü

O- CD ■a

B TO 1 o CD I o a) 2r co ■— i_ = CO

< o

CO ■a

a) ro CD c CD CD CO

So >,.t=:

i§ < o

c o o TO CO a) TO CD c CD CD

O

<

TO CD

CO CD CO T3

55£-5 8 $ °

:"g ; C

TO • - to CD C co CD

«^ 3 g> £ TO E CO O £"55 E

g ° ° ^, o c 3 c o TO O ~ O ■•- CO = =5 .9> Q- c > ® O TO H o c

CD E? co * CD >."CD

11 Q. CO

cß a.

B-21

IT) i—I

i

M CD

3 et H

CO

c CD

E E o ü

< CD co CJ

CD X

LU

■o

to

CO > UJ

Per

form

ance

R

atin

g

z

0.

>

Ava

il-

abili

ty

Y/N

.2 a>

ü c o CD 3

CO > UJ

CO o

_co

CD

Ü CO

CO .c O o

■•-■

CO

3

E CO

>. CO

Q. CO

Q

1 _CD

"co CO

"o C o

"cO Ü

T3 C

CD

CO o CO

CD

E CD 4= CO »- 3 o c

II CO o CJ CD

=5 > S. o

"o co CD i_ CO CD -C

° C

to .52 O 3

T5 e'- er °

J= CO

Ö i_

c o Ü

Abi

lity

to s

elec

t sa

fe

limits

acc

ordi

ng t

o:

(1)

dist

ance

(C

PA

)

(2)

time(

TC

PA

)

Ava

ilabi

lity

of:

(1)

time-

adju

stab

le

scal

e or

(2)

fixed

tim

e sc

ale

Abi

lity

to s

uppr

ess

acqu

isiti

on i

n ce

rtai

n ar

eas

CO c o

TJ c o ü

o.c

.§S CD E

CO C\J

CO

CD

cö o CO

CD

E ■*- CZ

"o CD

%2 CO

Cvj

to a> co co CD c o N

« E (5 co

CO CM 2.

6

Exc

lusi

on z

one

set t

o ex

clud

e so

uthe

rly

fishi

ng v

esse

ls a

nd

land

CD

E i-

Q. 3

'S CO

3 i

a> CO

Q. 3

'S CO

Q. 3 i

% CO

.2 'S

'u u

§■

c<i

S

S4, = 3 3

U a <u si c«

O

«

W I

03 cu yi 'Ü u 0>

B-22

.2 'u

'u u s ©

•■e

Vl

U

o> J3 c/5

S- O

B tO *3 OS j3 "es >

W I

M <u

• F* u fc- o> X

W

Of

3 03 H

CO

c V E E o ü

< CD CO

O

CD X CD

c T3 CD

3 CO >

111

Per

form

ance

R

atin

g

z

Q.

>-

Ava

il-

abili

ty

Y/N

CO "k. o> *-• '^ ü c o

CO _3

CO > UJ CO o

CO

CD +-» o CO

CO n o o

3 E

CO

o. CO

b

0) "O o o c o o CO

o c o ffl o

T3 c

-a 05

ü CO

'S c o

§e =5 E> c re V

isua

l an

d/or

aud

ible

w

arni

ng w

hen

targ

et

viol

ates

"sa

fe l

imit"

cr

iteria

Indi

catio

n of

tar

get

caus

ing

"saf

e lim

it"

war

ning

Vis

ual

and/

or a

udib

le

war

ning

whe

n ta

rget

en

ters

"gu

ard

zone

" ar

ea

Indi

catio

n of

tar

get

caus

ing

"gua

rd z

one"

w

arni

ng

Ö

c o O

<*> r- CD 5.

« 2 *- CO

CO o

2 CD o .E o CO

< CO Abi

lity

to a

ctiv

ate

or

deac

tivat

e "s

afe

limit"

w

arni

ng

Abi

lity

to a

ctiv

ate

or

deac

tivat

e "g

uard

zon

e"

war

ning

CO c o

c o o

c o o CO

_>. c CD

CO 3.2

A

cqui

re a

nd t

rack

all

targ

ets, e

xcep

t fis

hing

ve

ssel

s to

the

nor

th

usin

g m

anua

l ac

quis

ition

3.3

T

arge

t vi

olat

es s

afe

limit

area

and

ac

tivat

es w

arni

ng

3.3

T

arge

t en

ters

gua

rd

zone

are

a an

d ac

tivat

es w

arni

ng h

CO h- CD CM

E?2 J5 05 O 2 >_ 3 O O re ° 3 „CO

co .2-

111 w

CD

E o o T—

oo o ° CD O O

B-23

.2 'u a>

•** u U

S ©

J3

c« U o

CU CU

o ^ ^ 3 ö .S o -3

"-3 S ss o 2. U

w I

PQ CU yi '3 u cu

W

« _CU

3 H

CO

c CD

E E o ü

Per

form

ance

R

atin

g

z

Q.

>-

Ava

il-

abili

ty

Y/N

.2 <D

■»-»

'^ Ü C o

CO _3

CO > UJ CO

_o 4-» JO '^ CD

O CO w co

JC o o

3 E

</5

CO

a CO

a

>.

CO CO

cö ^ Q- c

C CD •2 .£

.2 CD

C C

O

o CD

CD CO

■S "o c g

o XI c

CO

CO

O § e c .2 © « ©

=5 S

O

O CD CO

"O cz

3

o C o re ü ■o

ö c o ü

CO Q-

? CO

O c

^X = CD

< .E

c g "co

> CO c ^

•A <o

>-.2 = Q. XI o CO -^

CO CD

<:i Abi

lity

to m

easu

re

dist

ance

bet

wee

n pa

ralle

l in

dex

line

and

land

mas

s

Abi

lity

to m

easu

re

bear

ing

of p

aral

lel

inde

x lin

e

Abi

lity

to m

easu

re r

ange

be

twee

n ow

n sh

ip a

nd

para

llel

inde

x lin

e

CD co

•2 CD

to 1_

CD C CD O)

O

.■= o

i« < CD Abi

lity

to g

ener

ate

rada

r in

terf

eren

ce w

ith c

lose

st

targ

et

C

3

E CO

o

is ■2 ® < CO

CO c

_o *■»

T3 c o ü

3.12 U

se o

f pa

ralle

l in

dex

lines

to

mai

ntai

n 3

nm

at b

earin

g 18

0° f

rom

Is

ia T

arifa

O x: o CD

CD co

•S CD O 3

"O O l_

Q. CD

DC

CD

CO T3

CO *" CD CD o

2. = T3 >- O CD

8-1 DC.S

CD

c 3 CD ü

is 0C co

CD

■z. O H < _l =>

c/> LL.

o Q

UJ

CD

E i-

CD ~ -n °° O CO ? T- o co S o

O C\J O

o CM O

B-24

et u 0)

'S

ik «ft .3»

s- o

V VI

u ©

a 'S et j3 "et >

I «

CJ u 0> X W

3

CO

c CD

E E o ü

Per

form

ance

R

atin

g

z

a

>

Ava

il-

abili

ty

Y/N

.5 'k.

S "c ü c o CO 3

CO > Hl o to 3 E

>< CO

Q. CO

b

Log

of o

wn

ship

cou

rse

and

spee

d at

a g

iven

tim

e

Log

of t

arge

t sh

ip

bear

ing, r

ange

, co

urse

, sp

eed, C

PA

, an

d T

CP

A

at a

giv

en t

ime

CO c o to k_ CD

8- w O O)

o.| o 5 _i 3

CD

CO

10 UJ

^8

c CD CD i_ o CO

o "3 o

ö k.

c o ü

Abi

lity

to s

peci

fy w

hich

vo

yage

par

amet

ers

to

disp

lay 2 3

2 cr

"O

CO

o to _C0

c\T

CD CO 3 CO Q.

CO

CD > CO CO

■o .. c 2 3 >» 2 3 "

"O

CO

o

to as

C\T

CD CO 3 CO Q.

CO

CD > CO CO

CO

s! CD S O CO CO — 4- CD C CO

o.'e

>.* ;=* S ^£ < 3

CO c o

"■o c o ü

CD D) CO >> CO

si

CO

2 c >. CO

CL CD CD E

CM

to as c >. CO Q. CD CD £

CM

c 'i_ Q. O t! c= -^ CD = CD

< CO

CO

B-25

.2 'u <ü

"S u fee

<§.

u .©

a» J3

O

c ,©

T3 0) 3 S

•M

c o u

cu

"3 i-

3 H

CO +■« c CD

E E o o

Per

form

ance

R

atin

g

z

0.

>-

Ava

il-

abili

ty

Y/N

.2 CD

Ü

C o « CO >

LLI

o +-• CO

3

E

>> CO

Q. CO

b

Prin

tout

of

exer

cise

ev

ents

:

(1) c

hart v

iew

5 CD

> CO -o CO

s Abi

lity

to m

onito

r tr

aine

e st

atio

n us

ing:

(1) c

hart v

iew

5 CD

> CO

"D CO

Ö c o ü

Abi

lity

to p

rint

exer

cise

ac

tiviti

es f

ollo

win

g ex

erci

se

CO c o

c o Ü

4.3

A

bilit

y to

prin

t ex

erci

se

activ

ities

o E o

O CD ** CO

< CD

B-26

Evaluation Worksheets for ARPA Exercise E In exercise E, the candidate's own ship is navigating through a narrow channel in New York City's Upper Harbor. The following characteristics are present in this scenario:

• A cross current and/or wind is present, requiring "crabbing" of own ship down the channel; crabbing is facilitated by the use of ground-stabilized mode.

• Own ship is outbound.

• Target A is inbound.

• Target B is in the harbor, dead in the water.

Table B-18. Vessel data at 00:00 for exercise E (McCallum et al., 2000).

Target Name Range nm Bearing Course Speed kt Target Type

Own Ship 180° 15 Container

A 7.0 167° 347° 11 Container

B 4.0 172° DIW Container

Instructions

Program the data in Table B-18 into the ARPA simulator. Review the exercise programming conditions given in column 2 of Table B-19. Evaluate the simulator's ability to generate the current and wind. Next, review the simulation conditions in Table B-20. Determine whether the simulator can be set up per the requirements of this scenario. Evaluate the simulator's ability to simulate the events in a realistic and dynamic manner. Under "Availability," note whether the simulator has the required control or display. Under "Performance rating," indicate the extent to which the simulator satisfies each evaluation criterion. If limitations are found for any criterion, detail them under "Comments."

B-27

.2 "C

"u. u fee

2 2 a, to

ft)

0) <u

©

C O •■* *J et

_S "3 w

I w

"3 u

H

(0

c 0)

E E o ü

Per

form

ance

R

atin

g

z

Q.

>

Ava

il-

abili

ty

Y/N

.5 0> +■» 'i-

Ü c o

re 3 re >

UJ

o

iS 3 E to

re a. to Q

ö ■♦-»

C o Ü

c CD i_

Ü

2 re a3 c 0) CO

o >. 5 <

-a c

re CD c CD CO O

>.

<

to c

_o

xi c o o

CD

Ü

CVJ

a) E i-

-5

2 a»

*5 u s ©

.3 a C<5

u a V cu

J3

:- O

c

« s "3 >

I w v

*3 s* 0) X W ö

I

M _cu

H

to c 0)

E E o ü

Per

form

ance

R

atin

g

Z

a.

>

Ava

il-

abili

ty

Y/N

.2 CD

Ü

£ o

"■•3 re 3 re >

UJ

o re 3 E

(75

re a. to b

Effe

ct o

f cu

rren

t on

ow

n sh

ip's h

ydro

dyna

mic

m

odel

5 CD S CD

c to o -o 4- c C CO

2a>

O 3 *- O O O

"-£ CD .Q-

Lu w

a) CO i_ CO _ *- CD C T3 O O

c £ CD Ü

Ü [2

°$ o o CD x>

"CD

is o -o ^ CD C CD CD Q- b <o 3 -O O C »- CO

° ® o !? ® 3

LÜ 8 Effe

ct o

f win

d on

ow

n sh

ip's h

ydro

dyna

mic

m

odel

CD r- CD

I- £ c O CO

"O CD C CO

5 3 »- o o o

*■£ CD Q-

UJ W

'S

CO 13 ** o §E T3.y .£ E ^c °^ o ° 2 -o

0) i_ CO -— XJ C CD O CD

*- CO

ü S2

Hi 8

ö c o Ü

to c o

"■5 c o Ü

1.3

E

ffect

s of

en

viro

nmen

tal

char

acte

ristic

s of

ow

n sh

ip

1.4

E

ffect

s of

en

viro

nmen

tal

char

acte

ristic

s on

tar

get

ship

s

1.3

E

ffect

s of

en

viro

nmen

tal

char

acte

ristic

s of

ow

n sh

ip

1.4

E

ffect

s of

en

viro

nmen

tal

char

acte

ristic

s on

tar

get

ship

s

a> E H

O o T—

*l—

B-28

T3 0) 3

.g c o U

C« •■* u

u u

©

c

«

> w

I w a;

.M

'3 u cy X

W o

3

c CD

E E o o

Per

form

ance

R

atin

g

Z

0.

>-

Ava

il-

abili

ty

Y/N

.2 *^ o

ü c o ro 3

CO > HI

o ■*-»

3 E

CO

Q. CO

b

a>

is O j£

o o is?

ü en Afte

r ch

angi

ng d

ispl

ay

mod

e, p

lotti

ng i

nfor

mat

ion

is a

vaila

ble

with

in a

pe

riod

not

exce

edin

g 4

scan

s

Afte

r ch

angi

ng d

ispl

ay

orie

ntat

ion, p

lotti

ng

info

rmat

ion

is a

vaila

ble

with

in a

per

iod

not

exce

edin

g 4

scan

s

Ö c o O

CO j£ o o "O c o CD

o

= CD

$2 Abi

lity

to t

oggl

e be

twee

n pr

esen

tatio

n m

odes

:

(1)

grou

nd-s

tabi

lized

(2)

sea-

stab

ilize

d

Abi

lity

to t

oggl

e be

twee

n di

spla

y or

ient

atio

ns:

(1)

cour

se-u

p

(2)

head

-up

(0 c o

'■5 c o Ü

3.2

A

utod

rift

(gro

und-

lock

) se

t on

tar

get

B (b

earin

g 17

2°,

rang

e 4.

0 nm

)

Tar

get

A a

cqui

red

Ow

n sh

ip n

avig

ates

do

wn

chan

nel

avoi

ding

buo

ys,

land

, an

d ot

her

targ

ets

3.1

T

oggl

e be

twee

n pr

esen

tatio

n m

odes

dur

ing

navi

gatio

n do

wn

chan

nel

3.1

T

oggl

e be

twee

n di

spla

y or

ient

atio

ns w

hile

in g

roun

d-

stab

ilize

d m

ode

to i

dent

ify a

spec

t of

tar

get(

s)

HI CO

ü tr HI X LU

LL O Q ■z. LU

E o

1102

1102

fa

st

fwdto

11

12

CM CM

O

B-29

Evaluation Worksheets forARPA Exercise F Exercise F is an open waters scenario with vessel targets A, B, and C starting at the same bearing (220°), but different ranges (11.5, 9.5, and 7.5). Targets A and C are on a collision course. The situation requires the candidate's own ship to maneuver to starboard to keep outside of the established safe limits closest point of approach (CPA) of 1 nm.

Table B-21. Vessel data at 00:00 for exercise F (McCallum et al., 2000).

Target Name Range nm Bearing Course Speed kt Target Type

Own Ship 270° 20.0 Container

A 11.5 220° 005° 26.5 Container

B 9.5 220° 295° 29.0 Container

C 7.5 220° 335° 16.8 Container

D 6.0 048° 270° 20.0 Container

Instructions

Program the data in Table B-21 into the ARPA simulator. Review the equipment set-up conditions given in column 2 of Table B-22. Determine whether the simulator can be set up per the required conditions. Then, review the simulation conditions in Table B-23. Evaluate the simulator's ability to replicate the events in a realistic and dynamic manner. To evaluate the accuracy of data and check the simulator's capability to generate a Search and Rescue Transponder (SART), repeat the simulation portion of this exercise, using the worksheet provided in Table B-24 to record the results of the second iteration. On all worksheets, under "Availability," note whether the simulator has the required control or display. Under "Performance ratings," indicate the extent to which the simulator satisfies each evaluation criterion. If limitations are found for any criterion, detail them under "Comments."

B-30

.2 *E o> u o

§■ t»

to s

a

u .©

03

X! VI

U ©

G

«

"es >-

W I

CD

y i. 03 X

W

ci <s M

«3 H

c 03

E E o o

Per

form

ance

R

atin

g

z

Q.

>-

Ava

il-

abili

ty

Y/N

.2 (13

■**

"k- ü C o

"■*-»

CO 3

CO > 111 k. o

3 E

c/>

>> CO

Q. CO

b

Ö k_

■*-«

c o o

03 i2 E <» io S> 03 CO

1® f0 03

Q-o

£"53 >•.£ .^ l_ = C0 XI 03 < -O

CO c o

'■ö c o ü

3.10

Thr

ee v

esse

ls o

n sa

me

bear

ing

(A,

B, C

), w

ith A

& C

on

col

lisio

n co

urse

03

E i- CO 3

e o

S- 03

c/3

.2

*E u K

.3 3

to

03 03

J3

s- O

e #o

«

w I

En 03

#c«

"S t- 03 X

W

M

3 «8 H

(0

c 03

E E o ü

Per

form

ance

R

atin

g

Z

0.

>-

Ava

il-

abili

ty

Y/N

.2 *k_ 03 *-•

"k- o c o "^ CO 3

CO > HI k. o

3 E to

>» CO

Q. CO

b

Dat

a fo

r th

ree

targ

ets

on

sam

e be

arin

g ar

e in

itial

ly

less

acc

urat

e th

an f

or o

ne

targ

et.

Wai

t unt

il da

ta f

or fi

rst

targ

et s

tead

y be

fore

ac

quiri

ng n

ext

targ

et,

then

re

cord

how

long

it t

akes

fo

r da

ta t

o st

eady

« -2. SE o E

1$ IS 3 CO 03 .2 C X> CO £ E S

imul

ate

the

effe

ct o

n al

l tr

acke

d ta

rget

s of

an

own

ship

man

euve

r

Sim

ulat

e w

ithou

t in

terr

uptin

g th

e up

date

of

tarq

et i

nfor

mat

ion

Ö k.

c o Ü

Tria

l m

aneu

ver

initi

ated

by

dep

ress

ion

of e

ither

a

sprin

g-lo

aded

sw

itch

or a

fu

nctio

n ke

y

CD

E

CO 03

"O 3 O c o

:■= to -Q 03 < "O

o .0

05 03 > CO to

<» S 03 .!2 3 -O

O O

>.'E .*s co

1> < "O

to c o

■•"*

TO c o O

3.10 D

ispl

ay o

f tar

get

data

for

targ

ets

acqu

ired

in t

he

follo

win

g or

der:

targ

et C

, th

en B

, th

en A

3.14

Use

of t

rial

man

eu

ver

to c

alcu

late

re

quire

d ne

w

cour

se w

hen

vess

el b

earin

g

is a

t di

stan

ce

to

mai

ntai

n m

inim

um C

PA o

f 1

nm (

cour

se

chan

ge m

ust

be

exec

uted

by

0510

)

UJ CO

o tr UJ X ID LL o Q -z. UJ

03

E 1— o in o

o o

lO o

B-31

a o \* a u

CU

a o cj o> «a

.2 CU -w

"S u C ©

3

Co

U a a* cu .a Cfl

O

s ^o '-3 e3 3 S3 >

I

a> ."* "S !-

CU x W

rf 1

CU

3 93 H

0)

c CD

E E o ü

u

form

ar

Rat

ing

P

Ava

il-

abili

ty

Y/N

ics

Eva

luat

ion

Cri

teri

a

Dis

play

Dat

a fo

r th

ree

targ

ets

on

sam

e be

arin

g ar

e in

itial

ly

less

acc

urat

e th

an f

or o

ne

targ

et

Indi

catio

n of

tar

get f

adin

g an

d re

plac

emen

t w

ith

SA

RT

Sim

ula

tor

Ch

arac

teri

st

Con

trol

Abi

lity

to a

uto-

acqu

ire

thre

e ta

rget

s on

sam

e be

arin

g co

ncur

rent

ly

Abi

lity

to o

btai

n ra

nge

and

bear

ing

of a

ny o

bjec

t on

dis

play

CO c o

TJ C o Ü

3.10

Aut

omat

ical

ly

acqu

ire a

ll th

ree

targ

ets

on s

ame

bear

ing

Allo

w ta

rget

s to

col

lide

in o

rder

to t

est

SA

RT

ca

pabi

lity

CO CD O

JZ o CDp CD fr

i£ SS.

■>*

CO

LU CO

Ü rr tu X LU

LL O Q -z. LU

CD

E i-

O CO o 05

02

fast

fw

d to

05

10

T—

m o

CM i— in o

B-32

APPENDIX C

Worksheets for Compiling and Analyzing Simulator Evaluation Data

This appendix provides a set of worksheets for compiling and analyzing simulator evaluation data. The appendix is divided into three main sections. The first section contains four worksheets (Tables C-2 through C-5), on which the evaluator can provide a detailed summary of the findings for each simulator evaluation criterion. The second section contains Table C-7, a worksheet that the evaluator can use to summarize the general capability of a simulator to support the four simulator evaluation objective categories. Lastly, the third section contains a worksheet (Table C-9) that the evaluator can use to indicate a simulator's capability to support the current set of mariner assessment objectives.

C-l

WORKSHEETS FOR RECORDING A DETAILED SUMMARY OF THE SIMULATOR EVALUATION RESULTS

Tables C-2, C-3, C-4, and C-5 are worksheets that the evaluator can use to summarize the detailed findings of a simulator evaluation. The four tables correspond to the simulator evaluation categories: exercise programming, equipment set-up, simulation, and debriefing. The tables address the findings for all the individual simulator evaluation criteria within each category. As noted in the reference column, each criterion is derived either from the present set of mariner assessment objectives (MAO) (McCallum et al., 1999); IMO Resolution A.222 (VII) (IMO, 1971); Resolution A.422 (XI) (IMO, 1979); or Section A-I/12 of the amended STCW Code (MO, 1996). In the next section, the detailed findings for each evaluation objective are tabulated and presented by category in Table C-7.

In the present application to ARPA simulators, we used a subjective ratings approach (classification into three or more pre-defined levels) to score each criterion. We used the following definitions for each level: Y (yes), indicating the simulator fully satisfied the criterion; P (partial), indicating the simulator partially satisfied the criterion; and N (no), indicating the simulator did not satisfy the criterion. Table C-l shows an example of how we used the worksheet depicted in Table C-5 (debriefing) to summarize our evaluation of Simulator Y on simulator evaluation objectives 4.1 through 4.4. As noted in the "Rating" column, Simulator Y was capable of meeting 8 out of the 10 evaluation criteria noted below. Our comments note the different capabilities of the simulator with respect to various criteria.

Table C-l. Detailed summary of Simulator Y's ability to satisfy debriefing criteria.

Simulator Evaluation Objectives Simulator Evaluation Criteria Reference Rating Comments

4.1 Record exercise

4.1 .C1 Ability to specify which voyage parameters to display

4.1. D1 Log of voyage activities:

(1) own ship course and speed at a given time

(2) target ship bearing, range, course, speed, CPA and TCPA at a given time

(3) applicable COLREGS for each target

(4) operational warnings

STCW Y

Y Log of activities for (a) arid (b) are currently available. Manufacturer indicated that logs for(c) and (d) would be available soon.

4.2 Replay exercise

4.2.C1 Ability to:

(1) rewind

(2) fast forward

(3) pause, and

(4) save

4.2.D1 Chart view

4.2.D2 Radar view

STCW Y

Y

Y

C-2

Table C-l. Detailed summary of Simulator Y's ability to satisfy debriefing criteria. (Continued)

Simulator Evaluation Objectives Simulator Evaluation Criteria Reference Rating Comments

4.3 Print exercise 4.3.C1 Ability to print screen while exercise is running

4.3.D1 Printout of screen

4.3.C2 Ability to print a hard copy of exercise activities in different views following exercise

4.3. D2 Printout of exercise events in (a) chart view (b) radar view

N

N

Y

Y

Not available.

Printer was not available at the time of evaluation.

Printout of radar view is not available.

4.4 Monitor exercise

4.4.D1 Ability to monitor trainee station using: (a) chart view (b) radar view

STCW Y

Table C-2. Worksheet for recording a detailed summary of a simulator's ability to satisfy exercise programming criteria.

Simulator Evaluation Objective

Simulator Evaluation Criteria Reference Rating Comments

1.1 Create an exercise area that enables the simulation of land masses

1.1.C1 Ability to generate actual coastline and narrow channel

1.1.D1.1 Indication of land masses

1.1.D1.2 When radar antenna is mounted at 15m the equipment, in the absence of clutter, gives a clear indication of a coastline

MAO

MAO

A.222

1.2 Replicate environmental conditions critical to navigation of own ship

1.2.C1 Ability to generate depth characteristics

1.2.C2 Ability to generate current

1.2.C3 Ability to generate tidal condition

1.2.C4 Ability to generate wind

1.2.C5 Ability to generate precipitation in specific location and density

1.2.D5.1 Indication of precipitation area

1.2.D5.2 Target detection is influenced by location and density of precipitation

1.2.C6 Ability to control precipitation clutter

1.2.D6 Indication of reduced precipitation clutter

MAO

MAO

C-3

Table C-2. Worksheet for recording a detailed summary of a simulator's ability to satisfy exercise programming criteria. (Continued)

Simulator Evaluation Objective

1.3 Specify own ship parame- ters to create realistic navi- gational char- acteristics

1.4 Specify target parameters to create realistic navigational scenario

Simulator Evaluation Criteria

1.3.C1 Ability to select model of ship

1.3.C2 Ability to program vessel specifications

1.3.C3 Ability to program maneuvering characteristics

1.3.C4 Ability to program initial set-up conditions

1.3.D4 Indication of own ship vector

1.3.C5 Ability to program future maneuvers

1.3.D6 Effect of current on own ship hydrodynamic characteristics

1.3.D7 Effect of current on own ship course and speed

1.3.D8 Effect of wind on own ship hydrodynamic characteristics

1.3.D9 Effect of wind on own ship course and speed

1.4.C1 Ability to program minimum of 20 targets

1.4.C2 Ability to select model of ship

1.4.D2 Visual symbol for each target type

1.4.C3 Ability to program vessel specifications

1.4.C4 Ability to program vessel maneuvering characteristics

1.4.C5 Ability to select racon, buoy, SART

1.4.D5 Indication of target code, if appropriate

1.4.C6 Ability to obtain range and bearing of target

1.4. D6 Indication of target position

1.4.C7 Ability to program initial set-up conditions

1.4.C8 Ability to program future maneuver

1.4.C9 Ability to program removal or addition of target from the exercise area

1.4.C10 Ability to program target fading and replacement with SART

1.4.D11 Effect of current on target hydrodynamic characteristics

1.4.D12 Effect of current on target course and speed

1.4.D13 Effects of wind on target hydrodynamic characteristics

1.4.D14 Effect of wind on target course and speed

Reference

MAO

MAO

MAO

MAO

MAO

MAO

A.422

MAO

MAO

MAO

Rating Comments

C-4

Table C-2. Worksheet for recording a detailed summary of a simulator's ability to satisfy exercise programming criteria. (Continued)

Simulator Evaluation Objective

Simulator Evaluation Criteria Reference Rating Comments

1.5 Reproduce critical equipment malfunctions

1.5.C1 Ability to program speed log malfunctions

1.5.D1 Indication of speed log malfunction

1.5.C2 Ability to program gyrocompass malfunctions

1.5.D2.1 Indication of gyrocompass malfunction

1.5.C3 Ability to program ARPA failure

1.5.D3 Indication of ARPA malfunction

STCW

STCW

STCW

1.6 Reproduce critical ARPA operational limitations (i.e., effect of limitations on ARPA operations)

1.6.C1 Ability to program density and area covered by sea clutter

1.6.D1 Indication of sea clutter area

1.6.C2 Ability to control sea clutter

1.6.D2 Indication of reduced sea clutter

1.6.C3 Ability to generate automatic radar interference with closest target or other ship

1.6.C4 Ability to generate false echo

1.6.D4 Indication of false echo

1.6.C5 Ability to simulate blind sector

1.6.D5 Indication of areas without radar coverage

Table C-3. Worksheet for recording a detailed summary of a simulator's ability to satisfy equipment set-up criteria.

Simulator Evaluation Objective Simulator Evaluation Criteria Reference Rating Comments

2.1 Selection of display presentation, orientation, and vector mode

2.1.C1

2.1.D1

2.1.C2

Ability to toggle between sea- and ground-stabilized modes

Indication of display mode

Ability to toggle between North-up, and either course-up or head-up azimuth stabilization

MAO

A.422

STCW

A.422

2.1.D2 Indication of display orientation mode A.422

2.1.C3 Ability to toggle between relative and true motion

STCW

A.422

2.1.D3 Indication of display vector mode A.422

2.1.C4 Ability to use ARPA on the following ranges: (a) 3 or 4 miles and (b) 12 or 16 miles

A.422

2.1.C5 Fixed range rings available A.222

C-5

Table C-3. Worksheet for recording a detailed summary of a simulator's ability to satisfy equipment set-up criteria. (Continued)

Simulator Evaluation Objective

2.2

2.3

Selection of required speed and compass input

Simulator Evaluation Criteria

2.1.D5.1 Indication of range scale in use

2.1.D5.2 Indication of distance between range rings

2.1 .C6 Variable electronic range marker available

Selection of AR PA plotting controls and manual/auto- matic acquisition

2.2.C1

2.2.D1.

2.2.D1

2.2.C2

2.2.D2

2.2.D2

2.3.C1

Ability to set speed log input with 1 knot resolution: (a) manual (b) automatic

1 Indication of manual speed input

2 Indication of auto speed log

Ability to set compass log input with 1° resolution: (a) manual (b) gyrocompass

1 Indication of manual compass input

.2 Indication of gyrocompass input

Reference

A.422

A.222

A.222

STCW

Ability to select acquisition rings or areas

2.3.D1 Indication of acquisition rings or areas

2.3.C2 Ability to select targets and initiate manual target acquisition

2.3.D2 Indication of manual acquisition mode

2.3.C3 Ability to select targets and initiate automatic target acquisition

2.3.D3 Indication of automatic acquisition mode

2.3.C4 Ability to select target manually while in automatic acquisition

Rating Comments

STCW

A.422

A.422

A.422

A.422

2.4 Selection of safe limits

2.4.C1

2.4.D1

Ability to select safe limits according to distance (CPA) and time (TCPA)

Indication of safe limits

STCW

2.5 Selection of vector time scale

2.5.C1

2.5.D1

Ability to select time-adjustable or fixed time scale

Indication of time scale of vector in use

STCW

A.422

A.422

2.6 Selection of exclusion areas when automatic acquisition is employed

2.6.C1

2.6.D1

Ability to suppress acquisition in certain areas (i.e., to select exclusion area according to bearing and range)

Indication of the area of acquisition

STCW

A.422

A.422

2.7 Selection of danger area

2.7.C1 Ability to create a danger area

2.7.D1 Indication of danger area

MAO

C-6

Table C-4. Worksheet for recording a detailed summary of a simulator's ability to satisfy simulation criteria.

Simulator Evaluation Objective Simulator Evaluation Criteria Reference Rating Comments

3.1 Display characteristics when alternating between ground- and sea-stabilized modes

3.1 .C1 Ability to toggle between presentation modes: (a) ground-stabilized (b) sea- stabilized

3.1 .D1 After resetting display mode, plotting information is available within a period not to exceed 4 scans

3.1 .C2 Ability to toggle between display orientations: (a) course-up (b) head- up

3.1.D2 After changing display orientation, plotting information is available within a period not to exceed 4 scans

MAO

A.422

A.422

3.2 Use of manual and automatic acquisition

3.2.C1 Ability to acquire, track, process and continuously update information manually for at least 10 targets

3.2.D1.1 Ability to display information simultaneously for at least 10 targets in manual mode

3.2.D1.2 Indication of manually tracked targets

3.2.C2 Ability to automatically acquire, track, process and continuously update information for at least 20 targets

3.2.D2.1 Ability to display information for at least 20 targets simultaneously in automatic mode

3.2.D2.2 Indication of automatically tracked targets

3.2.C3 Ability to suppress automatic acquisition mode

3.2.C4 Ability to groundlock a target

3.2.D4 Indication of groundlocked target

STCW

A.422

A.422

A.422

A.422

A.422

A.422

A.422

MAO

MAO

3.3 Use and limitations of AR PA operational warnings

3.3.C1 Ability to activate or deactivate "safe limit" warning

3.3.D1.1 Visual and/or audible warning when target violates safe limit criteria

3.3.D1.2 Indication of target causing "safe limit" warning

3.3.C2 Ability to activate or deactivate guard zone warning

3.3.D2.1 Visual and/or audible warning when target enters guard zone area

3.3.D2.2 Indication of target causing "guard zone" warning

3.3.C3 Ability to activate or deactivate "lost target" warning

3.3.D3.1 Visual and/or audible warning when target is lost

A.422

A.422

A.422

A.422

A.422

A.422

A.422

A.422

C-7

Table C-4. Worksheet for recording a detailed summary of a simulator's ability to satisfy simulation criteria. (Continued)

Simulator Evaluation Objective Simulator Evaluation Criteria Reference Rating Comments

3.3.D3.2 Indication of last tracked position

3.3.C4 Lost target can be reacquired

3.3.C5 Ability to activate or deactivate "new target" warning

3.3.D5.1 Visual and/or audible warning when new target enters the acquisition zone

3.3.D5.2 Indication of target causing "new target" warning

3.3.C6 Ability to activate or deactivate "target store full" warning

3.3. D6 Visual and/or audible "target store full" warning

A.422

A.422

3.4 Detection and identification of false echoes, sea returns, racons, and SARTs

3.4.C1 Ability to suppress unwanted echoes from sea clutter, rain and other types of precipitation

3.4.D1 Indication of reduced precipitation clutter

3.4.C2 Ability to adjust the sea and precipitation anti-clutter manually and continuously

3.4. D2 Indication of reduced sea clutter

3.4.C3 Ability to obtain range and bearing of any object on display

3.4.D3.1 Indication of racon code

3.4.D3.2 Indication of SART code

3.4.C4 Indication of target fading and replacement with SART

STCW

A.222

A.222

A.422

3.5 Use of graphic repre- sentation of danger areas

3.5.D1 Indication of danger areas

3.5. D2 Visual and/or audible warning

3.6 Use of vessel history trails

3.6.C1 Ability to select vessel history display

3.6.D1 Display at least 4 equally time-spaced past positions of any targets being tracked over a period of at least 8 minutes

3.6.D2 If target has been tracked less than 8 minutes, number of past positions displayed reflects the time tracked

3.6.D3 A target just acquired has no vessel history

STCW

A.422

C-8

Table C-4. Worksheet for recording a detailed summary of a simulator's ability to satisfy simulation criteria. (Continued)

Simulator Evaluation Objective Simulator Evaluation Criteria Reference Rating Comments

3.7 Speed and direction of a target's rela- tive move- ment and the identification of critical echoes

3.7.C1 Ability to request the display of ARPA data

3.7.D1.1 Acquired data (course, speed, CPA, TCPA, range, bearing) show trend (low accuracy) for first minute

3.7.D1.2 Precise target data (course, speed, CPA, TCPA, range, bearing) appear after three minutes

3.7.D1.3 Acquired target data (course & speed) should be displayed in a vector or graphic form which indicates the target's predicted motion

3.7.D1.4 ARPA information does not obscure radar information

3.7.C2 Ability to cancel the display of unwanted ARPA data

STCW

A.422

STCW

A.422

A.422

A.422

A.422

A.422

3.8 Limitations of vessel data following changes in target's course or speed, or both

3.8.D1 Target data (course, speed, CPA, TCPA, range, bearing) show trend (low accuracy) for first minute

3.8.D2 Precise target data (course, speed, CPA, TCPA, range, bearing) appear after three minutes

A.422

A.422

3.9 Limitations of vessel data following changes in own ship course, speed, or both

3.9.D1 Target data (course, speed, CPA, TCPA, range, bearing) show trend (low accuracy) for first minute

3.9.D2 Precise target data (course, speed, CPA, TCPA, range, bearing) appear after three minutes

A.422

A.422

3.10 Limitations of radar range and bearing on the accuracy of ARPA data

3.10.C1 Ability to program numerous targets on the same bearing

3.10.C2 Ability to auto-acquire concurrently numerous targets on same bearing

3.10.D2 Data for three targets on same bearing are initially less accurate than data for 1 target

MAO

MAO

3.11 The circum- stances causing "target swap" and their effects on display data

3.11.D1 Visual and/or audible "lost target" warning

3.11 .D2 Erroneous indication of swapped target's data

C-9

Table C-4. Worksheet for recording a detailed summary of a simulator's ability to satisfy simulation criteria. (Continued)

Simulator Evaluation Objective Simulator Evaluation Criteria Reference Rating Comments

3.12 Use of parallel index lines to maintain position on planned course and to identify time of maneuver

3.12.C1 Ability to measure distance between parallel index line and land mass

3.12.D1 Ability to draw parallel index lines maintaining a given distance from land

3.12.C2 Ability to measure bearing of parallel index line

3.12.D2 Indication of parallel index lines

3.12.C3 Ability to measure range between own ship and parallel index line.

3.12.C4 Availability of navigation lines (optional)

MAO

MAO

MAO

MAO

3.13 Display char- acteristics when alternat- ing between true and rela- tive vectors

3.13.C1 Ability to switch between vector modes without losing tracking information

3.13.D1 Indication of vector mode

A.422

3.14 The operation of the trial maneuver facility

3.14.C1 Simulation is initiated by depression of either a spring-loaded switch, or a function key

3.14.D1 Identification of trial maneuver mode

3.14.C2 Ability to use a static or dynamic display

3.14. D2 Simulate the effect on all tracked targets of an own ship maneuver

3.14.C3 Ability to include a time delay

3.14. D3 Simulate without interrupting the update of target information

STCW

A.422

A.422

A.422

MAO

A.422

3.15 Performance checks of radar, compass, speed input sensors, and ARPA

3.15.C1 Ability to reset disabled speed log 3.15.D1.1 Visual and/or audible warning "speed

log error" 3.15.D1.2 Erroneous own ship and target

speed and course indications 3.15.C2 Ability to reset disabled compass log

3.15.D2.1 Visual and/or "compass log error" warning

3.15.D2.2 Erroneous own ship and target speed and course indications

3.15.D2.3 Only head-up relative motion display is available when gyrocompass fails

STCW

STCW

3.16 Methods of testing for malfunctions of ARPA systems including functional self- testing

3.16.C1 Test programs are available to assess ARPA's overall performance against a known solution

STCW

A.422

C-10

Table C-5. Worksheet for recording a detailed summary of a simulator's ability to satisfy debriefing criteria.

Simulator Evaluation Objective Simulator Evaluation Criteria Reference Rating Comments

4.1 Record exercise

4.1 .C1 Ability to specify which voyage parameters to display

4.1 .D1 Log of voyage activities:

(5) own ship course and speed at a given time

(6) target ship bearing, range, course, speed, CPA and TCPA at a given time

(7) applicable COLREGS for each target

(8) operational warnings

STCW

4.2 Replay exercise

4.2.C1 Ability to:

(1) rewind

(2) fast forward

(3) pause

(4) save

4.2.D1 Chart view

4.2.D2 Radar view

STCW

4.3 Print exercise 4.3.C1 Ability to print screen while exercise is running

4.3.D1 Printout of screen

4.3.C2 Ability to print a hard copy of exercise activities in different views following exercise

4.3.D2 Printout of exercise events (a) chart view (b) radar view

4.4 Monitor exercise

4.4.D1 Ability to monitor trainee station using: (a) chart view (b) radar view

STCW

C-ll

WORKSHEET FOR SUMMARIZING A SIMULATOR'S GENERAL CAPABILITY TO MEET SIMULATOR EVALUATION OBJECTIVES ^■■■■■w^^^^mmm^^mmmmu^m^mmmmmtmmm

This section contains a worksheet that the evaluator can use to record a simulator's capabilities in the following general categories: exercise programming, equipment set-up, simulation, and debriefing. Since the evaluation of an ARPA simulator using the present approach can be quite extensive (over 170 evaluation criteria), this worksheet (Table C-7) should be used to record only the general findings. The general findings for each evaluation objective are the result of tabulating the detailed findings for each criterion recorded in Tables C-2 through C-5. For example, in our evaluation of the debriefing capabilities of ARPA Simulator Y (see Table C-l) we found that Y could replicate 8 out of 10 of the evaluation criteria required for debriefing. Table C-6 shows how these results were recorded onto a general summary worksheet.

Table C-6. General summary of Simulator F's ability to satisfy simulator evaluation objectives in the debriefing category.

Simulator Evaluation Objective Evaluation Criteria Met Comments

4. Debriefing 80%

4.1 Record exercise 2/2 Each exercise can be recorded and kept in memory for an extensive period of time.

4.2 Replay exercise 3/3 Each exercise can be replayed in real and fast time using either the radar view (instructor or trainee console) or the chart view (instructor console).

4.3 Print exercise 2/4 Manufacturer indicated that all the logs of voyage activities, as well as chart and radar views of the exercises, can be printed.

Printer was not available during the evaluation.

'Print screen' feature is not available.

4.4 Monitor exercise 1/1 Instructor can monitor the exercise using either the chart or radar view.

To obtain these results, we first integrated the detailed findings (addressed in Table C-l) to determine the extent that Simulator Y met the debriefing objectives. Then, we assigned a numerical score [1 (yes), 0.5 (partial) and 0 (no)] to each criterion. Next, we summed these scores across all the evaluation criteria within each simulator evaluation objective to obtain a final score for each objective. In the column labeled "evaluation criteria met," we recorded the ratio of the criteria met to the total criteria available for each objective. For example, simulator evaluation objective 4.3, Print exercise, has four separate evaluation criteria (see Table C-l for a listing of these individual criteria). Simulator Y fully met two criteria (2 points) and did not meet two other criteria (0 points) for a total score of 2 out of 4 points. This score is recorded above as "2/4." Simulator 7's overall percentage score (80%) is also recorded on this worksheet, as are our comments addressing the simulator's strengths and weaknesses with respect to each evaluation objective. Evaluators can also use the comments section to note those objectives they were unable to evaluate due to constraints of the exercises used in the evaluation.

C-12

Table C-7. Worksheet for summarizing findings of an ARPA simulator evaluation, organized by simulator evaluation objective.

Simulator Evaluation Objective Evaluation

Criteria Met5 Comments

1. Exercise Programming %

1.1 Create an exercise area that enables the simulation of land masses

/3

1.2 Replicate environmental conditions critical to navigation of own ship

/9

1.3 Specify own ship parameters to create realistic navigational characteristics

/10

1.4 Specify target parameters to create realistic navigational scenario

/17

1.5 Reproduce critical equipment malfunctions

/6

1.6 Reproduce critical ARPA operational limitations (i.e., effects of limitations of ARPA operations)

/9

2. Equipment Set-Up %

2.1 Selection of display presentation, orientation, and vector mode

/11

2.2 Selection of required speed and compass input

/6

2.3 Selection of ARPA plotting controls and manual/automatic acquisition

n

2.4 Selection of safe limits 12

2.5 Selection of vector time scale /2

2.6 Selection of exclusion areas when automatic acquisition is employed

/2

2.7 Selection of danger area /2

3. Simulation %

3.1 Display characteristics when alternating between ground- and sea- stabilized modes

/4

3.2 Use of manual and automatic acquisition

/9

3.3 Use and limitations of ARPA operational warnings

/15

3.4 Detection and identification of false echoes, sea returns, racons, and search and rescue transponders (SART)

/8

3.5 Use of graphic representation of danger areas

/2

3.6 Use of vessel history trails /4

The denominator represents the total number of evaluation criteria for each evaluation objective. (See Tables C-2 through C-5 for a complete listing of the simulator evaluation criteria for each objective.)

C-13

Table C-7. Worksheet for summarizing findings of an ARPA simulator evaluation, organized by simulator evaluation objective. (Continued)

Simulator Evaluation Objective

3.9 Limitations of vessel data following changes in own ship course, speed, or both

4. Debriefing

4.1 Record exercise

4.2 Replay exercise

4.3 Print exercise

4.4 Monitor exercise

Evaluation Criteria Met

3.7 Speed and direction of a target's relative movement and the identification of critical echoes

3.8 Limitations of vessel data following changes in target course or speed, or both

/6

72

12.

3.10 Limitations of radar range and /3 bearing on the accuracy of ARPA data

3.11 The circumstances causing "target 12 swap" and their effects on display data

3.12 Use of parallel index lines to maintain /6 position on planned course and to identify time of maneuver

3.13 Display characteristics when 12 alternating between true and relative vectors

3.14 The operation of the trial maneuver /6 facility __

3.15 Performance checks of radar, II compass, speed input sensors, and ARPA

3.16 Methods of testing for malfunctions of /1 ARPA systems including functional self-testing

12

13

/4

/1

Comments

C-14

WORKSHEET FOR SUMMARIZING A SIMULATOR'S CAPABILITY TO SUPPORT MARINER ASSESSMENT OBJECTIVES

This section contains a worksheet (Table C-9) on which evaluators can record a simulator's capability to support the 27 mariner assessment objectives described in McCallum et al. (1999). Table C-9 summarizes a simulator's performance in terms of its ability to reproduce the assessment conditions required for each mariner assessment objective. To determine the percentage of assessment objectives that a simulator could support, the evaluator can assign a numerical score of 1 (yes), 0.5 (partial), and 0 (no) to each assessment objective, and sum these scores across objectives within an assessment objective category. Then, the evaluator can divide the result by the total score possible for that category. For example, Table C-8 below shows an excerpt of our summary of the ability of Simulator Y to support mariner assessment objectives in category 1, setting up and maintaining displays. In this category, Simulator Y fully supported four assessment objectives (4 points) and partially supported two objectives (1 point), resulting in a total score of 5 out of 6 possible points, or 83 percent.

Table C-8. Summary of the capability of Simulator Y to support mariner assessment objectives in category 1, setting up and maintaining displays.

Mariner Assessment Objective

Simulator Evaluation Objective

Ability to Meet Requirements Comments

1. Setting up and maintaining displays 83%

1.1 The selection of display presentation; stabilized relative motion display and true motion display

2.1 Yes

1.2 The selection, as appropriate, of required speed and compass input to ARPA

2.2 Yes Manual input as well as automatic speed log and gyrocompass are available. Resolution is 0.1 knot or degree.

1.3 The selection of ARPA plotting controls, manual/automatic acquisition, vector/graphic display of data

2.1,2.3 Yes

1.4 The selection of the vector time scale 2.5 Yes

1.5 The use of exclusion areas when automatic acquisition is employed by ARPA

2.6 Partial Although exclusion areas are not directly available, they can be indirectly specified using two acquisition areas or set of rings.

1.6 Display characteristics and an understanding of when to use ground- or sea-stabilized modes

1.1, 1.2, 3.1

Partial Although ground-stabilized mode is not available, the availability of randomly generated currents, along with the use of EBL and VRM, enables the trainee to understand the effect of current on navigation.

C-15

Table C-9. Worksheet for recording a summary of a simulator's capability to support mariner assessment objectives.

Mariner Assessment Objective

Simulator Evaluation Ability to Meet Objective Requirements Comments

1. Setting up and maintaining displays

1.1 The selection of display presentation; stabilized relative motion display and true motion display

2.1

1.2 The selection, as appropriate, of required speed and compass input to ARPA

2.2

1.3 The selection of ARPA plotting controls, manual/automatic acquisition, vector/graphic display of data

2.1,2.3

1.4 The selection of the vector time scale 2.5

1.5 The use of exclusion areas when automatic acquisition is employed

2.6

1.6 Display characteristics and an 1.1,1.2, understanding of when to use ground 3.1 or sea-stabilized modes

2. Situation assessment

2.1 Understanding the criteria for the selection of targets by automatic acquisition

3.2

2.2 Uses, benefits and limitations of ARPA operational warnings

2.4, 3.3

%

2.3 Detection and identification of false echoes, sea return, racons, and SART

3.4

2.4 The use of graphic representation of danger areas

2.7, 3.5

2.5 Knowledge and recognition of historic data as a means of indicating recent maneuvering of targets

3.6

2.6 The speed and direction of a target's relative movement and the identification of critical echoes (in both relative and true motion modes of display)

3.7

2.7 Detecting target course and speed changes and the limitations of such information (in both relative and true motion modes of display)

3.8

2.8 The effect of changes in own ship's course or speed or both (in both relative and true motion modes of display)

3.9

C-16

Table C-9. Worksheet for recording a summary of a simulator's capability to support mariner assessment objectives. (Continued)

Mariner Assessment Objective

Simulator Evaluation Objective

Ability to Meet Requirements Comments

3. Knowledge of factors affecting performance and accuracy; and ability to operate and interpret system performance and accuracy, tracking capabilities and limitations, and processing delays

%

3.1 Knowledge of the effect of limitations of radar range and bearing on the accuracy of ARPA data

3.10

3.2 The circumstances causing "target swap" and their effect on display data

3.11

3.3 The effects on tracking of "lost" targets and target fading

3.4 An appreciation of the IMO perform- ance standards for ARPA, in particu- lar the standards relating to accuracy

3.3

1.6,3.15 IMO

exercises

4. Parallel indexing % 4.1 Plotting parallel index lines to

maintain position on planned course

4.2 Using parallel index lines to identify time of maneuver

1.1,3.12

3.12

5. Application of COLREGS; and deriving and analyzing information, critical echoes, exclusion areas and trial maneuvers

%

5.1 The benefit of switching between true and relative vectors

3.13

5.2 Analysis of potential collision situations from displayed information, determination and execution of action to avoid close-quarters situations in accordance with COLREGS

3.7

5.3 The operation of the trial maneuver facility

3.14

6. Use of operational warnings and system tests

%

6.1 Performance checks of radar, compass, speed input sensors and ARPA

1.5,3.15

6.2 Methods of testing for malfunctions of ARPA systems including functional self-testing

1.5,3.16

6.3 Precautions to be taken after a malfunction occurs

3.15

6.4 Ability to perform system checks and determine data accuracy of ARPA, including the trial maneuver facility, by checking against basic radar plot

1.5, 1.6, 3.14, 3.16

C-17

[This page intentionally left blank.]

r

C-18