1 evaluation of tewa in a ground based air defense environment presenters: basie kok, andries heyns...

25
1 Evaluation of TEWA in a Evaluation of TEWA in a Ground Based Air Defense Ground Based Air Defense Environment Environment Presenters: Basie Kok, Andries Heyns Supervisor: Prof. Jan van Vuuren

Upload: bryce-kelly

Post on 13-Dec-2015

225 views

Category:

Documents


1 download

TRANSCRIPT

1

Evaluation of TEWA in a Evaluation of TEWA in a Ground Based Air Defense Ground Based Air Defense

EnvironmentEnvironment

Presenters: Basie Kok, Andries Heyns

Supervisor: Prof. Jan van Vuuren

2

Overview

• Context and motivation

• Evaluation Overview

• Modelling TEWA components

• Simulation evaluation of TEWA

• Proposed measures of performance

• Demonstration

• Status / Further Work

3

Context and Motivation• Forms part of TEWA program underway at the University of Stellenbosch.

• Aim: To evaluate a fully fledged TEWA system in a GBADS environment.

• High-stress environments of particular interest in GBADS application

• Collaborators: RRS, IMT, UDDC, CSIR…

– Duvenhage B & le Roux WH, A Peer-to-Peer Simulation Architecture, Proceedings of the 2007 High Performance Computing and Simulation Conference, pp. 684-690, 2007.

– le Roux WH, Implementing a Low Cost Distributed Architecture for Real-Time Behavioural Modelling and Simulation, Proceedings of the 2006 European Simulation Interoperability Workshop, Stockholm, pp. 81-95, June 2006.

– Roux J & van Vuuren JH, Threat evaluation and weapon assignment decision support: A review of the state of the art, Orion Journal of Operations Research, Submitted June 2007.

• Evaluation of TEWA models to date an imperative step in producing an effective system:– Investigation of computational costs (i.e. line of sight calculations)– Effectiveness and relevance of various model sub-components

• Prototype testing often infeasible due to high cost.

• Evaluation of large complex systems in the design phase often involves simulation: – Repetition (statistical certainty)– Variation (realistic scope traversal)

4

Evaluation OverviewObjectives:

– Demonstrate workability of system– Evaluate performance of various TE and WA methodologies in

different contexts– Evaluate system performance as a whole and identify focus

areas for future development

Overview:1. Model TEWA components

• Threat evaluation models• Combining threat lists• Weapon assignment models

2. Simulate scenarios using repetition and statistical variation• Robustness of components• Performance of components

3. Performance measure analysis

5

1. Modelling TEWA Threat Evaluation (TE)• Flagging Models / Qualititive• Deterministic / Quantitive

– Projected Passing Distance– Bearing towards Assets– Course towards Assets (2D & 3D)– Time to asset

• Probabilistic [Jacques du Toit & Willa Lotz, Jaco Roux & Jan van Vuuren]

Weapon Assignment (WA)• Rule Based / Heuristic [Francois du Toit & Cobus

Potgieter]

• Mathematical / Computational [Grant van Dieman]

6

TE: Flagging Models • Provide information regarding major changes in attributes of

monitored threats:– Speed (Afterburner)– Altitude (Pitch)– Course (Maneuvers)

• Binary output

• Calibration options:– Absolute– Dynamic

77

TE Models

• Course

• Projected Passing Distance

• Course Variation

• Bearing

• Estimated Time-To-Weapon-Release

(TTWR)• Determine threat values between 0

(minimum) and 1 (maximum)

8

TE Model Threat Lists

Threat Rank

Aircraft ThreatValue

1

2

3

4

Coursea

dc

b Threat Rank

Aircraft ThreatValue

1 a 0.9

2 c 0.8

3 b 0.4

4 d 0.0

9

TE Model Threat Lists

Threat Rank

Aircraft ThreatValue

1

2

3

4

PPDa

dc

b Threat Rank

Aircraft ThreatValue

1 a 0.8

2 c 0.7

3 b 0.1

4 d 0.0

10

TE Model Threat Lists

Threat Rank

Aircraft ThreatValue

1

2

3

4

Course Variationa

dc

b Threat Rank

Aircraft ThreatValue

1 c 0.9

2 a 0.4

3 d 0.3

4 b 0.0

11

TE Model Threat Lists

Threat Rank

Aircraft ThreatValue

1

2

3

4

Bearinga

dc

b Threat Rank

Aircraft ThreatValue

1 b 0.9

2 a 0.4

3 c 0.2

4 d 0.1

12

TE Model Threat Lists

Threat Rank

Aircraft ThreatValue

1

2

3

4

TTWRa

dc

b Threat Rank

Aircraft ThreatValue

1 b 0.9

2 c 0.7

3 a 0.6

4 d 0.1

13

Combined Asset Threat List

Rank Aircraft ThreatValue

1 d 0.8

2 b 0.5

3 c 0.3

4 a 0.1

Rank Aircraft ThreatValue

1 d 0.9

2 b 0.7

3 a 0.4

4 c 0.1

Rank Aircraft ThreatValue

1 c 0.8

2 a 0.7

3 d 0.6

4 b 0.2

Rank Aircraft ThreatValue

1 a 0.9

2 c 0.8

3 b 0.6

4 d 0.0

Rank Aircraft ThreatValue

1 a 0.8

2 c 0.7

3 b 0.5

4 d 0.0

Course PPD CV Bearing TTWR

Rank Aircraft ThreatValue

1 a 0.8

2 c 0.7

3 b 0.5

4 d 0.0

Asset

14

Asset Threat Lists

Asseta

dc

b Threat Rank

Aircraft ThreatValue

1 a 0.8

2 c 0.7

3 b 0.5

4 d 0.2

15

Asset Threat Lists

Asseta

dc

b Threat Rank

Aircraft ThreatValue

1 a 0.9

2 c 0.7

3 d 0.3

4 b 0.1

16

Combined System Threat List

Rank Aircraft ThreatValue

1 d 0.9

2 c 0.7

3 d 0.3

4 b 0.1

Rank Aircraft ThreatValue

1 a 0.8

2 c 0.7

3 b 0.5

4 d 0.2

Asset a Asset b

Rank Aircraft ThreatValue

1 a 0.9

2 c 0.7

3 d 0.3

4 b 0.2

System

17

System Threat Lists

Systema

dc

b Threat Rank

Aircraft ThreatValue

1 a 0.9

2 c 0.7

3 d 0.3

4 b 0.2

1818

Combination Procedures

• Maximize Agreement Heuristic, Distance-Based Solution, Additive Model, Analytic Hierarchy Process

• Adapted from inustrial applications to be applied to TEWA

• TE model importance and Asset priorities taken into account by weighting

• Aim to maximize flexibility to satisfy end-user requirements

19

2. Simulation of TEWA• Constructive discrete event simulation

– Hill RR, Miller JO & McIntyre, Applications of discrete event simulation modelling to military problems, Proceedings of the 2001 Winter Simulation Conference, 2001.

– Clymer JR, System design and evaluation using discrete-event simulation with artificial intelligence, Proceedings of the 1993 Winter Simulation Conference, 1993.

• System components– Defended Assets– Weapon sensors & effectors– Terrain– Monitored Threats (Fixed wing aircraft)

• System track (3D)• Attack technique [Jacques du Toit & Willa Lotz]

20

System Infrastructure

21

3. Performance Measures• Asset preservation• Resource utilisation

– Weapon cache / asset preservation ratio• Threat evaluation accuracy

– Intent vs action– Estimated capability vs actual capability

• Assignment optimality– Temporal optimality– Weapon allocation optimality– Weapon assignment optimality

• Expert analysis

Challenges:– Performance measures difficult to quantify!

22

Demonstration

23

HMI

24

Status / Further Work

• TE: Flagging model and deterministic model infrastructure implemented for multiple assets and multiple aircraft.

• Threat list generation and system threat calculation implemented.

• Probabilistic threat models, WA, and discrete event simulation of multiple aircraft and multiple assets in various attack scenarios will follow thereafter in order to evaluate TEWA models.

25

ReferencesDuvenhage B & le Roux WH, A Peer-to-Peer Simulation Architecture, Proceedings of the 2007 High Performance Computing and Simulation Conference, pp. 684-690, 2007.

le Roux WH, Implementing a Low Cost Distributed Architecture for Real-Time Behavioural Modelling and Simulation, Proceedings of the 2006 European Simulation Interoperability Workshop, Stockholm, pp. 81-95, June 2006.

Roux J & van Vuuren JH, Threat evaluation and weapon assignment decision support: A review of the state of the art, Orion Journal of Operations Research, Submitted June 2007.

Hill RR, Miller JO & McIntyre, Applications of discrete event simulation modelling to military problems, Proceedings of the 2001 Winter Simulation Conference, 2001.

Clymer JR, System design and evaluation using discrete-event simulation with artificial intelligence, Proceedings of the 1993 Winter Simulation Conference, 1993.