improving confidence in the assessment of system performance in differing scenarios

44
Cardinal Consultants 19 ISMOR Aug 2002 Improving Confidence in the Assessment of System Performance in Differing Scenarios. T D Clayton Cardinal Consultants

Upload: gerda

Post on 02-Feb-2016

16 views

Category:

Documents


0 download

DESCRIPTION

Improving Confidence in the Assessment of System Performance in Differing Scenarios. T D Clayton. Cardinal Consultants. 1. Context 2. Scenario Dependency of Input Data 3. Choosing Scenarios to Assess 4. Modelling Widely Differing Scenarios 5. Example Study - PowerPoint PPT Presentation

TRANSCRIPT

Page 1: Improving Confidence in the Assessment of System Performance in Differing Scenarios

Cardinal Consultants 19 ISMOR Aug 2002

Improving Confidence in the Assessment of

System Performance in Differing Scenarios.

T D Clayton

Cardinal Consultants

Page 2: Improving Confidence in the Assessment of System Performance in Differing Scenarios

Cardinal Consultants 19 ISMOR Aug 2002

1. Context

2. Scenario Dependency of Input Data

3. Choosing Scenarios to Assess

4. Modelling Widely Differing Scenarios

5. Example Study

6. Summary and Conclusions

Page 3: Improving Confidence in the Assessment of System Performance in Differing Scenarios

Cardinal Consultants 19 ISMOR Aug 2002

SYSTEMEFFECTIVENESS

ASSESSMENT

Page 4: Improving Confidence in the Assessment of System Performance in Differing Scenarios

Cardinal Consultants 19 ISMOR Aug 2002

SYSTEMEFFECTIVENESS

ASSESSMENT

WarheadLethality

Page 5: Improving Confidence in the Assessment of System Performance in Differing Scenarios

Cardinal Consultants 19 ISMOR Aug 2002

SYSTEMEFFECTIVENESS

ASSESSMENT

WarheadLethality

Combatmodelling

Page 6: Improving Confidence in the Assessment of System Performance in Differing Scenarios

Cardinal Consultants 19 ISMOR Aug 2002

SYSTEMEFFECTIVENESS

ASSESSMENT

Warhead / FuzePerformance

Combatmodelling

SensorPerformance

OperatorPerformance

GuidanceSystem Wargaming

Tactical / Strategicstudies

Othersubsystems

Page 7: Improving Confidence in the Assessment of System Performance in Differing Scenarios

Cardinal Consultants 19 ISMOR Aug 2002

Purpose of System Effectiveness Studies

• Research / long term development objectives

• Medium term procurement objectives

• Design optimisation

• Procurement decisions

• Input to Operational / Tactical Studies

Page 8: Improving Confidence in the Assessment of System Performance in Differing Scenarios

Cardinal Consultants 19 ISMOR Aug 2002

But, whatever the purpose,

scenario assumptions are critical.

or, we should assume they are,

unless proven otherwise.

Page 9: Improving Confidence in the Assessment of System Performance in Differing Scenarios

Cardinal Consultants 19 ISMOR Aug 2002

Rule 1

Everything is scenario dependent.

Page 10: Improving Confidence in the Assessment of System Performance in Differing Scenarios

Cardinal Consultants 19 ISMOR Aug 2002

SYSTEMEFFECTIVENESS

ASSESSMENT

Warhead / FuzePerformance

Combatmodelling

SensorPerformance

OperatorPerformance

GuidanceSystem Wargaming

Tactical / Strategicstudies

Othersubsystems

Page 11: Improving Confidence in the Assessment of System Performance in Differing Scenarios

Cardinal Consultants 19 ISMOR Aug 2002

SYSTEMEFFECTIVENESS

ASSESSMENT

Warhead / FuzePerformance

Combatmodelling

SensorPerformance

OperatorPerformance

GuidanceSystem Wargaming

Tactical / Strategicstudies

Othersubsystems

Page 12: Improving Confidence in the Assessment of System Performance in Differing Scenarios

Cardinal Consultants 19 ISMOR Aug 2002

Pk = 0.47

Page 13: Improving Confidence in the Assessment of System Performance in Differing Scenarios

Cardinal Consultants 19 ISMOR Aug 2002

• Nature of ground around the target

• Presence of adjacent trees, or protective earthworks

• Azimuth distribution

• Elevation distribution

• Relative value of M-kill, F-kill, P-kill, K-kill

• Likelihood of multiple hits

• Using an MFK value as a probability ?

Page 14: Improving Confidence in the Assessment of System Performance in Differing Scenarios

Cardinal Consultants 19 ISMOR Aug 2002

The Multi-Disciplinary Problem

LethalityExpert

SystemsModeller

CombatModeller

Page 15: Improving Confidence in the Assessment of System Performance in Differing Scenarios

Cardinal Consultants 19 ISMOR Aug 2002

The Management Solution

Establish roles and responsibilities for managing the interfaces between

expert groups.

Page 16: Improving Confidence in the Assessment of System Performance in Differing Scenarios

Cardinal Consultants 19 ISMOR Aug 2002

Responsibilities of the Interface Manager

• Understand methodologies and assumptions at all levels

• Organise training / briefings to assist expert groups widen knowledge

• Conduct studies to measure Scenario Dependencies of results

• Maintain knowledge base of dependencies and “corrections”

• Involvement in planning of studies, addressing assumptions

• Involvement in reporting of studies, esp. assumptions

Page 17: Improving Confidence in the Assessment of System Performance in Differing Scenarios

Cardinal Consultants 19 ISMOR Aug 2002

Study 1

MAIN DATABASE OF STUDY RESULTS

Study 2 Study 3

DATABASE OFSCENARIO COMPENSATION FACTORS

Comparison & Analysis

‘Offline’ analysis tools

Study planning and analysis

Data provided to other studies

Page 18: Improving Confidence in the Assessment of System Performance in Differing Scenarios

Cardinal Consultants 19 ISMOR Aug 2002

Study 1

MAIN DATABASE OF STUDY RESULTS

Study 2 Study 3

DATABASE OFSCENARIO COMPENSATION

FACTORS

Modified SCF’s

Calculate SCF’s from new studies

Assessment and comparison of SCF’s

Page 19: Improving Confidence in the Assessment of System Performance in Differing Scenarios

Cardinal Consultants 19 ISMOR Aug 2002

Rule 2

You will never assess the right scenarios.

Page 20: Improving Confidence in the Assessment of System Performance in Differing Scenarios

Cardinal Consultants 19 ISMOR Aug 2002

Scenario Parameters

Climate - Temperature - Precipitation

Ground - Vegetation - Topology - Roads

Geography - Geographic isolation& Politics - Neighbouring countries - Local cilvilian population

Opposing - Nuc., Chem., Bio.Max. Cap. - Short range Long range

Opposing - NumbersTroops - Capability

Opposing - TechnologyGround - NumbersEquipment - Own Intell.

Posture & - Posture (Defensive, attacking)Deployment - Deployment and detectablity

Air - Aircraft typesCapability - Level of technology - Numbers - Own Intell.

Anti-Air - Numbers of unitsCapability - Capability - Own Intell.

Maritime - Maritime involvement - Capability

BLUE ROLE - Peace keeping, combat (defensive) combat (hunt and kill)

Page 21: Improving Confidence in the Assessment of System Performance in Differing Scenarios

Cardinal Consultants 19 ISMOR Aug 2002

Scenario 1 Scenario 2 Scenario 3

Page 22: Improving Confidence in the Assessment of System Performance in Differing Scenarios

Cardinal Consultants 19 ISMOR Aug 2002

continuous parameter

Page 23: Improving Confidence in the Assessment of System Performance in Differing Scenarios

Cardinal Consultants 19 ISMOR Aug 2002

Rule 3

A combat model cannot addresswidely differing scenarios.

Page 24: Improving Confidence in the Assessment of System Performance in Differing Scenarios

Cardinal Consultants 19 ISMOR Aug 2002

Example Study

Comparative assessment of two potential candidatesfor a cannon system for light armoured vehicles.

Page 25: Improving Confidence in the Assessment of System Performance in Differing Scenarios

Cardinal Consultants 19 ISMOR Aug 2002

System A System B

Weight 190 kg 105 kg

Range 6 km 4 km

Rounds on vehicle 70 180

Accuracy 2 mil 3 mil

Dispersion at 2 km 5 m sd 10 m sd

Single round Pkh - truck 0.06 0.04

Page 26: Improving Confidence in the Assessment of System Performance in Differing Scenarios

Cardinal Consultants 19 ISMOR Aug 2002

Input data

Engagement Model (developed for this study)

Combat model (existing)

3 Scenarios

ORIGINAL STUDY PLAN

Page 27: Improving Confidence in the Assessment of System Performance in Differing Scenarios

Cardinal Consultants 19 ISMOR Aug 2002

REVIEW OF PROVIDED DATA

1. When multiple hits are likely, SSKP may not be appropriate.

2. Lethality figures give no azimuth dependency.

3. No information on range dependency.

4. Data required for wider range of target types.

Lethality models re-run, in concert with Engagement model.

Page 28: Improving Confidence in the Assessment of System Performance in Differing Scenarios

Cardinal Consultants 19 ISMOR Aug 2002

REVIEW OF EXISTING COMBAT MODEL

1. Tends to choose tanks as preferred target type.

2. All targets are land vehicles.

3. Terrain in all 3 scenarios tends to give long engagement ranges.

4. No variations in met-vis or day/night > long ranges

5. Same Blue positions for both System A and System B.

6. Units are static when firing.

Page 29: Improving Confidence in the Assessment of System Performance in Differing Scenarios

Cardinal Consultants 19 ISMOR Aug 2002

0

2

4

6

8

10

12

Scen. 1 Scen. 2 Scen. 3

Mil.

Val

. of K

ills

per

amm

o lo

ad

System A

System B

Page 30: Improving Confidence in the Assessment of System Performance in Differing Scenarios

Cardinal Consultants 19 ISMOR Aug 2002

THE ALTERNATIVE APPROACH

1. Use a range of methods, including Military Judgement, to derive intermediate data and distributions reflecting a wide range of scenarios.

• relative frequencies of target types engaged

• engagement range distributions

• azimuth distributions

• probability of kill per burst - function of range and target type

Page 31: Improving Confidence in the Assessment of System Performance in Differing Scenarios

Cardinal Consultants 19 ISMOR Aug 2002

THE ALTERNATIVE APPROACH

1. Use a range of methods, including Military Judgement, to derive intermediate data and distributions reflecting a wide range of scenarios.

2. Develop a simple tool to calculate specific Measures of Effectiveness from the input data and distributions.

MoE 1: Military Worth of kills per burst

MoE 2: Military Worth of kills per ammunition load

Page 32: Improving Confidence in the Assessment of System Performance in Differing Scenarios

Cardinal Consultants 19 ISMOR Aug 2002

THE ALTERNATIVE APPROACH

1. Use a range of methods, including Military Judgement, to derive intermediate data and distributions reflecting a wide range of scenarios.

2. Develop a simple tool to calculate specific Measures of Effectiveness from the input data and distributions.

• quick to develop

• quick to run

• facilitates review and scrutiny of data

• stores data and maintains audit trails

Page 33: Improving Confidence in the Assessment of System Performance in Differing Scenarios

Cardinal Consultants 19 ISMOR Aug 2002

THE ALTERNATIVE APPROACH

1. Use a range of methods, including Military Judgement, to derive intermediate data and distributions reflecting a wide range of scenarios.

2. Develop a simple tool to calculate specific Measures of Effectiveness from the input data and distributions.

• permit results to be adjusted by Military Judgementto account for factors not addressed by calculations

- the value of the ability to fire on the move

- the value of the greater manoeuvrability affordedby the lighter system

Page 34: Improving Confidence in the Assessment of System Performance in Differing Scenarios

Cardinal Consultants 19 ISMOR Aug 2002

Page 35: Improving Confidence in the Assessment of System Performance in Differing Scenarios

Cardinal Consultants 19 ISMOR Aug 2002

Page 36: Improving Confidence in the Assessment of System Performance in Differing Scenarios

Cardinal Consultants 19 ISMOR Aug 2002

Page 37: Improving Confidence in the Assessment of System Performance in Differing Scenarios

Cardinal Consultants 19 ISMOR Aug 2002

Page 38: Improving Confidence in the Assessment of System Performance in Differing Scenarios

Cardinal Consultants 19 ISMOR Aug 2002

Page 39: Improving Confidence in the Assessment of System Performance in Differing Scenarios

Cardinal Consultants 19 ISMOR Aug 2002

Page 40: Improving Confidence in the Assessment of System Performance in Differing Scenarios

Cardinal Consultants 19 ISMOR Aug 2002

SUMMARY AND CONCLUSIONS

Appropriate methods of addressing scenario dependencies are essential to ensure study conclusions are valid.1. ALL DATA should be regarded as being scenario-dependent.

It is very useful to have an analyst in every team with special responsibility for addressing this problem.

2. Using combat models to compare performance of systemscan be hazardous.

Consider using a range of methods to generateintermediate results which are open to scrutinyand to sensitivity studies.

Page 41: Improving Confidence in the Assessment of System Performance in Differing Scenarios

Cardinal Consultants 19 ISMOR Aug 2002

Page 42: Improving Confidence in the Assessment of System Performance in Differing Scenarios

Cardinal Consultants 19 ISMOR Aug 2002

Title

Contents

Study levels

Study purpose

Rule 1

Highlight top-level

Highlight all

TarDes pic

Leth’y depends

MutliDisciplinary

Management Soln

Responsibilities

Framework

Feedback

Rule 2

Scen Pars

Histogram

Graph

Rule 3

Example study

Data

Original plan

Data review

Model review

Model results

Alternative approach

Data screen 1

Results screen

Conclusions

Further Dev’t

Current issues

Page 43: Improving Confidence in the Assessment of System Performance in Differing Scenarios

Cardinal Consultants 19 ISMOR Aug 2002

Further Development of the CST Tool

2. Improved statistical routines for increase in speed

1. Development of proper library of routines

3. Automated methods for parametric studies

4. Use of EDMS technologies to manage and access study reports

Page 44: Improving Confidence in the Assessment of System Performance in Differing Scenarios

Cardinal Consultants 19 ISMOR Aug 2002

CURRENT ISSUES / PROBLEMS WITH CST-01

1. It is not clear how best to address the problem offiring multiple bursts at a target, depending uponwhether it is perceived to be killed.

2. It is not clear whether (and how) costs (or numbers of units)should be included, or handled separately.