white paper - bmt group | a leading international design

36
WHITE PAPER: TEST, EVALUATION & ACCEPTANCE BEST PRACTICE BMT’s experience of UOR Capability Delivery Executive Summary BMT Defence Services (BMT) has provided support to the UK MoD (primarily DE&S) for the acquisition, and ensuing delivery to theatre, of a number of Land Platform Urgent Operational Requirement (UOR) programmes. We have subsequently identified a number of ‘lessons’ that can be used in future acquisitions to ensure a robust, thorough, yet still agile systems engineering approach to capability delivery. This white paper discusses BMT’s recognised ‘best practice’ relating to the test, evaluation and acceptance process and how our ‘lessons learnt’ from UOR programmes over the past 3 years can be used to deliver front line capability to tight time and cost parameters, yet still achieve the levels of quality and performance required.

Upload: others

Post on 12-Sep-2021

1 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: WHITE PAPER - BMT Group | A leading international design

WHITE PAPER:TEST, EVALUATION & ACCEPTANCE

BEST PRACTICEBMT’s experience of UOR Capability Delivery

Executive Summary

BMT Defence Services (BMT) has provided support to the UK MoD

(primarily DE&S) for the acquisition, and ensuing delivery to theatre,

of a number of Land Platform Urgent Operational Requirement (UOR)

programmes. We have subsequently identified a number of ‘lessons’

that can be used in future acquisitions to ensure a robust, thorough,

yet still agile systems engineering approach to capability delivery.

This white paper discusses BMT’s recognised ‘best practice’

relating to the test, evaluation and acceptance process and how our

‘lessons learnt’ from UOR programmes over the past 3 years can be

used to deliver front line capability to tight time and cost parameters,

yet still achieve the levels of quality and performance required.

Page 2: WHITE PAPER - BMT Group | A leading international design

WHITE PAPER:

TEST, EVALUATION & ACCEPTANCE BEST PRACTICE

Contents Page

Executive Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1

Contents . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2

Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3

Aim . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3

Definitions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3

Background . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3

Requirements and Acceptance Management . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 8

Requirements Background . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 8

Requirements and Acceptance Strategy. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 8

Integrated Test, Evaluation and Acceptance . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 10

ITEA Principles . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 10

ITEA Process . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 12

Plan the Approach . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 12

Define Verification & Validation Criteria . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 13

Determine the Evidence Needs (Detailed Trials and Tests Planning) . . . . . . . . . . . . 15

Detail the ITEAP . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 19

ITEAP Structure . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 23

Stakeholders and Working Groups. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 23

Linkage of Requirements Management to Acceptance . . . . . . . . . . . . . . . . . . . . . . . . . . 25

VVRM . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 26

ITEAP Schedule . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 27

Collect Evidence (Execute ITEAP) . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 27

Factory Acceptance Tests . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 28

Scenario Base Acceptance . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 29

Evaluate and Recommend (Evidence Management) . . . . . . . . . . . . . . . . . . . . . . . . . . . . 30

Acceptance Declaration . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 31

Sub-System Contractual Acceptance . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 31

System Acceptance . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 32

Capability Acceptance . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 32

Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 33

Conclusion . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 33

Case Studies . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 34

Project TALISMAN . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 34

Explosive Line Charge . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 35

Soldier Short Gap Crossing . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 35

MoD Projects . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 35

Page 3: WHITE PAPER - BMT Group | A leading international design

INTEGRATED TEST, EVALUATION & ACCEPTANCE BEST PRACTICE

“Although much needs improvement in the planning and delivery of longer term requirements,

it is notable, and to the DE&S’s great credit, that the equipment acquisition system works best

when needs are greatest. The UOR process, which is designed to provide battle-winning equipment

at short notice to current operations, appears able to deliver better trade-offs between performance,

cost and time in the interests of ensuring that, by and large, the front line receives

the right kit at the right time.”

Bernard Gray - Review of Acquisition for the Secretary of State for Defence, 2009

AimThis white paper discusses the lessons that BMT Defence Services (BMT) has learnt managing

test, evaluation and acceptance, in support of DE&S, to deliver Land Platform Urgent Operational

Requirement (UOR) capability to demanding timescales. The aim of this paper is to enable the

reader to use this experience in applying test and evaluation ‘best practice’ to any future

programme or project.

DefinitionsTest - A controlled event designed to measure the performance of an entity in controlled

circumstances (typically stimulus / load and environment)1.

Evaluation - The formal analysis of existing information or test results in order to inform an

acceptance decision or the action of determining the overall worth of a solution and how that worth

might be increased, on balance, across the properties of effectiveness, cost, time and achievability1.

Acceptance - A process, under the control of the Sponsor as the Acceptance Authority, confirming

that the user’s needs for military capability have been met by the systems supplied1.

Integrated Test, Evaluation and Acceptance (ITEA) - Confirms that the supplied solution meets

the user’s needs. It is also a method of identifying and managing technical and operational risks –

and hence time and cost – throughout the programme1.

BackgroundAs the primary intention for all new programmes is to develop and release into service a system or

capability that will meet a specified need and be accepted by its intended user, it is necessary to

understand why the Testing and Evaluation (T&E) of a developing system is important and how these

activities support the systems engineering lifecycle (shown in Figure 1).

three

INTRODUCTION

1 Definition from Ministry of Defence AOF Glossary

Page 4: WHITE PAPER - BMT Group | A leading international design

four

T&E occurs throughout a number of stages of the lifecycle from initial requirements, leading towards

acceptance, into service and sustained through the life of the project (re-activated for upgrade and

modification programmes). These stages include:

• Development of the concept and the associated requirement sets;

• Down-selection of competing systems;

• Contract Acceptance, through verification of system requirements;

• Validation of user requirements, especially the Key User Requirements (KURs)

detailed in the Business Case;

• Characterisation and optimisation of interoperability with other systems.

Although this approach has worked well in the past for the more lengthy duration Equipment

Programmes, UOR programmes have become increasingly technically complex in order to deal with

a constantly evolving threat or changing operational environment and require a robust, yet agile,

T&E process. These activities therefore need careful, hands-on management if the pace of the UOR

programme is to be sustained.

INTEGRATED TEST, EVALUATION & ACCEPTANCE BEST PRACTICE

Figure 1: The Systems Engineering Lifecycle – showing the link between requirements and T&E activity.

Page 5: WHITE PAPER - BMT Group | A leading international design

five

BMT has previously described a streamlined approach to acquisition in its ‘FAST’ Acquisition White

Paper2, summarised in the next 3 paragraphs, which aligns to the MoD recommended guidance for

test, evaluation and acceptance due to its iterative nature, improved configuration and continuous

cycle of feedback.

The Systems Engineering ‘V’ diagram, pictured in Figure 1, forms the basis of the Concept,

Assessment, Demonstration, Manufacture, In-Service, Disposal (CADMID) cycle upon which the

Acquisition Operational Framework (AOF) is securely grounded. If the AOF itself is to be streamlined

for UOR delivery then it follows that the methodology, or technique, on which it is based should be

further analysed.

Figure 2 shows a modified version of the ‘V’ diagram; now termed the ‘O’ diagram. The left hand

side of the ‘V’ has been ‘straightened’ to show a more rapid progression through the requirements

specification. This occurs through the utilisation and concurrency of consultancy activities to produce

an agreed architecture and robust requirements to which industry can Design, Manufacture/Modify,

and Test. This is in contrast to the original ‘V’ model in which a ‘waterfall’ approach is taken from the

User Requirements, through the System Requirements and then on to the Architecture design in a

sequential, and therefore more time-consuming, procedure.

INTEGRATED TEST, EVALUATION & ACCEPTANCE BEST PRACTICE

Figure 2: The Systems Engineering ‘O’ Diagram – An adaptation of the ‘V’ Diagram providing a more agile approach

Review

Optioneering

Project start In service

Trials - validation

Architecture

Requirements

Trials - verification

Factory Test

Unit TestDesign

Manufacture

Analysis

Comparison

Decision

Action

Stak

ehol

der

cont

ribut

ions

2 Published in RUSI Defence Analysis, February 2010.

Page 6: WHITE PAPER - BMT Group | A leading international design

six

The continual review process in the design, manufacture and test phases ensures that issues can

be identified early and resolved with stakeholder input. Once the system has been delivered by

industry, its acceptance is made easier by the presence of robust and testable requirements against

which it can be assessed. A highly important factor with this approach is the feedback loop forming

the completed ‘O’. This enables feedback on the fielded capability’s performance to be provided to

inform the equipment programme and for subsequent planning of modifications to enable uplift from

the originally delivered 80% capability.

BMT believes that it is possible to combine MoD guidance with the lessons learnt through BMT’s

development of this ‘FAST’ process and their experience of test, evaluation and acceptance activities

to determine a ‘best practice’ approach which can be applied to future projects, increasing project

success and reducing the costs associated with such activities.

This paper does not cover the requirements scoping elicitation, prioritisation and agreement

processes, but focuses on the strategy for requirements management, leading into the strategy

needed to manage successful completion of the capability acceptance.

Figure 3 (below) has been included to provide an overview of the documentation set required

to successfully follow the recommended acceptance process. The detail behind this diagram is

explained throughout the remainder of this paper. It is worth noting however, that the documents

displayed in green are supporting documents which are developed throughout other stages

of the project.

INTEGRATED TEST, EVALUATION & ACCEPTANCE BEST PRACTICE

Page 7: WHITE PAPER - BMT Group | A leading international design

seven

INTEGRATED TEST, EVALUATION & ACCEPTANCE BEST PRACTICE

Key UserRequirements

UserRequirements

Document

SystemRequirements

Document

Requirementsand Acceptance

Management Plan

RAMP(Acceptance

Strategy)

Master Dataand Assumptions

List

TrialsAdministration

Orders

TrialsDirective

Plans

ExistingTrialsData

Verification andValidation

Requirements Matrix

ITEASchedule

StakeholderResponsibility

Matrix

GFXPlan

Asset & ContractDelivery

Schedules

CapabilityIntegration

Plan

TEST TeamMaster

Schedule

Business Case

RelationshipBetween ITEAP

and CIP

Integrated Test,Evaluation and

Acceptance Plan

CapabilityAcceptance

ReportSafety Case

First ImpressionsReport

Test/TrialsReport

ExceptionReport

Other DLoDReports

Figure 3: Documentation Set Overview

Page 8: WHITE PAPER - BMT Group | A leading international design

REqUIREmENTS AND ACCEPTANCEmANAgEmENT

Requirements backgroundFor any Project, the Business Case will state the KURs reflecting the capability, services and

performance that the Sponsor expects of the delivered system. As far as is possible, no constraints

should be placed on the engineering solution, although constraints relating to systems of systems

considerations may be necessary to ensure interoperability with other systems. Approaches to

fulfilling the capability using candidate Military Off The Shelf (MOTS) system solutions and

Government Furnished Equipment (GFE) components should also be considered.

For more complex programmes, a User Requirements Document (URD) may be developed from

the KURs, which can be extremely valuable for developing the requirements for the non-equipment

capability (non-EC) Defence Lines of Development (DLoD). Responsibility for non-EC DLoD planning

is delegated to DLoD owners, their remit being to plan & manage fulfilment of the particular DLoD

requirements. The DLoD owners are expected to report evidence of satisfaction of the non-EC

DLoDs as part of the overall capability acceptance case. The System Requirements Document

(SRD), derived from the KURs or URD, identifies the engineering constraints and targets that the user

has decided to place on the assets and equipment. The System Requirements will be used as the

basis for acquisition of the solution, against which the supplied capability will be verified.

Where possible a Prime Contractor will be responsible for supplying the integrated solution to meet

the SRD. Sometimes this is not the case and DE&S takes on the role of system integrator with

sub-systems procured by DE&S from different suppliers. These sub-systems should be specified

against a subset of the SRD; it is important that the Measures of Performance (MoPs) for each sub-

system requirement refer to the contribution that the supplier is expected to make rather than the

performance of the overall system. This will avoid ambiguity, discussion and delay when it comes to

sub-system acceptance. The balance of the overall performance may be due to another sub-system

or due to an item of GFE.

Requirements and Acceptance StrategyIntegrated Test, Evaluation and Acceptance (ITEA) is a way of progressively confirming that the

Users’ needs have been met by the developed solution and that each of the DLoDs combine

effectively to deliver the capability. It also provides a defined method of identifying and managing the

technical and operational risks related to the project. However, evidence against a set of acceptance

criteria needs to be gathered to determine that this is the case so that approval can be gained

from the Sponsor (acting as Acceptance Authority) and Stakeholders from the relevant Capability

Integration Working Group (CIWG), responsible for the range of DLoDs, before the system will be

accepted into service. For this reason, it is important for each requirement to be explicit, atomic,

quantified and testable.

eight

INTEGRATED TEST, EVALUATION & ACCEPTANCE BEST PRACTICE

Page 9: WHITE PAPER - BMT Group | A leading international design

nine

Explicit requirements reduce the possibility that the supplier has misinterpreted what has been asked

for and ensures that the recipient is able to identify that what has been delivered meets the identified

need. Quantified requirements allow the performance to be established against the identified need

and, by being testable, it is possible to reach a consensus among Stakeholders that the requirement

has been met rather than being a purely subjective opinion. Each requirement should therefore have

at least one testable characteristic with traceability demonstrated between the requirements and the

gathered acceptance evidence.

The requirements and acceptance strategy for any project is typically based upon the DE&S AOF

recommended systems engineering ‘V’ process model. As the outline of the requirements and

acceptance process, shown in Figure 4 below, indicates, the testing, evaluation and acceptance of

a system needs to be considered very early on in the project lifecycle and continue throughout its

development. In particular, the strategy for acceptance should be defined as early as Project Initiation

and captured within the Requirements and Acceptance Management Plan (RAMP)3. The strategy

contained within the RAMP defines the process and management of the acceptance activities

and outlines the principal goals for the capability acquisition. These goals can then inform the

development of the validation criteria in the URD and the verification criteria in the SRD.

INTEGRATED TEST, EVALUATION & ACCEPTANCE BEST PRACTICE

Capability Need

Manufacture System

Capability Solution @IOC Deploy @FOC

User RequirementsCharacterisation

(Operational) Trials & Tests

Indicative Contractual Boundary

System Requirements System Trials & Testing

Sub-Systems Trials & TestingSuppliers Design

Validate using Evidence & CIWG agreement

Verify results against Satisfaction

Criteria using Evidence & WG decision

Supplier Design Evidence& Certifications

EvidenceSatisfies

Satisfies

ITEAPreparation& Start-up

Test & TrialsInitiation

Test & TrialsCompletion

Sub-System FATs

System Acceptance

Capability Acceptance

Test & Trials Management

ITEA Timeline

Figure 4: Requirements and Acceptance Process – Linking acceptance activity to requirements definitionat the earliest opportunity in a project’s lifecycle

3 Additional information regarding the content and structure of the RAMP can be found within the AOF.

Page 10: WHITE PAPER - BMT Group | A leading international design

ten

INTEGRATED TEST, EVALUATION & ACCEPTANCE BEST PRACTICE

INTEgRATED TEST, EVALUATIONAND ACCEPTANCE

ITEA PrinciplesThe key principles of ITEA revolve around early engagement of Stakeholders across all DLoDs in

order to determine that a full understanding of all evidence collection requirements has been

obtained, and that the most effective method of evidence collection can be determined. ITEA also

provides an opportunity to determine any dependencies in testing and evaluation activities so that

these can be effectively managed throughout the lifecycle. The use of a Combined Test Team (CTT)

to collect both developmental and operational evidence may help to ensure that a cost effective and

timely delivery of acceptance evidence is achieved.

Whenever possible, a requirement will only be tested once and as early in the programme as

possible, with the objective being to minimise the time and cost of live trials, and reduce the

dependence on critical resources such as MoD trial facilities which are, at present, in heavy

demand. However, a balance must be struck between the cost of reducing uncertainty and the risks

associated with full test coverage. Experience has shown that careful planning can usually mean that

the testing of several requirements can be carried out simultaneously and that risk can be reduced

by integrating operational evaluation and test activities. Applying a progressive and risk-based

approach to evidence collection will not only assist in determining the most important elements of

testing and evaluation, and therefore which activities may need additional effort, but also ensures

that suitably representative test methods are used throughout development rather than left until

critical stages where remedial action may be unfeasible. Regression testing may also be required

in some circumstances.

The implementation of an ITEA approach can lead to a number of benefits being realised, including

the development of an agreed approach to the collection and evaluation of evidence covering all

requirements, all DLoDs and relevant to all Stakeholders, and the ability to determine a clear

understanding of the time, cost and risks associated with the test, evaluation and acceptance

activities. This approach aims to reduce duplicated effort and make the most efficient use of

available test resources by:

• Identifying the tests that most effectively measure capability and performance;

• Planning the tests to make efficient use of all resources by testing the right aspects

at the right time;

• Identifying the need to demonstrate the test;

• Running the tests, recording all of the relevant data and, where possible, testing once

but using the data many times with a focus on optimising the test programme and

preventing unnecessary testing through early identification of the evidence required

for acceptance;

• Identifying and managing technical and operational risks.

Page 11: WHITE PAPER - BMT Group | A leading international design

eleven

The overarching details of the test, evaluation and acceptance process for a project should be

contained within a consolidated Integrated Test, Evaluation and Acceptance Plan (ITEAP) as this can

be used as a single reference point for all ITEA plans, processes and activities and reduces the need

for duplication of information within a number of related documents. Although the ITEAP is primarily

a documented plan, it also includes a Verification and Validation Requirements Matrix (VVRM) and

an ITEA Schedule as explained in the following sections of this paper, and should contain references

to other documentation which may be relevant, such as detailed test plans. This ensures that any

relevant required information is always up to date as only one plan needs to be modified throughout

system development.

The AOF suggests following a seven stage process to ITEA, shown in Figure 5 below. However, it is

worth noting that experience has shown that to get the full benefit of this process, it should be carried

out incrementally and not as a series of independent steps. This iterative process ensures that the

acceptance methods and activities are constantly reviewed as the project develops and will remain

relevant and feasible throughout development, leading to the successful acceptance of the system.

INTEGRATED TEST, EVALUATION & ACCEPTANCE BEST PRACTICE

Plan theApproach

DefineV&V Criteria

Determine theEvidence Needs

Detailthe ITEAP

CollectEvidence

Evaluate andRecommend

AcceptanceDeclaration

Figure 5: ITEA Process

Page 12: WHITE PAPER - BMT Group | A leading international design

twelve

INTEGRATED TEST, EVALUATION & ACCEPTANCE BEST PRACTICE

ITEA PROCESS

Plan the ApproachAs mentioned above, the approach to acceptance, which outlines the capability acquisition goals,

should be considered during Project Initiation and detailed within the RAMP. A thorough strategy

will ensure that Stakeholder expectations are managed from the outset and confidence is gained in

the approach to be taken. Planning the approach in such a way was one of the key factors in BMT

winning the contract to provide acceptance support to the TALISMAN Project4.

The acceptance strategy should answer a number of questions relating to the plans for the acceptance

activities, primarily:

• Who will carry out the process of Acceptance, and what contractual mechanism

will be used to enable this?

• Has the scope and type of the evidence to be collected been agreed with key

Stakeholders, and confirmed with relevant technical experts as sufficient?

• When will the testing be completed, and will this be representative of the final product?

• Will all products be tested, or will this be limited to a percentage of products or factory

tests of component performance?

• How will pass/fail criteria be established as part of any test plan, and how will successful

testing be planned and scheduled to inform the acceptance process?

The strategy must consider any assumptions made in the project Master Data and Assumptions List

(MDAL) and as a minimum, define the following aspects of the project:

• The Acceptance Authority, DLoD Owners and Stakeholder roles and responsibilities;

• Initial Operating Capability (IOC);

• System Acceptance;

• The contribution of other Lines of Development;

• The relationship between the ITEAP and the Capability Integration Plan (CIP);

• Any interdependencies;

• Third party generation of evidence (such as suppliers or agencies);

• The responsibility for installation and integration testing;

• Test policy implications including the use of ranges, simulation and responsibility

for test facility validation;

• Technical issues such as design certification;

• Arrangements for UORs and any associated risks;

• Any multinational funding arrangements;

• Access and security of acceptance evidence, and

• The risk management approach.

4 Project TALISMAN is a complex Route Proving and Clearance project. Please see Case Studies for more information.

Page 13: WHITE PAPER - BMT Group | A leading international design

thirteen

INTEGRATED TEST, EVALUATION & ACCEPTANCE BEST PRACTICE

A key lesson learnt during the support to the TALISMAN project was that whilst it is important to plan

the approach to ITEA as thoroughly as possible, it is also necessary to ensure that this approach is

flexible should any project changes arise. For example, ideally, acceptance activities should not be

planned so tightly that there is a risk of delay to the entire project should one area be delayed. Any

flexibility that can be built into the plan will reduce the need for constant rescheduling of activities

should delays arise. Where risks such as this become apparent during the planning stages, they

must be highlighted and managed accordingly. Consideration should also be given to early testing

of some system components as a method of risk reduction although it is recommended that

Technology Readiness Level (TRL) Assessments are used to determine the feasibility of such testing.

Define Verification & Validation CriteriaThe mechanism for acceptance of the system and its constituent sub-systems will be Verification

and Validation (V&V) of the solution against the agreed SRD and URD respectively. Verification

determines that the system which has been developed meets the requirements specified within the

SRD i.e. that the system was built correctly. Validation determines that the system which has been

developed meets the needs of the user as specified in the URD – that the system is the correct

solution for the user (that the correct system was built). The tracing between the Requirements sets

and their fulfilment through the acceptance process is shown in Figure 6 below.

Validates

Satisfies/Validates

Verifies

Satisifed By

VVRM

Plan theApproach

Operational(Characterisation)

Trials

SystemAcceptance

Trials & Tests

SupplierEvidence

(References)

Trials& TestResults

Trials& TestResults

GFEInformation

MOTSSub-SystemInformation

Plan theApproach

MOPS

Figure 6: Traceability from Requirements to Acceptance

Page 14: WHITE PAPER - BMT Group | A leading international design

fourteen

INTEGRATED TEST, EVALUATION & ACCEPTANCE BEST PRACTICE

In order to gain acceptance from the Stakeholders, the requirements process will have to obtain

suitable evidence to prove that the requirements suitably meet the needs. All V&V activity will need to

be planned to cover the entirety of the project’s requirement set to ensure that a full acceptance case

is made.

This evidence collection will typically be achieved through a series of trials, testing, and inspection of

the supplied design solution, with proposed methods for V&V developed in line with the documented

acceptance strategy as the User and System Requirements are matured. A number of methods can

be used for V&V activities (listed below) and the most appropriate method should be determined

based upon the requirement to be tested.

These activities can range from visual inspections, to simulations, to live trials, although it is worth

noting that more than one method may be necessary to fully test a requirement. Additionally, in the

case of UORs, full trials may not always be possible and in these cases adapted test methods may

be required although details of the reasons behind this should be specified within the ITEAP. The

International Council on Systems Engineering (INCOSE) Systems Engineering Handbook provides

details of the Inspection, Demonstration, Analysis, Test (IDAT) Structure which may be a beneficial

reference during system verification.

Standard Verification methods:

• Design Review;

• Supplier Test;

• Supplier Analysis;

• Certification;

• Independent Test;

• Independent Analysis;

• Service Trial – TFT/TLT;

• Service Trial – OFT;

• In-Service Support Evaluation.

Standard Validation methods:

• Factory Test & Evaluation;

• System Integration Test & Evaluation;

• Technical Field Trials;

• Operational Field Trials;

• In-Service Support Evaluation.

Page 15: WHITE PAPER - BMT Group | A leading international design

It is possible and recommended that progressive verification is carried out throughout the

development of the system and that evidence gathered builds on existing evidence which may

have already been gathered and evaluated as this provides the customer with early and increasing

confidence of the performance of the system and provides a method for managing expectations.

BMT applied this approach for the acceptance activities during the TALISMAN project although this

was, in part, forced due to programme delays. In this case, an acceptance review was held around

the time of the critical design review to check that the anticipated performance of the system of

systems would meet the high level requirements.

Determine the Evidence Needs(Detailed Trials and Tests Planning)To determine the level of evidence required for acceptance, it is important to understand the need

for the evidence as well as ensuring that the correct evidence is being gathered, and considering the

risks and benefits associated with collecting or not collecting it. To ensure that all aspects have been

given an appropriate level of detail throughout the collection of evidence, the following steps

as described in the T&E process should be conducted:

• Identify the evidence gaps across all requirements, whether equipment related or not;

• Identify, agree and endorse the level of evidence required for all requirements;

• Identify and review existing evidence, including the consideration of evidence that

may not appear to be directly applicable;

• Identify remaining evidence gaps;

• Risk assessment of the need for evidence;

• Identify appropriate method for collecting evidence still required;

• Produce the necessary trials schedule;

• Conduct the trials;

• Evaluate the evidence against the acceptance criteria.

As this process is followed, the VVRM should be populated with the evidence requirements, the

evidence as it is gathered and finally the acceptance arguments which will then lead to a fully

populated acceptance case.

A number of factors may influence the methods used for evidence collection, including whether there

is a need to conduct the activity in a controlled environment. The following are examples of possible

influencing factors which should be considered:

• Where the capability will be deployed;

• Safety and security;

• The need and feasibility for testing in a realistic environment;

• The use of simulated environments;

• The maturity of the system being tested;

fifteen

INTEGRATED TEST, EVALUATION & ACCEPTANCE BEST PRACTICE

Page 16: WHITE PAPER - BMT Group | A leading international design

sixteen

INTEGRATED TEST, EVALUATION & ACCEPTANCE BEST PRACTICE

• The use of destructive or non-destructive testing;

• System interfaces and availability;

• Availability of and confidence in a suitable test capability;

• Legal and regulatory requirements;

• Test duration and relationship to the critical path;

• Test cost;

• Whether ‘probability’ is an issue.

In addition to the type of evidence collection required, the use of certain test facilities is mandated

or advised by the MoD, including the Land Systems Reference Centre (LSRC) for land digitisation

integration in accordance with JSP 602. The Defence Test and Evaluation Strategy 2008 also

mandates that MoD users of T&E are required to use centrally-managed facilities unless a

compelling Business Case can be made for an alternative facility. The latest guidance should be

obtained from the Trials, Evaluation Services and Targets (TEST) Team during the early stages of

planning. The AOF recommends that the TEST Team’s Test and Evaluation Master Catalogue should

be used to identify alternative T&E capabilities should these be required.

Ranges and Test and Reference Facilities (T&RF) should be capable of emulating fielded systems

and real world conditions and can provide the following benefits:

• Cost savings against the expense of proving systems in the field;

• Avoidance of interference with operational activities;

• Contextual proving of systems before they are fielded;

• Safe and secure stressing of a system under test beyond limits experienced

in peacetime operations;

• Staged release of acquisition capability into service.

Most Land projects will follow current MoD policy of using the British Army Trials Development Units

(TDUs) for the ITEA tests and trials although additional manpower requirements over and above that

which the TDUs can supply, such as Combat Medics for live firings, need to be considered. Early

liaison with the Trials Planning Office (TPO) for Regular Army Assistance to Trials (RAAT) troops is

therefore essential. There may also be a requirement for additional Army assets during the testing

and trials activities and the management of the availability of these assets should occur as part of

the detailed planning for each trial. Any elements of testing which require the use of human beings

must now also consider the ethical aspects of the test in question. A statement which determines

the ethical aspects of the programme must be included within the ITEAP as mandated by JSP 536

(Ethical Conduct and Scrutiny in MoD Research Involving Human Participants).

In order to ensure their availability and to determine that they can be configured as required, potential

trials facilities should be identified and planned as early as possible. In particular, early planning is

required for supporting roles, operational trials, calibration and simulation and model validation.

Page 17: WHITE PAPER - BMT Group | A leading international design

seventeen

INTEGRATED TEST, EVALUATION & ACCEPTANCE BEST PRACTICE

ITEA trials & testing activities need to be planned in detail using Trial Directive Plans and Trial

Administration Orders (TAO) which include a detailed description of the planned test stages that will

be applied for the project. Working Group (WG) involvement, through planned meetings, will provide

advice on the scope of the testing and will enable the trial facilitator (or otherwise agreed agency) to

formulate the technical Trials Plan or Directive. Each Trial Directive will contain the detailed technical

planning & scheduling for the trial, and will provide strong guidance on the scope of work for the

agency(ies) undertaking the testing. Local detailed test plans and schedules are dependent on:

• Source and scope of test requirements;

• Test supplier and contractual agreement;

• Test items and resources (critical if more than one stakeholder group is involved);

• Test agency’s facilities;

• GFX needs.

This plan will be issued to Stakeholders in advance, in conjunction with a TAO. The TAO deals

with the organisational issues and non-technical planning of the test/trial including safety, resources,

assets and guidance to the agency on deliverable outputs. As a minimum, it should address

the following issues:

• Introduction/trials objective;

• Safety;

• List of test assets, prerequisites and GFE arrangements;

• Special conditions;

• List of agencies involved in testing and the coordination authority;

• Trial location(s);

• Special accommodation arrangements;

• Trial programme & schedule;

• Logistics, transport & vehicle drivers;

• Personnel list including contacts for errors, omissions and disruption to programme;

• Contractual conditions;

• List of reports & other deliverables.

Proposed programmes should be developed in discussion with the Sponsor, DLoD Owners,

Prime Contractor and other project Stakeholders with the responsibilities for planning, conducting,

evaluating and reporting on T&E being agreed between them. These responsibilities and their

agreement must then be captured in the ITEAP, appropriate contractual documents and Internal

Business Agreements (IBA).

In most cases, the aim of a trial will be to provide the Project Manager (PM), via the trial sponsor,

with formal evidence of testing of the equipment against a sponsor-supplied set of requirements.

The requirement list will form the trial agency’s scope of work and will include MoPs or satisfaction

criteria for each requirement. The trial agency’s response must address the results of testing

against each requirement.

Page 18: WHITE PAPER - BMT Group | A leading international design

eighteen

INTEGRATED TEST, EVALUATION & ACCEPTANCE BEST PRACTICE

It is the responsibility of the test agency to ensure that any draft Test Directives are agreed in advance

of the test/trial and to ensure that the trial technical aims, methods and collection of results are

sound, achievable within the trial schedule and conducted in a safe manner. However, the level of

effort required for successful trials & testing activities can place a heavy load on project staff for any

trials for which the Prime Contractor is not responsible. This is an area where specialist contractor

support may prove invaluable such as that provided by BMT for the TALISMAN project.

For the ITEAP master scheduling, testing and trials booking dates must be secured with

Stakeholders as far in advance as possible, particularly those agencies who already have busy

schedules with MoD UOR projects. In BMT’s experience, it is also important to reschedule activities

as soon as possible should programme delays affect the trials schedule as this will ensure that

facilities can be used by other programmes if required.

The detailed Test/Trials planning will amplify and expand on the initial master ITEAP planning to

provide a tactical level of detail. Figure 7 below illustrates the process.

Figure 7: Test & Trials Planning

Equipment PM/Project Officer Working Group Test / Trial

Facilitator

Provide SubjectRequirementsInformation

Collate & SupplyInformation

Pre-requisites

Organise Assets& Resources

Issue TrialAdmin Order

Agree Scopeof Trial

Formulate & IssueTrial Plan

(Technical)

Execute Trial Plan& ProduceTrial Report

Provide Advice onFulfilment of

Requirements

Record &Communicate

Acceptance Verification

Page 19: WHITE PAPER - BMT Group | A leading international design

nineteen

INTEGRATED TEST, EVALUATION & ACCEPTANCE BEST PRACTICE

Detail the ITEAPThe ITEAP details the Project Team’s plan for achieving acceptance of the required capability,

specifying and executing the defined acceptance strategy against the planned project schedule.

This includes assigning responsibilities for test activities, determining any required customer

supplier agreements and monitoring the progress of acceptance throughout the development of the

system. It is therefore important that the ITEAP is maintained as a living document which is tailored

to the needs of the project throughout its life. The ITEAP will also cover those aspects of T&E that

are required to be conducted to inform other processes such as the development of training and

definition of safe operating environments.

The AOF suggests that the ITEAP should be initially drafted and then developed in an iterative

manner as the project progresses so that early scoping activities can be carried out and a better

understanding of the required acceptance activities can be obtained. Experience has shown that

applying an iterative process as suggested can produce effective and complete ITEAPs which

provide a suitable level of detail to which acceptance activities can be carried out. An eight

stage process is recommended, as shown in Figure 8 below, to ensure that all elements of the

system are considered and planned appropriately.

THEEIGHT STAGE ITEAP

12

3

45

8

7

6

Identify Sources ofNeed for T&E

Scope OutlineT&E Requirements

AssessAchievability

PerformRisk Assessment

Consolidate Tests to FormITEAP, ITEA Schedule

and VVRM

Document andCross-Reference in

Project Documentation

DevelopInformation Processes

Optimise the ITEAP andObtain Stakeholder

Endorsement

Figure 8: ITEAP Development Process

Page 20: WHITE PAPER - BMT Group | A leading international design

twenty

INTEGRATED TEST, EVALUATION & ACCEPTANCE BEST PRACTICE

In developing the ITEAP, it is important to firstly identify the sources of need, considering what

is required for not only V&V of the requirements but for also determining what is required for

design certification, technology readiness, contract acceptance and to meet safety and

environmental needs.

The outline T&E activities can then be scoped for each of the identified sources of need including

the development of individual test cases, detailed costs, priorities, responsibilities and dependencies

as required. Consideration should be given to the resources required to conduct T&E activities

including manpower, facilities, and the use of shared working environments, existing qualification

data and Government Furnished Assets (GFX). As the URD and SRD start to evolve, these

requirements should be reviewed to ensure their continuing appropriateness and thought given to

the impact which activities may have on any related asset or contract delivery schedules.

In order to manage the cost associated with ITEA activities, a balance should be obtained between

the costs of reducing uncertainty and the level of risk reduction possible with full test coverage. ITEA

management is a complex task which requires a combination of specialist knowledge, experience

and judgement and in order to be successful, Stakeholders and SMEs should be engaged early on

to ensure that significant issues are not overlooked. These may include the following, for example:

• User;

• TPO;

• Trials Organisations;

• The Independent Technical Evaluator (ITE);

• Defence Ordnance Safety Group (DOSG);

• ISS Network Technical Authority (NTA);

• DLoD Owners;

• Internal Stakeholders;

• Contractors.

Following this scoping activity, it is important to assess the achievability of the T&E activities. This

assessment must include the assessment of the estimated cost against budget, schedule to critical

path milestones and the demand for resources against their availability. Any issues should be

highlighted as specified within the project risk management documentation.

Although the Sponsor typically owns the ITEAP, it is common for it to be managed within the project

team by the Requirements Manager or an Acceptance Manager. The Requirements Manager

manages and processes inputs and evidence from the relevant DLoD owners to ensure the

capability can be achieved and escalates issues to the Sponsor through the Programme Board for

resolution if required. To relieve some of the workload of the Project Manager and Requirements

Manager, ITEA Management is another area where specialist contractor support may be of great

benefit. BMT have experience in carrying out this role for a number of projects, including for each

of the projects discussed in the Case Studies section of this paper.

Page 21: WHITE PAPER - BMT Group | A leading international design

twenty one

INTEGRATED TEST, EVALUATION & ACCEPTANCE BEST PRACTICE

Additionally, a risk and opportunity process should be defined in the ITEAP to provide Stakeholders

with a means of raising new risks and opportunities and a risk assessment carried out, including risks

which arise due to assumptions made. The following should be considered as a minimum:

• Non-availability of test articles and facility on the day of test;

• Failure of (integration) tests;

• Consequent collapse of project management plan.

Possible mitigation actions should be identified which may be implemented if required. It is therefore

recommended that:

• The uncertainty of cost and schedule estimates are quantified;

• Best practise is used, across MoD and Industry, to build the ITEA schedule;

• The implications of schedule uncertainty on resource availability are considered;

• The use of operational SMEs is maximised at the design stages;

• SMEs are involved and in the early stages of ITEA, and that;

• There is sufficient use of incremental verification to ensure there are no surprises

at completion of manufacture.

Where Government Furnished Assets (GFA) are involved it is important that the financial

implications are understood by both the supplier and MoD as dependence on GFA is a risk that

needs to be managed.

Once these initial stages have been carried out, it is possible to consolidate the T&E activities

and document them within the ITEAP, ITEA schedule and VVRM. Details of the content of each of

these elements can be found below. It is important to maintain links between the tests which are

to be completed and the requirements which they will satisfy to ensure traceability throughout.

This traceability will be important as the V&V activities progress and evidence is gathered of the

systems compliance with requirements. BMT’s experience in this area has, on occasion, identified

unnecessary testing which had been included simply because the test agency had always done it.

By removing these additional tests, the workload and cost of test activities can be reduced, without

affecting the overall integrity or capability of the equipment being tested.

A number of steps can be taken to ensure the robustness of the ITEAP, including:

• Refinement of the dependencies and sequencing to resolve bottlenecks including

updates to the project master schedule as required;

• Identification of which body will be required to evaluate evidence, and when;

• Development of the plan to include all sources of test and contributing organisations;

• Explicit definition of accountabilities and liabilities;

• Checking whether the ITEAP protects any relevant Intellectual Property Rights (IPR)

in test results and whether there will be any problems with accounting, liability and

insurance where assets are ‘loaned’ across organisation boundaries;

• Checking to ensure that each required facility can operate at the required security level.

Page 22: WHITE PAPER - BMT Group | A leading international design

twenty two

INTEGRATED TEST, EVALUATION & ACCEPTANCE BEST PRACTICE

To prevent duplication of effort and ensure that the most up to date information is available, the

ITEAP should document and cross-reference the Stakeholder Responsibility Matrix (SRM), GFX Plan

and Asset and Contract Delivery Schedules as appropriate. It is however recommended that when

referring to another document, an overview of the content being discussed is given in the referring

document.

It is important to develop formal processes and procedures regarding the management of test and

evaluation activities which cross organisational boundaries including the collation and evaluation

of V&V evidence, the management of disputes, test facilities and GFX, and integration and

interoperability testing. These processes must include the incorporation of the non-EC DLoDs into

validation testing and might cover areas such as:

• The verification of individual DLoDs;

• The collation and management of evidence;

• The evaluation of evidence, and the conduct of acceptance decision-making;

• The management of disputes, provisos, and remedial action;

• The management of test articles;

• The management of test facilities and availability problems;

• The management of GFX and availability problems;

• Installation and test;

• Integration and interoperability testing;

• The Test, Evaluation and Acceptance of UORs.

Finally, the ITEAP can be optimised and stakeholder endorsement obtained. This should include

regulatory Stakeholders such as:

• Quality Assurance;

• Air and Sea worthiness if appropriate;

• Safety;

• Environment;

• Security.

Any conclusions drawn in the achievability and risk assessment stages should be reflected in the

ITEAP. This can then be baselined and published, along with the ITEA Schedule, and maintained in

line with project configuration control processes. The SRM, Asset and Contract Delivery Schedules

and other dependent plans should also remain aligned with the ITEAP and ITEA Schedule. A

copy of each project’s ITEAP needs to be logged with the TEST Team who manage the UK Test

and Evaluation capability portfolio and hold responsibility for managing the future T&E capability

requirements on behalf of Cap JTES.

Page 23: WHITE PAPER - BMT Group | A leading international design

twenty three

INTEGRATED TEST, EVALUATION & ACCEPTANCE BEST PRACTICE

ITEAP STRUCTURE

Although the above section details the processes which are recommended for developing an ITEAP,

it is important to structure the document, as with all project documents, in a way which promotes

readability. It is suggested that the structure determined by the AOF is used where possible to

maintain consistency between projects. This structure is as shown in Figure 9 below with additional

detail on selected areas provided in the following sections:

Stakeholders and Working GroupsA CIWG will provide the overall assessment of the capability being procured across all the DLoDs

to the relevant Capability Lead. BMT has attended a number of CIWGs in support of Manoeuvre

Support Team (MST) and Protected Mobility Team (PMT). Usually chaired by a full Colonel with

full representation by Stakeholders, a CIWG is an authoritative, dynamic entity with considerable

influence over the destiny of a capability.

StrategicContext &

ITEA Objectives

MilitaryCapabilityContext

DLODs andCapabilityIntegration

ProjectITEA Schedule& Milestones

ITEAStakeholders &

Organisation

Test andEvaluationStrategy

Risks,Assumptions

& LFE

StakeholderResponsibility

Matrix

DLODs andProject

Milestones

VVRM or VVRManagement

Plan

ITEAP

T&E Process

ProjectInterdependences

ITEAAnnexes

Aim &Objectives

T&ESchedule

ProjectStrategy

ITEASchedule

SystemDescription

TrialsProgramme

ProjectPlans

Main ITEARisks

RequirementsManagement

Evaluation ofEvidence

ProjectProcesses

ITEAAssumptions

ProjectDocumentation

EvidenceManagement

ITEAResources

VerificationMethodologies

ContractualElements

CombinedTesting

Resources

ContractualRequirements

Test &Evaluation Acceptance

GFX

Stakeholders AcceptanceStrategy

ITEA Risks &Opportunities

Responsibilities AcceptanceProcess

ITEAAssumptions

ITEAOrganisation

AcceptanceMilestones LFEVerification

& Validation

AcceptanceOrganisationDLOD T&E

AcceptanceCriteria

Figure 9: ITEAP Recommended Structure

Page 24: WHITE PAPER - BMT Group | A leading international design

twenty four

Stakeholders assigned as DLoD Owners will provide DLoD maturity readiness evidence to the CIWG.

The DE&S Project and Requirements Managers (RM) will also contribute evidence, their principle

viewpoint being the satisfaction of requirements within the Equipment and Logistic DLoDs, as

defined in the SRD and the Integrated Logistical Support Statement of Work (ILS SOW)5.

Specialist WGs/Panels may be formed to deal with specialist Equipment Capability and any other

DLoD acceptance assessment issues, and will typically be engaged throughout the test and

acceptance phase. Specialist WGs may include some or all of the following:

• CIWG;

• ILS WG;

• Safety and Environment WG (Safety Panel);

• Electromagnetic Environmental Effects (E3) Working Group. Members may include

Defence E3 Authority (DE3A), Dstl, DOSG and 3rd Party trials managers

(e.g. Electromagnetic Compatibility (EMC) test establishments). Having chaired the

E3WG for a number of projects, BMT has found it to be an invaluable forum for trials

planning as well as assessment of evidence.

Specific System Requirements may be allocated to a WG when specialist advice is required for

verification. It will then be the responsibility of the WG to assess evidence from trials and other

sources to determine if each requirement MoP has been fulfilled. This delegation is a very effective

way of avoiding bottlenecks in the acceptance process and is a more effective utilisation of

Stakeholders’ valuable time.

The Terms of Reference for WG members will typically include:

• To provide specialist requirement verification advice to the Project within the remit

supplied by the WG Chair. All safety issues arising are considered to be within the remit;

• To recommend any amendments to requirement Verification/Validation Criteria,

MoPs and requirement priorities;

• To provide technical trial planning as input into the ITEAP;

• To assess and recommend acceptance or rejection of evidence gathered to demonstrate

the acceptance of requirements.

Other key Stakeholders who may need to be consulted include:

• The TPO as the point of contact for the allocation of trials tasks to the Land TDUs,

and for manpower allocation through RAAT. BMT has regular contact with the team

who runs the office, and has experience of completing the TPO Form 1 (Trial Proposal

and Tasking Proforma). BMT has worked with 6 of the 7 TDUs over the last 3 years and

has found that they are an essential part of the trials and acceptance process. As well as

INTEGRATED TEST, EVALUATION & ACCEPTANCE BEST PRACTICE

5 There is often duplication between the ILS SOW and the SRD. In BMT’s view the ILS section of the SRD should be restricted to equipment design issues that affect the support solution such as supportability, maintainability, reliability etc rather than ILS processes and deliverables that support the project management of procuring the support solution.

Page 25: WHITE PAPER - BMT Group | A leading international design

twenty five

INTEGRATED TEST, EVALUATION & ACCEPTANCE BEST PRACTICE

running specific trials to gather acceptance evidence, they can offer invaluable user

input to the design process;

• The TEST team who publish the T&E Catalogue which has been compiled with the

co-operation of teams in the Defence community, including the MoD, Industry and

academia, and lists the wide range of evaluation capabilities available to the UK;

• The Test and Evaluation Coordination Cell as part of the TEST team, who compile a

Master List of T&E Activities which may be consulted by projects considering the

availability of trials facilities;

• The Arms and Services Directors (ASDs) who are responsible for providing policy,

direction and professional advice to their respective Arm or Service and to sustain the

delivery of military capability from the Army both now and in the future. One ASD will be

appointed as the lead for a new capability, but others may well be involved for specialist

areas. BMT has worked successfully with a range of post holders within EinC(A),

DRAC and Dinf.;

• The DE&S Operational Vehicles Office (OVO) who determine priorities for vehicle

programmes and will mediate when there may be a programming conflict for trials

facilities between different project.

Linkage of Requirements Management to AcceptanceAs mentioned previously, traceability between requirements and acceptance evidence needs to be

maintained throughout system development. BMT uses the IBM Rational DOORS® requirements

database version 9.2 with the KEYPAQ add-on as the principal tool for all requirements and

acceptance management. The DOORS®/KEYPAQ system allows for electronic links to be created

between information to provide traceability and can be used to import and export the requirements

set in Microsoft Office document formats so that this method can be used for projects with little or no

access to the DOORS® software.

The following requirement attributes are used in the SRD to provide sufficient information for linkage

to acceptance:

• Measure of Performance: Text and numerical data specifying the required effectiveness

or performance envelope of each requirement. The lower ‘Threshold’ figure indicates

the level of performance below which the requirement ceases to provide an acceptable

level of benefit. The upper ‘Objective’ figure indicates the level above which there is

no justification, financial or otherwise, for further improvements in performance.

There is no need to assign both upper and lower targets to every requirement;

• Verification Criteria: These attributes list the primary criteria against which it will be

demonstrated that the requirement has been met, and the primary methodology to be

used to demonstrate achievement of the requirement;

• Verification Authority: This attribute shows who is responsible for the final decision on

whether the requirement criteria are validated/verified against the evidence supplied

from acceptance activities. Where specialist advice may be required, for example in

Page 26: WHITE PAPER - BMT Group | A leading international design

interpreting evidence, a specialist WG/Panel may be nominated. Although this attribute

is not mandated by the AOF, it has proved to be very useful in BMT’s experience,

especially throughout the TALISMAN project.

VVRMTest activity, like experimentation and analysis, generates volumes of data and meta-data that

must be transformed into information and evaluated to create knowledge and inform acceptance

decisions. That data information and knowledge must be consolidated, distributed, used and

archived. The VVRM provides a method of tracing the requirements through to the relevant tests and

trials which are to be conducted in order to determine whether the system meets the stakeholder

needs and should be accepted. There should be references to:

• User and System Requirements;

• Evidence and information, including tests and results;

• Acceptance criteria and recommendations;

• Responsibilities;

• Progress and milestones.

The VVRM will usually be managed by the Authority, although it is accepted that the Prime Contractor

will typically be heavily involved in its completion with respect to the contracted requirements. To

ensure completeness, it is important to ensure that the VVRM covers the V&V of any non-contracted

requirements as well as the validation of the URD.

Typically this will be developed in DOORS® or Microsoft Excel®, but the key aspects are that the

information should be displayed in a table, as this shows the progression of each requirement

through the development of the system in an easy to view manner, and that traceability can be

demonstrated and maintained between the requirements and the associated information.

Traceability management can prove extremely complex and any methods which can be utilised

to reduce this are highly recommended. Reference to TRLs and System Readiness Levels (SRLs)

may be useful to demonstrate progression of the system.

The headings required within the VVRM can vary depending on the project, although it is possible to

combine information related to test procedures and their results with the V&V information to minimise

duplication of effort. A possible VVRM layout is shown in Table 1 below:

twenty six

INTEGRATED TEST, EVALUATION & ACCEPTANCE BEST PRACTICE

URID

UserRequirement

SRID

SystemRequirement

TestProcedure

ExpectedOutcome

TestResults Responsibility References

Table 1: Example VVRM

Page 27: WHITE PAPER - BMT Group | A leading international design

twenty seven

INTEGRATED TEST, EVALUATION & ACCEPTANCE BEST PRACTICE

ITEA SCHEDULE

The ITEA Schedule is part of a Project’s Master Planning Schedule and shows the expected dates

and durations of key project and acceptance activities and milestones, including when resources

are required. As with the ITEAP, this schedule shows the pan-DLoD activities throughout the lifecycle

and will be a living document as, especially during the initial stages of the project, activities are likely

to change as the understanding of what is required develops and the possibility for a number of

activities to occur concurrently may become apparent. At Initial Gate (IG) the schedule is likely to

include a number of aspirations identifying key T&E milestones and the possible resources required

rather than specific dates or events. At Main Gate (MG) these dates will be more precise providing

a schedule which contractors and DLoD owners can contract/plan against. By this stage, all assets

required throughout the life of the programme should have been identified and costed within the

ITEAP.

The key acceptance stages for a typical Land project are outlined below and should be identified

within the ITEA Schedule:

• Sub-Systems Factory Acceptance Testing;

• Suppliers Contract Acceptance;

• System Acceptance;

• Characterisation (Operational) Trials;

• Logistics & other non-EC DLoDs Acceptance;

• Capability Acceptance.

The TEST Team maintain a Master Schedule covering the usage of all UK T&E capabilities (covering

a period of 20 years) and a catalogue of all UK facilities (MoD and Civilian) which should be

consulted when determining the ITEA schedule, as understanding the availability of assets required

to conduct activities should enable a more realistic schedule for the acceptance of the project

capability into service to be developed.

It is recommended that the ITEA Schedule is produced and managed within Microsoft Project with

a baseline being created to determine between initial planned activity and the actual dates and

durations of the activities being carried out.

Collect Evidence (Execute ITEAP)Evidence collection activities occur for all requirements across all DLoDs. Although many of these

activities may just require confirmation that a specific activity has taken place and been signed off

by the appropriate organisation, Equipment DLoD evidence collection activities, such as tests

and trials, tend to be more complex and typically require some sort of Demonstration, Analysis,

Test or Inspection.

Page 28: WHITE PAPER - BMT Group | A leading international design

Once the ITEAP has been developed, the plan can be executed with trials facilities being booked,

tests carried out, evidence collected and results and evidence collated within the VVRM. Tests should

be carried out at a number of relevant test points, including design reviews and development

milestones, and the progress of DLoDs should be managed and recorded within the ITEAP as

the acceptance process progresses.

BMT has organised the MoD and independent trials elements of various projects using MoD facilities

as recommended by the Subject Matter Expert (SME) Stakeholders, with support from industry test

facilities such as MIRA where necessary. When production programmes slip, a lot of effort is required

to reschedule trials to meet the delayed programme. It is worth considering contracting this support

to the Prime Contractor in an attempt to make them bear the consequences of delivery delays.

Following execution of the trials, the agency(ies) test reports will be issued as soon as possible to the

PM, whereupon, if necessary, the WG will be consulted regarding their interpretation of the results

and whether the requirements have been fulfilled. Below are a number of lessons which have been

learnt during the execution of ITEA activities.

Factory Acceptance TestsThorough Factory (or Production) Acceptance Tests (FATs) of every unit will support the supplier’s

Quality Assurance process prior to delivery to the customer. For a newly developed system, the test

documentation to support the FAT will itself need significant development effort. Lessons learnt

about FATs include:

• Production schedules will inevitably be under pressure, but sufficient time needs to be

allowed for setting to work and other preparations for acceptance, otherwise these will

end up being done as part of testing and cause a lot of disruption. Preparations for

acceptance can be included as a check-list in the FAT document;

• Software functionality testing can be very time-consuming, if every function is to be tested.

This only needs to be done when a new release of software needs to be tested, rather

than on every production unit. However; testing a subset of the platform’s software

functionality can be a useful method of testing that interconnecting cables have

been correctly installed;

• Representing the customer, BMT has sat alongside the supplier during early FATs to

develop confidence in the supplier’s test processes. BMT provided a number of

observations on the FAT documentation which the supplier acted on. Once that

confidence was established it was possible to limit oversight to inspection of the

completed FAT documents with occasional ‘dip-tests’;

• It will normally be possible for the customer to sign off a significant number of system

requirements during a FAT, including software functionality and visual inspection of

installations. BMT has regularly represented the customer, calling on the RM or the

TDUs for specialist input where necessary;

twenty eight

INTEGRATED TEST, EVALUATION & ACCEPTANCE BEST PRACTICE

Page 29: WHITE PAPER - BMT Group | A leading international design

twenty nine

INTEGRATED TEST, EVALUATION & ACCEPTANCE BEST PRACTICE

• The MoD Quality Assurance Representative (QAR) is a key stakeholder in the

acceptance process, and it is essential to engage with them when developing the

FAT process and documentation.

However exhaustive the FAT process and Quality Inspections, DSDA 932 inspections may still

pick up some deficiencies; the production schedule should ideally contain contingency for

rectification and re-inspection.

Scenario Based AcceptanceIt is always worth considering the value of a systems view and looking at capability evaluation based

on realistic scenarios. This means testing the overall system (or system of systems) and capabilities

as well as atomic requirement testing at lower equipment or subsystem levels. The scenarios may

involve some level of simulation and emulation.

This system level testing may reveal previously unknown interactions between supposedly

independent functions, which would not be revealed by testing the atomic system requirements.

BMT helped to gather a great deal of TALISMAN acceptance evidence during Exercise Comanche

Charge run by EinC(A) and Royal Engineers Trials and Development Unit (RETDU) in Jordan in

Autumn 2009. Although a unit level training exercise for the TALISMAN squadron that was about to

deploy the capability to theatre, the acceptance activities covered most of the DLoDs and included:

• Acceptance testing on a number of vehicles prior to handover to the squadron due to

lack of time in UK prior to shipping;

• Sign-off of about half the functional requirements by RETDU, in representative light levels,

dust and heat. One component of the TALISMAN capability, the High Mobility Engineering

Excavator (HMEE), performed exceptionally well, transforming previously negative

impressions formed during trials in the UK in unrepresentative conditions;

• Feedback on maintainability by the Royal Electrical and Mechanical Engineers (REME)

unit in support;

• Interoperability testing that was not possible in UK revealing a significant problem.

BMT was tasked to organise an investigation back in the UK, co-ordinating a number of

SMEs and test facilities; this identified quick fixes that could be implemented in time for

the theatre deployment;

• Assessment of the effectiveness of concepts and doctrine embodied in the TTPs used

by the squadron;

• Assessment of the TALISMAN organisation and manning;

• Assessment of the individual and unit level training developed for TALISMAN;

• DOSG testing and clearance for smoke-dischargers.

Page 30: WHITE PAPER - BMT Group | A leading international design

thirty

INTEGRATED TEST, EVALUATION & ACCEPTANCE BEST PRACTICE

Evaluate and Recommend (Evidence Management)The management of evidence throughout the acceptance process is another iterative process

which should be applied and reviewed throughout the development of the system. The acceptance

evidence should be built up within the VVRM and results from tests and trials assessed by

appropriate Stakeholders and SMEs, with compliance statements recorded as appropriate.

As mentioned previously, the progress of DLoDs should be maintained within the ITEAP and ITEA

Schedule with any issues highlighted as they arise, including the need for further evidence collection.

A number of trials reports may be beneficial to aid evidence management including:

• First Impressions Report (FIR);

• Test / Trial Report;

• Exception Report;

• Other DLoD Reports.

Following the assessment of these results, a period of evidence evaluation needs to be carried out.

This needs to be scheduled into the ITEA schedule early on in the project as there is often a need

for several different agencies/Stakeholders to review the same evidence. Sufficient time must be

allocated for this to happen and for the production of the appropriate reports.

Where there is a non-compliance, the report must clearly identify the gap between the achieved

performance and the performance required, the impact of the non-compliance on system

performance, and any remedial action required. Agreement should be sought from Stakeholders

and alternative options discussed where agreement is not possible. These options may include:

• Re-design;

• Quality investigation;

• Modification;

• Trade off;

• Re-trial or re-plan.

Figures 10 and 11 - Exercise Comanche Charge

Page 31: WHITE PAPER - BMT Group | A leading international design

thirty one

INTEGRATED TEST, EVALUATION & ACCEPTANCE BEST PRACTICE

System failures, component failures, operability issues, maintainability issues, reliability issues and

safety concerns are all examples of incidents that may be identified during acceptance activities. It

is important they are captured as early as possible, sentenced appropriately and action taken. BMT

has supported full incident sentencing using codes in accordance with Def Stan 00-42, and also

has developed a simpler process that is more appropriate to the timescales of a UOR. For more

information on this process, please contact BMT using the contact details at the end of this paper.

BMT has often found that the user or tester will identify potential design improvements that are

outside the scope of the endorsed requirements. These should still be logged as incidents but

may be sentenced as ‘Capability Improvements’ that can be addressed if funds allow, or in a future

capability increment.

The pace of a UOR programme means that trials and testing may reveal some endorsed

requirements that cannot be met. Some form of concession or requirement trade will need to be

sought at the appropriate level, supported by an impact statement and proposed way ahead.

Acceptance DeclarationAt each of the key milestones throughout the project, a recommendation should be made to the

Acceptance Authority. These milestones shall include:

• Contract Acceptance;

• Systems Acceptance;

• In-Service Date (ISD);

• Initial Operating Capability (IOC);

• Full Operating Capability (FOC);

• Any agreed Equipment Delivery Dates (EDDs).

Sub-System Contractual AcceptanceAcceptance for sub-systems will be achieved when compliance has been demonstrated by the

suppliers through providing performance evidence against the subset of System Requirements,

using a VVRM, and depends upon the completion of all sub-system integration test and

trials activities.

The VVRM will be generated by the Project Manager from the System Requirements set and

supplied to the supplier in either DOORS® or spreadsheet format. Sub-system suppliers will

provide the requirements compliancy, performance and references to evidence for each

sub-system requirement listed with the completed VVRM to the PM.

Page 32: WHITE PAPER - BMT Group | A leading international design

thirty two

INTEGRATED TEST, EVALUATION & ACCEPTANCE BEST PRACTICE

System AcceptanceFinal System Acceptance for the System is achieved when compliance has been demonstrated, with

evidence, against all the individual requirements of the SRD.

The DOORS® database will have been populated with all the information necessary to ascertain the

status of each system requirement, and will be used to generate reports listing the test and trials

achieved with summary results.

The PM or RM should retain a pack of acceptance evidence as an electronic folder. This can be

managed by BMT on their behalf if required, and will contain electronic copies of reports and other

evidence documents that are referred to within the DOORS® requirements set as requirements

satisfaction evidence.

Capability AcceptanceThe FOC Acceptance milestone is the final milestone and marks the culmination of the ITEA

activities.

The capability acceptance comprises of the final analysis, reporting and presentation of the

acceptance case to the Capability lead by the RM. This presentation can be made when:

• Any System Acceptance and Operational (characterisation) re-testing is completed;

• Capability and safety issues and risks have been reviewed and mitigated where possible;

• A full set of requirements satisfaction evidence has been collected and entered into the

DOORS® database and is available for review;

• The Safety Case has been issued;

• Development of the non-EC DLoDs (Training and Logistics) is sufficiently matured to

support the IOC.

The PM will need to supply the RM with sufficient information input for the final Capability Acceptance

report; the inputs are expected to comprise of:

• Status of the satisfaction of all Key User (including non–EC DLoDs) and

System Requirements;

• Any Concessions raised against supplier-delivered sub-systems (equipment);

• Any delivered equipment issues, in plain English, regarding safety and capability for

linkage into any CIP. This completes the feedback loop described in BMT’s ‘FAST’

Acquisition White Paper.

Page 33: WHITE PAPER - BMT Group | A leading international design

thirty three

INTEGRATED TEST, EVALUATION & ACCEPTANCE BEST PRACTICE

SUmmARY

BMT has provided trials, evaluation and acceptance support to a number of Land UOR projects

in recent years. This paper combines the lessons learnt during these support activities with MoD

guidance to provide a number of recommendations on how best to proceed on future projects.

Using the information given within this paper, it is possible to achieve a robust and comprehensive

ITEAP, ITEA Schedule and VVRM, ensuring that all aspects of a system that need to be tested or

evaluated are considered with processes and evidence managed accordingly. The BMT ‘lessons

learnt’ highlight some of the key issues surrounding ITEA and suggest methods for overcoming

related problems during future projects.

ConclusionWhilst these lessons have been learnt in the execution of UOR programmes, they can be applied

to any project. It is certainly worth considering contracting out the detailed management of trials,

evaluation and acceptance, allowing the MoD PM and RM to concentrate on and direct the high level

programme issues.

Page 34: WHITE PAPER - BMT Group | A leading international design

thirty four

INTEGRATED TEST, EVALUATION & ACCEPTANCE BEST PRACTICE

CASE STUDIES

Project TALISMANBMT staff provided requirements, trials and acceptance support to the MST for Project TALISMAN, a

complex Route Proving and Clearance (RP&C) Capability involving the integration of 3 different types

of manned vehicle, an unmanned ground vehicle and an unmanned aerial vehicle.

In the first phase of the work, BMT produced the system requirements for the various platforms,

comprising the overall capability and developed a robust ITEAP. During the second phase, BMT

matured the ITEAP, liaising with trials and test organisations and platform contractors in order to

develop a robust trials schedule and the processes to accept the RP&C military capability. Firm

linkages between the ITEA schedule and training schedule were established.

In the final phase, BMT provided extensive programme, trials and acceptance support. BMT

proactively managed the trials programme to overcome production delays and other technical issues

with the result that all major programme milestones were achieved. Members of BMT were among

individuals receiving special praise from MST for their hard work and professionalism in organising

trials and technical solutions to resolve a late emerging mutual interference problem. BMT built

Figure 12 : TALISMAN undergoing trials

Page 35: WHITE PAPER - BMT Group | A leading international design

thirty five

INTEGRATED TEST, EVALUATION & ACCEPTANCE BEST PRACTICE

strong relationships with the full range of MoD trials organisations (TPO, ATDU, RETDU, CSSTDU,

JADTEU etc) and commercial trials organisations (MIRA, QinetiQ etc) and as well as MoD technical

experts in DOSG, Dstl, FPDT, DE3A etc.

Explosive Line ChargeThe Explosive Line Charge project addressed the need for rapid breaching of mine/IED obstacles in

support of high tempo operations and involved a large number of Stakeholders across MoD (MST,

EinC(A), EOD, JEODS, DSTL, 26&33 ENGR Regt). BMT provided the following deliverables:

• KUR/SRD: Following a series of stakeholder meetings, BMT established a comprehensive

set of KURs, and associated System Requirements for endorsement;

• Acceptance Matrix: A comprehensive VVRM was produced for MST in order to enable

the subsequent capture and evaluation of trials evidence against the complete

requirements set;

• Market Survey Report: In support of the associated trials programme, BMT led a market

survey identifying the key COTS/MOTS solutions and evaluating their performance

against the agreed requirements;

• Test Specification: A detailed test specification was generated to enable the evaluation

of down-selected stores against common criteria, and specifically capture evidence

to populate the VVRM.

Soldier Short Gap CrossingBMT staff supported the MST Project Manager throughout the lifecycle of the capability from

initial capability requirements elucidation and clarification through a down-select assessment (with

equipment trials of concepts) phase into dialog with the preferred bidder to generate the fielded

Soldier Short Gap Crossing solution.

MoD ProjectsIn support of specific MoD projects , BMT provided trial support at very short notice. In only one

week BMT had identified a suitable trials site and developed an assessment scheme for the three

candidate systems based on the User Requirements and liaison with Stakeholders. The following

week BMT supervised trials on consecutive days which included managing significant safety

aspects. Two weeks later, after analysing the results, BMT issued a report summarising the findings

to support the Business Case.

BMT staff also supported the development of the SRD for a specific MoD project. The SRD was

initially based on the requirements for a similar Land platform programme, also developed by BMT.

BMT then developed and conducted the trials plan for the User Acceptance Trials in conjunction with

ITDU and the User. As trials reports were published, BMT collated the evidence against the system

requirements to support the acceptance case.

Page 36: WHITE PAPER - BMT Group | A leading international design

thirty six

ABOUT THE PUBLISHERS

BMT GroupBMT is an international design, engineering and risk management consultancy, working principally in

the defence, energy and environment, marine risk and insurance, maritime transport and ports and

logistics sectors. BMT invests significantly in research. Its customers are served through a network of

international subsidiary companies. The assets are held in beneficial ownership for its staff.

Web site: www.bmt.org

Systems Engineering at BMT Defence Services

The capability of the defence environment is maintained through the interaction and collaboration of

a complex series of systems. Success depends on performing the right activities at the right time in a

simple, effective and cost-effective manner. BMT Defence Services understands this completely and

so we bring a structured and unbiased view to this complex world.

With systems engineering we understand that no two systems have the same set of objectives and

so we combine innovation with our in-depth knowledge. This gives our customer confidence in our

ability to deliver and maintain affordable, safe and capable systems. When designing and

maintaining capable Combat Systems, our unique whole-platform, whole-life perspective makes

sure that our designs continue to work in harmony with platform and operational needs through life.

We develop, supply and support Information Systems that are tailor-made to our customer’s

requirement. Our awareness of our customer’s procedures, organisation and information

management needs helps us ensure our systems are both affordable and effective whilst

complementing and enhancing their processes.

Web site: www.bmtsystemsengineering.co.uk or www.bmtdsl.co.uk

Contact: Mark Stanton, Head of Systems Engineering- [email protected].

Paper written by Stephen Walters and Cara Perks.

COPYRIGHTS AND ACKNOWLEDGEMENTS

© BMT Defence Services Limited 2011.

Diagrams and images are the property of BMT Defence Services and should not be reproduced without prior permission.

INTEGRATED TEST, EVALUATION & ACCEPTANCE BEST PRACTICE