ballistic missile defense st pf a tsystem performance .../ pf a tsystem performance assessment is a...

29
Ballistic Missile Defense S t P f A t System Performance Assessment is a Wicked Problem INCOSE – Chesapeake Bay Chapter Dinner Lecture 15F b 2012 15 February 2012 Clyde Smithson

Upload: vuxuyen

Post on 12-Apr-2018

217 views

Category:

Documents


1 download

TRANSCRIPT

Ballistic Missile DefenseS t P f A tSystem Performance Assessment

is a Wicked ProblemINCOSE – Chesapeake Bay Chapter

Dinner Lecture15 F b 201215 February 2012

Clyde Smithson

Background: Definitions of some terms used in this brief System: A functionally physically and/or behaviorally related group of regularly System: A functionally, physically, and/or behaviorally related group of regularly

interacting or interdependent elements; that group of elements forming a unified whole [JP 1-02 & JP 3-0, DOD SE Guide to SoS].

System of Systems: An SoS is defined as a set or arrangement of systems that results when independent and useful systems are integrated into a larger system that delivers unique capabilities. Both individual systems and SoS conform to the accepted definition of a system in that each consists of parts, relationships, and a whole that is greater than the sum of the parts; however, although an SoS is a system, not all systems are SoS. [DoD, 2004(1), DOD SE Guide to SoS]

Complex Systems [Russ Abbott Complexity 2007]: Complex Systems [Russ Abbott, Complexity, 2007]: “They are multi-scalar These systems are often best understood in terms of layered architectures These systems are physical …. not just mathematical abstractions These systems generally have no predefined bottom level These systems often take the form of … systems of systems These systems are generally intimately entangled with their environment These systems are typically both deployed and under development at the same

Clyde Smithson

y yp y p y ptime”

2 BMDS Performance Assessment is a Wicked Problem

BMDS Performance Assessment Problem“Judge a man by his questions rather than his answers.” Voltaire

System Performance Assessment of the Operational BMDS means System Performance Assessment of the Operational BMDS means assessing the capability of the system in terms of metrics such as Probability of Engagement Success, Defended Area, Launch Area Denied, Operating Area, and others. To date there have been a

b f bl i d i dibl S t P fnumber of problems in producing a credible System Performance Assessment: The operational system cannot be tested across the range of potential

conditions Modeling and Simulation (M&S) is MDA’s approach of choice. There are many

M&S of various resolution, scope, and fidelity but there are no accredited M&S to perform the assessment

There are disparate sets of stakeholders and test venues performing tests, collecting data and executing analysis on the systemcollecting data, and executing analysis on the system

There is no central methodology being applied to produce the performance assessment [This last comment is circa 2010; MDA has spent considerable effort to address this issue in recent years]

Clyde Smithson3 BMDS Performance Assessment is a Wicked Problem

BMDS Performance Assessment Goals

The 2010 Ballistic Missile Defense Report established the f ll i f i li i itifollowing as one of six policy priorities: “Before new capabilities are deployed, they must undergo testing that enables

assessment under realistic operational conditions.”1

In the past MDA has produced a capability assessments for the p p p ypurpose of evaluating the BMDS in three areas: System Verification – verifying that BMDS system level requirements are met System Functionality Assessment – demonstrating that the BMDS performs its

designed functionsdesigned functions System Performance Assessment – determining how well the BMDS performs its

mission MDA recognized the need for a multi-year assessment campaign Integrated Master Assessment Plan (IMAP) Integrated Master Assessment Plan (IMAP) IMAP process is not inconsistent with concepts presented in this brief IMAP is intended to leverage and align with MDA’s existing test program

Clyde Smithson4 BMDS Performance Assessment is a Wicked Problem

1Ballistic Missile Defense Review (BMDR) Report, Secretary of Defense, 1 February 2010. www.defense.gov/bmdr

Complex Systems Approach to Addressing Performance Assessment System of System and Complex System Views of BMDS Functional, Platform, Operational BMDS System of System and Complex System Characteristics Wicked Problems and Social Complexity

General ApproachGeneral Approach Define the System Performance Assessment

Establish the System Context Establish the Problem ContextEstablish the Problem Context Techniques, Analytical Methods, Tools Defining the Referent System Developing Parameters for the Referent System Alignment of the Referent System to M&S From the Referent System to Assessment Assessing the BMDS

Planning the System Performance Assessment

Clyde Smithson

Planning the System Performance Assessment Caveats and Limitations to This Approach

5 BMDS Performance Assessment is a Wicked Problem

Functional View of the BMDS

Clyde Smithson6

MDA Fact Sheets, http://www.mda.mil/system/system.htmlBMDS Performance Assessment is a Wicked Problem

Platform-centric View of the BMDS

Clyde Smithson7 BMDS Performance Assessment is a Wicked Problem

1Ballistic Missile Defense Review (BMDR) Report, Secretary of Defense, 1 February 2010. www.defense.gov/bmdr

BMDS Operational System Configuration

Clyde Smithson8 BMDS Performance Assessment is a Wicked Problem

BMDS System of Systems Characteristics

Characteristic Descriptionp

Operational Independence

The Elements/Components of the BMDS are able to, and do, operate independently. Aegis, THAAD, and Patriot can perform missile defense missions in the absence of the rest of

Independence the system. GMD can perform missile defense missions with a subset of the system.

Managerial I d d

Elements of the BMDS have successfully operated i d d tl f h P t i t d A iIndependence independently for many years, such as Patriot and Aegis.

Evolutionary Development

The BMDS is being built over many years and capabilities are being added incrementally.

EmergentBehavior

The full spectrum of layered BMDS defense can only be provided by all Elements/Components operating together.

Geographic Elements/Components of the BMDS are distributed world-

Clyde Smithson9

Geographic Distribution

Elements/Components of the BMDS are distributed worldwide, may be mobile, and communicate via data networks.

BMDS Performance Assessment is a Wicked Problem

Complexity Attributes of the BMDS (1 of 2)

ProblemCh i i Simple Complex BMDS

P bl RationaleCharacteristic Simple Complex Problem Rationale

Number of

• Internal Social:Multiple competing MDA Directorates, Services/Program Offices, Contractors, UARCs/FFRDCs, Warfighters

• Internal Technical:System has a large number of Elements and ComponentsNumber of

Elements Small Large Complex System has a large number of Elements and Components• External Social:

Congress, POTUS, OTA, Services, Warfighters• External Technical:

Other services necessary, such as communications, not under MDA controlT h i l

Interactions Few Many Complex

• Technical:The BMDS interacts between Elements/Components and externals through numerous links and interfaces

• Social:In the context of this problem interactions occur across multiple Assessment, Analysis, Flight/Ground Test teams within MDA Contractors Program Officesteams within MDA, Contractors, Program Offices

Predetermined Attributes Yes No Complex

• Wide variety of Elements/Components• Multiple disparate stakeholders• System organization changes – test to test, over time

based on operational systems• System boundaries difficult to define – especially with

regard to the operator being a part of the system or

Clyde Smithson10

regard to the operator being a part of the system or external to it.

BMDS Performance Assessment is a Wicked Problem

Complexity Attributes of the BMDS (2 of 2)

ProblemCh i i Simple Complex BMDS

P bl RationaleCharacteristic Simple Complex Problem Rationale

Interaction Organization

Highly Organized

Loosely Organized Complex

• Many activities are centrally organized by MDA but executed locally or distributed. From an assessment perspective Flight Test, Ground Test, Performance Assessment activities are not highly organized to each other even though they impact each other in many waysg g g other even though they impact each other in many ways.

• Element, Component, Service, Operators are differently organized.

Laws Governing Behavior Well Defined Probabilistic Complex • System behavior is sometimes even less than

probabilistic.

S E l i D N • Overlapping complex planned system evolution at System Evolution Over Time

Does Not Evolve Evolves Complex

• Overlapping complex planned system evolution at System, Element, and Component levels over differenttime frames.

Subsystems Pursue Own

GoalsNo Yes

(Purposeful) Complex• Programs and services are driven toward meeting the

goals of their Elements/Components• Many Elements are complex in their own rights and have

missions other than BMD (i e Aegis Patriot)missions other than BMD (i.e. Aegis, Patriot)Systems Affected

by Behavioral Influences

No Yes Complex• Different service paradigms operating in “joint” world• Differences between R&D, Developers, Integrators,

Testers, Operators

Predominantly Closed or Open Largely Largely

• Primary purpose of the system is to defend against an external threat

Clyde Smithson

Closed or Open to the

Environment

Largely Closed

Largely Open Complex • Subject to environment

• Depends upon external services (such as comms) to accomplish mission

11 BMDS Performance Assessment is a Wicked Problem

Wicked Problems and Social Complexity*

1. You don’t understand the problem until you have developed a solution. Every BMDS “solution” exposes new/different aspects of the problem Every BMDS solution exposes new/different aspects of the problem. The problem is ill-structured with evolving interrelated issues and constraints. Different stakeholders have different viewpoints.

2. Wicked problems have no stopping rule. Problem solving is usually limited by resources rather than discovery of an

optimal solution.3. Solutions to wicked problems are not right or wrong.

There is no optimal solution, but “better or worse”, “good enough or not good p , , g g genough.”

4. Every wicked problem is essentially unique and novel. Many factors and conditions are embedded in a dynamic socio-technical context.

5 Every solution to a wicked problem is a “one-shot operation ”5. Every solution to a wicked problem is a one-shot operation. You cannot build a BMDS just to see how it works. But you can’t learn about the problem without trying solutions.

6. Wicked problems have no given alternative solutions.

Clyde Smithson

There may be no solutions, or there may be many solutions. Creativity in selecting solutions and judging which are valid is critical.

12

*Jeff Conklin, Wicked Problems & Social Complexity, www.cognexus.org/wpf/wickedproblems.pdfBMDS Performance Assessment is a Wicked Problem

General Approach“We fail more often because we solve the wrong problem than because we get the wrong solution to the right problem.” Russell L. Ackoff

Define the System Performance Assessment Problem Performance reported with regard to effectiveness metrics defined by MDA

documents such as: Technical Objective and Goals BMDS Accountability Report Effectiveness Metrics Standard

Determine System Performance Assessment Venues/Methods Flight Test Ground TestGround Test Modeling & Simulation SME Assessment/Opinion

Define System Performance Assessment Campaign Planning Development Execution Analysis and Assessment

Clyde Smithson

y Reporting

13 BMDS Performance Assessment is a Wicked Problem

Relative Quality of Assessment by Venues and Methods

The validity of performance y prepresentation from highest to lowest by venue/method is:

• Flight Test• Ground Test

Flight Test(FT)

resented

High

• Modeling & Simulation• SME Assessment/Opinion

System Capability is reflected in three ways:

Hardware‐in‐the‐Loop(HWIL)Ground Test(GT)

rmance Rep

r

three ways:• System Tests verify a delivered

capability for point solutions • Modeling & Simulation either predicts

or reflects system performance S bj t M tt E t (SME ) l

Subject MatterExpert Opinion

ystem Perfor

Engineering

CampaignMission

• Subject Matter Experts (SMEs) apply judgment to produce an assessment

Validity

 of S

y

LowNarrow Broad

Clyde Smithson14

Span of Capability  Space Assessed

V

BMDS Performance Assessment is a Wicked Problem

Establishing BMDS System Context

Developmental System Test

OperationalSystem Test

V ifi /C t C bilit

Build up as in a Campaign

(eg. Ground Test, Flight Test)

REFERENTBMDS BaselineS t D i ti

Verifies/Captures Capability

Defines

Measured by:Flight Test

Defines/CreatesM&S Benchmarks

VerifiesREFERENTSYSTEM

System DescriptionSystem Spec & ICDsOther Specs & Reqs

DefinesGround TestOther (eg. M&S)

SystemMissionScenariosAbstracts to

DeliveredCapability

Validates

SimulationSystem

SystemCapabilityPredicts/Reflects Assessed 

Delivered CapabilityMust be Accredited DocumentedCabability

Clyde Smithson

Must be AccreditedMultiple Fidelity Levels(eg. Engineering, Mission, Campaign)

Documented Cabability,Declared Capability

15 BMDS Performance Assessment is a Wicked Problem

BMDS Element/Component Logical (Functional) Referent System and Externals

OPERATEDefine Referent System

OPERATEMISSION

OBJECTIVESTHREATS &SCENARIOS

yboundaries in terms of:

• Entities• Relationships• Permeability• Interdependence• Hierarchy• Relationship to External

Factors (Environment, Operator, etc.)

TRAIN TESTOperator, etc.)

Define Referent System in terms of BMDS functional categories

h

CONOPS, 

such as:• Sensing• Battle Management• Control/Execution of

Weapons

Clyde Smithson16

DEVELOP/INTEGRATE ENVIRONMENTTTP, ROEWeapons

BMDS Performance Assessment is a Wicked Problem

Operational BMDS Physical Architecture

Command SBXX‐Band Sensor

(an example realization) G

CommandCenterSystems

OPIR(DSP, SBIRS, Etc.) CD & UEWR

GFC

L‐Band Sensor

UHF Sensor

Weapon System

Externa GMD

BMDS

AN/TYP‐2(Fwd‐Based Mode)

Airborne Infrared

Ad IR

CLE/GBI‐FGA

CLE/GBI‐VAFB

p yX‐Band Sensor

IR Sensor

al IR Sensor

BMDSC2BMC

Adv IR Aegis BM

D

C&D/WCS

AN/SPY‐1

SM‐3 Blk IB/IIA/IIBSM‐2, Far Term SBT

Weapon System

Weapon Sys

Adv C2

Aegis Ashore

Weapon System

Future Capability?

Airborne IR

External IR

S‐band

UHF/L‐band

THAAD

THAAD FCC

AN/TPY‐2(THAAD Mode)

Patrio

Patriot ECS

AN/MPQ‐53/65

Weapon System

X‐Band Sensor

stemInterface Types

Clyde Smithson

Weapon X‐band

C2/BM

D

THAAD Interceptor

ot

GEM/GEM+/PAC‐2/3

17 BMDS Performance Assessment is a Wicked Problem

Extending the Referent System to Multiple Layers

• Referent System can

BMDS

OPERATEMISSION

OBJECTIVESTHREATS &SCENARIOS

ybe extended to multiple levels within the BMDS

• Goal is to produce a• Goal is to produce a representation from which conclusions about system behavior

TRAIN TEST can be made• Caution: different

behaviors may be associated with

CONOPS, 

associated with emergent properties at different levels of scope and/or

Clyde Smithson

DEVELOP/INTEGRATE ENVIRONMENT,

TTP, ROE

18

resolution of system representation

BMDS Performance Assessment is a Wicked Problem

Performance Parameters for the Referent System

TOGMetric DefinitionTOG Metric Definition

Probability of SuccessThe Probability of Success is the probability that BMDS engagements prevent an adversary weapon from performing its duty (“negation”). 

Defended Area is the geographic area that the BMDS can defend againstDefended Area

Defended Area is the geographic area that the BMDS can defend against adversary ballistic missiles originating from specified launch areas.

Launch Area DeniedLaunch Area Denied is the geographic area that the BMDS can defend against adversary ballistic missiles given a specified Defended Area.

Operational AreaOperational Area is the geographic area in which BMDS Elements and Components can operate and still defend against adversary ballistic missiles from specified launch areas to specified Defended Areas.

Raid CapacityThe number of adversary ballistic missiles that the BMDS can defend against simultaneously.

Other Factors

Environmental conditions such cloud, rain, dust, day/night, sea state.

Adversary ballistic missile characteristics

Clyde Smithson19

y

“‐ilities”

BMDS Performance Assessment is a Wicked Problem

Alignment of the Referent System to Modeling and Simulation

Medium FidelityM&SWil P di i

Real World Data• Flight Test•Ground Test

•Wilma• EADSIM•Other

Data/ParameterRepository

SYSTEMASSESSMENT

Predictive

•OtherDevelopmental/Operational Test High Fidelity

M&S•DSA‐P• Engineering Models

Repository

Reflective

Real World Anchoring, V&V

Benchmark M&S, V&V

• blend w/ Tactical SW

Clyde Smithson

Accreditation, Assessment

20

Representing the Referent System in Models & SimulationsBMDS Performance Assessment is a Wicked Problem

From Referent System to Assessment

System Performance Assessment

Level 1 ‐ Referent

Level 1Emergent Behavior

Level 2 ‐ Referent

ConcensusProcesses

Belief/ConfidenceAssessment

Emergent Behavior

DefendedArea (e.g.)

Level 2Emergent Behavior

M& S

Characterization

ExpectedValue,

SME

M & SMission

Campaign

Engineering

Other

Flight Test

Ground Test

PerformanceAssessment

“Real World”

Expected Value,Decision Trees (e.g.)

M & S

Clyde Smithson21

System Performance Assessment Process

ExperimentalDesign

Emergent Behavior

BMDS Performance Assessment is a Wicked Problem

Assessing the BMDS

In addition to producing BMDS Metrics and their derivatives, S t P f A t h ld l d ltSystem Performance Assessment should also produce results describing Delivered Capability: Caveats and Limitations Discussion of noted emerging behaviors g g

The process for describing behaviors may include: Identify - that the behavior is occurring Characterize - the nature of the behavior Measure - produce behavior metrics Analyze - the effect of the behavior (effectiveness, efficiency) Assess - the impact of the behavior (so what?)

System Issues (problems requiring resolution) should beSystem Issues (problems requiring resolution) should be documented.

The System Performance Assessment should produce results of both a quantitative and qualitative nature.

Clyde Smithson22 BMDS Performance Assessment is a Wicked Problem

Planning the System Performance

Fli ht T t

Assessment Campaign

GTI‐XX

FT FT FT

Flight Test ‐

CD‐XX

TA PA OTM&S Assessment  ‐

GT‐XX CampaignGround Test ‐

GTD‐XX

System Performance Assessment ‐ XXAssessmentCampaign

Capability AssessmentIMAP

Phases/Processes to address topics such as:p• Planning (mission needs, requirements generation)• Development (analysis specified/designed, internal tools specified & developed,

external requirements {e.g., for M&S} developed)• Execution (data/analyses collected, models executed, etc.)

Clyde Smithson23

• Analysis (Performance Parameters and metrics produced)• Assessment (results assessed)• Reporting (results reported in Event or EOY Reports, for example)

BMDS Performance Assessment is a Wicked Problem

Caveats and Limitations to This Approach(Factors which may drive acceptance and success)

Assessment StakeholderF l d b d t t li h th t k

Assessment Results&S Few people and budget to accomplish the task

Must be able to produce “independent” community accepted, defensible results

Performance Assessments should be of use to senior officials in a timely fashion

There are no accredited system-level M&S, either Medium or High fidelity

The model is not the system Different resolution and fidelity models

produce different emerging behaviors Assessment Stakeholders encompass

multiple organizations & objectives

Multiple Complex Stakeholder Base Stakeholders must come to common

p g g Historical System Performance Assessment

process has failed to produce “good” simulation runs with results useable for accredited Performance Assessment

BMDS A hit tStakeholders must come to common agreement as to how the BMDS will be assessed at the SoS level, and must agree to the representation of their system at that level

Element/Component Stakeholders have been resistant in the past to aggregated

BMDS Architecture Elements/Components are often complex

systems and some are SoS The BMDS depends on systems/services

not within the control of the BMDSresistant in the past to aggregated performance representations at the BMDS SoS level

Variable level/quality of performance reporting Stakeholder imperatives may favor non-

reporting of assessment data

not within the control of the BMDS Valid Element/Component behavior may not

produce desired system-level behavior It is sometimes difficult to distinguish

between emergent behavior and improper function of the BMDS

Clyde Smithson

reporting of assessment data Never underestimate the power of indifference

of Stakeholders to the system-level performance assessment campaign concept

function of the BMDS

24 BMDS Performance Assessment is a Wicked Problem

BACKUPSBACKUPS

Clyde Smithson25 BMDS Performance Assessment is a Wicked Problem

Wicked Problems*(Rittel & Webber’s Original Description)1. There is no definitive formulation of a wicked problem.2 Wicked problems have no stopping rule2. Wicked problems have no stopping rule.3. Solutions to wicked problems are not true-or-false, but good-or-bad.4. There is no immediate and no ultimate test of a solution to a wicked

problem.5. Every solution to a wicked problem is a “one-shot operation”; because there

is no opportunity to learn by trial-and-error, every attempt counts significantly.

6. Wicked problems do not have an enumerable (or an exhaustively6. Wicked problems do not have an enumerable (or an exhaustively describable) set of potential solutions, nor is there a well described set of permissible operations that may be incorporated into the plan.

7. Every wicked problem is essentially unique.8 Every wicked problem can be considered to be a symptom of another8. Every wicked problem can be considered to be a symptom of another

problem.9. The existence of a discrepancy representing a wicked problem can be

explained in numerous ways. The choice of explanation determines the t f th bl ' l ti

Clyde Smithson

nature of the problem's resolution.10.The planner has no right to be wrong.

26*Horst W. J. Rittel & Melvin M. Webber, Dilemmas in a General Theory of Planning, Policy Science 4, 1973, 155-169

BMDS Performance Assessment is a Wicked Problem

System of System Management & Oversight and Operational EnvironmentAspect of Environment

Acknowledged SoS* Virtual SoS Collaborative SoS Directed SoSEnvironmentManagement & OversightStakeholder Involvement

Stakeholders at both system level and SoS levels (including the system owners), with competing interests and priorities; in some cases the

A virtual SoS has no centrally established purposes but rather the purpose expresses itself as the collective actions of the individual systems

Stakeholders negotiate among themselves to establish a common purpose. The SoS is built to this purpose and the individual systems negotiate among themselves to

A central SoS authority usually establishes the purpose to be achieved by the SoS. The SoS is built to this purpose and the individual systems arepriorities; in some cases, the

system stakeholder has no vested interest in the SoS; all stakeholders may not be recognized.

systems. negotiate among themselves to determine which part of this responsibility each fulfills. Central players often establish the ground rules by which other players participate.

individual systems are generally directed by the central authority.

Governance Added levels of complexity due to management and

No central body controls the purpose or management of the SoS or

In collaborative SoS there is no central authority with the power to enforce a

Individual systems are governed by membership to a g

funding for both the SoS and individual systems; SoS does not have authority over all the systems.

gindividual systems. Governance may emerge from politics or policies agreed to by stakeholders but none is compelled to comply.

y pparticular SoS purpose. A central authority may establish purposes, standards, etc., which are usually complied with, but does not have authority to enforce them.

g y pcommon SoS command structure which usually includes a central governing authority.

Operational EnvironmentpOperational Focus

Called upon to meet a set of operational objectives using systems whose objectives may or may not align with the SoS objectives.

Individual systems are operated independently. Operation of the SoS is complex because there is no centrally directed/controlled purpose. Participation by systems is voluntary and they often have conflicting

hi h th ill t t tt i

Collaborative SoS differs from directed SoS in that a central authority is not able to enforce particular operation of the system. Systems collaborate of their own will to achieve a central purpose; however, from time to time SoS

ti l d bj t d t th

The systems are connected by command and control structures. The SoS directs the operation of individual systems to achieve the SoS purpose (a centralized control authority). S t ll ll d

Clyde Smithson27

purposes which they will try to attain simultaneously with other systems.

operational needs are subjugated to the needs of a particular system.

Systems are usually allowed operational independence to deal with local situations.

*Table adapted and Acknowledged SoS definitions come from DoD SE Guide for SoS; others are defined by this author.BMDS Performance Assessment is a Wicked Problem

System of System Implementation

Aspect of Environment

Acknowledged SoS* Virtual SoS Collaborative SoS Directed SoSEnvironmentImplementationAcquisition Added complexity due to

multiple system lifecycles across acquisition programs, involving legacy systems, systems under development

Component systems are acquired independently without regard to other systems, except in the context that another system may perform a beneficial function for that system

Systems negotiate among themselves to determine how SoS objectives are to be met and which system is to provide which SoS capability. Agreements are made between central players to form a

Individual systems are acquired through different program offices and operated separately; however, there is a central authority directingsystems under development,

new developments, and technology insertion; Typically have stated capability objectives upfront which may need to be translated into formal requirements.

beneficial function for that system (usually at little or no cost) and dependably.

made between central players to form a common acquisition strategy. This can be seen as negotiated “political” objective as opposed to direction by central authority.

central authority directing, coordinating, and balancing the various program offices. Systems may be custom built to meet the needs of the SoS.

qTest & Evaluation

Testing is more challenging due to the difficulty of synchronizing across multiple systems’ life cycles; given the complexity of all the moving parts and potential for

i d d

SoS testing generally occurs on an ad hoc basis. Individual systems test themselves. Testing at the SoS level is confined to aspects of the SoS at that level that affect the function and purpose of individual systems. In

h d l h

SoS testing is established by coordination and negotiation between the central SoS players. Testing tends to change over time as the SoS purpose evolves. For a directed SoS the testing tends to be directed from top down

h f i l S S i i

Testing occurs at multiple levels but is directed from the SoS level At the SoS level testing is directed to evaluate the central purpose of the SoS. Testing may occur with the

i S S i f iunintended consequences. other words a system only tests what is important to itself at the SoS level, if any SoS testing is conducted at all.

whereas for virtual SoS it springs up organically. T&E for a collaborative system comes from a middle ground in which the central players establish goals that are tested by the entire SoS.

entire SoS or portions of it. Additionally, testing occurs at the system level to establish that the system meets its individual requirements, including those supporting the system purpose

Clyde Smithson28

system purpose.

*Table adapted and Acknowledged SoS definitions come from DoD SE Guide for SoS; others are defined by this author.BMDS Performance Assessment is a Wicked Problem

System of System Engineering & Design ConsiderationsAspect of Environment

Acknowledged SoS* Virtual SoS Collaborative SoS Directed SoSEnvironmentEngineering & Design ConsiderationsBoundaries and Interfaces

Focus on identifying the systems that contribute to the SoS objectives and enabling the flow of data, control and functionality across the SoS

Boundaries and interfaces evolve through adaptation and survival –successful standards live and are extended upon while others die out. Forces other than the technical merits

Certain systems rise to be central players at the SoS level. These systems usually reach agreement on what the interface standards are and what services to provide They usually create

Interfaces are seen as a key integrating factor for the SoS. A central authority establishes the interface requirements, with input from the componentfunctionality across the SoS

while balancing needs of the systems.

Forces other than the technical merits of these may determine survival (e.g., VHS vs. Betamax). Systems choose to use or not use these at their own discretion. A standard may be created by an individual system, and then be adopted by others.

services to provide. They usually create common standards for use by the entire SoS but do not enforce them (except by operationally excluding other systems that do not conform).

with input from the component systems. Similarly, the central authority establishes the boundaries between systems.

p yPerformance & Behavior

Performance across the SoS that satisfies SoS user capability needs while balancing needs of the systems.

The performance of the SoS is not directed, but rather is an emergent behavior. There are no established SoS performance requirements. Individual systems optimize to perform best for their own ends (i.e., b ROI h l l) d S S

Like the virtual SoS, there are no minimum SoS performance requirements enforced by a central authority. Rather, the constituent systems agree to a set of mutual performance goals and behaviors which

l i I di id l

All constituent systems must met minimum performance requirements to satisfy SoS capability requirements. Individual systems may be operated sub optimally to meet S S f ibest ROI at the system level) and SoS

performance derives from that.evolve over time. Individual systems may choose to sub optimize to benefit the SoS.

SoS performance requirement. Generally, individual system performance is secondary to SoS performance.

Clyde Smithson29*Table adapted and Acknowledged SoS definitions come from DoD SE Guide for SoS; others are defined by this author.

BMDS Performance Assessment is a Wicked Problem