red teaming

68
Red Teaming A GUIDE TO RED TEAMING DCDC GUIDANCE NOTE A guide to Red Teaming dated February 2010 is promulgated as directed by the Chiefs of Staff Assistant Chief of the Defence Staff (Development, Concepts and Doctrine) CONDITIONS OF RELEASE This information is Crown copyright and the intellectual property rights for this publication belong exclusively to the Ministry of Defence (MOD). No material or information contained in this publication should be reproduced, stored in a retrieval system or transmitted in any form outside MOD establishments except as authorised by both the sponsor and the MOD where appropriate. This information may be subject to privately owned rights. i

Upload: ruben-segura

Post on 11-Mar-2015

79 views

Category:

Documents


4 download

TRANSCRIPT

Page 1: Red Teaming

Red Teaming

A GUIDE TO RED TEAMING

DCDC GUIDANCE NOTE

A guide to Red Teaming dated February 2010 is promulgated

as directed by the Chiefs of Staff

Assistant Chief of the Defence Staff (Development, Concepts and Doctrine)

CONDITIONS OF RELEASE

This information is Crown copyright and the intellectual property rights for this publication belong exclusively to the Ministry of Defence (MOD). No material or information contained in this publication should be reproduced, stored in a retrieval system or transmitted in any form outside MOD establishments except as authorised by both the sponsor and the MOD where appropriate. This information may be subject to privately owned rights.

i

Page 2: Red Teaming

Red Teaming

AUTHORISATION

The Development, Concepts and Doctrine Centre (DCDC) is responsible for publishing a variety of Joint Publications set at the Strategic, Conceptual and Doctrinal levels. Readers of DCDC publications who wish to quote them as reference material in other work should confirm with the DCDC Doctrine Editor whether the particular publication and amendment state remains authoritative. Comments on factual accuracy or proposals for amendment are welcomed by the Doctrine Editor at:

The Development, Concepts and Doctrine Centre Ministry of Defence Shrivenham SWINDON, Wiltshire, SN6 8RF

Telephone number: 01793 314216/7 Military Network: 96161 4216/4217 Facsimile number: 01793 314232 Military Network: 96161 4232 E-mail: [email protected]

DISTRIBUTION

Distribution of JDPs is managed by the Forms and Publications Section, DSDA Operations Centre, C16 Site, Ploughley Road, Arncott, Bicester, OX25 1LP. Requests for issue of this publication, or amendments to its distribution, should be referred to the DSDA Operations Centre. All other DCDC publications, including a regularly updated CD Joint Doctrine Disk (containing both JDPs and Allied Joint Publications (AJPs)), can also be demanded from the DSDA Operations Centre. DSDA Help Desk: 01869 256052 Military Network: 94240 2052

All publications (including drafts) are available to view and download on the Defence Intranet (RLI) at: www.dcdc.dii.r.mil.uk This publication is available on the Internet at: www.dcdc.mod.uk

ii

Page 3: Red Teaming

Red Teaming

FOREWORD JDP 0-01 British Defence Doctrine (3rd Edition) notes that – critically – it is the conceptual component of fighting power that provides commanders with the ability to understand the context within which they operate, and that this serves as the foundation upon which creativity, ingenuity and initiative may be achieved. This emphasises the importance that we place upon human factors in all aspects of operations. Subordinate operational level doctrine necessarily focuses on the largely procedural aspects that are used in training and education to help develop swift but consistent responses to difficult problems. Yet some problems cannot be managed by process and science alone; human and organisational factors that drive military thinking are the greatest determinant of success or failure in operations. This Guide seeks to bridge a gap in our current doctrine by articulating the whys and wherefores of red teaming, a powerful analytical tool by which to aid understanding and improve decision-making.

Red teaming is a practical response to a complex cultural problem. That problem is as much with our own organisational culture as it is with our understanding of other people, organisations and groups. Military groups have robust cultures that reinforce cohesion under stress and which sustain a strong moral component of fighting power. This characteristic, which is a strength in many circumstances, can also be a weakness if it constrains thinking, discourages people from speaking out or excludes alternative perspectives. In these circumstances decision-making may be distorted. Red teaming is a tool which may help to combat this effect, as well as enhance understanding of how other actors in the operating environment may behave. If done well by the right people using appropriate techniques, red teaming can generate constructive critique of a project, inject broader thinking to a problem and provide alternative perspectives to shape plans. In short, it can help us guard against our own cognitive vulnerabilities while promoting an open approach that identifies opportunities.

This publication draws on a number of sources and I am particularly grateful for contributions from:

• The Joint Services Command and Staff College. • Permanent Joint Headquarters. • Land Warfare Centre. • Defence Intelligence Staff. • Defence Science and Technology Laboratories (Dstl). • US Joint Forces Command. • US Joint Irregular Warfare Centre. • US University of Foreign Military and Cultural Studies.

iii

Page 4: Red Teaming

Red Teaming

The Guide provides comprehensive guidance on the fundamentals of red teaming and how to get started. It attempts to move away from the mechanistic approach of traditional doctrine, but inevitably in some areas includes lists and processes. Your feed back will enable us to develop and refine later versions. I welcome your engagement.

Assistant Chief of the Defence Staff (Development, Concepts and Doctrine)

iv

Page 5: Red Teaming

Red Teaming

PREFACE ‘We cannot solve our problems with the same thinking we used when we

created them’.

Albert Einstein

1. Red teaming is increasingly recognised as a valuable tool for commanders, leaders and managers, both within and outside the Ministry of Defence.1 To date, although red teaming is acknowledged as a useful activity by military commanders, little guidance is available within UK defence doctrine on how best to carry it out. This guide is designed to fill that gap, provide key information on the conduct of red teaming, and raise its profile throughout the defence community.

2. The guide is split into 3 parts to address different audiences. Part 1 is aimed principally at commanders, leaders and managers with responsibility for ensuring robust outcomes from projects or plans. It provides an overview of red teaming, describing a spectrum of red teaming activities, highlighting potential benefits, and outlining how and where the technique may be employed. This part concludes with a summary of the Command Principles of Red Teaming.

3. Part 2 is aimed at staffs and red teaming practitioners; it addresses the what, why and how of red teaming, as well as explaining how it should be conducted in a range of situations. Section I describes how red teaming should be applied; it first explains the conditions required for success and then provides guidance on the many potential applications of red teaming. This is followed by outline guidance on a range of techniques which may be useful for red teamers.

4. Part 2, Section II addresses the contribution of red teaming to operations and planning, amplifies the details found in JDP 5-00 (2nd Edition) Campaign Planning, and makes reference to alternative US techniques. This leads into Section III which provides advice on forming a red team, including the attributes required of the team and its individual members. It concludes with a short overview of where red teaming expertise may be found within defence. Finally, there is a lexicon of red teaming terms which explains specialist terms used in the guide.

5. Part 3 contains 3 annexes which amplify the guidance at Part 2, including: full details of the analytical techniques described in Section I; a short section on Cultural

1 Defence Intelligence Staff, Defence Academy, Land Warfare Centre, Defence Science and Technology Laboratories. Also the US military and an increasing number of commercial operations including defence industries.

v

Page 6: Red Teaming

Red Teaming

Capability2 to assist with the analysis of adversaries and others; and details of defence organisations with red teaming expertise.

This publication is not doctrine and does not provide a prescriptive checklist of red teaming actions or tasks, but is intended instead to provide guidance to commanders, leaders, managers and defence practitioners facing a range of situations where red teaming will bring benefit.

2 Defined as ‘The ability to understand culture, and to apply this knowledge to effectively engage in any environment. Cultural capability comprises 3 levels: awareness, understanding and competence’. (Joint Doctrine Note (JDN) 1/09 The Significance of Culture to the Military).

vi

Page 7: Red Teaming

Red Teaming

A GUIDE TO RED TEAMING

CONTENTS

Title Page i Authorisation and Distribution ii Foreword iii Preface v Contents vii PART 1 – THE COMMANDER’S GUIDE

A Definition of Red Teaming 1-1 The Fundamentals 1-1 Red Teaming Benefits 1-3

Using the Red Team in Defence 1-6 Command Principles of Red Teaming 1-8

PART 2 – THE PRACTITIONERS’ GUIDE

Section I How to Conduct Red Teaming

Conditions Required for Effective Red Teaming 2-1 Applying the Tool 2-2

Assumptions 2-3 Measures of Effectiveness 2-3

Analytical Techniques 2-4 Applying Structured Analytical Techniques 2-7

Section II The Red Teaming Contribution to Operations and Planning

Red Team to Effective Planning 2-13

Section III The Red Team Attributes of Effective Red Teams 2-16 Team Challenges 2-17 Sourcing Red Teaming Expertise 2-18 Summary 2-19

vii

Page 8: Red Teaming

Red Teaming

PART 3 – DETAILED RED TEAMING INFORMATION

Annex A – Analytical Techniques

Annex B – Cultural Capability

Annex C – Red Teaming Expertise and Contacts

Lexicon

viii

Page 9: Red Teaming

Red Teaming

PART 1 – THE COMMANDERS’ GUIDE ‘Red Teams are quintessential heretics. They are constantly trying to

overthrow expectation.’1

101. The term red teaming is used in many contexts and is interpreted in a variety of ways by different people and organisations. In essence, it is a function or tool by which a commander, leader or manager may enhance his knowledge and understanding of a situation through consideration of alternative perspectives.

A Definition of Red Teaming

102. Various definitions of red team can be found amongst British and American publications; for the purposes of this guide however, red teaming is defined as follows:

Red Teaming is the art of applying independent structured critical thinking and culturally sensitised alternative thinking from a variety of perspectives, to challenge

assumptions and fully explore alternative outcomes, in order to reduce risks and increase opportunities.

The Fundamentals

103. Red teaming should: identify strengths, weaknesses, opportunities and threats, hitherto unthought-of; challenge assumptions; propose alternative strategies; test a plan in a simulated adversarial engagement; and ultimately lead to improved decision making and more effective outcomes. Red teaming is a complementary function to existing staff functions in J2, J3 and J52 – adding alternative thinking and an element of informed speculation to known or derived information sourced through the intelligence, operations and plans teams, or from independent research.

104. Red teaming activities vary in purpose, scope and process according to the context and the product under consideration. For example, red teams may be established to:

• Deliberately challenge own plans, programmes and assumptions.

• Challenge or test a system, plan or perspective through the eyes of an adversary, outsider or competitor.

• Understand options available to adversaries by generating plausible hypotheses of adversary behaviour and countering adversary deception.

1 Dr Jim Schneider, Professor of Theory, School of Advanced Military Studies, Fort Leavenworth. 2 J2, J3 and J5 are the military designations for the Joint Intelligence, Operations and Planning sections.

1-1

Page 10: Red Teaming

Red Teaming

• Better understand partners, local populations and other influential actors; activity which is often referred to as white, green or brown-teaming.3

• Prepare an organisation to deal with surprises, and strategic shocks.4

105. Figure 1.1 illustrates the spectrum of views which may be considered through red teaming. Each perspective is likely to require a tailored approach ranging from scrutiny and challenge, through testing and war gaming, to proposing alternatives or identifying the actions and reactions of others. Any one project may require a combination of these approaches, and can produce a range of outcomes as shown.

Perspectives

Perspectives

Analysis/Critique of Blue

Alternative Perspectives White/Green/Brown

Perspectives/likely actions of Red

Scrutinise, and challenge assumptions and developing plans

Identify red actions, reactions and

deceptionRecognise external

influences

strengths

weaknesses

opportunities threats

vulnerabilities

surprisesshocks

understanding

2nd, 3rd order effects deception

counter-plans

Approaches

Outputs

Testing

measures of effectiveness

branches and sequels

information requirements

more robustassumptions

Interactive War Gaming

staff cohesion

Figure 1.1 – The Red Teaming Spectrum

106. The left hand end of the spectrum illustrates the critical thinking function to mitigate against the comfort or complacency of accepted assumptions and solutions. In this instance the red team should help to avoid group think and organisational bias, and should hedge against inexperience. The function is often described as playing devil’s advocate, or conducting prism or blue cell activity.

3 White-teaming refers to organisers and facilitators, green-teaming to friendly actors, brown-teaming to actors that are neither supportive nor adversarial. 4 It was in this context that Red Teaming gained significant traction in the US following the ‘failure of imagination’ of the US intelligence community to forewarn of 9/11.

1-2

Page 11: Red Teaming

Red Teaming

107. To the right hand end of the spectrum is the classic red cell function whereby the team acts as a surrogate adversary to propose opposing courses of action in order to help improve blue plans and decision making. Beyond this, is the process of interactive war gaming where campaign plans, and operational plans are tested against as realistic a surrogate adversary as possible in a simulated conflict. This may be conducted in the latter stages of the planning process, as part of a campaign review, during mission rehearsal or to assist with a campaign work-up.

108. The red team should also consider the perspectives of non-adversarial actors who may be friendly, neutral or mildly antagonistic, and their likely impact on blue. These are illustrated in Figure 1.1 as white, green and brown perspectives, where white views are those of organisers and facilitators, green views are those held by friendly actors, and brown perspectives by a myriad of other actors who may be broadly neutral but in some sense partial. Consideration of these factions is especially important when operating in a Joint, interagency and multinational setting, with non-traditional coalitions, or when conducting irregular warfare or stabilisation operations amongst local populations. It is essential that the commander endeavours to understand how actors with differing views and perceptions may influence the campaign.5

109. Accurate cultural assessments contribute to the success of operations through risk reduction and the exploitation of opportunities, including the potential to influence behaviours and perceptions. This improves the ability to calculate and plan military outcomes, and leads to better informed strategic, operational and tactical decision-making. However, this is not easy; approaches and attitudes vary within all cultures (including our own) and we must be aware of the limitations of emulation or surrogate adversary interpretations, and apply an appropriate level of confidence to cultural judgements.

110. The requirement to critically review assumptions, consider the impacts of the environment from alternative perspectives, and carry out a what if function to mitigate against surprises and shocks underpins red teaming activity across the whole spectrum.

Red Teaming Benefits

111. The complexity of today’s operational environment requires military leaders to see through multiple lenses. Events in Iraq and Afghanistan have confirmed the dynamic nature of the environment, the adversary and others in and around the Joint Operations Area (JOA). Adaptable, non-traditional adversaries seek new means to destroy, disrupt, or outwit us, requiring us to continually reassess how they are thinking and we are acting.

5 JDP 3-40 Security and Stabilisation: The Military Contribution, paragraph 805 and Annex 8B.

1-3

Page 12: Red Teaming

Red Teaming

112. In addition to continuous reappraisal of the operational environment, the military should regularly examine its processes, structures and practices in a broader context. The Armed Forces as a whole require the organic capacity to adapt quickly to new, unanticipated requirements. Lessons from contemporary operations suggest we must improve our decision making, planning and execution of operations, not only in the front line but also in the underpinning support and policy areas.6 These lessons show that in every area of defence we need to develop the ability to rapidly understand, learn, adapt and anticipate.

113. Red teaming is a deliberate process executed by a suitably qualified team with access to relevant expertise, and capable of critical, alternative analysis. Red team outputs, in the form of written estimates, back briefs or war game outcomes, provide the commander, leader or manager with an independent capability to consider concepts, projects, plans, and operational design from alternative perspectives. Thus red teaming is an analytical tool by which to engender the learning and understanding that underpins effective adaptation and anticipation. If it is to be successful however, it should be conducted using robust methods and the team must incorporate the right people. Moreover, the results must be given serious consideration, even if they prove to be unexpected or unpalatable.

114. The benefits of red teaming include:

• Broader understanding of the Operational Environment.

• Filling gaps in understanding.

• Identifying vulnerabilities and opportunities.

• Reducing risks and threats.

• Avoiding group think, mirror imaging, cultural mis-steps and tunnel vision.

• Revealing how outside influences, adaptive adversaries and competitors could counter plans, concepts and capabilities.

• Identifying desired or undesired 2nd and 3rd order effects and unforeseen consequences.

Which lead to:

• More robust assumptions.

• Improved decision-making. 6 DCDC Operation TELIC and HERRICK - Recurring Themes, dated 22 January 2009.

1-4

Page 13: Red Teaming

Red Teaming

• More effective action.

• Contingency plans.

• More holistic understanding of possible outcomes, including awareness of potential shocks.

• More focused intelligence collection.

• Better staff cohesion.

115. In the last 5 years the US military has embraced red teaming as an essential tool, and promoted its use throughout headquarters and decision making organisations. This follows a pivotal study by the Defense Science Board in 2003, which concluded that:

‘red teaming is especially important now… [to] challenge emerging operational concepts in order to discover weaknesses before real

adversaries do...[and to] temper complacency’.7

Furthermore, in March 2005 the Robb-Silberman Report on WMD8 noted:

‘The widely recognised need for alternative analysis drives many to propose organisational solutions, such as ‘red teams’ and other formal mechanisms. Indeed, the Intelligence Reform and Terrorism Prevention

Act mandates the establishment of such mechanisms to ensure that analysts conduct alternative analysis. Any such organs, the creation of which we encourage, must do more than just ‘alternative analysis’. The Community should institute formal systems for competitive – and even explicitly contrarian – analysis. Such groups must be licensed to be

troublesome. Further, they must take contrarian positions, not just ones that take a harder line…’

116. Despite its many advantages, red teaming is not a silver bullet. As one would expect, the credibility of the output hinges on the quality and experience of the team, the team’s approach and toolset, the quality of the leadership and the overall context of the effort. An uninformed, overconfident or culturally biased team is unlikely to add value, and may be detrimental to the project. Furthermore, the product of a successful red team will be of no benefit if it is rejected or not considered by the commander.

7 Office of Undersecretary of Defense for Acquisition, Technology, and Logistics, Defence Science Board Task Force on the Role and Status of Department of Defense Red Teaming Activities, September 2003. 8 Report to the President of the United States, 31 March 2005, The Commission on the Intelligence Capabilities of the United States Regarding Weapons of Mass Destruction (WMD), page 170. Report is commonly referred to as the ‘Robb – Silberman Report on WMD’ – named after the two co-chairman.

1-5

Page 14: Red Teaming

Red Teaming

Using the Red Team in Defence

117. The concept of red teaming has been used for many years in government, military, and civilian circles. In the commercial world, red teaming usually means a critical peer review of a business concept or proposal. In government circles, it is normally associated with assessing vulnerabilities of systems or structures, especially within the information and security sectors. Armed forces use red teaming to consider a plan or operation from the point of view of an adversary, partner or other influential actor; the prime example being the use of red cells during Campaign Planning. This is similar to the UK Defence Intelligence Staff (DIS) technique of red team analysis, in which one places oneself in the shoes of the adversary.9

Getting Started

Identify the right red team

Identify red team problem

Set red team objectives

Plan how to use the red team outputs

Figure 1.2 – A Model for Getting Started

118. Once it is recognised that red teaming is a useful tool to apply to a project or problem, the first step is to identify the analytical problem or issues for the red team to consider. These may be at the strategic, operational and tactical levels:

• At the strategic level to challenge ideas and underpinning assumptions, or acquisition decisions.

• At the operational level to challenge assumptions, force postures, a commander’s plan, or scope urgent operational requirements.

• At the tactical level to challenge military units in training, commanders’ plans at high tempo or programmes in development.

9 Other DIS techniques are variously described as devil’s advocacy, alternative analysis, threat emulation, Team A/Team B, and vulnerability assessments.

1-6

Page 15: Red Teaming

Red Teaming

119. Red teaming may be beneficial in a number of key areas of Defence. Not only is it applicable to the planning of military and inter-agency operations, but has much wider applicability, for example:

• Development of training and doctrine.

• Concept development and experimentation (not just an opposing force for an experiment but continuous challenge by red teams throughout the concept development process).

• Force Development.

• Security of complex networks and systems for example communications systems.

• Activities where there are limited testing opportunities, for example, nuclear weapons stockpile issues.

• Technology assessment and acquisition decisions.

• Intelligence assessments.

120. Once the problem has been identified and initial analysis conducted, the next stage is to identify an appropriate red team. The team must possess the right mix of skills and expertise to fully address the problem. It is likely to comprise a combination of: subject matter experts, critical and creative thinkers, analysts, cultural experts and surrogate adversaries. To achieve such a team, the commander, leader or manager is likely to require a reach-out or reach-back capability by which he can call on experts from for example, the broader military, academia, think-tank institutions and defence industries.10 Ideally, the red team should be a discrete entity, without broader tasking.

121. The red team requires clear objectives, guidance on their interaction with the blue team,11 and whenever possible a schedule for back briefing their findings. The commander, leader or manager should plan how he will incorporate the red team findings into his project or plan. Further details on how to conduct red teaming, the analytical tools of the trade and considerations for red team members can be found in Part II of this guide.

10 The Development, Concepts and Doctrine Centre (DCDC) is working on recommendations to identify a layered network of supporting expertise as a reach-back capability for operations (Enhancing Situational Understanding and Campaign Continuity). 11 The project/planning team.

1-7

Page 16: Red Teaming

Red Teaming

Command Principles of Red Teaming

122. The art of good red teaming is founded on the following high level principles:

Command Principles of Red Teaming • Create the right conditions. Red teaming needs an open, learning culture,

accepting of challenge and criticism.

• Plan red teaming from the outset. It cannot work as an afterthought.

• Support the red team. Its contribution should be valued and used to improve outcomes.

• Provide clear objectives for the red and blue teams.

• Fit the tool to the task. Assemble an appropriate red team and ensure individuals have the right skills and experience to do the job.

• Ensure that the red team works with the blue not against them, but that the red team approach is critical and appropriately adversarial.

• Focus on key issues. Red teaming should contribute quality thinking rather than quantity.

• Poorly conducted red teaming is pointless, may be misleading and engender false confidence.

‘No plan survives first contact with the enemy.’12

Red teaming may help to anticipate and mitigate that effect.

12 Attributed to Helmuth von Moltke the Elder.

1-8

Page 17: Red Teaming

Red Teaming

PART 2 – THE PRACTITIONERS’ GUIDE

SECTION I – HOW TO CONDUCT RED TEAMING

201. Part 2 provides guidance for how to set about and conduct successful red teaming. Section 1 explains the conditions required for effective red teaming, then provides guidance on the many potential applications of the tool. This is followed by a section which outlines a series of analytical techniques for red teamers.

Conditions Required for Effective Red Teaming

202. Learning Culture. Learning culture is the most important factor in effective red teaming. Red teaming can only thrive in an environment that tolerates and values internal criticism and challenge. The art of red teaming is to provide a balance to the military culture which enshrines team cohesion, selection and maintenance of the aim and a can-do approach, and is a virtue in many circumstances, but may detract from full consideration of other perspectives. Red teaming is necessarily a confrontational activity to guard against group-think and should generate constructive tension within staffs. For this reason it is essential that red teamers have the ability to present their findings in an objective and cogent manner, without provoking confrontation at the individual level.

203. Commanders’ Engagement. To be effective, red teams must have the commander’s confidence and support and must understand his broad direction. The team needs sufficient clout to raise issues that might not be welcome throughout the organisation. Top cover is essential to ensure that red teamers have the authority to appropriately challenge the blue commander and confidence that their insights will be seriously considered.

204. Independence. Red teams must balance the requirement to provide alternative views and avoid group think by being independent of the planning process, whilst remaining sufficiently engaged with the staff to understand the mission and provide relevant outcomes.

205. Output-Oriented. Red teaming is not process driven. Nevertheless, a process is required to ensure that the output is seriously considered and the team is not marginalised. The red team is unlikely to have the capacity to mirror all blue activities, therefore should focus on critical vulnerabilities, areas of uncertainty and in depth analysis of decision points to contribute quality insights.

206. Interaction. Robust interaction between the red and blue teams is essential; it is not competitive. The objective is to establish an environment in which blue learns from the process and comes out with sharper insights or more robust solutions and

2-1

Page 18: Red Teaming

Red Teaming

greater appreciation of issues. When the red team is employed to offer alternative solutions it is important that they can challenge basic assumptions.

207. Timeliness. Red teams should be put in place before major problems arise and before significant resource expenditure so that problems can be anticipated. However, the red team should not stifle blue ideas or lead to pre-judgement of a situation.

208. Staff. The red team must contain suitably expert and experienced staff for the job. To source the expertise required it may be necessary to reach-out to broader defence organisations, academia, defence institutions and industry.

Anticipate shocks

Measure Effectiveness

Internal focus:blue assumptions

biasesperspectives

organisational challengesblue objectives

MOE

Adversarial focus:red assumptions,

views, culture,objectives,

capability, threatsthe worst that red

can do to blueasymmetric edge

Critical thinking:Scrutiny, Challenge,

Devils Advocacy

Adversarial thinking:Surrogate Adversary,

Threat emulation

External focus:White,

Green, Brownculture

views, objectives

Test Assumptions

Alternative perspectives

Test and compare developing plans and propose alternatives

Figure 2.1 – The Red Teaming Toolset

Applying the Tool

209. Red teaming is a multi-faceted toolset as illustrated in Figure 2.1. It can be applied to a range of situations for differing or multiple outputs. At the left hand end of the red teaming spectrum are scrutiny and challenge of emerging thoughts, ideas and underpinning assumptions, to improve understanding, and identify strengths and weaknesses. This type of analysis is relevant in any sphere of operation and may usefully be applied to the full range of military projects.

2-2

Page 19: Red Teaming

Red Teaming

210. To the right and centre of the red teaming spectrum, a red team will consider the alternative perspectives of adversaries, competitors or outsiders to determine their likely actions and reactions in the context of blue plans. This type of analysis is particularly pertinent in the operational domain, but may also be relevant in the acquisition area for assessing tenders, or when negotiating contracts.

211. Across the spectrum, red-teaming is a function that can compare and test approaches and plans, by considering a range of hypotheses or alternative outcomes. The analysis should enable commanders1 to make choices and select preferred options, more cognisant of the opportunities, threats and downstream consequences associated with each option. Red teaming is also a vehicle by which to mitigate against potential shocks and surprises.2 These uses of the tool are pertinent to many military projects but are particularly relevant for acquisition decisions and campaign planning.

Assumptions

212. A key role for red teams is to challenge underpinning assumptions, identify invalid or unnecessary assumptions, validate robust assumptions and offer alternatives as appropriate. The red team should consider whether and when an assumption is a fact or opinion, whether it is logical, reflects reality, and remains valid as the situation changes. The team should also identify critical assumptions and those that are key dependencies. The comments of General Michael Hayden (US Army), the Deputy Director of US National Intelligence, in May 2006, in response to the Iraq Intelligence failures are instructive:

‘We just took too much for granted. We didn’t challenge our basic assumptions.’

Measures of Effectiveness

213. Linked to planning, an effective assessment system which provides quantitative and qualitative measures of effectiveness is an essential tool by which to assess progress, and indicate where adjustments to plans may be required. Red teams can provide a different perspective on the assessment process and should offer alternative views on how adversaries, partners or other actors gauge progress, success and victory.

1 Throughout Part 2, the term commander(s) includes leaders and managers involved in red teaming. 2 As in Rumsfeld‘s known unknowns.

2-3

Page 20: Red Teaming

Red Teaming

Analytical Techniques

‘Alternative analysis seeks to help analysts and policy-makers stretch their thinking through structured techniques that challenge underlying assumptions and broaden the range of possible outcomes considered.’3

214. A plethora of analytical techniques exists to assist red teams in bringing critical thinking and alternative perspectives to a situation. This section refers to a number of the better known techniques and provides a guide as to how they may be employed by a military or defence red team. A more detailed explanation of each technique, including when and how they should be used and their likely value is at Annex A.

215. Diagnostic Techniques.

a. Key Assumptions Check – Review the key working assumptions on which fundamental judgments rest. A critique of key assumptions is likely to be important to the development of any project or plan. It is essential at the beginning of any project, but periodic rechecking should ensure that outcomes do not rest on flawed premises. Identifying hidden assumptions can be one of the most difficult challenges as they are ideas held to be true – but often unconsciously – and therefore are seldom examined or challenged.

b. Quality of Information Check – Evaluate integrity and reliability of available information. It is important to assess the accuracy and reliability of the information base to establish a confidence level for judgements and decisions. Checking the quality of information used in analysis should be an ongoing, continuous process throughout any project.

c. Indicators or Signposts of Change – Periodically review a list of observable events or trends to track events, monitor targets, spot emerging trends, and warn of unanticipated change. A team can create a list of indicators or signposts related to observable events that one would expect to see if a postulated situation is developing, for example: economic reform, military modernisation, political instability, or democratisation. The technique can be used when a team needs to track an event over time to monitor and evaluate changes.

d. Deception Detection – Systematic use of checklists to determine when deception may be present and how to avoid being deceived. This check is vital as part of Campaign planning and should be part of the blue team process, however the red team may bring an alternative perspective. The search for clues that deception is being conducted is often time consuming and requires

3 FishbeinW and Treverton G, The Sherman Kent Center for Intelligence Analysis Occasional Papers: Volume 3, Number 2, October 2004 - Rethinking ‘Alternative Analysis’ to Address Transnational Threats,.

2-4

Page 21: Red Teaming

Red Teaming

extensive fact checking and hypothesis testing. Nonetheless, it can be critical in cases where the stakes are high. Red teams should consider whether the adversary has a history of practicing deception, examine his motivations for an elaborate deception effort, and identify whether he has the opportunities and means at his disposal to carry it out.

e. Analysis of Competing Hypotheses – Identification of alternative explanations (hypotheses) and evaluation of all evidence that will disprove rather than confirm hypotheses. This technique is based on developing an Analysis of Competing Hypotheses (ACH) matrix to set hypotheses against supporting evidence, thus allowing the red team to evaluate the strength and consistency of the evidence underpinning each theory, and hence judge the strength or otherwise of each hypothesis. The matrix also provides an audit trail enabling others to review the analysis. This systematic examination of the evidence makes the technique ideal for considering the possibility of deception and denial.

216. Contrarian Techniques.

a. Devil’s Advocacy – Taking an alternative point of view to drill out assumptions or reasoning. This technique is appropriate for any situation where strong consensus or an established mind-set exists. It is especially useful for challenging key assumptions in any project, including developing concepts or doctrine, and emerging outcomes within technological or acquisition projects.

b. Team A/Team B – Use of separate teams to contrast 2 (or more) strongly held views or competing hypotheses. If there are at least 2 competing views on a key issue, then Team A/Team B can be the most appropriate technique to use. A longstanding policy issue, a critical decision that has far-reaching implications, or a dispute that has obstructed effective cross agency cooperation can be grounds for using Team A/Team B. Teams blending personnel from all parties and views can be instructed to review all available evidence and create specific alternative arguments that can capture the essential differences, similarities, pros and cons of each relevant position, building consensus and informing the correct way forward.

c. High Impact/Low Probability Analysis – Highlights a seemingly unlikely event that would have major consequences if it happened. A contrarian technique that sensitises analysts to the potential impact of seemingly low probability events that would have major repercussions. Using this technique is advisable when analysts and policymakers are convinced that an event is unlikely and have given little thought to the implications if it did

2-5

Page 22: Red Teaming

Red Teaming

occur. In essence, this can be a warning that the intelligence and policy communities must be alert to an unexpected but not impossible event.

d. What If Analysis – Assumes that an event has occurred with potential (negative or positive) impact and explains how it might come about. A technique for challenging a strong mindset that an event will not happen or that a confidently made forecast may not be entirely justified. It is similar to a High- Impact/Low-Probability analysis, but it does not dwell on the consequences of the event as much as it accepts the significance and moves directly to explaining how it might come about.

e. Experimentation – A test under controlled conditions to examine the validity of a hypothesis, or determine the efficacy of something previously untried. This technique may be applied to test developing concepts and ideas, or demonstrate how certain proposals may play out within a particular context. Key methods for experimentation include (but are not limited to) facilitated workshops and judgement panels that can consider one or more potential scenarios, problems or timeframes, often most usefully through comparative analysis.

f. War Gaming – An event to simulate a military operation; testing underpinning assumptions and testing and comparing the impact of proposed courses of action. This is a useful technique to apply in the latter stages of Campaign planning, during a Campaign review or mission rehearsal. Further details are at paragraph 213.

217. Imaginative Thinking Techniques.

a. Brainstorming – An unconstrained group process designed to generate new ideas, theories and concepts. This group process allows others to build on an initial idea suggested by a member of the brainstorming session. It is a technique for stimulating new thinking and it can be applied to virtually any project as an aid to thinking. Typically, red teams will brainstorm when they begin a project to help generate a range of hypotheses about their issue. Brainstorming can be ineffective if the group tries to generate all ideas in plenary session. Tasking individuals to generate ideas first, which are then analysed and tested by the group, is a more reliable way of generating diverse thinking.

b. Outside-In-Thinking – Consideration of the external changes that might, over time, profoundly affect the project or issue. This technique is used to identify the full range of basic forces, factors and trends that would indirectly shape an issue, usually at the initial stages of a project when identifying critical external factors that could influence how a particular

2-6

Page 23: Red Teaming

Red Teaming

situation will develop. This is particularly relevant during the initial stages of the Campaign Planning Estimate to reduce the risk of missing important variables.

c. Alternative Futures Analysis – Systematically explores multiple ways a situation can develop when there is a high degree of complexity and uncertainty. Multiple futures development is a divergent thinking technique that tries to use the complexity and uncertainty of a situation to describe multiple outcomes or futures that the analyst and policymaker should consider, rather than to predict one outcome.

d. Role Play/Surrogate Adversary (Prism Technique) – Models the behaviour of an individual or group by trying to replicate how an adversary or outsider would think about an issue. This is to avoid assuming that a foreign or non-military actor will have the same motives, values, or understanding of an issue as blue. History has shown that foreign leaders often respond differently to events because of different cultural, organisational, or personal experiences. Role Play tries to consciously place analysts in the same cultural, organisational, and personal setting in which the target individual or group operates (putting them in their shoes). For these techniques to succeed, it is essential that the red team contains individuals with appropriate cultural capability.

“I had perfect situational awareness. What I lacked was cultural awareness. Great technical intelligence…wrong enemy”.4

Applying Structured Analytical Techniques

218. Analytical techniques are designed to help broaden thinking, enhance decision making, and avoid surprise. They are complementary to, but not a substitute for, open minded and creative thinking.

219. The analytical techniques described above can be applied in a variety of ways when red teams begin a new assessment. Some can be used effectively at multiple points in the process and can promote a team’s ability to keep an open mind, to consider multiple hypotheses – including the highly unlikely – to challenge conventional wisdom, and to assess the impact of important information gaps or deception on judgements and confidence levels. The paragraphs below provide some pointers for when to use analytical techniques as part of a red team assignment. It should be stressed however, that every task is different and techniques must be selected according to their suitability for the situation under consideration.

4 Commander of US Third Infantry Division, Iraq 2003, quoted in R Scales, Culture-Centric Warfare proceedings, October 2004.

2-7

Page 24: Red Teaming

Red Teaming

Initial Analysis – Understand, Scrutinise and Challenge

220. At the beginning of a project or plan, red teams should consider Brainstorming and Key Assumptions Checks to ensure that important factors are not being missed or taken for granted. Similarly, Outside-in-Thinking can put a project into a broader international context. For example, economic assumptions about the price of oil might be key to understanding the prospects for political stability in an oil-exporting country or an underdeveloped country entirely dependent on energy imports. A High Impact/Low Probability assessment can also sensitise red teamers early on to the significance of dramatic events that might affect the project. Alternative Futures analysis is useful at the beginning of a project and can provide the structure for the entire project or plan.

Test and Compare Developing Ideas

221. As a project or plan develops and hypotheses are being formed on the basis of key assumptions and intelligence, Contrarian Techniques including Team A/Team B and War Gaming can be used to test developing ideas and plans.

222. Techniques such as Indicators and Signposts or Analysis of Competing Hypotheses (ACH) can be useful throughout a project and may be revisited periodically as new information is absorbed and analysed. ACH in particular, is a good tool to use throughout a project to highlight evidence that underpins key arguments.

Alternative Analysis

223. If the assessment contains strong judgements about an adversary’s or outsider’s behaviour, then challenging this view with a Prism/Role Play/Surrogate Adversary intervention will provide a good corrective. This is however, a time consuming activity, requiring team members with suitable Cultural Capability.

A Final Check

224. As the red team assessment is being finalised, key assumptions should be reviewed as a sanity check on the underlying logic of the analysis (Key Assumptions Check). A final Quality of Information Check at this juncture can help to give a better degree of confidence in the information base and judgements reached in the assessment. Further Brainstorming may also ensure that no plausible hypothesis has been dismissed or left unaddressed. If a firm consensus has formed around an idea or plan and it has not been seriously questioned in some time, then a Devil’s Advocacy exercise should be undertaken.

2-8

Page 25: Red Teaming

Red Teaming

SECTION II – THE RED TEAMING CONTRIBUTION TO OPERATIONS AND PLANNING

225. Operational planning is a long established art in which the contribution of red teaming is beginning to be recognised.5 As operations become more complex, involving a multiplicity of players including Joint, inter-agency and multinational partners, as well as a range of adversaries and neutral actors, a preceding phase of operational design is increasingly important. Operational design sets out to analyse and frame a problem before detailed planning commences. Red teaming is equally applicable to this phase of an operation, when it may be applied as an analytical tool to help refine and develop ideas.

226. The aim of red teaming as part of campaign planning or review, and mission rehearsal is to help to: provide clarity; fully explore alternatives; realise opportunities; and identify vulnerabilities and threats in order to ensure robust courses of action, and recommend branches and sequels not previously considered.

227. A range of methods exist to aid red teams involved in the planning process, such as that used by the US Army and outlined in the US red teaming handbook.6 The guidance laid out in JDP 5-00 (2nd Edition) Campaign Planning is less prescriptive, but provides instruction on how and at what stages of the planning process the red team should engage. The following paragraphs provide amplification of JDP 5-00 guidance to maximise the value of red teaming during UK campaign planning. Figure 2.2 illustrates the sequencing of red and blue planning.

5 JDP 5-00 (2nd Edition) Campaign Planning Annex I. 6 University of Foreign Military and Cultural Studies (UFMCS) Red Team Handbook, version 4 12 October 2007.

2-9

Page 26: Red Teaming

Red Teaming

Step 1- Understanding the Operating Environment

Step 2- Understanding the Problem

(end state, objectives, centres of gravity,

decisive conditions)

Step 3– Formulate Potential CoAs

Step 5 - Evaluate CoAs

Operational Planning

Commanders Decision

Steps 4- Develop and Validate CoAs

Step 1- Understanding the Operating Environment

Step 2- Understanding the Problem

(end state, objectives, centres of gravity,

decisive conditions)

Step 3– Formulate Potential CoAs

Step 5 - Evaluate CoAs

Operational Planning

Commanders Decision

Steps 4- Develop and Validate CoAs

Figure 2.2 - Sequencing of Red and Blue Planning

228. The red team should participate at each stage of the planning process. The commander should decide who will consider white, green and brown perspectives and how these will be incorporated. He should also ensure his staff consider the linkage between the red and blue staffs7 and decide whether and when these teams should interact with blue; whether and when it adopts a passive observing role followed by back briefs or critique, or alternatively develops independent red plans either for presentation, or to underpin challenges to the blue plan and inform war gaming. He should plan how he will evaluate any options or alternatives highlighted by the red team and consider how he will incorporate them into the final plan.

229. Step 1 (Understand the Operating Space, Work with Blue). At Step 1 the red team will draw on the same analysis data as blue. However, the red team should have access to additional expertise or alternative sources of information from which to broaden their situational understanding and validate assumptions. At this stage it is recommended that the red team works alongside with blue to ensure all observations and insights are considered and ensure the greatest understanding of the operating environment.8

7 White, green, brown staff as required. 8 This includes understanding of the terrain, environment, populations and culture.

2-10

Page 27: Red Teaming

Red Teaming

230. Although understanding of the operating space is key at the start of the planning process, it is an ongoing requirement throughout a campaign. Therefore the red team, in conjunction with J2 staff, should continue to monitor and analyse situational changes as they occur and feed them back to the blue planners. The red team should retain any alternative information or assumptions which are not incorporated by the blue planners to inform its own thinking and subsequent red planning.

231. Step 2 (Understand the Problem, Work Independently from Blue). From this stage it is recommended that the red team operates independently from blue to produce a red operational estimate and subsequently a discrete red plan. The red estimate should identify, from an informed red perspective:

• Alternative BLUE End-states, Objectives, Centres of Gravity (CoG) and Decisive Conditions.

• RED End-states, Objectives, CoG and Decisive Conditions.

232. Where there is more than one significant actor involved in the operation, it may be necessary to conduct a number of red estimates from different perspectives, including those of partners and neutrals. Once complete, the outcomes of the red estimates may be briefed to blue planners wholesale, to ensure that the widest possible range of views is considered as early as possible. Alternatively, and potentially more constructively, the red team may conduct a critique of the blue estimate, playing the role of devil’s advocate, questioning blue deductions and proffering alternative views, based upon the red estimates.

233. Steps 3-4 (Formulate, Develop and Validate Courses of Action). During this stage the red team should continue to develop its independent red plan to formulate and develop Courses of Action (CoAs) from its informed red perspective to identify:

• Alternative BLUE CoAs.

• Alternative second and third order effects of BLUE CoAs.

• RED CoAs.

234. At this stage feedback should be provided by critiquing the developing blue plan. The red team should again act as devil’s advocate, robustly presenting alternative views and challenging the CoAs developed by the blue team, on the basis of the red estimate.

235. Steps 4-5 (Validate and Evaluate Courses of Action). Validation and evaluation of CoAs is best achieved by war gaming – an adversarial technique which plays friendly, neutral and hostile elements together to identify any shortcomings in

2-11

Page 28: Red Teaming

Red Teaming

potential or selected CoAs. Generally, elements of the blue staff play out the blue plans and often infer those of friendly forces. However, the red team should take on one or a combination of the following roles:9

• The red team plays red (and white, green, brown as required), invoking its red plans based on the perspectives of adversaries, partners and neutrals.

• Alternatively, the red team may play a free thinking adversary who reacts to blue.

• J2 staff play red, while the red team stands back and takes an independent view of the game, offering advice to the commander as appropriate, based on its broad perspective of the overall plan.

• The red team manages the war game, sets the conditions and context and injects situational changes.

236. The commander must take a view of where and how the red team will add the most value, not only to the war game, but to the plan as a whole, and task them accordingly. Whichever role the red team takes, if done well, their participation should ensure that the event takes into account the alternative perspectives and challenges identified throughout the planning process. The red team should also be able to provide advice to the commander on how to best shape the war game itself to achieve the desired outcomes. These outcomes should provide:

• Better understanding of the likely actions and reactions of friendly, neutral and hostile actors within the Joint Operations Area (JOA).

• An indication of the likely effects of military activity and the associated risks, threats and potential opportunities.

• Assessment of blue CoAs versus red CoAs and potential outcomes.

• Refinement and development of CoAs.

237. The war game may also identify new perspectives or unexpected effects and outcomes for blue and red staffs to consider and incorporate into contingency plans.

9 The Red and White Perceptions paper currently under development by Dstl offers views on alternative roles for red teams during wargaming.

2-12

Page 29: Red Teaming

Red Teaming

‘The application of red teaming requires a deft touch. On the one hand, we don’t want to stifle good ideas by subjecting them too early to the

most formidable opponents possible. On the other hand, we can’t wait too long to learn what adaptive enemies might have in store for our

favourite idea.’10

Red Teaming for Effective Planning

238. Throughout the planning process red team input should be timely and tailored to the problem. Input should be made at the lowest appropriate level but elevated if the staffs fail to reach agreement; any tension between the red and blue staffs should be constructive.

Challenges

239. If it is to be effective at all stages of the planning process the red team should be alert to the challenges outlined below:

Group think – the desire for solidarity or unanimity within a staff constrains wider, alternative thinking.

• Focus on the Current - failure to anticipate or to react to the situation changing.

• Paradigm blindness – a why change what has worked in the past attitude leading to predictable actions or failure to recognise changes in adversary actions.

• Trends faith – blind adherence to trends without considering other problems or possible shocks.

• Mirror Imaging – applying own attitudes (for example values, beliefs, cultural concepts, capabilities) to others, thus gaining a flawed understanding of consequences and outcomes.

• Cultural contempt or misunderstanding – distinct from mirror imaging in that the staff recognise that cultural differences exist but fail to understand their significance or interpret them. The challenge for the red team here is to understand the culture of the adversary and other actors in the JOA.

• Over optimism or pessimism – to assume success will be the only outcome, or to be unable to see the route to success.

10 Gold T, Joint Advanced Warfighting Program, Institute for Defence Analysis, USA. Jan 2001.

2-13

Page 30: Red Teaming

Red Teaming

• Oversimplification and tunnel vision – failure to take an holistic view of a complex problem with many variables, especially when time constrained and operating with poorly integrated coalitions, leading to implicit or untested assumptions.

• Faulty perceptions/mindsets – a tendency to perceive the expected.

• The red team should be alert to the possible impact of strong prescriptive commanders and cohesive staffs, who may stifle dissent and discourage alternative thinking.

• Time constraints and limited situational understanding may also narrow thinking and lead to sub-optimal decision making.

‘The general who wins a battle makes many calculations in his temple ‘ere the battle is fought. The general who loses a battle makes but few

calculations beforehand. Thus do many calculations lead to victory, and few calculations to defeat: how much more no calculation at all! It is by attention to this point that I can foresee who is likely to win or lose’.11

SECTION III – THE RED TEAM

‘Red teaming is largely an intellectual process….but more an art than a science…. requiring Red Team members to possess superb critical and

creative thinking skills’.12

240. A red team should be tailored appropriately for the project under consideration. Red teams may comprise a diverse mix of skills and experience or may be focused in one particular area, depending upon issue being addressed. Members of the team should be selected for their special subject matter expertise, professional or cultural perspective, imagination or penchant for critical analysis. Individuals may be drawn from internal staff, partners, specialists, external actors with alternative views and possibly from former or surrogate adversaries. Whatever their particular skill all red team members should have a full understanding of the problem for analysis and ensure they are familiar with relevant systems and processes.

241. For any issue, the team should contain critical and creative thinkers who can approach the problem from different perspectives and deal with complex systems and challenging constructs. Where alternative views are being considered, familiarity with different cultural perspectives is required including; economies, sociological, political and religious systems, military theory and culture as well as attitudes to death and

11 Sun Tzu, the Art of War. 12 UFMCS Red Team Handbook, v4 12 October 2007.

2-14

Page 31: Red Teaming

Red Teaming

violence. Consideration should be given to understanding partners and neutrals as well as adversaries. An anthropological specialist is likely to be a useful member of the team.

242. In addition to the attributes mentioned above, red teams must be capable of effective communication; they need the knack of asking questions to stimulate thought without alienating ‘blue’ and maintaining a robust line of argument whilst avoiding antagonism. For large and diverse teams a facilitator will be required to cohere ideas, maintain focus and ensure outcomes are captured.

243. There are 3 categories of red team member, and individuals may fall into one or more. Teams may be made up of any combination of these, appropriate to the situation under consideration:

a. Surrogate Adversaries, Outsiders and Competitors. The purpose of these team members is to sharpen skills, expose vulnerabilities that adversaries may exploit and in general increase understanding of the options and responses available to adversaries, outsiders and competitors. These red teamers should possess the attributes to emulate the adversary or competitor as appropriate.

b. Devil’s Advocates. These red teamers offer critiques of and in some cases alternatives to assumptions, strategies, plans, concepts, programmes, projects and processes. The objective is to provide critical analysis in order to anticipate problems and avoid surprises. Devil’s advocates can also critically assess processes such as how an organisation conducts its business; an example is the Ballistic Missile Threat Committee that Secretary Rumsfeld chaired in 1998. The team examined the same data available to the intelligence community but identified alternative paths adversaries might take and came to different conclusions about the threat.13

c. Independent Thinkers. These individuals provide judgements that are independent of the unit’s normal processes and fields of experience. They are often personnel with wider experience, who bring a different, creative, innovative or visionary, although not necessarily analytical, view to a problem.

13 Arguably the external red team effort that has had the greatest effect on the management of the missile defense program was the 1998 report of the ‘Commission to Assess the Ballistic Missile Threat to the United States’. This effort was created by congressional legislation in the Fiscal Year 1997 National Defense Appropriations Act and chaired by Mr. D. Rumsfeld (then a former secretary of defense). This study helped to foster the organizational acceptance of subsequent red team analyses, liberating it from the bonds of standard intelligence assessments, which are typically based on relatively straightforward and limited extrapolations of what had actually been seen.

2-15

Page 32: Red Teaming

Red Teaming

Attributes of Effective Red Teams

244. Success of red teaming depends on employing the right people. Staffing red teams presents special challenges as some very talented individuals are not necessarily suited, temperamentally or motivationally to be effective red team members. Furthermore, resource constraints may necessitate judicious selection of the right mix of talents and perspectives. When forming a red team the following attributes should be considered and included as appropriate:

a. The ability to see things from alternative perspectives.

b. Imagination, a particularly desirable attribute, enabling freedom of thought.

c. Familiarity with different cultural perspectives, cultural appreciation and empathy. Knowledge of culture is one of the most important aspects in meeting the challenges of contemporary conflict. This is true of conflict of any nature but is especially pertinent when conducting irregular or stabilisation operations where large numbers of diverse actors are involved.14

d. The ability to understand cultural differences and similarities then identify their relevance to the situation (cultural capability).

e. Understanding partners and neutrals. Cultural capability should enhance routine relations with friendly and neutral actors, including allies and partners.15

f. Knowledge of military theory, history and culture.

g. Understanding of Western and non-Western military thinking, ethos and ideology.

h. Self awareness. ‘Know thy enemy but not yourself, wallow in defeat every time’ (Sun Tzu).

i. Knowledge of relevant foreign military capabilities and developments.

j. Understanding of the operational environment, its critical variables and the military decision making process.

k. Familiarity with war gaming and experimentation best practice.

l. The confidence to challenge conventional or established blue thinking.

14 JDP 3-40 Security and Stabilisation: The Military Contribution, paragraph 306. 15 Ibid.

2-16

Page 33: Red Teaming

Red Teaming

m. The ability to communicate effectively.

n. Strong leadership.

o. Effective facilitation.

245. Identification of suitable red teamers may require reach into a broad range of organisations, both within and outside the Ministry of Defence (MOD). Whatever their background however, in addition to the attributes above, red teamers must, above all, bring an open and agile mind to the problem.

Team Challenges

246. Red teaming is not easy; establishing an effective team and applying sound processes are challenges in themselves but are essential if red teaming is to add value. The red team must:

a. Have a clear objective.

b. Be independent from blue, but be close to the decision making process and have adequate interaction with blue.

c. Have the full support of the commander, but not be influenced by him.

d. Contain critical and creative thinkers with relevant expertise.

e. Capture the culture of the adversary or competitor when enacting surrogate adversaries.

247. The team and the individuals within it are at the heart of the success of any red teaming activity. The make-up of the team, the expertise of its constituent members, the way that they interact with each other and their motivations all require careful consideration. Likewise the authority afforded to the red team and the manner of integration with the project team must be planned from the outset. Red teaming cannot work as an add-on or an afterthought.

2-17

Page 34: Red Teaming

Red Teaming

Sourcing Red Teaming Expertise

248. Experienced red teamers and specific subject matter experts can be found within Defence and its supporting areas, as indicated in Figure 2.3. Although the use of red teaming is not consistent throughout the MOD, there are a number of areas where red teaming is used or referenced; a description of the key areas of red teaming expertise with contact details is at Annex C.

249. Beyond the MOD, expertise may be found in a plethora of organisations, but those listed below are recommended as first options:

• Academia: Kings College London, Cranfield University, Exeter University, Birmingham University Global Facilitation Network for Security Sector Reform.

• Defence Institutions: Royal United Services Institute (RUSI), Chatham House, Institute of International Strategic Studies (IISS).

• Boeing UK and QinetiQ via the Co-operative Research and Development Agreement (CRADA) (accessed via DCDC).

Contact details for these organisations are included at Annex C.

Internal workshops

External critique

Red cellGeneric

specialists

J3,J5

DCDC

J2, DIS

Defence Cultural Support Unit

Research & Assessment Branch (Defence Academy)

Surrogate/ex adversary

dstl

CRADA

Academics

Warfare Centres

Figure 2.3 - Expertise Across the Red Teaming Spectrum

2-18

Page 35: Red Teaming

Red Teaming

Summary

250. Red teaming is especially important to the military at a time when conflict is becoming increasingly complex and dynamic. Our operations are likely to be conducted in an expanded, unfamiliar operating space, against adaptive adversaries, in concert with a range of non-traditional partners.16 Understanding better the perspectives of opponents and outsiders within the JOA, and being self critical of our own, is critical to our thinking and planning, and ultimately aids decision making. In addition, a key lesson of recent years is the need to challenge and review processes and practices in all spheres of operation, including policy, acquisition and support areas.

251. For these reasons it can be argued that now, more than ever before, robust red teams are needed to challenge emerging operational concepts and current MOD practices, in order to discover weaknesses before real adversaries do.

‘Because war is a phenomenon between thinking opponents, a broad approach to interactive red teaming is important to inform our thinking

about future military challenges and explore ideas for dealing with them’17

252. Successful red teaming helps lead to robust decision making, ameliorates risk and helps prepare for the unexpected.

‘Understand + Anticipate + Adapt’18

16 Future Character of Conflict (FCOC) Paper 16 September 2009. 17 John F Sandoz. Joint Advanced Warfighting Program, Institute for Defence Analysis, USA. January 2001. 18 The ‘Red Team Journal’ Strap line (www.redteamjournal.com).

2-19

Page 36: Red Teaming

Red Teaming

(INTENTIONALLY BLANK)

2-20

Page 37: Red Teaming

Red Teaming

ANNEX A – ANALYTICAL TECHNIQUES

Diagnostic Techniques

1A1. Key Assumptions Check. List and review the key working assumptions on which fundamental judgements rest.

a. When to Use. This is most useful at the beginning of a project. An individual or a red team can spend time articulating and reviewing the key assumptions. Rechecking assumptions also can be valuable at any time prior to finalising judgements, to ensure that the assessment does not rest on flawed premises. Identifying hidden assumptions can be one of the most difficult challenges an analyst faces, as they are ideas held to be true – often unconsciously – and therefore are seldom examined and almost never challenged.

b. Value Added. Explicitly identifying working assumptions during a project helps:

• Explain the logic of the argument and expose faulty logic. • Understand the key factors that shape an issue. • Stimulate thinking about an issue. • Uncover hidden relationships and links between key factors. • Identify developments that would cause you to abandon an

assumption. • Prepare analysts for changed circumstances that could surprise

them.

c. The Method. The aim is to consider how plans or ideas depend upon underpinning assumptions and question the validity of those assumptions. A four step process may be used:

(1) Review the current line of thinking on an issue; write it down for all to see.

(2) Articulate all the premises and assumptions, both stated and unstated, which are accepted as true for this line of thought to be valid.

(3) Challenge each assumption, asking why it ‘must’ be true and whether it remains valid under all conditions.

(4) Refine the list of key assumptions to contain only those that must be true to sustain your plan; consider under what conditions or in the face of what information these assumptions might not hold.

A-1

Page 38: Red Teaming

Red Teaming

d. Questions to ask during this process include:

• How much confidence exists that this assumption is correct? • What explains the degree of confidence in the assumption? • What circumstances or information might undermine this

assumption? • Is a key assumption more likely a key uncertainty or key factor? • Could the assumption have been true in the past but less so now? • If the assumption proves to be wrong, would it significantly alter

the plan? How? • Has this process identified new factors that need further analysis?

1A2. Quality of Information Check. Evaluates completeness and soundness of available information sources.

a. When to Use. Weighing the validity of sources is a key feature of any critical thinking. Moreover, the confidence level a commander can have in his judgements and decisions depends upon the accuracy and reliability of the information base. Checking the quality of information used in analysis is an important and ongoing process. Receiving information from various sources is not necessarily a substitute for having good information that has been thoroughly examined. Red teams should perform periodic checks on the quality of the information base. Otherwise, important decisions can become anchored to weak information, and any caveats attached to that information in the past can be forgotten or ignored over time.

b. Value Added. A thorough review of information sources provides red teams with an accurate assessment of what we know and what we do not know. It is also an opportunity to confirm that sources have been cited accurately. In the case of Human Intelligence (HUMINT), this will require extensive review of the sources’ background information and access as well as his or her motivation for providing the information. Similarly, reviewing technical sourcing can sometimes reveal inadvertent errors in processing, translation, or interpretation that otherwise might have gone unnoticed. In addition, a quality of information check can be valuable to:

• Identify key intelligence gaps and new requirements for collectors.

• Assist commanders in understanding how much confidence to place in information and judgements derived from it.

• Help to detect possible deception and denial strategies by an adversary.

A-2

Page 39: Red Teaming

Red Teaming

c. The Method. A red team might begin a quality of information check by developing a database in which information is stored according to source type and date, with additional notations indicating strengths or weaknesses of those sources. Ideally, the team should have a retrieval and search capability on the database to enable periodic reviews of the data. For the information review to be fully effective, the red team will need as much background information on sources as is feasible. Knowing the circumstances in which reporting was obtained is often critical to understanding its validity. With the data readily available the red team can:

• Review systematically all sources for accuracy. • Identify information sources that appear most critical or

compelling. • Check for sufficient and strong corroboration of critical reporting. • Re-examine previously dismissed information in light of new facts

or circumstances that cast it in a different light. • Consider whether ambiguous information has been interpreted and

caveated properly. • Indicate a level of confidence in sources, which are likely to figure

in future assessments.

1A3. Indicators or Signposts of Change. Periodically review a list of observable events or trends to track events, monitor targets, spot emerging trends, and warn of unanticipated change.

a. When to Use. If a postulated situation is developing, for example economic reform, military modernisation, political instability, or democratisation, a red team can create a list of indicators or signposts of observable events that one would expect to see. Constructing the list might require only a few hours or as much as several days to identify the critical variables associated with a targeted issue. The technique can be used when a red team needs to track an event over time to monitor and evaluate changes. However, it can also be a very powerful aid in supporting other structured methods. In those instances, red teams would be watching for mounting evidence to support a particular hypothesis or low probability event. If there are sharply divided views on an issue, an indicators or signposts list can also depersonalise an argument by shifting attention to a more objective set of criteria, and provide clarity.

b. Value Added. By providing an objective baseline for tracking events or targets, indicators instil rigour into the process and enhance the credibility of judgements. An indicators list included in a finished product also allows the commander to track developments and builds a more concrete case for

A-3

Page 40: Red Teaming

Red Teaming

decision-making. By laying out a list of critical variables, red teams will be generating hypotheses containing why they expect to see particular factors, hence their arguments will be much more transparent to scrutiny by others.

c. The Method. Whether used alone, or in combination with other structured analysis, the process is the same:

• Identify a set of competing hypotheses or scenarios. • Create separate lists of potential activities, statements, or events

expected for each hypothesis or scenario. • Regularly review and update the indicators lists to see which are

changing. • Identify the most likely or most correct hypotheses or scenarios,

based on the number of changed indicators that are observed.

Developing 2 lists of indicators for each hypothesis or scenario may prove useful to distinguish between indicators that a development is or is not emerging. This is particularly useful in a ‘What If?’ analysis, when it is important to make a case that a certain event is unlikely to happen.

1A4. Deception Detection. Systematic use of checklists can determine when deception may be present and how to avoid being deceived.

a. When to Use. Red teams should check for the possibility of deception, especially when there is a well-known history of its use. The search for clues that deception is being conducted is often time consuming and requires extensive fact checking and hypothesis testing. Nonetheless, it can be critical in cases where the stakes are high. Red teams should be most concerned about deception when the adversary would have a lot to gain through his efforts and has strong capabilities to deny or manipulate intelligence collection assets.

b. Value Added. Deception Detection can add rigour to analysis and reinforce the effectiveness of other techniques covered in this guide. There may be times when red teams will place too much confidence in the effectiveness of other techniques covered in this guide, if they have not considered the possibility that deception may be present as well. For example, a well-developed set of indicators might actively mislead, if they were partly developed from information purposely designed or fabricated by an adversary to mislead opponents. Posing the hypothesis of deception places a considerable cognitive burden on red teams. Once accepting this possibility, it places in question all the evidence and makes it difficult to draw any inferences from the evidence with high confidence. A checklist of questions to detect possible deception can prevent paralysis of thinking.

A-4

Page 41: Red Teaming

Red Teaming

c. The Method. If there is any possibility that deception could be present, the red team should assess the situation based on 4 sets of criteria:

• Does a foreign actor have the Motive, Opportunity and Means to deceive?

• Would this potential deception be consistent with Past Opposition Practices?

• Do we have cause for concern regarding the Manipulability of Sources?

• What can be learned from the Evaluation of Evidence?

In addition to using this Deception Detection technique, red teams can also use the technique of Analysis of Competing Hypotheses (ACH). This should explicitly pose deception as one of the multiple explanations for the presence or absence of information.

1A5. Analysis of Competing Hypotheses. Identification of alternative explanations (hypotheses) and evaluation of all evidence that will disprove rather than confirm hypotheses.

a. When to Use. ACH is a highly effective technique when there is a large amount of data to absorb and evaluate. It is most effective when red team members actively challenge each other’s evaluation of the evidence. The red team should develop a matrix of hypotheses and input the supporting evidence for each hypothesis to examine where the weight of evidence lies. ACH is particularly appropriate for controversial issues when red teams want to develop a clear record that shows the theories they have considered and how they arrived at their judgements. The ACH matrix allows others to review the analysis and identify areas of agreement and disagreement. Evidence can be examined systematically, making the technique ideal for considering the possibility of deception and denial.

b. Value Added. ACH helps red teams overcome 3 common mistakes that can lead to inaccurate forecasts:

• Undue influence of a first impression, based on incomplete data, an existing line of thinking, or a single explanation that seems to fit well enough.

• Planning teams seldom generate a full set of explanations or hypotheses at the outset of a project.

• Evidence which supports a preferred hypothesis, may also be consistent with other explanations.

A-5

Page 42: Red Teaming

Red Teaming

In essence, ACH enables red teams assist the commander to avoid picking the first solution that seems satisfactory instead of going through all the possibilities to arrive at the best solution.

c. The Method. Explicitly identify all the reasonable alternative hypotheses, then array the evidence against each hypothesis, rather than evaluating the plausibility of each hypothesis one at a time. To create a level playing field, the process must:

• Ensure that all information and arguments are evaluated and given equal treatment or weight when considering each hypothesis.

• Prevent premature closure on a particular explanation or hypothesis.

• Protect against innate tendencies to ignore or discount information that does not fit comfortably with the preferred explanation at the time.

To accomplish this, the process should follow these steps:

• Brainstorming among red teamers with different perspectives to identify all possible hypotheses.

• List all significant evidence and arguments relevant to all the hypotheses.

• Prepare a matrix with hypotheses across the top and each piece of evidence on the side. Determine whether each piece of evidence is consistent, inconsistent, or not applicable to each hypothesis.

• Refine the matrix and reconsider the hypotheses – in some cases, red teams will need to add new hypotheses and re-examine the information available.

• Focus on disproving hypotheses rather than proving one. Identify and weigh up the evidence that is consistent with each hypothesis to see which explanations are strongest.

• Analyse how sensitive the ACH results are to a few critical items of evidence; should those pieces prove to be wrong, misleading, or subject to deception, how would it impact an explanation’s validity?

• Ask what evidence is not being seen but would be expected for a given hypothesis to be true. Is denial and deception a possibility?

• Report all the conclusions, including the weaker hypotheses that should still be monitored as new information becomes available.

• Establish the relative likelihood for the hypotheses and report all the conclusions, including the weaker hypotheses that should still be monitored as new information becomes available.

A-6

Page 43: Red Teaming

Red Teaming

• Identify and monitor indicators that would be both consistent and inconsistent with the full set of hypotheses. In the latter case, explore what could account for inconsistent data.

Contrarian Techniques

1A6. Devil’s Advocacy. Devil’s Advocacy is challenging a single, strongly held view or consensus by building the best possible case for an alternative explanation.

a. When to Use. This technique is most effective when used to challenge a consensus or a key assumption regarding a critically important issue. On those issues that one cannot afford to get wrong, Devil’s Advocacy can provide further confidence that the line of thinking will hold up to close scrutiny. An individual red teamer can often assume the role of the Devil’s Advocate if he or she has some doubts about a widely held view, or a commander might designate a critical thinker to challenge the prevailing wisdom in order to reaffirm the group’s confidence in those judgements. In some cases, the red team can review a key assumption of a critical judgement in the course of their work, or more likely, a separate product can be generated that arrays all the arguments and data that support a contrary hypothesis. While this can involve time and effort, when a planning group has worked on an issue for a long period of time, a strong mind-set is likely to exist which may warrant the closer scrutiny provided by Devil’s Advocacy.

b. Value Added. The Devil’s Advocacy process can highlight weaknesses in thinking or alternatively help to reaffirm confidence in prevailing judgements by:

• Explicitly challenging key assumptions to see if they will not hold up under some circumstances.

• Identifying any faulty logic or information that would undermine the key judgements.

• Presenting alternative hypotheses that could explain the information available.

Its primary value is to serve as a check on a dominant mind-set that can develop over time among even amongst the best planning teams, who have followed an issue and formed a strong consensus view. This mindset phenomenon makes it more likely that contradictory evidence is dismissed or not given proper weight or consideration. An exercise aimed at highlighting such evidence and proposing another way of thinking about an issue can expose hidden assumptions and compel planners to review their information with greater scepticism. The technique should ensure one of 3 outcomes: current thinking is sound; the argument is strong but that there are areas where

A-7

Page 44: Red Teaming

Red Teaming

further analysis is needed; or there are flaws in logic or supporting evidence suggesting that a different line of thinking is required or heavy caveats are needed.

c. The Method. To challenge the prevailing thinking the Devil’s Advocate must:

• Consider the main line of thinking and the key underpinning assumptions, then identify the supporting evidence.

• Select one or more assumptions – stated or not – that appear the most susceptible to challenge.

• Review the evidence to determine whether any is of questionable validity, whether deception is possibly indicated, or whether major gaps exist.

• Highlight any evidence that could support an alternative hypothesis or contradicts the current thinking.

• Present to the group the findings that demonstrate there are flawed assumptions, poor quality evidence, or possible deception at work.

• If the review uncovers major flaws, consider drafting a separate contrarian paper that lays out the arguments for a different conclusion.

• Be sure that any products generated clearly lay out the conventional wisdom and are identified as an explicitly Devil’s Advocate project; otherwise, the reader can become confused as to the current official view on the issue.

1A7. Team A/Team B. Use of separate teams that contrast 2 (or more) strongly held views or competing hypotheses.

a. When to Use. If there are at least 2 competing views within an organisation or competing opinions on a key issue, then Team A/Team B analysis can be used to help resolve those differences. Developing a full-blown Team A/Team B exercise requires a significant time and resource commitment time so it is worthwhile considering if the issue merits this kind of attention. A longstanding strategic issue, a critical decision that has far-reaching implications, or a dispute within a community that has obstructed effective cross-agency cooperation would be grounds for using Team A/Team B. If those circumstances exist, then red teams will need to review all of the data to develop alternative papers that can capture the essential differences between the two viewpoints.

b. Value Added. Team A/Team B approach can help opposing experts see the merit in the other group’s perspective. The process of conducting such an

A-8

Page 45: Red Teaming

Red Teaming

exercise can reduce the friction and even narrow the differences. At a minimum, it allows those holding opposing views to feel that their views have been given equal attention. For the commander, this technique helps to surface and explain important differences within the expert community. Often senior officials can learn more by weighing well-argued conflicting views than from reading an assessment that masks substantive differences or drives analysis to the lowest common denominator. By making the key assumptions and information used for each argument more transparent, a commander can judge the merits of each case, pose questions back to the red teams, and reach an independent judgement on which argument is the strongest. If opposing positions are well established, it can be useful to place staff on teams that will advocate positions they normally do not support; forcing a member of staff to argue for the other side can often make them more aware of their own mind-set.

c. The Method.

(1) Analysis Phase. A Team A/Team B exercise can be conducted on an important issue to:

• Identify the 2 (or more) competing hypotheses or points of view.

• Form teams or designate individuals to develop the best case that can be made for each hypothesis.

• Review all pertinent information that supports their respective positions.

• Identify missing information that would buttress their hypotheses.

• Structure each argument with an explicit presentation of key assumptions, key pieces of evidence, and careful articulation of the logic behind the argument.

(2) Debate Phase. A presentation of the alternative arguments and rebuttals in parallel fashion can then be organised for the benefit of other staff:

• Set aside time for a presentation of the alternative team findings; this can an informal brainstorming session or a more formal debate.

• Have an independent jury of peers to listen to the presentation and be prepared to question the teams regarding their assumptions, evidence, or logic.

A-9

Page 46: Red Teaming

Red Teaming

• Allow each team to present their case, challenge the other team’s arguments, and rebut the opponent’s critique of its case.

• Let the jury consider the strength of each presentation and recommend possible next steps for further research and collection efforts.

1A8. High-Impact/Low-Probability Analysis. Highlights a seemingly unlikely event that would have major policy consequences if it happened.

a. When to Use. This is a contrarian technique that enables red teams to explore and demonstrate the potential impact of seemingly low probability events that would have major repercussions on UK interests, operations or plans. Using this technique is advisable when commanders are convinced that an event is unlikely but have not given much thought to the consequences of its occurrence. In essence, this can be a warning that the intelligence and policy communities must be alert to an unexpected but not impossible event.

b. Value Added. Mapping out the course of an unlikely, yet plausible, event can uncover hidden relationships between key factors and assumptions; it also can alert red teams to oversights. In addition, an examination of the unthinkable allows red teams to develop signposts that may provide early warning of a shift in the situation. By periodically reviewing these indicators a red team is more likely to be able to counter any prevailing mind-set that such a development is highly unlikely.

c. The Method. If there is a strongly held view that an event is unlikely, then postulating precisely the opposite should not be difficult.

• Define the high-impact outcome clearly. This process is what will justify examining very unlikely developments.

• Devise one or more plausible explanations for or pathways to the low probability outcome. This should be as precise as possible, as it can help identify possible indicators for later monitoring.

• Insert possible triggers or changes in momentum if appropriate. These can be natural disasters, threats to key leaders, or plausible economic or political shocks. Brainstorming may be necessary to identify these unpredictable triggers of sudden change.

• Identify for each pathway a set of indicators or observable events that would help to recognise these situations developing.

• Identify factors that would deflect a bad outcome or encourage a positive outcome.

A-10

Page 47: Red Teaming

Red Teaming

1A9. ‘What If?’ Analysis. Assumes that an event has occurred with potential (negative or positive) impact and explains how it might come about.

a. When to Use. A technique for challenging a strong mindset by positing that events may not happen as planned or that a confidently made forecast may not be entirely justified. It is similar to a High-Impact/Low-Probability analysis, but it does not dwell on the consequences of the event as much as it accepts the significance and moves directly to explaining how it might come about.

b. Value Added. By shifting the focus from whether an event could occur to how it may happen, the red team suspends judgement about the likelihood of the event and focuses more on what developments – even unlikely ones – might enable such an outcome. A red team might employ this technique and repeat the exercise whenever a critical judgement is made.

Using this technique is particularly important when a judgement rests on limited information or unproven assumptions. Moreover, it can free red teams from arguing about the probability of an event to considering its consequences and developing some indicators or signposts for its possible emergence. It will help red teams understand the impact of an event, the factors that could cause or alter it, and likely signposts that an event is imminent and provide relevant input to the planning process. A ‘What If?’ analysis can complement a difficult judgement and caution the commander against accepting the conventional wisdom without considering the costs and risks of being wrong. This can help commanders consider options including the unlikely.

c. The Method. ‘What If?’ analysis must begin by stating clearly the accepted line of thinking and then stepping back to consider what alternative outcomes are too important to dismiss, even if unlikely. Brainstorming can develop one or more plausible scenarios by which the unlikely event occurs:

• Assume the event has happened. • Select some triggering events that permitted the scenario to unfold

to help make the ‘what if’ more plausible; for example, red teams might postulate the death of a leader, a natural disaster, or some economic event that would start a chain of other events.

• Develop a line of argument based as much on logic as evidence to explain how this outcome could have come about.

• Working backwards from the event in concrete ways - specifying what must actually occur at each stage of the scenario - is often very useful.

• Identify one or more plausible pathways or scenarios to the unlikely event; very often more than one will appear possible.

A-11

Page 48: Red Teaming

Red Teaming

• Generate a list of indicators or observables events for each scenario that would help to detect the beginnings of the event.

• Consider the scope of the positive and negative consequences of each scenario and their relative impacts.

• Monitor the indicators developed on a periodic basis.

Imaginative Thinking Techniques

1A10. Brainstorming. An unconstrained group process designed to generate new ideas and concepts.

a. When to Use. This technique should stimulate new thinking therefore is a useful aid to thinking in virtually all of the other structured techniques. Typically, both red and blue teams will conduct brainstorming when they begin a project to help generate a range of hypotheses about the problem. Brainstorming, by definition, involves a group meeting to discuss a common challenge; a modest investment of time at the beginning or critical points of a project can take advantage of the group members’ different perspectives to help structure a problem. This group process allows others to build on an initial idea suggested by a member of the brainstorming session. Individuals can produce a wider range of ideas than a group might generate, without regard for others’ egos, opinions, or objections. However, an individual will not have the benefit of others’ perspectives to help develop the ideas as fully. Moreover, an individual may have difficulty breaking free of his or her cognitive biases without the benefit of a diverse group.

b. Value Added. This technique can maximise creativity in the thinking process, force team members to step outside their normal mind-sets, and suspend their judgement about the practicality of ideas or approaches. More generally, brainstorming allows team members to see a wider range of factors that might bear on the topic than they would otherwise consider. Military teams typically censor out ideas that seem far fetched, poorly sourced, or irrelevant to the question at hand. Brainstorming gives permission to think more radically or outside the box. In particular, it can spark new ideas, ensure a comprehensive look at a problem or issues, raise unknowns, and prevent premature consensus around a single hypothesis – thus enabling the red team to provide a more wide-ranging set of ideas and thoughts to underpin a project or plan.

c. The Method. Paradoxically, to be most productive brainstorming should be a structured process. An unconstrained, informal discussion might produce some interesting ideas, but usually a more systematic process is the most effective way to break down mind-sets and produce new insights. Brainstorming can be ineffective if the group tries to generate all ideas in

A-12

Page 49: Red Teaming

Red Teaming

plenary session. Tasking individuals to generate ideas first, which are then analysed and tested by the group, is a more reliable way of generating diverse thinking. In particular, the process involves a divergent thinking phase to generate and collect new ideas and insights, followed by a convergent phase in which ideas are grouped and organised around key concepts. Some of the simple rules to be followed include:

• Never censor an idea no matter how unconventional it might sound.

• Rather find out what prompted the thought, as it might contain the seeds of an important connection between the topic and an unstated assumption.

• Give yourself enough time to do brainstorming correctly. It usually takes some time to set the rules of the game, get the group comfortable, and exhaust the conventional wisdom on the topic. Only then will the truly creative ideas begin to emerge.

• Involve at least one outsider in the process – that is, someone who does not share the same educational background, culture, technical knowledge or mindset as the core group, but has some familiarity with the topic.

• A two-phase, twelve-step, structured process is often used to get the most out of the brainstorming sessions:

d. Divergent Thinking Phase. Distribute Post-It notes and pens or markers to all participants. Typically, 10-12 people works best.

• Pose the problem in terms of a focal question. Display it in one sentence on a large easel or whiteboard.

• Ask the group to write down responses to the question, using key words that will fit on the small Post-It note.

• Stick all the notes on a wall for all to see – treat all ideas the same. • When a pause follows the initial flow of ideas, the group is

reaching the end of their conventional thinking and the new divergent ideas are then likely to emerge.

• End the collection stage of the brainstorming after two or three pauses.

e. Convergent Thinking Phase.

• Ask the participants as a group to rearrange the notes on the wall according to their commonalities or similar concepts. No talking is permitted. Some notes may be moved several times as notes

A-13

Page 50: Red Teaming

Red Teaming

begin to cluster. Copying some notes is permitted to allow ideas to be included in more than one group.

• Select a word or phrase that characterises each grouping or cluster once all the notes have been arranged.

• Identify any notes that do not easily fit with others and consider them either useless noise or the beginning of an idea that deserves further attention.

• Assess what the group has accomplished in terms of new ideas or concepts identified or new areas that need more work or further brainstorming.

• Instruct each participant to select one or two areas that deserve the most attention. Tabulate the votes.

• Set the brainstorming group’s priorities based on the voting and decide on the next steps for analysis.

1A11. Outside-In Thinking. Used to identify the full range of basic forces, factors, and trends that would indirectly shape an issue.

a. When to Use. At the beginning of a project, when the goal is to identify all the critical, external factors that could influence how a particular situation will develop. Outside-in Thinking can reduce the risk of missing important variables early in the planning process.

b. Value Added. Most military staff think from the inside – namely, what they control – out to the broader world. Conversely, thinking from the outside-in begins by considering the external changes that might, over time, profoundly affect a plan or issue. This technique enables the red team to get away from their immediate thinking and think about issues in a wider conceptual and contextual framework. By recasting the problem in much broader and fundamental terms, red teams are more likely to uncover additional factors, an important dynamic, or a relevant alternative hypothesis.

c. The Method. Develop a generic description of the problem or the phenomenon under study. Then:

• List all the key forces (social, technological, economic, environmental, and political) that could have an impact on the topic, but over which one can exert little influence (e.g. globalisation, social stress, the internet, or the global economy).

• Focus next on key factors over which an actor can exert some influence. In the business world this might be the market size, customers, the competition, suppliers or partners; in the military

A-14

Page 51: Red Teaming

Red Teaming

domain it might include the actions or behaviour of allies or adversaries.

• Assess how each of these forces could affect the problem. • Determine whether these forces actually do have an impact on the

particular issue based on the available evidence.

1A12. Surrogate Adversary/Role Play. Models the behaviour of an individual or group by trying to replicate how an adversary would think about an issue.

a. When to Use. When commanders face the challenge of forecasting how an adversary, competitor or other actor may behave, there is a risk of falling into a mirror-image problem. That is, we can sometimes impute to a foreign actor the same motives, values, or understanding of an issue that we hold. Traditional thinking sometimes assumes that foreign leaders or groups will behave as we would if faced with the same threats or opportunities. History has shown that others often respond differently to events because of different cultural, organisational or personal experiences. Red teams should try to consciously place themselves in the same cultural, organisational, and personal setting as the one in which the target individual or group operates (put themselves in the shoes of the adversary).

b. Value Added. Like Devil’s Advocacy and Team A/Team B techniques, playing a Surrogate Adversary is aimed at freeing blue from the prison of a well-developed mind-set; in this case, the blue players’ own sense of rationality, cultural norms, and personal values. Surrogate adversary techniques transform the red teamer into an actor operating within the adversary’s culture and political milieu. This form of role playing is useful when trying to replicate the mind-set of authoritarian leaders, terrorist cells, or other non-Western groups that operate under very different codes of behaviour or motivations. Often the technique can introduce new or different stimuli that might not have been factored into traditional thinking – such as the target’s familial ties or the international political, economic, and military pressures felt by the individual. For example, surrogate adversaries might ask themselves: “What would my peers, family, or tribe expect me to do? Alternatively, a surrogate adversary might pose the question to his colleagues: “How do we perceive the external threats and opportunities?” Finally, this technique can factor in how personal power and status might influence a target’s behaviour.

c. The Method. For this technique to work, it is essential that the red team contains experts with in-depth knowledge of the adversary, competitor or other actor. They will need to understand as best they can relevant history and geography, politics, cultures, and customs of the focus group. It is likely that suitable experts will share an appropriate ethnic background or have worked or closely studied the group of interest. The team members should:

A-15

Page 52: Red Teaming

Red Teaming

• Put themselves in the adversary’s circumstances and react to foreign stimuli as the target would.

• Develop a set of first-person questions that the adversary would ask, such as: “how would I perceive incoming information; what would be my personal concerns; or to whom would I look for an opinion?”

• Draft a set of policy papers in which the leader or group makes specific decisions, proposes recommendations, or lays out courses of actions. The more these papers reflect the cultural and personal norms of the adversary, the more they can offer a different perspective on the problem.

Playing a surrogate adversary is not easy. It requires significant time to develop a team of qualified experts who can think like the adversary. The red team has to distance itself from blue and work as though living in the world of the adversary. Without a sophisticated understanding of the culture, operational environment, and personal histories of the adversary, red teamers will not be able to behave or think like the enemy. Red teamers can never truly escape their own experiences and mindsets, but this technique can at least prevent them from unconsciously falling into mirror-imaging.

1A13. Alternative Futures Analysis. Systematically explores multiple ways in which a situation can develop when there is high complexity and uncertainty.

a. When to Use. This technique is most useful when a situation is viewed as too complex or the outcomes as too uncertain to trust a single outcome assessment. First, red teams must recognise that there is a high degree of uncertainty surrounding the topic in question. Second, they and the wider staff should recognise that they need to consider a wide range of factors that might bear on the question. And third, they must be prepared to explore a range of outcomes rather than be drawn to any preconceived result. Depending on how elaborate the problem, the effort can amount to considerable investment in time, resources, and money. A red team can spend several hours or days conducting brainstorming and developing multiple futures; alternatively, a larger-scale effort can require preparing a multi-day workshop that brings together a larger number of participants, including outside experts. Such an undertaking often demands the special skills of trained scenario-development facilitators and conferencing facilities. This technique is a sharp contrast to contrarian techniques, which try to challenge high confidence and relative certitude about an event or trend. Multiple futures development is a divergent thinking technique that tries to use the complexity and uncertainty of a situation to describe multiple outcomes or futures that commander should consider, rather than to predict one outcome.

A-16

Page 53: Red Teaming

Red Teaming

b. Value Added. This technique is useful in highly ambiguous situations, when commanders confront not only a lot of known unknowns but also unknown unknowns. What this means is that commanders recognise that there are factors, forces, and dynamics among key actors that are difficult to identify without the use of some structured technique that can model how they would interact or behave. As the outcomes are not known prior to the futures exercise, commanders must be prepared for the unexpected and be willing to engage in a more free-wheeling exchange of views than typically occurs in order to imagine the future. Futures analysis done well is resource and time intensive. From past experience, red teams have found that involving commanders in the alternative futures exercise is the most effective way to communicate the results of this exploration of alternative outcomes and sensitise them to key uncertainties. Most participants find the process of developing scenarios as useful as any finished product that attempts to capture the results of the exercise. Commanders benefit from this technique in several ways:

• It provides an effective means of weighing multiple unknowns or unknowable factors and presenting a set of plausible outcomes.

• It can help to bound a problem by identifying plausible combinations of uncertain factors.

• It provides a broader framework for calculating the costs, risks, and opportunities presented to commanders by different outcomes.

• It aids commanders anticipating what otherwise would be surprising developments by forcing them to challenge assumptions and consider possible wild cards or discontinuous events.

• It generates indicators which can be used to monitor developments and assess trends.

c. The Method. The most common approach used in involves the following steps:

• Develop the focal issue by systematically interviewing experts and officials who are examining the general topic.

• Convene a group of experts (both internal and external) to brainstorm the forces and factors that could affect the focal issue.

• Select by consensus the two most critical and uncertain forces and convert these into axes or continua with the most relevant endpoints assigned.

A-17

Page 54: Red Teaming

Red Teaming

• Establish the most relevant endpoints for each factor; e.g., if economic growth were the most critical uncertain force, the endpoints could be fast and slow or transformative and stabilising depending on the type of issue addressed.

• Form a futures matrix by crossing the 2 chosen axes. The 4 resulting quadrants provide the basis for characterising alternative future worlds.

• Generate narratives that describe these futures and how they could plausibly come about. Signposts or indicators of progress can then be developed.

• Participants can then consider how current decisions or strategies would fare in each of the four worlds and identify alternative plans that might work better either across all the futures or in specific ones. By anticipating alternative outcomes, commanders have a better chance of either devising strategies flexible enough to accommodate multiple outcomes or of being prepared and agile in the face of change.1

1 Additional guidance on analytical techniques may be found in a US Government Publication A Tradecraft Primer: Structured Analytical Techniques for Improving Intelligence Analysis (March 2009). Available at https://www.cia.gov/library/center-for-the-study-of-intelligence/csi-publications/books-and-monographs/Tradecraft%20Primer-apr09.pdf

A-18

Page 55: Red Teaming

Red Teaming

ANNEX B – CULTURAL CAPABILITY

1B1. Cultural Capability is defined as:

‘The ability to understand culture, and to apply this knowledge to effectively engage in any environment. Cultural capability comprises

three levels: awareness, understanding and competence.’1

1B2. People from different cultures may behave differently from the way we do and their underpinning perception of the world, beliefs and values may be different from ours. To understand why they act in the way they do, we need to try to see their world in the way that they do. We also need to appreciate how they perceive our world view.

1B3. Culture is underpinned by a large number of characteristics including:

a. Structures and Politics – the structures, dynamics and agendas of government, as well as defence, law and order, economics and commerce.

b. History – its effect in shaping national, regional and tribal attitudes, beliefs and relations, including perceptions of the West, the UK and religion.

c. Social – social, religious or cultural conventions which shape operational and social interaction, for example entering homes, searching, meetings, use of weapons, the acceptance of hospitality, alcohol and gifting.

d. Daily Life – routines, employment, education, worship, sport, literacy, poverty, diet, home ownership, access to utilities and wages.

e. Communications and Language – verbal and non-verbal greetings, insults, words, phrases, gestures and taboos.

1B4. Examination of the culture of actors or groups in isolation is only part of the consideration. Red teams must appreciate that the impact of culture on relationships is crucial and that there are many different perspectives in any set of relationships. These may broadly be summarised as:

• Blue views of Blue. • Red views of Red. • Blue views of Red (and White, Green, Brown). • Red views of Blue (and White, Green, Brown).

1B5. The red team must recognise the existence of red and blue perspectives, as well as those of other actors who may have influence, take them into account and realise 1 Joint Doctrine Note (JDN) 1/09 The Significance of Culture to the Military.

B-1

Page 56: Red Teaming

Red Teaming

the implications when conducting analysis of the likely actions and reactions of adversaries, partners and neutrals.

1B6. A Cultural Analysis Template which has been developed to aid the analysis of any group (adversary, partner or neutral) to contribute to situational understanding at Step 1 of the estimate process, or on any other occasion, is below.

Cultural Analysis Template

1B7. This template2 is under development to aid the analysis of any cultural group (adversary, partner or neutral) through:

a. The basic information required on any cultural/social group includes:

• Group name/primary identity. • Size of group and support base. • Basic demographical information. • Military disposition and threat, if relevant. • Area of operation and/or influence. • Identifiable cultural beliefs, symbols and taboos. • Dos and don’ts when interacting with members of the group.

1B8. The following questions may help to elicit the above details:

Q1. What makes the group a group?

• How does the group describe its history and where it came from? What are the key formative events in the group’s history, and what is their importance?

• How does the group perceive current and past events? • Do the group members share religious beliefs? If so, what are

they? If not, what other beliefs do they share? • What are the important rituals that the group uses?

Q2. How is the group organised?

• What are the important relationships and does kinship play a role? • Does class play a role? • Does ethnicity play a role? • Does tribalism play a role?

2 Derived from a combination of work: Professor A King’s (Exeter University) Cultural Estimate draft, J Bastrup-Birk’s ARIA Group Tool draft and Defence Intelligence Human Factors Generic Anthropological Assessment Template.

B-2

Page 57: Red Teaming

Red Teaming

Q3. What motivates the group? Consider: political; resistance; economic gain; social change; service provision; and/or information provider.

• What are the key attitudes and motivations of individual members (e.g. same as group objective, power status, personal economic, social change, protection)?

• What are the principal means of achieving their objectives (e.g. political, non-violent social, violent)?

• How are their objectives expressed internally and externally?

Q4. What relationships are seen within the group and with people outside the group?

• How does the group relate to those around it (how isolated or otherwise is it)?

• What is the group’s support base? • How often do people move into and out of the group? How long

do people stay members of the group? • What are the critical internal relationships within the group?

What causes these relationship to become critical, and when? • What are the critical external relationships to the group? (who is

interacting with whom?) What causes these relationships to become critical, and when?

• Who are the central players in the network and why? • What is the nature of the group’s relationship to the wider

population and the state? • How similar is the group to others around it and what values are

shared/different to other groups? • How do people become members of the group? • Do people leave the group? If so, how and why?

Q5. What are the group’s primary economic resources?

• How do they gain these resources? • Who if anyone does the group rely on to gain resources? • What capabilities do these resources give to the group?

Q6. What forms does power and political organisation take in the group?

• What political influence does the group have? • How is power achieved and maintained in a group? • Does the group have leaders and if so how are they determined? • What are the roles and characteristics of the leaders?

B-3

Page 58: Red Teaming

Red Teaming

• What is the leadership style (e.g. directive, consensus-driven, laissez faire)?

• Who are the key opinion formers and gatekeepers for information?

• How is information disseminated within the group? • How is status determined? • Improved analysis and planning, including conflict prevention and

pre-conflict planning.

B-4

Page 59: Red Teaming

Red Teaming

ANNEX C – RED TEAMING EXPERTISE AND CONTACTS

1C1. Permanent Joint Headquarters and Deployed Headquarters. Permanent Joint Headquarters (PJHQ) and Deployed Headquarters use red-cells as part of their campaign planning procedure in accordance with Joint Doctrine Publication (JDP) 5-00 (2nd Edition) Campaign Planning. Red cell members are generally appointed from within the staff (often J2) specifically for the planning event, and tend to adopt a surrogate adversary approach. Although considered an essential part of the planning process, the red teamers often have other planning tasks, and generally have no formal red team training. This may be an area for development through further education and training, where there is a prospect of adding greater value to the planning and decision making process.

1C2. PJHQ also employs a red team analyst on its J2 staff to provide longer term challenge thinking to specific operational scenarios. Over the last year, the current post-holder has focused on inviting experts and institutions from outside the government and military to provide ‘red’ views at bespoke study days, presentations and round table events. For example: last year Messrs Semple and Patterson, the diplomats expelled from Afghanistan in December 2007 for talking to the Taliban; Dr Antonio Giustozzi, an Afghan/Taliban specialist from London School of Economics; and Professor Stephen Chan, a Zimbabwean specialist, all attended red team events.

1C3. Joint Services Command and Staff College. During the Campaign Planning exercises at the Joint Services Command and Staff College (JSCSC) red cells are appointed from within the war gaming syndicates to challenge the thinking of the blue planners. Red cell members are drawn from within and generally act as surrogate adversaries to the blue plans; using international students in this role potentially provides different perspectives and a more critical analysis. Red teaming techniques do not form part of any of the JSCSC syllabi and no formal red team training is provided prior to Campaigning. Contact: http://defac.ac.uk/

1C4. Defence Intelligence Staff. The Butler report of 2003 highlighted the need for challenge to analytical processes and assumptions within the Intelligence Community. A small analytical challenge team now forms part of the Strategic Futures section of the Defence Intelligence Staff (DIS). Their mission is to promote an analytical challenge function and culture within DIS, through the use of analytical techniques to aid structured thinking, test assumptions and consider a wider range of outcomes and scenarios. They may also provide a limited consultancy service to other MOD departments and Other Government Departments (OGDs); examples include regular interaction with Development, Concepts and Doctrine Centre (DCDC) Strategic Trends section, provision of notes to the National Police Improvement Agency (NPIA) and advice to Her Majesty’s Revenue and Customs (HMRC) red team.

C-1

Page 60: Red Teaming

Red Teaming

Contact: http://www.mod.uk/DefenceInternet/AboutDefence/WhatWeDo/SecurityandIntelligence/DIS/

1C5. Training for the challenge thinking role in DIS is provided by the Defence School of Intelligence at Chicksands. New joiners to DIS undertake a Defence Intelligence Foundation Course (Analysis) (DIFC(A)) which lasts 4 days. This may be followed by Defence Intelligence Service Analytical Course lasting 6 days; this is primarily aimed at DIS new joiners but also serves as a refresher for longer serving analysts. The courses are open to wider government departments including the MOD (military and civilian) as well as appropriate non-governmental organisations. Contact: DCI, Chicksands, Shefford, Bedfordshire, SG17 5PR.

1C6. The Research and Analysis Branch. The Research and Analysis Branch (R and AB) (formerly ARAG) within the Defence Academy has traditionally carried a red teaming capability, and has recently developed a new challenge technique called Prism. This essentially employs subject matter experts and analytical thinkers to consider concepts or issues through the different lenses of a prism, i.e. from different perspectives. The team considers the interactions of decision-takers, policymaking bodies, strategies and operational designs by various types of critical and creative thinking, including brainstorming, ‘what if’ analysis and role-play.1 The process is considered by the originator2 to be:

‘An inoculation against the most recurrent failure of international relations, that is misperception largely induced through self-

deception….The real value of a prismatic view comes from understanding and mapping the dynamics of risky relationships, a reality that is often

glossed over in the pursuit of certitude and faith in a plan’.3

As with other red teaming methods, the technique is subject to continuous evolution and challenge, according to the context and circumstances of the subject, and is not a set process. Contact: http://defac.ac.uk/

1C7. Land Warfare Centre. The Land Warfare Centre (LWC) includes an established Red Teaming capability embedded in the Warfare Development Branch (WarDev) Research Team in the Land Warfare Development Group (LWDG). WarDev Research Team’s purpose is to enhance the British Army’s understanding of the Contemporary and Future Operating Environments (COE, FOE) through operationally relevant cultural and historical research and analysis and Red Teaming. The focus is on land operations at the tactical level, but remaining coherent with joint doctrine and concepts, and linked to the wider security and developmental work at the 1 Terms are defined in the Lexicon. 2 Mr Chris Donnelly (Senior Fellow Defence Academy and Former Head of ARAG). 3 ARAG note to JFCOM on Multiple Futures Project, 18 March 2009.

C-2

Page 61: Red Teaming

Red Teaming

strategic level. Within this overall context, the Red Team provides formalised, structured and independent critical and creative thinking (Red Teaming) to enhance the robustness and depth of WarDev research and analysis. The Red Team is an integral component of Land Forces Research, Development and Experimentation (RD&E) hub. Contact: SO1 Research (Red Team Leader), Warfare Development Branch, UK Land Warfare Centre, Imber Road, Warminster, Wiltshire, BA12 0DJ. Tel (Mil): 94381 2229, Tel (Civil): + 44 1985 22 2229, Fax (Mil): 94381 3070, Fax (Civil): +44 1985 22 3070, E-mail (mil): LWC-WarDev-SO1-Research, WarDev – Research Team Web Site: http://defenceintranet.diiweb.r.mil.uk/DefenceIntranet/Military/Army/ComdFdArmy/DGLW/ComdLWDG/ComdLWDG.htm

1C8. Defence Science and Technology Laboratory (Dstl) provides red teaming capability to support the MOD, particularly during war gaming and experimentation events such as JCDFE.4 Although Dstl has no single red teaming department, expertise can be drawn from across the organisation to provide red teams with the following capabilities:

• Critical thinking of planning/problems combined with alternative/creative viewpoints to ensure a more robust plan or argument is developed.

• Familiarity with various planning, modelling and analysis approaches, ability to integrate well with other teams, and provide constructive feedback, solutions and opportunities.

• A broad base of expertise with many cultures & potential adversaries alongside future systems capabilities, technology trends and the changing nature of warfare.

• Focused support.

Contact: Tel +44(0)1980 613121, Fax +44(0)1980 613004, E-mail [email protected]

1C9. US Military. The US Military targets 3 distinct areas for red teaming:

The Intelligence area to:

• Improve understanding of the enemy resulting in better estimates and synchronisation of Intelligence and Operations.

• Support J2 development of adversary’s intention by providing alternative perspectives of enemy courses of action and reaction.

• Conduct Joint Intelligence Preparation of the Operational Environment (OE) from the adversary perspective.

4 Joint Campaign Development and Force Estimation.

C-3

Page 62: Red Teaming

Red Teaming

• Ensure the enemy and other major elements found in the OE are appropriately portrayed and fought during the war game.

• Critically examine the friendly Measures of Effectiveness (MOEs) from the adversary’s perspective to ensure the Commander is measuring the important things.

In Joint Forces Quarterly Captain Tyler Akers USN, speaks of effective red teaming in intelligence as: ‘Taking Joint Intelligence Operations to the Next Level.... [by tapping] the expertise of critical and creative thinkers….to encourage consideration of overlooked possibilities, challenge assumptions, and present issues in a cultural context or from a different perspective.’

The Planning and Operations area to:

• Improve decision making in Plans and Operations by providing independent critical and creative thinking and alternative perspectives to inform commanders and staff.

• Provide alternatives during planning and operations. • Conduct independent critical reviews and analysis of concepts,

organisational designs, war games, experiments, and processes to identify potential weaknesses and vulnerabilities.

• Anticipate cultural perceptions of partners, adversaries and others.

The Critical Review and Analysis area to improve decision making and problem solving by providing independent analysis. US Joint Forces Command has recently launched an initiative to recruit a pool of external red teamers who may be called on to challenge developing ideas and concepts. Perhaps controversially, this may include quasi and former adversaries and other actors with the widest range of views possible.

1C10. Common to all these areas is the requirement to apply critical thinking and analysis to challenge and provide alternatives, therefore over the last two years the US military has established a red teaming training capability at the University of Foreign Military and Cultural Studies (UFMCS), Fort Leavenworth, Kansas. The pinnacle of this training is the Red Team Leader’s Course which is graduate-level education of 18 weeks designed to effectively anticipate change, reduce uncertainty, and improve operational decisions. A UFMCS-trained red teamer is educated to look at problems from the perspectives of the adversary and multinational partners, with the goal of identifying alternative strategies. Shorter 6 or 9 week courses are available for red team members. The team’s responsibilities range from challenging planning assumptions to conducting independent analysis, examining courses of action and identifying vulnerabilities.

C-4

Page 63: Red Teaming

Red Teaming

1C11. The UFMCS has recommended that red teams or trained personnel are added to every echelon of command from the Brigade Combat Team to Army Headquarters. Based on this recommendation, TRADOC5 has approved the addition of a red team to deploying divisional and corps headquarters. At the same time red teaming is being included in updates to US Army doctrine manuals as a key enabler. UFMCS is also setting up a reach-back capability to provide access to expertise and databases, as well as to exchange lessons amongst red team members. The organisation will provide mobile training in some circumstances.

1C12. Defence Industries. Various defence industries have worked with a number of government and military organisations in the design, planning, conduct and analysis of red teaming activities such as planning and evaluation. For example, QinetiQ has recently supported the Cabinet Office by providing red teaming support to crisis management exercises helping to identify cultural and other limitations to good decision-making.

1C13. The recently agreed CRADA6 may provide opportunities for DCDC and the wider MOD to draw on QinetiQ and Boeing red teaming expertise. Contact: Development, Concepts and Doctrine Centre, Shrivenham, SWINDON SN6 8RF, Tel: +44 (0)1793 314347, Fax: +44 (0)1793 314211, DII Email DCDC-Coord SO1.

Cultural Expertise

1C14. Cultural expertise resides in a number of different areas of the MOD and its supporting agencies as described below:

Defence Academy staff including Kings College, Cranfield University and R&AB can provide generic and some specific cultural expertise.

DIS Human Factors and Human Geographical Intelligence Analysis Branches provide expertise and background briefs, for example Cultural Appreciation Booklets entitled: The Arab World, Iraq and Afghanistan are available via DIS.

1C15. Deputy Chief of Defence Staff (Operations) is the champion for Cultural Awareness and oversees the training requirement against the standards set by Director Joint Capability (D Jt Cap) in consultation with relevant customers and OGDs, for example PJHQ and the FCO.7 Delivery of Collective Cultural Awareness training is

5 Training and Doctrine Command (US Army). 6 The Development, Concepts and Doctrine Centre (DCDC) has entered into a Co-operative Research and Development Agreement (CRADA) with Boeing UK and QinetiQ, which will allow the MOD to explore DCDC concepts and themes in a risk free environment. The CRADA provides access to state-of-the-art analytical and experimentation facilities and will enable DCDC to conduct more rigorous experimentation and analysis of conceptual themes and ideas. 7 Defence Information Note (DIN) 2007DIN06-093: Defence Languages – Capstone Policy (includes Cultural Awareness policy).

C-5

Page 64: Red Teaming

Red Teaming

the responsibility of the Command, supported by J9 (CIMIC), J7 (Security Sector Reform & Staff Training) and J3/J5 as appropriate and training packages are the primary means of delivery.

1C16. The UK has recently begun to further develop its cultural capabilities, through enhanced education and training. Cultural training and education including focused Cultural Awareness training, are delivered at four levels according to rank and role.8

1C17. The Defence Cultural Specialist Unit has been created under the sponsorship of Director Joint Capability to provide deployable cultural specialists to assist operational commanders on operations in Afghanistan and elsewhere. Once fully operational the Defence Cultural Specialist Unit (DCSU) will provide training and education to improve defence cultural understanding, ranging from language training to wider cultural studies. For details on opportunities and course qualifying criteria, interested individuals should contact their local education unit or chain of command.

1C18. Intermediate Command and staff Course (Land) is developing a cultural module comprising more generic education for all course attendees. The module is expected to commence in January 2010 and is likely to comprise approximately 20 hours of education and training.

8 DIN 2008DIN07-075: Defence Language and Cultural Awareness Training Policy.

C-6

Page 65: Red Teaming

Red Teaming

LEXICON OF TERMS AND DEFINITIONS Term Definition

Alternative Futures Analysis

Systematically explores multiple ways a situation can develop when there is a high degree of complexity and uncertainty.

Bandwagon Effect This is a tendency to do or believe things because many other people do or believe the same.

Blue team The project or planning team. Brown cell

A brown cell considers an issue from the perspectives of actors that are neither supportive nor adversarial.

Command Support Having a commander’s confidence and support and must understand his broad direction.

Confirmation Bias The tendency to search for or interpret new information in a way that confirms one’s preconceptions and to irrationally avoid information or reject new evidence that contradicts an established view.

Culture The shared concepts that guide what people believe how they behave and how this behaviour is interpreted.

Cultural Capability The ability to understand culture, and to apply this knowledge to effectively engage in any environment.

Cultural Contempt Distinct from mirror imaging in that the staff recognises that cultural differences exist but fail to understand the significance of them.

Experimentation

A test under controlled conditions that is made to demonstrate a known truth, examines the validity of a hypothesis, or determines the efficacy of something previously untried.

False Certainty Effect

There is an inclination to make risk-averse choices if the expected outcome is positive, but make risk-seeking choices to avoid negative outcomes.

Faulty Perceptions A tendency to perceive the expected. Green cell

A Green cell consider an issue from the perspectives of friendly actors.

Group think The desire for solidarity or unanimity within a staff constrains wider, alternative thinking.

Independence Balance the requirement to provide alternative views and avoid group thing by being private of the planning process.

Information Check Completeness and soundness of available information sources.

Lexicon-1

Page 66: Red Teaming

Red Teaming

Learning Culture A climate within an organisation which actively promotes and accepts learning and development.

Mirror Imaging Applying own attitudes (for example values, beliefs, cultural concepts, capabilities) to others, thus gaining a flawed understanding of consequences and outcomes.

Operational Environment

A composite of the conditions, circumstances, and influences that affect the employment of military forces and bear on the decisions of the unit commander.

Operational Design Sets out to analyse and frame a problem before detailed planning commences.

Optimism Bias The systematic tendency for people to be over-optimistic about the outcome of a planned course of action, which is one reason why contingency planning is so important; branches and sequels in operational planning hedge against such over-optimism.

Output-Oriented Focused on the outcomes of a plan or action rather than on the processes or activities to achieve it.

Over-optimism To assume success will be the only outcome. Over- pessimism To be unable to see the route to success. Paradigm blindness A ‘why change what has worked in the past’ attitude

leading to predictable actions or failure to recognise changes in adversary actions.

Red The adversary, this includes, but is not restricted to, hybrid, conventional, insurgent, terrorist, and criminal threats. Red may be a state or non-state actor.

Red Cell The term for the opposing force in a war game. Red Teaming Red Teaming is the art of applying independent structured

critical thinking and culturally sensitised alternative thinking from a variety of perspectives, to challenge assumptions and fully explore alternative outcomes, in order to reduce risks and increase opportunities.

Role Play/Surrogate Adversary

Models the behaviour of an individual or group by trying to replicate how an adversary or outsider would think about an issue.

Signposts of Change Periodically review a list of observable events or trends to track events, monitor targets, spot emerging trends, and warn of unanticipated change.

Trends faith Blind adherence to trends without considering other problems or possible shocks.

Lexicon-2

Page 67: Red Teaming

Red Teaming

Tunnel vision Failure to take a holistic view of a complex problem with many variables, especially when time constrained and operating with poorly integrated coalitions, leading to implicit or untested assumptions.

War Gaming An event to simulate a military operation; testing underpinning assumptions and testing and comparing the impact of proposed courses of action.

White cell A White cell considers an issue from the perspectives of organisers and facilitators.

Lexicon-3

Page 68: Red Teaming

Red Teaming

(INTENTIONALLY BLANK)

Lexicon-4