insider threat modeling: an adversarial risk analysis approach

36
Introduction ARA models for insider threat Examples Insider Threat Modeling: an Adversarial Risk Analysis approach Chaitanya Joshi (with David Rios Insua & Jesus Rios ) Department of Mathematics & Statistics, University of Waikato, New Zealand. 31 st May 2019 GDRR 2019, Washington DC. Chaitanya Joshi (with David Rios Insua & Jesus Rios) Insider Threat Modeling: an Adversarial Risk Analysi

Upload: others

Post on 17-Feb-2022

7 views

Category:

Documents


0 download

TRANSCRIPT

IntroductionARA models for insider threat

Examples

Insider Threat Modeling: an Adversarial RiskAnalysis approach

Chaitanya Joshi(with David Rios Insua & Jesus Rios)

Department of Mathematics & Statistics,

University of Waikato, New Zealand.

31st May 2019GDRR 2019, Washington DC.

Chaitanya Joshi (with David Rios Insua & Jesus Rios) Insider Threat Modeling: an Adversarial Risk Analysis approach

IntroductionARA models for insider threat

Examples

Adversarial Risk Analysis (ARA)

Improvement over Game Theory.

• Does not assume common knowledge.

• Solves the problem from the point of view of just one of theplayers (defender).

• Incorporates their knowledge and uncertainties about theirchoices, outcomes as well as their adversary’s preferences, choicesand outcomes.

• Can incorporate different strategic thinking models for theadversary.

Chaitanya Joshi (with David Rios Insua & Jesus Rios) Insider Threat Modeling: an Adversarial Risk Analysis approach

IntroductionARA models for insider threat

Examples

Insider threat

• A significant and widespread problem that is often underreported.

• Very little data available.

• Challenging to prevent, detect and counter.

Where?

• Security and intelligence agencies.

• Finance, business.

• Data/knowledge security, cyber security, etc.

Chaitanya Joshi (with David Rios Insua & Jesus Rios) Insider Threat Modeling: an Adversarial Risk Analysis approach

IntroductionARA models for insider threat

Examples

A simple defend-attack-defend model.

Figure: Bi-agent Influence Diagram (BAID) for theDefend-Attack-Defend insider threat game

Chaitanya Joshi (with David Rios Insua & Jesus Rios) Insider Threat Modeling: an Adversarial Risk Analysis approach

IntroductionARA models for insider threat

Examples

A simple defend-attack-defend model.

• Initially, the organization must choose one of the availablepreventive measures d1 in the set D1.

• Having observed the preventive measure taken, the employee willadopt one of the actions a in A.

• The set S consists of the possible outcomes s that can occur as aresult of the preventive measure d1 and the attack a adopted.

• Once the attack has been detected, the organization will chooseto carry out one of the possible actions d2 in the set D2 to endthe attack, limit any damage and possibly pre-empt futureattacks leading to the final outcomes of both agents, respectively,evaluated through their utility functions uD and uA.

Chaitanya Joshi (with David Rios Insua & Jesus Rios) Insider Threat Modeling: an Adversarial Risk Analysis approach

IntroductionARA models for insider threat

Examples

A simple defend-attack-defend model.

For its solution, the defender must first quantify the following:

1 The distribution pD(a|d1) modeling her beliefs about the attacka chosen at node A by the employee given the chosen defense d1.

2 The distribution pD(s|d1, a) modeling her beliefs about theoutcome s of the attack, given a and d1.

3 Her utility function uD(d1, s, d2) which evaluates theconsequences associated with their first (d1) and second (d2)defensive actions as well as the outcome s of the attack.

Chaitanya Joshi (with David Rios Insua & Jesus Rios) Insider Threat Modeling: an Adversarial Risk Analysis approach

IntroductionARA models for insider threat

Examples

A simple defend-attack-defend model.

Given these assessments, the defender first seeks to find the actiond∗2(d1, s) maximizing her utility

d∗2(d1, s) = arg maxd2∈D2

uD(d1, s, d2), (1)

leading to the best second defense when the first one was d1 and theoutcome was s. Then, they seek to compute the expected utilityψD(d1, a) for each (d1, a) ∈ D1 ×A as

ψD(d1, a) =

∫uD(d1, s, d

∗2(d1, s))pD(s|d1, a) ds. (2)

Chaitanya Joshi (with David Rios Insua & Jesus Rios) Insider Threat Modeling: an Adversarial Risk Analysis approach

IntroductionARA models for insider threat

Examples

A simple defend-attack-defend model.

Moving backwards, she computes her expected utility for eachd1 ∈ D1 using the predictive distribution pD(a|d1) through

ψD(d1) =

∫ψD(d1, a)pD(a|d1) da. (3)

Finally, the defender has to find her maximum expected utilitydecision d∗1 = arg maxd1∈D1

ψD(d1). This backward induction showsthat the defender’s optimal strategy is to first choose d∗1 and, then,after having observed s, choose d∗2(d∗1, s).

Catch! The above analysis requires the defender to elicit

pD(a|d1). How do you elicit that???

Chaitanya Joshi (with David Rios Insua & Jesus Rios) Insider Threat Modeling: an Adversarial Risk Analysis approach

IntroductionARA models for insider threat

Examples

A simple defend-attack-defend model.

Moving backwards, she computes her expected utility for eachd1 ∈ D1 using the predictive distribution pD(a|d1) through

ψD(d1) =

∫ψD(d1, a)pD(a|d1) da. (3)

Finally, the defender has to find her maximum expected utilitydecision d∗1 = arg maxd1∈D1

ψD(d1). This backward induction showsthat the defender’s optimal strategy is to first choose d∗1 and, then,after having observed s, choose d∗2(d∗1, s).

Catch! The above analysis requires the defender to elicit

pD(a|d1). How do you elicit that???

Chaitanya Joshi (with David Rios Insua & Jesus Rios) Insider Threat Modeling: an Adversarial Risk Analysis approach

IntroductionARA models for insider threat

Examples

A simple defend-attack-defend model.

• The defender could model the attacker’s strategic analysis byassuming that the attacker will perform an analysis similar tohers to find their optimal attack a∗. To do so, the defendershould assess the attacker’s utility function uA(a, s, d2) andprobability distributions pA(s|a, d1) and pA(d2|d1, a, s).

• But these are not available to the defender - we could model heruncertainty about them through a random utility functionUA(a, s, d2) and random probability distributions PA(s|a, d1) andPA(d2|d1, a, s).

• Once these random quantities are elicited, the defender solvesthe attacker’s decision problem using backward induction. Thisis done by following a process similar to how they solved theirown decision problem but taking into account the randomness injudgments.

Chaitanya Joshi (with David Rios Insua & Jesus Rios) Insider Threat Modeling: an Adversarial Risk Analysis approach

IntroductionARA models for insider threat

Examples

A simple defend-attack-defend model.

First, the defender finds the random expected utility for each d2 ∈ D2

ΨA(d1, a, s) =

∫UA(a, s, d2)PA(d2|d1, a, s) dd2. (4)

Then, they find the random expected utility for each pair(d1, a) ∈ D1 ×A

ΨA(d1, a) =

∫ΨA(d1, a, s)PA(s|d1, a) ds, (5)

and compute the random optimal attack A∗(d1) given the defense d1

A∗(d1) = arg maxa∈A

ΨA(d1, a). (6)

Chaitanya Joshi (with David Rios Insua & Jesus Rios) Insider Threat Modeling: an Adversarial Risk Analysis approach

IntroductionARA models for insider threat

Examples

A simple defend-attack-defend model.

Finally, once the defender assesses A∗(d1), she is able to solve herdecision problem. The desired predictive distribution by the defenderabout the attack chosen a given the initial defense d1 is

pD(a|d1) = pD(A∗ = a|d1) and

pD[A∗ ≤ a|d1] =

∫ a

0

PD(A∗ = x|d1) dx. (7)

Chaitanya Joshi (with David Rios Insua & Jesus Rios) Insider Threat Modeling: an Adversarial Risk Analysis approach

IntroductionARA models for insider threat

Examples

Let’s get real!

Not all employees are the same!

Chaitanya Joshi (with David Rios Insua & Jesus Rios) Insider Threat Modeling: an Adversarial Risk Analysis approach

IntroductionARA models for insider threat

Examples

ARA with segmented employees.

We classify the employees as A1 (the good), A2 (the bad) and A3 (theugly), with S1, S2 and S3 being the corresponding outcome sets.

• A1 : Correctly and promptly perform their duties includingfollowing any procedures to prevent insider attacks. Positiveimpact on the productivity and work culture of the organizationand will correctly report any suspicious activity, thus helping theorganization to protect itself.

• A2 : While not intentionally working to harm the organization,will help to create an environment which could increase thechances of an insider attack through their accidental ordeliberate actions. For e.g. misuse the defensive procedures,creating a culture of mis-trust and loss in productivity.Therefore, their actions will be negative to the organization.

• A3 : Actively aim at harming the organization. They are theones who intend to launch an insider attack. Their actions willtherefore be very negative to the organization.

Chaitanya Joshi (with David Rios Insua & Jesus Rios) Insider Threat Modeling: an Adversarial Risk Analysis approach

IntroductionARA models for insider threat

Examples

ARA with segmented employees.

Figure: Decision trees for the three games in the insider threatproblem with segmented employees.

Chaitanya Joshi (with David Rios Insua & Jesus Rios) Insider Threat Modeling: an Adversarial Risk Analysis approach

IntroductionARA models for insider threat

Examples

ARA with segmented employees.

• Actions by A1 may reduce the chance of insider attacks as wellas the chance of one of them succeeding. Similarly, actions by A2

may increase the chance of an insider attack as well as thechance of one of them succeeding.

• Game [c] is identical to the simple Defend-Attack-Defend game.

• ARA solution to games [a] and [b] will consist of identical sets ofsteps. Henceforth, we use Ai, i = 1, 2 and Si, i = 1, 2.

• Note that we differentiate between node Ai, which is uncertain,and node A3, which is a decision node but belonging to adifferent decision maker, as this last one is strategic.

Chaitanya Joshi (with David Rios Insua & Jesus Rios) Insider Threat Modeling: an Adversarial Risk Analysis approach

IntroductionARA models for insider threat

Examples

ARA with segmented employees.

Figure: MAID for decision trees [a] and [b] in the segmentedemployees insider threat game.

Chaitanya Joshi (with David Rios Insua & Jesus Rios) Insider Threat Modeling: an Adversarial Risk Analysis approach

IntroductionARA models for insider threat

Examples

ARA with segmented employees.

The defender must first quantify the following.

1 Her predictive distribution pD(ai|d1) about the action that willbe chosen at node Ai given the defense d1.

2 Her predictive distribution pD(si|d1, ai) about the outcome ofsuch action, given ai and d1.

3 Her predictive distribution pD(a3|d1, ai, si) about the attack thatwill be chosen at note A3 given the outcome si and actions aiand d1.

4 Her predictive distribution pD(s3|d1, ai, si, a3) about theoutcome of the attack, given outcome si, actions a3, ai and d1.

5 The utility function uD(d1, ai, si, a3, s3, d2) given their first andsecond defensive actions, the outcomes of the attack s3 and siand the actions a3 and ai.

Chaitanya Joshi (with David Rios Insua & Jesus Rios) Insider Threat Modeling: an Adversarial Risk Analysis approach

IntroductionARA models for insider threat

Examples

ARA with segmented employees.

Given these, the defender works backwards along the decision trees inFigure 2 [a] or [b]. First, they seek to find the actiond∗2(d1, ai, si, a3, s3) maximizing their utility

d∗2(d1, ai, si, a3, s3) = arg maxd2∈D2

uD(d1, ai, si, a3, s3, d2). (8)

Then, for each (d1, ai, si, a3) ∈ D1 ×Ai × Si ×A3, they seek tocompute the expected utility ψD(d1, ai, si, a3) through

ψD(d1, ai, si, a3) =

∫uD(d1, ai, si, a3, s3, d

∗2(d1, ai, si, a3, s3))

pD(s3|d1, ai, si, a3) ds3. (9)

Chaitanya Joshi (with David Rios Insua & Jesus Rios) Insider Threat Modeling: an Adversarial Risk Analysis approach

IntroductionARA models for insider threat

Examples

ARA with segmented employees.

Next, they compute the expected utility ψD(d1, ai, si) for each(d1, ai, si) through

ψD(d1, ai, si) =

∫ψD(d1, ai, si, a3)pD(a3|d1, ai, si) da3. (10)

They then find the expected utility ψD(d1, ai) for each (d1, ai), as

ψD(d1, ai) =

∫ψD(d1, ai, si)pD(si|d1, ai) dsi, (11)

and their expected utility for each d1 ∈ D1 using their predictivedistribution pD(a|d1)

ψD(d1) =

∫ψD(d1, ai)pD(ai|d1) dai. (12)

Chaitanya Joshi (with David Rios Insua & Jesus Rios) Insider Threat Modeling: an Adversarial Risk Analysis approach

IntroductionARA models for insider threat

Examples

ARA with segmented employees.

• Finally, the defender finds their maximum utility decision asd∗1 = arg maxd1∈D1 ψD(d1).

• Thus, the defender’s optimal strategy is to first choose d∗1 andthen, after having observed ai, si, a3 and s3, choose actiond∗2(d∗1, ai, si, a3, s3).

The above analysis requires the defender to elicit pD(a3|d1, ai, si) andpD(ai|d1).

• Ai being non-strategic, pD(ai|d1) can be elicited using historicaldata/research on employee behavior.

Chaitanya Joshi (with David Rios Insua & Jesus Rios) Insider Threat Modeling: an Adversarial Risk Analysis approach

IntroductionARA models for insider threat

Examples

ARA with segmented employees.

• Finally, the defender finds their maximum utility decision asd∗1 = arg maxd1∈D1 ψD(d1).

• Thus, the defender’s optimal strategy is to first choose d∗1 andthen, after having observed ai, si, a3 and s3, choose actiond∗2(d∗1, ai, si, a3, s3).

The above analysis requires the defender to elicit pD(a3|d1, ai, si) andpD(ai|d1).

• Ai being non-strategic, pD(ai|d1) can be elicited using historicaldata/research on employee behavior.

Chaitanya Joshi (with David Rios Insua & Jesus Rios) Insider Threat Modeling: an Adversarial Risk Analysis approach

IntroductionARA models for insider threat

Examples

ARA with segmented employees.

Eliciting pD(a3|d1, ai, si) is less straightforward.

• Can be elicited by modeling attacker’s strategic thinking.

• To elicit it, the defender must assess UA(a3, s3, d2), PA(s3|a3, d1)and PA(d2|d1, a3, s3).

• Once elicited, the defender solves the attacker’s decision problemusing backward induction - similar to how they solved their owndecision problem.

Chaitanya Joshi (with David Rios Insua & Jesus Rios) Insider Threat Modeling: an Adversarial Risk Analysis approach

IntroductionARA models for insider threat

Examples

ARA with segmented employees - simultaneous actions!

• In the previous model, the employees have been assumed to actin a sequential manner and only two in any given game.

• But scenarios where any two or all three types of employees actsimultaneously can also be easily modelled using ARA.

Chaitanya Joshi (with David Rios Insua & Jesus Rios) Insider Threat Modeling: an Adversarial Risk Analysis approach

IntroductionARA models for insider threat

Examples

ARA with segmented employees - simultaneous actions!

Figure: MAID for the game where A1 and A2 act simultaneouslyfollowed by A3.

Chaitanya Joshi (with David Rios Insua & Jesus Rios) Insider Threat Modeling: an Adversarial Risk Analysis approach

IntroductionARA models for insider threat

Examples

ARA with segmented employees - simultaneous actions!

Figure: MAIDs for games where A3 acts simultaneously with eitherA1 or A2 or both act simultaneously followed by A3.

Chaitanya Joshi (with David Rios Insua & Jesus Rios) Insider Threat Modeling: an Adversarial Risk Analysis approach

IntroductionARA models for insider threat

Examples

Example: simple defend-attack-defend model

We consider an insider threat scenario for an organization dealingwith information/data collection.

• It already has its sites and IT systems protected so that onlyauthorized personnel are able to access them.

• However, anticipating attacks, the organization is consideringimplementing an additional security layer to defend itself.

• The defensive actions (D1) under consideration are:

1 anomaly detection/data provenance tools;2 information security measures and employee training;3 carrying out random audits.

Chaitanya Joshi (with David Rios Insua & Jesus Rios) Insider Threat Modeling: an Adversarial Risk Analysis approach

IntroductionARA models for insider threat

Examples

Example: simple defend-attack-defend model

• The malicious insider’s aim could be financial fraud, data theft,espionage or whistle blowing.

• Regardless, we assume that the attacker’s options (A) refer to itsscale, say small, medium or large.

• Assume that the attack will either fully succeed (S) or fail (F).

• We assume that once the attack is detected, the organisation canchoose to carry out one of the following defensive actions (D2):

1 major upgrade of defenses;2 minor upgrade of defenses; or3 no upgrade.

Chaitanya Joshi (with David Rios Insua & Jesus Rios) Insider Threat Modeling: an Adversarial Risk Analysis approach

IntroductionARA models for insider threat

Examples

Example: simple defend-attack-defend model

Assessing defender’s utility function uD(d1, a, s, d2).

• We assume that uD aggregates the monetary costs c(d1) andc(d2) associated with actions d1 and d2 respectively and themonetized perceived utilities associated with every (a, s)combination through

uD(d1, a, s, d2) = c(d1) + c(d2) + u(a, s). (13)

• The costs and perceived utilities are scaled from -100 to 100.

Chaitanya Joshi (with David Rios Insua & Jesus Rios) Insider Threat Modeling: an Adversarial Risk Analysis approach

IntroductionARA models for insider threat

Examples

Example: simple defend-attack-defend model

Assessing defender’s utility function uD(d1, a, s, d2).

Table: Costs associated with defensive actions d1 and d2.

d1 c(d1) d2 c(d2)

Anom. det. & Data prov. -100 Major upgrade -100

Info. Sec.& train. -60 Minor upgrade -25

Random audits -50 No upgrade 0

Table: Monetized Perceived Utility for Every Combination (a, s).

a s u(a, s)

Small Success -25

Small Fail 30

Medium Success -50

Medium Fail 60

Large Success -100

Large Fail 80

Chaitanya Joshi (with David Rios Insua & Jesus Rios) Insider Threat Modeling: an Adversarial Risk Analysis approach

IntroductionARA models for insider threat

Examples

Example: simple defend-attack-defend model

Eliciting pD(a|d1)

Table: pD(a|d1) elicited by defender using their beliefs

.

D1 A = small A = med. A = large

Anom. det. & Data prov. 0.8 0.15 0.05

Info. sec. & train. 0.2 0.6 0.2

Random audits 0.5 0.4 0.1

Table: PD(a|d1) elicited by defender modeling the strategic analysisof the attacker

.

D1 A = small A = med. A = large

Anom.det. & Data prov. 0.112 0.102 0.786

Info.sec. & train. 0.706 0.132 0.162

Random audits 0.399 0.119 0.482

Chaitanya Joshi (with David Rios Insua & Jesus Rios) Insider Threat Modeling: an Adversarial Risk Analysis approach

IntroductionARA models for insider threat

Examples

Example: simple defend-attack-defend model

Eliciting pD(a|d1)

Table: pD(a|d1) elicited by defender using their beliefs

.

D1 A = small A = med. A = large

Anom. det. & Data prov. 0.8 0.15 0.05

Info. sec. & train. 0.2 0.6 0.2

Random audits 0.5 0.4 0.1

Table: PD(a|d1) elicited by defender modeling the strategic analysisof the attacker

.

D1 A = small A = med. A = large

Anom.det. & Data prov. 0.112 0.102 0.786

Info.sec. & train. 0.706 0.132 0.162

Random audits 0.399 0.119 0.482

Chaitanya Joshi (with David Rios Insua & Jesus Rios) Insider Threat Modeling: an Adversarial Risk Analysis approach

IntroductionARA models for insider threat

Examples

Example: simple defend-attack-defend model

Defender’s expected utility ψD(d1) for each d1 using Eq. (3).

Table: How ψD(d1) changes based on the choice of pD(a|d1)!

D1 when pD(a|d1) - beliefs when pD(a|d1) -strategic thinking

Anom.det. & Data prov. ψD = −69.005 ψD = −36.115

Info.sec. & train. ψD = −29 ψD = −39.051

Random audits ψD = −39.75 ψD = −34.566

• Eliciting pD(a|d1) differently can lead to different optimalchoices (as one would expect)!

• Also true for any of the other subjective choices - utilties andprobabilities - involved.

Chaitanya Joshi (with David Rios Insua & Jesus Rios) Insider Threat Modeling: an Adversarial Risk Analysis approach

IntroductionARA models for insider threat

Examples

Summary & Discussion!

• ARA can be a practical tool for strategic decision making!

• Insider threat is a serious security risk and ARA model had notyet been developed!

Further Work:

• Bounded rationality.

• Evolving defend-attack-defend scenarios - using MDP.

• Robustness w.r.t priors and utilities. ARA can be based entirelyon subjective beliefs (and NO DATA)!

Chaitanya Joshi (with David Rios Insua & Jesus Rios) Insider Threat Modeling: an Adversarial Risk Analysis approach

IntroductionARA models for insider threat

Examples

Summary & Discussion!

• ARA can be a practical tool for strategic decision making!

• Insider threat is a serious security risk and ARA model had notyet been developed!

Further Work:

• Bounded rationality.

• Evolving defend-attack-defend scenarios - using MDP.

• Robustness w.r.t priors and utilities. ARA can be based entirelyon subjective beliefs (and NO DATA)!

Chaitanya Joshi (with David Rios Insua & Jesus Rios) Insider Threat Modeling: an Adversarial Risk Analysis approach

IntroductionARA models for insider threat

Examples

We are hiring! (www.waikato.ac.nz)

Chaitanya Joshi (with David Rios Insua & Jesus Rios) Insider Threat Modeling: an Adversarial Risk Analysis approach