trust in networks and networked systems · 2018-01-08 · trust in networks and networked systems...

Post on 11-Jun-2020

2 Views

Category:

Documents

0 Downloads

Preview:

Click to see full reader

TRANSCRIPT

Trust in Networks and Networked Systems

John S. BarasThe Institute for Systems Research,

Electrical and Computer Engineering Department,Applied Mathematics, Statistics and Scientific Computation Program

University of Maryland College ParkUSA

December 7, 2017ANU Workshop on Systems and Control

ANU, Canberra, Australia

Acknowledgments

• Joint work with: Peixin Gao, Tao Jiang, Ion Matei, Kiran Somasundaram, Xiangyang Liu, George Theodorakopoulos

• Sponsors: NSF, ARO, ARL, AFOSR, NIST, DARPA, Lockheed Martin, Northrop Grumman, Telcordia (ACS)

2

Networked Systems

Infrastructure / Communication Networks

Internet / WWWMANETSensor NetsRobotic NetsHybrid Nets: Comm, Sensor, Robotic and Human Nets

Social / Economic Networtks

Social Interactions

CollaborationSocial FilteringEconomic

AlliancesWeb-based

social systems

Biological Networks

CommunityEpiddemicCellular and Sub-cellularNeuralInsectsAnimal Flocks

3

Networks and Trust

• Trust and reputation critical for collaboration• Characteristics of trust relations:

– Integrative (Parsons1937) – main source of social order– Reduction of complexity – without it bureaucracy and

transaction complexity increases (Luhmann 1988)– Trust as a lubricant for cooperation (Arrow 1974) –

rational choice theory• Social Webs, Economic Webs

– MySpace, Facebook, Windows Live Spaces, Flickr, Classmates Online, Orkut, Yahoo! Groups, MSN Groups

– e-commerce, e-XYZ, services and service composition – Reputation and recommender systems

4

Indirect Network Trust

1 7

5

3

4

2

6

8

User 8 asks for access to User 1’s files.User 1 and User 8 have no previous interaction

What should User 1 do?Use transitivity of trust(i.e. use references to

compute indirect trust)

5

6

Indirect Trust: System Model

• System mapped to a weighted, directed graph– Vertices : entities/users– Edges : direct trust relations– Weights : w(i,j) = How much i trusts j

• Establish an indirect trust relation, between users that have not had direct interactions– We assume that trust is transitive (at least partially)

• Trust computation: path problem on a graph– Information about j that is useful to i

Directed paths from i to j

– Combine information along each path, and then aggregate across paths

7

Semirings-Examples

• Shortest Path Problem– Semiring:– is + and computes total path delay– is and picks shortest path

• Bottleneck Problem– Semiring:– is and computes path bandwidth– is and picks highest bandwidth

8

Trust Semiring Properties:Partial Order

• Combined along-a-path weight should not increase :

• Combined across-paths weight should not decrease :

2 31

b

a

a b

9

Trust Path Semiring

• 0 £ trust, confidence £ 1• is• is

10

Trust Distance Semiring

• Motivated from Eisner’s Expectation Semiring (2002) (speech/language processing)

(a1, b1) (a2, b2) = (a1b2+a2b1, b1b2)(a1, b1) (a2, b2) = (a1 + a2, b1 + b2)

• 0 £ trust, confidence £ 1, (t, c) (c/t, c)S = [0, ¥] x [0,1]

• is

• is

11

Computing Indirect Trust

• Path interpretation

• Linear system interpretation

• Treat as a linear system– We are looking for its steady state.

Indicator vector of pre-trusted nodes

1

i j i k k jUser k

n n

t t w

t W t b

12

Attacks to Indirect Trust

• ATTACK the trust computation!– Aim: Increase t1→8 to a level that would

grant access.

• How?– Edge attack: change opinion on an edge

(trick a node into forming false opinion)– Node attack: change any opinion

emanating from a node (gain complete control of a node)

13

Attacks to Indirect Trust

1 7

5

3

4

2

6

8

Edge Attack

14

Attacks to Indirect Trust

1 7

5

3

4

2

6

8

Node Attack

15

Edge Tolerances

• Upper (Lower) edge tolerance of an edge e, w.r.t. an optimal path p*, is the highest (lowest) weight of ethat would preserve the optimality of p*.

• In a shortest path problem (min, +), the most vital edge is the path edge whose weight has the largest difference with the upper tolerance.

• In a maximum capacity problem (max, min), the most vital edge is the path edge whose weight has the largest difference with the lower tolerance.

16

Upper Tolerance Example

• Upper Tolerances for the Shortest Path Problem

42

1 3 51

665 5

10

Most Vital Edge

10

∞ ∞∞ ∞

12

Upper Tolerances

Shortest Path

17

Lower Tolerance Example

• Lower Tolerances for the Shortest Path Problem

42

1 3 51

665 5

10

-∞

4 4-4 -4

-∞

Lower Tolerances

Shortest Path

“Smallest” required changes

18

Attacked Edge on the Path

1 7

5

3

4

2

6

8

Trust Edge Attack

Optimal Path p*, trust value: t*

RESULT: Decrease Trust!

19

Attacked Edge not on the Path

1 7

5

3

4

2

6

8

Trust Edge Attack

Optimal Path p*, trust value: t*

RESULT:Increase Trust!Change Path!

New Optimal Path p’, trust value: t’

20

Tolerances for any Optimization Semiring

• Optimization semirings: is min or max• -minimal (maximal) tolerance αe (βe) of edge e instead of lower

(upper) tolerance.• is the inverse of defined by: a x = b x = b a• w(e) is the weight of edge e. w(p) is the weight of path p.

If e p* If e p*

21

Tolerances for the Trust Semiring

• Assume (max, ·) semiring; essentially equivalent to our trust semiring.

• Tolerances:If e p* If e p*

Distributed Kalman Filtering and Tracking: Performance Improvements from Trusted Core

• Realistic sensor networks: Normal nodes, faulty or corrupted nodes, malicious nodes

• Hierarchical scheme – provide global trust on a particular context without requiring direct trust on the same context between all agents

• Combine techniques from fusion centric, collaborative filtering, estimation propagation

• Trusted Core– Trust Particles, higher security, additional sensing capabilities,

broader observation of the system, confidentiality and integrity, multipath comms

– Every sensor can communicate with one or more trust particles at a cost

22

Trust and Induced Graphs

23

Induced Graph G (V, A)  

Weighted Directed Dynamic Trust Graph  Gt  (V, At )

tcV V

( , ) ( ( , ), ( , )[ ])w i j c i j t i j n

Trust  relation

Goals of Trusted System

1. All the sensors which abide by the protocols of sensing and message passing, should be able to track the trajectories.

2. This implies that those nodes which have poor sensing capabilities, nodes with corrupted sensors, should be aided by their neighbors in tracking.

3. Those nodes which are malicious and pass false estimates, should be quickly detected by the trust mechanism and their estimates should be discarded.

24

[ 1] [ ] [ ]

[ ] [ ] [ ] [ ]

[ ] [ ] [ ]i i i

tc tc tc

x n Ax n Bw n

z n H n x n v n

z H n x n v n

Trusted DKF and Particles

25

• Can use any valid trust system as trust update component

• Can replace DKF with any Distributed Sequential MMSE or other filter

• Trust update mechanism: Linear credit and exponential penalty

Trusted DKF Performance

26

Open Loop Performance Closed Loop PerformanceTrust System Performance

Power Grid Cyber‐security

• Inter‐area oscillations (modes)– Associated with large inter‐connected power networks between clusters of generators

– Critical in system stability– Requiring on‐line observation and control

• Automatic estimation of modes– Using currents, voltages and angle differences  measured by PMUs (Power Management Units) that are distributed throughout the power system

27

System Model

• Linearization around the nominal operating points– The initial steady–state value is eliminated – Disturbance inputs consist of M frequency modes defined as

oscillation amplitudes;      damping constants;oscillation frequencies;     phase angles of the oscillations

– Consider two modes and use the first two terms in the Taylor series expansion of the exponential function; expanding the trigonometric functions:

1 1 12

( ) exp( )cos( ) exp( )cos( )M

i i i ii

f t a t t a t t

:ia :i:i :i

1 1 1

2 2 2 2 2 2

( ) (1 )cos( )(1 )[cos cos( ) sin sin( )].

f t a t ta t t t

28

Distributed Estimation

• To compute an accurate estimate of the state x (k), using: – local measurements yj (k); – information received from the PMUs in its communication neighborhood; – confidence in the information received from other PMUs provided by the 

trust model

PMUPMU

PMU

GPS Satellite

N  multiple recording sites (PMUs) to measure the output signals

29

30

Trust Model

• To each information flow (link) j i, we attach a positive value Tij , which represents the  trust PMU i has in the information received from  PMU j ;

• Trust interpretation:  – Accuracy– Reliability

• Goal: Each PMU has to compute accurate estimates of the state, by intelligently combining the measurements and the information from neighboring PMUs 

Trust‐based Multi‐agent State Estimation

• Main idea: pick the weights wij to be trust dependent

• Does not require global information about the power grid topology

• Ensures greater robustness in computing the state estimate

31

Numerical Example

• 3‐generators, 9‐bus system:

32

Numerical example (cont.)

• We assume that a PMU is placed at each bus that measures the complex voltages and currents (in the case of adjacent buses).

• The state vector is formed by the voltages measured at buses, i.e., X = (Ui), where Ui is the complex voltage at bus i.

• Measurement model:In the case the buses (i, j) are adjacent 

where the measurement vector Zi(k)′= (Ui(k), Iij(k)), Yij is the admittance of line (i, j) and Vi(k) is the complex measurement noise

Numerical Example (cont.)

• PMU network: Compromised node

34

Numerical Example (cont.)

• Estimates of the voltage at bus 1 using Algorithm 1, with agent 8 injecting false data

35

Numerical Example (cont.)

• Estimates of the voltage at bus 1 using Algorithm 3, with agent 8 injecting false data

36

Numerical Example (cont.)

• The evolution of agent 4’s weights

37

MANET Trust Aware Routing --Trust/Reputation Systems

• Components– Monitoring and Detection Modules– Reputation Control System– Response Modules

• Our approach is different: build and use a Trusted Sentinel Sub-Network (SSN), which is responsible for monitoring and flooding reputation measures

• Logically decouple the Trust/Reputation System from other network functionalities

• Demonstrate that logical constraints on the SSN translate to constraints on the communication graph topology of the network

• Trade-off analysis between security and performance38

Path problems on Graphs –Delay and Trust Semirings

Notions of Optimality: Pareto, Lexicographic, Max-Ordering, Approximation Semirings

i j(d(i,j), t(i,j))

( , )

( , ) ( , )

min ( ) min ( , )

max ( ) max min ( , ) min max( ( , ))

SD SD

SDSD SD

p p i j p

i j p pp p i j p

d p d i j

t p t i j t i j

P P

PP P

DelaySemiring

Trust Semiri

:( {0}, min, )

: ( {0}, minng , max)

R

R

2:( ) ( ( ), ( )),

SD

SD

f Rf p d p t p pP

P

39

Trust Aware Routing –Multi‐Criteria Optimization Problem

• Delay of a path “p” 

• Trust of a path “p” –bottleneck trust

( , )

( ) ( , )i j p

d p d i j

( , )( ) min ( , )i j pt p t i j

j2

ij4

j3j1

j5

j7J6

Bi‐metric Network

40

Haimes Method –Two Stage Recipe

G (V,E)Source

1. G’ – reduced graph  O(|E|) 2. G’SP – SP on reduced graph  O(|V|.|E|)

41

What is a Network …?

• In several fields or contexts:

42

– social– economic– communication– control– sensor– biological– physics and materials

A Network is …

• A collection of nodes, agents, …that collaborate to accomplish actions, gains, …that cannot be accomplished with out such collaboration

• Most significant concept for autonomic networks

43

The Fundamental Trade-off

• The nodes gain from collaborating• But collaboration has costs (e.g. communications)• Trade-off: gain from collaboration vs cost of

collaborationVector metrics involved typicallyConstrained Coalitional Games

44

Example 1: Network Formation -- Effects on Topology Example 2: Collaborative robotics, communications Example 3: Web-based social networks and services

● ● ● Example 4: Groups of cancer tumor or virus cells

Gain

• Each node potentially offers benefits V per time unit to other nodes: e.g. V is the number of bits per time unit.

• Potential benefit V is reduced during transmissions due to transmission failures and delay

• Jackson-Wolingsky connections model, gain of node i

• rij is # of hops in the shortest path between i and j

• is the communication depreciation rate

1( ) ijr

ij g

w G V

0 1

if there is no path between and ijr i j

45

Cost

• Activating links is costly– Example – cost is the energy consumption for

sending data– Like wireless propagation model, cost cij of link ij as a

function of link length dij :

• P is a parameter depending on the transmission/receiver antenna gain and the system loss not related to propagation

• is path loss exponent -- depends on specific propagation environment.

ij ijc Pd

46

Pairwise Game and Convergence

• Payoff of node i from the network is defined as

• Iterated process– Node pair ij is selected with probability pij– If link ij is already in the network, the decision is whether to

sever it, and otherwise the decision is whether to activate the link– The nodes act myopically, activating the link if it makes each at

least as well off and one strictly better off, and deleting the link if it makes either player better off

– End: if after some time, no additional links are formed or severed– With random mutations , the game converges to a unique

Pareto equilibrium (underlying Markov chain states )

( ) gain cost ( ) ( )i i iv G w G c G

47

G

Coalition Formation at the Stable State

• The cost depends on the physical locations of nodes– Random network where nodes are placed according to a uniform

Poisson point process on the [0,1] x [0,1] square.• Theorem: The coalition formation at the stable state for n∞

— Given is a

sharp threshold for establishing the grand coalition ( number of coalitions = 1).

— For , the threshold is

less than

20

ln,

nV P

n

0 1 2

ln.

nPn

n = 20

48

Topologies Formed

49

50

Trust and Collaborative Control/Operation

Two linked dynamics• Trust / Reputation 

propagation  and  collaborative control   evolution

• Beyond linear algebra and weights, semirings of constraints, constraint programming, soft constraints semirings, policies, agents

• Learning on graphs and network dynamic games: behavior, adversaries• Adversarial models, attacks, constrained shortest paths, …

• Integrating network utility maximization (NUM) with constraint based reasoning and coalitional games

An example of constrained

coalitional games

51

Trust Dynamics and ‘Local’ Voting Rules

Initial “islands” of trusts

Trust spreads

Trust-connectednetwork

• Trust and mistrust spreading

● ‘Generalized consensus’ problems● Spin glasses (from statistical physics), phase transitions

( 1) , ( ) | i ji j is k f J s k j N

52

Results of Game Evolution

● Theorem:                                                ,  there exists τ0, such that for a reestablishing period τ > τ0– iterated game converges to Nash equilibrium;– In the Nash equilibrium, all nodes cooperate with all their neighbors.

● Compare games with (without) trust mechanism, strategy update:

and

ii i ijj N

i N x J

Percentage of cooperating pairs vs negative links Average payoffs vs negative links

Future Directions

• Constrained coalitional games, constrained satisfaction problems and spin glasses – better algorithms?

• Dynamic hypergraphs over semirings – replace weights with rules (hard and soft constraints, logic)

• Time varying hypergraphs – mixing – statistical physics • Generalized constrained shortest path problems –

multiple semirings• Design of high integrity reputation systems, resilient trust • Network self-organization to improve resiliency/robustness• Role of trust in establishing composable security• Networks of agents with dynamic positive and negative trust

(recommendations)? 53

Thank you!

baras@isr.umd.edu301‐405‐6606

http://dev‐baras.pantheonsite.io/

Questions?

top related