scalable information-driven sensor querying and routing for ad hoc heterogeneous sensor networks

Post on 07-Feb-2016

37 Views

Category:

Documents

0 Downloads

Preview:

Click to see full reader

DESCRIPTION

Scalable Information-Driven Sensor Querying and Routing for ad hoc Heterogeneous Sensor Networks. Maurice Chu, Horst Haussecker and Feng Zhao. Based upon slides by: Jigesh Vora, UCLA. Overview. Introduction Sensing Model Problem Formulation & Information Utility - PowerPoint PPT Presentation

TRANSCRIPT

1

Scalable Information-Driven Sensor Querying and Routing for ad hoc Heterogeneous Sensor Networks

Maurice Chu, Horst Haussecker and Feng Zhao

Based upon slides by:Based upon slides by:

Jigesh Vora, UCLAJigesh Vora, UCLA

2

Overview

IntroductionIntroduction Sensing ModelSensing Model Problem Formulation & Information Problem Formulation & Information

UtilityUtility Algorithm for Sensor Selection and Algorithm for Sensor Selection and

Dynamic RoutingDynamic Routing ExperimentsExperiments Rumor RoutingRumor Routing

3

Introduction

how to dynamically query sensors and how to dynamically query sensors and route data in a network so that route data in a network so that information gain is maximized while power information gain is maximized while power and bandwidth consumption is minimizedand bandwidth consumption is minimized

two algorithms: two algorithms: Information-Driven Sensor Querying Information-Driven Sensor Querying

(IDSQ)(IDSQ) Constrained Anisotropic Diffusion Routing Constrained Anisotropic Diffusion Routing

(CADR)(CADR)

4

What’s new? Introduction of two novel techniques Introduction of two novel techniques

IDSQ and CADR for energy-efficient IDSQ and CADR for energy-efficient data querying and routing in sensor data querying and routing in sensor networksnetworks

the use of general form of information the use of general form of information utility that models the information utility that models the information content as well as the spatial content as well as the spatial configuration of a networkconfiguration of a network

generalization of directed diffusion that generalization of directed diffusion that uses both the communication cost and uses both the communication cost and the information utility to diffuse datathe information utility to diffuse data

5

Problem Formulation

zzi i ((tt) = ) = hh((xx((tt), ), i i ((tt)),)), (1) (1) xx((tt) is based on parameters of the sensor, ) is based on parameters of the sensor, i i ((tt) ) and and zzi i ((tt) are characteristics and measurement ) are characteristics and measurement of sensor of sensor ii respectively. respectively.

for sensors measuring sound amplitudefor sensors measuring sound amplitude i i = [ x= [ xi i , , σσi i

2 2 ] ] TT (3) (3) xxii is the known sensor position and is the known sensor position and σσi i

2 2 is the is the known additive noise varianceknown additive noise variance

zzi i = = aa / || / || xxii - - xx || ||/2/2 + + wwi i ,, (4) (4) aa is target amplitude, is target amplitude, is attenuation coefficient is attenuation coefficient

, , wwii is Gaussian noise with variance is Gaussian noise with variance σσi i 2 2

6

Define Belief as ...

representation of the current representation of the current a a posterioriposteriori distribution of distribution of xx given given measurement measurement zz11, …, , …, zzNN: : pp((xx | | zz11, …, , …, zzNN))

expectation is considered estimate expectation is considered estimate xx = = xxpp((xx | | zz11, …, , …, zzNN))ddxx covariance approximates residual covariance approximates residual

uncertaintyuncertainty = = ( (xx - - xx)()(xx - - xx))TTpp((xx | | zz11, …, , …, zzNN))ddxx

7

Define Information Utility as …

The Information Utility function is defined asThe Information Utility function is defined as

ΨΨ: : P(RP(Rdd) R) R

d is the dimension of xd is the dimension of x

ΨΨ assigns value to each element of P(R assigns value to each element of P(Rdd) ) indicating the uncertainty of the indicating the uncertainty of the distributiondistribution

smaller value -> more spread out smaller value -> more spread out distributiondistribution

larger value -> tighter distributionlarger value -> tighter distribution

8

Sensor Selection (in theory)

jj00 = arg = argjjA A max max ((pp((xx|{|{zzii}}ii U U {{zzjj}))})) A A == {1, …, {1, …, NN} - } - UU is set of sensors whose is set of sensors whose

measurements not incorporated into beliefmeasurements not incorporated into belief is information utility function defined on is information utility function defined on

the class of all probability distributions of the class of all probability distributions of xx intuitively, select sensor intuitively, select sensor jj for querying for querying

such that information utility function of the such that information utility function of the distribution updated by distribution updated by zzjj is maximum is maximum

9

Sensor Selection (in practice)

zzjj is unknown before it’s sent back is unknown before it’s sent back best average casebest average case j’j’ = arg = argjjA A max max EEzzjj

[[((pp((xx|{|{zzii}}ii U U {{zzjj}))| }))|

{{zzii}}ii U U ] ] maximizing worst casemaximizing worst case j’j’ = arg = argjjA A max minmax minzzjj

((pp((xx|{|{zzii}}ii U U {{zzjj}))}))

maximizing best casemaximizing best case j’j’ = arg = argjjA A max maxmax maxzzjj

((pp((xx|{|{zzii}}ii U U {{zzjj}))}))

10

Sensor Selection Example

11

Information Utility Measures

covariance-basedcovariance-based

((ppXX) = - det() = - det(), ), ((ppXX) = - trace() = - trace()) Fisher information matrixFisher information matrix

((ppXX) = det() = det(FF((xx)), )), ((ppXX) = trace() = trace(FF((xx)))) entropy of estimation uncertaintyentropy of estimation uncertainty

((ppXX) = - ) = - HH((PP), ), ((ppXX) = - ) = - hh((ppXX) )

12

Information Utility Measures (Contd ...)

volume of high probability regionvolume of high probability region = {= {xxSS : : pp((xx) ) }, chose }, chose so that P( so that P()) = = , , is givenis given

((ppXX) = - vol() = - vol()) sensor geometry based measuressensor geometry based measures

in cases utility is function of sensor location onlyin cases utility is function of sensor location only

((ppXX) = - () = - (xxii--xx00))TT-1-1((xxii--xx00), where ), where xx0 0 is the is the mean of the current estimate of target mean of the current estimate of target location also called location also called Mahalanobis distanceMahalanobis distance

13

Composite Objective Function

MMcc((ll, , jj, , pp((xx|{|{zzii}}ii U U) )

= = MMuu((pp((xx|{|{zzii}}ii U U, , jj) – (1 - ) – (1 - ))MMaa((ll, , jj)) MMuu is information utility measure is information utility measure

MMaa is communication cost measure is communication cost measure [0, 1] balances their contributions[0, 1] balances their contributions jj is characteristics of current sensor is characteristics of current sensor jj

jj00 = arg = argjjA A max max MMcc((ii, , jj, , pp((xx|{|{zzii}}ii U U) )

14

Incremental Update of Belief

p(x | z1, …, zN)

= c p(x | z1, …, zN-1) p(zN | x) zN is the new measurement

p(x | z1, …, zN-1) is previous belief

p(x | z1, …, zN) is updated belief c is normalizing constant

for linear system with Gaussian distribution, Kalman filter is used

15

IDSQ Algorithm

16

CADR Algorithm- global knowledge of sensor positions

with global knowledge of sensor with global knowledge of sensor positionspositions

optimal position to route query to is optimal position to route query to is given bygiven by

xxoo = arg = argxx [[MMcc = 0] = 0] (10)(10) The routing is directly addressed to The routing is directly addressed to

the sensor node that is closest to the sensor node that is closest to the optimal positionthe optimal position

17

CADR Algorithm- no global knowledge of sensor position

1. j’j’ = arg = argj j max(max(MMcc((xxii)), )), j j kk

2. 2. j’j’ = arg = argj j max((max((MMcc))TT((xxkk--xxjj) / (|) / (|MMcc||||xxkk--xxjj|)), |)),

3. instead of following 3. instead of following MMc c only, followonly, follow

dd = = MMcc + (1 - + (1 - )()(xxoo - - xxjj), ),

==(|x(|x0 0 –x–xj j ||-1-1))

for large distance to for large distance to xxoo, follow (, follow (xxoo - - xxkk))

for small distance to for small distance to xxoo, follow , follow MMcc

18

IDSQ Experiments- Sensor Selection Criteria

A. A. nearest neighbor data diffusionnearest neighbor data diffusion

jj00 = arg = argjj{1, …, {1, …, NN}-}-U U min||min||xxll--xxjj|||| B. B. Mahalanobis distanceMahalanobis distance

jj00 = arg = argjj{1, …, {1, …, NN}-}-U U min(min(xxii--xx00))-1-1((xxii--xx00)) C. maximum likelihoodC. maximum likelihood

jj00 = arg = argjj{1, …, {1, …, NN}-}-U U minminpp((xxii||)) D. best feasible region, upper boundD. best feasible region, upper bound

http://www.itl.nist.gov/div898/software/dataplot/refman2/auxillar/matrdist.htm

19

IDSQ Experiments

20

IDSQ Experiments

21

IDSQ - Comparison of NN and Mahalanobis distance method

NN distance methodNN distance method Mahalanobis distance methodMahalanobis distance method

22

IDSQ Experiments

23

IDSQ Experiments

24

IDSQ Experiments

B uses the Mahalanobis distance, A is Nearest Neighbor Diffusion. “Performs Better” means less error for given number of sensors used.

25

IDSQ Experiments

C uses maximum likelihood.

26

IDSQ Experiments

D is the best feasible region approach. Requires knowledge of sensor data before querying.

27

CADR Experiments

Mc = Mu – (1 - )Ma

= 1

Figure 12-1

28

CADR Experiments

Mc = Mu – (1 - )Ma

= 0.2

Figure 12-2

29

CADR Experiments

Mc = Mu – (1 - )Ma

= 0

Figure 12-3

30

Critical Issues

Use of Directed Diffusion to Use of Directed Diffusion to implement IDSQ/CADRimplement IDSQ/CADR

Belief RepresentationBelief Representation Impact of Choice of RepresentationImpact of Choice of Representation Hybrid Representation Hybrid Representation

31

Belief Representation

ParametricParametric, where each distribution , where each distribution is described by a set of parameters, is described by a set of parameters, poor quality but light-weighted e.g. poor quality but light-weighted e.g. Gaussian distributionsGaussian distributions

Non-parametricNon-parametric, where each , where each distribution is approximated by distribution is approximated by point samples, more accurate but point samples, more accurate but more costly e.g. grid approach or more costly e.g. grid approach or histogram type approachhistogram type approach

32

Impact of Choice of Representation

Representation using a non-parametric approximation will result in a more accurate approximation but more bits will be required to transmit

Representation using a parametric approximation will result in poor approximation but fewer bits will be required to transmit

Hybrid Approach Initially, the belief is parameterized by a history of

measurements Once the belief looks unimodal, it can be

approximated using a parametric approximation like the Gaussian distributions

33

Other Issues

The paper initially talks about the mitigation of link/node failures but no experiments are performed to prove it or no mention of it in the algorithms.

34

Rumor Routing Nodes having observed

an event send out agents which leave routing info to the event as state in nodes

Agents attempt to travel in a straight line

If an agent crosses a path to another event, it begins to build the path to both

Agent also optimizes paths if they find shorter ones.

Paper: David Braginsky and Deborah Estrin. Slide adapted from Sugata Hazarika, UCLAPaper: David Braginsky and Deborah Estrin. Slide adapted from Sugata Hazarika, UCLA

35

Algorithm Basics

All nodes maintain a neighbor list. Nodes also maintain a event table

When it observes an event, the event is added with distance 0.

Agents Packets that carry local event info

across the network. Aggregate events as they go.

36

Agent Path

Agent tries to travel in a “somewhat” straight path. Maintains a list of recently seen nodes. When it arrives at a node adds the

node’s neighbors to the list. For the next tries to find a node not in

the recently seen list. Avoids loops -important to find a path regardless of

“quality”

37

Following Paths

A query originates from source, and is forwarded along until it reaches it’s TTL

Forwarding Rules: If a node has seen the query before, it

is sent to a random neighbor If a node has a route to the event,

forward to neighbor along the route Otherwise, forward to random neighbor

using straightening algorithm

38

Some Thoughts

The effect of event distribution on the results is not clear.

The straightening algorithm used is essentially only a random walk … can something better be done.

The tuning of parameters for different network sizes and different node densities is not clear.

There are no clear guidelines for parameter tuning, only simulation results in a particular environment.

39

Simulation Results Assume that undelivered

queries are flooded Wide range of

parameters allow for energy saving over either of the naïve alternatives

Optimal parameters depend on network topology, query/event distribution and frequency

Algorithm was very sensitive to event distribution

10 Events, 4000 Nodes

0

5

10

15

20

25

30

35

40

45

50

0 10 20 30 40

Number of QueriesN

um

ber

of

Tran

smis

sio

ns

(th

ou

san

ds)

Query Flooding

A=28, La=500, Lq=1000

A=52, La=100, Lq=2000

Event Flooding

40

Questions?

top related