approximate aggregation techniques for sensor databases john byers computer science department...

38
Approximate Aggregation Approximate Aggregation Techniques for Sensor Techniques for Sensor Databases Databases John Byers John Byers Computer Science Department Computer Science Department Boston University Boston University Joint work with Jeffrey Considine, Joint work with Jeffrey Considine, George Kollios and Feifei Li George Kollios and Feifei Li

Upload: marianna-kelley

Post on 16-Dec-2015

217 views

Category:

Documents


1 download

TRANSCRIPT

Approximate Aggregation Approximate Aggregation Techniques for Sensor Techniques for Sensor DatabasesDatabases

John ByersJohn ByersComputer Science DepartmentComputer Science Department

Boston UniversityBoston University

Joint work with Jeffrey Considine, Joint work with Jeffrey Considine, George Kollios and Feifei LiGeorge Kollios and Feifei Li

Sensor Network ModelSensor Network Model

Large set of sensors distributed in a sensor Large set of sensors distributed in a sensor field.field.

Communication via a wireless ad-hoc network.Communication via a wireless ad-hoc network. Node and links are failure-prone. Node and links are failure-prone. Sensors are resource-constrainedSensors are resource-constrained

– Limited memory, battery-powered, messaging is Limited memory, battery-powered, messaging is costly.costly.

Sensor DatabasesSensor Databases

Useful abstraction:Useful abstraction:– Treat sensor field as a distributed Treat sensor field as a distributed

databasedatabase But: data is gathered, not stored nor saved.But: data is gathered, not stored nor saved.

– Express query in SQL-like languageExpress query in SQL-like language COUNT, SUM, AVG, MIN, GROUP-BYCOUNT, SUM, AVG, MIN, GROUP-BY

– Query processor distributes query and Query processor distributes query and aggregates responsesaggregates responses

– Exemplified by systems like TAG Exemplified by systems like TAG (Berkeley/MIT) and Cougar (Cornell)(Berkeley/MIT) and Cougar (Cornell)

A

B

C

D

E

G

HIJ

H

2

6

3

7

6

9

1

10

A Motivating ExampleA Motivating Example

Each sensor has a Each sensor has a single sensed value.single sensed value.

Sink initiates Sink initiates one-one-shotshot queries such queries such as: What is the…as: What is the…– maximum value?maximum value?– mean value?mean value?

ContinuousContinuous queries queries are a natural are a natural extension.extension.

F124

MAX Aggregation (no MAX Aggregation (no losses)losses) Build spanning treeBuild spanning tree Aggregate in-Aggregate in-

networknetwork– Each node sends one Each node sends one

summary packetsummary packet– Summary has MAX of Summary has MAX of

entire sub-treeentire sub-tree One loss could lose One loss could lose

MAX of many nodesMAX of many nodes– Neighbors of sink are Neighbors of sink are

particularly vulnerableparticularly vulnerable

A

B

C

D

E

G

HIJ

H

2

6

3

7

6

9

1

10

4F12

2

6

6

7

6 10

12

12

99

MAX=12

MAX Aggregation (with MAX Aggregation (with loss)loss) Nodes send Nodes send

summaries over summaries over multiple pathsmultiple paths– Free local broadcastFree local broadcast– Always send MAX Always send MAX

value observedvalue observed MAX is “infectious”MAX is “infectious”

– Harder to loseHarder to lose– Just need one viable Just need one viable

path to the sinkpath to the sink Relies on Relies on duplicate-duplicate-

insensitivityinsensitivity of MAX of MAX

A

B

C

D

E

G

HIJ

H

2

6

3

7

6

9

1

10

4F12

2

6

6

7

6 12

12

12

99

12

MAX=12

AVG Aggregation (no AVG Aggregation (no losses)losses) Build spanning treeBuild spanning tree Aggregate in-Aggregate in-

networknetwork– Each node sends one Each node sends one

summary packetsummary packet– Summary has SUM Summary has SUM

and COUNT of sub-and COUNT of sub-treetree

Same reliability Same reliability problem as beforeproblem as before

A

B

C

D

E

G

HIJ

H

2

6

3

7

6

9

1

10

4F12

2,1

6,1

9,2

15,3

6,1 10,1

26,4

12,1

10,29,1

AVG=70/10=7

AVG Aggregation (naive)AVG Aggregation (naive)

What if redundant What if redundant copies of data are copies of data are sent?sent?

AVG is AVG is duplicate-duplicate-sensitivesensitive– Duplicating data Duplicating data

changes aggregatechanges aggregate– Increases weight of Increases weight of

duplicated dataduplicated data

A

B

C

D

E

G

HIJ

H

2

6

3

7

6

9

1

10

4F12

2,1

6,1

9,2

15,3

6,1 22,2

26,4

12,1

10,29,1

AVG=82/11≠7

12,1

AVG Aggregation (TAG+AVG Aggregation (TAG++)+) Can compensate for Can compensate for

increased weight increased weight [MFHH’02][MFHH’02]– Send halved SUM Send halved SUM

and COUNT insteadand COUNT instead Does not change Does not change

expectation!expectation! Only reduces Only reduces

variancevariance

A

B

C

D

E

G

HIJ

H

2

6

3

7

6

9

1

10

4F12

2,1

6,1

9,2

15,3

6,1 16,0.5

20,3.5

6,0.5

10,29,1

AVG=70/10=7

6,0.5

AVG Aggregation (LIST)AVG Aggregation (LIST)

Can handle duplicates Can handle duplicates exactly with a list of exactly with a list of <id, value> pairs<id, value> pairs

Transmitting this list is Transmitting this list is expensive!expensive!

Lower bound: linear Lower bound: linear space is necessary space is necessary if if we demand exact we demand exact results.results.

A

B

C

D

E

G

HIJ

H

2

6

3

7

6

9

1

10

4F12

A,2

B,6

B,6;D,3

A,2;G,7;H,6

H,6 F,1;I,10

C,9;E,1;F,1;H,4 F,1

C,9;E,1

C,9

AVG=70/10=7

F,1

Classification of AggregatesClassification of Aggregates

TAG classifies aggregates according toTAG classifies aggregates according to– Size of partial stateSize of partial state– MonotonicityMonotonicity– Exemplary vs. summaryExemplary vs. summary– Duplicate-sensitivityDuplicate-sensitivity

MIN/MAX (cheap and easy)MIN/MAX (cheap and easy)– Small state, monotone, exemplary, duplicate-Small state, monotone, exemplary, duplicate-

insensitiveinsensitive COUNT/SUM/AVG (considerably harder)COUNT/SUM/AVG (considerably harder)

– Small state and monotone, BUT duplicate-sensitiveSmall state and monotone, BUT duplicate-sensitive– Cheap if aggregating over tree without lossesCheap if aggregating over tree without losses– Expensive with multiple paths and lossesExpensive with multiple paths and losses

Design Objectives for Design Objectives for RobustRobustAggregationAggregation Admit in-network aggregation of partial values.Admit in-network aggregation of partial values. Let representation of aggregates be both Let representation of aggregates be both order-order-

insensitiveinsensitive and and duplicate-insensitiveduplicate-insensitive.. Be agnostic to routing protocolBe agnostic to routing protocol

– Trust routing protocol to be best-effort.Trust routing protocol to be best-effort.– Routing and aggregation can be logically decoupled [NG Routing and aggregation can be logically decoupled [NG

’03].’03].

– Some routing algorithms better than others (multipath)Some routing algorithms better than others (multipath).. Exact answers incur extremely high cost.Exact answers incur extremely high cost.

– We argue that it is reasonable to use aggregation We argue that it is reasonable to use aggregation methods methods that are themselves approximatethat are themselves approximate..

OutlineOutline

IntroductionIntroduction Sketch Theory and PracticeSketch Theory and Practice

– COUNT sketches (old)COUNT sketches (old)– SUM sketches (new)SUM sketches (new)– Practical realizations for sensor netsPractical realizations for sensor nets

ExperimentsExperiments ConclusionsConclusions

COUNT SketchesCOUNT Sketches

Problem: Estimate the number of distinct item Problem: Estimate the number of distinct item IDs in a data set with only one pass.IDs in a data set with only one pass.

Constraints: Constraints: – Small space relative to stream size.Small space relative to stream size.– Small per item processing overhead.Small per item processing overhead.– Union operator on sketch results.Union operator on sketch results.

Exact COUNT is impossible without linear Exact COUNT is impossible without linear space.space.

First approximate COUNT sketch in [FM’85].First approximate COUNT sketch in [FM’85].– O(log N) space, O(1) processing time per item.O(log N) space, O(1) processing time per item.

Counting PaintballsCounting Paintballs

Imagine the Imagine the following scenario:following scenario:– A bag of A bag of nn paintballs paintballs

is emptied at the top is emptied at the top of a long stair-case.of a long stair-case.

– At each step, each At each step, each paintball either paintball either bursts and marks the bursts and marks the step, or bounces to step, or bounces to the next step. 50/50 the next step. 50/50 chance either way.chance either way.

Looking only at the pattern of marked steps, what was n?

Counting Paintballs Counting Paintballs (cont)(cont) What does the What does the

distribution of distribution of paintball bursts look paintball bursts look like?like?– The number of bursts The number of bursts

at each step follows a at each step follows a binomial distribution.binomial distribution.

– The expected number The expected number of bursts drops of bursts drops geometrically.geometrically.

– Few bursts after Few bursts after loglog22 n n stepssteps

1st

2nd

S th

B(n,1/2)

B(n,1/2 S)

B(n,1/4)

B(n,1/2 S)

Counting Paintballs Counting Paintballs (cont)(cont)

Many different estimator ideas Many different estimator ideas [FM'85,AMS'96,GGR'03,DF'03,...][FM'85,AMS'96,GGR'03,DF'03,...]

Example: Let Example: Let pospos denote the position denote the position of the highest unmarked stair,of the highest unmarked stair,

E(pos) ≈ logE(pos) ≈ log22(0.775351 n)(0.775351 n)

22(pos) ≈ 1.12127(pos) ≈ 1.12127 Standard variance reduction methods Standard variance reduction methods

applyapply Either O(log n) or O(log log n) spaceEither O(log n) or O(log log n) space

Back to COUNT SketchesBack to COUNT Sketches

The COUNT sketches of The COUNT sketches of [FM'85] are equivalent [FM'85] are equivalent to the paintball process.to the paintball process.– Start with a bit-vector Start with a bit-vector

of all zeros.of all zeros.– For each item, For each item,

Use its ID and a hash Use its ID and a hash function for coin flips.function for coin flips.

Pick a bit-vector entry.Pick a bit-vector entry. Set that bit to one.Set that bit to one.

These sketches are These sketches are duplicate-insensitiveduplicate-insensitive::

1 0 0 0 0{x}

0 0 1 0 0{y}

1 0 1 0 0{x,y}

A,B (Sketch(A) A,B (Sketch(A) υυ Sketch(B)) = Sketch(A Sketch(B)) = Sketch(A B) B)

Application to Application to SensornetsSensornets

Each sensor computes Each sensor computes kk independent sketches of independent sketches of itself using its unique sensor ID.itself using its unique sensor ID.– Coming next: sensor computes sketches of its Coming next: sensor computes sketches of its valuevalue..

Use a robust routing algorithm to route sketches Use a robust routing algorithm to route sketches up to the sink.up to the sink.

Aggregate the Aggregate the kk sketches via in-network XOR. sketches via in-network XOR.– Union via XOR is Union via XOR is duplicate-insensitiveduplicate-insensitive..

The sink then estimates the count.The sink then estimates the count.

Similar to Similar to gossipgossip and and epidemicepidemic protocols. protocols.

SUM SketchesSUM Sketches

Problem: Estimate the sum of values of Problem: Estimate the sum of values of distinct <distinct <key, value> pairs in a data > pairs in a data stream with repetitions. (stream with repetitions. (value ≥ 0, ≥ 0, integral).integral).

Obvious start: Emulate Obvious start: Emulate value insertions insertions into a COUNT sketch and use the same into a COUNT sketch and use the same estimators.estimators.– For <k,v>, imagine inserting

<k, v, 1>, <k, v, 2>, …, <k, v, v>

SUM Sketches (cont)SUM Sketches (cont)

But what if the value is 1,000,000?But what if the value is 1,000,000?

Main Idea (details on next slide): Main Idea (details on next slide): – Recall that all of the low-order bits will be Recall that all of the low-order bits will be

set to 1 w.h.p. inserting such a value.set to 1 w.h.p. inserting such a value.– Just set these bits to one immediately.Just set these bits to one immediately.– Then set the high-order bits carefully.Then set the high-order bits carefully.

Simulating a set of Simulating a set of insertionsinsertions

Set all the low-order bits in the “safe” region.Set all the low-order bits in the “safe” region.– First First SS = log = log vv – 2 log log – 2 log log v v bits are set to 1 w.h.p. bits are set to 1 w.h.p.

Statistically estimate number of trials going Statistically estimate number of trials going beyond “safe” regionbeyond “safe” region– Probability of a trial doing so is simply 2Probability of a trial doing so is simply 2--SS

– Number of trials Number of trials ~~ B B ((vv, 2, 2--SS). [Mean = O(log). [Mean = O(log22 vv)])] For trials and bits outside “safe” region, set For trials and bits outside “safe” region, set

those bits manually.those bits manually.– Running time is O(1) for each outlying trial.Running time is O(1) for each outlying trial.

Expected running time: Expected running time: O(log O(log vv) + time to draw from ) + time to draw from B B ((vv, 2, 2--SS) + O(log) + O(log22 vv))

Sampling for Sensor Sampling for Sensor NetworksNetworks

We need to generate samples from We need to generate samples from B B ((n, pn, p).).– With a slow CPU, very little RAM, no floating point With a slow CPU, very little RAM, no floating point

hardwarehardware General problem: sampling from a discrete pdf.General problem: sampling from a discrete pdf. Assume can draw uniformly at random from [0,1].Assume can draw uniformly at random from [0,1]. With an event space of size With an event space of size NN::

– O(log O(log NN) lookups are immediate.) lookups are immediate. Represent the cdf in an array of size Represent the cdf in an array of size NN.. Draw from [0, 1] and do binary search.Draw from [0, 1] and do binary search.

– Cleverer methods for O(log log Cleverer methods for O(log log NN), O(log* ), O(log* NN) time) time

Amazingly, this can be done in constant time!

Walker’s Alias MethodWalker’s Alias Method

n

1/n

A – 0.10

B – 0.25

C – 0.05

D – 0.25E – 0.35

n

1/n

A – 0.10 B – 0.15C – 0.05

D – 0.25E – 0.35B – 0.10

n

1/n

A – 0.10 B – 0.15C – 0.05

D – 0.20

E – 0.35B – 0.10 D – 0.05

n

1/n

A – 0.10 B – 0.15C – 0.05

D – 0.20 E – 0.20B – 0.10 D – 0.05

E – 0.15

Theorem [Walker ’77]: For any Theorem [Walker ’77]: For any discrete pdf D over a sample space of discrete pdf D over a sample space of size n, a table of size size n, a table of size O(n)O(n) can be can be constructed in constructed in O(n)O(n) time that enables time that enables random variables to be drawn from D random variables to be drawn from D using at most two table lookups.using at most two table lookups.

n

1/n

Binomial Sampling for Binomial Sampling for SensorsSensors

Recall we want to sample from Recall we want to sample from BB(v,2(v,2-S-S)) for various values of for various values of vv and and S.S.– First, reduce to sampling from First, reduce to sampling from GG(1 – 2(1 – 2-S-S).).– Truncate distribution to make range finite Truncate distribution to make range finite

(recursion to handle large (recursion to handle large values).values).

– Construct tables of size Construct tables of size 22SS for each for each SS of of interest.interest.

– Can sample Can sample BB(v,2(v,2-S-S)) in in O(v · 2O(v · 2-S-S)) expected expected time.time.

The Bottom LineThe Bottom Line

– SUM inserts inSUM inserts in O(logO(log22(v))(v)) time with time with O(v / logO(v / log22(v))(v)) space space O(log(v))O(log(v)) time with time with O(v / log(v))O(v / log(v)) space space O(v)O(v) time with naïve method time with naïve method

– Using Using O(logO(log22(v))(v)) method, 16 bit values (S method, 16 bit values (S ≤ 8) and 64 bit probabilities≤ 8) and 64 bit probabilities

Resulting lookup tables are ~ 4.5KBResulting lookup tables are ~ 4.5KB Recursive nature of Recursive nature of GG(1 – 2(1 – 2-S-S)) lets us tune size lets us tune size

furtherfurther

– Can achieve Can achieve O(log v)O(log v) time at the cost of time at the cost of bigger tablesbigger tables

OutlineOutline

IntroductionIntroduction Sketch Theory and PracticeSketch Theory and Practice ExperimentsExperiments ConclusionsConclusions

Experimental ResultsExperimental Results

Used TAG simulatorUsed TAG simulator Grid topology with sink in Grid topology with sink in

the middlethe middle– Grid sizeGrid size

[default: 30 by 30] [default: 30 by 30] – Transmission radiusTransmission radius

[default: 8 neighbors on [default: 8 neighbors on the grid]the grid]

– Node, packet, or link lossNode, packet, or link loss[default: 5% link loss [default: 5% link loss rate]rate]

– Number of bit vectorsNumber of bit vectors[default: 20 bit-vectors [default: 20 bit-vectors of 16 bits (compressed)].of 16 bits (compressed)].

Experimental ResultsExperimental Results

We consider four main methods.We consider four main methods.– TAG1: transmit aggregates up a single spanning treeTAG1: transmit aggregates up a single spanning tree– TAG2: Send a 1/TAG2: Send a 1/kk fraction of the aggregated values fraction of the aggregated values

to each of to each of kk parents. parents.– SKETCH: broadcast an aggregated sketch to all SKETCH: broadcast an aggregated sketch to all

neighbors at level neighbors at level ii –1 –1 – LIST: explicitly enumerate all <key, value> pairs LIST: explicitly enumerate all <key, value> pairs

and broadcast to all neighbors at level and broadcast to all neighbors at level ii – 1. – 1.

LIST vs. SKETCH measures the penalty LIST vs. SKETCH measures the penalty associated with approximate values. associated with approximate values.

COUNT vs Link Loss COUNT vs Link Loss (grid)(grid)

COUNT vs Link Loss COUNT vs Link Loss (grid)(grid)

SUM vs Link Loss (grid)SUM vs Link Loss (grid)

Message Cost ComparisonMessage Cost Comparison

StrategyStrategy Total Data Total Data BytesBytes

Messages Messages SentSent

Messages Messages ReceivedReceived

TAG1TAG1 18001800 900900 900900

TAG2TAG2 18001800 900900 24682468

SKETCHSKETCH 1084310843 900900 24682468

LISTLIST 170424170424 900900 24682468

OutlineOutline

IntroductionIntroduction Sketch Theory and PracticeSketch Theory and Practice ExperimentsExperiments ConclusionsConclusions

Our Work in ContextOur Work in Context

In parallel with our efforts,In parallel with our efforts,– Nath and Gibbons (Intel/CMU)Nath and Gibbons (Intel/CMU)

What are the essential properties of duplicate What are the essential properties of duplicate insensitivity?insensitivity?

What other aggregates can be sketched?What other aggregates can be sketched?

– Bawa et al (Stanford)Bawa et al (Stanford) What routing methods are necessary to What routing methods are necessary to

guarantee the validity and semantics of guarantee the validity and semantics of aggregates?aggregates?

ConclusionsConclusions

Duplicate-insensitive sketches fundamentally Duplicate-insensitive sketches fundamentally change how aggregation workschange how aggregation works– Routing becomes logically decoupledRouting becomes logically decoupled– Arbitrarily complex aggregation scenarios are Arbitrarily complex aggregation scenarios are

allowable – cyclic topologies, multiple sinks, etc.allowable – cyclic topologies, multiple sinks, etc. Extended list of non-trivial aggregatesExtended list of non-trivial aggregates

– We added SUM, MEAN, VARIANCE, STDEV, …We added SUM, MEAN, VARIANCE, STDEV, … Resulting system performs betterResulting system performs better

– Moderate cost (tunable) for large reliability boostModerate cost (tunable) for large reliability boost

Thank you!Thank you!

More questions?More questions?

Multipath RoutingMultipath Routing

Braided Paths:Braided Paths: c

c

c

c

cc

c

c

c

c

cc

c

c

c

c

c

c

c

c

c

c

c

c

c

c

c c

z

Laptop computerHand held computer

WorkstationCell phone

Two paths from the source to the sink that differ in at least two nodes