static-neighbor-graph-based prediction present by yftah ziser january 2015

Post on 31-Dec-2015

246 Views

Category:

Documents

8 Downloads

Preview:

Click to see full reader

TRANSCRIPT

Static-neighbor-Graph-based prediction

Present by Yftah ZiserJanuary 2015

Description

• The Static-neighbor-Graph method predicts the primary users spectrum utilization by constructing an empirical probabilistic graph of primary users mobility.

How to build the graph?

• 1)if SU observe PU movement from point I to point j1.1)if the edge (i,j) doesn't exist 1.1.1)add edge (i,j) with the weight of 11.2)else1.2.1)add 1 to the weight of the edge (i,j)

Simple Example

12

34

25

12

Simple Example

12

34

25

2512

Simple Example

12

34

25

2512 25 12Directed graph

Simple Example

12

34

25

2512 25 1212 34

Simple Example

12

34

25

2512 25 1212 3434 12

Simple Example

12

34

25

2512 25 1212 3434 12 2512

Simple Example

12

34

25

2512 25 1212 3434 12 2512 3425

How to use the graph for prediction?

• Assuming that the current location of the PU is represented by vertex i, our prediction for the next location is j such that edge (i,j) has the maximum weight .

Simple Example

12

34

25

12

Simple Example

12

34

25

12

25

Simple Example – with a conflict

12

34

25

12

?

Simple Example – with a conflict

12

34

25

12

Rand({25,34})

Reduction to our problem

• What we haveAlgorithm for predicting the next PU location.

• What we wantAlgorithm for predicting the spectrum holes.

The reduction is quite simple.

The reduction

• Assuming we know all the Primary users locations we can know injectively which of the spectrum beans are idle.

• We would like to relate to the option of predicting that some of the stations stay in the same frequency for the next few intervals. For this purpose the algorithm allows self-loops.

Self-loops

• In order to differ as possible the SNG algorithm from Hold, a the weights on self-loops edges will be factored (0.1 in our case).

pros

• The time and space complexity are very low (predicting and training).

pros

• Can work with large number of data representations (including all the representations we introduced in the seminar).

cons

• The prediction for the next step and N steps ahead are the same.

cons

• The prediction for the next step and N steps ahead are the same.

• Work well in absolute patterns but very inaccurate for relative ones.

Results

• In the following tables we present the relative error of the frequency prediction for each time interval ahead (the "n" row).i.e. the relative frequency error formula given by

Results

• The NW lines means that for each interval the window size used is 10*i.The BW lines means that the window size is the minimum sum of all the intervals errors (for the SNG algorithm).

Results

101

10 9 8 7 6 5 4 3 2 1n

0 0 0 0 0 0 0 0 0 0 SNG NW

0 0 0 0 0 0 0 0 0 0SNG BW

0 0 0 0 0 0 0 0 0 0Hold

The best window is 10

Results

201

10 9 8 7 6 5 4 3 2 1n

0.2824 0.2251 0.2481 0.2376 0.2247 0.3035 0.2435 0.2860 0.1764 0.3189 SNG NW

0.1764 0.1764 0.1764 0.1764 0.1764 0.1764 0.1764 0.1764 0.1764 0.1764SNG BW

0.1764 0.1764 0.1764 0.1764 0.1764 0.1764 0.1764 0.1764 0.1764 0.1764Hold

The best window is 20

Results

301

10 9 8 7 6 5 4 3 2 1n

0.1313 0.1212

0.1303

0.1128

0.1093

0.1480

0.5565

0.2332

0.2822 0.2826 SNG NW

0.1149 0.1290 0.1203 0.1203 0.1093 0.1163 0.1239 0.1172 0.1060 0.1066SNG BW

0.1364 0.1337 0.1407 0.1407 0.1250 0.126

0.1311 0.1340 0.1454 0.1350Hold

The best window is 60

Results

301

10 9 8 7 6 5 4 3 2 1n

0.1313 0.1212

0.1303

0.1128

0.1093

0.1480

0.5565

0.2332

0.2822 0.2826 SNG NW

0.1149 0.1290 0.1203 0.1203 0.1093 0.1163 0.1239 0.1172 0.1060 0.1066SNG BW

0.1364 0.1337 0.1407 0.1407 0.1250 0.126

0.1311 0.1340 0.1454 0.1350Hold

The best window is 60

Results

401

10 9 8 7 6 5 4 3 2 1n

0.3991 0.4547 0.3787 0.2786 0.3227 0.2644 0.1567 0.1209 0.1213 0.1124 SNG NW

0.1238 0.1341 0.1530 0.1522 0.1624 0.1296 0.1272 0.1209 0.1167 0.1011SNG BW

0.1027 0.1079 0.1277 0.1269 0.1451 0.1061 0.1071 0.0994 0.0953 0.0882Hold

The best window is 30

Results Analysis

• In "constant wave" stations such as "101" we can clearly see that both algorithms SNG and Hold are predicting the frequency perfectly.

• In "frequency hop" stations such as “201" the SNG and the hold algorithms are practically the same.

• When the station nature is less holdish we can see improvement (301).

Alternative results

• For this section we allow the SNG to accumulate knowledge

Alternative results

101

10 9 8 7 6 5 4 3 2 1n

0 0 0 0 0 0 0 0 0 0 SNG NW

0 0 0 0 0 0 0 0 0 0SNG BW

0 0 0 0 0 0 0 0 0 0Hold

The best window is 10

Alternative results

201

10 9 8 7 6 5 4 3 2 1n

0.2824 0.2251 0.2481 0.2376 0.2247 0.3035 0.2435 0.2860 0.1764 0.3189 SNG NW

0.1764 0.1764 0.1764 0.1764 0.1764 0.1764 0.1764 0.1764 0.1764 0.1764SNG BW

0.1764 0.1764 0.1764 0.1764 0.1764 0.1764 0.1764 0.1764 0.1764 0.1764Hold

The best window is 20

Results

301

10 9 8 7 6 5 4 3 2 1n

0.1313 0.1212

0.1303

0.1128

0.1093

0.1480

0.5565

0.2332

0.2822 0.2826 SNG NW

0.1149 0.1290 0.1203 0.1203 0.1093 0.1163 0.1239 0.1172 0.1060 0.1066SNG BW

0.0590 0.0589 0.0781 0.1025 0.1581 0.1693

0.1347 0.1139 0.0654 0.0226A-SNG BW

The best window is 60

Alternative results

401

10 9 8 7 6 5 4 3 2 1n

0.3991 0.4547 0.3787 0.2786 0.3227 0.2644 0.1567 0.1209 0.1213 0.1124 SNG NW

0.1238 0.1341 0.1530 0.1522 0.1624 0.1296 0.1272 0.1209 0.1167 0.1011SNG BW

0.1921 0.1845 0.1742 0.1720 0.1854 0.1405 0.1345 0.1246 0.1170 0.1105 A-SNG BW

The best window is 30

Future thoughts

• Predict frequency and time

Future thoughts

• Predict frequency and time - The prediction for the next step and N steps ahead are the same.

• Predict a probabilistic spectrum

Simple Example – with a conflict

12

34

25

12

Rand({25,34})

Simple Example – with a conflict

12

34

25

12

50% 25 , 50% 34

Any questions?

top related