summary: bayesian (multi-) sensor tracking · sensor data fusion - methods and applications, 3rd...

57
Summary: B AYES ian (Multi-) Sensor Tracking Basis: In the course of time one or several sensors produce measurements of targets of interest. Each target is characterized by its current state vector, being expected to change with time. Sensor Data Fusion - Methods and Applications, 3rd Lecture on April 24, 2019 slide 1

Upload: others

Post on 15-Aug-2020

6 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Summary: BAYESian (Multi-) Sensor Tracking · Sensor Data Fusion - Methods and Applications, 3rd Lecture on April 24, 2019 — slide 1. Summary: BAYESian (Multi-) Sensor Tracking

Summary: BAYESian (Multi-) Sensor Tracking

• Basis: In the course of time one or several sensors produce measurements oftargets of interest. Each target is characterized by its current state vector, beingexpected to change with time.

Sensor Data Fusion - Methods and Applications, 3rd Lecture on April 24, 2019 — slide 1

Page 2: Summary: BAYESian (Multi-) Sensor Tracking · Sensor Data Fusion - Methods and Applications, 3rd Lecture on April 24, 2019 — slide 1. Summary: BAYESian (Multi-) Sensor Tracking

Summary: BAYESian (Multi-) Sensor Tracking

• Basis: In the course of time one or several sensors produce measurements oftargets of interest. Each target is characterized by its current state vector, beingexpected to change with time.

• Objective: Learn as much as possible about the individual target states at eachtime by analyzing the ‘time series’ which is constituted by the sensor data.

Sensor Data Fusion - Methods and Applications, 3rd Lecture on April 24, 2019 — slide 2

Page 3: Summary: BAYESian (Multi-) Sensor Tracking · Sensor Data Fusion - Methods and Applications, 3rd Lecture on April 24, 2019 — slide 1. Summary: BAYESian (Multi-) Sensor Tracking

Summary: BAYESian (Multi-) Sensor Tracking

• Basis: In the course of time one or several sensors produce measurements oftargets of interest. Each target is characterized by its current state vector, beingexpected to change with time.

• Objective: Learn as much as possible about the individual target states at eachtime by analyzing the ‘time series’ which is constituted by the sensor data.

• Problem: imperfect sensor information: inaccurate, incomplete, and eventuallyambiguous. Moreover, the targets’ temporal evolution is usually not well-known.

Sensor Data Fusion - Methods and Applications, 3rd Lecture on April 24, 2019 — slide 3

Page 4: Summary: BAYESian (Multi-) Sensor Tracking · Sensor Data Fusion - Methods and Applications, 3rd Lecture on April 24, 2019 — slide 1. Summary: BAYESian (Multi-) Sensor Tracking

Summary: BAYESian (Multi-) Sensor Tracking

• Basis: In the course of time one or several sensors produce measurements oftargets of interest. Each target is characterized by its current state vector, beingexpected to change with time.

• Objective: Learn as much as possible about the individual target states at eachtime by analyzing the ‘time series’ which is constituted by the sensor data.

• Problem: imperfect sensor information: inaccurate, incomplete, and eventuallyambiguous. Moreover, the targets’ temporal evolution is usually not well-known.

• Approach: Interpret measurements and state vectors as random variables(RVs). Describe by probability density functions (pdf) what is known about them.

Sensor Data Fusion - Methods and Applications, 3rd Lecture on April 24, 2019 — slide 4

Page 5: Summary: BAYESian (Multi-) Sensor Tracking · Sensor Data Fusion - Methods and Applications, 3rd Lecture on April 24, 2019 — slide 1. Summary: BAYESian (Multi-) Sensor Tracking

Summary: BAYESian (Multi-) Sensor Tracking

• Basis: In the course of time one or several sensors produce measurements oftargets of interest. Each target is characterized by its current state vector, beingexpected to change with time.

• Objective: Learn as much as possible about the individual target states at eachtime by analyzing the ‘time series’ which is constituted by the sensor data.

• Problem: imperfect sensor information: inaccurate, incomplete, and eventuallyambiguous. Moreover, the targets’ temporal evolution is usually not well-known.

• Approach: Interpret measurements and state vectors as random variables(RVs). Describe by probability density functions (pdf) what is known about them.

• Solution: Derive iteration formulae for calculating the pdfs! Develop a mecha-nism for initiation! By doing so, exploit all background information available! De-rive state estimates from the pdfs along with appropriate quality measures!

Sensor Data Fusion - Methods and Applications, 3rd Lecture on April 24, 2019 — slide 5

Page 6: Summary: BAYESian (Multi-) Sensor Tracking · Sensor Data Fusion - Methods and Applications, 3rd Lecture on April 24, 2019 — slide 1. Summary: BAYESian (Multi-) Sensor Tracking

Bayesian Multiple Sensor Tracking: Basic IdeaIterative updating of conditional probability densities!

kinematic target state xk at time tk, accumulated multiple sensor data Zk

a priori knowledge: target dynamics models, sensor model, other context

• prediction: p(xk−1|Zk−1)dynamics model−−−−−−−−−−→

contextp(xk|Zk−1)

• filtering: p(xk|Zk−1)sensor data Zk−−−−−−−−−−→sensor model

p(xk|Zk)

• retrodiction: p(xl−1|Zk)filtering output←−−−−−−−−−−

dynamics modelp(xl|Zk)

− finite mixture: inherent ambiguity (data, model, road network )− optimal estimators: e.g. minimum mean squared error (MMSE)− initiation of pdf iteration: multiple hypothesis track extraction

Sensor Data Fusion - Methods and Applications, 3rd Lecture on April 24, 2019 — slide 6

Page 7: Summary: BAYESian (Multi-) Sensor Tracking · Sensor Data Fusion - Methods and Applications, 3rd Lecture on April 24, 2019 — slide 1. Summary: BAYESian (Multi-) Sensor Tracking

Recapitulation: The Multivariate GAUSSian Pdf

− wanted: probabilities ‘concentrated’ around a center x

− quadratic distance: q(x) = 12(x− x)P−1(x− x)>

q(x) defines an ellipsoid around x, its volume and orienta-tion being determined by a matrix P (symmetric: P> = P,positively definite: all eigenvalues > 0).

− first attempt: p(x) = e−q(x)/∫dx e−q(x) (normalized!)

p(x) = N (x; x, P) =1√|2πP|

e−12(x−x)>P−1(x−x)

− GAUSSian Mixtures: p(x) =∑i pi N (x; xi, Pi) (weighted sums)

Sensor Data Fusion - Methods and Applications, 3rd Lecture on April 24, 2019 — slide 7

Page 8: Summary: BAYESian (Multi-) Sensor Tracking · Sensor Data Fusion - Methods and Applications, 3rd Lecture on April 24, 2019 — slide 1. Summary: BAYESian (Multi-) Sensor Tracking

Very First Look at an Important Data Fusion Algorithm

Kalman filter: xk = (r>k , r>k )>, Zk = {zk,Zk−1}

initiation: p(x0) = N(x0; x0|0, P0|0

), initial ignorance: P0|0 ‘large’

prediction: N(xk−1; xk−1|k−1, Pk−1|k−1

) dynamics model−−−−−−−−−→Fk|k−1,Dk|k−1

N(xk; xk|k−1, Pk|k−1

)xk|k−1 = Fk|k−1xk−1|k−1

Pk|k−1 = Fk|k−1Pk−1|k−1Fk|k−1> + Dk|k−1

filtering: N(xk; xk|k−1, Pk|k−1

) current measurement zk−−−−−−−−−−−−−→sensor model: Hk,Rk

N(xk; xk|k, Pk|k

)xk|k = xk|k−1 + Wk|k−1νk|k−1, νk|k−1 = zk −Hkxk|k−1

Pk|k = Pk|k−1 −Wk|k−1Sk|k−1Wk|k−1>, Sk|k−1 = HkPk|k−1Hk

> + Rk

Wk|k−1 = Pk|k−1Hk>Sk|k−1

−1 ‘KALMAN gain matrix’

A deeper look into the dynamics and sensor models necessary!

Sensor Data Fusion - Methods and Applications, 3rd Lecture on April 24, 2019 — slide 8

Page 9: Summary: BAYESian (Multi-) Sensor Tracking · Sensor Data Fusion - Methods and Applications, 3rd Lecture on April 24, 2019 — slide 1. Summary: BAYESian (Multi-) Sensor Tracking

How to deal with probability density functions?

• pdf p(x): Extract probability statements about the RV x by integration!

• naıvely: positive and normalized functions (p(x) ≥ 0,∫dx p(x) = 1)

• conditional pdf p(x|y) = p(x,y)p(y) : Impact of information on y on RV x?

• marginal density p(x) =∫dy p(x, y) =

∫dy p(x|y) p(y): Enter y!

• Bayes: p(x|y)= p(y|x)p(x)p(y) = p(y|x)p(x)∫

dx p(y|x)p(x): p(x|y)← p(y|x), p(x)!

Sensor Data Fusion - Methods and Applications, 3rd Lecture on April 24, 2019 — slide 9

Page 10: Summary: BAYESian (Multi-) Sensor Tracking · Sensor Data Fusion - Methods and Applications, 3rd Lecture on April 24, 2019 — slide 1. Summary: BAYESian (Multi-) Sensor Tracking

How to deal with probability density functions?

• pdf p(x): Extract probability statements about the RV x by integration!

• naıvely: positive and normalized functions (p(x) ≥ 0,∫dx p(x) = 1)

• conditional pdf p(x|y) = p(x,y)p(y) : Impact of information on y on RV x?

• marginal density p(x) =∫dy p(x, y) =

∫dy p(x|y) p(y): Enter y!

• Bayes: p(x|y)= p(y|x)p(x)p(y) = p(y|x)p(x)∫

dx p(y|x)p(x): p(x|y)← p(y|x), p(x)!

• certain knowledge on x: p(x) = δ(x− y) ‘=’ limσ→01√2πσ

e−1

2(x−y)2

σ2

Sensor Data Fusion - Methods and Applications, 3rd Lecture on April 24, 2019 — slide 10

Page 11: Summary: BAYESian (Multi-) Sensor Tracking · Sensor Data Fusion - Methods and Applications, 3rd Lecture on April 24, 2019 — slide 1. Summary: BAYESian (Multi-) Sensor Tracking

How to deal with probability density functions?

• pdf p(x): Extract probability statements about the RV x by integration!

• naıvely: positive and normalized functions (p(x) ≥ 0,∫dx p(x) = 1)

• conditional pdf p(x|y) = p(x,y)p(y) : Impact of information on y on RV x?

• marginal density p(x) =∫dy p(x, y) =

∫dy p(x|y) p(y): Enter y!

• Bayes: p(x|y)= p(y|x)p(x)p(y) = p(y|x)p(x)∫

dx p(y|x)p(x): p(x|y)← p(y|x), p(x)!

• certain knowledge on x: p(x) = δ(x− y) ‘=’ limσ→01√2πσ

e−1

2(x−y)2

σ2

• transformed RV y = t[x]: p(y) =∫dx p(y, x) =

∫dx p(y|x) px(x)

Sensor Data Fusion - Methods and Applications, 3rd Lecture on April 24, 2019 — slide 11

Page 12: Summary: BAYESian (Multi-) Sensor Tracking · Sensor Data Fusion - Methods and Applications, 3rd Lecture on April 24, 2019 — slide 1. Summary: BAYESian (Multi-) Sensor Tracking

How to deal with probability density functions?

• pdf p(x): Extract probability statements about the RV x by integration!

• naıvely: positive and normalized functions (p(x) ≥ 0,∫dx p(x) = 1)

• conditional pdf p(x|y) = p(x,y)p(y) : Impact of information on y on RV x?

• marginal density p(x) =∫dy p(x, y) =

∫dy p(x|y) p(y): Enter y!

• Bayes: p(x|y)= p(y|x)p(x)p(y) = p(y|x)p(x)∫

dx p(y|x)p(x): p(x|y)← p(y|x), p(x)!

• certain knowledge on x: p(x) = δ(x− y) ‘=’ limσ→01√2πσ

e−1

2(x−y)2

σ2

• transformed RV y = t[x]: p(y) =∫dx p(y, x) =

∫dx p(y|x) px(x) =∫

dx δ(y − t[x]) px(x) =: [T px](y) (T : px 7→ p, “transfer operator”)

Sensor Data Fusion - Methods and Applications, 3rd Lecture on April 24, 2019 — slide 12

Page 13: Summary: BAYESian (Multi-) Sensor Tracking · Sensor Data Fusion - Methods and Applications, 3rd Lecture on April 24, 2019 — slide 1. Summary: BAYESian (Multi-) Sensor Tracking

Affine Transforms of GAUSSian Random Variables

N(x; x, P

) y=t+Tx−−−−−−−→ N(y; t + Tx, TPT>

)

p(y) =∫dx p(x,y) =

∫dx p(y|x) p(x) =

∫dx δ(y − t−Tx) p(x)

Sensor Data Fusion - Methods and Applications, 3rd Lecture on April 24, 2019 — slide 13

Page 14: Summary: BAYESian (Multi-) Sensor Tracking · Sensor Data Fusion - Methods and Applications, 3rd Lecture on April 24, 2019 — slide 1. Summary: BAYESian (Multi-) Sensor Tracking

Affine Transforms of GAUSSian Random Variables

N(x; x, P

) y=t+Tx−−−−−−−→ N(y; t + Tx, TPT>

)

p(y) =∫dx p(x,y) =

∫dx p(y|x) p(x) =

∫dx δ(y − t−Tx) p(x)

A possible representation: δ(y − t−Tx) = N(y; t + Tx, R

)with R→ O!

p(y) =∫dxN (y − t;Tx,R)N (x; x,P) for R→ 0 product formula!

Sensor Data Fusion - Methods and Applications, 3rd Lecture on April 24, 2019 — slide 14

Page 15: Summary: BAYESian (Multi-) Sensor Tracking · Sensor Data Fusion - Methods and Applications, 3rd Lecture on April 24, 2019 — slide 1. Summary: BAYESian (Multi-) Sensor Tracking

A Useful Product Formula for GAUSSians

N(z; Hx, R

)N(x; y, P

)= N

(z; Hy, S

)︸ ︷︷ ︸independent of x

N(x; y + Wν, P−WSW>)

ν = z−Hy, S = HPH> + R, W = PH>S−1.

Sensor Data Fusion - Methods and Applications, 3rd Lecture on April 24, 2019 — slide 15

Page 16: Summary: BAYESian (Multi-) Sensor Tracking · Sensor Data Fusion - Methods and Applications, 3rd Lecture on April 24, 2019 — slide 1. Summary: BAYESian (Multi-) Sensor Tracking

Affine Transforms of GAUSSian Random Variables

N(x; x, P

) y=t+Tx−−−−−−−→ N(y; t + Tx, TPT>

)

p(y) =∫dx p(x,y) =

∫dx p(y|x) p(x) =

∫dx δ(y − t−Tx) p(x)

A possible representation: δ(y − t−Tx) = N(y; t + Tx, R

)with R→ O!

p(y) =∫dxN (y − t;Tx,R)N (x; x,P) for R→ 0

= N(y; t + Tx, TPT>+ R

)for R→ 0; product formula!

Also true if dim(x) 6= dim(y)!Sensor Data Fusion - Methods and Applications, 3rd Lecture on April 24, 2019 — slide 16

Page 17: Summary: BAYESian (Multi-) Sensor Tracking · Sensor Data Fusion - Methods and Applications, 3rd Lecture on April 24, 2019 — slide 1. Summary: BAYESian (Multi-) Sensor Tracking

Create your own sensor simulator!

Exercise 3.1Simulate normally distributed (radar) measurements!

Measurement interval: ∆T = 5 s, sensor position: rsState at time tk = k∆T , k ∈ Z: xk = (r>k , r

>k , r

>k )>

Use standard random number generators

such as normrnd(0,1) that are producing

“normally distributed zero-mean, unit-variance random numbers”!

Sensor Data Fusion - Methods and Applications, 3rd Lecture on April 24, 2019 — slide 17

Page 18: Summary: BAYESian (Multi-) Sensor Tracking · Sensor Data Fusion - Methods and Applications, 3rd Lecture on April 24, 2019 — slide 1. Summary: BAYESian (Multi-) Sensor Tracking

Interpret unknown object states as random variables, x [1D] or x [vector variate]),characterized by corresponding probability density functions (pdf).

The concrete shape of the pdf p(x) contains the full knowledge on x!

Sensor Data Fusion - Methods and Applications, 3rd Lecture on April 24, 2019 — slide 18

Page 19: Summary: BAYESian (Multi-) Sensor Tracking · Sensor Data Fusion - Methods and Applications, 3rd Lecture on April 24, 2019 — slide 1. Summary: BAYESian (Multi-) Sensor Tracking

Information on a random variable (RV) can be ex-

tracted by integration from the corresponding pdf!

at present: one dimensional case:

How probable is it that x ∈ (a, b) ⊆ R holds?

Answer: P{x ∈ (a, b)} =∫ badx p(x) ⇒ p(x) ≥ 0

in particular: P{x ∈ R} =∫ ∞−∞

dx p(x) = 1 (normalzation)

intuitive interpretation: “the object is somewhere in R”

loosely: p(x) dx is probabity for x having a value between x and x+ dx

Sensor Data Fusion - Methods and Applications, 3rd Lecture on April 24, 2019 — slide 19

Page 20: Summary: BAYESian (Multi-) Sensor Tracking · Sensor Data Fusion - Methods and Applications, 3rd Lecture on April 24, 2019 — slide 1. Summary: BAYESian (Multi-) Sensor Tracking

How to characterize the properties of a pdf?

specifically: How to associate a single “expected” value to a RV?

The maximum of the pdf is sometimes but not always useful!

Sensor Data Fusion - Methods and Applications, 3rd Lecture on April 24, 2019 — slide 20

Page 21: Summary: BAYESian (Multi-) Sensor Tracking · Sensor Data Fusion - Methods and Applications, 3rd Lecture on April 24, 2019 — slide 1. Summary: BAYESian (Multi-) Sensor Tracking

How to characterize the properties of a pdf?

specifically: How to associate a single “expected” value to a RV?

The maximum of the pdf is sometimes but not always useful! (→ examples)

instead: Calculate the centroid of the pdf!

E[x] =∫ ∞−∞

dx x p(x) = x “expectation value”

Sensor Data Fusion - Methods and Applications, 3rd Lecture on April 24, 2019 — slide 21

Page 22: Summary: BAYESian (Multi-) Sensor Tracking · Sensor Data Fusion - Methods and Applications, 3rd Lecture on April 24, 2019 — slide 1. Summary: BAYESian (Multi-) Sensor Tracking

How to characterize the properties of a pdf?

specifically: How to associate a single “expected” value to a RV?

The maximum of the pdf is sometimes but not always useful! (→ examples)

instead: Calculate the centroid of the pdf!

E[x] =∫ ∞−∞

dx x p(x) = x “expectation value”

more generally: Consider functions g : x 7→ g(x) of the RV x!

E[g(x)] =∫ ∞−∞

dx g(x) p(x), “expectation value of the observable g“

Example: Consider the observable 12mx

2 (kinetic energy, x = speed)

Sensor Data Fusion - Methods and Applications, 3rd Lecture on April 24, 2019 — slide 22

Page 23: Summary: BAYESian (Multi-) Sensor Tracking · Sensor Data Fusion - Methods and Applications, 3rd Lecture on April 24, 2019 — slide 1. Summary: BAYESian (Multi-) Sensor Tracking

An important observable: the “error” of an estimate

• Quality: How useful is an expectation value x = E[x]?

Consider special obervables as distance measure:

g(x) = |x− x| oder g(x) = (x− x)2

quadratic measures: computationally more comfortable!

‘expected error’ of the expectation value x:

V[x] = E[(x− x)2], σx =√V[x]

variance, standard deviation

Exercise 3.2Show that

V[x] = E[x2]− E[x]2holds.

Expectation value of the observable x2 also called “2nd moment” of the pdf of x.

Sensor Data Fusion - Methods and Applications, 3rd Lecture on April 24, 2019 — slide 23

Page 24: Summary: BAYESian (Multi-) Sensor Tracking · Sensor Data Fusion - Methods and Applications, 3rd Lecture on April 24, 2019 — slide 1. Summary: BAYESian (Multi-) Sensor Tracking

Exercise 3.3

Calculate expectation and variance of the uniform densityof a RV x ∈ R in the intervall [a, b].

p(x) = U( x︸︷︷︸ZV

; a, b︸︷︷︸Parameter

) =

1b−a x ∈ [a, b]

0 sonst!

Pdf correctly normalized?∫ ∞−∞

dx U(x; a, b) =1

b− a

∫ badx = 1

E[x] =∫ ∞−∞

dx x U(x; a, b) =b+ a

2

V[x] =1

b− a

∫ badx x2 − E[x]2 =

1

12(b− a)2

Sensor Data Fusion - Methods and Applications, 3rd Lecture on April 24, 2019 — slide 24

Page 25: Summary: BAYESian (Multi-) Sensor Tracking · Sensor Data Fusion - Methods and Applications, 3rd Lecture on April 24, 2019 — slide 1. Summary: BAYESian (Multi-) Sensor Tracking

Important example: x normally distributed over R (Gauss)

− wanted: probabilities concentrated around µ

− quadratic distance: ||x−µ||2 = 12(x−µ)2/σ2 (mathematically convenient!)

− Parameter σ is a measure of the “width” of the pdf: ||σ||2 = 12

− for ‘large’ distances, i.e. ||x− µ||2 � 12, the pdf shall decay quickly.

− simplest approach: p(x) = e−||x−µ||2

(> 0 ∀x ∈ R, normalization?)

− Normalized for p(x) = p(x)/∫∞−∞ dx p(x)!

Formula collection delivers:∫ ∞−∞

dx p(x) =√

2πσ

An admissible pdf with the required properties is obviously given by:

N (x;µ, σ) =1√2πσ

exp

(−

(x− µ)2

2σ2

)

Sensor Data Fusion - Methods and Applications, 3rd Lecture on April 24, 2019 — slide 25

Page 26: Summary: BAYESian (Multi-) Sensor Tracking · Sensor Data Fusion - Methods and Applications, 3rd Lecture on April 24, 2019 — slide 1. Summary: BAYESian (Multi-) Sensor Tracking

Exercise 3.4Show for the Gauss density p(x) = N (x;µ, σ):

E[x] = µ, V[x] = σ2

E[x] =∫ ∞−∞

dx xN (x;µ, σ) = µ

V[x] = E[x2]− E[x]2 = σ2

Use substitution and partial integration!

Use∫∞−∞ dx e−

12x

2=√

2π!

Sensor Data Fusion - Methods and Applications, 3rd Lecture on April 24, 2019 — slide 26

Page 27: Summary: BAYESian (Multi-) Sensor Tracking · Sensor Data Fusion - Methods and Applications, 3rd Lecture on April 24, 2019 — slide 1. Summary: BAYESian (Multi-) Sensor Tracking

Create your own sensor simulator!

Exercise 3.1Simulate normally distributed (radar) measurements!

Measurement interval: ∆T = 5 s, sensor position: rsState at time tk = k∆T , k ∈ Z: xk = (r>k , r

>k , r

>k )>

Use standard random number generators such as normrnd(0,1)!

with uk =(

normrnd(0,1)normrnd(0,1)

): p(uk) = N

(uk; o, I

)we have for uk = σuk : p(uk) = N

(uk; o, σ2I

)

Sensor Data Fusion - Methods and Applications, 3rd Lecture on April 24, 2019 — slide 27

Page 28: Summary: BAYESian (Multi-) Sensor Tracking · Sensor Data Fusion - Methods and Applications, 3rd Lecture on April 24, 2019 — slide 1. Summary: BAYESian (Multi-) Sensor Tracking

Create your own sensor simulator!

Exercise 3.1Simulate normally distributed (radar) measurements!

Measurement interval: ∆T = 5 s, sensor position: rsState at time tk = k∆T , k ∈ Z: xk = (r>k , r

>k , r

>k )>

1. Simulate measurements of the Cartesian position components of the target state xk:

zck =(zxkzyk

)= Hxk + uk = ( I I I )

( rkrkrk

)+ σc

(normrnd(0,1)normrnd(0,1)

)with a random number generator normrnd(0,1) producing normally distributed zero-mean and unit-variance random numbers, σc = 50 m denoting the standard deviation ofthe sensor measurement errors. Sensor position has no impact.

Sensor Data Fusion - Methods and Applications, 3rd Lecture on April 24, 2019 — slide 28

Page 29: Summary: BAYESian (Multi-) Sensor Tracking · Sensor Data Fusion - Methods and Applications, 3rd Lecture on April 24, 2019 — slide 1. Summary: BAYESian (Multi-) Sensor Tracking

Create your own sensor simulator!

Exercise 3.1Simulate normally distributed (radar) measurements!

Measurement interval: ∆T = 5 s, sensor position: rsState at time tk = k∆T , k ∈ Z: xk = (r>k , r

>k , r

>k )>

1. Simulate measurements of the Cartesian position components of the target state xk:

zck =(zxkzyk

)= Hxk + uk = ( I I I )

( rkrkrk

)+ σc

(normrnd(0,1)normrnd(0,1)

)with a random number generator normrnd(0,1) producing normally distributed zero-mean and unit-variance random numbers, σc = 50 m denoting the standard deviation ofthe sensor measurement errors. Sensor position rs has no impact.

2. Simulate range / azimuth measurements of the target position rk w.r.t sensor position rs:

zpk =(zrkzϕk

)=

(√(xk−xs)2+(yk−ys)2

arctan(yk−ysxk−xs

)

)+(σr normrnd(0,1)σϕ normrnd(0,1)

), rk,s = (xk,s, yk,s)

>

with σr = 20 m, σϕ = 0.2◦ denoting the standard deviations in range and azimuth.

3. Plot the Cartesian and polar measurements zrk(cos zϕk , sin zϕk )>+ rs over the true targettrajectory! Play with sensor positions and measurement error standard deviations!

Sensor Data Fusion - Methods and Applications, 3rd Lecture on April 24, 2019 — slide 29

Page 30: Summary: BAYESian (Multi-) Sensor Tracking · Sensor Data Fusion - Methods and Applications, 3rd Lecture on April 24, 2019 — slide 1. Summary: BAYESian (Multi-) Sensor Tracking

Exercise 3.5

Read (if you like) the Wikipedia article on John von Neumann’s(1903-1957) algorithm for “rejection sampling”

http://en.wikipedia.org/wiki/Rejection sampling.

Generate (if you like) random numbers zn with p(zn) = N (zn; 0,1)from random numbers zu with p(zu) = U(zu; 0,1).

Read the Wikipedia article on the great mathematician, physicist andcomputer pioneer John von Neumann

http://en.wikipedia.org/wiki/John von Neumann.

Read the beautiful book (if you like): George Dyson (2012). Turing’sCathedral: The Origins of the Digital Universe.

Sensor Data Fusion - Methods and Applications, 3rd Lecture on April 24, 2019 — slide 30

Page 31: Summary: BAYESian (Multi-) Sensor Tracking · Sensor Data Fusion - Methods and Applications, 3rd Lecture on April 24, 2019 — slide 1. Summary: BAYESian (Multi-) Sensor Tracking

Generalization to multiple random variables!

Vector states: x = (x1, . . . , xn−1, xn)> e.g. : x = (r, r)>,x = (x1,x2)>

Volume integral: P{x ∈ V } =

∫V

dx1 . . . dxn p(x1, . . . , xn)

vector variate or scalar expectation values: E[g(x)] =

∫dx g(x) p(x)

Independence: Statements about x not influenced by y→ p(x, y) = p(x) p(y)!

P{x ∈ X, y ∈ Y } =

∫X

dx . . . dx p(x)

∫Y

dy . . . dy p(y)

Sensor Data Fusion - Methods and Applications, 3rd Lecture on April 24, 2019 — slide 31

Page 32: Summary: BAYESian (Multi-) Sensor Tracking · Sensor Data Fusion - Methods and Applications, 3rd Lecture on April 24, 2019 — slide 1. Summary: BAYESian (Multi-) Sensor Tracking

Some properties of joint densities

Non-negative: p(x, y) ≥ 0

Normalized:∫dx dy p(x, y) = 1

Relation between p(x), p(y) and p(x, y):

p(x) =∫dy p(x, y)

p(y) =∫dx p(x, y)

p(x) is also called a marginal density of the joint density w.r.t. x.

Sensor Data Fusion - Methods and Applications, 3rd Lecture on April 24, 2019 — slide 32

Page 33: Summary: BAYESian (Multi-) Sensor Tracking · Sensor Data Fusion - Methods and Applications, 3rd Lecture on April 24, 2019 — slide 1. Summary: BAYESian (Multi-) Sensor Tracking

How does knowledge on a RV y affects knowledge on a RV x?

No impact if x, y are independent of each other. → p(x|y) = p(x)

Feeling: p(x, y) and p(y) should enter into the definition of p(x|y).

Sensor Data Fusion - Methods and Applications, 3rd Lecture on April 24, 2019 — slide 33

Page 34: Summary: BAYESian (Multi-) Sensor Tracking · Sensor Data Fusion - Methods and Applications, 3rd Lecture on April 24, 2019 — slide 1. Summary: BAYESian (Multi-) Sensor Tracking

How does knowledge on a RV y affects knowledge on a RV x?

No impact if x, y are independent of each other. → p(x|y) = p(x)

Feeling: p(x, y) and p(y) should enter into the definition of p(x|y).

A first attempt:p(x|y) =

p(x, y)

p(y)

•∫dx p(x|y) =

1

p(y)

∫dx p(x, y) =

p(y)

p(y)= 1 → Normalized!

• x, y mutually independent: p(x|y) =p(x, y)

p(y)=p(x) p(y)

p(y)= p(x)

Sensor Data Fusion - Methods and Applications, 3rd Lecture on April 24, 2019 — slide 34

Page 35: Summary: BAYESian (Multi-) Sensor Tracking · Sensor Data Fusion - Methods and Applications, 3rd Lecture on April 24, 2019 — slide 1. Summary: BAYESian (Multi-) Sensor Tracking

How does knowledge on a RV y affects knowledge on a RV x?

No impact if x, y are independent of each other. → p(x|y) = p(x)

Feeling: p(x, y) and p(y) should enter into the definition of p(x|y).

A first attempt:p(x|y) =

p(x, y)

p(y)

•∫dx p(x|y) =

1

p(y)

∫dx p(x, y) =

p(y)

p(y)= 1 → Normalized!

• x, y mutually independent: p(x|y) =p(x, y)

p(y)=p(x) p(y)

p(y)= p(x)

p(x|y) ≥ 0 is obviously interpretable as a useful pdf that quantitatively describes thenotions of statistical “dependency” and “independency”.

conditional probability density function: p(x|y)

Sensor Data Fusion - Methods and Applications, 3rd Lecture on April 24, 2019 — slide 35

Page 36: Summary: BAYESian (Multi-) Sensor Tracking · Sensor Data Fusion - Methods and Applications, 3rd Lecture on April 24, 2019 — slide 1. Summary: BAYESian (Multi-) Sensor Tracking

How to calculate conditional pdfs? Use Bayes’ Rule!

Because of: p(x|y) p(y) = p(x, y) = p(y, x) = p(y|x) p(x)

we have in particular: p(x|y) =p(y|x) p(x)

p(y)

Sensor Data Fusion - Methods and Applications, 3rd Lecture on April 24, 2019 — slide 36

Page 37: Summary: BAYESian (Multi-) Sensor Tracking · Sensor Data Fusion - Methods and Applications, 3rd Lecture on April 24, 2019 — slide 1. Summary: BAYESian (Multi-) Sensor Tracking

How to calculate conditional pdfs? Use Bayes’ Rule!

Because of: p(x|y) p(y) = p(x, y) = p(y, x) = p(y|x) p(x)

we have in particular: p(x|y) =p(y|x) p(x)

p(y)

We can also write: p(y) =∫dx p(y, x)︸ ︷︷ ︸

marginal pdf

=∫dx p(y|x) p(x)︸ ︷︷ ︸

def. cond. pdf

and thus obtain:p(x|y) =

p(y|x) p(x)∫dx p(y|x) p(x)

Sensor Data Fusion - Methods and Applications, 3rd Lecture on April 24, 2019 — slide 37

Page 38: Summary: BAYESian (Multi-) Sensor Tracking · Sensor Data Fusion - Methods and Applications, 3rd Lecture on April 24, 2019 — slide 1. Summary: BAYESian (Multi-) Sensor Tracking

How to calculate conditional pdfs? Use Bayes’ Rule!

Because of: p(x|y) p(y) = p(x, y) = p(y, x) = p(y|x) p(x)

we have in particular: p(x|y) =p(y|x) p(x)

p(y)

We can also write: p(y) =∫dx p(y, x)︸ ︷︷ ︸

marginal pdf

=∫dx p(y|x) p(x)︸ ︷︷ ︸

def. cond. pdf

and thus obtain:p(x|y) =

p(y|x) p(x)∫dx p(y|x) p(x)

• Who knows p(y|x) and p(x), can calculatehow knowledge on y affects knowledge on x.

• Large parts of statistics is just an application of Bayes’ rule.(Rev. Thomas Bayes, 18th century, fully understood by Laplace)

Sensor Data Fusion - Methods and Applications, 3rd Lecture on April 24, 2019 — slide 38

Page 39: Summary: BAYESian (Multi-) Sensor Tracking · Sensor Data Fusion - Methods and Applications, 3rd Lecture on April 24, 2019 — slide 1. Summary: BAYESian (Multi-) Sensor Tracking

Google Rev. Thomas Bayes (1701-1761), perhaps starting withhttp://en.wikipedia.org/wiki/Thomas Bayes.

Sensor Data Fusion - Methods and Applications, 3rd Lecture on April 24, 2019 — slide 39

Page 40: Summary: BAYESian (Multi-) Sensor Tracking · Sensor Data Fusion - Methods and Applications, 3rd Lecture on April 24, 2019 — slide 1. Summary: BAYESian (Multi-) Sensor Tracking

More Precise Formulation of the BAYESian Approach

Consider a set of measurements Zl = {zjl }mlj=1 of a single or a multiple

target state xl at time instants tl, l = 1, . . . , k and the time series:

Zk = {Zk,mk, Zk−1,mk−1, . . . , Z1,m1} = {Zk,mk,Zk−1}!

Sensor Data Fusion - Methods and Applications, 3rd Lecture on April 24, 2019 — slide 40

Page 41: Summary: BAYESian (Multi-) Sensor Tracking · Sensor Data Fusion - Methods and Applications, 3rd Lecture on April 24, 2019 — slide 1. Summary: BAYESian (Multi-) Sensor Tracking

More Precise Formulation of the BAYESian Approach

Consider a set of measurements Zl = {zjl }mlj=1 of a single or a multiple

target state xl at time instants tl, l = 1, . . . , k and the time series:

Zk = {Zk,mk, Zk−1,mk−1, . . . , Z1,m1} = {Zk,mk,Zk−1}!

Based on Zk, what can be learned about the object states xlat t1, . . . , tk, tk+1, . . ., i.e. for the past, present, and future?

Evidently the answer is given be calculating the pdfp(xl|Zk)

!

Sensor Data Fusion - Methods and Applications, 3rd Lecture on April 24, 2019 — slide 41

Page 42: Summary: BAYESian (Multi-) Sensor Tracking · Sensor Data Fusion - Methods and Applications, 3rd Lecture on April 24, 2019 — slide 1. Summary: BAYESian (Multi-) Sensor Tracking

More Precise Formulation of the BAYESian Approach

Consider a set of measurements Zl = {zjl }mlj=1 of a single or a multiple

target state xl at time instants tl, l = 1, . . . , k and the time series:

Zk = {Zk,mk, Zk−1,mk−1, . . . , Z1,m1} = {Zk,mk,Zk−1}!

Based on Zk, what can be learned about the object states xlat t1, . . . , tk, tk+1, . . ., i.e. for the past, present, and future?

Evidently the answer is given be calculating the pdfp(xl|Zk)

!

multiple sensor measurement fusion: Calculate p(x|Zk1, . . . ,ZkN)!

• communication lines • common coordinate system: sensor registration

Sensor Data Fusion - Methods and Applications, 3rd Lecture on April 24, 2019 — slide 42

Page 43: Summary: BAYESian (Multi-) Sensor Tracking · Sensor Data Fusion - Methods and Applications, 3rd Lecture on April 24, 2019 — slide 1. Summary: BAYESian (Multi-) Sensor Tracking

How to calculate the pdf p(xl|Zk)?

Consider at first the present time: l = k.

an observation:

Bayes’ rule: p(xk|Zk) = p(xk|Zk,mk,Zk−1)

=p(Zk,mk|xk,Zk−1) p(xk|Zk−1)∫dxk p(Zk,mk|xk,Zk−1) p(xk|Zk−1)

Sensor Data Fusion - Methods and Applications, 3rd Lecture on April 24, 2019 — slide 43

Page 44: Summary: BAYESian (Multi-) Sensor Tracking · Sensor Data Fusion - Methods and Applications, 3rd Lecture on April 24, 2019 — slide 1. Summary: BAYESian (Multi-) Sensor Tracking

How to calculate the pdf p(xl|Zk)?

Consider at first the present time: l = k.

an observation:

Bayes’ rule: p(xk|Zk) = p(xk|Zk,mk,Zk−1)

=p(Zk,mk|xk,Zk−1) p(xk|Zk−1)∫

dxk p(Zk,mk|xk,Zk−1)︸ ︷︷ ︸likelihood function

p(xk|Zk−1)︸ ︷︷ ︸prediction

• p(xk|Zk−1) is a prediction of the target state at time tkbased on all measurements in the past.

• p(Zk,mk|xk) ∝ `(xk;Zk,mk) describes, what the current sensor output Zk,mk

can say about the current target state xk and is called likelihood function.

Sensor Data Fusion - Methods and Applications, 3rd Lecture on April 24, 2019 — slide 44

Page 45: Summary: BAYESian (Multi-) Sensor Tracking · Sensor Data Fusion - Methods and Applications, 3rd Lecture on April 24, 2019 — slide 1. Summary: BAYESian (Multi-) Sensor Tracking

• p(xk|Zk−1) is a prediction for time tk based on all measurements in the past.

p(xk|Zk−1) =∫dxk−1 p(xk,xk−1|Zk−1) marginal pdf

=∫dxk−1 p(xk|xk−1,Zk−1)︸ ︷︷ ︸

object dynamics!

p(xk−1|Zk−1)︸ ︷︷ ︸idea: iteration!

notion of a conditional pdf

sometimes: p(xk|xk−1) = N(xk; Fk|k−1︸ ︷︷ ︸

deterministic

xk−1, Dk|k−1︸ ︷︷ ︸random

)(linear GAUSS-MARKOV)

• p(Zk,mk|xk) ∝ `(xk;Zk,mk) describes, what the current sensor output Zk,mk

can say about the current target state xk and is called likelihood function.

sometimes: `(xk; zk) = N(zk; Hkxk, Rk

)(1 target, 1 measurement)

iteration formula: p(xk|Zk) =`(xk; zk)

∫dxk−1 p(xk|xk−1) p(xk−1|Zk−1)∫

dxk `(xk; zk)∫dxk−1 p(xk|xk−1) p(xk−1|Zk−1)

Sensor Data Fusion - Methods and Applications, 3rd Lecture on April 24, 2019 — slide 45

Page 46: Summary: BAYESian (Multi-) Sensor Tracking · Sensor Data Fusion - Methods and Applications, 3rd Lecture on April 24, 2019 — slide 1. Summary: BAYESian (Multi-) Sensor Tracking

A popular model for object evolutions

Piecewise Constant White Acceleration Model

Consider state vectors: xk = (r>k , r>k )> (position, velocity)

For known xk−1 and without external influences we have with ∆Tk = tk − tk−1:

xk =

(I ∆Tk IO I

) (rk−1rk−1

)=: Fk|k−1xk−1, see blackboard!

Assume during the interval ∆Tk a constant acceleration ak causing the state evolution:(12∆T2

k I∆Tk I

)ak =: Gkak, linear transform!

Let ak be a Gaussian RV with pdf: p(ak) = N(ak; o, Σ2

kI), we therefore have:

p(Gkak) = N(Gkak; o, Σ2

kGkG>k

).

Sensor Data Fusion - Methods and Applications, 3rd Lecture on April 24, 2019 — slide 46

Page 47: Summary: BAYESian (Multi-) Sensor Tracking · Sensor Data Fusion - Methods and Applications, 3rd Lecture on April 24, 2019 — slide 1. Summary: BAYESian (Multi-) Sensor Tracking

Therefore: p(xk|xk−1) = N(xk; Fk|k−1xk−1, Dk|k−1

)with

Fk|k−1 =

(I ∆Tk IO I

), Dk|k−1 = Σ2

k

(14∆T4

k I 12∆T3 I

12∆T3

k I ∆T2k I

)

Sensor Data Fusion - Methods and Applications, 3rd Lecture on April 24, 2019 — slide 47

Page 48: Summary: BAYESian (Multi-) Sensor Tracking · Sensor Data Fusion - Methods and Applications, 3rd Lecture on April 24, 2019 — slide 1. Summary: BAYESian (Multi-) Sensor Tracking

Therefore: p(xk|xk−1) = N(xk; Fk|k−1xk−1, Dk|k−1

)with

Fk|k−1 =

(I ∆Tk IO I

), Dk|k−1 = Σ2

k

(14∆T4

k I 12∆T3

k I12∆T3

k I ∆T2k I

)

Exercise 3.6 Consider xk = (r>k , r>k , r>k )> (position, velocity, acceleration)

Show that Fk|k−1 and Dk|k−1 = Σ2kGkG

>k (constant acceleration rates) are given by:

Fk|k−1 =

I ∆Tk I12∆T 2

k IO I ∆Tk IO I I

, Dk|k−1 = Σ2k

14∆T 4

k I 12∆T 3

k I 12∆T 2

k I12∆T 3

k I ∆T 2k I ∆Tk I

12∆T 2

k I ∆TkI I

with ∆Tk = tk − tk−1. Reasonable choice: 1

2qmax ≤ Σk ≤ qmax

Sensor Data Fusion - Methods and Applications, 3rd Lecture on April 24, 2019 — slide 48

Page 49: Summary: BAYESian (Multi-) Sensor Tracking · Sensor Data Fusion - Methods and Applications, 3rd Lecture on April 24, 2019 — slide 1. Summary: BAYESian (Multi-) Sensor Tracking

A more Insightful Look at a Data Fusion Algorithm

Kalman filter: xk = (r>k , r>k )>, Zk = {zk,Zk−1}

initiation: p(x0) = N(x0; x0|0, P0|0

), initial ignorance: P0|0 ‘large’

prediction: N(xk−1; xk−1|k−1, Pk−1|k−1

) dynamics model−−−−−−−−−→Fk|k−1,Dk|k−1

N(xk; xk|k−1, Pk|k−1

)xk|k−1 = Fk|k−1xk−1|k−1

Pk|k−1 = Fk|k−1Pk−1|k−1Fk|k−1> + Dk|k−1

filtering: N(xk; xk|k−1, Pk|k−1

) current measurement zk−−−−−−−−−−−−−→sensor model: Hk,Rk

N(xk; xk|k, Pk|k

)xk|k = xk|k−1 + Wk|k−1νk|k−1, νk|k−1 = zk −Hkxk|k−1

Pk|k = Pk|k−1 −Wk|k−1Sk|k−1Wk|k−1>, Sk|k−1 = HkPk|k−1Hk

> + Rk

Wk|k−1 = Pk|k−1Hk>Sk|k−1

−1 ‘KALMAN gain matrix’

A deeper look into the dynamics and sensor models necessary!

Sensor Data Fusion - Methods and Applications, 3rd Lecture on April 24, 2019 — slide 49

Page 50: Summary: BAYESian (Multi-) Sensor Tracking · Sensor Data Fusion - Methods and Applications, 3rd Lecture on April 24, 2019 — slide 1. Summary: BAYESian (Multi-) Sensor Tracking

Definitions: “Sensor Data and Information Fusion”

Llinas (2001). “Information fusion is an Information Process dealing with the association, corre-lation, and combination of data and information from single and multiple sensors or sources toachieve refined estimates of parameters, characteristics, events, and behaviors for observed enti-ties in an observed field of view. It is sometimes implemented as a Fully Automatic process or asa Human-Aiding process for Analysis and/or Decision Support.” [1]

— — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — —

JDL (1987). Data fusion is “a process dealing with the association, correlation, and combinationof data and information from single and multiple sources to achieve refined position and identityestimates, and complete and timely assessments of situations and threats, and their significance.The process is characterized by continuous refinements of its estimates and assessments, andthe evaluation of the need for additional sources, or modification of the process itself, to achieveimproved results.” [2]

Hugh Durrant-Whyte (1988). “The basic problem in multi-sensor systems is to integrate a se-quence of observations from a number of different sensors into a single best-estimate of the stateof the environment.” [3]

Llinas (1988). “Fusion can be defined as a process of integrating information from multiple sourcesto produce the most specific and comprehensive unified data about an entity, activity or event. Thisdefinition has some key operative words: specific, comprehensive, and entity. From an informati-ontheoretic point of view, fusion, to be effective as an information processing function, must (atleast ideally) increase the specificity and comprehensiveness of the understanding we have abouta battlefield entity or else there would be no purpose in performing the function.” [4]

Sensor Data Fusion - Methods and Applications, 3rd Lecture on April 24, 2019 — slide 50

Page 51: Summary: BAYESian (Multi-) Sensor Tracking · Sensor Data Fusion - Methods and Applications, 3rd Lecture on April 24, 2019 — slide 1. Summary: BAYESian (Multi-) Sensor Tracking

Richardson and Marsh (1988). “Data fusion is the process by which data from a multitude ofsensors is used to yield an optimal estimate of a specified state vector pertaining to the observedsystem.” [5]

McKendall and Mintz (1988). “...the problem of sensor fusion is the problem of combining multiplemeasurements from sensors into a single measurement of the sensed object or attribute, calledthe parameter.” [6]

Waltz and Llinas (1990). “This field of technology has been appropriately termed data fusionbecause the objective of its processes is to combine elements of raw data from different sourcesinto a single set of meaningful information that is of greater benefit than the sum of the contributingparts. As a technology, data fusion is actually the integration and application of many traditionaldisciplines and new areas of engineering to achieve the fusion of data.” [7]

Luo and Kay (1992). “Multisensor fusion, ..., refers to any stage in an integration process wherethere is an actual combination (or fusion) of different sources of sensory information into onerepresentational format.” [8]

Abidi and Gonzalez (1992). “Data fusion deals with the synergistic combination of informationmade available by various knowledge sources such as sensors, in order to provide a better under-standing of a given scene.” [9]

Hall (1992). “Multisensor data fusion seeks to combine data from multiple sensors to performinferences that may not be possible from a single sensor alone.” [10]

DSTO (1994). Data fusion is “a multilevel, multifaceted process dealing with the automatic detec-tion, association, correlation, estimation, and combination of data and information from single andmultiple sources.” [11]

Sensor Data Fusion - Methods and Applications, 3rd Lecture on April 24, 2019 — slide 51

Page 52: Summary: BAYESian (Multi-) Sensor Tracking · Sensor Data Fusion - Methods and Applications, 3rd Lecture on April 24, 2019 — slide 1. Summary: BAYESian (Multi-) Sensor Tracking

Malhotra (1995). “The process of sensor fusion involves gathering sensory data, refining andinterpreting it, and making new sensor allocation decisions.” [12]

Hall and Llinas (1997). “Data fusion techniques combine data from multiple sensors, and re-lated information from associated databases, to achieve improved accuracy and more specificinferences than could be achieved by the use of single sensor alone.” [13]

Goodman, Mahler and Nguyen (1997). Data fusion is to “locate and identify many unknown ob-jects of many different types on the basis of different kinds of evidence. This evidence is collectedon an ongoing basis by many possibly allocatable sensors having varying capabilities and to ana-lyze the results insuch a way as to supply local and over-all assessments of the significance of ascenario and to determine proper responses based on those assessments.” [14]

Paradis, Chalmers, Carling and Bergeron (1997). “Data fusion is fundamentally a process de-signed to manage (i.e., organize, combine and interpret) data and information, obtained from avariety of sources, that may be required at any time by operators or commanders for decision ma-king. ... data fusion is an adaptive information process that continuously transforms available dataand information into richer information, through continuous refinement of hypotheses or inferencesabout real-world events, to achieve a refined (potentially optimal) kinematics and identity estimatesof individual objects, and complete and timely assessments of current and potential future situa-tions and threats (i.e., contextual reasoning), and their significance in the context of operationalsettings.” [15]

Starr and Desforges (1998). “Data fusion is a process that combines data and knowledge fromdifferent sources with the aim of maximising the useful information content, for improved reliabilityor discriminant capability, while minimising the quantity of data ultimately retained.” [16]

Sensor Data Fusion - Methods and Applications, 3rd Lecture on April 24, 2019 — slide 52

Page 53: Summary: BAYESian (Multi-) Sensor Tracking · Sensor Data Fusion - Methods and Applications, 3rd Lecture on April 24, 2019 — slide 1. Summary: BAYESian (Multi-) Sensor Tracking

Wald (1998). “Data fusion is a formal framework in which are expressed means and tools for thealliance of data of the same scene originating from different sources. It aims at obtaining infor-mation of greater quality; the exact definition of greater quality will depend upon the application.”[17]

Evans (1998). “The combining of data from different complementary sources (usually geodemo-graphic and lifestyle or market research and lifestyle) to build a picture of someone’s life”. [18]

Wald (1999). “Data fusion is a formal framework in which are expressed the means and tools forthe alliance of data originating from different sources.” [19]

Steinberg, Bowman and White (1999). “Data fusion is the process of combining data to refinestate estimates and predictions.” [20]

Gonsalves, Cunningham, Ton and Okon (2000). “The overall goal of data fusion is to combinedata from multiple sources into information that has greater benefit than what would have beenderived from each of the contributing parts.” [21]

Hannah, Ball and Starr (2000). “Fusion is defined materially as a process of blending, usuallywith the application of heat to melt constituents together (OED), but in data processing the moreabstract form of union or blending together is meant. The ’heat’ is applied with a series of algo-rithms which, depending on the technique used, give a more or less abstract relationship betweenthe constituents and the finished output.” [22]

Dasarathy (2001). “Information fusion encompasses the theory, techniques, and tools conceivedand employed for exploiting the synergy in the information acquired from multiple sources (sensor,databases, information gathered by humans etc.) such that the resulting decision or action is insome sense better (qualitatively and quantitatively, in terms of accuracy, robustness and etc.) thanwould be possible, if these sources were used individually without such synergy exploitation.” [23]

Sensor Data Fusion - Methods and Applications, 3rd Lecture on April 24, 2019 — slide 53

Page 54: Summary: BAYESian (Multi-) Sensor Tracking · Sensor Data Fusion - Methods and Applications, 3rd Lecture on April 24, 2019 — slide 1. Summary: BAYESian (Multi-) Sensor Tracking

Bloch and Hunter et al. (2001). “...fusion consists in conjoining or merging information that stemsfrom several sources and exploiting that conjoined or merged information in various tasks such asanswering questions, making decisions, numerical estimation, etc.” [24]

McGirr (2001). “The process of bringing large amounts of dissimilar information together into amore comprehensive and easily manageable form is known as data fusion.” [25]

Bell, Santos and Brown (2002). “Sophisticated information fusion capabilities are required in or-der to transform what the agents gather from a raw form to an integrated, consistent and completeform. Information fusion can occur at multiple levels of abstraction.” [26]

Challa, Gulrez, Chaczko and Paranesha (2005). Multi-sensor data fusion “is a core componentof all networked sensing systems, which is used either to:- join/combine complementary informati-on produced by sensor to obtain a more complete picture or - reduce/manage uncertainty by usingsensor information from multiple sources.” [27]

Jalobeanu and Gutirrez (2006). “The data fusion problem can be stated as the computation of theposterior pdf [probability distribution function] of the unknown single object given all observations.”[28]

Mastrogiovanni et al (2007). “The aim of a data fusion process is to maximize the useful infor-mation content acquired by heterogeneous sources in order to infer relevant situations and eventsrelated to the observed environment.” [29]

Wikipedia (2007). “Information Integration is a field of study known by various terms: InformationFusion, Deduplication, Referential Integrity and so on. It refers to the field of study of techniquesattempting to merge information from disparate sources despite differing conceptual, contextualand typographical representations. This is used in data mining and consolidation of data fromsemi- or unstructured resources.” [30]

Sensor Data Fusion - Methods and Applications, 3rd Lecture on April 24, 2019 — slide 54

Page 55: Summary: BAYESian (Multi-) Sensor Tracking · Sensor Data Fusion - Methods and Applications, 3rd Lecture on April 24, 2019 — slide 1. Summary: BAYESian (Multi-) Sensor Tracking

Wikipedia (2007). “Sensor fusion is the combining of sensory data or data derived from senso-ry data from disparate sources such that the resulting information is in some sense better thanwould be possible when these sources were used individually. The term better in that case canmean more accurate, more complete, or more dependable, or refer to the result of an emergingview, such as stereoscopic vision (calculation of depth information by combining two-dimensionalimages from two cameras at slightly different viewpoints). The data sources for a fusion processare not specified to originate from identical sensors. One can distinguish direct fusion, indirectfusion and fusion of the outputs of the former two. Direct fusion is the fusion of sensor data froma set of heterogeneous or homogeneous sensors, soft sensors, and history values of sensor da-ta, while indirect fusion uses information sources like a priori knowledge about the environmentand human input. Sensor fusion is also known as (multi-sensor) data fusion and is a subset ofinformation fusion.” [31]

MSN Encarta (2007). “Data integration: the integration of data and knowledge collected fromdisparate sources by different methods into a consistent, accurate, and useful whole.” [32]

[1] D. L Hall and J. Llinas (edt). Handbook of Multisensor Data Fusion. CRC Press: USA. 2001

[2] F. E. White, Jr., Data Fusion Lexicon, Joint Directors of Laboratories, Technical Panel for C3, Data Fusion Sub-Panel, Naval Ocean Systems Center, San Diego, 1987.

[3] H. F. Durrant-Whyte, Integration, Coordination and control of Multi-sensor robot systems, Kluwer Academic Publis-hers. 1988

[4] J. Llinas, Toward the Utilization of Certain Elements of AI Technology for Multi Sensor Data Fusion. In: C. J. Harris(ed.), Application of artificial intelligence to command and control systems, Peter Peregrinus Ltd (1988)

[5] J. M. Richardson and K. A. Marsh. Fusion of Multisensor data. The International Journal of Robotic Research, Vol.7, No. 6, pp. 78-96, 1988

[6] R. McKendall and M. Mintz. Robust fusion of location information. IEEE International Conference on Robotics andAutomation, Philadelphia, United States, pp. 1239-1244. April 1988.

Sensor Data Fusion - Methods and Applications, 3rd Lecture on April 24, 2019 — slide 55

Page 56: Summary: BAYESian (Multi-) Sensor Tracking · Sensor Data Fusion - Methods and Applications, 3rd Lecture on April 24, 2019 — slide 1. Summary: BAYESian (Multi-) Sensor Tracking

[7] E. L. Waltz and J. Llinas Multisensor Data Fusion. Artech House, Inc. Norwood, MA, USA. 1990

[8] R. C. Luo and M. G Kay. Data fusion and sensor integration: State-of-the-art 1990s. Data Fusion in Robotics andMachine Intelligence, Academic Press Limited, San Diego, 1992

[9] M. A. Abidi and R. C. Gonzalez, Data Fusion in Robotics and Machine Intelligence, Academic Press, San Diego.1992

[10] David L. Hall, Mathematical Techniques in Multisensor Data Fusion, Artech House (1992)

[11] DSTO (Defence Science and Technology Organization) Data Fusion Special Interest Group, Data fusion lexicon.Department of Defence, Australia, 7 p., 21 September 1994.

[12] R. Malhotra. Temporal considerations in sensor management, In: Proceedings of the IEEE national aerospaceand electronics conference, NAECON 1995.

[13] D.L. Hall and J. Llinas - An Introduction to Multisensor Fusion. In: Proceedings of the IEEE, vol. 85. issue 1. p6-23. Jan 1997

[14] I. R. Goodman, R. P. Mahler and H. T. Nguyen, Mathematics of Data Fusion, Kluwer Academic Publishers, 1997

[15] S. Paradis, B. A. Chalmers, R. Carling, P. Bergeron, Towards a generic model for situation and threat assessment,SPIE vol. 3080, 1997

[16] Starr and M. Desforges. Strategies in data fusion - sorting through the tool box. Proceedings of EuropeanConference on Data Fusion, 1998

[17] L. Wald. A European proposal for terms of reference in data fusion, In: International Archives of Photogrammetryand Remote Sensing, Vol. XXXII, Part 7, pp. 651-654 . 1998

[18] M. Evans. From 1086 to 1984: direct marketing into the millennium, Marketing Intelligence and Planning, 16(1),pp.56-67. 1998

[19] L. Wald. Some terms of reference in data fusion. In: IEEE Transactions on Geosciences and Remote Sensing,37, 3, pp. 1190-1193. 1999

[20] A. N. Steinberg, C. L. Bowman and F. E. White. Revisions to the JDL data fusion model. In: Proceeding of SPIESensor Fusion: Architectures, Algorithms, and Applications III pp. 430-41. 1999

Sensor Data Fusion - Methods and Applications, 3rd Lecture on April 24, 2019 — slide 56

Page 57: Summary: BAYESian (Multi-) Sensor Tracking · Sensor Data Fusion - Methods and Applications, 3rd Lecture on April 24, 2019 — slide 1. Summary: BAYESian (Multi-) Sensor Tracking

[21] P. G. Gonsalves., R. Cunningham., N. Ton and D. Okon, Intelligent threat assessment processor (ITAP) usinggenetic algorithms and fuzzy logic, In: Proc. Internationan Conference on Information Fusion. 2000)

[22] P. Hannah, A. Ball and A. Starr, Decisions in Condition Monitoring - An Examplar for data fusion Architecture. In:Proc. International Conference on Information Fusion . 2000

[23] Dasarathy B. V., Information Fusion - what, where, why, when, and how? Information Fusion 2: 75-76. 2001

[24] I. Bloch and A. Hunter (Eds.), A. Appriou, A. Ayoun, S. Benferhat, P. Besnard, L. Cholvy, R. Cooke, F. Cuppens,D. Dubois, H. Fargier, M. Grabisch, R. Kruse, J. Lang, S. Moral, H. Prade, A. Saffiotti, P. Smets, C. Sossai, Fusion:General Concepts and Characteristics, International Journal of Intelligent Systems, 16:1107-1134, 2001

[25] S. C. McGirr, Resources for the design of data fusion systems, In: Proc. International Conference on InformationFusion. 2001

[26] B. Bell, E. Santos and S. M. Brown, Making adversary decision modeling tractable with intent inference and infor-mation fusion, In: Proc. of the 11th conf on computer generated forces and behavioural representation. 2002

[27] S.Challa, T. Gulrez, Z. Chaczko and T. N. Paranesha. Opportunistic information fusion: A new paradigm for nextgeneration networked sensing systems. In: Proc. International Conference on Information Fusion. 2005

[28] A. Jalobeanu, J.A. Gutirrez: Multisource data fusion for bandlimited signals: A Bayesian perspective. In Proc. of25th workshop on Bayesian Inference and Maximum Entropy methods (MaxEnt’06), Paris, France, Aug 2006

[29] F. Mastrogiovanni, A. Sgorbissa and R. Zaccaria. A Distributed Architecture for Symbolic Data Fusion. In IJCAI-07, pp 2153-2158. 2007

[30] Wikipedia. Information Fusion. URL: http:/en.wikipedia.org/wiki/Information Fusion. [accessed Februrary 13,2007]

[31] Wikipedia. Sensor Fusion. URL: http://en.wikipedia.org/wiki/Sensor fusion. [accessed Februrary 13, 2007]

[32] MSN Encarta. Data fusion definition. URL: http://encarta.msn.com/dictionary 701705479/data fusion.html [ac-cessed Februrary 21, 2007]

Sensor Data Fusion - Methods and Applications, 3rd Lecture on April 24, 2019 — slide 57