random signals notes

34
Random signals Jan ˇ Cernock´ y ´ UPGM FIT VUT Brno, [email protected] 1

Upload: ramyakoganti

Post on 16-Jan-2016

14 views

Category:

Documents


0 download

DESCRIPTION

random signals notes for mtech students who ever

TRANSCRIPT

Page 1: Random Signals Notes

Random signals

Jan Cernocky

UPGM FIT VUT Brno, [email protected]

1

Page 2: Random Signals Notes

Random signals

• deterministic signals (can be represented by an equation) have one substantial

drawback – they carry very little information (for example cosine: amplitude,

frequency, phase shift).

• real-world signals are tough to be described as deterministic (for example physical

model of speech production is very complex and anyway simplified).

⇒ in signal theory, we will consider these useful signals as random signals (processes)

(for example speech, circulation of letters in agencies of Czech Posts, exchange rate

CZK/EUR . . . ).

According to the character of the time axis, random signals are divided into continuous

time random signals (the time is defined for all t) and discrete time random signals

(only for discrete n).

These signals can not be represented in all time-points (in this case, they would be

deterministic), we will rather look for characteristic properties of random signals such as

mean value, probability density function, etc.2

Page 3: Random Signals Notes

Definition of random process

• continuous time: the system ξt of random variables defined for all t ∈ ℜ is called

random process, it is denoted ξ(t).

• discrete time: the system ξn of random variables defined for all n ∈ N is called

random process, it is denoted ξ[n].

Set of realizations of random process

the set of realizations includes infinity of possible “runs” of random process - its

realizations. We will limite ourselves to finite number Ω and denote each realization as

ξω(t), or ξω[n]. In case any parameters are estimated on this set, we will speak about

ensemble estimates.

Example: the random signal is recording of water flowing through the water tube in my

flat. 1068 realizations, each of 20 ms, were recorded. For the demonstration of continuous

random signals, we will imagine this set as ξω(t), for discrete random signals as ξω[n].

3

Page 4: Random Signals Notes

ξω(t) for ω = 1, 200, 500, 1000

0 0.002 0.004 0.006 0.008 0.01 0.012 0.014 0.016 0.018

−0.2

0

0.2

0 0.002 0.004 0.006 0.008 0.01 0.012 0.014 0.016 0.018

−0.2

0

0.2

0 0.002 0.004 0.006 0.008 0.01 0.012 0.014 0.016 0.018

−0.2

0

0.2

0 0.002 0.004 0.006 0.008 0.01 0.012 0.014 0.016 0.018

−0.2

0

0.2

4

Page 5: Random Signals Notes

ξω[n] for ω = 1, 200, 500, 1000

0 50 100 150 200 250 300

−0.2

0

0.2

0 50 100 150 200 250 300

−0.2

0

0.2

0 50 100 150 200 250 300

−0.2

0

0.2

0 50 100 150 200 250 300

−0.2

0

0.2

5

Page 6: Random Signals Notes

Distribution function

is defined for one random variable: the random process for given time t or n is such a

random variable. Definition:

F (x, t) = Pξ(t) < x,

F (x, n) = Pξ[n] < x,

where Pξ(t) < x or Pξ[n] < x is the probability that random variable in given time

will be smaller than x. Note, that x is nothing random, it is an auxiliary variable.

Ensemble estimation of distribution function: we will fix ourselves in given time t or n,

and take Ω realizations. For given x we estimate:

F (x, t) =

∑Ω

ω=11 if ξω(t) < x, 0 else

Ω

F (x, n) =

∑Ω

ω=11 if ξω[n] < x, 0 else

Ω

6

Page 7: Random Signals Notes

−0.3 −0.2 −0.1 0 0.1 0.2 0.3 0.4 0.5

0.1

0.2

0.3

0.4

0.5

0.6

0.7

0.8

0.9

1

x

F(x,0.1ms)F(x,3.1ms)F(x,6.3ms)F(x,9.4ms)

7

Page 8: Random Signals Notes

−0.3 −0.2 −0.1 0 0.1 0.2 0.3 0.4 0.5

0.1

0.2

0.3

0.4

0.5

0.6

0.7

0.8

0.9

1

x

F(x,1)F(x,50)F(x,100)F(x,150)

8

Page 9: Random Signals Notes

Probability density function

is again defined for one random variable (random process for a given time t or n is such a

random variable). Definition:

p(x, t) =δF (x, t)

δx

p(x, n) =δF (x, n)

δx

Ensemble estimation of Probability density function: The function can be obtained by

numeric derivation from estimated F (x, t) or F (x, n) or it can be estimated by a

histogram:

• choose given t or n

• CHoose L values x from xmin till xmax, with regular step ∆ = xmax−xmin

L−1:

x1 = xmin, x2 = xmin + ∆, x3 = xmin + 2∆ . . .

. . . xL−1 = xmin + (L − 2)∆, xL = xmin + (L − 1)∆ = xmax

In such a way, we’ll obtain L “cages” with width ∆, for xi, given cage chi is from9

Page 10: Random Signals Notes

xi −∆

2till xi + ∆

2. The left edge of the

left-most cage (1) will be stretched till −∞, the right edge of the right-most (L) till +∞.

• the estimate for xi is computed

p(xi, t) =

∑Ω

ω=11 if ξω(t) ∈ chi, 0 else

Ω∆

p(xi, n) =

∑Ω

ω=11 if ξω[n] ∈ chi, 0 else

Ω∆

10

Page 11: Random Signals Notes

−0.2 0 0.2 0.4 0.60

0.5

1

1.5

2

2.5

3

3.5

x

p(x,

0.1m

s)

−0.2 0 0.2 0.4 0.60

0.5

1

1.5

2

2.5

3

x

p(x,

3.1m

s)

−0.2 0 0.2 0.4 0.60

0.5

1

1.5

2

2.5

3

x

p(x,

6.3m

s)

−0.2 0 0.2 0.4 0.60

0.5

1

1.5

2

2.5

3

3.5

x

p(x,

9.4m

s)

11

Page 12: Random Signals Notes

−0.2 0 0.2 0.4 0.60

0.5

1

1.5

2

2.5

3

3.5

x

p(x,

1)

−0.2 0 0.2 0.4 0.60

0.5

1

1.5

2

2.5

3

x

p(x,

50)

−0.2 0 0.2 0.4 0.60

0.5

1

1.5

2

2.5

3

x

p(x,

100)

−0.2 0 0.2 0.4 0.60

0.5

1

1.5

2

2.5

3

3.5

x

p(x,

150)

12

Page 13: Random Signals Notes

F (x, t), p(x, t) and probabilities

our task is to determine the probability, that the value of random process in time t or n is

in the interval [a, b]. We can estimate this in two ways (shown only for continuous time):

• from distribution function, having in mid its definition: F (x, t) = Pξ(t) < x, then

Pa < ξ(t) < b = F (b, t) − F (a, t)

• from probability density function. Densities can be integrated:

Pa < ξ(t) < b =

∫ b

a

p(x, t)dx

13

Page 14: Random Signals Notes

Properties of distribution function and probability density function:

• the values of random process will hardly be smaller than −∞, therefore

F (−∞, t) = Pξ(t) < −∞ = 0.

• the values of random process will all be smaller than ∞, therefore

F (+∞, t) = Pξ(t) < +∞ = 1.

• probability density function is derivation of distribution function, the inverse sense is

given by an integral:

F (x, t) =

∫ x

−∞

p(g, t)dg

14

Page 15: Random Signals Notes

as F (+∞, t) = 1, p(x, t) must obey:

∫ +∞

−∞

p(x, t)dx = 1

• the value of probability densifty function for given x is not a probability !!!

15

Page 16: Random Signals Notes

Illustration of F (x, t), p(x, t) – a beer keg . . .

beer keg is being drunk from x = 18.00 till 22.00. We’ll define function p(x) as immediate

beer consumption (drinking function) and F (x) as function of drunk beer (attention, x is

time in this example):

The behavior is simnilar to PDF and distribution functions, just replace 1 with “1 keg”:

• F (x) is zero in time −∞ (it is zero till 18.00), as there was no beer.

• F (x) is 1 keg in time +∞ (actually already at 22.00), as all beer was drunk and16

Page 17: Random Signals Notes

there’ll be no more.

• amount of drunk beer at time x: F (x) =∫ x

−∞p(g)dg.

• total amount of drunk beer: F (+∞) =∫ +∞

−∞p(x)dx = 1 keg.

• amount of beer drunk from time x1 till x2 can be computed in as difference of 2

points on F (x) or by integration of p(x).

• one value of p(x) (for example. p(19.00)) can NOT be called “amount of drunk beer”

– noone can drink anything in infinitely short time.

17

Page 18: Random Signals Notes

Moments

on contrary to functions, moments are values, characterizing random process in time t or

n:

mean value or “Expectation”, is the 1st moment:

a(t) = Eξ(t) =

∫ +∞

−∞

xp(x, t)dx

a[n] = Eξ[n] =

∫ +∞

−∞

xp(x, n)dx

ensemble estimate of mean value for each time t or n the estimate is simply the mean

value of samples over all realizations:

a(t) =1

Ω

Ω∑

ω=1

ξω(t)

a[n] =1

Ω

Ω∑

ω=1

ξω[n]

18

Page 19: Random Signals Notes

For our signals:

0 0.002 0.004 0.006 0.008 0.01 0.012 0.014 0.016 0.018−15

−10

−5

0

5

x 10−3

t

0 50 100 150 200 250 300−15

−10

−5

0

5

x 10−3

n

19

Page 20: Random Signals Notes

variance (dispersion), standard deviation

D(t) = E[ξ(t) − a(t)]2 =

∫ +∞

−∞

[x − a(t)]2p(x, t)dx

D[n] = E[ξ[n] − a[n]]2 =

∫ +∞

−∞

[x − a[n]]2p(x, n)dx

std - standard deviation is the square root of variance:

σ(t) =√

D(t) σ[n] =√

D[n]

ensemble estimate of variance and std: for each t or n:

D(t) =1

Ω

Ω∑

ω=1

[ξω(t) − a(t)]2, σ(t) =

D(t)

D[n] =1

Ω

Ω∑

ω=1

[ξω[n] − a[n]]2, σ[n] =

D[n]

20

Page 21: Random Signals Notes

For our signals:

0 0.002 0.004 0.006 0.008 0.01 0.012 0.014 0.016 0.0180.125

0.13

0.135

0.14

t

0 50 100 150 200 250 3000.125

0.13

0.135

0.14

n

21

Page 22: Random Signals Notes

Correlation function

is quantifying the similarity between values of random process in times t1 (or n1) and t2

(or n2):

R(t1, t2) =

∫ +∞

−∞

∫ +∞

−∞

x1x2p(x1, x2, t1, t2)dx1dx2,

R(n1, n2) =

∫ +∞

−∞

∫ +∞

−∞

x1x2p(x1, x2, n1, n2)dx1dx2,

where p(x1, x2, t1, t2), resp. p(x1, x2, n1, n2) is two-dimensional probability density

function between times t1 and t2, resp. n1 and n2. Theoretically, it can be computed from

two-dimensional distribution function:

F (x1, x2, t1, t2) = Pξ(t1) < x1 a ξ(t2) < x2,

F (x1, x2, n1, n2) = Pξ[n1] < x1 a ξ[n2] < x2

22

Page 23: Random Signals Notes

by deriving along x1 and x2:

p(x1, x2, t1, t2) =δ2F (x1, x2, t1, t2)

δx1δx2

p(x1, x2, n1, n2) =δ2F (x1, x2, n1, n2)

δx1δx2

We’ll rather be interested in ensemble estimation using 2D-histogram:

• similarly as for 1D histogram, define “cages”, in forms of squares: chij is for the first

dimension from xi −∆

2till xi + ∆

2and for the second dimension from xj −

2till

xj + ∆

2.

23

Page 24: Random Signals Notes

24

Page 25: Random Signals Notes

• the value of 2D-histogramu for cage chij with values xi and xj will be:

p(xi, xj , t1, t2) =

∑Ω

ω=11 if ξω(t1), ξω(t2) ∈ chij , 0 else

Ω∆2

p(xi, xj , n1, n2) =

∑Ω

ω=11 if ξω[n1], ξω[n2] ∈ chij , 0 else

Ω∆2

25

Page 26: Random Signals Notes

For our signals (only discrete-time): n1 = 0, n2 = 0, 1, 5, 11.

0

20

40

60

80

100

x2

x1

−0.4 −0.2 0 0.2 0.4−0.4

−0.2

0

0.2

0.4

0

5

10

15

20

x2

x1

−0.4 −0.2 0 0.2 0.4−0.4

−0.2

0

0.2

0.4

2

4

6

8

10

12

14

x2

x1

−0.4 −0.2 0 0.2 0.4−0.4

−0.2

0

0.2

0.4

5

10

15

20

x2

x1

−0.4 −0.2 0 0.2 0.4−0.4

−0.2

0

0.2

0.4

26

Page 27: Random Signals Notes

For n1 = 0, n2 = 0, 1, 5, 11 the following coefficients were obtain after numeric integration:

• R(0, 0) = 0.0188: the process is very similar to itself (the same point). similar. . .

• R(0, 1) = 0.0151: if shifted to neighboring sample, still very similar to time n1 = 0.

• R(0, 5) = 0.0030: times n1 = 0 and n2 = 5 are not similar at all.

• R(0, 11) = −0.0133: in times n1 = 0 and n2 = 11 the process is similar to itself, but

with inverse sign ! It is probable, that if the value ξ[n1] is positive, ξ[n2] will be

negative and vice versa.

27

Page 28: Random Signals Notes

Correlation function for

n1 = 0, n2 = n1+k for k = 0 . . . 40, comparison with n1 = 100, n2 = n1+k for k = 0 . . . 40:

0 5 10 15 20 25 30 35 40−0.015

−0.01

−0.005

0

0.005

0.01

0.015

0.02

k

R(0,

k)

0 5 10 15 20 25 30 35 40−0.015

−0.01

−0.005

0

0.005

0.01

0.015

0.02

k

R(10

0,10

0+k)

28

Page 29: Random Signals Notes

STATIONARITY OF RANDOM PROCESS

simply said, the behavior of random process is not changing with the time. The values do

not depend on time t or n. Correlation function does not depend on the precise values of

t1, t2 or n1, n2, but only on their difference: τ = t2 − t1, k = n2 − n1. For stationary

continuous-time signal:

F (x, t) → F (x) p(x, t) → p(x)

a(t) → a D(t) → D σ(t) → σ

p(x1, x2, t1, t2) → p(x1, x2, τ) R(t1, t2) → R(τ)

Similarly for discrete time:

F (x, n) → F (x) p(x, n) → p(x)

a[n] → a D[n] → D σ[n] → σ

p(x1, x2, n1, n2) → p(x1, x2, k) R(n1, n2) → R(k)

29

Page 30: Random Signals Notes

The “water”-example signal was obviously stationary, as:

• the mean value was similar for all times. In case we had more realizations, it would be

even more similar.

• dtto for standard deviation.

• Correlation function for n1 = 0, n2 = n1 + k and for n1 = 100, n2 = n1 + k was

similar.

30

Page 31: Random Signals Notes

Stationary vs. non-stationary signal:

0 0.5 1 1.5 2 2.5 3−2

−1.5

−1

−0.5

0

0.5

1

1.5

2

t

x k(t)

0 0.5 1 1.5 2 2.5 3

−1

−0.8

−0.6

−0.4

−0.2

0

0.2

0.4

0.6

0.8

1

t

x k(t)

31

Page 32: Random Signals Notes

ERGODICITY OF RANDOM PROCESS

all parameters can be estimated from 1 realization:

32

Page 33: Random Signals Notes

Example of stationary and ergodic and of stationary but non-ergodic random process.

0 0.5 1 1.5 2 2.5 3−2

−1.5

−1

−0.5

0

0.5

1

1.5

2

t

x k(t)

0 0.5 1 1.5 2 2.5 3−2

−1.5

−1

−0.5

0

0.5

1

1.5

2

t

x k(t)

33

Page 34: Random Signals Notes

All ensemble estimates can be replaced by temporal estimates, we dispose of interval

of lenght T (continuous time) or of N samples (discrete time). The only realization we

have will be simply denoted x(t), resp. x[n]:

• histograms can be applied to estimate distribution function and PDF:

• mean value, variance, std:

a =1

T

∫ T

0

x(t)dt D =1

T

∫ T

0

[x(t) − a]2dt σ =√

D

a =1

N

N−1∑

n=0

x[n] D =1

N

N−1∑

n=0

[x[n] − a]2 σ =√

D

• correlation function:

R(τ) =1

T

∫ T

0

x(t)x(t + τ)dt

R(k) =1

N

N−1∑

n=0

x[n]x[n + k]

34