icassp-99 - spg.tu-darmstadt.de · icassp-99 t utorial, 15 marc h 1999 ' & $ % in tro...
TRANSCRIPT
ICASSP-99 Tutorial, 15 Mar h 1999'&
$%
The Bootstrap: A Powerful Tool forStatisti al Signal Pro essing withSmall Sample Sets
A.M. Zoubir�Australian Tele ommuni ations Resear h InstituteS hool of Ele tri al & Computer EngineeringCurtin University of Te hnologyGPO Box U 1987, Perth, WA 6001, Australiae-mail: zoubir�ieee.orgURL: www.atri. urtin.edu.au/ sp
�Formerly with the Queensland University of Te hnologyA.M. Zoubir, Curtin University of Te hnology 1
ICASSP-99 Tutorial, 15 Mar h 1999'&
$%
Outline
� Introdu tion� Prin iple of the Bootstrap{ The Non-Parametri Bootstrap{ Examples1. Bias Estimation2. Varian e Estimation3. Con�den e Interval Estimation for the Mean{ The Parametri Bootstrap{ An Example: Con�den e Interval Estimation for the Mean{ An Example of Bootstrap Failure
A.M. Zoubir, Curtin University of Te hnology 2
ICASSP-99 Tutorial, 15 Mar h 1999'&
$%
Outline (Cont'd)
� Prin iple of the Bootstrap{ The Dependent Data Bootstrap{ An Example: Varian e Estimation for AutoregressiveParameter Estimates� The Prin iple of Pivoting{ An Example: Con�den e Interval for the Mean� Varian e Stabilisation{ Examples1. Correlation CoeÆ ient2. Coheren e Gain for Engine Kno k Data
A.M. Zoubir, Curtin University of Te hnology 3
ICASSP-99 Tutorial, 15 Mar h 1999'&
$%
Outline (Cont'd)
� Hypothesis Testing with the Bootstrap{ An Example: Testing the Frequen y Response for Zero� Signal Dete tion{ Examples1. Bootstrap Mat hed-Filter2. Dete tion of a Non-Gaussian Signal at Multiple Sensors� Model Sele tion{ Linear Models
A.M. Zoubir, Curtin University of Te hnology 4
ICASSP-99 Tutorial, 15 Mar h 1999'&
$%
Outline (Cont'd)
� Model Sele tion{ Non-Linear Models{ Order Sele tion in Autoregressive Models� More Appli ations in Signal pro essing{ Con�den e Intervals for Spe tra{ Bispe trum Based Gaussianity Tests{ Noise Floor Estimation in Radar{ Con�den e Intervals for Flight Parameters in Passive Sonar{ Model Sele tion of Polynomial Phase Signals� Summary� A knowledgements
A.M. Zoubir, Curtin University of Te hnology 5
ICASSP-99 Tutorial, 15 Mar h 1999'&
$%
Introdu tion
� Most te hniques for omputing varian es of parameter estimatorsor for setting on�den e intervals assume a large sample size[Bhatta harya & Rao (1976)℄.� In many signal pro essing problems large sample methods areinappli able [Fisher & Hall (1991)℄.� The bootstrap was introdu ed [Efron (1979)℄ to al ulate on�den e intervals for parameters when few data are available.� The bootstrap has been shown to solve many other problemswhi h would be too ompli ated for traditional statisti al analysis[Hall (1992), Efron & Tibshirani (1993), Shao & Tu (1995)℄.
A.M. Zoubir, Curtin University of Te hnology 6
ICASSP-99 Tutorial, 15 Mar h 1999'&
$%
Introdu tion (Cont'd)
� The bootstrap does with a omputer what the experimenterwould do in pra ti e, if it were possible.1. The observations are randomly re-assigned, and the estimatesre- omputed.2. These assignments and re- omputations are done many timesand treated as repeated experiments.� In an era of exponentially in reasing omputational power, omputer-intensive methods su h as the bootstrap are a�ordable.
A.M. Zoubir, Curtin University of Te hnology 7
ICASSP-99 Tutorial, 15 Mar h 1999'&
$%
Introdu tion (Cont'd)
Appli ations of the bootstrap to real-life problems have beenreported in (see also spe ial session at ICASSP-94 [64℄)� radar signal pro essing [Nagaoka & Amai (1990,1991)℄,� sonar signal pro essing [Krolik et al. (1991), B�ohme & Maiwald(1994), Reid et al. (1996)℄,� geophysi s [Fisher & Hall (1989,1990,1991), Tauxe et al. (1991)℄,� biomedi al engineering [Haynor & Woods (1989), Banga &Ghorbel (1993)℄� ontrol [Lai & Chen (1995)℄,
A.M. Zoubir, Curtin University of Te hnology 8
ICASSP-99 Tutorial, 15 Mar h 1999'&
$%
Introdu tion (Cont'd)
� atmospheri environmental resear h [Hanna (1989)℄,� vibration analysis [Zoubir & B�ohme (1991,1995)℄.� power systems [Herman (1996)℄,� omputer vision [Lange et al. (1998), Kanatani & Ohta (1998)℄� image analysis [Ar her & Chan (1996)℄,� nu lear te hnology [Ya out et al. (1996)℄,� metrology [Ciarlini (1997), Cox et al. (1997)℄,� �nan ial engineering [Bhar & Chiarella (1996), Ankenbrand &Tomassini (1996)℄.A.M. Zoubir, Curtin University of Te hnology 9
ICASSP-99 Tutorial, 15 Mar h 1999'&
$%
Introdu tion (Cont'd)
� Bootstrap methods are potentially superior to large samplete hniques for small sample sizes [Hall (1992)℄.� A danger exists when applying bootstrap te hniques in some ir umstan es where standard approa hes are judgedinappropriate and in su h ir umstan es the bootstrap may alsofail [Freedman (1981)℄.� Spe ial are is therefore required when applying the bootstrapin real-life situations [Young (1994)℄.
A.M. Zoubir, Curtin University of Te hnology 10
ICASSP-99 Tutorial, 15 Mar h 1999'&
$%
Problem Statement
Let X = fX1; X2; : : : ; Xng be a random sample from a ompletelyunspe i�ed distribution F . Let � denote an unknown hara teristi ofF , su h as its mean or varian e.Find the distribution of ^�, an estimator of �, derived fromthe sample X .Possible solution: repeat the experiment a suÆ ient number oftimes and approximate the distribution of ^� by the so obtainedempiri al distribution.Problem: may be inappli able for ost reasons or be ause theexperimental onditions are not reprodu ible.
A.M. Zoubir, Curtin University of Te hnology 11
ICASSP-99 Tutorial, 15 Mar h 1999'&
$%
Prin iple of the Bootstrap
Model
Unknown EstimatedProbability
Model
^
REAL WORLD
21X=(x , x , ..., x )F
Probability
θ
n
^
F^ x*=(x* , x* , ..., x* )1 2
θ*=s( *)x
Observed Data Bootstrap Sample
x=s( )Bootstrap replicationStatistic of interest
BOOTSTRAP WORLD
n
From [Efron & Tibshirani (1993)℄.A.M. Zoubir, Curtin University of Te hnology 12
ICASSP-99 Tutorial, 15 Mar h 1999'&
$%
Outline
� Introdu tion� Prin iple of the Bootstrap{ The Non-Parametri Bootstrap{ Examples1. Bias Estimation2. Varian e Estimation3. Con�den e Interval Estimation for the Mean{ The Parametri Bootstrap{ An Example: Con�den e Interval Estimation for the Mean{ An Example of Bootstrap Failure
A.M. Zoubir, Curtin University of Te hnology 13
ICASSP-99 Tutorial, 15 Mar h 1999'&
$%
The Non-Parametri Bootstrap
1. Condu t the experiment to obtain the random sample X =fX1; X2; : : : ; Xng and al ulate the estimate ^� from X .2. Constru t the empiri al distribution ^F , whi h puts equal mass1=n at ea h observation X1 = x1; X2 = x2; : : : ; Xn = xn.3. From the sele ted ^F , draw a sample X � = fX�1 ; X�2 ; : : : ; X�ng, alled the bootstrap (re)sample.4. Approximate the distribution of ^� by the distribution of ^��derived from X �.
A.M. Zoubir, Curtin University of Te hnology 14
ICASSP-99 Tutorial, 15 Mar h 1999'&
$%
The Bootstrap Pro edure
Measured DataCondu tExperiment - - ComputeStatisti -x ^�(x)?ResampleBootstrap Data- - ComputeStatisti -x�1 ^��1- - ComputeStatisti -x�2 ^��2... ...- - ComputeStatisti -x�N ^��NGenerateEDF,^F^�(^�)- - ^�6^F^�(^�)
A.M. Zoubir, Curtin University of Te hnology 15
ICASSP-99 Tutorial, 15 Mar h 1999'&
$%
Outline
� Introdu tion� Prin iple of the Bootstrap{ The Non-Parametri Bootstrap{ Examples1. Bias Estimation2. Varian e Estimation3. Con�den e Interval Estimation for the Mean{ The Parametri Bootstrap{ An Example: Con�den e Interval Estimation for the Mean{ An Example of Bootstrap Failure
A.M. Zoubir, Curtin University of Te hnology 16
ICASSP-99 Tutorial, 15 Mar h 1999'&
$%
Example: Bias Estimation
Consider the problem of estimating the varian e of an unknowndistribution F�;�, based on the random sample X = fX1; : : : ; Xng.Two di�erent estimators an be used:
^�2u = 1n� 1 nXi=10�Xi � 1n nXj=1Xj1A2 ; ^�2b = 1n nXi=10�Xi � 1n nXj=1Xj1A2 :
It an be easily shown thatE^�2u = �2 and E^�2b = (1� 1n )�2With the bootstrap we estimate the bias b(^�2) = E^�2 � �2 byE�^��2 � ^�2, where ^�2 is the maximum likelihood estimate of �2, i.e.,^�2b and E� is expe tation w.r.t. bootstrap sampling.A.M. Zoubir, Curtin University of Te hnology 17
ICASSP-99 Tutorial, 15 Mar h 1999'&
$%
Example (Cont'd)
Step 0. Experiment. Colle t the data into X = fX1; : : : ; Xng.Compute the estimates ^�2u and ^�2b .Step 1. Resampling. Draw a random sample of size n, with re-pla ement, from X .Step 2. Cal ulation of the bootstrap estimate. Cal ulate thebootstrap estimates ^��2u and ^��2b from X � in the same way^�2u and ^�2b were omputed but with the resample X �.Step 3. Repetition. Repeat Steps 1 and 2 to obtain a total of Nbootstrap estimates ^��2u;1; : : : ; ^��2u;N and ^��2b;1; : : : ; ^��2b;N .Step 4. Bias Estimation. Estimate b(^�2u) by b�(^��2u ) =1=NPNi=1 ^��2u;i�^�2b and b(^�2b ) by b�(^��2b ) = 1=NPNi=1 ^��2b;i�^�2b .
A.M. Zoubir, Curtin University of Te hnology 18
ICASSP-99 Tutorial, 15 Mar h 1999'&
$%
Example (Cont'd)
We onsidered a sample of size n = 5 from the standard normaldistribution. With N = 999 and 1000 Monte Carlo simulations, weobtained the following histograms for b�(^��2u ) and b�(^��2b ).
−0.15 −0.1 −0.05 0 0.05 0.10
50
100
150
200
250
300
350
400
450
Bootstrap bias estimate
Freq
uenc
y of
occ
uren
ce
Bootstrap estimation of b(^�2u),sample mean: -4.9400e-04 (=0 !).−0.9 −0.8 −0.7 −0.6 −0.5 −0.4 −0.3 −0.2 −0.1 00
20
40
60
80
100
120
140
160
180
200
Bootstrap bias estimate
Freq
uenc
y of
occ
uren
ce
Bootstrap estimation of b(^�2b ),sample mean: -0.1573 (=-0.2 !).A.M. Zoubir, Curtin University of Te hnology 19
ICASSP-99 Tutorial, 15 Mar h 1999'&
$%
Outline
� Introdu tion� Prin iple of the Bootstrap{ The Non-Parametri Bootstrap{ Examples1. Bias Estimation2. Varian e Estimation3. Con�den e Interval Estimation for the Mean{ The Parametri Bootstrap{ An Example: Con�den e Interval Estimation for the Mean{ An Example of Bootstrap Failure
A.M. Zoubir, Curtin University of Te hnology 20
ICASSP-99 Tutorial, 15 Mar h 1999'&
$%
Example: Varian e Estimation
Consider the problem of �nding the varian e �2^� of an estimator ^� of�, based on the random sample X = fX1; : : : ; Xng from the unknowndistribution F�.� If tra table, one may derive an analyti expression for �2^� .� Alternatively, one may use asymptoti arguments to ompute anestimate ^�2^� for �2^� .Problem: In many situations the onditions for the above are notful�lled.Solution: The bootstrap provides a simple and a urate alternativeto approximate �2^� by ^��2^� .
A.M. Zoubir, Curtin University of Te hnology 21
ICASSP-99 Tutorial, 15 Mar h 1999'&
$%
Example (Cont'd)
Step 0. Experiment. Condu t the experiment and olle t therandom data into the sample X = fX1; : : : ; Xng.Step 1. Resampling. Draw a random sample of size n, with re-pla ement, from X .Step 2. Cal ulation of the bootstrap estimate. Evaluate the boot-strap estimate ^�� from X � al ulated in the same manner as^� but with the resample X � repla ing X .Step 3. Repetition. Repeat Steps 1 and 2 many times to obtaina total of B bootstrap estimates ^��1 ; : : : ; ^��B. Typi al valuesfor B are between 25 and 200 [Efron & Tibshirani (1993)℄.
A.M. Zoubir, Curtin University of Te hnology 22
ICASSP-99 Tutorial, 15 Mar h 1999'&
$%
Example (Cont'd)
Step 4. Estimation of the varian e of ^�. Estimate the varian e�2^� of ^� by^�2BOOT = 1B � 1 BXb=1
^��b � 1B BXb=1 ^��b!2 :
� Suppose F�;� is N (10; 25) and we wish to estimate �^� based on arandom sample X of size n = 50.� Following the above pro edure with B = 25, a bootstrap estimateof the varian e of ^� is found to be ^�2BOOT = 0:49 as ompared tothe true �2^� = 0:5.
A.M. Zoubir, Curtin University of Te hnology 23
ICASSP-99 Tutorial, 15 Mar h 1999'&
$%
Example (Cont'd)0.44 0.46 0.48 0.5 0.52 0.54 0.56 0.58 0.6 0.620
20
40
60
80
100
120
140
Bootstrap variance estimate
Fre
quen
cy o
f occ
uren
ce
Histogram of ^��2(1)^� ; ^��2(2)^� ; : : : ; ^��2(1000)^� , based on a random sample of sizen = 50 and B = 25.A.M. Zoubir, Curtin University of Te hnology 24
ICASSP-99 Tutorial, 15 Mar h 1999'&
$%
Outline
� Introdu tion� Prin iple of the Bootstrap{ The Non-Parametri Bootstrap{ Examples1. Bias Estimation2. Varian e Estimation3. Con�den e Interval Estimation for the Mean{ The Parametri Bootstrap{ An Example: Con�den e Interval Estimation for the Mean{ An Example of Bootstrap Failure
A.M. Zoubir, Curtin University of Te hnology 25
ICASSP-99 Tutorial, 15 Mar h 1999'&
$%
Example: Con�den e Interval for the Mean
Let X = fX1; : : : ; Xng be from some unknown distribution F�;�. Wewish to �nd an estimator and a 100(1� �)% interval for �. Let^� = X1 + : : :+Xnn :A on�den e interval for � is found by determining the distributionof ^�, and �nding ^�L; ^�U su h thatPr(^�L � � � ^�U ) = 1� �:The distribution of ^� depends on the distribution of the Xi's, whi his unknown. If n is large, the distribution of ^� ould be approximatedby the normal distribution as per the entral limit theorem. What ifn is small?A.M. Zoubir, Curtin University of Te hnology 26
ICASSP-99 Tutorial, 15 Mar h 1999'&
$%
Example (Cont'd)
Step 0. Experiment. Condu t the experiment and olle tX1; : : : ; Xn into X . Suppose F�;� is N (10; 25) and X =f�2:41; 4:86; 6:06; 9:11; 10:20; 12:81; 13:17; 14:10; 15:77; 15:79gis of size n = 10. The mean of all values in X is ^� = 9:95.Step 1. Resampling. Draw a sample of 10 values, with repla e-ment, from X . One might obtain the bootstrap resample X � =f9:11; 9:11; 6:06; 13:17; 10:20;�2:41; 4:86; 12:81;�2:41; 4:86g.Note that some values from the original sample appear morethan on e while others do not appear at all.Step 2. Cal ulation of the bootstrap estimate. Cal ulate themean of X �. The mean of all 10 values in X � is ^��1 = 6:54.
A.M. Zoubir, Curtin University of Te hnology 27
ICASSP-99 Tutorial, 15 Mar h 1999'&
$%
Example (Cont'd)
Step 3. Repetition. Repeat Steps 1 and 2 to obtain N bootstrapestimates ^��1; : : : ; ^��N . Let N = 1000.Step 4. Approximation of the Distribution of ^�. Sort the boot-strap estimates to obtain ^��(1) � ^��(2) � : : : � ^��(N). Wemight get 3:48; 3:39; 4:46; : : : ; 8:86; 8:88; 8:89; : : : ; 10:07;10:08; : : : ; 14:46; 14:53; 14:66.Step 5. Con�den e Interval. The 100(1 � �)% bootstrap on-�den e interval is (^��(q1); ^��(q2)), where q1 = bN�=2 andq2 = N � q1 + 1. For � = 0:05 and N = 1000, q1 = 25and q2 = 976, and the 95% on�den e interval is found to be(6:27;13:19) as ompared to the theoreti al (6:85;13:05).
A.M. Zoubir, Curtin University of Te hnology 28
ICASSP-99 Tutorial, 15 Mar h 1999'&
$%
Example (Cont'd)4 6 8 10 12 14 16
0
0.05
0.1
0.15
0.2
0.25
0.3
Bootstrap estimates
Den
sity
func
tion
Histogram of ^��1; ^��2; : : : ; ^��1000, based on the random sample X = f�2:41;4:86; 6:06; 9:11; 10:20; 12:81; 13:17; 14:10; 15:77; 15:79g, together with thedensity fun tion of a Gaussian variable with mean 10 and varian e 2.5.
A.M. Zoubir, Curtin University of Te hnology 29
ICASSP-99 Tutorial, 15 Mar h 1999'&
$%
Example (Cont'd)−2 −1.5 −1 −0.5 0 0.5 1 1.5 20
0.1
0.2
0.3
0.4
0.5
0.6
0.7
0.8
0.9
1
Boostrap estimates
Den
sity
func
tion
Histogram of 1000 bootstrap estimates of the mean of the t4-distribution andthe kernel probability density fun tion obtained from 1000 Monte Carlosimulations. The 95 % on�den e interval is (-0.896,0.902) and(-0.886,0.887) based on the bootstrap and Monte Carlo, respe tively.A.M. Zoubir, Curtin University of Te hnology 30
ICASSP-99 Tutorial, 15 Mar h 1999'&
$%
Example (Cont'd)
� The pro edure des ribed above an be substantially improvedbe ause the interval al ulated is, in fa t, an interval with overage less than the nominal value [Hall (1988)℄.� Later, we shall dis uss another way that will lead to a morea urate on�den e interval for the mean.� The omputational expense to al ulate the on�den e intervalfor � is approximately N times greater than the one needed to ompute ^�.� This is a eptable given the ever-in reasing apabilities oftoday's omputers.
A.M. Zoubir, Curtin University of Te hnology 31
ICASSP-99 Tutorial, 15 Mar h 1999'&
$%
Outline
� Introdu tion� Prin iple of the Bootstrap{ The Non-Parametri Bootstrap{ Examples1. Bias Estimation2. Varian e Estimation3. Con�den e Interval Estimation for the Mean{ The Parametri Bootstrap{ An Example: Con�den e Interval Estimation for the Mean{ An Example of Bootstrap Failure
A.M. Zoubir, Curtin University of Te hnology 32
ICASSP-99 Tutorial, 15 Mar h 1999'&
$%
The Parametri Bootstrap
� Bootstrap sampling an be arried out parametri ally.� If one has partial knowledge of F , one may use ^F^� instead of ^F .� Draw N samples of size n from the parametri estimate of F^F^� �! (x�1; x�2; : : : ; x�n)and pro eed as before.� When used in a parametri way, the bootstrap provides morea urate answers, provided the model is orre t.
A.M. Zoubir, Curtin University of Te hnology 33
ICASSP-99 Tutorial, 15 Mar h 1999'&
$%
Outline
� Introdu tion� Prin iple of the Bootstrap{ The Non-Parametri Bootstrap{ Examples1. Bias Estimation2. Varian e Estimation3. Con�den e Interval Estimation for the Mean{ The Parametri Bootstrap{ An Example: Con�den e Interval Estimation for the Mean{ An Example of Bootstrap Failure
A.M. Zoubir, Curtin University of Te hnology 34
ICASSP-99 Tutorial, 15 Mar h 1999'&
$%
Example: Con�den e Interval for the Mean
Step 0. Experiment. Condu t the experiment and olle tX1; : : : ; Xn into X . Suppose F�;� is N (10; 25) and X =f�2:41; 4:86; 6:06; 9:11; 10:20; 12:81; 13:17; 14:10; 15:77; 15:79gis of size n = 10. The mean of all values in X is ^� = 9:95 andthe sample varian e is ^�2 = 33:15.Step 1. Resampling. Draw a sample of 10 values, withrepla ement, from F^�;^�. We might obtain X � =f7:45; 0:36; 10:67; 11:60; 3:34; 16:80; 16:79; 9:73; 11:83; 10:95g.Step 2. Cal ulation of the bootstrap estimate. Cal ulate themean of X �. The mean of all 10 values in X � is ^��1 = 9:95.
A.M. Zoubir, Curtin University of Te hnology 35
ICASSP-99 Tutorial, 15 Mar h 1999'&
$%
Example (Cont'd)
Step 3. Repetition. Repeat Steps 1 and 2 to obtain N bootstrapestimates ^��1; : : : ; ^��N . Let N = 1000.Step 4. Approximation of the Distribution of ^�. Sort the boot-strap estimates to obtain ^��(1) � ^��(2) � : : : � ^��(N).Step 5. Con�den e Interval. The 100(1 � �)% bootstrap on-�den e interval is (^��(q1); ^��(q2)), where q1 = bN�=2 andq2 = N � q1 + 1. For � = 0:05 and N = 1000, q1 = 25and q2 = 976, and the 95% on�den e interval is found to be(6:01;13:87) as ompared to the theoreti al (6:85;13:05).
A.M. Zoubir, Curtin University of Te hnology 36
ICASSP-99 Tutorial, 15 Mar h 1999'&
$%
Example (Cont'd)4 6 8 10 12 14 16
0
0.05
0.1
0.15
0.2
0.25
0.3
Bootstrap estimates
Den
sity
func
tion
Histogram of ^��1; ^��2; : : : ; ^��1000, based on the random sample X = f�2:41;4:86; 6:06; 9:11; 10:20; 12:81; 13:17; 14:10; 15:77; 15:79g, together with thedensity fun tion of a Gaussian variable with mean 10 and varian e 2.5.
A.M. Zoubir, Curtin University of Te hnology 37
ICASSP-99 Tutorial, 15 Mar h 1999'&
$%
Outline
� Introdu tion� Prin iple of the Bootstrap{ The Non-Parametri Bootstrap{ Examples1. Bias Estimation2. Varian e Estimation3. Con�den e Interval Estimation for the Mean{ The Parametri Bootstrap{ An Example: Con�den e Interval Estimation for the Mean{ An Example of Bootstrap Failure
A.M. Zoubir, Curtin University of Te hnology 38
ICASSP-99 Tutorial, 15 Mar h 1999'&
$%
An Example of Bootstrap Failure
Let X � U(0; �) and X = fX1; X2; : : : ; Xng. We wish to estimate �by ^� and its distribution ^F^�(^�). The Maximum Likelihood (ML)estimator of � is given by ^� = X(n).� To obtain an estimate of the density fun tion of ^� we sample withrepla ement from the data and ea h time estimate ^�� from X �.� Alternatively, we ould sample from U(0; ^�) and estimate ^�� fromX � (parametri bootstrap).� We ran an example with � = 1, n = 50 and N = 1000. The MLestimate of � was found to be ^� = 0:9843.
A.M. Zoubir, Curtin University of Te hnology 39
ICASSP-99 Tutorial, 15 Mar h 1999'&
$%
An Example of Bootstrap Failure (Cont'd)
The non-parametri bootstrap shows that approximately 62% of thevalues of ^�� equal ^�. In fa t,Pr(^�� = ^�) = 1� (1� 1=n)n �! 1� e�1 � 0:632 as n �!1.
0.65 0.7 0.75 0.8 0.85 0.9 0.95 10
100
200
300
400
500
600
700
Freq
uenc
y of
occ
urre
nce
theta hat starNon-parametri Bootstrap 0.65 0.7 0.75 0.8 0.85 0.9 0.95 10
20
40
60
80
100
120
Freq
uenc
y of
occ
urre
nce
theta hat starParametri Bootstrap
A.M. Zoubir, Curtin University of Te hnology 40
ICASSP-99 Tutorial, 15 Mar h 1999'&
$%
Outline
� Prin iple of the Bootstrap{ The Dependent Data Bootstrap{ An Example: Varian e Estimation for AutoregressiveParameter Estimates� The Prin iple of Pivoting{ An Example: Con�den e Interval for the Mean� Varian e Stabilisation{ Examples1. Correlation CoeÆ ient2. Coheren e Gain for Engine Kno k Data
A.M. Zoubir, Curtin University of Te hnology 41
ICASSP-99 Tutorial, 15 Mar h 1999'&
$%
The Dependent Data Bootstrap
� The assumption of i.i.d. data an break down in pra ti e eitherbe ause the data is not independent or be ause it is notidenti ally distributed, or both.� We an still invoke the bootstrap prin iple if we knew themodel that generated the data [Efron & Tibshirani (1993), Bose(1988), Kreiss & Franke (1992), Paparoditis (1996), Zoubir(1993)℄.� For example, a way to relax the i.i.d. assumption is to assumethat the data is identi ally distributed but not independent su has in autoregressive (AR) models.
A.M. Zoubir, Curtin University of Te hnology 42
ICASSP-99 Tutorial, 15 Mar h 1999'&
$%
The Dependent Data Bootstrap (Cont'd)
� If no plausible model su h as AR is available for the probabilityme hanism generating stationary observations, we ould makethe assumption of weak dependen e.� Strong mixing pro essesa, for example, satisfy the weakdependen e ondition.� The moving blo ks bootstrap [K�uns h (1989), Liu & Singh(1992), Politis & Romano (1992,1994)℄ has been proposed forbootstrapping weakly dependent data.
aLoosely speaking a pro ess is strong mixing if observations far apart (in time)are almost independent [Rosenblatt (1985)℄.A.M. Zoubir, Curtin University of Te hnology 43
ICASSP-99 Tutorial, 15 Mar h 1999'&
$%
The Moving Blo ks Bootstrap
S hemati diagram of the moving blo ks bootstrap for a stationary signal. Thered ir les are the original signal. A bootstrap realisation of the signal (green ir les) is generated by hoosing a blo k length (\3" in the diagram) andsampling with repla ement from all possible ontiguous blo ks of this length.A.M. Zoubir, Curtin University of Te hnology 44
ICASSP-99 Tutorial, 15 Mar h 1999'&
$%
Other Blo k Bootstrap Methods
� The ir ular blo ks bootstrap [Politis & Romano (1992), Shao &Yu (1993)℄ allows blo ks whi h start at the end of the data andwrap around to the start.� The blo ks of blo ks bootstrap [Politis et al. (1992)℄ uses twolevels of blo king to estimate on�den e bands for spe tra and ross-spe tra.� The stationary bootstrap [Politis & Romano (1994)℄ allowsblo ks to be of random lengths instead of a �xed length.
A.M. Zoubir, Curtin University of Te hnology 45
ICASSP-99 Tutorial, 15 Mar h 1999'&
$%
Outline
� Prin iple of the Bootstrap{ The Dependent Data Bootstrap{ An Example: Varian e Estimation for AutoregressiveParameter Estimators� The Prin iple of Pivoting{ An Example: Con�den e Interval for the Mean� Varian e Stabilisation{ Examples1. Correlation CoeÆ ient2. Coheren e Gain for Engine Kno k Data
A.M. Zoubir, Curtin University of Te hnology 46
ICASSP-99 Tutorial, 15 Mar h 1999'&
$%
An Example: Varian e Estimation in AR ModelsWe generate n observations xt; t = 0; : : : ; n� 1, fromXt + a �Xt�1 = Zt ;where Zt is white Gaussian noise with EZt = 0, ZZ(u) = �2ZÆ(u),and a su h that jaj < 1.After de-trending the data, we �t the AR(1) model to theobservation xt. With ^ xx(u) = 1=nPn�juj�1t=0 xtxt+juj for0 � juj � n� 1, we al ulate the Maximum Likelihood Estimate(MLE) of a, ^a = �^ xx(1)=^ xx(0), whi h has approximate varian ez^�2^a = (1� a2)=n.zunder some regularity onditions an asymptoti formula for ^�2^a an be foundin the non-Gaussian ase and is a fun tion of a and the varian e and kurtosis ofZt [Porat & Friedlander (1989)℄.A.M. Zoubir, Curtin University of Te hnology 47
ICASSP-99 Tutorial, 15 Mar h 1999'&
$%
Example (Cont'd)
Step 0. Experiment. Condu t the experiment and olle t n ob-servations xt; t = 0; : : : ; n�1, from an auto-regressive pro essof order one, Xt.Step 1. Cal ulation of the residuals. With the Maximum Likeli-hood Estimate ^a of a, de�ne the residuals ^zt = xt + ^a � xt�1for t = 1; 2; : : : ; n� 1.Step 2. Resampling. Create a bootstrap sample x�0; x�1; : : : ; x�n�1by sampling ^z�1 ; ^z�2 ; : : : ; ^z�n�1, with repla ement, from theresiduals ^z1; ^z2; : : : ; ^zn�1, then letting x�0 = x0, and x�t =�^ax�t�1 + ^z�t ; t = 1; 2; : : : ; n� 1.
A.M. Zoubir, Curtin University of Te hnology 48
ICASSP-99 Tutorial, 15 Mar h 1999'&
$%
Example (Cont'd)
Step 3. Cal ulation of the bootstrap estimate. After entring thetime series x�0; x�1; : : : ; x�n�1, obtain ^a�, using the above for-mulae but based on x�0; x�1; : : : ; x�n�1.Step 4. Repetition. Repeat steps 2{3 a large number of times,N = 1000, say, to obtain ^a�1; ^a�2; : : : ; ^a�N .Step 5. Varian e estimation. From ^a�1; ^a�2; : : : ; ^a�N , approximatethe varian e of ^a by^��2^a = 1N � 1 NXi=1(^a�i � 1N NXi=1 ^a�i )2 :
A.M. Zoubir, Curtin University of Te hnology 49
ICASSP-99 Tutorial, 15 Mar h 1999'&
$%
Example (Cont'd)−0.8 −0.7 −0.6 −0.5 −0.4 −0.3 −0.2 −0.10
1
2
3
4
5
6
Bootstrap estimates
Den
sity
func
tion
Histogram of ^a�1; ^a�2; : : : ; ^a�1000 for a = �0:6, n = 128 and Zt Gaussian. TheMLE for a was ^a = �0:6351 and ^�^a = 0:0707. The bootstrap estimate was^��^a = 0:0712 as ompared to ^�^a = 0:0694 based on 1000 Monte Carlosimulations.A.M. Zoubir, Curtin University of Te hnology 50
ICASSP-99 Tutorial, 15 Mar h 1999'&
$%
Outline
� Prin iple of the Bootstrap{ The Dependent Data Bootstrap{ An Example: Varian e Estimation for AutoregressiveParameter Estimators� The Prin iple of Pivoting{ An Example: Con�den e Interval for the Mean� Varian e Stabilisation{ Examples1. Correlation CoeÆ ient2. Coheren e Gain for Engine Kno k Data
A.M. Zoubir, Curtin University of Te hnology 51
ICASSP-99 Tutorial, 15 Mar h 1999'&
$%
The Prin iple of Pivoting
� A statisti T (X; �) is alled pivotal if it possesses a �xedprobability distribution independent of � [Cram�er (1967),Lehmann (1986)℄.� Bootstrap on�den e intervals or tests have ex ellent propertieseven for relatively low �xed resample number [Hall (1992)℄.� For example, one an show that the overage error in on�den einterval estimation with the bootstrap is Op(n�1) as ompared toOp(n�1=2) when using the normal approximation.� The a ura y laimed holds whenever the statisti isasymptoti ally pivotal [Hall & Titterington (1989), Hall (1992)℄.
A.M. Zoubir, Curtin University of Te hnology 52
ICASSP-99 Tutorial, 15 Mar h 1999'&
$%
Outline
� Prin iple of the Bootstrap{ The Dependent Data Bootstrap{ An Example: Varian e Estimation for AutoregressiveParameter Estimators� The Prin iple of Pivoting{ An Example: Con�den e Interval for the Mean� Varian e Stabilisation{ Examples1. Correlation CoeÆ ient2. Coheren e Gain for Engine Kno k Data
A.M. Zoubir, Curtin University of Te hnology 53
ICASSP-99 Tutorial, 15 Mar h 1999'&
$%
An Example: Con�den e Interval Estimation
We onsider the onstru tion of a on�den e interval for the mean.Let X = fX1; : : : ; Xng be a random sample from some unknownF�X ;�X . We wish to �nd an estimator of �X with a 100(1� �)% on�den e interval.Let ^�X and ^�2X be the sample mean and the sample varian e of X ,respe tively. Alternatively to the previous example, we will base ourmethod for �nding a on�den e interval for �X on the statisti ^�Y = ^�X � �X^� ;where ^� is the standard deviation of ^�X . The statisti hasasymptoti ally for large n a distribution free of unknown parameters.
A.M. Zoubir, Curtin University of Te hnology 54
ICASSP-99 Tutorial, 15 Mar h 1999'&
$%
Example (Cont'd)
Step 0. Experiment. Condu t the experiment and olle t therandom data into the sample X = fX1; X2; : : : ; Xng.Step 1. Parameter estimation. Based on X , al ulate ^�X and itsstandard deviation ^�, using a nested bootstrap.Step 2. Resampling. Draw a random sample, X � of n values,with repla ement, from X .Step 3. Cal ulation of the pivotal statisti . Cal ulate the meanof all values in X � and using a nested bootstrap, al ulate ^��.Then, form ^��Y = ^��X � ^�X^��
A.M. Zoubir, Curtin University of Te hnology 55
ICASSP-99 Tutorial, 15 Mar h 1999'&
$%
Example (Cont'd)
Step 4. Repetition. Repeat Steps 2-3 many times to obtain atotal of N bootstrap estimates ^��Y;1; : : : ; ^��Y;N .Step 5. Ranking. Sort the bootstrap estimates to obtain ^��Y;(1) �^��Y;(2) � : : : � ^��Y;(1000).Step 6. Con�den e Interval. If (^��Y;(q1); ^��Y;(q2)) is an interval ontaining (1 � �)N of the means ^��Y , where q1 = bN�=2 and q2 = N � q1 + 1, then(^�X � ^�^��Y;(q2); ^�X � ^�^��Y;(q1))is a 100(1� �)% on�den e interval for �X .Su h an interval is known as a per entile-t on�den e interval [Efron(1987), Hall (1988)℄.A.M. Zoubir, Curtin University of Te hnology 56
ICASSP-99 Tutorial, 15 Mar h 1999'&
$%
Example (Cont'd)
For the same random sample X as before, we obtained the on�den einterval (3:54;13:94) as ompared to (6:01;13:87).� This interval is larger than the one obtained earlier and enfor esthe statement that the interval obtained there has overage lessthan the nominal 95%.� It also yields better results than an interval derived using theassumption that ^�Y is N (0; 1)- or the (better) approximationthat ^�Y is tn�1-distributed.� The interval obtained here a ounts for skewness in theunderlying population or other errors [Hall (1988,1992), Efron &Tibshirani (1993), Zoubir (1993)℄.A.M. Zoubir, Curtin University of Te hnology 57
ICASSP-99 Tutorial, 15 Mar h 1999'&
$%
Outline
� Prin iple of the Bootstrap{ The Dependent Data Bootstrap{ An Example: Varian e Estimation for AutoregressiveParameter Estimators� The Prin iple of Pivoting{ An Example: Con�den e Interval for the Mean� Varian e Stabilisation{ Examples1. Correlation CoeÆ ient2. Coheren e Gain for Engine Kno k Data
A.M. Zoubir, Curtin University of Te hnology 58
ICASSP-99 Tutorial, 15 Mar h 1999'&
$%
Varian e StabilisationTo ensure pivoting, usually the statisti is \studentised", i.e., we form^T = ^� � �^�^�The per entile-t method is parti ularly appli able to lo ationstatisti s, su h as the sample mean, sample median, et . [Efron &Tibshirani (1993)℄. However, for more general statisti s, it may notbe a urate.Problem: Studentising results in on�den e intervals with errati allyvarying lengths and end points.Solution: Pivoting often does not hold unless an appropriatevarian e stabilising transformation is applied �rst. How do we\automati ally" get a varian e stabilising transformation?A.M. Zoubir, Curtin University of Te hnology 59
ICASSP-99 Tutorial, 15 Mar h 1999'&
$%
Varian e Stabilisation (Cont'd)
Step 1. Estimation of the varian e stabilising transformation.(a) Generate B1 bootstrap samples X �i from X and for ea h al ulate the value of the statisti ^��i , i = 1; : : : ; B1. Forexample B1 = 100.(b) Generate B2 bootstrap samples from X �i , i = 1; : : : ; B1,and al ulate ^��2i , a bootstrap estimate for the varian e of^��i , i = 1; : : : ; B1. For example B2 = 25.( ) Estimate the varian e fun tion �(�) by smoothing the val-ues of ^��2i against ^��i , using, for example, a �xed-span 50%\running lines" smoother [Hastie & Tibshirani (1990)℄.
A.M. Zoubir, Curtin University of Te hnology 60
ICASSP-99 Tutorial, 15 Mar h 1999'&
$%
Varian e Stabilisation (Cont'd)
Step 1. Estimation of the varian e stabilising transformation.(d) Estimate the varian e stabilising transformation h(^�) from
h(�) = Z �f�(s)g�1=2ds:
Step 2. Bootstrap quantile estimation. Generate B3 bootstrapsamples and ompute ^��i and h(^��i ) for ea h sample i. Approx-imate the distribution of h(^�)� h(�) by that of h(^��)� h(^�).
A.M. Zoubir, Curtin University of Te hnology 61
ICASSP-99 Tutorial, 15 Mar h 1999'&
$%
Outline
� Prin iple of the Bootstrap{ The Dependent Data Bootstrap{ An Example: Varian e Estimation for AutoregressiveParameter Estimators� The Prin iple of Pivoting{ An Example: Con�den e Interval for the Mean� Varian e Stabilisation{ Examples1. Con�den e Interval for the Mean2. Coheren e Gain for Engine Kno k Data
A.M. Zoubir, Curtin University of Te hnology 62
ICASSP-99 Tutorial, 15 Mar h 1999'&
$%
Example: Correlation oeÆ ient
Let � = % be the orrelation oeÆ ient of two unknown populations,and let ^% and ^�2 be estimates of % and the varian e of ^%, respe tively,based on X = fX1; : : : ; Xng and Y = fY1; : : : ; Yng.Let X � and Y� be resamples, drawn with repla ement from X and Y,respe tively, and let ^%� and ^��2 be bootstrap versions of ^% and ^�2.By repeated resampling from X and Y we ompute ^s� and ^t�, su hthat with 0 < � < 1Pr ( (^%� � ^%)=^�� � ^s� j X ;Y) = �2 = Pr �(^%� � ^%)=^�� � ^t� �� X ;Y� :The on�den e interval (per entile-t) for % is given byI(X ;Y) = �^%� ^�^t� ; ^%� ^�^s�� :A.M. Zoubir, Curtin University of Te hnology 63
ICASSP-99 Tutorial, 15 Mar h 1999'&
$%
Example: Correlation oeÆ ient (Cont'd)
There exists a transformation alled Fisher's z-transform [Fisher(1921), Anderson (1984)℄, whi h is stabilising and normalising:�% = tanh�1 ^% = 12 log 1 + ^%1� ^% :We ould �rst �nd a on�den e interval for � = tanh�1 % and thentransform the endpoints ba k with the inverse transformation% = tanh � to obtain a on�den e interval for %.For X and Y bivariate normal (�% � N (�; 1=(n� 3))), a 95%, forexample, on�den e interval for % is obtained from�tanh(�1:96=pn� 3 + �%) ; tanh(1:96=pn� 3 + �%)� :
A.M. Zoubir, Curtin University of Te hnology 64
ICASSP-99 Tutorial, 15 Mar h 1999'&
$%
Example: Correlation oeÆ ient (Cont'd)
Let X = Z1 +W and Y = Z2 +W , where Z1; Z2 and W are pairwisei.i.d. Then, %XY = 0:5. We drew n = 15 realisations z1;i; z2;i and wi,from the normal distribution and al ulated xi; yi, i = 1; : : : ; 15.� Assuming a normal distribution we found ^%XY = 0:36 and the95% on�den e interval (�0:18;0:74) for %X;Y .� using the bootstrap per entile-t method, we found withN = 1000 the 95% on�den e interval (�0:05;1:44).� Using Fisher's z-transform and the bootstrap (without assumingbivariate normality), a on�den e interval was found to be(�0:28;0:93).The interval found using the per entile-t method is over- overingA.M. Zoubir, Curtin University of Te hnology 65
ICASSP-99 Tutorial, 15 Mar h 1999'&
$%
Example: Correlation oeÆ ient (Cont'd)
−0.8 −0.6 −0.4 −0.2 0 0.2 0.4 0.6 0.8 10
0.1
0.2
0.3
0.4
0.5
0.6
Sta
ndar
d de
viat
ion
υ*^Bootstrap estimates of the standard deviation of B3 = 1000 (bootstrap)estimates of the orrelation oeÆ ient before varian e stabilisation.
A.M. Zoubir, Curtin University of Te hnology 66
ICASSP-99 Tutorial, 15 Mar h 1999'&
$%
Example: Correlation oeÆ ient (Cont'd)
0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 10
1
2
3
4
5
6
7
υ*^
h(υ*
)^
Varian e stabilising transformation for the orrelation oeÆ ient estimatedusing B1 = 100 and B2 = 25. The solid line is a plot of Fisher's z-transform.
A.M. Zoubir, Curtin University of Te hnology 67
ICASSP-99 Tutorial, 15 Mar h 1999'&
$%
Example: Correlation oeÆ ient (Cont'd)
0 1 2 3 4 5 6 70
2
4
6
8
10
12
14
16
18
Sta
ndar
d de
viat
ion
h(υ*)^Bootstrap estimates of the standard deviation of B3 = 1000 (new bootstrap)estimates of the orrelation oeÆ ient after varian e stabilisation, obtainedthrough bootstrap. The on�den e interval found was (0:06;0:97).
A.M. Zoubir, Curtin University of Te hnology 68
ICASSP-99 Tutorial, 15 Mar h 1999'&
$%
Example: Correlation oeÆ ient (Cont'd)
−1.5 −1 −0.5 0 0.5 1 1.5 20
0.5
1
1.5
2
2.5
3
3.5
Sta
ndar
d de
viat
ion
h(υ*)^Bootstrap estimates of the standard deviation of B3 = 1000 (bootstrap)estimates of the orrelation oeÆ ient after applying Fisher's varian estabilising transformation tanh�1.
A.M. Zoubir, Curtin University of Te hnology 69
ICASSP-99 Tutorial, 15 Mar h 1999'&
$%
Outline
� Prin iple of the Bootstrap{ The Dependent Data Bootstrap{ An Example: Varian e Estimation for AutoregressiveParameter Estimators� The Prin iple of Pivoting{ An Example: Con�den e Interval for the Mean� Varian e Stabilisation{ Examples1. Correlation CoeÆ ient2. Coheren e Gain for Engine Kno k Data
A.M. Zoubir, Curtin University of Te hnology 70
ICASSP-99 Tutorial, 15 Mar h 1999'&
$%
Example: Coheren e Gain for Kno k Data
vibration sensors distributed on the blo k of a Volkswagen Passat four- ylinderengine with 1.8l, 79 kW and 10:1 ompression ratio.A.M. Zoubir, Curtin University of Te hnology 71
ICASSP-99 Tutorial, 15 Mar h 1999'&
$%
Coheren e Gain for Kno k Data (Cont'd)
g1;tg2;t qqq
gg
++
��- -
-
---? 6
??6�Zt =0� XtYt 1A Zt
Xt ^S2;t^S1;t
E2;tE1;t
St
YtZt: ve tor of vibration signals, observed;St: ylinder pressure signal, observed;g1;t: predi tion �lter impulse response, unknown;g2;t: predi tion �lter impulse response, unknown;Ei;t: predi tion error, Ei;t = St � ^Si;t ; i = 1; 2, unknown.A.M. Zoubir, Curtin University of Te hnology 72
ICASSP-99 Tutorial, 15 Mar h 1999'&
$%
Coheren e Gain for Kno k Data (Cont'd)
The loseness to zero of �(!) = R2SZ(!)�R2SX(!), the oheren egain explained by the sensor with output signal Yt, is an irrele-van y measure of this sensor among the remaining sensors.This suggests totest H : R2SZ(!)�R2SX(!) � �0(!)against K : R2SZ(!)�R2SX(!) > �0(!) ; 0 < �0(!) < 1 :Problem: Distribution and varian e stabilisation for the teststatisti ^T (!) = (^�(!)� �0(!))=^�(!), with ^�(!) = ^R2SZ(!)� ^R2SX(!),are unknown.
A.M. Zoubir, Curtin University of Te hnology 73
ICASSP-99 Tutorial, 15 Mar h 1999'&
$%
Example (Cont'd)0.2
0.4
0.6
0.8
1
1.2
1.4
1.6
1.8
2
2.2x10-3
0 1 2 3 4 5 6 7 8
x10-3
.
.
..
.
.
...
..
.
.
.
.
.
.
.
.
.
.
.
. ..
...
.
.
.
.
..
.
.
..
..
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
..
.
.
.
.
.
......
.
.
.
.
.
.
..
..
.
.
.
..
.
.
.
.
.
.
.
.
.
.
.
..
.
.
.
.
.
.
..
...
.
..
.
.
.
.
.
.
...
..
.
.
.
.
.
.
.
.
.
..
.
.
.
..
.
..
..
.
.
.
...
..
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
..
.
.
.
.
.
..
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
..
.
.
.
.
.
.
.
.
.
..
.
.
.
.
.
.
.
..
.
.
.
.
.
.
.
.
.
..
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
..
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
..
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
..
.
.
.
..
.
.
.
.
.
..
.
.
.
.
..
.
.
.
.
.
.
.
.
.
.
.
.
..
.
.
..
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
..
.
.
..
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
...
.
.
.
.
..
.
..
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
..
.
.
..
.
.
.
.
.
..
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
..
.
.
.
.
.
.
.
..
.
.
.
.
.
.
..
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
..
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
..
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
..
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
..
.
.
.
.
.
..
.
.
.
.
.
.
.
.
.
...
.
.
.
.
.
.
.
..
.
.
..
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
..
.
.
.
.
..
.
.
.
.
.
.
..
.
.
.
.
..
.
..
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
..
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
..
.
..
.
.
.
.
.
.
.
.
.
.
.
.
.
.
..
.
..
.
.
.
.
.
.
..
.
.
.
.
.
.
.
.
..
..
.
.
.
.
.
.
..
.
.
.
.
.
.
.
.
.
.
.
...
.
.
.
.
..
...
.
.
.
.
..
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
..
.
.
.
..
.
.
.
.
.
.
.
.
.
.
...
.
.
.
.
..
.
.
.
.
.
.
.
.
..
.
.
.
.
.
.
.
.
.
.
.
.
.
.
..
.
.
.
.
.
.
ϑ *^
Sta
ndar
d de
viat
ion
Standard deviation of bootstrap estimates at one mode frequen y withoutvarian e stabilisation. Herein, ^�(!) = ^R2SZ(!)� ^R2SX(!).A.M. Zoubir, Curtin University of Te hnology 74
ICASSP-99 Tutorial, 15 Mar h 1999'&
$%
Example (Cont'd)0
1
2
3
4
5
6
0 1 2 3 4 5 6 7
x10-3ϑ *^
h(ϑ* )^
Estimated varian e stabilising transformation. The transformation was foundusing B1 = 200; B2 = 25 and a �xed-span running lines smoother with span of50%.A.M. Zoubir, Curtin University of Te hnology 75
ICASSP-99 Tutorial, 15 Mar h 1999'&
$%
Example (Cont'd)0.2
0.4
0.6
0.8
1
1.2
1.4
1.6
1.8
2
0 1 2 3 4 5 6
.
.
.
.
.
.
.
..
.
.
.
.
.
.
.
.
.
.
..
.
..
.
..
.
.
.
.
..
...
.
.
..
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
..
...
.
...
.
.
..
.
.
.
.
.
..
.
.
.
.
..
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
...
...
.
.
.
.
.
.
..
...
.
.
.
.
.
.
.
....
.
.
.
.
.
.
..
.
.
.
..
.
..
..
.
.
.
.
.
.
.
..
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
..
.
.
.
.
.
.
.
.
.
.
.
.
.
..
.
.
.
.
.
.
.
..
..
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
..
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
..
.
.
.
.
..
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
..
.
.
.
.
..
.
.
.
..
.
.
.
...
..
.
.
.
.
.
..
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
..
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
..
.
.
.
..
.
..
.
..
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
..
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
...
.
.
..
..
.
.
.
.
.
.
.
.
.
.
..
..
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
..
.
.
.
.
.
.
.
.
.
.
...
.
.
.
.
.
...
.
.
.
.
.
.
.
...
.
.
.
.
.
..
.
.
.
.
.
.
...
.
.
.
.
.
.
.
.
..
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
..
.
.
.
..
.
..
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
..
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
..
.
.
.
.
.
.
.
.
.
..
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
..
.
.
.
.
.
.
.
.
.
.
.
.
.
.
..
.
.
.
.
.
..
.
.
..
.
.
.
.
.
.
.
.
.
.
.
...
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
..
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
..
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
..
.
.
..
.
..
.
.
.
.
.
.
.
.
.
.
.
...
.
.
.
.
.
.
.
.
.
.
.
.
...
.
.
.
.
.
..
.
..
.
.
.
..
.
.
.
..
..
.
.
..
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
...
.
.
.
.
..
.
.
.
.
.
.
.
..
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
..
.
.
.
.
.
.
h(ϑ *)^
Sta
ndar
d de
viat
ion
Standard deviation of (new) bootstrap estimates ^��(!) at one mode frequen yafter varian e stabilisation.A.M. Zoubir, Curtin University of Te hnology 76
ICASSP-99 Tutorial, 15 Mar h 1999'&
$%
Outline
� Hypothesis Testing with the Bootstrap{ An Example: Testing the Frequen y Response for Zero� Signal Dete tion{ Examples1. Bootstrap Mat hed-Filter2. Dete tion of a Non-Gaussian Signal at Multiple Sensors� Model Sele tion{ Linear Models
A.M. Zoubir, Curtin University of Te hnology 77
ICASSP-99 Tutorial, 15 Mar h 1999'&
$%
Hypothesis Testing with the Bootstrap
Consider a random sample X = fX1; : : : ; Xng observed from anunspe i�ed probability distribution F . Let � be an unknownparameter of F .We wish to test the hypothesisH : � � �0 against K : � > �0 ;where �0 is some known onstant. Let ^� be an estimator of � and ^�2an estimator of the varian e �2 of ^�.De�ne the statisti ^T = ^� � �0^� :
A.M. Zoubir, Curtin University of Te hnology 78
ICASSP-99 Tutorial, 15 Mar h 1999'&
$%
Hypothesis Testing (Cont'd)
Step 1. Draw X �, with repla ement, from X .Step 2. Cal ulate ^T � = ^�� � ^�^�� ;where ^�� and ^�� are versions of ^� and ^� omputed from X �.Step 3. Repeat Steps 1 and 2 to obtain ^T �1 ; : : : ; ^T �N .Step 4. Rank ^T �1 ; : : : ; ^T �N into ^T �(1) � � � � � ^T �(N). Reje t H if^T � ^T �(q), where q = b(N + 1)(1� �) .
A.M. Zoubir, Curtin University of Te hnology 79
ICASSP-99 Tutorial, 15 Mar h 1999'&
$%
Hypothesis Testing (Cont'd)
For a double-sided alternative we would testH : � = �0 against K : � 6= �0 ;where �0 is some known onstant. The statisti used is given by^Td = j^� � �0j^� :We would pro eed as before and rank the bootstrap statisti s^T �d;1; : : : ; ^T �d;N into ^T �d;(1) � � � � � ^T �d;(N). Then, we would reje t H atlevel � if ^Td � ^T �d;(q), where q = b(N + 1)(1� �) .
A.M. Zoubir, Curtin University of Te hnology 80
ICASSP-99 Tutorial, 15 Mar h 1999'&
$%
Hypothesis Testing: Varian e EstimationIf an estimator ^�2 is unavailable, use the bootstrap: Given^��1 ; : : : ; ^��B1 , we estimate the varian e ^�2 of ^� by^�2BOOT = 1B1 � 1 B1Xb=1 ^��b � 1B1 B1Xb=1 ^��b!2 :
In the ase of ^��2, the pro edure involves two nested levels ofresampling. For ea h resample X �b , b = 1; : : : ; B1, we draw resamplesX ��b , b = 1; : : : ; B2, evaluate ^���b from ea h resample to obtain B2repli ations, and al ulate^��2BOOT = 1B2 � 1 B2Xb=1
^���b � 1B2 B2Xb=1 ^���b!2 :
A.M. Zoubir, Curtin University of Te hnology 81
ICASSP-99 Tutorial, 15 Mar h 1999'&
$%
Varian e Estimation (Cont'd)
The ja kknife an be thought of as drawing n samples of size n� 1ea h without repla ement from the original sample of size n [Miller(1974)℄.The ja kknife is based on the sample delete-one observation at atime, X (i) = fX1; X2; : : : ; Xi�1; Xi+1; : : : ; Xng, i = 1; 2; : : : ; n, alledja kknife sample. For ea h ith ja kknife sample, we al ulate the ithja kknife estimate ^�(i) of �, i = 1; : : : ; n and ompute
^�2JACK = n� 1n nXi=1 ^�(i) � 1n nXi=1 ^�(i)!2 ;whi h is less expensive than the bootstrap if n is less than thenumber of repli ates used by the bootstrap for standard deviation.A.M. Zoubir, Curtin University of Te hnology 82
ICASSP-99 Tutorial, 15 Mar h 1999'&
$%
Outline
� Hypothesis Testing with the Bootstrap{ An Example: Testing the Frequen y Response for Zero� Signal Dete tion{ Examples1. Bootstrap Mat hed-Filter2. Dete tion of a Non-Gaussian Signal at Multiple Sensors� Model Sele tion{ Linear Models
A.M. Zoubir, Curtin University of Te hnology 83
ICASSP-99 Tutorial, 15 Mar h 1999'&
$%
An Example: Testing the Frequen y Response
Æ ��+--- gt ?- - ZtEtS(1)tS(2)tS(r)t ...:::St: r ve tor-valued stationary signal, observed;Zt: output signal, observed;gt: �lter impulse response, unknown;Et: noise, unknown, Et and St independent for t = 0;�1;�2; : : : .Zt = 1Xu=�1g0uSt�u + Et ;Problem: whi h element Gl(!), 1 � l � r, is zero at a given !?A.M. Zoubir, Curtin University of Te hnology 84
ICASSP-99 Tutorial, 15 Mar h 1999'&
$%
Example (Cont'd)Let G(!) = (G(l)(!)0; Gl(!))0. Given St and Zt for n independentobservations of length T ea h, we wish to testH : Gl(!) = 0 (G(l)(!) unspe i�ed) against K : Gl(!) 6= 0 :We ompute the frequen y data ds(!) = (dS1(!); : : : ;dSr(!)),dSl(!) = (dSl(!; 1); : : : ; dSl(!; n))0, l = 1; : : : ; r,dZ(!) = (dZ(!; 1); : : : ; dZ(!; n))0, withdZ(!; i) =PT�1t=0 w(t=T ) � Zt;i e�j!t; i = 1; : : : ; n, and onsider the omplex regression dZ(!) = ds(!)G(!) + dE(!) :Let the Least-Squares Estimate (LSE) of G(!) be^G(!) = (ds(!)Hds(!))�1(ds(!)HdZ(!)) :A.M. Zoubir, Curtin University of Te hnology 85
ICASSP-99 Tutorial, 15 Mar h 1999'&
$%
Example (Cont'd)
Conventional te hniques assume T large so that dE(!) be omes omplex Gaussian [Brillinger (1981)℄. Under this ondition and H,the statisti ^T (!) = (n� r)k dZ(!)� ds(l)(!) ^G(l)(!) k2 � k dZ(!)� ds(!) ^G(!) k2k dZ(!)� ds(!) ^G(!) k2is F2;2(n�r)-distributed, whereds(l)(!) = (dS1(!); : : : ;dSl�1(!);dSl+1(!); : : : ;dSr(!))0is obtained from ds(!) = (ds(l)(!);dSl(!)) by deleting the lth ve tordSl(!), and ^G(l)(!) = (ds(l)(!)Hds(l)(!))�1(ds(l)(!)HdZ(!))[Shumway (1983)℄.A.M. Zoubir, Curtin University of Te hnology 86
ICASSP-99 Tutorial, 15 Mar h 1999'&
$%
Example (Cont'd)
Step 0. Experiment. Condu t the experiment and al ulatedS(!; 1); : : : ;dS(!; n), and dZ(!; 1); : : : ; dZ(!; n).Step 1. Resampling. Condu t two totally independent re-sampling operations in whi h fd�S(!; 1); : : : ; d�S(!; n)g isdrawn, with repla ement, from fdS(!; 1); : : : ;dS(!; n)g,where dS(!; i) = (dS1(!; i); : : : ; dSr(!; i)); i = 1; : : : ; n, anda resample fd�^E(!; 1); : : : ; d�^E(!; n)g is drawn, with repla e-ment, from fd ^E(!; 1); : : : ; d ^E(!; n)g, olle ted into the ve tord ^E(!) = (d ^E(!; 1); : : : ; d ^E(!; n))0, so thatd ^E(!) = dZ(!)� ds(!) ^G(!) :
A.M. Zoubir, Curtin University of Te hnology 87
ICASSP-99 Tutorial, 15 Mar h 1999'&
$%
Example (Cont'd)
Step 2. Generation of bootstrap data. Centre the frequen y dataresamples and omputed�Z(!) = d�s(!) ^G(!) + d ^E(!) :The joint distribution of f(d�S(!; i); d�Z(!; i)); 1 � i �ng, onditional on X (!) = f(dS(!; 1); dZ(!; 1)); : : : ,(dS(!; n); dZ(!; n))g is the bootstrap estimate of the un on-ditional joint distribution of X (!).Step 3. Cal ulation of bootstrap estimates. With the new d�Z(!)and d�s(!), al ulate the LSE ^G�(!), using the resamplesd�Z(!) and d�s(!), repla ing dZ(!) and ds(!), respe tively.
A.M. Zoubir, Curtin University of Te hnology 88
ICASSP-99 Tutorial, 15 Mar h 1999'&
$%
Example (Cont'd)
Step 4. Cal ulation of the bootstrap statisti . Cal ulate thestatisti , repla ing ds(!), ds(l)(!), dZ(!), ^G(!), and ^G(l)(!)by their bootstrap ounterparts to yield ^T �(!).Step 5. Repetition. Repeat Steps 1{4 a large number of times,say N , to obtain ^T �1 (!); : : : ; ^T �N (!).Step 6. Distribution estimation. Approximate the distributionof ^T (!) by the distribution of ^T �(!) obtained.An alternative bootstrap approa h to the one des ribed here an beobtained by expressing ^T (!) using multiple oheren es [Zoubir(1993,1994), Zoubir & Boashash (1998)℄.
A.M. Zoubir, Curtin University of Te hnology 89
ICASSP-99 Tutorial, 15 Mar h 1999'&
$%
Example (Cont'd)
Let n = 20 be independent re ords of St with r = 5 and let
Sl;t = KXk=1Ak;l os(!kt+�k;l) + Ul;t ; l = 1; : : : ; 5 ;
where Ak;l and �k;l are mutually independent random amplitudesand phases, respe tively, !k are arbitrary resonan e frequen ies fork = 1; : : : ;K and Ul;t is a white noise pro ess, l = 1; : : : ; r.With K = 4 and T = 128, we generated data for � and A from auniform distribution on the interval [0; 2�) and [0; 1), respe tively.We sele ted f1 = 0:1; f2 = 0:2; f3 = 0:3 and f4 = 0:4, wherefk = !k=2�, k = 1; : : : ; 4. The added noise was uniformly distributed.
A.M. Zoubir, Curtin University of Te hnology 90
ICASSP-99 Tutorial, 15 Mar h 1999'&
$%
Example (Cont'd)0 0.05 0.1 0.15 0.2 0.25 0.3 0.35 0.4 0.45 0.5
10
20
30
40
50
60
70
Mag
nitu
de [d
B]
Normalised frequency
Spe tral estimate of Sl;t, l = 1; : : : ; r, dSl(!)HdSl(!)=n, obtained byaveraging n = 20 periodograms.A.M. Zoubir, Curtin University of Te hnology 91
ICASSP-99 Tutorial, 15 Mar h 1999'&
$%
Example (Cont'd)
We generated bandstop �lters (FIR �lters with 256 oeÆ ients) withbands entred about the four resonan e frequen ies f1; f2; f3 and f4.
0 0.05 0.1 0.15 0.2 0.25 0.3 0.35 0.4 0.45 0.5−120
−100
−80
−60
−40
−20
0
20
Mag
nitu
de [d
B]
Normalised frequencyFrequen y response of the �rst hannel, G1(!), obtained using an FIR �lterwith 256 oeÆ ients.A.M. Zoubir, Curtin University of Te hnology 92
ICASSP-99 Tutorial, 15 Mar h 1999'&
$%
Example (Cont'd)
We �ltered St and added independent uniformly distributed noise Etto generate Zt (SNR = 5 dB with respe t to the omponent Sl;t,l = 1; : : : ; r; t = 0; : : : ; T � 1, with highest power).0 0.05 0.1 0.15 0.2 0.25 0.3 0.35 0.4 0.45 0.5
40
50
60
70
80
90
100
Mag
nitu
de [d
B]
Normalised frequencySpe tral estimate of Zt, dZ(!)HdZ(!)=n, with n = 20.A.M. Zoubir, Curtin University of Te hnology 93
ICASSP-99 Tutorial, 15 Mar h 1999'&
$%
Example (Cont'd)We sele ted arbitrarily Gl(!), l = 1; : : : ; r, and tested H.
−8 −6 −4 −2 0 2 40
0.1
0.2
0.3
0.4
0.5
0.6
0.7
0.8
Bootstrap statistic
Den
sity
func
tion
Histogram of 1000 bootstrap values of the statisti ^T �(!) = (^��(!)� ^�(!))=^��(!) at a frequen y bin where H : G2(!) = 0 retained. The solid linerepresents a kernel density estimate using 1000 Monte Carlo simulations.A.M. Zoubir, Curtin University of Te hnology 94
ICASSP-99 Tutorial, 15 Mar h 1999'&
$%
Regression Analysis (Cont'd)
−8 −6 −4 −2 0 2 40
0.1
0.2
0.3
0.4
0.5
0.6
0.7
0.8
Bootstrap statistic
Den
sity
func
tion
Histogram of 1000 bootstrap values of the statisti ^T �(!) = (^��(!)� ^�(!))=^��(!) at a frequen y bin where the hypothesisH : G4(!) = 0 was retained. The solid line represents a kernel densityestimate using 1000 Monte Carlo simulations.A.M. Zoubir, Curtin University of Te hnology 95
ICASSP-99 Tutorial, 15 Mar h 1999'&
$%
Outline
� Hypothesis Testing with the Bootstrap{ An Example: Testing the Frequen y Response for Zero� Signal Dete tion{ Examples1. Bootstrap Mat hed-Filter2. Dete tion of a Non-Gaussian Signal at Multiple Sensors� Model Sele tion{ Linear Models
A.M. Zoubir, Curtin University of Te hnology 96
ICASSP-99 Tutorial, 15 Mar h 1999'&
$%
Signal Dete tion
� Dete tion of signals in interferen e is a key area in signalpro essing appli ations su h as radar, sonar, andtele ommuni ations.� Dete tion theory is well established when the interferen e isGaussian.� In many appli ations su h as high-resolution radar and radar atlow grazing angles interferen e su h as lutter is non-Gaussian.� Existing methods for the dete tion of signals in non-Gaussianinterferen e are often umbersome and/or non-optimal
A.M. Zoubir, Curtin University of Te hnology 97
ICASSP-99 Tutorial, 15 Mar h 1999'&
$%
The Signal Dete tion Problem
k+Transmitter Re eiver- -?Signal sNoise W
X = �s+W
Test the hypotheses:H : � = 0K : � > 0 aa
Re eiver Stru ture^T (X) ^T ? �- - KH
A.M. Zoubir, Curtin University of Te hnology 98
ICASSP-99 Tutorial, 15 Mar h 1999'&
$%
Signal Dete tion with the Mat hed Filter
P s-x �����- ^T1�ps0s6s
- -
^T = s0x�ps0s = s0Psx�ps0s � N �ps0s� ; 1! ; Ps = ss0=s0s :
A.M. Zoubir, Curtin University of Te hnology 99
ICASSP-99 Tutorial, 15 Mar h 1999'&
$%
Performan e of the Mat hed Filter0 1 2 3 4 5 6 7 8 9
0.01
0.05
0.1
0.2
0.3
0.4
0.5
0.6
0.7
0.8
0.9
0.95
0.99
0.999
0.9999
Voltage signal−to−noise ratio
Det
ectio
n pr
obab
ility
P FA=1
0−1
P FA=1
0−2
P FA=1
0−3
P FA=1
0−4
P FA=1
0−5
P FA=1
0−6
A.M. Zoubir, Curtin University of Te hnology 100
ICASSP-99 Tutorial, 15 Mar h 1999'&
$%
Signal Dete tion with the CFAR Mat hed Filter
k�
k��
P s-x
- ?I�P s k�k2
k�- 6
s0ps0s
- - - - p 6? -
1=(n� 1)
? �
T = s0Psxp�2s0spx0(I�Ps)x=(n� 1)�2 � tn�1��ps0s� �
A.M. Zoubir, Curtin University of Te hnology 101
ICASSP-99 Tutorial, 15 Mar h 1999'&
$%
Performan e of the CFAR Mat hed Filter
0 1 2 3 4 5 6 7 8 90.01
0.05
0.1
0.2
0.3
0.4
0.5
0.6
0.7
0.8
0.9
0.95
0.99
0.999
0.9999
Voltage signal−to−noise ratio
Det
ectio
n pr
obab
ility
N=2
N=3N=6
N=∞
N=11
N=21
A.M. Zoubir, Curtin University of Te hnology 102
ICASSP-99 Tutorial, 15 Mar h 1999'&
$%
Limitations of the Mat hed Filter
� The mat hed �lter and the CFAR mat hed �lter are designed(and optimal) for Gaussian interferen e.� Although they show high probability of dete tion in thenon-Gaussian ase, they are unable to maintain the preset level ofsigni� an e for small sample sizes.� The mat hed �lter fails in the ase where the interferen e/noiseis non-Gaussian and the data size is small.� The goal is to develop te hniques whi h require little in the wayof modelling and assumptions.
A.M. Zoubir, Curtin University of Te hnology 103
ICASSP-99 Tutorial, 15 Mar h 1999'&
$%
Outline
� Hypothesis Testing with the Bootstrap{ An Example: Testing the Frequen y Response for Zero� Signal Dete tion{ Examples1. Bootstrap Mat hed-Filter2. Dete tion of a Non-Gaussian Signal at Multiple Sensors� Model Sele tion{ Linear Models
A.M. Zoubir, Curtin University of Te hnology 104
ICASSP-99 Tutorial, 15 Mar h 1999'&
$%
The Signal Dete tion Problem
k+Transmitter Re eiver- -?Signal sNoise W
X = �s+W
Test the hypotheses:H : � = 0K : � > 0 aa
Re eiver Stru ture^T (X) ^T ? �- - KH
A.M. Zoubir, Curtin University of Te hnology 105
ICASSP-99 Tutorial, 15 Mar h 1999'&
$%
The Bootstrap Mat hed Filter
Step 0. Experiment. Run the experiment and olle t the dataxt; t = 0; : : : ; n� 1.Step 1. Estimation. Compute the LSE ^� of �, ^�^�, and^T = ^� � �^�^� ������=0 :Step 2. Resampling. Compute the residuals^wt = xt � ^�st; t = 0; : : : ; n� 1;and after entering, resample the residuals, assumed to bei.i.d., to obtain ^w�t , t = 0; : : : ; n� 1.
A.M. Zoubir, Curtin University of Te hnology 106
ICASSP-99 Tutorial, 15 Mar h 1999'&
$%
The Bootstrap Mat hed Filter (Cont'd)
Step 3. Bootstrap Test Statisti . Compute new measurementsx�t = ^�st + ^w�t ; t = 0; : : : ; n� 1 ;the LSE ^�� based on the resamples x�t , t = 0; : : : ; n� 1, and^T � = ^�� � ^�^��^�� :Step 4. Repetition. Repeat Steps 2 and 3 a large number of timesto obtain ^T �1 ; : : : ; ^T �N .Step 5. Bootstrap Test. Sort ^T �1 ; : : : ; ^T �N to obtain ^T �(1) � � � � �^T �(N). Reje t H if ^T � ^T �(q), where q = b(1� �)(N + 1) .
A.M. Zoubir, Curtin University of Te hnology 107
ICASSP-99 Tutorial, 15 Mar h 1999'&
$%
Simulation Results (Gaussian Case)
Let st = sin(2�t=6), n = 10, N = 999 (25 for varian e estimation),� = 5%, and the number of independent runs be 5,000.
N (0; 1), SNR=7 dBDete tor ^Pf [%℄ ^Pd [%℄Mat hed Filter (MF) 5 99CFAR MF 5 98Boot. (� known) 4 99Boot. (� unknown) 5 98A.M. Zoubir, Curtin University of Te hnology 108
ICASSP-99 Tutorial, 15 Mar h 1999'&
$%
Simulation Results (Non-Gaussian Case)
Let st = sin(2�t=6), n = 10, N = 999 (25 for varian e estimation),� = 5%, and the number of independent runs be 5,000.
Wt �P4i=1 aiN (�i; �2i ) with a = (0:5; 0:1; 0:2; 0:2),� = (�0:1; 0:2; 0:5; 1), � = (0:25; 0:4; 0:9; 1:6), and SNR = -6 dBDete tor ^Pf [%℄ ^Pd [%℄Mat hed Filter (MF) 8 83CFAR MF 7 77Boot. (� known) 5 77Boot. (� unknown) 7 79A.M. Zoubir, Curtin University of Te hnology 109
ICASSP-99 Tutorial, 15 Mar h 1999'&
$%
Interpretation of the Results
� The bootstrap is able to maintain a onstant false alarm whilea hieving a reasonably high dete tion power.� These results an be improved in several ways, and the methodsextended to the orrelated data ase.� The bootstrap is not proposed as an alternative to existingnon-parametri /parametri signal dete tion s hemes in thenon-Gaussian interferen e ase.� The examples suÆ e to show the power of the bootstrap in signaldete tion when little is known about the distribution of theinterferen e.
A.M. Zoubir, Curtin University of Te hnology 110
ICASSP-99 Tutorial, 15 Mar h 1999'&
$%
Outline
� Hypothesis Testing with the Bootstrap{ An Example: Testing the Frequen y Response for Zero� Signal Dete tion{ Examples1. Bootstrap Mat hed-Filter2. Dete tion of a Non-Gaussian Signal at Multiple Sensors� Model Sele tion{ Linear Models
A.M. Zoubir, Curtin University of Te hnology 111
ICASSP-99 Tutorial, 15 Mar h 1999'&
$%
Dete tion of a Non-Gaussian Signal Common toTwo Sensors
Problem: Dete tion of a non-Gaussian signal ommon to twosensors embedded in interferen e whi h is either mutuallyindependent at ea h sensor or have a vanishing ross bispe trum[Tugnait (1993), Ong et al. (1997)℄.Sensor 2receives
receivesSensor 1
Source
Noisy System
x(t)
y(t)
A.M. Zoubir, Curtin University of Te hnology 112
ICASSP-99 Tutorial, 15 Mar h 1999'&
$%
Cross-Bispe tral Dete tion S heme
� Let xt and yt be two jointly-stationary, zero-mean, dis rete-timerandom pro esses modelling the sensor output signals.� A dete tion s heme for the problem is based on testing the rossbispe trum of the sensor output signals for zero, i.e.H : C(j; k) = 0;K : C(j; k) 6= 0; 8(j; k) 2 D0;
where C(j; k) = Cxxy(2�j=n; 2�k=n), is the ross bispe trum ofxt and yt at dis rete frequen ies !j = 2�j=n; !k = 2�k=n, andD0 = f(j; k) : jjj < k < n=2g.
A.M. Zoubir, Curtin University of Te hnology 113
ICASSP-99 Tutorial, 15 Mar h 1999'&
$%
Limitations with Existing Methods
� Current dete tion methods [Tugnait (1993)℄ based on the rossbispe trum assume enough data are available for asymptoti results to apply.� In some appli ations these assumptions are not valid leading todegradation in the dete tor performan e.� Spe i� ally, for small sample sizes, the probability of false alarmis not maintained at the nominal level.Problem: Seek a solution in the ase where the data size is smalland asymptoti tests are inappli able.
A.M. Zoubir, Curtin University of Te hnology 114
ICASSP-99 Tutorial, 15 Mar h 1999'&
$%
Prin iple of the Bootstrap Pro edure
We use the following approximate regressionIxxy(j; k) = Cxxy(j; k) + "(j; k)V (j; k);where Ixxy(j; k) = 1ndx(j)dx(k) �dy(j + k)and V (j; k)2 = n[1 + Æ(j � k)℄Cxx(j)Cxx(k)Cyy(j + k):This regression is onsistent with asymptoti results.
A.M. Zoubir, Curtin University of Te hnology 115
ICASSP-99 Tutorial, 15 Mar h 1999'&
$%
Bootstrap Pro edure (Cont'd)
Step 1. Cal ulate I(i)(j; k), i = 1; : : : ; P and ^C(j; k) =1P PPi=1 I(i)(j; k), (j; k) 2 D0.Step 2. Form residuals of the regression^"(i)(j; k) = I(i)(j; k)� ^C(j; k)^V (i)(j; k)Step 3. Repeat N times (after mean-subtra ting ^"(i)(j; k)):I(i)�b (j; k) = ^C(j; k) + ^"(i)�b (j; k) ^V (i)(j; k)^C�b (j; k) = 1P PXi=1 I(i)�b (j; k); b = 1; : : : ; N:
A.M. Zoubir, Curtin University of Te hnology 116
ICASSP-99 Tutorial, 15 Mar h 1999'&
$%
Bootstrap Pro edure (Cont'd)
Step 4. Cal ulate test statisti ^T = X(j;k)2D0����� ^C(j; k)� C0(j; k)^�(j; k) ����� :Step 5. Cal ulate the bootstrap statisti s^T �b = X(j;k)2D0
����� ^C�b (j; k)� ^C(j; k)^��b (j; k) ����� ; b = 1; : : : ; N :
Step 6. Rank ^T �1 ; : : : ; ^T �N to obtain ^T �(1) � : : : � ^T �(N).Step 7. Reje t the null hypothesis if ^T > ^T �(b(N+1)(1��) ).
A.M. Zoubir, Curtin University of Te hnology 117
ICASSP-99 Tutorial, 15 Mar h 1999'&
$%
GLRT Simulation Results
Interferen e % False Alarms % Dete tedCommon i.i.d. 5 99Gaussian AR(1) 6 99interferen e AR(5) 6 99Independent i.i.d. 31 99exponential AR(1) 30 99interferen e AR(5) 28 99Dete tion results for a ommon MA(10) exponential signal in Gaussian andnon-Gaussian interferen e for n = 512 and � = 5%.A.M. Zoubir, Curtin University of Te hnology 118
ICASSP-99 Tutorial, 15 Mar h 1999'&
$%
Bootstrap Simulation Results (Cont'd)
Interferen e % False Alarms % Dete tedCommon i.i.d. 4 93Gaussian AR(1) 1 94interferen e AR(5) 1 93Independent i.i.d. 8 94exponential AR(1) 1 93interferen e AR(5) 1 95Dete tion results for a ommon MA(10) exponential signal in Gaussian andnon-Gaussian interferen e for n = 512 and � = 5%.A.M. Zoubir, Curtin University of Te hnology 119
ICASSP-99 Tutorial, 15 Mar h 1999'&
$%
Dete tion of a Non-Gaussian Signal Common toMultiple Sensors
Problem: How an the dete tion that uses measurements from twosensors be extended when measurements are available from multiplesensors so that a higher probability of dete tion is a hieved?Relevan e: This problem is important in array pro essingappli ations su h as in sonar, radar, and ommuni ations.Solution: We propose the use of the Bonferroni test of multiplehypotheses oupled with a bootstrap method [Ong & Zoubir (1997)℄.
A.M. Zoubir, Curtin University of Te hnology 120
ICASSP-99 Tutorial, 15 Mar h 1999'&
$%
Multiple Tests
� Let x1;t; : : : ; xL+1;t, t = 0; : : : ; n� 1, be the dis rete-timemeasurements from L+ 1 sensors.� They are assumed to be zero-mean, jointly stationary, randomsequen es.� A non-Gaussian signal ommon to any two adja ent sensors anbe dete ted by testing the multiple hypotheses,Hl : Cxlxlxl+1(j; k) � 0;Kl : Cxlxlxl+1(j; k) 6� 0; l = 1; : : : ; L;
where Cxlxlxl+1(j; k) is the ross bispe trum of xl;t and xl+1;tevaluated at dis rete bifrequen ies (2�j=n; 2�k=n).A.M. Zoubir, Curtin University of Te hnology 121
ICASSP-99 Tutorial, 15 Mar h 1999'&
$%
Multiple Tests (Cont'd)
1. Apply a dete tion method for two sensors on xl;t and xl+1;t forl = 1; : : : ; L.2. Form a set of L p-values, P1; : : : ; PL.3. Compare the minimum p-value, P(1), to �=L, where � is thenominal test level.4. If P(1) < �=L, on lude that a non-Gaussian signal is present inat least two adja ent sensors.Using the Bonferroni level, �=L, limits the global level to �.This pro edure is equivalent to running two sensor tests with �=L instead of� and reje ting if any test reje ts.A.M. Zoubir, Curtin University of Te hnology 122
ICASSP-99 Tutorial, 15 Mar h 1999'&
$%
Simulation Results
Dete tion of an MA(10) exponential signal in AR(1) (left plot) andAR(5) (right plot) Gaussian interferen e (512 data points):
−10 −8 −6 −4 −2 0 2 4 60
10
20
30
40
50
60
70
80
90
100
SNR (dB)
% d
etec
ted
2 sensors3 sensors5 sensors
−10 −8 −6 −4 −2 0 2 4 60
10
20
30
40
50
60
70
80
90
100
SNR (dB)%
det
ecte
d
2 sensors3 sensors5 sensors
In all ases, % false alarms = 0 (100 simulation runs)
A.M. Zoubir, Curtin University of Te hnology 123
ICASSP-99 Tutorial, 15 Mar h 1999'&
$%
Simulation Results (Cont'd)
Dete tion of an MA(10) exponential signal in AR(1) (left plot) andAR(5) (right plot) exponential interferen e (512 data points):
−10 −8 −6 −4 −2 0 2 4 60
10
20
30
40
50
60
70
80
90
100
SNR (dB)
% d
etec
ted
2 sensors3 sensors5 sensors
−10 −8 −6 −4 −2 0 2 4 60
10
20
30
40
50
60
70
80
90
100
SNR (dB)
% d
etec
ted
2 sensors3 sensors5 sensors
In all ases, % false alarms < 3 (100 simulation runs)A.M. Zoubir, Curtin University of Te hnology 124
ICASSP-99 Tutorial, 15 Mar h 1999'&
$%
Outline
� Hypothesis Testing with the Bootstrap{ An Example: Testing the Frequen y Response for Zero� Signal Dete tion{ Examples1. Bootstrap Mat hed-Filter2. Dete tion of a Non-Gaussian Signal at Multiple Sensors� Model Sele tion{ Linear Models
A.M. Zoubir, Curtin University of Te hnology 125
ICASSP-99 Tutorial, 15 Mar h 1999'&
$%
Model Sele tion
� Several model sele tion pro edures exist. They in lude Akaike'sinformation riterion, Rissanen's minimum des ription length riterion, and Hannan and Quinn's riterion.� Bootstrap methods for model sele tion are simple and omputationally eÆ ient.� If one uses a bootstrap approa h for the model sele tion and forthe subsequent inferen e, then the bootstrap observationsgenerated for model sele tion an also be used in the inferen epro edure.� Thus, the model sele tion pro edure an be done at no extra omputational ost.A.M. Zoubir, Curtin University of Te hnology 126
ICASSP-99 Tutorial, 15 Mar h 1999'&
$%
Outline
� Hypothesis Testing with the Bootstrap{ An Example: Testing the Frequen y Response for Zero� Signal Dete tion{ Examples1. Bootstrap Mat hed-Filter2. Dete tion of a Non-Gaussian Signal at Multiple Sensors� Model Sele tion{ Linear Models
A.M. Zoubir, Curtin University of Te hnology 127
ICASSP-99 Tutorial, 15 Mar h 1999'&
$%
Model Sele tion in Linear Models
Consider the linear modelYt = x0tb+ Zt; t = 0; : : : ; n� 1;where Zt is a set of i.i.d. random variables of unknown distributionwith �Z = 0 and �2Z > 0, and b is an unknown p-ve tor parameter.Here, xt is the t-th value of the p ve tor of explanatory variables,assumed to be known.With Y = (Y0; : : : ; Yn�1)0, x = (x0; : : : ;xn�1)0, b = (b1; : : : ; bp)0 andZ = (Z0; : : : ; Zn�1)0, we haveY = xb+Z :A.M. Zoubir, Curtin University of Te hnology 128
ICASSP-99 Tutorial, 15 Mar h 1999'&
$%
Model Sele tion in Linear Models (Cont'd)
With � being as subset of f1; : : : ; pg, a model orresponding to � isgiven by Y = x�b� +Z ;where b� is a sub-ve tor of b ontaining the omponents of b indexedby integers in � and x� is a matrix ontaining the olumns of xindexed by integers in �.Problem: Estimate �0 based on y0; : : : ; yn�1, where �0 is su h thatb�0 ontains all non-zero omponents of b only.
A.M. Zoubir, Curtin University of Te hnology 129
ICASSP-99 Tutorial, 15 Mar h 1999'&
$%
Model Sele tion in Linear Models (Cont'd)
We onsider an estimator of the mean-squared error given by�n(�) = 1n n�1Xt=0 �Yt � x0�t^b��2 = kY � x�^b�k2nwith x0�t being the t-th row of x� and ^b� the LSE for b� . Then,E[�n(�)℄ = �2Z � �2Zp�n +�n(�);where �n(�) = n�1�0(I � x�(x0�x�)�1x0�)�, with � = E[Y ℄,h� = x�(x0�x�)�1x0� being the p� p proje tion matrix and p� is thesize of b� . If � is orre t, then �n(�) = 0.Model Sele tion: Minimise E[�n(�)℄ over �.A.M. Zoubir, Curtin University of Te hnology 130
ICASSP-99 Tutorial, 15 Mar h 1999'&
$%
Model Sele tion in Linear Models (Cont'd)
� A bootstrap model sele tion approa h would minimise over �~�n(�) = 1n n�1Xt=0 E� �yt � x0t�^b���2 = E� "ky � x�^b��k2n # ;
where ^b�� is the LSE based on (y�t ;x�t).� The estimator ~�n(�) is biased. A better estimator is given by^��n;m(�) = E� "ky � x�^b��;mk2n # ;
where ^b��;m is obtained from y�t = x0�t^b� + ^z�t , t = 0; : : : ; n� 1,with ^z�t being a resample from pn=m(^zt � ^z�)=p1� p=n and^z� = n�1Pn�1t=0 ^zt.A.M. Zoubir, Curtin University of Te hnology 131
ICASSP-99 Tutorial, 15 Mar h 1999'&
$%
Model Sele tion in Linear Models (Cont'd)
Step 1. Based on y0; : : : ; yn�1, ompute the LSE ^b and ^zt =yt � x0�t^b�, t = 0; : : : ; n� 1, where � = f1; : : : ; pg.Step 2. Resample with repla ement from pn=m(^zt �^z�)=p1� p=n to obtain ^z�t , where mn ! 0 andnm maxt�n h�t ! 0 for all � in the lass of models tobe sele ted.Step 3. Compute y�t = x0�t^b� + ^z�t , t = 0; : : : ; n� 1 and the LSE^b��;m from (y�t ;x�t).Step 4. Repeat Steps 2-3 to obtain ^b�(i)�;m and ^��(i)n;m(�) for i =1; : : : ; N .Step 5. Average ^��(i)n;m(�) over i = 1; : : : ; N and minimise over �to obtain ^�0.A.M. Zoubir, Curtin University of Te hnology 132
ICASSP-99 Tutorial, 15 Mar h 1999'&
$%
Example: Trend EstimationLet xt = (1; t; : : : ; tp), t = 0; : : : ; 63 and b = (0; 0; 0:035;�0:0005)0.We simulate Yt = x0tb+Zt by adding standard normal andt3-distributed noise. With N = 100 and m = 2 we obtain thefollowing results (based on 1,000 simulations).N (0; 1) t3Model � ^�� AIC MDL ^�� AIC MDL(0; 0; b2; b3) 100 91 98 99 89 98(0; b1; b2; b3) 0 5 1 1 5 1(b0; 0; b2; b3) 0 3 1 0 3 1(b0; b1; b2; b3) 0 2 0 0 3 0Empiri al probabilities (%), ex luding models not sele ted by any method.A.M. Zoubir, Curtin University of Te hnology 133
ICASSP-99 Tutorial, 15 Mar h 1999'&
$%
Outline
� Model Sele tion{ Non-Linear Models{ Order Sele tion in Autoregressive Models� More Appli ations in Signal pro essing{ Con�den e Intervals for Spe tra{ Bispe trum Based Gaussianity Tests{ Noise Floor Estimation in Radar{ Con�den e Intervals for Flight Parameters in Passive Sonar{ Model Sele tion of Polynomial Phase Signals� Summary� A knowledgements
A.M. Zoubir, Curtin University of Te hnology 134
ICASSP-99 Tutorial, 15 Mar h 1999'&
$%
Model Sele tion in Non-Linear Models
We de�ne a nonlinear model byYt = g(xt; b) + Zt; t = 0; : : : ; n� 1;where Zt is a noise sequen e of i.i.d. random variables of unknowndistribution with �Z = 0 and �2Z > 0, and g is a known fun tion.De�ne the olle tion of subsets of � = f1; : : : ; pg by B andg�t(b�) = g�(x�t; b�), where � 2 B and g� is the restri tion of thefun tion g to the admissible set of (x�t; b�). Let ~B be the admissibleset for b.With _g( ) = �g( )� and m�( ) =Pn�1t=0 _g�t( ) _g�t( )0, a onsistentbootstrap pro edure for sele ting � is as follows.
A.M. Zoubir, Curtin University of Te hnology 135
ICASSP-99 Tutorial, 15 Mar h 1999'&
$%
Model Sele tion in Non-Linear Models (Cont'd)
Step 1. With yt, t = 0; : : : ; n � 1, �nd ^b�, the solution ofPn�1t=0 (yt � g�t( )) _g�t( ) = 0, for all 2 ~B and the residu-als ^zt = yt � g�t(^b�), t = 0; : : : ; n� 1.Step 2. Get ^z�t by resampling pn=m(^zt � ^z�)=p1� p=n.Step 3. Compute ^b��;m = ^b� +m�(^b�)�1Pn�1t=0 ^z�t _g�t(^b�).Step 4. Repeat Steps 2-3 to obtain ^b�(i)�;m, i = 1; : : : ; N .Step 5. To �nd ^�0, minimise over �
^��n;m(�) = N�1 NXi=1 n�1Xt=0�yt � g�t(^b�(i)�;m)�2n :
A.M. Zoubir, Curtin University of Te hnology 136
ICASSP-99 Tutorial, 15 Mar h 1999'&
$%
Example: Os illations in Noise
Let Yt = os!1t(1 + os!2t) + Zt, t = 0; : : : ; 39. In this aseB = f�k; k = 1; 2; 3g so that g�1t(b�1) = 2 os!1t (!2 = 0),g�2t(b�2) = 1 + os!2t (!1 = 0), and g�3t(b�3) = os!1t(1 + os!2t)(!1; !2 6= 0).We hose !1 = 0:2� and !2 = 0:1� and run simulations at �1:2 dBSNR with m = 35. Method �1 �2 �3Bootstrap 3 0 97AIC 0 3 97MDL 0 5 95Empiri al probabilities (%) based on 100 simulations.A.M. Zoubir, Curtin University of Te hnology 137
ICASSP-99 Tutorial, 15 Mar h 1999'&
$%
Outline
� Model Sele tion{ Non-Linear Models{ Order Sele tion in Autoregressive Models� More Appli ations in Signal pro essing{ Con�den e Intervals for Spe tra{ Bispe trum Based Gaussianity Tests{ Noise Floor Estimation in Radar{ Con�den e Intervals for Flight Parameters in Passive Sonar{ Model Sele tion of Polynomial Phase Signals� Summary� A knowledgements
A.M. Zoubir, Curtin University of Te hnology 138
ICASSP-99 Tutorial, 15 Mar h 1999'&
$%
Order Sele tion in Autoregressive Models
Consider Yt = b1Yt�1 + b2Yt�2 + � � � bpYt�p + Zt; t 2 Z ;where p is the order, bk, k = 1; : : : ; p, are unknown parameters andZt are i.i.d. random variables with �Z = 0 and �2Z > 0. Let(y�p; : : : ; y�1; y0; : : : ; yn�1) be observations and ^b the LSE ofb = (b1; : : : ; bp)0.Sele t a model � from B = f1; : : : ; pg where ea h � orresponds to theautoregressive model of order �, i.e., Yt = b1Yt�1 + � � � b�Yt�� + Zt.Obje tive: Find the optimal order, i.e.,�0 = max fk : 1 � k � p; bk 6= 0g, where p is the largest order.
A.M. Zoubir, Curtin University of Te hnology 139
ICASSP-99 Tutorial, 15 Mar h 1999'&
$%
Order Sele tion in Autoregressive Models (Cont'd)
Step 1. Resample the residuals (^zt � ^z�) to obtain ^z�t .Step 2. Find ^b��;m the LSE of b� under � from y�t =P�k=1 ^bky�t�k + ^z�t for t = �p; : : : ;m � 1; with m repla ingn and �y��p; : : : ; y�0 repla ing �y��2p; : : : ; y��p�1.Step 3. Repeat Steps 1-2 to obtain ^b�(1)�;m; : : : ; ^b�(N)�;m and
^��n;m(�) = N�1 NXi=1 n�1Xt=0�yt �P�k=1 yt�k+1^b�(i)k;m�2nStep 5. Minimise ^��n;m(�) over � to �nd ^�0.The pro edure is onsistent for m!1 and m=n! 0 as n!1.
A.M. Zoubir, Curtin University of Te hnology 140
ICASSP-99 Tutorial, 15 Mar h 1999'&
$%
Example: Order Sele tion in an AR Model
We onsider determining the order of the pro ess des ribed byYt = �0:4Yt�1 + 0:2Yt�2 + Zt; t 2 Z;where Zt is a standard Gaussian variable. With n = 128 we obtained:Method � = 1 � = 2 � = 3 � = 4Bootstrap 28.0 65.0 5.0 2.0AIC 17.8 62.4 12.6 7.2MDL 43.2 54.6 2.1 0.1
Empiri al Probabilities (%) of sele ting the true AR model, �0 = 2, n = 128and m = 40, based on 1,000 simulations.A.M. Zoubir, Curtin University of Te hnology 141
ICASSP-99 Tutorial, 15 Mar h 1999'&
$%
Outline
� Model Sele tion{ Non-Linear Models{ Order Sele tion in Autoregressive Models� More Appli ations in Signal pro essing{ Con�den e Intervals for Spe tra{ Bispe trum Based Gaussianity Tests{ Noise Floor Estimation in Radar{ Con�den e Intervals for Flight Parameters in Passive Sonar{ Model Sele tion of Polynomial Phase Signals� Summary� A knowledgements
A.M. Zoubir, Curtin University of Te hnology 142
ICASSP-99 Tutorial, 15 Mar h 1999'&
$%
More Appli ations in Signal Pro essing
The relatively simple examples showed that the bootstrap is powerful.This suggests its use in real-life appli ations and more omplexproblems. We show now how the bootstrap an be applied for:1. Con�den e Intervals for Spe tra [Franke & H�ardle (1992), Politis et al.(1992), Zoubir & Iskander (1996)℄2. Bispe trum Based Gaussianity Tests [Zoubir & Iskander (1996,1999)℄3. Noise Floor Estimation in Radar [Zoubir & Boashash (1996)℄4. Con�den e Intervals for Flight Parameters in Passive Sonar [Reid etal. (1996), Zoubir & Boashash (1998)℄.5. Model Sele tion of Polynomial Phase Signals [Zoubir & Iskander(1998,1999)℄.A.M. Zoubir, Curtin University of Te hnology 143
ICASSP-99 Tutorial, 15 Mar h 1999'&
$%
Outline
� Model Sele tion{ Non-Linear Models{ Order Sele tion in Autoregressive Models� More Appli ations in Signal pro essing{ Con�den e Intervals for Spe tra{ Bispe trum Based Gaussianity Tests{ Noise Floor Estimation in Radar{ Con�den e Intervals for Flight Parameters in Passive Sonar{ Model Sele tion of Polynomial Phase Signals� Summary� A knowledgements
A.M. Zoubir, Curtin University of Te hnology 144
ICASSP-99 Tutorial, 15 Mar h 1999'&
$%
Con�den e Intervals for Spe tra
� Several methods for spe trum estimation exist [Brillinger (1981),Priestley (1981), Marple (1987), Kay (1989)℄.� It is often desirable to provide a on�den e interval for thespe trum based on the estimate as an a ura y measure.� We present three di�erent methods based on the bootstrap toestimate on�den e bands for the power spe trum.� These methods are ompared with the hi-squared approximationfor both Gaussian and non-Gaussian weakly dependent timeseries.A.M. Zoubir, Curtin University of Te hnology 145
ICASSP-99 Tutorial, 15 Mar h 1999'&
$%
Con�den e Intervals for Spe tra (Cont'd)
Let X0; : : : ; Xn�1 be observations from a stri tly stationaryreal-valued time series Xt with �X = 0, �2X > 0 and spe tral densityCXX(!) = 12� 1X�=�1EX0Xj� j e�j � !:Denote the periodogram [Brillinger (1981), Marple(1987)℄ by
IXX(!) = 12�n �����n�1Xk=0 Xk ej k !�����2 ; �� � ! � � :
A.M. Zoubir, Curtin University of Te hnology 146
ICASSP-99 Tutorial, 15 Mar h 1999'&
$%
Con�den e Intervals for Spe tra (Cont'd)
We will onsider a kernel estimate of CXX(!)^CXX(!; h) = 1nh MXk=�M K �! � !kh � IXX(!k) ;� K(�) is symmetri and nonnegative (here Bartlett-Priestleywindow [Priestley (1981)℄),� h is its bandwidth,� M denotes the largest integer � n=2� !k = 2�k=n, �M � k �M .
A.M. Zoubir, Curtin University of Te hnology 147
ICASSP-99 Tutorial, 15 Mar h 1999'&
$%
Con�den e Intervals for Spe tra (Cont'd)
Asymptoti ally for large n, ^CXX(!1); : : : ; ^CXX(!M ) are independentvariates with distribution CXX(!k)�24m+2=(4m+ 2), k = 1; : : : ;Mwith m = b(hn� 1)=2 .A 100�% on�den e interval [Brillinger (1981)℄ is given by(4m+ 2) ^CXX(!; h)�24m+2 � 1+�2 � < CXX(!) < (4m+ 2) ^CXX(!; h)�24m+2 � 1��2 � ;where �2�(�) is su h that Pr [�2� < �2�(�)℄ = �.An alternative method proposed by Franke & H�ardle (1992) ispresented below. It exploits the approximate regression"k = IXX(!k)=CXX(!k), k = 1; : : : ;M .
A.M. Zoubir, Curtin University of Te hnology 148
ICASSP-99 Tutorial, 15 Mar h 1999'&
$%
Con�den e Intervals for Spe tra (Cont'd)
Step 1. Compute Residuals. Choose an hi > 0 whi h does notdepend on ! and ompute^"k = IXX(!k)^CXX(!k;hi) ; k = 1; : : : ;M:Step 2. Res aling. Res ale the empiri al residuals to
~"k = ^"k^" ; k = 1; : : : ;M; ^" = 1M MXj=1 ^"j :Step 3. Resampling. Draw independent bootstrap residuals~"�1; : : : ; ~"�M from the empiri al distribution of ~"1; : : : ; ~"M .
A.M. Zoubir, Curtin University of Te hnology 149
ICASSP-99 Tutorial, 15 Mar h 1999'&
$%
Con�den e Intervals for Spe tra (Cont'd)
Step 4. Bootstrap estimates. With a bandwidth g, �ndI�XX(!k) = I�XX(�!k) = ^CXX(!k; g) ~"�k; k = 1; : : : ;M;^C�XX(!;h) = 1nh MXk=�MK �! � !kh � I�XX(!k):Step 5. Con�den e bands estimation. Repeat Steps 3 � 4 and�nd �U (and pro eed similarly for �L) su h that
Pr� pnh ^C�XX(!;h)� ^CXX(!; g)^CXX(!; g) � �U! = � :
That is f1+ �U (nh)�1=2g�1 ^CXX(!;h) is the upper bound ofan (1� 2�)%- on�den e interval for CXX(!).A.M. Zoubir, Curtin University of Te hnology 150
ICASSP-99 Tutorial, 15 Mar h 1999'&
$%
Simulation ResultsConsider the AR pro essXt = 0:5Xt�1 � 0:6Xt�2 + 0:3Xt�3 � 0:4Xt�4 + 0:2Xt�5 + "t ;where "t is an i.i.d. N (0; 1) pro ess. With n = 256, N = 399 andh = 0:1, we obtain the following 95% on�den e interval.0 20 40 60 80 100 120
0
2
4
6
8
10
12
Sample
Pow
er S
pect
rum
Residuals based method 0 20 40 60 80 100 1200
2
4
6
8
10
12
Sample
Pow
er S
pect
rum
�2 approximationA.M. Zoubir, Curtin University of Te hnology 151
ICASSP-99 Tutorial, 15 Mar h 1999'&
$%
Simulation Results (Cont'd)Consider the AR pro essYt = Yt�1 � 0:7Yt�2 � 0:4Yt�3 + 0:6Yt�4 � 0:5Yt�5 + �t ;where �t is an i.i.d. U(�2:5; 2:5) pro ess. With n = 256, N = 399 andh = 0:1, we obtain the following 95% on�den e interval.0 20 40 60 80 100 120
10-2
10-1
100
101
102
Sample
Powe
r Spe
ctru
m
Residuals based method 0 20 40 60 80 100 12010
-2
10-1
100
101
102
Sample
Powe
r Spe
ctru
m
�2 approximationA.M. Zoubir, Curtin University of Te hnology 152
ICASSP-99 Tutorial, 15 Mar h 1999'&
$%
The Blo k of Blo ks Bootstrap
The blo k of blo ks bootstrap initially suggested in [K�uns h (1989)℄was proposed for setting on�den e bands for spe tra in [Politis &Romano (1992)℄. An appli ation of the blo k of blo ks bootstrap tohigher-order umulants an be found in [Zhang et al. (1993)℄.
L
M
Q segments
q segments
X
Q
l
hPrin iple of the blo k of blo ks bootstrapA.M. Zoubir, Curtin University of Te hnology 153
ICASSP-99 Tutorial, 15 Mar h 1999'&
$%
Blo k of Blo ks Bootstrap for Spe tra
Step 1. First blo k. Given X0; : : : ; Xn�1, obtain Q overlapping(0 < M < L) or non-overlapping (M = 0) segments of Lsamples and estimate ^C(i)XX(!), i = 1; : : : ; Q.Step 2. Se ond blo k. Divide ^C(1)XX(!); : : : ; ^C(Q)XX(!) into q over-lapping (0 < h < l) or non-overlapping (h = 0) blo ks, sayCj , j = 1; : : : ; q, ea h ontaining l estimates.Step 3. Resampling. Generate k bootstrap samples y�1 ; : : : ; y�k ofsize l ea h, from C1; : : : ; Cq.Step 4. Reshaping. Con atenate y�1 ; : : : ; y�k into a ve tor Y� andestimate ^C�XX(!).Step 5. Con�den e interval. Repeat Steps 3-4 and pro eed asbefore to obtain a on�den e interval for CXX(!).
A.M. Zoubir, Curtin University of Te hnology 154
ICASSP-99 Tutorial, 15 Mar h 1999'&
$%
Simulation ResultsConsider the AR pro ess driven by an i.i.d. N (0; 1) pro ess:Xt = 0:5Xt�1 � 0:6Xt�2 + 0:3Xt�3 � 0:4Xt�4 + 0:2Xt�5 + "t :With n = 2; 000; L = 128; M = 20; l = 6; h = 2 and N = 100, weobtain the following 95% on�den e interval, based on 100 runs.
0 20 40 60 80 100 1200
2
4
6
8
10
12
Sample
Pow
er S
pect
rum
Blo k of blo ks bootstrap method. 0 20 40 60 80 100 1200
2
4
6
8
10
12
Sample
Pow
er S
pect
rum
Cir ular blo k bootstrap method.A.M. Zoubir, Curtin University of Te hnology 155
ICASSP-99 Tutorial, 15 Mar h 1999'&
$%
Simulation Results (Cont'd)Consider the AR pro ess driven by an i.i.d. U(�2:5; 2:5) pro ess:Yt = Yt�1 � 0:7Yt�2 � 0:4Yt�3 + 0:6Yt�4 � 0:5Yt�5 + �t :With n = 2; 000; L = 128; M = 20; l = 6; h = 2 and N = 100, weobtain the following 95% on�den e interval, based on 100 runs.
0 20 40 60 80 100 12010
-2
10-1
100
101
102
Sample
Powe
r Spe
ctru
mBlo k of blo ks bootstrap method. 0 20 40 60 80 100 120
10-2
10-1
100
101
102
Sample
Powe
r Spe
ctru
mCir ular blo k bootstrap method.A.M. Zoubir, Curtin University of Te hnology 156
ICASSP-99 Tutorial, 15 Mar h 1999'&
$%
Outline
� Model Sele tion{ Non-Linear Models{ Order Sele tion in Autoregressive Models� More Appli ations in Signal pro essing{ Con�den e Intervals for Spe tra{ Bispe trum Based Gaussianity Tests{ Noise Floor Estimation in Radar{ Con�den e Intervals for Flight Parameters in Passive Sonar{ Model Sele tion of Polynomial Phase Signals� Summary� A knowledgements
A.M. Zoubir, Curtin University of Te hnology 157
ICASSP-99 Tutorial, 15 Mar h 1999'&
$%
Testing for Departure from Gaussianity
� Tests for departure from Gaussianity [Subba Rao & Gabr (1980),Hini h (1982)℄ have re eived onsiderable interest among signalpro essing pra titioners [Swami et al. (1995)℄.� A limitation of the tests is the large amount of data required forthe asymptoti distribution of the test statisti s to hold.� The bootstrap an be used to test for departure from Gaussianitywith high power while maintaining the level of signi� an e, evenfor small sample sizes [Zoubir & Iskander (1999)℄.� The bootstrap an also be used to set on�den e bands for thebi oheren e [Zoubir & Iskander (1996)℄.
A.M. Zoubir, Curtin University of Te hnology 158
ICASSP-99 Tutorial, 15 Mar h 1999'&
$%
Testing for Departure from Gaussianity (Cont'd)
� Let X0; : : : ; Xn�1 be observations from a stri tly stationaryreal-valued time-series Xt, t 2 Z , with �X = 0; �2X > 0 andCXXX(!j ; !k);�� � (!j ; !k) � �.� For a Gaussian pro ess (as well as any other symmetri pro ess)CXXX(!j ; !k) � 0. Testing the bispe trum for zero may be seenas testing for departure from Gaussianity.� Reje tion of the hypothesis implies that the pro ess isnon-Gaussian; otherwise, it may be non-Gaussian.� We would test CXXX(!j ; !k) for zero when !j and !k arerestri ted to 0 � !k � !j , !k + 2!j � 2� due to symmetry.
A.M. Zoubir, Curtin University of Te hnology 159
ICASSP-99 Tutorial, 15 Mar h 1999'&
$%
Testing for Departure from Gaussianity (Cont'd)
Divide the observations Xt, t = 0; : : : ; T � 1 into P non-overlappingsegments of n onse utive measurements and al ulate theperiodogram I(i)XXX(!j ; !k) for ea h segment i = 1; : : : ; P ,I(i)XXX(!j ; !k) = 1nd(i)X (!j)d(i)X (!k) �dX (i)(!j + !k); �� � !j ; !k � � ;where d(i)X (!i) is the �nite Fourier transform of the i-th segment and�dX is its omplex onjugate.An estimate of CXXX(!j ; !k) is obtained through^CXXX(!j ; !k) = 1P PXi=1 I(i)XXX(!j ; !k):
A.M. Zoubir, Curtin University of Te hnology 160
ICASSP-99 Tutorial, 15 Mar h 1999'&
$%
Testing for Departure from Gaussianity (Cont'd)
Our pro edure is based on the approximate regressionI(i)XXX(!j ; !k) = CXXX(!j ; !k) + "j;kV (!j ; !k); (j; k) 2 D;
V (!j ; !k)2 = nCXX(!j)CXX(!k)CXX(!j + !k)� [1 + Æ(j � k) + Æ(n� 2j � k) + 4Æ(n� 3j)Æ(n� 3k)℄ :Herein, CXX(!) is the spe trum of Xt, Æ(k) is Krone ker's deltafun tion, !j = 2�j=n and !k = 2�k=n are dis rete frequen ies andD = f0 < k � j; 2j + k � ng.We shall assume that "j;k are independent and identi ally distributedrandom variates whi h holds for a reasonably large n.A.M. Zoubir, Curtin University of Te hnology 161
ICASSP-99 Tutorial, 15 Mar h 1999'&
$%
Testing for Departure from Gaussianity (Cont'd)
Step 1. Cal ulate I(i)XX(!j), I(i)XXX(!j ; !k), ^CXX(!j),^CXXX(!j ; !k), ^�(!j ; !k) (using the bootstrap) and~C = Xj;k2D j ^CXXX(!j ; !k)j^�(!j ; !k) :Step 2. For ea h segment, estimate the residuals^"(i)j;k = I(i)XXX(!j ; !k)� ^CXXX(!j ; !k)^V (!j ; !k) ; j; k 2 D :Step 3. Centre the residuals to obtain ~"(i)j;k = ^"(i)j;k � �"(i), i =1; : : : ; P , where �"(i) is an average over all ^"(i)j;k.
A.M. Zoubir, Curtin University of Te hnology 162
ICASSP-99 Tutorial, 15 Mar h 1999'&
$%
Testing for Departure from Gaussianity (Cont'd)
Step 4. Draw independent bootstrap residuals ~"(i)�j;k .Step 5. Compute the bootstrap biperiodogram ordinatesI(i)�XXX(!j ; !k) = ^CXXX(!j ; !k) + ~"(i)�j;k ^V (!j ; !k):Step 6. Obtain the bootstrap bispe tral estimate^C�XXX(!j ; !k) = 1P PXi=1 I(i)�XXX(!j ; !k) ;Step 7. Compute the statisti ~C� = Xj;k2D j ^C�XXX(!j ; !k)� ^CXXX(!j ; !k)j^��(!j ; !k) :
A.M. Zoubir, Curtin University of Te hnology 163
ICASSP-99 Tutorial, 15 Mar h 1999'&
$%
Testing for Departure from Gaussianity (Cont'd)Step 8. Repeat Steps 4�7 a large number of times, to obtain atotal of N bootstrap statisti sa ~C�1 ; : : : ; ~C�N .Step 9. Rank the olle tion ~C�1 ; : : : ; ~C�N into in reasing order toobtain ~C�(1) � � � � � ~C�(N). Reje t the hypothesis of Gaussian-ity at level � if ~C > ~C�(q), where q = b(N + 1)(1� �) .a^��(!j ; !k)2 is obtained as follows. For ea h I�XXX (!j ; !k), repeat Steps3{6 (nested bootstrap) a small number of times e.g., B = 25), repla ing^CXXX (!j ; !k) and I(i)�XXX (!j ; !k) by ^C�XXX (!j ; !k) and I(i)��XXX (!j ; !k), re-spe tively. Then, ompute^��(!j ; !k)2 = 1B � 1 BXb=1 ^C��(b)XXX (!j ; !k)� 1B BXb0=1 ^C��(b0)XXX (!j ; !k)!2 ;where ^C��(b)XXX (!j ; !k), b = 1; : : : ; B1, is a bootstrap version of ^C�XXX (!j ; !k).A.M. Zoubir, Curtin University of Te hnology 164
ICASSP-99 Tutorial, 15 Mar h 1999'&
$%
Simulation Results
We onsidered three pro esses:� a white pro ess� an auto-regressive pro ess of order �ve [AR(5)℄, and� a moving-average pro ess of order two [MA(2)℄.Spe i� ally, we assumed X1;t = Yt,X2;t = 0:5Xt�1 � 0:6Xt�2 + 0:3Xt�3 � 0:4Xt�4 + 0:2Xt�5 + Yt;X3;t = 0:5Yt + 0:3Yt�1 + 0:5Yt�2; t 2 Z ;where Yt is an independent random pro ess with distribution FY ,
A.M. Zoubir, Curtin University of Te hnology 165
ICASSP-99 Tutorial, 15 Mar h 1999'&
$%
Simulation Results
FY i.i.d. AR(5) MA(2)N(0; 1) 6 4 16U(0; 1) 2 7 6�22 92 91 88�28 39 34 14Lapla e 23 27 20K(1; 1) 62 66 44LogN 100 100 98Reje tion rate (� = 5%) usingSubba-Rao and Gabr's test (T =256).
FY i.i.d. AR(5) MA(2)N(0; 1) 5 2 1U(0; 1) 5 2 1�22 95 72 63�28 69 22 12Lapla e 16 2 1K(1; 1) 78 42 20LogN 100 91 77Reje tion rate (� = 5%) using thebootstrap test (T = 256, n = 22).
A.M. Zoubir, Curtin University of Te hnology 166
ICASSP-99 Tutorial, 15 Mar h 1999'&
$%
Simulation Results
FY i.i.d. AR(5) MA(2)N(0; 1) 8 8 7U(0; 1) 7 4 10�22 100 100 98�28 61 48 32Lapla e 39 38 45K(1; 1) 85 82 68LogN 100 100 96Reje tion rate (� = 5%) usingSubba-Rao and Gabr's test (T =512).
FY i.i.d. AR(5) MA(2)N(0; 1) 3 2 3U(0; 1) 6 1 2�22 100 94 89�28 89 60 38Lapla e 14 8 3K(1; 1) 98 84 57LogN 100 100 93Reje tion rate (� = 5%) using thebootstrap test (T = 512, n = 22).
A.M. Zoubir, Curtin University of Te hnology 167
ICASSP-99 Tutorial, 15 Mar h 1999'&
$%
Interpretation of the Results
� Subba Rao and Gabr's test is unable to maintain the 5% level ofsigni� an e for symmetri pro esses, e.g. for oloured Gaussianor independent/ oloured Lapla e pro ess.� The test based on the bootstrap maintains the nominal level ofsigni� an e at below 5%, ex ept in the ase of independentuniform and Lapla e pro esses for T = 512.� A omparison of power is appropriate only if the tests maintainthe same nominal level of signi� an e. We found that thebootstrap test a hieves power omparable to or better thanSubba Rao and Gabr's test.
A.M. Zoubir, Curtin University of Te hnology 168
ICASSP-99 Tutorial, 15 Mar h 1999'&
$%
Con�den e Bands estimation for the Bi oheren e
� We an use the above method to set on�den e bands for higherorder spe tra or umulants.� Repeating steps 4{6 a large number of times, one an obtain atotal of N bootstrap estimates of the bi oheren e, asj ^C�(1)XXX(!j ; !k)j=^�(!j ; !k); : : : ; j ^C�(B)XXX(!j ; !k)j=^�(!j ; !k):� After sorting these estimates at ea h frequen y pair (!j ; !k) oneis able to determine the on�den e bands using estimatedper entiles obtained as in the above pro edure [Zoubir &Iskander (1996,1999)℄.
A.M. Zoubir, Curtin University of Te hnology 169
ICASSP-99 Tutorial, 15 Mar h 1999'&
$%
Con�den e Bands estimation for the Bi oheren e
05
1015
2025
30
0
5
10
15
20
−20
0
20
40
60
Mag
nitu
de [
dB]
ωk ωjLower 95% on�den e band for thebi oheren e of a white Gaussianpro ess (T = 4096 and n = 64).0
510
1520
2530
0
5
10
15
20
−20
0
20
40
60
Mag
nitu
de [
dB]
ωk ωjUpper 95% on�den e band for thebi oheren e of a white Gaussianpro ess (T = 4096 and n = 64).
A.M. Zoubir, Curtin University of Te hnology 170
ICASSP-99 Tutorial, 15 Mar h 1999'&
$%
Outline
� Model Sele tion{ Non-Linear Models{ Order Sele tion in Autoregressive Models� More Appli ations in Signal pro essing{ Con�den e Intervals for Spe tra{ Bispe trum Based Gaussianity Tests{ Noise Floor Estimation in Radar{ Con�den e Intervals for Flight Parameters in Passive Sonar{ Model Sele tion of Polynomial Phase Signals� Summary� A knowledgements
A.M. Zoubir, Curtin University of Te hnology 171
ICASSP-99 Tutorial, 15 Mar h 1999'&
$%
Noise Floor Estimation in Radar
� The noise oor is one of the main radar performan e statisti s.� It is omputed by taking a trimmed mean of power estimates inthe Doppler spe trum, after ex luding the low Doppler powerasso iated with any stationary targets or ground lutter.� The radar operator requires an optimal value for the trim andalso supplementary information des ribing the a ura y of the omputed noise oor estimate.� The limited number of samples available and the non-Gaussiannature of the noise �eld make it parti ularly diÆ ult to estimate on�den e measures reliably. This suggests the use of thebootstrap.A.M. Zoubir, Curtin University of Te hnology 172
ICASSP-99 Tutorial, 15 Mar h 1999'&
$%
Optimal Sele tion of the Trim
� The goal is to use the bootstrap to optimally sele t the amountof trimming, �, say.� The problem an be stated as one of estimation of � from a lassof estimates f^�(�) : � 2 Ag indexed by a parameter �, whi h issele ted a ording to some optimality riterion.� It is reasonable to use the estimate that leads to a minimumvarian e. Be ause the varian e of ^�(�) may depend on unknownparameters, the optimal parameter �o is unknown.
A.M. Zoubir, Curtin University of Te hnology 173
ICASSP-99 Tutorial, 15 Mar h 1999'&
$%
Optimal Sele tion of the Trim ( ont'd)
� The obje tive is to �nd � by estimating the unknown varian e of^�(�) and sele t � whi h minimises the varian e estimate.� The problem of trimming has been solved theoreti ally [Jae kel(1971)℄ for the mean of symmetri distributions. The trim is,however, restri ted to the interval [0; 25℄%.� In addition to the fa t that for a small sample size asymptoti results are invalid, expli it expressions for the asymptoti varian e may not be available outside the interval [0; 25℄%.
A.M. Zoubir, Curtin University of Te hnology 174
ICASSP-99 Tutorial, 15 Mar h 1999'&
$%
An Example: Sele tion of the Trim for the Mean
� Let X = fX1; : : : ; Xng be a random sample drawn from adistribution fun tion F .� Let X(1); : : : ; X(n) denote the order statisti s.� For an integer � less then n=2, the �-trimmed mean based on thesample X is given by^�(�) = 1n� 2� n��Xi=�+1X(i)� For an asymmetri distribution we use the ��-trimmed mean:^�(�; �) = 1n� �� � n��Xi=�+1X(i) ; �; � < n=2 :
A.M. Zoubir, Curtin University of Te hnology 175
ICASSP-99 Tutorial, 15 Mar h 1999'&
$%
Example
Step 1. Initial onditions. Sele t the initial trim � = 0 and � = 0.Step 2. Resampling. Using a pseudo-random number generator,draw a large number, say N , of independent samplesX �1 = fX�11; : : : ; X�1ng; : : : ;X �N = fX�N1; : : : ; X�Nngof size n from X . Ea h sample is taken with repla ement.Step 3. Cal ulation of the bootstrap statisti . For ea h bootstrapsample X �j , j = 1; : : : ; N , al ulate the trimmed mean
^��j (�; �) = 1n� �� � n��Xi=�+1X�j(i); j = 1; : : : ; N:
A.M. Zoubir, Curtin University of Te hnology 176
ICASSP-99 Tutorial, 15 Mar h 1999'&
$%
Example ( ont'd)
Step 4. Varian e estimation. Using bootstrap based trimmed meanvalues, al ulate the estimate of the varian e
^��2 = 1N � 1 NXj=10�^��j (�; �)� 1N NXj=1 ^��j (�; �)1A2 :
Step 5. Repetition. Repeat steps 2{4 using di�erent ombinations ofthe trim � and �.Step 6. Optimal trim. Choose the setting of the trim that results ina minimal varian e estimate.
A.M. Zoubir, Curtin University of Te hnology 177
ICASSP-99 Tutorial, 15 Mar h 1999'&
$%
Noise Floor Estimation in Radar
� In radar, we estimate residuals for the Doppler spe trum bytaking the ratio of the periodogram and a spe tral estimate ofthe raw data. The residuals are used for resampling to generatebootstrap spe tral estimates as in a previous example.� We pro eed as in the example for the trimmed mean, repla ingXi by ^CXX(!i), where !i = 2�i=n, i = 1; : : : ; n, are dis retefrequen ies and n is the number of observations.� The pro edure makes the assumption that the spe tral estimatesare i.i.d. for distin t dis rete frequen ies.� The optimal noise oor is found by minimising the bootstrapvarian e estimate of the trimmed spe trum w.r.t. (�; �).A.M. Zoubir, Curtin University of Te hnology 178
ICASSP-99 Tutorial, 15 Mar h 1999'&
$%
Noise Floor Estimation in Radar (Cont'd)
The radar return is assumed to onsist of 128 observations from a omplex-valued AR(4) pro ess, driven by a Gaussian pro ess.
−0.5 −0.4 −0.3 −0.2 −0.1 0 0.1 0.2 0.3 0.4 0.5
−10
−5
0
5
10
15
20
25
30
35
40
Normalized Doppler Frequency
Pow
er S
pect
rum
[dB
]
Target
CXX
CNN
TM
Doppler spe trum and estimated noise oor. With N = 99, ^�(^�0; ^�0) = 0:80.A.M. Zoubir, Curtin University of Te hnology 179
ICASSP-99 Tutorial, 15 Mar h 1999'&
$%
Noise Floor in Over-The-Horizon Radar
−0.5 −0.4 −0.3 −0.2 −0.1 0 0.1 0.2 0.3 0.4 0.510
20
30
40
50
60
70
Normalized Doppler Frequency
Pow
er S
pect
rum
[dB
]
Target
CXX
CNN
TM
Doppler spe trum and estimated noise oor for a real over-the-horizon radarreturn (no target present). With N = 99, ^�(^�0; ^�0) = 2:0738� 103.A.M. Zoubir, Curtin University of Te hnology 180
ICASSP-99 Tutorial, 15 Mar h 1999'&
$%
Noise Floor in Over-The Horizon Radar
−0.5 −0.4 −0.3 −0.2 −0.1 0 0.1 0.2 0.3 0.4 0.520
25
30
35
40
45
50
55
60
65
70
Normalized Doppler Frequency
Pow
er S
pect
rum
[dB
]
Target
CXX
CNN
TM
Doppler spe trum and estimated noise oor for a real over-the-horizon radarreturn (target present). With N = 99, ^�(^�0; ^�0) = 8:9513� 103.A.M. Zoubir, Curtin University of Te hnology 181
ICASSP-99 Tutorial, 15 Mar h 1999'&
$%
Outline
� Model Sele tion{ Non-Linear Models{ Order Sele tion in Autoregressive Models� More Appli ations in Signal pro essing{ Con�den e Intervals for Spe tra{ Bispe trum Based Gaussianity Tests{ Noise Floor Estimation in Radar{ Con�den e Intervals for Flight Parameters in Passive Sonar{ Model Sele tion of Polynomial Phase Signals� Summary� A knowledgements
A.M. Zoubir, Curtin University of Te hnology 182
ICASSP-99 Tutorial, 15 Mar h 1999'&
$%
Con�den e Intervals for an Air raft's FlightParameters
OVERFLYING AIRCRAFT
ACOUSTIC EMISSION
MICROPHONE
S hemati of the passive a ousti s enario.
A.M. Zoubir, Curtin University of Te hnology 183
ICASSP-99 Tutorial, 15 Mar h 1999'&
$%
The Passive A ousti Approa h
� Information about the ight parameters is ontained in thea ousti signal as heard by the stationary observer.� The frequen y of the observed a ousti signal s(t) from theover- ying air raft undergoes a time varying Doppler shift.� Thus, the phase of s(t) undergoes a time varying rate of hange.A simple model for the air raft a ousti signal, as heard by astationary observer, is given byX(t) = z(t) + U(t) = Aej�(t) + U(t) = Aej2� R t�1 f(�)d� + U(t) ;where z(t) = s(t) + jH[s(t)℄ and H[�℄ is the Hilbert transform.
A.M. Zoubir, Curtin University of Te hnology 184
ICASSP-99 Tutorial, 15 Mar h 1999'&
$%
Problem Des ription
�� ���� � ������������vs vm�(t)
r(t)hR(t)
S hemati of the geometri arrangement of the air raft and observer in termsof the physi al parameters of the model.A.M. Zoubir, Curtin University of Te hnology 185
ICASSP-99 Tutorial, 15 Mar h 1999'&
$%
Problem Des ription (Cont'd)
Context: Estimation of an air raft's ight parameters (height,speed, range and a ousti frequen y) from the a ousti signal asheard by a stationary observer [Reid et al. (1997), Ferguson (1992),Ferguson & Quinn (1994)℄.Problem: To estimate an air raft's ight parameters using passivea ousti te hniques and to determine on�den e bounds given only asingle a ousti realisation.Solution: Estimate the air raft ight parameters from the timevarying phase of the observed a ousti signal. Use bootstrapte hniques to estimate the on�den e bounds [Reid et al. (1996),Zoubir & Boashash (1998)℄.
A.M. Zoubir, Curtin University of Te hnology 186
ICASSP-99 Tutorial, 15 Mar h 1999'&
$%
The Passive A ousti Model
An observer phase model �(t) des ribes the phase of the observeda ousti signal in terms of the ight parameters and is given by
�(t) = 2� fa 2 2 � v2 t�rh2 + v2t2 + 2v2th 3 !+ �0; �1 < t <1 ;
where t0 is the time when the air raft is dire tly overhead, fa is thesour e a ousti frequen y, is the speed of sound in the medium, v isthe onstant velo ity of the air raft, h is the onstant altitude of theair raft and �0 is an initial phase onstant.
A.M. Zoubir, Curtin University of Te hnology 187
ICASSP-99 Tutorial, 15 Mar h 1999'&
$%
The Passive A ousti Model (Cont'd)
From the phase model, the instantaneous frequen y (IF), relative tothe stationary observer, an be expressed as
f(t) = 12� d�(t)dt = fa 2 2 � v2 1� v2(t+ h= )ph2( 2 � v2) + v2 2(t+ h= )2! :
For a given f(t), or �(t), �1 < t <1 and , the air raft parameters olle ted in the ve tor � = (fa; h; v; t0)0 an be uniquely determinedfrom the phase or observer IF model.Consider appropriate sampled versions of the ontinuous-time signalsand denote by �t and Xt, the phase, and the observed signal asfun tions of t = 0;�1;�2; : : : .A.M. Zoubir, Curtin University of Te hnology 188
ICASSP-99 Tutorial, 15 Mar h 1999'&
$%
The Instantaneous Frequen y
−15 −10 −5 0 5 10 1565
70
75
80
85
90
95
100
105
110
115
time (s)
freq
uenc
y (H
z)
Typi al Time varying a ousti frequen y pro�le of an over- ying air raft asdes ribed by the observer frequen y model. For this example, va = 75 m/s,h = 300 m, fa = 85 Hz and t0 = 0 s. The artwheel symbol indi ates the timeat whi h the air raft is dire tly overhead the observer.A.M. Zoubir, Curtin University of Te hnology 189
ICASSP-99 Tutorial, 15 Mar h 1999'&
$%
The Instantaneous Phase
−15 −10 −5 0 5 10 150
2000
4000
6000
8000
10000
12000
14000
16000
time (s)
phas
e (r
adia
ns)
Typi al Time varying a ousti phase pro�le of an over- ying air raft asdes ribed by the observer phase model. For this example, va = 75 m/s,h = 300 m, fa = 85 Hz and t0 = 0 s. The artwheel symbol marks the time atwhi h the air raft is dire tly overhead the observer.
A.M. Zoubir, Curtin University of Te hnology 190
ICASSP-99 Tutorial, 15 Mar h 1999'&
$%
A Bootstrap Approa h
Step 0. Colle t and sample the data to obtain Xt; t =�n=2; : : : ; n=2� 1.Step 1. Unwrap the phase of the signal Xt to provide a non-de reasing fun tion ^�t whi h approximates the true phase �t.Step 2. Obtain ^�, an initial estimate of the air raft parametersby �tting the non-linear observer phase model �t;� to ^�t in aleast-squares sense.Step 3. Compute the residuals^"t = �t; ^� � ^�t ; t = �n=2; � � � ; n=2� 1 :
A.M. Zoubir, Curtin University of Te hnology 191
ICASSP-99 Tutorial, 15 Mar h 1999'&
$%
A Bootstrap Approa h (Cont'd)
Step 4. Compute ^�i, a bootstrap estimate of the standard devi-ation of ^�i, i = 1; : : : ; 4.Step 5. Draw a random sample X � = f^"��n=2; � � � ; ^"�n=2�1g, withrepla ement, from X = f^"�n=2; � � � ; ^"n=2�1g and onstru t^��t = �t; ^� + ^"�t :Step 6. Obtain and re ord the bootstrap estimates of the air raftparameters ^�� by �tting the observer phase model to ^��t in aleast-squares sense.
A.M. Zoubir, Curtin University of Te hnology 192
ICASSP-99 Tutorial, 15 Mar h 1999'&
$%
A Bootstrap Approa h (Cont'd)
Step 7. Estimate the standard deviation of ^��i using a nestedbootstrap step and ompute and re ord^T �i = ^��i � ^�i^��i ; i = 1; : : : ; 4 :Step 8. Repeat Steps 5 through 7 a large number of times N .Step 9. Order the bootstrap estimates as ^T �i;(1) � ^T �i;(2) � � � � �^T �i;(N) and ompute the (1� �)100% on�den e interval(^�i � ^T �i;(q1)^�i; ^�i � ^T �i;(q2)^�i);where q1 = N � bN�=2 + 1 and q2 = bN�=2 .
A.M. Zoubir, Curtin University of Te hnology 193
ICASSP-99 Tutorial, 15 Mar h 1999'&
$%
Simulation Results
� An n = 320 point passive a ousti signal zt is generated at threelevels of SNR (15 dB, 20 dB and 30 dB).� We set h = 304:8 m, v = 102:89 m/s, fa = 1 Hz, t0 = 0 s,sampling frequen y fs = 8 Hz and bootstrap variables B1 = 25,B2 = 100 and B3 = 1000.� The bootstrap on�den e bounds are omputed and omparedwith those obtained by Monte Carlo simulation where the air raftparameters are al ulated for 1000 independent realisations of zt.
A.M. Zoubir, Curtin University of Te hnology 194
ICASSP-99 Tutorial, 15 Mar h 1999'&
$%
Simulation Results (Cont'd)
h [m℄ v [m/s℄ t0 [s℄ fa [Hz℄SNR A tual 304.8 102.89 0.000 1.000dB BS MC BS MC BS MC BS MCUpper Bound 308.40 308.0 102.94 103.04 0.004 0.002 1.000 1.00030 Lower Bound 301.4 301.9 102.67 102.74 -0.005 -0.006 1.000 1.000Interval length 6.6 6.1 0.27 0.30 0.009 0.008 0.000 0.000Upper Bound 309.5 314.5 103.17 103.34 0.011 0.016 1.000 1.00020 Lower Bound 290.8 295.2 102.26 102.44 -0.020 -0.016 0.999 0.999Interval length 18.7 19.3 0.91 0.90 0.031 0.032 0.001 0.001Upper Bound 315.7 320.7 103.54 103.65 0.040 0.027 1.001 1.00115 Lower Bound 286.0 288.6 102.14 102.12 -0.008 -0.027 0.999 0.999Interval length 29.7 32.1 1.40 1.53 0.048 0.054 0.002 0.002
The bootstrap derived 95% on�den e bounds for ea h of the parameters are ompared with the 95% on�den e bounds determined by Monte Carlosimulation.A.M. Zoubir, Curtin University of Te hnology 195
ICASSP-99 Tutorial, 15 Mar h 1999'&
$%
Simulation Results (Cont'd)
0.9992 0.9996 1 1.0004 1.00080
100
200
300
(a) source frequency (Hz) (Bootstrap)
coun
t
0.9992 0.9996 1 1.0004 1.00080
100
200
300
(e) source frequency (Hz) (Monte Carlo)
coun
t
102 102.5 103 103.50
100
200
300
(b) speed (m/s) (bootstrap)
coun
t
102 102.5 103 103.50
100
200
300
(f) speed (m/s) (Monte Carlo)
coun
t
285 290 295 300 305 310 315 3200
100
200
300
(c) height (m) (bootstrap)
coun
t
285 290 295 300 305 310 315 3200
100
200
300
(g) height (m) (Monte Carlo)
coun
t
−0.03 −0.02 −0.01 0 0.01 0.02 0.030
100
200
300
(d) time reference (s) (bootstrap)
coun
t
−0.03 −0.02 −0.01 0 0.01 0.02 0.030
100
200
300
(h) time reference (s) (Monte Carlo)co
unt
The histograms of the bootstrap distribution (left olumn) are ompared withthe histograms obtained by Monte Carlo simulation (right olumn) for ea h ofthe parameters at 20 dB SNR. The true values are indi ated by the dashed line.A.M. Zoubir, Curtin University of Te hnology 196
ICASSP-99 Tutorial, 15 Mar h 1999'&
$%
Real Experimentseconds of longitude (relative to the ground location)
seco
nds
of la
titud
e (r
elat
ive
to th
e gr
ound
loca
tion)
ground location
−150 −100 −50 0 50 100
−100
−50
0
50
100
150
The omplete ight path of an air raft a ousti data experiment ondu ted atBribie Island showing �ve y-overs. The ground lo ation is indi ated by the artwheel symbol.A.M. Zoubir, Curtin University of Te hnology 197
ICASSP-99 Tutorial, 15 Mar h 1999'&
$%
Real Experiment (Cont'd)h [m℄ v [m/s℄ t0 [s℄ fa [Hz℄Nominal Value 149.05 36.61 0.00 76.91Run 1. Upper Bound 142.03 36.50 0.02 76.99Lower Bound 138.16 36.07 -0.03 76.92Interval length 3.87 0.43 0.05 0.07Nominal Value 152.31 52.94 0.00 77.90Run 2. Upper Bound 161.22 53.27 -0.04 78.18Lower Bound 156.45 52.64 -0.06 78.13Interval length 4.77 0.62 0.02 0.05Nominal Value 166.52 47.75 0.00 75.94Run 3. Upper Bound 232.30 57.29 0.10 76.48Lower Bound 193.18 53.47 -0.14 75.87Interval length 39.13 3.83 0.24 0.60Nominal Value 233.01 56.56 0.00 77.65Run 6. Upper Bound 243.02 57.15 0.74 77.96Lower Bound 209.72 53.69 0.68 77.80Interval length 33.30 3.46 0.06 0.16The bootstrap derived 95% on�den e bounds for four real test signals usingthe unwrapped phase based parameter estimator.A.M. Zoubir, Curtin University of Te hnology 198
ICASSP-99 Tutorial, 15 Mar h 1999'&
$%
Interpretation of the Results
� The proposed bootstrap and unwrapped phase based estimationte hniques an be used to provide a pra ti al ight parameterestimation s heme.� The te hniques do not assume any statisti al distribution for theparameter estimates and an be applied in the absen e ofmultiple a ousti realisations.� The obtained on�den e bounds are in lose agreement withthose obtained from Monte Carlo simulations.� Similar experiments with Central Finite Di�eren e (CFD)estimates were performed. The on�den e bounds presented hereare mu h tighter than those of the CFD based estimates.A.M. Zoubir, Curtin University of Te hnology 199
ICASSP-99 Tutorial, 15 Mar h 1999'&
$%
Outline
� Model Sele tion{ Non-Linear Models{ Order Sele tion in Autoregressive Models� More Appli ations in Signal pro essing{ Con�den e Intervals for Spe tra{ Bispe trum Based Gaussianity Tests{ Noise Floor Estimation in Radar{ Con�den e Intervals for Flight Parameters in Passive Sonar{ Model Sele tion of Polynomial Phase Signals� Summary� A knowledgements
A.M. Zoubir, Curtin University of Te hnology 200
ICASSP-99 Tutorial, 15 Mar h 1999'&
$%
Modelling Polynomial Phase Signals
� Many non-stationary signals en ountered in radar, sonar,tele ommuni ations, seismology, or biomedi al engineering an beexpressed in the general form of a omplex analyti signalz(t) = a0 expfj'(t)g ;where a0 and '(t), t 2 [T1; T2℄, T1; T2 <1, are the amplitudeand the phase of the signal, respe tively.� In pra ti e, z(t) is observed in stationary omplex noise U(t) andsampled, yielding values xt, t = 0; : : : ; n� 1, from the modelXt = a0 expfj'tg+ Ut; t = 0; : : : ; n� 1:
A.M. Zoubir, Curtin University of Te hnology 201
ICASSP-99 Tutorial, 15 Mar h 1999'&
$%
Modelling Polynomial Phase Signals (Cont'd)
� Provided that ertain regularity onditions are ful�lled [Kreyszig(1989)℄, '(t) an be modelled by
Xt = a0 exp(j QXq=0 bq t;q)+ Ut; t = 0; : : : ; n� 1;
where bq, q = 0; : : : ; Q, are unknown real valued parameters, andf t;qg is an arbitrary set of basis sequen es.� The signal zt = a0 expnjPQq=0 bq t;qo, t 2 Z , is referred to as afrequen y modulated (FM) signal.
A.M. Zoubir, Curtin University of Te hnology 202
ICASSP-99 Tutorial, 15 Mar h 1999'&
$%
Modelling Polynomial Phase Signals (Cont'd)
� In the spe ial ase where t;q = tq, zt is alled a polynomialphase signal.� Problem: Given a short segment of length n of noisyobservations of the polynomial phase signal, model zt. Modellinginvolves sele tion of the model and onditional estimation of theparameters [Akaike (1978)℄.� model sele tion onsists of hoosing an appropriate set ofsequen es tq, q 2 �, � � f0; : : : ; Qg, where Q is an arbitrarilylarge model order.� Conditional estimation of the model parameters refers toestimation onditioned on the unknown model.A.M. Zoubir, Curtin University of Te hnology 203
ICASSP-99 Tutorial, 15 Mar h 1999'&
$%
Bootstrap Model Sele tion based on LSE
� Assuming that SNR = a20=�2U is large, one an write fort = 0; : : : ; n� 1 [Tretter (1985), Djuri & Kay (1990)℄
Xt = a0 exp(j QXq=0 bqtq)+ Ut � a0 exp(j QXq=0 bqtq +Wt!) ;with Wt, real, zero-mean, i.i.d. and �2W = �2U=(2a20).� The estimation problem is redu ed to� =H b+W ;where � = (�0; : : : ; �n�1)0, W = (W0; : : : ;Wn�1)0,b = (b0; : : : ; bQ)0, H = (h00; : : : ;h0n�1)0, h0t = (1; t; : : : ; tQ)0,t = 0; : : : ; n� 1.A.M. Zoubir, Curtin University of Te hnology 204
ICASSP-99 Tutorial, 15 Mar h 1999'&
$%
Residuals Based Bootstrap Model Sele tion
With � a subset of f0; : : : ; Qg, the optimal model is the model �osu h that b�o ontains all non-zero omponents of b, where b� is ave tor ontaining the omponents of b indexed by the integers in �.Step 1. Sele t the largest possible model � = f0; : : : ; Qg, and�nd the least-squares estimate ^b of b = (b0; : : : ; bQ)0.Step 2. Compute the residuals^wt = �t � h0t^b; t = 0; : : : ; n� 1:Step 3. Centre and s ale the residuals, to obtain
~wt = ^wt � 1n n�1Xt=0 ^wt! =r1� Q+ 1n ; t = 0; : : : ; n� 1:
A.M. Zoubir, Curtin University of Te hnology 205
ICASSP-99 Tutorial, 15 Mar h 1999'&
$%
Bootstrap Model Sele tion (Cont'd)
Step 4. For all models � � f0; : : : ; Qg,(a) Draw ^w�t , with repla ement, from ~wt, t = 0; : : : ; n� 1.(b) Compute��t = h0�t^b� + ^w�t ; t = 0; : : : ; l � 1;where l is su h that l=n! 0 and l !1.( ) Find the least-squares estimate ^b��;l from (b).(d) Compute��n;l(�) = 1n n�1Xt=0 ��t � h0�t^b��;l�2 :
A.M. Zoubir, Curtin University of Te hnology 206
ICASSP-99 Tutorial, 15 Mar h 1999'&
$%
Bootstrap Model Sele tion (Cont'd)
Step 4. (Continued)(e) Repeat steps (a){(d) a number of times (e.g. 100),to obtain a total of N bootstrap statisti s��(1)n;l (�); : : : ;��(N)n;l (�) and ompute
^��n;l(�) = 1B NXb=1 ^��(b)n;l (�):Step 5. Choose ^� for whi h ^��n;l(�) is minimum w.r.t. �.The pro edure above is onsistent in that limn!1 Prf^� = �og = 1,provided l is su h that l=n! 0 and l !1 [Shao (1996), Zoubir &Iskander (1999)℄.A.M. Zoubir, Curtin University of Te hnology 207
ICASSP-99 Tutorial, 15 Mar h 1999'&
$%
Simulation Results
� Consider a quadrati FM signal of the formzt = expfj(0:5 + 0:05t+ 0:0002t3)g; t = 0; : : : ; n� 1;embedded in i.i.d. noise Ut, t = 0; : : : ; n� 1. The true model ofthis quadrati FM signal is �o = fb0; b1; b3g.� The noise is sele ted to be Gaussian, although the distribution ofthe noise is not relevant provided it has a �nite varian e.� The signal-to-noise ratio ranges from 5 dB to 15 dB.� 100 bootstrap resamples are used. The number of samples inea h realisation is set to n = 64 and l = 48 (l should be su h thatQ=l is reasonably small).A.M. Zoubir, Curtin University of Te hnology 208
ICASSP-99 Tutorial, 15 Mar h 1999'&
$%
Simulation Results (Cont'd)
Model � ^��n;l(�) AIC MDL HQ AICC(b0; 0; b2; b3; 0) 0 0 0 0 0(b0; 0; b2; b3; b4) 0 5.1 2.4 5.8 4.8(b0; b1; 0; b3; 0) 100.0 87.5 96.0 82.7 88.9(b0; b1; 0; b3; b4) 0 3.4 0.9 5.1 3.0(b0; b1; b2; b3; 0) 0 3.5 0.7 5.0 3.1(b0; b1; b2; b3; b4) 0 0.5 0 1.4 0.2Empiri al probability (in per ent) of sele ting the true model, (b0; b1; 0; b3; 0),of a quadrati FM signal embedded in Gaussian noise. SNR = 15dB, n = 64,l = 48. Models not sele ted by any of the methods are not shown.A.M. Zoubir, Curtin University of Te hnology 209
ICASSP-99 Tutorial, 15 Mar h 1999'&
$%
Simulation Results (Cont'd)Model � ^��n;l(�) AIC MDL HQ AICC(b0; 0; 0; b3; 0) 0.5 0 0 0 0(b0; 0; 0; b3; b4) 0 1.1 1.8 1 1.1(b0; 0; b2; 0; 0) 0.2 0 0 0 0(b0; 0; b2; 0; b4) 0 1 1.8 0.7 1.2(b0; 0; b2; b3; 0) 0.3 13.8 14.3 13.7 13.9(b0; 0; b2; b3; b4) 0 4.1 2.8 5 3.5(b0; b1; 0; 0; b4) 6.2 1 1.6 0.8 1.2(b0; b1; 0; b3; 0) 84.9 56.9 62.8 53.4 58.9(b0; b1; 0; b3; b4) 0 1.6 0.3 2.5 1.2(b0; b1; b2; 0; 0) 5.3 1.1 1.4 1.1 1.1(b0; b1; b2; 0; b4) 0 3.1 0.9 3.4 2.3(b0; b1; b2; b3; 0) 2.6 3.1 2.2 3.5 3.1(b0; b1; b2; b3; b4) 0 13.2 10.1 14.9 12.5Empiri al probability (in per ent) of sele ting the true model, (b0; b1; 0; b3; 0),of a quadrati FM signal in Gaussian noise. SNR = 5dB, n = 64, l = 48.A.M. Zoubir, Curtin University of Te hnology 210
ICASSP-99 Tutorial, 15 Mar h 1999'&
$%
Interpretation of Results
� Bootstrap te hniques an be applied to onstant amplitudepolynomial phase modelling.� Results have shown that the empiri al probability of orre tlysele ting the model of a onstant amplitude polynomial phasesignal is high at a reasonable SNR.� A omparison with other te hniques demonstrates the superiorityof bootstrap te hniques.� Other bootstrap te hniques based on the dis rete polynomialphase transform have also been devised. They onsist of atwo-stage approa h involving hypothesis testing [Zoubir &Iskander (1999)℄.A.M. Zoubir, Curtin University of Te hnology 211
ICASSP-99 Tutorial, 15 Mar h 1999'&
$%
Outline
� Model Sele tion{ Non-Linear Models{ Order Sele tion in Autoregressive Models� More Appli ations in Signal pro essing{ Con�den e Intervals for Spe tra{ Bispe trum Based Gaussianity Tests{ Noise Floor Estimation in Radar{ Con�den e Intervals for Flight Parameters in Passive Sonar{ Model Sele tion of Polynomial Phase Signals� Summary� A knowledgements
A.M. Zoubir, Curtin University of Te hnology 212
ICASSP-99 Tutorial, 15 Mar h 1999'&
$%
Summary
� It is often ne essary to �nd the sampling distributions ofparameter estimators, so that the respe tive means and varian es an be al ulated, and more generally, on�den e intervals forthe true parameters an be set.� Most te hniques for omputing varian es or on�den e intervalsassume that the size of the available set of sample values issuÆ iently large, so that \asymptoti " results an be applied.� In many signal pro essing problems this assumption annot bemade be ause, for example, the pro ess is non-stationary andonly small portions of stationary data are onsidered.
A.M. Zoubir, Curtin University of Te hnology 213
ICASSP-99 Tutorial, 15 Mar h 1999'&
$%
Summary (Cont'd)
� Bootstrap te hniques are an alternative to asymptoti methods.� The bootstrap does with a omputer what the experimenterwould do in pra ti e, if it were possible: they would repeat theexperiment.� With the bootstrap, the observations are randomly re-assigned,and the estimates re- omputed. This pro ess is done thousandsof times to simulate repeated experiments.� In an era of exponentially in reasing omputational power, su h omputer-intensive methods are be oming in reasingly attra tive.
A.M. Zoubir, Curtin University of Te hnology 214
ICASSP-99 Tutorial, 15 Mar h 1999'&
$%
Summary (Cont'd)
� The tutorial provides the fundamental on epts and methodsneeded by the signal pro essing pra titioner to de ide when andhow to apply the bootstrap su essfully.� The tutorial fo uses on the independent data bootstrap. Theassumption of i.i.d. data an break down in pra ti e eitherbe ause the data is not independent or be ause it is notidenti ally distributed, or both.� The bootstrap an still be invoked if we knew the model thatgenerated the data. In other ases we an make the reasonableassumption that the data is identi ally distributed but notindependent su h as in autoregressive pro esses.A.M. Zoubir, Curtin University of Te hnology 215
ICASSP-99 Tutorial, 15 Mar h 1999'&
$%
Summary (Cont'd)
� For on�den e interval estimation or hypothesis testing, it isessential that the statisti used is asymptoti ally pivotal.� It has been shown that working on a varian e stable s ale isbetter than studentising.� When a varian e stabilising transformation is not known,bootstrap an be used to estimate it.� Many appli ations have been presented to demonstrate the powerof the bootstrap. Spe ial are is however required when applyingthe bootstrap in real-life situations.
A.M. Zoubir, Curtin University of Te hnology 216
ICASSP-99 Tutorial, 15 Mar h 1999'&
$%
Outline
� Model Sele tion{ Non-Linear Models{ Order Sele tion in Autoregressive Models� More Appli ations in Signal pro essing{ Con�den e Intervals for Spe tra{ Bispe trum Based Gaussianity Tests{ Noise Floor Estimation in Radar{ Con�den e Intervals for Flight Parameters in Passive Sonar{ Model Sele tion of Polynomial Phase Signals� Summary� A knowledgements
A.M. Zoubir, Curtin University of Te hnology 217
ICASSP-99 Tutorial, 15 Mar h 1999'&
$%
A knowledgements
� Many of the examples and simulations were implemented by Dr.D. Robert Iskander and Mr. Hwa-Tung Ong who also designedand implemented the bootstrap demo.� The passive a ousti problem was the subje t of Dr. David Reid'sPhD. He olle ted the data and implemented the algorithms.� The data from the JINDALEE over-the-horizon radar systemwas provided by the Defen e S ien e & Te hnology Organisationin Salisbury, South Australia.� I wish to thank my olleagues and students for their support andthe Australian Resear h Coun il whi h partly provided �nan ialsupport for some of these proje ts.A.M. Zoubir, Curtin University of Te hnology 218
ICASSP-99 Tutorial, 15 Mar h 1999'&
$%
Cited Referen es[1℄ H. Akaike. Comments on \On Model Stru ture Testing in SystemIdenti� ation". International J. Control, 27:323{324, 1978.[2℄ T. W. Anderson. An Introdu tion to Multivariate Statisti al Analysis.John Wiley and Sons, 1984.[3℄ T. Ankenbrand and M. Tomassini. Predi ting Multivariate Finan ial TimeSeries Using Neural Networks: the Swiss Bond Case. In Pro eedings ofthe IEEE/IAFE 1996 Conferen e on Computational Intelligen e forFinan ial Engineering (CIFEr), pages 27{33, New York, USA, 1996. IEEE.[4℄ G. Ar her and K. Chan. Bootstrapping Un ertainty in Image Analysis. InPro eedings of the 12th Symposium in Computational Statisti s, pages193{198, Heidelberg, Germany, 1996. Physi a-Verlag.[5℄ C. Banga and F. Ghorbel. Optimal Bootstrap Sampling for Fast ImageSegmentation: Appli ation to Retina Image. In Pro eedings of the IEEEInternational Conferen e on A ousti s, Spee h and Signal Pro essing(ICASSP-93), Minnesota, 1993.[6℄ R. Bhar and C. Chiarella. Bootstrap Algorithms and Appli ations. InPro eedings of the IEEE/IAFE 1996 Conferen e on ComputationalA.M. Zoubir, Curtin University of Te hnology 219
ICASSP-99 Tutorial, 15 Mar h 1999'&
$%
Intelligen e for Finan ial Engineering (CIFEr), pages 168{182, New York,USA, 1996. IEEE.[7℄ R. N. Bhatta harya and R. R. Rao. Normal Approximation andAsymptoti Expansions. Wiley, New York, 1976.[8℄ J. F. B�ohme and D. Maiwald. Multiple Wideband Signal Dete tion andTra king from Towed Array Data. In M. Blanke and T. S�oderstr�om,editors, Pro eedings of the 10th IFAC Symposium on SystemIdenti� ation (SYSID 94), volume 1, pages 107{112, Copenhagen, July1994. Danish Automation So .[9℄ A. Bose. Edgeworth Corre tion by Bootstrap in Autoregressions. TheAnn. of Statist., 16:1709{1722, 1988.[10℄ D. R. Brillinger. Time Series: Data Analysis and Theory. Holden-Day,1981.[11℄ P. Ciarlini. Bootstrap Algorithms and Appli ations. In Advan edMathemati al Tools in Metrology III, pages 12{23, Singapore, 1997.World S ienti� .[12℄ M. G. Cox, P. M. Harris, M. J. T. Milton, and P. T. Woods. A methodfor evaluating trends in ozone- on entration data and its appli ation todata from the UK Rural Ozone Monitoring network. In Advan edA.M. Zoubir, Curtin University of Te hnology 220
ICASSP-99 Tutorial, 15 Mar h 1999'&
$%
Mathemati al Tools in Metrology III, pages 171{177, Singapore, 1997.World S ienti� .[13℄ H. Cram�er. Mathemati al Methods of Statisti s. Prin eton UniversityPress, 1967.[14℄ P. M. Djuri� and S. M. Kay. Parameter Estimation of Chirp Signals.IEEE Transa tions on A ousti s, Spee h and Signal Pro essing,38(12):2118{2126, De ember 1990.[15℄ B. Efron. Bootstrap Methods. Another Look at the Ja kknife. The Ann.of Statist., 7:1{26, 1979.[16℄ B. Efron. Computers and the Theory of Statisti s: Thinking theUnthinkable. SIAM Review, 4:460{480, 1979.[17℄ B. Efron. Better Bootstrap Con�den e Interval. J. Amer. Statist. Asso .,82:171{185, 1987.[18℄ B. Efron and R. Tibshirani. An Introdu tion to the Bootstrap. Chapmanand Hall, 1993.[19℄ B. G. Ferguson. A Ground Based Narrow-Band Passive A ousti Te hnique for Estimating the Altitude and Speed of a Propeller DrivenAir raft. J. A oust. So . Am., 92(3):1403{1407, 1992.A.M. Zoubir, Curtin University of Te hnology 221
ICASSP-99 Tutorial, 15 Mar h 1999'&
$%
[20℄ B. G. Ferguson and B. G. Quinn. Appli ation of the Short-Time FourierTransform and the Wigner-Ville Distribution to the A ousti Lo alizationof Air raft. J. A oust. So . Am., 96(2):821{827, 1994.[21℄ N. I. Fisher and P. Hall. Bootstrap Con�den e Regions for Dire tionalData. J. Amer. Statist. Asso ., 84:996{1002, 1989.[22℄ N. I. Fisher and P. Hall. New Statisti al Methods for Dire tional Data {I. Bootstrap Comparison of Mean Dire tions and the Fold Test inPalaeomagnet. Geophys. J. International, 101:305{313, 1990.[23℄ N. I. Fisher and P. Hall. Bootstrap Algorithms for Small Samples. J.Statist. Plan. Infer., 27:157{169, 1991.[24℄ N. I. Fisher and P. Hall. General Statisti al Test for the E�e t of Folding.Geophys. J. International, 105:419{427, 1991.[25℄ R. A. Fisher. On the \Probable Error" of a CoeÆ ient of CorrelationDedu ed from a Small Sample. Metron, 1:1{32, 1921.[26℄ J. Franke and W. H�ardle. On Bootstrapping Kernel Estimates. The Ann.of Statist., 20:121{145, 1992.[27℄ D. A. Freedman. Bootstrapping Regression Models. The Ann. of Statist.,9:1218{1228, 1981.A.M. Zoubir, Curtin University of Te hnology 222
ICASSP-99 Tutorial, 15 Mar h 1999'&
$%
[28℄ P. Hall. Theoreti al Comparison of Bootstrap Con�den e Intervals (withdis ussion). The Ann. of Statist., 16:927{985, 1988.[29℄ P. Hall. The Bootstrap and Edgeworth Expansion. Springer-Verlag NewYork, In ., 1992.[30℄ P. Hall and D. M. Titterington. The E�e t of Simulation Order on LevelA ura y and Power of Monte Carlo Tests. J. R. Statist. So . B,51:459{467, 1989.[31℄ S. R. Hanna. Con�den e Limits for Air Quality Model Evaluations, asEstimated by Bootstrap and Ja kknife Resampling Methods.Atmospheri Environment, 23:1385{1398, 1989.[32℄ T. Hastie and R. Tibshirani. Generalized Additive Models. Chapman &Hall, 1990.[33℄ D. R. Haynor and S. D. Woods. Resampling Estimates of Pre ision inEmission Tomography. IEEE Transa tions on Medi al Imaging,8:337{343, 1989.[34℄ R. Herman. Using the Bootstrap Method for Evaluating Variations Dueto Sampling in Voltage Regulation Cal ulations. In Pro eedings of theSixth Southern Afri an Universities Power Engineering Conferen e(SAUPEC-96), pages 235{238, Witwatersrand, South Afri a, 1996. Univ.A.M. Zoubir, Curtin University of Te hnology 223
ICASSP-99 Tutorial, 15 Mar h 1999'&
$%
Witwatersrand.[35℄ A. O. Hero III, J. A. Fessler, and M. Usman. Exploring EstimatorBias-Varian e Tradeo�s Using the Uniform CR Bound. IEEE Transa tionson Signal Pro essing, 44(8):2026{2041, 1996.[36℄ M. J. Hini h. Testing for Gaussianity and Linearity of a Stationary TimeSeries. J. Time Series Anal., 3:169{176, 1982.[37℄ L. B. Jae kel. Some Flexible Estimates of Lo ation. Ann. Math. Statist.,42:1540{1552, 1971.[38℄ K. Kanatani and N. Ohta. Optimal Robot Self-Lo alization andReliability Evaluation. In Pro eedings of the 5th European Conferen e onComputer Vision (ECCV'98), volume 2, pages 796{808, Berlin, Germany,1998. Springer-Verlag.[39℄ S. Kay. Modern Spe tral Estimation. Theory and Appli ation. Prenti eHall, 1988.[40℄ J. P. Kreiss and J. Franke. Bootstrapping Stationary AutoregressiveMoving Average Models. J. Time Ser. Anal., 13:297{319, 1992.[41℄ J.-P. Kreiss and G. Lien. Bootstrapping autoregressions with in�niteorder. In Pro eedings of the IEEE International Conferen e on A ousti s,A.M. Zoubir, Curtin University of Te hnology 224
ICASSP-99 Tutorial, 15 Mar h 1999'&
$%
Spee h and Signal Pro essing (ICASSP-94), volume 6, pages 97{100,Adelaide, South Australia, April 1994.[42℄ E. Kreyszig. Introdu tory Fun tional Analysis with Appli ations. J. Wiley& Sons, 1989.[43℄ J. Krolik, G. Niezgoda, and D. Swingler. A bootstrap approa h forevaluating sour e lo alization performan e on real sensor array data. InPro eedings of the IEEE International Conferen e on A ousti s, Spee hand Signal Pro essing (ICASSP-91), volume 2, pages 1281{1284,Toronto, Canada, May 1991.[44℄ H. R. K�uns h. The Ja kknife and the Bootstrap for General StationaryObservations. The Ann. of Statist., 17:1217{1241, 1989.[45℄ D. Lai and G. Chen. Computing the Distribution of the LyapunovExponent from Time Series: The One-dimensional Case Study.International Journal of Bifur ation and Chaos in Applied S ien es andEngineering, 5:1721{1726, 1995.[46℄ J. M. Lange, H. M. Voigt, S. Burkhardt, R. Gobel, S. Burkhardt, andR. Gobel. The Intelligent Inspe tion Engine|a Real-Time Real-WorldVisual Classi�er System. In IEEE International Joint Conferen e on NeuralNetworks Pro eedings, pages 1810{1815, New York, USA, 1998. IEEE.A.M. Zoubir, Curtin University of Te hnology 225
ICASSP-99 Tutorial, 15 Mar h 1999'&
$%
[47℄ E. L. Lehmann. Testing Statisti al Hypotheses. Wadsworth and Brooks,1991.[48℄ R. Y. Liu and K. Singh. Moving Blo ks Ja kknife and Bootstrap CaptureWeak Dependen e. In LePage and Billard, editors, Exploring the limits ofbootstrap, New York, 1992. John Wiley.[49℄ S. L. Marple Jr. Digital Spe tral Analysis with Appli ations.Prenti e-Hall, 1987.[50℄ R. G. Miller. The Ja kknife - A Review. Biometrika, 61:1{15, 1974.[51℄ S Nagaoka and O. Amai. A Method for Establishing a Separation in AirTraÆ Control Using a Radar Estimation A ura y of Close Approa hProbability. J. Japan Ins. Navigation, 82:53{60, 1990.[52℄ S Nagaoka and O. Amai. Estimation A ura y of Close Approa hProbability for Establishing a Radar Separation Minimum. J. Navigation,44:110{121, 1991.[53℄ H. Ong, D. R. Iskander, and A. M. Zoubir. Dete tion of a CommonNon-Gaussian Signal in Two Sensors Using The Bootstrap. InPro eedings of the IEEE Signal Pro essing Workshop on Higher-OrderStatisti s, pages 463{467, Ban�, Alberta, Canada, 1997.A.M. Zoubir, Curtin University of Te hnology 226
ICASSP-99 Tutorial, 15 Mar h 1999'&
$%
[54℄ H. Ong and A. M. Zoubir. Non-Gaussian Signal Dete tion from MultipleSensors using the Bootstrap. In Pro eedings of the InternationalConferen e on Information, Communi ations & Signal Pro essing(ICICS'97), volume 1, pages 340{344, Singapore, September 1997.[55℄ E. Paparoditis. Bootstrapping Autoregressive and Moving AverageParameter Estimates of In�nite Order Ve tor Autoregressive Pro esses. J.Multivar. Anal., 57:277{296, 1996.[56℄ D. N. Politis and J. P. Romano. A General Resampling S heme forTriangular Arrays of �-Mixing Random Variables with Appli ation to theProblem of Spe tral Density Estimation. The Ann. of Statist.,20:1985{2007, 1992.[57℄ D. N. Politis and J. P. Romano. Bootstrap Con�den e Bands for Spe traand Cross-Spe tra. IEEE Transa tions on Signal Pro essing,40:1206{1215, 1992.[58℄ D. N. Politis and J. P. Romano. The Stationary Bootstrap. Journal Am.Stat. Asso ., 89:1303{1313, 1994.[59℄ B. Porat and B. Friedlander. Performan e Analysis of ParameterEstimation Algorithms Based on High-Order Moments. International J.Adaptive Control and Signal Pro essing, 3:191{229, 1989.A.M. Zoubir, Curtin University of Te hnology 227
ICASSP-99 Tutorial, 15 Mar h 1999'&
$%
[60℄ M. B. Priestley. Spe tral Analysis and Time Series. A ademi Press,1981.[61℄ D. C. Reid, A. M. Zoubir, and B. Boashash. The Bootstrap Applied toPassive A ousti Air raft Parameter Estimation. In Pro eedings ofInternational Conferen e on A ousti s, Spee h and Signal Pro essing(ICASSP-96), volume VI, pages 3153{3156, Atlanta, May 1996.[62℄ D. C. Reid, A. M. Zoubir, and B. Boashash. Air raft ParameterEstimation based on Passive A ousti Te hniques Using the PolynomialWigner-Ville Distribution. J. A oust. So . Am., 102(1):207{223, 1997.[63℄ M. Rosenblatt. Stationary Sequen es and Random Fields. Birkhauser,Boston, 1985.[64℄ Spe ial Session. The Bootstrap and Its Appli ations. In Pro eedings ofthe IEEE International Conferen e on A ousti s, Spee h and SignalPro essing (ICASSP-94), volume VI, pages 65{79, Adelaide, Australia,1994.[65℄ J. Shao. Bootstrap Model Sele tion. J. Am. Statist. Asso ., 91:655{665,1996.[66℄ J. Shao and D. Tu. The Ja kknife and Bootstrap. Springer, 1995.A.M. Zoubir, Curtin University of Te hnology 228
ICASSP-99 Tutorial, 15 Mar h 1999'&
$%
[67℄ Q. Shao and H. Yu. Bootstrapping the Sample Means for StationaryMixing Sequen es. Sto hasti Pro esses and Their Appl., 48:175{190,1993.[68℄ R. H. Shumway. Repli ated Time-Series Regression: An Approa h toSignal Estimation and Dete tion. In D. R. Brillinger and P. R.Krishnaiah, editors, Handbook of Statisti s, volume 3, pages 383{408.North-Holland, 1983.[69℄ T. Subba Rao and M. M. Gabr. Testing for Linearity of a StationaryTime Series. J. Time Series Anal., 1:145{158, 1980.[70℄ A. Swami, J. M. Mendel, and C. L. Nikias. Higher-Order Spe tralAnalysis Toolbox for use with MATLAB. The MathWorks, In ., 1995.[71℄ L. Tauxe, N. Kylstra, and C. Constable. Bootstrap Statisti s forPaleomagneti Data. J. Geophysi al Resear h, 96:11723{11740, 1991.[72℄ S. Tretter. Estimating the Frequen y of a Noisy Sinusoid. IEEETransa tions on Information Theory, 31:832{835, 1985.[73℄ J. K. Tugnait. Two-Channel Test for Common Non-Gaussian SignalDete tion. IEE Pro eedings-F, 140:343{349, 1993.[74℄ A. M. Ya out, S. Salvatores, and Y. G. Ore hwa. Degradation AnalysisA.M. Zoubir, Curtin University of Te hnology 229
ICASSP-99 Tutorial, 15 Mar h 1999'&
$%
Estimates of the Time-to-Failure Distribution of Irradiated Fuel Elements.Nu lear-Te hnology, 113(2):177{189, 1996.[75℄ G. A. Young. Bootstrap: More Than a Stab in the Dark? Statisti alS ien e, 9:382{415, 1994.[76℄ Y. Zhang, D. Hatzinakos, and A. N. Venetsanopoulos. BootstrappingTe hniques in the Estimation of Higher-Order Cumulants from ShortData Re ords . In Pro eedings of the IEEE International Conferen e onA ousti s, Spee h and Signal Pro essing (ICASSP-93), volume IV, pages200{203, Minnesota, 1993.[77℄ A. M. Zoubir. Bootstrap: Theory and Appli ations. In F. T. Luk, editor,Advan ed Signal Pro essing Algorithms, Ar hite tures andImplementations, volume 2027, pages 216{235, San Diego, July 1993.Pro eedings of SPIE.[78℄ A. M. Zoubir. Multiple Bootstrap Tests and Their Appli ation. InPro eedings of the IEEE International Conferen e on A ousti s, Spee hand Signal Pro essing (ICASSP-94), volume VI, pages 69{72, Adelaide,Australia, 1994.[79℄ A. M. Zoubir and B. Boashash. Advan ed Signal Pro essing forOver-The-Horizon Radar: Appli ation of Bootstrap Methods. Te hni alA.M. Zoubir, Curtin University of Te hnology 230
ICASSP-99 Tutorial, 15 Mar h 1999'&
$%
report, Signal Pro essing Resear h Centre, Queensland University ofTe hnology, Mar h 1996. DSTO Report ref. 96/2{Part I-IV.[80℄ A. M. Zoubir and B. Boashash. The Bootstrap and Its Appli ation inSignal Pro essing. IEEE Signal Pro essing Magazine, 15(1):56{76, 1998.[81℄ A. M. Zoubir and J. F. B�ohme. Bootstrap Multiple Hypotheses Tests forOptimizing Sensor Positions in Kno k Dete tion. In Pro eedings of 13thWorld Congress on Computation and Applied Mathemati s, IMACS'91,pages 955{958, Dublin, 1991.[82℄ A. M. Zoubir and J. F. B�ohme. Appli ation of Higher Order Spe tra tothe Analysis and Dete tion of Kno k in Combustion Engines. InB. Boashash, E. J. Powers, and A. M. Zoubir, editors, Higher-OrderStatisti al Signal Pro essing, pages 269{290. John Wiley, 1995.[83℄ A. M. Zoubir and J. F. B�ohme. Multiple Bootstrap Tests: An Appli ationto Sensor Lo ation. IEEE Transa tions on Signal Pro essing,43:1386{1396, 1995.[84℄ A. M. Zoubir and D. R. Iskander. A Bispe trum Based Gaussianity TestUsing The Bootstrap. In Pro eedings of the IEEE InternationalConferen e on A ousti s, Spee h and Signal Pro essing (ICASSP-96),volume V, pages 3029{3032, Atlanta, 1996.A.M. Zoubir, Curtin University of Te hnology 231
ICASSP-99 Tutorial, 15 Mar h 1999'&
$%
[85℄ A. M. Zoubir and D. R. Iskander. On Bootstrapping the Con�den eBands for Power Spe tra. Te hni al report, Signal Pro essing Resear hCentre, Queensland University of Te hnology, January 1996.[86℄ A. M. Zoubir and D. R. Iskander. Bootstrap Model Sele tion forPolynomial Phase Signals. In Pro eedings of the International Conferen eon A ousti s, Spee h and Signal Pro essing (ICASSP-98), pages2229{2232, Seattle, USA, May 1998.[87℄ A. M. Zoubir and D. R. Iskander. Bootstrapping Bispe tra: AnAppli ation to Testing for Departure from Gaussianity of StationarySignals. IEEE Transa tions on Signal Pro essing, Mar h 1999.[88℄ A. M. Zoubir and D. R. Iskander. Bootstrap Based Modelling of a Classof Non-Stationary Signals. IEEE Transa tion on Signal Pro essing,(under review).
A.M. Zoubir, Curtin University of Te hnology 232