ill-posed inverse problems: algorithms and applications ...rosie/mypresentations/talksdsu.pdf ·...

117
Outline Inverse Problems Total Least Squares Methods Results Conclusions Other Research Ill-Posed Inverse Problems: Algorithms and Applications Total Least Squares Rosemary Renaut with Dr Hongbin Guo and Wolfgang Stefan Department of Mathematics and Statistics Arizona State University February 16, 2006 Rosemary Renaut with Dr Hongbin Guo and Wolfgang Stefan Ill-Posed Inverse Problems: Algorithms and Applications

Upload: others

Post on 02-Jan-2020

14 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Ill-Posed Inverse Problems: Algorithms and Applications ...rosie/mypresentations/talksdsu.pdf · Ill-Posed Inverse Problems: Algorithms and Applications Total Least Squares Rosemary

OutlineInverse Problems

Total Least Squares MethodsResults

ConclusionsOther Research

Ill-Posed Inverse Problems: Algorithms andApplications

Total Least Squares

Rosemary Renautwith

Dr Hongbin Guo and Wolfgang Stefan

Department of Mathematics and StatisticsArizona State University

February 16, 2006

Rosemary Renaut with Dr Hongbin Guo and Wolfgang Stefan Ill-Posed Inverse Problems: Algorithms and Applications

Page 2: Ill-Posed Inverse Problems: Algorithms and Applications ...rosie/mypresentations/talksdsu.pdf · Ill-Posed Inverse Problems: Algorithms and Applications Total Least Squares Rosemary

OutlineInverse Problems

Total Least Squares MethodsResults

ConclusionsOther Research

Inverse Problems

Total Least Squares Methods

Results

Conclusions

Other Research

Rosemary Renaut with Dr Hongbin Guo and Wolfgang Stefan Ill-Posed Inverse Problems: Algorithms and Applications

Page 3: Ill-Posed Inverse Problems: Algorithms and Applications ...rosie/mypresentations/talksdsu.pdf · Ill-Posed Inverse Problems: Algorithms and Applications Total Least Squares Rosemary

OutlineInverse Problems

Total Least Squares MethodsResults

ConclusionsOther Research

Linear Parameter EstimationIntegral EquationsStructured Inverse ProblemRegularizationPET ExamplesUnstructured Inverse Problems

Parameter Estimation Problem

I Classical Approach Linear Least Squares(A is exactlyspecified.)

xLS = arg minx||Ax − b||22

• Orthogonal projection of b onto the range of A.

I Dense A Form QR Decomposition of A, solve directlyRx = QTb.

I Sparse A Use iterative techniques, CG, Krylov subspace, etc.

Rosemary Renaut with Dr Hongbin Guo and Wolfgang Stefan Ill-Posed Inverse Problems: Algorithms and Applications

Page 4: Ill-Posed Inverse Problems: Algorithms and Applications ...rosie/mypresentations/talksdsu.pdf · Ill-Posed Inverse Problems: Algorithms and Applications Total Least Squares Rosemary

OutlineInverse Problems

Total Least Squares MethodsResults

ConclusionsOther Research

Linear Parameter EstimationIntegral EquationsStructured Inverse ProblemRegularizationPET ExamplesUnstructured Inverse Problems

Parameter Estimation Problem

I Classical Approach Linear Least Squares(A is exactlyspecified.)

xLS = arg minx||Ax − b||22

• Orthogonal projection of b onto the range of A.

I Dense A Form QR Decomposition of A, solve directlyRx = QTb.

I Sparse A Use iterative techniques, CG, Krylov subspace, etc.

Rosemary Renaut with Dr Hongbin Guo and Wolfgang Stefan Ill-Posed Inverse Problems: Algorithms and Applications

Page 5: Ill-Posed Inverse Problems: Algorithms and Applications ...rosie/mypresentations/talksdsu.pdf · Ill-Posed Inverse Problems: Algorithms and Applications Total Least Squares Rosemary

OutlineInverse Problems

Total Least Squares MethodsResults

ConclusionsOther Research

Linear Parameter EstimationIntegral EquationsStructured Inverse ProblemRegularizationPET ExamplesUnstructured Inverse Problems

Parameter Estimation Problem

I Classical Approach Linear Least Squares(A is exactlyspecified.)

xLS = arg minx||Ax − b||22

• Orthogonal projection of b onto the range of A.

I Dense A Form QR Decomposition of A, solve directlyRx = QTb.

I Sparse A Use iterative techniques, CG, Krylov subspace, etc.

Rosemary Renaut with Dr Hongbin Guo and Wolfgang Stefan Ill-Posed Inverse Problems: Algorithms and Applications

Page 6: Ill-Posed Inverse Problems: Algorithms and Applications ...rosie/mypresentations/talksdsu.pdf · Ill-Posed Inverse Problems: Algorithms and Applications Total Least Squares Rosemary

OutlineInverse Problems

Total Least Squares MethodsResults

ConclusionsOther Research

Linear Parameter EstimationIntegral EquationsStructured Inverse ProblemRegularizationPET ExamplesUnstructured Inverse Problems

Integral Equations

∫Ω

input X system dΩ = output

• Given noisy output determine input or/and the system.

• General Applications: Image Processing, SignalIdentification

• Our Motivation Seismic & Medical Signal Restoration

Rosemary Renaut with Dr Hongbin Guo and Wolfgang Stefan Ill-Posed Inverse Problems: Algorithms and Applications

Page 7: Ill-Posed Inverse Problems: Algorithms and Applications ...rosie/mypresentations/talksdsu.pdf · Ill-Posed Inverse Problems: Algorithms and Applications Total Least Squares Rosemary

OutlineInverse Problems

Total Least Squares MethodsResults

ConclusionsOther Research

Linear Parameter EstimationIntegral EquationsStructured Inverse ProblemRegularizationPET ExamplesUnstructured Inverse Problems

Integral Equations

∫Ω

input X system dΩ = output

• Given noisy output determine input or/and the system.

• General Applications: Image Processing, SignalIdentification

• Our Motivation Seismic & Medical Signal Restoration

Rosemary Renaut with Dr Hongbin Guo and Wolfgang Stefan Ill-Posed Inverse Problems: Algorithms and Applications

Page 8: Ill-Posed Inverse Problems: Algorithms and Applications ...rosie/mypresentations/talksdsu.pdf · Ill-Posed Inverse Problems: Algorithms and Applications Total Least Squares Rosemary

OutlineInverse Problems

Total Least Squares MethodsResults

ConclusionsOther Research

Linear Parameter EstimationIntegral EquationsStructured Inverse ProblemRegularizationPET ExamplesUnstructured Inverse Problems

Integral Equations

∫Ω

input X system dΩ = output

• Given noisy output determine input or/and the system.

• General Applications: Image Processing, SignalIdentification

• Our Motivation Seismic & Medical Signal Restoration

Rosemary Renaut with Dr Hongbin Guo and Wolfgang Stefan Ill-Posed Inverse Problems: Algorithms and Applications

Page 9: Ill-Posed Inverse Problems: Algorithms and Applications ...rosie/mypresentations/talksdsu.pdf · Ill-Posed Inverse Problems: Algorithms and Applications Total Least Squares Rosemary

OutlineInverse Problems

Total Least Squares MethodsResults

ConclusionsOther Research

Linear Parameter EstimationIntegral EquationsStructured Inverse ProblemRegularizationPET ExamplesUnstructured Inverse Problems

Integral Equations

∫Ω

input X system dΩ = output

• Given noisy output determine input or/and the system.

• General Applications: Image Processing, SignalIdentification

• Our Motivation Seismic & Medical Signal Restoration

Rosemary Renaut with Dr Hongbin Guo and Wolfgang Stefan Ill-Posed Inverse Problems: Algorithms and Applications

Page 10: Ill-Posed Inverse Problems: Algorithms and Applications ...rosie/mypresentations/talksdsu.pdf · Ill-Posed Inverse Problems: Algorithms and Applications Total Least Squares Rosemary

OutlineInverse Problems

Total Least Squares MethodsResults

ConclusionsOther Research

Linear Parameter EstimationIntegral EquationsStructured Inverse ProblemRegularizationPET ExamplesUnstructured Inverse Problems

Invariant Model

• Signal degradation is modeled as a convolution

g = f ⊗ h + n

• where g is the blurred signal• f is the unknown signal• h is the point spread function (PSF)- known• n is noise

• Matrix Formulation H is Toeplitz (structured)

g = Hf + n

Rosemary Renaut with Dr Hongbin Guo and Wolfgang Stefan Ill-Posed Inverse Problems: Algorithms and Applications

Page 11: Ill-Posed Inverse Problems: Algorithms and Applications ...rosie/mypresentations/talksdsu.pdf · Ill-Posed Inverse Problems: Algorithms and Applications Total Least Squares Rosemary

OutlineInverse Problems

Total Least Squares MethodsResults

ConclusionsOther Research

Linear Parameter EstimationIntegral EquationsStructured Inverse ProblemRegularizationPET ExamplesUnstructured Inverse Problems

Invariant Model

• Signal degradation is modeled as a convolution

g = f ⊗ h + n

• where g is the blurred signal• f is the unknown signal• h is the point spread function (PSF)- known• n is noise

• Matrix Formulation H is Toeplitz (structured)

g = Hf + n

Rosemary Renaut with Dr Hongbin Guo and Wolfgang Stefan Ill-Posed Inverse Problems: Algorithms and Applications

Page 12: Ill-Posed Inverse Problems: Algorithms and Applications ...rosie/mypresentations/talksdsu.pdf · Ill-Posed Inverse Problems: Algorithms and Applications Total Least Squares Rosemary

OutlineInverse Problems

Total Least Squares MethodsResults

ConclusionsOther Research

Linear Parameter EstimationIntegral EquationsStructured Inverse ProblemRegularizationPET ExamplesUnstructured Inverse Problems

Example Of Convolution

g = f ⊗ h + n

Rosemary Renaut with Dr Hongbin Guo and Wolfgang Stefan Ill-Posed Inverse Problems: Algorithms and Applications

Page 13: Ill-Posed Inverse Problems: Algorithms and Applications ...rosie/mypresentations/talksdsu.pdf · Ill-Posed Inverse Problems: Algorithms and Applications Total Least Squares Rosemary

OutlineInverse Problems

Total Least Squares MethodsResults

ConclusionsOther Research

Linear Parameter EstimationIntegral EquationsStructured Inverse ProblemRegularizationPET ExamplesUnstructured Inverse Problems

Inverse with Known PSF

• Find f from g = f ⊗ h + n given g and h with unknown n.

• Assuming normal distributed n yields the estimator

f = arg minf‖g − f ⊗ h‖2

2

• Reconstruction with n ∈ N(0, 10−7)

Rosemary Renaut with Dr Hongbin Guo and Wolfgang Stefan Ill-Posed Inverse Problems: Algorithms and Applications

Page 14: Ill-Posed Inverse Problems: Algorithms and Applications ...rosie/mypresentations/talksdsu.pdf · Ill-Posed Inverse Problems: Algorithms and Applications Total Least Squares Rosemary

OutlineInverse Problems

Total Least Squares MethodsResults

ConclusionsOther Research

Linear Parameter EstimationIntegral EquationsStructured Inverse ProblemRegularizationPET ExamplesUnstructured Inverse Problems

Inverse with Known PSF

• Find f from g = f ⊗ h + n given g and h with unknown n.• Assuming normal distributed n yields the estimator

f = arg minf‖g − f ⊗ h‖2

2

• Reconstruction with n ∈ N(0, 10−7)

Rosemary Renaut with Dr Hongbin Guo and Wolfgang Stefan Ill-Posed Inverse Problems: Algorithms and Applications

Page 15: Ill-Posed Inverse Problems: Algorithms and Applications ...rosie/mypresentations/talksdsu.pdf · Ill-Posed Inverse Problems: Algorithms and Applications Total Least Squares Rosemary

OutlineInverse Problems

Total Least Squares MethodsResults

ConclusionsOther Research

Linear Parameter EstimationIntegral EquationsStructured Inverse ProblemRegularizationPET ExamplesUnstructured Inverse Problems

Inverse with Known PSF

• Find f from g = f ⊗ h + n given g and h with unknown n.• Assuming normal distributed n yields the estimator

f = arg minf‖g − f ⊗ h‖2

2

• Reconstruction with n ∈ N(0, 10−7)

Rosemary Renaut with Dr Hongbin Guo and Wolfgang Stefan Ill-Posed Inverse Problems: Algorithms and Applications

Page 16: Ill-Posed Inverse Problems: Algorithms and Applications ...rosie/mypresentations/talksdsu.pdf · Ill-Posed Inverse Problems: Algorithms and Applications Total Least Squares Rosemary

OutlineInverse Problems

Total Least Squares MethodsResults

ConclusionsOther Research

Linear Parameter EstimationIntegral EquationsStructured Inverse ProblemRegularizationPET ExamplesUnstructured Inverse Problems

Ill-posedness

• Add more information about the signal

• e.g. statistical properties or information about the structure(e.g. sparse decon, or total variation decon)

• Regularize

f = arg minf‖g − f ⊗ h‖2

2 + λR(f ),

where R(f ) is the penalty term and λ is a penalty parameter.

Rosemary Renaut with Dr Hongbin Guo and Wolfgang Stefan Ill-Posed Inverse Problems: Algorithms and Applications

Page 17: Ill-Posed Inverse Problems: Algorithms and Applications ...rosie/mypresentations/talksdsu.pdf · Ill-Posed Inverse Problems: Algorithms and Applications Total Least Squares Rosemary

OutlineInverse Problems

Total Least Squares MethodsResults

ConclusionsOther Research

Linear Parameter EstimationIntegral EquationsStructured Inverse ProblemRegularizationPET ExamplesUnstructured Inverse Problems

Ill-posedness

• Add more information about the signal

• e.g. statistical properties or information about the structure(e.g. sparse decon, or total variation decon)

• Regularize

f = arg minf‖g − f ⊗ h‖2

2 + λR(f ),

where R(f ) is the penalty term and λ is a penalty parameter.

Rosemary Renaut with Dr Hongbin Guo and Wolfgang Stefan Ill-Posed Inverse Problems: Algorithms and Applications

Page 18: Ill-Posed Inverse Problems: Algorithms and Applications ...rosie/mypresentations/talksdsu.pdf · Ill-Posed Inverse Problems: Algorithms and Applications Total Least Squares Rosemary

OutlineInverse Problems

Total Least Squares MethodsResults

ConclusionsOther Research

Linear Parameter EstimationIntegral EquationsStructured Inverse ProblemRegularizationPET ExamplesUnstructured Inverse Problems

Ill-posedness

• Add more information about the signal

• e.g. statistical properties or information about the structure(e.g. sparse decon, or total variation decon)

• Regularize

f = arg minf‖g − f ⊗ h‖2

2 + λR(f ),

where R(f ) is the penalty term and λ is a penalty parameter.

Rosemary Renaut with Dr Hongbin Guo and Wolfgang Stefan Ill-Posed Inverse Problems: Algorithms and Applications

Page 19: Ill-Posed Inverse Problems: Algorithms and Applications ...rosie/mypresentations/talksdsu.pdf · Ill-Posed Inverse Problems: Algorithms and Applications Total Least Squares Rosemary

OutlineInverse Problems

Total Least Squares MethodsResults

ConclusionsOther Research

Linear Parameter EstimationIntegral EquationsStructured Inverse ProblemRegularizationPET ExamplesUnstructured Inverse Problems

Standard Methods

• Tikhonov (TK).

R(f ) = TK(f ) =

∫Ω|∇f (x)|2dx .

• Total Variation (TV)

R(f ) = TV(f ) =

∫Ω|∇f (x)|dx .

• Sparse deconvolution (L1)

R(f ) = ‖f ‖1 =

∫Ω|f (x)|dx .

Rosemary Renaut with Dr Hongbin Guo and Wolfgang Stefan Ill-Posed Inverse Problems: Algorithms and Applications

Page 20: Ill-Posed Inverse Problems: Algorithms and Applications ...rosie/mypresentations/talksdsu.pdf · Ill-Posed Inverse Problems: Algorithms and Applications Total Least Squares Rosemary

OutlineInverse Problems

Total Least Squares MethodsResults

ConclusionsOther Research

Linear Parameter EstimationIntegral EquationsStructured Inverse ProblemRegularizationPET ExamplesUnstructured Inverse Problems

Standard Methods

• Tikhonov (TK).

R(f ) = TK(f ) =

∫Ω|∇f (x)|2dx .

• Total Variation (TV)

R(f ) = TV(f ) =

∫Ω|∇f (x)|dx .

• Sparse deconvolution (L1)

R(f ) = ‖f ‖1 =

∫Ω|f (x)|dx .

Rosemary Renaut with Dr Hongbin Guo and Wolfgang Stefan Ill-Posed Inverse Problems: Algorithms and Applications

Page 21: Ill-Posed Inverse Problems: Algorithms and Applications ...rosie/mypresentations/talksdsu.pdf · Ill-Posed Inverse Problems: Algorithms and Applications Total Least Squares Rosemary

OutlineInverse Problems

Total Least Squares MethodsResults

ConclusionsOther Research

Linear Parameter EstimationIntegral EquationsStructured Inverse ProblemRegularizationPET ExamplesUnstructured Inverse Problems

Standard Methods

• Tikhonov (TK).

R(f ) = TK(f ) =

∫Ω|∇f (x)|2dx .

• Total Variation (TV)

R(f ) = TV(f ) =

∫Ω|∇f (x)|dx .

• Sparse deconvolution (L1)

R(f ) = ‖f ‖1 =

∫Ω|f (x)|dx .

Rosemary Renaut with Dr Hongbin Guo and Wolfgang Stefan Ill-Posed Inverse Problems: Algorithms and Applications

Page 22: Ill-Posed Inverse Problems: Algorithms and Applications ...rosie/mypresentations/talksdsu.pdf · Ill-Posed Inverse Problems: Algorithms and Applications Total Least Squares Rosemary

OutlineInverse Problems

Total Least Squares MethodsResults

ConclusionsOther Research

Linear Parameter EstimationIntegral EquationsStructured Inverse ProblemRegularizationPET ExamplesUnstructured Inverse Problems

Comparing TV and TK Regularization

Rosemary Renaut with Dr Hongbin Guo and Wolfgang Stefan Ill-Posed Inverse Problems: Algorithms and Applications

Page 23: Ill-Posed Inverse Problems: Algorithms and Applications ...rosie/mypresentations/talksdsu.pdf · Ill-Posed Inverse Problems: Algorithms and Applications Total Least Squares Rosemary

OutlineInverse Problems

Total Least Squares MethodsResults

ConclusionsOther Research

Linear Parameter EstimationIntegral EquationsStructured Inverse ProblemRegularizationPET ExamplesUnstructured Inverse Problems

Cost Functional

f = arg minf‖g − f ⊗ h‖2

2 + λR(f )

• λ Governs the trade off between the fit to the data and thesmoothness of the reconstruction and can be picked by theL-curve approach

• TV yields a piece wise constant reconstruction and preservesthe edges of the signal

• TK yields a smooth reconstruction

• To find the minimum we use a limited memory BFGS method

Rosemary Renaut with Dr Hongbin Guo and Wolfgang Stefan Ill-Posed Inverse Problems: Algorithms and Applications

Page 24: Ill-Posed Inverse Problems: Algorithms and Applications ...rosie/mypresentations/talksdsu.pdf · Ill-Posed Inverse Problems: Algorithms and Applications Total Least Squares Rosemary

OutlineInverse Problems

Total Least Squares MethodsResults

ConclusionsOther Research

Linear Parameter EstimationIntegral EquationsStructured Inverse ProblemRegularizationPET ExamplesUnstructured Inverse Problems

Cost Functional

f = arg minf‖g − f ⊗ h‖2

2 + λR(f )

• λ Governs the trade off between the fit to the data and thesmoothness of the reconstruction and can be picked by theL-curve approach

• TV yields a piece wise constant reconstruction and preservesthe edges of the signal

• TK yields a smooth reconstruction

• To find the minimum we use a limited memory BFGS method

Rosemary Renaut with Dr Hongbin Guo and Wolfgang Stefan Ill-Posed Inverse Problems: Algorithms and Applications

Page 25: Ill-Posed Inverse Problems: Algorithms and Applications ...rosie/mypresentations/talksdsu.pdf · Ill-Posed Inverse Problems: Algorithms and Applications Total Least Squares Rosemary

OutlineInverse Problems

Total Least Squares MethodsResults

ConclusionsOther Research

Linear Parameter EstimationIntegral EquationsStructured Inverse ProblemRegularizationPET ExamplesUnstructured Inverse Problems

Cost Functional

f = arg minf‖g − f ⊗ h‖2

2 + λR(f )

• λ Governs the trade off between the fit to the data and thesmoothness of the reconstruction and can be picked by theL-curve approach

• TV yields a piece wise constant reconstruction and preservesthe edges of the signal

• TK yields a smooth reconstruction

• To find the minimum we use a limited memory BFGS method

Rosemary Renaut with Dr Hongbin Guo and Wolfgang Stefan Ill-Posed Inverse Problems: Algorithms and Applications

Page 26: Ill-Posed Inverse Problems: Algorithms and Applications ...rosie/mypresentations/talksdsu.pdf · Ill-Posed Inverse Problems: Algorithms and Applications Total Least Squares Rosemary

OutlineInverse Problems

Total Least Squares MethodsResults

ConclusionsOther Research

Linear Parameter EstimationIntegral EquationsStructured Inverse ProblemRegularizationPET ExamplesUnstructured Inverse Problems

Cost Functional

f = arg minf‖g − f ⊗ h‖2

2 + λR(f )

• λ Governs the trade off between the fit to the data and thesmoothness of the reconstruction and can be picked by theL-curve approach

• TV yields a piece wise constant reconstruction and preservesthe edges of the signal

• TK yields a smooth reconstruction

• To find the minimum we use a limited memory BFGS method

Rosemary Renaut with Dr Hongbin Guo and Wolfgang Stefan Ill-Posed Inverse Problems: Algorithms and Applications

Page 27: Ill-Posed Inverse Problems: Algorithms and Applications ...rosie/mypresentations/talksdsu.pdf · Ill-Posed Inverse Problems: Algorithms and Applications Total Least Squares Rosemary

OutlineInverse Problems

Total Least Squares MethodsResults

ConclusionsOther Research

Linear Parameter EstimationIntegral EquationsStructured Inverse ProblemRegularizationPET ExamplesUnstructured Inverse Problems

Optimization

• TK is a linear least squares (LS) problem

f = arg minf‖g − Hf ‖2

2 + λ‖Lf ‖22)

• The TV objective function is non differentiable

J(f ) = ‖g − Hf ‖22 + λ‖Lf ‖1

• Evaluation of the OF and its gradient is cheap (some FFTsand sparse matrix-vector multiplications)

Rosemary Renaut with Dr Hongbin Guo and Wolfgang Stefan Ill-Posed Inverse Problems: Algorithms and Applications

Page 28: Ill-Posed Inverse Problems: Algorithms and Applications ...rosie/mypresentations/talksdsu.pdf · Ill-Posed Inverse Problems: Algorithms and Applications Total Least Squares Rosemary

OutlineInverse Problems

Total Least Squares MethodsResults

ConclusionsOther Research

Linear Parameter EstimationIntegral EquationsStructured Inverse ProblemRegularizationPET ExamplesUnstructured Inverse Problems

Optimization

• TK is a linear least squares (LS) problem

f = arg minf‖g − Hf ‖2

2 + λ‖Lf ‖22)

• The TV objective function is non differentiable

J(f ) = ‖g − Hf ‖22 + λ‖Lf ‖1

• Evaluation of the OF and its gradient is cheap (some FFTsand sparse matrix-vector multiplications)

Rosemary Renaut with Dr Hongbin Guo and Wolfgang Stefan Ill-Posed Inverse Problems: Algorithms and Applications

Page 29: Ill-Posed Inverse Problems: Algorithms and Applications ...rosie/mypresentations/talksdsu.pdf · Ill-Posed Inverse Problems: Algorithms and Applications Total Least Squares Rosemary

OutlineInverse Problems

Total Least Squares MethodsResults

ConclusionsOther Research

Linear Parameter EstimationIntegral EquationsStructured Inverse ProblemRegularizationPET ExamplesUnstructured Inverse Problems

Optimization

• TK is a linear least squares (LS) problem

f = arg minf‖g − Hf ‖2

2 + λ‖Lf ‖22)

• The TV objective function is non differentiable

J(f ) = ‖g − Hf ‖22 + λ‖Lf ‖1

• Evaluation of the OF and its gradient is cheap (some FFTsand sparse matrix-vector multiplications)

Rosemary Renaut with Dr Hongbin Guo and Wolfgang Stefan Ill-Posed Inverse Problems: Algorithms and Applications

Page 30: Ill-Posed Inverse Problems: Algorithms and Applications ...rosie/mypresentations/talksdsu.pdf · Ill-Posed Inverse Problems: Algorithms and Applications Total Least Squares Rosemary

OutlineInverse Problems

Total Least Squares MethodsResults

ConclusionsOther Research

Linear Parameter EstimationIntegral EquationsStructured Inverse ProblemRegularizationPET ExamplesUnstructured Inverse Problems

Simulated PET

• Blur Segmented MRI scan using Gaussian PSF

• Add Gauss noise to resulting Simulated PET image

• Note: PSF is known regularize with TV

Rosemary Renaut with Dr Hongbin Guo and Wolfgang Stefan Ill-Posed Inverse Problems: Algorithms and Applications

Page 31: Ill-Posed Inverse Problems: Algorithms and Applications ...rosie/mypresentations/talksdsu.pdf · Ill-Posed Inverse Problems: Algorithms and Applications Total Least Squares Rosemary

OutlineInverse Problems

Total Least Squares MethodsResults

ConclusionsOther Research

Linear Parameter EstimationIntegral EquationsStructured Inverse ProblemRegularizationPET ExamplesUnstructured Inverse Problems

Simulated PET

• Blur Segmented MRI scan using Gaussian PSF

• Add Gauss noise to resulting Simulated PET image

• Note: PSF is known regularize with TV

Rosemary Renaut with Dr Hongbin Guo and Wolfgang Stefan Ill-Posed Inverse Problems: Algorithms and Applications

Page 32: Ill-Posed Inverse Problems: Algorithms and Applications ...rosie/mypresentations/talksdsu.pdf · Ill-Posed Inverse Problems: Algorithms and Applications Total Least Squares Rosemary

OutlineInverse Problems

Total Least Squares MethodsResults

ConclusionsOther Research

Linear Parameter EstimationIntegral EquationsStructured Inverse ProblemRegularizationPET ExamplesUnstructured Inverse Problems

Simulated PET

• Blur Segmented MRI scan using Gaussian PSF

• Add Gauss noise to resulting Simulated PET image

• Note: PSF is known regularize with TV

Rosemary Renaut with Dr Hongbin Guo and Wolfgang Stefan Ill-Posed Inverse Problems: Algorithms and Applications

Page 33: Ill-Posed Inverse Problems: Algorithms and Applications ...rosie/mypresentations/talksdsu.pdf · Ill-Posed Inverse Problems: Algorithms and Applications Total Least Squares Rosemary

OutlineInverse Problems

Total Least Squares MethodsResults

ConclusionsOther Research

Linear Parameter EstimationIntegral EquationsStructured Inverse ProblemRegularizationPET ExamplesUnstructured Inverse Problems

Real PET data

• Reconstruction done using Filtered Back Projection

• PSF estimated by a Gaussian

• TV regularization

Rosemary Renaut with Dr Hongbin Guo and Wolfgang Stefan Ill-Posed Inverse Problems: Algorithms and Applications

Page 34: Ill-Posed Inverse Problems: Algorithms and Applications ...rosie/mypresentations/talksdsu.pdf · Ill-Posed Inverse Problems: Algorithms and Applications Total Least Squares Rosemary

OutlineInverse Problems

Total Least Squares MethodsResults

ConclusionsOther Research

Linear Parameter EstimationIntegral EquationsStructured Inverse ProblemRegularizationPET ExamplesUnstructured Inverse Problems

Observations

• Image improvement is possible even with a rough estimationof the PSF (non-blind decon)

• Total Variation regularization (piecewise constant solution) isappropriate: intensity levels depend on the tissue type.

• Improvement requires better approximation of the PSF

• Increased Artifacts and noise. (More post processing canimprove this)

Rosemary Renaut with Dr Hongbin Guo and Wolfgang Stefan Ill-Posed Inverse Problems: Algorithms and Applications

Page 35: Ill-Posed Inverse Problems: Algorithms and Applications ...rosie/mypresentations/talksdsu.pdf · Ill-Posed Inverse Problems: Algorithms and Applications Total Least Squares Rosemary

OutlineInverse Problems

Total Least Squares MethodsResults

ConclusionsOther Research

Linear Parameter EstimationIntegral EquationsStructured Inverse ProblemRegularizationPET ExamplesUnstructured Inverse Problems

Observations

• Image improvement is possible even with a rough estimationof the PSF (non-blind decon)

• Total Variation regularization (piecewise constant solution) isappropriate: intensity levels depend on the tissue type.

• Improvement requires better approximation of the PSF

• Increased Artifacts and noise. (More post processing canimprove this)

Rosemary Renaut with Dr Hongbin Guo and Wolfgang Stefan Ill-Posed Inverse Problems: Algorithms and Applications

Page 36: Ill-Posed Inverse Problems: Algorithms and Applications ...rosie/mypresentations/talksdsu.pdf · Ill-Posed Inverse Problems: Algorithms and Applications Total Least Squares Rosemary

OutlineInverse Problems

Total Least Squares MethodsResults

ConclusionsOther Research

Linear Parameter EstimationIntegral EquationsStructured Inverse ProblemRegularizationPET ExamplesUnstructured Inverse Problems

Observations

• Image improvement is possible even with a rough estimationof the PSF (non-blind decon)

• Total Variation regularization (piecewise constant solution) isappropriate: intensity levels depend on the tissue type.

• Improvement requires better approximation of the PSF

• Increased Artifacts and noise. (More post processing canimprove this)

Rosemary Renaut with Dr Hongbin Guo and Wolfgang Stefan Ill-Posed Inverse Problems: Algorithms and Applications

Page 37: Ill-Posed Inverse Problems: Algorithms and Applications ...rosie/mypresentations/talksdsu.pdf · Ill-Posed Inverse Problems: Algorithms and Applications Total Least Squares Rosemary

OutlineInverse Problems

Total Least Squares MethodsResults

ConclusionsOther Research

Linear Parameter EstimationIntegral EquationsStructured Inverse ProblemRegularizationPET ExamplesUnstructured Inverse Problems

Observations

• Image improvement is possible even with a rough estimationof the PSF (non-blind decon)

• Total Variation regularization (piecewise constant solution) isappropriate: intensity levels depend on the tissue type.

• Improvement requires better approximation of the PSF

• Increased Artifacts and noise. (More post processing canimprove this)

Rosemary Renaut with Dr Hongbin Guo and Wolfgang Stefan Ill-Posed Inverse Problems: Algorithms and Applications

Page 38: Ill-Posed Inverse Problems: Algorithms and Applications ...rosie/mypresentations/talksdsu.pdf · Ill-Posed Inverse Problems: Algorithms and Applications Total Least Squares Rosemary

OutlineInverse Problems

Total Least Squares MethodsResults

ConclusionsOther Research

Linear Parameter EstimationIntegral EquationsStructured Inverse ProblemRegularizationPET ExamplesUnstructured Inverse Problems

Inverse ProblemsLinear Parameter EstimationIntegral EquationsStructured Inverse ProblemRegularizationPET ExamplesUnstructured Inverse Problems

Total Least Squares Methods

Results

Conclusions

Other ResearchRosemary Renaut with Dr Hongbin Guo and Wolfgang Stefan Ill-Posed Inverse Problems: Algorithms and Applications

Page 39: Ill-Posed Inverse Problems: Algorithms and Applications ...rosie/mypresentations/talksdsu.pdf · Ill-Posed Inverse Problems: Algorithms and Applications Total Least Squares Rosemary

OutlineInverse Problems

Total Least Squares MethodsResults

ConclusionsOther Research

Linear Parameter EstimationIntegral EquationsStructured Inverse ProblemRegularizationPET ExamplesUnstructured Inverse Problems

Matrix Formulation: H not Toeplitz

• Consider g = Hf + n where either H is structured but notknown, or H is estimated and unstructured.

• Example: Dynamic PET Estimate parametersK = (K1, k2, k3, k4), which satisfy

y(t) = u(t)⊗(c1(K)e−λ1(K)t + c2(K)e−λ2(K)t

).

Nonlinear parameter estimation can be converted to linearform

H = [

∫(u);

∫ ∫(u);

∫(y);

∫ ∫(y)],

g = Hf , but f = f (K), is non linear.

Rosemary Renaut with Dr Hongbin Guo and Wolfgang Stefan Ill-Posed Inverse Problems: Algorithms and Applications

Page 40: Ill-Posed Inverse Problems: Algorithms and Applications ...rosie/mypresentations/talksdsu.pdf · Ill-Posed Inverse Problems: Algorithms and Applications Total Least Squares Rosemary

OutlineInverse Problems

Total Least Squares MethodsResults

ConclusionsOther Research

Linear Parameter EstimationIntegral EquationsStructured Inverse ProblemRegularizationPET ExamplesUnstructured Inverse Problems

Matrix Formulation: H not Toeplitz

• Consider g = Hf + n where either H is structured but notknown, or H is estimated and unstructured.

• Example: Dynamic PET Estimate parametersK = (K1, k2, k3, k4), which satisfy

y(t) = u(t)⊗(c1(K)e−λ1(K)t + c2(K)e−λ2(K)t

).

Nonlinear parameter estimation can be converted to linearform

H = [

∫(u);

∫ ∫(u);

∫(y);

∫ ∫(y)],

g = Hf , but f = f (K), is non linear.

Rosemary Renaut with Dr Hongbin Guo and Wolfgang Stefan Ill-Posed Inverse Problems: Algorithms and Applications

Page 41: Ill-Posed Inverse Problems: Algorithms and Applications ...rosie/mypresentations/talksdsu.pdf · Ill-Posed Inverse Problems: Algorithms and Applications Total Least Squares Rosemary

OutlineInverse Problems

Total Least Squares MethodsResults

ConclusionsOther Research

HistoryTotal Least SquaresRegularizationSolution TechniquesTheoretical ResultsAlgorithm

Inverse Problems

Total Least Squares MethodsHistoryTotal Least SquaresRegularizationSolution TechniquesTheoretical ResultsAlgorithm

Results

Conclusions

Other ResearchRosemary Renaut with Dr Hongbin Guo and Wolfgang Stefan Ill-Posed Inverse Problems: Algorithms and Applications

Page 42: Ill-Posed Inverse Problems: Algorithms and Applications ...rosie/mypresentations/talksdsu.pdf · Ill-Posed Inverse Problems: Algorithms and Applications Total Least Squares Rosemary

OutlineInverse Problems

Total Least Squares MethodsResults

ConclusionsOther Research

HistoryTotal Least SquaresRegularizationSolution TechniquesTheoretical ResultsAlgorithm

• Statistics: Errors in variables regression(Gleser, 1981)

• Measurement error models (Fuller, 1987)• Orthogonal distance regression (Adcock, 1878)• Generalized least squares (Madansky, 1959,. . . )

• Numerical Analysis: Golub & van Loan, 1981.

• System Identification: Global linear least squares (Staar, 1981)

• eigenvector method (Levin, 1964)• Koopmans-Levin method (Fernando & Nicholson, 1985)• Compensated least squares (Stoica & Soderstrom, 1982)

• Signal Processing: Minimum norm method (Kumaresan &Tufts,1983)

Rosemary Renaut with Dr Hongbin Guo and Wolfgang Stefan Ill-Posed Inverse Problems: Algorithms and Applications

Page 43: Ill-Posed Inverse Problems: Algorithms and Applications ...rosie/mypresentations/talksdsu.pdf · Ill-Posed Inverse Problems: Algorithms and Applications Total Least Squares Rosemary

OutlineInverse Problems

Total Least Squares MethodsResults

ConclusionsOther Research

HistoryTotal Least SquaresRegularizationSolution TechniquesTheoretical ResultsAlgorithm

• Statistics: Errors in variables regression(Gleser, 1981)

• Measurement error models (Fuller, 1987)• Orthogonal distance regression (Adcock, 1878)• Generalized least squares (Madansky, 1959,. . . )

• Numerical Analysis: Golub & van Loan, 1981.

• System Identification: Global linear least squares (Staar, 1981)

• eigenvector method (Levin, 1964)• Koopmans-Levin method (Fernando & Nicholson, 1985)• Compensated least squares (Stoica & Soderstrom, 1982)

• Signal Processing: Minimum norm method (Kumaresan &Tufts,1983)

Rosemary Renaut with Dr Hongbin Guo and Wolfgang Stefan Ill-Posed Inverse Problems: Algorithms and Applications

Page 44: Ill-Posed Inverse Problems: Algorithms and Applications ...rosie/mypresentations/talksdsu.pdf · Ill-Posed Inverse Problems: Algorithms and Applications Total Least Squares Rosemary

OutlineInverse Problems

Total Least Squares MethodsResults

ConclusionsOther Research

HistoryTotal Least SquaresRegularizationSolution TechniquesTheoretical ResultsAlgorithm

• Statistics: Errors in variables regression(Gleser, 1981)

• Measurement error models (Fuller, 1987)• Orthogonal distance regression (Adcock, 1878)• Generalized least squares (Madansky, 1959,. . . )

• Numerical Analysis: Golub & van Loan, 1981.

• System Identification: Global linear least squares (Staar, 1981)

• eigenvector method (Levin, 1964)• Koopmans-Levin method (Fernando & Nicholson, 1985)• Compensated least squares (Stoica & Soderstrom, 1982)

• Signal Processing: Minimum norm method (Kumaresan &Tufts,1983)

Rosemary Renaut with Dr Hongbin Guo and Wolfgang Stefan Ill-Posed Inverse Problems: Algorithms and Applications

Page 45: Ill-Posed Inverse Problems: Algorithms and Applications ...rosie/mypresentations/talksdsu.pdf · Ill-Posed Inverse Problems: Algorithms and Applications Total Least Squares Rosemary

OutlineInverse Problems

Total Least Squares MethodsResults

ConclusionsOther Research

HistoryTotal Least SquaresRegularizationSolution TechniquesTheoretical ResultsAlgorithm

• Statistics: Errors in variables regression(Gleser, 1981)

• Measurement error models (Fuller, 1987)• Orthogonal distance regression (Adcock, 1878)• Generalized least squares (Madansky, 1959,. . . )

• Numerical Analysis: Golub & van Loan, 1981.

• System Identification: Global linear least squares (Staar, 1981)

• eigenvector method (Levin, 1964)• Koopmans-Levin method (Fernando & Nicholson, 1985)• Compensated least squares (Stoica & Soderstrom, 1982)

• Signal Processing: Minimum norm method (Kumaresan &Tufts,1983)

Rosemary Renaut with Dr Hongbin Guo and Wolfgang Stefan Ill-Posed Inverse Problems: Algorithms and Applications

Page 46: Ill-Posed Inverse Problems: Algorithms and Applications ...rosie/mypresentations/talksdsu.pdf · Ill-Posed Inverse Problems: Algorithms and Applications Total Least Squares Rosemary

OutlineInverse Problems

Total Least Squares MethodsResults

ConclusionsOther Research

HistoryTotal Least SquaresRegularizationSolution TechniquesTheoretical ResultsAlgorithm

• Statistics: Errors in variables regression(Gleser, 1981)

• Measurement error models (Fuller, 1987)• Orthogonal distance regression (Adcock, 1878)• Generalized least squares (Madansky, 1959,. . . )

• Numerical Analysis: Golub & van Loan, 1981.

• System Identification: Global linear least squares (Staar, 1981)

• eigenvector method (Levin, 1964)• Koopmans-Levin method (Fernando & Nicholson, 1985)• Compensated least squares (Stoica & Soderstrom, 1982)

• Signal Processing: Minimum norm method (Kumaresan &Tufts,1983)

Rosemary Renaut with Dr Hongbin Guo and Wolfgang Stefan Ill-Posed Inverse Problems: Algorithms and Applications

Page 47: Ill-Posed Inverse Problems: Algorithms and Applications ...rosie/mypresentations/talksdsu.pdf · Ill-Posed Inverse Problems: Algorithms and Applications Total Least Squares Rosemary

OutlineInverse Problems

Total Least Squares MethodsResults

ConclusionsOther Research

HistoryTotal Least SquaresRegularizationSolution TechniquesTheoretical ResultsAlgorithm

• Statistics: Errors in variables regression(Gleser, 1981)

• Measurement error models (Fuller, 1987)• Orthogonal distance regression (Adcock, 1878)• Generalized least squares (Madansky, 1959,. . . )

• Numerical Analysis: Golub & van Loan, 1981.

• System Identification: Global linear least squares (Staar, 1981)

• eigenvector method (Levin, 1964)• Koopmans-Levin method (Fernando & Nicholson, 1985)• Compensated least squares (Stoica & Soderstrom, 1982)

• Signal Processing: Minimum norm method (Kumaresan &Tufts,1983)

Rosemary Renaut with Dr Hongbin Guo and Wolfgang Stefan Ill-Posed Inverse Problems: Algorithms and Applications

Page 48: Ill-Posed Inverse Problems: Algorithms and Applications ...rosie/mypresentations/talksdsu.pdf · Ill-Posed Inverse Problems: Algorithms and Applications Total Least Squares Rosemary

OutlineInverse Problems

Total Least Squares MethodsResults

ConclusionsOther Research

HistoryTotal Least SquaresRegularizationSolution TechniquesTheoretical ResultsAlgorithm

min‖(E , f )‖F subject to (A + E )x = b + f

• Classical Algorithm Golub Reinsch Van-Loan

Solve [A|b]

[xTLS

−1

]= 0, A = A + E , b = b + f .

• SVD Solution uses the SVD of augmented matrix [A|b].[xTLS

−1

]=

−1

vn+1,n+1vn+1.

I Direct Compute the SVD and solve.

Rosemary Renaut with Dr Hongbin Guo and Wolfgang Stefan Ill-Posed Inverse Problems: Algorithms and Applications

Page 49: Ill-Posed Inverse Problems: Algorithms and Applications ...rosie/mypresentations/talksdsu.pdf · Ill-Posed Inverse Problems: Algorithms and Applications Total Least Squares Rosemary

OutlineInverse Problems

Total Least Squares MethodsResults

ConclusionsOther Research

HistoryTotal Least SquaresRegularizationSolution TechniquesTheoretical ResultsAlgorithm

min‖(E , f )‖F subject to (A + E )x = b + f

• Classical Algorithm Golub Reinsch Van-Loan

Solve [A|b]

[xTLS

−1

]= 0, A = A + E , b = b + f .

• SVD Solution uses the SVD of augmented matrix [A|b].[xTLS

−1

]=

−1

vn+1,n+1vn+1.

I Direct Compute the SVD and solve.

Rosemary Renaut with Dr Hongbin Guo and Wolfgang Stefan Ill-Posed Inverse Problems: Algorithms and Applications

Page 50: Ill-Posed Inverse Problems: Algorithms and Applications ...rosie/mypresentations/talksdsu.pdf · Ill-Posed Inverse Problems: Algorithms and Applications Total Least Squares Rosemary

OutlineInverse Problems

Total Least Squares MethodsResults

ConclusionsOther Research

HistoryTotal Least SquaresRegularizationSolution TechniquesTheoretical ResultsAlgorithm

min‖(E , f )‖F subject to (A + E )x = b + f

• Classical Algorithm Golub Reinsch Van-Loan

Solve [A|b]

[xTLS

−1

]= 0, A = A + E , b = b + f .

• SVD Solution uses the SVD of augmented matrix [A|b].[xTLS

−1

]=

−1

vn+1,n+1vn+1.

I Direct Compute the SVD and solve.

Rosemary Renaut with Dr Hongbin Guo and Wolfgang Stefan Ill-Posed Inverse Problems: Algorithms and Applications

Page 51: Ill-Posed Inverse Problems: Algorithms and Applications ...rosie/mypresentations/talksdsu.pdf · Ill-Posed Inverse Problems: Algorithms and Applications Total Least Squares Rosemary

OutlineInverse Problems

Total Least Squares MethodsResults

ConclusionsOther Research

HistoryTotal Least SquaresRegularizationSolution TechniquesTheoretical ResultsAlgorithm

Rayleigh Quotient Formulation for TLS

• An Iterative Approach:

xTLS = argminxφ(x) = argminx‖Ax − b‖2

1 + ‖x‖2,

• φ is the Rayleigh quotient of matrix [A, b].

• TLS minimizes the sum of squared normalized residuals.

Rosemary Renaut with Dr Hongbin Guo and Wolfgang Stefan Ill-Posed Inverse Problems: Algorithms and Applications

Page 52: Ill-Posed Inverse Problems: Algorithms and Applications ...rosie/mypresentations/talksdsu.pdf · Ill-Posed Inverse Problems: Algorithms and Applications Total Least Squares Rosemary

OutlineInverse Problems

Total Least Squares MethodsResults

ConclusionsOther Research

HistoryTotal Least SquaresRegularizationSolution TechniquesTheoretical ResultsAlgorithm

Rayleigh Quotient Formulation for TLS

• An Iterative Approach:

xTLS = argminxφ(x) = argminx‖Ax − b‖2

1 + ‖x‖2,

• φ is the Rayleigh quotient of matrix [A, b].

• TLS minimizes the sum of squared normalized residuals.

Rosemary Renaut with Dr Hongbin Guo and Wolfgang Stefan Ill-Posed Inverse Problems: Algorithms and Applications

Page 53: Ill-Posed Inverse Problems: Algorithms and Applications ...rosie/mypresentations/talksdsu.pdf · Ill-Posed Inverse Problems: Algorithms and Applications Total Least Squares Rosemary

OutlineInverse Problems

Total Least Squares MethodsResults

ConclusionsOther Research

HistoryTotal Least SquaresRegularizationSolution TechniquesTheoretical ResultsAlgorithm

Rayleigh Quotient Formulation for TLS

• An Iterative Approach:

xTLS = argminxφ(x) = argminx‖Ax − b‖2

1 + ‖x‖2,

• φ is the Rayleigh quotient of matrix [A, b].

• TLS minimizes the sum of squared normalized residuals.

Rosemary Renaut with Dr Hongbin Guo and Wolfgang Stefan Ill-Posed Inverse Problems: Algorithms and Applications

Page 54: Ill-Posed Inverse Problems: Algorithms and Applications ...rosie/mypresentations/talksdsu.pdf · Ill-Posed Inverse Problems: Algorithms and Applications Total Least Squares Rosemary

OutlineInverse Problems

Total Least Squares MethodsResults

ConclusionsOther Research

HistoryTotal Least SquaresRegularizationSolution TechniquesTheoretical ResultsAlgorithm

An eigenvalue problem for TLS

• Bjorck, Hesternes and Matstoms(BHM), 2000 Eigenvalue equationfor TLS (

ATA ATbbTA bTb

) (x−1

)= σ2

(x−1

)

• σ2 is to be minimized. System matrix is constant.

• First block equation gives the normal equations for TLS

(ATA− σ2I )x = ATb.

• Use Rayleigh quotient iteration to find the minimal eigenvalue.

• Solve shifted normal equations by Preconditioned ConjugateGradients each iteration.

Rosemary Renaut with Dr Hongbin Guo and Wolfgang Stefan Ill-Posed Inverse Problems: Algorithms and Applications

Page 55: Ill-Posed Inverse Problems: Algorithms and Applications ...rosie/mypresentations/talksdsu.pdf · Ill-Posed Inverse Problems: Algorithms and Applications Total Least Squares Rosemary

OutlineInverse Problems

Total Least Squares MethodsResults

ConclusionsOther Research

HistoryTotal Least SquaresRegularizationSolution TechniquesTheoretical ResultsAlgorithm

An eigenvalue problem for TLS

• Bjorck, Hesternes and Matstoms(BHM), 2000 Eigenvalue equationfor TLS (

ATA ATbbTA bTb

) (x−1

)= σ2

(x−1

)• σ2 is to be minimized. System matrix is constant.

• First block equation gives the normal equations for TLS

(ATA− σ2I )x = ATb.

• Use Rayleigh quotient iteration to find the minimal eigenvalue.

• Solve shifted normal equations by Preconditioned ConjugateGradients each iteration.

Rosemary Renaut with Dr Hongbin Guo and Wolfgang Stefan Ill-Posed Inverse Problems: Algorithms and Applications

Page 56: Ill-Posed Inverse Problems: Algorithms and Applications ...rosie/mypresentations/talksdsu.pdf · Ill-Posed Inverse Problems: Algorithms and Applications Total Least Squares Rosemary

OutlineInverse Problems

Total Least Squares MethodsResults

ConclusionsOther Research

HistoryTotal Least SquaresRegularizationSolution TechniquesTheoretical ResultsAlgorithm

An eigenvalue problem for TLS

• Bjorck, Hesternes and Matstoms(BHM), 2000 Eigenvalue equationfor TLS (

ATA ATbbTA bTb

) (x−1

)= σ2

(x−1

)• σ2 is to be minimized. System matrix is constant.

• First block equation gives the normal equations for TLS

(ATA− σ2I )x = ATb.

• Use Rayleigh quotient iteration to find the minimal eigenvalue.

• Solve shifted normal equations by Preconditioned ConjugateGradients each iteration.

Rosemary Renaut with Dr Hongbin Guo and Wolfgang Stefan Ill-Posed Inverse Problems: Algorithms and Applications

Page 57: Ill-Posed Inverse Problems: Algorithms and Applications ...rosie/mypresentations/talksdsu.pdf · Ill-Posed Inverse Problems: Algorithms and Applications Total Least Squares Rosemary

OutlineInverse Problems

Total Least Squares MethodsResults

ConclusionsOther Research

HistoryTotal Least SquaresRegularizationSolution TechniquesTheoretical ResultsAlgorithm

An eigenvalue problem for TLS

• Bjorck, Hesternes and Matstoms(BHM), 2000 Eigenvalue equationfor TLS (

ATA ATbbTA bTb

) (x−1

)= σ2

(x−1

)• σ2 is to be minimized. System matrix is constant.

• First block equation gives the normal equations for TLS

(ATA− σ2I )x = ATb.

• Use Rayleigh quotient iteration to find the minimal eigenvalue.

• Solve shifted normal equations by Preconditioned ConjugateGradients each iteration.

Rosemary Renaut with Dr Hongbin Guo and Wolfgang Stefan Ill-Posed Inverse Problems: Algorithms and Applications

Page 58: Ill-Posed Inverse Problems: Algorithms and Applications ...rosie/mypresentations/talksdsu.pdf · Ill-Posed Inverse Problems: Algorithms and Applications Total Least Squares Rosemary

OutlineInverse Problems

Total Least Squares MethodsResults

ConclusionsOther Research

HistoryTotal Least SquaresRegularizationSolution TechniquesTheoretical ResultsAlgorithm

An eigenvalue problem for TLS

• Bjorck, Hesternes and Matstoms(BHM), 2000 Eigenvalue equationfor TLS (

ATA ATbbTA bTb

) (x−1

)= σ2

(x−1

)• σ2 is to be minimized. System matrix is constant.

• First block equation gives the normal equations for TLS

(ATA− σ2I )x = ATb.

• Use Rayleigh quotient iteration to find the minimal eigenvalue.

• Solve shifted normal equations by Preconditioned ConjugateGradients each iteration.

Rosemary Renaut with Dr Hongbin Guo and Wolfgang Stefan Ill-Posed Inverse Problems: Algorithms and Applications

Page 59: Ill-Posed Inverse Problems: Algorithms and Applications ...rosie/mypresentations/talksdsu.pdf · Ill-Posed Inverse Problems: Algorithms and Applications Total Least Squares Rosemary

OutlineInverse Problems

Total Least Squares MethodsResults

ConclusionsOther Research

HistoryTotal Least SquaresRegularizationSolution TechniquesTheoretical ResultsAlgorithm

Tikhonov TLS

Regularization Stabilize TLS with realistic constraint on data ‖Lx‖ ≤ δ .δ is prescribed from known information on the solution,and L is typically a differential operator.

Reformulation with Lagrange Multiplier If the constraint is active thesolution x∗ satisfies normal equations

(ATA + λI I + λLLTL)x∗ = ATb, (x∗)TLTLx∗ − δ2 = 0,

λI = −‖Ax∗ − b‖2

1 + ‖x∗‖2, λL = µ(1 + ‖x∗‖2),

µ = − 1

δ2

(bT (Ax∗ − b)

1 + ‖x∗‖2+

‖Ax∗ − b‖2

(1 + ‖x∗‖2)2

),

Golub, Hansen and O’Leary, 1999. Solution technique isparameter dependent: λL, λI , δ, µ

Rosemary Renaut with Dr Hongbin Guo and Wolfgang Stefan Ill-Posed Inverse Problems: Algorithms and Applications

Page 60: Ill-Posed Inverse Problems: Algorithms and Applications ...rosie/mypresentations/talksdsu.pdf · Ill-Posed Inverse Problems: Algorithms and Applications Total Least Squares Rosemary

OutlineInverse Problems

Total Least Squares MethodsResults

ConclusionsOther Research

HistoryTotal Least SquaresRegularizationSolution TechniquesTheoretical ResultsAlgorithm

Tikhonov TLS

Regularization Stabilize TLS with realistic constraint on data ‖Lx‖ ≤ δ .δ is prescribed from known information on the solution,and L is typically a differential operator.

Reformulation with Lagrange Multiplier If the constraint is active thesolution x∗ satisfies normal equations

(ATA + λI I + λLLTL)x∗ = ATb, (x∗)TLTLx∗ − δ2 = 0,

λI = −‖Ax∗ − b‖2

1 + ‖x∗‖2, λL = µ(1 + ‖x∗‖2),

µ = − 1

δ2

(bT (Ax∗ − b)

1 + ‖x∗‖2+

‖Ax∗ − b‖2

(1 + ‖x∗‖2)2

),

Golub, Hansen and O’Leary, 1999. Solution technique isparameter dependent: λL, λI , δ, µ

Rosemary Renaut with Dr Hongbin Guo and Wolfgang Stefan Ill-Posed Inverse Problems: Algorithms and Applications

Page 61: Ill-Posed Inverse Problems: Algorithms and Applications ...rosie/mypresentations/talksdsu.pdf · Ill-Posed Inverse Problems: Algorithms and Applications Total Least Squares Rosemary

OutlineInverse Problems

Total Least Squares MethodsResults

ConclusionsOther Research

HistoryTotal Least SquaresRegularizationSolution TechniquesTheoretical ResultsAlgorithm

Rayleigh Quotient Formulation for RTLS

• Recall

xTLS = argminxφ(x) = argminx

‖Ax − b‖2

1 + ‖x‖2,

• Formulate regularization for TLS in RQ form

minxφ(x) subject to ‖Lx‖ ≤ δ.

• Augmented Lagrangian

L(x , µ) = φ(x) + µ(‖Lx‖2 − δ2).

Rosemary Renaut with Dr Hongbin Guo and Wolfgang Stefan Ill-Posed Inverse Problems: Algorithms and Applications

Page 62: Ill-Posed Inverse Problems: Algorithms and Applications ...rosie/mypresentations/talksdsu.pdf · Ill-Posed Inverse Problems: Algorithms and Applications Total Least Squares Rosemary

OutlineInverse Problems

Total Least Squares MethodsResults

ConclusionsOther Research

HistoryTotal Least SquaresRegularizationSolution TechniquesTheoretical ResultsAlgorithm

Rayleigh Quotient Formulation for RTLS

• Recall

xTLS = argminxφ(x) = argminx

‖Ax − b‖2

1 + ‖x‖2,

• Formulate regularization for TLS in RQ form

minxφ(x) subject to ‖Lx‖ ≤ δ.

• Augmented Lagrangian

L(x , µ) = φ(x) + µ(‖Lx‖2 − δ2).

Rosemary Renaut with Dr Hongbin Guo and Wolfgang Stefan Ill-Posed Inverse Problems: Algorithms and Applications

Page 63: Ill-Posed Inverse Problems: Algorithms and Applications ...rosie/mypresentations/talksdsu.pdf · Ill-Posed Inverse Problems: Algorithms and Applications Total Least Squares Rosemary

OutlineInverse Problems

Total Least Squares MethodsResults

ConclusionsOther Research

HistoryTotal Least SquaresRegularizationSolution TechniquesTheoretical ResultsAlgorithm

Rayleigh Quotient Formulation for RTLS

• Recall

xTLS = argminxφ(x) = argminx

‖Ax − b‖2

1 + ‖x‖2,

• Formulate regularization for TLS in RQ form

minxφ(x) subject to ‖Lx‖ ≤ δ.

• Augmented Lagrangian

L(x , µ) = φ(x) + µ(‖Lx‖2 − δ2).

Rosemary Renaut with Dr Hongbin Guo and Wolfgang Stefan Ill-Posed Inverse Problems: Algorithms and Applications

Page 64: Ill-Posed Inverse Problems: Algorithms and Applications ...rosie/mypresentations/talksdsu.pdf · Ill-Posed Inverse Problems: Algorithms and Applications Total Least Squares Rosemary

OutlineInverse Problems

Total Least Squares MethodsResults

ConclusionsOther Research

HistoryTotal Least SquaresRegularizationSolution TechniquesTheoretical ResultsAlgorithm

Eigenvalue Problem RTLS: Renaut & Guo (SIAM 2004)

B(x∗)

(x∗

−1

)= −λI

(x∗

−1

),

B(x∗) =

(ATA + λL(x

∗)LTL, ATbbTA, −λL(x

∗)γ + bTb

),

λL = µ(1 + ‖x∗‖2),

• Seek the minimal eigenpair for B. Note B is not constant.

• Specify constraint ‖Lx∗‖2 ≤ δ2, use γ = δ2 or ‖Lx∗‖2

Rosemary Renaut with Dr Hongbin Guo and Wolfgang Stefan Ill-Posed Inverse Problems: Algorithms and Applications

Page 65: Ill-Posed Inverse Problems: Algorithms and Applications ...rosie/mypresentations/talksdsu.pdf · Ill-Posed Inverse Problems: Algorithms and Applications Total Least Squares Rosemary

OutlineInverse Problems

Total Least Squares MethodsResults

ConclusionsOther Research

HistoryTotal Least SquaresRegularizationSolution TechniquesTheoretical ResultsAlgorithm

Eigenvalue Problem RTLS: Renaut & Guo (SIAM 2004)

B(x∗)

(x∗

−1

)= −λI

(x∗

−1

),

B(x∗) =

(ATA + λL(x

∗)LTL, ATbbTA, −λL(x

∗)γ + bTb

),

λL = µ(1 + ‖x∗‖2),

• Seek the minimal eigenpair for B. Note B is not constant.

• Specify constraint ‖Lx∗‖2 ≤ δ2, use γ = δ2 or ‖Lx∗‖2

Rosemary Renaut with Dr Hongbin Guo and Wolfgang Stefan Ill-Posed Inverse Problems: Algorithms and Applications

Page 66: Ill-Posed Inverse Problems: Algorithms and Applications ...rosie/mypresentations/talksdsu.pdf · Ill-Posed Inverse Problems: Algorithms and Applications Total Least Squares Rosemary

OutlineInverse Problems

Total Least Squares MethodsResults

ConclusionsOther Research

HistoryTotal Least SquaresRegularizationSolution TechniquesTheoretical ResultsAlgorithm

Eigenvalue Problem RTLS: Renaut & Guo (SIAM 2004)

B(x∗)

(x∗

−1

)= −λI

(x∗

−1

),

B(x∗) =

(ATA + λL(x

∗)LTL, ATbbTA, −λL(x

∗)γ + bTb

),

λL = µ(1 + ‖x∗‖2),

• Seek the minimal eigenpair for B. Note B is not constant.

• Specify constraint ‖Lx∗‖2 ≤ δ2, use γ = δ2 or ‖Lx∗‖2

Rosemary Renaut with Dr Hongbin Guo and Wolfgang Stefan Ill-Posed Inverse Problems: Algorithms and Applications

Page 67: Ill-Posed Inverse Problems: Algorithms and Applications ...rosie/mypresentations/talksdsu.pdf · Ill-Posed Inverse Problems: Algorithms and Applications Total Least Squares Rosemary

OutlineInverse Problems

Total Least Squares MethodsResults

ConclusionsOther Research

HistoryTotal Least SquaresRegularizationSolution TechniquesTheoretical ResultsAlgorithm

Development

• At a solution constraint is active: measure normalized discrepancy

g(x) = (‖Lx‖2 − δ2)/(1 + ‖x‖2)

• Denote B(θ) = M + θN, N =

(LTL 00 −δ2

).

• Denote eigenpair for smallest eigenvalue of B(θ) as %n+1, (xTθ ,−1)T

• Reformulation:

For a constant δ, find a θ such that g(xθ) = 0.

Rosemary Renaut with Dr Hongbin Guo and Wolfgang Stefan Ill-Posed Inverse Problems: Algorithms and Applications

Page 68: Ill-Posed Inverse Problems: Algorithms and Applications ...rosie/mypresentations/talksdsu.pdf · Ill-Posed Inverse Problems: Algorithms and Applications Total Least Squares Rosemary

OutlineInverse Problems

Total Least Squares MethodsResults

ConclusionsOther Research

HistoryTotal Least SquaresRegularizationSolution TechniquesTheoretical ResultsAlgorithm

Development

• At a solution constraint is active: measure normalized discrepancy

g(x) = (‖Lx‖2 − δ2)/(1 + ‖x‖2)

• Denote B(θ) = M + θN, N =

(LTL 00 −δ2

).

• Denote eigenpair for smallest eigenvalue of B(θ) as %n+1, (xTθ ,−1)T

• Reformulation:

For a constant δ, find a θ such that g(xθ) = 0.

Rosemary Renaut with Dr Hongbin Guo and Wolfgang Stefan Ill-Posed Inverse Problems: Algorithms and Applications

Page 69: Ill-Posed Inverse Problems: Algorithms and Applications ...rosie/mypresentations/talksdsu.pdf · Ill-Posed Inverse Problems: Algorithms and Applications Total Least Squares Rosemary

OutlineInverse Problems

Total Least Squares MethodsResults

ConclusionsOther Research

HistoryTotal Least SquaresRegularizationSolution TechniquesTheoretical ResultsAlgorithm

Development

• At a solution constraint is active: measure normalized discrepancy

g(x) = (‖Lx‖2 − δ2)/(1 + ‖x‖2)

• Denote B(θ) = M + θN, N =

(LTL 00 −δ2

).

• Denote eigenpair for smallest eigenvalue of B(θ) as %n+1, (xTθ ,−1)T

• Reformulation:

For a constant δ, find a θ such that g(xθ) = 0.

Rosemary Renaut with Dr Hongbin Guo and Wolfgang Stefan Ill-Posed Inverse Problems: Algorithms and Applications

Page 70: Ill-Posed Inverse Problems: Algorithms and Applications ...rosie/mypresentations/talksdsu.pdf · Ill-Posed Inverse Problems: Algorithms and Applications Total Least Squares Rosemary

OutlineInverse Problems

Total Least Squares MethodsResults

ConclusionsOther Research

HistoryTotal Least SquaresRegularizationSolution TechniquesTheoretical ResultsAlgorithm

Development

• At a solution constraint is active: measure normalized discrepancy

g(x) = (‖Lx‖2 − δ2)/(1 + ‖x‖2)

• Denote B(θ) = M + θN, N =

(LTL 00 −δ2

).

• Denote eigenpair for smallest eigenvalue of B(θ) as %n+1, (xTθ ,−1)T

• Reformulation:

For a constant δ, find a θ such that g(xθ) = 0.

Rosemary Renaut with Dr Hongbin Guo and Wolfgang Stefan Ill-Posed Inverse Problems: Algorithms and Applications

Page 71: Ill-Posed Inverse Problems: Algorithms and Applications ...rosie/mypresentations/talksdsu.pdf · Ill-Posed Inverse Problems: Algorithms and Applications Total Least Squares Rosemary

OutlineInverse Problems

Total Least Squares MethodsResults

ConclusionsOther Research

HistoryTotal Least SquaresRegularizationSolution TechniquesTheoretical ResultsAlgorithm

Matrix B

Lemma Assuming that bTA 6= 0 and N (A) ∩N (L) = 0, then thesmallest eigenvalue of B(θ) is simple.

Lemma If [A, b] is a full rank matrix, there exists one and only onepositive number, denoted by θc , such that B(θc) is singular,and

(i) the null eigenvalue of B(θc) is simple(ii) when 0 ≤ θ < θc , B(θ) is positive definite, and(iii) when θ > θc , B(θ) has only one negative eigenvalue.

Rosemary Renaut with Dr Hongbin Guo and Wolfgang Stefan Ill-Posed Inverse Problems: Algorithms and Applications

Page 72: Ill-Posed Inverse Problems: Algorithms and Applications ...rosie/mypresentations/talksdsu.pdf · Ill-Posed Inverse Problems: Algorithms and Applications Total Least Squares Rosemary

OutlineInverse Problems

Total Least Squares MethodsResults

ConclusionsOther Research

HistoryTotal Least SquaresRegularizationSolution TechniquesTheoretical ResultsAlgorithm

Matrix B

Lemma Assuming that bTA 6= 0 and N (A) ∩N (L) = 0, then thesmallest eigenvalue of B(θ) is simple.

Lemma If [A, b] is a full rank matrix, there exists one and only onepositive number, denoted by θc , such that B(θc) is singular,and

(i) the null eigenvalue of B(θc) is simple(ii) when 0 ≤ θ < θc , B(θ) is positive definite, and(iii) when θ > θc , B(θ) has only one negative eigenvalue.

Rosemary Renaut with Dr Hongbin Guo and Wolfgang Stefan Ill-Posed Inverse Problems: Algorithms and Applications

Page 73: Ill-Posed Inverse Problems: Algorithms and Applications ...rosie/mypresentations/talksdsu.pdf · Ill-Posed Inverse Problems: Algorithms and Applications Total Least Squares Rosemary

OutlineInverse Problems

Total Least Squares MethodsResults

ConclusionsOther Research

HistoryTotal Least SquaresRegularizationSolution TechniquesTheoretical ResultsAlgorithm

Uniqueness

Lemma If bTA 6= 0, [A, b] is full-rank, then solution of g(xθ) = 0 isunique and

(i) There exists a λ∗L ∈ [0, θc ] which solves g(xθ) = 0.(ii) When λL ∈ (0, λ∗L), g(xλL

) > 0 and λL ∈ (λ∗L,∞), g(xλL) < 0.

Observe We see from this result that there is an unique solution to ourproblem and that an algorithm for finding this solution shoulddepend both on finding an update for the Lagrange parameterλL and monitoring the sign of g(xλL

).

Rosemary Renaut with Dr Hongbin Guo and Wolfgang Stefan Ill-Posed Inverse Problems: Algorithms and Applications

Page 74: Ill-Posed Inverse Problems: Algorithms and Applications ...rosie/mypresentations/talksdsu.pdf · Ill-Posed Inverse Problems: Algorithms and Applications Total Least Squares Rosemary

OutlineInverse Problems

Total Least Squares MethodsResults

ConclusionsOther Research

HistoryTotal Least SquaresRegularizationSolution TechniquesTheoretical ResultsAlgorithm

Uniqueness

Lemma If bTA 6= 0, [A, b] is full-rank, then solution of g(xθ) = 0 isunique and

(i) There exists a λ∗L ∈ [0, θc ] which solves g(xθ) = 0.(ii) When λL ∈ (0, λ∗L), g(xλL

) > 0 and λL ∈ (λ∗L,∞), g(xλL) < 0.

Observe We see from this result that there is an unique solution to ourproblem and that an algorithm for finding this solution shoulddepend both on finding an update for the Lagrange parameterλL and monitoring the sign of g(xλL

).

Rosemary Renaut with Dr Hongbin Guo and Wolfgang Stefan Ill-Posed Inverse Problems: Algorithms and Applications

Page 75: Ill-Posed Inverse Problems: Algorithms and Applications ...rosie/mypresentations/talksdsu.pdf · Ill-Posed Inverse Problems: Algorithms and Applications Total Least Squares Rosemary

OutlineInverse Problems

Total Least Squares MethodsResults

ConclusionsOther Research

HistoryTotal Least SquaresRegularizationSolution TechniquesTheoretical ResultsAlgorithm

The Update Equation for θ = λL

λ(k+1)L = λ

(k)L + ι(k) λ

(k)L

δ2g(x

λ(k)L

),

0 < ι(k) ≤ 1 such that g(xλ

(k+1)L

)g(xλ

(0)L

) > 0.

Theorem Given λ(0)L > 0 iteration converges to unique solution, λ∗L.

Observe B(λ(k)L ) is always positive definite. For 0 < λ

(0)L < θc iteration

is monotically increasing or decreasing dependent on

λ(0)L >< λ∗L.

Rosemary Renaut with Dr Hongbin Guo and Wolfgang Stefan Ill-Posed Inverse Problems: Algorithms and Applications

Page 76: Ill-Posed Inverse Problems: Algorithms and Applications ...rosie/mypresentations/talksdsu.pdf · Ill-Posed Inverse Problems: Algorithms and Applications Total Least Squares Rosemary

OutlineInverse Problems

Total Least Squares MethodsResults

ConclusionsOther Research

HistoryTotal Least SquaresRegularizationSolution TechniquesTheoretical ResultsAlgorithm

The Update Equation for θ = λL

λ(k+1)L = λ

(k)L + ι(k) λ

(k)L

δ2g(x

λ(k)L

),

0 < ι(k) ≤ 1 such that g(xλ

(k+1)L

)g(xλ

(0)L

) > 0.

Theorem Given λ(0)L > 0 iteration converges to unique solution, λ∗L.

Observe B(λ(k)L ) is always positive definite. For 0 < λ

(0)L < θc iteration

is monotically increasing or decreasing dependent on

λ(0)L >< λ∗L.

Rosemary Renaut with Dr Hongbin Guo and Wolfgang Stefan Ill-Posed Inverse Problems: Algorithms and Applications

Page 77: Ill-Posed Inverse Problems: Algorithms and Applications ...rosie/mypresentations/talksdsu.pdf · Ill-Posed Inverse Problems: Algorithms and Applications Total Least Squares Rosemary

OutlineInverse Problems

Total Least Squares MethodsResults

ConclusionsOther Research

HistoryTotal Least SquaresRegularizationSolution TechniquesTheoretical ResultsAlgorithm

The Update Equation for θ = λL

λ(k+1)L = λ

(k)L + ι(k) λ

(k)L

δ2g(x

λ(k)L

),

0 < ι(k) ≤ 1 such that g(xλ

(k+1)L

)g(xλ

(0)L

) > 0.

Theorem Given λ(0)L > 0 iteration converges to unique solution, λ∗L.

Observe B(λ(k)L ) is always positive definite. For 0 < λ

(0)L < θc iteration

is monotically increasing or decreasing dependent on

λ(0)L >< λ∗L.

Rosemary Renaut with Dr Hongbin Guo and Wolfgang Stefan Ill-Posed Inverse Problems: Algorithms and Applications

Page 78: Ill-Posed Inverse Problems: Algorithms and Applications ...rosie/mypresentations/talksdsu.pdf · Ill-Posed Inverse Problems: Algorithms and Applications Total Least Squares Rosemary

OutlineInverse Problems

Total Least Squares MethodsResults

ConclusionsOther Research

HistoryTotal Least SquaresRegularizationSolution TechniquesTheoretical ResultsAlgorithm

EXACT RTLS: Alternating Iteration on λL and x .

AlgorithmGiven δ, λ

(0)L > 0, calculate eigenpair (%

(0)n+1, x

(0)). Set k = 0. Update

λ(k)L and x (k) until convergence.

1. While not converged Do ι(k) = 1

1.1 Inner Iteration: Until g(x (k+1))g(x (0)) > 0

1.1.1 Update λ(k+1)L .

1.1.2 Calculate smallest eigenvalue, %(k)n+1, and corresponding

eigenvector, [(x (k+1))T ,−1]T , of matrix B(λ(k)L ).

1.1.3 If g(x (k+1))g(x (0)) < 0, set ι(k) = ι(k)/2 else Break.

1.2 Test for convergence. If converged Break else k = k + 1.

2. End Do. xRTLS = x (k).

Rosemary Renaut with Dr Hongbin Guo and Wolfgang Stefan Ill-Posed Inverse Problems: Algorithms and Applications

Page 79: Ill-Posed Inverse Problems: Algorithms and Applications ...rosie/mypresentations/talksdsu.pdf · Ill-Posed Inverse Problems: Algorithms and Applications Total Least Squares Rosemary

OutlineInverse Problems

Total Least Squares MethodsResults

ConclusionsOther Research

HistoryTotal Least SquaresRegularizationSolution TechniquesTheoretical ResultsAlgorithm

EXACT RTLS: Alternating Iteration on λL and x .

AlgorithmGiven δ, λ

(0)L > 0, calculate eigenpair (%

(0)n+1, x

(0)). Set k = 0. Update

λ(k)L and x (k) until convergence.

1. While not converged Do ι(k) = 1

1.1 Inner Iteration: Until g(x (k+1))g(x (0)) > 0

1.1.1 Update λ(k+1)L .

1.1.2 Calculate smallest eigenvalue, %(k)n+1, and corresponding

eigenvector, [(x (k+1))T ,−1]T , of matrix B(λ(k)L ).

1.1.3 If g(x (k+1))g(x (0)) < 0, set ι(k) = ι(k)/2 else Break.

1.2 Test for convergence. If converged Break else k = k + 1.

2. End Do. xRTLS = x (k).

Rosemary Renaut with Dr Hongbin Guo and Wolfgang Stefan Ill-Posed Inverse Problems: Algorithms and Applications

Page 80: Ill-Posed Inverse Problems: Algorithms and Applications ...rosie/mypresentations/talksdsu.pdf · Ill-Posed Inverse Problems: Algorithms and Applications Total Least Squares Rosemary

OutlineInverse Problems

Total Least Squares MethodsResults

ConclusionsOther Research

HistoryTotal Least SquaresRegularizationSolution TechniquesTheoretical ResultsAlgorithm

EXACT RTLS: Alternating Iteration on λL and x .

AlgorithmGiven δ, λ

(0)L > 0, calculate eigenpair (%

(0)n+1, x

(0)). Set k = 0. Update

λ(k)L and x (k) until convergence.

1. While not converged Do ι(k) = 1

1.1 Inner Iteration: Until g(x (k+1))g(x (0)) > 0

1.1.1 Update λ(k+1)L .

1.1.2 Calculate smallest eigenvalue, %(k)n+1, and corresponding

eigenvector, [(x (k+1))T ,−1]T , of matrix B(λ(k)L ).

1.1.3 If g(x (k+1))g(x (0)) < 0, set ι(k) = ι(k)/2 else Break.

1.2 Test for convergence. If converged Break else k = k + 1.

2. End Do. xRTLS = x (k).

Rosemary Renaut with Dr Hongbin Guo and Wolfgang Stefan Ill-Posed Inverse Problems: Algorithms and Applications

Page 81: Ill-Posed Inverse Problems: Algorithms and Applications ...rosie/mypresentations/talksdsu.pdf · Ill-Posed Inverse Problems: Algorithms and Applications Total Least Squares Rosemary

OutlineInverse Problems

Total Least Squares MethodsResults

ConclusionsOther Research

HistoryTotal Least SquaresRegularizationSolution TechniquesTheoretical ResultsAlgorithm

EXACT RTLS: Alternating Iteration on λL and x .

AlgorithmGiven δ, λ

(0)L > 0, calculate eigenpair (%

(0)n+1, x

(0)). Set k = 0. Update

λ(k)L and x (k) until convergence.

1. While not converged Do ι(k) = 1

1.1 Inner Iteration: Until g(x (k+1))g(x (0)) > 0

1.1.1 Update λ(k+1)L .

1.1.2 Calculate smallest eigenvalue, %(k)n+1, and corresponding

eigenvector, [(x (k+1))T ,−1]T , of matrix B(λ(k)L ).

1.1.3 If g(x (k+1))g(x (0)) < 0, set ι(k) = ι(k)/2 else Break.

1.2 Test for convergence. If converged Break else k = k + 1.

2. End Do. xRTLS = x (k).

Rosemary Renaut with Dr Hongbin Guo and Wolfgang Stefan Ill-Posed Inverse Problems: Algorithms and Applications

Page 82: Ill-Posed Inverse Problems: Algorithms and Applications ...rosie/mypresentations/talksdsu.pdf · Ill-Posed Inverse Problems: Algorithms and Applications Total Least Squares Rosemary

OutlineInverse Problems

Total Least Squares MethodsResults

ConclusionsOther Research

HistoryTotal Least SquaresRegularizationSolution TechniquesTheoretical ResultsAlgorithm

EXACT RTLS: Alternating Iteration on λL and x .

AlgorithmGiven δ, λ

(0)L > 0, calculate eigenpair (%

(0)n+1, x

(0)). Set k = 0. Update

λ(k)L and x (k) until convergence.

1. While not converged Do ι(k) = 1

1.1 Inner Iteration: Until g(x (k+1))g(x (0)) > 0

1.1.1 Update λ(k+1)L .

1.1.2 Calculate smallest eigenvalue, %(k)n+1, and corresponding

eigenvector, [(x (k+1))T ,−1]T , of matrix B(λ(k)L ).

1.1.3 If g(x (k+1))g(x (0)) < 0, set ι(k) = ι(k)/2 else Break.

1.2 Test for convergence. If converged Break else k = k + 1.

2. End Do. xRTLS = x (k).

Rosemary Renaut with Dr Hongbin Guo and Wolfgang Stefan Ill-Posed Inverse Problems: Algorithms and Applications

Page 83: Ill-Posed Inverse Problems: Algorithms and Applications ...rosie/mypresentations/talksdsu.pdf · Ill-Posed Inverse Problems: Algorithms and Applications Total Least Squares Rosemary

OutlineInverse Problems

Total Least Squares MethodsResults

ConclusionsOther Research

HistoryTotal Least SquaresRegularizationSolution TechniquesTheoretical ResultsAlgorithm

EXACT RTLS: Alternating Iteration on λL and x .

AlgorithmGiven δ, λ

(0)L > 0, calculate eigenpair (%

(0)n+1, x

(0)). Set k = 0. Update

λ(k)L and x (k) until convergence.

1. While not converged Do ι(k) = 1

1.1 Inner Iteration: Until g(x (k+1))g(x (0)) > 0

1.1.1 Update λ(k+1)L .

1.1.2 Calculate smallest eigenvalue, %(k)n+1, and corresponding

eigenvector, [(x (k+1))T ,−1]T , of matrix B(λ(k)L ).

1.1.3 If g(x (k+1))g(x (0)) < 0, set ι(k) = ι(k)/2 else Break.

1.2 Test for convergence. If converged Break else k = k + 1.

2. End Do. xRTLS = x (k).

Rosemary Renaut with Dr Hongbin Guo and Wolfgang Stefan Ill-Posed Inverse Problems: Algorithms and Applications

Page 84: Ill-Posed Inverse Problems: Algorithms and Applications ...rosie/mypresentations/talksdsu.pdf · Ill-Posed Inverse Problems: Algorithms and Applications Total Least Squares Rosemary

OutlineInverse Problems

Total Least Squares MethodsResults

ConclusionsOther Research

HistoryTotal Least SquaresRegularizationSolution TechniquesTheoretical ResultsAlgorithm

EXACT RTLS: Alternating Iteration on λL and x .

AlgorithmGiven δ, λ

(0)L > 0, calculate eigenpair (%

(0)n+1, x

(0)). Set k = 0. Update

λ(k)L and x (k) until convergence.

1. While not converged Do ι(k) = 1

1.1 Inner Iteration: Until g(x (k+1))g(x (0)) > 0

1.1.1 Update λ(k+1)L .

1.1.2 Calculate smallest eigenvalue, %(k)n+1, and corresponding

eigenvector, [(x (k+1))T ,−1]T , of matrix B(λ(k)L ).

1.1.3 If g(x (k+1))g(x (0)) < 0, set ι(k) = ι(k)/2 else Break.

1.2 Test for convergence. If converged Break else k = k + 1.

2. End Do. xRTLS = x (k).

Rosemary Renaut with Dr Hongbin Guo and Wolfgang Stefan Ill-Posed Inverse Problems: Algorithms and Applications

Page 85: Ill-Posed Inverse Problems: Algorithms and Applications ...rosie/mypresentations/talksdsu.pdf · Ill-Posed Inverse Problems: Algorithms and Applications Total Least Squares Rosemary

OutlineInverse Problems

Total Least Squares MethodsResults

ConclusionsOther Research

HistoryTotal Least SquaresRegularizationSolution TechniquesTheoretical ResultsAlgorithm

Inner Iteration Solve I

I Find eigenvector y (k,j+1) = ((x (k,j+1))T ,−1)T such that

B(λ(k)L )y (k,j+1) = β(k,j)y

(k,j),

B(λ(k)L ) =

(J(k,j) ATbbTA η(k,j)

),

J(k,j) = ATA + λ(k)L LTL η(k,j) = bTb − λ

(k)L δ2,

Rosemary Renaut with Dr Hongbin Guo and Wolfgang Stefan Ill-Posed Inverse Problems: Algorithms and Applications

Page 86: Ill-Posed Inverse Problems: Algorithms and Applications ...rosie/mypresentations/talksdsu.pdf · Ill-Posed Inverse Problems: Algorithms and Applications Total Least Squares Rosemary

OutlineInverse Problems

Total Least Squares MethodsResults

ConclusionsOther Research

HistoryTotal Least SquaresRegularizationSolution TechniquesTheoretical ResultsAlgorithm

Inner Iteration Solve II

I Apply Block Gaussian Elimination:(J(j) ATb0 τj

) (x (j+1)

−1

)= βj

(x (j)

−(z(j))T x (j) − 1

),

I where

τj = ηj−bTAz(j) x (j+1) = z(j)+βju(j) βj = τj/((z(j))T x (j)+1).

I z(j) and u(j) solve J(j)z(j) = ATb, J(j)u(j) = x (j).

Rosemary Renaut with Dr Hongbin Guo and Wolfgang Stefan Ill-Posed Inverse Problems: Algorithms and Applications

Page 87: Ill-Posed Inverse Problems: Algorithms and Applications ...rosie/mypresentations/talksdsu.pdf · Ill-Posed Inverse Problems: Algorithms and Applications Total Least Squares Rosemary

OutlineInverse Problems

Total Least Squares MethodsResults

ConclusionsOther Research

HistoryTotal Least SquaresRegularizationSolution TechniquesTheoretical ResultsAlgorithm

Inner Iteration Solve II

I Apply Block Gaussian Elimination:(J(j) ATb0 τj

) (x (j+1)

−1

)= βj

(x (j)

−(z(j))T x (j) − 1

),

I where

τj = ηj−bTAz(j) x (j+1) = z(j)+βju(j) βj = τj/((z(j))T x (j)+1).

I z(j) and u(j) solve J(j)z(j) = ATb, J(j)u(j) = x (j).

Rosemary Renaut with Dr Hongbin Guo and Wolfgang Stefan Ill-Posed Inverse Problems: Algorithms and Applications

Page 88: Ill-Posed Inverse Problems: Algorithms and Applications ...rosie/mypresentations/talksdsu.pdf · Ill-Posed Inverse Problems: Algorithms and Applications Total Least Squares Rosemary

OutlineInverse Problems

Total Least Squares MethodsResults

ConclusionsOther Research

HistoryTotal Least SquaresRegularizationSolution TechniquesTheoretical ResultsAlgorithm

Inner Iteration Solve II

I Apply Block Gaussian Elimination:(J(j) ATb0 τj

) (x (j+1)

−1

)= βj

(x (j)

−(z(j))T x (j) − 1

),

I where

τj = ηj−bTAz(j) x (j+1) = z(j)+βju(j) βj = τj/((z(j))T x (j)+1).

I z(j) and u(j) solve J(j)z(j) = ATb, J(j)u(j) = x (j).

Rosemary Renaut with Dr Hongbin Guo and Wolfgang Stefan Ill-Posed Inverse Problems: Algorithms and Applications

Page 89: Ill-Posed Inverse Problems: Algorithms and Applications ...rosie/mypresentations/talksdsu.pdf · Ill-Posed Inverse Problems: Algorithms and Applications Total Least Squares Rosemary

OutlineInverse Problems

Total Least Squares MethodsResults

ConclusionsOther Research

HistoryTotal Least SquaresRegularizationSolution TechniquesTheoretical ResultsAlgorithm

Practical Details

• The Sign Condition: If the condition g(x (k+1))g(x (0)) > 0is relaxed a divergent sequence may result.

• Inexact Update at Inner Iteration: At the inner iteration itis crucial to maintain the sign condition on g . Hence it issufficient to find approximate eigenpair for which the signcondition holds.

• Do not shift For inexact update we do not use the shiftbecause the system matrix B is parameter dependent and itmakes no practical sense to force convergence with the shift.

• Choice of γ: The theory is developed for γ = δ2 but thealgorithm can use γ = ‖Lxk,j‖2. If blow up does not occur,the algorithm converges more quickly.

Rosemary Renaut with Dr Hongbin Guo and Wolfgang Stefan Ill-Posed Inverse Problems: Algorithms and Applications

Page 90: Ill-Posed Inverse Problems: Algorithms and Applications ...rosie/mypresentations/talksdsu.pdf · Ill-Posed Inverse Problems: Algorithms and Applications Total Least Squares Rosemary

OutlineInverse Problems

Total Least Squares MethodsResults

ConclusionsOther Research

HistoryTotal Least SquaresRegularizationSolution TechniquesTheoretical ResultsAlgorithm

Practical Details

• The Sign Condition: If the condition g(x (k+1))g(x (0)) > 0is relaxed a divergent sequence may result.

• Inexact Update at Inner Iteration: At the inner iteration itis crucial to maintain the sign condition on g . Hence it issufficient to find approximate eigenpair for which the signcondition holds.

• Do not shift For inexact update we do not use the shiftbecause the system matrix B is parameter dependent and itmakes no practical sense to force convergence with the shift.

• Choice of γ: The theory is developed for γ = δ2 but thealgorithm can use γ = ‖Lxk,j‖2. If blow up does not occur,the algorithm converges more quickly.

Rosemary Renaut with Dr Hongbin Guo and Wolfgang Stefan Ill-Posed Inverse Problems: Algorithms and Applications

Page 91: Ill-Posed Inverse Problems: Algorithms and Applications ...rosie/mypresentations/talksdsu.pdf · Ill-Posed Inverse Problems: Algorithms and Applications Total Least Squares Rosemary

OutlineInverse Problems

Total Least Squares MethodsResults

ConclusionsOther Research

HistoryTotal Least SquaresRegularizationSolution TechniquesTheoretical ResultsAlgorithm

Practical Details

• The Sign Condition: If the condition g(x (k+1))g(x (0)) > 0is relaxed a divergent sequence may result.

• Inexact Update at Inner Iteration: At the inner iteration itis crucial to maintain the sign condition on g . Hence it issufficient to find approximate eigenpair for which the signcondition holds.

• Do not shift For inexact update we do not use the shiftbecause the system matrix B is parameter dependent and itmakes no practical sense to force convergence with the shift.

• Choice of γ: The theory is developed for γ = δ2 but thealgorithm can use γ = ‖Lxk,j‖2. If blow up does not occur,the algorithm converges more quickly.

Rosemary Renaut with Dr Hongbin Guo and Wolfgang Stefan Ill-Posed Inverse Problems: Algorithms and Applications

Page 92: Ill-Posed Inverse Problems: Algorithms and Applications ...rosie/mypresentations/talksdsu.pdf · Ill-Posed Inverse Problems: Algorithms and Applications Total Least Squares Rosemary

OutlineInverse Problems

Total Least Squares MethodsResults

ConclusionsOther Research

HistoryTotal Least SquaresRegularizationSolution TechniquesTheoretical ResultsAlgorithm

Practical Details

• The Sign Condition: If the condition g(x (k+1))g(x (0)) > 0is relaxed a divergent sequence may result.

• Inexact Update at Inner Iteration: At the inner iteration itis crucial to maintain the sign condition on g . Hence it issufficient to find approximate eigenpair for which the signcondition holds.

• Do not shift For inexact update we do not use the shiftbecause the system matrix B is parameter dependent and itmakes no practical sense to force convergence with the shift.

• Choice of γ: The theory is developed for γ = δ2 but thealgorithm can use γ = ‖Lxk,j‖2. If blow up does not occur,the algorithm converges more quickly.

Rosemary Renaut with Dr Hongbin Guo and Wolfgang Stefan Ill-Posed Inverse Problems: Algorithms and Applications

Page 93: Ill-Posed Inverse Problems: Algorithms and Applications ...rosie/mypresentations/talksdsu.pdf · Ill-Posed Inverse Problems: Algorithms and Applications Total Least Squares Rosemary

OutlineInverse Problems

Total Least Squares MethodsResults

ConclusionsOther Research

HistoryTotal Least SquaresRegularizationSolution TechniquesTheoretical ResultsAlgorithm

Computational Considerations: Generalized SVD

I All approaches need to solve normal equations Jw = f ,

(ATA + λLLTL)w = f

I GSVD For the augmented matrix [A, L], 2m2n + 15n3 flops.

A = U

(Σ 00 In−p

)X−1, L = V

(M 0

)X−1.

U, V , m × n and p × p, resp., orthonormal. X , n × nnonsingular.Σ, M diag., p × p, entries σi µi , resp. Then[(

Σ2 00 In−p

)+ λL

(M2 00 0

)]X−1w = XT f .

I Efficient Direct solution can be found efficiently 8n2 flops. Solvetriangular systems for each λL.

Rosemary Renaut with Dr Hongbin Guo and Wolfgang Stefan Ill-Posed Inverse Problems: Algorithms and Applications

Page 94: Ill-Posed Inverse Problems: Algorithms and Applications ...rosie/mypresentations/talksdsu.pdf · Ill-Posed Inverse Problems: Algorithms and Applications Total Least Squares Rosemary

OutlineInverse Problems

Total Least Squares MethodsResults

ConclusionsOther Research

HistoryTotal Least SquaresRegularizationSolution TechniquesTheoretical ResultsAlgorithm

Computational Considerations: Generalized SVD

I All approaches need to solve normal equations Jw = f ,

(ATA + λLLTL)w = f

I GSVD For the augmented matrix [A, L], 2m2n + 15n3 flops.

A = U

(Σ 00 In−p

)X−1, L = V

(M 0

)X−1.

U, V , m × n and p × p, resp., orthonormal. X , n × nnonsingular.Σ, M diag., p × p, entries σi µi , resp. Then[(

Σ2 00 In−p

)+ λL

(M2 00 0

)]X−1w = XT f .

I Efficient Direct solution can be found efficiently 8n2 flops. Solvetriangular systems for each λL.

Rosemary Renaut with Dr Hongbin Guo and Wolfgang Stefan Ill-Posed Inverse Problems: Algorithms and Applications

Page 95: Ill-Posed Inverse Problems: Algorithms and Applications ...rosie/mypresentations/talksdsu.pdf · Ill-Posed Inverse Problems: Algorithms and Applications Total Least Squares Rosemary

OutlineInverse Problems

Total Least Squares MethodsResults

ConclusionsOther Research

HistoryTotal Least SquaresRegularizationSolution TechniquesTheoretical ResultsAlgorithm

Computational Considerations: Generalized SVD

I All approaches need to solve normal equations Jw = f ,

(ATA + λLLTL)w = f

I GSVD For the augmented matrix [A, L], 2m2n + 15n3 flops.

A = U

(Σ 00 In−p

)X−1, L = V

(M 0

)X−1.

U, V , m × n and p × p, resp., orthonormal. X , n × nnonsingular.Σ, M diag., p × p, entries σi µi , resp. Then[(

Σ2 00 In−p

)+ λL

(M2 00 0

)]X−1w = XT f .

I Efficient Direct solution can be found efficiently 8n2 flops. Solvetriangular systems for each λL.

Rosemary Renaut with Dr Hongbin Guo and Wolfgang Stefan Ill-Posed Inverse Problems: Algorithms and Applications

Page 96: Ill-Posed Inverse Problems: Algorithms and Applications ...rosie/mypresentations/talksdsu.pdf · Ill-Posed Inverse Problems: Algorithms and Applications Total Least Squares Rosemary

OutlineInverse Problems

Total Least Squares MethodsResults

ConclusionsOther Research

Experiments

Inverse Problems

Total Least Squares Methods

ResultsExperiments

Conclusions

Other Research

Rosemary Renaut with Dr Hongbin Guo and Wolfgang Stefan Ill-Posed Inverse Problems: Algorithms and Applications

Page 97: Ill-Posed Inverse Problems: Algorithms and Applications ...rosie/mypresentations/talksdsu.pdf · Ill-Posed Inverse Problems: Algorithms and Applications Total Least Squares Rosemary

OutlineInverse Problems

Total Least Squares MethodsResults

ConclusionsOther Research

Experiments

Experiments: Hansen’s Regularization Toolbox

• ilaplace, phillips and shaw -all discretizations of continuousill-posed Fredholm integral type constructed by quadrature.

• Generate A and x∗, calculate b = Ax∗ exactly.

• Scale ‖A‖F = ‖b‖2 = 1.

• Add noise 5% Gaussian to b and A.

• L approxs first derivative, (n − 1)× n.

• Tolerance τ = 10−4.

• Estimated solution xest . Relative error with respect to x∗.

• Number of system solves K .

Rosemary Renaut with Dr Hongbin Guo and Wolfgang Stefan Ill-Posed Inverse Problems: Algorithms and Applications

Page 98: Ill-Posed Inverse Problems: Algorithms and Applications ...rosie/mypresentations/talksdsu.pdf · Ill-Posed Inverse Problems: Algorithms and Applications Total Least Squares Rosemary

OutlineInverse Problems

Total Least Squares MethodsResults

ConclusionsOther Research

Experiments

Inexact and Exact Algorithms: convergence for −λ(k)I

The dotted and dashed lines show convergence for exact andinexact algorithms, resp.. The first row shows the wholeconvergence history while the second row shows the first 10steps. Left to right ilaplace, shaw and phillips resp..

Rosemary Renaut with Dr Hongbin Guo and Wolfgang Stefan Ill-Posed Inverse Problems: Algorithms and Applications

Page 99: Ill-Posed Inverse Problems: Algorithms and Applications ...rosie/mypresentations/talksdsu.pdf · Ill-Posed Inverse Problems: Algorithms and Applications Total Least Squares Rosemary

OutlineInverse Problems

Total Least Squares MethodsResults

ConclusionsOther Research

Experiments

Inclusion of Shift: convergence history of −λI

Using γ = δ2 and γ = ‖Lxk,j‖2. top and bottom, resp.

shifted, no shift, or shift after first step.

ilaplace, shaw and phillips resp..

Rosemary Renaut with Dr Hongbin Guo and Wolfgang Stefan Ill-Posed Inverse Problems: Algorithms and Applications

Page 100: Ill-Posed Inverse Problems: Algorithms and Applications ...rosie/mypresentations/talksdsu.pdf · Ill-Posed Inverse Problems: Algorithms and Applications Total Least Squares Rosemary

OutlineInverse Problems

Total Least Squares MethodsResults

ConclusionsOther Research

Experiments

Comparison for γ: δ2 or‖Lx (k,j)‖2

On top ilaplace, and phillips. Below shaw. Solutions areindicated on the left, and the L-curve on the right.

Rosemary Renaut with Dr Hongbin Guo and Wolfgang Stefan Ill-Posed Inverse Problems: Algorithms and Applications

Page 101: Ill-Posed Inverse Problems: Algorithms and Applications ...rosie/mypresentations/talksdsu.pdf · Ill-Posed Inverse Problems: Algorithms and Applications Total Least Squares Rosemary

OutlineInverse Problems

Total Least Squares MethodsResults

ConclusionsOther Research

Inverse Problems

Total Least Squares Methods

Results

Conclusions

Other Research

Rosemary Renaut with Dr Hongbin Guo and Wolfgang Stefan Ill-Posed Inverse Problems: Algorithms and Applications

Page 102: Ill-Posed Inverse Problems: Algorithms and Applications ...rosie/mypresentations/talksdsu.pdf · Ill-Posed Inverse Problems: Algorithms and Applications Total Least Squares Rosemary

OutlineInverse Problems

Total Least Squares MethodsResults

ConclusionsOther Research

Regularized Total Least Squares

Numerical experiments have been presented which verify thetheory

• The algorithm provides an efficient and practical approach forthe solution of the RTLS problem in which a good estimate ofthe physical parameter is provided.

• If blow up occurs bisection search may yield a better solutionsatisfying the constraint condition.

• If no constraint information is provided, the L-curve techniquecan be successfully employed.

• Algorithm performs better than QEP for all of our tests.

Rosemary Renaut with Dr Hongbin Guo and Wolfgang Stefan Ill-Posed Inverse Problems: Algorithms and Applications

Page 103: Ill-Posed Inverse Problems: Algorithms and Applications ...rosie/mypresentations/talksdsu.pdf · Ill-Posed Inverse Problems: Algorithms and Applications Total Least Squares Rosemary

OutlineInverse Problems

Total Least Squares MethodsResults

ConclusionsOther Research

Regularized Total Least Squares

Numerical experiments have been presented which verify thetheory

• The algorithm provides an efficient and practical approach forthe solution of the RTLS problem in which a good estimate ofthe physical parameter is provided.

• If blow up occurs bisection search may yield a better solutionsatisfying the constraint condition.

• If no constraint information is provided, the L-curve techniquecan be successfully employed.

• Algorithm performs better than QEP for all of our tests.

Rosemary Renaut with Dr Hongbin Guo and Wolfgang Stefan Ill-Posed Inverse Problems: Algorithms and Applications

Page 104: Ill-Posed Inverse Problems: Algorithms and Applications ...rosie/mypresentations/talksdsu.pdf · Ill-Posed Inverse Problems: Algorithms and Applications Total Least Squares Rosemary

OutlineInverse Problems

Total Least Squares MethodsResults

ConclusionsOther Research

Regularized Total Least Squares

Numerical experiments have been presented which verify thetheory

• The algorithm provides an efficient and practical approach forthe solution of the RTLS problem in which a good estimate ofthe physical parameter is provided.

• If blow up occurs bisection search may yield a better solutionsatisfying the constraint condition.

• If no constraint information is provided, the L-curve techniquecan be successfully employed.

• Algorithm performs better than QEP for all of our tests.

Rosemary Renaut with Dr Hongbin Guo and Wolfgang Stefan Ill-Posed Inverse Problems: Algorithms and Applications

Page 105: Ill-Posed Inverse Problems: Algorithms and Applications ...rosie/mypresentations/talksdsu.pdf · Ill-Posed Inverse Problems: Algorithms and Applications Total Least Squares Rosemary

OutlineInverse Problems

Total Least Squares MethodsResults

ConclusionsOther Research

Regularized Total Least Squares

Numerical experiments have been presented which verify thetheory

• The algorithm provides an efficient and practical approach forthe solution of the RTLS problem in which a good estimate ofthe physical parameter is provided.

• If blow up occurs bisection search may yield a better solutionsatisfying the constraint condition.

• If no constraint information is provided, the L-curve techniquecan be successfully employed.

• Algorithm performs better than QEP for all of our tests.

Rosemary Renaut with Dr Hongbin Guo and Wolfgang Stefan Ill-Posed Inverse Problems: Algorithms and Applications

Page 106: Ill-Posed Inverse Problems: Algorithms and Applications ...rosie/mypresentations/talksdsu.pdf · Ill-Posed Inverse Problems: Algorithms and Applications Total Least Squares Rosemary

OutlineInverse Problems

Total Least Squares MethodsResults

ConclusionsOther Research

Application to Medical ImagesImage Registration

Related Work

• Domain decomposition (PVDTLS) for large scale TLS:(2005).

• Applications :

• Signal and image restoration (Geophysical Journal 2006)• Microarray analysis-pattern recognition: Support Vector

Machine

• Current Directions

• Domain Decomposition for large scale RTLS• Total Variation TLS• Large scale implementation• Multiple Right Hand Sides.

Rosemary Renaut with Dr Hongbin Guo and Wolfgang Stefan Ill-Posed Inverse Problems: Algorithms and Applications

Page 107: Ill-Posed Inverse Problems: Algorithms and Applications ...rosie/mypresentations/talksdsu.pdf · Ill-Posed Inverse Problems: Algorithms and Applications Total Least Squares Rosemary

OutlineInverse Problems

Total Least Squares MethodsResults

ConclusionsOther Research

Application to Medical ImagesImage Registration

Related Work

• Domain decomposition (PVDTLS) for large scale TLS:(2005).

• Applications :

• Signal and image restoration (Geophysical Journal 2006)

• Microarray analysis-pattern recognition: Support VectorMachine

• Current Directions

• Domain Decomposition for large scale RTLS• Total Variation TLS• Large scale implementation• Multiple Right Hand Sides.

Rosemary Renaut with Dr Hongbin Guo and Wolfgang Stefan Ill-Posed Inverse Problems: Algorithms and Applications

Page 108: Ill-Posed Inverse Problems: Algorithms and Applications ...rosie/mypresentations/talksdsu.pdf · Ill-Posed Inverse Problems: Algorithms and Applications Total Least Squares Rosemary

OutlineInverse Problems

Total Least Squares MethodsResults

ConclusionsOther Research

Application to Medical ImagesImage Registration

Related Work

• Domain decomposition (PVDTLS) for large scale TLS:(2005).

• Applications :

• Signal and image restoration (Geophysical Journal 2006)• Microarray analysis-pattern recognition: Support Vector

Machine

• Current Directions

• Domain Decomposition for large scale RTLS• Total Variation TLS• Large scale implementation• Multiple Right Hand Sides.

Rosemary Renaut with Dr Hongbin Guo and Wolfgang Stefan Ill-Posed Inverse Problems: Algorithms and Applications

Page 109: Ill-Posed Inverse Problems: Algorithms and Applications ...rosie/mypresentations/talksdsu.pdf · Ill-Posed Inverse Problems: Algorithms and Applications Total Least Squares Rosemary

OutlineInverse Problems

Total Least Squares MethodsResults

ConclusionsOther Research

Application to Medical ImagesImage Registration

Related Work

• Domain decomposition (PVDTLS) for large scale TLS:(2005).

• Applications :

• Signal and image restoration (Geophysical Journal 2006)• Microarray analysis-pattern recognition: Support Vector

Machine

• Current Directions

• Domain Decomposition for large scale RTLS

• Total Variation TLS• Large scale implementation• Multiple Right Hand Sides.

Rosemary Renaut with Dr Hongbin Guo and Wolfgang Stefan Ill-Posed Inverse Problems: Algorithms and Applications

Page 110: Ill-Posed Inverse Problems: Algorithms and Applications ...rosie/mypresentations/talksdsu.pdf · Ill-Posed Inverse Problems: Algorithms and Applications Total Least Squares Rosemary

OutlineInverse Problems

Total Least Squares MethodsResults

ConclusionsOther Research

Application to Medical ImagesImage Registration

Related Work

• Domain decomposition (PVDTLS) for large scale TLS:(2005).

• Applications :

• Signal and image restoration (Geophysical Journal 2006)• Microarray analysis-pattern recognition: Support Vector

Machine

• Current Directions

• Domain Decomposition for large scale RTLS• Total Variation TLS• Large scale implementation• Multiple Right Hand Sides.

Rosemary Renaut with Dr Hongbin Guo and Wolfgang Stefan Ill-Posed Inverse Problems: Algorithms and Applications

Page 111: Ill-Posed Inverse Problems: Algorithms and Applications ...rosie/mypresentations/talksdsu.pdf · Ill-Posed Inverse Problems: Algorithms and Applications Total Least Squares Rosemary

OutlineInverse Problems

Total Least Squares MethodsResults

ConclusionsOther Research

Application to Medical ImagesImage Registration

Removal of Noise from Difference Images for LongitudinalStudies

I Application: Two PET scans of the same patient at differenttimes

I Question: Are there any anatomical or functional changes?

Rosemary Renaut with Dr Hongbin Guo and Wolfgang Stefan Ill-Posed Inverse Problems: Algorithms and Applications

Page 112: Ill-Posed Inverse Problems: Algorithms and Applications ...rosie/mypresentations/talksdsu.pdf · Ill-Posed Inverse Problems: Algorithms and Applications Total Least Squares Rosemary

OutlineInverse Problems

Total Least Squares MethodsResults

ConclusionsOther Research

Application to Medical ImagesImage Registration

Difference Image after Alignment by Maximization ofMutual Information

I Scans from different dayshave to be aligned

I Noise and artifacts changefrom scan to scan.

I Small changes are hard tolocate in the differenceimage

Rosemary Renaut with Dr Hongbin Guo and Wolfgang Stefan Ill-Posed Inverse Problems: Algorithms and Applications

Page 113: Ill-Posed Inverse Problems: Algorithms and Applications ...rosie/mypresentations/talksdsu.pdf · Ill-Posed Inverse Problems: Algorithms and Applications Total Least Squares Rosemary

OutlineInverse Problems

Total Least Squares MethodsResults

ConclusionsOther Research

Application to Medical ImagesImage Registration

Decomposed Difference Image: wavelets

Rosemary Renaut with Dr Hongbin Guo and Wolfgang Stefan Ill-Posed Inverse Problems: Algorithms and Applications

Page 114: Ill-Posed Inverse Problems: Algorithms and Applications ...rosie/mypresentations/talksdsu.pdf · Ill-Posed Inverse Problems: Algorithms and Applications Total Least Squares Rosemary

OutlineInverse Problems

Total Least Squares MethodsResults

ConclusionsOther Research

Application to Medical ImagesImage Registration

Difference Image and u Part

Difference Image u Part

Rosemary Renaut with Dr Hongbin Guo and Wolfgang Stefan Ill-Posed Inverse Problems: Algorithms and Applications

Page 115: Ill-Posed Inverse Problems: Algorithms and Applications ...rosie/mypresentations/talksdsu.pdf · Ill-Posed Inverse Problems: Algorithms and Applications Total Least Squares Rosemary

OutlineInverse Problems

Total Least Squares MethodsResults

ConclusionsOther Research

Application to Medical ImagesImage Registration

u and v Part

u Part v Part

Rosemary Renaut with Dr Hongbin Guo and Wolfgang Stefan Ill-Posed Inverse Problems: Algorithms and Applications

Page 116: Ill-Posed Inverse Problems: Algorithms and Applications ...rosie/mypresentations/talksdsu.pdf · Ill-Posed Inverse Problems: Algorithms and Applications Total Least Squares Rosemary

OutlineInverse Problems

Total Least Squares MethodsResults

ConclusionsOther Research

Application to Medical ImagesImage Registration

Inverse Problems

Total Least Squares Methods

Results

Conclusions

Other ResearchApplication to Medical ImagesImage Registration

Rosemary Renaut with Dr Hongbin Guo and Wolfgang Stefan Ill-Posed Inverse Problems: Algorithms and Applications

Page 117: Ill-Posed Inverse Problems: Algorithms and Applications ...rosie/mypresentations/talksdsu.pdf · Ill-Posed Inverse Problems: Algorithms and Applications Total Least Squares Rosemary

OutlineInverse Problems

Total Least Squares MethodsResults

ConclusionsOther Research

Application to Medical ImagesImage Registration

THANKYOU

Any Questions

Rosemary Renaut with Dr Hongbin Guo and Wolfgang Stefan Ill-Posed Inverse Problems: Algorithms and Applications