2 conjugate gradient algorithm
TRANSCRIPT
-
8/4/2019 2 Conjugate Gradient Algorithm
1/5
pext I f p I previous I icontents IN ex t: 3 M u hifU id I te ra tio n U p: 2 C oq ju ga te Gm die nt Ite ra tio n P re vio us: 1 M in im isa tio n P ro blem & Contents
2 Conjugate Gradient AlgorithmWe will c an tw o v ec to rs X and y Avconiugate ifxTAy - O . W e can also see that ifw e denote thecharac te r is ti c po lynomia l o f ,A as
p (A ) - det(A - XI) , (179)
th e nCayl ey -H am il io n's th eo rem t el ls u s t ha t peA) - 0 too. Ifw e supp ose p (A ) - Lj P i A i ,thenAn + . An-1 + ... + . '[ -0P .- P '- - l- .:_ """ 'n", = I, .1 ,n. I-n.' I 'FV ., (180)
and so, as the m atrix is non-singu1ar (PO '# 0),
(181)
a nd th e so hrtio n x mast b e11,-1
"""- ~ i"PAbA - L..J = s - "J : = Q
(182)
where Xi - -PJ+l!Po . Thus th e s oh rti on is able to be expressed as a linear com bin ation of vectors of theformAib,i - 0 " " , ,,'n - 1. LetKr - span{b."" "",Arb} t hen one approxima ti on s tm tegyis to tr y to fi nd app ro ximati on s f rom th e s ub sp ac esKo "K 1'"",',' which success iv e ly min im i se q ; ! . After nsteps w e m ust have the sohrtion b ut of course w e hope that w hen n is v ery la rg e, w e might o bta in a g oo da pp ro xim a tio n a fte r o nly a sm a llm u n be r o f s te ps. The obv io us w ay to do this w ould be at step r, approximatingfromK r to tr y to f ind coeff icients , q i s uch t ha t ifxr - Lj:~ ,qjA] b then
-
8/4/2019 2 Conjugate Gradient Algorithm
2/5
. j.gr kg.,. jgrA , , ( . . . . . . . . . . . ) - !"" ....bTAJ+k+1b- "bTA1b!p' '@" ... " q,. - 2 L . . . . . i L.i qjqk . - .. L.i qj . -l (183)
J~tJ kgtJ j~tJ
a nd so se ttin g th e d eriv ativ es o f q ; ' w ith r es pe ct to q O 'l " , ' , ' " ,q ,. to b e z ero w ou ld m ea n so lv in g a no th er m atrixs ystem o f g rowin g siz e a t e ac h ite ra tio n. The id ea o f c on ju ga te g ra die nt is to : f i n d coe ff ic ie nt s o nly o nc e in anex pan sion o f th e so hrtio n in term s o f a g ro win g se t of b asis v ec to rs. T o do this w e ne ed a differen t b asis fur .K r .S uppose w e call this basis {do'l" ,',' ",d,.} an d th at th e b asis ve cto rs are . A . -coniugate, N ow ifw e let
. - . 2 : . jg,. d = . -q ~ , . ~ thenrJgQ] ]
(184)
s o n ow th e m in im is atio n p ro blem is a hn ost trivial, e ach compon en t (1j is in de pe nd en t o f th e o th ers and a t e ac hste p, th e ex istin g c om po ne nts d o no t ch an ge. S o w e o nly ca lc ula te th e c om po nen ts in e ac h d irec tio n o nc e. O fcourse w e do have to construct the basis {dol" " "l d,.} but since it has to span .K, . and at the rtft stepw e already have an . A . - co nju ga te b as is fu r .K, .-1, ll w e have to do is com pute one additional basis vector ate ac h ste p.
Theorem 2..2 Ijxr-1minimises ' over allvedorsin J
-
8/4/2019 2 Conjugate Gradient Algorithm
3/5
. K - . . d Tbd Tms r-1, mcludmgXr-1, we nrusthave r _. ~ Tr-1 .This enables us to carry out one iteration of the Conjugate Gradient algorithm.
Suppose x; EKr-1 - span{b.,. Ab., ,.Ar-1b} ~. span{do,.. ,.dr-1}, minimises t P 'overK r-1and the basis vectors do, ... ..,d r-1are A -conjugate. CalculateTr-1 - b -AX-1 E .Kr and suppose
(186)
where /3 r is chosen to make dr A -coniugate to dr-I,TAd, Tr-1' r-1f i r - - dT .Ad -. r-1 _. r-1 (187)
IfTdr Tr-1
Or - dT.A.d. _.,..r _. r-1 (188)
is a search l eng th , then
(189)
minimises t P ' overK r andT T T-Tr_1ds - Tr_1TS - drAds - 0, s - 0, ,'1 ' - L (190)
Note that we can also show th eTTr_1Tr-1d'fAdr' (191)
-
8/4/2019 2 Conjugate Gradient Algorithm
4/5
an dT ., rr_1Adr-l
f 3 r - - dT .Ad ., r-l -, r-1 (192)
Itis wo rth o bse rv in g th at w e o nly e ve r n eed to s to re fo ur v ec to rs, x;-1" d, -1"r-1"Adr -1 each o fw h ich c an b e re plac ed a s th e ite ra tio n p ro ce ed s an d o th er th an th e v ec to r,A dr-l th er e a re o nly im le rp roduc ts t o e v al ua te . Th e a lg o rt ln n is ve ry compact ,
AlgorithmX - or-bd-r T5 0 - r r5 1 -60while;(51 .> tolerance)
q-Ad /dTQ: =ULq
X=x+odr=r-aq5 0 - 61c h - rTrf 3 - -81/80d-r+f3d
(193)
end while:
W e do not have t ime to dev elo p an erro r an aly sis b ut w e can n ote th at ifth e c ond it io n number o f t he ma tr ix .A is'" , th en th e erro r ch an ges by a facto r sm aller o r eq ual to
~-1~+1' (194)
each i te ra ti on This is c on sid era ble b ette r th an fu r s te ep es t d es ce nts . E v en f ur th er a dv an ta ge c an b e g ain ed b yp r econdi ti oning the ma t rix .Aby pr e- n ru l ti p ly ing the equat ion .A .x - b b y a n a dd itio na l m a tr ix . We s ha ll n o tc on s id er t ha t a sp ec t in this course.
pext I f p I previous I icontents INe xt: 3 Mu ltig ri d I te ra tio n Up: 2 C o nju ga te G ra die nt Ite ra tio n P re vio us : 1 M i nim is atio n P r ob lem & Contents
-
8/4/2019 2 Conjugate Gradient Algorithm
5/5
Last changed 2000-11-21