multivector differentiation and linear algebra 0.5cm 17th ... · multivector differentiation and...

78
Multivector differentiation and Linear Algebra 17th Santal· o Summer School 2016, Santander Joan Lasenby Signal Processing Group, Engineering Department, Cambridge, UK and Trinity College Cambridge [email protected] , www-sigproc.eng.cam.ac.uk/ s jl 23 August 2016 1 / 78

Upload: others

Post on 20-Aug-2020

3 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Multivector differentiation and Linear Algebra 0.5cm 17th ... · Multivector differentiation and Linear Algebra 17th Santalo Summer School 2016, Santander´ Joan Lasenby Signal Processing

Multivector differentiation and Linear Algebra

17th Santalo Summer School 2016, Santander

Joan Lasenby

Signal Processing Group,Engineering Department,

Cambridge, UKand

Trinity CollegeCambridge

[email protected], www-sigproc.eng.cam.ac.uk/ ∼ jl

23 August 2016

1 / 78

Page 2: Multivector differentiation and Linear Algebra 0.5cm 17th ... · Multivector differentiation and Linear Algebra 17th Santalo Summer School 2016, Santander´ Joan Lasenby Signal Processing

Overview

The Multivector Derivative.

Examples of differentiation wrt multivectors.

Linear Algebra: matrices and tensors as linear functionsmapping between elements of the algebra.

Functional Differentiation: very briefly...

Summary

2 / 78

Page 3: Multivector differentiation and Linear Algebra 0.5cm 17th ... · Multivector differentiation and Linear Algebra 17th Santalo Summer School 2016, Santander´ Joan Lasenby Signal Processing

Overview

The Multivector Derivative.

Examples of differentiation wrt multivectors.

Linear Algebra: matrices and tensors as linear functionsmapping between elements of the algebra.

Functional Differentiation: very briefly...

Summary

3 / 78

Page 4: Multivector differentiation and Linear Algebra 0.5cm 17th ... · Multivector differentiation and Linear Algebra 17th Santalo Summer School 2016, Santander´ Joan Lasenby Signal Processing

Overview

The Multivector Derivative.

Examples of differentiation wrt multivectors.

Linear Algebra: matrices and tensors as linear functionsmapping between elements of the algebra.

Functional Differentiation: very briefly...

Summary

4 / 78

Page 5: Multivector differentiation and Linear Algebra 0.5cm 17th ... · Multivector differentiation and Linear Algebra 17th Santalo Summer School 2016, Santander´ Joan Lasenby Signal Processing

Overview

The Multivector Derivative.

Examples of differentiation wrt multivectors.

Linear Algebra: matrices and tensors as linear functionsmapping between elements of the algebra.

Functional Differentiation: very briefly...

Summary

5 / 78

Page 6: Multivector differentiation and Linear Algebra 0.5cm 17th ... · Multivector differentiation and Linear Algebra 17th Santalo Summer School 2016, Santander´ Joan Lasenby Signal Processing

Overview

The Multivector Derivative.

Examples of differentiation wrt multivectors.

Linear Algebra: matrices and tensors as linear functionsmapping between elements of the algebra.

Functional Differentiation: very briefly...

Summary

6 / 78

Page 7: Multivector differentiation and Linear Algebra 0.5cm 17th ... · Multivector differentiation and Linear Algebra 17th Santalo Summer School 2016, Santander´ Joan Lasenby Signal Processing

The Multivector Derivative

Recall our definition of the directional derivative in the adirection

a·∇ F(x) = limε→0

F(x + εa)− F(x)ε

We now want to generalise this idea to enable us to find thederivative of F(X), in the A ‘direction’ – where X is a generalmixed grade multivector (so F(X) is a general multivectorvalued function of X).

Let us use ∗ to denote taking the scalar part, ie P ∗Q ≡ 〈PQ〉.Then, provided A has same grades as X, it makes sense todefine:

A ∗ ∂XF(X) = limτ→0

F(X + τA)− F(X)

τ

7 / 78

Page 8: Multivector differentiation and Linear Algebra 0.5cm 17th ... · Multivector differentiation and Linear Algebra 17th Santalo Summer School 2016, Santander´ Joan Lasenby Signal Processing

The Multivector Derivative

Recall our definition of the directional derivative in the adirection

a·∇ F(x) = limε→0

F(x + εa)− F(x)ε

We now want to generalise this idea to enable us to find thederivative of F(X), in the A ‘direction’ – where X is a generalmixed grade multivector (so F(X) is a general multivectorvalued function of X).

Let us use ∗ to denote taking the scalar part, ie P ∗Q ≡ 〈PQ〉.Then, provided A has same grades as X, it makes sense todefine:

A ∗ ∂XF(X) = limτ→0

F(X + τA)− F(X)

τ

8 / 78

Page 9: Multivector differentiation and Linear Algebra 0.5cm 17th ... · Multivector differentiation and Linear Algebra 17th Santalo Summer School 2016, Santander´ Joan Lasenby Signal Processing

The Multivector Derivative

Recall our definition of the directional derivative in the adirection

a·∇ F(x) = limε→0

F(x + εa)− F(x)ε

We now want to generalise this idea to enable us to find thederivative of F(X), in the A ‘direction’ – where X is a generalmixed grade multivector (so F(X) is a general multivectorvalued function of X).

Let us use ∗ to denote taking the scalar part, ie P ∗Q ≡ 〈PQ〉.Then, provided A has same grades as X, it makes sense todefine:

A ∗ ∂XF(X) = limτ→0

F(X + τA)− F(X)

τ

9 / 78

Page 10: Multivector differentiation and Linear Algebra 0.5cm 17th ... · Multivector differentiation and Linear Algebra 17th Santalo Summer School 2016, Santander´ Joan Lasenby Signal Processing

The Multivector Derivative cont...

Let {eJ} be a basis for X – ie if X is a bivector, then {eJ} will bethe basis bivectors.

With the definition on the previous slide, eJ ∗ ∂X is therefore thepartial derivative in the eJ direction. Giving

∂X ≡∑J

eJeJ ∗ ∂X

[since eJ ∗ ∂X ≡ eJ ∗ {eI(eI ∗ ∂X}].

Key to using these definitions of multivector differentiation areseveral important results:

10 / 78

Page 11: Multivector differentiation and Linear Algebra 0.5cm 17th ... · Multivector differentiation and Linear Algebra 17th Santalo Summer School 2016, Santander´ Joan Lasenby Signal Processing

The Multivector Derivative cont...

Let {eJ} be a basis for X – ie if X is a bivector, then {eJ} will bethe basis bivectors.

With the definition on the previous slide, eJ ∗ ∂X is therefore thepartial derivative in the eJ direction. Giving

∂X ≡∑J

eJeJ ∗ ∂X

[since eJ ∗ ∂X ≡ eJ ∗ {eI(eI ∗ ∂X}].

Key to using these definitions of multivector differentiation areseveral important results:

11 / 78

Page 12: Multivector differentiation and Linear Algebra 0.5cm 17th ... · Multivector differentiation and Linear Algebra 17th Santalo Summer School 2016, Santander´ Joan Lasenby Signal Processing

The Multivector Derivative cont...

Let {eJ} be a basis for X – ie if X is a bivector, then {eJ} will bethe basis bivectors.

With the definition on the previous slide, eJ ∗ ∂X is therefore thepartial derivative in the eJ direction. Giving

∂X ≡∑J

eJeJ ∗ ∂X

[since eJ ∗ ∂X ≡ eJ ∗ {eI(eI ∗ ∂X}].

Key to using these definitions of multivector differentiation areseveral important results:

12 / 78

Page 13: Multivector differentiation and Linear Algebra 0.5cm 17th ... · Multivector differentiation and Linear Algebra 17th Santalo Summer School 2016, Santander´ Joan Lasenby Signal Processing

The Multivector Derivative cont...

If PX(B) is the projection of B onto the grades of X (iePX(B) ≡ eJ〈eJB〉), then our first important result is

∂X〈XB〉 = PX(B)

We can see this by going back to our definitions:

eJ ∗ ∂X〈XB〉 = limτ→0

〈(X + τeJ)B〉 − 〈XB〉τ

= limτ→0

〈XB〉+ τ〈eJB〉 − 〈XB〉τ

limτ→0

τ〈eJB〉τ

= 〈eJB〉

Therefore giving us

∂X〈XB〉 = eJ(eJ ∗ ∂X)〈XB〉 = eJ〈eJB〉 ≡ PX(B)

13 / 78

Page 14: Multivector differentiation and Linear Algebra 0.5cm 17th ... · Multivector differentiation and Linear Algebra 17th Santalo Summer School 2016, Santander´ Joan Lasenby Signal Processing

The Multivector Derivative cont...

If PX(B) is the projection of B onto the grades of X (iePX(B) ≡ eJ〈eJB〉), then our first important result is

∂X〈XB〉 = PX(B)

We can see this by going back to our definitions:

eJ ∗ ∂X〈XB〉 = limτ→0

〈(X + τeJ)B〉 − 〈XB〉τ

= limτ→0

〈XB〉+ τ〈eJB〉 − 〈XB〉τ

limτ→0

τ〈eJB〉τ

= 〈eJB〉

Therefore giving us

∂X〈XB〉 = eJ(eJ ∗ ∂X)〈XB〉 = eJ〈eJB〉 ≡ PX(B)

14 / 78

Page 15: Multivector differentiation and Linear Algebra 0.5cm 17th ... · Multivector differentiation and Linear Algebra 17th Santalo Summer School 2016, Santander´ Joan Lasenby Signal Processing

The Multivector Derivative cont...

If PX(B) is the projection of B onto the grades of X (iePX(B) ≡ eJ〈eJB〉), then our first important result is

∂X〈XB〉 = PX(B)

We can see this by going back to our definitions:

eJ ∗ ∂X〈XB〉 = limτ→0

〈(X + τeJ)B〉 − 〈XB〉τ

= limτ→0

〈XB〉+ τ〈eJB〉 − 〈XB〉τ

limτ→0

τ〈eJB〉τ

= 〈eJB〉

Therefore giving us

∂X〈XB〉 = eJ(eJ ∗ ∂X)〈XB〉 = eJ〈eJB〉 ≡ PX(B)

15 / 78

Page 16: Multivector differentiation and Linear Algebra 0.5cm 17th ... · Multivector differentiation and Linear Algebra 17th Santalo Summer School 2016, Santander´ Joan Lasenby Signal Processing

The Multivector Derivative cont...

If PX(B) is the projection of B onto the grades of X (iePX(B) ≡ eJ〈eJB〉), then our first important result is

∂X〈XB〉 = PX(B)

We can see this by going back to our definitions:

eJ ∗ ∂X〈XB〉 = limτ→0

〈(X + τeJ)B〉 − 〈XB〉τ

= limτ→0

〈XB〉+ τ〈eJB〉 − 〈XB〉τ

limτ→0

τ〈eJB〉τ

= 〈eJB〉

Therefore giving us

∂X〈XB〉 = eJ(eJ ∗ ∂X)〈XB〉 = eJ〈eJB〉 ≡ PX(B)

16 / 78

Page 17: Multivector differentiation and Linear Algebra 0.5cm 17th ... · Multivector differentiation and Linear Algebra 17th Santalo Summer School 2016, Santander´ Joan Lasenby Signal Processing

Other Key Results

Some other useful results are listed here (proofs are similar tothat on previous slide and are left as exercises):

∂X〈XB〉 = PX(B)

∂X〈XB〉 = PX(B)

∂X〈XB〉 = PX(B) = PX(B)

∂ψ〈Mψ−1〉 = −ψ−1Pψ(M)ψ−1

X, B, M, ψ all general multivectors.

17 / 78

Page 18: Multivector differentiation and Linear Algebra 0.5cm 17th ... · Multivector differentiation and Linear Algebra 17th Santalo Summer School 2016, Santander´ Joan Lasenby Signal Processing

Other Key Results

Some other useful results are listed here (proofs are similar tothat on previous slide and are left as exercises):

∂X〈XB〉 = PX(B)

∂X〈XB〉 = PX(B)

∂X〈XB〉 = PX(B) = PX(B)

∂ψ〈Mψ−1〉 = −ψ−1Pψ(M)ψ−1

X, B, M, ψ all general multivectors.

18 / 78

Page 19: Multivector differentiation and Linear Algebra 0.5cm 17th ... · Multivector differentiation and Linear Algebra 17th Santalo Summer School 2016, Santander´ Joan Lasenby Signal Processing

Exercises 1

1 By noting that 〈XB〉 = 〈(XB) ˜ 〉, show the second keyresult

∂X〈XB〉 = PX(B)

2 Key result 1 tells us that ∂X〈XB〉 = PX(B). Verify thatPX(B) = PX(B), to give the 3rd key result.

3 to show the 4th key result

∂ψ〈Mψ−1〉 = −ψ−1Pψ(M)ψ−1

use the fact that ∂ψ〈Mψψ−1〉 = ∂ψ〈M〉 = 0. Hint: recallthat XAX has the same grades as A.

19 / 78

Page 20: Multivector differentiation and Linear Algebra 0.5cm 17th ... · Multivector differentiation and Linear Algebra 17th Santalo Summer School 2016, Santander´ Joan Lasenby Signal Processing

A Simple Example

Suppose we wish to fit a set of points {Xi} to a plane Φ – wherethe Xi and Φ are conformal representations (vector and 4 vectorrespectively).

One possible way forward is to find the plane that minimisesthe sum of the squared perpendicular distances of the pointsfrom the plane.

X1

X′1

X2

X′2

X3

X′3

20 / 78

Page 21: Multivector differentiation and Linear Algebra 0.5cm 17th ... · Multivector differentiation and Linear Algebra 17th Santalo Summer School 2016, Santander´ Joan Lasenby Signal Processing

A Simple Example

Suppose we wish to fit a set of points {Xi} to a plane Φ – wherethe Xi and Φ are conformal representations (vector and 4 vectorrespectively).

One possible way forward is to find the plane that minimisesthe sum of the squared perpendicular distances of the pointsfrom the plane.

X1

X′1

X2

X′2

X3

X′3 21 / 78

Page 22: Multivector differentiation and Linear Algebra 0.5cm 17th ... · Multivector differentiation and Linear Algebra 17th Santalo Summer School 2016, Santander´ Joan Lasenby Signal Processing

Plane fitting example, cont....

Recall that ΦXΦ is the reflection of X in Φ, so that −X·(ΦXΦ)is the distance between the point and the plane. Thus we couldtake as our cost function:

S = −∑i

Xi·(ΦXiΦ)

Now use the result ∂X〈XB〉 = PX(B) to differentiate thisexpression wrt Φ

∂ΦS = −∑i

∂Φ〈XiΦXiΦ〉 = −∑i

∂Φ〈XiΦXiΦ〉+ ∂Φ〈XiΦXiΦ〉

= −2 ∑i

PΦ(XiΦXi) = −2 ∑i

XiΦXi

=⇒ solve (via linear algebra techniques) ∑i XiΦXi = 0.

22 / 78

Page 23: Multivector differentiation and Linear Algebra 0.5cm 17th ... · Multivector differentiation and Linear Algebra 17th Santalo Summer School 2016, Santander´ Joan Lasenby Signal Processing

Plane fitting example, cont....

Recall that ΦXΦ is the reflection of X in Φ, so that −X·(ΦXΦ)is the distance between the point and the plane. Thus we couldtake as our cost function:

S = −∑i

Xi·(ΦXiΦ)

Now use the result ∂X〈XB〉 = PX(B) to differentiate thisexpression wrt Φ

∂ΦS = −∑i

∂Φ〈XiΦXiΦ〉 = −∑i

∂Φ〈XiΦXiΦ〉+ ∂Φ〈XiΦXiΦ〉

= −2 ∑i

PΦ(XiΦXi) = −2 ∑i

XiΦXi

=⇒ solve (via linear algebra techniques) ∑i XiΦXi = 0.

23 / 78

Page 24: Multivector differentiation and Linear Algebra 0.5cm 17th ... · Multivector differentiation and Linear Algebra 17th Santalo Summer School 2016, Santander´ Joan Lasenby Signal Processing

Plane fitting example, cont....

Recall that ΦXΦ is the reflection of X in Φ, so that −X·(ΦXΦ)is the distance between the point and the plane. Thus we couldtake as our cost function:

S = −∑i

Xi·(ΦXiΦ)

Now use the result ∂X〈XB〉 = PX(B) to differentiate thisexpression wrt Φ

∂ΦS = −∑i

∂Φ〈XiΦXiΦ〉 = −∑i

∂Φ〈XiΦXiΦ〉+ ∂Φ〈XiΦXiΦ〉

= −2 ∑i

PΦ(XiΦXi) = −2 ∑i

XiΦXi

=⇒ solve (via linear algebra techniques) ∑i XiΦXi = 0.

24 / 78

Page 25: Multivector differentiation and Linear Algebra 0.5cm 17th ... · Multivector differentiation and Linear Algebra 17th Santalo Summer School 2016, Santander´ Joan Lasenby Signal Processing

Plane fitting example, cont....

Recall that ΦXΦ is the reflection of X in Φ, so that −X·(ΦXΦ)is the distance between the point and the plane. Thus we couldtake as our cost function:

S = −∑i

Xi·(ΦXiΦ)

Now use the result ∂X〈XB〉 = PX(B) to differentiate thisexpression wrt Φ

∂ΦS = −∑i

∂Φ〈XiΦXiΦ〉 = −∑i

∂Φ〈XiΦXiΦ〉+ ∂Φ〈XiΦXiΦ〉

= −2 ∑i

PΦ(XiΦXi) = −2 ∑i

XiΦXi

=⇒ solve (via linear algebra techniques) ∑i XiΦXi = 0.

25 / 78

Page 26: Multivector differentiation and Linear Algebra 0.5cm 17th ... · Multivector differentiation and Linear Algebra 17th Santalo Summer School 2016, Santander´ Joan Lasenby Signal Processing

Differentiation cont....

Of course we can extend these ideas to other geometric fittingproblems and also to those without closed form solutions,using gradient information to find solutions.

Another example is differentiating wrt rotors or bivectors.

Suppose we wished to create a Kalman filter-like system whichtracked bivectors (not simply their components in some basis) –this might involve evaluating expressions such as

∂Bn

L

∑i=1〈vi

nRnuin−1Rn〉

where Rn = e−Bn , u, v s are vectors.

26 / 78

Page 27: Multivector differentiation and Linear Algebra 0.5cm 17th ... · Multivector differentiation and Linear Algebra 17th Santalo Summer School 2016, Santander´ Joan Lasenby Signal Processing

Differentiation cont....

Of course we can extend these ideas to other geometric fittingproblems and also to those without closed form solutions,using gradient information to find solutions.

Another example is differentiating wrt rotors or bivectors.

Suppose we wished to create a Kalman filter-like system whichtracked bivectors (not simply their components in some basis) –this might involve evaluating expressions such as

∂Bn

L

∑i=1〈vi

nRnuin−1Rn〉

where Rn = e−Bn , u, v s are vectors.

27 / 78

Page 28: Multivector differentiation and Linear Algebra 0.5cm 17th ... · Multivector differentiation and Linear Algebra 17th Santalo Summer School 2016, Santander´ Joan Lasenby Signal Processing

Differentiation cont....

Of course we can extend these ideas to other geometric fittingproblems and also to those without closed form solutions,using gradient information to find solutions.

Another example is differentiating wrt rotors or bivectors.

Suppose we wished to create a Kalman filter-like system whichtracked bivectors (not simply their components in some basis) –this might involve evaluating expressions such as

∂Bn

L

∑i=1〈vi

nRnuin−1Rn〉

where Rn = e−Bn , u, v s are vectors.

28 / 78

Page 29: Multivector differentiation and Linear Algebra 0.5cm 17th ... · Multivector differentiation and Linear Algebra 17th Santalo Summer School 2016, Santander´ Joan Lasenby Signal Processing

Differentiation cont....

Using just the standard results given, and a page of algebralater (but one only needs to do it once!) we find that

∂Bn

⟨vnRnun−1Rn

⟩= −Γ(Bn) +

1|Bn|2

〈BnΓ(Bn)RnBnRn〉2

+sin(|bn|)|Bn|

⟨BnΓ(Bn)RnBn

|Bn|2+ Γ(Bn)Rn

⟩2

where Γ(Bn) =12 [un−1∧RnvnRn]Rn.

29 / 78

Page 30: Multivector differentiation and Linear Algebra 0.5cm 17th ... · Multivector differentiation and Linear Algebra 17th Santalo Summer School 2016, Santander´ Joan Lasenby Signal Processing

Differentiation cont....

Using just the standard results given, and a page of algebralater (but one only needs to do it once!) we find that

∂Bn

⟨vnRnun−1Rn

⟩= −Γ(Bn) +

1|Bn|2

〈BnΓ(Bn)RnBnRn〉2

+sin(|bn|)|Bn|

⟨BnΓ(Bn)RnBn

|Bn|2+ Γ(Bn)Rn

⟩2

where Γ(Bn) =12 [un−1∧RnvnRn]Rn.

30 / 78

Page 31: Multivector differentiation and Linear Algebra 0.5cm 17th ... · Multivector differentiation and Linear Algebra 17th Santalo Summer School 2016, Santander´ Joan Lasenby Signal Processing

Linear Algebra

A linear function, f, mapping vectors to vectors asatisfies

f(λa + µb) = λf(a) + µf(b)

We can now extend f to act on any order blade by(outermorphism)

f(a1∧a2∧...∧an) = f(a1)∧f(a2)∧....∧f(an)

Note that the resulting blade has the same grade as the originalblade. Thus, an important property is that these extendedlinear functions are grade preserving, ie

f(Ar) = 〈f(Ar)〉r

31 / 78

Page 32: Multivector differentiation and Linear Algebra 0.5cm 17th ... · Multivector differentiation and Linear Algebra 17th Santalo Summer School 2016, Santander´ Joan Lasenby Signal Processing

Linear Algebra

A linear function, f, mapping vectors to vectors asatisfies

f(λa + µb) = λf(a) + µf(b)

We can now extend f to act on any order blade by(outermorphism)

f(a1∧a2∧...∧an) = f(a1)∧f(a2)∧....∧f(an)

Note that the resulting blade has the same grade as the originalblade. Thus, an important property is that these extendedlinear functions are grade preserving, ie

f(Ar) = 〈f(Ar)〉r

32 / 78

Page 33: Multivector differentiation and Linear Algebra 0.5cm 17th ... · Multivector differentiation and Linear Algebra 17th Santalo Summer School 2016, Santander´ Joan Lasenby Signal Processing

Linear Algebra

A linear function, f, mapping vectors to vectors asatisfies

f(λa + µb) = λf(a) + µf(b)

We can now extend f to act on any order blade by(outermorphism)

f(a1∧a2∧...∧an) = f(a1)∧f(a2)∧....∧f(an)

Note that the resulting blade has the same grade as the originalblade. Thus, an important property is that these extendedlinear functions are grade preserving, ie

f(Ar) = 〈f(Ar)〉r

33 / 78

Page 34: Multivector differentiation and Linear Algebra 0.5cm 17th ... · Multivector differentiation and Linear Algebra 17th Santalo Summer School 2016, Santander´ Joan Lasenby Signal Processing

Linear Algebra cont....

Matrices are also linear functions which map vectors to vectors.If F is the matrix corresponding to the linear function f, weobtain the elements of F via

Fij = ei·f(ej)

Where {ei} is the basis in which the vectors the matrix acts onare written.

As with matrix multiplication, where we obtain a 3rd matrix(linear function) from combining two other matrices (linearfunctions), ie H = FG, we can also write

h(a) = f[g(a)] = fg(a)

The product of linear functions is associative.

34 / 78

Page 35: Multivector differentiation and Linear Algebra 0.5cm 17th ... · Multivector differentiation and Linear Algebra 17th Santalo Summer School 2016, Santander´ Joan Lasenby Signal Processing

Linear Algebra cont....

Matrices are also linear functions which map vectors to vectors.If F is the matrix corresponding to the linear function f, weobtain the elements of F via

Fij = ei·f(ej)

Where {ei} is the basis in which the vectors the matrix acts onare written.

As with matrix multiplication, where we obtain a 3rd matrix(linear function) from combining two other matrices (linearfunctions), ie H = FG, we can also write

h(a) = f[g(a)] = fg(a)

The product of linear functions is associative.

35 / 78

Page 36: Multivector differentiation and Linear Algebra 0.5cm 17th ... · Multivector differentiation and Linear Algebra 17th Santalo Summer School 2016, Santander´ Joan Lasenby Signal Processing

Linear Algebra cont....

Matrices are also linear functions which map vectors to vectors.If F is the matrix corresponding to the linear function f, weobtain the elements of F via

Fij = ei·f(ej)

Where {ei} is the basis in which the vectors the matrix acts onare written.

As with matrix multiplication, where we obtain a 3rd matrix(linear function) from combining two other matrices (linearfunctions), ie H = FG, we can also write

h(a) = f[g(a)] = fg(a)

The product of linear functions is associative.

36 / 78

Page 37: Multivector differentiation and Linear Algebra 0.5cm 17th ... · Multivector differentiation and Linear Algebra 17th Santalo Summer School 2016, Santander´ Joan Lasenby Signal Processing

Linear Algebra cont....

Matrices are also linear functions which map vectors to vectors.If F is the matrix corresponding to the linear function f, weobtain the elements of F via

Fij = ei·f(ej)

Where {ei} is the basis in which the vectors the matrix acts onare written.

As with matrix multiplication, where we obtain a 3rd matrix(linear function) from combining two other matrices (linearfunctions), ie H = FG, we can also write

h(a) = f[g(a)] = fg(a)

The product of linear functions is associative.37 / 78

Page 38: Multivector differentiation and Linear Algebra 0.5cm 17th ... · Multivector differentiation and Linear Algebra 17th Santalo Summer School 2016, Santander´ Joan Lasenby Signal Processing

Linear Algebra

We now need to verify that

h(A) = f[g(A)] = fg(A)

for any multivector A.

First take a blade a1∧a2∧...∧ar and note that

h(a1∧a2∧...∧ar) = fg(a1)∧fg(a2)∧...∧fg(ar)

= f[g(a1)∧g(a2)∧.....∧g(ar)] = f[g(a1∧a2∧...∧ar)]

from which we get the first result.

38 / 78

Page 39: Multivector differentiation and Linear Algebra 0.5cm 17th ... · Multivector differentiation and Linear Algebra 17th Santalo Summer School 2016, Santander´ Joan Lasenby Signal Processing

Linear Algebra

We now need to verify that

h(A) = f[g(A)] = fg(A)

for any multivector A.

First take a blade a1∧a2∧...∧ar and note that

h(a1∧a2∧...∧ar) = fg(a1)∧fg(a2)∧...∧fg(ar)

= f[g(a1)∧g(a2)∧.....∧g(ar)] = f[g(a1∧a2∧...∧ar)]

from which we get the first result.

39 / 78

Page 40: Multivector differentiation and Linear Algebra 0.5cm 17th ... · Multivector differentiation and Linear Algebra 17th Santalo Summer School 2016, Santander´ Joan Lasenby Signal Processing

Linear Algebra

We now need to verify that

h(A) = f[g(A)] = fg(A)

for any multivector A.

First take a blade a1∧a2∧...∧ar and note that

h(a1∧a2∧...∧ar) = fg(a1)∧fg(a2)∧...∧fg(ar)

= f[g(a1)∧g(a2)∧.....∧g(ar)] = f[g(a1∧a2∧...∧ar)]

from which we get the first result.

40 / 78

Page 41: Multivector differentiation and Linear Algebra 0.5cm 17th ... · Multivector differentiation and Linear Algebra 17th Santalo Summer School 2016, Santander´ Joan Lasenby Signal Processing

Linear Algebra

We now need to verify that

h(A) = f[g(A)] = fg(A)

for any multivector A.

First take a blade a1∧a2∧...∧ar and note that

h(a1∧a2∧...∧ar) = fg(a1)∧fg(a2)∧...∧fg(ar)

= f[g(a1)∧g(a2)∧.....∧g(ar)] = f[g(a1∧a2∧...∧ar)]

from which we get the first result.

41 / 78

Page 42: Multivector differentiation and Linear Algebra 0.5cm 17th ... · Multivector differentiation and Linear Algebra 17th Santalo Summer School 2016, Santander´ Joan Lasenby Signal Processing

Exercises 2

1 For a matrix F

F =

(F11 F12F21 F22

)Verify that Fij = ei·f(ej), where e1 = [1, 0]T and e2 = [0, 1]T,for i, j = 1, 2.

2 Rotations are linear functions, so we can write R(a) = RaR,where R is the rotor. If Ar is an r-blade, show that

RArR = (Ra1R)∧(Ra2R)∧...∧(RarR)

Thus we can rotate any element of our algebra with thesame rotor expression.

42 / 78

Page 43: Multivector differentiation and Linear Algebra 0.5cm 17th ... · Multivector differentiation and Linear Algebra 17th Santalo Summer School 2016, Santander´ Joan Lasenby Signal Processing

The Determinant

Consider the action of a linear function f on an orthogonal basisin 3d:

The unit cube I = e1∧e2∧e3 is transformed to a parallelepiped,V

V = f(e1)∧f(e2)∧f(e3) = f(I)

43 / 78

Page 44: Multivector differentiation and Linear Algebra 0.5cm 17th ... · Multivector differentiation and Linear Algebra 17th Santalo Summer School 2016, Santander´ Joan Lasenby Signal Processing

The Determinant

Consider the action of a linear function f on an orthogonal basisin 3d:

The unit cube I = e1∧e2∧e3 is transformed to a parallelepiped,V

V = f(e1)∧f(e2)∧f(e3) = f(I)

44 / 78

Page 45: Multivector differentiation and Linear Algebra 0.5cm 17th ... · Multivector differentiation and Linear Algebra 17th Santalo Summer School 2016, Santander´ Joan Lasenby Signal Processing

The Determinant cont....

So, since f(I) is also a pseudoscalar, we see that if V is themagnitude of V, then

f(I) = VI

Let us define the determinant of the linear function f as thevolume scale factor V. So that

f(I) = det(f) I

This enables us to find the form of the determinant explicitly(in terms of partial derivatives between coordinate frames)very easily in any dimension.

45 / 78

Page 46: Multivector differentiation and Linear Algebra 0.5cm 17th ... · Multivector differentiation and Linear Algebra 17th Santalo Summer School 2016, Santander´ Joan Lasenby Signal Processing

The Determinant cont....

So, since f(I) is also a pseudoscalar, we see that if V is themagnitude of V, then

f(I) = VI

Let us define the determinant of the linear function f as thevolume scale factor V. So that

f(I) = det(f) I

This enables us to find the form of the determinant explicitly(in terms of partial derivatives between coordinate frames)very easily in any dimension.

46 / 78

Page 47: Multivector differentiation and Linear Algebra 0.5cm 17th ... · Multivector differentiation and Linear Algebra 17th Santalo Summer School 2016, Santander´ Joan Lasenby Signal Processing

The Determinant cont....

So, since f(I) is also a pseudoscalar, we see that if V is themagnitude of V, then

f(I) = VI

Let us define the determinant of the linear function f as thevolume scale factor V. So that

f(I) = det(f) I

This enables us to find the form of the determinant explicitly(in terms of partial derivatives between coordinate frames)very easily in any dimension.

47 / 78

Page 48: Multivector differentiation and Linear Algebra 0.5cm 17th ... · Multivector differentiation and Linear Algebra 17th Santalo Summer School 2016, Santander´ Joan Lasenby Signal Processing

A Key Result

As before, let h = fg, then

h(I) = det(h) I = f(g(I)) = f(det(g) I)

= det(g) f(I) = det(g) det(f)(I)

So we have proved that

det(fg) = det(f) det(g)

A very easy proof!

48 / 78

Page 49: Multivector differentiation and Linear Algebra 0.5cm 17th ... · Multivector differentiation and Linear Algebra 17th Santalo Summer School 2016, Santander´ Joan Lasenby Signal Processing

A Key Result

As before, let h = fg, then

h(I) = det(h) I = f(g(I)) = f(det(g) I)

= det(g) f(I) = det(g) det(f)(I)

So we have proved that

det(fg) = det(f) det(g)

A very easy proof!

49 / 78

Page 50: Multivector differentiation and Linear Algebra 0.5cm 17th ... · Multivector differentiation and Linear Algebra 17th Santalo Summer School 2016, Santander´ Joan Lasenby Signal Processing

A Key Result

As before, let h = fg, then

h(I) = det(h) I = f(g(I)) = f(det(g) I)

= det(g) f(I) = det(g) det(f)(I)

So we have proved that

det(fg) = det(f) det(g)

A very easy proof!

50 / 78

Page 51: Multivector differentiation and Linear Algebra 0.5cm 17th ... · Multivector differentiation and Linear Algebra 17th Santalo Summer School 2016, Santander´ Joan Lasenby Signal Processing

The Adjoint/Transpose of a Linear Function

For a matrix F and its transpose, FT we have (for any vectorsa, b)

aTFb = bTFTa = φ (scalar)

In GA we can write this in terms of linear functions as

a·f(b) = f(a)·b

This reverse linear function, f, is called the adjoint.

51 / 78

Page 52: Multivector differentiation and Linear Algebra 0.5cm 17th ... · Multivector differentiation and Linear Algebra 17th Santalo Summer School 2016, Santander´ Joan Lasenby Signal Processing

The Adjoint/Transpose of a Linear Function

For a matrix F and its transpose, FT we have (for any vectorsa, b)

aTFb = bTFTa = φ (scalar)

In GA we can write this in terms of linear functions as

a·f(b) = f(a)·b

This reverse linear function, f, is called the adjoint.

52 / 78

Page 53: Multivector differentiation and Linear Algebra 0.5cm 17th ... · Multivector differentiation and Linear Algebra 17th Santalo Summer School 2016, Santander´ Joan Lasenby Signal Processing

The Adjoint/Transpose of a Linear Function

For a matrix F and its transpose, FT we have (for any vectorsa, b)

aTFb = bTFTa = φ (scalar)

In GA we can write this in terms of linear functions as

a·f(b) = f(a)·b

This reverse linear function, f, is called the adjoint.

53 / 78

Page 54: Multivector differentiation and Linear Algebra 0.5cm 17th ... · Multivector differentiation and Linear Algebra 17th Santalo Summer School 2016, Santander´ Joan Lasenby Signal Processing

The Adjoint/Transpose of a Linear Function

For a matrix F and its transpose, FT we have (for any vectorsa, b)

aTFb = bTFTa = φ (scalar)

In GA we can write this in terms of linear functions as

a·f(b) = f(a)·b

This reverse linear function, f, is called the adjoint.

54 / 78

Page 55: Multivector differentiation and Linear Algebra 0.5cm 17th ... · Multivector differentiation and Linear Algebra 17th Santalo Summer School 2016, Santander´ Joan Lasenby Signal Processing

The Adjoint cont....

It is not hard to show that the adjoint extends to blades in theexpected way

f(a1∧a2∧...∧an) = f(a1)∧f(a2)∧....∧f(an)

See following exercises to show that

a·f(b∧c) = f[f(a)·(b∧c)]

This can now be generalised to

Ar·f(Bs) = f[f(Ar)·Bs] r ≤ s

f(Ar)·Bs = f[Ar·f(Bs)] r ≥ s

55 / 78

Page 56: Multivector differentiation and Linear Algebra 0.5cm 17th ... · Multivector differentiation and Linear Algebra 17th Santalo Summer School 2016, Santander´ Joan Lasenby Signal Processing

The Adjoint cont....

It is not hard to show that the adjoint extends to blades in theexpected way

f(a1∧a2∧...∧an) = f(a1)∧f(a2)∧....∧f(an)

See following exercises to show that

a·f(b∧c) = f[f(a)·(b∧c)]

This can now be generalised to

Ar·f(Bs) = f[f(Ar)·Bs] r ≤ s

f(Ar)·Bs = f[Ar·f(Bs)] r ≥ s

56 / 78

Page 57: Multivector differentiation and Linear Algebra 0.5cm 17th ... · Multivector differentiation and Linear Algebra 17th Santalo Summer School 2016, Santander´ Joan Lasenby Signal Processing

The Adjoint cont....

It is not hard to show that the adjoint extends to blades in theexpected way

f(a1∧a2∧...∧an) = f(a1)∧f(a2)∧....∧f(an)

See following exercises to show that

a·f(b∧c) = f[f(a)·(b∧c)]

This can now be generalised to

Ar·f(Bs) = f[f(Ar)·Bs] r ≤ s

f(Ar)·Bs = f[Ar·f(Bs)] r ≥ s

57 / 78

Page 58: Multivector differentiation and Linear Algebra 0.5cm 17th ... · Multivector differentiation and Linear Algebra 17th Santalo Summer School 2016, Santander´ Joan Lasenby Signal Processing

Exercises 3

1 For any vectors p, q, r, show that

p·(q∧r) = (p·q)r− (p·r)q

2 By using the fact that a·f(b∧c) = a·[f(b)∧f(c)], use theabove result to show that

a·f(b∧c) = (f(a)·b)f(c)− (f(a)·c)f(b)

and simplify to get the final result

a·f(b∧c) = f[f(a)·(b∧c)]

58 / 78

Page 59: Multivector differentiation and Linear Algebra 0.5cm 17th ... · Multivector differentiation and Linear Algebra 17th Santalo Summer School 2016, Santander´ Joan Lasenby Signal Processing

The Inverse

Ar·f(Bs) = f[f(Ar)·Bs] r ≤ s

Now put Bs = I in this formula:

Ar·f(I) = Ar·det(f)(I) = det(f)(ArI)

= f[f(Ar)·I] = f[f(Ar)I]

We can now write this as

Ar = f[f(Ar)I]I−1[det(f)]−1

59 / 78

Page 60: Multivector differentiation and Linear Algebra 0.5cm 17th ... · Multivector differentiation and Linear Algebra 17th Santalo Summer School 2016, Santander´ Joan Lasenby Signal Processing

The Inverse

Ar·f(Bs) = f[f(Ar)·Bs] r ≤ s

Now put Bs = I in this formula:

Ar·f(I) = Ar·det(f)(I) = det(f)(ArI)

= f[f(Ar)·I] = f[f(Ar)I]

We can now write this as

Ar = f[f(Ar)I]I−1[det(f)]−1

60 / 78

Page 61: Multivector differentiation and Linear Algebra 0.5cm 17th ... · Multivector differentiation and Linear Algebra 17th Santalo Summer School 2016, Santander´ Joan Lasenby Signal Processing

The Inverse

Ar·f(Bs) = f[f(Ar)·Bs] r ≤ s

Now put Bs = I in this formula:

Ar·f(I) = Ar·det(f)(I) = det(f)(ArI)

= f[f(Ar)·I] = f[f(Ar)I]

We can now write this as

Ar = f[f(Ar)I]I−1[det(f)]−1

61 / 78

Page 62: Multivector differentiation and Linear Algebra 0.5cm 17th ... · Multivector differentiation and Linear Algebra 17th Santalo Summer School 2016, Santander´ Joan Lasenby Signal Processing

The Inverse cont...

Repeat this here:

Ar = f[f(Ar)I]I−1[det(f)]−1

The next stage is to put Ar = f−1(Br) in this equation:

f−1(Br) = f[BrI]I−1[det(f)]−1

This leads us to the important and simple formulae for theinverse of a function and its adjoint

f−1(A) = [det(f)]−1f[AI]I−1

f−1(A) = [det(f)]−1f[AI]I−1

62 / 78

Page 63: Multivector differentiation and Linear Algebra 0.5cm 17th ... · Multivector differentiation and Linear Algebra 17th Santalo Summer School 2016, Santander´ Joan Lasenby Signal Processing

The Inverse cont...

Repeat this here:

Ar = f[f(Ar)I]I−1[det(f)]−1

The next stage is to put Ar = f−1(Br) in this equation:

f−1(Br) = f[BrI]I−1[det(f)]−1

This leads us to the important and simple formulae for theinverse of a function and its adjoint

f−1(A) = [det(f)]−1f[AI]I−1

f−1(A) = [det(f)]−1f[AI]I−1

63 / 78

Page 64: Multivector differentiation and Linear Algebra 0.5cm 17th ... · Multivector differentiation and Linear Algebra 17th Santalo Summer School 2016, Santander´ Joan Lasenby Signal Processing

The Inverse cont...

Repeat this here:

Ar = f[f(Ar)I]I−1[det(f)]−1

The next stage is to put Ar = f−1(Br) in this equation:

f−1(Br) = f[BrI]I−1[det(f)]−1

This leads us to the important and simple formulae for theinverse of a function and its adjoint

f−1(A) = [det(f)]−1f[AI]I−1

f−1(A) = [det(f)]−1f[AI]I−1

64 / 78

Page 65: Multivector differentiation and Linear Algebra 0.5cm 17th ... · Multivector differentiation and Linear Algebra 17th Santalo Summer School 2016, Santander´ Joan Lasenby Signal Processing

The Inverse cont...

Repeat this here:

Ar = f[f(Ar)I]I−1[det(f)]−1

The next stage is to put Ar = f−1(Br) in this equation:

f−1(Br) = f[BrI]I−1[det(f)]−1

This leads us to the important and simple formulae for theinverse of a function and its adjoint

f−1(A) = [det(f)]−1f[AI]I−1

f−1(A) = [det(f)]−1f[AI]I−1

65 / 78

Page 66: Multivector differentiation and Linear Algebra 0.5cm 17th ... · Multivector differentiation and Linear Algebra 17th Santalo Summer School 2016, Santander´ Joan Lasenby Signal Processing

An Example

Let us see if this works for rotations

R(a) = RaR and R(a) = RaR

So, putting this in our inverse formula:

R−1(A) = [det(f)]−1R(AI)I−1

= [det(f)]−1R(AI)RI−1 = RAR

since det(R) = 1. Thus the inverse is the adjoint ... as we knowfrom RR = 1.

66 / 78

Page 67: Multivector differentiation and Linear Algebra 0.5cm 17th ... · Multivector differentiation and Linear Algebra 17th Santalo Summer School 2016, Santander´ Joan Lasenby Signal Processing

An Example

Let us see if this works for rotations

R(a) = RaR and R(a) = RaR

So, putting this in our inverse formula:

R−1(A) = [det(f)]−1R(AI)I−1

= [det(f)]−1R(AI)RI−1 = RAR

since det(R) = 1. Thus the inverse is the adjoint ... as we knowfrom RR = 1.

67 / 78

Page 68: Multivector differentiation and Linear Algebra 0.5cm 17th ... · Multivector differentiation and Linear Algebra 17th Santalo Summer School 2016, Santander´ Joan Lasenby Signal Processing

An Example

Let us see if this works for rotations

R(a) = RaR and R(a) = RaR

So, putting this in our inverse formula:

R−1(A) = [det(f)]−1R(AI)I−1

= [det(f)]−1R(AI)RI−1 = RAR

since det(R) = 1. Thus the inverse is the adjoint ... as we knowfrom RR = 1.

68 / 78

Page 69: Multivector differentiation and Linear Algebra 0.5cm 17th ... · Multivector differentiation and Linear Algebra 17th Santalo Summer School 2016, Santander´ Joan Lasenby Signal Processing

More Linear Algebra...

That we won’t look at.....

The idea of eigenblades – this becomes possible with ourextension of linear functions to act on blades.

Symmetric (f(a) = f(a)) and antisymmetric (f(a) = −f(a))functions. In particular, antisymmetric functions are beststudied using bivectors.

Decompositions such as Singular Value Decomposition

Tensors - we can think of tensors as linear functionsmapping r-blades to s-blades. Thus we retain somephysical intuition that is generally lost in index notation.

69 / 78

Page 70: Multivector differentiation and Linear Algebra 0.5cm 17th ... · Multivector differentiation and Linear Algebra 17th Santalo Summer School 2016, Santander´ Joan Lasenby Signal Processing

More Linear Algebra...

That we won’t look at.....

The idea of eigenblades – this becomes possible with ourextension of linear functions to act on blades.

Symmetric (f(a) = f(a)) and antisymmetric (f(a) = −f(a))functions. In particular, antisymmetric functions are beststudied using bivectors.

Decompositions such as Singular Value Decomposition

Tensors - we can think of tensors as linear functionsmapping r-blades to s-blades. Thus we retain somephysical intuition that is generally lost in index notation.

70 / 78

Page 71: Multivector differentiation and Linear Algebra 0.5cm 17th ... · Multivector differentiation and Linear Algebra 17th Santalo Summer School 2016, Santander´ Joan Lasenby Signal Processing

More Linear Algebra...

That we won’t look at.....

The idea of eigenblades – this becomes possible with ourextension of linear functions to act on blades.

Symmetric (f(a) = f(a)) and antisymmetric (f(a) = −f(a))functions. In particular, antisymmetric functions are beststudied using bivectors.

Decompositions such as Singular Value Decomposition

Tensors - we can think of tensors as linear functionsmapping r-blades to s-blades. Thus we retain somephysical intuition that is generally lost in index notation.

71 / 78

Page 72: Multivector differentiation and Linear Algebra 0.5cm 17th ... · Multivector differentiation and Linear Algebra 17th Santalo Summer School 2016, Santander´ Joan Lasenby Signal Processing

More Linear Algebra...

That we won’t look at.....

The idea of eigenblades – this becomes possible with ourextension of linear functions to act on blades.

Symmetric (f(a) = f(a)) and antisymmetric (f(a) = −f(a))functions. In particular, antisymmetric functions are beststudied using bivectors.

Decompositions such as Singular Value Decomposition

Tensors - we can think of tensors as linear functionsmapping r-blades to s-blades. Thus we retain somephysical intuition that is generally lost in index notation.

72 / 78

Page 73: Multivector differentiation and Linear Algebra 0.5cm 17th ... · Multivector differentiation and Linear Algebra 17th Santalo Summer School 2016, Santander´ Joan Lasenby Signal Processing

Functional Differentiation

We will only touch on this briefly, but it is crucial to work inphysics and has hardly been used at all in other fields.

∂f(a)(f(b)·c) = (a·b)c

In engineering, this, in particular, enables us to differentiate wrtto structured matrices in a way which is very hard to dootherwise.

73 / 78

Page 74: Multivector differentiation and Linear Algebra 0.5cm 17th ... · Multivector differentiation and Linear Algebra 17th Santalo Summer School 2016, Santander´ Joan Lasenby Signal Processing

Functional Differentiation

We will only touch on this briefly, but it is crucial to work inphysics and has hardly been used at all in other fields.

∂f(a)(f(b)·c) = (a·b)c

In engineering, this, in particular, enables us to differentiate wrtto structured matrices in a way which is very hard to dootherwise.

74 / 78

Page 75: Multivector differentiation and Linear Algebra 0.5cm 17th ... · Multivector differentiation and Linear Algebra 17th Santalo Summer School 2016, Santander´ Joan Lasenby Signal Processing

Functional Differentiation

We will only touch on this briefly, but it is crucial to work inphysics and has hardly been used at all in other fields.

∂f(a)(f(b)·c) = (a·b)c

In engineering, this, in particular, enables us to differentiate wrtto structured matrices in a way which is very hard to dootherwise.

75 / 78

Page 76: Multivector differentiation and Linear Algebra 0.5cm 17th ... · Multivector differentiation and Linear Algebra 17th Santalo Summer School 2016, Santander´ Joan Lasenby Signal Processing

Summary

Multivector differentiation : can usually differentiate mostexpressions using a set of standard rules.

Linear Algebra : we will see applications of the GAapproach to linear algebra – using, in particular, thebeautiful expressions for the inverse of a function.

Functional Differentiation : used widely in physics, scopefor much more use in engineering.

76 / 78

Page 77: Multivector differentiation and Linear Algebra 0.5cm 17th ... · Multivector differentiation and Linear Algebra 17th Santalo Summer School 2016, Santander´ Joan Lasenby Signal Processing

Summary

Multivector differentiation : can usually differentiate mostexpressions using a set of standard rules.

Linear Algebra : we will see applications of the GAapproach to linear algebra – using, in particular, thebeautiful expressions for the inverse of a function.

Functional Differentiation : used widely in physics, scopefor much more use in engineering.

77 / 78

Page 78: Multivector differentiation and Linear Algebra 0.5cm 17th ... · Multivector differentiation and Linear Algebra 17th Santalo Summer School 2016, Santander´ Joan Lasenby Signal Processing

Summary

Multivector differentiation : can usually differentiate mostexpressions using a set of standard rules.

Linear Algebra : we will see applications of the GAapproach to linear algebra – using, in particular, thebeautiful expressions for the inverse of a function.

Functional Differentiation : used widely in physics, scopefor much more use in engineering.

78 / 78