methods for nonlinear least- square problems jinxiang chai

39
Methods For Nonlinear Least-Square Problems Jinxiang Chai

Post on 20-Dec-2015

222 views

Category:

Documents


1 download

TRANSCRIPT

Methods For Nonlinear Least-Square Problems

Jinxiang Chai

Applications

• Inverse kinematics

• Physically-based animation

• Data-driven motion synthesis

• Many other problems in graphics, vision, machine learning, robotics, etc.

Problem Definition

Most optimization problem can be formulated as a nonlinear least squares problem

)()(2

1minarg xfxfx T

x

m

iix xfx

1

2))((2

1minarg

Where , i=1,…,m are given functions, and m>=n RRf ni :

Data Fitting

Data Fitting

Inverse Kinematics

Find the joint angles θ that minimizes the distance between the character position and user specified position

Base

θ2

θ1

(0,0)

θ2 l2l1

C=(c1,c2)

2221211

2121211

,))sin(sin())cos(cos(minarg

21

cllcll

Global Minimum vs. Local Minimum

• Finding the global minimum for nonlinear functions is very hard

• Finding the local minimum is much easier

Assumptions

• The cost function F is differentiable and so smooth that the following Taylor expansion is valid,

Gradient Descent

Objective function:

Which direction is optimal?

Gradient Descent

)(xFh Which direction is optimal?

cos)()()()(

lim0

xFh

xFh

h

hxFxF T

Gradient Descent

A first-order optimization algorithm.

To find a local minimum of a function using gradient descent, one takes steps proportional to the negative of the gradient of the function at the current point.

Gradient Descent

• Initialize k=0, choose x0

• While k<kmax

)( 11 kkk xFxx

Newton’s Method

• Quadratic approximation

• What’s the minimum solution of the quadratic approximation

2)(2

1)()()( xxfxxfxfxxf

)(

)(

xf

xfx

Newton’s Method

• High dimensional case:

• What’s the optimal direction?

xxHxxxFxFxxF T )(2

1)()()(

)()( 1 xFxHx

Newton’s Method

• Initialize k=0, choose x0

• While k<kmax

)()( 11

1

kkk xFxHxx

Newton’s Method

• Finding the inverse of the Hessian matrix is often expensive

• Approximation methods are often used

- conjugate gradient method

- quasi-newton method

Comparison

• Newton’s method vs. Gradient descent

Gauss-Newton Methods

• Often used to solve non-linear least squares problems.

222 )12.0(2

1)1(

2

1)( xxxxF

12.0

1)( 2 xx

xxf

)()(2

1)( xfxfxF T

Define

We have

Gauss-Newton Method

• In general, we want to minimize a sum of squared function values

)()(2

1minarg xfxfx T

x

Gauss-Newton Method

• In general, we want to minimize a sum of squared function values

• Unlike Newton’s method, second derivatives are not required.

)()(2

1minarg xfxfx T

x

Gauss-Newton Method

• In general, we want to minimize a sum of squared function values

)()(2

1minarg xfxfx T

x

xxFxfxxf )()()(

))()(())()((2

1)()(

2

1xxFxfxxFxfxxfxxf TT

Gauss-Newton Method

• In general, we want to minimize a sum of squared function values

)()(2

1minarg xfxfx T

x

xxFxfxxf )()()(

))()(())()((2

1)()(

2

1xxFxfxxFxfxxfxxf TT

Quadratic function

Gauss-Newton Method

• In general, we want to minimize a sum of squared function values

)()(2

1minarg xfxfx T

x

xxFxfxxf )()()(

))()(())()((2

1)()(

2

1xxFxfxxFxfxxfxxf TT

Quadratic function

0))()(()( xxFxfxF T

Gauss-Newton Method

• In general, we want to minimize a sum of squared function values

)()(2

1minarg xfxfx T

x

xxFxfxxf )()()(

))()(())()((2

1)()(

2

1xxFxfxxFxfxxfxxf TT

Quadratic function

xxFxFxfxF TT )()()()(

Gauss-Newton Method

• In general, we want to minimize a sum of squared function values

)()(2

1minarg xfxfx T

x

xxFxfxxf )()()(

))()(())()((2

1)()(

2

1xxFxfxxFxfxxfxxf TT

Quadratic function

)()())()(( 1 xfxFxFxFx TT

Gauss-Newton Method

• Initialize k=0, choose x0

• While k<kmax

)()())()(( 111

111

kT

kkT

kkk xfxfxfxfxx

Gauss-Newton Method

• In general, we want to minimize a sum of squared function values

)()(2

1minarg xfxfx T

x

xxFxfxxf )()()(

))()(())()((2

1)()(

2

1xxFxfxxFxfxxfxxf TT

Quadratic function

)()())()(( 1 xfxFxFxFx TT

Any Problem?

Gauss-Newton Method

• In general, we want to minimize a sum of squared function values

)()(2

1minarg xfxfx T

x

xxFxfxxf )()()(

))()(())()((2

1)()(

2

1xxFxfxxFxfxxfxxf TT

Quadratic function

0))()(()( xxFxfxF T

Any Problem?

Gauss-Newton Method

• In general, we want to minimize a sum of squared function values

)()(2

1minarg xfxfx T

x

xxFxfxxf )()()(

))()(())()((2

1)()(

2

1xxFxfxxFxfxxfxxf TT

Quadratic function

0))()(()( xxFxfxF T

Any Problem?

Solution might not be unique!

Gauss-Newton Method

• In general, we want to minimize a sum of squared function values

)()(2

1minarg xfxfx T

x

xxFxfxxf )()()(

))()(())()((2

1)()(

2

1xxFxfxxFxfxxfxxf TT

Quadratic function

0))()(()( xxFxfxF T

Any Problem?

Add regularization term!

Levenberg-Marquardt Method

• In general, we want to minimize a sum of squared function values

)()(2

1minarg xfxfx T

x

xxFxfxxf )()()(

xIxxFxfxxFxfxxxfxxf TT ))()(())()((2

1)()(

2

1 2

Any Problem?

Levenberg-Marquardt Method

• In general, we want to minimize a sum of squared function values

)()(2

1minarg xfxfx T

x

xxFxfxxf )()()(

xIxxFxfxxFxfxxxfxxf TT ))()(())()((2

1)()(

2

1 2

Quadratic function

0))()(()( xIxxFxfxF T

Any Problem?

Add regularization term!

Levenberg-Marquardt Method

• In general, we want to minimize a sum of squared function values

)()(2

1minarg xfxfx T

x

xxFxfxxf )()()(

xIxxFxfxxFxfxxxfxxf TT ))()(())()((2

1)()(

2

1 2

Quadratic function

)()())()(( 1 xfxFIxFxFx TT

Any Problem?

Add regularization term!

Levenberg-Marquardt Method

• Initialize k=0, choose x0

• While k<kmax

)()())()(( 111

111

kT

kkT

kkk xfxfIxfxfxx

Stopping Criteria

• Criterion 1: reach the number of iteration specified by the user

K>kmax

Stopping Criteria

• Criterion 1: reach the number of iteration specified by the user

• Criterion 2: when the current function value is smaller than a user-specified threshold

K>kmax

F(xk)<σuser

Stopping Criteria

• Criterion 1: reach the number of iteration specified by the user

• Criterion 2: when the current function value is smaller than a user-specified threshold

• Criterion 3: when the change of function value is smaller than a user specified threshold

K>kmax

F(xk)<σuser

||F(xk)-F(xk-1)||<εuser

Levmar Library

• Implementation of the Levenberg-Marquardt algorithm

• http://www.ics.forth.gr/~lourakis/levmar/

Constrained Nonlinear Optimization

• Finding the minimum value while satisfying some constraints

cxgts

xFx x

)(.

)(minarg