Graph Optimization 3 - Optimization Update

From the last article, we get the following negative log likelihood function as our optimization target:

The optimization problem turns to be:

This article will explain how can this optimization problem be solved using Gauss-Newton method.

If we find as the initial/rough estimation of , the next step would be looking for a to achieve:

can be expanded to be:

Apply first order Tylor approximation on , we get:

a Jacobian matrix is defined as:

Then we get:

now we define:

Then the equation system is simplified to be:

What’s more, if is convex, or at least locally convex in the region close to , is the minimum if the derivative of to equals to 0.

which leads to:

then we can solve :

The above equation can be arranged in a sparse matrix in the form of

Now can be solved by sparse Cholesky factorization. When is solved, we can compute:

Obviously, the final can be solved iteratively:


I am algorithm engineer focused in computer vision, I know it will be more elegant to shut up and show my code, but I simply can't stop myself learning and explaining new things ...