# Graph Optimization 3 - Optimization Update

From the last article, we get the following negative log likelihood function as our optimization target:

The optimization problem turns to be:

This article will explain how can this optimization problem be solved using Gauss-Newton method.

If we find $x_0$ as the initial/rough estimation of $x$, the next step would be looking for a $\Delta x^*$ to achieve:

$F(x_0+\Delta x)$ can be expanded to be:

Apply first order Tylor approximation on $e_{ij}$, we get:

a Jacobian matrix is defined as:

Then we get:

now we define:

Then the equation system is simplified to be:

What’s more, if $F$ is convex, or at least locally convex in the region close to $x_0$, $F(x_0+\Delta x)$ is the minimum if the derivative of $F(x_0+\Delta x)$ to $\Delta x$ equals to 0.

then we can solve $\Delta x^*$:

The above equation can be arranged in a sparse matrix in the form of

Now $\Delta x^*$ can be solved by sparse Cholesky factorization. When $\Delta x^*$ is solved, we can compute:

Obviously, the final $x$ can be solved iteratively:

-->

### Wangxin

I am algorithm engineer focused in computer vision, I know it will be more elegant to shut up and show my code, but I simply can't stop myself learning and explaining new things ...

### Graph Optimization 4 - g2o introduction - GPS odometry

Graph Optimization Continue reading

#### Graph Optimization 2 - Modelling Optimization Target

Published on April 14, 2018

#### Graph Optimization 1 - Modelling Edges and Vertices

Published on March 02, 2018