next up previous contents
Next: 4.1.3 Steepest-descent Up: 4.1 Optimization Methods Previous: 4.1.1 Gradient Based Optimization


4.1.2 Step direction

Many optimization algorithms iteratively improve the solution; each iteration consists of four basic steps:

1. If the convergence conditions are satisfied stop the algorithm with the actual point $\vec{x}_k$ as the solution.
2. Compute a direction $\vec{p}_k$.
3. Compute a step length lk.
4. Set the new actual point $\vec{x}_{k+1} = \vec{x}_{k} + \vec{p}_k \cdot l_k$; go to step 1.
For multidimensional optimization problems, a major question is how to choose the step direction for the next iteration to increase the scalar function $f(\vec{x})$. The usually used directions for the next iteration are the steepest-descent or the Newton direction or combinations.




R. Plasun