5.4 Global Optimization Methods

A gradient based optimizer is in general not able to find a global extremum of a function. Depending on the chosen starting parameter vector (initial guess), it will most likely get stuck at a local optimum. To overcome these disadvantages several global optimization strategies have been developed. A global optimizer does not use first or second order derivatives to determine the step direction and step width, but uses a heuristic method instead. This section presents two global optimization strategies, namely a genetic optimization algorithm and optimization based on simulated annealing.

One of the negative predicates attributable to global optimization strategies with respect to local optimizations is the relatively large number of evaluations that are necessary to find an acceptable minimum. This fact however, must be qualified with respect to a parallelized optimization. As pointed out in Section 5.2, optimizations can be performed on a cluster of workstations. In the case of a gradient optimization the number of independent jobs is closely related to the dimension of the parameter vector. This results in a limited utilization of the cluster especially for low dimensioned problems. On the other hand, the presented global optimization strategies are able to perform an arbitrary number of individual evaluations, the whole optimization is nearly independent from the dimension and thus, scales much better to the available computing resources.