next up previous contents
Next: 4.3 Least-Squares Problems Up: 4. Optimizer Previous: 4.1.9 Comparison of Step


4.2 Genetic Algorithms

Outside the wide area of gradient based optimization methods there are a large variety of other methods. Genetic algorithms [68,23] are an opportunity for solving multidimensional optimization problems.

The genetic algorithm is a model of machine learning which derives its behavior from the mechanisms of evolution in nature. This is done by the creation of individuals of a population with slightly different properties; the chromosomes. These individuals in the population then go through a process of simulated evolution.

The implementation of genetic algorithms usually consists of the following steps:

1. Evaluation of the fitness of all of the individuals in the population.
2. Creation of a new population by performing operations such as mutation on the individuals whose fitness has just been measured.
3. Replace the old population with the new population in the next iteration.

Genetic algorithms are often described as global search methods that do not use gradient information. These algorithms have the capability of finding global optima. On the other side these algorithms need an enormous number of evaluations of the model function. This makes it necessary for the evaluation to be relatively fast.

Genetic algorithms are generally highly parallelizable algorithms because in each generation the required evaluations -- the population -- are independent and can be solved separately. The population of each generation depends on the number of parameters and their ranges.

Generic algorithms are very robust, simple to implement and very general. But because of the large number of required evaluations genetic algorithms are not always the best optimization tools. If there exists a good specialized optimizer, with, for example, a gradient based algorithm which is able to solve the problem, this method is more efficient.

For microelectronics applications the number of required evaluations even when they are independently solved, are too large for practical examples. Hybrid algorithms combining existing optimization methods with genetic algorithms, however, may give better results.


next up previous contents
Next: 4.3 Least-Squares Problems Up: 4. Optimizer Previous: 4.1.9 Comparison of Step

R. Plasun