8.4.3 A State of the Art Algorithm

This steady state algorithm works as follows. Its parameters are the population size $ m$, the number $ s$ of steady state individuals, the ratio $ \sigma/\ell$ of the Gaussian mutation operator (cf. Section 8.3.3), and its mutation probability $ p_m$.

  1. Set the time (i.e., the generation number) $ t:=0$.
  2. Initialize the population $ P(0)$ using random numbers.
  3. Evaluate $ P(0)$ using the given objective function $ f$.
  4. While the population has not converged and the maximum number of generations has not been reached, do
    1. Increase time, $ t:=t+1$.
    2. Construct the new generation $ P(t)$ from the old one $ P(t-1)$:
      1. Construct $ \vert P(t-1)\vert - s$ individuals using point in the middle crossover (cf. Section 8.3.2), where individuals are chosen using a linearly scaled roulette wheel (cf. Section 8.3.1).
      2. Mutate the population using the Gaussian mutation operator (cf. Section 8.3.3) and the given $ \sigma/\ell$ ratio using the mutation probability $ p_m$.
      3. The new population $ P(t)$ consists of the best $ s$ individuals from $ P(t-1)$ and the $ \vert P(t-1)\vert - s$ individuals just constructed.
    3. Evaluate $ P(t)$ in parallel. Remember the best individual ever found.
    4. Repeat.
  5. Finally return the best individual.
Typical values for the parameters of the algorithm are $ m=50$, $ s=5$, $ \sigma/\ell=0.1$, and $ p_m=0.2$.

The most notable difference to the classic algorithm is the use of a linearly scaled roulette wheel for selecting promising individuals. This ensures a consistent selection process during the whole optimization. In the selection process the right amount of support of individuals of high fitness, i.e., individuals likely to solve the given optimization problem, must be found. If their probability of proliferation is too close to the probability of individuals of low fitness, the population converges slowly or not at all. If the difference is too big, premature convergence to a local extremum is likely. The linearly scaled roulette wheel selection provides a good compromise.

Furthermore all new individuals are introduced by the point in the middle crossover operator (cf. Section 8.3.2), which is advantageous for determining local extrema and exploring the search space. In the following the building blocks of genetic algorithms will be discussed in more detail.

Clemens Heitzinger 2003-05-08