GENETIC ALGORITHM PARAMETER TUNING USING EXPLORATORY LANDSCAPE ANALYSIS AND MACHINE LEARNING

  • М.V. Pikalov ITMO University
  • А.М. Pismerov ITMO University
Keywords: Evolutionary algorithms, parameter tuning, landscape analysis

Abstract

The choice of parameter values in evolutionary algorithms greatly affects their performance. Many
popular parameter tuning methods are constrained by the maximum number of fitness function evaluations
to find a good set of parameter values. Recently, an approach to algorithm selection for optimization
problems has been proposed, which uses the analysis of the fitness function landscape and machine learning
to select the optimal algorithm based on the characteristics of its landscape. Such application of fitness
landscape analysis motivates further research, particularly in the context of parameter tuning in evolutionary
algorithms. The use of landscape features allows for the identification of similar problems and
the use of parameter tuning data obtained from testing on benchmark problems, significantly reducing the
number of required fitness function evaluations during tuning. This work considers an approach to automatic
parameter selection using landscape analysis of the objective function and machine learning, using a genetic algorithm as an example. The proposed solution evaluates the characteristics of the
landscape of the optimization problem's objective function and suggests optimal parameter values for the
algorithm using a neural network. This network was trained on a dataset of landscape features expressed
as numerical features and their corresponding optimal algorithm parameter sets. In contrast to approaches
for automatic algorithm selection for a specific problem, this work addresses the problem of regressing
algorithm parameters instead of classifying the most suitable algorithm from a given set. The results of
experiments on different configurations of the W-model problem, as well as on the MAX-3SAT problem,
show that the proposed approach to automatic parameter selection considering the landscape of the objective
function can help determine appropriate values for the static parameters of the genetic
algorithm. The algorithm with the proposed parameter values outperforms other considered
options on average, requiring fewer evaluations of the objective function to find the optimum
compared to the other algorithms considered.

References

1. Lobo F.J., Lima C.F., Michalewicz Z. (ed.). Parameter setting in evolutionary algorithms. Springer
Science & Business Media, 2007, Vol. 54.
2. Mersmann O., Preuss M., Trautmann H. Benchmarking evolutionary algorithms: Towards exploratory
landscape analysis, 2010.
3. Mersmann O. et al. Exploratory landscape analysis, Proceedings of the 13th annual conference on
Genetic and evolutionary computation, 2011, pp. 829-836.
4. Lopez-Ibanez,M., Dubois-Lacoste J., Caceres L.P., Stutzle T., Birattari M. The irace package: Iterated
racing for automatic algorithm configuration, Operations Research Perspectives, 2016, 3, pp. 43-58.
5. Hutter F., Hoos H.H., Leyton-Brown K. Sequential model-based optimization for general algorithm
configuration, Proceedings of Learning and Intelligent Optimization. Springer, 2011, pp. 507-523.
6. Ochoa G., Malan K. Recent advances in fitness landscape analysis, Proceedings of the Genetic and
Evolutionary Computation Conference Companion, 2019, pp. 1077-1094.
7. Kerschke P., Trautmann H. Automated algorithm selection on continuous black-box problems by combining
exploratory landscape analysis and machine learning, Evolutionary computation, 2019, pp. 99-127.
8. Janković A., Doerr C. Adaptive landscape analysis, Proceedings of the Genetic and Evolutionary
Computation Conference Companion, 2019, pp. 2032-2035.
9. Bassin A., Buzdalov M. The (1 + (λ, λ)) Genetic Algorithm for Permutations, Proceedings of Genetic
and Evolutionary Computation Conference Companion. ACM, 2020, pp. 1669-1677.
10. Weise T., Wu Z. Difficult features of combinatorial optimization problems and the tunable w-model
benchmark problem for simulating them, Proceedings of the Genetic and Evolutionary Computation
Conference Companion, 2018, pp. 1769-1776.
11. Kerschke P., Trautmann H. The R-Package FLACCO for exploratory landscape analysis with applications
to multi-objective optimization problems, 2016 IEEE Congress on Evolutionary Computation
(CEC). IEEE, 2016, pp. 5262-5269.
12. Doerr B., Doerr C., Ebel F. From black-box complexity to designing new genetic algorithms, Theoretical
Computer Science, 2015, Vol. 567, pp. 87-104.
13. Dang N., Doerr C. Hyper-parameter tuning for the (1+(λ, λ)) GA, Proceedings of the Genetic and
Evolutionary Computation Conference, 2019, pp. 889-897.
14. Kimura M. [et al.]. Evolutionary rate at the molecular level, Nature, 1968, Vol. 217, No. 5129,
pp. 624-626.
15. Cordell H.J. Epistasis: what it means, what it doesn’t mean, and statistical methods to detect it in humans,
Human molecular genetics, 2002, Vol. 11, No. 20, pp. 2463-2468.
16. Weise T. [et al.]. A tunable model for multiobjective, epistatic, rugged, and neutral fitness landscapes,
Proceedings of the 10th annual conference on Genetic and evolutionary computation, 2008, pp. 795-802.
17. Garay M.R. Computers and intractability. A guide to the theory of NP-completeness, 1979.
18. Beyer H.G. Evolution strategies, Scholarpedia, 2007, Vol. 2, No. 8, pp. 1965.
19. Eshelman L. On crossover as an evolutionarily viable strategy, Proceedings of the Fourth International
Conference on Genetic Algorithms, 1991, pp. 61-68.
20. Schumer M., Steiglitz K. Adaptive step size random search, IEEE Transactions on Automatic Control,
1968, Vol. 13, No. 3, pp. 270-276.
Published
2024-05-28
Section
SECTION III. INFORMATION PROCESSING ALGORITHMS