In this work, we present an extension of genetic algorithm (GA) which exploits the supervised learning technique called active subspaces (AS) to evolve the individuals on a lower-dimensional space. In many cases, GA requires in fact more function evaluations than other optimization methods to converge to the global optimum. Thus, complex and high-dimensional functions can end up extremely demanding (from the computational point of view) to be optimized with the standard algorithm. To address this issue, we propose to linearly map the input parameter space of the original function onto its AS before the evolution, performing the mutation and mate processes in a lower-dimensional space. In this contribution, we describe the novel method called ASGA, presenting differences and similarities with the standard GA method. We test the proposed method over n-dimensional benchmark functions-Rosenbrock, Ackley, Bohachevsky, Rastrigin, Schaffer N. 7, and Zakharov-and finally we apply it to an aeronautical shape optimization problem.

A supervised learning approach involving active subspaces for an efficient genetic algorithm in high-dimensional optimization problems / Demo, N.; Tezzele, M.; Rozza, G.. - In: SIAM JOURNAL ON SCIENTIFIC COMPUTING. - ISSN 1064-8275. - 43:3(2021), pp. 831-853. [10.1137/20M1345219]

A supervised learning approach involving active subspaces for an efficient genetic algorithm in high-dimensional optimization problems

Demo N.;Tezzele M.;Rozza G.
2021-01-01

Abstract

In this work, we present an extension of genetic algorithm (GA) which exploits the supervised learning technique called active subspaces (AS) to evolve the individuals on a lower-dimensional space. In many cases, GA requires in fact more function evaluations than other optimization methods to converge to the global optimum. Thus, complex and high-dimensional functions can end up extremely demanding (from the computational point of view) to be optimized with the standard algorithm. To address this issue, we propose to linearly map the input parameter space of the original function onto its AS before the evolution, performing the mutation and mate processes in a lower-dimensional space. In this contribution, we describe the novel method called ASGA, presenting differences and similarities with the standard GA method. We test the proposed method over n-dimensional benchmark functions-Rosenbrock, Ackley, Bohachevsky, Rastrigin, Schaffer N. 7, and Zakharov-and finally we apply it to an aeronautical shape optimization problem.
2021
43
3
831
853
10.1137/20M1345219
https://arxiv.org/abs/2006.07282
Demo, N.; Tezzele, M.; Rozza, G.
File in questo prodotto:
File Dimensione Formato  
20m1345219.pdf

accesso aperto

Descrizione: pdf editoriale
Tipologia: Versione Editoriale (PDF)
Licenza: Non specificato
Dimensione 1.71 MB
Formato Adobe PDF
1.71 MB Adobe PDF Visualizza/Apri

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/20.500.11767/124578
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 16
  • ???jsp.display-item.citation.isi??? 8
social impact