Gaussian processes are employed for non-parametric regression in a Bayesian setting. They generalize linear regression embedding the inputs in a latent manifold inside an infinite-dimensional reproducing kernel Hilbert space. We can augment the inputs with the observations of low-fidelity models in order to learn a more expressive latent manifold and thus increment the model's accuracy. This can be realized recursively with a chain of Gaussian processes with incrementally higher fidelity. We would like to extend these multi-fidelity model realizations to case studies affected by an high-dimensional input space but with a low intrinsic dimensionality. In this cases physical supported or purely numerical low-order models are still affected by the curse of dimensionality when queried for responses. When the model's gradients information is provided, the presence of an active subspace can be exploited to design low-fidelity response surfaces and thus enable Gaussian process multi-fidelity regression, without the need to perform new simulations. This is particularly useful in the case of data scarcity. In this work we present a multi-fidelity approach involving active subspaces and we test it on two different high-dimensional benchmarks.

Multi‐fidelity data fusion for the approximation of scalar functions with low intrinsic dimensionality through active subspaces / Romor, Francesco; Tezzele, Marco; Rozza, Gianluigi. - In: PROCEEDINGS IN APPLIED MATHEMATICS AND MECHANICS. - ISSN 1617-7061. - 20:S1(2021), pp. 1-8. (Intervento presentato al convegno 7th GAMM Juniors' Summer School on Applied Mathematics and Mechanics (SAMM20)) [10.1002/pamm.202000349].

Multi‐fidelity data fusion for the approximation of scalar functions with low intrinsic dimensionality through active subspaces

Romor, Francesco;Tezzele, Marco;Rozza, Gianluigi
2021-01-01

Abstract

Gaussian processes are employed for non-parametric regression in a Bayesian setting. They generalize linear regression embedding the inputs in a latent manifold inside an infinite-dimensional reproducing kernel Hilbert space. We can augment the inputs with the observations of low-fidelity models in order to learn a more expressive latent manifold and thus increment the model's accuracy. This can be realized recursively with a chain of Gaussian processes with incrementally higher fidelity. We would like to extend these multi-fidelity model realizations to case studies affected by an high-dimensional input space but with a low intrinsic dimensionality. In this cases physical supported or purely numerical low-order models are still affected by the curse of dimensionality when queried for responses. When the model's gradients information is provided, the presence of an active subspace can be exploited to design low-fidelity response surfaces and thus enable Gaussian process multi-fidelity regression, without the need to perform new simulations. This is particularly useful in the case of data scarcity. In this work we present a multi-fidelity approach involving active subspaces and we test it on two different high-dimensional benchmarks.
2021
PROCEEDINGS IN APPLIED MATHEMATICS AND MECHANICS
20
S1
1
8
https://arxiv.org/abs/2010.08349
Wiley
Romor, Francesco; Tezzele, Marco; Rozza, Gianluigi
File in questo prodotto:
Non ci sono file associati a questo prodotto.

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/20.500.11767/130250
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus ND
  • ???jsp.display-item.citation.isi??? ND
social impact