Composite Bayesian Optimization in function spaces using NEON—Neural Epistemic Operator Networks

Abstract Operator learning is a rising field of scientific computing where inputs or outputs of a machine learning model are functions defined in infinite-dimensional spaces. In this paper, we introduce Neon (Neural Epistemic Operator Networks), an architecture for generating predictions with uncert...

Full description

Saved in:
Bibliographic Details
Main Authors: Leonardo Ferreira Guilhoto, Paris Perdikaris
Format: Article
Language:English
Published: Nature Portfolio 2024-11-01
Series:Scientific Reports
Subjects:
Online Access:https://doi.org/10.1038/s41598-024-79621-7
Tags: Add Tag
No Tags, Be the first to tag this record!
_version_ 1832571674640252928
author Leonardo Ferreira Guilhoto
Paris Perdikaris
author_facet Leonardo Ferreira Guilhoto
Paris Perdikaris
author_sort Leonardo Ferreira Guilhoto
collection DOAJ
description Abstract Operator learning is a rising field of scientific computing where inputs or outputs of a machine learning model are functions defined in infinite-dimensional spaces. In this paper, we introduce Neon (Neural Epistemic Operator Networks), an architecture for generating predictions with uncertainty using a single operator network backbone, which presents orders of magnitude less trainable parameters than deep ensembles of comparable performance. We showcase the utility of this method for sequential decision-making by examining the problem of composite Bayesian Optimization (BO), where we aim to optimize a function $$f=g\circ h$$ f = g ∘ h , where $$h:X\rightarrow C(\mathscr {Y},{\mathbb {R}}^{d_s})$$ h : X → C ( Y , R d s ) is an unknown map which outputs elements of a function space, and $$g: C(\mathscr {Y},{\mathbb {R}}^{d_s})\rightarrow {\mathbb {R}}$$ g : C ( Y , R d s ) → R is a known and cheap-to-compute functional. By comparing our approach to other state-of-the-art methods on toy and real world scenarios, we demonstrate that Neon achieves state-of-the-art performance while requiring orders of magnitude less trainable parameters.
format Article
id doaj-art-ac03a9fac4b746bf83a8abbe9ca072cd
institution Kabale University
issn 2045-2322
language English
publishDate 2024-11-01
publisher Nature Portfolio
record_format Article
series Scientific Reports
spelling doaj-art-ac03a9fac4b746bf83a8abbe9ca072cd2025-02-02T12:25:35ZengNature PortfolioScientific Reports2045-23222024-11-0114111210.1038/s41598-024-79621-7Composite Bayesian Optimization in function spaces using NEON—Neural Epistemic Operator NetworksLeonardo Ferreira Guilhoto0Paris Perdikaris1Graduate Group on Applied Mathematics & Computational Science, University of PennsylvaniaMechanical Engineering & Applied Mechanics, University of PennsylvaniaAbstract Operator learning is a rising field of scientific computing where inputs or outputs of a machine learning model are functions defined in infinite-dimensional spaces. In this paper, we introduce Neon (Neural Epistemic Operator Networks), an architecture for generating predictions with uncertainty using a single operator network backbone, which presents orders of magnitude less trainable parameters than deep ensembles of comparable performance. We showcase the utility of this method for sequential decision-making by examining the problem of composite Bayesian Optimization (BO), where we aim to optimize a function $$f=g\circ h$$ f = g ∘ h , where $$h:X\rightarrow C(\mathscr {Y},{\mathbb {R}}^{d_s})$$ h : X → C ( Y , R d s ) is an unknown map which outputs elements of a function space, and $$g: C(\mathscr {Y},{\mathbb {R}}^{d_s})\rightarrow {\mathbb {R}}$$ g : C ( Y , R d s ) → R is a known and cheap-to-compute functional. By comparing our approach to other state-of-the-art methods on toy and real world scenarios, we demonstrate that Neon achieves state-of-the-art performance while requiring orders of magnitude less trainable parameters.https://doi.org/10.1038/s41598-024-79621-7Deep learningAutonomous experimentationUncertainty quantification
spellingShingle Leonardo Ferreira Guilhoto
Paris Perdikaris
Composite Bayesian Optimization in function spaces using NEON—Neural Epistemic Operator Networks
Scientific Reports
Deep learning
Autonomous experimentation
Uncertainty quantification
title Composite Bayesian Optimization in function spaces using NEON—Neural Epistemic Operator Networks
title_full Composite Bayesian Optimization in function spaces using NEON—Neural Epistemic Operator Networks
title_fullStr Composite Bayesian Optimization in function spaces using NEON—Neural Epistemic Operator Networks
title_full_unstemmed Composite Bayesian Optimization in function spaces using NEON—Neural Epistemic Operator Networks
title_short Composite Bayesian Optimization in function spaces using NEON—Neural Epistemic Operator Networks
title_sort composite bayesian optimization in function spaces using neon neural epistemic operator networks
topic Deep learning
Autonomous experimentation
Uncertainty quantification
url https://doi.org/10.1038/s41598-024-79621-7
work_keys_str_mv AT leonardoferreiraguilhoto compositebayesianoptimizationinfunctionspacesusingneonneuralepistemicoperatornetworks
AT parisperdikaris compositebayesianoptimizationinfunctionspacesusingneonneuralepistemicoperatornetworks