Repository logo
 

Random Grid Neural Processes for Parametric Partial Differential Equations

Accepted version
Peer-reviewed

Type

Conference Object

Change log

Authors

Vadeboncoeur, Arnaud  ORCID logo  https://orcid.org/0000-0003-4124-6763
Kazlauskaite, I 
Papandreou, Y 
Cirak, F 
Girolami, M 

Abstract

We introduce a new class of spatially stochastic physics and data informed deep latent models for parametric partial differential equations (PDEs) which operate through scalable variational neural processes. We achieve this by assigning probability measures to the spatial domain, which allows us to treat collocation grids probabilistically as random variables to be marginalised out. Adapting this spatial statistics view, we solve forward and inverse problems for parametric PDEs in a way that leads to the construction of Gaussian process models of solution fields. The implementation of these random grids poses a unique set of challenges for inverse physics informed deep learning frameworks and we propose a new architecture called Grid Invariant Convolutional Networks (GICNets) to overcome these challenges. We further show how to incorporate noisy data in a principled manner into our physics informed model to improve predictions for problems where data may be available but whose measurement location does not coincide with any fixed mesh or grid. The proposed method is tested on a nonlinear Poisson problem, Burgers equation, and Navier-Stokes equations, and we provide extensive numerical comparisons. We demonstrate significant computational advantages over current physics informed neural learning methods for parametric PDEs while improving the predictive capabilities and flexibility of these models.

Description

Keywords

Journal Title

Proceedings of Machine Learning Research

Conference Name

International Conference on Machine Learning 2023

Journal ISSN

2640-3498

Volume Title

Publisher

Publisher DOI

Publisher URL