Document detail
ID

oai:arXiv.org:2311.04457

Topic
Computer Science - Machine Learnin... Computer Science - Artificial Inte... Physics - Computational Physics Physics - Fluid Dynamics
Author
Dongre, Vardhan Hora, Gurpreet Singh
Category

Computer Science

Year

2023

listing date

11/15/2023

Keywords
science physics bayesian predictions
Metrics

Abstract

The accessibility of spatially distributed data, enabled by affordable sensors, field, and numerical experiments, has facilitated the development of data-driven solutions for scientific problems, including climate change, weather prediction, and urban planning.

Neural Partial Differential Equations (Neural PDEs), which combine deep learning (DL) techniques with domain expertise (e.g., governing equations) for parameterization, have proven to be effective in capturing valuable correlations within spatiotemporal datasets.

However, sparse and noisy measurements coupled with modeling approximation introduce aleatoric and epistemic uncertainties.

Therefore, quantifying uncertainties propagated from model inputs to outputs remains a challenge and an essential goal for establishing the trustworthiness of Neural PDEs.

This work evaluates various Uncertainty Quantification (UQ) approaches for both Forward and Inverse Problems in scientific applications.

Specifically, we investigate the effectiveness of Bayesian methods, such as Hamiltonian Monte Carlo (HMC) and Monte-Carlo Dropout (MCD), and a more conventional approach, Deep Ensembles (DE).

To illustrate their performance, we take two canonical PDEs: Burger's equation and the Navier-Stokes equation.

Our results indicate that Neural PDEs can effectively reconstruct flow systems and predict the associated unknown parameters.

However, it is noteworthy that the results derived from Bayesian methods, based on our observations, tend to display a higher degree of certainty in their predictions as compared to those obtained using the DE.

This elevated certainty in predictions suggests that Bayesian techniques might underestimate the true underlying uncertainty, thereby appearing more confident in their predictions than the DE approach.

;Comment: 8 pages, 4 figures, 1 table, AI for Science Workshop Attention Track, neurips 2023

Dongre, Vardhan,Hora, Gurpreet Singh, 2023, Evaluating Uncertainty Quantification approaches for Neural PDEs in scientific applications

Document

Open

Share

Source

Articles recommended by ES/IODE AI