BrunoScience / BrunoScience.github.io

The best that most of us can hope to achieve in science is simply to misunderstand at a deeper level.
https://BrunoScience.github.io
Apache License 2.0
0 stars 1 forks source link

Modulus -- Physics-Driven AND Data-Driven #12

Open MarkBruns opened 1 year ago

MarkBruns commented 1 year ago

Physics Informed Neural Networks by NVIDIA ... more generally, Physics-Informed Deep Learning on Arxiv

NVIDIA Modulus (previously referred to as SimNet) is a physics-informed neural network framework for developing physics machine learning neural network models. It blends the power of physics in the form of governing partial differential equations (PDEs) with data to build high-fidelity, parameterized surrogate models with near-real-time latency. Modulus provides a framework to model PDEs along with boundary conditions. It provides the end-to-end pipeline, from setting up the input tensor from geometry to training at scale. Thus provides a way to explicitly specify parameters for training the surrogate model with a range of values to learn for the design space and for inferring multiple scenarios simultaneously. The Modulus Extension integrates Omniverse with Modulus which allows you to explore Modulus-based simulations interactively in various pre-configured examples; there are also curated neural network architectures that are effective for physics-informed machine learning, such as Fourier feature networks, sinusoidal representation networks, or Fourier neural operators and adaptive Fourier neural operators.

Operator Learning Networks is aiming to learn operators or parametrized operators between two function spaces. There are two networks structures now in Modulus that can handle this problem, DeepONet and Fourier Neural Operator. Both of these two structures have data informed and physics informed modeling ways. ... This tutorial data-informed and physics-informed Deep operator network (DeepONet) in Modulus illustrates how to learn abstract operators. The Fourier Neural Operator (FNO), a novel deep-learning method, has shown promising results for predicting complex systems, such as spatio-temporal chaos, turbulence, and weather phenomena. The power of deep learning is best realized at scale — when models are trained on very large datasets (hundreds of terabytes or more) utilizing thousands of GPUs, with data and model parallelism.

NVIDIA Applied Research Accelerator Program

farrate commented 1 year ago

The examples at:

https://gitlab.com/nvidia/modulus/examples

Are unavailable. Any idea why?