NVIDIA Modulus (previously referred to as SimNet) is a physics-informed neural network framework for developing physics machine learning neural network models. It blends the power of physics in the form of governing partial differential equations (PDEs) with data to build high-fidelity, parameterized surrogate models with near-real-time latency. Modulus provides a framework to model PDEs along with boundary conditions. It provides the end-to-end pipeline, from setting up the input tensor from geometry to training at scale. Thus provides a way to explicitly specify parameters for training the surrogate model with a range of values to learn for the design space and for inferring multiple scenarios simultaneously. The Modulus Extension integrates Omniverse with Modulus which allows you to explore Modulus-based simulations interactively in various pre-configured examples; there are also curated neural network architectures that are effective for physics-informed machine learning, such as Fourier feature networks, sinusoidal representation networks, or Fourier neural operators and adaptive Fourier neural operators.
Operator Learning Networks is aiming to learn operators or parametrized operators between two function spaces. There are two networks structures now in Modulus that can handle this problem, DeepONet and Fourier Neural Operator. Both of these two structures have data informed and physics informed modeling ways. ... This tutorial data-informed and physics-informed Deep operator network (DeepONet) in Modulus illustrates how to learn abstract operators. The Fourier Neural Operator (FNO), a novel deep-learning method, has shown promising results for predicting complex systems, such as spatio-temporal chaos, turbulence, and weather phenomena. The power of deep learning is best realized at scale — when models are trained on very large datasets (hundreds of terabytes or more) utilizing thousands of GPUs, with data and model parallelism.
Physics Informed Neural Networks by NVIDIA ... more generally, Physics-Informed Deep Learning on Arxiv
NVIDIA Modulus (previously referred to as SimNet) is a physics-informed neural network framework for developing physics machine learning neural network models. It blends the power of physics in the form of governing partial differential equations (PDEs) with data to build high-fidelity, parameterized surrogate models with near-real-time latency. Modulus provides a framework to model PDEs along with boundary conditions. It provides the end-to-end pipeline, from setting up the input tensor from geometry to training at scale. Thus provides a way to explicitly specify parameters for training the surrogate model with a range of values to learn for the design space and for inferring multiple scenarios simultaneously. The Modulus Extension integrates Omniverse with Modulus which allows you to explore Modulus-based simulations interactively in various pre-configured examples; there are also curated neural network architectures that are effective for physics-informed machine learning, such as Fourier feature networks, sinusoidal representation networks, or Fourier neural operators and adaptive Fourier neural operators.
Operator Learning Networks is aiming to learn operators or parametrized operators between two function spaces. There are two networks structures now in Modulus that can handle this problem, DeepONet and Fourier Neural Operator. Both of these two structures have data informed and physics informed modeling ways. ... This tutorial data-informed and physics-informed Deep operator network (DeepONet) in Modulus illustrates how to learn abstract operators. The Fourier Neural Operator (FNO), a novel deep-learning method, has shown promising results for predicting complex systems, such as spatio-temporal chaos, turbulence, and weather phenomena. The power of deep learning is best realized at scale — when models are trained on very large datasets (hundreds of terabytes or more) utilizing thousands of GPUs, with data and model parallelism.
NVIDIA Applied Research Accelerator Program