jx-wang-s-group / ppnn

PDE Preserved Neural Network
MIT License
33 stars 11 forks source link

PPNN

PDE Preserved Neural Network

Published on Communications Physics: Multi-resolution partial differential equations preserved learning framework for spatiotemporal dynamics | arxiv version

Abstract Traditional data-driven deep learning models often struggle with high training costs, error accumulation, and poor generalizability in complex physical processes. Physics-informed deep learning (PiDL) addresses these challenges by incorporating physical principles into the model. Most PiDL approaches regularize training by embedding governing equations into the loss function, yet this depends heavily on extensive hyperparameter tuning to weigh each loss term. To this end, we propose to leverage physics prior knowledge by “baking” the discretized governing equations into the neural network architecture via the connection between the partial differential equations (PDE) operators and network structures, resulting in a PDE-preserved neural network (PPNN). This method, embedding discretized PDEs through convolutional residual networks in a multi-resolution setting, largely improves the generalizability and long-term prediction accuracy, outperforming conventional black-box models. The effectiveness and merit of the proposed methods have been demonstrated across various spatiotemporal dynamical systems governed by spatiotemporal PDEs, including reaction-diffusion, Burgers’, and Navier-Stokes equations.

structure

Code

Citation

If you find our work relevant to your research, please cite:

@article{liu2024multi,
  title={Multi-resolution partial differential equations preserved learning framework for spatiotemporal dynamics},
  author={Liu, Xin-Yang and Zhu, Min and Lu, Lu and Sun, Hao and Wang, Jian-Xun},
  journal={Communications Physics},
  volume={7},
  number={1},
  pages={31},
  year={2024},
  publisher={Nature Publishing Group UK London}
}