lululxvi / deepxde

A library for scientific machine learning and physics-informed learning
https://deepxde.readthedocs.io
GNU Lesser General Public License v2.1
2.64k stars 741 forks source link

curse of dimensionality and PINNs with DeepXDE #425

Closed fperiago closed 2 years ago

fperiago commented 2 years ago

Dear Lu, dear users, I wonder about the performance of PINNs algo when solving high dimensional PDEs. Does it break the curse of dimensionality? or maybe PINNs and DeepXDE is more suitable for PDEs in spatial dimensions1,2,3 even complicated? Can anyone share a DeepXDE based code (e.g., Poisson or heat equations) in dimension higher than 3? Many thanks in advance.

nih23 commented 2 years ago

I'm not experienced with DeepXDE but in general, the performance of PiNN in terms of GPU utilisation should be excellent for arbitrary large problems since these networks typically are CPU/GPU bound but not I/O bound. However, the number of co-location points scales with the dimensions (and stiffness) of your problem meaning that you have to balance computational resources vs. training time. You can take leverage on interpolation in solution space and/or transfer learning which, to some degree, reduce the implications of the course of dimensionality.

My observation after experimenting with 3d problems (wave eqn.) is that significant improvement in learning theory of PiNN is required to outperform state-of-the-art solvers.

fperiago commented 2 years ago

Thanks for your comments that I appreciate so much. I don't try to compare PiNN with state-of-the-art solvers in low dimensions. I rather try to see the performance of PiNN in very high dimensions where classical numerical schemes (finite differences, finite elements, finite volumes) do not apply. It would be nice if someone could share a DeepXDE script in dimension higher than 3. Thanks all

lululxvi commented 2 years ago

I don't have a code for high dim, but it is straightforward to implement if you know how to solve low dim. As @nih23 pointed out, the number of colocation points scales with the dimensions, but how is the scaling, polynomial, or exponential? This is still open research problem.

fperiago commented 2 years ago

Thanks Lu for your comments. Of course, it also depends on the type of collocation points chosen. If random points are used for training, having in mind the formulation of PINNs, my intuition is that the number of points will scale as in Monte Carlo integration and so PINNs will overcome the curse of dimensionallity. This would be very good news for PINNs. I do find it would interesting to check this statement in this example: linear heat equation in [0,1]x[0,1]^d which has the explicit solution u(t,x)=2t+\frac{\Vert x \Vert^2}{d}, with the initial condition u(0,x)= \frac{\Vert x \Vert^2}{d}. And solve the problem for d=1,10,50,100. I do think that error will increase linearly with the dimension.

Unfortunately, for beginners in DeepXDE, like me, implementing the code for this problem is not easy as I don't know the sintaxis for writting a Laplacian in dimension 100 or how to deal with the different boundaries. As the main concern of PINNs for PDEs is, in my modest opinion, with high dimensional PDEs, such a script will be really helpful and will increase the interest for DeepXDE.

Thanks a lot in advance

lululxvi commented 2 years ago

If you know how to solve it in 1D/2D/3D, then you should be able to solve it in any dim. Basically only a few lines of code change.