maziarraissi / PINNs

Physics Informed Deep Learning: Data-driven Solutions and Discovery of Nonlinear Partial Differential Equations
https://maziarraissi.github.io/PINNs
MIT License
3.73k stars 1.25k forks source link

Optimizer #24

Open pacorikos opened 3 years ago

pacorikos commented 3 years ago

This is a two-fold question. Why are two optimizers used? One is used every iteration and the other is used after the 5,000 iterations. Secondly, does the AdamOptimizer need to be defined in the init method, because I was interested in implementing a learning schedule and initializing a new RMSProp optimizer every iteration but I was wondering how I would do that with their setup.

nish-ant commented 3 years ago

For your first question, you can refer to my response to another similar query.

You can try to implement learning schedule with the optimizer following the example here. I don't think you need to initialize the optimizer for each iteration when you use the scheduler that is provided with the module.