Open jouvetg opened 1 year ago
Thanks for the feature request. We will take a look into this. The literature you have shared above is not having any citations so far https://scholar.google.com/scholar?cluster=7468386216636877262&hl=en&as_sdt=0,5
@fchollet @chenmoneygithub Please take a look! We're unsure whether L-BFGS-B is an optimizer we should add to Keras at this time.
@jouvetg Thanks for reporting the issue! We will add this optimizer to our monitoring list, once it gets a decent number of interest and usage, we will offer it in Keras.
A bit context about how we add new optimizer - we "lag" a bit from the literature paper, as we need clear signal that an optimizer works and has benefits before bringing it to release. For example, Adafactor has shown its success in LLM pretraining.
@chenmoneygithub I think @jouvetg may have referred to a specific paper that doesn't have enough interest. If you see https://link.springer.com/article/10.1007/BF01589116 and the use of this optimizer in https://www.sciencedirect.com/science/article/pii/S0021999118307125 , you will realize that it is quite famous within the Sci-ML community. Most people prefer to write PINNs in PyTorch since L-BFGS is just used as any other optimizer.
It would be great to have L-BFGS-B available in Keras. It is ubiquitously used now in Sci-ML and it is frustrating to rely on fiddly workarounds mentioned above. Specifically, for scientific applications, it is often essential that the parameters have to be within certain bounds.
According to recent literature, the L-BFGS-B optimizer seems to be very powerful for training PINNs, overperforming stochastic optimizers (Adam, ect.) -- the L-BFGS-B can be used for second-step fine-optimization while Adam can be used in a first stage rough-optimization to prevent falling into a local minimum. Having L-BFGS-B as a Keras optimizer available like Adam would therefore be very valuable. Currently, there are unpractical workarounds (e.g. this one) based on the Tensorflow probability BFGS minimizer, I tried it myself, but I found it rather unflexible. So, is there any plan to have the L-BFGS-B optimizer available? This request was already addressed in several Tensorflow requests, this one or this one,. Regarding the growing interest in PINNs, that would be great to have this feature.