lululxvi / deepxde

A library for scientific machine learning and physics-informed learning
https://deepxde.readthedocs.io
GNU Lesser General Public License v2.1
2.74k stars 756 forks source link

Fail to update the infered unknow PDE parameters when using L-BFGS? #574

Open ZPLai opened 2 years ago

ZPLai commented 2 years ago

Dear @lululxvi and comunity, I'm using DEEPXDE to infer several unknow parameters in PDE and ODE. I used the callbacks to monitor the changes of these infered parameters during the training process. When using "adam" method, I found these parameters would be updated with the training process when using "adam" method, however, when I used "L-BFGS" these parameters shall be not updated. Anyone can help me? Following is the related code:

_early_stopping = dde.callbacks.EarlyStopping(min_delta=1e-4, patience=5000) fnamevar = "variables3.txt" variable = dde.callbacks.VariableValue([theta_a,theta_b,theta_c], period=1000, filename=fnamevar)

model.compile("adam", lr=1e-3/1,external_trainable_variables=[theta_a,theta_b,theta_c]) losshistory, train_state = model.train(epochs=5000, callbacks=[variable,early_stopping]) dde.saveplot(losshistory, train_state, issave=True, isplot=True)

model.compile("L-BFGS",external_trainable_variables=[theta_a,theta_b,theta_c]) losshistory, train_state = model.train(callbacks=[variable]) dde.saveplot(losshistory, trainstate, issave=True, isplot=True)

AHDMarwan commented 2 years ago

you can compile with L-BFGS first than Adam, I had a similar problem and its work,

model.compile("L-BFGS-B",loss_weights=[0.01, 0.01, 0.001, 0.001, 100, 100, 100, 100])
model.train()
learning_rate = 0.0001
model.compile("adam",lr=learning_rate, loss_weights=[0.01, 0.01, 0.001, 0.001, 100, 100, 100, 100], external_trainable_variables=[omegaI,omegaR])
lululxvi commented 2 years ago

@AHDMarwan Good trick.

For L-BFGS in TensorFlow 1, due to the technical implementation of L-BFGS, we cannot show the results of the variables. Swithing to Adam can resolve the issue.

haison19952013 commented 2 years ago

@AHDMarwan For example, I want to do the 2 step training: Adam then L-BFGS, so I have to do something like below?

variable = .... model.compile("adam", lr=1e-3/1,external_trainable_variables=[...]) losshistory, train_state = model.train(epochs=5000, callbacks=[variable])

model.compile("L-BFGS",external_trainable_variables=[theta_a,theta_b,theta_c]) losshistory, train_state = model.train(callbacks=[variable])

model.compile("adam", lr=1e-3/1,external_trainable_variables=[...]) losshistory, train_state = model.train(epochs=1, callbacks=[variable])

lululxvi commented 2 years ago

@haison19952013 Also see FAQ "Q: L-BFGS."