Closed richieakka closed 2 years ago
Your code dosen't have any BC.
Yes LuLu, this problem does not have any BC. We have only ICs which we have mentioned. We get the correct exact solution using solve_ivp function, but predicted solution becomes flat and does not match the exact solution. Kindly help!!!
Hi LuLu,
Please help in what could be the issue with this problem.
Try a smaller time domain like [0, 1] first.
Dear LuLu,
The problem is working for domain [0,20] but not beyond that. We need the results in domain of [0,300]. Please guide how to approach it.
Check FAQ for scaling.
Dear LuLu,
We are trying to sole the following set of equations
def de_system(x, y): r = y[:, 0:1] p = y[:, 1:2] d_r_over_d_t = tf.gradients(r, x)[0] d_p_over_d_t = tf.gradients(p, x)[0] return [ d_r_over_d_t - (r - (rr)/26 - 0.1211rp), d_p_over_d_t - (0.278p(1 - 0.0862p) - 0.0105rp) ]
def boundary(_, on_initial): return on_initial
geom = dde.geometry.TimeDomain(0, 200) ic1 = dde.IC(geom, lambda X: 1, boundary, component=0) ic2 = dde.IC(geom, lambda X: 1, boundary, component=1)
data = dde.data.PDE(geom, de_system, [ic1, ic2], 5000, 2, num_test=200) #change here, 4000 traininng points
layer_size = [1] + [50] * 5 + [2] # change here, layers are made 5 instaed of 3. activation = "tanh" initializer = "Glorot normal" #uniform" net = dde.maps.FNN(layer_size, activation, initializer)
model = dde.Model(data, net) model.compile("adam", lr=0.001) losshistory, train_state = model.train(epochs=40000) model.compile('L-BFGS-B') losshistory, train_state = model.train()
The training loss we are achieving if o order 10^-8. But despite this, the predicted solution deviates largely from true solution. Also, just for fyi, we have normalized this DEs. Can you please let me know where I am wrong.
Thanking you in advance Richa Gupta