Closed gmouts closed 2 years ago
Is your problem well defined? You only have boundary conditions for v, but on BC for u.
It should be well defined...we don't need a BC in u. The specimen is free to move in the x direction and is only restrained in the y direction at the very bottom. If we solved this problem with FEM then we would not need any BC in u but only in v, exactly as I define it here. I am not sure if that's the case with PINNs and we need something more...?
If it is free, then it means that the stress should be zero? In FEM, you use integration by parts and then the natural boundary conditions is automatically zero. But in PINN, natural BCs need to be explicit.
Hello, I major in mechanics. Your thought seems interesting, but why do you set fun(x)=0, for fixed boundary conditions? If so, as doctor lu said, I think you need to describe free boundary conditions exactly. Maybe you can try to simplify these bcs used in plate problems. If you can answer my question, I will appreciate it. Thanks.
@lululxvi Ok I get it now. Basically we need to explicitly define all the zero traction BCs where we don't have Dirichlet BCs. For example, in the left boundary we have the zero traction BC which is σ n = 0
. After expanding this it becomes (λ+2μ) du/dx + λ dv/dy = 0
and μ (du/dy+dv/dx) = 0
. In such a case I believe I need to use the OperatorBC
and have the following code.
`from future import absolute_import from future import division from future import print_function
import numpy as np
import tensorflow as tf
import deepxde as dde
import csv
def main(): E = 78.2e9 nu = 0.3 lamda = 110.0 mu = 80.0
def traction1(x, u, X):
u_disp_x = dde.grad.jacobian(u, x, i=0, j=0)
v_disp_y = dde.grad.jacobian(u, x, i=1, j=1)
return (lamda + 2*mu) * u_disp_x + lamda * v_disp_y
def traction2(x, u, X):
u_disp_y = dde.grad.jacobian(u, x, i=0, j=1)
v_disp_x = dde.grad.jacobian(u, x, i=1, j=0)
return mu * (u_disp_y + v_disp_x)
def pde(x, u):
u_disp, v_disp = u[:, 0:1], u[:, 1:]
u_disp_x = dde.grad.jacobian(u, x, i=0, j=0)
u_disp_y = dde.grad.jacobian(u, x, i=0, j=1)
v_disp_x = dde.grad.jacobian(u, x, i=1, j=0)
v_disp_y = dde.grad.jacobian(u, x, i=1, j=1)
sigma11 = (lamda + 2*mu) * u_disp_x + lamda * v_disp_y
sigma22 = lamda * u_disp_x + (lamda + 2*mu) * v_disp_y
sigma12 = mu * (u_disp_y + v_disp_x)
sigma11_x = tf.gradients(sigma11, x)[0][:,0:1]
sigma22_y = tf.gradients(sigma22, x)[0][:,1:]
sigma12_x = tf.gradients(sigma12, x)[0][:,0:1]
sigma12_y = tf.gradients(sigma12, x)[0][:,1:]
residual_x = sigma11_x + sigma12_y
residual_y = sigma12_x + sigma22_y - 200.0
return [residual_x, residual_y]
def func_zero(x):
return 0
spatial_domain = dde.geometry.geometry_2d.Rectangle(xmin=[0, 0], xmax=[1.0, 1.0])
def boundary_left(x, on_boundary):
return on_boundary and np.isclose(x[0], 0)
def boundary_right(x, on_boundary):
return on_boundary and np.isclose(x[0], 1)
def boundary_top(x, on_boundary):
return on_boundary and np.isclose(x[1], 1)
def boundary_bottom(x, on_boundary):
return on_boundary and np.isclose(x[1], 0)
bc_v_b = dde.DirichletBC(spatial_domain, func_zero, boundary_bottom, component=1)
bc_l_tr1 = dde.OperatorBC(spatial_domain, traction1, boundary_left)
bc_l_tr2 = dde.OperatorBC(spatial_domain, traction2, boundary_left)
data = dde.data.PDE(
spatial_domain,
pde,
[
bc_v_b,
bc_l_tr1,
bc_l_tr2,
],
num_domain=25000,
num_boundary=25000,
num_test=25000,
)
net = dde.maps.FNN([2] + 4 * [50] + [2], "tanh", "Glorot normal")
model = dde.Model(data, net)
model.compile(
"adam", lr=1e-3, loss_weights=[1, 1, 100, 100, 100]
)
model.train(epochs=30000)
model.compile("L-BFGS-B", loss_weights=[1, 1, 100, 100, 100])
losshistory, train_state = model.train()
dde.saveplot(losshistory, train_state, issave=True, isplot=True)
X = spatial_domain.random_points(10000)
output = model.predict(X)
u_pred = output[:, 0]
v_pred = output[:, 1]
X1 = X[:,0]
X2 = X[:,1]
with open('u.csv', mode='w') as output_file:
output_writer = csv.writer(output_file, delimiter=',')
output_writer.writerow(("x","y","z"))
for i in range(len(u_pred)):
output_writer.writerow((X1[i], X2[i], u_pred[i]))
with open('v.csv', mode='w') as output_file:
output_writer = csv.writer(output_file, delimiter=',')
output_writer.writerow(("x","y","z"))
for i in range(len(u_pred)):
output_writer.writerow((X1[i], X2[i], v_pred[i]))
#print (array)
if name == "main": main()
`
Is this the right thing? (It is implied that I will do the same thing for the top and right boundaries, I just skipped them here to save some space and just communicate the idea).
@supersuxy I think the formulation you are proposing comes from plate theory, which is similar to beam theory. The variable w in your equations refers to the deflection that is perpendicular to the plane of the plate. This is a completely different problem than what I am trying to do here, so it does not apply.
@gmouts Yes, you are right. I am sorry to misunderstood your model .
@gmouts You are right.
@lululxvi thanks for the help. I have a follow up question. After taking into account your suggestion for enforcing the traction boundary conditions I have the following code:
`from future import absolute_import from future import division from future import print_function
import numpy as np
import tensorflow as tf
import deepxde as dde
import csv
def main(): E = 78.2e9 nu = 0.3 lamda = 110.0 mu = 80.0
def traction_xx(x, u, X):
u_disp_x = dde.grad.jacobian(u, x, i=0, j=0)
v_disp_y = dde.grad.jacobian(u, x, i=1, j=1)
return (lamda + 2*mu) * u_disp_x + lamda * v_disp_y
def traction_yy(x, u, X):
u_disp_x = dde.grad.jacobian(u, x, i=0, j=0)
v_disp_y = dde.grad.jacobian(u, x, i=1, j=1)
return (lamda + 2*mu) * v_disp_y + lamda * u_disp_x
def traction_xy(x, u, X):
u_disp_y = dde.grad.jacobian(u, x, i=0, j=1)
v_disp_x = dde.grad.jacobian(u, x, i=1, j=0)
return mu * (u_disp_y + v_disp_x)
def pde(x, u):
u_disp, v_disp = u[:, 0:1], u[:, 1:]
u_disp_x = dde.grad.jacobian(u, x, i=0, j=0)
u_disp_y = dde.grad.jacobian(u, x, i=0, j=1)
v_disp_x = dde.grad.jacobian(u, x, i=1, j=0)
v_disp_y = dde.grad.jacobian(u, x, i=1, j=1)
sigma11 = (lamda + 2*mu) * u_disp_x + lamda * v_disp_y
sigma22 = lamda * u_disp_x + (lamda + 2*mu) * v_disp_y
sigma12 = mu * (u_disp_y + v_disp_x)
#sigma11_x = tf.gradients(sigma11, x)[0][:,0:1]
#sigma22_y = tf.gradients(sigma22, x)[0][:,1:]
#sigma12_x = tf.gradients(sigma12, x)[0][:,0:1]
#sigma12_y = tf.gradients(sigma12, x)[0][:,1:]
sigma11_x = dde.grad.jacobian(sigma11, x, j=0)
sigma22_y = dde.grad.jacobian(sigma22, x, j=1)
sigma12_x = dde.grad.jacobian(sigma12, x, j=0)
sigma12_y = dde.grad.jacobian(sigma12, x, j=1)
residual_x = sigma11_x + sigma12_y
residual_y = sigma12_x + sigma22_y - 200.0
return [residual_x, residual_y]
def func_zero(x):
return 0
spatial_domain = dde.geometry.geometry_2d.Rectangle(xmin=[0, 0], xmax=[1.0, 1.0])
def boundary_left(x, on_boundary):
return on_boundary and np.isclose(x[0], 0)
def boundary_right(x, on_boundary):
return on_boundary and np.isclose(x[0], 1)
def boundary_top(x, on_boundary):
return on_boundary and np.isclose(x[1], 1)
def boundary_bottom(x, on_boundary):
return on_boundary and np.isclose(x[1], 0)
bc_v_b = dde.DirichletBC(spatial_domain, func_zero, boundary_bottom, component=1)
bc_l_tr1 = dde.OperatorBC(spatial_domain, traction_xx, boundary_left)
bc_l_tr2 = dde.OperatorBC(spatial_domain, traction_xy, boundary_left)
bc_r_tr1 = dde.OperatorBC(spatial_domain, traction_xx, boundary_right)
bc_r_tr2 = dde.OperatorBC(spatial_domain, traction_xy, boundary_right)
bc_t_tr1 = dde.OperatorBC(spatial_domain, traction_yy, boundary_top)
bc_t_tr2 = dde.OperatorBC(spatial_domain, traction_xy, boundary_top)
data = dde.data.PDE(
spatial_domain,
pde,
[
bc_v_b,
bc_l_tr1,
bc_l_tr2,
bc_r_tr1,
bc_r_tr2,
bc_t_tr1,
bc_t_tr2,
],
num_domain=40000,
num_boundary=20000,
num_test=30000,
)
net = dde.maps.FNN([2] + 5 * [100] + [2], "tanh", "Glorot normal")
model = dde.Model(data, net)
model.compile(
"adam", lr=1e-4, loss_weights=[1, 1, 100, 100, 100, 100, 100, 100, 100]
)
model.train(epochs=50000)
model.compile("L-BFGS-B", lr=1.0e-5, loss_weights=[1, 1, 100, 100, 100, 100, 100, 100, 100])
losshistory, train_state = model.train(epochs=100000)
dde.saveplot(losshistory, train_state, issave=True, isplot=True)
X = spatial_domain.random_points(10000)
output = model.predict(X)
u_pred = output[:, 0]
v_pred = output[:, 1]
X1 = X[:,0]
X2 = X[:,1]
with open('u.csv', mode='w') as output_file:
output_writer = csv.writer(output_file, delimiter=',')
output_writer.writerow(("x","y","z"))
for i in range(len(u_pred)):
output_writer.writerow((X1[i], X2[i], u_pred[i]))
with open('v.csv', mode='w') as output_file:
output_writer = csv.writer(output_file, delimiter=',')
output_writer.writerow(("x","y","z"))
for i in range(len(u_pred)):
output_writer.writerow((X1[i], X2[i], v_pred[i]))
#print (array)
if name == "main": main()
`
And here I am attaching the loss function during the iterations.
`Using TensorFlow 2 backend.
Warning: 30000 points required, but 30276 points sampled. Compiling model... Building feed-forward neural network... 'build' took 0.048661 s
'compile' took 1.172279 s
Initializing variables... Training model...
Step Train loss Test loss Test metric
0 [1.45e+02, 3.45e+04, 8.46e-03, 3.48e+03, 8.17e+04, 2.20e+03, 6.32e+04, 4.05e+04, 5.06e+04] [1.42e+02, 3.43e+04, 0.00e+00, 0.00e+00, 0.00e+00, 0.00e+00, 0.00e+00, 0.00e+00, 0.00e+00] []
1000 [1.16e+02, 7.83e+02, 1.00e+02, 7.17e+00, 2.96e+00, 2.38e+01, 1.37e+01, 1.84e+02, 1.57e+01] [7.88e+01, 5.48e+02, 0.00e+00, 0.00e+00, 0.00e+00, 0.00e+00, 0.00e+00, 0.00e+00, 0.00e+00] []
2000 [8.26e+01, 4.05e+02, 7.67e+01, 3.55e-01, 5.99e+00, 9.52e+00, 8.31e+00, 1.21e+02, 1.22e+01] [5.59e+01, 2.85e+02, 0.00e+00, 0.00e+00, 0.00e+00, 0.00e+00, 0.00e+00, 0.00e+00, 0.00e+00] []
3000 [6.48e+01, 2.14e+02, 4.58e+01, 2.66e+00, 5.91e+00, 5.69e+00, 7.39e+00, 7.88e+01, 1.15e+01] [4.37e+01, 1.51e+02, 0.00e+00, 0.00e+00, 0.00e+00, 0.00e+00, 0.00e+00, 0.00e+00, 0.00e+00] []
4000 [4.96e+01, 9.66e+01, 2.23e+01, 3.54e+01, 4.00e+00, 2.01e+01, 7.21e+00, 7.76e+01, 1.01e+01] [3.33e+01, 6.56e+01, 0.00e+00, 0.00e+00, 0.00e+00, 0.00e+00, 0.00e+00, 0.00e+00, 0.00e+00] []
5000 [3.80e+01, 4.25e+01, 1.08e+01, 1.35e+01, 3.27e+00, 2.49e+00, 5.72e+00, 3.26e+01, 7.15e+00] [2.53e+01, 3.06e+01, 0.00e+00, 0.00e+00, 0.00e+00, 0.00e+00, 0.00e+00, 0.00e+00, 0.00e+00] []
6000 [2.90e+01, 2.98e+01, 6.33e+00, 1.38e+01, 2.04e+00, 1.62e+00, 3.93e+00, 2.24e+01, 4.63e+00] [1.90e+01, 2.26e+01, 0.00e+00, 0.00e+00, 0.00e+00, 0.00e+00, 0.00e+00, 0.00e+00, 0.00e+00] []
7000 [2.28e+01, 2.67e+01, 4.76e+00, 1.23e+01, 1.21e+00, 1.10e+00, 2.50e+00, 1.63e+01, 2.93e+00] [1.47e+01, 2.07e+01, 0.00e+00, 0.00e+00, 0.00e+00, 0.00e+00, 0.00e+00, 0.00e+00, 0.00e+00] []
8000 [1.83e+01, 2.49e+01, 4.31e+00, 1.07e+01, 6.93e-01, 7.73e-01, 1.56e+00, 1.22e+01, 1.86e+00] [1.17e+01, 1.93e+01, 0.00e+00, 0.00e+00, 0.00e+00, 0.00e+00, 0.00e+00, 0.00e+00, 0.00e+00] []
9000 [1.50e+01, 2.34e+01, 4.27e+00, 9.24e+00, 4.02e-01, 6.59e-01, 9.99e-01, 9.53e+00, 1.22e+00] [9.50e+00, 1.79e+01, 0.00e+00, 0.00e+00, 0.00e+00, 0.00e+00, 0.00e+00, 0.00e+00, 0.00e+00] []
10000 [1.23e+01, 2.26e+01, 4.30e+00, 4.65e+01, 9.38e-01, 2.57e+01, 7.04e-01, 5.39e+01, 1.17e+00] [7.71e+00, 1.57e+01, 0.00e+00, 0.00e+00, 0.00e+00, 0.00e+00, 0.00e+00, 0.00e+00, 0.00e+00] []
11000 [1.06e+01, 2.09e+01, 4.43e+00, 3.02e+02, 4.07e+00, 1.80e+02, 2.60e+00, 3.71e+02, 3.68e+00] [6.66e+00, 1.46e+01, 0.00e+00, 0.00e+00, 0.00e+00, 0.00e+00, 0.00e+00, 0.00e+00, 0.00e+00] []
12000 [9.09e+00, 1.90e+01, 4.60e+00, 6.32e+00, 6.42e-02, 8.28e-01, 2.96e-01, 5.07e+00, 3.54e-01] [5.63e+00, 1.39e+01, 0.00e+00, 0.00e+00, 0.00e+00, 0.00e+00, 0.00e+00, 0.00e+00, 0.00e+00] []
13000 [7.81e+00, 1.77e+01, 4.72e+00, 5.67e+00, 5.67e-02, 1.06e+00, 2.63e-01, 4.48e+00, 2.76e-01] [4.81e+00, 1.28e+01, 0.00e+00, 0.00e+00, 0.00e+00, 0.00e+00, 0.00e+00, 0.00e+00, 0.00e+00] []
14000 [6.95e+00, 1.65e+01, 4.85e+00, 5.22e+00, 3.21e-02, 8.24e-01, 1.70e-01, 3.62e+00, 1.70e-01] [4.28e+00, 1.18e+01, 0.00e+00, 0.00e+00, 0.00e+00, 0.00e+00, 0.00e+00, 0.00e+00, 0.00e+00] []
15000 [6.44e+00, 1.55e+01, 4.92e+00, 8.19e+02, 1.03e+01, 4.97e+02, 9.06e+00, 1.10e+03, 9.34e+00] [4.14e+00, 1.14e+01, 0.00e+00, 0.00e+00, 0.00e+00, 0.00e+00, 0.00e+00, 0.00e+00, 0.00e+00] []
16000 [5.56e+00, 1.41e+01, 5.03e+00, 9.21e+00, 5.59e-02, 3.92e+00, 2.18e-01, 9.55e+00, 1.82e-01] [3.41e+00, 9.85e+00, 0.00e+00, 0.00e+00, 0.00e+00, 0.00e+00, 0.00e+00, 0.00e+00, 0.00e+00] []
17000 [5.01e+00, 1.34e+01, 5.05e+00, 3.91e+00, 6.26e-02, 7.24e-01, 1.43e-01, 2.65e+00, 6.89e-02] [3.07e+00, 9.33e+00, 0.00e+00, 0.00e+00, 0.00e+00, 0.00e+00, 0.00e+00, 0.00e+00, 0.00e+00] []
18000 [4.48e+00, 1.27e+01, 5.03e+00, 3.44e+00, 1.04e-01, 8.80e-01, 2.11e-01, 2.68e+00, 6.09e-02] [2.74e+00, 8.74e+00, 0.00e+00, 0.00e+00, 0.00e+00, 0.00e+00, 0.00e+00, 0.00e+00, 0.00e+00] []
19000 [4.13e+00, 1.18e+01, 5.04e+00, 3.02e+00, 1.25e-01, 8.37e-01, 2.24e-01, 2.39e+00, 5.20e-02] [2.52e+00, 8.06e+00, 0.00e+00, 0.00e+00, 0.00e+00, 0.00e+00, 0.00e+00, 0.00e+00, 0.00e+00] []
20000 [3.81e+00, 1.09e+01, 5.06e+00, 2.94e+00, 9.90e-02, 7.96e-01, 2.00e-01, 2.32e+00, 3.77e-02] [2.33e+00, 7.47e+00, 0.00e+00, 0.00e+00, 0.00e+00, 0.00e+00, 0.00e+00, 0.00e+00, 0.00e+00] []
21000 [3.55e+00, 1.00e+01, 5.09e+00, 2.71e+00, 9.40e-02, 5.45e-01, 1.58e-01, 1.81e+00, 2.80e-02] [2.18e+00, 6.82e+00, 0.00e+00, 0.00e+00, 0.00e+00, 0.00e+00, 0.00e+00, 0.00e+00, 0.00e+00] []
22000 [3.26e+00, 9.34e+00, 5.05e+00, 2.41e+00, 9.90e-02, 5.24e-01, 1.56e-01, 1.68e+00, 2.44e-02] [2.01e+00, 6.32e+00, 0.00e+00, 0.00e+00, 0.00e+00, 0.00e+00, 0.00e+00, 0.00e+00, 0.00e+00] []
23000 [2.98e+00, 8.76e+00, 5.00e+00, 2.10e+00, 1.07e-01, 5.23e-01, 1.64e-01, 1.59e+00, 1.99e-02] [1.84e+00, 5.86e+00, 0.00e+00, 0.00e+00, 0.00e+00, 0.00e+00, 0.00e+00, 0.00e+00, 0.00e+00] []
24000 [2.67e+00, 8.23e+00, 4.92e+00, 3.02e+00, 1.32e-01, 1.19e+00, 2.55e-01, 2.96e+00, 3.40e-02] [1.66e+00, 5.21e+00, 0.00e+00, 0.00e+00, 0.00e+00, 0.00e+00, 0.00e+00, 0.00e+00, 0.00e+00] []
25000 [2.49e+00, 7.74e+00, 4.86e+00, 1.92e+00, 1.60e-01, 7.31e-01, 2.74e-01, 1.85e+00, 3.53e-02] [1.55e+00, 5.02e+00, 0.00e+00, 0.00e+00, 0.00e+00, 0.00e+00, 0.00e+00, 0.00e+00, 0.00e+00] []
26000 [2.42e+00, 6.91e+00, 4.87e+00, 1.49e+00, 1.02e-01, 3.69e-01, 1.44e-01, 1.10e+00, 2.68e-02] [1.52e+00, 4.58e+00, 0.00e+00, 0.00e+00, 0.00e+00, 0.00e+00, 0.00e+00, 0.00e+00, 0.00e+00] []
27000 [2.22e+00, 6.35e+00, 4.81e+00, 1.28e+00, 1.04e-01, 3.46e-01, 1.42e-01, 9.73e-01, 2.77e-02] [1.40e+00, 4.19e+00, 0.00e+00, 0.00e+00, 0.00e+00, 0.00e+00, 0.00e+00, 0.00e+00, 0.00e+00] []
28000 [2.02e+00, 5.97e+00, 4.72e+00, 1.05e+00, 1.11e-01, 3.35e-01, 1.32e-01, 9.19e-01, 2.57e-02] [1.28e+00, 3.91e+00, 0.00e+00, 0.00e+00, 0.00e+00, 0.00e+00, 0.00e+00, 0.00e+00, 0.00e+00] []
29000 [1.84e+00, 5.49e+00, 4.64e+00, 8.73e-01, 1.08e-01, 3.49e-01, 1.50e-01, 8.57e-01, 2.14e-02] [1.17e+00, 3.55e+00, 0.00e+00, 0.00e+00, 0.00e+00, 0.00e+00, 0.00e+00, 0.00e+00, 0.00e+00] []
30000 [1.51e+00, 6.56e+00, 4.46e+00, 1.07e+01, 2.11e-01, 6.66e+00, 4.10e-01, 1.86e+01, 2.56e-01] [9.53e-01, 3.61e+00, 0.00e+00, 0.00e+00, 0.00e+00, 0.00e+00, 0.00e+00, 0.00e+00, 0.00e+00] []
31000 [1.70e+00, 4.35e+00, 4.55e+00, 1.38e+02, 1.28e+00, 8.96e+01, 1.49e+00, 2.17e+02, 1.34e+00] [1.13e+00, 2.73e+00, 0.00e+00, 0.00e+00, 0.00e+00, 0.00e+00, 0.00e+00, 0.00e+00, 0.00e+00] []
32000 [1.49e+00, 4.01e+00, 4.44e+00, 4.76e-01, 8.43e-02, 2.69e-01, 1.08e-01, 4.28e-01, 2.93e-02] [9.67e-01, 2.61e+00, 0.00e+00, 0.00e+00, 0.00e+00, 0.00e+00, 0.00e+00, 0.00e+00, 0.00e+00] []
33000 [1.36e+00, 3.80e+00, 4.32e+00, 3.26e-01, 8.27e-02, 2.86e-01, 1.00e-01, 4.18e-01, 2.29e-02] [8.87e-01, 2.42e+00, 0.00e+00, 0.00e+00, 0.00e+00, 0.00e+00, 0.00e+00, 0.00e+00, 0.00e+00] []
34000 [1.28e+00, 3.31e+00, 4.26e+00, 2.76e-01, 6.86e-02, 2.55e-01, 9.15e-02, 2.80e-01, 2.64e-02] [8.43e-01, 2.16e+00, 0.00e+00, 0.00e+00, 0.00e+00, 0.00e+00, 0.00e+00, 0.00e+00, 0.00e+00] []
35000 [1.17e+00, 3.02e+00, 4.16e+00, 1.92e-01, 6.11e-02, 2.63e-01, 8.92e-02, 2.54e-01, 2.29e-02] [7.75e-01, 1.95e+00, 0.00e+00, 0.00e+00, 0.00e+00, 0.00e+00, 0.00e+00, 0.00e+00, 0.00e+00] []
36000 [1.13e+00, 2.89e+00, 4.05e+00, 3.46e+02, 2.63e+00, 2.28e+02, 2.98e+00, 5.55e+02, 2.59e+00] [7.56e-01, 1.64e+00, 0.00e+00, 0.00e+00, 0.00e+00, 0.00e+00, 0.00e+00, 0.00e+00, 0.00e+00] []
37000 [1.01e+00, 2.55e+00, 3.94e+00, 1.03e-01, 4.81e-02, 2.74e-01, 8.31e-02, 1.92e-01, 1.81e-02] [6.85e-01, 1.65e+00, 0.00e+00, 0.00e+00, 0.00e+00, 0.00e+00, 0.00e+00, 0.00e+00, 0.00e+00] []
38000 [9.65e-01, 2.25e+00, 3.86e+00, 8.08e-02, 3.89e-02, 2.50e-01, 7.71e-02, 1.07e-01, 2.02e-02] [6.55e-01, 1.51e+00, 0.00e+00, 0.00e+00, 0.00e+00, 0.00e+00, 0.00e+00, 0.00e+00, 0.00e+00] []
39000 [9.12e-01, 2.10e+00, 3.74e+00, 6.02e-02, 3.50e-02, 2.51e-01, 7.58e-02, 8.74e-02, 1.80e-02] [6.25e-01, 1.42e+00, 0.00e+00, 0.00e+00, 0.00e+00, 0.00e+00, 0.00e+00, 0.00e+00, 0.00e+00] []
40000 [8.65e-01, 1.97e+00, 3.62e+00, 4.94e-02, 3.17e-02, 2.52e-01, 7.64e-02, 7.33e-02, 1.67e-02] [5.98e-01, 1.33e+00, 0.00e+00, 0.00e+00, 0.00e+00, 0.00e+00, 0.00e+00, 0.00e+00, 0.00e+00] []
41000 [8.05e-01, 1.93e+00, 3.49e+00, 1.98e-01, 3.37e-02, 3.93e-01, 9.39e-02, 3.97e-01, 9.31e-03] [5.67e-01, 1.28e+00, 0.00e+00, 0.00e+00, 0.00e+00, 0.00e+00, 0.00e+00, 0.00e+00, 0.00e+00] []
42000 [7.95e-01, 1.73e+00, 3.40e+00, 4.45e-02, 2.53e-02, 2.54e-01, 7.76e-02, 5.24e-02, 1.60e-02] [5.56e-01, 1.20e+00, 0.00e+00, 0.00e+00, 0.00e+00, 0.00e+00, 0.00e+00, 0.00e+00, 0.00e+00] []
43000 [7.46e-01, 1.71e+00, 3.28e+00, 6.83e-02, 2.42e-02, 2.88e-01, 8.81e-02, 1.35e-01, 1.30e-02] [5.29e-01, 1.13e+00, 0.00e+00, 0.00e+00, 0.00e+00, 0.00e+00, 0.00e+00, 0.00e+00, 0.00e+00] []
44000 [7.40e-01, 1.54e+00, 3.20e+00, 4.96e-02, 2.07e-02, 2.56e-01, 7.70e-02, 4.09e-02, 1.69e-02] [5.23e-01, 1.09e+00, 0.00e+00, 0.00e+00, 0.00e+00, 0.00e+00, 0.00e+00, 0.00e+00, 0.00e+00] []
45000 [7.23e-01, 1.46e+00, 3.10e+00, 5.38e-02, 1.90e-02, 2.57e-01, 7.62e-02, 3.72e-02, 1.79e-02] [5.13e-01, 1.05e+00, 0.00e+00, 0.00e+00, 0.00e+00, 0.00e+00, 0.00e+00, 0.00e+00, 0.00e+00] []
46000 [7.12e-01, 1.39e+00, 3.00e+00, 4.88e-01, 2.16e-02, 5.74e-01, 8.16e-02, 8.54e-01, 1.60e-02] [5.08e-01, 1.00e+00, 0.00e+00, 0.00e+00, 0.00e+00, 0.00e+00, 0.00e+00, 0.00e+00, 0.00e+00] []
47000 [6.86e-01, 1.44e+00, 2.88e+00, 1.55e-01, 1.99e-02, 3.34e-01, 9.01e-02, 2.58e-01, 1.57e-02] [4.95e-01, 9.68e-01, 0.00e+00, 0.00e+00, 0.00e+00, 0.00e+00, 0.00e+00, 0.00e+00, 0.00e+00] []
48000 [6.90e-01, 1.30e+00, 2.80e+00, 7.08e-02, 1.67e-02, 2.55e-01, 7.83e-02, 3.06e-02, 2.17e-02] [4.93e-01, 9.37e-01, 0.00e+00, 0.00e+00, 0.00e+00, 0.00e+00, 0.00e+00, 0.00e+00, 0.00e+00] []
49000 [6.73e-01, 1.31e+00, 2.69e+00, 1.24e-01, 1.80e-02, 2.92e-01, 8.91e-02, 1.36e-01, 1.93e-02] [4.86e-01, 8.92e-01, 0.00e+00, 0.00e+00, 0.00e+00, 0.00e+00, 0.00e+00, 0.00e+00, 0.00e+00] []
50000 [6.68e-01, 1.19e+00, 2.63e+00, 8.34e-02, 1.56e-02, 2.52e-01, 8.09e-02, 2.98e-02, 2.45e-02] [4.78e-01, 8.67e-01, 0.00e+00, 0.00e+00, 0.00e+00, 0.00e+00, 0.00e+00, 0.00e+00, 0.00e+00] []
Best model at step 50000: train loss: 4.97e+00 test loss: 1.35e+00 test metric: []
'train' took 270215.095084 s
Compiling model... Warning: learning rate is ignored for L-BFGS-B 'compile' took 3.814844 s
Training model...
Step Train loss Test loss Test metric
50000 [6.68e-01, 1.19e+00, 2.63e+00, 8.34e-02, 1.56e-02, 2.52e-01, 8.09e-02, 2.98e-02, 2.45e-02] [4.78e-01, 8.67e-01, 0.00e+00, 0.00e+00, 0.00e+00, 0.00e+00, 0.00e+00, 0.00e+00, 0.00e+00] []
51000 [5.63e-02, 9.00e-02, 2.23e-02, 3.10e-02, 2.73e-03, 1.40e-02, 5.55e-03, 1.74e-02, 1.16e-02]
52000 [8.01e-03, 9.49e-03, 1.84e-02, 6.63e-04, 3.60e-03, 6.17e-04, 3.06e-03, 2.00e-03, 9.36e-04]
52131 [7.47e-03, 9.11e-03, 1.84e-02, 6.04e-04, 3.33e-03, 5.90e-04, 3.31e-03, 2.31e-03, 1.04e-03] [6.61e-03, 6.02e-03, 0.00e+00, 0.00e+00, 0.00e+00, 0.00e+00, 0.00e+00, 0.00e+00, 0.00e+00] []
Best model at step 52131: train loss: 4.62e-02 test loss: 1.26e-02 test metric: []
'train' took 11633.123929 s
Saving loss history to loss.dat ... Saving training data to train.dat ... Saving test data to test.dat ... Predicting... 'predict' took 0.037205 s
`
I read in other replies that the loss functions should drop down to 10^-4 or lower, but in my case as you see it stays around 10^-2. I have tried different things like more Adam iterations, more training points in the domain and on the boundary, smaller learning rates, more layers , and different weights on the boundary loss terms, but nothing seems to drive the loss lower than 10^-2. Any ideas or suggestions? Thanks!
At step 0, the loss seems very large, and also unbalanced. Also, if possible, it is usually beneficial to use hard constraints for BC.
Dear @lululxvi
I have one question about the formulation of the traction condition in the code above. Wouldn't specifying the stress condition using Dirichlet bc = 0, be the same as the condition specified above using OperatorBC? In Dirichlet BC it will check for stress value calculated using the stress equation right? Same as done in Operator BC or is it different?
Dear @lululxvi
Thanks for the reply on this thread where I am getting huge help from. However, as you said it might be more beneficial to use hard constraints for the BC, but how is that actually carried out?
For Dirichlet BCs, I have read other threads and know that we could implement them with model transform, but how about those Operator BC? Could they also be implemented by transformation or are there other methods?
For Dirichlet BCs, I have read other threads and know that we could implement them with model transform, but how about those Operator BC? Could they also be implemented by transformation or are there other methods?
Maybe not.
Hello,
I have started practicing with deepxde and I am trying to solve a simple linear elasticity problem. The domain is a unit square and there is a body force of 200 in the y direction. See the code below:
from future import absolute_import from future import division from future import print_function
import numpy as np
import tensorflow as tf
import deepxde as dde
import csv
def main(): E = 78.2e9 nu = 0.3 lamda = 110.0 mu = 80.0
if name == "main": main()
After solving the model I am getting the following vertical and horizontal displacements, respectively. The vertical one looks reasonable, but not the horizontal one. Am I missing something? Thank you!>