weili101 / Deep_Plates

Physics-guided neural network framework for elastic plates
MIT License
30 stars 10 forks source link

The problem with the code for partial derivatives #1

Closed wk36524 closed 1 year ago

wk36524 commented 1 year ago
    Hello author, thank you very much for sharing the code.
    In the class for "derivative", my understanding is that Net(x) is the output of the neural network, which is the predictive value "w", so what does the function "func(x)" do?
    I already know the specific definition of "func(x)" in the following text, but I do not know the function of "func(x)" in the class of "derivative".
    Thanks!

def derivative(x, Net, func, order):

w = Net(x)*func(x).view(-1,1)                                #func(x)?????????????????

if order == '0':
    return w

else:
    dw_xy = torch.autograd.grad(w, x, torch.ones_like(w), 
                                retain_graph=True, create_graph=True, allow_unused=True)
    dw_x = dw_xy[0][:,0].view(-1,1)
    dw_y = dw_xy[0][:,1].view(-1,1)

    if order == '1':
        return w, dw_x, dw_y

    else:
        dw_xxy = torch.autograd.grad(dw_x, x, torch.ones_like(dw_x), 
                                     retain_graph=True, create_graph=True,allow_unused=True)
        dw_xx = dw_xxy[0][:,0].view(-1,1)
        dw_xy = dw_xxy[0][:,1].view(-1,1)
        dw_yy = torch.autograd.grad(dw_y, x, torch.ones_like(dw_y), retain_graph=True, 
                                    create_graph=True,allow_unused=True)[0][:,1].view(-1,1)
        return w, dw_x, dw_y, dw_xx, dw_yy, dw_xy
weili101 commented 1 year ago

Hi, you are right that Net(x) is the output of the neural network.

"func(x)" here is to modify the output to satisfy certain boundary conditions, e.g., if func(x) = x - 1, w = Net(x)*(x -1) is always equal to 0 when x = 1. In this way, the boundary condition "w = 0, at x =1" is enforced.

You can pass func(x) as 1 (lambda x: 1) if you don't need this.

Hope that helps.

Best, Wei