lululxvi / deepxde

A library for scientific machine learning and physics-informed learning
https://deepxde.readthedocs.io
GNU Lesser General Public License v2.1
2.77k stars 760 forks source link

Unable to run the graph of the solution of two-dimensional partial differential equation #596

Closed doubao2 closed 2 years ago

doubao2 commented 2 years ago

Hello Lu, I try to solve a two-dimensional partial differential equation that solves the option value over time. The equation has no exact solution, so it is not written in the code. but the code result cannot display the graph of the obtained numerical solution. where the problem is?

import numpy as np
import pandas as pd
import tensorflow as tf
import matplotlib.pyplot as plt
# tf.compat.v1.disable_eager_execution()
import deepxde as dde
A=5.97
B=1
C=10.4
alpha=0.237
omega=1
sigma=1.96
phi=- 2.01
lamda=0.08
r=0.05
k=10
tick=1
T=30
#@tf.function
def pde(x, u): 
    u_tau = dde.grad.jacobian(u, x, i=0, j=2) 
    u_x = dde.grad.jacobian(u, x, i=0, j=0)        
    u_y = dde.grad.jacobian(u, x, i=0, j=1)
    u_xx = dde.grad.hessian(u, x, i=0, j=0)
    return (
        u_tau  
        - tf.maximum(18-x[: ,0:1],0)* u_y  #max(18-x[: ,0:1],0)
        - (B + C * omega * tf.cos(omega * (T-x[: ,2:3]) + phi)+ alpha * (A + B*(T-x[: ,2:3])+C * tf.sin(omega*(T-x[: ,2:3])+phi)-x[: ,0:1])-lamda*sigma)* u_x 
        - 0.5 * sigma ** 2 * u_xx
        + r * u
         ) 

spatial_domain = dde.geometry.Rectangle(xmin=[-50, 0], xmax=[50, 2040])
temporal_domain = dde.geometry.TimeDomain(0, 30)
spatio_temporal_domain = dde.geometry.GeometryXTime(spatial_domain, temporal_domain)

def boundary_t(x, on_boundary):
    return on_boundary and np.isclose(x[0], -50)
def boundary_b(y, on_boundary):
    return on_boundary and np.isclose(y[1], 2040)
def boundary_r(tau, on_boundary):
    return on_boundary and np.isclose(tau[2], 0)

bc_t = dde.DirichletBC(spatio_temporal_domain, lambda x: 0, boundary_t)
bc_b = dde.DirichletBC(spatio_temporal_domain, lambda x: 0, boundary_b)#float("inf")无穷
ic = dde.IC(
        spatio_temporal_domain,
        lambda x:tf.maximum(k-tf.maximum(18-x[: ,0:1],0),0), boundary_r
    )
def boundary_init(x, _):
    return np.isclose(x[0], 50)
ic_2 = dde.NeumannBC(
        spatio_temporal_domain,
        lambda x:0,boundary_init, component = 0)
data = dde.data.TimePDE(
    spatio_temporal_domain,
    pde,[bc_t, bc_b,ic, ic_2],
    num_domain=3000,
    num_boundary=600,
    num_initial=360,
    num_test=3000,
)

net = dde.nn.FNN([3] + [100] * 3 + [1],"tanh", "Glorot normal")
model = dde.Model(data, net)
model.compile("adam", lr = 0.001,)
model.train(epochs=10000)
model.compile("L-BFGS")
losshistory, train_state = model.train()
dde.saveplot(losshistory, train_state, issave=True, isplot=True)
Building feed-forward neural network...
'build' took 0.043144 s

'compile' took 0.425968 s

Initializing variables...
Training model...

0         [8.91e-02, 6.05e-02, 1.93e-01, 3.24e+01, 5.41e-05]    [2.87e-02, 6.05e-02, 1.93e-01, 3.24e+01, 5.41e-05]    []  
1000      [3.52e-01, 4.67e-04, 1.17e-02, 6.15e-02, 1.58e-03]    [3.47e-02, 4.67e-04, 1.17e-02, 6.15e-02, 1.58e-03]    []  
2000      [2.90e-01, 3.46e-04, 6.38e-03, 1.85e-01, 1.31e-03]    [4.51e-02, 3.46e-04, 6.38e-03, 1.85e-01, 1.31e-03]    []  
3000      [2.56e-01, 5.64e-04, 4.67e-03, 3.80e-02, 1.24e-03]    [4.05e-02, 5.64e-04, 4.67e-03, 3.80e-02, 1.24e-03]    []  
4000      [2.05e-01, 1.20e-03, 3.41e-03, 5.00e-02, 1.18e-03]    [3.24e-02, 1.20e-03, 3.41e-03, 5.00e-02, 1.18e-03]    []  
5000      [1.53e-01, 2.57e-04, 1.21e-03, 8.19e-03, 9.83e-04]    [3.09e-02, 2.57e-04, 1.21e-03, 8.19e-03, 9.83e-04]    []  
6000      [1.41e-01, 1.40e-03, 1.96e-03, 6.46e-02, 6.13e-04]    [4.09e-02, 1.40e-03, 1.96e-03, 6.46e-02, 6.13e-04]    []  
7000      [4.61e-01, 2.13e-04, 1.07e-02, 7.75e-02, 1.53e-03]    [1.88e-02, 2.13e-04, 1.07e-02, 7.75e-02, 1.53e-03]    []  
8000      [3.05e-01, 1.95e-04, 9.44e-03, 3.41e-02, 2.80e-03]    [1.41e-02, 1.95e-04, 9.44e-03, 3.41e-02, 2.80e-03]    []  
9000      [1.99e-01, 1.07e-04, 3.97e-03, 1.32e-02, 1.54e-03]    [1.37e-02, 1.07e-04, 3.97e-03, 1.32e-02, 1.54e-03]    []  
10000     [1.38e-01, 2.20e-04, 4.77e-03, 1.55e-02, 1.77e-03]    [1.42e-02, 2.20e-04, 4.77e-03, 1.55e-02, 1.77e-03]    []  

Best model at step 10000:
  train loss: 1.60e-01
  test loss: 3.64e-02
  test metric: []

'train' took 511.775824 s

Compiling model...
'compile' took 0.234032 s

Training model...

Step      Train loss                                            Test loss                                             Test metric
10000     [1.38e-01, 2.20e-04, 4.77e-03, 1.55e-02, 1.77e-03]    [1.42e-02, 2.20e-04, 4.77e-03, 1.55e-02, 1.77e-03]    []  
INFO:tensorflow:Optimization terminated with:
  Message: b'CONVERGENCE: REL_REDUCTION_OF_F_<=_FACTR*EPSMCH'
  Objective function value: 0.140541
  Number of iterations: 124
  Number of functions evaluations: 144
10144     [1.29e-01, 5.70e-05, 2.42e-03, 7.31e-03, 1.70e-03]    [1.42e-02, 5.70e-05, 2.42e-03, 7.31e-03, 1.70e-03]    []  

Best model at step 10144:
  train loss: 1.41e-01
  test loss: 2.57e-02
  test metric: []

'train' took 10.865519 s

Saving loss history to C:\Users\豆包儿\Desktop\新建文件夹\loss.dat ...
Saving training data to C:\Users\豆包儿\Desktop\新建文件夹\train.dat ...
Saving test data to C:\Users\豆包儿\Desktop\新建文件夹\test.dat ...
lululxvi commented 2 years ago

I cannot understand your question "code result cannot display the graph of the obtained numerical solution".

doubao2 commented 2 years ago

This is the running result of my code. It has only one graph of training loss. I can't get the solution of the partial differential equation. Is there any less code? image

praksharma commented 2 years ago

When you use dde.saveplot(losshistory, train_state, issave=True, isplot=True) it saves several .dat files in the pwd. You can plot test.dat with anything you wish.

This is exactly what you need:

https://github.com/lululxvi/deepxde/issues/276#issuecomment-827279923

doubao2 commented 2 years ago

使用它时,会在 .你可以用任何你想要的东西来策划。dde.saveplot(losshistory, train_state, issave=True, isplot=True)``.dat``pwd``test.dat

这正是您所需要的:

#276 (评论)

I'm trying to visualize my results. Thank you very much for your answer, which is very helpful to me.