lu-group / deeponet-fno

DeepONet & FNO (with practical extensions)
Other
233 stars 49 forks source link

Using hard BC to constraint training for example of 'darcy_rectangular_pwc' #15

Open LanPeng-94 opened 11 months ago

LanPeng-94 commented 11 months ago

Hi Lu,

I am running your code for example of darcy_rectangular_pwc. I note that in your paper, you used hard constraints, as mentioned, "We use the coefficient 20 such that 20x(1−x)y(1−y) is of order 1 for x ∈ [0, 1] and y ∈ [0, 1]." I want to add it.

Then, I write the following code

def output_transform(inputs, output):
    x_loc = inputs[1]
    x1, x2 = x_loc[:, 0:1], x_loc[:, 1:2]
    final_output = x1 * x2 * (1 - x1) * (1 - x2) * output * 20
    return final_output

net.apply_output_transform(output_transform)

However, it does not work for me.

"File "/public/home/hpc214801033/DeepONet/Darcy/src/BVx_H/deeponet_BVx_H.py", line 82, in output_transform finaloutput = x1 x2 (1 - x1) (1 - x2) (y * std + scaler.mean.astype(np.float32)) File "/public/home/hpc214801033/.conda/envs/tensorflow2.91/lib/python3.10/site-packages/tensorflow/python/util/traceback_utils.py", line 150, in error_handler return fn(*args, **kwargs)"

What happens to this?

Best regards,

Peng

LanPeng-94 commented 11 months ago

Hi,

Now, I see that the reason for my code. Because the outputs shape is the samplessensors (Nm), I can change the x1 * x2 * (1 - x1) * (1 - x2) * output * 20 to output * 20 * tf.transpose(x1 * x2 * (1 - x1) * (1 - x2)), as shown in

def output_transform(inputs, outputs):
    x_loc = inputs[1]
    print(x_loc.shape) 
    x1, x2 = x_loc[:, 0:1], x_loc[:, 1:2]
    print("x1 shape{}".format(x1.shape)) 
    print("x2 shape{}".format(x2.shape)) 
    final_outputs = outputs * 20 * tf.transpose(x1 * x2 * (1 - x1) * (1 - x2))
    return final_outputs

net.apply_output_transform(output_transform)

Is it right?

Best regards,

Peng

lululxvi commented 11 months ago

Looks good.