Crunch-UQ4MI / neuraluq

150 stars 33 forks source link

Learn forward_poisson example #20

Open LanPeng-94 opened 1 year ago

LanPeng-94 commented 1 year ago

Hi, This is an excellent job. Currently, I would like to complete and implement some of my ideas through your library. I am learning how to use HMC to build BPINN, looking at forward examples of posts can give me some inspiration, and I have achieved some results. image This is actually not good for learning. This seems to be due to the high noise level, as discussed in the paper, is this the reason for a noise level of 10%? What is the noise level tolerated by BPINN for this example? Looking forward to your answer. Yours

XuhuiM commented 1 year ago

Hi, Thanks for the interest in our work. The noise level is related to the instrument error in experiments. We randomly set it as 10% in the code to test our model, and you can change it based on your own problem setup. We have an example for function approximation in the B-PINN paper in which the noise level is 50%. You can take a look if it is of your interest. Thanks.

LanPeng-94 commented 1 year ago

Cheers! I have another question. I have reviewed your examples and it seems that they all assume the presence of noise in the source term. If I want to use BPINN to solve the following problem: PDE is:

*h_zz + alphah_z = 0 (alpha is constant, known), and 0<z<10.**

I assume that the noise comes from the observation data of h (for example, h(1), h(2), h(3)). At this point, the source term f is 0 with the correct measurement data, as discussed in #7. When using Neuraluq to construct BPINN (HMC for posterior inference), only the likelihood of the data needs to be established, and since f does not contain noise, there is no need to establish likelihood. What should I do?

XuhuiM commented 1 year ago

No, you should also include the likelihood in for f in the likelihood. Otherwise, you can encode correctly the equation in the BNN. For now we just assume that the data for f contain very small measurement error, e.g., 1%. In this way, we can construct the likelihood in the same way as for h. Let me know if you have better ways for it. Thanks.

LanPeng-94 commented 1 year ago

Hi, Currently, I want to convert the output of the neural network, and I have noticed that there is an output in neuq.surrogates.FNN_ The option for transform, but I don't know what it should define. Can you help me take a look at the following code snippet? Can it work?

def output_transform(x):
    return -tf.exp(x)

def Samplable(
    z_u_train, t_u_train, u_train, z_f_train, t_f_train, f_train, noise, layers
):
    # build processes
    process_u = neuq.process.Process(
        surrogate=neuq.surrogates.FNN(layers=layers, output_transform=output_transform),
        prior=neuq_vars.fnn.Samplable(layers=layers, mean=0, sigma=1),
    )