PredictiveIntelligenceLab / Physics-informed-DeepONets

266 stars 83 forks source link

PI_DeepONet_Burger.ipynb vmap issue. #3

Closed HyungJunNoh closed 2 years ago

HyungJunNoh commented 2 years ago

Hello Sifan, First of all, thank you for making this code public. I'm trying to run and understand the code in PI_DeepONet_Burger.ipynb . Especially in

# Generate training data for inital condition
u_ics_train, y_ics_train, s_ics_train = vmap(generate_one_ics_training_data, in_axes=(0, 0, None, None))(keys, u0_train, m, P_ics_train)

it gives error as follow.

UnfilteredStackTrace: ValueError: vmap got inconsistent sizes for array axes to be mapped: arg 0 has shape (1000, 2) and axis 0 is to be mapped arg 1 has shape (10, 101) and axis 0 is to be mapped arg 2 has shape () and axis None is to be mapped arg 3 has shape () and axis None is to be mapped so arg 0 has an axis to be mapped of size 1000 arg 1 has an axis to be mapped of size 10

The stack trace below excludes JAX-internal frames. The preceding is the original exception that occurred, unmodified.

So as far as I figured out, to use vmap, the sizes of the mapped input axes for all mapped positional arguments must all be equal, but here looks like keys, u0_train have different size in axis=0. And, so vmap gives error.

Is there anything I am missing? I've tried to run this code on both colab and my server, but both gave me same error. How did you run this without vmap issue? Thank you!

Original Code Snippet

# Prepare the training data

# Load data
path = 'Burger.mat'  # Please use the matlab script to generate data

data = scipy.io.loadmat(path)
usol = np.array( data['output'])

N = usol.shape[0]  # number of total input samples
N_train =1000      # number of input samples used for training
N_test = N - N_train  # number of input samples used for test
m = 101            # number of sensors for input samples
P_ics_train = 101   # number of locations for evulating the initial condition
P_bcs_train = 100    # number of locations for evulating the boundary condition
P_res_train = 2500   # number of locations for evulating the PDE residual
P_test = 101        # resolution of uniform grid for the test data

u0_train = usol[:N_train,0,:]   # input samples
# usol_train = usol[:N_train,:,:]

key = random.PRNGKey(0) # use different key for generating test data 
keys = random.split(key, N_train)

# Generate training data for inital condition
u_ics_train, y_ics_train, s_ics_train = vmap(generate_one_ics_training_data, in_axes=(0, 0, None, None))(keys, u0_train, m, P_ics_train)

u_ics_train = u_ics_train.reshape(N_train * P_ics_train,-1)  
y_ics_train = y_ics_train.reshape(N_train * P_ics_train,-1)
s_ics_train = s_ics_train.reshape(N_train * P_ics_train,-1)

# Generate training data for boundary condition
u_bcs_train, y_bcs_train, s_bcs_train = vmap(generate_one_bcs_training_data, in_axes=(0, 0, None, None))(keys, u0_train, m, P_bcs_train)

u_bcs_train = u_bcs_train.reshape(N_train * P_bcs_train,-1)  
y_bcs_train = y_bcs_train.reshape(N_train * P_bcs_train,-1)
s_bcs_train = s_bcs_train.reshape(N_train * P_bcs_train,-1)

# Generate training data for PDE residual
u_res_train, y_res_train, s_res_train = vmap(generate_one_res_training_data, in_axes=(0, 0, None, None))(keys, u0_train, m, P_res_train)

u_res_train = u_res_train.reshape(N_train * P_res_train,-1)  
y_res_train = y_res_train.reshape(N_train * P_res_train,-1)
s_res_train = s_res_train.reshape(N_train * P_res_train,-1)
sifanexisted commented 2 years ago

Hi HyungJunNoh,

Thank you for interest in our work. It seems that the first dimension of your keysand u0_train are different. Can you print the shape of these two? Perhaps you only generate 10 training data?

HyungJunNoh commented 2 years ago
print(keys.shape)
print(u0_train.shape)

------------------------

(1000, 2)
(10, 101)

Yes, both are different. I got data Burger.mat with given matlab script, found from Physics-informed-DeepONets/Burger/Data/. Was there anything I should implement for generating data?

sifanexisted commented 2 years ago

I think the only thing you need to change is the value of N in Physics-informed-DeepONets/Burger/Data/gen_Burgers.m Change N = 10 to N = 1000. But it may take some time to generate the total dataset.

Or if you just want to get rid of this error and run the notebook then you can just change N_train =1000 to N_train = 10.

Hope this helps.

HyungJunNoh commented 2 years ago

Thank you, Sifan!