Open jazberna1 opened 2 months ago
Hello @jazberna1,
Sorry for the delay, it has been a really busy week 😅 Seems like the code it's failing when getting the input size, I will take a look to the notebook in ZeroCostDL4Mic. I will see if I can reproduce the error and fix it, when I manage to solve the issue I will let you know here with another comment 🤗
One quick question to try to reproduce the error, when using section 2.3. Using weights from a pre-trained model as initial weights. Did you used any previous weights?
Thank you for reporting the bug! ❤️
Hello Ivan,
Thanks for coming back to me. To your question: when using section 2.3. Using weights from a pre-trained model as initial weights. Did you used any previous weights?
In section 2.3 I have:
Then I go straight to 5.1 Generate prediction(s) from unseen dataset
where I have:
Inside the results
folder I have unet_2d_multlilabel/weights_best.hdf5
so like this:
results/unet_2d_multlilabel/weights_best.hdf5
Then I comment out the line to avoid the error: print('Model input size: '+str(Input_size[0])+'x'+str(Input_size[1]))
click 'Load and run' and get the predictions:
Hope this helps! Jorge
I get the following error in
section 5.1 Generate prediction(s) from unseen dataset
when running the notebookU-Net_2D_Multilabel_ZeroCostDL4Mic
.IndexError: list index out of range
The error happens in
function_29
, in the following line:print('Model input size: '+str(Input_size[0])+'x'+str(Input_size[1]))
I believe what is happening here is that
Input_size
is[(None, 256, 256, 1)]
, whereInput_size[0]
is theNone
(the batch size)If comment the
Input_size
everywhere in the function I am able to get the predictions:To Reproduce I have run the following sections of the notebook:
Use_Data_augmentation
2.3. Using weights from a pre-trained model as initial weightsDescribe the bug See beginning of the message
Expected behavior Get model predictions
Screenshots
Desktop (please complete the following information):
My version of the notebook is:
Smartphone (please complete the following information): Does not apply