Closed amorimdiogo closed 5 years ago
tables.exceptions.NoSuchNodeError: group / does not have a child named data
The way I have the code set up, it expects the data file to have 3 groups: "data", "truth", and "affine". You can also add "subject_ids" to the data file as well. During training the generator will fetch the data from the "data" and "truth" groups in the h5 file.
Your problem is that your h5 file does not have the "data" group. It might not have the other groups as well, but missing the "data" group is what caused the error.
Of note, this is the same problem that @SnowRipple had in #95 when trying to create his own h5 file. I got confused when answering his question, and I'm pretty sure I totally messed him up!
So, I created an .h5 file with the following code:
data_set = create_dataset(data_dir, shorten=True) # data_set.shape = (m, 2, 128, 128, 128)
h5f = h5py.File('lits_data.h5', 'w')
h5f.create_dataset('data', data=np.expand_dims(data_set[:, 0, :, :, :], 1))
h5f.create_dataset('truth', data=np.expand_dims(data_set[:, 1, :, :, :], 1))
Where the data_set (m, 2, 128, 128, 128) is converted into two arrays with shape (m, 1, 128, 128, 128) each, and saved under either data or truth.
I get this error:
Loading previous validation split...
Number of training steps: 1
Number of validation steps: 1
Epoch 1/500
Traceback (most recent call last):
File "/home/albaroz/PycharmProjects/tese/3DUnetCNN-master/brats/train_isensee2017.py", line 120, in <module>
main(overwrite=config["overwrite"])
File "/home/albaroz/PycharmProjects/tese/3DUnetCNN-master/brats/train_isensee2017.py", line 114, in main
n_epochs=config["n_epochs"])
File "/home/albaroz/PycharmProjects/tese/3DUnetCNN-master/unet3d/training.py", line 88, in train_model
early_stopping_patience=early_stopping_patience))
File "/home/albaroz/anaconda3/envs/tese/lib/python3.6/site-packages/keras/legacy/interfaces.py", line 91, in wrapper
return func(*args, **kwargs)
File "/home/albaroz/anaconda3/envs/tese/lib/python3.6/site-packages/keras/engine/training.py", line 2230, in fit_generator
class_weight=class_weight)
File "/home/albaroz/anaconda3/envs/tese/lib/python3.6/site-packages/keras/engine/training.py", line 1877, in train_on_batch
class_weight=class_weight)
File "/home/albaroz/anaconda3/envs/tese/lib/python3.6/site-packages/keras/engine/training.py", line 1476, in _standardize_user_data
exception_prefix='input')
File "/home/albaroz/anaconda3/envs/tese/lib/python3.6/site-packages/keras/engine/training.py", line 123, in _standardize_input_data
str(data_shape))
ValueError: Error when checking input: expected input_1 to have shape (2, 128, 128, 128) but got array with shape (1, 128, 128, 128)
Closing remaining open files:/home/albaroz/PycharmProjects/tese/3DUnetCNN-master/brats/lits_data.h5...done
So I'm really struggling with the .h5 file shape, all I have are volumes (data) and segmentation masks (truth), so I can only feed it two layers... How should I save my data? Thanks!
I think this is a problem with your model configuration parameters. The model is expecting data with shape (2, 128, 128, 128), but the h5 file is feeding it data of shape (1, 128, 128, 128). If you only have one image type/modality (not including the segmentation image), then the h5 file is correct and the model is wrong. You need to delete the model file and create a new model with the input_shape
variable set to (1, 128, 128, 128). If you are going off of one of my scripts for training the model, then you can change the input shape by setting config["input_shape"]=(1, 128, 128, 128)
Hello. When I ran the code, I encountered the same problem:
Traceback (most recent call last):
File "/home/weijiaxu/pythoncode/3DUnetCNN-master/brats/train.py", line 115, in
Process finished with exit code 1
@albatroz95 @ellisdg @alkamid You said above that you need to add some code brats_data.h5.But I don't know how to write this program, and I don't know where to put it in that file. I hope I can get your help, so I'm just starting to learn deep learning.Can you share with me the code you wrote? Thank you
I tried to create a new brats_data.h5 file under the brats folder, and then copied @albatroz95 'code into the brats_data.h5 file, as follows: data_set = create_dataset(data_dir, shorten=True) # data_set.shape = (m, 2, 128, 128, 128) h5f = h5py.File('lits_data.h5', 'w') h5f.create_dataset('data', data=np.expand_dims(data_set[:, 0, :, :, :], 1)) h5f.create_dataset('truth', data=np.expand_dims(data_set[:, 1, :, :, :], 1))
then run the program and report an error:
Traceback (most recent call last):
File "/home/weijiaxu/pythoncode/3DUnetCNN-master/brats/train.py", line 115, in
File "H5F.c", line 604, in H5Fopen unable to open file File "H5Fint.c", line 1087, in H5F_open unable to read superblock File "H5Fsuper.c", line 277, in H5F_super_read file signature not found
End of HDF5 error back trace
Unable to open/create file '/home/weijiaxu/pythoncode/3DUnetCNN-master/brats/brats_data.h5'
Process finished with exit code 1 I hope I can get your help Thank you @albatroz95 @ellisdg @alkamid
When I ran the code, I encountered the same problem: @alkamid @ellisdg @albatroz95 I hope I can get your help Thank you
Loading previous validation split...
Number of training steps: 63
Number of validation steps: 10
Traceback (most recent call last):
File "/tmp/pycharm_project_271/3DUnetCNN/brats/train.py", line 134, in
Process finished with exit code 1
Loading previous validation split...
Number of training steps: 63
Number of validation steps: 10
Traceback (most recent call last):
File "/tmp/pycharm_project_271/3DUnetCNN/brats/train.py", line 134, in
Process finished with exit code 1
i'm also trying to apply the isensee 3D unet model to the LiTS dataset and was wondering what results @albatroz95 and @love112358 or others are getting with this dataset and architecture. for the liver segmentation, i'm getting mean dice of 0.918 +/- 0.049, median of 0.927, min of 0.693, max of 0.955. even though the scores appear high, some of the edges do not match the manually segmented edges well for the whole liver. the liver tumors did not perform well at all. interested in suggestions to improve.
Hello! I have created a .h5 file with the LiTS Dataset with shape (140, 2, 128, 128, 128), with 140 volumes, 2 labels, a scan and a mask for each, 128 slices, and 128x128 px size. I adjusted the header of the train_isensee2017.py file acordingly. When I try to run it I get the error bellow when running the get_training_and_validation_generators() function. What could it be? Thanks in advanced!