xamyzhao / brainstorm

Implementation of "Data augmentation using learned transforms for one-shot medical image segmentation"
MIT License
392 stars 91 forks source link

ValueError: Error when checking model : the list of Numpy arrays that you are passing to your model is not the size the model expected. Expected to see 4 array(s), but instead got the following list of 3 arrays: #11

Closed aven1995 closed 4 years ago

aven1995 commented 5 years ago

Hi, Thank you for sharing the code. I use your data in this github and use your default setting to run

python main.py trans --gpu 0 --data mri-100unlabeled --model flow-fwd
python main.py trans --gpu 0 --data mri-100unlabeled --model flow-bck

get the spatial transform models,and edit main.py then run

python main.py trans --gpu 0 --data mri-100unlabeled --model color-unet

get the appearence model

and then train the segmentation network

python main.py seg --gpu 0 --data mri-100unlabeled --aug_tm

have error messages:

Traceback (most recent call last):
  File "main.py", line 367, in <module>
    test_every_n_epochs=test_every_n_epochs)
  File "/home/aven/Research/brainstorm-master/src/experiment_engine.py", line 133, in run_experiment
    run_metadata=None,
  File "/home/aven/Research/brainstorm-master/src/experiment_engine.py", line 184, in train_batch_by_batch
    joint_loss, joint_loss_names = exp.train_on_batch()
  File "/home/aven/Research/brainstorm-master/src/segmenter_model.py", line 876, in train_on_batch
    self.X_train_batch, self.Y_train_batch, self.ul_train_ids = next(self.aug_train_gen)
  File "/home/aven/Research/brainstorm-master/src/segmenter_model.py", line 599, in _generate_augmented_batch
    colored_vol, color_delta, _ = self.color_aug_model.predict([source_X, X_colortgt_src, source_contours])
  File "/home/aven/anaconda3/envs/oneshot/lib/python3.6/site-packages/keras/engine/training.py", line 1817, in predict
    check_batch_axis=False)
  File "/home/aven/anaconda3/envs/oneshot/lib/python3.6/site-packages/keras/engine/training.py", line 86, in _standardize_input_data
    str(len(data)) + ' arrays: ' + str(data)[:200] + '...')
ValueError: Error when checking model : the list of Numpy arrays that you are passing to your model is not the size the model expected. Expected to see 4 array(s), but instead got the following list of 3 arrays: [array([[[[[0.],
          [0.],
          [0.],
          ...,
          [0.],
          [0.],
          [0.]],

         [[0.],
          [0.],
          [0.],
          ...,
          [0.],
       ...

i think the error is in src/segmenter_model.py _generate_augmented_batch

colored_vol, color_delta, _ = self.color_aug_model.predict([source_X, X_colortgt_src, source_contours])

this function need 4 arrays but you only give source_X, X_colortgt_src, source_contours 3 arrays.

Is it your code have problem? or something wrong i have? Hope u can help me to solve this problem. My enviroment is:

OS: Ubuntu 16.04 x86_64
Python version: Python 3.6.8 :: Anaconda, Inc.
tensorflow-gpu version: 1.9.0
Keras version: 2.1.6
xamyzhao commented 5 years ago

Thanks for pointing this out! Just checked in a quick fix -- let me know if you run into any further issues with it.