zsdonghao / u-net-brain-tumor

U-Net Brain Tumor Segmentation
https://github.com/zsdonghao/tensorlayer
499 stars 181 forks source link

while trainig (python train.py --task=all) i got error #30

Open ksharm50 opened 6 years ago

ksharm50 commented 6 years ago

while trainig (python train.py --task=all) i got error

Traceback (most recent call last): File "train.py", line 250, in main(args.task) File "train.py", line 65, in main import prepare_data_with_valid as dataset File "C:\delete\shared_f\braintumor\u-net-brain-tumor-master\prepare_data_with _valid.py", line 351, in X_train_input = np.asarray(X_train_input, dtype=np.float32) File "C:\anaconda\lib\site-packages\numpy\core\numeric.py", line 492, in asarr ay return array(a, dtype, copy=False, order=order) MemoryError

how can i solve this one ? thanks in advance for helping :)

zsdonghao commented 6 years ago

three solutions:

  1. find a machine with more memory
  2. smaller batch size
  3. reimplement the data loading part with tensorflow dataset API

hope it helps.

ksharm50 commented 6 years ago

hey :)

  1. i have machine with 16gig ram is it enough ?
  2. what you mean by batch sie exactly ?
  3. where is data loading part exactly ? due you mean that i should replace this code . if DATA_SIZE == 'all': HGG_path_list = tl.files.load_folder_list(path=HGG_data_path) LGG_path_list = tl.files.load_folder_list(path=LGG_data_path) elif DATA_SIZE == 'half': HGG_path_list = tl.files.load_folder_list(path=HGG_data_path)[0:100]# DEBUG WITH SMALL DATA LGG_path_list = tl.files.load_folder_list(path=LGG_data_path)[0:30] # DEBUG WITH SMALL DATA elif DATA_SIZE == 'small': HGG_path_list = tl.files.load_folder_list(path=HGG_data_path)[0:50] # DEBUG WITH SMALL DATA LGG_path_list = tl.files.load_folder_list(path=LGG_data_path)[0:20] # DEBUG WITH SMALL DATA
ksharm50 commented 6 years ago

Actually i am trying to run you solution with brats2018 dataset .

miaozhang0525 commented 6 years ago

I also got this same issue. I found it is not because of batch size, and the error is in nib.load(image_path).get_data(). So, could you tell us use which tensorflow load api could replace this nib load???

anitchris commented 5 years ago

how did you resolve this problem?