juntang-zhuang / LadderNet

139 stars 36 forks source link

How do you deal the Problem about Memory Error when your patch sets 760000 for CHASE dataset?it 's too big #6

Closed BOBKINGS1101 closed 4 years ago

juntang-zhuang commented 4 years ago

Hi, without any modification of the code, you can increase system swap size; or you can change the dataloader, so you can load each batch on the fly without pre-storing all in memory.

BOBKINGS1101 commented 4 years ago

Thank you! How to change the dataloader to load the batch on the fly?

BOBKINGS1101 commented 4 years ago

I trained and test on the STARE dataset. But found the result is not very good (Se very low). Did you train on the STARE dataset?If you do some practice on this Datasets, Could you tell me how many patches you cut in this dataset?

juntang-zhuang commented 4 years ago

The same as in the code, very big number of patches. You can first generate those patches and store them on drive, then write a DataSet class to read in batch-by-batch as here https://pytorch.org/tutorials/beginner/data_loading_tutorial.html