gregwchase / eyenet

Identifying diabetic retinopathy using convolutional neural networks
https://www.youtube.com/watch?v=pMGLFlgqxuY
MIT License
195 stars 76 forks source link

MemoryError #3

Closed rockywind closed 6 years ago

rockywind commented 6 years ago

Hi,my configure is :+1: ubuntu10.04 python3 but,when I run the scipt:image_to_array.py, the error occur. Writing Train Array Traceback (most recent call last): File "image_to_array.py", line 62, in <module> X_train = convert_images_to_arrays_train('../data/train-resized-256/', labels) File "image_to_array.py", line 38, in convert_images_to_arrays_train return np.array([np.array(Image.open(file_path + img)) for img in lst_imgs]) MemoryError

Thanks in advance.

gregwchase commented 6 years ago

@rockywind The issue is you don't have enough DRAM. The file you're creating is a gigantic NumPy array of arrays. In order to do so, you need to have enough DRAM. My suggestion would be to run the files in the cloud, with plenty of memory (100GB should be enough).

rockywind commented 6 years ago

Yes,thank you very much,but I don't have the cloud to run the file,so I want to know if I can solve it in another way,for example, processing the pictures in stages,storing the .npy file for every 100 pictures. In this way,we don't need such a large memory.

gregwchase commented 6 years ago

@rockywind In order to train the model, a GPU is all but mandatory. Otherwise, you won't get results back in a timely manner.

Assuming you have a GPU, you can read images into memory, a few a time. Keras should have documentation on this.

karankumar-07 commented 6 years ago

To avoid memory error :

  1. Write a code to split the images based on the levels.
  2. After making them in a separate folder, run Keras.
  3. After predicting the labels use sklearn to find precision, kappa etc. In case if you need the labels for the test dataset in the kaggle. The link is https://www.kaggle.com/c/diabetic-retinopathy-detection/discussion/16149 (I'll upload the code soon)
gregwchase commented 6 years ago

@CodeRed1704 My own code for what you're talking about will be integrated soon. Fixing a few other bugs, but it's on the roadmap.

karankumar-07 commented 6 years ago

Thanks @gregwchase .