keras-team / keras

Deep Learning for humans
http://keras.io/
Apache License 2.0
61.56k stars 19.41k forks source link

Memory allocation error on GPU #1218

Closed ymcui closed 8 years ago

ymcui commented 8 years ago

Hi,

Recently, I used Keras to do some language modeling with TESLA K40M GPU. I set up an LSTM-LM, and it works well for a small data set (~1M in txt format), and GPU memory is only consumed for approximately ~300M. But when i tried to train on a larger data set (~150M in txt format), the program can not be executed, and report "Memory Allocation Error". it indicated that the program wants to allocate >30GB GPU memory space, which is impossible for any regular GPUs. The hyper-parameters for small and large datasets are identical(hiddenSize=100,vocabulary=30k). I don't know whether the memory allocation is also related to the input data size or not.

Thanks in advance.

jocicmarko commented 8 years ago

You could try reducing batch_size in training and predicting, at least that helped me with big models and big datasets.

ymcui commented 8 years ago

@jocicmarko thank you, i will try that.