Closed hitmusic100 closed 2 years ago
hps.n_samples = 1 if model in ('5b', '5b_lyrics') else 8 max_batch_size = 1 if model in ('5b', '5b_lyrics') else 16
Have set the above variables to 1, as an experiment, as per https://github.com/openai/jukebox/blob/master/README.md
colab needs to allocate a tesla P100 - solved
@hitmusic100 hi how did you get Colab to allocate a tesla p100? Or was there a change you made to the code?
I have colab pro, but keep getting this message using 5b_lyrics:
RuntimeError: CUDA out of memory. Tried to allocate 1.39 GiB (GPU 0; 14.76 GiB total capacity; 12.25 GiB already allocated; 1.08 GiB free; 12.61 GiB reserved in total by PyTorch) If reserved memory is >> allocated memory try setting max_split_size_mb to avoid fragmentation. See documentation for Memory Management and PYTORCH_CUDA_ALLOC_CONF
but cannot find how to fix the error anywhere
Would you know how to fix this error (without switching from 5b_lyrics to 1b_lyrics) - thanks
Am thinking this might be an elephant in the room, so have created this new issue
referencing the same issue here (https://github.com/openai/jukebox/issues/145#issue-703124473)