openai / jukebox

Code for the paper "Jukebox: A Generative Model for Music"
https://openai.com/blog/jukebox/
Other
7.83k stars 1.42k forks source link

Running Colab ENV With Local GPU/In Parallel #251

Open austinask opened 2 years ago

austinask commented 2 years ago

I have been able to get the Colab and a local Jupyter Notebook to run on my system but am hoping someone can help me get the environment to use both of my Tesla K80s to prevent stalling.

Since the K80 is 2x12GB GPUs with an SLI interface I end up running out of VRAM quite quickly and it produces just a stalled.

I have tried running: !mpiexec -n {ngpus} python jukebox/sample.py before the sampling model

Changed line 111 of sample.py to: {hps.name}_{dist.getrank()}/level{level}

However this gives me a process killed 9 error which basically says it's running out of memory, yet while monitoring my GPUs they're only using around 276MiB on Node 0.

I have tried doing plenty of research but it hasn't proved helpful.

Since the Discord is no longer working and this project seems to be mostly dead I'm not expecting much of a reply, though would greatly appreciate any help.

As an aside, I do not wish to use Colab Pro because of the GPUs in my local machine.