Open YukiSakuma opened 2 years ago
I got it to work after upgrading to Pro for $10.. and then waiting a bit or restarting. https://github.com/borisdayma/dalle-mini/issues/207
I got it to work after upgrading to Pro for $10.. and then waiting a bit or restarting. #207
It could definitely work with Pro, but I'm searching for a way if possible for the mega model to work with kaggle with only 13 GB of RAM
It seems a higher RAM is really needed but I don't know what is the minimum, I was able to load the mega fp16 with CLIP in paperspace free with an 8 GB GPU but RAM capacity is 30 GB, image generation for a total of 9 images is about ~5 minutes. Now I'm curious what would be the minimum RAM and GPU VRAM for inferencing once there is an open source DALLE-2 to play with.
What if you don't load CLIP?
What if you don't load CLIP?
I tried but OOMs at this line:
from flax.jax_utils import replicate
params = replicate(params)
I am using this edited notebook: https://www.kaggle.com/code/apolinario/dall-e-mega
I need the _do_init=False
, without it it would OOM at loading the mega model because the official inference notebook provided has the _do_init
as an unexpected keyword argument. I also added line below at the top
import os; os.environ["XLA_PYTHON_CLIENT_ALLOCATOR"] = "platform"
As provided here:https://github.com/saharmor/dalle-playground/issues/14
I'm using the notebook provided but on kaggle and using the
DALLE_MODEL = "dalle-mini/dalle-mini/mega-1-fp16:latest"
but it will crash out of memory errorYour notebook tried to allocate more memory than is available. It has restarted.