C0untFloyd / bark-gui

🔊 Text-Prompted Generative Audio Model with Gradio
MIT License
674 stars 63 forks source link

torch.cuda.OutOfMemoryError: CUDA out of memory #63

Closed brehiner closed 1 year ago

brehiner commented 1 year ago

Is there a way to configure the start for this can work with rtx 3060ti?

I'm getting this error:

torch.cuda.OutOfMemoryError: CUDA out of memory. Tried to allocate 12.00 MiB (GPU 0; 8.00 GiB total capacity; 7.30 GiB already allocated; 0 bytes free; 7.33 GiB reserved in total by PyTorch) If reserved memory is >> allocated memory try setting max_split_size_mb to avoid fragmentation. See documentation for Memory Management and PYTORCH_CUDA_ALLOC_CONF

C0untFloyd commented 1 year ago

You shouldn't have problems with your state-of-the-art GPU. Did you try the different commandline arguments as described here?

ghn9264 commented 1 year ago

If your GPU memories < 10G , you can only use small models。startwith ‘ -smallmodels’