JarodMica / ai-voice-cloning

GNU General Public License v3.0
655 stars 144 forks source link

torch.cuda.OutOfMemoryError: CUDA out of memory. #62

Open nabnabdave opened 8 months ago

nabnabdave commented 8 months ago

Hi,

I get this error message:

return t.to(device, dtype if t.is_floating_point() or t.is_complex() else None, non_blocking) torch.cuda.OutOfMemoryError: CUDA out of memory. Tried to allocate 12.00 MiB. GPU 0 has a total capacty of 2.00 GiB of which 0 bytes is free. Of the allocated memory 472.67 MiB is allocated by PyTorch, and 3.33 MiB is reserved by PyTorch but unallocated. If reserved but unallocated memory is large try setting max_split_size_mb to avoid fragmentation. See documentation for Memory Management and PYTORCH_CUDA_ALLOC_CONF

RealCalumPlays commented 7 months ago

This means you either do not have enough VRAM to inference OR if you are preping your dataset you may need to choose a smaller whisper model.