facebookresearch / seamless_communication

Foundational Models for State-of-the-Art Speech and Text Translation
Other
10.98k stars 1.07k forks source link

finetuning required GPU #341

Open zouberou-sayibou opened 10 months ago

zouberou-sayibou commented 10 months ago

torch.cuda.OutOfMemoryError: CUDA out of memory. Tried to allocate 318.00 MiB. GPU 0 has a total capacty of 6.00 GiB of which 0 bytes is free. Including non-PyTorch memory, this process has 17179869184.00 GiB memory in use. Of the allocated memory 5.00 GiB is allocated by PyTorch, and 271.88 MiB is reserved by PyTorch but unallocated. If reserved but unallocated memory is large try setting max_split_size_mb to avoid fragmentation. See documentation for Memory Management and PYTORCH_CUDA_ALLOC_CONF

I got this error but it is Rrdiculous the amount of GB required by the code, is anyone having the same problem?

210023 commented 8 months ago

i have a similar problem, my GPU is GeForce 4050, with a total capacity of 6GiB, i am not sure if our problems are the same type

OutOfMemoryError: CUDA out of memory. Tried to allocate 1.36 GiB. GPU 0 has a total capacty of 5.77 GiB of which 284.31 MiB is free. Including non-PyTorch memory, this process has 5.48 GiB memory in use. Of the allocated memory 5.34 GiB is allocated by PyTorch, and 22.94 MiB is reserved by PyTorch but unallocated. If reserved but unallocated memory is large try setting max_split_size_mb to avoid fragmentation. See documentation for Memory Management and PYTORCH_CUDA_ALLOC_CONF

maybe i should somehow tell seamless to do it slowly?

FAb7D commented 3 months ago

idem,but with the medium model and using it just after turning on the PC it works