facebookresearch / seamless_communication

Foundational Models for State-of-the-Art Speech and Text Translation
Other
10.8k stars 1.05k forks source link

Error with GPU 40GB !!!!!!!!! #421

Closed developeranalyser closed 5 months ago

developeranalyser commented 5 months ago

i have 40GB gpu vram and id get error CUDA out of memory. with A100 gpu

for fine tune how much gpu vram needed ? id use colab pro its possible ?

Isn't it ridiculous?

TnkU

Angelalilyer commented 5 months ago

I also encountered this issue while using the new code. May I ask if you have resolved this issue

developeranalyser commented 5 months ago

I also encountered this issue while using the new code. May I ask if you have resolved this issue

not now _ I'm trying...... :(

developeranalyser commented 5 months ago

I also encountered this issue while using the new code. May I ask if you have resolved this issue

do you can solve this ?

zrthxn commented 5 months ago

The large version will crash out even on Colab A100. Maybe trying using the medium version.

developeranalyser commented 5 months ago

The large version will crash out even on Colab A100. Maybe trying using the medium version.

Does you have a better idea? I want to use V2 features and functions for 3 limited languages Can you kindly guide me how to ????

developeranalyser commented 5 months ago

The large version will crash out even on Colab A100. Maybe trying using the medium version.

How can I use the features of v2 with less volume for specific languages to get good results like v2?

Do you know a better model for this job than V2?