Open J-Korn opened 3 months ago
While trying to finetune the openai/whisper-medium model with the google/fleurs dataset, even only using one language (greek) I very soon run out of VRAM, on a 20GB VRAM GPU.
Is there some way to reduce the VRAM consumption?
@J-Korn Please use a lower batch size
While trying to finetune the openai/whisper-medium model with the google/fleurs dataset, even only using one language (greek) I very soon run out of VRAM, on a 20GB VRAM GPU.
Is there some way to reduce the VRAM consumption?