lyuchenyang / Macaw-LLM

Macaw-LLM: Multi-Modal Language Modeling with Image, Video, Audio, and Text Integration
Apache License 2.0
1.56k stars 127 forks source link

How many GPU memory needed to finetune the model? #17

Closed aixiaodewugege closed 1 year ago

aixiaodewugege commented 1 year ago

Can I finetune it with my dataset with 4 * 3090?

lyuchenyang commented 1 year ago

Hi, based on what I estimated and other poeple's feedback, I think you can run it on 4 Nvidia 3090 GPUs with FP16 - its just the speed can be slow as Nvidia 3090 only as 24GB VRAM so you must use a smaller batch size .