lyuchenyang / Macaw-LLM

Macaw-LLM: Multi-Modal Language Modeling with Image, Video, Audio, and Text Integration
Apache License 2.0
1.54k stars 125 forks source link

GPU Memory Requirement #9

Closed SJY8460 closed 1 year ago

SJY8460 commented 1 year ago

Thank you for your awesome work! I would like to know how much GPU memory at least can run on this project, can It run on a 2*3090 GPU?

wanghao-cst commented 1 year ago

Thank you for your awesome work! I would like to know how much GPU memory at least can run on this project, can It run on a 2*3090 GPU?

It works for me~

lyuchenyang commented 1 year ago

Hi, thanks for yor interests. I think the model can fit in two 3090 GPU (2 * 24GB VRAM) with FP16.