Vision-CAIR / MiniGPT-4

Open-sourced codes for MiniGPT-4 and MiniGPT-v2 (https://minigpt-4.github.io, https://minigpt-v2.github.io/)
https://minigpt-4.github.io
BSD 3-Clause "New" or "Revised" License
25.29k stars 2.91k forks source link

How to use deep speed inference start MiniGPT-4 on low GPU memory hardware? #112

Open yt7589 opened 1 year ago

yt7589 commented 1 year ago

I had run the MiniGPT-4 on Nvidia T4 which has 16G memory. I could upload picture. But when I asked question about this picture it reported CUDA out of memory. I want to use deep speed inference tech to run MiniGPT-4. Because deep speed inference can offload the parameters to CPU memory and swap to GPU memory when necissary. I had write a shell to use deep speed to start inference, a deep speed configuration file, a initialize method to invoke deep speed. But it failed to start correctly. Who can tell to how to do?

zzhanghub commented 1 year ago

I also want to learn how to use Deepspeed.

Xinzhe-Ni commented 1 year ago

I want to learn, too T^T