Open joseph16388 opened 4 months ago
Can 16GB-VRAM run Chat-UniVi-7B-v1.5 model? thanks
16GB-VRAM doesn't seem like enough for training the model, you can try using Lora and reducing the batch size.
For model inference, 16GB-VRAM is sufficient.
Can 16GB-VRAM run Chat-UniVi-7B-v1.5 model? thanks