Open Ethantequila opened 1 year ago
Thank you for your interest in our work. Short answer is it needs 20G memory. Longer answer is here: https://github.com/mlpc-ucsd/BLIVA/issues/3
Thank you for your interest in our work. Short answer is it needs 20G memory. Longer answer is here: #3
Thanks for your quick reply! I am wondering how much GPU memory needed for run BLIVA_Vicuna 7B with INT8?
Hi I try to run a demo(BLIVA_Vicuna 7B) in my local machine(V100, 16GB), It comes with a OOM error: torch.cuda.OutOfMemoryError: CUDA out of memory. Tried to allocate 86.00 MiB (GPU 0; 15.77 GiB total capacity; 12.18 GiB already allocated; 54.88 MiB free; 12.38 GiB reserved in total by PyTorch) If reserved memory is >> allocated memory try setting max_split_size_mb to avoid fragmentation. See documentation for Memory Management and PYTORCH_CUDA_ALLOC_CONF
So how much is the minimum amount of GPU memory required ? Or is there any way to reduce GPU usage?
Thanks a lot!