AI4Finance-Foundation / FinGPT

FinGPT: Open-Source Financial Large Language Models! Revolutionize 🔥 We release the trained model on HuggingFace.
https://ai4finance.org
MIT License
12.75k stars 1.81k forks source link

CUDA out of memory on Google Colab when trying to run beginners notebook #136

Open mithril9 opened 7 months ago

mithril9 commented 7 months ago

Hi,

I keep getting

OutOfMemoryError: CUDA out of memory. Tried to allocate 508.00 MiB. GPU 0 has a total capacty of 15.77 GiB of which 30.12 MiB is free. Process 44331 has 15.74 GiB memory in use. Of the allocated memory 14.89 GiB is allocated by PyTorch, and 1.11 MiB is reserved by PyTorch but unallocated. If reserved but unallocated memory is large try setting max_split_size_mb to avoid fragmentation. See documentation for Memory Management and PYTORCH_CUDA_ALLOC_CONF

When trying to run

model = AutoModel.from_pretrained( model_name, quantization_config=q_config, trust_remote_code=True, device='cuda' )

I have paid for 100 compute units and am using A100 GPU as the session type. I also tried changing the batch size from 4 to 1 but that didn't help.

mithril9 commented 7 months ago

The above is when trying to run your beginners Colab notebook.

Weiyao-Li commented 4 months ago

Please try to reduce batch size or torch.cuda.empty_cache() to adjust. You can also use nvidia-smi to oversee what's going on and adjust the model based on your GPU. I ran the beginner script before. You may also refer to my repo and articles here:https://github.com/AI4Finance-Foundation/FinGPT-Research. Hope this will help.

edwarts commented 1 month ago

I have the same issue when running on my 4090, out of memory.