Alpha-VLLM / LLaMA2-Accessory

An Open-source Toolkit for LLM Development
https://llama2-accessory.readthedocs.io/
Other
2.61k stars 167 forks source link

LargeDiT T2i - how much GPU memory is needed for sampling? #185

Open SapirW opened 2 months ago

SapirW commented 2 months ago

for 1024 resolution - I seem to get CUDA OOM on an A100 40GB machine. Does it make sense? How much memory is needed?

gaopengpjlab commented 2 months ago

LLaMa-7B utilize lots of GPU memory. Currently, A100 80 GB is a must for inference.

We will release a model using CLIP or Gemma-2B soon. Please keep an eye on the final version of LargeDiT-T2I https://github.com/Alpha-VLLM/Lumina-T2X/