Open iamwangyabin opened 10 months ago
@waitzkin thanks for your great work. I have 4 * A6000 (49G) machine, is it enough to train Vicuna-7B? I'm not sure if it's needed to explicitly split model to different GPU. Thanks for your clarification in advance. What's the memory size of your A100, 80G or 90G? how about is its memory consumption?
It was about 40G, so single A1000 machine will be enough to train models with Vicuna-7B.
@waitzkin thanks a lot for your response ^-^. Will try
Hello, I apologize for the delayed response. For training models with Vicuna-7B, a GPU with VRAM greater than 24GB is required. For training models with FlanT5-XL, a GPU with 24GB VRAM is sufficient.