Open francis2tm opened 1 year ago
Hello, Clearly my PC is not as juicy as yours. I'm attempting to load the model with LlamaForCausalLM.from_pretrained() but the script crashes since I don't have enough RAM (I only have 16GB). Is there anyway to load with only 16GB?
LlamaForCausalLM.from_pretrained()
Thanks in advance
Seems the lora train need 18G: https://twitter.com/nash_su/status/1637423768665718784 And I would like to know the generate.py also need 18G vRAM?
Hello, Clearly my PC is not as juicy as yours. I'm attempting to load the model with
LlamaForCausalLM.from_pretrained()
but the script crashes since I don't have enough RAM (I only have 16GB). Is there anyway to load with only 16GB?Thanks in advance