Closed happisky-lyt closed 3 months ago
@happisky-lyt Hi, sorry to bother you, but I had the same problem. Have you solved it?
I first ran the demo on my own pc, and I set line 103 of geochat.model.builder.py
as followed:
model = GeoChatLlamaForCausalLM.from_pretrained(model_path, offload_folder="offload", low_cpu_mem_usage=True, **kwargs)
But when doing inference there is an OOM error, so I uploaded the code to the server and I got the error I mentioned before.
i believe the error was caused by setting offload_folder="offload"
, so I directly download the repository in my server and done inference successfully.
hope this helps you.
I encountered the following issue when loading the checkpoint can u help me to solve it?