Closed Cohejh closed 2 weeks ago
https://github.com/oobabooga/text-generation-webui/wiki Training is supported for Transformers and GPTQ-for-LLaMa loaders. It is not possible for your Exllamav2HF loader.
https://github.com/oobabooga/text-generation-webui/wiki
Training is supported for Transformers and GPTQ-for-LLaMa loaders. It is not possible for your Exllamav2HF loader.
Ok, thank you very much for your help
This issue has been closed due to inactivity for 6 months. If you believe it is still relevant, please leave a comment below. You can tag a developer in your comment.
Describe the bug
Whenever I try to train a LoRA, when it tries to reload the model, it breaks.
Is there an existing issue for this?
Reproduction
Screenshot
No response
Logs
System Info
Windows 11, Nvidia GeForce RTX 3060.