EricLBuehler / xlora

X-LoRA: Mixture of LoRA Experts
Apache License 2.0
152 stars 7 forks source link

request for the training code #30

Open leonardxie opened 3 months ago

leonardxie commented 3 months ago

hi, thank you for your excellent work. Do you have any plans to share the training code? i want to reproduce the training but raises error as followers

RuntimeError: Trying to backward through the graph a second time (or directly access saved tensors after they have already been freed). Saved intermediate values of the graph are freed when you call .backward() or autograd.grad(). Specify retain_graph=True if you need to backward through the graph a second time or if you need to access saved tensors after calling backward.

XiaoYiWeio commented 2 months ago

me too

marcio-afr commented 2 months ago

I also tried to run a training script using Trainer class from Huggingface and I faced several issues and errors including: