Open jypppppp opened 10 months ago
Thank you for your interest in our work! The hyper-parameters of LoRA is listed in the following:
And the seed list is {0, 21, 42, 81, 100}, the batch_size is 8.
I hope my response helps you.
Hi, I was wondering if you use the same learning rate for all the rank settings. Looking forward to your help :)
Hi,
Thanks for your good work!
Can you clarify what is learning rate,bsz and epochs for baseline LoRA experiments among different datasets
Kind regards,
Jason