Closed bingwork closed 5 months ago
I do not really see what the point of this would be. The idea here is to fine-tune one model to be able to do multiple tasks. If you want one LoRA per task, you should train a LoRA individually for each task (No need to add multiple heads to a Transformer)
Also, I am the sole maintainer of this project right now, so please do not ping anyone else.
This is a great toolkit. I have read the content available at the following URL: https://github.com/center-for-humans-and-machines/transformer-heads/blob/main/notebooks/gpt2/joint_multitask_learning.ipynb.
However, it seems that there are multiple heads with a single LORA. I would like to confirm if it is possible to use this toolkit to train multiple LORAs simultaneously?
@yannikkellerde