center-for-humans-and-machines / transformer-heads

Toolkit for attaching, training, saving and loading of new heads for transformer models
https://transformer-heads.readthedocs.io/en/latest/
MIT License
237 stars 21 forks source link

Is it possible to use this toolkit to train multiple LORAs simultaneously? #1

Closed bingwork closed 5 months ago

bingwork commented 5 months ago

This is a great toolkit. I have read the content available at the following URL: https://github.com/center-for-humans-and-machines/transformer-heads/blob/main/notebooks/gpt2/joint_multitask_learning.ipynb.

However, it seems that there are multiple heads with a single LORA. I would like to confirm if it is possible to use this toolkit to train multiple LORAs simultaneously?

@yannikkellerde

yannikkellerde commented 5 months ago

I do not really see what the point of this would be. The idea here is to fine-tune one model to be able to do multiple tasks. If you want one LoRA per task, you should train a LoRA individually for each task (No need to add multiple heads to a Transformer)

Also, I am the sole maintainer of this project right now, so please do not ping anyone else.