Alpha-VLLM / LLaMA2-Accessory

An Open-source Toolkit for LLM Development
https://llama2-accessory.readthedocs.io/
Other
2.68k stars 170 forks source link

saving and loading multiple lora weights #99

Closed wj210 closed 9 months ago

wj210 commented 10 months ago

How can i save multiple lora weights and then load them sequentially without training over previous lora weights? For example, if i want to train a pretrained llama for 3 task, A,B,C sequentially with lora. The 1st model is saved with the only_save_trainable flag set to False, the new model would have additional weights lora_a, lora_b in each of the blocks. Subsequently, how can i train on task B, C while not overriding the previous lora weights?

The goal of this is to do multiple-step inference where the final output is derived sequentially from task A,B and C, rather than updating the same adapter weights after each task.

The only way i could think of is to have initialize lora weights for each adapter and then sum up the modifications to the original output?

wj210 commented 9 months ago

anyone has a clue on this?