Clin0212 / HydraLoRA

[NeurIPS'24 Oral] HydraLoRA: An Asymmetric LoRA Architecture for Efficient Fine-Tuning
72 stars 5 forks source link

The position of the hydralora module in the model? #6

Closed WonderLandxD closed 5 days ago

WonderLandxD commented 5 days ago

Hello, thank you for your contribution. I would like to ask if your hydralora module is parallel to each transformer module. Or is it just added to the q, v matrix in the transformer? I noticed that the style shown in your Fig. 4B seems to be calculated in parallel with each transformer module.

Clin0212 commented 5 days ago

Thank you for your question. Fig. 4B is intended to better visualize the process of MoE in the B matrix during inference. Similar to the vanilla LoRA, HydraLoRA incorporates adapters into each linear layer within the FFN layers.

WonderLandxD commented 5 days ago

Thank you for your question. Fig. 4B is intended to better visualize the process of MoE in the B matrix during inference. Similar to the vanilla LoRA, HydraLoRA incorporates adapters into each linear layer within the FFN layers.

I understand, thanks for your answer :)