Open kibru9399 opened 1 year ago
The checkpoints are hosted on GitHub. Which one are you having trouble downloading?
thank you for answering, the one one this page, but i was able to find it on your other page, I have on question on the mnli(3-class classificatiton) LoRA adapter, you said that we should use the deberta-v2-xxl(from huggingface) as a base model to use with the adater, however, the deberta-v2-xxl model on huggingface doesnt have a classifier head(for the 3 class), how can we reproduce your result(97% accuracy) using your LoRA adapter if we cant find the classifier head? I hope what I am saying make sense. LoRA adapter is only for the attention matrices, how are we suppose to get the weight of the classifier head for mnli task, the base deberta-v2-xxl doesnt have a trained classifier head for this task. THANK YOU FOR YOUR ATTENTION.
I also have trouble downloading withall on the page https://github.com/microsoft/LoRA/tree/main/examples/NLU, and am not able to find it on your other page....
The checkpoints are hosted on GitHub. Which one are you having trouble downloading?
None of the checkpoints can be downloaded
None of the checkpoints on the page https://github.com/microsoft/LoRA/tree/main/examples/NLU can be downloaded.
thank you for answering, the one one this page, but i was able to find it on your other page, I have on question on the mnli(3-class classificatiton) LoRA adapter, you said that we should use the deberta-v2-xxl(from huggingface) as a base model to use with the adater, however, the deberta-v2-xxl model on huggingface doesnt have a classifier head(for the 3 class), how can we reproduce your result(97% accuracy) using your LoRA adapter if we cant find the classifier head? I hope what I am saying make sense. LoRA adapter is only for the attention matrices, how are we suppose to get the weight of the classifier head for mnli task, the base deberta-v2-xxl doesnt have a trained classifier head for this task. THANK YOU FOR YOUR ATTENTION.
hello,I am not reproducing the mnli task with the accuracy described in the paper, what does the mention of using LoRA adapter model mean? Have you solved it now?
can you share your lora checkpoints? i also need the checkpoints for the base and large Roberta for a project. thank you
I remember there is another link in the repo that still contains the weights for all models, please check thoroughly.
On Sun, Jul 14, 2024 at 9:57 PM SORO Bedionita @.***> wrote:
can you share your lora checkpoints? i also need the checkpoints for the base and large Roberta for a project. thank you
— Reply to this email directly, view it on GitHub https://github.com/microsoft/LoRA/issues/141#issuecomment-2227590077, or unsubscribe https://github.com/notifications/unsubscribe-auth/AYQVVCZ5ZB5KOFXFJFSTB6LZMMUB5AVCNFSM6AAAAAA6KMTL3OVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZDEMRXGU4TAMBXG4 . You are receiving this because you authored the thread.Message ID: @.***>
thank you for answering, the one one this page, but i was able to find it on your other page, I have on question on the mnli(3-class classificatiton) LoRA adapter, you said that we should use the deberta-v2-xxl(from huggingface) as a base model to use with the adater, however, the deberta-v2-xxl model on huggingface doesnt have a classifier head(for the 3 class), how can we reproduce your result(97% accuracy) using your LoRA adapter if we cant find the classifier head? I hope what I am saying make sense. LoRA adapter is only for the attention matrices, how are we suppose to get the weight of the classifier head for mnli task, the base deberta-v2-xxl doesnt have a trained classifier head for this task. THANK YOU FOR YOUR ATTENTION.
Hi, anyone can share which page can download lora checkpoint of mili for Roberta base?
I am not able to download the LoRA adabters for the NLU task this week, is there any other place I can find them?