microsoft / LoRA

Code for loralib, an implementation of "LoRA: Low-Rank Adaptation of Large Language Models"
https://arxiv.org/abs/2106.09685
MIT License
10.72k stars 682 forks source link

LoRA adapter checkpoints not downloadable #141

Open kibru9399 opened 1 year ago

kibru9399 commented 1 year ago

I am not able to download the LoRA adabters for the NLU task this week, is there any other place I can find them?

edwardjhu commented 1 year ago

The checkpoints are hosted on GitHub. Which one are you having trouble downloading?

kibru9399 commented 1 year ago

thank you for answering, the one one this page, but i was able to find it on your other page, I have on question on the mnli(3-class classificatiton) LoRA adapter, you said that we should use the deberta-v2-xxl(from huggingface) as a base model to use with the adater, however, the deberta-v2-xxl model on huggingface doesnt have a classifier head(for the 3 class), how can we reproduce your result(97% accuracy) using your LoRA adapter if we cant find the classifier head? I hope what I am saying make sense. LoRA adapter is only for the attention matrices, how are we suppose to get the weight of the classifier head for mnli task, the base deberta-v2-xxl doesnt have a trained classifier head for this task. THANK YOU FOR YOUR ATTENTION.

xumingyu2021 commented 1 year ago

I also have trouble downloading withall on the page https://github.com/microsoft/LoRA/tree/main/examples/NLU, and am not able to find it on your other page....

Edenzzzz commented 11 months ago

The checkpoints are hosted on GitHub. Which one are you having trouble downloading?

None of the checkpoints can be downloaded

rensushan commented 9 months ago

None of the checkpoints on the page https://github.com/microsoft/LoRA/tree/main/examples/NLU can be downloaded.

zxchasing commented 8 months ago

thank you for answering, the one one this page, but i was able to find it on your other page, I have on question on the mnli(3-class classificatiton) LoRA adapter, you said that we should use the deberta-v2-xxl(from huggingface) as a base model to use with the adater, however, the deberta-v2-xxl model on huggingface doesnt have a classifier head(for the 3 class), how can we reproduce your result(97% accuracy) using your LoRA adapter if we cant find the classifier head? I hope what I am saying make sense. LoRA adapter is only for the attention matrices, how are we suppose to get the weight of the classifier head for mnli task, the base deberta-v2-xxl doesnt have a trained classifier head for this task. THANK YOU FOR YOUR ATTENTION.

hello,I am not reproducing the mnli task with the accuracy described in the paper, what does the mention of using LoRA adapter model mean? Have you solved it now?

sorobedio commented 4 months ago

can you share your lora checkpoints? i also need the checkpoints for the base and large Roberta for a project. thank you

kibru9399 commented 4 months ago

I remember there is another link in the repo that still contains the weights for all models, please check thoroughly.

On Sun, Jul 14, 2024 at 9:57 PM SORO Bedionita @.***> wrote:

can you share your lora checkpoints? i also need the checkpoints for the base and large Roberta for a project. thank you

— Reply to this email directly, view it on GitHub https://github.com/microsoft/LoRA/issues/141#issuecomment-2227590077, or unsubscribe https://github.com/notifications/unsubscribe-auth/AYQVVCZ5ZB5KOFXFJFSTB6LZMMUB5AVCNFSM6AAAAAA6KMTL3OVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZDEMRXGU4TAMBXG4 . You are receiving this because you authored the thread.Message ID: @.***>

drjohnsmiths commented 2 months ago

thank you for answering, the one one this page, but i was able to find it on your other page, I have on question on the mnli(3-class classificatiton) LoRA adapter, you said that we should use the deberta-v2-xxl(from huggingface) as a base model to use with the adater, however, the deberta-v2-xxl model on huggingface doesnt have a classifier head(for the 3 class), how can we reproduce your result(97% accuracy) using your LoRA adapter if we cant find the classifier head? I hope what I am saying make sense. LoRA adapter is only for the attention matrices, how are we suppose to get the weight of the classifier head for mnli task, the base deberta-v2-xxl doesnt have a trained classifier head for this task. THANK YOU FOR YOUR ATTENTION.

Hi, anyone can share which page can download lora checkpoint of mili for Roberta base?