Closed yaswanth19 closed 1 month ago
Hi @BenjaminBossan
I think this logic should cover most of the cases. I haven't added prompt learning techniques here as they don't have a concrete prefix. Please let me know if I am missing any edge cases.
Yes @BenjaminBossan, Let me know which test cases should be handled. I will add those.
Let me know which test cases should be handled. I will add those.
I checked our existing tests and I think it would actually be fine if we extend one of our existing tests and check that the missing_keys
are empty. Take a look at this one:
There we're just loading two adapters and do no further checks (the test is just that there is no error). We could assign the load_result
and assert that there are no missing keys for both of them. WDYT?
Apart from that, I just noticed that after your last pushes, when I go to https://github.com/huggingface/peft/pull/2084/files, for some reason I see 27 changed files (even though it only says 1 file on this page). Not sure what's going on. Maybe it's a GitHub bug, but it makes it super hard to review the PR. I would suggest to wait a day and see if it goes away by itself. If not, maybe you need to rebase on the latest main or if that doesn't help, create a fresh PR. Again, let's wait a day and only try those options if there is still the same error then.
Hmm, I still see 27 changed files :-/ Could you please try re-basing? Otherwise, we may need a new PR with just the relevant changes.
Rebased the branch correctly and added test cases. Please review the PR.
Done with changes.
The docs for this PR live here. All of your documentation changes will be reflected on that endpoint. The docs are available until 30 days after the last update.
As you can see, the CI is failing and the reason is that the test does not pass when checking VB-LoRA. The reason for that is that the vector bank in VB-LoRA is shared among all layers, so there isn't an individual one for each layer. I don't think there is an easy way to account for that in the code changes you made. My suggestion is therefore to simply skip the test for VB-LoRA, adding a comment explaining why. Could you please adjust the test accordingly?
To check locally that the error is fixed, just run pytest tests/ -k test_load_multiple_adapters -v
.
Done, Skipping the VBLORA test case and added appropriate comment.
Yup Thanks @BenjaminBossan for guiding me, and I would be happy to help with any other issues or feature requests :hugs:
Sounds good, thanks! You can watch the PEFT issues for potential contributions, especially when there is the contributions-welcome
tag.
Refer #1932 for more context.
The main logic is we would have a specific prefix of adapter related keys and if such prefix is present in any of the missing keys then raise a warning else the adapter is loaded succesfully.