Closed leaBroe closed 2 months ago
Hey @leaBroe,
In your code snippet for loading the adapter, you might be missing the init()
call for adapters
. That's why not the load_adapter()
method of adapters
but of PEFT is called.
For adapters
, it is always required to first call adapters.init()
to add the libraries add_adapter()
, load_adapter()
etc methods. Unfortunately, the PEFT library decided to copy the names of some methods in adapters
(such as add_adapter()
, load_adapter()
), causing the confusing error messages if the PEFT library is installed and adapters.init()
is not called. adapters
has no dependencies on the PEFT library, so it's never necessary to have it installed alongside adapters
to use adapters
functionality.
Hope this helps!
PS: the fix for the issue you raised a while ago is part of the v0.2.2 release :)
Yes I forgot to add the init() call. Thanks!
Environment info
transformers
version: 4.40.2Information
I trained an EncoderDecoderModel with adapters using a "BnConfig" config and not using PEFT:
However, if i want to load the model and the adapter (in a different script):
I get the error:
Because adapters assumes that it is a PEFT adapter / a PEFT config by default (or at least this is how i understand it).
I also had this issue a couple of times when i wanted to train the adapter, when running:
where adapters also assumed a PEFT config. Unfortunately I dont remember how I solved this issue back then, but I only remember that at some point it worked when making a new environment and installing everything from scratch...so far, the environment that I am using now has always worked for the adapter training (without PEFT).
So, is there a way to load the adapter while not using PEFT / a PEFT config?
Maybe relevant: I installed adapters from github using:
pip install git+https://github.com/adapter-hub/adapters.git
because of this issue i opened a few weeks ago.Many thanks for any help!