adapter-hub / adapters

A Unified Library for Parameter-Efficient and Modular Transfer Learning
https://docs.adapterhub.ml
Apache License 2.0
2.53k stars 339 forks source link

Error Loading Adapter Without PEFT Configuration in EncoderDecoderModel #718

Closed leaBroe closed 2 months ago

leaBroe commented 2 months ago

Environment info

Information

I trained an EncoderDecoderModel with adapters using a "BnConfig" config and not using PEFT:

model = EncoderDecoderModel.from_encoder_decoder_pretrained("/ibmm_data2/oas_database/paired_lea_tmp/heavy_model/src/redo_ch/FULL_config_4_smaller_model_run_lr5e-5_500epochs_max_seq_length_512/checkpoint-117674391", "/ibmm_data2/oas_database/paired_lea_tmp/light_model/src/redo_ch/FULL_config_4_smaller_model_run_lr5e-5_500epochs_max_seq_length_512/checkpoint-56556520", add_cross_attention=True)
init(model)

config = BnConfig(mh_adapter=True, output_adapter=True, reduction_factor=16, non_linearity="relu")

model.add_adapter("heavy2light_adapter", config=config)
model.set_active_adapters("heavy2light_adapter")
model.train_adapter("heavy2light_adapter")

However, if i want to load the model and the adapter (in a different script):

model_name = "/ibmm_data2/oas_database/paired_lea_tmp/paired_model/BERT2BERT/heavy2light_model_checkpoints/save_adapter_FULL_data_temperature_0.5"

tokenizer = AutoTokenizer.from_pretrained("/ibmm_data2/oas_database/paired_lea_tmp/paired_model/BERT2BERT/heavy2light_model_checkpoints/save_adapter_FULL_data_temperature_0.5/checkpoint-336040")

model = EncoderDecoderModel.from_pretrained(model_name)
model.load_adapter("/ibmm_data2/oas_database/paired_lea_tmp/paired_model/BERT2BERT/heavy2light_model_checkpoints/save_adapter_FULL_data_temperature_0.5/final_adapter")
model.set_active_adapters("heavy2light_adapter")
model.to(device)

I get the error:

Traceback (most recent call last):
  File "/ibmm_data2/oas_database/paired_lea_tmp/paired_model/BERT2BERT/src/generate_sequences.py", line 17, in <module>
    model.load_adapter("/ibmm_data2/oas_database/paired_lea_tmp/paired_model/BERT2BERT/heavy2light_model_checkpoints/save_adapter_FULL_data_temperature_0.5/final_adapter")
  File "/home/leab/anaconda3/envs/adap_2/lib/python3.9/site-packages/transformers/integrations/peft.py", line 180, in load_adapter
    peft_config = PeftConfig.from_pretrained(
  File "/home/leab/anaconda3/envs/adap_2/lib/python3.9/site-packages/peft/config.py", line 151, in from_pretrained
    return cls.from_peft_type(**kwargs)
  File "/home/leab/anaconda3/envs/adap_2/lib/python3.9/site-packages/peft/config.py", line 118, in from_peft_type
    return config_cls(**kwargs)
TypeError: __init__() got an unexpected keyword argument 'config'

Because adapters assumes that it is a PEFT adapter / a PEFT config by default (or at least this is how i understand it).

I also had this issue a couple of times when i wanted to train the adapter, when running:

config = BnConfig(mh_adapter=True, output_adapter=True, reduction_factor=16, non_linearity="relu")

model.add_adapter("heavy2light_adapter", config=config)

where adapters also assumed a PEFT config. Unfortunately I dont remember how I solved this issue back then, but I only remember that at some point it worked when making a new environment and installing everything from scratch...so far, the environment that I am using now has always worked for the adapter training (without PEFT).

So, is there a way to load the adapter while not using PEFT / a PEFT config?

Maybe relevant: I installed adapters from github using: pip install git+https://github.com/adapter-hub/adapters.git because of this issue i opened a few weeks ago.

Many thanks for any help!

calpt commented 2 months ago

Hey @leaBroe,

In your code snippet for loading the adapter, you might be missing the init() call for adapters. That's why not the load_adapter() method of adapters but of PEFT is called.

For adapters, it is always required to first call adapters.init() to add the libraries add_adapter(), load_adapter() etc methods. Unfortunately, the PEFT library decided to copy the names of some methods in adapters (such as add_adapter(), load_adapter()), causing the confusing error messages if the PEFT library is installed and adapters.init() is not called. adapters has no dependencies on the PEFT library, so it's never necessary to have it installed alongside adapters to use adapters functionality.

Hope this helps!

PS: the fix for the issue you raised a while ago is part of the v0.2.2 release :)

leaBroe commented 2 months ago

Yes I forgot to add the init() call. Thanks!