adapter-hub / adapters

A Unified Library for Parameter-Efficient and Modular Transfer Learning
https://docs.adapterhub.ml
Apache License 2.0
2.56k stars 342 forks source link

Add EncoderDecoderModel support #203

Closed joao-alves97 closed 3 years ago

joao-alves97 commented 3 years ago

🚀 Feature request

Could it be possible to add adapters to the EncoderDecoderModel? I am fine-tuning an EncoderDecoderModel with two mBERT and I would like to compare it with fine-tuning only the adapter layers

joao-alves97 commented 3 years ago

@patil-suraj

calpt commented 3 years ago

Hey @joao-alves97, I started an adapter implementation for EncoderDecoderModel in https://github.com/calpt/adapter-transformers/tree/dev/encoder_decoder (currently work in progress). Expecting it to be available soon.

joao-alves97 commented 3 years ago

Awesome! Could you send me a message once it is available? Thanks

calpt commented 3 years ago

@joao-alves97 the EncoderDecoderModel implementation has been merged into master, so you should be able to use it when installing from there. We haven't done any extensive evaluation yet though, so happy to hear about results you get :)

joao-alves97 commented 3 years ago

Awesome! Next week I'm going to try to do some experiments!

joao-alves97 commented 2 years ago

@calpt sorry for replying after 2 months but I'm trying to use adapters on top of a EncoderDecoderModel with two XLMR for translation but I'm always facing this error: The model is not freezed. For training adapters please call the train_adapters() method I'm using the method model.train_adapter(adapter_name). Any idea on how to solve this problem?

calpt commented 2 years ago

@joao-alves97 This error doesn't sound expected. Would you mind opening a new (bug report) issue for this (ideally with a short snippet for us to reproduce)? Thanks!