Closed joao-alves97 closed 3 years ago
@patil-suraj
Hey @joao-alves97, I started an adapter implementation for EncoderDecoderModel in https://github.com/calpt/adapter-transformers/tree/dev/encoder_decoder (currently work in progress). Expecting it to be available soon.
Awesome! Could you send me a message once it is available? Thanks
@joao-alves97 the EncoderDecoderModel implementation has been merged into master, so you should be able to use it when installing from there. We haven't done any extensive evaluation yet though, so happy to hear about results you get :)
Awesome! Next week I'm going to try to do some experiments!
@calpt sorry for replying after 2 months but I'm trying to use adapters on top of a EncoderDecoderModel with two XLMR for translation but I'm always facing this error:
The model is not freezed. For training adapters please call the train_adapters() method
I'm using the method model.train_adapter(adapter_name).
Any idea on how to solve this problem?
@joao-alves97 This error doesn't sound expected. Would you mind opening a new (bug report) issue for this (ideally with a short snippet for us to reproduce)? Thanks!
🚀 Feature request
Could it be possible to add adapters to the EncoderDecoderModel? I am fine-tuning an EncoderDecoderModel with two mBERT and I would like to compare it with fine-tuning only the adapter layers