prajdabre / yanmtt

Yet Another Neural Machine Translation Toolkit
MIT License
174 stars 32 forks source link

Unexpected Keyword arguments prompt_params, adaptor_layers, deep_adaptor_tuning, deep_adaptor_tuning_ffn_only, parallel_adaptors #27

Closed goru001 closed 2 years ago

goru001 commented 2 years ago

Hi @prajdabre ,

Thanks for doing great work with this library. I was trying to use it and ran into this issue where prompt_params, adaptor_layers, deep_adaptor_tuning, deep_adaptor_tuning_ffn_only, parallel_adaptors params are being passed here to forward here but the MBartForConditionalGeneration class's forward function doesn't expect it.

Wanted to understand from you if the fix is as simple as creating these params in forward function call with default value of None (in which case I'm guessing we would need to make changes in the forward functions implementation itself to use these params).

Let me know if you think I might be missing something here. Thanks!

prajdabre commented 2 years ago

Oh lord. I forgot to push my latest changes. Gimme a moment.

prajdabre commented 2 years ago

Ok please pull, run setup.py in the transformers folder and try again and lmk.

goru001 commented 2 years ago

Thanks @prajdabre for the quick reply. The code still seems to be failing here when we're not using --prompt_tuning as prompt_params is None.

When we use --prompt_tuning it fails here as tgt_len size is same as src_len size in attention mask.

prajdabre commented 2 years ago

Hi,

My bad again, I forgot to put if else conditions. Could you try using the following commit for now: https://github.com/prajdabre/yanmtt/commit/ed5da801ba6f7fad18a95a0643581aa197e68511

Just revert to this commit and try again.

Really should not make breaking changes before going on vacation! 🤦

prajdabre commented 2 years ago

If you want you can also fix this by wrapping the lines you pointed out with an if condition which triggers only if prompt params is not none.

goru001 commented 2 years ago

Got it, sure @prajdabre. Thanks a lot for your quick replies. I've switched to tag v2.0 for now, but ya you're right I can fix this at my end. Just didnt do it for now as I wanted to get more comfortable with the code-base first.

I hope to contribute back as well in future. Thanks for doing the great work and Hope you'll enjoy your vacation! :)

prajdabre commented 2 years ago

Fixed please check @goru001