Open makotos-nlproc opened 1 year ago
Hi
I see the same error. Making the change
+++ b/fairseq/models/speech_to_text/xm_transformer.py
@@ -643,7 +643,7 @@ class XMTransformerModel(FairseqEncoderDecoderModel):
base_architecture(args)
if getattr(args, "load_pretrained_decoder_from", None) is not None:
ckpt = torch.load(getattr(args, "load_pretrained_decoder_from", None))
- decoder_args_dict = cls.get_decoder_args_from_checkpoint(ckpt["cfg"])
+ decoder_args_dict = cls.get_decoder_args_from_checkpoint(ckpt)
args = cls.override_decoder_args(args, decoder_args_dict)
fixes this error. But then there are other configuration errors.
best Barry
@bhaddow Hi, Thank you for your advice. I face some (configuration) errors. How do you solve it?
@xutaima Hi, we have some troubles with the baseline of iwslt2023. I follow the instructions in the README, but it does not work. Also, the iwslt2024 baseline is the same, but it is not maintained.
For example, when fixing https://github.com/facebookresearch/fairseq/issues/5083#issuecomment-1514713597, I caught following error:
omegaconf.errors.MissingMandatoryValue: Missing mandatory value: model.w2v_path
full_key: model.w2v_path
reference_type=Any
object_type=dict
I tried to fix it step-by-step, but I caught the other error... We need a fundamental solution.
best,
❓ Questions and Help
What is your question?
To reproduce, I'm tring to do below instruction. But it doesn't work.
https://github.com/facebookresearch/fairseq/tree/iwslt2023/examples/simultaneous_translation
Ref: https://iwslt.org/2023/simultaneous
Error mesage
This time,
ckpt.keys()
isdict_keys(['args', 'model', 'optimizer_history', 'extra_state', 'last_optimizer_state'])
.ckpt
is loaded usingtorch.load
from--load_pretrained_decoder_from
option path.so maybe I think the loaded modei is bad?
--load_pretrained_decoder_from
value ismodel.pt
as below instruction.What's your environment?
pip
, source): sourcepip install --editable ./