Helsinki-NLP / Tatoeba-Challenge

Other
809 stars 91 forks source link

some module have missed #4

Closed zhhao1 closed 4 years ago

zhhao1 commented 4 years ago

In the npz file of the model you saved, it seems that the w in the Logitout layer is not given. Is this an oversight? image

zhhao1 commented 4 years ago

Another question is that the order of modules you save is different from that saved by torch.load. l1 is the first layer, context stands for mask multihead attention, and ln stands for layer norm. Do you mean this?

jorgtied commented 4 years ago

Those models are created by MarianNMT and maybe you need to ask for support from there support channels or maybe I misunderstand something in your question. This is not a native torch package if that is causing some problem for you.

zhhao1 commented 4 years ago

Those models are created by MarianNMT and maybe you need to ask for support from there support channels or maybe I misunderstand something in your question. This is not a native torch package if that is causing some problem for you.

Thanks a lot.