prajdabre / yanmtt

Yet Another Neural Machine Translation Toolkit
MIT License
173 stars 32 forks source link

Some question of pretrain_nmt.py #33

Open raullese opened 2 years ago

raullese commented 2 years ago

I have some confusion about the pretrain_nmt.py

I just saw that for the first few lines in your pretrain_nmt.py,

from transformers import AutoTokenizer, MBartTokenizer, MBart50Tokenizer, BartTokenizer, AlbertTokenizer from transformers import MBartForConditionalGeneration, BartForConditionalGeneration, MBartConfig, get_linear_schedule_with_warmup

According to my understand,you have rewrite some script such as some Class in (https://github.com/prajdabre/yanmtt/tree/main/transformers/src/transformers/models/mbart)/modeling_mbart.py in order to reach the goal of further pre train based on mBart.

And why don't you use the function in your new modeling_mbart.py ? I mean why don't you import Class in (https://github.com/prajdabre/yanmtt/tree/main/transformers/src/transformers/models/mbart)/modeling_mbart.py ?

raullese commented 2 years ago

I have some confusion about the pretrain_nmt.py

I just saw that for the first few lines in your pretrain_nmt.py,

from transformers import AutoTokenizer, MBartTokenizer, MBart50Tokenizer, BartTokenizer, AlbertTokenizer from transformers import MBartForConditionalGeneration, BartForConditionalGeneration, MBartConfig, get_linear_schedule_with_warmup

According to my understand,you have rewrite some script such as some Class in (https://github.com/prajdabre/yanmtt/tree/main/transformers/src/transformers/models/mbart)/modeling_mbart.py in order to reach the goal of further pre train based on mBart.

And why don't you use the function in your new modeling_mbart.py ? I mean why don't you import Class in (https://github.com/prajdabre/yanmtt/tree/main/transformers/src/transformers/models/mbart)/modeling_mbart.py ?

Sorry, I missed the information that recover the installation of Transformer from your fold, I will try it