Open Chaitanya-Jadhav opened 1 year ago
Hi Team,
We are trying to train model, but we are getting following error: ValueError: invalid literal for int() with base 10: '0a0'
Following is the command that we are using for training the model: fairseq-train corpus-bin-en-indic --save-dir transformer-en-indic --arch transformer --layernorm-embedding --task translation_multi_simple_epoch --sampling-method "temperature" --sampling-temperature 1.5 --encoder-langtok "tgt" --lang-dict lang_list.txt --lang-pairs en-asm,en-ben,en-brx,en-guj,en-hin,en-kas,en-kan,en-kok,en-mai,en-mal,en-mar,en-mni,en-nep,en-ori,en-pan,en-san,en-sid,en-tam,en-tel,en-urd --decoder-normalize-before --encoder-normalize-before --activation-fn gelu --adam-betas "(0.9, 0.98)" --batch-size 1024 --decoder-attention-heads 4 --decoder-embed-dim 256 --decoder-ffn-embed-dim 1024 --decoder-layers 6 --dropout 0.5 --encoder-attention-heads 4 --encoder-embed-dim 256 --encoder-ffn-embed-dim 1024 --encoder-layers 6 --lr 0.001 --lr-scheduler inverse_sqrt --max-epoch 51 --optimizer adam --num-workers 32 --warmup-init-lr 0 --warmup-updates 4000
Hi Team,
We are trying to train model, but we are getting following error: ValueError: invalid literal for int() with base 10: '0a0'
Following is the command that we are using for training the model: fairseq-train corpus-bin-en-indic --save-dir transformer-en-indic --arch transformer --layernorm-embedding --task translation_multi_simple_epoch --sampling-method "temperature" --sampling-temperature 1.5 --encoder-langtok "tgt" --lang-dict lang_list.txt --lang-pairs en-asm,en-ben,en-brx,en-guj,en-hin,en-kas,en-kan,en-kok,en-mai,en-mal,en-mar,en-mni,en-nep,en-ori,en-pan,en-san,en-sid,en-tam,en-tel,en-urd --decoder-normalize-before --encoder-normalize-before --activation-fn gelu --adam-betas "(0.9, 0.98)" --batch-size 1024 --decoder-attention-heads 4 --decoder-embed-dim 256 --decoder-ffn-embed-dim 1024 --decoder-layers 6 --dropout 0.5 --encoder-attention-heads 4 --encoder-embed-dim 256 --encoder-ffn-embed-dim 1024 --encoder-layers 6 --lr 0.001 --lr-scheduler inverse_sqrt --max-epoch 51 --optimizer adam --num-workers 32 --warmup-init-lr 0 --warmup-updates 4000