Closed johntiger1 closed 6 years ago
Actually, I see your comment here addresses it:
https://github.com/shrimai/Style-Transfer-Through-Back-Translation/blob/master/style_decoder/translate.py#L11
## When using english-french trained MT model, uncomment -model
However, upon doing that, I get the following error message: when I run this command:
python translate.py -model ../models/translation/english_french/english_french.pt -src ../data/political_data/democratic_only.train.en -output ../data/political_data/democratic_only.train.fr -replace_unk $true
Traceback (most recent call last): File "translate.py", line 139, in <module> main() File "translate.py", line 66, in main translator = onmt.Translator_style(opt) File "/h/johnchen/Desktop/git_stuff/Style-Transfer-Through-Back-Translation/style_decoder/onmt/Translator_style.py", line 12, in __init__ checkpoint = torch.load(opt.decoder_model, map_location=lambda storage, loc: storage) AttributeError: 'Namespace' object has no attribute 'decoder_model'
Any help? Thanks
You are using the Translator_style.py file. For english_french.py, you have to use the Translator.py file (Given in line 9 of example.sh). Uncomment line 65 and comment line 66 of translate.py
Wow, thanks for the quick reply! So I tried that and I got a new error message:
(onmt) johnchen@vws53:~/Desktop/git_stuff/Style-Transfer-Through-Back-Translation/style_decoder$ python translate.py -model ../models/translation/english_french/english_french.pt -src ../data/political_data/democratic_only.train.en -output ../data/political_data/democratic_only.train.fr -replace_unk $true
/h/johnchen/Desktop/git_stuff/Style-Transfer-Through-Back-Translation/style_decoder/onmt/Dataset.py:68: UserWarning: volatile was removed and now has no effect. Use `with torch.no_grad():` instead.
b = Variable(b, volatile=self.volatile)
Traceback (most recent call last):
File "translate.py", line 139, in <module>
main()
File "translate.py", line 93, in main
predBatch, predScore, goldScore = translator.translate(srcBatch, tgtBatch)
File "/h/johnchen/Desktop/git_stuff/Style-Transfer-Through-Back-Translation/style_decoder/onmt/Translator.py", line 197, in translate
pred, predScore, attn, goldScore = self.translateBatch(src, tgt)
File "/h/johnchen/Desktop/git_stuff/Style-Transfer-Through-Back-Translation/style_decoder/onmt/Translator.py", line 70, in translateBatch
encStates, context = self.model.encoder(srcBatch)
File "/h/johnchen/anaconda3/lib/python3.6/site-packages/torch/nn/modules/module.py", line 491, in __call__
result = self.forward(*input, **kwargs)
File "/h/johnchen/Desktop/git_stuff/Style-Transfer-Through-Back-Translation/style_decoder/onmt/Models.py", line 33, in forward
emb = pack(self.word_lut(input[0]), input[1])
File "/h/johnchen/anaconda3/lib/python3.6/site-packages/torch/onnx/__init__.py", line 57, in wrapper
return fn(*args, **kwargs)
File "/h/johnchen/anaconda3/lib/python3.6/site-packages/torch/nn/utils/rnn.py", line 124, in pack_padded_sequence
data, batch_sizes = PackPadded.apply(input, lengths, batch_first)
File "/h/johnchen/anaconda3/lib/python3.6/site-packages/torch/nn/_functions/packing.py", line 19, in forward
lengths_iter = reversed(lengths.tolist())
AttributeError: 'tuple' object has no attribute 'tolist'
To be clear, this is with lines 13-14 and line 65 uncommented, and lines 15-18 and line 66 commented.
Any idea on how to get around this?
Are you using pytorch 0.4.0 ? This code is compatible with pytorch 0.3.
I think that may have solved the issue, although I haven't seen any results yet. I will report back and close if/when the issue is fixed. Thanks!
@shrimai I'm not sure if this is expected behaviour but I haven't seen any results yet still. This is what the prompt is hanging on:
(onmt) johnchen@vws53:~/Desktop/git_stuff/Style-Transfer-Through-Back-Translation/style_decoder$ python translate.py -model ../models/translation/english_french/english_french.pt -src ../data/political_data/democratic_only.train.en -output ../data/political_data/democratic_only.train.fr -replace_unk $true
/h/johnchen/Desktop/git_stuff/Style-Transfer-Through-Back-Translation/style_decoder/onmt/modules/GlobalAttention.py:49: UserWarning: mask is not broadcastable to self, but they have the same number of elements. Falling back to deprecated pointwise behavior.
attn.data.masked_fill_(self.mask, -float('inf'))
/h/johnchen/Desktop/git_stuff/Style-Transfer-Through-Back-Translation/style_decoder/onmt/modules/GlobalAttention.py:50: UserWarning: Implicit dimension choice for softmax has been deprecated. Change the call to include dim=X as an argument.
attn = self.sm(attn)
/h/johnchen/anaconda3/envs/onmt/lib/python3.6/site-packages/torch/nn/modules/container.py:67: UserWarning: Implicit dimension choice for log_softmax has been deprecated. Change the call to include dim=X as an argument.
input = module(input)
Do you think this is typical behaviour? I am running on an Intel(R) Xeon(R) CPU E5-1620 v4 @ 3.50GHz
Just logged in and saw the following results: PRED AVG SCORE: -0.5515, PRED PPL: 1.7358
Thanks for the help!
https://github.com/shrimai/Style-Transfer-Through-Back-Translation/blob/ce44ea946eef5df4c725ccf0319392d105408495/style_decoder/example.sh#L10
I think this line should be updated according to the issue raised in issue #1