(torch1.8) nlp@nlp:~/projects/UVR-NMT-master$ sh test.sh
Traceback (most recent call last):
File "interactive.py", line 193, in
cli_main()
File "interactive.py", line 189, in cli_main
main(args)
File "interactive.py", line 149, in main
translations = task.inference_step(generator, models, sample).long()
File "/home/nlp/projects/UVR-NMT-master/fairseq/tasks/fairseq_task.py", line 245, in inference_step
return generator.generate(models, sample, prefix_tokens=prefix_tokens)
File "/home/nlp/anaconda3/envs/torch1.8/lib/python3.7/site-packages/torch/autograd/grad_mode.py", line 27, in decorate_context
return func(*args, **kwargs)
File "/home/nlp/projects/UVR-NMT-master/fairseq/sequence_generator.py", line 376, in generate
scores.long().view(bsz, beam_size, -1)[:, :, :step]
File "/home/nlp/projects/UVR-NMT-master/fairseq/search.py", line 81, in step
torch.div(self.indices_buf, vocab_size, out=self.beams_buf)
RuntimeError: result type Float can't be cast to the desired output type Long
This is the first time I run this codes. I follow the redme documentation. When I run interactive.py, an error appears. The error message is as shown above. However, I checked the code and found no problem. I would like to ask how to solve the above question.
the tets.sh file is script from Inference part code in redme document
thank you ( I use Multi30K dataset - en-de )
(torch1.8) nlp@nlp:~/projects/UVR-NMT-master$ sh test.sh Traceback (most recent call last): File "interactive.py", line 193, in
cli_main()
File "interactive.py", line 189, in cli_main
main(args)
File "interactive.py", line 149, in main
translations = task.inference_step(generator, models, sample).long()
File "/home/nlp/projects/UVR-NMT-master/fairseq/tasks/fairseq_task.py", line 245, in inference_step
return generator.generate(models, sample, prefix_tokens=prefix_tokens)
File "/home/nlp/anaconda3/envs/torch1.8/lib/python3.7/site-packages/torch/autograd/grad_mode.py", line 27, in decorate_context
return func(*args, **kwargs)
File "/home/nlp/projects/UVR-NMT-master/fairseq/sequence_generator.py", line 376, in generate
scores.long().view(bsz, beam_size, -1)[:, :, :step]
File "/home/nlp/projects/UVR-NMT-master/fairseq/search.py", line 81, in step
torch.div(self.indices_buf, vocab_size, out=self.beams_buf)
RuntimeError: result type Float can't be cast to the desired output type Long
This is the first time I run this codes. I follow the redme documentation. When I run interactive.py, an error appears. The error message is as shown above. However, I checked the code and found no problem. I would like to ask how to solve the above question.
the tets.sh file is script from Inference part code in redme document thank you ( I use Multi30K dataset - en-de )