Closed gouldju1 closed 4 years ago
Looks like the binary data is incomplete. Please check the size of your bin, idx files, reprocess the data could help resolve this issue.
Yes, it looks like this resolves the issue. However, now, after entering an input sentence during fairseq-interaction
, I get the following:
Traceback (most recent call last):
File "/usr/local/bin/fairseq-interactive", line 11, in <module>
load_entry_point('fairseq', 'console_scripts', 'fairseq-interactive')()
File "/workspace/fairseq/fairseq_cli/interactive.py", line 213, in cli_main
main(args)
File "/workspace/fairseq/fairseq_cli/interactive.py", line 164, in main
translations = task.inference_step(generator, models, sample)
File "/workspace/fairseq/fairseq/tasks/fairseq_task.py", line 356, in inference_step
return generator.generate(models, sample, prefix_tokens=prefix_tokens)
File "/usr/local/lib/python3.6/dist-packages/torch/autograd/grad_mode.py", line 49, in decorate_no_grad
return func(*args, **kwargs)
File "/workspace/fairseq/fairseq/sequence_generator.py", line 161, in generate
return self._generate(sample, **kwargs)
File "/workspace/fairseq/fairseq/sequence_generator.py", line 261, in _generate
tokens[:, : step + 1], encoder_outs, self.temperature
File "/workspace/fairseq/fairseq/sequence_generator.py", line 726, in forward_decoder
incremental_state=self.incremental_states[i],
File "/workspace/ProphetNet/src/prophetnet/ngram_s2s_model.py", line 590, in forward
x_list, extra = self.extract_features(prev_output_tokens, encoder_out, incremental_state, **unused)
File "/workspace/ProphetNet/src/prophetnet/ngram_s2s_model.py", line 751, in extract_features
real_positions=real_positions
File "/usr/local/lib/python3.6/dist-packages/torch/nn/modules/module.py", line 532, in __call__
result = self.forward(*input, **kwargs)
File "/workspace/ProphetNet/src/prophetnet/ngram_s2s_model.py", line 365, in forward
real_positions=real_positions
File "/usr/local/lib/python3.6/dist-packages/torch/nn/modules/module.py", line 532, in __call__
result = self.forward(*input, **kwargs)
File "/workspace/ProphetNet/src/prophetnet/ngram_multihead_attention.py", line 244, in forward
saved_state = self._get_input_buffer(incremental_state)
File "/workspace/ProphetNet/src/prophetnet/ngram_multihead_attention.py", line 418, in _get_input_buffer
'attn_state',
File "/workspace/fairseq/fairseq/utils.py", line 91, in get_incremental_state
return module.get_incremental_state(incremental_state, key)
File "/usr/local/lib/python3.6/dist-packages/torch/nn/modules/module.py", line 576, in __getattr__
type(self).__name__, name))
AttributeError: 'NgramMultiheadAttention' object has no attribute 'get_incremental_state'
@gouldju1 Hi, no attribution error is caused by Fairseq version. The master version of Fairseq keeps changing in its api, thus we build ProphetNet in v-0.9.0. Please pip install fairseq==v0.9.0, and try if it works
Yes, that works. Thank you!
Hello,
I performed the following:
preprocess_cnn_dm.py
fairseq-preprocess
to generate binariesWhen I run
fairseq-train
or inferencefairseq-generate
, I get the following errors: TrainInference
Inputs:
Train
Inference
Any idea how to handle this? Thank you.