atulkum / pointer_summarizer

pytorch implementation of "Get To The Point: Summarization with Pointer-Generator Networks"
Apache License 2.0
904 stars 243 forks source link

'Encoder' object has no attribute 'tx_proj' #57

Open jivatneet opened 3 years ago

jivatneet commented 3 years ago

I was successfully able to run the LSTM based pointer generator. While running the transformer_encoder branch with LSTM=false, I encounter this error:

File "training_ptr_gen/train.py", line 400, in <module>
    train_processor.trainIters(config.max_iterations, args.model_file_path)
  File "training_ptr_gen/train.py", line 341, in trainIters
    loss = self.train_one_batch(batch)
  File "training_ptr_gen/train.py", line 273, in train_one_batch
    encoder_outputs, encoder_feature, encoder_hidden = self.model.encoder(enc_batch, enc_lens, enc_padding_mask)
  File "/srv/home/kaur/pointer_summarizer/lstmpg/lib/python3.6/site-packages/torch/nn/modules/module.py", line 727, in _call_impl
    result = self.forward(*input, **kwargs)
  File "/srv/home/kaur/pointer_summarizer/training_ptr_gen/model.py", line 108, in forward
    word_embed_proj = self.tx_proj(embedded)
  File "/srv/home/kaur/pointer_summarizer/lstmpg/lib/python3.6/site-packages/torch/nn/modules/module.py", line 779, in __getattr__
    type(self).__name__, name))
torch.nn.modules.module.ModuleAttributeError: 'Encoder' object has no attribute 'tx_proj'

Any help regarding this would be appreciated, thanks.

v-chuqin commented 3 years ago

You can set use_lstm=True to handle this issue.

jivatneet commented 3 years ago

Thanks for the reply @v-chuqin, but I wanted to use the transformer-based encoder and hence set use_lstm=False. With use_lstm=True, it is working fine.

Tinarights commented 2 years ago

How did you solve that? I want to use transformers and I got the same error