tranquoctrinh / transformer

This is a PyTorch implementation of the Transformer model in the paper Attention is All You Need
MIT License
22 stars 4 forks source link

Size error when testing my own dataset #2

Closed Sherww closed 1 year ago

Sherww commented 2 years ago

Hi, thanks for sharing the code. I fine-tuned the model and trained on my own data set. But when I run evaluate.py, I get this error:

Special tokens have been added in the vocabulary, make sure the associated word embeddings are fine-tuned or trained. Done load model on the cpu device 0%| | 1/6665 [00:00<46:19, 2.40it/s] Traceback (most recent call last): File "D:\remote_download\sher\parser\completion\transformer-main\evaluate.py", line 158, in main() File "D:\remote_download\sher\parser\completion\transformer-main\evaluate.py", line 154, in main bleus = calculate_bleu_score(model, source_tokenizer, target_tokenizer, configs) File "D:\remote_download\sher\parser\completion\transformer-main\evaluate.py", line 130, in calculate_bleu_score pred_trg = translate(model, sentence, source_tokenizer, target_tokenizer, configs["target_max_seq_len"], configs["beam_size"], device) File "D:\remote_download\sher\parser\completion\transformer-main\evaluate.py", line 57, in translate encoder_output = model.encoder.forward(source_tensor, source_mask) File "D:\remote_download\sher\parser\completion\transformer-main\models.py", line 169, in forward x = self.position_embedding(x) File "D:\Anaconda3\lib\site-packages\torch\nn\modules\module.py", line 1110, in _call_impl return forward_call(*input, **kwargs) File "D:\remote_download\sher\parser\completion\transformer-main\models.py", line 35, in forward x = x + pe RuntimeError: The size of tensor a (148) must match the size of tensor b (128) at non-singleton dimension 1

I don't know how to solve this error, could you give me some suggestions? Thanks.

tranquoctrinh commented 2 years ago

Hi Sherww, can you pull the latest code to try and see if it is fixed or not?

Sherww commented 2 years ago

This solution works for me, thanks!