facebookresearch / InferSent

InferSent sentence embeddings
Other
2.28k stars 471 forks source link

Unpickling error following demo #122

Closed jkamalu closed 5 years ago

jkamalu commented 5 years ago

Hi there, I get an error when running the following demo code. Note that the model import works fine. I am using Python 3.6.8 and PyTorch 1.0.1.

infer_sent = InferSent({ 'bsize': 64, 'word_emb_dim': 300, 'enc_lstm_dim': 2048, 'pool_type': 'max', 'dpout_model': 0.0, 'version': 1 }) infer_sent.load_state_dict(torch.load("./InferSent/encoder/infersent1.pkl"))

Error w/stack trace:


UnpicklingError Traceback (most recent call last)

in 7 'version': 1 8 }) ----> 9 infer_sent.load_state_dict(torch.load("./InferSent/encoder/infersent1.pkl")) ~/anaconda3/envs/usc/lib/python3.6/site-packages/torch/serialization.py in load(f, map_location, pickle_module) 366 f = open(f, 'rb') 367 try: --> 368 return _load(f, map_location, pickle_module) 369 finally: 370 if new_fd: ~/anaconda3/envs/usc/lib/python3.6/site-packages/torch/serialization.py in _load(f, map_location, pickle_module) 530 f.seek(0) 531 --> 532 magic_number = pickle_module.load(f) 533 if magic_number != MAGIC_NUMBER: 534 raise RuntimeError("Invalid magic number; corrupt file?") UnpicklingError: invalid load key, '<'. Can you offer any advice or have you seen this problem with the demo code before? I've tried re-downloading the model file via `curl` but to no avail. Thanks
jkamalu commented 5 years ago

By the way, I have also tried unpickling the state .pkl myself outside of PyTorch with Python 2 and Python 3 and get the same error.

jkamalu commented 5 years ago

This is a copy of #110 which has been closed but nonetheless unresolved. Looks like it's bad/old links. So, I'll keep this open.

kanishkamisra commented 5 years ago

Hey! Were you able to resolve this issue? I'm facing the same error :(

marcoaleixo commented 5 years ago

The infersent.pkl file is empty. That is why pickle isn't able to open it. They need to re-upload it.

kanishkamisra commented 5 years ago

@marcoaleixo check this out: #127