SapienzaNLP / ewiser

A Word Sense Disambiguation system integrating implicit and explicit external knowledge.
Other
66 stars 17 forks source link

Loading checkpoints #13

Closed kanishkamisra closed 3 years ago

kanishkamisra commented 3 years ago

How does one load the pre-trained checkpoints?

I tried it with embs = torch.load('ewiser.semcor+wngt.pt') but it gives me an error:

---------------------------------------------------------------------------
ModuleNotFoundError                       Traceback (most recent call last)
<ipython-input-2-123918dcfdcf> in <module>
----> 1 embs = torch.load('ewiser.semcor+wngt.pt', map_location = 'cpu')

~/miniconda3/lib/python3.8/site-packages/torch/serialization.py in load(f, map_location, pickle_module, **pickle_load_args)
    606                     return torch.jit.load(opened_file)
    607                 return _load(opened_zipfile, map_location, pickle_module, **pickle_load_args)
--> 608         return _legacy_load(opened_file, map_location, pickle_module, **pickle_load_args)
    609
    610

~/miniconda3/lib/python3.8/site-packages/torch/serialization.py in _legacy_load(f, map_location, pickle_module, **pickle_load_args)
    785     unpickler = pickle_module.Unpickler(f, **pickle_load_args)
    786     unpickler.persistent_load = persistent_load
--> 787     result = unpickler.load()
    788
    789     deserialized_storage_keys = pickle_module.load(f, **pickle_load_args)

ModuleNotFoundError: No module named 'qbert'
mbevila commented 3 years ago

Check out https://github.com/SapienzaNLP/ewiser/blob/master/ewiser/spacy/disambiguate.py.

kanishkamisra commented 3 years ago

Thanks! I'll take a look -- for some reason I mistakenly thought the 'pt' files were regular tensors! Thanks for the clarification!