Closed EricPeter closed 3 years ago
Hello! What's the code that triggers this error?
import os loaded_model = torch.load("mt_luganda.pt",map_location=torch.device('cpu'))
Are you sure? That's unrelated to tranformers
or huggingface
, yet I do see a transformers
cache error in your issue title.
am trying to load that model in another machine
This issue has been automatically marked as stale because it has not had recent activity. If you think this still needs to be addressed please comment on this thread.
Please note that issues that do not follow the contributing guidelines are likely to be ignored.
Same problem.
Any solution?
Same issue. Trying to load a pickled tokenizer inside of a docker container with open(f"t5-base_tokenizer.pkl", 'rb') as f: tok = pickle.load(f)
This happens when try to load the model on another device
Environment info
transformers
version:Who can help
Information
Model I am using (Bert, XLNet ...):
The problem arises when using:
The tasks I am working on is:
To reproduce
Steps to reproduce the behavior:
1. 2. 3.
Expected behavior