Closed MuhammedTech closed 3 years ago
That looks strange to me:
input_ids = tokenizer(text, return_tensors="pt")['input_ids'].to(device)
out = model.generate(**input_ids,
Try maybe:
input_ids = tokenizer(text, return_tensors="pt").to(device)
out = model.generate(input_ids
I completed the training and saved models like the following:
model.save_pretrained("/content/drive/MyDrive/gpt_sentiment/model_rugpt3-trainer", push_to_hub=False) tokenizer.save_pretrained("/content/drive/MyDrive/gpt_sentiment/model_rugpt3-trainer", push_to_hub=False)
then I am loading the model:
here i am getting:
how can i properly load it and get the prediction for my text