Closed JhonDan1999 closed 9 months ago
The models we uploaded are Parameter-Efficient Fine-Tuning (PEFT) checkpoints.
For example To use prompted-opt-1.3b, you need load opt-1.3b first, and then load peft checkpoint over opt-1.3b
import torch
from transformers import AutoModelForCausalLM, AutoTokenizer
from peft import PeftModel
# Import our models. The package will take care of downloading the models automatically
tokenizer = AutoTokenizer.from_pretrained("facebook/opt-1.3b")
model = AutoModelForCausalLM.from_pretrained("facebook/opt-1.3b")
tokenizer.pad_token_id = 0
tokenizer.padding_side = "left"
model = PeftModel.from_pretrained(model, "royokong/prompteol-opt-1.3b", torch_dtype=torch.float16)
I am trying to use the model but it gave me this error
![Screenshot 2023-10-01 at 7 01 02 PM](https://github.com/kongds/scaling_sentemb/assets/145554661/4a9a32a0-2839-42e6-afff-c9fa83be2de9)
can you please help me with the correct way to use the model with huggingface