Open NanshaNansha opened 3 weeks ago
@NanshaNansha To load a model locally using the PeftModel class from a pretrained model, you need to ensure that the base_model and other required files are available locally. Try out these steps and let me know, if it works
pip install transformers peft
from peft import PeftModel
base_model_path = 'path/to/your/base_model' model_id = 'path/to/your/local_model_directory' cache_dir = 'path/to/your/cache_directory'
model = PeftModel.from_pretrained(base_model_path, model_id=model_id, cache_dir=cache_dir)
from transformers import AutoTokenizer
tokenizer = AutoTokenizer.from_pretrained(base_model_path) input_text = "Your input text here"
inputs = tokenizer(input_text, return_tensors="pt") outputs = model(**inputs)
print(outputs)
Let me know, if it works
Thanks