Closed Tejaswi-kashyap-006 closed 7 months ago
Yes sure.
You can load the Model via from peft import AutoPeftModelForCausalLM model = AutoPeftModelForCausalLM.from_pretrained("your-adapter-repo") and use it directly.
if you wish to merge the adapter do
model = model.merge_and_unload()
now the adapter is merged into the base model with adapter weights applied
Thanks @DRXD1000
I trained the model using SFT on a custom dataset using lora config, which produced a Lora adapter, can we infer with it like having a base model and this adapter on top of it, or merge it ?