lxe / simple-llm-finetuner

Simple UI for LLM Model Finetuning
MIT License
2.05k stars 132 forks source link

How can I use the finetuned model with text-generation-webui or KoboldAI? #16

Open Gitterman69 opened 1 year ago

Gitterman69 commented 1 year ago

How can I utilize a fine-tuned model with text-generation-webui or KoboldAI for generating text? What are the necessary steps to ensure the successful integration of the model with these interfaces, and are there any specific requirements or dependencies I need to be aware of?

lxe commented 1 year ago

I think you'll need to convert it using https://github.com/tloen/alpaca-lora/blob/main/export_hf_checkpoint.py. Let me figure it out and I'll ad it to the readme.

Gitterman69 commented 1 year ago

Thanks lad!

Sent from Proton Mail for iOS

On Sat, Mar 25, 2023 at 06:19, Aleksey Smolenchuk @.***> wrote:

I think you'll need to convert it using https://github.com/tloen/alpaca-lora/blob/main/export_hf_checkpoint.py. Let me figure it out and I'll ad it to the readme.

— Reply to this email directly, view it on GitHub, or unsubscribe. You are receiving this because you authored the thread.Message ID: @.***>

kotykd commented 1 year ago

I would also like to know this.

Gitterman69 commented 1 year ago

Any updates?