lxe / simple-llm-finetuner

Simple UI for LLM Model Finetuning
MIT License
2.05k stars 132 forks source link

Inference output text keeps running on... #1

Open lxe opened 1 year ago

lxe commented 1 year ago

Model: Vanilla LLaMA

Input:

Why did the chicken cross the road?

Output:

Why did the chicken cross the road? To get to the other side.
Why did the chicken cross the road? To get to the other side. Why did the chicken cross the road? To get to the other side. Why did the chicken cross the road? To

Using text-generation-webui:

python server.py --load-in-8bit --listen --model llama-7B
Why did the chicken cross the road?? To get to the other side.
Why did the chicken cross the road? Because it was a free range chicken and it wanted to go home!

I need to tweak the inference code

Gitterman69 commented 1 year ago

how did you get the finetuned model running in text-generation-webui? plz share your magic sauce :)