Open twalderman opened 3 months ago
BMO Chatbot fetches from the chat completion API.
It does not look like min_p is added to the list of parameters yet (https://github.com/ollama/ollama/blob/main/docs/modelfile.md#parameter).
They havent released the merge. I jumped the gun. Please hold onto it.
Enjoy this:
https://chat.openai.com/g/g-zbfRNM72V-endless-zork On Saturday, March 16th, 2024 at 12:38 PM, Long Huynh @.***> wrote:
BMO Chatbot fetches from the chat completion API.
It does not look like min_p is added to the list of parameters yet (https://github.com/ollama/ollama/blob/main/docs/modelfile.md#parameter).
— Reply to this email directly, view it on GitHub, or unsubscribe. You are receiving this because you authored the thread.Message ID: @.***>
I compiled this fork of ollama. https://github.com/nathanpbell/ollama which has Min_p. Still waiting to be merged in main. The benefit is worthwhile. I have the parameter called from my system prompt. Please consider exposing it in the UI. at the very least it will be ignored and at best, its tweakable with this fork. Try it out. The smoothing is very nice.
I'm suddenly getting an error, "ResponseError: invalid options: min_p"
1) Upgraded to latest release (v2.1.1) 2) reset settings 3) Reset the Ollama REST API URL 4) Selected a model (either phi3:medium or llama3:latest) doesn't seem to matter. 5) Leave min_p at 0.00 or raise it to 0.10 doesn't seem to matter.
@Spatacoli
The min_p
parameter is not exposed in Ollama at this moment. I believe it will be merged soon (https://github.com/ollama/ollama/pull/1825#issuecomment-2101861720).
Ignore the errors for now. I think other users are currently running a forked version of Ollama that supports the min_p
parameter or running it with llama.cpp via Ollama which already supports min_p
.
Thank you, but I'm not getting any output, just that error. How can I ignore it?
Are you running on a standard Ollama server? I only received a warning which states, "invalid option provided."
Perhaps @twalderman or someone else can chime in since I have not made any custom changes to the server that would throw an error other than a warning.
Otherwise, we would have to wait for Ollama to support the min_p
parameter or I will need to remove the parameter until it is fully supported.
Interesting. I'm running a standard Ollama server. I'll try @twalderman changes and see if that works, or I'll just wait till min_p
is supported.
Oh... sorry I was on version 0.1.34 of Ollama. I just upgraded to 0.1.38 and it works. Thank you for your support.
I was just about to share with you this comment (https://github.com/longy2k/obsidian-bmo-chatbot/issues/71#issuecomment-2130382799) of the same issue.
Thank you for letting me know. I will tell the other user to update Ollama 😁
Ollama added Min_p. https://github.com/ollama/ollama/pull/1825