longy2k / obsidian-bmo-chatbot

Generate and brainstorm ideas while creating your notes using Large Language Models (LLMs) from Ollama, LM Studio, Anthropic, OpenAI, Mistral AI, and more for Obsidian.
https://ko-fi.com/longy2k
MIT License
261 stars 32 forks source link

Feature Request - Expose Min_p in Ollama Parameters #62

Open twalderman opened 3 months ago

twalderman commented 3 months ago

Ollama added Min_p. https://github.com/ollama/ollama/pull/1825

longy2k commented 3 months ago

BMO Chatbot fetches from the chat completion API.

It does not look like min_p is added to the list of parameters yet (https://github.com/ollama/ollama/blob/main/docs/modelfile.md#parameter).

twalderman commented 3 months ago

They havent released the merge. I jumped the gun. Please hold onto it.

Enjoy this:

https://chat.openai.com/g/g-zbfRNM72V-endless-zork On Saturday, March 16th, 2024 at 12:38 PM, Long Huynh @.***> wrote:

BMO Chatbot fetches from the chat completion API.

It does not look like min_p is added to the list of parameters yet (https://github.com/ollama/ollama/blob/main/docs/modelfile.md#parameter).

— Reply to this email directly, view it on GitHub, or unsubscribe. You are receiving this because you authored the thread.Message ID: @.***>

twalderman commented 3 months ago

I compiled this fork of ollama. https://github.com/nathanpbell/ollama which has Min_p. Still waiting to be merged in main. The benefit is worthwhile. I have the parameter called from my system prompt. Please consider exposing it in the UI. at the very least it will be ignored and at best, its tweakable with this fork. Try it out. The smoothing is very nice.

Spatacoli commented 1 month ago

I'm suddenly getting an error, "ResponseError: invalid options: min_p"

1) Upgraded to latest release (v2.1.1) 2) reset settings 3) Reset the Ollama REST API URL 4) Selected a model (either phi3:medium or llama3:latest) doesn't seem to matter. 5) Leave min_p at 0.00 or raise it to 0.10 doesn't seem to matter.

longy2k commented 1 month ago

@Spatacoli

The min_p parameter is not exposed in Ollama at this moment. I believe it will be merged soon (https://github.com/ollama/ollama/pull/1825#issuecomment-2101861720).

Ignore the errors for now. I think other users are currently running a forked version of Ollama that supports the min_p parameter or running it with llama.cpp via Ollama which already supports min_p.

Spatacoli commented 1 month ago

Thank you, but I'm not getting any output, just that error. How can I ignore it?

longy2k commented 1 month ago

Are you running on a standard Ollama server? I only received a warning which states, "invalid option provided."

Perhaps @twalderman or someone else can chime in since I have not made any custom changes to the server that would throw an error other than a warning.

Otherwise, we would have to wait for Ollama to support the min_p parameter or I will need to remove the parameter until it is fully supported.

Spatacoli commented 1 month ago

Interesting. I'm running a standard Ollama server. I'll try @twalderman changes and see if that works, or I'll just wait till min_p is supported.

Spatacoli commented 1 month ago

Oh... sorry I was on version 0.1.34 of Ollama. I just upgraded to 0.1.38 and it works. Thank you for your support.

longy2k commented 1 month ago

I was just about to share with you this comment (https://github.com/longy2k/obsidian-bmo-chatbot/issues/71#issuecomment-2130382799) of the same issue.

Thank you for letting me know. I will tell the other user to update Ollama 😁