longy2k / obsidian-bmo-chatbot

Generate and brainstorm ideas while creating your notes using Large Language Models (LLMs) from Ollama, LM Studio, Anthropic, OpenAI, Mistral AI, and more for Obsidian.
https://ko-fi.com/longy2k
MIT License
261 stars 32 forks source link

Feature Request: Support for Model settings #42

Open twalderman opened 5 months ago

twalderman commented 5 months ago

Support Min-P, Repeat Penalty, Repeat Penalty Tokens

longy2k commented 5 months ago

v1.8.4

I have not tested all the parameters but let me know if this work for your use case.

Thanks

EDIT: I noticed the regen/edit is not as random as I want it to be. I set the seed parameter default value to '0' which has caused the model to generate the same response for the same prompt. I will clear the seed parameter tomorrow.

longy2k commented 5 months ago

v1.8.5

Cleared the seed parameter default value.

You may still have to go to Ollama Local LLMS > Advanced Settings > seed and clear the input field.

twalderman commented 4 months ago

It would be good if there was a way to save presets of ollama settings.

twalderman commented 4 months ago

Min_P is not yet supported by Ollama. I put in a feature request for it with the ollama project

More info:

https://www.reddit.com/r/LocalLLaMA/comments/17vonjo/your_settings_are_probably_hurting_your_model_why/