WilliamKarolDiCioccio / open_local_ui

OpenLocalUI: Native desktop app for Windows, MacOS and Linux. Easily run Large Language Models locally, no complex setups required. Inspired by OpenWebUI's simplicity for LLM use.
MIT License
16 stars 2 forks source link

Model specific settings file #66

Closed Rossi1337 closed 1 week ago

Rossi1337 commented 1 week ago

Started preparation for this. I introduced a model specific json settings file. It will be located at /models/.json for example: "C:\Users\myuser\AppData\Roaming\OpenLocalUI\OpenLocalUI\models\llama3_latest.json"

It can be used to specifiy all Ollama related options and additionally allows you to specify a model specific "system prompt". This is helpfull when you want to customize the context length, temperature, system prompte ... per model.

For now the file needs to be manually created. If it does not exist we use the default settings. This also means in the "Settings" we would later specify the "defaults", and then we allow to override things per model in in the "Models" section. A UI to override these settings per model would then be next. We should extend the dialog that currently shows the model attributes.

This is related to issue #12

WilliamKarolDiCioccio commented 1 week ago

Considering this it might be wise to move to the chat page temperature and useGPU settings, maybe we can can show those along with the model selection dropdown in an chat settings overlay toggle by a button

Rossi1337 commented 1 week ago

My idea would be to set there the global defaults that are valid for all models. Then you can override these settings per model. See #68