Closed Rossi1337 closed 1 week ago
Considering this it might be wise to move to the chat page temperature and useGPU settings, maybe we can can show those along with the model selection dropdown in an chat settings overlay toggle by a button
My idea would be to set there the global defaults that are valid for all models. Then you can override these settings per model. See #68
Started preparation for this. I introduced a model specific json settings file. It will be located at/models/.json
for example: "C:\Users\myuser\AppData\Roaming\OpenLocalUI\OpenLocalUI\models\llama3_latest.json"
It can be used to specifiy all Ollama related options and additionally allows you to specify a model specific "system prompt". This is helpfull when you want to customize the context length, temperature, system prompte ... per model.
For now the file needs to be manually created. If it does not exist we use the default settings. This also means in the "Settings" we would later specify the "defaults", and then we allow to override things per model in in the "Models" section. A UI to override these settings per model would then be next. We should extend the dialog that currently shows the model attributes.
This is related to issue #12