Using a local hosted model (llama 3.1) with the openai backend, the hardcoded temperature (value 1.0) used by the mattermost plugin is too high. The model tends to create new words and produce incorrect sentences in foreign languages. Using a temperature of 0.5 solve the issue.
It would be nice to be able to set the temperature with the model parameters
Description
Using a local hosted model (llama 3.1) with the openai backend, the hardcoded temperature (value 1.0) used by the mattermost plugin is too high. The model tends to create new words and produce incorrect sentences in foreign languages. Using a temperature of 0.5 solve the issue.
It would be nice to be able to set the temperature with the model parameters