jupyterlab / jupyter-ai

A generative AI extension for JupyterLab
https://jupyter-ai.readthedocs.io/
BSD 3-Clause "New" or "Revised" License
3.23k stars 331 forks source link

Enable passing a temperature parameter in chat #1061

Open srdas opened 2 weeks ago

srdas commented 2 weeks ago

The chat interface does not permit the user to pass in a temperature parameter to be used with the LLM. Nor does it reveal what the default temperature parameter is for the LLM being used. One use case that science/research users have requested is to include the temperature parameter in settings and allow the user to change it. This will aid experimentation by modulating the temperature parameter, which adjusts the extent of randomness in responses from the LLM. Some chat interfaces enable choosing the temperature parameter in a separate panel in the interface.

Proposed approach to enabling this in Jupyter AI: add a float field for "Temperature" to the Settings interface below the Region name:

image

This will show the current temperature parameter and allow the user to change it.

An additional option is to allow the user to postfix the temperature parameter to a query/prompt using a --temperature <float(0,1)> or -t <float(0,1)>, noting of course that the temperature parameter can only be in the range (0,1).

ykharkov commented 2 weeks ago

Yes, this would be a great enhancement! Can also suggest to add "top-k" and "top-p" parameters as well.