The chat interface does not permit the user to pass in a temperature parameter to be used with the LLM. Nor does it reveal what the default temperature parameter is for the LLM being used. One use case that science/research users have requested is to include the temperature parameter in settings and allow the user to change it. This will aid experimentation by modulating the temperature parameter, which adjusts the extent of randomness in responses from the LLM. Some chat interfaces enable choosing the temperature parameter in a separate panel in the interface.
Proposed approach to enabling this in Jupyter AI: add a float field for "Temperature" to the Settings interface below the Region name:
This will show the current temperature parameter and allow the user to change it.
An additional option is to allow the user to postfix the temperature parameter to a query/prompt using a --temperature <float(0,1)> or -t <float(0,1)>, noting of course that the temperature parameter can only be in the range (0,1).
The chat interface does not permit the user to pass in a temperature parameter to be used with the LLM. Nor does it reveal what the default temperature parameter is for the LLM being used. One use case that science/research users have requested is to include the temperature parameter in settings and allow the user to change it. This will aid experimentation by modulating the temperature parameter, which adjusts the extent of randomness in responses from the LLM. Some chat interfaces enable choosing the temperature parameter in a separate panel in the interface.
Proposed approach to enabling this in Jupyter AI: add a
float
field for "Temperature" to the Settings interface below the Region name:This will show the current temperature parameter and allow the user to change it.
An additional option is to allow the user to postfix the temperature parameter to a query/prompt using a
--temperature <float(0,1)>
or-t <float(0,1)>
, noting of course that the temperature parameter can only be in the range (0,1).