jupyterlab / jupyter-ai

A generative AI extension for JupyterLab
https://jupyter-ai.readthedocs.io/
BSD 3-Clause "New" or "Revised" License
3.22k stars 325 forks source link

Default "temperature" parameter is incompatible with openai "o1" models #994

Open pollackscience opened 1 month ago

pollackscience commented 1 month ago

Description

Error thrown when trying to invoke the new openai o1 model series: BadRequestError: Error code: 400 - {'error': {'message': "Unsupported value: 'temperature' does not support 0.7 with this model. Only the default (1) value is supported.", 'type': 'invalid_request_error', 'param': 'temperature', 'code': 'unsupported_value'}}

Reproduce

  1. %config AiMagics.default_language_model = "openai-chat:o1-preview"
  2. %%ai What is 5+8?
  3. Error thrown due to "temperature" parameter.

Expected behavior

Ideally this model can be run without any additional modification to configs or core package code.

Context

dlqqq commented 1 month ago

@pollackscience Unfortunately, I'm not able to reproduce this as I lack a tier-5 usage account necessary to use the latest OpenAI o1 models.

I've searched through our codebase and don't see any mention of us setting a default value for the temperature parameter, so I'm not sure why temperature is being set to 0.7 in your environment. This could be an upstream API issue given that it is a very recent release.

For now, can you try using the --model-parameters argument to explicitly set the temperature to 1?

%%ai -m {"temperature": 1}
What is 5 + 8?
pollackscience commented 1 month ago

@dlqqq the explicit setting of temp to 1 does work! Thank you for your quick response. One item to note: input must be {"temperature":1} (no space between ":" and "1") or else it throws a json.loads error.

Not sure if you want to mark this issue as closed, but it's a simple and effective workaround. Thanks!