Closed EduardDurech closed 8 months ago
Thank you for opening your first issue in this project! Engagement like this is essential for open source projects! :hugs:
If you haven't done so already, check out Jupyter's Code of Conduct. Also, please try to follow the issue template as it helps other other community members to contribute more effectively.
You can meet the other Jovyans by joining our Discourse forum. There is also an intro thread there where you can stop by and say Hi! :wave:
Welcome to the Jupyter community! :tada:
@EduardDurech Thanks for opening this! I was able to reproduce this by passing a similar override on the command line using the tip of the main
branch and JupyterLab 4.1.0.
jupyter lab --AiExtension.model_parameters openai-chat:gpt-4-1106-preview='{"max_tokens": 4095}'
I'm getting the same error with Bedrock-chat models, in fact even giving some nonsense like --AiExtension.model_parameters='bedrock-chat:anthropic.claude-v2={"anything":200}'
generates the same error, but complains about got multiple values for keyword argument anything
It looks like the parameter is being ingested into both provider_params
and model_parameters
. I logged the value of each variable immediately before instantiating llm
in jupyter_ai/chat_handlers/default.py
, and I found the same custom model parameter in both objects:
[I 2024-02-07 15:38:23.871 AiExtension] provider_params: {'model_id': 'gpt-4-1106-preview', 'max_tokens': 4095, 'openai_api_key': 'secret'}
[I 2024-02-07 15:38:23.871 AiExtension] model_parameters: {'max_tokens': 4095}
This is true whether the parameter name is max_tokens
or anything
, or anything else.
Not sure if relevant but I realized that it seemed to autosave to the ~/.local/share/jupyter/jupyter_ai/config.json
whatever I gave as the argument under a field called "Fields" or something (I cannot access the server to check right now, sorry about that)
There is an error when setting
max_tokens
foropenai-chat
in thejupyter_jupyter_ai_config.json
Error
https://github.com/jupyterlab/jupyter-ai/blob/8ea0ff34335ab530d8c22a80602ce5668d5b85e6/packages/jupyter-ai/jupyter_ai/chat_handlers/base.py#L204-L209
Steps:
jupyter_jupyter_ai_config.json
in aconfig
path specified injupyter --paths
(I picked~/.jupyter
)Parametres are successfully loaded
[I FooDate AiExtension] Configured model parameters: {'openai-chat:gpt-4-1106-preview': {'max_tokens': 4095}}
Searching the codebase I cannot find that
max_tokens
is set anywhere and https://api.python.langchain.com/en/latest/chat_models/langchain_community.chat_models.openai.ChatOpenAI.html does not set