danny-avila / LibreChat

Enhanced ChatGPT Clone: Features Anthropic, AWS, OpenAI, Assistants API, Azure, Groq, o1, GPT-4o, Mistral, OpenRouter, Vertex AI, Gemini, Artifacts, AI model switching, message search, langchain, DALL-E-3, ChatGPT Plugins, OpenAI Functions, Secure Multi-User System, Presets, completely open-source for self-hosting. Actively in public development.
https://librechat.ai/
MIT License
17.86k stars 2.96k forks source link

[Bug]: Model Specs Settings a little buggy #3185

Closed prononext closed 3 months ago

prononext commented 3 months ago

What happened?

I noticed when using Model Specs https://www.librechat.ai/docs/configuration/librechat_yaml/object_structure/model_specs, that when I have different endpoints in several specs like openai, google, gptPlugins and I change to a spec that has the plugins endpoint and then go back to the one that has a openai or google endpoint, the endpoint does not change and stays on plugins.

Also changing prioritize: true did not help. When having set that its the same.

Steps to Reproduce

  1. Create a librechat.yml with several modelSpecs like openai, google, gptPlugins and define also tools in the gptPlugins endpoint modelSpec.
  2. Open the LibreChat Gui and click on the openai ModelSpecs template
  3. Then click on the gptPlugins modelSpecs template
  4. Then again on the openai

you will see that the endpoint does not change back, after the settings you have set in the modelSpecs but stays on the gptPlugins endpoint.

The only way to get it working is to manually change the endpoint.

Worse it gets when you have set prioritize: true in one of your modelSpecs presets then the only way to change the endpoint is to go to one chat in your history wich has that endpoint as somehow the endpoint selector does not work anymore.

Hopefully its clear, if there are any questions I am happy to help.

What browsers are you seeing the problem on?

No response

Relevant log output

No response

Screenshots

No response

Code of Conduct