sagemathinc / cocalc

CoCalc: Collaborative Calculation in the Cloud
https://CoCalc.com
Other
1.16k stars 216 forks source link

custom llm: two minor bugs #7587

Open haraldschilly opened 4 months ago

haraldschilly commented 4 months ago

This is mainly a quick note for myself

  1. when your setup has only custom OpenAI LLMs, then the enable "openai" and "custom openai" settings are used in a wrong way. The real issue is that historically, the "enable openai" setting is used to enable overall LLM integration. The solution is to add a new global "enable LLM" configuration and use that everywhere, where the "enable openai" is used right now.
  2. The validation function for setting the "default llm" only uses the server side list of known models (from the llm-utils file). Hence setting this via an environment variables fails the validation. This is a chicken-egg problem! The solution is probably to ease the validation and allow any string for the "default llm". At runtime, when the LLM is picked, there is already a fallback mechanism in place to select one of the existing models.