use the explicitly set agentProvider and agentModel combination. No verification is done to confirm the pairing is valid, just that it exists. The API will return any errors anyway.
Fallback to the workspaces chatProvider and chatModel overrides, this at least keeps answers within the same model preference the user-defined for a workspace.
Lastly, attempt to use the LLM_PROVIDER env and its ENV default model. If no ENV preference can be found - assume a static model string that is generally available for the provider. If that cannot be determined - return null and probably exit since most LLM providers do not accept null for model.
Additional Information
Developer Validations
[x] I ran yarn lint from the root of the repo & committed changes
Pull Request Type
Relevant Issues
resolves #2514
What is in this change?
use the explicitly set agentProvider and agentModel combination. No verification is done to confirm the pairing is valid, just that it exists. The API will return any errors anyway.
Fallback to the workspaces chatProvider and chatModel overrides, this at least keeps answers within the same model preference the user-defined for a workspace.
Lastly, attempt to use the LLM_PROVIDER env and its ENV default model. If no ENV preference can be found - assume a static model string that is generally available for the provider. If that cannot be determined - return null and probably exit since most LLM providers do not accept null for model.
Additional Information
Developer Validations
yarn lint
from the root of the repo & committed changes