Adds an option -o base_url https://... to override the endpoint used for Command-R.
This is useful when using the model through a cloud provider or your own deployment (tested this with a deployment on Azure - after deploying Command-R+ through the Azure AI Studio I have an endpoint: https://....swedencentral.inference.ai.azure.com/v1).
Ultimately I think this is something that can be managed by llm in a standard way, like keys, since most popular models are now available from various cloud providers or as your own deployment (if the weights are available). If you think this is interesting let me know and I don't mind contributing that.
Adds an option
-o base_url https://...
to override the endpoint used for Command-R.This is useful when using the model through a cloud provider or your own deployment (tested this with a deployment on Azure - after deploying Command-R+ through the Azure AI Studio I have an endpoint:
https://....swedencentral.inference.ai.azure.com/v1
).Ultimately I think this is something that can be managed by
llm
in a standard way, like keys, since most popular models are now available from various cloud providers or as your own deployment (if the weights are available). If you think this is interesting let me know and I don't mind contributing that.