Open rohanarora opened 1 week ago
Primarily is it that instead of the api_key
here we should be leveraging:
api_params['token']
as it is the token which is needed for authentication here as opposed to the API key at this stage?
Happy to make a PR if this is an acceptable solution.
Good point. Yes a PR is welcome. I'll look into this today as well.
Reviewing again - no i don't think that matters, as the api_params['token']
is already set here
can you run the request with litellm.set_verbose=True
and share the debug logs?
https://us-south.ml.cloud.ibm.com/ml/v1/text/chat?version=2024-03-13
Your url looks off -
here's what i would expect it to look like - /ml/v1/deployments/{deployment_id}/text/chat
where deployment_id = model passed in
it would help to see how you're calling litellm.completion
What happened?
watsonx
provider as opposed to thewatsonx_text
providermax_new_tokens
parameter tomax_tokens
as requiredRelevant log output