BerriAI / litellm

Python SDK, Proxy Server (LLM Gateway) to call 100+ LLM APIs in OpenAI format - [Bedrock, Azure, OpenAI, VertexAI, Cohere, Anthropic, Sagemaker, HuggingFace, Replicate, Groq]
https://docs.litellm.ai/docs/
Other
14.37k stars 1.69k forks source link

Getting error while trying to making a request to LLM (Large Language Model) provided IBM Watson. #6620

Open bishwajitprasadgond opened 3 weeks ago

bishwajitprasadgond commented 3 weeks ago

Give Feedback / Get Help: https://github.com/BerriAI/litellm/issues/new LiteLLM.Info: If you need to debug this error, use `litellm.set_verbose=True'.

Provider List: https://docs.litellm.ai/docs/providers


ValueError Traceback (most recent call last) /usr/local/lib/python3.10/dist-packages/litellm/llms/watsonx.py in completion(self, model, messages, custom_prompt_dict, model_response, print_verbose, encoding, logging_obj, optional_params, acompletion, litellm_params, logger_fn, timeout) 553 # regular text generation --> 554 return handle_text_request(req_params) 555 except WatsonXAIError as e:

9 frames ValueError: Invalid isoformat string: '2024-11-06T16:29:58.343Z'

During handling of the above exception, another exception occurred:

WatsonXAIError Traceback (most recent call last) WatsonXAIError: Invalid isoformat string: '2024-11-06T16:29:58.343Z'

During handling of the above exception, another exception occurred:

APIConnectionError Traceback (most recent call last) /usr/local/lib/python3.10/dist-packages/litellm/litellm_core_utils/exception_mapping_utils.py in exception_type(model, original_exception, custom_llm_provider, completion_kwargs, extra_kwargs) 2093 exception_mapping_worked = True 2094 if hasattr(original_exception, "request"): -> 2095 raise APIConnectionError( 2096 message="{} - {}".format(exception_provider, error_str), 2097 llm_provider=custom_llm_provider,

APIConnectionError: litellm.APIConnectionError: WatsonxException - Invalid isoformat string: '2024-11-06T16:29:58.343Z'

bishwajitprasadgond commented 3 weeks ago

This is the code import os from litellm import completion

response = completion( model="watsonx/ibm/granite-13b-chat-v2", messages=[{ "content": "What is your favorite color?","role": "user"}], url="https://eu-de.ml.cloud.ibm.com", api_key="myapikeywasthere", project_id="projectidwasthere" )