BerriAI / litellm

Python SDK, Proxy Server (LLM Gateway) to call 100+ LLM APIs in OpenAI format - [Bedrock, Azure, OpenAI, VertexAI, Cohere, Anthropic, Sagemaker, HuggingFace, Replicate, Groq]
https://docs.litellm.ai/docs/
Other
12.64k stars 1.47k forks source link

[Bug]: unable to connect to the LiteLLM Proxy as the ChatOpenAI call leads to “SSLCertVerificationError” exception #1332

Closed krrishdholakia closed 8 months ago

krrishdholakia commented 9 months ago

What happened?

We have to run LiteLLM proxy as HTTPS/SSL endpoint instead of HTTP. Currently we run as HTTP endpoint. We have deployed EKS container in AWS with LiteLLM proxy with AWS ALB to give it that domain address. However we are hitting roadblock where we are unable to connect to the LiteLLM Proxy as the ChatOpenAI call leads to “SSLCertVerificationError” exception. We have tried to add certificate to ensure it is trusted source, but that doesn’t work and also I know one can turn off SSL verify flag such as in Python requests but just not clear where to do in the chain based on the following code and corresponding stacktrace.

Relevant log output

No response

Twitter / LinkedIn details

No response

ishaan-jaff commented 9 months ago

Working on this:

krrishdholakia commented 9 months ago

I believe this is an error raised by openai when the referrer is unverified. cc: @sestinj I believe you dealt with this a while ago

How did you solve this?

ishaan-jaff commented 9 months ago

relevant issue: https://community.openai.com/t/how-can-i-disable-ssl-verification-when-using-openai-api-in-python/110837/4

I'm looking for a way to repro this issue, have not found one so far

ishaan-jaff commented 9 months ago

Potential fix from another user:

The ssl verification issue happened with me since in our organization we have Zscalar added and it did had the certificate available for openai url.

I was able to solve this issue by manually downloading the certificate and adding it to the 'REQUESTS_CA_BUNDLE' environment variable in my code

sestinj commented 9 months ago

verify_ssl=False I know was one of our solutions, but mostly would let people add their own certificates manually through ca_bundle_path. Not sure this is relevant here unfortunately

ishaan-jaff commented 8 months ago

this is fixed for anyone looking to do this with litellm proxy you can set ssl_verify=False on the OpeAI python client

import openai, httpx
client = openai.OpenAI(
    api_key="anything",
    base_url="http://0.0.0.0:8000", # litellm proxy url 
    http_client=httpx.Client(verify=False)
)

# request sent to model set on litellm proxy, `litellm --model`
response = client.chat.completions.create(
    model="azure/chatgpt-v-2",
    messages = [
        {
            "role": "user",
            "content": "this is a test request, write a short poem"
        }
    ],
)

print(response)