Closed Manouchehri closed 6 months ago
Related to https://github.com/BerriAI/litellm/issues/2279.
@Manouchehri do you see this on latest?
We check for url's that end with /v1
- https://github.com/BerriAI/litellm/blob/e7b4882e9726c1d28d18246aecbc3a6de7f62176/litellm/router.py#L1998
We also have testing for this - https://github.com/BerriAI/litellm/blob/e7b4882e9726c1d28d18246aecbc3a6de7f62176/litellm/tests/test_router.py#L141
Where we check to make sure we're not double counting.
If you're still seeing this, can you share a url for repro?
I was seeing it on v1.35.31 for sure. Pretty sure it's still a bug though.
The URL from Cloudflare doesn't have a v1 at all at the end; Cloudflare adds it internally before forwarding the request to OpenAI.
curl -X POST https://gateway.ai.cloudflare.com/v1/0399b10e77ac6668c80404a5ff49eb37/litellm-test/openai/chat/completions \
-H 'Authorization: Bearer XXX' \
-H 'Content-Type: application/json' \
-d ' {
"model": "gpt-3.5-turbo-0125",
"messages": [
{
"role": "user",
"content": "What is Cloudflare?"
}
]
}
'
Got it. @Manouchehri just pushed a fix for this. Since the initial fix was made to solve the problem for azure ai studio, i've added a flag for that.
Would welcome a PR with stricter testing here, to prevent future regressions.
Confirmed fixed, thank you!
I thought I was the regression tester? 😉
What happened?
https://github.com/BerriAI/litellm/commit/e05764bdb7dda49127dd4b1c2c4d02fa90463e71 caused OpenAI requests through Cloudflare AI Gateway to fail. Cloudflare already adds on the
/v1
, so adding/v1
again automatically toapi_base
results in/v1/v1
.Relevant log output
Twitter / LinkedIn details
https://twitter.com/DaveManouchehri