Closed vonstring closed 5 months ago
Able to repro - picking this up now. This is pretty serious. Thanks for flagging this @vonstring
hmm still seeing this even after removing the self. reference
@vonstring i have a fix out - targeting anthropic for now, once you can confirm it works for you too - i'll roll it out to other llm's
Yes, it works now! Thanks for addressing this so quickly.
Great - i'll roll this out to the other providers too. thank you for flagging this @vonstring
Curious - how're you using litellm today?
What happened?
When using the AsyncOpenAI client with an Anthropic model on litellm, any new completion request will kill previous requests, causing an httpx.ReadError exception (see log output)
Minimal test case:
The issue seems to be with this and this line in anthropic.py, which creates a new handler and overwrites any reference to previous handlers, causing the httpx client to be prematurely closed in the AsyncHTTPHandler, since there are no more references to the handler.
This quickfix diff cause the handler to be reused, thus making the above testcase pass:
A quick search seems to suggest this could be an issue in several other llm classes as well.
Relevant log output
Twitter / LinkedIn details
https://x.com/vonstrenginho