Closed krrishdholakia closed 11 months ago
cc: @shauryr
im taking a look at this
I was able to fix this locally
here's my deep infra call
TLDR: this is because our llama index implementation was trying to make text_completion requests for models like deep_infra
@shauryr do you use llama index with streaming ?
Yes, I use streaming
On Thu, Sep 28, 2023 at 3:17 PM Ishaan Jaff @.***> wrote:
@shauryr https://github.com/shauryr do you use llama index with streaming ?
— Reply to this email directly, view it on GitHub https://github.com/BerriAI/litellm/issues/462#issuecomment-1739948713, or unsubscribe https://github.com/notifications/unsubscribe-auth/ADAFLTBYK7EADTEWOY6TF3DX4XLNTANCNFSM6AAAAAA5JTQY7Y . You are receiving this because you were mentioned.Message ID: @.***>
I have a PR here it fixes regular calls for deep infra : https://github.com/jerryjliu/llama_index/pull/7885
What happened?
Using llama index with litellm. When trying to call deepinfra, it throws an error.
Relevant log output
No response
Twitter / LinkedIn details
No response