Open rc19 opened 6 months ago
🤖
Hello @rc19! 👋 I'm Dosu, a friendly bot here to assist you with your bugs, answer your questions, and even help you become a contributor. While we're waiting for a human maintainer to join us, I'm here to help. Don't worry, we'll get this sorted out! 🦙
The error message you're encountering, <asyncio.locks.Event object at 0x13de7fe50 [unset]> is bound to a different event loop
, typically indicates that an asyncio object was created in one event loop and is being used in another. This can happen when using the astream_chat
method in the Anthropic
class, which is asynchronous and uses the Anthropic.AsyncAnthropic
client to create completions.
The Anthropic.AsyncAnthropic
client is initialized in the Anthropic
class's constructor (__init__
method) and is likely bound to the event loop that was running at the time of its creation. If you're trying to use the astream_chat
method in a different event loop, you'll encounter this error.
To resolve this issue, you should ensure that the Anthropic.AsyncAnthropic
client is created and used in the same event loop. If you're using a framework or library that manages the event loop for you (like FastAPI or Jupyter), you might need to adjust your code to ensure that the Anthropic.AsyncAnthropic
client is created in the correct event loop.
Unfortunately, without more information about how you're using the Anthropic
class and the astream_chat
method, it's difficult to provide a more specific solution. You can refer to the Anthropic class source code for more details.
As for the changes made in the Anthropic
class between version 0.9.16 and the latest version 0.9.21 of LlamaIndex, I wasn't able to find specific information within the repository. It's possible that the answer may be available elsewhere or I could have missed it. If you could provide more details about your usage or any specific issues you're encountering, it would be helpful in providing a more accurate response.
This response is meant to be useful and save you time. It is not meant to be a precise solution, but rather a starting point for your own research.
Help me be more useful! Please leave a 👍 if this is helpful and 👎 if it is irrelevant.
If you want to continue the conversation, start your reply with @dosu-bot.
@rc19 same issue
Same issue, but with Cohere Async streaming
Has anyone made headway here? I'm experiencing a similar issue when using the BEST/CONTEXT chat mode.
Bug Description
Hello I am using llama-index (v0.9.16) and using async streaming apis (
achat_stream
) with the simple chat engine.However it seems like when I use openai sdk (for anyscale and open models), I get full responses while if I use anthropic sdk (for claude) I get the error
I am confused as to if this is because of llama index or anthropic (given the same code works with openai call)
Any pointers?
Version
0.9.16
Steps to Reproduce
Relevant Logs/Tracbacks