Closed anthonyrs06 closed 3 weeks ago
i am getting same error, let me know if you figure this out
@anthonyrs06 @Mitthat, it looks like an openai
library issue. As a workaround, downgrade to version 1.6.1
and the issue disappears.
that did the trick!
I think the recipe needs to be updated for the latest lib versions if anyone from chainlit see's this
https://github.com/openai/openai-python/commit/5429f69670e4db70f0cb7420ddb27c9bd11b9508 release: 1.14.0 (https://github.com/openai/openai-python/pull/1234)
See the reference docs for more information: https://platform.openai.com/docs/api-reference/assistants-streaming We've also improved some of the names for the types in the assistants beta, non exhaustive list:
CodeToolCall
-> CodeInterpreterToolCall
MessageContentImageFile
-> ImageFileContentBlock
MessageContentText
-> TextContentBlock
ThreadMessage
-> Message
ThreadMessageDeleted
-> MessageDeleted
release: 1.14.0
Thanks for catching! super helpful
I assume the best approach to stream assistant steps into Chainlit objects is directly via the EventHandler. This works inside app.py but I'd love to import this to clean it up. How can we pass the context though?
For streaming input, like code interpreter, there's no stream_token for input, only output. step.input += token then update() is inefficient.
One design struggle that I'm curious how anyone has solved, is submitting tool outputs for custom functions. This starts a new stream.
@hayescode I'm struggling with the same issue with custom functions...any luck on your side?
It seems that this has been fixed, I'm using openai=1.44.1 and it works like a charm ! I'm closing the issue
I keep getting this error when trying to run the assistants api example. I have pip install openai done. Any tips?
ImportError: cannot import name 'MessageContentImageFile' from 'openai.types.beta.threads'