xbreid / fastapi-assistant-streaming

Medium tutorial codebase showcasing OpenAI Assistant streaming with FastAPI
MIT License
11 stars 2 forks source link

how to integrate function calling #1

Open zishanahmed-tomtom opened 5 months ago

zishanahmed-tomtom commented 5 months ago

greta work Brandon! How do you integrate function calling into it

jackzhouusa commented 4 months ago

Same ask. Seems the queue.put_nowait() can't work well with client.beta.threads.runs.submit_tool_outputs_stream: In some cases (usually the case of multiple function callings) the streaming is frozen without tokens in the queue. @xbreid

xu2xulim commented 4 months ago

I am using queue.put_nowait() and it seems to work ok with a demo assistant with two functions to create a response. I modified the on_event in event_handler.py for action required. With each iteration of tools (functions) I built the tools output. Because I am learning Python, I have yet to figure out how to call client.beta.threads.runs.submit_tool_outputs_stream … my temporary solution is to call the api via httpx.AsyncClient() and look for “thread.message.completed” and add the message to the queue. I did a self.done.clear() when I get a 200 response after the API call. Not sure that helps.

jackzhouusa commented 4 months ago

Thanks, @xu2xulim. I tried your solution and it couldn't work on my side.

Seems to me the queue issue is related to some racing condition, the occurrence of the problem is very random.

xbreid commented 4 months ago

Hey guys, I was on vacation for a couple of weeks and just got back. I’ll provide a solution I came up with as soon as I find some time outside of work.

jackzhouusa commented 2 months ago

Any update? @xbreid