RetellAI / retell-custom-llm-python-demo

MIT License
57 stars 44 forks source link

draft_response call in server.py for the advanced function call use case #9

Open IsmailAlaouiAbdellaoui opened 8 months ago

IsmailAlaouiAbdellaoui commented 8 months ago

I believe server.py needs to handle funcResult variable. The function stream_response needs to include this parameter so that draft_response can use it too ( since it will be called in llm_with_func_calling.py with that parameter)

In general, I suggest you guys create a new file called server_with_func_calling.py to include this.

Thanks.

toddlzt commented 6 months ago

Hi @IsmailAlaouiAbdellaoui Thanks for dropping a message here. Just to clarify, why does the server need funcResult? I thought users can use the result locally to do any actions.

IsmailAlaouiAbdellaoui commented 6 months ago

You're right, I opened this issue back when I was trying function calling, the server actually doesn't need funcResult and you can close the issue. Thanks.