Closed tituslhy closed 2 months ago
I am not sure why you want to have another fast API server and use http requests. I would just use the Chainlit abstraction for function calling. If you need to expose something as an API, you can use custom API endpoints.
Sorry I think I'm just not very good at JavaScript. Maybe just to be more specific, how do I edit this line of code to send the query to the chainlit server? Or does the callback() function already handle that?
window.addEventListener("chainlit-call-fn", (e) => {
const { name, args, callback } = e.detail;
if (name === "test") {
console.log(name, args);
callback("You sent: " + args.msg);
}
});
And, can I check if I edited the on_message function correctly to incorporate the CoPilot's response? Also, though I used agent.stream_chat (I'm using llamaindex), I'm not sure if I need to edit the CoPilotFunction segment of the code to stream the response? The code works fine when I take away the CoPilot function segment and run it just like I would in a normal Chainlit web application.
@cl.on_message
async def on_message(message: cl.Message):
agent = cl.user_session.get("agent")
response_message = await cl.Message(content="").send()
cl.user_session.set("response_message", response_message)
response = await cl.make_async(agent.stream_chat)(message.content)
if cl.context.session.client_type == "copilot":
fn = cl.CopilotFunction(name="test", args={"msg": response.content})
res = await fn.acall()
await cl.Message(content=res).send()
Sorry just to update that I've figured it out! Minimum working sample!
@willydouhard sorry for the confusion and thanks for helping!
@cl.on_message
async def on_message(msg: cl.Message):
llm = cl.user_session.get("llm")
if cl.context.session.client_type == "copilot":
response_message = await cl.Message("").send()
response = await cl.make_async(llm.stream_complete)(msg.content)
for chunk in response:
await response_message.stream_token(chunk.delta)
await response_message.update()
fn = cl.CopilotFunction(name="test", args = {"msg": msg.content, "response": str(response_message)})
res = await fn.acall()
Hello!
I'm building a ReAct agent on top of Chainlit CoPilot which will be mounted on a HTML file. I'm able to code the invoking of the agent and streaming of response in Chainlit using Python, but the Chainlit CoPilot requires a callback function within JavaScript event listener.
I'm thinking of deploying my LLM agent as a FastAPI returning a StreamingResponse that will be called by the callback function in the index.html file. How do I adapt the callback function accordingly to stream the response in the CoPilot? Do you have a simple example of streaming a simple reply via an API call?
Much appreciated!
Sample codes:
api.py
And here's the HTML file: