Closed devpulse01 closed 4 months ago
Hi @devpulse01.
Could you share some of the code you are using? If I understand correctly, you are doing everything through the request handler and you are not using the directConnection property. If this is correct, then I am a little confused on how the return { 'text': response.choices[0].delta.content };
comes into this context.
Just want to make sure I fully understand the problem before I give advice. Thankyou!
Yes, sure. I have something like this:
request config:
{
...
handler: (body, signals) => Proxy.handleRequest(url, method, headers, body, signals)
}
Then the handleRequest method:
handleRequest(url, method, headers, body, signals){
try {
const response = await utils.call(method, url, headers, body, maxRetry);
signals.onResponse(response);
} catch (error) {
signals.onResponse({ error: 'Error while calling proxy endpoint' });
}
...
}
Then on the onResponse:
onResponse = (response: any): any => {
....
if (this.isProxy()){
if (this.isStream()){
return { 'text': response.choices[0].delta.content };
}
return { 'text': response.choices[0].message.content };
}
return response;
}
Ok, I see you are handling everything on your end. This does unfortunately mean that you will also have to handle the tool_calls
responses on your end as well.
I must say it is not easy, especially if you are facilitating both normal and streamed responses. The best thing that I can do is provide a reference to how Deep Chat handles the responses here. You should be able to copy and reuse a lot of the code there.
Let me know if you have any difficulties. Thanks.
I suspected it. Now I have confirmation. Thank you for the link :) Best
Hi Ovidijus,
I'm nearing the completion of the code, but I've run into an issue related to stream handling and tool_call. Specifically, the problem arises when I try to close the stream in the handler.
Here’s the sequence of actions in my current process:
The response appears correctly in the chat. However, I'm facing a challenge with the code for the stream handler, which is structured as follows:
....
onclose() {
if (!service.asyncCallInProgress) {
signals.onClose();
}
service.emitter.on('asyncCallTerminate', () => {
signals.onClose();
service.asyncCallInProgress = false;
});
}
I emit the asyncCallTerminate event after sending the OpenAI response. But, when I call signals.onClose(), I got:
I apologize for all these questions and truly appreciate any guidance. Best
Hmmmmm, usually this error is given when no response was streamed to the chat, however you said that response appears correctly in the chat which is a little strange. May I ask if you are using the stream
property?
Also, the asyncCallInProgress
property is meant to be used internally within Deep Chat. Though I don't think this makes any difference if you are using it in the handler
, but may I inquire why you are using it?
This information should hopefully help me to solve the issue. Thanks!
Hey Ovidijus, I finally found a solution for this case. I'm closing the ticket. Thank you.
Glad you got it resolved @devpulse01. Could you share what the solution was if it was to do with Deep Chat, or was it something unrelated?
Hi Ovidijus!
I have a question regarding the use of the onResponse return value when dealing with a tool_call response from OpenAI.
Context:
While I can successfully return various response types such as text and files, as described here https://deepchat.dev/docs/connect#Response, I'm not sure about how to integrate this with a tool_call.
For instance, when dealing with streaming, returning the response like this works perfectly:
return { 'text': response.choices[0].delta.content };
For non-streaming responses:
return { 'text': response.choices[0].message.content };
However, I'm unsure about handling a tool_call. Does this imply that in scenarios involving a proxy and a custom handler, I would need to handle the tool_call process on my own?
Thank you!