Open backslash112 opened 2 months ago
sorry was planning to fix this earlier was in the middle of our big migration to a monorepo. looking into this now.
fix in latest release
I would like to reopen this issue. Number two, i dont understand on how to do it with .chat.
I can see that it is suppose to return a Readable Stream, and i have set the stream to true. But i cannot get it to work.
Any example or ideas @dosco ?
@taieb-tk have you looked at the streaming1.ts and streaming2.ts examples? stream: true
enables streaming with the underlying llm provider to speed up things the final fields are not streamed out.
@dosco Yes i did, i could not get it to work, probably a skill issue from my side. I tried to just use the
`const ai = new ax.AxAIOpenAI({ apiKey: apiKey as string, });
ai.setOptions({ debug: true })
const response = await ai.chat({
chatPrompt: conversationHistory,
config: {
stream: true
},
...(tools?.length && { functions: normalizeFunctions(tools) }),
});
`
Not sure what to do with the response in the next step... Could you possibly help me? :)
I'm submitting a ... [x] question about how to use this project
Summary I'm encountering two problems when working with the streaming example:
examples/streaming2.ts
withstream: true
, I get an error:Missing required fields: answerInPoints
. What's causing this error and how can I resolve it?stream: true
, how can I access the result chunks? Are there methods similar tofor await (const chunk of result)
orcompletion.data.on()
that I can use to process the incoming stream? (Similar to https://github.com/openai/openai-node/issues/18)Any guidance on resolving these issues would be greatly appreciated. Thank you!