vercel / modelfusion

The TypeScript library for building AI applications.
https://modelfusion.dev
MIT License
1.01k stars 76 forks source link

docs(AbstractOpenAIChatModel): add comments to explain parameters #204

Closed bearjaws closed 6 months ago

bearjaws commented 6 months ago

Added some docs for the commonly used props, I also noticed a bug with streaming output when using the n: property. Seems like maybe we need to throw an error there long term? Not sure how to separate the two outputs as they both come in at the same time.

vercel[bot] commented 6 months ago

The latest updates on your projects. Learn more about Vercel for Git ↗︎

Name Status Preview Comments Updated (UTC)
modelfusion ✅ Ready (Inspect) Visit Preview 💬 Add feedback Dec 15, 2023 3:10am
lgrammel commented 6 months ago

@bearjaws Thanks! Can you elaborate on the n=2 bug? It should be supported, even with streaming (just not available in the standard output).

bearjaws commented 6 months ago

@bearjaws Thanks! Can you elaborate on the n=2 bug? It should be supported, even with streaming (just not available in the standard output).

How do I output the separate streams? I see that it is type AsyncIterable but cannot find some docs on how to capture each individual output in a streaming manner.

Using


const textStream = await streamText(
    openai.ChatTextGenerator({ model: "gpt-4-1106-preview", n: 2 }),
    [
        OpenAIChatMessage.user([
            { type: "text", text: "Create a story about shrek" },
        ]),
    ]
);

for await (const textPart of textStream) {
    process.stdout.write(textPart);
}

Result: OnceOnce upon upon a a time time in in the a lush swamp, far verd removedant from sw theamps bustling of towns Far of Far the Away land, of there Far lived Far an Away og,re there lived named Sh anrek og.re Sh namedrek Sh wasrek a. solitary Sh creaturerek by rel natureished, his content solitude with and the the simple comfort pleasures of in his life swamp:, a a good lush mud retreat bath filled, with a the hearty sounds meal of of cro slakingugs frogs and and sn theails mur,mur and of the the occasional wind roar through to the keep trees the. results in both streams coming out in realtime, which means the text is out of order.

lgrammel commented 6 months ago

How do I output the separate streams? I see that it is type AsyncIterable but cannot find some docs on how to capture each individual output in a streaming manner.

I investigated this and it is currently not possible. I'll add it to the API. As a first step, I've fixed the output of the single stream: https://github.com/lgrammel/modelfusion/commit/af14f150b0cab1d451401e0632289b2b59d381b2 (will be included in the next release).